Variations in the policies used by virtual schools in relation to course enrollment trial periods and course completion impact the comparability of attrition statistics. We contacted 159 U.S. virtual schools and received responses from 86 schools, a response rate of 54%. 68.6% of respondents had trial periods that varied from one day to 185 days. Course completion definitions varied considerably from remaining in the course irrespective of the final grade to receiving an A-, considered a passing grade. These differences were examined based upon geographical region and school type. We recommend virtual schools adopt multiple measures for calculating student attrition to allow meaningful comparisons between virtual and also with brick and mortar schools.
Choosing the Right CBSE School A Comprehensive Guide for Parents
E-Learn 2008 - U.S. Virtual School Trial Period and Course Completion Policy Study
1. U.S. VIRTUAL SCHOOL TRIAL
PERIOD AND COURSE COMPLETION
POLICY STUDY
ABIGAIL GONZALES, BRIGHAM YOUNG UNIVERSITY & UNIVERSITY OF NEVADA,
LAS VEGAS
DR. MICHAEL K. BARBOUR, WAYNE STATE UNIVERSITY
E Learn Conference 2008
3. State of Virtual Schools in U.S.
Explosive growth
Student population primarily supplementary
Variety of types of virtual schools
Statewide, virtual charter, Multi-district/consortia,
single-district, private, for profit, & university
Geographic location
High concentration Western & Southeastern
states
Northeastern states slow adopters
4. Challenges of virtual schooling
Attrition is a significant problem (Carr, 2000; Lary, 2002; Rice,
2005)
Multiple factors contribute to differences
Non-learning related factors – Policy adoption
When we start counting students
How we count them
5. Purpose of Study
1. Examine variation in trial period policies in
US
Variability across types schools & geographic regions
1. Examine variation in how US virtual schools
define course completions
Variability across types schools & geographic regions
6. Significance of Study
Is there a need to standardize?
Cannot standardize metric without knowing
current landscape
Are policies adopted context specific?
7. Review of Literature
Researchers call for standardizing
performance measures (Smith et al., 2005; Pape et al., 2006;
Watson et al., 2006)
Limited research examining two policies
Pape et al., (2006) compared 3 v. schools
2 trial periods: 3 and 5 weeks
2 defined completion as 60%, 1 used “qualitative
tag”
Evidence trial periods can sift out weaker
students (Ballas & Belyk, 2000; Cavanuagh, Gillan, Bosnick, Hess, & Scott,
2005; McLeod, Hughes, Brown, Choi, & Maeda, 2005)
When to count
Course completion definitions affect retention
8. Methods
Sampling Procedures
159 US schools
Schools listed in
NACOL’s Online Learning Clearinghouse List ‘07
State-led schools in Keeping Pace w/ K12 Online
Learning (Watson, 2007)
Survey Study
3-question email survey w/ introduction,
purpose
Presence of trial period
Length of trial period in days
9. Survey email
4 contact attempts (2 emails, fax, phone)
Addressed to school principal, director, or
registrar
Addressed by name when possible
10. Methods
Virtual school: state approved / regionally
accredited school offering credit through DL
methods including the internet (Clark, 2001)
School type taxonomy from Cavanaugh,
Barbour, and Clark 2008
Regional Divisions
US Watson & Ryan 2007
11. US Geographical Regions
Northeastern States
Central Sates
Western States
Southeastern States
12. Sample by Region
Region US Sample US % of
Sample
Central States 41 25.5
Northeastern 18 11.2
States
Southeastern 33 20.5
States
Western States 67 41.6
Total 159 100
13. Sample by School Type
School type US US %
Cyber Charter 34 21.1
For Profit 9 5.6
Multi-district 11 6.8
Private 21 13
Single – district 49 30.4
State – led 24 14.9
University – led 11 6.8
Other (Aboriginal, 0 0
Unknown, etc)
Total 159 100%
15. Responses by School Type
School type US US %
Cyber Charter 16 18.2
For Profit 1 1.1
Multi-district 7 8.0
Private 13 14.8
Single-district 26 29.5
State – led 17 19.3
University – led 8 9.1
Totals 88 100%
16. Representativeness by School
Type
US
US Response
School type Sample % % % Difference
Cyber Charter 21.1 18.2 2.9
For Profit 5.6 1.1 4.5
Multi-district 6.8 8.0 -1.2
Private 13 14.8 -1.8
Single-district 30 29.5 .5
State – led 14.9 19.3 -4.4
University – led 6.8 9.1 -2.3
17. Representativeness by Region
US US %
Sample Respons Differenc
Region % e% e
Central States 25.5 26.1 -.6
Northeastern
11.2 9.1 2.1
States
Southeastern
20.5 22.7 -2.2
States
Western States 41.6 42 -.4
21. Trial period length variations
by…
School type:
Sig. @ p=.05 df(5) f3.909
Differences: Private school vs. state-led, cyber
charters, and single-district
Private schools had shorter trial periods
compared to other schools
Geographical region:
No significant difference
25. Course Completion Definitions where…
Other
Definitions US US %
Mastery not defined by grade 1 1.2
Individual schools define completion 4 4.7
Totals 5 5.9%
28. Findings Summary
Trial Period Presence
Prevalent practice ~70%
Trial Period Length
Average length ~ 20 days
Most common lengths: 2 and 4 weeks
Regional differences: Not sig.
School type: Sig. - private schools
29. Findings Summary
Course completion definitions
Wide variation between and within groups
Remain in course
Future Research
Student characteristics, experience, and
reason for dropping out during trial period
duration
Comparison study with Canadian trial period
and course completion policies
30. Implications
Need common metrics for calculating attrition
Best if same as bricks-and-mortar schools
Gather data for internal and external reporting
Internal = Institutional metrics
External = Standardized metrics
Determining metric easier since geography
and school type factor little
31. Participant Discussion
How do you determine or set your trial period
policies and completion definitions?
What influences you?
Should a common metric be established?
Who would determine the standardized metric?
What would be the optimal trial period/ course
completion policy?
What other metrics / policies need
standardization?
Questions?
32. References
Ballas, F. A., & Belyk, D. (2000). Student achievement and performance levels in online
education research study. Red Deer, AB: Schollie Research & Consulting. Retrieved July
31, 2005, from http://www.ataoc.ca/files/pdf/AOCresearch_full_report.pdf
Carr, S. (2000). As distance education comes of age, the challenge is keeping the
students. The Chronicle of Higher Education, 46(23), A39-41.
Cavanaugh, C., Gillan, K. J., Bosnick, J., Hess, M., & Scott, H. (2005). Succeeding at
the gateway: Secondary algebra learning in the virtual school. Jacksonville, FL: University of
North Florida.
Cavnaugh, C., Barbour, M., & Clark, T. (2008, March). Research and practice in k-12
online learning: A review of literature. Paper presented at the annual meeting of the
American Educational Research Association, New York.
Clark, T. (2000). Virtual high schools: State of the states - A study of virtual high school
planning and preparation in the United States: Center for the Application of Information
Technologies, Western Illinois University. Retrieved July 4, 2005, from
http://www.ctlt.iastate.edu/research/projects/tegivs/resources/stateofstates.pdf
Lary, L. (2002). Online learning: Student and environmental factors and their
relationship to secondary student school online learning success. Unpublished
dissertation, University of Oregon.
33. References Continued
McLeod, S., Hughes, J. E., Brown, R., Choi, J., & Maeda, Y. (2005). Algebra
achievement in virtual and traditional schools. Naperville, IL: Learning Point Associates.
Pape, L., Revenaugh, M., Watson, J., & Wicks, M. (2006). Measuring outcomes in K-
12 online education programs: The need for common metrics. Distance Learning, 3(3),
51-59.
Rice, K. L. (2006). A comprehensive look at distance education in the K-12 context.
Journal of Research on Technology in Education, 38(4), 425-448.
Roblyer, M. D. (2006). Virtually successful: Defeating the dropout problem through
online school programs. Phi Delta Kappan, 88(1), 31-36.
Smith, R., Clark, T., & Blomeyer, R. L. (2005). A synthesis of new research on K-12 online
learning. Naperville, IL: Learning Point Associates.
Tucker, B. (2007). Laboratories of reform: Virtual high schools and innovation in public
education. Retrieved April 20, 2008, from
http://www.educationsector.org/usr_doc/Virtual_Schools.pdf
Watson, J. F., & Ryan, J. (2007). Keeping pace with k-12 online learning: A review of state-
level policy and practice. Vienna, VA: North American Council for Online Learning.
Retrieved September 23, 2007, from http://www.nacol.org/docs/KeepingPace07-
color.pdf
Editor's Notes
Ask audience introduce self & what do. What interested them in attending? If their institution has a trial period? - length How they define course completions? Illustrate wide variability even within the room.
1. 2005-2007 State led schools went from 21 to 42 according to Watson and Ryan’s Keeping pace wit hK-12 online learning 2. Estimated 700,000 (Tucker 2005) to 1 million (Christensen & Horn, 2008) students participating in K-12 online learning 3. Michigan requires e-learning component for graduate requirement from high school. Other states may soon follow. 4. The student population is primarily using online learning to supplement their brick and mortar courses but there is a growing trend toward more full-time programs. 5. Variety in providers 6. Variety in geographic distribution
Retention rates are a key indicator of the health of a school However in VS, attrition tends to be a significant problem Attrition rates range significantly 12-40% (Lary, 2002) 50% (Rice, 2006) Low as 3 % as high as 70% (Robyler, 2007) Wide range in what is reported for different reasons – Some learning related. Other policy factors and lack of agreement on how to calculate metric.
Is there a need to standardize? Or have we already done this organically? There has been a call to standardize but before you can do this it is important to know the current landscape. Are there policies institutions have gravitated to en mass, determining a standard metric becomes easier. Are these policies context specific?
In Pape’s et al. 2006 article Measuring outcomes in K-12 online education programs: The need for common metrics she examined 3 virtual high schools--Virtual High School Global Consortium (VHS), Illinois Virtual High School (IVHS), and Connections Academy (CA). Qualitative tag which included “intertwining metrics: attendance, participation, and performance” (p. 55). This data was combined to calculate a “qualitative tag” depicting performance ranging from “Satisfactory to Alarm” Long trial periods can act as a sifting mechanism during which weaker students to drop out, masking attrition rates for lower performing students. FLVS 1999 -2000 evaluation report—reported a 71% retention rate. If included dropout students from trial period – 54%In turn, virtual schools with generous trial periods would be able to report high retention rates because students who were having trouble and would have likely struggled to complete the course would have dropped out by the time the virtual school began counting then as students.
159 US schools located:: NACOL Clearinghouse list, State-led schools from Keeping Pace with K-12 Education 2007 Canadian schools selected based on Email survey : 3 questions; 2 open-ended,
Used Clark’s 2001 definition of a virtual school as “a state approved and/or regionally accredited school that offers secondary credit courses through distance learning methods that include Internet-based delivery” Limitations: Definition of VS not accepted by some people in our study – they dropped ou.
Almost 50% of schools in sample were western. Not suprising given history of distance education in the west. Then Central Then Southeastern
US: Single district, state led, and cyber charter schools accounted for 66.4% of virtual schools in our sample.
Canada: 29.9% Response breakdown by country – majority were US schools responding.
Single district, Cyber charter, and State –led: 67% Fairly representative set of responses compared to the sample set.
Single district, Cyber charter, and State –led: 67% Fairly representative set of responses compared to the sample set.
Single district, Cyber charter, and State –led: 67%
US: Of the 88 schools surveyed, 27 schools had no trial period compared to 61 schools had a trial period Trial periods were a common practice in the US. Several instances where a trial period was marked by an event such as submitting your first assignment, taking your first quiz, paying your tuition; in contrast to a time period that was common in the US.
US most common: 28-30 days about 4 weeks accounted for 28.3% of the sample Most common (14 days) 14-15 days: 2 weeks accounted for 26.7% of the sample
Ran One way ANOVAs to see if there were any significant differences in trial length based on school type or region. For School types found that US significant with an f value of 3.909 Did a Post HokTukey test to see which variables were significant and found that
Wide range in completion definitions.
Wide range from passing the course w/ 60% to a mastery level with 90% or better US had significant variation within this category.
Not define by grade: elementary school
This study gives us evidence beyond anecdote or guess, that variations are significant and there is a need to standardize trial period policies and course completion definitions. We need to count students in the same time and same manner. Ideally, best if we could align this with how brick and mortar schools are calculating attrition/retention to allow for comparisons
How do you determine…. Drivers-Colorado Online Learning changed their trial period from 2 weeks to 5 weeks because a competitor school had this length and wanted their attrition rates to be comparable. If a standardized metric were to be established, who should determine it?