Online Learning Outcomes PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Economics of Education Review 37 (2013) 46–57

Contents lists available at ScienceDirect

Economics of Education Review


journal homepage: www.elsevier.com/locate/econedurev

The impact of online learning on students’ course outcomes:


Evidence from a large community and technical college
system
Di Xu *, Shanna Smith Jaggars
Community College Research Center, Teachers College, Columbia University, Box 174, 525 West 120th Street, New York, NY 10027,
United States

A R T I C L E I N F O A B S T R A C T

Article history: Using a large administrative dataset from a statewide system including 34 community and
Received 27 November 2012 technical colleges, the authors employed an instrumental variable technique to estimate
Received in revised form 1 August 2013 the impact of online versus face-to-face course delivery on student course performance.
Accepted 5 August 2013
The travel distance between each student’s home and college campus served as an
instrument for the likelihood of enrolling in an online section of a given course. In addition,
JEL classification:
college-by-course fixed effects controlled for within- and between-course selection bias.
I23
Analyses yield robust negative estimates for online learning in terms of both course
I28
I21 persistence and course grade, contradicting the notion that there is no significant
difference between online and face-to-face student outcomes—at least within the
Keywords: community college setting. Accordingly, both two-year and four-year colleges may wish to
Online learning focus on evaluating and improving the quality of online coursework before engaging in
Community colleges further expansions of online learning.
Student performance
ß 2013 Elsevier Ltd. All rights reserved.
Instrumental variable analysis

1. Introduction further expansions in online learning (e.g., Chen, 2012;


Fain & Rivard, 2013; Texas Higher Education Coordinating
For two decades, state financing of higher education has Board, 2011). The notion that online courses are more cost-
been on the decline (Kane, Orszag, & Gunter, 2003). Public effective than traditional, face-to-face courses is predicat-
postsecondary institutions have responded by raising ed on two assumptions: first, that online course sections
tuition, increasing class sizes, cutting programs, and are consistently less expensive; and second, that they yield
otherwise seeking to reduce costs and improve efficiency. fairly comparable student outcomes.
At the same time, colleges have sharply increased their Although it may seem self-evident that online courses
distance education offerings through online coursework— are consistently cheaper than face-to-face courses, there is
though often with an intent to improve access and surprisingly little evidence on online and face-to-face
convenience for students rather than to reduce costs. In course costs. Most research on the topic is dated (e.g.,
the wake of the recent recession, policy leaders in several Hawkes & Cambre, 2000; Jewett, 2000; Jung, 2003; Levine
states, assuming that online courses must be more cost- & Sun, 2002; Rogers, 2001; Virginia Community College
effective than face-to-face courses, have championed System, 2001; Whalen & Wright, 1999), and the conclu-
sions drawn from relevant studies are mixed. Rumble
(2003) discussed the complexities involved in making
* Corresponding author. Tel.: +1 212 678 3044; fax: +1 212 678 3699.
generalizations about costs across different types of
E-mail addresses: [email protected], [email protected] courses and institutions and concluded that there can be
(D. Xu), [email protected] (S.S. Jaggars). no clear-cut answer as to whether online courses are

0272-7757/$ – see front matter ß 2013 Elsevier Ltd. All rights reserved.
http://dx.doi.org/10.1016/j.econedurev.2013.08.001
D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57 47

indeed cheaper. Schiffman (2005) noted that development course they did not successfully complete the first time
costs for online courses varied across institutions from (Adelman, 2005; Bailey & Morest, 2006; Planty et al., 2009).
$10,000 to $60,000 per course. Based on interviews with Thus, studies that focus solely on course completers are
presidents, provosts, and other senior academic leaders at minimally helpful to community college administrators
more than 25 higher education institutions,1 Bacow, contemplating the potential costs and benefits of expand-
Bowen, Guthrie, Lack, and Long (2012) reported that most ing online course offerings.
institutions provided distance education to better serve Second, it is unclear whether the sets of courses
student needs rather than to save on costs. In fact, many examined in previous research represent the larger body
interviewees believed that online courses were at least as of online courses available in the postsecondary setting,
expensive as traditional courses, not only due to their and particularly in the community college setting. Each
substantial start-up costs (e.g., investments in technology, study in the literature tends to focus on one or two specific
course design, and instructor training) but also due to courses, which in some cases are selected because they are
recurring costs (e.g., those resulting from increased thought to represent high-quality examples of online
coordination demands and technical support). Moreover, coursework. Moreover, each course included in the
studies of online course costs have not taken into account rigorous research cited above was conducted within a
the quality or effectiveness of the courses examined, and it selective college or university (Jaggars & Bailey, 2010)—
is possible that online courses with high completion rates institutions that are not representative of the less-selective
and strong learning outcomes require substantial invest- or open-access colleges that make up the bulk of the
ments to design and teach. nation’s postsecondary sector. Qualitative research con-
The second assumption underlying the cost-effective- ducted in the community college setting has revealed that
ness argument—that online courses produce student most online instructors simply convert their face-to-face
outcomes comparable to those produced by face-to-face instructional materials to printed handouts and text-heavy
courses—is also based on relatively weak evidence. slide presentations, with few of the interactive technolo-
Although dozens of studies have compared student gies that may effectively engage students in online
performance between online and face-to-face courses, learning (Cox, 2006; Edgecombe, Barragan, & Rucks-
most have been descriptive studies, with no controls for Ahidiana, 2013). Although no parallel studies have been
student self-selection. Moreover, the majority have fo- conducted in the four-year sector, these findings raise the
cused on populations (e.g., K-12 students) or contexts (e.g., question of how high-quality the ‘‘typical’’ or ‘‘average’’
hour-long educational modules) that are not relevant to online college course may be.
the typical online college course. Only a few random- In order to understand student performance in the
assignment or quasi-experimental studies have focused on typical online course within a given sector, it would be
semester-length college courses (Caldwell, 2006; Cavus & most useful to compare a large and representative set of
Ibrahim, 2007; Coates, Humphreys, Kane, & Vachris, 2004; online courses against a similar set of face-to-face courses.
Figlio, Rush, & Lin, 2010; LaRose, Gregg, & Eastin, 1998; Thus far, only one study has done so: Using a dataset
Mentzer, Cryan, & Teclehaimanot, 2007; Odell, Abbitt, including hundreds of course sections from 23 colleges in
Amos, & Davis, 1999; Peterson & Bond, 2004; Schoenfeld- Virginia’s community college system, Xu and Jaggars
Tacher, McConnell, & Graham, 2001). Results of these (2011) found that students fared significantly worse in
studies are mixed, leading many college leaders to online courses in terms of both course persistence and end-
conclude that online learning at least ‘‘does no harm.’’ of-course grades. However, the study was limited to entry-
However, two considerations limit the usefulness of this level English and math courses in community colleges in
conclusion. one state, raising the question of whether the results apply
First, nearly all previous studies have focused on to other academic subjects and other state contexts.
learning outcomes among students who completed the Moreover, although Xu and Jaggars controlled for a wide
course, and thus have disregarded the potential impact of array of student, course, and institutional characteristics
online delivery on course withdrawal. Ignoring course using multilevel propensity score matching, they could not
withdrawal may be reasonable within the context of control for unobserved influences on students’ course
selective four-year institutions, which typically have low selection, such as employment status, actual working
course withdrawal rates. In the community college hours, educational motivation, and academic capacity.
context, however, descriptive studies have typically Thus, the results could have remained subject to selection
reported course withdrawal rates in the 20–30% range, bias. Indeed, using an endogenous switching model, Coates
with higher withdrawal rates for online courses (Beatty- et al. (2004) found that online students tended to have
Guenter, 2002; Carr, 2000; Chambers, 2002; Moore, ‘‘higher levels of unobservable ability that improves their
Bartkovich, Fetzner, & Ison, 2003). Course persistence performance under both types of instruction’’ (p. 543).
and completion is a particularly important issue in Thus, failure to account for unobservables underlying
community colleges, where most students are low- student self-selection may underestimate any negative
income, many are working or have dependents, and few impacts of the online format on student course perfor-
can readily afford the time or money required to retake a mance.
This paper builds on Xu and Jaggars’ (2011) study of
Virginia community colleges by focusing on a different
1
The institutions included public and private research universities, region of the country and using an instrumental variable
four-year colleges, and community colleges. (IV) technique to control for unobserved confounding
48 D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57

variables. Using a large administrative dataset from Table 1


Characteristics of Washington State Community and technical colleges vs.
Washington State’s community and technical college
All U.S. public two-year colleges.
system, we used the distance from a student’s home to
college as an instrument for the likelihood of enrolling in Variables All U.S. public Washington
two-year state two-year
an online rather than a face-to-face section of a given
colleges colleges
course. We augmented the IV strategy using college-by-
course fixed effects, which allowed us to compare students Student demographics
% White 65.89 67.06
who took the same course but were enrolled in sections
(23.69) (12.96)
with different delivery formats, potentially controlling for % Black 14.22 3.82
biases related to within- and between-course selection. To (17.02) (3.11)
assess the effects of taking a course online rather than face- % Hispanic 8.54 5.68
(13.67) (5.67)
to-face, we explored two course outcomes: (1) course
% Receive federal financial aid 43.94 27.94
persistence, or whether a student remained in the course (18.71) (10.63)
through the end of the semester; and (2) final course grade % Enrolled full-time 64.53 64.93
among those who persisted to the end of the course. Our (11.87) (6.71)
analyses yielded robust estimates of negative impacts of Academics
online learning on both course persistence and course Graduation rates 29.03 32.79
grade. Moreover, our IV estimates were consistently (19.42) (10.95)
stronger than the corresponding OLS estimates across all First-year retention rates 57.73 57.85
(13.85) (9.76)
model specifications, lending support to the Coates et al.
(2004) argument that students tend to be positively Expenditures (dollars per FTE)
selected into online coursework, which may bias the Instructional 5261.52 4848.71
(20,987.74) (2133.11)
negative impacts of online learning toward zero when Academic 1003.05 578.26
student self-selection is not well addressed. (4365.67) (229.78)
Institutional 1684.28 1302.03
2. Data (4236.92) (1391.40)
Student 1037.52 1237.12
(1378.74) (1544.99)
2.1. Data and institutional characteristics
Location
Urban 39.40% 59.38%
The study used an administrative dataset of students
Suburban 23.72% 21.88%
who initially enrolled in one of Washington State’s 34 two- Rural 36.81% 18.75%
year public community or technical colleges during the fall
Observations (N) 1165 34
term of 2004. These first-time college students were
Source: Statistics reported to the 2004 Integrated Postsecondary
tracked for approximately five years, through the summer
Education Data System (IPEDS) database.
of 2009. The dataset, provided by the Washington State Note: Standard deviations for continuous variables are in parentheses.
Board of Community and Technical Colleges, included
information on student demographics2; institutions
attended; transcript data on courses taken and grades they drop the course. Those who choose to drop after that
received; and information on each course, such as course point receive a grade of ‘‘W,’’ indicating withdrawal. Thus,
number, course subject, and course delivery format.3 The in our study, ‘‘course withdrawal’’ denotes that a student
dataset also included information from Washington State paid tuition for a course but officially dropped prior to the
Unemployment Insurance wage records, which allowed us term’s end. ‘‘Course persistence’’ indicates that a student
to control for students’ working status and working hours formally remained through the end of the term—although
in each term. some may have informally chosen to desist work in the
The system’s dataset does not include courses dropped course and thus received a failing grade. Students who
early in the semester (prior to the course census date, or persisted in each course received a grade ranging from 0.0
the 10th instructional day after the quarter begins). After to 4.0.4
the census date, students are not entitled to full refund if The 34 Washington community colleges have widely
varying institutional characteristics. The system comprises
a mix of large and small schools, as well as institutions
2
In addition to information on the set of demographic characteristics located in rural, suburban, and urban settings. Most
available in most administrative datasets (e.g., gender, race, age, and colleges are comprehensive (offering both transfer-orient-
financial aid receipt), the dataset included information on socioeconomic ed and occupationally oriented degrees), but five are
status (SES). Students were divided into five quintiles of SES based on
technical colleges that primarily offer occupational
census data on the average income in the census block in which the
student lived. degrees. Table 1 describes the 34 colleges’ institutional
3
The system divides course sections into three categories: face-to-face, characteristics in fall 2004, based on statistics reported to
online, and hybrid. Given that less than 2% of courses were offered in a the Integrated Postsecondary Education Data System
hybrid format, and that these courses included a substantial on-campus (IPEDS) database. Compared to the national sample,
component (i.e., online technology displaced at most 50% of the course
delivery), we combined the hybrid and face-to-face formats in this
analysis. In a robustness check, we excluded all hybrid courses from the
4
analysis; the results are nearly identical to those presented in Tables 1–4. Each student’s grade is recorded to one decimal place.
D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57 49

Washington’s community and technical colleges are more had a higher level of academic preparedness; larger
likely to be located in urban areas and serve lower proportions of ever-online students were dual enrolled
proportions of African American and Hispanic students, as prior to college, and ever-online students had higher grade
well as lower proportions of students who receive federal point averages (GPA) and had earned more credits by the
financial aid. end of their first term.7 These statistics imply that students
with stronger academic preparation were more likely to
2.2. Sample description attempt an online section of a given course. However, it is
also possible that more prepared students tended to take
A major assumption underlying the use of distance as courses in certain subjects that also happened to have
an instrument (discussed further in Section 3) is that more online sections. To account for this possibility, we
students do not choose where to live based on unobserved used academic subject fixed effects to control for student
confounding variables that influence both online enroll- self-selection into different subject areas (see Section 3.1
ment and course outcomes. One such potential confound- for details).
ing variable is educational motivation, which may be
particularly relevant in the context of community colleges, 2.3. Online courses in Washington community and technical
given the wide variation in their students’ educational colleges
intent (Alfonso, 2006; Alfonso, Bailey, & Scott, 2005). To
address this concern, we focused on in-state students Washington’s community and technical college system
enrolled in an academic transfer-oriented track provides a number of supports intended to create an
(N = 22,624), who intended to eventually transfer to a environment conducive to high-quality online learning. In
four-year school and earn a bachelor’s degree. Among 1998, the system implemented several supports for
these students, 95% lived within 65 miles of their college, students in online courses (including an online readiness
with an average distance of 17 miles.5 assessment, a course management system tutorial, and
Because our goal was to understand the impact of online technical support services) as well as supports for
online versus face-to-face delivery within specific courses, instructors (including required training on the online
we excluded courses where all sections were offered course management system and voluntary training on
through the same delivery format. That is, all courses in our effective online pedagogies, advanced technological tools,
analysis were offered through both online and face-to-face and other topics).
sections. In addition, we excluded developmental educa- As in most community college systems (see Cox, 2006),
tion (or ‘‘remedial’’) courses, given that very few of them however, each Washington institution developed its online
were offered online. Finally, a handful of courses (<0.003%) program locally, according to the college’s own priorities
were taken at a school that was not the student’s primary and resources and the perceived needs of its particular
college, raising the concern that distance could be student population. Accordingly, colleges varied consider-
endogenous in these cases. To be conservative, we dropped ably in the proportion of online course enrollments
those courses from analysis.6 (ranging from 10% to 37%). Colleges also exerted local
The final analysis sample included 125,218 course control over course quality standards, instructor evalua-
enrollments among 18,567 students; approximately 22% of tions, and campus-level supports for online students and
enrollments were in online sections. Student summary faculty. These varying practices, together with varying
statistics are displayed in Table 2. In addition to the student characteristics and programs across colleges, likely
statistics for the full student sample (column 1), the table contribute to variation in online course outcomes. For
presents the characteristics of students who ever example, average online course persistence rates ranged
attempted an online course across the five-year period from 84% to 96% across colleges, and online course grades
of study (‘‘ever-online’’ students, column 2) and the ranged from 1.54 to 2.97. This school-level variation
characteristics of students who never took an online highlights the importance of controlling for school-level
course during that period (column 3). On a descriptive effects in our analysis.
basis, it appears that the ever-online student population Across the five-year period of the study, online course-
was comprised larger proportions of females, White taking increased substantially. In the fall of 2004, entering
students, students of higher socioeconomic status (SES), students attempted an average of 1.03 credits online (12%
students who applied and were eligible for need-based aid, of their term credits); by the spring of 2009, still-enrolled
students who lived slightly farther away from their college students in the 2004 cohort had more than doubled their
of attendance, and students who worked more hours in a rate of online credit attempts to an average of 2.56 credits
term. The ever-online student sample also seems to have (39% of their term credits). This growth was due to two
separate trends. First, students in the 2004 cohort were
increasingly likely to try at least one online course over
5
About 1% lived a considerable distance from their college (182
miles). Given that some of these students also took face-to-face courses at
7
the college, some may have provided their parents’ home address rather Although first-term GPA provides a useful sense of students’ initial
than their own address. We excluded these students in a robustness check academic performance, it could be affected by students’ choices of online
and the results remained consistent. versus face-to-face formats during their first term. However, less than 13
6
In a separate robustness check, we included those courses in the percent (N = 2376) of our sample took an online course in their first term,
analysis, and the results were almost identical to those presented in and when we excluded these students from our analysis, the academic
Tables 1–4. advantage in first-term GPA persisted for ever-online students.
50 D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57

Table 2
Summary statistics.

(I) Student-level characteristics

Full student sample Ever online student sample Never online student sample Diff (ever–never)
Demographic characteristics
Female 0.525 0.571 0.475 0.096***
(0.499)y (0.495) (0.499)
White 0.697 0.710 0.682 0.028***
(0.460) (0.454) (0.466)
African American 0.044 0.037 0.052 0.015***
(0.205) (0.188) (0.222)
Hispanic 0.022 0.021 0.024 0.003
(0.148) (0.143) (0.154)
American Indian 0.014 0.012 0.017 0.005***
(0.118) (0.108) (0.129)
Asian 0.075 0.077 0.074 0.003
(0.264) (0.266) (0.262)
Alaska Native 0.001 0.001 0.001 0.000
(0.034) (0.029) (0.038)
Native Hawaiian 0.004 0.004 0.004 0.000
(0.060) (0.059) (0.062)
Pacific Islander 0.002 0.001 0.004 0.003***
(0.050) (0.035) (0.062)
Multiracial 0.041 0.042 0.041 0.001
(0.199) (0.200) (0.198)
Unknown race 0.062 0.061 0.064 0.003
(0.242) (0.239) (0.245)
Age 21.304 21.444 21.151 0.293***
(6.585) (6.641) (6.521)
Eligible for need-based aid 0.421 0.444 0.397 0.047***
(0.494) (0.497) (0.489)
Highest SES 0.177 0.188 0.165 0.023***
(0.382) (0.391) (0.371)
Higher SES 0.223 0.229 0.218 0.011*
(0.417) (0.420) (0.413)
Middle SES 0.206 0.202 0.211 0.009
(0.405) (0.402) (0.408)
Lower SES 0.180 0.176 0.185 0.009
(0.385) (0.381) (0.388)
Lowest SES 0.137 0.131 0.145 0.014***
(0.344) (0.337) (0.351)
Unknown SES 0.076 0.074 0.078 0.004
(0.265) (0.263) (0.267)
Hours worked per week 14.889 15.536 14.187 1.349***
(13.380) (13.201) (13.357)
Distance to college (miles) 17.248 17.537 16.935 0.602***
(13.895) (14.228) (13.519)
Academic characteristics
Took developmental ed. 0.601 0.594 0.607 0.013*
(0.490) (0.491) (0.489)
Limited English proficiency 0.002 0.002 0.002 0.000
(0.040) (0.041) (0.040)
Dual enrolled prior to entry 0.087 0.094 0.080 0.014***
(0.282) (0.292) (0.272)
GPA in first termyy 2.888 2.981 2.784 0.197***
(0.947) (0.872) (1.014)
Credits accrued first term 11.200 11.633 10.731(4.963) 0.902***
(4.857) (4.717)
Credits taken per term 12.847 13.033 12.650 0.383***
(3.302) (3.109) (3.484)

Observations 18,567 9655 8912

(II) Course-level characteristics and outcomes

Full course sample Online course sample Face-to-face course sample Diff (ever–never)
Online delivery format 0.218 1.000 0.000 –
(0.413) (0.000) (0.000)
Course persistence 0.933 0.907 0.941 0.034***
(0.249) (0.293) (0.235)
Course gradeyyy 2.652 2.539 2.682 0.143***
(1.281) (1.416) (1.240)

Observations 125,218 27,331 97,887

* Significant at the 10% level.


*** Significant at the 1% level.
y
Standard deviations are in parentheses.
yy
For ‘‘GPA at the end of first term’’ N = 17,355 for the full course sample, N = 9170 for the ever online student sample, and N = 8185 for the never online student sample.
yyy
For ‘‘course grade’’ N = 116,830 for the full course sample.
D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57 51

time. Second, among only students who were actively these courses would then result in biased estimates. To
online in a given term, the percentage of credits taken address this problem, we used an additional model that
online also increased across terms. To account for growth used college-by-course fixed effects with term fixed
over time, we include controls for term-level variation in effects, thus effectively comparing online and face-to-face
our analysis. sections of the same course.9

3. Methods 3.3. Addressing within-course selection using an


instrumental variable approach
3.1. Basic empirical model
Although college-by-course fixed effects are an effec-
To assess the effects of online course delivery, we used tive means of controlling for student self-selection into
regression techniques, beginning with a probit model for different courses, there may be some remaining selection
course persistence and an OLS model for course grade. Our issues if students systematically sort between online and
basic strategy related student i’s course outcomes in face-to-face sections of a single course. To deal with this
subject k at campus j in term t to the course format in the concern, we employed an IV approach, using a variable
following equation (using course grade as an example): related to the treatment but theoretically unrelated to the
outcome to identify the treatment effect. In this analysis,
Y itk j ¼ a þ b onlineitk j þ g X i þ pt þ rk þ s j þ mitk j (1)
we used the distance from each student’s home to their
In this equation, online is the key explanatory variable and college campus as an instrument for the student’s
is equal to 1 if the course is taken online. We incorporated a likelihood of enrolling in an online rather than face-to-
rich set of controls into our model, where Xi includes face section. Specifically, we first identified the associated
demographic attributes (e.g., age, gender, race, and SES), geocode for each address in the dataset, including both
academic preparedness (e.g., remedial status, and previous student home address and college address; we then used
dual enrollment), and semester-level information (e.g., Google Maps to calculate the ‘‘travel distance’’ between
working hours in current term, total credits taken in each student’s home and their college of attendance. Given
current term).8 In addition, we included fixed effects for that online courses offer the flexibility of off-site educa-
the term of enrollment in the course (pt), the subject of the tion, students who live farther from their own college
course (rk), and the campus of attendance (sj). campus might be more likely to take advantage of online
courses, compared with students who live closer to their
3.2. Addressing between-course selection using a college-by- college. Using distance as an instrumental variable, we
course fixed effects approach modified Eq. (1) to use an IV approach. Specifically, we first
predicted the probability that an individual i took a
By including college, term, and course subject fixed particular course c online using a probit model:
effects, Eq. (1) addresses two potential problems related to
Prob ðonlineict Þ ¼ Fða þ d1 distancei þ d2 distance2i
student selection of online courses. First, students may
choose course subjects based on their preference for online þ g X i þ þpt þ Z c þ mict Þ (2)
or face-to-face course formats. For example, if a campus
where F represents the cumulative density function for
offers sociology but not psychology online, then a student
the standard normal distribution and Zc represents college-
who prefers to take online courses may choose to fulfill his
by-course fixed effects. Consistent estimates of the relative
or her social science requirement with the online sociology
impact of online course delivery can be then derived by
course rather than the face-to-face psychology course.
using the estimated probabilities from Eq. (2) as instru-
Second, online courses may be more prevalent within
ments for the endogenous dummy variable onlineict in a
particular colleges, terms, departments, or course subjects.
2SLS estimation process.10
Thus, for example, students enrolled in an English program
There are four potential concerns with using distance as
may be more likely to enroll in online courses than those in
an instrument. First, researchers (e.g., Long & Kurlaender,
an engineering program.
2009) have argued that distance may be a problematic
Although Eq. (1) addresses these issues, it cannot
instrument when using national datasets because of
account for a potential third problem: Certain courses
differences in the way distance is perceived across the
(even within a particular college, term, and subject) may be
country. This concern is limited in the current context,
more likely to be offered online. For example, suppose that
given that we focused on one state; in our sample, the
within a given department, advanced courses were more
average distance from a student’s home to the college of
likely to be offered online than entry-level courses. A direct
attendance was 17 miles, with nearly 90% of students
comparison of online and face-to-face sections across

9
Each college-by-course fixed effect includes the subject area and
8
The full list of covariates includes dummy variables for gender, race, college of attendance (for example, ‘‘coll101math111’’). Thus academic
socioeconomic status, receipt of federal financial aid, limited English subject and college fixed effects become redundant and are automatically
proficiency, dual enrollment prior to college, whether the student dropped when college-by-course fixed effects are added to the model.
10
enrolled in a remedial course, and whether the student was enrolled full- See Wooldridge (2002) for a detailed discussion about using
time in the given term. Continuous variables include the total number of nonlinear models in the first stage instrumental variable analysis. Similar
credits enrolled in that term and total working hours in that term. procedures are also illustrated in Coates et al. (2004).
52 D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57

living within 25 miles. It is unlikely that perceptions of Table 3


OLS/probit estimates of the impact of the online format (and each
distance would be fundamentally different within such a
covariate) on course persistence and course grade.
small range. In addition, given the mountainous terrain in
Washington State, where short distances may translate Course Course grade
persistence
into long commutes, we used travel distance rather than
direct-line distance. Moreover, we also used a nonlinear Coefficient (SE) Coefficient (SE)

probit model and added a quadratic term (d2) into the first- Online delivery format 0.257 ***
0.197***
stage IV equation to take into account the possibility that (0.018) (0.018)
(Marginal effect) 0.036*** –
the relationship between travel distance and the probabil-
(0.003)
ity of online enrollment may not be linear.
Second, one might be concerned about potential Covariates: demographic characteristics
Female 0.007*** 0.198***
endogeneity issues in terms of travel distance. Some
(0.002) (0.009)
researchers have suggested that individuals or families African American 0.014*** 0.464***
who value education might choose to live near a college (base group: White) (0.005) (0.024)
campus (e.g., Card, 1995; Long & Kurlaender, 2009; Rouse, Hispanic 0.017*** 0.169***
1995); similarly, the quality of a student’s previous high (0.007) (0.038)
American Indian 0.020*** 0.257***
school instruction might vary systematically with the high (0.008) (0.040)
school’s distance from the nearest college. Moreover, Asian 0.006** 0.021
students who live close to college campus may be able to (0.003) (0.018)
access support services more readily, which could height- Alaska Native 0.097*** 0.627***
(0.039) (0.141)
en their academic success. Our sample limitation (includ-
Native Hawaiian 0.032** 0.168***
ing only students who were homogeneous in their intent to (0.016) (0.063)
earn a four-year degree) may limit the impacts of such Pacific Islander 0.036** 0.544***
factors; however, to more rigorously assess their potential (0.021) (0.097)
impacts, we also conducted a validity check by assessing Multi-racial 0.014*** 0.225***
(0.004) (0.023)
the relationships between course outcomes and distance Unknown race 0.002 0.041**
for a sample of face-to-face courses (see Section 4.3). (0.003) (0.019)
Third, using an instrumental variable strategy may be Age 0.000 0.024***
more appropriate for examining course completion among (0.000) (0.001)
Eligible for need-based aid 0.017*** 0.081***
all students who enrolled in a course than for examining
(0.002) (0.010)
course grades among those who persisted in the course. Higher SES (base group: 0.003 0.041***
Examining the outcome of course grades only among highest SES) (0.002) (0.014)
persisters may introduce additional self-selection bias, if Middle SES 0.010*** 0.002
persistence rates differ by course delivery format. Howev- (0.003) (0.016)
Lower SES 0.001 0.013
er, as discussed in Section 4, online courses have higher (0.003) (0.017)
attrition rates, which may leave online courses with Lowest SES 0.014*** 0.121***
relatively better-prepared students by the end of the (0.003) (0.019)
course. Thus, using grades conditional on persistence as Unknown SES 0.005 0.045**
(0.004) (0.021)
the outcome is likely to underestimate rather than
Hours worked per week 0.000*** 0.002***
overestimate the negative effect of online delivery on (0.000) (0.000)
students’ grades.
Covariates: academic characteristics
Finally, distance will be effective as an instrumental
Took developmental education 0.003 0.141***
variable only if it has a relationship to online course (0.002) (0.011)
enrollment. We explore this issue in Section 4.2. Limited English proficiency 0.026 0.198**
(0.013) (0.083)
4. Results Dual enrolled prior to entry 0.002 0.127***
(0.003) (0.016)
Credits taken this term 0.001*** 0.019***
4.1. Ordinary least squares estimates (0.000) (0.003)
Enrolled full time this term 0.012*** 0.048**
In descriptive terms, across the total sample of 125,218 (0.003) (0.019)
course enrollments the overall course persistence rate was Observations 125,218 116,830
93%, with a gap between online courses (91%) and face-to-
Notes: Because the data include multiple observations within each course,
face courses (94%). For enrollments that persisted until the standard errors for all models are adjusted for clustering at the course
end of the semester (N = 116,830), the average grade was level.
2.65 (on a 4-point scale), also with a gap between online See Wooldridge (2003) for a detailed discussion of the necessity and
methods of adjusting standard errors when individual observations are
courses (2.54) and face-to-face courses (2.68).11
clustered.
We used the student-level variable ‘‘average credits taken per term’’ in
Table 1 to describe student sample characteristics; in the regression
analysis on the course-level sample, we used the course-level variable of
11 the actual number of credits enrolled in the given term as the covariate.
Please see the bottom panel in Table 2 for the standard deviation of
** Significant at the 5% level.
each outcome variable by course delivery format.
*** Significant at the 1% level.
D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57 53

Table 4
Estimates of the effect of taking a course online, based on different model specifications.

OLS/probit estimates IV estimates

(1) (2) (3) (4) (5) (6)

Dependent variable: course persistence


Online format (SE) 0.257*** 0.298*** 0.311*** 0.420*** 0.457*** 0.509***
(0.018) (0.017) (0.017) (0.118) (0.126) (0.132)
Marginal effect (SE) 0.036*** 0.041)*** 0.044*** 0.054*** 0.058*** 0.065***
(0.003) (0.003) (0.003) (0.016) (0.017) (0.018)

Dependent variable: course grade


Online format (SE) 0.196*** 0.233*** 0.266*** 0.228*** 0.270*** 0.324***
(0.018) (0.017) (0.016) (0.089) (0.098) (0.110)

College and subject FE No Yes Yes No Yes Yes


Year-term FE No Yes Yes No Yes Yes
College-by-course FE No No Yes No No Yes

Notes: N = 125, 218 for the analysis on course persistence; N = 116, 830 for the analysis on course grade. Standard errors for all models are adjusted for
clustering at the course level. Each cell represents a different regression specification. All models also include the following covariates: gender, ethnicity
dummy variables, socioeconomic status dummy variables, receipt of federal financial aid, limited English proficiency, dual enrolled prior to college, ever
enrolled in remedial courses, total credits taken in that term, total working hours in that term, and full-time (vs. part-time) college enrollment in that term.
*** Significant at the 1% level.

Table 3 presents baseline probit and OLS estimates of Table 5


Results of probit model for first-stage IV (probability of taking a course
the relationship between online course format and the
online).
outcomes of course persistence and course grade. The
regression includes the vector of student characteristics Xi (1) (2) (3)

but does not include any fixed effects. The results suggest Distance to 0.025*** 0.025*** 0.022***
that the online course format had a significant negative college (SE) (0.001) (0.001) (0.001)
relationship with both course persistence and course Marginal 0.007*** 0.007*** 0.006***
effect (SE) (4.28e04) (3.86e04) (2.74e04)
grade. Converting the probit coefficient (b = 0.257) for
course persistence to a marginal effect12 indicates that Distance 1.01e04*** 9.82e05*** 7.76e05***
squared (SE)
online course persistence rates were 3.6 percentage points
(1.01e05) (9.76e06) (7.52e06)
lower than face-to-face persistence rates. Among students Marginal 2.90e05*** 2.71e05*** 2.34e05***
who persisted through the course, the average grade in effect (SE)
online courses was approximately 0.19 points lower than (2.89e06) (2.65e06) (1.88e06)
in face-to-face courses. College and No Yes Yes
subject FE
Table 3 also shows the estimated coefficients for the
Year-term FE No Yes Yes
controls in the model. Overall, women, full-time students, College-by- No No Yes
older students, and those eligible for financial aid tended to course FE
perform better academically, while ethnic minority and F-Test on 497.32 557.87 471.50
low-income students and those working more hours per excluded
week tended to perform worse. Instruments <0.001 <0.001 <0.001
The left panel of Table 4 contrasts the baseline (Prob > F)
estimates for online learning with the estimates from Notes: N = 125, 218. Standard errors for all models are adjusted for
the fixed-effects models. When fixed effects for college, clustering at the course level. Each cell represents a different regression
course subject, and term were included (column 2), the specification. All models also include the following covariates: gender,
ethnicity dummy variables, socioeconomic status dummy variables,
estimated negative relationship became larger for both
receipt of federal financial aid, limited English proficiency, dual enrolled
outcome measures; when college-by-course fixed effects prior to college, ever enrolled in remedial courses, total credits taken in
were included (column 3), the gaps between online and that term, total working hours in that term, and full-time (vs. part-time)
face-to-face outcomes were further magnified to 4.4 college enrollment in that term.
percentage points for course persistence and 0.26 grade *** Significant at the 1% level.

points for course grade.


attendance as an instrument for the likelihood of enrolling
4.2. Instrumental variable estimates in an online rather than face-to-face section of a particular
course controlling for all other available covariates. Table 5
To control for selection into online coursework based on shows the first-stage results using Eq. (2) and indicates
unobservable student characteristics, our IV strategy used that the travel distance between a student’s home and
the distance between a student’s home and college of college is a significant and positive predictor of online
enrollment across all models. The quadratic term is
significantly negative though small in magnitude, indicat-
12
Calculated by averaging the derivative at each observation. ing that while students who live farther from their own
54 D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57

college campus are more likely to take advantage of online downward bias when precise measures of academic ability
courses compared with students who live closer to their and motivation are unavailable.
college, this positive relationship between distance and
online enrollment becomes less strong as distance 4.3. Robustness checks
increases.13 We conducted F-tests on the excluded
instrument to test its strength,14 and our results indicated Given that the colleges in our sample varied widely in
that travel distance does indeed help explain which terms of their enrollment sizes and in the proportion of
students choose online course sections after controlling course enrollments that were online, we conducted two
for all other covariates, no matter which model specifica- robustness checks to ensure that our results did not reflect
tion is employed. the effectiveness of online courses in particular schools.
However, for the IV estimates to be consistent, it must We reran analyses based on a sample excluding the three
also be the case that travel distance is uncorrelated with colleges with the largest student enrollments, as well as on
the error term. As a validity check, we excluded all online a sample excluding the three colleges with the largest
courses from the sample and examined the relationship online enrollments. Despite small variations, the results
between course outcomes and distance for the subsample were similar to those presented in Table 4.
of face-to-face courses.15 If students living closer to Another potential concern is that our results may be
campus were systematically more motivated, received driven by a small set of individuals who took an entirely
better instruction in high school, or had better access to online curriculum or a high proportion of courses online.
school resources, then distance would be directly related Yet among the 18,567 students in the sample, only 3%
to course outcomes for this subsample. The results of this (N = 574) took all of their courses online; most students
exploration, which are robust to all model specifications, who attempted online courses enrolled in them intermit-
suggest no relationship between course outcomes and tently, or as one course among several face-to-face courses.
distance for face-to-face courses. This evidence of inde- In addition, the majority of ‘‘fully online’’ students took no
pendence strengthens our interpretation that the IV more than three online courses before they dropped out
estimates reflect the impact of course delivery format on from the college. The courses taken by these students
course outcomes. (N = 1778) make up only 1% of the full course sample, and
The right panel in Table 4 shows the IV estimates for thus should not exert a large impact on the estimates. As a
online learning in terms of each course outcome measure. robustness check, however, we excluded all fully online
The results echo the OLS estimates: The online course students from the sample, and the results were nearly the
format had a negative estimate for both course persistence same as those presented in Table 4.
and course grade, and the impacts became stronger when In a similar vein, we considered the possibility that our
we added fixed effects. In addition, the IV estimates are results were driven by a few large courses that offered a
noticeably and consistently stronger than the correspond- high number of online sections. To address this concern,
ing OLS estimates using each model specification. For we restricted the data to courses in which at least 30% of
course persistence, the marginal effect derived from the IV enrollments were in face-to-face sections (N = 120,066)
estimate controlling for all fixed effects (column 6) is and reran the analysis on this subsample. Despite minor
0.07, compared with 0.04 based on the OLS model. For variations in the coefficients, the results were qualitatively
course grade, the column 6 estimate is 0.32, compared similar to those presented in Table 4.
with 0.27 based on the OLS model. The magnification of A final concern with our analysis is that we relied
the estimates after controlling for both observed and primarily on a cohort that entered college nearly a decade
unobserved characteristics supports the notion that online ago, in 2004. The advantage of examining this cohort is that
courses are more popular among more motivated and it supplies several years of data for each student, making
academically better prepared students. As a result, the college-by-course fixed effects strategy more plausible.
straightforward OLS estimates may be subject to a The disadvantage is that online course technologies may
have evolved since these students entered college, result-
ing in improved outcomes vis-à-vis face-to-face courses.
To investigate this possibility, we examined changes over
time in course outcomes. Descriptive data shown in Fig. 1
13
For example, the probability of taking a particular course online suggest that although course outcomes varied over time,
increased 0.6 percentage points when travel distance increased from 0 to
1 mile, but increased by a slightly smaller 0.5 percentage points when
the gap in performance between online and face-to-face
travel distance increased from 20 to 21 miles (based on the marginal outcomes remained fairly consistent. To conduct a more
effect from the college-by-course fixed effects model, column 3). explicit test of whether the gap remained consistent, we
14
Stock, Wright, and Yogo (2002) described a rule of thumb for added interaction terms between year dummies and
estimating the strength of the instrument in models using one
online format into the model shown in column 6 of
instrumental variable for one endogenous covariate, as in the current
case: the instrumental variable is regarded as a weak predictor of the Table 4. We used an F-test to examine the joint statistical
endogenous covariate if the F-statistic against the null hypothesis—that significance of these interaction terms; the null hypothe-
the excluded instrument is not a significant predictor in the first-stage sis—that they were jointly insignificant—failed to be
equation—is less than 10. rejected for either course persistence (F = 1.20, p = 0.28)
15
Removing online courses from the sample did not substantially
curtail our student sample size or variability among the student sample in
or course grade (F = 0.21, p = 0.93). That is, the adjusted
terms of distance from campus; more than 97% of students took at least association between course format and student perfor-
one face-to-face course during their time at college. mance did not change significantly over the four-year span
D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57 55

Course Persistence Course Grade

.91 .92 .93 .94 .95 .96

2.8
2.7
2.6
2.5
.9
.88 .89

2.4
Year04-05 Year05-06 Year06-07 Year07-08 Year08-09 Year04-05 Year05-06 Year06-07 Year07-08 Year08-09

Online Courses Face-to-face Courses Online Courses Face-to-face Courses

Fig. 1. Online and face-to-face course outcomes, 2004–2005 to 2008–2009.

of the study, suggesting that evolving technologies either stronger among less-advantaged populations—particularly
were not adopted or did not have a strong impact on online among ethnic minorities and students with below-average
success. prior GPAs. If so, then the gaps we observed in Washington
State community colleges may be even wider in colleges
5. Discussion and conclusion that serve high proportions of disadvantaged students, but
diminished in colleges that serve more academically
Using a unique dataset with information on a large and prepared and socially advantaged students.
representative set of online courses and similar face-to- Second, some colleges may be more thoughtful than
face courses, we explored the impact of online delivery on others in terms of how they design and support online
student course performance in the community college courses. Well-regarded online courses are often designed
setting. Estimates across all model specifications suggest through a team-based approach, with faculty collaborating
that the online format had a significant negative impact on with an instructional designer and often with additional
both course persistence and course grade. This relationship support staff (Alvarez, Blair, Monske, & Wolf, 2005;
remained significant even when we used an IV approach Hawkes & Coldeway, 2002; Hixon, 2008; Knowles &
and college-by-course fixed effects to address within- and Kalata, 2007; Puzziferro & Shelton, 2008; Thille, 2008; Xu &
between-course selection. In practical terms, these results Morris, 2007). High-quality online courses may need to be
indicate that for the typical student, taking a particular designed to promote strong interpersonal connections,
course in an online rather than face-to-face format would which a large body of empirical research suggests is
decrease his or her likelihood of course persistence by 7 important to students’ motivation, engagement, and
percentage points (e.g., from 95% to 88%), and if the student academic performance in the course (Bernard et al.,
persisted to the end of the course, would lower his or her 2009). Effective online teaching may also require explicitly
final grade by more than 0.3 points (e.g., from 2.85 to 2.52). developing students’ time management and independent
Some proponents of online learning argue that high learning skills, which are thought to be critical to success in
withdrawal rates in online courses are due to self-selection distance and online education (e.g., Bambara, Harbour,
bias (Howell, Laws, & Lindsay, 2004; Hyllegard, Heping, & Davies, & Athey, 2009; Bork & Rucks-Ahidiana, in press;
Hunter, 2008). In our study, we explored the direction of Ehrman, 1990; Eisenberg & Dowsett, 1990).
the purported selection bias by comparing IV estimates The extent to which the typical college supports its
with the straightforward OLS estimates; the fact that the IV faculty in designing and teaching high-quality courses is
estimates were consistently stronger than the correspond- unknown. Most community college systems, such as that
ing OLS estimates across all model specifications suggests in Washington State, have already expended substantial
that students who take online courses in community resources to support online learning. However, most of
colleges tend to be better prepared and more motivated. As these supports are provided on a passive basis rather than
a result, descriptive comparisons are likely to underesti- proactively integrated into the everyday activities of
mate rather than overestimate the gap between online and students and faculty,16 as recent research suggests is
face-to-face performance outcomes.
Two factors may influence the generalizability of these
results to other postsecondary settings: the population of
16
students served, and colleges’ philosophies of course For example, during the timeframe under study, the Washington
State system’s online readiness assessment provided students with
design and support. First, recent research (Figlio, Rush, & feedback as to whether an online course would be a good option for them;
Yin, 2010; Kaupp, 2012; Xu & Jaggars, 2013) suggests that however, the assessment was voluntary, and many students did not take
gaps between online and face-to-face outcomes may be advantage of it.
56 D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57

necessary in order for such supports to have sustained Research, the Bill & Melinda Gates Foundation, and Lumina
effectiveness (Karp, 2011). In particular, studies in the Foundation for providing funding that was crucial to this
community college setting suggest that most faculty are research. All opinions and mistakes are our own.
left to design online courses on their own and keenly feel a
lack of training and ongoing support (Cox, 2006; Millward, References
2008; Pagliari, Batts, & McFadden, 2009).
Overall, it seems likely that the applicability of our Adelman, C. (2005). Moving into town—And moving on: The community college
in the lives of traditional-age students. Washington, DC: US Department of
results for a given college will vary depending on the Education.
college’s student population and its philosophies of course Alfonso, M. (2006). The impact of community college attendance on bacca-
design and support. Accordingly, both two-year and four- laureate attainment. Research in Higher Education, 47(8), 873–903.
Alfonso, M., Bailey, T., & Scott, M. (2005). Educational outcomes of occupa-
year colleges may wish to examine the success of their own tional sub-baccalaureate students: Evidence from the 1990s. Economics
students in online and face-to-face courses, in order to of Education Review, 24(2), 197–212.
identify potential gaps in performance and discuss Alvarez, D. M., Blair, K., Monske, E., & Wolf, A. (2005). Team models in online
course development: A unit-specific approach. Journal of Educational
strategies to help eliminate any such gaps.
Technology & Society, 8(3), 176–186.
Despite the negative results of our study, we Bacow, L. S., Bowen, W. G., Guthrie, K. M., Lack, K. A., & Long, M. P. (2012).
acknowledge that online learning is an important Barriers to adoption of online learning systems in U.S. higher education.
strategy to improve course access and flexibility in New York, NY: ITHAKA S+R.
Bailey, T., & Morest, V. S. (2006). Introduction. In T. Bailey & V. S. Morest
higher education, with benefits from both the student (Eds.), Defending the community college equity agenda (pp. 1–27). Balti-
perspective and the institutional perspective. From the more, MD: Johns Hopkins University Press.
student perspective, the convenience of online learning Bambara, C. S., Harbour, C. P., Davies, T. G., & Athey, S. (2009). Delicate
engagement: The lived experience of community college students en-
is particularly valuable to adults with multiple respon- rolled in high-risk courses. Community College Review, 36(3), 219–238.
sibilities and highly scheduled lives; thus, online Beatty-Guenter, P. (2002, May). Why distance education? Mimetic forces in
learning can be a boon to workforce development, institutions of adult education. Paper presented at the 21st annual na-
tional conference of the Canadian Association for the Study of Adult Educa-
helping adults to return to school and complete tion.
additional education that otherwise could not fit into Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M.,
their daily routines. From an institutional perspective, Surkes, M. A., et al. (2009). A meta-analysis of three types of interaction
treatments in distance education. Review of Educational Research, 79(3),
online modalities allow colleges to offer additional 1243–1290.
courses or course sections to their students, increasing Bork, R. H., & Rucks-Ahidiana, Z. (in press). Role ambiguity in online courses: An
student access to (and presumably progression through) analysis of student and instructor expectations. New York, NY: Columbia
University, Teachers College, Community College Research Center (in
required courses. Given the value of these benefits,
press).
online courses are likely to become an increasingly Caldwell, E. R. (2006). A comparative study of three instructional modalities in a
important feature of postsecondary education. The computer programming course: Traditional instruction, Web-based instruc-
results of this study, however, suggest that colleges tion, and online instruction (Doctoral dissertation). Retrieved from Pro-
Quest Dissertations and Theses database (UMI No. AAT 3227694).
need to take steps to ensure that students perform as Card, D. (1995). Using geographic variation in college proximity to estimate
well in online courses as they do in face-to-face courses, the return to schooling. In L. N. Christofides, E. K. Grant, & R. Swidinsky
before continuing to expand their online course offer- (Eds.), Aspects of labor market behavior: Essays in honour of John Vander-
kamp. Toronto, Canada: University of Toronto Press.
ings. Carr, S. (2000, February). As distance education comes of age, the challenge is
Creating more in-depth, systematic, and proactive keeping the students. The Chronicle of Higher Education.
supports for online faculty and students may not be an Cavus, N., & Ibrahim, D. (2007). Assessing the success rate of students using a
learning management system together with a collaborative tool in Web-
inexpensive endeavor. To help clarify the costs associated based teaching of programming languages. Journal of Educational Com-
with such supports, researchers should work to identify puting Research, 36(3), 301–321.
high-quality online courses and programs—those that Chambers, T. E. (2002). Internet course student achievement: In Ohio’s two-
year community and technical colleges, are online courses less effective than
yield strong student outcomes, particularly among disad- traditional courses? Bowling Green, OH: Bowling Green State University
vantaged populations—and quantify the costs associated (Doctoral dissertation).
with them. Until such research is conducted, it will remain Chen, A. (2012, September). Florida ponders opening an online-only public
university. The Chronicle of Higher Education Retrieved from http://
unclear whether online courses currently do, or eventually
chronicle.com/.
will, represent a cost-effective alternative to face-to-face Coates, D., Humphreys, B. R., Kane, J., & Vachris, M. A. (2004). ‘‘No significant
courses. distance’’ between face-to-face and online instruction: Evidence from
principles of economics. Economics of Education Review, 23, 533–546.
Cox, R. D. (2006). Virtual access. In T. Bailey & V. S. Morest (Eds.), Defending
Acknowledgements the community college equity agenda (pp. 110–131). Baltimore, MA: Johns
Hopkins University Press.
The authors would like to thank all the staff at the Edgecombe, N., Barragan, M., & Rucks-Ahidiana, Z. (2013). Enhancing the
online experience through interactive technologies: An empirical analysis of
Community College Research Center for their support technology usage in community college (Unpublished manuscript).
during this research project. We are also indebted to Ehrman, M. (1990). Psychological factors and distance education. American
Thomas Bailey, Judith Scott Clayton, Jonah Rockoff, Aaron Journal of Distance Education, 4(1), 10–24.
Eisenberg, E., & Dowsett, T. (1990). Student drop-out from a distance
Pallas, Clive Belfield, Jennifer Hill, and the Association for
education project course: A new method of analysis. Distance Education,
Institutional Research’s grant review panel for their 11(2), 231–253.
valuable comments and suggestions. The 34 community Fain, P., & Rivard, R. (2013, March). Outsourcing Public Higher Ed. Inside
and technical colleges in Washington State provided Higher Ed Retrieved from http://www.insidehighered.com/news/2013/
03/13/california-bill-encourage-mooc-credit-public-colleges.
invaluable data on which this paper is based. In addition, Figlio, D. N., Rush, M., & Yin, L. (2010). Is it live or is it Internet? Experimental
we are grateful to the Association for Institutional estimates of the effects of online instruction on student learning. (NBER
D. Xu, S.S. Jaggars / Economics of Education Review 37 (2013) 46–57 57

working paper no. 16089) Cambridge, MA: National Bureau of Economic Proceedings of Society for Information Technology & Teacher Education
Research. International Conference 1999 (pp. 126–130). Chesapeake, VA: Associa-
Hawkes, M., & Cambre, M. (2000). The cost factor: When is interactive tion for the Advancement of Computing in Education.
distance education justifiable? Technology Horizons in Education Journal, Pagliari, L., Batts, D., & McFadden, C. (2009). Desired versus actual training for
28(1), 26–32. online instructors in community colleges. Online Journal of Distance
Hawkes, M., & Coldeway, D. O. (2002). An analysis of team vs. faculty-based Learning Administration, 12(4). Retrieved from http://www.west-
online course development: Implications for instructional design. Quar- ga.edu/distance/ojdla/winter124/pagliari124.html.
terly Review of Distance Education, 3(4), 431–441. Peterson, C. L., & Bond, N. (2004). Online compared to face-to-face teacher
Hixon, E. (2008). Team-based online course development: A case study of preparation for learning standards-based planning skills. Journal of
collaboration models. Online Journal of Distance Learning Administration, Research on Technology in Education, 36(4), 345–360.
11(4). Retrieved from http://www.westga.edu/distance/ojdla/win- Planty, M., Hussar, W., Snyder, T., Kena, G., KewalRamani, A., Kemp, J., et al.
ter114/hixon114.html. (2009). The condition of education 2009. (Report no. NCES 2009-081)
Howell, S. L., Laws, R. D., & Lindsay, N. K. (2004). Reevaluating course Washington, DC: US Department of Education Institute of Education
completion in distance education. Quarterly Review of Distance Educa- Sciences National Center for Education Statistics.
tion, 5(4), 243–252. Puzziferro, M., & Shelton, K. (2008). A model for developing high-quality
Hyllegard, D., Heping, D., & Hunter, C. (2008). Why do students leave online online courses: Integrating a systems approach with learning theory.
courses? Attrition in community college distance learning courses. Journal of Asynchronous Learning Networks, 12(3/4), 119–136.
International Journal of Instructional Media, 35(4), 429–434. Rogers, B. (2001). Analysis of funding issues related to distance learning in the
Jaggars, S. S., & Bailey, T. (2010). Effectiveness of fully online courses for college North Carolina Community College System Retrieved from http://
students: Response to a Department of Education meta-analysis. New York, www.nccommunitycolleges.edu/Reports/docs/LegislativeReports/fun-
NY: Columbia University, Teachers College, Community College Re- ding.pdf.
search Center. Rouse, C. E. (1995). Democratization or diversion? The effect of community
Jewett, F. I. (2000). A model for comparing the costs of using distance colleges on educational attainment. Journal of Business Economics and
instruction and classroom instruction. American Journal of Distance Statistics, 13(2), 217–224.
Education, 14(2), 37–47. Rumble, G. (2003). Modeling the costs and economics of distance education.
Jung, I. (2003). Cost-effectiveness of online education. In M. G. Moore & W. G. In M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education
Anderson (Eds.), Handbook of distance education (pp. 717–726). Mahwah, (pp. 703–716). Mahwah, NJ: Lawrence Erlbaum Associates.
NJ: Lawrence Erlbaum Associates. Schiffman, S. (2005). Business issues in online education. In J. Bourne & J. C.
Kane, T. J., Orszag, P.R., & Gunter, G. (2003). State fiscal constraints and higher Moore (Eds.), Elements of quality education: Engaging communities (pp.
education spending. Urban-Brookings Tax Policy Center. Discussion 151–172). Needham, MA: Sloan Consortium.
Paper 12. Schoenfeld-Tacher, R., McConnell, S., & Graham, M. (2001). Do no harm—A
Karp, M. M. (2011). Toward a new understanding of non-academic student comparison of the effects of on-line vs. traditional delivery media on a
support: Four mechanisms encouraging positive student outcomes in the science course. Journal of Science Education and Technology, 10(3), 257–
community college. (CCRC working paper no. 28, Assessment of Evidence 265.
Series) New York, NY: Columbia University, Teachers College, Commu- Stock, J. H., Wright, J. H., & Yogo, M. (2002). A survey of weak instruments and
nity College Research Center. weak identification in generalized method of moments. Journal of the
Kaupp, R. (2012). Online penalty: The impact of online instruction on the American Statistical Association, 20(4), 518–529.
Latino-White achievement gap. Journal of Applied Research in Community Texas Higher Education Coordinating Board. (2011). Summary of higher
Colleges, 12(2), 1–9. education legislation: 82nd Texas Legislature. Austin, TX: Author.
Knowles, E., & Kalata, K. (2007). A model for enhancing online course Thille, C. (2008). Building open learning as a community-based research
development. Innovate, 4(2). Retrieved from http://www.innovateonli- activity. In T. Iiyoshi & M. S. V. Kumar (Eds.), Opening up education:
ne.info/pdf/vol4_issue2/A_Model_for_Enhancing_Online_Course_Deve- The collective advancement of education through open technology,
lopment.pdf. open content, and open knowledge (pp. 165–180). Cambridge, MA: MIT
LaRose, R., Gregg, J., & Eastin, M. (1998). Audiographic telecourses for the Press.
Web: An experiment. Journal of Computer-mediated Communication, 4(2). Virginia Community College System, Distance Learning Taskforce. (2001).
Retrieved from http://jcmc.indiana.edu/vol4/issue2/larose.html. Virginia Community College System organizational strategy for distance
Levine, A., & Sun, J. (2002). Barriers to distance education. Distributed educa- learning: Final report. Richmond, VA: Virginia Community College Sys-
tion: Challenges, choices, and a new environment, sixth in a series. tem.
Washington, DC: American Council on Education. Whalen, T., & Wright, D. (1999). Methodology for cost-benefit analysis of
Long, B. T., & Kurlaender, M. (2009). Do community colleges provide a viable Web-based tele-learning: Case study of the Bell Online Institute. Ameri-
pathway to a baccalaureate degree? Educational Evaluation and Policy can Journal of Distance Education, 13(1), 23–44.
Analysis, 31, 30–53. Wooldridge, J. M. (2002). Econometric analysis of cross section and panel data.
Mentzer, G. A., Cryan, J., & Teclehaimanot, B. (2007). Two peas in a pod? A Cambridge, MA: MIT Press.
comparison of face-to-face and Web-based classrooms. Journal of Tech- Wooldridge, J. M. (2003). Cluster-sample methods in applied econometrics.
nology and Teacher Education, 15(2), 233–246. American Economic Review, 93(2), 133–138.
Millward, J. (2008). An analysis of the national ‘‘TYCA Research Initiative Xu, D., & Jaggars, S. S. (2011). The effectiveness of distance education across
Survey Section III: Technology and Pedagogy’’ in two-year college Virginia’s community colleges: Evidence from introductory college-level
English programs. Teaching English in the Two-Year College, 35(4), math and English courses. Educational Evaluation and Policy Analysis,
372–398. 33(3), 360–377.
Moore, K., Bartkovich, J., Fetzner, M., & Ison, S. (2003). Success in cyberspace: Xu, D., & Jaggars, S. S. (2013). Adaptability to online learning: Differences across
Student retention in online courses. Journal of Applied Research in the types of students and academic subject areas. (CCRC working paper no. 54)
Community College, 10(2), 107–118. New York, NY: Columbia University, Teachers College, Community
Odell, M., Abbitt, J., Amos, D., & Davis, J. (1999). Developing online courses: A College Research Center.
comparison of Web-based instruction with traditional instruction. In J. Xu, H., & Morris, L. V. (2007). Collaborative course development for online
D. Price, J. Willis, D. A. Willis, M. Jost, & S. Boger-Mehall (Eds.), In courses. Innovative Higher Education, 32(1), 35–47.

You might also like