Abstract
Even prior to the COVID-19 pandemic, online learning had become a fundamental part of post-secondary education. At the same time, empirical evidence from the last decade documents higher dropout online in comparison to face-to-face courses for some students. Thus, while online learning may provide students access to post-secondary education, concerns about academic momentum and degree attainment dominate the higher education online learning landscape. Because course completion is often used as a measure of effectiveness, there is a strong need for institutions to be able to predict the potential persistence of online students to direct efforts towards ameliorating dropout. Yet currently, a widely tested and validated archetypical predictive model of retention and success does not exist for undergraduate online learning. This integrative review of the literature examines evidence gathered over the last decade, organizing and summarizing major findings, to help identify potential undergraduate student characteristics for inclusion in such a model. The body of literature collected in this review suggests ten factors for consideration.
Keywords: Online learning, Online enrollment, Online retention, Online attrition
Introduction
Even before the global pandemic necessitated a radical shift in course delivery in the spring of 2020, online learning had become fundamental to postsecondary education. In the U.S., online learning has increased at a substantial rate over the last decade, and this is expected to continue: the 7% annual growth rate for online enrollments stands in marked contrast to the decline found in general post-secondary enrollments (Allen & Seaman, 2016). Pre COVID-19, between 28 and 36% of all post-secondary students in the U.S. enrolled in at least one online course annually (Allen & Seaman, 2016; Lederman, 2019; NCES, 2013) and online learning was a critical part of two-thirds of post-secondary institution’s long-term strategy (Allen & Seaman, 2010). Initially led by community colleges (Pearson, 2011; Shea & Bidjerano, 2014), online learning is now a reality for virtually all undergraduate institutions. Yet, because of the potentially high rates and costs of online attrition, there remains a need for institutions to be able to predict the potential persistence of online undergraduate students to foster inclusion and direct efforts towards ameliorating dropout (Hachey et al., 2013; Lee & Choi, 2011). Hence, enrollment and retention research–and the identification of influential factors to include in pre-enrollment predictive models–is critical.
Background: how do outcomes differ for online versus face-to-face instruction on average?
Course-level learning outcomes
There is support that students can learn as much online as they do face-to-face via comparison of course-level factors. An oft-cited meta-analysis of 232 online learning studies (Bernard et al., 2004) concluded no significant difference between student outcomes online versus face-to-face. After controlling for G.P.A, Driscoll et al. (2012) found grade difference of online students versus their face-to-face counterparts were statistically insignificant. Numerous other studies and meta-analyses support that there are no positive or negative effects on outcomes online versus face-to-face as measured by exam scores and aggregate course grades—the “no significant difference” claim (Ashby et al., 2011; Bowen et al., 2014; Brown, 2012; Cavanaugh & Jacquemin, 2015; Dell et al., 2010; Enriquez, 2010; Fish & Kang, 2014; Holmes & Reid, 2017; Jones & Long, 2013; Lack, 2013; Lyke & Frank, 2012; McCutcheon et al., 2015; McDonough et al., 2014; Meder, 2013; Mollenkopf et al., 2017; Nemetz et al., 2017; Rich & Dereshiwsky, 2011; Schmidt, 2012; Stack, 2015; Stocker, 2018; Tallent-Runnels et al., 2006; Wagner et al., 2011; Wu, 2015; Wuellner, 2013; Ya Ni, 2013; Zhao et al., 2005). Thus, collectively, there is a general consensus in the literature of little to no difference in student performance between instructional mediums for courses where both mediums are available.
Nonetheless, in an early review by Zhao et al. (2005) and supported by more recent findings, although aggregated data show no significant difference in outcomes between online and face-to-face learning, there continue to be some differences reported within studies, often connected to student characteristics. Several studies report higher exam scores or final grades in online courses compared to face-to-face courses (Holbert, 2017; Jorczak & Dupuis, 2014; Soffer & Nachmias, 2018). A meta-analysis of 176 studies from the Department of Education (2010) suggests that online learning outcomes are superior to those found in face-to-face courses. However, Jaggars and Bailey (2010) question these findings, as the only studies in the meta-analysis that were fully online and semester long (seven in total) did not find a difference between online vs. face-to-face course outcomes. Atchley et al. (2013) found differences in student outcomes between online vs. face-to-face courses, with online students earning more A, D and F grades, and Buchanan and Palmer (2017) report that median learning outcomes were similar but online assessments had larger standard deviations and higher F grades; yet, for both of these studies, differences did not necessarily mean better or worse overall outcomes between modalities.
Fendler et al. (2018) contend that while no significant difference may exist between instructional modes at the aggregate level, the choice between online and face-to-face course modes may matter at the individual level. A meta-review by Nguyen (2015) contends that characteristics such as gender, race/ethnicity, and ability moderate the learning outcomes for online learning. In line with this, Figlio et al. (2013) report that students with specific characteristics (Latinx, men, low achievers) do poorly online in comparison to face-to-face. Xu and Jaggars (2011b) report lower grades for online English and Math “gatekeeper” courses and further (2013, 2014) identify lower grades in online versus face-to-face course sections for specific populations (i.e., men, Black Students, younger students and students with lower G.P.A.s). Tanyel and Griffin (2014), in a 10-year longitudinal study, found final grades for students in face-to-face sections were significantly higher than online sections of the same course. Bettinger et al (2017) report that online learning produced lower grades on average and was more likely to negatively impact future enrollment than in-person courses. Similar findings of lower grades and negative impact on later enrollment are reported for both baccalaureate and community college students across a variety of disciplines (Alpert et al., 2016; Burns et al., 2013; Faidley, 2018; Francis et al., 2019; Garman, 2012; Hart et al., 2018; Keramidas, 2012). It is important to note that none of these studies compared outcomes within students, rather they compared outcomes in online and face-to-face courses by different students. While the inferences we can make about which characteristics put students at highest risk online from these particular studies are limited, they do suggest that a consideration of student characteristics maybe important when investigating outcomes.
Attrition rates
Course completion (i.e., successful finish of a course and final grade earned for which a student receives credit from the institution) is often used to measure the effectiveness of online courses (Boston & Ice, 2011). Attrition rates among online learners fluctuate between 40 and 80% (Smith, 2010) and research shows that online attrition may be 7–20 percentage points higher than the attrition found in face-to-face courses ( Anderson & Kim, 2006; Boston & Ice, 2011; Garman, 2012; Hachey et al., 2013; Jaggars et al., 2013; Johnson & Mejia, 2014; Kinder, 2013; Morris & Finnegan, 2009; Muse, 2003; Nora & Snyder, 2009; Patterson & McFadden, 2009; Smith, 2016; Smith & Ferguson, 2005; Tanyel & Griffin, 2014). Online dropout rates have been linked by some researchers to academic non-success (Angelino et al., 2007; Atchely et al., 2013; Carr, 2000; Diaz, 2002; Gregory, 2016; Jaggars & Bailey, 2010; Rovai, & Downey, 2010; Tyler-Smith, 2006; Wojciechowski & Palmer, 2005). In contrast to most of the literature, Moore and Fetzner (2009) present a review of six higher education institutions with 80% or greater online retention rates, Ashby et al. (2011) report better completion online, and others have found no differences in the completion rates between online and face-to-face students (Soffer & Nachman, 2018; Wilson & Allen, 2018). Despite the few noted exceptions, overwhelmingly, aggregate findings in the literature have demonstrated that students enrolled in online courses are more likely to drop the course than those enrolled in a face-to-face course. Yet, one major limitation of this research is that studies have largely not addressed differences in the characteristics of students who enroll in online courses (versus face-to-face courses).
College access and persistence
At least 74% of all U.S. undergraduate students have been found to have one (or more) non-traditional characteristic (U.S. Department of Education, 2015). [Non-traditional/underserved often includes students with any of the following characteristics: minorities; low-income; rural/inner-city; first generation; academically unprepared; the most common definition of non-traditional student characteristics is: delayed enrollment (age > 24); no high school diploma; part-time enrollment; financially independent; have dependents; single parent status; working full-time while enrolled (NCES, 1996, 2002)]. Online courses are often believed to increase college access, particularly for non-traditional and/or underserved students (Epper & Garn, 2003; Jaggars & Bailey, 2010; Johnson & Mejia, 2014; Parsad et al., 2008; Picciano et al., 2010). Many students who take online courses do so because they need the flexibility that these courses offer due to a wide range of life challenges that make it difficult to attend face-to-face courses (Brown, 2012; Burnette, 2015; Daymont & Blau, 2011; Fox, 2017; Jaggars, 2014; Jaggars & Bailey, 2010; Jaggars, 2013; Lei & Gupta, 2010; Pontes et al., 2010; Skopek & Schuhmann, 2008). Yet, there is little evidence to support the claim that online courses increase access (Jaggars, 2011), although improved access is often assumed (Allen & Seaman, 2010).
Moreover, it is still unclear how online offerings impact college persistence, as results have been very mixed (Jaggars, 2011; Shea & Bidjerano, 2014; Wladis et al., 2016). In some large-scale multi-institution studies (Jaggars & Xu, 2010; Xu & Jaggars, 2011a, 2011b), results suggest that online coursework may hinder subsequent semester re-enrollment and students who took a higher proportion of online courses were less likely to graduate or transfer to a senior college. Xu and Jaggars (2013) found that for community college students, students who were men, younger, Black, or had lower G.P.A.’s, struggled in online courses. Similar findings for community college students are reported by Huntington-Klein et al. (2017). Shea and Bidjerano (2019) report that minority community college students with higher online loads were more likely to drop out of college than non-minority students. Additionally, Smith (2016), using multi-institution data and controlling for student self-selection bias, found results for 4-year colleges that confirm Jaggars and Xu’s earlier findings of an overall negative impact of online enrollments on course and college retention.
However, these initial studies have not been confirmed in other recent research on academic momentum and degree completion which attempt to control for the student characteristics which often differ between online and face-to-face learners or that include nuanced analyses on online course taking and graduation attainment. Johnson and Mejia (2014), assessing all community colleges in the California system and controlling for demographics and G.P.A., found that in the short term, student outcomes were worse in online courses versus face-to-face courses. Yet, they also found that students who took at least some online courses were more likely than those who took only face-to-face courses to earn an associate degree or to transfer to a 4-year institution. Shea and Bidjerano (2014, 2016), using national data (rather than the state-wide datasets that other large-scale studies utilize) and controlling for student characteristics, also found that students (even those less academically prepared) who had taken some of their early courses online, attained degrees at significantly higher rates than those who did not. More recently, Shea and Bidjerano (2018) report that community college students who enrolled in online course loads up to 40% had better chances of completing their college degree (although it is important to note that beyond 40% online enrollment, the enhanced rate of degree completion was shown to begin to rapidly decline). They also found, on average, online course completion significantly increased the odds of completing a community college degree and transfer to a 4-year institution; more specifically, online course taking was shown to improve the degree completion/transfer of low G.P.A. part-time students relative to their full-time counterparts (Shea & Bidjerano, 2017, 2019). Also, using national data, Sublett (2019a) reports that community college students who engaged in online learning obtained their BA degrees faster than their peers who did not.
James et al. (2016) in a large scale, multi-institutional study including community colleges and 4-year institutions, report students taking a mix of online and face-to-face courses had higher retention and passing grades than students attending fully online or fully face-to-face, but differences were explained by student characteristics rather than delivery mode and thus, showed little evidence that the act of taking online courses itself placed students at greater risk of dropout. Similarly, Wladis et al. (2016), using data from a large, urban, east coast, multi-institution university system, report that it cannot be assumed that students who withdraw after taking online courses do so because of the outcomes of the online course; rather, other student characteristics (which drive the students to take online courses in the first place) may be significant in determining college persistence. Recent findings, which all attempt to control for student characteristics, suggest that restricting access to online courses could reduce academic momentum and student progress toward a degree. However, given the conflicting results, more research is needed (particularly accounting for student characteristics) before firm conclusions can be drawn about how enrollment in online courses impacts college persistence and degree completion.
Study purpose
This review initially arose from the question “What factors are important to include in any archetypical pre-enrollment predictive model of undergraduate online learning enrollment, retention and success? To date, a widely empirically tested, archetypical predictive model of retention and success does not exist for undergraduate online learning. Traditional models of persistence and retention (i.e., Bean & Metzner, 1985; Nora & Cabrera, 1996; Tinto, 1975, 1986, 1993, 2016) may not be applicable to online learning because these models emphasize the role of social integration, a process very different for the online student (Boston & Ice, 2011; Hachey et al., 2013; Rovai, 2002, 2003).
Scholars have posited online retention models, although none have been completely validated. Kember (1995) proposed a two-prong model of persistence for students in online courses based on how well students socially integrate school with their lives and the number of external demands they face. Kember’s own research found that there are likely other factors not included in the model (it did not account for 20% of the variance in student persistence). Further, Kember’s model has been empirically challenged, with Woodley et al. (2001) finding little internal consistency in the sub-scales for key constructs in the model. Rovai (2003) proposed a model of online student retention that synthesized traditional departure models with research on the skills and needs of online students; it included student characteristics, skills, and external factors as the major inputs that lead to internal factors affecting student persistence. Studies have validated and extended some of the external and internal factors identified in Rovai’s model (Lehman & Conceição, 2013; Packham et al., 2004), although other specific sub-factors that were identified have not been supported (Lee et al., 2013). More critically, both Kember and Rovai’s models are old and do not reflect recent knowledge about online learning, particularly related to modern social media phenomena (Lint, 2013).
Park (2007) proposed a model of online dropout based on Rovai’s model and a review of the online learning literature at that time. Efforts to validate Park’s model (Park & Choi, 2009) used a small sample and did not test all components. Money and Dean (2019) presented a model with major categories of antecedents and processes that they suggest may directly or indirectly impact online outcomes. Aligning with factors in Park’s model, they placed individual antecedents in seven broad categories and included broad categories of student internal processes (cognitive, affective, behavioral, and social). This model has yet to be validated and while the broad categories do reflect the literature, the model does not seem to account for the totality of potential predictor factors.
Research on students who succeed in the online environment has identified internal affective qualities such as self-discipline, independence, and motivation (Dabbagh, 2007; Johnson & Mejia, 2014; Stevens & Switzer, 2006) or skills such as time management and computer literacy (Dutton et al., 2002; Johnson & Galy, 2013), which (although not detailed) would likely fall under the internal process category in Money and Dean’s model. However, these internal qualities and skills may be no less applicable to success in the face-to-face environment than in the online environment or they may be less predictive than other factors (Kauffman, 2015; Park & Choi, 2009). Further, it is not well understood how such qualities and skills can be identified pre-enrollment; the effectiveness of screening tools is often assumed, but surveys developed to test student readiness for online learning are often costly, time consuming and may not be as effective in predicting student retention and success in online courses as other measures (Wladis & Samuels, 2016).
A cursory examination of early reviews devoted to online learning points to two broad conclusions. First, there are a plethora of variables that have been investigated as impacting retention and success in online learning; as Tyler-Smith (2006) contends, the reasons for attrition in online courses are numerous and complex. While many of the factors reported as impacting the online student’s decision to drop out likely contribute to online attrition, these variables may not easily be obtained pre-enrollment, nor may they account for most of the variability in retention and success. Second, discussions of early online learning research reveal a “lack of rigor” in research design (e.g., Jaggars & Bailey, 2010; Lack, 2013; Wu, 2015). This probably stems from the lack of controlled experimental designs. For example, in an oft-cited meta-analysis focused on learning outcomes (Means et al., 2010), the authors focus on 176 articles in their study, but a critique of this work (Jaggars & Bailey, 2010) notes that only seven of the studies included experimental/quasi-experimental designs and reflect conditions in typical postsecondary education. This may provide a partial explanation for why there is little consensus for which factors are significant and lead to persistence for online students and further, why there is currently no archetypical predictive model.
The ideal approach to analyzing the effects of the online environment on attrition would be one in which students were randomly assigned to either an online or a face-to-face section of a course. Some studies have attempted to do this with mixed results (i.e., Alpert et al., 2016; Arias et al., 2018; Figlio et al., 2013; Stack, 2015), though most randomized studies typically compare hybrid courses to face-to-face courses (see Bowen et al., 2014; Vernadakis et al., 2012). This is problematic because most online courses at colleges are fully online [80% or more online] rather than hybrid [30–79% online] (Instructional Technology Council, 2010; Lutzer et al., 2007), and because hybrid students seem to be comparable to face-to-face students in terms of demographics, course persistence and success (Jaggars & Xu, 2010; Xu & Jaggars, 2011a).
The dearth of large-scale randomized studies is likely due to the difficulties of recruiting students to participate in a study in which they could be randomly placed in a fully online or a face-to-face course. This is because a critical aspect of online learning is that students choose to enroll in the online modality (Fendler et al., 2018). Lack (2013) and Paulson and McCormick (2020) note that the differences in pre-existing background characteristics of students taking online versus face-to face courses are not trivial; the characteristics that drive students to enroll in online courses are likely correlated to course outcomes (Muljana & Luo, 2019; Nguyen, 2015; Smith, 2016; Wuellner, 2013). It has been posited that it is the characteristics of students who select online classes, rather than the instructional methodology itself, which impacts dropout (Jaggars & Bailey, 2010; McDonough et al., 2014; Wladis et al., 2015a).
A review of the literature from 1999 to 2009 conducted by Lee and Choi (2011) identified 69 possible factors from 35 empirical studies that may impact the online student’s decision to drop-out. However, Lee and Choi deliberately exclude findings related to student demographics (such as age, gender, or ethnicity), citing that the literature at the time offered inconsistent results. They are not incorrect; much of the research on the impact of student characteristics on online retention and success is conflicting (Jones, 2010). Yet, because findings are mixed does not mean that student characteristics should be completely ignored, nor does it mean that findings and related theorizing cannot be informative. This connects back to the earlier “lack of rigor” contention; because students opt into online courses and our lack of knowledge of what matters, many of the empirical studies on retention and success lack sufficient controls (Cavanaugh & Jacquemin, 2015; Lack, 2013; Wladis et al., 2016; Wu, 2015). This clarified for us that an integrative review of the literature focused on the question of “What student characteristics lead to online enrollment, retention, success and later college persistence?” is needed, both to foster better controls in future research and to help clarify potential factors for inclusion in an archetypical pre-enrollment predictive model of undergraduate online learning enrollment, retention and success.
Method of the review
Our goal was to be as inclusive as possible, providing a comprehensive view of the undergraduate literature for the last decade (January 2010–March 2021). In line with the methodology employed by Wallace (2003) and the guidelines for an integrative review as outlined by de Souza et al. (2010) and the PRISMA statement (Page et al., 2021), we present a wide-ranging overview that includes findings from previous reviews and research summaries, and any empirical research that addresses student factors in online learning that affect enrollment, retention, success and/or college persistence at institutions that also offer face-to-face courses in the U.S. (this review excludes research conducted at online-only institutions). The reasons for the inclusion of these types of work are two-fold. First, previous reviews and policy reports offer conclusions that are often used by researchers and policy makers as the basis for current interventions and research directions, and therefore, deserve to be noted. Second, we include all research studies that may offer insight, regardless of sample size or methodology or type. We do note multi-institutional or national datasets present findings that are likely more generalizable; therefore, these results may be more critical to the development of any archetypical predictive model (See the Appendix, Tables 5, 6).
Table 5.
Quantitative and mixed method (MM) multi-site studies (n = 16)
| Study/work | Sample/context | Design/method | Measure | Key findings |
|---|---|---|---|---|
| Gregory (2016) | n = over 4000 students from multiple campuses at TN CC (2012–2015) | Quasi-experimental (QE) design, retrospective IR data analysis | Attrition;Grades | Online students more likely to withdraw than students face-to-face (F2F); online students more likely to make and ‘A’ or ‘F’ with more ‘B’, ‘C’, or ‘D’ grades for F2F students; non-traditional students more likely to earn an ‘A’ in both modalities; no significant differences online by gender for completion but women out performed men online in terms of grades; online students had higher ACT scores and correlated to higher grades; part-time students more likely to withdraw than full-time students in both modalities; students with more credits hours had higher grades in both modalities; Pell Grant-eligible students less likely to earn an “A” grade in an online course than non-Pell Grant-eligible students; no impact of first-generation status in either modality; married online students more likely to earn an “A” grade than single students; no impact of dependent child status in either modality |
| Harrell and Bower (2011) | n = 225 online students at five FL CCs | QE design, quantitative performance data analysis | Attrition | Predictor variables for persistence in online courses identified as auditory learning style, increased GPA, and basic computer skills |
| Jaggars and Xu (2010) | n = 24,000 from 23 VA CCs colleges examined over a 4-year period |
QE design, retrospective IR data analysis |
Attrition; Persist-ence (re-enroll-ment) | Students, regardless of academic preparation, more likely to fail or withdraw from online courses than from F2F courses; students who took online courses in early semesters slightly less likely to reenroll; students who took a higher proportion of credits online slightly less likely to transfer to a 4-year institution |
| Kaupp (2012) | n = 4.5 million Latino and White CA CC students; n = 10 faculty and n = 10 Latino students at a single CA CC |
MM design; retrospective IR data analysis; interviews with faculty and students |
Enroll-ment by student character-istics; Attrition |
Latino students 30% less likely than White students to enroll in the online section but overrepresented in F2F classes; Latinos students online had higher withdrawal rates and achieved lower grades than peers in F2F classes; faculty attribute Latino achievement gap to motivation, technology and engagement, while students attribute it to instructor-student relationships |
| Palacios and Wood (2016) | n = 3,936,284 students from 112 CA CCs |
QE design, retrospective IR data analysis |
Attrition | Asian and White men had higher online retention than Black and Latino men; Black men were least likely to be retained online; media courses that were asynchronous shown to be beneficial for Black men students |
| James et al. (2016) | n = 656,258 (n = 213,056 CC; n = 113,036 4-year universities; n = 330,166 online-only institutions) from PAR member institutions [online only institution results not reported here] |
QE design, retrospective IR data analysis |
Attrition | CCs and 4-year university students taking a mix of online and F2F courses had higher retention and passing grades than students fully online or fully F2F, with difference explained by student characteristics; little evidence that taking online courses increased attrition, once student characteristics accounted for; students with Pell grants retained at higher rates regardless of modality; more women students enrolled online yet no significant difference by gender for modality; older students had greater online enrollment and retention than younger students in only online courses (the reverse was found for F2F, with younger students outperforming older students) |
| Johnson and Mejia (2014) |
n = 8,959,319 (n = 957,888 online; n = 7,790,510 F2F) students enrolled in CA CCs (2006–2012) |
QE design, retrospective IR data analysis |
Attrition; persist-ence/complet-ion | Short-term: student outcomes (passing and completion) worse in online versus F2F courses; online course retention rates 11–14% points lower than F2F course rates; Long-term: students taking some online courses more likely to earn an associate degree or transfer to a 4-year institution than students who take only F2F courses |
| Kinder (2013) | N = 148,939 students at public WV CCs (2009–2010) |
QE design, retrospective IR data analysis |
Attrition; Grades | Online students more likely than their F2F peers to fail or withdraw; online students received more grades of A than F2F courses; older students less likely to withdraw from, and more likely to receive higher grades, in online or hybrid courses |
| Shea and Bidjerano (2017) | n = 41,616 students from 30 CCs within the New York State system [SUNY] (2012–2015) |
QE design, retrospective IR data analysis |
Enroll-ment by student character-istics; Grades |
Female students, White students, full-time students, older students, Pell grant recipients much more likely to take both online and face-to-face courses than to be face-to-face only; Lower G.P.A.s online vs. face-to-face in 4/7 semesters for same student; Neither gender nor minority status were significant predictors of differences in G.P.A. between face-to-face and online courses |
| Streich (2014) | n = 265,296 students from 58 CCs (2003–11) |
QE design, retrospective IR data analysis; unemployment-state insurance record data analysis |
Enroll-ment by student character-istics; Attrition |
Working adults (30 + years), women, and Pell recipients enrolled in online courses; results suggest online offerings induce new enrollments rather than simply shifting enrollments from other courses |
| Smith (2016) | n = 195,237 students from 15 4-year institutions within the UNC system (2012–14) |
QE design, retrospective IR data analysis |
Enroll-ment by student character-istics; Attrition; Persist-ence (re-enroll-ment) |
Online enrollment negatively impacted course retention and performance; English enrollments had higher negative impact over math enrollments; online students more likely to be fourth year students, older, women, and file as independent for FAFSA; online students also have lower high school G.P.A.s and SAT scores |
| Wladis et al. (2016) | n = 9663 students at CUNY, a large urban university system | QE design, retrospective IR data analysis | Attrition; persist-ence (re-enroll-ment) | Native-born students at greater risk online than foreign-born students; having a child under the age of 6 predicted lower rates of online course completion; while online students were more likely to drop out of college, online course outcomes had no direct effect on college persistence; other characteristics seemed to make students simultaneously both more likely to enroll online and to drop out of college |
| Xu and Jaggars (2011a) | n = 51,017 students enrolled in WA community and technical colleges (2004–2009) |
QE design, retrospective IR data analysis |
Enroll-ment by student character-istics; Attrition |
Online students more likely to work more hours and have better academic preparation, but also more likely to fail or withdraw compared to F2F students; students who took online courses in earlier terms less likely to reenroll; students who took a higher number of courses online less likely to transfer to 4-year institutions; students were equally likely to complete hybrid and F2F courses |
| Xu and Jaggars (2011b) | N = 24,000 VA CC students (2004–08) |
QE design, retrospective IR data analysis |
Attrition; Grades | After controlling for some student characteristics, robust negative impact for students taking introductory math and English online (lower persistence and lower grades) |
| Xu and Jaggars (2013) | n = 51,017 students enrolled in WA community and technical colleges (2004–2009) |
QE design, retrospective IR data analysis |
Attrition; Grades | Students who are men, younger, Black, and/or have lower GPAs struggled to adapt to the online environment; older students adapted more readily to online courses than younger students did; online learning effects varied across subject areas |
| Xu and Jaggars (2014) | n = over 40,000 students in 500,000 courses at WA’s 34 community and technical colleges |
QE design, retrospective IR data analysis |
Attrition; Grades | All student groups suffered decrements in performance online compared to F2F, those with the strongest declines were men, younger students, Black students, and students with lower G.P.A.s; suggests there may be an achievement gap online versus F2F in some academic areas |
Table 6.
Quantitative nationally representative studies (n = 6)
| Study/work | Sample/context | Design/method | Measure | Key findings |
|---|---|---|---|---|
| Ortagus (2017) | n = 94,200 students | Quasi-experimental (QE) design, National Post-secondary Student Aid Study data | Enroll-ment by student character-istics | Minority and low-income students less likely to enroll online; found increasing participation of low-income students in online courses, however, low-income students were also less likely to enroll in online only course loads |
| Pontes et al. (2010) | n = 80,000 students enrolled in 1400 institutions (2003–2004) | QE design, 2004 National Postsecondary Student Aid Study data | Attrition | Non-traditional undergraduates, who have more risk factors for degree non-completion than traditional undergraduates, and students with physical disabilities, that limit their mobility, prefer/enroll in online courses |
| Shea and Bidjerano (2014) | n = 4600 first-time students at a post-secondary institution from Spring 2004, 2006, and 2009 | QE design, Beginning Postsecondary Students Longitudinal Study data from NCES | Attrition; Persist-ence/complet-ion | Less prepared students who took online courses early o in their time at college were more likely to attain a degree compared to students who did not do; more community college (CC) students who took courses online completed a credential 4 years after the fact compared to students who did not do; women are overrepresented in online courses |
| Pao (2016) | n = 123,600 students (2011–12) | QE design, National Postsecondary Student Aid Study data | Enroll-ment by student character-istics | More non-traditional age students enrolled online than traditional college age students; more women non-traditional age students enrolled in online courses compared to men non-traditional age students; more women non-traditional students > 30 yrs. enrolled in online courses than younger, non-traditional students aged 24–29 |
| Wladis, Hachey, et al. (2015) | n = 27,800 under-graduate STEM majors | QE design, 2008 National Postsecondary Student Aid Study data | Enroll-ment by student character-istics | Hispanic and Black STEM majors less likely, and women STEM majors s more likely, to take online courses while controlling for academic preparation, SES, citizenship and ESL status; non-traditional student characteristics strongly increased the likelihood of enrolling online, more than any other characteristic, and the probability increased steeply as the number of non-traditional factors increased; impact of non-traditional factors on online enrollment significantly stronger for STEM than non-STEM majors |
| Wladis et al. (2015c) | n = more than 2000 CC college STEM majors | QE design, 2008 National Postsecondary Student Aid Study data | Enroll-ment by student character-istics | Hispanic students significantly less likely to enroll online, with Black and Hispanic men students particularly underrepresented; women significantly more likely to enroll online, as were students with non-traditional student characteristics; at CCs, ethnicity was a stronger predictor than non-traditional characteristics, whereas at 4-year colleges, the reverse was true: each additional non-traditional risk factor increased the likelihood of online enrollment by two and five percentage points at 2-year and 4-year colleges, respectively |
To conduct the search (see Fig. 1), relevant articles on undergraduate students enrolled in online learning in the U.S. were first located through a search of publicly available literature published between January 2010 and March 2021. Using key words, searches were conducted in three educational online databases: ERIC, Education Full Text and PsychINFO; these databases were chosen because they include adult education and they are large and commonly searched educational databases devoted to peer-reviewed literature. The key words were: Higher Education or College or University or Postsecondary [Boolean operater AND] Online Learning [the Boolean operator OR] Distance Education [Boolean operater AND] Attrition [Boolean operators AND/OR] Retention [Boolean operators AND/OR] Persistence [Boolean operators AND/OR] Dropout [Boolean operators AND/OR] Learning Outcomes. Our search of the three online educational databases initially gathered 682 articles. These articles were then screened to remove duplicates and articles that fell outside the scope of the search criteria; screening of the initial collection of articles based on the described criteria reduced the total educational database gathered works to 17. For the initial screening, articles had to meet the following criteria: published 2010 or later; focus is on online learning/online courses, with online learning being classified as taking place over the Internet (i.e., exclude articles on self-paced video-only; hybrid or blended courses); focus is on typical, semester long, for-credit undergraduate courses at institutions that offer both online and face-to-face courses/programs (i.e., exclude articles on graduate populations and/or MOOCS and/or about online-only institutions); focus is on outcomes as measured by grade and/or completion and/or future enrollment (exclude articles reporting only perceived learning, satisfaction or engagement by faculty/students and/or only student affective factors/motivation data and/or only self-reported data); specifically reports the impact of student demographics/characteristics on enrollment, course success, course retention or future online persistence; enrollment takes place in the United States and the work is published in English.
Fig. 1.
Flowchart outlining the search protocol adopted in this integrative review
Adopting Wallace’s (2003) comprehensive methodology, our search was supplemented by an ancestral/snowball review of the initial articles found and also works cited in several meta-reviews and summaries on research in online learning (i.e., Hart, 2012; Jaggars et al., 2013; Lee & Choi, 2011; Money & Dean, 2019; Muljana & Luo, 2019; Sublett, 2019b; Wu, 2015). Additional articles were gathered through manual searches in Google Scholar of the first five pages brought up in our searches [with the key words: “enrollment”, “attrition” in online learning]. Gathered articles were screened by our original criteria, and then we also applied an ancestral/snowball review to these articles. Finally, abstracts were manually reviewed based on our initial search criteria for articles published from January 2010 to March 2021 in the following journals (which focus on/often include the topic of online learning): American Journal of Distance Education; Computers & Education; Internet and Higher Education Journal; Online Journal of Distance Learning Administration; Online Learning (Journal of the Online Learning Consortium); Briefs from the Community College Research Center; Journal of Computing in Higher Education; Journal of Online Learning and Teaching; and The Journal of Educators Online. [Note: The Journal of Online Learning and Teaching merged with Online Learning, so an independent search of the former was conducted from 2010–2015]. As the search process progressed, duplications were removed. Note: although pre-dating 2010, we have chosen to include the meta-reviews of Bernard et al., (2004), Tallent-Runnels et al. (2006) and Zhao et al. (2005) due to the applicability of the findings and to provide critical historical context. This inclusion and exclusion process brought the number of articles detailed in the results section of this review to 83 (See study summaries in the Appendix, Tables 2, 3, 4, 5, 6).
Table 2.
Theoretical papers, models of online learning and model validation research (n = 5)
| Study/work | Design/method | Key findings |
|---|---|---|
| Kember (1995) | Model derived from quantitative data (factor analysis of Distance Education Student Progress questionnaire) and qualitative data (interviews) | Development of a conceptual model of student progress in online courses; components include entry characteristics, academic & social integration, external attribution, and academic incompatibility; less academically successful students, cite external demands (time, work, unexpected events) |
| Lint (2013) | Mixed method study of MD community college students (n = 169) to investigate relationships identified in Kember’s (1995) model | Suggests social media can be interpreted as the combination of external attribution and social integration of the Kember model; social media may cause a distractor for academic focus (not addressed in Kember), thus there is a need to increase academic input by increasing academic integration to mediate the interference; G.P.A. and previous online course experience shown to be a significant predictor of persistence |
| Park (2007) | Model derived from review of 18 studies (1987–2006) related to online dropout, focusing on non-traditional distance learners | Makes modifications to Rovai’s (2003) model, redefining some variables and changing their location within the model; author notes some research relies on variables identified by earlier researchers rather than exploring variables on a more systematic, ongoing basis |
| Rovai (2003) | Builds on Tinto and Bean & Metzner, adding skills required by/needs of online students | Model is divided into student characteristics and skills prior to admission and external and internal factors affecting students after admission |
| Woodley et al. (2001) | Replicates Kember’s (1995) model with Open University (U.K.) students (n = 457) | Uses Kember’s Distance Education Student Progress (DESP) inventory (1995); considers social integration, academic integration, external attribution, and academic incompatibility; concluded Kember's model does not fit data from this study |
Table 3.
Meta-analysis (MA), literature reviews, research summaries/overviews and higher education policy reports (n = 21)
| Study/work | Design/method | Key findings |
|---|---|---|
| Bernard et al. (2004) | MA examining 232 studies on distance education; studies published from 1985 to 2002 | Wide variability in effect sizes on all measures suggests mixed outcomes for online courses; literature weak in terms of design features—more than half of the codable study features were missing; distinction between synchronous and asynchronous forms of online moderates effect sizes for achievement and attitudes |
| Ford and Vignare (2015) | Reviews scholarly and grey literature between 2000 and 2014 to begin addressing the research needed on online military students | Military learner demographics and academic risk profiles are most similar to non-traditional and first-generation learners, although military learners face additional challenges; military learners have become increasingly reliant on online learning; significant lack of research examining online military learners |
| Hart (2012) | Review of post-1999 peer reviewed journals, addresses student factors leading to persistence, includes original data; search strategy yielded 11 articles | Factors associated with student persistence online include satisfaction with online learning, a sense of belonging to the learning community, motivation, peer, and family support, time management skills, and increased communication with the instructor; if persistence factors are not present in sufficient quantity, the student may be at risk of withdrawing from an online course |
| Ice et al. (2012) | MA by six institutions to determine factors contributing to retention, progression, and completion of online learners; aggregated student and course data into one dataset; n = 661,705 (n = 550,172 private-for profit institutions; n = 91,128 community colleges (CC); n = 20,405 public 4-year institutions) | Creation of the Predictive Analytics Reporting (PAR) Framework; identified and defined 33 variables that impact retention, progression, and completion; number of degree hours completed highly associated with a student’s likelihood of remaining enrolled or graduating; race next most highly associated variable if the student does not have any degree hours; used statistical analyses including descriptive statistics, CHAID analysis, regression analysis, and group differences and found that CHAID analysis was useful/beneficial for the stakeholders because they could produce data outputs that are “more visually appealing and relatively easy for non-statisticians to understand.” |
| Jaggars (2011) | Working paper- review examining the impact of online learning on low-income and underprepared college students | CC students taking online courses more likely to withdraw; online courses may discourage students from reenrolling and persisting; completers, both online and face-to-face (F2F) have similar end-of-semester learning outcomes; more rigorous studies tended to find negative effects for online learning; studies with few controls and allowed for self-selection of modality had no effect for online learning; online course completion lower due to: technical difficulties, social distance, lack of structure and student supports and cost barriers |
| Jaggars and Bailey (2010) | MA of seven fully online semester-long college courses found in the Means et al. (2010) meta review sponsored by the DOE | Found lack of consistent differences in online vs. F2F course outcomes; lack of generalization to larger online classes or classes with non-technology related subject matter; relied on studies from mid-sized or large universities (mostly selective or highly selective schools) |
| Jaggars et al. (2013) | Overview of previous work; examined online course outcomes at two large statewide CC systems | Overview of other work by Jaggars & Xu, whose findings represent differences in online and F2F outcomes based on descriptive data; studies suggest online CC students are less likely to complete and perform well; online courses may exacerbate already persistent achievement gaps between student subgroups |
| Lack (2013) | MA of 30 studies not in Means et al. (2010) and that compare F2F to hybrid or online learning; examined learning outcomes or academic performance in for-credit undergraduate courses | Yielded little evidence suggesting online or hybrid learning, on average, is more or less effective than F2F learning. Most studies included have mixed results: on some measures, students in the online or hybrid did better, but on others did worse, relative to students in the F2F format—or else, on some measures online- or hybrid-format students did significantly better or worse than students in F2F, but on other measures no significant difference between groups |
| Lee and Choi (2011) | Reviewed studies (1999–2009) from common educational databases; per inclusion criteria, examined 35 studies focusing on online dropout | Identified 69 factors that influence students’ decisions to dropout; classified into three main categories: (a) Student factors, (b) Course/Program factors, and (c) Environmental factors; most distinctive online dropout factors were previous academic and professional experiences and performance, learning skills, and psychological attributes |
| Lei and Gupta (2010) | Short evaluations of distance education from institutional, faculty, and students’ perspectives | Examined institutional, faculty and student benefits and costs. Student benefits included: flexibility; time management; self-paced; limited peer distractions; less instructor bias; access to course materials; less culture shock; easier for students with disabilities; skill building. Student costs: technology access, fees and difficulties; advanced understanding of computer skills, lack of F2F interactions with peers and instructors; more assignments; apprehension about online courses; self-discipline, self-motivation, delayed feedback, lack of direct assistance |
| McCutcheon et al. (2015) | Review of the impact of online and blended learning vs. F2F learning for undergraduate nursing students | Found wide variation of online and blended approaches to develop clinical skills; online learning is as effective as F2F; insufficient evidence on the implementation of a blended learning approach to teaching nursing clinical skills |
| Means et al. (2010) | Review (1996–2008) of 1000+ empirical studies of online learning; 45 studies had rigorous design and adequate effect size and contrasted online to F2F and measured student learning outcomes | Fifty independent effects identified that could be subjected to meta-analysis; found that, on average, students in online learning conditions performed modestly better than those receiving F2F instruction |
| Money and Dean (2019) | Review of online learning; utilized a content-based literature review methodology to produce new model of online retention | Literature did not coherently/consistently identify factors impacting online learning; developed retention model citing cognition; prior knowledge and experience; personal traits; motivation; familiarity with and preference for online learning; demographic attributes; and learning styles |
| Nguyen (2015) | MA of studies found on nosignificantdifference.org comparing online versus F2F | 92% of all online education studies find online education is at least as effective, if not better, than F2F; author notes the lack of rigorous methodology of earlier studies and selection issues related to voluntary submittal to the site reviewed |
| Picciano et al. (2010) | MA of 6 years of national studies of online learning in American Education to examine the macrolevel impact | In higher education, primarily 4-year private liberal arts schools, ignore or don’t prioritize online education; issues regarding the quality of online learning and level of effect required to teach online courses at all levels of education, suggesting more developmental work needs to be done; also examined K-12 |
| Rovai and Downey (2010) | Review of literature and U.S. Securities and Exchange Commission (SEC) filings | Discusses major factors in online learning (e.g. planning; marketing & recruitment; financial management; quality assurance; student retention and faculty development); states online learning attracts a higher percentage of non-traditional students and details related risk factors |
| Shachar and Neumann (2003) | MA of 86 experimental and quasi-experimental studies (1990–2002) that met inclusion criteria | Found in 2/3 of the cases, students taking courses online outperformed their student counterparts enrolled in F2F courses |
| Tallent-Runnels et al. (2006) | Review (1993–2004) of common educational databases, identified 76 studies focusing on distance education | Mainly descriptive and exploratory studies; online students are non-traditional and Anglo American; synchronous communication facilitated in-depth communication (but not more than F2F classes); students liked self-pacing; online learning outcomes same as in F2F courses; students with prior computer experience were more satisfied with online courses |
| Wu (2015) | MA of twelve studies published between 2013 and 2014 which compared at least one F2F section to at least one hybrid or fully online section | Three studies employing randomization or quasi-experimental strategies found students in online or hybrid courses performed slightly worse to no differently than peers in F2F courses; descriptive studies incorporating control variables found online and hybrid courses associated with lower learning outcomes; six studies employing strictly observational analyses found students in online and hybrid formats performed no worse and, in some cases, better than F2F sections |
| Xu and Xu (2019) | Review of online learning utilizing author’s research, IPEDS data, National Center for Education Statistics [NCES] data and summaries of a selection of studies from the online literature | Online education can benefit some CC students; however, students learn less well in online courses compared to similar students in F2F classes—particularly at 2-year and nonselective institutions; online learning can exacerbate education inequality, since online courses are substantially more prevalent at nonselective institutions that disproportionately enroll students from underrepresented groups and lower socioeconomic backgrounds |
| Zhao et al. (2005) | Review of 51 journal articles (1966–2002) which compared F12F versus online outcomes | Aggregated data show no significant difference in outcomes between online and F2F courses, but remarkable difference across studies; effective online education needs interaction by live human instructors and the right mix of technology; online may be more appropriate for specific content and student groups |
Table 4.
Quantitative and mixed method (MM) single site or single course studies (n = 36)
| Study/work | Sample/context | Design/method | Measure | Key findings |
|---|---|---|---|---|
| Ary and Brune (2011) | n = 185 (half the students online) in personal finance courses in small private AK university (2009–2010) | Quasi-experimental (QE) design, quantitative performance data analysis | Grades | Modality made little difference in final grade; higher mean G.P.A.s and ACT scores found for students in online section; results identify G.P.As, and to a lesser extent ACT scores, as predictors of student online success, regardless of modality |
| Ashby et al. (2011) | n = 167 (35% F2F; 28% blended; 38% online) students in an Algebra course at a Mid-Atlantic CC | QE design, quantitative performance data analysis | Attrition | Online and blended students performed worse than F2F students without taking attrition into account; for students who completed the course, F2F students performed worse; among students who completed all assignments, online students had highest success rate |
| Bettinger et al. (2017) | n = 230,484 students at a large for-profit university | QE design, retrospective Institutional Research (IR) data analysis | Attrition | Online students perform substantially worse in terms of grade and future courses than students in F2F courses; results show differentially larger negative effects of online course-taking for students with lower G.P.A.s |
| Brown (2012) | n = 324 students from an AL college of education (2007–2010) | QE design, retrospective IR data analysis | Attrition; Grades | No difference in the grades for online and F2F courses; more students on average dropped online courses compared to F2F courses |
| Burns et al. (2013) | n = 382 students in information science courses at a Midwest land grant institution (2010–2012) | QE design, retrospective IR data analysis | Grades | G.P.A. was a highly significant factor for student success in the introductory course, regardless of modality; students had better grades F2F; students who took online or hybrid section of the introductory course did better than F2F in the next, more advanced course, regardless of modality |
| Cavanaugh and Jacquemin (2015) | n = 140,444 students from a large public university (2010–2013) | QE design, retrospective IR data analysis | Grades | Little or no difference between course offering formats; non-minority, older, women students had higher grades than minority, younger, men students; students with higher GPAs performed better in online courses |
| Cochran et al. (2014) | n = 2314 students in online courses (spring 2010) at a large state university | QE design, quantitative performance data analysis | Attrition | Strongest predictor of online withdrawal is academic experience; seniors less likely to withdraw than non-seniors; students who previously withdrew from online courses more likely to withdraw in the current term; students with G.P.A.’s > 3.0 less likely to withdraw online |
| Conway et al. (2011) | n = 2330 students in a large NE urban CC | QE design, retrospective IR data analysis | Enroll-ment by student characteristics | Minority students less likely to enroll in online courses; Black and Hispanic students, regardless of modality, had lower G.P.A.s |
| Driscoll et al. (2012) | n = 368 (n = 212 online; n = 231 F2F) sociology students at an urban NC university (2010) | MM design; quantitative performance data analysis; survey data analysis | Enroll-ment by student character-istics; perceived satisfact-ion | No significant difference in student satisfaction between modalities; online student had lower grades than F2F students, but after controlling for G.P.A, differences were insignificant; online students had lower G.P.A.s, were more likely to be older, college seniors, took fewer credit hours and worked more hours per week than F2F students |
| Faidley (2018) | n = 557 (n = 124 online; n = 433 F2F) accounting students in a public university in the SE U.S. (2015–2017) | QE design, retrospective IR data analysis | Attrition; Grades | Students scored significantly higher on final grades and had higher G.P.A.s in F2F section than online; women outperformed men in both modalities; older students (> 25 years) scored significantly higher than younger students in online courses; women enrolled more online than F2F and in comparison to men |
| Fendler et al. (2018) | n = 504 (n = 219 F2F; n = 285 online) business students in a finance course | MM design; retrospective IR data analysis; survey data analysis | Grades | Self-selection into courses may be important; G.P.A highly significant predictor of final grades in both modalities; men earned better grades than women online (at the 10% significant level); no gender differences observed in F2F courses; greater number of higher education credits completed generally positively impacts online grades |
| Fetzner (2013) | n = 438 online students from a Rochester, NY CC (2000–01, 2005–2006, and 2009–2010) | MM design; retrospective IR data analysis; survey data analysis | Grade; perceived impact factors | Older (> 25 years) students, early enrollees and those with more credits had better grades online; online non-success largely due to falling behind/inability to catch up; other reasons: personal problems (health, balancing study, work and family responsibilities), didn’t like modality, or instructor’s teaching style, technical difficulties, course difficult or time intensive, unmotivated, heavy course load |
| Figlio et al. (2013) | n = 312 (n = 215 online; n = 97 F2F students from a Micro-economics course taught at a large selective university | Experimental design with random assignment, quantitative performance data analysis | Grades | F2F students performed better on exams than online students, but not statistically significant; Hispanic students, men students, and lower-achieving students had higher grades F2F versus online; no difference between modalities among students with higher G.P.A.s, but lower G.P.A. students in online course scored significantly lower on in-class exams than those F2F |
| Francis et al. (2019) | n = 2411 develop-mental mathematics students from a SE U.S. CC | MM design; quantitative performance data analysis; self-report survey data analysis | Grades; Motivat-ion | Online students failed, withdrew and earned lower grades than F2F students; no impact of gender, ethnic/racial status or generation status; older online learners had lower grades and pass rates that F2F older learners and also younger learners in either modality; F2F and online students did not show differential patterns of motivational change |
| Garman (2012) | n = 6582 (n = 2237 Online; n = 4345 F2F) students enrolled in biology courses in a TN CC (2008–11) | QE design, retrospective IR data analysis | Attrition; Grades | Final grades and completion rates higher for F2F students than the online students; student success greater for women, men, and health program majors in F2F sections compared to online sections; students older than 25 had a higher success rate F2F compared to online |
| Goomas and Clayton (2013) | n = 115 (n = 52 online; 63 F2F) first-time students at a TX CC | QE design, quantitative performance data analysis | Attrition; Grades | First-time-in-college online students faired worse as measured by final semester grades and retention in comparison to first-time-in-college F2F students |
| Hachey et al. (2012) | n = 880 students at a large urban CC | QE design, retrospective IR data analysis | Attrition | Prior online course experience strongly correlated with online course success in future courses |
| Hachey et al. (2013) | n = 1284 students at a large urban CC (2004–205); additional data through 2008 | QE design, retrospective IR data analysis | Attrition | Restrictive G.P.A. policies did not make a significant difference in attrition rates; no significant drop in online attrition or in the gap between online and F2F attrition when prohibiting students with a G.P.A < 2.0; students with G.P.A. < 2.0 had the lowest online vs. F2F attrition rate ratio; online students had significantly higher average G.P.A.’s than F2F students |
| Hachey et al. (2014) | n = 962 students in a large urban CC (2004–2010) | QE design, retrospective IR data analysis | Attrition | For students with previous online course experience, prior online course outcomes more significant predictor of future online course grades and retention than G.P.A.; for students without prior online course experience, G.P.A. was a good predictor of online course outcomes |
| Hachey et al. (2015) | n = 1566 students who took a STEM course online at a large urban NE CC (2004–2012) | QE design, retrospective IR data analysis | Attrition; Grades | Prior online course experience added significant information about likely future STEM online outcomes when controlling for G.P.A.; students who dropped or earned a D or F in even one prior online course had significantly lower rates of successful online STEM course completion than students with no prior online experience, even when controlling for G.P.A |
| Helms (2014) | n = 96 (n = 58 online; n = 32 F2F) enrolled in under-graduate course limited to psychology majors | QE design, quantitative performance data analysis | Enroll-ment by student character-istics; Grades | Online students’ G.P.A.s significantly lower than F2F students; no significant differences in communication patterns, amount of time spent online; online students performed worse on course assignments than F2F peers and F2F students had significantly higher final grades; women students more likely to enroll online |
| Johnson and Galy (2013) | n = 268 students at Hispanic-serving institution in south TX | MM design; quantitative performance data analysis; surveys | Grades | Factors identified as impacting online grades include self efficacy with computers, time management and ability to work independently; found lower online grades for older Hispanic students in comparison to younger Hispanic students |
| Johnson and Palmer (2015) | Two samples; n = 317 and n = 331 (both = half online) Linguistics students at a large SE state university | MM design; retrospective IR data analysis; surveys | Grades | Students had better grades F2F in comparison to online; students with higher G.P.A.s were F2F; convenience was cited as reason for enrolling online; no difference in age between students enrolling F2F and online |
| Krajewski (2015) | n = 687 students in an online biology course at a Midwest CC | QE design, retrospective IR data analysis | Attrition; Grades | Age, race/ethnicity, academic load and first-term were predictive of completion, Pell recipient was not; age, race/ethnicity, and Pell recipient were significant predictors of grade of “C” or better; no gender effects found; full-time students more likely to complete than part-time students but academic load did not impact grade for completers; first-term students less likely to complete online but first-term did not impact grade for completers |
| Park and Choi (2009) | n = 147 students from a large Midwest University | MM design; quantitative performance data analysis; surveys | Attrition | Age, gender and educational level found to not have a significant and direct effect on online dropout. Online dropouts found to be significantly different in family and organizational support |
| Rich and Dereshiwsky (2011) | n = 101 students in accounting courses at a large urban CT university | MM design; quantitative performance data analysis; self-reported improvements data analysis | Grades | Online students had similar results on all assessments as those enrolled F2F; online students were older that F2F students and were more part-time with work experience |
| Ryabov (2012) | n = 286 online students in sociology courses at a large public university (2009–2010) | QE design, quantitative performance data analysis | Grades | More women students enrolled online than men; found a positive relationship between time spent online and grade; G.P.A. was the most significant predictor of online grade; students majoring in sociology were advantaged academically compared to students who chose other majors |
| Soffer and Nachmias (2018) | n = 968 under-graduate students at a university | MM design; quantitative performance data analysis; surveys | Attrition; Grades; Perceived satisfact-ion | No significant differences between online and F2F courses related to grades, communication with other students or completion rate; students in online courses had better understanding of the course structure, reported better communication with instructors and higher engagement and satisfaction compared to F2F |
| Strang (2017) | n = 228 students in an online business course | compared to F2F data analytics; student essay data analysis | Grades | Age, gender, and culture were unrelated to a student’s outcome online; students who spent more time on quizzes received higher online grades |
| Tanyel and Griffin (2014) | n = 5621 (n = 2226 online; n = 3355 F2F) students at a small SE regional state university over a 10-year period | QE design, retrospective IR data analysis | Attrition; Grades | Final grades for students F2F were significantly higher than online sections; attrition higher in online courses; greater number of students older than 25 in online courses than traditional aged students; found non-significant difference in prior G.P.A. for online and F2F students (although online students had slightly higher G.P.A.) |
| Wagner et al. (2011) | n = 624 students in a business application software course (2001–2010) | QE design, retrospective IR data analysis | Grade | Men had lower average grades in the online course than in F2F courses |
| Wang et al. (2013) | n = 256 under-graduate and graduate students at a SE university (2008–2009) | MM design; quantitative performance data analysis; survey data analysis | Grade | Students with previous online learning experiences had more effective learning strategies when taking online courses, and as a result had higher levels of motivation in their online courses; this correlated with technology self-efficacy and course satisfaction which positively impacted online grades |
| Wilson and Allen (2018) | n = 101 students at a Historically Black College or University [HBCU] (Spring 2010) | QE design, quantitative performance data analysis | Enroll-ment by student character-istics; attrition | Online students tended to be women, older and have earned greater semester credit hours than F2F students; no differences in the completion and failure rates between online and F2F students; G.P.A. was greatest predictor of course grades, regardless of modality; online completers earned greater credit hours than F2F completers |
| Wladis et al. (2015a) | n = 3600 students from an urban NE CC who took a STEM course between 2004 and 2011 | QE design, retrospective IR data analysis | Attrition | Women significantly more likely to succeed in F2F STEM courses rather than men; both genders had nearly identical success rates in online STEM courses; students < 24 years of age significantly less likely to successfully complete STEM courses online versus F2F, and less likely when compared to their older peers |
| Wuellner (2013) | n = 105 students in two courses in wildlife and fisheries sciences at a SD State University | MM design; quantitative performance data analysis; surveys data analysis | Grades | Student performance was similar between both modalities but mixed between courses; one online section reported lower levels of satisfaction than all other sections; author notes student characteristics may explain differences |
| Yeboah and Smith (2016) | n = 149 minority online students at a SE U.S. university | MM design; quantitative performance data analysis, survey data analysis | Attrition; Perceived satisfact-ion | Satisfaction and social media usage had no relationship with the academic performance; positive relationship found between technology usage, number of online courses, program of study, and academic performance; minority students’ academic performance was facilitated by factors such as cultural, language, personal, and self-efficacy skills |
Results: what are the characteristics of undergraduate students who enroll in online courses and their relation to online retention and success?
Except for emergency conditions such as pandemic driven necessity, students select to enroll in online courses (versus face-to-face courses). The research suggests that students who tend to enroll in online courses are likely on average different from those who enroll in face-to-face courses in terms of academic preparation/G.P.A., ethnicity/English language skills, gender, age, family/employment, part-time enrollment, non-traditional student characteristics and other characteristics such as socioeconomic status, prior online experience & credit accumulation. The literature on each of these factors is discussed and we share what is known about their relationship with undergraduate student enrollment and success in online courses.
Academic Preparation and G.P.A.
Enrollment
Several large-scale and multi-institution studies suggest that undergraduate students who enroll in online classes generally have higher G.P.A.s. than those who only take face-to-face courses (Ary & Brune, 2011; Jaggars & Xu, 2010; Xu & Jaggars, 2011a*). In more recent multi-institution studies, Johnson and Mejia (2014*) found that online course enrollment is more prevalent among more academically prepared students and Cavanaugh and Jacquemin (2015) report online students having higher G.P.A.s despite similar re-enrollment credit accumulation. On the other hand, some single course studies (Driscoll et al., 2012; Faidley, 2018; Helms, 2014; Johnson & Palmer, 2015) found that students who chose online courses had lower G.P.A.s and performed more poorly than face-to-face peers. Additionally, both a 10-year longitudinal study and a study using national data report that online students may be no more prepared or even less academically prepared than students who only take face-to-face courses (Shea & Bidjerano, 2014; Tanyel & Griffin, 2014).
Outcomes
Academic preparation (usually measured by G.P.A., SAT score and/or class rank), has been found to be a mediating variable affecting online grades and dropout rates (Bettinger et al., 2017; Figlio et al., 2013; Gregory, 2016; Harrell & Bower, 2011; Ice et al., 2012; Johnson & Mejia, 2014; Lee & Choi, 2011; Lint, 2013; Ryabov, 2012; Wladis et al., 2016; Xu & Jaggars, 2011a*), although it is unclear the extent to which this is any different in online courses than in face-to-face courses. Remedial placement (a strong indicator of academic preparedness) has been found to negatively impact online success as measured by lower grades and higher dropout (Jaggars & Xu, 2010; Xu & Jaggars, 2013, 2014*). Recently, Francis et al., (2019*) reported that online students in developmental math courses1 had worse outcomes than comparable face-to-face students, including lower grades and lower pass rates. Research also suggests that taking online developmental courses may negatively impact future online enrollment (Xu & Jaggars, 2011b*). Using multi-institutional data, Johnson and Mejia (2014*) found that students who were less academically prepared performed worse in online courses. Along the same line, Cavanaugh and Jacquemin (2015) report that students with higher G.P.A.s perform better in online courses compared to a face-to-face format. Overall, findings substantiate earlier assertions by Lee and Choi (2011), who in their review of the literature from 1999 to 2009, cite eight empirical studies that indicated that students with higher levels of academic preparation were less likely to drop-out of online courses.
Research has shown that G.P.A was a highly significant predictor of grades, for both online and face-to-face formats (Burns et al., 2013; Fendler et al., 2018; Wilson & Allen, 2018). Still, we note that there are nuances in the research. Figlio et al. (2013) found no significant difference between online and face-to face course formats among students with higher prior G.P.A.s. However, among those with lower G.P.A.s, students in the online course scored significantly lower on in-class exams than did those in the face-to-face section. G.P.A. has been noted as having a significant relationship with online course dropout (Harrell & Bower, 2011; Helms, 2014). Cochran et al. (2014) found that students with lower G.P.A.s were more likely to withdraw from online courses than those with higher G.P.A.s. Similarly, Wladis et al., (2016*) report that the weakest G.P.A. was associated with higher risk online, though the effect size in this study was small and students with G.P.A.s in the 2.0–3.0 range did not have significantly different risk online than students with higher or lower grades. Additionally, Hachey et al., (2013*) found that students in the 2.0–3.5 G.P.A. range had the highest proportional difference in attrition between online and face-to-face courses. For students with no prior online course experience, G.P.A. was a good predictor of future online course outcomes; but (corroborating earlier research) for students with previous online course experience, prior online course outcomes were a better predictor of future online course grades and retention than G.P.A. (Hachey et al., 2014, 2015*). While the literature indicates that G.P.A. can be a predictor of online course retention, there is no evidence that G.P.A. is any better at predicting student outcomes in online courses than in face-to-face courses.
Ethnicity & ESL
Enrollment
Available data generally seems to suggest that there is a proportional disparity in online enrollment by ethnicity, but the relationship between ethnicity and online success is less clear. Online learners are less likely than face-to-face learners to be Latinx or Black (Ashby et al., 2011; Cavanaugh & Jacquemin, 2015; Conway et al., 2011; Kaupp, 2012; Ortagus, 2017; Xu & Xu, 2019*). Jaggars and Xu (2010*) and Xu and Jaggars (2011a*, 2011b*), assessed students who took online courses at community and technical colleges in Virginia and Washington State, and reported that online students tended to be White. Similarly, Johnson and Mejia (2014*) report that online enrollments were proportionally lower for Black and Latinx California community college students than White students, with Latinx students lagging all other student groups in online enrollment. Within the New York SUNY system, Shea and Bidjerano (2017*) report that White community college students were much more likely to take both online and face-to-face courses than to be face-to-face only students. Using national data, Shea and Bidjerano (2014) found that the odds of taking an online course were lower for Black students in comparison to White Students. Specific to STEM online courses, Wladis et al. (2015b) found that Latinx and Black STEM majors were significantly less likely to take online courses than their White student peers. Further, using national data, results indicate that Black and Latino men students are particularly proportionally underrepresented in STEM online courses (Wladis et al., 2015c*). Additionally, ESL and immigrant status have been suggested as potentially having a negative effect on enrollment in online courses, as some research has shown that online students tend to be native English speakers (Xu & Jaggars, 2011a*).
Outcomes
Research regarding the impact of ethnicity on online success is mixed, with most studies failing to differentiate between overall relationships which may exist both online and face-to-face, versus those that may be specific to the online environment. Figlio et al. (2013) randomly assigned students in a single course and report that the face-to-face format produced more favorable outcomes than the online format for students who were lower-achieving, men, and Latinx (we note this study tested one narrow definition of online learning among a small sample from a single class at a single university). Several studies using statewide data from California community colleges found that Latinx and Black students in online classes had higher withdrawal rates and lower grades than their peers in face-to-face classes (Johnson & Mejia, 2014; Kaupp, 2012*); however, we note that online and face-to-face students may not be comparable on key characteristics. Ice et al. (2012) report race and ethnicity as a factor in online retention and success. Krajewski (2015*) found that non-White students were 2.4 times less likely to complete an online course than White Students and further, White students were 2.5 times more likely to earn a C or above in comparison to non-White students. Similarly, Xu and Jaggars (2013*) using multi-institutional datasets, found that Black students had poorer performance (i.e., lower grades, higher withdrawal) in comparison to White students in online courses. In the other direction, Yeboah and Smith (2016) found a positive relationship for minority students (defined as African American, Hispanic, Asian, Native American or Pacific Islander, including international students) between the number of online courses taken and academic performance. The results found in these studies may simply be reflecting general trends, and not ones that are specific to the online environment.
In contrast, Xu and Jaggars (2014*) report that for students of different ethnicities, although all types of students were more likely to drop out from an online course than a face-to-face course, the size of this difference did not significantly vary across ethnic groups. Similarly, Strang (2017) found that there was no interaction between ethnicity and course medium, suggesting that gaps in success rates between Latinx and White students were no greater online than face-to-face. Shea and Bidjerano (2017*), analyzing community college student data from the New York SUNY system, found that minority status was not a significant predictor of differences in G.P.A. between face-to-face and online courses. Lei and Gupta (2010) posited that the online environment can build a bridge between academia and students from diverse cultures which can lead to better academic outcomes. Supporting this, Wladis et al., (2016*) found that foreign-born students (with English as a second language) in a large, urban, U.S. university system were at less risk of dropout online than native-born students.
Gender
Enrollment
There is strong evidence that women (who enroll in higher education in greater numbers than men) are also represented to an even greater degree in the online environment. Numerous studies, which confirm a plethora of earlier research, indicate that a larger proportion of women students enroll in online courses (Ashby et al., 2011; Cavanaugh, & Jacquemin, 2015; Faidley, 2018; Helms, 2014; James et al., 2016; Pao, 2016; Ryabov, 2012; Streich, 2014*; Wilson & Allen, 2018. Within the New York SUNY system, Shea and Bidjerano (2017*) report that women community college students were much more likely to take both online and face-to-face courses than to be face-to-face only students. Jaggars and Xu (2010*), Johnson and Mejia (2014*), Smith (2016) and Xu and Jaggars (2011a*, 2011b*), all using different multi-institution state datasets, report that online courses are significantly more popular among women students in comparison to men. Shea and Bidjerano (2014), using national data, confirm that there is an overrepresentation of women among students who take online courses. Also using national data, Wladis et al. (2015b*, 2015c*) found that women STEM majors are significantly more likely than men STEM majors to take online courses, even when academic preparation, socioeconomic status (SES), citizenship and English-as-second-language status was controlled.
Outcomes
Research on the relationship between gender and online course retention and success is inconclusive. Ice et al. (2012) report gender as a factor in online retention and success. Wagner et al. (2011) and Gregory (2016*) found interaction of gender and course delivery method, with lower average grades for men online than in face-to-face courses. Using multi-institutional datasets, Xu and Jaggars (2013*) and Johnson and Mejia (2014*) found that women community college students had better online course persistence and grades than men students. Cochran et al. (2014) found that men were more likely to withdraw from online courses than women. Other studies report no relationship between gender and online course grades or completion rates (Burns et al., 2013; James et al., 2016; Krajewski, 2015; Park & Choi, 2009; Strang, 2017). Shea and Bidjerano (2017*), analyzing community college student data from the New York SUNY system, report gender was not a significant predictor of differences in G.P.A. between face-to-face and online courses. Specific to STEM online courses, one study found that women students did significantly worse online than would be expected based on their comparable face-to-face STEM outcomes [and yet, still no worse than men students] (Wladis et al., 2015a*). Palacios and Wood (2016*) found that community college men had higher success face-to-face in comparison to men online, even when accounting for ethnicity. While some studies cite no differences and others have found women outperform men, we found only one study (Fendler et al., 2018) that reports men obtaining better grades than women in online courses (at the 10% significance level); this study reported no gender differences observed in face-to-face courses.
Age
Enrollment
Research suggests that many students who take online courses do so because they need the flexibility that these courses offer due to life challenges that make it difficult to attend face-to-face courses (Jaggars & Bailey, 2010; Lei & Gupta, 2010; Pontes et al., 2010). Studies have found that online learners are more likely to be older (> 24), married with families, employed and with other responsibilities (Ashby et al., 2011*; Cavanaugh & Jacquemin, 2015; Driscoll et al., 2012; Faidley, 2018; Jaggars & Xu, 2010*; Johnson & Mejia, 2014*; Ortagus, 2017; Pao, 2016; Rich & Dereshiwsky, 2011; Smith, 2016; Tanyel & Griffin, 2014; Wilson & Allen, 2018; Xu & Jaggars, 2011a*). Within the New York SUNY system, Shea and Bidjerano (2017*) report that older community college students were much more likely to take both online and face-to-face courses than to be face-to-face only students. The current literature overwhelmingly confirms an early research review by Tallent-Runnels et al. (2006), which puts the average age of online students above the traditional college age range of 18–24 years.
Outcomes
Most large-scale studies indicate that on average students in online courses are older (> 25) and that older students tend to be more successful in the online environment than younger students, both in terms of grades and completion rates (Garman, 2012; Ice et al., 2012; Kinder, 2013*). Older students (> 25) have been reported to adapt more readily to online courses than younger students, and to benefit from higher levels of critical thinking, responsibility, and maturity (Kinder, 2013; Wuellner, 2013; Xu & Jaggars, 2013, 2014*). Johnson and Mejia (2014*) report that older students (> 24) did better than younger students in online courses, with a 4.6 percentage point achievement gap. In another multi-institutional study, James et al. (2016) report that among students taking only online courses, older students had better retention. Krajewski (2015*) reports that for every year of age, students were 1.1 times more likely to complete an online course. Xu and Jaggars (2014*) found that older online students did better than younger online students (but not to comparable face-to-face younger students). Specific to STEM online courses, Wladis et al., (2015a*) found that older students did significantly better online than would be expected based on their outcomes in comparable face-to-face STEM courses.
Park and Choi (2009), in a small-scale study, did not find age differences between those who dropped out and those who were retained in online courses. Similarly, Burns et al. (2013) and Strang (2017) found no impact of age on online performance. In a few exceptions to general trends, Francis et al., (2019*) found that older students had lower grades than both face-to-face peers and younger students in either delivery modality and Johnson and Galy (2013) report lower online grades for older Hispanic students in comparison to younger Hispanic students. Studies which attempted to compare trends by age online versus face-to-face seem to suggest that age is a factor. Yet, as with other characteristics, it is often difficult to determine the extent to which this relationship is specific to the online environment, or simply reflective of overall academic trends by age.2
Family, employment
While most studies seem to point to greater success for older students, the mixed findings may be due to the added complexity of many older students lives: they are more likely to work, be married, have children and have other responsibilities that compete with time to devote to academics and impact decisions to dropout (Jaggars et al., 2013*; Lee & Choi, 2011; Smith, 2016). Students with the additional complexities of work, family commitments, and other life responsibilities have been shown to have significantly greater preference for the flexibility and convenience of online courses (Brown, 2012; Pontes et al., 2010). Enrollment in online courses helps these students better manage their busy schedules and provides better work/life balance (Helms, 2014; Jaggars, 2013*; Wladis et al., 2016*).
Nonetheless, time commitments outside of school also seem to impact their online academic performance: Wladis et al., (2016*) found that students with insufficient time for their academics (i.e., those with “time poverty” due to family and work responsibilities) may be more likely to drop an online course than a face-to-face class, all else being equal. Additionally, they found evidence that the number of credits/courses taken (increasing time poverty) may be a significant predictor of differential online versus face-to-face course retention, with students taking more credits/courses at higher risk of online dropout. They further found that parents of young children (under age six) were particularly at risk of online dropout; in contrast, Gregory (2016*) did not find an impact for having a dependent child (but being married made it more likely to earn an “A” online). Park and Choi (2009) found no difference in completion results with adult learners taking online courses related to their work. There is almost no research investigating the extent to which increased work and family commitments may simultaneously relate to online course enrollment and outcomes in online versus face-to-face courses, despite the extensive research showing that online students are significantly more likely to have work and family commitments. Thus, this is a significant area of research where more research is needed.
Part-time enrollment
Several studies have found that students who enrolled in online classes were employed for more hours than their face-to-face counterparts (Driscoll et al., 2012; Xu & Jaggars, 2011a, 2011b*) and as a result are more likely to attend part-time, which may impact enrollment and persistence. Part-time enrollment, for example, may be a weak proxy variable for time poverty, which may significantly relate to both online enrollment and course outcomes. Johnson and Mejia (2014*) report that part-time students do significantly worse in online courses. Similarly, Krajewski (2015*) found that full-time students were 2.1 times more likely to complete an online course than part-time students. Research suggests that without adequate support (e.g., childcare; financial aid to reduce work hours), the flexibility that online courses offer may not be enough to compensate for the time demands of non-academic responsibilities (Wladis et al., 2016*). Still, for students juggling school, family, and work obligations, the ability to maintain a full-time course load by taking some online courses each term may outweigh any theoretical risks of higher attrition online in terms of actual degree attainment (Johnson & Mejia, 2014*). Perhaps in support of this, within the New York SUNY system, Shea and Bidjerano (2017*) found that full-time community college students were much more likely to take both online and face-to-face courses than to be face-to-face only students.3
Non-traditional characteristics
As noted earlier, there are seven characteristics typically considered non-traditional: delayed enrollment (> age 24); no high school diploma; part-time enrollment; financially independent; have dependents; single parent status; working full-time while enrolled (NCES, 1996, 2002). While some of these characteristics were addressed above, studies have often looked at these characteristics as a group, as there is evidence that online learners are more likely to possess non-traditional student characteristics (Pao, 2016; Pontes et al., 2010; Rovai & Downey, 2010; Xu & Jaggars, 2013). Jaggars and Xu (2010*), Xu and Jaggars (2011a*, 2011b*) and Streich (2014*), using multi-institutional state data, report that being financially independent, having dependents and/or being employed was associated with online course taking. Shea and Bidjerano (2014), using national data, report an overrepresentation of online students in comparison to face-to-face students in six out of the seven categories of non-traditional characteristics. Wladis et al. (2015b*, 2015c*), also using national data, found that having non-traditional student characteristics strongly increased the likelihood of enrolling in a STEM course online, more so than any other characteristic, with online enrollment probability increasing steeply as the number of non-traditional factors increased. Further, they found that the impact of non-traditional factors on online enrollment was significantly stronger for STEM than non-STEM majors. Wladis et al. (2015b*) contend that as non-traditional students are more likely to be women and non-White, that non-traditional characteristics may serve as a mediating variable for differences in online enrollment by gender and ethnicity. Further, military students, also shown to often have non-traditional characteristics, have overwhelmingly moved to online learning modalities, yet there is little specific research on this population (Ford & Vignare, 2015) [Outside our search parameters due to self-report methodology, we do note the only found research related to military students (Wright, 2015; See the Appendix, Table 7) which suggests a connection between military duty, online enrollment, and dropout.]
Table 7.
Qualitative single site or single course studies
| Study/work | Sample/context | Design/method | Measure | Key findings |
|---|---|---|---|---|
| Wright (2015) | n = 152 adult learners (2013–2015) | Qualitative design, survey and interview data analysis | Attrition | Students did not complete online courses because of: military duty (15%); work (8%); course difficulty (33%); modality preference (33%), communication with the instructor (33%); financial need (8%) |
There is concern about the online course performance of underprepared and/or traditionally underserved students who are already at higher risk of course dropout (Jaggars & Bailey, 2010*). Evidence that non-traditional students tend to enroll in online courses, combined with research reporting that non-traditional students are more likely to be non-White and women, as well as significantly more vulnerable to negative academic outcomes (NCES, 1996, 2002) suggests that non-traditional characteristics may be a mediating variable for differences in online outcomes by gender and ethnicity (Wladis et al., 2015c*). Yet, data on the effect of all non-traditional characteristics on online enrollment and persistence to date is inconsistent across the literature (Wladis et al., 2015c*) or generally missing. The high proportion of non-traditional students who enroll online compared to face-to-face students should make us cautious about generalizing inferences based on simple comparisons of online versus face-to-face students, as these two populations are significantly different; imposing norms developed with traditional students on non-traditional students may lead to invalid inferences.
Other characteristics (SES, prior online experience & credit accumulation)
Other characteristics that warrant further exploration include socioeconomic status (SES), prior online course taking experience, and credit accumulation. SES has been posited as an important mediating variable when considering online course enrollment and ethnicity (Wladis et al., 2015b*). Ortagus (2017) found increasing participation of low-income students in comparison to middle/high income students in online courses, though low-income students were also less likely to enroll completely online. Shea and Bidjerano (2017*) found that community college students within the New York SUNY system who were Pell Grant recipients were much more likely to take both online and face-to-face courses than to be face-to-face only students. Additionally, several large-scale and national data research studies have reported that in comparison to face-to-face enrollment, students enrolling online are more likely to have applied for, or received, financial aid (Jaggars & Xu, 2010; Shea & Bidjerano, 2014; Smith, 2016; Streich, 2014; Xu & Jaggars, 2011a*).
Krajewski (2015*) did not find financial aid was related to online completion but did report that Pell grant recipients were 2.7 times less likely to achieve a C or better in an online course. Similarly, Gregory (2016*) reports that students ineligible for Pell Grants were more likely to earn As, Bs and C’s in online courses than those who were eligible. However, Burns et al. (2013) report that Pell Grant eligibility was not a significant factor in online grades. Cochran et al. (2014) found that students with loans were more likely to withdraw from online courses than those without loans. Wladis et al. (2016) and James et al. (2016) report that while household income and Pell Grant eligible were strongly correlated with course and college outcomes generally, it was not relevant to the online environment specifically. This does not eliminate SES as a factor, yet it may simply be that lower-income students need significant support regardless of course modality.
Both prior online experience and college experience generally may have an impact on online success. Several studies report that students with a greater number of completed higher education credits had better performance in terms of grades and/or retention online (Cochran et al., 2014; Fendler et al., 2018; Fetzner, 2013*; Wilson & Allen, 2018), though Burns et al. (2013) did not find an effect for class status. Krajewski (2015*) found that students in their first term were 2.6 times less likely to complete an online course than students who completed at least one semester. We note it is difficult to determine whether these patterns are specific to the online medium, or just reflective of a relationship between course outcomes and college level.
Previous online course experience may particularly have a significant impact on future online success (Burns et al., 2013; Lint, 2013*). Wang et al. (2013) report that online experience directly effects self-regulation, which impacts course performance. Hachey et al., (2012*) report that knowing a student’s prior online course success explains 13.2% of the variation found in retention and 24.8% of the variation found in online success, over and above GPA alone, making prior experience a significant predictor variable, which is more specific than prior academic outcomes more generally. Students with no previous online course experience (including new freshman) or those with prior unsuccessful online experience (i.e., course withdrawal or low grade) have been found to be at greatest risk (Cochran et al., 2014; Goomas & Clayton, 2013*; Hachey et al., 2012*, 2014*; Ice et al., 2012; Jaggars, 2011*), suggesting that this may be an important factor to consider in future models.
Limitations
In applying Wallace (2003) and de Souza’s et al. (2010) integrative methodology, we attempted to cast a wide net, conducting a comprehensive review of the literature in the U.S. for the past decade related to student characteristics which may impact the decision to enroll and subsequently succeed in online learning. To this end, we employed multiple search techniques. As all-encompassing as our method tried to be, it still cannot be certain that all relevant literature has been gathered within our defined search parameters. Further, aspects of the search criteria utilized in our protocol in their own way serve as limitations; by definition, setting exclusions means that some data is lost. We note that while this review covered a large time span, it does not include articles prior to 2010 barring a few exceptions, so all knowledge gained in previous decades are not accounted for in this study. And, while we selected databases and additional meta-reviews and journals to manually search that were likely to contain articles relevant to this review, there may have been other databases and journals that were not included that could have contributed to our findings. Additionally, we acknowledge that our search criteria of excluding studies outside the U.S. or not written in English likely filtered out studies pertinent to online student characteristics and in this way, does not allow generalizability of our findings to global contexts. Finally, our findings may only be applied to undergraduate students at institutions that offer both online and face-to-face courses, thus excluding graduate and online-only sectors in higher education. That said, the multiple approaches utilized for searching and the protocol adopted did provide a relatively large number of articles gathered and discussed in this review and thus, serves to shed some light on the online learning literature in the U.S. that focuses on undergraduate student characteristics and the potential relationships to enrollment and outcomes.
Concluding thoughts and future directions
The results of this integrative review, though in many ways inconclusive, are also informative. We identified ten student characteristics which most past research suggests is relevant in describing online undergraduate students (G.P.A./academic preparation; ethnicity; ESL; gender; non-traditional status; age; family responsibilities; employment; student level; and SES). Thus, this review provides more systematic evidence that certain student characteristics are a factor significantly differentiating those who enroll online versus those who enroll face-to-face. Further, while Money and Dean (2019) provide several likely broad categories of antecedents in their model of student differences for learning in online education, this review enlarges their limited sub-category discussion of demographic attributes that are likely to have an impact on outcomes, expanding to a more comprehensive set of categories.
To accurately assess whether a factor puts an undergraduate student at greater risk in the online environment, it is essential to analyze the interaction between that factor and course medium, while simultaneously controlling for the characteristics of students who select online courses (Wladis et al., 2016). However, clear evidence of the potential impact of the characteristics of undergraduate students who select online courses has yet to be determined; the research for many of the variables influencing outcomes is indeed mixed (See Table 1). Further, most of the studies found have limited potential for more generalized inferences (e.g., because they only considered patterns in online courses and direct face-to-face controls; because they compared the same courses taken by different students without thoroughly controlling for all of the factors that correlate with student selection into online courses). That said, there seems to be evidence that the ten student variables identified are sufficiently correlated with both online enrollment and academic outcomes such that they cannot completely be ignored in any online enrollment/retention model. Thus, this review supports previous assertions that several student characteristics have emerged which need further empirical testing, both regarding their impact and the predictive power these factors may have in terms of identifying students at highest risk online (Hachey et al., 2014; Street, 2010).
Table 1.
Overall summary description of findings related to identified student characteristics
| Characteristic | Enrollment | Outcomesa |
|---|---|---|
| Academic preparation and G.P.A | Relationship suggested; evidence mixed with some indication of higher GPA/Preparation = more enrollment | Relationship suggested; strong support that higher GPA/Preparation = more successful outcomes |
| Ethnicity and ESL | Relationship suggested, some evidence that enrollees tend to be White and native English-speaking | Relationship suggested; evidence mixed with some indication that people of color may have less successful outcomes |
| Gender | Relationship suggested; strong support of higher enrollment of women in comparison to men | Relationship suggested; evidence mixed; tends to show genders equal or women do better than men |
| Age | Relationship suggested; strong support of students > 24 enrolling online | Relationship suggested; direction mixed with some indication that older students do better than younger students |
| Family, employment | Relationship suggested; support of higher enrollment of students with children/family commitments and those working | Relationship suggested; evidence mixed with some indication that work/family responsibilities may correlate with higher drop out |
| Non-traditional characteristics | Relationship suggested; strong support of higher enrollment of non-traditional students | Relationship posited; evidence mixed or missing |
| SES | Relationship suggested; some indication that students who enroll online may have lower SES students | Relationship suggested; evidence mixed, with some indication lower income online students may earn lower grades |
| Prior experience (College-level; previous online coursework) | Research not found | Relationship suggested; evidence supports that more college or online prior experience = more successful outcomes |
aWhile the evidence from the studies found do seem to suggest a relationship between enrollment/outcomes and each of the characteristics identified (and thus, that they should be considered in online enrollment/retention models), we note that the literature remains unclear regarding how specific the relationship for each factor is to the online modality versus a trend that holds for courses more generally regardless of modality
The characteristics we identify above, plus the prior online course experience factor, combined with the additional factors (69 additional factors, including items related to course design and institutional support) identified in the review by Lee and Choi (2011) offer insight into the creation of an archetypical predictive model of retention and success for undergraduate online learning. However, before such a model can be constructed, further work is needed. First, more large-scale, well controlled empirical research is needed to narrow down which student characteristics truly do impact outcomes in the online course modality and specifically, in what direction and to what extent. Many of the supports or interventions which are relevant to all students (e.g., supporting low-income students, students with lower levels of college preparation, minoritized students, etc.) remain equally relevant to online students, as this literature review demonstrates. What is less clear is which specific student characteristics may be most relevant to predicting differential outcomes online versus face-to-face. While some of the studies in this review have attempted to investigate this question, they are in the minority, and many of them suffer from critical limitations of the groups to which inferences can be generalized.
In the absence of large-scale randomized control trials (RCTs) and clearer evidence, this study suggests that future observational studies should be careful to systematically measure and control for all of the student-level demographic variables identified in this review which have been shown to differ between online and face-to-face students. While many of the characteristics of undergraduate students who enroll in online courses are readily available through routine and/or automatized institutional research data collections, others (e.g., work and family commitments) are less readily available, but could become readily available if they were automatically included in routine data collection efforts (e.g., colleges could ask these questions on routine admissions surveys, or federal forms such as the FAFSA could collect more specific information about ages and numbers of children, for example).
In addition, both relevant student characteristics and those possible factors identified by Lee and Choi (2011) need to be further assessed to find out which of the multitude are most predictive of academic outcomes. An archetypical model with 79+ variables is overwhelming and likely of little practical use to advisors and policy makers at undergraduate institutions; such a model need not include every possible variable but only those that capture the most variability in outcomes to be accurate enough to be predictive of those at highest risk. Finally, effective and targeted interventions need to be developed and empirically tested to support those undergraduate students identified as being at most risk of drop-out in the online environment. Because of the critical role that online courses may be playing in providing students access to college who might otherwise not have been able to take particular courses face-to-face, it is essential that we consider ways to provide all students with the supports that they need to succeed in this environment. Until this work is accomplished, there is likely to be little reduction in online attrition.
Acknowledgements
This work was supported by The National Science Foundation [Award Numbers: 1920599 and 1431649] and the City University of New York William P. Kelly Research Fellowship Program.
Biographies
Alyse C. Hachey
is an Associate Professor at The University of Texas at El Paso (UTEP). Dr. Hachey serves as the Teacher Education Department Co-Chair, Division Director of BELSS [Bilingual Education, Early Childhood Education, Literacy & Sociocultural Studies], Lead Early Childhood Faculty and Codirector of the College of Education Makerspace within the College of Education. Her teaching and research interests focus on early childhood STEM curriculum development and access and degree attainment in higher education with a focus on community college, STEM, online learning and traditionally marginalized college populations.
Katherine M. Conway
is a Professor of Business at the Borough of Manhattan Community College, City University of New York (CUNY). Dr. Conway’s background spans both corporate and academia; she has a M.B.A. in Finance and a Ph.D. in Administration, Leadership and Technology, both from New York University. Her research explores student success and retention in the online environment, with a focus on community college, STEM and non-traditional students.
Claire Wladis
is a Professor of Mathematics Education at the Borough of Manhattan Community College and of Urban Education at the Graduate Center at the City University of New York (CUNY). Dr. Wladis conducts National Science Foundation (NSF) supported research in both mathematics education (related to foundational knowledge in elementary algebra) and postsecondary education (related to STEM education and online learning). Across both research foci, Dr. Wladis studies access and retention for students who have traditionally been underrepresented in higher education and STEM fields.
Shirsti Karim
has a Master of Education from the University of Toronto. She served as a research assistant at the Borough of Manhattan Community College, City University of New York (CUNY) on a funded National Science Foundation (NSF) grant. Ms. Karim is a Research Associate for Data and Cohort Management at New York University.
Appendix 1
Footnotes
We note that developmental courses (sometimes called remedial courses) are those that prepare students for college-level credit courses.
Family and employment have been reported in a combined way because that is the way that they are presented in the literature and therefore, we cannot tease them apart.
We acknowledge that non-traditional characteristics include factors that are separated out in the results (i.e., age, part-time enrollment). We report this way because the some of the literature reports non-traditional characteristics as a single, inclusive construct and yet, many studies have also investigated age/part-time enrollment alone and thus, provide more information for these specific factors.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- Allen, I. E., & Seaman, J. (2010). Class differences: Online education in the United States, 2010. (No. ED529952). Sloan Consortium. http://sloanconsortium.org/publications/survey/class_differences.
- Allen, I. E., & Seaman, J. (2016). Online report card: Tracking online education in the United States. (No. ED572777). Babson Survey Research Group. https://onlinelearningconsortium.org/read/online-report-card-tracking-online-education-united-states-2015/.
- Alpert WT, Couch KA, Harmon OR. A randomized assessment of online learning. American Economic Review. 2016;106(5):378–382. doi: 10.1257/aer.p20161057. [DOI] [Google Scholar]
- Anderson E, Kim D. Increasing the success of minority students in science and technology (No. 310736) American Council on Education; 2006. [Google Scholar]
- Angelino LM, Williams FK, Natvig D. Strategies to engage online students and reduce attrition rates. The Journal of Educators Online. 2007;4(2):n2. doi: 10.9743/JEO.2007.2.1. [DOI] [Google Scholar]
- Arias JJ, Swinton J, Anderson K. Online vs. face-to-face: A comparison of student outcomes with random assignment. E-Journal of Business Education and Scholarship. 2018;12(2):1–23. [Google Scholar]
- Ary EJ, Brune CW. A comparison of student learning outcomes in traditional and online personal finance courses. MERLOT Journal of Online Learning and Teaching. 2011;7(4):465–474. [Google Scholar]
- Ashby J, Sadera WA, McNary SW. Comparing student success between developmental math courses offered online, blended, and face-to-face. Journal of Interactive Online Learning. 2011;10(3):128–140. [Google Scholar]
- Atchley W, Wingenbach G, Akers C. Comparison of course completion and student performance through online and traditional courses. The International Review of Research in Open and Distance Learning. 2013;14(4):104–116. doi: 10.19173/irrodl.v14i4.1461. [DOI] [Google Scholar]
- Bean JP, Metzner BS. A conceptual model of nontraditional undergraduate student attrition. Review of Educational Research. 1985;55(4):485–540. doi: 10.3102/00346543055004485. [DOI] [Google Scholar]
- Bernard RM, Abrami PC, Lou Y, Borokhovsk E, Wade A, Wozney L, Huang B. How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of Educational Research. 2004;74(3):379–439. doi: 10.3102/00346543074003379. [DOI] [Google Scholar]
- Bettinger EP, Fox L, Loeb S, Taylor ES. Virtual classrooms: How online college courses affect student success. American Economic Review. 2017;107(9):2855–2875. doi: 10.1257/aer.20151193. [DOI] [Google Scholar]
- Boston, W. E., & Ice, P. (2011). Assessing retention in online learning: An administrative perspective. Online Journal of Distance Learning Administration, 14(2).
- Bowen WG, Chingos MM, Lack KA, Nygren TI. Interactive learning online at public universities: Evidence from a six-campus randomized trial. Journal of Policy Analysis and Management. 2014;33(1):94–111. doi: 10.1002/pam.21728. [DOI] [Google Scholar]
- Brown JLM. Online learning: A comparison of web-based and land-based courses. Quarterly Review of Distance Education. 2012;13(1):39–42. [Google Scholar]
- Buchanan TC, Palmer E. Role immersion in a history course: Online versus face-to-face in reacting to the past. Computers & Education. 2017;108:85–95. doi: 10.1016/j.compedu.2016.12.008. [DOI] [Google Scholar]
- Burnette DM. Negotiating the mine field: Strategies for effective online education administrative leadership in higher education institutions. Quarterly Review of Distance Education. 2015;16(3):13–25. [Google Scholar]
- Burns K, Duncan M, Sweeney DC, II, North JW, Ellegood WA. A Longitudinal comparison of course delivery modes of an introductory information systems course and their impact on a subsequent information systems course. MERLOT Journal of Online Learning and Teaching. 2013;9(4):453–467. [Google Scholar]
- Carr S. As distance education comes of age, the challenge is keeping the students. Chronicle of Higher Education. 2000;46(23):A39–A41. [Google Scholar]
- Cavanaugh J, Jacquemin S. A large sample comparison of grade based student learning outcomes in online vs face-to-face courses. Online Learning. 2015;19(2):n2. doi: 10.24059/olj.v19i2.454. [DOI] [Google Scholar]
- Cochran JD, Campbell SM, Baker HM, Leeds EM. The role of student characteristics in predicting retention in online courses. Research in Higher Education. 2014;55:27–48. doi: 10.1007/s11162-013-9305-8. [DOI] [Google Scholar]
- Conway KM, Wladis CW, Hachey AC. Minority student access in the online environment. HETS Hispanic Educational Technology Services Online Journal. 2011;2(1):52–77. [Google Scholar]
- Dabbagh N. The online learner: Characteristics and pedagogical implications. Contemporary Issues in Technology and Teacher Education (CITE) 2007;7(3):217–226. [Google Scholar]
- Daymont T, Blau G. Deciding between traditional and online formats: Exploring the role of learning advantages, flexibility and compensatory adaptation. Journal of Behavioral and Applied Management. 2011;12(2):156–175. [Google Scholar]
- Dell CA, Low C, Wilker JF. Comparing student achievement in online and face-to-face class formats. MERLOT Journal of Online Learning and Teaching. 2010;6(1):30–42. [Google Scholar]
- de Souza MT, Dias da Silva M, de Carvalho R. Integrative review: What is it? How to do it? Einstein. 2010;8(1):102–106. doi: 10.1590/S1679-45082010RW1134. [DOI] [PubMed] [Google Scholar]
- Diaz, D. P. (2002). Online drop rates revisited. Technology Source. Retrieved from http://technologysource.org/article/online_drop_rates_revisited/.
- Driscoll A, Jicha K, Hunt AN, Tichavsky L, Thompson G. Can online courses deliver in-class results? A comparison of student performance and satisfaction in an online versus a face-to-face introductory sociology course. Teaching Sociology. 2012;40(4):312–331. doi: 10.1177/0092055X12446624. [DOI] [Google Scholar]
- Dutton J, Dutton M, Perry J. How do online students differ from lecture students? Journal for Asynchronous Learning Networks. 2002;6(1):1–20. [Google Scholar]
- Enriquez, A. (2010). Assessing the effectiveness of synchronous content delivery in an online introductory circuits analysis course. In Proceedings of the annual conference of the American Society for Engineering Education, Louisville, Kentucky, June 2010 (pp. 15.27.2–15.27.14). https://peer.asee.org/assessing-the-effectiveness-of-dual-delivery-mode-in-an-online-introductory-circuits-analysis-course.
- Epper RM, Garn M. Virtual college and university consortia: A national study. Instructional Technology Council; 2003. [Google Scholar]
- Faidley, J. (2018). Comparison of learning outcomes from online and face-to-face accounting courses. Electronic Theses and Dissertations. Paper 3434. Retrieved from https://dc.etsu.edu/etd/3434.
- Fendler RJ, Ruff C, Shrikhande MM. No significant difference—unless you are a jumper. Online Learning. 2018;22(1):39–60. doi: 10.24059/olj.v22i1.887. [DOI] [Google Scholar]
- Fetzner M. What do unsuccessful online students want us to know? Journal of Asynchronous Learning Networks. 2013;17(1):13–27. [Google Scholar]
- Figlio D, Rush M, Yin L. Is it live or is it Internet? Experimental estimates of the effects of online instruction on student learning. Journal of Labor Economics. 2013;31(4):763–784. doi: 10.1086/669930. [DOI] [Google Scholar]
- Fish K, Kang H-G. Learning outcomes in a stress management course: Online versus face-to-face. MERLOT Journal of Online Learning and Teaching. 2014;10(2):179–191. [Google Scholar]
- Ford K, Vignare K. The evolving military learner population: A Review of the literature. Online Learning. 2015;19:7–30. doi: 10.24059/olj.v19i1.503. [DOI] [Google Scholar]
- Fox, H. L. (2017). What motivates community college students to enroll online and why it matters. Insights on equity and outcomes, 19. Office of Community College Research and Leadership. Retrieved from https://files.eric.ed.gov/fulltext/ED574532.pdf.
- Francis MK, Wormington SV, Hulleman C. The costs of online learning: Examining differences in motivation and academic outcomes in online and face-to-face community college developmental mathematics courses. Frontiers in Psychology. 2019;10:2054. doi: 10.3389/fpsyg.2019.02054. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garman, D. E. (2012). Student success in face-to-face and online sections of biology courses at a community college in east Tennessee. (3515024). Electronic Theses and Dissertations. Paper 1408. Retrieved from https://dc.etsu.edu/etd/1408.
- Goomas DT, Clayton A. New-to-college “academic transformation” distance learning: A paradox. Community College Journal of Research and Practice. 2013;37(11):915–918. doi: 10.1080/10668926.2012.718712. [DOI] [Google Scholar]
- Gregory, C. B. (2016). Community college student success in online versus equivalent face-to-face courses. Electronic Theses and Dissertations. Paper 3007. Retrieved from https://dc.etsu.edu/etd/3007.
- Hachey AC, Wladis CW, Conway KM. Is the second time the charm? Investigating trends in online re-enrollment, retention and success. Journal of Educators Online. 2012;9(1):n1. doi: 10.9743/JEO.2012.1.2. [DOI] [Google Scholar]
- Hachey AC, Wladis CW, Conway KM. Balancing retention and access in online courses: Restricting enrollment… Is it worth the cost? Journal of College Student Retention: Research, Theory & Practice. 2013;15(1):9–36. doi: 10.2190/CS.15.1.b. [DOI] [Google Scholar]
- Hachey AC, Wladis CW, Conway KM. Do prior online course outcomes provide more information than GPA alone in predicting subsequent online course grades and retention? An observational study at an urban community college. Computers & Education. 2014;72:59–67. doi: 10.1016/j/compedu.2013.10.012. [DOI] [Google Scholar]
- Hachey AC, Wladis CW, Conway KM. Prior online course experience and GPA as predictors of subsequent online STEM course outcomes. The Internet and Higher Education. 2015;25:11–17. doi: 10.1016/j.iheduc.2014.10.003. [DOI] [Google Scholar]
- Harrell IL, Bower BL. Student characteristics that predict persistence in community college online courses. American Journal of Distance Education. 2011;25(3):178–191. doi: 10.1080/08923647.2011.590107. [DOI] [Google Scholar]
- Hart C. Factors associated with student persistence in an online program of study: A review of the literature. Journal of Interactive Online Learning. 2012;11(1):1541–4914. [Google Scholar]
- Hart CM, Friedmann EA, Hill M. Online course-taking and student outcomes in California community colleges. Education Finance and Policy. 2018;13(1):42–71. doi: 10.1162/edfp_a_00218. [DOI] [Google Scholar]
- Helms JL. Comparing student performance in online and face-to-face delivery modalities. Journal of Asynchronous Learning Networks. 2014;18(1):n1. [Google Scholar]
- Holbert, E. (2017). Comparison of traditional face-to-face and online student performance in two online-delivered engineering technical electives. Paper presented at the American Society for Engineering Conference [ASEE], Tempe, Arizona, April 2017. Paper ID: 20668. Retrieved from https://strategy.asee.org/comparison-of-traditional-face-to-face-and-online-student-performance-in-two-online-delivered-engineering-technical-electives.
- Holmes CM, Reid C. A comparison study of on-campus and online learning outcomes for a research methods course. The Journal of Counselor Preparation and Supervision. 2017 doi: 10.7729/92.1182. [DOI] [Google Scholar]
- Huntington-Klein N, Cowan J, Goldhaber D. Selection into online community college courses and their effects on persistence. Research in Higher Education. 2017;58(3):244–269. doi: 10.1007/s11162-016-9425-z. [DOI] [Google Scholar]
- Ice P, Diaz S, Swan K, Burgess M, Sharkey M, Sherrill J, Huston D, Okimoto H. The PAR framework proof of concept: Initial findings from a multi-institutional analysis of federated postsecondary data. Journal of Asynchronous Learning Networks. 2012;16(3):63–86. [Google Scholar]
- Instructional Technology Council. (2010). 2010 distance education survey results: Trends in eLearning: Tracking the impact of eLearning at community colleges. (No. 66). Instructional Technology Council. Retrieved from http://www.itcnetwork.org/attachments/article/66/ITCSurveyResultsMay2011Final.pdf.
- Jaggars, A., & Bailey, T. (2010). Effectiveness of fully online courses for college students: A response to a department of education meta-analysis. Teachers College Columbia University, Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/publications/effectiveness-fully-online-courses.html.
- Jaggars, S. S. (2011). Online learning: Does it help low-income and underprepared students? Community College Research Center, Paper No. 26. https://files.eric.ed.gov/fulltext/ED515135.pdf
- Jaggars, S. S. (2013). Choosing between online and face-to-face courses: Community college student voices. Community College Research Center, Columbia University, Working paper No. 58. https://anitacrawley.net/Resources/Reports/Online-Demand-Student-Voices.pdf
- Jaggars SS. Choosing between online and face-to-face courses: Community college student voices. American Journal of Distance Education. 2014;28(1):27–38. doi: 10.1080/08923647.2014.867697. [DOI] [Google Scholar]
- Jaggars, S. S., Edgecombe, N., & West Stacey, G. (2013). What we know about online course outcomes. Research Review. Teachers College Columbia University, Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/publications/what-we-know-online-course-outcomes.html.
- Jaggars, S., & Xu, D. (2010). Online learning in the Virginia community college system. Teachers College Columbia University, Community College Research Center. Retrieved from https://ccrc.tc.columbia.edu/publications/online-learning-virginia.html.
- James S, Swan K, Daston C. Retention, progression and the taking of online courses. Online Learning. 2016;20(2):75–96. [Google Scholar]
- Johnson J, Galy E. The use of E-learning tools for improving Hispanic students’ academic performance. MERLOT Journal of Online Learning & Teaching. 2013;9(3):328–340. [Google Scholar]
- Johnson, H., & Mejia, M. C. (2014). Online learning and student outcomes in California’s community colleges. Public Policy Institute of California. Retrieved from http://www.ppic.org/content/pubs/report/R_514HJR.pdf.
- Johnson, D. M., & Palmer, C. C. (2015). Comparing student assessments and perceptions of online and face-to-face versions of an introductory linguistics course. Online Learning Journal, 19(2).
- Jones, E. H. (2010). Exploring common characteristics among community college students: Comparing online and traditional student success. Dissertation. Retrieved from https://libres.uncg.edu/ir/asu/f/Jones,%20Elizabeth_2010_Dissertation.pdf.
- Jones SJ, Long VM. Learning equity between online and on-site mathematics courses. MERLOT Journal of Online Learning and Teaching. 2013;9(1):1–12. [Google Scholar]
- Jorczak, R. L., & Dupuis, D. N. (2014). Differences in classroom versus online exam performance due to asynchronous discussion. Journal of Asynchronous Learning Network, 18(2).
- Kauffman H. A review of predictive factors of student success in and satisfaction with online learning. Research in Learning Technology. 2015 doi: 10.3402/rlt.v23.26507. [DOI] [Google Scholar]
- Kaupp R. Online penalty: The impact of online instruction on the Latino-White achievement gap. Journal of Applied Research in the Community College. 2012;19(2):8–16. [Google Scholar]
- Kember D. Open learning courses for adults: A model of student progress. Educational Technology; 1995. [Google Scholar]
- Keramidas CG. Are undergraduates ready for online learning? A comparison of online and face-to-face sections of a course. Rural Special Education Quarterly. 2012;31(4):25–32. doi: 10.1177/875687051203100405. [DOI] [Google Scholar]
- Kinder, A. M. (2013). Completion rates in West Virginia community and technical colleges. Graduate Theses, Dissertations, and Problem Reports. 3636. Retrieved from https://researchrepository.wvu.edu/etd/3636.
- Krajewski, S. (2015). Retention of community college students in online courses. Dissertations. 1180. Retrieved from https://scholarworks.wmich.edu/dissertations/1180.
- Lack, K. A. (2013). Current status of research on online learning in postsecondary education. Ithaka S+R. Retrieved from https://sr.ithaka.org/wp-content/uploads/2015/08/ithaka-sr-online-learning-postsecondary-education-may2012.pdf.
- Lederman, D. (2019). Online enrollment grows, but pace slows. Inside Higher Ed. Retrieved from https://www.insidehighered.com/digital-learning/article/2019/12/11/more-students-study-online-rate-growth-slowed-2018.
- Lee Y, Choi J. A review of online course dropout research: Implications for practice and future research. Educational Technology Research and Development. 2011;59(5):593–618. doi: 10.1007/s11423-010-9177-y. [DOI] [Google Scholar]
- Lee Y, Choi J, Kim T. Discriminating factors between completers of and dropouts from online learning courses. British Journal of Educational Technology. 2013;44(2):328–337. doi: 10.1111/j.1467-8535.2012.01306.x. [DOI] [Google Scholar]
- Lehman RM, Conceição SC. Motivating and retaining online students: Research-based strategies that work. Wiley; 2013. [Google Scholar]
- Lei SA, Gupta RK. College distance education courses: Evaluating benefits and costs from institutional, faculty and student perspectives. Education. 2010;130(4):616–631. [Google Scholar]
- Lint, A. (2013). Academic persistence of online students in higher education impacted by student progress factors and social media. Online Journal of Distance Learning Administration, 16(4).
- Lutzer DJ, Rodi SB, Kirkman EE, Maxwell JW. Statistical abstract of undergraduate programs in the mathematical sciences in the United States: Fall 2005 CBMS survey. American Mathematical Society; 2007. [Google Scholar]
- Lyke J, Frank M. Comparison of student learning outcomes in online and traditional classroom environments in a psychology course. Journal of Instructional Psychology. 2012;39(4):245–250. [Google Scholar]
- McCutcheon K, Lohan M, Traynor M, Martin D. A systematic review evaluating of online or blended learning vs. face-to-face learning of clinical skills in undergraduate nurse education. Journal of Advanced Nursing. 2015;71(2):255–270. doi: 10.1111/jan.12509. [DOI] [PubMed] [Google Scholar]
- McDonough C, Roberts RP, Hummel J. Online learning: Outcomes and satisfaction among underprepared students in an upper-level psychology course. Online Journal of Distance Learning Administration. 2014;17(3):n3. [Google Scholar]
- Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2010). Evaluation of evidence based practices in online learning: A meta-analysis and review of online learning studies. U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Retrieved from http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf.
- Meder, C. (2013). Counselor education delivery modalities: Do they affect student learning outcomes (Doctoral dissertation). ProQuest Dissertations and Theses database. UMI Number: 3573598. Retrieved from https://www.proquest.com/docview/1442476250.
- Mollenkopf, D., Vu, P., Crow, S., & Black, C. (2017). Does Online Learning deliver? A comparison of student teacher outcomes from candidates in face-to-face and online program pathways. Online Journal of Distance Learning Administration, 20(1)
- Money WH, Dean BP. Incorporating student population differences for effective online education: A content-based review and integrative model. Computers & Education. 2019;138:57–82. doi: 10.1016/j.compedu.2019.03.013. [DOI] [Google Scholar]
- Moore, J. C., & Fetzner, M. J. (2009). The road to retention: A closer look at institutions that achieve high course completion rates. Journal of Asynchronous Learning Networks, 13(3), 3–22. https://files.eric.ed.gov/fulltext/EJ862352.pdf
- Morris LV, Finnegan CL. Best practices in predicting and encouraging student persistence and achievement online. Journal of College Student Retention: Research, Theory & Practice. 2009;10(1):55–64. doi: 10.2190/CS.10.1.e. [DOI] [Google Scholar]
- Muljana PS, Luo T. Factors contributing to student retention in online learning and recommended strategies for improvement: A systematic literature review. Journal of Information Technology Education: Research. 2019;18:19–57. doi: 10.28945/4182. [DOI] [Google Scholar]
- Muse HE., Jr The web-based community college student: An examination of factors that lead to success and risk. Internet and Higher Education. 2003;6(3):241–261. doi: 10.1016/S1096-7516(03)00044-7. [DOI] [Google Scholar]
- National Center for Education Statistics [NCES] Digest of education statistics 1996. National Center for Education Statistics, U.S. Department of Education; 1996. [Google Scholar]
- National Center for Education Statistics [NCES] The condition of Education 2002. National Center for Education Statistics, U.S. Department of Education; 2002. [Google Scholar]
- National Center for Education Statistics [NCES] The condition of education at a glance: Distance education in post-secondary institutions. National Center for Education Statistics, U.S. Department of Education; 2013. [Google Scholar]
- Nemetz PL, Eager WM, Limpaphayom W. Comparative effectiveness and student choice for online and face-to-face classwork. Journal of Education for Business. 2017;92(5):210–219. doi: 10.1080/08832323.2017.1331990. [DOI] [Google Scholar]
- Nguyen T. The effectiveness of online learning: Beyond no significant difference and future horizons. MERLOT Journal of Online Learning and Teaching. 2015;11(2):309–313. [Google Scholar]
- Nora A, Cabrera AF. The role of perceptions in prejudice and discrimination and the adjustment of minority students to college. Journal of Higher Education. 1996;67(2):119–148. doi: 10.2307/2943977. [DOI] [Google Scholar]
- Nora A, Snyder BP. Technology and higher education: The impact of e-learning approaches on student academic achievement, perceptions and persistence. Journal of College Student Retention: Research, Theory & Practice. 2009;10(1):3–19. doi: 10.2190/CS.10.1.b. [DOI] [Google Scholar]
- Ortagus JC. From the periphery to prominence: An examination of the changing profile of online students in American higher education. Internet and Higher Education. 2017;32:47–57. doi: 10.1016/j.iheduc.2016.09.002. [DOI] [Google Scholar]
- Packham G, Jones G, Miller C, Thomas B. E-learning and retention: Key factors influencing student withdrawal. Education & Training. 2004;46(6/7):335–342. doi: 10.1108/00400910410555240. [DOI] [Google Scholar]
- Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. PLoS Medicine. 2021;18(3):e1003583. doi: 10.1371/journal.pmed.1003583. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palacios AMG, Wood JL. Is online learning the silver bullet for men of color? An institutional-level analysis of the California community college system. Community College Journal of Research and Practice. 2016;40(8):643–655. doi: 10.1080/10668926.2015.1087893. [DOI] [Google Scholar]
- Pao, T. C. (2016). Nontraditional student risk factors and gender as predictors for enrollment in college distance education (Doctoral dissertation). 10.36837/chapman.000009.
- Park, J. (2007). Factors related to learner dropout in online learning. Paper presented at the International Research Conference in the Americas of the Academy of Human Resource Development. Retrieved from http://eric.ed.gov/?id=ED504556.
- Park J, Choi HJ. Factors influencing adult learners’ decision to drop out or persist in online learning. Educational Technology and Society. 2009;12(4):207–217. [Google Scholar]
- Parsad, B., Lewis, L., & Tice, P. (2008). Distance education at degree-granting postsecondary institutions: 2006–2007. (NCES 2009-044). U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubs2009/2009044.pdf.
- Patterson, B., & McFadden, C. (2009). Attrition in online and campus degree programs. Online Journal of Distance Learning Administration, 12(2).
- Paulson J, McCormick AC. Reassessing disparities in online learner student engagement in higher education. Educational Researcher. 2020;49(1):20–29. doi: 10.3102/0013189X19898690. [DOI] [Google Scholar]
- Pearson Foundation . Community college student survey: Summary of results. Pearson Foundation; 2011. [Google Scholar]
- Picciano AG, Seaman J, Allen IE. Educational transformation through online learning: To be or not to be. Journal of Asynchronous Learning Networks. 2010;14(4):17–35. [Google Scholar]
- Pontes, M. C. F., Hasit, C., Pontes, N. M. H., Lewis, P. A., & Siefring, K. T. (2010). Variables related to undergraduate students’ preference for distance education classes. Online Journal of Distance Learning Administration, 13(2).
- Rich AJ, Dereshiwsky MI. Assessing the comparative effectiveness of teaching undergraduate intermediate accounting in the online classroom format. Journal of College Teaching & Learning. 2011;8(9):19–28. doi: 10.19030/tlc.v8i9.5641. [DOI] [Google Scholar]
- Rovai, A. P. (2002). Building sense of community at a distance. International Review of Research in Open and Distance Learning, 3(1).
- Rovai AP. In search of higher persistence rates in distance education online programs. Internet and Higher Education. 2003;6(1):1–16. doi: 10.1016/S1096-7516(02)00158-6. [DOI] [Google Scholar]
- Rovai AP, Downey JR. Why some distance education programs fail while others succeed in a global environment. The Internet and Higher Education. 2010;13(3):141–147. doi: 10.1016/j.iheduc.2009.07.001. [DOI] [Google Scholar]
- Ryabov I. The effect of time online on grades in online sociology courses. MERLOT Journal of Online Learning and Teaching. 2012;8(1):13–23. [Google Scholar]
- Schmidt, S. (2012). The rush to online: Comparing students' learning outcomes in online and face-to-face accounting courses. ProQuest Dissertations & Theses Global (1039150240). Retrieved from https://search.proquest.com/docview/1039150240.
- Shachar, M., & Neumann, Y. (2003). Differences between traditional and distance education academic performances: A meta-analytic approach. International Review of Research in Open and Distributed Learning, 4(2), 1–20.
- Shea P, Bidjerano T. Does online learning impede degree completion? A national study of community college students. Computers & Education. 2014;75:103–111. doi: 10.1016/j.compedu.2014.02.009. [DOI] [Google Scholar]
- Shea P, Bidjerano T. A national study of differences between online and classroom-only community college students in time to first associate degree attainment, transfer, and dropout. Online Learning. 2016;20(3):14–15. doi: 10.24059/olj.v20i3.984. [DOI] [Google Scholar]
- Shea, P. & Bidjerano, T. (2017). Online learning in community colleges of the State University of New York: Initial results on differences between classroom-only and online learners. 2017 American Educational Research Association Annual Meeting. http://www.sunyresearch.net/hplo/wp-content/uploads/2017/06/AERA-2017-Final-1.pdf.
- Shea P, Bidjerano T. Online course enrollment in community college and degree completion: The tipping point. The International Review of Research in Open and Distributed Learning. 2018 doi: 10.19173/irrodl.v19i2.3460. [DOI] [Google Scholar]
- Shea P, Bidjerano T. Effects of online course load on degree completion, transfer, and dropout among community college students. Online Learning. 2019;23(4):6–22. doi: 10.24059/olj.v23i4.1364. [DOI] [Google Scholar]
- Skopek TA, Schuhmann RA. Traditional and non-traditional students in the same classroom? Additional challenges of the distance education environment. Online Journal of Distance Learning Administration. 2008;11(1):6. [Google Scholar]
- Smith, B. (2010). E-learning technologies: A comparative study of adult learners enrolled on blended and online campuses engaging in a virtual classroom (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses Database. https://search.proquest.com/docview/746605945.
- Smith ND. Examining the effects of college courses on student outcomes using a joint nearest neighbour matching procedure on a state-wide university system. North Carolina State University; 2016. [Google Scholar]
- Smith G, Ferguson D. Student attrition in mathematics e-learning. Australasian Journal of Educational Technology. 2005;21(3):323–334. doi: 10.14742/ajet.1323. [DOI] [Google Scholar]
- Soffer T, Nachmias R. Effectiveness of learning in online academic courses compared with face-to-face courses in higher education. Journal of Computer Assisted Learning. 2018;34(5):534–543. doi: 10.1111/jcal.12258. [DOI] [Google Scholar]
- Stack S. Learning outcomes in an online vs traditional course. International Journal for the Scholarship of Teaching and Learning. 2015;9(1):5. doi: 10.20429/ijsotl.2015.090105. [DOI] [Google Scholar]
- Stevens T, Switzer C. Differences between online and traditional students: A study of motivational orientation, self-efficacy, and attitudes. Turkish Online Journal of Distance Education. 2006;7(2):8. [Google Scholar]
- Strang KD. Beyond engagement analytics: Which online mixed-data factors predict student learning outcomes? Education and Information Technologies. 2017;22:917–937. doi: 10.1007/s10639-016-9464-2. [DOI] [Google Scholar]
- Street H. Factors influencing a learner’s decision to drop-out or persist in higher education distance learning. Online Journal of Distance Learning Administration. 2010;13(4):4. [Google Scholar]
- Streich, F. E. (2014). Online education in community colleges: Access, school success, and labor-market outcomes. Dissertation. Retrieved from https://deepblue.lib.umich.edu/bitstream/handle/2027.42/108944/fstreich_1.pdf?sequence=1&isAllowed=y.
- Stocker BL. Transitioning from on-campus to online in a master of science nursing program: A comparative study of academic success. American Journal of Distance Education. 2018;32(2):113–130. doi: 10.1080/08923647.2018.1443371. [DOI] [Google Scholar]
- Sublett C. Examining distance education coursetaking and time-to-completion among community college students. Community College Journal of Research and Practice. 2019;43(3):201–215. doi: 10.1080/10668926.2018.1453889. [DOI] [Google Scholar]
- Sublett C. What do we know about online coursetaking, persistence, transfer, and degree completion among community college students? Community College Journal of Research and Practice. 2019;43(12):813–828. doi: 10.1080/10668926.2018.1530620. [DOI] [Google Scholar]
- Tallent-Runnels MK, Thomas JA, Lan WY, Cooper S, Ahern TC, Shaw SM. Teaching courses online: A review of the research. Review of Educational Research. 2006;76(1):93–135. doi: 10.3102/00346543076001093. [DOI] [Google Scholar]
- Tanyel, F., & Griffin, J. (2014). A ten-year comparison of outcomes and persistence rates in online versus face-to-face courses. B>Quest (2014). Retrieved from https://www.westga.edu/~bquest/2014/onlinecourses2014.pdf.
- Tinto V. Dropout from higher education: A theoretical synthesis of recent research. Review of Educational Research. 1975;45(89):89–125. doi: 10.3102/00346543045001089. [DOI] [Google Scholar]
- Tinto V. Theories of student departure revisited. In: Smart JC, editor. Higher education handbook of theory and research. Agathon Press; 1986. pp. 359–384. [Google Scholar]
- Tinto V. Leaving college: Rethinking the causes and cures of student attrition. 2. University of Chicago Press; 1993. [Google Scholar]
- Tinto, V. (2016, September 26). From retention to persistence. Inside Higher Ed. Retrieved from https://www.insidehighered.com/views/2016/09/26/how-improve-student-persistence-and-completion-essay?utm_source=Inside+Higher+Ed&utm_campaign=ae25c70b37-DNU20160926&utm_medium=email&utm_term=0_1fcbc04421-ae25c70b37-198157101&mc_cid=ae25c70b37&mc_eid=9f06c277e8.
- Tyler-Smith K. Early attrition among online E-learners: A review of factors that contribute to dropout, withdrawal and non-completion rates of adult learners undertaking E-learning programs. Journal of Online Learning and Teaching. 2006;2(2):73–85. [Google Scholar]
- U.S. Department of Education. (2015). Demographic and enrollment characteristics of nontraditional undergraduates: 2011–2012. NCES 2015-025. Retrieved from https://nces.ed.gov/pubs2015/2015025.pdf.
- Vernadakis N, Giannousi M, Tsitskari E, Antoniou P, Kioumourtzoglou E. A comparison of student satisfaction between traditional and blended technology course offerings in physical education. Turkish Online Journal of Distance Education. 2012;13(1):137–147. [Google Scholar]
- Wagner SC, Garippo SJ, Lovaas P. A longitudinal comparison of online versus traditional instruction. MERLOT Journal of Online Learning and Teaching. 2011;7(1):30–42. [Google Scholar]
- Wallace RM. Online learning in higher education: A review of research on interactions among teachers and students. Education, Communication & Information. 2003;3(2):241–280. doi: 10.1080/14636310303143. [DOI] [Google Scholar]
- Wang C, Shannon DM, Ross ME. Students’ characteristics, self-regulated learning, technology self-efficacy, and course outcomes in online learning. Distance Education. 2013;34(3):302–323. doi: 10.1080/01587919.2013.835779. [DOI] [Google Scholar]
- Wilson D, Allen D. Success rates of online versus traditional college students. Research in Higher Education Journal. 2018;14:1–9. [Google Scholar]
- Wladis CW, Conway KM, Hachey AC. The online STEM Classroom—Who succeeds? An exploration of the impact of ethnicity, gender, and non-traditional student characteristics in the community college context. Community College Review. 2015;43(2):142–164. doi: 10.1177/0091552115571729. [DOI] [Google Scholar]
- Wladis C, Hachey AC, Conway KM. Which STEM majors enroll in online courses, and why should we care? the impact of ethnicity, gender, and non-traditional student characteristics. Computers & Education. 2015;87:285–308. doi: 10.1016/j.compedu.2015.06.010. [DOI] [Google Scholar]
- Wladis CW, Conway KM, Hachey AC. The representation of minority, female, and non-traditional STEM majors in the online environment at community colleges: A nationally representative study. Community College Review. 2015;43(1):89–114. doi: 10.1177/0091552114555904. [DOI] [Google Scholar]
- Wladis CW, Conway KM, Hachey AC. Assessing readiness for online education – Research models for identifying students at risk. Online Learning Journal. 2016;20(3):97–109. [Google Scholar]
- Wladis CW, Samuels J. Do online readiness surveys do what they claim? Validity, reliability, and subsequent student enrollment decisions. Computers & Education. 2016;98:39–56. doi: 10.1016/j.compedu.2016.03.001. [DOI] [Google Scholar]
- Wojciechowski, A., & Bierlein Palmer, L. (2005). Individual student characteristics: Can any be predictors of successes in online classes? Online Journal of Distance Learning Administration, 8.
- Woodley A, de Lang P, Tanewski GA. Student progress in distance education: Kember revisited. Open Learning. 2001;16(2):113–131. doi: 10.1080/02680510123105. [DOI] [Google Scholar]
- Wright, L. (2015). Identifying successful online adult learners. Walden Dissertations and Doctoral Studies. 1430. Retrieved from https://scholarworks.waldenu.edu/dissertations/1430.
- Wu, D. D. (2015). Online learning in post-secondary education: A review of the literature 2013–2014. ITHAKA S+R. Retrieved from https://sr.ithaka.org/wp-content/uploads/2015/08/SR_Report_Online_Learning_Postsecondary_Education_Review_Wu_031115.pdf.
- Wuellner MR. Student learning and instructor investment in online and face-to-face natural resources courses. Natural Sciences Education. 2013;42(1):14–23. doi: 10.4195/nse.2012.0023. [DOI] [Google Scholar]
- Xu D, Jaggars SS. Online and hybrid course enrollment and performance in Washington state community and technical colleges. Teachers College Columbia University, Community College Research Center; 2011. [Google Scholar]
- Xu D, Jaggars SS. The effectiveness of distance education across Virginia's community colleges: Evidence from introductory college-level Math and English courses. Educational Evaluation and Policy Analysis. 2011;33(3):360–377. doi: 10.3102/0162373711413814. [DOI] [Google Scholar]
- Xu, D., & Jaggars, S. S. (2013). Adaptability to online learning: Differences across types of students and academic subject areas. (No. CCRC Working Paper No. 54). Teachers College Columbia University, Community College Research Center. Retrieved from http://ccrc.tc.columbia.edu/media/k2/attachments/adaptability-to-online-learning.pdf.
- Xu D, Jaggars SS. Performance gaps between online and face-to-face courses: Differences across types of students and academic subject areas. The Journal of Higher Education. 2014;85(5):633–659. doi: 10.1080/00221546.2014.11777343. [DOI] [Google Scholar]
- Xu D, Xu Y. The promises and limits of online higher education: Understanding how distance education affects access, cost, and quality. American Enterprise Institute; 2019. [Google Scholar]
- Ya Ni A. Comparing the effectiveness of classroom and online learning: Teaching research methods. Journal of Public Affairs Education. 2013;19(2):199–215. doi: 10.1080/15236803.2013.12001730. [DOI] [Google Scholar]
- Yeboah AK, Smith P. Relationships between minority students' online learning experiences and academic performance. Online Learning. 2016;20(4):577. [Google Scholar]
- Zhao Y, Lei J, Yan B, Lai C, Tan HS. What makes the difference? A practical analysis of research on the effectiveness of distance education. Teachers College Record. 2005;107(8):1836–1884. doi: 10.1111/j.1467-9620.2005.00544.x. [DOI] [Google Scholar]

