Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
. 2021 Feb 6;65(4):576–588. doi: 10.1007/s11528-021-00587-8

Community College Student Perceptions of Remote Learning Shifts Due to COVID-19

Christopher Prokes 1,, Jacqueline Housel 2
PMCID: PMC7865100  PMID: 33585838

Abstract

COVID-19 challenged higher education to rapidly shift to remote course delivery. This study surveyed community college students (N = 356) about their confidence in completing learning related tasks before and after the shift, access to technologies used in in remote learning, and disruptions that impacted their learning. Results indicated notable declines in confidence across all demographics with significant changes in those age 18-21and for those without prior online course experience. Technology use for remote courses was primarily laptops and smartphones. Students reported the most significant changes to work-life balance came through employment changes and mental health issues. Instructional changes were both positive and negative in workload organization, course delivery, communication and technology. Institutions can use this study’s findings to enact contingency planning, expand online and blended course options, refine academic and social support, and allocate resources to mental health.

Keywords: Remote learning, COVID-19, eLearning, Online learning, Perceptions, Self-efficacy


While higher education is certainly no stranger in adapting to societal shifts and even responding to traumatic events, the COVID-19 (Coronavirus) pandemic challenged such transpositions in both immediacy and extent. In most cases of institutional response to various demands, higher education is most often the laggard (Pincus et al. 2017). However, COVID-19 forced an expeditious transition from in-person and blended instruction to fully remote delivery (Kim 2020). The pace of the transition saw historic institutional resource commitment in several peripheral areas, such as the acquisition of synchronous web conferencing technology (such as ZOOM), loaning of hardware and software licenses for faculty and students and an exponential increase in learning management system usage (LMS) (Flaherty 2020). While the stress of this quick transition was certainly felt by faculty, those with experience and training in multiple teaching modalities (e.g. online and blended classrooms) more effectively re-tooled courses.

In this transition, much of the focus has been on faculty (and other college staff) and their role in making the pivot (Bal et al. 2020). Missing are the perceptions of students. Once the retooling of courses was accomplished and the classes were ‘live,’ students were left to complete them in this new remote learning format (Grajek 2020). This transition from face-to-face to remote learning should be examined from the students’ perspective. Prior to COVID-19, research found that many students reported taking in-person or blended/mixed classes because they prefer these over online courses (Jaggars 2014; Kemp and Grieve 2014; Malarkodi et al. 2018). Since remote delivery of existing classes was mandatory, understanding student perceptions of the transition can benefit practitioners as it will enable them to better plan for the possibilities of using remote instruction in a more robust manner regardless of whether the approach becomes normalized.

The purpose of this survey research was to assess student perceptions (N = 356) of the transition of existing face-to-face or blended courses to remote learning in the spring 2020 academic term. Specifically, we focused on views of the transition, confidence with remote learning, technology used and limitations and issues regarding work-life balance from before to after the switch. We used the following research questions:

  1. How does student perceptions of confidence in learning tasks before the COVID-19 remote learning transition compare to after the transition?

  2. What technologies did students employ to engage with remote learning courses?

  3. What changes in student work-life balance affected their participation in remote learning courses?

Literature Review

At the research site, remote learning was defined as a course delivered through synchronous means (such as Zoom) at scheduled times on specific days (Sinclair College 2020). Effectively, an instructor would deliver their course on the same schedule as if it were offered in a traditional face-to-face or blended format and use the institution’s learning management system for assignment submission, quizzes/exams and other course materials. Although remote instruction as the delivery of synchronous real-time interaction to many students is not new (White et al. 2010), the literature contains few traces of research on the effects of a rapid transition to remote learning. Still as a system, education is not limited to the physical contexts of a traditional setting on a college campus (Frick 2020). To that end, we situate this study by researching the conversion of face-to-face courses to other delivery methods (i.e. online, blended, etc.), studies of self-efficacy and confidence, the use of technology in eLearning courses and the impact of household or peripheral situations that affect how students complete coursework.

Transition of Courses to Other Modalities

Numerous examples in the literature detail the transition of in-person courses to online and other modalities which could help to understand what occurred during the transition to remote instruction. The advent of the internet led to educators to design courses online which effectively replaced other forms of distance education (Carr-Chelman and Duchastel 2001), such as correspondence or video courses, by using the power of connecting users across any distance (Conceicao 2006). Even so, many challenges exist in converting a face-to-face course to an online in the middle of the semester. No ‘one size fits all’ approach exists (Gillet-Swan 2017). Much work has been directed towards the quality of these converted courses and the impact on students (Pentina and Neely 2007; Xu and Jaggars 2013).

Educators are continuously comparing face-to-face and online instruction, particularly, how delivery method may affect students taking the course. From a student outcome perspective, Xu and Jaggars (2013) found that students enrolled in face-to-face courses had both higher persistence and outcome measures than those enrolled in online courses. Bettinger et al. (2017) found a similar result when examining performance based on grade point average (GPA) and persistence. Uniquely, the Bettinger et al. (2017) study described all aspects of course delivery as identical between online and in-person courses, including the professor, class materials and class size. The Bettinger et al. (2017) study more closely resembles what happened during rapid conversion in the spring semester. On might assume that remote courses may see reduced performance as well, though the purpose of the present study is not focused on success but rather perceptions and confidence.

Self-Efficacy of Learners

Self-efficacy is defined as the belief in one’s confidence to succeed in a need or task (Bandura 1977). Bandura (1986) argues that self-efficacy is a primary factor in determining the success of an individual. Those with higher self-efficacy espouse higher confidence and experience a lower likelihood of failure. According to Bandura (1997), four factors affect self-efficacy: (1) mastery experiences -- indicating repeated effort produces higher confidence, (2) vicarious experiences -- where a mentor guides navigation of a challenge, (3) verbal persuasion -- consisting of being told success is possible and (4) physiological states -- where completing a challenging task leads to positive emotions (and vice versa). These factors lay the groundwork for many instruments measuring self-efficacy, particularly as it relates to teaching and learning (Chen et al. 2001; Gaumer Erickson et al. 2018; Schwarzer and Jerusalem 1995; Sherer et al. 1982). Similarly, our study is based largely on these constructs.

The concept of self-efficacy as it relates to online courses was slow to gain traction (Hodges 2008), but more recently researchers have used self-efficacy to better understand online courses (Alqurashi 2016). Studies with an online course focus tend to examine self-efficacy and its relationship to a specific aspect of learning, such as internet self-efficacy (Kuo et al. 2014), use of computers in learning (Lim 2001)or self-efficacy as a standalone construct (Artino 2007). The result of studies that connect computer self-efficacy and performance often conflict. For example, Jan (2015) found a positive relationship between self-efficacy in using a computer and overall academic efficacy. In contrast, Puzziferro (2008) studied 815 community college students and found no correlation between performance and self-efficacy with technology. Lee and Witta (2001) found self-efficacy with course technologies did not predict later performance, although self-efficacy and course satisfaction did improve over the term. The conflicting findings of these studies could be attributed to any number of factors, such as student educational experience, the ways in which different institutions design and deliver online courses or the faculty-student relationship in the course.

One question is whether prior online experience is a factor in one’s confidence to succeed. Not surprisingly, Zimmerman and Kulikowich (2016) found higher learning self-efficacy in students with prior online experience. Though examining multiple tasks in a class, the authors found the largest between-groups difference addressed learning when not in the same physical location as the instructor. These studies on self-efficacy differ from our study in terms of the abrupt disruption to learning that occurred during the COVID-19 pandemic. Students lost their shared physical space and potentially their reported self-efficacy. Our study found that students with prior online experience were more confident that they could succeed.

Technology to Access Courses

Technologies used for remote learning efforts were ubiquitous to those already used as part of design and delivery approaches for online courses. The literature contains examples of such access. For example, Magda and Aslanian (2018) found 67% of students accessed full or partial coursework on a mobile device (smart phone, tablet, notebook). Seilhamer et al. (2018) surveyed more than 4000 students, finding 99.8% of respondents owned a mobile device with 86% accessing their institution’s LMS on a mobile app. Mobile learning was cited as an emerging technology by the EDUCAUSE Horizon Report ( 2019) as a short-term trend. These percentages, while striking, warrant caution as less than 30% of instructors design courses with mobile access in mind and many students only use such devices for basic course access such as grades or announcements as opposed to deep instructional materials (Dahlstrom and Bischel 2014).

Mobile devices only account for one common piece of technology. Brooks and Pomerantz (2017) surveyed 43,559 students in ten countries and found nearly all learners owned a smartphone. Further, one third owned a laptop, smartphone and a tablet. The EDUCAUSE Report ( 2019) determined student preferences leaned towards a smartphone and laptop combination when owning multiple devices. Dello Stritto and Linder (2019) found 99.9% of participants (N = 2035) owned a smartphone, 99% a laptop, 56.3% a tablet and 34.9% a desktop. They also found just under 75% of users accessed their eLearning courses on a laptop owing to convenience, ease of use and effectiveness of viewing content, though users also reported desktop machines were more suited to viewing videos. Our findings also indicate preferences by device type.

Trauma and Disruptions to Learning

Literature documents the disruption to learning caused by traumatic events including disasters, pandemics and mass violence. Challenging situations in ‘normal’ times are exacerbated during and following such events, including changes in employment or household situation, loss of educational or social support and an increase in general stress and anxiety. Much research has been aimed at traditional students at four-year colleges and universities (e.g. Carrns 2020). This work, however, may not translate to community college students. Students at two-year colleges are more likely to be balancing work, family needs and school (Karp 2011; Sprung and Rogers 2020). Even during normal circumstances, faculty observe the many circumstances of their students that include homelessness, food insecurity, and health problems among others. Our study exposes such challenges with respect to COVID and includes a change in residence or employment, fast-changing work schedules (especially for ‘essential workers’) and increasing responsibilities of child or elderly care.

Much work on disruption due to traumatic events focuses on the mental health of students. Following the 2009 L’Aquilla, Italy earthquake, Di Pierto and Mora (2015) found students were less likely to complete degrees on time. Causes included relocation and physiological effects from the earthquake which led to PTSD through symptoms such as, “poor concentration, depression, anxiety and insomnia” (p. 63). Each factor affected the ability to focus on academic work. Certainly, the severity of the event effects the degree of stress and anxiety (McCarthy and Butler 2003) and the effects of PTSD can linger long after the event occurred (Sparks 2017). That said, the response of the college following a traumatic event matters. A study of campus preparedness following the 1999 Clarksville, Tennessee tornado highlighted the importance of addressing mental health needs of students immediately following the disaster (Shepherd and Sheu 2014). Hinson et al. (2007) underscored preparedness in advance of a disaster and more presciently recommending planning for the next great pandemic, in their case, the Avian Flu or in our case, the Coronavirus.

Methods

Sinclair Community College is a large, urban community college with approximately 25,000 students. In order to mitigate the effects of the COVID-19, following a weeklong break to revise courses, Sinclair transitioned to remote learning on March 16, 2020. Our survey focused on students and their perceptions of how the transition to remote effected their coursework. Faculty in high enrollment social science and humanities courses (e.g. sociology, geography, history, English) received an invitation to distribute the online survey to students in their remote sections which had initially started as face-to-face or hybrid/blended classes. Instructors had the option of whether or not to distribute the survey; all responses were anonymous and not tied to particular sections. For that reason, the response rate is not quantifiable. The collection period began a month after the transition to remote learning and ended the week following finals (April 17 to May 11, 2020).

The instrument assessed student perceptions of the transition of existing face-to-face or blended courses to remote learning for the remainder of the spring 2020 term. We first asked demographic questions, followed by Likert-style items focused on student confidence in completing common core tasks before and after the shift, such as completion of homework assignments or accessing learning resources. These items were asked in the context of both pre- and post-shift. We continued with items asking students about the technology used in the shift. Finally, we asked open-ended questions about technical issues, the impact of life changes from COVID-19 and instructional changes that worked – or not – in the shift.

Data analysis consisted of two parts. First, we used descriptive statistics for each survey item to find the mean, median and standard deviation of responses for each item and the difference in means pre- and post-shift. For open-ended items, we used in vivo coding (Miles et al. 2014). We created the instrument ourselves out of concerns that no existing instrument existed that would measure all the constructs we wished to include. Though we did not have time to field test or pilot the instrument prior to this study, the quantitative items on the survey were found to be highly reliable (20 items; α = .91). This result suggests the instrument measured what it intended to measure. Scores on a Cronbach’s alpha fall between 0 and 1 with a score closer to 1 indicating stronger reliability than a score closer to 0 suggesting the converse.

Findings

General Characteristics of Survey Respondents

A total of 356 students took part in the study. Table 1 displays demographic results. Most students were enrolled in introductory social science classes. Participants identified as female, white and between the ages of 18 and 21. The demographics of respondents closely reflected the college’s demographics with the exception of gender where the college’s breakdown is closer to 60 % female. In terms of prior experience with face-to-face or online experiences (including mixed modalities), respondents were nearly evenly split. In our discussion following, we will examine differences between subsets of each demographic factor.

Table 1.

Demographics and Experience with Learning Modality

Subject Classification N (N) % Gender N %
Social Sciences 204 57.3 Female 253 66.4
Humanities 135 37.9 Male 98 25.7
Other 17 4.8 Non-binary 2 0.5
Prior Learning Modality Prefer not to answer 3 0.8
Not Specified 2 0.5 Age
F2F Only 183 51.4 18–21 237 66.6
Online Experience 171 48.0 22-over 117 32.8
Online Only 45 No answer 2 0.6
Blended Only 16 Ethnicity
Online & Blended 1 African American 72 20.2
Online & F2F 59 White 247 69.4
Blended & F2F 16 Other 18 5.1
All three modalities 34 Prefer not to answer 19 5.3

Overall Confidence Pre- and Post-Remote Shift

The primary focus of the survey addressed the confidence in issues related to in-class work/college support tasks or study habits outside of class both pre- and post-shift. Items were on a five-point Likert-scale with a star rating from one to five stars later translated to numbers from one to five. Summarily, all areas on the survey decreased from pre- to post-shift (see Table 2). Not surprisingly, the use of the library to obtain information saw the largest decline (−1.49). Other noticeable differences came in taking notes on instruction (1.46), motivating oneself to do schoolwork (1.44) and remembering information presented in textbooks or in class (1.34). While students were also less confident of successfully completing tasks involving organization, planning and finishing work on time, the shift was less than those involving remembering information or motivating oneself. In fact, some students later commented that they thought remote instruction provided a more rigid structure than their previous in-person class. Student with more online experience were already familiar with organizing and planning their work.

Table 2.

Overall Descriptive Statistics for Pre- and Post-COVID Confidence Measures

Item M Diff Median SD Item M Diff Median SD
In-class Work/College Support Tasks Study Habits Outside of Class
Finish Homework by Deadline Plan your schoolwork
Pre 4.59 5 .69 Pre 4.38 5 .87
Post 3.43 −1.15 4 1.28 Post 3.31 −1.07 3 1.39
Study when there are other things to do Organize your schoolwork
Pre 4.10 4 .94 Pre 4.37 5 .88
Post 3.00 −1.10 3 1.34 Post 3.36 −1.01 4 1.39
Concentrate on school subjects Remember information from class or materials
Pre 4.25 4 .85 Pre 4.29 4 .85
Post 2.96 −1.29 3 1.35 Post 2.95 −1.34 3 1.30
Take class notes on instruction Arrange a place to study without distractions
Pre 4.38 5 1.00 Pre 4.28 5 .95
Post 2.92 −1.46 3 1.43 Post 3.05 −1.24 3 1.47
Use library for information for class assignments Motivate yourself to do schoolwork
Pre 4.05 5 1.21 Pre 4.21 4 .92
Post 2.56 −1.49 2 1.41 Post 2.77 −1.44 3 1.40
Participate in class discussions
Pre 4.37 5 .92
Post 3.12 −1.25 3 1.46

Comparison of Confidence by Gender, Race and Age Range

While we have a general sense that student confidence declined overall pre- and post-shift, differences observed in populations by gender, age and race are enlightening. The majority of participants were female – 66% compared to 26% male. With two exceptions (taking class notes on instruction and participating in class discussions), males reported a slightly smaller drop in confidence post COVID than female participant. The differences in the racial and age demographics were more marked and reported in Table 3. A comparison of ages found that generally students from ages 18–21 saw larger decreases in confidence measures than those in the other age ranges. Students who were over 22 years of age had a minimal change in confidence in successfully completing tasks. Logically, older students tend to have more life experience and often have learned how to persevere with challenging circumstances, like COVID. These older students have learned to navigate work-life balance as they often have full-time employment, childcare considerations or other more significant life issues that have provided practice on tackling challenges and overcoming adversity more so than younger students. We do not suggest such younger students may not face similar hurdles to their older peers; however, the length of time spent facing challenges is far less, thus suggesting more experience and ability to adapt as a vicarious experience (Bandura 1986, 1997).

Table 3.

Descriptive Statistics for Pre- and Post-COVID Confidence Measures by Race and Age

African American White 18–21 22-over
Item M Diff. Med. SD M Diff. Med. SD M Diff. Med. SD M Diff. Med. SD
In-class Work/College Support Tasks
Finish Homework by Deadline
Pre 4.41 −0.88 5 0.88 4.58 −0.91 5 0.68 4.58 −1.19 5 0.71 4.62 −1.08 5 0.63
Post 3.53 3 1.18 3.67 3 1.67 3.39 3 1.28 3.55 4 1.27
Study when there are other interesting things to do
Pre 4.02 −1.08 4 1.02 4.01 −1.12 4 0.91 4.03 −1.18 4 0.93 4.27 −0.93 4 0.91
Post 2.93 3 1.28 2.81 3 1.30 2.84 3 1.33 3.34 3 1.32
Concentrate on school subjects
Pre 4.21 −1.26 4 0.93 4.26 −1.49 4 0.76 4.22 −1.40 4 0.84 4.35 −1.05 5 0.84
Post 2.95 3 1.30 2.77 3 1.32 2.82 3 1.30 3.30 3 1.38
Take class notes on instruction
Pre 4.38 −1.35 5 0.87 4.28 −1.59 5 1.06 4.33 −1.52 5 1.03 4.50 1.34 5 0.94
Post 3.03 3 1.31 2.72 3 1.43 2.81 3 1.41 3.15 3 1.47
Use the library to get information for class assignments
Pre 3.85 −1.34 4 1.22 3.93 −1.59 4 1.28 3.94 −1.47 4 1.28 4.30 −1.51 5 1.01
Post 2.51 2 1.28 2.37 2 1.43 2.46 2 1.45 2.79 3 1.33
Study Habits outside of Class
Plan your schoolwork
Pre 4.18 −1.01 4 1.06 4.39 −1.15 5 0.85 4.34 −1.05 5 0.90 4.48 −1.09 5 0.79
Post 3.17 3 1.32 3.25 3 1.42 3.29 3 1.40 3.39 3 1.36
Organize your schoolwork
Pre 4.20 −0.97 4 0.93 4.40 −1.06 5 0.83 4.36 −1.01 5 0.92 4.43 −1.01 5 0.79
Post 3.23 4 1.35 3.35 4 1.45 3.34 4 1.41 3.41 4 1.36
Remember information presented in class and/or in textbooks
Pre 4.38 −1.44 5 0.74 4.29 −1.41 4 0.82 4.27 −1.40 4 0.85 4.37 −1.22 5 0.86
Post 2.94 3 1.24 2.88 3 1.33 2.87 3 1.31 3115 3 1.27
Arrange a place to study without distractions
Pre 4.19 −1.34 5 1.05 4.32 −1.41 5 0.90 4.27 −1.31 5 0.98 4.37 1.12 5 0.85
Post 2.87 3 1.38 2.91 3 1.50 2.96 3 1.48 3.25 3 1.44
Motivate yourself to do schoolwork
Pre 4.15 −1.33 4 1.11 4.17 −1.59 4 0.87 4.15 −1.53 4 0.94 4.34 −1.28 5 0.88
Post 2.82 3 1.40 2.58 2 1.38 2.63 2 1.39 3.05 3 1.36
Participate in class discussions
Pre 4.33 −1.22 5 0.99 4.33 −1.32 5 0.89 4.30 −1.29 5 0.93 4.54 −1.21 5 0.88
Post 3.12 3.5 1.46 3.01 3 1.50 3.01 3 1.47 3.33 4 1.45

We also examined differences among participants identifying as part of an ethnic/racial group. Among the students responding to the survey, only a few students identified as Hispanic (7) and Asian (11) (See Table 1). As such, we compared the data between the two largest groups – Black/African American and White. In examining individual confidence measures, African American students had a lower decline in confidence from pre- to post-shift than their white peers. Only two areas – remembering information and arranging a place to study without distractions – were higher for white students.

Previous Experience with Online Classes

Finally, Table 4 provides a breakdown by experience with prior course modalities including by online (completely or in conjunction with blended courses) and face-to-face. Students with prior online course experience had far less differences in pre- and post-remote learning transitions. We attribute this change to a familiarity and comfort with taking courses in such modalities, experience using the learning management system, course expectations and requirements were familiar and practical experience planning and organizing coursework while at a distance (either fully or in part).

Table 4.

Descriptive Statistics for Pre- and Post-COVID Confidence Measures by prior Online Experience or Face-to-Face Only

F2F only Online Experience
Item M Diff. Med. SD M Diff. Med. SD
In-class Work/College Support Tasks
Finish Homework by Deadline
Pre 4.62 −1.38 5 0.62 4.59 −0.77 5 0.51
Post 3.24 3 1.32 3.82 4 1.38
Study when there are other interesting things to do
Pre 4.21 −1.45 4 0.88 4.05 −0.36 4 0.91
Post 2.76 3 1.32 3.69 4 0.91
Concentrate on school subjects
Pre 4.42 −1.67 5 0.78 4.08 −0.72 4 0.91
Post 2.75 3 1.42 3.36 4 1.3
Take class notes on instruction
Pre 4.58 −1.91 5 0.79 4.00 −0.59 4 1.63
Post 2.67 3 1.42 3.41 4 1.43
Use the library to get information for class assignments
Pre 4.26 −1.84 5 1.07 3.89 −1.02 4 1.60
Post 2.42 2 1.46 2.87 3 1.85
Study Habits outside of Class
Plan your schoolwork
Pre 4.42 −1.37 5 0.84 4.44 −0.70 5 0.58
Post 3.05 3 1.45 3.74 4 1.43
Organize your schoolwork
Pre 4.39 −1.29 5 0.87 4.51 −0.69 5 0.55
Post 3.09 3 1.45 3.82 4 1.45
Remember information presented in class and/or in textbooks
Pre 4.44 −1.69 5 0.74 4.13 −0.92 4 1.02
Post 2.75 3 1.29 3.21 4 1.87
Arrange a place to study without distractions
Pre 4.32 −1.55 5 0.98 4.43 −0.85 5 0.58
Post 2.77 3 1.49 3.57 4 1.92
Motivate yourself to do schoolwork
Pre 4.33 −1.79 5 0.83 4.15 −0.90 4 1.06
Post 2.53 2 1.41 3.25 3 1.66
Participate in class discussions
Pre 4.48 −1.69 5 0.83 4.18 −0.60 5 0.95
Post 2.80 3 1.49 3.58 4 1.92

Access to Technology

Open-ended questions in the latter part of the survey asked about access to technology, technical issues and shifts in work-life balance due to COVID. Access to technology became critical during this transition to remote learning with the college concerned with student access to both a computer and the internet during this crisis. Studies have tried to assess student access to technology with varying results. For example, Seilhamer et al. (2018) found that 99.8% of students had a smartphone while Dello Stritto and Linder (2019) found 56.3% of students had a tablet. These results vary over time/space. In our study, student respondents reported fairly acceptable access to computers (91.6%) and cell phones (86.6%) (Table 5). While most students had access to technology during remote instruction, many students still experienced technical issues when working at home (Table 6). Most students had internet issues as a result of the number of household members working at the same time, insufficient device memory or just plain spotty or slow access.

Table 5.

Technologies used for Remote Coursework Completion

Total Study Population Students with previous F2F Experience Only
Technology Available Number Percentage Number Percentage
Cell Phone 330 86.6 170 92.9
Computer 349 91.6 180 98.4
Tablet 96 25.2 43 23.5
Webcam 181 47.5 94 51.4
Wi-Fi/Internet 336 88.3 170 92.9

Table 6.

Technical Issues Faced Completing Classwork at Home

Summary of Technical Issue N Examples of problems
No Issues 123
Limited or slow Internet/Wi-Fi access 200

Slow due to number of family members working

Spotty internet in house

Slow loading

Live in country/rural area with spotty access

Slow/dropped connections

No Internet 4

Cannot afford internet service

Rural location of residence limits access to service

Zoom Issues 25

Lose connections

Trouble figuring out zoom

Video often freezes or is dropped

eLearn/website issues 21

Email issues

Website down

Using wrong browser

eLearn deleted work

Slow loading

Instructor-related problems 17

Messy eLearn environment, lack organization

Online lectures poorly done

Professors not answering emails

No office hours or online access to instructor

Unclear about assignments and/or due dates

Computer software/hardware difficulties 31

Old computer crashed

Computer not compatible

No headphone/no printer

No response 48

COVID Life Changes

Changes during the onset of the COVID shutdown in March and April were not limited to learning in new ways. Students experienced changes, some abrupt, in many areas of their life, including employment, housing, and household/family responsibilities among others. In the survey, students were asked to comment on how changes in their life as a result of the pandemic affected their ability to engage in their education at Sinclair (See Table 7). To better understand the range of responses, their comments were coded and grouped into three general categories. First, student reported changes in their work-life balance which was coded as environmental issues. While some students experienced the loss of employment, others were deemed ‘essential workers’ which led to more hours or a change to their schedule. Some students picked up extra work to compensate for their parent’s loss of income. Others experienced changes to their household situation or an increase in childcare responsibilities which made it challenging to find the time/space to study.

Table 7.

Changes in Student Lives Affecting Class Participation

N Examples of impact
No Impact 66
Environmental Issues
  Change in Work Situation 39

I had to pick up more shifts at work

Forced to find a new job and it is third shift

Work in the hospital, hours surged

Deemed essential worker

Increased work because parents lost job

  Change in Household Situation 30

Lack of sufficient/dedicated place to study and do my work. Loud roommates

Family distractions

Moving to new house/apartment

Lost my job and had to move

  Increase in Childcare responsibilities 24

“siblings to take care of”

“home with my two-year-old”

Teaching my siblings

Single mom with young children

Mental/Physical Health Issues
  Less motivated 37

Lost motivation to do a lot of things

Lacking motivation, human contact/connections are really important to me

I am depressed and lost

It defeated me from wanting to continue education

Changes in my life left me feeling defeated and lost a part of my drive to complete my work

Without being in class, I lost a good amount of motivation to finish my coursework.

  Mental Health 13

It’s really hard to get out of bed

Just feeling sad, not being able to go out

Battling mental heal

My depression and anxiety has come back

I was a bit scared and not sure what is going to happen.

  Ill 4 I got COVID; Sick; Family member sick
Learning Issues
  + Increased Concentration 15

More productive and successful

Helped me get motivated to actually do my work

Learned that online suits me better

More time to study

  + Easier, saved time 4 No longer had to spend time driving to work
- Prefer Face-to-Face 40

Less class discussion to less ability to understand the content

Felt isolated and detached from education, I’m f2f learner

Virtual was harder that the teacher-student connection in class

Not able to build connections with classmates and professor

- Difficulty focusing 14

Difficult maintaining focus

Many things that I have to worry about, so I cannot focus

- Challenge managing work 19

Hard doing high school work and Sinclair work

Made it harder for to keep track of what/when to do it

- Lack of resources 6

Cannot get same kind of tutoring

Cannot go to the library to work

- Less peer communication 4 Could not meet with groups
- Increase homework 3 Professors assigned more work
- other: less time, more responsibility, procrastination 10

Felt less confident to join in zoom meetings

Sitting at computer for long time

Second, students commented on their health – both mental and physical. Students commented directly on their mental health citing that they felt depressed Some students directly wrote about their mental health: depressed, anxious, sad, ‘can’t get out of bed”, while others more indirectly talk about their lack of motivation and drive to complete their studies. Unsurprisingly, illness surfaced as family members and/or students got COVID. Finally, students commented on changes in their learning environment. Any classes with a face-to-face component were moved to synchronous remote learning (virtual meeting of a course with all participants engaged simultaneously) or asynchronous online (participants engage with course content at their convenience). While a few students found that remote/online learning was easy or took less time, others found the work challenging in terms of content (some felt it was more work), experienced a loss of academic and/or technology resources and also felt the loss of social support which can positively affect student success.

The study revealed some marked differences in responses based on demographic groups based on age and race. A comparison of responses provided by respondents age 18–21 and those over 22 years revealed important differences. Younger respondents were more likely to report that they were less motivated and/or had mental health issues (16% compared to 8.4%). Not surprisingly, that same younger group was less likely to be impacted by work issues (8.3% compared to 15%) and less likely to have childcare issues (2.8% compared to 15%). A reported preference for face-to-face classes was nearly the same – 11% for those age 18–21 and 14% for those over 22. In terms of race/ethnicity, the data was only sufficient to report on respondents self-identifying as African American/Black only and White only. Respondents identifying as African American/Black only were twice as likely to report a preference for face-to-face classes compared to those identifying as White only (17% compared with 8.8%); more likely to report having difficulties managing work (17% compared with 7.6%); and less likely to have work issues (4.3% compared to 9.6%). It is also important to note that White students did not report impact by illness while three African American/Black students reported effects from illness (self or family).

COVID Instructional Changes

Students were also asked “What instructional changes worked (or not) for your situation during the pandemic? Each of the responses (N = 314) were coded as instructional changes that were either positive (N = 188) or negative (N = 106). Occasionally, students had both negative and positive statements in their response (N = 17). Responses categorize as related to workload organization/structure and delivery of material, communication and technology (see Table 8). Student’s responses were not unexpected. What works well during a time of crisis, also works well for students prior to COVID-19.

Table 8.

Instructional Changes during the Pandemic

Worked (Positive) Did NOT Work (Negative)
Workload

Reduced Workload

Flexible deadlines

Single weekly deadlines for all work

Increased workload

Too many changes to due dates

Organization/

Structure

Updated syllabus

Instructors were more organized

Structured schedule

Syllabus not updated

Professors kept changing weekly study

plan

Dropboxes not open

Instructor NOT organized

Delivery

Recorded lectures

Recorded explanations of labs

Pre-recorded lectures

Posting notes/powerpoints review

Instructions confusing
Communication

Weekly check-in emails from professor

Alerts sent about assignments

Better able to plan what to say during

discussions on Zoom

Send info through email

Difficult to reach professors – no response

Expectations not clear

Technology

Zoom worked well

Learned new technology

Disruptive ZOOM sessions when

students left early or were not visible

Poor internet connections

Suggested Next Steps

As this pandemic has progressed through the fall term, we have already observed colleges adjust their plans in response to changing government guidelines and to their own local circumstances. It certainly is a fluid environment. Even with this changing terrain, we have suggested ‘next steps’ that colleges should consider as they anticipate similar disruptions to learning. Ultimately, lessons will help other practitioners grow in their continued pandemic to the response while simultaneously preparing for future significant similar disruptions to higher education.

Contingency Planning

First, this study underscores the importance of training faculty, staff and students to use elements of online instruction to prepare them for the flexibility required when learning disruptions occur due to natural disasters, traumatic community events or societal changes. This preparation, which might include learning to use a learning management system, synchronous technologies and other integrated tools, would reduce the kinds of technical issues reported by students during the COVID transition, including, messy course environments, poorly constructed online lectures and unclear instructions. In fact, elements of online courses can provide more structure in face-to-face courses that could benefit student’s access course materials with consistency throughout a semester. Our results (Table 4) demonstrate that some knowledge and experience with online learning helped students with their confidence in completing many tasks related to their academic work.

Access to Technology

Given the socio-economic status of many of our community college students, we were surprised at the high percentages of students with access to technology. Still, for a variety of reasons students had trouble with internet access. As such, institutions would benefit by periodically surveying students to capture their access to different types of technology and the internet as well as the quality of those technologies. This knowledge would be useful in both contingency planning prior to a rapid shift to remote instruction and may help institutions reach students who might benefit from the flexibility of online courses.

Expansion of Online/Blended Modalities

The transition to remote learning has opened opportunities for institutions, particularly in expanding the number of classes offered in online/blended modalities. Now that many students have both access to technology (even with internet problems) and online experience, they may find that synchronous online or asynchronous remote classes may provide some much-needed flexibility in their work-life balance. Colleges may find that expanding online options might open new markets.

Academic and Social Support

The COVID transition highlighted the need for flexible academic and social support services which contribute to student successfully completing a course, certificate or program. Services such as advising, libraries, tutoring, information technology and mental health offices help students overcome hurdles in normal circumstances and become even more important during disruptions to learning. In addition, students with technology or compatibility issues suggest that institutions should plan for more technology services, such as, deployment of wireless hotspots, restricted in-person access points and the potential to loan devices that are compatible for student course needs in the event of a future disruption.

Attention to Mental Health

Mental health issues were not listed as impactful to the extent we anticipated, though we partially attribute this to the timeframe in which survey completion took place (April–May 2020) and that there were no directed questions on mental health. For this we lean on studies that find that mental health as a cause inhibiting completion following a traumatic event (e.g. Di Pierto and Mora 2015; McCarthy and Butler 2003). Certainly, colleges and universities should ensure staff are prepared to address issues arising during crises to ensure they can meet student needs. What we did find are students who reported issues related to social isolation – a facet certainly connected to mental health. Students missed in-person interaction in classes and group meetings, tutoring sessions, and visits to the library. To prepare for disruptions to learning, institutions should develop ready-made resources for students to navigate their sudden solitude. This knowledge of resources will help students feel supported and able to tackle the challenges encountered during these sudden disruptions to learning particularly for students accustomed to in-person offerings.

Conclusion

Finding ways to address what institutions need to do to prepare for future calamities requires that we continue to examine and assess what happened during the COVID pandemic as well as other, perhaps highly localized, traumatic events. While there have been local or regional events that have caused sharp changes in delivery, higher education has not faced a critical global situation like the COVID-19 pandemic (Polkoff et al. 2020).

The present study captures some early information from a student perspective despite certain limitations which we discuss here. One limitation was the abbreviated timeframe required to develop and administer a survey before the end of the semester. To quickly create a survey, we created an instrument based on a previous confidence survey (Bandura 2006). If we had more time, we might have been supplemented with questions from a psychometric analysis standpoint to ensure validity and reliability. A second limitation was the sample size which is small for a community college with a population of 25,000. Early on, we decided to target students in introductory, general education classes in the social sciences and humanities where the delivery was face-to-face or blended. We did this to reach students who did not previously have an experience with online classes. Still, it would have been helpful to survey a larger sample. Although we had enough responses to compare students with online experience and those without, it would have been helpful to reach more students distinguished by race/ethnicity and gender. A final limitation relates to the timeline of the COVID-19 pandemic. The present study captures the early stages of the pandemic in the spring of 2020. Since this time, institutions of higher education have continued to shift their responses based on their local and national situation during the ongoing pandemic. Further research may help to understand how students are impacted by these changes in higher education. In particular, a focus on mental health would likely be informative as the results of this study were limited in terms of student mental health needs.

As institutions continue to grapple with the effects of the COVID-19 pandemic, they must consider the voice of students academically, emotionally and technologically. Their perspective is essential in helping to learn from this unprecedented national pivot to remote instruction. Although the findings of this study are based on students attending a primarily two year community college, the views of students certainly apply to other institutions of higher education. Students faced all sorts of challenges as they balanced a new learning environment with many adjustments to work, family and home life. Just as students come to college with differing personal and educational experiences, the mix of challenges is different for each student. The key is to continue working towards understanding the differing circumstances of students to help them be successful during these unexpected (and expected) disruptions to learning. While institutions can use the findings from this study to continue to support students through the pandemic’s presence, the result might be fuller services for students during ‘normal’ times.

Funding

No funding was received to assist with the preparation of this manuscript. The authors have no relevant financial or non-financial interests to disclose.

Compliance with Ethical Standards

Conflict of Interest

The authors have no relevant conflicts of interest in the completion of this study.

Ethics Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. The study was approved by the Institutional Research Board of Sinclair Community College.

Consent to Participate

Informed consent was obtained from all participants included in the study.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Christopher Prokes, Email: christopher.prokes@sinclair.edu.

Jacqueline Housel, Email: jacqueline.housel@sinclair.edu.

References

  1. Alqurashi E. Self-efficacy in online learning environments: A literature review. Contemporary Issues in Education Research. 2016;9:45–52. doi: 10.19030/cier.v9i1.9549. [DOI] [Google Scholar]
  2. Artino AR. Motivational beliefs and perceptions of instructional quality: Predicting satisfaction with online training. Journal of Computer Assisted Learning. 2007;24(3):260–270. doi: 10.1111/j.1365-2729.2007.00258.x. [DOI] [Google Scholar]
  3. Bal, I, A., Arslan, O., Budhrani, K., Mao, Z., Novak, K., & Muljana, P, S. (2020). The balance of roles: Graduate student perspectives during the COVID-19 pandemic. TechTrends. 10.1007/s11528-020-00534-z. [DOI] [PMC free article] [PubMed]
  4. Bandura A. Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review. 1977;84(2):191–215. doi: 10.1037//0033-295x.84.2.191. [DOI] [PubMed] [Google Scholar]
  5. Bandura A. Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice-Hall; 1986. [Google Scholar]
  6. Bandura A. Self-efficacy: The exercise of control. New York: W.H. Freeman and Company; 1997. [Google Scholar]
  7. Bandura A. Guide for constructing self-efficacy scales. In: Pajares F, Urdan T, editors. Self-efficacy beliefs of adolescents. Charlotte, NC: Information Age Publishing; 2006. pp. 307–337. [Google Scholar]
  8. Bettinger EP, Fox L, Loeb S, Taylor ES. Virtual classrooms: How online college courses affect student success. American Economic Review. 2017;107:2855–2875. doi: 10.1257/aer.20151193. [DOI] [Google Scholar]
  9. Brooks, D, C., & Pomerantz, J. (2017). ECAR study of undergraduate students and information technology. Louisville, CO: ECAR. https://library-educause-edu.libproxy.boisestate.edu/-media/files/library/2017/10/studentitstudy2017.Pdf. Accessed 2 May 2020
  10. Carr-Chelman, A., & Duchastel, P. (2001). The ideal online course. Library Trends, 50, 145–158 https://www.ideals.illinois.edu/bitstream/handle/2142/8379/librarytrendsv50i1k_opt.pdf?sequence=1. Accessed 1 May 2020
  11. Carrns, A. (2020). The ‘indirect’ costs at college can involve nasty surprises. New York Times. https://www.nytimes.com/2020/08/07/your-money/college-costs-tuition.html.  Accessed 20 Aug 2020
  12. Chen G, Gully SM, Eden D. Validation of a new general self-efficacy scale. Organizational Research Methods. 2001;4:62–83. doi: 10.1177/109442810141004. [DOI] [Google Scholar]
  13. Conceicao S. Faculty lived experiences in the online environment. Adult Education Quarterly. 2006;56(1):26–45. doi: 10.1177/1059601106292247. [DOI] [Google Scholar]
  14. Dahlstrom, E., & Bischel, J. (2014). ECAR study of undergraduate students and information technology. Louisville, CO: ECAR. https://library.educause.edu/~/media/files/library/2014/10/ers1406-pdf.pdf?la=en. Accessed 20 May 2020
  15. Dello Stritto, M, E., & Linder, K. (2019). Uncovering student device preferences for online course access and multimedia learning. EDUCAUSE.https://er.educause.edu/blogs/2019/1/uncovering-student-device-preferences-foronline-course-access-and-multimedia-learning. Accessed 2 May 2020
  16. Di Pierto G, Mora T. The effect of the L’Aquilla earthquake on labour market outcomes. Environment and Planning C: Politics and Space. 2015;33:239–255. doi: 10.1068/c12121r. [DOI] [Google Scholar]
  17. EDUCAUSE. (2019). 2019 horizon report: Higher education. EDUCAUSE.https://library.EDUCAUSE.Edu/resources/2019/4/2019-horizon-report. Accessed 1 May 2020
  18. Flaherty, C. (2020). ‘Massive increases in LMS and synchronous video usage. Inside Higher Education.https://www.insidehighered.com/news/2020/04/03/%E2%80%98massive%E2%80%99-increases-lms-and-synchronous-video-usage. Accessed 19 May 2020
  19. Frick, T, W. (2020). Education systems and technology in 1990, 2020and beyond. Tech Trends.10.1007/s11528-020-00527-y. [DOI] [PMC free article] [PubMed]
  20. Gaumer Erickson, A, S., Soukup, J, H., Noonon, P, M., & McGurn, L. (2018). Self-efficacy formative questionnaire technical report. Research Collaboration. http://www.researchcollaboration.org/uploads/Self-EfficacyQuestionnaireInfo.pdfAccessed 18 April 2020
  21. Gillet-Swan, J. (2017). The challenges of online learning. Supporting and engaging the isolated learner. Journal of Learning Design, 10, from https://www.jld.edu.au/article/view/293. Accessed May 29, 2020
  22. Grajek, S. (2020). EDUCAUSE COVID-19 quickpoll results: Help for students. EDUCAUSE Review.https://er.educause.edu/blogs/2020/4/educause-covid-19-quickpoll-results-help-for-students. Accessed 8 May
  23. Hinson J, LaPrarie K, Carroll E. Emergency preparedness and e-learning: Recommendations for readiness. Journal of Interactive Instruction Development. 2007;20:3–7. [Google Scholar]
  24. Hodges, C. (2008). Self-efficacy, motivational email and achievement in an asynchronous math course. Journal of Computers in Mathematics and Science Teaching, 27(3), 265–285 Retrieved June 1, 2020 from https://www.learntechlib.org/primary/p/25282/. Accessed 1 Jun 2020 
  25. Jaggars SS. Choosing between online and face-to-face courses: Community college student voices. American Journal of Distance Education. 2014;28:27–38. doi: 10.1080/08923647.2014.867697. [DOI] [Google Scholar]
  26. Jan SK. The relationships between academic self-efficacy, computer self-efficacy, prior experience and satisfaction with online learning. American Journal of Distance Education. 2015;29:30–40. doi: 10.1080/08923647.2015.994366. [DOI] [Google Scholar]
  27. Karp, M, M. (2011). Toward a new understanding of non-academic support: Four mechanisms encouraging positive student outcomes in the community college (CRCC working paper no. 28). Assessment of evidence series. Community College Research Center, Columbia University.https://ccrc.tc.columbia.edu/publications/non-academic-student-supportmechanisms.html- mechanisms.Html. Accessed 22 April 2020
  28. Kemp, N., & Grieve, R. (2014). Face-to-face or face-to-screen? Undergraduates’ opinions and test performance in classroom vs. online learning. Frontiers in Psychology, 12,10.3389/fpsyg.2014.01278. [DOI] [PMC free article] [PubMed]
  29. Kim, J. (2020). COVID-19, remote learning and the beauty of all hands on deck. Inside Higher Education.https://www.insidehighered.com/digital-learning/blogs/learning-innovation/covid-19-remote-learning-and-beauty-all-hands-deck. Accessed 13 May 2020
  30. Kuo YC, Walker AE, Schroder KEE, Belland BR. Interaction, internet self- efficacy, and self-regulated learning as predictors of student satisfaction in online education courses. The Internet and Higher Education. 2014;20:35–50. doi: 10.1016/j.iheduc.2013.10.001. [DOI] [Google Scholar]
  31. Lee C, Witta EL. Online students’ perceived self-efficacy: Does it change? Proceedings of the Association for Educational Communications and Technology International Conference (pp. 228–236) Atlanta, GA: AECT; 2001. [Google Scholar]
  32. Lim CK. Computer self-efficacy, academic self-concept and other predictors of satisfaction and future participation of adult distance learners. The American Journal of Distance Education. 2001;15:41–51. doi: 10.1080/08923640109527083. [DOI] [Google Scholar]
  33. Magda, A, J., & Aslanian, C, B. (2018). Online college students 2018: Comprehensive data on demands and preferences. The learning house, Inc. https://www.learninghouse.com/knowledge-center/research-reports/ocs2018/. Accessed 1 May 2020
  34. Malarkodi, M., Indumathi, V, M., & Praveena, S. (2018). Preference towards online mode of distance education courses-conjoint analysis. International Journal of Bio-Resource & Stress Management, 1, /10.23910/IJBSM/2018.9.1.1858.
  35. McCarthy M, Butler L. Responding to traumatic events on college campuses: A case study and assessment of student post-disaster anxiety. Journal of College Counseling. 2003;6:90–96. doi: 10.1002/j.2161-1882.2003.tb00230.x. [DOI] [Google Scholar]
  36. Miles MB, Huberman AM, Saldana J. Qualitative data analysis (2nd ed.) London: SAGE; 2014. [Google Scholar]
  37. Pentina, I., & Neely, C. (2007). Differences in characteristics of online versus traditional students: Implications for target marketing. Journal of Marketing for Higher Education, 17, 49–65. 10.1300/J050v17n01_05.
  38. Pincus, K. V., Stout, D. E., Sorensen, J. E., Stocks, K. D., & Lawson, R. A. (2017). Forces for change in higher education and implications for the accounting academy. Journal of Accounting Education, 40, 1–18. 10.1016/j.jaccedu.2017.06.001.
  39. Polkoff, M., Silver, D., & Korn, S. (2020). What’s the likely impact of COVID-19 on higher ed? Inside Higher Education.https://www.insidehighered.com/views/2020/08/04/analysis-data-national-survey-impact-pandemic-higher-ed-opinion. Accessed 18 Aug 2020
  40. Puzziferro M. Online technologies self-efficacy and self-regulated learning as predictors of final grade and satisfaction in college-level online courses. American Journal of Distance Education. 2008;22:72–89. doi: 10.1080/08923640802039024. [DOI] [Google Scholar]
  41. Schwarzer R, Jerusalem M. Generalized self-efficacy scale. In: Weinman J, Wright S, Johnson M, editors. Measures in health psychology: A user’s portfolio. Causal and control beliefs (pp. 35-37) Windsor, UK: NFER-Nelson; 1995. [Google Scholar]
  42. Seilhamer, R., Chen, B., de Noyelles, A., Raible, J., Bauer, S., & Salter, A. (2018). 2018 Mobile Survey Report. University of Central Florida Digital Learning. https://digitallearning.ucf. Accessed 1 May 2020 
  43. Shepherd, M., & Sheu, T. S. (2014). The effects of informal faculty-student interaction and use of information technology on non-traditional students’ persistence intentions and educational outcomes. Journal of Higher Education Theory and Practice, 14, 46–60. http://www.na-businesspress.com/Subscriptions/JHETP/JHETP_14_2__Master.pdf.
  44. Sherer M, Maddux JE, Mercandanate B, Prentice-Dunn S, Jacobs B, Rogers RW. The self-efficacy scale: Construction and validation. Psychological Reports. 1982;51:663–667. doi: 10.2466/pr0.1982.51.2.663. [DOI] [Google Scholar]
  45. Sinclair College. (2020). SinclairOnline. https://www.sinclair.edu/locations/online/. Accessed 19 May 2020
  46. Sparks, S. (2017). Students feel trauma’s aftereffect long after crises end, studies find. Education Week.https://www.edweek.org/leadership/childrens-trauma-lasts-long-after-disasters-studiesshow/2017/09. Accessed 18 May 2020
  47. Sprung JM, Rogers A. Work-life balance as a predictor of college student anxiety and depression. Journal of American College Health. 2020;68:1–8. doi: 10.1080/07448481.2019.1706540. [DOI] [PubMed] [Google Scholar]
  48. White CP, Ramirez R, Smith JG, Plonowski L. Simultaneous deliver of a face- to-face course to on-campus and remote off-campus students. Tech Trends. 2010;54:34–40. doi: 10.1007/s11528-010-1418-z. [DOI] [Google Scholar]
  49. Xu D, Jaggars SS. The impact of online learning on students’ course outcomes: Evidence from a large community and technical college system. Economics of Education Review. 2013;37:46–57. doi: 10.1016/j.econedurev.2013.08.001. [DOI] [Google Scholar]
  50. Zimmerman WA, Kulikowich JM. Online learning self-efficacy in students with an without online learning experience. American Journal of Distance Education. 2016;30:180–191. doi: 10.1080/08923647.2016.1193801. [DOI] [Google Scholar]

Articles from Techtrends are provided here courtesy of Nature Publishing Group

RESOURCES