Skip to main content
ATS Scholar logoLink to ATS Scholar
. 2022 Aug 31;3(3):399–412. doi: 10.34197/ats-scholar.2022-0001OC

Learning Outcomes in a Live Virtual versus In-Person Curriculum for Medical and Pharmacy Students

Sedtavut D Nilaad 1, Erica Lin 1, Jacob Bailey 1, Caitlyn Truong 1, Samvel Gaboyan 1, Ankita Mittal 1, Brookie M Best 2,3, Kama Guluma 4, Alana Iglewicz 5, Lina Lander 6, Sean Evans 7, Charles Goldberg 1, Laura E Crotty Alexander 1,8,
PMCID: PMC9585697  PMID: 36312802

Abstract

Background

The coronavirus disease (COVID-19) pandemic has been a source of disruption, changing the face of medical education. In response to infection control measures at the University of California, San Diego, the hybrid in-person and recorded preclerkship curriculum was converted to a completely virtual format. The impact of this exclusive virtual teaching platform on the quality of trainee education is unknown.

Objective

To determine the efficacy of a virtual course, relative to traditional hybrid in-person and recorded teaching, and to assess the impact of supplementary educational material on knowledge acquisition.

Methods

A retrospective observational cohort study was performed to assess an introductory course, held mostly in person in 2019 versus completely virtual in 2020, for first-year medical students and second-year pharmacy students at the University of California, San Diego, School of Medicine and Skaggs School of Pharmacy and Pharmaceutical Sciences.

Results

The midterm and final examination scores were similar for the hybrid and virtual courses. There was no association between the hours of recorded lectures watched and final examination scores for either course. In the 2019 in-person and recorded course, students who demonstrated consistent on-time use of practice quizzes scored statistically higher on the final examination (P = 0.0066). In the 2020 virtual course, students who downloaded quizzes regularly had statistically higher scores on the midterm examination (P < 0.0001).

Conclusion

The similar examination scores for the hybrid in-person and recorded and exclusively virtual courses suggest that the short-term knowledge acquired was equivalent, independent of the modality with which the content was delivered. Consistent on-time use of practice quizzes was associated with higher examination scores. Future studies are needed to assess the difference between a completely in-person versus virtual curriculum.

Keywords: medical education, coronavirus disease, virtual learning, preclerkship learning


The unique circumstances of the coronavirus disease (COVID-19) pandemic caused by the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) resulted in an unprecedented disruption of medical education. Adherence to infection control measures led to the swift cancellation of in-person didactics (1). With the social distancing measures required to prevent the spread of SARS-CoV-2 in 2020, there was a drastic shift toward an online environment, with an almost exclusive replacement of in-person learning by virtual modalities.

Although educational systems have traditionally relied on face-to-face teaching, since its introduction, web-based learning has played a growing role in medical education (2, 3). The addition of technology has been adopted by most institutions to facilitate learning by trainees (2). Beginning even before the COVID-19 pandemic, preclerkship teaching has shifted toward e-learning, as in-person large-group lecture attendance has declined ever since recording and online posting of lectures became commonplace (4). The addition of asynchronous learning to the traditional curriculum has been viewed positively by learners (5, 6). Online teaching platforms have been shown to have similarly high degrees of student engagement compared with in-person approaches (7).

The switch to a completely virtual curriculum as a result of the COVID-19 pandemic may have profound consequences in the future. Although small studies have examined the efficacy of virtual formats (810), there are minimal data on larger scale applications. As such, there is still limited understanding of the impact of this nearly exclusively online version of medical education. Some medical educators have expressed concerns about the quality of trainee education, citing video/audio integrity, social isolation, student engagement, and reliance on learner motivation (4, 11). Therefore, an evaluation of knowledge transfer and acquisition from online teaching formats is necessary.

The aim of this study was to determine the impact of an exclusive virtual curriculum on examination performance in comparison with traditional hybrid in-person/recorded delivery. In addition, because much time and effort are spent on developing and distributing supplemental educational materials for medical trainees, we sought to understand the patterns of use of online supplemental educational resources and the impact on overall knowledge acquisition.

Methods

Study Design

A retrospective cohort study assessing the impact of a hybrid in-person and recorded versus completely virtual curriculum was conducted at the University of California, San Diego, School of Medicine and Skaggs School of Pharmacy and Pharmaceutical Sciences. In this study we examined the Foundations of Medicine course for a cohort of first-year medical (MS1) students (MS1) and second-year pharmacy (P2) students during 2019 and 2020. The study was determined to meet exempt status by the University of California, San Diego, Institutional Review Board.

Course Specifics

The Foundations of Medicine course is the first course in the MS1 sequence and P2 sequence, spanning five weeks and covering core biomedical science content including cell biology, molecular biology, and genetics. The course is divided into core lectures (40%), small-group sessions (50%), and histology and anatomy labs (10%).

In 2019, this course, which consisted of 36 hours of material, was conducted using an in-person and recorded format. Specifically, lectures were taught in person, but in-person attendance was not mandatory. Students had the option to watch the recorded versions online instead of or in addition to attending lectures in person. Although exact in-person attendance was not obtained, attendance was estimated to be high, with 75–80% of the class being present in person in 2019. This high presence is typical of the first course of the year, with in-person attendance waning with subsequent courses. In 2020, this course, which consisted of 30 hours of material, was conducted using a COVID-19 virtual format (CVF) because of the pandemic. All lectures were performed either remotely live via Zoom (San Jose, California) or prerecorded and posted online, with the majority being prerecorded. Notably, these lectures were taught primarily in a didactic format, with the ability to ask follow-up questions during the lecture and afterward. The lecture content was unchanged between the 2019 and 2020 courses and consisted of 38 lectures. The hours of material in the CVF course were less than in the hybrid course because the prerecorded lectures did not take the full allocated time.

In addition to these core lectures, the course also includes small groups dedicated to literature review, genetic workshops, problem-based learning, and practice of medicine. Specifically, in the literature review, students learn to critically appraise research articles with real-world applications of core concepts from their cell biology and molecular biology lectures. In genetics workshops, students work through cases to better understand genetic inheritance patterns. The majority of the literature review and genetic workshop facilitators were similar, with more than 80% having participated as facilitators the year prior. In problem-based learning, students are taught to approach patient-centered cases with a focus on clinical decision making. In practice of medicine, students are exposed to nonmedical topics that are integral to patient care, including professionalism, ethics, the importance of the patient–physician relationship, and communication skills. In 2019, all of these small-group sessions were conducted in person, with mandatory attendance. In 2020, these sessions were converted to a purely virtual format, again with mandatory attendance. Approximately 70% of the problem-based learning and practice of medicine facilitators were identical across both years. In addition, the University of California San Diego School of Medicine’s Office of Educational Support Services provides additional educational support for both MS1 and P2 students for all content related to the course. The Office of Educational Support Services offers optional weekly review sessions to review the lecture and small-group material in a more interactive format. These sessions were held in person in 2019 and virtually in 2020.

Supplementary course material, including task-oriented learning objectives (TOLOs) for each lecture and weekly practice quizzes covering several lectures, was provided to the students in both versions of the course. TOLOs were developed by the course faculty members to help organize, prioritize, and even reframe the learning material for the students. Although ordinary learning objectives may simply list topics students should learn, TOLOs are designed to make students think critically about a topic. For example, these TOLOs may provide questions and tasks to be completed while applying the material learned from lectures, labs, and small groups. Practice quizzes were also developed by the course faculty members to be reflective of similar questions on the midterm and final examinations. In addition, examination questions from previous years are pulled into these weekly practice quizzes for each year, as new questions are written and added to the examinations. To assess knowledge acquisition by the learners, a midterm examination was administered after three weeks and a final examination after five weeks. These examinations were conducted in person for the in-person course and virtually for the 2020 CVF version. Seventy-two percent of questions across examinations were unchanged between 2019 and 2020, with the remaining questions either modified for question clarity or replaced with newer examination questions. To mitigate the risk of cheating, we requested that each student adhere and sign the following honor code statement: “I pledge on my honor that I will not give or receive any unauthorized assistance on this exam.”

Data Collection

We collected deidentified records from ExamSoft, Panopto, and Canvas Learning Management System (Salt Lake City, Utah) and analyzed examination scores as well as use patterns between the two courses. Examination scores were obtained through ExamSoft. Data on the use of recorded material, such as the number of recorded lectures viewed per student, the percentage of lectures completed, and total time spent watching lectures, were collected through Panopto. Information on the use of supplementary resources was extracted from Canvas. To assess consistent use of course material, we examined the number of on-time weekly downloads of each resource, which was defined as downloading the material within the week it was uploaded. Student activity was based on a traditional calendar week, starting on Sunday and ending on Saturday.

Statistical Analysis

Statistical analysis was conducted using Prism (GraphPad). Quantitative variables are expressed as mean ± standard deviation in text and as 95% confidence intervals in figures, and qualitative variables are expressed as counts and percentages. For these analyses, the distribution of the examination scores was assumed to be nonparametric, confirmed using the D’Agostino-Pearson test. Kolmogorov-Smirnov tests were used to compare the cumulative grade point average (GPA) and biology, chemistry, physics, and mathematics (BCPM) GPA between the cohorts. Mann-Whitney tests were used to compare medical students’ Medical College Admission Test (MCAT) between the 2019 and 2020 courses. The Wilcoxon signed rank test was performed for a comparison of examination scores between the two courses. The Kruskal-Wallis test was performed to assess for any association between examination scores and factors such as time spent watching lectures and consistent use of supplementary course material. A one-sided P value <0.05 was considered to indicate statistical significance.

Results

Demographics

In 2019, 203 students were enrolled in the Foundations of Medicine course, including 134 medical students and 69 pharmacy students (Table 1). The class had a female predominance, with 59% women and 41% men. Ages ranged from 21 to 41 years (mean, 24 yr). The majority of medical and pharmacy students self-identified as Asian (38% and 58%, respectively) or White (34% and 20%, respectively), with smaller percentages of Hispanic, Black, and American Indian/Alaskan Native/Native Hawaiian/Pacific Islander students. The entering medical students had a mean cumulative GPA of 3.77 ± 0.20 and a mean BCPM GPA of 3.73 ± 0.25, while the pharmacy students had a mean cumulative GPA of 3.66 ± 0.17. The mean MCAT score was 516 ± 5. Pharmacy College Admission Test scores were not collected for entering P2 students.

Table 1.

Student demographics

  2019 Hybrid In-Person and Recorded Course
2020 Virtual Course
 
  Medical Students (MS1) Pharmacy Students (P2) Overall Medical Students (MS1) Pharmacy Students (P2) Overall P Value
  (n = 134) (n = 69) (n = 203) (n = 133) (n = 62) (n = 195)  
Age, yr 24 ± 3 24 ± 2 24 ± 3 25 ± 3 25 ± 4 25 ± 3 N/A
Entering class statistics              
 Cumulative GPA 3.77 ± 0.20 3.66 ± 0.17 3.73 ± 0.20 3.74 ± 0.20 3.66 ± 0.16 3.71 ± 0.19 0.45, 0.89, and 0.52*
 BCPM GPA 3.73 ± 0.25 N/A N/A 3.70 ± 0.23 N/A N/A 0.10
 MCAT 516 ± 5 N/A N/A 515 ± 5 N/A N/A 0.14
Gender              
 Female 78 (58) 42 (61) 120 (59) 76 (57) 45 (64) 121 (62) N/A
 Male 56 (42) 27 (39) 83 (41) 57 (43) 17 (36) 74 (38) N/A
Ethnicity              
 Asian 51 (38) 40 (58) 91(45) 51 (38) 31 (50) 82 (42) N/A
 Black 8 (6) 1 (1) 9 (4) 7 (5) 2 (3) 9 (5) N/A
 Hispanic 9 (7) 2 (3) 11 (5) 13 (10) 6 (10) 19 (10) N/A
 White 46 (34) 14 (20) 60 (30) 38 (29) 18 (29) 56 (29) N/A
 Unknown 19 (14) 12 (17) 31 (15) 10 (8) 5 (8) 15 (8) N/A
 Other 0 (0) 0 (0) 0 (0) 7 (5) 0 (0) 7 (4) N/A
 Native American Indian, Native Alaskan, Native Hawaiian, or Pacific Islander 1 (1) 0 (0) 1 (0) 7 (5) 0 (0) 7 (4) N/A

Definition of abbreviations: BCPM = biology, chemistry, physics, and mathematics; GPA = grade point average; MCAT = Medical College Admission Test; MS1 = first-year medical; N/A = not applicable; P2 = second-year pharmacy.

Data are expressed as mean ± SD or number (percentage).

*

Comparison of MS1 students alone, P2 students alone, and the combined students in the two courses, respectively.

Students who self-reported as “other” described themselves as “other Middle Eastern.”

For the 2020 academic year, a total of 195 students, including 133 medical students and 62 pharmacy students, participated in the introductory course (Table 1). One pharmacy student deferred during the middle of the course, and her data were excluded from statistical analysis. This study included 121 women (62%) and 74 (38%) men, ranging from 21 to 51 years of age (mean, 25 yr). There were higher proportions of medical and pharmacy students who self-identified as Asian (38% and 50%, respectively) and White (29% and 29%, respectively) compared with other ethnic groups. Medical students for the 2020 academic year had a mean cumulative GPA of 3.74 ± 0.20 and a mean BCPM GPA of 3.70 ± 0.23, while pharmacy students had an average cumulative GPA of 3.66 ± 0.16. The medical students also had an average MCAT score of 515 ± 5. There were no significant differences in cumulative or BCPM GPA among medical students alone, pharmacy students alone, and both MS1 and P2 students in the hybrid and virtual course (P = 0.45, P = 0.89, and P = 0.52, respectively; Table 1). Furthermore, no differences were observed in the MCAT scores of the medical students between the two cohorts (P = 0.10 and P = 0.14, respectively), suggesting that the two classes were at similar academic levels entering medical school.

Examination Scores between In-Person and CVF Modalities

The scores for the midterm examinations during the 2019 hybrid in-person and recorded course and 2020 CVF courses were similar, with mean ± standard deviation percentages of 84 ± 12 and 84 ± 11, respectively (P = 0.58; Figure 1A). Although the mean percentage for the final examination in the CVF course (86 ± 7) was slightly higher in comparison with the mean percentage for the hybrid course (83 ± 12), the values were not significantly different between the two courses (P = 0.19; Figure 1B). However, the final examination scores for the CVF course demonstrated less variability, with a narrower standard deviation compared with the hybrid course (86 ± 7 vs. 83 ± 12, respectively).

Figure 1.


Figure 1.

Examination scores between the hybrid in-person and recorded course and coronavirus disease (COVID-19) virtual format courses. (A) Midterm examination scores (P = 0.58). The red line represents the mean. (B) Final examination scores (P = 0.19).

Relationship between Hours of Recorded Lectures Watched and Examination Scores

Use patterns of recorded lectures during the hybrid in-person and recorded and CVF courses were examined. Students were categorized by the total number of hours of lectures watched over the five-week block. In the hybrid course, a right-skewed distribution was observed, with an average of 16 hours of recorded lectures watched (Figure 2A). In the CVF course, a normal distribution was observed, with an average of 31 hours watched (Figure 2B). This average is higher than the overall duration of lecture material, totaling 30 hours. There was no association between the number of hours of recorded lectures watched and final examination scores for either the hybrid or CVF course (P = 0.97 and P = 0.42; Figures 2C and 2D, respectively). Specifically, students who rewatched recorded lectures on multiple occasions did not consistently score higher on the final examination (Figures 2C and 2D, respectively).

Figure 2.


Figure 2.

Relationship between lecture recording watched and examination scores. Students were separated into 6-hour increments for the 2019 hybrid in-person and recorded course and 5-hour increments for the 2020 CVF course. (A) Number of students in each 6-hour block in the 2019 hybrid in-person and recorded course. The orange line represents the total number of lecture hours (36.08 h). (B) Number of students in each 5-hour block in the 2020 CVF course. The orange line represents the total number of lecture hours (30.15 h). (C) Final examination scores of students in each 6-hour block for the 2019 hybrid in-person and recorded course. (D) Final examination scores of students in each 5-hour block for the 2020 CVF course. (E) Students were categorized by the number of lecture recordings viewed in the 2019 hybrid in-person and recorded course. For example, 18 students watched one or two lectures (“2” on the x-axis). (F) Students were categorized by the number of lecture recordings watched in the 2020 CVF course. (G) Percentage of the class that watched each lecture in the 2019 hybrid in-person and recorded course. The 38 lectures are denoted L1–L38. The green line represents the average of the 38 percentages generated across the lectures for the 2019 hybrid in-person and recorded course (40 ± 13%). (H) Percentage of the class that watched each lecture in the 2020 CVF course. The orange bar denotes the synchronous lectures (i.e., held live virtually). The green line represents the average of the 38 percentages generated across the lectures for the 2020 virtual course (80 ± 27%). (I) Average completion rate of each lecture for the 2019 in-person/hybrid course. The green line represents the average of each lecture’s completion rate for the 2019 hybrid in-person and recorded course (47 ± 16%). (J) Average completion rate of each lecture for the 2020 virtual course. The orange bar denotes the synchronous lectures (i.e., held live virtually). The green line represents the average of each lecture’s completion rate for the 2020 virtual course (86 ± 14%). CVF = coronavirus disease (COVID-19) virtual format; ID = identifier.

Thirty-eight lectures were unchanged across both years and were examined to determine the number of lectures each student viewed. In the 2019 hybrid in-person/recorded course, students viewed an average of 15 ± 11 of the recorded lectures (Figure 2E). In the virtual course, students viewed an average of 31 ± 5 of the recorded lectures (Figure 2F). By reviewing the details of each individual lecture, we were able to determine the percentage of the class who viewed each lecture and its completion rate. In the hybrid course, each lecture was viewed on average by 40 ± 13% of the entire class, and there was an average completion rate of 47 ± 16% for each lecture (Figures 2G and 2I). In contrast, for the 2020 virtual course, each lecture was viewed on average by 80 ± 27% of the class, with a completion rate of 86 ± 14% (Figures 2H and 2J).

Impact of Supplementary Course Material Used and Examination Scores

Use patterns of supplementary material provided to the students were analyzed. During this course, 79% of the students in the hybrid course (160 of 203) and 74% of the students in the 2020 CVF course (144 of 195) downloaded a practice quiz at least once. Consistent use of practice quizzes and TOLOs was defined as weekly downloads on a regular basis or regularly downloading the material the same week it was uploaded. In the hybrid course, students who demonstrated consistent use of practice quizzes scored statistically higher on the final examination relative to students who did not consistently download these materials (P < 0.01; Figure 3B). There was no relationship between on-time use of practice quizzes and midterm examinations (P = 0.52; Figure 3A). In the CVF course, students who downloaded the practice quizzes regularly scored statistically higher on the midterm examination compared with students who did not routinely download them (P < 0.0001; Figure 3C). However, there was no difference in final examination scores between those who did and did not consistently use the practice quizzes on time (P = 0.09; Figure 3D). In this introductory course, 99% of the students in the 2019 course (201 of 203) and 86% of the students in the virtual course (168 of 195) downloaded at least one of the TOLOs. In the hybrid course, students who routinely used TOLOs had higher scores on their final examinations (P = 0.02; Figure 4A), but there was no relationship between regular on-time use of TOLOs and final examination scores for the CVF course (P = 0.13; Figure 4B).

Figure 3.


Figure 3.

Relationship between use of practice quizzes and examination grades. Students were sorted on the basis of the number of times they downloaded a quiz the same week it was uploaded. (A) Comparison of the number of on-time weekly quiz downloads and midterm scores for the 2019 hybrid in-person and recorded course. (B) Comparison of the number of on-time weekly quiz downloads before the midterm examination and midterm scores for the 2020 CVF course (P < 0.0001 for zero vs. one and P = 0.0037 for zero vs. two). (C) Comparison of the number of on-time weekly quiz downloads and final examination scores for the 2019 hybrid in-person and recorded course (P = 0.009 for zero vs. five). (D) Comparison of the number of on-time weekly quiz downloads and final examination score for the 2020 CVF course (P = 0.086). **P < 0.01 and ****P < 0.0001. CVF = coronavirus disease (COVID-19) virtual format.

Figure 4.


Figure 4.

Relationship between use of task-oriented learning objectives (TOLOs) and examination grades. Students were sorted on the basis of the number of times they downloaded the TOLOs the same week they were uploaded. (A) Comparison of the number of on-time weekly TOLO downloads and final examination scores for the 2019 hybrid in-person and recorded course. (B) Comparison of the number of on-time weekly TOLO downloads and final examination scores for the 2020 coronavirus disease (COVID-19) virtual format course.

Discussion

The COVID-19 pandemic forced the preclerkship curriculum to transition from a hybrid in-person and recorded to completely virtual format. In this retrospective cohort study, we determined that mean examination scores were similar in the two courses, suggesting that the quality of education and knowledge acquired were equivalent independent of the modality used to deliver the content. In addition, we determined that on-time use of practice quizzes was associated with higher examination scores, and thus, these additional resources can be used to optimize student learning.

The move to an exclusive remote teaching environment caused a fundamental shift in medical education. Although medical educators expressed concerns about student comprehension of the lecture material using a virtual platform because of decreased access to faculty members, we found no overt evidence of a difference in knowledge acquisition with the transition to this format. Specifically, midterm and final examination scores were similar in the two courses. Interestingly, although there were no statistically significant differences in final examination grades between the hybrid in-person and recorded course and CVF course, students in the CVF course had a slightly higher mean and narrower variance compared with students in the hybrid course. This difference may reflect the beneficial aspects of a technology-enhanced teaching modality. Specifically, the virtual learning platform provides a flexible framework for students. For example, the availability of prerecorded lectures allows students to individualize their schedules, so that they may prioritize learning on the basis of their own habits. Other studies on the effectiveness of online curriculum have been mixed (12, 13), suggesting the need for additional research in this arena.

Over the years, much time and effort have been put toward developing and distributing additional educational materials for use by both MS1 and P2 students; however, there are no data regarding whether the use of these materials leads to increased knowledge acquisition. As part of our study, we first evaluated use patterns of these supplemental online educational resources. In the 2019 hybrid course, only 40 ± 13% of the entire class watched the recorded lectures, and the students watched an average of 15 ± 11 hours of content. This distribution was expected, as a significant portion of the students attended the lectures in person. As exact in-person attendance was not obtained, a direct comparison with the subset of individuals who participated only in face-to-face lectures could not be conducted. In 2020, 80 ± 27% of the class watched the recordings online. As the CVF course relied only on online videos, the students would, on average, be expected to watch more recorded material compared with the 2019 course, which offered both in-person lectures and online recordings. Even in 2020, there was still a percentage of the class that did not consistently watch these recorded lectures, which is not unexpected, as prior studies have shown that a majority of students do not view all asynchronous learning modules provided (5).

An interesting finding is the viewing rate for the 12 lectures held synchronously in 2020. In the virtual course, these lectures, which were held live virtually, had a lower percentage of students who watched the recordings of these lectures. Although in-person attendance was not obtained, we suspect that the students preferentially watched the lectures live virtually (i.e., synchronously), rather than recorded (i.e., asynchronously). In fact, a survey of the students in the 2020 CVF course showed that they would prefer on average at least 28% of lectures be performed remotely live on Zoom, rather than prerecorded and posted, suggesting that these virtual formats can be better optimized to meet the needs of the learners.

In our study, the number of hours watching or rewatching material does not correlate well with student performance on examinations. This finding supports the widespread belief that passive learning may have less impact on student comprehension (14). There may be no difference in knowledge acquisition regardless of how many lectures students rewatch because passive viewing of information is known to be less effective in relaying a deep understanding of complex topics. Students may benefit from restructuring their time spent on passive learning and devote it to other more productive avenues, such as small-group study sessions and weekly interactive review sessions. Medical educators who use these virtual formats may consider embedding more interactive elements into their medium to encourage participation despite an asynchronous approach (15). Future studies are needed to determine whether recorded didactics can be improved to increase student engagement and overall knowledge acquisition.

When we assessed the impact of supplementary learning materials on overall knowledge acquisition, we found that routine on-time use of some supplementary resources was associated with better student performance. Specifically, consistent use of practice quizzes was associated with higher midterm examination scores in the CVF course and higher final examination scores in the hybrid in-person and recorded course. These additional course materials likely facilitate student learning by encouraging data retention, identifying knowledge gaps, and alleviating anxiety. This study did not assess the impact of practice quizzes and TOLOs on non–examination-related factors such as student comfort and learning environment. Future studies are needed to assess the effects of supplementary educational materials on non–knowledge-based aspects of medical education.

Strengths and Limitations

Our study has several limitations. Students differ each year, which may make a direct comparison of examination scores and use patterns difficult. Although the majority of the lectures, small groups, examinations, and facilitators were similar, there were still slight differences in these variables between the two courses. Because the noncurriculum schedule changed between the two academic years, there may be other confounding variables that were unaccounted for during this study. As live lecture attendance was not taken, we cannot perform comparisons with the subset of individuals who used only the in-person curriculum. This study assumes that higher examination scores equate to knowledge acquisition. An assessment of knowledge acquisition can be challenging when comparing in-person and online examinations, as online examinations are at higher risk for cheating and may prevent an accurate assessment of student performance. We evaluated the impact of this curriculum on short-term knowledge assessments, including midterm and final examinations, but did not evaluate its impact on long-term knowledge retention. In the future, once both cohorts have completed the United States Medical Licensing Examination Step 1, we will conduct a follow-up analysis of the impact of hybrid versus virtual learning on long-term knowledge retention. Our conclusions are based on material downloaded by each student and do not address the possibility of sharing course material between learners or even the possibility of not using the material after downloading. As student activity was based on the traditional week, the analysis may be affected by this delineation. We did not evaluate other important aspects of the medical curriculum, such as the development of personal identity, self-reflection, and student-to- student interaction, all of which may be difficult to achieve in a solely online environment.

Conclusions

Our study showed that there was no major difference between the quality of education provided and knowledge acquired in either a hybrid in-person and recorded or exclusively virtual curricula. In addition, the consistent on-time use of practice quizzes was associated with higher examination scores, suggesting that these supplementary resources are an effective tool to augment medical education in a virtual environment. Future research is needed to better understand how curricular changes affect knowledge acquisition and how educational materials may be deployed to optimally support medical trainee learning and development.

Footnotes

Supported by a VA Merit Award 1I01BX004767 (L.E.C.A.), and NIH NHLBI grants K24 HL155884 and R01 HL147326 (L.E.C.A.).

Author Contributions: All authors contributed substantially to the conception and design of the work, the collection and interpretation of data, and drafting and revising the manuscript.

Author disclosures are available with the text of this article at www.atsjournals.org.

References

  • 1. Rose S. Medical student education in the time of COVID-19. JAMA . 2020;323:2131–2132. doi: 10.1001/jama.2020.5227. [DOI] [PubMed] [Google Scholar]
  • 2. Robin BR, McNeil SG, Cook DA, Agarwal KL, Singhal GR. Preparing for the changing role of instructional technologies in medical education. Acad Med . 2011;86:435–439. doi: 10.1097/ACM.0b013e31820dbee4. [DOI] [PubMed] [Google Scholar]
  • 3. Han H, Resch DS, Kovach RA. Educational technology in medical education. Teach Learn Med . 2013;25:S39–S43. doi: 10.1080/10401334.2013.842914. [DOI] [PubMed] [Google Scholar]
  • 4. Emanuel EJ. The inevitable reimagining of medical education. JAMA . 2020;323:1127–1128. doi: 10.1001/jama.2020.1227. [DOI] [PubMed] [Google Scholar]
  • 5. Lew EK, Nordquist EK. Asynchronous learning: student utilization out of sync with their preference. Med Educ Online . 2016;21:30587. doi: 10.3402/meo.v21.30587. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Vo T, Ledbetter C, Zuckerman M. Video delivery of toxicology educational content versus textbook for asynchronous learning, using acetaminophen overdose as a topic. Clin Toxicol (Phila) . 2019;57:842–846. doi: 10.1080/15563650.2019.1574974. [DOI] [PubMed] [Google Scholar]
  • 7. Kay D, Pasarica M. Using technology to increase student (and faculty satisfaction with) engagement in medical education. Adv Physiol Educ . 2019;43:408–413. doi: 10.1152/advan.00033.2019. [DOI] [PubMed] [Google Scholar]
  • 8. Caldwell KE, Hess A, Wise PE, Awad MM. Maintaining effective senior resident-led intern education through virtual curricular transition. J Surg Educ . 2021;78:e112–e120. doi: 10.1016/j.jsurg.2021.05.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Tuma F, Nituica C, Mansuri O, Kamel MK, McKenna J, Blebea J. The academic experience in distance (virtual) rounding and education of emergency surgery during COVID-19 pandemic. Surg Open Sci . 2021;5:6–9. doi: 10.1016/j.sopen.2021.03.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Nujum ZT, Devanand P, Remya G, Anuja U. Efficacy of an online course in developing competency for prescribing balanced diet by medical students: a non - inferiority trial. Indian J Public Health . 2021;65:51–56. doi: 10.4103/ijph.IJPH_1248_20. [DOI] [PubMed] [Google Scholar]
  • 11. Song L, Singleton ES, Hill JR, Hwa Koh M. Improving online learning: student perceptions of useful and challenging characteristics. Internet High Educ . 2004;7:59–70. [Google Scholar]
  • 12. Thom ML, Kimble BA, Qua K, Wish-Baratz S. Is remote near-peer anatomy teaching an effective teaching strategy? Lessons learned from the transition to online learning during the COVID-19 pandemic. Anat Sci Educ . 2021;14:552–561. doi: 10.1002/ase.2122. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Rossettini G, Geri T, Turolla A, Viceconti A, Scumà C, Mirandola M, et al. Online teaching in physiotherapy education during COVID-19 pandemic in Italy: a retrospective case-control study on students’ satisfaction and performance. BMC Med Educ . 2021;21:456. doi: 10.1186/s12909-021-02896-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Freeman S, Eddy SL, McDonough M, Smith MK, Okoroafor N, Jordt H, et al. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci U S A . 2014;111:8410–8415. doi: 10.1073/pnas.1319030111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Dong C, Goh PS. Twelve tips for the effective use of videos in medical education. Med Teach . 2015;37:140–145. doi: 10.3109/0142159X.2014.943709. [DOI] [PubMed] [Google Scholar]

Articles from ATS Scholar are provided here courtesy of American Thoracic Society

RESOURCES