Skip to main content
PLOS One logoLink to PLOS One
. 2020 Dec 18;15(12):e0243958. doi: 10.1371/journal.pone.0243958

How medical education survives and evolves during COVID-19: Our experience and future direction

Ju Whi Kim 1, Sun Jung Myung 1,*, Hyun Bae Yoon 1, Sang Hui Moon 1, Hyunjin Ryu 1, Jae-Joon Yim 1,2
Editor: Cesario Bianchi3
PMCID: PMC7748283  PMID: 33338045

Abstract

Background

Due to the outbreak of coronavirus disease 2019 (COVID-19), school openings were postponed worldwide as a way to stop its spread. Most classes are moving online, and this includes medical school classes. The authors present their experience of running such online classes with offline clinical clerkship under pandemic conditions, and also present data on student satisfaction, academic performance, and preference.

Methods

The medical school changed every first-year to fourth-year course to an online format except the clinical clerkship, clinical skills training, and basic laboratory classes such as anatomy lab sessions. Online courses were pre-recorded video lectures or live-streamed using video communication software. At the end of each course, students and professors were asked to report their satisfaction with the online course and comment on it. The authors also compared students’ academic performance before and after the introduction of online courses.

Results

A total of 69.7% (318/456) of students and 35.2% (44/125) of professors answered the questionnaire. Students were generally satisfied with the online course and 62.2% of them preferred the online course to the offline course. The majority (84.3%) of the students wanted to maintain the online course after the end of COVID-19. In contrast, just 13.6% of professors preferred online lectures and half (52.3%) wanted to go back to the offline course. With the introduction of online classes, students' academic achievement did not change significantly in four subjects, but decreased in two subjects.

Conclusions

The inevitable transformation of medical education caused by COVID-19 is still ongoing. As the safety of students and the training of competent physicians are the responsibilities of medical schools, further research into how future physicians will be educated is needed.

Introduction

Medical education has gradually been changing and one significant part of this has been the introduction of online learning, which is now widespread not only in medical education but in many other fields [1]. Online learning has been demonstrated to be as effective as conventional didactic teaching and can be used to promote self-directed learning [2, 3]. According to a recently published meta-analysis, blended learning for the medical professions comprising face-to-face learning and online learning has increased knowledge compared with education using only one or the other method [4]. However, many medical schools in Korea are still sticking to face-to-face lectures and many professors prefer offline lectures rather than online ones.

Schools are closed in many parts of the world to alleviate the outbreak of COVID-19 [5]. In the Republic of Korea, with the exponential increase in the number of confirmed cases when a number of cases of regional infections involving Daegu and Gyeongbuk area related to religious gatherings (Shincheonji) have been reported, the Ministry of Education postponed the start of the new school year until late May 2020 [6]. Moreover, the risk alert level for infectious diseases has been upgraded to "serious." We no longer have the opportunity to choose between online and offline lectures. The time has come to move all face-to-face classes to online classes, and non-lecture practicums such as anatomy labs and clinical skills training should be implemented in a way that minimizes the risk of infection. To minimize the spread of infection, we made it mandatory for all students and professors to wear masks, keep 2 meters apart, fill out a health condition questionnaire, and measure their body temperature before these classes every day.

In this study, we present our experience in running a medical school curriculum under COVID-19 pandemic conditions by moving all offline classes online and minimizing face-to-face practices. We also present data on student satisfaction, problems, and achievements, and some perspectives on the future.

Methods

Every course for all years from first-year to fourth-year medical students, except basic laboratory practicums such as anatomy labs and clinical clerkships, was switched to an online program.

Curricular change

In the first semester of 2020, school opening was postponed due to the regional infections of COVID-19 in February 2020, involving Daegu and Gyeongbuk area. Because of the outbreak, the courses were re-organized and re-opened 2 weeks later online. Online learning was run using the e-Teaching and Learning System, a learning management system of Seoul National University, and was delivered mainly using pre-recorded lecture video clips, with some courses using live online classes.

The first year started with a 1-week integrated medical humanities course followed by basic medical science courses such as anatomy, biochemistry, and physiology. Basic laboratory classes requiring face-to-face contact and which used to run in parallel with the lectures were postponed until social distancing was loosened in early May. We first moved all basic science lectures online, and for basic laboratory classes students were divided into small groups to reduce the spread of possible infections. To protect our students against infection and toxic material, we provided students with personal protective equipment (PPE) such as face shields and masks. We also asked students to fill out a health condition questionnaire and measured their body temperature before class every day.

The second-year curriculum was mainly composed of an organ-system-based integrated course. When the medical school stopped face-to-face classes in late February, the second-year curriculum was in the middle of the integrated gastrointestinal system course. The first half of the gastrointestinal system course was delivered in the classroom and the other half was delivered online using lecture video clips. The courses that followed, such as those on the respiratory system and circulatory system, were also delivered mainly online using video clips.

The third-year curriculum was composed of the core clinical clerkship covering internal medicine, surgery, etc. As infection rates increased, we put off the clinical clerkship till late April and ran a 2-week online course on integrated medical humanities. We also provided online clinical didactic sessions to allow for later entry into the clinical clerkship. After the social distancing was loosened by the government, students could participate in the core clinical clerkship at the hospital with new guidance under the COVID-19 pandemic situation, and they were not involved in the care of patients with suspected or confirmed COVID-19. Students were required to take preventive measures against the epidemic, such as hand washing and wearing a mask, and were allowed COVID-19 testing if indicated. The fourth-year curriculum consisted of elective clinical clerkships. Similar to the third-year curriculum, online clinical didactic sessions were provided first, and the clerkship was started after the social distancing requirement was loosened. A schematic diagram summarizing the curriculum changes is presented in Fig 1.

Fig 1. Schematic diagram of the curriculum.

Fig 1

The courses were re-organized to mainly comprise online program. After the social distancing requirement was loosened by the government, students could participate in face-to-face activities.

Subjects

The subjects of the study were first-, second-, third-, and fourth-year medical students and professors at Seoul National University College of Medicine (Republic of Korea). In the year 2020, there were 145 students in the first and second years of the medical course, 155 students in the third year, and 149 students in the fourth year. Professors who participated in online courses were included in this study.

Survey

After each course, students were asked to complete a questionnaire that included the following items: 1) overall satisfaction with the online course, 2) satisfaction with technical aspects of the online lectures, 3) preference for an online course, 4) strengths of the online course, 5) weaknesses of the online course, and 6) any other comments or suggestions regarding the online course. Students were asked to respond using a 5-point scale that ranged from 1 (very dissatisfied) to 5 (very satisfied). Professors who participated in online courses were asked to complete a questionnaire similar to the students’ questionnaire and revised for the professors.

Academic achievement

While in the midst of the COVID-19 crisis, we had many worries about how to do academic testing. Since proper assessment is a part of learning, and minimizing the spread of infection is also important, there were many concerns about test timing and methods. After each course, we used to evaluate students’ academic performance through the test questions made by the professors who ran the course. After the big outbreak, the daily record for new infections remained under 30 cases per day and the requirement for social distancing was relaxed in late April. We decided to proceed with offline examinations because academic misconduct in online examinations is a key concern of many educators [7]. With preventive measures such as hand washing, mask wearing, and keeping 2 meters apart, we divided students into small groups and, accordingly, recruited additional professors and officers to conduct the exams under infection control guidelines. By checking for symptoms such as fever on test day, we ensured that every student with any symptom took the test alone at a prepared place or was instructed to apply for reexamination. The examination was composed of multiple choice questions.

We analyzed the distributions and means of the scores to find out whether there was a difference in students’ academic performance with the introduction of online learning.

Statistical analysis

Statistical analysis was performed using the SPSS (version 23) statistical package (IBM SPSS Statistics) and SAS (version 9.3) statistical package (SAS Institute). We performed the Pearson’s chi-squared test as a measure of association to analyze the data. Means were compared using analysis of variance (ANOVA). And we used mixed effects model to identify patterns of score change over years and to determine online class effect on academic achievement. Effect size and 95% confidence intervals were calculated using Cohen’s d. P values of < 0.05 were taken to indicate statistically significant differences.

Ethical considerations

The Seoul National University College of Medicine institutional review board provided study approval and waived the requirement for written informed consent (IRB No.2003-159-1111).

Results

A total 69.7% (318/456) of students and 35.2% (44/125) of professors answered the questionnaire.

Student satisfaction with the online course

Students were generally satisfied with the online course (Table 1). They answered that “they were generally satisfied with the course (3.97/5),” “the educational objectives of the course were clearly presented (4.14/5),” “the course were organized well (4.08/5),” and “the volume of learning was reasonable (3.85/5).” Among online courses, students were mostly satisfied with the integrated medical humanities course. Comparing the average satisfaction with individual lectures with the previous year, there was no significant difference in overall satisfaction.

Table 1. Students’ satisfaction with the online course.

Item Mean (SD) *
Overall satisfaction on the course
    • I am generally satisfied with the course 3.97 (0.95)
    • The educational objectives of the course were clearly presented 4.14 (0.87)
    • The course lectures were well-organized in relation to each other 4.08 (0.94)
    • I am generally satisfied with the volume of learning 3.85 (1.10)
Technical aspects of online lectures
    • I am generally satisfied with the progress of the online lecture 3.96 (1.11)
    • I am generally satisfied with the video quality of the lecture 4.13 (0.96)
    • I am generally satisfied with the sound quality of the lecture 3.82 (1.17)
    • I am generally satisfied with the speed of the lecture 3.92 (1.05)
    • Feedback via email was done properly 3.73 (1.06)
The strengths of online learning
    • Taking the course at any time 4.64 (0.67)
    • Taking the course anywhere 4.66 (0.63)
    • Flexibility in the sequence of the lecture 4.07 (1.11)
    • Playing the lecture at any speed they want 3.72 (1.33)
    • Reviewing multiple times any portion of the lecture 4.57 (0.78)
The weaknesses of online learning
    • Lack of interaction between professor and student 3.10 (1.20)
    • Lack of interaction among students 3.11 (1.33)
    • Difficulty in concentrating during online lectures 2.86 (1.40)
    • Difficulty in maintaining self-directed learning 2.73 (1.32)

5-point Likert scale: 5: strongly agree, 4: agree, 3: neither agree nor disagree, 2: disagree, and 1: strongly disagree.

*SD: standard deviation.

As to their satisfaction with technical aspects of online lectures, students answered that they were satisfied with the quality, sound, and speed of the video clips. Students pointed out the following strengths of online learning: 1) they can take the course anywhere they want (4.64/5), 2) they can take the course at any time they want (4.66/5), 3) they can review any portion of the lecture multiple times (4.57/5), 4) they can alter the sequence of the lectures (4.07/5), and 5) they can play the lecture at any speed they want (3.72/5). They pointed out that the weaknesses of online learning are the lack of interaction between the professor and each student and among students. As to difficulty in concentrating during online lectures or difficulty in maintaining self-directed learning, students answered neutrally (2.86/5 and 2.73/5, respectively).

Faculty satisfaction with the online course

The professors were satisfied with the guide for online lectures (4.05/5), the overall process of online class operation (3.77/5), and the technical aspects of online lectures (3.81/5). They pointed out that the strengths of online learning were that “they can give a lecture anywhere (3.68/5) and anytime they want (4.01/5).” As for the weaknesses of online learning, the lack of interaction between professor and student (2.02/5) and difficulty in grasping the students’ level of understanding (1.93/5) were suggested (Table 2). They also answered that it took more time and effort to prepare lectures, and that they had difficulty in preparing lecture materials due to copyright issues and personal information protection.

Table 2. Professors’ satisfaction with the online course.

Mean (SD)*
Overall satisfaction with the course
    • Guidance on online training was appropriate and easy to understand 4.05 (0.77)
    • Online teaching (making a lecture video clip or conducting a live online class) was easy 3.77 (1.15)
    • The environment for making the lecture clip was satisfactory 3.91 (0.95)
    • There was no inconvenience in booking the place to make the lecture clips 3.81 (1.06)
    • I am satisfied with the QnA** process after class 2.02 (2.02)
The strengths of online learning
    • Giving the lecture at any time 4.02 (0.84)
    • Giving the lecture anywhere 3.68 (1.06)
    • Correcting the part of the lecture flexibly 3.48 (0.97)
    • Using the given class time more efficiently 3.09 (1.04)
The weaknesses of online learning
    • Taking more time and effort to prepare for the online lecture 2.57 (0.86)
    • Copyright issues make it difficult to prepare lecture materials 2.66 (0.74)
    • The computers and related equipment for online lectures are unfamiliar 3.09 (1.02)
    • Difficulty in grasping the students’ level of understanding 1.93 (0.94)
    • Lack of interaction between professor and student 1.66 (0.93)

5-point Likert scale: 5: strongly agree, 4: agree, 3: neither agree nor disagree, 2: disagree, and 1: strongly disagree.

*SD: standard deviation.

**QnA: question and answer

Preference for online learning

Students’ preference for online lectures was much higher than for offline lectures (online vs. offline: 63% vs. 29%), and they preferred recorded video (75.5%) to live online classes (11.3%) in all years and courses (Fig 2). In contrast, professors preferred offline lectures (77.3%) over online lectures (13.6%), and had a higher preference for live online classes (61.3%) than for recorded video (31.9%). As to the survey on future education plans, 84.3% of students wanted to maintain online courses even after the COVID-19 pandemic ends. Among them, 45.5% answered that they wanted to combine offline and online classes, and 38.8% of students answered that they wanted most of the lectures to be maintained online. Although professors provided fewer answers than students who wanted to maintain online lectures, 47.7% of professors also said they hoped to maintain online lectures. Over half of the professors (52.3%) wanted to go back to offline lectures.

Fig 2. Students’ and professors’ preference for online learning.

Fig 2

Students preferred online lectures over offline lectures, and video-recorded lectures over real-time lectures.

Academic performance

To compare achievement, we compared examination scores from 2018 to 2020, although the exams were not standardized for difficulty. As there is no significant change in the competencies to be acquired through the course, and the composition of the professors who made test questions are similar, we can expect the difficulty level of the exam would not change much. In some courses, such as the anatomy course and the respiratory system course, the mean score in 2020 was lower than that in 2018 or 2019 (Table 3). However, since the mean of each subject exam score changes year by year, and the ANOVA analysis only indicates whether the difference in the mean of each year is significant, it is necessary to analyze whether the overall pattern of change is significant. And even if it is significant, it is necessary to analyze how meaningful the amount of change is.

Table 3. Students’ examination scores for 3 years (2018–2020).
2018 2019 2020 p-value
Mean (SD) * Mean (SD) * Mean (SD) *
Anatomy 86.0 (7.0) 88.1 (10.3) 82.0 (11.5) <0.0001
N = 150 N = 147 N = 143
Biochemistry 79.7 (11.5) 70.9(17.1) 74.1 (17.3) <0.0001
N = 149 N = 152 N = 144
Histology 86.2 (6.7) 85.1 (12.9) 83.4 (12.0) 0.0754
N = 152 N = 150 N = 144
Gastrointestinal system 86.6 (8.8) 88.4 (10.5) 85.9 (10.4) 0.0825
N = 150 N = 153 N = 145
Respiratory system 78.7 (13.1) 88.2 (9.2) 76.9 (11.7) <0.0001
N = 151 N = 158 N = 145
Circulatory System 79.2 (10.6) 80.1 (10.5) 77.3 (12.1) 0.0854
N = 150 N = 157 N = 145

*SD: standard deviation.

As the exam score is a repeated measurement data that a student participates in several subject tests and is related to each other, a mixed model analysis was performed. For the analysis of each subject, subjects were analyzed as fixed effects and the interaction between the two variables (year, subject) was included to check whether there was any difference in the pattern of change by subject. As a result, the p-value was less than 0.0001, indicating that the pattern of change for each subject was significantly different. Therefore, it was analyzed whether there was a difference in scores in 2018, 2019, and 2020 for each subject. At this time, since the online class conversion due to COVID-19 is a big change, we looked at whether there was a difference between the average score in 2018 and 2019 and the score in 2020 in order to investigate this impact.

For anatomy, the average score for 2018 and 2019 was 86.67 and was higher than the average score of 82.55 for 2020. For the rest of the subjects, the scores in 2020 were lower than the average of scores in 2018 and 2019 (the average difference was negative for all subjects). In addition, differences were statistically significant in anatomy, circulatory and respiratory systems (P-value = 0.0004, 0.0138, <0.0001) (S1 Table). Using mixed model, we could find out that exam score of some subject showed significant change with the introduction of online class. To analyze the overall score changes over years, the subjects were analyzed as random effects, considering that the degree of difficulty may be different. As a result, there is an overall difference between the years, and the difference between the 2020 score and the average score in 2018 and 2019 was -2.10, which was low in 2020 and was statistically significant (P = 0.0001).

Since the ANOVA results and the mixed model results are similar, the effect size of the difference between the average of the 2020 scores and the average of 2018 and 2019 scores was calculated using the mean and standard deviation, not least square mean (Table 4). The effect size of anatomy, respiratory system and circulatory system course score is -0.5150, -0.5504 and -0.2116, respectively. In the case of anatomy and respiratory system course, the change in academic achievement by online class is moderate, and in the case of circulatory system, the change by online class is small.

Table 4. Effect size of students’ examination scores of 2020 compared to 2018 and 2019.
Subject Cohen’s d effect size
Anatomy -0.5150
Biochemistry -0.0754
Histology -0.2127
Gastrointestinal system -0.1605
Respiratory system -0.5504
Circulatory system -0.2116

Discussion

In this study, we present our experience of moving our classes online and our survey of students and professors for their feedback. We continued offline clinical clerkship, clinical skills training, and basic laboratory classes with preventive measures such as PPE. Contrary to the professors' concerns, students were generally satisfied with the online course and seem to be adjusting very well. Moreover, they preferred online to offline lectures and wanted to maintain the online course even after the COVID-19 pandemic is over. However, professors preferred offline lectures and more than half of them wanted to go back to offline lectures. Students' academic performance did not differ significantly compared with the year before the curricular changes in most courses. In some courses the test scores dropped slightly, but the differences were not significant.

The educational effects of online learning have been proven through research. The biggest advantage of online learning is that it is possible to learn at any time and anywhere, using the internet. Online learning also allows for learner-oriented learning. With the introduction of online learning in medical education, each student can study at their own speed and repeat what is needed, ultimately enabling them to learn according to their ability. A systematic review of 59 studies suggested that online learning is equivalent to traditional teaching in terms of knowledge and skills gained, and student satisfaction [8]. In addition, online learning, which uses a variety of multimedia content, can be useful not only in medical classes where photographs, paintings, or other images are used to describe some clinical presentations, but also in the evaluation of the students’ academic performance. Based on these advantages, it could be expected that student and professor satisfaction with the courses conducted by online learning in this study would be good. There was a difference in satisfaction depending on the type of course. It is assumed that the reasons for the low level of satisfaction with the basic medicine are that the laboratory classes (cadaver dissection) were not provided in a timely way and total laboratory practice was insufficient due to the relatively short time period allocated compared with 2019. As laboratory practice could enhance students’ learning, the low test scores in the anatomy course might have been caused by the shortage of timely anatomy practice sessions.

It is very important to maintain students’ academic achievement after the conversion to online class. In our study, moderately decreased exam scores were observed in anatomy and respiratory system courses. It is difficult to make an accurate comparison because the degree of difficulty may vary between tests, but a statistically significant decrease was observed in the above two subjects. In anatomy, the aforementioned lack of practice seems to be the cause, and in respiratory system course, we judged that it was difficult to compare precisely because of the unusually large annual variation in difficulty index. However, apart from these reasons, if the academic achievement of medical students really declines due to online learning, this is a serious problem. It is necessary to observe whether actual academic achievement decreases, and if so, to find out how to resolve this decrease. In some way, it may be predictable that the efficiency of medical education decreases when practice is insufficient.

Interestingly, students preferred recorded lectures over live online lectures, and professors preferred live online lectures. This finding is in contrast to the result of Brockfeld et al. that students preferred live online lectures [9]. Recently, Ashokka et al. introduced the transition of lectures to online streaming with interactive components set according to the pandemic alert level [1012]. Students in this study were dissatisfied with the disadvantages of live online lectures, such as being in front of laptop in a fixed class and taking the class once without repeating it.

As for the limitations of online learning, the practical problems associated with the design and development of online learning programs are drawbacks. Professors' conservative tendencies and reluctance also serve as obstacles to the spread and long-term adoption of online learning in medical education. As many scholars [1315] have pointed out, professors familiar with traditional teaching methods are reluctant to introduce online learning to their courses and be burdened with the current situation of having to introduce online learning. Active faculty support at the college level, and close cooperation through multiple meetings between schools, faculty, and students, helped ease this situation [10].

Since the COVID-19 outbreak rapidly transitioned into a worldwide pandemic, we are facing unprecedented times. This pandemic has disrupted medical education and will change many things and make it difficult to go back to the past. The term "new normal" has been coined [16]. Even before the COVID-19 era, online lectures were already showing their effectiveness and were being used by many educational institutions. This pandemic made offline lectures disappear and most lectures are currently delivered online. However, there is still no substitute for clinical clerkship, which is the core curriculum of medical schools. Virtual clinical learning, virtual care, hospital at home, and other innovations are being proposed as a complement to the clinical clerkship, but the implication is still a relatively limited learning experience [17]. We are maintaining this while devising ways to minimize risk, worrying that students may potentially spread the virus when asymptomatic and may acquire the virus in the course of training. In Korea, during the Daegu outbreak crisis, all medical schools suspended their academic schedules, as did the United States and other countries experiencing their own regional outbreaks. As the crisis slowly passed, the school schedule slowly resumed. Medical schools should make decisions that balance student safety from COVID-19 infection with training students with sufficient clinical experience. Decisions include triaging which activities should be continued, postponed, adapted, dropped, or added [18]. We continued the lectures by putting them online, postponed the clinical clerkship and basic medical practice, dropped a few parts of the clerkship with high risk of infection, and added a regular live online discussion session to help students lead self-directed learning.

Our study has several limitations. First, our study was performed at a single institution. As each medical school has different situations and circumstances, our curricular change and results may not be generalizable to other institutions. Second, as we used exam scores that were not standardized for difficulty level, accurate comparison of academic achievement with the introduction of online class. If we had used nation-wide examination or item response theory based computer adaptive test, a more accurate comparison would have been possible. Finally, our exam was MCQ test that evaluates student’s academic achievement focused on cognitive domain. To assess of students’ achievement related to psychomotor or affective domain, it would have been necessary to use other assessment tool.

The medical education environment is changing rapidly with COVID-19 and we are only at the beginning. COVID-19 may forever change how future physicians are educated. Further research is needed to maximize the benefits of online education, compensate for any shortcomings and try various educational attempts.

Supporting information

S1 Fig. Box and whisker plot the test score.

(PPTX)

S1 Table. Students’ examination scores in 2020 compared to in 2018 and 2019.

(DOCX)

S1 File. Anonymized data set.

(XLSX)

S1 Appendix. Questionnaire.

(DOCX)

Acknowledgments

The authors acknowledge all students, professors, and teaching assistants in Seoul National University College of Medicine for helping us move our curriculum smoothly online and the Medical Research Collaborating Center at Seoul National University Hospital for their support for statistical analyses.

Data Availability

Data cannot be shared publicly because of ethical concerns. Data are available from the Seoul National University Institutional Data Access / Ethics Committee (contact via Tel: 82-2-2072-0694) for researchers who meet the criteria for access to confidential data.

Funding Statement

The authors received no specific funding for this work.

References

  • 1.Cheng B, Wang M, Mørch AI, Chen N-S, Kinshuk, Spector JM. Research on e-learning in the workplace 2000–2012: A bibliometric analysis of the literature. Educ Res Rev. 2014;11:56–72. 10.1016/j.edurev.2014.01.001 [DOI] [Google Scholar]
  • 2.Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med. 2006;81(3):207–12. Epub 2006/02/28. 10.1097/00001888-200603000-00002 . [DOI] [PubMed] [Google Scholar]
  • 3.Huynh R. The Role of E-Learning in Medical Education. Acad Med. 2017;92(4):430 Epub 2017/03/30. 10.1097/ACM.0000000000001596 . [DOI] [PubMed] [Google Scholar]
  • 4.Liu Q, Peng W, Zhang F, Hu R, Li Y, Yan W. The Effectiveness of Blended Learning in Health Professions: Systematic Review and Meta-Analysis. J Med Internet Res. 2016;18(1). 10.2196/jmir.4807 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.World Health Organization. Considerations for school-related public health measures in the context of COVID-19: annex to considerations in adjusting public health and social measures in the context of COVID-19, 10 May 2020. Geneva: World Health Organization, 2020. Contract No.: WHO/2019-nCoV/Adjusting_PH_measures/Schools/2020.1. [Google Scholar]
  • 6.Korean Society of Infectious D, Korean Society of Pediatric Infectious D, Korean Society of E, Korean Society for Antimicrobial T, Korean Society for Healthcare-associated Infection C, Prevention, et al. Report on the Epidemiological Features of Coronavirus Disease 2019 (COVID-19) Outbreak in the Republic of Korea from January 19 to March 2, 2020. J Korean Med Sci. 2020;35(10):e112 Epub 2020/03/17. 10.3346/jkms.2020.35.e112 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Cleland J, McKimm J, Fuller R, Taylor D, Janczukowicz J, Gibbs T. Adapting to the impact of COVID-19: Sharing stories, sharing practice. Med Teach. 2020;42:1–4. 10.1080/0142159X.2019.1691909 [DOI] [PubMed] [Google Scholar]
  • 8.George PP, Papachristou N, Belisario JM, Wang W, Wark PA, Cotic Z, et al. Online eLearning for undergraduates in health professions: A systematic review of the impact on knowledge, skills, attitudes and satisfaction. J Glob Health. 2014;4(1). 10.7189/jogh.04.010406 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Brockfeld T, Muller B, de Laffolie J. Video versus live lecture courses: a comparative evaluation of lecture types and results. Med Educ Online. 2018;23(1):1555434 Epub 2018/12/19. 10.1080/10872981.2018.1555434 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Ashokka B, Ong SY, Tay KH, Loh NHW, Gee CF, Samarasekera DD. Coordinated responses of academic medical centres to pandemics: Sustaining medical education during COVID-19. Med Teach. 2020:1–10. Epub 2020/05/14. 10.1080/0142159X.2020.1757634 . [DOI] [PubMed] [Google Scholar]
  • 11.Ramlogan S, Raman V, Sweet J. A comparison of two forms of teaching instruction: video vs. live lecture for education in clinical periodontology. Eur J Dent Educ. 2014;18(1):31–8. Epub 2014/01/16. 10.1111/eje.12053 . [DOI] [PubMed] [Google Scholar]
  • 12.Ranasinghe L, Wright L. Video lectures versus live lectures: competing or complementary? Med Educ Online. 2019;24(1):1583970 Epub 2019/11/02. 10.1080/10872981.2019.1583970 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Wolcott L. Dynamics of faculty participation in distance education: Motivations, incentives, and rewards In: Moore MG and Anderson WG, editor. Handbook of distance education. Mahwah: Lawrence Erlbaum Associates; 2003. p. 549–66. [Google Scholar]
  • 14.Shepherd SSG. Relationships between Computer Skills and Technostress: How Does This Affect Me? Association of Small Computer Users in Education (ASCUE). 2004. [Google Scholar]
  • 15.Lee H, Choi HS, Lee WJ. A Strategy to improve faculties’ LMS usability in a blended learning environment: SNU case of menu template development. 7th International Conference on Education Research, Seoul, South Korea 2006. p. 392–412. [Google Scholar]
  • 16.Lakhani M, Lakhani S, Lakhani P. Reimagining healthcare after covid-19: a new normal for medicine. BMJ. 2020;369:m2220 10.1136/bmj.m2220 [DOI] [PubMed] [Google Scholar]
  • 17.Woolliscroft JO. Innovation in Response to the COVID-19 Pandemic Crisis. Acad Med. 2020. 10.1097/ACM.0000000000003402 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Tolsgaard MG, Cleland J, Wilkinson T, Ellaway RH. How we make choices and sacrifices in medical education during the COVID-19 pandemic. Med Teach. 2020:1–3. 10.1080/0142159X.2020.1767769 [DOI] [PubMed] [Google Scholar]

Decision Letter 0

Cesario Bianchi

5 Oct 2020

PONE-D-20-26059

How medical education survives and evolves during COVID-19: our experience and future direction

PLOS ONE

Dear Dr. Myung,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

The work is of interest and address issues we are all, actually facing. Please, carefully revise  your manuscript accordingly with the expert reviewers comments with a response letter   (point by point). Submit a revised version, at your earliest convenience.

Please submit your revised manuscript by Nov 19 2020 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter.

If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

We look forward to receiving your revised manuscript.

Kind regards,

Cesario Bianchi

Academic Editor

PLOS ONE

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2.We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. Please see http://www.bmj.com/content/340/bmj.c181.long for guidelines on how to de-identify and prepare clinical data for publication. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

Additional Editor Comments (if provided):

Dear Dr. Myung:

Thank you for submitting your work. It was reviewed by 2 experts that found the data of interest. Reviewer #1, and myself, found that the manuscript could be more focused and the statistical analyses clearly stated and the in line with the discussion. Please, carefully revise your interesting work accordingly with the reviewers suggestions and , if you find appropriate, incorporate changes in your revised version.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Yes

Reviewer #2: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

Reviewer #2: No

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: Yes

Reviewer #2: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

Reviewer #2: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: The theme discussed at the article is very important and currently. It's important to the other Medical Schools to know what is being done about medical education. There are a lot of medical's studentes around the world who have been affected by the pandemic, like other students. But, the Medical Schools have many hours in practical and theoretical classes, and continue the medical education is a challenge.

Reviewer #2: Overall, I think this is a very interesting and timely paper as it captures real life data on the sudden change in curriculum that so many educators have been faced with. Unfortunately, this paper in some ways tries to do too many things. It gives an example of how a cirrculum can be adjusted for the current conditions, but also tries to make statements about satisfaction with the online cirriculum. I think the authors would be better served by focusing on the latter goal, and briefly describing their cirrciculum, then moving into the data driven portion of the manuscript and centering the manuscript more around the actual study than just the logistics of education during COVID.

I have several additional comments/concerns:

- For the evaluation of academic performance, what tests were used? Were these national standardized tests or tests specific to the university. Given that you make important comments about the academic performance not suffering from this change in curriculum, we need more details about the tests that were used to do this evaluation.

- Also, on the topic of testing, you mention that tests were not standardized from year to year, how are these tests created and how much do they change year to year?

- In the statistical analysis of test performance, what was the N for each class? Is the N high enough to assume normal distribution? If not, should this be reported as a median? Also, what is the standard deviation with median or standard error with mean? Need some idea of the distribution.

- There are differences in how much the academic performance changed between the different subjects. Is this worth looking into further or discussing further? Was this a trend between year groups (1st vs 2nd year) or any other trend noted here? Potentially something to look at and/or discuss.

- Finally, you concluded no significant differences in academic performance, yet almost all of the the p values you report are statistically significant, this must be addressed. What is your threshold for “significant” differences, the language here needs to be clear. Also, if there are subtle differences, this should not be ignored as the trend is certainly towards more online learning, so are there ways to improve this and make academic performance better?

- You state that “Students pointed out the following strengths of online learning…” What exactly does this mean? Was there a focus group where students pointed out benefits and problems and then the Leikert scale on survey was used to assess agreement with these statements? Or did the authors come up with the statements used for the survey? The language here could be more clear. If the point is just that the score on the survey of these states was higher than three, the language would be better as “Students agreed that the following were strengths of…” something like that. If this was a focus group coming up with strengths/weaknesses, this should be pointed out.

- The portions of the manuscript describing the specifics of how students were able to go to labs and clinical clerkships are of interest, but I think the analysis of surveys and academic performance are interesting as they give some data on how things went. I would emphasize these areas and tighten up the data as above.

- Authors should discuss the limitations of their study in the discussion.

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

Reviewer #2: Yes: Timothy Vreeland

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Dec 18;15(12):e0243958. doi: 10.1371/journal.pone.0243958.r002

Author response to Decision Letter 0


5 Nov 2020

PLOS ONE/ PONE-D-20-26059

We thank the Reviewer for his/her thoughtful and expert review of our manuscript and for their valuable and insightful comments. We have responded to each of the Reviewer’s comments and have incorporated all modifications suggested by the reviewer into the revised manuscript. The changes within the revised manuscript were highlighted (underlined and in blue). Our responses to the Reviewer’s comments are as follows:

Journal Requirements:

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming.

Author’s Response: According to the Reviewer’s comment, we have confirmed our manuscript meets PLOS ONE's style requirements.

2. If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. Please see http://www.bmj.com/content/340/bmj.c181.long for guidelines on how to de-identify and prepare clinical data for publication. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

Author’s Response: According to the Reviewer’s comment, we have uploaded the minimal anonymized data set necessary to replicate our study findings as Supporting Information files.

Additional Editor Comments

Thank you for submitting your work. It was reviewed by 2 experts that found the data of interest. Reviewer #1, and myself, found that the manuscript could be more focused and the statistical analyses clearly stated and the in line with the discussion. Please, carefully revise your interesting work accordingly with the reviewers suggestions and, if you find appropriate, incorporate changes in your revised version.

Author’s Response: According to the Reviewer’s comment, we have made the manuscript focused and the statistical analyses clearly stated and the in line with the discussion. Please refer to the following answers.

Review Comments to the Author

Reviewer 1 Comments:

1. Reviewer’s comment: The theme discussed at the article is very important and currently. It's important to the other Medical Schools to know what is being done about medical education. There are a lot of medical's students around the world who have been affected by the pandemic, like other students. But, the Medical Schools have many hours in practical and theoretical classes, and continue the medical education is a challenge.

Author’s Response:

Thank you for the reviewer's encouragement. As the Reviewer’s comment, a lot of medical schools around the world are struggling to educate follow-up generations of doctors even in COVID-19 pandemic situation. We will find a breakthrough to innovate medical education even in the current chaotic situation through ceaseless efforts and communication.

Reviewer 2 Comments:

1. Reviewer’s comment: Overall, I think this is a very interesting and timely paper as it captures real life data on the sudden change in curriculum that so many educators have been faced with. Unfortunately, this paper in some ways tries to do too many things. It gives an example of how a curriculum can be adjusted for the current conditions, but also tries to make statements about satisfaction with the online curriculum. I think the authors would be better served by focusing on the latter goal, and briefly describing their curriculum, then moving into the data driven portion of the manuscript and centering the manuscript more around the actual study than just the logistics of education during COVID.

Author’s Response: The authors agree with the reviewer’s comment that focusing on the actual study than just the logistics of education during COVID. However, to present students’ satisfaction with the online curriculum and their academic achievement, describing our effort to adjust our curriculum under pandemic situation and how the curriculum is like was an indispensable requirement. As the Reviewer pointed out, we have more focused on the latter portion. According to the Reviewer’s comment, we have added the sentence “As there is no significant change in the competencies to be acquired through the course, and the composition of the professors who made test questions are similar, we can expect the difficulty level of the exam would not change much. In some courses, such as the anatomy course and the respiratory system course, the mean score in 2020 was lower than that in 2018 or 2019 (Table 3). However, since the mean of each subject exam score changes year by year, and the ANOVA analysis only indicates whether the difference in the mean of each year is significant, it is necessary to analyze whether the overall pattern of change is significant. And even if it is significant, it is necessary to analyze how meaningful the amount of change is.” on page 11-12, lines 202-210, and “As the exam score is a repeated measurement data that a student participates in several subject tests and is related to each other, a mixed model analysis was performed. For the analysis of each subject, subjects were analyzed as fixed effects and the interaction between the two variables (year, subject) was included to check whether there was any difference in the pattern of change by subject. As a result, the p-value was less than 0.0001, indicating that the pattern of change for each subject was significantly different. Therefore, it was analyzed whether there was a difference in scores in 2018, 2019, and 2020 for each subject. At this time, since the online class conversion due to COVID-19 is a big change, we looked at whether there was a difference between the average score in 2018 and 2019 and the score in 2020 in order to investigate this impact.

Among subjects in Table 3, anatomy, biochemistry and histology are 1st grade subjects whereas gastrointestinal system, respiratory system and circulatory system are 2nd grade subjects. For anatomy, the average score for 2018 and 2019 was 86.67 and was higher than the average score of 82.55 for 2020. For the rest of the subjects, the scores in 2020 were lower than the average of scores in 2018 and 2019 (the average difference was negative for all subjects). In addition, differences were statistically significant in anatomy, circulatory and respiratory systems (p-value=0.0004, 0.0138, <0.0001) (Supplementary Table 1). Using mixed model, we could find out that exam score of some subject showed significant change with the introduction of online learning.

Since the ANOVA results and the mixed model results are similar, the effect size of the difference between the average of the 2020 scores and the average of the scores in 2018 and 2019 was calculated using the mean and SD, not lsmean (Table 4). The effect size of anatomy, respiratory system and circulatory system is -0.5150, -0.5504 and -0.2116, respectively. In the case of anatomy and respiratory system course, the change in academic achievement by e-learning is moderate, and in the case of circulatory system, the change by e-learning is small.” on page 12-13, lines 215-242 within the revised manuscript.

2. Reviewer’s comment: For the evaluation of academic performance, what tests were used? Were these national standardized tests or tests specific to the university. Given that you make important comments about the academic performance not suffering from this change in curriculum, we need more details about the tests that were used to do this evaluation.

Author’s Response: We evaluated students’ academic performance using tests specific to the university. As national standardized tests are usually scheduled the end of the semester, we couldn’t apply those test to compare students’ academic performance with or without curricular change under COVID-19. According to the Reviewer’s comment, we have added the sentence “After each course, we used to evaluate students’ academic performance through exam that was developed by the professors who ran the course.” on page 6, lines 113-115 and “As there is no significant change in the competencies to be acquired through the course, and the composition of the professors who made test questions are similar, we can expect the difficulty level of the exam would not change much.” on page 11-12, lines 202-205 within the revised manuscript.

3. Reviewer’s comment: Also, on the topic of testing, you mention that tests were not standardized from year to year, how are these tests created and how much do they change year to year?

Author’s Response: According to the Reviewer’s comment, we have added the sentence “After each course, we used to evaluate students’ academic performance through exam that was developed by the professors who ran the course. As there is no significant change in the competencies to be acquired through the course, and the composition of the professors who developed the exam is similar, the exam was developed as done as previous year we can expect the difficulty level of the exam would not change much.” on page 6, lines 114-119 within the revised manuscript and also refer the answer.

4. Reviewer’s comment: In the statistical analysis of test performance, what was the N for each class? Is the N high enough to assume normal distribution? If not, should this be reported as a median? Also, what is the standard deviation with median or standard error with mean? Need some idea of the distribution.

Author’s Response: According to the Reviewer’s comment, we have added the N for each class in Table 3 on page 12, lines 212-213 within the revised manuscript. The N for each data is from 143 to 158, which is high enough to assume normal distribution. We also added supplementary figure which present the distribution of the test score. (Supplementary Figure 1) The meaning of mean score and standard deviation in our study describe the distribution of the test scores of students.

5. Reviewer’s comment: There are differences in how much the academic performance changed between the different subjects. Is this worth looking into further or discussing further? Was this a trend between year groups (1st vs 2nd year) or any other trend noted here? Potentially something to look at and/or discuss.

Author’s Response: Among subjects in Table 3, anatomy, biochemistry and histology are 1st grade subjects whereas gastrointestinal system, respiratory system and circulatory system are 2nd grade subjects. There is no significant change in average score that could be attributed to difference of year groups (1st vs 2nd year). As the exam score is a repeated measurement data that a student participates in several subject tests and is related to each other, a mixed model analysis was performed.

6. Reviewer’s comment: Finally, you concluded no significant differences in academic performance, yet almost all of the p values you report are statistically significant, this must be addressed. What is your threshold for “significant” differences, the language here needs to be clear. Also, if there are subtle differences, this should not be ignored as the trend is certainly towards more online learning, so are there ways to improve this and make academic performance better?

Author’s Response: We reviewed the data and consulted statistician (Professor Choi). Then, we came to have final results which presented in result part within the revised manuscript.

According to the Reviewer’s comment, we have added the sentence “Statistical analysis was performed using the SPSS (version 23) statistical package (IBM SPSS Statistics) and SAS (version 9.3) statistical package (SAS Institute). We performed the Pearson’s chi-squared test as a measure of association to analyze the data. Means were compared using analysis of variance (ANOVA). And we used mixed effects model to identify patterns of score change over years and to determine e-learning effect on academic achievement. Effect size and 95% confidence intervals were calculated using Cohen’s d. P values of <0.05 were taken to indicate statistically significant differences.” on page 7, lines 128-134, “As there is no significant change in the competencies to be acquired through the course, and the composition of the professors who made test questions are similar, we can expect the difficulty level of the exam would not change much. In some courses, such as the anatomy course and the respiratory system course, the mean score in 2020 was lower than that in 2018 or 2019 (Table 3). However, since the mean of each subject exam score changes year by year, and the ANOVA analysis only indicates whether the difference in the mean of each year is significant, it is necessary to analyze whether the overall pattern of change is significant. And even if it is significant, it is necessary to analyze how meaningful the amount of change is.

As the exam score is a repeated measurement data that a student participates in several subject tests and is related to each other, a mixed model analysis was performed. For the analysis of each subject, subjects were analyzed as fixed effects and the interaction between the two variables (year, subject) was included to check whether there was any difference in the pattern of change by subject. As a result, the p-value was less than 0.0001, indicating that the pattern of change for each subject was significantly different. Therefore, it was analyzed whether there was a difference in scores in 2018, 2019, and 2020 for each subject. At this time, since the online class conversion due to COVID-19 is a big change, we looked at whether there was a difference between the average score in 2018 and 2019 and the score in 2020 in order to investigate this impact.

For anatomy, the average score for 2018 and 2019 was 86.67 and was higher than the average score of 82.55 for 2020. For the rest of the subjects, the scores in 2020 were lower than the average of scores in 2018 and 2019 (the average difference was negative for all subjects). In addition, differences were statistically significant in anatomy, circulatory and respiratory systems (p-value=0.0004, 0.0138, <0.0001) (Supplementary Table 1). Using mixed model, we could find out that exam score of some subject showed significant change with the introduction of online class. To analyze the overall score changes over years, the subjects were analyzed as random effects, considering that the degree of difficulty may be different. As a result, there is an overall difference between the years, and the difference between the 2020 score and the average score in 2018 and 2019 was -2.10, which was low in 2020 and was statistically significant (p=0.0001).

Since the ANOVA results and the mixed model results are similar, the effect size of the difference between the average of the 2020 scores and the average of 2018 and 2019 scores was calculated using the mean and standard deviation, not lease square mean (Table 4). The effect size of anatomy, respiratory system and circulatory system course score is -0.5150, -0.5504 and -0.2116, respectively. In the case of anatomy and respiratory system course, the change in academic achievement by online class is moderate, and in the case of circulatory system, the change by online class is small.” on page 11-14, lines 202-245 and “It is very important to maintain students’ academic achievement after the conversion to online class. In our study, moderately decreased exam scores were observed in anatomy and respiratory system courses. It is difficult to make an accurate comparison because the degree of difficulty may vary between tests, but a statistically significant decrease was observed in the above two subjects. In anatomy, the aforementioned lack of practice seems to be the cause, and in respiratory system course, we judged that it was difficult to compare precisely because of the unusually large annual variation in difficulty index. However, apart from these reasons, if the academic achievement of medical students really declines due to online learning, this is a serious problem. It is necessary to observe whether actual academic achievement decreases, and if so, to find out how to resolve this decrease. In some way, it may be predictable that the efficiency of medical education decreases when practice is insufficient.” on page 15, lines 276-286 within the revised manuscript.

7. Reviewer’s comment: You state that “Students pointed out the following strengths of online learning…” What exactly does this mean? Was there a focus group where students pointed out benefits and problems and then the Leikert scale on survey was used to assess agreement with these statements? Or did the authors come up with the statements used for the survey? The language here could be more clear. If the point is just that the score on the survey of these states was higher than three, the language would be better as “Students agreed that the following were strengths of…” something like that. If this was a focus group coming up with strengths/weaknesses, this should be pointed out.

Author’s Response: In the survey, we included question “Please indicate a 5-point scale to see which of the online lectures you think you are satisfied with compared to the existing offline lectures.” and also asked students to check Likert scale on following question; 1) they can take the course anywhere they want, 2) they can take the course at any time they want, 3) they can review any portion of the lecture multiple times, 4) they can alter the sequence of the lectures, and 5) they can play the lecture at any speed they want. We also attached survey form (Appendix) within the revised manuscript.

8. Reviewer’s comment: The portions of the manuscript describing the specifics of how students were able to go to labs and clinical clerkships are of interest, but I think the analysis of surveys and academic performance are interesting as they give some data on how things went. I would emphasize these areas and tighten up the data as above.

Author’s Response: According to the Reviewer’s comment, we have added further data analysis results. These sentences are described in author's response to reviewer's comment 6.

9. Reviewer’s comment: Authors should discuss the limitations of their study in the discussion.

Author’s Response: According to the Reviewer’s comment, we have added the sentence “Our study has several limitations. First, our study was performed at a single institution. As each medical school has different situations and circumstances, our curricular change and results may not be generalizable to other institutions. Second, as we used exam scores that were not standardized for difficulty level, accurate comparison of academic achievement with the introduction of online class. If we had used nation-wide examination or item response theory based computer adaptive test, a more accurate comparison would have been possible. Finally, our exam was MCQ test that evaluates student’s academic achievement focused on cognitive domain. To assess of students’ achievement related to psychomotor or affective domain, it would have been necessary to use other assessment tool.” on page 16, lines 322-330 within the revised manuscript.

Attachment

Submitted filename: Response to Reviewers_e-learning_Myung.docx

Decision Letter 1

Cesario Bianchi

2 Dec 2020

How medical education survives and evolves during COVID-19: our experience and future direction

PONE-D-20-26059R1

Dear Dr. Myung,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Cesario Bianchi

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Dear Myung:

Thank you for carefully revise your manuscript in line with the reviewer comments. I find your manuscript acceptable for publication. Congratulations.

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation.

Reviewer #2: All comments have been addressed

**********

2. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #2: Yes

**********

3. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #2: Yes

**********

4. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #2: Yes

**********

5. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #2: Yes

**********

6. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #2: All comments have been addressed. The authors have done a thorough job of examining and explaining their data.

**********

7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #2: Yes: Timothy J Vreeland, MD

Acceptance letter

Cesario Bianchi

11 Dec 2020

PONE-D-20-26059R1

How medical education survives and evolves during COVID-19: our experience and future direction

Dear Dr. Myung:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Cesario Bianchi

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Fig. Box and whisker plot the test score.

    (PPTX)

    S1 Table. Students’ examination scores in 2020 compared to in 2018 and 2019.

    (DOCX)

    S1 File. Anonymized data set.

    (XLSX)

    S1 Appendix. Questionnaire.

    (DOCX)

    Attachment

    Submitted filename: Response to Reviewers_e-learning_Myung.docx

    Data Availability Statement

    Data cannot be shared publicly because of ethical concerns. Data are available from the Seoul National University Institutional Data Access / Ethics Committee (contact via Tel: 82-2-2072-0694) for researchers who meet the criteria for access to confidential data.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES