Abstract
Objectives
This study investigates the effectiveness of a virtual format of an advanced communication skills observed structured clinical examination (OSCE) for senior medical students in comparison to an in-person format. The study also examines the emotional support students experience in the virtual setting. Our analysis was based on quantitative data collected through objective checklists and post-OSCE survey results.
Methods
The virtual OSCE was a revision of an earlier in-person formative advanced communication skills OSCE for fourth-year medical students. Student performances were assessed by self and peers using objective checklists—the modified Master Interview Rating Scale (mMIRS) and Communication Behavior Checklist (CBC). The mMIRS measured interview process such as avoiding jargon and demonstrating empathy. The CBC examined interview content which included tasks specific to the content of the case. The OSCE was followed by a faculty-led debrief and quantitative survey. The virtual OSCE was conducted in 2021, and the results of the checklists and survey were compared with those collected from two earlier in-person OSCEs.
Results
Eighty-three students participated in the virtual OSCE. There was no difference in mMIRS scores between the virtual and in-person OSCE. Overall CBC scores were lower in the virtual OSCE compared to in-person (p < 0.05). Sixty-seven out of 83 (80.7%) students completed the post-OSCE survey. There were no differences between the virtual and in-person OSCE in terms of educational value, whether the OSCE would change the way participants talk to patients, and preparedness to have serious conversations with patients. All respondents somewhat or strongly agreed with feeling emotionally supported during the virtual OSCE.
Conclusion
The virtual format was a suitable alternative to an in-person, formative, advanced communication skills OSCE for medical students. The virtual OSCE was educationally effective and was met with student satisfaction and a sense of emotional support. Future virtual iterations must ensure adequate instruction on interview content.
Keywords: OSCE, communication skills, undergraduate medical education, virtual learning
Introduction
Since its development in 1975, 1 the observed structured clinical examination (OSCE) has been increasingly utilized for summative 2 and formative assessments in medical education.3,4 OSCEs have been an effective means of training communication skills to various levels of health providers.5,6
In response to the COVID-19 pandemic in March 2020, medical schools rapidly adopted virtual learning across all parts of their curricula. 7 For pre-clinical lectures, classroom teaching was transitioned to online,8,9 and a workshop for “webside” manners was devised. 10 Clinical rotations adopted virtual bedside teaching rounds 11 and electives specific for telemedicine were developed.12,13 While in-person learning has returned to most facets of medical education, virtual sessions are projected to persist into the foreseeable future due to their flexibility. 14
A number of studies have demonstrated effectiveness of virtual teaching in comparison to in-person lectures, 15 small group discussions, 16 and problem-based learning. 17 Unlike these other pedagogies used in medical schools, virtual OSCEs present unique challenges given the logistical complexities involved in this modality 3 and their emphasis on teaching and assessing effective communication. 18 Since the start of the pandemic, many institutions have developed virtual OSCEs 19 and have described effective solutions. Grover et al conducted a virtual communication skills OSCE in 19 UK medical schools and showed increased learner confidence in history-taking, communication, and data interpretation. 20 Another virtual OSCE was conducted on students rotating through obstetrics and gynecology, and the majority deemed it to be excellent or above-average educational value. 21 Residents of three Physical Medicine and Rehabilitation residency programs in Canada also perceived a virtual OSCE as acceptable and authentic, with thematic analysis suggesting the virtual format particularly effective for assessing communication skills. 22
While these studies described creative shifts in response to the pandemic, we have returned to an era where in-person OSCEs are safe and feasible. If virtual OSCEs are to continue, data are needed to support their use in place of in-person offerings. To our knowledge, only one prior study has compared the effectiveness of a virtual to in-person OSCE for medical students. 23 Prasad and colleagues found that an OSCE on delivering bad news for clerkship students met curricular goals compared to an earlier in-person OSCE and was well received by students. However, standardized patients noted that students appeared uncomfortable and some cried during the encounter.
The pandemic has provided a potential challenge to student comfort with sharing emotions and feeling supported in the virtual setting. Several studies have shown higher incidences of depression, emotional instability, and mental distress from isolation for medical students during the pandemic.24,25 One group conducted a virtual advanced care planning workshop for internal medicine residents, and included a group debriefing in which residents were able to share challenges and triumphs, achieving a sense of camaraderie. 26 Prior work has shown that medical students can experience a range of negative emotions during challenging in-person simulation exercises, which can adversely impact performance and learning.27,28 In addition to student satisfaction and faculty perception of effectiveness, psychological safety can be an important component of OSCEs. 29 To our knowledge, no study to date has examined the emotional support medical students experience during OSCEs in a virtual format.
We previously described an in-person, peer-assisted formative OSCE on advanced communication skills for senior medical students.4,30 We reconfigured this for a virtual set-up in 2021. The objectives of the current study were to (a) assess students’ performance between an in-person and virtual OSCE based on objective self- and peer-evaluations; (b) evaluate students’ perspective on a virtual OSCE in comparison to a prior similar in-person OSCE based on quantitative survey responses; and (c) gauge students’ sense of emotional support during an OSCE conducted within a virtual environment.
Methods
Place, period, and nature of study
The study was conducted at Yale School of Medicine in New Haven, Connecticut, USA. The MD program at Yale School of Medicine is a 4-year curriculum consisting of an 18-month pre-clerkship period covering basic science education, 12-month clerkship period during which all students rotate through core clinical disciplines, and a 17-month post-clerkship period. During the post-clerkship period, students are required to complete a 4-week sub-internship, at least 28 weeks of clinical electives and research, and a 4-week Capstone course. The remaining time is spent studying for licensure exams, applying to residency programs, doing additional electives and research, and taking vacation.
The advanced communication skills OSCE is part of the Capstone course which is scheduled 2 months before graduation. During the Capstone course, students participate in core classes/workshops and individual experiences based on their desired specialty choice. The Capstone course is formative, requiring only attendance for a passing grade.
Quantitative data was collected from the advanced communication skills OSCE for senior medical students conducted in 2017, 2019, and 2021. The 2021 OSCE was conducted in virtual format, and the results were compared to those from similar in-person OSCEs in 2017 and 2019. Data from 2018 and 2020 were not included as data was not collected in 2018 and the OSCE was canceled in 2020. The post-OSCE survey in 2021 also measured students’ sense of emotional support during the virtual OSCE.
Study participants
To be eligible for the study, participants had to be fourth-year medical school students in good standing during the years 2017, 2019, and 2021. There were no exclusion criteria. Informed consent for inclusion of OSCE results was obtained by written form before the OSCE. The post-OSCE survey was administered as part of routine course evaluation with no risk associated with completion; completion of the survey constituted consent. The study was granted exemption from review by the Yale University IRB (Protocol ID 2000020576).
In order to assess for differences in student cohorts participating in the in-person OSCEs (2017 and 2019) and virtual OSCE (2021), class demographic factors (age, gender), number of clinical electives taken before the OSCE, and scores from a separate in-person summative OSCE (C-OSCE) were obtained from the registrar. The C-OSCE is a 7-station summative assessment conducted at the end of the clerkship period at Yale School of Medicine. The C-OSCE assesses problem-focused history taking and physical exam skills based on core content from the clerkships. Case content and scoring were unchanged for the duration of the present study. Passing the C-OSCE is required for advancement in the curriculum.
Original in-person OSCE format
The advanced communication skills OSCE consists of five stations, each simulating a challenging clinical scenario with a standardized patient: disclosure of intraoperative complication, goals of care discussion with family of a critically ill patient, disclosure of medical error, introduction of palliative care, and death notification over the phone (Table 1). Students were organized into triads and rotated through three of the five OSCE stations. Students took turns being the examinee while being observed by two peers. After each station, examinees completed a self-assessment checklist (Communication Behavior Checklist; CBC) and received checklist-based assessment (CBC and modified Master Interview Rating Scale; mMIRS) and verbal feedback from peers. Faculty and standardized patients did not complete checklists due to prior work demonstrating similarity in scoring among faculty, standardized patient, and peer observers plus student preference for a peer-assisted learning model for the in-person OSCE. 30 The checklists were developed via Delphi method with four content-experts developing the CBC and two modifying the MIRS, 30 an assessment tool with strong validity evidence. 4 After students completed their three stations, a faculty-led debrief was performed to review all five scenarios, emotional responses, and student questions. At the end of the debrief, an online survey was distributed. The survey was designed by the investigators due to the lack of existing post-OSCE survey tools examining quantitative data with validity evidence. The survey included 5-point Likert scales for quantitative analyses.
Table 1.
Summary of five cases in advanced communication skills OSCE.
| Competency addressed | Brief summary of scenario |
|---|---|
| Communication with an Angry Patient | Surgical intern talks with a patient who is furious because of an unexpected, but unavoidable, surgical complication. |
| Goals of Care for a Patient with Serious Illness | Intensive care unit intern leads conversation with adult child of patient with multi-organ failure and terminal prognosis. |
| Medical Error Disclosure | Floor intern informs patient of delay in diagnosis due to personal error. |
| Palliative Care Assessment | Primary care intern discusses care plan with patient with incurable metastatic cancer in outpatient setting. |
| Phone Death Notification (added in 2021) | Emergency department intern calls mother of adolescent to notify her of her child's death in a motor vehicle collision. |
During the week prior to the OSCE, the Capstone course included brief, relevant didactics on advanced communication skills, though the particular scenarios to be addressed during the OSCE were not revealed to the students.
Revisions with virtual OSCE format
In preparation for the virtual OSCE, all pre-OSCE didactics were given virtually, and for some topics consisted of recordings from previous years. All OSCE cases and faculty debriefs were conducted on Zoom. During the OSCE, examinees and standardized patients had their cameras on, with the exception of the case designed to be a phone call where the standardized patient camera was also off. Additional questions were added to the post-OSCE survey focusing on the virtual format and sense of emotional support. Otherwise, the virtual OSCE format was identical to the in-person OSCE format described above.
Statistical analysis
We focused on two primary quantitative outcomes: self and observer checklist scores (CBC and mMIRS) and post-OSCE survey results. Only fully completed checklists were included in the analysis. Of note, CBC and mMIRS scores were collected during the 2017 OSCE but not for the 2019 OSCE, and the post-OSCE survey was collected for the 2019 OSCE but not for the 2017 OSCE. As a result, checklist scores from the virtual OSCE in 2021 were compared to those from the in-person OSCE in 2017, while the survey results from the virtual OSCE in 2021were compared to those from the in-person OSCE in 2019. CBC and mMIRS scores from the 2017 in-person OSCE and 2021 virtual OSCE were compared using Welch's t-test. 31 Checklist scores for the death notification case were omitted from this comparison because the case had been added after the 2017 OSCE. For the post-OSCE survey, results from the 2019 in-person OSCE and 2021 virtual OSCE were compared using either Fisher's exact test or Chi-squared analysis. The data were analyzed using SPSS Statistics, version 25 (IBM Corp., Armonk, NY, USA).
Results
Participant demographics
Eighty-three students participated in the virtual OSCE in 2021, compared to 91 and 88 students in 2017 and 2019, respectively. Comparison of class demographics among 2017, 2019, and 2021 showed no difference in average age, gender distribution, and average C-OSCE score. The number of electives taken before the OSCE were similar between 2019 and 2021, while lower in 2017. Of note, the nine students who did not participate in 2021 were included in the class demographic analysis while the additional two students who participated in 2017 were not. Summary of class demographics are shown in Table 2.
Table 2.
Demographic factors of student classes of 2017, 2019, and 2021.
| 2017 | 2019 | 2021 | p-Value | |
|---|---|---|---|---|
| Average age (years) | 29.06 | 28.64 | 29.00 | 0.41 |
| Gender distribution | Male 52 Female 37 |
Male 44 Female 44 |
Male 53 Female 39 |
0.46 |
| Number of electives taken before OSCE (mean) | 3.78 | 4.35 | 4.23 | 2019 vs 2021: 0.51 2017 vs 2021: 0.007 2017 vs 2019: 0.048 |
| C-OSCE score (average; max score 100) | 83.35 | 83.58 | 83.64 | 0.90 |
Checklist scores
Fully completed checklist scores were available for 45 OSCE participants in 2017 and 64 in 2021. There was no difference between the all case average mMIRS scores between the 2017 in-person OSCE (4.69) and the 2021 virtual OSCE (4.63) (p = 0.18). Mean mMIRS scores for individual cases ranged from 4.55 to 4.79 for the 2017 in-person and 4.60 to 4.67 for the 2021 virtual OSCE. No statistical differences were found in mean mMIRS scores in individual cases for the in-person compared to virtual OSCE (Figure 1).
Figure 1.
Mean mMIRS scores for overall and individual cases from in-person (2017) and virtual (2021) advanced communication OSCE.
All case average CBC scores were 0.84 for self and 0.87 for observers in the 2017 in-person OSCE compared with 0.76 for self and 0.82 for observers in the 2021 virtual OSCE (p = 0.004 and p = 0.01, respectively). CBC scores were significantly lower for two of the cases: both self (p = 0.03) and observer (p = 0.02) in the Palliative Care case, and observer (p = 0.008) for the Medical Error case. Other cases showed no statistical difference between the in-person and virtual OSCE (Figure 2).
Figure 2.
Mean CBC scores for overall and individual cases from in-person (2017) and virtual (2021) advanced communication OSCE (* indicate statistically significant difference of p < 0.05).
Post-OSCE survey
Sixty out of 88 (68.1%) students completed the post-OSCE survey for the 2019 in-person OSCE, compared to 67 out of 83 (80.7%) students for the 2021 virtual OSCE. For all questions, higher scores on the Likert scale 1 to 5 reflected more favorable responses. Responses revealed no perceived difference between the in-person and virtual OSCEs in terms of overall educational value of the OSCE, how the OSCE would change the way participants talk to patients, and how well the OSCE prepared participants to have serious conversations with patients during their upcoming residencies (Table 3). Analysis of the distribution of survey responses found a significant difference for the survey question regarding preparedness to have serious conversations. In 2019, 42 (70.0%) students rated 5 (excellent) on the Likert scale compared to 32 (47.8%) students in 2021 (p = 0.03). In addition, mean Likert scores from peer assessors regarding knowledge improvement as an observer were significantly higher for the 2019 in-person OSCE (4.77) compared to the 2021 virtual OSCE (4.36; p = 0.01). There were no differences in peer assessors’ reports about the benefit of practicing giving feedback to peers. Lastly, 55 out of 67 (82.1%) respondents in 2021 felt that the virtual OSCE was an effective substitute for an in-person offering, and all 67 students somewhat or strongly agreed with feeling emotionally supported during the 2021 virtual OSCE.
Table 3.
Post-workshop survey results from in-person (2019) and virtual (2021) OSCEs (NS: non-significant p > 0.05).
| Survey question | Answer options | In-person (2019; percentage of responses 4 or 5) | Virtual (2021; percentage of responses 4 or 5) | p-Value (statistical test) |
|---|---|---|---|---|
| Overall, the educational value of today's OSCE was: |
|
4.83 (100%) | 4.64 (92.5%) | NS (fisher exact) |
| Will today's OSCE change how you talk to patients? |
|
4.36 (93.22%) | 4.22 (85.10%) | NS (chi square) |
| After today's OSCE, do you feel more prepared to have serious conversations with patients during your upcoming residency? |
|
4.68 (96.61%) | 4.42 (95.50%) | NS (chi square) |
| How beneficial was your participation as the Observer in terms of improving your knowledge about the topic? |
|
4.77 (100%) | 4.36 (82.0%) | 0.01 (fisher exact) |
| How beneficial was your participation as the Observer in terms of practicing giving feedback to peers? |
|
4.5 (88.46%) | 4.44 (87.2%) | NS (chi square) |
| The virtual set-up was an effective substitute for an in-person session in regards to the cases |
|
n/a | 4.18 (82.09%) | n/a |
| I felt emotionally safe during today's session. |
|
n/a | 4.87 (100%) | n/a |
Discussion
A virtual formative OSCE in 2021 on advanced communication skills for senior medical students showed mixed results across a range of outcomes compared to in-person OSCEs. While students’ survey responses about educational value and emotional support in the virtual setting were reassuring, objective assessment scores (CBC/mMIRS) were lower in the virtual format when compared to prior in-person OSCEs. Nevertheless, quantitative survey responses suggested that the virtual OSCE was an effective substitute for an in-person offering. This is in keeping with previous literature related to virtual shifts for other pedagogies in medical education.15–17
The preserved mMIRS scores in the virtual environment suggest that examinees were engaged with the OSCE. Indeed, we would predict disengagement during the OSCE to lead to decreased mMIRS scores due to lack of nonverbal and empathic communication. Decreases in these domains were major concerns for telemedicine during the initial phases of the COVID-19 pandemic.32,33 Instead, students achieved mMIRS scores equivalent to those during the 2019 in-person OSCE, and higher than those reported in other studies using the MIRS for in-person assessments.34–36 The equivalent mMIRS scores in virtual and in-person formats support findings from previous studies that trainees can successfully display well-practiced skills in a virtual OSCE environment.10,20,22,37–39 Langewitz et al suggest students’ ability to display these skills in a virtual OSCE may have been further augmented by effective use of technology, such as the ability for everyone else to turn off their audio/camera to simulate a 1:1 environment with a patient. 40
The lower assessment scores for the virtual OSCE were restricted to the CBC scores, which were designed to measure interview content. There are a variety of reasons why interview content may have suffered in the virtual format. While engagement by examinees during the standardized patient interviews may not have suffered, as explained above, prior studies have shown poor student engagement being one of the major barriers to effective medical education during the COVID-19 pandemic. 41 Our virtual OSCE was conducted after a series of virtual didactics about communication skills. This didactic content was unchanged over the years of the study, and both live and pre-recorded lecture-based virtual instruction can negatively impact learner engagement and retention.42,43 This difference is accentuated by the fact that students in 2021 had more clinical electives compared to their 2017 cohort. Clinical electives give students the opportunity to explore specialties and engage with patients of interest. With the additional focused experience, students could be expected to display improved clinical skills including interview content. At the same time, the quality of the clinical electives for the 2021 cohort cannot be presumed to be the same as those conducted pre-COVID.
In contrast, scores related to interview process measured by the mMIRS, were similar between the virtual and in-person OSCEs. Students learn about interview process starting in first year of medical school, compared to the specialized interview content required for this workshop, which was provided a few days before the OSCE. While well-ingrained skills related to interview process did not suffer, application of new knowledge did. The reasons for this are unclear, though virtual learning may have an inherently different learning curve compared to in-person which requires more time and/or repetition to achieve the same level of competency. These findings also highlight a potential need for more engaging case review for virtual learning, instead of simply transferring in-person content to a virtual platform.
The possibility of student disengagement with some aspects of the virtual environment is further supported by the quantitative survey results. While there were no differences reported by students in the examinee role, students reported lower benefit in terms of knowledge improvement for cases they observed in the virtual OSCE, suggesting that the observer role in a virtual OSCE may not be as effective for content review as the examinee role. Alternatively, there may be a subset of students who do not engage with the virtual format as easily as their peers. This may have also contributed to difference in response distribution for perceived preparedness for future serious conversations with patients. Experiential learning theory asserts that students learn most effectively when engaged in reflective and cognitive work through active exercises or role-play. 44 Some students may require a higher level of cognitive load to stay focused during virtual education, challenging educators to utilize innovative teaching methods to increase and maintain learner engagement in the virtual setting.45,46
Nonetheless, students valued the opportunity to practice giving feedback both virtually and in-person, and clearly were engaged enough to make the examinees feel that the feedback they received was helpful. Additionally, overall survey results showed students found the virtual OSCE educationally valuable, agreed that it changed the way they will communicate with patients, and felt more prepared to have serious conversations with patients in the future. The majority of students thought the virtual OSCE was an effective substitute to an in-person session, in keeping with prior studies that have described virtual communication skills training in graduate medical education.38,39,47 Their views were likely influenced by the quickly evolving comfort with the Zoom platform among students, SPs, and workshop organizers, who had all been involved with virtual education for a year. As a result, there were no technical concerns, allowing examinees to focus on the cases. Previous work has highlighted other benefits of using virtual learning towards clinical competency, 48 including ease of access 49 and reduced travel time. 20
The clinical cases designed for our OSCE involved intense emotions, and our examinees were physically separated from their peers and instructors in the virtual format. We thus prioritized assessing whether students experienced adequate emotional support in their examinee role. Reassuringly, all the participants agreed they felt emotionally supported during the OSCE. Several structural aspects of the OSCE may have served as safeguards. Pre-OSCE didactics were provided to review interview content. The formative nature of the workshop, along with peer-assisted learning,2,50 may have helped facilitate a collaborative environment. The faculty-led debrief after the OSCE specifically included space to process emotional responses. These structural components of the OSCE are aligned with previous research showing instructional elements such as structured tutorials, reflection, human feedback, and scaffolding, help to increase learning in the virtual environment. 33
Despite the overall positive results, there are limitations to our study. First, the students are from a single institution and from different cohorts, hence confounders inherent to each group or a single institution may exist. The most visible confounder related to cohorts is the possible and unknown effect of their pandemic adjustment to education. Since the COVID-19 pandemic affected medical schools simultaneously, our level of preparation and subsequent response to setting up a virtual OSCE would most likely be comparable to other institutions. 51 Also, students in our different cohorts were comparable in terms of class demographic factors and scores on identical in-person summative assessments (C-OSCE) earlier in the curriculum. Second, our OSCE focuses on communication skills which can be conducted without hands-on assessments or in-person interactions. Different logistical challenges are inherent for OSCEs involving physical examination and/or procedures 52 and our findings cannot be generalized to those activities. Third, the post-OSCE survey used in this study was created ad hoc for this exercise and requires further validity assessment.
In terms of validity, the assessments used in the virtual OSCE do not present a similar level of validity evidence compared to the in-person format. According to Messick's validity framework, 53 both formats have identical content (Delphi method for consensus for both CBC and mMIRS items) and response process (same methods and statistical analysis) validity evidence. However, internal structure through inter-rater reliability and relationship to other variables were only examined for the in-person format. 30 Additionally, consequential validity evidence has not been investigated for either format. This could be collected in subsequent studies. Our future work will explore the reasons behind the measured differences in virtual versus in-person OSCEs through an ongoing qualitative analysis.
Conclusions
While previous work has revealed similar levels of educational value and satisfaction among learners in virtual OSCEs, our study is the first to directly compare objective assessment scores and survey results from medical students between in-person and virtual OSCEs, and to show that students can feel emotionally supported while undergoing intense simulations in the virtual environment. If future virtual iterations are utilized, modifications may be necessary to ensure adequate instruction on interview content and optimize learner engagement, especially for those in the peer-assessor role. Overall, the virtual format should be considered an alternative for peer-assisted, formative communication skills OSCEs for medical students.
Supplemental Material
Supplemental material, sj-docx-1-mde-10.1177_23821205241241375 for A Comparison Between In-Person and Virtual Communication Skills OSCE for Medical Students by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development
Supplemental material, sj-docx-2-mde-10.1177_23821205241241375 for A Comparison Between In-Person and Virtual Communication Skills OSCE for Medical Students by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development
Supplemental material, sj-docx-3-mde-10.1177_23821205241241375 for A Comparison Between In-Person and Virtual Communication Skills OSCE for Medical Students by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development
Acknowledgements
The authors acknowledge assistance from Dr Michael Green for his expertise in validity assessment.
Footnotes
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The authors received no financial support for the research, authorship, and/or publication of this article.
ORCID iD: Alex Choi https://orcid.org/0000-0002-5387-687X
Supplemental material: Supplemental material for this article is available online.
References
- 1.Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447-451. doi: 10.1136/bmj.1.5955.447 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Etheridge L, Bouriscot K. Performance and workplace assessment. In: A practical guide for medical teachers. 4th ed. Elsevier; 2013:307-313. [Google Scholar]
- 3.Talwalkar JS, Cyrus KD, Fortin AH. Twelve tips for running an effective session with standardized patients. Med Teach. 2020;42(6):622-627. doi: 10.1080/0142159x.2019.1607969 [DOI] [PubMed] [Google Scholar]
- 4.Talwalkar JS, Fortin AH, Morrison LJ, et al. An advanced communication skills workshop using standardized patients for senior medical students. MedEdPORTAL. 2021;17:11163. doi: 10.15766/mep_2374-8265.11163 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Stepien KA, Baernstein A. Educating for empathy. A review. J Gen Intern Med. 2006;21(5):524-530. doi: 10.1111/j.1525-1497.2006.00443.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Batt-Rawden SA, Chisolm MS, Flickinger AB, et al. Teaching empathy to medical students: an updated, systematic review. Acad Med. 2013;88(8):1171-1177. doi: 10.1097/ACM.0b013e318299f3e3 [DOI] [PubMed] [Google Scholar]
- 7.Daniel M, Gordon M, Patricio M, et al. An update on developments in medical education in response to the COVID-19 pandemic: a BEME scoping review: BEME Guide No. 64. Med Teach. 2021;43(3):253-271. doi: 10.1080/0142159x.2020.1864310 [DOI] [PubMed] [Google Scholar]
- 8.Sud R, Sharma P, Budhwar V, Khanduja S. Undergraduate ophthalmology teaching in COVID-19 times: students’ perspective and feedback. Indian J Ophthalmol. 2020;68(7):1490-1491. doi: 10.4103/ijo.IJO_1689_20 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.K N, D D, A J, G A. Study of the effectiveness of e-learning to conventional teaching in medical undergraduates amid COVID-19 pandemic. Natl J Physiol Pharm Pharmacol. 2020;10(7):1. [Google Scholar]
- 10.Mulcare M, Naik N, Greenwald P, et al. Advanced communication and examination skills in telemedicine: a structured simulation-based course for medical students. MedEdPORTAL. 2020;16:11047. doi: 10.15766/mep_2374-8265.11047 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Hofmann H, Harding C, Youm J, Wiechmann W. Virtual bedside teaching rounds with patients with COVID-19. Med Educ. 2020;54(10):959-960. doi: 10.1111/medu.14223 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Ray JM, Wong AH, Yang TJ, et al. Virtual telesimulation for medical students during the COVID-19 pandemic. Acad Med. 2021;96(10):1431-1435. doi: 10.1097/ACM.0000000000004129 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Bautista CA, Huang I, Stebbins M, et al. Development of an interprofessional rotation for pharmacy and medical students to perform telehealth outreach to vulnerable patients in the COVID-19 pandemic. J Interprof Care. 2020;34(5):694-697. doi: 10.1080/13561820.2020.1807920 [DOI] [PubMed] [Google Scholar]
- 14.Kim KJ. Moving forward: embracing challenges as opportunities to improve medical education in the post-COVID era. Humanit Soc Sci Commun. 2022;9(1):419. doi: 10.1057/s41599-022-01451-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Teichgräber U, Mensel B, Franiel T, et al. Virtual inverted classroom to replace in-person radiology lectures at the time of the COVID-19 pandemic—a prospective evaluation and historic comparison. BMC Med Educ. 2021;21(1):611. doi: 10.1186/s12909-021-03061-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Xiong W, Singh S, Wilson-Delfosse A, et al. “Flipped” clinical rotations: a novel approach. Clin Teach. 2022;19(5):e13520. 10.1111/tct.13520 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Koch LK, Correll-Buss A, Chang OH. Implementation and effectiveness of a completely virtual pathology rotation for visiting medical students. Am J Clin Pathol. 2022;157(3):406-412. doi: 10.1093/ajcp/aqab140 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Petrusa ER, Blackwell TA, Ainsworth MA. Reliability and validity of an objective structured clinical examination for assessing the clinical performance of residents. Arch Intern Med. 1990;150(3):573-577. [PubMed] [Google Scholar]
- 19.Sartori DJ, Olsen S, Weinshel E, Zabar SR. Preparing trainees for telemedicine: a virtual OSCE pilot. Med Educ. 2019;53(5):517-518. doi: 10.1111/medu.13851 [DOI] [PubMed] [Google Scholar]
- 20.Grover S, Pandya M, Ranasinghe C, Ramji SP, Bola H, Raj S. Assessing the utility of virtual OSCE sessions as an educational tool: a national pilot study. BMC Med Educ. 2022;22(1):178. doi: 10.1186/s12909-022-03248-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Reid HW, Branford K, Reynolds T, Baldwin M, Dotters-Katz S. It’s getting hot in here: piloting a telemedicine OSCE addressing menopausal concerns for obstetrics and gynecology clerkship students. MedEdPORTAL. 2021;17:11146. doi: 10.15766/mep_2374-8265.11146 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Kelly R, Leung G, Lindstrom H, Wunder S, Yu JC. Virtual objective structured clinical examination experiences and performance in physical medicine and rehabilitation residency. Am J Phys Med Rehabil. 2022;101(10):947-953. doi: 10.1097/phm.0000000000001942 [DOI] [PubMed] [Google Scholar]
- 23.Prasad L, Hockstein S, Safdieh JE, Harvey K, Christos PJ, Kang Y. An objective structured clinical exam on breaking bad news for clerkship students: in-person versus remote standardized patient approach. MedEdPORTAL. 2023;19:11323. doi: 10.15766/mep_2374-8265.11323 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Sahi PK, Mishra D, Singh T. Medical education amid the COVID-19 pandemic. Indian Pediatr. 2020;57(7):652-657. doi: 10.1007/s13312-020-1894-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Michaeli D, Keough G, Perez-Dominguez F, et al. Medical education and mental health during COVID-19: a survey across 9 countries. Int J Med Educ. 2022;13:35-46. doi: 10.5116/ijme.6209.10d6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Mills S, Cioletti A, Gingell G, Ramani S. Training residents in virtual advance care planning: a new twist in telehealth. J Pain Symptom Manage. 2021;62(4):691-698. doi: 10.1016/j.jpainsymman.2021.03.019 [DOI] [PubMed] [Google Scholar]
- 27.Madsgaard A, Smith-Strøm H, Hunskår I, Røykenes K. A rollercoaster of emotions: an integrative review of emotions and its impact on health professional students’ learning in simulation-based education. Nurs Open. 2022;9(1):108-121. doi: 10.1002/nop2.1100 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Fraser K, Huffman J, Ma I, et al. The emotional and cognitive impact of unexpected simulated patient death: a randomized controlled trial. Chest. 2014;145(5):958-963. doi: 10.1378/chest.13-0987 [DOI] [PubMed] [Google Scholar]
- 29.McLeod E, Gupta S. The role of psychological safety in enhancing medical Students’ engagement in online synchronous learning. Med Sci Educ. 2023;33(2):423-430. doi: 10.1007/s40670-023-01753-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Talwalkar JS, Murtha TD, Prozora S, AHt F, Morrison LJ, Ellman MS. Assessing advanced communication skills via objective structured clinical examination: a comparison of faculty versus self, peer, and standardized patient assessors. Teach Learn Med. 2020;32(3):294-307. doi: 10.1080/10401334.2019.1704763 [DOI] [PubMed] [Google Scholar]
- 31.Welch BL. On the comparison of mean values in small samples. Biometrika. 1951;38(3-4):330-336. [Google Scholar]
- 32.Holstead RG, Robinson AG. Discussing serious news remotely: navigating difficult conversations during a pandemic. JCO Oncol Pract. 2020;16(7):363-368. doi: 10.1200/op.20.00269 [DOI] [PubMed] [Google Scholar]
- 33.Lee J, Kim H, Kim KH, Jung D, Jowsey T, Webster CS. Effective virtual patient simulators for medical communication training: a systematic review. Med Educ. 2020;54(9):786-795. doi: 10.1111/medu.14152 [DOI] [PubMed] [Google Scholar]
- 34.Chandawarkar RY, Ruscher KA, Krajewski A, et al. Pretraining and posttraining assessment of residents’ performance in the fourth accreditation council for graduate medical education competency: patient communication skills. Arch Surg. 2011;146(8):916-921. doi: 10.1001/archsurg.2011.167 [DOI] [PubMed] [Google Scholar]
- 35.Pfeiffer CA, Kosowicz LY, Holmboe E, Wang Y. Face-to-face clinical skills feedback: lessons from the analysis of standardized patient's work. Teach Learn Med. 2005;17(3):254-256. doi: 10.1207/s15328015tlm1703_9 [DOI] [PubMed] [Google Scholar]
- 36.Wagner JA, Pfeiffer CA, Harrington KL. Evaluation of online instruction to improve medical and dental students’ communication and counseling skills. Eval Health Prof. 2011;34(3):383-397. doi: 10.1177/0163278710380612 [DOI] [PubMed] [Google Scholar]
- 37.VanLangen KM, Sahr MJ, Salvati LA, Meny LM, Bright DR, Sohn M. Viability of virtual skills-based assessments focused on communication. Am J Pharm Educ. 2021;85(7):8378. doi: 10.5688/ajpe8378 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Frydman JL, Gelfman LP, Lindenberger EC, et al. Virtual Geritalk: improving serious illness communication of clinicians who care for older adults. J Pain Symptom Manage. 2021;62(3):e206-e212. doi: 10.1016/j.jpainsymman.2021.02.024 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Boardman D, Wilhite JA, Adams J, et al. Telemedicine training in the COVID era: revamping a routine OSCE to prepare medicine residents for virtual care. J Med Educ Curric Dev. 2021;8:23821205211024076. doi: 10.1177/23821205211024076 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Langewitz W, Pleines Dantas Seixas U, Hunziker S, et al. Doctor-patient communication during the Corona crisis—web-based interactions and structured feedback from standardized patients at the University of Basel and the LMU Munich. GMS J Med Educ. 2021;38(4):Doc81. doi: 10.3205/zma001477 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Wilcha RJ. Effectiveness of virtual medical teaching during the COVID-19 crisis: systematic review. JMIR Med Educ. 2020;6(2):e20963. doi: 10.2196/20963 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Schnee D, Ward T, Philips E, et al. Effect of live attendance and video capture viewing on student examination performance. Am J Pharm Educ. 2019;83(6):6897. doi: 10.5688/ajpe6897 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Cacault MP, Hildebrand C, Laurent-Lucchetti J, Pellizzari M. Distance learning in higher education: evidence from a randomized experiment. J Eur Econ Assoc. 2021;19(4):2322-2372. doi: 10.1093/jeea/jvaa060 [DOI] [Google Scholar]
- 44.Kolb AY, Kolb DA. Experiential learning theory as a guide for experiential educators in higher education. Exp Learn Teach Higher Educ. 2017;1(1):7-44. [Google Scholar]
- 45.Dickinson KJ, Caldwell KE, Graviss EA, et al. Perceptions and behaviors of learner engagement with virtual educational platforms. Am J Surg. 2022;224(1 Pt B):371-374. doi: 10.1016/j.amjsurg.2022.02.043 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Bientzle M, Hircin E, Kimmerle J, et al. Association of online learning behavior and learning outcomes for medical students: large-scale usage data analysis. JMIR Med Educ. 2019;5(2):e13529. doi: 10.2196/13529 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Carter K, Podczerwinski J, Love L, et al. Utilizing telesimulation for advanced skills training in consultation and handoff communication: a post-COVID-19 GME bootcamp experience. J Hosp Med. 2021;16(12):730-734. doi: 10.12788/jhm.3733 [DOI] [PubMed] [Google Scholar]
- 48.Waseh S, Dicker AP. Telemedicine training in undergraduate medical education: mixed-methods review. JMIR Med Educ. 2019;5(1):e12515. doi: 10.2196/12515 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Prettyman AV, Knight EP, Allison TE. Objective structured clinical examination from virtually anywhere!. J Nurse Pract. 2018;14(8):e157-e163. doi: 10.1016/j.nurpra.2018.05.007 [DOI] [Google Scholar]
- 50.Swallow MA, Wride AM, Donroe JH. Peer-assisted learning in a longitudinal hybrid physical exam course. Med Sci Educ. 2023;33(2):359-362. doi: 10.1007/s40670-023-01755-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Binks AP, LeClair RJ, Willey JM, et al. Changing medical education, overnight: the curricular response to COVID-19 of nine medical schools. Teach Learn Med. 2021;33(3):334-342. doi: 10.1080/10401334.2021.1891543 [DOI] [PubMed] [Google Scholar]
- 52.Blythe J, Patel NSA, Spiring W, et al. Undertaking a high stakes virtual OSCE (“VOSCE”) during COVID-19. BMC Med Educ. 2021;21(1):221. doi: 10.1186/s12909-021-02660-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Messick S. Validity. In: Educational measurement. 3rd ed. Macmillan Publishing; 1989:13–103. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-mde-10.1177_23821205241241375 for A Comparison Between In-Person and Virtual Communication Skills OSCE for Medical Students by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development
Supplemental material, sj-docx-2-mde-10.1177_23821205241241375 for A Comparison Between In-Person and Virtual Communication Skills OSCE for Medical Students by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development
Supplemental material, sj-docx-3-mde-10.1177_23821205241241375 for A Comparison Between In-Person and Virtual Communication Skills OSCE for Medical Students by Alex Choi, Tanya D. Murtha, Laura J. Morrison and Jaideep S. Talwalkar in Journal of Medical Education and Curricular Development


