In response to the COVID-19 pandemic, the American Board of Surgery (ABS) cancelled the 2020 March and April Certifying Exams (CE) and adapted the July Qualifying Exams (QE) to a virtual format. Surgical educators faced a similar choice: cancel mock oral examinations (MOE) or shift to a virtual format.1 We chose to adapt our MOE to a virtual format, recognizing MOE as a valuable educational and assessment tool that helps residents identify areas of weakness and guide self-preparation for the ABS CE.2, 3, 4 Here, we report our methods for conducting a virtual multi-institutional MOE, faculty and resident perceptions of the virtual format, and challenges unique to delivering a virtual examination.
For the virtual exam, we used an online survey tool (Qualtrics, Provo, UT) to distribute test materials to faculty examiners. We selected 12 cases from an existing institutional catalog, with 4 cases in each of the 3 thematic categories: Trauma and Critical Care, Acute Care and GI Surgery, and Subspecialty and Oncologic Surgery. The online survey first displayed a prompt for the examiner to read aloud and was followed by 4–6 open-ended clinical questions. Examiners could display clinical images embedded in the survey to examinees by screen sharing.
Scoring criteria were included with each question, and faculty members scored resident responses in real time as pass, fail, or critical fail when applicable. Receiving a critical fail resulted in an automatic failure of the entire case. Examiners also assigned an overall “case score” of 0–3 (0 = did not complete, 1 = fail, 2 = borderline, 3 = pass) at the end of each case completed and submitted feedback to the examinee. Each faculty examiner received a personalized web link to a survey one week prior to the exam to preview case content.
To facilitate the virtual exam, we used a teleconferencing platform (Zoom Video Communications, San Jose, CA). General surgery residents in post-graduate years 3–5 of clinical training and surgical faculty from 3 academic medical centers were invited to participate. The exam day began with a brief orientation. An administrator assigned participants into breakout rooms within one of three Zoom meetings. In each breakout room, 2 examiners facilitated 4 cases in a thematic category over a 20-min time period for one examinee. After this, examiners remained in the same virtual room, and examinees were directed to additional virtual rooms until they completed all 12 cases. At the end of the exam, faculty and residents completed a survey on exam satisfaction, perception of the virtual format compared to the in-person format, and any technical issues encountered. Examinees were scored using 2 separate metrics: average case score in cases completed and percentage of questions passed of questions completed. Examinees failed the MOE if they scored below one standard deviation in either criterion.
In total, 44 general surgery residents and 42 faculty examiners participated in the virtual MOE. There was a 97.6% (43/44) exam completion rate by examinees. One resident concluded the exam prematurely due to lack of adequate Internet access and cell phone reception. There was a 75% exam pass rate, with an average question pass rate of 84% [95% CI: 68–99%] and an average case score of 2.6 [95% CI: 2.0–3.0].
The virtual MOE was well-received, with high completion and satisfaction rates. Post-exam surveys had a response rate of 72% (32/44) by residents and 100% (42/42) by faculty. Overall, 87.5% (28/32) of residents and 100% (42/42) of faculty were satisfied with the virtual format (p = 0.03) and nearly all residents and faculty believed the MOE would help residents prepare for the ABS CE (94% vs. 100%, p = 0.18). When comparing the in-person and virtual MOE formats, 75% (24/32) of residents and 73% (11/15) of faculty associated the virtual format with easier access to participate (p = 0.99). However, residents were significantly more likely than faculty to associate the in-person format with better communication with the other party, compared to the virtual format (78% vs. 40%, p = 0.02).
The main challenges associated with the virtual format were communication and technical audiovisual issues. Technical issues were prevalent during the exam, with significantly more residents than faculty reporting audiovisual or connectivity issues (65% vs. 28%, p = 0.02). While most technical issues were considered insignificant by participants, 12% of residents and 6% of faculty experienced a major technical issue that was perceived to impact the exam (p = 0.65). Despite technical issues, nearly all faculty were comfortable with the web-based interface and believed the virtual format was an efficient method of conducting MOE and the ABS CE [97.6% (41/42) and 95.2% (40/42), respectively].
Overall, we found that a multi-institutional virtual MOE was feasible, well-received, and offered several advantages. More faculty members were able to participate from separate institutions, allowing examinees to have unfamiliar examiners. The virtual format also eliminated the need for on-site logistical preparations and minimized travel time. Having clear case prompts on a screen reduced preparation time required for faculty members, and an online scoring system allowed faster and semi-automated generation of score reports.
Residents may be more likely than faculty to recognize and experience technical issues, possibly due to the stress of undergoing an examination. Thus, we recommend all participants perform an audiovisual quality check on their Internet connection and microphone in advance.5 , 6 Accommodations should be arranged for those who identify Internet connectivity issues.
For better or worse, virtual assessments are anticipated to be necessary for the foreseeable future. Future efforts should focus on the standardization of an online exam format, prevention and minimization of technical issues, and improving virtual communication. In this difficult time, we must continue to adapt education modalities and formats for our trainees using creative yet pragmatic methods and further explore the use of virtual technology in education.
Funding
This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
References
- 1.Chick R.C., Clifton G.T., Peace K.M. Using technology to maintain the education of residents during the COVID-19 pandemic. J Surg Educ. 2020;77(4):729–732. doi: 10.1016/j.jsurg.2020.03.018. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Lu Y., Miranda R., Quach C. Standardized multi-institutional mock oral examination: a feasible and valuable educational experience for general surgery residents. J Surg Educ. 2020;(20):S1931–S7204. doi: 10.1016/j.jsurg.2020.05.015. 30150-1. [DOI] [PubMed] [Google Scholar]
- 3.Higgins R., Deal R., Rinewalt D. The utility of mock oral examinations in preparation for the American Board of Surgery certifying examination. Am J Surg. 2016;211(2):416–420. doi: 10.1016/j.amjsurg.2015.09.008. [DOI] [PubMed] [Google Scholar]
- 4.Meyerson S., Lipnick S., Hollinger E. The usage of mock oral examinations for program improvement. J Surg Educ. 2017;74(6):946–951. doi: 10.1016/j.jsurg.2017.05.003. [DOI] [PubMed] [Google Scholar]
- 5.Jones R., Abdelfattah K. Virtual interviews in the era of COVID-19: a primer for applicants. J Surg Educ. 2020;77(4):733. doi: 10.1016/j.jsurg.2020.03.020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.McKinley S.K., Fong Z.V., Udelsman B., Rickert C.G. Successful virtual interviews: perspectives from recent surgical fellowship applicants and advice for both applicants and programs. Ann Surg. 2020;272(3):e192–e196. doi: 10.1097/SLA.0000000000004172. [DOI] [PubMed] [Google Scholar]