Skip to main content
Missouri Medicine logoLink to Missouri Medicine
. 2017 Sep-Oct;114(5):396–399.

Comparison of High-Fidelity Medical Simulation to Short-Answer Written Examination in the Assessment of Emergency Medicine Residents in Medical Toxicology

Michael R Christian 1,, Michelle J Sergel 2, Mark B Mycyk 3, Steven E Aks 4
PMCID: PMC6140179  PMID: 30228643

Abstract

We compared high-fidelity medical simulation to short-answer written examination in the assessment of emergency medicine residents (EMR) on a month-long medical toxicology rotation. Knowledge-based assessment tools using cases of an aspirin overdose and a tricyclic antidepressant overdose were used to assess all consecutive rotating EMR (n=53). Assessment by simulation had similar accuracy and precision but higher satisfaction rates when compared to written examination. Incorporating simulation into the ABEM certifying examination warrants further study.

Introduction

Medical simulation is a popular and accepted teaching tool for post-graduate education. In addition, simulation is increasingly being used as an assessment tool particularly for procedures.1,2,3 However, data are limited regarding the use of simulation as a tool for competency assessment. Some have advocated for the use of simulation in board certification and re-certification exams.3,4,5

Furthermore, simulation may have several advantages over traditional written tests in assessing procedural competency and certain American College of Graduate Medical Education (ACGME) core competencies (interpersonal & communication skills, professionalism, patient care, and systems-based practice). The next GME system, which started in July 2013, incorporates “milestones” into the traditional core competencies.6 Since the types of procedures and complexity of simulation cases can be increased or decreased without much difficulty depending on the learner’s level of training, simulation may lend itself to assessing these “milestones” more easily than traditional written tests for physicians-in-training. In a similar vein, a study by McGahie et al. demonstrated that standard examination tools like the USMLE Steps 1 and 2 are not correlated with reliable measures of medical students’, residents’, and fellows’ clinical skill acquisition. Simulation may be a more accurate assessment tool, but more research is needed.7

Also, simulation evokes a physiologic response in physicians comparable to real-life clinical scenarios.8 Despite these possible advantages, medical simulation is much more expensive to proctor than written exams, and comparative research is warranted9,10. We sought to assess precision and accuracy of sim-based testing versus traditional written-based testing.

Methods

Knowledge-based assessment tools using sim and written examinations were developed, piloted, and revised by four physicians (authors) board-certified in Medical Toxicology and/or Emergency Medicine. These physicians were familiar with the Emergency Medicine core content requirements set forth by the American Board of Emergency Medicine (ABEM). From July 2011 to June 2012, all consecutive rotating EMR on a month-long Medical Toxicology rotation at an academic Emergency Medicine training program were eligible for participation in the study. Ten board-certified medical toxicologists as well as four fellows in training staffed the Medical Toxicology rotation. The rotation consists of daily patient rounds, daily didactics, and a weekly journal club. At any given time, approximately 20–25 medical students, pharmacy students, pediatric residents, emergency medicine residents, and pediatric emergency medicine fellows are on the service. However, only emergency medicine residents were assessed in this pilot study.

On odd months, EMR were assessed using a sim case of an aspirin (ASA) overdose patient and a written examination of a tricyclic antidepressant (TCA) overdose patient. On even months, testing modalities were switched (TCA sim and ASA written). The assessment tools each contained ten knowledge-based questions. For each toxin (ASA and TCA), the scenario, questions, and answers were exactly the same; only the assessment tool (sim versus written) varied. We specifically avoided the use of multiple-choice questions as an assessment modality as this would give a resident at least a 20% chance of guessing the correct answer (if five answer choices were given). Instead, we chose to compare two open-ended assessment modalities (sim versus written). At the end of the assessment, the EMR completed a survey. The content of the survey included demographic items, previous experience with simulation, and learner satisfaction. EMR were blinded to the purpose of the study. Scores were not used in the final rotation grade. Final scores were reported as 0 to 10 based upon the number of knowledge-based questions answered correctly (out of 10). The fourth author scored both the simulation and written tests to ensure consistency in grading. Data were analyzed using descriptive and inferential statistics. This study received IRB approval.

Results

From July 2011 to June 2012, 53 EMR [23 females, 30 males; PGY 3(38), 4(11), and 5(4)] from six different emergency medicine residencies were eligible for our study and all participated. All EMR were considered senior residents by their home institution. All residents who participated completed the study. Simple, descriptive statistics were used. Examination number, mean scores, and standard deviations were: ASA sim: N=26, M=6.04 SD=2.38; ASA written: N=27, M=7.30 SD=1.27; TCA sim: N=27, M=6.41 SD=1.22; and TCA written: N=26, M=6.27 SD=2.20 (see Figure 1). Although our study was not powered to find a correlation between sim and written examinations, there was no statistically significant Spearman correlation between ASA sim and written (ρ=0.698, p=0.742) or between TCA sim and written (ρ=−0.222, p=0.276). Both the written tests (51/53; 96%) and sim tests (53/53) were judged to be fair (as opposed to not fair) by EMR. On a three-point Likert scale (very satisfied, satisfied, and not satisfied), the sim tests had a higher satisfaction rate (45/53 very satisfied; 85%) than the written tests (27/53 very satisfied; 51%) (see Figure 2). Although only 6/53 EMR (11%) had been assessed using simulation previously, the majority of EMR (38/53; 72%) would prefer to see simulation incorporated into the American Board of Emergency Medicine (ABEM) certifying (oral) examination.

graphic file with name ms114_p0396f1.jpg

graphic file with name ms114_p0396f2.jpg

Discussion

We feel that our study has several important implications. In regards to graduate medical education, written examinations have many shortcomings in the assessment of future physicians. Specifically, it is very difficult, if not impossible, to assess some of the ACGME’s core competencies (i.e., interpersonal & communication skills, professionalism, patient care, and systems-based practice) with a written examination. Medical simulation may be a better tool to assess some of these areas. In addition, medical simulation has a distinct advantage regarding patient safety in comparison to other methods of assessment like direct observation. The era of “see one, do one, teach one” is quickly ending in the wake of patient safety. In fact, some literature demonstrates that simulation-based education can have significant impact on patient outcomes (i.e., lowering the central line-associated bloodstream infection (CLABSI) rate at two participating hospitals).11 Furthermore, some literature supports the use of simulation-based assessment tools in the assessment of residents in an Anesthesiology residency.12 However, to our knowledge, no such data exist regarding residents training in Emergency Medicine and/or Medical Toxicology.

While medical simulation is certainly more expensive than proctoring a standard written test, it may be more cost effective than other commonly used methods of assessment like the so-called “standardized patient.” These actor-patients are costly and difficult to standardize. There is certainly an opportunity for a comparative study between a medical simulation model and a standardized patient model. Currently, every allopathic medical student in the country is required to take the United States Medical Licensing Examination (USMLE) Step II Clinical Skills (CS) exam prior to receiving his or her medical license. There are only five testing centers in the U.S. and the exam is very costly. The use of a standardized medical simulation experience may lower costs overall and allow for a greater number of testing sites.

This study had several limitations. First, it was a small pilot study with only 53 participants. A larger sample size may have yielded different results. Similarly, we only compared two simulation scenarios (ASA and TCA) versus two written examinations (ASA and TCA) using 10 knowledge-based questions. We chose two of the most important topics in emergency toxicology: tricyclic antidepressant and salicylate poisoning. These are low frequency, yet high morbidity events. These are two of the major learning objectives that will be relevant for patient care in Emergency Medicine. We estimated that two scenarios were adequate and feasible for an initial proof-of-concept study.

There is also uncontrolled variability in the EMR previous experiences in medical simulation as well as medical toxicology. In addition, a senior EM resident on July 1 is obviously not as experienced in emergency medicine or medical toxicology as a senior resident at the end of the same academic year. However, this inter-learner variability would be expected to progress in both study groups similarly throughout the year. While the residents were blinded to the purpose of the study and the cases were confidential, there is a possibility that some residents discussed the cases outside of the rotation. Similarly, all assessments used by the American Board of Emergency Medicine and other specialty certifying organizations have to deal with variability in training and experience. In addition, the scoring was performed by a single physician (fourth author), which could introduce some bias. However, we feel that this design was an overall strength in our study put in place to eliminate inter-rater reliability and potentially reducing bias on the whole.

In the future, it would be insightful to include more patient scenarios (sim versus written) among a larger pool of EMR. Similar results in simulation accuracy and precision as well as continued higher satisfaction, may justify the incorporation of medical simulation as an assessment tool as a trial in the ABEM certifying (oral) examination. As discussed previously, medical simulation may provide an avenue to assess procedural competency and certain American College of Graduate Medical Education (ACGME) core competencies (i.e., interpersonal and communication skills, professionalism, patient care, and systems-based practice) that cannot easily be assessed using the current ABEM certifying examination format. Other medical subspecialties might utilize medical simulation for similar reasons. Eventually, it would be ideal to quantify a real change in patient outcomes and/or patient satisfaction after an intervention with a simulation-based assessment tool versus a standard written examination. It should be pointed out, however, that simulation is merely a tool that can be used during assessment; simulation cannot replace direct faculty supervision during patient care.

Conclusions

In this pilot study, assessment by sim has similar accuracy (demonstrated by the similar mean scores) and precision (demonstrated by similar standard deviations) but higher satisfaction rates when compared to a written examination. Most participants felt simulation should be incorporated into the ABEM certifying (oral) examination.

Acknowledgments

Thank you to the residents who participated in this study. We would also like to thank those who attended and participated in the 2011 North American Congress of Clinical Toxicology (NACCT) Fellow-In-Training Research Symposium and the faculty and students of the 2011–2012 Scholars for Teaching Excellence Faculty Fellowship (STEFF) at the University of Illinois at Chicago for their valuable input. Special thanks to Errick Christian, MS, for his assistance with data management and analysis.

Biography

Clockwise, from top left: Michael R. Christian, MD, MSMA member since 2012, is Assistant Professor of Emergency Medicine and Pediatrics, University of Missouri-Kansas City School of Medicine, Department of Emergency Medicine, Truman Medical Center, Medical Toxicologist, Division of Clinical Pharmacology and Medical Toxicology. Michelle J. Sergel, MD, Mark B. Mycyk, MD, and Steven E. Aks, DO, are with the John H. Stroger, Jr. Hospital in Chicago, Ill.

Email: michael.christian@tmcmed.org

graphic file with name ms114_p0396f3.jpg

graphic file with name ms114_p0396f4.jpg

graphic file with name ms114_p0396f5.jpg

graphic file with name ms114_p0396f6.jpg

Footnotes

Disclosure

None reported.

References

  • 1.Boulet JR, Murray DJ. Simulation-based assessment in Anesthesiology: requirements for practical implementation. Anesthesiology. 2010;112(4):1041–1052. doi: 10.1097/ALN.0b013e3181cea265. [DOI] [PubMed] [Google Scholar]
  • 2.Kaye AR, Salud LH, Domont ZB, et al. Expanding the use of simulators as assessment tools: the new pop quiz. Stud Health Technol Inform. 2011;163:271–273. [PMC free article] [PubMed] [Google Scholar]
  • 3.Marco J, Holmes DR. Simulation: present and future roles. J Am Coll Cardiol Intv. 2008;1(5):590–592. doi: 10.1016/j.jcin.2008.08.015. [DOI] [PubMed] [Google Scholar]
  • 4.Crosby E. The role of simulator-based assessments in physician competency evaluations. Can J Anesth. 2010;57(7):627–635. doi: 10.1007/s12630-010-9323-3. [DOI] [PubMed] [Google Scholar]
  • 5.Decker S, Utterback VA, Thomas MB, Mitchell M, Sportsman S. Assessing continued competency through simulation: A call for stringent action. Nurs Educ Perspect. 2011;32(2):120–125. doi: 10.5480/1536-5026-32.2.120. [DOI] [PubMed] [Google Scholar]
  • 6.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system – rationale and benefits. New Engl J Med. 2012;366:1051–1056. doi: 10.1056/NEJMsr1200117. [DOI] [PubMed] [Google Scholar]
  • 7.McGahie WC, Cohen ER, Wayne DB. Are United States Medical Licensing Exam Step 1 and 2 scores valid measures for postgraduate medical residency selection decisions? Acad Med. 2011;861(1):48–52. doi: 10.1097/ACM.0b013e3181ffacdb. [DOI] [PubMed] [Google Scholar]
  • 8.Bong CL, Lightdale JR, Fredette ME, Weinstock P. Effects of simulation versus traditional tutorial-based training on physiologic stress levels among clinicians: a pilot study. Simul Healthc. 2010;5(5):272–278. doi: 10.1097/SIH.0b013e3181e98b29. [DOI] [PubMed] [Google Scholar]
  • 9.Cook DA. One drop at a time: research to advance the science of simulation. Simul Healthc. 2010;5(1):1–4. doi: 10.1097/SIH.0b013e3181c82aaa. [DOI] [PubMed] [Google Scholar]
  • 10.Okuda Y, Bryson EO, DeMaria S, et al. The utility of simulation in medical education: what is the evidence? Mt Sinai J Med. 2009;76(4):330–343. doi: 10.1002/msj.20127. [DOI] [PubMed] [Google Scholar]
  • 11.Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central-line associated bloodstream infections. BMJ Qual Saf. 2014;0:1–8. doi: 10.1136/bmjqs-2013-002665. [DOI] [PubMed] [Google Scholar]
  • 12.Blum RH, Boulet JR, Cooper JB, Muret-Wagstaff SL. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesthesiology. 2014;120:129–41. doi: 10.1097/ALN.0000000000000055. [DOI] [PubMed] [Google Scholar]

Articles from Missouri Medicine are provided here courtesy of Missouri State Medical Association

RESOURCES