Skip to main content
AEM Education and Training logoLink to AEM Education and Training
. 2017 Jan 19;1(1):43–47. doi: 10.1002/aet2.10002

Emergency Radiology “Boot Camp”: Educating Emergency Medicine Residents Using E‐learning Radiology Modules

Shlomo Minkowitz 1,, Kristen Leeman 1, Ashley E Giambrone 3, Jennifer F Kherani 2, Lily M Belfi 1, Roger J Bartolotta 1
Editor: Teresa Chan
PMCID: PMC6001497  PMID: 30051008

Abstract

Objectives

There is an overall paucity of literature on the radiologic education of emergency medicine (EM) clinicians. Given the fact that many EM clinicians preliminarily review images for their patients, we hypothesized that a brief imaging curriculum could be efficacious in teaching basic and relevant radiologic interpretation.

Methods

We designed a 4‐hour “radiology boot camp” for a group of 20 EM residents (from all years of training) covering several subject specific e‐learning modules. They completed precourse and postcourse quizzes to evaluate the efficacy of these modules. These modules included interactive PowerPoint‐based tutorials, games, and imaging decision support simulators. Matched results from the pre‐ and posttests were analyzed using paired t test. An additional questionnaire was administered to the EM residents to evaluate their perception of the educational experience.

Results

The precourse and postcourse quizzes demonstrated a statistically significant level of improved knowledge due to the educational modules (p < 0.0001). In addition, all of the participants believed the modules were a useful learning experience (100%) and a vast majority described them as a valuable resource for future reference (95%).

Conclusion

We demonstrate a model for providing an easy and effective means of educating EM residents on basic imaging interpretation and utilization, using e‐learning modules.


Imaging plays a crucial role in the clinical work of the emergency medicine (EM) physician. Patients who visit the emergency department (ED) will often undergo some form of diagnostic imaging, ranging from chest radiography to brain magnetic resonance imaging to ultrasound of the lower extremities. For example, a recent study demonstrated that computed tomography (CT) scans are ordered in approximately 16.7% of adult patients visiting the ED.1

In most hospital settings, a board‐certified diagnostic radiologist is relied upon for the formal interpretation of these imaging studies. However, radiologists may not be available at all times in all institutions, and immediate interpretation by a radiologist may not be readily available.2 Given the acuity of many clinical scenarios in the ED, it often benefits the EM physician to be able to interpret basic imaging studies (and/or key urgent findings on more complex imaging studies) to guide urgent clinical decisions.

Our review of the literature shows that EM physicians may have difficulty with certain radiologic interpretations, such as chest x‐ray and head CT, as discussed below under Discussion. At present, EM residents in our institution receive no formal radiology training within their curriculum, and there is an overall paucity of literature on the topic of radiologists providing education to EM physicians. We hypothesize that a dedicated radiology curriculum for the EM residents at our institution would improve their diagnostic skills when comparing precourse and postcourse assessments. We developed educational modules to instruct EM residents to make basic and relevant clinical findings in the acute setting, as well as to provide a resource for them to utilize going forward.

Methods

Course Description

As a platform for our educational modules used in this course and other projects, faculty and residents from our Department of Radiology participated in the creation of a variety of e‐learning modules in the form of RadTorials, RadGames, and an interactive imaging decision support simulator (“ICARUS”; Interactive Clinical Anatomy and Radiology Utilization Simulator). These resources are available at http://www.create-rad.com, and the various modules, described in detail elsewhere,3 will be briefly explained below.

The RadTorial modules review imaging evaluation of various clinical conditions or patient populations. The RadGames modules are interactive learning modules in question/answer format (Game of Unknowns, Ace the Case), each with a specific subspecialty focus. The Interactive Clinical Anatomy and Radiology Utilization Simulator (ICARUS) is a program that uses a simulation platform as a means of teaching basic radiologic anatomy, imaging appropriateness, and basic imaging review. There are approximately 40 fully developed modules at the time of writing this article.

The current course was designed as a one‐time course that was 4 hours in length. The course participants consisted of 20 EM residents who were chosen randomly and equally spread between all years of residency. They were split into smaller groups of 5 to 7 that rotated to each module station. We chose three sets of modules for our course: module set 1 consisted of RadTorials on stroke and cervical spine injury; module set 2 included a RadTorial on female pelvic ultrasound and an ICARUS module on small bowel obstruction; and module set 3 consisted of a RadGame on basic chest radiographic findings and a RadTorial focusing on the radiographic appearance of tubes and lines. Each group of EM residents rotated through all three of the module sets and completed precourse and postcourse examinations (as described below).

Each module set was facilitated by a radiologist, with a ratio of approximately five to seven EM residents per one radiologist. During each module set, the EM residents worked independently, with the radiologist answering questions and offering teaching points throughout the session.

Data Collection

After institutional review board approval, the EM residents completed precourse and postcourse knowledge assessments as well as a perception survey of their educational experience. The knowledge assessment of 15 multiple‐choice questions was administered immediately prior to and subsequent to the course. The assessment was created by our radiology faculty specifically for the course and included material from all elements of the course. The test consisted of five questions in relation to each of the three modules. For example, one of the neuroradiology questions asked: “What is the key CT finding that differentiates chronic infarction from acute infarction?” (answer choices: vasogenic edema, hemorrhage, density, or volume loss). Additional sample questions are included in Appendix 1. The residents used anonymous numeric identifiers allowing us to track their pretest and posttest responses. Additionally, after the session, the residents responded anonymously to three perception questions asking whether they found the modules to be useful, whether they would recommend them to other EM residents, and whether they would refer to them in the future as a resource. For these three questions, they were given the option of choosing: “strongly agree,” “agree,” “neutral,” “disagree,” or “strongly disagree.”

Data Analysis

Continuous variables are represented as mean ± standard deviation and categorical variables as number (percentage). Precourse and postcourse knowledge was analyzed by comparing the mean test scores via paired t test. All p‐values are two sided with significance evaluated at the 0.05 alpha level. Data were analyzed with SAS, version 9.3 (SAS Institute).

Results

With respect to the precourse and postcourse examinations administered to the EM residents, our results demonstrated a statistically significant level of improved knowledge between the pre‐ and posttests. The mean scores of the precourse and postcourse examinations were 38 and 71%, respectively, with a paired t test p‐value of <0.001. In addition, with respect to the perception questions, the vast majority of participants believed that the modules were not only a useful learning experience (100%) but also a valuable resource for future reference (95%; Figure 1). The three radiologists who served as on‐site facilitators for these sessions found the sessions engaging as well and appreciated the opportunity to interact with residents from another department and hear their perspectives. All involved radiologists indicated that they would happily participate in a similar educational experience in the future.

Figure 1.

Figure 1

Perception questions.

Discussion

The current project demonstrates an innovative curriculum developed by our radiology department that was implemented in a 4‐hour course for the EM residents at our institution, for whom radiology education is not currently a part of their curriculum. The efficacy of the course was then demonstrated using precourse and postcourse assessment. Of additional secondary benefit was the collaboration between the residents of the radiology and EM departments. Both radiology residents and EM residents indicated that they found the sessions favorable and would like to be involved in something similar in the future.

Our review of the literature demonstrates that there are indeed areas in which EM physicians may have difficulty with radiologic interpretations. For example, in one study examining performance‐based clinical skill assessment of EM residents, the EM residents were scored on their ability to do multiple clinical and diagnostic tasks. The diagnostic tasks included the interpretation of chest radiographs, which are one of the most commonly performed radiologic examinations in the ED setting. Of all the clinical and diagnostic tasks that they were required to perform, they scored the lowest on chest x‐ray interpretation.4

Additionally, there have been several studies that evaluate the accuracy of imaging interpretation by EM physicians vis‐a‐vis that of the diagnostic radiologist, many of which demonstrate that EM physicians have decreased accuracy in interpretation of radiologic studies as compared with diagnostic radiologists. For example, one study demonstrated that EM physicians interpreted head CTs with a significant number of clinically important false‐positive and false‐negative interpretations (sensitivity of 88% and specificity of 80%) compared with neuroradiologists.5 Similarly, another investigation concluded that a significant number of findings were missed on chest radiography when interpreted by the EM physician (with a reported sensitivity of 20%–65%) compared with the diagnostic radiologist.6

Our hypothesis therefore was that implementation of a radiology curriculum may be efficacious, and we have indeed found that our modules provided an effective way of teaching radiology to the EM residents. When searching to see whether anything similar had been done in the past, we found an overall paucity of literature on the topic of radiologists providing education to nonradiology clinicians of any specialty, including EM. The literature does describe EM departments that provide their own radiology education to their trainees, with one publication describing a course designed to educate EM residents on assessment of critical findings on head CTs.7 That study demonstrated that EM residents had some deficiencies in their interpretation of head CTs (prior to intervention) but that an educational course was beneficial. Similar paradigms do exist in the education of medical students, for whom radiologists (and other physicians) have thoroughly evaluated how we provide effective education. For example, a recently published study described a unique educational method for educating medical students, where student preclass review of material was combined with dedicated class time for interactive knowledge application.3

Limitations

Our study has several limitations. Our sample size is small and only included EM residents, due to the ease of convening such a group together, and because from an education viewpoint, they are more of a controlled group relative to attendings. A more important limitation is that we evaluated only immediate postcourse knowledge and did not assess long‐term effect on diagnostic interpretation by the EM residents, nor long‐term change in clinical management. Furthermore, our postcourse assessment was not piloted beforehand and was composed of a multiple‐choice quiz, which although provides a standardized way of assessment does not simulate the challenges of diagnostic interpretation as accurately as would an assessment of their actual interpretive skills. A follow‐up study assessing whether this cohort of EM residents demonstrate improved diagnostic skills would be helpful to evaluate the longitudinal effect of our intervention.

Conclusion

In conclusion, our 4‐hour “boot camp” was well received by the emergency medicine residents and demonstrated that a single interactive educational session could have significant positive impact on emergency medicine residents' knowledge of emergency radiology. Additionally, the results of our analysis provide evidence for the validity of our online educational tool (http://www.create-rad.com) as an effective way to educate nonradiologists (such as emergency medicine residents) through various online modules and educational tools.

At the present, emergency radiology education is not a part of the formal emergency medicine resident curriculum in our institution. We created an effective model for presenting educational radiology modules to nonradiology residents, which could be expanded to meet the needs of other departments' residents as well as other clinical scenarios.

Appendix 1.

ID number: ______________________________

  1. A 36 year‐old female with right upper quadrant abdominal pain and fever presents to ED. Her WBC count is elevated. You suspect acute cholecystitis. Which of the following is the best initial imaging test?

    1. HIDA scan

    2. CT scan of the abdomen

    3. Right upper quadrant ultrasound

    4. Upper GI series

  2. Which of the following is NOT a relative contraindication to intravenous contrast for CT?

    1. Asthma

    2. Multiple myeloma

    3. Shellfish allergy

    4. Sickle cell disease

    5. Collagen vascular disease

  3. Which of the following values is closest to the effective dose of radiation for CT abdomen/pelvis?

    1. 1 mSv

    2. 10 mSv

    3. 20 mSv

    4. 30 mSv

    5. 50 mSv

  4. Which of the following ovarian abnormalities can demonstrate the “string of pearls sign” (enlarged ovary with peripheralized follicles)?

    1. Corpus luteum

    2. Hemorrhagic cyst

    3. Ovarian torsion

    4. Endometrioma

    5. Dermoid

  5. Which of the following is the most reliable differentiator between retained products of conception and gestational trophoblastic disease?

    1. Beta‐HCG levels

    2. Color Doppler flow

    3. Echogenicity

    4. Free fluid

    5. Internal septations

AEM Education and Training 2017;1:43–47.

The authors have no relevant financial information or potential conflicts to disclose.

References

  • 1. Kirsch TD, Hsieh YH, Horana L, Holtzclaw SG, Silverman M, Chanmugam A. Computed tomography scan utilization in emergency departments: a multi‐state analysis. J Emerg Med 2011;41:302–9. [DOI] [PubMed] [Google Scholar]
  • 2. DeFlorio R, Coughlin B, Coughlin R, Li H, Santoro J, Akey B, Favreau M. Process modification and emergency department radiology service. Emerg Radiol 2008;15:405–12. [DOI] [PubMed] [Google Scholar]
  • 3. Belfi LM, Bartolotta RJ, Giambrone AE, Davi C, Min RJ. “Flipping” the introductory clerkship in radiology: impact on medical student performance and perceptions. Acad Radiol 2015;22:794–801. [DOI] [PubMed] [Google Scholar]
  • 4. Burdick WP, Ben‐David MF, Swisher L, et al. Reliability of performance‐based clinical skill assessment of emergency medicine residents. Acad Emerg Med 1996;3:1119–23. [DOI] [PubMed] [Google Scholar]
  • 5. Boyle A, Staniciu D, Lewis S, et al. Can middle grade and consultant emergency physicians accurately interpret computed tomography scans performed for head trauma? Cross‐sectional study. Emerg Med J 2009;26:583–5. [DOI] [PubMed] [Google Scholar]
  • 6. Gatt ME, Spectre G, Paltiel O, Hiller N, Stalnikowicz R. Chest radiographs in the emergency department: is the radiologist really necessary? Postgrad Med J 2003;79:214–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Perron AD, Huff JS, Ullrich CG, Heafner MD, Kline JA. A multicenter study to improve emergency medicine residents' recognition of intracranial emergencies on computed tomography. Ann Emerg Med 1998;32:554–62. [DOI] [PubMed] [Google Scholar]

Articles from AEM Education and Training are provided here courtesy of Wiley

RESOURCES