Skip to main content
Journal of the American College of Emergency Physicians Open logoLink to Journal of the American College of Emergency Physicians Open
. 2021 Jun 14;2(3):e12439. doi: 10.1002/emp2.12439

Teaching emergency ultrasound to emergency medicine residents: a scoping review of structured training methods

Leila L PoSaw 1,, Brandon M Wubben 2, Nicholas Bertucci 2, Gregory A Bell 2, Heather Healy 3, Sangil Lee 2
PMCID: PMC8202829  PMID: 34142104

Abstract

Background

Over the past 2 decades, emergency ultrasound has become essential to patient care, and is a mandated competency for emergency medicine residency graduation. However, the best evidence regarding emergency ultrasound education in residency training is not known. We performed a scoping review to determine the (1) characteristics and (2) outcomes of published structured training methods, (3) the quality of publications, and (4) the implications for research and training.

Methods

We searched broadly on multiple electronic databases and screened studies from the United States and Canada describing structured emergency ultrasound training methods for emergency medicine residents. We evaluated methodological quality with the Medical Education Research Study Quality Instrument (MERSQI), and qualitatively summarized study and intervention characteristics.

Results

A total of 109 studies were selected from 6712 identified publications. Publications mainly reported 1 group pretest–posttest interventions (38%) conducted at a single institution (83%), training in image acquisition (82%) and interpretation (94%) domains with assessment of knowledge (44%) and skill (77%) outcomes, and training in cardiac (18%) or vascular access (15%) applications. Innovative strategies, such as gamification, cadaver models, and hand motion assessment are described. The MERSQI scores of 48 articles ranged from 0 to 15.5 (median, 11.5; interquartile range, 9.6–13.0) out of 18. Low scores reflected the absence of reported valid assessment tools (73%) and higher level outcomes (90%).

Conclusions

Although innovative strategies are illustrated, the overall quality of research could be improved. The use of standardized planning and assessment tools, intentionally mapped to targeted domains and outcomes, might provide valuable formative and summative information to optimize emergency ultrasound research and training.

Keywords: emergency medicine, emergency ultrasound, graduate medical education, scoping review, teaching, training

1. INTRODUCTION

1.1. Background

In the United States, the leading organizations in emergency medicine, including the American College of Emergency Medicine (ACEP), the Society for Academic Emergency Medicine (SAEM), the Council of Emergency Medicine Residency Directors (CORD), the Accreditation Council for Graduate Medical Education (ACGME), and the American Board of Emergency Medicine (ABEM), released collaborative guidelines in 2001 that listed “bedside ultrasonography” as 1 of the procedures and skills integral to the practice of emergency medicine. 1 In 2012, the ACGME designated emergency ultrasound as an essential patient care skill and mandated that all emergency medicine residents attain competency in emergency ultrasound by the completion of residency training. 2 Subsequently, this was endorsed by ABEM, and additional framework for defining competency was provided by the consensus guidelines from the CORD‐SAEM emergency ultrasound milestones project in 2013. 3 , 4

To meet expanding competency requirements, emergency medicine residency programs in the United States have been tasked with providing residents adequate emergency ultrasound instruction. Since the first model emergency ultrasound curriculum was developed by Mateer et al 5 in 1994, guidelines for emergency ultrasound training have evolved significantly. In 2002, the Scope of Training Task Force recommended that best practice was to teach applications in “discrete sessions” as a 2‐day course with both didactic (lecture) and hands‐on (laboratory) components. 6 In 2008, ACEP published comprehensive guidelines, subsequently recognized by the SAEM and the American Institute of Ultrasound in Medicine, that recommended a 1‐day orientation course early in residency training and a standard 2‐day course with lectures and technical components. 7 , 8 , 9 Also in 2008, CORD published a model emergency ultrasound curriculum with minimum education standards and a framework for the integration of emergency ultrasound into resident education. 10 Revised guidelines were published by ACEP in 2017, which further define the components of emergency ultrasound competency and detail expanded core applications and competency assessment. 11

The Canadian Association of Emergency Physicians (CAEP) first published a general position statement on the use of ultrasound in the emergency department (ED) in 1999. 12 , 13 This was updated in 2006 and 2012 to include training recommendations, including that all practicing emergency medicine physicians be competent in the core applications of focused assessment with sonography for trauma (FAST), abdominal aorta aneurysm identification, first trimester pregnancy, thoracic ultrasound, focused cardiac ultrasound, and guided vascular access. 12 , 13 The 2014 guidelines from the Royal College of Physicians and Surgeons of Canada also included 6 targeted ED ultrasound examinations as a core competency. 14 In 2018, the Canadian Association of Emergency Physicians working group discussed expanding the core ultrasound applications, and although there was not agreement, they concluded that there was a need for frequent review and reassessment of core emergency ultrasound curriculum. 15

1.2. Importance

The translation of these guidelines into published literature related to emergency ultrasound education during residency training is largely unknown. We conducted a comprehensive scoping review of the emergency ultrasound education literature with the express purpose of collating, critically appraising, and highlighting quality structured training interventions to better understand the state of emergency ultrasound education research. The scoping review method was selected for its rigor and transparency, with the potential to map primary research, identify gaps in the evidence base, summarize findings, and facilitate use by policy makers and practitioners.

1.3. Goals of this investigation

Our objectives were framed by the following questions:

  1. What are the reported range and characteristics of structured emergency ultrasound interventions that have been used to train emergency medicine residents?

  2. What types of outcome evidence support the effectiveness of these published methods?

  3. What is the quality of the selected emergency ultrasound publications?

  4. What are the implications for emergency ultrasound research and training?

2. METHODS

2.1. Protocol and registration

We registered our study protocol on the Center for Open Science (OSF) registry. 16 Our study followed the Preferred Reporting Items for Systematic Reviews and Meta‐Analyses extension for scoping reviews (PRISMA‐ScR) standards, 17 which incorporates the guidelines of Arksey and O'Malley, 18 Levac and colleagues, 19 and the Johanna Briggs Institute. Our checklist is presented as Supporting Information Table S1. Our protocol included the critical appraisal of individual studies (optional items 12 and 16, PRISMA‐ScR). This scoping review was exempt from institutional ethics approval.

2.2. Eligibility criteria

We included studies that met the following criteria: (1) described an educational intervention (systematic instruction as a program, course, curriculum or pedagogical technique), (2) trained emergency medicine residents at any post‐graduate level, (3) trained in the United States or Canada, (4) were described in English, (5) were prospective studies, surveys, and descriptions, and (6) trained in any of the core 12 emergency ultrasound applications (focused assessment of sonography in trauma, gallbladder, aorta, renal, cardiac, airway/thoracic, gastrointestinal, lower extremity venous, ocular, soft tissue/musculoskeletal, pelvic/obstetric, procedures) or in adjunct emergency ultrasound applications (including but not limited to trans‐esophageal echocardiogram, transcranial Doppler). 11 , 15 Mixed learning groups that included emergency medicine residents were included.

We excluded studies that: (1) did not describe an educational intervention, (2) whose primary focus was not education, (3) was published in a non‐English language, or (4) were review articles, clinical practice guidelines, or editorials.

2.3. Information sources

The searches were developed and conducted by a health science librarian (HH) trained in systematic searching. A broad set of search terms were identified. Search strategies, using both subject heading and keyword methods, were created for PubMed, Cochrane CENTRAL (Wiley), ERIC (EBSCOhost), and Embase (Elsevier). The initial search was conducted in November of 2017, and was updated in May of 2019. No date restrictions were applied. The PubMed and Embase searches were limited to English. Studies were de‐duplicated using the method described by Bramer and colleagues. 20

2.4. Search

Our PubMed search strategy is presented in Supporting Information Table S2. The search strategies for the other databases are available on request.

2.5. Selection of sources of evidence

All retrieved studies were transferred to Covidence software. 21 First, reviewers had several discussions to confirm concepts and definitions relevant to article selection. Then, 2 of 4 reviewers (SL, GAB, LLP, and BMW) independently screened studies by title and abstract and identified articles for full text review. Articles were assigned to a single exclusion category based on a predetermined ordered list of categories. At several group meetings of 3 reviewers, the third reviewer objectively mediated discordance after discussion. Second, full text articles were reviewed by 2 reviewers (LLP and SL). Discordance was mediated by reviewer discussion. The data for both screenings is archived on the Covidence software and is available on request.

2.6. Data charting process

Select data items were charted on custom forms (Microsoft Excel software) by a single reviewer (LLP) and checked for accuracy by other reviewers (SL and BMW). Disagreements were resolved by discussion. Form 1 collected data on study characteristics—(year, country of implementation, design, sample size, number of institutions, and learner type), Form 2 collected data on intervention characteristics (application taught, training domains [indications, image acquisition, image interpretation, clinical integration, and documentation/reimbursement] and learner outcomes [benefit to patients, behavior change, skills, knowledge, self‐efficacy, attitudes, and reaction]), and Form 3 collected data on educational strategies (pre‐intervention [asynchronous learning], intervention [design, learner assessment], and post‐intervention [program and learner evaluation survey]). Individual studies were matched into 1 or several domains.

2.7. Critical appraisal of individual studies

A total of 48 studies were critically appraised for methodological quality using the Medical Education Research Study Quality Instrument (MERSQI) by 2 reviewers (LP, BW). Because of brevity and concision of abstracts, a quantitative analysis with the MERSQI was not performed on the 61 abstracts. The MERSQI is a 10‐item instrument, organized into 6 domains (study design, sampling, type of data, validity, data analysis, and outcomes). Each domain has a maximum score of 3, for a total score of 18. 22 , 23 There are no published (print or on‐line) training modules available for the MERSQI. Both reviewers (LLP and BMW) self‐trained with reading of primary articles on MERSQI, 22 , 23 , 24 and item and domain definitions were thoroughly discussed before scoring. Constructs for the 3 validity measures: (1) content, (2) internal structure, and (3) relationship to other variables, followed those presented by Beckman and colleagues. 25 The wide variability in reporting made it necessary for both reviewers (LLP and BMW) to meet frequently and review their independent scores for each of the 48 studies. All items were discussed to consensus.

2.8. Synthesis of results

Studies were grouped by general study characteristics, including design, sample size, learner type and response rate, number of institutions, and by educational intervention characteristics, including emergency ultrasound application, techniques, training domains, outcomes, learner assessment, and program evaluation.

Training strategies were grouped as: (1) pre‐intervention asynchronous learning, (2) intervention design, (3) learner assessment, and (4) post‐intervention survey. Training domains were grouped as: (1) indications, (2) image acquisition, (3) image interpretation, (4) clinical integration, and (5) quality, documentation, and reimbursement.

Training outcomes were grouped as: “Reaction to educational experience,” “Attitudes,” “Self‐efficacy,” “Knowledge,” “Skills,” “Behaviors,” and “Benefit to patients.” The outcome categories were adapted from Kirkpatrick's Hierarchy of Levels of Outcomes. 26 , 27 , 28

3. RESULTS

3.1. Selection of studies

Our PRISMA‐ScR flow diagram is shown as Figure 1. A broad search on multiple electronic databases yielded 6712 studies. For the first screening, a total of 4852 unique studies were screened by title and abstract, yielding 1320 studies. For the second screening, these 1320 studies were screened by full text, and 1211 studies were excluded for the following reasons: 558 studies had the wrong study intention (not an educational focus), 192 studies targeted the wrong learners (not emergency medicine residents), 133 studies were located outside the United States and Canada, 141 studies had the wrong intervention (116 studies were not structured training methods and 25 studies were not emergency ultrasound), 104 studies were not emergency medicine based, and 83 studies were the wrong study design (reviews, guidelines, editorials). The remaining 109 studies, 48 articles 5 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 36 , 37 , 38 , 39 , 40 , 41 , 42 , 43 , 44 , 45 , 46 , 47 , 48 , 49 , 50 , 51 , 52 , 53 , 54 , 55 , 56 , 57 , 58 , 59 , 60 , 61 , 62 , 63 , 64 , 65 , 66 , 67 , 68 , 69 , 70 , 71 , 72 , 73 , 74 , 75 and 61 abstracts, 76 , 77 , 78 , 79 , 80 , 81 , 82 , 83 , 84 , 85 , 86 , 87 , 88 , 89 , 90 , 91 , 92 , 93 , 94 , 95 , 96 , 97 , 98 , 99 , 100 , 101 , 102 , 103 , 104 , 105 , 106 , 107 , 108 , 109 , 110 , 111 , 112 , 113 , 114 , 115 , 116 , 117 , 118 , 119 , 120 , 121 , 122 , 123 , 124 , 125 , 126 , 127 , 128 , 129 , 130 , 131 , 132 , 133 , 134 , 135 , 136 were included in this review.

FIGURE 1.

FIGURE 1

Preferred reporting items for systematic reviews and meta‐analyses extension for scoping reviews (PRISMA‐SCR): study selection process

The chance–adjusted interrater agreement (Cohen's κ with 95% confidence interval [CI]) of each set of paired reviewers in the first screening were 0.33 (0.28–0.38), 0.39 (0.33–0.44), 0.48 (0.37–0.58), and 0.57 (0.51–0.62) and in the second screening was 0.36 (0.26–0.47).

3.2. Characteristics of studies

The 109 studies (48 articles and 61 abstracts) were published from 1994 to 2019. Although 9 (8.2%) studies were published prior to 2005, 10 (9.1%) studies were published between 2006 and 2010, 55 (50.4%) studies were published between 2011 and 2015, and 35 (32.1%) studies have been published from 2016 to 2019. Most studies (86, 78.8%) were published in the United States and 23 (21.1%) were published in Canada.

The study design of 109 studies was analyzed, and there were 41 (37.6%) one group pretest and posttest studies, 34 (31.1%) cross‐sectional or one group posttest studies, 16 (14.7%) randomized studies, and 5 (4.6%) descriptive studies. The majority (91, 83.4%) were conducted at a single institution, and 12 (11%) were conducted at 2 or more sites. The median sample size in the 48 articles was 30 (range, 0–99), the median sample size in the 61 abstracts was 18 (range, 0–900). A total of 57 (52.2%) studies taught emergency medicine residents exclusively.

3.3. Critical appraisal of quality

The MERSQI composite scores of the 48 articles, at the domain and item level, are presented in Table 1. Four (8%) of the 48 studies were descriptive in nature and were included in the MERSQI calculations, but received a score of 0 for all items. Of a total of 18, the MERSQI scores of individual studies ranged from 0 to 15.5 (median, 11.5; interquartile range, 9.6–13.0). At the individual domain level, studies scored best at data measurement and analysis. Common reasons for lower scores were omitting a control group (77%), studying a single institution (78%), not reporting validity of assessment tools (73%), and evaluating skills or attitudes rather than behaviors or patient outcomes (90%). The MERSQI scores of individual studies are presented in Supporting Information Table S4.

TABLE 1.

Summary of MERSQI domain and item scores for 48 selected studies a

Domain MERSQI item Max. possible score Studies no. (%) b Median score (Q1–Q3) c
Item Domain Item Domain
Study design 3 1.5 (1.0–1.5)
1. Study design 1.5 (1.0–1.5)
Descriptive 0 [0] 4 (8)
Single group cross‐sectional or single group posttest only 1 13 (27)
Single group pre‐ and post‐test 1.5 20 (42)
Non‐randomized, 2 group 2 5 (10)
Randomized controlled trial 3 6 (13)
Sampling 3 2.0 (1.0–2.0)
2. No. of institutions studied 0.5 (0.5–0.5)
None 0 4 (8)
Single institution 0.5 38 (79)
Two institutions 1 3 (6)
More than 2 institutions 1.5 3 (6)
3. Response rate, % 1.5 (0.5–1.5)
Not applicable 0 6 (13)
<50 or not reported 0.5 8 (17)
50–74 1 4 (8)
>75 1.5 30 (63)
Type of data 3 3.0 (3.0–3.0)
4. Type of data   3.0 (3.0–3.0)
No assessment 0 4 (8)
Assessment by study participant 1 7 (15)
Objective measurement 3 37 (77)
Validity of evaluation instrument 3 0 (0–2.0)
5. Internal structure 0 (0–1.0)
Not reported 0 35 (73)
Reported 1 13 (27)
6. Content 0 (0–1.0)
Not reported 0 34 (71)
Reported 1 14 (29)
7. Relationship to other variables 0 (0–1.0)
Not reported 0 35 (73)
Reported 1 13 (27)
Data analysis 3 3.0 (3.0–3.0)
8. Appropriateness of analysis 1.0 (1.0–1.0)
Data analysis inappropriate for study design or type of data 0 5 (10)
Data analysis appropriate for study design and type of data 1 43 (90)
9. Complexity of analysis 2.0 (2.0–2.0)
No analysis 0 4 (8)
Descriptive analysis only 1 7 (15)
Beyond descriptive analysis 2 37 (77)
Outcomes 3 1.5 (1.5–1.5)
10. Outcomes 1.5 (1.5–1.5)
None 0 4 (8)
Satisfaction, attitudes, perceptions, opinions, general facts 1 7 (15)
Knowledge, skills 1.5 32 (67)
Behaviors 2 2 (4)
Patient/health care outcome 3 3 (6)
Total 18 11.5 (9.6–13.0)

MERSQI, medical education research study quality instrument; Max, maximum.

a

Table adapted from Reed et al.22

b

Percentages may not total 100 due to rounding.

c

Interquartile range reported as Q1–Q3.

An example of a study 42 with a high MERSQI score (15.5/18) was a randomized trial with a control group (3/3) that had >75% learner response rate (1.5/1.5). Learners were assessed with a written knowledge pretest and posttest, as well as a skill assessment (3/3). Multiple, blinded evaluators performed a validated observed structured clinical examination (OSCE) test (3/3). Data analysis was appropriate and beyond descriptive analysis (3/3). This study lost points, however, because it was conducted at a single institution (0.5/1.5) and only assessed knowledge and skill outcomes (1.5/3).

This is in contrast to a study 30 with a low MERSQI score (5/18) that was a single group cross‐sectional study (1/3) conducted at a single institution (0.5/1.5), and the sampling rate was not reported (0.5). No validated evaluation instrument was used (0/3) and learners assessed themselves (1/3). Data analysis was descriptive and inappropriate for study design (1/3) and only satisfaction, attitudes, and perception outcomes were assessed (1/3).

3.4. Individual study characteristics

Detailed characteristics of each of the 109 studies are presented in Supporting Information Table S3.

3.5. Synthesis of results

For all 109 studies, training strategies are presented in Table 2, training domains are presented in Table 3, and training outcomes are presented in Table 4.

TABLE 2.

Pre‐intervention, intervention, learner assessment, and post‐intervention educational strategies performed in 109 selected studies

Domains Studies N (%)
Pre‐intervention
Asynchronous learning Amini, 32 Arntfield, 33 Caffery, 39 Chenkin, 42 Gable, 97 Hafez, 101 Hall, 103 Jang, 49 Jang, 50 Laack, 53 Lewiss, 56 Liteplo, 57 McGraw, 63 Minnigan, 114 Norris, 119 Parks, 67 Parks, 123 Parks, 124 Stolz, 74 Stolz, 132 Woodcroft 136 21 (19)
Intervention
Model curriculum Adhikari, 77 Alkhalifah, 30 Amini, 32 Amini, 31 Bahner, 78 Bayci, 34 Boulger, 37 Chenkin, 82 Field, 94 Gable, 97 Grall, 45 Hall, 103 Hayward, 48 Jones, 51 Lall, 54 Lanoix, 55 Lee, 112 Leung, 113 Mahler, 60 Mandavia, 62 Mateer, 5 McGraw, 63 Noble, 66 Norris, 119 Shah, 70 Stolz 74 , Stolz, 132 Woodcroft 136 28 (26)
Large scale institutional training Grudziak, 47 Sessler 128 2 (2)
Simulation
Human models Amini, 31 Amini, 32 Bayci, 34 Berg, 35 Chao, 81 Chenkin, 42 De Lucia, 90 Dulani, 91 Duran Gehring, 92 Hall, 103 Hrymak, 109 Jones, 51 Lewiss, 56 Liteplo, 57 Noble, 66 Salen, 69 Shah, 70 Shah, 129 Shokoohi, 71 Williams 135 20 (18)
Cadaver models Adan, 76 Ghosh, 99 Herring, 107 Laack, 53 4 (4)
Animal models Berg, 35 Bloch, 36 Campanella, 40 Ferre, 93 Nguyen‐Phuoc 117 5 (5)
Patients Jang, 49 Jang, 50 Lanoix, 55 MacVane, 59 Mandavia, 62 Miller, 64 Nguyen, 116 Nguyen‐Phuoc, 117 Norris, 119 Shokoohi, 71 Smalley 72 11 (10)
Mannequins and phantoms Adan, 76 Akhtar, 29 Alkhalifah, 30 Arntfield, 33 Bayci, 34 Bayers, 38 Caffery, 39 Chenkin, 84 Chenkin 83 , Chenkin, 41 Chenkin, 42 Cho, 86 Corujo, 88 Furman, 96 Girzadas, 44 Godbout, 100 Greenstein, 46 Grudziak, 47 Hakmeh, 102 Hall, 103 Haydel, 106 Hayward, 48 Hrymak, 108 Hrymak, 109 Jagneaux, 110 Laack, 53 Lall, 54 Lewiss, 56 Liteplo, 57 Lobo, 58 Mallin, 61 McGraw, 63 Minnigan, 114 Nguyen, 118 Norris, 119 Olson, 121 Olszynski, 122 O'Keefe, 120 Parks, 67 Parks, 123 Parks, 124 Runde, 127 Salen, 69 Sessler, 128 Sommerkamp, 73 Staum, 131 Woo, 75 Woodcroft 136 48 (44)
Novel educational techniques Chenkin, 83 Clinton, 87 Field, 95 Gelabert, 98 Kerwin, 52 Kluger, 111 Mallin, 61 Miller, 64 Morse, 65 Nelson, 115 Nguyen, 116 O'Keefe, 120 Olszynski, 122 Shokoohi, 71 Sommerkamp, 73 Williams 135 16 (15)
Case‐based learning Adhikari, 77 Alkhalifah, 30 Amini, 31 Amini, 32 Bharati, 79 Byars, 38 Byars, 80 Chao, 81 Chenkin, 41 Datta, 89 Duran Gehring, 92 Field, 95 Gable, 97 Girzadas, 44 Grall, 45 Greenstein, 46 Hall, 103 Hayward, 48 Jones, 51 Laack, 53 Lall, 54 Leung, 113 MacVane, 59 Minnigan, 114 Parks, 67 Parks, 123 Parks, 124 Rohra, 126 Stolz, 74 Stolz 132 31 (28)
Social media: blog, Twitter, Facebook, YouTube Bahner, 78 Hafez, 101 Tyler 133 3 (3)
Multimedia/online modules Amini, 31 Amini, 32 Bayci, 34 Bharati, 79 Byars, 80 Chao, 81 Chenkin, 42 Chenkin, 43 Chenkin, 85 Datta, 89 Dulani, 91 Field, 94 Field, 95 Gable, 97 Hafez, 101 Hall, 103 Hassani, 105 Kerwin, 52 Laack, 53 McGraw, 63 Minnigan, 114 Nguyen, 116 Norris, 119 Peterson, 125 Platz, 68 Rohra, 126 Shah, 70 Smalley, 72 Stolz, 74 Tyler, 133 Wadhawan, 134 Williams 135 , Woo 75 , Woodcroft 136 35 (32)
Gamification Lewiss, 56 Liteplo, 57 Lobo, 58 Olson 121 4 (4)
Novel track/elective/rotation/shifts Boulger, 37 Chenkin, 82 Lee, 112 Haney, 104 Hayward, 48 Mahler, 60 Smalley 72 7 (6)
Deliberate practice, blocked practice, mastery learning Chenkin, 83 Chenkin, 85 Chenkin, 43 Hayward, 48 McGraw, 63 Smalley, 72 Smith, 130 Woodcroft 136 8 (7)
Learner assessment
Pretest: knowledge, skills, confidence Akhtar, 29 Alkhalifah, 30 Bayci, 34 Bharati, 79 Campanella, 40 Chenkin 83 , Chenkin 83 , Chenkin 41 , Chenkin, 42 Chenkin, 43 Chenkin, 85 Clinton, 87 Corujo, 88 Datta, 89 Dulani, 91 Ferre, 93 Gable, 97 Gelabert, 98 Greenstein, 46 Grudziak, 47 Hassani, 105 Haydel, 106 Jagneaux, 110 Jones, 51 Kerwin, 52 Kluger, 111 Laack, 53 Lee, 112 Leung, 113 Lewiss, 56 Lobo, 58 Mahler, 60 Mandavia, 62 McGraw, 63 Morse, 65 Nelson, 115 Nguyen, 116 Noble, 66 Parks, 67 Parks, 123 Peterson, 125 Platz, 68 Rohra, 126 Sessler, 128 Shah, 70 Shah, 129 Stolz, 132 Stolz, 74 Wadhawan, 134 Williams, 135 Woo 75 51 (47)
Posttest: knowledge Akhtar, 29 Amini, 31 Amini, 32 Bayci, 34 Bharati, 79 Byars, 38 Campanella, 40 Chao, 81 Chenkin, 82 Chenkin, 42 Chenkin, 43 Chenkin, 85 Cho, 86 Clinton, 87 Corujo, 88 Datta, 89 Gable, 97 Gelabert, 98 Grall, 45 Grudziak, 47 Hafez, 101 Hassani, 105 Haydel, 106 Jagneaux, 110 Jones, 51 Kerwin, 52 Kluger, 111 Lee, 112 Leung, 113 Lewiss, 56 Lobo, 58 MacVane, 59 Mahler, 60 Mandavia, 62 McGraw, 63 Morse, 65 Nelson, 115 Nguyen, 116 Noble, 66 Norris, 119 Olson, 121 Parks, 67 Parks, 123 Peterson, 125 Platz, 68 Rohra, 126 Salen, 69 Sessler, 128 Shah, 70 Shah, 129 Smalley, 72 Stolz, 132 Wadhawan, 134 Woo 75 54 (49)
Posttest: skills (including OSCE, SDOT, OSATS, GRS, checklist, video review) a Akhtar, 29Alkhalifah, 30 Amini, 31 Amini, 32 Arntfield, 33 Bayci, 34 Bharati, 79 Byars, 80 Caffery, 39 Chao, 81 Chenkin, 82 Chenkin, 83 Chenkin, 84 Chenkin, 41 Chenkin, 42 Cho, 86 Clinton, 87 Corujo, 88 Datta, 89 De Lucia, 90 Dulani, 91 Duran Gehring, 92 Ferre, 93 Gable, 97 Ghosh, 99 Girzadas, 44 Godbout, 100 Grall, 45 Greenstein, 46 Grudziak, 47 Hall, 103 Hayward, 48 Hrymak, 108 Hrymak, 109 Jagneaux, 110 Jang, 49 Jang, 50 Jones, 51 Laack, 53 Lall, 54 Lanoix, 55 Lee, 112 Leung, 113 Lewiss, 56 Lobo, 58 MacVane, 59 Mahler, 60 Mallin, 61 McGraw, 63 Miller, 64 Nguyen, 116 Norris, 119 O'Keefe, 120 Olson, 121 Parks, Salen, 69 Sessler, 128 Shah, 70 Shah, 129 Smalley, 72 Smith, 130 Sommerkamp, 73 Stolz, 74 Stolz, 132 Williams, 135 Woo, 75 Woodcroft 136 67 (61)
Long‐term assessment Akhtar, 29 Amini, 32 Arntfield, 33 Bahner, 78 Bayci, 34 Bharati, 79 Chao, 81 Chenkin, 83 Chenkin, 84 Chenkin, 41 Chenkin, 42 Cho, 86 Clinton, 87 Datta, 89 Ferre, 93 Furman, 96 Gable, 97 Godbout, 100 Grall, 45 Haydel, 106 Hayward, 48 Jang, 49 Jang, 50 Kluger, 111 Laack, 53 Lanoix, 55 Leung, 113 MacVane, 59 Mallin, 61 McGraw, 63 Miller, 64 Morse, 65 Noble, 66 Rohra, 126 Smith 130 36 (33)
Assessment: hand motion analysis Chenkin, 84 Chenkin, 83 McGraw, 63 Woodcroft 136 4 (3.7)
Post‐intervention
Subjective program or learner assessment survey Adan, 76 Adhikari, 77 Alkhalifah, 30 Amini, 32 Arntfield, 33 Bahner, 78 Bayci, 34 Berg, 35 Bloch, 36 Caffery, 39 Chenkin, 84 Chenkin, 42 Chenkin, 85 Cho, 86 Clinton, 87 Corujo, 88 Dulani, 91 Furman, 96 Ghosh, 99 Girzadas, 44 Grall, 45 Greenstein, 46 Grudziak, 47 Hakmeh, 102 Haney, 104 Hassani, 105 Haydel, 106 Hrymak, 108 Hrymak, 108 Leung, 113 Liteplo, 57 Lobo, 58 Mallin, 61 Nguyen‐Phuoc, 117 Nguyen, 118 Noble, 66 O'Keefe, 120 Olson, 121 Parks, 67 Parks, 123 Parks, 124 Runde, 127 Salen, 69 Sessler, 128 Shah, 70 Shah, 129 Sommerkamp, 73 Staum, 131 Stolz, 74 Stolz, 132 Woo 75 52 (48)

OSCE, Objective Structured Clinical Examination; SDOT, Standardized Direct Observational Assessment Tool; OSATS, Objective Structured Assessment of Technical Skills; GRS, Global Rating Scale.

a

Checklist and video review are bolded.

TABLE 3.

Training domains of 109 selected articles

Domain Studies No. (%) a
Recognition of indications/contraindications Adhikari 77 , Akhtar, 29 Alkhalifah, 30 Amini, 31 Amini, 32 Arntfield, 33 Berg, 35 Byars, 38 Byars, 38 Caffrey, 39 Chao, 81 Field, 94 Gable, 97 Grall, 45 Girzadas, 44 Greenstein, 46 Grudziak, 47 Herring, 107 Kluger, 111 Lall, 54 Lanoix, 55 Mateer, 5 Nelson, 115 O'Keefe, 120 Parks, 123 Sessler, 128 Shah, 129 Woo, 75 Woodcroft 136 29 (27)
Image acquisition Adan, 76 Akhtar, 29 Alkhalifah, 30 Amini, 31 Amini, 32 Arntfield, 33 Bayci, 34 Berg, 35 Bharati, 79 Bloch, 36 Boulger, 37 Byars, 80 Byars, 38 Caffrey, 39 Chao, 81 Chenkin 41 , Chenkin, 42 Chenkin, 43 Chenkin, 83 Chenkin, 84 Cho, 86 Clinton, 87 Corujo, 88 Datta, 89 De Lucia, 90 Dulani, 91 Duran Gehring, 92 Ferre, 93 Field, 94 Furman, 96 Gable, 97 Ghosh, 99 Girzadas, 44 Godbout, 100 Grall, 45 Greenstein, 46 Grudziak, 47 Hakmeh, 102 Hall, 103 Haney, 104 Hayward, 48 Haydel, 106 Herring, 107 Hrymak, 108 Hrymak, 109 Jagneaux, 110 Jang, 49 Jang, 50 Jones, 51 Laack, 53 Lall, 54 Lanoix, 55 Lee, 112 Leung, 113 Lewiss, 56 Liteplo, 57 Lobo, 58 MacVane, 59 Mahler, 60 Mallin, 61 Mandavia, 62 Mateer, 5 McGraw, 63 Miller, 64 Nguyen, 116 Nguyen, 118 Nguyen‐Phuoc, 117 Noble, 66 Norris, 119 O'Keefe, 120 Olson, 121 Parks, 67 Parks, 123 Parks, 124 Runde, 127 Salen, 69 Sessler, 128 Shah, 70 Shah, 129 Shokoohi, 71 Smalley, 72 Smith, 130 Sommerkamp, 73 Staum, 131 Stolz, 74 Stolz, 132 Williams, 135 Woo, 75 Woodcroft 136 89 (82)
Image interpretation Adan, 76 Adhikari, 77 Akhtar, 29 Alkhalifah, 30 Amini, 31 Amini, 32 Arntfield, 33 Bayci, 34 Berg, 35 Bharati, 79 Bloch, 36 Boulger, 37 Byars, 38 Byars, 80 Caffrey, 39 Campanella, 40 Chao, 81 Chenkin, 41 Chenkin, 42 Chenkin, 43 Chenkin, 82 Chenkin, 83 Chenkin, 84 Cho, 86 Clinton, 87 Corujo, 88 Datta, 89 De Lucia, 90 Dulani, 91 Duran Gehring, 92 Ferre, 93 Field, 95 Furman, 96 Gable, 97 Gelabert, 98 Ghosh, 99 Godbout, 100 Girzadas, 44 Grall, 45 Greenstein, 46 Grudziak, 47 Hafez, 101 Hakmeh, 102 Hall, 103 Haney, 104 Hassani, 105 Haydel, 106 Hayward, 48 Herring, 107 Hrymak, 108 Hrymak, 109 Jagneaux, 110 Jang, 49 Jang, 50 Jones, 51 Kerwin, 52 Kluger, 111 Laack, 53 Lall, 54 Lanoix, 55 Lee, 112 Leung, 113 Lewiss, 56 Liteplo, 57 Lobo, 58 MacVane, 59 Mallin, 61 Mandavia, 62 Mateer, 5 McGraw, 63 Miller, 64 Minnigan, 114 Morse, 65 Nelson, 115 Nguyen, 116 Nguyen‐Phuoc, 117 Nguyen, 118 Noble, 66 Norris, 119 O'Keefe, 120 Olson, 121 Olzsynski, 122 Parks, 67 Parks, 123 Parks, 124 Peterson, 125 Platz, 68 Rohra, 126 Runde, 127 Sessler, 128 Salen, 69 Shah, 70 Shah, 129 Shokoohi, 71 Smalley, 72 Smith, 130 Staum, 131 Sommerkamp, 73 Stolz, 74 Stolz, 132 Williams, 135 Woo, 75 Woodcroft 136 103 (94)
Clinical integration Adhikari, 77 Alkhalifah, 30 Amini, 31 Amini, 32 Bayci, 34 Byars, 80 Boulger, 37 Byars, 38 Chao, 81 Clinton, 87 Datta, 89 Dulani, 91 Duran Gehring, 92 Field, 94 Furman, 96 Gable, 97 Girzadas, 44 Godbout, 100 Grall, 45 Greenstein, 46 Hafez, 101 Hayward, 48 Herring, 107 Hrymak, 108 Hrymak, 108 Jang, 49 Jang, 50 Lanoix, 55 Leung, 113 Lewiss, 56 Lobo, 58 MacVane, 59 Mahler, 60 Mandavia, 62 Mateer, 5 Miller, 64 Minnigan, 114 Nelson, 115 Norris, 119 Okeefe, 120 Olson, 121 Olszynski, 122 Parks, 67 Platz, 68 Sessler, 128 Shah, 70 Stolz, 74 Stolz, 132 Woo 75 49 (45)
Accuracy, documentation, quality assurance, reimbursement Boulger, 37 Lanoix, 55 Mateer 5 3 (3)
a

Numbers (percentages) total >109, as studies may train >1 domain.

TABLE 4.

Outcomes assessment of 109 selected studies (adapted from Kirkpatrick's hierarchy of levels of outcomes a )

Assessment category Method of assessment Studies N (%) b
Benefit to patients Patient‐oriented outcomes Furman, 96 Jang, 49 Jang, 50 Lanoix, 55 MacVane, 59 Mandavia, 62 Miller, 64 O'Keefe, 120 Sessler 128 9 (8)
Behaviors Activity monitoring Amini 31 , Arntfield, 33 Furman, 96 Godbout, 100 Jang, 49 Jang, 50 Laack, 53 Lanoix, 55 MacVane, 59 Mandavia, 62 Nelson, 115 O'Keefe, 120 Tyler 133 13 (12)
Skills (image acquisition, image interpretation) Performance assessment Adan, 76 Adhikari, 77 Akhtar, 29 Amini, 31 Amini, 32 Arntfield, 33 Bayci, 34 Bharati, 79 Bloch, 36 Byars, 38 Caffrey, 39 Campanella, 40 Chao, 81 Chenkin, 83 Chenkin, 84 Chenkin, 85 Chenkin, 41 Chenkin, 42 Cho, 86 Clinton, 87 Corujo, 88 Datta, 89 De Lucia, 90 Dulani, 91 Duran Gehring, 92 Ferre, 93 Gable, 97 Gelabert, 98 Ghosh, 99 Godbout, 100 Greenstein, 46 Haydel, 106 Hakmeh, 102 Hassani, 105 Jagneaux, 110 Jang, 49 Jang, 50 Jones, 51 Kerwin, 52 Laack, 53 Lall, 54 Lanoix, 55 Lee, 112 Leung, 113 Liteplo, 57 Lobo, 58 MacVane, 59 Mahler, 60 Mandavia, 62 McGraw, 63 Miller, 64 Minnigan, 114 Morse, 65 Nelson, 115 Nguyen 113 , Nguyen‐Phuoc, 117 Norris, 119 Noble, 66 O'Keefe, 120 Olson, 121 Parks, 67 Parks, 123 Parks, 124 Peterson, 125 Platz, 68 Salen, 69 Sessler, 128 Shah, 70 Shah, 129 Shokoohi, 71 Smith, 130 Sommerkamp, 73 Stolz, 74 Stolz, 132 Smalley, 72 Williams, 135 Woo, 75 Woodcroft 136 84 (77)
Knowledge Cognitive testing Adhikari, 77 Akhtar, 29 Amini, 31 Amini, 32 Bayci, 34 Bharti, 79 Bloch, 36 Campanella, 40 Chao, 81 Chenkin, 42 Chenkin, 43 Cho, 86 Clinton, 87 Corujo, 88 Datta, 89 Gable, 97 Grudziak, 47 Haydel, 106 Jagneaux, 110 Jones, 51 Kluger, 111 Laack, 53 Lall, 54 Lee, 112 Leung, 113 Liteplo, 57 Lobo, 58 MacVane, 59 Mahler, 60 Mandavia, 62 McGraw, 63 Minnigan, 114 Nelson, 115 Nguyen, 116 Noble, 66 Norris, 119 Olson, 121 Parks, 123 Parks, 124 Platz, 68 Rohra, 126 Sessler, 128 Shah, 70 Shah, 129 Stolz, 74 Stolz, 132 Tyler, 133 Wadhawan 134 48 (44)
Self‐efficacy: “confidence,” “comfort” Self‐report/opinion Adan, 76 Arntfield, 33 Bayci, 34 Berg, 35 Caffrey, 39 Chenkin, 42 Chenkin, 43 Chenkin, 84 Chenkin, 83 Clinton, 87 Corujo, 88 Dulani, 91 Furman, 96 Grall, 45 Greenstein, 46 Grudziak, 47 Hakmeh, 102 Haney, 104 Haydel, 106 Hrymak, 108 Hrymak, 109 Kerwin, 52 Lall, 54 Leung, 113 Lobo, 58 Mallin, 61 Noble, 66 Nguyen, 118 Parks, 67 Parks, 123 Parks, 124 Runde, 127 Sessler, 128 Shah, 70 Shah, 129 Staum, 131 Stolz, 74 Stolz, 132 Woo 75 39 (36)
Attitudes: “useful,” “valuable,” “effective,” “easy” Self‐report/opinion Adan, 76 Adhikari, 77 Alkhalifah, 30 Amini, 31 Arntfield, 33 Bahner, 78 Berg, 35 Bloch, 36 Chenkin, 42 Dulani, 91 Grall, 45 Grudziak, 47 Haney, 104 Lall, 54 Liteplo, 57 Lobo, 58 Mallin, 61 Noble, 66 Olson, 121 O'Keefe, Runde, 127 Salen,69 Shah 70 , Shah, 129 Stolz, 74 Woo 75 26 (24)
Reaction to educational experience: “satisfied,” “enjoyed” Self‐report/opinion Adhikari, 77 Amini 31 , Alkhalifah, 30 Arntfield, 33 Berg, 35 Chenkin, 42 Corujo, 88 Girzadas, 44 Grall, 45 Greenstein, 46 Grudziak, 47 Leung, 113 Liteplo, 57 Lobo, 58 Mallin, 61 Noble, 66 Salen, 69 Sessler, 69 Shah, 70 Stolz, 74 Stolz, 132 Woo 75 22 (20)
Not applicable Description Boulger, 37 Byars, 80 Hayward, 48 Lewiss, 56 Mateer, 5 Chenkin, 82 Field, 94 Field, 94 Hafez, 101 Hall, 103 Herring, 107 Olszynski 122 12 (11)
a

Adapted from Tilson et al.26

b

Totals do not equal 109 because as a study may have measured >1 outcome. Total percentage may not equal 100 because of rounding.

The emergency ultrasound applications taught in the published literature are presented in Supporting Information Table S5. The most common applications were cardiac (20, 18.3%), followed by ultrasound guided vascular access (17, 15.6%). Studies also reported training in the pelvic application (9, 8.2%), undifferentiated hypotension (8, 7.3%), and nerve blocks (6, 5.5%). Twenty‐five (22.9%) studies reported training in multiple (4 or more) applications.

Cadaver models were used to train residents in ultrasound guided peripheral 107 or regional nerve blocks, 76 ultrasound guided vascular access, 53 and for the diagnosis of Achilles tendon rupture. 99 Collaborative learning, through team work and gamification, was reported by 4 studies (SonoGames, 56 , 57 Sound Games, 58 and UltraSimageddon 46 ). Additionally, 4 studies, which taught ultrasound guided vascular access 63 , 136 and transesophageal echocardiogram, 83 , 84 assessed their learners with transducer motion metrics (hand motion analysis).

4. LIMITATIONS

We are confident that this scoping review provides a representative range of published work. Along with articles, we have intentionally included conference abstracts. Although these may lack the methodological rigor of articles, we feel that the inclusion of these was necessary to fully describe the vast range of research in emergency ultrasound training. Our review focused on the United States and Canada, and we do not believe that the restriction of our review to research published in the English language negatively affected our results.147 We performed comprehensive searches, applying a broad search strategy, of the most relevant databases; however, despite our best intentions, the sheer volume of publications precluded a search of the grey literature. Another concern was that hand‐searching of relevant journals and reviewing of reference lists might impair the reproducibility of our results.

Our review has several limitations. First, we recognize that published training methods may not reflect the practice of emergency ultrasound training. Much training is performed during clinical shifts and through unstructured learning processes, which is difficult to capture with planned research. Second, we encountered a number of studies investigating feasibility and test characteristics of emergency ultrasound. Often, these included an educational component, and many included residents. These studies were excluded because their focus was not that of evaluating a structured educational intervention. However, consideration of these studies may provide another facet of emergency ultrasound education. Third, although the use of the MERSQI enabled us to quantitatively evaluate the quality of research, there might be contradictions between what is reported in studies and practice. Fourth, our reviewer training of the MERSQI may be seen as a limitation. However, reviewers felt very comfortable with the concepts and definitions before screening. For our MERSQI analysis, our final decisions were reached by discussion and consensus. We recognize that others might not arrive at the same consensus as us. Last, despite our best efforts, our interrater agreement is low. When screening, we were faced with a large array of literature, beset with inherent heterogeneity and the lack of clear definitions. We were concerned that using overly strict criteria might exclude important evidence. We used a conservative approach during our first and second screening process, by being more inclusive of studies than exclusive. Some reviewers were more conservative than others, and this issue was resolved by consensus.

5. DISCUSSION

To the best of our knowledge, this is the first scoping review that evaluates publications on structured methods for emergency ultrasound training of emergency medicine residents in the United States and Canada. This review presents >2 decades of relevant research, from the first published article in 1994 to mid‐2019. A total of 109 publications, 48 articles, and 61 abstracts describe an exhaustive range of educational characteristics, strategies, and outcomes of structured training methods that has evolved with the development of the field.

There is substantial evidence that emergency ultrasound education has developed significantly in the last 2 decades. An early curriculum, exemplified by that of Mandavia 62 et al describes a 2‐day, 16‐hour course that teaches “7 indications,” workshop‐style with lectures followed by an ultrasound “lab.” A contemporary curriculum that blends modern technology with traditional methods is exemplified by that of Stolz et al, 74 which describes a 1‐day course consisting of flipped classroom didactics with asynchronous learning, case‐based interactive teaching, and goal‐oriented skills training using checklists. Emergency ultrasound training is observed to have progressed from the teaching of basic emergency ultrasound applications (focused assessment with sonography of trauma [FAST]) 7 , 69 to more advanced applications (such as transesophageal echocardiogram) 33 , 38 , 41 , 83 , 84 and clinical integration of skills with protocols and algorithms (such as undifferentiated hypotension). 67 , 81 , 108 , 109 , 123 , 124 , 131

There is also strong evidence on the remarkable innovations that showcase emergency ultrasound education over this period. Although many are not new to the field of general medical education, several classic instructional techniques have been creatively adapted to provide novel and fresh approaches to emergency ultrasound training. Twenty‐eight unique curricula are described. On‐line learning, with multimedia modules and through social media, 78 , 101 , 133 are reported. Sixty‐eight studies report training with simulation using phantoms and mannequins, human models, animal models, 36 , 40 , 93 , 117 and cadavers. 76 , 99 , 107 Other successful innovations include large scale multi‐institution initiatives, 47 , 128 collaborative learning through gamification, 56 , 57 , 58 , 121 case‐based learning, and learning through deliberate practice, blocked practice, and mastery learning.

The published literature leaves pronounced gaps in our knowledge of training domains, learner assessment, long term learning retention, and the translation of training into practice.

The most recent ACEP guidelines 11 recommend 5 emergency ultrasound training domains: image acquisition, image interpretation, recognition of indications, clinical integration and quality, documentation, and reimbursement. However, the majority of studies report training in image acquisition and interpretation; training in the last 3 domains is only marginally reported. Although we may hope that training in the latter domains takes place in the larger clinical arena, all 5 domains are important separate educational needs for emergency ultrasound in emergency medicine and should be an integral part of any focused emergency ultrasound training.

The majority of reported interventions assess learners on outcomes at the lower Kirkpatrick levels; behavior change and beneficial patient outcomes have not been often reported. The majority of publications report assessment with surveys, subjective self‐assessment, and single observer ratings. Only 28 (26%) of the studies report using validated and standardized assessment instruments, such as the OSCE (Objective Structured Clinical Examination), SDOT (Standard Direct Observational Assessment Tool), OSATS (Objective Structured Assessment of Technical Skills), GRS (Global Rating Scale), and checklists. Only 36 studies (one‐third) report long term assessment of learners, and >10% of the studies reported the translation of training into practice.

Our review reflects selective research interest in the training of applications and procedures. Based on the number of studies, there appears to be a strong interest in procedural guidance (nerve blocks and vascular access) and cardiac applications. However, publications in gallbladder, lower extremity venous, musculoskeletal, and renal applications are limited to single studies, and there is no dedicated gastrointestinal study. Temporal publication numbers also indicate a strong interest in emerging applications, like transesophageal echocardiography. This publication bias makes the evaluation of North American content training guidelines challenging.

We have intentionally focused our review on studies conducted in the United States and Canada. Given the diversity of country‐specific emergency ultrasound training requirements, this geographic focus was necessary, if only to enable the authors to better evaluate training methods against a familiar contextual backdrop of emergency ultrasound guidelines and recommendations. Although it is likely that emergency ultrasound education is influenced by processes particular to individual countries, we believe that our study can inform and empower astute researchers and educators on an international level. In our search, we did find several noteworthy international studies; however, a discussion of these is beyond the scope of this article.

The MERSQI was selected to critically appraise our 48 articles because of its objective methodological rigor and growing body of validity evidence. 22 , 23 Although the MERSQI has been used to evaluate general education research in internal medicine, 23 obstetrics and gynecology, 137 and surgery, 138 and to evaluate research on echocardiography teaching to cardiology fellows, 139 this is the first time that the MERSQI has been used to evaluate research on emergency ultrasound training of emergency medicine residents.

Our critical analysis revealed several common weaknesses in study design, including a low number of sampled institutions (usually only a single site), using assessment instruments with unknown or unreported validity, and assessing low‐level outcomes. These weaknesses negatively affected the overall quality of emergency ultrasound education research.

Significant quality improvement would require the careful selection of assessment instruments and reference standards. An ideal reporting structure would use a randomized design at >2 institutions, a learner response rate of more 75%, objective assessment of learners with validated evaluation instruments (at least internal structure, content and relationship to other variables), appropriate and inferential data analysis, and the assessment of patient and healthcare outcomes.

Based on our review, there are several shortcomings that need consideration including how to (1) overcome the heterogeneity in research, (2) gather high level outcome data, and (3) best assess learner proficiency. Of these, we believe that a structured and validated learner assessment strategy should be considered a priority. A practical approach suggested by Hamstra is a 7‐step checklist that includes validating content with experts from multiple institutions, inter‐rater training and assessment, and ongoing item writing development, pilot testing, and construct validity reassessment. 140 Alternately, SDOT checklists for 10 common emergency ultrasound applications are included in the supplement of the CORD‐AEUS 2013 consensus guidelines. 3

Emergency ultrasound might benefit from the experience of other medical education fields, such as evidence‐based medicine, which has developed guidelines for the development of assessment tools, defined a taxonomy, and created a framework called the Classification Rubric for Evidence‐based Practice Assessment Tools in Education (CREATE) framework that ties the modified Kirkpatrick outcomes levels to intentional instrument design. 26 , 27 , 28 Consideration should be given to the formation of a focused group or a collaborative network dedicated to enhancing the quality of emergency ultrasound education research through the development of robust reporting guidelines and frameworks. We recognize that there may be practical considerations of cost and funding; however, this should not preclude the development of quality standards.

Daily training of emergency ultrasound consists of apprentice‐type encounters in the ED. These are poorly represented in the literature and vary widely with the individual styles of attending emergency physicians and the workload of the ED. Although structured training methods merely represent a fragment of a larger educational system, they play an important role in ensuring that all learners get a modicum of high‐quality, standardized training and assessment.

A robust emergency ultrasound education program requires considerable faculty expertise, dedicated faculty time, training resources, and departmental support, 10 and programs are faced with the challenge of creating curricula that meet training goals and are time and cost‐effective. Our review suggests that the evaluation of a curriculum or intervention during the planning phase, using the MERSQI or another validated quality instrument, is likely to be valuable to educators.

In summary, this scoping review covers >2 decades of structured emergency ultrasound training and illustrates several innovative advances that mirror the rapid expansion of emergency ultrasound to its current status as an essential component of emergency medicine training and practice. Overall, we found a dearth of rigorous, high‐quality studies. Instead, we found many articles on novel interventions conducted as small, single‐institution studies using unvalidated assessment tools.

Our findings have several important implications for educators and researchers. Research in emergency ultrasound structured training methods would benefit from careful consideration of several areas: underrepresented emergency ultrasound applications, higher‐level outcomes assessment of behavior change and benefit to patients, and measures of instrument and content validity. The use of standardized and intentionally developed planning and assessment tools, mapped to targeted content and outcome domains might provide valuable formative and summative assessments, that would not only benefit research, but also training.

CONFLICTS OF INTEREST

The authors declare no conflicts of interest.

AUTHOR CONTRIBUTIONS

GB, Conceptualization; Formal analysis. HH, Data curation (lead), Visualization, Writing‐original draft, review and editing. NB, Formal analysis. SL and LP, Conceptualization, Formal Analysis, Supervision, Validation, Writing‐original draft, review and editing. BW, Formal analysis, Validation, Visualization, Writing‐original draft, review and editing. LLP takes final responsibility for all contents of this manuscript.

Supporting information

Supporting Information

Supporting Information

Supporting Information

Supporting Information

Supporting Information

PoSaw LL, Wubben BM, Bertucci N, Bell GA, Healy H, Lee S. Teaching emergency ultrasound to emergency medicine residents: a scoping review of structured training methods. JACEP Open. 2021;2:e12439. 10.1002/emp2.12439

Supervising Editor: Michael Blaivas, MD

Scientific Assembly 2019 by American College of Emergency Physicians, Oct 27–30, 2019 at Denver, Colorado, United States of America.

Funding and support: By JACEP Open policy, all authors are required to disclose any and all commercial, financial, and other relationships in any way related to the subject of this article as per ICMJE conflict of interest guidelines (see www.icmje.org). The authors have stated that no such relationships exist.

REFERENCES

  • 1. Hockberger RS, Binder LS, Graber MA, et al. The model of the clinical practice of emergency medicine. Ann Emerg Med. 2001;37(6):745‐770. [DOI] [PubMed] [Google Scholar]
  • 2. Beeson MS, Carter WA, Christopher TA, et al. The development of the emergency medicine milestones. Acad Emerg Med. 2013;20(7):724‐729. [DOI] [PubMed] [Google Scholar]
  • 3. Lewiss RE, Pearl M, Nomura JT, et al. CORD‐AEUS: consensus document for the emergency ultrasound milestone project. Acad Emerg Med. 2013;20(7):740‐745. [DOI] [PubMed] [Google Scholar]
  • 4.Accreditation Council for Graduate Medicina Education, American Board of Emergency Medicine. The emergency medicine milestone project. ACGME. Published July 2015. Accessed March 22, 2020. https://www.acgme.org/Portals/0/PDFs/Milestones/EmergencyMedicineMilestones.pdf
  • 5. Mateer J, Plummer D, Heller M, et al. Model curriculum for physician training in emergency ultrasonography. Ann Emerg Med. 1994;23(1):95‐102. [DOI] [PubMed] [Google Scholar]
  • 6. Heller MB, Mandavia D, Tayal VS, et al. Residency training in emergency ultrasound: fulfilling the mandate. Acad Emerg Med. 2002;9(8):835‐839. [DOI] [PubMed] [Google Scholar]
  • 7. American College of Emergency Physicians . Emergency ultrasound guidelines. Ann Emerg Med. 2009;53(4):550‐570. [DOI] [PubMed] [Google Scholar]
  • 8. Moak J, SAEM endorses the 2008 ACEP ultrasound guidelines. SANAIEM. Published April 23, 2010. Accessed March 22, 2020. http://new.sinaiem.us/saem-endorses-the-2008-acep-ultrasound-guidelines/
  • 9. American Institute of Ultrasound in Medicine . Recognition of ACEP "Ultrasound Guidelines: Emergency, Point‐of‐care, and Clinical Ultrasound Guidelines in Medicine." AIUM. Accessed March 22, 2020. https://www.aium.org/officialStatements/45
  • 10. Akhtar S, Theodoro D, Gaspari R, et al. Resident training in emergency ultrasound: consensus recommendations from the 2008 Council of Emergency Medicine Residency Directors Conference. Acad Emerg Med. 2009;16(suppl 2):S32‐S36. [DOI] [PubMed] [Google Scholar]
  • 11. American College of Emergency Physicians . Ultrasound guidelines: emergency, point‐of‐care and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69(5):e27‐e54. [DOI] [PubMed] [Google Scholar]
  • 12. Kim D, Theoret J, Liao M, Hopkins E, Woolfrey K, Kendall JL. The current state of ultrasound training in Canadian emergency medicine programs: perspectives from program directors. Acad Emerg Med. 2012;19(9):1073‐1078. [DOI] [PubMed] [Google Scholar]
  • 13. Lewis D, Rang L, Kim D, et al. Recommendations for the use of point‐of‐care ultrasound (PoCUS) by emergency physicians in Canada: CAEP position statement. 2018. https://caep.ca/wp-content/uploads/2018/11/CAEP-PoCUS-Position-Statement-Full-Text-2018-V7-draft.pdf [DOI] [PubMed]
  • 14. Royal College of Physicians and Surgeons of Canada . Objectives of training in the specialty of emergency medicine. Published June 2018. Accessed September 2, 2020. http://www.royalcollege.ca/rcsite/documents/ibd/pediatric-emergency-medicine-otr-e.pdf
  • 15. Olszynski P, Kim D, Chenkin J, Rang L. The core emergency ultrasound curriculum project: a report from the Curriculum Working Group of the CAEP Emergency Ultrasound Committee. Can J Emerg Med. 2018;20(2):176‐182. [DOI] [PubMed] [Google Scholar]
  • 16. PoSaw LL, Wubben B, Bertucci N, Bell G, Healy HS, Lee S, Teaching emergency ultrasound to emergency medicine residents: a scoping review of structured training methods. OSF. Published May 1, 2020. Accessed August 21. 2020. osf.io/c2zyg [DOI] [PMC free article] [PubMed]
  • 17. Tricco AC, Lillie E, Zarin, et al. PRISMA extension for scoping reviews (PRISMA‐ScR): checklist and explanation. Ann Intern Med. 2018;169(7):467‐473. [DOI] [PubMed] [Google Scholar]
  • 18. Arksey H, O'Malley L. Scoping studies: towards a Methodological Framework. Int J Soc Res Methodol. 2005;8(1):19‐32. [Google Scholar]
  • 19. Levac D, Colquhoun H, O'Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69. Published 2010 Sep 20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Bramer WM, Giustini D, de Jonge GB, et al. De‐duplication of database search results for systematic reviews in EndNote. J Med Libr Assoc. 2016;104(3):240‐243. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Covidence systematic review software, Veritas Health Innovation, Melbourne, Australia. Available at www.covidence.org
  • 22. Reed DA, Cook DA, Beckman TJ, et al. Association between funding and quality of published medical education research. JAMA. 2007;298(9):1002‐1009. [DOI] [PubMed] [Google Scholar]
  • 23. Reed DA, Beckman TJ, Wright SM, et al. Predictive validity evidence for medical education research study quality instrument scores: quality of submissions to JGIM's Medical Education Special Issue. J Gen Intern Med. 2008;23(7):903‐907. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Cook DA, Reed DA. Appraising the quality of medical education research methods: the Medical Education Research Study Quality Instrument and the Newcastle‐Ottawa Scale‐Education. Acad Med. 2015;90(8):10671076. [DOI] [PubMed] [Google Scholar]
  • 25. Beckman TJ, Cook DA, Mandrekar JN. What is the validity evidence for assessments of clinical teaching? J Gen Intern Med. 2005;20(12):1159‐1164. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26. Tilson JK, Kaplan SL, Harris JL, et al. Sicily statement on classification and development of evidence‐based practice learning assessment tools. BMC Med Educ. 2011;11:78. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Kirkpatrick DL. Evaluation of Training Programs: The Four Levels. Berrett‐Koehler. 1994. [Google Scholar]
  • 28. Freeth D, Hammick M, Koppel I, Reeves S, Barr H. A Critical Review of Evaluations of Interprofessional Eduction. Higher Education Academy, Health Sciences and Practice Network; 2002. http://www.health.ltsn.ac.uk/publications/occasionalpaper/occasionalpaper02.pdf. [Google Scholar]
  • 29. Akhtar S, Hwang U, Dickman E, et al. A brief educational intervention is effective in teaching the femoral nerve block procedure to first‐year emergency medicine residents. J Emerg Med. 2013;45(5):726‐730. [DOI] [PubMed] [Google Scholar]
  • 30. Alkhalifah M, McLean M, Koshak A. Acute cardiac tamponade: an adult simulation case for residents. MedEdPORTAL. 2016;12:10466. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Amini R, Stolz LA, Kartchner JZ, et al. Bedside echo for chest pain: an algorithm for education and assessment. Adv Med Educ Pract. 2016;7:293‐300. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32. Amini R, Stolz L, Javedani P, et al. Point‐of‐care echocardiography in simulation‐based education and assessment. Adv Med Educ Pract. 2016;7:325‐328. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Arntfield R, Pace J, McLeod S, Granton J, Hegazy A, Lingard L. Focused transesophageal echocardiography for emergency physicians—description and results from simulation training of a structured four‐view examination. Crit Ultrasound J. 2015;7:10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34. Bayci AW, Mangla J, Jenkins CS, Ivascu FA, Robbins JM. Novel educational module for subclavian central venous catheter insertion using real‐time ultrasound guidance. J Surg Educ. 2015;72(6):1217‐1223. [DOI] [PubMed] [Google Scholar]
  • 35. Berg DA, Milner RE, Demangone D, et al. Successful collaborative model for trauma skills training of surgical and emergency medicine residents in a laboratory setting. Curr Surg. 2005;62(6):657‐663. [DOI] [PubMed] [Google Scholar]
  • 36. Bloch AJ, Bloch SA, Secreti L, et al. A porcine training model for ultrasound diagnosis of pneumothoraces. J Emerg Med. 2011;41(2):176‐181. [DOI] [PubMed] [Google Scholar]
  • 37. Boulger C, Adams DZ, Hughes D, et al. Longitudinal ultrasound education track curriculum implemented within an emergency medicine residency program. J Ultrasound Med. 2017;36(6):1245‐1250. [DOI] [PubMed] [Google Scholar]
  • 38. Byars DV, Tozer J, Joyce JM, et al. Emergency physician‐performed transesophageal echocardiography in simulated cardiac arrest. West J Emerg Med. 2017;18(5):830‐834. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39. Caffery T, Jagneaux T, Jones GN, et al. Residents' preferences and performance of training. Ochsner J. 2018;18(2):146‐150. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40. Campanella LM, Pancu D, Gang M, Marill KA, Ort V. The use of a dissected bovine heart to teach cardiac sonography. Acad Emerg Med. 2004;11(7):782‐785. [DOI] [PubMed] [Google Scholar]
  • 41. Chenkin J, Hockmann E, Jelic T. Simulator‐based training for learning resuscitative transesophageal echocardiography. CJEM. 2019;21(4):523‐526. [DOI] [PubMed] [Google Scholar]
  • 42. Chenkin J, Lee S, Huynh T, et al. Procedures can be learned on the web: a randomized study of ultrasound‐guided vascular access training. Acad Emerg Med. 2008;15(10):949‐954. [DOI] [PubMed] [Google Scholar]
  • 43. Chenkin J, McCartney CJ, Jelic T, et al. Defining the learning curve of point‐of‐care ultrasound for confirming endotracheal tube placement by emergency physicians. Crit Ultrasound J. 2015;7(1):14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Girzadas DV Jr, Antonis MS, Zerth H, et al. Hybrid simulation combining a high fidelity scenario with a pelvic ultrasound task trainer enhances the training and evaluation of endovaginal ultrasound skills. Acad Emerg Med. 2009;16(5):429‐435. [DOI] [PubMed] [Google Scholar]
  • 45. Grall KH, Stoneking LR, DeLuca LA, et al. An innovative longitudinal curriculum to increase emergency medicine residents' exposure to rarely encountered and technically challenging procedures. Adv Med Educ Pract. 2014;5:229‐236. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. Greenstein YY, Martin TJ, Rolnitzky L, Felner K, Kaufman B. Goal‐directed transthoracic echocardiography during advanced cardiac life support: a pilot study using simulation to assess ability. Simul Healthc. 2015;10(4):193‐201. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Grudziak J, Herndon B, Dancel RD, et al. Standardized, interdepartmental, simulation‐based central line insertion course closes an educational gap and improves intern comfort with the procedure. Am Surg. 2017;83(6):536‐540. https://www.ncbi.nlm.nih.gov/pubmed/28637552. [PubMed] [Google Scholar]
  • 48. Hayward M, Chan T, Healey A. Dedicated time for deliberate practice: one emergency medicine program's approach to point‐of‐care ultrasound (PoCUS) training. CJEM. 2015;17(5):558‐561. [DOI] [PubMed] [Google Scholar]
  • 49. Jang TB, Kaji AH. A 2‐week elective experience provides comparable training as longitudinal exposure during residency for pelvic sonography. J Ultrasound Med. 2015;34(2):221‐224. [DOI] [PubMed] [Google Scholar]
  • 50. Jang TB, Ruggeri W, Kaji AH. Emergency ultrasound of the gall bladder: comparison of a concentrated elective experience vs. longitudinal exposure during residency. J Emerg Med. 2013;44(1):198‐203. [DOI] [PubMed] [Google Scholar]
  • 51. Jones AE, Tayal VS, Kline JA. Focused training of emergency medicine residents in goal‐directed echocardiography: a prospective study. Acad Emerg Med. 2003;10(10):1054‐1058. [DOI] [PubMed] [Google Scholar]
  • 52. Kerwin C, Tommaso L, Kulstad E. A brief training module improves recognition of echocardiographic wall‐motion abnormalities by emergency medicine physicians. Emerg Med Int. 2011;2011:483242. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53. Laack TA, Dong Y, Goyal DG, Sadosty AT, Suri HS, Dunn WF. Short‐term and long‐term impact of the central line workshop on resident clinical performance during simulated central line placement. Simul Healthc. 2014;9(4):228‐233. [DOI] [PubMed] [Google Scholar]
  • 54. Lall M, Beck S, Meer J. Advanced ultrasound workshops for emergency medicine residents. JETem. 2017;2(1):C1‐13. [Google Scholar]
  • 55. Lanoix R, Baker WE, Mele JM, et al. Evaluation of an instructional model for emergency ultrasonography. Acad Emerg Med. 1998;5(1):58‐63. [DOI] [PubMed] [Google Scholar]
  • 56. Lewiss RE, Hayden GE, Murray A, Liu YT, Panebianco N, Liteplo AS. SonoGames: an innovative approach to emergency medicine resident ultrasound education. J Ultrasound Med. 2014;33(10):1843‐1849. [DOI] [PubMed] [Google Scholar]
  • 57. Liteplo AS, Carmody K, Fields MJ, et al. SonoGames: effect of an innovative competitive game on the education, perception, and use of point‐of‐care ultrasound. J Ultrasound Med. 2018;37(11):2491‐2496. [DOI] [PubMed] [Google Scholar]
  • 58. Lobo V, Stromberg AQ, Rosston P. The sound games: introducing gamification into Stanford's orientation on emergency ultrasound. Cureus. 2017;9(9):e1699. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59. MacVane CZ, Irish CB, Strout TD, et al. Implementation of transvaginal ultrasound in an emergency department residency program: an analysis of resident interpretation. J Emerg Med. 2012;43(1):124‐128. [DOI] [PubMed] [Google Scholar]
  • 60. Mahler SA, Swoboda TK, Wang H, et al. Dedicated emergency department ultrasound rotation improves residents' ultrasound knowledge and interpretation skills. J Emerg Med. 2012;43(1):129‐133. [DOI] [PubMed] [Google Scholar]
  • 61. Mallin M, Louis H, Madsen T. A novel technique for ultrasound‐guided supraclavicular subclavian cannulation. Am J Emerg Med. 2010;28(8):966‐969. [DOI] [PubMed] [Google Scholar]
  • 62. Mandavia DP, Aragona J, Chan L, Chan D, Henderson SO. Ultrasound training for emergency physicians—a prospective study. Acad Emerg Med. 2000;7(9):1008‐1014. [DOI] [PubMed] [Google Scholar]
  • 63. McGraw R, Chaplin T, McKaigney C, et al. Development and evaluation of a simulation‐based curriculum for ultrasound‐guided central venous catheterization. CJEM. 2016;18(6):405‐413. [DOI] [PubMed] [Google Scholar]
  • 64. Miller AH, Roth BA, Mills TJ, Woody JR, Longmoor CE, Foster B. Ultrasound guidance versus the landmark technique for the placement of central venous catheters in the emergency department. Acad Emerg Med. 2002;9(8):800‐805. [DOI] [PubMed] [Google Scholar]
  • 65. Morse JW, Saracino BS, Melanson SW, et al. Ultrasound interpretation of hydronephrosis is improved by a brief educational intervention. Am J Emerg Med. 2000;18(2):186‐188. [DOI] [PubMed] [Google Scholar]
  • 66. Noble VE, Nelson BP, Sutingco AN, Marill KA, Cranmer H. Assessment of knowledge retention and the value of proctored ultrasound exams after the introduction of an emergency ultrasound curriculum. BMC Med Educ. 2007;7:40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67. Parks AR, Atkinson P, Verheul G, et al. Can medical learners achieve point‐of‐care ultrasound competency using a high‐fidelity ultrasound simulator?: a pilot study. Crit Ultrasound J. 2013;5(1):9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68. Platz E, Liteplo A, Hurwitz S, et al. Are live instructors replaceable? Computer vs. classroom lectures for EFAST training. J Emerg Med. 2011;40(5):534‐538. [DOI] [PubMed] [Google Scholar]
  • 69. Salen P, O'Connor R, Passarello B, et al. Fast education: a comparison of teaching models for trauma sonography. J Emerg Med. 2001;20(4):421‐425. [DOI] [PubMed] [Google Scholar]
  • 70. Shah S, Adedipe A, Ruffatto B, et al. BE‐SAFE: bedside sonography for assessment of the fetus in emergencies: educational intervention for late‐pregnancy obstetric ultrasound. West J Emerg Med. 2014;15(6):636‐640. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71. Shokoohi H, Boniface KS, Siegel A. Horizontal subxiphoid landmark optimizes probe placement during the focused assessment with sonography for trauma ultrasound exam. Eur J Emerg Med. 2012;19(5):333‐337. [DOI] [PubMed] [Google Scholar]
  • 72. Smalley CM, Thiessen M, Byyny R, Dorey A, McNair B, Kendall JL. Number of weeks rotating in the emergency department has a greater effect on ultrasound milestone competency than a dedicated ultrasound rotation. J Ultrasound Med. 2017;36(2):335‐343. [DOI] [PubMed] [Google Scholar]
  • 73. Sommerkamp SK, Romaniuk VM, Witting MD, et al. A comparison of longitudinal and transverse approaches to ultrasound‐guided axillary vein cannulation. Am J Emerg Med. 2013;31(3):478‐481. [DOI] [PubMed] [Google Scholar]
  • 74. Stolz LA, Amini R, Situ‐LaCasse E, et al. Multimodular ultrasound orientation: residents' confidence and skill in performing point‐of‐care ultrasound. Cureus. 2018;10(11):e3597. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75. Woo MY, Frank J, Lee AC, et al. Effectiveness of a novel training program for emergency medicine residents in ultrasound‐guided insertion of central venous catheters. CJEM. 2009;11(4):343‐348. [DOI] [PubMed] [Google Scholar]
  • 76. Adan A, Gibbons R, Patterson J, et al. Human cadaver vs simulator nerve model for ultrasound‐guided regional anesthesia resident education. Ann Emerg Med. 2017;70(4 suppl):S47‐2017. [Google Scholar]
  • 77. Adhikari S, Fiorello A. Introduction of symptom‐based point‐of‐care ultrasound lecture curriculum into an emergency medicine residency: a novel approach. Acad Emerg Med. 2012;19(suppl 1):S395‐S396.2012 SAEM Annual Meeting. [Google Scholar]
  • 78. Bahner DP, Adkins E, Patel N, et al. How we use social media to supplement a novel curriculum in medical education. Med Teach. 2012;34(6):439‐444. [DOI] [PubMed] [Google Scholar]
  • 79. Bharati A, Datta A, Gupta S, et al. Resident education in ultrasonography: assessment of identification of basic cardiac anatomy and pathology after a multimedia tutorial. Ann Emerg Med. 2012;60(4 suppl):S41. [Google Scholar]
  • 80. Byars D. Vertically integrated multidisciplinary multimedia (VIMM) modules to teach ultrasound in emergeny medicine, an ACEP teaching fellowship project. Ann Emerg Med. 2011;58(4 suupl):S332. [Google Scholar]
  • 81. Chao A, Ennis J, Jammal M, et al. Can the rapid ultrasound in shock (RUSH) exam be performed by emergency physicians with varying experience in point‐of‐care ultrasound. Acad Emerg Med. 2015;22(5):S414. SAEM 2015 Annual Meeting Abstracts. [Google Scholar]
  • 82. Chenkin J. Development and implementation of core competencies for an emergency medicine point‐of‐care ultrasound rotation using the Can MEDS framework. Can J Emerg Med. 2014;16(S1):S37. CAEP/ACMU 2014 Scientific Abstracts. [Google Scholar]
  • 83. Chenkin J, Brydges R, Jelic T, et al. Blocked practice outperforms random practice for learning resuscitative transesophageal echocardiography: a randomized controlled trial. Can J Emerg Med. 2018;20(S1):S63. [Google Scholar]
  • 84. Chenkin J, Hockmann E. A brief educational session is effective for teaching emergency medicine residents resuscitative transesophageal echocardiography. Can J Emerg Med. 2017;19(S1):S35. [Google Scholar]
  • 85. Chenkin J, McCartney C, Jelic T, et al. Practice makes perfect: defining the learning curve for emergency physicians undertaking point‐of‐care ultrasound for confirming endotracheal tube placement. Can J Emerg Med. 2015;17(S2):S19. CAEP/ACMU 2015 Scientific Abstracts: Plenary Oral Presentations. [Google Scholar]
  • 86. Cho DD, Chenkin J. Optimizing practice for learning emergency department transthoracic echocardiography using an ultrasound simulator. Can J Emerg Med. 2016;18(S1):S86‐S87. [Google Scholar]
  • 87. Clinton ME, Young J. An innovative approach to junior resident introductory E‐fast education and outcome assessment. Acad Emerg Med. 2015;22(5):S418. SAEM 2015 Annual Meeting Abstracts. [Google Scholar]
  • 88. Corujo O, Romney M, Lema PC, et al. Use of a simulation model as an adjunct for transvaginal emergency ultrasound teaching and a novel evaluation tool to assess competency. Ann Emerg Med. 2013;62(4 suppl):S90‐S91. [Google Scholar]
  • 89. Datta A, Lema PC, Bharati A, et al. Resident education in ultrasonography: improving identification of abdominal anatomy and pathology after a multimedia tutorial. Ann Emerg Med. 2012;60(4 suppl):S80‐S81. [Google Scholar]
  • 90. De Lucia A, Jackson S, Paul C, et al. How many ultrasound examinations are necessary to gain proficiency in accurately identifying the nerves of the forearm. Acad Emerg Med. 2013;20(suppl 1):S251. SAEM 2013 Annual Meeting Abstracts. [Google Scholar]
  • 91. Dulani T, Bajaj T, Ayala S, et al. Assessing the need for dedicated inferior Vena Cava ultrasound education in emergency medicine residents. Ann Emerg Med. 2014;64(4 suppl):S123‐S124. [Google Scholar]
  • 92. Duran Gehring P, Jacobson L, Saldana N. Fast wars: assessing competency and speed in a simulated trauma patient. Ann Emerg Med. 2013;62(5):S178‐S179. [Google Scholar]
  • 93. Ferre R, Bowser C, Kacprowicz V. Proctored thoracic ultrasound exams with a pneumothroax pig model increases emergency medicine residents’ ability to detect a pneumothroax by ultrasound. Acad Emerg Med. 2010;17(suppl 1):S29. SAEM 2010 Annual Meeting Abstracts. [Google Scholar]
  • 94. Field J. The online emergency ultrasound microsimulator: a novel approach to assessing resident emergency ultrasound competency. Ann Emerg Med. 2016;68(4 suupl):S154‐S155. [Google Scholar]
  • 95. Field J, Brown C, Pacifique M. Asynchronous crowdsourced education for clinical ultrasound: a Curated FOAM for CUS curriculum. Acad Emerg Med. 2016;23(suppl 1):S283‐S284.SAEM 2016 Annual Meeting Abstracts. [Google Scholar]
  • 96. Furman AM, Foster T, Chaput J. Ultrasound guided central line placement training. Am J Resp Crit Care Med. 2011;183:A5859. [Google Scholar]
  • 97. Gable B. Learner curriculum for bedside evaluation of patients with undifferentiated hypotension using the rapid ultrasound for shock and hypotension exam. Ann Emerg Med. 2013;62(4 suppl):S161. [Google Scholar]
  • 98. Gelabert C, Geckle R, Malhotra R, et al. Comparison of different methods of left ventricular ejection fraction in novice sonographers. Ann Emerg Med. 2014;64(4 supple):S114. [Google Scholar]
  • 99. Ghosh R, Das D, Kapoor M, et al. Point‐of‐care ultrasound to diagnose Achilles tendon rupture in cadaver models. Acad Emerg Med. 2019;26(suppl 1):S294. SAEM 2019 Annual Meeting Abstracts. [Google Scholar]
  • 100. Godbout B, Clark M, Egan D, et al. A prospective randomized controlled trial comparing the impact of high fidelity simulation versus standard lecture‐based training on long term clinical performance of emergency airway management by emergency medicine residents. Ann Emerg Med. 2011;58(4 suppl):S286‐S287. [Google Scholar]
  • 101. Hafez NM. Sono buff, an online multimedia point of‐care sonography training instrument designed for medical students, resident physicians, and emergency medicine faculty. Ann Emerg Med. 2016;68(4 suppl):S151. [Google Scholar]
  • 102. Hakmeh W, Sabak M, Zengin S, et al. Training with noncommercial homemade phantoms increases ultrasound guided venous cannulation procedural competence and confidence levels among emergency medicine residents. Acad Emerg Med. 2017;24(suppl 1):S261. SAEM 2017 Annual Meeting Abstracts. [Google Scholar]
  • 103. Hall B. Curriculum for teaching focused bedside echocardiography. Ann Emerg Med. 2011;58(4 suppl):S334. [Google Scholar]
  • 104. Haney R, Baran E. Excellence in ultrasound education: an innovative longitudinal approach to bedside hands‐on ultrasound teaching. West J Emerg Med. 2016;17(4.1):S55. https://escholarship.org/uc/item/4p86x547. [Google Scholar]
  • 105. Hassani B, McLeod SL, Shah A, et al. Evaluation of a brief on‐line teaching module training emergency physicians and residents how to interpret hydronephrosis and its gradations using point of care ultrasonography. Can J Emerg Med. 2014;16(S1):S91. CAEP/ACMU 2014 Scientific Abstracts. [Google Scholar]
  • 106. Haydel M, Butts C. The use of an endovaginal task‐training manikin as an adjunct in teaching emergency ultrasound of early pregnancy to residents‐cord educational grant. Acad Emerg Med. 2013;20(5):S83. SAEM 2013 Annual Meeting Abstracts. [Google Scholar]
  • 107. Herring A, Nagdev A, Durant A, et al. Cadaveric models for training emergency medicine residents in ultrasound‐guided peripheral nerve blocks. Acad Emerg Med. 2010;17(suppl 1):S207. SAEM 2010 Annual Meeting Abstracts. [Google Scholar]
  • 108. Hrymak C, Pham C. Introduction of a formalized RUSH (Rapid Ultrasound in Shock) protocol in emergency medicine residency ultrasound training. Can J Emerg Med. 2016;18(S1):S61. [Google Scholar]
  • 109. Hrymak C, Weldon E, Pham C. The educational impact of a formalized RUSH (Rapid Ultrasound in Shock) protocol in emergency medicine residency ultrasound training. Can J Emerg Med. 2016;18(S1):S61‐S62. [Google Scholar]
  • 110. Jagneaux T, Caffery T, Jones G. The benefit of a standardized simulation‐based approach in teaching ultrasound‐guided central venous access. Crit Care Med. 2011;39(12 suppl):19.20890189 [Google Scholar]
  • 111. Kluger SB, Dickman E, Homel P. Ability of emergency medicine residents in the diagnosis of CHF with a preserved ejection fraction by echocardiogram. Am J Emerg Med. 2018;36(6):1113‐1114. [DOI] [PubMed] [Google Scholar]
  • 112. Lee D, Woo MY, Lee CA, et al. A PILOT evaluation of the effectiveness of a novel emergency medicine ultrasound curriculum for residents at a Canadian Academic Centre. Can J Emerg Med. 2010;12(3):260‐261. [Google Scholar]
  • 113. Leung D, Chenkin J. Education innovation: implementing a point‐of‐care ultrasound curriculum for emergency medicine residents. Can J Emerg Med. 2016;18(1 suppl):S64. [Google Scholar]
  • 114. Minnigan H, Snead G, Fecher A, et al. The utility of a novel simulation assessment method for emergency and critical care cardiac ultrasound training. Acad Emerg Med. 2012;19(suppl 1):S240. [Google Scholar]
  • 115. Nelson BP, Noble VE, Choi J, et al. Improved resident knowledge and adherence to care guidelines using an algorithm for ectopic pregnancy evaluation. Ann Emerg Med. 2009;54(3 suppl):S34. [Google Scholar]
  • 116. Nguyen N, Paschal G, Saul T, et al. Pilot study: ejection fraction estimation by emergency medicine residents using limited bedside echocardiography in the emergency department. Ann Emerg Med. 2013;62(4 suppl):S80. [Google Scholar]
  • 117. Nguyen‐Phuoc A, Cardell A, Barca M, et al. Porcine tissue simulation training decreases time required for successful ultrasound guided peripheral access by novice sonographers. Acad Emerg Med. 2018;25(suppl 1):S59. SAEM 2018 Annual Meeting Abstracts. [Google Scholar]
  • 118. Nguyen TC, Holley C, Meador J, et al. Ultrasound‐guided nerve blocks: a comparison of two teaching methods: traditional identification of nerves versus hands‐on practice with a gel phantom model. Ann Emerg Med. 2016;68(4 suppl):S40. [Google Scholar]
  • 119. Norris DL. Comprehensive approach to teaching first trimester emergencies utilizing multiple teaching modalities. Ann Emerg Med. 2011;58(4 suppl):S335‐S336. [Google Scholar]
  • 120. O'Keefe K. Procedure simulation cart. Ann Emerg Med. 2013;62(4 suppl):S162. [Google Scholar]
  • 121. Olson A, Olson N, Chin E, Gelabert C, Sisson C. "UltraSimageddon:” an intra‐city emergency medicine residency competition. West J Emerg Med. 2018;19(4.1):S28. https://escholarship.org/uc/item/0zq5f6jr. [Google Scholar]
  • 122. Olszynski PA. The edus2 workout: a stepwise approach to learning critical care emergency ultrasound. Can J Emerg Med. 2014;16(S1):S34‐S35.CAEP/ACMU 2014 Scientific Abstracts. [Google Scholar]
  • 123. Parks A, Atkinson PR, Verheul G. Point‐of‐care ultrasound (ACES protocol) improves diagnostic accuracy and confidence in simulated cardiorespiratory scenarios. Can J Emerg Med. 2013;15:S33. CAEP/ACMU 2013 Scientific Abstracts. [Google Scholar]
  • 124. Parks A, Atkinson PR, Verheul G. Can medical learners achieve ultrasound competency using a high‐fidelity ultrasound simulator? Can J Emerg Med. 2013;15(S1):S13. CAEP/ACMU 2013 Scientific Abstracts. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 125. Peterson D, McLeod SL, Ahn J, et al. Ten‐minute educational intervention improves emergency physicians' ability to interpret left ventricular function. Can J Emerg Med. 2013;15(S1):S5. CAEP/ACMU 2013 Scientific Abstracts. [Google Scholar]
  • 126. Rohra A. Use of multimedia for resident education in aorta ultrasound. Ann Emerg Med. 2013;62(4 suppl):S78. [Google Scholar]
  • 127. Runde D, Jafri F, Lewiss R, et al. Does the use of a novel ocular ultrasound model increase the level of confidence and likelihood that an emergency medicine resident will perform ocular ultrasound? Acad Emerg Med. 2011;18(5):S170‐S171.SAEM 2011 Annual Meeting Abstracts. [Google Scholar]
  • 128. Sessler CN, Seago B, Gray ND, et al. Central venous catheterization education and task simulation training: large scale implementation and reduced rates of bloodstream infection. Chest. 2009;136(4 suppl). [Google Scholar]
  • 129. Shah S, Adedipe A, Ruffatto B, et al. Effect of educational intervention on ED physician ability to perform a rapid, bedside ultrasound assessment in late pregnancy. Acad Emerg Med. 2013;20(suppl 1):S258. SAEM 2013 Annual Meeting Abstracts. [Google Scholar]
  • 130. Smith S, Lobo V, Hicks M. Mastery learning of point‐of‐care ultrasound by emergency medicine residents: a randomized study. Acad Emerg Med. 2019;26(suppl 1):S273. SAEM 2019 Annual Meeting Abstracts. [Google Scholar]
  • 131. Staum M, Radomski M. Low‐cost, open source ultrasound simulator enhances resident ultrasound education. Acad Emerg Med. 2016;23(suppl 1):S259. SAEM 2016 Annual Meeting Abstracts. [Google Scholar]
  • 132. Stolz L, Situ‐LaCasse E, Acuna J. Multimodular ultrasound orientation: residents' confidence and skill in performing point‐of‐care ultrasound. Academic Emergency Medicine. 2018;25(suppl 1):S201. SAEM 2018 Annual Meeting Abstracts. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 133. Tyler MD, Leo M. Exploring the use of social media and staged incentives to increase resident participation in a self‐directed emergency ultrasonography learning tool. Acad Emerg Med. 2015;22(suppl 1):S19. SAEM 2015 Annual Meeting Abstracts. [Google Scholar]
  • 134. Wadhawan A, Can podcasts replace traditional lecture for teaching ultrasound? Poster presented at: SAEM 2017 Annual Meeting. May 15‐20, 2017. Orlando, FL. Accessed March 20, 2020. https://www.eventscribe.com/2017/SAEM/ [Google Scholar]
  • 135. Williams SJ, Wu S, Peacock WF, et al. The effect of an educational intervention on the ability to identify peripheral nerves with ultrasound. Acad Emerg Med. 2013;20(suppl 1):S82‐S83.SAEM 2013 Annual Meeting Abstracts. [Google Scholar]
  • 136. Woodcroft M, Holden M, Chaplin T, et al. Development of a simulation‐based curriculum for ultrasound guided internal jugular central venous catheterization. Can J Emerg Med. 2016;18(S1):S60. [DOI] [PubMed] [Google Scholar]
  • 137. Smith RP, Learman LA. A plea for MERSQI: the medical education research study quality instrument. Obstet Gynecol. 2017;130(4):686‐690. [DOI] [PubMed] [Google Scholar]
  • 138. Reed DA, Beckman TJ, Wright SM. An assessment of the methodologic quality of medical education research studies published in The American Journal of Surgery. Am J Surg. 2009;198(3):442‐444. [DOI] [PubMed] [Google Scholar]
  • 139. Ruden EA, Way DP, Nagel RW, Cheek F, Auseon AJ. Best practices in teaching echocardiography to cardiology fellows: a review of the evidence. Echocardiography. 2016;33(11):1634‐1641. [DOI] [PubMed] [Google Scholar]
  • 140. Hamstra SJ. Keynote address: the focus on competencies and individual learner assessment as emerging themes in medical education research. Acad Emerg Med. 2012;19(12):1336‐1343. [DOI] [PubMed] [Google Scholar]
  • 141. Morrison A, Polisena J, Husereau D, et al. The effect of English‐language restriction on systematic review‐based meta‐analyses: a systematic review of empirical studies. Int J Technol Assess Health Care. 2012;28(2):138‐144. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supporting Information

Supporting Information

Supporting Information

Supporting Information

Supporting Information


Articles from Journal of the American College of Emergency Physicians Open are provided here courtesy of American College of Emergency Physicians

RESOURCES