Skip to main content
ATS Scholar logoLink to ATS Scholar
. 2020 Jul 1;1(3):260–277. doi: 10.34197/ats-scholar.2020-0002OC

Development of a Focused Cardiac Ultrasound Image Acquisition Assessment Tool

Rosemary Adamson 1,2,, Amy E Morris 1, Jessica Sun Woan 1,2, Irene W Y Ma 3, Daniel Schnobrich 4, Nilam J Soni 5,6
PMCID: PMC8043312  PMID: 33870293

Abstract

Background: Focused cardiac ultrasound (FCU) is widely used by healthcare providers to answer specific questions about cardiac structure and function at the bedside. Currently, no widely accepted FCU image acquisition checklist exists to assess learners with varying skill levels from different specialties.

Objective: The primary objective of this project was to develop a consensus-based FCU image acquisition checklist using a multispecialty group of point-of-care ultrasound (POCUS) experts.

Methods: The essential components of an FCU examination were identified on the basis of published recommendations from echocardiography and international ultrasound societies. A checklist of the essential components of an FCU examination was drafted. A panel of POCUS experts from different medical specialties in the United States and Canada was convened to vote on each checklist item by answering two questions: 1) Is this item important to include in a checklist of essential FCU skills applicable to any medical specialty? and 2) Should the learner be required to successfully complete this item to be considered competent? A modified Delphi approach was used to assess the level of agreement for each checklist item during four rounds of voting. Checklist items that achieved an agreement of 80% or greater were included in the final checklist.

Results: Thirty-one POCUS experts from seven different medical specialties voted on sixty-five items to be included in the FCU image acquisition assessment tool. The majority of POCUS experts (61%) completed all four rounds of voting. During the first round of voting, 59 items reached consensus, and after revision and revoting, an additional 3 items achieved 80% or greater consensus. A total of 62 items were included in the final checklist, and 57 items reached consensus as a requirement for demonstration of competency.

Conclusion: We have developed a multispecialty, consensus-based FCU image acquisition checklist that may be used to assess the skills of learners from different specialties. Future steps include studies to develop additional validity evidence for the use of the FCU assessment tool and to evaluate its utility for the translation of skills into clinical practice.

Keywords: point-of-care ultrasound, bedside ultrasound, focused cardiac ultrasound, echocardiography, competency assessment


Focused cardiac ultrasound (FCU) is a bedside examination performed to rapidly detect potentially significant valvular, hemodynamic, or structural abnormalities to guide appropriate early triage and management (1, 2). A broad range of healthcare providers with varying levels of clinical training, ranging from medical students to experienced subspecialty attending physicians, have begun to use FCU in recent years (38). Noncardiologists with basic FCU training can reliably obtain cardiac views in the assessment of acutely ill patients to identify clinically significant findings such as severe left ventricular systolic dysfunction or a large pericardial effusion (921). Though not a replacement for traditional transthoracic echocardiography, FCU can identify certain cardiac abnormalities with higher accuracy than physical examination alone (2227). Furthermore, FCU is readily available in settings with limited resources or situations in which rapid diagnosis is crucial to guide immediate decision making, such as hemodynamically unstable patients or those in respiratory distress (2833).

As more providers have begun to use FCU, questions have been raised about training and assessment of competence of providers incorporating FCU into clinical care (1, 3438). For cardiologists, competence in echocardiography is most often determined by board certification by the National Board of Echocardiography, which entails performing a specified number of echocardiograms, completing formal training in cardiovascular diseases, and passing a knowledge assessment (39). Although certification in medical procedures has traditionally been determined by performing a specified number of examinations, recent literature in different specialties has suggested that competence may be unrelated to numerical thresholds (4042). Development of structured assessment tools reflects a broad movement in medical training toward the use of objective assessments to determine competence (43). Nielsen and colleagues developed one such tool for cardiologists to assess competence in echocardiography (44).

Structured assessment tools to objectively assess competence in noncardiac applications of point-of-care ultrasound (POCUS) have been published (4547). For FCU specifically, a few structured tools to assess skills in cardiac image acquisition have been published, but they have important limitations. Most important, currently available assessment tools represent the consensus of individual specialties or institutions, which limits their generalizability. Furthermore, current cardiac ultrasound skill assessment tools broadly evaluate image acquisition (i.e., do not define individual components of an adequate cardiac image acquisition), which limits their ability to assess performance of specific skills (46, 4851). Given these limitations, currently available FCU image acquisition assessment tools have not been widely adopted across institutions for assessing the competence of noncardiologists.

Development of a standardized FCU image acquisition assessment tool has important implications for an increasing number of specialties, including critical care medicine, anesthesiology, and emergency medicine, that require FCU training during residency or fellowship per the American College of Graduate Medical Education, as well as for an increasing number of medical schools that are teaching students how to perform FCU examinations (3, 7). The purpose of this study was to develop a multispecialty, consensus-based FCU image acquisition checklist that includes the specific skills needed to acquire the five common FCU views. A structured and objective FCU image acquisition checklist is a critical step toward developing a standardized competency assessment tool for learners with varying skills within different specialties.

Methods

We conducted a literature review to identify the key components of an FCU examination. Sixty-four separate components of an FCU examination were identified by three of the authors (R.A., A.E.M., and J.S.W.), drawn from international ultrasound and echocardiography society guidelines, previously published checklists and curricula, and frequently cited textbooks and educational websites (1, 28, 44, 46, 52). We were intentionally inclusive in our approach, and these items were reviewed by all the authors to ensure that the list was complete. Through a series of conference calls and e-mail correspondences, the FCU examination components were organized and incorporated into the initial checklist with the following sections: 1) ultrasound knowledge and ultrasound machine functions; 2) scanning techniques; and 3) key anatomic structures and features of an optimal image for the parasternal long axis (PLAX), parasternal short axis (PSAX), apical four-chamber (A4C), subcostal four-chamber (S4C), and subcostal inferior vena cava (IVC) views.

Between August 2016 and February 2017, a multispecialty group of POCUS experts was invited to participate on the expert panel. Experts were defined as individuals who teach POCUS courses at the local, national, or international level and/or have been published on the subject and who regularly use FCU in their clinical practice. A target of at least 20 panel members from a diverse range of clinical specialties that use POCUS was sought. All expert panel members were required to disclose any conflicts of interest before inclusion on the panel, and any conflicts of interest were adjudicated by the project leaders. This project was reviewed by the University of Minnesota Institutional Review Board, which determined the study to be exempt from full institutional review board review.

The checklist items were entered in an internet-based electronic data collection tool (Research Electronic Data Capture [REDCap]) hosted on the server of the University of Texas Health Science Center in San Antonio, Texas (see Appendix E1 in the data supplement) (53). The checklist was circulated to all expert panel members for voting. For each item, panel members were asked two yes-or-no questions with skip logic incorporated to ease the burden of answering questions: 1) Is the item important enough to include in a checklist of essential FCU skills? and 2) Should the item be required for a learner to be considered competent? Panel members were encouraged to provide any relevant comments while voting on each item. Four rounds of voting were conducted between April 2017 and January 2018.

A modified Delphi approach was used to determine consensus on each item to be included in the FCU image acquisition checklist. If at least 80% of respondents voted to accept or reject an item, the panel’s voting was accepted as consensus, and the item was either accepted or rejected from inclusion, respectively, on the checklist. If the panel’s agreement failed to reach 80%, the item was revised on the basis of the panel’s comments and reconsidered during up to two additional rounds of voting. After the first round of voting, respondents were informed of the level of consensus in the prior round of voting for each item included. In the final round of voting, respondents could select from among a prespecified list of reasons why they did not agree with the item, or they could offer their own explanation (Figure 1).

Figure 1.

Figure 1.

Modified Delphi consensus method.

The authors held a conference call after the completion of each round of voting to review the results. The voting results, including all free-text comments, were shared by e-mail with preliminary discussion of the results before the conference calls. During the conference calls, all free-text comments were reviewed, and the authors discussed whether and how to modify checklist items on the basis of the comments. Any changes to checklist items were shared by e-mail after each conference call to confirm consensus on the new wording before the next round of voting.

Results

A total of 33 POCUS experts were invited to participate on the expert panel, and 31 participated in at least one round of voting. The expert panel consisted of physicians specializing in anesthesiology, cardiology, critical care, emergency medicine, internal medicine, pediatrics, and pulmonology. More than half of the experts had completed 1 year of fellowship training in ultrasound, had been practicing FCU for at least 8 years, and teaching FCU for at least 5 years (Table 1). Response rates of the expert panel were 84% in round 1, 87% in round 2, 84% in round 3, and 74% in round 4. A total of 19 experts (61%) completed all four rounds of voting.

Table 1.

Demographics of the expert panel

Characteristic n (%)
Specialty  
 Internal medicine and/or pediatrics 9 (29)
 Emergency medicine 6 (19)
 Critical care and pulmonology 6 (19)
 Cardiology 4 (13)
 Anesthesiology 2 (6)
 Critical care and emergency medicine 2 (6)
 Critical care and anesthesiology 2 (6)
Sex, F 13 (42)
U.S. region or Canadian province  
 Northeast (New York, Pennsylvania) 6 (19)
 South (Georgia, North Carolina, South Carolina, Texas) 6 (19)
 Midwest (Illinois, Kansas, Michigan, Minnesota) 4 (13)
 West (California, Oregon, Washington) 4 (26)
 Alberta 4 (13)
 Nova Scotia 1 (3)
 Ontario 1 (3)
 Quebec 1 (3)
Completed ultrasound fellowship  
 Yes 18 (58)
Experience using focused cardiac ultrasound  
 0–4 yr 2 (6)
 5–8 yr 13 (42)
 >8 yr 16 (52)
Experience teaching focused cardiac ultrasound  
 0–4 yr 10 (32)
 5–8 yr 9 (29)
 >8 yr 12 (39)
Years assessing focused cardiac ultrasound skills  
 0–4 yr 12 (39)
 5–8 yr 9 (29)
 >8 yr 10 (32)
Ultrasound-related peer-reviewed publications  
 0–5 20 (65)
 6–15 5 (16)
 >15 6 (19)
Voting  
 Voted in all 4 rounds 19 (61)
 Voted in 3 rounds 6 (19)
 Voted in 2 rounds 2 (6)
 Voted in 1 round only 4 (13)

Round 1 Voting

A table summarizing all voting results is presented in Appendix E2. The initial FCU image acquisition checklist contained 64 items divided into seven sections: basic ultrasound machine settings, cardiac ultrasound setup, and the five most common cardiac ultrasound views (PLAX, PSAX, A4C, S4C, and subcostal IVC). After the first round of voting, 50 checklist items (78%) achieved 80% or greater agreement as being both important to include in the checklist and required for competency. Of the remaining 14 checklist items, 9 items achieved 80% or greater consensus as being important to include in the checklist but not required for competency. The remaining five checklist items did not achieve consensus for being either important to include (or exclude) on the checklist or required for competency.

Round 2 Voting

The nine checklist items from round 1 that did not reach consensus as a requirement for competency were modified on the basis of expert panel comments for subsequent voting. One checklist item that stated, “measures the largest and smallest diameters of IVC,” was separated into two items: “assesses the largest and smallest diameters of IVC qualitatively using eyeballing” and “measures the largest and smallest diameters of IVC quantitatively using calipers.” Thus, round 2 voting included a total of 10 checklist items.

Two additional items reached consensus for inclusion on the checklist (items 30 and 40). Three items achieved consensus for being important but not required for competency (items 6, 55, and 62). Five checklist items on quantitative assessment of IVC diameter failed to reach consensus as being either important or required for competency (items 1 and 7; draft checklist items F5b, F6, and F7; listed in Table 2).

Table 2.

Results of checklist items not achieving consensus during round one voting

Proposed Checklist Item Modified Checklist Item Important to Include? (Round)* Required for Competency? (Round)*
1. Enters appropriate identifying information (e.g., provider name, patient name, and medical record number) A1. Enters appropriate identifying information if applicable (e.g., provider name, patient name, and medical record number) Yes (1) No
6. Places patient in supine or left lateral decubitus position B1. Places patient in optimal position to obtain parasternal long-axis views, typically supine or left lateral decubitus position, when feasible Yes (2) Yes (4)
7. Begins examination with transducer located to left of sternum at approximately the fourth intercostal space B2. Begins examination with transducer placed left of sternum, approximately midsternal level Yes (1) No
10. Correctly identifies the right ventricular outflow tract B5. Correctly identifies the right ventricle Yes (1) Yes (4)
23. Left ventricle is approximately centered on screen. Yes (1) No
28. Positions patient appropriately: left lateral decubitus position or supine. If unable to obtain adequate views in supine position, moves to left lateral decubitus position. Yes (1) Yes (4)
30. Cardiac apex is centered on screen. D3. Attempts to place cardiac apex approximately in center of screen Yes (1) Yes (2)
31. Interventricular septum is approximately vertical. Yes (1) Yes (4)
40. Places patient in the supine position. If unable to obtain adequate views, bends knees to relax abdominal muscles. E1. Places patient in optimal position to obtain subcostal views, typically supine position. If unable to obtain adequate views, repositions patient (e.g., bends knees to relax abdominal muscles). Yes (1) Yes (2)
51. Places transducer subxiphoid in cephalad–caudad orientation with transducer marker toward patient’s head F1. Places transducer subxiphoid in cephalad–caudad orientation Yes (1) Yes (4)
55. Measures the largest and smallest diameters of IVC F5a. Assesses the largest and smallest diameters of IVC qualitatively using eyeballing Yes (2) No
  F5b. Measures the largest and smallest diameters of IVC quantitatively using calipers No No
Draft item F6. When using M-mode, places caliper at approximately 90 degrees to IVC F6. When using M-mode, attempts to place calipers approximately perpendicular to IVC No No
Draft item F7. When using M-mode, measures largest and smallest diameters F7. Measures the largest and smallest diameters of IVC using M-mode and calipers No No
62. Digitally stores and archives ultrasound images. G3. Digitally stores and archives ultrasound images if applicable Yes (2) No
*

Numbers in parentheses indicate the voting round in which consensus was achieved.

Based on two rounds of voting.

Excluded from checklist because less than 20% of experts voted to include this item in round 3.

Round 3 Voting

Round 3 included five items. Two items on general machine use had reached consensus as being important but not required for competency: “enters appropriate identifying information” (item 1) and “digitally stores and archives ultrasound images” (item 7). Several experts commented on technical and institutional limitations that can affect storage and archiving capability unrelated to an operator’s skills. Despite clarifying the wording of these two checklist items (Table 2), 77% and 50% of experts voted in favor of requiring items 1 and 7, respectively, for competency in round 3. Thus, items 1 and 7 are included in the final checklist as being important but not required for competency. The other three items in round 3 voting pertained to quantitative assessment of IVC diameter, but none of these items reached consensus as a requirement for competency (“measures IVC diameters with calipers” [draft checklist item F5b], “places calipers perpendicular to IVC” [draft checklist item F6], and “measures IVC diameter in M-mode with calipers” [draft checklist item F7]). All three items were therefore excluded from the final checklist.

During data analysis after voting in round 3, it was discovered that some checklist items had duplicate responses from the same expert panel member. After removing duplicate responses, five checklist items that had reached consensus in round 1 were now below the 80% threshold to be required for competency: item 10 (76%), item 23 (72%), item 28 (72%), item 31 (76%), and item 51 (79%). Three checklist items from round 2 voting were similarly affected: item 6 (74%), item 7 (78%), and item 55 (69%). Therefore, a fourth round of voting was conducted to reevaluate and finalize the results for these pending eight items. Five items were found to have been correctly included as required for competency (items 6, 10, 28, 31, and 51), and three items were included as being important but not required for competency (items 7, 23, and 55) in the final checklist. Of note, the removal of duplicate responses did not affect the inclusion or exclusion of any items on the final checklist, and the additional round of voting provided reassurance of accurate representation of the expert panel’s consensus on these items.

In summary, the initial checklist included 64 items, and the final checklist included 62 items based on expert panel consensus. Of these 62 items, 57 achieved consensus as a requirement for demonstration of competency in image acquisition for FCU examinations (Table 3). The items that did not achieve consensus and were excluded from the final checklist focused on quantitative measurement of IVC diameter using calipers or M-mode. Of the five items included in the final checklist considered to be important but not required for competency, two items were related to basic machine operation (entering patient identifying information [item 1] and digitally storing and archiving images [item 62]), two items involved nuanced aspects of cardiac image acquisition (beginning the image acquisition of a PLAX view from the left midsternal level [item 7] and centering the left ventricle on the screen in the PSAX view [item 23]), and one item involved assessment of IVC diameter (assessing the largest and smallest diameters of the IVC qualitatively [item 55]).

Table 3.

Focused Cardiac Ultrasound Skills Checklist developed using modified Delphi process

Checklist Item
Required for Competency?
Section A: Appropriate use of ultrasound machine and basic settings
1. Enters appropriate identifying information (e.g., provider name, patient name, and medical record number) No
2. Selects cardiac mode Yes
3. Places indicator on screen in location consistent with local or specialty convention Yes
4. Selects phased-array transducer Yes
5. Interacts in a professional manner with the patient Yes
 
Section B: Parasternal long axis (PLAX)
6. Places patient in optimal position to obtain parasternal long-axis view, typically supine or left lateral decubitus position, when feasible Yes
7. Begins examination with transducer placed left of sternum, approximately midsternal level No
8. Positions transducer in correct axis with transducer marker toward right shoulder/left hip Yes
9. Displays left ventricular cavity at its fullest diameter Yes
10. Correctly identifies right ventricle Yes
11. Correctly identifies left ventricular outflow tract Yes
12. Correctly identifies aortic valve Yes
13. Correctly identifies ascending aorta Yes
14. Correctly identifies left ventricle Yes
15. Correctly identifies septum Yes
16. Correctly identifies left atrium Yes
17. Correctly identifies mitral valve Yes
18. Correctly identifies anterior leaflet of mitral valve Yes
19. Correctly identifies pericardium Yes
20. Correctly identifies descending thoracic aorta Yes
 
Section C: Parasternal short axis (PSAX) at midpapillary muscle level
21. Rotates transducer 90 degrees clockwise from parasternal long-axis view Yes
22. Captures a circular left ventricular image in normal patients Yes
23. Left ventricle is approximately centered on screen No
24. Correctly identifies right ventricle Yes
25. Correctly identifies left ventricle Yes
26. Correctly identifies both papillary muscles Yes
27. Correctly identifies septum Yes
 
Section D: Apical 4-chamber (A4C)
28. Positions patient appropriately: left lateral decubitus position or supine. If unable to obtain adequate views in supine position, moves to left lateral decubitus position. Yes
29. Left ventricle and atrium appear on right side of the screen, and right ventricle and atrium are on the left. Yes
30. Attempts to place cardiac apex approximately in center of screen Yes
31. Interventricular septum is approximately vertical Yes
32. Avoids foreshortening. Heart should appear long and oval rather than short and globular in normal patients. Yes
33. Correctly identifies right ventricle Yes
34. Correctly identifies tricuspid valve Yes
35. Correctly identifies right atrium Yes
36. Correctly identifies left ventricle Yes
37. Correctly identifies mitral valve Yes
38. Correctly identifies left atrium Yes
39. Correctly identifies septum Yes
 
Section E: Subcostal long axis (SLAX) or subcostal 4-chamber (S4C)
40. Places patient in optimal position to obtain subcostal views, typically supine position. If unable to obtain adequate views, repositions patient (e.g., bends knees to relax abdominal muscles). Yes
41. Places transducer in subxiphoid position Yes
42. Positions transducer in correct orientation to obtain 4-chamber view of the heart Yes
43. Correctly identifies liver Yes
44. Correctly identifies right ventricle Yes
45. Correctly identifies tricuspid valve Yes
46. Correctly identifies right atrium Yes
47. Correctly identifies septum Yes
48. Correctly identifies left ventricle Yes
49. Correctly identifies mitral valve Yes
50. Correctly identifies left atrium Yes
 
Section F: Subcostal inferior vena cava (SIVC)
51. Places transducer subxiphoid in cephalad–caudad orientation Yes
52. Obtains view of inferior vena cava (IVC) entering right atrium Yes
53. Acquires view at widest diameter of IVC Yes
54. Chooses an appropriate point to assess IVC diameter Yes
55. Assesses the largest and smallest diameters of IVC qualitatively using estimation (“eyeballing”) No
56. Correctly identifies IVC Yes
57. Correctly identifies liver Yes
58. Correctly identifies right atrium Yes
59. Correctly identifies hepatic vein Yes
 
Section G: Examination conclusion
60. Adjusts depth appropriately in order to capture all key structures Yes
61. Adjusts gain appropriately in order to optimize visualization of all key structures Yes
62. Digitally stores and archives ultrasound images, if applicable No

Discussion

Using a large international panel of multispecialty POCUS experts, we have developed a consensus-based FCU image acquisition checklist. The final checklist contains 62 items, 57 of which are required for demonstration of competency in FCU image acquisition. The checklist covers skills in basic ultrasound machine operation, techniques of cardiac image acquisition, and structures visualized from the five most common cardiac views.

At least five FCU assessment tools have been published previously, but none have been widely accepted for assessing competence in FCU image acquisition by noncardiologists. Important limitations of these previously published checklists include lack of inclusion of all standard FCU views, recruitment of a small number of experts, and focus on use by a single specialty (46, 4851). Patrawalla and colleagues developed a multisystem critical care ultrasound assessment tool that was limited by the number of expert panel members (n = 4) and inclusion of only the parasternal long- and short-axis views (46). Similarly, Rebel and colleagues and Mitchell and colleagues published multisystem checklists for anesthesiology residents, but only the former checklist specified a few required structures to be identified from the parasternal short-axis and apical four-chamber views (50, 51). The tools developed by Gaudet and colleagues and Millington and colleagues included all five standard FCU views, but the number of experts recruited to develop these tools was small and limited to critical care (48, 49). Echocardiography-certified critical care physicians developed the checklist by Gaudet and colleagues, which includes four levels of the parasternal short-axis view and quantitative IVC assessments, which can be considered advanced skills, limiting its generalizability to basic FCU users from other specialties.

The FCU skill assessment tool published by Millington and colleagues used a global rating scale approach with a single question to assess image quality of each standard FCU view, rather than articulating the individual components of each view (49). Global rating scales have some advantages over checklists. For example, many checklists include items not essential to the procedure itself, such as cleaning the ultrasound machine, which can cloud judgment regarding technical skill competence. If every checklist item has equal weight, learners who correctly perform most steps of a procedure yet make a critical mistake can theoretically still receive a passing score (54). In such cases, global rating scales may be more effective at discriminating between competent and incompetent performances (55). However, the use of global rating scales requires significant training and expertise among the examiners to clearly define the essential components of a given skill and maintain consistency in rating learners. Given the limited availability of POCUS experts at institutions, it is probable that many FCU skill assessors have limited experience in using global rating scales, and therefore a checklist with specific items was an appropriate tool to develop.

We have created an FCU image acquisition checklist that is unique compared with prior tools because it was developed by a multispecialty panel of experts from different institutions. The expert panel represented seven specialties from different regions of the United States and Canada. Our goal in recruiting a broad array of experts was to create a checklist that is useful for all FCU users. Despite this diversity, our panel reached consensus on 81% of the checklist items during the first round of voting. The use of such a broad expert panel provides validity evidence for the content of our checklist. The distinction between “important to include” and “required for competence” of individual checklist items merits further discussion. By asking our expert panel to determine whether each element is not only important but also required for competence, we hoped to establish consensus on the components of a universal FCU exam while identifying critical items that should not be missed if a learner is deemed to be competent.

The process of developing this consensus-based checklist revealed some important caveats of assessing FCU image acquisition skills. First, the panel agreed it is important for FCU users to enter patient data and archive images, but the majority believed it was not necessary for learners to demonstrate these skills to be deemed competent. The panel believed the most emphasis should be placed on using FCU to make accurate, rapid clinical decisions in the care of acutely ill patients and less emphasis should be placed on entering patient data and archiving images. Furthermore, the expert panel expressed concerns about the variability in access to image-archiving software and potential challenges in connecting handheld ultrasound devices to institutional servers. Second, there was a common sentiment in favor of focusing on the quality of an acquired image rather than details of the maneuver used to obtain the view. Third, there was controversy about the importance and clinical utility of quantitative measurements of respiratory variation of IVC diameter during an FCU examination. Although a strong majority (85%) agreed that an assessment of IVC diameter should be included in the checklist, none of the items with quantitative IVC measurements reached the 80% threshold for consensus. Experts commented on the minimal advantage of a quantitative versus qualitative assessment and concerns about inexperienced FCU users misinterpreting or applying undue clinical significance to quantitative IVC measurements. These views reflect the ongoing debate about the clinical utility of IVC assessments in certain settings (56, 57). The final checklist therefore contains items that require the user to visualize the IVC but not measure its diameter.

Our study has important limitations related to the use of an expert panel, the length of the checklist, and the focus on image acquisition. Though our panel of POCUS experts was based exclusively in North America, our panel only interacted electronically via e-mail and the online surveying tool. Compared with in-person meetings or video-based conference calls, our purely electronic interaction may have precluded discussion of important nuances of FCU image acquisition, especially in areas of disagreement. However, to mitigate loss of these qualitative data, we provided a free-text comment box under each checklist item and reviewed the panel’s comments with each round of voting. In addition, not all experts completed all rounds of voting. When selecting expert panelists for this modified Delphi design, an approach that can be difficult to coordinate because of the sheer number of participants, we invited individuals with known or suspected high levels of engagement in cardiac ultrasound from a broad range of specialties and diverse practice settings. We clearly communicated the survey completion deadline in the invitation e-mail for each round of voting and sent one reminder e-mail before the deadline. Regarding the length of the checklist, although a checklist that includes 62 items may seem long, many of the items can be assessed simultaneously in each view. Typically, checklist items that require identification of structures in a view can be completed relatively quickly given the binary nature (yes or no) of these checklist items. For ease of use, we have developed a one-page version of the checklist that may be preferred (Table E1). Future studies should evaluate the feasibility of using the checklist and assess whether the number of items can be reduced without impacting the discriminatory performance of the checklist. Last, our checklist focuses only on FCU image acquisition skill, which is essential but not sufficient to determine whether a provider is competent to use FCU in patient management. For a complete assessment of FCU competence, a learner’s image interpretation and clinical integration skills must also be assessed. However, acquisition of high-quality FCU images is a critical first step toward accurately interpreting cardiac structure and function, and therefore our checklist addresses this important first step toward integrating FCU in patient care. Furthermore, some ancillary skills, such as appropriate care and cleaning of the ultrasound machine, were not captured by our checklist.

Conclusions

We have developed an FCU image acquisition checklist based on the consensus of a large, multispecialty panel of ultrasound experts. Our checklist contains 62 items that may be used as a standardized tool to provide both summative and formative assessments for a broad range of medical specialties and disciplines that perform FCU examinations. The checklist can help identify healthcare providers who are competent to perform FCU examinations, as well as individuals who require further practice. Future steps include development of a visual anchoring system for each view to aid users, as well as gathering usage data to provide further evidence of validity of the checklist for determining learner competence in FCU performance in patient care.

Supplementary Material

Supplements
Author disclosures

Acknowledgments

Acknowledgment

The authors express their thanks to Elizabeth Haro, M.P.H., for assisting with the preparation of the manuscript; to Michael Mader, M.S., for collating the results; and especially to the expert panel participants for participating in this consensus-based process. The expert panel includes Robert Arntfield, M.D.; Yuriy Bronnshteyn, M.D., FASE; Andrew Cheng, M.D.; David Clanton, M.D.; Rosaleen Chun, M.D., FRCPC, M.H.Sc.; Ria Dancel, M.D., FAAP, FHM; Renee Dversdal, M.D.; Amy Hendricks, M.D.; Heather Hurdle, M.D., M.Sc., FRCPC; Trevor Jensen, M.D.; Nicholas J. Johnson, M.D.; Ross Kessler, M.D.; Vijay Krishnamoorthy, M.D., Ph.D.; Mangala Narasimhan, M.D.; Bret P. Nelson, M.D., FACEP; Laura Oh, M.D.; Nova Panebianco, M.D.; Paru Patrawalla, M.D., FCCP; Danny Peterson B.Sc., M.D., Ph.D., R.D.M.S., FRCPC; Lewis Satterwhite, M.D.; Ariel Shiloh, M.D.; Kirk Spencer, M.D.; Michael Vrablik, D.O.; Patrick Willemot, M.D.C.M., FRCPC; Gulrukh Zaidi M.D., FCCP; and Elisa Zaragoza-Macias, M.D.

Footnotes

Author Contributions: Conception and design: R.A., A.E.M., J.S.W., I.W.Y.M., D.S., and N.J.S. Analysis and interpretation: R.A., A.E.M., J.S.W., I.W.Y.M., D.S., and N.J.S. Drafting of the manuscript for important intellectual content: R.A., A.E.M., J.S.W., and N.J.S. Review and revisions to the final manuscript: R.A., A.E.M., J.S.W., I.W.Y.M., D.S., and N.J.S.

This article has a data supplement, which is accessible from this issue’s table of contents at www.atsjournals.org.

Author disclosures are available with the text of this article at www.atsjournals.org.

References

  • 1.Spencer KT, Kimura BJ, Korcarz CE, Pellikka PA, Rahko PS, Siegel RJ. Focused cardiac ultrasound: recommendations from the American Society of Echocardiography. J Am Soc Echocardiogr. 2013;26:567–581. doi: 10.1016/j.echo.2013.04.001. [DOI] [PubMed] [Google Scholar]
  • 2.Labovitz AJ, Noble VE, Bierig M, Goldstein SA, Jones R, Kort S, et al. Focused cardiac ultrasound in the emergent setting: a consensus statement of the American Society of Echocardiography and American College of Emergency Physicians. J Am Soc Echocardiogr. 2010;23:1225–1230. doi: 10.1016/j.echo.2010.10.005. [DOI] [PubMed] [Google Scholar]
  • 3.Bahner DP, Goldman E, Way D, Royall NA, Liu YT. The state of ultrasound education in U.S. medical schools: results of a national survey. Acad Med. 2014;89:1681–1686. doi: 10.1097/ACM.0000000000000414. [DOI] [PubMed] [Google Scholar]
  • 4.Steinmetz P, Dobrescu O, Oleskevich S, Lewis J. Bedside ultrasound education in Canadian medical schools: a national survey. Can Med Educ J. 2016;7:e78–e86. [PMC free article] [PubMed] [Google Scholar]
  • 5.Johri AM, Durbin J, Newbigging J, Tanzola R, Chow R, De S, et al. Cardiac point-of-care ultrasound: state-of-the-art in medical school education. J Am Soc Echocardiogr. 2018;31:749–760. doi: 10.1016/j.echo.2018.01.014. [DOI] [PubMed] [Google Scholar]
  • 6.Kumar A, Barman N, Lurie J, He H, Goldman M, McCullough SA. Development of a point-of-care cardiovascular ultrasound program for preclinical medical students. J Am Soc Echocardiogr. 2018;31:1064–1066, e2. doi: 10.1016/j.echo.2018.05.008. [DOI] [PubMed] [Google Scholar]
  • 7.Baltarowich OH, Di Salvo DN, Scoutt LM, Brown DL, Cox CW, DiPietro MA, et al. National ultrasound curriculum for medical students. Ultrasound Q. 2014;30:13–19. doi: 10.1097/RUQ.0000000000000066. [DOI] [PubMed] [Google Scholar]
  • 8.Cantisani V, Dietrich CF, Badea R, Dudea S, Prosch H, Cerezo E, et al. EFSUMB statement on medical student education in ultrasound [short version] Ultraschall Med. 2016;37:100–102. doi: 10.1055/s-0035-1566959. [DOI] [PubMed] [Google Scholar]
  • 9.Smith CJ, Morad A, Balwanz C, Lyden E, Matthias T. Prospective evaluation of cardiac ultrasound performance by general internal medicine physicians during a 6-month faculty development curriculum. Crit Ultrasound J. 2018;10:9. doi: 10.1186/s13089-018-0090-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Lucas BP, Candotti C, Margeta B, Evans AT, Mba B, Baru J, et al. Diagnostic accuracy of hospitalist-performed hand-carried ultrasound echocardiography after a brief training program. J Hosp Med. 2009;4:340–349. doi: 10.1002/jhm.438. [DOI] [PubMed] [Google Scholar]
  • 11.American College of Emergency Physicians (ACEP) Emergency Ultrasound Section Writing Committee. Ultrasound guidelines: emergency, point-of-care and clinical ultrasound guidelines in medicine. Ann Emerg Med. 2017;69:e27–e54. doi: 10.1016/j.annemergmed.2016.08.457. [DOI] [PubMed] [Google Scholar]
  • 12.Melamed R, Sprenkle MD, Ulstad VK, Herzog CA, Leatherman JW. Assessment of left ventricular function by intensivists using hand-held echocardiography. Chest. 2009;135:1416–1420. doi: 10.1378/chest.08-2440. [DOI] [PubMed] [Google Scholar]
  • 13.Vignon P, Dugard A, Abraham J, Belcour D, Gondran G, Pepino F, et al. Focused training for goal-oriented hand-held echocardiography performed by noncardiologist residents in the intensive care unit. Intensive Care Med. 2007;33:1795–1799. doi: 10.1007/s00134-007-0742-8. [DOI] [PubMed] [Google Scholar]
  • 14.Johnson BK, Tierney DM, Rosborough TK, Harris KM, Newell MC. Internal medicine point-of-care ultrasound assessment of left ventricular function correlates with formal echocardiography. J Clin Ultrasound. 2016;44:92–99. doi: 10.1002/jcu.22272. [DOI] [PubMed] [Google Scholar]
  • 15.Mjølstad OC, Andersen GN, Dalen H, Graven T, Skjetne K, Kleinau JO, et al. Feasibility and reliability of point-of-care pocket-size echocardiography performed by medical residents. Eur Heart J Cardiovasc Imaging. 2013;14:1195–1202. doi: 10.1093/ehjci/jet062. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Beraud AS, Rizk NW, Pearl RG, Liang DH, Patterson AJ. Focused transthoracic echocardiography during critical care medicine training: curriculum implementation and evaluation of proficiency. Crit Care Med. 2013;41:e179–e181. doi: 10.1097/CCM.0b013e31828e9240. [DOI] [PubMed] [Google Scholar]
  • 17.Bustam A, Noor Azhar M, Singh Veriah R, Arumugam K, Loch A. Performance of emergency physicians in point-of-care echocardiography following limited training. Emerg Med J. 2014;31:369–373. doi: 10.1136/emermed-2012-201789. [DOI] [PubMed] [Google Scholar]
  • 18.Panoulas VF, Daigeler AL, Malaweera AS, Lota AS, Baskaran D, Rahman S, et al. Pocket-size hand-held cardiac ultrasound as an adjunct to clinical examination in the hands of medical students and junior doctors. Eur Heart J Cardiovasc Imaging. 2013;14:323–330. doi: 10.1093/ehjci/jes140. [DOI] [PubMed] [Google Scholar]
  • 19.Duvall WL, Croft LB, Goldman ME. Can hand-carried ultrasound devices be extended for use by the noncardiology medical community? Echocardiography. 2003;20:471–476. doi: 10.1046/j.1540-8175.2003.03070.x. [DOI] [PubMed] [Google Scholar]
  • 20.Stokke TM, Ruddox V, Sarvari SI, Otterstad JE, Aune E, Edvardsen T. Brief group training of medical students in focused cardiac ultrasound may improve diagnostic accuracy of physical examination. J Am Soc Echocardiogr. 2014;27:1238–1246. doi: 10.1016/j.echo.2014.08.001. [DOI] [PubMed] [Google Scholar]
  • 21.Martin LD, Howell EE, Ziegelstein RC, Martire C, Shapiro EP, Hellmann DB. Hospitalist performance of cardiac hand-carried ultrasound after focused training. Am J Med. 2007;120:1000–1004. doi: 10.1016/j.amjmed.2007.07.029. [DOI] [PubMed] [Google Scholar]
  • 22.Kimura BJ. Point-of-care cardiac ultrasound techniques in the physical examination: better at the bedside. Heart. 2017;103:987–994. doi: 10.1136/heartjnl-2016-309915. [DOI] [PubMed] [Google Scholar]
  • 23.Martin LD, Howell EE, Ziegelstein RC, Martire C, Whiting-O’Keefe QE, Shapiro EP, et al. Hand-carried ultrasound performed by hospitalists: does it improve the cardiac physical examination? Am J Med. 2009;122:35–41. doi: 10.1016/j.amjmed.2008.07.022. [DOI] [PubMed] [Google Scholar]
  • 24.Razi R, Estrada JR, Doll J, Spencer KT. Bedside hand-carried ultrasound by internal medicine residents versus traditional clinical assessment for the identification of systolic dysfunction in patients admitted with decompensated heart failure. J Am Soc Echocardiogr. 2011;24:1319–1324. doi: 10.1016/j.echo.2011.07.013. [DOI] [PubMed] [Google Scholar]
  • 25.Kobal SL, Trento L, Baharami S, Tolstrup K, Naqvi TZ, Cercek B, et al. Comparison of effectiveness of hand-carried ultrasound to bedside cardiovascular physical examination. Am J Cardiol. 2005;96:1002–1006. doi: 10.1016/j.amjcard.2005.05.060. [DOI] [PubMed] [Google Scholar]
  • 26.Smallwood N, Dachsel M. Point-of-care ultrasound (POCUS): unnecessary gadgetry or evidence-based medicine? Clin Med (Lond) 2018;18:219–224. doi: 10.7861/clinmedicine.18-3-219. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Bhagra A, Tierney DM, Sekiguchi H, Soni NJ. Point-of-care ultrasonography for primary care physicians and general internists. Mayo Clin Proc. 2016;91:1811–1827. doi: 10.1016/j.mayocp.2016.08.023. [DOI] [PubMed] [Google Scholar]
  • 28.Kimura BJ, Yogo N, O’Connell CW, Phan JN, Showalter BK, Wolfson T. Cardiopulmonary limited ultrasound examination for “quick-look” bedside application. Am J Cardiol. 2011;108:586–590. doi: 10.1016/j.amjcard.2011.03.091. [DOI] [PubMed] [Google Scholar]
  • 29.Breitkreutz R, Price S, Steiger HV, Seeger FH, Ilper H, Ackermann H, et al. Emergency Ultrasound Working Group of the Johann Wolfgang Goethe-University Hospital, Frankfurt am Main. Focused echocardiographic evaluation in life support and peri-resuscitation of emergency patients: a prospective trial. Resuscitation. 2010;81:1527–1533. doi: 10.1016/j.resuscitation.2010.07.013. [DOI] [PubMed] [Google Scholar]
  • 30.Arntfield RT, Millington SJ. Point of care cardiac ultrasound applications in the emergency department and intensive care unit – a review. Curr Cardiol Rev. 2012;8:98–108. doi: 10.2174/157340312801784952. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Bøtker MT, Jacobsen L, Rudolph SS, Knudsen L. The role of point of care ultrasound in prehospital critical care: a systematic review. Scand J Trauma Resusc Emerg Med. 2018;26:51. doi: 10.1186/s13049-018-0518-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Gaspari R, Weekes A, Adhikari S, Noble VE, Nomura JT, Theodoro D, et al. Emergency department point-of-care ultrasound in out-of-hospital and in-ED cardiac arrest. Resuscitation. 2016;109:33–39. doi: 10.1016/j.resuscitation.2016.09.018. [DOI] [PubMed] [Google Scholar]
  • 33.Jones AE, Tayal VS, Sullivan DM, Kline JA. Randomized, controlled trial of immediate versus delayed goal-directed ultrasound to identify the cause of nontraumatic hypotension in emergency department patients. Crit Care Med. 2004;32:1703–1708. doi: 10.1097/01.ccm.0000133017.34137.82. [DOI] [PubMed] [Google Scholar]
  • 34.Pustavoitau A, Blaivas M, Brown SM, Gutierrez C, Kirkpatrick AW, Kohl BA, et al. Mount Prospect, IL: Society of Critical Care Medicine; Ultrasound Certification Task Force on behalf of the Society of Critical Care Medicine. Official Statement of the Society of Critical Care Medicine: Recommendations for achieving and maintaining competence and credentialing in critical care ultrasound with focused cardiac ultrasound and advanced critical care echocardiography. [Google Scholar]
  • 35.Kaplan DA. The trouble with ultrasound’s pervasive use by non-radiologists: diagnostic imaging. Cranbury, NJ: MJH Life Sciences/Diagnostic Imaging; 2011 Mar 3 [accessed 2019 May 19]. Available from: https://www.diagnosticimaging.com/ultrasound/trouble-ultrasound%E2%80%99s-pervasive-use-non-radiologists.
  • 36.Mulvagh SL, Bhagra A, Nelson BP, Narula J. Handheld ultrasound devices and the training conundrum: how to get to “seeing is believing.”. J Am Soc Echocardiogr. 2014;27:310–313. doi: 10.1016/j.echo.2014.01.011. [DOI] [PubMed] [Google Scholar]
  • 37.Spencer KT. Focused cardiac ultrasound: where do we stand? Curr Cardiol Rep. 2015;17:567. doi: 10.1007/s11886-015-0567-y. [DOI] [PubMed] [Google Scholar]
  • 38.Ruddox V, Norum IB, Stokke TM, Edvardsen T, Otterstad JE. Focused cardiac ultrasound by unselected residents-the challenges. BMC Med Imaging. 2017;17:22. doi: 10.1186/s12880-017-0191-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Mayo PH, Beaulieu Y, Doelken P, Feller-Kopman D, Harrod C, Kaplan A, et al. American College of Chest Physicians/La Société de Réanimation de Langue Française statement on competence in critical care ultrasonography. Chest. 2009;135:1050–1060. doi: 10.1378/chest.08-2305. [DOI] [PubMed] [Google Scholar]
  • 40.Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Residents’ procedural experience does not ensure competence: a research synthesis. J Grad Med Educ. 2017;9:201–208. doi: 10.4300/JGME-D-16-00426.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Stride HP, George BC, Williams RG, Bohnen JD, Eaton MJ, Schuller MC, et al. Relationship of procedural numbers with meaningful procedural autonomy in general surgery residents. Surgery. 2018;163:488–494. doi: 10.1016/j.surg.2017.10.011. [DOI] [PubMed] [Google Scholar]
  • 42.Blehar DJ, Barton B, Gaspari RJ. Learning curves in emergency ultrasound education. Acad Emerg Med. 2015;22:574–582. doi: 10.1111/acem.12653. [DOI] [PubMed] [Google Scholar]
  • 43.Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32:676–682. doi: 10.3109/0142159X.2010.500704. [DOI] [PubMed] [Google Scholar]
  • 44.Nielsen DG, Gotzsche O, Eika B. Objective structured assessment of technical competence in transthoracic echocardiography: a validity study in a standardised setting. BMC Med Educ. 2013;13:47. doi: 10.1186/1472-6920-13-47. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Todsen T, Tolsgaard MG, Olsen BH, Henriksen BM, Hillingsø JG, Konge L, et al. Reliable and valid assessment of point-of-care ultrasonography. Ann Surg. 2015;261:309–315. doi: 10.1097/SLA.0000000000000552. [DOI] [PubMed] [Google Scholar]
  • 46.Patrawalla P, Eisen LA, Shiloh A, Shah BJ, Savenkov O, Wise W, et al. Development and validation of an assessment tool for competency in critical care ultrasound. J Grad Med Educ. 2015;7:567–573. doi: 10.4300/JGME-D-14-00613.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Tolsgaard MG, Todsen T, Sorensen JL, Ringsted C, Lorentzen T, Ottesen B, et al. International multispecialty consensus on how to evaluate ultrasound competence: a Delphi consensus survey. PLoS One. 2013;8:e57687. doi: 10.1371/journal.pone.0057687. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Gaudet J, Waechter J, McLaughlin K, Ferland A, Godinez T, Bands C, et al. Focused critical care echocardiography: development and evaluation of an image acquisition assessment tool. Crit Care Med. 2016;44:e329–e335. doi: 10.1097/CCM.0000000000001620. [DOI] [PubMed] [Google Scholar]
  • 49.Millington SJ, Arntfield RT, Hewak M, Hamstra SJ, Beaulieu Y, Hibbert B, et al. The rapid assessment of competency in echocardiography scale: validation of a tool for point-of-care ultrasound. J Ultrasound Med. 2016;35:1457–1463. doi: 10.7863/ultra.15.07083. [DOI] [PubMed] [Google Scholar]
  • 50.Rebel A, Srour H, DiLorenzo A, Nguyen D, Ferrell S, Dwarakanatli S, et al. Ultrasound skill and application of knowledge assessment using an innovative OSCE competition-based simulation approach. J Educ Perioper Med. 2016;18:E404. [PMC free article] [PubMed] [Google Scholar]
  • 51.Mitchell JD, Amir R, Montealegre-Gallegos M, Mahmood F, Shnider M, Mashari A, et al. Summative objective structured clinical examination assessment at the end of anesthesia residency for perioperative ultrasound. Anesth Analg. 2018;126:2065–2068. doi: 10.1213/ANE.0000000000002826. [DOI] [PubMed] [Google Scholar]
  • 52.Via G, Hussain A, Wells M, Reardon R, ElBarbary M, Noble VE, et al. International Liaison Committee on Focused Cardiac UltraSound (ILC-FoCUS); International Conference on Focused Cardiac UltraSound (IC-FoCUS) International evidence-based recommendations for focused cardiac ultrasound. J Am Soc Echocardiogr. 2014;27:683, e1–683, e33. doi: 10.1016/j.echo.2014.05.001. [DOI] [PubMed] [Google Scholar]
  • 53.Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Inform. 2009;42:377–381. doi: 10.1016/j.jbi.2008.08.010. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Ma IW, Pugh D, Mema B, Brindle ME, Cooke L, Stromer JN. Use of an error-focused checklist to identify incompetence in lumbar puncture performances. Med Educ. 2015;49:1004–1015. doi: 10.1111/medu.12809. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Ilgen JS, Ma IW, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49:161–173. doi: 10.1111/medu.12621. [DOI] [PubMed] [Google Scholar]
  • 56.Schmidt GA. POINT: should acute fluid resuscitation Be guided primarily by inferior vena cava ultrasound for patients in shock? Yes. Chest. 2017;151:531–532. doi: 10.1016/j.chest.2016.11.021. [DOI] [PubMed] [Google Scholar]
  • 57.Kory P. Counterpoint: should acute fluid resuscitation Be guided primarily by inferior vena cava ultrasound for patients in shock? No. Chest. 2017;151:533–536. doi: 10.1016/j.chest.2016.11.017. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplements
Author disclosures

Articles from ATS Scholar are provided here courtesy of American Thoracic Society

RESOURCES