Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Aug 5.
Published in final edited form as: Echocardiography. 2019 Aug 5;36(8):1515–1523. doi: 10.1111/echo.14441

A Standardized Imaging Protocol Improves Quality and Reduces Practice Variability in Echocardiographic Assessment of Ventricular Function by First-Year Pediatric Cardiology Fellows

Brian R White 1,*, Deborah Y Ho 1, Lindsay S Rogers 1, Shobha S Natarajan 1
PMCID: PMC6717032  NIHMSID: NIHMS1042277  PMID: 31385382

Abstract

Background:

Echocardiography education for pediatric cardiology fellows has been a recent focus leading to the implementation of “boot camps”. Less is described about continuing education through fellowship and improving image quality. We noticed practice variation in echocardiograms assessing ventricular function performed on nights and weekends. Thus, we implemented a standardized protocol and assessed its impact on imaging and reporting completeness.

Methods:

We created an imaging protocol for the assessment of ventricular function in the acute setting. The protocol included demographic information, a list of images to be obtained, and the methods to quantify ventricular function. The protocol was explained to first-year fellows and distributed on an electronic quick-reference card. Echocardiograms independently performed by first-year fellows during their first four months of on-call time were assessed pre- and post-intervention using a standard rubric.

Results:

Compliance with demographic reporting was high pre- and post-intervention, but significantly improved after the standardized protocol (p<0.001). Use of the protocol increased the median number of unique images obtained per echocardiogram from 13 to 17 (out of 23 required views, p<0.001). Particularly improved was the performance of quantitative evaluations of function, including Simpson’s method for left ventricular ejection fraction (four chamber: 40% vs. 67%, p<0.001; two chamber: 33% vs. 67%, p<0.001) and tricuspid annular plane systolic excursion (45% vs. 80%, p<0.001).

Conclusions:

The introduction of a standardized imaging protocol and its distribution to first-year fellows resulted in improvements in echocardiographic reporting completeness and increased the quality of information obtained by providing more quantitative assessments of ventricular function.

Keywords: Quality Improvement, Pediatric Cardiology, Education, Fellowship Training, Image Quality

Introduction:

Educating fellows to perform and interpret transthoracic echocardiograms (TTEs) is a critical component of pediatric cardiology fellowship1. As most general pediatrics residents are not exposed to echocardiography, it is a skill which must be learned from scratch. Thus, improving and standardizing echocardiography education has been a recent focus at many fellowship programs. A common development has been the implementation of a “boot camp” at the beginning of fellowship26. The amount of time dedicated to teaching echocardiography skills in these programs varies, but intensive educational sessions have been shown to lead to higher quality echocardiograms and greater independence7. However, an understanding of best practices in echocardiography education after fellowship orientation is lacking.

While observation by experienced sonographers and attendings occurs throughout fellowship, much experience and learning occurs during nights and weekends, when fellows perform echocardiograms independently. At our hospital, the majority of echocardiograms during these times are performed by first-year pediatric cardiology fellows. We focused our improvement efforts on one of the most common types of TTE performed in this setting: the acute inpatient assessment of right and left ventricular function. We have anecdotally noted unintended practice variation in the manner in which these echocardiograms are performed, particularly with regard to the use of Doppler interrogation and quantitative evaluation of function. Such deficiencies are likely due to the many barriers to effectively performing echocardiograms in the on-call environment, including frequent interruptions, patient instability, and fellow inexperience.

In our current educational practice, sonographers and attendings often provide feedback about echocardiograms performed on-call after imaging is complete. As real-time feedback is usually not feasible, fellows frequently learn by trial-and-error regarding what echocardiographic images to perform in each situation and improve with experience. In order to provide practical support to new fellows and standardize this process, we created an imaging protocol for the acute inpatient assessment of systolic ventricular function (which includes an assessment of pericardial and pleural effusions) and distributed the protocol with electronic quick reference cards. We hypothesized that the use of a standardized TTE imaging protocol for focused ventricular function assessment would lead to more complete echocardiograms earlier in the first-year of fellowship and improve the ability to assess function qualitatively and quantitatively.

Methods:

Imaging Protocol and Educational Intervention:

We created an echocardiographic imaging protocol for assessing ventricular function in the acute setting (Table I). The protocol was developed based on consensus among our echocardiography faculty and the recommendations of professional societies89. The imaging goals of this protocol were for the standard assessment of systolic ventricular function, effusions, and valvar regurgitation by on-call pediatric cardiology fellows in patients who are admitted to the hospital or being evaluated in the emergency department. As such, it did not include advanced evaluation of function (e.g., strain, three-dimensional ejection fraction, or tissue Doppler imaging).

Table I:

The ventricular function protocol quick reference card including information and views to be obtained.*

FOCUSED VENTRICULAR FUNCTION CHECK PROTOCOL
   • This protocol is a guideline for focused function checks and effusion checks
   • If specific information is requested, prioritize those images
   • Ensure ECG tracing is present throughout study
   • Record BP at the time of imaging & any cardiac inotropes

Study information

   • Include ordering attending and indication for study

   • Please indicate on the report if a view cannot be obtained

Imaging: Subcostal views

   • 2D subcostal transverse with sweeps through right & left pleural spaces and heart ⋀°

   • Subcostal frontal or LAO sweep, entire heart

   • Color Doppler subcostal frontal or LAO of atrial septum

   • Subcostal sagittal with color over DAo
   ○ Pulse wave Doppler descending aorta

Imaging: Apical

   • 2D 4 chamber view, no sweep

   • Repeat with focus on RV to include free wall, if necessary
   ○ For effusions: 2D of right heart for RA or RV collapse

   • 2D 4 chamber sweep all the way posterior and anterior

   • M-mode for TAPSE

   • Color Doppler over tricuspid valve, with sweep
   ○ Pulse wave Doppler at leaflet tips
   ○ Continuous wave of TR jet, if present
   ○ For effusions: TV inflow with lowest sweep speed, respiratory tracing on

   • Color Doppler over mitral valve, with sweep
   ○ Pulse wave at leaflet tips
   ○ Fluid check: MV inflow with lowest sweep speed, respiratory tracing on

   • 2D 2-chamber view, no sweep  Turn counter-
clockwise 90˚

   • 2D 5-chamber view, no sweep
   ○ Color Doppler over LV outflow tract

Imaging: Parasternal long axis ⋀°

   • 2D reference view at aortic valve, no sweep
   • 2D sweep to tricuspid & pulmonary valves
   • For effusions: include 2D clip to assess for diastolic collapse of RV free wall

Imaging: Parasternal short axis ⋀°

   • 2D reference view at aortic valve en face, no sweep

   • 2D sweep from base to apex

   • 2D short-axis view at papillary muscles, no sweep

   • M-mode for shortening fraction

   • Color Doppler over pulmonary valve

Imaging: Additional views

   • If parasternal short windows are poor, obtain short-axis of ventricles from subcostal LAO or sagittal

   • If RV pressure estimate by TR jet elevated, transition to PH protocol

   • Evaluate valves from multiple views if abnormality or insufficiency noted
*

The wedges with adjacent dot indicate our laboratory standard for image orientation. ECG: electrocardiogram, BP: blood pressure, 2D: two-dimensional, LAO: left anterior oblique, DAo: descending aorta, RV: right ventricle, RA: right atrium, TAPSE: tricuspid annular plane systolic excursion, TR: tricuspid regurgitation, TV: tricuspid valve, MV: mitral valve, LV: left ventricle, PH: pulmonary hypertension

The first portion of the protocol stipulated important information to be included in the report, including notation of the height, weight, and blood pressure as well as a proper indication for the study10. The indication was expected to include the original cardiac diagnosis, any prior surgeries or procedures, and the immediate clinical concern prompting the need for the study during off hours (proper indication). The cardiology attending ordering the study also was required to be listed by name on the report (i.e., not “CICU” or “Consult Service”) to allow for ease in communication of important findings. The second portion of the protocol listed the images to be obtained from each echocardiographic view (e.g., subcostal, apical, parasternal view), including M-mode and color and spectral Doppler acquisitions as well as methods to use to quantify ventricular function.

A quick reference card detailing the above information was created. In September 2018, the protocol and quick reference card were distributed to fellows via e-mail, and the protocol was available on the fellows’ shared hard drive for reference. The purpose of the protocol was explained to the fellows in person with discussion of the requirements, and any questions were answered. All of the required views and techniques had been taught during the preceding echocardiography boot camp as well as during standard echocardiography rotations. However, all of these components had not previously been compiled into a standard imaging protocol.

The study was reviewed by the institutional review board and judged to be quality improvement and standard educational practice, not meeting criteria for human subjects research. Thus, informed consent was waived for both fellows and the patients whose echocardiograms were reviewed. The study was conducted in accordance with ethical guidelines and patient privacy protections.

Study Population (Fellow-Performed Echocardiograms):

TTEs were eligible for inclusion if they were (1) assessments of ventricular function or pericardial effusion and (2) performed by first-year fellows without sonographer supervision. Exclusion criteria were (1) early study termination due to patient instability, (2) mechanical circulatory support, or (3) use of a different standardized imaging protocol (including those for a complete assessment of congenital heart disease, pulmonary hypertension, or the full postoperative assessment of a particular surgical repair). Pre-intervention baseline data were obtained from fellow-imaged TTEs from October through January of fellowship years 2015–2017. Post-intervention data were collected from fellow-imaged TTEs from October through January of the 2018 fellowship year. These months were selected as these were the first four months of each fellowship year during which first-year fellows took overnight and weekend call.

All such studies were reviewed for demographic reporting completeness; a total of 292 TTEs met these criteria (226 before and 66 after the protocol intervention). In order to limit the imaging review to those studies where all requested images could reasonably have been obtained, additional exclusion criteria were applied prior to image evaluation: (1) single-ventricle heart disease, (2) systemic right ventricles, and (3) unrepaired complex congenital heart disease (e.g., common atrioventricular canal or tetralogy of Fallot). These criteria left 235 TTEs (186 pre- and 49 post-intervention) for a full image review.

Study Measure (Echocardiogram Grading Rubric):

In order to judge whether demographic reporting and image acquisition improved after the start of the standardized protocol, echocardiograms were reviewed and compared against a standardized grading rubric (Tables IIIII). For demographic criteria, information was extracted from the clinical echocardiographic reports by two authors (BRW and DYH); by its nature, this analysis could not be blinded. Echocardiograms were graded with a yes/no score as to whether they included a (1) height, (2) weight, (3) blood pressure, (4) requesting cardiologist by name (e.g., “CICU” or “consult team” was not considered sufficient), (5) the original cardiac diagnosis (e.g., “tetralogy of Fallot” or “previously normal”), (6) any cardiac surgeries (e.g., “s/p repair” or “s/p Glenn”), and (7) the proper indication for the TTE (e.g., “hypotension”). In addition to individual component scores, a demographic completeness score was calculated as the total number of requested components completed (i.e., scores could range from 0 to 7).

Table II:

The echocardiographic demographic grading rubric.

Demographic Reporting Criteria
Was a height listed?
Was a weight listed?
Was a blood pressure listed?
Was the ordering physician listed by name?
Was the original cardiac diagnosis listed appropriately?
Were prior surgeries listed appropriately?
Was the immediate study indication listed appropriately?

Table III:

The echocardiographic imaging grading rubric with inter-reader reliability (number of raters: 4, number of studies: 20).

Gwet’s AC1 Percent Agreement
Subcostal Views
Were subcostal views obtained? 0.881 91.2%
Was an effusion sweep (abdominal transverse) performed? 0.875 91.2%
Were frontal or LAO views performed? 0.843 92.1%
Was the atrial septum interrogated by color Doppler? 0.942 94.7%
Was the descending aorta interrogated by spectral Doppler? 0.857 91.2%
Apical Views
Were apical views obtained? 0.973 97.4%
Was a 2D four-chamber view of the ventricles (no sweep) performed? 0.890 90.4%
Was a 2D four-chamber sweep of the ventricles performed? 0.650 82.5%
Was an apical two-chamber view performed? 0.807 90.4%
Was Simpsons’ for LVEF performed from apical four-chamber? 0.950 97.4%
Was Simpsons’ for LVEF performed from apical two-chamber? 0.956 97.4%
Was M-mode for TAPSE performed? 0.933 96.5%
Was color Doppler of the tricuspid valve performed? 0.963 97.4%
Was spectral Doppler of the tricuspid valve performed? 0.884 92.1%
Was color Doppler of the mitral valve performed? 0.970 97.4%
Was spectral Doppler of the mitral valve performed? 0.875 91.2%
Was an apical 2D view of left ventricular outflow tract performed? 0.479 73.7%
Was the left ventricular outflow tract interrogated by color Doppler? 0.868 92.1%
Parasternal Long and Short Axis
Were parasternal long axis views obtained? 1.000 100%
Was a 2D parasternal long axis of the left ventricle (no sweep) performed? 0.809 89.5%
Was a 2D parasternal long axis sweep performed? 0.396 69.3%
Were parasternal short axis views obtained? 1.000 100%
Was a 2D parasternal short axis view performed at the aortic valve (no sweep)? 0.836 91.2%
Were appropriate views performed of cross-sectional ventricular function (no sweeps)? 0.799 86.0%
Was a 2D parasternal short axis sweep of the heart performed? 0.518 74.6%
Was M-mode for shortening fraction performed? 0.955 96.5%
Was the right ventricular outflow tract interrogated by color Doppler (either parasternal short or long)? 0.653 82.5%

All of the echocardiograms were reviewed for imaging completeness by one reader (BRW) with a subset of twenty (randomly-selected) echocardiograms being reviewed by all four authors for the purposes of inter-reader reliability. Echocardiographic image quality was similarly judged with a binary yes/no score for whether each requested echocardiographic view was obtained. This assessment was based on whether a particular image was attempted, and no judgement was made about image quality. This methodology is consistent with the guidelines of the Intersocietal Accreditation Commission and American Society of Echocardiography8,11 on how components should be recorded. An echocardiographic imaging completeness score was calculated as the total number of requested views obtained (the maximum possible score was 23).

Statistical Analysis:

Inter-reader reliability on the twenty echocardiograms reviewed by all four raters was calculated using percent agreement and Gwet’s AC112. Gwet’s AC1 was chosen due to its superior performance compared to Light’s kappa in the setting of high inter-rater agreement1314. Excellent agreement was defined as an AC1 greater than 0.80, and good agreement an AC1 greater than 0.60.

For each individual metric (demographic and imaging), the percent of compliant echocardiograms was calculated both pre- and post-intervention. These percentages were compared using Pearson’s chi-squared test. Total compliance scores (for both demographic and imaging) are presented as medians and interquartile ranges (IQR). As the distributions of these scores were not normal, they were compared between the pre- and post-intervention groups using the Wilcoxon rank-sum test. For all statistical tests, the threshold for significance was assumed to be p≤0.05.

Results:

The echocardiographic imaging grading rubric demonstrated high inter-reader reliability (Table III). Inter-reader reliability was lower for judgments on whether or not complete sweeps were performed.

Compliance with demographic reporting was high both pre- and post-intervention (Figure 1), but overall improvement following the educational intervention was demonstrated (median: 6, IQR: 5–7 increasing to median: 7, IQR: 6–7, p<0.001). A particular focus of our echocardiography laboratory pre-intervention was inconsistency in including the name of the ordering cardiologist, as its presence improves the reading cardiologist’s ability to ensure critical information is directly communicated. We saw marked improvement in this field: prior to the intervention, this information was included only 45% of the time, but reporting increased to 83% post-intervention (p<0.001). Almost all components of demographic reporting improved post-intervention (Table IV) including recording the blood pressure (88% vs 97%, p=0.04) and listing the original cardiac diagnosis (82% vs. 94%, p=0.016). Inclusion of the immediate indication for the study was unchanged with the intervention.

Figure 1.

Figure 1.

Distribution of demographic completeness scores before and after the educational intervention.

Table IV:

Frequency with which components of the ventricular function protocol were obtained by first-year pediatric cardiology fellows both before and after the educational intervention.

Demographic Criteria Pre-Intervention (N=226) Post-Intervention (N=66) p-value
Notate weight 98% 100% 0.28
Notate height 85% 94% 0.06
Notate blood pressure 88% 97% 0.04
Notate ordering physician 45% 83% <0.001
List Diagnosis 82% 94% 0.016
List Surgeries 83% 92% 0.06
Proper Indication 82% 80% 0.77
Echocardiographic Image Review Pre-Intervention (N=186) Post-Intervention (N=49) p-value
Subcostal Views
Any 90% 94% 0.38
Effusion Sweep 87% 94% 0.19
Frontal or LAO Sweep 55% 69% 0.08
Color Doppler of the atrial septum 15% 22% 0.22
Spectral Doppler of the descending aorta 27% 51% 0.001
Apical Views
Any 97% 100% 0.2
Four-Chamber View 93% 96% 0.46
Four-Chamber complete sweep 40% 55% 0.05
Two-Chamber View 33% 67% <0.001
Simpsons’: Four-Chamber 37% 61% 0.002
Simpsons’: Two Chamber 23% 44% <0.001
TAPSE 45% 80% <0.001
Tricuspid valve color Doppler 83% 86% 0.69
Tricuspid valve pulse wave Doppler 78% 84% 0.38
Mitral calve color Doppler 87% 90% 0.61
Mitral valve pulse wave Doppler 74% 82% 0.25
Left ventricular outflow tract 42% 55% 0.1
Left ventricular outflow tract color Doppler 54% 69% 0.05
Parasternal Long and Short Axis
Any Long Axis 75% 82% 0.32
Left ventricle long axis 62% 73% 0.13
Long axis complete sweep 47% 41% 0.46
Any Short Axis 92% 92% 0.88
Short axis, base 32% 49% 0.03
Short axis, mid-ventricular 86% 86% 0.96
Short axis complete sweep 64% 71% 0.33
M-Mode for shortening fraction 81% 82% 0.94
Right ventricular outflow tract color Doppler 41% 53% 0.13

There was significant practice variation pre-intervention in the views obtained in ventricular function assessments (Table IV). There were 11 views that were obtained less than 50% of the time, including color Doppler assessment of the atrial septum, apical two-chamber views, and complete assessments of the right and left ventricular outflow tracts. Over 90% of echocardiograms included subcostal, apical, or parasternal short views. However, only 75% of echocardiograms included parasternal long axis views.

The number of echocardiographic views increased with the intervention from a median of 13 views (IQR: 9–17) pre-intervention to a median of 17 views (IQR 13–19) post-intervention (p<0.001, Figure 2). However, there continued to be unintended variation, particularly with color Doppler of the atrial septum and complete parasternal long axis sweeps.

Figure 2.

Figure 2.

Distribution of echocardiographic imaging completeness scores before and after the educational intervention.

A particular area of improvement after protocol introduction was first-year fellows performing quantitative evaluation of systolic function (Table IV). There were statistically significant increases in the use of Simpson’s method for left ventricular ejection fraction in both the four-chamber (40% vs. 67%, p<0.001) and two chamber (33% vs. 67%, p<0.001), as well as tricuspid annular plane systolic excursion (TAPSE, 45% vs. 80%, p<0.001) to evaluate right ventricular systolic function. The intervention also increased the use of Doppler interrogation, specifically spectral Doppler of the descending aorta (27% vs. 51%, p=0.001) and color Doppler of the left ventricular outflow tract (54% vs. 69%, p=0.05).

Conclusion:

There is unintended practice variation amongst newly independent first-year pediatric cardiology fellows when performing echocardiograms to assess systolic function. The introduction of the standardized imaging protocol reduced practice variation and improved the use of Doppler and quantitative methods to assess systolic function. Despite these improvements, practice variation remains. However, overall the post-intervention rates of image acquisition compare favorably to prior studies of practice variation in image acquisition1516 where about 60–70% of requested views were obtained by experienced echocardiographic sonographers and anesthesia attendings, respectively. We would not expect perfect compliance; due to limitations such as postoperative dressings and poor ultrasound windows, it is not expected that every view can be obtained on every patient. However, per guidelines specific documentation should be provided if a specific view could not be obtained.

While national professional society guidelines11 and accreditation standards8 for pediatric echocardiography recommend that images be performed with standardized protocols when possible, there are few studies on whether standardized protocols improve echocardiographic quality. The topic of image quality was studied by Parthiban et al.15, who developed standardized imaging protocols for postoperative assessment of multiple congenital cardiac surgeries (tetralogy of Fallot repair, arterial switch operation, Glenn, and Fontan). Echocardiograms were judged using a binary score indicating whether or not a structure was adequately imaged (note that this metric differs slightly from our own in that the requirement was that a structure was imaged – from any view – not that a specific view was performed). This quality improvement intervention (which, in addition to standardized protocols, also included an increase in the use of sedation) led to an increase in image quality. Interestingly, two studies of the introduction of a standardized imaging protocol for adult transesophageal echocardiograms have shown opposing results1617. In Graham et al.16, TEEs being performed by anesthesiologists did not show any change in the number of views acquired. Conversely, Chen et al.17, also developed a TEE grading system and provided on-going feedback by e-mail, which improved compliance by cardiology attendings. While prior studies have focused on imaging by experienced attendings and sonographers, our results show that a standardized protocol can serve as a roadmap for early learners who must perform studies in unsupervised environments without the knowledge necessary to prioritize image acquisition.

There has been little research into practice variation in image acquisition in pediatric echocardiography. More attention has been paid to differences in reporting, where quality improvement methodologies have been shown to decrease variation1820, especially with regards to quantitative reporting of function2122. While the variety of congenital heart disease means that image acquisition often needs to be individually tailored, standards have been developed for a complete structural diagnostic assessment23 as well as for some individual lesions24. A recent survey of pediatric echocardiography laboratory directors demonstrated substantial variability in the protocols and methods used25. Only 55% of centers had written protocols for “limited echocardiograms”. Quantitative methods for LV ejection fraction were routinely used in less than half of centers; quantitative assessment of RV function was routine in less than 60% of centers. While our echocardiography laboratory had protocols for a first echocardiographic study, evaluation of specific repaired congenital heart defects, assessment of pulmonary hypertension, as well as a complete evaluation of ventricular performance for outpatient monitoring of cardiomyopathy and heart transplant patients, no standardized protocol existed for acute evaluation of function and effusions in the inpatient setting. This deficiency was despite this being one of the most common indications for echocardiography, especially for those performed by cardiology fellows.

Our study demonstrates that image acquisition is highly variable and that the development and implementation of a clinically-tailored standardized protocol can reduce unintended practice variation and improve the quality of information echocardiographers can provide to the ordering team. Specifically, substantial improvements were seen in areas of particular clinical relevance. An increase in the listing of the ordering physician allows timely communication of important results. An increase in the use of quantitative functional assessment should improve detection of longitudinal changes in function. Additionally, we would hypothesize that reducing variability in imaging and the quantitative methods used should in turn reduce reporting practice variation.

As a quality improvement project, fellows could not be randomized to receive the standardized protocol; the present study is a pilot intervention with data from a single plan-do-study-act (PDSA) cycle. Each individual fellow enters fellowship with different levels of experience and has different exposures when on call. While the echocardiography “boot camp” can mitigate some of these differences, variation in skills will remain. Additionally, while all fellows in this study (pre- and post-intervention) had a fellowship orientation that included echocardiography training, the nature of this introduction varied over the course of the study period as we constantly worked to improve the quality of the delivered education. Thus, some of the observed improvements may be due to other educational changes and not solely due to the standardized protocol.

The results of our study, showing the success of implementation of standardized protocols as part of fellow education, add to the growing literature on methods to improve echocardiography education for pediatric cardiology fellows. Continued improvement in echocardiography education will likely require a combination of standardized reporting and imaging protocols, simulator training2627 as well as observed, formalized testing2830. Future work in this area will focus on developing educational strategies to sustain these improvements over the course of fellowship.

Acknowledgments

Funding: Dr. White is funded by the National Heart, Lung, and Blood Institute grant T32HL007915 and the Children’s Hospital of Philadelphia Research Institute.

References:

  • 1.Srivastava S, Printz BF, Geva T, et al. : Task force 2: pediatric cardiology fellowship training in noninvasive cardiac imaging. J Amer Coll Cardiol 2015;66:687–98. 10.1016/j.jacc.2015.03.010 [DOI] [PubMed] [Google Scholar]
  • 2.Maskatia SA, Altman CA, Morris SA, et al. : The echocardiography “boot camp”: a novel approach in pediatric cardiovascular imaging education. J Amer Soc Echocardiogr 2013;26:1187–92. 10.1016/j.echo.2013.06.001 [DOI] [PubMed] [Google Scholar]
  • 3.Ceresnak SR, Axelrod DM, Motonaga KS, et al. : Pediatric cardiology boot camp: description and evaluation of a novel intensive training program for pediatric cardiology trainees. Pediatr Cardiol 2016;37:834–44. 10.1007/s00246-016-1357-z [DOI] [PubMed] [Google Scholar]
  • 4.Allan CK, Tannous P, DeWitt E, et al. : A pediatric cardiology fellowship boot camp improves trainee confidence. Cardiol Young 2016;16:1514–21. 10.1017/s1047951116002614 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Ceresnak SR, Axelrod DM, Sacks LD, et al. : Advances in pediatric cardiology boot camp: boot camp training promotes fellowship readiness and enables retention of knowledge. Pediatr Cardiol 2017;38:631–40. 10.1007/s00246-016-1560-y [DOI] [PubMed] [Google Scholar]
  • 6.Burstein DS, Mille FK, Cohen MS, et al. : A new expanded approach to pediatric cardiology fellowship orientation. Prog Pediatr Cardiol 2018;50:23–8. 10.1016/j.ppedcard.2018.06.001 [DOI] [Google Scholar]
  • 7.Maskatia SR, Cabrera AG, Morris SA, et al. : The pediatric echocardiography boot camp: four-year experience and impact on clinical performance. Echocardiography 2017;34:1486–94. 10.1111/echo.13649 [DOI] [PubMed] [Google Scholar]
  • 8.Intersocietal Accreditation Commission: The IAC standards and guidelines for pediatric echocardiography accreditation 2018. [Google Scholar]
  • 9.Kirk R, Dipchand AI, Rosenthal DN, et al. : The international society for heart and lung transplantation guidelines for the management of pediatric heart failure: executive summary. J Heart Lung Transplant 2014;33:888–909. 10.1016/j.healun.2014.06.002 [DOI] [PubMed] [Google Scholar]
  • 10.Kossaify A, Grollier G: Echocardiography practice: insights into appropriate clinical use, technical competence and quality improvement program. Clin Med Insights Cardiol 2014;8:1–7. 10.4137/cmc.s13645 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Picard MH, Adams D, Bierig SM, et al. : American society of echocardiography recommendations for quality echocardiography laboratory operations. J Amer Soc Echocardiogr 2011;24:1–10. 10.1016/j.echo.2010.11.006 [DOI] [PubMed] [Google Scholar]
  • 12.Gwet KL: Computing inter-rater reliability and its variance in the presence of high agreement. Br J Math Stat Psychol 2008;61:29–46. [DOI] [PubMed] [Google Scholar]
  • 13.Wongpakaran N, Wongpakaran T, Wedding D, et al. : A comparison of Cohen’s kappa and Gwet’s AC1 when calculating inter-rater reliability coefficients: a study conducted with personality disorder samples. BMC Med Res Methodol 2013;13:61 10.1186/1471-2288-13-61 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Walsh P, Thornton J, Asato J, et al. : Approaches to describing inter-rater reliability of the overall clinical appearance of febrile infants and toddlers in the emergency department. PeerJ 2014;2:651 10.7717/peerj.651 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Parthiban A, Levine JC, Nathan M, et al. : Implementation of a quality improvement bundle improves echocardiographic imaging after congenital heart surgery in children. J Amer Soc Echocardiogr 2016;29:144–50. 10.1016/j.echo.2016.09.002 [DOI] [PubMed] [Google Scholar]
  • 16.Graham JM, Sidebotham D, Story DA, et al. : Adequate images in intraoperative transoesophageal echocardiography: a quality improvement project. Anaesth Intensive Care 2013;42:640–8. [DOI] [PubMed] [Google Scholar]
  • 17.Chen T, Carlson S, Cheney A, et al. : Performance of comprehensive transesophageal echocardiography: quality improvement through educational intervention. J Amer Soc Echocardiogr 2019; in press. 10.106/j.echo.2019.03.019 [DOI] [PubMed] [Google Scholar]
  • 18.Jone PN, Gould R, Barrett C, et al. : Data-driven quality improvement project to increase the value of the congenital echocardiography report. Pediatr Cardiol 2018;39:726–30. 10.1007/s00246-018-1812-0 [DOI] [PubMed] [Google Scholar]
  • 19.Statile C, Statile A, Brown J, et al. : Using improvement methodology to optimize echcoardiography imaging of coronary arteries in children. J Amer Soc Echocardiogr 2016;29:247–52. 10.1016/j.echo.2015.08.019 [DOI] [PubMed] [Google Scholar]
  • 20.Daubert MA, Yow E, Barnhart HX, et al. : Quality improvement implementation: improving reproducibility in the echocardiography laboratory. J Amer Soc Echocardiogr 2015;28:959–68. 10.1016/j.echo.2015.03.004 [DOI] [PubMed] [Google Scholar]
  • 21.Johri AM, Picard MH, Newell J, et al. : Can a teaching intervention reduce interobserver variability in LVEF assessment. JACC Cardiovasc Imaging 2011;4:821–9. 10.1016/j.jcmg.2011.06.004 [DOI] [PubMed] [Google Scholar]
  • 22.Johnson TV, Symanski JD, Patel SR, et al. : Improvement in the assessment of diastolic function in a clinical echocardiography laboratory following implementation of a quality improvement initiative. J Amer Soc Echocardiogr 2011;24:1169–79. 10.1016/j.echo.2011.08.018 [DOI] [PubMed] [Google Scholar]
  • 23.Lai WW, Geva T, Shirali GS, et al. : Guidelines and standards for performance of a pediatric echocardiogram: a report from the task force of the pediatric council of the American society of echocardiography. J Amer Soc Echocardiogr 2006;19:1413–30. 10.1016/j.echo.2006.09.001 [DOI] [PubMed] [Google Scholar]
  • 24.Silvestry FE, Cohen MS, Armsby LB, et al. : Guidelines for the echocardiographic assessment of atrial septal defect and patent foramen ovale: from the American society of echocardiography and society for cardiac angiography and interventions. J Amer Soc Echocardiogr 2015;28:910–58. 10.1016/j.echo.2015.05.015 [DOI] [PubMed] [Google Scholar]
  • 25.Camarda JA, Patel A, Carr MR, et al. : Practice variation in pediatric echocardiography laboratories. Pediatr Cardiol 2019;40:537–45. 10.1007/s00246-018-2012-7 [DOI] [PubMed] [Google Scholar]
  • 26.Weidenbach M, Razek V, Wild F, et al. : Simulation of congenital heart defects: a novel way of training in echocardiography. Heart 2009;95:636–41. http://10.1136/hrt.2008.156919 [DOI] [PubMed] [Google Scholar]
  • 27.Dayton JD, Groves AM, Glickstein JS, et al. : Effectiveness of echocardiography simulation training for paediatric cardiology fellows in CHD. Cardiol Young 2018;28:611–5. 10.1017/s104795111700275x [DOI] [PubMed] [Google Scholar]
  • 28.Nair P, Siu SC, Sloggett CE, et al. : The assessment of technical and interpretative proficiency in echocardiography. J Amer Soc Echocardiogr 2006;19:924–31. 10.1016/j.echo.2006.01.015 [DOI] [PubMed] [Google Scholar]
  • 29.Hao M, Ippisch HM, Cook RS, et al. : Implementation of an objective testing system in noninvasive cardiac imaging for evaluation of pediatric cardiology fellows. J Amer Soc Echocardiogr 2007;20:1211–8. 10.1016/j.echo.2007.03.012 [DOI] [PubMed] [Google Scholar]
  • 30.Levine JC, Geva T, Brown DW: Competency testing for pediatric cardiology fellows learning transthoracic echocardiography: implementation, fellow experience, lessons learned. Pediatr Cardiol 2015;35:1700–11. 10.1007/s00246-015-1220-7 [DOI] [PubMed] [Google Scholar]

RESOURCES