Skip to main content
Lippincott Open Access logoLink to Lippincott Open Access
editorial
. 2019 Sep 11;12(9):e009727. doi: 10.1161/CIRCIMAGING.119.009727

Combining Artificial Intelligence With Human Insight to Automate Echocardiography

Paul Leeson 1,, Andrew J Fletcher 2
PMCID: PMC7099860  PMID: 31522554

See Article by Asch et al

A quick eyeball estimate of left ventricular function is often the first thing an echocardiographer or clinician performs, whether consciously or subconsciously, when they start to perform an examination.1 By the time they are writing their report, a precise, quantified measure will have replaced this estimate.2 Over time, M-mode quantification3 has been superseded by a series of new ultrasound technologies, to the point where 3D echocardiography is now accepted as providing measures as precise as cardiovascular magnetic resonance.4 So what next? Asch et al5 in this issue of Circulation Cardiovascular Imaging propose that echocardiography may have reached a stage in life perhaps best described by Marcel Proust in his ‘Remembrance of Things Past’ when he realizes that: ‘the real voyage of discovery is not in seeking new landscapes but in having new eyes’.6 The new eyes for echocardiography are those of a computer, trained using artificial intelligence (AI) methodology, to mimic a human expert’s eye.

To develop an AI that can eyeball left ventricular function, the team had to give some guidance to the computer. Based on the a priori observation that the operator does not rely on volume measures when they estimate left ventricular function, they trained the computer to look for proportional, rather than absolute, changes in the size of the heart. When performed along 2 orthogonal axes within the apical long-axis views these proportional changes can replace volume measures in the equations used to calculate ejection fraction.5 The approach overcomes several technical challenges, including needing to account for image scaling when generating measures and trying to identify borders in poor quality images, which limits some other automated techniques.7,8 The major limitation is that, because the computer performs no measures of left ventricular size, no volume or size measures are reportable with the technology and, as only 2 planes are used, unusual pathologies or regional variations are overlooked.

After training on 50 000 real-world patients, the method was tested on scans from 99 new studies, which had also been analyzed by independent experts. The ability to identify patients with severe left ventricular dysfunction (ejection fraction <35%) was particularly impressive achieving sensitivity and specificity of 0.90 and 0.92, respectively. The overall performance across a range of ejection fraction levels was also good but with apparently large limits of agreement of over 10%. This degree of variation from a reference standard was similar to that recorded for clinical readers. If most of the time we are eyeballing within 5% to 10% of the right answer, is that acceptable? Some of this variation may also represent an experimental problem of comparing 2 methods, neither of which generate gold standard measures, against each other. Which is closer to the truth, the AI method or the reference method?

Interestingly, on closer examination, there are subtle differences that differentiate AI eyeballing from human eyeballing. The clinician has a tendency to identify confidently severe or normal but finds intermediate or borderline cases more difficult. In the correlation and Bland-Altman plots, clinical readers appear to have a tendency to report measures closer to upper or lower limits and cluster around cut points such as 35% and 55%. This leads to narrower bounds of variation below 35% and above 55%, and a tendency for intermediate ejection fractions to be classed higher or lower, apparently at random. Severe is obvious; normal is obvious, with a split decision in the middle. In contrast, the AI has more uniform bounds of error across the range of ejection fractions but with a trend to underestimate lower ejection fractions and overestimate higher ejections, similar to the clinician.

As a source of immediate feedback on cardiac function at point of care, possibly with handheld devices, this technology appears attractive. However, 2 apical views are required, and some adaptation and retraining may need to be considered to take account of echocardiography use in emergency departments or the operating theater. The first estimate of ejection fraction is often from single views and, typically, this view is the parasternal or transgastric short axis.9 There is also the question of whether the echocardiography community, after years of increasing accuracy of ejection fraction measures, and with new availability of AI-driven tools available to auto-contour and measure function,8,10 are ready to take an apparent step back to automated eyeballing. A novice only needs training on 50 images to achieve acceptable eyeballed estimates of ejection fraction11 and, despite many publications showing its accuracy,12 everyone still expects a quantified, contoured measure to end up on the report. Increasingly, this report also now needs to include detailed regional assessment and strain measures.2

But there is an elegance to this work. This is not another description of brute force, black-boxed AI applied to large scale data, but the use of clever human observation and insight into how an operator looks at images to carefully direct machine learning. The tangible excitement in this article is that, because of this approach, a computer that has never seen an image before can now take an echocardiogram, look at 2 views and, without any user input or need to contour the ventricle, estimate what it thinks is the ejection fraction with astonishing precision.

Footnotes

The opinions expressed in this article are not necessarily those of the editors or of the American Heart Association.

Disclosures

Dr Leeson acknowledges current grant support related to medical imaging from the British Heart Foundation, Wellcome Trust, National Institute of Health Research, and Lantheus Medical Imaging. Dr Leeson is a stockholder and founder of Ultromics a medical imaging artificial intelligence company. The other author reports no conflicts.

References

  • 1.Rich S, Sheikh A, Gallastegui J, Kondos GT, Mason T, Lam W. Determination of left ventricular ejection fraction by visual estimation during real-time two-dimensional echocardiography. Am Heart J. 1982;104:603–606. doi: 10.1016/0002-8703(82)90233-2 [DOI] [PubMed] [Google Scholar]
  • 2.Lang RM, Badano LP, Mor-Avi V, Afilalo J, Armstrong A, Ernande L, Flachskampf FA, Foster E, Goldstein SA, Kuznetsova T, Lancellotti P, Muraru D, Picard MH, Rietzschel ER, Rudski L, Spencer KT, Tsang W, Voigt JU. Recommendations for cardiac chamber quantification by echocardiography in adults: an update from the American Society of echocardiography and the European Association of cardiovascular imaging. J Am Soc Echocardiogr. 2015;28:1–39.e14. doi: 10.1016/j.echo.2014.10.003 [DOI] [PubMed] [Google Scholar]
  • 3.Pombo JF, Troy BL, Russell RO., Jr Left ventricular volumes and ejection fraction by echocardiography. Circulation. 1971;43:480–490. doi: 10.1161/01.cir.43.4.480 [DOI] [PubMed] [Google Scholar]
  • 4.Dorosz JL, Lezotte DC, Weitzenkamp DA, Allen LA, Salcedo EE. Performance of 3-dimensional echocardiography in measuring left ventricular volumes and ejection fraction: a systematic review and meta-analysis. J Am Coll Cardiol. 2012;59:1799–1808. doi: 10.1016/j.jacc.2012.01.037 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Asch FM, Poilvert N, Abraham T, Jankowski M, Cleve J, Adams M, Romano N, Hong H, Mor-Avi V, Martin RP, Lang RM. Automated echocardiographic quantification of left ventricular ejection fraction without volume measurements using a machine learning algorithm mimicking a human expert. Circ Cardiovasc Imaging. 2019;12:e009303 doi: 10.1161/CIRCIMAGING.119.009303 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Proust M. Remembrance of Things Past: La Prisonnière. New York: Random House; 1924. [Google Scholar]
  • 7.Medvedofsky D, Mor-Avi V, Byku I, Singh A, Weinert L, Yamat M, Kruse E, Ciszek B, Nelson A, Otani K, Takeuchi M, Lang RM. Three-dimensional echocardiographic automated quantification of left heart chamber volumes using an adaptive analytics algorithm: feasibility and impact of image quality in nonselected patients. J Am Soc Echocardiogr. 2017;30:879–885. doi: 10.1016/j.echo.2017.05.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Alsharqi M, Woodward WJ, Mumith JA, Markham DC, Upton R, Leeson P. Artificial intelligence and echocardiography. Echo Res Pract. 2018;5:R115–R125. doi: 10.1530/ERP-18-0056 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Leeson P, Augustine D, Mitchell ARJ, Becher H. Echocardiography. 2nd ed Oxford: Oxford University Press; 2012. [Google Scholar]
  • 10.Dey D, Slomka PJ, Leeson P, Comaniciu D, Shrestha S, Sengupta PP, Marwick TH. Artificial intelligence in cardiovascular imaging: JACC state-of-the-art review. J Am Coll Cardiol. 2019;73:1317–1335. doi: 10.1016/j.jacc.2018.12.054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Lee Y, Shin H, Kim C, Lee I, Choi HJ. Learning curve-cumulative summation analysis of visual estimation of left ventricular function in novice practitioners: a STROBE-compliant article. Medicine. 2019;98:14 doi: 10.1097/MD.0000000000015191 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Gudmundsson P, Rydberg E, Winter R, Willenheimer R. Visually estimated left ventricular ejection fraction by echocardiography is closely correlated with formal quantitative methods. Int J Cardiol. 2005;101:209–212. doi: 10.1016/j.ijcard.2004.03.027 [DOI] [PubMed] [Google Scholar]

Articles from Circulation. Cardiovascular Imaging are provided here courtesy of Wolters Kluwer Health

RESOURCES