Skip to main content
Open Heart logoLink to Open Heart
. 2021 Dec 23;8(2):e001874. doi: 10.1136/openhrt-2021-001874

AI and the cardiologist: when mind, heart and machine unite

Antonio D'Costa 1,, Aishwarya Zatale 1
PMCID: PMC8705226  PMID: 34949649

Abstract

Artificial intelligence (AI) and deep learning has made much headway in the consumer and advertising sector, not only affecting how and what people purchase these days, but also affecting behaviour and cultural attitudes. It is poised to influence nearly every aspect of our being, and the field of cardiology is not an exception. This paper aims to brief the clinician on the advances in AI and machine learning in the field of cardiology, its applications, while also recognising the potential for future development in these two mammoth fields. With the advent of big data, new opportunities are emerging to build AI tools, with better accuracy, that will directly aid not only the clinician but also allow nations to provide better healthcare to its citizens.

Keywords: cardiac imaging techniques, arrhythmias, cardiac, echocardiography

Introduction

Despite significant advances in diagnosis and treatment, cardiovascular disease (CVD) remains the most common cause of morbidity and mortality worldwide, accounting for approximately one-third of annual deaths.1 2 Early and accurate diagnosis is key to improving CVD outcomes. A core of these can be tackled through regular screenings. Although screening programmes at present can be cost inefficient for niche diseases, artificial intelligence (AI) has most definitely broken the rules of what our present cardiovascular health monitoring tools can be capable of; From using ECG’s for detection of left ventricular systolic dysfunction, to cardiovascular risk prediction with accuracies higher than a mammogram.3 In this paper, we wish to briefly touch on the recent advancements in the field, and how AI could not only bring the birth of new technology, but also expand the capabilities of current tools available to us.

A brief introduction to AI principles for the clinician

Machine learning and deep learning

AI has under it a few major subsets, two of them being machine learning (ML) and deep learning (DL).4 ML is an application of AI that includes algorithms that parse data, learn from that data, and then go on to make informed decisions based on what they’ve interpreted from it. In the music streaming industry, this would help to analyse the bulk of songs a user listens to, compare it for similarities with a bunch of other users using the same service and then provide suggestions from listeners with similar musical taste.

A subfield of ML is what is known as DL (figure 1). DL is more akin to how we as humans ‘think’, working through what is known as a ‘neural network’. Unlike an ML algorithm, a DL neural network will have multiple layers, each layer consisting of an algorithm that in most simplistic terms takes an input, runs it through a mathematical function, and provides a relevant ‘insightful’ output. This output can then be passed on as input to another layer to get another feature detail, so on and so forth, each layer honing in and picking out the most relevant details with respect to the task at hand (figure 2).

Figure 1.

Figure 1

The brain, AI, ML and DL—the relationship.35 AI, artificial intelligence; ARDA, automated retinal disease assessment; DL, deep learning; ML, machine learning.

Figure 2.

Figure 2

A DL neural network using data from multiple variables to predict the visibility in a foggy situation at an airport36 in simplest of terms, the neural network accepts multiple inputs through its input layer. At each node, the data are analysed using a mathematical filter function, before being passed onto the next node. This repeats for hidden layers of ‘N’ depth, creating a feature map that summarises the presence of detected features in the input, honing and refining itself at each depth of layer before finally being passed out through the output layer. DL, deep learning.

In DL, one can stack multiple such layers to create a neural network of ‘n’ size, limited only by the amount of computing ability and processing time available at hand.

DL, with its ability to learn by itself, has significantly created newer avenues in AI research. In the field of cardiology, this technology is being used to detect and classify arrhythmias and murmurs using ECG tracings and stethoscope recordings respectively. In echocardiography (ECHO), AI image processing can help in automation of multiple parameter detection like Ejection fraction, as well as in quick screening exams.

Natural language processing

While most of the fields applying AI rely on data that can be easily accessed by a computer, medicine isn’t one of them. Clinical narratives comprise more than 80% of data in electronic health records.5 Much of this data is free-text and unstructured whose summarisation would be time and labour-intensive. For it to be made computer-manageable, tools for automatic identification and extraction of relevant data would be hence needed.

Fortunately, recent advances in technology, especially natural language processing (NLP), have enabled this automatic information extraction from narrative text. NLP is another AI method that converts unstructured text into a structured, machine readable form. Presently, NLP has been used to extract information from clinical notes, radiology reports and pathology reports.

The advancements in NLP have been tied to advancements in AI and DL which rely on ‘big data’ for their accuracy, making possible it’s venture into the field of medicine.

AI in cardiology

Electrophysiology

An ECG records the electrical signals produced by the heart through sensors placed on the skin. One of the most common tests used to quickly detect a variety of heart diseases, its utility was thus far limited though, in comparison to more advanced imaging techniques such as two-dimensional (2D)-ECHO.

Asymptomatic left ventricular dysfunction (ALVD), a treatable condition, is present in 3%–6% of the general population, and associated with reduced quality of life and longevity. Classified as stage B heart failure, it is defined as depressed left ventricular systolic function in the absence of clinical heart failure. Early detection and initiation of therapy in patients with presumed ALVD has shown to lead to better outcomes.6 Presently 2D-ECHO is the only way to diagnose this condition, but AI seems to be on the verge of changing that.

Attia et al trained a convolutional neural network to identify patients with ventricular dysfunction using ECG data alone.7 In their study, of the patients without ventricular dysfunction on presentation, those with a positive AI screen were at four times the risk of developing future ventricular dysfunction compared with those with a negative screen. This makes the study one of the pivotal one’s in the field, owing to the use of a cheaper and more widely accessible modality such as ECG providing a functionality once exclusive to an ECHO.

While long-term cardiac monitoring, such as a Holter exam, provides information mostly about cardiac rhythm and repolarisation, the standard, short duration, 12-lead ECG can detect a wider range of cardiac electrical activity. These include arrhythmias, conduction disturbances, acute coronary syndromes, chamber hypertrophy and enlargement, effects of drugs and electrolyte disturbances. Thus, a DL approach that allows for accurate interpretation of the 12 lead ECG would have the greatest impact.

A majority of physicians, including cardiologists calculate QTc incorrectly, potentially missing a long QT syndrome, which could be deadly.8 Although present ECG Machines can provide automated estimation of various intervals, AI-based automatic ECG interpretation could aid in decreasing physician mishaps by allowing better accuracies(4% error rate) and offer more functionalities such as detecting ischaemic cardiac beats.9 Furthermore, with the development of newer architectures and faster chips, smaller mobile devices capable of interpreting an ECG may well be an effective screening tool for both acquired and congenital long QT syndrome in a variety of clinical settings, especially where a standalone 12-lead electrocardiography is not accessible or cost-effective.10

AI has also been used to detect arrhythmias such as atrial fibrillation and conduction blocks. Lyon et al have been able to identify and classify ECG phenotypes associated with arrhythmic risk markers in hypertrophic cardiomyopathy.11

The accuracy of these predictions is based on the large datasets, which are now increasingly and easily available thanks to digitisation, paving the way for future advances in this modality.

Echocardiography

ECHO remains the principal imaging modality in cardiology for the evaluation of cardiac structure and function. Unfortunately, being an ultrasonography based imaging modality, the acquisition and interpretation of echocardiograms remains highly dependent on operator experience and hence open to human errors. This allows an opportunity for AI which could be used to minimise such errors, and open up the possibility of standardisation.

A recent study conducted by Narang et al concluded that AI can indeed be used to assist untrained personnel, to acquire echocardiographic studies with diagnostic potential.12 In this study, 8 untrained nurses in ultrasonography used AI guidance to scan 30 patients with a 10-view ECHO protocol. Five expert echocardiographers, did a blind review of these scans and felt they were of diagnostic quality for left ventricular size and function in 98.8% of patients, right ventricular size in 92.5%, and presence of pericardial effusion in 98.8%. Their AI guidance algorithm represents a step forward in the interaction of medical imaging, and novice sonologists, as also opening up the possibility of ultrasonography into settings that ordinarily would not have access due to lack of trained personnel.

Compared with a human, ML models have also been shown to provide an almost instantaneous assessment of an echocardiogram. In a study by Knackstedt et al, left ventricular ejection fraction could be analysed in approximately 8 s,13 which is far quicker than what a trained cardiologist with years of experience would be able to achieve. This quick measurement would allow cardiologists to save time, allowing for an increase in the number of scans, decreasing the reporting time, and providing a cost-benefit advantage.

Further work (table 1) has shown that AI models used to identify borders can provide an accurate identification of left and right ventricular cavities so as to derive their respective volumes, comparable to those measured by cardiac MRI (figure 3.14–18

Table 1.

Findings in the field of echocardiography and machine learning

Reference Study year Application Machine learning model used Training/ validation set Test set Sensitivity/specificity/accuracy
37 2018 Recognise 15 echocardiography views Convolutional neural network 200 000 images 20 000 images –/–/91.7%
38 2018 Quantification of wall motion abnormalities Double density-dual tree discrete wavelet transform 279 images 96.12%/96%/96.05%
39 2016 Classification/ discrimination of pathological patterns (HCM vs ATH) Support vector machine, random forest, artificial neural network 96%/77%/–
40 2016 Quantification of MR Support vector machine 5004 frames 99.38%/99.63%/99.45%
13 2015 Calculation of EF and LS AutoEF Software 255 patients
41 2013 Automated detection of LV border Random forest classifier with an active shape model 50 images 35 images –/–/90.09%

ATH, athletes’ heart; EF, ejection fraction; HCM, hypertrophic cardiomyopathy; LS, longitudinal strain; LV, left ventricle; MR, mitral regurgitation.

Figure 3.

Figure 3

Automatic border detection and strain analysisfrom the three standard apical views calculating regional and global longitudinal strain. Adapted from Davis et al.42 Global longitudinal strain (GLS) is a simple parameter that expresses longitudinal shortening as a percentage (change in length as a proportion to baseline length), and is a newly emerging topic which has a significant role in predicting cardiovascular outcomes.

Stress ECHO

Stress ECHO is one of the most commonly used functional imaging tests for coronary artery disease.

A meta-analysis of 62 published stress ECHO studies demonstrated a wide variation in reported sensitivities and specificities for dobutamine stress echocardiography. Sensitivity ranged from 33% to 98%, while the specificity ranged from 38% to 97% resulting in average sensitivity and specificity for dobutamine stress ECHO of 81% and 82%, respectively. In essence, one in every five patients could be potentially misdiagnosed.19

Quantitative assessment of changes in regional wall motion is important in stress ECHO to identify patients with prognostically significant coronary disease. It is also used in assessment of systolic heart failure. A study by Omat et al20 found that using a DL technique of convolutional neural networks provided a sensitivity of 81.1% compared with an expert operator interpretation, although most studies until have been on relatively small datasets. Nevertheless, they show promise that ML models may be able to support decision making in stress ECHO, reducing the incidence of a patient being misdiagnosed.

AI in clinical cardiology and daily life

Clinical decision support and preventive cardiology

In clinical practice, the main goals are the right diagnosis and effective treatment of the patient.

Yan et al.21propose an interesting concept—they propose that wherein the traditional model involved a clinician analysing and giving ‘instructions’ to a patient directly, a novel approach would rather be the clinician giving instructions to an AI solution acting as a liason.22 The AI programme would then search for flaws in the clinician’s interpretation and if any such were found, would then request help from a senior clinician before finally passing the corrected advice on to the patient. This approach would most definitely help in decreasing errors in medical practice, acting as a redundancy tool.

Take for example, Google has been able to determine cardiovascular risk factors from retinal fundus photographs, such as age, gender, smoking status, blood pressure and major adverse events.23 This work allowed the scientists to use this data to predict the patient’s risk of CVD, with an accuracy as high as 70%.

It gives hope that the future clinician may be able to ascertain a much better depth of a patient’s past history using devices that incorporate this tech, and hence better guide them in their medications and lifestyle.

When it comes to predicting prognosis, studies also showed that echocardiographic data and clinical factors can be used by AI tools to facilitate heart failure diagnosis, classification, severity estimation and prediction of adverse events.24–26 Work by Nakashima et al allowed highly precise estimates of risk for out-of-hospital cardiac arrests using a ML model.27 Although few, studies in the in-patient setting are nevertheless promising. Zhang et al integrated AI with their hospital management system to analyse 14 clinical variables and predict in real-time the risk of major adverse cardiac events in patients presenting with chest pain.28 Other studies have shown that DL had high sensitivity and a low false-alarm rate at detecting patients with cardiac arrest in an in-patient setting.29 This would mean that in the future AI based tools, integrated with the hospital record management system, could well act as an early warning system to alert the clinician of patients who could potentially worsen in the coming days, and tailor treatment thus accordingly.

To note here would be that as with any new technology, it’s implementation poses to remain a challenge. AI is no stranger to this. This could well be guided by the Nonadoption, Abandonment, and Challenges to the Scale-Up, Spread, and Sustainability framework proposed by Greenhalgh et al.30 It is based on the premise that when considering on whether a technology will be successfully accepted or not, it is important to keep in mind that ‘it is not individual factors that make or break a technology implementation effort but the dynamic interaction between them’. When it comes to AI, we have a host of factors ranging from the cost of setup and it’s integration, to ethical and humanitarian issues and funding. In resource poor countries, with the lack of a basic electronic medical record system in most clinical setups and hence the relevant digital infrastructure, this would pose an even bigger challenge.

The digital stethoscope

With the development of higher quality and robust microphones such as Micro-Electro-Mechanical System, the development of the digital stethoscope was not far behind, opening up the field of computer-aided auscultation.

Today’s digital stethoscopes can not just allow recordings to be shared wirelessly, but also use ML software to provide automatic arrhythmia detection with accuracies as high as 87%, and acoustic based automatic diagnosis of cardiac dysfunction. Audio recorded on such stethoscopes can further be uploaded online to services which greatly enhance the scope of the acoustic data.31

Wearable sensors

The Food and Drug Administration recently approved an ECG acquisition technology designed by Apple, for use in their Apple watch devices.32 In a standard ECG limb lead, lead 1 is the potential difference between the right arm and the left arm. For the apple watch to detect an ECG, the user has to touch the digital crown of the watch. The watch also contains electrodes on the back of the device which are in continuous contact with the user’s wrist. The watch then acquires the electrical potential between the electrodes and digital crown(essentially the potential differences between the two arms) to display a waveform. Software in the watch can then use this data to detect a sinus rhythm or an abnormality such as atrial fibrillation (Afib).

Wearable plethysmographs available today in most digital watches too can provide AFib detection. AI has been shown to improve the sensitivity and specificity of atrial fibrillation detection in wearable devices dramatically compared with conventional methods.33 WATCH-AF(SmartWATCHes for Detection of Atrial Fibrillation) trial is one such study that showed that the photoplethysmographic algorithm had very high specificity and diagnostic accuracy compared with ECG data measured by cardiologists, but was limited by a high dropout rate owing to insufficient signal quality.34

Moving forward: future prospects

‘Cultural lags’, as proposed by William Ogbur, is the idea that as technological changes leap forward, they create a cultural lag in society, which then need to adapt to the new realities introduced by innovations (Ogburn, 1922). The day when AI replaces a cardiologist is not yet in sight, nor may ever be. Although not mainstream yet, we are definitely in the era where AI is assisting cardiologists across the world daily in faster and better diagnosis and image interpretations. The future lies in uing this technology in areas not yet ventured into due to cost constraints. Furthermore, integration of automatic diagnosis and cardiovascular risk assessment systems into existing electronic medical record software would not only improve holistic treatment, but also aid in counselling of patients on modifiable risk factors, improving morbidity and mortality.

Acknowledgments

Special thanks and much gratitude to Dr. Shreepal Jain and Dr. Sujoy Fernandes for all the help in reviewing the work. A special mention to Dr. Shakuntala Prabhu and Dr. Sudha Rao for their kind assistance. Special mention to Dr. Deepak Salgare, Dr. Norris Rodrigues, Dr Shrinivas Waghmare, Dr. Kanhaiya Kumar, Dr. Noopur Girmal.

Footnotes

Contributors: AD'C: author; AZ: coauthor.

Funding: The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Competing interests: None declared.

Provenance and peer review: Not commissioned; externally peer reviewed.

Data availability statement

Data sharing not applicable as no datasets generated and/or analysed for this study.

Ethics statements

Patient consent for publication

Not applicable.

Ethics approval

This study does not involve human participants.

References

  • 1.Wilkins E, Wilson L, Wickramasinghe K. European cardiovascular disease statistics 2017. Brussels: European Heart Network, 2017. [Google Scholar]
  • 2.Ritchie H, Roser M. Our world in data. In: Causes of death, 2018. https://ourworldindata.org/causes-of-death [Google Scholar]
  • 3.Artificial intelligence (AI) in cardiovascular medicine. Available: https://www.mayoclinic.org/departments-centers/ai-cardiology/overview/ovc-20486648
  • 4.Ai, ml, and DL: how not to get them mixed! Available: https://towardsdatascience.com/understanding-the-difference-between-ai-ml-and-dl-cceb63252a6c
  • 5.Jensen PB, Jensen LJ, Brunak S. Mining electronic health records: towards better research applications and clinical care. Nat Rev Genet 2012;13:395–405. 10.1038/nrg3208 [DOI] [PubMed] [Google Scholar]
  • 6.Sara JD, Toya T, Taher R, et al. Asymptomatic left ventricle systolic dysfunction. Eur Cardiol 2020;15:e13. 10.15420/ecr.2019.14 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Attia ZI, Kapa S, Lopez-Jimenez F, et al. Screening for cardiac contractile dysfunction using an artificial intelligence-enabled electrocardiogram. Nat Med 2019;25:70–4. 10.1038/s41591-018-0240-2 [DOI] [PubMed] [Google Scholar]
  • 8.Viskin S, Rosovski U, Sands AJ, et al. Inaccurate electrocardiographic interpretation of long QT: the majority of physicians cannot recognize a long QT when they see one. Heart Rhythm 2005;2:569–74. 10.1016/j.hrthm.2005.02.011 [DOI] [PubMed] [Google Scholar]
  • 9.Ronzhina M, Potocnak T, Janousek O. Spectral and higher-order statistical analysis of the ECG: application to the study of ischemia in rabbit isolated hearts. Computing in Cardiology 2012:645–8. [Google Scholar]
  • 10.Giudicessi JR, Schram M, Bos JM, et al. Artificial Intelligence-Enabled assessment of the heart rate corrected QT interval using a mobile electrocardiogram device. Circulation 2021;143:1274–86. 10.1161/CIRCULATIONAHA.120.050231 [DOI] [PubMed] [Google Scholar]
  • 11.Lyon A, Mincholé A, Martínez JP, et al. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances. J R Soc Interface 2018;15:20170821. 10.1098/rsif.2017.0821 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Narang A, Bae R, Hong H, et al. Utility of a Deep-Learning algorithm to guide novices to acquire Echocardiograms for limited diagnostic use. JAMA Cardiol 2021;6:624–32. 10.1001/jamacardio.2021.0185 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Knackstedt C, Bekkers SCAM, Schummers G, et al. Fully automated versus standard tracking of left ventricular ejection fraction and longitudinal strain: the FAST-EFs multicenter study. J Am Coll Cardiol 2015;66:1456–66. 10.1016/j.jacc.2015.07.052 [DOI] [PubMed] [Google Scholar]
  • 14.Levy F, Dan Schouver E, Iacuzio L, et al. Performance of new automated transthoracic three-dimensional echocardiographic software for left ventricular volumes and function assessment in routine clinical practice: Comparison with 3 Tesla cardiac magnetic resonance. Arch Cardiovasc Dis 2017;110:580–9. 10.1016/j.acvd.2016.12.015 [DOI] [PubMed] [Google Scholar]
  • 15.Domingos JS, Stebbing RV, Leeson P. Structured random forests for myocardium delineation in 3D echocardiography. Cham, Switzerland: Springer International Publishing, 2014. [Google Scholar]
  • 16.Stebbing RV, Namburete AIL, Upton R, et al. Data-Driven shape Parameterization for segmentation of the right ventricle from 3D+t echocardiography. Med Image Anal 2015;21:29–39. 10.1016/j.media.2014.12.002 [DOI] [PubMed] [Google Scholar]
  • 17.Tsang W, Salgo IS, Medvedofsky D, et al. Transthoracic 3D Echocardiographic Left Heart Chamber Quantification Using an Automated Adaptive Analytics Algorithm. JACC Cardiovasc Imaging 2016;9:769–82. 10.1016/j.jcmg.2015.12.020 [DOI] [PubMed] [Google Scholar]
  • 18.Otani K, Nakazono A, Salgo IS, et al. Three-Dimensional echocardiographic assessment of left heart chamber size and function with fully automated quantification software in patients with atrial fibrillation. J Am Soc Echocardiogr 2016;29:955–65. 10.1016/j.echo.2016.06.010 [DOI] [PubMed] [Google Scholar]
  • 19.Geleijnse ML, Krenning BJ, van Dalen BM, et al. Factors affecting sensitivity and specificity of diagnostic testing: dobutamine stress echocardiography. J Am Soc Echocardiogr 2009;22:1199–208. 10.1016/j.echo.2009.07.006 [DOI] [PubMed] [Google Scholar]
  • 20.Omar HA, Domingos JS, Patra A. Quantification of cardiac bull’s-eye map based on principal strain analysis for myocardial wall motion assessment in stress echocardiography. In: 2018 IEEE 15th International Symposium on biomedical imaging (ISBI 2018), 2018. [Google Scholar]
  • 21.Yan Y, Zhang J-W, Zang G-Y, et al. The primary use of artificial intelligence in cardiovascular diseases: what kind of potential role does artificial intelligence play in future medicine? J Geriatr Cardiol 2019;16:585–91. 10.11909/j.issn.1671-5411.2019.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Mei X, Lee H-C, Diao K-Y, et al. Artificial intelligence-enabled rapid diagnosis of patients with COVID-19. Nat Med 2020;26:1224–8. 10.1038/s41591-020-0931-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Poplin R, Varadarajan AV, Blumer K, et al. Prediction of cardiovascular risk factors from retinal fundus Photographs via deep learning. Nat Biomed Eng 2018;2:158–64. 10.1038/s41551-018-0195-0 [DOI] [PubMed] [Google Scholar]
  • 24.Anchouche K, Singh G, Singh G, et al. Clinical applications of machine learning in cardiovascular disease and its relevance to cardiac imaging. Eur Heart J 2019;40:1975–86. 10.1093/eurheartj/ehy404 [DOI] [PubMed] [Google Scholar]
  • 25.Cikes M, Sanchez-Martinez S, Claggett B, et al. Machine learning-based phenogrouping in heart failure to identify responders to cardiac resynchronization therapy. Eur J Heart Fail 2019;21:74–85. 10.1002/ejhf.1333 [DOI] [PubMed] [Google Scholar]
  • 26.Horiuchi Y, Tanimoto S, Latif AHMM, et al. Identifying novel phenotypes of acute heart failure using cluster analysis of clinical variables. Int J Cardiol 2018;262:57–63. 10.1016/j.ijcard.2018.03.098 [DOI] [PubMed] [Google Scholar]
  • 27.Nakashima T, Ogata S, Noguchi T, et al. Machine learning model for predicting out-of-hospital cardiac arrests using Meteorological and chronological data. Heart 2021;107:1084–91. 10.1136/heartjnl-2020-318726 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Zhang P-I, Hsu C-C, Kao Y, et al. Real-Time AI prediction for major adverse cardiac events in emergency department patients with chest pain. Scand J Trauma Resusc Emerg Med 2020;28:93. 10.1186/s13049-020-00786-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Kwon Joon‐myoung, Lee Y, Lee Y. An algorithm based on deep learning for predicting In‐Hospital cardiac arrest. J Am Heart Assoc 2018;7. 10.1161/JAHA.118.008678 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Greenhalgh T, Wherton J, Papoutsi C, et al. Beyond adoption: a new framework for theorizing and evaluating Nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res 2017;19:e367. 10.2196/jmir.8775 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Artificial intelligence gives stethoscopes a much-needed upgrade. Available: https://engineering.berkeley.edu/news/2020/03/artificial-intelligence-gives-stethoscopes-a-much-needed-upgrade/
  • 32.De novo classification Request for ECG APP. Available: https://www.accessdata.fda.gov/cdrh_docs/reviews/DEN180044.pdf
  • 33.Torres-Soto J, Ashley EA. Multi-task deep learning for cardiac rhythm detection in wearable devices. NPJ Digit Med 2020;3:116. 10.1038/s41746-020-00320-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Dörr M, Nohturfft V, Brasier N, et al. The WATCH AF Trial: SmartWATCHes for Detection of Atrial Fibrillation. JACC Clin Electrophysiol 2019;5:199–208. 10.1016/j.jacep.2018.10.006 [DOI] [PubMed] [Google Scholar]
  • 35.Drukker L, Noble JA, Papageorghiou AT. Introduction to artificial intelligence in ultrasound imaging in obstetrics and gynecology. Ultrasound Obstet Gynecol 2020;56:498–505. 10.1002/uog.22122 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Oğuz K, Pekin MA. Predictability of Fog Visibility with Artificial Neural Network for Esenboga Airport". Avrupa Bilim ve Teknoloji Dergisi, 2019: 542–51. https://dergipark.org.tr/en/pub/ejosat/issue/43603/452598 [Google Scholar]
  • 37.Madani A, Arnaout R, Mofrad M, et al. Fast and accurate view classification of echocardiograms using deep learning. NPJ Digit Med 2018;1:6. 10.1038/s41746-017-0013-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Raghavendra U, Fujita H, Gudigar A, et al. Automated technique for coronary artery disease characterization and classification using DD-DTDWT in ultrasound images. Biomed Signal Process Control 2018;40:324–34. 10.1016/j.bspc.2017.09.030 [DOI] [Google Scholar]
  • 39.Narula S, Shameer K, Salem Omar AM, et al. Machine-learning algorithms to Automate morphological and functional assessments in 2D echocardiography. J Am Coll Cardiol 2016;68:2287–95. 10.1016/j.jacc.2016.08.062 [DOI] [PubMed] [Google Scholar]
  • 40.Moghaddasi H, Nourian S. Automatic assessment of mitral regurgitation severity based on extensive textural features on 2D echocardiography videos. Comput Biol Med 2016;73:47–55. 10.1016/j.compbiomed.2016.03.026 [DOI] [PubMed] [Google Scholar]
  • 41.Gregg Belous AB, Rowlands D. Segmentation of the left ventricle from ultrasound using random forest with active shape model in artificial intelligence, modelling and simulation (AIMS). Kota Kinabalu, Malaysia: IEEE, 2013. [Google Scholar]
  • 42.Davis A, Billick K, Horton K, et al. Artificial intelligence and echocardiography: a primer for cardiac Sonographers. Journal of the American Society of Echocardiography 2020;33:1061–6. 10.1016/j.echo.2020.04.025 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Data sharing not applicable as no datasets generated and/or analysed for this study.


Articles from Open Heart are provided here courtesy of BMJ Publishing Group

RESOURCES