Skip to main content
The Lancet Regional Health: Western Pacific logoLink to The Lancet Regional Health: Western Pacific
letter
. 2024 Jul 13;48:101145. doi: 10.1016/j.lanwpc.2024.101145

Unveiling the black box: imperative for explainable AI in cardiovascular disease prevention

Yanyi Wu a,b,, Chenghua Lin a,b
PMCID: PMC11298886  PMID: 39104749

Dalakoti et al. have a positive attitude towards the application of artificial intelligence (AI) in cardiovascular disease (CVD) prevention in Singapore, especially emphasizing the potential of AI tools such as CardioSight and CHAMP in identifying high-risk individuals and implementing preventive measures.1 However, a core issue is missing from its argument: the explainability and transparency of these AI systems. Given the increasing penetration of AI technology into medical decision-making, it has become a top priority to uncover the mystery of the “black box” and ensure that the recommendations provided by these systems can be understood and trusted by both physicians and patients alike.

The lack of explainability in AI systems poses significant challenges in the medical domain, because medical decisions directly impact patient health and safety. Physicians need to comprehend the logical basis of AI recommendations to communicate effectively with patients and make prudent decisions.2 For example, when CardioSight identifies that a patient is at high risk for CVD, the physician must be able to understand and explain the factors contributing to this assessment, otherwise, they may miss the early intervention opportunities because of the uncertainty of AI recommendations. In parallel, patients have the right to know how their health data is being used and how AI algorithms arrive at specific conclusions. The opaque systems will erode patients' trust, hinder patient-physician communication, and impede shared decision-making.3 Must be aware that the success of AI-driven CVD prevention depends not only on the accuracy of predictions but also on the transparency of the decision-making process.

Therefore, researchers and developers should give priority to the development of explainable AI models in healthcare, which involves designing algorithms that can provide clear, interpretable explanations for their predictions and recommendations.4 Techniques such as feature importance analysis, rule extraction and counterfactual explanation to unravel the complex process of AI decision-making.5 Increasing investment in explainable AI aims to foster trust, facilitate effective communication, and ensure that AI tools such as CardioSight and CHAMP are powerful, transparent and accountable for realizing the fundamental change of cardiovascular disease prevention and patient care model.

Declaration of interests

Authors declare no competing interests.

References

  • 1.Dalakoti M., Djohan A., Ong J., et al. Incorporating AI into cardiovascular diseases prevention–insights from Singapore. Lancet Reg Health West Pac. 2024;48 doi: 10.1016/j.lanwpc.2024.101102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Amann J., Blasimme A., Vayena E., et al. Explainability for artificial intelligence in healthcare: a multidisciplinary perspective. BMC Med Inform Decis Mak. 2020;20(1):310. doi: 10.1186/s12911-020-01332-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Ploug T., Holm S. The four dimensions of contestable AI diagnostics-A patient-centric approach to explainable AI. Artif Intell Med. 2020;107 doi: 10.1016/j.artmed.2020.101901. [DOI] [PubMed] [Google Scholar]
  • 4.Tjoa E., Guan C. A survey on explainable artificial intelligence (XAI): toward medical XAI. IEEE Trans Neural Netw Learn Syst. 2021;32(11):4793–4813. doi: 10.1109/TNNLS.2020.3027314. [DOI] [PubMed] [Google Scholar]
  • 5.Henry K.E., Kornfield R., Sridharan A., et al. Human–machine teaming is key to AI adoption: clinicians’ experiences with a deployed machine learning system. NPJ Digit Med. 2022;5:97. doi: 10.1038/s41746-022-00597-7. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from The Lancet Regional Health: Western Pacific are provided here courtesy of Elsevier

RESOURCES