Introduction
Digital transformation has become a defining inflection point in cardiovascular care, supported by robust evidence demonstrating measurable benefits across multiple domains of practice. Yet emerging evidence suggests that, despite the promise of these advances, we may inadvertently create new barriers for disadvantaged, high-risk patients who are likely to benefit most.
Discussion
Understanding the emerging disparities
Accumulating evidence reveals patterns of digital health disparities that cardiovascular practitioners must address. Patients aged 80 years and older demonstrate 76% reduced odds of accessing video telemedicine consultations compared with younger patients.1 Black patients in the USA have a 36% lower odds of utilizing video-based cardiovascular care, while those with limited English proficiency encounter 48% reduced access to digital health platforms.2 These differences suggest our current implementation approaches may unintentionally exclude vulnerable populations from technological advances. While it is important to acknowledge the challenges faced by these vulnerable populations, digital technology and artificial intelligence (AI) have been effectively utilized in certain contexts to address gaps in cardiovascular care for underserved patient populations, demonstrating that exclusion is not inevitable.
The geographical dimension proves equally concerning. Rural residents in high-income countries, who already face elevated cardiovascular mortality rates,3 may be further disadvantaged by video-based telemedicine consultations due to unreliable broadband access and limited digital literacy.4 The extent of disadvantage varies by specific digital health intervention, with robust evidence currently limited primarily to video telemedicine and remote patient monitoring platforms.5,6
The unintended consequences of artificial intelligence implementation
Perhaps nowhere do we see greater potential for both benefit and unintended bias than in AI applications. As we develop cardiovascular AI systems, the datasets used for training may not fully represent the populations most burdened by disease. When algorithms are deployed at scale without comprehensive validation across diverse populations, we risk amplifying existing healthcare disparities.
Performance data reveal important patterns. Deep learning models for electrocardiogram interpretation show declining accuracy in patients over 80 years (area under the curve [AUC] 0.73 vs. 0.81 in younger adults),7 with heart failure prediction algorithms demonstrating AUC reductions from 0.80 in younger patients to 0.66 in older populations due to algorithmic bias.8
These findings highlight the importance of ensuring our AI systems perform equitably across all patient populations. When algorithms are trained predominantly on data from younger, more homogeneous populations, we may inadvertently create digital disparities mirroring historical healthcare inequities.
The wearable technology challenge
Consumer wearables offer democratized cardiac monitoring, yet adoption data expose significant gaps. Despite cardiovascular disease affecting predominantly older adults, fewer than 20% of established cardiovascular patients utilize wearable devices, with consistent use reported by only half of adopters.9 While historical concerns have been raised regarding photoplethysmography-based pulse oximetry accuracy in individuals with darker skin tones,10–13 recent evidence suggests a more nuanced picture. The EquiOx study in critically ill patients found that although pulse oximetry measurements showed overall negative bias, the bias was actually less negative in darker skin categories,14 with newer devices demonstrating improved performance. Furthermore, limited evidence suggests that photoplethysmography-based heart rate measurement, which forms the foundation of most consumer wearables, does not appear to be significantly affected by skin tone.15 These findings highlight the importance of avoiding overgeneralizations, though continued vigilance regarding device performance across diverse individuals remains essential.
Even when wearables generate data, questions remain about clinical utility. Over 90% of AI-generated alerts from wearable devices prove clinically non-actionable, with atrial fibrillation diagnosed by consumer devices confirmed by cardiologists in only 34–65% of cases.16 This creates situations where some users receive excessive false alarms, contributing to anxiety,17 while others lack access to potentially beneficial monitoring.
Reconsidering our digital strategy
The cardiovascular community is navigating digital transformation, recognizing both its transformative potential and imperatives it creates for equitable implementation.
Substantial infrastructure investments for comprehensive digital health equity—including broadband access, device subsidies, multilingual interfaces, and digital literacy training—represent significant costs requiring evaluation against alternative strategies for achieving digital health equity, including both technological modifications (such as simplified interfaces or offline-capable tools) and complementary non-digital interventions that address the same health outcomes in underserved populations.
Patient-facing digital health implementations—including consumer health applications, telemedicine platforms, and wearables—often assume patients possess smartphones, reliable internet access, and sufficient technological literacy. These assumptions may exclude elderly patients managing multiple conditions, immigrants navigating language barriers, and economically disadvantaged individuals managing competing priorities. These concerns primarily address consumer-facing technologies. However, provider-facing digital health applications, including AI-driven clinical support systems and electronic health record analytics, represent a substantial portion of digital health solutions with different equity implications that merit separate consideration.
A comprehensive approach
The development of responsible AI and digital health approaches has been substantially advanced by foundational work from multiple organizations. The World Health Organization’s (WHO) Ethics and Governance of Artificial Intelligence for Health established six core principles—protecting human autonomy, promoting well-being and safety, ensuring transparency, fostering accountability, ensuring inclusiveness and equity, and promoting responsive and sustainable AI.18 The Guidelines International Network (GIN) has proposed eight principles specifically for AI use in the health guideline enterprise, including transparency, pre-planning, additionality, credibility, ethics, accountability, compliance, and evaluation.19 The National Academy of Medicine’s AI Code of Conduct provides a comprehensive framework addressing governance, development, and monitoring across the AI lifecycle.20 Building upon these established frameworks, cardiovascular-specific implementation requires tailored approaches.
Cardiovascular medicine can operationalize these established principles through specific actions. Training AI algorithms on diverse datasets to adequately represent high-risk populations addresses the inclusiveness principle while improving clinical validity. Designing interfaces that prioritize accessibility alongside functionality implements the equity and transparency principles in practical terms. Developing workflows that accommodate varying technological capabilities ensures the protection of human autonomy. Measuring success not only by adoption rates but by impact on health disparities across all populations enacts the accountability and evaluation principles.
The evaluation paradigm for cardiovascular digital health tools should extend beyond technical validation metrics. Following the principles established by the WHO and echoed in the GIN guidance, assessment frameworks should include algorithmic performance across demographic strata, accessibility of implementation requirements, transparent reporting of training data composition, mechanisms for ongoing monitoring and refinement, and measurable impact on health disparities.
Towards equitable digital innovation
Digital health systems may reflect their creators’ perspectives and priorities when development teams lack diversity or when insufficient attention is paid to potential biases. However, this limitation is not universal, and thoughtfully designed development processes that prioritize diverse input and rigorous bias testing can substantially mitigate these concerns.
Cardiovascular medicine has an opportunity to ensure technological advancement serves equity through thoughtful resource allocation and honest examinations of potential biases. Applying the principles outlined above, we can reconsider how to conceive, develop, and successfully implement digital health technologies in cardiovascular care. The potential benefits are substantial: ensuring that our most powerful tools reach those most vulnerable to cardiovascular disease.
Conclusion
The digital transformation of cardiovascular care presents remarkable opportunities and important challenges requiring thoughtful navigation. As we develop increasingly sophisticated technologies, we can ensure innovation serves all patients equitably, particularly those high-risk populations who stand to benefit most but may face access barriers. The fundamental question is not whether we can develop sophisticated AI algorithms or capable wearable devices, but whether we can implement these tools to reduce rather than exacerbate cardiovascular health disparities. With thoughtful design, inclusive development, and equity-focused metrics, digital health technologies could advance cardiovascular health equity. Our choices will determine whether digital transformation becomes a force for greater health equity or an inadvertent contributor to widening disparities.
Contributor Information
Joshua J Hon, Faculty of Medicine, Imperial College London, Exhibition Road, London SW7 2AZ, UK.
Gerald Carr-White, Guy’s and St Thomas’ NHS Foundation Trust, Westminster Bridge Road, London SE1 7EH, UK.
Author contributions
Joshua J Hon (Conceptualisation [lead], Writing—original draft [lead], Writing—review & editing [equal]) and Gerald Carr-White (Conceptualisation [supporting], Methodology [supporting], Supervision, Writing—review & editing [equal])
Funding
No funding from any specific agency was received for this work.
References
- 1. Osmanlliu E, Kalwani NM, Parameswaran V, Qureshi L, Dash R, Scheinker D, et al. Sociodemographic disparities in the use of cardiovascular ambulatory care and telemedicine during the COVID-19 pandemic. Am Heart J 2023;263:169–176. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Pierce RP, Stevermer JJ. Disparities in the use of telehealth at the onset of the COVID-19 public health emergency. J Telemed Telecare 2023;29:3–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Liu M, Marinacci LX, Joynt Maddox KE, Wadhera RK. Cardiovascular health among rural and urban US adults—healthcare, lifestyle, and social factors. JAMA Cardiol 2025;10:585–594. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Harrington RA, Califf RM, Balamurugan A, Brown N, Benjamin RM, Braund WE, et al. Call to action: rural health: a presidential advisory from the American Heart Association and American Stroke Association. Circulation 2020;141:e615–e644. [DOI] [PubMed] [Google Scholar]
- 5. Rowe Ferrara M, Intinarelli-Shuler G, Chapman SA. Video and telephone telehealth use and web-based patient portal activation among rural-dwelling patients: retrospective medical record review and policy implications. J Med Internet Res 2025;27:e67226. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Burchim S, Miller S, Beima-Sofie K, Spencer AG, Selah B, Wadden E, et al. Rural perspectives on digital health in cardiovascular care: qualitative study of interviews with rural and rural-serving primary care providers and cardiologists. J Med Internet Res 2025;27:e77234. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Mihan A, Pandey A, Van Spall HGC. Artificial intelligence bias in the prediction and detection of cardiovascular disease. NPJ Cardiovasc Health 2024;1:31. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Kaur D, Hughes JW, Rogers AJ, Kang G, Narayan SM, Ashley EA, et al. Race, sex, and age disparities in the performance of ECG deep learning models predicting heart failure. Circ Heart Fail 2024;17:e010879. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Dhingra LS, Aminorroaya A, Oikonomou EK, Nargesi AA, Wilson FP, Krumholz HM, et al. Use of wearable devices in individuals with or at risk for cardiovascular disease in the US, 2019 to 2020. JAMA Netw Open 2023;6:e2316634. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Osorio-Sanchez L, May JM, Kyriacou P. Evaluation of skin pigmentation effect on photoplethysmography signals using a vascular finger phantom with tunable optical and mechanical properties. J Biomed Opt 2025;30:117002. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Martin D, Johns C, Sorrell L, Healy E, Phull M, Olusanya S, et al. Effect of skin tone on the accuracy of the estimation of arterial oxygen saturation by pulse oximetry: a systematic review. Br J Anaesth 2024;132:945–956. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Shi C, Goodall M, Dumville J, Hill J, Norman G, Hamer O, et al. The accuracy of pulse oximetry in measuring oxygen saturation by levels of skin pigmentation: a systematic review and meta-analysis. BMC Med 2022;20:267. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Gudelunas MK, Lipnick M, Hendrickson C, Vanderburg S, Okunlola B, Auchus I, et al. Low perfusion and missed diagnosis of hypoxemia by pulse oximetry in darkly pigmented skin: a prospective study. Anesth Analg 2024;138:552–561. [DOI] [PubMed] [Google Scholar]
- 14. Hendrickson CM, Lipnick MS, Chen D, Chen D, Law TJ, Pirracchio R, et al. EquiOx: a prospective study of pulse oximeter bias and skin pigmentation in critically-ill adults. medRxiv 25337217, 10.1101/2025.10.06.25337217, 7 October 2025, preprint: not peer reviewed. [DOI]
- 15. Mulholland AM, MacDonald HV, Aguiar EJ, Wingo JE. Influence of skin pigmentation on the accuracy and data quality of photoplethysmographic heart rate measurement during exercise. Eur J Appl Physiol 2025. Online ahead of print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Armoundas AA, Narayan SM, Arnett DK, Spector-Bagdady K, Bennett DA, Celi LA, et al. Use of artificial intelligence in improving outcomes in heart disease: a scientific statement from the American Heart Association. Circulation 2024;149:e1028–e1050. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Rosman L, Gehi A, Lampert R. When smartwatches contribute to health anxiety in patients with atrial fibrillation. Cardiovasc Digit Health J 2020;1:9–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18. Organization WH . Ethics and Governance of Artificial Intelligence for Health: WHO Guidance. Geneva: World Health Organization; 2021. [Google Scholar]
- 19. Sousa-Pinto B, Marques-Cruz M, Neumann I, Chi Y, Nowak AJ, Reinap M, et al. Guidelines international network: principles for use of artificial intelligence in the health guideline enterprise. Ann Intern Med 2025;178:408–415. [DOI] [PubMed] [Google Scholar]
- 20. National Academy of Medicine, The Learning Health System Series . In: Krishnan S, Matheny M, Fontaine E, Adams L, eds. The National Academies Collection: Reports Funded by National Institutes of Health. Washington (DC): National Academies Press (US); 2025. [PubMed] [Google Scholar]
