Abstract
Purpose
This article explores the transformative role of artificial intelligence (AI) in healthcare medical education, highlighting the urgent need to integrate AI into medical curricula. It examines the current gaps in AI literacy among healthcare professionals, proposes practical teaching strategies, and discusses the ethical considerations essential for the responsible implementation of AI.
Methods
An extensive literature review was conducted to identify key challenges and opportunities in integrating AI into healthcare education. Case studies, existing curricula, and frameworks were analysed to explore effective teaching methodologies, including problem-based learning (PBL), simulations, and ethical embedding. Multidisciplinary collaboration among academic institutions, technology companies, and government organisations was also examined as a pathway to bridge the gap between AI innovation and practical application.
Results
The findings reveal that while AI is rapidly becoming an integral part of healthcare practice, current medical curricula do not adequately prepare students for its adoption. Integrating AI into curricula through PBL, simulations, and real-world case studies fosters critical thinking and practical skills. Embedding ethical considerations throughout the curriculum addresses data privacy, bias, and accountability issues. Interdisciplinary partnerships and continuous professional development are key strategies for enhancing AI education and ensuring alignment with evolving healthcare needs.
Conclusion
Reimagining healthcare education to incorporate AI is essential for preparing healthcare professionals to navigate an AI-driven future. Embedding AI into curricula, addressing ethical concerns, and fostering collaborations between academia, industry, and government can responsibly equip healthcare professionals to harness AI’s potential. This paradigm shift is critical for improving patient care, advancing medical innovation, and ensuring the ethical integration of AI in healthcare systems.
Keywords: Healthcare education, Artificial intelligence (AI), AI in healthcare, Medical curricula, AI competency training, Ethical AI in medicine
Introduction
Artificial Intelligence (AI) is transforming healthcare by reshaping clinical decision-making, diagnostic imaging, and patient care [1, 2]. Deep learning algorithms, for instance, have demonstrated remarkable performance in tasks such as image classification, leading to improved diagnostic accuracy [3]. In imaging applications, AI contributes to enhanced reconstruction, motion correction, and automatic detection of clinical features, as demonstrated in cardiac computed tomography [4]. AI is also reshaping oncology and radiology through applications such as tissue phenotyping and integration with next-generation sequencing [5, 6].
Beyond diagnostics, AI plays an increasingly important role in patient monitoring and care delivery. For example, AI-enabled implantable sensors and digital health platforms are enabling real-time, remote, and personalised health interventions aligned with the Healthcare 5.0 paradigm [7, 8]. Applications also extend into mental health and chronic disease management, where AI supports improved access, triage, and tailored feedback [9, 10].
Despite these innovations, healthcare professionals often lack the necessary training to engage critically and effectively with AI technologies [2, 11]. This mismatch between rapid technological advancement and current medical training has generated increasing calls to rethink healthcare education [12, 13]. Medical curricula must now be adapted not only to familiarise learners with AI as a tool, but also to equip them with the competencies to evaluate, interpret, and integrate AI into practice responsibly.
This article aims to support this educational transformation by proposing practical strategies for embedding AI into formal medical and healthcare curricula. The focus lies on undergraduate, postgraduate, and continuing professional development programmes. Broader domains such as patient education, informal learning, and public health literacy are outside the scope of this analysis. Drawing on a structured review of literature and case studies, this paper examines current educational gaps, explores innovative teaching methodologies, and outlines a competency-based framework to reimagine healthcare education for an AI-driven future.
Methods
To explore how AI can be meaningfully integrated into healthcare education, we conducted a structured narrative review informed by scoping review principles [14]. However we did not strictly adhere to PRISMA guidelines. The review synthesised existing literature on the integration of AI into healthcare education, with a focus on medical curricula, teaching strategies, professional competencies, and ethical considerations published between 2018 and 2024 across PubMed, Scopus, and ERIC, focusing on curricular design, competency models, pedagogical strategies, and ethical considerations in AI education. Key findings were thematically grouped and are presented in the subsequent sections of this paper, which outline (i) current gaps in AI education, (ii) proposed competency frameworks, (iii) implementation strategies, (iv) ethical governance, and (v) implications for continuous professional development.
Literature was identified through electronic searches of PubMed, Scopus, and ERIC (Education Resources Information Center). Searches were conducted between January 15 and February 10, 2025, and included studies published between 2018 and 2024. An update was performed on July 20, 2025 to capture any newly added studies. Search terms included combinations of:
“artificial intelligence” AND “medical education”.
“AI curriculum” OR “healthcare training” AND “ethics” OR “competency”.
“AI literacy” AND “teaching strategy” OR “problem-based learning” OR “simulation”.
“AI in clinical education” OR “professional development”.
Reference lists of included articles were also hand-searched to identify additional relevant studies.
Included sources met the following criteria:
Published in English between 2018 and 2024;
Focused on AI in formal healthcare education (undergraduate, graduate, or CPD);
Described empirical results, conceptual frameworks, curricular initiatives, or pedagogical strategies.
Articles focused solely on patient education, dealing exclusively with the technical aspects of AI (e.g., algorithm development) without educational context and opinion pieces without substantive discussion of curricula were excluded.
After initial retrieval 122 articles, the author screened abstracts and titles to assess relevance. A final set of 67 articles was selected for full-text analysis. Thematic synthesis was employed to categorise findings into key dimensions of AI in healthcare education, including curriculum design, competency models, ethical integration, and professional development.
Case studies and real-world examples were extracted and integrated into the narrative to illustrate implementation strategies and measurable outcomes.
In addition to peer-reviewed studies, we also reviewed a purposive sample of curriculum descriptions, institutional AI education programmes, and case-based learning initiatives identified in the grey literature and academic websites. The reviewed materials included AI-focused modules, interdisciplinary hackathons, and simulated clinical learning environments. Altogether, we examined 6 distinct educational initiatives spanning North America, Europe, and Asia (cf. A new paradigm for AI education in healthcare section). These initiatives were selected for their innovative use of AI technologies, reported learning outcomes, and relevance to professional healthcare education, as well as the incorporation of AI and availability of learning outcome data.
The gap in traditional medical education
Thematic synthesis of the 68 included studies identified recurrent themes that frame the ensuing discussion: (i) curricular gaps and barriers (overload, limited faculty expertise, lack of standard frameworks); (ii) competency framing (consumer–translator–developer tiers); (iii) pedagogical approaches (PBL, simulation, case-based learning, design thinking, multimodal/AI-supported learning); (iv) institutional readiness (faculty development, cross-disciplinary delivery, academic–industry–government partnerships); (v) ethics and governance (bias/fairness, transparency, accountability, data protection); and (vi) continuous professional development (lifelong learning pathways and critical appraisal of AI tools). Table 1 summarises these themes and their indicative prevalence across the literature.
Table 1.
Themes from the thematic synthesis (2018–2024;
studies)
| Theme | Brief description | Indicative prevalence | Illustrative references |
|---|---|---|---|
| Curricular gaps and barriers | Curriculum overload; lack of faculty AI expertise; absence of standardised frameworks; uneven global adoption. | High | Ng et al. [12]; Charow et al. [15]; Grunhut et al. [16]; Pucchio et al. [17]; Mikeladze et al. [18]. |
| Competency framing | Three-tier learner profiles (consumer, translator, developer) guiding depth and assessment. | High | Ng et al. [12]; Laupichler et al. [19]; Russell et al. [20]. |
| Pedagogical approaches | PBL, simulation, case-based learning, design thinking; AI-supported tutoring for self-directed learning. | High | Benedict [21]; Benedict [22]; Hmelo-Silver [23]; Cain and Rajan [24]; Hui et al. [25]; Lin and Chang [26]. |
| Institutional readiness | Faculty development, interprofessional co-teaching, global networks, partnerships. | Moderate–High | Mah et al. [27]; Garas et al. [28]; Liaw et al. [29]. |
| Ethics and governance | Bias/fairness, explainability, accountability, privacy/data governance; embedding ethics longitudinally. | High | Amini et al. [30]; Ferrara [31]; Russell et al. [20]; Lysaght et al. [32]; Murdoch [33]. |
| Continuous professional development (CPD) | Personalised/adaptive learning; critical appraisal; alignment with clinical realities. | Moderate | Lin et al. [34]; Rubin [35]; Zuhair et al. [36]; Lee et al. [37]. |
Current medical curricula are not adequately preparing future healthcare professionals for the growing integration of AI into clinical practice. As noted by Li and Qin [38], this disconnect has led to a significant deficit in AI literacy among healthcare professionals, which in turn undermines the adoption and impact of AI innovations in patient care. Despite AI’s increasing relevance across diagnostic, prognostic, and therapeutic domains, most clinicians reported a lack of exposure to AI principles during their formal training [17].
This educational gap persists despite mounting awareness of AI’s importance in medicine. Surveys across various regions confirm that while medical students recognise the transformative potential of AI, they often feel unprepared to engage with these tools in clinical contexts [16, 39, 40]. For instance, in a national Canadian study, 85% of medical students reported receiving no formal training in AI, despite 94% believing AI would play a significant role in their future careers [17]. These findings suggest a critical misalignment between medical training and technological advancements.
Multiple structural and institutional barriers hinder the integration of AI education into healthcare training. Curriculum overload, limited faculty expertise, lack of institutional support, and the absence of standardised frameworks are among the most commonly cited challenges [12, 15]. Furthermore, healthcare educators face challenges in keeping pace with the rapid evolution of AI technologies, which often necessitate continuous content updates and collaboration across disciplines [18].
Another significant issue is the lack of validated AI competence frameworks tailored for healthcare education. As Mikeladze et al. [18] emphasise, most educators operate without clear guidelines on what competencies to teach, at what depth, and through which pedagogical methods. This results in fragmented, ad hoc efforts that fail to address the full spectrum of AI-related knowledge and skills necessary for clinical practice. Compounding this issue is the insufficient focus on the foundational quantitative, computational, and data science concepts underpinning AI systems. Noack and Reyes [41] argue that educational programmes often prioritise user-friendly tools while neglecting the mathematical and statistical underpinnings of AI. This superficial exposure limits students’ ability to critically assess the outputs of AI models or engage with their development. Blanco-González et al [42] also highlight the academic challenges of integrating AI-assisted outputs, such as LLM-generated draft text, code suggestions, and model-derived figures, into scholarly work, noting that such materials typically require substantial human curation and revision to meet scientific standards.
AI education also remains unevenly distributed across geographic regions, institutional types, and medical specialities. While leading academic centres in Europe, North America, and Asia have launched AI modules and interdisciplinary initiatives, many institutions, particularly in low- and middle-income countries, still lack the infrastructure or policy support to do so [15, 43]. This exacerbates existing disparities in training quality and workforce preparedness.
Recommendations to close this gap include adopting a tiered competency model, which categorises learners as AI “consumers,” “translators,” or “developers” based on the expected level of interaction with AI technologies [12]. This approach allows for differentiated learning pathways tailored to learners’ professional roles and prior knowledge. Early integration of AI into the core curriculum, ideally at the undergraduate level, has also been advocated [44], with AI being framed as an essential component of modern evidence-based medicine [16].
However, implementing such changes requires substantial curricular reform and faculty development. Educators require support in designing interdisciplinary, competency-based training modules that bridge medical knowledge and AI proficiency. Moreover, without institutional commitment and investment in AI infrastructure, both human and technological, these efforts risk remaining isolated or unsustainable.
In sum, the persistent disconnect between current healthcare education and the realities of AI-driven clinical environments demands urgent and systemic attention. Addressing this gap is not merely a technical challenge but a curricular and organisational imperative. As AI continues to reshape healthcare delivery, the ability of medical professionals to engage meaningfully with these technologies will increasingly define the quality, equity, and effectiveness of care.
A new paradigm for AI education in healthcare
To prepare healthcare professionals for the emerging realities of AI-enhanced clinical practice, a fundamental shift in medical education is required. This new paradigm must move beyond sporadic exposure to digital tools and instead offer structured, longitudinal, and competency-based training that reflects the complexity of AI applications in healthcare [11, 12].
Rethinking competency: the consumer–translator–developer model
A widely cited framework proposes a three-tier classification of AI engagement for healthcare professionals: consumers, translators, and developers [12, 19]. Table 2 outlines the core learning objectives and assessment strategies associated with the three-tier AI competency framework proposed by Ng et al. [12] for differentiated integration of AI into healthcare education. This model supports differentiated learning trajectories that are aligned with the specific responsibilities and needs of various professional profiles:
Consumers (e.g., general practitioners, nurses) should be equipped to interpret AI outputs critically, understand model limitations, and apply AI-generated insights to patient care.
Translators (e.g., clinical informaticians, physician–data science liaisons) act as intermediaries between developers and clinicians, helping integrate AI tools into workflows while ensuring clinical relevance.
Developers (e.g., clinician scientists with programming expertise) require technical training to design, evaluate, and validate AI models using healthcare data.
Table 2.
AI competency profiles in healthcare education
| Category | Learning objectives | Suggested assessment methods |
|---|---|---|
| Consumer | • Identify clinical tasks appropriate for AI assistance | • Multiple-choice questions (MCQs) |
| • Interpret AI-generated risk scores or predictions | • OSCE station with AI tool interpretation | |
| • Recognise limitations and biases in AI outputs | • Structured reflection | |
| Translator | • Critically evaluate the clinical relevance of AI models | • Case-based discussions |
| • Communicate AI results to patients or interdisciplinary teams | • Role-play or communication OSCE | |
| • Propose workflows for integrating AI | • Reflective portfolios | |
| Developer | • Implement basic predictive models using healthcare data | • Project-based assignments |
| • Validate models against clinical metrics | • Code walkthroughs | |
| • Explain algorithmic design choices | • Oral examination of model logic |
Pedagogical strategies and curriculum integration
Moving from theory to practice, various institutions have developed AI-integrated curricula that illustrate feasible and scalable approaches to teaching AI in medical settings. The following examples from Europe, North America, and Asia, illustrate diverse formats, institutional contexts, and pedagogical strategies:
Stanford Medicine’s AI Clinical Coach (AI CliC), which uses AI to support metacognitive development through reflective clinical reasoning1
The Charit’e “AI in Medicine” elective2, which incorporates ethical and technical content into clinical modules;
The NUS Healthcare AI Expo & Datathon, which promotes interdisciplinary collaboration through hands-on prototype development3;
The AiM-PC curriculum by the Society of Teachers of Family Medicine, which offers modular, CME-accredited training for primary care professionals4;
Stanford Online’s AI in Healthcare Certificate, a professional development course covering practical tools, clinical use cases, and implementation strategies5.
Guangdong Medical University’s AI Medical School, launched in 2025, is China’s first dedicated artificial intelligence medical school, aiming to cultivate AI-literate healthcare professionals through a fully integrated virtual–real learning environment6.
A promising strategy is to integrate AI content longitudinally across the medical curriculum, rather than isolating it in elective modules or standalone courses. Embedding AI concepts into preclinical and clinical subjects, such as radiology, pathology, pharmacology, and public health, enables learners to encounter AI in real-world contexts [16, 44]. For instance, students may be asked to critically assess AI-generated risk scores during case-based learning exercises or consider AI-supported imaging tools in diagnostic simulations [45].
This integrated approach allows students to engage with AI as both a tool and a subject of analysis, building their capacity to interpret, evaluate, and apply AI outputs responsibly. The aim is not to transform all medical students into AI developers, but to ensure they acquire sufficient literacy to collaborate with technical teams, advocate for patient interests, and identify the clinical implications of emerging technologies.
Multimodal learning and educational technology
Pedagogical innovation is central to this paradigm shift. A mix of active learning methods, including problem-based learning (PBL), simulations, design thinking, and case-based analysis, has demonstrated strong potential to foster critical thinking, adaptability, and interprofessional collaboration [22, 23, 46].
For example, PBL supported by AI-enhanced virtual patients has been found to increase engagement and learning retention among pharmacy and medical students [21]. In simulation-based studies, AI-driven diagnostic tools have enhanced clinical reasoning and decision-making skills [22, 47]. These results suggest that combining AI content with experiential learning enhances both conceptual understanding and real-world readiness.
Design thinking, which involves iterative problem-solving through the phases of inspiration, ideation, and implementation, has also proven effective in developing AI competencies. Lin and Chang [26] demonstrated that STEAM-based design thinking instruction enhanced both creativity and AI conceptualisation among European university students. Similarly, the HPI Design Thinking Studio in Digital Health offers a 22-day interdisciplinary programme where students, clinicians, and engineers co-develop AI-based healthcare solutions grounded in patient needs.
AI also enables personalised and adaptive learning. Large language models, such as ChatGPT, can serve as virtual tutors, facilitating self-directed learning and providing context-specific feedback [24, 25]. These tools can support students in reviewing material, generating clinical scenarios, or exploring decision trees, thus reinforcing both domain knowledge and digital fluency [48].
Institutional readiness and faculty development
Embedding AI into medical education also requires institutional investment in infrastructure, faculty training, and curriculum reform. Many educators currently lack expertise in AI concepts, pedagogy, or tool evaluation [18]. Faculty development programmes must address this gap by equipping instructors with the skills to teach AI fundamentals and collaborate across disciplines [27].
Collaborative networks, within and between institutions, are crucial for pooling resources, sharing best practices, and maintaining curricular relevance. Studies suggest that open, geographically distributed academic networks foster more innovation than closed, localised ones [28]. Hackathons, co-taught modules, and international exchange programmes can accelerate knowledge transfer and pedagogical experimentation.
Moreover, integrating AI education into existing competency frameworks ensures alignment with accreditation standards and national benchmarks. The inclusion of AI within evidence-based medicine (EBM), clinical reasoning, or digital health competencies helps normalise its presence and reinforce its legitimacy within core training structures [20, 29].
These strategies, ranging from curriculum integration and interdisciplinary team teaching to experiential learning, represent key levers for embedding AI education into healthcare training. Figure 1 illustrates these core pedagogical pillars, synthesising the foundational elements of a reimagined AI-ready curriculum.
Fig. 1.
Core pedagogical elements for integrating AI into healthcare education: interdisciplinary collaboration, competency-based design, curriculum integration, and innovative teaching strategies. These pillars reflect the practical foundations for building an AI-literate healthcare workforce
Practical approaches to AI education
A multifaceted teaching approach combining active and passive learning strategies, such as PBL and simulations and integrating also AI as educational tool (e.g., ChatGPT as a virtual tutor [24, 25], AI-driven diagnostic algorithms used in PBL case studies [45, 47]), fosters an effective learning environment for healthcare students [21]. As AI continues to advance, integrating AI-based technologies, such as chatbots, into these educational strategies may further enhance student motivation and learning outcomes, mainly when supported by instructors [49].
PBL has proven highly effective in healthcare education by enhancing student learning and problem-solving skills. In a study by Benedict [21], students participating in AI-enhanced virtual patients reported higher engagement and knowledge retention scores compared to traditional case-based sessions. Similarly, Dubovi [22] demonstrated that students trained using AI-based simulation modules achieved significantly higher clinical reasoning scores in post-intervention assessments. It enables students to learn course content by solving real-world problems, fostering flexible knowledge acquisition, self-directed learning, collaboration skills, critical thinking, and intrinsic motivation [23]. In nursing education, PBL has demonstrated significant positive effects on teamwork, problem-solving, and critical thinking skills, which are vital for healthcare professionals [46].
Simulations, especially when combined with PBL, provide a safe and controlled environment where students can apply theoretical knowledge to practical scenarios. Online computer-based simulations have become increasingly prevalent in nursing education, improving students’ clinical reasoning skills [22]. Integrating high-quality learning objects such as interactive web pages and video-based reflections has increased student satisfaction, exam performance, and self-efficacy in nursing tasks [50]. Additionally, the use of AI-powered applications in language learning has shown promise, with students expressing positive perceptions of their utility in educational contexts [51]. Indeed, in today’s increasingly globalised medical education landscape, many students train away from their home countries and conduct their studies in English, even if it is not their native language. It makes professional English proficiency essential for clinical communication and academic success. Consequently, AI-driven language-learning tools retain relevance within the integration of AI curricula, particularly in supporting English as a Foreign Language (EFL) learners in healthcare contexts. For instance, a 2024 study found that AI-powered EFL platforms enhanced pronunciation, vocabulary retention, and metacognitive strategies among users by providing adaptive, real-time feedback and fostering self-regulated learning [52].
Real-world case studies play a pivotal role in contextualising AI applications in healthcare, offering valuable insights into the practical implementation and impact of these technologies. Educators and policymakers can identify critical success factors and challenges by examining successful AI implementations and shaping strategies for future adoption [53]. One effective method involves categorising case studies based on key factors such as policy setting, technological implementation, and measurement of medical and economic impact. This structured framework highlights essential elements, such as privacy-focused technology infrastructure and quantifiable impact metrics, which are crucial for assessing the real-world value of AI technologies [53].
Despite the demonstrated efficacy of AI in healthcare research, a significant gap remains between theoretical advancements and their practical applications in preventive, diagnostic, and therapeutic contexts [53]. Bridging this gap requires systematically analysing real-world case studies to derive actionable insights into best practices, challenges, and success factors for AI implementation. Such studies can inform policy development, technological design, and impact assessment strategies, ultimately translating scientific innovation into real-world healthcare improvements [53, 54].
The practical integration of AI into healthcare education requires a balanced approach that combines established pedagogical strategies with innovative AI-driven solutions. By leveraging techniques such as PBL, simulations, and real-world case studies alongside emerging AI technologies, educational institutions can provide healthcare students with a dynamic and engaging learning experience. This approach enhances technical and problem-solving skills, ensuring that students are prepared to navigate the ethical and practical challenges of implementing AI in clinical practice. Collaboration among educators, AI experts, and policymakers is essential to create a responsible and practical framework for AI education, ultimately bridging the gap between innovation and practical application.
The role of AI in continuous professional development
Artificial intelligence (AI) has the potential to transform healthcare by reshaping how clinicians are trained to diagnose, treat, and manage care using advanced digital tools. Beyond formal education, AI is also playing a pivotal role in lifelong learning and continuous professional development (CPD), enabling adaptive, data-informed training tailored to the evolving needs and contexts of healthcare professionals.
One of AI’s most significant contributions to CPD is its capacity to deliver personalised learning experiences tailored to individual clinicians’ learning styles, preferences, and knowledge gaps [34]. By analysing performance data and learning patterns, AI systems can optimise content delivery, enhancing learning outcomes and ensuring educational interventions are both efficient and impactful. Ultimately, this personalisation supports higher-quality care delivery to patients.
AI-enabled platforms also offer data-driven insights into professionals’ performance, engagement levels, and even emotional states, enabling educators to adapt their teaching strategies and deliver targeted support [34]. Such continuous feedback mechanisms enable clinicians to identify areas for improvement and stay current with the latest advancements and best practices.
In specialised areas such as diabetes education, AI technologies facilitate tailored educational interventions that address individual characteristics and needs [55]. This customised approach supports healthcare professionals in managing complex, chronic conditions. It contributes to improved patient outcomes and quality of life. Similarly, the application of AI tools in mental health care and neurobiological research, fields where AI adoption has been limited thus far, holds promise in alleviating workforce shortages while enhancing diagnostic and treatment capabilities [37, 56].
Nevertheless, the assistive role of AI is not guaranteed and may be compromised under financial or workforce constraints. For example, start-ups such as Sword Health7 reportedly aim to increase clinician caseloads from approximately 200–300 to 700 patients per provider through AI-driven automation for messaging and triage. Hinge Health8 claims to have reduced clinician therapy time by up to 95% by leveraging AI and computer vision. While such automation may increase access and lower costs, it also risks clinician deskilling, reduced patient interaction, and prioritisation of throughput over care quality. These concerns underscore the necessity for robust institutional policies, effective regulatory oversight, and professional standards to ensure that AI continues to augment, rather than replace, clinical practice.
To remain proficient in an AI-enabled clinical landscape, healthcare professionals should (1) assess the clinical need for AI tools in their own practice, (2) conduct formal evaluations of such tools before adoption, and (3) use AI as a support mechanism, not a substitute for professional judgement [35].
Engagement in AI-focused CPD across domains such as diagnostics, prognostics, and hospital operations is also vital [36]. Furthermore, active involvement in AI research and development can ensure that technologies evolve in alignment with clinical realities.
AI is reshaping healthcare through innovation in diagnosis, treatment recommendations, and patient engagement [57]. To fully harness these benefits, clinicians must not only develop fluency with AI tools but also learn to assess their relevance, limitations, and context-specific value [35]. Given that AI development is still primarily led by computer scientists, engineers, and entrepreneurs, with limited input from healthcare practitioners, particularly in radiology, greater clinical engagement is essential to ensure that implementation is both safe and clinically meaningful [35].
AI-powered clinical decision support tools exemplify how AI can enhance professional judgement by streamlining information synthesis and supporting more accurate diagnoses and treatment planning. However, to ensure their responsible use, healthcare professionals must receive targeted training that addresses the ethical, social, and technical dimensions of these systems [20, 32, 58]. It includes understanding the implications of fairness, transparency, and patient autonomy, competencies that are critical for integrating AI tools effectively into clinical workflows [20]. Moreover, as AI technologies continue to evolve, their role is best conceived as augmentative rather than substitutive. Properly implemented, AI has the potential to relieve clinicians from routine administrative tasks, thereby allowing more time for compassionate, patient-centred care [59]. Yet, realising this potential depends on careful alignment between technological design and professional development strategies that foreground ethical considerations such as privacy, bias mitigation, and explainability [37] (cf. Ethics and governance section).
In summary, AI presents transformative opportunities for CPD in healthcare. When supported by ethical safeguards and collaborative frameworks, AI can enhance clinical knowledge, promote informed decision-making, and ultimately advance medical care.
Ethics and governance
The integration of artificial intelligence into healthcare education and clinical practice introduces complex ethical and governance challenges that must be addressed to ensure responsible use. As AI systems increasingly influence diagnostic pathways, treatment decisions, and workflow design, healthcare professionals must possess a foundational understanding of their ethical implications [20, 58].
Core ethical dimensions
Four major ethical dimensions must be considered in the context of AI in healthcare:
Bias and Fairness: AI systems trained on non-representative or biased datasets risk perpetuating or amplifying existing disparities in healthcare access and outcomes. Bias can enter at various stages such as data collection, algorithm development, model validation, and deployment [31, 60]. Fairness-aware design and routine bias audits are necessary, and clinicians must be trained to assess AI outputs for potential inequities critically. Vulnerable populations, such as older adults, are particularly at risk of harm when AI-based decisions fail to take into account individual values, cultural contexts, or communication barriers. If left unchecked, algorithmic bias may compound health inequities or erode trust. However, there is evidence that well-designed AI tools can also enhance understanding and shared decision-making by providing clear, personalised insights, thereby increasing patient trust in care pathways [61, 62].
Transparency and Explainability: Many AI systems operate as opaque “black boxes,” hindering clinical trust and regulatory approval. Healthcare professionals must be able to interpret and explain AI-generated recommendations, particularly in high-stakes settings. Explainable AI (XAI) is an expanding field that aims to make algorithmic decisions interpretable. Training clinicians to critique rather than unquestioningly accept AI outputs is crucial for preserving professional autonomy [3, 63].
Accountability and Liability: When AI systems lead to error or harm, the question of legal responsibility, whether attributed to developers, clinicians, or institutions, remains unresolved in many jurisdictions. Medical education should prepare learners to engage with complex scenarios involving shared accountability and legal ambiguity, enabling them to manage risk proactively [32, 60].
Privacy and Data Governance: The large-scale collection of personal health data required to train AI systems raises essential concerns regarding consent, data minimisation, third-party use, and cybersecurity. While the European General Data Protection Regulation (GDPR) offers a legal foundation for data governance, ethical AI education must also emphasise the principled understanding of privacy, digital rights, and data ownership [30, 33].
Embedding ethics into AI education
Ethics should not be confined to isolated modules within the curriculum. Instead, ethical considerations must be integrated both vertically and horizontally across AI education. This approach helps learners develop applied ethical reasoning and understand the balance between technological innovation and clinical responsibility.
Key strategies include:
Interdisciplinary Team Teaching: Involving educators from medicine, ethics, law, computer science, and the social sciences to reflect the interdisciplinary collaboration required in real-world AI development and governance [64].
Case-Based Discussions: Using scenarios that simulate ethical dilemmas, such as algorithmic triage, data use, or AI-driven decision-making, to promote critical reflection and moral reasoning.
Reflective Assignments and Portfolios: Encouraging students to document their interactions with AI tools, both in simulations and clinical environments, helps build ethical awareness and professional accountability.
Familiarity with Regulatory Frameworks: Students should be introduced to key legal instruments, including the GDPR, the EU AI Act, and health-specific data protection regulations. These frameworks serve as essential anchors for discussing accountability, transparency, and trust in AI [30, 65].
Faculty development and institutional support
Effective integration of AI ethics also requires investment in faculty training. Many healthcare educators lack formal expertise in AI technologies and digital ethics. Institutions should offer professional development programmes covering:
Core AI principles and limitations;
Ethical design and evaluation of AI tools;
Data protection regulations and compliance;
Inclusive and bias-conscious pedagogical strategies.
Beyond training, institutions should establish governance mechanisms to ensure responsible use of AI within education. Tools such as ChatGPT or adaptive learning systems must be assessed not only for educational value but also for their risks concerning data collection, autonomy, and equity [66, 67]. This requires transparent procurement practices, oversight committees with multidisciplinary representation, and institution-wide innovation policies grounded in professional values and a commitment to public trust.
In conclusion, ethical engagement with AI is not an optional skill - it is foundational. Embedding ethics and governance throughout medical curricula, investing in faculty capacity, and implementing institutional safeguards are essential steps towards cultivating healthcare professionals who are both technologically fluent and morally grounded. Only then can the promise of AI enhance, rather than undermine, the human element at the core of medical practice.
Conclusion: a roadmap to reimagining healthcare education
AI is transforming clinical practice, yet medical education has not kept pace. As this manuscript has shown, embedding AI into healthcare education requires not only curricular reform but also ethical preparedness, institutional collaboration, and tailored professional development [12, 16, 17].
To guide this transformation, Fig. 2 presents a conceptual framework structured around four foundational pillars, each supported by targeted implementation strategies:
Fig. 2.
Strategic framework for integrating AI into healthcare education, comprising four pillars: curriculum integration, cross-sector collaboration, ethical and regulatory preparedness, and professional development
Strategic Roadmap for Action
Table 3 outlines how this model can be operationalised within healthcare education, detailing the competencies, assessment methods, and delivery formats tailored to each learner profile.
Table 3.
Curriculum structure for AI integration in healthcare education
| Tier | Target learner role | Learning objectives | Assessment methods | Delivery format |
|---|---|---|---|---|
| Consumer | All medical students, generalists | Interpret AI outputs, recognise limitations, apply responsibly | MCQs, OSCEs, structured reflections | Embedded into clinical case studies and PBL |
| Translator | Clinical leaders, informaticians | Communicate model outputs, evaluate relevance, guide implementation | Role plays, peer feedback, implementation plans | CPD modules, interdisciplinary simulation workshops |
| Developer | Research-oriented clinicians, scientists | Build and validate AI models using healthcare data | Capstone projects, code reviews, oral examinations | AI electives, summer schools, thesis-based projects |
By following this roadmap, institutions can prepare healthcare professionals to critically engage with AI technologies while upholding patient-centred values and professional responsibility.
To advance the field, several research questions warrant further investigation:
What is the optimal timing and sequence for integrating AI competencies into medical curricula?
Which assessment tools most effectively measure both technical and ethical AI competence in healthcare trainees?
How can faculty development programs best support interdisciplinary AI instruction across medicine, computer science, and ethics?
Answering these questions is essential to operationalising this framework and ensuring AI integration in healthcare education is evidence-informed, ethically grounded, and globally scalable.
Acknowledgements
A special thank you is directed to the NMS for motivating and supporting this manuscript.
Abbreviations
- AI
Artificial Intelligence
- ChatGPT
Chat Generative Pre-trained Transformer
- CPD
Continuous Professional Development
- EBM
Evidence-Based Medicine
- EFL
English as a Foreign Language
- ERIC
Education Resources Information Center
- GDPR
General Data Protection Regulation
- OSCE
Objective Structured Clinical Examination
- PBL
Problem-Based Learning
- UK
United Kingdom
- XAI
Explainable AI
Authors’ contributions
JMM is responsible for designing, drafting and revising the work.
Funding
The present publication was funded by Fundação Ciência e Tecnologia, IP national support through UID/04923 - Comprehensive Health Research Centre.
Data availability
No datasets were generated or analysed during the current study.
Declarations
Ethics approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Competing Interests
The authors declare no competing interests.
Footnotes
https://med.stanford.edu/edtech/portfolio/ai-clinical-coach.html, accessed July 18, 2025.
https://volkamerlab.org/education/ai-in-medicine, accessed July 18, 2025.
https://medicine.nus.edu.sg/dbmi/events/singapore-healthcare-ai-datathon-expo, accessed July 18, 2025.
https://stfm.org/teachingresources/curriculum/aim-pc/aiml_curriculum, accessed July 18, 2025.
https://online.stanford.edu/programs/artificial-intelligence-healthcare, accessed July 18, 2025.
https://www.chinadaily.com.cn/a/202507/02/WS6864833fa31000e9a5739a62.html, accessed July 18, 2025.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Varnosfaderani SM, Forouzanfar M. The role of AI in hospitals and clinics: transforming healthcare in the 21st century. Bioengineering. 2024. 10.3390/bioengineering11040337. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Kolomenskaya E, Butakova M, Poltavskiy A, Soldatov A, Butova V. Application of artificial intelligence at all stages of bone tissue engineering. Biomedicines. 2023;12(1):76. 10.3390/biomedicines12010076. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Nazir S, Dickson DM, Akram MU. Survey of explainable artificial intelligence techniques for biomedical imaging with deep neural networks. Comput Biol Med. 2023;156:106668. 10.1016/J.COMPBIOMED.2023.106668. [DOI] [PubMed] [Google Scholar]
- 4.Tatsugami F, Nakaura T, Yanagawa M, Fujita S, Kamagata K, Ito R, et al. Recent advances in artificial intelligence for cardiac CT: enhancing diagnosis and prognosis prediction. Diagn Interv Imaging. 2023;104(11):521–8. 10.1016/j.diii.2023.06.011. [DOI] [PubMed] [Google Scholar]
- 5.Dlamini Z, Francies FZ, Hull R, Marima R. Artificial intelligence (AI) and big data in cancer and precision oncology. Comput Struct Biotechnol J. 2020;18(4):2300–11. 10.1016/j.csbj.2020.08.019. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Oikonomou EK, Siddique M, Antoniades C. Artificial intelligence in medical imaging: a radiomic guide to precision phenotyping of cardiovascular disease. Cardiovasc Res. 2020;116(13):2040–54. 10.1093/cvr/cvaa021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Yogev Dea. Current state of the art and future directions for implantable sensors in medical technology: clinical needs and engineering challenges. APL Bioeng. 2023. 10.1063/5.0152290. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Adibi S, Shojaei D, Rajabifard A, Wickramasinghe N. Enhancing healthcare through sensor-enabled digital twins in smart environments: a comprehensive analysis. Sensors. 2024;24(9):2793. 10.3390/s24092793. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Saqib Mea. Artificial intelligence in critical illness and its impact on patient care: a comprehensive review. Front Med. 2023. 10.3389/fmed.2023.1176192. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Talyshinskii A, Naik N, Hameed BMZ, Juliebø-Jones P, Somani BK. Potential of AI-driven chatbots in urology: revolutionizing patient care through artificial intelligence. Curr Urol Rep. 2024;25(1):9–18. 10.1007/s11934-023-01184-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Moldt JA, Festl-Wietek T, Mamlouk AM, Nieselt K, Fuhl W, Herrmann-Werner A. Chatbots for future docs: exploring medical students’ attitudes and knowledge towards artificial intelligence and medical chatbots. Med Educ Online. 2023. 10.1080/10872981.2023.2182659. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Ng FYC, Thirunavukarasu AJ, Cheng H, Tan TF, Gutierrez L, Lan Y, et al. Artificial intelligence education: an evidence-based medicine approach for consumers, translators, and developers. Cell Rep Med. 2023. 10.1016/j.xcrm.2023.101230. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Paranjape K, Nanayakkara P, Panday RN, Car J, Schinkel M. Introducing artificial intelligence training in medical education. JMIR Med Educ. 2019;5(2):e16048. 10.2196/16048. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Arksey H, O’Malley L. Scoping studies: Towards a methodological framework. Int J Soc Res Methodol Theory Pract. 2005;8. 10.1080/1364557032000119616.
- 15.Charow R, Salhia M, Lalani N, Dolatabadi E, Tripp T, Peteanu W, et al. Artificial intelligence education programs for health care professionals: scoping review. JMIR Med Educ. 2021;7(4):e31043. 10.2196/31043. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Grunhut J, Marques O, Wyatt ATM. Needs, challenges, and applications of artificial intelligence in medical education curriculum. JMIR Med Educ. 2022;8(2):e35587. 10.2196/35587. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Pucchio A, Rathagirishnan R, Caton N, Nabhen JJ, Papa JD, Lee W, et al. Exploration of exposure to artificial intelligence in undergraduate medical education: a Canadian cross-sectional mixed-methods study. BMC Med Educ. 2022. 10.1186/s12909-022-03896-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Mikeladze T, Meijer PC, Verhoeff RP. A comprehensive exploration of artificial intelligence competence frameworks for educators: a critical review. Eur J Educ. 2024. 10.1111/ejed.12663. [Google Scholar]
- 19.Laupichler MC, Aster A, Schirch J, Raupach T. Artificial intelligence literacy in higher and adult education: a scoping literature review. Comput Educ Artif Intell. 2022;3:100101. 10.1016/j.caeai.2022.100101. [Google Scholar]
- 20.Russell RG, Moore D, Craig KJT, Novak LL, Patel M, Miller BM, et al. Competencies for the use of artificial intelligence-based tools by health care professionals. Acad Med. 2022;98(3):348–56. 10.1097/acm.0000000000004963. [DOI] [PubMed] [Google Scholar]
- 21.Benedict N. Virtual patients and problem-based learning in advanced therapeutics. Am J Pharm Educ. 2010;74(8):143. 10.5688/aj7408143. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Dubovi I. Designing for online computer-based clinical simulations: evaluation of instructional approaches. Nurse Educ Today. 2018;69:67–73. 10.1016/j.nedt.2018.07.001. [DOI] [PubMed] [Google Scholar]
- 23.Hmelo-Silver CE. Problem-based learning: what and how do students learn? Educ Psychol Rev. 2004;16(3):235–66. 10.1023/b:edpr.0000034022.16470.f3. [Google Scholar]
- 24.Cain J, Rajan AS. Proof of concept of ChatGPT as a virtual tutor. Am J Pharm Educ. 2024;88(12):101333. 10.1016/j.ajpe.2024.101333. [DOI] [PubMed] [Google Scholar]
- 25.Hui Z, Zewu Z, Jiao H, Cui Y. Application of ChatGPT-assisted problem-based learning teaching method in clinical medical education. BMC Med Educ. 2025;25(1):50. 10.1186/s12909-024-06321-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Lin MY, Chang YS. Effects of design thinking STEAM instruction on AI learning and creativity. Int J Technol Des Educ. 2025;14. 10.1007/s10798-025-09977-y. [Google Scholar]
- 27.Mah D, Knoth N, Egloffstein M. Perspectives of academic staff on artificial intelligence in higher education: exploring areas of relevance. Front Educ. 2025;10:1484904. 10.3389/feduc.2025.1484904. [Google Scholar]
- 28.Garas G, Patel V, Panzarasa P, Darzi A, Cingolani I, Athanasiou T, et al. Surgical innovation in the era of global surgery: a network analysis. Ann Surg. 2020;271(5):868–74. 10.1097/sla.0000000000003164. [DOI] [PubMed] [Google Scholar]
- 29.Liaw W, Kakadiaris I, Lin S, Kueper JK, Bazemore A. Competencies for the use of artificial intelligence in primary care. Ann Fam Med. 2022;20(6):559–63. 10.1370/afm.2887. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Amini MM, Sheikholeslami DF, Jesus M, Alves P, Benam AH, Hariri F. Artificial intelligence ethics and challenges in healthcare applications: a comprehensive review in the context of the European GDPR mandate. Mach Learn Knowl Extr. 2023;5(3):1023–35. 10.3390/make5030053. [Google Scholar]
- 31.Ferrara E. Fairness and bias in artificial intelligence: a brief survey of sources, impacts, and mitigation strategies. Sci. 2023;6(1):3. 10.3390/sci6010003. [Google Scholar]
- 32.Lysaght T, Lim HY, Ngiam KY, Xafis V. AI-assisted decision-making in healthcare. Asian Bioeth Rev. 2019;11(3):299–314. 10.1007/s41649-019-00096-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Murdoch B. Privacy and artificial intelligence: challenges for protecting health information in a new era. BMC Med Ethics. 2021;22(1):122. 10.1186/s12910-021-00687-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Lin CC, Huang AYQ, Lu OHT. Artificial intelligence in intelligent tutoring systems toward sustainable education: a systematic review. Smart Learn Environ. 2023. 10.1186/s40561-023-00260-y. [Google Scholar]
- 35.Rubin DL. Artificial intelligence in imaging: the radiologist’s role. J Am Coll Radiol. 2019;16(9 Pt B):1309–17. 10.1016/j.jacr.2019.05.036. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Zuhair V, Rehman LU, Ali R, Oduoye MO, Noor Z, Babar A, et al. Exploring the impact of artificial intelligence on global health and enhancing healthcare in developing nations. J Prim Care Community Health. 2024. 10.1177/21501319241245847. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Lee EE, Torous J, Choudhury MD, Depp CA, Graham SA, Kim HC, et al. Artificial intelligence for mental health care: clinical applications, barriers, facilitators, and artificial wisdom. Biol Psychiatry Cogn Neurosci Neuroimaging. 2021;6(9):856–64. 10.1016/j.bpsc.2021.02.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Li Q, Qin Y. AI in medical education: medical student perception, curriculum recommendations and design suggestions. BMC Med Educ. 2023. 10.1186/s12909-023-04700-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Sit C, Poon DS, Srinivasan R, Amlani A, Muthuswamy K, Azam A, et al. Attitudes and perceptions of UK medical students towards artificial intelligence and radiology: a multicentre survey. Insights Imaging. 2020;11(1):14. 10.1186/s13244-019-0830-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Rjoop A, Al-Qudah M, Alkhasawneh R, Bataineh N, Abdaljaleel M, Rjoub MA, et al. Awareness and Attitude Toward Artificial Intelligence Among Medical Students and Pathology Trainees: Survey Study. JMIR Med Educ. 2025;11:e62669. 10.2196/62669. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Noack MM, Reyes KG. Mathematical nuances of Gaussian process-driven autonomous experimentation. MRS Bull. 2023;2(48):153–63. 10.1557/s43577-023-00478-8. [Google Scholar]
- 42.Blanco-González A, Seco-González A, Antelo-Riveiro P, Conde-Torres D, Garcia-Fandino R, Cabezón A, et al. The role of AI in drug discovery: challenges, opportunities, and strategies. Pharmaceuticals. 2023;16(6):891. 10.3390/ph16060891. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Tomašev N, Snyder K, Khan ME, Ezer D, Glasmachers T, Abila G, et al. AI for social good: unlocking the opportunity for positive impact. Nat Commun. 2020. 10.1038/s41467-020-15871-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Ötles E, James CA, Lomis KD, Woolliscroft JO. Teaching artificial intelligence as a fundamental toolset of medicine. Cell Rep Med. 2022;3(12):100824. 10.1016/j.xcrm.2022.100824. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Abouzeid E, Harris P. Insights Gained from Using AI to Produce Cases for Problem-Based Learning. Proceedings. 2025;114(1):5. 10.3390/proceedings2025114005. [Google Scholar]
- 46.Chan ZCY. Role-playing in the problem-based learning class. Nurse Educ Pract. 2011;12(1):21–7. 10.1016/j.nepr.2011.04.008. [DOI] [PubMed] [Google Scholar]
- 47.Sriram A, Ramachandran K, Krishnamoorthy S. Artificial Intelligence in Medical Education: Transforming Learning and Practice. Cureus. 2025;17(3):e80852. 10.7759/cureus.80852. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Lee H. The rise of ChatGPT: exploring its potential in medical education. Anat Sci Educ. 2023;17(5):926–31. 10.1002/ase.2270. [DOI] [PubMed] [Google Scholar]
- 49.Chiu TKF, Moorhouse BL, Chai CS, Ismailov M. Teacher support and student motivation to learn with Artificial Intelligence (AI) based chatbot. Interact Learn Environ. 2023;32(7):1–17. 10.1080/10494820.2023.2172044. [Google Scholar]
- 50.Docherty C, Hoy D, Topp H, Trinder K. Elearning techniques supporting problem based learning in clinical simulation. Int J Med Inform. 2005;74(7–8):527–33. 10.1016/j.ijmedinf.2005.03.009. [DOI] [PubMed] [Google Scholar]
- 51.Moulieswaran N, Kumar PNS. Investigating ESL Learners’ Perception and Problem towards Artificial Intelligence (AI) - Assisted English Language Learning and Teaching. World J Engl Lang. 2023;5(13):290–8. 10.5430/wjel.v13n5p290. [Google Scholar]
- 52.Qiao H, Zhao A. Artificial intelligence-based language learning: illuminating the impact on speaking skills and self-regulation in Chinese EFL context. Front Psychol. 2023;14:1255594. 10.3389/fpsyg.2023.1255594. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Wolff J, Keck A, Pauling J, Baumbach J. Success factors of artificial intelligence implementation in healthcare. Front Digit Health. 2021. 10.3389/fdgth.2021.594971. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Roppelt JS, Kanbach DK, Kraus S. Artificial intelligence in healthcare institutions: A systematic literature review on influencing factors. Technol Soc. 2023;76:102443. 10.1016/j.techsoc.2023.102443. [Google Scholar]
- 55.Li J, Huang J, Li X, Zheng L. Application of artificial intelligence in Diabetes education and management: present status and promising prospect. Front Public Health. 2020. 10.3389/fpubh.2020.00173. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Graham S, Depp C, Lee EE, Nebeker C, Kim HC, Jeste DV, et al. Artificial intelligence for mental health and mental illnesses: an overview. Curr Psychiatry Rep. 2019;21(11):116. 10.1007/s11920-019-1094-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Alowais SA, Alghamdi SS, Alsuhebany N, Alqahtani T, Alshaya AI, Almohareb SN, et al. Revolutionizing healthcare: the role of artificial intelligence in clinical practice. BMC Med Educ. 2023. 10.1186/s12909-023-04698-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Cobianchi L, Kaafarani HM, Mascagni P, Piccolo D, Peter A, Vazquez AG, et al. Artificial intelligence and surgery: ethical dilemmas and open issues. J Am Coll Surg. 2022;235(2):268–75. 10.1097/xcs.0000000000000242. [DOI] [PubMed] [Google Scholar]
- 59.Oosterhoff JHF, Doornberg JN. Artificial intelligence in orthopaedics: false hope or not? A narrative review along the line of Gartner’s hype cycle. EFORT Open Rev. 2020;5(10):593–603. 10.1302/2058-5241.5.190092. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Polevikov S. Advancing AI in healthcare: a comprehensive review of best practices. Clin Chim Acta. 2023;548:117519. 10.1016/j.cca.2023.117519. [DOI] [PubMed] [Google Scholar]
- 61.Skuban-Eiseler T, Steger F, Orzechowski M, Denkinger M, Leinert C, Kocar TD. Artificial intelligence-based clinical decision support systems in geriatrics: an ethical analysis. J Am Med Dir Assoc. 2023;24(9):1271-1276.e4. 10.1016/j.jamda.2023.06.008. [DOI] [PubMed] [Google Scholar]
- 62.Benrimoh D, Tanguay-Sela M, Perlman K, Israel S, Mehltretter J, Armstrong C, et al. Using a simulation centre to evaluate preliminary acceptability and impact of an artificial intelligence-powered clinical decision support system for depression treatment on the physician-patient interaction. BJPsych Open. 2021. 10.1192/bjo.2020.127. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Bienefeld N, Blaser M, Keller E, Boss JM, Lithy R, Willms J, et al. Solving the explainable AI conundrum by bridging clinicians’ needs and developers’ goals. Npj Digit Med. 2023;6(1). 10.1038/s41746-023-00837-4. [DOI] [PMC free article] [PubMed]
- 64.Wong MK, Hong DZH, Wu J, Ting JJQ, Goh JL, Ong ZY, et al. A systematic scoping review of undergraduate medical ethics education programs from 1990 to 2020. Med Teach. 2021;44(2):167–86. 10.1080/0142159x.2021.1970729. [DOI] [PubMed] [Google Scholar]
- 65.Thomasian NM, Adashi EY, Eickhoff C. Advancing health equity with artificial intelligence. J Public Health Policy. 2021;42(4):602–11. 10.1057/s41271-021-00319-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Chang DH, Hajian S, Lin MPC, Wang QQ. Educational design principles of using AI chatbot that supports self-regulated learning in education: goal setting, feedback, and personalization. Sustainability. 2023;15(17):12921. 10.3390/su151712921. [Google Scholar]
- 67.Zhang K, Aslan AB. AI technologies for education: Recent research & future directions. Comput Educ Artif Intell. 2021;2:100025. 10.1016/j.caeai.2021.100025. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
No datasets were generated or analysed during the current study.


