Abstract
Introduction
Digital transformation is reshaping healthcare education, requiring educators to demonstrate competence in digital teaching, patient care, and communication. Existing frameworks such as DigCompEdu, DigHealthCom, and UNESCO’s AI Competency Framework offer guidance, yet their suitability for clinical educators is not well established. This study explored how clinical educators integrate digital tools in teaching and practice, and examined strengths and gaps in their digital competencies through the lens of these frameworks.
Methods
A qualitative, exploratory study with Semi-structured interviews was conducted, including clinical educators and comparative basic sciences educators from both public and private institutions. Questions informed by the DigCompEdu, DigHealthCom, and UNESCO artificial intelligence (AI) Competency Frameworks for teachers, examined the types of digital resources educators used and their confidence in applying these tools for teaching and learning and patient care. Thematic analysis revealed five major themes, which were mapped to the DigCompEdu, DigHealthCom, and UNESCO AI Competency frameworks, highlighting critical gaps in digital pedagogy, health communication, and AI literacy within clinical education settings.
Results
Participants expressed varying levels of confidence and competence. Basic sciences educators were more confident in using structured tools such as LMS platforms and interactive apps, while clinicians tended to use WhatsApp, PowerPoint, or emails with limited integration into teaching practices, particularly deficient areas were student assessment and empowerment.
Conclusion
This study highlights both strengths and gaps in the digital competencies of healthcare educators. While educators showed confidence in communication through mobile platforms, they lacked skills in resource creation, data literacy, student engagement and interaction with emerging technologies like artificial intelligence (AI). Basic sciences educators showed greater confidence in structured digital teaching, whereas clinical educators struggled with integrating technology, both into patient-centered contexts and teaching. The study underscores gaps in current digital competence frameworks and we advocate for inclusive, clinically grounded models that recognize and respond to the dual responsibilities of clinician-educators as teachers and caregivers.
Supplementary Information
The online version contains supplementary material available at 10.1186/s12909-025-08389-9.
Keywords: Digital competencies, Clinical digital integration, Digital competency frameworks, Digital literacy, Clinical educator, Digital pedagogy
Background
As digital technology continues to transform healthcare and education, clinical educators must integrate digital tools into their teaching practices to prepare future healthcare professionals. The adoption of artificial intelligence (AI), virtual simulations, and online learning platforms is reshaping medical education and their role extends beyond delivering content to fostering a learning environment that aligns with technological advancements in healthcare delivery [1]. Moreover, clinical educators also play a pivotal role in developing these competencies among students to ensure they are well-prepared for a technology-enhanced healthcare system [2, 3]. Digital competence refers to the “skills, knowledge, and attitudes required to use digital technologies effectively and responsibly across various contexts” [4]. A clinical educator is a health professional who is actively involved in the education, supervision, and assessment of students and trainees in clinical settings, aiming to facilitate the development of clinical competence and professional skills [5].
Several competency frameworks have been developed to guide the integration of digital skills in education, but this study is guided by three digital competence frameworks [6–8]. The DigCompEdu framework, developed by the European Commission, is a widely used framework that defines digital competence in six key areas and 22 sub-competencies for educators, further categorizing it in three proficiency levels from A1 (newcomer) to C2 (pioneer) [8]. Its strength lies in its comprehensive focus on teaching and learning; however, it is largely designed for general educators and does not adequately address the specific challenges of clinical education, such as balancing patient care with digital teaching or navigating healthcare-specific technologies (Fig. 1).
Fig. 1.
Combined competency frameworks
The DigHealthCom offers a valuable healthcare-specific lens by emphasizing digital health literacy, patient communication, and responsible data use; however, it is limited as a standalone framework for clinical educators. Its focus remains on clinical practice rather than digital pedagogy, online assessment, or student engagement, leaving critical gaps in teaching-related competencies. Moreover, it does not address emerging areas such as AI literacy and ethical integration of intelligent systems in education. Therefore, while DigHealthCom is important for contextualizing digital competence within healthcare, it must be complemented by broader educational (DigCompEdu) and future-oriented (UNESCO AI framework for teachers) frameworks to provide a holistic understanding of clinical educators’ digital competencies [7]. The UNESCO AI Competency Framework for Teachers fills this gap by explicitly integrating AI literacy, critical engagement with AI tools, and ethical considerations for responsible use in education. This makes it particularly future-oriented and highly relevant in the context of rapidly evolving digital healthcare [6].
This study, thereby, aims to fill this important gap by examining clinical educators’ use and understanding of digital skills in their teaching and clinical practice, guided by a combined established competency framework, through a qualitative approach. Our research question is: “How do clinical educators integrate digital tools into their teaching and clinical practice, and what are the strengths and gaps in their digital competencies based on the DigCompEdu, DIGHealthCom, and UNESCO AI framework for teachers?”
Methodology
A qualitative exploratory study was conducted from October 2024 to June 2025 using semi-structured interviews to explore medical educators’ digital competencies.
Participant selection
The undergraduate medical program in Pakistan consists of a five-year MBBS degree, entered directly after higher secondary (pre-medical) studies. Purposive sampling recruited 20 participants (10 clinical educators and 10 basic sciences teachers) from three PMDC-recognized institutions (one private tertiary-care hospital and two large public hospitals). Basic sciences educators were included to highlight differences and commonalities in digital competencies across both groups, given their distinct teaching contexts. All clinical educators had over 10 years of experience and were actively engaged in undergraduate teaching within integrated curricula.
Data collection
The interview guide was based on combined competencies from three frameworks, reviewed by three experts, and piloted with two interviews. In DigCompEdu, participants were categorized as confident users, emerging users, or novices based on their responses. Since our study was a qualitative adaptation thereby, we supported our categorization with direct quotes from the data that reflect participants’ competency levels. For example,
Novices (Basic Competence): Struggle with digital tools, are unsure about online resources, and need assistance. Example: “I find it difficult to use online platforms and usually ask my colleagues for help.”
Emerging users (Intermediate Competence): Some independent use of digital tools, actively experimenting, but needs more support. Example: “I’m starting to use more digital tools in my teaching, but I still need some help figuring out the best ones.”
Confident users (Advanced Competence): Extensive use of digital tools, comfortable with creating, modifying, and evaluating digital resources, and guide others. Example: “I regularly use digital simulations for teaching, and I help my team integrate new tools into our practice.”
We categorized digital competence broadly, considering both general platforms and virtual delivery tools together rather than as separate categories. Interviews were conducted by the principal researcher (trained in qualitative research), face-to-face or via Zoom, lasted 30–40 min, and were audio-recorded with consent using OTTER.Ai [9]. The researcher had no prior relationship with participants to reduce bias and encourage openness. Transcripts were anonymized and coded by two independent researchers, with participant feedback obtained on summaries.
Data analysis
Thematic analysis followed Braun and Clarke’s six-phase approach, combining deductive (framework-based) and inductive coding [10]. In line with Braun and Clarke’s approach to thematic analysis, our focus was on capturing patterns of meaning and the depth of participants’ experiences rather than counting the frequency of comments, as would be done in content analysis. Codes were grouped into subthemes and synthesized into overarching themes, with disagreements resolved by consensus. Prolonged engagement was ensured as researchers visited participants to introduce the project and obtain consent, and also emailed for record. Time was spent before and during interviews to build rapport and understand teaching contexts. AI-based tools (e.g., ChatGPT, Paperpal) supported the literature search and refinement of the manuscript’s language.
Results
The analysis of the interview data resulted in the identification of five key themes (see Table 1).
Table 1.
Themes and subthemes
| S. No | Themes | Subthemes | Domains of the three frameworks used | ||
|---|---|---|---|---|---|
| DigComEdu. (classroom teaching) |
DigComHealth (clinical spaces) |
UNESCO AI competencies for teachers | |||
| 1 | Digital Integration in Teaching |
-Widespread Use of Basic Communication Tools -Limited Integration of Advanced Digital Pedagogies -YouTube and Multimedia Use for Visual Learning -Instructional Practices for Student Engagement |
-Digital resources -Professional resources -Teaching & learning -Assessment -Empowering learners/Facilitating learners |
-Digital solutions -Information & communication technology (ICT) competence -Competence utilizing digital solutions |
Use of AI in teaching and learning |
| 2- | Digital Professionalism |
-Challenges in Empathetic Communication and Professional Boundaries -Ethical Use of Digital Tools and Data Privacy -Digital Etiquette and Professional Role Modelling -Critical Digital Literacy and Evidence-Based Practice |
Responsible use of digital resources | Ethical competence related to digital solutions | AI & ethical considerations |
| 3- | AI Literacy in Education & research | Limited and Emerging Integration of AI Tools | -------- | ------ |
Understanding AI -Use of AI in teaching and learning -Use of AI in professional development |
| 4- | Clinical Digital Integration |
-Limited Integration into Bedside Teaching -Limited Exposure to Advanced Digital Tools -Underutilized Potential in Student Engagement and Patient Communication |
------- |
-Digital solution -ICT Competence in utilizing and evaluating digital solutions |
|
| 5- | Digital Skills Development |
-Predominantly Self-Directed Learning -Supplementary Use of Online Learning Platforms -Institutional Gaps in Training and Support |
Professional engagement |
-Human-centered remote counselling -Digital solution -ICT Competence in utilizing and evaluating digital solutions |
|
We will consecutively discuss these themes in the following sections with representative quotes.
Theme 1: Digital integration in teaching
Widespread use of basic communication tools
WhatsApp was the most used platform, valued for its speed and accessibility in communicating with both students and colleagues. Zoom was widely used during the COVID-19 pandemic for lectures and discussions, facilitating continuity of education.
“We often use WhatsApp groups to quickly update students about clinical rounds or case changes—it’s immediate and accessible” Participant 4.
“Zoom was our go-to for staying connected—it allowed us to continue our lectures and discussions almost seamlessly during the lockdown” Participant 10.
Limited integration of advanced digital pedagogies
While there was some awareness of interactive tools and Learning Management Systems (LMS) like “Moodle” and “Google Classroom”, usage remained basic and sporadic. Participants often used these platforms for uploading materials but rarely for interactive learning or assessment.
“We have very limited exposure to platforms like Google Classroom, so most of us don’t feel confident using them for teaching” Participant 5.
YouTube and multimedia use for visual learning
Many participants incorporated YouTube videos to visually demonstrate rare procedures or complex topics.
“I rely on YouTube videos to explain rare procedures. It’s much easier for students to visualize when they see it being done”Participant 3.
Instructional practices to enhance digital student engagement
Few educators interacted with students on online teaching forums like Moodle and Google Classroom, but they didn’t use any tools to enhance student engagement. A few participants used in-session quizzes to enhance student engagement.
“I often use interactive quizzes during live sessions to keep students engaged and to check their understanding in real time” Participant 15.
“Students are more responsive when they feel in control. I let them choose digital platforms for group projects.it also increases their engagement with the session “participant 18.
Theme 2: Digital professionalism
Challenging digital dialogue
Many participants emphasized the challenges of maintaining human connection and empathy when using digital platforms. The impersonal nature of virtual communication made it difficult for educators to assess students’ emotional states or levels of engagement.
“It’s harder to connect with students online, especially when they’re silent. I can’t gauge if they’re struggling” Participant 6.
They also expressed uncertainty about how to strike the right balance between being approachable and maintaining professionalism in online environments.
“I want to be approachable online, but sometimes I worry that being too casual might come across as unprofessional” Participant 12.
Data ethics and privacy
Participants highlighted the importance of safeguarding sensitive information and adhering to academic integrity in digital spaces. While some took personal initiative to anonymize patient data, others noted the lack of clear institutional guidance.
“Before sharing cases, I remove all personal details to safeguard patient privacy.” Participant 16.
“There’s no official protocol, so I rely on my own judgment to anonymize information” participant 20.
Digital etiquette and professional role modeling
Educators saw themselves as role models for digital professionalism, emphasizing the need to uphold academic integrity and maintain professional boundaries across virtual settings.
“As educators, we are expected to set an example in how we use digital tools—our practices become a model for our students” Participant 20.
They stressed that ethical standards should remain consistent regardless of the medium.
“Sometimes students message at odd hours, and it becomes difficult to maintain boundaries in digital communication” participant 16.
Critical digital literacy and Evidence-Based practice
Beyond personal conduct, participants underlined the responsibility of teaching students how to critically evaluate digital content, identify credible sources, and apply healthcare data ethically in clinical contexts.
“We encourage students to question the credibility of digital sources and guide them on using data ethically and effectively in clinical settings” Participant 7.
“We remind students that copying material from online platforms is still plagiarism. Ethics don’t change with the medium” Participant 8.
Theme 3: AI literacy in education and research
Emerging integration of AI tools
Participants reported a general awareness of AI technologies—such as ChatGPT, diagnostic applications, and AI-enhanced platforms like WhatsApp’s Meta features, but reported minimal hands-on experience. Its active integration into teaching or scholarly work remained limited.
“I’ve heard of AI tools like ChatGPT or diagnostic apps, but I haven’t really used them in my teaching. It’s still quite new for many of us”Participant 3.
“I use ChatGPT and WhatsApp’s meta feature for information search, but I haven’t really explored how these can be applied in teaching—I’m not sure what all they can do” Participant 6.
Some participants have begun experimenting with AI tools for specific tasks, such as generating quizzes or simplifying complex concepts. However, their use was still tentative, and many lacked the confidence or support to adopt these tools consistently.
“I’ve started experimenting with AI tools to create quizzes or simplify complex concepts, but I’m still not fully confident in using them regularly” Participant 5.
Theme 4: Challenges in clinical teaching
Limited integration into bedside teaching
Although there is a growing awareness of the potential of digital tools to enhance clinical education, most participants reported limited integration of such tools into bedside teaching, or simulation-based learning. This gap is particularly evident in high-pressure, time-constrained clinical environments, where digital solutions could improve efficiency and innovation but are rarely utilized.
“I see the potential of digital tools to enhance clinical teaching, especially in fast-paced settings—but without proper training, they feel like distant possibilities rather than practical solutions” Participant 13.
Limited exposure to advanced digital tools
Some participants mentioned exposure to advanced tools—such as simulation software and clinical decision-support systems—while training abroad or within select private institutions. However, these experiences were not representative of broader practice. Most departments continued to rely on traditional methods, with minimal institutional support or strategic integration of digital technologies.
“I know there are tools available, but we don’t really use them in clinical teaching as much. It’s mostly the traditional methods that we rely on” Participant 7.
Underutilized potential in student engagement and patient communication
The use of digital resources was again limited to enhance student engagement or for patient communication in virtual environment. Although the COVID-19 pandemic prompted some experimentation with telemedicine, ongoing challenges, including poor internet connectivity and the absence of secure platforms, hindered sustained adoption.
“We struggled getting good connectivity. Besides, there was no secure channel to connect to patients. We weren’t sure how to do it” Participant 11.
Some educators continued to prefer face-to-face interactions with patients, especially for educational purposes, citing digital tools as less effective in building rapport and delivering meaningful communication.
“We’ve had some training on telemedicine, but in my day-to-day practice, I still find face-to-face communication to be more effective. Digital tools haven’t really made a big impact on patient education yet ” Participant 4.
Theme 5: Digital skills development
Predominantly Self-Directed learning
A consistent theme across participants was the self-directed nature of their digital skills development. Most had not received any formal training in educational technologies and instead relied on self-exploration and trial-and-error to build digital competency.
“I’ve mostly learned digital tools on my own—just experimenting with them and figuring out what works best for my teaching. It’s all about trial and error” Participant 4.
Supplementary use of online learning platforms
Some educators enhanced their learning by accessing online courses through platforms like Coursera, edX, and Google’s digital training resources. However, these were used by only a few, with most participants still preferring informal, hands-on experimentation over structured e-learning.
“While platforms like Coursera and edX are great, I’ve mostly developed my skills by diving into the tools myself. It’s about playing around with apps and seeing how they fit into my teaching style” Participant 7.
Institutional gaps in training and support
Although there was an expectation for educators to use learning management systems like Google Classroom or Moodle, most reported a lack of institutional training, guidance, and access to full (paid) versions of these platforms. These limitations hindered the effective and confident use of digital tools in their teaching.
“While we expected to use platforms like Google Classroom or Moodle, there was little institutional guidance or training on how to use them effectively. And without access to paid versions of the software, it often made things harder to navigate” Participant 2.
One of the most critical gaps identified was the lack of strategies to meaningfully engage students with digital tools or empower them to use such technologies independently.
“We expect students to be independent with technology, but no one has really taught them how to use it effectively in a clinical or academic setting” Participant 18.
Participants held mixed views on students’ digital skills. Some believed students were naturally tech-savvy and needed little support,
“Students these days are more tech-savvy than us. They teach us to use technologies” Participant 15.
while others felt, students required guidance in evaluating digital content critically and using online resources meaningfully.
“There’s a big gap between students using technology socially and using it academically. They know how to browse or use apps, but not how to apply digital tools for learning” Participant 15.
One participant highlighted the importance of teaching students how to access and evaluate online medical information, emphasizing the need for digital health literacy for students.
“Giving students access to online medical databases and teaching them how to filter good-quality information is key to their digital growth” Participant 8.
Our basic sciences educators demonstrated stronger mastery of digital tools, frequently incorporating YouTube, multimedia, and interactive platforms such as Kahoot into their teaching.
“I often use YouTube animations and Kahoot quizzes—it helps students grasp concepts much faster” Participant 10.
“We have more pressure to use tools like MOODLE or classroom and engage students on online platforms, thereby we are bound to learn these skills. Our faculty development programs also provide us with insight into this” participent13.
In contrast, clinical educators use traditional classroom lecturers or PowerPoint presentations.
“I usually just use simple tools for my teaching; I haven’t had much exposure to advanced platforms or interactive technologies, so I don’t really bring them into my classes” Participant 6.
Similarly, with limited integration into bedside teaching due to contextual constraints.
“In the ward, I mostly share PowerPoint slides or send updates on WhatsApp; it’s hard to use more advanced platforms during rounds” Participant 14.
Despite these differences, both groups showed comparable awareness of digital professionalism and ethics, emphasizing privacy and professional boundaries.
“I remind students not to share patient details on social media—it’s about keeping professional boundaries intact” Participant 16.
Awareness of AI was limited in both groups, though basic sciences faculty showed slightly more experimentation with tools like ChatGPT for research and teaching.
“I tried ChatGPT to simplify some complex topics for my lectures—it saves time” Participant 10.
Finally, while basic sciences educators benefitted from some institutional training, clinical educators largely relied on self-directed learning.
“I just learned through trial and error—no formal training was given for digital tools in clinical teaching”Participant 8.
Most participants, whether from clinical or basic sciences backgrounds, demonstrated confidence levels that fell within the novice to emergent levels. Basic sciences educators generally expressed greater ease with structured digital tools, such as multimedia resources and interactive platforms, reflecting more consistent integration into their teaching practices.
“For us in basic sciences, digital platforms like videos and quizzes have become part of routine teaching” Participant 13.
In contrast, clinical educators showed lower confidence in adapting advanced technologies within patient-centered contexts as well as teaching and learning, often restricting their use to basic communication tools. Despite these differences, neither group reached mastery, underscoring a shared need for targeted professional development to advance beyond foundational competence.
“At the bedside, it’s harder—digital tools don’t always blend naturally into patient teaching.” Participant 17.
Discussion
The findings of this study highlight the complex and evolving nature of digital integration in clinical education. While the study explored how medical educators incorporate digital technologies into both clinical and academic teaching, it revealed several key themes. There is a predominant reliance on basic communication tools, with limited adoption of advanced digital pedagogies. Integration of digital tools into clinical teaching remains sparse, and digital skill acquisition is through self-directed learning rather than interactive or innovative approaches.
Since our study is a qualitative adaptation of frameworks, its comparison with studies using scoring systems may not be possible, but inferences can be drawn from the data. Almost all educators acknowledged that digital tools help them teach effectively, but their confidence level fluctuated depending on the tool used. Our results are similar to those of another study [11], which places the technical level of medical teachers above the pedagogical level. At the same time, it has been shown that the teachers surveyed obtain scores that place them at intermediate levels of digital competencies, which is comparable to moderate confidence levels observed in our study population [12]. Similarly, research in the Allied Health Professionals (AHP) Digital Competency Framework in the UK has demonstrated that AHPs have moderate–high levels of confidence using digital technology at work, where more than 50% of participants perceived their ability to perform specific digital competence as poor or very poor [13]. In a qualitative study Roe et al., analyzed the digital competencies of educators in physiotherapy and health professions education using the DigCompEdu framework. Their Findings endorse our findings in a way that educators predominantly exhibited competencies in professional engagement and basic teaching and learning pedagogies. However, there were notable deficiencies in areas like advanced digital resources, assessment, and facilitating learners’ digital competence [14].
There are notable disparities in digital engagement between basic science and clinical faculty. Clinical educators relied almost exclusively on WhatsApp for patient care and teaching due to its accessibility and efficiency, consistent with prior studies highlighting its popularity in medical education [15, 16].In contrast, basic science educators used a wider range of platforms such as Google Classroom, Google Meet, and Zoom, often supplemented with interactive tools to enhance engagement [17]. While they demonstrated greater proficiency in use of digital tools, literature notices steep learning curves even for them, in pedagogy and determining clinical relevance for their teaching [18, 19]. These differences in competencies and learning needs between basic science and clinical faculty highlight the importance of tailoring faculty development activities accordingly [19].
Digital integration in “clinical workflow” among educators appears significantly underdeveloped. Although participants recognized the potential of simulation tools, clinical decision-support systems, and mobile applications, their use in bedside teaching and patient care remained limited. Several studies have similarly observed that clinical educators often possess only basic digital competencies, with an urgent need to develop more advanced, context-specific digital teaching strategies [1, 20, 21]. Zainal et al. raised concerns about an overreliance on digital tools potentially leading to an “erosion” of core clinical competencies. Additional barriers—such as inadequate internet infrastructure, lack of access to secure digital platforms, and a strong preference for face-to-face communication—further restrict the adoption of digital tools [20, 21]. Notably, even during the COVID-19 pandemic, which globally accelerated telemedicine, participants reported only temporary and reluctant use of digital patient communication [22, 23]. This suggests that while awareness of digital tools exists, integration into clinical routines remains superficial.
Findings from our study indicate that most educators developed their digital skills through self-directed learning and experimentation, with only a few having attended formal workshops or seminars. This aligns with previous literature suggesting that self-motivation and informal learning play a crucial role in digital competence acquisition among educators [24, 25]. However, the presence of institutional policies and structured learning opportunities can significantly enhance and accelerate this process [24]. Supporting educators through formal programs, accessible digital tools, and a culture of digital innovation may therefore facilitate more equitable and efficient development of digital health competencies [19].
Our clinical educators were aware of the “potential” use of artificial intelligence (AI) in teaching and research, and used basic AI tools available, like Chat GPT, but its application in teaching was limited. Findings of a systematic review also highlight that there is consensus on the use of AI in education and health informatics, but the specific skill set required is missing. Educators struggle for the right sources and tools for teaching and patient management [26, 27]. similarly, there was apprehension about this use in research for the fear of plagiarism [28]. It’s quite contrary to the study done by Maya Banerjee et al. in postgraduate trainees in the UK, in which they found that AI does improve ‘research and quality improvement’ skills and facilitate ‘curriculum mapping’, but there was scepticism about its utilization in clinical judgment and clinical skills [29].
Ethical considerations remain one of the most debated aspects of digital literacy [30]. Many educators have voiced concerns about the difficulty of establishing meaningful connections with students in virtual settings [28]. These concerns echo broader discussions in medical education about the changing nature of teacher–student and clinician–patient interactions in digital environments. Furthermore, there was a shared awareness of ethical responsibilities—such as data privacy and academic integrity—though many participants reported limited guidance or formal training in these areas. Supporting this, a related study also highlights that teachers often struggle to form emotional bonds with students in online environments, emphasizing the need for educators to demonstrate and model emotional intelligence [28]. Additionally, Carlos et al. emphasized the importance of providing structured training in digital professionalism for both teachers and students to foster a more effective and respectful digital learning environment [31]. Similarly, Flott et al. highlighted concerns regarding the use of digital platforms, particularly when sharing patient information, and stressed the importance of implementing structured institutional guidelines to govern digital information sharing [32].
One of the most critical gaps identified was the lack of strategies to meaningfully engage students with digital tools or empower them to use such technologies independently. A few studies also observed that educators take the least effort to engage students in online sessions [13, 33]. Also observed was that many students lacked the skills to critically assess and apply digital tools in academic or clinical contexts. This suggests a pressing need to incorporate digital literacy and critical thinking into both faculty and student development strategies [34, 35].
Conclusion & strengths of the study
This study explored how clinical educators integrate digital tools into their teaching and clinical workflows, guided by three competency frameworks. The findings reveal that clinical educators demonstrate strong digital competence in areas such as communication, collaboration, and professional engagement, most notably through the widespread use of mobile-based platforms like WhatsApp. However, the study also identified gaps in competencies related to the creation of digital resources, data literacy, and critical engagement with emerging technologies such as artificial intelligence. Overall, the results of this study underscore the importance of supporting clinical educators with structured, framework-aligned training that reflects both the current realities of clinical education and the evolving demands of digital healthcare and AI-enhanced learning environments.
Limitations (weaknesses) of the study
This study has several limitations. The findings are based on a small, purposively selected group of clinical educators, limiting generalizability. Responses were self-reported, which may introduce bias due to social desirability. Additionally, the use of frameworks such as DigCompEdu and DigHealthCom, while valuable, may not fully capture the contextual nuances of clinical teaching environments. Finally, given the rapid evolution of digital technologies, some findings may become outdated over time.
Recommendations
Targeted faculty development should advance beyond basic digital skills to include digital health and pedagogical competencies.
Clinical educators need tailored support, including time allowances, access to appropriate technologies, and simulation-based, mobile-friendly tools suited to clinical settings.
Institutional guidelines on digital professionalism should be integrated into faculty orientation and student curricula, with emphasis on role modelling and digital etiquette.
Educators should be encouraged to explore AI tools in teaching, assessment, and research through pilot projects and collaborative learning communities.
Existing frameworks such as DigCompEdu, DigHealthCom, and UNESCO AI framework for teachers, offer useful guidance but do not fully address the unique dual roles of clinical educators in teaching and patient care. This highlights the need for a context-specific digital competency framework tailored to clinical education.
Supplementary Information
Acknowledgements
The authors thank all faculty members for their active participation, including developers of the DigHealthCom tool, for their permission to use the qualitative adaptation of their tool. The authors also acknowledge the use of AI tools such as ChatGPT and Paperpal for language refinement and assistance in organizing the literature during manuscript preparation.
Abbreviations
- DigCompEdu
UNESCO AI framework for teachers, DigHealthCom
Authors’ contributions
Author SS is responsible for data collection, analysis, and writing of results. Author SA helped in data collection and compilation of results. Author NA supervised all the steps of the research and helped in correcting the manuscript. Author IK helped in the conception of the research idea and proofreading of the manuscript.
Funding
This project was self-funded.
Data availability
The data supporting the findings of this study are contained within the manuscript. There is no additional data to be shared.
Declarations
Ethics approval and consent to participate
Ethical approval has been obtained from the institutional review committee in Shifa College of Medicine (Islamabad, Pakistan) under license number IRB#615 − 24. Informed consent was obtained from all participants. All identities were kept confidential and not revealed in the study, and all interviews were conducted in compliance with the Helsinki Declaration.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.O’Brien A, Forde C. Health science staff and student experiences of teaching and assessing clinical skills using digital tools: a qualitative study. Ann Med. 2023;55(2):2256656. 10.1080/07853890.2023.2256656. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Car J, Ong QC, Erlikh Fox T, Leightley D, Kemp SJ, Švab I, et al. The Digital Health Competencies in Medical Education Framework: an international consensus statement based on a Delphi study. JAMA Netw Open. 2025;8(1):e2453131. 10.1001/jamanetworkopen.2024.53131. [DOI] [PubMed] [Google Scholar]
- 3.Nazeha N, Pavagadhi D, Kyaw BM, Car J, Jimenez G, Car LT. A digitally competent health workforce: scoping review of educational frameworks. J Med Internet Res. 2020;22(11):e22706. 10.2196/22706. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Hesketh EA, Laidlaw JM. Developing the teaching instinct: feedback to medical teachers on their teaching. Med Teach. 2002;24(3):241–4. 10.1080/01421590220134051.12098409 [Google Scholar]
- 5.Khurana MP, Raaschou-Pedersen DE, Kurtzhals J, Bardram JE, Ostrowski SR, Bundgaard JS. Digital health competencies in medical school education: a scoping review and Delphi method study. BMC Med Educ. 2022;22(1):805. 10.1186/s12909-022-03163-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.UNESCO. AI and education: guidance for policy-makers. Paris: United Nations Educational, Scientific and Cultural Organization; 2021. [Google Scholar]
- 7.Jarva E, Oikarinen A, Andersson J, Tomietto M, Kääriäinen M, Mikkonen K. Healthcare professionals’ digital health competence and its core factors: development and psychometric testing of two instruments. Int J Med Inform. 2023;171:104995. 10.1016/j.ijmedinf.2023.104995. [DOI] [PubMed] [Google Scholar]
- 8.Ghomi M, Redecker C. Digital competence of educators (DigCompEdu): development and evaluation of a self-assessment instrument for teachers’ digital competence. Comput Hum Behav. 2019;102:146–63. 10.1016/j.chb.2019.07.002. [Google Scholar]
- 9.Otter.ai. Otter.ai: Automated transcription and note-taking tool. 2025. Available from: https://otter.ai.
- 10.Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101. 10.1191/1478088706qp063oa. [Google Scholar]
- 11.Caton E, Philippou J, Baker E. Exploring perceptions of digital technology and digital skills among newly registered nurses and clinical managers. Nurs Manag. 2024;31(1):7. 10.7748/nm.2024.e2051. [DOI] [PubMed] [Google Scholar]
- 12.Pervez Z, Nadeem I, Ghani BM, Fatima N. Assessing the level of digital health literacy in health care professionals in a tertiary care hospital. Esculapio. 2024;20(01):42–7. Available from: https://esculapio.pk/journal/index.php/journal-files/article/view/1065/945.
- 13.Dai K, Ma J, Han F, Althubyani AR. Digital competence of teachers and the factors affecting their competence level: a nationwide mixed-methods study. Sustainability. 2024;16(7):2796. 10.3390/su16072796. [Google Scholar]
- 14.Røe Y, Vik Torbjørnsen AC, Admiraal W. Educators’ digital competence in physiotherapy and health professions education: insights from qualitative interviews. DIGITAL HEALTH. 2024;10:20552076241297044. 10.1177/20552076241297044. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Salam MAU, Oyekwe GC, Ghani SA, Choudhury RI. How can WhatsApp® facilitate the future of medical education and clinical practice? BMC Med Educ. 2021;21(1):442. 10.1186/s12909-020-02440-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Aizaz Salam MAU, Oyekwe GC, Ghani SA, Choudhury RI. The role of WhatsApp® in medical education: a scoping review and instructional design model. BMC Med Educ. 2019;19(1):279. 10.1186/s12909-019-1706-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Boté-Vericad JJ, Palacios-Rodríguez A, Gorchs-Molist M, Cejudo-Llorente C. Comparison of the teaching of digital competences between health science faculties in Andalusia and Catalonia. Educación Médica. 2023;24(2):100791. 10.1016/j.edumed.2023.100791. [Google Scholar]
- 18.Wong LW, Wong AHP, Lian DWQ, Hooi SC. Perceived challenges and opportunities for basic scientists transitioning to basic science educators in medical and health professions education. Discov Educ. 2024;3(1):1–10. 10.1007/s44217-024-00311-7. [Google Scholar]
- 19.Farghaly A. Comparing faculty development needs of basic sciences and clinical teachers during major curricular reform at Prince Sattam Bin Abdulaziz medical college in Saudi Arabia. Educ Med J. 2021;13(1):31–41. 10.21315/eimj2021.13.1.4. [Google Scholar]
- 20.Jobst S, Lindwedel U, Marx H, Pazouki R, Ziegler S, König P et al. Competencies and needs of nurse educators and clinical mentors for teaching in the digital age: a multi-institutional, cross-sectional study. BMC Nurs. 2022;21(1):258. 10.1186/s12912-022-01018-6. Available from: https://link.springer.com/article/10.1186/s12912-022-01018-6. [DOI] [PMC free article] [PubMed]
- 21.Zainal H, Xin X, Thumboo J, Fong KY. Medical school curriculum in the digital age: perspectives of clinical educators and teachers. BMC Med Educ. 2022;22(1):807. 10.1186/s12909-022-03454-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Ammenwerth E, Zwan L. Integration of digital tools into clinical reasoning education: a rapid review. Stud Health Technol Inform. 2025;324:49–50. 10.3233/SHTI250159. [DOI] [PubMed] [Google Scholar]
- 23.Manesh R, Dhaliwal G. Digital tools to enhance clinical reasoning. Med Clin North Am. 2018;102(3):475–83. 10.1016/j.mcna.2017.12.015. [DOI] [PubMed] [Google Scholar]
- 24.Kulju E, Jarva E, Oikarinen A, Hammarén M, Kanste O, Mikkonen K. Educational interventions and their effects on healthcare professionals’ digital competence development: a systematic review. Int J Med Inform. 2024;185:105396. 10.1016/j.ijmedinf.2024.105396. [DOI] [PubMed] [Google Scholar]
- 25.Liu Z, Gao Y, Zhang N, Long T, Liu S, Peng X. Effects of self-regulated learning on cognitive engagement and learning achievement in online discussions. Curr Psychol. 2024;43(35):28147–62. 10.1007/s12144-024-06445-z. [Google Scholar]
- 26.Hasan Sapci A, Sapci HA. Artificial intelligence education and tools for medical and health informatics students: systematic review. JMIR Med Educ. 2020;6(1):e19285. 10.2196/19285. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Paranjape K, Schinkel M, Nannan Panday R, Car J, Nanayakkara P. Introducing artificial intelligence training in medical education. JMIR Med Educ. 2019;5(2):e16048. 10.2196/16048. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Masters K. Ethics in medical education digital scholarship: AMEE guide no. 134. Med Teach. 2020;42(3):252–65. 10.1080/0142159X.2019.1695043. [DOI] [PubMed] [Google Scholar]
- 29.Banerjee M, Chiew D, Patel KT, Johns I, Chappell D, Linton N, et al. The impact of artificial intelligence on clinical education: perceptions of postgraduate trainee doctors in London (UK) and recommendations for trainers. BMC Med Educ. 2021;21(1):84. 10.1186/s12909-021-02870-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Pittmann R, Danaher-Garcia N, Adair White BA, Thompson A. The impact of a professional development workshop on healthcare professionals’ knowledge and readiness to use telehealth etiquette in virtual care. J Telemed Telecare. 2024;(9):1357633X241285938. 10.1177/1357633X241285938. [DOI] [PubMed] [Google Scholar]
- 31.Novella-García C, Cloquell-Lozano A. The ethical dimension of digital competence in teacher training. Educ Inf Technol. 2021;26(3):3529–41. 10.1007/s10639-021-10436-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Flott K, Maguire J, Phillips N. Digital safety: the next frontier for patient safety. Future Healthc J. 2021;8(3):e598-601. 10.7861/fhj.2021-0089. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Chang CY, Panjaburee P, Lin HC, Lai CL, Hwang GH. Effects of online strategies on students’ learning performance, self-efficacy, self-regulation and critical thinking in university online courses. Educ Technol Res Dev. 2022;70(1):185–204. 10.1007/s11423-021-10034-8. [Google Scholar]
- 34.Kwiatkowska W, Wiśniewska-Nogaj L. Digital Skills and Online Collaborative Learning: The Study Report. Electronic Journal of e-Learning. 2022;20(5):510–22. Available from:https://academic-publishing.org/index.php/ejel/article/view/2412. Cited 4 Jun 2025.
- 35.Carabregu-Vokshi M, Ogruk-Maz G, Yildirim S, Dedaj B, Zeqiri A. 21st century digital skills of higher education students during COVID-19: is it possible to enhance digital skills of higher education students through e-learning? Educ Inf Technol. 2024;29(1):103–37. Available from: https://link.springer.com/article/10.1007/s10639-023-12232-3. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The data supporting the findings of this study are contained within the manuscript. There is no additional data to be shared.

