Skip to main content
BMC Medical Education logoLink to BMC Medical Education
. 2026 Feb 9;26:401. doi: 10.1186/s12909-026-08699-6

Dental students’ knowledge, perceptions, and educational needs regarding artificial intelligence: a multinational cross-sectional survey

Argyro Kavadella 1,, Sergio E Uribe 2,3,4,5, Marco Antonio Dias da Silva 6,7,8, Falk Schwendicke 5, Antonín Tichý 5,9, Reinhilde Jacobs 10,11, Daniel Karni 1, Akhilanand Chaurasia 12, Kostis Giannakopoulos 1
PMCID: PMC12977397  PMID: 41652399

Abstract

Aims

To explore undergraduate dental students’ AI knowledge, perceptions, and concerns, and to identify their educational needs based on these findings.

Methods

A cross-sectional, anonymous survey was conducted using a 30-item online questionnaire distributed to dental schools across multiple countries. The survey employed an exploratory, observational approach with convenience and snowball sampling methods. The population included dental students from all academic semesters, and participation in the survey was voluntary. The questionnaire consisted of multiple-choice and Likert-scale questions organized into five sections: consent form, demographic data, knowledge/awareness, perceptions/attitudes, and ethics-related questions. Data were analysed using Jamovi and R. Descriptive statistics summarised the demographic characteristics and responses to survey questions. Non-parametric correlation analysis was employed as a primary measure of association for relationships between ordinal variables. For regression analyses, ordinal logistic regression models were constructed to identify predictors for specific outcomes. For each Likert scale question, an ordinal logistic regression model was constructed (dependent variable), with the knowledge questions score as a covariate and the nominal answered questions as factors.

Results

508 students completed the questionnaire. Most students (76.2%) agreed they understood what AI entails, and 67.4% were familiar with generic AI tools; however, only 34.7% knew AI's dental applications. 70.3% supported AI education during undergraduate studies, favoring case-based teaching, and 53.7% felt their current education had not adequately prepared them for AI technologies. Students declared that AI would be beneficial in diagnostic analysis (64.5%), enhance clinical practice (69.5%), and improve patient care (60.4%). Also, 41.7% believed that AI would cause a reduction in professionals’ skills and dehumanize healthcare (29.2%). 3/4 students agreed that AI ethics should be taught from a multidisciplinary perspective, and 65.3% declared AI in healthcare should be legally regulated.

Conclusions

This study establishes baseline data on dental students' AI knowledge and educational requirements across multiple countries. Despite general AI familiarity, understanding of dental applications remains limited. The results highlight the need for structured AI education programs tailored to students’ knowledge gaps and learning preferences. Dental students’ understanding and perceptions of AI can effectively guide the identification of their learning needs and inform curriculum integration.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12909-026-08699-6.

Keywords: Artificial intelligence, AI curriculum, Dental education, Dentistry, Health education, Undergraduate, Higher education, Educators, Students questionnaire, ChatGPT

Introduction

Background

Artificial Intelligence (AI) has changed various fields of medicine, such as radiology, genetics, surgery, and health status assessments [13]. Healthcare was impacted by AI pattern recognition abilities to identify clinical conditions and predict adverse events [4]. AI-powered systems have outperformed humans in identifying skin conditions or predicting cancer based on radiographic examinations [3, 5].

In dentistry, AI can enhance diagnostics, support clinical decision-making, and improve patient safety by reducing errors [6, 7]. Key advancements include image-based disease detection, predictive modeling for oral health outcomes, and treatment simulations [8]. In dental imaging, AI diagnostic accuracy is comparable to or better than specialists [9]. The use of multimodal AI can integrate health records, socio-demographic data [6], reducing administrative tasks and improving efficiency and accessibility. AI-powered systems merged with gamification and virtual/augmented reality can also enhance students’ and professionals’ training [10]. Further advancements in dentistry involve the development of reasonable multimodal integrated AI systems capable of interacting with the environment and even performing complex tasks similar to human capabilities [6].

Despite growing interest in AI, health education is still in the process of integrating these advancements. There is a need to improve digital literacy in healthcare to ensure the effective and safe use of AI technologies in clinical practice [11], and higher education plays a crucial role in this. Formal education on AI is limited, inconsistent across higher education institutions, and mostly research-focused rather than addressing real-life situations [5, 1113]. This might be due to the lack of evidence to inform the integration of AI in undergraduate curricula [5], inadequate support/training of educators [14], or resistance to change [15]. Student surveys have demonstrated that most dental and medical students report limited knowledge, a lack of formal instruction, and a strong interest in integrating AI concepts into their training [2, 8, 11, 1520].

Rationale and objectives

Since undergraduate students are primary beneficiaries of AI education, their perspectives and attitudes play a crucial role in shaping this educational intervention [19, 21]. Existing research has primarily focused either on exploring students’ attitudes toward AI [1, 2, 8, 10] or on proposing AI competences and curricula [3, 13]. Yet, no studies have combined these perspectives, even though it is essential that new AI curricula are firmly grounded in students’ opinions and attitudes, as students are the primary stakeholders in education and their perspectives are crucial for ensuring both relevance and acceptance. Thus, this study aims to address this gap in the literature by assessing dental students’ knowledge, perceptions, and concerns regarding AI in dentistry, and by identifying their educational needs based on these insights. The results of this survey will contribute to the ongoing efforts to identify key insights for integrating AI into dental education.

Materials and methods

Study design

A cross-sectional, anonymous survey was conducted using an online questionnaire. The questionnaire was developed by adapting and refining previously published and validated questionnaires relevant to the study topic [2227]. The multiple-choice knowledge questions were developed with assistance from ChatGPT-4, which provided relevant input following targeted prompts. Reporting of this study followed the STROBE statement for cross-sectional studies [28, 29]. The study’s ethical clearance was obtained from the Institutional Committee on Bioethics and Ethics of the European University Cyprus (approval number EUC ETHICS COMMITTEE.2024-03, date 26 March 2024).

Setting, participants, sample size

The survey was conducted between March and May 2024 and employed an exploratory, observational approach with convenience and snowball sampling methods. Given the study’s exploratory nature, sample size estimation and power calculations linked to the sample size had to be waived. Dental students from all semesters of all universities involved (seven Universities) were invited to voluntary join the study; no financial or academic interests were implied. Participants were free to withdraw from the study at any time without any consequences. Students could fill the survey using either a QR code or the direct link to the survey webpage after reading and providing informed consent. The estimated survey duration was less than 10 min and contact information for the principal investigator and the privacy policy were provided. All participants could ask questions via email. Only two researchers could access the student data.

Data sources

The survey was composed of an anonymized questionnaire, having multiple-choice and Likert-scale questions, and was designed using Google Forms (Google LLC, United States). To assess usability and ensure the reliability of the online survey, a pilot test was conducted with 10 dental students from the target population. The questionnaire was administered in English and was not translated into other languages. The students independently completed the questionnaire without consulting others and assessed its relevance, clarity, and comprehensibility. Based on their feedback, modifications were made to enhance clarity and user-friendliness, including adding examples in two of the questions. For example, we added examples of generic AI tools (Siri, chatbots, image generation, etc.) to the question asking about students’ knowledge of such tools. Responses from the pilot study were excluded from the data.

The survey was divided into five subsections:

  • 1st: consent form accompanied by a brief presentation of the study.

  • 2nd: demographic data (i. dental school and country of the respondent, ii. year of study, and iii their self-assessed knowledge and IT proficiency, using a 5-point Likert scale).

  • 3rd: 10 knowledge/awareness-related questions, exploring factual knowledge level, dental and general applications, and personal sources of information.

  • 4th: 14 questions related to students’ perceptions and attitudes towards AI (advantages and disadvantages of using AI in education and practice, opinions regarding the integration of AI into formal education, and beliefs and future expectations).

  • 5th: 6 ethics-related questions.

  • Questions were designed to measure the cognitive (knowledge and skills, understanding), affective (self-efficacy, confidence, motivation), behavioral (AI implementation and usage), and ethical aspects of AI in education and practice [30].

Sources of bias

As the questionnaire was administered only in English, a potential language-related bias may have occurred among non-native English-speaking participants. Differences in language proficiency could have influenced comprehension of the questions and, consequently, the accuracy or depth of the responses. Moreover, the use of snowball sampling may have further introduced bias, as participants were recruited through academic networks, potentially leading to an overrepresentation of students with similar backgrounds, interests, or attitudes toward AI. In addition, participants might have provided socially desirable answers regarding their knowledge or attitudes, and students with a greater interest in AI may have been more likely to participate, resulting in response and selection bias. Finally, under-coverage bias is inherent to all web-based surveys, as only respondents with Internet access were able to complete the questionnaire [31].

Statistical analysis

Data were analysed using Jamovi (Version 2.6) [32] and R [33].

Descriptive statistics summarised the demographic characteristics and responses to survey questions. The relationships between students’ responses to the questions were explored across the three core conceptual domains of the questionnaire (knowledge-awareness, perceptions-attitudes, and ethics) and three key areas: (a) their self-assessed IT proficiency, (b) their scores on the three factual AI knowledge questions, and (c) their primary source of initial AI learning. For these investigations, Kendall’s Tau-b (τb​) was employed as a primary measure of association for relationships between ordinal variables. Additionally, Spearman’s rank-order correlation (ρ) was used to assess monotonic relationships between other relevant ordinal or continuous-ordinal variables within the dataset. The statistical significance of these associations, particularly when dealing with categorical variables or when expected cell frequencies were low, was rigorously determined using the Monte Carlo simulated Fisher’s exact test for contingency tables. This advanced approach provided more robust p-values compared to the standard chi-squared test. Cramer’s V was consistently used to quantify the effect sizes for associations between relevant categorical variables.

For regression analyses, ordinal logistic regression models were constructed to identify predictors for specific outcomes. For each Likert scale question, an ordinal logistic regression model was constructed (dependent variable), with the knowledge questions score as a covariate and the nominal answered questions as factors.

The level of significance for all statistical tests was set at p < 0.05.

Results

Descriptive statistics

The study involved 508 students who completed the questionnaire. Most respondents came from Cyprus (225; 47.3%), followed by India (64; 13.4%) and Latvia (51; 10.7%). Other countries could not reach more than 10% individual participation (Fig. 1). The distribution of participants across the years of study reveals 111 students in Year 1 (21.9%), 117 in Year 2 (23.0%), 87 in Year 3 (17.1%), 141 in Year 4 (27.8%), 46 in Year 5 (9.1%), and 6 in Year 6 (1.2%).

Fig. 1.

Fig. 1

Chart depicting the frequency of countries

1. Most students considered their own IT knowledge as “adequate” (44.2%) (level 3 on the 5-point Likert scale), 23% as “limited or no knowledge” (levels 1 and 2) and only 6.3% as “highly proficient”. The average self-assessed IT proficiency was 3.18 (SD = 0.97).

2. Regarding where students first learnt about artificial intelligence during their studies, the most frequent source was social media and forums (35.7%), self-research and exploration (33.4%), their peers (19.7%), or during formal university courses (11.1%).

3. Regarding the 3 factual knowledge questions, 77% understood that AI products can improve their performance and make decisions based on data, 66.5% of students correctly answered that 3D printing is not an AI application, and 51% correctly defined artificial intelligence as “mimicking human intelligence by machines.” Most participants answered two questions correctly (40%), and less than one-third (29.5%) answered all questions correctly.

4. Overall, most students (76.2% agreeing or strongly agreeing) indicated they understand what AI entails, and 67.4% reported being familiar with using generic AI tools. However, the understanding of AI’s applications in dentistry was lower, with only 34.7% demonstrating comprehension, and awareness of AI’s use in periodontal diagnosis was 44.1%. In fact, 52.5% of students reported a lack of familiarity with dental AI applications, and 30.3% were unfamiliar with the concept of prompting AI (Fig. 2).

Fig. 2.

Fig. 2

Chart of students’ responses on AI knowledge and awareness

5. Concerning students’ attitudes and perceptions towards AI in education and clinical practice, the majority showed positive attitudes. Approximately 70% believed that AI tools would enhance their clinical practice, and a similar percentage (60.4%) felt they would improve the quality of patient care. Also, nearly 70% considered AI education essential during undergraduate studies, favoring case-based and application-focused teaching (70.3%). Interestingly, 53.7% felt their current education had not adequately prepared them to use AI technologies in clinical practice. Despite these positive views, less than half (45%) imagined themselves working closely with AI algorithms. Regarding the use of generative AI, 60.3% of students admitted to using AI in their studies, 41.3% thought that AI undermines educational value, and 26.9% preferred ChatGPT over traditional search engines. About AI’s potential impact, approximately 30% disagreed that AI would degrade professional skills (while 41.7% agreed), and 41.6% believed that it would not harm the patient-doctor relationship (29.2% believed the opposite). Furthermore, 47.1% did not think AI’s diagnostic abilities were superior to humans’, and nearly 60% believed that large language models cannot replace human educators in theoretical courses (Fig. 3).

Fig. 3.

Fig. 3

Chart of students’ responses on attitudes and perceptions towards AI

6. Students considered dental diagnosis (64.5%), treatment planning (52.5%) and educational tools (47.8%) as more likely to benefit from AI. Dental practice management and patient management were the least considered areas, mentioned by almost one-third of respondents.

7. The ethics-related questions revealed that most students agreed that AI ethics should be taught from a multidisciplinary perspective during undergraduate studies (75%), believed AI in healthcare should be legally regulated (65.3%), and expressed concerns about potential breaches of sensitive private data (57%). Half of the students (51.9%) were concerned about the general ethical implications of AI in healthcare, while 13% did not share these concerns. Concerning liability for errors caused by AI, 54.9% of students believed healthcare professionals should be held responsible (16.9% disagreed), and 47.1% expressed uncertainty about accountability (Fig. 4).

Fig. 4.

Fig. 4

Chart of students’ responses on ethical issues regarding AI in dentistry

Analytical statistics

Students’ self-assessed IT proficiency

Those students who initially learned about AI through their own research or university courses tended to rate their IT knowledge and skills more highly (Fisher’s exact test p = 0.001, Cramer’s V = 0.156). Positive correlations were observed between students’ self-assessed computer skills (IT skills) and their understanding of what AI actually is (Spearman’s rho = 0.431, p < 0.001) and between their perceived IT skills and understanding how AI is used in dentistry (Spearman’s rho = 0.282, p < 0.001).

Students’ scores on the factual knowledge

No significant correlation was observed between students’ scores and the self-assessed IT skills (p > 0.05), nor between the scores and the primary source of learning about AI (p > 0.05).

Students who demonstrated more AI knowledge were less satisfied with their current dental school education’s preparation for AI use (Fisher’s exact test p = 0.002; Kendall’s Tau-b p < 0.001), were also more confident that AI tools would improve clinical practice (Fisher’s exact test p = 0.002; Kendall’s Tau-b p < 0.001) and dental patient care (Fisher’s exact test p = 0.011; Kendall’s Tau-b p < 0.001). Moreover, as students’ AI knowledge increased, their concerns about AI making healthcare less personal and negatively affecting the doctor-patient relationship tended to decrease (Fisher’s exact test p = 0.001; Kendall’s Tau-b p < 0.001), as also declined their agreement with the idea that AI could replace dental educators for theoretical courses (Fisher’s exact test p < 0.001; Kendall’s Tau-b p < 0.001) or could reduce health professionals’ skills (Fisher’s exact test p = 0.028; Kendall’s Tau-b p < 0.001).

Students who answered more AI knowledge questions correctly generally reported a higher understanding of AI (Fisher’s exact test p = 0.046; Kendall’s Tau-b p = 0.041) and perceived AI as an educational tool (Fisher’s exact test p = 0.009; Cramer’s V = 0.152), however, they were not more familiar with dental AI applications (Fisher’s exact test p = 0.020; Kendall’s Tau-b p < 0.001).

Students’ with higher scores were uncertain about AI accountability in dentistry (Fisher’s exact test p = 0.031; Kendall’s Tau-b p = 0.269), were concerned about AI privacy violations in dentistry (Fisher’s exact test p = 0.029; Kendall’s Tau-b p = 0.060), and believed that AI ethics should be taught from a multidisciplinary perspective (Fisher’s exact test p < 0.001; Kendall’s Tau-b p < 0.001).

The primary source of students’ learning about AI

Those students who learned about AI through their own research or university courses tended to self-assess their IT knowledge and proficiency more highly (Fisher’s exact test p = 0.001, Cramer’s V = 0.156), as well as their knowing and understanding of AI (Fisher’s exact test p = 0.013, Cramer’s V = 0.123), and their understanding of how AI is applied in dentistry (Fisher’s exact test p < 0.001, Cramer’s V = 0.177). The ones who learned through their own research and exploration were more likely to report a higher ability to use generic AI tools (Fisher’s exact test p = 0.006, Cramer’s V = 0.135). While an association existed between the initial learning method and familiarity with AI in dentistry (Fisher’s exact test p = 0.001, Cramer’s V = 0.100), no clear pattern emerged regarding the primary source of information about AI.

Additional correlations among various questions

Students who knew how to use general AI tools tended to understand what AI is (Spearman’s rho = 0.348, p < 0.001), how AI is used in dentistry (Spearman’s rho = 0.170, p < 0.001), and that AI can help diagnose periodontitis (Spearman’s rho = 0.167, p < 0.001). Skill at “prompting” (giving instructions to AI, like in ChatGPT) correlated with understanding what AI is (Spearman’s rho = 0.420, p < 0.001), how AI is used in dentistry (Spearman’s rho = 0.295, p < 0.001), knowing how to use AI tools (Spearman’s rho = 0.482, p < 0.001), and that AI can help diagnose periodontitis (Spearman’s rho = 0.459, p < 0.001).

Conversely, students concerned that AI can make healthcare less personal did not believe AI would be helpful in their practice (Spearman’s rho = -0.190, p < 0.001) and did not think AI education should be compulsory (Spearman’s rho = -0.105, p < 0.01). Those who did think AI education should be compulsory also tended to prefer AI to be taught using real-world dental examples (Spearman’s rho = 0.124, p < 0.01) and believed AI would improve patient care (Spearman’s rho = 0.138, p < 0.01).

Furthermore, students who preferred using ChatGPT to search engines were more likely to know how to use AI tools (Spearman’s rho = 0.282, p < 0.001) and have used ChatGPT in their studies (Spearman’s rho = 0.171, p < 0.001). Students who felt their dental school has prepared them well for using AI tended to have a better understanding of AI in general (Spearman’s rho = 0.143, p < 0.01), and of AI’s dental applications (Spearman’s rho = 0.301, p < 0.001), and knew how to use AI tools (Spearman’s rho = 0.350, p < 0.001). Finally, students with a good general understanding of AI were more likely to understand its use in dentistry (Spearman’s rho = 0.340, p < 0.001).

Ordinal logistic regression

Support for mandatory education

The belief that AI education should be mandatory revealed that Year 6 students were more supportive than Year 1 (p = 0.039, 523%). Greater factual knowledge about AI predicted a stronger support for mandatory AI education (p = 0.034, 24%), and for using AI for diagnostic analysis (p < 0.001, 100%), as educational tools (p = 0.009, 59%), and for dental practice management (p<0.001, 96%). While using AI for patient management (p = 0.031, -36%) and learning about AI via peers (p = 0.017, -53%) were associated with unwillingness to include AI in the dental school curriculum.

Scores to the knowledge questions (factual knowledge)

For each unit increase in knowledge participants were more likely to agree that AI enhances clinical practice (27%, p = 0.013), to believe AI improves patient care (30%, p = 0.009), to express willingness to work alongside AI (33%, p = 0.002), and to agree that AI ethics should be taught from a multidisciplinary perspective (48%, p<0.001). It was also observed that students with higher scores were less likely to believe AI’s diagnostic ability is superior to humans (-20%, p = 0.024), prefer using ChatGPT over traditional search engines (-26%, p = 0.002), and agree that AI can replace pre-clinical dental educators (-38%, p < 0.001). Factual knowledge did not predict their reported use of generative AI technologies (p = 0.839), concern about ethical implications of AI (p = 0.555), the perceived need for AI to be regulated by legislation (p = 0.235), uncertainties about accountability for AI accidents (p = 0.189), or certainty that the healthcare professional using AI is responsible for its errors (p = 0.246).

Discussion

This study explored dental students’ knowledge and perceptions of AI to identify learning needs and key insights for integrating AI into dental education [19, 21]. As AI-based tools advance in clinical applications and research, it is crucial to better understand how to include AI education in the dental curriculum. Now, it is expected that dental students and professionals know how to use AI systems and methods critically to enhance their professional skills and employability [15, 34].

Knowledge and awareness

Corroborating previous studies [1, 8, 20, 35], it was observed that students self-reported knowledge level in computers and IT seems rather average (average IT proficiency was 3.18 on the 5-point scale). Accordingly, high knowledge levels were reported at low percentages [1, 8, 20, 35]. Unusual reports of high knowledge levels' higher frequencies are attributed to the country and characteristics of the student population [2, 11, 36].

In the factual knowledge about AI, students scored rather high, with the lowest score being the correct definition of AI (50%). The higher percentage of top scores was achieved by students from the Czech Republic (67%), Japan (62%), and the USA (56%), followed by Germany and Belgium. All other countries’ percentage of top scorers was lower than 40%, and India and Nepal obtained the lowest scores. Our results highlight the educational differences between countries, potentially confirming that students from developed countries had a better understanding of AI and its developments [2, 16]. Beyond educational systems, students in high-income countries often benefit from better technological access and infrastructure, which provides more opportunities to engage with AI concepts and tools. This is closely linked to the earlier integration of digital literacy and AI-related content in their university curricula, giving them a systematic advantage. Finally, cultural perceptions of technology may further reinforce these differences, with some societies (e.g. USA, Japan) being more open to innovation.

Based on the regression analysis, the profile of students with high factual knowledge describes students who are willing to work alongside AI, believe that AI enhances clinical practice and improves patient care, and want AI ethics to be taught from a multidisciplinary perspective. They do not believe that AI’s diagnostic ability is superior to humans, do not use ChatGPT over traditional search engines for dental topics, and do not agree that AI can replace preclinical educators. These findings suggest that students with greater AI knowledge have a more balanced and realistic view of AI’s potential, recognizing its supportive role in clinical practice, and understanding its limitations and the importance of human expertise.

Only half of the students were able to correctly define AI, while 77% were able to identify what makes a product AI, indicating that students knew specific products/applications better than the theoretical background of AI [37]. This was partly confirmed by the knowledge question about what AI technology is, in which most could distinguish AI from other modern technologies. It was surprising, though, that the remaining third thought that 3D printing was AI technology and that one-third of students knew about AI dental applications. This suggests that students had a limited exposure to AI-related educational content, supported by the fact that only 11% of the participants declared having first learned about AI in university courses.

Students who first learned AI by themselves or university courses rated their IT knowledge and skills more highly, suggesting that exposure to AI fosters with stronger digital literacy. This aligns with the finding that students who rated their IT knowledge highly also reported a better understanding of what AI is and its dental uses, and that stronger IT skills are linked to a better perception of general and dental AI [20, 21, 36].

Complementing this finding, students possessing emerging technological skills – such as using chatbots, image generators, and prompt engineering – demonstrated a better understanding of AI. It has been demonstrated that students who perceived themselves as technologically proficient reported higher awareness of AI’s dental applications [10], were more confident [20], and were likely to have basic AI understanding [17].

Interestingly, no correlation between AI self-evaluated proficiency and factual knowledge was observed, as students at all proficiency levels (apart from those with no proficiency at all) performed well in the knowledge questions. In contrast, it was reported that tech-savvy students were more likely to have high or moderate AI knowledge [20]. It is understood that AI-related knowledge may not depend on general IT skills, as understanding AI concepts is not directly related to using digital tools. Furthermore, self-assessed IT proficiency is subjective and may not reflect actual skills; highly knowledgeable students may rate themselves only moderately, due to being aware of what they have yet to learn, while others may overrate themselves. In addition, AI is a popular topic, which might contribute to a baseline familiarity across all proficiency levels. Lastly, students may have explored AI topics, even if they do not consider themselves proficient.

The knowledge and awareness of dental students in AI seem to be superficial. Usually there is low awareness of AI in dentistry among both dentists and students [38]. Although 67.4% of participants declared they knew how to use AI tools, 30% were unfamiliar with prompting concepts or with dental AI tools (52.5%). This may be related to the current social media hype regarding AI. On social media, AI content is often superficial and offered by unreliable sources [39, 40]; still, that is students’ primary source of information [41].

It seems that students have a basic understanding of AI but lack in-depth knowledge of its dental applications. Similarly, most health students have a moderate understanding of AI technologies (68%) and applications (56%), but a higher understanding of daily life AI applications (85%) [2]. Most medical students were more familiar with everyday life than with medical applications [37] and aware of the use of AI in medicine (83%), but only 39% were able to describe specific AI concepts [11]. The superficial awareness was also shown in Australian and Pakistani students and professionals who could not name a specific software and were unaware of AI applications [10, 42]. In the present study, students who scored high in the knowledge questions were not necessarily more familiar with dental AI applications, highlighting that general AI literacy does not automatically translate into specific domain application knowledge. It suggests that dental education needs to focus not just on general AI concepts but also on dental-specific applications and tools.

Confirming previous research, social media, self-research, and peers served as main information sources [8, 11, 17, 21] while dental courses trained only 11% of students. Similarly, most medical students and trainees reported receiving no formal AI education [20, 21]. However, university education has the potential to significantly enhance students’ AI and IT competencies. The present study revealed that students who first learnt about AI in formal education rated their IT knowledge and skills more highly and demonstrated a better understanding of general and dental AI, thus evidencing the courses’ effectiveness in enhancing students’ confidence and their understanding of AI concepts. Similar results were observed also in medical students [20].

Perceptions and attitudes

The students’ perception of AI effects on healthcare is inconsistent [43]. Students believed that AI would improve clinical practice (70%) and the quality of patient care (60%) but also lead to the dehumanization of healthcare (30%) and reduction of professional skills (40%). Our dental students share the ongoing professionals worry that AI may fall short addressing the human side of patient care [8, 15], and view of AI integration in clinical practice with cautious optimism [10]. Similar studies revealed students’ optimism toward AI, acknowledging its potential to improve clinical practice and support professional growth [8, 15], and daily work activities [11, 20].

Particularly, students with higher factual knowledge were more confident that AI would improve clinical practice and patient care [20] and less concerned with the reduction of skills or negative effects on the doctor-patient relationship. It is believed that a better understanding may reduce misconceptions and foster trust in AI as a supportive tool rather than a replacement for humans [44, 45], supporting the argument for formal AI education to build confidence, address anxieties, and promote realistic expectations among future dental professionals [46].

Our present observations match previous research by confirming that the majority of students, viewed AI as beneficial for diagnostic analysis [1, 8, 10, 21], but only 20% perceived AI’s diagnostic ability as superior to humans. Possible explanations for these differences could be that either the students believe that AI will never reach the level of humans, but they consider it a helpful tool that can support the diagnosis, or they might think that AI will surpass humans later, but it is just not at the level yet.

About half of the students perceived AI as a benefit in treatment planning and education, which may have contributed to their optimistic expectations. In a different dental scenario, AI was recognized by 91% of students as a valuable support tool for clinicians, with radiology and implantology linked as the most suitable application fields [10]. Clinicians largely (80%) believe AI will enhance diagnostic workflows consistency [35], as do half of the medical students [20].

At present, the integration of AI education in higher education curricula remains insufficient [21, 26, 39]. It is understood that limited availability of guidelines [12, 47] and lack of AI-trained educators [14] contribute to it. The lack of formal learning opportunities led most students to feel unprepared to work with AI [18, 24]. Our study corroborates the literature demonstrating that the current AI education in the respondents’ dental schools is insufficient, and that it should be part of the dental curriculum [19, 20, 37].

On the other hand, students who felt that their dental school had prepared them well for using AI demonstrated a better understanding of AI (general and dental) and knew how to use AI tools. It has been reported that formal AI training leads to greater overall AI knowledge and the perception of being better prepared fostering positive attitudes and the intention to use AI [15, 18].

Based on the regression analysis, the profile of the students who believe that AI education in undergraduate studies should be mandatory describes 6-year students with high factual knowledge, supporting the use of AI for diagnostics, for dental practice management, and as educational tools. They have not learned about AI from peers and do not agree with using AI for patient management. This student profile may be attributed to their advanced clinical exposure, which likely enhances their awareness of AI’s relevance in diagnostics, education, and practice management, and the will for formal training [23, 39]. Their reluctance to support AI for patient management may reflect a critical understanding of AI’s limitations in areas requiring human judgment and empathy.

Generative AI has already penetrated the academic environment as 60% of students admitted using it in their studies, formally or informally, therefore, the educators cannot but adapt [36]. It is no longer a problem to write an essay on any topic (which might be why 40% of students believe that generative AI tools undermine university education), but emphasis now needs to be put on critical thinking and the adequate use of information sources. In contrast to most Pakistani medical students (60%), only 27% of the dental students use ChatGPT as a major search tool [36]; however, these students were significantly more familiar with other AI tools, apparently because they are more curious or engaged with emerging technologies.

Students don’t think LLMs would replace pre-clinical dental educators (60%) validating the role of human educators. Similar perceptions are reported in studies, where undergraduate students do not support the idea that AI would replace dentists and doctors in the foreseeable future [8, 11, 17, 21, 36].

Ethics

Half of the students were aware of the AI use ethical implications, and possible privacy violations. Similarly, researches have shown that dental students’ principal concerns include insurance liability, accuracy and responsibility for machine errors, data security, and privacy issues [8, 10, 15]. Understanding and addressing ethical concerns is essential to ensure the responsible deployment of AI in healthcare in terms of patient safety, privacy, and autonomy.

The reinforcement of ethical conduct is complex, not only for medical professionals but also for AI-tools developers. Students seem to realize that the use of AI in health should be regulated by legislation. AI is developing quickly, making it difficult for legislative procedures to catch up; therefore, the legislature should at least oversee the developers and ensure adequate validation of tools, while ethical issues at the individual level will largely remain a matter of personal responsibility. In fact, more than half of the students declared that they are certain that a healthcare professional bears full responsibility for an error made by AI. Interestingly, though not surprisingly, when more choices were available in relation to an AI error responsibility (in another question), half of the students were not certain of the responsible stakeholder. It is not unexpected that they seek to attribute responsibility elsewhere. Indeed, as identified in the early days of AI adoption in healthcare, when a patient experiences significant adverse outcomes due to an inaccurate diagnosis or prediction, liability becomes complex and may fall on any of several parties, such as the physician, the hospital, the software company, the developer, or even the individual who supplied the data [48]. In a recent study, half of the students also believed that the dentist should be responsible in case of AI misdiagnosis and 45% attributed responsibility to the company that developed the AI [8], while in another study, the percentages were 39% and 33% respectively [1]. We have also observed that students with higher factual knowledge were more uncertain about AI accountability and more concerned about privacy violations in dentistry. Possibly due to being more aware of its technical complexities and ethical challenges.

Educational needs assessment

The knowledge and education-related questions, combined with students’ opinions and perceptions, offer valuable insights into students’ educational needs. Key findings to inform this process include:

  1. Feeling better prepared by the dental school for using AI = better understanding of general AI, dental AI applications, and AI tools.

  2. Believing that AI education should be mandatory = higher factual knowledge, and positive perceptions about AI in clinical practice.

  3. Students exhibited superficial knowledge and awareness in AI = no knowledge of AI dental applications.

  4. Students knew specific products or AI applications better than the theoretical background of AI.

  5. Higher self-rated IT proficiency and prompting skills = better perception of AI and dental applications.

  6. Higher factual AI knowledge = positive and realistic perspectives regarding AI in education/practice, and associated ethical concerns.

  7. Negative attitudes on AI = disagreement with the mandatory AI education, belief that AI will dehumanize healthcare, and will not benefit clinical practice.

  8. Around 70% of students believed that being able to evaluate and use AI tools would enhance their clinical practice.

  9. Over 70% of students thought that AI dental education should be based on case studies and real-life scenarios, and AI ethics should be taught by experts in various fields.

Students’ educational needs encompass basic IT and AI literacy, dental applications and tools, critical evaluation, cutting-edge skills (prompting and alternative information retrieval), as well as education on ethical and legal implications. An educational intervention addressing these needs was applied at a postgraduate module on AI in radiography. Students enrolled in this module explored a range of topics, including foundational concepts in AI and data science, AI applications across various radiographic imaging modalities, ethical principles for responsible, transparent, and equitable AI use in medical imaging, AI governance and regulatory frameworks, technology acceptance and adoption by clinical practitioners, the impact of AI on radiography workflows, and approaches to evaluating and validating AI systems. Teaching was delivered through a combination of lectures, industry-led interactive seminars, hands-on workshops, and tutorials [49]. In the sonography profession, AI and ML fundamentals, impact on the profession, and privacy, ethics, and legal aspects are suggestions for the curriculum improvement [50]. In line with our findings, Valikodath et al. proposed the components of a core AI curriculum on medical education in Ophthalmology, including basic mathematics and statistics, fundamentals of AI, machine learning, and deep learning, evaluation of AI literature, clinical and surgical applications, ethics, and medicolegal implications [51].

Although interest in AI medical education is increasing among researchers and practitioners, the translation into structured curricula has been limited [52]. Common curricular concepts mentioned in the studies reporting AI medical curricula include fundamentals of AI, strengths and limitations of AI, ethical and legal considerations, clinical applications of AI systems, impact of AI on clinical reasoning and decision-making, and critical appraisal of AI systems [52].

As dental students’ background IT knowledge is variable and their limited understanding of programming can be a barrier [11], basic foundational training should be offered to enhance the understanding of how AI systems are designed [34, 48]. Addressing similar considerations in higher education, an AI framework was suggested with core AI competencies for non-computer science students’ education [34]. This is an essential component for developing, trusting, and critically evaluating AI outputs within the complex realities of healthcare practice [13, 48]. Future healthcare professionals must understand the basic principles of data science and AI for being competent in managing data, supervising AI tools, and using applications wisely to provide the treatments for their patients [48, 53]. Computer science and dental educators must collaborate to achieve this task in a tailored way without elaborating on technical details [10, 11].

Understanding and applying ethic concepts is a core competency for responsible AI use [34]. This is particularly important not only for future dental professional practice, but also during the undergraduate studies, as it underpins the safe and responsible use of AI-powered applications.

Conceptual frameworks and competencies have been proposed to address students’ educational needs and promote AI education in undergraduate healthcare curricula. Medical students should receive training on data science and machine learning, AI ethics and patient safety, governance and regulation, and AI evaluation, accompanied by hands-on experience, in addition to being acquainted with AI diagnostic systems, robotics and e-patients [13, 52, 54]. In dental education, competencies included computing and AI, data-related and machine learning, human-AI interaction, and responsible use of digital systems [34, 55].

Based on students’ educational needs, as identified through the questionnaire results, a framework for teaching AI in undergraduate dental education can be proposed:

Modules Topics Learning outcomes
1: Foundations of IT and AI · Basic computer literacy and software tools· Introduction to AI concepts: machine learning, neural networks, and data-driven systems · Explain fundamental AI concepts and their relevance to healthcare· Understand how AI systems are structured and trained· Recognize the role of IT literacy in interacting with AI systems
2: AI Applications in Dentistry · AI in dental imaging and radiography· AI tools for diagnosis, treatment planning, and workflow optimization· Case studies of AI implementation in dental clinics· Hands-on training to AI-powered software tools · Identify key AI applications in dental practice· Demonstrate the ability to use AI tools in simulated and real clinical scenarios· Critically evaluate the performance of AI applications in dental imaging· Appreciate the potential and limitations of AI in clinical decision-making
3: Critical Evaluation and Evidence-Based Use · Methods for evaluating AI tools: accuracy, bias, transparency, and reproducibility· Critical appraisal of scientific literature on AI in healthcare· Identification of common pitfalls and biases in AI applications · Assess the validity and reliability of AI tools· Apply critical thinking to evaluate AI-driven research studies· Recognize sources of bias and limitations in AI applications· Make informed decisions about integrating AI into clinical practice
4: LLMs and Generative AI · Prompting and alternative methods of AI information retrieval· Exploring emerging AI technologies and their applications in healthcare· Hands-on exercises with AI chatbots, generative models, and decision support systems · Understand the principles behind LLMs and GenAI· Apply effective prompting strategies for information retrieval· Critically evaluate the reliability and limitations of LLM-generated output in healthcare contexts· Use LLMs responsibly in clinical, educational, and research contexts
5: Ethics, Regulation, and Responsible AI Use · Ethical principles in AI: fairness, accountability, transparency· Legal and regulatory considerations for AI in healthcare· Patient privacy, data protection, and consent· Integrating ethical reasoning into AI-supported clinical decisions · Identify ethical and legal issues related to AI in dentistry· Apply ethical reasoning when using AI tools in clinical practice· Demonstrate awareness of patient data privacy and regulatory requirements· Promote responsible, safe, and transparent use of AI in healthcare

This proposal aims to provide foundational suggestions for curriculum designers to build upon, therefore the educational framework would need to be tailored to the specific student population, educational aims, regional context, and pre-existing courses of each institution.

Integrating a new educational topic into an already overcrowded curriculum is challenging, due to constraints on time allocation, the need to revise program structures/objectives, and the development of new learning outcomes [5, 11, 48]. AI-focused courses can be implemented using a modular, scaffolded, and phased strategy, gradually advancing each year with learning outcomes evolving from theoretical understanding to practical application [13]. AI courses can initially be offered as an elective and gradually incorporated into existing courses such as Evidence-Based Dentistry, Patient Safety, Governance and Regulation, or Medical Informatics [11, 13, 48]. The different sub-subjects of AI courses can be taught by experts from various fields (computer science, law, philosophy, mathematics, economics) using the multidisciplinary approach [10, 48, 56]. The teaching methodology should include a combination of lectures, specified seminars, hands-on workshops, case studies, and collaborative activities with other departments [11, 13].

Limitations

In this study, the questionnaire was administered only in English for feasibility and consistency across countries, as English is a widely used academic language. While this may have introduced language bias by favoring respondents with higher proficiency, clear and simple wording was used to minimize this. Future multinational studies should consider validated translations and cross-cultural adaptation to reduce such bias.

The use of web survey implies under-coverage bias, self-selection bias, and response bias, as it requires an internet connection, and at the end of the day, individuals selected themselves for and self-report in the survey [10, 31, 36].

Also, the majority of responding students were from Cyprus, apparently because the principal investigators were from this country, resulting in higher student engagement, an issue that could skew the analysis. However, it is important to note that 70% of these students are international, and thus bring diverse educational and cultural backgrounds, perspectives, and opinions.

Additionally, a country/educator-specific response bias may be inherent in this multinational study, given the variation in teaching protocols across different educational systems.

The questionnaire used was not designed to specifically identify learning modules or a comprehensive AI curriculum; however, it offered valuable insights into students’ perceptions, knowledge, and educational needs, which can contribute to AI curriculum development. A more focused questionnaire designed specifically for educational purposes would offer the opportunity to inform educational stakeholders about the development of such a curriculum more accurately and comprehensively.

The survey used in this exploratory research was meticulously constructed using questions adapted from previously validated surveys. However, while individual items have a basis in established research, the overall internal consistency, construct validity, and reliability of this specific composite questionnaire, as a novel instrument within this study, have not been empirically established. We understand that future studies should address the psychometric properties of this instrument, including analyses of internal consistency and factorial structure. The diverse nature of the partners also presented challenges for a comprehensive analysis in this initial phase. Consequently, the findings regarding the measured data should be interpreted with this consideration in mind.

Conclusions

The multinational survey revealed a complex and varied landscape of dental students’ knowledge and perceptions. Students stated a basic understanding of AI but lacked in-depth knowledge of its dental applications. Formal AI training was minimal, but students believed that AI tools would enhance clinical practice and improve patient care. Students recognized that AI education should be mandatory and they had used Gen AI tools during their studies. A significant proportion of students believed that AI would cause a reduction in health professionals’ skills and dehumanize healthcare; however, they did not think that LLMs could replace educators. Most of the students were concerned about the ethical implications of AI. Students with higher factual AI knowledge and those who were supportive of mandatory undergraduate AI education held more balanced and realistic views on AI potential. Results of the study indicate that integrating AI into dental education is crucial for equipping future professionals with the basic knowledge and skills needed to work ethically in the technology-driven healthcare system.

Supplementary Information

Supplementary Material 1. (30.6KB, docx)

Acknowledgements

Not applicable.

Authors’ contributions

Conceptualization, AK; questionnaire, AK and DK; study design, AK, KG, and FS; literature review, AK and DK; statistical analysis, MADS and SEU; writing – original draft preparation, AK; writing – content development and editing, MADS, AT; manuscript review and suggestions, AT, MADS, FS, KG, AC, RJ. All authors read and approved the final manuscript.

Funding

The study received no funding.

SEU acknowledges financial support from the European Union’s Horizon 2020 research and innovation program under grant agreement No 857287 for the Baltic Biomaterials Centre of Excellence and from the Latvian Council of Science, project No lzp-2022/1–0047, “IEVA Project.”

Researcher MADS is an MSCA COFUND fellow funded by the European Union’s Horizon Europe research and innovation programme under the Marie Skłodowska-Curie Actions grant agreement 2430429511.

Data availability

The dataset supporting the conclusions of this article is available in the Zenodo repository: KAVADELLA, A., Uribe, S., Dias da Silva, M. A., Schwendicke, F., Tichy, A., Jacobs, reinhilde, chaurasia, A., Daniel, K., & Giannakopoulos, K. (2025). Dataset Dental students knowledge, perceptions, and educational needs regarding Artificial Intelligence: a multinational cross-sectional survey [Data set]. Zenodo. [10.5281/zenodo.15825108](10.5281/zenodo.15825108).

Declarations

Ethics approval and consent to participate

The study adhered to the Declaration of Helsinki. Informed consent was obtained from all participants.

The Institutional Committee on Bioethics and Ethics of the European University Cyprus approved this study (approval number EUC ETHICS COMMITTEE.2024-03, date 26 March 2024).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Sassis L, Kefala–Karli P, Sassi M, Zervides C. Exploring medical students’ and faculty’s perception on artificial intelligence and robotics: a questionnaire survey. J Artif Intell Med Sci. 2021;2(1–2):76–84. 10.2991/jaims.d.210617.002. [Google Scholar]
  • 2.Bisdas S, Topriceanu C-C, Zakrzewska Z, Irimia A-V, Shakallis L, Subhash J, Casapu M-M, Leon-Rojas J, Pinto dos Santos D, Andrews DM, Zeicu C, Bouhuwaish AM, Lestari AN, Abu-Ismail L, Sadiq AS, Khamees A, Mohammed KMG, Williams E, Omran AI, Ismail DYA, Ebrahim EH. Artificial intelligence in medicine: a multinational multi-center survey on the medical and dental students’ perception. Front Public Health. 2021;9:795284. 10.3389/fpubh.2021.795284. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Çalışkan SA, Demir K, Karaca O. Artificial intelligence in medical education curriculum: an e-Delphi study for competencies. PLoS ONE. 2022;17(7):e0271872. 10.1371/journal.pone.0271872. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Liu DS, Sawyer J, Luna A, Aoun J, Wang J, Boachie L, Halabi S, Joe B. Perceptions of US medical students on artificial intelligence in medicine: mixed methods survey study. JMIR Med Educ. 2022;8(4):e38325. 10.2196/38325. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Lee J, Wu AS, Li D, Kulasegaram KM. Artificial intelligence in undergraduate medical education: a scoping review. Acad Med. 2021;96:S62–70. 10.1097/ACM.0000000000004291. [DOI] [PubMed] [Google Scholar]
  • 6.Schwendicke F, Samek W, Krois J. Artificial intelligence in dentistry: chances and challenges. J Dent Res. 2020;99(7):769–74. 10.1177/0022034520915714. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J. 2019;6(2):94–8. 10.7861/futurehosp.6-2-94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Yılmaz C, Erdem RZ, Altınok Uygun L. Artificial intelligence knowledge, attitudes and application perspectives of undergraduate and specialty students of faculty of dentistry in turkey: an online survey research. BMC Med Educ. 2024;24:1149. 10.1186/s12909-024-06106-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Hung KF, Yeung AWK, Bornstein MM, Schwendicke F. Personalized dental medicine, artificial intelligence, and their relevance for dentomaxillofacial imaging. Dentomaxillofac Radiol. 2023. 10.1259/dmfr.20220335. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Hegde S, Nanayakkara S, Jordan A, Jeha O, Patel U, Luu V, Gao J. Attitudes and perceptions of Australian dentists and dental students towards applications of artificial intelligence in dentistry: a survey. Eur J Dent Educ. 2024;0:1–10. 10.1111/eje.13042. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Pucchio A, Rathagirishnan R, Caton N, Gariscsak PJ, Del Papa J, Nabhen JJ, Vo V, Lee W, Moraes FY. Exploration of exposure to artificial intelligence in undergraduate medical education: a Canadian cross-sectional mixed-methods study. BMC Med Educ. 2022;22:815. 10.1186/s12909-022-03896-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Uribe SE, Maldupa I, Schwendicke F. Integrating generative AI in dental education: A scoping review of current practices and recommendations. Eur J Dent Educ. 2025;0:1–15. 10.1111/eje.13074. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Crotty E, Singh A, Neligan N, Chamunyonga C, Edwards C. Artificial intelligence in medical imaging education: recommendations for undergraduate curriculum development. Radiography. 2024;30:S67–73. 10.1016/j.radi.2024.01.007. [DOI] [PubMed] [Google Scholar]
  • 14.Uribe SE, Maldupa I, Kavadella A, El Tantawi M, Chaurasia A, Fontana M, Marino R, Innes N, Schwendicke F. Artificial intelligence chatbots and large Language models in dental education: worldwide survey of educators. Eur J Dent Educ. 2024;28:865–76. 10.1111/eje.13009. [DOI] [PubMed] [Google Scholar]
  • 15.Derakhshanian S, Wood L, Arruzza E. Perceptions and attitudes of health science students relating to artificial intelligence (AI): a scoping review. Health Sci Rep. 2024;7:e2289. 10.1002/hsr2.2289. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Mousavi Baigi SF, Sarbaz M, Ghaddaripouri K, Ghaddaripouri M, Mousavi AS, Kimiafar K. Attitudes, knowledge, and skills towards artificial intelligence among healthcare students: a systematic review. Health Sci Rep. 2023;6:e1138. 10.1002/hsr2.1138. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Pinto dos Santos D, Giese D, Brodehl S, Chon SH, Staab W, Kleinert R, Maintz D, Baeßler B. Medical students’ attitude towards artificial intelligence: a multicentre survey. Eur Radiol. 2019;29:1640–6. 10.1007/s00330-018-5601-1. [DOI] [PubMed] [Google Scholar]
  • 18.Busch F, Hoffmann L, Truhn D, Palaian S, Alomar M, Shpati K, Makowski MR, Bressem KK, Adams LC. International pharmacy students’ perceptions towards artificial intelligence in medicine—A multinational, multicentre cross-sectional study. Br J Clin Pharmacol. 2024;90:649–61. 10.1111/bcp.15911. [DOI] [PubMed] [Google Scholar]
  • 19.Almusharraf A, Shaikh AK, Attar RW, Alwassil O. Perceptions of artificial intelligence in healthcare curricula: insights from a nationwide survey of medical students. Front Educ. 2025;10:1550671. 10.3389/feduc.2025.1550671. [Google Scholar]
  • 20.Allam AH, Eltewacy NK, Alabdallat YJ, Owais TA, Salman S, Ebada MA, EARG Group. Knowledge, attitude, and perception of Arab medical students towards artificial intelligence in medicine and radiology: a multinational cross-sectional study. Eur Radiol. 2024;34:4393–406. 10.1007/s00330-023-10509-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Rjoop A, Al-Qudah M, Alkhasawneh R, et al. Awareness and attitude toward artificial intelligence among medical students and pathology trainees: survey study. JMIR Med Educ. 2025;11:e62669. 10.2196/62669. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Boillat T, Nawaz FA, Rivas H. Readiness to embrace artificial intelligence among medical Doctors and students: questionnaire-based study. JMIR Med Educ. 2022;8(2):e34973. 10.2196/34973. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Chawla RL, Gadge NP, Ronad S, et al. Knowledge, attitude and perception regarding artificial intelligence in periodontology: a questionnaire study. Cureus. 2023;15(11):e48309. 10.7759/cureus.48309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Sit C, Srinivasan R, Amlani A, Muthuswamy K, Azam A, Monzon L, Poon DS. Attitudes and perceptions of UK medical students towards artificial intelligence and radiology: a multicentre survey. Insights Imaging. 2020;11:14. 10.1186/s13244-019-0830-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Tamori H, Yamashina H, Mukai M, Morii Y, Suzuki T, Ogasawara K. Acceptance of the use of artificial intelligence in medicine among japan’s Doctors and the public: a questionnaire survey. JMIR Hum Factors. 2022;9(1):e24680. 10.2196/24680. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Weidener L, Fischer M. Artificial intelligence in medicine: cross-sectional study among medical students on application, education, and ethical aspects. JMIR Med Educ. 2024;10:e51247. 10.2196/51247. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Shinners L, Aggar C, Grace S, Smith S. Exploring healthcare professionals’ perceptions of artificial intelligence: validating a questionnaire using the e-Delphi method. Digit Health. 2021;7:1–9. 10.1177/20552076211003433. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, for the STROBE Initiative. The strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. J Clin Epidemiol. 2008;61:344–9. 10.1016/j.jclinepi.2007.11.008. [DOI] [PubMed] [Google Scholar]
  • 29.Vandenbroucke JP, von Elm E, Altman DG, Gøtzsche PC, Mulrow CD, Pocock SJ, et al. Strengthening the reporting of observational studies in epidemiology (STROBE): explanation and elaboration. PLoS Med. 2007;4(10):e297. 10.1371/journal.pmed.0040297. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Ng DTK, Wu W, Leung JKL, Chiu TKF, Chu SKW. Design and validation of the AI literacy questionnaire: the affective, behavioural, cognitive and ethical approach. Br J Educ Technol. 2023;00:1–23. 10.1111/bjet.13411. [Google Scholar]
  • 31.Bethlehem J. Selection bias in web surveys. Int Stat Rev. 2010;78(2):161–88. 10.1111/j.1751-5823.2010.00112.x. [Google Scholar]
  • 32.Jamovi Project. Jamovi (Version 2.6) [Computer software]. Sydney, Australia: Jamovi.org; 2021. https://www.jamovi.org/. [Google Scholar]
  • 33.R Core Team. R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. 2021. http://www.R-project.org/. Accessed 10 June 2024.
  • 34.Tenório K, Romeike R. AI competencies for non-computer science students in undergraduate education: Towards a competency framework. In: Proceedings of the 23rd Koli Calling International Conference on Computing Education Research, Koli Calling ’23. Association for Computing Machinery, New York; 2024;1–12. 10.1145/3631802.3631829
  • 35.Eschert T, Schwendicke F, Krois J, Bohner L, Vinayahalingam S, Hanisch M. A survey on the use of artificial intelligence by clinicians in dentistry and oral and maxillofacial surgery. Medicina. 2022;58:1059. 10.3390/medicina58081059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Sami A, Tanveer F, Sajwani K, Kiran N, Javed MA, Uzun Ozsahin D, Muhammad K, Waheed Y. Medical students’ attitudes toward AI in education: perception, effectiveness, and its credibility. BMC Med Educ. 2025;25:82. 10.1186/s12909-025-06704-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Duan S, Liu C, Rong T, Zhao Y, Liu B. Integrating AI in medical education: a comprehensive study of medical students’ attitudes, concerns, and behavioral intentions. BMC Med Educ. 2025;25:599. 10.1186/s12909-025-07177-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Aldakhil S, Alkhurayji K, Albarrak S, et al. Awareness and approaches regarding artificial intelligence in dentistry: a scoping review. Cureus. 2024;16(1):e51825. 10.7759/cureus.51825. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Dantas JN, Dias da Silva MA. A descriptive and correlational study assessing the integration of digital health into brazil’s dentistry master’s degree programs. Eval Health Prof. 2025 Apr;28. 10.1177/01632787251333780. [DOI] [PubMed]
  • 40.Dias da Silva MA, Pereira AC, Walmsley AD. The availability of open access videos offered by dental schools. Eur J Dent Educ. 2019;23(4):522–6. 10.1111/eje.12461. [DOI] [PubMed] [Google Scholar]
  • 41.Dias da Silva MA, Pereira AC, Vital S, Mariño R, Ghanim A, Skelton Macedo MC, et al. Online videos: the hidden curriculum. Eur J Dent Educ. 2022;26(4):830–7. 10.1111/eje.12766. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Ahmed Z, Bhinder KK, Tariq A, et al. Knowledge, attitude, and practice of artificial intelligence among Doctors and medical students in pakistan: a cross-sectional online survey. Ann Med Surg. 2022;76:103493. 10.1016/j.amsu.2022.103493. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Liehner GL, Biermann H, Hick A, Brauner P, Ziefle M. Perceptions, attitudes and trust towards artificial intelligence — An assessment of the public opinion. Artif Intell Social Comput. 2023;72:32–41. 10.54941/ahfe1003271. [Google Scholar]
  • 44.Al-Roomi K, Alzayani S, Almarabheh A, et al. Familiarity and applications of artificial intelligence in health professions education: perspectives of students in a community-oriented medical school. Cureus. 2024;16(11):e73425. 10.7759/cureus.73425. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Steerling E, Siira E, Nilsen P, Svedberg P, Nygren J. Implementing AI in healthcare—the relevance of trust: a scoping review. Front Health Serv. 2023;3:1211150. 10.3389/frhs.2023.1211150. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Amiri H, Peiravi S, Rezazadeh Shojaee S, et al. Medical, dental, and nursing students’ attitudes and knowledge towards artificial intelligence: a systematic review and meta-analysis. BMC Med Educ. 2024;24:412. 10.1186/s12909-024-05406-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Survey UNESCOUNESCO, UNESCO News. Less Than 10% of Schools and Universities Have Formal Guidance on AI. Retrieved from https://www.unesco.org/en/articles/unesco-survey-less-10-schools-and-universities-have-formal-guidance-ai;2023 [Accessed 13 June 2025].
  • 48.Paranjape K, Schinkel M, Nannan Panday R, Car J, Nanayakkara P. Introducing artificial intelligence training in medical education. JMIR Med Educ. 2019;5(2):e16048. 10.2196/16048. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.van de Venter R, Skelton E, Matthew J, Woznitza N, Tarroni G, Hirani SP, et al. Artificial intelligence education for radiographers, an evaluation of a UK postgraduate educational intervention using participatory action research: a pilot study. Insights Imag. 2023;14(1):25. 10.1186/s13244-023-01372-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Edwards C, Chamunyonga C, Searle B, Reddan T. The application of artificial intelligence in the sonography profession: professional and educational considerations. Ultrasound. 2022;30(4):273–82. 10.1177/1742271X211072473. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Valikodath NG, Cole E, Ting DS, Campbell JP, Pasquale LR, Chiang MF, et al. American academy of ophthalmology task force on artificial Intelligence. Impact of artificial intelligence on medical education in ophthalmology. Transl Vis Sci Technol Jun. 2021;01(7):14. 10.1167/tvst.10.7.14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Tolentino R, Baradaran A, Gore G, Pluye P, Abbasgholizadeh-Rahimi S. Curriculum frameworks and educational programs in AI for medical Students, Residents, and practicing physicians: scoping review. JMIR Med Educ. 2024;10:e54793. 10.2196/54793. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Seth P, Hueppchen N, Miller SD, Rudzicz F, Ding J, Parakh K, Record JD. Data science as a core competency in undergraduate medical education in the age of artificial intelligence in health care. JMIR Med Educ. 2023;9:e46344. 10.2196/46344. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Masters K. Artificial intelligence in medical education. Med Teach. 2019;41(9):976–80. 10.1080/0142159X.2019.1595557. [DOI] [PubMed] [Google Scholar]
  • 55.Schwendicke F, Chaurasia A, Wiegand T, Uribe SE, Fontana M, Akota I, et al. Artificial intelligence for oral and dental healthcare: core education curriculum. J Dent. 2023;128:104316. 10.1016/j.jdent.2022.104316. [DOI] [PubMed] [Google Scholar]
  • 56.Thurzo A, Strunga M, Urban R, Surovková J, Afrashtehfar KI. Impact of artificial intelligence on dental education: A review and guide for curriculum update. Educ Sci. 2023;13(2):150. 10.3390/educsci13020150. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 1. (30.6KB, docx)

Data Availability Statement

The dataset supporting the conclusions of this article is available in the Zenodo repository: KAVADELLA, A., Uribe, S., Dias da Silva, M. A., Schwendicke, F., Tichy, A., Jacobs, reinhilde, chaurasia, A., Daniel, K., & Giannakopoulos, K. (2025). Dataset Dental students knowledge, perceptions, and educational needs regarding Artificial Intelligence: a multinational cross-sectional survey [Data set]. Zenodo. [10.5281/zenodo.15825108](10.5281/zenodo.15825108).


Articles from BMC Medical Education are provided here courtesy of BMC

RESOURCES