Abstract
Background
This study aims to assess the effectiveness of a virtual scenario-based clinical reasoning training system to improve clinical reasoning and communication skills among dental students.
Methods
Seventy fourth year dental students, who had previously completed dental basic science courses and theoretical components of clinical dentistry courses, engaged in a four-week clinical communication and reasoning training programme using the virtual scenario-based clinical reasoning training system. All students underwent a communication skills and clinical reasoning assessment before and after using this virtual training system, and their scores, including each module and total scores, were compared using Paired t tests or Wilcoxon signed-rank tests. The level of significance was P < 0.05. Students were asked to provide feedback through a survey to identify the usability and their perceived benefits or drawbacks of this system.
Results
After using the virtual scenario-based clinical reasoning training system four times with different cases, students’ scores in all modules (history-taking, examination, diagnosis, and treatment) and total scores improved significantly (P < 0.05). Post-training clinical reasoning competence scores (86.13 ± 1.79) were significantly higher than pre-training scores (74.66 ± 2.18), and communication skills also showed significant enhancement (P < 0.05). The average System Usability Scale (SUS) score was 70.14 ± 4.96, indicating above-average system usability. Most students were satisfied with the system (92.86%), acknowledged its advantages of innovative and interesting lectures and effectiveness in improving clinical reasoning (71.43%) and communication skills (67.14%).
Conclusion
The application of the virtual scenario-based clinical reasoning training system in clinical communication and reasoning training programme can improve students’ clinical reasoning and communication skills.
Keywords: Simulation training, Virtual reality, Dental education, Patient communication, Clinical competence, Clinical reasoning
Background
Despite its essential role in developing clinical competence, clinical reasoning and critical thinking is a complex process involving the identification, analysis, and prioritisation of clinical information to formulate and test hypotheses [1]. These skills are critical not only for making accurate diagnoses and developing effective treatment plans but also for enhancing trusting relationship through effective provider-patient communication. Stomatology is a medical discipline that combines theory and practice closely, where clinical decision-making not only relies on theoretical knowledge but also demands the application of effective provider-patient communication and practical skills. Clinical case-based teaching effectively improves students’ clinical reasoning and critical thinking skills by immersing them in clinical scenarios [2, 3]. This, in turn, significantly cultivates their clinical competence, enabling them to confidently meet the complex needs of patients in daily clinical practice.
To date, training of clinical reasoning and critical thinking for dental students mostly relies on conventional teaching methods such as Problem-Based Learning (PBL) and Case-Based Learning (CBL). PBL facilitates the development of clinical reasoning, self-directed learning, and teamwork skills by engaging students in active exploration centred around clinical problems [4–7]. CBL is grounded in clinical cases and emphasises the integration of theoretical knowledge with practical application, thereby effectively enhancing students’ clinical decision-making abilities and competence in applying knowledge to practice [5, 6, 8–10]. Those conventional teaching methods offer advantages in oral clinical education by integrating theoretical knowledge with practical case-based reasoning to enhance students’ problem-solving skills and clinical competence.
Despite these advantages, clinical reasoning, however, is not easily transferable from classroom to clinic. The transition from pre-clinical to clinical education poses significant challenges for students, as it demands not just the recall of theoretical knowledge but its application in dynamic and unexpected clinical contexts. According to Kolb’s Experiential Learning Theory (ELT), learning is a cyclical process in which knowledge is created through the transformation of experience. It involves four stages: concrete experience, reflective observation, abstract conceptualization, and active experimentation [11]. In dental education, engaging students in genuine clinical scenarios allows them to iteratively move through this cycle, thereby enhancing their clinical reasoning and decision-making capabilities. A safe learning environment—where learners can engage in repeated practice without the risk of harming real patients—is thus essential for supporting this experiential learning process and promoting confidence in clinical performance [12].
With the rapid development of digital technology [13], the field of stomatology has actively embraced digital approaches. For instance, virtual simulation-based teaching models have been actively integrated into dental education [14–17]. The current application of virtual simulation technology in dental education primarily focuses on clinical practice training [18], such as root canal treatment [19, 20], crown preparation [21, 22], periodontal surgery [23], and anatomy education [24]. However, in other clinical disciplines such as pharmacy and nursing, scenario-based virtual simulation systems have already been integrated into preclinical education, effectively enhancing students’ comprehensive clinical diagnostic and treatment competencies through simulated patient consultation processes [25–30]. To date, there is a gap in integrating clinical reasoning and communication skills into dental education.
To address this gap, a virtual scenario-based clinical reasoning training system (Unidraw Co. Ltd, Beijing, China) was designed and developed by Nanjing Medical University. The system was built upon real-world dental clinical cases, and the modules were developed following a typical dental consultation process, which included history-taking, examination, diagnosis, and treatment plan, simulating clinical procedures and provider-patient communication scenarios. A distinctive feature of this system is the integration of biomechanically accurate haptic feedback with dynamic 3D anatomical visualization. In real dental encounters, tactile findings, such as resistance during probing, sensitivity upon percussion, or tissue consistency during palpation, play a critical role in guiding clinical reasoning and must be interpreted and then communicated to patients in a clear and professional manner [16]. By reproducing these tactile cues, the system not only enhances manual dexterity but also enables learners to practice linking tactile–visual information with diagnostic reasoning and verbal explanation. In this way, haptic feedback simultaneously strengthens diagnostic accuracy and communication competence in a safe, immersive environment.
The aim of this study was to assess the effectiveness of a clinical training programme using the virtual scenario-based clinical reasoning training system to improve clinical reasoning and communication skills among dental students.
Methods
Participants
This study included all fourth-year dental students from Nanjing Medical University. A total of 70 dental students participated in this clinical communication and reasoning training programme (32 males and 38 females), and their average age was 22.14 ± 0.38 years. All students had previously completed dental basic science courses and theoretical components of clinical dentistry courses. The protocol of this study was approved by the Ethical Committee Department, Affiliated Hospital of Stomatology, Nanjing Medical University (PJ2023-129-001).
Training procedures
All the students were asked to complete the virtual training within one month, using the virtual scenario-based clinical reasoning training system, one time each week for four consecutive weeks. This frequency was determined by the course timetable, as the module was scheduled to last only four weeks with one virtual simulation class per week; consequently, each student was limited to a total of four training sessions.
The virtual scenario-based training system was developed by Unidraw Co. Ltd (Beijing, China) and was operated on specific hardware device. Upon logging into the system, students interacted with a virtual patient, whose “self-reported” descriptions would guide them through a standardised dental consultation programme including history-taking, examination, diagnosis, and treatment plan. After the training, the system provided comprehensive feedback and evaluation throughout the entire training process based on the student’s interactions with the virtual patient, such as the completeness and accuracy of history-taking, intraoral and extraoral examination, diagnosis and treatment planning. These feedback domains were designed to align with the assessment framework subsequently used in the traditional clinical reasoning and communication examination, which is based on the standards of China National Certificated Dentist Qualification Examination and detailed in Table 1. Based on these evaluation results and specific feedback, students could revisit this case to focus on areas that require improvement. The application procedure of the system is shown in Fig. 1.
Table 1.
The format and scoring criteria of the clinical communication and reasoning assessment
| Module | Criteria | Score |
|---|---|---|
| History-taking | o Comprehensive consultation, including chief complaint, medical history, family history, social history | 20 |
| o Accurate record and clear summary | 5 | |
| Examination | o Adjust position and light, complete inspection examination | 5 |
| o Complete probing examination | 5 | |
| o Complete palpation examination | 5 | |
| o Complete percussion examination | 5 | |
| o Select proper auxiliary examination | 10 | |
| Diagnosis | o Provide a complete diagnosis based on the examination results | 10 |
| o Generate 2–3 differential diagnoses with supporting evidence | 10 | |
| Treatment planning | o Suggest appropriate investigations and management steps | 10 |
| o Propose clinically justified treatment methods based on the examination results and diagnosis. | 10 | |
| o Explains goals clearly to the patient for understanding and adherence. | 5 | |
| Total | 100 | |
Fig. 1.
The Training procedures of the Virtual Scenario-based Clinical Reasoning Training System
Assessment of clinical reasoning
A before-and-after study design was applied. All students underwent clinical communication and reasoning assessment before and after this virtual simulation training. The assessment method, the extent of difficulty, and scoring criteria were based on the standards of China National Certificated Dentist Qualification Examination. The pre and post assessment was divided into four parts: history-taking, examination, diagnosis, and treatment planning, with a total score of 100 points (Table 1). Students were randomly allocated clinical cases from a clinical scenario bank, which consists of cases derived from real clinical cases compiled by the School of Stomatology and covers multiple dental disciplines, including orthodontics, oral and maxillofacial surgery, endodontics, periodontology, and prosthodontics. Students then completed the assessment through interaction with Standardised Patients [31]. Two senior clinicians with extensive clinical experience simultaneously assessed each student, and the final score was the average of the two clinicians’ results. All examiners were rigorously trained and calibrated to ensure consistency and reliability in scoring.
Assessment of communication skills
During the clinical reasoning assessments, students’ communication skills were assessed using the SEGUE (Set the stage, Elicit information, Give information, Understand the patient’s perspective, and End the encounter), a validated instrument with demonstrated reliability in standardized patient–based clinical assessments [32]. The SEGUE was consisted of five evaluation dimensions and 25 dichotomous items (yes = 1 points, no = 0 point). The five evaluation dimensions were: preparation, eliciting information from the patient, providing information to the patient, understanding the patient’s perspective, and closing the encounter. The score ranged from 0 to 25. Students with higher scores were regarded as having better communication skills.
Feedback questionnaires
After completing the curriculum, all the participants were invited to provide feedback on their training experiences. To assess students’ user experience and acceptance of the system, the first section of questionnaire employed the System Usability Scale (SUS) used in previous studies [33, 34]. The SUS is a widely utilized and validated instrument used to assess the usability of various products or services, which has been shown to be a highly reliable tool with a Cronbach’s alpha of 0.91 [35]. It consists of a 10-item questionnaire that requires users to fill out after interacting with the system, where they rate their agreement with various statements on a scale ranging from 1 (“Strongly Disagree”) to 5 (“Strongly Agree”). For items 1, 3, 5, 7, and 9, the contribution is equal to the item score minus 1. For items 2, 4, 6, 8, and 10, the contribution is equal to 5 minus the item score. The contributions from all items are then summed and multiplied by 2.5 to obtain the overall SUS score. The score typically ranges from 0 to 100, with higher scores indicating better perceived usability [36]. If an SUS score is over 68, it is considered higher than the average value; if the SUS score is above 80.3, it ranges within the top 10 percentile [34, 35, 37].
The second section of the survey was a questionnaire developed by the research team to gather feedback on the students’ learning experiences. The questionnaire used a 5-point scale (1 = “Strongly Disagree”, 5 = “Strongly Agree”) and primarily focused on the system’s assistance in enhancing learning efficiency, improving clinical decision-making skills, strengthening provider–patient communication abilities, promoting overall clinical competence and the level of satisfaction with each module (history-taking, examination, diagnosis, and treatment plan). To ensure content validity, the questionnaire was reviewed by three senior clinical faculty members for clarity and relevance. Additionally, a pilot test was performed with five students to ensure the questions were unambiguous before formal administration.
Statistical analysis
Data were analysed using SPSS version 29.0 (IBM Corporation, Armonk, NY). The assessment results before and after the simulation training were tested for normal distribution and variance homogeneity. A paired sample t-test or Wilcoxon signed-rank tests was used for comparing data differences, based on whether the data were normally distributed. The total score and each module score of the system results were compared using one-way ANOVA (if the data met the assumptions of normal distribution and variance homogeneity), or Kruskal-Wallis tests (if the data did not meet these assumptions). A post hoc test (Least Significant Difference, LSD) was performed to analyse the system results among each training. The level of significance was set as P < 0.05.
Results
The scores of the virtual training system
After using the virtual training system four times (one time each week), the students’ scores in each individual module (history-taking, examination, diagnosis and treatment planning) as well as their total scores showed statistically significant improvements from the week one assessment to the week four assessment (One way ANOVA, P < 0.05, Table 2). As the number of practices increased, the students’ overall performance gradually improved (Fig. 2A). Moreover, a post hoc test (Least Significant Difference, LSD) was performed and suggested that students’ scores have reached a plateau after 2 to 3 times of the practice (P < 0.05).
Table 2.
Student performance across virtual training sessions and pre and post training assessments of clinical reasoning
| E-Module | Means of Weekly Practice Scores | Pre and Post Assessment Means | ||||
|---|---|---|---|---|---|---|
| 1st | 2nd | 3rd | 4th | pre | post | |
| History Taking | 19.0 | 19.7 | 20.6 | 20.9* | 17.3 | 20.9* |
| Examination | 23.1 | 24.2 | 24.5 | 24.8* | 19.9 | 25.4* |
| Diagnosis | 16.2 | 16.9 | 18.0 | 18.1* | 17.8 | 18.8* |
| Treatment planning | 20.5 | 21.0 | 21.5 | 21.7* | 19.7 | 20.9* |
| Total | 78.9 | 81.9 | 84.6 | 85.5* | 74.7 | 86.1* |
*P < 0.05
Fig. 2.
The student experiences of using the Virtual scenario-based clinical reasoning training system (A The results of the virtual training system, including four modules (History Taking, Examination, Diagnosis and Treatment Planning) and total scores. Error bars represent the standard deviation (SD); B Comparison of scores of clinical reasoning assessment before and after training; C Investigation of teaching effect based on the virtual scenario-based clinical reasoning training system; D Students’ satisfaction of each module of the virtual scenario-based clinical reasoning training system)
Comparison of clinical reasoning competence before and after the training programme
Before and after the virtual scenario-based clinical reasoning training, students underwent clinical reasoning assessments. All the scores fit a normal distribution. The post-training scores (86.13 ± 1.79) were significantly higher than the pre-training scores (74.66 ± 2.18), with a t-value of 32.78 and a p-value of less than 0.05 (Fig. 2B). The scores of each individual item are presented in Table 2, and all show statistically significant differences.
Comparison of the communication skills before and after the training programme
The evaluation of communication skills, which comprised initiating communication, collecting information, giving information, understanding the patient’s perspective and ending communication, was noticeably improved after the training programme, and the differences were statistically significant (P < 0.05, Table 3).
Table 3.
Comparison of the communication skills evaluation using SEGUE
| Item | Before Training (n = 70) |
After Training (n = 70) |
Differences | P |
|---|---|---|---|---|
|
Preparation (5 score) |
1.63 ± 0.80 | 3.24 ± 0.92 | 0.61 ± 1.08 | < 0.05 |
|
Collecting information (10 score) |
5.59 ± 1.12 | 7.14 ± 0.80 | 1.56 ± 1.43 | < 0.05 |
|
Giving information (4 score) |
0.86 ± 0.82 | 1.98 ± 0.91 | 1.13 ± 1.24 | < 0.05 |
|
Understanding patients (4 score) |
1.73 ± 0.67 | 2.73 ± 0.65 | 1.00 ± 1.03 | < 0.05 |
|
Ending communication (2 score) |
0.56 ± 0.62 | 1.2 ± 0.53 | 0.64 ± 0.78 | < 0.05 |
| Total (25 score) | 10.36 ± 1.88 | 16.3 ± 1.67 | 5.94 ± 2.59 | < 0.05 |
Acceptability and student experiences of the virtual training system
To further evaluate students’ perceptions of the system, questionnaires were administered after the training. Regarding to first section of the survey, the mean score was 70.14 ± 4.96, indicating that the system’s usability is above average [34, 37]. This suggests that the system’s usability is acceptable, and users are relatively satisfied with the overall experience, but there is still room for improvement. Table 4 shows the mean score on each item, which, per the SUS scoring template, were totalled (28.06) and multiplied by 2.5 to obtain the overall score of 70.14.
Table 4.
The acceptability of the virtual training system using system usability scale (SUS) 5-point scale where 1 = “Strongly Disagree” and 5 = “Strongly Agree”
| Questions | Mean | SD |
|---|---|---|
| 1. I think that I would like to use the system frequently. | 3.33 | 0.58 |
| 2. I found the system unnecessarily complex. | 3.16 | 0.50 |
| 3. I thought the system was easy to use. | 2.77 | 0.68 |
| 4. I think that I would need the support of a technical person to be able to use the system. | 3.11 | 0.47 |
| 5. I found the various functions in the system were well integrated. | 3.03 | 0.72 |
| 6. I thought there was too much inconsistency in the system. | 2.37 | 0.54 |
| 7. I would imagine that most people would learn to use the system very quickly. | 2.26 | 0.47 |
| 8. I found the system very cumbersome to use. | 2.73 | 0.51 |
| 9. I felt very confident using the system. | 3.04 | 0.84 |
| 10. I needed to learn a lot of things before I could get going with the system. | 2.26 | 0.63 |
In addition to acceptability, students were also asked to provide feedback on their overall learning experiences. Almost all students expressed general satisfaction with the virtual scenario-based clinical reasoning training system (97.14%). The system was found to be effective in motivating students in learning, improving clinical reasoning and communication competence and facilitating a better understanding and reinforcement of theoretical knowledge (Fig. 2C). The module that students were most satisfied with was “Diagnosis” (72.86%), while 7.14% (n = 5) of students indicated that the “History-taking” module required improvement (Fig. 2D). This feedback consisted of two specific concerns: 2.86% of the participants (n = 2) mentioned that the force-feedback system could be more sensitive, and 4.29% (n = 3) observed that the speech rate of the simulated patients was slower compared to real-world clinical encounters.
Discussion
As technology grows more complex and patients expect more personalised care, conventional dental education needs new approaches to teach all the skills needed for modern practice. This study aimed to assess the effectiveness of a virtual scenario-based clinical reasoning training system created by Unidraw Co. Ltd (Beijing, China) to improve clinical reasoning and communication skills among dental students. Digital simulation systems, such as the virtual scenario-based training system used in this study, offer a promising solution to the challenge of developing dental students’ clinical reasoning and communications skills. Virtual simulations provide a safe, controlled learning environment where students can repeatedly practise their skills using clinical scenarios without risking patient safety [38]. Such systems also allow learners to experience authentic patient interactions, make diagnostic and treatment decisions, and receive immediate feedback—elements that are crucial for developing clinical reasoning and communication skills [39, 40]. Moreover, by simulating real-world situations, digital tools can help smoothly transition students from preclinical learning to actual clinical practice, improving their readiness for real patient encounters.
Despite the growing interest in virtual simulations, no previous studies have been found that evaluate a virtual simulation system specifically designed to integrate clinical reasoning and communication skills training in dental education. By focusing on this integration, the research aims to validate whether such digital tools can effectively enhance students’ ability to navigate the complex interplay of technical diagnostic and interpersonal communication skills—a critical competency for modern dental practice.
The present study applied a virtual scenario-based clinical reasoning training system based on the Virtual Reality Simulation. We evaluated the students’ clinical communication skills and diagnosis and treatment abilities before and after using the system and found that the students’ abilities had significantly improved (Tables 2 and 3; Fig. 2B). Although some degree of improvement might be expected as students gain more practice over time, the magnitude of the observed gains and positive student feedback suggest that the virtual training provided significant benefits beyond natural progression. One possible explanation for this improvement is that the system provides a highly immersive and realistic learning environment, where students could better systematically integrate aetiology, disease symptoms, diagnosis and treatment planning with communication skills. Another advantage is that the system allows students to make mistakes and learn from them without causing harm to real patients [40]. This process of trial and error is crucial for the development of clinical reasoning skills. Regarding the training dosage, while the program was limited to four sessions due to curriculum constraints, the observation that student performance reached a plateau after two or three sessions suggests that while increasing frequency could theoretically provide additional reinforcement, a point of cognitive saturation may exist for these specific clinical cases.
The conventional evaluation of clinical competence solely relies on subjective judgment, which is inherently subject to inter-rater variability [41]. However, our system can automatically record the actions of students for each case and show the scores in a comprehensive way after completion, which is more objective and standardised. Additionally, the feedback mechanism of the system provides teachers with precise student data, allowing them to adjust teaching strategies and optimise personalised learning paths to enhance learning outcomes. It also alleviates the teaching burden on educators, allowing teachers to dedicate more time and energy to course design and personalised student guidance, thereby stimulating innovation in teaching practices.
The positive feedback from students obtained through the questionnaires further supports the effectiveness of this virtual training system (Fig. 2C and D). Students indicated that they found the system was “engaging” and recognised that the system helped them to improve their clinical reasoning abilities. However, the usability ratings in Table 4 reflect more neutral enthusiasm regarding user-friendliness. This could be attributed to the required initial familiarization with the multifunctional virtual training system. In addition, the incorporation of haptic feedback may have increased the initial learning demands for novice learners, contributing to a more moderate usability perception despite the system’s perceived educational value. This high level of acceptance for the system’s educational value among students is essential for the successful integration of such a system into the dental curriculum. In comparison with previous studies involving virtual reality systems in dental and medical education, our findings align with the general trend of VR demonstrating positive impacts [42, 43]. Nevertheless, our current study extends beyond just practical skills. By focusing on clinical reasoning and communication, we have identified an additional benefit of VR in dental education.
The virtual scenario-based clinical reasoning training system is a valuable supplementary and improving tool for training dental students in clinical reasoning and communication. However, the dental issues encountered by dentists are highly complex, and digital systems are still unable to fully replace real-life clinical scenarios. Therefore, the virtual scenario-based clinical reasoning training system should not and cannot fully replace clinical internships; instead, it can serve as an effective tool for pre-clinical training, helping medical students adapt to clinical internships more quickly and effectively.
The virtual scenario-based clinical reasoning training system has achieved positive results, yet it has room for growth. Future improvement will focus on leveraging AI to craft intelligent virtual patients and enable personalized case generation. By integrating AI, the system can analyse students’ learning progress and adaptability, generating cases that precisely match their individual needs [44, 45]. The integration of Large Language Models (LLMs) in dental pedagogy has already demonstrated the capacity to deliver high-fidelity, individualized feedback that markedly enhances diagnostic precision [46]. This personalised approach ensures that students at different learning stages are challenged appropriately, promoting more efficient skill acquisition [47]. It should also be noted that minor system responsiveness issues were occasionally observed, indicating the need for further technical refinement in future iterations. In addition, enhancing virtual patient responses is a key aspect of the upgrade. AI-driven tools will endow virtual patients with the ability to respond flexibly to students’ questions and even pose proactive inquiries, fostering students’ critical thinking and clinical reasoning [48].
Nevertheless, our study has inevitably some limitations. The study was conducted in a specific context with a limited sample size, which in some cases resulted in relatively large standard deviations, indicating variability in individual responses and limiting the precision of the estimates. Additionally, the study design used a before-and-after design without a control group. This decision was primarily due to the fact that the training programme was newly introduced in the curriculum during this semester, and no comparable module had previously existed. Our initial objective was to evaluate whether the addition of this module could benefit students. While this design allowed us to capture changes in student performance before and after the intervention, it also limits causal inferences and leaves the study difficult to establish the causality and prone to various biases such as maturation effects and selection bias [49]. This will obviously limit the generalisability of the findings to other groups or settings. Although this study has demonstrated that virtual training can enhance skills in the short term, the long-term retention of these skills remains unexplored. Future research should investigate whether the learned skills can be retained over time, which can be achieved through longitudinal assessments several weeks or months after the training.
Conclusions
The virtual scenario-based clinical reasoning training system integrates dental consultation process with virtual reality technology, creating a highly realistic virtual clinical environment. This allows students to overcome the limitations of time and space, enabling autonomous learning, improving learning efficiency, mastering provider-patient communication skills, and ensuring patient safety and well-being. However, it is important to recognize that while the system addresses the shortcomings of conventional case-based teaching, it cannot fully replace clinical internships at this stage, and there is still room for improvement and refinement.
Acknowledgements
We thank all the students and teachers for participating.
Abbreviations
- CBL
Case-Based Learning
- ELT
Experiential Learning Theory
- PBL
Problem-Based Learning
- SEGUE
Set the stage, Elicit information, Give information, Understand the patient's perspective, and End the encounter
- SUS
System Usability Scale
Authors’ contributions
YC, contributed to conception and design, data acquisition, analysis, and manuscript writing; SY, contributed to analysis, data interpretation, and manuscript editing; YP, ZY, contributed to implementation and data acquisition; ZW, ZG, contributed to data interpretation and manuscript revision; BY, LL, contributed to conception and design, supervision, and manuscript revision. All authors gave final approval and agreed to be accountable for all aspects of the work.
Funding
This work was supported by National Online Course Research Project for Graduate Students in Medicine and Pharmacy (2024-28); “Healthy China” Thematic Case Project of the Degree and Graduate Education Development Centre of the Ministry of Education (ZT-2410312003); Educational Research Project of Stomatology of Nanjing Medical University (2024-01); Innovation Project of Jiangsu Province (BZ2023042).
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Declarations
Ethics approval and consent to participate
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. The protocol of this study was approved by the Ethical Committee Department, Affiliated Hospital of Stomatology, Nanjing Medical University (PJ2023-129-001). Informed consent was obtained from all individual participants included in the study.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Luwei Liu, Email: liuluwei_orth@njmu.edu.cn.
Bin Yan, Email: byan@njmu.edu.cn.
References
- 1.Richards JB, Hayes MM, Schwartzstein RM. Teaching clinical reasoning and critical thinking: from cognitive theory to practical application. Chest. 2020;158(4):1617–28. [DOI] [PubMed] [Google Scholar]
- 2.Levin M, Cennimo D, Chen S, Lamba S. Teaching clinical reasoning to medical students: A Case-Based illness script worksheet approach. MedEdPORTAL. 2016;12:10445. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Alavi-Moghaddam M, Zeinaddini-Meymand A, Ahmadi S, Shirani A. Teaching clinical reasoning to medical students: A brief report of case-based clinical reasoning approach. J Educ Health Promot. 2024;13:42. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Puranik CP, Pickett K, de Peralta T. Evaluation of problem-based learning in dental trauma education: an observational cohort study. Dent Traumatol. 2023;39(6):625–36. [DOI] [PubMed] [Google Scholar]
- 5.Wang H, Xuan J, Liu L, Shen X, Xiong Y. Problem-based learning and case-based learning in dental education. Ann Transl Med. 2021;9(14):1137. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Manas A, Ismail PMS, Mohan R, Madiraju GS, Mulla M, Mulla M, Babaji P. The role of combined problem-based learning (PBL) and case-based learning (CBL) teaching methodologies in dental education. J Educ Health Promot. 2024;13:417. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Chaturvedi S, Elmahdi AE, Abdelmonem AM, Haralur SB, Alqahtani NM, Suleman G, Sharif RA, Gurumurthy V, Alfarsi MA. Predoctoral dental implant education techniques-students’ perception and attitude. J Dent Educ. 2021;85(3):392–400. [DOI] [PubMed] [Google Scholar]
- 8.Dong H, Guo C, Zhou L, Zhao J, Wu X, Zhang X, Zhang X. Effectiveness of case-based learning in Chinese dental education: a systematic review and meta-analysis. BMJ Open. 2022;12(2):e048497. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Zhang SY, Zheng JW, Yang C, Zhang ZY, Shen GF, Zhang JZ, Xu YJ, Cao X. Case-based learning in clinical courses in a Chinese college of stomatology. J Dent Educ. 2012;76(10):1389–92. [PubMed] [Google Scholar]
- 10.Du GF, Li CZ, Shang SH, Xu XY, Chen HZ, Zhou G. Practising case-based learning in oral medicine for dental students in China. Eur J Dent Educ. 2013;17(4):225–8. [DOI] [PubMed] [Google Scholar]
- 11.Yardley S, Teunissen PW, Dornan T. Experiential learning: transforming theory into practice. Med Teach. 2012;34(2):161–4. [DOI] [PubMed] [Google Scholar]
- 12.Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28. [DOI] [PubMed] [Google Scholar]
- 13.Sereewisai B, Chintavalakorn R, Santiwong P, Nakornnoi T, Neoh SP, Sipiyaruk K. The accuracy of virtual setup in simulating treatment outcomes in orthodontic practice: a systematic review. BDJ Open. 2023;9(1):41. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Yamakami SA, Nagai M, Chutinan S, Ohyama H. 3D digital technology as an alternative educational tool in preclinical dentistry. Eur J Dent Educ. 2022;26(4):733–40. [DOI] [PubMed] [Google Scholar]
- 15.Ishida Y, Kuwajima Y, Kobayashi T, Yonezawa Y, Asack D, Nagai M, Kondo H, Ishikawa-Nagai S, Da Silva J, Lee SJ. Current implementation of digital dentistry for removable prosthodontics in US dental schools. Int J Dent. 2022;2022:7331185. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Bandiaky ON, Lopez S, Hamon L, Clouet R, Soueidan A, Le Guehennec L. Impact of haptic simulators in preclinical dental education: A systematic review. J Dent Educ. 2024;88(3):366–79. [DOI] [PubMed] [Google Scholar]
- 17.Felszeghy S, Mutluay M, Liukkonen M, Flacco N, Bakr MM, Rampf S, Schick SG, Mushtaq F, Sittoni-Pino MF, Ackerman K, et al. Benefits and challenges of the integration of haptics-enhanced virtual reality training within dental curricula. J Dent Educ. 2025;89(7):1070–83. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Huang Y, Huang S, Liu Y, Lin Z, Hong Y, Li X. Application of virtual reality and haptics system simodont in Chinese dental education: A scoping review. Eur J Dent Educ. 2025;29(3):585–93. [DOI] [PubMed] [Google Scholar]
- 19.Yuan CY, Wang XY, Dong YM, Gao XJ. [Effect of digital virtual simulation system for preclinical teaching of access and coronal cavity preparation]. Zhonghua Kou Qiang Yi Xue Za Zhi. 2021;56(5):479–84. [DOI] [PubMed] [Google Scholar]
- 20.Duan M, Lv S, Fan B, Fan W. Effect of 3D printed teeth and virtual simulation system on the pre-clinical access cavity Preparation training of senior dental undergraduates. BMC Med Educ. 2024;24(1):913. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Li Y, Ye H, Wu S, Zhao X, Liu Y, Lv L, Zhang P, Zhang X, Zhou Y. Mixed reality and Haptic-Based dental simulator for tooth preparation: Research, Development, and preliminary evaluation. JMIR Serious Games. 2022;10(1):e30653. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Liu L, Zhou R, Yuan S, Sun Z, Lu X, Li J, Chu F, Walmsley AD, Yan B, Wang L. Simulation training for ceramic crown Preparation in the dental setting using a virtual educational system. Eur J Dent Educ. 2020;24(2):199–206. [DOI] [PubMed] [Google Scholar]
- 23.Gurbuz E, Gurbuz AA. Investigation of the effect of virtual reality distraction in patients undergoing mandibular periodontal surgery: A randomized controlled study. J Esthet Restor Dent. 2024;36(5):813–22. [DOI] [PubMed] [Google Scholar]
- 24.Reymus M, Liebermann A, Diegritz C. Virtual reality: an effective tool for teaching root Canal anatomy to undergraduate dental students - a preliminary study. Int Endod J. 2020;53(11):1581–7. [DOI] [PubMed] [Google Scholar]
- 25.Bodein I, Forestier M, Le Borgne C, Lefebvre JM, Pincon C, Garat A, Standaert A, Decaudin B. Evaluation of simulation-based training program intended to improve interprofessional communication skills of community pharmacy and general medicine students. Ann Pharm Fr. 2023;81(2):354–65. [DOI] [PubMed] [Google Scholar]
- 26.Bartlett JL, Kinsey JD. Large-group, asynchronous, interprofessional simulation: identifying roles and improving communication with student pharmacists and student nurses. Curr Pharm Teach Learn. 2020;12(6):763–70. [DOI] [PubMed] [Google Scholar]
- 27.Faferek J, Cariou PL, Hege I, Mayer A, Morin L, Rodriguez-Molina D, Sousa-Pinto B, Kononowicz AA. Integrating virtual patients into undergraduate health professions curricula: a framework synthesis of stakeholders’ opinions based on a systematic literature review. BMC Med Educ. 2024;24(1):727. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Quail NPA, Boyle JG. Virtual patients in health professions education. Adv Exp Med Biol. 2019;1171:25–35. [DOI] [PubMed] [Google Scholar]
- 29.Dyer E, Swartzlander BJ, Gugliucci MR. Using virtual reality in medical education to teach empathy. J Med Libr Assoc. 2018;106(4):498–500. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Bracq MS, Michinov E, Jannin P. Virtual reality simulation in nontechnical skills training for healthcare professionals: A systematic review. Simul Healthc. 2019;14(3):188–94. [DOI] [PubMed] [Google Scholar]
- 31.Pohlenz P, Grobe A, Petersik A, von Sternberg N, Pflesser B, Pommert A, Hohne KH, Tiede U, Springer I, Heiland M. Virtual dental surgery as a new educational tool in dental school. J Cranio Maxill Surg. 2010;38(8):560–4. [DOI] [PubMed] [Google Scholar]
- 32.Makoul G. The SEGUE framework for teaching and assessing communication skills. Patient Educ Couns. 2001;45(1):23–34. [DOI] [PubMed] [Google Scholar]
- 33.Ghorayeb A, Darbyshire JL, Wronikowska MW, Watkinson PJ. Design and validation of a new healthcare systems usability scale (HSUS) for clinical decision support systems: a mixed-methods approach. BMJ Open. 2023;13(1):e065323. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Huang Y, Cheng X, Chan U, Zheng L, Hu Y, Sun Y, Lai P, Dai J, Yang X. Virtual reality approach for orthodontic education at school of Stomatology, Jinan university. J Dent Educ. 2022;86(8):1025–35. [DOI] [PubMed] [Google Scholar]
- 35.Bangor A, Kortum PT, Miller JT. An empirical evaluation of the system usability scale. Int J Human–Computer Interact. 2008;24:574–94. [Google Scholar]
- 36.Brooke J. SUS-A quick and dirty usability scale. Usability Evaluation Ind. 1996;189(194):4–7. [Google Scholar]
- 37.Bangor A, Kortum P, Miller J. Determining what individual SUS scores mean: adding an adjective rating scale. J Usability Stud. 2009;4(3):114–23. [Google Scholar]
- 38.Mistry D, Geevarughese S, Brock CA, Nelson J, Drew D, Goddard CM, West RT, Kleiman K, Lindsey T. Utilizing a virtual reality matrix in medical education. Cureus. 2024;16(8):e66446. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Haowen J, Vimalesvaran S, Myint Kyaw B, Tudor Car L. Virtual reality in medical students’ education: a scoping review protocol. BMJ Open. 2021;11(5):e046986. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Buono FD, Marks A, Lee D. Virtual reality in medical education. Cyberpsychol Behav Soc Netw. 2024;27(6):361–2. [DOI] [PubMed] [Google Scholar]
- 41.Kao CT, Wu TY, Kuo CL, Li CH, Cheng CF, Chih YK. Comparative evaluation of scenario-based clinical examinations in orthodontic certification: insights from Taiwan and the united States. J Dent Sci 2025. [DOI] [PMC free article] [PubMed]
- 42.Al-Saud LM, Mushtaq F, Allsop MJ, Culmer PC, Mirghani I, Yates E, Keeling A, Mon-Williams MA, Manogue M. Feedback and motor skill acquisition using a haptic dental simulator. Eur J Dent Educ. 2017;21(4):240–7. [DOI] [PubMed] [Google Scholar]
- 43.Moro C, Stromberga Z, Raikos A, Stirling A. The effectiveness of virtual and augmented reality in health sciences and medical anatomy. Anat Sci Educ. 2017;10(6):549–59. [DOI] [PubMed] [Google Scholar]
- 44.Uribe SE, Maldupa I, Schwendicke F. Integrating generative AI in dental education: A scoping review of current practices and recommendations. Eur J Dent Educ. 2025;29(2):341–55. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Ardila CM, Yadalam PK. AI and dental education. Br Dent J. 2025;238(5):294. [DOI] [PubMed] [Google Scholar]
- 46.Yilmaz BE, Ozbey F, Gokkurt Yilmaz BN, Akpinar H. Can large Language models perform clinical anamnesis? Comparative evaluation of ChatGPT, Claude, and gemini in diagnostic reasoning through case-based questioning in oral and maxillofacial disorders. J Stomatol Oral Maxillofac Surg. 2025;127(2):102644. [DOI] [PubMed] [Google Scholar]
- 47.Gokkurt Yilmaz BN, Ozbey F, Yilmaz BE. Effect of artificial intelligence-assisted personalized feedback on radiographic diagnostic performance of dental students: a controlled study. BMC Med Educ. 2025;25(1):1403. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Sezer B, Sezer TA, Teker GT, Elcin M. Developing a virtual patient: design, usability, and learning effect in communication skills training. BMC Med Educ. 2023;23(1):891. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Torgerson DJ, Torgerson CJ. The limitations of before and after designs. Designing randomised trials in Health, education and the social sciences: an introduction. edn. London: Palgrave Macmillan UK; 2008. pp. 9–16. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.


