Abstract
Purpose
To assess a modified use of Compare software as a resource to (1) improve students’ ability to self‐assess their endodontic access preparations (EAPs) and (2) students’ opinions of this adjunct.
Methods
Sixty second‐year dental students were randomly assigned to two groups (n = 30). A questionnaire was developed and validated. Both groups performed two accesses on #14 acrylic teeth, one at the course's outset and another at its conclusion, and evaluated them using a traditional method of assessment, completed the self‐assessment form, and answered items 1‒4 of the questionnaire. The experimental group (G2) received training and were asked to evaluate their EAPs three‐dimensionally (3D), complete the self‐assessment form a second time, and answer items 5‒14 of the questionnaire. Data were analyzed statistically (significance set at 5%).
Results
Using the traditional method, G1 (control group) showed improvement in “size‐shape,” G2 in “encroachment,” and both increased their “overall rate” (p < 0.05). Using the 3D method, G2 showed improvement in the “mesial extent” (p < 0.05). No difference was found between groups for self‐assessment or opinions (items 1‒4) (p > 0.05). However, both had an increase in confidence from the first access to the second, and G1 participants believed they improved their ability to perform and assess EAPs (p < 0.05).
Conclusion
No difference was observed between the two groups when relying upon the traditional method alone. Both groups experienced an increase in confidence but only G1 believed that their ability to perform and assess EAPs improved. Notably, 96.7% of G2 believed that the 3D method should be incorporated into preclinical endodontic training.
Keywords: education, educational technology, endodontics, predental, self‐assessment, teaching methods
1. INTRODUCTION
Self‐assessment is a crucial skill predoctoral dental students need to develop during their education to ultimately improve and refine their hand‐skills, quality of work, and capacity to work independently as clinicians. In order to bridge the gap between inadequate and acceptable, students should be able to identify specific areas needing improvement. 1 The ability to critically appraise one's own work is a skill that students begin to cultivate during preclinical training. 2 Self‐assessment is not only a beneficial tool used throughout dental school, but also an imperative lifelong skill to provide patients with safe and effective treatment after graduation. 3
Traditional self‐assessment relies on subjective human eye judgment and physical measurements. CAD/CAM technology, often referred to as digital dentistry, has been widely adopted into dental curriculum and has been shown in restorative dentistry to improve self‐assessment skills, reduce one‐on‐one teaching time, and enhance teacher‒student communication. 4 CAD/CAM technology minimizes subjectivity, providing objective standards for assessment and detection of errors. 5 The use of three‐dimensional (3D) analysis software in preclinical courses received positive feedback, with students reporting quicker and more objective assessment of their work. 6
Endodontic access, considered a challenging part of root canal treatment, poses stress for students due to internal anatomy variations, access difficulties, and fear of complications. 7 Traditionally, students have relied solely upon visual assessment of their access preparations based upon a general description of an ideal access, for example, near ideal shape, size, and location, straight‐line access achieved, and completely unroofed, hereafter course evaluation criteria. Alternatively, the criteria for the CDCA‐WREB‐CITA ADEX exam (ADEX) (www.cdcaexams.org/wp‐content/uploads/2022/02/2022‐CDCA‐WREB‐Endo‐Pros‐Manual.pdf) defines acceptable access in terms of distances from specific landmarks, hereafter licensure evaluation criteria. A preliminary study suggested that enhancing self‐assessment methods could improve students' understanding of the access opening procedure and their skills. 8 A recent pilot study successfully adapted the use of the Planmeca Romexis Compare (Compare software; Planmeca, Hoffman Estates), formerly E4D Compare, to evaluate endodontic access preparation (EAP) three‐dimensionally. 9 However, no study has evaluated the benefits of a 3D method of assessment of EAPs compared to a traditional method.
Therefore, this study aims to (1) determine the impact of integrating 3D technology on students’ self‐assessment compared to a traditional method and (2) assess students' perceptions of the software's effectiveness. The null hypothesis proposes no differences between student self‐assessment using traditional or 3D methods and students do not consider the 3D method helpful for EAP evaluation.
2. MATERIALS AND METHODS
The study protocol was approved by the local Ethics Committee (IRB2022‐1492‐CD‐EXM) and registered on ClinicalTrials.gov (NCT06567236) and the Open Science Framework database (https://osf.io/dpbw3).
At Texas A&M University College of Dentistry, second‐year dental students are enrolled in a year‐long integrated preclinical laboratory course that encompasses several disciplines. In the fall semester, as part of the fixed prosthodontics discipline, students learn and receive extensive training in scanning and comparing their tooth preparation to an ideal model selected by the course director using Compare software. In the spring 2023 semester, there were 102 students enrolled in the integrated laboratory course. All students were invited to participate and 60 students volunteered and signed the informed consent.
The study was not part of their regular endodontic course but was performed concurrently with it. One of the researchers (C.A.B.) created a unique identification code to track participants’ assignments and ensure blind evaluation throughout the experiment. Participants were then randomly (www.random.org) allocated into G1 (n = 30; control group) or G2 (n = 30; experimental group) (Figure 1). Both groups received regular traditional endodontic training in the simulation laboratory. Additionally, G2 received training on how to use the Compare software adapted to evaluate EAPs. Training for the utilization of the software consisted of one 3‐h hands‐on laboratory session and unlimited after‐hours accessibility to it.
FIGURE 1.

CONSORT diagram showing the flow of participants through each stage of a randomized trial.
Both G1 and G2 performed EAPs in #14 acrylic teeth (RTE #14 With Insert; Acadental) on the benchtop at the beginning of the semester, evaluated their EAPs based on the traditional method, and completed the self‐assessment form (Supporting Information S1). An ideal EAP model was always available for visual comparison in the simulation laboratory. Both groups responded to items 1‒4 of the questionnaire (Supporting Information S2). Afterward, G2 scanned and evaluated their EAPs a second time using the 3D method and completed the self‐assessment form.
Later in the semester, both groups returned to the simulation laboratory and performed a second EAP on a new #14 acrylic tooth. Once again, both groups evaluated their second EAPs using the traditional method and completed the self‐assessment form. Afterward, G2 scanned and evaluated their second EAPs using the 3D method and completed the self‐assessment form. Both groups answered items 1‒4 of the questionnaire, and only G2 answered items 5‒14 (Supporting Information S2).
2.1. Self‐assessment
The self‐assessment form (Supporting Information S1) was created to assess the comparison between the ideal EAP model and the participants’ EAPs based on the course's practical rubric and the EAP's opening size that was based on the ADEX Access Opening Size criteria.
2.2. Questionnaire
A questionnaire was developed regarding the different aspects of handling, didactic benefit, motivation, and overall assessment of participants' EAPs.
For the questionnaire's content validation, three dental educators with expertise in the software analyzed it for possible needed revisions. The agreement rate (%) between experts was calculated and each item was individually validated by calculating the content validity index (CVI). Expert agreement rate was 100% for all items but the CVI for item 4 was 0.67. After adjustments, item 4 reached a CVI of 1. Subsequently, a pretest with 15 randomly selected first‐year dental students was performed. They were asked to assess item intelligility and make relevant suggestions. Items 6, 7, 8, 10, 11, and 12 received suggestions regarding grammar and sentence structure. They were adjusted accordingly, after which the questionnaire was considered validated and ready to be applied.
The validated 14‐item questionnaire (Supporting Information S2) was uploaded to Qualtrics Survey Platform (Qualtrics) and applied as mentioned before. Items 1‒4 covered the perception of their own capacities performing endodontic accesses and were answered by both groups, while questions 5‒14 covered opinions on the use of Compare software and 3D method of assessment and were answered only by G2. All responses were collected on a Likert scale from 1 (strongly disagree) to 5 (strongly agree) and tabulated.
2.3. Statistical analysis
Normality assumption was not met (Shapiro‒Wilk test; p > 0.05). Independent‐samples Mann‒Whitney test was used for comparisons between groups, and two related samples Wilcoxon test was used for intragroup comparisons. Descriptive analyses of questionnaire responses were expressed in absolute and relative frequencies (%). Significance level was set at 5% (SPSS v29.0; SPSS).
3. RESULTS
One participant from G1 withdrew from the study (n = 29) (Figure 1).
3.1. Self‐assessment
Median and range (minimum‒maximum) of self‐assessment parameters associated with traditional and 3D methods are shown in Tables 1 and 2, respectively.
TABLE 1.
Median (range) of participants' self‐assessment using traditional method of assessment.
| Parameter | Group | First access | Second access | p‐value |
|---|---|---|---|---|
| Course's practical rubric | ||||
| Preparation size‐shape | G1 | 4 (2‒5) | 5 (2‒5) | 0.043 a |
| G2 | 4 (3‒5) | 5 (3‒5) | 0.175 | |
| p‐Value | 0.452 | 0.522 | ||
| Preparation location | G1 | 5 (3‒5) | 5 (3‒5) | 0.153 |
| G2 | 5 (4‒5) | 5 (3‒5) | 0.782 | |
| p‐Value | 0.531 | 0.561 | ||
| Encroachment | G1 | 5 (3‒5) | 5 (2‒5) | 0.615 |
| G2 | 5 (2‒5) | 5 (4‒5) | 0.046 a | |
| p‐Value | 0.756 | 0.465 | ||
| Straight line | G1 | 2 (1‒2) | 2 (1‒2) | 0.655 |
| G2 | 2 (2‒2) | 2 (2‒2) | 1.000 | |
| p‐Value | 0.073 | 0.147 | ||
| Overextension | G1 | 2.5 (1‒3) | 3 (1‒3) | 0.197 |
| G2 | 2 (1‒3) | 2 (1‒3) | 0.808 | |
| p‐Value | 0.933 | 0.481 | ||
| Drilling into root | G1 | 3 (2‒3) | 3 (1‒3) | 0.317 |
| G2 | 3 (2‒3) | 3 (3‒3) | 0.564 | |
| p‐Value | 0.369 | 0.078 | ||
| External crown shape | G1 | 2 (1‒2) | 2 (2‒2) | 0.083 |
| G2 | 2 (2‒2) | 2 (2‒2) | 1.000 | |
| p‐Value | 0.073 | 1.000 | ||
| Perforation | G1 | 2 (1‒2) | 2 (2‒2) | 0.317 |
| G2 | 2 (2‒2) | 2 (1‒2) | 0.317 | |
| p‐Value | 0.309 | 0.326 | ||
| Unroofing | G1 | 4 (3‒5) | 5 (3‒5) | 0.071 |
| G2 | 5 (4‒5) | 5 (4‒5) | 0.058 | |
| p‐Value | 0.269 | 0.342 | ||
| Overall rate | G1 | 4 (2‒5) | 4 (2‒5) | 0.008 a |
| G2 | 4 (3‒5) | 4 (3‒5) | 0.008 a | |
| p‐Value | 0.536 | 1.000 | ||
| ADEX access opening size criteria | ||||
| Mesial | G1 | 2 (1‒2) | 2 (1‒2) | 1.000 |
| G2 | 2 (2‒2) | 2 (1‒2) | 0.317 | |
| p‐Value | 0.309 | 0.981 | ||
| Buccal | G1 | 2 (1‒2) | 2 (1‒2) | 0.414 |
| G2 | 2 (1‒2) | 2 (1‒2) | 0.257 | |
| p‐Value | 0.787 | 0.965 | ||
| Distal | G1 | 2 (1‒2) | 2 (1‒2) | 0.564 |
| G2 | 2 (1‒2) | 2 (1‒2) | 1.000 | |
| p‐Value | 0.177 | 0.417 | ||
| Palatal | G1 | 2 (1‒2) | 2 (1‒2) | 0.564 |
| G2 | 2 (2‒2) | 2 (1‒2) | 0.317 | |
| p‐Value | 0.147 | 0.981 | ||
| Access size rate | G1 | 2 (1‒2) | 2 (1‒2) | 0.564 |
| G2 | 2 (1‒2) | 2 (1‒2) | 1.000 | |
| p‐Value | 0.537 | 0.981 | ||
Significant difference between first and second accesses (two related samples Wilcoxon test, p < 0.05). There is no statistical difference between groups (independent samples Mann‒Whitney U‐test, p > 0.05).
TABLE 2.
Median (range) of participants' self‐assessment using 3D method of assessment.
| Parameter | First access | Second access | p‐value |
|---|---|---|---|
| Course's practical rubric | |||
| Preparation size‐shape | 4 (2‒5) | 4.5 (3‒5) | 0.180 |
| Preparation location | 5 (2‒5) | 5 (3‒5) | 0.382 |
| Encroachment | 5 (1‒5) | 5 (3‒5) | 0.059 |
| Straight line | 2 (1‒2) | 2 (1‒2) | 0.096 |
| Overextension | 2 (1‒3) | 2 (1‒3) | 0.071 |
| Drilling into root | 3 (2‒3) | 3 (2‒3) | 0.655 |
| External crown shape | 2 (1‒2) | 2 (2‒2) | 0.317 |
| Perforation | 2 (1‒2) | 2 (2‒2) | 0.317 |
| Unroofing | 5 (3‒5) | 5 (3‒5) | 0.963 |
| Overall rate | 4 (1‒5) | 4 (2‒5) | 0.060 |
| ADEX access opening size criteria | |||
| Mesial | 2 (1‒2) | 2 (1‒2) | 0.035 a |
| Buccal | 2 (1‒2) | 2 (1‒2) | 0.527 |
| Distal | 2 (1‒2) | 2 (1‒2) | 0.705 |
| Palatal | 2 (1‒2) | 2 (2‒2) | 0.157 |
| Access size rate | 2 (1‒2) | 2 (1‒2) | 0.102 |
Significant difference between first and second accesses (two related samples Wilcoxon test, p > 0.05).
There was no significant difference between groups for all variables tested (p > 0.05). Using the traditional method, G1 showed significant improvement in preparation size‐shape (p = 0.043), G2 in encroachment of marginal ridges (p = 0.046), and both groups significantly increased their overall rate (p < 0.05). Using the 3D method, G2 showed improvement in the mesial criteria on access opening size (p = 0.035).
When G2 self‐assessed the same access using the traditional method and later using the 3D method, significant differences were noticed. For their first access, overextension (p = 0.003) and deficiency of the mesial criteria for access opening size (p < 0.001) were better identified using the 3D method. Moreover, a change in the access opening size rate from acceptable to deficient was significantly noted (p = 0.046). For their second access, overextension (p = 0.029), lack of straight‐line (p = 0.008), and deficiency of the mesial criteria for access opening size (p = 0.014) were better identified using the 3D method.
3.2. Questionnaire
Table 3 shows the relative and absolute frequencies of G1 and G2 perception of their capacities related to performing endodontic accesses. No significant difference was seen between groups for items 1‒4 (p > 0.05). Both G1 and G2 had an increase in confidence from the first to the second EAP (p < 0.001 and 0.042, respectively). However, G1 also expressed a significant increase in their ability to perform (p < 0.001) and assess (p < 0.001) EAPs.
TABLE 3.
Relative (%) and absolute (n) frequencies of participant's perception of their own capacities.
| First access | Second access | |||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Questions 1‒4 | 1—strongly disagree | 2—disagree |
3—neither agree nor disagree |
4—agree |
5—strongly agree |
1—strongly disagree |
2—disagree |
3—neither agree nor disagree |
4—agree |
5—strongly agree |
p‐value | |
| I feel confident | G1, frequency (n) | 6.7% (2) | 23.3% (7) | 30.0% (9) | 36.7% (11) | – | – | 10.0% (3) | 13.3% (4) | 43.3% (13) | 30.0% (9) | <0.001a |
| G2, frequency (n) | – | 33.3% (10) | 16.7% (5) | 46.7% (14) | 3.3% (1) | 10.0% (3) | 3.3% (1) | 10.0% (3) | 53.3% (16) | 23.3% (7) | 0.042a | |
| p‐Value | 0.464 | 0.612 | ||||||||||
| I am able to perform | G1, frequency (n) | – | 10.0% (3) | 43.3% (13) | 36.7% (11) | 6.7% (2) | – | 3.3% (1) | 6.7% (2) | 53.3% (16) | 33.3% (10) | <0.001a |
| G2, frequency (n) | – | 6.7% (2) | 33.3% (10) | 53.3% (16) | 6.7% (2) | 10.0% (3) | 3.3% (1) | – | 63.3% (19) | 23.3% (7) | 0.160 | |
| p‐Value | 0.299 | 0.320 | ||||||||||
| I am able to assess | G1, frequency (n) | – | 13.3% (4) | 20.0% (6) | 46.7% (14) | 16.7% (5) | – | – | 16.7% (5) | 53.3% (16) | 26.7% (8) | 0.032a |
| G2, frequency (n) | – | 3.3% (1) | 20.0% (6) | 70.0% (21) | 6.7% (2) | 10.0% (3) | – | – | 60.0% (18) | 30.0% (9) | 0.264 | |
| p‐Value | 0.744 | 0.681 | ||||||||||
| I am able to identify errors | G1, frequency (n) | – | 10.0% (3) | 23.3% (7) | 46.7% (14) | 16.7% (5) | – | 6.7% (2) | 20.0% (6) | 46.7% (14) | 23.3% (7) | 0.334 |
| G2, frequency (n) | 3.3% (1) | 10.0% (3) | 20.0% (6) | 56.7% (17) | 10.0% (3) | 10.0% (3) | – | – | 53.3% (16) | 36.7% (11) | 0.101 | |
| p‐Value | 0.704 | 0.157 | ||||||||||
Significant difference between first and second accesses (two related samples Wilcoxon test, p < 0.05). There is no statistical difference between groups (independent samples Mann‒Whitney U‐test, p > 0.05).
Table 4 shows the relative and absolute frequencies of G2 perception of the 3D method using the Compare software adapted to evaluate EAPs. One participant in G2 chose every answer as “strongly disagree.” Furthermore, 90% of G2 believed the Compare software was easy to learn, more helpful than the traditional method of assessment, and should be included in preclinical training, including preparation for the licensure exam.
TABLE 4.
Absolute (n) and relative (%) frequencies of G2's perception of the 3D method of assessment using Compare software.
| Questions 5‒14 | 1—strongly disagree | 2—disagree | 3—neither agree nor disagree | 4—agree | 5—strongly agree | |||||
|---|---|---|---|---|---|---|---|---|---|---|
| n | % | N | % | n | % | n | % | n | % | |
| Easy to learn | 1 | 3.3 | – | – | – | – | 19 | 63.3 | 10 | 33.3 |
| Compare tool helps with weakness and strengths | 1 | 3.3 | – | – | – | – | 7 | 23.3 | 22 | 73.3 |
| Shoulder width tool helps with opening sizes acceptable or deficient | 1 | 3.3 | – | – | 1 | 3.3 | 9 | 30.0 | 19 | 63.3 |
| Overall software helps see positive and negative aspects | 1 | 3.3 | – | – | – | – | 11 | 36.7 | 18 | 60.0 |
| Compare tool is more helpful than traditional/visual | 1 | 3.3 | – | – | 2 | 6.7 | 9 | 30.0 | 18 | 60.0 |
| Shoulder width tool is more helpful than traditional/visual | 1 | 3.3 | – | – | 2 | 6.7 | 10 | 33.3 | 17 | 56.7 |
| Overall 3D software is more helpful than traditional/visual | 1 | 3.3 | – | – | – | – | 9 | 30.0 | 20 | 66.7 |
| Software to self‐assess improves technical skills | 1 | 3.3 | – | – | 2 | 6.7 | 14 | 46.7 | 13 | 43.3 |
| Should be included in preclinical training for all students | 1 | 3.3 | – | – | – | – | 9 | 30.0 | 20 | 66.7 |
| Use software for training/self‐evaluation for licensure exam | 1 | 3.3 | – | – | 2 | 6.7 | 8 | 26.7 | 19 | 63.3 |
4. DISCUSSION
EAPs are the initial step in nonsurgical root canal treatments. Understanding the anatomy of the pulp helps identify factors that complicate access cavity preparation. Common errors include leaving caries tissue in the pulp chamber, selecting an incorrect access point, using an excessively large drill, improper tooth positioning, incomplete access, failure to unroof, and perforations in anterior teeth or molar floors. 10 Special care is needed to avoid missing canals in the mesiobuccal root of the maxillary first molar, multi‐rooted premolars, mandibular central incisors, and teeth with dens invaginatus. Predoctoral students often make procedural errors in access cavity preparation, indicating a need for more training. 10 , 11
The application of randomized controlled trials (RCTs), a cornerstone of medical sciences research, is increasingly being explored within the field of education. 12 , 13 Moreover, 3D educational technology tools can provide students with more interactive and personalized learning experiences. These tools support dental education by allowing students to immerse themselves in realistic scenarios, enhancing their skills and understanding of procedures. 14 In Endodontics, a recent study 15 developed a novel augmented reality method for guided access cavity preparation using 3D‐printed jaws. Another study 16 implemented haptic virtual reality simulator training using microcomputed tomography tooth models to minimize procedural errors in EAP. Thus, the present study employed an RCT to compare two distinct methods of self‐assessment in preclinical endodontic education: traditional versus 3D.
Properly applying self‐assessment criteria is crucial for developing a solid understanding of the subject, and students should be allowed to effectively implement these criteria in their work. 17 Our study utilized rubrics familiar to students from their preclinical course, alongside the ADEX criteria for access opening (Supporting Information S1). This approach was deliberately chosen to avoid confusing students with a different rubric not utilized in their coursework, thereby ensuring practical relevance. The course's practical rubric is a simplified version of the rubric outlined by Abiad in 2017, 18 which focuses on broader criteria not directly tied to the students’ immediate educational and licensure requirements.
The second goal of this study was to assess students' perceptions of their capacities and the software's effectiveness through a survey. More than 80% of G2 participants agreed/strongly agreed with all items on the use of Compare software and 3D method of assessment, except item 12, which considered the improvement of their technical skills (Table 4; Supporting Information S2). This suggests that while participants perceived that the software significantly enhances knowledge and self‐assessment abilities, it does not directly translate to improved technical skills, which requires physical and mental training. 19 The software aids in identifying flaws and providing objective feedback, thereby refining self‐assessment capabilities rather than directly enhancing hands‐on skills.
Both groups in the study reported an increase in confidence after the second access preparation, likely due to increased familiarity and comfort with the procedure (Table 3). However, only G1 felt their ability to perform and assess had improved, a phenomenon potentially explained by the Dunning‒Kruger effect. This cognitive bias causes individuals with lower competence to overestimate their abilities due to a lack of self‐awareness. 20 For low performers, the less calibrated their self‐estimates are, the more confident they are in their accuracy, supporting the idea that awareness of one's knowledge and ignorance depends partly on the extent of one's knowledge. 21
The critical feedback provided by the software acted as a magnifying glass, increasing students' awareness of minor flaws, and leading to a more accurate self‐assessment. G1, using the traditional method of self‐assessment, may have developed an overconfident perception of their skills, unlike G2 who utilized the 3D method of assessment. The software provided a more accurate and critical evaluation, highlighting flaws that traditional methods might overlook, thus tempering their self‐assessment. Our finding aligns more closely with the results described by Surdilovic et al., 22 which evaluated cognitive bias among fifth‐year dental students in the United Arab Emirates in the competency domains of communication, diagnosis, and clinical skills in Pediatric dentistry. Throughout the year, the students’ level of confidence declined and then shifted to a higher level across all domains, demonstrating the Dunning‒Kruger effect. 22
This study lays the groundwork for integrating 3D evaluation tools in dental curricula, potentially revolutionizing endodontic education and assessment. A limitation of the present study was the participation rate of 57.8%. A larger sample size might have revealed more significant findings. Further studies might overcome that through a multi‐centered study. Additionally, the online survey format posed challenges, as uncontrolled environments may have affected the accuracy and genuineness of responses, exemplified by the presence of an outlier in G2 who consistently selected “strongly disagree.”
The study advances our understanding of the potential application of Compare software in evaluating EAPs. Previously used for assessing crown and operative preparations, 4 , 6 this novel application within the endodontics discipline underscores the potential of CAD/CAM educational software as an adjunctive learning tool. The findings suggest that students better engage and learn with technology, preparing them for practical and board exams. Other dental schools can adopt this method, using various CAD/CAM software, to enhance their endodontic training programs.
5. CONCLUSION
This study reveals that while no significant difference was observed between the group using the 3D method of assessment and the group relying on traditional assessment alone for endodontic evaluation, both groups experienced an increase in confidence. However, only the group using the traditional method felt that their ability to perform and assess endodontic accesses improved, while the group equipped with the addition of 3D assessment examined their work more critically. Notably, 96.7% of participants who used the 3D method believed that 3D evaluation should be integrated into preclinical endodontic training.
CONFLICT OF INTEREST STATEMENT
The authors declare they have no conflicts of interest.
Supporting information
Supporting Information
Supporting Information
ACKNOWLEDGMENTS
We thank the volunteers for their invaluable participation and the Departments of Endodontics and Comprehensive Dentistry for their support.
Poly A, Harness C, Vu E, et al. Integrating digital technology in endodontic education: A randomized controlled trial evaluating student self‐assessment and perspectives. J Dent Educ. 2025;89:1294–1302. 10.1002/jdd.13821
REFERENCES
- 1. Gadbury‐Amyot CC, Woldt JL, Siruta‐Austin KJ. Self‐assessment: a review of the literature and pedagogical strategies for its promotion in dental education. J Dent Hyg. 2015;89(6):357‐364. [PubMed] [Google Scholar]
- 2. Rung A, George R. A systematic literature review of assessment feedback in preclinical dental education. Eur J Dent Educ. 2021;25(1):135‐150. [DOI] [PubMed] [Google Scholar]
- 3. Burrows RS. Understanding self‐assessment in undergraduate dental education. Br Dent J. 2018;224(11):897‐900. [DOI] [PubMed] [Google Scholar]
- 4. Chiang H, Staffen A, Abdulmajeed A, et al. Effectiveness of CAD/CAM technology: a self‐assessment tool for preclinical waxing exercise. Eur J Dent Educ. 2021;25(1):50‐55. [DOI] [PubMed] [Google Scholar]
- 5. Arnetzl G, Dornhofer R. PREPassistant: a system for evaluating tooth preparations. Int J Comput Dent. 2004;7(2):187‐197. [PubMed] [Google Scholar]
- 6. Hamil LM, Mennito AS, Renné WG, Vuthiganon J. Dental students' opinions of preparation assessment with E4D compare software versus traditional methods. J Dent Educ. 2014;78(10):1424‐1431. [PubMed] [Google Scholar]
- 7. Picart G, Pouhaër M, Dautel A, et al. Dental students' observations about teaching of endodontic access cavities in a French dental school. Eur J Dent Educ. 2022;26(3):499‐505. [DOI] [PubMed] [Google Scholar]
- 8. Choi S, Choi J, Peters OA, Peters CI. Design of an interactive system for access cavity assessment: a novel feedback tool for preclinical endodontics. Eur J Dent Educ. 2023;27(4):1031‐1039. [DOI] [PubMed] [Google Scholar]
- 9. Poly A, Burnett JE, Buie CA, Schweitzer JL. Three‐dimensional software adapted to evaluate endodontic access cavity preparation. J Dent Educ. 2023;87(suppl 3):1848‐1851. [DOI] [PubMed] [Google Scholar]
- 10. Estrela C, Pécora JD, Estrela CRA, et al. Common operative procedural errors and clinical factors associated with root canal treatment. Braz Dent J. 2017;28(2):179‐190. [DOI] [PubMed] [Google Scholar]
- 11. Almutairi M, Alattas MH, Alamoudi A, et al. Challenges assessment in endodontics among undergraduate students. Cureus. 2023;15(8):e43215. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Gutersohn C, Schweingruber S, Haudenschild M, et al. Medical device education: study protocol for a randomised controlled trial comparing self‐directed learning with traditional instructor‐led learning on an anaesthesia workstation. BMJ Open. 2023;13(9):e070261. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Moll‐Khosrawi P, Zöllner C, Cencin N, Schulte‐Uentrop L. Flipped learning enhances non‐technical skill performance in simulation‐based education: a randomised controlled trial. BMC Med Educ. 2021;21(1):353. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Aminoshariae A, Nosrat A, Nagendrababu V, et al. Artificial intelligence in endodontic education. J Endod. 2024;50(5):562‐578. [DOI] [PubMed] [Google Scholar]
- 15. Farronato M, Torres A, Pedano MS, Jacobs R. Novel method for augmented reality guided endodontics: an in vitro study. J Dent. 2023;132:104476. [DOI] [PubMed] [Google Scholar]
- 16. Suebnukarn S, Hataidechadusadee R, Suwannasri N, et al. Access cavity preparation training using haptic virtual reality and microcomputed tomography tooth models. Int Endod J. 2011;44(11):983‐989. [DOI] [PubMed] [Google Scholar]
- 17. Abiad RS. Effect of self‐assessment training in preclinical endodontic courses on the clinical performance of undergraduate dental students. AMJ. 2018;11(5):316‐321. [Google Scholar]
- 18. Abiad RS. Rubrics for practical endodontics. J Orthod Endod. 2017;3(2):5. [Google Scholar]
- 19. Uziel N, Lugassy D, Ghanaym K, et al. Impact of physical and mental training on dental students' fine motor skills. J Oral Rehabil. 2023;50(8):698‐705. [DOI] [PubMed] [Google Scholar]
- 20. Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self‐assessments. J Pers Soc Psych. 1999;77(6):1121‐1134. [DOI] [PubMed] [Google Scholar]
- 21. Coutinho MVC, Thomas J, Fredricks‐Lowman I, et al. Unskilled and unaware: second‐order judgments increase with miscalibration for low performers. Front Psychol. 2024;15:1252520. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Surdilovic D, Adtani P, Fuoad SA, et al. Evaluation of the Dunning‒Kruger effects among dental students at an academic training institution in UAE. Acta Stomatol Croat. 2022;56(3):299‐310. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supporting Information
Supporting Information
