Abstract
Highly competent clinical practice requires cognitive, psychomotor and affective skills. Therefore, the ultimate goal of dental education is for practitioners to be competent in all of these domains. While many methods have been introduced to assess knowledge and non-technical skills, it is still very difficult for educators to assess technical skill.
Assessment methods for technical skills are still not well established because it is very difficult to assure objectivity, validity and fairness. Nonetheless, technical skill is especially important in dental treatments, along with knowledge and attitude. The aim of this review was to summarize the methods of technical skill training in dental education and how they are assessed.
This is a literature review. We searched PubMed MEDLINE using terms related to technical skill training and those assessment as of June 2020 and reviewed them.
There have been many reports introducing methods of technical skill training and assessment, including the use of digital technology. However, no single assessment method had demonstrated validity of it. Technical skill training is very important in dental education and there are various ways of learning. The validity of current assessment methods is limited; therefore, a combination of several methods may achieve the best results.
Keywords: Dental education, Technical skill, Assessment
1. Introduction
Highly competent clinical practice requires skill in the cognitive, affective and psycho-motor domains. Therefore, the ultimate goal of dental education is for students to become competent in each of these domains. However, while many methods to assess knowledge and non-technical skill have been introduced, it is still very difficult for educators to assess technical skill.
There are many established methods of assessing knowledge [1]. Portfolios [2] and the Objective Structured Clinical Examination (OSCE) [3] have been reported to be effective for assessing elements of the affective domain. Other methods, such as simulation and video recording followed by assessment with rating scales, have been reported to assess affective skills [4].
Assessment methods for technical skills are still not well established because it is very difficult to assure objectivity, validity and fairness. Nonetheless, technical skill is especially important in dental treatments, along with knowledge and attitude. The aim of this review was to summarize the methods of technical skill training in dental education and how they are assessed.
To cover all sorts of learning and assessment methods, this study was a literature review but not a systematic review. We searched PubMed MEDLINE using terms related to technical skill training first as of June 2020 and reviewed them. Then we also searched reports related to the methods of technical skill assessment thereafter. Eventually, we selected some reports those are thought to be representing each category out of searched ones for the summery.
2. Methods of technical skill training
2.1. Conservative skill training with typodonts
Typodonts are a common education tool for pre-clinical practice in all sorts of ingenious programs. Gartenmann et al. reported that even short pre-clinical training using typodonts was sufficient for students to acquire the basic scaling/root planning skills needed [5]. Typodonts have also been used in a variety of courses for basic endodontic skills [6] and even tooth extraction skill [7]. It is still very common to use artificial teeth in typodonts for the training of crown preparations [8].
Various studies evaluating learning methods using typodonts for skill training have been conducted. Huth et al. imposed six tasks on students in a phantom course on conservative dentistry with assessment criteria [9]. Diva et al. developed a portable tool named PhantHome, composed of jaws, gingival tissue, and a rubber cover with specific 3D-printed teeth and a compatible stand [10]. They gave students certain tasks that only required tweezers and a dental mirror so that PhantHome could be used at home prior to the actual class. They concluded that training using PhantHome improved students’ motor skills [10]. Settings such as simulation clinics also turned out to be effective for clinical skill improvement [11] and several support systems have been introduced for periodontal therapy training [12] and tooth preparation practice [13].
However, while almost all of the reports extracted acknowledged the effectiveness of the programs in improving students’ manual skills, limitations were also pointed out. Polyzois et al. [14] and Nunez et al. [15] concluded that conventional preclinical training was effective in improving technical skills, but it did not necessarily predict subsequent performance at the preclinical or clinical level.
2.2. Simulation models
Although typodonts are effective for dental skill training, they could be more useful when simulating whole or part of a patient’s oral condition, or when combined with cognitive or affective elements such as treatment planning and consideration of patient safety. Akiba et al. introduced a comprehensive simulation model with decayed teeth, periodontal problems, missing teeth and many other dental problems in one typodont for pre-clinical training [16]. They used a rubric to assess students’ treatment planning for this model. A full-body patient simulation system called SIMROID is also available these days for dental trainees. Abe et al. concluded that SIMROID was effective in improving students’ affective elements while treating the teeth [17]. Another robot patient was developed and Tanzawa et al. reported its usefulness for risk management education [18]. Richard et al. developed new models for student periodontal examination training that could be used without any systematic or technical difficulties [19].
Digital technologies have also been utilized in dental education. 3D printed models created based on oral scans of actual patients are more realistic than plastic typodonts. Therefore, 3D simulation models in dental education have been suggested recently. Polyjet printing has been used to reproduce realistic hardness of enamel, dentin and caries of natural teeth and could be effective for pre-clinical training of dental students [20]. 3D printing technologies have been utilized for dental education in other clinical disciplines as well [21].
2.3. Dental trainer with virtual reality (VR) technology
VR is defined as a technology that generates highly realistic images in a closed world such as computer screen. Recently, several virtual dental trainers have been developed and some are commercially available. One example is SIMODONT, which is one of the major VR dental trainers worldwide. Therefore, many reports evaluating this machine have been published [[22], [23], [24]]. Many other VR dental trainers have been introduced and evaluated [[25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36]]. Since the utility of VR machines in dental education have been widely described recently, there have also been many review papers as well [[37], [38], [39], [40], [41], [42], [43], [44], [45]]. While some of them concluded VR trainers are effective in improving technical skill, some were skeptical about this [46] and it might be better to use them as auxiliary resources for the time being.
2.4. Dental trainer with augmented reality (AR) technology
AR is defined as a technology in which computer-generated information is mounted on real objects. Therefore, AR is a totally different technique from VR and the structure of dental trainers utilizing this technique also differ from AR trainers. DentSim is considered to be one example of an AR dental trainer. However, even though students using DentSim are preparing actual artificial teeth, they are looking at a monitor presenting completely computer generated images and to be precise, this is not AR. In this regard, there are no real AR machine to date. Nonetheless, there are several other digital technologies such as computerized dental simulators, image-guided interventions and the Implant Real-time Imaging system.
There are a relatively small number of reports regarding AR dental trainers [[47], [48], [49], [50], [51]]. Some of them concluded these trainers are effective for technical skill improvement, while others suggested that an instructor’s feedback would be required in addition to the digital feedback for better results.
3. Methods for technical skill assessment
3.1. Assessment of clinical skill
From a clinical point of view, clinical skill includes both technical skills and soft skills and is thus too complicated to assess with just a single method. Miller proposed a framework known as Miller’s pyramid for assessing clinical competence with a hierarchy of levels [52]. As he pointed out, no single assessment method could achieve sufficient validity to judge clinical skills. The American Dental Education Association (ADEA) also mentioned in a report called the “Dental student assessment toolbox” that using only one or two methods to assess students’ attainment of clinical skills would not be efficient or effective [53]. As a method to assess comprehensive clinical skill, the American Board of Internal Medicine developed the mini-clinical evaluation exercise (mini-CEX) in 1995 [54]. They also present an example of mini-CEX on the web site and easily accessed at (https://www.abim.org/∼/media/ABIM%20Public/Files/pdf/paper-tools/mini-cex.pdf). Since then, many studies have reported on the usage of mini-CEX and its effectiveness [[55], [56], [57]]. However, this method mainly focused on the assessment of soft skills in clinical practice. Therefore, Prescott et al. implemented a new assessment format similar to mini-CEX that included the technical domain and named it Longitudinal Evaluation of Performance (LEP) [58]. Under the concept of direct observation of student performance in clinical practice as a form of assessment, the Workplace-based assessment (WBA) and the Direct Observation of Produced Skills (DOPS) have been commonly used recently [59]. The examples of DOPS check list are easily found on the web site such as Health Education England (https://www.nwpgmd.nhs.uk/sites/default/files/VED%20Guidance%20Pack%20-%20PLVE%20FINAL%20May%202018.pdf). These methods are thought to be effective for assessment of technical and soft skills in clinical practice. All of these assessment methods have been established because assessing technical skills alone does not help improve learners’ clinical competences. The ADEE/ADEA also stated that the ultimate goal of assessment strategies such as WBA is to assess competence at a higher level and to develop the skills needed [60]. However, Magnier et al. concluded that the competence of health professionals should continue to be measured using a combination of mini-CEX, DOPS and multi-source feedback [61].
OSCE was also designed to assess clinical competencies [62] and is widely used for formative and summative purposes. It could be used to assess technical skill along with clinical competencies. Interestingly, comparisons of students’ OSCE scores while using typodonts and those with actual patients revealed that there were no correlation between typodonts and patients while there was a slight correlation between OSCE and patients [63]. However, the most common challenge is the number of OSCE stations and Gruppen et al. pointed out that it would be necessary to have approximately 30 stations or 10 h to obtain a reliability score of 0.80 [64].
3.2. Assessment of technical skill
As mentioned above, a single assessment method might not provide sufficient validity to assess clinical skill; however, different methods could provide standard milestones towards assessing ultimate clinical competency. Each method could utilize several measuring formats such as rubrics, checklists, the Global Rating Scale and Structured Rating Scale.
Rubrics became a popular educational tool in the 1990s and they provide guidelines to describe the standards for scoring students’ performance [65]. The strict definition of the word is quite difficult, but rubrics were originally supposed to provide a separate description for each qualitative level of students’ performance. Thus, rubrics may help standardize the subjective assessment criteria of technical skill but this would be different from simple checklists or rating scales. We should select the appropriate tools according to the purpose of the assessment [66]. It is also reported that inter-rater reliability of rubrics and graded category rating scales showed no significant difference [67]. Rubrics could be also used as learning tools for students [68].
The Global Rating Scale and Structured Rating Scale are both assessment tools for raters observing students’ performance. Usually, these are used in clinical settings, but could be used to assess technical skills. The Global Rating Scale is usually a five-point scale and students’ performance is judged according to designated indicators or rubrics to assess competencies [53,69]. The Structured Rating Scale is a series of hierarchical rating scales and students are expected to perform a specific skill and important components/attributes are assessed. It could be used extensively for assessing psychomotor skills [53,70].
Testing typodonts [71] are thought to be useful and the most widely adopted method for technical skill assessment. However, there are several other methods, such as digital assessment or examination using high-fidelity simulators.
As for digital assessment of technical skills, Miyazono et al. [72], Tiu et al. [73] and Greany et al. [74] reported that digital assessment could be more accurate and consistent than visual inspection for tooth preparation and waxing. On the other hand, Gratton et al. concluded that the use of digital evaluation technology did not impact students’ prosthodontic technique regardless of the type of software [75,76]. Digital assessment might not be effective for student learning whereas narrative feedback could improve student performance. Full-scale simulation or high-fidelity simulation could be an assessment tool as well [77].
4. Programmatic assessment
Under the concept “assessment for learning”, Schuwirth and Vleuten proposed a new method of assessment called programmatic assessment [78,79]. Establishing the validity of programmatic assessment requires the implementation of a multitude of assessment methods, which would increase the reliability. Although this concept was originally developed for effective learning, it ensures both the reliability and validity of clinical assessment outcomes for competency and we could also apply this to technical skill assessment [80].
5. Conclusion
While trainings with typodonts are still effective for technical skill achievement, many educators are providing novel programs with variations of the model itself or unique support systems. On the other hand, digital devices such as VR and AR dental trainers are also effective, but it might be better to utilize those as auxiliary resources at the moment. Further improvements are expected as technology develops.
Assessment of technical skill remains a challenge and not all dental schools are assessing core clinical skills before allowing students to treat patients [81]. Single assessment methods are only valid to a certain extent, and we should combine several assessment methods and continue working towards greater validity and increased objectivity for methods of assessing dental training.
Declaration of Competing Interest
The authors report no declarations of interest.
References
- 1.Kim Myo-Kyoung, Patel Rajul A., Uchizono James A., Beck Lynn. Incorporation of bloom’s taxonomy into multiple-choice examination questions for a pharmacotherapeutics course. Am J Pharm Educ. 2012;76(August (6)):114. doi: 10.5688/ajpe766114. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Yielder Jill, Moir Fiona. Assessing the development of medical students’ personal and professional skills by portfolio. J Med Educ Curric Dev. 2016;3(February) doi: 10.4137/JMECD.S30110. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Khan Kamran Z., Ramachandran Sankaranarayanan, Gaunt Kathryn, Pushkar Piyush. The objective structured clinical examination (OSCE): AMEE guide No. 81. Part I: an historical and theoretical perspective. Med Teach. 2013;35(September (9)):e1437–46. doi: 10.3109/0142159X.2013.818634. [DOI] [PubMed] [Google Scholar]
- 4.Rehim S.A., DeMoor S., Olmsted R., Dent D.L., Parker-Raley J. Tools for assessment of communication skills of hospital action teams: a systematic review. J Surg Educ. 2017;74(March–April (2)):341–351. doi: 10.1016/j.jsurg.2016.09.008. [DOI] [PubMed] [Google Scholar]
- 5.Gartenmann S.J., Hofer D., Wiedemeier D., Sahrmann P., Attin T., Schmidlin P.R. Comparative effectiveness of hand scaling by undergraduate dental students following a two-week pre-clinical training course. Eur J Dent Educ. 2019;23(February (1)):1–7. doi: 10.1111/eje.12361. [DOI] [PubMed] [Google Scholar]
- 6.Pileggi R., Glickman G.N. A cost-effective simulation curriculum for preclinical endodontics. Eur J Dent Educ. 2004;8(1):12–17. [PubMed] [Google Scholar]
- 7.Stelzle F., Farhoumand D., Neukam F.W., Nkenke E. Implementation and validation of an extraction course using mannequin models for undergraduate dental students. Acta Odontol Scand. 2011;69(2):80–87. doi: 10.3109/00016357.2010.517560. [DOI] [PubMed] [Google Scholar]
- 8.Nishida M., Sohmura T., Takahashi J. Training in tooth preparation utilizing a support system. J Oral Rehabil. 2004;31(February (2)):149–154. doi: 10.1046/j.0305-182x.2003.01216.x. [DOI] [PubMed] [Google Scholar]
- 9.Huth K.C., Baumann M., Kollmuss M., Hickel R., Fischer M.R. Assessment of practical tasks in the Phantom course of Conservative Dentistry by pre-defined criteria: a comparison between self-assessment by students and assessment by instructors. Eur J Dent Educ. 2017;21(February (1)):37–45. doi: 10.1111/eje.12176. [DOI] [PubMed] [Google Scholar]
- 10.Lugassy D., Levanon Y., Shpack N., Levartovsky S., Pilo R. An interventional study for improving the manual dexterity of dentistry students. PLoS One. 2019;14(February (2)) doi: 10.1371/journal.pone.0211639. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Clancy J.M., Lindquist T.J., Palik J.F., Johnson L.A. A comparison of student performance in a simulation clinic and a traditional laboratory environment: three-year results. J Dent Educ. 2002;66(12):1331–1337. [PubMed] [Google Scholar]
- 12.Tani Botticelli A., Schittek Janda M., Botticelli D., Mattheos N., Attström R. The effectiveness of video support in the teaching of manual skills related to initial periodontal therapy tested on phantoms. Int J Comput Dent. 2005;8(2):117–127. [PubMed] [Google Scholar]
- 13.Nishida M., Sohmura T., Takahashi J. Training in tooth preparation utilizing a support system. J Oral Rehabil. 2004;31(2):149–154. doi: 10.1046/j.0305-182x.2003.01216.x. [DOI] [PubMed] [Google Scholar]
- 14.Polyzois I., Claffey N., McDonald A., Hussey D., Quinn F. Can evaluation of a dental procedure at the outset of learning predict later performance at the preclinical level? A pilot study. Eur J Dent Educ. 2011;15(2):104–109. doi: 10.1111/j.1600-0579.2010.00647.x. [DOI] [PubMed] [Google Scholar]
- 15.Nunez D.W., Taleghani M., Wathen W.F., Abdellatif H.M. Typodont versus live patient: predicting dental students’ clinical performance. J Dent Educ. 2012;76(April (4)):407–413. [PubMed] [Google Scholar]
- 16.Nami Akiba, Masako Nagasawa, Kazuhiro Ono, Takeyasu Maeda, Katsumi Uoshima. An introduction to the undergraduate comprehensive model practice course at the Faculty of Dentistry, Niigata University. JJDEA. 2017;33(2):106–114. [Google Scholar]
- 17.Abe S., Noguchi N., Matsuka Y., Shinohara C., Kimura T. Educational effects using a robot patient simulation system for development of clinical attitude. Eur J Dent Educ. 2018;22(August (3)):e327–e336. doi: 10.1111/eje.12298. [DOI] [PubMed] [Google Scholar]
- 18.Tanzawa T., Futaki K., Tani C., Hasegawa T., Yamamoto M. Introduction of a robot patient into dental education. Eur J Dent Educ. 2012;16(February (1)):e195–e199. doi: 10.1111/j.1600-0579.2011.00697.x. [DOI] [PubMed] [Google Scholar]
- 19.Heym Richard, Krause Sebastian, Hennessen Till, Pitchika Vinay, Ern Christina. A new model for training in periodontal examinations using manikins. J Dent Educ. 2016;80(December (12)):1422–1429. [PubMed] [Google Scholar]
- 20.Oberoi Gunpreet, Nitsch Sophie, Edelmayer Michael, Janjic Klara, Müller Anna Sonja, Agis Hermann. 3D printing—encompassing the facets of dentistry. Front Bioeng Biotechnol. 2018;6(November):172. doi: 10.3389/fbioe.2018.00172. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Marty M., Broutin A., Vergnes J.N., Vaysse F. Comparison of student’s perceptions between 3D printed models versus series models in paediatric dentistry hands-on session. Eur J Dent Educ. 2019;23(1):68–72. doi: 10.1111/eje.12404. [DOI] [PubMed] [Google Scholar]
- 22.Serrano C.M., Wesselink P.R., Vervoorn J.M. Real patients in virtual reality: the link between phantom heads and clinical dentistry. Ned Tijdschr Tandheelkd. 2018;125(May (5)):263–267. doi: 10.5177/ntvt.2018.05.17192. [DOI] [PubMed] [Google Scholar]
- 23.Mirghani I., Mushtaq F., Allsop M.J. Capturing differences in dental training using a virtual reality simulator. Eur J Dent Educ. 2018;22(1):67–71. doi: 10.1111/eje.12245. [DOI] [PubMed] [Google Scholar]
- 24.de Boer I.R., Lagerweij M.D., Wesselink P.R., Vervoorn J.M. The effect of variations in force feedback in a virtual reality environment on the performance and satisfaction of dental students. Simul Healthc. 2019;14(3):169–174. doi: 10.1097/SIH.0000000000000370. [DOI] [PubMed] [Google Scholar]
- 25.Khelemsky R., Hill B., Buchbinder D. Validation of a novel cognitive simulator for orbital floor reconstruction. J Oral Maxillofac Surg. 2017;75(4):775–785. doi: 10.1016/j.joms.2016.11.027. [DOI] [PubMed] [Google Scholar]
- 26.Thomas G., Johnson L., Dow S., Stanford C. The design and testing of a force feedback dental simulator. Comput Methods Programs Biomed. 2001;64(1):53–64. doi: 10.1016/s0169-2607(00)00089-4. [DOI] [PubMed] [Google Scholar]
- 27.Pulijala Y., Ma M., Pears M., Peebles D., Ayoub A. Effectiveness of immersive virtual reality in surgical training—a randomized control trial. J Oral Maxillofac Surg. 2018;76(5):1065–1072. doi: 10.1016/j.joms.2017.10.002. [DOI] [PubMed] [Google Scholar]
- 28.Hienmora P., Haddawy P., Khanal P., Suebnukarn S., Dailey M.N. A virtual reality simulator for teaching and evaluating dental procedures. Methods Inf Med. 2010;49(4):396–405. doi: 10.3414/ME9310. [DOI] [PubMed] [Google Scholar]
- 29.Rhienmora P., Haddawy P., Suebnukarn S., Dailey M.N. Intelligent dental training simulator with objective skill assessment and feedback. Artif Intell Med. 2011;52(2):115–121. doi: 10.1016/j.artmed.2011.04.003. [DOI] [PubMed] [Google Scholar]
- 30.Gal G.B., Weiss E.I., Gafni N., Ziv A. Preliminary assessment of faculty and student perception of a haptic virtual reality simulator for training dental manual dexterity. J Dent Educ. 2011;75(4):496–504. [PubMed] [Google Scholar]
- 31.Ben-Gal G., Weiss E.I., Gafni N., Ziv A. Testing manual dexterity using a virtual reality simulator: reliability and validity. Eur J Dent Educ. 2013;17(3):138–142. doi: 10.1111/eje.12023. [DOI] [PubMed] [Google Scholar]
- 32.Serrano C.M., Wesselink P.R., Vervoorn J.M. First experiences with patient-centered training in virtual reality. J Dent Educ. 2020;84(5):607–614. doi: 10.1002/jdd.12037. [DOI] [PubMed] [Google Scholar]
- 33.Steinberg A.D., Bashook P.G., Drummond J., Ashrafi S., Zefran M. Assessment of faculty perception of content validity of PerioSim, a haptic-3D virtual reality dental training simulator. J Dent Educ. 2007;71(12):1574–1582. [PubMed] [Google Scholar]
- 34.Chen X., Sun P., Liao D. A patient-specific haptic drilling simulator based on virtual reality for dental implant surgery. Int J Comput Assist Radiol Surg. 2018;13(11):1861–1870. doi: 10.1007/s11548-018-1845-0. [DOI] [PubMed] [Google Scholar]
- 35.Yamaguchi S., Yamada Y., Yoshida Y., Noborio H., Imazato S. Development of three-dimensional patient face model that enables real-time collision detection and cutting operation for a dental simulator. Dent Mater J. 2012;31(6):1047–1053. doi: 10.4012/dmj.2012-164. [DOI] [PubMed] [Google Scholar]
- 36.Pohlenz P., Gröbe A., Petersik A., von Sternberg N., Pflesser B. Virtual dental surgery as a new educational tool in dental school. J Craniomaxillofac Surg. 2010;38(December (8)):560–564. doi: 10.1016/j.jcms.2010.02.011. [DOI] [PubMed] [Google Scholar]
- 37.Nassar H.M., Tekian A. Computer simulation and virtual reality in undergraduate operative and restorative dental education: a critical review. J Dent Educ. 2020;(March):1–18. doi: 10.1002/jdd.12138. [DOI] [PubMed] [Google Scholar]
- 38.Perry S., Bridges S.M., Burrow M.F. A review of the use of simulation in dental education. Simul Healthc. 2015;10(February (1)):31–37. doi: 10.1097/SIH.0000000000000059. [DOI] [PubMed] [Google Scholar]
- 39.Plessas A. Computerized virtual reality simulation in preclinical dentistry: can a computerized simulator replace the conventional phantom heads and human instruction? Simul Healthc. 2017;12(October (5)):332–338. doi: 10.1097/SIH.0000000000000250. [DOI] [PubMed] [Google Scholar]
- 40.Huang T.K., Yang C.H., Hsieh Y.H., Wang J.C., Hung C.C. Augmented reality (AR) and virtual reality (VR) applied in dentistry. Kaohsiung J Med Sci. 2018;34:243–248. doi: 10.1016/j.kjms.2018.01.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Joda T., Gallucci G.O., Wismeijer D., Zitzmann N.U. Augmented and virtual reality in dental medicine: a systematic review. Comput Biol Med. 2019;108:93–100. doi: 10.1016/j.compbiomed.2019.03.012. [DOI] [PubMed] [Google Scholar]
- 42.Towers A., Field J., Stokes C., Maddock S., Martin N. A scoping review of the use and application of virtual reality in pre-clinical dental education. Br Dent J. 2019;226(5):358–366. doi: 10.1038/s41415-019-0041-0. [DOI] [PubMed] [Google Scholar]
- 43.Roy E., Bakr M.M., George R. The need for virtual reality simulators in dental education: a review. Saudi Dent J. 2017;29(2):41–47. doi: 10.1016/j.sdentj.2017.02.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Zitzmann N.U., Matthisson L., Ohla H., Joda T. Digital undergraduate education in dentistry: a systematic review. Int J Environ Res Public Health. 2020;17(9) doi: 10.3390/ijerph17093269. Published 2020 May 7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Kapoor S., Arora P., Kapoor V., Jayachandran M., Tiwari M. Haptics — touchfeedback technology widening the horizon of medicine. J Clin Diagn Res. 2014;8(3):294–299. doi: 10.7860/JCDR/2014/7814.4191. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Nassar H.M., Tekian A. Computer simulation and virtual reality in undergraduate operative and restorative dental education: a critical review. J Dent Educ. 2020;(March) doi: 10.1002/jdd.12138. [DOI] [PubMed] [Google Scholar]
- 47.Wierinck E., Puttemans V., Swinnen S., van Steenberghe D. Effect of augmented visual feedback from a virtual reality simulation system on manual dexterity training. Eur J Dent Educ. 2005;9(1):10–16. doi: 10.1111/j.1600-0579.2004.00351.x. [DOI] [PubMed] [Google Scholar]
- 48.Kikuchi H., Ikeda M., Araki K. Evaluation of a virtual reality simulation system for porcelain fused met.al crown preparation at Tokyo Medical and Dental University. J Dent Educ. 2013;77(6):782–792. [PubMed] [Google Scholar]
- 49.Wierinck E., Puttemans V., van Steenberghe D. Effect of reducing frequency of augmented feedback on manual dexterity training and its retention. J Dent. 2006;34(October (9)):641–647. doi: 10.1016/j.jdent.2005.12.005. [DOI] [PubMed] [Google Scholar]
- 50.Llena C., Folguera S., Forner L., Rodríguez-Lozano F.J. Implementation of augmented reality in operative dentistry learning. Eur J Dent Educ. 2018;22(1):e122–e130. doi: 10.1111/eje.12269. [DOI] [PubMed] [Google Scholar]
- 51.Suebnukarn S., Haddawy P., Rhienmora P., Jittimanee P., Viratket P. Augmented kinematic feedback from haptic virtual reality for dental skill acquisition. J Dent Educ. 2010;74(12):1357–1366. [PubMed] [Google Scholar]
- 52.Miller G.E. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9):63–67. [Google Scholar]
- 53.Kramer G.A., Albino J.E., Andrieu S.C., Hendricson W.D., Henson L., Horn B.D. Dental student assessment toolbox. J Dent Educ. 2009;73(1):12–35. [PubMed] [Google Scholar]
- 54.Norcini J.J., Blank L.L., Arnold G.K., Kimball H.R. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med. 1995;123(10):795–799. doi: 10.7326/0003-4819-123-10-199511150-00008. [DOI] [PubMed] [Google Scholar]
- 55.Norcini J.J., Blank L.L., Duffy F.D., Fortna G.S. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138(March (6)):476–481. doi: 10.7326/0003-4819-138-6-200303180-00012. [DOI] [PubMed] [Google Scholar]
- 56.Al-Jewair Thikriat, Kumar Sonal. Review and application of the mini-clinical evaluation exercise (Mini-CEX) in advanced orthodontic education: a pilot study. J Dent Educ. 2019;83(November (11)):1332–1338. doi: 10.21815/JDE.019.131. [DOI] [PubMed] [Google Scholar]
- 57.Lörwald A.C., Lahner F.M., Nouns Z.M., Berendonk C., Norcini J. The educational impact of Mini-Clinical Evaluation Exercise (Mini-CEX) and Direct Observation of Procedural Skills (DOPS) and its association with implementation: a systematic review and meta-analysis. PLoS One. 2018;13(June (6)) doi: 10.1371/journal.pone.0198009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Prescott L.E., Norcini J.J., McKinlay P., Rennie J.S. Facing the challenges of competency-based assessment of postgraduate dental training: longitudinal evaluation of performance (LEP) Med Educ. 2002;36(1):92–97. doi: 10.1046/j.1365-2923.2002.01099.x. [DOI] [PubMed] [Google Scholar]
- 59.Wiles C.M., Dawson K., Hughes T.A., Llewelyn J.G., Morris H.R. Clinical skills evaluation of trainees in a neurology department. Clin Med (Lond) 2007;7(August (4)):365–369. doi: 10.7861/clinmedicine.7-4-365. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Tonni I., Gadbury-Amyot C.C., Govaerts M., Ten Cate O., Davis J. ADEA-ADEE shaping the future of dental education III: assessment in competency-based dental education: ways forward. J Dent Educ. 2020;84(January (1)):97–104. doi: 10.1002/jdd.12024. [DOI] [PubMed] [Google Scholar]
- 61.Magnier K.M., Dale V.H., Pead M.J. Workplace-based assessment instruments in the health sciences. J Vet Med Educ. 2012;39(winter (4)):389–395. doi: 10.3138/jvme.1211-118R. [DOI] [PubMed] [Google Scholar]
- 62.Harden R.M., Stevenson M., Downie W.W., Wilson G.M. Assessment of clinical competence using an objective structured examination. Br Med J. 1975;1(5955):447–451. doi: 10.1136/bmj.1.5955.447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Curtis D.A., Lind S.L., Brear S., Finzen F.C. The correlation of student performance in preclinical and clinical prosthodontic assessments. J Dent Educ. 2007;71(3):365–372. [PubMed] [Google Scholar]
- 64.Gruppen L.D., Davis W.K., Fitzgerald J.T., McQuillan M.A. Reality number of stations, and examination length in an Objective Structured Clinical Examination. Adv Med Educ. 1997:441–442. [Google Scholar]
- 65.Popham W. James. What’s wrong — and what’s right — with rubrics. Educ Leadersh. 1997;55(October (2)):72–75. [Google Scholar]
- 66.Hauser A.M., Bowen D.M. Primer on preclinical instruction and evaluation. J Dent Educ. 2009;73(3):390–398. [PubMed] [Google Scholar]
- 67.Doğan C.D., Uluman M. A comparison of rubrics and graded category rating scales with various methods regarding raters’ reliability. Educ Sci Theory Pract. 2017;17(2):631–651. [Google Scholar]
- 68.O’Donnell J.A., Oakley M., Haney S., O’Neill P.N., Taylor D. Rubrics 101: a primer for rubric development in dental education. J Dent Educ. 2011;75(September (9)):1163–1175. [PubMed] [Google Scholar]
- 69.Gray J.D. Global rating scale in residency education. J Gray Acad Med. 1996;71:555–563. doi: 10.1097/00001888-199601000-00043. [DOI] [PubMed] [Google Scholar]
- 70.Linacre John Michael. Structured Rating Scales. Sixth International Objective Measurement Workshop; Chicago, Illinois, April; 1991. [Google Scholar]
- 71.Hill G.L., Hampel A.T. A typodont used exclusively for practical examinations. J Dent Educ. 1987;51(September (9)):553–554. [PubMed] [Google Scholar]
- 72.Miyazono S., Shinozaki Y., Sato H., Isshi K., Yamashita J. Use of digital technology to improve objective and reliable assessment in dental student simulation laboratories. J Dent Educ. 2019;83(October (10)):1224–1232. doi: 10.21815/JDE.019.114. [DOI] [PubMed] [Google Scholar]
- 73.Tiu J., Cheng E., Hung T.C., Yu C.C., Lin T., Schwass D. Effectiveness of crown preparation assessment software as an educational tool in simulation clinic: a pilot study. J Dent Educ. 2016;80(August (8)):1004–1011. [PubMed] [Google Scholar]
- 74.Greany T.J., Yassin A., Lewis K.C. Developing an all-digital workflow for dental skills assessment: part I, visual inspection exhibits low precision and accuracy. J Dent Educ. 2019;83(November (11)):1304–1313. doi: 10.21815/JDE.019.132. [DOI] [PubMed] [Google Scholar]
- 75.Gratton D.G., Kwon S.R., Blanchette D., Aquilino S.A. Impact of digital tooth preparation evaluation technology on preclinical dental students’ technical and self-evaluation skills. J Dent Educ. 2016;80(January (1)):91–99. [PubMed] [Google Scholar]
- 76.Gratton D.G., Kwon S.R., Blanchette D.R., Aquilino S.A. Performance of two different digital evaluation systems used for assessing pre-clinical dental students’ prosthodontic technical skills. Eur J Dent Educ. 2017;21(November (4)):252–260. doi: 10.1111/eje.12231. [DOI] [PubMed] [Google Scholar]
- 77.Langdon M.G., Cunningham A.J. High-fidelity simulation in post-graduate training and assessment: an Irish perspective. Ir J Med Sci. 2007;176(December (4)):267–271. doi: 10.1007/s11845-007-0074-2. [DOI] [PubMed] [Google Scholar]
- 78.van der Vleuten C.P.M., Schuwirth L.W.T. Assessing professional competence: from methods to programmes. Med Educ. 2005;39:309–317. doi: 10.1111/j.1365-2929.2005.02094.x. [DOI] [PubMed] [Google Scholar]
- 79.Schuwirth L.W.T., Van der Vleuten C.P.M. Programmatic assessment: from assessment of learning to assessment for learning. Med Teach. 2011;33(6):478–485. doi: 10.3109/0142159X.2011.565828. [DOI] [PubMed] [Google Scholar]
- 80.Patel U.S., Tonni I., Gadbury-Amyot C., Van der Vleuten C.P.M., Escudier M. Assessment in a global context: an international perspective on dental education. Eur J Dent Educ. 2018;22 Suppl 1(March):21–27. doi: 10.1111/eje.12343. [DOI] [PubMed] [Google Scholar]
- 81.Field J., Stone S., Orsini C., Hussain A., Vital S., Crothers A. Curriculum content and assessment of pre-clinical dental skills: a survey of undergraduate dental education in Europe. Eur J Dent Educ. 2018;22(May (2)):122–127. doi: 10.1111/eje.12276. [DOI] [PubMed] [Google Scholar]
