Abstract
Background
The use of simulators for ophthalmology training is growing globally. However, all developed simulators have certain limits based on different circumstances. The study aims to evaluate training performance and student satisfaction of the new refractive adjustment simulator "ICEyeModel" compared with a traditional simulator for direct ophthalmoscopy training among medical students.
Methods
This constitutes a comparative, randomised, cross-over study. Our study enrolled 50 participants from a 6-year medical student training programme at Phramongkutklao Hospital. They underwent a refresher lecture on primary direct ophthalmoscopy use and a short course review of common retinal diseases. They were randomised into two training sequence groups: Training sequence 1 started with a traditional film photograph simulator called Eye Retinopathy Trainer (developed by Adam, Rouilly Co., Sittingbourne, UK) followed by the ICEyeModel. Training Sequence 2 started with the ICEyeModel, followed by a traditional simulator. Participants in both groups completed the fundoscopic description tests and satisfaction questionnaires immediately following each simulator training session.
Results
On an 18-point prospective rubrics scale, medical students trained with ICEyeModel achieved significantly higher fundoscopic examination scores (14.42 ± 2.34) compared to those trained with traditional simulators (11.30 ± 2.64), with p < 0.001. For the ICEyeModel, 86% of the participants can correctly adjust the direct ophthalmoscope power match to the refractive state of the trial lens placed in the simulator. The ICEyeModel has a higher satisfaction score regarding picture quality, enhancing motivation and confidence than a traditional simulator.
Conclusion
In comparison with the Eye Retinopathy Trainer, the ICEyeModel significantly enhanced performance with increased satisfaction and self-confidence in simulated direct ophthalmoscopy training. Although these improvements were observed in a simulation setting and do not necessarily translate to superior performance in patient examinations, our findings suggest that the ICEyeModel may offer a promising alternative for training with direct ophthalmoscopes, indirect ophthalmoscopes, and retinoscopes in clinical settings.
Supplementary Information
The online version contains supplementary material available at 10.1186/s12909-025-07418-x.
Keywords: Direct ophthalmoscope, Fundoscopic examination, Simulator, Medical student
Background
Fundoscopic examination is essential for accurately diagnosing many visual and life-threatening conditions. Several studies [1, 2] reported that 1.3% of chief complaints in the emergency room are presented with eye manifestations every year. Some of the patients need a further fundoscopic examination to confirm the diagnosis. Although the development of the ophthalmoscope began with Helmholtz’s seminal work in 1850 [3]—a milestone that established direct ophthalmoscopy as a fundamental component of medical training—effective training remains challenging. In 2004, when Chung and Watzke developed a closed plastic chamber with 37 mm photographs, which proved effective for training [4], a variety of simulators employing diverse design approaches were developed to enhance training outcomes [5–12]. Early models used retinal photograph quizzes and simple mechanical devices, while more advanced systems, such as the Eyesi virtual reality simulator [13–15], have aimed to create immersive training experiences. However, many of these systems face issues related to cost, accessibility, and functionality. For example, the widely used Eye Retinopathy Trainer (developed by Adam, Rouilly Co., Sittingbourne, UK) has been employed globally for over a decade [12, 16]. However, it requires an extra light source to enhance image visualization, which may limit its effectiveness during extended training sessions. To address these challenges, we developed the ICEyeModel- a novel refractive adjustment simulator that replicates a human optical eye system with adjustable refraction features. This design enables medical students to practice adjusting the direct ophthalmoscope for optical refractive correction, thereby improving their fundoscopic examination skills. In this study, we evaluate the clinical relevance of the ICEyeModel by assessing student direct ophthalmoscopy training, compared with the traditional simulator”.
Methods
Study design
We conducted a randomised, cross-over trial in the Department of Ophthalmology, Phramongkutklao Hospital, Thailand. Sixth-year medical students were recruited from August 2019 to July 2020. Written informed consent was obtained from all participants. Using a computer-generated randomization list, participants were randomly assigned to one of two groups. Training sequence 1 initially trained with simulator A and then proceeded to train with simulator B, whereas training sequence 2 initially trained with simulator B before training with simulator A. This randomization process ensured that both simulators were experienced in counterbalanced order. The Institutional Review Board of the Royal Thai Army Medical Department approved the study (No. R058q/62) before recruiting participants and collecting data. The study adhered to the principles described in the Declaration of Helsinki and the guidelines of the International Network for Simulation-based Paediatric Innovation, Research, and Education [17].
The inclusion criteria are listed below:
1) Sixth-year medical students who completed training in the annual ophthalmology period in the fifth academic year.
2) Having a best corrected visual acuity (BCVA) at 0.0 logMar converted from the Early Treatment Diabetic Retinopathy Study (ETDRS) chart [18]
3) Not having participated in any other study using an ophthalmic simulator.
We excluded medical students who had not completed a three-week eye block in the fifth academic year.
Sample size calculations
Based on a pilot study, we calculated that a minimum sample size of 40 participants was required to detect a significant difference, with an α of 0.05 and a power of 80%.
Innovation phase
A simulator, the ICEyeModel, was developed to resemble a human optical system. The simulator consists of an adult male half-head manikin with a neck, constructed using three-dimensional (3D) printing technology [19–23] and hard resin material, as shown in Fig. 1. The head model includes four slots at the back for interchangeable components. Replacement parts were designed for two optical half-eyeballs and slots for the placement of commercial trial lenses for refractive adjustment. A 5 cm diameter front half-eyeball was attached to a high-plus convexity lens. The front half was covered with a hole cut in the center as a “pupil,” with three sizes of adjustable pupil diameter (3, 6, 9 mm), and the back half was constructed using an exchangeable ocular fundus photograph glued to the inner wall as shown in the video in Appendix 2. One set of fundus had eight different fundus photographs, including normal fundus, glaucomatous optic neuropathy, diabetic retinopathy, hypertensive retinopathy, central retinal vein occlusion, central retinal artery occlusion, optic disc swelling, and optic disc atrophy. The fundus photographs were obtained in Phramongkutklao Hospital using a Kowa Vx-10 digital fundus photo camera, as shown in Appendix 1. Figure 1 demonstrates A. the front view of the ICEyeModel and B. the back view, including two half-eyeballs, storage drawers, and slots behind the optical system.
Fig. 1.
The front (a) and back view (b) of the ICEyeModel
Intervention phase
Participants were assessed using a set of validated questionnaires as shown in Appendix 3., developed to evaluate training performance and compare student satisfaction between training with a traditional simulator and the ICEyeModel simulator. The set of questionnaires consisted of two sections: participant information and assessments. The participant information included demographic data (age, sex, refraction status, frequency of using the direct ophthalmoscope within three months). Prospective rubrics to guide the evaluator in assigning scores and grading to the responsiveness of the participant are shown in Table 1. The assessment section comprised two major segments: fundoscopic description and satisfaction. The fundoscopic description section assessed participants'ability to describe fundoscopic findings using a checkbox on a three-grading scale (unseen, seen with the incorrect answer, seen with the correct answer) and one short answer for diagnosis. The scoring system used for all reports was as follows: unseen = 0 points, seen with incorrect answer = 1 point, and seen with correct answer = 2 points. (Table 1.) All participants were asked to complete all questionnaires and describe fundoscopic findings. The total score of the fundoscopic description was 18 points. The primary outcome of this study was the comparison of the funduscopic description score between the two simulators, serving as an objective measure of training performance. The secondary outcome was the satisfaction questionnaire, which assessed participants’ subjective experiences and provided additional insights into their training preferences. The questionnaire was validated by three attending ophthalmologists at Phramongkutklao Hospital, with a content validity index (CVI) exceeding 0.5. Reliability testing using Cronbach's alpha yielded a range of 0.7 to 0.8. [24].
Table 1.
Prospective Rubrics to Guide Evaluator in Assigning Scores and Grading to Participant's Responsiveness
| Score titles | Decription |
|---|---|
| Description | |
| Score 0 (unseen) | Student would not be able to find any basic regulatory structures |
| Score 1 (seen and incorrect answer) | Student would be able to find basic regulatory structures; |
| Noted a"normal"fundus despite the presence of abnormalities; or | |
| Listed incorrect findings | |
| Score 2 (seen and correct answer) | Student would be able to find basic regulatory structures; |
| A lesion was described accurately with greater detail | |
| Diagnosis | |
| Score 0 (incorrect) | Incorrect diagnosis |
| Score 1 (correct) | Correct diagnosis |
Pretraining period
All participants had to complete a refreshment course containing a ten-minute video on the principles of direct ophthalmoscopy (ophthalmoscope controls and proper technique) from the E-learning tool. As part of the normal ophthalmology curriculum, all participants were assigned to small groups of two or three for a one-hour clinical fundoscopic examination instruction and lecture on common retinal diseases. Before beginning the study, the training verified that all participants had a consistent core understanding.
Training period
A Welch Allyn 3.5v Coaxial direct ophthalmoscope with C-Cell Handle in Hard Case was used in the study. All participants were randomised by concealed envelopes into two training sequence groups, as shown in Fig. 2. Training Sequence 1 started with direct ophthalmoscopy training on a traditional simulator (fixed-size pupil at 5 mm) called the Eye Retinopathy Trainer (developed by Adam, Rouilly Co., Sittingbourne, UK) coded Simulator A, followed by the ICEyeModel (adjusted-size of pupil at 6 mm) coded Simulator B, as shown in Fig. 3. The traditional simulator (simulator A), which requires a table lamp light source for visualisation, is not shown in the figure.
Fig. 2.
Flow diagram of a sequence of training processes with a parallel group
Fig. 3.
Front, back view of Simulator A, B and samples of fundus photographs, Traditional Model or simulator A (a, c) and ICEyeModel or simulator B (b, d); Fundus photographs of simulator A (e, g) and simulator B (f, h). Glaucomatous disc (e, f). Optic disc swelling (g, h)
Training Sequence 2 started with direct ophthalmoscope training in reciprocal order (ICEyeModel (B) and then the traditional simulator (A)). Both sequences consisted of two randomly selected fundus photographs.
Each direct ophthalmoscopy simulator training session was allowed for 30 min. The fundoscopy examination tests assessed various fundus findings included red reflex, optic disc, retinal background, vessel configuration, macula, and diagnosis. Additionally, all participants underwent testing for refractive adjustment of the direct ophthalmoscope using the ICEyeModel, with a trial lens of + 3.00 dioptres to simulate refractive errors. Correct performance was defined as participants accurately adjusting the refractive settings in the ophthalmoscope to match the trial lens before performing the fundoscopy examination.
Post-training period
A survey was conducted to evaluate the preference of participants to use the traditional or ICEyeModel for direct ophthalmoscope training.
Statistical analysis
Demographic data were presented using the mean and standard deviation or median, IQR ranges for continuous data, and percentages for categorical data. Paired nominal and independent data were analysed using the Extended McNemar and McNemar tests. The primary outcome was the comparison fundoscopic description total score (18 points). The paired samples were tested for normally distributed data, and the Wilcoxon matched pair rank test was employed for abnormally distributed data. Refractive collection and satisfaction scores were reported using percentages for other outcomes. A two-tailed p-value of less than 0.05 was considered statistically significant. All statistical analyses were carried out using STATA 14.0 software.
Results
A total of 52 sixth-year medical students were enrolled in our study; two medical students were excluded due to amblyopia and an incomplete refreshment course. Fifty participants completed all the required elements. Twenty-nine (58%) were men. The mean (SD) age of participants was 23.14 (0.64) years old. Most of the participants presented myopic refraction (68%) and had no exposure to the direct ophthalmoscope in the past three months, as shown in Table 2. According to the pre-training survey, 68% of participants strongly agreed that general physicians should be trained in direct ophthalmoscopy and practice on a simulator before examining actual patients. Additionally, 64% of participants strongly agreed that direct ophthalmoscopy skills should be reviewed on a simulator before a patient's examination and that regular training should continue until confidence and proficiency are attained. These results highlight the need for simulator-based training to be included as a requirement for clinical practice. In the self-evaluate pretraining survey, 46% of participants self-assessed their understanding of direct ophthalmoscopy use as being at the highest level, while 36% self-assessed it as being at an average level. Half of the participants reported a moderate confidence level in the diagnosis of retinal pathology using direct ophthalmoscopy.
Table 2.
Baseline characteristics
| Gender (n,%) | |
| Male | 29 (58.00) |
| Female | 21 (42.00) |
| Age (Mean,SD) | 23.14 (0.64) |
| Refractive error status (n,%) | |
| Emmetropia | 21 (42.00) |
| Myopia | 27 (54.00) |
| Hyperopia | 0 (0.00) |
| Astigmatism | 2 (4.00) |
| Refractive error correction status (n,%) | |
| With spectacle correction | 20 (40.00) |
| Without spectacle correction | 25 (50.00) |
| Post refractive surgery | 3 (6.00) |
| Contact lens | 2 (4.00) |
| Frequency of use of the direct ophthalmology in the past 3 months (n,%) | |
| 0 times | 34 (68.00) |
| 1–5 times | 11 (22.00) |
| 5–10 times | 4 (8.00) |
| more than 10 times | 1 (2.00) |
According to the evaluation conducted using the prospective rubrics listed in Table 1, an evaluator found that participants training with simulator B more frequently identified and corrected answers than those training with simulator A. Participants trained with simulator B reported significantly better visualisation of the macula, with a p-value < 0.05 (Fig. 4). Participants trained with simulator B scored significantly higher in all four major fundus structures. In the macula region, which is known to be clinically challenging to visualise, the median score for simulators A and B was 1 (with an IQR of 0,2) and 3 (with an IQR of 0,4) points, respectively, with a p-value < 0.05. The diagnosis score was significantly higher when using simulator B over simulator A, with a p-value < 0.05. Finally, the total fundoscopic examination score of simulator B was considerably higher in the description score than that of simulator A, with a p-value < 0.05, as shown in Table 3.
Fig. 4.
Frequency of fundoscopic description answers, Q1, Question number 1; Q2, Question number 2; P, P-value Statistic analysis: Extended McNemar test
Table 3.
Fundoscopic description score according to focussing retinal structures
| Description score | Simulator A | Simulator B | p-value |
|---|---|---|---|
| Red reflex (4 points) | |||
| Mean ± SD | 3.56 ± 0.79 | 3.84 ± 0.51 | 0.038(p) |
| Median (IQR) | 4 (2,4) | 4 (2,4) | |
| Optic nerve (4 points) | |||
| Mean ± SD | 3.00 ± 0.97 | 3.44 ± 0.67 | 0.001(p) |
| Median (IQR) | 3 (1,4) | 4 (2,4) | |
| Background and vessel (4 points) | |||
| Mean ± SD | 3.10 ± 0.76 | 3.46 ± 0.61 | 0.030(p) |
| Median (IQR) | 3 (2,4) | 4 (2,4) | |
| Macula (4 points) | |||
| Mean ± SD | 0.82 ± 0.97 | 2.58 ± 1.36 | |
| Median (IQR) | 0 (0,4) | 3 (0,4) | < 0.001(w) |
| Diagnosis (2 points) | |||
| Mean ± SD | 0.82 ± 0.72 | 1.10 ± 0.74 | |
| Median (IQR) | 1 (0–2) | 1 (0,2) | 0.013(w) |
| Total score | |||
| Mean ± SD | 11.30 ± 2.64 | 14.42 ± 2.34 | < 0.001(p) |
*p Paired Sample Test and w Wilcoxon Mached Pair Sign Rask Test
The total fundoscopic description score for each simulator did not show statistical differences in either Training Sequence 1 or Training Sequence 2. Simulator B also achieved a higher total description score from both training sequences than simulator A (Table 4).
Table 4.
A comparative training sequence with the total score of both simulators
| Traditional simulator (Simulator A) total score | Mean ± SD | p-value* |
|---|---|---|
| Sequence 1 (1st order) | 11.69 ± 2.42 | 0.231 |
| Sequence 2 (2nd order) | 10.76 ± 2.90 | |
| ICEyeModel (Simulator B) total score | ||
| Sequence 1 (2nd order) | 14.17 ± 2.48 | 0.247 |
| Sequence 2 (1st order) | 14.76 ± 2.14 | |
*Paired sample test
The refractive adjustment test of simulator B demonstrated that 43 participants (86%) can adjust direct ophthalmoscopy and answer refractive status of simulator B correctly. In all, 88% of the participants preferred simulator B as a simulator of choice for training in direct ophthalmoscopy. Totally, 82% of participants favoured simulator B for its human-like appearance, and 80% rated simulator B superior for providing a higher-quality fundus photo compared to simulator A. As shown in Fig. 5, training with simulator B showed higher satisfaction in all aspects.
Fig. 5.
Satisfaction analysis in quality aspect of each simulator
Discussion
Direct ophthalmoscopy is a difficult skill for many medical students and is one of the skills in which there is a gap in medical education [25]. Skill development for fundoscopic examination depends on various factors such as exposure to ophthalmoscopy, the inherent difficulty of the technique, and differences in learning curves associated with each simulator. Our study survey demonstrates that 34 participants (68%) were concerned that direct ophthalmoscopy is a vital skill and agreed that practising ophthalmoscopy with a simulator before examining patients and regular practice is also essential. Multiple studies [26–30] reported the need for more confidence among medical students and general practitioners. Our survey revealed nearly the same direction: most medical students (54%) have moderate confidence in the use of direct ophthalmoscopes, which can be correlated with infrequent use and lack of motivation. Training with simulators helps practitioners achieve proficiency, including: provides an environment enhancing skill management and repeatability, is independent of time and is not inconvenient or harmful to patients. Better performance was observed among participants in describing fundus pathology and making diagnoses when trained with the ICEyeModel. The total fundoscopic description scores were significantly higher for participants trained with the ICEyeModel compared to the traditional simulator. These results suggest improved competency in fundoscopic examination, likely due to the realistic features of the ICEyeModel. The ICEyeModel incorporated real fundoscopic photographs printed at 300 DPI resolution, which enhances the accuracy of retinal background, colours, sharpness, and overall visualisation. Furthermore, the model's human-like design, mimicking the optical system of the human eye, contributes to improved learning outcomes compared to traditional simulators.
Presently, the VR simulator is the most advanced technology available worldwide for training in various ophthalmology skills [31–34]. The Perspectives trial conducted at the Indiana University School of Medicine [14] reported that VR technology simulators (Eyesi) also support the idea that simulation enhances the technical skill of direct ophthalmoscopy. Due to resource constraints and the high cost associated with the Eyesi simulator—the future gold standard for direct ophthalmoscopy training—a direct comparison was not feasible in this study. In contrast, the ICEyeModel is an in-house innovation that focuses on cost-effectiveness, offering a more accessible alternative for simulation-based training. It might be appropriate for countries that have limited financial resources.
The FOTO-ED study [35] reported that nonmydriatic fundus photography taken by nurses is an appropriate alternative to direct ophthalmoscopy in an emergency room. While various commercially available types of fundus cameras can serve as an alternative to the direct ophthalmoscope, they are not available in the emergency rooms of some hospitals. However, a direct ophthalmoscope could be more suitable for patients who are not cooperative or cannot sit upright. For our innovation, the training can be practised in upright and supine positions with improved competency for examination in various situations, especially in the emergency unit. The high-resolution printing of fundus photographs and the realistic fundus appearance is better than those from other technologies.
The highlight of our innovation is the integration of both ophthalmoscopy and retinoscopy capabilities—a combination not previously described in simulation-based training models. Traditional methods facilitate retinoscopy training by assessing refractive error and best-corrected visual acuity [36, 37], but they often involve repeated testing in human subjects, which presents challenges such as difficult enrolment, fatigue, and potential risks to patients. The schematic mechanical eyes may be a reasonable solution. In 2015, a device was described that assisted retinoscopy training in low-resource countries [20]. Donghyun Kim et al. reported a 3D-printed eye model that enhances retinoscopy and shows non-inferior improvements in refractive accuracy compared with that of students practising on actual patients [19]. Our institute also used a simulator called the Heine Retinoscope Trainer (model 13301, manufactured in Germany), as reported in another study [38], for refractive training over a long period. The ICEyeModel incorporates a true optical system with an adjustable refractive feature, and 92% of participants observed a clear red reflex, suggesting that the simulator offers a promising alternative for retinoscopy training. Although the current study focused on ophthalmoscopy performance and student satisfaction, the inclusion of retinoscopy—validated by experts during development—represents a significant innovation. Future studies will be necessary to directly compare the retinoscopy functionality of our simulator with traditional training methods.
One of the strengths of this study is that crossover trials enhance statistical power with fewer participants by using each subject as their own control, thereby minimizing confounding effects. However, the potential for a carryover effect remains a limitation inherent to crossover designs. Ideally, a longer washout period, equivalent to the participants, typical period of non-use of the direct ophthalmoscopy, would have been implemented to further mitigate this concern. Due to scheduling constraints and the daily academic commitments of the participants, only a short washout period was feasible in this study. Despite this limitation, we evaluated whether the fundoscopy description score was influenced by the sequence of simulator training and found no significant effect of the training sequence on the scores. The significantly higher score for simulator B (ICEyeModel) across both training sequences confirms that the carryover effect was unlikely to have influenced the overall findings. The other limitations included we were unable to compare the findings with the Eyesi simulator, the future gold standard tool for direct ophthalmoscopy. This new design simulator cannot provide the real-time assessment of the labelling of pathological findings from students like VR. Other training skills, including time of diagnosis and instrument handling and examination in a nonmydriatic setting, were not assessed in our study.
Currently, the ICEyeModel was successfully registered with a Thai petty patent (number 2103002366) following the conclusion of this study, which is significant to disclose. The authors declare a possible conflict of interest as the innovators of the ICEyeModel, although the study results were obtained independently of any commercial considerations. Every effort has been made to present the findings in an objective manner.
Conclusion
In comparison with the Eye Retinopathy Trainer, the ICEyeModel significantly enhanced performance with increased satisfaction and self-confidence in simulated direct ophthalmoscopy training. Although these improvements were observed in a simulation setting and do not necessarily translate to superior performance in patient examinations, our findings suggest that the ICEyeModel may offer a promising alternative for training with direct ophthalmoscopes, indirect ophthalmoscopes, and retinoscopes in clinical settings.
Supplementary Information
Authors’ contributions
All authors contributed to the conception of the study. Ratima Chokchaitanasin and Raveewan Choontanom contributed to the drafting and revising of the manuscript. All authors contributed to the article and approved the submitted version.
Funding
Phramongkutklao College of Medicine Grants, 2021–2022. Funding was used to obtain supplies for the study.
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Declarations
Ethics approval and consent to participate
All study procedures were approved by the Institutional Review Board of the Royal Thai Army Medical Department (protocol ID R058q/62), and adhered to the principles of the Declaration of Helsinki.
Consent for publication
Not applicable.
Competing interests
The authors declare that the New Refractive Adjustment Fundoscopic Examination Simulator for Medical Students, the ICEye model, has been granted a petty patent in Thailand number 2103002366, effective February 2, 2023.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Imsuwan I, Amnuaypattanapon K, Vongkittirux S, Imsuwan Y. The Study of Incidence and Characteristics of Patients with Eye-Related Chief Complaints at the Emergency Department of Thammasat University Hospital. Emergency Medicine International. 2020;2020:e4280543. Available from: https://www.hindawi.com/journals/emi/2020/4280543/ [DOI] [PMC free article] [PubMed]
- 2.Nash EA. Patterns of Emergency Department Visits for Disorders of the Eye and Ocular Adnexa. Arch Ophthalmol. 1998;116(9):1222. [DOI] [PubMed] [Google Scholar]
- 3.Pearce JMS. The Ophthalmoscope: Helmholtz’s Augenspiegel. Eur Neurol. 2009;61(4):244–9. [DOI] [PubMed] [Google Scholar]
- 4.Chung KD, Watzke RC. A simple device for teaching direct ophthalmoscopy to primary care practitioners. American Journal of Ophthalmology. 2004;138(3):501–2. Available from: https://pubmed.ncbi.nlm.nih.gov/15364247/ [DOI] [PubMed]
- 5.Hoeg TB, Sheth BP, Bragg DS, Kivlin JD. Evaluation of a tool to teach medical students direct ophthalmoscopy. WMJ: official publication of the State Medical Society of Wisconsin. 2009;108(1):24–26. Accessed February 3, 2024. https://pubmed.ncbi.nlm.nih.gov/19326631/ [PubMed]
- 6.Li D, Zhang W, Li X, Zhen S, Wang Y. Stressful life events and problematic Internet use by adolescent females and males: A mediated moderation model. Comput Hum Behav. 2010;26(5):1199–207. 10.1016/j.chb.2010.03.031. [Google Scholar]
- 7.McCarthy DM, Leonard HR, Vozenilek JA. A new tool for testing and training ophthalmoscopic skills. J Grad Med Educ. 2012;4(1):92–6. 10.4300/JGME-D-11-00052.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Larsen P, Stoddart H, Griess M. Ophthalmoscopy using an eye simulator model. Clin Teach. 2014;11(2):99–103. [DOI] [PubMed] [Google Scholar]
- 9.Wang H, Liao X, Zhang M, Chi Pui Pang, Chen H. A simple eye model for objectively assessing the competency of direct ophthalmoscopy. Eye. 2021;36(9):1789–94. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8351584/. [DOI] [PMC free article] [PubMed]
- 10.Kelly LP, Garza PS, Bruce BB, Graubart EB, Newman NJ, Biousse V.Teaching ophthalmoscopy to medical students (the TOTeMS study).Am J Ophthalmol. 2013;156(5):1056–1061. [DOI] [PMC free article] [PubMed]
- 11.Lucas HR, Caroline AR. Ophthalmoscopy simulation : advance in training and practice for medical students and young ophthalmologists 2017;8. [DOI] [PMC free article] [PubMed]
- 12.Ricci L, Ferraz C. Ophthalmoscopy simulation: advances in training and practice for medical students and young ophthalmologists. Adv Med Educ Pract. 2017;8:435–9. 10.2147/amep.s108041. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Ting DSW, Sim SSKP, Yau CWL, Rosman M, Aw AT, yeo IYS. Ophthalmology simulation for undergraduate and postgraduate clinical education. Int J Ophthalmol. 2016;18;9(6):920–924. [DOI] [PMC free article] [PubMed]
- 14.Tso HL, Young J, Yung CW. Comparing Eyesi Virtual Reality Simulator and Traditional Teaching Methods for Direct Ophthalmoscopy: Students’ Perspectives at Indiana University School of Medicine. Journal of Academic Ophthalmology. 2021;13(01):e66-72. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Boden KT, Rickmann A, Fries FN, Xanthopoulou K, Alnaggar D, Januschowski K, et al. [Evaluation of a virtual reality simulator for learning direct ophthalmoscopy in student teaching]. Der Ophthalmologe: Zeitschrift Der Deutschen Ophthalmologischen Gesellschaft. 2020;117(1):44–9. Available from: https://pubmed.ncbi.nlm.nih.gov/31073679/ [DOI] [PubMed]
- 16.Androwiki JE, Scravoni IA, Ricci LH, Fagundes DJ, Ferraz CA, Androwiki JE, et al. Evaluation of a simulation tool in ophthalmology: application in teaching funduscopy. Arquivos Brasileiros de Oftalmologia. 2015;78(1):36–9. Available from: https://www.scielo.br/scielo.php?script=sci_arttext&pid=S0004-27492015000100010 [DOI] [PubMed]
- 17.Cheng A, Kessler D, Mackinnon R, et al. Reporting Guidelines for Health Care Simulation Research. Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare. 2016;11(4):238–48. 10.1097/SIH.0000000000000150. [DOI] [PubMed] [Google Scholar]
- 18.Tiraset N, Poonyathalang A, Padungkiatsagul T, Deeyai M, Vichitkunakorn P, Vanikieti K. Comparison of Visual Acuity Measurement Using Three Methods: Standard ETDRS Chart, Near Chart and a Smartphone-Based Eye Chart Application. Clinical Ophthalmology. 2021;15:859–69. [DOI] [PMC free article] [PubMed]
- 19.Kim DH, Yang HK, Baek C, Seo J, Hwang JM. Efficacy of 3D-printed eye model to enhance retinoscopy skills. Sci Rep. 2024;14(1):4207. Available from: https://www.nature.com/articles/s41598-024-53321-8#ref-CR11 [DOI] [PMC free article] [PubMed]
- 20.Donovan L, Brian G, du Toit R. A device to aid the teaching of retinoscopy in low-resource countries. Br J Ophthalmol. 2008;92(2):294–304. [DOI] [PubMed] [Google Scholar]
- 21.Baek C, BASc, Seo JM. Development of Schematic Eye for Retinoscopy Training Using 3D Printer. Annals of Optometry and Contact Lens. 2016;15(4):145–9. Available from: https://www.annocl.org/journal/view.php?number=222.
- 22.Kang S, Kwon J, Ahn CJ, Esmaeli B, Kim GB, Kim N, et al. Generation of customized orbital implant templates using 3-dimensional printing for orbital wall reconstruction. Eye. 2018;32(12):1864–70. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Kim BR, Kim SH, Ko J, Baek SW, Park YK, Kim YJ, et al. A Pilot Clinical Study of Ocular Prosthesis Fabricated by Three-dimensional Printing and Sublimation Technique. Korean J Ophthalmol. 2021;35(1):37–43. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Tavakol M, Dennick R. Making Sense of Cronbach’s Alpha. International Journal of Medical Education. 2011;2(2):53–5. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4205511/. [DOI] [PMC free article] [PubMed]
- 25.Gilmour G, McKivigan J. Evaluating medical students’ proficiency with a handheld ophthalmoscope: a pilot study. Adv Med Educ Pract. 2016;8:33–6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Mackay DD, Garza PS, Bruce BB, Newman NJ, Biousse V. The demise of direct ophthalmoscopy. Neurology: Clinical Practice. 2014;5(2):150–7. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4404284/. [DOI] [PMC free article] [PubMed]
- 27.Stern GA. Teaching Ophthalmology to Primary Care Physicians. Arch Ophthalmol. 1995;113(6):722. [DOI] [PubMed] [Google Scholar]
- 28.Kelly LP, Garza PS, Bruce BB, Graubart EB, Newman NJ, Biousse V. Teaching Ophthalmoscopy to Medical Students (the TOTeMS Study). Am J Ophthalmol. 2013;156(5):1056-1061.e10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Gupta RR, Lam WC. Medical students’ self-confidence in performing direct ophthalmoscopy in clinical training. Can J Ophthalmol. 2006;41(2):169–74. [DOI] [PubMed] [Google Scholar]
- 30.Shuttleworth GN, Marsh GW. How effective is undergraduate and postgraduate teaching in ophthalmology? Eye. 1997;11(5):744–50. [DOI] [PubMed] [Google Scholar]
- 31.Jacobsen MF, Konge L, Bach-Holm D, la Cour M, Holm L, Hφjgaard-Olsen K, et al. Correlation of virtual reality performance with real-life cataract surgery performance. J Cataract Refract Surg. 2019;45(9):1246–51. [DOI] [PubMed] [Google Scholar]
- 32.Mellum ML, Vestergaard AH, Grauslund J, Vergmann AS. Virtual vitreoretinal surgery: effect of distracting factors on surgical performance in medical students. Acta Ophthalmologica. 2020;98(4):378–83. Available from: https://pubmed.ncbi.nlm.nih.gov/31580012/. [DOI] [PubMed]
- 33.Ropelato S, Menozzi M, Michel D, Siegrist M. Augmented reality microsurgery: a tool for training micromanipulations in ophthalmic surgery using augmented reality. Simul Healthc. 2020;15(2):122-7. 10.1097/SIH.0000000000000413. [DOI] [PubMed]
- 34.Ng D, Sun Z, Young AL, Ko STC, Lok J, Lai T, et al. Impact of virtual reality simulation on learning barriers of phacoemulsification perceived by residents. Clin Ophthalmol. 2018;12:885–93. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Bruce BB, Thulasi P, Fraser CL, Keadey MT, Ward A, Heilpern KL, et al. Diagnostic Accuracy and Use of Nonmydriatic Ocular Fundus Photography by Emergency Physicians: Phase II of the FOTO-ED Study. Ann Emerg Med. 2013;62(1):28-33.e1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Akil H, Keskin S, Çavdarli C. Comparison of the Refractive Measurements with Hand-held Autorefractometer, Table-mounted Autorefractometer and Cycloplegic Retinoscopy in Children. Korean J Ophthalmol. 2015;29(3):178. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Yoo SG, Cho MJ, Kim US, Baek SH. Cycloplegic Refraction in Hyperopic Children: Effectiveness of a 0.5% Tropicamide and 0.5% Phenylephrine Addition to 1% Cyclopentolate Regimen. Korean J Ophthalmol. 2017;31(3):249-56. [DOI] [PMC free article] [PubMed]
- 38.Estay AM, Plaza-Rosales I, Torres HR, Cerfogli FI. Training in retinoscopy: learning curves using a standardized method. BMC Med Educ. 2023;23:874. 10.1186/s12909-023-04750-y. [DOI] [PMC free article] [PubMed]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.





