Abstract
Purpose
To evaluate fundus examination accuracy of medical students when using an unmodified iPhone X or a direct ophthalmoscope in comparison to a staff ophthalmologist’s retinal examination.
Methods
In this prospective comparative analysis, patients underwent dilated fundus examination by novice medical trainees using either an unmodified iPhone X or standard direct ophthalmoscope. The primary outcome was the mean difference and degree of agreement in cup-to-disc ratio between student examination and the staff ophthalmologist’s cup-to-disc observation.
Results
A total of 18 medical students conducted 230 retinal examinations, 117 with the iPhone X and 113 with the direct ophthalmoscope. A greater proportion of students were unable to report cup-to-disc ratio using the iPhone X (81.2%) vs direct ophthalmoscope (30.1%). Student examination of cup-to-disc ratio led to a systematic bias (95% limits of agreement) of + 0.16 (−0.22 to + 0.54) and + 0.10 (−0.36 to + 0.56) with the iPhone X and direct ophthalmoscope, respectively. iPhone X and direct ophthalmoscope student observation concordance for optic disc colour (88.7 and 82.4%, respectively) and contour (68.3 and 74.2%, respectively) demonstrated low agreement with staff ophthalmologist findings. Student iPhone X observations demonstrated lower agreement with staff findings compared to direct ophthalmoscope observations for spontaneous venous pulsations (Cohen’s Kappa = −0.044 vs 0.099).
Conclusion
Amongst medical trainees, optic disc visualization using an unmodified iPhone X was inferior to the direct ophthalmoscope. When able to visualize the optic nerve head, there was no significant difference in reported cup-to-disc ratio between modalities. However, both modalities demonstrated poor reliability in comparison to staff ophthalmologist findings.
Supplementary Information
The online version contains supplementary material available at 10.1007/s10792-022-02377-4.
Keywords: Direct ophthalmoscopy, iPhone X, Direct ophthalmoscope, Medical trainees, Telemedicine, Medical education
Introduction
Ophthalmoscopy is a critical component of a standard ocular physical examination. The reference standard for retinal examination is binocular ophthalmoscopy using either a biomicroscope slit-lamp lens or an indirect ophthalmoscope. Direct ophthalmoscopy (DO), using a portable hand-held ophthalmoscope, has become a viable option for general physicians and medical trainees because of the complexity, cost, and limited portability associated with binocular ophthalmoscopy [1]. Despite DO being preferred in the medical education setting, medical students and novice ophthalmology residents have expressed minimal confidence in their abilities to perform this skill [2, 3].
It has been reported that up to 98% of physicians use a smartphone; this number is expected to rise [4]. Due to their prevalence, excellent camera quality, and accessibility, smartphones present a possible alternative to DO as both clinical and teaching tools. Smartphone ophthalmoscopy allows the capture of retinal images or videos to appreciate the retina in real time. This is a useful application for teaching purposes, as a preceptor can review the captured video to see if a student is viewing and interpreting the field correctly.
Gunasekera and Thomas first reported the novel use of an unaided iPhone X for direct ophthalmoscopy [5]. This method is appealing due to the superior camera quality of the iPhone X compared to previous generation smartphones, thereby eliminating the need for expensive attachments to perform ophthalmoscopy. However, there is a paucity of literature investigating the accuracy and reliability of unmodified, newer model smartphones as ophthalmoscopes in the clinical setting. This study was conducted to compare the accuracy of medical students evaluating the optic nerve head using an unmodified iPhone X smartphone compared to a standard direct ophthalmoscope.
Materials and methods
Study design
In this prospective comparative analysis, patients being followed in a comprehensive, tertiary ophthalmology clinic in Toronto, Canada, and first- and second-year medical students from the University of Toronto, were recruited between November 2019 and January 2020. This study adhered to the tenets of the Declaration of Helsinki, and University of Toronto Institutional Review Board approval (35158) was obtained. To be considered for inclusion, patients 18 years or older were required to have provided written informed consent to participate. Exclusion criteria were as follows: 1) no light perception vision, 2) documented nuclear sclerotic cataract graded > 3 as per the Lens Opacities Classification System III, 3) presence of posterior subcapsular cataract, and 4) present corneal oedema or dystrophy. To be considered for inclusion, participating medical students from the University of Toronto in years 1 or 2 of training were required to provide written informed consent. Exclusion criteria were as follows: 1) previous training with DO over 3 h, 2) no previous experience using a smartphone, and 3) corrected Snellen visual acuity less than 20/40.
Study participants and data collection
First- and second-year medical students at the University of Toronto were recruited to participate in the study. Prior to data collection, all students received a 3-h hands-on training session with the direct ophthalmoscope and iPhone X. For iPhone X observations, students were instructed to place the rear camera 3 cm from the eye, and video mode was selected at 2 × magnification and 4 K resolution (60 frames-per-second); these parameters mirrored those used by Gunasekera and Thomas [5].
A bilateral dilated fundus examination, using either an unmodified iPhone X or direct ophthalmoscope, was performed by 6 students on each study patient. Students were randomized to either fundoscopy modality for each eye included in the study. Time of examination was standardized to a maximum of 1 min per eye.
Students were instructed to independently report each patient's cup-to-disc ratio (CDR), using the average of their vertical and horizontal CDR estimates, to the tenth decimal place, on a standardized questionnaire (Online Resource 1). Colour of the optic disc (pale or pink), contour of the disc (sharp or blurred margins), and presence of spontaneous venous pulsations (SVPs) (yes or no) were also reported. Students were instructed only to make an observation if they were truly able to visualize fundus structures, and if they were not, they would designate the observation as “not accessible.” Student observations were compared to the findings of a masked staff ophthalmologist, who used a 78D non-contact slit-lamp biomicroscope to perform the standard-of-care fundus examination using similar reporting parameters.
Primary and secondary outcomes
The primary outcome of this study was the mean difference in CDR between the novice trainees and the staff ophthalmologist for iPhone X and direct ophthalmoscope observations. Secondary outcomes included the percentage (%) of correct evaluations of optic disc colour, contour and presence of SVPs by the medical students compared to the reference ophthalmologist, stratified by observation modality. The percentage of student CDR evaluations within 0.2, 0.1 and 0.0 (identical) disc diameter observations of the staff ophthalmologist examination was also recorded.
Statistical analysis
The difference between medical trainee CDR findings and the CDR determined by the trained ophthalmologist was determined for every observation. Difference in CDR was averaged for a mean difference with a representative standard deviation, stratified by modality-of-use (iPhone X and direct ophthalmoscope). Bland–Altman analysis was performed to evaluate bias and the limits of agreement between direct ophthalmoscope or iPhone X observations and reference standard staff ophthalmologist findings.
Secondary outcomes were analysed similarly, comparing the percentage of accurate examinations to the standard-of-care reference. Cohen's Kappa value was calculated and reported, taking into consideration probability of chance agreement between medical student and staff observations. A complete case analysis methodology was used for comparison of concordance of direct ophthalmoscope and iPhone X observations with staff observations to account for those unable to report findings. Fisher’s exact test was performed to evaluate for statistical difference between comparison groups. The SPSS Statistics Software (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY, USA, April 2020) was utilized for analysis.
Results
A total of 18 medical students conducted 230 retinal examinations on 38 unique patient eyes. A total of 117 student-performed fundus examinations were completed using the iPhone X compared to 113 using the direct ophthalmoscope (Table 1).
Table 1.
Demographics of patients included in study analysis (n = 24)
| Characteristics | |
|---|---|
| Age, years (range) | 72.1 (52 to 87) |
| Number of unique eyes, n | 38 |
| Male Sex, n (%) | 16 (57.9%) |
| Mean IOP, mmHg (SD) | 13.0 (3.67) |
| BCVA, LogMAR (SD) | 0.11 (0.15) |
| Past Ocular history | |
| AMD, n | 1 |
| Pseudophakic, n | 25 |
| Healthy, n | 12 |
Data are listed as mean (range), mean (SD) or number (%). IOP intraocular pressure, BCVA best-corrected visual acuity, AMD Age-related macular degeneration
Grading of CDR
Mean CDR (± SD) amongst patients included in the study was 0.33 ± 0.18 as measured by the staff ophthalmologist (Table 2). A statistically significantly higher proportion of students were unable to comment on CDR using the iPhone X (81.2%) compared to the direct ophthalmoscope (30.1%), p < 0.001 (Table 3). Amongst students who were able to provide CDR estimates, a greater proportion reported values within 0.2 disc diameters of staff ophthalmologist findings in the iPhone group (72.3%) compared to students using the direct ophthalmoscope (62.0%) (Table 3). For students able to visualize the disc, the 95% limits of agreement (mean bias) was −0.33 to + 0.56 (+ 0.11) (Fig. 1a). When using the iPhone X to assess CDR, there was a systematic bias of + 0.16 compared to reference observation, with the 95% limits of agreement being −0.22 to + 0.54 (Fig. 1b). When using the direct ophthalmoscope to assess CDR, there was a systematic bias of + 0.10 compared to reference standard observation, with the 95% limits of agreement being −0.36 to + 0.56 (Fig. 1c).
Table 2.
Patient optic nerve head findings as determined by staff ophthalmologist
| Optic Disc parameter | Staff ophthalmologist finding | |
|---|---|---|
| CDR, n (SD) | 0.33 (0.18) | |
| Colour, n | Pink disc | Pale disc |
| 38 | 0 | |
| Contour, n | Crisp margins | Blurred margins |
| 38 | 0 | |
| SVPs, n | Yes | No |
| 11 | 27 | |
Data are listed as mean (SD) or number (n). CDR cup-to-disc ratio, SVPs spontaneous venous pulsations
Table 3.
Student CDR observation concordance with staff ophthalmologist findings using the direct ophthalmoscope and iPhone X
| Modality for direct ophthalmoscopy | P value | ||
|---|---|---|---|
| Direct ophthalmoscope (n = 113) | iPhone X (n = 117) | ||
| N (%) | N (%) | ||
| Able to comment | 79 (69.9) | 22 (18.8) | < 0.001 |
| Exact match | 10 (12.7) | 5 (22.7) | 0.308 |
| ≤ 0.1 Disc Diameter | 30 (38.0) | 9 (40.9) | 0.809 |
| ≤ 0.2 Disc Diameter | 49 (62.0) | 16 (72.3) | 0.454 |
| Unable to comment | 34 (30.1) | 95 (81.2) | < 0.001 |
Data are listed as number and percentage of student responses
“ ≤ 0.1 and 0.2 Disc Diameter” is inclusive
Fig. 1.
Bland–Altman plot for mean difference of cup-to-disc measurements between 78D slit-lamp biomicroscopy staff ophthalmologist findings and a students observations with either iPhone X or direct ophthalmoscope, b student observations with just the iPhone X, and c student observations with just the direct ophthalmoscope
Evaluation for optic disc pallor
The staff ophthalmologist classified the disc of all 38 eyes as 'pink' (Table 2). Of the 230 learner observations, 51/230 (22.2%) were recorded as 'no response' (Table 4). A statistically significantly higher proportion of students were unable to evaluate disc pallor using the iPhone X (39.3%) compared to the direct ophthalmoscope (4.4%), p < 0.001. Of students who were able to visualize the disc, 152/179 (84.9%) observations agreed with the staff ophthalmologist's finding; optic disc pallor concordance with the staff ophthalmologist finding was 89/108 (82.4%) for students using the direct ophthalmoscope and 63/71 (88.7%) for students using the iPhone X. The proportion of students reporting disc pallor was 19/108 (17.6%) and 8/71 (11.3%) in the direct ophthalmoscope and iPhone groups, respectively.
Table 4.
Student optic nerve head exam findings using the direct ophthalmoscope and iPhone X
| Modality for direct ophthalmoscopy | P value | ||
|---|---|---|---|
| Direct ophthalmoscope (n = 113) | iPhone X (n = 117) | ||
| N (%) | N (%) | ||
| Disc colour | |||
| Correct | 89 (82.4) | 63 (88.7) | |
| Incorrect | 19 (17.6) | 8 (11.3) | |
| No response | 5 (4.4) | 46 (39.3) | < 0.001 |
| Disc contour | |||
| Correct | 72 (74.2) | 28 (63.8) | |
| Incorrect | 25 (25.8) | 13 (36.2) | |
| No response | 16 (14.2) | 76 (65.0) | < 0.001 |
| SVPs | |||
| Correct | 45 (65.2) | 16 (55.2) | |
| Incorrect | 24 (34.8) | 13 (44.8) | |
| No response | 44 (38.9) | 88 (75.2) | < 0.001 |
Data are listed as number and percentage of student responses. CDR cup-to-disc ratio, SVPs spontaneous venous pulsations
Evaluation for optic disc contour
The staff ophthalmologist classified disc margins of all 38 eyes as crisp (Table 2). Of the 230 learner observations, 92/230 (40.0%) were recorded as 'no response' (Table 4). A statistically significantly higher proportion of students were unable to evaluate disc contour using the iPhone X (65.0%) compared to direct ophthalmoscope (14.2%), p < 0.001. Of students who were able to evaluate disc contour, 100/138 (72.5%) observations agreed with the staff ophthalmologist's finding; optic disc contour concordance with the staff ophthalmologist finding was 72/97 (74.2%) for students using the direct ophthalmoscope and 28/41 (63.8%) for students using the iPhone X. The proportion of student observations reporting blurred disc margins was 25/97 (25.8%) and 13/41 (26.2%) in the direct ophthalmoscope and iPhone groups, respectively.
Evaluation for SVPs
A total of 11 out of 38 eyes (28.9%) had SVPs as noted by the staff ophthalmologist (Table 2). Of the 230 learner observations, 132/230 (57.4%) were unable to evaluate SVPs (Table 4). A statistically significantly higher proportion of students were unable to evaluate presence of SVPs using the iPhone X (75.2%) compared to the direct ophthalmoscope (38.9%), p < 0.001. Of students who were able to evaluate SVPs, 61/98 (62.2%) observations agreed with the staff ophthalmologist's finding; concordance for presence of SVP's with the staff ophthalmologist finding was 45/69 (65.2%) for students using the direct ophthalmoscope (Cohen's Kappa 0.099, SE = 0.125) and 16/29 (55.2%) for students using the iPhone X (Cohen's Kappa −0.044, SE = 0.167), p = 0.37.
Discussion
We present the first study to perform a prospective clinical analysis evaluating the accuracy of fundus observations using an unmodified smartphone, following the publication by Gunasekera and Thomas in 2019 [5]. Gunasekera and Thomas hypothesized that inexperienced students would be able to perform DO more accurately using the iPhone X compared to direct ophthalmoscope due to the increased prevalence and familiarity of smartphones in the student population [5]. However, in our study, medical students demonstrated greater difficulty performing DO using the iPhone X, as indicated by the lower proportion of those able to make observations in parameters such as CDR (iPhone X: 81.2%; direct ophthalmoscope: 30.1%), optic disc colour (iPhone X: 60.7%; direct ophthalmoscope: 95.6%) and optic disc contour (iPhone X: 35.0%; direct ophthalmoscope 85.8%).
Students rated CDR higher by an average of 0.11 compared to the staff ophthalmologist finding. A mean bias of approximately 0.10 is generally considered clinically acceptable for CDR [6]. This level was achieved in the direct ophthalmoscope group (bias = + 0.10, 95% limits of agreement: −0.36 to + 0.56) but not the iPhone X cohort (bias = + 0.16, 95% limits of agreement: −0.22 to + 0.54). Furthermore, a greater proportion of students were unable to comment on CDR using the iPhone X compared to the direct ophthalmoscope (81.2% vs 30.1%). Amongst trained ophthalmologists, Tielsch et al. found that CDR estimates varied by over 0.2 disc diameters or more in 17–19% of stereoscopic observations [7]. This is in comparison to medical student CDR estimates in our study, which differed from staff findings by over 0.2 disc diameters in 38.0% and 27.7% of direct ophthalmoscope and iPhone X observations, respectively. While inter-user variability is expected, limits of agreement for CDR scoring using the iPhone X and direct ophthalmoscope were both too large to suggest agreement between student and staff observations. Therefore, we cannot infer one modality superior over another with respect to evaluating CDR. Our findings suggest that trainees were not comfortable with DO despite the 3-h training session prior to study initiation, which is in line with the literature [8].
Gunasekera and Thomas were able to easily perform iPhone X ophthalmoscopy as demonstrated by the embedded video in their publication; however, it is unclear the level of practice that was required to become proficient in this technique [5]. Trainees enrolled in our study received a 3-h training session and did not have extensive training prior to study enrollment. Despite using the same video parameters as the previous publication, the training that students in this study received may not have been sufficient, thus contributing to poor performance. Gunasekera and Thomas did not comment on how long they trained with the iPhone X, but both are trained ophthalmologists who may have better proficiency at finding the optic nerve head at baseline.
At first glance, it appeared that students were able to accurately assess optic disc colour and margin if able to visualize the disc. Students using the iPhone X demonstrated greater concordance with the staff ophthalmologist’s disc pallor assessment (iPhone X = 88.7%, direct ophthalmoscope 82.4%), but poorer concordance with the staff ophthalmologist’s disc margin assessment (iPhone X = 63.8%, direct ophthalmoscope 74.2%). All patients enrolled in the study had pink discs and crisp margins. Therefore, we were unable to assess students' ability to distinguish disc pallor or blurred margins from normal disc colour and contour. The main conclusion is that students demonstrated less ability to characterize disc colour and margins with the iPhone X as indicated by the proportion of 'no responses' in each modality group.
Unlike disc contour and disc pallor, there were patients enrolled in the study who had SVPs as noted by the staff ophthalmologist (30.4%). Students experienced the greatest difficulty in evaluating for SVPs compared to other secondary outcomes, as demonstrated by the highest proportion of those unable to comment on presence of SVPs (57.4%). This may be explained by the increased relative difficulty of this task in comparison to evaluation of optic nerve pallor and contour assessment. Concordance was greater in the direct ophthalmoscope group (65.2%) compared to the iPhone X group (55.2%). However, only slight agreement between students and staff was observed in the direct ophthalmoscope group (Cohen's kappa = 0.099) and no agreement was observed in the iPhone X group (Cohen's kappa = −0.044).
The use of smartphones for retinal examination has been reported by several studies [9–11]; however, previously investigated smartphones have required additional equipment. Smartphone attachments such as the “D-Eye” [9–11] and “Peek Retina” [11] are available to enhance the camera quality and to provide a light source for ophthalmic purposes. Using indirect ophthalmoscopy as the reference standard, Dickson et al. found that D-Eye iPhone CDR assessment agreed within a value of 0.1 disc diameters in 94% of observations [9]. The proportion of students able to visualize the optic disc using the D-Eye smartphone attachment (82.3%) has also been shown to be greater compared to the direct ophthalmoscope (48.5%) in a study by Kim and Chao [10]. Bastawarus et al. reported CDR mean bias using the Peek Retina to be 0.02 compared with digital fundus camera readings, with 95% limits of agreement of −0.21 to + 0.17 (Kappa = 0.69) [11]. While CDR assessment using the Peek Retina was not compared with slit-lamp biomicroscopy findings bias, results showed greater agreement compared to our study. There is potential for smartphone ophthalmoscopy; however, the need for additional attachments makes this method inconvenient.
Exciting developments in telemedicine, robust security channels, and smartphone-accessible electronic medical records in combination with smartphone-ophthalmoscope practices may lead to improved patient care. If proven to be equivalent or superior compared to the direct ophthalmoscope, iPhone X ophthalmoscopy can possibly adopt a similar role to fundus photography, which has become popularized as a referring option for providers wishing to consult ophthalmologists [12]. The possibility for electronic consults through shared ophthalmoscopy images and videos from smartphones is especially applicable in the present day COVID-19 pandemic. Electronic consults through secured sharing of ophthalmoscopy images and videos from smartphones can provide useful clinical information without requiring face-to-face patient interaction.
The limitations of this study deserve mention. First, the level of training of the student participant sample resulted in a large proportion of incomplete observations, as trainees were often unable to comment on optic disc parameters. This limits the ability for the authors to derive absolute conclusions about the superiority of one DO modality over the other in the novice student population. While the large proportion of incomplete observations was not ideal, this phenomenon reflected the high level of difficulty novice students experience performing DO. In addition, the patient sample did not include those with optic disc pallor or blurred disc margins. Therefore, we were unable to reliably assess students' ability to distinguish disc pallor and blurred disc margins from normal disc colour and contour. Strategic guessing of optic disc parameters amongst students was also an uncontrollable potential bias. We aimed to eliminate guessing by providing clear instructions to participants and through blinding of student observations to all participants and the staff ophthalmologist.
Conclusions
In conclusion, amongst medical trainees, optic disc visualization using the iPhone X was inferior in comparison to the direct ophthalmoscope. However, students demonstrated poor reliability performing fundus examinations using either device. There was no significant difference in CDR assessment accuracy between modalities. Further evaluation of the unmodified iPhone X as a tool for ophthalmoscopy is necessary to provide evidence for implementation into medical education settings.
Supplementary Information
Below is the link to the electronic supplementary material.
Abbreviations
- DO
Direct ophthalmoscopy
- CDR
Cup-to-disc ratio
- SVPs
Spontaneous venous pulsations
Authors’ contributions
All authors contributed to the manuscript and have reviewed/approved its contents.
Funding
No funding was received by any author for the completion of this study.
Availability of data and material
The data that support the findings of this study are available on request from the corresponding author.
Code availability
The SPSS Statistics Software (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY, USA, April 2020) was utilized for analysis.
Declarations
Conflict of interest
Dr. Amandeep S. Rai reported receiving payment/honoraria from Alcon and Bausch Health to support an educational event as well as receiving consulting fees from Alcon and Bausch Health. The other authors have no conflict of interests to disclose.
Ethics approval
This study adhered to the tenets of the Declaration of Helsinki and University of Toronto Institutional Review Board approval was obtained (REB# 35158).
Consent to participate (ethics)
Written informed consent was obtained from all individual participants included in the study.
Consent to publish (ethics)
Written informed consent was obtained from all individual participants included in the study.
Research involving human participants
All procedures performed in this study involving participants were in accordance with the ethical standards of the institutional research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent
Written informed consent was obtained from all individual participants included in the study. This was approved by the University of Toronto research ethics board and adhered to the Tenet’s of Helsinki for retrospective case series studies.
Meeting Presentations
None
Footnotes
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Bifolck E, Fink A, Pedersen D, Gregory T. Smartphone imaging for the ophthalmic examination in primary care. JAAPA. 2018;31:34–38. doi: 10.1097/01.JAA.0000541482.54611.7c. [DOI] [PubMed] [Google Scholar]
- 2.Shah M, Knoch D, Waxman E. The state of ophthalmology medical student education in the United States and Canada, 2012 through 2013. Ophthalmology. 2014;121:1160–1163. doi: 10.1016/j.ophtha.2013.12.025. [DOI] [PubMed] [Google Scholar]
- 3.Mackay DD, Garza PS, Bruce BB, Newman NJ, Biousse V. The demise of direct ophthalmoscopy: A modern clinical challenge. Neurol Clin Pract. 2015;5:150–157. doi: 10.1212/CPJ.0000000000000115. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Mobasheri MH, King D, Johnston M, Gautama S, Purkayastha S, Darzi A. The ownership and clinical use of smartphones by doctors and nurses in the UK: A multicentre survey study. BMJ Innov. 2015;1:174–181. doi: 10.1136/bmjinnov-2015-000062. [DOI] [Google Scholar]
- 5.Gunasekera CD, Thomas P. High-resolution direct ophthalmoscopy with an unmodified iPhone X. JAMA Ophthalmol. 2019;137:212–213. doi: 10.1001/jamaophthalmol.2018.5806. [DOI] [PubMed] [Google Scholar]
- 6.Haslett RS, Batterbury M, Cuypers M, Cooper RL. Inter-observer agreement in clinical optic disc measurement using a modified 60 D lens. Eye (Lond) 1997;11:692–697. doi: 10.2147/AMEP.S119440. [DOI] [PubMed] [Google Scholar]
- 7.Tielsch JM, Katz J, Quigley HA, Miller NR, Sommer A. Intraobserver and interobserver agreement in measurement of optic disc characteristics. Ophthalmology. 1988;95:350–356. doi: 10.1016/S0161-6420(88)33177-5. [DOI] [PubMed] [Google Scholar]
- 8.Gilmour G, McKivigan J. Evaluating medical students' proficiency with a handheld ophthalmoscope: a pilot study. Adv Med Educ Pract. 2016;8:33–36. doi: 10.2147/AMEP.S119440. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Dickson D, Samiksha F, MacDonald C, Song H, Agraz D, Morgan L, Suh D (2017) Comparison study of funduscopic exam of pediatric patients using the D-EYE method and conventional indirect ophthalmoscopic methods. Open J Ophthalmol 7:150–152
- 10.Kim Y, Chao DL. Comparison of smartphone ophthalmoscopy vs conventional direct ophthalmoscopy as a teaching tool for medical students: the COSMOS study. Clin Ophthalmol. 2019;13:391–401. doi: 10.2147/OPTH.S190922. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Bastawrous A, Giardini ME, Bolster NM, Peto T, Shah N, Livingstone IA, Weiss HA, Hu S, Rono H, Kuper H, Burton M. Clinical validation of a smartphone-based adapter for optic disc imaging in Kenya. JAMA Ophthalmol. 2016;134:151–158. doi: 10.1001/jamaophthalmol.2015.4625. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Panwar N, Huang P, Lee J, Keane PA, Chuan TS, Richhariya A, Teoh S, Lim TH, Agrawal R. Fundus photography in the 21st century–a review of recent technological advances and their implications for worldwide healthcare. Telemed J E Health. 2016;22:198–208. doi: 10.1089/tmj.2015.0068. [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
The data that support the findings of this study are available on request from the corresponding author.
The SPSS Statistics Software (IBM SPSS Statistics for Windows, Version 22.0, Armonk, NY, USA, April 2020) was utilized for analysis.

