Abstract
Purpose
To compare the vision-screening results of school-aged children tested with EyeSpy software and those of children examined by a pediatric ophthalmologist. We also compared combined results of an electronic visual acuity (EVA) tester and stereopsis testing to the results of a professional eye examination.
Methods
In this pilot study, all children were tested using EyeSpy, an ETDRS EVA tester/stereopsis, and examined (including cycloplegic refraction) by a pediatric ophthalmologist. The order of presentation of the EVA and EyeSpy assessments was assigned randomly. The EyeSpy test was performed twice (using occlusive eyepatch and red-blue dissociative goggles). EyeSpy registered pass or refer results for visual acuity testing at a threshold of 20/32 visual acuity and stereopsis of 300 arcsec. Similar threshold values were used in the EVA/stereopsis testing.
Results
The average age of 72 subjects was 11.4 ± 2.2 years. Prevalence of visual impairment was 25 of 72 (34.7%) as reported by the professional examination. The sensitivity, specificity, and conventional positive likelihood ratio was 88%, 87%, and 6.8 when EyeSpy was used using a patch; 88%, 74%, and 3.44 when EyeSpy was used with goggles; and 88%, 94%, and 13.79 for EVA/stereospsis, compared to the gold-standard professional eye examination. EyeSpy screening results using a patch were not significantly different than those of a professional examination (p = 0.508). The two results concurred in 63 of 72 (87.5%) subjects.
Conclusions
EyeSpy software has potential for use as a vision-screening device. EyeSpy using an occlusive patch outperformed EyeSpy using dissociative glasses.
Introduction
Vision screening for school-aged children is important to detect uncorrected refractive errors, problems with binocular vision, and missed amblyopia.1-6 Ideally, children should have repeated vision screening as they age since refractive errors and disruptions in binocular vision can develop at any age. Vision screening in school-aged children has been made even more relevent by the results of a recent study suggesting that amblyopia therapy is effective even at older ages.7 Visual acuity is the most widely used method for screening vision in school-aged children; however, visual-acuity testing in schools is not standardized, and binocular vision is often not tested. Professional examinations provide a comprehensive assessment of vision but are cost prohibitive for repeated testing during school years for all children.
EyeSpy 20/20 (VisionQuest 20/20, Mesa, AZ) is an automated computer program that assesses vision while a child plays a video game. In addition to visual acuity, it incorporates an analysis of binocular function. The video game format, with highly engaging graphics, motion, and sound, appeals to children.3 Computers are readily available in most schools in the United States, and vision screening software is easily distributed. Computer software applications allow standardization of logic protocols for vision screenings. In addition, the video game format allows for automated testing, which eliminates the need for training and certification of vision-screening proctors; thus it allows screenings to be performed by lay volunteers or other school personnel.
Our pilot study aimed to compare vision-screening results using EyeSpy software on an off-the-shelf laptop computer performed by a lay screener to the results of a gold-standard professional eye examination by a pediatric ophthalmologist. We also compared combined results of an electronic visual acuity (EVA) tester and stereopsis testing, both administered by an ophthalmic technician, to the results obtained from a professional eye examination by a pediatric ophthalmologist.
Methods
Subjects
This study complied with the Health Insurance Portability and Accountability Act and was approved by the Institutional Review Board (IRB) for Human Research at the Medical University of South Carolina (MUSC). In order to attract study participants, advertisements approved by the IRB were sent out by the MUSC Broadcast Email Messaging system, and letters were mailed to the school nurses at nearby schools and to various Boy and Girl Scout Clubs. The advertisements included specific inclusion and exclusion criteria. By parental report, children with developmental delay, psychiatric or attention deficit disorders, or motor skills insufficiency were excluded from consideration for the study. Children aged 8-16 years were enrolled. Informed written consent was provided by a parent of each enrolled subject. In addition, assent was obtained from children above 12 years of age prior to testing. All subjects enrolled after prototype software evolved to the beta stage were analyzed.
Apparatus
EyeSpy
The EyeSpy is a computerized eye chart and stereopsis tester. In the current study we used HP laptop computers with 15.4″ diagonal measurement screens with 1280 × 1024 resolution running Windows XP. The minimum requirements, as stated by the software designers, are (1) a laptop or desktop computer, an LCD monitor with a 13.3″ to 30″ diagonal screen and 1024 × 768 minimum resolution; (2) Windows XP or Vista or Windows 7 operating system; and (3) 100 megabytes of available space on the primary hard drive.
The stimuli for the visual acuity tasks were a subset of ETDRS optotypes. The optotypes used in the software version evaluated in this study contained the following letters: R, S, Z, and K.8 These letters were chosen by the manufacturers. The software allows modifying optotypes (eg, use of HOTV optotypes). EyeSpy presents differently sized optotypes with crowding bars one at a time. The optotypes were presented with white crowding bars on a black background and were white, red, or blue in color. When patching was used to dissociate the eyes, the optotypes were always presented in white. When using colored goggles to dissociate the eyes, the optotypes were presented either in white (to evaluate both eyes), red (to evaluate the left eye), or cyan/blue (to evaluate the right eye). Stimuli for the stereopsis assessment were randot stereograms that would present one of four of the following shapes when stereo vision was present: circle, square, triangle, or diamond (Figure 1).
FIG 1.
Screen shot of EyeSpy stereo protocol (provided by Dr. Jim O’Neil and Rich Tirendi).
EyeSpy follows the Amblyopia Treatment Study (ATS) visual acuity testing protocol.9-11 E-Supplement 1 (available at jaapos.org) illustrates the generic protocol for EyeSpy (provided by Dr. Jim O’Neil and Rich Tirendi). In order to establish the child’s ability to master the concept of matching letters, EyeSpy has an initial comprehension phase, in which large optotypes are presented. The letters then rapidly decline in size until they reach the threshold value, which is set at the pass/refer criteria. For this study, the threshold visual acuity value was 20/32 (ability to read 20/32 required to pass). These parameters are adjustable within the software but remained at this value for all children enrolled in this study. For this line to be counted as a pass, three out of four optotypes must have been correctly identified. If fewer than three of four optotypes were correctly identified, a reinforcement phase was employed, in which larger optotypes letters were shown. Optotypes were then subsequently diminished in size back to the pass/refer level so that a retest of the threshold level was performed prior to receiving a failing result. In addition, if the response time was 60% longer in one eye than the other when wearing the red/blue goggles, the difference would trigger a referral rather than a pass for that child. The computer software was set to detect a 100% (doubling) or more difference in the speed of testing between the eyes when a patch was used. The difference would trigger a referral rather than a pass for that child.
Electronic Visual Acuity Tester
For the EVA (model 4-WRH), the stimuli were high-contrast, black-and-white letters with luminance of 85 to 105 candelas/meter and contrast of 98% (Jaeb Center for Health Research Foundation, Inc, Florida). Single letters were framed with crowding bars. With a high-resolution (1600 × 1200 pixels) 17″ monitor, the system was capable of displaying letters from 20/800 to 20/12 at a test distance of 3 meters.10-11
Procedure
Each child underwent 3 separate evaluations (EyeSpy testing, EVA testing plus stereopsis, and a comprehensive professional eye examination, including cycloplegic refraction). EyeSpy testing was conducted twice by two lay examiners (one testing with an eye patch and the other with red/blue glasses). EVA and stereopsis testing was conducted by two ophthalmic technicians. The comprehensive eye examination was conducted by a pediatric ophthalmologist. The order of evaluation by EVA and EyeSpy was assigned randomly. Children were evaluated without optical correction. The children were monitoring during the testing, which was conducted in the Storm Eye Institute MUSC pediatric ophthalmology clinic.
EyeSpy Testing
The EyeSpy video-game and vision testing was performed twice (once with an eye patch and once with red/blue glasses). Two laptop computers were used for the study: one exclusively to test all children with an eye patch, the other to test with goggles. This was not randomized; however, approximately half of the children underwent examination by patch first and the other half by red/blue glasses first.
Red/blue glasses were used to dissociate the eyes to allow each eye to be tested separately. The testing distance was set to 10 feet for our study. Children sat at a table positioned so that they were 10 feet from the computer screen. The game was played using a computer mouse placed on the table in front of the child. The EyeSpy program allowed the child to choose from several animated characters. The chosen character then moved through the game as directed by the child using the mouse. The software program took the child through a series of video game tasks while it performed visual acuity testing. The child had to use the computer mouse to select the corresponding optotype from one of four choices located along the top of the screen. The proper matching selection was necessary for the child to move to the next stage of the video game. For purposes of this study, EyeSpy registered pass/refer for threshold visual acuity testing at 20/32 (ability to read 20/32 required for a pass) and the presence or absence of low-grade stereopsis (300 arcsec). In order to perform uninterrupted testing on the computer, stereopsis was tested at the same distance as visual acuity, that is, 10 feet. Red/blue glasses were used for stereopsis testing regardless of the method of dissociation used during visual acuity testing (patch or goggles).
EVA Testing and Stereopsis
The standard visual acuity testing was performed by an ophthalmic technician using the EVA tester while the child wore an eye patch over the eye not being tested. The ETDRS visual acuity testing protocol was used, consisting of an initial screening phase followed by a testing phase. The smallest line for which at least 3 out of 5 letters could correctly be identified was recorded as the visual acuity for that eye.
Stereopsis testing was performed using the Random dot E stereotest (Stereo Optical Co, Inc, Chicago, IL) at 50 cm while wearing polarized glasses. Presence or absence of near stereopsis at 300 arcsec was recorded. Stereopsis testing was performed as per the Prevent Blindness America physician protocol, which requires correctly selecting a stereopsis random E from a blank card four times in a row with a maximum of six attempts. The patient was asked to distinguish between a raised E and a nonstereo target.
The subject was assigned “refer” if the child could not read 20/32 optotypes with either the right eye or the left eye, if there was two lines difference in visual acuity between the eyes, or if stereopsis was not demonstrated.
Professional Eye Examination
The professional eye examinations included routine clinical assessments, including ocular motility. A penlight was used to detect any structural deformities in or around the eyes. A slit lamp was utilized at the examiner’s discretion. Cycloplegic retinoscopy and indirect ophthalmoscopy were performed 30-40 minutes after instillation of 1 drop of 0.5% proparacaine, followed by 1 drop each of 1% cyclopentolate and 1% tropicamide. A second set of drops was instilled at the examiner’s judgment. In circumstances where retinoscopic reflexes were difficult to interpret, subjective cycloplegic refraction was also performed.
The subject was considered as affected or having an eye problem if during a professional eye examination a pediatric ophthalmologist detected any of the following eye abnormalities: clinically significant ocular pathology, including blepharoptosis, strabismus, corneal or lenticular opacities, and retinal disorders, hyperopia ≥ +4.00 D, hyperopia ≥ +2.00 D associated with esotropia, anisometropia ≥ +1.00 D spherical equivalent, astigmatism ≥ +1.25 D, or myopia ≥ −1.00 sphere. The subject as a whole was considered to be an appropriate referral if either eye was affected.
Results of each examination were masked to each examiner. The results were not discussed with the subject or parents until all examinations were completed. The examiner and subject or parents were instructed not to discuss the performance of the previous examination with the subsequent examiner. Each examiner was given a separate copy of the data recording form. Statistical analysis was performed using SPSS for Windows (SPSS, Chicago, IL). The data analyses were performed using sensitivity and specificity tests along with confidence interval values. We also calculated conventional positive and negative likelihood ratios. These analyses were performed using an online calculator (http://faculty.vassar.edu/lowry/VassarStats.html). The results of the two tests were compared using a paired data test (McNemar’s test).
Results
The average age of the 72 subjects was 11.4 ± 2.2 years (range, 8.2- 15.8 years). Forty-three subjects were female (60%). Racial distribution was as follows: white, 42; African American, 20; Asian, 8; and Hispanic, 2. Table 1 illustrates the overall results of the study. The sensitivity, specificity, and conventional likelihood ratios are shown in Table 2. Results of the professional examination were similar to (not significantly different from) the EyeSpy screening results using a patch (p = 0.508). In 63 of 72 subjects (87.5%) the results of EyeSpy concurred with that of the professional eye examination. In the remaining 9 subjects (12.5%), the EyeSpy results differed from that of the professional eye examination. However, there was less agreement (a signficant difference) between the professional examination and the EyeSpy using goggles (p = 0.035). In 15 of 72 subjects (20.8%), the EyeSpy results using goggles did not concur with that of the professional examination.
Table 1A.
Professional examination and EyeSpy using patch
| Prof exam | Total | ||
|---|---|---|---|
| Affected | Not affected |
||
| Refer | 22 | 6 | 28 |
| Pass | 3 | 41 | 44 |
| Total | 25 | 47 | 72 |
Table 1B.
Professional examination and EyeSpy using goggles
| Prof exam | Total | ||
|---|---|---|---|
| Affected | Not affected |
||
| Refer | 22 | 12 | 34 |
| Pass | 3 | 35 | 38 |
| Total | 25 | 47 | 72 |
Prevalence of visual impairment was 25 of 72 subjects (34.7%) as reported by the professional examination. The most common cause for a refer rather than pass designation on the professional examination was myopia (21/25, 84%). Of the 21 children found to have a visual deficit, 17 had myopia in both eyes. For the remaining 4 subjects, the causes for visual impairment were as follows: astigmatism in one eye, astigmatism in both eyes, strabismus and astigmatism, and myopia both eyes with anisometropia. If only the EVA analysis had been used, 25 subjects would have been referred for visual problems (visual acuity <20/32 in both eyes, n = 16; visual acuity <20/32 in one eye, n = 8; one subject failed stereopsis). Average time taken for EyeSpy using goggles was 4.43 ± 1.91 minutes (n = 71); for patching the average time was 3.92 ± 1.37 minutes (n = 59).
Discussion
We evaluated vision screening performed by lay examiners using EyeSpy automated software in school-aged children. A professional examination by a pediatric ophthalmologist was used as the gold standard for comparison. An exact visual acuity is not obtained with the EyeSpy threshold testing employed in our study; it simply states whether a child is able to see 20/32 or better. The importance of threshold testing as opposed to an exact visual acuity test is that threshold testing is performed much more rapidly. One disadvantage of threshold testing, however, is that some traditional vision screening programs employ a two-line difference as additional failing criteria. The EyeSpy software can be set to detect a two-line difference, but at the cost of a much longer testing time. A trained vision screener often notices when a child reads the eye chart rapidly with one eye but is hesitant or slow to read the eye chart with the other eye even though he or she may ultimately identify the letters correctly. Based on early clinical experience, it was determined by software manufacturers that if the response time was 60% longer in one eye than the other, it was suggestive of a true visual disparity when wearing the red/blue goggles. The computer software was set to detect a 100% (doubling) or more difference in the speed of testing between the eyes when a patch is used. The difference would trigger a referral rather than a pass for that child.
The sensitivity value was 88% when the EyeSpy video-game vision screening was compared with the gold-standard professional eye examination, while specificity was 87% and 74% (using eye patch and goggles, respectively). The published reports for vision screening in younger children have sensitivities and specificities around 85% to 90%.12-18 The results obtained in our study of school-aged children cannot be directly compared with published studies reporting the screening of younger children; however, the results of our study comparing EyeSpy with a professional eye examination are encouraging.
When compared to a gold-standard professional eye examination, the sensitivity was 88% with all three comparisions—EyeSpy using a patch, EyeSpy using goggles, and EVA/stereopsis. This suggests that EyeSpy was comparable to EVA in identifying actual positives with vision impairment who are correctly identified as such. However, similar values for specificity were high when using EVA/stereopsis (94% for EVA/stereopsis, 87% when EyeSpy was performed using a patch, and 74% when EyeSpy was performed using goggles). This suggests that EVA/stereopsis was better in identifying subjects without vision impairment who truly did not have vision impairment. However, the results of EyeSpy using a patch were better compared to the results of EyeSpy using goggles. We did not set out to directly compare the results of EyeSpy with that of EVA/stereopsis, but these results were also encouraging, with sensitivity of 92% and specificity of 89%.
In 9 of 72 subjects (12.5%), the results of EyeSpy did not concur with that of the professional eye examination. Although these results were not statastically significant (p = 0.508), they may be clinically important. EyeSpy using a patch performed better than EyeSpy using goggles. One of the most important concerns about using goggles to dissociate the two eyes is the possibility of bleed-through, where the eye not being tested is able to see the testing optotypes as opposed to being fully blocked by an eye patch. Additionally, goggles have lower contrast sensitivity. It may be possible that in the presence of refractive error and amblyopia, the reduced contrast sensitivity could influence the visual acuity pass/fail threshold results.
The odds of having a failed examination by a pediatric ophthalmologist increased 6.8 times when EyeSpy results suggested a vision problem. When an EyeSpy test was negative for a vision problem, the odds of having a vision problem decreased. We reported the likelihood ratio instead of predictive values. The predictive values are dependent on the prevalence of the disease, while sensitivity, specificity, and likelihood ratio are independent of prevalence of the disease.19 The higher the prevalence of the disorder, the higher the positive predictive value of the test.20 Thus, with a different prevalence of at-risk subjects, the same test with the same cutoff value has greatly differing predictive values. The prevalence of children having eye problems that require either glasses or treatment for strabismus was reported as about 10% of 5-to-6-year-old children.21 Although we intended to mimic the general population as much as possible, our study participants had a higher percentage of children with eye problems than in the general population (34.7%). For example, many children came because they failed a vision screening examination in their school. Thus we are not reporting predictive values.
Average total testing time taken, from the start of the game to completion, for EyeSpy using goggles (4.43 minutes) and patching (3.92 minutes) seems comparable to other screening tools. Salcido and colleagues22 reported a mean time of 2.5 minutes using photoscreening and 5.9 minutes with traditional screening.
We did not use uniform guidelines for reporting results of vision screening studies.23 These guidelines were for preschool vision screening. Our study included school-aged children. EyeSpy vision screening serves not only to identify amblyogenic risk factors but also any cause of visual impairment in school-aged children which could negatively affect their quality of life. Thus our results may differ in the rate of failed screenings and, in particular, the rate of identification of visual impairment.
This study has several limitations. For this pilot research trial, older children with average comprehension, attention, and cooperation skills were enrolled in the study. We enrolled older children, as the assessment process employed in the study design necessitated several tests being performed on the same day and thus longer periods of attention than required for a normal vision screening. Further study in a younger age group is warranted based on the promising results in our older cohort. The small sample size is an additional limitation. This was a pilot study of the EyeSpy software; therefore, we employed a smaller sample size. Subsequent studies could include a wider range of age groups and larger numbers. Although we avoided using children from the pediatric ophthalmology clinic, the prevalence of visual impairment was high in our study. The number of false positives may be high in a normal population. The EyeSpy software used in the study has now undergone several upgrades, which may help improve outcomes, shorten the testing time, and minimize bleed-through by using red-blue goggles. The results of our current study may not be directly comparable to the results of future studies performed after software upgrades.
In conclusion, EyeSpy has potential for use as a vision screener for school-aged children: compared to the results of professional eye examinations, it effectively identifies children with visual impairment.
Supplementary Material
Table 1C.
Professional examination and EVA
| Prof exam | Total | ||
|---|---|---|---|
| Affected | Not affected |
||
| Refer | 22 | 3 | 25 |
| Pass | 3 | 44 | 47 |
| Total | 25 | 47 | 72 |
Table 2.
Vision screening results of EyeSpy software using patch or goggles and EVA/stereopsis testing to the reference standard professional examination
| EyeSpy (using patch) (compared to professional examination ) |
EyeSpy (using goggles) ( compared to professional examination) |
EVA/stereopsis (compared to professional examination ) |
|
|---|---|---|---|
| Sensitivity (CI) |
88% (68-97) |
88% (68-97) |
88% (68-97) |
| Specificity (CI) |
87% (73-95) |
74% (59-85) |
94% (81-98) |
| Positive likelihood ratio (CI) |
6.8 (3.21-14.75) |
3.4 (2.07-5.73) |
13.8 (4.57-41.6) |
| Negative likelihood ratio (CI) |
0.13 (0.04-0.40) |
0.16 (0.05-0.47) |
0.13 (0.04-0.37) |
EVA, electronic visual acuity; CI, confidence interval.
Acknowledgments
Supported in part by a grant from Amblyopia Foundation of America, NIH grant EY-14793, and unrestricted grant to MUSC-SEI from Research to Prevent Blindness, Inc., New York, NY.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
The authors have no financial or proprietary interest in any product mentioned herein.
Presented in part as a poster presentation at the 35th Annual Meeting of the American Association for Pediatric Ophthalmology and Strabismus, San Francisco, CA, April 17-21, 2009.
References
- 1.Ganley JP, Roberts J. Eye conditions and related need for medical care among persons 1-74 years of age: United States, 1971-72. Vital Health Stat. 1983;11:21. [Google Scholar]
- 2.Lai YH, Hsu HT, Wang HZ, Chang SJ, Wu WC. The visual status of children ages 3 to 6 years in the vision screening program in Taiwan. J AAPOS. 2009;13:58–62. doi: 10.1016/j.jaapos.2008.07.006. [DOI] [PubMed] [Google Scholar]
- 3.Proctor SE. Vision screening: New and time-honored techniques for school nurses. Nasnewsletter. 2009;24:62–8. [PubMed] [Google Scholar]
- 4.Rutstein RP, Corliss DA. BVAT distance vs near stereopsis screening of strabismus, strabismic amblyopia and refractive amblyopia; a prospective study of 68 patients. Binocul Vis Strabismus Q. 2000;15:229–36. [PubMed] [Google Scholar]
- 5.Rose K, Younan C, Morgan I, Mitchell P. Prevalence of undetected ocular conditions in a pilot sample of school children. Clin Experiment Ophthalmol. 2003;31:237–40. doi: 10.1046/j.1442-9071.2003.00636.x. [DOI] [PubMed] [Google Scholar]
- 6.Jamali P, Fotouhi A, Hashemi H, Younesian M, Jafari A. Refractive errors and amblyopia in children entering school: Shahrood, Iran. Optom Vis Sci. 2009;86:364–9. doi: 10.1097/OPX.0b013e3181993f42. [DOI] [PubMed] [Google Scholar]
- 7.Scheiman MM, Hertle RW, Beck RW, et al. Randomized trial of treatment of amblyopia in children aged 7 to 17 years. Arch Ophthalmol. 2005;123:437–47. doi: 10.1001/archopht.123.4.437. [DOI] [PubMed] [Google Scholar]
- 8.Ferris FL, 3rd, Kassoff A, Bresnick GH, Bailey I. New visual acuity charts for clinical research. Am J Ophthalmol. 1982;94:91–6. [PubMed] [Google Scholar]
- 9.Holmes JM, Beck RW, Repka MX, et al. Pediatric Eye Disease Investigator Group The amblyopia treatment study visual acuity testing protocol. Arch Ophthalmol. 2001;119:1345–53. doi: 10.1001/archopht.119.9.1345. [DOI] [PubMed] [Google Scholar]
- 10.Moke PS, Turpin AH, Beck RW, et al. Computerized method of visual acuity testing: Adaptation of the amblyopia treatment study visual acuity testing protocol. Am J Ophthalmol. 2001;132:903–9. doi: 10.1016/s0002-9394(01)01256-9. [DOI] [PubMed] [Google Scholar]
- 11.Beck RW, Moke PS, Turpin AH, et al. A computerized method of visual acuity testing: Adaptation of the early treatment of diabetic retinopathy study testing protocol. Am J Ophthalmol. 2003;135:194–205. doi: 10.1016/s0002-9394(02)01825-1. [DOI] [PubMed] [Google Scholar]
- 12.Ying GS, Kulp MT, Maguire M, Ciner E, Cyert L, Schmidt P. Sensitivity of screening tests for detecting vision in preschoolers-targeted vision disorders when specificity is 94% Optom Vis Sci. 2005;82:432–8. doi: 10.1097/01.OPX.0000162660.14378.30. [DOI] [PubMed] [Google Scholar]
- 13.Moll AM, Rao RC, Rotberg LB, Roarty JD, Bohra LI, Baker JD. The role of the random dot Stereo Butterfly test as an adjunct test for the detection of constant strabismus in vision screening. J AAPOS. 2009;13:354–6. doi: 10.1016/j.jaapos.2009.03.008. [DOI] [PubMed] [Google Scholar]
- 14.Arthur BW, Riyaz R, Rodriguez S, Wong J. Field testing of the plusoptiX S04 photoscreener. J AAPOS. 2009;13:51–7. doi: 10.1016/j.jaapos.2008.08.016. [DOI] [PubMed] [Google Scholar]
- 15.Ottar WL, Scott WE, Holgado SI. Photoscreening for amblyogenic factors. J Pediatr Ophthalmol Strabismus. 1995;32:289–95. doi: 10.3928/0191-3913-19950901-06. [DOI] [PubMed] [Google Scholar]
- 16.Kennedy RA, Thomas DE. Evaluation of the iScreen digital screening system for amblyogenic factors. Can J Ophthalmol. 2000;35:258–62. doi: 10.1016/s0008-4182(00)80075-7. [DOI] [PubMed] [Google Scholar]
- 17.Schmidt P, Maguire M, Dobson V, et al. Comparison of preschool vision screening tests as administered by licensed eye care professionals in the Vision in Preschoolers Study. Ophthalmology. 2004;111:637–50. doi: 10.1016/j.ophtha.2004.01.022. [DOI] [PubMed] [Google Scholar]
- 18.Van Eenwyk J, Agah A, Giangiacomo J, Cibis G. Artificial intelligence techniques for automatic screening of amblyogenic factors. Trans Am Ophthalmol Soc. 2008;106:64–73. discussion 73-4. [PMC free article] [PubMed] [Google Scholar]
- 19.Altman DG, Bland JM. Diagnostic tests 2: Predictive values. BMJ. 1994;309:102. doi: 10.1136/bmj.309.6947.102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Stewart SH, Connors GJ. Screening for alcohol problems: What makes a test effective? Alcohol Res Health. 2004;28:5–16. [PMC free article] [PubMed] [Google Scholar]
- 21.Traboulsi EI, Cimino H, Mash C, Wilson R, Crowe S, Lewis H. Vision First, a program to detect and treat eye diseases in young children: The first four years. Trans Am Ophthalmol Soc. 2008;106:179–85. discussion 185-6. [PMC free article] [PubMed] [Google Scholar]
- 22.Salcido AA, Bradley J, Donahue SP. Predictive value of photoscreening and traditional screening of preschool children. J AAPOS. 2005;9:114–20. doi: 10.1016/j.jaapos.2003.10.011. [DOI] [PubMed] [Google Scholar]
- 23.Donahue SP, Arnold RW, Ruben JB. Preschool vision screening: What should we be detecting and how should we report it? Uniform guidelines for reporting results of preschool vision screening studies. J AAPOS. 2003;7:314–6. doi: 10.1016/s1091-8531(03)00182-4. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

