Skip to main content
Medical Devices (Auckland, N.Z.) logoLink to Medical Devices (Auckland, N.Z.)
. 2019 Jan 15;12:21–40. doi: 10.2147/MDER.S186529

Undermining a common language: smartphone applications for eye emergencies

Jennifer M Charlesworth 1,2,, Myriam A Davidson 2
PMCID: PMC6339640  PMID: 30697086

Abstract

Background

Emergency room physicians are frequently called upon to assess eye injuries and vision problems in the absence of specialized ophthalmologic equipment. Technological applications that can be used on mobile devices are only now becoming available.

Objective

To review the literature on the evidence of clinical effectiveness of smartphone applications for visual acuity assessment marketed by two providers (Google Play and iTunes).

Methods

The websites of two mobile technology vendors (iTunes and Google Play) in Canada and Ireland were searched on three separate occasions using the terms “eye”, “ocular”, “ophthalmology”, “optometry”, “vision”, and “visual assessment” to determine what applications were currently available. Four medical databases (Cochrane, Embase, PubMed, Medline) were subsequently searched with the same terms AND mobile OR smart phone for papers in English published in years 2010–2017.

Results

A total of 5,024 Canadian and 2,571 Irish applications were initially identified. After screening, 44 were retained. Twelve relevant articles were identified from the health literature. After screening, only one validation study referred to one of our identified applications, and this one only partially validated the application as being useful for clinical purposes.

Conclusion

Mobile device applications in their current state are not suitable for emergency room ophthalmologic assessment, because systematic validation is lacking.

Keywords: visual assessment, visual acuity and emergency medicine, epidemiology, methodology, ophthalmology, ocular

Background

Clinical utility of available smartphone applications for emergency health care providers who evaluate ophthalmologic complaints has not yet been established. Emergency room physicians evaluate a variety of ophthalmologic emergencies, including acute glaucoma, retinal detachment, and episcleritis/scleritis. These emergencies potentially threaten vision and require careful visual examination. A quick, accessible, portable electronic tool that evaluates vision in patients of all ages at the bedside is required.14 Before use, however, such tools need to be rigorously evaluated. Transferring a tool from its paper to smartphone version does not necessarily mean that reliability and validity remain intact.57

Visual acuity (VA) tools (eg, Eye Handbook, Visual Acuity XL) are available on smartphones, and are employed variably in emergency departments. VA tests give clinicians an estimate of a patient’s ability to perceive spatial detail,8 and are one aspect of a full assessment. VA is the easiest and most important test for bedside evaluation, because it correlates positively with both quality of life and degree of limitation in independent activities of daily living, especially in the geriatric population.911

Evaluation of VA faces a number of challenges1214 as it comprises detection acuity (ability to interpret visual stimulus and note if present or absent), resolution acuity (ability to evaluate and express if all the spatial detail is absorbed and resolved from the background), and recognition acuity (ability to identify a target and recognize it). This evaluation can be especially difficult when assessing young children13 or the elderly.15

Today, eye care professionals use the Bailey–Lovie chart14 and the Early Treatment of Diabetic Retinopathy Study (ETDRS).16 Both tools have standard letter optotypes (letter-like images) with five optotypes per line. These tools correlate well with ocular pathology in the adult population and are the gold standard for VA. In research circles, VA is now expressed in terms of logarithm of minimum angle of resolution (logMAR) equivalents, as opposed to Snellen equivalent distances (eg, 20/40 feet [6/12 m]), although the latter are often still used in modern emergency departments.13,17

There are smartphone applications for VA tests that could replace the older paper versions. The problem is that these applications may not have undergone the rigorous methodological assessment necessary for either screening out pathological conditions or arriving at an accurate diagnosis. Previous reviews of smartphone applications in other areas of medicine have demonstrated substantial variation in quality.1720 Quality is especially important in acute care settings, where urgent treatment decisions need to be made. The aim of this paper was to assess the evidence for the usefulness and validity of selected smartphone applications intended for ophthalmologic assessment of acute eye emergencies.

Methods

This systematic review identified relevant smartphone applications in Canada and Ireland through a search of the websites of two mobile technology vendors (iTunes and Google Play) on three separate occasions between November 2014 and July 2017 using the search terms “eye”, “ ocular”, “ ophthalmology”, “ optometry”, “ vision”, “ and visual assessment”. Secondly, we searched four medical databases (Cochrane, Embase, PubMed, Medline) for research papers on the applications we had identified. We used the same search terms, with the addition of “mobile” or “smartphone”. We included only papers written in English from 2010 to 2017. During the whole analysis, the two authors performed data extraction independently, and conflicts on pertinence were resolved by discussion. This systematic review thus evaluates existing smartphone applications marketed to health care professionals for the determination of VA in Canada and Ireland.

Application search

Identification of mobile applications

An iterative ongoing search in both countries for applications in online stores for iPhone (iTunes, App Store) and Google Play (Google Play) was done, with search terms (Figure S1) altered for use in each database. Both authors independently reviewed applications for inclusion on the basis of a priori criteria (Figure S2). The final update was completed in November 2017. The search was limited to the two stores listed, as they represent the majority (99.7%) of smartphone user platforms according to the International Data Corporation Worldwide Quarterly Mobile Phone Tracker (May 2017) and make up the majority of the market share in the two target regions.21

Selection criteria

English language applications marketed for evaluation of vision by health care professionals were screened by title and description. Applications that were targeted for educational purposes/knowledge dissemination, games, self-monitoring, multimedia/graphics, recreational health and/or fitness, business, travel, weather, or sports or were clinically outdated were all excluded. Where it was unclear whether the application should be included, further review of any linked websites was performed.

Data extraction and encoding

Data elements extracted included year of release, affiliation (academic, commercial), target as stated, content source, and cross-platform availability (for use with tablets and/or computers). A preliminary coding system was developed based on the first application store.

Health literature search

A systematic search was conducted of the four major databases (Figure 1) from January 1, 2010 to July 31, 2016. The search strategy was developed in consultation with a medical librarian and methodological search professional (Figure S1). This was supplemented with a review of relevant reference material for any missed literature.

Figure 1.

Figure 1

Identification of relevant smartphone studies.

Identification of articles for literature review

Selection criteria, data extraction, and coding

Twelve relevant articles were identified in the literature using the processes described (Figure 1). Additional exclusion criteria for articles included preclinical studies or those addressing clinically specialized ophthalmologic/neurological populations not seen in the emergency room (Table S1).

Results

Applications

A total of 44 applications were retained in the final data set after screening of 7,595 applications. In the Canadian iTunes store, 2,526 applications met our initial search criteria. Of those, 927 were unique and suitable for detailed review (Figure 2): 229 were screened by title, of which 21 were selected based on description. Similar selections were made in Google Play. The results were combined and four additional duplicates removed, to obtain a final data set of 24 applications whose characteristics are summarized in Table 1. The Irish iTunes store had substantially fewer applications and duplicates, with 1,100 applications identified and 307 duplicates removed, to obtain a total of 793 applications to be screened by title (Figure 3).

Figure 2.

Figure 2

Selection of Canadian smartphone applications.

Table 1a.

Results of systematic application review, Canada (n= 24)

App Name Provider Cost ($) Store Explicitly specifies which HCP For Whom Explicitly mention validated How Disclaimer present Disclaimer (up to 44 words)
AmblyoCare RISC Schware Grmbh Free Google Play Yes Ophthalmologists, orthoptists, ESIT specialists, pediatrists, neurologists, medical specialists No n/a Yes “the software may only be used in those European countries where one or both of these [English or German] languages is defined as an official language”
AmblyopiaVA Manuel Rodriguez Vallejo 21.99 (iTunes) 26.78 (Google Play) Both Yes Optometry and ophthalmology practice No n/a No n/a
ClinicCSF Manuel Rodriguez Vallejo 59.99 iTunes Yes Optometrists and ophthalmologists Yes measured the chromatic characteristics of the devices [ computers or portable devices such as Tablets or iPad] with the Spyder4Elite Colorimeter Yes For tablets, not for use on smartphones
Eye Chart HD - Screen Vision with Pocket Snellen, Sloan, Near Vision, and Amsler Grid Test Dok LLC Free iTunes No “Triage, house visits, patients who have memorized the office chart” selected as implied HCP No n/a Yes “Should not be used as a primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall- based Snellen charts.”
Eye Chart Premium Dok LLC 39.99 iTunes No “Triage and hospital bedside, testing patients who have memorized the office chart” selected as implied HCP No n/a Yes “Should not be used as a primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall-based Snellen charts.”
Eye Chart Pro Test Vision and Visual Acuity better with Snellen, Sloan, ETDRS, and Near Vision! Dok LLC Free iTunes No “Triage, house visits, patients who have memorized the office chart” selected as implied HCP No n/a Yes “Eye Test Pro should not be used as a primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall-based Snellen charts.”
Eye Chart Professional Dok LLC 139.99 iTunes No “House visits, hospital bedside testing, or mounted to a wall” Selected as implied HCP No n/a No n/a
Eye Emergency Manual Agency for Clinical Innovation Free (both iTunes and Google Play) Both Yes Medical, nursing, and allied health clinicians in Emergency Departments across New South Wales No n/a Yes “These guidelines have not undergone a formal process of evidence based clinical practice guideline development however they are the result of consensus opinion determined by the expert clinician working group”
Eye Handbook Cloud Nine Developer Free (both iTunes and Google Play) Both No “Anyone involved in eye care” selected as implied HCP No n/a No n/a
EyeTest Droid George YX Kong Free Google Play Yes General practitioners, ophthalmologists, optometrists, opticians, and medical and optometry students No n/a Yes “Note eyeTests is not a substitute for a full ophthalmic examination by an optometrist of ophthalmologist”
EyeTestsFree George Kong Software Free iTunes Yes General practitioners, ophthalmologists, optometrists, opticians, and medical and optometry students No n/a Yes “Note eyeTests is not a substitute for a full ophthalmic examination by an optometrist of ophthalmologist”
EyeTests Easy George Kong Software Free iTunes Yes General practitioners, ophthalmologists, optometrists, opticians, and medical and optometry students No n/a Yes “Note eyeTests is not a substitute for a full ophthalmic examination by an optometrist of ophthalmologist”
Morphision Constantin Chifor Free iTunes Yes Ophthalmic professionals No n/a No n/a
NEOD Acuity NEOD Inc. 209.99 iTunes Yes Vision care specialists No n/a No n/a
OKN+ Medical 3.99 iTunes Yes Healthcare professionals No n/a No n/a
OphthDocs Eye App Sheng Chiong Hong Free iTunes Yes Healthcare providers, ophthalmologists and optometrists Yes by ophthalmologist. For more information visit www.ophthalmicdocs.com No Website no longer exists
Pocket Eye Exam Nomad 2.79 (iTunes) Free (Google Play) Both Yes Neuro-ophthalmologists, neurologists, optometrist, medical school students and residents Yes Updated for Android 4.0 devices. Not tested for 5.0+. Tested on HDPI screen (Samsung Captivate) and works. MDPI and LDPI screens do not display Snell and Pupil Chart correctly. No n/a
Professional Clinical Ossibus Software 26.52 Google Play Yes Optometrists, ophthalmologists, neurologists, medical doctors, optometric technicians, nurses, EMTs, medical assistants, and other healthcare providers No n/a Yes “These tools are very useful for screening purposes, but are not necessarily recommended for diagnosis” …”This app is intended for informational purposes only and not for the diagnosis, treatment, or cure of any medical conditions or illness.” “…”
Random Eye Chart Generator Dok LLC 2.79 iTunes No “Triage, house visits, patients who have memorized the office chart” selected as implied HCP No n/a Yes “Should not be used as a primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall-based Snellen charts.”
MultiQuity Sightrisk Free iTunes Yes Eye care professionals No n/a Yes “The results and interpreation [sic] of the multiQuity suite are to be entrusted to eyecare professionals only”
Smart Optometry Smart Optometry 9.99 iTunes Yes Eye-care practitioners No n/a No “Precise: Eye testing and screening oftern [sic] requires eye-care professionals to make calculations - giving the room for error. Eliminate this risk with precise calculations done by our Smart Optometry application. Our test are as precise if not more than the currently use on-the-wall testing equipment”
Smart Optometry Lite Smart Optometry Free iTunes Yes Eye-care practitioners No n/a No “Precise: Eye testing and screening oftern [sic] requires eye-care professionals to make calculations - giving the room for error. Eliminate this risk with precise calculations done by our Smart Optometry application. Our test are as precise if not more than the currently use on-the-wall testing equipment”
Visual Acuity XL Kybervision Japan LLC 69.99 iTunes Yes Vision care specialists Yes “Validated by a study from the University of Auckland, “An assessment of the iPad as a testing platform for distance visual acuity in adults” published in BMJ Open, June 2013” Yes “This app has not been validated yet on ipad [sic] mini with retina display”
VisualFields easy George Kong Software Free iTunes Yes Clinicians in clinic or bed-side No n/a Yes “This app is not designed to substitute formal visual fields testing by optometrists or opthalmologists [sic]. Not all abnormalities found on visual fields test indicate an actual problem. This App does not provide formal interpretation of visual fields”

Table 1b.

Results of systematic application review, Ireland (n= 24)

App Name Provider Cost ($) Store Explicitly specifies which HCP For Whom Explicitly mention validated How Disclaimer present Disclaimer
AmblyoCare RISC Software GmbH Free Google Play Yes ophthalmologists, orthoptists, ESIT specialists, pediatrists, neurologists, medical specialists, and affected people under medical guidance No n/a No n/a
AmblyopiaVA Manuel Rodriguez Vallejo 21.99 (iTunes) 23.27 (Google Play) Both Yes Optometrist and Ophthalmologists No n/a No n/a
AMD Pro, A Metamorphopsia Det. app4eyes 69 Google Play Yes Ophthalmologists, optometrists, opticians No n/a No n/a
AMD, A Metamorphopsia Detector app4eyes Free Google Play Yes Ophthalmologists, optometrists, opticians No n/a No n/a
Central Vision Test healthcare- 4mobile Free Google Play Yes implied mobile HCP No n/a yes “this application is not intended to replace optician’s regular full examination. We recommend you get a full eye test after using it”.
Eye Chart HD Screen Vision with Pocket Snellen, Sloan, Near Vision, and Amsler Grid Test Dok LLC Free iTunes Yes Eye Care Professionals No n/a Yes “Though this unique, pocket-sized randomizable eye chart should not be used as primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall based Snellen charts”.
Eye Chart Premium Dok LLC 29.99 iTunes Yes Doctor or Nurse No n/a Yes “Though this unique, notebook-sized randomizable eye chart should not be used as primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall-based Snellen charts”.
Eye Emergency Manual Agency for Clinical Innovation Free (iTunes and Google Play) Both Yes All medical, nursing, and allied health clinicians in emergency departments across New South Wales No n/a No n/a
Eye Handbook Cloud Nine Development LLC Free (iTunes and Google Play) Both Yes iTunes: Eye Care Professionals Google Play: For anyone involved in eye care. Note: top of description states the following “Complete overhaul of previous design. Extensive reconstruction of form and function. Now available on Android Tablet!” No n/a No n/a
eyeTests Easy George Kong Softwares Free iTunes Yes Eye care clinicians including general practitioners, ophthalmologies [sic], optometrists, opticians, medical and optometry students No n/a Yes Note: EyeTests is not a substitute for full ophthalmic examination by an optometrist pr [sic] ophthalmologist
eyeTestsDroid George YX Kong Free Google Play Yes general practitioners, ophthalmologists, optometrists, opticians, medical and optometry students No n/a Yes Note eyeTests is not a substitute for a full ophthalmic examination by an optometrist or ophthalmologist.
LogMar Snellen Hilmi Software 15.65 Google Play Yes Doctor or Nurse No n/a No n/a
Medmont AT20P Medmont Australia PT Free Google Play No “people with patients” Implied HCP No n/a No n/a
Optical Tool Brent McCardle 2.99 iTunes Yes Optometric office, optometrist, and opticians No n/a No n/a
Optician familion.ru Free Google Play Yes students, nurses, and doctors No n/a Yes please note that these tools provided in this app are used at your own risk and we do not make any claim as to their effectiveness as a testing tool. You should always seek the advice of a trained professional where possible.
Pocket Eye Exam Nomad 1.99 (iTunes) 1.40 (Google Play) Both Yes Neuro-ophthalmologist, neurologists, optometrists, medical school students and residents No n/a Yes No disclaimer for iTunes. Google Play (after overhaul) : Tested on HDPI screen (Samsung Captivate) and works, MDPI and LDPI screens do not display Snellen and pupil chart correctly. Updated Android 4.0 devices, not tested for 5.0+
Pocket Ophthalmology centricweb.co.uk Free Google Play No “This app aims to discuss treatment protocols, tests, and induction guides followed by a tertiary ophalmology [sic] clinic in the UK” Implied HCP No n/a No n/a
Professional Clinic Ossibus Software 27.99 Google Play Yes optometrists, ophthalmologists, neurologists, medical doctors, optometric technicians, nurses, EMTs, medical assistants, and other healthcare providers No n/a Yes This app is intended for informational purposts [sic] only and not for the diagnosis, treatment, or cure of any medical condition or illness. The app should not be used as a substitute for professional medical care or interpretation. Ossibus Software and its writers are not responsible or liable for the use or misuse of the information contained in this app.
Random Eye Chart Generator Dok LLC 1.99 iTunes Yes Eye Care Professionals No n/a Yes This unique, pocket-sized randomizable eye chart should not be used as primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall based Snellen charts.
Smart Optometry Smart Optometry 4.99 iTunes Yes Eye Care Professionals No n/a No n/a
StereoTAB Manuel Rodriguez Vallejo 19.99 iTunes Yes Optometric and Ophthalmology practice No n/a No n/a
Vision Tests (Complete Package) Manuel Rodriguez Vallejo 94.99 iTunes Yes Optometrist and Ophthalmologists Yes “gold standards or current vision tests”. Indicates that further information is available on developer website. However, link from iTunes website not available.test-eye.com which is developers website includes self-referencing in peer reviewed journals which were published after literature search was performed. No n/a
Visual Acuity XL Kybervision Japan LLC 49.99 iTunes Yes Vision Care Specialists Yes Peer Reviewed Journal (BMJ Open, June 2013) No n/a
visualFields easy George Kong Softwares Free iTunes Yes Clinicians No n/a Yes This app is not designed to substitute formal visual fields testing by optometrists of ophthalmologists. Not all abnormalities found on VisualFields test indicate an actual problem. This app does not provide formal interpretation of visual fields. Fixation losses are not tracked in this version of the app.

Abbreviations: ESIT, early support for infants and toddlers; HCP, health care professional; EMTs, emergency medical technicians; HDPI, high dots per inch; MDPI, medium DPI; LDPI, low DPI; NA, not applicable.

Figure 3.

Figure 3

Selection of Irish smartphone applications.

On the two Canadian stores, on average Google Play had a much lower percentage of on-target applications (0.5% vs 2.3%), and iTunes had a much higher percentage of costing vs free applications (52% vs 29%; Figure 4). Applications in the Google Play store were more likely to be free and less expensive overall. This trend was also seen in the Irish store (Figure 5). In the iTunes store, only a single application cost above $200, but 19% of the identified applications carried a cost of $50 or more (Figure 4). Of applications that appeared in both stores, 50% (n=2) remained free regardless of where they appeared. One app was more expensive in the iTunes store, and one was more expensive in Google Play. Again, this trend was also seen in the Irish stores.

Figure 4.

Figure 4

Cost of Canadian smartphone applications by platform.

Figure 5.

Figure 5

Cost of Irish smartphone applications by platform.

None of the listed providers are academic institutions. One application (4%) referred to an academic affiliation in the form of a validation study from the University of Auckland. The majority (75%, n=18) of the identified applications originated from companies. Individuals provided the remaining 21% (n=5) of applications, while 4% (n=1) of applications did not list any provider.

Eighteen of 24 applications in the two Canadian stores vs 22 of 24 in the two Irish stores (75 vs 92%) explicitly indicated which health care professionals they were directly marketed toward. Targeted professionals included medical doctors, ophthalmologists, opticians, and medical students, to name a few. However, 58% of these applications in Canadian stores vs 42% in Irish stores included some sort of disclaimer, eg, “Should not be used as a primary visual acuity measuring tool, it can provide a handy rough vision screen when a chart is not available, or it can be used to complement static, wall-based Snellen charts” (Eye Chart HD by Dok LLC).

In the Canadian stores, four applications (17%) claimed they were validated although none, when investigated, had used large-scale head-to-head studies under clinical conditions or included a variety of modern smartphones or tablet technology. In the Irish stores, even fewer (two of 24, 8%) applications stated some type of validation. Of the four validated Canadian applications, Visual Acuity XL showed the most methodological rigor, as evidenced by its reference to an affiliation with the University of Auckland and a 2013 validation study by Black et al.22 All four of the validated applications were available in the iTunes store, but only one, Pocket Eye Exam by Nomad, was also available on the Google Play platform. In the Irish stores, validated applications appeared only in the iTunes store (Table 2).

Table 2.

Quality assessment of selected apps

Country Provider Vendor Quality assessment

AmblyoCare Both RISC Software Google Play ***
AmblyopiaVA Canada only Manuel Rodriguez Vallejo Both *
Eye Handbook Both Cloud Nine Developer Both ***
Pocket Eye Exam Both Nomad Both **
Visual Acuity XL Both KyberVision Japan iTunes ***

Note:

*

No measures of validation internal or external noted;

**

some discussion of internal validation or research noted;

***

partial peer or government validation.

Of note, applications varied by country, although 58% (14 of 24) of applications were present in both countries. Each country had ten distinct applications. Moreover, within a given country, applications were not universally available for both platforms. For example, KyberVision Japan’s Visual Acuity XL, the most robustly validated application, was available in both countries, but only on the iTunes platform.

Overall, only 21% (n=5) of applications in the Canadian store were affiliated with academic institutions, which (given limited information) was our best proxy measure for academic rigor. Only one application, Visual Acuity XL, explicitly mentioned a validation study. In the Irish store, the same four applications were affiliated with academic institutions. The additional application in the Canadian store spoke of a research group without explicitly naming an affiliated academic institution. However, it was retained based on consensus of the investigators.

Systematic review of the literature

Twelve relevant articles were retained from an initial 5,648 identified by the systematic review of the literature (Figure 3). Only ten of 15 assessed methods of validation for smartphone applications. Two review articles spoke about a variety of applications, but primary data sources were not identified. Cheng et al23 discussed in limited fashion some of the chal lenges in this area in 2014. Mostly, articles compared one to three applications on the basis of a specific smartphone or tablet. One of the articles discussed validation of the Eye Handbook vs a near vision chart and identified that the application tends to overestimate near-VA vs conventional near vision card by an average of 0.11 logMAR (P<0.0001).

Given the rate of mobile technology change or upgrade,21 the specific applications reviewed here are likely already obsolete. Moreover, there is poor correlation of the literature with the identified smartphone applications. There is poor cross-referencing between the applications and the health literature. Only one application cited a validation study. Furthermore, the health literature, when referenced, may not always be accessible to the consumer, due to copyright limitations when journals are not open source. While the Eye Handbook did have some validation in the health literature,24,25 this was not referred to in the online platform of the application (Table 3). This makes it impossible for busy health care professionals to distinguish a validated application from among the others.

Table 3.

Demographic data of articles identified in a systematic search of the health literature (n=12)

Reference Study Design Subjects (n) Results Applications Reviewed, (n) Methods of assessment (Validity/Reliability/Responsiveness to change) Primary outcome Conclusions Limitations of paper
Aslam et al33 Threshold testing - 4 phases (initial threshold, threshold, attention check, and final threshold) EDTRS (Landolt C) vs MAVERIC system: specialised software on a tablet computer, housed in a bespoke viewing booth 78 High contrast visual acuity measurements (normal):
• mean of 0.003 and SD +/−0.09.
• +/−2SD = 0.17 (95 % CI) Low contrast acuity measurements:
• mean difference of 0.02 and a SD +/−0.12.
• +/− 2SD 0.23 (95 % CI)
2 Reliability Assessment Fixed distance of 40 cm High and low contract visual acuity measures Versus EDTRS Landolt C Reliable self-test VA with a high degree of reliability and agreement when compared vs gold standard chart based measurements. The major limitation at present is screen resolution and this will inevitably improve with time. The ease of availability and mobility of components means that such devices could in future be used for home testing or testing in general practice settings and further studies are needed to confirm this.
Bastawrous et al29 Validation Study 300 older-aged Kenyan adults • Test retest variability smart phone VA ±0.029logMAR (95% CI).
• The mean differences:
• smartphone vs ETDRS 0.07 (95% CI, 0.05-0.09)
• smartphone-based test and Snellen acuity data 0.08(95% CI, 0.06-0.10) logMAR.
• The agreement of Peek Acuity and the ETDRS chart was greater than the Snellen chart with the ETDRS chart (95% CI, 0.05-0.10; P= .08).
• minimal training and took no longer than the Snellen test (77seconds vs 82 seconds; 95% CI, 71-84 seconds vs 73-91 seconds, respectively; P= 0.13).
3 Variability test Monocular logMAR visual acuity scores for each test: ETDRS chart logMAR, Snellen acuity, and Peek Acuity. Peek Acuity smartphone test is capable of accurate and repeatable acuity measurements consistent with published data on the test-retest variability of acuities measured using 5-letter-per- line retro illuminated logMAR charts. may not be generalizable to other populations.
Black et al22 Blinded, diagnostic test study 85 • iPad tablet with its glossy screen significantly poorer (approximately 2 LogMAR lines) vs ETDRS chart and a standard computerised testing system (n=56) due to glare.
• iPad with an antiglare screen + positioning away from sources creating reflected (veiling) glare = equivalent VA to gold standard (n=29).
2 Comparison Visual acuity measured under a number of conditions. Tablets are only suitable for use measuring VA in situations where sources of glare can be eliminated. • Technological advancement: higher resolution screens.
• Distance acuity measurements at 6 m were not limited by the resolution of the screen, as they were indistinguishable from the printed ETDRS chart measurements.
Brady et al34 Validation Study in participants’ homes and temporary clinic settings in rural Kenya 300 • test-retest variability of smartphone acuity data were ±0.029 logMAR.
• mean differences smartphone- based test and the ETDRS chart = 0.07 (95%CI, 0.05–0.09)
• smartphone-based test and Snellen = 0.08 (95%CI, 0.06–0.10) logMAR
• Agreement of Peek Acuity and the ETDRS chart was greater than the Snellen vs. ETDRS chart (95%CI, 0.05–0.10; P = .08).
• Required minimal training and took no longer than the Snellen test (77 seconds vs 82 seconds; 95%CI, 71–84 seconds vs 73–91 seconds, respectively; P = .13).
3 Comparison • monocular logMAR visual acuity scores for: ETDRS chart logMAR, Snellen acuity, and Peek Acuity.
• test-retest variability of peak acuity and measurement time
test-retest variability of peek smart phone visual acuity using 5-letter-per-line retro illuminated logMAR charts is consistent with published data. Generalizability to western population
Chacon et al35 Validation vs gold standard netbook display for color vision assessment in normal verus colour deficient subjects testing. 32 16 Normal 16 Colour vision deficient. Both displays showed 100% specificity for confirming CVN and 100% sensitivity for detecting CVD. In CVNs there was no difference between scores on netbook vs. tablet displays. G cone CVDs showed slightly lower G cone contrast test scores on the tablet. None noted specifically Validity: ability to discriminate on tablet known normal and deficient colour vision subjects. • Sensitivity 100%
• Specificity 100% of tablet screen to identify subjects normal colour vision and pathologic colour vision
Construct validity shows that tablet can be used to discriminate colour vision normal versus colour vision deficient individuals Not used as diagnostic test. Didn’t assess discrimination threshold. Did not address glare and distance.
Cheng et al23 Systematic review of the The Apple iTunes store was searched for iPhone eye care–themed apps None In total, 182 apps were identified. The majority of apps lacked community user ratings and had 3,000 or fewer downloads (84% and 69%, respectively). Consistent with other medical specialties, only 37% of apps had documented qualified professional involvement in their development. When stratified by intended audience, 52% and 44% of apps designed for ophthalmologists and optometrists, respectively, had professional input, compared with 31% for non–eye care clinicians and 21% for the general public. 182 None Description of applications available • rapid emergence of eye care apps. But a low level of qualified professional involvement in app development and a lack of peer review after publishing remain.
• Need for evidence- based principles and standards of app development to be adopted in this emerging area.
Single store, no accompanying literature analysis.
Kollbaum et al36 Validation of an iPad-based letter contrast sensitivity test versus 2 tests including gold standard for clinical contrast sensitivity paper based measure (Pilli- Robson test) and computer based Freiburg (variable contrast Landolt C presented at eight possible orientations and used a 30-trial Best PEST procedure) • N=40
• 20 normal
• 20 low- vision
• tested monocularly at 1 m using each test wearing their habitual correction.
Test retest 95% LoA: iPad =T0.19, Pelli-Robson = T0.19 Freiburg =T0.15. The iPad test showed good agreement with the Freiburg test with similar mean (STD) logCS (iPad = 1.98-0.11, Freiburg = 1.96T0.06); 95% LoA (T0.24), iPad vs. Pelli-Robson had lower values (1.65T0.04). Low-vision subjects (iPad =T0.24, Pelli-Robson = T0.23, Freiburg =T0.21). Agreement between the iPad and Freiburg tests was good (iPad = 1.45T0.40, Freiburg = 1.54T0.37), but the Pelli-Robsontest gave significantly lower values (1.30T0.30). 1 • Ipad good precision (test retest) and good agreement with two test
• Low-vision subjects slightly poorer repeatability (iPad = T 0.24, Pelli-Robson = T 0.23, Freiburg = T 0.21).
• Poor agreement with the Pelli- Robson test (1.30 T 0.30).
• Precision
• Limits of agreement
• iPad test showed good precision
• poor limits of agreement with the Pelli-Robson test.
The newly developed application. Did not test visual acuity.
Perera et al37 • Review of Snellen equivalent for optotype size.
• Prospective cohort study visual acuity using the ‘Snellen’ application on an Apple iPhone4.
88 • logMAR of mean difference in visual acuity iphont versus paper was 0.02 logMAR (95% limit of agreement −0.332, 0.372 logMAR).
• The subgroup of patients with 6SVA worse than 6/18(n=5) had the largest mean difference. This difference is equivalent to two Snellen visual acuity lines between the charts (0.276 logMAR).
11 Validity LogMAR/Snellen equivalent VA • Eye phone apps could not validly discriminate Snellen visual acuity within one line.
• There was considerable variability in the optotype accuracy of apps. Further validation is required for assessment of acuity in patients with severe vision impairment.
• Possible selection bias, as the right eye is more frequently the dominant eye and chosen tested eye. If ocular dominance was an important factor in the reading of the chart results may be influenced.
• Chart testing order was not randomized. All Snellen visual acuity testing was 1st, followed by the iPhone Snellen acuity chart introducing a possibility of learning effect.
Phung et al38 • Retrospective chart review reliability and validity
• visual acuity measured during the same routine clinic visit with Snellen chart, Rosenbaum near vision card, and SightBook mobile app
126 • SightBook, Snellen, and near card acuities had excellent test– retest reproducibility
• SightBook acuities were significantly different from the other measures
• Agreement was also poor between the near card and the Snellen acuities (mean absolute difference of 6.4 and 7.6 letters in the right and left eyes)
1 Validity and Reliability • Acuities were converted to approximate ETDRS letters for statistical purposes • The discrepancy between SightBook mobile app and the clinic charts acuities are large; however, the results are highly reproducible. • Small sample size –
• N = 123 right eyes and 115 left eyes.
• the visual acuities were not always the best-corrected visual acuities. Finally,
• only one mobile app was evaluated as a representative application.
Tofigh et al26 Cross sectional comparison of near visual acuity.
• Near vision card
• Eye Hand Book app for iPhone 5 from.
100 • Eye Hand Book application overestimates the near VA vs. conventional near vision card by an average of 0.11 LogMAR (P-value of 0.0001) 1 Validity and Reliability Near LogMAR Visual acuity • Potential disparity in VA measurement between different platforms.
• Contrast and brightness levels of the smartphone’s high-definition screen when compared with the near vision card can effect results.
• Currently out-dated technology
• Did not account for glare
Toner et al39 • Prospective cohort: Validity and Reliability study in children age 6-18 years
• Excluded if visual acuity in the worse-seeing eye was less than 20/200 (for validity testing, but not reliability testing).
73 • strong linear correlation (r50.92)
• mean difference in acuity 0.005logMAR (less than one letter 95% CI,0.03 to 0.02)
• The 95% CI was 2 lines.
• Test–retest reliability 81% of retest scores within 0.1 logMAR (5 letters) and 100% within 0.2 logMAR (10 letters).
• intra-class correlation coefficient of 0.93, and a standard error of measurement of 0.08
• 2
• Handy Eye Check App
• Handy Eye Chart
• Test-retest reliability and validity • Monocular visual acuity testing using the subject’s poorer-seeing eye The mobile application is a valid and reliable test of VA in children aged 6–18 years compared with its paper counterpart Not randomized test order Not for poor vision children
Zhang et al40 Prospective Cohort: Validation study 120 (consecutive eyes) • mean difference (bias) of 0.02 logMAR units between the VA results from the iPad chart and the light-box chart
• 95% limits of agreement of 20.14 to 0.19.
• Better vision: 182 eyes with VA better than 0.1 according to the light-box VA test. The median logMAR VA by the iPad was 0.54 and by the light-box chart it was 0.52; there was no significant difference between them (P¼0.69).
• In Group 2, Low vision, there were 58 eyes with VA equal to or worse than 0.1 according to the light-box VA test. The median logMAR VA by the iPad was 1.26 and was 1.10 by the light box; the result from the iPad was significantly lower(P<0.001).
2 Validity Snellen VA • Good agreement in patients with better VA (i.e. Snellen VA better than 20/200)
• In patients with poor vision (i.e. Snellen VA equal to or worse than 20/200), the results from the iPad chart were worse than from the lightbox chart.
• Visual charts with Snellen notations were compared. Snellen visual charts are not in logMAR steps.
• No strict standardization in the examination process and examination conditions.
• The brightness of the iPad screen could not be adjusted to be precisely the same as the light-box chart because a light meter was not available.
• The height of the light box could not be adjusted to suit each patient’s eye height.
• The reading scales of the two kinds of charts were not identical.

Abbreviations: EDTRS, Early Treatment Diabetic Retinopathy Study; VA, visual acuity; MAR, minimum angle of resolution; LoA, limit of agreement; STD, standard deviation.

Discussion

This systematic review demonstrates that despite the availability of many mobile device applications for ophthalmologic assessment, they are either not suitable for the emergency room or else systematic validation is lacking. A combination of 5,024 Canadian and 2,571 Irish applications were identified on Google Play and iTunes as having the potential for use in ocular emergency diagnostics. Less than 1% of the identified applications (n=44) were unique and on target as potentially suitable. Four applications that were available in both stores and one additional one from a Canadian store only (n=5) were affiliated with an academic institution. Only a single application explicitly cited a validation study in its online store. This validation, based on the current standards of best practice, would have to be described as only partial.

When searched in the academic literature, three applications – Visual Acuity XL, Eye Handbook, and AmblyoCare – had some evidence of validation. The Eye Handbook was validated by a single study26 on iPhone 5, but did not address issues of glare, which in studies of other applications have been shown to make the results unreliable.27 Black et al22 used a cross-sectional design with a convenience sample of 85 healthy volunteers to demonstrate that a first-generation iPad and Visual Acuity XL could reproduce gold standard eye chart evaluation data, but only with significant attention to position and modifications to the tablet’s screen to avoid glare. Outside these standardized conditions, iPad results were significantly poorer than standardized paper-based/wall-mounted eye charts.28

Aurora et al28 noted the advantage of preliminary screening using the Eye Handbook application for purposes of home monitoring or public health data, but considered it not yet useful in the clinical setting. These authors did not comment on the reliability or validity of the Eye Handbook adapting the gold standard ETDRS into a now obsolete application for an iPod touch and an iPhone 3G. They did not address the impact of screen glare or size of the device on the patient’s required distance from it and how these variables impacted test validity either.

The online store text for the AmblyoCare28 application indicates that it is registered as a medical device with the Austrian medical board. However, no original research data for this application came up in our literature review, making the limits of this application for clinical use unclear.28 Therefore, there is a lack of transparent validation of this tool that is accessible to health-care professionals.

The difficulty with all these applications is that adapting a visual chart, eg, a Snellen chart, to the varying screen sizes/properties of various smartphone devices does not ensure that the size, font, or required distance from the image preserve the diagnostic properties of the original paper chart.5,7,29,30 One may argue that these tools could provide a rough estimate of vision sufficient for an emergency clinician. However, we would respond by saying that the results of tools that have not been validated cannot be usefully compared to known benchmarks to make treatment decisions concerning a patient’s eyesight. For example, if an application categorizes someone’s sight as normal, that is not comparable to 20–20 vision on a logMAR chart. Such results may generate treatment decisions based on faulty information.

Limitations

We did not examine smaller electronic markets, and our analysis looked only at the description of the application. Applications for non-English-speaking foreign markets were excluded from the review. We did not address the variation in regulatory requirements in the different global markets. We did not ask the opinion of professionals in this field, as some studies similar to ours have done.

Conclusion

We conclude that efficient regulation and standardization of valid clinical tools for smartphones are needed.18,31,32 This is a major challenge. One possible solution could come from the business world. Instead of free ad-based revenue passed to individual health care professionals, which rewards individual developers with low-quality applications, a business case can be developed to amalgamate resources, first nationally and perhaps eventually internationally, to fund high-quality applications that can be used globally. This would consolidate funds and expertise into a high-quality validated application that has international value (on a par with paper tools).

Despite the bright future for smartphone technology, mobile device applications in their current state are not suitable for emergency room ophthalmologic assessment. Furthermore, education for clinicians about measurement science and the limits of technological validation is also required. The importance of quality electronic diagnostic tools for patients and the challenges introduced by nonvalidated tools need to be disseminated to all health professionals.

Data-sharing statement

Most data generated or analyzed during this study are included in this published article and its supplementary information files. The original data sets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Acknowledgments

This work would not have been possible without the generous support of the following dedicated individuals. Thank you: Dr Mary V Seeman, Mrs Jennifer Desmarais, Mr Tobias Feih, Dr Joshua Chan, Ms Chelsea Lefaivre, Ms Johanna Tremblay, Ms Jennifer Lay, Dr Amanda Carrigan, Dr Clarissa Potter, Dr Gerald Lane, and Dr Vinnie Krishnan.

Footnotes

Author contributions

JMC conceived and designed the study and supervised the conduct of the study and data collection. Both authors collected and analyzed the data, managed the data and quality control, provided statistical advice, drafted the manuscript, and contributed substantially to its revision, gave final approval of the version to be published, and agree to be accountable for all aspects of the work.

Disclosure

The authors report no conflicts of interest in this work.

References

  • 1.Bourges JL, Boutron I, Monnet D, Brézin AP. Consensus on Severity for Ocular Emergency: The BAsic SEverity Score for Common OculaR Emergencies [BaSe SCOrE] J Ophthalmol. 2015;2015:576983. doi: 10.1155/2015/576983. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Collignon NJ. Emergencies in glaucoma: a review. Bull Soc Belge Ophtalmol. 2005;296(296):71–81. [PubMed] [Google Scholar]
  • 3.Muth CC. Eye Emergencies. JAMA. 2017;318(7):676. doi: 10.1001/jama.2017.9899. [DOI] [PubMed] [Google Scholar]
  • 4.Tarff A, Behrens A. Ocular Emergencies: Red Eye. Med Clin North Am. 2017;101(3):615–639. doi: 10.1016/j.mcna.2016.12.013. [DOI] [PubMed] [Google Scholar]
  • 5.Bellamy N, Campbell J, Hill J, Band P. A comparative study of telephone versus onsite completion of the WOMAC 3.0 osteoarthritis index. J Rheumatol. 2002;29(4):783–786. [PubMed] [Google Scholar]
  • 6.Bond M, Davis A, Lohmander S, Hawker G. Responsiveness of the OARSI-OMERACT osteoarthritis pain and function measures. Osteoarthritis Cartilage. 2012;20(6):541–547. doi: 10.1016/j.joca.2012.03.001. [DOI] [PubMed] [Google Scholar]
  • 7.Hawker GA, Davis AM, French MR, et al. Development and preliminary psychometric testing of a new OA pain measure-an OARSI/OMERACT initiative. Osteoarthritis Cartilage. 2008;16(4):409–414. doi: 10.1016/j.joca.2007.12.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Kniestedt C, Stamper RL. Visual acuity and its measurement. Ophthalmol Clin North Am. 2003;16(2):155–170. doi: 10.1016/s0896-1549(03)00013-0. [DOI] [PubMed] [Google Scholar]
  • 9.Chou R, Dana T, Bougatsos C, Grusing S, Blazina I. Screening for Impaired Visual Acuity in Older Adults: Updated Evidence Report and Systematic Review for the US Preventive Services Task Force. JAMA. 2016;315(9):915–933. doi: 10.1001/jama.2016.0783. [DOI] [PubMed] [Google Scholar]
  • 10.Matthews K, Nazroo J, Whillans J. The consequences of self-reported vision change in later-life: evidence from the English Longitudinal Study of Ageing. Public Health. 2017;142:7–14. doi: 10.1016/j.puhe.2016.09.034. [DOI] [PubMed] [Google Scholar]
  • 11.Hochberg C, Maul E, Chan ES, et al. Association of vision loss in glaucoma and age-related macular degeneration with IADL disability. Invest Ophthalmol Vis Sci. 2012;53(6):3201–3206. doi: 10.1167/iovs.12-9469. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Gerra G, Zaimovic A, Gerra ML, et al. Pharmacology and toxicology of Cannabis derivatives and endocannabinoid agonists. Recent Pat CNS Drug Discov. 2010;5(1):46–52. doi: 10.2174/157488910789753521. [DOI] [PubMed] [Google Scholar]
  • 13.Sonksen PM, Salt AT, Sargent J. Re: the measurement of visual acuity in children: an evidence-based update. Clin Exp Optom. 2014;97(4):369. doi: 10.1111/cxo.12185. [DOI] [PubMed] [Google Scholar]
  • 14.Bailey IL, Lovie JE. New design principles for visual acuity letter charts. Am J Optom Physiol Opt. 1976;53(11):740–745. doi: 10.1097/00006324-197611000-00006. [DOI] [PubMed] [Google Scholar]
  • 15.Abdolali F, Zoroofi RA, Otake Y, Sato Y. Automatic segmentation of maxillofacial cysts in cone beam CT images. Comput Biol Med. 2016;72:108–119. doi: 10.1016/j.compbiomed.2016.03.014. [DOI] [PubMed] [Google Scholar]
  • 16.Elliott DB, Whitaker D, Bonette L. Differences in the legibility of letters at contrast threshold using the Pelli-Robson chart. Ophthalmic Physiol Opt. 1990;10(4):323–326. doi: 10.1111/j.1475-1313.1990.tb00877.x. [DOI] [PubMed] [Google Scholar]
  • 17.Anstice NS, Thompson B. The measurement of visual acuity in children: an evidence-based update. Clin Exp Optom. 2014;97(1):3–11. doi: 10.1111/cxo.12086. [DOI] [PubMed] [Google Scholar]
  • 18.Bender JL, Yue RY, To MJ, Deacken L, Jadad AR. A lot of action, but not in the right direction: systematic review and content analysis of smartphone applications for the prevention, detection, and management of cancer. J Med Internet Res. 2013;15(12):e287. doi: 10.2196/jmir.2661. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Lalloo C, Shah U, Birnie KA, et al. Commercially available smartphone apps to support postoperative pain self-management: scoping review. JMIR Mhealth Uhealth. 2017;5(10):e162. doi: 10.2196/mhealth.8230. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Larsen ME, Nicholas J, Christensen H. A Systematic assessment of smart-phone tools for suicide prevention. PLoS One. 2016;11(4):e0152285. doi: 10.1371/journal.pone.0152285. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Smartphone OS Market Share, 2017 Q1. Smartphone OS. 2017. [Accessed November 17, 2017]. Available from: https://www.idc.com/promo/smartphone-market-share/os.
  • 22.Black JM, Jacobs RJ, Phillips G, et al. An assessment of the iPad as a testing platform for distance visual acuity in adults. BMJ Open. 2013;3(6):e002730. doi: 10.1136/bmjopen-2013-002730. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Cheng NM, Chakrabarti R, Kam JK. iPhone applications for eye care professionals: a review of current capabilities and concerns. Telemed J E Health. 2014;20(4):385–387. doi: 10.1089/tmj.2013.0173. [DOI] [PubMed] [Google Scholar]
  • 24.Perera C, Chakrabarti R. Response to: ‘Comment on The Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smartphone technology’. Eye. 2015;29(12):1628. doi: 10.1038/eye.2015.169. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Perera C, Chakrabarti R, Islam A, Crowston J. The eye phone study (EPS): Reliability and accuracy of assessing snellen visual acuity using smart-phone technology. Clinical and Experimental Ophthalmology; Conference: 44th Annual Scientific Congress of the Royal Australian and New Zealand College of Ophthalmologists, RANZCO 2012; Melbourne, VIC Australia. Conference Start: 20121124 Conference End: 20121128; 2012. p. 20121121. Conference Publication: (var.pagings). 20121140. [Google Scholar]
  • 26.Tofigh S, Shortridge E, Elkeeb A, Godley BF. Effectiveness of a smartphone application for testing near visual acuity. Eye. 2015;29(11):1464–1468. doi: 10.1038/eye.2015.138. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Black JM, Hess RF, Cooperstock JR, To L, Thompson B. The measurement and treatment of suppression in amblyopia. J Vis Exp. 2012;70:e3927. doi: 10.3791/3927. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Arora KS, Chang DS, Supakontanasan W, Lakkur M, Friedman DS. Assessment of a rapid method to determine approximate visual acuity in large surveys and other such settings. Am J Ophthalmol. 2014;157(6):1315–1321. doi: 10.1016/j.ajo.2014.02.031. [DOI] [PubMed] [Google Scholar]
  • 29.Bastawrous A, Rono HK, Livingstone IA, et al. Development and Validation of a Smartphone-Based Visual Acuity Test (Peek Acuity) for Clinical Practice and Community-Based Fieldwork. JAMA Ophthalmol. 2015;133(8):930–937. doi: 10.1001/jamaophthalmol.2015.1468. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Sullivan GM. A primer on the validity of assessment instruments. J Grad Med Educ. 2011;3(2):119–120. doi: 10.4300/JGME-D-11-00075.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Gagnon L. Time to rein in the “Wild West” of medical apps. CMAJ. 2014;186(8):E247. doi: 10.1503/cmaj.109-4772. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Cole BL. Measuring visual acuity is not as simple as it seems. Clin Exp Optom. 2014;97(1):1–2. doi: 10.1111/cxo.12123. [DOI] [PubMed] [Google Scholar]
  • 33.Aslam TM, Parry NR, Murray IJ, Salleh M, Col CD, Mirza N, et al. Development and testing of an automated computer tablet-based method for self-testing of high and low contrast near visual acuity in ophthalmic patients. Graefes Arch Clin Exp Ophthalmol. 2016;254(5):891–899. doi: 10.1007/s00417-016-3293-2. [DOI] [PubMed] [Google Scholar]
  • 34.Brady CJ, Eghrari AO, Labrique AB. Smartphone-Based Visual Acuity Measurement for Screening and Clinical Assessment. JAMA. 2015;314(24):2682–2683. doi: 10.1001/jama.2015.15855. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Chacon A, Rabin J, Yu D, Johnston S, Bradshaw T. Quantification of color vision using a tablet display. Aerosp Med Hum Perform. 2015;86(1):56–58. doi: 10.3357/AMHP.4045.2015. [DOI] [PubMed] [Google Scholar]
  • 36.Kollbaum PS, Jansen ME, Kollbaum EJ, Bullimore MA. Validation of an iPad test of letter contrast sensitivity. Optom Vis Sci. 2014;91(3):291–296. doi: 10.1097/OPX.0000000000000158. [DOI] [PubMed] [Google Scholar]
  • 37.Perera C, Chakrabarti R, Islam FM, Crowston J. The Eye Phone Study: reliability and accuracy of assessing Snellen visual acuity using smart-phone technology. Eye (Lond) 2015 2015 May 1;29(7):888–894. doi: 10.1038/eye.2015.60. Epub. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Phung L, Gregori NZ, Ortiz A, Shi W, Schiffman JC. Reproducibility and comparison of visual acuity obtained with Sightbook mobile application to near card and Snellen chart. Retina. 2016;36(5):1009–1020. doi: 10.1097/IAE.0000000000000818. [DOI] [PubMed] [Google Scholar]
  • 39.Toner KN, Lynn MJ, Candy TR, Hutchinson AK. The Handy Eye Check: a mobile medical application to test visual acuity in children. J AAPOS. 2014;18(3):258–260. doi: 10.1016/j.jaapos.2014.01.011. [DOI] [PubMed] [Google Scholar]
  • 40.Zhang ZT, Zhang SC, Huang XG, Liang LY. A pilot trial of the iPad tablet computer as a portable device for visual acuity testing. J Telemed Telecare. 2013;19(1):55–59. doi: 10.1177/1357633X12474964. [DOI] [PubMed] [Google Scholar]

Articles from Medical Devices (Auckland, N.Z.) are provided here courtesy of Dove Press

RESOURCES