Skip to main content
Clinical Diabetes : A Publication of the American Diabetes Association logoLink to Clinical Diabetes : A Publication of the American Diabetes Association
. 2023 Sep 18;42(1):142–149. doi: 10.2337/cd23-0019

Clinical Implementation of Autonomous Artificial Intelligence Systems for Diabetic Eye Exams: Considerations for Success

Risa M Wolf 1,, Roomasa Channa 2, Harold P Lehmann 3, Michael D Abramoff 4,5, TY Alvin Liu 6
PMCID: PMC10788651  PMID: 38230333

Diabetic eye disease (DED), including diabetic retinopathy (DR) and diabetic macular edema (DME), is a complication of diabetes and the leading cause of vision loss among working-age adults in the United States (13). Screening for DED can lead to its early identification and treatment, thereby preventing irreversible vision loss (47). However, rates of diabetic eye exams for DED screening remain suboptimal, with reported rates in the United States ranging from 11 to 70% (8,9).

Autonomous artificial intelligence (AI)–based diabetic eye exams have the potential to increase access to these exams and facilitate the early identification of DED so that timely treatment can be administered to prevent blindness. Autonomous AI systems use a robotic nonmydriatic fundus camera with a built-in AI algorithm to provide feedback to the operator to acquire high-quality fundus photographs for determining the presence or absence of referable DED, with immediate results after image acquisition at the point of care (1012). The autonomous AI system guides the operator to acquire two color fundus images determined to be of adequate quality using an image quality algorithm, with one each centered on the fovea and the optic nerve, and guides the operator to retake any images of insufficient quality. It is important to note that these systems have been rigorously validated against a prognostic standard to identify DR and DME and do not diagnose other eye conditions.

The first autonomous AI diagnostic system (LumineticsCore [formerly IDx-DR], Digital Diagnostics, Coralville, IA) was de novo–authorized by the U.S. Food and Drug Administration (FDA) in 2018 after completion of a pivotal trial demonstrating its safety, efficacy, and equity in diagnosing referable DED compared with a prognostic standard outcome (10). In the pivotal trial, the autonomous AI system demonstrated 87% sensitivity and 90% specificity in detecting referable DR and/or DME in adults (10,13). Since this first autonomous AI system was approved, two additional autonomous AI systems have been authorized under the FDA’s 510(k) authorization process using it as a predicate, and there are several other autonomous AI systems in different stages of development (11,12).

Because of widespread stakeholder support and based on the ethical framework under which the first autonomous AI diagnostic system was developed and validated (13,14), performance of diabetic eye exams using an autonomous AI system was added to the American Diabetes Association’s Standards of Medical Care in Diabetes—2019 guidelines (15). Furthermore, in HEDIS-MIPS (Healthcare Effectiveness Data and Information Set Merit-Based Incentive Payment System) measurement year 2020, eye exams interpreted by an AI system were determined to meet criteria to fill care gaps for value-based care (16,17). As of 1 January 2021, the new Current Procedural Terminology (CPT) code 92229 was established to allow for billing and reimbursement using autonomous AI technology to diagnose DED, and, as of 2023, the term “autonomous AI” was reintroduced into CPT coding (18,19).

Prospective cohort studies have demonstrated autonomous AI to improve rates of diabetic eye exams (8,20), and several health care modeling and cost-savings analyses have shown these systems to be cost-saving for patients and health care systems (2125) and to lead to greater prevention of vision loss on a population level than can be achieved via standard-of-care dilated eye exams (26). In addition, autonomous AI has been shown to potentially reduce greenhouse gas emissions (27).

Although there are many studies evaluating the diagnostic accuracy of different autonomous AI systems (28), as well as published observational studies demonstrating improved screening completion rates, there is limited information on successful clinical implementation of these systems and incorporation into clinical workflow. This article reviews the key considerations for successful clinical implementation of these systems in primary care and endocrinology clinics providing care to people with diabetes.

At our integrated health care system consisting of six hospitals and more than 30 community-based primary care clinics, autonomous AI screening using LumineticsCore Digital Diagnostics for DED has been implemented in both the adult and pediatric populations. We were the first center to implement this autonomous diagnostic AI system at scale. The start-up time from first initiation to actual camera use was approximately 6 months, including contracting, workflow integration in the electronic medical record (EMR) system (Epic, Madison, WI), order and result integration, and workflow planning. In the first year of use in our multidisciplinary pediatric diabetes center, we completed 310 autonomous AI diabetic eye exams in youth with diabetes, demonstrating 85.7% sensitivity and 79.3% specificity in detecting referable DED in youth compared with a level 2 reference standard of image interpretation by retina specialists (20). Based on our successful implementation in pediatrics, our health care system expanded diagnostic autonomous AI throughout the adult primary care network and adult diabetes centers. In this review, we share our experience and strategies for success from both the pediatric and adult care perspectives. Table 1 summarizes the tips highlighted throughout the article.

TABLE 1.

Summary of Tips for Successful Implementation and Use of Autonomous AI for DED Screening

Key stakeholders
  • Endocrinology and primary care teams

  • IT telemedicine teams

  • Ophthalmology/optometry clinics

Camera setup
  • Location: set up in a dimly lit room

  • Operator: MA or nurse; no prior experience in eye imaging needed

  • EMR system integration: results available in patient charts; auto-population of patient information in the camera

Patient imaging
  • Adults: reflexive versus predictive pharmacologic dilation for nondiagnostic exams

  • Pediatrics: no pharmacologic dilation required

Workflow
  • Use BPAs for patients who need eye exams

  • Establish a referral route to get timely follow-up eye care if AI is positive for referable disease

Billing
  • CPT 92229: “point of care automated analysis that uses innovative autonomous AI technology to perform the interpretation of the eye exam, without requiring that an ophthalmologist interpret the results.”

Engaging Key Stakeholders

Key to successful implementation is that stakeholders within the system must be engaged at the inception of the program, while it is still in the planning phase. These stakeholders include physician champions in the deploying department (i.e., primary care or endocrinology clinic), from the departments of ophthalmology/optometry, and from the information technology (IT) teams and staff. Stakeholders need to be comfortable with the safety, efficacy, and equity of the system, including such issues as mitigation of racial bias, data usage and cybersecurity, reimbursement, and value-based care.

The primary users of the system (i.e., primary care or endocrinology teams) should identify a champion within their practice or division to facilitate and oversee implementation of the AI technology. This champion should be responsible for informing and educating providers about the AI system, its functionality, and the workflow around its use. It is important to be considerate of the current workflow and to work with providers and clinic staff to optimize the new workflow that will incorporate implementation of diagnostic AI (29).

Getting buy-in from the institution’s ophthalmology/optometry departments, if available, is helpful in evaluating which AI system to choose, providing technical camera support and camera operator expertise, and setting up scheduling to ensure there is an efficient referral pathway for patients identified as having potentially blinding disease.

It is also important to engage the IT team, who will play a crucial role in helping to set up the camera in the clinic, connecting it to the network, assisting with any local software issues, and connecting the AI camera to the EMR system. Connecting the local IT team with the vendor IT team will also facilitate a smooth startup. The experience of implementing other systems worldwide has highlighted the importance of multidisciplinary team collaboration in the integration of AI into routine workflows (29).

Camera Setup and Integration With the EMR System

Autonomous AI systems (i.e., nonmydriatic fundus cameras with assistive AI for image quality feedback and autonomous diagnostic AI—what the FDA refers to as “medical devices”) can be used as a standalone, independent system. In this case, a formatted report of the diabetic eye exam results can be uploaded into the EMR system or printed and scanned into the medical record. PDF documents of the results can be printed for patients or e-mailed to patients.

At our institution, the autonomous AI system is fully integrated through HL7 (Health Level Seven) with our EMR system in a secure manner. A provider places an order for an autonomous AI diabetic eye exam in the EMR system. This order is then released to the camera software, and the patient’s name is automatically added to the worklist on the camera software. The camera operator then clicks on the patient’s name from the worklist and proceeds with obtaining the retinal images for the diabetic eye exam. Once the exam is complete, the results are generated as a PDF report within 30–60 seconds, automatically uploaded/pushed to the EMR system, and stored under the “Media” tab of the EMR. The actual images also remain on the camera and can also be pushed to an ophthalmic data management platform if necessary. Many ophthalmic data management platforms exist. Currently, our institution uses Forum Viewer, v. 4.2.4.15 (Zeiss, Dublin, CA).

Once the order for the autonomous diabetic eye exam is placed and completed, the best practice alert (BPA) notification for DR screening on the diabetes dashboard is fulfilled. This is important, particularly in adult clinical practices where HEDIS-MIPS measures for care gap closure should be met (17).

Of note, the autonomous AI system reformats the retina images from the fundus camera in DICOM (Digital Imaging and Communications in Medicine) format, which can then be saved in any imaging database. In our system, the images are stored on the camera and manually pushed into our ophthalmic data management platform from the camera.

Camera Location and Operator

A tabletop nonmydriatic fundus camera requires a designated space in the clinic setting with enough room for the camera and its associated computer to sit on an adjustable tabletop, as well as room for the patient chair for imaging. Space requirements would be minimal for handheld cameras with autonomous AI technology, and although they are used widely around the world with varying image quality, these devices currently are not approved by the FDA (30). For best imaging results, it is ideal to place the camera in a dimly lit room and away from any windows to avoid glare from outside light. If there is a window in the room, we recommend darkening shades to keep much of the light from entering the space with the camera.

The computer and camera software can be operated by a technician or operator (who should be a high school graduate, per FDA labeling for the device). The AI system of LumineticsCore (formerly IDx-DR) has an image quality algorithm that determines whether the images are of sufficient quality for the diagnostic algorithm to process and provide a result (10). The camera software also guides the operator on image acquisition to further ensure sufficient images for interpretation. In the pediatric study, a learner operator curve was created and demonstrated that it took ∼40 patient eye exams for an operator to reach a consistent average of 6 minutes to complete image acquisition (20). In subsequent use of the camera with new operators, we have recommended at least 20 practice eye exams before imaging patients for clinical care. Additionally, in the real-world clinical practice setting where there may be many medical assistants (MAs) or nurses triaging patients, we recommend training a few designated camera operators who can quickly build their imaging experience by using the camera and software regularly, as opposed to investing time and resources for training all of the MAs and nurses in a clinic.

Image Acquisition, Imageability, and Diagnosability

In the first pivotal trial conducted using LumineticsCore in adults with diabetes, the overall imageability was 96.1%, and 23.6% of adults required pharmacologic dilation to obtain an AI system result (10). The pivotal trial of the EyeArt Automated DR Detection System, v. 2.1.0, also demonstrated high imageability of 87.4% without dilation and up to 97.4% with a dilation protocol (11).

Our experience in pediatrics is that no pharmacologic dilation is required, and simply dimming the lights causes spontaneous pupil dilation that is sufficient to obtain images for AI system interpretation (20). In contrast, our experience with the adult population led to two main findings. First, the diagnosability rate in real-world deployment could be lower than that of clinical trials. In an analysis of our data, we identified patient factors, including older age, current smoking, and having type 1 diabetes, that were associated with nondiagnostic results. Using factors associated with nondiagnostic results, we have developed a model that can predict nondiagnostic results for future design of a novel predictive dilation workflow, in which patients most likely to benefit from pharmacologic dilation are dilated a priori to improve workflow efficiency and optimize completion of the diabetic eye exam (31).

Several factors leading to insufficient (i.e., nondiagnosable) images have been identified in studies using retinal images in both teleretinal networks and autonomous AI. In a study of adults by Liu et al. (8), small pupils, cataracts, or other media opacities led to suboptimal retinal image quality. Inconsistent image quality in combination with comorbidities has made real-world implementation challenging in adults (32,33). In our pediatric study, we identified image quality issues of blurring, edge artifacts, dark images, and diffuse reddish hue as causes of insufficient images. The real-world use of autonomous AI systems has demonstrated that images of inadequate quality can lead to false-positive results by the AI system (13,34,35). Thus, if the operator obtains images with these artifacts, the images should be retaken before processing the images through the autonomous AI algorithm (Figure 1).

FIGURE 1.

FIGURE 1

Images of insufficient quality and with artifacts.

Clinic Workflow Considerations

Developing a streamlined workflow for integrating point-of-care (POC) diabetic eye exams in any busy clinical practice is important for success. We suggest considering three factors: 1) identifying patients who need a diabetic eye exam, 2) integration of AI in the clinic workflow, and 3) establishing referral routes for abnormal AI exams (Figure 2).

FIGURE 2.

FIGURE 2

Incorporating autonomous AI into clinical workflow.

There are several potential methods for identifying patients who are due for a diabetic eye exam, and this process may depend on the practice’s EMR system and/or clinical workflow. Institutions using Epic may have a diabetes dashboard that provides “care gaps” for patients with a diabetes diagnosis or best practice alerts (BPAs) that inform providers when a diabetic eye exam is due. If these systems are not available, a manual chart review of upcoming patients can identify who is due for diabetic eye exams.

The clinical workflow of a multidisciplinary diabetes clinic is complex, so one option is to notify patients in advance of their visit that a POC diabetic eye exam with immediate results is available at their appointment. Once a patient arrives in the clinic, there are several possibilities for the workflow: 1) the eye exam can be performed by the MA as part of the check-in process, 2) the eye exam can be performed after the patient checks in but while he or she is waiting to be seen by the provider, or 3) the eye exam can be performed at the end of the visit. The limitation to performing the eye exam at the end of the visit is that this process would require the provider to wait and return to the patient once the eye exam is complete and results are available. Educating patients about the importance of the diabetic eye exam and the importance of having a follow-up eye care visit when referable eye disease is identified at the point of care has been shown to improve adherence with eye care follow-up (8,36).

Establishing a Referral Route for Positive or Abnormal Eye Exams

Providing abnormal results at the point of care instead of in a deferred manner is associated with higher adherence to follow-up eye care. In a randomized controlled trial using AI-assisted DR screening in a low-resource setting, the researchers found that providing an abnormal result at the point of care instead of 3–5 days later was associated with higher rates of follow-up with an eye care provider (37). Additionally, a prospective cohort study performed using AI screening also demonstrated improved adherence to follow-up eye care when abnormal results were provided at the point of care with education on the importance of eye care follow-up (8).

To facilitate follow-up for patients with AI-positive eye exams, develop a referral route by providing a list of local ophthalmologists/optometrists to help patients find an eye doctor with whom to schedule an appointment. In health care systems with optometry or ophthalmology practices, the diabetes clinic or primary care office should consider setting up a direct referral system such that referral orders can be placed and patients can schedule their follow-up eye care visit on their own, or the eye clinic staff can contact patients to schedule the follow-up visit, or follow-up eye care visits can be scheduled before patients leave their diabetes appointment.

Billing and Reimbursement

Widespread adoption of autonomous AI for DED screening requires alignment of incentives of different stakeholders within a particular health care ecosystem. The existence of an established screening program such as the Singapore Integrated DR Program in Singapore or the National Health Service Diabetic Eye Screening Program in the United Kingdom could facilitate the adoption of AI by providing the necessary IT infrastructure, referral workflow, and regulatory framework. However, in the absence of widespread, well-coordinated screening programs, the alignment of financial incentives is of paramount importance. For example, as previously mentioned, a new CPT code 92229 was established in the United States in 2021 specifically for the billing and reimbursement of autonomous AI exams for DED. This code is for “point of care autonomous analysis that uses innovative autonomous AI technology to perform the interpretation of the eye exam, without requiring that an ophthalmologist interpret the results.” The Centers for Medicare & Medicaid Services considers CPT code 92229 to be a diagnostic service reimbursable for Medicare and provides a definitive way for primary care providers to perform and be reimbursed for autonomous AI testing (19,38).

Measuring Your Impact

The ultimate goal of deploying autonomous AI screening for DED is to improve compliance with diabetic eye exams because regular screenings can lead to early diagnosis, timely intervention, and prevention of irreversible vision loss. In our first deployment in pediatrics, we measured the baseline screening rate at 49%, and this rate improved to 95% after implementation of autonomous AI diabetic eye exams (20). At Johns Hopkins Medicine, we initially deployed autonomous AI for diabetic eye exams at four adult primary care clinics in 2020. We compared the change in the DED screening adherence rate from 2019 (pre-AI deployment) to 2021 (post-AI deployment) between the sites with autonomous AI screening (AI sites) and sites with standard- of-care screening (non-AI sites). From 2019 to 2021, the odds of compliance increased by 87% in the AI sites (odds ratio [OR] 1.87, 95% CI 1.74–2.01) compared with 10% in the non-AI sites (OR 1.10, 95% CI 1.05–1.15) (P <0.001) (39). Future analysis of patient-level data will use multivariable logistic regression models to identify at-risk groups and streamline screening and referrals to optimize compliance and early identification of DED.

Additionally, acceptability by both providers and patients as key stakeholders in this process is important for the success of implementation and dissemination. A limited body of literature on this topic suggests that ∼55% of patients believe that AI will improve their health care, while voicing concerns related to patient choice, potentially increased health care costs, bias in the data source, and data security (40,41). Although there is general enthusiasm in the medical community for AI in health care, there is focused attention on the explainability of AI, as well as ensuring ethics and equity in the AI algorithm and its implementation (13,14,42).

Many studies have demonstrated that underserved, low-income, and racial/ethnic minority populations negatively affected by social determinants of health are less likely to access eye care and complete diabetic eye exams, yet are more likely to have DED and worse diabetes-related health outcomes (4347). Implementation of accessible programs for diabetic eye exams using autonomous AI in underserved and under-resourced communities may improve early detection of DR, allow health systems to focus resources required to provide specialist eye care for patients with referable disease, and thereby improve visual outcomes and promote health equity in these populations.

We hope this review will serve as a practical guide, providing useful “nuts-and-bolts” and “how-to” implementation recommendations for anyone planning to deploy autonomous AI testing for DED in a real-world clinical setting. In addition to the practical considerations raised herein, there are research and ethical considerations regarding deep learning, machine learning, and big data. These issues, including data shift over time, bias, and health equity, are important but beyond the scope of this review.

Article Information

Funding

This work was supported by the National Eye Institute (NEI) of the National Institutes of Health under award number R01EY033233 to R.M.W. R.C. receives research support in part from an unrestricted grant from Research to Prevent Blindness, Inc., to the University of Wisconsin–Madison Department of Ophthalmology and Visual Sciences, and from the NEI under award number 5K23EY030911-03. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.

Duality of Interest

M.D.A. is an investor in, director of, and consultant to Digital Diagnostics, Inc.; has patents and patent applications assigned to the University of Iowa and Digital Diagnostics that are relevant to the subject matter of this article; is chair of the Healthcare AI Coalition; is a member of the American Academy of Ophthalmology AI Committee, the AI Workgroup Digital Medicine Payment Advisory Group, and the Collaborative Community for Ophthalmic Imaging (CCOI); and is chair of the Foundational Principles of AI CCOI Workgroup. No other potential conflicts of interest relevant to this article were reported.

Author Contributions

R.M.W., R.C., M.D.A., and T.Y.A.L. wrote the manuscript. All of the authors critically revised the manuscript and read and approved the final manuscript for submission. R.M.W. is the guarantor of this work and, as such, has full access to all the data presented and takes responsibility for the integrity of the data and the accuracy of the review.

Funding Statement

This work was supported by the National Eye Institute (NEI) of the National Institutes of Health under award number R01EY033233 to R.M.W. R.C. receives research support in part from an unrestricted grant from Research to Prevent Blindness, Inc., to the University of Wisconsin–Madison Department of Ophthalmology and Visual Sciences, and from the NEI under award number 5K23EY030911-03. The content is solely the responsibility of the authors and does not necessarily represent the official views of the funding agencies.

References

  • 1. Lynch SK, Abràmoff MD. Diabetic retinopathy is a neurodegenerative disorder. Vision Res 2017;139:101–107 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Yau JW, Rogers SL, Kawasaki R, et al.; Meta-Analysis for Eye Disease (META-EYE) Study Group . Global prevalence and major risk factors of diabetic retinopathy. Diabetes Care 2012;35:556–564 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Zhang X, Saaddine JB, Chou CF, et al. Prevalence of diabetic retinopathy in the United States, 2005–2008. JAMA 2010;304:649–656 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Early Treatment Diabetic Retinopathy Study Research Group . Grading diabetic retinopathy from stereoscopic color fundus photographs: an extension of the modified Airlie House classification. ETDRS report number 10. Ophthalmology 1991;98(Suppl.):786–806 [PubMed] [Google Scholar]
  • 5. Early Treatment Diabetic Retinopathy Study Research Group . Fundus photographic risk factors for progression of diabetic retinopathy. ETDRS report number 12. Ophthalmology 1991;98(Suppl.):823–833 [PubMed] [Google Scholar]
  • 6. Flaxel CJ, Adelman RA, Bailey ST, et al. Diabetic retinopathy preferred practice pattern. Ophthalmology 2020;127:66–P145 [DOI] [PubMed] [Google Scholar]
  • 7. Will JC, German RR, Schuman E, Michael S, Kurth DM, Deeb L. Patient adherence to guidelines for diabetes eye care: results from the diabetic eye disease follow-up study. Am J Public Health 1994;84:1669–1671 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Liu J, Gibson E, Ramchal S, et al. Diabetic retinopathy screening with automated retinal image analysis in a primary care setting improves adherence to ophthalmic care. Ophthalmol Retina 2021;5:71–77 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Benoit SR, Swenor B, Geiss LS, Gregg EW, Saaddine JB. Eye care utilization among insured people with diabetes in the U.S., 2010–2014. Diabetes Care 2019;42:427–433 [DOI] [PubMed] [Google Scholar]
  • 10. Abràmoff MD, Lavin PT, Birch M, Shah N, Folk JC. Pivotal trial of an autonomous AI-based diagnostic system for detection of diabetic retinopathy in primary care offices. NPJ Digit Med 2018;1:39. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Ipp E, Liljenquist D, Bode B, et al.; EyeArt Study Group . Pivotal evaluation of an artificial intelligence system for autonomous detection of referrable and vision-threatening diabetic retinopathy. JAMA Netw Open 2021;4:e2134254. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Rom Y, Aviv R, Ianchulev T, Dvey-Aharon Z. Predicting the future development of diabetic retinopathy using a deep learning algorithm for the analysis of non-invasive retinal imaging. BMJ Open Ophthalmol 2022;7 [Google Scholar]
  • 13. Abràmoff MD, Cunningham B, Patel B, et al.; Collaborative Community on Ophthalmic Imaging Executive Committee and Foundational Principles of Ophthalmic Imaging and Algorithmic Interpretation Working Group . Foundational considerations for artificial intelligence using ophthalmic images. Ophthalmology 2022;129:e14–e32 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Abràmoff MD, Tobey D, Char DS. Lessons learned about autonomous AI: finding a safe, efficacious, and ethical path through the development process. Am J Ophthalmol 2020;214:134–142 [DOI] [PubMed] [Google Scholar]
  • 15. American Diabetes Association . 11. Microvascular complications and foot care: Standards of Medical Care in Diabetes—2019. Diabetes Care 2019;42(Suppl. 1):S124–S138 [DOI] [PubMed] [Google Scholar]
  • 16. Centers for Medicare & Medicaid Services . Quality ID #117 (NQF 0055): Diabetes: eye exam. Available from https://qpp.cms.gov/docs/QPP_quality_measure_specifications/Claims-Registry-Measures/2019_Measure_117_MedicarePartBClaims.pdf. Accessed 28 August 2023
  • 17. National Committee for Quality Asasurance . HEDIS measurement year 2020 and measurement year 2021: technical specifications for health plans. Available from https://www.ncqa.org/hedis/measures. Accessed 28 August 2023
  • 18. Frank RA, Jarrin R, Pritzker J, et al. Developing current procedural terminology codes that describe the work performed by machines. NPJ Digit Med 2022;5:177. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Abràmoff MD, Roehrenbeck C, Trujillo S, et al. A reimbursement framework for artificial intelligence in healthcare. NPJ Digit Med 2022;5:72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Wolf RM, Liu TYA, Thomas C, et al. The SEE Study: safety, efficacy, and equity of implementing autonomous artificial intelligence for diagnosing diabetic retinopathy in youth. Diabetes Care 2021;44:781–787 [DOI] [PubMed] [Google Scholar]
  • 21. Nguyen HV, Tan GS, Tapp RJ, et al. Cost-effectiveness of a national telemedicine diabetic retinopathy screening program in Singapore. Ophthalmology 2016;123:2571–2580 [DOI] [PubMed] [Google Scholar]
  • 22. Scotland GS, McNamee P, Philip S, et al. Cost-effectiveness of implementing automated grading within the national screening programme for diabetic retinopathy in Scotland. Br J Ophthalmol 2007;91:1518–1523 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Wolf RM, Channa R, Abramoff MD, Lehmann HP. Cost-effectiveness of autonomous point-of-care diabetic retinopathy screening for pediatric patients with diabetes. JAMA Ophthalmol 2020;138:1063–1069 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Xie Y, Nguyen QD, Hamzah H, et al. Artificial intelligence for teleophthalmology-based diabetic retinopathy screening in a national programme: an economic analysis modelling study. Lancet Digit Health 2020;2:e240–e249 [DOI] [PubMed] [Google Scholar]
  • 25. Tufail A, Rudisill C, Egan C, et al. Automated diabetic retinopathy image assessment software: diagnostic accuracy and cost-effectiveness compared with human graders. Ophthalmology 2017;124:343–351 [DOI] [PubMed] [Google Scholar]
  • 26. Channa R, Wolf RM, Abràmoff MD, Lehmann HP. Effectiveness of artificial intelligence screening in preventing vision loss from diabetes: a policy model. NPJ Digit Med 2023;6:53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27. Wolf RM, Abramoff MD, Channa R, Tava C, Clarida W, Lehmann HP. Potential reduction in healthcare carbon footprint by autonomous artificial intelligence. NPJ Digit Med 2022;5:62. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Wu JH, Liu TYA, Hsu WT, Ho JH, Lee CC. Performance and limitation of machine learning algorithms for diabetic retinopathy screening: meta-analysis. J Med Internet Res 2021;23:e23863. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Widner K, Virmani S, Krause J, et al. Lessons learned from translating AI from development to deployment in healthcare. Nat Med 2023;29:1304–1306 [DOI] [PubMed] [Google Scholar]
  • 30. Rogers TW, Gonzalez-Bueno J, Garcia Franco R, et al. Evaluation of an AI system for the detection of diabetic retinopathy from images captured with a handheld portable fundus camera: the MAILOR AI study. Eye (Lond) 2021;35:632–638 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31. Shou BL, Venkatesh K, Chen C, et al. Predictors for non-diagnostic images in real world deployment of artificial intelligence assisted diabetic retinopathy screening [Abstract]. Invest Ophthalmol Vis Sci 2022;63:1157 [Google Scholar]
  • 32. Ruan S, Liu Y, Hu WT, et al. A new handheld fundus camera combined with visual artificial intelligence facilitates diabetic retinopathy screening. Int J Ophthalmol 2022;15:620–627 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33. Wang S, Jin K, Lu H, Cheng C, Ye J, Qian D. Human visual system-based fundus image quality assessment of portable fundus camera photographs. IEEE Trans Med Imaging 2016;35:1046–1055 [DOI] [PubMed] [Google Scholar]
  • 34. Kanagasingam Y, Xiao D, Vignarajan J, Preetham A, Tay-Kearney ML, Mehrotra A. Evaluation of artificial intelligence-based grading of diabetic retinopathy in primary care. JAMA Netw Open 2018;1:e182665. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35. Mehra AA, Softing A, Guner MK, Hodge DO, Barkmeier AJ. Diabetic retinopathy telemedicine outcomes with artificial intelligence-based image analysis, reflex dilation, and image overread. Am J Ophthalmol 2022;244:125–132 [DOI] [PubMed] [Google Scholar]
  • 36. Watane A, Kalavar M, Vanner EA, Cavuoto K, Sridhar J. Follow-up adherence in patients with nonproliferative diabetic retinopathy presenting to an ophthalmic emergency department. Retina 2021;41:1293–1301 [DOI] [PubMed] [Google Scholar]
  • 37. Mathenge W, Whitestone N, Nkurikiye J, et al. Impact of artificial intelligence assessment of diabetic retinopathy on referral service uptake in a low-resource setting: the RAIDERS randomized trial. Ophthalmol Sci 2022;2:100168. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38. Centers for Medicare & Medicaid Services . CY 2021 payment policies under the physician fee schedule and other changes to Part B payment policies. Available from https://www.federalregister.gov/documents/2020/08/17/2020-17127/medicare-program-cy-2021-payment-policies-under-the-physician-fee-schedule-and-other-changes-to-part. Accessed 28 August 2023
  • 39. Liu TYA, Huang J, Lehmann H, Wolf RM, Channa R, Abramoff MD. Autonomous artificial intelligence (AI) testing for diabetic eye disease (DED) closes care gap and improves health equity on a systems level [Abstract]. Diabetes 2023;72(Suppl. 1):261-OR36346618 [Google Scholar]
  • 40. Richardson JP, Smith C, Curtis S, et al. Patient apprehensions about the use of artificial intelligence in healthcare. NPJ Digit Med 2021;4:140. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41. Khullar D, Casalino LP, Qian Y, Lu Y, Krumholz HM, Aneja S. Perspectives of patients about artificial intelligence in health care. JAMA Netw Open 2022;5:e2210309. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42. Amann J, Blasimme A, Vayena E, Frey D; Precise4Q consortium . Explainability for artificial intelligence in healthcare: a multidisciplinary perspective. BMC Med Inform Decis Mak 2020;20:310. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43. Patel D, Ananthakrishnan A, Lin T, Channa R, Liu TYA, Wolf RM. Social determinants of health and impact on screening, prevalence, and management of diabetic retinopathy in adults: a narrative review. J Clin Med 2022;11:7120. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44. Shi Q, Zhao Y, Fonseca V, Krousel-Wood M, Shi L. Racial disparity of eye examinations among the U.S. working-age population with diabetes: 2002–2009. Diabetes Care 2014;37:1321–1328 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Thomas CG, Channa R, Prichett L, Liu TYA, Abramoff MD, Wolf RM. Racial/ethnic disparities and barriers to diabetic retinopathy screening in youths. JAMA Ophthalmol 2021;139:791–795 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46. Wang SY, Andrews CA, Gardner TW, Wood M, Singer K, Stein JD. Ophthalmic screening patterns among youths with diabetes enrolled in a large US managed care network. JAMA Ophthalmol 2017;135:432–438 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47. Zhang X, Cotch MF, Ryskulova A, et al. Vision health disparities in the United States by race/ethnicity, education, and economic status: findings from two nationally representative surveys. Am J Ophthalmol 2012;154(Suppl. 6):S53–S62.e1 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Clinical Diabetes : A Publication of the American Diabetes Association are provided here courtesy of American Diabetes Association

RESOURCES