Skip to main content
. 2024 Mar 7;11:10. doi: 10.1186/s40662-024-00376-3

Table 1.

A summary table of artificial intelligence (AI) application in the diagnosis of keratoconus and dry eye diseases, in reverse chronological order

Year Authors Imaging modality Sample size (eyes) Study population Outcome measures AI algorithms Diagnostic performance Validation method
Keratoconus
 2023 Lu et al. [15] Pentacam, SD-OCT, APT 599 Healthy, FF, early, advanced KC eyes KC detection RF/CNN AUC: 0.801–0.902 Hold-out validation
 2023 Kundu et al. [111] AS-OCT 1125 Healthy, VAE and KC eyes KC detection RF

AUC: 0.994–0.976, Acc: 95.5%–95.6%

Sens: 71.5%–98.5%, Precis: 91.2%–92.7%

Hold-out validation
 2022 Cohen et al. [112] Galilei 8526 Healthy, suspect and KC eyes KC detection RF

AUC: 0.964–0.969, Acc: 90.2%–91.5%

Sens: 94.2%–94.7%, Spec: 89.6%–89.8%

Hold-out validation
 2022 Almeida Jr et al. [113] Pentacam 2893 Healthy, VAE and KC eyes KC detection BESTi MLRA

AUC: 0.91, Sens: 86.02%

Spec: 83.97%

Hold-out validation
 2022 Reddy et al. [114] Oculyzer 1331 Healthy and KC eyes Prediction of latent progression of KC CNN 11.1 months earlier progression than KP (P < 0.001) Hold-out validation
 2022 Gao et al. [115] Pentacam 208 Healthy, subclinical and KC eyes Subclinical and KC detection KeratoScreen ANN

Sens: 93.9%–97.6%

Precis: 95.1%–96.1%

Hold-out validation
 2022 Xu et al. [116] Pentacam 1108 Healthy, VAE and KC eyes Detection of healthy eye in VAE KerNet CNN

Acc: 94.67%

AUC: 0.985

Hold-out validation
 2022 Gairola et al. [117] SmartKC 57 Healthy and KC eyes KC detection CNN

Sens: 91.3%

Spec: 94.2%

Hold-out validation
 2022 Lu et al. [65] SD-OCT, APT 622 Healthy, FF, early, advanced KC eyes KC detection RF/CNN

AUC: 0.99, Sens: 75%

Spec: 94.74%

Hold-out validation
 2022 Subramaniam et al. [118] Pentacam 900 Healthy, subclinical and KC eyes KC detection and grading PSO, GoogLeNet CNN

Acc: 95.9%, Spec: 97.0%

Sens: 94.1%

Hold-out validation
 2022 Mohammadpour et al. [12]

Pentacam, Sirius,

OPD-Scan III Corneal Navigator

200 Healthy, subclinical and KC eyes KC detection RF

Subclinical KC – Acc: 88.7%, Sens: 84.6%,

Spec: 90.0%

KC – Acc: 91.2%, Sens: 80.0%, Spec: 96.6%

(Based on Sirius Phoenix)

N.A
Dry eye diseases
 2023 Shimizu et al. [54] ASV 158 Healthy and DED eyes DED grading based on TBUT ImageNet-22 k CNN

Acc: 78.9%, AUC: 0.877

Sens: 77.8%, Spec: 85.7%

Hold-out validation
 2023 Abdelmotaal et al. [52] ASV 244 Healthy and DED eyes DED detection CNN AUC: 0.98 Hold-out validation
 2022 Fineide et al. [51] ASV 431 Patients with DED DED grading based on TBUT RF

Sens: 99.8%, Precis: 99.8%

Acc: 99.8%

Cross validation
 2022 Edorh et al. [119] AS-OCT 118 Healthy and DED eyes Epithelial changes as a marker of DED RF

Sens: 86.4%

Spec: 91.7%

N.A
 2021 Chase et al. [44] AS-OCT 151 Healthy and DED eyes DED detection VGG19 CNN

Acc: 84.62%, Sens: 86.36%

Spec: 82.35%

Hold-out validation
 2021 Elsawy et al. [120] AS-OCT 879 Healthy and various anterior segment eye diseases DED detection VGG19 CNN AUC: 0.90–0.99 Hold-out validation
 2020 Maruoka et al. [62] HRT-3 confocal microscopy 221 Healthy and obstructive MGD eyes Obstructive MGD detection Multiple CNNs

AUC: 0.96, Sens: 94.2%

Spec: 82.1%

Hold-out validation
 2020 da Cruz et al. [56] Doane interferometer 106 VOPTICAL_GCU database of tear film images Classification of tear film lipid layer SVM, RF, NBC, MLP, RBFNetwork, random tree

Acc: 97.54%, AUC: 0.99

κ: 0.96

Cross validation
 2020 Stegmann et al. [121] AS-OCT 6658 Healthy eye images Tear meniscus segmentation TBSA CNN

Sens: 96.4%, Spec: 99.9%

Jaccard index: 93.2%

Cross validation
 2019 Wang et al. [122] Keratograph 5 M 209 Healthy and DED eyes Segmentation of meibomian gland, grading of meiboscore CNN

Acc: 95.4%–97.6%

IoU: 66.7%–95.5%

Hold-out validation
 2018 Arita et al. [55] DR-1α tear interferometer 100 Healthy and DED eyes DED detection and grading Interferometric movies

κ: 0.76

Acc: 76.2%–95.4%

N.A

Acc = accuracy; ANN = artificial neural network; APT = air puff tonometry; AS-OCT = anterior-segment optical coherence tomography; ASV = anterior segment videography; AUC = area under curve; CNN = convolutional neural network; DED = dry eye disease; FF = forme fruste keratoconus; IoU = intersection over union; IVCM = in vivo confocal microscopy; κ = kappa index; KC = keratoconus; KP = keratometric progression; MGD = meibomian gland disease; MLP = multilayer perceptron; N.A. = not available; NBC = naïve Bayes classifier; Precis = precision; RF = random forest; SD-OCT = spectral-domain optical coherence tomography; Sens = sensitivity; Spec = specificity; SVM = support vector machines; TBSA = threshold based algorithm; TBUT = tear breakup time; VAE = very asymmetric eyes (fellow to KC eyes)

Jaacard index is a statistical analysis of how similar two sample sets are