Table 3.
Studies | Population | Objective | Device | Intervention | Results |
---|---|---|---|---|---|
Kudva et al., 2018 [44] | India >24 y n = 102 |
Develop a decision support system for cervical cancer screening with an inbuilt image processing algorithm. | Android device with a camera of 13 Mpx. | 102 images Reference = expert evaluation. |
Accuracy 97.9%, Se 99.0%, Sp 97.1%, AUC NR. |
Bae et al., 2020 [45] |
South Korea, >20 y n = 20 |
Develop a new cervical cancer screening technique and implement a machine-learning algorithm using images taken during VIA with a smartphone-based endoscope. | Smartphone-based endoscope. | 40 images (2 per patient). Expert evaluation vs AI. Reference = histopathology. |
Accuracy 78.3%, Se 75.8%, Sp 80.3%, AUC 0.805. Clinicians’ mean accuracy 77.5%, Se 62.5%, Sp 100%, AUC NR. |
Xue Z. et al., 2020 [43] | Various countries >18 y n = 3221 |
Evaluate accuracy of automated visual evaluation (AVE) on smartphone images. | MobileODT system (smartphone with lens). | 7587 images. Reference = expert evaluation |
Accuracy NR, Se NR, Sp NR, AUC 0.87 (95% CI 0.81–0.92). |
Viñals et al., 2021 [46] | Cameroon, Switzerland 30–49 y n = 44 |
Development of a smartphone-based algorithm to detect cervical precancer from the dynamic features (dynamics of aceto-whitening). | Samsung Galaxy S5 | 44 dynamic images; Expert evaluation vs. AI. Reference = histology |
AI accuracy 89%, Se 90%, Sp 87%, AUC NR. Clinicians’ mean accuracy 71%, Se 68%, Sp 78%, AUC NR. |
Abbreviations: AI (artificial intelligence), AUC (area under the curve), Mpx (megapixels), NR (not reported), Se (sensitivity), Sp (specificity), VIA (visual inspection with acetic acid).