Table 2.
Sen (%) | Spec (%) | AUC | Computerized Tests vs. Paper-and-Pencil Tests | Whether Computerized Test Is Better | |
---|---|---|---|---|---|
MCI | |||||
Memory test | 42.0–85.8 | 66.0–93.3 | 0.53–0.93 | CANTAB-PAL vs. CERAD wordlist learning delay recall | inferior |
MemTrax vs. MoCA-BJ | better | ||||
Digital VSM vs. Cube-copying test | much better | ||||
digital TPT vs. paper-and-pencil TPT | better | ||||
Test battery | 41.4–100.0 | 64.0–100.0 | 0.65–0.97 | CANS-MCI vs. MoCA, ACE-R | comparable |
subsets in NeuroTrax MindStreams vs. subsets in WMS-III, RAVLT, CDT, TMT-A, Boston Naming Test, COWA | comparable and some subsets are even better | ||||
memory factor in Tablet-based cognitive assessments vs. MMSE | inferior | ||||
BHA vs. MoCA | better | ||||
CAMCI vs. MMSE | better | ||||
COMCOG-CAT vs. CAMCOG | comparable | ||||
Handwriting/drawing test | 71.4–100.0 | 56.0–100.0 | 0.77–0.89 | machine learning on dCDT features vs. CERAD | comparable |
Daily living task and Serious game | 76.9–84.4 | 58.0–88.9 | 0.77–0.90 | SASG vs. MoCA | comparable |
SIMBAC vs. MMSE, Composite score of RAVLT-Delayed recall, Boston Naming Test, Digit Span, Digit Symbol Coding, and TMT-B | comparable | ||||
Other single/multiple cognitive test | 56.3–84.7 | 53.6–90.5 | 0.67–0.91 | e-CT vs. K-T CT | comparable |
Dementia | |||||
Memory test | 88.9 | 92.9 | - | digital TPT vs. paper-and-pencil TPT | comparable |
Test battery | 52.9–100.0 | 56.0–100.0 | 0.54–0.99 | CST vs. MMSE | better |
CCS vs. MoCA | inferior | ||||
BHA vs. MoCA | comparable | ||||
Handwriting/drawing test | 82.0–97.7 | 71.4–86.0 | 0.90–0.92 | dCDT parameters vs. CERAD | comparable |
Daily living task and Serious game | 86.0 | 75.0 | 0.97 | SIMBAC vs. MMSE, Composite score of RAVLT-Delayed recall, Boston Naming Test, Digit Span, Digit Symbol Coding, and TMT-B | comparable |
Other single/multiple cognitive tests | 62.7–86.1 | 75.0–95.3 | 0.76–0.95 | e-CT vs. K-T CT | comparable |
CI | |||||
Memory test | 91.8 | 72.0 | 0.89 | - | - |
Test battery | 70.7–91.0 | 69.0–94.2 | 0.78–0.95 | BHA vs. MoCA | better |
eSAGE vs. paper version of SAGE | better | ||||
Handwriting/drawing test | 74.0–89.7 | 70.0–100.0 | 0.84–0.92 | machine learning on dCDT vs. MMSE | better |
Daily living task and Serious game | 70.0 | 82.0 | 0.84 | - | - |
Other single/multiple cognitive tests | 77.0–97.0 | 80.6–92.6 | 0.77–0.97 | TMT vs. MMSE | comparable |
e-CT vs. K-T CT | comparable |
Abbreviations. ACE-R: Addenbrooke’s Cognitive Examination-Revised; BHA: Brain Health Assessment; CANTAB: Cambridge Neuropsychological Test Automated Battery; CAMCI: Computer Assessment of Memory and Cognitive Impairment; CDT: Clock Drawing Test; CERAD: The Consortium to Establish a Registry for Alzheimer’s Disease; COMCOG: Computer-assisted Cognitive Rehabilitation; COMCOG-CAT: Computer-assisted Cognitive Rehabilitation administered by Computerized Adaptive Testing; COWA: Controlled Oral Word Association Test; e-CT: electronic version of Cancellation Test; e-SAGE: electronic version of Self-Administered Gerocognitive Examination; MMSE: Mini-Mental State Examination; RAVLT: Rey Auditory Verbal Learning Test; PAL: Paired Associate Learning; SAGE: Self-Administered Gerocognitive Examination; SASG: Smart Aging Smart Game; SIMBAC: SIMulation-Based Assessment of Cognition; TMT-A: Trail-Making Test—Part A; TMT-B: Trail-Making Test—Part B; MoCA: The Montreal Cognitive Assessment (MoCA); MoCA-BJ: Beijing version of The Montreal Cognitive Assessment (MoCA); VSM: Visuo-spatial Memory task; WMS-III: Wechsler Memory Scale, 3rd edition.