Table 2.
Interreader Agreement of Manual Annotations and Reader–Algorithm Agreement for Object Detection-Based Algorithm.
Metric | Interreader |
Reader 1: Algorithm |
Reader 2: Algorithm |
|||
---|---|---|---|---|---|---|
‘gland’ | ‘eyelid’ | ‘gland’ | ‘eyelid’ | ‘gland’ | ‘eyelid’ | |
aAJI | 0.5487 (0.5310,0.5620) | 0.8774 (0.8688,0.8834) | 0.4294 (0.4132,0.4545) | — | 0.4102 (0.3912,0.4372) | — |
DSC | 0.6230 (0.6033,0.6404) | 0.9340 (0.9302,0.9384) | 0.5860 (0.5694,0.6041) | — | 0.5516 (0.5268,0.5687) | — |
Kappa | 0.6224 (0.6033,0.6404) | 0.9228 (0.9182,0.9286) | 0.5848 (0.5662,0.5998) | — | 0.5503 (0.5309,0.5716) | — |
ICC | 0.7264 (0.7123,0.7343) | 0.9240 (0.9195,0.9291) | 0.5943 (0.5793,0.6154) | — | 0.5613 (0.5368,0.5787) | — |
Each value represents 2-sided mean with 95% confidence interval.
aAJI = average aggregated Jaccard index; DSC = Dice similarity coefficient; ICC = interclass correlation coefficient; Kappa = Cohen’s Kappa coefficient.