Table 3.
Inter-rater agreement for both image-wise and for full examination for both readers and the dCNN compared to ground truth
Accuracy [%] | κ (95% CI) | Agreement | ||
---|---|---|---|---|
Single images | dCNN/actuals | 79.7% | 0.57 (0.50–0.64) | Moderate |
Reader 1/actuals | 80.5% | 0.61 (0.53–0.68) | Substantial | |
Reader 2/actuals | 72.7% | 0.44 (0.36–0.51) | Moderate | |
Reader 1/Reader 2 | 71.9% | 0.43 (0.28–0.58) | Moderate | |
Full imageset | dCNN/actuals | 90.9% | 0.82 (0.69–0.95) | Almost perfect |
Reader 1/actuals | 81.8% | 0.63 (0.47–0.81) | Substantial | |
Reader 2/actuals | 90.9% | 0.82 (0.69–0.95) | Almost perfect | |
Reader 1/Reader 2 | 72.7% | 0.46 (0.09–0.82) | Moderate |