Table 4.
Observer agreement on vertebral fracture diagnosis
Intraobserver Agreement | ||||
---|---|---|---|---|
Radiologist 1 | Radiologist 2 | Radiologist 3 | ||
Vertebral Level | Agreement (95 % CI) | 98 (97, 99) | 97 (96, 98) | 96 (95, 97) |
κ (95% CI) | 0.726 (0.620, 0.833) | 0.668 (0.566, 0.771) | 0.529 (0.402, 0.655) | |
Weighted κ | 0.731 (0.637, 0.826) | 0.678 (0.597, 0.758) | 0.632 (0.531, 0.733) | |
Patient Level | Agreement (95% CI) | 92 (87, 97) | 89 (83, 95) | 84 (77, 91) |
κ (95% CI) | 0.767 (0.612, 0.922) | 0.642 (0.443, 0.842) | 0.528 (0.316, 0.740)) | |
Weighted κ | 0.766 (0.637, 0.895) | 0.778 (0.639, 0.917) | 0.678 (0.515, 0.841) | |
Interobserver Agreement | ||||
Radiologist 1 vs. 2 | Radiologist 1 vs. 3 | Radiologist 2 vs. 3 | ||
Vertebral Level | Agreement (95 % CI) | 97 (96, 97) | 96 (95, 97) | 96 (95, 96) |
κa (95% CI) | 0.532 (0.430, 0.633) | 0.455 (0.348, 0.562) | 0.548 (0.461, 0.635) | |
Weighted κ | 0.531 (0.452, 0.610) | 0.481 (0.393, 0.570) | 0.568 (0.496, 0.641) | |
Patient Level | Agreement (95% CI) | 85 (80, 90) | 82 (76, 87) | 82 (76, 87) |
κ (95% CI) | 0.486 (0.310, 0.661) | 0.433 (0.261, 0.605) | 0.470 (0.308, 0.631) | |
Weighted κ | 0.573 (0.434, 0.713) | 0.525 (0.384, 0.666) | 0.592 (0.457, 0.728) |
CI confidence interval, K Cohen’s kappa, vertebral level indicates determination of whether individual vertebrae were fractured, patient level indicates determination of whether a patient had one or more vertebral fractures