Skip to main content
. 2020 Dec 15;5(2):198–204. doi: 10.1016/j.jseint.2020.10.019

Table III.

Intraobserver agreement

X-ray 2D CT 3D CT 3D models
Consultants and registrars (n = 7)
 kappa
 Mean κ 0.45 0.41 0.43 0.60
 Range of κ 0.32-0.63 0.10-0.64 0.36-0.53 0.45-0.89
 95% CI 0.36-0.54 0.25-0.57 0.37-0.48 0.46-0.73
 Agreement Moderate Moderate Moderate Moderate
 Range of P values P = .016 to P < .0001 P = .201 to P < .0001 P = .004 to P < .0001 P = .0005 to P < .0001
 Mean % agreement
 % 70.5 65.6 65.5 75.0
 95% CI 62.8-78.1 53.9-77.3 61.2-69.9 66.4-83.6
 Number of images 30 30 30 30§
Consultants (n = 3)
 kappa
 Mean κ 0.46 0.45 0.39 0.69
 Range of κ 0.42-0.48 0.33-0.56 0.36-0.44 0.51-0.89
 Agreement Moderate Moderate Moderate Substantial
 Mean % agreement 74.4 70.8 65.6 82.0
Registrars (n = 4)
 kappa
 Mean κ 0.44 0.37 0.45 0.52
 Range of κ 0.32-0.63 0.10-0.64 0.40-0.53 0.45-0.60
 Agreement Moderate Fair Moderate Moderate
 Mean % agreement 67.5 61.7 65.5 69.8

2D, two-dimensional; 3D, three-dimensional; CI, confidence interval; CT, computed tomography.

Agreement has been defined using the Landis and Koch criteria (Table I).20

P value less than .05 shows that kappa was statistically significantly different from “0”. Range of P values assessing kappa for each observer.

29 images for one observer

§

29 images for two observers.