Skip to main content
. 2020 Dec 15;5(2):198–204. doi: 10.1016/j.jseint.2020.10.019

Table II.

Interobserver agreement

X-ray 2D CT 3D CT 3D models
Overall (consultants and registrars, n = 14)
 kappa
 κ 0.29 0.30 0.35 0.47
 95% CI 0.26-0.31 0.27-0.33 0.33-0.38 0.44-0.50
 Agreement Fair Fair Fair Moderate
 P value <.001 <.001 <.001 <.001
 % agreement
 % 57.2 57.8 58.8 66.5
 95% CI 55.1-59.3 55.5-60.2 56.7-60.9 64.6-68.4
 Number of images 30 29 28 26
Consultants (n = 7)
 kappa
 κ 0.26 0.37 0.30 0.48
 95% CI 0.20-0.32 0.31-0.43 0.24-0.36 0.42-0.55
 Agreement Fair Fair Fair Moderate
 P value <.001 <.001 <.001 <.001
 % agreement
 % 56.0 62.9 57.1 66.8
 95% CI 50.2-61.9 56.3-69.4 52.0-62.3 62.3-71.3
 Number of images 30 29 30 27
Registrars (n = 7)
 kappa
 κ 0.35 0.25 0.39 0.51
 95% CI 0.29-0.40 0.19-0.31 0.34-0.45 0.44-0.57
 Agreement Fair Fair Fair Moderate
 P value <.001 <.001 <.001 <.001
 % agreement
 % 60.3 54.1 60.1 68.3
 95% CI 57.3-63.4 49.0-59.3 55.1-65.0 64.4-72.3
 Number of images 30 30 28 29

2D, two-dimensional; 3D, three-dimensional; CI, confidence interval; CT, computed tomography.

Agreement has been defined using the Landis and Koch criteria (Table I).20

P value less than .05 shows that kappa was statistically significantly different from “0”.