Skip to main content
. 2018 Aug 10;2(1):e000304. doi: 10.1136/bmjpo-2018-000304

Table 3.

Level of agreement in predicting ultimate management

Predictors Kappa coefficient (95% CI) Interpretation of level of agreement
Final radiographic report 0.62 (0.30 to 0.94) Good
Clinicians’ predictions on the presence of fracture 0.67 (0.36 to 0.98) Good
Clinicians’ predictions of ultimate management 0.88 (0.64 to 1) Excellent

Guidelines for interpreting agreement by κ coefficient values have created arbitrary divisions. We have chosen, considering the subjective nature of our study, higher agreement levels: poor (κ≤0.40), fair (κ=0.41–0.60), good (κ=0.61–0.75)  and excellent (κ>0.75).14