Table 3.
PA (95% CI) | K (95% CI) | AC1 | K w (95% CI) | AC1 w | ICC (95% CI) g) | |
---|---|---|---|---|---|---|
Interrater-reliability |
|
|
|
|
|
|
2 categoriesa) |
0.96 (0.87-0.1.0) |
0.88 (0.65-1.0)
c)
|
0.93 (0.80-1.0) |
NA |
NA |
NA |
3 categoriesb) |
0.96 (0.87-1.0) |
0.89 (0.66-1.0)d) |
NA |
0.93 (0.76-1.0)
d)
|
0.98 (0.95-1.0) |
0.93 (0.84-0.97) |
Total score |
0.67 (0.47-0.87) |
0.53 (0.31-0.75) |
NA |
0.91 (0.84-0.99)c) |
0.98 (0.96-1.0) |
0.92 (0.80-0.96) |
Intrarater-reliability |
|
|
|
|
|
|
2 categoriesa) |
0.88 (0.74-1.0) |
0.65 (0.26-1.0)
e)
|
0.81 (0.57-1.0) |
NA |
NA |
NA |
3 categoriesb) |
0.83 (0.68-0.99) |
0.57 (0.21-0.92)f) |
NA |
0.62 (0.19-1.0)
f)
|
0.88 (0.72-1.0) |
0.63 (0.30-0.82) |
Total score | 0.67 (0.47-0.87) | 0.56 (0.33-0.78) | NA | 0.82 (0.66-0.98) | 0.95 (0.91-.1.0) | 0.84 (0.67-0.93) |
a)2-category classification = no/low vs moderate/high risk.
b)3-category classification = no/low vs moderate vs high risk.
c)Kmax = 0.88, Prevalence-adjusted bias-adjusted kappa (PABAK) = 0.92.
d)Kmax (weighted or unweighted) = 0.89, PABAK = 0.98.
e)Kmax = 0.88, PABAK = 0.75.
f)Kmax (weighted or unweighted) = 0.89, PABAK = 0.85.
g)Two-way mixed model for single measures, absolute agreement criterion.
PA, Proportion exact agreement; K, Kappa; K w , weighted Kappa (quadratic weights); ICC, Intraclass Correlation Coefficient; AC1, Gwet’s first order agreement coefficient. AC1 w , Gwet’s first order agreement coefficient (quadratic weights); NA, not applicable
Data in boldface indicate the most appropriate statistic in relation to the respective MEONF-II outcomes.