Skip to main content
. 2017 Sep 18;17:309. doi: 10.1186/s12884-017-1503-5

Table 3.

Accuracy and inter-rater reliability

Measure Percentage of agreement ICC (95% CI) Unweighted IRR (95% CI) c
Green 98.33% a 0.94 (0.93–1.0)
Yellow 68.33% 0.50 (0.15–0.99)b 0.70 (0.67–0.85)
Orange 96.67% a 0.83 (0.82–0.88)
Red 100% a 0.97 (0.96–1.00)
Overall 90.83% 0.961 (0.91–0.99) 0.85 (.85–.89)

aunable to compute ICC due to low variance – scores are to highly consistent

bNumber should be interpreted with caution due to low variance and large CIs

cLight Kappa statistic used to compute unweighted IRR

dBootstrap – t utilised to adjust confidence intervals [1]