Skip to main content
. 2020 Nov 2;97(4):445–452. doi: 10.1016/j.jped.2020.09.006

Table 1.

Inter-scorer agreement between scorers A, B, and C and intra-scorer agreement (after a 4-week interval) for GM categories (Kappa values) and for the checklist (ICC).

Kappa (normal vs. abnormal) Kappa (categories) ICC (95% confidence interval)
Inter-scorer agreement
Scorer A – Scorer B K = 0.70 (gooda) K = 0.78 (gooda) ICC = 0.77 (0.51–0.90) (goodb)
Scorer A – Scorer C K = 0.80 (gooda) K = 0.69 (gooda) ICC = 0.85 (0.66–0.94) (goodb)
Scorer B – Scorer C K = 0.68 (gooda) K = 0.56 (considerablea) ICC = 0.78 (0.52–0.91) (goodb)
All scorers K = (0.68–0.80) (gooda) K = (0.56−0.78) (considerable-gooda) ICC = 0.80 (0.63–0.91) (goodb)



Intra-scorer agreement
Scorer A K = 0.90 (excellenta) K = 0.85 (excellenta) ICC = 0.89 (0.76–0.96) (goodb)
Scorer B K = 1.00 (excellenta) K = 0.93 (excellenta) ICC = 0.96 (0.91–0.99) (excellentb)
Scorer C K = 0.89 (excellenta) K = 0.69 (gooda) ICC = 0.89 (0.75–0.96) (goodb)

ICC, Intra-class correlation coefficient; K, Cohen Kappa.

a

According to Landis et al.22.

b

According to Portney et al.21.