Table 1.
Kappa (normal vs. abnormal) | Kappa (categories) | ICC (95% confidence interval) | |
---|---|---|---|
Inter-scorer agreement | |||
Scorer A – Scorer B | K = 0.70 (gooda) | K = 0.78 (gooda) | ICC = 0.77 (0.51–0.90) (goodb) |
Scorer A – Scorer C | K = 0.80 (gooda) | K = 0.69 (gooda) | ICC = 0.85 (0.66–0.94) (goodb) |
Scorer B – Scorer C | K = 0.68 (gooda) | K = 0.56 (considerablea) | ICC = 0.78 (0.52–0.91) (goodb) |
All scorers | K = (0.68–0.80) (gooda) | K = (0.56−0.78) (considerable-gooda) | ICC = 0.80 (0.63–0.91) (goodb) |
Intra-scorer agreement | |||
Scorer A | K = 0.90 (excellenta) | K = 0.85 (excellenta) | ICC = 0.89 (0.76–0.96) (goodb) |
Scorer B | K = 1.00 (excellenta) | K = 0.93 (excellenta) | ICC = 0.96 (0.91–0.99) (excellentb) |
Scorer C | K = 0.89 (excellenta) | K = 0.69 (gooda) | ICC = 0.89 (0.75–0.96) (goodb) |