Table IV. Kappa coefficients for interrater reliability.
Average Kappa coefficients for interrater agreement comparing one experienced evaluator with four inexperienced evaluators. The Kappa analysis was repeated, progressively relaxing the criterion for concordance between evaluators from 0 to 4. The bolded Kappa values are those considered to indicate good or excellent interrater agreement (Rosner, 1995).
Criterion for interrater agreement (Δ SNAP) | |||||
---|---|---|---|---|---|
0 | 1 | 2 | 3 | 4 | |
κ | 0.13 | 0.36 | 0.56 | 0.71 | 0.83 |