Skip to main content
. Author manuscript; available in PMC: 2009 Jun 29.
Published in final edited form as: J Neurosci Methods. 2007 Nov 19;168(2):431–442. doi: 10.1016/j.jneumeth.2007.11.003

Table IV. Kappa coefficients for interrater reliability.

Average Kappa coefficients for interrater agreement comparing one experienced evaluator with four inexperienced evaluators. The Kappa analysis was repeated, progressively relaxing the criterion for concordance between evaluators from 0 to 4. The bolded Kappa values are those considered to indicate good or excellent interrater agreement (Rosner, 1995).

Criterion for interrater agreement (Δ SNAP)

0 1 2 3 4
κ 0.13 0.36 0.56 0.71 0.83