Skip to main content
. Author manuscript; available in PMC: 2016 Oct 1.
Published in final edited form as: Comput Methods Programs Biomed. 2015 Jun 16;122(1):1–15. doi: 10.1016/j.cmpb.2015.06.004

Table 1. Interpretation of Kappa statistic and Williams' index.

Statistic Value Interpretation
Kappa 1.00 Total agreement
0.75-1.00 Excellent level of agreement
0.40-0.75 Fairly good to good level of agreement
0.00-0.40 Poor level of agreement, which could be considered due to chance
0.00 Agreement entirely due to chance
<0.00 Agreement even lower than that expected by chance
Williams' > 1.00 Agreement between isolated expert and group of experts is greater than agreement among members of group
1.00 Agreement between isolated expert and group of experts is equal to agreement among members of group
0.00-1.00 Agreement between isolated expert and group is less than agreement among members of group