Skip to main content
. Author manuscript; available in PMC: 2013 Jul 30.
Published in final edited form as: J Aging Health. 2010 Nov 1;23(2):242–266. doi: 10.1177/0898264310383421

Table 3.

Interrater Agreement - Kappa statistics

Kappa Agreement Expected Agreement
Panel A: Unweighted Kappa statistic
SAH vs. IAH 0.128** 33.0% 23.2%
SAH vs. PAH 0.095** 37.3% 30.7%
PAH vs. IAH 0.085** 34.8% 28.8%

Panel B: Weighted Kappa statistic
SAH vs. IAH 0.271** 77.9% 69.7%
SAH vs. PAH 0.176** 80.3% 76.1%
PAH vs. IAH 0.154** 78.7% 74.9%

Notes: The weighted kappa is based on the following weights: 1-{|i-j|/(k-1)], where i and j indicate the rows and columns of the ratings by the two raters and k is the number of possible ratings. The following interpretation of Kappa statistics is commonly used (Landis and Koch, 1977): 0.00-0.20=slight; 0.21-0.40=fair; 0.41-0.60=moderate; 0.61-0.80=substantial; and 0.81-1.0=almost perfect

**

p <0.01