Skip to main content
. 2014 Jun 3;9(6):e98654. doi: 10.1371/journal.pone.0098654

Table 2. Exact agreement and rank correlation for all raters and for raters according to specialty.

Value All raters Surgeons Geriatricians p value
n = 41 n = 32 n = 9
Fleiss' kappa (κ) 0.471 0.491 0.390 0.0452
Kendall's W 0.847 0.850 0.847 0.909
Kendall's tau (τ) 0.797 0.801 0.782 0.333

Fleiss' kappa: exact inter-rater agreement; Kendall's W: relative agreement among raters; Kendall's tau: relative agreement with a standard value