Skip to main content
. 2015 Jun 2;17(6):e135. doi: 10.2196/jmir.3831

Table 5.

Interrater agreement, percent versus Fleiss’ kappa (κ).

Criteria Percent agreement (%) Fleiss’ κ Interpretation
Authority 92.59 .745 Substantial agreement
Complementarity 79.63 –.113 Poor agreement
Privacy 85.19 .614 Substantial agreement
Reference (attribution) 88.89 .756 Substantial agreement
Justifiability 74.07 .463 Moderate agreement
Contact details 95.37 .471 Moderate agreement
Financial disclosure 87.04 .716 Substantial agreement
Advertising policy 85.19 .691 Substantial agreement
Date (attribution) 79.63 .492 Moderate agreement