Skip to main content
. 2024 Feb 29;19(2):e0280105. doi: 10.1371/journal.pone.0280105

Table 2. Kappa coefficient’s agreement interpretation.

Kappa coefficient * Interpretation
0 No agreement
0.10–0.20 Slight agreement
0.21–0.40 Fair agreement
0.41–0.60 Moderate agreement
0.61–0.80 Substantial agreement
0.81–0.99 Near perfect agreement
1 Perfect agreement

* Kappa value calculations: k = (po–pe) / (1 –pe)

po: Relative observed agreement between self -testing and testing by professional user

pe: Hypothetical probability of chance agreement