Skip to main content
. 2022 Mar 6;2022:baac009. doi: 10.1093/database/baac009

Table 2.

Summary of inter-curator agreement results

Number of testers Number of cases Number of decisions
3 100 300
Percent agreement
Testers 1&3 pairwise agreement Testers 1&2 pairwise agreement Testers 2&3 pairwise agreement
81% 84% 97%
Average pairwise percent agreement 87.333%
Fleiss’ Kappa
Observed agreement Expected agreement Fleiss’ Kappa
0.873 0.367 0.8
Cohen’s Kappa (CK)
Testers 1&3 pairwise CK Testers 1&2 pairwise CK Testers 2&3 pairwise CK
0.703 0.749 0.952
Average pairwise CK 0.801