Table 7.
Explanations | Reviewer 1 vs. reviewer 2 | Reviewer 1 vs. reviewer 3 | Reviewer 2 vs. reviewer 3 | All reviewers |
---|---|---|---|---|
All | 0.87 | 0.49 | 0.33 | 0.57 |
Actual | 0.82 | 0.24 | 0.01 | 0.39 |
Fabricated | 0.93 | 0.70 | 0.63 | 0.75 |
Note: agreements between pairs of reviewers show the Cohen’s kappa coefficient and agreement across all reviewers show the Fleiss’ kappa statistic.