Skip to main content
. Author manuscript; available in PMC: 2021 Jun 3.
Published in final edited form as: ACI open. 2019 Nov 10;3(2):e88–e97. doi: 10.1055/s-0039-1697907

Table 7.

Interreviewer agreement scores

Explanations Reviewer 1 vs. reviewer 2 Reviewer 1 vs. reviewer 3 Reviewer 2 vs. reviewer 3 All reviewers
All 0.87 0.49 0.33 0.57
Actual 0.82 0.24 0.01 0.39
Fabricated 0.93 0.70 0.63 0.75

Note: agreements between pairs of reviewers show the Cohen’s kappa coefficient and agreement across all reviewers show the Fleiss’ kappa statistic.