Table 3.
Cohen’s kappa coefficient for inter-rater reliability between the reviewers during the selection process.
Domain | Cohen’s kappa | Agreement (%) | Responses (no. of) |
---|---|---|---|
Reproducibility | 0.912 | 97.4 | 152 |
Accuracy | 0.889 | 94.4 | 126 |
Readability | 1 | 100 | 126 |
Relevancy | 0.868 | 94.4 | 126 |