Skip to main content
. 2024 Feb 3;29(5):407–414. doi: 10.1093/oncolo/oyae009

Table 3.

Cohen’s kappa coefficient for inter-rater reliability between the reviewers during the selection process.

Domain Cohen’s kappa Agreement (%) Responses (no. of)
Reproducibility 0.912 97.4 152
Accuracy 0.889 94.4 126
Readability 1 100 126
Relevancy 0.868 94.4 126