Skip to main content
. 2023 Dec 14;16:369. doi: 10.1186/s13104-023-06637-z

Table 4.

Rater reliability

ICC Kappa
Intra-rater reliability (n = 14) 0.97 0.85
Inter-rater reliability (n = 18) 0.99 (CI 0.988–0.998) 0.77
p-value p < 0.001 p < 0.001

ICC  Intraclass correlation coefficient, CI  Confidence interval. Weighted Kappa used to calculate intra rater reliability. Inter rater reliability calculated using the mean of all weighted Kappas across patient cases