Skip to main content
. 2021 Apr 7;1:100013. doi: 10.1016/j.nbas.2021.100013

Table 3.

Quality assessment: inter-rater reliability.

Fleiss' kappa Error Confidence Interval Agreement z p-value
0.76893 0.091287 0.72238–0.81549 Substantial 8.4233 <0.001
Reject null hypothesis: Observed agreement is not accidental