Table 3.
Quality assessment: inter-rater reliability.
| Fleiss' kappa | Error | Confidence Interval | Agreement | z | p-value |
|---|---|---|---|---|---|
| 0.76893 | 0.091287 | 0.72238–0.81549 | Substantial | 8.4233 | <0.001 |
| Reject null hypothesis: Observed agreement is not accidental | |||||
Quality assessment: inter-rater reliability.
| Fleiss' kappa | Error | Confidence Interval | Agreement | z | p-value |
|---|---|---|---|---|---|
| 0.76893 | 0.091287 | 0.72238–0.81549 | Substantial | 8.4233 | <0.001 |
| Reject null hypothesis: Observed agreement is not accidental | |||||