Appendix 1—table 2. Reviewer agreement on simulated data.
Reviewers were highly consistent with each other in the simulated data. The interrater reliability is extremely high, as the p-value (Fleiss Kappa, three reviewers) is negligible.
Unanimous | 2/3 agree | No consensus | P value | |
---|---|---|---|---|
Onset (N = 240) | 192 | 44 | 4 | p<4.9 e-324 |
Offset (N = 240) | 189 | 51 | 0 | p<4.9 e-324 |