Skip to main content
. 2020 Oct 28;11(5):714–724. doi: 10.1055/s-0040-1717084

Table 1. Interrater reliability and agreement for singular taxonomy.

Data source Kappa Kappa 95% confidence interval Agreement (%)
Lower bound Upper bound
ERS reports
 Calibration subset
 Inclusion ( n  = 80) 0.702 0.531 0.874 87.5
 Classification ( n  = 19) 0.432 0.192 0.672 52.6
Evaluation subset
 Inclusion ( n  = 230) 0.606 0.501 0.710 82.1
 Classification ( n  = 47) 0.688 0.581 0.795 87.8
HD tickets
 Calibration subset
 Inclusion ( n  = 79) 0.413 0.142 0.685 84.8
 Classification ( n  = 61) 0.277 0.138 0.417 44.3
 Evaluation subset
 Inclusion ( n  = 244) 0.688 0.535 0.841 94.3
 Classification ( n  = 212) 0.556 0.462 0.649 73.6

Abbreviations: ERS, event reporting system; HD, help desk.