Skip to main content
. 2016 Jul 26;17(5):574–584. doi: 10.5811/westjem.2016.6.30825

Table 5.

Inter-rater agreement on the quality indicator subscore components, calculated using a 2-way random effects model for consistency to calculate the ICCs (interclass correlation coefficient).

Question item number Single measure ICC*** (95% CI) Average measure ICC*** (95% CI) Number of missing data points % Missing
Q1* 0.04 (0.02–0.08) 0.64 (0.47–0.79) 202 13%
Q2* 0.03 (0.01–0.07) 0.56 (0.35–0.74) 193 12%
Q3 0.17 (0.12–0.26) 0.89 (0.84–0.94) 206 13%
Q4* 0.12 (0.07–0.19) 0.84 (0.76–0.90) 208 13%
Q5* 0.10 (0.06–0.16) 0.81 (0.71–0.89) 713 45%
Q6** 0.28 (0.20–0.39) 0.94 (0.91–0.96) 476 30%
Q7 0.38 (0.28–0.50) 0.96 (0.94–0.98) 216 14%
Q8** 0.22 (0.15–0.32) 0.92 (0.89–0.95) 773 48%
Q9** 0.16 (0.11–0.25) 0.88 (0.82–0.93) 465 29%
Q10 0.22 (0.14–0.32) 0.92 (0.87–0.95) 287 18%
Q11 0.17 (0.11–0.26) 0.89 (0.83–0.93) 290 18%
Q12 0.29 (0.21–0.41) 0.95 (0.92–0.97) 319 20%
Q13* 0.14 (0.09–0.22) 0.87 (0.80–0.92) 285 18%
*

Eliminated in Score Models 1 and 2 due to alpha <0.85 or single measure ICC <0.15

**

Eliminated in Score Model 2 since trainees were unsure too often (>25% missing data)

***

p-value was <0.001 for all ICC calculated