Skip to main content
. 2019 Jun 13;19:205. doi: 10.1186/s12909-019-1609-8

Table 6.

Interrater reliability of the six pairs of raters and of all data (98 rated simulation scenarios)

Raters ICC
D.1
ICC
D.2
ICC
D.3
ICC Overall Cohen’s Kappa
D.1
Cohen’s Kappa
D.2
Cohen’s Kappa
D.3
Cohen’s Kappa
Overall
Interrater reliability of six pair of raters (67 emergency and anaesthesiology simulation scenarios)
 R1/R2 0.925 0.955 0.882 0.922 0.92 0.95 0.90 0.92
 R1/R3 0.897 0.856 0.945 0.886 0.89 0.87 0.78 0.88
 R1/R4 0.855 0.825 0.874 0.837 0.833 0.853 0.854 0.837
 R1/R5 0.943 0.949 1 0.976 1 0.87 0.78 0.88
 R1/R6 0.905 0.978 0.943 0.945 0.893 0.98 0.94 0.94
 R7/R8 0.706 0.706 0.706 0.681 0.67 0.67 0.67 0.56
Interrater reliability of the 67 emergency simulation scenarios after data aggregation across raters’anaesthesiology training
 A./2nd year 0.880 0.893 0.830 0.868 0.87 0.8 0.82 0.87
 A./3rd year 0.915 0.900 0.950 0.917 0.90 0.89 0.95 0.92
 A./4th year 0.871 0.937 0.884 0,897 0.91 0.96 0.88 0.86
 A./5th year 0.811 0.787 0.834 0.805 0.85 0.77 0.87 0.78
 4th/3rd year 0.729 0.767 0.722 0.733 0.77 0.75 0.71 0.74
 5th/3rd year 0.650 0.500 0.680 0.624 0.73 0.48 0.68 0.63
Interrater reliability of the 31 OR simulation scenarios with 3 raters per each simulation scenario
 ICC Dimension one ICC Dimension two ICC Dimension three ICC overall
 0.737 0.780 0.684 0.738

Abbreviations: A. Attending, R.: Rater, Year Year of anaesthesiology training