Skip to main content
. 2018 Nov 20;18:144. doi: 10.1186/s12874-018-0598-3

Table 4.

Inter-intra expert agreement analysis

Cohen-Kappa E2R1 E3R1 E4R1 E5R1 E1R2 E2R2 E3R2 E4R2 E5R2 V R1 V R2 V R1 R2 R c
E1R1 0.39 0.53 0.57 0.61 0 . 3 5 0.39 0.22 0.20 0.66 0.92 0.35 0.57 0.53
E2R1 0.50 0.53 0.42 0.43 0 . 5 6 0.34 0.35 0.38 0.44 0.43 0.53 0.42
E3R1 0.43 0.48 0.44 0.50 0 . 5 0 0.33 0.52 0.59 0.51 0.70 0.74
E4R1 0.51 0.55 0.61 0.39 0 . 4 3 0.45 0.63 0.55 0.64 0.70
E5R1 0.37 0.51 0.31 0.25 0 . 8 3 0.69 0.45 0.71 0.57
E1R2 0.50 0.76 0.48 0.23 0.40 0.86 0.62 0.66
E2R2 0.48 0.48 0.46 0.44 0.64 0.69 0.73
E3R2 0.57 0.34 0.26 0.83 0.54 0.57
E4R2 0.28 0.22 0.61 0.43 0.46
E5R2 0.74 0.49 0.65 0.52
V R1 0.40 0.63 0.59
V R2 0.70 0.74
V R1 R2 0.87

Cohen-Kappa indexes for intra and inter-rater agreement as well as between each expert and VR1, VR2, VR1R2 and Rc in rounds R1 and R2 for asymptomatic / symptomatic classification. The indexes that are highlighted in bold correspond to the intra-rater agreement between rounds R1 and R2