Skip to main content
. 2019 May 28;36(8):2122–2136. doi: 10.1007/s12325-019-00970-1

Table 5.

Inter-rater agreement reliability in experiment 2

Approach Agreement level N Agreement, % (95% CI)
Exact 15-day window 30-day window
Radiology-anchored Event 55 98 (94–100)
Date 49 61 (47–75) 67 (54–80) 69 (56–82)
Overall 55 64 (51–77) 69 (57–81) 71 (59–83)
Clinician-anchored Event 55 96 (91–100)
Date 48 60 (46–74) 67 (54–80) 71 (58–84)
Overall 55 62 (49–75) 67 (55–79) 71 (59–83)
Combined Event 55 98 (94–100)
Date 49 61 (47–75) 69 (56–82) 71 (58–84)
Overall 55 64 (51–77) 71 (59–83) 73 (61–85)

Patient charts were abstracted in duplicate by different abstractors and agreement (95% CIs) is reported. Event agreement is based on the presence or absence of at least one cancer progression event. Date agreement is based on when the progression occurred, and only calculated in cases where both abstractors recorded a cancer progression. Overall agreement is based on a combined approach where both the absence or presence of a progression event and the date of the event, if one was found, contribute toward agreement