Skip to main content
. Author manuscript; available in PMC: 2017 Nov 30.
Published in final edited form as: J Biomed Inform. 2017 Mar 18;69:1–9. doi: 10.1016/j.jbi.2017.03.012

Table 3.

Inter-rater agreements between the three topic coders measured by Cohen's Kappa. Note that coder 1 annotated all posts while coder 2 and coder 3 annotated two complimentary parts of the data. Therefore, no agreement is calculated between coder 2 and coder 3.

Label Coder 1 and 2 Coder 1 and 3

Avg K 0.50 0.62
ALTR 0.36 0.29
DAIL 0.30 0.50
DIAG 0.50 0.71
FIND 0.56 0.61
HSYS 0.56 0.68
MISC 0.38 0.76
NUTR 0.70 0.69
PERS 0.13 0.61
RSRC 0.63 0.58
TEST 0.69 0.70
TREA 0.67 0.71