Skip to main content
. 2021 Mar 25;6:654438. doi: 10.3389/frma.2021.654438

Table 5.

Summary of inter-annotator agreement scores.

Metric anno1 vs. anno2 anno1 vs. anno3 anno2 vs. anno3
Named entities
Cohen's Kappa 0.9070 0.9515 0.9539
F1 score 0.9505 0.9747 0.9760
Events
Cohen's Kappa 0.6513 0.8035 0.8068
F1 score 0.8985 0.9496 0.9506