Skip to main content
. 2020 Dec 1;2020:baaa104. doi: 10.1093/database/baaa104

Table 3.

The inter-rater agreement score, using both Fleiss’ kappa and Krippendorff’s alpha metrics, considering only the Amazon workers, the Amazon workers plus the extra rater (on-site) and the extra rater (on-site) plus the domain expert (Task 2)

Inter-rater agreement
Inter-rater agreement metric Amazon workers Amazon workers + extra rater (on-site) Extra rater (on-site) + expert
Fleiss’ kappa 0.2028 0.2050 0.6549
Krippendorff’s alpha 0.2029 0.2051 0.6550