Skip to main content
. 2015 Jul 28;76(4):609–637. doi: 10.1177/0013164415596420

Table 6.

Coverage Rates of 95% Confidence Intervals When There is Systematic Disagreement Among Raters.

Sample size (n)
(q)a Agreement coefficient 10 20 30 40 50 80 100
3 Cohen’s kappa 0.966 0.956 0.956 0.952 0.950 0.952 0.948
Scott’s pi 0.903 0.929 0.940 0.941 0.940 0.946 0.944
Gwet’s AC1 0.913 0.931 0.942 0.938 0.942 0.945 0.947
Brennan–Prediger 0.882 0.926 0.936 0.940 0.944 0.946 0.947
Krippendorff’s alpha 0.925 0.940 0.949 0.946 0.948 0.952 0.952
4 Cohen’s kappa 0.930 0.938 0.944 0.946 0.949 0.948 0.946
Scott’s pi 0.912 0.934 0.939 0.941 0.944 0.945 0.943
Gwet’s AC1 0.926 0.933 0.940 0.944 0.946 0.946 0.945
Brennan–Prediger 0.885 0.928 0.931 0.941 0.947 0.947 0.945
Krippendorff’s alpha 0.922 0.939 0.943 0.945 0.948 0.950 0.946
5 Cohen’s kappa 0.953 0.948 0.949 0.944 0.944 0.949 0.947
Scott’s pi 0.914 0.928 0.937 0.941 0.944 0.944 0.948
Gwet’s AC1 0.936 0.934 0.942 0.940 0.944 0.942 0.943
Brennan–Prediger 0.858 0.931 0.930 0.937 0.947 0.942 0.942
Krippendorff’s alpha 0.914 0.928 0.936 0.942 0.944 0.944 0.948
a

q = number of categories.