Table 4.
Intraobserver reliability across all 12 observers and by individual observer
Classification | Coefficient | 95% confidence interval | Percentage agreement |
---|---|---|---|
Primary classifications (A, B and C) | k F | ||
All raters | 0.91 | 0.83 to 0.99 | 94 |
Rater 1 | 0.87 | 0.59 to 1 | 92 |
Rater 2 | 0.93 | 0.64 to 1 | 96 |
Rater 3 | 0.94 | 0.65 to 1 | 96 |
Rater 4 | 0.94 | 0.66 to 1 | 96 |
Rater 5 | 0.93 | 0.64 to 1 | 96 |
Rater 6 | 0.94 | 0.65 to 1 | 96 |
Rater 7 | 1.00 | 0.71 to 1 | 100 |
Rater 8 | 0.81 | 0.53 to 1 | 88 |
Rater 9 | 0.87 | 0.59 to 1 | 92 |
Rater 10 | 0.87 | 0.59 to 1 | 92 |
Rater 11 | 0.82 | 0.54 to 1 | 88 |
Rater 12 | 1.00 | 0.71 to 1 | 100 |
Sub-classifications (A1-A5, B1-B2, C) | α k | ||
All raters | 0.88 | 0.83 to 0.93 | 81 |
Rater 1 | 0.81 | 0.50 to 1.00 | 88 |
Rater 2 | 0.85 | 0.55 to 1.00 | 80 |
Rater 3 | 0.93 | 0.79 to 0.99 | 80 |
Rater 4 | 0.88 | 0.60 to 0.99 | 80 |
Rater 5 | 0.90 | 0.69 to0.99 | 80 |
Rater 6 | 0.94 | 0.82 to 1.00 | 84 |
Rater 7 | 0.94 | 0.84 to 0.99 | 80 |
Rater 8 | 0.77 | 0.43 to 0.97 | 76 |
Rater 9 | 0.93 | 0.80 to 1.00 | 84 |
Rater 10 | 0.87 | 0.62 to 0.99 | 80 |
Rater 11 | 0.83 | 0.60 to 0.98 | 76 |
Rater 12 | 0.95 | 0.83 to 1.00 | 88 |
α k , Krippendorff’s alpha coefficient; kF, Fleiss’s kappa coefficient