Table 4.
Inter‐rater Reliability,Categorical Scoring (Trial 1)
Kappa Coefficient | Absolute Agreement | |
---|---|---|
Rater A/Rater B | .76 | .88 |
Rater A/Rater C | .30 | .74 |
Rater B/Rater C | .20 | .62 |
Inter‐rater Reliability,Categorical Scoring (Trial 1)
Kappa Coefficient | Absolute Agreement | |
---|---|---|
Rater A/Rater B | .76 | .88 |
Rater A/Rater C | .30 | .74 |
Rater B/Rater C | .20 | .62 |