Skip to main content
. 2012 Mar-Apr;65(2):111–118. doi: 10.4212/cjhp.v65i2.1118

Table 3.

Inter-rater Reliability of Checklists

Statistic MDI MDI with Spacer Turbuhaler Diskus HandiHaler
Overall agreement 0.867 0.933 0.867 0.8 1
Cohen’s kappa 0.42 0.86* −0.07 0.60* 1.00*
Spearman’s rho 0.839* 0.673* 0.683* 0.324 0.564*

MDI = metered-dose inhaler.

*

p < 0.05.

Cohen’s kappa is a measure of agreement that adjusts for chance agreement between observers. It is used with nominal data. Kappa values close to or below 0 indicate that agreement between observers is not at all or only slightly better than chance. Kappa values above 0 indicate agreement better than chance. A kappa value of exactly 1 indicates perfect agreement.

Spearman’s rho measures the association between observers and is used for continuous data. A value of 0 indicates no relation and a value of 1 indicates perfect relation.