Skip to main content
. 2014 Aug 14;13(5):619–624. doi: 10.1111/iwj.12330

Table 3.

(A) Inter‐rater reliability profiles. (B) Test–retest reliability profiles

(A)*
Observer 1 2 3 4 5 6 7 8 9
1
2 0·33
3 0·07 0·54
4 0·12 0·12 0·26
5 0·09 0·18 0·18 0·39
6 0·04 0·06 −0·06 0·04 0·14
7 0·14 0·26 0·37 0·08 0·00 −0·04
8 0·14 0·32 0·13 0·45 0·33 0·19 0·14
9 −0·01 −0·08 0·05 0·36 0·46 0·07 −0·06 0·09
(B)
Photograph Observer 1 Observer 2 Observer 3 Observer 4 Observer 5 Agreement (%)
1 Yes Yes Yes Yes No 80
2 Yes Yes Yes Yes No 80
3 Yes No Yes Yes Yes 80
4 Yes Yes No No No 40
5 Yes Yes No Yes No 60
6 No Yes No No Yes 40
7 No Yes No No Yes 60
8 No Yes Yes Yes Yes 80
9 No No No Yes yes 40
10 Yes Yes No No Yes 60
11 No Yes No No Yes 40
12 Yes Yes Yes Yes Yes 100
13 Yes Yes No Yes No 60
14 Yes Yes Yes Yes Yes 100
15 No Yes Yes Yes Yes 80
16 No Yes Yes Yes No 60
17 Yes Yes Yes Yes No 80
18 Yes Yes Yes Yes Yes 100
19 Yes No No Yes Yes 40
20 No Yes No Yes No 40
Agreement (%) 60 85 50 75 60
*

Kappa statistics, bold values indicate moderate agreement. κ value above 0·80, ‘very good’; between 0·80 and 0·60, ‘good’; between 0·60 and 0.40, ‘moderate’; below 0·40, ‘poor’.