Table 4.
Inter-rater agreementb Direct observation |
Inter-rater agreementb Virtual observation (n=84) |
Intra-rater agreementc Street- versus parcel- level (n=84) |
|||||
---|---|---|---|---|---|---|---|
Domain | Total # of itemsd |
Street- level n(%) |
Parcel- level n(%) |
Street- level n(%) |
Parcel- level n(%) |
Direct observation n(%) |
Virtual observation n(%) |
Physical disorder |
3 | 1(34) | 2(67) | 2(67) | 2(67) | 3(100) | 3(100) |
Physical decay |
6 | 5 (83) | 5(83) | 4(67) | 4(67) | 5(83) | 5(83) |
Safety | 3 | 1 (34) | 1(34) | 2(67) | 1(34) | 3(100) | 3(100) |
Street safety | 3 | 2 (67) | 2(67) | 2(67) | 2(67) | 3(100) | 3(100) |
Land use | 6 | 6(100) | 6(100) | 6(100) | 6(100) | 6(100) | 6(100) |
Total | 21 | 15(71) | 16(76) | 16(76) | 15(71) | 20(95) | 20(95) |
Substantial agreement is defined as greater than 75 % agreement from item-level agreement in Table 3.19,20
Inter-rater agreement was measured as observed agreement and a simple kappa coefficient between two raters.
Intra-rater agreement was measured as observed agreement and a simple kappa coefficient between virtual vs. direct ratings performed by the same rater.
The total number of items indicates the number of survey questions that were measured within each domain. The total number of items is the denominator used to calculate the percent of items within each domain with substantial agreement.