Skip to main content
. 2019 Nov 2;16(21):4258. doi: 10.3390/ijerph16214258

Table 4.

Testing the inter-rater reliability of advertisement coding using a novel tool.

Feature Kappa % Agreement
Ad Type: Billboards/Transit Shelters (n = 93 ads)
Price 0.910 96.8
Food/Beverage Image 1.000 100.0
Slogan/Description 0.891 95.7
Logo/Company Name 1.000 100.0
Location/Directions 0.950 97.8
Sale/Deal/Special Offer 0.731 92.5
Loyalty/Rewards 1.000 100.0
Gamification 0.903 98.9
Taste Description 0.678 91.4
Characters, Celebrities, TV, or Sports tie ins 0.863 94.6
Average 0.893 96.8
Ad Type: Outdoor Vendor Signage (n = 999 ads)
Price 0.923 96.6
Food/Beverage Image 0.936 96.4
Slogan/Description 0.732 82.6
Logo/Company Name 0.884 92.6
Location/Directions 0.923 99.2
Sale/Deal/Special Offer 0.969 98.7
Loyalty/Rewards 0.998 99.8
Gamification 0.984 99.9
Taste Description 0.769 96.7
Characters, Celebrities, TV or Sports tie ins 0.980 99.7
Average 0.910 96.2

Note: For all inter-rater comparisons, p < 0.001.