Skip to main content
. 2011 May;25(5):261–264. doi: 10.1155/2011/302382

TABLE 3.

Summary of interobserver agreement between conventional criteria and Rosemont criteria

Conventional criteria Rosemont criteria
Weighted kappa (for 4-category assessment) 0.50 0.27
(P<0.001 for agreement beyond chance) (P=0.01 for agreement beyond chance)
80.0% overall agreement 68.1% overall agreement
Kappa for a positive test* 0.47 0.38
(P=0.002 for agreement beyond chance) (P=0.002 for agreement beyond chance)
86.1% overall agreement 80.6% overall agreement
Kappa for a normal test 0.50 0.18
(P=0.01 for agreement beyond chance) (P=0.13 for agreement beyond chance)
75.0% overall agreement 58.3% overall agreement
*

Positive test refers to any test that was not ‘normal’/‘low probability’ and not ‘indeterminate’;

McNemar’s exact test P<0.05 (significant);

No significant agreement beyond what is expected by chance