Skip to main content
. Author manuscript; available in PMC: 2015 Jan 7.
Published in final edited form as: J Neurosurg. 2014 Jul 18;121(3):536–542. doi: 10.3171/2014.4.JNS121952

Table 3.

Assessment of Interobserver Variability. Cohen’s and Fleiss’ Kappa Coefficients are presented in Table 3 as an assessment of the interobserver variability of determining radiographic progression or response for each of the methods. The novel volumetric method had the strongest agreement among the two neuroradiologists (Cohen’s Kappa=0.96) and also had strong agreement (Fleiss’ Kappa=0.79) among all examiners. Cohen’s Kappa Statistic for Agreement among the two neuroradiologists for 2D and 1D were 0.54 and 0.46, respectively.

Neuroradiologist 1
1D method 2D method Volumetric method
Complete
Response/
Regression
Stable Progression Total Complete
Response/
Regression
Stable Progression Total Complete
Response/
Regression
Stable Progression Total
Neuroradiologist 2 Complete
Response/
Regression
4 3 4 11 5 2 4 11 5 0 0 5
Stable 2 17 1 20 1 15 2 18 0 18 0 18
Progression 2 3 8 13 2 2 11 15 0 1 20 21
Concordance 65.9% (29 of 44) 70.5% (31 of 44) 97.7% (43 of 44)
Cohen’s
Kappaa
0.46 0.54 0.96

Abbreviations: 1D, one-dimensional; 2D, two-dimensional.

a

Interobserver agreement on radiographic calls of examiners for the 1D, 2D, and volumetric methods.