Skip to main content
. 2016 Jun 3;14:159. doi: 10.1186/s12967-016-0919-4

Table 3.

Inter-rater reliability

Examiner Fleis’ kappa Agreement
Expected Observed
Experts (n = 3) 0.730 0.506 0.867
Non-experts (n = 3) 0.814 0.507 0.909

Inter-rater reliability of experts (subjective evaluation) and non-experts (using DOC-score for diagnosis) in CLE assessment