Skip to main content
. 2018 Jun 3;32:45. doi: 10.14196/mjiri.32.45

Table 4. Validity, reliability, and feasibility of DOPS assessment method .

Author year Country Participants Validity Reliability Feasibility
Asadi K, et al., 2012(23) Iran 70 orthopedic interns CVI:0.90 0.80 -
Wilkinson J, et al., 2008(30) UK 177 medical specialists DOPS has low validity. DOPS reliability can be favorably compared with the mini-CEX and MSF. Mean time for observation in DOPS varied based on the procedure.
Watson MJ, et al., 2014(43) Australia Trainees in ultrasound-guided regional anesthesia (30 video-recorded) ‘Total score’ correlating with trainee experience (r = 0.51, p = 0.004) Inter-rater: ICC = 0.25
internal consistency:
(r = 0.68, p < 0.001)
The mean time taken to complete assessments was 6 minutes and 35 seconds
Hengameh H, et al., 2015 (40) Iran Nursing students CVR: 0.62
CVI:0.79
Kappa coefficient:0.6
ICC: 0.5
-
Barton JP, et al., 2012(44) UK 157 senior endoscopists —111 candidates and 42 assessors Most of the candidates (73.6%) and assessors (88.1%) pointed out that DOPS assessment method was valid or very valid. G: 0.81 Scores of DOPS were highly correlated with assessment score of global experts.
Amini A, et al., 2015(33) Iran Seven orthopedic residents and 9 faculty members CVI: 0.95 ICC:0.85
Delfino AE, et al., 2013 UK Six anesthesia staff for interviews, 10 anesthesiologists for consensus survey, and 31 anesthesia residents. CVI: 0.9
kappa values: 0.8
G coefficient:0.90
Sahebalzamani M, et al., 2012 Iran 55 nursing students Correlation for theoretical: 0.117;
correlation for clinical: 0.376
Cronbach alpha coefficient: 94 %
Kuhpayehzade J, et al., 2014(45) Iran 44 midwifery students CVR: 0.75
CVI:0.50
Alpha coefficient: 0.81