Skip to main content
. 2021 Jan 14;16(1):e0245439. doi: 10.1371/journal.pone.0245439

Table 4. Previous studies from the literature investigating correlations between OSCE and other academic assessment methods.

Reference Country Students, No. Academic assessment method compared to OSCE Statistical evaluation of the inter-relationship Conclusions
Smith et al (1984) United Kingdom 229 Viva voce examination, in-case assessment (clinical aptitude and written project), MCQ examination, comparable traditional clinical examinations. Significant correlation between OSCE and marks from MCQ (r = 0.34, P<0.001), comparable clinical examination (r = 0.26, P<0.001), and previous in-case assessment (r = 0.24, P<0.001). In contrast to viva-voce examination, OSCE results correlated well with an overall assessment of the student’s ability.
No correlation between OSCE and viva voce examination (r = 0.08, P>0.05). The clinical component of OSCE did not correlate well with MCQ.
Probert et al (2003) United Kingdom 30 Long and short case-based viva voce examinations. Overall performance at traditional finals was correlated with the total OSCE mark (r = 0.5, P = 0.005). Dichotomizing traditional examinations into surgery and medicine assessment resulted in significant correlations between OSCE and surgery marks (r = 0.71, P<0.001) but not between OSCE and medicine marks (r = 0.15, P = 0.4). This was a pilot study for OSCE implementation, and the analyzed sample of students who performed both examination methods was representative of the whole population.
The authors added independent consultant evaluations to assess clinical performance by students. OSCE assesses different clinical domains than do traditional finals and improved the prediction of future clinical performance.
Dennehy et al (2008) USA 62 National Board Dental Examination (NBDE, computerized assessments of theoretical knowledge in part I, and clinical knowledge in part II), and MCQ examinations. NBDE score was statistically associated with OSCE score (P ranging from <0.001 to 0.04). Didactic predictors (NBDE, MCQ examinations) explained around 20% of the variability in OSCE scores.
There was no significant association between OSCE and MCQ scores. OSCE may be a tool that allows educators to assess student capabilities that are not evaluated in typical standardized written examinations.
In multiple regression models none of the didactic predictors were significantly associated with overall OSCE performance.
Sandoval et al (2010) Chile 697 Written examination and daily clinical practice observation guidelines. Positive correlation between percentages of success for all three evaluation methods with OSCE (P<0.001). Pearson’s correlation co-efficient was higher between assessment methods after seven years of OSCE implementation.
These evaluations are complementary.
Kirton et al (2011) United Kingdom 39 per year during a 3-year long evaluation Medicine and pharmacy practice (MPP) written examination combining MCQ and essays expected to relate to clinical practice. Moderate positive correlation between OSCE and MPP (r = 0.6, P<0.001). For 20% of students, experience in OSCE did not increase marks or individual performance.
These two examinations assess different areas of expertise according to Miller’s Pyramid of Competence and both should be performed.
Kamarudin et al (2012) Malaysia 152 Student’s clinical competence component evaluated during the final professional long-case examination. Positive correlation between OSCE and long case for the diagnostic ability (r = 0.165, P = 0.042) and total exam score (r = 0.168, P = 0.039). There is a weak correlation between OSCE and long-case evaluation format. These two assessment methods test different clinical competences.
Tijani et al (2017) Nigeria 612 Long-case examination (end of posting in the 4th and 6th years of medical school), final MCQ, and total written papers (TWP): sum of MCQ examinations and essays. Positive correlation between OSCE and MCQ (r = 0.408), TWP (r = 0.523), and long case (r = 0.374), P<0.001. The total clinical score combining OSCE and long-case marks was a better predictor of student clinical performance than each assessment method analyzed separately.
These two evaluations could be complementary.
Previous experience with OSCE was not taken into account in the analysis.
Lacy et al (2019) Mexico 83 Communication skills evaluated during direct observation of a clinical encounter (DOCE) using the New Mexico Clinical Communication Scale. Students’ matched scores on OSCE and DOCE were not correlated. The discordance between OSCE and DOCE suggests that OSCE may not be an optimal method to identify students requiring additional communication training.
Mean scores were not statistically different between faculty evaluators for individual communication skills (P = 0.2).

No. = number; OSCE = objective structured clinical examination; MCQ = multiple-choice question; NBD = national board dental examination; MPP = medicine and pharmacy practice; TWP = total written papers; DOCE = direct observation of a clinical encounter.