Skip to main content
PLOS ONE logoLink to PLOS ONE
. 2020 Sep 2;15(9):e0236484. doi: 10.1371/journal.pone.0236484

Factors predicting students’ performance in the final pediatrics OSCE

Maysoun Al Rushood 1,*, Amal Al-Eisa 1
Editor: Oathokwa Nkomazana2
PMCID: PMC7467284  PMID: 32877419

Abstract

Background

Objective Structured Clinical Examinations (OSCEs) have been used to assess the clinical competence of medical students for decades. Limited data are available on the factors that predict students’ performance on the OSCEs. The aim of our study was to evaluate the factors predicting performance on the pediatrics final OSCE, including the timing of students’ clerkship and their performance on the in-training OSCE and written examinations.

Methods

Grades in pediatrics for 3 consecutive academic years (2013–2016) were included. The average scores of the in-training OSCEs, written and final OSCEs and written exams were compared among the three years using the analysis of variance (ANOVA) test. The correlations between performance on the final OSCEs and the in-training OSCEs, in-training written exams and final written exams were studied using Spearman’s Rho correlation test. The effect of the timing of the clerkship on the final OSCE performance was evaluated.

Results

A total of 286 students’ records were included. There were 115 male students and 171 female students (M:F 1:1.5). There were strong positive correlations between students’ performance on the in-training examinations (OSCE and written) and the final OSCE (correlation coefficients of 0.508 and 0.473, respectively). The final written exam scores were positively correlated with the final OSCEs (r = 0.448). There was no significant effect of the timing of the clerkship.

Conclusions

Students’ performance on in-training examinations might predict their final OSCE scores. Thus, it is important to provide students with the necessary intervention at an early stage to reduce failure rates. The final OSCE performance does not seem to be affected by the timing of the clerkship.

Introduction

OSCE stands for Objective Structured Clinical Examination. OSCEs have been used to assess the clinical competence of health care professional students for decades [1, 2]. They involve a series of encounters objectively testing clinical skills including history taking, physical examination, communication and data interpretation [3]. The OSCE is a feasible approach to assess clinical competency due to its inherent flexibility [4, 5]. The reliability and utility of this examination have been widely studied [411].

Studying the factors related to students that affect their performance is highly important in order to provide the learners with the optimal educational conditions for the best outcomes in terms of knowledge retention, application and exam performance. The correlations between OSCE performance and measures such as performance on future OSCEs, USMLEs and medical school grade point average have been reported [69]. However, limited data are available on some other important student-related factors affecting their OSCE performance, such as the effects of the timing of one’s clerkship during the academic year, the performance on the written examination components and the performance on the in-training OSCEs [1, 3, 12, 13]. Such correlations are crucial to evaluate if the students have improved their performance throughout the academic year. We believe that students should demonstrate improved skills with each OSCE undertaken. This has important implications for medical schools in providing students with poor in-training performance with the necessary intervention at an early stage to reduce failure rates and improve students’ outcomes.

The aim of our study is to evaluate the factors affecting students’ performance on the Pediatrics Final Year OSCE, including the timing of their clerkship, their performance on the in-training OSCE and their written examination score.

Materials and methods

This descriptive study was conducted with 6th year clinical students at the Faculty of Medicine, Kuwait University, and it analyzed the grades in the pediatrics rotations for 3 consecutive classes of students for three academic years (2013–2014, 2014–2015, and 2015–2016).

The School of Medicine, Kuwait University, admits an average of 100 students every year and offers a 7-year undergraduate teaching program. The faculty curricula, for both preclinical and clinical programs, were reformed in 2005–2006, and consequently, problem-based learning was introduced [14].

Study setting

In their 6th year, students are divided into 3 groups. They rotate between pediatrics, obstetrics and gynecology and 4 other subspecialties in medicine and surgery. There is one pediatric rotation completed by the students. Sixth year medical students spend 12 weeks in pediatrics. Students complete clinical rotations in the general pediatric wards along with 2 weeks in the NICU in the 4 main teaching hospitals in the country; each student rotates in 2 hospitals. Clinical teaching comprises bedside teaching, procedure and communication skills teaching. In addition, the students join the medical team in the ward, where they are assigned patients to follow and present during ward rounds.

In addition to the clinical teaching, there are seminars, medical school days and paper-based problem-based learning (PBL) sessions delivered at the faculty campus.

At the end of the pediatrics rotation, there is an end-of-block examination with both written (computer-based MCQs) and OSCE components. The final exam, which is carried out at the end of the year, is composed of similar components. Six external examiners from Europe and North America are invited every year to assess our students on the final OSCE. External examiners are invited only for the final exam every year.

The OSCE is performed on real pediatric patients, healthy children and simulated parents in the hospital. We run 2 successive cycles with identical stations. The end-of-block OSCE is run in 2 teaching hospitals in one day, while the final OSCE is conducted over 2 days in the 4 main teaching hospitals each day. To maintain consistency, examiners are given protected time immediately before the exam to agree on the physical signs and what the student must demonstrate in order to achieve the score, thereby standardizing the marking process.

The exam consists of 9 clinical stations: neurology, pulmonology, gastroenterology, cardiology, communication skills, history taking, clinical procedure demonstration, normal development assessment and one other station that could be genetics or dermatology. Except for the history taking, which is a 20 minute station, the other stations are 10 minutes in duration. One to two minutes are allowed between stations for students to move and examiners to complete their marks. In addition to a checklist, a global mark is awarded based on fluency and mastery. The department of pediatrics holds an orientation session for the external examiners to explain the guidelines of the exam and what standards are expected from the students. The grading sheets and the global assessment sheets are discussed. Examiners are briefed on each day of the OSCE on the standards expected. The pass mark is set before the exam and no changes on the marks are made after the exam. The grades are expressed as A (≥ 90%), B (80–89%), C (60–79%) and F (<60%). All grades are expressed as percentages. The in-training OSCE is graded, and feedback is provided to failing students at a later date; however, students complete Mini-Cex sessions during their training, which are not graded but are given immediate feedback.

Ethical approval was obtained from the Health Sciences Center Ethical Committee, Faculty of Medicine, Kuwait University. The Board members are the Vice Dean of Research at the Faculty of Medicine, who is the head of the committee, and five other members who are full professors in different specialties. The use of students’ grades was approved by the Vice-Dean Academic Office at the Faculty of Medicine. The confidentiality of the identity of the students included was ensured.

Statistical analysis

The data management, analysis and graphical presentation were carried out using the computer software ‘Statistical Package for Social Sciences, SPSS version 25.0’ (IBM Corp, Armonk, NY, USA). The descriptive statistics for students’ overall academic grade performance during the three years were presented as numbers and percentages. The average scores were computed and presented as the means ± standard deviations (SD) with a range for each examination component, and the three years were compared using analysis of variance (ANOVA) with the Bonferroni test for multiple comparisons. The Shapiro-Wilk test was used to check for a normal distribution for all the data studied. Spearman’s rho correlation test was used to find the correlations of the final OSCE performance and the in-training OSCE, in-training written and final written performances. The one-way ANOVA test was used to evaluate the effect of the order of the clerkship on the final OSCE performance. A two-tailed probability value ‘p’ < 0.05 was considered statistically significant.

Results

A total of 286 students’ score records were included in the study. The distribution of the number of the students across three consecutive years is presented in Table 1 (The total number of male students was 115 and the number of female students was 171 (M:F ratio 1:1.5)).

Table 1. Mean score for the end of block, final OSCE, end of block written and final written by year.

Year of examination Number of students Gender Male Gender Female End of Block OSCE End of block written Final OSCE Final written
No (%) No (%) Mean±SD Mean±SD Mean±SD Mean±SD
2013–14 92 37 (40.2) 55 (59.8) 77.96±7.2 67.6±7.6 77.71±6.2 74.3±6.5
2014–15 90 30 (33.3) 60 (66.7) 80.7±5.5 69.3±8.1 78.4±6.3 74.5±7.5
2015–16 104 48 (46.2) 56 (53.8) 79.3±6.2 66.2±8.5 77.8±6.4 75.8±6.9
Total 286 115 (40.2) 171(59.8) 79.31±6.4 67.66±8.2 77.95±6.3 74.9±7.0

Students’ grades in each exam component in each year are presented as the means ± SDs in Table 1. No significant difference was noticed in the final grades during the three years (P = 0.701).

The final grades for all the students in individual years and collectively are displayed in Fig 1. Overall, the most scored grade was C (68.9%) followed by B (25.5%).

Fig 1. Distribution of the final grades for the students in individual years and collectively.

Fig 1

Performance on the OSCE (Table 2)

Table 2. Spearman’s rho correlations between the final OSCE and the end of block OSCE, final written and end of block written.

Academic year Examination Correlation Coefficient P value
2013–14 End of Block OSCE 0.539 <0.001
Final written 0.402 <0.001
End of Block written 0.428 <0.001
2014–15 End of Block OSCE 0.434 <0.001
Final written 0.570 <0.001
End of Block written 0.574 <0.001
2015–16 End of Block OSCE 0.536 <0.001
Final written 0.356 <0.001
End of Block written 0.428 <0.001
Total End of Block OSCE 0.508 <0.001
Final written 0.448 <0.001
End of Block written 0.473 <0.001

Over the whole sample, the results showed that there is a positive correlation between the performances on the end-of-block OSCE and the final OSCE with a correlation coefficient r of 0.508. This holds true for each academic year as well, where 2013–14, 2014–15, and 2015–16 had correlation coefficients of 0.539, 0.434, and 0.536, respectively (Table 2).

Performance on the written exam (Table 2)

The results showed a positive correlation (r = 0.448) between the overall scores on the written final and the final OSCE examinations (Table 2). A similar finding was noticed when the end-of-block written was compared to the final OSCE in each year and over the whole sample.

Timing of clerkship (Table 3)

Table 3. Mean final OSCE scores by timing of clerkship.

Year of Examination First Second Third Total Rotation
Mean ± SD Mean ± SD Mean ± SD Mean ± SD
2013–14 78.23±5.59 76.15±7.66 78.62±4.97 77.71±6.16
(n = 32) (n = 29) (n = 31) (n = 92)
2014–15 77.76±6.09 77.23±5.93 80.10±6.55 78.35±6.26
(n = 29) (n = 31) (n = 30) (n = 90)
2015–16 76.54±7.20 78.03±5.93 78.85±5.97 77.81±6.41
(n = 35) (n = 34) (n = 35) (n = 104)
Total 77.47±6.34 77.19±6.49 79.17±5.84* 77.95±6.26
(n = 96) (n = 94) (n = 96) (n = 286)

* Statistically significant difference between third and second, P = 0.029.

Looking at the timing of their clerkships, there was no significant effect on the performance on the final OSCE (Table 3).

The effect of the timing of the clerkship was studied for each academic year and the overall sample. When the means of the final OSCE performances for all the students in the 3 academic years (2013–2016) were compared, there was a statistically significant difference in the final OSCE performance for students who did their rotation last compared to those who did it in the middle of the year but not first (Table 3). The mean score was 79.17 in rotation 3 compared to 77.19 in rotation 2 with a P = 0.029.

Discussion

Efforts are continuously being made to improve the assessment of medical students. The OSCE has been widely implemented due to its validity and reliability. It has added greater objectivity to students’ evaluation. Therefore, it is important to understand the factors that might affect students’ performance and the ways to improve such performance to ensure the acquisition of the clinical skills and ability to apply medical knowledge.

In our study, we have demonstrated that there are strong correlations between students’ performance on prior examinations in pediatrics rotations, both OSCE and written, and the final OSCE, as well as with the final written exam component.

Studies on correlations between formative and summative OSCEs have reported conflicting findings [1, 1214]. Chisnall et al. found that formative OSCEs had a positive predictive value of 92.5% for passing the summative OSCE. Although they have shown that formative OSCEs were associated with improved performance only for identical stations in subsequent summative OSCEs, there was an improvement in the passing rates in the summative exam in other stations as well [14].

Previous reports have shown that repetitive testing enhances knowledge retention [12, 15]. This is extremely crucial in medical education, where the aim is to ensure the attainment of clinical skills for safe practice by the medical graduates. In fact, many previous studies supported the idea of assessment driving learning [16, 17].

Our findings of strong correlations of the final OSCE with the written exam component are consistent with previous studies [3, 8, 1820]. This correlation emphasizes the importance of having a good knowledge base when taking the OSCE.

Couto et al. reported positive correlations between the summative OSCE and formative assessments, including assessments done during PBL sessions [19]. In addition, a study evaluating the predictors of OSCE performance, concluded that students who took a practice National Board of Medical Examiner (NBME) shelf exam did better on both their shelf exam and OSCE [20].

In our study, this correlation might be attributed to the fact that our OSCE is based on real pediatric patients with clinical findings; therefore, students learn to perform in a way that is relevant to the clinical findings. Clinical findings should be sought as the student progresses with patient encounters. They need to give explanations and answer a few questions at the end of the station. The ultimate objective of students’ evaluation for a medical certificate is not only to evaluate the depth and breadth of their knowledge but more importantly to evaluate their ability to apply such knowledge and to correlate it to clinical findings. Our OSCE is clinical performance concentrated.

Previous studies have demonstrated that the closer the students’ rotation is to their final exam, the better their scores were [1, 3, 8]. In our study, the timing of the clerkship did not affect the performance on the final OSCE when looking at individual years. However, when we compared the overall sample size, students who did their rotation at the end of the academic year had slightly better performance than those who did it in the middle of the academic year but not those students who completed their clerkship at the beginning of the year. In our study, this might be attributed to the fact that a clerkship in the middle of the year is interrupted by many holidays. Nonetheless, this was not true when comparing the third rotation to the first or the second to the first. We may conclude that there was no effect of the timing of the rotation on the final OSCE performance. Therefore, dividing students to complete different rotations throughout the year, according to the capacity of the training centers, staff availability, and applied curriculum, should not affect their performance on final exams. It seems that students have good long-term clinical capacities. This is extremely crucial in medicine, as our aim is for students to be able to retrieve the information when needed and to apply it later in their practice.

This is the first study done in the Faculty of Medicine at Kuwait University correlating in-training exam scores with final OSCE scores. Our data over the years were comparable. We were able to include the complete score records for entire batches of the period of study.

The limitations of the study include the lack of a comparison group. It would be imperative to study the outcomes of students’ performance when there are no in-training assessments under the same educational settings. However, this was not feasible since there is only one school of medicine in the country. Comparing the grades between different specialties would not be conclusive. The OSCEs differed in their contents over the years. Although we kept similar components (systems to be examined), the exact cases were different. This might be a potential limitation, but it is designed this way to ensure that students do not know the exact content of the exam, which might affect their success in the examinations.

Conclusion

In conclusion, there are positive correlations between students’ performance on the final OSCE and their performance on both the in-training OSCE and the written examinations. Such correlations might be used as predictors of final OSCE grades. Use of these predictors would offer a better opportunity to provide students with the necessary intervention at an early stage to reduce failure rates. The positive correlation between the written exam components and the OSCE supports that adequate performance on OSCEs requires a good knowledge base. Final OSCE performance may not be affected by the timing of the clerkship.

Supporting information

S1 Dataset

(DOCX)

S2 Dataset. Edited-final overall grading 13–14.

(XLS)

S3 Dataset. Edited-final overall grading 14–15.

(XLS)

S4 Dataset. Edited-final overall grading 15–16.

(XLS)

Acknowledgments

The authors would like to gratefully acknowledge Ms. Asiya Tasneem Ibrahim and Dr. Prem Sharma for their assistance with the statistical analysis. We would like to acknowledge Mr. Thomas M. De Souza for his great help in extracting the students’ grades.

Data Availability

All relevant data are within the manuscript and its Supporting Information files.

Funding Statement

the authors receive no specific funding for this work.

References

  • 1.Chima M, Dallangam GB. Does student perfomance on preclinical OSCEs relate to clerkship grades? Med Educ Online.2016, 21:31724 10.3402/meo.v21.31724 eCollection 2016. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Bartfay WJ, Rombough R, Howse E, Leblanc R. Evaluation: The OSCE approach in nursing education. Can Nurse.2004;100:18–23 [PubMed] [Google Scholar]
  • 3.Lukas RV, Adesoye T, Smith S, Blood A, Brorson J. Student assessment by objective structured examination in a neurology clerkship. Neurology 2012; 79:681–85 10.1212/WNL.0b013e3182648ba1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Patricio MF, Juliano M, Fareleira F, Carneiro AV. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med Teach. 2013; 35(6):503–14 10.3109/0142159X.2013.774330 [DOI] [PubMed] [Google Scholar]
  • 5.Joshi MK, Srivastava AK, Ranjan M, Dhar M, Chumer S, et al. OSCE as a summative Assessment Tool for Undergraduate Students of Surgery-Our Experience. Indian J Surg.2017;79(6):534–538 10.1007/s12262-016-1521-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Bodger O, Byrne A, Evans PA, Rees S, Jones G, Cowell C, et al. Graduate Entry Medicine: Selection Criteria and Student Performance. PLoS one.2011;6: e27161 10.1371/journal.pone.0027161 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Dong T, Swygert KA, Durning SJ, Saguil A, Gilliliand WR, Cruess D, et al. Validity Evidence for Medical Schools OSCEs: Association with USMLE (R) Step assessments. Teach Learn Med.2014;26:379–386 10.1080/10401334.2014.960294 [DOI] [PubMed] [Google Scholar]
  • 8.Simon SR, Bui A, Berti D, Volkan K. The Relationship between Second -Year Medical Students’ OSCE Scores and USMLE Step 2 Scores. J Eval Clin Pract.2007;13:901–905 10.1111/j.1365-2753.2006.00768.x [DOI] [PubMed] [Google Scholar]
  • 9.Dong T, Saguil A, Artino AR Jr., Gilliland WR, Waechter DM, Lopreaito J, et al. Relationship Between OSCE Scores and Other Typical Medical School Performance Indicators: a 5-year Cohort Study. Mil Med.2012;177:44–46 10.7205/milmed-d-12-00237 [DOI] [PubMed] [Google Scholar]
  • 10.Kreptul D, Thomas RE. Family Medicine Resident OSCEs: A Systematic Review. Educ Prim Care.2016;13:1–7 [DOI] [PubMed] [Google Scholar]
  • 11.Sobh AH, Austin Z, Izham M I M, Diab MI, Wilby KJ. Application of a Systematic Approach to Evaluating Psychometric Properties of a Cumulative Exit-From-Degree Objective Structured Clinical Examination (OSCE). Curr Pharm Teach Learn.2017;9(6):1091–1098 10.1016/j.cptl.2017.07.011 [DOI] [PubMed] [Google Scholar]
  • 12.Nackman GB1, Sutyak J, Lowry SF, Rettie C. Predictors of Educational Outcome: Factors Impacting Performance on a Standardized Clinical Evaluation. J Surg Res.2002;106(2): 314–318 10.1006/jsre.2002.6477 [DOI] [PubMed] [Google Scholar]
  • 13.Chisnall B, Vince T, Hall S, Tribe R. Evaluation of Outcomes of a Formative Objective Structured Clinical Examination for Second-Year UK Medical Students. Int J Med Educ.2015;6:76–83 10.5116/ijme.5572.a534 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Alkhateeb N. E., Al-Dabbagh A., Ibrahim M., & Al-Tawil N. G. (2019). Effect of a Formative Objective Structured Clinical Examination on the Clinical Performance of Undergraduate Medical Students in a Summative Examination: A Randomized Controlled Trial. Indian pediatrics, 56(9), 745–748. [PubMed] [Google Scholar]
  • 15.Zahid MA, Vaegese R, Mohammed AM, Ayed AK. Comparison of Problem-Based Learning-Driven with Traditional Didactic–Lecture-Based Curricula. Int J Med educ.2016;7:181–187 10.5116/ijme.5749.80f5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Newble D. Revisiting “the effect of assessments and Examinations on the Learning of Medcal students” Med Educ.2016; 50:498–501 10.1111/medu.12796 [DOI] [PubMed] [Google Scholar]
  • 17.Stormann S, Stankiewicz M, Raes P, Berchtold C, Kosanke YIlles G, et al. How well Do Final Year Undergraduate Medical Students Master Clinical Skills? GMC J Med Educ. 2016;33(4):Doc58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Gillette C, Stanton RB, Anderson HG Jr. Student Performance on a Knowledge-Based Exam May Predict Student Ability to communicate Effectively with a Standardized Patient During an Objective Structured Clinical Examination Curr Pharm Teach Learn.2017;9(2):201–207 10.1016/j.cptl.2016.11.004 [DOI] [PubMed] [Google Scholar]
  • 19.Lucélio B. Couto, Marina T. Durand, Amora C. D. Wolff, Carolina B. A. Restini, Milton Faria Jr, et al. (2019) Formative assessment scores in tutorial sessions correlates with OSCE and progress testing scores in a PBL medical curriculum, Medical Education Online, 24:1, 1560862, 10.1080/10872981.2018.1560862 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Sampat A., Rouleau G, O’Brien C, Zadikoff C (2019) Neurology Clerkship Predictors of Objective Structured Clinical Exam and Shelf Performance, J Med Educ & curric develop, 6:1–7 [DOI] [PMC free article] [PubMed] [Google Scholar]

Decision Letter 0

Oathokwa Nkomazana

26 Jun 2020

PONE-D-19-27962

Factors Predicting Students’ Performance in the Final Pediatrics OSCE

PLOS ONE

Dear Dr. Al Rushood,

Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.

We would appreciate receiving your revised manuscript by Jul 03 2020 11:59PM. When you are ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file.

If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter.

To enhance the reproducibility of your results, we recommend that if applicable you deposit your laboratory protocols in protocols.io, where a protocol can be assigned its own identifier (DOI) such that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols

Please include the following items when submitting your revised manuscript:

  • A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). This letter should be uploaded as separate file and labeled 'Response to Reviewers'.

  • A marked-up copy of your manuscript that highlights changes made to the original version. This file should be uploaded as separate file and labeled 'Revised Manuscript with Track Changes'.

  • An unmarked version of your revised paper without tracked changes. This file should be uploaded as separate file and labeled 'Manuscript'.

Please note while forming your response, if your article is accepted, you may have the opportunity to make the peer review history publicly available. The record will include editor decision letters (with reviews) and your responses to reviewer comments. If eligible, we will contact you to opt in or out.

We look forward to receiving your revised manuscript.

Kind regards,

Oathokwa Nkomazana, MD MSC PhD

Academic Editor

PLOS ONE

Additional Editor Comments:

Thank you for an interesting article.

Please address the comments by the reviewer. In addition please address the following:

1. In the Methods section, it will be helpful to have a section titled: Study setting where you describe the setting of the medical school, where the clinical training is done, focusing on paediatrics; how the training is done; How many paediatric rotations do the students do in the course of training?Are the PBL sessions paper based or are they based on real patients? What is entailed in clinical teaching``/

How are the examination quality assured? You only talk to external examiners in the final OSCE. Is there blue printing of the examination? Is there moderation? Is it internal or are there aspects of external moderation? What does it mean that "Examiners are provided protected time just before exams?" what are the examiners briefed on?

In the data collection, make it clear if you followed the same cohort of students over three years, which is what I think you are reporting, but it's not clear. You say that you enrol 100 students a year but do not explain why you do not have 300 students in your study. Who have you excluded?

In the results section, you say that students who did their rotation last performed better than those who did it second, but not not better that those who did it first. Hw do you explain the difference? Could this be attributed to the differences in the assessments?

In the discussion,, the 7th paragraph that talks to differences in performance based on the timing of the rotation, the inferences are not very clear and it's not clear how this is supported by the reported findings. You conclude the paragraph by applauding the recall of information,; in medical education, however, recall is not considered a very high value but rather information literacy and life ling learning capacity as the half life of medical information is very short.

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and

https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf

2. We suggest you thoroughly copyedit your manuscript for language usage, spelling, and grammar. If you do not know anyone who can help you do this, you may wish to consider employing a professional scientific editing service.  

Whilst you may use any professional scientific editing service of your choice, PLOS has partnered with both American Journal Experts (AJE) and Editage to provide discounted services to PLOS authors. Both organizations have experience helping authors meet PLOS guidelines and can provide language editing, translation, manuscript formatting, and figure formatting to ensure your manuscript meets our submission guidelines. To take advantage of our partnership with AJE, visit the AJE website (http://learn.aje.com/plos/) for a 15% discount off AJE services. To take advantage of our partnership with Editage, visit the Editage website (www.editage.com) and enter referral code PLOSEDIT for a 15% discount off Editage services.  If the PLOS editorial team finds any language issues in text that either AJE or Editage has edited, the service provider will re-edit the text for free.

Upon resubmission, please provide the following:

  • The name of the colleague or the details of the professional service that edited your manuscript

  • A copy of your manuscript showing your changes by either highlighting them or using track changes (uploaded as a *supporting information* file)

  • A clean copy of the edited manuscript (uploaded as the new *manuscript* file)

3. Thank you for stating in the manuscript Methods section:

'Ethical approval was obtained from The Health Sciences Ethical Committee. The use of students’ grades was approved by the Vice-Dean Academic Office at the Faculty of Medicine. Confidentiality of identity of students included was secured.' 

a. Please amend your current ethics statement to include the full name of the ethics committee/institutional review board(s) that approved your specific study.  

b. Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”).

For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research

4. PLOS requires an ORCID iD for the corresponding author in Editorial Manager on papers submitted after December 6th, 2016. Please ensure that you have an ORCID iD and that it is validated in Editorial Manager. To do this, go to ‘Update my Information’ (in the upper left-hand corner of the main menu), and click on the Fetch/Validate link next to the ORCID field. This will take you to the ORCID site and allow you to create a new iD or authenticate a pre-existing iD in Editorial Manager. Please see the following video for instructions on linking an ORCID iD to your Editorial Manager account: https://www.youtube.com/watch?v=_xcclfuvtxQ

5. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. Please see http://www.bmj.com/content/340/bmj.c181.long for guidelines on how to de-identify and prepare clinical data for publication. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

6. Please include your tables as part of your main manuscript and remove the individual files.

Please note that supplementary tables should be uploaded as separate "supporting information" files.

[Note: HTML markup is below. Please do not edit.]

Reviewers' comments:

Reviewer's Responses to Questions

Comments to the Author

1. Is the manuscript technically sound, and do the data support the conclusions?

The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented.

Reviewer #1: Partly

**********

2. Has the statistical analysis been performed appropriately and rigorously?

Reviewer #1: Yes

**********

3. Have the authors made all data underlying the findings in their manuscript fully available?

The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified.

Reviewer #1: No

**********

4. Is the manuscript presented in an intelligible fashion and written in standard English?

PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here.

Reviewer #1: Yes

**********

5. Review Comments to the Author

Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters)

Reviewer #1: I read the manuscript with great interest, I suggest the following comments to make it publishable

• There is need to address other factors that may affect the final OSCE like gender, day of the exam (the authors mentioned that they have around 30 students /group and there were 9 stations run in 2 cycles, this mean that the exam either repeated for three days or in three different place), studying these factors will add to the quality of paper.

• Was the in-training OSCE formative without marks, did you provide any feedback for student in the in-training OSCE

• The term error bar is mentioned in the method section but it is not shown in the figure.

• In result section (‘End of block OSCE’ and ‘End of block written’ showed some improvement, especially in the year 2014-15 compared to 2013-14 and 2015-16 (p=0.016 & p=0.031) respectively.) could you clarify this sentence more, what it means and refer to table number. The word improvement here is not perfect with the meaning. (improvement mean that student score improved in this test in comparison to previous test)

• In discussion section, the authors should shed light on the low r value between written and final OSCE, usually the OSCE should be a test of clinical competence and not knowledge try to discuss this point when comparing final OSCE and written test, could be the good correlation between written test and OSCE due to more knowledge concentrated OSCE stations rather than performance OSCE stations.

• To mention reliability of a test there is need for reliability index.

• In conclusion part the first sentence should be removed as it was not detected in the results.

• Try to use some new references that addressed similar problem.

Alkhateeb, N. E., Al-Dabbagh, A., Ibrahim, M., & Al-Tawil, N. G. (2019). Effect of a Formative Objective Structured Clinical Examination on the Clinical Performance of Undergraduate Medical Students in a Summative Examination: A Randomized Controlled Trial. Indian pediatrics, 56(9), 745–748.

English editing is advisable

**********

6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files.

If you choose “no”, your identity will remain anonymous but your review may still be made public.

Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy.

Reviewer #1: No

[NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files to be viewed.]

While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email us at figures@plos.org. Please note that Supporting Information files do not need this step.

PLoS One. 2020 Sep 2;15(9):e0236484. doi: 10.1371/journal.pone.0236484.r002

Author response to Decision Letter 0


1 Jul 2020

1. In the Methods section, it will be helpful to have a section titled: Study setting where you describe the setting of the medical school, where the clinical training is done, focusing on paediatrics; how the training is done; How many paediatric rotations do the students do in the course of training?Are the PBL sessions paper based or are they based on real patients? What is entailed in clinical teaching``/

Our response: A section entitled “Study Setting” was added with the required details (lines 77-87 pages: 4,5).

2. How are the examination quality assured? You only talk to external examiners in the final OSCE. Is there blue printing of the examination? Is there moderation? Is it internal or are there aspects of external moderation? What does it mean that “Examiners are provided protected time just before exams?” what are the examiners briefed on?

Our response:

The pediatric exams are prepared by the staff in the department. Each member contributes in each exam. There is a blue print for each exam that it is followed. A departmental assessment committee, including academic staff in the department, evaluate each exam thoroughly. Then the exam is sent to the Faculty of Medicine exam committee for further evaluation and approval. Exam moderation is done internally. External examiners are involved as examiners in the final OSCE, in addition, they evaluate the written final exam (line 91, page 5).

“Examiners are provided protected time just before exams” Our response: before the start of the final OSCE, Examiners are assigned time to check on patients physical signs in the different exam stations and agree on what students should demonstrate (lines 96-98, page 5).

what are the examiners briefed on? Our response: Few days prior to the final exam, the department of pediatrics, representing by the head of the department and the chair of the assessment committee and other members, hold an orientation session to the external examiners to explain the guidelines and the lay out of the exam, what standards are expected from the students so to define the pass/fail marks. The grading sheets and the global assessment sheets are discussed. (lines 105-108, page 6).

3. In the data collection, make it clear if you followed the same cohort of students over three years, which is what I think you are reporting, but it's not clear. You say that you enroll 100 students a year but do not explain why you do not have 300 students in your study. Who have you excluded?

Our response: We did not follow the same cohort for 3 years. We included the students rotated in pediatrics in the following academic years 2013-14, 2014-15 and 2015-16.

The total number of students involved in the study was 286 students because the faculty admits, on average, 100 students every year, yet 14 students had early drop-out for different reasons (lines 67-69, page 4).

4. In the results section, you say that students who did their rotation last performed better than those who did it second, but not better that those who did it first. How do you explain the difference? Could this be attributed to the differences in the assessments?

Our response: There are no differences in assessments. We expected that students’ performance in the final OSCE is best for those who did their pediatrics rotation towards the end of the year and closer to the final exam. However, this was not consistent, especially when compared with those who did their rotation in the beginning of the academic year. Students, doing their pediatric rotation second (ie in the middle of the academic year), perform less in the final OSCE. An explanation to this would be the fact, the rotation in the middle of the year is interrupted by many holidays (New year, Christmas and our National days). The students might be distracted. Of note, this difference was only statistically significant when we compared the whole study population, it was not demonstrated among students in each year, or when compared last to beginning or middle to beginning of the year. (lines 225-228 p 12)

5. In the discussion, the 7th paragraph that talks to differences in performance based on the timing of the rotation, the inferences are not very clear and it's not clear how this is supported by the reported findings.

Our response: We reported no significant differences in the final OSCE performance between the 3 groups each year. The only significant difference was when we looked at the whole study sample, there was a slight improvement in the final OSCE performance between the students rotating last compared to those rotating in the middle of the year (Means= 79.17 vs 77.19, P=0.028) however, this was not true comparing the third rotation to the first, or second to first. Therefore, there might be no effect of the timing of the rotation on the final OSCE performance. Therefore, dividing students to do different rotations throughout the year, according to the capacity of the training centers, staff availabilities, applied curriculum, should not affect their performance in the final exams (lines 220-228, page12).

6. You conclude the paragraph by applauding the recall of information; in medical education, however, recall is not considered a very high value but rather information literacy and life-long learning capacity as the half-life of medical information is very short.

Our response:

We agree. Recall of information was not meant. It is the life-long learning capacity and the ability to perform the skills learned fluently and efficiently it was clarified in line 232 page12.

Journal Requirements:

When submitting your revision, we need you to address these additional requirements.

1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at

The format was modified accordingly

2. We suggest you thoroughly copyedit your manuscript for language usage, spelling, and grammar. If you do not know anyone who can help you do this, you may wish to consider employing a professional scientific editing service.

The manuscript was edited by AJE

3. Thank you for stating in the manuscript Methods section:

'Ethical approval was obtained from The Health Sciences Ethical Committee. The use of students’ grades was approved by the Vice-Dean Academic Office at the Faculty of Medicine. Confidentiality of identity of students included was secured.'

a. Please amend your current ethics statement to include the full name of the ethics committee/institutional review board(s) that approved your specific study.

b. Once you have amended this/these statement(s) in the Methods section of the manuscript, please add the same text to the “Ethics Statement” field of the submission form (via “Edit Submission”).

For additional information about PLOS ONE ethical requirements for human subjects research, please refer to http://journals.plos.org/plosone/s/submission-guidelines#loc-human-subjects-research

Ethical approval was obtained from the Health Sciences Center Ethical Committee, Faculty of Medicine, Kuwait University. The Board members are the Vice Dean of Research at the Faculty of Medicine as the head of the committee and five other members who are full professors in different specialties.

4. PLOS requires an ORCID iD for the corresponding author in Editorial Manager on papers submitted after December 6th, 2016.

The ORCID ID of the corresponding author was updated in the Editorial Manager System

ORCID ID: 0000-0001-6148-1707.

5. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or

ethical restrictions on sharing data publicly. For information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions.

In your revised cover letter, please address the following prompts:

a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially identifying or sensitive patient information) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent.

b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. Please see http://www.bmj.com/content/340/bmj.c181.long for guidelines on how to de-identify and prepare clinical data for publication. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories.

We will update your Data Availability statement on your behalf to reflect the information you provide.

The data will be available and shared. To be able to share the data, we need to apply for a permission from the administration at the Faculty of Medicine, which is a lengthy procedure and because of the COVID-19 pandemic and the lockdown situation in the country, we will not be able to have the permission in timely manner. We will be able to share the data when the situation is back to normal in the country.

6. Please include your tables as part of your main manuscript and remove the individual files. Please note that supplementary tables should be uploaded as separate "supporting information" files.

The tables were included as part of the main manuscript file. The supplementary files were removed.

Response to reviewer #1 comments:

Thank you for your valuable comments. We have responded and made necessary changes a follows:

5. Review Comments to the Author

Reviewer #1: I read the manuscript with great interest, I suggest the following comments to make it publishable.

• There is need to address other factors that may affect the final OSCE like gender, day of the exam (the authors mentioned that they have around 30 students /group and there were 9 stations run in 2 cycles, this mean that the exam either repeated for three days or in three different place), studying these factors will add to the quality of paper.

Our response: We agree. Gender and the day of the examination, if it is conducted in more than one day, are important factors that might affect the exam performance. In our medical school, the end of block OSCE is conducted in one day in 2 hospitals. The final OSCE is run over 2 day in 4 hospitals per day (lines 94-96 in page 5). A study on the effect of students’ demographic information, including gender, age, type of housing, order among siblings and others is being carried out.

• Was the in-training OSCE formative without marks, did you provide any feedback for student in the in-training OSCE?

Our response: The in-training OSCE is graded. Feedback is provided to the students who failed the exam station only. During their 3 months of clerkship, the students perform Mini-Cex, where immediate feedback is given (lines 110-112, page 6)

• The term error bar is mentioned in the method section but it is not shown in the figure.

Our response: This was a mistake. There was an error bar, but we opt to remove it in the final version of the manuscript as it does not add much to data analysis. This sentence should have been removed from the text prior to submission. Thank you for pointing it out.

(Lines 125-126, page 7) was removed.

• In result section (‘End of block OSCE’ and ‘End of block written’ showed some improvement, especially in the year 2014-15 compared to 2013-14 and 2015-16 (p=0.016 & p=0.031) respectively.) could you clarify this sentence more, what it means and refer to table number. The word improvement here is not perfect with the meaning. (improvement mean that student score improved in this test in comparison to previous test)

Our response: The word improvement is misleading, I agree, as it implies comparing the performance of the same students over time. In this paragraph, we refer to table 1, showing the stability of our exam grades over the years, which make the study data comparable and hence, conclusions can be withdrawn. There are no significant differences between the grades of different exam components over the years. However, we think that it is confusing, so it was removed from the text ( lines 144-146, page 8).

• In discussion section, the authors should shed light on the low r value between written and final OSCE, usually the OSCE should be a test of clinical competence and not knowledge try to discuss this point when comparing final OSCE and written test, could be the good correlation between written test and OSCE due to more knowledge concentrated OSCE stations rather than performance OSCE stations.

Our response: We agree, OSCE should measure clinical competence. Many studies reported correlations between written exam performance and OSCE (3,8,18-20 in the references). As we have real patients in the OSCE run in our institution, we think that students should have a good knowledge base in order to perform well. Clinical findings should be sought as the student progresses with the patient encounter. They need to give explanations and answer few questions at the end of the station. The OSCE is conducted to ensure mastering the clinical skills. More clarification to the paragraph, with new references, was added (lines 203-219, pages 11,12).

• To mention reliability of a test there is need for reliability index.

Our response: True. The sentence was removed from the text as it might be confusing (line 205, page 11).

• In conclusion part the first sentence should be removed as it was not detected in the results.

Our response: Agree. The sentence was removed (lines 250,251, Page 13).

• Try to use some new references that addressed similar problem.

Alkhateeb, N. E., Al-Dabbagh, A., Ibrahim, M., & Al-Tawil, N. G. (2019). Effect of a Formative Objective Structured Clinical Examination on the Clinical Performance of Undergraduate Medical Students in a Summative Examination: A Randomized Controlled Trial. Indian pediatrics, 56(9), 745–748.

Our response: Thank you for the reference. This article and other 2 new references were added. References 14, 19 and 20.

English editing is advisable.

Our response: The manuscript was edited by AJE

Attachment

Submitted filename: response to Reviewers.docx

Decision Letter 1

Oathokwa Nkomazana

9 Jul 2020

Factors Predicting Students’ Performance in the Final Pediatrics OSCE

PONE-D-19-27962R1

Dear Dr.Maysoun Al Rushood ,

We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements.

Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication.

An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org.

If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org.

Kind regards,

Oathokwa Nkomazana, MD MSC PhD

Academic Editor

PLOS ONE

Additional Editor Comments (optional):

Reviewers' comments:

Acceptance letter

Oathokwa Nkomazana

21 Aug 2020

PONE-D-19-27962R1

Factors Predicting Students’ Performance in the Final Pediatrics OSCE

Dear Dr. Al Rushood:

I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department.

If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org.

If we can help with anything else, please email us at plosone@plos.org.

Thank you for submitting your work to PLOS ONE and supporting open access.

Kind regards,

PLOS ONE Editorial Office Staff

on behalf of

Dr. Oathokwa Nkomazana

Academic Editor

PLOS ONE

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    S1 Dataset

    (DOCX)

    S2 Dataset. Edited-final overall grading 13–14.

    (XLS)

    S3 Dataset. Edited-final overall grading 14–15.

    (XLS)

    S4 Dataset. Edited-final overall grading 15–16.

    (XLS)

    Attachment

    Submitted filename: response to Reviewers.docx

    Data Availability Statement

    All relevant data are within the manuscript and its Supporting Information files.


    Articles from PLoS ONE are provided here courtesy of PLOS

    RESOURCES