Skip to main content
Medical Science Educator logoLink to Medical Science Educator
. 2021 Oct 6;31(6):1803–1812. doi: 10.1007/s40670-021-01421-9

Tutor–Student Partnership in Practice OSCE to Enhance Medical Education

Eve Cosker 1,2, Valentin Favier 3, Patrice Gallet 3,4,5, Francis Raphael 5, Emmanuelle Moussier 5, Louise Tyvaert 5,6, Marc Braun 4,5, Eva Feigerlova 4,5,7,8,
PMCID: PMC8651844  PMID: 34956698

Abstract

Background

Training of examiners is essential to ensure the quality of objective structured clinical examination (OSCE). We aimed to study a perceived effectiveness of tutor–student partnership in a practice OSCE module by novice OSCE tutors and medical students.

Method

We implemented a practice OSCE at a medical faculty in France with novice tutors and third year medical students as partners. Each tutor (n = 44) served as a partner for the group of 5 students in the conception of the scenario and as an evaluator of the tutored station. Students (n = 303) were involved in the conception of a case and the roles of a physician, evaluator and a simulated patient. Data were obtained through self-assessment questionnaires. Descriptive statistics were used to analyze items of the questionnaires. Free-form answers were coded and analyzed thematically.

Results

A total of 36 tutors (82%) and 185 students (61%) responded to the questionnaires. The intervention was well perceived. Thirty-two percent of the tutors reported some difficulties in the assessment of student performance and were disposed to receive further training. Fifty-five percent of the students considered the participation in the OSCE case development appropriate to their level of knowledge, and 70% perceived it as beneficial allowing them to set their learning goals.

Conclusion

This initiative provides a relevant method beneficial to OSCE tutors, medical students, and the faculty. Tutors learn how to assess student performance according to expected achievement levels. It allows students to be engaged as partners in co-creation of learning and teaching.

Supplementary Information

The online version contains supplementary material available at 10.1007/s40670-021-01421-9.

Keywords: Objective structured clinical examination, Tutor–student partnership, Co-creation in learning and teaching, Novice tutors, Medical students

Introduction

The objective structured clinical examination (OSCE) is used as a standard method for assessing clinical performance levels in undergraduate medical education in many countries. Training of examiners is essential to ensure the quality of OSCE, and this is particularly challenging for high stakes examinations. Orientation sessions or video-based training methods are usually provided to familiarize OSCE examiners with scoring instruments and standards for acceptable performance [13]. However, conventional training is time consuming and resource-demanding for faculty staff. Despite established quality assurance procedures, a variety of studies reported that inaccurate assessment by examiners contributes significantly to the variability in the evaluation of student performance levels in both clinical and communication skills [48].

The challenge in writing OSCE stations is to integrate the assessment of relevant knowledge within a clinical context: this is the reason why the creation of stations is usually ensured by experienced faculty staff. However, the involvement of examiners in the writing and construction of OSCE stations, rather than their experience in teaching or clinical medicine, had been positively associated with increased reliability among evaluations [4].

Literature evidence supports that pedagogical approaches involving students as partners help tutors to increase their understanding of how students learn, to identify students in difficulties, and enable students to increase their confidence and self-efficacy [9]. According to Cook-Sather, involving students as partners is “…a collaborative, reciprocal process through which all participants have the opportunity to contribute equally, although not necessarily in the same ways, to curricular or pedagogical conceptualization, decision-making, implementation, investigation, or analysis” [10]. Most importantly, involving students as partners in the learning process is beneficial for the promotion of student-tutor relationships, the enhancement of students’ self-evaluation, and their sense of being part of the community [1013]. Involvement of students in the creation of the OSCE cases with the guidance and the feedback of a clinically experienced tutor could be beneficial for students’ engagement in teaching and learning activities. Reciprocally, the elaboration of an OSCE case with students as partners might help novice OSCE tutors to acquire necessary skills for a summative OSCE, to get a better understanding of how students perceive their engagement in learning activities, how to address students’ needs, and how to assess student performance according to learning outcomes.

In France, the undergraduate medical education program has a duration of 6 years of fulltime study: the first 2 years are dedicated to pre-clerkship learning and the last 4 years, which integrate theoretical learning and clinical rotations, are dedicated to the practical acquisition of clinical competencies in the clinical settings. A practice OSCE module was developed at the Medical Faculty of the University of Lorraine in 2019 as a part of the 3rd year medical curriculum. Novice OSCE tutors, with various clinical and teaching experiences, and third-year medical students, who had never undertaken an OSCE before, collaborate as partners. The objectives of the practice OSCE module are (i) to train tutors to create and evaluate OSCE stations and (ii) to allow students to experience all facets of the OSCE, including the co-creation of the case and the roles of physician candidates, of evaluators, and of simulated patients. Students experience all facets of the OSCE, including a co-creation of cases, roles of physician candidates, of evaluators, and of simulated patients. This initiative targets both tutors and medical students. Tutors get training in the assessment of student performance according to expected achievement levels. They further learn how to detect students in difficulties and how to address students’ learning goals. Students are actively engaged in learning activities and built relationships with tutors who provide them learning support.

The aim of the present article is to describe the design of this practice OSCE module and explore its perceived effectiveness by both OSCE tutors and medical students.

Methods

This is a single-center initiative at the Medical Faculty at the University of Lorraine, France, proposed to all students as part of the third-year medical education curriculum. There were no penalties for non-attendance and no grades. The participation was voluntary for OSCE tutors. During the practice OSCE module, tutors and third-year medical students collaborated as partners. Each tutor served as a partner for a group of 5 students in co-creation of the OSCE scenario and as an evaluator of the tutored station.

Description of the Practice OSCE Module

The practice OSCE module is illustrated in Fig. 1. The first phase consisted of initial half-day training sessions, which were organized separately for tutors and students by the members of the OSCE teaching staff (three permanent medical teachers: EF, PG, and LT) and one pedagogical engineer (EM). Online learning material for self-paced learning was provided to all participants: (i) the OSCE orientation video (developed by the investigators with the help of the Department of Digital Media of the University of Lorraine, and of the University Centre for Education by Medical Simulation, CUESiM, Virtual Hospital of Lorraine), (ii) steps for creation of OSCE stations with templates; (iii) a checklist with binary items (containing a maximum of 5 items with 25 required answers) and global rating scales (adapted version of Maas-Global Manual 2000 [14]).

Fig. 1.

Fig. 1

Formative OSCE module. a General description of formative OSCE module. 1 — creation of the OSCE case by students with tutors as partners; 2 — revision (patient script, checklist items and rating scales) (OSCE staff); 3 — practice OSCE sessions with post-encounter feedback by tutors; 4 — final debriefing of the OSCE station (tutor with his/her group of students). b Illustration of the OSCE circuit. The OSCE circuit is illustrated for one tutor (red color) and one student (green color), both co-creators of the OSCE station 1. The OSCE tutors served as evaluators on the station they had tutored. Each student experienced the roles of a simulated patient (once), a candidate physician (once), and an evaluator (three times). The student who created the station did not play the role of the physician or that of the patient on the same station, but experienced these roles on other stations. The entire circuit took 110 min for each student. For each station, four evaluators were assigned: the OSCE tutor in charge of the station, one of the students who had co-created the station, and three students who were not involved in the station development

The second phase consisted of the development and rehearsal of OSCE stations by tutor-guided groups of five students. Tutors chose the themes for OSCE stations following the learning objectives of the 3rd year medical curriculum. Scenarios were based on one chief clinical complaint to allow the assessment of focused history-taking, or physical examination, or diagnostic approach, together with communication skills. The OSCE teaching staff was responsible for reviewing and a final online editing of each OSCE scenario.

The third phase consisted of practice OSCE sessions. Before starting the rotations, each tutor received a guide on how to provide a structured feedback to students. Practice OSCE sessions comprised 4 circuits scheduled for 7 half-days. Four groups of students simultaneously circulated through a 5-station circuit in a seminar room. Students had 7 min at each station to complete the task, 10 min for a post-encounter feedback with the tutor, 2 min to switch between stations, and 15 min to familiarize themselves with the script of a simulated patient. OSCE tutors served as evaluators on the station they had tutored. Students experienced the roles of a simulated patient (once), a candidate physician (once), and an evaluator (three times). The entire circuit took 110 min for each student. For each station, four evaluators were assigned: the OSCE tutor in charge of the station, one of the students who had co-created the station, and three students not involved in the station development. After completion of each station, a 10-min performance feedback was provided by the tutor to the candidate physician. Students serving as evaluators were also encouraged to provide commentaries if they wished to do so. Members of the OSCE teaching staff were close enough to intervene if necessary.

After completion of one OSCE circuit, each tutor and his group of 5 students held a 25-min debriefing session aiming to improve the OSCE station based on the observations from rotations and to discuss perspectives for future work. The scoring sheets, final synthesis, and revised scenario were submitted to the OSCE teaching staff online using the Google Docs platform. The same day, all participants were invited by e-mail to complete an anonymous online survey regarding their experience as tutors, evaluators, standardized patients, and examinees.

Students’ and Tutors’ Self-Reports

The self-assessment questionnaires for tutors and students were developed by the investigators after a literature review with responses on Likert-type scales and open response questions to explore participants’ objectives for taking part in the practice OSCE module, their views of the practice OSCE, and perceived benefits. The final version of the questionnaire for tutors encompassed 2 domains (Supplemental online material 1). The first domain concerned the preparation of the OSCE case with students (4 items), and the second domain was focused on tutors’ perceptions of their role as tutors on the OSCE stations (5 items). The questionnaire for students comprised 3 domains (Supplemental online material 2). The first domain explored the objectives to participate in the practice OSCE module (5 items), the second domain concerned the creating of the OSCE case (5 items), and the third domain of the questionnaire analyzed students’ perceptions and experience from the practice OSCE sessions (9 items). Six-point Likert-type scales ranged from 1 (strongly disagree) to 6 (strongly agree). Both questionnaires included a voluntary section for respondents to submit in writing any free commentaries about their experience concerning the practice OSCE module. The questionnaires were created as an online survey using the Google Docs platform (Questionnaire for Tutors: https://docs.google.com/forms/d/e/1FAIpQLSd9KJxVQR4jPddVhKc7uxRaOWquHTZYhEg0uhsYPyVfHTxF5g/viewform, Questionnaire for Students: https://docs.google.com/forms/d/e/1FAIpQLSeqtfeY0QIcr1ok-BRt3V8Jd_05uPjpsNnrQcBVy1XmLxkAkg/viewform) and took 10 min to complete. A web link to the anonymous online survey was made available to the study participants after the practice OSCE module as a post-self-assessment survey. Pre-assessment survey before starting practice OSCE formation was not realized to avoid the response shift bias that might occur, for example as the consequence of going through the different phases of the learning module [15]. All responses were kept confidential. The questionnaires were unable to link the research materials to individual study participants. The study was approved by the Institutional Review Board of the University of Lorraine and registered at French National Commission for Data Protection and Liberties (n° 2020/118).

Data Analysis

Items of the questionnaires were analyzed as continuous variables and reported as means and standard deviations (SD) and medians (minimum–maximum). Additional analyses were also done for the items divided into two categories (Strongly Agree/Agree/Slightly agree vs. Slightly Disagree/Disagree/Strongly Disagree) [16]. The free-form answers were coded and analyzed thematically by two investigators and any discrepancies were resolved by consensus with two other investigators [17].

Results

During the OSCE case development, twenty-nine tutors served as partners for one group of students, 14 tutors for 2 groups of students, and one tutor for 3 groups of students. Sixty-one OSCE stations were developed by tutor-guided groups encompassing 16 clinical situations as chief complaints and 20 medical specialties (Table 1). All OSCE scenarios described a simulated patient encounter with history taking or assessment of clinical skills and evaluation of communication skills. All stations listed in Table 1 were created by students with tutors as their partners. The OSCE teaching staff was responsible for reviewing and a final online editing of each OSCE scenario. Each medical student was thus a co-creator of one OSCE station and participated in its revision after the OSCE session. A total of 303 students (90% of the third year medical students), 44 OSCE tutors, with a mean teaching experience of 2.8 years ± 4 SD, participated in the practice OSCE module.

Table 1.

Description of OSCE cases

Chief complaints/symptoms Organ system/medical condition Number of stations (n = 61)
Abdominal pain Surgical emergency (2) 2
Cough Pulmonary infection (1), neoplasia (1) 2
Chest pain Pulmonary (2), cardiac (3) pathology 5
Dyspnea Pulmonary pathology (6) 6
Difficulty in swallowing Head and neck neoplasia (1) 1
Fever, chills, hyperthermia Pulmonary (1), cardiac (1) pathology, infectious diseases (3) 5
Jaundice Hepatology (2) 2
Macules, papules, erythema Skin manifestation of systemic diseases (2), infection (1) 3
Melena, anal bleeding Gastroenterology (1) 1
Memory disturbance, cognitive impairment Functional impairment related to age (2), confusion (1) 3
Syncope Endocrine emergencies (2) Neurological emergency (1) 3
Pain, burning, cramps of the extremities Rheumatic disease (2), vascular disease (1), infectious diseases (1), injury of extremities (3), burn (1) 8
Swelling, edema Urogenital (2), cardiac (1) pathology (2), allergy (2), swollen eye (1) 8
Swollen or painful joints Injury (2), infection (1), degenerative joint disease (1), pulmonary (1), and cardiac pathology (1) 6
Vaginal bleeding Hormonal dysregulation and gynecological pathology (3) 3
Weight gain Cirrhosis (1) 1
Abnormal findings Proteinuria (1), abnormal abdominal CT scan findings (1) 2

One hundred and eighty-five students (61%) and 36 tutors (82%) responded to the self-assessment questionnaires. Students who responded to the survey reported the following objectives for participating in the OSCE module: to receive a feedback on performance (96%), to progress in learning (96%), to get familiar with the OSCE examination (95%), to receive a feedback on the created OSCE case (89%), and to know his/her level of performance before passing a real exam (86%).

OSCE Case Development

Students’ Perceptions

When categorizing the answers to positive and negative, 55% of the students perceived the task of active participation in co-creation of the OSCE case appropriate to their level of knowledge (Table 2). Majority of the students reported no difficulty in preparing the OSCE scenario and considered that the elaboration of assessment checklist was easy. They further reported no difficulties with communication skills items. Nearly 80% of the respondents considered that the supervision of tutors was sufficient.

Table 2.

OSCE case development

Writing OSCE scenario % positive answers Mean SD median min max
Students’ perspectives (n = 185)
  The exercise was appropriate to my level of knowledge 55 3.6 1.3 4 1 6
  I had no difficulty in writing the OSCE case 62 3.7 1.1 4 1 6
  I had no difficulty to elaborate checklist items 62 3.7 1.1 4 1 6
  I had no difficulty to select communication skills items 74 4.0 1.1 4 1 6
  Tutor’s supervision was sufficient 76 4.5 1.4 5 1 6
Tutors’ perspectives (n = 36)
  The OSCE case reproduced an authentic clinical situation 94 4.9 0.8 5 3 6
  Checklist items were appropriate 87 4.5 1.0 5 2 6
  Items assessing communication skills were appropriate 79 4.3 0.9 4 2 6
  Pre-set objectives of the OSCE case were met 90 4.6 1.1 5 2 6

Median and mean scores on a 6-point scale (1 = Strongly Disagree; 2 = Disagree; 3 = Slightly disagree; 4 = Slightly agree; 5 = Agree; 6 = Strongly Agree) and percentage of positive answers (Strongly Agree/Agree/Slightly agree)

SD standard deviation

Tutors’ Perceptions

Among tutors who responded to the survey, a majority considered that checklist items written by the students were appropriate to the expected level of knowledge of the 3rd-year medical students, and that the items assessing communication skills were in concordance with the objectives of the scenario. According to nearly all respondents, the preset objectives established with the students before creating the OSCE case were met (Table 2).

Moving Through the OSCE Stations

Students’ Perceptions

When categorizing the answers to positive and negative, the major part of students (Table 3) found the instructions clear and considered that pre-set learning objectives were met. Only 40% of the students considered that OSCE cases were adapted to their level of knowledge. Interestingly, majority of the students reported no difficulty to evaluate clinical and communication skills of their peers. More than 80% of the students considered that the feedback received from tutors on their performance and on the OSCE case content helped them to progress and to identify the gaps in their knowledge. Finally, two-thirds of them found that the exercise allowed them to set learning goals.

Table 3.

Moving through the OSCE stations

% positive answers Mean SD Median Min Max
Students’ perspectives (n = 185)
  The instructions were clear 91 4.9 1.1 5 1 6
  The announced learning objectives were met 87 4.6 1.1 5 1 6
  The OSCE cases were adapted to my level of knowledge 40 3.2 1.2 3 1 6
  I had no difficulty to evaluate clinical skills 71 4.0 1.2 4 1 6
  I had no difficulty to evaluate communication skills 76 4.1 1.6 4 1 6
  Tutor’s feedback on my performance helped me to progress 90 5.0 1.2 5 1 6
  Tutor’s feedback on the OSCE case helped me to progress 87 4.8 1.2 5 1 6
  It allowed me to identify the gaps in my knowledge 82 4.5 1.3 5 1 6
  This exercise allowed me to set my learning goals 73 4.3 1.4 4 1 6
Tutors’ perspectives (n = 36)
  The instructions were clear 95 5.0 1.0 5 2 6
   “Simulated patient” respected the scenario 97 4.8 0.9 5 3 6
  I had no difficulty to evaluate clinical skills 82 4.6 1.1 5 2 6
  I had no difficulty to evaluate communication skills 68 3.9 1.1 4 2 6
  I had no difficulty to give a feedback 97 5.0 0.7 5 3 6

Median and mean scores on a 6-point scale (1 = Strongly Disagree; 2 = Disagree; 3 = Slightly disagree; 4 = Slightly agree; 5 = Agree; 6 = Strongly Agree) and percentage of positive answers (Strongly Agree/Agree/Slightly agree)

SD standard deviation

Tutors’ Perceptions

Majority of tutors found the instructions clear and declared to have no difficulties to evaluate clinical and communication skills of the candidate physicians. Nearly all respondents did not perceive any difficulties to provide feedback to the students on their performance (Table 3).

Free Text Comments of Participants

Major themes and representative quotations emerging from the free-comment part of questionnaires based on the answers from 30% of students and from 24% of tutors are summarized in Table 4. The students found it challenging to create the OSCE case and to evaluate performance of their peers, mainly because of a lack of appropriate knowledge; however, at the same time, they found the experience rewarding. A self-reflection on their own performance allowed the students to set their specific personal goals: “It was very instructive and allowed me to get used to clinical reasoning. Feedback received from the tutor helps me to progress.” (Student 6) …. “…it was a great way to test my level of knowledge.” (Student 8).

Table 4.

Major themes and illustrative responses emerging from tutors’ and students’ free-text comments

Students’ perspectives (representative comments) Tutors’ perspectives (representative comments)
Areas of concern

Creation of the OSCE case

“A lack of knowledge (sometimes due to forgetting the lessons learned) and the need to search the Internet for more information.” (S1)

“I had to do some additional research to prepare a relevant clinical case. This exercise helped me to acquire some new knowledge and consolidate several clinical concepts. It is a real “plus” for learning.” (S2)

Role of the evaluator:

“It was difficult for me to decide whether the task was well conducted and structured.” (S3)

“It was difficult for me to evaluate clinical and communication skills at the same time.” (S4)

Creation of the OSCE case:

“It would be useful to know beforehand the level of knowledge of the students.” (T1)

“It was difficult for me to adapt the content of the OSCE case to the actual level of knowledge of the students.” (T2)

“Favorable discussions with these young students… it helps me to create cases in my specialty (neuro-anatomy)” (T3)

Self-reflection on learning

Role of the candidate-physician:

“It was stressful for me and I forgot some essential points.” (S5)

“It was very instructive and allowed me to get used to clinical reasoning. Feedback received from the tutor helps me to progress.” (S6)

Role of the patient:

“Playing the role of a patient was very instructive. It helped me to realize that the questions I’m usually asking my patient as a “doctor” are different from those that the patient would ask his/her health provider.” (S7)

Role of the evaluator:

“To my question whether the case was difficult, the students answered that it was not difficult, but that the subject had been taught last year and that they had already forgotten the lessons learned.” (T4)

“It was difficult for me to evaluate communication skills.” (T5)

“I did not have enough time to fill out the checklist and the rating scale and to give feedback to the student” (T6)

“I need more training on the use of scoring instruments.” (T7)

“More information on the expected level of student performance would be helpful to facilitate an equal evaluation.” (T8)

Experience

“A very pleasant exercise, it was a great way to test my level of knowledge.” (S8)

”The supervision by tutors was very good. Most of them were quite young and they still remember concerns of medical students just starting their clinical clerkship. I appreciate this kind of exercise.” (S9)

“I enjoyed the OSCE experience, in particular the exchange with the tutors and the situational exercises.” (S10)

“Very interesting and convivial.” (T9)

“It was an enriching experience.” (T10)

“The experience, in my opinion, enriching and to be renewed.” (T11)

“Willingness of tutors and students to become involved “ (T12)

S student, T tutor

Several tutors emphasized the need for more information on the expected level of student performance: “It was difficult for me to adapt the content of the OSCE case to the actual level of knowledge of the students.” (Tutor 2) … as well as for further training in the use of scoring instruments: “It was difficult for me to evaluate communication skills” (Tutor 5)… “I did not have enough time to fill out the checklist and the rating scale and to give feedback to the student” (Tutor 6).

Overall, participants reported perceived benefices to participate in the OSCE module: “A very pleasant exercise, …” (Student 8) … “It was an enriching experience.” (Tutor 10)… “The experience, in my opinion, enriching and to be renewed.” (Tutor 11)… “Favorable discussions with these young students… it helps me to create cases in my specialty (neuro-anatomy)” (T3)….“Willingness of tutors and students to become involved” (T12)… “I enjoyed the OSCE experience, in particular the exchange with tutors and situational exercises.” (S10).

Discussion

The intervention has been perceived positively by both OSCE tutors and 3rd-year medical students. Tutors and students were engaged as meaningful partners in the pedagogical approach, which enabled to integrate their insights from the very beginning of the practice OSCE module. The students could experience different facets of the OSCE process while producing the work for the benefit of their peers. Students perceived as valuable the collaborative multi-level learning environment, the post-encounter feedback, and the final debriefing of the OSCE case with their tutor.

Free comments of both students and tutors enabled us to explore participants’ attitudes to the intervention. Tutors were disposed to equitable and faithful evaluations and were interested in receiving further training. Similarly, the students indicated that the exercise allowed them to identify gaps in their knowledge (S1, S2), and to determine their learning goals (S6). Furthermore, it enabled them to experience the role of evaluator in assessing clinical skills of their peers. Playing the role of patient helped them to get a better understanding of the patient’s expectations (S7) and to understand the importance of the physician–patient relationship as a pivotal link in the delivery of health care. Students highlighted other benefits such as the feedback received from the tutors helping them to improve clinical reasoning skills (S6). Finally, the presence of young tutors was perceived by students as facilitating the development of tutor–student relationship (S9).

Literature shows that formative OSCEs offer unparalleled opportunities for direct observation and feedback according to students’ actual performance [18, 19]. Consequently, various formats of formative OSCE have been proposed to guide learning activities [1821], such as modules where students are standardized patients, near-peer assessors [2227], or modules where medical students have been involved in the creation of OSCE stations and the design of assessment checklists [2830]. Several authors have proposed peer-led multi-role practice [23] or near-peer teaching programs [31] to develop formative OSCEs with optimized resources. However, despite self-reported improvements in learning and in gaining self-confidence [22, 32, 33], available studies did not show a clear impact on improvements of students’ performance scores in subsequent summative OSCEs [29, 30, 34]. Nowadays, teaching activities are mainly focused on providing feedback as a form of support to enhance learning after an assessment; however, other factors enhance students’ learning such as reflection-in-action while being actively involved in task [35] or reflection-on-action after the active engagement [36]. We believe that active participation of students in the whole OSCE process will motivate the students and enhance their feelings of professional responsibility and that throughout their work, the students will develop professional skills and engage in near-peer mentoring.

In the present study, only 55% of the students who responded to the questionnaire stated that OSCE cases were appropriate to their level of knowledge. This statement shows that students involved in OSCE case development were not aware of the degree of difficulty of their case, nor was this perceived by the tutor. This is partly explained by the lack of experience of tutors and students, due to a weak OSCE culture in France. Similarly, some tutors reported the difficulties with the evaluation of generic criteria (such as communication skills or empathy). This underlines the importance of the partnership between students and tutors for a better understanding of a real level of students’ skills and competencies.

Literature evidence indicates a necessity for faculty training and development in competency-based medical education [37]. As far as we know, no research has been done regarding the benefits of practice OSCE for the training of faculty staff with students as partners. The role of students as partners in pedagogic activities and curriculum adaptation has been highlighted as one of the principal elements of the medical school in the future [38], and the involvement of students in the education process is considered in the ASPIRE-to-Excellence in the teaching program of medical schools [39]. Previously, Black and William suggested several principal activities that should be continuously integrated in formative evaluation, where formative OSCE is considered only as a tool helping to evaluate the attainment of pre-defined goals [40]. These activities include defining learning objectives and exchange on success criteria between tutors and learners, active discussion on learning tasks [41], adapted feedback to students on their performance, involving students as co-creators and initiators of their learning activities.

The present study was specifically designed to obtain the students and the tutors’ survey responses after they participated in the OSCE module and evaluate the feasibility of such a program run for large numbers of students and of novice OSCE tutors. In the future, it would be interesting to explore the experiences of participants before, during, just after, and at a later point in the intervention. As the answers of participants were not coded, we could not determine if responses were different for those tutors who participated in writing OSCE cases for more than one group of students. It would also be relevant to examine the future educational outcomes of students participating in the program. The calculation of OSCE scores and validity measures were beyond the scope of this study. This work reports a single-center experience, thus limiting generalizability. However, the replication of the experience in subsequent years and different institutions is possible.

Conclusion

Tutor–student partnership provides a well-perceived pedagogical approach benefiting OSCE tutors, medical students, and the medical faculty. This pedagogic approach is feasible and can be offered to large numbers of participants. Students considered it beneficial as it allowed them a self-reflection on their performance. We believe that it might help them to see a goal setting as an ongoing process in learning. It enabled tutors to get a better understanding of how the students perceive their engagement in learning activities and how to address students’ needs.

Based on the results of the present experience, several activities are being implemented to create a stimulating learning environment respecting the needs of learners, both tutors and students, and to enable the students to be involved actively in teaching activities: online formation for tutors to learn how to provide specific and constructive feedback with self-assessment questions and on the use of checklists and rating scales with practical exercises and comparison with pairs and near-peer teaching activities with the formation of student-tutors using the revised OSCE stations adapted to students’ needs and introduction of peer feedback in a formative assessment. Future research might be oriented at exploring the impact of the OSCE module on students’ performances and examiners’ inter-rater reliability in subsequent summative OSCEs and clinical workplace evaluations.

Supplementary Information

Below is the link to the electronic supplementary material.

Acknowledgements

The authors wish to thank all the novice tutors and students for their active involvement in the project and Laurent Colinet for the technical assistance.

Availability of Data and Material

De-identified data and materials available on request.

Declarations

Ethics Approval

Determined exempt. The study was approved by the Institutional Review Board of the University of Lorraine and registered at French National Commission for Data Protection and Liberties (n° 2020/118).

Consent to Participate

Consent obtained from all participants consistent with IRB review.

Consent for Publication

Consent obtained from all participants consistent with IRB review.

Conflict of Interest

The authors declare no competing interests.

Footnotes

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Khan KZ, Gaunt K, Ramachandran S, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation & administration. Med Teach. 2013;35:e1447–1463. doi: 10.3109/0142159X.2013.818635. [DOI] [PubMed] [Google Scholar]
  • 2.Yeates P, Cope N, Hawarden A, Bradshaw H, McCray G, Homer M. Developing a video-based method to compare and adjust examiner effects in fully nested OSCEs. Med Educ. 2019;53:250–263. doi: 10.1111/medu.13783. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Gormley GJ, Johnston J, Thomson C, McGlade K. Awarding global grades in OSCEs: evaluation of a novel eLearning resource for OSCE examiners. Med Teach. 2012;34:587–589. doi: 10.3109/0142159X.2012.682745. [DOI] [PubMed] [Google Scholar]
  • 4.Wilkinson TJ, Frampton CM, Thompson-Fawcett M, Egan T. Objectivity in objective structured clinical examinations: checklists are no substitute for examiner commitment. Acad Med. 2003;78:219–223. doi: 10.1097/00001888-200302000-00021. [DOI] [PubMed] [Google Scholar]
  • 5.Schleicher I, Leitner K, Juenger J, Moeltner A, Ruesseler M, Bender B, et al. Examiner effect on the objective structured clinical exam - a study at five medical schools. BMC Med Educ. 2017;17:71. doi: 10.1186/s12909-017-0908-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Chong L, Taylor S, Haywood M, Adelstein B-A, Shulruf B. Examiner seniority and experience are associated with bias when scoring communication, but not examination, skills in objective structured clinical examinations in Australia. J Educ Eval Health Prof. 2018;15:17. doi: 10.3352/jeehp.2018.15.17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Schleicher I, Leitner K, Juenger J, Moeltner A, Ruesseler M, Bender B, et al. Does quantity ensure quality? Standardized OSCE-stations for outcome-oriented evaluation of practical skills at different medical faculties. Ann Anat. 2017;212:55–60. doi: 10.1016/j.aanat.2017.03.006. [DOI] [PubMed] [Google Scholar]
  • 8.Chong L, Taylor S, Haywood M, Adelstein B-A, Shulruf B. The sights and insights of examiners in objective structured clinical examinations. J Educ Eval Health Prof. 2017;14:34. doi: 10.3352/jeehp.2017.14.34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Gravett K, Kinchin IM, Winstone NE. ‘More than customers’: conceptions of students as partners held by students, staff, and institutional leaders. Stud High Educ. 2020;45:2574–2587. doi: 10.1080/03075079.2019.1623769. [DOI] [Google Scholar]
  • 10.Cook-Sather A, Bovill C, Felten P. Engaging students as partners in learning and teaching: a guide for faculty. Frst. San Francisco: Jossey-Bass; 2014. [Google Scholar]
  • 11.Becker LM. Students as partners in academic placements. IJSaP. 2019;3:149–155. doi: 10.15173/ijsap.v3i2.3377. [DOI] [Google Scholar]
  • 12.Reynolds AK. Academic coaching for learners in medical education: twelve tips for the learning specialist. Med Teach. 2020;42:616–621. doi: 10.1080/0142159X.2019.1607271. [DOI] [PubMed] [Google Scholar]
  • 13.Matthews K, Cook-Sather A, Acai A, Dvorakova S, Felten P, Marquis E, et al. Toward theories of partnership praxis: an analysis of interpretive framing in literature on students as partners in teaching and learning. High Educ Res Dev. 2019;38:280–293. doi: 10.1080/07294360.2018.1530199. [DOI] [Google Scholar]
  • 14.Thiel J, Ram P, Dalen J. Maas-Global Manual 2000. 2000.
  • 15.Howard GS. Response-Shift Bias: A problem in evaluating interventions with pre/post self-reports. Eval Rev. 1980;4:93–106. doi: 10.1177/0193841X8000400105. [DOI] [Google Scholar]
  • 16.Harpe SE, Phipps LB, Alowayesh MS. Effects of a learning-centered approach to assessment on students’ attitudes towards and knowledge of statistics. Curr Pharm Teach Learn. 2012;4:247–255. doi: 10.1016/j.cptl.2012.05.002. [DOI] [Google Scholar]
  • 17.Fitzpatrick R, Boulton M. Qualitative methods for assessing health care. Qual Saf Health Care. 1994;3:107–113. doi: 10.1136/qshc.3.2.107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Archer JC. State of the science in health professional education: effective feedback. Med Educ. 2010;44:101–108. doi: 10.1111/j.1365-2923.2009.03546.x. [DOI] [PubMed] [Google Scholar]
  • 19.Furmedge DS, Smith L-J, Sturrock A. Developing doctors: what are the attitudes and perceptions of year 1 and 2 medical students towards a new integrated formative objective structured clinical examination? BMC Med Educ. 2016;16:32. doi: 10.1186/s12909-016-0542-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Townsend AH, Mcllvenny S, Miller CJ, Dunn EV. The use of an objective structured clinical examination (OSCE) for formative and summative assessment in a general practice clinical attachment and its relationship to final medical school examination performance. Med Educ. 2001;35:841–846. doi: 10.1046/j.1365-2923.2001.00957.x. [DOI] [PubMed] [Google Scholar]
  • 21.Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996;1:41–67. doi: 10.1007/BF00596229. [DOI] [PubMed] [Google Scholar]
  • 22.Lee CB, Madrazo L, Khan U, Thangarasa T, McConnell M, Khamisa K. A student-initiated objective structured clinical examination as a sustainable cost-effective learning experience. Med Educ Online. 2018;23:1440111. doi: 10.1080/10872981.2018.1440111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Bevan J, Russell B, Marshall B. A new approach to OSCE preparation - PrOSCEs. BMC Med Educ. 2019;19:126. doi: 10.1186/s12909-019-1571-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Bosse HM, Nickel M, Huwendiek S, Schultz JH, Nikendei C. Cost-effectiveness of peer role play and standardized patients in undergraduate communication training. BMC Med Educ. 2015;15:183. doi: 10.1186/s12909-015-0468-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Schwill S, Fahrbach-Veeser J, Moeltner A, Eicher C, Kurczyk S, Pfisterer D, et al. Peers as OSCE assessors for junior medical students – a review of routine use: a mixed methods study. BMC Med Educ. 2020;20:17. doi: 10.1186/s12909-019-1898-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Kim K-J, Kim G. The efficacy of peer assessment in objective structured clinical examinations for formative feedback: a preliminary study. Korean J Med Educ. 2020;32:59–65. doi: 10.3946/kjme.2020.153. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Möltner A, Lehmann M, Wachter C, Kurczyk S, Schwill S, Loukanova S. Formative assessment of practical skills with peer-assessors: quality features of an OSCE in general medicine at the Heidelberg Medical Faculty. GMS J Med Educ. 2020;37:Doc42. [DOI] [PMC free article] [PubMed]
  • 28.Heinke W, Rotzoll D, Hempel G, Zupanic M, Stumpp P, Kaisers UX, et al. Students benefit from eveloping their own emergency medicine OSCE stations: a comparative study using the matched-pair method. BMC Med Educ. 2013;13:138. doi: 10.1186/1472-6920-13-138. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Taylor D, Quick S. Students’ perceptions of a near-peer Objective Structured Clinical Examination (OSCE) in medical imaging. Radiography (Lond) 2020;26:42–48. doi: 10.1016/j.radi.2019.06.009. [DOI] [PubMed] [Google Scholar]
  • 30.Madrazo L, Lee CB, McConnell M, Khamisa K, Pugh D. No observed effect of a student-led mock objective structured clinical examination on subsequent performance scores in medical students in Canada. J Educ Eval Health Prof. 2019;16. [cited 2020 Feb 20]. Available from: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6609294/. [DOI] [PMC free article] [PubMed]
  • 31.Rashid MS, Sobowale O, Gore D. A near-peer teaching program designed, developed and delivered exclusively by recent medical graduates for final year medical students sitting the final objective structured clinical examination (OSCE) BMC Med Educ. 2011;11:11. doi: 10.1186/1472-6920-11-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Pugh D, Desjardins I, Eva K. How do formative objective structured clinical examinations drive learning? Analysis of residents’ perceptions. Med Teach. 2018;40:45–52. doi: 10.1080/0142159X.2017.1388502. [DOI] [PubMed] [Google Scholar]
  • 33.Khan R, Payne MWC, Chahine S. Peer assessment in the objective structured clinical examination: a scoping review. Med Teach. 2017;39:745–756. doi: 10.1080/0142159X.2017.1309375. [DOI] [PubMed] [Google Scholar]
  • 34.Burgess A, Clark T, Chapman R, Mellis C. Senior medical students as peer examiners in an OSCE. Med Teach. 2013;35:58–62. doi: 10.3109/0142159X.2012.731101. [DOI] [PubMed] [Google Scholar]
  • 35.Sargeant J, Armson H, Chesluk B, Dornan T, Eva K, Holmboe E, et al. The processes and dimensions of informed self-assessment: a conceptual model. Acad Med. 2010;85:1212–1220. doi: 10.1097/ACM.0b013e3181d85a4e. [DOI] [PubMed] [Google Scholar]
  • 36.Schön DA. The reflective practitioner: how professionals think in action. New York: Basic Books; 1983. [Google Scholar]
  • 37.Holmboe ES, Ward DS, Reznick RK, Katsufrakis PJ, Leslie KM, Patel VL, et al. Faculty development in assessment: the missing link in competency-based medical education. Acad Med. 2011;86:460–467. doi: 10.1097/ACM.0b013e31820cb2a7. [DOI] [PubMed] [Google Scholar]
  • 38.Harden RM. Ten key features of the future medical school-not an impossible dream. Med Teach. 2018;40:1010–1015. doi: 10.1080/0142159X.2018.1498613. [DOI] [PubMed] [Google Scholar]
  • 39.Hunt D, Klamen D, Harden RM, Ali F. The ASPIRE-to-Excellence Program: a global effort to improve the quality of medical education. Acad Med. 2018;93:1117–1119. doi: 10.1097/ACM.0000000000002099. [DOI] [PubMed] [Google Scholar]
  • 40.Black P, Wiliam D. Developing the theory of formative assessment. Educ Asse Eval Acc. 2009;21:5–31. doi: 10.1007/s11092-008-9068-5. [DOI] [Google Scholar]
  • 41.Bernard AW, Ceccolini G, Feinn R, Rockfeld J, Rosenberg I, Thomas L, et al. Medical students review of formative OSCE scores, checklists, and videos improves with student-faculty debriefing meetings. Med Educ Online. 2017;22:1324718. doi: 10.1080/10872981.2017.1324718. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Data Availability Statement

De-identified data and materials available on request.


Articles from Medical Science Educator are provided here courtesy of Springer

RESOURCES