Abstract
Introduction
Teaching is an important competency in graduate medical education (GME). Many residency programs have implemented curricula to develop residents’ teaching skills and observed structured teaching experiences (OSTEs) have been used to assess these skills. There is an increasing focus on building teaching skills earlier in the medical education continuum, however, there is limited literature on assessing medical students’ teaching skills. The authors developed an OSTE for medical students enrolled in a students-as-teachers course to address this gap and provide formative feedback on teaching skills.
Materials and Methods
OSTEs were conducted for fourth-year medical students (M4s) enrolled in a Students as Teachers Advanced Elective at a US medical school. An M4 observed a first-year medical student (M1) during a simulated encounter with a standardized patient. The M4 gave feedback and a chalk talk. A physician observer assessed the M4’s teaching using the modified Stanford Faculty Development Program (SFDP) questionnaire. The M1s and M4s also completed the SFDP. The M4 completed pre- and post-OSTE self-efficacy surveys (score range 6-30) and a post-OSTE acceptability survey.
Results
All (30/30) M4s completed the OSTE. The SFDP identified common teaching strengths and areas for growth. ANOVA tests demonstrated significant differences between the mean (SD) scores from physician assessors, M1s, and M4s [4.56 (0.63) vs. 4.87 (0.35) vs. 4.08 (0.74), p<0.001]. There was a statistically significant difference in mean (SD) self-efficacy scores pre- and post-OSTE [18.72 (3.39) vs. 23.83 (3.26), p<0.001]. All M4s (30/30) somewhat or strongly agreed with all three OSTE acceptability questions.
Lessons Learned
The authors successfully conducted an OSTE in an M4 advanced elective. The OSTE was highly acceptable to participants, and M4s demonstrated improved teaching self-efficacy. Further research should explore the validity of the OSTE to measure medical students’ teaching skills and the long-term impact of developing teaching skills in medical school.
Supplementary Information
The online version contains supplementary material available at 10.1007/s40670-023-01952-3.
Keywords: Undergraduate medical education, Teaching skills, Simulation, Observed structured teaching experiences (OSTEs)
Introduction
Within graduate medical education (GME), teaching is an important competency expected of residents [1, 2]. Many institutions have implemented curricula for residents to develop teaching skills and there have been increased efforts to measure competencies – including teaching – using objective data [1, 3]. Observed structured teaching experiences (OSTEs) have been one way to measure this competency [1].
OSTEs were first described in the early 1990s to assess teaching performance, similar to how observed structured clinical encounters (OSCEs) assess clinical performance [4]. Generally, OSTEs include a teacher giving feedback to a standardized learner on their clinical performance, while a third party evaluates the teaching skills. OSTEs have successfully been used in GME across many specialties to assess skills ranging from procedural teaching to interpersonal conflict resolution [5, 6]. Additionally, there is some evidence that OSTEs are a valid measurement of teaching skills in residents-as-teachers curricula [4].
In many ways, the goal of undergraduate medical education (UME) is to prepare medical students for the demands of residency. In line with the expectation of teaching in residency, there is an increasing emphasis on developing teaching skills during UME; almost half of U.S. allopathic medical schools offer training in teaching [7]. However, there is limited literature about assessing the teaching skills of medical students, and no published study to our knowledge has described the use of an OSTE to assess medical students’ teaching skills. Given this gap and the previously demonstrated success of OSTEs in GME, we implemented an OSTE in a students-as-teachers advanced elective for post-clerkship medical students.
Materials and Methods
Students as Teachers Advanced Elective
Students as Teachers (SaT) is a one-year advanced elective (AE) at Vanderbilt University School of Medicine (VUSM) for post-clerkship students (years 3 and 4). The SaT AE runs from March of students’ third year until February of students’ fourth year. There are both student and faculty course directors, and enrollment is limited to 30 students. The goals of the course are to prepare students to be effective teachers in residency, to provide a foundation in educational theory, and to educate pre-clinical and clerkship students (years 1 and 2). Throughout the year, SaTs engage in discourse about adult learning theory, create learning materials, and offer teaching sessions to other medical students. The course requirements include completing an OSTE, which is designed as a formative teaching assessment at a single time point during the AE.
OSTE
For the 2022–2023 SaT AE, OSTEs were conducted in May 2022 at Vanderbilt’s Center for Experiential Learning and Assessment (CELA). A single case was specifically developed for the OSTE (Supplemental Digital Appendix 1). The OSTE was intentionally timed to M1 curricular needs. The OSTE was planned just prior to the M1’s Physical Diagnosis final exam so it could be a mutually beneficial learning experience for the SaT and M1 students. OSTE participants included first-year medical students (M1s) as learners, standardized patients (SPs), and physician assessors. M1s were recruited by SaT AE student leadership to participate as the learner in the OSTE. M1 participation was voluntary, and there was no specific prior preparation for the OSTE. The session was designed to complement the M1 learning in their Physical Diagnosis course. SPs were recruited and trained by CELA staff in the usual manner for CELA simulation events. Physicians were nominated by SaT students for having excellent teaching skills and recruited by the SaT AE student leadership to assess the SaT students.
The OSTE was 50 min in duration and consisted of the SaT student observing the M1 perform a history and physical examination on a standardized patient with abdominal pain (Fig. 1). The SaT then provided feedback to the M1 and conducted a chalk talk, which is a 15-minute dynamic teaching session using a whiteboard. Prior to the OSTE, SaT students and physician assessors were provided with the case, and SaT students received a rubric used to grade the M1s during their Physical Diagnosis final exam. (Supplemental Digital Appendix 2). Prior to completing the OSTE, SaT students had three course sessions that covered core fundamentals of teaching skills and feedback to help them prepare for the OSTE. They also had variable prior teaching experiences in the clinical learning environment as senior students on their clinical rotations.
Fig. 1.
Observed Structured Teaching Experience (OSTE) design and timing. “SaT” refers to post-clerkship students enrolled in a Students as Teachers Advanced Elective. “M1” refers to first-year medical student participating as a learner in the OSTE
OSTE Assessment and Evaluation
The SaTs’ teaching skills were assessed after the OSTE by physician assessors and M1s using a modified version of the Stanford Faculty Development Program (SFDP) questionnaire (Supplemental Digital Appendix 3) [8]. To support the development of self-regulated learning skills, particularly informed self-assessment regarding teaching skills, the SaT student also completed this survey as a self-assessment and subsequently used their results as stimulus for a teaching reflection. The survey included seven items on a 5-point Likert scale as well as narrative comments about strengths and areas of growth.
Given that the OSTE occurred early in the AE due to M1 curricular needs, the SaT students completed a pre- and post-OSTE self-efficacy survey (Supplemental Digital Appendix 4) to ensure that the OSTE was supportive of their learning and encouraged future teaching. The authors developed a six-item survey measuring confidence in three teaching domains (giving feedback, giving a chalk talk, and near-peer teaching) using a 5-point Likert scale where higher scores indicate higher self-efficacy (total score range 6–30). These domains were identified by SaT students as key areas where they had prior informal teaching experiences and desired more formal feedback. Finally, the SaT students completed a three-item acceptability survey that used a 5-point Likert scale (1 = strongly disagree, 2 = somewhat disagree, 3 = neutral, 4 = somewhat agree, 5 = strongly agree) (Supplemental Digital Appendix 5).
All surveys were completed online using REDCap, a secure online database [9]. Data were analyzed using Microsoft Excel. Qualitative comments were coded for themes by two authors (KPP, MRM), using an inductive-deductive method.
IRB Approval
This study was reviewed by the Vanderbilt IRB (#210,874) and deemed to be exempt.
Results
OSTE Participants
All SaT students (30/30, 100%) participated in the OSTE and were assessed by a physician. Of the 30 physicians evaluating the SaT students, 3 (10%) were faculty, 2 (7%) were fellows, and 25 (83%) were residents. Of the learners, 27 (90%) were M1 students. Due to COVID and unexpected absences, 3 (10%) learners were CELA staff or course leaders portraying M1 students.
SaT Assessment
All physician assessors (30/30, 100%) and M1s (30/30, 100%) completed assessments of the SaT student’s teaching performance, and all SaT students (30/30, 100%) completed the self-assessment.
The learning environment was ranked within the top three teaching domains by all assessor groups. Both the physician assessors and SaT students had professionalism and chalk talk in the top three domains, while the M1s rated feedback and responsiveness most highly. Across all domains on the SaT assessment, ANOVA tests demonstrated statistically significant differences between the mean score from physician assessors, M1s, and SaT students (Fig. 2). In each domain, the M1 mean score was the highest, and the SaT mean score was the lowest.
Fig. 2.
Mean scores on the SaT assessment from each group of assessors. Stars indicate ANOVA with statistically significant difference between groups (* means p = 0.001, ** means p < 0.001)
All physician assessors (30/30, 100%) and most M1 learners (27/30, 90%) provided optional narrative feedback about the SaT’s teaching skills. Half of SaT students (15/30, 50%) provided optional narrative self-assessment comments. Main themes concerning areas of strength and growth were consistent across the physician assessors, M1s, and SaT students. Strengths included creating a good learning environment, using visual aids, and providing specific feedback. Areas for growth included encouraging learner self-reflection and improving time management. Select quotations can be found in Supplemental Digital Appendix 6.
SaT Self-Efficacy
Twenty-nine SaT students (96.7%) completed pre- and post-OSTE self-efficacy surveys. A two-tailed, paired t-test demonstrated a statistically significant difference between total pre- and post-OSTE mean (SD) self-efficacy scores [pre: 18.72 (3.39); post-OSTE 23.83 (3.26), p < 0.001]. There was also increased self-efficacy across each individual item on the survey (Table 1). The lowest domain of self-efficacy both pre- and post-OSTE was giving a chalk talk to a medical team. The highest domain of self-efficacy at both timepoints was overall confidence in the near-peer teaching role.
Table 1.
Comparison of pre- and post-Observed Structured Teaching Experience (OSTE) self-efficacy scores (1 = “not at all confident” to 5 = “extremely confident”)
| Item | Mean (SD) pre-OSTE | Mean (SD) post-OSTE | p-value |
|---|---|---|---|
| Confidence giving a medical student feedback on: | |||
| 1. History of Present Illness | 3.27 (0.49) | 4.13 (0.34) | < 0.001 |
| 2. Focused Physical Exam | 3.24 (0.55) | 4.03 (0.39) | < 0.001 |
| 3. Assessment and Plan | 2.97 (0.46) | 3.97 (0.39) | < 0.001 |
| Confidence giving a chalk talk to: | |||
| 4. Medical students | 3.21 (0.60) | 4.03 (0.32) | < 0.001 |
| 5. A medical team | 2.66 (0.66) | 3.48 (0.76) | < 0.001 |
| Overall confidence in near-peer teaching role: | |||
| 6. Overall confidence | 3.38 (0.46) | 4.14 (0.34) | < 0.001 |
OSTE Acceptability
All SaT students (30/30, 100%) completed the OSTE acceptability survey. All students (30/30, 100%) agreed with each statement. When asked about if the OSTE prepared them for clinical teaching, 21 students (70%) strongly agreed and 9 (30%) somewhat agreed. When asked if they received helpful feedback about their teaching, 27 students (90%) strongly agreed, and 3 students (10%) somewhat agreed. Finally, when asked if they would recommend an OSTE to medical students interested in improving their teaching skills, 28 students (93%) strongly agreed, and 2 (7%) students somewhat agreed.
Discussion
We successfully conducted an OSTE in a students-as-teachers advanced elective for post-clerkship medical students. Our findings demonstrated that the OSTE was feasible and acceptable among participating medical students. Overall, the SaT students were rated highly by both the physician assessor and the M1 learner. We identified the learning environment as a particular area of strength across all assessor groups, reflected both in the survey data and narrative comments. We found differences in mean score on each teaching skill (e.g., communication of goals) across physician assessors, M1 learners, and SaT students. The lower scores on the SaT students’ self-assessment compared to those of the physician assessors and M1 learners is consistent with literature that demonstrates that high performers tend to underestimate their own skills [10]. Themes that emerged from the narrative feedback identified common gaps in teaching skills and will be used to guide SaT AE course sessions in the future and as a nidus for discussion.
Additionally, participating medical students demonstrated improved self-efficacy after the OSTE for each teaching skill assessed. Students’ lowest self-efficacy score was for giving a chalk talk to a medical team. This is not surprising as the OSTE was conducted prior to an SaT session on preparing and giving a chalk talk due to the need to align with M1 curricular needs. Finally, SaT students unanimously agreed that the OSTE prepared them for clinical teaching and provided valuable feedback.
We believe this model is generalizable to other institutions, as many have programs or tracks in place to expose medical students to teaching opportunities [7]. Challenges to dissemination of this model may include limited access to simulation space and difficulty recruiting physician evaluators. The OSTE uses simulation resources and can be modified to meet resource limitations at various institutions. Our OSTE was unique in using M1 students as learners. While it is more typical to use a standardized learner, using other medical students as the learners in an OSTE may both reduce resource burdens and provide mutually beneficial learning opportunities for all participating students. It also provides a more naturalistic environment for formative, low stakes feedback. The more natural response to teaching behaviors can be used to foster more meaningful feedback and foster discussion.
There are several limitations in this innovation project. Given limited enrollment to 30 SaT students per year, the sample size is small. Additionally, there is a selection bias for post-clerkship medical students highly motivated toward teaching. Most of our physician assessors were residents. Although the course leadership recruited residents and fellows felt to be excellent teachers, these assessors likely had variable levels of teaching expertise. Furthermore, there may be bias inherent in the SaT student-physician assessor pairs, particularly if the assessor was from the student’s intended specialty.
Future research should explore the validity and longevity of the OSTE as a formative teaching assessment. For example, is improved teaching self-efficacy as a medical student associated with increased comfort as a resident in teaching settings? Further, do students who begin developing teaching skills in medical school maintain these skills into residency? To our knowledge, this is the first publication that describes the use of an OSTE to assess medical students’ teaching skills, and this appears to be an acceptable strategy for formative assessment of medical students’ teaching skills.
Supplementary Information
Below is the link to the electronic supplementary material.
Acknowledgements
This case used in the OSTE was originally written and developed by Hannah G. Kay, MD and Jill Braddock-Watson, MA-CCC-SLP. The authors wish to thank Jill Braddock-Watson and the Center for Experiential Learning and Assessment (CELA) at Vanderbilt University Medical Center for assistance with case simulation design and implementation.
Funding
No funds, grants, or other support was received.
Data Availability
Deidentified data is available from the authors on request.
Declarations
Competing Interests
The authors have no relevant financial or non-financial interests to disclose.
Ethical Approval
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. This study was deemed to be exempt by the Vanderbilt IRB (##210874).
Consent
Informed consent was obtained from all individual participants included in the study.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.McKeon BA, Ricciotti HA, Sandora TJ, et al. A consensus guideline to support resident-as-teacher programs and enhance the culture of teaching and learning. J Grad Med Educ. 2019;11(3):313–8. doi: 10.4300/JGME-D-18-00612.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Smith CC, Newman LR, Huang GC. Those who teach, can do: characterizing the relationship between teaching and clinical skills in a residency program. J Grad Med Educ. 2018;10(4):459–63. doi: 10.4300/JGME-D-18-00039.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Zackoff MW, Real FJ, DeBlasio D, et al. Objective assessment of resident teaching competency through a longitudinal, clinically integrated, resident-as-teacher curriculum. Acad Pediatr. 2019;19(6):698–702. doi: 10.1016/j.acap.2019.01.011. [DOI] [PubMed] [Google Scholar]
- 4.Trowbridge RL, Snydman LK, Skolfield J, Hafler J, Bing-You RG. A systematic review of the use and effectiveness of the Objective Structured Teaching Encounter. Med Teach. 2011;33(11):893–903. doi: 10.3109/0142159X.2011.577463. [DOI] [PubMed] [Google Scholar]
- 5.Jhaveri VV, Currier PF, Johnson JH. Bridging the gap between do one and teach one: impact of a procedural objective structured teaching encounter on resident procedural teaching proficiency. Med Sci Educ. 2020;30(2):905–10. doi: 10.1007/s40670-020-00972-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Jones N, Milanes L, Banales V, Price I, Gomez I, Hughes S. Difficult interpersonal encounters with medical students and residents: Two Objective Standardized Teaching Encounters. MedEdPORTAL. 2017;13:10640. Published 2017 Oct 11. 10.15766/mep_2374-8265.10640. [DOI] [PMC free article] [PubMed]
- 7.Bahar RC, O’Shea AW, Li ES, et al. The pipeline starts in medical school: characterizing clinician-educator training programs for U.S. medical students. Med Educ Online. 2022;27(1):2096841. doi: 10.1080/10872981.2022.2096841. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Mintz M, Southern DA, Ghali WA, Ma IW. Validation of the 25-item Stanford Faculty Development Program Tool on clinical teaching effectiveness. Teach Learn Med. 2015;27(2):174–81. doi: 10.1080/10401334.2015.1011645. [DOI] [PubMed] [Google Scholar]
- 9.Harris PA, Taylor R, Minor BL, Elliott V, Fernandez M, O’Neal L, McLeod L, Delacqua G, Delacqua F, Kirby J, Duda SN, REDCap Consortium The REDCap consortium: building an international community of software partners. J Biomed Inform. 2019;95:103208. doi: 10.1016/j.jbi.2019.103208. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Rosenzveig A, Raiche I, Fung BSC, Gawad N. Self-assessment in general Surgery applicants: an insight into interview performance. J Surg Res. 2022;273:155–60. doi: 10.1016/j.jss.2021.12.031. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Deidentified data is available from the authors on request.


