Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2022 Mar 17.
Published in final edited form as: Med Teach. 2020 Nov 26;43(3):314–319. doi: 10.1080/0142159X.2020.1841891

Student Curriculum Review Team, 8 Years Later: Where We Stand and Opportunities for Growth

Priyanka Kumar 1,*, Christina M Pickering 1,*, Lyla Atta 1, Austin G Burns 1, Robert F Chu 1, Thomas Gracie 1, Caroline X Qin 1, Katherine A Whang 1, Harry R Goldberg 1
PMCID: PMC8929683  NIHMSID: NIHMS1779022  PMID: 33242263

Abstract

Background:

The Student Curriculum Review Team (SCRT) was founded at the Johns Hopkins University School of Medicine (JHUSOM) in 2012 to refine pre-clinical courses. Since then, SCRT has provided a voice for student feedback—offering forums for discussion through ‘Town Hall meetings’ and confidential avenues for peer-to-peer comments. Here, we assess the perceived efficacy and utility of SCRT among the student body and faculty course directors.

Methods:

A cross-sectional analysis was conducted in 2019 using an anonymous survey distributed to second- (MS2) and third-year (MS3) medical students as well as faculty course directors at JHUSOM.

Results:

A total of 113 student surveys and 13 faculty surveys were returned. The majority of students (97%) endorsed SCRT as effective in enabling them to express their concerns. Most faculty (69%) reported SCRT’s impact on their respective course as positive and found SCRT suggestions to be “realistic and actionable”. Students (84%) and faculty (62%) alike considered SCRT to meet needs not met by other curricular organizations at JHUSOM.

Conclusion:

Students and faculty find that SCRT satisfies an unfilled position in the landscape of curricular feedback at JHUSOM. This study may be beneficial for other academic institutions considering ways to better engage students in curricular reform.

Glossary Terms: medical education, curriculum, basic science years

INTRODUCTION

Over the last decade, medical education has witnessed significant changes, evolving from a primarily lecture-based orientation to a more active learning environment (McCoy et al. 2018). Currently, team-based learning, case-based learning, peer instruction, e-lectures and other online resources have changed the way students and faculty interact. Student learning has been further influenced by shorter preclinical experiences, more academically diverse medical school classes, and an increasingly competitive residency match process. With these changes, curricular assessment has become correspondingly more essential. Traditionally, course evaluations were the primary mechanism by which student opinions were solicited; however, the development of supplemental approaches, including focus groups and student representatives, have been shown to increase student participation in curricular development (Fetterman et al. 2010; Hendry et al. 2001; Wilson et al. 2013).

The incorporation of direct student feedback has become increasingly recognized as vital to improving the preclinical curricula (Kogan & Shea 2007). At McGill University, the implementation of a student feedback committee for medical education also identified the transition from “student as consumer” to a “student as partner” model as critical for involving medical students in successfully contributing to their own curriculum (Bilodeau et al. 2019). The implementation of student-requested changes to the academic environment has been found to improved student satisfaction and mental health (Fleming et al. 2015; Slavin et al. 2014). Furthermore, efforts to solicit student feedback for course quality improvement have also been accompanied by higher achievement on standardized metrics such as National Board of Medical Examiners (NBME) assessments (Richman et al. 2019).

In 2012, members of the Johns Hopkins University School of Medicine (JHUSOM) founded the Student Curriculum Review Team (SCRT) to establish a student-led curricular improvement process with direct student-to faculty collaboration (Hsih et al. 2015). Over the past eight years, SCRT has provided an alternative route for student involvement in the course feedback process beyond course evaluations. These methods include offering class-wide Town Hall meetings that focus on the strengths and areas for future course improvement, formal and informal student-to-SCRT correspondence, anonymous course surveys, and a confidential avenue to provide comments. For each course, SCRT collects, reviews, and synthesizes the feedback from the various sources. Using this information, recommendations are generated and then discussed among members of the SCRT team, faculty, and academic deans (Figure 1).

Figure 1. SCRT 2019-2020 Workflow.

Figure 1.

*SCRT only reviews courses that generate an overall rating ≤4.0/5.0.

SCRT has served a unique role within the JHUSOM curricular review process for the past eight years and uses novel strategies to engage both students and faculty. The purpose of this study was to assess the perceived effectiveness and utility of the SCRT program by both students and faculty. Furthermore, we evaluated how SCRT recommendations aid the continued development and improvement of preclinical teaching at JHUSOM. This work may provide a template that other academic medical institutions could adopt to integrate student feedback in their curricular review process.

METHODS

The study was approved by the Johns Hopkins University Institutional Review Board (IRB00221102). Informed consent was obtained from all study participants.

Survey Instruments

Second- and third-year medical students (MS-2, MS-3) were anonymously surveyed via email between October 21 and November 11, 2019 on their perception of SCRT using list servs maintained by the Johns Hopkins University School of Medicine Office of Registrar. The survey tool Qualtrics™ was used in this study. The survey included a series of statements on a five-point Likert scale ranging from strongly disagree to strongly agree. A five-dollar incentive in the form of an Amazon™ card was provided to all student respondents to encourage participation.

Faculty who had served as course directors for a pre-clinical course in the past three years were also anonymously surveyed; this cohort was identified using course director lists maintained by the Johns Hopkins University School of Medicine Office of Curriculum. Faculty were surveyed on the impact of SCRT using statements scored with a Likert rating scale. All questions were both generated and pilot-tested by study team members.

Qualitative Coding

Two study team members (PK, CP) conducted conventional content analysis on all free text responses (Hsieh & Shannon 2005). Themes were defined and reflexively adopted throughout the data analysis by both team members.

Statistical Analysis

Descriptive statistical analyses were performed with R statistical software (R Core Team, 2018). Wilcoxon-Mann-Whitney testing was used to compare the likelihood of students using particular modalities at Hopkins to provide course feedback. The threshold of significance was set at p < 0.05.

RESULTS

Survey Demographics

A total of 240 second- and third-year medical students and 31 course directors were invited to participate in the study; 113 student (47% response rate) and 13 faculty (42% response rate) completed surveys (Table 1). Of the student survey respondents, 64 (57%) were MS-2s and 49 (43%) were MS-3s – reflecting response rates of 53% and 41%, respectively. Faculty responders included course directors from the curricular blocks “Scientific Foundations of Medicine” (4), “Genes to Society” (5), “Foundations of Public Health and Ethics” (2), “Topics in Interdisciplinary Medicine” (1), and “Integrative Medicine” (1). On average, course directors met with SCRT 3.2 +/− 1.8 times during the academic years 2015-2019. Greater than half of faculty respondents met with SCRT in the 2018-2019 academic year.

TABLE 1.

Survey Responses

Counts, n Response rate
Completed Surveys Medical Students 113 47%
 MS-2 64 53%
 MS-3 49 41%
Faculty
 Scientific Foundations of Medicine 4 ?
 Genes to Society 5 ?
 Foundations of Public Health and Ethics 2 ?
 Topics in Interdisciplinary Medicine 1 ?
 Integrative Medicine 1 ?
Open Response Medical Students 48, 53 20%, 22%
Faculty 12, 11 ?

Abbreviations: MS – medical student

MS-2s were defined as students with an entering class year of 2018 and MS-3s were defined as students with an entering class year of 2017.

Perceived SCRT Impact and Effectiveness by Student Respondents

Most students (97%) endorsed SCRT as an effective forum that enables them to express their concerns. Additionally, a majority of students (60%) reported SCRT’s impact on their medical education as “positive,” with all others selecting “neutral” (38%) or “negative” (2%) (Figure 2). Notably, MS-2 and MS-3s diverged in their opinions on the impact of the organization; MS-2s students were more likely to report the impact of SCRT as “positive” than MS-3s (p=0.006).

Figure 2.

Figure 2.

Overall impact of SCRT as rated by student (A) and faculty (B) respondents.

There are three primary modalities by which students may interact with SCRT: (1) course evaluations, (2) Town Hall meetings, and (3) direct, in-person communication. Students most frequently interacted with SCRT through online course evaluations (80%), followed by Town Halls (68%), and, lastly, in-person through an informal interaction with a SCRT team member or director (42%) (Table 2). The majority of students surveyed (63%) attended at least one SCRT Town Hall within the last academic year; between 2019 and 2020, there were 8 total Town Halls held for MS-2 and MS-3s.

TABLE 2.

Student Interaction and Opinion of SCRT

Student Interaction, n
(%)
Highly Effective, n
(%)
Moderately Effective, n
(%)
Not Effective, n
(%)
SCRT Overall -- 37 (43.0) 46 (53.5) 3 (3.5)
Online Course Evaluations 90 (79.6) 26 (29.2) 54 (60.7) 9 (10.1)
Town Hall 77 (68.1) 23 (30.3) 49 (64.5) 4 (5.3)
Informal Interaction 47 (41.6) 20 (42.6) 22 (46.8) 5 (10.6)
Other 12 (10.6) 2 (40.0) 2 (40.0) 1 (0.0)

Town Halls are a distinguishing feature of SCRT. A majority of students agreed (43% strongly, 46% somewhat) with the statement, “Town Halls allow students to provide additional feedback that cannot be obtained from online course evaluations” (Figure 3). The perception of SCRT’s impact on a student’s medical education did not significantly differ between students who attended at least one Town Hall in the past year versus those who did not (p=0.109). When asked to rank the features they valued about SCRT Town Halls, students classified the student-led aspect of SCRT to be its most valuable feature (64% “highly valuable”), directed peer-to-peer communication as its second (55% “highly valuable”), and collaborative feedback as its third (5% “highly valuable”).

Figure 3.

Figure 3.

Role of SCRT as compared to other feedback modalities as rated by student (A-B) and faculty (C-D) respondents.

Perceived SCRT Impact and Value by Faculty Respondents

Most faculty (69%) reported SCRT’s impact on their respective course as positive, with the rest of faculty selecting “neutral” (31%) (Figure 2). The three aspects of SCRT that faculty indicated they valued most were the summarized suggestions provided in the reports (69% “highly valuable”), that SCRT is a student-led avenue of feedback (54% “highly valuable”), and the collaborative nature of the process (39% “highly valuable”).

SCRT suggestions were largely found to be reasonable. A majority of faculty (69%) agreed with the statement “SCRT suggestions are realistic and actionable” (Figure 3). The types of changes that faculty indicated they made influenced by SCRT feedback included: exam/quiz content (46%), curricular content (54%), lecture format (46%), small group content and structure (54%), and schedule (31%). Fifteen percent of faculty reported that they have not made any changes influenced by SCRT feedback. The two most common challenges to implementing SCRT-influenced changes cited by faculty were logistics/time constraints (54%) and differences in pedagogical philosophy (38%).

Themes Used to Describe SCRT

A total of 48 (43%) students and 11 (85%) faculty responded to the open-ended question at the end of the survey asking the surveyor to describe an experience with SCRT (Table 1). Content analysis of student replies revealed three emergent themes: feeling heard and validated when communicating with peers, collaborating and drawing on other student experiences to identify areas for course refinement, and engaging in self-reflection during these conversations (Table 3). Feedback from course directors highlighted the role of direct student-faculty engagement as a critical part of the SCRT process. Two faculty-identified strengths of SCRT included the opportunity to problem-solve with stakeholders (42%) and the provision of a neutral forum to discuss a balanced report on the course (50%) (Table 3).

TABLE 3.

Predominant themes mentioned in student- and faculty-identified experiences (n=56) with SCRT including selected quotations

Major Themes Selected Quotes n (%) of
responses
Student (n=56) Feeling heard and validated when communicating with peers
  • “Attended a session, felt like my opinion was heard. It’s reassuring when multiple students feel the same way, something you don’t get out of the online surveys.”

  • “I attended a town hall where everyone echoed each other's thoughts about a course - makes you realize a lot of people share similar concerns.”

14 (25%)
Collaborating and drawing on other student experiences to identify areas for course refinement
  • “I enjoy the townhalls because they give the ability to talk in an open discussion about things that worked well that should continue to be implemented and things that could be changed.”

  • “Town hall-- being able to jump off of other student's ideas/comments to think about issues I didn't know about/couldn't verbalize”

36 (64%)
Engaging in self-reflection during these conversations
  • “In my SCRT discussion it was good to hear other student's feedback as well, so that I could critically evaluate my thoughts and see whether they were justified, or not. Getting that other perspective was helpful.”

3 (5%)
Faculty (n=12) The opportunity to problem-solve with stakeholders
  • “In general, when there are consistent concerns students have with the course it is our only opportunity to discuss their significance and to problem-solve ways to improve things.”

  • “Students noted that a particular anatomical region (the neck) was not specifically included in any lecture (rather, particular aspects were included in multiple lectures), and that a single lecture devoted to it would be useful. We incorporated that suggestion successfully the following year.”

5 (42%)
Providing a neutral forum to discuss a balanced report of the course
  • “We had one meeting with 2 students on the SCRT Committee regarding the [name redacted] course. They were extremely pleasant, respectful and cooperative. The value for us was more about process than content. We were able to listen to their comments and suggestions more openly because their criticisms were constructive, thoughtful and balanced. Sometimes the written evaluations just sound like a bunch of complaints.”

4 (33%)

SCRT as Compared to Other Curricular Organizations

Students and faculty expressed a high degree of enthusiasm for the added value of a program like SCRT. 84% of student respondents agreed with the statement “SCRT meets needs that are not met by other curricular organizations at Hopkins” (Figure 3). Across the various avenues in which students may provide course feedback, respondents were significantly more likely to rely on SCRT than any other feedback method (e.g. course directors, medical student senate, academic deans) (P=0.036). Faculty responses (69%) supported the consistency of SCRT reviews compared to other feedback sources, and 62% of faculty agreed with the notion that SCRT provides values not captured by other curricular organizations (Figure 3). Of those, a balanced, class-wide perspective and face-to-face interaction were the most commonly cited points (Table 3).

SCRT in 2020, and Beyond

In all, 53 (47%) students and 10 (77%) faculty responded to the open-ended question at the end of the survey asking surveyors to suggest how SCRT could be more effective (Table 1). The most common improvement suggestion cited by both students and faculty included tracking curricular changes made in response to SCRT requests for the SCRT-reviewed courses (Table 4). Student responses emphasized relating course director feedback at future Town Halls or in class-wide communications. Faculty responses underscored the need to preserve institutional memory of course evolution in response to SCRT and other feedback mechanisms. Several students also supported allocating more time to discuss courses in SCRT Town Halls, identifying that time was a constraint in their ability to fully explore the structure, content, and teaching styles within a given course.

TABLE 4.

Student- and faculty-identified strategies for improving SCRT

Major Themes n (%) of responses SCRT response*
Student and Faculty Tracking curricular changes made in response to SCRT requests Student: 19 (32%)
Faculty: 4 (36%)
Notes taken at faculty meetings distributed to the class that ran the town hall (e.g. MS-1).
Student Allocating more time for discussion in SCRT forums 9 (15%) No more than two courses will be reviewed at a given town hall.
Faculty Consolidate institutional memory of SCRT reviews 3 (27%) All SCRT-generated reports and notes from faculty meetings will be uploaded onto a Google Folder for future reference.
*

Response defined as actions taken by SCRT for the 2019-2020 academic year.

DISCUSSION

The results of our survey demonstrated that SCRT is well-regarded among the JHUSOM student body as an agent of curricular change. Additionally, most faculty respondents (62%) found that SCRT added value beyond both course evaluations and other extant mechanisms of curricular review. Although solicitation of student feedback is required of all medical schools by the Liaison Committee on Medical Education (LCME), conventional feedback surveys may be insufficient to truly capture overall educational effectiveness (Hsih et al. 2015, Amrein-Beardsley & Haladyna 2009). Focus groups, as exemplified by SCRT’s Town Halls can be more informative than surveys or one-on-one interviews, as they allow for dynamic interaction in describing a course’s shortcomings and suggesting avenues for improvement (Kogan & Shea 2007; Shea et al. 2004; Frasier et al. 2007). Necessarily, even though other mechanisms of collecting student feedback exist, these data suggest that SCRT is uniquely effective in gathering this information.

The two most vital aspects of SCRT chosen by students when ranking the various aspects of SCRT by importance were being ‘student-led’ and fostering ‘peer-to-peer communication.’ In free-text responses, students advanced that they feel heard and validated in Town Halls, suggesting that having direct communication with peers during the course-review process serves to legitimize their concerns beyond that of other extant course review mechanisms. One potential contributor to this process is the fear of evaluation subsequent to traditional, student-to-instructor feedback. A study comparing open and anonymous medical student evaluation of clinical preceptors found that although no significant difference existed in assigned scores, students cited fear of subsequent interaction with the evaluated attending as a barrier to effective feedback (Afonso et al. 2005). In contrast, SCRT preserves student anonymity by discussing concerns collated from both course evaluations comments as well as opinions voiced during Town Halls. These characteristics likely create a more detailed and valuable course evaluation process.

A second rationale for student valuation of SCRT is the fact that the organization not only validated their concerns but also created space for self-reflection. This finding has important ramifications in the context of imposter syndrome—a widespread phenomenon characterized by the feeling of not belonging in a space— documented in academic settings (Bravata et al. 2019), including medical school (Villwock et al. 2016), in which the difficulty of the academic or professional program can lead students to doubt their fitness for their role. Preliminary studies have shown that group reflection can help alleviate the effects of imposter syndrome (Gold et al. 2019). In the same fashion, the ‘self-reflection’ described by student participants attending SCRT Town Halls may improve mental health and well-being by leading students to recognize their shared struggles with certain aspects of their coursework. In this fashion, course design is highlighted as a potential source of difficulty rather than a lack of individual aptitude. Further, given that medical students are at higher-risk of burnout and suicidal ideation than age matched peers (Dyrbye et al. 2008; Dyrbye et al. 2014), forums such as SCRT may serve an important role in student well-being in addition to promoting innovative course reform.

While this investigation has highlighted several positive impacts of gathering student feedback in a student-led forum, several studies have raised questions about the reliability of learner feedback in medical school. Specifically, several factors that do not directly relate to the quality of didactic teaching, such as the gender of student evaluators, pre-course interest in the subject material, and satisfaction with final examinations have been identified as contributing to more favorable course evaluations (Schiekierka & Raupach 2015, Woloschuk 2011). Studies of college students have also identified instructor attractiveness and perceived ease of a course as being positively associated with evaluation scores (Felton et al. 2008). However, analysis of these characteristics has found that although they do impact evaluation scores, their impact (<7%) is relatively small compared to the impact of actual quality of instruction (Hessler et al. 2018, Beran & Violato 2005). Elucidation of the true impact of these factors on student evaluation is needed to determine the impartiality and ultimate usefulness of student feedback, especially in the medical student population, which may differ in significant ways from the general college-age population (e.g., age, interest in academics).

Beyond concerns of learner feedback validity, a number of other potential limitations exist for these data as well as for implementation of the SCRT model itself. First, this survey design, as with any, may be predisposed to sampling bias, potentially resulting in increased response rate among students or faculty with polarized views of SCRT. A second limitation is related to the generalizability of this study. Other medical education institutions may have different grading systems, student and faculty culture, and curricular design logistics, all of which may restrict the direct translation of this model. The third limitation of the current SCRT model is that the beneficiaries of course reform are the students of the subsequent class, not the students proposing the current changes. This drawback was recapitulated in our data; third year students were less likely to report the impact of SCRT on their education as “positive” than second year students currently taking courses with new implementations. Given that neither the rising class nor the former class will experience both the ‘new’ course changes and the ‘old’ course prior to SCRT review, the only mechanisms of continuity between student impressions before-and-after course changes are previous SCRT reviews and informal communication between students of different classes.

To address some of the SCRT model shortcomings and better inform curricular innovation, several future directions exist. First, to better “close the loop,” SCRT has begun distributing faculty- and student- approved notes from SCRT-faculty meetings via class-wide communications (Table 4). Further, all SCRT-generated material from a given academic year are uploaded onto a secure, online repository for future reference by the organization (Table 4). Second, to mitigate the loss of student feedback due to the delay between a course and its evaluation, there may be a role for real-time student feedback. Of note, the delivery and implementation of such a mechanism could be challenging.

In conclusion, our study demonstrates considerable medical student and course director satisfaction with a medical student-led committee for evaluation and improvement of preclinical courses. By creating a student-led forum for peer-to-peer curricular discussion, SCRT solicited constructive criticism in a way that was both anonymizing and validating to students, who may benefit from collaboratively voicing their academic struggles with peers. In doing so, SCRT fostered a richer discussion of potential course improvements and delivered that information to faculty in a way that provided context as well as added value beyond traditional course metrics.

Given the positive impact the current SCRT model has had on JHUSOM students and faculty, application of similar course evaluations strategies may be beneficial to other medical schools seeking to enhance their curricular review process.

Practice Points.

  • Student feedback is a core component of curriculum design and refinement.

  • Student-led, collaborative, peer-to-peer communication effectively captures student opinion.

  • Student forums generate realistic and actionable suggestions to inform productive dialogue between students and course directors.

  • Student-led feedback mechanisms may satisfy an unmet need in curricular improvement.

ACKNOWLEDGEMENTS

We would like to thank course directors at the Johns Hopkins University School of Medicine, past members of the SCRT team, the Office of Curriculum, and the Office of Academic Computing for their commitment to fostering and supporting student feedback.

Footnotes

DECLARATION OF INTEREST

The authors report no declarations of interest.

Notes on Contributors

All authors, except Harry Goldberg, and third- and fourth-year medical students at the Johns Hopkins University School of Medicine.

Harry Goldberg, PhD, is the Faculty Sponsor of SCRT and an Assistant Dean at the Johns Hopkins University School of Medicine.

REFERENCES

  1. Afonso NM, Cardozo LJ, Mascarenhas OA, Aranha AN and Shah C, 2005. Are anonymous evaluations a better assessment of faculty teaching performance? A comparative analysis of open and anonymous evaluation processes. Family Medicine, 37(1), pp.43–7. [PubMed] [Google Scholar]
  2. Amrein-Beardsley A and Haladyna T, 2009. Tinkering with the traditional to assess and promote quality instruction: Learning from a new and unimproved instructor evaluation instrument. Journal of College Teaching & Learning (TLC), 6(4). [Google Scholar]
  3. Beran T and Violato C, 2005. Ratings of university teacher instruction: How much do student and course characteristics really matter?. Assessment & Evaluation in Higher Education, 30(6), pp.593–601. [Google Scholar]
  4. Bilodeau PA, Liu XM and Cummings BA, 2019. Partnered Educational Governance: Rethinking Student Agency in Undergraduate Medical Education. Academic Medicine, 94(10), pp.1443–1447. [DOI] [PubMed] [Google Scholar]
  5. Bravata DM, Watts SA, Keefer AL, Madhusudhan DK, Taylor KT, Clark DM, Nelson RS, Cokley KO and Hagg HK, 2019. Prevalence, Predictors, and Treatment of Impostor Syndrome: a Systematic Review. Journal of General Internal Medicine, pp.1–24. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Dyrbye LN, Thomas MR, Massie FS, Power DV, Eacker A, Harper W, Durning S, Moutier C, Szydlo DW, Novotny PJ and Sloan JA, 2008. Burnout and suicidal ideation among US medical students. Annals of Internal Medicine, 149(5), pp.334–341. [DOI] [PubMed] [Google Scholar]
  7. Dyrbye LN, West CP, Satele D, Boone S, Tan L, Sloan J and Shanafelt TD, 2014. Burnout among US medical students, residents, and early career physicians relative to the general US population. Academic Medicine, 89(3), pp.443–451. [DOI] [PubMed] [Google Scholar]
  8. Felton J, Koper PT, Mitchell J and Stinson M, 2008. Attractiveness, easiness and other issues: Student evaluations of professors on ratemyprofessors. com. Assessment & Evaluation in Higher Education, 33(1), pp.45–61. [Google Scholar]
  9. Fetterman DM, Deitz J and Gesundheit N, 2010. Empowerment evaluation: A collaborative approach to evaluating and transforming a medical school curriculum. Academic Medicine, 85(5), pp.813–820. [DOI] [PubMed] [Google Scholar]
  10. Fleming P, Heath O, Goodridge A and Curran V, 2015. Making medical student course evaluations meaningful: implementation of an intensive course review protocol. BMC Medical Education, 15(1), p.99. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Frasier PY, Slatt L, Kowlowitz V, Kollisch DO and Mintzer M, 1997. Focus groups: a useful tool for curriculum evaluation. Family Medicine, 29(7), pp.500–507. [PubMed] [Google Scholar]
  12. Gold JA, Bentzley JP, Franciscus AM, Forte C and De Golia SG, 2019. An intervention in social connection: medical student reflection groups. Academic Psychiatry, 43(4), pp.375–380. [DOI] [PubMed] [Google Scholar]
  13. Hendry GD, Cumming RG, Lyon PM and Gordon J, 2001. Student-centred course evaluation in a four-year, problem based medical programme: Issues in collection and management of feedback. Assessment & Evaluation in Higher Education, 26(4), pp.327–339. [Google Scholar]
  14. Hessler M, Pöpping DM, Hollstein H, Ohlenburg H, Arnemann PH, Massoth C, Seidel LM, Zarbock A and Wenk M, 2018. Availability of cookies during an academic course session affects evaluation of teaching. Medical Education, 52(10), pp.1064–1072. [DOI] [PubMed] [Google Scholar]
  15. Hsih KW, Iscoe MS, Lupton JR, Mains TE, Nayar SK, Orlando MS, Parzuchowski AS, Sabbagh MF, Schulz JC, Shenderov K and Simkin DJ, 2015. The Student Curriculum Review Team: how we catalyze curricular changes through a student-centered approach. Medical teacher, 37(11), pp.1008–1012. [DOI] [PubMed] [Google Scholar]
  16. Hsieh HF and Shannon SE, 2005. Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), pp.1277–1288. [DOI] [PubMed] [Google Scholar]
  17. McCoy L, Pettit RK, Kellar C, and Morgan C, 2018. Tracking active learning in the medical school curriculum: a learning-centered approach. Journal of Medical Education and Curricular Development, 5, pp.1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Kogan JR and Shea JA, 2007. Course evaluation in medical education. Teaching and Teacher Education, 23(3), pp.251–264. [Google Scholar]
  19. Richman PS, Olvet DM, Ahmad S and Chandran L, 2019. Use of student feedback to drive quality improvement (QI) in a preclinical US medical school course. Medical Education Online, 24(1), p.1583968. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Schiekirka S and Raupach T, 2015. A systematic review of factors influencing student ratings in undergraduate medical education course evaluations. BMC Medical Education, 15(1), p.30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Shea JA, Bridge PD, Gould BE and Harris IB, 2004. UME-21 local evaluation initiatives: contributions and challenges. Family Medicine, 36(1; SUPP), pp.S133–S137. [PubMed] [Google Scholar]
  22. Slavin SJ, Schindler DL and Chibnall JT, 2014. Medical student mental health 3.0: improving student wellness through curricular changes. Academic Medicine, 89(4), p.573. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Team, R.C., 2018. R Foundation for Statistical Computing; Vienna, Austria: 2015. R: A language and environment for statistical computing, p.2013. [Google Scholar]
  24. Villwock JA, Sobin LB, Koester LA and Harris TM, 2016. Impostor syndrome and burnout among American medical students: a pilot study. International Journal of Medical Education, 7, p.364. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Wilson MW, Morreale MK, Waineo E and Balon R, 2013. The focus group: a method for curricular review. Academic Psychiatry: the Journal of the American Association of Directors of Psychiatric Residency Training and the Association for Academic Psychiatry, 37(4), p.281. [DOI] [PubMed] [Google Scholar]
  26. Woloschuk W, Coderre S, Wright B and McLaughlin K, 2011. What factors affect students' overall ratings of a course? Academic Medicine, 86(5), pp.640–643. [DOI] [PubMed] [Google Scholar]

RESOURCES