Skip to main content
Medical Education Online logoLink to Medical Education Online
. 2011 May 16;16:10.3402/meo.v16i0.6354. doi: 10.3402/meo.v16i0.6354

Faculty verbal evaluations reveal strategies used to promote medical student performance

Karen E Hauer 1,*, Lindsay Mazotti 2, Bridget O'Brien 1, Paul A Hemmer 3, Lowell Tong 4
PMCID: PMC3102538  PMID: 21629669

Abstract

Background

Preceptors rarely follow medical students' developing clinical performance over time and across disciplines. This study analyzes preceptors' descriptions of longitudinal integrated clerkship (LIC) students' clinical development and their identification of strategies to guide students' progress.

Methods

We used a common evaluation framework, reporter-interpreter-manager-educator, to guide multidisciplinary LIC preceptors' discussions of students' progress. We conducted thematic analysis of transcripts from preceptors' (seven longitudinal ambulatory preceptors per student) quarterly group discussions of 15 students' performance over one year.

Results

All students' clinical development progressed, although most experienced obstacles. Lack of structure in the history and physical exam commonly obstructed progression. Preceptors used templates for data gathering, and modeling or experiences in the inpatient setting to provide time and solidify structure. To advance students' knowledge acquisition, many preceptors identified focused learning topics with their students; to promote application of knowledge, preceptors used reasoning strategies to teach the steps involved in synthesizing clinical data. Preceptors shared accountability for helping students advance as the LIC allowed them to follow students' response to teaching strategies.

Discussion

These results depict preceptors' perceptions of LIC students' developmental continuum and illustrate how multidisciplinary preceptors can use a common evaluation framework to identify strategies to improve performance and follow students' performance longitudinally.

Keywords: education, medical, undergraduate, clinical clerkship, clinical competence, faculty


The principal clerkship year is foundational for students' clinical development, and yet faculty rarely follow students' clinical performance longitudinally over the year. In most curricula, students progress through discipline-based clerkships in serial four- to eight-week blocks. Within and across these blocks, students experience discontinuity with preceptors, who may supervise them only briefly (1). While the apprenticeship model of clinical training was designed to promote longitudinal mentorship and guidance, current realities typically fragment a preceptor's witnessing of a student's progressive development (2). Such discontinuity also limits formation of meaningful relationships between preceptors and students that could support a shared commitment and accountability to advancing an individual student's clinical development.

Given the complexity of synthesizing the knowledge, skills, and attitudes needed to conduct clinical encounters, students depend upon observation of their interactions and feedback regarding areas for improvement (3). The lack of opportunity for preceptors to observe their students regularly over a substantial period of time limits their ability to provide and assess the response to feedback. Not surprisingly, faculty written evaluations of students usually contain generalities without nuanced information about strengths, weaknesses, changes over time, and areas for development (4).

Understanding students' developmental trajectories would help preceptors target teaching strategies to student needs, facilitate appropriate clinical experiences, and provide useful feedback. However, most studies of the development of clinical ability examine students at a single timepoint or within single disciplines, and do not describe individual student's progress over time (57). One framework for descriptive evaluation of students' clinical performance is the reporter-interpreter-manager-educator (RIME) framework, which is used by over 40% of US internal medicine and other single-discipline clerkships, and to guide feedback in longitudinal clerkships (811). RIME is a synthetic framework, in that each step along the developmental continuum requires a student to demonstrate the requisite knowledge, skills, and attitudes to succeed (Appendix 1). This synthetic framework captures the integration of individual domains on the path toward competence as a physician (12). While RIME provides the frame of reference, formal evaluation sessions (real-time, regular, face-to-face meetings among teachers during clinical clerkships to discuss student performance and generate feedback) allow a clerkship director and faculty collaboratively to produce detailed descriptions of student performance, while simultaneously training faculty to use the evaluation framework (8). The evaluation sessions capitalize on the social-cognitive nature of clinical training (13) and the willingness of faculty to say what they may not be prepared to write in evaluations. While prior research has focused on feasibility, acceptability, and usefulness of the RIME vocabulary and evaluation sessions in single clerkships, it is unclear how this synthetic framework may enable preceptors from multiple disciplines to select strategies to help learners at different stages of development.

In our longitudinal setting, we identified opportunities to capture preceptors' characterization of students' performance over time using the RIME framework during evaluation sessions. Year-long relationships between individual students and preceptors that develop in a longitudinal integrated clerkship (LIC) (14) allowed us to enhance the usefulness of the RIME framework by associating descriptions of performance with specific strategies selected by preceptors to advance students' abilities, and those same preceptors' insights into students' subsequent performance—something not previously described. This nuanced understanding can inform clinical teachers about learner-centered approaches to precepting and providing individualized feedback.

The purpose of this study was to analyze faculty's descriptive evaluations of students' clinical performance in an LIC, with an emphasis on identifying the strategies they recommended to advance their students' performance.

Methods

Design

This was a qualitative study piloting the first-ever implementation of the RIME framework coupled with formal evaluation sessions of students' performance in multiple simultaneous core clerkships within an LIC at a single US medical school.

Setting

LIC students completed all core clerkships concurrently as 12-month outpatient preceptorships. Each student had a single faculty preceptor for each core discipline (family medicine, medicine, neurology, obstetrics/gynecology, pediatrics, psychiatry, surgery). Thus each student had seven preceptors all year, one per discipline, and met with each preceptor for a half-day clinic approximately once every one to two weeks, as shown in Fig. 1. Students acquired a panel of patients to follow longitudinally to inpatient or outpatient visits with any provider.

Fig. 1.

Fig. 1

Longitudinal integrated clerkship student schedule.

Participants

Study participants were all 87 preceptors for all 15 LIC students over the 2008–2009 academic year. Sixteen preceptors had two students; one had three. Preceptors were recruited by each department's clerkship director based on willingness to precept a student longitudinally, prior teaching ratings and experience, and a clinical practice conducive to student continuity with some patients. Preceptors participated in a two-hour orientation to the program with a 16-page information pack addressing program goals, guidelines for students' interactions with longitudinal patients, feedback, and evaluation. The evaluation component included orientation to the RIME framework.

The 15 LIC students ranked the clerkship in their top two preferences during the clerkship scheduling process, when students have a choice of traditional block clerkships at several sites during the academic year, the LIC, or three programs featuring traditional block clerkships at a single site. There were no baseline differences in demographics or pre-clerkship academic performance between LIC and non-LIC students.

Data Analysis

Study data were obtained from recordings of preceptors' discussions of students' performance at quarterly evaluation sessions (clerkship months 3, 6, 9, 12). The RIME framework and formal evaluation sessions format were introduced to the preceptors with examples of students' performance at the LIC orientation described above at the beginning of the year, and they were reminded by email and verbally before each session. At each evaluation session, 20 minutes were allocated to discuss each student, during which all available preceptors discussed that student's progress in person or by conference call. Each session was facilitated by one of the two LIC directors. Written comments submitted by absent preceptors were read aloud. Each preceptor had approximately two minutes to present the RIME descriptor that best characterized the student's performance on most occasions, behavioral examples of the student's performance, and ‘next steps' for improvement. The RIME descriptors are shown in Appendix 1 (15). Preceptors listened to others describe the same student and discussed questions and suggestions among themselves and with the directors. In the tradition of frame-of-reference training which defines expected performance at certain levels for comparison with observed performance (16), the LIC director guided the preceptors' use of the RIME framework, asking for clarifications or examples when necessary and correcting preceptors if their RIME level was not supported by their comments. The “next steps” described by preceptors included recommendations for how students could advance their clinical performance, and strategies the preceptors had used or planned to use to promote that advancement. Students received written summaries with specialty and preceptor de-identified, to review individually within 30 days of the evaluation session with a year-long program advisor who helped set learning goals. Advisors were also asked to attend the evaluation sessions; some but not all were preceptors for the advisee.

Evaluation sessions were audio-recorded and transcribed verbatim. Preceptors' assignment of RIME descriptors was averaged across preceptors for a given student. Each of 60 transcripts (15 students with four transcripts each) represented discussion of one student at one evaluation session, and were de-identified prior to analysis. This study focused on data describing clinical performance development. Review of initial transcripts showed that faculty discussed performance using similar language to that on the school's written evaluation form for clerkship students. Four investigators (KEH, LM, BCO, LT) generated an initial codebook based on evaluation form domains and used it to code two transcripts. Investigators discussed and revised the codebook based on transcript content. Codes addressed data gathering, written and oral presentation, and communication; codes were added for program structure, clinical settings, and students' work habits. The four investigators then applied the codes to three more transcripts, refined the codebook as needed, and coded all remaining transcripts in pairs. Differences were reconciled through discussion. All transcripts were double coded, and thus we did not calculate inter-rater reliability. We used the constant comparison method to identify themes and strategies using data from within evaluation sessions that could be examined in subsequent session transcripts, and to examine themes and use of strategies within individual students' transcripts over time and across different students (17, 18). All four coders discussed the coded data to generate larger themes. We used NVivo8® to organize and retrieve coded data.

The University of California San Francisco Institutional Review Board approved the study.

Results

Over the four evaluation sessions, 72 of 87 preceptors participated. The 15 preceptors who never participated in person or by phone were from family medicine (two), neurology (two), obstetrics/gynecology (five), pediatrics (three), psychiatry (one), and surgery (two). Preceptor in-person and conference-call participation ranged from 43% to 60% at each session. All students had multiple preceptors participate at every session. Faculty reported anecdotally that busy schedules explained non-participation. RIME descriptors for each student were averaged across preceptors at each timepoint (Table 1). Over the academic year, all students progressed within the RIME framework; a few progressed faster and farther than others, and some stalled at certain timepoints.

Table 1.

Reporter-interpreter-manager-educator framework adjective assigned in evaluation sessions by preceptors for 15 students in a longitudinal integrated clerkship over one academic year

Timepoint during the clerkship year

1 2 3 4
RIME adjective (no. of students)
Reporter 15 7 1 0
R-I 0 3 0 0
Interpreter 0 5 9 1
I-M 0 0 3 3
Manager 0 0 2 9
M-E 0 0 0 2
Educator 0 0 0 0
*

The adjective assigned represents the average assignment across preceptors for a student.

Results from the qualitative analysis are organized below into the two performance domains that emerged in the transcripts: data gathering and reporting, and knowledge and clinical reasoning. For each, we present what the faculty described as general precepting strategies, strategies for students with rapid developmental progression, and strategies for students manifesting obstacles to development.

Data gathering and reporting

General precepting strategies to advance students' development

Initially, preceptors described their efforts to teach students to take the history of present illness with structure and focus relevant to their particular settings. They provided feedback to students about how to gather the right amount of data, organize information, and generate accurate, clear oral presentations. One preceptor commented at the first timepoint, ‘She doesn't really understand the neurological exam [or] the different parts and what they show yet, and how to direct an exam based on the history.’ At the third timepoint, the same preceptor praised the student's improved data gathering: ‘She has refined history-taking abilities. She's fluent with a neuro exam.’

When preceptors wanted to identify areas for improvement and provide specific feedback, they recognized the need for focused observations of the student. One preceptor said at the first timepoint, “I need to do more observation of her, going over some of the physical exam skills.” At the third timepoint, the preceptor followed up with praise that the student ‘has diagnosed a couple of heart murmurs accurately that weren't documented.’ To focus their teaching and students' bedside learning, some preceptors addressed one aspect of the physical exam per clinic session for in-depth teaching and practice. A generalist preceptor enlisted the student's other preceptors who saw more patients with abnormal physical examination findings to practice physical exam technique and interpretation in their settings.

Preceptors frequently reviewed expectations for the standard organization and content of reporting through oral presentations and notes with their students. As students advanced their data-gathering and reporting skills, preceptors consistently identified next steps as synthesizing, prioritizing, and focusing on pertinent findings. One preceptor reported:

At the beginning, he was just writing whatever he was hearing, like it was flow of consciousness. Now, he's making logical sense. He's showing the evolution of the illness and what brought the patient [in].

Strategies for students demonstrating rapid, progressive development

The four highest-performing students progressed at every timepoint. They were quick, flexible thinkers who adjusted their data gathering and reporting of assessments in real time based on information elicited. At the first timepoint, one preceptor described a student's early formulation of differential diagnoses:

He actually directs his exams more and more towards solving his differential question. So no longer just randomly asking questions but saying, “Ah-hah. So you have this pain. What about this and this?” Really beginning to get at questions and physical exam findings that would direct him to a diagnosis.

As these students readily mastered basic clinical presentations, preceptors described their role as providing opportunities for evaluating more unusual, complex patients.

Strategies for Students with Obstacles to Performance

Ten students struggled to solidify data-gathering and reporting skills through the second timepoint. Preceptors commonly identified lack of structure in the history and physical exam as an impediment. Clinics with algorithmic approaches to data gathering provided helpful structure for students lacking organizational skills:

We have a very rote approach to our patients, which has probably helped in that it's a very specific template, so that the skill is in deciding how many questions to ask in that template until you're satisfied that you know what the pertinent positives and negatives are.

Conflicting preceptor impressions of students in clinics with and without these templated approaches helped diagnose underlying organizational problems. In the absence of pre-existing templates, preceptors promoted organized data gathering by providing structure or modeling. One preceptor was ‘trying to get him to have sort of a checklist in his mind of what he needs to go through for every H&P [history and physical examination].’

To encourage data gathering targeted to the patient's complaint, some preceptors taught students how to read patient charts or textbooks in advance to anticipate relevant questions to ask and systems to examine. For some students with highly structured approaches, however, preceptors observed lack of responsiveness to new information during history taking. In response, preceptors explained how to adapt questions based on information gathered.

One preceptor modeled an oral presentation to help a student struggling to organize and synthesize information. Another observed that a student's language and information synthesis improved after spending time with an inpatient team; the preceptor attributed the improvement to the structure of the inpatient setting.

The fast pace of the ambulatory setting challenged students. Some preceptors addressed this problem by directing students' time management during encounters—either assigning a certain amount of time to encourage efficiency, or in one case encouraging the student to time the visit to promote the student's own awareness. One preceptor took such a student with persistent problems gathering and synthesizing information efficiently to practice seeing patients on the inpatient unit, free of the ambulatory setting's time constraints.

Knowledge and clinical reasoning

General precepting strategies to advance students' development

To advance students' knowledge acquisition, many preceptors and students identified a learning topic after each clinic session for the student to read about and report to the preceptor. Preceptors praised students who self-identified appropriate reading sources without reliance on preceptors' guidance.

Over the year, preceptors transitioned their expectations from knowledge accumulation to application to specific patients. They described all students as progressing in their clinical reasoning, although at different rates. Preceptors' expectations shifted from generation of problem lists to attempts at differential diagnoses and then to prioritized lists of likely possibilities.

At the second timepoint, one preceptor said, ‘I want her to make the leap into coming up with a differential in terms of, especially, common gynecologic problems.’

Preceptors with subspecialty practices often reflected that students saw patients with similar issues or pre-established diagnoses. Several recognized the need to identify general teaching points even in the subspecialty setting. For example, a pediatric subspecialist encouraged students to assess and discuss general pediatric issues with each patient. Some students proactively scheduled extra sessions with a generalist or acute care provider to supplement subspecialty sessions.

Strategies for Students Demonstrating Rapid, Progressive Development

The four highest-performing students differed from the majority of their peers by attempting, from early on, differential diagnoses and clinical reasoning, and they steadily improved. These students applied newly acquired knowledge to patients across specialties and settings:

She applied knowledge she had gained in the adult world to one of our pediatric patients, teaching me in the process… she impressed me with her maturity and how she was thinking through what she wanted to get out of the third or fourth year. Although I know she will not be going into my field, she continued to show interest in learning and figuring out how to apply knowledge of my field to hers.

From the preceptors' perspective, these students benefited from a range of clinical opportunities, but preceptors emphasized the students' role in their own advancement through their cognitive abilities, motivation, and self-monitoring of learning. One preceptor described:

She's doing a very good job of taking advantage of what she's seeing clinically to tell her what she needs to learn next. I think she's just going to naturally evolve into a manager role.

Strategies for Students with Obstacles to Performance

Eleven students had problems across at least two timepoints accessing knowledge and employing clinical reasoning to apply it to patients. For two, tunnel vision prompted premature declaration of narrow differential diagnoses. More commonly, students were overly thorough without focusing on relevant information. For example, at the second timepoint one preceptor stated: ‘He is definitely not at the point where he is able to really synthesize a set of data and studies and then represent it with any sort of plan.’ By the next timepoint, after focused coaching, the student was generating differential diagnoses, although they were still described as too thorough or too brief.

Preceptors were not always clear whether these students lacked foundational pathophysiologic knowledge or failed to incorporate it clinically. Students without a framework for remembering features of basic clinical problems failed to apply knowledge learned from one patient to another. Preceptors used highly structured strategies to engage these students in efforts to gain knowledge. Although many preceptors and students jointly selected basic learning topics early in the year, most preceptors later assigned students with knowledge deficiencies structured reading topics based on clinical cases or common topics in the discipline. These students required guidance about exactly how to read textbooks and the literature, and integrate that information clinically:

[He] seems still at a loss for even what to ask patients. So we got the textbook that the medical students use, and one of our fellows, on a weekly basis, has just been going over some very basic things with him.

To reinforce basic knowledge, preceptors steered these students toward patients with common problems.

Preceptors guided these students to generate problem lists and read before presenting cases. For one who resisted this instruction, the preceptor modeled how to look up information during clinic and apply it to a patient. A common strategy for all students to promote application of information from reading to patients was to assign students to complete one write-up from a clinic session at home; for students struggling to access and apply knowledge, this strategy remained essential to allow more time for reading and synthesis.

Preceptors felt responsible for advancing students' ability to approach clinical problems. One preceptor explained:

I don't know if I'm just not effectively helping him to know how to begin to walk through basic things… [I'm] helping him to target his reading, to incorporate his reading into how he thinks about the patients.

The other preceptors also discussed at the second and third timepoints that this student required guidance about exactly how to read using textbooks and the literature, and how to integrate that information clinically. They observed a benefit from seeing common clinical presentations multiple times to solidify understanding. With these strategies, by year's end the student had ‘made very substantial strides’ and showed confidence in identifying patient problems and developing diagnostic and treatment plans.

Discussion

To our knowledge, this is the first study describing multidisciplinary preceptors' use of the RIME framework in formal evaluation sessions in a longitudinal setting. In addition, we describe the strategies used by preceptors to guide students' clinical development over the principal clinical year. This ‘in vivo’ view of LIC students' development illustrates how preceptors addressed various obstacles to progress in data gathering and the acquisition and application of knowledge.

Based on the qualitative analysis, our preceptors' longitudinal relationships with their students seemed to facilitate insights into students' progressive development, and, perhaps more importantly, the preceptors had sustained opportunities for observation, implementation of learning strategies, and monitoring of progress. This continuity allowed preceptors to select student tasks that were appropriately challenging to advance their learning and useful for the preceptor to determine whether learning goals were achieved (1921). The manner in which our preceptors identified strategies for accomplishing next steps in students' learning and observed progress toward those goals over time is consistent with recommended conceptualization of feedback as part of an ongoing dialogue to support learners' advancement (22). We also heard from preceptors how they attempted to balance providing challenge for students with providing support in the fast-paced ambulatory setting. These preceptor efforts align with a social-constructivist model of learning, in which teachers respond to learners' needs in addition to challenging them to higher levels of performance (23).

Consistent with early learners' performance according to the Dreyfus scheme for development of expertise, our students' progress occurred at the interface of novice and advanced beginner (24). In that perspective, preceptors appropriately recognized the importance of students learning to make connections between knowledge and illness presentations, or between different but similar presentations (25). Some articulated the teacher's role in providing exposure to general case examples and highlighting key features and underlying principles (26, 27). Instructional strategies used by preceptors were often consistent with those recommended in the literature on clinical reasoning, such as adjusting expectations based on students' performance, imparting reading strategies, and identifying opportunities for comparing presentations across settings (27).

Most students manifested some difficulties in data gathering or clinical knowledge and reasoning at some point in the year. The high prevalence of stalled progress suggests these occurrences may be normal aspects of development, at least for LIC students, that all preceptors should be able to identify and address. For instance, some students who had learned the organization of a history lacked flexibility in their questioning, a hallmark of the novice learner (24). Our preceptors recognized that they needed to help students progress from simply performing data-gathering maneuvers to connecting their findings to a differential diagnosis (21). preceptors' focus on effective data-gathering technique has similarly been reported in prior studies of clinical teaching (28, 29). Our preceptors also used modeling to emphasize basic clinical skills, a strategy that has been recommended for excellent teaching (19). Future research to clarify mechanisms of distinguishing normal development versus more worrisome deficits would be helpful. The highest-performing students manifested strategies typical of self-regulated learners that seemed to facilitate their advancement with minimal corrective intervention by preceptors; facilitating adoption of these strategies by other students might help those facing obstacles (30).

The longitudinal clerkship structure enhanced accountability for preceptors to ensure their students' learning. Because preceptors worked individually with their students over a year, they could identify deficits, prepare feedback with action plans, and monitor subsequent performance (31). This situation contrasts with most block clerkships, in which shorter periods of interaction may lead faculty to report deficits only on final evaluations or not at all, depriving students of the opportunity to improve with those preceptors (3). In fact, concern about ‘forward feeding’ information regarding struggling students' performance to subsequent instructors is obviated in the LIC model (32). The serial evaluation sessions created an environment that allowed faculty to communicate honest opinions and concerns, anticipate developmental progress, and generate collaborative learning plans, in contrast to clerkships that rely solely on written evaluations completed in isolation. Thus use of the RIME framework along with evaluation sessions in this LIC model provides an example of using an assessment strategy in part for the purpose of advancing learning (33). In fact, the progression of the student group from the level of reporter toward interpreter and manager over the year suggests evidence for the construct validity of the RIME terminology for assessment of students' performance (the construct being ‘growing independence’ that is not dependent on the clerkship discipline). Admittedly, this evidence of validity is difficult to discern given that we asked the faculty to use RIME; comparison of these descriptive data with performance data from other objective assessments of performance could strengthen the evidence of validity.

Our findings may have implications for the design of clinical experiences for students. Our results show how educators attempt to tailor core clinical experiences to LIC students' learning needs within the context of faculty practices (34). Our preceptors defined and assigned students level-appropriate patients (i.e., basic versus complex presentations) or tasks (i.e., assessment before plan). Observing students' performance level over time and being able to consult with other preceptors seemed to enhance preceptors' ability to customize experiences using readily available clinical resources. Our use of RIME and serial evaluation sessions provided some faculty development on the RIME framework, although there is room to capitalize on the model even more with additional training on how to use evaluation sessions strategically to monitor students' progress in specific domains and intervene accordingly. For instance, faculty development on observation of trainees' clinical skills could augment their ability to characterize students' strengths and deficits (35). Group faculty development on collaborative interventions could also engage preceptors to address common obstacles for individual students across settings. Further research could explore whether these efforts could facilitate a feedback cycle of information sharing, skills improvement, and subsequent observation of performance (36).

This study has limitations. The data derive from a single medical school in one academic year with a limited number of students who chose to enroll in the LIC, although there were no baseline differences between these students and their peers in other clerkship tracks. Other students in other clerkship models might progress differently; however, the RIME framework also applies for students in traditional clerkship settings (8, 9, 11). We cannot determine how preceptor comments might have been reported differently had the evaluation sessions not used the RIME vocabulary. We did not observe students' clinical performance to verify preceptors' reports of performance, or their changes in performance after preceptors planned, and ideally implemented, strategies to help them; nor did we calculate inter-rater reliability of preceptor RIME adjectives, because students might perform in different ways in different settings. However, involvement of a large number of preceptors from multiple specialties at four timepoints captures longitudinal aspects of students' performance. Participation bias is possible, as not all preceptors attended the evaluation sessions, although our participation rate was high among busy clinical faculty. Faculty were recruited to precept, in part, based on their teaching skill. However, we believe their ability to discern student progress and identify next steps was not unique, but was facilitated by the format of collective sharing and problem solving at the evaluation sessions. We chose to use RIME as a commonly employed and easily understood framework for faculty to describe medical students' development; other frameworks might have produced different faculty discussions, but, reassuringly, our faculty addressed core skills necessary for all clinical students.

This study illustrates how preceptors intervene with the goal of promoting LIC students' clinical development over the core clerkship year. In the context of a developmental perspective, preceptors used templates and modeling to promote data-gathering and reporting skills, and aimed to impart understanding of reading strategies and application of knowledge. Our findings show how preceptors plan to intervene to guide students' development with specific performance feedback and instruction in a setting that facilitates follow-up of students' progress.

Acknowledgements

Joanne Batt for administrative support; Ann Poncelet MD, PISCES director; Glenn Regehr PhD and Patricia O'Sullivan EdD for expert review; the PISCES students.

Appendix 1. RIME vocabulary

‘Reporter’: The student can accurately gather and clearly communicate the clinical facts on his/her own patients. Mastery in this step requires the basic skill to do a history and physical examination and the basic knowledge to know what to look for. It emphasizes day-to-day reliability, for instance being on time or follow-up of a patient's test results. Implicit in the step is the ability to recognize normal from abnormal and the confidence to identify and label a new problem. This step requires a sense of responsibility, and achieving consistency in “bedside” skills in dealing directly with patients. These skills are often introduced to students in their pre-clinical years, but now they must be mastered as a ‘pass’ criterion. Students must be complete, accurate, reliable, and honest. They must consistently be able to answer accurately the “what” kinds of questions about their patients.

‘Interpreter’: Making a transition from ‘reporter’ to “interpreter” is an essential step in the growth of a third-year student, and often the most difficult. At a basic level, the student must prioritize among problems identified in their time with the patient. The next step is to offer a differential diagnosis. Because a public forum can be intimidating to beginners, and third-year students cannot be expected to have the ‘right answer’ all the time, we define success as offering at least three reasonable diagnostic possibilities for new problems. Follow-up of tests provides another opportunity to ‘interpret’ the data (especially in the clinic setting). This step requires a higher level of knowledge, and more skill in selecting the clinical findings which support possible diagnoses and applying these results to specific patients. The student has to make the transition, emotionally, from “bystander” to see himself/herself as an active participant in patient care. Students at this level consistently have reasonable answers to the “why” questions about their patients.

‘Manager’: This step takes even more knowledge, more confidence, and more judgment in deciding when action needs to be taken, and to propose and select among options for patients. Once again we cannot require students to be ‘right’ with each suggestion, so we ask them to include at least three options in their diagnostic and therapeutic plan. A key element is to tailor the plan to the particular patient's circumstances and preferences.

‘Educator’: Success in each prior step depends on self-directed learning and a mastery of basics. To be an ‘educator’ in our framework means to go beyond the required basics, to read deeply, and to share new learning with others. Defining important questions to research in more depth takes insight. Having the drive to look for hard evidence on which clinical practice can be based and having the skill to know whether the evidence will stand up to scrutiny are qualities of an advanced trainee; to share leadership in educating the team (and even the faculty) takes maturity and confidence. At the manager/educator level, students can consistently answer and address the ‘how’ questions (how things work, how they will help my patient, etc.).

Conflict of interest and funding

Disclaimer: The views expressed in this paper are those of the authors and do not reflect official policy of the US government or other federal agencies.

This study was funded in part by the University of California San Francisco Office of Medical Education and Academy of Medical Educators, and the Western Group on Educational Affairs.

References

  • 1.Dolmans DH, Wolfhagen IH, Heineman E, Scherpbier AJ. Factors adversely affecting student learning in the clinical learning environment: a student perspective. Educ Health (Abingdon) 2008;21:32. [PubMed] [Google Scholar]
  • 2.Kilminster SM, Jolly BC. Effective supervision in clinical practice settings: a literature review. Med Educ. 2000;34:827–40. doi: 10.1046/j.1365-2923.2000.00758.x. [DOI] [PubMed] [Google Scholar]
  • 3.Branch WT, Jr, Paranjape A. Feedback and reflection: teaching methods for clinical settings. Acad Med. 2002;77:1185–8. doi: 10.1097/00001888-200212000-00005. [DOI] [PubMed] [Google Scholar]
  • 4.Lye PS, Biernat KA, Bragg DS, Simpson DE. A pleasure to work with—an analysis of written comments on student evaluations. Ambul Pediatr. 2001;1:128–31. doi: 10.1367/1539-4409(2001)001<0128:aptwwa>2.0.co;2. [DOI] [PubMed] [Google Scholar]
  • 5.Coderre S, Wright B, McLaughlin K. To think is good: querying an initial hypothesis reduces diagnostic error in medical students. Acad Med. 2010;85:1125–9. doi: 10.1097/ACM.0b013e3181e1b229. [DOI] [PubMed] [Google Scholar]
  • 6.Fuks A, Boudreau JD, Cassell EJ. Teaching clinical thinking to first-year medical students. Med Teach. 2009;31:105–11. doi: 10.1080/01421590802512979. [DOI] [PubMed] [Google Scholar]
  • 7.Smucny J, Epling JW. A web-based approach to teaching students about diagnostic reasoning. Fam Med. 2004;36:622–4. [PubMed] [Google Scholar]
  • 8.Hemmer PA, Pangaro L. Using formal evaluation sessions for case-based faculty development during clinical clerkships. Acad Med. 2000;75:1216–21. doi: 10.1097/00001888-200012000-00021. [DOI] [PubMed] [Google Scholar]
  • 9.Battistone MJ, Milne C, Sande MA, Pangaro LN, Hemmer PA, Shomaker TS. The feasibility and acceptability of implementing formal evaluation sessions and using descriptive vocabulary to assess student performance on a clinical clerkship. Teach Learn Med. 2002;14:5–10. doi: 10.1207/S15328015TLM1401_3. [DOI] [PubMed] [Google Scholar]
  • 10.DeWitt D, Carline J, Paauw D, Pangaro L. Pilot study of a ‘RIME’-based tool for giving feedback in a multi-specialty longitudinal clerkship. Med Educ. 2008;42:1205–9. doi: 10.1111/j.1365-2923.2008.03229.x. [DOI] [PubMed] [Google Scholar]
  • 11.Hemmer PA, Papp KK, Mechaber AJ, Durning SJ. Evaluation, grading, and use of the RIME vocabulary on internal medicine clerkships: results of a national survey and comparison to other clinical clerkships. Teach Learn Med. 2008;20:118–26. doi: 10.1080/10401330801991287. [DOI] [PubMed] [Google Scholar]
  • 12.Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287:226–35. doi: 10.1001/jama.287.2.226. [DOI] [PubMed] [Google Scholar]
  • 13.Dornan T, Boshuizen H, King N, Scherpbier A. Experience-based learning: a model linking the processes and outcomes of medical students' workplace learning. Med Educ. 2007;41:84–91. doi: 10.1111/j.1365-2929.2006.02652.x. [DOI] [PubMed] [Google Scholar]
  • 14.Ogur B, Hirsh D, Krupat E, Bor D. The Harvard Medical School-Cambridge integrated clerkship: an innovative model of clinical education. Acad Med. 2007;82:397–404. doi: 10.1097/ACM.0b013e31803338f0. [DOI] [PubMed] [Google Scholar]
  • 15.Pangaro L. A new vocabulary and other innovations for improving descriptive in-training evaluations. Acad Med. 1999;74:1203–7. doi: 10.1097/00001888-199911000-00012. [DOI] [PubMed] [Google Scholar]
  • 16.Holmboe ES, Hawkins RE, editors. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, PA: Mosby; 2008. [Google Scholar]
  • 17.Strauss AL, Corbin J. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Newberry Park, CA: Sage; 1990. [Google Scholar]
  • 18.Miles MB, Huberman AM. Qualitative Data Analysis. 2nd ed. Thousand Oaks, CA: Sage; 1994. [Google Scholar]
  • 19.Irby DM, Ramsey PG, Gillmore GM, Schaad D. Characteristics of effective clinical teachers of ambulatory care medicine. Acad Med. 1991;66:54–5. doi: 10.1097/00001888-199101000-00017. [DOI] [PubMed] [Google Scholar]
  • 20.Black P, Wiliam D. Assessment and classroom learning. Assess in Educ Princi Pol Prac. 1998;5:7–74. [Google Scholar]
  • 21.Duvivier RJ, van Dalen J, van der Vleuten CP, Scherpbier AJ. Teacher perceptions of desired qualities, competencies and strategies for clinical skills teachers. Med Teach. 2009;31:634–41. doi: 10.1080/01421590802578228. [DOI] [PubMed] [Google Scholar]
  • 22.Archer JC. State of the science in health professional education: effective feedback. Med Educ. 2010;44:101–8. doi: 10.1111/j.1365-2923.2009.03546.x. [DOI] [PubMed] [Google Scholar]
  • 23.Brydges R, Dubrowski A, Regehr G. A new concept of unsupervised learning: directed self-guided learning in the health professions. Acad Med. 2010;85:S49–55. doi: 10.1097/ACM.0b013e3181ed4c96. [DOI] [PubMed] [Google Scholar]
  • 24.Batalden P, Leach D, Swing S, Dreyfus H, Dreyfus S. General competencies and accreditation in graduate medical education: an antidote to overspecification in the education of medical specialists. Health Aff (Millwood) 2002;21:103–11. doi: 10.1377/hlthaff.21.5.103. [DOI] [PubMed] [Google Scholar]
  • 25.Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:753. doi: 10.1111/j.1365-2929.2004.01972.x. [DOI] [PubMed] [Google Scholar]
  • 26.Irby DM. Three exemplary models of case-based teaching. Acad Med. 1994;69:947–53. doi: 10.1097/00001888-199412000-00003. [DOI] [PubMed] [Google Scholar]
  • 27.Bowen JL. Educational strategies to promote clinical diagnostic reasoning. New Engl J Med. 2006;355:2217–25. doi: 10.1056/NEJMra054782. [DOI] [PubMed] [Google Scholar]
  • 28.Alweshahi Y, Cook D. Domains of effective teaching process students perspectives in two medical schools. Med Teach. 2009;31:125–30. doi: 10.1080/01421590802572742. [DOI] [PubMed] [Google Scholar]
  • 29.Hauer KE, Teherani A, Irby DM, Kerr KM, O'Sullivan PS. Approaches to medical student remediation after a comprehensive clinical skills examination. Med Educ. 2008;42:104–12. doi: 10.1111/j.1365-2923.2007.02937.x. [DOI] [PubMed] [Google Scholar]
  • 30.Li ST, Paterniti DA, Co JP, West DC. Successful self-directed lifelong learning in medicine: a conceptual model derived from qualitative analysis of a national survey of pediatric residents. Acad Med. 2010;85:1229–36. doi: 10.1097/ACM.0b013e3181e1931c. [DOI] [PubMed] [Google Scholar]
  • 31.Teherani A, O'Brien BC, Masters DE, Poncelet AN, Robertson PA, Hauer KE. Burden, responsibility, and reward: preceptor experiences with the continuity of teaching in a longitudinal integrated clerkship. Acad Med. 2009;84:S50–3. doi: 10.1097/ACM.0b013e3181b38b01. [DOI] [PubMed] [Google Scholar]
  • 32.Cox SM. “Forward feeding” about students' progress: information on struggling medical students should not be shared among clerkship directors or with students' current teachers. Acad Med. 2008;83:801. doi: 10.1097/ACM.0b013e318181cfe6. [DOI] [PubMed] [Google Scholar]
  • 33.Shepard L. The role of assessment in a learning culture. Educ Researcher. 2000;29:4–14. [Google Scholar]
  • 34.Cooke M, Irby DM, O'Brien BC. Educating Physicians: A Call for Reform of Medical School and Residency. San Francisco: Jossey-Bass/Carnegie Foundation for the Advancement of Teaching; 2010. [Google Scholar]
  • 35.Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents' clinical competence: a randomized trial. Ann Intern Med. 2004;140:874–81. doi: 10.7326/0003-4819-140-11-200406010-00008. [DOI] [PubMed] [Google Scholar]
  • 36.van de Ridder JM, Stokking KM, McGaghie WC, ten Cate OT. What is feedback in clinical education? Med Educ. 2008;42:189–97. doi: 10.1111/j.1365-2923.2007.02973.x. [DOI] [PubMed] [Google Scholar]

Articles from Medical Education Online are provided here courtesy of Taylor & Francis

RESOURCES