Skip to main content
MedEdPORTAL : the Journal of Teaching and Learning Resources logoLink to MedEdPORTAL : the Journal of Teaching and Learning Resources
. 2016 Aug 26;12:10445. doi: 10.15766/mep_2374-8265.10445

Teaching Clinical Reasoning to Medical Students: A Case-Based Illness Script Worksheet Approach

Michael Levin 1,*, David Cennimo 2, Sophia Chen 3, Sangeeta Lamba 4,5
PMCID: PMC6464440  PMID: 31008223

Abstract

Introduction

Clinical reasoning is a fundamental part of a physician's daily workflow. Yet it remains a challenging skill to develop formally, especially in preclerkship-level early learners. Traditionally, medical students learn clinical reasoning informally through experiential opportunities during their clerkship years. This occurs in contrast to the more structured, explicit learning of the basic sciences and physical diagnosis during the preclerkship years. To address this need, we present a flipped classroom case-based approach for developing clinical reasoning skills based on problem representation and the use of a structured illness script worksheet as a model.

Methods

Students were given a short introduction via screencast to introduce clinical reasoning and related terminology such as problem representation and semantic qualifiers. They also received a case vignette and an illness script worksheet to prepare them for in-class discussion. Students used this worksheet to practice clinical reasoning in a small-group session that was held in our last organ system–based second-year course, prior to the start of the clerkships.

Results

In comparison to the traditional facilitator-led small-group sessions, where students would sequentially answer a set of defined content-based questions to explore a clinical case, 80% of students preferred the new framework that incorporates problem representation and the illness script worksheets. Faculty facilitators found the structure of the illness script worksheet helpful in leading a clinical reasoning small-group session.

Discussion

Based on the results of this pilot, we plan to systematically implement this clinical reasoning framework in our preclerkship curriculum.

Keywords: Decision Making, Clinical Reasoning, Illness Script, Problem Representation

Educational Objectives

By the end of this session, students should be able to:

  • 1.

    Understand and use terminology related to clinical reasoning such as problem list, summary statement, and semantic qualifiers.

  • 2.

    Synthesize problem representations to generate more specific differential diagnoses using semantic qualifiers (such as acute/chronic, chronic, recurrent, young/old, etc.).

  • 3.

    Complete an illness script worksheet for a case-based medical scenario.

  • 4.

    Practice clinical reasoning skills in a small-group setting.

Introduction

According to the Liaison Committee on Medical Education, clinical reasoning is defined as “the integration, organization, and interpretation of information gathered as a part of medical problem-solving.”1 Recent literature on the topic, along with the Core Entrustable Professional Activities (EPAs) for Entering Residency project, highlight the importance of teaching clinical reasoning.215 The second Core EPA proposes that a graduating student be able to prioritize a differential diagnosis following a clinical encounter.2 Furthermore, clinical reasoning is a key component of patient care, a skill employed by physicians on a daily basis. Many educational techniques have been implemented to teach medical students these skills, including problem-based learning, team-based learning, and clinical presentations, among others. Despite the clear need to teach medical students clinical reasoning, no gold standard for developing these skills has emerged. Our goal was to provide students with a structured framework to formalize clinical reasoning skills when discussing clinical cases in the preclerkship phase of medical education.

Problem representation is the process of distilling critical information from a broader clinical scenario using semantic qualifiers, for example, young/old, constant/recurrent, diffuse/localized, mild/severe, and acute/chronic.14,15 Illness scripts are a physician's organization of disease schemas into working memory that may contain elements such as the epidemiology, time course, and pathophysiology of medical conditions.15 The process of explicitly formulating a problem representation and then generating illness scripts through a worksheet may be learned and may help improve clinical reasoning. Literature on the use of such frameworks to teach clinical reasoning using a case-based approach and also on how to prepare facilitators to teach these skills to students is expanding.513 Curricular materials to teach these sessions are emerging but limited, and we hope to add to this library of resources.6,7,10 The case-based approach presented here may also be useful to those who wish to teach components of clinical reasoning, letting learners integrate them later within a broader clinical reasoning curriculum.

We developed a short screencast to introduce clinical reasoning (for both students and faculty facilitators), a framework for analyzing paper-based clinical cases using problem representation and illness script worksheets, and a small-group case for students to practice their clinical reasoning skills. A pilot of this curriculum was implemented at Rutgers New Jersey Medical School as a small-group exercise during the end of the second year. This framework can be implemented at any point during medical school, for either paper-based or simulated patient scenarios, although some knowledge of relevant pathophysiology and physical diagnosis is required.

Methods

The clinical reasoning pilot was conducted among second-year medical students. It was used as a part of the last organ system–based unit (renal) in the spring of the second year, just prior to students entering their clerkship rotations.

In our existing organ system–based cases, the goal was to integrate the basic sciences into a clinical context using a problem-solving approach. This traditional setup provides an organ system–specific clinical case vignette to students with a set pattern. A short, clinical stem is followed by open-ended questions, and the case proceeds until all listed questions are answered. Students receive the case with questions ahead of time and are expected to come to class prepared. In-class discussions consist of a student reading the clinical stem and question set and then the other students building upon the answers. Each case takes between 90 and 120 minutes. A sample clinical case vignette in this traditional format is available as an attachment in our prior MedEdPORTAL publication,16 where we described its setup using an online small-group discussion for e-learning.

Students are assigned to groups of 15. The groups remain unchanged, with the same faculty facilitator, for all six sessions across the organ systems units. The facilitator's role is to direct the flow of conversation, guide content-based queries, and ensure all questions are answered. We have traditionally not run these small groups to explicitly teach components of clinical reasoning. Student and clerkship faculty feedback identified that when the students enter their first clerkships, they show a wide variability in their clinical reasoning abilities. This curriculum was designed to target this need and to build clinical reasoning skills in the preclerkship years.9 We therefore selected the last unit, the renal system, in the spring of the second year for the pilot.

Clinical Reasoning Pilot

Out of a class of 180 second-year students, approximately two-thirds completed the small-group session using the traditional format. Around one-third of the class (59 students) used the worksheet from this resource. The clinical reasoning case outline was adapted using parameters directly from the traditional case. Three faculty facilitators helped develop the clinical reasoning curriculum. These three faculty members also routinely led the traditional small-group sessions as described above. Therefore, four small groups of students already assigned to these three faculty facilitators were included in the pilot (one faculty member led two small groups on consecutive days).

We allotted 2 hours for this pilot, the same as the traditional session. We spent approximately 90 minutes on the clinical reasoning worksheet completion and the other 30 minutes to ensure that we covered any remaining content from the traditional clinical case questions. This was done to ensure that no students were negatively impacted or missed key content by participating in the pilot.

Before the Small-Group Session

Students were emailed a short screencast (Appendix A) 1 week before each small-group clinical reasoning session. A slide-set version (Appendix B) was also available. The screencast introduced the students to clinical reasoning and associated terminology such as problem representation and semantic qualifiers. Additionally, it familiarized the students with the illness script worksheet. They also received a copy of the traditional student handout for the edema case (Appendix D). The email clearly stated that this exercise could not be completed without viewing the screencast first. However, we did not track whether students actually saw the screencast. Use of an educational platform may allow for this tracking. Anecdotally, each group's members reported they had watched the screencast and were familiar with the terms prior to start of session, so in our experience, a majority of students came prepared.

Facilitators also watched the same clinical reasoning screencast 1 week prior to the clinical reasoning exercise to familiarize themselves with the terminology and format. Four days prior to the exercise, facilitators met to work through the edema faculty guide (Appendix C) and to answer any questions about implementing the exercise (15–20 minutes). We recommend the use of clinical faculty to lead this exercise. The facilitators in our pilot were seasoned faculty who lead the traditional sessions regularly. They were also familiar with the case as they had helped develop the clinical reasoning material. We feel that if this exercise is to be delivered by novice facilitators, more time should be allotted to cover responsibilities. We recommend working through a sample worksheet to demonstrate flow (recommended time: 30–45 minutes).

During the Small-Group Session

During the session, students completed the illness script worksheet. Flow of the session included the following:

  • Introduction (5 minutes).

  • Exploring relevant history and review of systems (20 minutes).

  • Physical examination (15 minutes).

  • Generating summary statement and initial differential diagnoses (20 minutes).

  • Diagnostic studies, rationale, reevaluation, and rank ordering the differential diagnoses (20 minutes).

  • Debrief and remaining queries (10 minutes).

We used a group-discussion format where the students worked as a team to complete the illness script worksheet, with a student volunteer scribe to list things on the whiteboard. We found that our group-discussion format worked well. It was interesting to note that in one group, the facilitator asked students to rank order the list of differential diagnoses for relevance. This generated a natural formation of smaller teams, which led to healthy debate as each team wanted to move its differential choice to the top of the list. We plan to use this strategy for future sessions. Alternatively, this exercise may also be done in a stepwise approach, where each student first individually fills in a section and refines it by sharing with a partner, and then the group discusses. This may offer the advantage that each student is engaged. Finally, we asked students to propose further imaging or laboratory studies that might be necessary to confirm or refute the most likely diagnosis.

Since these sessions are held in the preclerkship years, there will be times when the differential diagnoses cover topics that students may not yet have been exposed to. We suggest that learners in such cases be encouraged to use other information resources, such as internet-based literature searches, while in session to complete this work (with faculty guidance). This may help reinforce the value of evidence-based medicine to support lifelong learning.

Feedback was provided throughout the session by the facilitator. Facilitators added information missing from the problem representation or student-generated illness script worksheets, highlighting key and differentiating features of the possible diagnoses. The main role of the facilitator was to guide the student discussion and engage all students, as well as debrief after the session for remaining questions or concerns.

Following the clinical reasoning exercise, all of the small groups then proceeded through the traditional question-and-answer content-based exercise for the renal unit for reasons mentioned above.

Results

A total of 59 students participated in the pilot clinical reasoning exercise. We asked participants to assess the resource via a survey utilizing a 5-point Likert scale, where 1 = strongly disagreed and 5 = strongly agreed (Appendix E). Overall, 96% of respondents agreed or strongly agreed that the illness script worksheet format was valuable for working through clinical reasoning, with a mean response of 4.5 out of 5 (see the Table). A majority (80%) also agreed or strongly agreed that they found the clinical reasoning format useful. Similarly, 70% agreed or strongly agreed with the statement that they preferred this illness script worksheet format over traditional small groups. Most students (88%-89%) also agreed or strongly agreed with the statements regarding their understanding of problem representation and illness script concepts.

Table. Student Survey Responses (N = 59).

Item Likert Scale Response (%) Ma SD
Strongly Disagree Disagree Neutral Agree Strongly Agree
This format was valuable for working through clinical reasoning. 0.00 1.69 1.69 37.29 59.32 4.54 0.62
The facilitators effectively conducted this exercise. 0.00 0.00 6.78 22.03 71.19 4.64 0.61
I understand the illness script framework. 0.00 1.72 8.62 65.52 24.14 4.12 0.62
I understand the concept of problem representation. 0.00 1.69 10.17 66.10 22.03 4.08 0.62
I feel comfortable solving clinical problems. 0.00 1.69 15.25 64.41 18.64 4.00 0.64
I found this small-group format useful. 0.00 1.69 18.64 33.90 45.76 4.24 0.82
I prefer this format over traditional small groups. 0.00 5.08 25.42 22.03 47.46 4.12 0.97
I have the skills to solve clinical problems. 0.00 5.08 28.81 59.32 6.78 3.68 0.68
I feel comfortable developing a differential diagnosis. 1.69 3.39 28.81 54.24 11.86 3.71 0.79
I have the skills to summarize a clinical situation. 0.00 5.08 30.51 52.54 11.86 3.71 0.74
I have the skills to develop a differential diagnosis. 0.00 1.69 33.90 54.24 10.17 3.73 0.67
I feel comfortable summarizing a clinical situation. 0.00 6.78 33.90 50.85 8.47 3.61 0.74
a

On a scale of 1 (strongly disagree) to 5 (strongly agree).

Students’ open-ended comments about the clinical reasoning exercise were also positive. Common strengths highlighted by students included increased use of logic, encouragement to think like a doctor, feelings of realism, and increased relevance to clinical medicine. Here are representative comments about strengths of the exercise:

  • “This format was more in line with problem solving and diagnosis in real situations. I felt that it pushes us to think like physicians.”

  • “More intuitive, more logical, more realistic.”

  • “Makes you logically think through potential differential diagnoses and reason without being given answers.”

  • “Learning to think through a case as if we were presented with a real patient.”

  • “Teaches you how to reason through a differential diagnosis.”

Students also noted some weakness about the clinical reasoning format. Following the exercise, the mean response to “I feel comfortable summarizing a clinical situation” was 3.61 out of 5 (see the Table). That result may have been due to the fact that this was the first time this was explicitly required. Student comments mentioned that the exercise covered a broader amount of information in less detail than our traditional small-group exercise, where the focus is on content, with more detail on the organ system being covered. For example, the traditional case in the renal block would focus on edema from renal causes and expand questions related to nephrotic syndrome, the causes, diagnosis, tests, and so on, whereas the clinical reasoning case requires integration of many other organ systems, such as considering congestive heart failure, cor pulmonale, cirrhosis, and lymphedema. Students also expressed concern about knowledge and confidence gaps. This was probably related to the fact that the students who had a mastery of basic science content-based knowledge (and who performed very well in the traditional content-based small groups) may have felt less confident when the focus was shifted to clinical reasoning. Finally, some students who preferred the format suggested it be implemented earlier in the medical school curriculum to better guide their reasoning as they learn each subject.

We debriefed with the faculty facilitators after sessions. Faculty felt that the exercise ran smoothly. They felt that the open-ended format of the reasoning exercise facilitated increased student participation, with active problem solving and discussion throughout the session.

Discussion

We developed a pilot curriculum for teaching clinical reasoning to medical students using problem representation and illness script worksheets. This framework was easy to implement and well received by preclerkship students.

There are a few advantages of this format in comparison to the traditional small group described above. The reasoning exercise requires minimal preparation on the part of students outside of their traditional coursework. The open-ended format relies on student input for moving the discussion forward, which encourages active participation, whereas the traditional small group proceeds along a preset course of discussion. The reasoning framework implemented during this pilot is also universal and can be applied in any clinical situation throughout the preclerkship and clerkship years.

Despite the strong positive feedback from students, this study has some limitations. Only a portion of the second-year class participated in this exercise. In the future, we plan to expand this format to the entire class. Although students enjoyed learning and practicing clinical reasoning, we did not formally assess these skills before and after the clinical reasoning exercise. Though students reported they understood the concepts of problem representation and illness scripts, we cannot definitively conclude that the exercise was successful in actually developing these skills. Similarly, whether providing students with clinical reasoning skills early in medical school will actually improve their clinical bedside reasoning and performance during clerkship experiences remains unclear.

To address students’ concern that the clinical reasoning exercise does not provide the same level of content detail as the traditional content-based small groups, we propose a few solutions. One possibility would be to implement both the clinical reasoning exercise followed by the traditional question-based session, as we did in this pilot. This approach may be time consuming, but it ensures that students develop both reasoning and knowledge. Another possibility would be to incorporate the traditional small-group information into another part of the curriculum. Finally, the clinical reasoning framework could be adapted to incorporate more detail into the clinical vignettes, and facilitators could spend more time exploring each component for the possible diagnoses. To address concerns about students’ knowledge and confidence gaps in small-group situations, we propose implementing this framework earlier and providing frequent practice in medical school. Earlier exposure to this framework may help students understand the level of knowledge required for sessions, as well as give them practice working on clinical reasoning skills together in small groups.

In addition to the edema case that we piloted, we have developed cardiovascular, pulmonary, and neurologic cases using the process described in the Methods section above. Given the positive feedback from this pilot, we plan to further develop and implement these cases throughout the preclerkship organ system–based curriculum.

Appendices

A. Clinical Reasoning Screencast.mp4

B. Clinical Reasoning Powerpoint.pptx

C. Edema - Facilitator.docx

D. Edema - Student.docx

E. Student Evaluation.docx

mep-12-10445-s001.zip (10.5MB, zip)

All appendices are peer reviewed as integral parts of the Original Publication.

Disclosures

None to report.

Funding/Support

None to report.

Ethical Approval

Reported as not applicable.

References

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

A. Clinical Reasoning Screencast.mp4

B. Clinical Reasoning Powerpoint.pptx

C. Edema - Facilitator.docx

D. Edema - Student.docx

E. Student Evaluation.docx

mep-12-10445-s001.zip (10.5MB, zip)

All appendices are peer reviewed as integral parts of the Original Publication.


Articles from MedEdPORTAL : the Journal of Teaching and Learning Resources are provided here courtesy of Association of American Medical Colleges

RESOURCES