Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2017 Jun;9(3):381–382. doi: 10.4300/JGME-D-16-00752.1

Creating Provider-Level Quality Reports for Residents to Improve the Clinical Learning Environment

Rebecca Jaffe 1, Gretchen Diemer 2, John Caruso 3, Matthew Metzinger 4
PMCID: PMC5476400  PMID: 28638529

Background

Access to personal performance data is hypothesized to drive engagement in quality improvement and lead to improved quality of care. In the Common Program Requirements, the Accreditation Council for Graduate Medical Education (ACGME) sets an expectation that residents receive specialty-specific data on quality metrics. During the Clinical Learning Environment Review the level of specificity of such data is assessed, with a goal that residents receive feedback regarding their individual performance. Compared to inpatient care, it is often easier to collect individual performance data in the outpatient setting, where patient care is more closely attributed to individual providers. Individual inpatient quality data are more difficult to acquire due to complicated attribution and data collection strategies that focus on units/departments rather than providers/teams. On our institution's most recent ACGME survey, a mean of 64% of residents in core programs reported access to their own performance data, indicating opportunities for improvement.

Intervention

Through a partnership with our performance improvement department, we aimed to design a resident quality and safety report for all core programs, with the following criteria for success:

  • 1. 

    each residency program must be able to define its own attribution strategy so that subsequent review will be meaningful to learners;

  • 2. 

    reports must include individual performance for each metric, as well as patient identifiers so that learners can review their performance at the case level;

  • 3. 

    existing performance analysis infrastructure should be used to make the process resource neutral; and

  • 4. 

    metrics must align institutional and educational priorities.

We created a mandatory field in our electronic health record's discharge order where a discharge resident is specified by his or her unique institutional identifier. This field ties the selected individual to the administrative data for a given patient case. An attribution strategy was defined by each program director, and residents were educated at the program level. Using the same process that our institution employs for the preparation of faculty Ongoing Professional Practice Evaluations (OPPEs), which are a joint commission requirement, our performance improvement department abstracted inpatient provider-level quality data for residents.

Outcomes

Resident OPPE reports have been prepared for 16 core programs and include average length of stay, 30-day readmission rate, complications of care, and average variable cost. Patient identifiers are included for all 30-day readmissions. Compliance in completing the “discharge resident” order field ranged from 17% to 99% (mean = 75%). Programs with low program director buy-in demonstrated poor compliance, illustrating the importance of engaging educational stakeholders. Program director concerns included whether OPPE reports can be modified to reflect team-based care models, whether attribution strategies are reliable enough for residents to accept and use their results, and whether residents have the knowledge/skills to apply these reports for personal practice improvement.

Next steps in the implementation of resident OPPE reports will be to define learning objectives that address the knowledge, skills, and attitudes needed to understand and use personal performance data, and build curricula to address these objectives. On a national scale, practicing physicians are asked to reflect on and improve their performance based on similar feedback from employers and payers. We believe that having regular and structured access to personal performance data during training is a vital first step to succeeding in this environment.

Because milestones differ across residency programs, curricula and learning objectives will likely vary somewhat by program. Our continued partnership with performance improvement will allow modification of OPPE reports to meet program needs.

This intervention demonstrates the importance of creating infrastructure for quality improvement and patient safety education across training programs. Partnership with institutional stakeholders can lead to shared resources, collaborative curriculum design, and improved alignment between educators and administrators.


Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES