Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2015 Jun;7(2):275–276. doi: 10.4300/JGME-D-14-00709.1

Mapping Quality Improvement and Safety Education to Drive Change Across Training Programs

Anjala V Tess 1,, Carlo Rosen 2, Carrie Tibbles 3
PMCID: PMC4512808  PMID: 26221453

Setting and Problem

The Accreditation Council for Graduate Medical Education sets expectations for institutions to provide oversight of residency and fellowship programs in quality improvement (QI) and patient safety (PS) through the Clinical Learning Environment Review program. This can prove challenging for graduate medical education leaders as they strive to build momentum in curricular change across large institutions.

Beth Israel Deaconess Medical Center is a Boston, Massachusetts, academic center with more than 40 accredited programs. Our QI/PS programs vary: Many programs are high performers, others are strong in 1 area, and a few lack curricula. We wanted high performers to continue to flourish and to support those who needed help. We created a standardized mapping tool to guide this work. Our goal was to provide a snapshot of the training programs at Beth Israel Deaconess Medical Center and to use the tool to both remedy gaps and easily keep our institutional leadership informed.

Intervention

In 2012, we began interviewing program directors to gather activities that integrated trainees into our quality and safety mission: didactic training, QI/PS programs, and root cause analysis/QI project work. Given that we were interested in trainee engagement, our tool aimed to map active work in both QI and PS. We interviewed each program director to learn how many trainees participated in the design and/or analysis of a QI/PS problem or intervention at least once in training. Programs were placed in 1 of 3 tiers: no hands-on work, hands-on work for some trainees, or 100% of trainees engaged. We did not use number of hours as a criterion or success of the project; instead we looked for residents actively engaged in QI/PS issue analysis and design of solutions, not just in implementing institutional remedies. For safety, we explored trainees' analyses of adverse events at least once in training, and again placed them in 1 of 3 tiers. We also assessed whether educational training was offered as outlined by Clinical Learning Environment Review expectations.

We mapped every program onto a 3 × 3 block grid (figure). To visually incorporate the need for educational training, we placed programs without basic teaching below a circle within their box. Our map included core programs (figure, boxed letters) and fellowships (figure, boxed numbers), with box sizes adjusted for number of trainees. Numbers alone represented programs with fewer than 5 trainees. For example, in figure a, program B is shown as a mid-sized core program with hands-on quality and safety and didactics for all trainees, whereas program 16 is a small fellowship that offers hands-on safety work for all fellows, active QI for some, and no training in basic principles. Each initial interview took approximately 1 hour and was conducted by the associate designated institutional official. We later incorporated the questions into our annual review process to allow for regular updates, with approximately 1 additional hour per quarter spent on updating the grid.

FIGURE .

FIGURE

Mapping of Curricular Activity in Quality Improvement and Safety

Abbreviation: QI, quality improvement.

A) Spring 2013. B) Fall 2014. Core programs are listed as letters, fellowships by numbers. Larger squares are sized according to the number of trainees. Numbers alone represent programs that have fewer than 5 trainees. Arc lines within each section separate programs that do not offer didactic training within that level of hands-on work.

Outcomes to Date

We began using our grid in 2013 (figure A). Consistent with our culture, many programs offered superior experiences in patient safety, but experiences in quality appeared to lag. Sixteen percent of programs (encompassing 32% of trainees) mapped to the top right box. We have shared the deidentified grid at graduate medical education, hospital QI leadership, and senior hospital leadership meetings. We revealed identifiers to individual program directors to both recognize successes and help identify gaps. We offered coaching to support programs and repeated this process every 6 months. figure b shows that we were able to move several core programs closer to the ideal. Examples include 1 program starting 4 project streams with their QI director and another program gathering data across all residents to use in QI projects. Several added didactic training. Response from program directors has been positive: They see their program compared with those of peers, and the grid helps identify concrete tasks needing improvement. The follow-up map shows 31% of all programs (representing 54% of trainees) in our top box. Smaller fellowships have continued to struggle, so we are piloting a centralized fellow training program in 1 department.

As a simple tool to stimulate change in our QI and safety efforts, our grid provides our leaders with a snapshot of the current state and a targeted needs assessment; it continues to help us set goals as an institution.


Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES