Skip to main content
BMJ Open Quality logoLink to BMJ Open Quality
. 2017 Oct 31;6(2):e000096. doi: 10.1136/bmjoq-2017-000096

Implementation of a mock root cause analysis to provide simulated patient safety training

Martina Murphy 1, Jennifer Duff 1, Julie Whitney 2, Benjamin Canales 3, Merry-Jennifer Markham 4, Julia Close 1
PMCID: PMC5699135  PMID: 29450282

Abstract

Background

The proposed revision to the Accreditation Council for Graduate Medical Education (ACGME) Common Program Requirements includes participation in real or simulated patient safety activities, such as root cause analysis (RCA).1 Because exposure to RCA may occur with low frequency, a mock RCA was developed and piloted for feasibility with Hematology/Oncology fellows.

Objective

To improve trainee knowledge of the goals and application of RCA in patient safety and quality improvement through a simulated experience.

Methods

A mock RCA was implemented with Hematology/Oncology fellows over two subsequent years. In small groups, they reviewed a case involving an adverse event and identified sources of harm. Additional details, in the form of provider interviews, were available upon request. Trainees identified the root cause(s) and proposed measurable changes. Teams presented proposals to peers and a panel representing hospital leadership. Feedback was provided. Trainees completed evaluations and were surveyed regarding their perceptions.

Results

Thirteen of 15 fellows completed the survey. Twelve of 13 (92%) fellows felt the mock RCA improved their comfort level for participation in a real RCA. Ten of 13 fellows (77%) reported increased awareness and likelihood of reporting near misses and/or adverse events following participation. More thorough patient care documentation following the session was reported by 8 of 13 (62%).

Conclusion

A pilot trial of a mock RCA with Hematology/Oncology fellows had high trainee satisfaction. Post-session surveys and informal interviews suggest trainees have reduced anxiety when faced with participation in a real RCA and have more interest in the process after participation.

Keywords: continuous quality improvement, graduate medical education, medical education, near miss, quality improvement


The proposed revision to the Accreditation Council for Graduate Medical Education (ACGME) Common Program Requirements includes ‘participation in real and/or simulated interprofessional clinical site-sponsored patient safety activities, such as root cause analysis (RCA)’.1 RCA is a focused review process for identifying system-related or process-related causal factors of adverse events and near misses. The process of an RCA identifies factors that led to an event occurrence and uses this information to improve quality and safety. While this approach has been exercised for decades in industrial fields like manufacturing, its potential is now more acknowledged in the healthcare setting.

In some medical specialties, RCA may occur in low frequency, thus limiting trainee exposure. Further, while promoting trainee participation in patient safety and quality care activities, such as RCA, is important, there is little data available on optimal educational strategies of these topics. In recent years, the ACGME has placed increasing emphasis on simulation-based training.2 Simulation has been traditionally explored in procedure-based specialties like surgery as a means to improve skills with technical tasks. However, there is data to suggest that simulation may be an effective method for teaching skills like working in multidisciplinary care teams or physician–patient interaction.3 4

Trainees completing our Haematology/Oncology Fellowship Program have experienced inconsistent opportunities to participate in institutional RCAs. Based on this need, we developed a mock RCA to simulate this quality improvement and error analysis experience and in doing so, improve trainee familiarity and participation in the RCA process.

Objective

To improve trainee knowledge of the goals and application of RCA in patient safety and quality improvement through a simulated experience. While our primary intent was to implement this mock RCA with our Haematology/Oncology Fellowship, the curriculum was developed to be readily exported to learners at varying levels of training.

Methods

A mock RCA was developed and implemented in two subsequent years (2015, 2016) with haematology/oncology fellows. The mock RCA curriculum met Institutional Quality Improvement criteria, and thus was exempt from institutional review board review. The implementation occurred over two 1-hour periods. Prior to the first session, fellows were provided with a case involving a postoperative bleeding complication in a patient with a missed diagnosis of coagulopathy. In order to simulate a real-world experience, the case was provided to fellows in the form of a series of charted progress notes from the electronic medical record. In addition, fellows were given an informational handout detailing the steps and criteria for conducting an RCA (table 1). Fellows were also provided with a PowerPoint slide deck prepared by the VA National Center for Patient Safety on the topic of RCA and were asked specifically to review slides with description and examples of flow mapping as detailed in those slides (see online supplementary appendix A). The fellows were instructed to review the material in advance of the session. During the first session, a 15 min didactic lesson was delivered that further oriented the learners to the utility and logistics of the RCA process. In particular, they learnt how safety assessment codes apply to the sentinel adverse event necessitating RCA and discussed the grading for the case provided. Working in three groups of four to five fellows, they then identified potential sources of patient harm and practised flow mapping, a method by which learners create a visual explanation for why an event occurred by connecting effects with their individual cause(s). Additional case-related information, specifically subspecialty physician and nursing opinions (‘interviews’), was available if requested by the group. These ‘interviews’ were provided to the teams as brief notes preselected to include the information from specialties or ancillary staff that would typically be interviewed for the case in the event of a true RCA. For example, many groups requested additional information from the surgical team and were provided an ‘interview’ from the attending surgeon with details regarding the patient’s surgical care.

Table 1.

Main steps in root cause analysis (RCA)*

Root cause analysis step Process description
Identify the event Institutions should have in place a process for selecting which events undergo RCA
Select team members People with personal knowledge of the processes and systems involved in the event to be investigated
Describe what happened Collect facts surrounding the event to understand what happened
Identify contributing factors Identify circumstances that increased the likelihood of the event
Identify root causes Contributing factors are analysed to identify underlying process and system issues (root causes) of the event
Design and implement changes to eliminate the root causes Team determines how best to change processes and system to reduce likelihood of another similar event
Measure success of changes Success of implemented changes is measured

* Adapted from: Guidance for Performing Root Cause Analysis https://www.cms.gov/medicare/provider-enrollment-and certification/qapi/downloads/guidanceforrca.pdf. Accessed 15 November 2016.

Supplementary file 1

bmjoq-2017-000096supp001.pptx (3.2MB, pptx)

For the second hour-long session, teams reconvened to define the root cause(s), identify appropriate actions and suggest measureable outcomes (see online supplementary appendix B). After determining a root cause, each team presented appropriate actions to their peers and a panel of educational faculty (authors MM, JD, JC) representing hospital leadership. A discussion between the teams and the leadership panel regarding resources needed to implement teams’ suggested changes and the potential degree of impact of proposed changes followed. Feedback regarding resource utilisation and anticipated efficacy of the presented actions plans was provided to the teams. Evaluations were obtained from the fellows following the session, and fellows were anonymously surveyed via Qualtrics (Qualtrics, Provo, UT) using Likert scale questions as to their perceptions.

Supplementary file 2

bmjoq-2017-000096supp002.pdf (53KB, pdf)

Results

Teams arrived at RCA solutions with similar themes centred around the appropriate ordering and follow-up of labs ordered in the preoperative setting. The facilitators felt that each team developed appropriate plans and outcome measures that would be reasonable in a real-time RCA (eg, preoperative lab checklist). Thirteen of 15 (87%) fellows completed the postsession survey. In the survey, 11 (85%) fellows reported feeling comfortable participating in a real RCA after attending the session. Only one (7%) fellow felt that the mock RCA did not help to improve their comfort level with regard to participation in a real RCA. Ten (77%) fellows reported that as a result of participation in the mock RCA, they were more aware of near misses and/or adverse events. A similar number was more likely to report adverse events after participation in the mock RCA. Eight (62%) fellows related that the mock RCA made them more thoroughly document patient interactions. Eleven (85%) fellows felt that all Heme/Onc fellows should participate in a mock RCA. The RCA simulation sessions received similar overall evaluation scores to the other conferences in the general fellowship curriculum.

Shortly following the mock RCA, several of the fellows had an opportunity to participate in a real RCA. These fellows were informally queried and reported decreased anxiety about the process and increased interest in participating given the prior mock RCA training they received.

Discussion

Development and implementation of a mock RCA as an educational tool was well received among our trainees. While our case was based on a true patient event, mock RCA allows the instructor the flexibility to craft a scenario best tailored to their educational objectives and field of study. Although the prime objective of this pilot was to educate trainees on concepts of quality improvement through simulation, we were also able to teach specialty specific material relevant to our learners in a novel format. Implementation of a mock RCA curriculum was highly feasible and required minimal resources aside from time and a conference room. Based on our initial experience and feasibility, we are currently developing additional mock RCA sessions.

One major limitation to this pilot is the small sample size as our fellowship programme contains 15 fellows. However, following the success of our mock RCA, other departments at our institution have been inspired to use this novel teaching format. This process has now been adopted by a fourth year medical student course as well as by two residency programmes. More widespread use of mock RCA will allow for more rigorous study of its utility as an educational tool.

Additionally, the sessions occurred in the beginning of the academic year during a 2.5-month orientation block. Fellows were surveyed following the block series and thus may have been subject to recall bias.

Finally, the effect of our mock RCA curriculum was limited to learner self-confidence and comfort. RCA-specific knowledge and this curriculum’s impact on behaviour were not assessed.

Conclusion

A pilot trial of a mock RCA with haematology/oncology fellows had high trainee satisfaction. A postsession survey and informal interviews suggest that trainees may have reduced anxiety when faced with participation in a real RCA and have more interest in the process after participation. Using a simulated experience through mock RCA provides a reliable way in which learners at varying training levels can learn important aspects of quality-related patient care issues while fulfilling ACGME Common Program Requirements.

Footnotes

Contributors: MM, JD, JC and JW planned and delivered the curriculum. BC provided urology-specific content. MM and JC conducted the survey. MJM provided editorial review. MM submitted the final manuscript.

Competing interests: None declared.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary file 1

bmjoq-2017-000096supp001.pptx (3.2MB, pptx)

Supplementary file 2

bmjoq-2017-000096supp002.pdf (53KB, pdf)


Articles from BMJ Open Quality are provided here courtesy of BMJ Publishing Group

RESOURCES