Abstract
There is a pressing need to provide health professional leaners experiential learning opportunities in health systems science and quality improvement. Moreover, there are several published tools to diagnose and treat health system vulnerabilities and hazards. The Health Care Failure Mode and Effect AnalysisTM (HFMEA) is a systems-engineering tool that the military and aerospace industries developed to proactively identify potential errors. While this technique has been used in a range of healthcare settings, there are few reports where health professional educators have used it with learners to teach quality improvement and systems engineering methods. We describe herewith an application of HFMEA in a medical informatics professional student rotation. In this manuscript, we briefly review HFMEA theory and methods, illustrate its application to address a quality improvement initiative, and reflect upon its value – and limitations – when used in an educational context.
Introduction and Background
There is a movement in medical education to teach health systems science – sometimes referred to as the third pillar of medicine – along with the basic and clinical sciences1. Clinical workflow analysis, process redesign, and healthcare quality improvement are foundational topics in health systems science that crosswalk with applied clinical informatics2. Consequently, there is a pressing need for academic informatics departments to provide an educational program for learners to study and apply health systems science. This program should include an experiential component that enriches learning by allowing learners to apply lessons and develop new skills.
At the University of Oklahoma-University of Tulsa School of Community Medicine (OU-TU SCM), the Department of Medical Informatics offers applied informatics and data science rotations to medical residents and health professional students. These rotations include didactics, practicums, and mentored scholarship in the form of a mini-capstone addressing a health systems science topic3. The mini-capstone projects focus on enterprise-level problems. This creates opportunities to teach core informatics topics like computerized decision support (CDS), health information technology (HIT) management, interprofessional collaboration, and quality improvement methodology.
Knowing how to “diagnose” health system failures and “prescribe” implementation science solutions is an important cross-cutting competency within these educational domains. Two systems diagnosis tools useful for evaluating healthcare workflow include root cause analysis (RCA) and Health Care Failure Mode and Effect Analysis (HFMEA)TM4. We teach these concepts when students embark on quality improvement or systems engineering projects. In this manuscript, we illustrate the use of HFMEA using a CDS problem, review the benefits and limitations of HFMEA, and explain how we used HFMEA to teach health systems science and quality improvement.
Problem Definition
The OU-TU SCM maintains academic relationships with several community hospitals. One of our affiliate hospitals is part of a hospital network governed by a centralized corporate authority. Corporate leaders partnered with clinical staff to create an electronic medical record (EMR) pain management bundle. The bundle includes new electronic orders and an interdisciplinary workflow. They designed the bundle with the intent to reduce inpatient opioid use and opioid related adverse events. Encouraged by initial piloting success at two network hospitals, corporate leaders approved a “big-bang” implementation across the remaining hospitals in network. At the time of our project, leadership had not yet implemented the bundle at our affiliate hospital. Anticipating a range of implementation challenges (e.g., staff education, workflow re-engineering) and unintended consequences (e.g., under or over-treating patient pain), the clinical champions consulted our medical informatics team to conduct an HFMEA. Our informatics faculty required students to participate in this project as part of the rotation practicum.
Project and Manuscript Goals and Objectives
We identified project goals and objectives based upon stakeholder expectations. Corporate quality and safety officers set a goal to improve opiate prescribing safety. Corporate management set timeline objectives for implementation. Clinical champions (i.e., our customers) set goals to limit systems-based errors, secure clinician buy-in, and increase patient satisfaction. Our medical informatics department sought to provide a prospective risk analysis and actionable recommendations. We knew at the outset that our project had to address clinicians’ goals within management’s timeline. We also needed to provide an educational experience that met course learning objectives.
The focus of this paper is to illustrate how to use the HFMEA within an informatics educational rotation to teach health systems science techniques, satisfy professional school educational program objectives, and provide consultative support to clinicians. We have several objectives with this manuscript. We intend to (1) give a brief review of HFMEA theory and methods; (2) demonstrate use of the HFMEA in a real-world situation; (3) illustrate how to integrate these methods into a professional school rotation; and (4) highlight early lessons learned and limitations. This paper should be of interest to applied informaticians, educators, and quality improvement specialists.
Theoretical Framework and Focused Literature Review
In 2001, the Department of Veterans’ Affairs (VA) adapted HFMEA from methods used by the aerospace and military sectors to identify risks in manufacturing processes5. The VA method combined concepts associated with Failure Mode Effect Analysis (FMEA) and Hazard Analysis and Critical Control Point (HACCP) to proactively identify and address health system vulnerabilities4. The VA also included the Safety Assessment Code (SAC) Matrix from RCA and a novel decision algorithm to prioritize corrective actions.
The HFMEA is most effective during product design, but practitioners may use it to analyze systems in a mature healthcare enterprise. It consists of five main steps: (1) selecting a process for inspection; (2) recruiting an interdisciplinary team; (3) creating a flow process map; (4) conducting a hazard analysis; and (5) formulating an action plan to address failure modes (Figure 1)6. Because HFMEA is a proactive “diagnostic tool” to predict failures, it is crucial to assemble an interdisciplinary team of subject matter experts (SMEs). This team can draw upon their collective experience when brainstorming7.
Figure 1.
Sample section of an HFMEA flow process map illustrating processes, sub-process, and hazard analysis.
After identifying all failure modes, the team assigns a SAC score to each failure mode using a 16-point scale that predicts the probability and clinical severity of an event7. Once scored, the team uses the HFMEA Decision Tree™ to prioritize action based upon hazard criticality, absence of effective control measures, and lack of detectability8. The interconnectedness and complexity of healthcare systems can make a comprehensive HFMEA cost prohibitive. Therefore, one approach is to focus upon a small section of the workflow.
There are numerous studies demonstrating the value of HFMEA across a range of clinical settings including oncology, surgery, and general inpatient care9-13. Notably, researchers have used HFMEA to diagnosis and manage health-systems pharmacy hazards leading to adverse events9,14,15. For example, Velez-Diaz-Pallares and colleagues used HFMEA to analyze medication management on inpatient wards using computerized physician order entry and unit dose dispensing9. They found that HFMEA helped the quality improvement team reduce inpatient prescribing errors. Anjalee and colleagues conducted a systematic review of published HFMEAs and found it to be an effective tool for reducing medication errors10. Similarly, Faiella and colleagues concluded that HFMEAs are effective for streamlining the evaluation of complex systems, schematizing risk assessment, and selecting safety interventions6.
Setting
Our affiliate hospital is an urban tertiary care facility with intensive care, surgical subspecialty services, pediatrics, and obstetrics. The clinical champions charged with implementation included a palliative care physician and two hospitalists with training in pain management and healthcare quality improvement.
The proposed pain management implementation bundle included (1) EMR order menus; (2) CDS; (3) a new interdisciplinary workflow; (3) a staff education campaign; and (4) new hospital policies. The bundle required prescribers to select therapies from a standardized list of orders for non-opioid medications, opioid medications, and non-pharmacologic pain-management alternatives. The orders direct nurses to regularly compare patient functional status to pre-defined therapeutic goals, and administer therapies in an escalating fashion when required. For example, the care team may set a therapeutic activity goal requiring a post-operative patient with a new hip arthroplasty to transfer from bed to bedside commode by the second post-op day. If the patient cannot transfer due to pain, the nurse will begin by administering non-pharmacologic therapies and non-opioid medications. If, upon reassessment, the patient fails to reach this goal, the nurse may administer oral or intravenous opioids.
Developers piloted the bundle at two hospitals and gathered preliminary data showing a reduction in total morphine equivalents (ME) administered, a decrease in the use of opioid reversal agents, and a reduction in opioid prescriptions at discharge. There was no change in patient satisfaction scores related to pain relief. Encouraged by these findings, corporate leadership authorized bundle implementation as part of a national pain management campaign. Management expected our hospital to submit for consideration any local CDS configuration requests, implement the bundle, and remove personalized clinician order sets within 60 days.
The bundle called for a seismic change in prescribing behavior and clinical workflow. Like many real-world health system implementation, the plan had several project management constraints including fixed implementation resources, an ambitious roll-out timeline, and top-down corporate messaging. Therefore, we anticipated numerous challenges and sought to identify as many failure modes as possible. We needed to prioritize failures modes as a function of risk and propose mitigating strategies that could feasibly be implemented within 30 days.
Methods
The OU-TU SCM Medical Informatics rotation is a two or four week rotation for medical students, physician assistant students, and residents3. Both formats include didactics, readings, participation in departmental meetings, and a mentored practicum. Typically, the practicum requires the learners to either participate on “in-flight” projects or design a novel project with a focused research question. The rotation culminates with the students delivering a “grand rounds” style presentation on their project to staff and faculty.
For this project, we asked students to conduct an HFMEA of the multi-modal pain order set. Informatics faculty gave several lectures on HFMEA methods and furnished students with readings detailing the theory and steps for analysis. The students then gathered and reviewed data and artifacts related to the protocol. This included preliminary reports from the pilot sites, wireframes of the order menus, written specifications for decision support, and training materials. The students met with local champions to better understand the protocol, the climate of implementation, and how they could apply HFMEA to identify potential implementation barriers.
Faculty supervised students as they conducted semi-structured interviews with SMEs. Interview topics included the current-state workflow and workflow compatibility concerns, perceived usability of the new order sets, perceived usefulness of the new protocol, appropriateness for the patient population, challenges with interdisciplinary communication, staff training needs, known implementation challenges at other sites, patient or specialty-specific implementation barriers, and the plan for patient education.
We selected a convenience sample of SMEs based upon their clinical domain knowledge and anticipated role in the future-state workflow. The sample included physicians, nurses, physical therapists, and nurse educators employed by the hospital. It was crucial to interview SMEs familiar with inpatient pain management, order entry, and bedside care. We interviewed nurse educators in the hopes of identifying training best-practices.
The students completed a modified HFMEA using the data collected from literature, semi-structured interviews, artifact analysis, and non-participant observation. Given the time constraints, we modified and simplified the HFMEA process so students could complete a preliminary analysis within the student rotation timeline. Rather than focusing on all potential failure modes, the students focused on major themes that emerged during the interviews. Using an apprenticeship model, faculty helped students identify sub-processes, failure modes, and recommendations for corrective action. The students presented their findings to department leadership and project sponsors as part of their rotation and received feedback on their work and presentation.
Results
Graphically describing the process
From the outset, the steps of our HFMEA deviated from the classic approach described by DeRosier4. Typically, the project leader assembles a multidisciplinary team with SMEs and one or more advisors. The SMEs provide insight on how a process works, whereas the advisor helps the leader scope the project and complete tasks. In our case, we had a future-state workflow in hand and needed SMEs to forecast potential problems. Therefore, instead of assembling the team to map workflow, we sought out and interviewed SMEs using the process map as a guide.
We documented the future state workflow using a swim lane diagram labeled with stakeholder roles rather than using a flow process map (Figure 3). This is because the workflow was multidisciplinary and included branching logic, and parallel activities. We assigned numbers chronologically to processes. This created a flow map with 22 processes and 22 sub-processes. We found it helpful to cluster related steps and label according to high-level goals. Goals included: (1) initial assessment; (2) order entry; (3) order processing; (4) goal assessment; and (5) administration.
Figure 3.
Simplified flow process map of the future-state workflow. We elected to adapt a swim lane diagram to reflect the non-linear and iterative workflow that crosses multiple stakeholders and clinical settings.
Completing the hazard analysis
In one week, the team interviewed two internal medicine residents, five acute care nurses, four maternity care nurses, and two physical therapists. Working iteratively, the faculty and students compared interview notes with the workflow diagram and conducted brainstorming sessions with local champions. The team identified 33 failure modes and four overarching themes (Table 1).
Table 1.
Excerpt of our HFMEA findings including failure mode themes, failure modes, and corrective actions.
Goal | Process and sub-process | Affected stakeholder | Failure mode and theme | Corrective action |
---|---|---|---|---|
Initial assessment | 1. Patient needs pain medication 1B. Nurse evaluates patient | Nurse | 1B1. Patient arrives to ward on high dose parenteral opiate - clinical appropriateness | Include pharmaceutical de-escalation protocol in nursing orders |
Order entry | 8. Select the therapeutic activity goal order 8B. Choose activity of daily living goal, psychosocial element goal, and mobility goal | Physician | 8B1. The order choices are too complex with too many goals – staff efficiency | Include pre-set goals for a limited number of common patient use-cases |
Goal assessment | 18. Re-assess patient’s pain control 18A. Determine if patient is meeting current goals | Nurse | 18A1. Time elapsed between first assessment and administration of parental medication could be 2.5h – clinical delay | Remove third assessment step from future-state workflow |
Administration | 13. First tier pharmaceutical analgesic administration 13B. Nurse reviews initial pharmaceutical analgesic order | Nurse | 13B1. The patient is post-caesarian delivery and still on the anesthesia protocol – workflow compatibility mismatch | Identify and post patient cohorts that meet exclusion criteria |
The first theme related to the clinical appropriateness of the order sets. Many clinicians believed the protocol might be inappropriate for some patients’ pain management needs. For example, patients with acute severe post-operative pain, chronic opiate prescriptions, the inability to swallow, or medication allergies may need a bespoke plan.
The second theme related to the delay between clinical assessment and medication administration. Clinicians were concerned that the elapsed time dictated by the protocol to assess analgesic effectiveness before the next medication administration would be unacceptably long. They feared this delay would negatively affect care quality and erode trust between patient and care team.
The third theme related to efficiency. Clinicians were apprehensive about the additional time required to complete tasks. Physicians believed order set complexity would increase order entry times. Nurses were equally concerned about the additional time invested in patient education, pain management counselling, and activity assessments.
The fourth theme highlighted workflow compatibility mismatches between the current-state and future-state. For example, med/surg ward nurses did not know how to reconcile discordant patient-reported pain scales with objectively observed functional performance. If a patient rated their pain 10 on a 10-point pain scale, but met a priori activity goals, should nurses administer or withhold the next analgesic dose? In a separate example, some surgical specialties, including obstetrics, use standardized pain management strategies that did not align with the new workflow. Anesthesiologists oversee patient pain management requirements following caesarean delivery and favor ketorolac, a parenteral drug used to treat moderately severe pain. The EMR order bundle did not include ketorolac.
For several reasons, the team did not use the HFMEA Hazard Scoring matrix or the HFMEA Decision TreeTM4. This process is time consuming and resource intensive. However, the clinical champions requested a fast turnaround. Furthermore, new clinical management policies restricted the range of potential corrective actions. Per customer request, we prioritized corrective actions based upon logistics and feasibility. We did not prioritize recommendations requiring major technology modifications, major informatics resource investments, or hospital policy revisions.
Corrective Actions and Recommendations
Given the project constraints and customer request, we assembled recommendations that clinical champions or local informaticians could implement within 60 days (Table 1). Our recommendations took one of the following forms: (1) order set configurations to improve usability; (2) patient communication and education materials; (3) local executive messaging; (4) patient inclusion/exclusion decision support; and (5) workflow modifications.
Placing orders for the pain management protocol entails choosing from a lengthy list of activity goals and selecting non-pharmacologic, non-narcotic, and narcotic medication orders. The prescriber must also place several corollary orders including nursing instructions and allied care consults. While the number of options affords a high degree of flexibility, this flexibility carries both learnability and complexity costs. Therefore, we proposed offering some pre-selected options to satisfy the most commonly encountered use-cases.
Nurses expressed concerns about patient reactions to the new pain protocols, hypothesizing that patients will become frustrated if new practices deviate from prior experience or fail to meet expectations. To defuse tension, improve health literacy, and direct culpability away from nurses, we suggested developing institution-branded resources for nurses to furnish to patients describing the pain management goals, program, and rationale.
The SMEs we spoke with expressed dismay over the corporate implementation strategy, arguing that the approach disenfranchised front-line clinicians. We believe the inability to participate in decision-making created a problematic climate of implementation. We recommended that local management and executive leadership conduct a series of “safety rounds” to support and reward adoption and identify and remove barriers to use16.
Recognizing that some patients may not be suitable for the multi-modal pain orders, we recommended defining patient inclusion and exclusion criteria. The organization could communicate these criteria through in-person and online trainings, published materials, and point-of-care decision support. We also recommended developing alternative order pathways to accommodate patients that were not suitable for the standard orders.
Finally, we outlined several solutions to handle common workflow exceptions. For example, it may be necessary to include a protocol for patients arriving on high-dose narcotic analgesia. These patients may need a different nursing assessment strategy and an analgesic de-escalation protocol. Also, providers may need to quickly enter pain management orders for patients at hospital admission, before the inter-disciplinary team can assess the patient’s functional status. We recommended including order sets that “release” when allied team members complete the functional assessment. We also recommended using temporary pain management holding orders as a bridge until the provider can enter the multi-modal pain protocol.
Educational Impact
Students shared with faculty several valuable insights about their educational experience. The students received formal instruction and practical experience on many core informatics topics including CDS, workflow analysis, process redesign, quality improvement, HIT, interdisciplinary teamwork, and change management17. Cited strengths included our strong emphasis on applied informatics, the opportunity for hands-on learning, and the ability to work with real-world interdisciplinary teams.
One weakness the students reported was the lack of prior informatics training which created a steep learning curve. They also noted that the short rotation timeline made it challenging to complete larger projects. Because most operational informatics projects continue longer that an educational rotation, it is crucial that faculty build in mechanisms to teach and support project handoffs. The lack of an established handoff format in medical informatics posed a formidable challenge.
We pragmatically adapted the situation-background-assessment-recommendations (SBAR) framework used in healthcare for patient care handoffs18. The students and faculty integrated SBAR information into the final presentation. For “Situation,” students provided a concise summary of the project and relevant informatics domains. For “Background,” the students described our adaptation of the HFMEA and a brief description of the pain management protocol. For “Assessment,” the students reviewed the flow diagram, failure modes, and preliminary recommendations to the customer. For “Recommendations,” outgoing students outlined future strategies for incoming students. Students then exchanged all materials, artifacts, and data.
Discussion
Principle findings
The HFMEA is a robust systems analysis method that is ideally suited for healthcare settings where interdisciplinary teams need a structured approach to unpack, understand, and predict the behavior of complex adaptive systems19. It is an effective tool for targeting workflow vulnerabilities, estimating patient safety risks, and prioritizing solutions in resource constrained settings. We found that applying the HFMEA framework to an implementation initiative offered several practical advantages. It increased awareness among stakeholders and fostered interdisciplinary engagement. It generated recommendations for order sets, educational materials, communication strategies, and special-case workflows. The HFMEA also provided a framework for stakeholder discussions and a way to track and finalize recommendations.
While healthcare organizations need competent clinical professionals applying these methods, training programs rarely teach HFMEA. We believe HFMEA is a practical and teachable method that educators can incorporate into undergraduate and graduate medical education. The informatics teaching faculty found it provided a novel and dynamic way to teach systems-based practice to health professional students. By applying HFMEA concepts to a real-world setting, students gained practical experience with quality improvement methods and adapted methods to suit the clinical and business context. Moreover, the experiential nature of the program enabled the faculty to observe and evaluate entrustable professional activities.
It is critical to point out that HFMEA can be time consuming and resource intensive. DeRosier and colleagues noted that a single HFMEA can require large interdisciplinary teams and 10 or more meetings4. For this reason, they suggested only examining one facet of a process so as not to overwhelm the participants. Our project was characterized by (1) a rigid, prescriptive future-state workflow; (2) top down implementation without stakeholder input; and (3) significant time and resource constraints. Therefore, we modified the HFMEA steps and streamlined the approach to meet leadership’s deadline, identify nimble solutions, and engage novice learners. We still believe the standard HFMEA framework is an excellent method for analyzing a system; practitioners should complete each step if time and resources permit. However, healthcare executives often demand quick and decisive action. In our experience, it is important to teach students how to keep pace with business operations by adapting to the use-case and available resources.
Relevance to current literature and future steps
There are many published descriptions using HFMEA; our recent literature search using PubMed and Google Scholar, identified 131 monographs describing application in specific disciplines (e.g. radiation oncology) or processes (e.g., inpatient supply chain management)9,10,13,19. However, we found only one report describing the use of HFMEA to teach health professionals. Schuller and colleagues sent department faculty to a continuing medical education conference offered by the American Association of Physicists in Medicine (AAPM) and then gave a condensed version to department personnel in a series of lunch seminars12. The seminars used a combination of didactics and use-cases
to teach flow process mapping, failure mode analysis, and fault tree analysis. However, they did not describe the training methods, strengths, or limitations.
For several reasons, we believe HFMEA is a teachable, feasible, and generalizable systems diagnosis strategy that academic programs should include in their health systems science curriculum for health professional students, residents, and fellows. First, HFMEA is a useful technique in quality improvement work to analyze systems. Teaching this method to health professional students provides future practitioners with a practical and adaptable skill they can use to diagnosis system stress points and explore a range of solutions. Second, HFMEA provides a framework for students from different programs to leverage their unique skills and knowledge on interprofessional teams. Third, curricular modules incorporating HFMEA can be used by professional programs to meet health systems science learning objectives required by accreditation bodies20.
Strengths and limitations
Despite the role of HFMEA in health systems science and patient safety, researchers have highlighted important methodologic limitations that could bias findings and outputs. First, identification of failure modes relies heavily upon facilitated brainstorming sessions with interdisciplinary groups. Therefore, participants are vulnerable to anchoring bias, availability bias, and group-think21,22. For this reason, it may be useful to combine HFMEA methods with human factors research methods such as user simulations, cognitive step-throughs, or non-participant ethnography. Second, risk scoring and decision analysis have validity issues23. Scoring risk based upon perceived probability and severity requires a considerable amount of guesswork. Finally, decision analysis methods in the HFMEA process do not quantify the reliability or effectiveness of system controls, backups, or fail-safe measures.
Faiella and colleagues theorized that HFMEA may miss certain classes of failure modes and recommended examining human-computer interaction errors using Systematic Human Error Reduction and Prediction Analysis (SHERPA)6,24. They also recommended analyzing the interconnectedness of complex adaptive systems using Systems Theoretic Accident Model and Processes and System Theoretic Process Analysis (STAMP-STPA)6,25. Similarly, Abrahamsen and colleagues suggested combining HFMEA with other systems engineering methods such as incident learning and Structured What If Technique (SWIFT)8. Kricke and colleagues recommended using EMR data and big-data analytic methods to identify sub-processes and workflow variations overlooked during process mapping26.
This background literature provides insights into limitations in our work. First, the rotation schedule and management objectives created an aggressive project timeline; students had very little time to brainstorm with frontline workers. This created an inherent selection bias. We could address this risk in the future by surging resources and assigning more students to the process workflow. Second, to improve and measure the validity and completeness of HFMEAs, we need to concurrently assign several groups of learners to independently complete an HFMEA on the same system and then compare outputs. Third, we hope to add human factors methods to our analysis protocol. Fourth and finally, adding a health data science module using EMR data to identify systems issues and errors might provide an important dimension to an overall safety appraisal.
Conclusions
In summary, we believe that the HFMEA is an important tool for health systems diagnosis and a powerful educational lever to improve the informatics and quality improvement competencies of health professional learners. However, traditional HFMEA methods can be quite time intensive and often demand full engagement of an interdisciplinary clinical team. This can erode stakeholder enthusiasm and limit practicality. Through this use-case, we demonstrated how to adapt methods to align with the pace of business operations and offered strategies to teach HFMEA to learners.
Figures & Table
References
- 1.Skochelak SE. Health systems science e-book: Elsevier. 2020.
- 2.Gardner RM, Overhage JM, Steen EB, et al. Core Content for the Subspecialty of Clinical Informatics. Journal of the American Medical Informatics Association. 2009;16:153–7. doi: 10.1197/jamia.M3045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Homco J, Kendrick D, Paungpetch S, Lesselroth B. St. Louis, MO: American Medical Informatics Association; 2019. From fledgling to facile: Teaching medical students to be clinician data scientists in less than four weeks. In: Committee AP, ed. AMIA Educators' Forum 2019. [Google Scholar]
- 4.DeRosier J, Stalhandske E, Bagian JP, Nudell T. Using health care failure mode and effect analysis™: the VA National Center for Patient Safety’s prospective risk analysis system. The Joint Commission journal on quality improvement. 2002;28:248–67. doi: 10.1016/s1070-3241(02)28025-6. [DOI] [PubMed] [Google Scholar]
- 5.Seidl KL, Newhouse RP. The intersection of evidence-based practice with 5 quality improvement methodologies. JONA: The Journal of Nursing Administration. 2012;42:299–304. doi: 10.1097/NNA.0b013e31824ccdc9. [DOI] [PubMed] [Google Scholar]
- 6.Faiella G, Parand A, Franklin BD, et al. Expanding healthcare failure mode and effect analysis: a composite proactive risk analysis approach. Reliability Engineering & System Safety. 2018;169:117–26. [Google Scholar]
- 7.Stalhandske E, DeRosier J, Patail B, Gosbee J. How to make the most of failure mode and effect analysis Biomedical instrumentation & technology. 2003;37:96–102. doi: 10.2345/0899-8205(2003)37[96:HTMTMO]2.0.CO;2. [DOI] [PubMed] [Google Scholar]
- 8.Abrahamsen HB, Abrahamsen EB, Høyland S. On the need for revising healthcare failure mode and effect analysis for assessing potential for patient harm in healthcare processes. Reliability Engineering & System Safety. 2016;155:160–8. [Google Scholar]
- 9.Vélez-Díaz-Pallarés M, Delgado-Silveira E, Carretero-Accame ME, Bermejo-Vicedo T. Using Healthcare Failure Mode and Effect Analysis to reduce medication errors in the process of drug prescription, validation and dispensing in hospitalised patients. BMJ quality & safety. 2013;22:42–52. doi: 10.1136/bmjqs-2012-000983. [DOI] [PubMed] [Google Scholar]
- 10.Anjalee JAL, Rutter V, Samaranayake NR. Application of Failure Mode and Effect Analysis (FMEA) to improve medication safety: a systematic review. Postgraduate Medical Journal. 2021;97:168–74. doi: 10.1136/postgradmedj-2019-137484. [DOI] [PubMed] [Google Scholar]
- 11.Sorrentino P. Use of failure mode and effects analysis to improve emergency department handoff processes. Clinical Nurse Specialist. 2016;30:28–37. doi: 10.1097/NUR.0000000000000169. [DOI] [PubMed] [Google Scholar]
- 12.Schuller BW, Burns A, Ceilley EA, et al. Failure mode and effects analysis: A community practice perspective. Journal of applied clinical medical physics. 2017;18:258–67. doi: 10.1002/acm2.12190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Wetterneck TB, Skibinski K, Schroeder M, Roberts TL, Carayon P. Los Angeles, CA: SAGE Publications Sage CA; 2004. Challenges with the performance of failure mode and effects analysis in healthcare organizations: an iv medication administration HFMEA™. Proceedings of the human factors and ergonomics society annual meeting; pp. 1708–12. [Google Scholar]
- 14.McElroy LM, Khorzad R, Nannicelli AP, Brown AR, Ladner DP, Holl JL. Failure mode and effects analysis: a comparison of two common risk prioritisation methods. BMJ quality & safety. 2016;25:329–36. doi: 10.1136/bmjqs-2015-004130. [DOI] [PubMed] [Google Scholar]
- 15.de Vries M, Fan M, Tscheng D, Hamilton M, Trbovich P. Clinical observations and a Healthcare Failure Mode and Effect Analysis to identify vulnerabilities in the security and accounting of medications in Ontario hospitals: a study protocol. BMJ open. 2019;9:e027629. doi: 10.1136/bmjopen-2018-027629. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Holahan PJ, Lesselroth BJ, Adams K, Wang K, Church V. Beyond technology acceptance to effective technology use: a parsimonious and actionable model. Journal of the American Medical Informatics Association. 2015;22:718–29. doi: 10.1093/jamia/ocu043. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Gardner RM, Overhage JM, Steen EB, et al. Core content for the subspecialty of clinical informatics. Journal of the American Medical Informatics Association. 2009;16:153–7. doi: 10.1197/jamia.M3045. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Shahid S, Thomas S. Situation, background, assessment, recommendation (SBAR) communication tool for handoff in health care-a narrative review. Safety in Health. 2018;4:1–9. [Google Scholar]
- 19.Taleghani YM, Vejdani M, Vahidi S, Ghorat F, Raeisi AR. Application of prospective approach of healthcare failure mode and effect analysis in the risk assessment of healthcare systems. EurAsian Journal of BioSciences. 2018;12:95–104. [Google Scholar]
- 20.Liaison Committee on Medical Education (LCME) Standards, Publications, & Notification Forms. LCME, 2021. (Accessed March 10, 2021, at https://lcme.org/publications/.)
- 21.Simsekler MCE, Kaya GK, Ward JR, Clarkson PJ. Evaluating inputs of failure modes and effects analysis in identifying patient safety risks. International journal of health care quality assurance. 2019. [DOI] [PMC free article] [PubMed]
- 22.Shebl NA, Franklin BD, Barber N. Is failure mode and effect analysis reliable? Journal of patient safety. 2009;5:86–94. doi: 10.1097/PTS.0b013e3181a6f040. [DOI] [PubMed] [Google Scholar]
- 23.Abraham J, Nguyen V, Almoosa KF, Patel B, Patel VL. Falling through the cracks: information breakdowns in critical care handoff communication. AMIA Annual Symposium Proceedings. 2011. American Medical Informatics Association. p. 28. [PMC free article] [PubMed]
- 24.Embrey D. SHERPA: A systematic human error reduction and prediction approach. Proceedings of the international topical meeting on advances in human factors in nuclear power systems. 1986.
- 25.Leveson N. A new accident model for engineering safer systems. Safety science. 2004;42:237–70. [Google Scholar]
- 26.Kricke GS, Carson MB, Lee YJ, et al. Leveraging electronic health record documentation for Failure Mode and Effects Analysis team identification. Journal of the American Medical Informatics Association. 2017;24:288–94. doi: 10.1093/jamia/ocw083. [DOI] [PMC free article] [PubMed] [Google Scholar]