Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2019 Mar 22;26(6):553–560. doi: 10.1093/jamia/ocz002

Systems engineering and human factors support of a system of novel EHR-integrated tools to prevent harm in the hospital

Anuj K Dalal 1,2,, Theresa Fuller 1, Pam Garabedian 3, Awatef Ergai 4, Corey Balint 4, David W Bates 1,2, James Benneyan 4
PMCID: PMC7647327  PMID: 30903660

Abstract

We established a Patient Safety Learning Laboratory comprising 2 core and 3 individual project teams to introduce a suite of digital health tools integrated with our electronic health record to identify, assess, and mitigate threats to patient safety in real time. One of the core teams employed systems engineering (SE) and human factors (HF) methods to analyze problems, design and develop improvements to intervention components, support implementation, and evaluate the system of systems as an integrated whole. Of the 29 participants, 19 and 16 participated in surveys and focus groups, respectively, about their perception of SE and HF. We identified 7 themes regarding use of the 12 SE and HF methods over the 4-year project. Qualitative methods (interviews, focus, groups, observations, usability testing) were most frequently used, typically by individual project teams, and generated the most insight. Quantitative methods (failure mode and effects analysis, simulation modeling) typically were used by the SE and HF core team but generated variable insight. A decentralized project structure led to challenges using these SE and HF methods at the project and systems level. We offer recommendations and insights for using SE and HF to support digital health patient safety initiatives.

Keywords: patient safety, health information technology, quality improvement, innovation science

INTRODUCTION AND BACKGROUND

Improving patient safety by minimizing preventable harm is crucial to our healthcare system, but remains an ongoing challenge.1 In the acute care setting, adverse events such as hospital-acquired infections, medication errors, and falls affect a significant number of patients and are associated with substantial costs.2 Although advancements in institutional culture, clinical care processes, medication administration, decision support, and health information technology (IT) appear to be improving patient safety, their overall impact has been variable.3–7

The safety failings of healthcare systems often are attributed to difficulty introducing new initiatives, many which leverage health IT, in sociotechnically complex settings.8,9 For hospital safety and quality improvement initiatives to achieve their intended goals, interdisciplinary teams must assess and measure organizational factors, and clinical processes that influence whether health IT will be successfully implemented and adopted.10,11 For example, improvement in the process of medication reconciliation associated with health IT implementation has been shown to reduce medication errors and potential adverse events.12,13 To maximize success, implementation teams must define problems, understand workflows, analyze user requirements, identify potential barriers, predict failures, and continuously improve.14,15 However, predicting the impact of disruptions to workflow, acceptance and adoption by users, unintended consequences, and external factors (eg, vendor constraints) is still difficult.16,17

Given the complexity of healthcare processes, organizations are increasingly partnering with experts in systems engineering (SE) and human factors (HF) to redesign care processes and improve implementation of new health IT tools.17–21 Methods from these disciplines can be applied to a broad range of processes to create practical, reliable, and generalizable health IT–enabled interventions that serve to engage patients and clinicians in identifying, assessing, and mitigating threats to safety, thereby decreasing adverse events related to hospital-acquired conditions. While the value of SE and HF is increasingly recognized in health care, few reports have described and assessed the experience of deploying SE and HF methods by interdisciplinary teams of varied backgrounds to support large-scale, health IT implementations.17,22,23

In 2014, we formed a Patient Safety Learning Laboratory (PSLL), a cross-institutional collaboration between the Center for Patient Safety, Research, and Practice at Brigham and Women’s Hospital and Northeastern University’s Healthcare Systems Engineering Institute, supported by a grant from the Agency for Healthcare Research and Quality (AHRQ). As part of our overarching structure, we established a core team with SE and HF expertise to support individual project teams in introducing a suite of novel digital health tools integrated with the electronic health record (EHR) across the 5 phases of AHRQ’s SE lifecycle: problem analysis, design, development, implementation, and evaluation. The objective of this case report is to describe the experience of our SE and HF core team to support individual projects during each phase.

CASE REPORT

Overview of patient safety learning laboratory

Our PSLL consisted of 3 individual project teams, an administrative core, and the SE and HF core. The 3 projects focused on introducing an electronic toolkit for patients and clinicians to engage in fall-prevention (Fall TIPS),24,25 a safety dashboard for clinicians to identify patients at risk for harm and implement corrective action at the point of care,26 and a portal for patients to report safety concerns directly to clinical unit leadership (MySafeCare).27,28 Our vision (Figure 1) was to employ a continuous quality and safety improvement process that included surveillance to identify “at-risk” patients, risk assessment, mitigation strategies to reduce likelihood of harm, and systematic analysis of the actual threats and harms to iteratively refine these 3 tools individually and as an integrated whole over time. Importantly, an overarching goal was to implement and evaluate the 3 tools as a single combined intervention integrated with our vendor EHR (Epic Systems, Inc) to proactively address threats to patient safety in real time.

Figure 1.

Figure 1.

Patient Safety Learning Laboratory: electronic health record (EHR)–integrated health information technology tools and vision. The overall vision of the Patient Safety Learning Laboratory vision was to employ a continuous quality and safety improvement process consisting of (1) surveillance to identify “at risk” patients, (2) risk assessment, (3) mitigation strategies to reduce likelihood of harm, and (4) systematic analysis of the actual threats and harms to iteratively refine the individual patient- and clinician-facing tools over time.

SE and HF core team

The SE and HF core team included a physician investigator with expertise in hospital medicine and clinical informatics (A.K.D.), a healthcare SE expert (J.B.), 1-2 rotating graduate SE students and staff engineers (A.E., C.B.), a HF and usability expert (P.G.), and a research assistant/project coordinator (T.F.). We frequently interacted with hospital quality and safety leaders, hospital-based clinicians (nurses, hospitalists, residents, physician assistants), clinical unit leaders, patients and patient advocates, and institutional stakeholders (eg, medical and nursing leadership). Appropriate methods (Figure 2) for each project were identified and applied by the SE and HF core across AHRQ’s SE lifecycle.

Figure 2.

Figure 2.

Systems engineering and human factors methods used for continuous quality and safety improvement across the Agency for Healthcare Research and Quality’s systems engineering lifecycle. A subset of system engineering and human factors methods were used across each phase of Agency for Healthcare Research and Quality’s 5-phase systems engineering project lifecycle: problem analysis, design, development, implementation, and analysis.

Data collection and analysis: surveys and focus groups

Our study was approved by the Partners HealthCare Institutional Review Board. In year 4 of the award, we surveyed all PSLL members via REDCap who had participated for at least 6 months over the 4-year project to assess their perceptions of the SE and HF methods that were used, projects for which each method was used, and lifecycle phase(s) in which they were most useful. Participants were asked to rate the frequency of use and insight gained for each method on a 5-point Likert-type scale. Two focus groups were conducted by SE and HF core team members (P.G., T.F.) to review survey responses; understand participant perceptions of SE and HF methods, how those methods were applied, and their impact; and assess participants’ overall experience with the SE and HF core team. Quantitative data from the surveys were analyzed and presented descriptively. Focus groups were recorded and transcribed verbatim for qualitative analysis; in a series of meetings, SE and HF core team members (T.F., P.G., A.K.D., J.B.) summarized, identified, and confirmed key themes, challenges, and recommendations using a group consensus approach.

RESULTS

A total of 29 individuals (44.8% were 35 years of age or older; 51.7% were women; 51.7% had a doctoral degree) participated in the PSLL for 6 or more months. Of these, 19 individuals (52.6% were 35 years of age or older; 63.2% were women; 47.4% had a doctoral degree) completed the survey (response rate 65.5%) and reported participating in the project for an average of 2.3 ± 1.1 years. Sixteen individuals participated in 1 of the focus groups. Interviews (21.6%), focus groups (17.0%), usability testing (15.1%), and workflow observations (11.0%) were the most frequently used methods across all phases (Figure 3). Seven themes (Box 1) in using SE and HF methods were identified.

Figure 3.

Figure 3.

Reported use and overall frequency of system engineering and human factors methods across project phases.

Box 1.

Key themes about system engineering and human factors methods

Theme 1: Access to institutional data. Addressing concerns of, and building trust with institutional stakeholders, to obtain access to key institutional data required for process and outcome measure analyses.

Theme 2: Timing and resource constraints. Stretching limited research personnel (often with competing priorities) across multiple projects manifesting as variable consistency in the application of methods.

Theme 3: Adaptability, ease of use. Determining which methods could be successfully adapted and quickly applied to individual projects and across project phases.

Theme 4: Stakeholder buy-in. Building and establishing relationships by engaging stakeholders at each phase of the project via consistent communication through application of key methods.

Theme 5: Informal vs. rigorous use. Balancing project requirements and timeline for introducing EHR-integrated health-IT tools while applying rigorous research methods.

Theme 6: Exploratory but instructive. Attempting to use a variety of methods during each phase of the project to generate new knowledge and insight.

Theme 7: Variable project team member understanding. Navigating how specific methods could be applied by individual project team members based on familiarity and experience with that method.

Of the 12 SE and HF methods employed (Table 1), 3 (25%) were dependent on access to institutional data (eg, workflow analysis, simulation modeling); 6 (50%) were constrained by timing or availability of resources (eg, participatory design sessions, root cause analysis), 2 (16.7%) were highly adaptable and easy to use (eg, interviews, usability testing), 3 (25%) were useful for generating stakeholder buy-in (eg, use and compliance reporting), and 4 (33.3%) were variably understood by project team members. In 5 (41.7%) methods (eg, interviews, workflow observations), respondents reported challenges with employing these informally versus rigorously; however, for 2 (16.7%) methods that were employed rigorously (eg, cognitive load and task analysis), respondents thought that the knowledge gained was instructive, despite their exploratory purpose. Survey respondents reported gaining a high degree of insight from usability testing, interviews, and cognitive load and task analysis. They reported gaining the least insight from root cause analyses and simulation modeling; however, only a few survey respondents reported using these methods.

Table 1.

Systems engineering and human factors methods used across the PSLL: insight, representative quotes, and themes

Method Insighta Representative Quotesb Themesb
Usability Testing 14/16 (87.5)
  • Always do it, wouldn’t have considered NOT doing it”

  • “Helped significantly”

  • “Informally conducted”

  • “HF expert provided test plans and minimal guidance on how to conduct”

2, 3, 5
Interviews 12/14 (85.7)
  • “Worked well with patients, families, providers”

  • “Method used in many projects”

  • “Many [logistical] barriers…time [constraints] with clinicians, not getting the right information from the interview”

  • “We need to learn how to do this better”

3, 5
Focus Groups 9/13 (69.2)
  • “Questions relating to the design of the app in a group setting allowed us to directly compare different ways people might approach using it”

5, 6
Workflow Observations 6/11 (54.5)
  • This was one of the most helpful methods. We should have done this more throughout implementation.”

  • “I think we needed more rigor around this but we never had the resources given the nature of people coming and going, research assistant workload, among other things. Also, the planning of this came too late”

2, 5
Participatory Design Sessions 6/8 (75.0)
  • “We got reactions to mock ups of [patient tools]”

  • “it was not the right stage of the project [due to development timeline]”

2, 4, 7
Use and Compliance Reporting 5/8 (62.5)
  • “I thought these were perhaps the most effective and useful way of engaging end users”

  • “This was one of the most impactful methods. Partially because it allowed real time feedback and got us in better touch with the data.”

4, 7
Process Mapping and Workflow Analysis 4/7 (57.1)
  • “For many of those time frames [nursing] reports were not completed due to other priorities”

1, 5
Simulation Modeling 0/4 (0)
  • “The simulation modeling is [a very] new [technique]”

  • “Access to data was a limiting factor”

1, 7
Failure Mode and Effects Analysis (FMEA) 3/8 (37.5)
  • “Challenging to make FMEA insights actionable.”

  • “[FMEAs] are a useful way to engage users. It would have been more helpful with longer sessions and more follow up from the team on the identified problems.”

  • “Served a separate purpose for continued [discussion] about the [PSLL] system as an integrated whole, how the system is working, and how it could fail.”

2, 4, 7
Cognitive Load and Task Analysis 4/5 (80)
  • “We got a lot of good input and feedback for a specific type of ‘high-cognitive’ load task”

  • “Helpful but will only be useful later on”

2, 6
Root Cause Analysis “Huddles” 1/3 (33.3)
  • “These were difficult to arrange due to hearing about safety events, getting everyone in the same room, and ownership of the methodology or subproject.”

  • “…happened infrequently”

2
Statistical Process Control Charts 2/3 (66.7)
  • Proved difficult due to data availability”

1

Values are n/n (%).

FMEA: failure mode and effects analysis; HF: Human factors; PSLL: Patient Safety Learning Laboratory.

a

Survey responses with a rating of 4 or 5.

b

Identified from surveys and focus groups.

DISCUSSION

As part of our PSLL, a core team with SE and HF expertise supported 3 project teams in using 12 quantitative and qualitative methods to introduce a suite of EHR-integrated digital health tools into a sociotechnically complex setting for engaging patients and clinicians in the real-time identification, assessment, and mitigation of safety threats. Of those methods used, qualitative methods typically were employed by individual project teams alone (interviews, focus groups, usability testing, workflow observations), or with informal assistance from the SE and HF core (participatory design sessions, usability testing). Quantitative methods primarily were used by the SE and HF core team (failure mode and effects analysis, cognitive load and task analysis, root cause analysis, statistical process control). We identified 7 key themes across these 12 SE and HF methods. Survey responses suggested that usability testing generated the greatest insight, whereas simulation modeling provided the least insight.

Individual project team members, many of whom were already familiar with qualitative methods because of their quality improvement and informatics backgrounds, employed these without much guidance from the SE and HF core, sometimes leading to their variable, nonstandardized, and “on-the-fly” application. In contrast, SE and HF core team members—more familiar than others with the quantitative methods—led their use at the systems level; however, variable understanding of these lesser familiar methods led to a lack of perceived usefulness by individual project teams. Interestingly, several methods (usability testing, interviews, cognitive analyses) seemed to generate a high level of insight, regardless of by whom (project vs core team) or how frequently they were used. For example, usability testing was easily adaptable, leading to its broad albeit variable use for the 3 projects. Cognitive load and task analysis, which was conducted on a specific intervention subcomponent (safety dashboard’s “opioid management” domain) in which clinician participants were asked to complete a complex yet high-profile task, generated rich insight. Tool use and compliance reporting was perceived as highly effective as it was one of the few methods that unified all projects by offering a real-time, data-driven approach to promote end user adoption during implementation. Finally, simulation modeling did not seemingly generate insight; however, we attribute this largely to its use being noncentral to these projects as well as variable understanding of its utility by project teams.

Regarding lessons learned, we encountered challenges and offer recommendations for using these 12 SE and HF methods based on our experience (Table 2). In general, we struggled with consistency and standardization across projects, variable degrees of participant experience, insufficient resources and time constraints, competing project priorities and development schedules, trade-offs at individual project and system-of-systems levels, variable degrees of project team member understanding, access to sensitive administrative data, and inconsistent levels of rigor for use.29 Still, we gained rich insights from application of each method (Table 2). For example, workflow observations empowered individual project teams to identify the pros (transparency for patients and clinicians) and cons (privacy concerns) of externalizing patient-specific EHR data on bedside displays. Simulation modeling helped the SE and HF core team to develop a shared understanding of the 3 projects as a unified whole and identify markers of unit “riskiness” (eg, turnover, staffing ratios, patient acuity) thought to affect key process and outcome measures for preventable harms.

Table 2.

Challenges and recommendations from using systems engineering and human factors methods during large-scale health information technology implementation

Method Challenges Recommendations Examples of Insights
↑ Most Qualitative ↓ Most Quantitative Workflow observations Informally conducted, “on-the-fly,” prone to misinterpretation if data not well quantified Quantify number of observations and types of clinicians (RN, MD) to minimize bias in conclusions Workflows varied by clinician (RN, MD) influencing the configuration of tools (unit-level vs team-level dashboard views)
Focus groups Many qualitative data, but emphasis often on a single user group or individual project tool, not system Ensure all types of users (patients, clinicians) are invited, encourage discussion about the system Bedside displays could be used on clinical units to deliver patient-specific information to patients and clinicians
Interviews Data often analyzed at project instead of systems level; prone to bias by specific participants Provide systems level context in interview guides; ensure broad representation of users Dashboards preferable to checklists for assessing safety threats during rounds on non-ICU units and services
Participatory user design sessions Session facilitators often had preconceived ideas of how tools could influence end users Train an impartial expert to conduct sessions and translate user input into specific design requirements In addition to a green-yellow-red color scheme, dashboard flags should accommodate color blind users
Usability testing Testing tools in context of competing institutional priorities Start in design phase; create test scenarios in development environment that are relevant Grouped safety dashboard columns to direct attention to most relevant domains by clinician type (RN vs MD)
Balancing institutional safety and quality priorities with end user requirements Use development environment to demonstrate how application logic addresses both institutional and end user requirements Partnering with clinical unit leads and quality/safety leaders ensured optimal alignment of priorities, generating buy-in during implementation
Managing end user expectations for desired functionality and bug-fixes with forthcoming application releases Reinforce that only true bugs in application logic will be fixed, NOT data entry errors (improperly entered orders; absent documentation) End users learned value of using structured fields for documentation in vendor EHR
Process mapping and workflow analysis Applying method to different settings; analyzing interaction of all tools individually and at the systems level Adapt use for different settings (ICU vs non-ICU), individual tools, and the integrated system Swim lane diagrams were most useful in a contained setting (closed ICU) for specific types of tool (safety dashboard)
Root cause analysis “Huddles” Identifying near misses and actual adverse events when they occur Align with organizational structures and hospital teams already in place to evaluate events Joined an infection control team conducting root cause analyses for a publicly reportable harm (CLABSI)
Use and compliance reporting Reconciling operational and research requirements for real-time feedback about tool use and process measures for units and clinical staff Align individual tool use data with hospital-defined process measures for key categories of harm (CAUTI, VTE, CLABSI, falls, etc.) Weekly use reports accelerated adoption of tools by creating end user competition; quarterly compliance reports may help demonstrate impact on process measures
Obtaining approvals to access to sensitive administrative data owned by various stakeholders (MD vs RN) Reassure hospital administrative and clinical staff that data will be used transparently for clearly defined and approved purposes Exploring how administrative measures of unit-risk (turn-over, census, staffing ratios) impacts tool use at the systems level could generate operational buy-in
Cognitive load and task analysis Determining appropriate measures of cognitive load for a complex system; applying appropriate methodologic rigor to minimize bias Administer a validated instrument (NASA-TLX) in a controlled environment for a complex task unique to a specific tool Comparing a highly complex, high-profile task (pain assessment) in safety dashboard vs EHR resonated in current climate (opioid crisis)
Failure mode and effects analysis (FMEA) Determining highest value of focus for FMEA (project vs. systems level); ensuring that each tool is equally represented Identify, prioritize, and address failure modes during implementation; determine whether failure mode require institutional policy changes Code status not routinely addressed by certain clinical services using safety dashboard, potentially leading to high-risk events (intubation, cardiac arrest)
Statistical process control charts Identifying potential uses based on anticipated availability of data Apply control charts to performance data during implementation and evaluation phases to improve intervention fidelity and outcomes Avoided misinterpretation of natural variation as improvements or deteriorations in tool use
Simulation modeling Determining types of questions that assess performance of all tools as an integrated whole Form cross-project team to focus on longitudinal experience of patients and clinicians who use the tools Tools may be used less when unit workload and net acuity is high, necessitating workload analysis to minimize time and cognitive burden

CAUTI: catheter-associated urinary tract infection; CLABSI: central line–associated bloodstream infection; EHR: electronic health record; ICU: intensive care unit; MD: doctor of medicine; RN: registered nurse; VTE: venous thromboembolic disease.

We believe the decentralized structure of our PSLL limited the ability of the SE and HF core team to immerse in individual project details—to a lesser extent in early project phases when SE personnel were embedded directly into each project—thereby contributing to how methods were ultimately used and perceived. Consequently, although the original intent was for the 3 tools to function as an integrated whole, their early design did not fully consider the larger system-of-systems. For example, patients could view their personalized safety plan on a bedside display and complete a fall-risk assessment via the patient portal, but this information was not linked to the safety dashboard for clinicians to view on rounds. In later phases, the SE and HF core applied failure mode and effects analysis, use and compliance reporting, and cognitive load and task analysis across projects to examine and improve the cohesiveness between each tool and foster more systems-level thinking. These experiences underscore the value and barriers of centralized SE and HF expertise, in the former case to facilitate consistent understanding of each method (eg, conducting learning sessions), synchronize component development, and apply appropriate integration oversight across projects.

More broadly, this case report illustrates how SE and HF approaches can assist with the introduction of digital health tools integrated with vendor EHRs to meet end user needs within sociotechnically complex healthcare environments.23 By engaging patients, clinicians, and institutional stakeholders, we believe our core SE and HF team was successful at contributing to patient-centered interventions to identify, assess, and mitigate preventable harm in the acute care setting, although certainly not without challenges.17 National organizations such as the President’s Council of Advisors on Science and Technology, National Academy of Engineering, Institute of Medicine, and AHRQ all have called for increased use of SE and HF methods to analyze, design, and improve healthcare processes including those impacting patient safety. Our experience adds to the evidence base of how this is accomplished in practice, which should be instructive for other institutions.

Our study has several limitations. First, the PSLL intervention was introduced at a single institution that recently had transitioned to a vendor-based EHR. Second, we evaluated only the subset of SE and HF methods considered applicable to the 3 projects—not all survey and focus group participants from each project were aware of the methods used to support the other projects. Third, the focus groups and surveys may have been subject to recall bias; however, our survey response was adequate even for members who participated during early years of the project.

To our knowledge, few efforts have adequately described the perceptions of using SE and HF methods to support introduction of novel, EHR-integrated digital health tools as a system-of-systems to mitigate preventable harm to hospitalized patients. We currently are completing a formal evaluation of the impact of our PSLL intervention on patient activation and hospital-acquired conditions for approximately 10 700 patients. We plan to enhance this infrastructure to proactively address other types of safety and quality risks, such as readmissions. Finally, our team is also exploring how to leverage this collaboration to address emerging safety threats to patients, such as diagnosis and therapeutic errors.30

FUNDING

This work was supported by a grant from AHRQ (P30-HS023535). Northeastern University researchers also were supported by a grant from Centers for Medicare & Medicaid Services (1C1CMS331050). The content is solely the responsibility of the authors and does not necessarily represent the official views of AHRQ or CMS. These funding agencies were not involved in the design and conduct of the study; collection, management, analysis, and interpretation of the data; or preparation, review, or approval of the manuscript. All authors had full access to the data. Dr Dalal takes responsibility for the integrity and accuracy of all data analysis.

AUTHOR CONTRIBUTORS

All authors have contributed sufficiently and meaningfully to the conception, design, and conduct of the study; data acquisition, analysis, and interpretation; and/or drafting, editing, and revising the manuscript.

ACKNOWLEDGMENTS

We appreciate the efforts of numerous PSLL subproject investigators and research staff including: Jeffrey L Schnipper, Patricia C Dykes, Sarah A Collins, Kumiko O Schnock, Ronen Rozenblum, Lisa S Lehmann, Stuart R Lipsitz, Alexandra Businger, Eli Mlaver, Brittany Couture, Megan Duckworth, Jenzel Espares, Zachary Katsulis, Dominic Breuer, Demetri Lemonias, Liam Synan, and Jennifer Bajorek.

Conflict of interest statement. None declared.

REFERENCES

  • 1. Vincent C, Aylin P, Franklin BD, et al. Is health care getting safer? BMJ 2008; 337: a2426.. [DOI] [PubMed] [Google Scholar]
  • 2.Estimating the additional hospital inpatient cost and mortality associated with selected hospital-acquired conditions; 2017. https://www.ahrq.gov/professionals/quality-patient-safety/pfp/haccost2017-results.html. Accessed July 4, 2018.
  • 3. Alotaibi YK, Federico F.. The impact of health information technology on patient safety. SMJ 2017; 38(12): 1173–80. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Meeks DW, Smith MW, Taylor L, Sittig DF, Scott JM, Singh H.. An analysis of electronic health record-related patient safety concerns. J Am Med Inform Assoc 2014; 21(6): 1053–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Magrabi F, Liaw ST, Arachi D, Runciman W, Coiera E, Kidd MR.. Identifying patient safety problems associated with information technology in general practice: an analysis of incident reports. BMJ Qual Saf 2016; 25(11): 870–80. [DOI] [PubMed] [Google Scholar]
  • 6. Wright A, Hickman TT, McEvoy D, et al. Analysis of clinical decision support system malfunctions: a case series and survey. J Am Med Inform Assoc 2016; 23(6): 1068–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Poon EG, Keohane CA, Yoon CS, et al. Effect of bar-code technology on the safety of medication administration. N Engl J Med 2010; 362(18): 1698–707. [DOI] [PubMed] [Google Scholar]
  • 8. Meeks DW, Takian A, Sittig DF, Singh H, Barber N.. Exploring the sociotechnical intersection of patient safety and electronic health record implementation. J Am Med Inform Assoc 2014; 21(e1): e28–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Sittig DF, Singh H.. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010; 19(Suppl 3): i68–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Sittig DF, Ash JS, Singh H.. The SAFER guides: empowering organizations to improve the safety and effectiveness of electronic health records. Am J Manag Care 2014; 20(5): 418–23. [PubMed] [Google Scholar]
  • 11. Yen PY, McAlearney AS, Sieck CJ, Hefner JL, Huerta TR.. Health Information Technology (HIT) adaptation: refocusing on the journey to successful HIT implementation. JMIR Med Inform 2017; 5(3): e28.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12. Schnipper JL, Hamann C, Ndumele CD, et al. Effect of an electronic medication reconciliation application and process redesign on potential adverse drug events: a cluster-randomized trial. Arch Intern Med 2009; 169(8): 771–80. [DOI] [PubMed] [Google Scholar]
  • 13. Salanitro A, Kripalani S, Resnic J, et al. Rationale and design of the Multicenter Medication Reconciliation Quality Improvement Study (MARQUIS). BMC Health Serv Res 2013; 13(1): 230.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Cresswell KM, Bates DW, Sheikh A.. Ten key considerations for the successful optimization of large-scale health information technology. J Am Med Inform Assoc 2017; 24(1): 182–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15. Cresswell KM, Bates DW, Sheikh A.. Ten key considerations for the successful implementation and adoption of large-scale health information technology. J Am Med Inform Assoc 2013; 20(e1): e9–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Carayon P, Wetterneck TB, Cartmill R, et al. Medication safety in two intensive care units of a community teaching hospital after electronic health record implementation: sociotechnical and human factors engineering considerations. J Patient Saf 2017. Feb 28 [E-pub ahead of print]. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17. Carayon P, Wood KE.. Patient safety: the role of human factors and systems engineering. Stud Health Technol Inform 2010; 153: 23–46. [PMC free article] [PubMed] [Google Scholar]
  • 18. Holden RJ, Carayon P, Gurses AP, et al. SEIPS 2.0: a human factors framework for studying and improving the work of healthcare professionals and patients. Ergonomics 2013; 56(11): 1669–86. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Acher AW, LeCaire TJ, Hundt AS, et al. Using human factors and systems engineering to evaluate readmission after complex surgery. J Am Coll Surg 2015; 221(4): 810–20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Xie A, Carayon P.. A systematic review of human factors and ergonomics (HFE)-based healthcare system redesign for quality of care and patient safety. Ergonomics 2015; 58(1): 33–49. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21. Reid PP, Compton WD, Grossman JH, Fanjiang G.. The National Academies collection: reports funded by National Institutes of Health In: National Academy of Engineering, Institute of Medicine, Committee on Engineering and the Health Care System, et al, ed Building a Better Delivery System: A New Engineering/Health Care Partnership. Washington, DC: National Academies Press; 2005. [Google Scholar]
  • 22. Saleem JJ, Russ AL, Sanderson P, Johnson TR, Zhang J, Sittig DF.. Current challenges and opportunities for better integration of human factors research with development of clinical information systems. Yearb Med Inform 2009; 18(01): 48–58. [PubMed] [Google Scholar]
  • 23. Reid PP, Compton WD, Grossman JH, et al. , eds. Building a Better Delivery System: A New Engineering/Health Care Partnership: Systems Engineering: Opportunities for Health Care. Washington, DC: National Academies Press; 2005. https://www.ncbi.nlm.nih.gov/books/NBK22864/. Accessed July 2018. [PubMed] [Google Scholar]
  • 24. Dykes PC, Duckworth M, Cunningham S, et al. Pilot testing fall TIPS (Tailoring Interventions for Patient Safety): a patient-centered fall prevention toolkit. Jt Comm J Qual Patient Saf 2017; 43(8): 403–13. [DOI] [PubMed] [Google Scholar]
  • 25. Leung WY, Adelman J, Bates DW, et al. Validating fall prevention icons to support patient-centered education. J Patient Saf 2017. Feb 22 [E-pub ahead of print]. [DOI] [PubMed] [Google Scholar]
  • 26. Mlaver E, Schnipper JL, Boxer RB, et al. User-centered collaborative design and development of an inpatient safety dashboard. Jt Comm J Qual Patient Saf 2017; 43(12): 676–85. [DOI] [PubMed] [Google Scholar]
  • 27. Couture B, Lilley E, Chang F, et al. Applying user-centered design methods to the development of an mHealth application for use in the hospital setting by patients and care partners. Appl Clin Inform 2018; 9(2): 302–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28. Collins SA, Couture B, Smith AD, et al. Mixed-methods evaluation of real-time safety reporting by hospitalized patients and their care partners: the mysafecare application. J Patient Saf 2018. Apr 27 [E-pub ahead of print]. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Brown AR, Sorensen AC.. Integrating creative practice and research in the digital media arts In: Smith H, Dean R, eds. Practice-Led Research, Research-led Practice in the Creative Arts. Edinburgh, UK: Edinburgh University Press; 2009: 153–65. [Google Scholar]
  • 30. Singh H, Graber ML.. Improving diagnosis in health care–the next imperative for patient safety. N Engl J Med 2015; 373(26): 2493–5. [DOI] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES