Abstract
Conventional wisdom suggests that the “human factor” in critical care environments is reason for inadequate medication and patient safety. Human factors (or human factors engineering (HFE)) is the science and practice of improving human performance. Using decades of HFE research, this paper evaluates a range of common beliefs about patient safety through a human factors lens.
This evaluation demonstrates that HFE provides a framework for understanding safety failures in critical care settings, offers insights in to how to improve medication and patient safety, and reminds us that the “human factor” in critical care units is what allows these time pressured, information intense, mentally challenging, interruption-laden, and life-or-death environments to function so safely so much of the time.
Keywords: Medication errors, Human engineering, Intensive care, Safety
To improve medication and patient safety, we need to deal with the ‘human factor’
That quote – and similar ones from the critical care world, such as “We invested in this new medication safety technology for the ICU. Now if only we could fix the human factor so they'd use it correctly” – are all too common. In these quotes, the term “human factor” is used as a synonym for the “baggage” people such as clinical staff bring with them to work. This baggage includes memory limitations, less than 100% compliance with safety rules, misuse or resistance to technology, and general less than perfect reliability and accuracy in actions and decisions. Unfortunately, this use of “human factor” puts the blame for patient safety problems squarely on the clinicians, which results in patient safety interventions that implore staff to pay more attention, try harder, stop screwing up, and follow all of the rules, when the focus should be on redesigning the system. That use of the phrase “human factors” is inconsistent with evidence from a scientific discipline and practice known as human factors (also human factors engineering, or ergonomics, or HFE).1, 2
HFE is the science and practice of improving human performance3-6. HFE scientists and practitioners discover and apply information about human cognitive and physical abilities and limitations to the design of tools, machines, systems, tasks and environments for productive, accurate, safe, and effective human use5-9. HFE designs and interventions have led to better understandings of, and designs for, operator performance and safety in aviation, manufacturing, nuclear power, process control, surface transportation, rail, air traffic control, service, construction, agriculture, and even healthcare3, 10-31. The Institute of Medicine also called for more application of HFE science as a way to improve patient safety,32, 33 though HFE research in healthcare is actually decades old.34, 35 HFE can be thought of as providing the evidence-based design guidance to support human performance, especially in complex systems such as critical care units.
HFE research has demonstrated that performance, efficiency, quality, and safety are the result of the interaction between people and the system in which they work.4, 36 We say, therefore, that phenomena like patient safety or quality are emergent properties of a system. That is, patient safety problems, including human error and violations of safety protocols are rarely the fault of the clinicians; rather they emerge from the clinicians working with technologies (which may or may not be well designed), in a particular environment (which may be rushed or dark or filled with interruptions), doing particular tasks (which may require intense concentration), in a particular organization (whose culture may reward shortcuts). This understanding that system interactions produce safety or quality is also known as systems thinking, and solutions that seek to redesign systems are known as systems engineering solutions or systems engineering designs. According to HFE research evidence, improving human performance is not an issue of telling or forcing people to work harder, smarter, or with fewer errors. Instead, performance is improved by designing systems to support the physical and cognitive work of the clinicians. That may involve better designed tools, technologies, policies, tasks, environments, layouts, or teams. HFE scientists and professionals have developed tools, standards, guidelines and principles for improving human performance (e.g. safety and quality)5, 37-40 that are increasingly being applied in healthcare. We argue that trend must continue for there to be patient safety improvements in critical care.
Decades of HFE research have yielded insights into how to improve human performance, and thus safety and productivity. These insights are often at odds with “common sense” beliefs that are not evidence based. With this is mind, the goal of this paper is to evaluate a range of common beliefs about patient safety through a human factors lens.
We would not have medication safety problems if people stopped making errors
This belief, if taken to its logical conclusion, suggest that to improve safety, one needs to get rid of errors. People, though, make errors. Therefore, as long as there are people (patients and providers) in critical care units, there will be errors. From an HFE perspective, efforts to improve patient safety that depend on requiring people to be infallible are misguided and wasteful. The reason goes beyond simply that “people make errors” – HFE research demonstrates that errors often are caused by poorly designed systems, which some have referred to as “design-induced” errors.41 (see also the report by Fairbanks, et al.15 for an excellent example of how poor design lead experienced Emergency Medical Technicians to err) Therefore if an error occurs, one should not ask “why did the person make the mistake,” but rather “what caused the mistake to occur?”
Consider an ICU that uses a computer order entry (CPOE) system. A typical healthcare approach would focus on education to assure that physicians remember how to correctly order medications. And, if the system was such that a physician could order an inappropriate medication without any feedback of what just happened, but an error happened, the physician would be blamed. A HFE approach would instead seek to design systems that make it harder to err in the first place and make errors that do happen more visible so that they can be quickly corrected.39,42, 43 An example of this is a CPOE system that uses software logic prevent (aka a forcing function) physicians from ordering out-of-range doses and warns the physician if an order for a particular drug might create an allergic response or drug-drug interactions. Rather than depending on a potentially tired and distracted intensivist to not make a mistake, the forcing function prevents certain orders while the warning makes other potential errors visible so the physician can correct it immediately.
Another clinical example is the restriction of the availability of high-risk medications. One source of medication errors is the storage of look-alike vials, or medication of differing concentrations in the same area. An incorrect look-alike medication can be given in error and potentially cause harm. A typical “person” based approach to solving this problem would be to educate the staff to pay more attention to look-alike vials and to make the labels on the medications appear more distinct. The HFE or system-based approach would involve removing the high-risk medications to a different area, making it impossible to confuse medications or concentrations. In other words, by redesigning the system of medication storage and access, the opportunity for error ceases to exist. The HFE solutions in both examples exhibit the characteristic of using system redesign to reduce the risk of errors. This risk-reduction approach is consistent with epidemiological approaches that seek to identify and reduce risk factors for disease, and is a standard HFE and safety engineering approach.4, 44, 45
Other examples of solutions that focus on people instead of system design can be seen by the following “solutions”: warning labels, signs and posters exhorting staff to “be careful” and to “do this” or “do not do that”, repeated educational campaigns, and reliance on policies and procedures. None of these activities are bad per se and in fact, they can be part of an effective safety effort. However, they are not sufficient to provide safety. In each of those “solutions”, the goal is error-proof the people, despite the inherent property of people to err. No amount of rules, education or warnings will prevent errors from occurring. In contrast, human factors or system-based improvement efforts apply principles of analysis and redesign to make it harder for errors that do occur to reach patients and, when the errors reach a patient, mitigate their harm.
One clinical illustration of applying analysis and redesign to eliminate errors and harm is found in the use of concentrated electrolytes, such as potassium.83 If a nurse inadvertently draws up potassium rather than another medication because of shared storage and “look-alike” labels, then he or she will be viewed as committing an error. Traditional solutions include warning labels or signs. However, identification and analysis of risk reveals that the problem is not the nurse but instead the storage of potassium along side other medications, creating the potential for error. Redesign would then result in removal of the potassium to a separate, perhaps locked, storage area, eliminating the possibility of error completely. Clinical examples of forcing functions might involve the redesign of intravenous tubing connectors to prevent inadvertent connection of enteral feeds or non-invasive blood pressure tubing to intravenous lines.84 Similarly, the principle of analysis and redesign is readily seen in newer “smart infusion pump” technology which essentially introduces constraints or forced limits to the magnitude of programming errors that can reach a patient. In each of these examples, there is an understanding of the work involved that is associated with a perceived risk, which then leads to a careful redesign to eliminate the risk underlying errors and harm.
We would not have medication safety problems if people stopped violating the safety rules
There is a common belief that that risk and harm to patients occurs from violations of rules and policies. This perspective stems from the fact that policies are often created to direct care and prevent harm. As a result, it is thought that any violation must decrease safety. HFE research demonstrates that violations, however, are to some extent inevitable in a complex domain like critical care where the exception is the rule46, 47. Rules are critical for safety; but rules assume a one size or few sizes fits all mentality. This logic may apply much of the time; however, critical care is a complex domain where multiple rules and goals may always be in conflict. As such, following one rule may mean violating another, where the rule followed enhances safety while the rule broken detracts from safety48, 49.
One example of enhancing safety through violations occurs when a patient requires an emergent medication that is not readily located in the patient's bedside medication box or the Intensive Care Unit (ICU) medication dispensing cabinet. The expected action is to wait for the medication to be sent from the central pharmacy. However, the nurse recognizes that the patient needs the medication immediately and that her other patient has the ordered medication in the correct dose in their bedside medication box. Giving a patient another patient's medication is a clear violation of most hospital's medication policies. Despite this, she obtains the medication from the second patient and administers it, treating the critically ill patient. Then, when the medication finally comes from the pharmacy, the nurse replaces the “borrowed” medication.
Central to the issue of safety and rule violation is the question of whether the rule fits the actual work of clinical care. In the illustration, the intent of the rule is to prevent one patient from receiving another patient's medication, which may result in harm from incorrect dosing, drug-drug interaction or an allergic response. Strictly speaking, the nurse has violated an important safety rule. At the same time, the nurse is faced with competing rules and goals: follow the rule not to borrow medications on the one hand and treat a patient with an emergent need on the other. The nurse in such a situation must decide which rule to follow based on patient need and safety considerations. In this case, the risk of violating one rule was weighed against the immediate need for medication. Was it the nurse's fault she did not have the needed medication at the time it was needed? No. But it was the nurse who was forced to adapt to this system design problem. This is another important lesson from HFE: when systems or technologies or rules or policies do not fit the situation encountered by provider, as was the case in the example, it is the person who is forced to adapt and respond50. Therefore, if an error or accident happens in this situation, it cannot be said to be the person's fault; after all, they were just doing what needed to be done in the face of a system that did not support their needs. That is, HFE research demonstrates that more often than not, it is the people in the complex systems that provide resilience, 51, 52 or the ability of the system to function safely despite the inherent complexity and the risks. It is the people after all, or the “human factor,” that provides judgment, creativity, problem solving, and context-sensitive solutions. In the case of critical care environments, where providers are always expected to go the extra distance to cope with technologies and system that do not reliably deliver, efforts should be made to train providers how to respond and react when systems do not work as planned50.
From a HFE perspective, the concept of rules and violations has three important implications. First, before considering punishment of ICU physicians and providers for violating rules, it is important to understand whether the rule fit the clinical scenario, or whether violation added to safety47. After all, a “violation” that everyone agrees is an improvement will be labeled a “best practice” and not a “violation.” Second, the creation of rules and policies must involve front line staff that truly understand how care is provided under real circumstances. Finally, instead of devoting resources to refining rules that poorly fit complex clinical care, safety might better be achieved through thoughtfully redesigning the systems of care delivery.
The first step in redesigning systems of care delivery (as an alternative to focusing on rules) is study the nature of work through the application of system analysis methods. We recommend in-depth observations, interviews, and other analytical techniques such as process mapping. But the key here is the expertise of the person doing the data collection and analysis. A trained human factors engineer observing the exact same clinical encounter as an industrial engineer or physician will see very different things and therefore record very different data. A team approach is highly recommended. The goal is not simply to understand the system of care, but specifically to also understand the nature of the work for the clinician – how can redesign help them perform better in light of the true complexities of their work? From those data, safety can be improved along the lines below, but we need to think beyond safety to improved performance. The hierarchies in Table 1 answer, “how do we enhance safety”. Human factors engineering also addresses – “how do we improve performance?” “How do we make it easier to make the right decisions?” How do we reduce memory burden?” How do we ease data integration?” Those are also topics we need to be very concerned about in critical care.
Table 1.
Two Examples of Hierarchies of Hazard Control
Strength of Prevention | Actions to Improve Medication Safety85 | Occupational Health and Safety Management Systems86 |
---|---|---|
Lower | Education/Training | Personal Protective Equipment |
Policies/Rules | Administrative Controls | |
Moderate | Double Checks | Warnings |
Standardization | Engineering Controls | |
Higher | Automation and Computerization | Substitute Less Hazardous Material/Process/Operation/Equipment |
Forcing Functions/Constraints | Eliminate Hazard |
Our ICU would be safer if I had a team who would do as I say
A recurring theme in patient safety is the need for improved teamwork and this belief applies to the ICU.53 However, Thomas et al. found that critical care physicians and nurses have different perceptions of what is meant by teamwork and how well their team functions.54 Whereas 73% of physicians rated collaboration and communication high or very high with team members, only 33% of nurses rated collaboration and communication high or very high. While the authors concluded that these differences might relate to training, gender and role-related culture, another explanation may relate to a poor understanding of the science of teams and team performance55-60. Team performance, like individual performance, is a research topic studied in HFE.
In critical care environments there are a variety of teams. There is the patient's care team that is comprised of the nurses, respiratory therapists, pharmacists, attending physicians and perhaps trainees such as residents and fellows. There are also the within-discipline clinical teams, such as the nursing or physician team that cares for a given patient over shifts, days, and weeks. A nurse and his/her nursing assistant might also be a team, as might all of the nurse on a given shift in a given ICU. However, despite that fact that we might call each of those “teams,” HFE evidence demonstrates that does not mean that they function as teams.
Teams are “two or more individuals with specialized roles and responsibilities who must interact dynamically and interdependently and are organized hierarchically to achieve common goals and objectives.”61 But more than that, according to HFE evidence, high performing teams are those that have been trained to have, and have demonstrated proficiency in, specialized knowledge, skills and attitudes that support teamwork62. For example, in high performing teams, all team members have the following knowledge: they share the same mental model of what needs to get done, they all know the team mission, and they all know each others' roles and expectations. Similarly, in high performing teams all team members have been trained and have demonstrated proficiency in the following skills: back-up behavior, team leadership, conflict resolution, and closed-loop communication, among others. However few health care organizations train their staff to have that knowledge or those skills. In part, the lack of meaningful team training may reflect that there is no “one size fits all” primer for training teams. As with other processes discussed in this paper, to answer what are specific steps or recommendations to training a team in healthcare, the first step should be defining and then understanding the purpose of the team and the work that must be performed by the team to achieve the purpose. Then team composition must be considered. Only then can specifics of training be entertained, and these specifics will be contingent on the results of these initial steps. Until clinicians and administrators routinely apply this thoughtful and robust approach, HFE research suggests that there will not be high functioning teams in healthcare.63
Based on these concepts, it is reasonable to ask, “What is the ideal means to communicate within a team?” However, it is imperative that providers understand there is no “ideal” means of communication. There may be relative ideals that exist for specific types of work in specific contexts but even then there are multiple dependencies. These include the nature of the work, the available technology, the available team members and their respective training. In the same vein, questions regarding ideal team size or team composition do not have a standard answer; instead, the correct answer depends on the work being performed by the team and the context in which this work is being done.
Another twist on team composition is whether families or even patients should be included as part of the team. Continuing the theme identified above, asking whether teams should include families should be supplanted by asking, “what is the purpose of the team?” If the purpose of the team is providing care and the involvement of the family helps achieve this purpose through provision of important information or participating in critical decision-making, then involvement of the family would enhance team performance. That said, the next question is “how to effectively include a family member?” Drawing from the points on team communication, if the clinical team strictly speaks in medical language and the family member cannot understand the discussions, then the family member is essentially excluded from the team by virtue of the performance of the team.
For critical care providers who are faced with managing multidisciplinary rounds, performing together to resuscitate patients, or manage mass-casualty situations, there is a rich source of HFE literature on team design and training that can improve performance. Applying the science to team performance will yield improved safety, not the expectation that a “team” blindly follow order.59
Our ICU would be safer if there was less focus on workload/ number of hours worked
The topic of work hours and work hour restrictions in healthcare has gained increasing attention over the last decade, particularly in light of the standards implemented by the Accreditation Council for Graduate Medical Education (ACGME).87 Rather than attempting to summarize the literature related to the effects of sleep deprivation and prolonged work hours, there are some additional lessons from the HFE literature which are germane to issues of workload. First, workload is multidimensional.88, 89 Unit-level workload can be viewed as staffing ratios on a nursing unit (or duty assignments for trainees). However, two other important types of work exist. Job-level workload “refers to general and specific demands of the job, including the general amount of work to be done in the day, the difficulty of the work, and the amount of concentration or attention required to do it.”88 The third type of workload can be thought of as task-level: those resources and demands associated with a specific task. In the ICU, this can be understood in the context of placing a central venous line. Task-level work encompasses the concentration required by the clinician to place the line in the face of competing demands for concentration, as well as the training of the clinician, their cognitive capacity, and the available resources such as monitoring technology and staff to assist with the line placement.
The concept of multidimensional workload is essential to understanding the issues of work hour and workload restriction. First, efforts to reduce one type of workload will likely have implications for the other type of workload. That is, reducing the unit-level workload by shortening shifts with out increasing the number of staff may unintentionally (but predictably) increase the job-level and task-level work for the clinicians remaining in the work environment. While the number of hours worked definitely is an issue and can lead to fatigue and then mistakes, even within a reasonable number of hours, the workload can be excessive, exceeding the capacity of well-intentioned clinicians, and leading to errors and workarounds. Workload is a matter of design and a choice by an organization. Sadly if someone is identified as having made a mistake because of excessive workload it is unlikely that “workload” will be blamed.
Medication delivery would be much safer if we only had [blank] technology
ICUs are technology-rich environments. Not surprisingly there is a perception that additional technologies may enhance safety. Specific technologies attributed with improving safety include electronic health records (EHRs), clinical decision support (CDS), computer provider order entry (CPOE), bar coded medication administration (BCMA), and “smart” infusion pumps. These technologies have been linked to reduction in errors, even though little evidence that they reduce harm to patients. There is also evidence that these technologies can introduce new types of errors, violations, and harm64-69.
While it might seem paradoxical that technologies designed to improve safety can actually lead to errors and violations, it is not paradoxical when seen through a HFE lens. For instance, CPOE does not exist in a vacuum within the ICU. Instead, people (physicians, nurses, pharmacists) must use the CPOE system to perform tasks (ordering, modifying and managing medications) within a busy and often distracting ICU environment. Independent of whether the CPOE system works as intended, the interactions between technology and people, tasks and environment, not to mention how the technology was implemented and supported, will ultimately determine whether the CPOE improves or sometimes worsens medication safety70. Health information technologies intended to improve safety may have usability problems71-77 that increase the likelihood of user errors, provide misleading feedback, lead to high rates of false alarms, or difficulties interpreting data. If such usability problems exist, it can lead to “design-induced” errors.41
As with team training, there is a rich body of non-healthcare literature and a growing body of healthcare-specific literature that can guide the design, selection and implementation of technologies to yield the best results3, 70, 78-80. Without leveraging this knowledge, ICU providers risk the unintended but foreseeable consequences of suboptimal technology adoption.
Things would be safer if we could standardize everything in the ICU
Standardization, like technology, has been identified as a potential solution for both medication and patient safety. However, HFE research suggests that this perspective is overly simplistic and potentially hazardous, especially in complex environments like ICUs. It is clear that standardization of processes, such as the central line insertion bundle, reduces unwanted and potentially dangerous variation. What is not as obvious is the need to standardize the right processes to the right standard.
For core processes such as preparing and dispensing a medication, handoff communications or placing a central venous line, standardization will reduce unwanted variation and potentially reduce waste while improving quality and safety. At the same time, the practice of standardization can be overused. Standardizing the ordering, dispensing and administration processes of aminoglycosides in septic ICU patients would be beneficial; standardizing to a single dose of aminoglycosides regardless of patient gender, age, weight or renal function would be potentially dangerous. The distinction between standardizing processes of care and “one size fits all” will likely be more important with the emerging science of pharmacogenetics and individualized medicine. The key HFE point on standardization is this: if the standardization of a process will support the needs of the ICU providers in all or nearly all cases, then use standardization, but allow exceptions for the few cases that do not apply. If on the other hand standardization will only support the needs of the providers some of the time, then standardization may be problematic. After all, if a standardized process does not fit many typical situations, then standardizing will simply create more “violators.”
I never make that mistake so I don't need a checklist
A specific type of standardization that has gained greater visibility in the ICU setting is the use of checklists. Physician's may be asked to use a checklist for placing a central venous line, ICU nurses use checklists for assuring the resuscitation cart is prepared for the next emergency, and ICU pharmacists may use a checklists while preparing a group of medications necessary for cannulation for extra-corporeal life support. Despite this, it is possible to hear resistance to checklist voiced through comments that include “We would never make that mistake so we don't need a checklist” and “We don't want to be forced into cookbook practice or cookbook medicine.” These sentiments stem from a misunderstanding of checklists. From an HFE perspective, checklists are not an attempt to force “cookbook” care. Instead, they are an effective solution to the limitations of human memory and the time pressured and interruptive nature of critical care environments. For instance, work has identified that omissions in a sequence of events occur at a rate of 1 per 100.81 This failure rate is multiplied 3 times by poor procedures, 6 times by information overload, 10 times by poor communication and 11 times by shortage of time.82 This data suggests a reliance on memory for safety is fraught with risk.
Checklists, whether for placing a central venous line, decreasing risk factors for ventilator associated pneumonia, communicating transfers of patient care or initiating hemodialyisis, are simply tools to minimize overlooking essential information or process steps. Rather than viewing them as an unnecessary crutch they should be viewed as a tool to assure good outcomes. They free clinicians to focus on care instead of trying to remember what was done or what still needs to be done.
Conclusions
Human factors, or human factors engineering, is a science that is invaluable to improving the safety of critically ill patients. First, it provides a framework to understand why things do not go as planned or desired without resorting to laying blame with the many providers working in ICUs. Second, HFE offers insights into how both medication safety and the larger issue of patient safety might be improved. By understanding that critical care providers are people with strengths and limitations that interact with a system comprised of tools and technology, tasks, an environment and organizations, efforts can be redirected to redesign systems to reduce unnecessary risk and harm in the ICU. Finally, the science of human factors reminds us that the “human factor” in critical care units is what allows these time pressured, information intense, mentally challenging, interruption-laden, and life-or-death environments to function so safely so much of the time.
Acknowledgments
Special thanks to Richard Holden, PhD. for his assistance in preparing this manuscript
Financial support was used in preparation of this manuscript came from: AHRQ 1 R01 HS013610, NIH-NLM 1R01LM008923-01A1
Footnotes
The authors have not disclosed any potential conflicts of interest.
No reprints to be ordered.
Contributor Information
Matthew C. Scanlon, Department of Pediatrics, Critical Care, Medical College of Wisconsin.
Ben-Tzion Karsh, Department of Industrial and Systems Engineering, University of Wisconsin.
References
- 1.Human Factors and Ergonomics Society. www.hfes.org.
- 2.International Ergonomics Association. www.iea.cc.
- 3.Carayon P, editor. Handbook of Human Factors and Ergonomics in Healthcare and Patient Safety. Lawrence Erlbaum Associates; 2006. [Google Scholar]
- 4.Karsh BT, Holden RJ, Alper SJ, Or CKL. A human factors engineering paradigm for patient safety: designing to support the performance of the healthcare professional. Quality & Safety in Health Care. 2006 Dec;15:I59–I65. doi: 10.1136/qshc.2005.015974. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Salvendy G, editor. Handbook of Human Factors and Ergonomics. 2nd. John Wiley and Sons; 1997. [Google Scholar]
- 6.Wickens C, Lee J, Liu Y, Becker S. An Introduction to Human Factors Engineering. Second. Upper Saddle River: Pearson Prentice Hall; 2004. [Google Scholar]
- 7.Eastman Kodak Company. Ergonomic Design for People at Work. 2nd. Hoboken, NJ: John Wiley and Sons; 2004. [Google Scholar]
- 8.Kroemer KHE, Kroemer HB, Kroemer-Elbert KE. Ergonomics: How to Design for Ease and Efficiency. 2nd. Prentice Hall; 2001. [Google Scholar]
- 9.Sanders MS, McCormick EJ. Human Factors in Engineering and Design. 7th. New York: McGraw-Hill Science; 1993. [Google Scholar]
- 10.Carayon P, Gurses AP. A human factors engineering conceptual framework of nursing workload and patient safety in intensive care units. Intensive & Critical Care Nursing. 2005;21:284–301. doi: 10.1016/j.iccn.2004.12.003. [DOI] [PubMed] [Google Scholar]
- 11.Carayon P, Schultz K, Hundt AS. Wrong site surgery in outpatient settings: The case for a human factors system analysis of the outpatient surgery process. Joint Commission Journal on Quality and Safety. 2004;20(7):405–410. doi: 10.1016/s1549-3741(04)30046-8. [DOI] [PubMed] [Google Scholar]
- 12.Carayon P, Wetterneck TB, Hundt AS, et al. Evaluation of nurse interaction with bar code medication administration technology in the work environment. Journal of Patient Safety. 2007;3(1):34–42. [Google Scholar]
- 13.Cook R, Rasmussen J. “Going solid”: a model of system dynamics and consequences for patient safety. Quality and Safety in Health Care. 2005;14:130–134. doi: 10.1136/qshc.2003.009530. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Cook RI. Resilience and resilience engineering for health care. Annals of Clinical and Laboratory Science. 2006 Spr;36(2):232–232. [Google Scholar]
- 15.Fairbanks RJ, Caplan SH, Bishop PA, Marks AM, Shah MN. Usability study of two common defibrillators reveal hazards. Annals of Emergency Medicine. 2007;50(4):424–432. doi: 10.1016/j.annemergmed.2007.03.029. [DOI] [PubMed] [Google Scholar]
- 16.Gurses AP, Carayon P. Performance obstacles of intensive care nurses. Nursing Research. 2007;56:185–194. doi: 10.1097/01.NNR.0000270028.75112.00. [DOI] [PubMed] [Google Scholar]
- 17.Patterson ES, Cook RI, Render ML. Improving patient safety by identifying side effects from introducing bar coding in medication administration. Journal of the American Medical Informatics Association. 2002 Sep-Oct;9(5):540–553. doi: 10.1197/jamia.M1061. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Saleem JJ, Patterson ES, Militello L, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. Journal of the American Medical Informatics Association. 2007;14:632–640. doi: 10.1197/jamia.M2163. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Scanlon M. Computer Physician Order Entry and the Real World: We're Just Humans. Joint Commission Journal on Quality and Safety. 2004;30(6):342–346. doi: 10.1016/s1549-3741(04)30039-0. [DOI] [PubMed] [Google Scholar]
- 20.Scanlon MC, Karsh B, Densmore E. Human Factors and Pediatric Patient Safety. Pediatric Clinics of North America. 2006;53:1105–1119. doi: 10.1016/j.pcl.2006.09.012. [DOI] [PubMed] [Google Scholar]
- 21.Scanlon MC, Weigle C, Karsh B, Alper SJ. Misperceptions of pediatric nursing actions as violations rather than compensation for bar coding technology. Paper presented at: 2nd International Conference on Healthcare Systems Ergonomics and Patient Safety; 2008; Strasbourg, France. [Google Scholar]
- 22.Wears RL, Cook RI, Perry SJ. Automation, interaction, complexity, and failure: A case study. Reliability Engineering & System Safety. 2006;91(12):1494–1501. [Google Scholar]
- 23.Wears RL, Perry SJ. Human factors and ergonomics in the emergency department. Annals of Emergency Medicine. 2002;40(2):206–212. doi: 10.1067/mem.2002.124900. [DOI] [PubMed] [Google Scholar]
- 24.Zhang JJ. Human-centered computing in health information systems Part 2: Evaluation. Journal of Biomedical Informatics. 2005 Jun;38(3):173–175. doi: 10.1016/j.jbi.2004.12.005. [DOI] [PubMed] [Google Scholar]
- 25.Zhang JJ. Human-centered computing in health information systems Part 1: Analysis and design. Journal of Biomedical Informatics. 2005 Feb;38(1):1–3. doi: 10.1016/j.jbi.2004.12.002. [DOI] [PubMed] [Google Scholar]
- 26.Koppel R, Wetterneck T, Telles JL, Karsh BT. Workarounds to barcode medication administration systems: Their occurrences, causes, and threats to patient safety. Journal of the American Medical Informatics Association. 2008 Jul-Aug;15(4):408–423. doi: 10.1197/jamia.M2616. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Karsh B, Scanlon M. When is a defibrillator not a defibrillator? When it is like a clock radio…. The challenge of usability and patient safety in the real world. Annals of Emergency Medicine. 2007;50:433–435. doi: 10.1016/j.annemergmed.2007.06.481. [DOI] [PubMed] [Google Scholar]
- 28.Hallock ML, Alper SJ, Karsh B. A macroergonomic work system analysis of the diagnostic testing process in an outpatient health care facility for process improvement and patient safety. Ergonomics. 2006;49(5-6):544–566. doi: 10.1080/00140130600568832. [DOI] [PubMed] [Google Scholar]
- 29.Or KL, Valdez RS, Casper GR, Brennan PF, Carayon P, Karsh B. Human factors, ergonomics, and health information technology in home care – a perspective from work system analysis. WORK. doi: 10.3233/WOR-2009-0867. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Holden RJ, Scanlon MC, Patel NR, et al. A human factors framework and study of the effect of nursing workload on patient safety and employee quality of working life. Quality and Safety in Healthcare. doi: 10.1136/bmjqs.2008.028381. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Rivera AJ, Karsh B. Interruptions and Distractions in Healthcare: Review and Reappraisal. Quality and Safety in Healthcare. doi: 10.1136/qshc.2009.033282. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Institute of Medicine. To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000. [Google Scholar]
- 33.National Academy of Engeering and Insitute of Medicine. Building a Better Delivery System: A New Engineering/Health Care Partnership. Washington DC: National Academy Press; 2005. [PubMed] [Google Scholar]
- 34.Safren M, Chapanis A. A critical incident study of hospital medication errors. Hospitals. 1960;34:53–68. [PubMed] [Google Scholar]
- 35.Chapanis A, Safren M. Of misses and medicines. Journal of Chronic Diseases. 1960;12(4):403–408. doi: 10.1016/0021-9681(60)90065-5. [DOI] [PubMed] [Google Scholar]
- 36.Carayon P, Hundt AS, Karsh BT, et al. Work system design for patient safety: the SEIPS model. Quality & Safety in Health Care. 2006 Dec;15:I50–I58. doi: 10.1136/qshc.2005.015842. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Federal Aviation Administration. Human Factors Workbench. [March 9, 2009]. http://www.hf.faa.gov/Portal/default.aspx.
- 38.Karwowski W. Handbook of Standards and Guidelines in Ergonomics and Human Factors. Lawrence Erlbaum Associates; 2005. [Google Scholar]
- 39.Nielsen J. Usability Engineering. Boston: Academic Press; 1993. [Google Scholar]
- 40.Stanton NA, Salmon PM, Walker GH, Baber C, Jenkins DP. Human Factors Methods. Ashgate; 2005. [Google Scholar]
- 41.Sawyer D. Do it by design: an introduction to human factors in medical devices. [May 17, 2007]. http://www.fda.gov/cdrh/humanfactors/
- 42.Norman DA. The Design of Everyday Things. New York: Basic Books; 1988. [Google Scholar]
- 43.Norman DA, Draper SW, editors. User Centered System Design: New Perspectives on Human-computer Interaction. CRC; 1986. [Google Scholar]
- 44.Smith MJ, Carayon P, Karsh B. Design for occupational health and safety. In: Salvendy G, editor. Handbook of Industrial Engineering: Technology and Operations Management. 3rd. New York: John Wiley and Sons; 2001. pp. 1156–1191. [Google Scholar]
- 45.Smith MJ, Karsh B, Carayon P, Conway FT. Controlling occupational safety and health hazards. In: Quick JC, Tetrick LE, editors. Handbook of Occupational Health Psychology. Washington DC: American Psychological Association; 2003. pp. 35–68. [Google Scholar]
- 46.Almaberti R, Vincent C, Auroy Y, Maurice GdS. Violations and migrations in health care: a framework for understanding and management. Quality and Safety in Health Care. 2006;15:66–71. doi: 10.1136/qshc.2005.015982. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Alper S, Karsh B. A systematic review of the causes of safety violations in industry. Accident Analysis and Prevention. 2009;41(4):739–754. doi: 10.1016/j.aap.2009.03.013. [DOI] [PubMed] [Google Scholar]
- 48.Alper SJ, Karsh B, Holden RJ, Scanlon MC, Patel N, Kaushal R. Protocol violations during medication administration in pediatrics. Paper presented at: Human Factors and Ergonomics Society 50th Annual Meeting; 2006; San Francisco. [Google Scholar]
- 49.Alper SJ, Scanlon MC, Murkowski K, Patel N, Kaushal R, Karsh B. Routine and situational violations during medication administration. Paper presented at: 9th International Symposium on Human Factors in Organizational Design and Management; 2008; Guarujá, São Paulo, Brazil. [Google Scholar]
- 50.Woods DD, Cook RI. Nine Steps to Move Forward from Error. Cognition, Technology & Work. 2002;4:137–144. [Google Scholar]
- 51.Hollnagel E, Woods DD. Joint Cognitive Systems: Foundations of Cognitive Systems Engineering. New York: CRC Press; 2005. [Google Scholar]
- 52.Hollnagel E, Woods DD, Leveson N, editors. Resilience Engineering: Concepts and Precepts. Ashgate; 2006. [Google Scholar]
- 53.Despins LA. Patient Safety and Collaboration of the Intensive Care Unit Team. Critical Care Nurse Apr. 2009;29(2):85–91. doi: 10.4037/ccn2009281. [DOI] [PubMed] [Google Scholar]
- 54.Thomas EJ, Sexton JB, Helmreich RL. Discrepant attitudes about teamwork among critical care nurses and physicians. Critical Care Medicine. 2003 Mar;31(3):956–959. doi: 10.1097/01.CCM.0000056183.89175.76. [DOI] [PubMed] [Google Scholar]
- 55.Burke CS, Salas E, Wilson-Donnelly K, Priest H. How to turn a team of experts into an expert medical team: guidance from the aviation and military communities. Quality & Safety in Health Care. 2004 Oct;13:I96–I104. doi: 10.1136/qshc.2004.009829. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Burke CS, Stagl KC, Klein C, Goodwin GF, Salas E, Halpin SA. What type of leadership behaviors are functional in teams? A meta-analysis. Leadership Quarterly. 2006 Jun;17(3):288–307. [Google Scholar]
- 57.Burke CS, Stagl KC, Salas E, Pierce L, Kendall D. Understanding team adaptation: A conceptual analysis and model. Journal of Applied Psychology. 2006 Nov;91(6):1189–1207. doi: 10.1037/0021-9010.91.6.1189. [DOI] [PubMed] [Google Scholar]
- 58.Salas E, Baker D, King H, Battles J, Barach P. On teams, organizations and safety: Of course…. Joint Commission Journal on Quality and Patient Safety. 2006;32:112–113. [Google Scholar]
- 59.Salas E, Cannon-Bowers JA. The science of training: A decade of progress. Annual Review of Psychology. 2001;52:471–499. doi: 10.1146/annurev.psych.52.1.471. [DOI] [PubMed] [Google Scholar]
- 60.Salas E, Rosen MA, Burke CS, Nicholson D, Howse WR. Markers for enhancing team cognition in complex environments: The power of team performance diagnosis. Aviation Space and Environmental Medicine. 2007 May;78(5):B77–B85. [PubMed] [Google Scholar]
- 61.Salas E, Dickenson TL, Converse SA, Tannebaum SI. Toward an understanding of team performance and training. In: Swezey RJ, Salas E, editors. Teams: their training and performance. Norwood NJ: Ablex; 1992. pp. 3–29. [Google Scholar]
- 62.Salas E, Almeida SA, Salisbury M, et al. What are the critical succes factors for team training in health care? The Joint Comission Journal on Quality and Patient Safety. 2009;35(8):398–405. doi: 10.1016/s1553-7250(09)35056-4. [DOI] [PubMed] [Google Scholar]
- 63.Salas E, Wilson KA, Murphy CE, King H, Baker D. What crew resource management training will not do for patient safety unless…. Journal of Patient Safety. 2007;3(2):62–64. [Google Scholar]
- 64.Koppel R, Metlay JP, Cohen A, et al. Role of computerized physician order entry systems in facilitating medication errors. Journal of the American Medical Association. 2005;293(10):1197–1203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]
- 65.Nebeker JR, Hoffman JM, Weir CR, Bennett CL, Hurdle JF. High rates of adverse drug events in a highly computerized hospital. Archives of Internal Medicine May. 2005;165(10):1111–1116. doi: 10.1001/archinte.165.10.1111. [DOI] [PubMed] [Google Scholar]
- 66.Han YY, Carcillo JA, Venkataraman ST, et al. Unexpected increased mortality after implementation of a commercially sold computerized physician order entry system. Pediatrics. 2005 Dec;116(6):1506–1512. doi: 10.1542/peds.2005-1287. [DOI] [PubMed] [Google Scholar]
- 67.Thompson DA, Duling L, Holzmueller CG, et al. Computerized physician order entry, a factor in medication errors: descriptive analysis of events in the intensive care unit safety reporting system. Journal of Clinical Outcomes Management. 2005;12(8):407–412. [Google Scholar]
- 68.Van der Sus H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. Journal of the American Medical Informatics Association. 2006;13(2):138–147. doi: 10.1197/jamia.M1809. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.van Onzenoort HA, van de Plas A, Kessels AG, Veldhorst-Janssen NM, van der Kuy PHM, Neef C. Factors influencing bar-code verification by nurses during medication administration in a Dutch hospital. American Journal of Health-System Pharmacy. 2008;65(7):644–648. doi: 10.2146/ajhp070368. [DOI] [PubMed] [Google Scholar]
- 70.Karsh B. Clinical practice improvement and redesign: how change in workflow can be supported by clinical decision support. Rockville, Maryland: Agency for Healthcare Research and Quality; 2009. AHRQ Publication No. 09-0054-EF. [Google Scholar]
- 71.Kushniruk A. Evaluation in the design of health information systems: application of approaches emerging from usability engineering. Computers in Biology and Medicine. 2002 May;32(3):141–149. doi: 10.1016/s0010-4825(02)00011-2. [DOI] [PubMed] [Google Scholar]
- 72.Kushniruk AW, Patel VL, Cimino JJ. Usability testing in medical informatics: Cognitive approaches to evaluation of information systems and user interfaces. Journal of the American Medical Informatics Association. 1997:218–222. [PMC free article] [PubMed] [Google Scholar]
- 73.Nielsen J. Usability Engineering. San Francisco: Morgan Kaufmann; 1994. [Google Scholar]
- 74.Kaufman DR, Patel VL, Hilliman C, et al. Usability in the real world: assessing medical information technologies in patients' homes. Journal of Biomedical Informatics. 2003 Feb-Apr;36(1-2):45–60. doi: 10.1016/s1532-0464(03)00056-x. [DOI] [PubMed] [Google Scholar]
- 75.Beuscart-Zephir MC, Menu H, Evrard F, Guerlinger S, Watbled L, Anceaux F. Multidimensional evaluation of a Clinical Information System for anaesthesiology: quality management, usability, and performances. Stud Health Technol Inform. 2003;95:649–654. [PubMed] [Google Scholar]
- 76.Cimino JJ, Patel VL, Kushniruk AW. Studying the human-computer-terminology interface. Journal of the American Medical Informatics Association. 2001;8(2):163–173. doi: 10.1136/jamia.2001.0080163. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Patel VL, Kaufman DR, Arocha JA, Kushniruk AW. Bridging theory and practice: cognitive science and medical informatics. Medinfo. 1995;8(Pt 2):1278–1282. [PubMed] [Google Scholar]
- 78.Weinger M, Gardner-Bonneau D, Wiklund ME, editors. Handbook of Human Factors in Medical Device Design. CRC Press; 2009. [Google Scholar]
- 79.Karsh BT. Beyond usability: designing effective technology implementation systems to promote patient safety. Quality & Safety in Health Care. 2004 Oct;13(5):388–394. doi: 10.1136/qshc.2004.010322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Gosbee J. Human factors engineering and patient safety. Quality & Safety in Health Care. 2002 Dec;11(4):352–354. doi: 10.1136/qhc.11.4.352. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Park KC. Human error. In: Salvendy G, editor. Handbook of human factors and ergonomics. New York: Wiley; 1997. pp. 150–173. [Google Scholar]
- 82.Reason JT. Understanding adverse events: The human factor. In: Vincent C, editor. Clinical risk management. 2nd. London, UK: BMJ Books; 2001. [Google Scholar]
- 83.Scanlon MC, Karsh B, Saran KA. Risk-based safety metrics. In: Henricksen K, Battles JB, Keyes MA, Grady ML, editors. Agency for Healthcare Research and Quality's Advances in Patient Safety: New Directions and Alternative Approaches. 1. Assessment. Rockville, MD: Agency for Research and Quality; 2008. Aug, AHRQ Publication No. 08-0034-1. 2008. [Google Scholar]
- 84.The Joint Commission. Sentinel Event Alert: Tubing misconnections- a persistent and potentially deadly occurrence. Apr 3, 2006. http://www.jointcommission.org/SentinelEvents/SentinelEventAlert/sea_36.htm. [PubMed]
- 85.Institute for Safe Medication Practice. Medication Safety Alert! Jun 2, 1999. http://www.ismp.org/newsletters/acutecare/archives/Jun99.asp.
- 86.Manuele FA. ANSI/AIHA Z10-2005-The new benchmark for safety management systems. Professional Safety. 2006 Feb;:25–33. [Google Scholar]
- 87.Accreditation Council for Graduate Medical Education. AGME Duty Hours Standards Fact Sheet. http://www.acgme.org/acWebsite/newsRoom/newsRm_dutyHours.asp.
- 88.Holden RJ, Scanlon MC, Patel NR, Kaushal R, Escoto KH, et al. A human factors framework and study of the effect of nursing workload on patient safety and employee quality of working life. Quality & Safety in Health Care. doi: 10.1136/bmjqs.2008.028381. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 89.Holden RJ, Patel NR, Scanlon MC, Shalaby TM, Arnold JM, Karsh B. Effects of mental demands during dispensing on perceived medication safety and employee well being: A study of workload in pediatric hospital pharmacies. Research in Social & Administrative Pharmacy. doi: 10.1016/j.sapharm.2009.10.001. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]