Abstract
Poor design of elements in a healthcare system produce the latent conditions which result in patient safety incidents. A better understanding of these elements and specific healthcare design challenges will result in improved patient safety.
Keywords: design, patient safety, human error
What can you tell about a family whose video recorder clock is not flashing? They have a teenage child.
An old joke but one with a serious point: research shows that 95% of people do not use 90% of the features on their video recorders because they are too complicated.1 Also, a recent thesis by Elke den Ouden at the Technical University of Eindhoven found that half of all “faulty” goods returned to stores by consumers are in full working order but that the customers did not know how to use them. Such product complaints and returns are often caused by poor design, but companies often dismiss them as “nuisance calls”.
You obviously do not have to be a design guru to spot that this is down to a piece of bad human/machine interface design—a design that fails to understand the end user, how they intend to use their product and one which, instead of providing the desired service, provides only an infuriating barrier.
If it is this difficult to operate consumer products, what must it be like operating a series of complex pieces of medical equipment? The risk of a badly operated video recorder is missing your favourite television programme, which is bad but not exactly life threatening, unlike the risks associated with operating a medical device incorrectly. This paper discusses some of the issues surrounding patient safety and how—by viewing the problems as systemic rather than putting them down to user error—the design of latent conditions in health care can be revealed as a contributing factor to many patient safety incidents.
Design in health care
Because of the obvious risks associated with healthcare provision, you would assume that medical devices—whether they are low tech such as tongue depressors or high tech such as life support equipment—are created by the top design minds in the country. You would assume that they go through rigorous ergonomic and usability studies, that the end users and patients are heavily involved in their creation, development and testing, and that information architects and human factors experts scrutinise every detail, every button, switch and dial to ensure they are well designed, well laid out, accident proof, intuitive and comply with a set of nationally adopted regulatory standards. Unfortunately this does not appear to be the case. You do not have to spend much time in a laboratory, clinic, ward, or operating theatre to form the opinion that the design and usability of the medical devices therein are by and large pretty poor. That is not to say that they don't function correctly and to set and agreed tolerances or are poorly constructed; rather, the human/machine interface is poorly considered for the intended end user.
A joint publication by the Department of Health and the Design Council in Britain highlighted this issue when it concluded that “the NHS [National Health Service] is seriously out of step with modern thinking and practice with regard to design … a direct consequence of this has been a significant incidence of avoidable risk and error”.2 These findings will not come as a shock to many working in health care, especially at the Medicines and Healthcare products Regulatory Agency (MHRA). The MHRA is an amalgamation of the Medical Devices Agency (MDA) and the Medicines Control Agency (MCA). It recently carried out an extensive review of infusion devices used extensively across NHS acute and community settings by nurses, medical staff, and specific patient groups and found that, of the 6770 infusion/transfusion related incident reports for the period 1990–2000, 19% were attributed to user error and in 53% of cases “no cause” was established. In other words, the device was found to be in full working order with the assumption that the error was some combination of user error or systems error (MDA accounts for the year ended 31 March 2003).
Another study carried out in the United States by the Food and Drug Administration (FDA) found that at least 27% of all medical devices were designed without adequately addressing human factor issues.3 As a consequence of this, medical devices are generally difficult to use and subsequently prone to misuse. This misuse has historically been attributed solely to the end user, with the tendency to blame, punish, and retrain them without looking at the root cause of the problem—a practice that has been roundly and rightly discredited by Dr Lucian Leape from the Harvard School of Public Health who dispels the dual myths of punishment and perfection.4
As Grandjean5 points out, we need to focus on “designing machines, equipment and installations so that they can be operated with great efficiency, accuracy and safety”. This is why latent conditions are important: “We cannot change the human condition, but we can change the conditions under which humans work”.6
Latent conditions
If we accept that there is little we can do about the inevitability of human error, it leaves us with two choices: (1) accept the current levels of patient harm as an unfortunate but unavoidable side effect of providing health care; or (2) start viewing current errors as results of the latent conditions of the system where the healthcare providers are merely inheritors not instigators of those errors. I believe that to continually focus efforts on training, education, and awareness programmes will not significantly reduce the level of harm we see at present. We need to focus more effort on the design of elements in the healthcare system in an attempt to reduce the number of latent conditions in the system that can contribute to user error.
Systems are made up of a combination of interrelated or interacting elements forming a whole. Any one of these elements can contribute to the likelihood of an error occurring, and in most of them there is a significant role that design can play to either reduce the likelihood of such an error or mitigate against the consequences. In an acute care setting there are generally six discernable elements in the system, the “6Ps” (fig 1):
Figure 1 Elements of the healthcare system.
Providers
Procedures
Products
Peripherals
Patients
Policy
(1) Providers
It is important to recognise fully the different range of individuals that might be required to operate in a clinical environment or interact with a single piece of equipment. Some will be operated only by specialists, but most will be operated by generalists across a range of settings and situations from doctors, nurses and paramedics to the patients themselves, working anywhere from hospitals and clinics to roadsides and people's homes. Another factor is the nature of working practices which is particularly mobile, especially at the beginning of a health professional's career. This can result in a single individual working in a variety of unfamiliar hospitals and with a variety of unfamiliar equipment with little or no time for appropriate device specific training. All these individuals will be working under pressure and will have all the positive and negative attributes of humans—capable of error and being affected in the performance of their duties by external factors.
(2) Procedures
The procedure itself will also have risks attached, whether administering potassium chloride or removing a limb. There are thousands of different drugs on the market which come from a variety of manufacturers in a variety of packages with a variety of doses and a variety of names. Over the years there appears to have been little effort made in the standardization or clarification of these drugs, doses, or packages, resulting in the obvious potential for wrong dose or wrong site administration. In addition, until March 2005 when the National Patient Safety Agency and the Royal College of Surgeons jointly issued guidance, there was also no nationally agreed procedure for marking surgery sites (do you mark the leg to be amputated or the leg to remain?).
(3) Products
There are thousands of mechanical and electronic medical devices currently in operation in British hospitals, all designed to deliver a certain service or medical procedure. However, some devices are overly complex—for example, an average hospital trust in England will stock over 30 different types of infusion pump. This, combined with the escalation in infusion pump features from four in 1980 to over 20 today,7 results in a totally unnecessary level of complexity with which the operator must contend. According to the FDA, “extensive functional capability may well impose an unreasonable cognitive load on the user” resulting in a heightened risk of user error.8
(4) Peripherals
In addition to specific medical devices, there are also all the other equipment and factors in the healthcare setting which might contribute to harm including beds, lights, floors, sinks, ambient temperature and humidity, noise levels, ward layout, and the location of equipment store rooms. This might also be further compounded by care being given in environments over which there is little or no control—for example, roadside emergencies or procedures carried out in the patient's home. Any or all of these are latent conditions that can contribute to the potential of accidental patient harm by not functioning effectively or in a manner conducive to patient safety.
(5) Patients
The patients themselves are becoming an increasingly significant factor in their own safety; they have the potential power to prevent accidental treatment and are often the experts of their own condition. In some cases, particularly in mental health and learning disability settings, they are also capable of unwittingly contributing towards their own harm.
(6) Policy
The previous five aspects of healthcare provision deal with tangible aspects of the system. Healthcare policies, however, are just as critical as they can involve a variety of protocols and regulations governing behavior that must be learned and adhered to. These aim to provide walls of defence in an attempt to make clinical procedures mistake‐proof, but ultimately add even more layers of complexity to an already complex system.
The “6Ps” of the healthcare system can come together at a single instant in time and are perhaps repeated in some combination or other throughout a cycle of treatment along the patient journey. Any single element or combination of elements can provide the latent condition which results in error and patient harm. While the importance of staff training, competence, and culture is recognized and rightly promoted as an avenue to improve patient safety, the design of the non‐human elements of the system may not have received the same level of recognition as error barriers and, as such, continue to contribute to the current level of harm to patients.
The National Patient Safety Agency
According to preliminary research carried out in England by Vincent et al,9 as many as 10% of all patients treated in acute hospitals in this country experience some kind of safety incident; 6% of these result in permanent impairment and 8% in fatality. The study was limited to two acute hospitals in London and only 48% of the fatalities were considered to be avoidable, but that still indicates a high level of patient harm.
In recognition of this research and the publication of two Department of Health reports on patient safety in the NHS,10,11 the National Patient Safety Agency (NPSA) was formed. The role of the NPSA is to improve the safety of patients by promoting a culture of reporting and learning from patient safety incidents and by developing solutions to common or catastrophic cases of harm. By “incidents” we mean “times when things go wrong in the NHS that either did or could cause harm to a patient”. To help in this we have created the National Reporting and Learning System (NRLS) which will help to identify trends and patterns of incidents and then establish their root causes. The Root Cause Analysis (RCA) programme of work at the NPSA involves a number of integrated strands which include RCA training locally, an RCA eLearning toolkit (available at www.npsa.nhs.uk), together with advice from our remote working patient safety managers who can help with aggregate RCA, quality control, writing RCA reports, and the RCA process for serious incidents. Once the root causes have been established, work is prioritized and solutions are developed at a national level which will reduce the recurrence of these errors in the future.
The nature of the solution will obviously dictate the manner of its dissemination to the broader NHS but, as the NPSA is not a regulatory body, this will typically involve influencing other healthcare and industry organizations and issuing safer practice notices and alerts. The solutions might be cultural, process, or training led, but may equally be concerned with the design of information, objects, and environments which are deemed to be confusing, dangerous, and not considered with patient safety as part of the formal design process.
Design
Design has proved itself to be a very powerful tool, one the private sector has been successfully exploiting for several decades as demonstrated by the Design Business Association's Design Effectiveness awards (www.dba.org.uk) and the Design Council's “Design in Britain 2003–4” publication (designbritain@designcouncil.org.uk). Both make compelling links between good design and gain. In the private sector this generally means profitability, but it can mean anything you want it to—even design for patient safety.
One of the main reasons design is not used fully by all those who might benefit from it is the misconception as to what design is. It is not, contrary to popular belief, a discipline based on form or the way something looks. Form is only a factor in design if it affects the way an object functions as a deliberate method of communication; anything beyond this is purely styling and more akin to art or decoration than design. And while the word “design” can be used to describe, for example, the planning and layout of circuit boards or the creation of a new drug, if there is no direct link to a consumer, customer or end user—if there is no “human interface”—then the process is not design but engineering or science.
Of course these are rather polarised views, and it would probably be more accurate to think of design as a scale with “art” at one end and “science” at the other, and “design” being everything in the middle. It can subsequently be very artistic or very scientific, but it is still design providing there is a human interface and more to it than pure aesthetics.
However, to fully understand design it is important to realise that, above all, it is a problem solving process and not a one‐off activity (fig 2). As such, the quality of the end results will be a direct reflection of the quality of the information provided at the start of the process, the skills of the team carrying out the work (both designers and clients), and the management structures put in place to deliver the project.
Figure 2 The design wheal.
A series of joint publications by Cambridge Consultants and the University of Cambridge entitled Good design practice for medical devices and equipment12,13,14 detail three aspects of the design process which they believe are paramount to the successful outcome of a project:
requirements capture: knowing what success looks like;
verification: ensuring things are done correctly; and
validation: ensuring you have achieved what you set out to achieve.
These safety nets go some way to ensuring that the project delivers the desired product or service and that it will not fall foul of legislation, but they do not shed any light on how to design for patient safety.
Safer by design
At one level, designing for patient safety is no different from just good design; to suggest otherwise would be to insinuate that good design can somehow produce unsafe products. However, when designing for health care and, in particular, when designing for patient safety, there are five aspects of design that should be considered with more weight than some traditional measures concerned with branding and aesthetics:
Usability
Reliability
Accident proof
Standardization
Systemic awareness
Usability
Put simply, the design process for medical devices should include human factors considerations at every stage so that the end product is easy to use and understand, functionally intuitive or, as Donald Norman puts it, “the appearance of the device must provide the critical clues required for its proper operation”.15 Put even more simply, the right thing should be easy to do and the wrong thing difficult. The US FDA is clear that “good user interface design is critical to safe and effective equipment operation” and that poor user interface design “greatly increases the likelihood of error in equipment operation”.8
Reliability
At a basic level, reliability means that devices should operate consistently within set tolerances. They must also warn of any malfunctions or inaccuracies in performance, thus ensuring that patient harm is not possible as a result of a fault in the device. This should also include considerations for tamper proofing.
Accident proof
Accident, error, or mistake proofing is a vital element of the design process in safety critical industries. As previously discussed, there is no point pretending that human beings can be trained to act in a perfect manner or that punishing them when they do not will get them close to this ideal.16 We must accept that “to err is human” and design accordingly so that potentially dangerous devices, equipment, and procedures cannot, wherever possible, unintentionally harm either patients or staff. This rather basic concept has been understood by the aviation industry for decades and has resulted in the current levels of aircraft safety that we all take for granted today.
Standardization
Standardization is a concept well understood by other safety critical industries that value the benefit of lightening the mental burden on staff and users to allow them to concentrate better on the job at hand. Healthcare designers and manufacturers should embrace voluntary standardization of safety critical features as there is no lack of evidence17 that certain elements of medical devices—especially the human/device interface—could benefit from further standardization.
Systemic awareness
In order to design safely for health care, it is imperative that manufacturers fully comprehend the system within which their products operate. To ensure these factors are fully incorporated, rigorous testing and risk assessments must be conducted throughout the design process, thus ensuring them fit for purpose. For example, the manufacturer of an infusion device should be aware that, even if there is no scientific evidence dictating the safest layout for numeric keypads, if every other device manufacturer has it running one to nine then it is dangerous to design theirs running nine to one; the risk of error should be obvious.
These five factors are the cornerstones of safe design in health care and should form part of the requirement capture, verification, and validation stages of any healthcare design project. To ensure these factors are fully incorporated, rigorous testing must be conducted throughout the entire design process to ensure fit for purpose is achieved, testing not just on the end users of the device in the true context of use but also on all its stakeholders including patients and the public.
Conclusion
It is clear that, if we want better and safer health care for patients, we must design out latent conditions that contribute to error and create better and safer systems of healthcare delivery. Designing for patient safety must be part of our corporate culture, not just in the NPSA but in our hospitals, purchasing and regulatory bodies, and other NHS agencies. We must create a virtuous circle in our manufacturers and suppliers where design for patient safety is not a cost but a means for differentiation and selection, a tool for competitive advantage.
Historically, we have had no centralized database of patient safety incidents, so the same errors have been allowed to happen time and time again with little pressure to change the system that allowed the accidents to occur in the first instance. With the introduction of the NPSA's National Reporting and Learning System, this will no longer be the case. There will be much greater transparency of incidents and a greater focus on their root causes. For the first time we will start to see patterns appear that are not purely related to users but also to the underlying system of healthcare provision including processes, equipment, environments, and information. There will then be an overwhelming need to react from all parties who contribute—whether in whole or in part—to patient harm and to design out these faults.
Footnotes
Competing interests: none declared.
References
- 1.De Bono E.Simplicity. UK: Penguin Group 1998
- 2.Robens Centre for Health Ergonomics at the University of Surrey, Helen Hamlyn Research Centre at the Royal College of Art, and Cambridge Engineering Design Centre at the University of Cambridge Designing for patient safety. UK: Department of Health 2002
- 3.US Food and Drug Administration Design control inspection results: June 1 1997 to June 1 1998. Rockville, MD: Food and Drug Administration, 1999
- 4.Leape L L. Striving for perfection. Clin Chem 2002481871–1872. [PubMed] [Google Scholar]
- 5.Grandjean E.Fitting the task to the man. London and Philadelphia: Taylor and Francis, 1986
- 6.Reason J. Human error: models and management. BMJ 2000320768–770. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.National Patient Safety Agency Standardising and centralising infusion devices: a project to develop safety solutions for NHS trusts. Executive Summary 2004
- 8.Sawyer D.Do it by design. Rockville, MD: US Department of Health and Human Services, Food and Drug Administration, 1996
- 9.Vincent C, Neale G, Woloshynowych M. Adverse events in British hospitals: preliminary retrospective record review. BMJ 2001322517–519. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Department of Health An organisation with a memory. London: Department of Health, 2000
- 11.Department of Health Building a safer NHS for patients. London: Department of Health, 2001
- 12.Cambridge Consultants and University of Cambridge Good design practice for medical devices and equipment: a framework. 2001.
- 13.Cambridge Consultants and University of Cambridge Good design practice for medical devices and equipment: requirements capture. 2002.
- 14.Cambridge Consultants and University of Cambridge Good design practice for medical devices and equipment: design verification. 2002.
- 15.Norman D.The design of everyday things. New York: Basic Books, 1988
- 16.Leape L. Error in medicine. JAMA 19942721851–1857. [PubMed] [Google Scholar]
- 17.Rozich J D, Howard R J, Justeson J M.et al Standardization as a mechanism to improve safety in health care. Jt Comm J Qual Saf 2004305–14. [DOI] [PubMed] [Google Scholar]


