Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2004 Mar-Apr;11(2):104–112. doi: 10.1197/jamia.M1471

Some Unintended Consequences of Information Technology in Health Care: The Nature of Patient Care Information System-related Errors

Joan S Ash 1, Marc Berg 1, Enrico Coiera 1
PMCID: PMC353015  PMID: 14633936

Abstract

Medical error reduction is an international issue, as is the implementation of patient care information systems (PCISs) as a potential means to achieving it. As researchers conducting separate studies in the United States, The Netherlands, and Australia, using similar qualitative methods to investigate implementing PCISs, the authors have encountered many instances in which PCIS applications seem to foster errors rather than reduce their likelihood. The authors describe the kinds of silent errors they have witnessed and, from their different social science perspectives (information science, sociology, and cognitive science), they interpret the nature of these errors. The errors fall into two main categories: those in the process of entering and retrieving information, and those in the communication and coordination process that the PCIS is supposed to support. The authors believe that with a heightened awareness of these issues, informaticians can educate, design systems, implement, and conduct research in such a way that they might be able to avoid the unintended consequences of these subtle silent errors.


Medical error reduction is an international issue. The Institute of Medicine's report on medical errors1 dramatically called attention to dangers inherent in the U.S. medical care system that might cause up to 98,000 deaths in hospitals and cost approximately $38 billion per year. In the United Kingdom, the chief medical officer of the newly established National Patient Safety Agency estimates that “850,000 incidents and errors occur in the NHS each year.”2 In The Netherlands, the exact implications of the U.S. figures for the Dutch health care scene are much debated. There as well, however, patient safety is on its way to becoming a political priority. Medication errors alone have been estimated to cause 80,000 hospital admissions per year in Australia, costing $350 million.3

In much of the literature on patient safety, patient care information systems (PCISs) are lauded as one of the core building blocks for a safer health care system.4 PCISs are broadly defined here as applications that support the health care process by allowing health care professionals or patients direct access to order entry systems, medical record systems, radiology information systems, patient information systems, and so on. With fully accessible and integrated electronic patient records, and with instant access to up-to-date medical knowledge, faulty decision making resulting from a lack of information can be significantly reduced.5 Likewise, computerized provider order entry (CPOE) systems and automated reminder systems can reduce errors by eliminating illegible orders, improving communication, improving the tracking of orders, checking for inappropriate orders, and reminding professionals of actions to be undertaken. In this way, these systems can contribute to preventing under-, over-, or misuse of diagnostic or therapeutic interventions.6,7,8 Among the broad array of health informatics applications, CPOE systems, and especially medication systems, have received the most attention.9,10,11,12

PCISs are complicated technologies, often encompassing millions of lines of code written by many different individuals. The interaction space13 within which clinicians carry out their work can also be immensely complex, because individuals can execute their tasks by communicating across rich social networks. When such technologies become an integral part of health care work practices, we are confronted with a large sociotechnical system in which many behaviors emerge out of the sociotechnical coupling, and the behavior of the overall system in any new situation can never be fully predicted from the individual social or technical components.13,14,15,16,17

It is not surprising, therefore, that authors have started to describe some of the unintended consequences that the implementation of PCISs can trigger.18 For instance, professionals could trust the decision support suggested by the seemingly objective computer more than is actually called for.15,19 Also, PCISs could impose additional work tasks on already heavily burdened professionals,20,21 and the tasks are often clerical and therefore economically inefficient.17 They can upset smooth working relations and communication routines.13,22 Also, given their complexity, PCISs could themselves contain design flaws “that generate specific hazards and require vigilance to detect.”23,24 As a consequence, PCISs might not be as successful in preventing errors as is generally hoped. Worse still, PCISs could actually generate new errors.25(p.511),26,27

It is obvious that PCISs will ultimately be a necessary component of any high-quality health care delivery system. Yet, in our research in three different countries, we have each encountered many instances in which PCIS applications seemed to foster errors rather than reduce their likelihood. In health care practices in the United States, Europe, and Australia alike, we have seen situations in which the system of people, technologies, organizational routines, and regulations that constitutes any health care practice seemed to be weakened rather than strengthened by the introduction of the PCIS application. In other words, we frequently observed instances in which the intended strengthening of one link in the chain of care actually leads unwittingly to a deletion or weakening of others.

We argue that many of these errors are the result of highly specific failures in PCIS design and/or implementation. We do not focus on errors that are the result of faulty programming or other technical dysfunctions. Hardware problems and software bugs are more common than they should be, especially in a high-risk field such as medicine. However, these problems are well known and can theoretically be dealt with through testing before implementation. Similarly, we do not discuss errors that are the result of obvious individual or organizational dysfunctioning such as a physician refusing to seek information in the computer system “because that is not his task,” or a health care delivery organization cutting training programs for a new PCIS for budgetary reasons.

We do focus on those often latent or silent errors that are the result of a mismatch between the functioning of the PCIS and the real-life demands of health care work. Such errors are not easily found by a technical analysis of the PCIS design, or even suspected after the first encounter with the system in use. They can only emerge when the technical system is embedded into a working organization and can vary from one organization to the next. Yet, in failing to take seriously some by now well-recognized features of health care work, some PCISs are designed or implemented in such a way that error can arguably be expected to result. Only when thoughtful consideration is given to these issues, we argue, will PCISs be able to fulfill their promise.

Background and Methods

This study draws on a literature review and a series of qualitative research studies in the United States, The Netherlands, and Australia. These studies are based on standard qualitative methods such as ethnographic observation in health care delivery settings and semistructured interviews with professionals.28 All of the studies focused on the impact of PCISs in health care, yet none of the researchers set out to study error prevention. We did not focus especially on problematic PCIS implementations; on the contrary, most of our studies were done at sites that were recognized as highly successful. While discussing our different research projects, however, we realized that we had all gathered data that indicated the possibility of, fear about, or awareness of PCIS-related errors. In our view, the importance of this topic, and the relevance of the lessons these data can teach us, warrants a blended international treatment of this issue. Briefly, our reflections are based on U.S. data about CPOE from four hospitals, including 340 hours of observation and 59 formal interviews, Australian data about CPOE from 18 semistructured interviews with stakeholders at several public hospital sites, and Dutch data from electronic medical records, CPOE, and medication system studies involving participant observations and interviews from two hospitals and other settings in The Netherlands. All of the sites had patient care information systems in place; the four U.S. hospitals and one each of the Australian and Dutch sites also had CPOE. Interview transcripts and all field notes in the U.S. and Australian studies were analyzed with the assistance of qualitative data analysis software. (For detailed descriptions of these studies, see references.29,30,31,32,33,34,35)

Because these were qualitative studies, they do not offer estimates of how often certain errors occurred or whether PCISs overall result in more or fewer medication errors, for example. The power of qualitative work is in the richness of its detailed descriptions.28 We include results from the underlying studies, from diverse fields in diverse contexts, to emphasize the ubiquity of the issues we are addressing here. We offer an interpretation of the nature of health care work, the role of information and information technology, and the risks of an improper interrelation or fit between PCISs and health care work. Our goal is to present an argument, supported by extensive literature, and illustrated by prototypical examples from our studies, that will lead to more quantitative work that can track the “epidemiology” of these information system pathologies, as well as convince decision makers to be cautiously realistic about the benefits of PCISs.

The following discussion includes verbatim quotes from interviewees or field notes that illustrate patterns seen across the studies. The quotes were selected because they are both representative and well stated. Words in brackets are ours and have been added to clarify meanings. All studies received appropriate human subject approval from our universities. Finally, it is important to emphasize that, as far as we are aware, the examples given here never led to actual harm of patients.

Results: Categorization of Errors

The complex nature of health care work both creates and hides errors, which can be nearly invisible or silent. Health care work can be characterized as the managing of patients' trajectories; under continuous time pressure, and in constant interaction with colleagues and the patient, health care professionals have to try to keep a patient's problem on track. This implies simultaneously acting on a whole range of dimensions, including interpreting physical signs and diagnostic tests, and dealing with organizational policies and the patient's individual needs. However, standardizing diagnostic and/or therapeutic care paths, individual trajectories always follow their own, unique course. Contingencies are the rule; smoothly molding such continuous lapses of order into events to be handled with “standard operating procedures” is the true skill of experienced health care professionals.13,33,36 Computer applications are best when they automate routine work, but the complexities of the health care process often make it anything but routine.

In outpatient settings, the interaction with colleagues could be less pronounced than in inpatient settings, although there as well, the professional is connected through other health care professionals' opinions and needs through progress reports and referrals. This social organization of medical work, as the sociologist Strauss called it, is now widely recognized as an important feature to consider in the design of health care information technologies.36,37,38,39,40

In this section, we discuss two main categories of errors that occur at the interface of the information system and work practice that are the result of a failure to grasp this nature of health care work. First, we discuss errors in the process of entering and retrieving information in or from the system. Second, we discuss errors in the communication and coordination processes that the PCIS is supposed to support. As the examples will illustrate, such failures are the result of mistaken assumptions about health care work that are built into PCIS applications, creating dysfunctional interactions with users and, sometimes, leading to actual errors in the delivery of health care.

Errors in the Process of Entering and Retrieving Information

Increasingly, the entry and retrieval of information into and from a PCIS are a core activity in health care work. Given the characteristics of this work, these PCIS applications have to fulfill specific demands. Many of these are well known; PCIS applications have to have fast response times, have negligible downtime, be easily accessible, and have interfaces that are easy to understand and navigate.5,30 Also, the software and hardware have to be designed to optimally fit the ecology of the work practice: mobile when necessary, robust, small but ergonomically suitable.13,41 Although such requirements are widely known and accepted, they are often not met. Many system interfaces are still so impractical that using the systems takes a great deal of costly time on the part of busy professionals. Some systems in use in medical work practices today have interfaces that are outdated, with no windows, no intuitive graphic navigation aids, and endless lines of identical-looking text. In such cases, even when the information is there, it could be exceedingly hard to find. We discuss two problems in detail: (1) PCISs that have human–computer interfaces that are not suitable for this highly interruptive use context, and (2) PCISs that cause cognitive overload by overemphasizing structured and complete information entry or retrieval.

A Human–Computer Interface That Is Not Suitable for a Highly Interruptive Use Context

Working on the computer is rarely an isolated task; health care professionals are always communicating with others, including patients in outpatient settings, but primarily with other health care professionals. More often than not, different tasks are executed simultaneously, and interruptions by beepers, telephones, and colleagues are endless.42,43 Many human–computer interfaces, however, seem to have been designed for workers doing their work by themselves, fully and extensively concentrating on the computer screens. This single-task assumption is aggravated by the fact that so many existing screen designs are already suboptimal by current office standards.

This mismatch between interface and use context often results in a juxtaposition error, the kind of error that can result when something is close to something else on the screen and the wrong option is too easily clicked in error. The following are typical quotations from physicians; note the allusions to the “interruptive” use context: “I have ordered the test that was right next to the one I thought I ordered, you know, right below it that my little thingie had come down and I clicked and I'm lookin' at this one but I in fact clicked on the thing before. By that time I turned my head and I'm hitting return and typing my signature and not seeing it” [physician, U.S. hospital]. “I was ordering Cortisporin, and Cortisporin solution and suspension comes up. The patient was talking to me, I accidentally put down solution, realized that's not what I wanted …. I would not have made that mistake, or potential mistake, if I had been writing it out because I would have put down what I wanted” [physician, U.S. outpatient setting].

Likewise, there were many instances of patient or physician confusion when orders were entered for or on behalf of the wrong person. Again, in a context of many co-occurring activities and interruptions, a suboptimal interface becomes rapidly unforgiving: “Patients were getting the wrong orders for medications. You would order it on one patient and it would, cause of the vagaries of the light pen system, you thought you were ordering it on one, and it was really ordered on somebody else and somebody got the wrong medication and that sort of thing” [physician, U.S. hospital]. “She looked up the patient's diet and was trying to order a regular diet. At the fifth screen she saw that the patient was getting tube feeding. This clued her that this was the wrong patient” [field notes, observation of a nurse, U.S. hospital].

Causing Cognitive Overload by Overemphasizing Structured and “Complete” Information Entry or Retrieval

Professionals need fast access to data that are relevant to the case at hand. Simultaneously, they need to be able to record a maximum amount of information in a minimum amount of time and in such a way that it is most useful to other health care professionals involved in the handling of this patient's trajectory. Psychologic and sociologic studies have shown that in a shared context, concise, unconstrained, free-text communication is most effective for coordinating work around a complex task.44,45,46 Attempting to require professionals to encode data, or enter data in more structured formats, can be fruitful and is necessary for research or managerial purposes but does not come without a cost. Such formats are generally more time-consuming to complete and read. When the information relevancy to the primary task is lessened through the structuring of the information, and/or when the time spent writing or reading this information increases significantly, the information ends up being less useful for the primary task at hand.20

Structure

Some PCIS systems require data entry that is so elaborate that the time spent recording patient data is significantly greater than it was with its paper predecessors. What is worse, on several occasions during our studies, overly structured data entry led to a loss of cognitive focus by the clinician. Having to go to many different fields, often using many different screens to enter many details, physicians reported a loss of overview. When professionals are working through a case, determining a differential diagnosis, for example, the act of writing the information is integral to the cognitive processing of the case.32,47 This act of writing-as-thinking can be aided greatly by some structure such as the grouping of similar types of information or sequencing to guide elucidating a history but is inevitably hampered by an excess of structure. Rather than helping the physician build a cognitive pattern to understand the complexities of the case, such systems overload the user with details at odds with the cognitive model the user is trying to develop.

Fragmentation

Similarly, the need to switch between different screens can result in a loss of overview. Physicians and nurses in an intensive care unit, for example, reported that the large paper day-sheets they used to work with would include an order list, problem list, vital signs graphs, and medication lists, all on a single large sheet of paper. The graphic user interface software they used allowed all of these functions and more, but the user had to switch among multiple windows to get all of the information. Doing so, several professionals argued, worked against their ability to acquire, maintain, and refine a mental overview of the case. Some reported that they felt insecure about identifying emerging problems because the activity of clicking through the different screens inevitably fragmented the cognitive “images” they were constructing.

Likewise, records might overly separate the information flows according to work task or responsibility. In everyday practice, doctors can gather information from nurses' notes, or those of other specialists, that relate to the problem. Information systems could limit this easy access to other people's notes or other parts of the record, and thereby severely hamper the professional's ability to be optimally informed.48 “… [R]egarding interpretation of results, currently this is often in the notes so [you] can see the results and the interpretation. On an order entry, results reporting may only get the raw data and not the interpretation, which could affect clinical work. This separation may also lead to clinicians being too specialty focused [and] not seeing what others have written—now [we] have to flick through notes so we see other information. On this system, if [we] only go to [our] own information, this may not happen and information may be missed” [allied health professional, Australia].

Overcompleteness

Results reporting systems can also mistake completeness for efficacy. In several instances, physicians stated that systems that produced standard, “complete” reports actually reduced the usability and the transparency of these reports or discharge letters. The physicians explained: “There are so many standard phrases in the ordinary reports, I don't think that's good … you have to really search for the usable information …. Many others use the [standard templates] and then you often see a discussion with standard phrases, one or two added phrases, and then more standard phrases. You then have to really search what the considerations were…. In my reports the text is mine, it doesn't come from the computer, I make it up myself…. Everyone should do that. If you have so much standard text, it becomes too easy to just push that button and add some more” [insurance physician, The Netherlands]. “You'll have to write the largest part yourself. You can standardize only so much, since otherwise you get an empty report with only standard phrases that could be true for anyone” [insurance physician, The Netherlands].

Too many standard phrases, these physicians argued, actually decreased the readability and information value of the reports. From the point of view of the professional, overly “complete” reports could end up becoming “empty” and stand in the way of actual communication. The similarity of the phrases, and the impossibility of judging whether a sentence is part of the template or a result of a thoughtful weighing of words, threatens to obscure the transparency that such systems attempt to introduce.

Here, of course, ease of use can also lure users into learning new but poor recording practices. The ability to cut and paste or, more often, copy and paste, affords users the opportunity to exacerbate the data overload problem. As an attending physician stated: “Just before I came up here, I looked at a discharge summary that was an absolute disaster, because not only had she cut and pasted the progress note, but she had cut and pasted the whole thing, so the intern's signature and the whole thing was on it. [The system] is inherently error prone…. people have the tendency to cut and paste…. and instead of taking the pertinent facts from a laboratory report or from another clinician's progress note, they will cut and paste a whole laboratory report, cut and paste somebody else's thinking process into their own note and sign it” [physician, U.S. hospital].

Errors in the Communication and Coordination Process

In the previous section, we discussed errors related to the processes of entering and retrieving information in PCISs. In this section, we focus on the way computers can undermine communication about and coordination of events and activities. Here we encounter the truly interactive and contingent nature of health care work and the consequences of not taking these characteristics into account. Although the issues discussed here are highly interrelated, we have subdivided them in two overarching problems: (1) misrepresenting collective, interactive work as a linear, clear-cut, and predictable workflow; and (2) misrepresenting communication as information transfer.

Misrepresenting Collective, Interactive Work as a Linear, Clearcut, and Predictable Workflow

PCIS systems often appear to be imbued with a formal, stepwise notion of health care work: a doctor orders an intervention, a nurse subsequently arranges for or carries out the intervention, and then the doctor obtains the information about the result. As a chain of independent actions, an order is executed and reported on, or a piece of information is generated, processed, and stored.49,50,51 Yet it has become common knowledge that it is inherently difficult for formal systems to accurately handle or anticipate the highly flexible and fluid ways in which professional work is executed in real life.13,52,53 Carepath or workflow systems are plagued by the ubiquity of exceptions.54 Similarly, decision support systems are in constant need of “supervision” to determine whether their suggestions fit a given case.18 Systems cannot handle all potential exceptions; very soon, the number of branching points becomes too great, and the system becomes impossible to maintain and to use.55

Support of work processes is one of the main benefits of PCISs, yet it has its problems. Finding the proper balance between formalizing work activities so that the information technology application can fulfill its promise and respecting this fluid and contingency-driven nature of health care work is no easy task for system designers.34 However, it is necessary if PCIS systems are to contribute to the overall quality improvement required in western health care.

Inflexibility

These systems often fail to reflect some of the basic real-life exigencies of care work, thus resulting in problems for the user and potentially faulty reporting and/or actions. Seemingly easy and clearcut on paper, the real-time intricacies of treatment protocols, for example, could baffle the system's preconceptions of these processes. In one instance, for example, a drug ordered three times a day had been discontinued, but one dose had already been given. The computer system would not allow the nurse to chart the one dose, because the system considered it an incomplete execution of the task [as told by a pharmacist, U.S. hospital, recorded in field notes].

Urgency

In the case of urgent medication orders, nurses could already give a medication before the physician formally activates the order. There is a familiar category of errors here that has to do with the informal realities of medication handling in health care. In everyday health care work, experienced nurses often have more practical knowledge about what medications to give when, and what contraindications could be relevant, than many of the junior physicians who populate the wards.56 For example, during nightly routine medication administration, nurses could initiate distribution without waking up the junior doctor who is formally responsible for signing the order. There is a rather large gray zone of informal management of these responsibilities and tasks, which can be entirely rational given the everyday organization and exigencies of health care work. Within this same gray zone, there could lie many practices that would contribute to unsafe medication routines such as doctors actively discouraging nurses to call them for medication requests or nurses taking too many liberties with dosing. All of these practices exist within the current paper medication systems, but many computerized medication systems all too radically cut off such practices. Many medication systems have been rejected by their users because they strictly demanded a physician's authorization before any drug could be distributed or because they made any alternative route (such as the nurse ordering the medication through an “agent-for” procedure) much too cumbersome. In the last example, nurses had to bear the consequences of physicians' not wanting to have to enter every medication order before anything could be given or changed. Understandably, both professional groups refused to fulfill these demands.57

Workarounds

When such systems do remain in practice, workarounds, which are clever alternative approaches, are artfully developed by the users. Workarounds allow users to live with the system while avoiding some of the demands that are deemed to be unrealistic or harmful.58,59 Such situations could undermine patient safety, however. In urgent situations, physicians could enter medication orders after the medication has already been administered, for example. Alternatively, the order might have been entered by the nurse but would have to be activated by the physician post hoc. A nurse remarked that in such situations, near the change of her shift, she often “worries that the [urgently given] medication could be given again when the order is “activated” [critical care nurse, U.S. hospital].

Transfers

Similar problems abound when transferring patients between wards or when admitting new patients. Here again, the real patient flow does not always match the clearcut, formal model of the patient flow in which you start with the completion of the required administrative data after which the clinical content can be accessed and entered. This ensures that the patient record is not accidentally fragmented over different electronic patient identities. In real-life health work, however, information can be required or activities will have to be started or planned before the proper administrative details are entered or even known. Problems such as this are familiar to everyone with some clinical experience, yet there are still systems that very poorly support this, as we have witnessed in all three countries. For example, during transfers between the emergency department and a patient ward, orders would not be transferred or new orders could not be entered in the system because the patient was not yet “in the system.” “If they don't remember or know their social security number, it's tough,” a U.S. hospital nurse remarked. In another example, we were told that once an order had been entered by a physician, that person expected it to be carried out but, if the administrative data had not yet been entered, the physician's orders might never be executed. “The doctors liked to be able to write orders and hold them pending an admission and the software was dropping off the orders you know … that was just incredible” [nurse, U.S. hospital].

A similar issue is “the midnight problem.” It does not make a large difference for ongoing practical work or for a patient's health whether it is just before or just after midnight, but some systems create a difference. This could make sense from a purely administrative perspective, but not from a clinical one. “If the patient has a while to wait in the ER [Wednesday night] for a bed, or some other delay and doesn't get on the floor until 12:01 am [Thursday], the order [for tomorrow's medications] effectively means Friday morning. [This is a] big problem from his perspective and I heard this from two other docs as well.” This would cause there to be no orders in the system for Thursday [field notes, observation of U.S. physician].

Misrepresenting Communication as Information Transfer

Loss of Communication

In a work practice such as health care, which is characterized by contingencies and constantly developing definitions of the situation, proper communication among the involved professionals is crucial. However, “physicians may assume that ‘entry’ into the computer system replaces their previous means of initiating and communicating their plans, and that orders will be carried out without further action on their part. The result is reduced direct interaction among physicians, nurses, and pharmacy, and increased overall reliance on the computer system.”60 The entry of information into the system, in other words, is not the same as completing a successful communication act. When a U.K. hospital supplanted the telephoning of results by laboratory staff with installation of a results-reporting system in an emergency department and on the medical admissions ward, the results were devastating: “The results from 1,443/3,228 (45%) of urgent requests from accident and emergency and 529/1836 (29%) from the admissions ward were never accessed via the ward terminal.… In up to 43/1,443 (3%) of the accident and emergency test results that were never looked at, the findings might have led to an immediate change in patient management.”61

In this case, the designers had overlooked the fact that in the previous work process, laboratory personnel called doctors when the results were in. In the new situation, doctors would have to actively log into the system to see whether the results were already available. In the hectic environment of these wards, this is a highly inefficient mode of communication for these professionals.13

Loss of Feedback

We encountered many variations on this theme; nurses are often alerted to new orders by the printer, but this assumes the nurse is nearby and that the printer functions correctly: “There is a printer problem, for example, you know, something prints out or that piece of paper that gets printed out at the nurses' station somehow gets lost or not seen. I've seen a couple of antibiotics get missed” [physician, U.S. hospital]. Likewise, a typical complaint is that “he was totally unaware of this new order—he had heard no mention of it previously and there had not been a notification of the order by the ordering physician” [field notes, observing nurse, U.S. hospital]. Here again, the sender of the information mistakenly assumes that the computer will take care of notifying the receiver, the nurse. Similarly, a common problem is that physicians cannot tell if an order has been carried out, or that someone else has entered a similar order, without gaining feedback. In one U.S. hospital, we discovered that nurses put their initials into the computer when they take the order off rather than, for example, when they have completed the order. The latter might be more correct, but it would require yet another separate computer session. Although logical from the nurses' point of view, the system did not make a distinction between an order that was accepted and an order that was executed. This was problematic, because doctors then often do not know the true status of orders [field notes, observing nurses and physicians, U.S. hospital].

As a result of miscommunication, orders or appointments are missed, diagnostic tests are delayed, and medication is not given. Communication involves more than transferring information. Communication is about generating effect—the laboratory personnel wanted to make sure that the doctors would act on their data. Similarly, communication is about testing out assumptions regarding the other person's understanding of the situation and willingness to act on your information.62,63,64 In addition, communication is always also about establishing, testing, or maintaining relationships.65

Decision Support Overload

Decision support systems suffer from the same problem. They could trigger an overdose of reminders, alerts, or warning messages. These messages can be sent to the computer user even if the message is not relevant for that user at that moment, or if the intended recipient of the message is not even the one entering the data. From a communication perspective, it is crucial to realize that it is not just a simple data overload that such messages could generate.18 Even worse, the user could feel supervised, treated as “stupid,” distrusted, or resentful of being constantly interrupted. As a result, health care professionals disregard the messages, click them away, or turn the warning systems off when they have an opportunity. It is common to blame these professionals for such seemingly irresponsible behavior. However, in too many systems, too little attention is paid to ensuring the judicious use of alerts and to working on the problem of contextual relevancy for the alerts the system generates during actual use. When time is a scarce resource, and too many of the warnings or reminders are either irrelevant or overly predictable, irritated physicians who disregard these alerts are quite rational.

Catching Errors

Appropriate and well-supported communication is also part and parcel of a safe work practice. In this sense, the systems we describe in this subsection could actually hamper safer working practices rather than stimulate them. In the hierarchies and task divisions of manual ordering, for example, many error prevention mechanisms are built in, often informally. For example, pharmacists routinely correct the medication orders given by physicians. Restructuring the medication ordering process might unwittingly eliminate these important mechanisms. “POE systems founded on notions of individual cognition are likely to be constrained by this model and be unable to take advantage of the distributed processing, fault tolerance, and resilience that obtains in settings characterized by distributed cooperative problem solving.”60 Errors are caught constantly, and not necessarily by those formally responsible for them.66

The redundancy that is built into the system of people and technologies constituting the medication management chain is partly responsible for the fact that of the many prescription mistakes, only a minute fraction results in actual medication administration mistakes. Similarly, in practice, orders often come into being during patient rounds, during discussions among senior and junior physicians and nurses. A case is discussed, a suggestion is made and elaborated on, and it becomes an order. It can also be transformed, renegotiated, or ignored. When details remain unclear, those involved can ask for elaboration, or smoothly “repair” interpretations of junior members of the team. In most clinical order-entry systems, however, the entering of orders is the task of the junior resident, who only does ordering after the patient rounds. This is because systems are rarely mobile, so they are not available during rounds. Alone at a computer, the resident enters a series of orders on a series of patients, copying from the notes made during rounds. In such a setting, outside of the actual context in which the patient was discussed, and away from those who could correct his misinterpretations, order entry can be prone to errors.

Discussion and Conclusion

We have outlined a number of issues within a framework describing two major kinds of silent errors caused by health care information systems: those related to entering and retrieving information and those related to communication and coordination. Because the potential causes of these errors are subtle but insidious, the problems need to be addressed in a variety of ways through improvements in education, systems design, implementation, and research.

Education

Health professionals need to be educated with a critical perspective toward what PCISs can do for them. People tend to project “intelligence” and “objectivity” onto computers,15,67,68 and physicians and nurses are no exception. In the classic case of the Therac-25 system, a computer-controlled radiation machine that was the cause of radiation overdoses in six patients, the operators trusted the “all is normal” messages the machine was delivering. They disregarded disturbing clinical signs because they had faith in the machine.69 In a study of computer decision support in health care, users were unduly influenced by incorrect advice.70 Medical education, and indeed the education of all health care professionals, should involve consideration of both the positives and negatives of using information systems. The outcome of these educational efforts should be a workforce that practices appropriate diligence when using a PCIS. Informatics education has a role to play in preventing these errors by educating individuals who can make sure that clinical systems are designed, implemented, and evaluated with unintended consequences in mind. It is imperative that we educate an increasing number of clinical informaticians: people who can bridge the gap between the clinical and technologic worlds, who can speak the language of both and therefore act as translators.

Systems Design

Systems developers and vendors should be clearer about the limitations of their technologies. When speaking of “order entry” and “intelligent” systems, and building on overly rationalist models of health care work, they can too readily lure users into expecting much more from a computer system than it can actually deliver. Systems should be designed to support communication13 and provide the flexibility that is needed for systems to better fit real work practices. There are many lessons to be learned from proponents of good systems design, and although the technology is rapidly improving, known design principles are still not evident in today's systems. Increasing involvement of experienced clinicians who know what the work is truly like should improve designs in the future. The hiring of more clinical informaticians by vendors and health care organizations to design and customize systems is a positive trend. In addition, even systems designers with no clinical experience should seek to spend some time simply observing clinical activities so that the nature of these activities can be experienced first hand.

Systems should be able to help clinicians manage interruptions, perhaps by reminding them about what they were doing when last using the system. The systems also need more effective feedback mechanisms so clinicians know if and when the orders are being received and carried out. Mobile systems hold promise for assisting with overcoming problems related to both interruptions and lack of feedback, and further development efforts should focus on them.

Prevention of silent errors is preferable to fixing them after the fact. Repairing these errors by adding safety features that are not thoroughly designed could very well make things worse. Introducing safety devices is an artful process in its own right, requiring thorough insight into the communication space. For example, an observer wrote in his field notes: “We were told that the answer to this problem was then they inserted a safety level which is yet another screen so that when you press on the patient then there's five lines of information about this person and you have to verify each one… at what point are safety levels (more screens to make sure it's the right patient) more disruptive than helpful—similar problem to having too many alerts or too much information to take in” [field notes, observation of house officers, U.S. hospital].

Systems designers are not to be blamed for silent errors. Sometimes, a problem could really have been anticipated, but some problems are so subtle that you can only find them by closely monitoring practice. Constant vigilance is crucial. Information systems are on their own not a sufficient fix for the safety problem. A rush toward implementing systems might ultimately endanger the quality of care more than help it.

Implementation

During the implementation process, clinical informaticians need to assure not only that clinicians are heavily involved so that the implementation goes more smoothly, but also that clinicians are able to continue the social processes that the system could supplant. For example, luncheon meetings for the purpose of discussing new functions of the system might replace some of the communication loss caused by a CPOE system. During and following the implementation process, organizational systems should be in place to provide ongoing monitoring of the safety of clinical systems. As recommended by a consortium of health information technology organizations, clinical systems software oversight committees should be formed at the local level.71,72

Research

In practice, then, the flow of health care work activities is often much less linear than it is in other arenas, with roles much more flexibly defined and overlapping, and distinctions between steps much more fuzzy than the formalized PCIS models would have it.60,73,74 Because of this complexity, standard quantitative research methods such as surveys fail to expose the subtle problems. Qualitative research techniques, on the other hand, can provide deep insight and can both identify problems and answer the “why” and “how” questions that quantitative studies cannot answer.75 This research needs to be multidisciplinary and must consider the multiple perspectives of all stakeholder groups.

Finally, all of us involved in information technology in health care need to practice heightened vigilance. We must be aware of the issues described in this article through education and training, be alert to the problems identified through further research, be cautious when making major changes that might have unintended consequences, and be prepared to deal with the inevitability of such consequences. We should also be optimistic; if we can identify the presence of unintended negative consequences early enough, we can do something about them. If we can reach a high enough level of vigilance, we might be able to completely avoid many of the subtle silent errors described here.

This work was supported in part by grant LM06942-02 from the National Library of Medicine. The authors appreciate the valuable contributions of Sophie Gosling and Johanna Westbrook from the Center for Health Informatics, University of New South Wales, who shared their Australian data, and Richard Dykstra, Lara Fournier, and Veena Seshadri of Oregon Health & Science University for analysis of U.S. data.

References

  • 1.Committee on Quality of Health Care in America To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press, 2000.
  • 2.NHS Magazine. Available at: <http://www.nhs.uk/nhsmagazine/story316.asp>. Accessed Sept 16, 2003.
  • 3.Roughead E. The nature and extent of drug-related hospitalizations in Australia. J Qual Clin Pract. 1999;19:19–22. [DOI] [PubMed] [Google Scholar]
  • 4.Committee on Quality of Health Care in America Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press, 2001.
  • 5.Dick RS, Steen EB, Detmer DE (eds). The Computer-Based Patient Record: An Essential Technology for Health Care. Washington, DC: National Academy Press, 1997. [PubMed]
  • 6.Bates DW, Pappins E, Kuperman GJ, et al. Using information systems to measure and improve quality. Int J Med Inf. 1999;53:115–24. [DOI] [PubMed] [Google Scholar]
  • 7.McDonald CJ, Hui SL, Smith DM, et al. Reminders to physicians from an introspective computer medical record. A two-year randomized trial. Ann Intern Med. 1984;100:130–8. [DOI] [PubMed] [Google Scholar]
  • 8.van Wijk M, Van der Lei J, Mosseveld M, Bohnen A, van Bemmel JH. Assessment of decision support for blood test ordering in primary care. A randomized trial. Ann Intern Med. 2001;134:274–81. [DOI] [PubMed] [Google Scholar]
  • 9.Sittig DF, Stead WW. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994;1:108–23. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Bates DW, Leape LL, Cullen DJ, et al. Effect of computerized physician order entry and a team intervention on prevention of serious medication errors. JAMA. 1998;280:1311–6. [DOI] [PubMed] [Google Scholar]
  • 11.Dexter PR, Perkins S, Overhage JM, Maharry K, Kohler RB, McDonald CJ. A computerized reminder system to increase the use of preventive care for hospitalized patients. N Engl J Med. 2001;345:965–70. [DOI] [PubMed] [Google Scholar]
  • 12.Teich JM, Merchia PR, Schmiz JL, Kuperman GJ, Spurr CD, Bates DW. Effects of computerized physician order entry on prescribing practices. Arch Intern Med. 2000;160:2741–7. [DOI] [PubMed] [Google Scholar]
  • 13.Coiera E. When conversation is better than computation. J Am Med Inform Assoc. 2000;7:277–86. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Perrow C. Normal Accidents. Living With High Risk Technologies. New York, NY: Basic Books, 1984.
  • 15.Weizenbaum J. Computer Power and Human Reason. From Judgment to Calculation. San Francisco, CA: W.H. Freeman and Co., 1976.
  • 16.Berg M. Implementing information systems in health care organizations: myths and challenges. Int J Med Inf. 2001;64:143–56. [DOI] [PubMed] [Google Scholar]
  • 17.Tenner E. Why Things Bite Back: Technology and the Revenge of Unintended Consequences. New York, NY: Vintage Books, 1996.
  • 18.Goldstein MK, Hoffman BB, Coleman RW, et al. Patient safety in guideline-based decision support for hypertension management: ATHENA DSS. J Am Med Inform Assoc. 2002;9(Nov-Dec suppl):S11–6. [PMC free article] [PubMed] [Google Scholar]
  • 19.Burnum JF. The misinformation era: the fall of the medical record. Ann Intern Med. 1989;110:482–4. [DOI] [PubMed] [Google Scholar]
  • 20.Berg M, Goorman E. The contextual nature of medical information. Int J Med Inform. 1999;56:51–60. [DOI] [PubMed] [Google Scholar]
  • 21.Massaro TA. Introducing physician order entry at a major academic medical center. I. Impact on organizational culture and behavior. Acad Med. 1993;68:20–5. [DOI] [PubMed] [Google Scholar]
  • 22.Dykstra R. Computerized physician order entry and communication: reciprocal impacts. AMIA Proc. 2002:230–4. [PMC free article] [PubMed]
  • 23.Shojania KG, Duncan BW, McDonald KM, Wachter RM. Safe but sound: patient safety meets evidence-based medicine. JAMA. 2002;88:508–13. [DOI] [PubMed] [Google Scholar]
  • 24.Effken JA, Carty B. The era of patient safety: implications for nursing informatics curricula. J Am Med Inform Assoc. 2002;9(suppl):S120–3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Weiner M, Gress T, Thiemann DR, et al. Contrasting views of physicians and nurses about an inpatient computer-based provider order-entry system. J Am Med Inform Assoc. 1999;6:234–44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Bates DW, Cohen MS, Leape LL, Overhage JM, Shabot MM, Sheridan T. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc. 2001;8:299–308. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.McNutt RA, Abrams R, Arons DC. Patient safety efforts should focus on medical errors. JAMA. 2002;287:1997–2001. [DOI] [PubMed] [Google Scholar]
  • 28.Strauss AL. Qualitative Analysis for Social Scientists. Cambridge, MA: Cambridge University Press, 1987.
  • 29.Ash JS, Stavri PZ, Dykstra R, Fournier L. Implementing computerized physician order entry: the importance of special people. Int J Med Inf. 2003;69:235–50. [DOI] [PubMed] [Google Scholar]
  • 30.Ash JS, Gorman PN, Lavelle M, et al. A cross-site qualitative study of physician order entry. J Am Med Inform Assoc. 2003;10:188–200. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Ash JS, Gorman PN, Lavelle M, et al. Perceptions of physician order entry: results of a cross-site qualitative study. Methods Inf Med. 2003;42:313–23. [PubMed] [Google Scholar]
  • 32.Berg M. Practices of reading and writing: the constitutive role of the patient record in medical work. Sociol Health Illness. 1996;18:499–524. [Google Scholar]
  • 33.Berg M. Medical work and the computer-based patient record: a sociological perspective. Methods Inf Med. 1998;37:294–301. [PubMed] [Google Scholar]
  • 34.Berg M. Search for synergy: interrelating medical work and patient care information systems. Methods Inf Med. 2003;42:337–44. [PubMed] [Google Scholar]
  • 35.Gosling S, Westbrook JI, Coiera EW. Variation in the use of online clinical evidence: a qualitative analysis. Int J Med Inf. 2003;69:1–16. [DOI] [PubMed] [Google Scholar]
  • 36.Strauss A, Fagerhaugh S, Suczek B, Wieder C. Social Organization of Medical Work. Chicago, IL: University of Chicago Press, 1985.
  • 37.Drazen EL, Metzger JB, Ritter JL, Schneider MK. Patient Care Information Systems: Successful Design and Implementation. New York, NY: Springer, 1995.
  • 38.Kaplan B. Objectification and negotiation in interpreting clinical images: implications for computer-based patient records. Artif Intell Med. 1995;7:439–54. [DOI] [PubMed] [Google Scholar]
  • 39.Kuhn KA, Guise DA. From hospital information systems to health information systems: problems, challenges, perspectives. Methods Inf Med. 2001;40:275–87. [PubMed] [Google Scholar]
  • 40.Forsythe DE. Studying Those Who Study Us. An Anthropologist in the World of Artificial Intelligence. Stanford, CA: Stanford University Press, 2001.
  • 41.Luff P, Heath C, Greatbatch D. Tasks-in-interaction: paper and screen-based documentation in collaborative activity. In Turner J, Kraus R (eds). Proceedings of the Conference on Computer Supported Cooperative Work. New York, NY: ACM Press, 1992, pp 163–70.
  • 42.Coiera E, Jayasuriya R, Hardy J, Bannan A, Thorpe MEC. Communication loads on clinical staff in the emergency department. Med J Aust. 2002;176:415–8. [DOI] [PubMed] [Google Scholar]
  • 43.Tellioglu H, Wagner I. Work practices surrounding PACS: the politics of space in hospitals. Computer Supported Cooperative Work. 2001;10:163–88. [Google Scholar]
  • 44.Patel VL, Kushniruk AW. Understanding, navigating and communicating knowledge: issues and challenges. Methods Inf Med. 1998;37:460–70. [PubMed] [Google Scholar]
  • 45.Garfinkel H. Studies in Ethnomethodology. Englewood Cliffs, NJ: Prentice-Hall, 1967.
  • 46.Garrod S. How groups co-ordinate their concepts and terminology: implications for medical informatics. Methods Inf Med. 1998;37:471–6. [PubMed] [Google Scholar]
  • 47.Hutchins E. Cognition in the Wild. Cambridge, MA: MIT Press, 1995.
  • 48.Faber MG. Design and introduction of an electronic patient record: how to involve users?. Methods Inf Med. 2003;42:371–5. [PubMed] [Google Scholar]
  • 49.Suchman L. Working relations of technology production and use. Computer Supported Cooperative Work. 1994;2:21–39. [Google Scholar]
  • 50.Siddiqi J, Shekaran MC. Requirements engineering: the emerging wisdom. IEEE Software. 1996;13:15–9. [Google Scholar]
  • 51.Reddy M, Pratt W, Dourish P, Shabot MM. Sociotechnical requirements analysis for clinical systems. Methods Inf Med. 2003;42:437–44. [PubMed] [Google Scholar]
  • 52.Star SL (ed). The Cultures of Computing. Oxford: Blackwell; 1995.
  • 53.Zuboff S. In the Age of the Smart Machine. The Future of Work and Power. New York, NY: Basic Books, 1988.
  • 54.Panzarasa S, Madde S, Quaglini S, Pistarini C, Stefanelli C. Evidence-based careflow management systems: the case of post-stroke rehabilitation. J Biomed Inform. 2002;35:123–39. [DOI] [PubMed] [Google Scholar]
  • 55.Collins HM. Artificial Experts. Social Knowledge and Intelligent Machines. Cambridge, MA: MIT Press, 1990.
  • 56.Hughes D. When nurse knows best: some aspects of nurse/doctor interaction in a casualty department. Sociol Health Illness. 1988;10:1–22. [Google Scholar]
  • 57.Goorman E, Berg M. Modelling nursing activities: electronic patient records and their discontents. Nurs Inquiry. 2000;7:3–9. [DOI] [PubMed] [Google Scholar]
  • 58.Gasser L. The integration of computing and routine work. ACM Transactions on Office Information Systems. 1986;4:205–25. [Google Scholar]
  • 59.Schmidt K, Bannon L. Taking CSCW seriously: supporting articulation work. Computer Supported Cooperative Work. 1992;1:7–40. [Google Scholar]
  • 60.Gorman PN, Lavelle M, Ash JS. Order creation and communication in healthcare. Methods Inf Med. 2003;42:376–84. [PubMed] [Google Scholar]
  • 61.Kilpatrick ES, Holding S. Use of computer terminals on wards to access emergency test results: a retrospective audit. BMJ. 2001;322:1101–3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Suchman L. Plans and Situated Actions. The Problem of Human–Machine Communication. Cambridge, MA: Cambridge University Press, 1987.
  • 63.Kay S, Purves IN. Medical records and other stories: a narratological framework. Methods Inf Med. 1996;35:72–87. [PubMed] [Google Scholar]
  • 64.Bardram J. Temporal coordination: on time and coordination of collaborative activities at a surgical department. Computer Supported Cooperative Work. 2000;9:157–87. [Google Scholar]
  • 65.Hartswood M, Procter R, Rouncefield M, Sharpe M. Making a case in medical work: implications for the electronic medical record. Computer Supported Cooperative Work. 2004. In press.
  • 66.Svenningsen S. Electronic Patient Records and Medical Practice. Reorganization of Roles, Responsibilities, and Risks. Copenhagen: Copenhagen Business School Thesis, 2002.
  • 67.Kling R (ed). Computerization and Controversy: Value Conflicts and Social Choices. San Diego, CA: Academic Press, 1996.
  • 68.Turkle S. The Second Self: The Human Spirit in a Computer Culture. New York, NY: Simon & Schuster, 1984.
  • 69.Leveson NG, Turner CS. An investigation of the Therac-25 accidents. Computer. 1993;July:18–41. [Google Scholar]
  • 70.Tsai TL, Fridsma DB, Gatti G. Computer decision support as a source of interpretation error: the case of electrocardiograms. J Am Med Inform Assoc. 2003;10:478–83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Miller RA, Gardner RM. Summary recommendations for responsible monitoring and regulation of clinical software systems. Ann Intern Med. 1997;127:842–5. [DOI] [PubMed] [Google Scholar]
  • 72.Miller RA, Gardner RM. Recommendations for responsible monitoring and regulation of clinical software systems. J Am Med Inform Assoc. 1997;4:442–57. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Brown JS, Duguid P. The Social Life of Information. Cambridge, MA: Harvard Business School Press, 2000.
  • 74.Lave J. Cognition in Practice. Cambridge, MA: Cambridge University Press, 1988.
  • 75.Ash J, Berg M. Report of conference track 4: socio-technical issues of HIS. Methods Inf Med. 2003;69:305–6. [DOI] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES