Skip to main content
Pharmacy and Therapeutics logoLink to Pharmacy and Therapeutics
. 2019 Jun;44(6):320-321, 375.

Understanding Human Over-Reliance On Technology

Matthew Grissinger
PMCID: PMC6534180  PMID: 31160864

Abstract

Implementing IT in medication-use systems reduces adverse drug events by decreasing human error. But over-reliance on technology can lead to automation bias and complacency.


graphic file with name PTJ4406320-f1.jpg

The implementation of information technology in medication-use systems is widely accepted as a way to reduce adverse drug events by decreasing human error.1 Technology examples include computerized order-entry systems, clinical decision support systems, robotic dispensing, profiled automated dispensing cabinets (ADCs), smart infusion pumps, and barcode scanning of medications during compounding, dispensing, ADC restocking, and administration. These technologies are meant to support human cognitive processes and, thus, have great potential to combat the shortcomings of manual medication systems and improve clinical decisions and patient outcomes. This is accomplished through precise controls, automatically generated cues and recommendations to help users respond appropriately, prompts promoting the correct sequence of work or ensuring the collection of critical information, and alerts to make users aware of potential errors.

Information technology that supports clinical decision-making doesn’t replace human activity but changes it, often in unintended or unanticipated ways.2 Instances of misuse and disuse––often to work around technology issues––and new sources of errors after technology implementation have been well documented. Errors can also be caused by over-reliance on and trust in technology’s proper function.3 Technology can occasionally malfunction, misdirect users, or give incorrect information or recommendations that results in users changing a previously correct decision or following a pathway that leads to error. Over-reliance on technology can result in serious consequences for patients. In its recent Safety Bulletin,4 our sister organization, ISMP Canada, highlighted this issue based on its analysis of an event reported to a Canadian national reporting system. In the article, they discussed two related cognitive limitations: automation bias and automation complacency.

Incident Description

An elderly patient was admitted to the hospital with new-onset seizures. Admission orders included the anticonvulsant phenytoin (handwritten with the brand name DILANTIN) 300 mg orally every evening. Before the pharmacy closed, a staff member entered the order into the computer so the medication could be obtained overnight from an ADC in the patient care unit. Medication selection for order entry was performed by typing the first three letters of the drug’s name (“dil,” in this case) then choosing the desired name from a drop-down list, comprised of both generic and brand names. The staff member was interrupted while entering the order and, upon resuming the task, selected dilTIAZem 300 mg instead of Dilantin 300 mg.

On the patient-care unit, the order for Dilantin had been correctly transcribed by hand onto a daily computer-generated medication administration record (MAR), which was verified against the prescriber’s order and co-signed by a nurse. The nurse who obtained the medication from the unit’s ADC noticed the discrepancy between the MAR and the ADC display, but accepted the information on the ADC screen as correct. Thus, the patient received a dose of long-acting dilTIAZem 300 mg instead of the Dilantin 300 mg as ordered. The error was caught the next morning when the patient exhibited significant hypotension and bradycardia.

Automation Bias and Automation Complacency

The tendency to favor or give greater credence to information supplied by technology (e.g., an ADC display) and to ignore a manual source of information that provides contradictory information (e.g., a handwritten entry on the computer-generated MAR), even if it is correct, illustrates the phenomenon of automation bias.3 Automation complacency is a closely linked, overlapping concept that refers to the monitoring of technology with less frequency or vigilance because of a lower suspicion of error and a stronger belief in its accuracy.2 End-users of a technology (e.g., a nurse who relies on the ADC display that lists the medications to be administered) tend to forget or ignore that information from the device may depend on data entered by a person. In other words, processes that may appear to be wholly automated are often dependent upon human input at critical points and thus require the same degree of monitoring and attention as manual processes. These two phenomena can affect individual as well as team decision-making, and offset the benefits of technology.2

Automation bias and complacency can lead to decisions that are not based on a thorough analysis of all available information but are strongly biased toward the presumed accuracy of the technology.2 While the effects are inconsequential if the technology is correct, errors are possible if the technology output is misleading. Automation-bias errors of omission happen when users rely on the technology to inform them of a problem but it does not do so (e.g., excessive dose warning); therefore, they fail to respond to a potentially critical situation because they were not prompted to do so. Automation-bias errors of commission occur when users make choices based on incorrect suggestions or information provided by technology.3 In the Dilantin incident, automation bias resulted in two errors: the first was the pharmacy staff member accepting dilTIAZem as the correct drug in the order-entry system. The second was the nurse identifying the discrepancy between the information displayed on the ADC and the information in the MAR, but trusting the ADC display over the handwritten entry in the computer-generated MAR.

In recent analyses of health-related studies on automation bias and complacency, clinicians overrode their own correct decisions in favor of erroneous advice from technology between 6% and 11% of the time,3 and the risk of an incorrect decision increased by 26% if the technology output was in error.5 The technology-failure detection rate is also low—in one study, half of all users didn’t detect any of the failures introduced during the course of a typical work day (e.g., non-issue of an important alert, or presentation of the wrong information or recommendation).2,6

Causes of Automation Bias and Complacency

Automation bias and complacency are thought to result from three basic human factors:2,3

  • When making decisions, people tend to select the pathway requiring the least cognitive effort, which often results in letting technology dictate the path. This factor is likely to play a greater role as people are faced with more complex tasks, multitasking, heavier workloads, or increasing time pressures—common phenomena in health care.

  • People often believe that technology’s analytic capability is superior to humans’, which can lead to overestimating its performance.

  • People may reduce their effort or shed responsibility while carrying out a task if an automated system performs the same function. It has been suggested that using technology convinces the human mind to hand over tasks and associated responsibilities to the system.7,8 This mental handover can reduce the vigilance people would typically demonstrate if carrying out those tasks independently.

Other conditions linked to bias and complacency include the following:

User Experience

There is conflicting evidence about the effect of experience on automation bias and complacency. Although there is evidence that reliance on technology decreases as people’s experience and confidence in their own decisions increases, it has also been shown that increased familiarity with technology can lead to desensitization. This may cause clinicians to doubt their instincts and accept inaccurate technology-derived information.3 Automation bias and complacency have been found in both naïve and expert users.2

Perceived reliability of and trust in technology

Where once there may have been a general tendency to trust all technology, today automation bias and complacency are believed to be influenced by users’ perceived reliability of a specific technology based on their prior experience with the system.2 When they perceive automation as reliable at least 70% of the time, people are less likely to question its accuracy.9

Confidence in decisions

As trust in technology increases bias and complacency, users are less likely to be biased if they are confident in their own decisions.3,10,11

Safe Practice Recommendations

The use of technology is considered a high-leverage strategy to optimize clinical decision making—but only if user trust in the technology closely matches the reliability of the technology itself. Therefore, the following strategies to address errors related to automation bias and complacency focus on:

  • Improving the reliability of the technology; and

  • Encouraging clinicians to more accurately assess its reliability so that appropriate monitoring and verification strategies can be employed.

Analyze and address vulnerabilities

Conduct a proactive risk assessment (e.g., failure mode and effects analysis [FMEA]) for new technologies to identify unanticipated vulnerabilities and address them before undertaking facility-wide implementation. Also, encourage the reporting of technology-associated risks, issues, and errors.

Limit human-computer interfaces

Organizations should continue to enable the seamless communication of all technology, thereby limiting the need for human interaction with the technology, which could introduce errors.

Design technology to reduce over-reliance

Technology design can affect users’ attention and how they regard its value and reliability. For example, the “auto-complete” function for drug names after entering the first few letters is a design strategy that has often led to selection of the first, but incorrect, choice provided by the system. Requiring four letters to generate a list of potential drug names could reduce this type of error. To cite another example, studies have found that providing too much on-screen detail can decrease users’ attention and care, thereby increasing automation bias.3

Provide training

Provide training in the technology involved in the medication-use system to all staff who utilize the technology. Include information about its limitations, as well as previously identified gaps and opportunities for error. Allow trainees to experience automation failures during training (e.g., non-issue of an important alert; discrepancies between technology and handwritten entries in which the handwritten ones are correct; “auto-fill” or “auto-correct” errors; incorrect calculation of body surface area due to human error by inputting weight in pounds instead of kilograms, etc.). Experiencing technology failures during training can help reduce errors caused by complacency and automation bias by encouraging critical thinking when using automated systems.3 Allowing trainees to experience such failures may increase their likelihood of recognizing them during daily work.

Reduce task distraction

Although easier said than done, leaders should attempt to ensure that staff using technology can do so uninterrupted and that they are not simultaneously responsible for other tasks. Automation failures are less likely to be identified if users have to multitask or are otherwise distracted or rushed.2

CONCLUSION

Technology plays an important role in the design and improvement of medication systems; however, it must be viewed as supplementary to clinical judgement. Although it can make many aspects of the medication-use system safer, health care professionals must continue to apply their clinical knowledge and critical thinking skills while using technology to provide optimal patient care.

Footnotes

ISMP thanks ISMP Canada for its generous contribution to the content of this article.

In 2019, ISMP is celebrating its 25th anniversary of helping health care practitioners keep patients safe and leading efforts to improve the medication-use process. For more information, visit www.ismp.org.

REFERENCES

  • 1.Mahoney CD, Berard-Collins CM, Coleman R, et al. Effects of an integrated clinical information system on medication safety in a multi-hospital setting. Am J Health Syst Pharm. 2007;64(18):1969–1977. doi: 10.2146/ajhp060617. [DOI] [PubMed] [Google Scholar]
  • 2.Parasuraman R, Manzey DH. Complacency and bias in human use of automation: an attentional integration. Hum Factors. 2010;52(3):381–410. doi: 10.1177/0018720810376055. [DOI] [PubMed] [Google Scholar]
  • 3.Goddard K, Roudsari A, Wyatt JC. Automation bias: a systematic review of frequency, effect mediators, and mitigators [published online June 16, 2011] J Am Med Inform Assoc. 2012;19(1):121–127. doi: 10.1136/amiajnl-2011-000089. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.ISMP Canada. Understanding human over-reliance on technology. ISMP Canada Safety Bulletin. 2016;16(5):1–4. [Google Scholar]
  • 5.Goddard K, Roudsari A, Wyatt JC. Automation bias: empirical results assessing influencing factors [published online January 17, 2014] Int J Med Inform. 2014;83(5):368–375. doi: 10.1016/j.ijmedinf.2014.01.001. [DOI] [PubMed] [Google Scholar]
  • 6.Parasuraman R, Molloy R, Singh IL. Performance consequences of automation-induced “complacency”. Int J Aviat Psychol. 1993;3(1):1–23. [Google Scholar]
  • 7.Coiera E. Technology, cognition and error. BMJ Qual Saf. 2015;24(7):417–422. doi: 10.1136/bmjqs-2014-003484.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Mosier KL, Skitka LJ. Human decision makers and automated decision aid: made for each other? In: Parasuraman R, Mouloua M, editors. Automation and Human Performance: Theory and Applications. Mahwah, NJ: Lawrence Erlbaum Associates, Inc; 1996. pp. 201–220. [Google Scholar]
  • 9.Campbell EM, Sittig DF, Guappone KP, et al. Overdependence on technology: an unintended adverse consequence of computerized provider order entry. AMIA Annu Symp Proc. 2007;2007:94–98. [PMC free article] [PubMed] [Google Scholar]
  • 10.Lee J, Moray N. Trust, control strategies and allocation of function in human-machine systems. Ergonomics. 1992;35(10):1243–1270. doi: 10.1080/00140139208967392. [DOI] [PubMed] [Google Scholar]
  • 11.Yeh M, Wickens CD. Display signaling in augmented reality: effects of cue reliability and image realism on attention allocation and trust calibration. Hum Factors. 2001;43(3):355–365. doi: 10.1518/001872001775898269. [DOI] [PubMed] [Google Scholar]

Articles from Pharmacy and Therapeutics are provided here courtesy of MediMedia, USA

RESOURCES