Skip to main content
Journal of the American Medical Informatics Association: JAMIA logoLink to Journal of the American Medical Informatics Association: JAMIA
letter
. 2012 Nov-Dec;19(6):1119. doi: 10.1136/amiajnl-2012-001193

When ‘technically preventable’ alerts occur, the design—not the prescriber—has failed

Alissa L Russ 1,2,3,4,, Michael Weiner 1,2,3, Jason J Saleem 1,2,3, Robert L Wears 5,6
PMCID: PMC3534473  PMID: 22847307

The JAMIA article entitled, ‘Failure to utilize functions of an electronic prescribing system and the subsequent generation of ‘technically preventable’ computerized alerts,’ by Baysari et al 1 identifies an intriguing phenomenon that may contribute to alert fatigue. For this study, an audit of electronic inpatient medication charts was conducted at a teaching hospital to quantify the frequency of alerts and assess how many alerts could have been prevented. Over 2000 active orders were reviewed and 20.2% of alerts were categorized as ‘technically preventable.’ For example, box 1 of the article describes how a new order was placed for ‘Paracetamol (500 mg) Tablet: 1 g oral Four Times Daily’ when there was an existing order for ‘Paracetamol (500 mg) Tablet: 1 g oral PRN: minimum dosage interval 4 h: up to 4 doses per day’ that could have been changed, which would have prevented a duplicate alert. We agree that many of these types of alerts are likely preventable.

In several instances in the article, however, preventable alerts are unfortunately attributed to the ‘failings’ of prescribers to use the system as intended. Prescriber training is emphasized as a potential solution, with some acknowledgment that changes to the system design may also be warranted. Here, we interpret the findings by drawing upon human factors science.

A basic tenet of human factors is that systems, technologies, and tools should be designed and adapted to support human cognition and performance. Efforts to alter human behavior through training on how to use technologies ‘appropriately’ have repeatedly been identified as a weak method of fostering sustainable change.2 Karsh et al 3 noted that expecting healthcare professionals to use technologies as the designer intended, and then attributing subsequent problems to them when they do not, is a common fallacy in assessing health information technologies. Training is essential when we want to increase our knowledge (eg, in science and medicine), but attempting to alter intrinsic cognitive processes (eg, memory, information processing) has limited effectiveness.2 The paper's methods section provides a key insight: at the study hospital, all doctors are already required to attend a 2 h training session and complete several case examples by using the computerized prescribing functions that reduce alerts. Notably, preventable alerts represent a sizeable portion of alerts and continue to occur, despite this mandatory computer training. As human beings, we have inherent capabilities and limitations, and rely on innate cognitive processes to interact with our surrounding world, which are not readily altered. Technologies should be adapted and redesigned to support human use and facilitate effective outcomes in terms of workflow, efficiency, workload, satisfaction, and patient safety.

Baysari et al suggest that preventable alerts indicate discordance between the prescribing task and the system's design. This important point directly relates to what is known from human factors science about mental model mismatches: discordance or mismatches between system designers and end-users can lead to errors, inefficient work processes (eg, ‘preventable alerts’), and potential unintended consequences for patient safety.4 5 This mismatch is the crux of the problem. The most effective and sustainable solution, however, is not ongoing training, but innovation in system design to align it with prescribers' cognitive processes and medication ordering tasks.

In their discussion, the authors indicate that modifications to the system design may be necessary. Reportedly, more than half of the duplication alerts could have been prevented if the system was redesigned so that alerts were only triggered when the two medication orders were both active. This was a valuable contribution of the paper and could inform the design of other alert systems. More examples of this type would strengthen the impact of the study findings. For instance, we are uncertain that the function actions ‘AND’, ‘OR’, and ‘THEN’ are aligned with the cognitive processes used by prescribers; perhaps this design may contribute to lack of use. We acknowledge that generating effective design solutions is not a simple task and would require additional work. For example, a follow-on human factors study could specifically examine the human–computer interaction. This type of study could identify design weakness and demonstrate how functions might be improved to aid prescribers and reduce the occurrence of preventable alerts.

In summary, training is often proposed as a solution to issues surrounding health information technologies, perhaps because it seems simple and inexpensive. In the long run, training on ‘appropriate use’ of health information technology in an attempt to alter intrinsic cognitive processes is largely ineffective, and its benefits are transient. Baysari et al have correctly diagnosed the discordance between prescribers and system design, but have not sufficiently highlighted its best management.

Footnotes

Funding: This work was supported in part by the VA HSR&D Center of Excellence on Implementing Evidence-Based Practice, Center grant #HFP 04-148, VA PPO# 09-298 and AHRQ grant R18 HS017902. Dr Saleem was supported by a VA HSR&D Research Career Development Award CDA 09-024-1.

Competing interests: None.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

  • 1. Baysari MT, Reckmann MH, Li L, et al. Failure to utilize functions of an electronic prescribing system and the subsequent generation of 'technically preventable' computerized alerts. J Am Med Inform Assoc. 2012;19:1003–10 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Reason J. Human Error. Cambridge, UK: Cambridge University Press, 1990 [Google Scholar]
  • 3. Karsh BT, Weinger MB, Abbott PA, et al. Health information technology: fallacies and sober realities. J Am Med Inform Assoc 2010;17:617–23 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Norman DA. The Design of Everyday Things. 1st edn New York, NY: Doubleday, 1990 [Google Scholar]
  • 5. Russ AL, Zillich AJ, McManus MS, et al. Prescribers' interactions with medication alerts at the point of prescribing: a multi-method, in situ investigation of the human-computer interaction. Int J Med Inform 2012;81:232–43 [DOI] [PubMed] [Google Scholar]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES