Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2010 Nov 13;2010:672–676.

Multiple Perspectives on the Meaning of Clinical Decision Support

Joshua E Richardson 1, Joan S Ash 1, Dean F Sittig 2, Arwen Bunce 1, James Carpenter 3, Richard H Dykstra 1, Ken Guappone 3, Carmit K McMullen 5, Michael Shapiro 1, Adam Wright 4
PMCID: PMC3041408  PMID: 21347063

Abstract

Clinical Decision Support (CDS) is viewed as a means to improve safety and efficiency in health care. Yet the lack of a consensus around what is meant by CDS represents a barrier to effective design, use, and utilization of CDS tools. We conducted a multi-site qualitative inquiry to understand how different people define and describe CDS. Using subjects’ multiple perspectives we were able to gain new insights as to what stakeholders want CDS to achieve and how to achieve it; even at times when those perspectives are competing and conflicting.

Introduction

In 1969, Goertzel introduced the concept of a clinical decision support (CDS) system as “a tool to aid the physician in patient care, in data acquisition, and in decision making.” (1) Subsequently, Greenes described CDS as if it is an action: “the use of the computer to bring relevant knowledge to bear on the health care and well-being of the patient.” (2) Authors of the Crossing the Quality Chasm framed their definition by the types of decisions CDS is meant to support: “preventive and monitoring tasks, prescribing of drugs, and diagnosis and management.” (3) Shortliffe defined CDS as a “function” of both a system, “any computer program designed to help health professionals make clinical decisions,” plus its tools: “tools for information management, tools for focusing attention, and tools for patient-specific consultation.” (4) Lastly, Berner’s CDS definition includes types of potential users: “clinicians, staff, patients, and other individuals.” (5)

Any one of the above definitions is not necessarily better than another, however a definition is a useful “soft technology” that can convey meaning and understanding across constituencies. Based on organizational communication theory, each definition is 1) partial, “only tells one part of a story;” 2) partisan, reflects the viewpoint of the author(s), and 3) problematic, generates more questions than answers, and any answers are based on what is known not on “all that could be known.” (6) How stakeholders interpret the meaning of CDS may impact the ways CDS is discussed, designed, and disseminated across research and clinical settings.

The lack of a consensus around what is meant by CDS may represent a barrier to effective design, use, and utilization of these clinical support tools. An illustrative example focuses on one type of CDS: clinical practice guidelines (CPGs). Hysong et al. (7) conducted interviews and observations of administrators, middle managers, and primary care providers across 15 Veterans Administration (VA) hospitals to detect if staffs exhibited shared understandings, “mental models,” of CPGs. Subjects were asked to provide how they interpreted the meaning of CPGs. The authors concluded that staffs within “high-performing” VA hospitals communicated “clear” shared mental models of clinical practice guidelines (CPGs). Conversely, “low-performing” VA hospitals were associated with staffs that “lacked clear, dominant mental model[s].” In short, shared understanding of CPGs’ meanings may have facilitated adoption and use of CPGs to improve practice.

Our team explored how multiple health IT constituencies: users, developers, administrators, “bridgers”, and vendors compare as to the meaning of “clinical decision support.” Using subjects’ multiple perspectives we were able to gain new insights as to what stakeholders want CDS to achieve and how to achieve it; even at times when those perspectives are competing and conflicting.

Methods

A multi-disciplinary team of qualitative researchers used an ethnographic method called the Rapid Assessment Process (RAP) (8). RAP relies on a team approach to expedite interview and observation data collection. Audio recorded interviews, naturalistic observations, and questionnaires were collected from a purposive sample of academic medical centers, community practices, community hospitals, and CDS vendors from December 2007 to December 2009. Nine sites (Table 1) were selected based on their reputations as leaders in the development and use of CDS and computerized provider order entry systems.

Table 1.

Types and locations of organizations visited.

Site Visits Location
Regenstrief Institute Indianapolis IN
UMDNJ Newark, NJ
Partners HealthCare Boston, MA
Roudebush VA Medical Center Indianapolis, IN
Mid-Valley IPA Salem, OR
Providence Portland Medical Center Portland, OR
El Camino Hospital Mt. View, CA
2 clinical content vendors (anonymous) United States

Initial subject recruitment relied on the assistance of organizational sponsors and liaisons, but researchers often took opportunities for impromptu interviews and observations. Subjects were classified into one of four categories: 1) Administrators (CIOs, directors, etc.); 2) Technical staff (analysts, IT support, etc.); 3) Clinicians (physician, nurse, etc.); 4) Bridgers (informaticians, content developers, etc.), and 5) Vendors (clinical content vendors). Distinctions among these categories were sometimes difficult to delineate due to overlapping inter-organizational job titles, roles, and responsibilities; in these instances the research team identified the “best fit” category. Ninety subjects were interviewed and/or observed, 80 of whom each provided either, 1) an explicit definition of CDS; 2) examples of things that did or did not qualify as CDS; or 3) both a definition and a description. Statements that fell into these three criteria were analyzed using grounded theory. Grounded theory is the process by which data is iteratively reviewed and labels, “codes,” are attributed to significant concepts and then organized into themes. (9) Codes and then themes were organized using NVivo Qualitative Software (QSR International, Inc., v.8).

Results

Uncertainty

A number of subjects found that “clinical decision support” an ambiguous term and definition. One bridger stated, “We’ve wrestled with [CDS definitions],” and a vendor explained, “I know personally we’re struggling with our definition of decision support.” Some subjects who were responsible for working with CDS asked interviewers for their definition of CDS before they could provide their own. A technical subject said, “I’d like to ask you to define [CDS] a little bit better...[the definition] depends on what you’re using the system for.” A vendor representative specifically responsible for selling CDS was caught off guard when asked to provide a definition, “I don’t know that I’ve given it a moment’s thought.”

What CDS supports: Rules, Workflow, and Process

Subjects defined and described three types of CDS: 1) rule-based CDS such as alerts and reminders that fire to deliver information and interrupt workflow; 2) workflow-based CDS meant to ease data entry, documentation, and resource location, and 3) process-based CDS that provides a patient management and planning overview.

Rule-based CDS was often times described as alerts and reminders that are presented at the point of care. A vendor explained, “I think alerts, and to me an alert is actionable…that fires when certain conditions are met.” Clinical practice guidelines, protocols, or order sets were consistently left out of initial considerations and were discussed only after being prompted by a researcher. One administrator laid out her thinking as, “order [sets]…wouldn’t necessarily [qualify] as decision support…they do guide you but they don’t give you alerts and reminders.”

Some explained the challenge of rule-based CDS is meeting specific user needs, “over-alerting is a huge problem for us…depending on the practice setting [and] the level of knowledge that [a] practitioner has, they want different levels of information,” and, “in an ideal world [CDS] would be a system tailored to…individual skills.” Other subjects noted challenges to fitting rule-based CDS within specific environments, “if you’re in an oncology clinic, the level of expertise and the doses that are going to be used [is very different than in] a general population.

Solutions included recognizing how and when it is best to apply rule-based CDS: “it’s that balance between…redundant [alerts] that take time and staying time efficient so that providers will actually use…and value [CDS];” as well as developing further technical sophistication, “Decision support needs to become smarter.”

Unlike rule-based CDS, other subjects tended to describe a workflow-based view of CDS that enhances clinical work as well as decisions. A technical subject emphasized the import physicians place on workflow, “workflow is more of a concern [to physicians] than is CDS.” Bridgers were aware that CDS can be meant to enhance clinical workflow and build software tools accordingly; for example, “once the [physician] decisions are being made, we actually make it very easy to write patients a letter describing the test results in a patient friendly format.”

Indeed, clinicians did not emphasize alerts and reminders in their descriptions and definitions but rather emphasized ease of use and workflow. A physician explained, “…if a patient needs something, I don’t want to have to open up another window. As it is now, I have to open up and check the weight; I have to look at their last note to see when that was done. I have to look at the vitals and look at the labs to see when the last lab was done. It takes a long time.” A pediatrician lamented the emphasis placed on inputting data, “Everything is so focused about putting [data] in; nobody talks about what you can get out.”

Types of CDS that appeared to facilitate workflow included templates and orders sets, “templates and order sets are a ‘memory prosthesis’ for her…It forces [her] to be clear,” and a physician noted, “templates limit what you can put in.”

There were also common pitfalls to workflow while using reference materials: “[The system] lacks a link button to external resources,” and, “It is easier for her to use Google,” and this example, “[the physician walked] across the room to get a copy of ‘Facts and Comparisons’…looked up the dose, scribbled a bit on a Post-it note and used a calculator to figure out the volume of elixir that had the same dosage of antibiotic. [When asked] he felt the book was much faster and easier [than Micromedex].”

A third form of CDS was described as “process-based CDS.” A physician recalled, “the best tool that I’ve ever had [was] for follow-up on ordering tests…it create[d] a patient notification form in our system at the same time so if the patient [didn’t] get the test…that form pops right up on the desktop and I take whatever action…it’s process support.” A technical subject unknowingly distinguished process-based CDS from rule-based CDS because he did not have the vocabulary to help explain any difference, “some of the practices are taking advantage of [automated] recall letters [and] notifications…That’s not really decision support.”

Other process-based supports include functions that facilitate communication. An observer noted, “his first example was contact with other doctors – this was a form of CDS.” A technical subject included messaging when asked about CDS, “some clinical staff will go into the system just to do phone notes.” Another technical subject explained how software supported the processes involved with team-based care, “[Groupware]…not only provide[s]…a focal point for interaction that we can use across time and space…[it] also provides a historical record…”

Subjects consistently attempted to define and describe distinctions among types of CDS. One vendor described a categorical view of CDS as either “actionable” CDS or “impact” CDS. Actionable CDS is “added into the workflow in such a way that the clinician does not need to stop the basic processes of…assessing, coming to decisions…it’s just THERE.” And “impact” CDS is “available, but it does require interruption of the workflow.” Another vendor differentiated between a lower level of “management” CDS: “a decision may have been already made to give a medication. And so having…dose checking will tell you maybe you’re outside of the normal dose range,” and higher level of “leadership” CDS: “Are you doing the right thing? Should you have even given that medication to begin with?”

Where CDS operates: Foreground vs. Background

Subjects described system designs and clinical scenarios that at times called for CDS to operate in the foreground and at other times for CDS to operate in the background. One clinician provided an example by making a distinction between “implicit and explicit” CDS: “clinical decision support to me means that there [is] some automated process in the background that helps direct me to do a clinically relevant—safe—appropriate task…”

Foreground CDS played an active role by reacting to a user’s entry. A technical person described CDS as, “something that looks at the electronic medical record…then based on rules determine[s] the answer to certain questions,” and “[CDS] assist[s] the provider based on information [he or she] collects and provides them with a list of rule outs, [a] list of possibilities for diagnosis, or treatment plans.”

Background CDS is triggered when data that is entered into a database by one user or program is used to alert and/or remind another user. Background CDS was used to inform business decisions: “I…started getting information from their decision support system to help them make funding decisions for the next fiscal year,” and, “sometimes you do clinical decision support on how you bill.” Some administrators offered that “financial decision support” is a form of CDS that takes into account aggregate information and is reviewed long after the patient interactions have been completed.

CDS philosophy: Straightjackets vs. Guardrails

Subjects explained meanings of CDS that revealed different philosophies behind the design and development process. An administrator aptly laid out that difference when he described his organization’s philosophy, “[we] believe in [CDS] guard rails, not straightjackets.”

“Straightjackets” represent a view that CDS operates within an alert-reminder paradigm and therefore solutions are focused on continually improving alerts and reminders. One administrator explained, “you have to get to the point where [clinicians] see real value in [alerts].” And value was explained as, “when [CDS] stops you from doing something or it points out something to you that you hadn’t thought of…teachable moments.”

Straightjackets can be fastened through external mandates that are meant to improve patient care, “[the] critical lab alert with Joint Commission [is] driving everybody crazy…you hav[e] to send out alerts to physicians on critical labs that…are outside of normal but not unexpectedly outside of normal.” And a vendor noted that litigious threats by not following manufacturer guidelines can bias toward straightjackets, “when [an alert] is not right, then it’s an over-message. But all we need is one patient goin’ south…[and when that] happens then we get into court.”

Yet other subjects provided a philosophy that went beyond the alert-reminder paradigm and saw a “third option” in which CDS provides “guardrails.” A subject put it thusly, “CDS [is] neither a carrot nor a stick but a guide for doing the right thing…[we] believe in [CDS] guard rails, not straightjackets.”

These subjects described a view that CDS should guide users: “I wouldn’t say it’s necessarily changing their decision, it’s helping them making the right decision at the right time,” and “a lot of the decision support that we have is very subtle… they see and they act on it but they don’t really acknowledge it as decision support.” Guardrails may require approaching CDS differently, “you could give [users] an option to write…an additional dose or you cannot give that option…what you’re doing is pushing people toward [a decision]…by making it much more difficult to order a dose….”

This is not to say that guardrails are useful while straightjackets are not. On the contrary, the two philosophies may complement one another. In fact, a physician described CDS as a combination of the two in that it both, “guides me [and] restrains me.”

Discussion

From their multiple perspectives subjects conveyed broad definitions and descriptions of clinical decision support which reflect the variety of goals people ascribe to CDS. Paradoxically, subjects appeared constrained by the term itself in that it was not precise enough to describe the variety of goals users wished to achieve.

We discovered that subjects in different organizations had been having internal discussions to define CDS. In our interviews we ran across subjects that had to first ask us to clarify what we meant beforehand. The variety of meanings attached to a single term could make it difficult for people within organizations, research teams, and across industry to speak the same language. It is important to clarify what CDS means so that people do not fall prey to competing or conflicting assumptions that may impact CDS acceptability, assessment, and use.

Our findings reveal that subjects from across disciplines and organizations have similar needs to distinguish different types of CDS: rule-based, workflow-based, and process-based support. Rule-based support can be considered “traditional” CDS that is most familiar to informaticians and industry experts. Yet a number of subjects, particularly clinicians, described a need for workflow-based CDS that helps them achieve their day-to-day tasks more easily, efficiently, and safely. Process-based CDS describes yet another approach to CDS, one that enables a provider to get a snapshot of a patient’s status in order to support patient management. The field needs to explicitly recognize that different aspects of clinical work need to be supported by different types of CDS. The three types work hand-in-hand: the data that drives rule-based CDS will not be collected if clinical workflow is significantly impeded; workflow will be slowed if users lack the tools to develop patient plans; and process-based CDS cannot operate without the data that informs it. We also found that distinctions are to be made from CDS that runs in the “foreground” to CDS that runs in the “background.” Subjects described decision support as functions that are “implicit” or occur in the background.

A number of subjects, particularly administrators and technicians, described a CDS philosophy from within the bounds of alerts and reminders. Yet they acknowledged that these mechanisms were not ideal forms for distributing decision support. The subjects were aware that alerting and reminding could be burdensome to users and that care needed to be taken to insure each alert provided perceived value. Administrators expressed opinions that “value” would be gained with “smarter” CDS that accounts for the abilities and experience of users; and that the mark of beneficial alerts and reminders are whether or not they are found to have value by users. For example, an alert that could be considered valuable is one that provides a “teachable moment” to a clinician. Measuring the usefulness of subjective “value,” however, brings about unanswered questions of how best to empirically measure it.

External and internal standards were described as a factor that drove the continued use of alerts even to the point of over-alerting. The Joint Commission requirement to alert for abnormal labs, even expectedly abnormal labs, was cited as such an example. Vendors noted the difficulty of managing alerts so as not to cause alert fatigue, yet noted that the presence of alerts provided protection from malpractice suits that could be brought against un-alerted physicians. Of note, the mention of malpractice did not arise in interviews or observations with any of the other groups.

Other subjects seemed to favor a philosophy of decision support that was oriented toward guiding clinicians to “make the right decision at the right time.” Designing subtle CDS was held out as a “third way,” beyond the common “carrots and sticks” that organizations often use to increase CDS adherence. The philosophy of guidance seeks behavior changes through the use of default options, templates, and order sets.

This study’s limitations are coding by a single person and that the purposive sample was determined in part by the research team’s availability and resources.

Conclusion

A multiple perspective approach provided valuable insight into how stakeholders have varying definitions and descriptions of CDS. Our research shows that use of the term “clinical decision support” may not adequately describe the types of clinical activities that are practiced in clinical environments and health care organizations that could benefit from computer-based support. Furthermore, through multiple perspectives we describe alternate meanings to CDS that have not been expressed in previous informatics definitions. Further research is needed to understand how people attribute meanings to CDS and the impact they may have on CDS acceptance and use.

Acknowledgments

NLM Grants LM006942-07, 2-T15-LM007088, AHRQ contract HHSA290200810010

References

  • 1.Goertzel G. Clinical decision support system. Ann N Y Acad Sci. 1969 Sep 30;161(2):689–693. doi: 10.1111/j.1749-6632.1969.tb34100.x. [DOI] [PubMed] [Google Scholar]
  • 2.Greenes RA. Clinical decision support: the road ahead. Academic Press; 2006. [Google Scholar]
  • 3.Corrigan JM. Crossing the quality chasm. Johns Hopkins Bloomberg School of Public Health; Baltimore, MD: 2004. [Google Scholar]
  • 4.Shortliffe EH. Computer programs to support clinical decision making. JAMA. 1987;258(1):61–66. [PubMed] [Google Scholar]
  • 5.Berner ES. Clinical Decision Support Systems: State of the Art [Internet] 2009. Jun, [cited 2009 Sep 27];Available from: http://healthit.ahrq.gov/images/jun09cdsreview/09_0069_ef.html.
  • 6.Eisenberg E. 5th ed. Boston: Bedford/St. Martin’s; 2007. Organizational communication : balancing creativity and constraint. [Google Scholar]
  • 7.Hysong SJ, Best RG, Pugh JA, Moore FI. Not of one mind: mental models of clinical practice guidelines in the Veterans Health Administration. Health Serv Res. 2005 Jun;40(3):829–847. doi: 10.1111/j.1475-6773.2005.00387.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Ash JS, Sittig DF, McMullen CK, Guappone K, Dykstra R, Carpenter J. AMIA Annual Symposium Proceedings. American Medical Informatics Association; 2008. A rapid assessment process for clinical informatics interventions; p. 26. [PMC free article] [PubMed] [Google Scholar]
  • 9.Strauss AL, Glaser B. The discovery of grounded theory: Strategies for qualitative research. Weidenfeld and Nicolson. 1968 [Google Scholar]

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES