Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2010 Nov 13;2010:1427–1431.

Multiple Perspectives on the Meaning of Clinical Decision Support

Joshua E Richardson 1, Joan S Ash 1, Dean F Sittig 2, Arwen Bunce 1, James Carpenter 3, Richard H Dykstra 1, Ken Guappone 3, James McCormack 1, Carmit K McMullen 5, Michael Shapiro 1, Adam Wright 4, Blackford Middleton 4
PMCID: PMC3041401  PMID: 21347119

Abstract

Clinical Decision Support (CDS) is viewed as a means to improve safety and efficiency in health care. Yet the lack of consensus about what is meant by CDS represents a barrier to effective design, implementation, and utilization of CDS tools. We conducted a multi-site qualitative inquiry to understand how different people define and describe CDS. Using subjects’ multiple perspectives we were able to gain new insights as to what stakeholders want CDS to achieve and how to achieve it even when those perspectives are competing and conflicting.

Introduction

In 1969, Goertzel introduced the concept of a clinical decision support (CDS) system as “a tool to aid the physician in patient care, in data acquisition, and in decision making.” (1) Greenes described CDS as an action: “the use of the computer to bring relevant knowledge to bear on the health care and well-being of the patient.” (2) Authors of Crossing the Quality Chasm framed their definition by the types of decisions CDS is meant to support: “preventive and monitoring tasks, prescribing of drugs, and diagnosis and management.” (3) Shortliffe defined CDS as a “function” of both a system, “any computer program designed to help health professionals make clinical decisions,” and its tools: “tools for information management, tools for focusing attention, and tools for patient-specific consultation.” (4) Finally, Berner’s CDS definition includes types of potential users: “clinicians, staff, patients, and other individuals.” (5)

Any one of the above definitions is not necessarily better than another. However, definitions convey meaning and understanding across constituencies. Based on organizational communication theory, each definition is 1) partial, e.g. “only tells one part of a story;” 2) partisan, reflecting the viewpoint of the author(s), and 3) problematic, generating more questions than answers, and any answers are based on what is known, not on “all that could be known.” (6) How stakeholders interpret the meaning of CDS could impact the way CDS is discussed, designed, and disseminated across research and clinical settings.

The lack of consensus as to what is meant by CDS may represent a barrier to effective design, implementation, and utilization of these clinical support tools. An illustrative example focuses on one type of CDS: clinical practice guidelines (CPGs). Hysong et al. (7) conducted interviews and observations of administrators, middle managers, and primary care providers across 15 Veterans Administration (VA) hospitals to find out if each staff exhibited shared understandings, or “mental models,” of CPGs. Subjects were asked to describe how they interpreted the meaning of CPGs. The authors concluded that each staff within “high-performing” VA hospitals communicated “clear” shared mental models of clinical practice guidelines (CPGs). Conversely, “low-performing” VA hospitals were associated with staff that “lacked clear, dominant mental model[s].” In short, shared understanding of CPGs’ meanings may have facilitated adoption and use of CPGs to improve practice.

Our team explored what CDS means to multiple health IT constituencies, including users, developers, administrators, “bridgers”, and vendors. Using subjects’ multiple perspectives, we were able to gain new insight into what stakeholders want CDS to achieve and how to achieve it, even when those perspectives are competing and conflicting.

Methods

A multi-disciplinary team of qualitative researchers used an ethnographic method called the Rapid Assessment Process (RAP) (8). RAP relies on a team approach to expedite interview and observation data collection. Audio recorded interviews, naturalistic observations, and questionnaires were collected from a purposive sample of academic medical centers, community practices, community hospitals, and CDS vendors from December 2007 to December 2009. Nine sites (Table 1) were selected based on their reputations as leaders in the development and/or use of CDS.

Table 1.

Types and locations of organizations visited.

Site Visits Location
Regenstrief Institute Indianapolis IN
UMDNJ Newark, NJ
Partners HealthCare Boston, MA
Roudebush VA Medical Center Indianapolis, IN
Mid-Valley IPA Salem, OR
Providence Portland Medical Center Portland, OR
El Camino Hospital Mt. View, CA
2 clinical content vendors (anonymous) United States

The research team conducted 183 interviews and observations from December 2007 to October 2009. Forty-six subjects provided either, 1) an explicit definition of CDS; 2) descriptions of CDS; or 3) both a definition and a description. Subjects were classified into one of five roles: 1) Administrators (CIOs, directors, etc.); 2) Technical staff (analysts, IT support, etc.); 3) Clinicians (physician, nurse, etc.); 4) Bridgers (informaticians, content developers, etc.), and 5) Vendors (clinical content vendors). A “best fit” role was selected for subjects with overlapping inter-organizational job titles, roles, and responsibilities. Definitions and descriptions were analyzed using a grounded theory approach. Grounded theory is the process by which data are iteratively reviewed and labels (“codes”) are attributed to significant concepts and then organized into themes. (9) Codes and then themes were organized using NVivo Qualitative Software (QSR International, Inc., v.8). We provided written results of our findings to each organization and gathered their feedback. We also conducted a theme analysis to further understand how CDS types may differ according to subjects’ roles (Table 2).

Table 2:

Number of respondents sorted by role who mentioned each type of decision support (DS).

Role WORK FLOW DS ALERTING CDS COGNITIVE DS
Admin (N=12) 8 10 8
Bridger (N=9) 7 7 5
Technical (N=5) 3 3 5
Clinician (N=11) 8 3 6
Vendor (N=5) 3 4 3

Results

We identify the following major themes and issues:

The Ambiguous Meaning of CDS

A number of subjects found that “clinical decision support” is an ambiguous term with an ambiguous definition. One Bridger stated, “We’ve wrestled with [CDS definitions],” and a vendor explained, “I know personally we’re struggling with our definition of decision support.” Some subjects who were responsible for working with CDS asked interviewers for their definition of CDS before they would provide their own. A technical subject said, “I’d like to ask you to define [CDS] a little bit better...[the definition] depends on what you’re using the system for.” A vendor representative specifically responsible for selling CDS was caught off guard when asked to provide a definition: “I don’t know that I’ve given it a moment’s thought.” The ambiguity that often surrounded the definitions encouraged our team to try to understand how people conceptualize and operationalize CDS within and across organizations.

Decision Support: Alerts, Workflow, Cognition

Subjects defined and described three types of CDS: 1) Alerting CDS such as alerts and reminders that fire to deliver information and interrupt workflow; 2) Workflow CDS meant to ease data entry, documentation, and resource location, and 3) Cognitive CDS that provides a patient management and planning overview.

Alerting CDS was often described as alerts and reminders that are presented at the point of care. A vendor explained, “I think…an alert is actionable…that fires when certain conditions are met.” Clinical practice guidelines, protocols, or order sets were consistently left out of initial considerations and were discussed only after prompting by a researcher. One administrator told us: “order [sets]…wouldn’t necessarily [qualify] as decision support…they do guide you but they don’t give you alerts and reminders.”

Some explained the challenge of alerting CDS is meeting specific user needs: “over-alerting is a huge problem for us…depending on the practice setting [and] the level of knowledge that [a] practitioner has, they want different levels of information,” and, “in an ideal world [CDS] would be a system tailored to…individual skills.” Other subjects noted challenges to fitting alerting CDS within specific environments: “if you’re in an oncology clinic, the level of expertise and the doses that are going to be used [is very different than in] a general population.

Solutions included recognizing how and when it is best to apply alerting CDS: “it’s that balance between…redundant [alerts] that take time and staying time efficient so that providers will actually use…and value [CDS],” as well as developing further technical sophistication: “Decision support needs to become smarter.”

Other subjects viewed CDS less as acute and more as workflow-based in that it provides help for clinical work as well as decisions. Bridgers were aware that CDS can be meant to enhance clinical workflow and they build software tools accordingly. For example, “once the [physician’s] decisions are being made, we actually make it very easy to write patients a letter describing the test results in a patient friendly format.” However, one technical subject described workflow and CDS as if they were different phenomena: “Workflow is more of a concern [to physicians] than is CDS.”

Clinicians provided descriptions and definitions that emphasized ease of use and workflow. A physician explained, “…if a patient needs something, I don’t want to have to open up another window. As it is now, I have to open up and check the weight; I have to look at their last note to see when that was done. I have to look at the vitals and look at the labs to see when the last lab was done. It takes a long time.” A pediatrician lamented the emphasis placed on data input: “Everything is so focused about putting [data] in; nobody talks about what you can get out.”

Types of CDS that appeared to facilitate workflow included templates and orders sets: “templates and order sets are ‘memory prosthes[e]s’ for her…It forces [her] to be clear,” but a physician noted, “templates limit what you can put in.”

There were also common pitfalls to workflow when using reference materials: “[The system] lacks a link button to external resources,” and, “It is easier for her to use Google,” and this example, “[the physician walked] across the room to get a copy of ‘Facts and Comparisons’…looked up the dose, scribbled a bit on a Post-it note and used a calculator to figure out the volume of elixir that had the same dosage of antibiotic. [When asked] he felt the book was much faster and easier [than Micromedex].”

A third form of CDS we term cognitive CDS provides users with new insights to the patient’s disease state that s/he might not otherwise have. A physician recalled, “the best tool that I’ve ever had [was] for follow-up on ordering tests…it create[d] a patient notification form in our system at the same time so if the patient [didn’t] get the test…that form pops right up on the desktop and I take whatever action…” A technical subject provided a counter example, “some of the practices are taking advantage of [automated] recall letters [and] notifications…That’s not really decision support.” The examples illustrate the different perspectives we encountered in our study.

Other process-based support includes functions that facilitate communication. An observer noted, “his first example was contact with other doctors – this was a form of CDS.” A technical subject included messaging when asked about CDS, “some clinical staff will go into the system just to do phone notes.” Another technical subject explained how software supported the processes involved with team-based care: “[Groupware]…not only provide[s]…a focal point for interaction that we can use across time and space…[it] also provides a historical record…”

Subjects consistently attempted to define and describe distinctions among different types of CDS. One vendor described a categorical view of CDS as either “actionable” CDS or “impact” CDS. Actionable CDS is “added into the workflow in such a way that the clinician does not need to stop the basic processes of…assessing, coming to decisions…it’s just THERE.” And “impact” CDS is “available, but it does require interruption of the workflow.” Another vendor differentiated between a lower level “management” CDS: “a decision may have been already made to give a medication. And so having…dose checking will tell you maybe you’re outside of the normal dose range,” and higher level of “leadership” CDS: “Are you doing the right thing? Should you have even given that medication to begin with?”

How CDS operates: Explicit vs. Implicit

Subjects described system designs and clinical scenarios that at times called for “explicit” CDS that made clinician users aware that the system was offering support. For example, a technical subject described explicit CDS that reacts to a user’s order entry: “something that looks at the electronic medical record…then based on rules determine[s] the answer to certain questions,” and “[CDS] assist[s] the provider based on information [he or she] collects and provides them with a list of rule outs, [a] list of possibilities for diagnosis, or treatment plans.” In contrast, subjects described system designs and clinical scenarios that called for “implicit” CDS which subtly supports clinical decision-making. One clinician provided an example of implicit CDS: “clinical decision support to me means that there [is] some automated process in the background that helps direct me to do a clinically relevant safe appropriate task…”

Some subjects included administrative reporting as implicit CDS. These reporting systems gather data from different clinical systems (e.g., number of patients seen, treatments provided, and hospital outcomes) away from clinician users and present it to healthcare administrators in ways that inform resource allocation. “I…started getting information from their decision support system to help them make funding decisions for the next fiscal year,” and, “sometimes you do clinical decision support on how you bill.” Some administrators felt that “financial decision support” is a form of CDS that takes into account aggregate clinical information and is reviewed long after the patient interactions have been completed.

CDS Philosophies: Straightjackets vs. Guardrails

Subjects within individual organizations explained shared meanings of CDS that revealed different philosophies behind the design and development process. An administrator described his organization’s philosophy: “[we] believe in [CDS] guard rails, not straightjackets.” Furthermore, his organization holds the philosophy that “CDS [is] neither a carrot nor a stick but a guide for doing the right thing.”

“Straightjackets” represent a view that CDS can restrict clinicians in their decision making, often for the purpose of standardizing care: “when [CDS] stops you from doing something or it points out something to you that you hadn’t thought of…teachable moments.” Straightjackets can represent external mandates that are meant to improve patient care: “[the] critical lab alert with [JCAHO is] driving everybody crazy…you hav[e] to send out alerts to physicians on critical labs that…are outside of normal but not unexpectedly outside of normal.” And a vendor noted that legal threats result from not following manufacturers’ guidelines: “when [an alert] is not right, then it’s an over-message. But all we need is one patient goin’ south…[and then] we get into court.”

The “guardrails” metaphor represents a view that whenever possible, CDS design should place bounds around potential decisions rather than alert incorrect decisions. One subject stated, “I wouldn’t say it’s necessarily changing [users’] decision[s], it’s helping them mak[e] the right decision at the right time,” and “decision support that we have is very subtle…[users] see and act on it but they don’t really acknowledge it as decision support.” Guardrails may require approaching CDS differently: “you could give [users] an option to write…an additional dose or you cannot give that option…what you’re doing is pushing people toward [a decision]…by making it much more difficult to order a dose….”

This is not to say that guardrails are useful while straightjackets are not. On the contrary, the two philosophies may complement one another. In fact, a physician described CDS as a combination of the two in that CDS both “guides me [and] restrains me.”

Discussion

From their multiple perspectives, study subjects conveyed broad definitions and descriptions of clinical decision support which reflect the variety of goals people ascribe to CDS. Paradoxically, subjects appeared constrained by the term itself in that it was not precise enough to describe the variety of goals users wished to achieve.

We discovered that subjects in different organizations had been having internal discussions to define CDS. In our interviews we ran across subjects that had to first ask us to clarify what we meant beforehand. The variety of meanings attached to a single term could make it difficult for people within organizations, research teams, and across industry to speak the same language. It is important to clarify what CDS means so that people do not fall prey to competing or conflicting assumptions that may impact CDS acceptability, assessment, and use.

Our findings reveal that subjects from across disciplines and organizations have similar needs to distinguish different types of CDS: acute, workflow, and cognitive. Alerting CDS can be considered “traditional” CDS that is most familiar to informaticians and industry experts. Yet a number of subjects described a need for workflow CDS that helps them achieve their day-to-day tasks more easily, efficiently, and safely. Cognitive CDS describes yet another approach to CDS, one that enables a provider to get a snapshot of a patient’s disease state in order to support patient management. The field needs to explicitly recognize that different aspects of clinical work that needs to be supported by different types of CDS. The three types work hand-in-hand: the data that drive alerting CDS will not be collected if clinical workflow is significantly impeded; workflow CDS will not be optimized if users lack the tools to timely develop patient plans; and cognitive CDS operates poorly, if at all, without the data that inform it. We also found that distinctions are to be made between CDS that is implicit and CDS that is explicit.

A number of subjects described a CDS philosophy from within the bounds of alerts and reminders. Yet they acknowledged that these mechanisms were not always ideal forms for distributing decision support. The subjects were aware that acute alerting and reminding could be burdensome to users and that care needed to be taken to insure each alert provided value. Administrators expressed opinions that “value” would be gained with “smarter” CDS that accounts for the abilities and experience of users, and that the mark of beneficial alerts and reminders is whether or not they are found to have value by users. For example, an alert that could be considered valuable is one that provides a “teachable moment” to a clinician. Measuring the usefulness of subjective “value,” however, brings about unanswered questions of how best to empirically measure it.

External and internal standards were described as a factor that drove the continued use of alerts even to the point of over-alerting. The Joint Commission requirement to alert for abnormal labs, even expectedly abnormal labs, was cited as such an example. Vendors noted the difficulty of managing alerts so as not to cause alert fatigue, yet noted that the presence of alerts provided protection from malpractice suits that could be brought against un-alerted physicians. Of note, the mention of malpractice did not arise in interviews or observations with any of the other groups.

Other subjects seemed to favor a philosophy of decision support that was oriented toward guiding clinicians to “make the right decision at the right time.” Designing subtle CDS was held out as a “third way,” beyond the common “carrots and sticks” that organizations often use to increase CDS adherence. The philosophy of guidance seeks behavior changes through the use of default options, templates, and order sets.

A limitation of this study is that although all transcripts and fieldnotes were coded by multiple researchers, the subcoding of the “meaning” theme was accomplished by the first author only.

Conclusion

A multiple perspectives approach provided valuable insight into how stakeholders have varying definitions and descriptions of CDS. Our research shows that use of the term “clinical decision support” may not adequately describe the types of clinical activities that are practiced in clinical environments and health care organizations that could benefit from computer-based support. Furthermore, through multiple perspectives we describe alternate meanings to CDS that have not been expressed in previous informatics definitions. Further research is needed to understand how people attribute meanings to CDS and the impact they may have on CDS acceptance and use.

Acknowledgments

This work was supported by NLM Grants LM006942-07, 2-T15-LM007088, and AHRQ contract HHSA290200810010. Special thanks go to Joseph Wasserman, NLM summer intern.

References

  • 1.Goertzel G. Clinical decision support system. Ann N Y Acad Sci. 1969 Sep 30;161(2):689–693. doi: 10.1111/j.1749-6632.1969.tb34100.x. [DOI] [PubMed] [Google Scholar]
  • 2.Greenes RA. Clinical decision support: the road ahead. Academic Press; 2006. [Google Scholar]
  • 3.Corrigan JM. Crossing the quality chasm. Johns Hopkins Bloomberg School of Public Health; Baltimore, MD: 2004. [Google Scholar]
  • 4.Shortliffe EH. Computer programs to support clinical decision making. JAMA. 1987;258(1):61–66. [PubMed] [Google Scholar]
  • 5.Berner ES. Clinical Decision Support Systems: State of the Art [Internet]. 2009 Jun [cited 2009 Sep 27];Available from: http://healthit.ahrq.gov/images/jun09cdsreview/09_0069_ef.html.
  • 6.Eisenberg E. Organizational communication : balancing creativity and constraint. 5th ed. Boston: Bedford/St. Martin’s; 2007. [Google Scholar]
  • 7.Hysong SJ, Best RG, Pugh JA, Moore FI. Not of one mind: mental models of clinical practice guidelines in the Veterans Health Administration. Health Serv Res. 2005 Jun;40(3):829–847. doi: 10.1111/j.1475-6773.2005.00387.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Ash JS, Sittig DF, McMullen CK, Guappone K, Dykstra R, Carpenter J. A rapid assessment process for clinical informatics interventions. AMIA Annual Symposium Proceedings. American Medical Informatics Association; 2008. p. 26. [PMC free article] [PubMed] [Google Scholar]
  • 9.Strauss AL, Glaser B. 1968. The discovery of grounded theory: Strategies for qualitative research. Weidenfeld and Nicolson;

Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES