Skip to main content
The Yale Journal of Biology and Medicine logoLink to The Yale Journal of Biology and Medicine
. 2014 Jun 6;87(2):187–197.

Clinical Decision Support: Effectiveness in Improving Quality Processes and Clinical Outcomes and Factors That May Influence Success

Elizabeth V Murphy 1
PMCID: PMC4031792  PMID: 24910564

Abstract

The use of electronic health records has skyrocketed following the 2009 HITECH Act, which provides financial incentives to health care providers for the “meaningful use” of electronic medical record systems. An important component of the “Meaningful Use” legislation is the integration of Clinical Decision Support Systems (CDSS) into the computerized record, providing up-to-date medical knowledge and evidence-based guidance to the physician at the point of care. As reimbursement is increasingly tied to process and clinical outcomes, CDSS will be integral to future medical practice. Studies of CDSS indicate improvement in preventive services, appropriate care, and clinical and cost outcomes with strong evidence for CDSS effectiveness in process measures. Increasing provider adherence to CDSS recommendations is essential in improving CDSS effectiveness, and factors that influence adherence are currently under study.

Keywords: electronic health record, electronic medical records, clinical decision support systems, computerized provider order entry, quality process measure, quality outcomes, meta-analysis of randomized controlled trials, clinical practice guidelines, preventive services guidelines

Introduction

The Health Information Technology for Economic and Clinical Health (HITECH) Act, from the 2009 American Recovery and Reinvestment Act (ARRA) legislation, provides financial incentives to hospitals and physician practices to adopt and make “meaningful use” of electronic health records (EHR) to improve the quality of patient care. There has been a rapid expansion of EHR use since the enactment of HITECH, with an increase from 48 percent EHR use in office-based practices in 2009 to 72 percent office-based practice use by 2012 [1]. An essential component of “meaningful use” is the development of EHRs that are capable of computerized physician order entry (CPOE) with clinical decision support systems (CDSS) that will integrate into workflow and facilitate clinical outcome objectives. Clinical decision support systems are still not widespread in the United States, and multiple strategies have been proposed to facilitate the expansion of decision support systems through local EHR systems or through a scalable, standards-based model that can be adapted to diverse EHR systems [2,3].

Although the recent HITECH Act has helped move us toward more widespread use of clinical decision support systems, the potential importance of such systems has been recognized for decades. In his 1968 editorial “Medical Records that Guide and Teach,” Weed asserted that “when large amounts of demographic data are developed, by means of a computer, a system could be developed whereby input of vital statistics on any patient would automatically result in an immediate print-out of his main demographic problems along with current approaches to their management” [4]. In Greenes’ groundbreaking work on the development of the Massachusetts General Hospital Utility Multi-Programming System (MUMPS) EHR system in 1969, he recognized that “the medical record … provides the primary means by which quality control, auditing of the medical care process, and research into the diagnosis and treatment of disease can be achieved” [5]. In 1970, “Primer for Writing Medical Data Base for the Clinical Decision Support System” was published by IBM’s elite Advanced Systems Development Division [6]. By 1976, McDonald employed the use of treatment suggestions with reasons, such as “add or increase antihypertensives because last diastolic blood pressure > 100 mmHg,” in a controlled crossover study that used computer-based record systems to print out suggestions for providers. McDonald argued that computerized decision support is needed to prevent errors of oversight in providing appropriate medical care due to “man’s inefficiency as a data processor” [7].

Since the 1980s, there has been a steady increase in scientific studies examining the use of computerized clinical decision support tools in quality of care outcomes [8]. Most of these studies have evaluated CDSS process outcomes, including the ability to facilitate the provider’s ordering of appropriate medications and preventive services and adherence to appropriate care/practice guidelines in the treatment of disease. A smaller number of studies have also attempted to evaluate the effect of CDSS on clinical outcomes, including morbidity and mortality as well as cost outcomes.

CDSS Effectiveness in Preventive Services Processes

Several higher quality studies, including multiple randomized controlled trials, as well as well-designed, quasi-experimental studies, have demonstrated the effectiveness of clinical decision support systems in increasing the appropriate use of preventive services. A 1993 randomized controlled trial compared the use of computer-generated reminders for providers to order preventive services versus an intervention using the same reminders with a required response including “done today,” “not applicable to this patient,” “patient refused,” or “next visit.” Intervention physicians complied at 61 percent vs. 49 percent (p = .0007) in controls for fecal occult blood testing and at 54 percent vs. 47 percent (p = .036) for ordering mammograms [9]. A VA cooperative study used computerized reminders for 13 standards of care, including smoking cessation counseling, diabetic foot and eye exams, hypertension counseling, lipid level measurement, and HbA1c and proteinuria testing in diabetics. The intervention group had a 5.5 percent increase in all standards of care (p = .002) over the control group and a 6 to 10 percent increase in diabetic foot, eye and proteinuria exams, smoking cessation counseling and pneumococcal vaccination (p = .04 to p < .001) compared with controls [10].

In a randomized controlled trial using computerized reminders for providers taking care of hospitalized patients, Dexter and colleagues assessed four appropriate preventive services that had not been ordered on admission. Computerized reminders resulted in a 35.8 percent order rate for pneumococcal vaccine vs. 0.8 percent in controls (p < .001), a 51.4 percent order rate for influenza vaccine vs. 1.0 percent in controls (p < .001), a 32.2 percent order rate for prophylactic Heparin vs. 18.9 percent (p < .001) in controls, and 36.4 percent order rate for prophylactic aspirin prescription at discharge vs. 27.6 percent rate (p < .001) in controls [11]. Electronic health records reminders were also found to be effective in increasing guideline-recommended osteoporosis diagnostic testing and treatment in older women who suffered fractures. At 6 months post fracture, reminders resulted in 51.5 percent of patients receiving recommended osteoporosis care vs. 5.9 percent in controls (p < .001) [12].

The effect of clinical decision support systems on prevention processes as well as clinical outcomes was evaluated in two studies that focused on the prevention of deep venous thrombosis in hospitalized and trauma patients. In a study using computer alerts with a requirement for a response, Kucher conducted a randomized controlled trial that resulted in 10 percent of intervention group patients vs. 1.5 percent of control patients being prescribed mechanical prophylaxis (p < .001) and 23.6 percent of the intervention patients vs. 13.0 percent of the control patients (p < .001) being prescribed pharmacologic prophylaxis. Clinical outcomes in the intervention group revealed a sizable decrease in the incidence of deep vein thrombosis and pulmonary embolism in the intervention group, with risk reduced for these outcomes by 41 percent (hazard ratio .59, p < .001), though mortality was not significantly different in the intervention and control groups by the end of the study period [13]. In a Johns Hopkins retrospective cohort study of mandatory clinical decision support for the prophylaxis of deep vein thrombosis in trauma patients, providers were required to fill in an electronic medical record-based questionnaire about the patient’s risk factors, and prophylaxis was initiated when indicated. Practice guideline compliance for prophylaxis increased from 62.2 percent at baseline to 99.5 percent by the end of the 3-year study period (p < .001), and there was an 83 percent decrease (p < .001) in preventable harm venous thromboembolism events (including mortality) over the study period [14].

Although the evidence for CDSS effectiveness in preventive services processes is strong, there are also examples of studies that did not produce significant improvement in process outcomes. In a 1994 randomized clinical trial at two inner city HMO sites, Burack and colleagues investigated the effectiveness of computerized reminders that were mailed to patients and/or placed in the patient’s medical records for physicians advising of the need for mammography referral. There was no effect from patient reminders at either site, and physician reminders or physician reminders combined with patient reminders had a significant effect only at the second site with 59 percent of women vs. 43 percent (P < 0.001) of women completing mammograms after physician reminders [15]. This study highlights some of the difficulties in assessing CDSS response since some of the CDSS recommendations were issued months before a mammogram was due, and many factors, including financial ability and the patient’s ability to negotiate with their providers, influenced whether mammograms were performed. In a 2009 RCT conducted by Fiks and colleagues, there was a small but insignificant effect from providing computerized reminders to physicians for administration of flu vaccine to pediatric asthma patients. There was no reason listed for failing to administer flu vaccine, however, and it is possible that the patients had already received the vaccine from another source or that vaccination was postponed for a specific reason [16].

CDSS Effectiveness in Appropriate Care Processes

Clinical Decision Support Systems have also been effective in increasing provider’s adherence to appropriate medical care/practice guidelines, including prescribing guidelines. β-blockers have demonstrated effectiveness in improving survival for patients with left heart failure. In an attempt to increase the appropriate use of β-blockers, Heidenreich et al. performed a randomized controlled trial attaching an electronic reminder statement regarding decreased mortality with the use of β-blockers to the echocardiogram report for left heart failure patients in the intervention group and no reminder in the control group. Seventy-four percent of the patients in the reminder group vs. 66 percent of patients in the control group were given β-blocker prescriptions (p < .002) [17]. In a prospective cohort study followed by a cluster randomized trial, a computerized decision support system (TREAT) targeted outcomes of appropriate antibiotic use. TREAT is a software program based on a causal probabilistic network (CPN) that predicts the most likely pathogen given the patient’s setting and condition. TREAT prescribed appropriate antibiotic use in 70 percent of cases vs. 57 percent of cases for physicians (p < .001.) When patients were treated based on TREAT advice on hospital wards, the odds ratio for receiving appropriate treatment was 3.40 compared with controls (95% CI = 2.25-5.14) [18].

In a randomized controlled trial using corollary order suggestions, Overhage and colleagues wrote computer prompts for corollary orders for 87 different tests and treatments. Examples of orders included blood drug levels if certain antibiotics or anti-seizure medications were ordered, liver or kidney function testing if drugs with potential toxicity to kidneys or liver were ordered, blood gases after ventilator setting changes, Prothrombin time after Coumadin is prescribed, and electrolytes after IV Furosemide. Intervention providers ordered the recommended tests 46.3 percent of the time vs. 21.9 percent in the control physician group (p < .0001) [19]. An integrative, immediate response clinical decision support system was tested in nursing home residents to support safe medication use and dosing in elderly patients with renal insufficiency. Sixty-two medications were entered into the computer, and intervention physicians were alerted about the creatinine clearance for the patient and appropriate maximum medication dose, maximum frequency of medication, and medications to avoid, given the patient’s abnormal renal function. Control physicians were alerted about the patient’s most recent creatinine level.

Intervention and control physicians both prescribed similarly appropriate dosing in renal failure patients, but intervention physicians had a 2.4 relative risk (95% CI = 1.4, 4.4) of prescribing appropriate frequency of medications and a 2.6 relative risk (95% CI = 1.4, 5.0) for avoiding drugs that should not be used in renal failure, as well as a 1.8 risk relative risk for ordering appropriate renal function tests when indicated [20].

Chronic disease management is another potential area for use of clinical decision support. McCowan et al. developed a software program for managing asthma with input from a software engineer, a statistician, a general practitioner, an asthma nurse, and a pulmonologist. The software was then tested at local and national conferences by a large number of general practitioners and practice nurses. Randomly assigned intervention practices used the software with their patients over a 6-month period. Fewer intervention patients initiated practice consultations with their providers during this period, 22 percent in intervention vs. 34 percent in controls OR .59 (CI = .37-.95), and fewer intervention patients suffered acute asthma exacerbations, 8 percent vs. 17 percent, OR .43 (CI =.21-.85) [21]. Bell and colleagues evaluated adherence to National Asthma Education Prevention Guidelines among physicians caring for pediatric asthma patients and found that computerized reminders and alerts led to a 6 percent increase in prescriptions for controller medications in the intervention group (p = .006) and a 3 percent increase in the use of spirometry in urban practices (p = .04.) Having an up-to date asthma care plan increased 14 percent (p = .03) and use of spirometry increased by 6 percent (p = .003) in the intervention suburban practices [22].

A study of the Vermont Diabetes Information System, a diabetes registry and decision support system, used computer systems to track laboratory testing, including labs that monitor glucose control, kidney function, proteinuria, and cholesterol in order to monitor diabetes control and prevention of potential complications. Providers were given summaries of patient’s results with decision support, and patients were sent alerts for out-of-range tests and reminders of appointments. This study is important because it resulted in decreased probability of hospitalization, 0.17 versus 0.20 in controls (p = .01), and fewer emergency room visits in the intervention group compared with controls, at 0.27 vs. 0.36 (p < .0001), a 25 pecent reduction in ER visits. A statistically significant cost savings of 11 percent for hospitalizations and 27 percent for ER visits was also realized [23]. The Mobile Diabetes Intervention Study combined clinical decision support for community (non-academic) providers with mobile tracking of diabetic patients in the community and a physician patient communication portal. The maximum intervention included provider clinical decision support, patient monitoring using a mobile device with patient input about finger stick results, diet, and other issues. It also included a patient-based decision support system portal with a computer “coach” that texted the patient with information/feedback related to the input and encouragement and a patient-provider portal. The mean HbA1c level was 9.9 percent in the intervention group prior to this program and decreased by 1.9 percent (to 7.9 percent) in the intervention group over a year vs. controls decreasing .7 percent (p < .001.) This is clinically and statistically significant [24].

Meta-Analysis in Evaluation of Effectiveness

In a 2012 AHRQ Evidence Report/Technology Assessment on clinical decision support and knowledge management, Lobach et al. identified 15,176 citations, including 1,407 full text articles [8]. After the studies were evaluated for quality, 323 articles were abstracted for evaluation. One hundred forty-eight randomized controlled trials were used in a meta-analysis to evaluate for evidence of process or clinical outcome improvement and/or cost reduction, with clinical decision support. These findings were summarized in table form (abstracted here, in Table 1) in another publication [25].

Table 1. Summary of Evidence, by Outcome (abstracted from “Table. Summary of Evidence, by Outcome” (Bright TJ, et al.; 2012).

Outcome Evidence Strength Studies (Quality Rating), n Meta-Analysis Result for Outcomes (95% CI) Studies Included in the Meta- Analysis, n Other Substantial Findings
Length of stay Low 6 (6 good) RR, 0.96 (0.88–1.05) favoring CDSS 5 Limited evidence that CDSSs that automatically delivered system-initiated recommendations to providers were effective or demonstrated a trend toward reducing length of stay
Morbidity Moderate 22 (13 good, 7 fair, 2 poor) RR, 0.88 (0.80–0.96) favoring CDSS 16 Modest evidence from academic and community inpatient and ambulatory settings that locally developed CDSSs that automatically delivered system-initiated recommendations to providers synchronously at the point of care were effective or demonstrated a trend toward reducing patient morbidity
Mortality Low 7 (6 good, 1 fair) OR, 0.79 (0.54–1.15) favoring CDSS 6 Limited evidence that CDSSs integrated in CPOE or EHR systems that automatically delivered system-initiated recommendations to providers were effective or demonstrated a trend toward reducing patient mortality
Adverse events Low 5 (3 good, 1 fair, 1 poor) RR, 1.01 (0.90–1.14) favoring control 5 Limited evidence from academic settings that CDSSs that delivered recommendations to providers synchronously at the point of care demonstrated an effect on reducing or preventing adverse events
Health care process measures. Recommended preventive care service ordered or completed High 43 (20 good, 16 fair, 7 poor) OR, 1.42 (1.27–1.58) favoring CDSS 25 Strong evidence from studies conducted in academic, VA, and community inpatient and ambulatory settings that locally and commercially developed CDSSs that automatically delivered system-initiated recommendations to providers synchronously at the point of care and did not require a mandatory clinician response were effective at improving the appropriate ordering of preventive care procedures
Recommended clinical study ordered or completed Moderate 29 (16 good, 9 fair, 4 poor) OR, 1.72 (1.47–2.00) favoring CDSS 20 Modest evidence from studies conducted in academic and community inpatient and ambulatory settings that CDSSs integrated in CPOE or EHR systems and locally and commercially developed CDSSs that automatically delivered system-initiated recommendations to providers synchronously at the point of care and did not require a mandatory clinician response were effective at improving the appropriate ordering of clinical studies
Recommended treatment ordered or prescribed High 67 (35 good, 24 fair, 8 poor) OR, 1.57 (1.35–1.82) favoring CDSS 46 Strong evidence from academic, community, and VA inpatient and ambulatory settings that locally and commercially developed CDSSs integrated in CPOE or EHR systems that automatically delivered system-initiated recommendations to providers synchronously at the point of care and did not require a mandatory clinician response were effective at improving appropriate treatment ordering or prescribing

The meta-analysis revealed strong evidence that clinical decision support can improve process outcomes, including increased preventive services with an odds ratio of 1.42 (95% CI = 1.27, 1.58) and increased ordering of appropriate medical treatment, odds ratio 1.57 (95% CI = 1.35, 1.82). There is moderate evidence that CDSS improves the ordering and completion of appropriate clinical studies, odds ratio of 1.72 (95% CI = 1.47, 2.00), and moderate evidence that CDSS can decrease morbidity, RR 0.88 (95% CI = 0.80, 0.96). There is poor strength of evidence that CDSS lowers mortality, costs, or adverse events, but there were many fewer studies in this area.

It is also less likely that randomized controlled trials will demonstrate evidence of decreased mortality from chronic diseases or cancer, since this outcome would usually occur after several years. RCT studies usually do not continue for longer time periods because these studies are costly and work intensive. There is evidence, however, that many of the increased preventive services in the CDSS studies are correlated with decreased mortality.

What Features of CDSS Will Make It More Effective?

Another question that is addressed in the AHRQ Technology Assessment is: What are the features of CDSS that contribute to making it more effective [8]? Kawamoto and Lobach attempted to address this question in a 2005 study that included a meta-regression analysis of 70 studies with analysis of 15 factors identified as possibly being relevant to CDSS success in prior studies [26]. Features that were determined to be significant from this study are highlighted in Table 2. They included “support presented at the time of the decision, computer based support, support that included a recommendation rather than just an assessment and automatic provision of decision support as part of workflow.” It should be noted that the confidence intervals for some of these features were very broad, with one confidence interval including infinity.

Table 2. Features that Contribute to CDSS Recommendation Adherence, by Author.

Citation Study Type/Number of Studies in Analysis Features Evaluated Successful Features (evidence)/Recommendations
[26] Systemic Review of Randomized Controlled Trials/n=70 Integration with charting or order entry Automatic Provision of decision support as part of clinician workflow; OR 112 (12.9, infinity)
Computer-based generation of decision support Provision at time and location of decision making; OR15.4 (1.3, 300.6)
Local user involvement in development Provision of a recommendation, not just an assessment; OR 7.1 (1.3, 49.0)
Clinician-system interactive features Computer-based generation of decision support; OR 6.3 (1.2,45)
Automatic Provision of decision support as part of clinician workflow
Provision at time and location of decision making
Request documentation of reason for not following system recommendations
Provision of a recommendation, not just an assessment
Promotion of action rather than inaction
Justification via provision of research evidence/reasoning
Provision of Decision Support results to both clinician and patient
CDSS accompanied by period performance feedback
CDSS accompanied by conventional education

[8] Meta-analysis of 91 randomized controlled trials Integration with charting or order entry Automatic Provision of decision support as part of clinician workflow; OR 1.45 to 1.85*
Computer based generation of decision support Provision at time and location of decision making; OR 1.35 to 1.78*
Local user involvement in development Provision of a recommendation, not just an assessment; OR 1.5 to 2.01*
Clinician-system interactive features Integration with charting or order entry; OR 1.47 to 1.67*
Automatic Provision of decision support as part of clinician workflow No need for additional clinician data entry; OR 1.43 to 1.78*
Provision at time and location of decision making Promotion of action rather than inaction; OR 1.28 to 1.71*
Request documentation of reason for not following system recommendations Provision of Decision Support results to both clinician and patient; OR 1.18 to 1.97*
Provision of a recommendation, not just an assessment Local user involvement in the development process; OR 1.45 to 1.90
Promotion of action rather than inaction
Justification via provision of research evidence/reasoning
Provision of Decision Support results to both clinician and patient
CDSS accompanied by period performance feedback
CDSS accompanied by conventional education
No need for additional clinician data entry

[27] Meta-regression analysis of 162 randomized controlled trials Primary Factor Set: Systems providing advice for patients in addition to practitioners; OR 2.77 (1.07 to 7.17)
Some of study’s authors are also system’s developers Required practitioners to provide a reason for over-ride; OR 11.23 (1.98 to 63.72)
System provides advice automatically within practitioner’s workflow Were evaluated by their developers; OR 4.35 (1.66 to 11.44)
System provides advice at time of care
Advice presented in electronic charting or order entry systems
Provides advice for patients
Requires reason for over-ride

*depends on type of care intervention

In the AHRQ Technology Assessment, Lobach performed a meta-analysis on 91 studies that resulted in nine features associated with increased CDSS effectiveness [8]. The study features and outcome features are presented in Table 2. The successful features overlapped with those in the Kawamoto study, but added other features such as “no need for additional clinician data entry,” “provision of decision support to patients as well as providers,” and “local user involvement in the development process.” Odds ratios for effective features were between 1.18 and 3.00, with confidence intervals that were much narrower than in the Kawamoto study [26].

In 2013, Roshanov performed a meta-regression analysis on 162 randomized controlled trials in order to evaluate for features that were effective in CDSS, and his results were quite different from the Kawamoto [26] and Lobach [8] studies [27]. In the prior two meta-analysis studies, if it was not stated that a factor was present, it was assumed not to be present. In Roshanov’s study, authors were contacted and asked whether missing features were present. Roshanov also found problems with the design of previous studies citing spuriously favorable results by testing “more factors than their study sample size could reliably support.” He opines that in order to test 15 primary factors, “Kawamoto would have required 460 studies to reliably test (the factors).” Roshanov limited his study to six primary factors for 162 RCT studies, and only three of these factors were favorable (Table 2). As in Lobach’s 2012 study, systems that provided advice to patients as well as practitioners were more likely to be effective, OR 2.77 (CI = 1.07,7.17). Systems that required practitioners to supply reasons for overturning advice were also likely to be effective, OR 11.23 (CI = 1.98, 63.72). He also notes that systems that were evaluated by their own developers were more likely to be effective, OR 4.35 (CI = 1.66, 11.44). Roshanov opines that this may be due to bias by the system developers, and he encourages third party evaluation of these systems. Surprisingly, systems that presented advice in the order entry system interface were less likely to be effective (OR 0.37 (CI = .17, .80).

Lobach responded to this paradoxical result in Roshanov’s study in an editorial titled, “The Road to Effective Clinical Decision Support: Are We There Yet?” His answer was “no,” and he expressed particular concern that there was an appearance of an adverse effect from electronic reminders at the point of care. He asks whether the negative association is due to “alert fatigue,” “integrated systems (being) too distractive,” or the “systems being user hostile in some other way” [29].

Conclusions and Outlook

While it is becoming increasingly clear that clinical decision support is effective in improving clinical processes, it may take some time to fully realize clinical decision support’s potential in improving health care quality and outcomes. Though most of the processes that are improved through CDSS, such as adherence to mammography or colonoscopy guidelines, have known effectiveness in improving clinical outcomes, further research is needed to assure that CDSS is applying guidelines appropriately and that better clinical outcomes are realized. Research on clinical outcomes and cost is much less mature than the research on process outcomes, but clinical outcome studies have increased in recent years. Long-term prospective cohort studies or well-designed retrospective cohort studies may give more information on CDSS effects on morbidity and mortality in the future, as most of the current studies do not assess long-term outcomes.

It will likely take more trial and error in order to optimize the penetration of clinical decision support recommendations into actual provider practices. CDSS recommendations are most often derived from evidence-based practice guidelines, and increasing provider adherence to practice guidelines has been an ongoing challenge [28]. There is continuing uncertainty regarding the most effective means for advancing evidence through decision support systems, and this is an important area for future research.

In the “Ten Commandments for Effective Clinical Decision Support,” Bates argues that making CDS systems user-friendly and well-integrated into the work flow, with ongoing knowledge updates, will lead to great advances in evidence-based medicine [30]. Sittig’s “five rights” for clinical decision support include the “right information, to the right person, in the right format, through the right channel, at the right point in workflow” [31]. Kawamoto and Lobach [26] proposed many variables supporting clinical decision support effectiveness, and these were later refined to nine factors for success [8], but these theories have all been brought into question by Roshanov’s findings that success is most strongly correlated to requiring providers to provide an explanation for failing to comply with recommendations and giving the support advice directly to patients [27].

None of these studies considered financial incentives and the possible impact of the mandates within the Meaningful Use Clinical Quality Measures to affect adherence to clinical decision support recommendations. Many of the Meaningful Use measures are processes that were improved by clinical decision support systems in randomized controlled trials, so implementation and reporting on these measures has good potential to facilitate compliance with decision support systems recommendations. Roshanov’s findings suggest that physician compliance with decision support is better when there is an outside incentive, such as saving time and possible audit by choosing a recommendation rather than explaining why you did not choose it or maintaining the respect and trust of your patients by complying with their knowledge-based requests regarding their care. Perhaps the monetary incentive in Meaningful Use will also be effective in changing provider behavior. Will the “right incentive” be added to the five rights?

Finally, the evidence from numerous well-designed studies is starting to confirm the importance of patient-centered care and the patient as a partner in improving health care quality through clinical decision support. Following the meta-analysis of dozens of randomized controlled trials, both Lobach [8] and Roshanov [27] found that providing clinical decision support advice to patients in addition to providers resulted in increased adherence to CDS recommendations (OR 1.78 to 2.77). Perhaps we have underestimated our patient’s ability to understand the underlying information and evidence in practice guidelines and to advocate for the quality of their care. Patient-directed clinical decision support may well be an important frontier in improving health care quality and is at least worthy of more focused exploration.

Acknowledgments

The author acknowledges Professor Bill Hersh for help and encouragement in understanding important concepts related to this review.

Abbreviations

EHR

electronic health record

CDSS

clinical decision support system

CPOE

computerized physician order entry

RCT

randomized controlled trials

OR

Odds Ratio

RR

Risk Ratio

CI

Confidence Interval

HITECH

Health Information Technology for Economic and Clinical Health Act

ARRA

American Recovery and Reinvestment Act

AHRQ

Agency for Healthcare Research and Quality

References

  1. Hsiao CJ, Hing E. NCHS Data Brief. Use and Characteristics of Electronic Health Systems Among Office-based Physician Practices: United States, 2001-2012. [accessed 15 Feb 2014];Centers for Disease Control and Prevention [Internet] 2012 December;111 Available from: http://www.cdc.gov/nchs/data/databriefs/db111.htm . [Google Scholar]
  2. Osheroff JA, Teich JM, Middleton B, Steen EB, Wright A, Detmer DE. A Roadmap for National Action on Clinical Decision Support. J Am Med Inform Assoc. 2007;14(2):141–145. doi: 10.1197/jamia.M2334. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Kawamoto K, Jacobs J, Welch BM, Huser V, Paterno MD, Del Fiol G. et al. Clinical Information System Services and Capabilities Desired for Scalable, Standards-Based, Service-oriented Decision Support: Consensus Assessment of the Health Level 7 Clinical Decision Support Work Group. AMIA Annu Symp Proc. 2012:446–455. [PMC free article] [PubMed] [Google Scholar]
  4. Weed LL. Medical Records that Guide and Teach. N Engl J Med. 1968;278(12):652–657. doi: 10.1056/NEJM196803212781204. [DOI] [PubMed] [Google Scholar]
  5. Greenes RA, Pappalardo AN, Marble CW, Barnett GO. Design and Implementation of a Clinical Data Management System. Comput Biomed Res. 1969;2(5):469–485. doi: 10.1016/0010-4809(69)90012-3. [DOI] [PubMed] [Google Scholar]
  6. Fallon H, Goertzel G, Marler GE, Pulver RW. A Primer for Writing Medical Data Base for the Clinical Decision Support System. Prog Brain Res. 1970;33:155–175. doi: 10.1016/s0079-6123(08)62449-8. [DOI] [PubMed] [Google Scholar]
  7. McDonald CJ. Protocol-Based Computer Reminders, the Quality of Care and the Non-Perfectability of Man. N Engl J Med. 1976;295(24):1351–1355. doi: 10.1056/NEJM197612092952405. [DOI] [PubMed] [Google Scholar]
  8. Lobach D, Sanders GD, Bright TJ, Wong A, Dhurjati R, Bristow E. et al. Enabling Health Care Decisionmaking Through Clinical Decision Support and Knowledge Management. Evid Rep Technol Assess (Full Rep) 2012;(203):1–784. [PMC free article] [PubMed] [Google Scholar]
  9. Litzelman DK, Dittus RS, Miller ME, Tierney WM. Requiring Physicians to Respond to Computerized Reminders Improves Their Compliance with Preventive Care Protocols. J Gen Intern Med. 1993;8:311–317. doi: 10.1007/BF02600144. [DOI] [PubMed] [Google Scholar]
  10. Demakis JG, Beauchamp C, Cull WL, Denwood R, Eisen S, Lofgren R. et al. Improving residents’ compliance with standards of ambulatory care: results from the VA Cooperative Study on Computerized Reminders. JAMA. 2000;284(11):1411–1416. doi: 10.1001/jama.284.11.1411. [DOI] [PubMed] [Google Scholar]
  11. Dexter PR, Perkins S, Overhage JM, Maharry K, Kohler RB, McDonald CJ. A Computerized Reminder System to Increase the Use of Preventive Care in Hospitalized Patients. N Engl J Med. 2001;345(13):965–970. doi: 10.1056/NEJMsa010181. [DOI] [PubMed] [Google Scholar]
  12. Feldstein A, Elmer PJ, Smith DH, Herson M, Orwoll E, Chen C. et al. Electronic Medical Record Reminder Improves Osteoporosis Management after a Fracture: A Randomized Controlled Trial. J Am Geriatr Soc. 2006;54(3):450–457. doi: 10.1111/j.1532-5415.2005.00618.x. [DOI] [PubMed] [Google Scholar]
  13. Kucher N, Koo S, Quiroz R, Cooper JM, Paterno MD, Soukonnikov B. et al. Electronic Alerts to Prevent Venous Thromboembolism in Hospitalized Patients. N Engl J Med. 2005;352(10):969–977. doi: 10.1056/NEJMoa041533. [DOI] [PubMed] [Google Scholar]
  14. Haut ER, Lau BD, Kraenzlin FS, Hobson DB, Kraus PS, Carolan HT. et al. Improved Prophylaxis and Decreased Rates of Preventable Harm With the Use of a Mandatory Computerized Clinical Decision Support Tool for Prophylaxis for Venous Thromboembolism in Trauma. Arch Surg. 2012;47(10):901–907. doi: 10.1001/archsurg.2012.2024. [DOI] [PubMed] [Google Scholar]
  15. Burack RC, Gimotty PA, George R, Simon MS, Dews P, Moncrease A. The Effect of Patient and Physician Reminders on the Use of Screening Mammography in a Health Maintenance Organization. Cancer. 1994;78(8):1708–1721. [PubMed] [Google Scholar]
  16. Fiks AG, Hunter KS, Localio AR, Grundmeier RW, Bryant-Stephens T, Luberti AA. et al. Impact of Electronic Health Record-Based Alerts on Influenza Vaccination in Children with Asthma. Pediatrics. 2009;124(1):159–169. doi: 10.1542/peds.2008-2823. [DOI] [PubMed] [Google Scholar]
  17. Heidenreich PA, Gholami P, Sahay A, Massic B, Goldstein MK. Clinical Reminders Attached to Echocardiography Reports of Patients with Reduced Left Ventricular Ejection Fraction Increase Use of β-Blockers: a Randomized Trial. Circulation. 2007;115:2829–2834. doi: 10.1161/CIRCULATIONAHA.106.684753. [DOI] [PubMed] [Google Scholar]
  18. Paul M, Andreassen S, Tacconelli E, Nielsen AD, Almanasreh N, Frank U. et al. Improving empirical antibiotic treatment using TREAT, a computerized decision support system: cluster randomized trial. J Antimicrob Chemother. 2006;58:1238–1245. doi: 10.1093/jac/dkl372. [DOI] [PubMed] [Google Scholar]
  19. Overhage JM, Tierney WM, Zhou XH, McDonald CM. A Randomized Trial of “Corollary Orders” to Prevent Errors of Omission. J Am Med Inform Assoc. 1997;4:364–375. doi: 10.1136/jamia.1997.0040364. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Field TS, Rochon P, Lee M, Gavendo L, Maril JL, Gurwitz JH. Computerized Clinical Decision Support During Medication Ordering for Long-term Care Residents with Renal Insufficiency. J Am Med Inform Assoc. 2009;16(4):480–485. doi: 10.1197/jamia.M2981. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. McCowan C, Neville RG, Ricketts IW, Warner FC, Hoskins G, Thomas GE. Lessons from a randomized controlled trial designed to evaluate computer decision support software to improve the management of asthma. Med Inform. 2001;26(3):191–201. doi: 10.1080/14639230110067890. [DOI] [PubMed] [Google Scholar]
  22. Bell LM, Grundmeier R, Localio R, Zorc J, Fiks AG, Zhang X. et al. Electronic Health Record-Based Decision Support to Improve Asthma Care: A Cluster-Randomized Trial. Pediatrics. 2010;125:e770–e777. doi: 10.1542/peds.2009-1385. [DOI] [PubMed] [Google Scholar]
  23. Khan S, Maclean CD, Littenberg B. The effect of the Vermont Diabetes Information System on inpatient and emergency room use: results from a randomized trial. Health Outcomes Res Med. 2010;1(1):e61–e66. doi: 10.1016/j.ehrm.2010.03.002. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Quinn CC, Shardel MD, Terrin ML, Barr EA, Ballew SH, Gruber-Baldini AL. Cluster-Randomized Trial of a Mobile Phone Personalized Behavioral Intervention for Blood Glucose Control. Diabetes Care. 2011;34:1934–1942. doi: 10.2337/dc11-0366. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux RR. et al. Effect of Clinical Decision Support Systems: A Systematic Review. Ann Intern Med. 2012;157(1):29–43. doi: 10.7326/0003-4819-157-1-201207030-00450. [DOI] [PubMed] [Google Scholar]
  26. Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems : a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765. doi: 10.1136/bmj.38398.500764.8F. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Roshanov PS, Fernandes N, Wilczynski JM, Hemens BJ, You JJ, Handler SM. et al. Features of effective computerized clinical decision support systems: meta-regression of 162 randomised trials. BMJ. 2013;346:f657. doi: 10.1136/bmj.f657. [DOI] [PubMed] [Google Scholar]
  28. Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PC. et al. Why Don’t Physicians Follow Clinical Practice Guidelines, A Framework for Improvement. JAMA. 1999;282(15):1458–1465. doi: 10.1001/jama.282.15.1458. [DOI] [PubMed] [Google Scholar]
  29. Lobach DF. The road to effective clinical decision support: are we there yet? BMJ. 2013;346:f1616. doi: 10.1136/bmj.f1616. [DOI] [PubMed] [Google Scholar]
  30. Bates DW, Kuperman GJ, Wang S, Gandhi T, Kittler A, Volk L. et al. Ten Commandments for Effective Clinical Decision Support: Making the Practice of Evidence-Based Medicine a Reality. J Am Med Inform Assoc. 2003;10(6):523–530. doi: 10.1197/jamia.M1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Sirajuddin AM, Osheroff JA, Sittig DF, Chuo J, Velasco F, Collins DA. et al. Implementation Pearls from a New Guidebook on Improving Medication Use and Outcomes with Clinical Decision Support. J Healthc Inf Manag. 2009;23(4):38–45. [PMC free article] [PubMed] [Google Scholar]

Articles from The Yale Journal of Biology and Medicine are provided here courtesy of Yale Journal of Biology and Medicine

RESOURCES