Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Oct 25.
Published in final edited form as: Am J Manag Care. 2018 Aug;24(8):361–366.

Choosing Wisely Clinical Decision Support Adherence and Associated Inpatient Outcomes

Andrew M Heekin 1, John Kontor 1, Harry C Sax 1, Michelle S Keller 1, Anne Wellington 1, Scott Weingarten 1
PMCID: PMC6813785  NIHMSID: NIHMS1051815  PMID: 30130028

Abstract

Objectives:

To determine whether utilization of clinical decision support (CDS) is correlated with improved patient clinical and financial outcomes.

Study Design:

Observational study of 26,424 patient encounters. In the treatment group, the provider adhered to all CDS recommendations. In the control group, the provider did not adhere to CDS recommendations.

Methods:

An observational study of provider adherence to a CDS system was conducted using inpatient encounters spanning 3 years. Data comprised alert status (adherence), provider type (resident, attending), patient demographics, clinical outcomes, Medicare status, and diagnosis information. We assessed the association between alert adherence and 4 outcome measures: encounter length of stay, odds of 30-day readmission, odds of complications of care, and total direct costs. The association between alert adherence and the outcome measures was estimated using 4 generalized linear models that adjusted for potential confounders, such as illness severity and case complexity.

Results:

The total encounter cost increased 7.3% (95% CI, 3.5%−11%) for non-adherent encounters versus adherent encounters. We found a 6.2% (95% CI, 3.0%−9.4%) increase in length of stay for non-adherent versus adherent encounters. The odds ratio for readmission within 30 days increased by 1.14 (95% CI, 0.998–1.31) for non-adherent versus adherent encounters. The odds ratio for complications increased by 1.29 (95% CI, 1.04–1.61) for nonadherent versus adherent encounters.

Conclusions:

Consistent improvements in measured outcomes were seen in the treatment group versus the control group. We recommend provider organizations consider the introduction of real time CDS to support adherence to evidence based guidelines, but because we cannot determine the cause of the association between CDS interventions and improved clinical and financial outcomes, further study is required.

Précis:

This analysis examines the association between adherence to Choosing Wisely recommendations embedded into clinical decision support alerts and 4 measures of resource use and quality.


The Health Information Technology for Economic and Clinical Health Act, an important component of the American Recovery and Reinvestment Act, enabled the federal government to subsidize hospitals, health systems, and physicians $40 billion1 to implement electronic health records (EHRs) and imposed significant penalties on nonadopters. This investment was expected to result in up to $470 billion in inpatient cost savings alone2 through reduced patient length of stay,3 reduced utilization of services, and other outcomes.4 Today, certified EHRs are operational in 96% of nonfederal acute care hospitals and health systems in the United States,5 but the expected cost savings have not yet been realized.6 The evidence that EHRs improve quality and patient outcomes has been mixed; studies have found improved quality of care in the ambulatory care setting,7 higher guideline adherence, fewer medication errors, and decreased adverse drug effects.8 However, other studies have found that EHR use is not associated with decreased readmissions9,10 or lower rates of mortality.8

In 2012, the ABIM Foundation introduced the Choosing Wisely (CW) initiative, a voluntary effort by more than 70 physician subspecialty societies to identify commonly used low-value services.11 The intent of this publicly promoted initiative was to stimulate provider–patient discussions about appropriate care and thereby reduce low-value tests and treatments.7 Although the primary aim of CW is not to lower costs, reducing inappropriate care could lead to lower costs for both patients and payers. To date, CW may not have achieved clinically significant changes in reducing low-value care.1214 Public promotion alone does not appear to be sufficient to achieve widespread adoption.10 A 2015 claims-based analysis of 7 CW recommendations found that use of 2 low-value services declined, but the decreases were not clinically significant.10 In their recommendations, the authors called for innovative methods to disseminate CW recommendations.10 Provider difficulty interpreting guidelines and evaluating patient risk,15,16 patient need for reassurance,13 and provider fear of malpractice litigation17 pose additional obstacles.

Ideally, an EHR infrastructure could overcome these obstacles and provide real-time computerized clinical decision support (CDS) to inform healthcare providers when their care deviates from evidence-based guidelines. CDS comprises a variety of tools, including computerized alerts and reminders with information such as diagnostic support, clinical guidelines, relevant patient information, diagnosis-specific order sets, documentation templates, and drug–drug interactions.18 CDS provides the ability to modify tests and treatments based on context-specific and patient-specific information presented at the point of care. Utilizing CDS can help providers avoid ordering a low-value test or intervention that could lead to additional nontherapeutic interventions or even harm. CDS has been shown to improve a variety of processes, including prescribing practices,19 appropriate use of diagnostic radiology,20 adherence to quality measures,21 and conformance to evidence-based care.19 Systems that automate CDS, provide tailored recommendations based on patient characteristics, and prompt clinicians to provide a reason for overriding recommendations have been shown to be significantly more likely to succeed than systems that provide only patient assessments.19

We implemented select CW recommendations in the EHR at a large academic health system in the form of 92 alert-based CDS interventions, both inpatient and ambulatory. Inpatient alerts selected for study were those deemed the most technically feasible to deploy accurately and with a sufficient number of relevant orders that would trigger an alert to fire, thus providing a sufficient volume of alerted encounters to evaluate. When initiating a potentially inappropriate order, a provider received real-time notification of deviation from a CW recommendation. That provider then had the option to cancel, change, or justify the order, if he or she agreed with the alert’s recommendation in the context of the individual patient. The objective of this study was to evaluate the relationship between providers who adhere to CW alerts and measurable outcomes.

METHODS

Study Setting

We conducted an observational study of provider adherence to the 18 highest-volume CW alerts utilizing a commercially available EHR-embedded CDS system at Cedars-Sinai Health System, a nonprofit tertiary 886-bed hospital and multispecialty academic health science center located in Los Angeles, California. The medical staff is pluralistic and includes employed and independent physicians in private practice, physician extenders, and residents.

This study included inpatient encounters from October 22, 2013, to July 31, 2016. The study protocol was approved by the Cedars-Sinai Medical Center (CSMC) Institutional Review Board.

Study Population and Data Sources

Data for the study were collected from 3 sources: data sent from the EHR to the CDS analytics platform, which included the category of the provider triggering the alert (e.g. resident, attending) and clinical data allowing for the assessment of adherence or non-adherence to the alert during the encounter; claims data, which included patient demographics (e.g. age, gender), diagnoses, services provided, admit and discharge dates, Medicare Severity-Diagnosis Related Group codes, and costs; and direct cost data associated with the patient care department, which we describe below. The unit of analysis is the patient encounter; this covers the entire inpatient visit, and there is only 1 encounter per visit. Data were matched using a common encounter identifier. Encounters in which the providers were considered adherent included all encounters that received CW alerts and for which providers adhered to all alerts. Alerts were considered adhered to when the order flagged by the CDS as potentially conflicting with CW was not signed within 1 hour after an alert was shown to a provider. The nonadherent group included encounters where providers received CW alerts and for which they did not adhere to any (complete nonadherence). Approximately 1400 encounters that did not meet either of these criteria were excluded because they included partial adherence to some, but not all of the alerts. The Elixhauser index was computed as an unweighted sum of comorbidities present during all encounters for a given patient to estimate the morbidity burden.22

Alert Selection and Development

In 2013, we integrated CW recommendations as CDS alerts into the Epic EHR at CSMC. A clinical informatics team enabled the translation of the CW recommendations through a standardized process. First, clinicians reviewed the primary sources cited in each recommendation to define inclusion and exclusion criteria for the CDS rule. Once defined, the clinical logic was deployed in the EHR using standardly available alert tools. Finally, the team reviewed patient charts from encounters in which alerts were triggered and identified opportunities to refine the logic and reduce false positives.

To define the alerts for inclusion in the study, we initially reviewed all inpatient CW alerts that were active in the CSMC EHR at any point during the study period. For this analysis, we eliminated any low-volume alerts that sounded an average of less than once per month. The general definition of when an alert is adhered to is when a provider is advised against taking a particular action and complies with the request. Specifically, because our adherence criteria are used to evaluate EHR data to determine whether a particular order was signed within an hour after seeing an alert, we cannot accurately categorize adherence to alerts that either make recommendations about the appropriateness of individual orders within a series of identical orders (ie, repeat or standing laboratory testing) or that do not flag a particular order as inappropriate and instead are reminders unrelated to avoiding unnecessary care (eg, “Don’t delay palliative care for patients with advanced gynecological cancer.”). All remaining alerts were included in the data set (eAppendix Table 1 [eAppendix available at ajmc.com]).

Outcomes

We assessed the association between alert adherence and 4 outcome measures: encounter length of stay, 30-day readmissions, complications of care, and total direct costs. We defined 30-day readmission as an inpatient readmission to the same facility for any cause occurring within 30 days of discharge that was unplanned and deemed unavoidable. Complications of care are defined using the Agency for Healthcare Research and Quality (AHRQ) Healthcare Cost and Utilization Project (HCUP) classification system for complication codes.23 Total direct costs are defined as expenses directly associated with patient care, such as labor (wages, salaries, agency, employee benefits), supplies (medical, implant, nonmedical), professional fees, contracted services equipment, and equipment depreciation.24 We selected these 4 outcomes measures due to their relevance to patients, health systems, and payers. As the industry shifts from fee-for-service to value-based contracts, cost containment and quality have become critical priorities for healthcare providers. Length of stay, readmission rates, and complication rates also merit evaluation, given their potential impact on patient outcomes and hospital value-based payment programs.25,26 Given that many low-value tests and procedures can result in a chain of additional tests and procedures, we also theorized that reducing inappropriate and low-value services may lead to shorter lengths of stay, lower 30-day readmission rates, and lower complication rates.

Statistical Analysis

The adherent/non-adherent encounter groups were compared based on demographic characteristics, number of diagnoses, and case severity. The χ2 test was used for categorical variables and the Wilcoxon rank sum test was used for continuous variables.

We estimated the association between alert adherence and the outcome measures using 4 generalized linear models. Alert adherence was measured as a dichotomous predictor. We adjusted for potential confounders, such as illness severity and case complexity, using demographic and clinical variables, such as gender, age, All Patients Refined-Diagnosis Related Groups (APR-DRG) severity level, number of diagnoses, expected length of stay, Elixhauser comorbidity index, Medicare status, and case mix index. A subset of all independent variables was used in each regression model to maximize the quality of fit of the model. Variable selection was performed using a backward stepwise method while minimizing the Akaike information criteria. In addition to alert adherence, variables were included in all models to adjust for the differences between the characteristics of the 2 groups. The continuous covariates generally had skewed distributions and were transformed prior to inclusion in the models. All statistical analyses were performed using R version 3.3.127 and the following packages: glm2,28 caret,29 and sqldf.30

Multivariate logistic regression was used to estimate the odds of patient outcomes that are dichotomous (ie, 30-day readmissions and complications of care). The 2 continuous outcomes, length of stay and total cost, were estimated using multivariate linear regression models with the dependent variable log-transformed to correct for significant right skew in the distribution of each outcome. The outcome variables also appeared as independent variables in other models. Statistical tests were 2-sided, with P <.05 considered statistically significant. More detailed discussion of the regression models is included in the eAppendix.

RESULTS

A total of 26,424 encounters were included in the analysis out of a total of approximately 100,000 encounters. In 1591 (6%) of these encounters, providers adhered to all alerts (an “adherent encounter”); in the remaining 24,833 (94%) encounters, no alerts were adhered to (a “nonadherent encounter”) (Table 1). Patients in the adherent and nonadherent encounter groups were similar with respect to age (P = .32) and total number of diagnoses (P = .26). Additionally, both encounter groups were comparable with respect to the proportion of patients whose primary payer was Medicare (P = .94). There were significant differences in APR-DRG severity levels (P = .01), with sicker patients in the non-adherent group (a greater proportion of non-adherent patients classified at level 4, extreme). Additionally, there were differences with respect to Elixhauser index scores (P = .04), case mix index values (P = .02), gender (P = .05), and expected length of stay (P <.001) (Table 1).

Table 1.

Characteristics of Patients Discharged From Inpatient Visit

Characteristics Alert
Adherence
(n = 1591)a
Alert
Nonadherence
(n = 24,833)
Pb
Women, n (%) 880 (55.3%) 13,112 (52.8%)  .05
Age, years, mean (SD) 65.9 (18.7) 65.7 (18.3)  .32
APR-DRG severity level  .01
 Level 1 (minor), n (%) 254 (15.96%) 3648 (14.69%)
 Level 2 (major), n (%) 510 (32.06%) 7574 (30.50%)
 Level 3 (severe), n (%) 562 (35.32%) 8945 (36.02%)
 Level 4 (extreme), n (%) 280 (17.60%) 4848 (19.52%)
Number of diagnoses, median (IQR) 15 (7.0) 15 (6.0)  .26
Expected length of stay, days, median (IQR) 3.9 (2.2) 4.1 (2.8) <.001
Elixhauser index, median (IQR) 2.6 (2.7) 2.7 (2.7)  .04
Case mix index, median (IQR) 1.6 (1.4) 1.7 (1.8)  .02
Medicare status 63.0% 63.0%  .94
30-day readmissions rate 17.8% 20.0%  .02
Complications rate 6.7% 10.0% <.001
Length of stay, days, median (IQR) 4.0 (5.0) 5.0 (6.0) <.001

APR-DRG indicates All Patients Refined Diagnosis Related Groups; IQR, interquartile range.

a

Number of encounters.

b

P values are from bivariate analyses; χ2 test was used for categorical variables and the Wilcoxon rank sum test was used for continuous variables.

With respect to outcomes, bivariate analyses indicated that patient encounters in the group in which providers did not adhere to CW recommendations had longer unadjusted actual lengths of stay (P <.001), higher complication rates (P <.001), higher 30-day readmission rates (P = .02), and higher direct costs (P <.001).

Overall, adherent encounters had significantly lower total costs, shorter lengths of stay, and lower odds of complications compared with nonadherent encounters. The coefficient of the independent variable used to determine lower odds of 30-day readmissions when the encounter is in the adherent group did not achieve significance. After adjusting for patient characteristics, nonadherent encounters also showed a 7.3% (95% CI, 3.5%−11%; P <.001) increase in total direct costs versus adherent encounters. That represents an increase of $944 for a nonadherent encounter versus an adherent encounter (Table 2).

Table 2.

Association Between CW Alert Adherence and Log Total Costs (in $1000s)a

Coefficients
(95% CI)
P
CW alert nonadherence 0.07 (0.04–0.11) <.001
APR-DRG severity level
 Level 1 (minor) Reference
 Level 2 (major) −0.01 (−0.04 to 0.01)  .31
 Level 3 (severe) 0.20 (0.17–0.23) <.001
 Level 4 (extreme) 0.94 (0.91–0.97) <.001
Log length of stay / expected length of stay 0.54 (0.52–0.55) <.001
Male 0.16 (0.14–0.17) <.001
Complications, excluding POA 0.89 (0.86–0.92) <.001

APR-DRG indicates All Patients Refined Diagnosis Related Groups; CW, Choosing Wisely; POA, present on admission.

a

The adjusted total costs based on alert adherence were estimated using a log-linear model and were fit using ordinary least squares regression and normal error distribution, with the dependent variable log-transformed to correct for significant right skew in the distribution. Adjusted R2 = 0.47. The error distribution is assumed to be Gaussian. One variable was log-transformed: the interaction between length of stay and expected length of stay for the given APR-DRG.

We found a 6.2% (95% CI, 3.0%−9.4%; P <.001) increase in length of stay for non-adherent versus adherent encounters. (Table 3). We found that the odds of a patient having a readmission within 30 days were 1.14 (95% CI, 0.998–1.31; P = .0503) times higher in nonadherent encounters (Table 4). The odds of a patient having complications were 1.29 (95% CI, 1.04–1.61; P = .02) times higher in nonadherent encounters (Table 5).

Table 3.

Association Between CW Alert Adherence and Log Length of Staya

Coefficients (95% CI) P
CW alert nonadherence 0.06 (0.03, 0.09)  .02
APR-DRG severity level
 Level 1 (minor) Reference
 Level 2 (major) 0.07 (0.04–0.09) <.001
 Level 3 (severe) 0.25 (0.22–0.28) <.001
 Level 4 (extreme) 0.57 (0.54–0.61) <.001
Log number of diagnoses 0.13 (0.10–0.16) <.001
Log expected length of stay based on DRG 0.46 (0.39–0.54) <.001
Complications, excluding POA 0.22 (0.20–0.25) <.001
30-day readmission 0.09 (0.07–0.11) <.001
Interaction: number of diagnoses and log expected length of stay based on DRG 0.05 (0.03–0.07) <.001

APR-DRG indicates All Patients Refined Diagnosis Related Groups; CW, Choosing Wisely; DRG, Diagnosis Related Group; POA, present on admission.

a

The adjusted length of stay based on alert adherence was estimated using a log-linear model and was fit using ordinary least squares regression, with the dependent variable log-transformed to correct for significant right skew in the distribution. The error distribution is assumed to be Gaussian. Adjusted R2 = 0.50. Three variables were log-transformed: number of diagnoses, expected length of stay based on the given APR-DRG, and length of stay.

Table 4.

Association Between CW Alert Adherence and 30-Day Readmissiona

Multivariate OR (95%
CI)
P
CW alert nonadherence 1.14 (0.998–1.31) .0503
APR-DRG severity level
 Level 1 (minor) 1 (reference)
 Level 2 (major) 1.85 (1.61–2.12) <.001
 Level 3 (severe) 3.12 (2.71–3.59) <.001
 Level 4 (extreme) 2.71 (2.32–3.17) <.001
Log number of diagnoses 1.19 (1.12–1.26) <.001
Medicare 1.14 (1.06–1.22) .05
Interaction: length of stay and log number of diagnoses 1.23 (1.18–1.29) <.001

APR-DRG, All Patients Refined Diagnosis Related Groups; CW, Choosing Wisely; OR, odds ratio.

a

Logistic regression was used to estimate the probability of a readmission within 30 days. The readmissions model had a C statistic of 0.65, and the Hosmer-Lemeshow goodness of fit test using 6 groups was not significant (P = .81), indicating no evidence of poor fit. The link function used was the logit function. The error distribution is assumed to be Bernoulli. One variable was log-transformed: number of diagnoses.

Table 5.

Association Between CW Alert Adherence and Presence of At Least 1 Complication Not Present on Admissiona

Multivariate OR
(95% CI)
P
CW alert nonadherence 1.29 (1.04–1.61)     .021
APR-DRG severity level
 Level 1 (minor) 1 (reference)
 Level 2 (major) 1.28 (1.06–1.54)  .01
 Level 3 (severe) 1.00 (0.83–1.21)  .99
 Level 4 (extreme) 0.79 (0.64–0.97)  .03
Log number of diagnoses 1.24 (1.14–1.35) <.001
Log total costs 1.08 (1.03–1.14)     .003
Log length of stay 1.74 (1.62–1.86) <.001
Case mix index 2.88 (2.76–3.01) <.001

APR-DRG indicates All Patients Refined Diagnosis Related Groups; CW, Choosing Wisely; OR, odds ratio.

a

Logistic regression was used to estimate the probability of having at least 1 complication not present on admission. The complications model had a C statistic of 0.64, and the Hosmer-Lemeshow goodness of fit test using 7 groups was not significant (P = .30). The link function used was the logit function. The error distribution is assumed to be Bernoulli. Three variables were log-transformed: number of diagnoses, total costs, and length of stay.

DISCUSSION

To our knowledge, this is the first study to evaluate the association between adherence to multiple CW guidelines delivered via CDS and changes in clinical and financial outcomes. Previous studies have established that effective CDS can impact provider behavior and contribute to improved patient outcomes for specific CDS interventions.31 This study contributes to the established body of research indicating that adherence to effective CDS alerts is associated with improved outcomes, such as length of stay,32,33 complication rates,34,35 and overall cost.36,37 Our analysis provides new evidence of the effect that a more comprehensive collection of alerts has on high-level patient and financial outcomes, including shorter length of stay (0.06 days), lower complication rates (odds ratio, 1.29), and reduced cost per adhered patient episode (7.3%).

Our results suggest that the difference in cost savings is both statistically and clinically significant. Adherent encounters resulted in approximately $944 in savings from the median encounter cost of $12,940. A previous study examined the prevalence of 28 low-value services in a large population of commercially insured adults and identified an average potential cost savings of approximately $300 for each patient who received a single low-value service.10 Our findings surpass this estimate and imply significant cost-savings opportunities through improved and broader utilization of CDS.

Our results also confirm the association between alert adherence and odds of complications as defined by AHRQ’s HCUP.18 The majority of studies do not specifically analyze the effect of CDS interventions on complication rates for patients; rather, they identify undesired outcomes, such as adverse drug events26 and mortality rates.25 Although it is plausible that the reduction in the utilization of low-value services resulting in lower inpatient lengths of stay may lead to reduced complication rates, we did not evaluate potential causation between specific complications and avoided interventions. Additional research to confirm these findings and, more specifically, to delineate the causal pathway is indicated.

Previous studies’ findings have shown a positive correlation between CDS implementation and patient length of stay.23,24 However, to our knowledge, no analyses have established a correlation between CDS content targeting unnecessary care and improved lengths of stay. Our findings demonstrate an association between adherence to guideline-based alerts and reduction in unnecessary care with shortened inpatient length of stay.

Limitations

One limitation is our strict definition of “alert compliance”: In order to be in the adherent encounter group, providers had to be adherent to all of the CW CDS-related alerts. Patient episodes in which clinicians followed some but not all of the CW alerts that fired were considered “mixed-adherence” episodes and were excluded from analysis. This strict inclusion criterion limits our understanding of the clinical and financial impact that patients with partiallyadherent episodes may have experienced. Similarly, we were unable to differentiate the impact of specific alerts on our studied outcomes. Although there appear to be some differences between individual alerts, our study did not have enough power to make inferences due to its small sample sizes. Another key limitation of this study is the lack of control for provider effects. The analysis did not include provider characteristics and thus could not examine confounding on the provider level; it is possible that some providers are more likely to trigger alerts or are more likely to be non-adherent to alerts, even though we found no overall correlation between provider acceptance rate and provider outcomes. Additionally, providers who are more likely to adhere to evidence-based guidelines, including CW, may be more likely to ascribe to other system-based approaches and practices consistent with value-based patient care. We need to understand better the differences in characteristics and practice patterns of providers who adhere to CW recommendations compared with providers who do not.

Future analyses should examine the role of specific physician and alert characteristics on adherence to CDS and the effects on outcomes. The Elixhauser index computation was based on all relevant diagnoses made on all encounters for a given patient within our data set. It is possible, however, that the patient could have received additional relevant diagnoses outside of the time frame or hospital system applicable to this study.

Although our regression models adjusted for severity of illness, it is possible that the model did not control for all differences in patient severity or characteristics. Moreover, this study did not seek to establish causation between CW adherence and improved patient and financial outcomes. Many factors determine whether a single alert is adhered to or ignored, including alert fatigue,38 provider familiarity with the guideline presented,39 fear of malpractice,13 or need to reassure one’s patient through further diagnostic tests.16 We were unable to capture some relevant data, including professional billing fees and cost and readmissions data from other facilities, limiting our outcomes analyses. Finally, our demonstrated correlations between adherence and outcomes cannot necessarily be generalized to all CDS interventions, as the alerts evaluated in this study were implemented in the inpatient setting, were deemed the most technically feasible to deploy accurately, and had sufficient volume to evaluate.

Conclusions

We recommend that health systems consider real-time CDS interventions as a method to encourage improved adoption of CW and other evidence-based guidelines. A meta-analysis of CDS systems concluded that by providing context-specific information at the point of care, the odds of providers adopting guideline recommendations are 112 times higher.19 CDS enables the provision of context-specific information at the point of care and could help to overcome several known barriers to CW guideline adoption.

Our findings contribute to the evidence base surrounding the use of CDS and improvements in patient clinical and financial outcomes. Formal prospective cohort studies and randomized CDS intervention trials, perhaps randomizing providers assigned to receive CDS interventions, should be prioritized to help guide future provider strategies in regard to reducing low-value care.

Supplementary Material

1

Takeaway Points.

  • Encounters in which providers adhered to all alerts had significantly lower total costs, shorter lengths of stay, a lower probability of 30-day readmissions, and a lower probability of complications compared with nonadherent encounters.

  • Full adherence to Choosing Wisely alerts was associated with savings of $944 from a median encounter cost of $12,940.

  • Health systems should consider real-time clinical decision support interventions as a method to encourage improved adoption of evidence-based guidelines.

Acknowledgments

The authors gratefully acknowledge the administrative and material support of Georgia Hoyler and Marin Lopci.

Source of Funding: Dr. Keller was supported by NIH/National Center for Advancing Translational Science (NCATS) UCLA CTSI Grant Number TL1TR000121. The other authors do not report any funding for this work.

REFERENCES

  • 1.Kayyali B, Knott D, Van Kuiken S. The big-data revolution in US health care: accelerating value and innovation. McKinsey & Company website. mckinsey.com/industries/healthcare-systems-and-services/our-insights/the-big-data-revolution-in-us-health-care. Published April 2013. Accessed December 2, 2016. [Google Scholar]
  • 2.Hillestad R, Bigelow J, Bower A, et al. Can electronic medical record systems transform health care? potential health benefits, savings, and costs. Health Aff (Millwood). 2005;24(5):1103–1117. doi: 10.1377/hlthaff.24.5.1103. [DOI] [PubMed] [Google Scholar]
  • 3.Tierney WM, Miller ME, Overhage JM, McDonald CJ. Physician inpatient order writing on microcomputer workstations: effects on resource utilization. JAMA. 1993;269(3):379–383. doi: 10.1001/jama.1993.03500030077036. [DOI] [PubMed] [Google Scholar]
  • 4.Wang SJ, Middleton B, Prosser LA, et al. A cost-benefit analysis of electronic medical records in primary care. Am J Med. 2003;114(5):397–403. doi: .10.1016/S0002-9343(03)00057-3. [DOI] [PubMed] [Google Scholar]
  • 5.Henry J, Pylypchuk Y, Searcy T, Patel V. Adoption of electronic health record systems among U.S. non-federal acute care hospitals: 2008–2015 [ONC data brief no. 35]. Office of the National Coordinator for Health Information Technology website; dashboard.healthit.gov/evaluations/data-briefs/non-federal-acute-care-hospital-ehr-adoption-2008-2015.php. Published May 2016. Accessed October 27, 2016. [Google Scholar]
  • 6.Adler-Milstein J, Everson J, Lee S-YD. EHR adoption and hospital performance: time-related effects. Health Serv Res. 2015;50(6):1751–1771. doi: 10.1111/1475-6773.12406. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Kern LM, Barrón Y, Dhopeshwarkar RV, Edwards A, Kaushal R; HITEC Investigators. Electronic health records and ambulatory quality of care. J Gen Intern Med. 2013;28(4):496–503. doi: 10.1007/s11606-012-2237-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Campanella P, Lovato E, Marone C, et al. The impact of electronic health records on healthcare quality: a systematic review and meta-analysis. Eur J Public Health. 2016;26(1):60–64. doi: 10.1093/eurpub/ckv122. [DOI] [PubMed] [Google Scholar]
  • 9.Lammers EJ, McLaughlin CG, Barna M. Physician EHR adoption and potentially preventable hospital admissions among Medicare beneficiaries: panel data evidence, 2010–2013. Health Serv Res. 2016;51(6):2056–2075. doi: 10.1111/1475-6773.12586. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Patterson ME, Marken P, Zhong Y, Simon SD, Ketcherside W. Comprehensive electronic medical record implementation levels not associated with 30-day all-cause readmissions within Medicare beneficiaries with heart failure. Appl Clin Inform. 2014;5(3):670–684. doi: 10.4338/aci-2014-01-ra-0008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Wolfson D, Santa J, Slass L. Engaging physicians and consumers in conversations about treatment overuse and waste: a short history of the Choosing Wisely campaign. Acad Med. 2014;89(7):990–995. doi: 10.1097/ACM.0000000000000270. [DOI] [PubMed] [Google Scholar]
  • 12.Berwick DM, Hackbarth AD. Eliminating waste in US health care. JAMA. 2012;307(14):1513–1516. doi: 10.1001/jama.2012.362. [DOI] [PubMed] [Google Scholar]
  • 13.Rosenberg A, Agiro A, Gottlieb M, et al. Early trends among seven recommendations from the Choosing Wisely campaign. JAMA Intern Med. 2015;175(12):1913–1920. doi: 10.1001/jamainternmed.2015.5441. [DOI] [PubMed] [Google Scholar]
  • 14.Reid RO, Rabideau B, Sood N. Low-value health care services in a commercially insured population. JAMA Intern Med. 2016;176(10):1567–1571. doi: 10.1001/jamainternmed.2016.5031. [DOI] [PubMed] [Google Scholar]
  • 15.Zikmund-Fisher BJ, Kullgren JT, Fagerlin A, Klamerus ML, Bernstein SJ, Kerr EA. Perceived barriers to implementing individual Choosing Wisely recommendations in two national surveys of primary care providers. J Gen Intern Med. 2017;32(2):210–217. doi: 10.1007/s11606-016-3853-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Krouss M, Croft L, Morgan DJ. Physician understanding and ability to communicate harms and benefits of common medical treatments. JAMA Intern Med. 2016;176(10):1565–1567. doi: 10.1001/jamainternmed.2016.5027. [DOI] [PubMed] [Google Scholar]
  • 17.Colla CH, Kinsella EA, Morden NE, Meyers DJ, Rosenthal MB, Sequist TD. Physician perceptions of Choosing Wisely and drivers of overuse. Am J Manag Care. 2016;22(5):337–343. [PubMed] [Google Scholar]
  • 18.Clinical decision support: more than just ‘alerts’ tipsheet. CMS website. cms.gov/regulations-and-guidance/legislation/EHRincentiveprograms/downloads/clinicaldecisionsupport_tipsheet-.pdf. Published 2014. Accessed October 25, 2016.
  • 19.Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765. doi: 10.1136/bmj.38398.500764.8F. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Goldzweig CL, Orshansky G, Paige NM, et al. Electronic health record-based interventions for improving appropriate diagnostic imaging: a systematic review and meta-analysis. Ann Intern Med. 2015;162(8):557–565. doi: 10.7326/m14-2600. [DOI] [PubMed] [Google Scholar]
  • 21.Raja AS, Gupta A, Ip IK, Mills AM, Khorasani R. The use of decision support to measure adherence to a national imaging quality measure. Acad Radiol. 2014;21(3):378–383. doi: 10.1016/j.acra.2013.10.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Elixhauser A, Steiner C, Harris DR, Coffey RM. Comorbidity measures for use with administrative data. Med Care. 1998;36(1):8–27. [DOI] [PubMed] [Google Scholar]
  • 23.Clinical Classifications Software (CCS) for ICD-9-CM. Healthcare Cost and Utilization Project website. hcup-us.ahrq.gov/toolssoftware/ccs/ccs.jsp Published October 2016. Accessed December 2, 2016. [Google Scholar]
  • 24.Lawthers AG, McCarthy EP, Davis RB, Peterson LE, Palmer RH, Iezzoni LI. Identification of in-hospital complications from claims data: is it valid? Med Care. 2000;38(8):785–795. [DOI] [PubMed] [Google Scholar]
  • 25.Readmissions Reduction Program (HRRP). CMS website. cms.gov/medicare/medicare-fee-for-service-payment/acuteinpatientpps/readmissions-reduction-program.html. Published April 2016. Accessed December 2, 2016.
  • 26.Hospital-acquired conditions. CMS website. cms.gov/medicare/medicare-fee-for-service-payment/hospitalacqcond/hospital-acquired_conditions.html. Published August 2015. Accessed December 2, 2016.
  • 27.The R project for statistical computing. R project website. r-project.org; Published 2016. Accessed August 16, 2016. [Google Scholar]
  • 28.glm2: fitting generalized linear models. R project website. CRAN.R-project.org/package=glm2; Published 2014. Accessed August 16, 2016. [Google Scholar]
  • 29.caret: classification and regression training. R project website. CRAN.R-project.org/package=caret; Published 2016. Accessed August 16, 2016. [Google Scholar]
  • 30.sqldf: manipulate R data frames using SQL. R project website. CRAN.R-project.org/package=sqldf; Published 2014. Accessed August 16, 2016. [Google Scholar]
  • 31.Schedlbauer A, Prasad V, Mulvaney C, et al. What evidence supports the use of computerized alerts and prompts to improve clinicians’ prescribing behavior? J Am Med Inform Assoc. 2009;16(4):531–538. doi: 10.1197/jamia.M2910. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Dimagno MJ, Wamsteker EJ, Rizk RS, et al. A combined paging alert and web-based instrument alters clinician behavior and shortens hospital length of stay in acute pancreatitis. Am J Gastroenterol. 2014;109(3):306–315. doi: 10.1038/ajg.2013.282. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Vicente V, Svensson L, Wireklint Sundström B, Sjöstrand F, Castren M. Randomized controlled trial of a prehospital decision system by emergency medical services to ensure optimal treatment for older adults in Sweden. J Am Geriatr Soc. 2014;62(7):1281–1287. doi: 10.1111/jgs.12888. [DOI] [PubMed] [Google Scholar]
  • 34.Panella M, Marchisio S, Di Stanislao F. Reducing clinical variations with clinical pathways: do pathways work? Int J Qual Health Care. 2003;15(6):509–521. doi: 10.1093/intqhc/mzg057. [DOI] [PubMed] [Google Scholar]
  • 35.Wolfstadt JI, Gurwitz JH, Field TS, et al. The effect of computerized physician order entry with clinical decision support on the rates of adverse drug events: a systematic review. J Gen Intern Med. 2008;23(4):451–458. doi: 10.1007/s11606-008-0504-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Lin Y-C, Chang C-S, Yeh C-J, Wu Y-C. The appropriateness and physician compliance of platelet usage by a computerized transfusion decision support system in a medical center. Transfusion. 2010;50(12):2565–2570. doi: 10.1111/j.1537-2995.2010.02757.x. [DOI] [PubMed] [Google Scholar]
  • 37.Bayati M, Braverman M, Gillam M, et al. Data-driven decisions for reducing readmissions for heart failure: general methodology and case study. PLoS One. 2014;9(10):e109264. doi: 10.1371/journal.pone.0109264. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of drug safety alerts in computerized physician order entry. J Am Med Inform Assoc. 2006;13(2):138–147. doi: 10.1197/jamia.M1809. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Halm EA, Atlas SJ, Borowsky LH, et al. Understanding physician adherence with a pneumonia practice guideline: effects of patient, system, and physician factors. Arch Intern Med. 2000;160(1):98–104. doi: 10.1001/archinte.160.1.98. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

1

RESOURCES