Abstract
Objective
To examine the impact of electronic health record (EHR) deployment on Surgical Care Improvement Project (SCIP) measures in a tertiary-care teaching hospital.
Data Sources
SCIP Core Measure dataset from the CMS Hospital Inpatient Quality Reporting Program (March 2010 to February 2012).
Study Design
One-group pre- and post-EHR logistic regression and difference-in-differences analyses.
Principal Findings
Statistically significant short-term declines in scores were observed for the composite, postoperative removal of urinary catheter and post–cardiac surgery glucose control measures. A statistically insignificant improvement in scores for these measures was noted 3 months after EHR deployment.
Conclusion
The transition to an EHR appears to be associated with a short-term decline in quality. Implementation strategies should be developed to preempt or minimize this initial decline.
Keywords: Quality of care/patient safety (measurement), observational data/quasi-experiments, surgery
Several landmark reports have underscored the importance of health information technology (HIT) in promoting the quality of care (IOM 2001, 2012). The Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009 provides resources to implement many of the reports' recommendations, including the adoption and “meaningful use” of electronic health records (EHRs)1 (Blumenthal 2011). The ease of access to clinical information, increased adherence to guidelines, and improved caregiver communication and clinical decision support are some of the many benefits that have been associated with EHR use (Chaudhry et al. 2006; Goldzweig et al. 2009; Buntin et al. 2011). However, deploying an EHR and transitioning from an existing paper-based or a fragmented electronic system to an integrated EHR may be challenged by unexpected events that could adversely impact care.
To study the impact of EHR deployment (“go-live” stage) and use on the quality of care, we chose the Surgical Care Improvement Project (SCIP) measures (see Table1) of the Centers for Medicare and Medicaid Services (CMS) (CMS 2012) as the metrics of interest. These 10 evidence-based process measures and their composite measure aim to increase adherence to processes that reduce postoperative complications, namely, surgical site infections, urinary tract infections, cardiovascular events, and venous thromboembolism (Bratzler and Hunt 2006).
Table 1.
Surgical Care Improvement Project (SCIP) Measures and Potential Complications Prevented (CMS 2012)
SCIP ID | Measure | Complication Prevented |
---|---|---|
SCIP-Inf-1 | Prophylactic antibiotic received within 1 hour prior to surgical incision | Surgical site infection |
SCIP-Inf-2 | Prophylactic antibiotic selection for surgical patients | Surgical site infection |
SCIP-Inf-3 | Prophylactic antibiotics discontinued within 24 hours after surgery end time | Surgical site infection |
SCIP-Inf-4 | Cardiac surgery patients with controlled 6 a.m. postoperative blood glucose | Surgical site infection |
SCIP-Inf-6 | Surgery patients with appropriate hair removal | Surgical site infection |
SCIP-Inf-9 | Urinary catheter removed on postoperative day 1 or 2 | Urinary tract infection |
SCIP-Inf-10 | Surgery patients with perioperative temperature management | Surgical site infection |
SCIP-Card-2 | Surgery patients on beta-blocker therapy prior to arrival who received a beta-blocker during the peri-operative period | Cardiac event |
SCIP-Vte-1 | Surgical patients with recommended venous thromboembolism prophylaxis ordered any time from hospital arrival to 24 hours after anesthesia end time | Venous thrombo-embolism |
SCIP-Vte-2 | Surgery patients who received appropriate venous thrombo-embolism prophylaxis within 24 hours prior to anesthesia start time to 24 hours after anesthesia end time | Venous thrombo-embolism |
Literature Review
When we began working on this project, we were unable to identify studies that examined the impact of EHR implementation resulting from the HITECH Act incentives on process or outcome measures. One study that examined the impact of EHR use on the composite SCIP measure (Appari, Johnson, and Anthony 2013) found a decline in the composite score when hospitals transitioned to comprehensive EHRs. Studies that examined individual process measures relevant to SCIP have shown mixed results. Clinical decision support functionalities have increased compliance to the timely administration of prophylactic antibiotics (Wax et al. 2007; Nair et al. 2010; Schwann et al. 2011) and have favorably influenced the prevention of venous thromboembolism (Durieux et al. 2000; Kucher et al. 2005; Lecumberri et al. 2008; Galanter et al. 2010; Beeler, Kucher, and Blaser 2011; Haut et al. 2012). EHRs have been associated with increased adherence to recommended diabetes management guidelines (O'Connor et al. 2005; Sequist et al. 2005; Cebul et al. 2011) but variable control of blood glucose and other diabetes-related metabolic parameters (O'Connor et al. 2005; Crosson et al. 2007; Lee et al. 2008; Guerra et al. 2010; Cebul et al. 2011).
In addition to these positive findings, EHRs can also adversely impact quality of care (Ash, Berg, and Coiera 2004; Campbell et al. 2006; Ash et al. 2007; Sittig and Singh 2010). Research has demonstrated an unfavorable association between the use of Computerized Physician Order Entry and clinical outcomes such as mortality (Han et al. 2005) and medication errors (Koppel et al. 2005; Nebeker et al. 2005).
Although EHR deployment may not adversely impact medication processes and outcomes such as length of stay, costs, and mortality (Mekhjian et al. 2002; Del Beccaro et al. 2006), the introduction of new technology has the potential to change the existing social system (Harrison, Koppel, and Bar-Lev 2007). Thus, the disruption in the workflows associated with EHR deployment can provide an opportunity for errors to occur (Aarts, Doorewaard, and Berg 2004; Ludwick and Doucette 2009; Agarwal et al. 2010).
Study Objective
Our study objective was to examine the impact of EHR deployment on SCIP measure compliance at a tertiary-care teaching hospital. We hypothesized that EHR deployment may be associated with a short-term unintended decline (worsening) of the SCIP score, followed by an increased probability of achieving a higher (better) SCIP score.
Methods
Setting
The main setting for the study was Strong Memorial Hospital (SMH), a 792-bed tertiary-care teaching hospital located in Rochester, New York. The hospital deployed an ONC-ATCB-certified EHR2 (ONC HIT 2012) across most of its inpatient areas on March 5, 2011.
Highland Hospital (HH), a 261-bed teaching hospital located 1.3 miles from SMH, was selected as the comparison hospital. The choice of HH was based on similarities between the two hospitals, especially in ownership and management, medical culture, and organization of care delivery, shared medical staff physicians and residents in several specialties, geographic service area, SCIP scores prior to EHR deployment at SMH, and collaborative efforts in addressing SCIP measures. EHRs were deployed at HH on June 11, 2011.
Study Patients
The dataset included patients who were admitted for surgery to SMH and HH and whose discharge took place between March 1, 2010, and February 29, 2012. The patients and their surgeries met CMS's Initial Patient Population Criteria (CMS 2012). The unit of analysis was the inpatient episode. For each chart-abstracted episode, CMS algorithms ascertain if the episode qualified for inclusion in a particular SCIP measure population and whether appropriate care was rendered, and assign grades (see Table2). We downloaded the pooled cross-sectional datasets from the University HealthSystem Consortium website (UHC 2012).
Table 2.
Grades Assigned Using CMS Algorithms (CMS 2012)
Grade | Description |
---|---|
B | Episode not included in measure population |
D | Episode included in measure population, but appropriate care not delivered |
E | Episode included in measure population, and appropriate care delivered |
X | Missing data |
Y | Unable to decide |
Study Duration and Design
For the main (short-term) analysis, the preEHR (before) phase extended from October 1, 2010, to March 4, 2011; and the post-EHR (after) phase extended from March 5, 2011, to June 10, 2011. Three sensitivity analyses were conducted using different long-term study periods (see Table3).
Table 3.
Duration of Before- and After-Phases for EHR Implementation at Strong Memorial Hospital (SMH) on March 5, 2011
Analysis | Start Date for Before-Phase | Months in Before-Phase | End Date for After-Phase | Months in After-Phase | Sample Size |
---|---|---|---|---|---|
Main analysis | October 1, 2010 | 5 | June 10, 2011 | 3.5 | 1,816 |
Sensitivity analysis 1 | March 1, 2010 | 12 | February 29, 2012 | 12 | 5,251 |
Sensitivity analysis 2 | March 1, 2010 | 12 | June 10, 2011 | 3.5 | 3,457 |
Sensitivity analysis 3 | October 1, 2010 | 5 | February 29, 2012 | 12 | 3,610 |
The study adopted two statistical methods for the main analysis and each of the sensitivity analyses: (1) one-group pretest-posttest design (prepost) for SMH patients and (2) difference-in-difference (DID) estimation with pre- and post-EHR samples from SMH, utilizing HH as the control group.
Variables
Statistical models were created for each SCIP measure. The dependent variable for each model was a dichotomous variable that represented episodes that received grade E (1) and grade D (0) (see Table2). The impact was quantified as the change in the relative odds of achieving success on a particular measure with EHR use. The composite measure represented episodes that had received appropriate care for all qualifying measures.
For the prepost analysis, the independent variable of interest was a dichotomous EHR variable representing the presence (1) or absence (0) of the EHR. For the DID estimation, the independent variable of interest was the interaction term between the dichotomous hospital variable (1 = SMH and 0 = HH) and the dichotomous phase variable (1 = after-phase and 0 = before-phase).
Patient and treatment characteristics that were available in the SCIP dataset were used as covariates (see Table S1).
Statistical Analysis
The prepost design employed Chi-square tests, Fisher's exact tests, and logistic regression to analyze the change in the odds of achieving success on a particular measure after EHR deployment at SMH. For each SCIP measure an unadjusted analysis was followed by multivariate logistic regression, adjusting for patient and treatment characteristics.
For the DID estimation, HH was the control group. The absence of differences in SCIP scores between SMH and HH in the before-phase was demonstrated using Chi-square tests, Fisher's exact tests, and logistic regression. A key assumption for DID estimation is that the performance of the two hospitals would have followed a parallel trend in the absence of EHRs (Abadie 2005). We believe that the similarities between SMH and HH, the absence of statistically significant differences in the two hospitals' SCIP scores in the before-phase, and the absence of significant events that could disturb the time trend after deployment support this assumption.
Statistical analyses were conducted using SAS 9.3 (SAS Institute Inc. 2010) and Stata 12 (StataCorp. 2011). The study was approved by the University of Rochester Research Subject Review Board and the HH Administrative Research Review Committee.
Results
Study Sample
The average patient age (N = 1,816) was 59 years, 58 percent of the sample was female, and 86 percent was white. Medicare was the primary payor for 48 percent of the sample, and 79 percent of the procedures were elective (see Table S1).
Validation of Comparator
There were no statistically significant differences between the scores of the two hospitals in the before-phase. However, sensitivity analysis 1 demonstrated significant differences in the composite (p = .05) and SCIP-Inf-1 (p = .02) measures (see Figure1 and Table S2).
Figure 1.
Time Trends for Composite,α SCIP-Inf-4,β SCIP-Inf-9,γ and SCIP-Card-2δ Measures
x-axis: study duration; y-axis: raw percentage score.
αReflective of performance on all the Surgical Care Improvement Project measures.
βCardiac surgery patients with controlled 6 a.m. postoperative blood glucose.
γUrinary catheter removed on postoperative day 1 or 2. λSurgery patients on beta-blocker therapy prior to arrival who received a beta-blocker during the perioperative period.
p-value for Chi-square or Fisher's exact tests for the difference in the SCIP scores at Strong Memorial Hospital and Highland Hospital prior to EHR deployment on March 5, 2011.
Change in Raw SCIP Scores
At SMH, the composite measure and 6 of the 10 individual measures had a reduction (worsening) of varying magnitude in the SCIP score after EHR deployment. Three individual measures had an increase in the SCIP score (see Table S2).
On account of high baseline scores for several measures (97 percent or greater), there was likely a ceiling effect that would impact finding statistically meaningful associations. Therefore, we limited the scope of our subsequent analyses to measures with scores of 97 percent or less during any one study phase, namely, the composite measure, SCIP-Inf-4, SCIP-Inf-9, and SCIP-Card-2.
Main Analysis: Pre–Post and DID Estimation
For the composite measure, using prepost estimation, we noted a 51 percent decline in the adjusted odds ratio (AOR) of achieving success during the month EHR was deployed at SMH (March 2011) (AOR = 0.49, p = .08; 95 percent confidence interval [CI] = 0.22–1.08) as compared to the before-phase (raw score absolute decline [RSAD] from 95.4 percent in the before-phase to 88.7 percent in March), and a 55 percent decline in the AOR of achieving success in April 2011 (AOR = 0.45, p = .04; 95 percent CI = 0.22–0.95) (RSAD from 95.4 percent in the before-phase to 91.2 percent in April 2011). Using DID estimation, there was a 91 percent decline in the AOR of achieving success in March 2011 (AOR = 0.09, p = .09; 95 percent CI = 0.01–1.48) (RSAD from 95.4 percent in the before-phase to 88.7 percent in March 2011), followed by a 3.49 times greater AOR of achieving success in May 2011 (AOR = 3.49, p = .06; 95 percent CI = 0.95–12.85) (RSAD from 95.4 percent in the before phase to 99.1 percent in May 2011) (see Table4).
Table 4.
Main Analysis of One-Group Pre- and Post-EHR Logistic Regression and Difference-in-Differences Estimation for Specific Surgical Care Improvement Project (SCIP) Measures†
One-Group Pre- and Post-EHR Logistic Regression | Difference-in-Differences Estimation | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Unadjusted | Adjusted | Unadjusted | Adjusted | |||||||||
Measure | OR | LCL | UCL | OR | LCL | UCL | OR | LCL | UCL | OR | LCL | UCL |
Composite‡ | ||||||||||||
Mar-11 | 0.46* | 0.21 | 1.00 | 0.49* | 0.22 | 1.08 | 0.09 | 0.00 | 1.62 | 0.09* | 0.01 | 1.48 |
Apr-11 | 0.46** | 0.22 | 0.96 | 0.45** | 0.22 | 0.95 | 0.85 | 0.24 | 2.99 | 0.97 | 0.27 | 3.43 |
May-11 | 1.01 | 0.40 | 2.59 | 1.10 | 0.44 | 2.72 | 2.76 | 0.74 | 10.27 | 3.49* | 0.95 | 12.85 |
Jun-11 | 1.33 | 0.24 | 7.20 | 1.11 | 0.22 | 5.45 | 2.28 | 0.20 | 26.11 | 2.93 | 0.27 | 31.43 |
SCIP-Inf-4§ | ||||||||||||
Mar-11 | 0.30** | 0.10 | 0.91 | 0.45 | 0.13 | 1.56 | Not applicable‡‡ | Not applicable‡‡ | ||||
Apr-11 | 0.49 | 0.13 | 1.86 | 0.55 | 0.13 | 2.33 | ||||||
May-11 | 1.51 | 0.25 | 9.17 | 2.01 | 0.32 | 12.49 | ||||||
Jun-11 | 1.62 | 0.08 | 32.81 | 2.19 | 0.11 | 45.35 | ||||||
SCIP-Inf-9¶ | ||||||||||||
Mar-11 | 0.94 | 0.16 | 5.56 | 1.58 | 0.14 | 18.25 | 0.22 | 0.01 | 6.60 | 0.76 | 0.02 | 34.32 |
Apr-11 | 0.25** | 0.09 | 0.73 | 0.21** | 0.07 | 0.66 | 0.07* | 0.00 | 1.42 | 0.03* | 0.00 | 1.08 |
May-11 | 1.23 | 0.21 | 7.23 | 1.58 | 0.21 | 12.13 | 3.75 | 0.47 | 30.21 | 12.96 | 0.60 | 278.52 |
Jun-11 | 1.10 | 0.06 | 21.57 | 0.13 | 0.01 | 1.81 | 1.41 | 0.02 | 99.68 | 0.30 | 0.00 | 19.12 |
SCIP-Card-2†† | ||||||||||||
Mar-11 | 0.65 | 0.10 | 4.21 | 0.71 | 0.12 | 4.24 | 0.60 | 0.02 | 22.73 | 0.55 | 0.02 | 15.94 |
Apr-11 | 0.69 | 0.11 | 4.44 | 0.43 | 0.08 | 2.21 | 0.36 | 0.01 | 13.15 | 0.18 | 0.01 | 4.20 |
May-11 | 0.61 | 0.13 | 2.85 | 0.36 | 0.08 | 1.50 | 0.52 | 0.02 | 16.81 | 0.23 | 0.01 | 6.26 |
Jun-11 | 0.79 | 0.04 | 16.64 | 0.67 | 0.05 | 8.89 | 1.86 | 0.02 | 161.00 | 1.27 | 0.02 | 95.91 |
This is the main analysis, where the before-phase extends from October 1, 2010, to March 4, 2011, and the after-phase extends from March 5, 2011, to June 10, 2011.
- Fit statistics for adjusted one-group pre- and post-EHR logistic regression: Hosmer–Lemeshow (HL) Test Statistic: 8.97, p = .34; C Statistic: 0.72; Link Test: Square term p = .29
- Fit statistics for adjusted DID estimation: HL Test Statistic: 5.82, p = .67, C Statistic: 0.75, Link Test: Square term p = .64
- Fit statistics for adjusted one-group pre- and post-EHR logistic regression: HL Test Statistic: 5.33, p = .72; C Statistic: 0.87; Link Test: Square term p = .87
- Fit statistics for adjusted one-group pre- and post-EHR logistic regression: HL Test Statistic: 4.55, p = .80; C Statistic: 0.92; Link Test: Square term p = .10
- Fit statistics for adjusted DID estimation: HL Test Statistic: 7.83, p = .45; C Statistic: 0.92; Link Test: Square term p = .22
- Fit statistics for adjusted one-group pre- and post-EHR logistic regression: HL Test Statistic: 8.81, p = .36; C Statistic: 0.89; Link Test: Square term p = .37
- Fit statistics for adjusted DID estimation: HL Test Statistic: 10.25, p = .25; C Statistic: 0.92; Link Test: Square term p = .25
Cardiac surgeries are not performed at Highland Hospital (HH); hence HH cannot be used as a comparator group.
OR, odds ratio; LCL, lower confidence limit; UCL, upper confidence limit.
p ≤ .05;
p ≤ .10.
For SCIP-Inf-9, in the prepost estimation we noted a 79 percent decline in the AOR of achieving success in April 2011 (AOR = 0.21, p = .01; 95 percent CI = 0.07–0.66) as compared to the before-phase. Using DID estimation, the decline was 97 percent (AOR = 0.03, p = .06; 95 percent CI = 0.00–1.08).
Sensitivity Analyses
Sensitivity analyses (see Table S3) demonstrate that the duration, magnitude, and statistical significance of the declines in the main analysis remain consistent. An exception is SCIP-Inf-4, where significant reductions (61–65 percent; p < .10) in AOR of success are noted in March 2011 for all sensitivity analyses.
Time Trends
For the composite, SCIP-Inf-4, and SCIP-Inf-9 measures, the statistically significant decline in the AOR of success at SMH bottomed-out a month after EHR deployment and was followed by a statistically insignificant increase (see Table4 and Figure1).
Discussion
Our findings demonstrate a decline in SCIP scores in the months immediately following EHR deployment. The sociotechnical model (STM) for HIT-related errors (Sittig and Singh 2010) provides a framework to understand such challenges encountered while implementing EHRs in hospitals. The clinical content dimension of the STM relates to all data and clinical information that is stored in the EHR, the people dimension to human factors that are involved in the implementation and use of technology, and the workflow and communication dimension to the interaction between technology and the workforce.
For the composite measure, EHR use was associated with lesser likelihood of success in the first 2 months after deployment. This decline was mainly comprised of failures in three measures: SCIP-Inf-4, SCIP-Inf-9, and SCIP-Card-2. Beginning in May 2011, a statistically significant increase in the probability of success on the composite measure is indicative of remedial measures instituted by the hospital upon identifying the errors.
For SCIP-Inf-4, EHR use was associated with lower probability of success in the month of deployment in the three sensitivity analyses. The failure to control blood glucose following cardiac surgery increases the risk of developing infections. Post hoc anecdotal evidence suggests that the decline in the score occurred because of the failure to transfer relevant order sets into the new EHR (reflecting the clinical content, and workflow and communication dimensions of the STM). This was identified and rectified through quality improvement efforts.
For SCIP-Inf-9, EHR use was associated with less likelihood of success in the second month of use. Failure to remove the urinary catheter within the specified time range increases the risk of catheter-associated urinary tract infection. Post hoc anecdotal evidence suggests that the decline in the score coincided with the gradual change from on-site end-user support to an off-site support system along with increased pressure to keep up with the EHR documentation requirements (reflecting the people dimension of the STM).
Since the landmark report “To Err Is Human,” several nationwide initiatives have been implemented to reduce medical errors (Leape and Berwick 2005). While HIT applications have the potential to prevent medical errors (Bates et al. 2001; Bates and Gawande 2003), they can also facilitate errors (IOM 2012), especially because of the disruption of work processes in the deployment phase. This phase is also critical because errors can be rectified early on.
Limitations
Our study has several limitations. First, the choice of HH as the comparison hospital for the DID estimation can bias our findings. Some notable differences between the two hospitals relevant to our study are the designation of SMH as a Level 1 Regional Trauma Center, performance of primarily emergency orthopedic surgeries at SMH as compared to the elective nature of orthopedic surgeries at HH, performance of cardiac surgeries at SMH but not at HH, and a concurrent abstraction process for the SCIP measures at SMH as compared to retrospective abstraction at HH. We acknowledge that differences in services, culture, and organization between the two hospitals may differentially influence their response to EHR deployment. However, similarities in the direction of risk-adjusted estimates in the one-group analysis as well as commonality of ownership, medical culture, and collaborative efforts in improving SCIP measures support the DID findings. Importantly, the DID assumption is not that the two hospitals should be similar but rather that the two hospitals would have had a similar time trend in the absence of the intervention. Second, it is difficult to tell from the data whether the fall in quality followed by the rebound that we observed is likely to be an effect directly linked to the EHR or is just random variation. However, the three measures that experienced this involve distinct care processes that utilize various EHR functionalities (Jha et al. 2009). It seems improbable that the consistent pattern of change across three distinct measures could have occurred due to random chance. Furthermore, the increase for three of the four measures can be explained by the steps taken by the hospital to rectify the errors made. Third, the generalizability of our findings is limited because our study was conducted at a single hospital and was contingent upon state and market conditions. Fourth, SCIP scores are intended to reflect performance of clinical activities as well as their documentation. It is possible that changes in scores may be due to changes in documentation.
Conclusion
Our study identified statistically significant temporary reductions in surgical quality associated with EHR deployment. While the use of EHRs has the potential to improve quality of care, their deployment may lead to a temporary reduction in quality. Incorporating this awareness in the design of the implementation process should reap rich benefits.
Acknowledgments
Joint Acknowledgment/Disclosure Statement: While there was no sponsor for the study, Memorial Hospital and Highland Hospital did provide the data that were analyzed. The study investigators greatly appreciate the contributions of Brenda L. Carlson, Rosemarie Kolker, and Keith Skelton.
Disclosures: None.
Disclaimers: None.
Notes
An EHR is defined as “a real-time patient health record with access to evidence-based decision support tools that can be used to aid clinicians in decision making” (ONC HIT 2009).
The Office of the National Coordinator for Health Information Technology – Authorized Testing and Certifications Bodies (ONC-ATCB).
Supporting Information
Appendix SA1: Author Matrix.
Table S1. Characteristics of Each Study Cohort.
Table S2. Raw Performance Scores for Each of the SCIP Measures at Strong Memorial Hospital (SMH) and Highland Hospital (HH).
Table S3. Excerpts of Main and Sensitivity Analyses for One-Group Pre- and Post-EHR Logistic Regression and Difference-in-Differences Estimation for Specific Surgical Care Improvement Project (SCIP) Measures.
References
- Aarts J, Doorewaard H. Berg M. Understanding Implementation: The Case of a Computerized Physician Order Entry System in a Large Dutch University Medical Center. Journal of the American Medical Informatics Association. 2004;11(3):207–16. doi: 10.1197/jamia.M1372. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Abadie A. Semiparametric Difference-in-Differences Estimators. Review of Economic Studies. 2005;72(1):1–19. [Google Scholar]
- Agarwal R, Gao GG, DesRoches C. Jha AK. Research Commentary—The Digital Transformation of Healthcare: Current Status and the Road Ahead. Information Systems Research. 2010;21(4):796–809. [Google Scholar]
- Appari A, E. Johnson M. Anthony DL. Meaningful Use of Electronic Health Record Systems and Process Quality of Care: Evidence from a Panel Data Analysis of US Acute-Care Hospitals. Health Services Research. 2013;48(2 Pt 1):354–75. doi: 10.1111/j.1475-6773.2012.01448.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ash JS, Berg M. Coiera E. Some Unintended Consequences of Information Technology in Health Care: The Nature of Patient Care Information System-Related Errors. Journal of the American Medical Informatics Association. 2004;11(2):104–12. doi: 10.1197/jamia.M1471. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ash JS, Sittig DF, Poon EG, Guappone K, Campbell E. Dykstra RH. The Extent and Importance of Unintended Consequences Related to Computerized Provider Order Entry. Journal of the American Medical Informatics Association. 2007;14(4):415–23. doi: 10.1197/jamia.M2373. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bates DW. Gawande AA. Improving Safety with Information Technology. New England Journal of Medicine. 2003;348(25):2526–34. doi: 10.1056/NEJMsa020847. [DOI] [PubMed] [Google Scholar]
- Bates D, Cohen M, Leape L, Overhage JM, Shabot MM. Sheridan T. Reducing the Frequency of Errors in Medicine Using Information Technology. Journal of the American Medical Informatics Association. 2001;8(4):299–308. doi: 10.1136/jamia.2001.0080299. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beeler P, Kucher N. Blaser J. Sustained Impact of Electronic Alerts on Rate of Prophylaxis against Venous Thromboembolism. Thrombosis and Haemostasis. 2011;106(4):734–38. doi: 10.1160/TH11-04-0220. [DOI] [PubMed] [Google Scholar]
- Blumenthal D. Wiring the Health System—Origins and Provisions of a New Federal Program. New England Journal of Medicine. 2011;365(24):2323–9. doi: 10.1056/NEJMsr1110507. [DOI] [PubMed] [Google Scholar]
- Bratzler DW. Hunt DR. The Surgical Infection Prevention and Surgical Care Improvement Projects: National Initiatives to Improve Outcomes for Patients Having Surgery. Clinical Infectious Diseases. 2006;43(3):322–30. doi: 10.1086/505220. [DOI] [PubMed] [Google Scholar]
- Buntin MB, Burke MF, Hoaglin MC. Blumenthal D. The Benefits of Health Information Technology: A Review of the Recent Literature Shows Predominantly Positive Results. Health Affairs. 2011;30(3):464–71. doi: 10.1377/hlthaff.2011.0178. [DOI] [PubMed] [Google Scholar]
- Campbell EM, Sittig DF, Ash JS, Guappone KP. Dykstra RH. Types of Unintended Consequences Related to Computerized Provider Order Entry. Journal of the American Medical Informatics Association. 2006;13(5):547–56. doi: 10.1197/jamia.M2042. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cebul RD, Love TE, Jain AK. Hebert CJ. Electronic Health Records and Quality of Diabetes Care. New England Journal of Medicine. 2011;365(9):825–33. doi: 10.1056/NEJMsa1102519. [DOI] [PubMed] [Google Scholar]
- Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, Morton SC. Shekelle PG. Systematic Review: Impact of Health Information Technology on Quality, Efficiency, and Costs of Medical Care. Annals of Internal Medicine. 2006;144(10):742–52. doi: 10.7326/0003-4819-144-10-200605160-00125. [DOI] [PubMed] [Google Scholar]
- CMS. 2012. “ Specifications Manual for National Hospital Quality Measures ” [accessed on October 30, 2013]. Available at http://qualitynet.org/dcs/ContentServer?cid=1141662756099&pagename=QnetPublic%2FPage%2FQnetTier2&c=page.
- Crosson JC, Ohman-Strickland PA, Hahn KA, DiCicco-Bloom B, Shaw E, Orzano AJ. Crabtree BF. Electronic Medical Records and Diabetes Quality of Care: Results from a Sample of Family Medicine Practices. Annals of Family Medicine. 2007;5(3):209–15. doi: 10.1370/afm.696. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Del Beccaro MA, Jeffries HE, Eisenberg MA. Harry ED. Computerized Provider Order Entry Implementation: No Association with Increased Mortality Rates in an Intensive Care Unit. Pediatrics. 2006;118(1):290–5. doi: 10.1542/peds.2006-0367. [DOI] [PubMed] [Google Scholar]
- Durieux P, Nizard R, Ravaud P, Mounier N. Lepage E. A Clinical Decision Support System for Prevention of Venous Thromboembolism. Journal of the American Medical Association. 2000;283(21):2816–21. doi: 10.1001/jama.283.21.2816. [DOI] [PubMed] [Google Scholar]
- Galanter WL, Thambi M, Rosencranz H, Shah B, Falck S, Lin FJ, Nutescu E. Lambert B. Effects of Clinical Decision Support on Venous Thromboembolism Risk Assessment, Prophylaxis, and Prevention at a University Teaching Hospital. American Journal of Health-System Pharmacy. 2010;67(15):1265–73. doi: 10.2146/ajhp090575. [DOI] [PubMed] [Google Scholar]
- Goldzweig CL, Towfigh A, Maglione M. Shekelle PG. Costs and Benefits of Health Information Technology: New Trends from the Literature. Health Affairs. 2009;28(2):w282–93. doi: 10.1377/hlthaff.28.2.w282. [DOI] [PubMed] [Google Scholar]
- Guerra YS, Das K, Antonopoulos P, Borkowsky S, Fogelfeld L, Gordon MJ, Palal BM, Witsil JC. Lacuesta EA. Computerized Physician Order Entry-Based Hyperglycemia Inpatient Protocol and Glycemic Outcomes: The CPOE-HIP Study. Endocrine Practice. 2010;16(3):389–97. doi: 10.4158/EP09223.OR. [DOI] [PubMed] [Google Scholar]
- Han YY, Carcillo JA, Venkataraman ST, Clark RSB, Watson RS, Nguyen TC, Bayir H. Orr RA. Unexpected Increased Mortality after Implementation of a Commercially Sold Computerized Physician Order Entry System. Pediatrics. 2005;116(6):1506–12. doi: 10.1542/peds.2005-1287. [DOI] [PubMed] [Google Scholar]
- Harrison MI, Koppel R. Bar-Lev S. Unintended Consequences of Information Technologies in Health Care—An Interactive Sociotechnical Analysis. Journal of the American Medical Informatics Association. 2007;14(5):542–9. doi: 10.1197/jamia.M2384. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Haut ER, Lau BD, Kraenzlin FS, Hobson DB, Kraus PS, Carolan HT, Haider AH, Holzmueller CG, Efron DT. Pronovost PJ. Improved Prophylaxis and Decreased Rates of Preventable Harm with the Use of a Mandatory Computerized Clinical Decision Support Tool for Prophylaxis for Venous Thromboembolism in Trauma. Archives of Surgery. 2012;147(10):901–7. doi: 10.1001/archsurg.2012.2024. [DOI] [PubMed] [Google Scholar]
- IOM. Crossing the Quality Chasm: A New Health System for the 21st Century (Committee on Quality of Health Care in America) Washington, DC: The National Academies Press; 2001. [PubMed] [Google Scholar]
- IOM. Health IT and Patient Safety: Building Safer Systems for Better Care (Committee on Patient Safety and Health Information Technology) Washington, DC: The National Academies Press; 2012. [PubMed] [Google Scholar]
- Jha AK, DesRoches CM, Campbell EG, Donelan K, Rao SR, Ferris TG, Shields A, Rosenbaum S. Blumenthal D. Use of Electronic Health Records in US Hospitals. New England Journal of Medicine. 2009;360(16):1628–38. doi: 10.1056/NEJMsa0900592. [DOI] [PubMed] [Google Scholar]
- Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE. Strom BL. Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors. Journal of the American Medical Association. 2005;293(10):1197–203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]
- Kucher N, Koo S, Quiroz R, Cooper JM, Paterno MD, Soukonnikov B. Goldhaber SZ. Electronic Alerts to Prevent Venous Thromboembolism among Hospitalized Patients. New England Journal of Medicine. 2005;352(10):969–77. doi: 10.1056/NEJMoa041533. [DOI] [PubMed] [Google Scholar]
- Leape LL. Berwick DM. Five Years after To Err Is Human. Journal of the American Medical Association. 2005;293(19):2384–90. doi: 10.1001/jama.293.19.2384. [DOI] [PubMed] [Google Scholar]
- Lecumberri R, Marqués M, Díaz-Navarlaz MT, Panizo E, Toledo J, García-Mouriz A. Páramo JA. Maintained Effectiveness of an Electronic Alert System to Prevent Venous Thromboembolism among Hospitalized Patients. Thrombosis and Haemostasis. 2008;100(4):699–704. doi: 10.1160/th08-05-0337. [DOI] [PubMed] [Google Scholar]
- Lee J, Clay B, Zelazny Z. Maynard G. Indication-Based Ordering: A New Paradigm for Glycemic Control in Hospitalized Inpatients. Journal of Diabetes Science and Technology. 2008;2(3):349–56. doi: 10.1177/193229680800200303. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ludwick D. Doucette J. Adopting Electronic Medical Records in Primary Care: Lessons Learned from Health Information Systems Implementation Experience in Seven Countries. International Journal of Medical Informatics. 2009;78(1):22–31. doi: 10.1016/j.ijmedinf.2008.06.005. [DOI] [PubMed] [Google Scholar]
- Mekhjian HS, Kumar RR, Kuehn L, Bentley TD, Teater P, Thomas A, Payne B. Ahmad A. Immediate Benefits Realized Following Implementation of Physician Order Entry at an Academic Medical Center. Journal of the American Medical Informatics Association. 2002;9(5):529–39. doi: 10.1197/jamia.M1038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nair BG, Newman SF, Peterson GN, Wu WY. Schwid HA. Feedback Mechanisms Including Real-Time Electronic Alerts to Achieve Near 100% Timely Prophylactic Antibiotic Administration in Surgical Cases. Anesthesia & Analgesia. 2010;111(5):1293–300. doi: 10.1213/ANE.0b013e3181f46d89. [DOI] [PubMed] [Google Scholar]
- Nebeker JR, Hoffman JM, Weir CR, Bennett CL. Hurdle JF. High Rates of Adverse Drug Events in a Highly Computerized Hospital. Archives of Internal Medicine. 2005;165(10):1111–6. doi: 10.1001/archinte.165.10.1111. [DOI] [PubMed] [Google Scholar]
- O'Connor PJ, Crain AL, Rush WA, Sperl-Hillen JAM, Gutenkauf JJ. Duncan JE. Impact of an Electronic Medical Record on Diabetes Quality of Care. Annals of Family Medicine. 2005;3(4):300–6. doi: 10.1370/afm.327. [DOI] [PMC free article] [PubMed] [Google Scholar]
- ONC HIT. 2009. “ Health IT Terms ” [accessed on October 30, 2013]. Available at http://healthit.hhs.gov/portal/server.pt?open=512&mode=2&cached=true&objID=1256&PageID=15726.
- ONC HIT. 2012. “ Certified Health IT Product List ” [accessed on October 30, 2013]. Available at http://oncchpl.force.com/ehrcert?q=CHPL.
- SAS Institute Inc. SAS System for Windows – Version 9.3. Cary, NC: SAS Institute Inc; 2010. [Google Scholar]
- Schwann NM, Bretz KA, Eid S, Burger T, Fry D, Ackler F, Evans P, Romancheck D, Beck M. Ardire AJ. Point-of-Care Electronic Prompts: An Effective Means of Increasing Compliance, Demonstrating Quality, and Improving Outcome. Anesthesia & Analgesia. 2011;113(4):869–76. doi: 10.1213/ANE.0b013e318227b511. [DOI] [PubMed] [Google Scholar]
- Sequist TD, Gandhi TK, Karson AS, Fiskio JM, Bugbee D, Sperling M, Cook EF, Orav EJ, Fairchild DG. Bates DW. A Randomized Trial of Electronic Clinical Reminders to Improve Quality of Care for Diabetes and Coronary Artery Disease. Journal of the American Medical Informatics Association. 2005;12(4):431–7. doi: 10.1197/jamia.M1788. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sittig DF. Singh H. A New Sociotechnical Model for Studying Health Information Technology in Complex Adaptive Healthcare Systems. Quality and Safety in Health Care. 2010;19(Suppl 3):i68–74. doi: 10.1136/qshc.2010.042085. [DOI] [PMC free article] [PubMed] [Google Scholar]
- StataCorp. Stata Statistical Software: Release 12. College Station, TX: StataCorp LP; 2011. [Google Scholar]
- UHC. 2012. “ University HealthSystem Consortium ” [accessed on October 30, 2013]. Available at http://www.uhc.edu.
- Wax DB, Beilin Y, Levin M, Chadha N, Krol M. Reich DL. The Effect of an Interactive Visual Reminder in an Anesthesia Information Management System on Timeliness of Prophylactic Antibiotic Administration. Anesthesia & Analgesia. 2007;104(6):1462–6. doi: 10.1213/01.ane.0000263043.56372.5f. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Appendix SA1: Author Matrix.
Table S1. Characteristics of Each Study Cohort.
Table S2. Raw Performance Scores for Each of the SCIP Measures at Strong Memorial Hospital (SMH) and Highland Hospital (HH).
Table S3. Excerpts of Main and Sensitivity Analyses for One-Group Pre- and Post-EHR Logistic Regression and Difference-in-Differences Estimation for Specific Surgical Care Improvement Project (SCIP) Measures.