Skip to main content
eGEMs logoLink to eGEMs
. 2015 Jul 9;3(2):1150. doi: 10.13063/2327-9214.1150

A Framework for Usable and Effective Clinical Decision Support: Experience from the iCPR Randomized Clinical Trial

Joseph Kannry i, Lauren McCullagh ii, Andre Kushniruk iii, Devin Mann iv, Daniel Edonyabo v, Thomas McGinn ii
PMCID: PMC4537146  PMID: 26290888

Abstract

Introduction:

The promise of Clinical Decision Support (CDS) has always been to transform patient care and improve patient outcomes through the delivery of timely and appropriate recommendations that are patient specific and, more often than not, are appropriately actionable. However, the users of CDS—providers—are frequently bombarded with inappropriate and inapplicable CDS that often are not informational, not integrated into the workflow, not patient specific, and that may present out of date and irrelevant recommendations.

Methods:

The integrated clinical prediction rule (iCPR) project was a randomized clinical trial (RCT) conducted to determine if a novel form of CDS, i.e., clinical prediction rules (CPRs), could be efficiently integrated into workflow and result in changes in outcomes (e.g., antibiotic ordering) when embedded within a commercial electronic health record (EHR).

We use the lessons learned from the iCPR project to illustrate a framework for constructing usable, useful, and effective actionable CDS while employing off-the-shelf functionality in a production system. Innovations that make up the framework combine the following: (1) active and actionable decision support, (2) multiple rounds of usability testing with iterative development for user acceptance, (3) numerous context sensitive triggers, (4) dedicated training and support for users of the CDS tool for user adoption, and (5) support from clinical and administrative leadership. We define “context sensitive triggers” as being workflow events (i.e., context) that result in a CDS intervention.

Discussion:

Success of the framework can be measured by CDS adoption (i.e., intervention is being used), acceptance (compliance with recommendations), and clinical outcomes (where appropriate). This framework may have broader implications for the deployment of Health Information Technology (HIT).

Results and Conclusion:

iCPR was well adopted(57.4% of users) and accepted (42.7% of users). Usability testing identified and fixed many issues before the iCPR RCT. The level of leadership support and clinical guidance for iCPR was key in establishing a culture of acceptance for both the tool and its recommendations contributing to adoption and acceptance. The dedicated training and support lead to the majority of the residents reporting a high level of comfort with both iCPR tools strep pharyngitis (64.4 percent) and pneumonia (62.7 percent) as well as a high likelihood of using the tools in the future. A surprising framework addition resulted from usability testing: context sensitive triggers.

Keywords: Health Information Technology, Human Computer Interaction (HCI)

Introduction

The promise of Clinical Decision Support (CDS) has always been to transform patient care and improve patient outcomes through the delivery of timely and appropriate recommendations.13 CDS is defined as anything that directly aids in clinical decision-making about individual patients. Decision support can include collegial advice, text references, Web sites, and computer systems4 A Clinical Decision Support System (CDSS) is a computerization of CDS, frequently integrated into a clinical information system such as an electronic health record (EHR), and directly aids in clinical decision-making about individual patients. Specifically, CDSS incorporates individual patient data, a rules engine, and a medical knowledge base to produce a patient-specific assessment or recommendation for clinicians.5,6 In a sense CDS is the content and CDSS is the delivery system. For the purposes of this paper, CDS, unless otherwise specified, is delivered through a CDSS.

However, users of CDS, i.e., providers, are frequently bombarded with inappropriate and inapplicable CDS that is often not informational, not integrated into the workflow, not patient specific, and that may present out of date and irrelevant recommendations. Not surprisingly, multiple recent studies and analyses have raised questions about the effectiveness of CDS and ambulatory EHRs and the ability of EHRs to have an impact on care quality.711 And a 2012 study determined that while CDS can lead to changes it’s not clear if it leads to changes in clinical outcomes or improved efficiency.12 Dexheimer et al. (2005) examined 23 studies of preventative health care reminders and concluded that paper tools were superior to CDS.13

Successful implementation of Health Information Technology (HIT) is affected by multiple factors, many of which have been studied in regards to implementation of clinical information systems such as inpatient Computerized Provider Order Entry (CPOE), ambulatory EHRs, etc. These factors include the following: leadership, integration with health care and workflow process, value to users, and training and support.14,15 However, these success factors have not been similarly studied in CDS.

The success of CDS interventions can be measured by the adoption rate and the acceptance rate of the CDS intervention (Table 1). Is the CDS intervention being used (adoption rate), and are the CDS-provided recommendations being accepted (acceptance rate).16

Table 1.

Measures of Success for Usable and Effective Clinical Decision Support (CDS)

MEASURE DEFINITION
Adoption CDS intervention is used.
Acceptance Compliance with CDS recommendations.
Changes in Behavior Changes in process or care(e.g. reviewing medications, ordering more of a medication).
Clinical Outcomes Demonstrable (statistically significant) changes in care.

The integrated clinical prediction rule (iCPR) project was a novel form of a CDS intervention, i.e., clinical prediction rules (CPRs), which was efficiently integrated into workflow. The CDS intervention resulted in both high adoption- (57.5 percent of intervention users) and acceptance rates (42.4 percent). In contrast the peer reviewed literature cites rates of 10–20 percent for both adoption and acceptance (see Table 2).12,16

Table 2.

Summary of iCPR Results by CDS Measure of Success

ICPR PEER REVIEWED LITERATURE
Adoption 57.5%17 10–20%12,16
Acceptance 42.4%17 10–20%12,16
Outcomes 9.2% reduction in ordering antibiotics for strep pharyngitis with a P value =.00817 Wide variability not easily summarized

The framework for usable and effective CDS is derived in part from the peer reviewed literature (see above), which significantly informed integrated iCPR project design and the authors’ experience conducting a randomized clinical trial (RCT) of iCPR. The iCPR project objective was to determine if iCPR—a novel form of CDS in an EHR—could be efficiently integrated into workflow, resulting in changes to patient outcomes (e.g., antibiotic use). Only off-the-shelf EHR functionality was employed to ensure portability. We implemented two well-validated iCPRs, namely, the Walsh rule for streptococcal pharyngitis and the Heckerling rule for pneumonia. The iCPR RCT was conducted at the Internal Medicine Associates at Mount Sinai Medical Center, New York City—with both attending physicians and housestaff providers. At the end of the study, the intervention group completed the iCPR tool in 57.5 percent of visits, and providers in the intervention group were significantly less likely to order antibiotics than was the control group.17

This paper uniquely uses the lessons learned from the iCPR1721 project relative to successful CDS implementation to illustrate a framework for constructing usable, useful, and effective CDS that is both highly adopted and accepted. The framework was developed in part because multiple studies have questioned the effectiveness and efficacy of CDS delivered by ambulatory EHRs.713 Additionally, the framework was developed to address the fact that while leadership, integration with health care and workflow process, value to users, and training and support14,15 have been studied as success factors in implementation for clinical information systems (see above), these same success factors have not been similarly studied in CDS.

This framework combines the following: (1) active and actionable decision support, (2) multiple rounds of usability testing with iterative development for user acceptance, (3) numerous context sensitive triggers, (4) dedicated training and support for CDS tool users to encourage user adoption, and (5) support from clinical and administrative leadership for both successful adoption and acceptance of the CDS intervention.

iCPR Framework for Constructing Usable, Useful, and Effective CDS

A literature review identified four factors that can have a significant impact on CDS adoption and acceptance: active and actionable CDS; usability and clinical workflow integration; training; and clinical leadership. The fifth factor, “context sensitive triggers,” was identified through the iCPR work. The review of the CDS literature was never intended to be a comprehensive or systematic review. Search terms in both PubMed and Google Scholar included clinical decision support; acceptance rates of clinical decision support; clinical decision support and usability; and clinical decision support and implementation. To understand the role of these factors it is necessary to first review the types of CDS and how they are triggered. For the role each of these factors play in the framework please see Table 3.

Table 3.

Relationship of Framework Criteria to Measures of Success for Clinical Decision Support (CDS)

FRAMEWORK CRITERIA IMPACTED MEASURES OF SUCCESS
Actionable and Active Adoption and Acceptance, Outcomes
Multiple rounds of usability testing with iterative development Adoption, Acceptance
Numerous context sensitive triggers Adoption
Dedicated training and support for users of CDS tool for user adoption, Adoption, Acceptance
Support from clinical and administrative leadership for both successful adoption and acceptance of the CDS intervention Adoption, Acceptance

Active and Actionable Clinical Decision Support (CDS)

Background

There are two types of CDS. The first type is passive, in which the user has to input data and then request help from an online source like the National Library of Medicine’s Medline (http://www.pubmed.org). In contrast, the second type—active decision support—is triggered by an event and delivers information to the physician that was not requested by the physician, but that is relevant and of interest.22,23 The physician may or may not have taken a related action to initiate the trigger. For example, a physician may order a nephrotoxic medication and trigger a CDS-generated alert that recommends a dose appropriate to the patient’s kidney function and enables ordering a corollary order for blood levels of the medication. Corollary orders refer to “orders required to detect or ameliorate adverse reactions that may result from the trigger order.”24

To be accepted and adopted, the active type of CDS must be as follows: delivered at the point of care, patient specific, in clinical context (i.e., clinically relevant and logical, timely, delivered to the right providers), automated as much as possible, allow explanation for override, and be both tested and validated.25 However, CDS that results in frequent overrides is far from desirable as overrides can become a source of error. In a study on the effect of overrides on Adverse Drug Events (ADEs), 1 out of 30 overrides resulted in an ADE.26,27

CDS can generate active decision support in several ways, which have been extensively studied: alerts, reminders, corollary orders, and guidelines.5,6,24 An alert is a suggestion requiring immediate response or action. For example, an inappropriately high dose of a medication or a dangerous interaction between a medication and the value from a lab test would trigger an alert (e.g., Digoxin and low potassium). A reminder is a suggestion not requiring immediate response or action. For example, a reminder may caution that the increased risk of heart disease associated with Celebrex must be balanced with the pain relieving benefit of Celebrex.

Guidelines are a series of instructions on how to care for the patient based on information about the patient’s clinical status. In contrast to alerts and reminders, there are several pieces of information required for a guideline to fire. If a physician orders Metformin, a medication used to control blood sugar in diabetes, a guideline might fire that prompts the optional ordering of hemoglobin A1C and calculates the estimated creatinine clearance. However, implementing guidelines for complex chronic disease has proven to be challenging.25,2831

The success of actionable decision support can be measured by changes in behavior and/or outcome. The term “actionable CDS” is not new,3237 and implies at the very least that advice is being provided that the user can then take action on and thus influence behavior and outcome. For example, the alert might state “this patient has diabetes and we recommend starting the medication metformin.” In this example, the user then needs to stop whatever he or she was doing at the moment of alert and go to a separate section of the EHR to place an order for metformin. We define actionable CDS as active CDS interventions (e.g., alerts, reminders, etc.) that contain everything (i.e., orders, documentation, patient instructions, prescriptions, etc.) the user needs to take the desired action.25 By our definition using the same example, the alert would still state “this patient has diabetes and we recommend starting the medication metformin” but would also contain orders for metformin. In this example the user simply has to “accept” in order to acknowledge and place the orders.

Successful CDS design as measured by adoption is dependent on triggers.38 For purposes of this paper we introduce a new term “context sensitive triggers” and define these triggers as being workflow events (i.e., context) that result in a CDS intervention. These workflow events are actions taken by the user such as data entry in structured fields like problem lists, billing diagnoses, or migrating to the order entry section to place an order. For example, the user begins seeing the patient (i.e., workflow) and enters a chief complaint. The entry of data in the chief complaint then triggers the alert.

iCPR and CDS Framework

Clinical prediction rules (CPRs) are a form of active and actionable decision support in the EHR. CPRs can be clearly identified as a type of CDS in that these rules aid in clinical decision-making. CPRs possess distinct characteristics in contrast to other forms of CDS integrated into EHRs. Unlike the content of alerts and reminders, the content of CPRs must meet methodological standards that are designed to subject the rules themselves to validation and assessment of clinical outcomes. Specifically, CPRs must include the following: an outcome that is both clearly defined and clinically important; a well-described patient population to account for the effect of the population on rule performance; validation in other patient populations; measurements of clinical use—e.g., is it being used clinically) and outcome; reproducibility; and clinical common sense (i.e., does the rule make sense?).3941 CPRs, unlike clinical guidelines, are designed to answer one clinical question such as “does this patient have strep pharyngitis or does this patient need a chest x-ray to diagnose pneumonia?” Since iCPR was built as actionable CDS and CPRs are designed to influence outcome, a very applicable measure of success for iCPR would be clinical outcomes. There was a 9.2 percent reduction in ordering antibiotics for strep pharyngitis with a P value =.008 (Table 2).17

As noted above, successful active CDS is delivered at the point of care, is patient specific, is in a clinical context, is automated as much as possible, allows an explanation for override, and is both tested and validated.25 CPRs are delivered at the point of care, are specific to patients with certain complaints and diagnoses, is automated as much as possible, and allows override with explanations.17 The success of active CDS can be measured by adoption, which was 57.5 percent (Table 2).

Usability and workflow integration

Background

Usability is a very significant factor in the successful adoption and acceptance of CDS. Usability can best be defined as both usable and useful. Usefulness along with relevance have been noted as key determinants of CDS success by both Bates and Rousseau.25,42 Poor user-interface design and redundant entry of data lead to poor use of CDS,43 perhaps highlighting the need for iterative design and assessment of usability even more so. The need to get structured user feedback to improve CDS (e.g., usability testing) was also noted with the findings that were observed in multiple other studies (above).4348 A nursing study indicated the need for redesign emphasizing usability and learnability to improve use of reminders as well as usability and workflow integration.44,45 Shah et al. found that taking into account usability and workflow helped in redesign of alerting in prescribing systems.47 Fung et al. (2008) found that improving integration into workflow as well as a continuous iterative feedback with user feedback were necessary ingredients for well designed and accepted alerting.48 Workflow integration as a determinant of CDS success was stressed in studies by Maviglia28 and Krall.49 This need cannot be underestimated, as failure to integrate with existing processes has led to significant difficulty and user dissatisfaction.5059

Not only is usability affected by a user population, the different types of users—light, moderate, and heavy users—may require different designs.60 Poor usability and user interface design can contribute to medical errors in the clinical information system.27,6163 Clinical information systems need to be intuitive and easy to use, and to integrate well in the workflow.64 Ultimately, this is true of CDS as well.45 The conflicting results in the literature on ambulatory CDS effectiveness noted earlier may reflect differences in system design, workflow, usability, and content.65

Ultimately, usability is linked to and affects user satisfaction,66 and user satisfaction is an important predictor of a system’s success.67 The bottom line is that physicians are looking for systems that are easy to use, improve daily efficiency, and provide perceived improvements in patient care.68 Perceived ease of use determines perceived usefulness and has the highest correlation with positive attitudes about EHR use.69 Consistent with this bottom line is that user satisfaction seems to correlate with the ability to perform tasks efficiently.70 This need for task efficiency has been noted for CDS as well,49 in particular for reminders.5,6,7174 CDS interventions provide some of the perceived quality and efficiency that physicians look for by helping physicians with “their” patient.14,68

iCPR and CDS Framework

Prior to our “go live,” investigators evaluated the usability of iCPR integration at each of the possible trigger points and tested the level of disruption to clinical workflow induced by triggering at that point in workflow. They also evaluated the perceived usefulness of iCPR as a decision aid in each of the two clinical scenarios. Clinical scenarios were constructed for providers, and they were asked—while interacting with the program—to “think aloud” about what they are doing and what they want to achieve, and also to verbalize their experience using the software. The computer screens were recorded and subjects’ verbalizations were audio recorded. A professional transcriptionist for the expert panel’s review transcribed the audio recordings. We paid particular attention to how easily and successfully providers were able to navigate the program and how satisfied they were with the interface and the information. We also recorded the number of errors or problems in understanding different concepts and completing different tasks. We recorded the number of requests for assistance. This first round of usability testing employed eight subjects.18

Over the development period, prototypes were tested in increasingly realistic scenarios, with final versions being tested in our simulation lab. The clinical care sessions reproduced and simulated the issues of time pressures and patient-case complexity. This process allowed the developers to ascertain characteristics of the iCPR experience that are functional, need improvement, fit user expectations, miss expectations, fail to function, or are opportunities for further development.18 This second round of usability testing employed eight subjects.

Refinements based on the results from usability testing were incorporated into subsequent prototypes. For example, the iCPR prototype included a calculator for generating strep and pneumonia risk estimates. This calculator minimized “clicks” and manual data entry because of usability testing and iterative design.18,20

In summary, usability testing assessed usability and usefulness before going live. Time was built in to make iterative changes based on the usability testing. In part by evaluating usability and usefulness, we achieved an adoption rate of 57.5 percent (useable) and an acceptance rate of 42.7 percent (useful).17

Dedicated Training

There is universal agreement on training as a prerequisite for a successful implementation15,53,56,7578 and thus adoption. Poor training can result in inappropriate or underutilization of functionality7981 and poor adoption. Users frequently want to learn more or receive advanced training and may need to learn about new features added to the system.14,15

iCPR and CDS Framework

Consenting residents who were randomized to the intervention arm received approximately 45 minutes of training. These training sessions were led by at least one study investigator with the support of a study staff member. The content of these training sessions was divided into three basic sections: (1) background discussion of the definition of CPRs and evidence for their use, specifically for strep pharyngitis and pneumonia; (2) on-screen walk-through of three common clinic patient scenarios employing the iCPR tool; and (3) presentation of a simulated patient-physician encounter demonstrating how the tool is incorporated in the office encounter workflow. Residents who were unable to attend this group session were trained individually in separate sessions. The control group was invited to the training, their consent was obtained, and then the control group was provided with two articles on CPRs. They did not participate in the on-screen walk-throughs or the video presentation.

A clinical champion for the iCPR tool was available on site during most clinical sessions to promptly address questions or problems that arose during the use of the iCPR tool. In addition, technical support was available on site and the pager number for the support personnel was posted throughout the practice area. Recurrent or significant technical problems were promptly communicated to the EHR programmer and subsequently discussed with the study team during weekly meetings.

All 59 residents assigned to the intervention group received the training session and completed the post-training survey.18 The majority of the residents reported a high level of comfort with both the strep pharyngitis (64.4 percent) and pneumonia (62.7 percent) iCPR tools following the training session. In addition, they reported a high likelihood of using the tools in the future, and nearly 95 percent of the residents gave a favorable rating for the overall quality of the presentation. The high level of comfort with the tool and content in part anticipated a high adoption rate 57.5 percent and acceptance rate 42.4 percent. It is difficult to determine whether it is the user-friendly design of the tool and the quality of the training and technical support that have facilitated the widespread adoption and use of the iCPR tool among residents.

Support from Clinical and Administrative Leadership

Leadership15,82 is critical for both clinical adoption and acceptance of CDS. There is some evidence in the peer reviewed literature to suggest that leadership is an independent risk factor that correlates with project success or failure.83 The broad umbrella of leadership has to include clinical, administrative, and IT leadership,84 especially clinicians.64,8587 Recently the peer reviewed literature has begun to clearly state the importance of clinical leadership in particular as a factor in implementation success or failure.15,88,89 Perhaps most importantly, on-site leadership needs to identify the raison d’etre for doing the intervention (i.e., acceptance).

iCPR and CDS Framework

Prior to the rollout, the team met with all of the clinical leadership for the practice: the medical director for the practice, the residency director, the chief residents for internal medicine, and the attending physicians who precept (i.e., teach and supervise the residents) in ambulatory care. Although all of the clinical leadership became strong advocates for the use of the iCPR tool, we identified the need for two clinical champions who were often present in the clinic. One champion was a frequent precept who could troubleshoot any questions or concerns. The other champion was senior clinical leadership who frequently precepted and provided strong on-site advocacy. This is consistent with the important role clinical leadership can play in project success or failure.15,88,89 Overall, the level of leadership support and clinical guidance for iCPR was key in establishing a culture of acceptance for both the tool and its recommendations contributing to adoption and acceptance.

As noted earlier, the role of people in implementations cannot be underestimated.82,83,90 After unrelated changes in clinical, administrative, and IT roles occurred, the utilization of the tool declined.

Context Sensitive Triggers

iCPR and CDS Framework

Usability testing and workflow integration analysis in iCPR identified a surprising candidate for the framework of successful CDS: context sensitive triggers. Specifically, usability testing and workflow integration identified two distinct workflows: charting first, and orders first. In the orders first workflow, frequently used by housestaff, orders were placed first and charting, which included recording of chief complaint, occurred later. In the charting first workflow, charting was done first—including the recording of chief complaint first—with orders occurring later. These two workflows suggested early on to the research team that CDS triggers need to be thought of in the context of workflow. Context sensitive triggers are workflow events that trigger the CDS intervention when the intervention occurs in the user’s workflow. Initially the placement of triggers was not identified as part of the framework for successful CDS, as the framework was based on analysis of the peer reviewed literature available at the time of study design in 2010. The available peer reviewed literature focused heavily on internally developed systems and a limited number of sites.91,92 What iCPR unearthed was a potential limitation of a commercial EHR system. The triggers in the EHR could be placed at the beginning of the visit when the chief complaint was being placed or at the end of the visit when billing diagnoses and orders were being placed. In short, triggers could be placed at the beginning or the end of the visit. There were no identifiable trigger points in the middle of the workflow such as entering observations in a progress note (i.e., charting). The resulting alerts included noninterrupting alerts in the chief complaint section and interrupting alerts in the diagnoses, order combination, and point-of-care testing sections. Adoption in iCPR was defined as responding to the alert and opening the risk score calculator.

The iCPR tool provided three potential trigger points for iCPR as it was clear from an earlier workflow analysis that three very different workflows were possible. Usability testing confirmed this workflow analysis. One such trigger point was the chief complaint, which accounted for 30 percent of all triggered CDS.19 Another such trigger was the entering of a relevant billing diagnosis, which accounted for 57 percent of triggered events, and a third trigger was placing orders accounting for 13 percent of triggered events. When orders were placed, any existing diagnosis was factored in as well to determine if an event should be triggered. The contribution of each of the context sensitive triggers highlights the importance for flexible triggering in CDS adoption as no one trigger would have presented all suitable candidates for intervention.19 In summary, multiple context sensitive triggers were required for successful adoption (i.e., 57.5 percent adoption rate) of the CDS intervention.

Limitations

The iCPR study was not designed to measure the relative contributions of each of the framework success factors. Further study would be needed to delineate relative contributions of framework factors.

While there was a decline in usage of iCPR due to changes in leadership, there was a decline in iCPR usage prior to the changes in leadership as well.21 The decline in usage prior to leadership change was carefully measured. Data regarding the observed decline after clinical leadership changes were not available.

Finally, while the framework is based on a thorough review of the literature, proof of its applicability and success is provided in the context of only one study at one large practice at one academic medical center. Further study would be needed to examine the applicability and success of the framework itself.

Conclusion

We used the lessons learned from the iCPR project to illustrate and support a framework for constructing usable, useful, and effective actionable CDS as measured by adoption (use), acceptance (compliance) rates, and clinical outcomes (where appropriate). For example, clinical outcomes would be an appropriate measure of CDS for an actionable CDS intervention that suggested additional medications. In contrast, measuring clinical outcomes for a process measure such as percentage of patients with their medication list reviewed, a measure required for Meaningful Use, would not be appropriate. The iCPR was designed to be active and actionable CDS and was both well adopted and accepted when compared to baseline rates in the literature (Table 3). Usability testing with iterative design identified and fixed many correctable issues before the clinical trial of iCPR and had a positive impact on the adoption- and acceptance rates.

A surprising addition to the framework came as a result of the usability testing: context sensitive triggers. Overall we identified three triggers, all of which contributed to identifying suitable intervention subjects, and our data support the need for more than one trigger. Since our EHR implementation provided cursory CDS training, and exposure to CDS can occur long after training, we provided dedicated training and support for the use of the tool. The added bonus of dedicated training and support had a positive impact on adoption and acceptance. The involvement and visibility of clinical leadership resulted in a buy in that was transmitted to housestaff, ensuring adoption and acceptance. The iCPR findings are consistent with prior studies that suggest that usability4348 user training,14,15,53,56,7581 and clinical leadership15,64,8489 are each necessary for successful system implementation if not for CDS implementation alone. However, we are unaware, based on our review of the literature, of any study that combined all of these interventions.

Future work will measure the effectiveness of the CDS framework in similar research studies that employ CDS to change behavior and affect outcome. Additionally, more attention will be paid to measuring the effect of clinical leadership on acceptance and adaptation. While it is beyond the scope of this paper to fully explore and discuss the applicability of this framework to other contexts beyond CDS, there is clear applicability to the broader context of HIT. Three of the framework factors have individually been previously identified as critical to successful implementation and use of clinical information systems in HIT: multiple rounds of usability testing with iterative development for user acceptance,4348 dedicated training and support for users as critical for adoption,4348 and support from clinical and administrative leadership.15,64,8489 Further study might investigate the effect of consistently employing each of these three factors together in clinical implementation, optimization, and overall daily use.

The concepts behind the two remaining factors may be quite important for those working with data. Context sensitivity may apply not only to CDS triggers but to the capture and use of clinical data as well. In other words, at what point in the workflow can the user be prompted to capture high quality and reliable data? Capturing byproducts of clinical data generation such as billing or claims has led to questions about the validity of the data in the past.9395 Similarly, the concepts of actionable and active may apply not only to decision support but also to presentation of data in clinical information systems. For example, should being actionable (i.e., something that the user can take action on) be a criterion for the presentation of data? For example, should data be displayed actively (i.e., clinically relevant and logical, timely, delivered to the right providers)? Should data display be triggered by an event and deliver information to the physician that was not requested by the physician, but is relevant and of interest?

In conclusion, we have provided a framework for developing usable, useful, and effective actionable CDS as measured by adoption and acceptance rates,13 as well as clinical outcomes (where appropriate). We recommend applying this framework to produce highly adopted and accepted CDS interventions.

Acknowledgments

Agency for Health Care Quality and Research (AHRQ) - 7R18HS018491-03. Special thanks to Kristin Myers, Vice President - IT Epic Clinical Transformation Group; Ken Koppenhaver, RN, MS, Senior Director, Epic Applications IT, Epic Clinical Transformation Group; the Sinai Epic team; Sumit Rana, Epic Development, Epic Corporation; and Nancy Smider, PhD, Epic Research, Epic Corporation.

Footnotes

Disciplines

Health Information Technology | Primary Care

References

  • 1.Greenes RA. Clinical decision support : the road to broad adoption. 2nd ed. Amsterdam Boston: Academic Press; 2014. [Google Scholar]
  • 2.Wager KA, Lee FW, Glaser JP. Health care information systems : a practical approach for health care management. Third editon. ed2013. [Google Scholar]
  • 3.Shortliffe EH, Cimino JJ. Biomedical informatics : computer applications in health care and biomedicine. 4th ed. New York, NY: Springer; 2014. [Google Scholar]
  • 4.Connelly DP, Rich EC, Curley SP, Kelly JT. Knowledge resource preferences of family physicians. J Fam Pract. 1990 Mar;30(3):353–9. [Comparative Study] [PubMed] [Google Scholar]
  • 5.Hunt DL, Haynes RB, Hanna SE, Smith K. Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review [see comments] Jama. 1998;280(15):1339–46. doi: 10.1001/jama.280.15.1339. [DOI] [PubMed] [Google Scholar]
  • 6.Randolph AG, Haynes RB, Wyatt JC, Cook DJ, Guyatt GH. Users’ Guides to the Medical Literature: XVIII. How to use an article evaluating the clinical impact of a computer-based clinical decision support system. Jama. 1999 Jul 7;282(1):67–74. doi: 10.1001/jama.282.1.67. [DOI] [PubMed] [Google Scholar]
  • 7.Eslami S, Abu-Hanna A, de Keizer NF. Evaluation of Outpatient Computerized Physician Medication Order Entry Systems: A Systematic Review. J Am Med Inform Assoc. 2007 Jul 1;14(4):400–6. doi: 10.1197/jamia.M2238. 2007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Lo HG, Newmark LP, Yoon C, Volk LA, Carlson VL, Kittler AF, et al. Electronic Health Records in Specialty Care: A Time-Motion Study. J Am Med Inform Assoc. 2007 Jun 28; doi: 10.1197/jamia.M2318. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Welch WP, Bazarko D, Ritten K, Burgess Y, Harmon R, Sandy LG. Electronic health records in four community physician practices: impact on quality and cost of care. J Am Med Inform Assoc. 2007 May-Jun;14(3):320–8. doi: 10.1197/jamia.M2125. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Jaspers MWM, Smeulers M, Vermeulen H, Peute LW. Effects of clinical decision-support systems on practitioner performance and patient outcomes: a synthesis of high-quality systematic review findings. Journal of the American Medical Informatics Association. 2011;18(3):327–34. doi: 10.1136/amiajnl-2011-000094. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Romano MJ, Stafford RS. Electronic Health Records and Clinical Decision Support Systems: Impact on National Ambulatory Care Quality. Arch Intern Med. 2011 Jan 24; doi: 10.1001/archinternmed.2010.527. 2011:archinternmed.2010.527. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux RR, et al. Effect of Clinical Decision-Support Systems: A Systematic Review. Annals of internal medicine. 2012. Apr 23, [DOI] [PubMed]
  • 13.Dexheimer JW, Sanders DL, Rosenbloom ST, Aronsky D. Prompting clinicians: a systematic review of preventive care reminders. AMIA Annu Symp Proc. 2005;938 [PMC free article] [PubMed] [Google Scholar]
  • 14.Kannry J. Computerized Physician Order Entry and Patient Safety: Panacea or Pandora’s Box? In: Ong KR, editor. Medical informatics : an executive primer. Chicago, IL: HIMSS; 2007. p. xviii.p. 316. [Google Scholar]
  • 15.Lorenzi NM, Kouroubali A, Detmer DE, Bloomrosen M. BMC Med Inform Decis Mak. England: 2009. How to successfully select and implement electronic health records (EHR) in small ambulatory practice settings; p. 15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Sittig D, Krall M, Dykstra R, Russell A, Chin H. A survey of factors affecting clinician acceptance of clinical decision support. BMC Medical Informatics and Decision Making. 2006;6(1):6. doi: 10.1186/1472-6947-6-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.McGinn TG, McCullagh L, Kannry J, Knaus M, Sofianou A, Wisnivesky JP, et al. Efficacy of an evidence-based clinical decision support in primary care practices: a randomized clinical trial. JAMA Intern Med. 2013 Sep 23;173(17):1584–91. doi: 10.1001/jamainternmed.2013.8980. [Comparative Study Multicenter Study Randomized Controlled Trial Research Support, U.S. Gov’t, P.H.S.] [DOI] [PubMed] [Google Scholar]
  • 18.Li AC, Kannry JL, Kushniruk A, Chrimes D, McGinn TG, Edonyabo D, et al. Integrating usability testing and think-aloud protocol analysis with “near-live” clinical simulations in evaluating clinical decision support. Int J Med Inform. 2012 Nov;81(11):761–72. doi: 10.1016/j.ijmedinf.2012.02.009. [Evaluation Studies Research Support, U.S. Gov’t, P.H.S.] [DOI] [PubMed] [Google Scholar]
  • 19.Mann D, Knauss M, McCullagh L, Sofianou A, Rosen L, McGinn T, et al. Measures of User experience in a Streptococcal pharyngitis and Pneumonia Clinical Decision Support Tools. Appl Clin Inform. 2014;5(3):824–35. doi: 10.4338/ACI-2014-04-RA-0043. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Mann DM, Kannry JL, Edonyabo D, Li AC, Arciniega J, Stulman J, et al. Rationale, design, and implementation protocol of an electronic health record integrated clinical prediction rule (iCPR) randomized trial in primary care. Implement Sci. 2011;6:109. doi: 10.1186/1748-5908-6-109. [Randomized Controlled Trial Research Support, U.S. Gov’t, P.H.S.] [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.McCullagh L, Mann D, Rosen L, Kannry J, McGinn T. Longitudinal adoption rates of complex decision support tools in primary care. Evidence Based Medicine. 2014 doi: 10.1136/ebmed-2014-110054. [DOI] [PubMed] [Google Scholar]
  • 22.Broverman CA. Standards for Clinical Decision Support Systems. JOURNAL OF HEALTHCARE INFORMATION MANAGEMENT. 1999 Summer;13(2):23–31. 1999. [Google Scholar]
  • 23.Kuperman Gilad J, Sittig DF, Shabot MM. Clinical Decision Support for Hospital and Critical Care. JOURNAL OF HEALTHCARE INFORMATION MANAGEMENT. 1999 Summer;13(2):81–96. 1999. [Google Scholar]
  • 24.Overhage JM, Tierney WM, Zhou XH, McDonald CJ. A randomized trial of “corollary orders” to prevent errors of omission. J Am Med Inform Assoc. 1997;4(5):364–75. doi: 10.1136/jamia.1997.0040364. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Bates DW, Kuperman GJ, Wang S, Gandhi T, Kittler A, Volk L, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc. 2003 Nov-Dec;10(6):523–30. doi: 10.1197/jamia.M1370. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Hsieh TC, Kuperman GJ, Jaggi T, Hojnowski-Diaz P, Fiskio J, Williams DH, et al. Characteristics and consequences of drug allergy alert overrides in a computerized physician order entry system. J Am Med Inform Assoc. 2004 Nov-Dec;11(6):482–91. doi: 10.1197/jamia.M1556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.van der Sijs H, Aarts J, Vulto A, Berg M. Overriding of Drug Safety Alerts in Computerized Physician Order Entry. J Am Med Inform Assoc. 2006 Mar 1;13(2):138–47. doi: 10.1197/jamia.M1809. 2006. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Maviglia SM, Zielstorff RD, Paterno M, Teich JM, Bates DW, Kuperman GJ. Automating complex guidelines for chronic disease: lessons learned. J Am Med Inform Assoc. 2003 Mar-Apr;10(2):154–65. doi: 10.1197/jamia.M1181. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Tierney WM, Overhage JM, Murray MD, Harris LE, Zhou XH, Eckert GJ, et al. Effects of computerized guidelines for managing heart disease in primary care. J Gen Intern Med. 2003 Dec;18(12):967–76. doi: 10.1111/j.1525-1497.2003.30635.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Eccles MP, Grimshaw JM. Selecting, presenting and delivering clinical guidelines: are there any “magic bullets”? Med J Aust. 2004 Mar 15;180(6 Suppl):S52–4. doi: 10.5694/j.1326-5377.2004.tb05946.x. [DOI] [PubMed] [Google Scholar]
  • 31.Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D, et al. Effect of computerised evidence based guidelines on management of asthma and angina in adults in primary care: cluster randomised controlled trial. Bmj. 2002 Oct 26;325(7370):941. doi: 10.1136/bmj.325.7370.941. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Middleton B, editor. The clinical decision support consortium. MIE; 2009. [PubMed] [Google Scholar]
  • 33.Ghitza UE, Gore-Langton RE, Lindblad R, Shide D, Subramaniam G, Tai B. Common data elements for substance use disorders in electronic health records: the NIDA Clinical Trials Network experience. Addiction. 2013;108(1):3–8. doi: 10.1111/j.1360-0443.2012.03876.x. [DOI] [PubMed] [Google Scholar]
  • 34.Kim GR, Zurhellen W. Health information technology and the medical home. Pediatrics. 2011;127(5):978–82. doi: 10.1542/peds.2011-0454. [DOI] [PubMed] [Google Scholar]
  • 35.Richardson JE, Ash JS, Sittig DF, Bunce A, Carpenter J, Dykstra RH, et al., editors. Multiple perspectives on the meaning of clinical decision support.. AMIA Annual Symposium Proceedings; American Medical Informatics Association; 2010. [PMC free article] [PubMed] [Google Scholar]
  • 36.Wu RR, Orlando LA, Himmel TL, Buchanan AH, Powell KP, Hauser ER, et al. Patient and primary care provider experience using a family health history collection, risk stratification, and clinical decision support tool: a type 2 hybrid controlled implementation-effectiveness trial. BMC family practice. 2013;14(1):1–8. doi: 10.1186/1471-2296-14-111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Kantor M, Wright A, Burton M, Fraser G, Krall M, Maviglia S, et al. Comparison of computer-based clinical decision support systems and content for diabetes mellitus. Applied clinical informatics. 2011;2(3):284. doi: 10.4338/ACI-2011-02-RA-0012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Wright A, Sittig DF. A framework and model for evaluating clinical decision support architectures. Journal of Biomedical Informatics. 2008;41(6):982–90. doi: 10.1016/j.jbi.2008.03.009. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Laupacis A, Sekar N, Stiell IG. Clinical prediction rules. A review and suggested modifications of methodological standards. Jama. 1997 Feb 12;277(6):488–94. [PubMed] [Google Scholar]
  • 40.McGinn TG, Guyatt GH, Wyer PC, Naylor CD, Stiell IG, Richardson WS. Users’ guides to the medical literature: XXII: how to use articles about clinical decision rules. Evidence-Based Medicine Working Group. Jama. 2000 Jul 5;284(1):79–84. doi: 10.1001/jama.284.1.79. [DOI] [PubMed] [Google Scholar]
  • 41.Wasson JH, Sox HC, Neff RK, Goldman L. Clinical prediction rules. Applications and methodological standards. The New England journal of medicine. 1985 Sep 26;313(13):793–9. doi: 10.1056/NEJM198509263131306. [DOI] [PubMed] [Google Scholar]
  • 42.Rousseau N, McColl E, Newton J, Grimshaw J, Eccles M. Practice based, longitudinal, qualitative interview study of computerised evidence based guidelines in primary care. Bmj. 2003 Feb 8;326(7384):314. doi: 10.1136/bmj.326.7384.314. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Bennett JW, Glasziou PP. Computerised reminders and feedback in medication management: a systematic review of randomised controlled trials. Med J Aust. 2003 Mar 3;178(5):217–22. doi: 10.5694/j.1326-5377.2003.tb05166.x. [DOI] [PubMed] [Google Scholar]
  • 44.Saleem JJ, Patterson ES, Militello L, Anders S, Falciglia M, Wissman JA, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc. 2007 Sep-Oct;14(5):632–40. doi: 10.1197/jamia.M2163. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Saleem JJ, Patterson ES, Militello L, Render ML, Orshansky G, Asch SM. Exploring barriers and facilitators to the use of computerized clinical reminders. J Am Med Inform Assoc. 2005 Jul-Aug;12(4):438–47. doi: 10.1197/jamia.M1777. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Shah NR, Seger AC, Seger DL, Fiskio JM, Kuperman GJ, Blumenfeld B, et al. Improving override rates for computerized prescribing alerts in ambulatory care. AMIA Annu Symp Proc. 2005;1110 [PMC free article] [PubMed] [Google Scholar]
  • 47.Shah NR, Seger AC, Seger DL, Fiskio JM, Kuperman GJ, Blumenfeld B, et al. Improving acceptance of computerized prescribing alerts in ambulatory care. J Am Med Inform Assoc. 2006 Jan-Feb;13(1):5–11. doi: 10.1197/jamia.M1868. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Fung CH, Tsai JS, Lulejian A, Glassman P, Patterson E, Doebbeling BN, et al. An evaluation of the Veterans Health Administration’s clinical reminders system: a national survey of generalists. J Gen Intern Med. 2008 Apr;23(4):392–8. doi: 10.1007/s11606-007-0417-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Krall MA, Sittig DF. Subjective assessment of usefulness and appropriate presentation mode of alerts and reminders in the outpatient setting. Proc AMIA Symp. 2001:334–8. [PMC free article] [PubMed] [Google Scholar]
  • 50.Ali NA, Mekhjian HS, Kuehn PL, Bentley TD, Kumar R, Ferketich AK, et al. Specificity of computerized physician order entry has a significant effect on the efficiency of workflow for critically ill patients. Crit Care Med. 2005 Jan;33(1):110–4. doi: 10.1097/01.ccm.0000150266.58668.f9. [DOI] [PubMed] [Google Scholar]
  • 51.Ash JS, Gorman PN, Hersh WR, Lavelle M, Poulsen SB. Perceptions of house officers who use physician order entry. Proc AMIA Symp. 1999:471–5. [PMC free article] [PubMed] [Google Scholar]
  • 52.Bowens FM, Frye PA, Jones WA. Health information technology: integration of clinical workflow into meaningful use of electronic health records. Vol. 7. Perspectives in health information management / AHIMA, American Health Information Management Association; 2010. p. 1d. [PMC free article] [PubMed] [Google Scholar]
  • 53.Kuperman GJ, Gibson RF. Computer physician order entry: benefits, costs, and issues. Ann Intern Med. 2003 Jul 1;139(1):31–9. doi: 10.7326/0003-4819-139-1-200307010-00010. [DOI] [PubMed] [Google Scholar]
  • 54.Kuperman GJ, Teich JM, Gandhi TK, Bates DW. Patient safety and computerized medication ordering at Brigham and Women’s Hospital. Jt Comm J Qual Improv. 2001;27(10):509–21. doi: 10.1016/s1070-3241(01)27045-x. [DOI] [PubMed] [Google Scholar]
  • 55.Mekhjian HS, Kumar RR, Kuehn L, Bentley TD, Teater P, Thomas A, et al. Immediate benefits realized following implementation of physician order entry at an academic medical center. J Am Med Inform Assoc. 2002 Sep-Oct;9(5):529–39. doi: 10.1197/jamia.M1038. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Overhage JM, Middleton B, Miller RA, Zielstorff RD, Hersh WR. Does national regulatory mandate of provider order entry portend greater benefit than risk for health care delivery? The 2001 ACMI debate. The American College of Medical Informatics. J Am Med Inform Assoc. 2002 May-Jun;9(3):199–208. doi: 10.1197/jamia.M1081. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Teich JM, Spurr CD, Schmiz JL, O’Connell EM, Thomas D. Enhancement of clinician workflow with computer order entry. Proc Annu Symp Comput Appl Med Care. 1995:459–63. [PMC free article] [PubMed] [Google Scholar]
  • 58.Durieux P, Trinquart L, Colombet I, Nies J, Walton R, Rajeswaran A, et al. Computerized advice on drug dosage to improve prescribing practice. Cochrane database of systematic reviews (Online) 2008;(3):CD002894. doi: 10.1002/14651858.CD002894.pub2. [DOI] [PubMed] [Google Scholar]
  • 59.Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005 Apr 2;330(7494):765. doi: 10.1136/bmj.38398.500764.8F. 2005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Zheng K, Padman R, Johnson MP, Diamond HS. Understanding technology adoption in clinical care: clinician adoption behavior of a point-of-care reminder system. Int J Med Inform. 2005 Aug;74(7–8):535–43. doi: 10.1016/j.ijmedinf.2005.03.007. [DOI] [PubMed] [Google Scholar]
  • 61.Kannry J, Kushniruk A, Koppel R. Meaningful Usability: Health Care Information Technology for the Rest of Us. In: Ong K, editor. Medical Informatics: An Executive Primer. 2nd ed. Chicago: HIMSS; 2011. [Google Scholar]
  • 62.Horsky J, Kuperman GJ, Patel VL. Comprehensive analysis of a medication dosing error related to CPOE. J Am Med Inform Assoc. 2005 Jul-Aug;12(4):377–82. doi: 10.1197/jamia.M1740. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, et al. Role of computerized physician order entry systems in facilitating medication errors. Jama. 2005 Mar 9;293(10):1197–203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]
  • 64.Ovretveit J, Scott T, Rundall TG, Shortell SM, Brommels M. Improving quality through effective implementation of information technology in healthcare. Int J Qual Health Care. 2007 Aug 23; doi: 10.1093/intqhc/mzm031. [DOI] [PubMed] [Google Scholar]
  • 65.Krall MA, Sittig DF. Clinician’s assessments of outpatient electronic medical record alert and reminder usability and usefulness requirements. Proc AMIA Symp. 2002:400–4. [PMC free article] [PubMed] [Google Scholar]
  • 66.Nielsen J. Usability engineering. Boston: Academic Press; 1993. [Google Scholar]
  • 67.Bailey JE. Development of an instrument for the management of computer user attitudes in hospitals. Methods Inf Med. 1990 Jan;29(1):51–6. [PubMed] [Google Scholar]
  • 68.Weiner M, Gress T, Thiemann DR, Jenckes M, Reel SL, Mandell SF, et al. Contrasting views of physicians and nurses about an inpatient computer- based provider order-entry system. J Am Med Inform Assoc. 1999;6(3):234–44. doi: 10.1136/jamia.1999.0060234. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Morton ME, Wiedenbeck S. EHR acceptance factors in ambulatory care: a survey of physician perceptions. Vol. 7. Perspectives in health information management / AHIMA, American Health Information Management Association; 2010. p. 1c. [PMC free article] [PubMed] [Google Scholar]
  • 70.Murff HJ, Kannry J. Physician satisfaction with two order entry systems. J Am Med Inform Assoc. 2001 Sep-Oct;8(5):499–509. doi: 10.1136/jamia.2001.0080499. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Dexheimer JW, Talbot TR, Sanders DL, Rosenbloom ST, Aronsky D. Prompting Clinicians about Preventive Care Measures: A Systematic Review of Randomized Controlled Trials. J Am Med Inform Assoc. 2008 May 1;15(3):311–20. doi: 10.1197/jamia.M2555. 2008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Sittig DF, Krall MA, Dykstra RH, Russell A, Chin HL. A survey of factors affecting clinician acceptance of clinical decision support. BMC Med Inform Decis Mak. 2006;6:6. doi: 10.1186/1472-6947-6-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Wu SJ, Lehto M, Yih Y, Saleem JJ, Doebbeling BN. Relationship of estimated resolution time and computerized clinical reminder adherence. AMIA Annu Symp Proc. 2007:334–8. [PMC free article] [PubMed] [Google Scholar]
  • 74.Shea S, DuMouchel W, Bahamonde L. A meta-analysis of 16 randomized controlled trials to evaluate computer-based clinical reminder systems for preventive care in the ambulatory setting. J Am Med Inform Assoc. 1996 Nov-Dec;3(6):399–409. doi: 10.1136/jamia.1996.97084513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Sittig DF, Stead WW. Computer-based physician order entry: the state of the art. J Am Med Inform Assoc. 1994;1(2):108–23. doi: 10.1136/jamia.1994.95236142. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Wu RC, Abrams H, Baker M, Rossos PG. Implementation of a computerized physician order entry system of medications at the University Health Network--physicians’ perspectives on the critical issues. Healthc Q. 2006;9(1):106–9. [PubMed] [Google Scholar]
  • 77.McAlearney AS, Song PH, Robbins J, Hirsch A, Jorina M, Kowalczyk N, et al. Moving from good to great in ambulatory electronic health record implementation. Journal for healthcare quality : official publication of the National Association for Healthcare Quality. 2010 Sep-Oct;32(5):41–50. doi: 10.1111/j.1945-1474.2010.00107.x. [DOI] [PubMed] [Google Scholar]
  • 78.Egleson N, Kang JH, Collymore D, Esmond W, Gonzalez L, Pong P, et al. A health center controlled network’s experience in ambulatory care EHR implementation. J Healthc Inf Manag. 2010 Spring;24(2):28–33. [PubMed] [Google Scholar]
  • 79.HIMSS. Transforming Healthcare with a Patient-Centric Electronic Health Record System. 2004. [cited 2006 April 16]; Available from: http://www.himss.org/content/files/davies2004_evanston.pdf.
  • 80.Schectman JM, Schorling JB, Nadkarni MM, Voss JD. Determinants of physician use of an ambulatory prescription expert system. Int J Med Inform. 2005 Sep;74(9):711–7. doi: 10.1016/j.ijmedinf.2005.05.011. [DOI] [PubMed] [Google Scholar]
  • 81.Patterson ES, Doebbeling BN, Fung CH, Militello L, Anders S, Asch SM. Identifying barriers to the effective use of clinical reminders: Bootstrapping multiple methods. Journal of Biomedical Informatics. 2005;38(3):189–99. doi: 10.1016/j.jbi.2004.11.015. [DOI] [PubMed] [Google Scholar]
  • 82.Kannry J, Emro S, Blount M, Elbing M, editors. Small-scale Testing of RFID in a Hospital Setting: RFID as Bed Trigger; AMIA Fall Symposium; Chicago, Ill.: 2007. 2007. [PMC free article] [PubMed] [Google Scholar]
  • 83.O’Connell RT, Cho C, Shah N, Brown K, Shiffman RN. Take note(s): differential EHR satisfaction with two implementations under one roof. J Am Med Inform Assoc. 2004 Jan-Feb;11(1):43–9. doi: 10.1197/jamia.M1409. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Ash JS, Fournier L, Stavri PZ, Dykstra R. Principles for a successful computerized physician order entry implementation. AMIA Annu Symp Proc. 2003:36–40. [PMC free article] [PubMed] [Google Scholar]
  • 85.Dunnigan A, John K, Scott A, Von Bibra L, Walling J. An implementation case study. Implementation of the Indian Health Service’s Resource and Patient Management System Electronic Health Record in the ambulatory care setting at the Phoenix Indian Medical Center. J Healthc Inf Manag. 2010 Spring;24(2):23–7. [PubMed] [Google Scholar]
  • 86.McAlearney AS, Song PH, Robbins J, Hirsch A, Jorina M, Kowalczyk N, et al. Moving from good to great in ambulatory electronic health record implementation. Journal for healthcare quality : official publication of the National Association for Healthcare Quality. 2010 Sep-Oct;32(5):41–50. doi: 10.1111/j.1945-1474.2010.00107.x. [Research Support, Non-U.S. Gov’t]. [DOI] [PubMed] [Google Scholar]
  • 87.Nikula RE, Elberg PB, Svedberg HB. Informed decisions by clinicians are fundamental for EPR implementation. International Journal of Medical Informatics. 2000;(58–59):141–6. doi: 10.1016/s1386-5056(00)00082-4. [DOI] [PubMed] [Google Scholar]
  • 88.Craven CK, Sievert MC, Hicks LL, Alexander GL, Hearne LB, Holmes JH. CAH to CAH: EHR implementation advice to critical access hospitals from peer experts and other key informants. Appl Clin Inform. 2014;5(1):92–117. doi: 10.4338/ACI-2013-08-RA-0066. [Research Support, N.I.H., Extramural] [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Silow-Carroll S, Edwards JN, Rodin D. Using electronic health records to improve quality and efficiency: the experiences of leading hospitals. Issue Brief (Commonw Fund) 2012 Jul;17:1–40. [PubMed] [Google Scholar]
  • 90.Ash JS, Stavri PZ, Dykstra R, Fournier L. Implementing computerized physician order entry: the importance of special people. International Journal of Medical Informatics. 2003;69(2–3):235–50. doi: 10.1016/s1386-5056(02)00107-7. [DOI] [PubMed] [Google Scholar]
  • 91.Chaudhry B, Wang J, Wu S, Maglione M, Mojica W, Roth E, et al. Systematic Review: Impact of Health Information Technology on Quality, Efficiency, and Costs of Medical Care. Ann Intern Med. 2006 Apr 11; doi: 10.7326/0003-4819-144-10-200605160-00125. 2006:0000605-200605160-00125. [DOI] [PubMed] [Google Scholar]
  • 92.Wright A, Sittig DF. A Four-Phase Model of the Evolution of Clinical Decision Support Architectures. Int J Med Inform. 2008 Oct;77(10):641–9. doi: 10.1016/j.ijmedinf.2008.01.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Quan H, Parsons GA, Ghali WA. Validity of information on comorbidity derived from ICD-9-CCM administrative data. Medical care. 2002;40(8):675–85. doi: 10.1097/00005650-200208000-00007. [DOI] [PubMed] [Google Scholar]
  • 94.Hebert PL, Geiss LS, Tierney EF, Engelgau MM, Yawn BP, McBean AM. Identifying persons with diabetes using Medicare claims data. American Journal of Medical Quality. 1999;14(6):270–7. doi: 10.1177/106286069901400607. [DOI] [PubMed] [Google Scholar]
  • 95.Holt G, Feurer ID, Schwartz HS. An analysis of concordance among hospital databases and physician records. Ann Surg Oncol. 1998;5(6):553–6. doi: 10.1007/BF02303650. [DOI] [PubMed] [Google Scholar]

Articles from eGEMs are provided here courtesy of Ubiquity Press

RESOURCES