Summary
Background
Dissemination and adoption of clinical decision support (CDS) tools is a major initiative of the Affordable Care Act’s Meaningful Use program. Adoption of CDS tools is multipronged with personal, organizational, and clinical settings factoring into the successful utilization rates. Specifically, the diffusion of innovation theory implies that ‘early adopters’ are more inclined to use CDS tools and younger physicians tend to be ranked in this category.
Objective
This study examined the differences in adoption of CDS tools across providers’ training level.
Participants
From November 2010 to 2011, 168 residents and attendings from an academic medical institution were enrolled into a randomized controlled trial.
Intervention
The intervention arm had access to the CDS tool through the electronic health record (EHR) system during strep and pneumonia patient visits.
Main Measures
The EHR system recorded details on how intervention arm interacted with the CDS tool including acceptance of the initial CDS alert, completion of risk-score calculators and the signing of medication order sets. Using the EHR data, the study performed bivariate tests and general estimating equation (GEE) modeling to examine the differences in adoption of the CDS tool across residents and attendings.
Key Results
The completion rates of the CDS calculator and medication order sets were higher amongst first year residents compared to all other training levels. Attendings were the less likely to accept the initial step of the CDS tool (29.3%) or complete the medication order sets (22.4%) that guided their prescription decisions, resulting in attendings ordering more antibiotics (37.1%) during an CDS encounter compared to residents.
Conclusion
There is variation in adoption of CDS tools across training levels. Attendings tended to accept the tool less but ordered more medications. CDS tools should be tailored to clinicians’ training levels.
Keywords: Implementation science, evidence based medicine, clinical decision support, clinical prediction rules, primary care
Background
The Health Information Technology for Economic and Clinical Health (HITECH) Act, enacted in 2009, ignited health information technology (HIT) growth, including mobile health devices and applications, electronic health records (EHRs), and electronic clinical decision support tools (CDS) [1–3]. CDS tools are more useful when decision-making is complex, the clinical stakes are high, or cost savings can be achieved without compromising patient care [4]. CDS blends individual patient data, a rules engine, and provider intuition or “clinical gestalt” to guide clinicians through complex decisions (prognosis, diagnosis or management) at the point of care. Healthcare experts, policymakers, and informaticists are predicting that the use of CDS and health IT solutions will significantly standardize care, improve resource allocation, reduce overutilization, and bring evidence-based guidelines into clinical practice [5]. In support of HITECH, the federal government’s meaningful-use initiative has pledged over 6 billion dollars as incentive for hospitals and practices to adopt EHR and CDS tools [6–9].
However, several recent studies and analyses have raised questions about the effectiveness and the design parameters of CDS tools embedded into ambulatory EHRs [10–15]. CDS tools have led to: reductions in prescribing brand-name antibiotics; improved lipid management in renal transplant patients; improved compliance with guidelines for treating HIV; reduced ordering of tests when costs were displayed; and age-specific alerts that reduced inappropriate prescribing in the elderly [15–27]. However, barriers to CDS utilization and compliance persist and rates of adoption remain low [28–30]. A systematic review of ambulatory order entry with CDS found studies with significant reductions in medication costs and increased adherence to guidelines, but negative effects as well, including increased time and high frequency of ignored alerts [31]. Another study found no changes in influenza immunization and found that CDS for lab medication monitoring yielded no improvement [32]. These inconsistent findings of CDS effectiveness have been associated with barriers in workflow, usability, and integration. These adverse unintended consequences have made these interventions ineffective in provider behavior change and little impact on patient outcomes [5, 29, 33–35].
Health IT conceptual models have been applied to CDS tool design and implementation to improve usability, thereby increasing adoption rates [36, 37]. Similar to conceptual models and theories used for behavioral interventions (for example, stages of change and health behavior change), and the concerns-based adoption model to improve implementation of education programs, models for health IT design have been derived from academia, private industry, psychology, and health informatics to address the complexity of health IT implementation and usability. Most models address the structural level (implementation of technology, design, efficiency), clinical level (workflow integration), and physician level (perception of useful) [38]. Using such conceptual models can allow researchers and designers address the facilitators and barriers to CDS tool integration [36, 37]. Yet, there are few studies and conceptual models that include individual characteristics such as users’ age, knowledge, and individual workflow.
Structural barriers to health IT tools could include the dissemination, and implementation of health IT programs (EHR) and CDS software, integration into workflow, coordination within complex health systems and customizing programs to organizational needs [39]. With the federal mandates pushing the health IT agenda many of these structural barriers are being addressed and organizations are highly receptive to take on new innovations. Clinical workflow barriers such as overtriggering of the tool (too frequent CDS reminders) or misplaced triggers (CDS tools appearing at inopportune moments), tend to make the tools inefficient, ineffective, and longer clinical visits [40] Usability testing has had varying rates of success on the integration of health innovations on the macro (system workflow) and micro level (users workflow) suggesting that structural and clinical workflow barriers are a part larger model for adoption [33, 34, 41–45].
In contrast to examining the structural or clinical factors related to adoption, the diffusion-of-innovation (DOI) theory posits that personal characteristics of providers predict the feasibility of diffusion and adoption of innovative tools across social systems (i.e. organizations) [46, 47]. This model suggests that individuals and systems adopt innovation at varying paces and places them on a continuum – an “S” curve – that reflects their status in that process [41, 47, 48]. The theory considers constructs that influence adoption: perception of relative advantage, compatibility, complexity, observability, and trialability. The rate of adoption of innovative tools is deeply rooted in these constructs and “account[s] for 49–87% of the variance in whether or not they adopt”[49]. This theory suggests that innovative tools must be tailored not only to the workflow processes and macro environments but also to broad range of users’ perceptions of innovation.
Similar models suggest that technology design and development should focus on human-computer interaction (HCI), which seeks to minimize barriers and maximize benefits of technology use. HCI designers and researchers aim to improve the interactions between users and computer interfaces to make HIT design more usable and receptive to users’ needs, and by doing so through iterative designs [45]. This design concept suggests that tools should be designed to the unique experiences and resources of the user: their practice, specialty, and comfort level with technology. Designers and programmers who implement HCI theory will utilize communication theory, graphic, and industrial design disciplines, linguistics, cognitive psychology, social psychology, and human factors, such as computer user satisfaction, when developing their tools [45].
Although the DOI and HCI incorporate user characteristics (perception of and readiness for adoption) in HIT design, they do not account for native personal characteristics such as experience with HIT, clinical experience, age, or personal preference in HIT. In this analysis we sought to compare the adoption of CDS components across providers’ clinical training experience. We feel that the findings of this study add to current theoretical frameworks. Understanding how user characteristics relate to adoption will allow CDS developers to tailor CDS tools for maximum impact and truly meaningful use.
Methods
This is a secondary analysis of data collected from our primary study, a randomized controlled trial (RCT) of a tool for integrating clinical prediction rules (iCPR) in a commercial EHR platform. The RCT was conducted in 2010 – 2011; a brief description follows [50, 51]. The study data on iCPR patient encounters were drawn from an academic institution in New York City, New York.
Design of the Intervention
The primary study examined the impact of a well-designed clinical decision support tool on patient outcomes and provider adoption. The tool was based on two clinical prediction rules that have been validated in a variety of settings and are now considered standard of care but are not consistently applied in clinical practice: the Walsh rule for strep pharyngitis (Strep) and the Heckerling rule for pneumonia (PNA) [52–56]. The study team collaborated with clinicians and informaticists to develop a complex CDS tool that incorporated multiple patient data points, passive, and active triggers, and the possibility of dismissing the tool. A more detailed description of iCPR’s design can be found in previous publications [50, 51, 56].
The basic components of iCPR include:
key-word triggering,
risk calculators that provide stratified scores reflecting probability of disease risk,
bundled-order sets for tests and medications (SmartSets), and
automated personalized risk information and patient instructions in the clinical discharge summary.
Providers could select from the bundled-order sets, selecting only the medications they wished to order. Through numerous iterations derived from repeated usability testing with residents, fellows, and attendings, the tool was finally refined to eliminate barriers to workflow and then implemented in a RCT.
Randomized Controlled Trial
The clinician was the unit of randomization. Providers were recruited and consented into the study, then randomly assigned to the intervention group (access to iCPR) or control group (no access to iCPR). The iCPR tool was activated in 87 clinicians’ EHR profiles after they attended an hour-long training session. A clinical champion of iCPR was available on site during most clinical sessions to promptly address questions or problems that arose during its use.
Data Collection
Data were collected from iCPR clinical encounters from the intervention group only, since the primary outcome was the acceptance of iCPR. An iCPR clinical encounter was defined as a clinical visit in which the iCPR tool was triggered for suspected strep or pneumonia. Provider demographic data were collected via surveys, including age, gender, and self-rating of overall comfort level with the EHR and previous exposure to and comfort level with CDS tools. Clinicians randomized to the intervention arm were asked to complete an additional post-training survey.
For the iCPR encounters, EHR data were collected on the intervention group’s use of the tool in addition to the survey data. EHR data included elements on iCPR tool use and medications ordered. Benchmarks identified for measuring iCPR use were
initial acceptance of iCPR tool alert,
completing the iCPR risk score calculator,
opening the medication bundled-order set (SmartSet),
signing the SmartSet, and
orders placed during iCPR encounter but not through the SmartSet.
We also measured the medications prescribed as a result of using the iCPR calculator.
Statistical Analysis
Usability and demographic data on 87 intervention-arm providers were analyzed. Providers who did not complete pre-training surveys (n=8) and nurse practitioners (n=2) were excluded. A final analytic sample of 78 providers in the intervention-arm contributed to a total of 556 iCPR encounters. Providers’ training level was defined as post-graduate year (PGY) 1, 2, or 3, and attendings. Basic demographics, and previous use of medication bundle sets, best practice alerts (BPAs), and DocFlow (patient progress notes) forms were compared across provider training level using Student’s t- and chi-squared tests as appropriate (▶Table 1 and ▶Table 2).
Table 1.
Total | PGY1 | PGY2 | PGY3 | ATTENDING | p- value | |
---|---|---|---|---|---|---|
Number of providers | 78 | 34 | 18 | 18 | 8 | |
Age – median (iqr**) | 28 (4.0) | 29 (3.0) | 28.0 (3.0) | 29.0 (3.0) | 41.5 (6.5) | <0.0001 |
Female | 44 (56.4) | 20 (58.8) | 9 (50.0) | 10 (55.6) | 5 (62.5) | 0.92 |
Used Smartset before | 66 (87) | 24 (72.7)* | 17 (100)* | 17 (94.4) | 8 (100) | 0.02 |
Used BPA Alerts before | 44 (56.4) | 16 (47.1) | 10 (55.6) | 10 (55.6) | 8 (100) | 0.05 |
Used Docflow before | 20 (26) | 8 (24.2) | 5 (27.8) | 5 (27.8) | 2 (25.0) | 0.99 |
* Response missing for one participant
**iqr=interquartile
Table 2.
Total+ | PGY1 | PGY2 | PGY3 | ATTENDING | p-value | |
---|---|---|---|---|---|---|
iCPR Encounters | 556 | 182 (32.7) | 140 (25.2) | 118 (21.2) | 116 (20.8) | |
iCPR Accepted | 353 (63.5) | 146 (80.2) | 101 (72.1) | 72 (61.0) | 34 (29.3) | 0.02 |
Smartset Opened | 308 (55.4) | 127 (69.8) | 90 (64.3) | 65 (55.1) | 26 (22.4) | 0.07 |
Smartset Signed | 238 (42.8) | 92 (50.6) | 70 (50.0) | 57 (48.3) | 19 (16.4) | 0.39 |
Antibiotic Ordered++ | 164 (29.5) | 48 (26.4) | 34 (24.3) | 39 (33.1) | 43 (37.1) | 0.52 |
Supportive med orders | 166 (29.5) | 55 (30.2) | 53 (37.9) | 43 (36.4) | 15 (12.9) | 0.10 |
+ Tallies of each variable indicate at least one occurrence in a given encounter
++ All antibiotics related to strep and pneumonia ordered during the encounter with the exception of antivirals
Differences in the completion of each of the five benchmarks for iCPR tool use (during iCPR encounters) were compared across training levels, adjusting for provider demographics to isolate the independent association between training level and tool acceptance. During the study period, each provider had many opportunities to trigger iCPR during iCPR encounters. Generalized estimating equations (GEE) model were used to compare across the five benchmarks, to adjust for repeated measurements by provider (▶Table 3). Analyses were conducted in SAS 9.2 (SAS Cong, NC).
Table 3.
Model 1: unadjusted OR(95%CI), p-value** |
Model 2: model 1 + past use of CDS OR(95%CI), p-value |
Model 3: model 2 + age OR(95%CI), p-value |
|
---|---|---|---|
iCPR Accepted | |||
PGY1 | 8.4 (4.7–15.2), p<0.0001 | 11.9 (5.2–27.5), p<0.0001 | 7.1 (0.85–59.2), p=0.07 |
PGY2 | 4.9 (2.3–10.6), p<0.0001 | 7.0 (3.2–15.0), p<0.0001 | 4.3 (0.70–26.5), p=0.12 |
PGY3 | 3.5 (1.6–7.6), p=0.002 | 4.4 (1.8,10.8), p=0.001 | 2.7 (0.46–16.4), p=0.27 |
global test | p=0.02 | p=0.01 | p=0.22 |
Smartset Opened | |||
PGY1 | 7.3 (3.6–14.7), p<0.0001 | 8.6 (3.5–21.2), p<0.0001 | 2.3 (0.33–16.8), p=0.40 |
PGY2 | 5.2 (2.2–12.3), p=0.0001 | 6.6 (2.7–15.8), p<0.0001 | 2.0 (0.36–11.0), p=0.43 |
PGY3 | 4.2 (1.7–10.3), p=0.002 | 4.7 (1.7–12.9), p=0.003 | 1.5 (0.27–8.0), p=0.7 |
global test | p=0.07 | p=0.07 | p=0.77 |
Smartsets Signed | |||
PGY1 | 3.6 (1.4–9.6), p=0.01 | 3.9 (1.2–12.4), p=0.02 | 1.0 (0.2–7.2), p=0.97 |
PGY2 | 3.2 (1.2–8.6), p=0.02 | 3.5 (1.1–10.8), p=0.03 | 1.0 (0.2–1.8), p=0.98 |
PGY3 | 3.6 (1.2–10.8), p=0.02 | 3.7 (1.1–12.7), p=0.04 | 1.1 (0.2–6.6), p=0.88 |
global test | p=0.39 | p=0.42 | p=0.99 |
Antibiotic Ordered | |||
PGY1 | 0.66 (0.4–1.1), p=0.11 | 0.72 (0.4–1.3), p=0.25 | 1.2 (0.2–7.0), p=0.9 |
PGY2 | 0.60 (0.3–0.6), p=0.12 | 0.59 (0.30–1.1), p=0.11 | 0.9 (0.2–4.4), p=0.9 |
PGY3 | 0.91 (0.5–1.8), p=0.78 | 1.1 (0.54–2.1), p=0.87 | 1.6 (0.3–7.7), p=0.5 |
global test | p=0.52 | p=0.41 | p=0.52 |
Supportive Medication ordered | |||
PGY1 | 3.0 (1.1–8.8), p=0.04 | 3.3 (1.0–10.9), p=0.05 | 2.6 (0.43–15.9), p=0.29 |
PGY2 | 4.2 (1.5–12.1), p=0.01 | 1.6 (0.4–2.8), p=0.01 | 4.0 (0.63–25.1), p=0.14 |
PGY3 | 4.4 (1.4–12.5), p=0.01 | 4.8 (1.4–16.6), p=0.01 | 3.9 (0.7–22.6), p=0.13 |
global test | p=0.10 | p=0.08 | p=0.30 |
*Tallies of each variable indicate at least one occurrence in a given encounter
** p-values represent the comparison of PGYs to attending training level (control)
Results
Participant experience with CDS tools varied with training levels. A greater proportion of attendings and PGY3s had more experience with SmartSets and BPA (elements of CDS tools) (p=0.02; p=0.05) (▶Table 1). Age was correlated with training level and the median age of attendings was nearly 10 years greater than that of residents (p<0.0001).
There were a total of 556 iCPR encounters across all the training levels. PGY1s saw the most encounters (32.7%) followed by PGY2s (25.2%), PGY3s (21.2%), and attendings (20.8%) (▶Table 2). Providers with lower training levels had higher acceptance rates of iCPR, with PGY1s accepting 80% of their encounters, PGY2s 72% encounters, PGY3s in 61% encounters, and attendings in 23% of their encounters (p=0.02). Although not statistically significant, attendings were the least likely to complete the SmartSet order set (16.4%), and were more likely to order antibiotics (ordered in 37.1% of their encounters) compared to PGY1 (26.4%), PGY2 (24.3%), PGY3 (33.1%) (▶Table 2). The lack of significant difference may be due to there only being 8 attendings.
Using the attending level as a benchmark, there were no differences across training level for icpr tool accepted (P<0.02), Smart Set opened (P<0.07), and signed (P<0.02) (▶Table 3, model 1). After controlling for provider age and experience with CDS tools (▶Table 3, model 3), there was no longer a significant difference in the acceptance of all the CPR benchmarks and across provider training level, suggesting that age and training level are critical factors in the adoption of the iCPR tool and various components.
Discussion
There are several HIT conceptual models used to guide design, improve integration, and usability, yet adoption of CDS tools continue to be problematic and patient outcomes remain unchanged. While the diffusion of innovation theory has been applied to the development and evaluation phase of HIT, individual factors influencing CDS adoption is still vague. This study sought to determine whether provider training level is one of the personal characteristics that play a significant role in adopting a CDS tool. The results indicated that acceptance of iCPR was significantly less among clinicians with higher levels of training, adding insights to our knowledge on CDS and suggesting that CDS tools may be more widely accepted by younger physicians. This secondary analysis demonstrates that individual characteristics impact adoption. Yet, this study did not examine other intrinsic factors related to adoption, such as providers’ knowledge of the evidence behind the CPR, patient workload or time for CDS completion, and individual workflow (i.e. interaction with EHR), which is a limitation of this study and should be incorporated in future studies.
The fact that the proportion of participants completing subsequent benchmarked steps (SmartSet opened and signed, antibiotic ordered) does not differ by training level indicates that the first – and perhaps only – hurdle to iCPR usability is initial acceptance of the tool. Differences across training levels for the acceptance step persisted after adjusting for experience using CDS, suggesting that experience using CDS is not predictive of iCPR acceptance. Nevertheless, differences were rendered insignificant after adjusting for age, suggesting that age may have an effect on iCPR acceptance that is not explained by either training level or CDS experience. In other words, tailoring engagement of CDS tools to training level and age may be critical elements of CDS usability.
To date, most efforts to increase consistent or sustained uptake of CDS tools have emphasized workflow design. Usability testing (iterative pilot testing and refinement) has sought to address overtriggering, misplacement of triggers, and ineffective delivery of guidelines and recommendations [51]. This analysis indicates that individual clinician’s characteristics such as training level, experience with CDS tools, age, and experience with health technology may play a role in adoption. An extensive systematic review compared business models of adoption were compared to healthcare adoption models [58]. The large overarching factor impacting adoption was that decision support systems need to be dynamic and be able to launch “multiple assumptions, and incorporation of new information in response to changing circumstances”[57]. This concept of dynamic and adaptive design response of CDS tools should be applied to the variability in individual users.
The study results indicate that compared to PGY levels, attendings had the most experience with CDS tools, yet were less likely to use iCPR and ordered more antibiotics. This suggests that the model of CDS tools (one model) was not effective for all providers and the tools should be tailored for the specific needs of physicians [57, 58]. It is not clear from this analysis why attendings are not adopting the tools as frequently as residents. Whether the reason is their perceived knowledge of the rule, their perceived value of the rule, or some other factor, it is clear that their failure to adopt the rule creates a tendency to prescribe more antibiotics. Physicians’ tendencies to order more antibiotics can create a public health issue of antibiotic resistance to infectious diseases.
This study’s limitations include this study was implemented in a single clinic setting using a single EHR system and it’s unclear if the same pattern would be seen in a different clinical practice. In addition, the tool studied included CPRs for only 2 medical conditions, pneumonia and pharyngitis. A different clinical prediction rule may cause different results in adoption across training levels. Lastly, there is a possibility that the more experienced practitioners were responding more to complexities not taken into account by the algorithms. Nevertheless, the investigators believe that our findings clearly show that achieving the full promise of clinical decision supports requires clarifying individual provider characteristics that influence adoption. Further research should investigate the influence of individual factors on the adoption rates and longitudinal usage patterns to guide the design of CDS tools that will have sustained and tailored use.
Footnotes
Conflict of interest
None of the listed authors has any financial or personal relationship with other people or organizations that may inappropriately influence or bias the objectivity of submitted content and/or its acceptance for publication in this journal.
Protection of Human Subjects and Animals in Research
The procedures used have been reviewed in compliance with ethical standards of the responsible committee on human experimentation at the home institution of the authors. All research activities are in compliance with the World Medical Association Declaration of Helsinki on Ethical Principles for Medical Research Involving Human Subjects.
Funding Support
Agency for Health Care Quality and Research (AHRQ) – 7R18HS018491–03.
References
- 1.King J, Patel V, Furukawa M.Physician Adoption of Electronic Health Record Technology to Meet Meaningful Use Objectives: 2009–2012. Washington, DC: Office of the National Coordinator for Health Information Technology; 2012. [Google Scholar]
- 2.Charles D, King J, Furukawa M, Patel V.Hospital Adoption of Electronic Health Record Technology to Meet Meaningful Use Objectives: 2008–2012. Washington, DC: Office of the National Coordinator for Health Information Technology, 2013. [Google Scholar]
- 3.ONC. ONC analysis of data from the 2011 American Hospital Association Survey Information Technolog Supplement. 2011. [Google Scholar]
- 4.McGinn TG, Guyatt GH, Wyer PC, Naylor CD, Stiell IG, Richardson WS.Users’ guides to the medical literature: XXII: how to use articles about clinical decision rules. Evidence-Based Medicine Working Group. JAMA 2000; 284(1):79–84. [DOI] [PubMed] [Google Scholar]
- 5.Linder JA, Schnipper JL, Tsurikova R, Yu DT, Volk LA, Melnikas AJ, et al. Electronic health record feedback to improve antibiotic prescribing for acute respiratory infections. Am J Manag Care 2010; 16(12 Suppl. HIT): e311–e319. [PubMed] [Google Scholar]
- 6.CMS. EHR Incentive Program: Data and Reports 2012 [cited 2012 July 27]. Available from: http://www.cms.gov/Regulations-and-Guidance/Legislation/EHRIncentivePrograms/DataAndReports.html. [Google Scholar]
- 7.Caldis T.Composite health plan quality scales. Health Care Financ Rev 2007; 28(3):95–107. [PMC free article] [PubMed] [Google Scholar]
- 8.Center for Medicare and Medicaid Services (CMS). Federal Register | Electronic Health Record (EHR) Incentive Program (CMS-0033-F) [cited 2010 Sept. 5]. Available from: http://www.federalregister.gov/regulations/0938-AP78/electronic-health-record-ehr-incentive-program-cms-0033-f-. [Google Scholar]
- 9.Medicare and Medicaid Programs; Electronic Health Record Incentive Program - Stage 2. Proposed Rule. Fed Regist 2012; 77(45):13698–13827. [PubMed] [Google Scholar]
- 10.Eslami S, Abu-Hanna A, Schultz MJ, de Jonge E, de Keizer NF.Evaluation of consulting and critiquing decision support systems: Effect on adherence to a lower tidal volume mechanical ventilation strategy. J Crit Care 2011. [DOI] [PubMed] [Google Scholar]
- 11.Lo HG, Newmark LP, Yoon C, Volk LA, Carlson VL, Kittler AF, et al. Electronic health records in specialty care: a time-motion study. J Am Med Inform Assoc 2007; 14(5):609–615. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Welch WP, Bazarko D, Ritten K, Burgess Y, Harmon R, Sandy LG.Electronic health records in four community physician practices: impact on quality and cost of care. J Am Med Inform Assoc 2007; 14(3):320–328. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Jaspers MWM, Smeulers M, Vermeulen H, Peute LW.Effects of clinical decision-support systems on practitioner performance and patient outcomes: a synthesis of high-quality systematic review findings. Journal of the American Medical Informatics Association 2011; 18(3):327–334. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Romano MJ, Stafford RS.Electronic health records and clinical decision support systems: impact on national ambulatory care quality. Arch Intern Med. 2011;171(10):897–903. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Bernstein SL, Whitaker D, Winograd J, Brennan JA.An electronic chart prompt to decrease proprietary antibiotic prescription to self-pay patients. Acad Emerg Med 2005; 12(3):225–231. [DOI] [PubMed] [Google Scholar]
- 16.Garthwaite EA, Will EJ, Bartlett C, Richardson D, Newstead CG.Patient-specific prompts in the cholesterol management of renal transplant outpatients: results and analysis of underperformance. Transplantation 2004; 78(7):1042–1047. [DOI] [PubMed] [Google Scholar]
- 17.Safran C, Rind DM, Davis RM, Currier J, Ives D, Sands DZ, et al. An electronic medical record that helps care for patients with HIV infection. Proc Annu Symp Comput Appl Med Care 1993: 224–228. [PMC free article] [PubMed] [Google Scholar]
- 18.Safran C, Rind DM, Davis RB, Ives D, Sands DZ, Currier J, et al. Guidelines for management of HIV infection with computer-based patient’s record. Lancet 1995; 346(8971):341–346. [DOI] [PubMed] [Google Scholar]
- 19.Safran C, Rind DM, Sands DZ, Davis RB, Wald J, Slack WV.Development of a knowledge-based electronic patient record. MD Comput 1996; 13(1):46–54, 63. [PubMed] [Google Scholar]
- 20.Tierney WM, Miller ME, McDonald CJ.The effect on test ordering of informing physicians of the charges for outpatient diagnostic tests [see comments]. N Engl J Med 1990; 322(21):1499–1504. [DOI] [PubMed] [Google Scholar]
- 21.Simon SR, Smith DH, Feldstein AC, Perrin N, Yang X, Zhou Y, et al. Computerized prescribing alerts and group academic detailing to reduce the use of potentially inappropriate medications in older people. J Am Geriatr Soc 2006; 54(6):963–968. [DOI] [PubMed] [Google Scholar]
- 22.Shah NR, Seger AC, Seger DL, Fiskio JM, Kuperman GJ, Blumenfeld B, et al. Improving acceptance of computerized prescribing alerts in ambulatory care. J Am Med Inform Assoc 2006; 13(1):5–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Tamblyn R, Huang A, Perreault R, Jacques A, Roy D, Hanley J, et al. The medical office of the 21st century (MOXXI): effectiveness of computerized decision-making support in reducing inappropriate prescribing in primary care. CMAJ 2003; 169(6):549–556. [PMC free article] [PubMed] [Google Scholar]
- 24.Tamblyn R, Huang A, Kawasumi Y, Bartlett G, Grad R, Jacques A, et al. The development and evaluation of an integrated electronic prescribing and drug management system for primary care. J Am Med Inform Assoc 2006; 13(2):148–159. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Gaikwad R, Sketris I, Shepherd M, Duffy J.Evaluation of accuracy of drug interaction alerts triggered by two electronic medical record systems in primary healthcare. Health Informatics J 2007; 13(3):163–177. [DOI] [PubMed] [Google Scholar]
- 26.Smith DH, Perrin N, Feldstein A, Yang X, Kuang D, Simon SR, et al. The impact of prescribing safety alerts for elderly persons in an electronic medical record: an interrupted time series evaluation. Arch Intern Med 2006;166(10): 1098–1104. [DOI] [PubMed] [Google Scholar]
- 27.Seidling HM, Schmitt SP, Bruckner T, Kaltschmidt J, Pruszydlo MG, Senger C, et al. Patient-specific electronic decision support reduces prescription of excessive doses. Qual Saf Health Care 2010; 19: e15. [DOI] [PubMed] [Google Scholar]
- 28.Hemens BJ, Holbrook A, Tonkin M, Mackay JA, Weise-Kelly L, Navarro T, et al. Computerized clinical decision support systems for drug prescribing and management: a decision-maker-researcher partnership systematic review. Implement Sci 2011; 6: 89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D, et al. Effect of computerised evidence based guidelines on management of asthma and angina in adults in primary care: cluster randomised controlled trial. BMJ 2002; 325(7370): 941. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Cresswell K, Majeed A, Bates DW, Sheikh A.Computerised decision support systems for healthcare professionals: an interpretative review. Inform Prim Care 2012; 20(2):115–128. [DOI] [PubMed] [Google Scholar]
- 31.Eslami S, Abu-Hanna A, de Keizer NF.Evaluation of Outpatient Computerized Physician Medication Order Entry Systems: A Systematic Review. J Am Med Inform Assoc 2007; 14(4):400–406. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Steele AW, Eisert S, Witter J, Lyons P, Jones MA, Gabow P, et al. The effect of automated alerts on provider ordering behavior in an outpatient setting. PLoS Med 2005; 2(9): e255. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Garg AX, Adhikari NK, McDonald H, Rosas-Arellano MP, Devereaux PJ, Beyene J, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005; 293(10):1223–1238. [DOI] [PubMed] [Google Scholar]
- 34.Krall MA, Sittig DF.Clinician’s assessments of outpatient electronic medical record alert and reminder usability and usefulness requirements. Proc AMIA Symp 2002: 400–404. [PMC free article] [PubMed] [Google Scholar]
- 35.Sequist TD, Morong SM, Marston A, Keohane CA, Cook EF, Orav EJ, et al. Electronic risk alerts to improve primary care management of chest pain: a randomized, controlled trial. J Gen Intern Med 2012; 27(4):438–444. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Jones JB, Stewart WF, Darer JD, Sittig DF.Beyond the threshold: real-time use of evidence in practice. BMC Med Inform Decis Mak 2013; 13: 47. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Sittig DF, Singh H.A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care 2010; 19 (Suppl. 3): i68–i74. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Capra F.The hidden connections. New York: Anchor Books; 2002. [Google Scholar]
- 39.Pettigrew AM, Ferlie E., McKee L.Shaping Strategic Change: Making Change in Large Organisations – The case of the National Health Service. London: Sage Publications, 1992. [Google Scholar]
- 40.Bright TJ, Wong A, Dhurjati R, Bristow E, Bastian L, Coeytaux RR, et al. Effect of Clinical Decision-Support Systems: A Systematic Review. Ann Intern Med 2012. [DOI] [PubMed] [Google Scholar]
- 41.P. P. Complexity and the adoption of innovation in health care. Accelerating Quality Improvement in Health Care: Strategies to Accelerate the Diffusion of Evidence-Based InnovationsNational Institute for Health Care Management Foundation and National Committee for Quality in Health Care; Washington, DC.2003. [Google Scholar]
- 42.Saleem JJ, Patterson ES, Militello L, Render ML, Orshansky G, Asch SM.Exploring barriers and facilitators to the use of computerized clinical reminders. J Am Med Inform Assoc 2005; 12(4):438–447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Saleem JJ, Patterson ES, Militello L, Anders S, Falciglia M, Wissman JA, et al. Impact of clinical reminder redesign on learnability, efficiency, usability, and workload for ambulatory clinic nurses. J Am Med Inform Assoc 2007; 14(5):632–640. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Zheng K, Padman R, Johnson MP, Diamond HS.Understanding technology adoption in clinical care: clinician adoption behavior of a point-of-care reminder system. Int J Med Inform 2005; 74(7–8): 535–543. [DOI] [PubMed] [Google Scholar]
- 45.Horsky J, Schiff GD, Johnston D, Mercincavage L, Bell D, Middleton B.Interface design principles for usable decision support: a targeted review of best practices for clinical prescribing interventions. J Biomed Inform 2012; 45(6):1202–1216. [DOI] [PubMed] [Google Scholar]
- 46.Geibert RC.Using diffusion of innovation concepts to enhance implementation of an electronic health record to support evidence-based practice. Nurs Adm Q 2006; 30(3):203–210. [DOI] [PubMed] [Google Scholar]
- 47.Horbar JD, Rogowski J, Plsek PE, Delmore P, Edwards WH, Hocker J, et al. Collaborative quality improvement for neonatal intensive care. NIC/Q Project Investigators of the Vermont Oxford Network. Pediatrics 2001; 107(1):14–22. [DOI] [PubMed] [Google Scholar]
- 48.Plsek P.Innovative thinking for the improvement of medical systems. Ann Intern Med 1999; 131(6):438–444. [DOI] [PubMed] [Google Scholar]
- 49.Wu HW, Davis PK, Bell DS.Advancing clinical decision support using lessons from outside of healthcare: an interdisciplinary systematic review. BMC Med Inform Decis Mak 2012; 12: 90. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Li AC, Kannry JL, Kushniruk A, Chrimes D, McGinn TG, Edonyabo D, et al. Integrating usability testing and think-aloud protocol analysis with „near-live“ clinical simulations in evaluating clinical decision support. Int J Med Inform 2012. [DOI] [PubMed] [Google Scholar]
- 51.McGinn TG, McCullagh L, Kannry J, Knaus M, Sofianou A, Wisnivesky JP, et al. Efficacy of an evidence-based clinical decision support in primary care practices: a randomized clinical trial. JAMA Intern Med 2013; 173(17):1584–1591 doi: 10.1001/jamainternmed.2013.8980. [DOI] [PubMed] [Google Scholar]
- 52.Centor RM, Witherspoon JM, Dalton HP, Brody CE, Link K.The diagnosis of strep throat in adults in the emergency room. Med Decis Making 1981; 1(3):239–246. [DOI] [PubMed] [Google Scholar]
- 53.Walsh BT, Bookheim WW, Johnson RC, Tompkins RK.Recognition of streptococcal pharyngitis in adults. Arch Intern Med 1975; 135(11):1493–1497. [PubMed] [Google Scholar]
- 54.McGinn TG, Deluca J, Ahlawat SK, Mobo BH, Jr., Wisnivesky JP.Validation and modification of streptococcal pharyngitis clinical prediction rules. Mayo Clin Proc 2003; 78(3):289–293. [DOI] [PubMed] [Google Scholar]
- 55.Heckerling PS, Tape TG, Wigton RS, Hissong KK, Leikin JB, Ornato JP, et al. Clinical prediction rule for pulmonary infiltrates. Ann Intern Med 1990; 113(9):664–670. [DOI] [PubMed] [Google Scholar]
- 56.Mann DM, Kannry JL, Edonyabo D, Li AC, Arciniega J, Stulman J, et al. Rationale, design, and implementation protocol of an electronic health record integrated clinical prediction rule (iCPR) randomized trial in primary care. Implement Sci 2011; 6(1): 109. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Wu RC, Abrams H, Baker M, Rossos PG.Implementation of a computerized physician order entry system of medications at the University Health Network--physicians’ perspectives on the critical issues. Healthc Q 2006; 9(1):106–109. [PubMed] [Google Scholar]
- 58.Wilkinson SA, Hinchliffe F, Hough J, Chang A.Baseline evidence-based practice use, knowledge, and attitudes of allied health professionals: a survey to inform staff training and organisational change. J Allied Health 2012; 41(4):177–184. [PubMed] [Google Scholar]