Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2011 Mar 17;18(3):232–242. doi: 10.1136/amiajnl-2011-000113

Development and evaluation of a comprehensive clinical decision support taxonomy: comparison of front-end tools in commercial and internally developed electronic health record systems

Adam Wright 1,2,3,, Dean F Sittig 4, Joan S Ash 5, Joshua Feblowitz 1,2, Seth Meltzer 2, Carmit McMullen 6, Ken Guappone 7, Jim Carpenter 7, Joshua Richardson 8, Linas Simonaitis 9, R Scott Evans 10, W Paul Nichol 11, Blackford Middleton 1,2,3
PMCID: PMC3078666  PMID: 21415065

Abstract

Background

Clinical decision support (CDS) is a valuable tool for improving healthcare quality and lowering costs. However, there is no comprehensive taxonomy of types of CDS and there has been limited research on the availability of various CDS tools across current electronic health record (EHR) systems.

Objective

To develop and validate a taxonomy of front-end CDS tools and to assess support for these tools in major commercial and internally developed EHRs.

Study design and methods

We used a modified Delphi approach with a panel of 11 decision support experts to develop a taxonomy of 53 front-end CDS tools. Based on this taxonomy, a survey on CDS tools was sent to a purposive sample of commercial EHR vendors (n=9) and leading healthcare institutions with internally developed state-of-the-art EHRs (n=4).

Results

Responses were received from all healthcare institutions and 7 of 9 EHR vendors (response rate: 85%). All 53 types of CDS tools identified in the taxonomy were found in at least one surveyed EHR system, but only 8 functions were present in all EHRs. Medication dosing support and order facilitators were the most commonly available classes of decision support, while expert systems (eg, diagnostic decision support, ventilator management suggestions) were the least common.

Conclusion

We developed and validated a comprehensive taxonomy of front-end CDS tools. A subsequent survey of commercial EHR vendors and leading healthcare institutions revealed a small core set of common CDS tools, but identified significant variability in the remainder of clinical decision support content.

Keywords: Developing/using computerized provider order entry, knowledge representations, classical experimental and quasi-experimental study methods (lab and field), designing usable (responsive) resources and systems, statistical analysis of large datasets, discovery, text and data mining methods, automated learning, human-computer interaction and human-centered computing, qualitative/ethnographic field study, clinical decision support, manning maddux, decision support, biomedical informatics, developing and refining EHR data standards (including image standards), controlled terminologies and vocabularies, measuring/improving patient safety and reducing medical errors, machine learning, electronic health records, meaningful use

Introduction

Much of the potential value of electronic health record (EHR) systems comes from clinical decision support (CDS) tools that can help make care safer, more efficient, and more cost effective.1 2 CDS systems are designed to improve clinician decision-making at the point of care. Examples include health maintenance reminders,3 drug–drug interaction checking,4 dose adjustment,5 and order sets.6 When well designed and implemented, these interventions can help improve care quality and reduce medical errors.1 2 7–10

Although extensive research on ‘internally developed’ CDS has demonstrated the power of CDS to accomplish these goals, much of this research comes from four sites with internally developed EHRs.11 For the most part, the decision support in commercial EHR systems remains understudied. In addition, commercial EHRs have previously been found to be variable in their clinical decision support capabilities.12 This is concerning given that most hospitals and physician practices are likely to purchase a commercial EHR rather than invest the substantial time and resources required to develop a custom EHR system.

Federal meaningful use requirements mandate that hospitals and eligible providers utilize certified EHRs that implement clinical decision support in order to qualify for federal incentives.13 Specifically, the stage 1 objective for achieving meaningful use, as defined by the Centers for Medicare and Medicaid Services, is to “implement one clinical decision support rule relevant to specialty or high clinical priority along with the ability to track compliance with the rule.”14 This benchmark is expected to expand dramatically in stage 2 (2013) and stage 3 (2015) requirements as EHR use becomes more widespread.

Given the limited availability of CDS in routine clinical use,15 the impending deadlines for increased CDS use outlined in ‘meaningful use’ guidelines, and the fact that many institutions will likely purchase commercially developed CDS systems, it is imperative to develop a nuanced understanding of existing CDS tools and to determine the extent to which they have been incorporated into currently available commercially developed EHR systems. The goal of this project was to develop a comprehensive taxonomy of front-end CDS tools. We used this taxonomy to create a survey to study the availability of these CDS tools as designed at a purposive sample of leading healthcare institutions with internally developed EHRs and in commercially available EHR products.

Background

Front-end tools versus back-end system capabilities

The front-end CDS tools available to EHR users depend on the EHR's available back-end system capabilities. We define back-end system capabilities as discrete system capabilities such as alert triggers, available data input elements, and end-user notification methods,16 while front-end CDS tools are the intervention types available to end-users created using specific clinical knowledge bases and application logic. Consider, for example, the domain of medication-related decision support. Examples of front-end CDS tools might include drug–drug interaction checking, weight based dosing, or renal dose adjustment. Back-end system capabilities that would support such tools might include a trigger in the information system that fires when a new medication is ordered, the ability to access the medication being ordered, a patient's current medications, weight and glomerular filtration rate, the ability to do mathematical calculations, and the ability to display an alert with actionable choices to the end-user.

As a specific example, consider the case of weight-based dosing, a type of front-end CDS tool, as defined above, which allows providers to calculate appropriate drug dosages based on patient weight. In order to implement this front-end tool, several back-end system capabilities must be present, including triggers, input data elements, interventions, and offered choices.16 First, a trigger (in this case, the ordering of a medication) is necessary. After the tool is initiated by the trigger, the information system retrieves necessary input data elements including patient weight, medication, and weight-based dosage guideline information. An intervention is then displayed in the form of text guidelines, a weight-based dosage calculator, or an automated dose recommendation. Finally, depending on the system, the user may be offered the choice to adjust the dose as needed and place the order or may be limited to certain default dose choices. Thus, a wide range of back-end system capabilities may act to support a unique front-end tool.

Review of taxonomies

A number of taxonomies have been proposed to describe CDS systems; these classification systems are summarized in table 1.1 2 16–21 Most, with the exception of those of Wang et al and Garg et al, describe the back-end system capabilities of CDS systems (eg, triggers, data input elements) rather than front-end tools.

Table 1.

Clinical decision support (CDS) taxonomies

Taxonomy Type Major taxa
Wang et al17 Front-end tools
  • Benefits: process improvement, policy implementation, error prevention, decision support

  • Domains: laboratory (process improvement), pharmacy (error prevention/decision support), Joint Commission (policy implementation)

  • Classes: logically organize clinical rules by content type (eg, drug–drug interaction checking, automated orders, guided dosing)

Miller et al18 Back-end system capabilities
  • Type of intervention (eg, optimal ordering, patient-specific decision support, optimal care, just-in-time (JIT) education)

  • When in the workflow to introduce the intervention (eg, initiating a session, selecting an order)

  • How disruptive the intervention should be (eg, incidental display, pop-up, complex protocol)

Garg et al1 Front-end tools (general)
  • Systems for diagnosis

  • Reminder systems for prevention

  • Systems for disease management

  • Systems for drug dosing and drug prescribing

Kawamoto et al2 Back-end system capabilities (general)
  • General system features (eg, integration with charting, computerized physician order entry)

  • Clinician–system interaction features (eg, automatic provision of CDS), provision at point-of-care, documentation of override reasons)

  • Communication content features (eg, provision of a recommendation vs assessment, justification with reasoning and/or research evidence)

  • Auxiliary features (eg, local user involvement in development, CDS provided to patients, periodic performance feedback)

Osheroff et al19 Back-end system capabilities
  • Documentation forms/templates

  • Relevant data display

  • Order creation facilitators

  • Time-based checking and protocol/pathway support

  • Reference information and guidance

  • Reactive alerts and reminders

Berlin et al20 Back-end system capabilities
  • Context: setting, objectives, and other contextual factors (eg, clinical setting, clinical task)

  • Knowledge and data source: sources of clinical knowledge (eg, guidelines) and patient data source (eg, electronic health record, direct entry)

  • Decision support: type of inference being made and complexity of recommendations

  • Information delivery: delivery format and mode

  • Workflow: user of the system (eg, clinicians, patients), system–workflow integration

Wright et al16 21 Back-end system capabilities
  • Triggers: events causing a CDS rule to be invoked (eg, prescribing a drug, ordering a laboratory test, entering a new problem on the problem list)

  • Input data: data elements used by the rule to make interferences (eg, laboratory results, patient demographics, problem list)

  • Interventions: possible actions a CDS tool can take (eg, send message, show guidance, log event)

  • Offered choices: choices offered to the user (eg, cancel order, change order, override)

Previously, we developed a taxonomy of clinical decision support that could be used to categorize discrete back-end system capabilities of clinical information systems and CDS systems.16 In a separate study, we examined the availability of these capabilities within several major commercial EHR systems.12 This study was limited to the back-end system capabilities present in the information system and explicitly excluded the front-end tools available for use by providers. We found that the back-end system capabilities of nine commercial systems was highly variable—the most comprehensive system had 41 of 42, while the least comprehensive had only 24 of 42 back-end system capabilities.

Although we believe this characterization was useful, we have found that, in practice, many healthcare organizations do not directly work with the back-end system capabilities of their EHR to implement CDS de novo, but rather use front-end CDS tools and content which they purchase ‘off-the-shelf’ from their EHR vendor or a clinical decision support content vendor. Therefore, we expanded upon our previous research on back-end system capabilities with the goal of fully characterizing available front-end decision support tools across a wide range of clinical information systems, including both commercially available and internally developed EHR systems.

Design versus implementation

In addition to assessing both back-end CDS system capabilities and front-end CDS tools, it is also valuable to differentiate between EHR system features as designed and the available tools as implemented or used. Although a particular type of clinical decision support may be possible in a given system, whether it is actually available to end-users can vary widely depending on how the system is implemented. Organizations can decide not to buy certain CDS modules from their EHR vendor if they can be optionally purchased elsewhere, or they can turn off what does come with their system purchase. In addition, research has shown that the same commercial systems can be used with variable results. For example, the Leapfrog group conducted a test of computerized physician order entry systems (CPOE), as implemented, and found that each commercial system evaluated failed the test as implemented in at least one institution, and passed in at least one other, a testament to the variability of the configuration and implementation process.22

A robust understanding of CDS systems on both the back-end/front-end and design/implementation dimensions is thus valuable for future research and development (table 2).

Table 2.

Taxonomic assessment of decision support content and function as designed and as implemented

Front-end tools Back-end system capabilities
As designed Current project Wright et al12
As implemented Classen et al23 NA*
*

Function as implemented is a less significant category given that clinical decision support functions are a necessary prerequisite for implementing content, and because the functions available (although not necessarily used) as implemented are generally the same as functions as designed.

Current systems have yet to be fully characterized along both of these dimensions. We first assessed back-end capabilities as implemented within one internally developed EHR to develop the taxonomy of back-end capabilities required to create useful front-end tools.16 A subsequent study on back-end system capabilities as designed assessed their availability across multiple commercially available EHR systems.12 In addition, Classen et al investigated front-end tools as implemented at various sites.23 The area that remains uninvestigated is the CDS front-end-as designed. Thus, the goal of the current study is to characterize the fourth and final quadrant: front-end-tools-as designed.

As reflected in table 1, although a variety of CDS taxonomies exist, rigorous taxonomies of front-end tools are lacking. Therefore, we began our project by developing a taxonomy of front-end CDS tools using a Delphi method, with a large expert panel. Our goal in developing the taxonomy was to assess the CDS tools available in various systems as designed. We then developed and administered a survey to two groups: commercial EHR vendors and ‘internal’ EHR developers. For the purposes of this paper, EHRs are referred to as either ‘commercial,’ created by a vendor and sold to a hospital or other healthcare organization, or ‘internally developed,’ built by a hospital or other healthcare organization for their own use.

Methods

Clinical decision support taxonomy

A preliminary list of 46 CDS tools was developed by the authors based on examination of systematic literature reviews of clinical decision support, extensive experience in the field of CDS, and previously conducted qualitative research.12 16 24 25 The authors, through their research group, then organized and facilitated an in-person conference which included a group of 11 national experts in healthcare IT and clinical decision support in addition to the researchers themselves (supplementary online appendix A includes a complete list of participants and organizing members of the multidisciplinary Provider Order Entry Team—POET).

The meeting took place over 2 days outside of Portland, Oregon in the spring of 2009. The complete list of 46 CDS tools was debated among all participants with meeting facilitation provided by POET team members. On the basis of this debate, several types of clinical decision support were added and some were modified or removed. In addition, the CDS types were divided into six categories based on this discussion (and in part on the taxa laid out in Osheroff et al8 and other clinical decision support taxonomies): medication dosing support, order facilitators, point-of-care alerts/reminders, relevant information display, expert systems, and workflow support. Although based on the assessment of experts at the conference and modifications of existing CDS taxonomies, the six over-arching categories were created primarily for the purpose of organizing and analyzing the CDS survey responses. The final taxonomy contains a list of 53 CDS tools meant to provide a comprehensive framework for describing all front-end tools currently in use. The complete taxonomy, including CDS types and sub-categories, descriptions and examples, is shown on the left-hand side of tables 3–8.

Table 3.

Taxonomy of clinical decision support (CDS) tools and survey results: medication dosing support

CDS type CDS description Example Vendor Total Institution Total Grand total
1 2 3 4 5 6 7 1 2 3 4
Medication dosing support
 Medication dose adjustment26 Assistance with adjusting or calculating medication doses based on patient characteristics such as age, weight, or renal or hepatic function. An algorithm that automatically suggests that if CrCl<50 mg/min, reduce frequency of administration of a particular medication to every 24 h. 7 3 10
 Formulary checking27 Check medication orders against hospital or payer formularies and suggest more cost-effective therapies. Suggest omeprazole as a more cost effective alternative to pantoprazole. 6 4 10
 Single dose range checking28 Checking to see whether a single dose of a medication falls outside of an allowable dose range Alert on a single dose of acetaminophen ≥1 g. 6 3 9
 Maximum daily dose checking29 Checking to see whether the combined daily dose of a medication exceeds a specified maximum daily dose. In the case of combination products (such as hydrocodone/acetaminophen), systems should check each ingredient for maximum daily dose, in combination with other medications the patient is receiving. Alert on a total daily dose of acetaminophen ≥4 g. 7 2 9
 Maximum lifetime dose checking29 Checking to see whether the combined lifetime dose of a medication exceeds a specified maximum lifetime dose. Alert if the total cumulative dose of doxorubicin over a patient's lifetime exceeds 550 mg/m2. 3 1 4
 Default doses/pick lists27 Providing common doses of a medication for a provider to choose from. Providing a list of 100 mg, 200 mg, 300 mg, 400 mg, 600 mg, and 800 mg doses for ibuprofen with a default of 400 mg. 7 4 11
 Indication-based dosing30 Adjusting default medication doses based on indications entered by ordering provider. Order 7.5 mg methotrexate once weekly for rheumatoid arthritis, but 1500 mg/m2 every 4 weeks (with leucovorin rescue) for gastric cancer. 6 3 9
Totals (absence of ‘•’ indicates response of N (no), NA (not applicable), or (blank)) 7 6 6 5 7 6 5 42 7 5 2 6 20 62

Table 4.

Taxonomy of clinical decision support (CDS) tools and survey results: order facilitators

CDS type CDS description Example Vendor Total Institution Grand total
1 2 3 4 5 6 7 1 2 3 4 Total
Order facilitators
 Medication order sentences6 Complete statements of orders which a provider can order as a single unit. Allowing the provider to order ‘Digoxin 0.25 mg PO QD’ as a single unit. 7 4 11
 Subsequent or corollary orders30 Suggesting or automatically ordering something based on or in response to another order. Order liver function tests after starting a statin. 5 3 8
 Indication-based ordering31 Suggesting orders based on the indication entered by the ordering provider. Suggesting a low-dose thiazide diuretic for a patient with hypertension. 5 3 8
 Service-specific order sets6 Order sets (collections of common orders) based on the service a patient is being admitted to. Intensive care unit (ICU) admission order set 4 4 8
 Condition-specific order sets6 Order sets (collections of common orders) based on a disease or problem that a patient has. Rule out myocardial infarction order set 7 4 11
 Procedure-specific order sets6 Order sets (collections of common orders) based on a procedure or clinical state (post-operative, post-partum, post-procedure, etc) of a patient. Post total knee replacement order set 7 4 11
 Condition-specific treatment protocol32 A treatment protocol for a specific condition. Protocols are characterized by complex or temporal logic, in comparison to order sets which are usually simpler. Hypothermia treatment protocol 6 2 8
 Transfer order set6 Order sets (collections of common orders) based on the services a patient is being transferred from and to. ICU-to-medicine transfer order set 3 4 7
 Non-medication order sentences6 Complete statements of non-medication orders which a provider can order as a single unit. Allowing the provider to order ‘Call HO for T >101, SBP >180, SBP <90, HR >120, HR <50, RR >30, RR <10, OT sats <92%’ as a single unit. 6 3 9
Totals (absence of ‘•’ indicates response of N (no), NA (not applicable), or (blank)) 9 7 7 9 9 6 3 50 9 7 8 7 31 81

Table 5.

Taxonomy of clinical decision support (CDS) tools and survey results: point of care alerts/reminders

CDS type CDS description Example Vendor Total Institution Total Grand total
1 2 3 4 5 6 7 1 2 3 4
Point of care alerts/reminders
 Drug–condition interaction checking33 Checking medication orders against the patient problem list for possible contraindications. Alert when a provider orders propranolol for a patient with asthma. 7 2 9
 Drug–drug interaction checking34 Checking medication orders and the medication list for possible contraindications. Alert when a provider orders sildenafil for a patient with nitroglycerin on the medication list. 7 4 11
 Drug–allergy interaction checking35 Checking medication orders against the allergy list for possible contraindications, including both direct allergies, allergies to drug classes or ingredients, and cross-sensitivities. Alert when a provider orders amoxicillin for a patient with a documented penicillin allergy. 7 4 11
 Plan of care alerts36 Time-based alerts relating to plans of care. Reminders to reassess the need for restraints and reorder if necessary at least every 24 h. 4 3 7
 Critical laboratory value checking37 Comparing laboratory results to reference ranges and alerting providers to critical (panic) values. Page the covering provider when pH>7.60. 5 4 9
 Duplicate order checking38 Checking active medication orders and the medication list for possible duplication. Alert when a provider orders metoprolol in a patient with an active order for atenolol or when it is already on the medication list. 6 2 8
 Care reminders39 Reminders to order a diagnostic or therapeutic procedure based on patient parameters, including preventive care reminders, chronic disease reminders, or palliative care reminders. Order an HbA1c every 6 months for patient with diabetes. 7 4 11
 Look-alike/sound-alike medication warnings40 Warn providers when they order a medication whose name looks or sounds like another drug. Warn providers ordering Zyrtec (cetirizine) or Zyprexa (olanzapine) to ensure that they have chosen the drug they intended. 2 1 3
 Ticklers41 Time-based alerts that an order has not been fully carried out. Alert a provider when a mammogram has been ordered but not scheduled or performed after 14 days. 4 1 5
 Problem list management42 Alerts, reminders, and automated documentation tools that help providers maintain an accurate problem list. When ordering ritonavir, ask the provider if he/she would like to add HIV to the problem list if not already documented. 4 1 5
 Radiology ordering support43 Assistance in selecting appropriate radiology studies based on patient conditions. Order a foot (rather than an ankle) x-ray if there is any pain in the midfoot zone and the patient is unable to weight bear both immediately and in the emergency department. 5 3 8
 Intravenous (IV)/per os (PO) conversion44 Conversion of patients from IV agents to PO agents when clinically appropriate and cost-effective. Convert patient from IV metronidazole to PO metronidazole when patient is no longer NPO (nil per os). 2 2 4
 High-risk state monitoring45 Alerting the provider to high-risk states. Alert the provider to order contact precautions for patients with known MRSA colonization. 3 4 7
 Polypharmacy alerts46 Alerting the provider when patients are on a high number of medications. Alert the provider that a patient is on >8 medications and suggest consult pharmacy. 2 1 3
Totals: (absence of ‘•’ indicates response of N (no), NA (not applicable), or (blank)) 14 7 9 8 13 9 5 65 13 9 8 6 36 101

Table 6.

Taxonomy of clinical decision support (CDS) tools and survey results: relevant information display

CDS type CDS description Example Vendor Total Institution Total Grand total
1 2 3 4 5 6 7 1 2 3 4
Relevant information display
 Context-sensitive information retrieval47 Information retrieval based on patient characteristics and clinical context (sometimes called infobuttons). Allow the provider to link directly to prescribing information for a medication at the time of ordering. 7 2 9
 Patient-specific relevant data displays48 Show relevant patient-specific information at appropriate times within information system workflows. Display recent potassium levels when ordering digoxin. 5 2 7
 Medication/test cost display49 Show the cost of a medication or test at the time of ordering. Indicate that a complete blood count costs $66 at the time of ordering. 3 4 7
 Tall man lettering50 Vary the case of look-alike medication names to show critical differences. Show hydralazine and hydroxazine as HydrALAZINE HydrOXYzine in a pick list. 3 4 7
 Context-sensitive user interface51 Provide special user interfaces for particular clinical scenarios. Provide a special interface for chemotherapy order entry, which might include relevant data display, special facilities for ordering complex or time-based protocols, and reference information. 4 1 5
Totals: (absence of ‘•’ indicates response of N (no), NA (not applicable), or (blank)) 5 3 3 1 5 4 1 22 5 3 2 3 13 35

Table 7.

Taxonomy of clinical decision support (CDS) tools and survey results: expert systems

CDS type CDS description Example Vendor Total Institution Total Grand total
1 2 3 4 5 6 7 1 2 3 4
Expert systems
 Antibiotic ordering support52 Antibiotic suggestions based on patient history, hospital antibiogram, culture results, and patient characteristics. Suggest vancomycin for empiric antibiotic therapy for patients with suspected MRSA. 2 3 5
 Ventilator support53 Ventilator suggestions based on patient-specific blood gas readings and current condition. Unless the FiO2 is already at 1.0, suggest increasing the FiO2 by 0.1 if the PaO2 is >50 but <60 mm Hg in patients with acute respiratory distress syndrome. 2 1 3
 Diagnostic support54 55 Differential diagnosis suggestions based on patient signs and symptoms (eg, Isabel, DxPlain, QMR) Suggest a differential diagnosis of appendicitis, diverticulitis/osis, or kidney stones in patients with lower abdominal pain. 2 1 3
 Risk assessment tools56 Tools and calculators to estimate disease risks based on patient characteristics. Calculate 10-year cardiovascular disease risk for a patient based on the Framingham risk score. 5 3 8
 Prognostic tools57 Tools to estimate the survival of patients with cancer or other potentially life-limiting conditions based on diagnostic criteria and procedures performed. Estimate survival for cancer patients based on tumor type, location, staging, and procedures performed. 2 1 3
 Transfusion support58 Recommendations regarding the appropriateness of transfusions and suggested products and dosing based on clinical indications. Suggest fresh frozen plasma for patients with a high INR and taking warfarin. 3 2 5
 Nutrition ordering tools59 Tools, calculators, guidelines, and protocols for ordering total parenteral nutrition (TPN), enteral nutrition or other alimentation procedures. Suggest increased protein in TPN for patients with active infection. 2 2 4
 Laboratory test interpretation60 Interpretative information for laboratory results. This may include reference range information, correlation among several results, or calculations (such as the anion gap). Based on ABG values, report that a patient has high anion gap metabolic acidosis. 3 3 6
 Treatment planning61 Computer tools to assist in the planning of interventional procedures (ie, surgery or radiation therapy). An image-guided treatment planning system used for radiation oncology. 1 2 3
 Triage tools62 Tools for determining urgency of clinical problems and sorting patients on the basis of need and available resources. A computer prompt that recommends that a patient with facial numbness and slurred speech, as documented by a triage nurse, be seen immediately to rule out stroke. 4 2 6
 Syndromic surveillance63 Direct or surrogate monitoring of disease conditions over a geographic area. City-wide reporting and monitoring of emergency department chief complaints in order to detect norovirus outbreaks. 2 2 4
Totals: (absence of ‘•’ indicates response of N (no), NA (not applicable), or (blank)) 9 3 1 2 9 4 0 28 10 2 3 7 22 50

Table 8.

Taxonomy of clinical decision support (CDS) tools and survey results: workflow support

CDS type CDS description Example Vendor Institution Total Grand total
1 2 3 4 5 6 7 Total 1 2 3 4
Workflow support
 Order routing64 Rule-based routing of orders to various functional areas. Route order for albuterol nebulizer to pharmacy and respiratory therapy. 4 3 7
 Registry functions65 Actionable interventions on multiple patients. Send a letter to all patients with diabetes who are overdue for an HbA1c. 4 3 7
 Medication reconciliation66 Tools for reconciling medication lists across transitions in care (admissions, discharges, and transfers). Upon admission, automatically generate a pre-admission medication list based on outpatient medication orders and pharmacy dispensing data. 5 4 9
 Automatic order termination67 Automatic termination of orders after a set period of time. Automatically terminate antibiotic orders after the conclusion of the order duration. 6 3 9
 Order approvals68 Apply logic and route orders for special approval based on order type, ordering provider, or patient characteristics. Send all human growth hormone (HGH) orders to endocrinology for review/approval. 4 3 7
 Free-text order parsing69 Parsing tools to translate free-text orders into structured representations. Allow the user to enter the text ‘amox 500 mg QID 10d’ and translate that to a complete, structured amoxicillin order that can be automatically processed by the pharmacy system. 2 1 3
 Documentation aids70 Templates and tools for documenting care in structured or unstructured forms. Structured documentation template for a primary care asthma visit that has checkboxes for common symptoms, etc. 7 4 11
Totals (absence of ‘•’ indicates response of N (no), NA (not applicable), or (blank)) 5 6 4 3 7 6 1 32 7 5 6 3 21 53

Survey

Once the clinical decision support taxonomy had been reviewed and revised by the expert panel, following IRB approval, surveys were sent to a purposive sample of nine major CCHIT-certified commercial EHR vendors providing a broad array of ambulatory and inpatient EHR systems: Eclipsys (recently merged with Allscripts, Chicago, Illinois, USA); NextGen, Horsham, Pennsylvania, USA; e-MDs, Austin, Texas, USA; Epic Systems, Verona, Wisconsin, USA; Cerner, Kansas City, Missouri, USA; GE, Fairfield, Connecticut, USA; Greenway Medical Technologies, Carrollton, Georgia, USA; and SpringCharts, Houston, Texas, USA; and four healthcare institutions: Partners HealthCare, Boston, Massachusetts, USA; the Regenstrief Institute, Indianapolis, Indiana, USA; Intermountain Healthcare, Salt Lake City, Utah, USA; and the national Veterans Health Administration, Washington, DC, USA (see table 9 for locations and other information).

Table 9.

Vendors and institutions surveyed

Vendor/institution Product/system Version CCHIT certification
Allscripts, Chicago, IL, USA Allscripts EHR 10 2011
Cerner, Kansas City, MO, USA PowerChart/PowerWorks 2007 2007
Eclipsys, Atlanta, GA, USA Sunrise Clinical Manager Suite 5.5 2011
e-MDs, Austin, TX, USA Solution Series 6.3 2008
Epic, Madison, WI, USA EpicCare Inpatient Summer 2009 2011
NextGen, Horsham, PA, USA Inpatient Clinicals 2.3 2008
GE, Fairfield, CT, USA Centricity EMR 9.2 2008
GMT, Carrollton, GA, USA PrimeSuite 2008 2008
SpringCharts, Houston, TX, USA SpringCharts EHR 9.5 2006
Partners HealthCare, Boston, MA, USA LMR NA NA
Veterans' Affairs Health System, Washington, DC, USA VistA NA NA
Regenstrief Institute, Indianapolis, IN, USA RMRS NA NA
Intermountain Healthcare, Salt Lake City, UT, USA HELP-2 NA NA

Commercial vendors were selected on the basis of (1) CCHIT certification and (2) EHR products in widespread use at multiple sites. The internally developed EHRs surveyed were in use at healthcare institutions identified by Chaudhry et al as having the largest number of high quality, peer-reviewed articles describing their research and development activities.11 All surveys were conducted via email and were sent to knowledgeable leaders and/or informatics staff within each organization (eg, CMIO, CEO, CMO).

For each type of clinical decision support, respondents were provided with a brief definition and a representative example (identical to the types listed in tables 3–8) and were asked to indicate whether each tool was present (‘Y’) or absent (‘N’) as the system was designed. Respondents were asked whether the current release of their “EMR supports this type of CDS.” Respondents were asked to answer according to the capabilities of the current version of their EHR system only, not on any planned capabilities or theoretical extensions, and were also asked to focus on the capabilities of their systems as designed, rather than as typically implemented (appreciating that some features may be used more than others). Respondents were also given the opportunity to provide comments to clarify each response, and were encouraged to contact the investigators with any questions—several vendors requested meetings to discuss their capabilities or ask questions, and these requests were accommodated.

Data analysis

Results were compiled in Microsoft Excel and analyzed using Excel and SAS. Based on the data collected, various descriptive statistics were recorded. Given our small sample and purposive sampling strategy, it was not possible to infer broad quantitative characteristics of the CDS developers' community at large.

Results

Surveys were sent to nine commercial EHR vendors and four healthcare institutions. We received responses from seven of nine vendors (77%) and four of four institutions (100%) for an overall response rate of 85%. Details about the systems surveyed, including vendor/institution name, location, system name, system version, and CCHIT certification year are presented in table 9. From this point forward, we present anonymized results in accordance with the preference of surveyed vendors and institutions.

The complete results of the survey along each of the 53 types of front-end CDS tools are shown on the right-hand side of tables 3–8 and summarized by category in table 10.

Table 10.

Summary of capabilities of commercial and internally developed systems by category

Decision support capabilities Vendor % Content available Internally developed % Content available Overall % available
1 2 3 4 5 6 7 1 2 3 4
Medication dosing support (7 features) 7 6 6 5 7 6 5 85.7 7 5 2 6 71.4 80.5
Order facilitators (9 features) 9 7 7 9 9 6 3 79.4 9 7 8 7 86.1 81.8
Point-of-care alerts/reminders (14 features) 14 7 9 8 13 9 5 63.2 13 9 8 6 64.2 65.6
Relevant information display (5 features) 5 3 3 1 5 4 1 62.9 5 3 2 3 65.0 63.6
Expert systems (11 features) 9 3 1 2 9 4 0 36.3 10 2 3 7 47.7 41.3
Workflow support (7 features) 5 6 4 3 7 6 1 65.3 7 5 6 3 75.0 68.8
Grand totals
% Features available 92.5 60.4 56.6 52.8 94.3 66.0 28.3 64.4 96.2 58.5 54.7 60.4 67.4 65.5

The proportion of available CDS tools in each category for all EHRs ranged from 28.3% to 96.2% (median 60.4%). Eight of the 53 types (15%) of clinical decision support were found to be present in all surveyed systems: default doses/pick lists, medication order sentences, condition-specific and procedure-specific order sets, drug–drug and drug–allergy interaction checking, health maintenance reminders, and clinical documentation (charting) aids. Twelve of the 53 types (23%) of clinical decision support were present in all commercial EHRs and 16 (30%) were present in all internally developed EHRs. All 53 categories of decision support were present in at least one of the 11 systems surveyed. Although no single system was capable of all surveyed types of clinical decision support, two commercial systems and one internally developed system had more than 90% of all surveyed CDS tools.

Overall, certain classes of decision support features, including order facilitators (81.8% availability) and dosing support (80.5%), were more common, with most of these types of decision support present in the majority of systems. Workflow support (68.8%), point-of-care alerts/reminders (65.6%), and relevant information displays (63.6%) were less common but still prevalent in the majority of systems. Finally, expert systems (41.3%), which includes tools such as diagnostic decision support, treatment planning, laboratory data interpretation, and ventilator support, was the least common class of CDS tools available.

Discussion

Among both internally developed and commercial systems, there was significant variability in the available front-end CDS tools as designed. While more than one system had over 90% of the surveyed CDS tools, others had less than 60% and one commercial system had only 28.3%. Several were present in all 11 systems, while others (including polypharmacy alerts, treatment planning, look-alike/sound-alike medication alerts, diagnostic support, prognostic tools, ventilator support, and free text order parsing) were present in as few as three of the systems surveyed. Not surprisingly, the most common CDS tools were generally the simplest, such as drug–drug interaction checking, while the least common were advanced expert systems such as treatment planning and diagnostic support. In general, ambulatory EHRs had a lower proportion of surveyed CDS functions when compared with inpatient EHRs.

Our findings also show that certain classes of CDS tools are more commonly available. Dosing support (eg, default doses/pick lists) and order facilitators (eg, condition-specific order sets) were the most common classes of CDS tools available while expert systems (eg, ventilator support) was the least common class. The variation in availability of different CDS categories is not surprising given that each requires differing knowledge bases and varying expertise. While all forms necessitate significant investments (both financial and otherwise), vendors and healthcare institutions may preferentially avoid incorporating the most resource-intensive content into their systems.

Overall, the results of our survey indicate that although a diverse range of CDS tools exists in both vendor and internally developed EHR systems, there remains significant room for improvement in making these tools more widely and consistently available. Given that our sample of commercial and internally developed systems represents some of the most advanced and most widely used systems and assesses their optimum CDS capabilities, our results indicate that the general availability of decision support tools remains limited even in the best of cases.

It is important to consider that these results are based on each system as it is designed, not as it is actually implemented and used at real-world sites. The gap between the available tools as a system is designed and how that system is actually implemented and used in clinical practices can be substantial, specifically in the case of commercially developed EHR systems. While vendors may incorporate a certain CDS tool into their system, whether that tool is ultimately available to the end-user is highly dependent on institutional priorities, governance practices, and implementation procedures.71 In this project, we examine the off-the-shelf CDS tools as designed in a purposive sample of leading EHRs. In evaluating a commercial EHR for possible adoption, it is important to consider both the tools that are available as designed or ‘out-of-the-box’ and what tools will actually be implemented based on the priorities and needs of the institution. Each institution, whether developing ‘home-grown’ systems or purchasing one from an outside vendor, needs to consider the specific decision support tools that are right for them and prioritize different types of CDS based on institutional needs.

Consideration of both back-end system capabilities and front-end tools is vitally important for the evaluation and development of EHR systems. Off-the-shelf systems may offer ready-to-use tools but may limit the ability to customize these tools through different combinations of CDS system capabilities. In contrast, a home-grown system with robust CDS system capabilities may offer a great deal of flexibility but may also require a greater investment of time, resources, and expertise to create front-end tools. In general, as long as a system includes enough basic system capabilities, the end-user can create any type of CDS tool. Realistically, however, the end-user may lack the time, resources, expertise, or creativity to create tools by combining available system capabilities.

There are a variety of ways to promote broader availability of CDS tools for the system end-user. One solution is simply for vendors and institutional developers to expand the variety of CDS tools available in their systems, which we hope they will continue to do in light of these results. However, given that this might not be feasible in all cases, additional means are necessary for increasing the availability of a range of CDS tools. One such solution is the use of external CDS tools (including web or software-based tools) that can add third-party content by ‘talking’ to the EHR via an application programming interface. Another option is the use of general purpose rule engines, which allow end-users to more easily customize tools based on available system capabilities. Service-oriented architectures such as SANDS also provide a means of making more CDS tools available.72 73 In general, it will be important to better understand end-user preferences and workflow habits in order to optimally improve these systems.

The taxonomy of front-end CDS tools described in this paper provides a novel means of assessing currently available decision support tools and it is our hope that this comprehensive taxonomy will also serve as a roadmap for vendors and institutional developers working to expand both the back-end CDS system capabilities and front-end tools in their systems. In addition, our taxonomy may also be of value for informing future certification criteria and stages 2 and 3 meaningful use requirements. Together, this taxonomy and the results of our survey also provide healthcare institutions with a framework for evaluating the capabilities of clinical information systems which may be useful as they evaluate the purchase or development of such systems. As meaningful use requirements continue to expand, more decision support tools will be necessary and it is imperative that healthcare institutions and commercial vendors continue to extend the range of CDS tools available to increase the quality and efficiency of care.

Our method of analyzing commercial and internally developed EHR systems has several potential limitations. First, we surveyed a very small sample of the commercial and home-grown systems currently in use. We employed a purposive sampling strategy in order to capture information about leading vendor-based and internally developed EHRs. However, this strategy limits the conclusions that can be drawn from survey results and their generalizability. Second, the use of a survey to evaluate these systems is a potential source of error due to the possibility that respondents may have inadvertently (or optimistically) misrepresented features of their system. One particular potential concern is highly extensible systems that support add-ons by customers (eg, via medical logic modules or an application programming interface). When asked, we instructed vendors to answer based on decision support types that are made available to customers and not to include types that could conceivably be developed through extension or additional programming. However, it is possible that some vendors still answered affirmatively for decision support types that could theoretically be implemented in their systems, but which have not actually been developed. Third, the survey analyzed systems and their front-end CDS tools as they were designed, rather than how they might be implemented and used in a real-world setting. For vendor systems, there may be a significant gap between the tools that are possible in a given system and those that are actually implemented at a given site. Finally, this project assesses only the presence or absence of each type of CDS tool delineated in the taxonomy, but does not attempt to measure or weight the importance of the tools. Indeed, some tools might be significantly more important than others, so it is not necessarily the case that the system with the highest proportion of CDS types offers the ‘best’ CDS. A system for prioritizing and weighting CDS types would be a useful future research direction. It would also be valuable to repeat the survey of decision support content at customer sites using our taxonomy in order to gauge the validity of vendor responses and to assess the potential gap between systems as they are designed and as they are implemented in the clinical setting.

Conclusion

To assess the clinical decision support capabilities of leading commercial and internally developed EHRs, we developed a comprehensive taxonomy and survey of the types of the front-end CDS tools currently in use. We found wide variability in the decision support tools available in commercial and internally developed EHRs. As pressure to perform more advanced CDS increases, EHR developers will need to incorporate a broader range of CDS tools into their systems.

Acknowledgments

This project was made possible by the hard work and dedication of the Provider Order Entry Team (POET) at Oregon Health & Science University. We would also like to thank all participants at the Menucha Conference who were instrumental in shaping the final list of clinical decision support types: DW Bates, B Churchill, J Dulcey, R Gibson, N Greengold, R Jenders, T Payne, E Poon, and SL Pestotnik.

Footnotes

Funding: This project was supported by NLM Grant R56-LM006942.

Competing interests: None.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

  • 1.Garg AX, Adhikari NK, McDonald H, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA 2005;293:1223–38 [DOI] [PubMed] [Google Scholar]
  • 2.Kawamoto K, Houlihan CA, Balas EA, et al. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ 2005;330:765. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Saleem JJ, Patterson ES, Militello L, et al. Exploring barriers and facilitators to the use of computerized clinical reminders. J Am Med Inform Assoc 2005;12:438–47 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Paterno MD, Maviglia SM, Gorman PN, et al. Tiering drug-drug interaction alerts by severity increases compliance rates. J Am Med Inform Assoc 2009;16:40–6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Roberts GW, Farmer CJ, Cheney PC, et al. Clinical decision support implemented with academic detailing improves prescribing of key renally cleared drugs in the hospital setting. J Am Med Inform Assoc 2010;17:308–12 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Payne TH, Hoey PJ, Nichol P, et al. Preparation and use of preconstructed orders, order sets, and order menus in a computerized provider order entry system. J Am Med Inform Assoc 2003;10:322–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Asch SM, McGlynn EA, Hogan MM, et al. Comparison of quality of care for patients in the Veterans Health Administration and patients in a national sample. Ann Intern Med 2004;141:938–45 [DOI] [PubMed] [Google Scholar]
  • 8.Osheroff JA, Teich JM, Middleton B, et al. A roadmap for national action on clinical decision support. J Am Med Inform Assoc 2007;14:141–5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Bates DW, Cohen M, Leape LL, et al. Reducing the frequency of errors in medicine using information technology. J Am Med Inform Assoc 2001;8:299–308 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.McDonald CJ. Protocol-based computer reminders, the quality of care and the non-perfectability of man. N Engl J Med 1976;295:1351–5 [DOI] [PubMed] [Google Scholar]
  • 11.Chaudhry B, Wang J, Wu S, et al. Systematic review: impact of health information technology on quality, efficiency, and costs of medical care. Ann Intern Med 2006;144:742–52 [DOI] [PubMed] [Google Scholar]
  • 12.Wright A, Sittig DF, Ash JS, et al. Clinical decision support capabilities of commercially-available clinical information systems. J Am Med Inform Assoc 2009;16:637–44 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Blumenthal D, Tavenner M. The “meaningful use” regulation for electronic health records. N Engl J Med 2010;363:501–4 [DOI] [PubMed] [Google Scholar]
  • 14.Comparison of Meaningful Use Objectives Between the Proposed Rule to the Final Rule. Washington, DC: Centers for Medicare & Medicaid Services, 2010 [Google Scholar]
  • 15.Simon SR, Kaushal R, Cleary PD, et al. Physicians and electronic health records: a statewide survey. Arch Intern Med 2007;167:507–12 [DOI] [PubMed] [Google Scholar]
  • 16.Wright A, Goldberg H, Hongsermeier T, et al. A description and functional taxonomy of rule-based decision support content at a large integrated delivery network. J Am Med Inform Assoc 2007;14:489–96 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Wang JK, Shabot MM, Duncan RG, et al. A clinical rules taxonomy for the implementation of a computerized physician order entry (CPOE) system. Proc AMIA Symp 2002:860–3 [PMC free article] [PubMed] [Google Scholar]
  • 18.Miller RA, Waitman LR, Chen S, et al. The anatomy of decision support during inpatient care provider order entry (CPOE): empirical observations from a decade of CPOE experience at Vanderbilt. J Biomed Inform 2005;38:469–85 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Osheroff J, Pifer E, Teich J, et al. Improving Outcomes with Clinical Decision Support. Chicago, IL: HIMSS, 2005 [Google Scholar]
  • 20.Berlin A, Sorani M, Sim I. A taxonomic description of computer-based clinical decision support systems. J Biomed Inform 2006;39:656–67 [DOI] [PubMed] [Google Scholar]
  • 21.Driving Quality and Performance Measurement: A Foundation for Clinical Decision Support. Washington, DC: National Quality Forum, 2010 [Google Scholar]
  • 22.Metzger J, Welebob E, Bates DW, et al. Mixed results in the safety performance of computerized physician order entry. Health Aff (Millwood) 2010;29:655–63 [DOI] [PubMed] [Google Scholar]
  • 23.Classen DC, Avery AJ, Bates DW. Evaluation and certification of computerized provider order entry systems. J Am Med Inform Assoc 2007;14:48–55 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Sittig DF, Wright A, Osheroff JA, et al. Grand challenges in clinical decision support. J Biomed Inform 2008;41:387–92 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Wright A, Sittig DF. A framework and model for evaluating clinical decision support architectures. J Biomed Inform 2008;41:982–90 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Chertow GM, Lee J, Kuperman GJ, et al. Guided medication dosing for inpatients with renal insufficiency. JAMA 2001;286:2839–44 [DOI] [PubMed] [Google Scholar]
  • 27.Teich JM, Merchia PR, Schmiz JL, et al. Effects of computerized physician order entry on prescribing practices. Arch Intern Med 2000;160:2741–7 [DOI] [PubMed] [Google Scholar]
  • 28.Kuperman GJ, Bobb A, Payne TH, et al. Medication-related clinical decision support in computerized provider order entry systems: a review. J Am Med Inform Assoc 2007;14:29–40 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Teich JM, Schmiz JL, O'Connell EM, et al. An information system to improve the safety and efficiency of chemotherapy ordering. Proc AMIA Annu Fall Symp 1996:498–502 [PMC free article] [PubMed] [Google Scholar]
  • 30.Overhage JM, Tierney WM, Zhou XH, et al. A randomized trial of “corollary orders” to prevent errors of omission. J Am Med Inform Assoc 1997;4:364–75 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Lee J, Clay B, Zelazny Z, et al. Indication-based ordering: a new paradigm for glycemic control in hospitalized inpatients. J Diabetes Sci Technol 2008;2:349–56 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Morris AH, Wallace CJ, Menlove RL, et al. Randomized clinical trial of pressure-controlled inverse ratio ventilation and extracorporeal CO2 removal for adult respiratory distress syndrome. Am J Respir Crit Care Med 1994;149:295–305 [DOI] [PubMed] [Google Scholar]
  • 33.Tamblyn R, Huang A, Taylor L, et al. A randomized trial of the effectiveness of on-demand versus computer-triggered drug decision support in primary care. J Am Med Inform Assoc 2008;15:430–8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Hulse RK, Clark SJ, Jackson JC, et al. Computerized medication monitoring system. Am J Hosp Pharm 1976;33:1061–4 [PubMed] [Google Scholar]
  • 35.Isaac T, Weissman JS, Davis RB, et al. Overrides of medication alerts in ambulatory care. Arch Intern Med 2009;169:305–11 [DOI] [PubMed] [Google Scholar]
  • 36.Haug PJ, Gardner RM, Tate KE, et al. Decision support in medicine: examples from the HELP system. Comput Biomed Res 1994;27:396–418 [DOI] [PubMed] [Google Scholar]
  • 37.Bradshaw KE, Gardner RM, Pryor TA. Development of a computerized laboratory alerting system. Comput Biomed Res 1989;22:575–87 [DOI] [PubMed] [Google Scholar]
  • 38.van der Sijs H, Mulder A, van Gelder T, et al. Drug safety alert generation and overriding in a large Dutch university medical centre. Pharmacoepidemiol Drug Saf 2009;18:941–7 [DOI] [PubMed] [Google Scholar]
  • 39.Shea S, DuMouchel W, Bahamonde L. A meta-analysis of 16 randomized controlled trials to evaluate computer-based clinical reminder systems for preventive care in the ambulatory setting. J Am Med Inform Assoc 1996;3:399–409 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Schulmeister L. Look-alike, sound-alike oncology medications. Clin J Oncol Nurs 2006;10:35–41 [DOI] [PubMed] [Google Scholar]
  • 41.Félix L, Gebremariam C. New Care Management Event Tracking Module in iCare. Rockville, MD: IHS OIT Newsletter, 2010:7–8 [Google Scholar]
  • 42.Wright A, Chen ES, Maloney FL. An automated technique for identifying associations between medications, laboratory results and problems. J Biomed Inform 2010;43:891–901 [DOI] [PubMed] [Google Scholar]
  • 43.Harpole LH, Khorasani R, Fiskio J, et al. Automated evidence-based critiquing of orders for abdominal radiographs: impact on utilization and appropriateness. J Am Med Inform Assoc 1997;4:511–21 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Teich JM, Petronzio AM, Gerner JR, et al. An information system to promote intravenous-to-oral medication conversion. Proc AMIA Symp 1999:415–19 [PMC free article] [PubMed] [Google Scholar]
  • 45.Evans RS, Wallace CJ, Lloyd JF, et al. ; CDC Prevention Epicenter Program Rapid identification of hospitalized patients at high risk for MRSA carriage. J Am Med Inform Assoc 2008;15:506–12 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Trygstad TK, Christensen D, Garmise J, et al. Pharmacist response to alerts generated from Medicaid pharmacy claims in a long-term care setting: results from the North Carolina polypharmacy initiative. J Manag Care Pharm 2005;11:575–83 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Del Fiol G, Haug PJ, Cimino JJ, et al. Effectiveness of topic-specific infobuttons: a randomized controlled trial. J Am Med Inform Assoc 2008;15:752–9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. J Am Med Inform Assoc 2003;10:523–30 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Bates DW, Kuperman GJ, Jha A, et al. Does the computerized display of charges affect inpatient ancillary test utilization? Arch Intern Med 1997;157:2501–8 [PubMed] [Google Scholar]
  • 50.Filik R, Purdy K, Gale A, et al. Labeling of medicines and patient safety: evaluating methods of reducing drug name confusion. Hum Factors 2006;48:39–47 [DOI] [PubMed] [Google Scholar]
  • 51.Linder JA, Schnipper JL, Tsurikova R, et al. Documentation-based clinical decision support to improve antibiotic prescribing for acute respiratory infections in primary care: a cluster randomised controlled trial. Inform Prim Care 2009;17:231–40 [DOI] [PubMed] [Google Scholar]
  • 52.Evans RS, Larsen RA, Burke JP, et al. Computer surveillance of hospital-acquired infections and antibiotic use. JAMA 1986;256:1007–11 [PubMed] [Google Scholar]
  • 53.Sittig DF, Gardner RM, Morris AH, et al. Clinical evaluation of computer-based respiratory care algorithms. Int J Clin Monit Comput 1990;7:177–85 [DOI] [PubMed] [Google Scholar]
  • 54.Barnett GO, Cimino JJ, Hupp JA, et al. DXplain. An evolving diagnostic decision-support system. JAMA 1987;258:67–74 [DOI] [PubMed] [Google Scholar]
  • 55.Miller RA, Pople HE, Jr, Myers JD. Internist-1, an experimental computer-based diagnostic consultant for general internal medicine. N Engl J Med 1982;307:468–76 [DOI] [PubMed] [Google Scholar]
  • 56.Peiris DP, Joshi R, Webster RJ, et al. An electronic clinical decision support tool to assist primary care providers in cardiovascular disease risk management: development and mixed methods evaluation. J Med Internet Res 2009;11:e51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Ravdin PM, Siminoff LA, Davis GJ, et al. Computer program to assist in making decisions about adjuvant therapy for women with early breast cancer. J Clin Oncol 2001;19:980–91 [DOI] [PubMed] [Google Scholar]
  • 58.Rothschild JM, McGurk S, Honour M, et al. Assessment of education and computerized decision support interventions for improving transfusion practice. Transfusion 2007;47:228–39 [DOI] [PubMed] [Google Scholar]
  • 59.Lehmann CU, Conner KG, Cox JM. Preventing provider errors: online total parenteral nutrition calculator. Pediatrics 2004;113:748–53 [DOI] [PubMed] [Google Scholar]
  • 60.Bleich HL. Computer evaluation of acid-base disorders. J Clin Invest 1969;48:1689–96 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Mohan R, Barest G, Brewster LJ, et al. A comprehensive three-dimensional radiation treatment planning system. Int J Radiat Oncol Biol Phys 1988;15:481–95 [DOI] [PubMed] [Google Scholar]
  • 62.North F, Varkey P. Use of the prioritization matrix to enhance triage algorithms in clinical decision support software. Am J Med Qual 2010;25:468–73 [DOI] [PubMed] [Google Scholar]
  • 63.Mandl KD, Overhage JM, Wagner MM, et al. Implementing syndromic surveillance: a practical guide informed by the early experience. J Am Med Inform Assoc 2004;11:141–50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Jacobs B, Crotty E, Conway E, et al. Computerized Provider Order Entry with Pager Notification Improves Efficiency in STAT Radiographic Studies and Respiratory Treatments. Appl Clin Inform 2010;1:19–31 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.McGlinchey EA, Wright A, Poon EG, et al. Ability to perform registry functions among practices with and without electronic health records. AMIA Annu Symp Proc 2008:1052. [PubMed] [Google Scholar]
  • 66.Hamann C, Poon E, Smith S, et al. Designing an electronic medication reconciliation system. AMIA Annu Symp Proc 2005:976. [PMC free article] [PubMed] [Google Scholar]
  • 67.Topal J, Conklin S, Camp K, et al. Prevention of nosocomial catheter-associated urinary tract infections through computerized feedback to physicians and a nurse-directed protocol. Am J Med Qual 2005;20:121–6 [DOI] [PubMed] [Google Scholar]
  • 68.Buising KL, Thursky KA, Robertson MB, et al. Electronic antibiotic stewardship–reduced consumption of broad-spectrum antibiotics using a computerized antimicrobial approval system in a hospital setting. J Antimicrob Chemother 2008;62:608–16 [DOI] [PubMed] [Google Scholar]
  • 69.Levin MA, Krol M, Doshi AM, et al. Extraction and mapping of drug names from free text to a standardized nomenclature. AMIA Annu Symp Proc 2007:438–42 [PMC free article] [PubMed] [Google Scholar]
  • 70.Rosenbloom ST, Denny JC, Xu H, et al. Data from clinical notes: a perspective on the tension between structure and flexible documentation. J Am Med Inform Assoc 2011;18:181–6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Wright A, Sittig DF, Ash JS, et al. Governance for clinical decision support: case studies and recommended practices from leading institutions. J Am Med Inform Assoc 2011;18:187–94 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Wright A, Sittig DF. SANDS: an architecture for clinical decision support in a National Health Information Network. AMIA Annu Symp Proc 2007:816–20 [PMC free article] [PubMed] [Google Scholar]
  • 73.CDS Consortium http://www.partners.org/cird/cdsc/default.asp (accessed 2 Jan 2011).

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES