Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2019 May 24.
Published in final edited form as: Infect Control Hosp Epidemiol. 2016 Jul 15;37(10):1201–1211. doi: 10.1017/ice.2016.115

A Concise Set of Structure and Process Indicators to Assess and Compare Antimicrobial Stewardship Programs Among EU and US Hospitals: Results From a Multinational Expert Panel

Lori A Pollack 1, Diamantis Plachouras 2, Ronda Sinkowitz-Cochran 1, Heidi Gruhler 1, Dominique L Monnet 2, J Todd Weber 1; Transatlantic Taskforce on Antimicrobial Resistance (TATFAR) Expert Panel on Stewardship Structure and Process Indicators
PMCID: PMC6533629  NIHMSID: NIHMS1027574  PMID: 27418168

Abstract

OBJECTIVES.

To develop common indicators, relevant to both EU member states and the United States, that characterize and allow for meaningful comparison of antimicrobial stewardship programs among different countries and healthcare systems.

DESIGN.

Modified Delphi process.

PARTICIPANTS.

A multinational panel of 20 experts in antimicrobial stewardship.

METHODS.

Potential indicators were rated on the perceived feasibility to implement and measure each indicator and clinical importance for optimizing appropriate antimicrobial prescribing.

RESULTS.

The outcome was a set of 33 indicators developed to characterize the infrastructure and activities of hospital antimicrobial stewardship programs. Among them 17 indicators were considered essential to characterize an antimicrobial stewardship program and therefore were included in a core set of indicators. The remaining 16 indicators were considered optional indicators and included in a supplemental set.

CONCLUSIONS.

The integration of these indicators in public health surveillance and special studies will lead to a better understanding of best practices in antimicrobial stewardship. Additionally, future studies can explore the association of hospital antimicrobial stewardship programs to antimicrobial use and resistance.


Appropriate antimicrobial use is associated with improved patient outcomes and decreased risk of adverse events, including development of antimicrobial resistance and Clostridium difficile infection.13 To this aim, many healthcare and professional organizations advocate for coordinated programs that implement activities to ensure appropriate antimicrobial prescribing.47 Such programs are referred to as antimicrobial stewardship programs (ASPs), or simply antibiotic stewardship. Antimicrobial stewardship is a key prevention strategy to address the global concern of increasing antimicrobial resistance.8,9 Preservation of effective antimicrobial agents is an international public health issue; therefore, implementation of ASPs in hospitals is a focus of healthcare systems and governments in many countries.5,10,11

The Transatlantic Taskforce on Antimicrobial Resistance (TATFAR) was established by a 2009 US-EU summit declaration to enhance cooperation to address antimicrobial resistance.12,13 One key focus area of this collaboration is the appropriate therapeutic use of antimicrobial drugs. Recognizing that a common way to assess ASPs is needed to understand and promote effective antimicrobial stewardship, in a 2011 report, TATFAR recommended the development of common structure and process indicators for hospital ASPs.14 Using common indicators, EU member states and the United States could meaningfully characterize and compare antimicrobial stewardship efforts among different countries and healthcare systems.

Assessments of antimicrobial stewardship practices have been conducted in EU member states1518 and in the United States1921 but vary in terms of focus, length, and applicability to other health systems. The objective of this effort was to learn from and build upon these previous assessments in order to create a manageable number of ASP indicators that are relevant and feasible for comparison. To meet this objective, experts in hospital antimicrobial stewardship developed a set of indicators to describe the structure and functions of ASPs across healthcare systems.

METHODS

Participants

Antimicrobial stewardship experts from the European Union and the United States were recruited to participate in a modified Delphi process. Purposive sampling was used to ensure participants (experts) had the necessary expertise and experience for the development of the indicators, knowledge about clinical and public health practice, and diversity in geography and healthcare systems. All 20 invited experts participated in the rating of indicators, including the project coordinators from the European Centre for Disease Prevention and Control (ECDC) and the US Centers for Disease Control and Prevention (CDC). The panel had multidisciplinary representation from 9 EU member states and 6 US states (Table 1 and list of members at end of text).

table 1.

Characteristics of 20 Experts Who Participated in the Modified Delphi Process to Develop Antimicrobial Stewardship Program Structure and Process Indicators

Characteristic N (%)
Current country of residence
 European Uniona 13 65
 United States 7 35
Professional training
 Medicine 15 75
 Pharmacy 3 15
 Microbiology 1 5
 Psychology and business administration 1 5
Current profession
 Clinical medicine 11 55
 Pharmacy 3 15
 Public health 6 30

NOTE. The experts had a mean of 17 years of experience in antimicrobial stewardship.

a

Austria, Belgium, England, France, Germany, Greece, Italy, Netherlands, Scotland (2), Slovenia, Sweden (2).

Consensus Process

A modified Delphi process using the RAND/UCLA appropriateness method was followed to build consensus using self-administered questionnaires with email exchange among the experts.22 An initial list of 53 indicators was developed from the Cochrane systematic review of interventions to improve antibiotic prescribing practices for hospital inpatients by Davey et al1 as well as previously developed structure and process indicators, antimicrobial stewardship surveys, and antimicrobial stewardship guidelines in the European Union and United States.1521 In addition, the project coordinators conducted a comprehensive, nonsystematic review of antimicrobial stewardship literature published from 2006 to 2013.

In the first rating round, experts were asked whether to “retain” or “remove” each indicator on its ability and feasibility to characterize ASPs. Indicators for which more than 60% of experts marked “retain” continued on to a second round. Next, the experts rated each of the structure and process indicators on a Likert scale from 1 to 9, where 1 indicated “Strongly disagree” and 9 indicated “Strongly agree,” according to 3 criteria: feasibility (ie, it would be possible to implement and measure this indicator at the facility level); clinical importance (ie, this indicator is important for optimizing the appropriateness of antimicrobial prescribing); and relevance to minimizing antimicrobial resistance (ie, this indicator is relevant to reducing the development of antimicrobial resistance). The median values of responses for feasibility and clinical importance were summed and indicators with a combined median score of 14 or greater were retained. Based on the RAND/UCLA appropriateness methods, agreement in ratings among experts was measured using an agreement score defined as the interpercentile range adjusted for symmetry minus the interpercentile range (ie, interpercentile range adjusted for symmetry – interpercentile range).22 A 90/10 interpercentile range was used for better discriminating power and agreement was defined as an agreement score greater than 0. The results were analyzed using an electronic spreadsheet (Excel; Microsoft). After the first and second rounds, members of the expert panel received feedback on the rating and agreement for each indicator, then participated in a conference call with a trained facilitator who moderated a structured discussion focusing on indicators with low agreement scores among the ratings, borderline combined median scores (12–14), and low median ratings for clinical importance. In a third and final round, experts re-rated the structure and process indicators. The changes in median and agreement score from the second to the third round were determined for each indicator. Also, agreement scores after the second and third round were compared by the Wilcoxon signed rank test to assess consensus. The experts were also asked to indicate whether each indicator should be included as either a “core” indicator (necessary to characterize ASPs) or a “supplemental” indicator (not necessary but desirable); or removed.

An in-person expert consensus meeting moderated by the project coordinators took place following the third round in Stockholm, Sweden, on June 18, 2014. At this meeting, experts reviewed and discussed the indicators that at least 20% of experts marked for removal in the third round. If consensus about removal was not achieved, the indicator would be carried over to the supplemental indicators group. Next, experts examined the supplemental indicators and discussed whether any should be reclassified as core or be removed. Finally, the indicators that emerged as core and supplemental during the meeting were reexamined with attention on their utility to comprehensively describe hospital ASPs and be understood in a multinational context. Final decisions regarding classification and wording of indicators were made by consensus.

RESULTS

In March 2014, 20 invited experts (100%) responded to the first round. Of the 53 proposed structure and process indicators, 36 were retained (10 with revision) and 17 were removed (Table 2). In response to feedback that the proposed indicators focused on ASP staffing and activities (structure), but did not capture the extent of activity performance (process), 8 process indicators were added to the second round of rating (Table 3). Two separate indicators, having an infection preventionist and a hospital epidemiologist on the antimicrobial stewardship team, were merged.

table 2.

Summary of Results From the Modified Delphi Process

Status of indicators N (%)
Indicators rated in the first round 53
 Retained without revision 26 (49.0)
 Retained and revised 10 (18.9)
 Removed 17 (32.1)
 Added 8
Indicators rated in the second round 44
 Retained without revision 26 (59.1)
 Retained and revised 11 (25.0)
 Removed 6 (13.6)
 Added 1
Indicators rated in the third round and Examined at in-person expert consensus meetinga 38
 Retained as “core” indicator without revision 13 (34.2)
 Retained as “core” indicator and revised 4 (10.5)
 Retained as “supplemental” indicator without revision 12 (31.6)
 Retained as “supplemental” indicator and revised 4 (10.5)
 Removed 6 (15.8)
 Added 1
Final indicators 33
 Core indicator 17 (51.5)
 Supplemental indicator 16 (48.5)
a

In the third round, experts were asked whether indicators should be consider “core” (ie, the indicator should be included in a set of indicators for ALL hospitals to be asked) or “supplemental” (ie, hospitals can include if the indicator is of interest).

table 3.

Results From the Modified Delphi Process to Develop Structure and Process Indicators for Hospital Antimicrobial Stewardship Programs: Rating and Agreementa on Feasibility and Clinical Importance With Final Decision on Inclusion as Core or Supplemental Indicator

Indicators, by categoryb Second round
Third round
Rated as Core, % Final decision on status after in-person expert consensus meeting
Feasibility
Clinical importance
Feasibility
Clinical importance
Median (range) Agreement Median (range) Agreement Median (range) Agreement Median (range) Agreement
Governance and management
1. Does your facility have a formally defined antimicrobial stewardship program for ensuring appropriate antimicrobial use? 9 (2–9) 1.0 8 (5–9) 4.9 9 (5–9) 5.9 8 (5–9) 6.2 79 Core
2. Does your facility have a formal reporting structure responsible for antimicrobial stewardship (eg, a multidisciplinary committee focused on appropriate antimicrobial use, pharmacy committee, patient safety committee, or other relevant structure)? 9 (4–9) 1.4 7 (5–8) 3.0 8 (5–9) 2.7 7 (5–9) 1.5 56 Core
3. Does your facility have a named senior executive officer with accountability for antimicrobial leadership? 9 (5–9) 4.2 6 (5–9) 1.4 8 (5–9) 4.8 6 (4–8) 1.6 21 Suppl.
4. Has an annual report focused on antimicrobial stewardship (summary antimicrobial use and/or practices improvement initiatives) been produced for your facility in the past year? 9 (4–9) 3.1 7 (5–9) 1.8 8 (5–9) 4.8 7 (5–9) 1.8 42 Core
5. Does your facility provide any salary support for dedicated time for antimicrobial stewardship activities (e.g., percentage of full-time equivalent (FTE) staff for ensuring appropriate antimicrobial use)? 8 (1–9) 0.8 7 (5–9) 1.6 7 (3–9) 2.0 7 (5–9) 2.7 42 Core
Human resources
6. Is an antimicrobial stewardship team available at your facility? 8 (4–9) 2.8 9 (4–9) 2.8 8 (5–9) 3.1 9 (5–9) 6.6 83 Core
7. Is clinical infectious disease (ID) consultation available at your facility? 7 (3–9) 1.0 8 (5–9) 4.5 7 (3–9) 1.0 8 (5–9) 1.3 29 Suppl.
8. Is there a physician identified as a leader for stewardship activities at your facility? 8 ( 4–9) 4.2 7 (5–9) 2.7 8 (6–9) 4.8 8 (7–9) 4.8 89 Core
9. If YES, Are stewardship duties included in the job description and/or annual review? 9 (5–9) 2.8 7 ( 4–9) 1.6 8 (3–9) 2.6 7 (5–9) 1.3 18 Suppl.
10. If YES, Is this physician trained in infectious diseases, clinical microbiology, and/or antimicrobial stewardship? 7 (2–9) 1.4 8 (7–9) 4.9 7 (3–9) 3.3 8 (5–9) 3.1 17 Suppl.
11. Is there a pharmacist responsible for working to improve antimicrobial use at your facility? 8 (4–9) 1.4 8 (5–9) 3.1 8 (5–9) 2.6 8 (5–9) 4.8 77 Core
12. If YES, has this pharmacist had specialized training in infectious disease management or stewardship? Are any of the staff below members involved in stewardship activities at your facility? 7 (2–9) 1.4 8 (5–9) 4.5 6 (4–9) 1.3 7 (5–9) 2.6 12 Suppl.
13. Microbiologist (laboratory) 8 (2–9) 1.0 8 (5–9) 1.4 8 (5–9) 2.6 8 (5–9) 1.3 12 Suppl.
14. Infection preventionist or hospital epidemiologist 7 (2–9) 1.4 7 (4–9) 1.6 7 (5–9) 2.7 7 (3–9) 1.6 0 Suppl.
15. Information technology (IT) staff member 6 (2–9) 1.0 6 (4–9) 1.6 Removed in second round
16. Quality improvement staff member 7 (2–9) 0.9 6 (3–7) 1.3 Removed in second round
Laboratory
17. Does your facility produce a cumulative antimicrobial susceptibility report at least annually? 9 (6–9) 4.9 8 (5–9) 3.1 9 (5–9) 4.8 8 (5–9) 4.5 74 Core
18. If YES, has a current susceptibility report been distributed to prescribers at your facility 9 (5–9) 4.3 8 (4–9) 1.4 8 (2–9) 1.3 7 (5–9) 2.9 23 Remove
Information technology
 Which of the following information technology (IT) systems are currently available and used in your facility:
19. IT system for prescribing (computerized order entry)? 7 (2–9) 0.7 7 (2–9) 1.4 Removed in second round
20. If YES, Does the computer order entry system support clinical decision making for prescribing antimicrobial agents? 5 (2–9) 0.3 7 (2–9) 1.6 Removed in second round
21. Does your facility have the IT capability to support the needs of the antimicrobial stewardship activities? 6 (3–9) 1.4 8 (5–9) 3.3 6 (4–9) 1.4 8 (6–9) 4.8 26 Core
Policies for appropriate use
22. Does your facility have a defined formulary of antimicrobial agents? 9 (7–9) 6.6 7 (2–9) 2.8 9 (5–9) 6.2 7 (5–9) 2.7 44 Remove
23. Does your facility have a written policy that requires prescribers to document in the medical record or during order entry a dose, duration, and indication for all antimicrobial prescriptions? 8 (4–9) 1.35 8 (5–9) 1.35 7 (2–9) 1.3 8 (5–9) 4.8 50 Core
Guidelines
 Does your facility have treatment recommendations, based on national guidelines and local susceptibility, to assist with antimicrobial selection for the following common clinical conditions: Corec
24. Surgical prophylaxis 9 (6–9) 4.9 9 (5–9) 4.9 9 (7–9) 6.2 9 (5–9) 4.8 67 Suppl.
25. Community-acquired pneumonia 9 (6–9) 4.9 8 (5–9) 4.9 8 (7–9) 4.8 8 (5–9) 4.8 50 Suppl.
26. Urinary tract infection 9 (6–9) 4.5 8 (5–9) 4.9 8 (6–9) 4.8 8 (5–9) 4.8 44 Suppl.
27. Are these treatment recommendations easily accessible to prescribers on all wards (printed “pocket guide” or electronic summaries at workstations) 9 (5–9) 3.1 8 (5–9) 4.9 8 (2–9) 2.6 8 (5–9) 4.8 22 Suppl.
Protocols
Are any of the following actions implemented in your facility to improve antibiotic prescribing:
28. Standardized criteria for changing from intravenous to oral antimicrobial therapy in appropriate situations? 8 (5–9) 2.8 7 (4–9) 2.8 8 (5–9) 4.1 7 (5–9) 1.5 39 Suppl.
29. Dose optimization (pharmacokinetics/pharmacodynamics) to optimize the treatment of organisms with reduced susceptibility? 7 (3–9) 1.6 8 (5–9) 2.75 7 (2–9) 1.5 8 (3–9) 4.3 26 Suppl.
30. Discontinuation of specified antimicrobial prescriptions after a predefined duration? 7 (5–9) 1.4 8 (1–9) 1.35 7 (5–9) 1.5 8 (1–9) 1.3 39 Suppl.
Activities and interventions
31. Do prescribers in your facility routinely use antimicrobial ordering forms (printed or electronic)? 7 (4–9) 3.1 6 (3–8) 0.1 Removed in second round
32. Is it routine practice for specified antimicrobial agents to be approved by a physician or pharmacist prior to dispensing (preauthorization) in your facility? 7 (5–9) 3.5 8 (5–9) 3.1 7 (5–9) 3.1 8 (5–9) 4.5 39 Core
33. Is there dedicated time during which the clinical team reviews antimicrobial orders for their assigned patients (antimicrobial ward rounds)? 6 (1–9) 0.5 8 (3–9) 4.2 Removed in second round
34. Is there a formal procedure for a physician, pharmacist, or other staff member to review the appropriateness of an antimicrobial after 48 hours from the initial order (postprescription review)? 7 (3–9) 1.0 8 (5–9) 4.2 7 (3–9) 3.0 8 (5–9) 4.8 73 Core
35. Are results of antimicrobial audits or reviews provided directly to prescribers through in-person, telephone, or electronic communication? 7 (3–9) 1.3 8 (6–9) 4.9 7 (3–9) 1.8 8 (7–9) 4.8 58 Core
36. Do prescribers ever receive education about how they can improve their antimicrobial prescribing? 6 (3–9) 0.6 8 (5–9) 4.2 7 (3–9) 1.3 8 (4–9) 3.8 65 Remove
Monitoring appropriate use
37. Does your facility monitor antimicrobial use by counts of antimicrobial(s) administered to patients per day (Days of Therapy; DOT)? 6 (1–9) 0.9 7 (2–9) 1.6 8 (6–9) 4.8 8 (5–9) 3.1 90 Core
38. Does your facility monitor antimicrobial use by number of grams of antimicrobials used (Defined Daily Dose; DDD)? 8 (4–9) 4.5 7 (5–9) 3.4
39. Does your facility monitor whether the indication for treatment is recorded in the medical record? 6 (3–9) 1 7 (5–9) 1.4 6 (4–9) 1.5 7 (5–9) 1.3 28 Core
40. If YES, is the indication for treatment is recorded in clinical case notes in >80% of sampled cases in your facility? 6 (3–9) 1.4 8 (5–9) 2.4 6 (1–8) 1.1 8 (5–9) 2.6 6 Remove
41. Does your facility measure the number of antimicrobial prescriptions that are consistent with the local treatment recommendations for either urinary tract infection (UTI) or community-acquired pneumonia (CAP)? 6 (3–9) 0.1 8 (5–9) 4.5 6 (4–9) 1.5 8 (5–9) 2.7 39 Suppl.
42. If YES, are antimicrobial prescriptions for UTI compliant with facility-specific guideline in >80% of sampled cases in your facility? 6 (1–9) 3.3 8 (1–9) 4.9 7 ( 3–8) 1.1 8 (5–9) 2.6 0 Remove
If YES, are antimicrobial prescriptions for CAP compliant with facility-specific guideline in >80% of sampled cases in your facility? Added to third round 6 (3–9) 0.2 8 (5–9) 2.6 6 Remove
43. Does your facility review surgical antimicrobial prophylaxis? 7 (3–9) 1.0 8 (5–9) 2.8 7 (3–9) 2.7 8 (5–9) 4.1 50 Core
44. If YES, are antimicrobial prescriptions for surgical antimicrobial prophylaxis compliant with facility-specific guideline in >95% >80% of sampled cases in your facility? 7 (3–9) 1.4 8 (5–9) 2.8 7 (3–9) 0.8 8 (5–9) 2.6 29 Suppl.
a

A higher number indicates higher agreement in the ratings among the experts.

b

The numbers correspond to the number of the indicator at the start of the second round. Bolded numbers indicate revisions between the second and third rounds (strikethrough for deleted text; added text is italicized).

c

The wording of many core indicators was revised in accordance with feedback from the in-person meeting. For this indicator, 3 separate indicators on condition-specific treatment guidelines were merged into a more general phrase, “…for common clinical conditions.”

Nineteen experts (95%) rated the 44 indicators carried forward to the second round. The mean (range) scores for feasibility and clinical importance for all indicators combined were 7.5 (5–9) and 7.6 (6–9), respectively. The experts rated indicators related to having a cumulative antimicrobial susceptibility report and guidelines high for both feasibility and clinical importance; whereas those related to governance and management and policies were generally rated higher for feasibility than for clinical importance; and indicators related to activities and interventions were generally rated higher for clinical importance than for feasibility (Table 2). For relevance to minimizing antimicrobial resistance, no indicator had a median score higher than 7 and there was low agreement on the scoring of the indicators among the experts (data not shown), so rating of relevance to minimizing antimicrobial resistance was not repeated in the third round. Following the second round, 14 experts (70%) participated in a group call on which many remarked that agreement or disagreement in feasibility ratings might be related to differences in healthcare settings and systems more than to discordant expert opinion. Variation in the information technology (IT) capacity among healthcare systems (eg, technical equipment, electronic systems) led to discussion on whether to remove the IT indicators or refocus the domain in more general terms. Some experts expressed that IT indicators are drivers for improvement and should be retained to track growth in the future, even if some countries or systems may not yet have advanced IT capacity for ASPs.

After analysis of the second round ratings and input from the expert panel call, 37 of the 44 proposed structure and process indicators were retained. An indicator that combined assessment of compliance with community-acquired pneumonia and urinary tract infection guidelines was divided into 2 separate indicators, making a total of 38 indicators advancing to third round (Tables 2 and 3). Six indicators were removed: 2 human resources indicators (involvement of IT staff and quality improvement staff); 2 indicators related to IT (presence of an IT system for prescribing and its application to clinical decision support for antimicrobial prescribing); the indicator on routine use of antimicrobial order forms and the indicator about dedicated time for clinical teams to review antimicrobial orders. For the proposed process indicators, the experts expressed that 95% compliance with guidelines was too stringent and recommended that the threshold change to 80%.

The response rate for the third round was 95% (19/20 experts). The mean (range) scores for feasibility and clinical importance were 7.5 (6–9) and 7.8 (6–9), respectively. The indicators with highest agreement for feasibility among experts were the identification of a defined ASP, formulary, and surgical prophylaxis guidelines; whereas the indicators with lowest agreement on feasibility were process indicators that assessed whether “>80% of sampled cases” had a documented indication or followed facility-specific guidelines. For clinical importance, the indicators on physician and pharmacist leadership, IT capability, facility-specific treatment guidelines, and postprescription review and feedback had high agreement in scoring among experts; whereas the human resources indicators not related to leadership, the discontinuation of specified antimicrobial prescriptions after a predefined duration, and the capture of indication for treatment in the medical record had the lowest agreement scores (Table 3). The mean agreement score increased from 1.8 to 2.9 for feasibility (P = .007) and from 2.9 to 3.3 for clinical significance (P = .087).

Thirteen experts (65%), with balanced representation from the European Union and United States, attended the in-person expert consensus meeting. There was consensus that indicators that at least 70% of experts rated as “core” would be core indicators, with minor revisions to clarify definitions. Most experts at the meeting recommended removal of the indicators that assessed compliance with facility-specific community-acquired pneumonia and urinary tract infection guidelines in at least 80% of sampled cases for the following reasons: there were concerns about feasibility; collection of such data would increase workload; accurate quantification would be challenging; and the indicator may not reflect appropriateness of non–guideline-concordant clinical decisions. Similarly, documentation of an indication for treatment in at least 80% of sampled cases was rejected. Some experts noted that monitoring other aspects of antimicrobial prescribing rather than attempting to quantify compliance may be a more effective use of the time of those responsible for antimicrobial stewardship. Experts also recommended removal of “a current susceptibility report has been distributed to prescribers” because this indicator does not assess the application of this information to patient care nor the ability of prescribers to interpret it; and “does your facility have a defined formulary of antimicrobial agents?” because the term formulary was found to have different interpretations, and diversity in prescribing may be considered as an approach to prevention of antimicrobial resistance.23,24 In contrast to recommending removing the assessment of compliance with community-acquired pneumonia and urinary tract infection guidelines, EU experts strongly advocated for the monitoring of surgical prophylaxis as a core indicator. The feasibility of this indicator is supported by current measurement in the United States through the Surgical Care Improvement Project and in the European Union by the Healthcare-Associated Infections Network as well as longstanding experience at national level.2528 Therefore, review of surgical prophylaxis was reclassified from a supplemental to a core indicator and the quantification of guideline concordance became a supplemental indicator. Experts who were not able to participate in the in-person meeting were provided all materials before the meeting and a summary. Each expert had the opportunity to give input into the final indicators and report developed after the meeting. At the conclusion, there were 17 core indicators (Tables 3 and 4) and 16 supplemental indicators (Table 3).

table 4.

Core Structure and Process Indicators for Hospital Antimicrobial Stewardship Programs (Final Set)

Core indicators, by categorya No. in Table 3b
Infrastructure
1. Does your facility have a formal antimicrobial stewardship program accountable for ensuring appropriate antimicrobial use? 1
2. Does your facility have a formal organizational structure responsible for antimicrobial stewardship (eg, a multidisciplinary committee focused on appropriate antimicrobial use, pharmacy committee, patient safety committee, or other relevant structure)? 2
3. Is an antimicrobial stewardship team available at your facility (eg, greater than one staff member supporting clinical decisions to ensure appropriate antimicrobial use)? 6
4. Is there a physician identified as a leader for antimicrobial stewardship activities at your facility? 8
5. Is there a pharmacist responsible for ensuring appropriate antimicrobial use at your facility? 11
6. Does your facility provide any salary support for dedicated time for antimicrobial stewardship activities (eg, percentage of full-time equivalent [FTE] staff for ensuring appropriate antimicrobial use)? 5
7. Does your facility have the information technology (IT) capability to support the needs of the antimicrobial stewardship activities? 21
Policy and practice
8. Does your facility have facility-specific treatment recommendations based on local antimicrobial susceptibility to assist with antimicrobial selection for common clinical conditions? 24–26
9. Does your facility have a written policy that requires prescribers to document an indication in the medical record or during order entry for all antimicrobial prescriptions? 23
10. Is it routine practice for specified antimicrobial agents to be approved by a physician or pharmacist in your facility (eg, preauthorization)? 32
11. Is there a formal procedure for a physician, pharmacist, or other staff member to review the appropriateness of an antimicrobial at or after 48 hours from the initial order (post-prescription review)? 34
Monitoring and feedback
12. Has your facility produced a cumulative antimicrobial susceptibility report in the past year? 17
13. Does your facility monitor if the indication is captured in the medical record for all antimicrobial prescriptions? 39
14. Does your facility audit or review surgical antimicrobial prophylaxis choice and duration? 43
15. Are results of antimicrobial audits or reviews communicated directly with prescribers? 35
16. Does your facility monitor antimicrobial use by grams (Defined Daily Dose [DDD]) or counts (Days of Therapy [DOT]) of antimicrobial(s) by patients per days? 37, 38
17. Has an annual report focused on antimicrobial stewardship (summary antimicrobial use and/or practices improvement initiatives) been produced for your facility in the past year? 4
a

The wording of core indicators may have been revised in accordance with feedback from the in-person meeting.

b

These numbers are provided as reference to better understand the changes that happened during the Delphi process.

DISCUSSION

A multinational panel of experts in antimicrobial stewardship developed a set of indicators to characterize the infrastructure and activities of hospital ASPs using standardized methods to address the TATFAR recommendation to develop common structure and process indicators for hospital ASPs. The modified Delphi process reflected the input of experienced clinical and public health professionals with diverse perspectives and fostered an exchange of best practices to address antimicrobial resistance and appropriate use of antimicrobial agents among diverse health systems. The high ratings of feasibility and clinical importance of each indicator along with convergence of ratings between multiple rounds demonstrate that the final indicators are likely to be practical in diverse settings and meaningful to quality of care.

Themes that emerged during the expert consensus meeting were the necessity of support and accountability for antimicrobial stewardship activities. Salary support was specified to differentiate a higher level of support than mere inclusion of responsibilities for antimicrobial stewardship in job duties without salary support. Participants acknowledged the importance of multidisciplinary involvement, but given that the specific composition of teams was highly variable among healthcare systems and facilities within the same system, indicators that asked about specific staff roles were retained as supplemental rather than core indicators. Participants noted that active feedback was more effective in changing prescribing practices compared with passive education of prescribers; therefore, direct communication of antimicrobial audits or reviews to prescribers was deemed core and education on improving prescribing was removed. An annual report on antimicrobial stewardship and an indicator related to IT capability remained as core indicators because they were seen as “reach” goals—that is, indicators that may be advanced for the current state of ASPs in most facilities but could, in the future, differentiate ASPs and set a target for achievement.

The proposed indicators build upon similar efforts to assess the hospital ASPs in France, Germany, the United Kingdom, the United States, and the European Commission–sponsored Antibiotic Strategy International.5,1518,29,30 Experts involved in the development, implementation, and analysis of many of these ASP assessment efforts participated in this process. The TATFAR ASP indicators align and could be compared with many questions in these previous assessments provided that an acceptable balance between flexibility and consistency in translation is achieved. By design, the TATFAR core indicators are a smaller number of items suitable for public health surveillance or integration with special studies, such as the European Surveillance of Antimicrobial Consumption Network coordinated by ECDC or point prevalence surveys of antimicrobial use conducted by the ECDC or the CDC or within large health systems. The CDC has developed a guidance document, Core Elements of Hospital Antimicrobial Stewardship, that aligns with the TATFAR ASP indicators and has incorporated antimicrobial stewardship questions based on the indicators into the 2015 National Healthcare Safety Network Annual Hospital Survey of more than 4,000 US facilities.31,32 Similarly, the ECDC is planning to include questions based on the TATFAR ASP indicators into its second point prevalence survey of healthcare-associated infections and antimicrobial use in European acute care hospitals that will take place in 2016–2017.

Most of the final indicators are structure rather than process indicators, reflecting the critical importance of staffing, baseline capacity, and support for ASPs as well as the variability of ASP implementation in healthcare. The establishment of a well-supported, multidisciplinary ASP infrastructure ensures an ASP is sustainably integrated into facility practices rather than dependent on a single person. The evidence base for any individual structure and process indicator is limited and challenging to establish because antimicrobial stewardship activities are multifaceted and involve a combination of efforts to attain results. Hopefully, as guidelines for ASPs are developed and adopted, metrics of appropriate antimicrobial use will also be developed as accurate process indicators of ASPs.33 Although minimizing antimicrobial resistance is a primary goal of TATFAR and was initially included as a rating criterion in this process, the ratings of these indicators were low and highly divergent. This result reflects the challenge of establishing the impact of ASPs on antimicrobial resistance.34 Other metrics of ASP success such as antimicrobial use and patient-focused outcomes are critical to demonstrate the impact of ASPs. Outcome indicators for ASPs were not evaluated as part of this modified Delphi process and expert consensus because standards for measuring antimicrobial use in hospital settings are the objective of a separate TATFAR recommendation.35

The modified Delphi process is a widely used, standardized method to develop healthcare quality indicators that ensures equal representation among participants and allows for input and collective consensus across diverse geographic locations. This method provides multiple opportunities for clarification and revision throughout the iterative process of rating and soliciting comments. Additionally, the expert panel group call and in-person consensus meeting contributed greatly to developing a common understanding and reaching final consensus, and included input from the 7 experts who were unable to travel to participate in the final consensus meeting. Potential limitations were that individual ratings by the experts were self-reported and subject to personal perspectives or experiences. A formal systematic review was not within the scope of this project.36 The project did not include representation from countries other than Europe and the United States owing to scope established in the 2009 US-EU Summit declaration. The entire process was conducted in English, which may have created a cultural bias or limited the participation of those who did not feel most comfortable exchanging views in English. Although the final indicators could offer a starting point for developing international ASP indicators, their relevance and applicability to Latin American, African, Asian, and other countries is unknown.

In conclusion, core and supplemental indicators of ASPs deemed to be essential elements to ensuring appropriate use of antimicrobial agents in the hospital setting were developed by a multinational group of experts through a modified Delphi process and consensus meeting. The collaborative development of these structure and process indicators for TATFAR contributed to mutual understanding of the capacity and vision for hospital ASPs in EU member states and in the United States. Implementation of these TATFAR-developed core indicators in multiple countries could contribute to a comprehensive, comparative description of infrastructure, policies, and practices of antimicrobial stewardship internationally. These findings could, in turn, lead to an understanding of best practices of ASPs through further investigation into the relation of different antimicrobial stewardship approaches to antimicrobial use and resistance. We believe that these indicators define clear expectations for hospital ASPs and, through piloting, implementation, and evaluation, will contribute to the understanding of ASP practices in hospitals and improve antimicrobial prescribing. A full report with a more detailed summary of the process and results of the ratings by rounds is available from the TATFAR Secretariat (TATFAR_Secretariat@cdc.gov).

ACKNOWLEDGMENTS

Financial support. This work was performed for TATFAR as part of the routine work of the ECDC and the CDC. ECDC provided funding for participants to attend the Expert Consensus Meeting that took place on June 18, 2014, in Stockholm, Sweden.

Disclaimer: The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the European Centre for Disease Prevention and Control, US Centers for Disease Control and Prevention, or US Department of Veterans Affairs.

Footnotes

Potential conflicts of interest. All authors report no conflicts of interest relevant to this article.

Presented in part: The European Congress of Clinical Microbiology and Infectious Diseases meeting; April 2015; Copenhagen; and the International Conference on Prevention & Infection Control meeting; June 2015; Geneva; and posted as part of the Update Summary of TATFAR Recommendations on the CDC TATFAR website in July 2015.

REFERENCES

  • 1.Davey P, Brown E, Charani E, et al. Interventions to improve antibiotic prescribing practices for hospital inpatients. Cochrane Database Syst Rev 2013;4:CD003543. [DOI] [PubMed] [Google Scholar]
  • 2.Feazel LM, Malhotra A, Perencevich EN, Kaboli P, Diekema DJ, Schweizer ML. Effect of antibiotic stewardship programmes on Clostridium difficile incidence: a systematic review and meta-analysis. J Antimicrob Chemother 2014;69:1748–1754. [DOI] [PubMed] [Google Scholar]
  • 3.Kaki R, Elligsen M, Walker S, Simor A, Palmay L, Daneman N. Impact of antimicrobial stewardship in critical care: a systematic review. J Antimicrob Chemother 2011;66:1223–1230. [DOI] [PubMed] [Google Scholar]
  • 4.ASHP statement on the pharmacist’s role in antimicrobial stewardship and infection prevention and control. Am J Health Syst Pharm 2010;67:575–577. [DOI] [PubMed] [Google Scholar]
  • 5.Dellit TH, Owens RC, McGowan JE Jr, et al. Infectious Diseases Society of America and the Society for Healthcare Epidemiology of America guidelines for developing an institutional program to enhance antimicrobial stewardship. Clin Infect Dis 2007;44: 159–177. [DOI] [PubMed] [Google Scholar]
  • 6.Fridkin S, Baggs J, Fagan R, et al. Vital signs: improving antibiotic use among hospitalized patients. MMWR Morb Mortal Wkly Rep 2014;63:194–200. [PMC free article] [PubMed] [Google Scholar]
  • 7.Ashiru-Oredope D, Sharland M, Charani E, McNulty C, Cooke J. Improving the quality of antibiotic prescribing in the NHS by developing a new antimicrobial stewardship programme: Start Smart—Then Focus. J Antimicrob Chemother 2012;67:i51–i63. [DOI] [PubMed] [Google Scholar]
  • 8.Canton R, Bryan J. Global antimicrobial resistance: from surveillance to stewardship. Part 2: stewardship initiatives. Expert Rev Anti Infect Ther 2012;10:1375–1377. [DOI] [PubMed] [Google Scholar]
  • 9.Agarwal R, Schwartz DN. Procalcitonin to guide duration of antimicrobial therapy in intensive care units: a systematic review. Clin Infect Dis 2011;53:379–387. [DOI] [PubMed] [Google Scholar]
  • 10.Huttner A, Harbarth S, Carlet J, et al. Antimicrobial resistance: a global view from the 2013 World Healthcare-Associated Infections Forum. Antimicrob Resist Infect Control 2013;2:31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Tacconelli E, Cataldo MA, Dancer SJ, et al. ESCMID guidelines for the management of the infection control measures to reduce transmission of multidrug-resistant gram-negative bacteria in hospitalized patients. Clin Microbiol Infect 2014;20:1–55. [DOI] [PubMed] [Google Scholar]
  • 12.Transatlantic Taskforce on Antimicrobial Resistance (TATFAR). About TATFAR. CDC website. http://www.cdc.gov/drugresistance/tatfar/about.html. Accessed July 2, 2015.
  • 13.The White House. US-EU Joint Declaration and Annexes White House website. https://www.whitehouse.gov/the-press-office/us-eu-joint-declaration-and-annexes. Published November 3, 2009. Accessed August 27, 2015.
  • 14.Transatlantic Taskforce on Antimicrobial Resistance Recommendations. Recommendations for future collaboration between the US and EU CDC website. http://www.cdc.gov/drugresistance/pdf/210911_tatfar_report_508.pdf. Published 2011. Accessed May 1, 2016.
  • 15.Amadeo B, Dumartin C, Parneix P, Fourrier-Reglat A, Rogues AM. Relationship between antibiotic consumption and antibiotic policy: an adjusted analysis in the French healthcare system. J Antimicrob Chemother 2011;66:434–442. [DOI] [PubMed] [Google Scholar]
  • 16.Buyle FM, Metz-Gercek S, Mechtler R, et al. Development and validation of potential structure indicators for evaluating antimicrobial stewardship programmes in European hospitals. Eur J Clin Microbiol Infect Dis 2013;32:1161–1170. [DOI] [PubMed] [Google Scholar]
  • 17.Cooke J, Alexander K, Charani E, et al. Antimicrobial stewardship: an evidence-based, antimicrobial self-assessment toolkit (ASAT) for acute hospitals. J Antimicrob Chemother 2010;65:2669–2673. [DOI] [PubMed] [Google Scholar]
  • 18.Morris AM, Brener S, Dresser L, et al. Use of a structured panel process to define quality metrics for antimicrobial stewardship programs. Infect Control Hosp Epidemiol 2012;33:500–506. [DOI] [PubMed] [Google Scholar]
  • 19.Abbo L, Lo K, Sinkowitz-Cochran R, et al. Antimicrobial stewardship programs in Florida’s acute care facilities. Infect Control Hosp Epidemiol 2013;34:634–637. [DOI] [PubMed] [Google Scholar]
  • 20.Doron S, Nadkarni L, Lyn Price L, et al. A nationwide survey of antimicrobial stewardship practices. Clin Ther 2013;35:758–765. [DOI] [PubMed] [Google Scholar]
  • 21.Trivedi KK, Rosenberg J. The state of antimicrobial stewardship programs in California. Infect Control Hosp Epidemiol 2013;34:379–384. [DOI] [PubMed] [Google Scholar]
  • 22.Fitch K, Bernstein SJ, Agular MD, et al. The RAND/UCLA Appropriateness Method User’s Manual Santa Monica: RAND; 2001. [Google Scholar]
  • 23.van Duijn PJ, Bonten MJ. Antibiotic rotation strategies to reduce antimicrobial resistance in gram-negative bacteria in European intensive care units: study protocol for a cluster-randomized crossover controlled trial. Trials 2014;15:277. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Martinez JA, Nicolas JM, Marco F, et al. Comparison of antimicrobial cycling and mixing strategies in two medical intensive care units. Crit Care Med 2006;34:329–336. [DOI] [PubMed] [Google Scholar]
  • 25.Bratzler DW, Hunt DR. The surgical infection prevention and surgical care improvement projects: national initiatives to improve outcomes for patients having surgery. Clin Infect Dis 2006;43:322–330. [DOI] [PubMed] [Google Scholar]
  • 26.Fry DE. Surgical site infections and the Surgical Care Improvement Project (SCIP): evolution of national quality measures. Surg Infect (Larchmt) 2008;9:579–584. [DOI] [PubMed] [Google Scholar]
  • 27.Zarb P, Coignard B, Griskeviciene J, et al. The European Centre for Disease Prevention and Control (ECDC) pilot point prevalence survey of healthcare-associated infections and antimicrobial use. Euro Surveill 2012;17:pii: 20316. [DOI] [PubMed] [Google Scholar]
  • 28.Kurz X, Mertens R, Ronveaux O. Antimicrobial prophylaxis in surgery in Belgian hospitals: room for improvement. Eur J Surg 1996;162:15–21. [PubMed] [Google Scholar]
  • 29.Buyle FM, Metz-Gercek S, Mechtler R, et al. Prospective multicentre feasibility study of a quality of care indicator for intravenous to oral switch therapy with highly bioavailable antibiotics. J Antimicrob Chemother 2012;67:2043–2046. [DOI] [PubMed] [Google Scholar]
  • 30.Thern J, de With K, Strauss R, Steib-Bauert M, Weber N, Kern WV. Selection of hospital antimicrobial prescribing quality indicators: a consensus among German antibiotic stewardship (ABS) networkers. Infection 2014;42:351–362. [DOI] [PubMed] [Google Scholar]
  • 31.Centers for Disease Control and Prevention. National Healthcare Safety Network. Patient safety component—annual hospital survey CDC website. http://www.cdc.gov/nhsn/forms/57.103_PSHosp Surv_BLANK.pdf. Published 2014. Accessed May 14, 2015.
  • 32.Centers for Disease Control and Prevention. Core elements of hospital antibiotic stewardship programs CDC website. http://www.cdc.gov/getsmart/healthcare/implementation/core-elements.html. Published 2014. Accessed March 25, 2014. [DOI] [PMC free article] [PubMed]
  • 33.van den Bosch CM, Geerlings SE, Natsch S, Prins JM, Hulscher ME. Quality indicators to measure appropriate antibiotic use in hospitalized adults. Clin Infect Dis 2015;60:281–291. [DOI] [PubMed] [Google Scholar]
  • 34.Schulz LT, Fox BC, Polk RE. Can the antibiogram be used to assess microbiologic outcomes after antimicrobial stewardship interventions? A critical review of the literature. Pharmacotherapy 2012;32:668–676. [DOI] [PubMed] [Google Scholar]
  • 35.Transatlantic Taskforce on Antimicrobial Resistance. Recom mendations for future collaboration between the US and EU: progress report CDC website. http://www.cdc.gov/drugresistance/tatfar/report.html. Published 2014. Accessed May 1, 2016.
  • 36.Jackson N, Waters E. Criteria for the systematic review of health promotion and public health interventions. Health Promot Int 2005;20:367–374. [DOI] [PubMed] [Google Scholar]

RESOURCES