Abstract
Background:
Inappropriate C. difficile testing has adverse consequences for the patient, hospital, and public health. Computerized Clinical Decision Supports (CCDS) in the Electronic Health Record (EHR) may reduce C. difficile test ordering; however, effectiveness of different approaches, ease of use, and best fit into the healthcare providers’ (HCP) workflow, are not well understood.
Methods
Nine academic and 6 community U.S. hospitals participated in this 2-year cohort study. CCDS (hard- or soft-stop) triggered when duplicate C. difficile test order attempted, or if laxatives were recently received. The primary outcome was the difference in testing rates pre- and post-CCDS interventions, using incident rate ratios (IRR) and mixed effect Poisson regression models. We performed qualitative evaluation (contextual inquiry, interviews, focus groups) based on a human factors model. We identified themes using a codebook with primary- and sub-nodes.
Results
In 9 hospitals implementing hard-stop CCDS and 4 hospitals implementing soft-stop CCDS, C. difficile testing IRR reduction was 33% (95% CI, 30–36%), and 23% (95% CI 21–25%), respectively. Two hospitals implemented a non-EMR based human intervention with IRR reduction of 21% (95% CI 15–28%). HCPs reported generally favorable experiences, and highlighted time efficiencies such as inclusion of the patients most recent laxative administration on the CCDS. Organizational factors including hierarchical cultures, and communication between HCPs caring for the same patient, impact CCDS acceptance and integration.
Conclusions
CCDS reduced unnecessary C. difficile testing and were perceived positively by HCPs when integrated into their workflow, and when displayed relevant patient specific information needed for decision-making.
Summary:
This 15 hospital, 2-year cohort, mixed methods, study found reduced C. difficile test incidence rates post Computerized Clinical Decision Supports (CCDS) implementation. Healthcare providers’ perceptions were positive when the CCDS was integrated into their workflow and provided relevant patient specific information.
Background
Clostridioides difficile infection (CDI), the most common preventable hospital-acquired infections (HAIs), infected 223,900 people in 2017, with at least 12,800 deaths and approximately $1 billion in attributable health care costs. [1]
The necessity of C. difficile testing stewardship is increasingly recognized because incorrect identification of CDI has adverse consequences for the patient, hospital, and public health.[2–4] In the U.S., highly sensitive testing methods such as nucleic acid amplification testing (NAAT) have likely contributed to overestimation of CDI rates. NAAT detects a gene encoding for the toxin; thus, a positive test may indicate the presence of C. difficile in which the toxin gene is not being expressed. Over-diagnosis of CDI can occur in patients who are colonized with C. difficile and undergo NAAT testing. This typically occurs when patients with a low pre-test probability of CDI, such as those without significant diarrhea or with loose stools due to concomitant laxative use, are tested for CDI, and have a positive NAAT reflecting a colonized state, rather than infection. [5–7]
Computerized clinical decision support (CCDS) systems generate evidence-based guideline prompts within the EHR at the point of care, and are known to significantly reduce inappropriate testing for C. difficile, with little to date described regarding negative consequences of CCDS. [8] Best Practice Advisories (BPA) are common prompts; BPA “hard stops” are when the HCP is required to obtain approval to continue the order, BPA “soft stops” are when the HCP can continue the order without seeking approval. [9] A previous study at three Johns Hopkins Health System hospitals found substantial improvement in CDI testing post-CCDS introduction; however, there were differences in CCDS uptake by provider and alert type.[10] We performed a multi-method study using a human factors framework to assess C. difficile testing frequency, the user experience and improvement suggestions post introduction of CCDS diagnostic stewardship interventions.
METHODS
Setting and Design
During this 2-year study, we recruited 15 hospitals (9 academic, 6 community) affiliated with the CDC Prevention Epicenter program sites to participate in a CCDS-based diagnostic stewardship intervention and evaluation study. [11] We assembled an interdisciplinary study team including infectious disease physician researchers (hospital epidemiologists, antimicrobial stewards), human factors and biomedical engineers, and clinical informaticians, who participated on monthly calls with the sites to share experiences and lessons learned. We shared EHR based tools (booklet & webinar (Epic XGM)) detailing the hard stop build and implementation, with technical assistance as needed. Hospital teams built their own tools incorporating these discussions and guidance’s. The study was approved by Johns Hopkins Medicine Institutional Review Board.
Quantitative Data Collection and Analysis
The primary outcome was the monthly number of C. difficile test orders processed. Secondary outcomes included the quarterly CDC-National Healthcare Safety Network (NHSN) defined hospital-onset C. difficile infection rate, and prescription of oral vancomycin or fidaxomicin, antibiotics specifically used to treat CDI.[12] Each site shared data via participation in a Prevention Epicenter NHSN user group, where the central study team NHSN administrator extracted data for each site including hospital characteristics, C. difficile test type used, monthly/quarterly in-patient days, and number of hospital onset C. difficile events. Sites submitted days of oral antibiotic therapies.
For each hospital, we compared outcome measures pre- and post-CCDS intervention. Hospitals had different intervention start dates; therefore, pre- and post- CCDS were based on relative, rather than absolute, time. We reported rate differences, rate ratio and tested the differences using student t-tests and Poisson regression models separately. To evaluate the overall impact of the CCDS across multiple Epicenters, we constructed mixed effect Poisson regression models that regressed monthly or quarterly outcome measures on intervention groups (i.e., no alert, soft stop, hard stop and human interaction, BPA) and covariates of NHSN-predicted C. difficile counts and C. difficile testing strategies.
Qualitative Data Collection and Analysis
We used the Systems Engineering Initiative for Patient Safety (SEIPS) model as an overarching human factors conceptual framework. [13] This framework captures key factors across five “work system” domains 1) people; 2) tools and technologies; 3) environment; 4) organization; and 5) tasks, to elucidate intended and unintended consequences of CCDS introduction.
We used three complementary qualitative methods to assess CCDS user experience and interaction 1) contextual inquiries (CIs) which informed 2) focus groups (FGs); and 3) semi-structured key informant interviews (KIIs). CIs and FGs were conducted at one academic and two community hospitals. KIIs were conducted at five academic hospitals. Regarding CIs, when a BPA was bypassed, a human factor engineer, in real-time, discussed the decision-making process and influencing circumstances with the HCP. The FGs included between two and ten multidisciplinary HCPs (physicians, nurses and advanced practitioners). KII were conducted virtually with convenience samples of HCP, where the hospital epidemiologist typically asked clinical directors or nurse manager to identify any participants who may be willing to participate in an interview. FG and KII guides were adapted for each site specific CCDS intervention, informed by findings from earlier CIs. The goals of the FGs and KIIs were to 1) evaluate HCPs perceptions and opinions regarding CCDS utility, ease of use, and workflow fit; and 2) identify and rank possible CCDS enhancements to improve the end-user experience. FGs and KIIs were recorded, transcribed, and imported into NVivo 12. Using the SEIPS model, key themes were identified, and a codebook with primary- and sub-nodes developed. Disagreements on themes and coding inconsistencies were adjudicated through discussion and team consensus. Three authors independently double coded three transcripts until achieving a kappa score of 0.8 suggesting excellent inter-rater reliability. [14]
RESULTS
Nine academic and six community hospitals participated in the study; 9 implemented hard stops, 4 implemented soft stops, and 2 hospitals, due to pending EHR transitions, employed a non-CCDS approach where the stewardship team discussed testing directly with the primary team, termed the human interaction best practice advisory (HBPA) see table 1. CCDS alerts were activated by at least one of the following criteria: 1) laxatives administered in the past 24 or 48 hours; 2) a negative or positive C. difficile test result reported in the past 7 or 14–28 days, respectively.
Table 1.
CCDS type and Operational Details for Each Participating Hospital
| CCDS type | |||
|---|---|---|---|
| Hard-stop: | Approver Role | Operational Details | Hospital type, # |
| Pathology resident | Provider must call the pathology resident (phone # available on alert) or the order will auto-expire after 5 days outlined in an “acknowledgement” field. | 1 Academic | |
| Microbiology technician | Provider calls the microbiology laboratory for code which must be entered into the EHR. | 1 Academic 2 Community |
|
| Attending physician | Order must be co-signed by an attending physician. If the ordering provider is an attending, co-signature of another attending is required. | 2 Academic 3 Community |
|
| Soft-stop: | Not Applicable | Operational Details | Hospital type, # |
| Provider can continue order and may choose to provide a reason when overriding the alert. | 1 Academic | ||
| Provider can continue order but if previous test within past 7 days it is canceled unless microbiology lab is contacted. | 3 Academic | ||
| Human Interaction: | Interactor Role | Operational Details | Hospital type, # |
| Antimicrobial stewardship (AMS) team | Inappropriate C. difficile testing indication triggers AMS to encourage provider to cancel test. If post discussion test is deemed appropriate, or consensus not reached, AMS contact microbiology lab to process the order | 1 Academic 1 Community |
|
The C. difficile testing order rate (# tests ordered per 1,000 patient days) decreased from 13.1 to 9.9 at the hard stop sites, 10.5 to 7.6 at soft stop sites, and 12.6 to 9.9 at human interaction BPA sites, see Table 2. After adjustment for NHSN-predicted C. difficile counts and C. difficile testing strategies, compared to no CCDS at baseline, hard stop sites had a C. difficile testing IR reduction of 33% (IRR=0.67, 95% CI=0.64–0.70), soft stop sites had had an IR reduction of 23% (IRR=0.77, 95% CI=0.75–0.79), and HBPA sites had an IR reduction of 21% (IRR=0.79, 95% CI=0.72–0.86), see table 3.
Table 2.
C. difficile Testing Rates Pre- and Post-CCDS (Hard stop, Soft stop and Human interaction) Best Practice Advisories.
| Prevention Epicenter, Hospital Type | C. diff Test Order Rate (# Tests per 1,000 patient days) | Incidence Rate ratio, 95% CI, p valuea | Incidence Rate Reduction, 95% CI, p valueb | |
|---|---|---|---|---|
| Pre-intervention | Post-intervention | |||
| All hard-stops c | 13.1 | 9.1 | ||
| A, Acad | 13.2 | 9.2 | 0.70, (0.55, 0.88), p=0.003 | −4.01, (−4.91, −3.11), P <0.001 |
| B, Acad #1 | 12.8 | 9.4 | 0.73, (0.59, 0.90), p=0.003 | −3.48, (−4.27, −2.68), P<0.001 |
| B, Acad #2 | 12.7 | 7.2 | 0.55, (0.44, 0.69), p<0.001 | −5.81, (−7.83, −3.80), P <0.001 |
| B, Comm #1 | 29.3 | |||
| B, Comm #2 | 13.3 | |||
| B, Comm #3 | 17.1 | |||
| All soft stops | 10.5 | 7.6 | ||
| C, Acad #1 | 12.0 | 9.0 | 0.75, (0.62, 0.91), p=0.003 | −3.00, (−4.04, −1.97), P<0.001 |
| C, Acad #2 | 7.5 | 4.9 | 0.67, (0.53, 0.86), p=0.001 | −2.46, (−3.85, −1.08), P=0.001 |
| C, Acad # 3 | 9.0 | 7.7 | 0.85, (0.66, 1.09), p=0.194 | −1.35, (−2.35, −0.34), P=0.010 |
| D, Acad | 14.0 | 8.1 | 0.59, (0.48, 0.72), p<0.001 | −5.75, (−7.70, −3.81), P<0.001 |
| All human-based intervention | 12.6 | 9.9 | ||
| F, Acad | 14.2 | 11.2 | 0.79, (0.59, 1.05), p=0.105 | −3.00, (−4.33, −1.67), P<0.001 |
| F, Comm | 4.8 | 3.8 | 0.78, (0.48, 1.27), p=0.315 | −1.10, (−2.95, 0.76), P=0.233 |
Poisson regression
Student’s t test
does not include 3 community hospitals with no available baseline data.
Table 3.
Effects of Interventions on C. difficile Testing Ratesa
| IRR | 95% CI | P value | |
|---|---|---|---|
| Effect of each intervention compared to no alert | |||
| Soft stop vs. no alert | 0.77 | (0.75 − 0.79) | <0.001 |
| Hard stop vs. no alert | 0.67 | (0.64 − 0.70) | <0.001 |
| Human BPA vs. no alert | 0.79 | (0.72 − 0.86) | <0.001 |
| Comparison of interventions to each other | |||
| Hard stop vs. soft stop | 0.88 | (0.84 − 0.91) | <0.001 |
| Human BPA vs. soft stop | 1.03 | (0.94 − 1.13) | 0.559 |
| Hard stop vs. human BPA | 0.85 | (0.77 − 0.94) | 0.002 |
Mixed Poisson regression with random effects at hospital level and an offset of patient days adjusted for C. difficile test type.
Post-CCDS intervention C. difficile NHSN Lab ID event data (C. difficile infection rates, defined by positive test) decreased relative to the pre-intervention periods for all alerts when compared with no alert. Comparisons of interventions showed the human alert had similar outcomes to soft stop, and hard stop reduced C. difficile Lab ID events by a further 44% when compared with soft stop (IRR 0.56, CI 0.48–0.66), see table 4. C. difficile treatment antibiotic use did not increase in any post intervention group, but human BPA was associated with the most antibiotic reduction.
Table 4.
Effects of Interventions on NHSN C. difficile Lab ID Events and Antibiotics Use Rates
| NHSN C. difficile infectiona | C. difficile treatment antibiotic useb | |||||
|---|---|---|---|---|---|---|
| IRR | 95% CI | P value | IRR | 95% CI | P value | |
| Effect of each intervention compared to no alert | ||||||
| Soft stop vs. no alert | 0.69 | (0.63 − 0.76) | <0.001 | 0.85 | (0.83 − 0.87) | <0.001 |
| Hard stop vs. no alert | 0.42 | (0.35 − 0.50) | <0.001 | 0.77 | (0.74 − 0.80) | <0.001 |
| Human BPA vs. no alert | 0.69 | (0.55 − 0.87) | 0.002 | 0.73 | (0.70 − 0.77) | <0.001 |
| Comparison of interventions to each other | ||||||
| Hard stop vs. soft stop | 0.56 | (0.48 − 0.66) | <0.001 | 0.90 | (0.86 − 0.94) | <0.001 |
| Human BPA vs. soft stop | 0.98 | (0.77 − 1.26) | 0.903 | 0.83 | (0.79 − 0.88) | <0.001 |
| Hard stop vs. human BPA | 0.57 | (0.43 − 0.76) | <0.001 | 0.97 | (0.92 − 1.03) | 0.351 |
Mixed Poisson regression with random effects at hospital level and an offset of patient days adjusted NHSN predicted C. difficile count for C. difficile infection and adjusted for C. difficile test type.
Oral vancomycin, fidaxomicin.
Qualitative Findings
To evaluate user experiences, 47 HCP including attending physicians/hospitalists (n=24), residents (n=5), nurse practitioners (NP, n=12), physician assistants (PA, n=5), and infection prevention practitioners (N=1) participated in six FGs and 15 KIIs across five academic hospitals and two community hospitals. Common themes that emerged related to tools and technologies, tasks, people and organization are shown in Table 5. Participant identified opportunities for CCDS improvement included EHR alert modifications to remove unnecessary text, see table 6.
Table 5.
Key Systems Engineering Initiative of Patient Safety (SEIPS) Model Work System Themes and Illustrative Examples
| Work System Element | Theme | Illustrative Quotation |
|---|---|---|
| Tools and Technology | Alerts: positive perceptions | “It’s become part of the normal process. I actually appreciate that it’s there because in the past, we just send C Diff, you know just not paying attention and over ordering.”—Nurse Practitioner, Hospital B |
| Alert type: perception of hard stops | “..alerts are great. They remind you to look at things that you don’t always look up.” —Physician Assistant, Hospital C | |
| Human Computer Interface (Ease of use of Alert) | “I don’t find the alert to be disruptive because I like the idea of practicing appropriately. I don’t want to be that provider that’s doing inappropriate lab testing …. I think that it’s really valuable.” —Nurse Practitioner, Hospital D | |
| Tasks | Complex Clinical Decision Making | “Clinical judgment of patient appearance is almost always the driver for me because otherwise, what am I doing, we can’t have an algorithm do everything in the hospital.” — Nurse Practitioner, Hospital D |
| “..If I find out that there’s a clear reason like laxative use, I’m going to say no, the BPA is appropriate. Let’s hold off.” —Resident, Hospital A | ||
| Workflow and communications | “I’ve got to say we don’t love alerts because they do interrupt, you’re already clicking a lot… but you feel, this is probably a good opportunity to make sure this is appropriate.” —Attending Physician, Hospital D | |
| “I don’t think it’s a significant stall in my workflow... I think that the BPA fits nicely into the workflow. It might delay my testing or… completing the workup for the patient a little bit. But I think it’s appropriately so.” —Physician Assistant, Hospital D | ||
| People | Attitudes regarding ordering autonomy | “… I should be able to order that if I think it’s indicated without needing further approval.” —Nurse Practitioner, Hospital D |
| [If providers were presented with a hard stop] … “they would really think hard and fast, even hard and long before they order it.” —Nurse Practitioner, Hospital C | ||
| Organization | Different models of patient care continuity. | “In an academic center you are more likely to have continuity of care. Here, when you go off at 5:00 PM, the overnight provider knows nothing about the patient...if you’ve already tested for C. diff or if they’ve been on laxatives.” —Attending Physician, Hospital E |
| Organizational cultures may be hierarchal | “This sort of thing can creep up in academic medicine where you’ve got the thought process of an intern not wanting to disappoint the attending, not thinking that this is an important enough issue to bug the attending about,….” —Attending Physician, Hospital E | |
| “Usually someone more senior on the team says, ‘Let’s send a C. diff’, and then we say, ‘Okay’. Yeah, I feel like if my attending told me to order it, then regardless I would order it.” —Resident, Hospital E |
Table 6.
Commonly Suggested Improvements to CCDS From Focus Groups and Key Informant Interviews
| 1) Modifying the EHR alert to remove unnecessary text |
| 2) Adding an option to discontinue the laxative directly from the alert screen |
| 3) Adding different pathways for complex cases or specific patient populations |
| 4) Including additional relevant information such as date and time of last laxative administration or date of last positive or negative C. difficile test |
| 5) Integration of a clinical probability tool such as a visual aid to assess the likelihood that the patient does/does not have CDI |
| 6) Modifying the alert to include reason or justification for overriding the alert |
| 7) Additional HCP training on C. difficile testing best practices |
Tool and Technologies.
Participants from all sites reported positive user experiences and perceptions of their respective CCDS. Participants appreciated time-efficient supports such as the display of patient-specific laxative administration information which eliminated the need to search the EHR for that information, and occasionally provided information which the HCPs were previously unaware.
Several participants raised considerations regarding CCDS ease of use including the volume of information in the CCDS, and cautioned that the alert may be disregarded entirely if contains too much information. Many HCPs were generally familiar with other CCDS, which facilitated the use of C. difficile hard and soft stop CCDS alerts. HCPs noted being more receptive to following CCDS when educated on appropriate testing recommendations. HCPs at sites with similar alerts (hard stops or soft stops) had similar impressions of user experience and perceived effectiveness. HCPs with hard stops found their own institutions’ CCDS helpful, but perceived that soft stops would be ineffective. HCPs at soft stop sites perceived hard stops, such as those requiring another providers co-signature, as potentially intrusive and disruptive to workflow, and possibly deterring or delaying necessary tests, but they recognized hard stops could be effective in preventing inappropriate C. difficile test ordering.
Tasks.
HCPs noted that the CCDS did not always relay the complete clinical information deemed necessary to inform the decision to test, including 1) patient’s clinical presentation (e.g., higher stool frequency than typically caused by laxatives, fever, abdominal pain, leukocytosis); 2) patient-specific CDI risk factors such as inflammatory bowel disease, recent surgery, and antibiotic use. However, the HCPs highlighted that the CCDS signaled to them to “stop and think” about the test appropriateness. Most HCPs did not find this disruptive to workflow, and those who did still valued the evidence-based practice guidance the CCDS provided.
People.
We did not find a consistent CCDS reaction by provider type across different hospitals. At soft stop sites, attending physicians, but not nurse practitioners, viewed the CCDS as effectively encouraging appropriate testing. Some HCPs at soft stop sites viewed testing as an autonomous decision by the ordering provider. At hard stop sites, physicians expressed concern regarding soft stop alert effectiveness, highlighting the risk of lack of HCP accountability. However, nurses at a hard stop site thought soft stops would prevent C. difficile order approval delays, which occurred when an attending co-signature was required to place the order.
Organization.
Organizational factors, such as hierarchical culture, may impede or facilitate CCDS alert acceptance. HCPs in community hospitals, where patient care is provided by one attending physician, perceived that CCDS prompted test cancellation was common and decided autonomously by that attending. Physicians at academic hospitals described a hierarchical culture where, if a C. difficile test was recommended by an attending or fellow, the resident placing the order may not feel empowered to reassess the need for the test following a CCDS alert, even if the CCDS information was not known to the attending or fellow at the time of the ordering decision. Community hospital attending physicians perceived that the hard stop (particularly the attending co-signature requirement) did not impede workflow, as if they thought the test was still indicated they were comfortable reaching out to a peer attending for a co-signature.
DISCUSSION
This 15-hospital study found an overall 25% reduction in C. difficile test ordering rate post implementation of CCDS in the electronic health record. Hard stops (the HCP is required to obtain approval to continue with order), were associated with a significantly larger reduction in testing frequency compared with soft stops (the HCP can continue to order without seeking approval). We also found a post-CCDS intervention C. difficile NHSN Lab ID event data (C. difficile infection rates, defined by positive test) decrease relative to the pre-intervention periods for all alerts when compared with no alert, with hard stop reduced C. difficile Lab ID events by a further 44% when compared with soft stop (IRR 0.56, CI 0.48–0.66). During this time period there was a national reduction in NHSN Lab ID event rates, with standardized infection ratio reducing from of 0.71 in 2018 to 0.58 in 2019, but the reduction seen in our participating hospital was more than seen nationwide.
HCPs’ experiences with the C. difficile diagnostic stewardship CCDS were generally favorable. HCPs found alerts helpful, informative, prompting reassessment of test necessity, thereby reducing unnecessary tests. Most HCPs did not report alert fatigue or workflow disruption suggesting that design of the CCDS may have had added value, and supported decision-making. Our results demonstrate that CCDS are effective tool at promoting responsible C. difficile testing practices, consistent with a 2020 meta-analysis of 122 trials of CCDS interventions which reported a 6% increase in patients receiving the desired care [15].
In this study, CCDS were similarly accepted at sites with hard or soft stops. However, HCP at sites with soft stops perceived that hard stops would engender HCP resistance and delay necessary testing. Interestingly, sites with existing hard stops did not report associated fatigue or frustration. Our finding of high acceptability of hard stops may reflect design effectiveness of the specific hard stops we evaluated. Other studies have found an association of hard stops with HCP fatigue and frustration. This possibility may deter sites with soft stops, or with no CCDS, from implementing a hard stop. However, our real-world findings of HCPs acceptability of hard stops, coupled with a greater reduction in C. difficile test orders, supports hospitals pursuing hard stop implementation.
Our findings suggest CCDS are positively perceived by HCP when integrated into their workflow and providing the right amount of relevant patient specific information needed for decision-making [16]. Alert fatigue describes how, due to high frequency of often inconsequential alerts, HCPs become desensitized and, as a result, ignore or fail to respond appropriately to a critical warning. [17,18] Alert fatigue is recognized as a major unintended consequence of the EHR, which may contribute to HCP dissatisfaction and feelings of burnout. [15] The potential for alert fatigue may be avoided by focusing on workflow integration and not including unnecessary information in the CCDS. There is a need for further research on optimal CCDS design to guide patient care improvements while balancing the threat of alert fatigue. Participants in this study responded favorably to incorporation of simple notations such as date, dose and type of laxative last given, and date of the patient’s last positive or negative C. difficile test into the CCDS. [19] CCDS interventions that consider the interacting work system elements such as people, tools and technologies, and tasks, may avoid unintended negative consequences, and increase the odds of practice behavior change.
We explored with HCPs CCDS modifications which could improve the process, and identified several recommendations. One suggestion was to allow inclusion of a reason or justification for not following the alert. There is evidence that requiring this when alert not being followed may be associated with greater improvements in targeted processes of care [15]. Multiple KII and FG participants highlighted that CCDS with capabilities to provide specific relevant patient information such as date and time of last laxative administration, or date of last positive or negative C. difficile test, could significantly improve the process. CCDS reminders to HCPs of existing orders, if ordering a potentially duplicative or redundant test, significantly reduced the rates of redundant test orders placed.[9,20]. A frequent suggestion for workflow improvement was easy facilitation of laxative discontinuation by integrating the option to cancel into the CCDS, allowing HCP to then observe the patients stool frequency and re-evaluate the C. difficile test need. Providing prompts, or facilitating HCP action, during the ordering process may improve CCDS acceptance and effectiveness. This premise is supported by a recent meta-analysis which found that the ability to execute the desired action directly through the CCDS was associated with increased improvements in process of care, compared with CCDS without this feature [15].
Our findings suggest organizational culture is important in HCPs acceptance of CCDS alerts in supporting appropriate clinical decision-making [21]. All HCPs reported that responsible testing practices were encouraged at their hospitals, and that CCDS had already, or had potential to, influence their testing practices by increasing awareness of laxative administration or prior C. difficile testing, and preventing an inappropriate order being placed. FG participants perceived two important differences between academic and community hospitals—academic hospitals had better structures and processes for continuity of care (e.g., HCP sign-outs and hand-offs), but a hierarchical culture between HCPs at different ranks. This implies that community hospitals may need to encourage more care coordination and HCP-to-HCP information transfer, whereas academic hospitals may need training to educate and empower HCPs of different ranks. These topics warrant deeper exploration in studies with more hospitals represented.
LIMITATIONS
Our study has some limitations. First, due to the SARS-CoV-2 global pandemic, the study team was unable to travel for in-person site visits. However, pre-pandemic we conducted in-person CIs and FGs, and our subsequent virtual interviews allowed for significant interaction with the interviewee. Second, recruitment was voluntary which could have introduced bias, and participants may have been indirectly identified by the hospital epidemiologist where there may have been preference for those who supported or liked the CCDS to be more likely to be asked to participate. We conducted qualitative evaluation at five academic hospitals, and likely these findings are generalizable to other academic hospitals with similar CCDS. However, we quantitatively and qualitatively evaluated the CCDS at only two community hospitals, and there may be geographical or cultural influences we did not capture. Lastly, there were no reported adverse events related to the intervention, we did not review each case, and therefore do not know if testing reduction was all appropriate.
CONCLUSION
This study provides novel insight into the CCDS to further improve diagnostic stewardship interventions, and how to maximize their effectiveness. A systems approach based on human factor engineering provides a framework with work system elements to support developing effective CCDS interventions.
FUNDING
This work was supported by the Centers for Disease Control and Prevention’s Prevention Epicenter Program [grant number 6 U01CK000554-02-02].
Footnotes
POTENTIAL CONFLICTS OF INTEREST
DJD reports grant to institution for clinical trial of new susceptibility test systems from bioMerieux, Inc. outside of the conduct of the study, payment for consulting on novel diagnostics from OpGen, Inc., and payment for consulting on antimicrobial resistance surveillance studies from JMI Laboratories. AG reports grants or contracts from AHRQ, CDC, and NIH outside of the conduct of the study; payment for lecture from North Carolina Health Association; and Human Factors and Ergonomics Society Executive Council. JJ reports royalties from UpToDate, Inc. DM reports grant funding to support infection prevention and medical decision making research from CDC, NIH, AHRQ, and VA HSRD and reimbursement for travel related to meeting planning on speaking at meetings from IDSA and SHEA.
REFERENCES
- 1.Clostridioides difficile Infection | HAI | CDC. 2020. Available at: https://www.cdc.gov/hai/organisms/cdiff/cdiff_infect.html. Accessed 25 October 2021. [Google Scholar]
- 2.Isaac S, Scher JU, Djukovic A, et al. Short- and long-term effects of oral vancomycin on the human intestinal microbiota. J Antimicrob Chemother 2017; 72:128–136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Ubeda C, Djukovic A, Isaac S. Roles of the intestinal microbiota in pathogen protection. Clin Transl Immunol 2017; 6:e128. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Lewis BB, Buffie CG, Carter RA, et al. Loss of Microbiota-Mediated Colonization Resistance to Clostridium difficile Infection With Oral Vancomycin Compared With Metronidazole. J Infect Dis 2015; 212:1656–1665. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Rock C, Pana Z, Leekha S, et al. National Healthcare Safety Network laboratory-identified Clostridium difficile event reporting: A need for diagnostic stewardship. Am J Infect Control 2018; 46:456–458. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Buckel WR, Avdic E, Carroll KC, Gunaseelan V, Hadhazy E, Cosgrove SE. Gut check: Clostridium difficile testing and treatment in the molecular testing era. Infect Control Hosp Epidemiol 2015; 36:217–221. [DOI] [PubMed] [Google Scholar]
- 7.Koo HL, Van JN, Zhao M, et al. Real-Time Polymerase Chain Reaction Detection of Asymptomatic Clostridium difficile Colonization and Rising C. difficile-Associated Disease Rates. Infect Control Amp Hosp Epidemiol 2014; 35:667–673. [DOI] [PubMed] [Google Scholar]
- 8.Dunn AN, Radakovich N, Ancker JS, Donskey CJ, Deshpande A. The Impact of Clinical Decision Support Alerts on Clostridioides difficile Testing: A Systematic Review. Clin Infect Dis 2021; 72:987–994. [DOI] [PubMed] [Google Scholar]
- 9.Main C, Moxham T, Wyatt JC, Kay J, Anderson R, Stein K. Computerised decision support systems in order communication for diagnostic, screening or monitoring test ordering: systematic reviews of the effects and cost-effectiveness of systems. Health Technol Assess Winch Engl 2010; 14:1–227. [DOI] [PubMed] [Google Scholar]
- 10.Mizusawa M, Small BA, Hsu Y-J, et al. Prescriber Behavior in Clostridioides difficile Testing: A 3-Hospital Diagnostic Stewardship Intervention. Clin Infect Dis 2019; 69:2019–2021. [DOI] [PubMed] [Google Scholar]
- 11.Prevention Epicenters Program | CDC. 2019. Available at: https://www.cdc.gov/hai/epicenters/index.html. Accessed 25 October 2021. [Google Scholar]
- 12.Johnson S, Lavergne V, Skinner AM, et al. Clinical Practice Guideline by the Infectious Diseases Society of America (IDSA) and Society for Healthcare Epidemiology of America (SHEA): 2021 Focused Update Guidelines on Management of Clostridioides difficile Infection in Adults. Clin Infect Dis 2021; 73:e1029–e1044. [DOI] [PubMed] [Google Scholar]
- 13.Rock C, Cosgrove SE, Keller SC, et al. Using a Human Factors Engineering Approach to Improve Patient Room Cleaning and Disinfection. Infect Control Hosp Epidemiol 2016; 37:1502–1506. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Bakeman R, McArthur D, Quera V, Robinson BF. Detecting sequential patterns and determining their reliability with fallible observers. Psychol Methods 1997; 2:357–370. [Google Scholar]
- 15.Kwan JL, Lo L, Ferguson J, et al. Computerised clinical decision support systems and absolute improvements in care: meta-analysis of controlled clinical trials. BMJ 2020; 370:m3216. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.White DR, Hamilton KW, Pegues DA, Hanish A, Umscheid CA. The Impact of a Computerized Clinical Decision Support Tool on Inappropriate Clostridium difficile Testing. Infect Control Hosp Epidemiol 2017; 38:1204–1208. [DOI] [PubMed] [Google Scholar]
- 17.Embi PJ, Leonard AC. Evaluating alert fatigue over time to EHR-based clinical trial alerts: findings from a randomized controlled study. J Am Med Inform Assoc JAMIA 2012; 19:e145–148. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Slight SP, Seger DL, Nanji KC, et al. Are we heeding the warning signs? Examining providers’ overrides of computerized drug-drug interaction alerts in primary care. PloS One 2013; 8:e85071. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Osheroff J, Teich J, Levick D, et al. Improving Outcomes with Clinical Decision Support, An Implementer’s Guide. Second. HIMSS, the Scottsdale Institute, AMIA, AMDIS and SHM, 2012. [Google Scholar]
- 20.Bates DW, Kuperman GJ, Rittenberg E, et al. A randomized trial of a computer-based intervention to reduce utilization of redundant laboratory tests. Am J Med 1999; 106:144–150. [DOI] [PubMed] [Google Scholar]
- 21.Baker DW, Hyun D, Neuhauser MM, Bhatt J, Srinivasan A. Leading Practices in Antimicrobial Stewardship: Conference Summary. Jt Comm J Qual Patient Saf 2019; 45:517–523. [DOI] [PMC free article] [PubMed] [Google Scholar]
