
Translating evidence into practice is a long and arduous journey: 17 years, as the refrain goes (1). A traditional view sees this journey as linear; first, efficacy studies prove benefit, then effectiveness research creates knowledge in real-world settings, and finally, implementation science focuses on understanding and promoting uptake. With any one study in any phase potentially taking years to complete, the long period between the start of this journey and its end is unsurprising but, unfortunately, infeasible in the modern clinical care environment that needs treatment answers today. To address this evidence-to-implementation gap, hybrid effectiveness–implementation study designs have emerged (2), blending elements of effectiveness and implementation design to improve research efficiency and speed up knowledge translation. In this issue of AnnalsATS, Peltan and colleagues (pp. 424–432) give us an opportunity to consider the value and challenges of implementation research and hybrid implementation–effectiveness study designs. They report the results of a pilot hybrid trial of a complex and multifaceted program of several implementation strategies that, together, aimed to increase the use of lung protective ventilation directly (through audit and feedback, education, and championing of lung protective ventilation) and indirectly (by promoting increased use of clinical decision support tools in place to facilitate clinical decision-making) (3). Not only was the implementation itself multimodal, but the intended mechanism of practice change was also multipronged. They found that adherence to low tidal volume ventilation and other implementation outcomes indeed improved after intervention; however, no improvement was noted in several effectiveness outcomes, including mortality and ventilator-free days.
The study’s strengths reveal the promise of implementation studies to shorten that journey from evidence generation to improved healthcare delivery. The investigators increased the efficiency of a complex, costly study by leveraging a hybrid trial design to create knowledge about implementation outcomes and clinical outcomes in tandem. Furthermore, this design acknowledges that the phases from proving efficacy to clinical practice change need not be entirely sequential. Intentionally combining elements of effectiveness and implementation recognizes that these parts cannot, and indeed should not, be separated and that evaluating one without the other would be a missed opportunity. For example, capturing data on implementation factors could and perhaps should be standard for an effectiveness study to enable the exportation of successful strategies to other settings. Another strength of this study is the investigators’ approach to incorporating the possibility of secular trends in the analyses. By adding a secondary analysis using segmented regression, the investigators uncover that the findings of their primary analysis (that the period after intervention was associated with higher rates of adherence to lung protective ventilation) may have an alternative explanation and thereby potentially avoid incorrect conclusions about the impact of the specified implementation strategies. That finding also prompted them to recognize the silent or spillover implementation strategies that crept into the study, such as training local implementation teams before the study started.
Implementation science embraces complexity and messiness. But a tradeoff of encouraging complex and multifaceted implementation efforts is the difficulty of defining, studying, and reporting on all the parts that could drive change. For example, in this study (and to their credit), the authors acknowledge the possibility that the institution’s pretrial prioritization of adherence to lung protective ventilation was possibly an implementation strategy in and of itself, even before the deployment of the key strategies under study. Therefore, identifying and naming all implementation strategies a priori is critical. To guide researchers in doing so, implementation scientists have generated long lists of implementation strategies (4), many of which significantly overlap with each other and are almost by definition present concurrently in any implementation effort. Additional tools that can support clinicians and researchers in the identification and reporting of implementation strategies (5, 6) should be a minimum standard for the publication of implementation studies.
The study by Peltan and colleagues also reveals another important challenge of implementation research: the difficulty of operationalizing definitions of implementation outcomes. Lung protective ventilation is an effective intervention that has been shown to reduce mortality among patients with acute respiratory distress syndrome and likely has beneficial effects for many patients, even in the absence of acute respiratory distress syndrome. Yet, the intervention itself is not a simple treatment with a binary definition. The most cited efficacy study for lung protective ventilation tested a complex intervention involving 1) administration of low tidal volumes; 2) limiting inspiratory pressure; and 3) providing parameters for titration of inspired oxygen and positive end-expiratory pressure (7). For over a decade after this publication, numerous studies have tried to parse out what specific components of the multifaceted intervention were the drivers of the mortality benefit. Studies have explored the roles of initial tidal volume, duration and percentage of time with low tidal volumes, duration of time of exposure to injurious tidal volumes, and driving pressure as potential mechanisms by which lung protective ventilation improves mortality, to name just a few (8–10). It is infeasible to create an operational definition of adherence to the original intervention overall, encompassing all elements (tidal volume, inspiratory pressures, inspired oxygen, and positive end-expiratory pressure) and even more so to incorporate all these over a period of days of mechanical ventilation. Of the many possibilities, Peltan and colleagues selected a primary outcome of the percentage of time on mechanical ventilation during which low tidal volumes were administered and secondary outcomes of other operational definitions of adherence to low tidal volume ventilation as well as to other elements of the original intervention. Nonetheless, many elements of the original intervention went unmeasured.
This last challenge feeds into a key dilemma of implementation research: how do we interpret a hybrid implementation– effectiveness study when the improvement in implementation outcomes does not correspond to an improvement in clinical outcomes? The debate over whether to prioritize process or outcomes is age-old in health services research, and this study provides a case study for further consideration. There are three key points we wish to make. First, the relationship between the operational definitions of the implementation outcomes and the clinical outcomes must be well understood to inform the study design. In this example, as mentioned above, the operational definition of adherence to the proven-effective intervention is challenging, so one explanation for the mismatch between implementation and effectiveness outcomes could be that the mechanisms for clinical effectiveness are not captured by these implementation definitions. Second, the study design helps us with the interpretation of the results. The hybrid type 3 design delineates that implementation outcomes are primary and clinical outcomes are secondary. Such a trial may, appropriately, not have sufficient power to detect differences in clinical outcomes. The nature of hybrid implementation–effectiveness studies requires that investigators be clear and upfront about the state of the evidence and carefully specify how outcomes are prioritized. Finally, we must remember that in the real-world settings of implementation and effectiveness studies, populations are more heterogeneous than in efficacy studies, and cointerventions are the norm. This study promoted low tidal volume ventilation among all mechanically ventilated patients; we would not expect an effect like that seen in a clinical trial of carefully selected patients with acute respiratory distress syndrome. And as the authors themselves point out, simply preparing for the implementation may have served as an implementation strategy itself. All of these issues will push the results of effectiveness analyses toward the null hypothesis. Therefore, the lack of evidence for effectiveness in a hybrid implementation-effectiveness study should not cause us to throw out the implementation strategies or the clinical intervention altogether.
The well-executed study by Peltan and colleagues highlights the benefits of hybrid implementation–effectiveness trials, as well as their complexities. Understanding their use and appropriately deploying these trials through thoughtful study design, careful statistical analyses, and a thorough understanding of the clinical environment is essential to moving from evidence to practice at a faster pace demanded by critical care in the modern day.
Footnotes
Author disclosures are available with the text of this article at www.atsjournals.org.
References
- 1.Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academies Press; 2001. [PubMed] [Google Scholar]
- 2. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care . 2012;50:217–226. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Peltan ID, Knighton AJ, Barney BJ, Wolfe D, Jacobs JR, Klippel C, et al. Delivery of lung-protective ventilation for acute respiratory distress syndrome: a hybrid implementation-effectiveness trial. Ann Am Thorac Soc . 2023;20:424–432. doi: 10.1513/AnnalsATS.202207-626OC. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci . 2015;10:21. doi: 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci . 2013;8:139. doi: 10.1186/1748-5908-8-139. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Rudd BN, Davis M, Beidas RS. Integrating implementation science in clinical research to maximize public health impact: a call for the reporting and alignment of implementation strategy use with implementation outcomes in clinical research. Implement Sci . 2020;15:103. doi: 10.1186/s13012-020-01060-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Brower RG, Matthay MA, Morris A, Schoenfeld D, Thompson BT, Wheeler A, Acute Respiratory Distress Syndrome Network Ventilation with lower tidal volumes as compared with traditional tidal volumes for acute lung injury and the acute respiratory distress syndrome. N Engl J Med . 2000;342:1301–1308. doi: 10.1056/NEJM200005043421801. [DOI] [PubMed] [Google Scholar]
- 8. Needham DM, Yang T, Dinglas VD, Mendez-Tellez PA, Shanholtz C, Sevransky JE, et al. Timing of low tidal volume ventilation and intensive care unit mortality in acute respiratory distress syndrome. A prospective cohort study. Am J Respir Crit Care Med . 2015;191:177–185. doi: 10.1164/rccm.201409-1598OC. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Sjoding MW, Gong MN, Haas CF, Iwashyna TJ. Evaluating delivery of low tidal volume ventilation in six ICUs using electronic health record data. Crit Care Med . 2019;47:56–61. doi: 10.1097/CCM.0000000000003469. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Amato MB, Meade MO, Slutsky AS, Brochard L, Costa EL, Schoenfeld DA, et al. Driving pressure and survival in the acute respiratory distress syndrome. N Engl J Med . 2015;372:747–755. doi: 10.1056/NEJMsa1410639. [DOI] [PubMed] [Google Scholar]
