Abstract
Background:
While contextual inquiry, or in-depth mixed-methods work to study the implementation context, is critical for understanding the setting in which a behavioral health evidence-based practice (EBP) will be implemented, current methods to determine potential barriers and facilitators to implementation are often conducted in a single setting and/or for a single EBP per study, often taking 1–2 years to complete. To maximize generalizability and reduce the research-to-practice gap efficiently, it is important to move toward cross-sector and/or cross-EBP contextual inquiry.
Methods:
In this viewpoint, we argue for (a) collaborative research studies that seek to identify determinants of implementation that are similar and unique across different settings and EBPs, using rapid approaches when possible; (b) enhanced synthesis of existing research on implementation determinants to minimize duplication of contextual inquiry efforts; and (c) clear rationale for why additional in-depth or rapid contextual inquiry is needed before it is conducted. Throughout this viewpoint, the need to balance scientific rigor and speed are considered.
Conclusions:
Overall, this viewpoint seeks to encourage researchers to consolidate and share knowledge on barriers and facilitators to implementation to prepare for the scaling out of much needed implementation strategies and interventions for improving health.
Plain language summary:
Significant time and resources are often devoted to understanding what makes it easier or harder to use best practices for behavioral health concerns in health care settings. The goal of the current viewpoint is to offer ways to streamline this process so that high-quality behavioral health services can reach the patients that need them more quickly. In particular, we advocate for ways to share knowledge among researchers and learn from prior findings to more efficiently identify what makes it easier or harder to use best practices for addressing behavioral health problems in a given setting (e.g., primary care, schools, specialty mental health).
Keywords: Contextual inquiry, determinants, generalizability
The first step in implementation research is contextual inquiry, whereby researchers collect data from surveys, interviews, and observations to understand barriers and facilitators (i.e., determinants) of implementation for an evidence-based practice (EBP) in a setting (Lane-Fall et al., 2019). Contextual inquiry can and should guide implementation strategy selection and design. However, there are challenges to our current approach. First, it requires considerable time and resources. Second, research has often focused on determinants of one EBP in a single setting, meaning we are likely duplicating efforts each time we enter a new setting or prepare to implement a different EBP in that setting. For instance, there are commonalities in barriers to implementing screening for behavioral health domains such as substance abuse and depression (Bhatta et al., 2018; Van Hook et al., 2007) in primary care, suggesting that future implementations of behavioral health screening programs in that setting can use knowledge of those commonalities to design implementation strategies. To efficiently produce generalizable knowledge and expedite the implementation process, we argue for (a) a new research agenda of collaborative studies that use rapid approaches when possible to identify implementation determinants that are similar and unique across different settings and EBPs, (b) enhanced synthesis of existing research on implementation determinants to minimize duplication of efforts, and (c) clear rationale for why additional contextual inquiry is needed before it is conducted. This viewpoint aims to foster the streamlining of contextual inquiry to jumpstart the implementation and scale out of EBPs for improving behavioral and physical health.
A call for collaboration
Implementation laboratories can benefit a range of stakeholders and reduce research waste (Grimshaw et al., 2019). Grimshaw and colleagues (2019) highlight the utility of the “meta-laboratory,” defined as “a coordinated set of implementation laboratories, in which findings from interventions tested in one context could inform decisions for another context, to facilitate a cumulative science in the field” (p. 420). While they discuss meta-laboratories for advancing research on audit and feedback, the same principles can be applied to advance other aspects of implementation science, including the identification of implementation determinants. For instance, our research group is engaging in a collaborative project involving two health systems to identify common and unique determinants of suicide prevention practices across primary care and specialty mental health (Davis et al., 2020). This kind of work can produce generalizable knowledge that may accelerate the deployment of EBPs across settings.
Acknowledging the tension between relevance and rigor in implementation science (Geng et al., 2017), leveraging insights from rapid implementation science (Smith et al., 2020) to efficiently bridge the research-to practice gap is essential. For instance, rapid ethnography, which involves gathering qualitative data on a brief and clearly delineated timeline (Reeves et al., 2013), could help truncate the time spent on contextual inquiry. Similarly, rapid analysis of qualitative data (e.g., summarizing interview transcripts using a structured template) can be used to glean the major themes that are uncovered via in-depth analysis (Gale et al., 2019). In particular, Gale et al. (2019) demonstrated consistency across these methods by mapping rapid analysis findings to the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009) domains discovered via in-depth analysis and determined there were no significant information gaps (e.g., themes related to Outer Setting were absent from both analytic methods). Additional studies directly comparing rapid and in-depth approaches for gathering and analyzing data will be important for confirming these methods yield similar results. Nonetheless, it is likely that rapid methods can allow us to collect and analyze contextual inquiry data in ways that provide sufficient detail to move upstream in the implementation process without requiring us to tread water in the pre-implementation phase longer than is necessary.
Streamlining via study synthesis
Early on, implementation science rarely included contextual inquiry. Over time, the importance of understanding context has been realized and the field has adopted these techniques to promote successful implementation. Now, in-depth contextual inquiry has become so commonplace that entire study aims and manuscripts have been devoted to the topic. Given the extensive contextual inquiry that has been conducted to date for many EBPs and settings, it is time to aggregate that work so that we can best learn from it and avoid redundancy. One way to do so is through systematic review and meta-analytic articles that synthesize existing research on implementation determinants. This is a cost-effective way to eliminate or reduce the need for lengthy contextual inquiry. For instance, if a systematic review were to reveal that the barriers to implementing behavioral health interventions across diagnostic categories (e.g., Attention-deficit/hyperactivity disorder, anxiety) in schools are well-established, it would behoove researchers to quickly verify those barriers are relevant to current stakeholders and then move to thoughtful implementation strategy design and testing rather than spending finite resources on additional contextual inquiry.
Requiring rationale to be reported
There may be cases, such as in novel EBPs and settings (including countries where an EBP may be newly introduced), that more extensive contextual inquiry is required than is available via the extant literature. Similarly, there may be situations when multi-site, collaborative research that spans sectors and/or EBPs is not feasible. It will be important in these cases for researchers to provide clear rationale in their grant applications and manuscripts for decisions to undertake in-depth contextual inquiry for one context and/or EBP.
The next phase of contextual inquiry
To catalyze changes in contextual inquiry practices within implementation science, we lay out actionable items for the field. We hope others will advance upon this thinking. First, encouraging engagement in cross-setting/cross-EBP research can be included in calls for submissions to special journal issues and grant Requests for Applications. Similarly, journals and granting agencies can add in requirements for researchers to clearly outline their rationale for the scope of their contextual inquiry. Moreover, this information could be requested via additions to existing reporting checklists. Additional opportunities for cross-site networking, such as at the Society for Implementation Research Collaboration (SIRC) and other conferences, could foster key opportunities for additional collaborations regarding the identification of implementation determinants.
Alternative viewpoints
Collaborations that span settings and EBPs and require the involvement of multiple research groups will be time- and resource-intensive
The time and resources needed to conduct collaborative contextual inquiry that spans settings and/or EBPs should not be overlooked. The hope is that conducting generalizable work now will lead to cost-savings later on, though this is an empirical question that warrants economic evaluation. Building economic evaluations into the beginning phases of our implementation research will be critical for ensuring that in our attempt to streamline contextual inquiry, we do not unintentionally shift to even more costly procedures.
Using rapid contextual inquiry approaches and prior research to identify barriers and facilitators may lead us to miss important determinants
A deep dive into the factors that enable or hinder implementation of a specific EBP in a specific setting can produce a wealth of information. There remains space for such work, especially when seeking to understand EBPs and/or settings that have not been the subject of much prior contextual inquiry. Similarly, it is important to recognize that determinants of an EBP in one country may not translate to other parts of the world, thus necessitating further research on barriers and facilitators to support implementation globally. However, questions about the pragmatism and incremental utility of deep contextual inquiry must be contemplated before such work is carried out. It is likely that rapid approaches to collecting and analyzing data can yield sufficient information to achieve successful implementation and sustainment. Just as we have struggled to define “enough evidence” for what constitutes an EBP across disciplines, we need to grapple with what is “enough” contextual inquiry. We eagerly invite recommendations and further discussion on this topic so that we may come to consensus as a field.
Streamlining contextual inquiry will detract from the relationship-building activities that are instrumental for conducting implementation research
Refining contextual inquiry should not equate to the eradication of the relationship-building with stakeholders that is often core to the contextual inquiry process. Instead, the money and time spent conducting surveys and interviews as part of contextual inquiry should be re-allocated to enhance and hasten other portions of the pre-implementation and implementation processes (e.g., investing in rapid prototyping methods and engaging stakeholders by asking them to provide iterative feedback throughout).
Conclusion
As we round the second decade of the implementation science field, we are at an important juncture to evaluate the field’s past and present and provide a roadmap for its future. Moreover, given the scarcity of research funds, particularly in the context of the COVID-19 pandemic, it is especially important to consider how to allocate finite resources to maximize impact. We hope that the current viewpoint will allow us to alter our contextual inquiry approaches while still preserving their core purpose. In turn, we will be able to increasingly focus our attention and resources on implementing and scaling out the implementation strategies and interventions that are critical to improving health.
Footnotes
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: Rinad Beidas receives royalties from Oxford University Press and has provided consultation to the Camden Coalition of Health Care Providers. She currently provides consultation to United Behavioral Health. Rinad Beidas currently provides consultation to United Behavioral Health. She also serves on the Clinical and Scientific Advisory Board for Optum Behavioral Health. Rinad Beidas serves as an Associate Editor for Implementation Research and Practice but was not involved in editorial decision-making for this article.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Molly Davis is supported by a National Institute of Mental Health Training Fellowship (T32 MH109433; Mandell/Beidas MPIs).
References
- Bhatta S., Champion J. D., Young C., Loika E. (2018). Outcomes of depression screening among adolescents accessing school-based pediatric primary care clinic services. Journal of Pediatric Nursing, 38, 8–14. 10.1016/j.pedn.2017.10.001 [DOI] [PubMed] [Google Scholar]
- Damschroder L. J., Aron D. C., Keith R. E., Kirsh S. R., Alexander J. A., Lowery J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, 1–15. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Davis M., Wolk C. B., Jager-Hyman S., Beidas R. S., Young J. F., Mautone J. A., . . . Becker-Haimes E. M. (2020). Implementing nudges for suicide prevention in real-world environments: Project INSPIRE study protocol. Pilot and Feasibility Studies, 6, 1–10. 10.1186/s40814-020-00686-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gale R. C., Wu J., Erhardt T., Bounthavong M., Reardon C. M., Damschroder L. J., Midboe A. M. (2019). Comparison of rapid vs in-depth qualitative analytic methods from a process evaluation of academic detailing in the Veterans Health Administration. Implementation Science, 14, Article 11. 10.1186/s13012-019-0853-y [DOI] [PMC free article] [PubMed]
- Geng E. H., Peiris D., Kruk M. E. (2017). Implementation science: Relevance in the real world without sacrificing rigor. PLOS Medicine, 14, Article e1002288. 10.1371/journal.pmed.1002288 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grimshaw J. M., Ivers N., Linklater S., Foy R., Francis J. J., Gude W. T., Hysong S. J. (2019). Reinvigorating stagnant science: Implementation laboratories and a meta-laboratory to efficiently advance the science of audit and feedback. British Medical Journal Quality & Safety, 28, 416–423. 10.1136/bmjqs-2018-008355 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lane-Fall M. B., Curran G. M., Beidas R. S. (2019). Scoping implementation science for the beginner: Locating yourself on the “subway line” of translational research. BMC Medical Research Methodology, 19, Article 133. 10.1186/s12874-019-0783-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reeves S., Peller J., Goldman J., Kitto S. (2013). Ethnography in qualitative educational research: AMEE guide no. 80. Medical Teacher, 35, e1365–e1379. 10.3109/0142159X.2013.804977 [DOI] [PubMed] [Google Scholar]
- Smith J., Rapport F., O’Brien T. A., Smith S., Tyrrell V. J., Mould E. V., . . . Braithwaite J. (2020). The rise of rapid implementation: A worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review. BMC Health Services Research, 20, Article 449. 10.1186/s12913-020-05289-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Van Hook S., Harris S. K., Brooks T., Carey P., Kossack R., Kulig J., . . . New England Partnership for Substance Abuse Research. (2007). The “six T’s”: Barriers to screening teens for substance abuse in primary care. Journal of Adolescent Health, 40, 456–461. 10.1016/j.jadohealth.2006.12.007 [DOI] [PubMed] [Google Scholar]
