Abstract
Background
Attention is being placed on the “ironic gap” or “secondary” research-to-practice gap in the field of implementation science. Among several challenges posited to exacerbate this research-to-practice gap, we call attention to one challenge in particular—the relative dearth of implementation research that is tethered intimately to the lived experiences of implementation support practitioners (ISPs). The purpose of this study is to feature a qualitative approach to engaging with highly experienced ISPs to inform the development of a practice-driven research agenda in implementation science. In general, we aim to encourage ongoing empirical inquiry that foregrounds practice-driven implementation research questions.
Method
Our analytic sample was comprised of 17 professionals in different child and family service systems, each with long-term experience using implementation science frameworks to support change efforts. Data were collected via in-depth, semi-structured interviews. Our analysis followed a qualitative content analysis approach. Our focal conceptual category centered on the desired areas of future research highlighted by respondents, with subcategories reflecting subsets of related research question ideas.
Results
Interviews yielded varying responses that could help shape a practice-driven research agenda for the field of implementation science. The following subcategories regarding desired areas for future research were identified in respondents’ answers: (a) stakeholder engagement and developing trusting relationships, (b) evidence use, (c) workforce development, and (d) cost-effective implementation.
Conclusions
There is significant promise in bringing implementation research and implementation practice together more closely and building a practice-informed research agenda to shape implementation science. Our findings point not only to valuable practice-informed gaps in the literature that could be filled by implementation researchers, but also topics for which dissemination and translation efforts may not have yielded optimal reach. We also highlight the value in ISPs bolstering their own capacity for engaging with the implementation science literature to the fullest extent possible.
Keywords: implementation, implementation practice, implementation research, implementation science, research-to-practice gap
Plain Language Summary
In the field of implementation science, increasing attention is being placed on the “ironic gap” or “secondary” research-to-practice gap. This gap reflects a general lag or disconnect between implementation research and implementation practice, often stemming from knowledge generated by implementation research not being accessible to or applied by professionals who support implementation efforts in various service-delivery systems. Several explanations for the research-to-practice gap in implementation science have been offered in recent years; the authors highlight one notable challenge that may be exacerbating the research-to-practice gap in this field, namely that implementation research often remains disconnected from the lived experiences of implementation support practitioners. In this paper, the authors demonstrate the promise of developing a practice-drive research agenda in implementation science, with specific research question ideas offered by highly experienced implementation support practitioners. The paper concludes by expressing enthusiasm for future efforts to bring implementation research and implementation practice together more closely, empirically foreground practice-driven implementation research questions, translate and disseminate existing implementation research findings more widely, and build the capacity of implementation support practitioners to fully engage with the implementation science literature.
Introduction
In the field of implementation science, increasing attention is being placed on the “ironic gap” or “secondary” research-to-practice gap (Beidas et al., 2022; Juckett et al., 2022; Westerlund et al., 2019; Wilson & Kislov, 2022). This gap reflects a general lag or disconnect between implementation research and implementation practice, often stemming from knowledge generated by implementation research not being accessible to or applied by implementers or implementation support practitioners (ISPs)—professionals who are on the frontlines supporting implementation efforts in various service-delivery systems. Implementation researchers, in this context, can be defined as individuals who aim to generate knowledge regarding the processes, strategies, and methods that promote the uptake of research-supported practices, policies, and programs in various service-delivery systems (Juckett et al., 2022). The general intent of implementation research is to “understand what, why, and how interventions work in ‘real-world’ settings and to test approaches to improve them” (Peters et al., 2013).
On the other hand, ISPs strive to leverage the best available insights generated from implementation research and frameworks to “help systems and service providers implement research-supported practices, policies, and programs, and sustain and scale research evidence for population impact” (Metz et al., 2021, p. 239). ISPs can be housed outside the service-delivery systems they support but may also operate within a service-delivery system, particularly those with an internal work unit designed to support innovation, implementation, improvement, and scaling efforts (Albers et al., 2020; Metz et al., 2021). Implementation support is often delivered via partnerships between professionals housed inside and outside a service-delivery system.
Several challenges plausibly undergird the research-to-practice gap in implementation science (Beidas et al., 2022; Juckett et al., 2022). First, implementation science is a rapidly evolving field, dampening the extent to which knowledge dissemination efforts can incorporate timely discoveries and advancements in the field. Second, the supply of implementation-focused training and dissemination mechanisms is dwarfed by the demand. Implementation training programs might be inaccessible for other reasons as well, such as limited time and resources among teams and organizations (Davis & D’Lima, 2020). Third, common and well-established implementation capacity-building interventions have targeted implementation researchers rather than ISPs. Thøgersen (2021) highlights the need to increase the applicability of knowledge generated from implementation research for use in real-world settings, noting that implementation frameworks, strategies, and measures developed under the best possible circumstances are not feasible for day-to-day use in service settings. Implementation science may not be producing knowledge that practitioners actually need to do their work.
There is a growing call to action in the field of implementation science to deeply integrate the perspectives of practitioners to co-produce the science of implementation (Supplee et al., 2023). We call attention to this challenge—the relative dearth of implementation research that is tethered intimately to the lived experiences of ISPs—that may be exacerbating the research-to-practice gap in implementation science (Metz, Jensen, Farley, & Boaz, 2022). Practice-driven research agendas can emerge from ongoing and intentional dialogue between implementation researchers and ISPs, with the goal of unearthing critical questions that, if answered and disseminated well, could impact the real-world efforts by ISPs to implement programs, policies, or practices intended to yield equitable and desirable population impacts. Efforts on this front align with recent calls to enrich the epistemological tools used to build the science of implementation, particularly from a phenomenological perspective (Wilson & Kislov, 2022). Such efforts also are aligned with the emerging prioritization of knowledge co-creation processes (Jensen & Kainz, 2019; Metz et al., 2019). Consequently, the purpose of the current short report is to feature a qualitative approach to engaging with highly experienced ISPs to identify and synthesize key research questions that are perceived to have a high probability of benefiting their implementation support work. In addition to summarizing our findings, we aim to encourage ongoing empirical inquiry that foregrounds practice-driven implementation research questions.
Methods
Study Setting and Sample
The current study leveraged qualitative data collected from individuals in the United States via a hybrid purposive-convenience sampling approach, with priority placed on recruiting participants with numerous years of experience using implementation science frameworks to support change efforts in various child and family service systems (Albers et al., 2020, 2021; Bührmann et al., 2022; Curtis et al., 2000; Koerber & McMichael, 2008; Metz et al., 2021). Our analytic sample was comprised of 17 professionals from different organizations, each with long-term experience using implementation science frameworks to support change efforts. The implementation support role has been described as part of the “service-delivery support system” (Wandersman et al., 2008) and has been expanded conceptually in recent years to include the roles of implementation facilitators (Kirchner et al., 2014, 2016; Parker et al., 2014) and ISPs (Albers et al., 2020, 2021; Metz et al., 2021). Table 1 provides a summary of self-reported respondent characteristics.
Table 1.
Respondent Characteristics
| n | % | |
|---|---|---|
| Gender | ||
| Cisgender woman | 14 | 82% |
| Cisgender man | 3 | 18% |
| Racial/ethnic identity | ||
| Non-Hispanic White | 14 | 82% |
| African American/African Descent | 2 | 12% |
| Hispanic White and Asian | 1 | 6% |
| Years of professional experience | ||
| 15+ | 14 | 82% |
| 6–10 | 2 | 12% |
| 11–15 | 1 | 6% |
| Focus of work (check all that apply) | ||
| Child Welfare | 12 | 71% |
| Mental and Behavioral Health | 9 | 53% |
| Implementation Science | 8 | 47% |
| Criminal Justice | 5 | 29% |
| Public Health | 4 | 24% |
| Health | 3 | 18% |
| Other | 3 | 18% |
| K-12 Education | 2 | 12% |
| Work setting (check all that apply) | ||
| Non-profit | 6 | 35% |
| Higher Education | 5 | 29% |
| Local Government | 5 | 29% |
| State Government | 4 | 24% |
| Other | 4 | 24% |
| Federal Government | 1 | 6% |
| For-Profit | 1 | 6% |
Data Collection Procedures
Data were collected via in-depth, semi-structured interviews using the Zoom platform. Interviews were 60 minutes in duration, on average. Core interview prompts attended to implementation support strategies used to support the use of research evidence (and the conditions under which they are most impactful) and the role of stakeholder engagement in implementation and evidence use. The following question—and primary focus of the current study—was also posed at the end of the interviews: “If you were to develop a research agenda on stakeholder engagement and evidence use/implementation, what questions would you ask and why?” Although seemingly narrow in focus, respondents provided a wide range of responses to this prompt, reflecting ideas more broadly connected to the field of implementation science stemming from the rich conversation that took place during the interview. One member of the research team (i.e., second author, who has over 17 years of experience in the field of implementation science and currently leads a university-based implementation science center) led the interviews, and two other members of the research team (i.e., first and fourth authors) attended the interviews to observe and engage in general notetaking. Participants had the opportunity to provide their informed consent verbally; the Institutional Review Board at the authors’ university reviewed all study protocols and assigned the project exempt status. Audio recordings from each interview were transcribed verbatim and fully de-identified in preparation for analysis.
Data Analysis
Our analysis followed a qualitative content analysis approach (Schreier, 2014), emphasizing the development and use of a coding frame whereby one or more main categories are specified and multiple subcategories are identified, which aim to specify and organize what is said in the material with respect to a main category (Schreier, 2014). Our focal conceptual category centered on the desired areas of future research highlighted by respondents, with subcategories reflecting subsets of related research question ideas. The first and third authors engaged in coding activities independently using Dedoose software, then met together to compare emergent subcategories and associated codes. With 93% rater agreement, only one additional code was added following discussion and consensus.
Results
Interviews yielded a wide range of responses that could help shape a practice-driven research agenda for the field of implementation science. The following subcategories regarding desired areas for future research were identified in respondents’ answers: (a) stakeholder engagement and developing trusting relationships, (b) evidence use, (c) workforce development, and (d) cost-effective implementation. These subcategories, along with related codes and research questions, are highlighted in Table 2.
Table 2.
Findings from Qualitative Content Analysis
| Conceptual subcategories | Specific code(s) | Research question(s) |
|---|---|---|
| Stakeholder Engagement and Developing Trusting Relationships | Understanding how to bridge the science and art of implementation | How does one balance the use of technical and relational skills in providing implementation support? |
| Understanding the origins and roles of champions |
How does one, specifically, develop and cultivate “champions”? Are “champions” defined the same way across the literature and contexts? How many champions are needed? What level of the system are champions typically found? What are these individuals explicitly doing to champion the work? What support do these individuals need to become a champion? |
|
| How to optimize relationships with stakeholders at different systems levels |
How does an implementation support practitioner navigate the relationship dynamic with multi-tiered, interlocking teams? At which level of the system does one enter the relationship? How does one work with these systems in a way that promotes connection among different levels? How do you serve as an “honest broker” when working in these systems (i.e., maintaining integrity across all relationships?) |
|
| How to cultivate/train for relational skills in implementation | How does one establish and cultivate trust in different settings (e.g., on-site/in-person vs. off-site/virtual)? | |
| How to move toward collaboration (authentically collaborate, flexibility) | How can implementation support practitioners foster collaboration with stakeholders (e.g., families) in the areas of program selection, implementation, and accountability in public systems? | |
| How to tailor communication (shared understanding) |
How does one navigate communication with different stakeholder groups and bring everyone to consensus on the facilitators and barriers to systems improvement? How do you convey information differently to various stakeholder groups to efficiently meet everyone's needs? |
|
| How to culturally adapt implementation support | How does one adapt their implementation support practice when working with communities of color and other minoritized communities? | |
| End Users/How to preserve proximity when scaling up implementation systems and structures | How does one preserve proximity to the end user(s) when creating implementation support systems and structures? | |
| Understanding common barriers to stakeholder engagement | What are common barriers to engagement among various stakeholders? | |
| Evidence Usea | Understanding different ways of knowing |
How do different stakeholder groups define “evidence”? How has the definition of “evidence” been socially constructed in the context of Western society? How do implementation support practitioners determine what is considered meaningful evidence to the stakeholder groups they are working with? What other forms of evidence and ways of knowing exist and how can we uplift and prioritize them in implementation practice? |
| Identifying the problem (rather than leading with solutions)/Factors that influence decisions to make system changes | What informs problem identification and adoption of a specific evidence-based practice(s) (e.g., beliefs, values)? | |
| Factors that shape beliefs about taking a system change to scale | What informs one's understanding of the specific strategies for getting to scale? | |
| Workforce Development | How to develop/train the implementation support workforce |
What is the pathway for teaching and developing the next generation of implementation support practitioners? How do you teach practitioners the skills and competencies required for effective implementation practice? Are these collective spaces helpful in fostering resilience among practitioners? How important is the practice of self-reflection to the work? |
| Cost-Effective Implementation | How can implementation be maximally cost-effective | Are there ways for implementation support to be administered in a less costly, less resource-intensive, yet still effective manner? |
Notes. aThe concept of evidence was discussed broadly to encompass both (a) the evidence of the intervention (or “the what”) being implemented and (b) evidence related to the efficacy of specific implementation strategies/approaches.
Respondents emphasized the critical roles that stakeholders and the development of trusting relationships serve in implementation efforts and the need for research centered on best practices for navigating these engagements. One respondent's professional history working in the child welfare system prompted them to wonder how public systems could better promote collaboration with key stakeholders in the areas of program selection, implementation, and accountability. Another respondent described their individual reflection on the importance of tailoring communication to different stakeholders in practice, causing them to think about how information could be conveyed in an effective and mutually beneficial manner for all groups involved.
I really have been thinking a lot about the messengers of information and how different people and different stakeholders…might need to hear information differently. But how do you convey that information efficiently that meets everyone's needs? (P2)
Similarly, the need to understand common barriers to engagement among various stakeholders and how this may impede implementation was indicated. Respondents also expressed a desire to explore how professionals in the field have adapted their implementation practice when working with communities of color and other minoritized communities; in particular, respondents were interested in knowing if specific models existed to help guide implementation practice when working alongside these communities. One respondent described an explicit shift in implementation support systems and structures and the growing need to examine how to preserve proximity to the “end user” as an ISP.
It feels like we are doing more implementation support for the sake of doing implementation support…but now we’ve got these big systems that are not as intimate, and not as intimately connected to the end user…I think it's important to try and preserve that proximity when you’re creating implementation support systems and structures. (P10)
Further, respondents reflected on the need for ISPs to bridge the science (i.e., technical skills) and art (i.e., relational skills) of implementation and posed the execution of this balance in practice as a potential focus of future study. Respondents also expressed curiosity concerning the origins and roles of champions in implementation practice. One respondent felt it would be relevant to define the characteristics of champions within the implementation site and processes whereby they are cultivated.
I see it happen very slowly where one person gets another person and then another person gets another person, and it becomes like a widespread understanding that this [intervention] is a good thing for our agency. I would like to understand a little bit better about how that happens. Because it's not just what I was saying about somebody says it, it's a good thing…that's not enough to make somebody a champion. (P9)
Another respondent reflected on the difficulty navigating stakeholder relationships in the context of a multi-tiered, interlocking system. They acknowledged a desire to understand how to promote collaboration and maintain integrity across these systems’ levels, knowing that each relational dynamic presents varying demands and requires some form of adaptation as an ISP.
I struggle with that concept of tiered and interlocking teams… like how and where one enters which relationship and in a way that makes all levels, how many there are, meaningful, interactive, and connected. How do you enter and how do you have the right-way relationship? How do you be an honest broker at every level over time? That would be an interesting thing [to study]. (P3)
In alignment with recent theorizing (Metz, Jensen, Farley, Boaz, Bartley, et al., 2022), respondents highlighted the need to explore other relational aspects of implementation in research; for example, how ISPs establish and cultivate trust with individuals in different settings (i.e., on-site/in-person versus off-site/virtual).
Relating to evidence use, respondents shared the usefulness of prioritizing research focused on understanding different types of evidence. The concept of evidence was discussed broadly to encompass both (a) the evidence of the intervention (or “the what”) being implemented and (b) evidence related to the efficacy of specific implementation strategies/approaches (or “the how”). One respondent discussed how the westernization of evidence has favored “traditional ways of knowing” in implementation efforts over other ways of knowing. Research could investigate what the term “evidence” means for different stakeholder groups to intentionally uplift various ways of knowing in practice.
Other respondents highlighted the need to uncover what implicit and explicit factors inform problem identification, adoption of a specific evidence-based practice, and the strategies utilized for getting to scale. It was hypothesized by one respondent that professionals in the field are not concretely thinking about and discussing the strategies that will longitudinally impact whether an implementation effort alters standard practice.
Respondents also demonstrated interest in the pathways for developing the next generation of ISPs and, specifically, how collaborative spaces and self-reflection components could prove helpful in cultivating professional competence and resilience. Further, other respondents indicated a desire to understand ways to be maximally cost-effective in implementation efforts, while still upholding fidelity.
Discussion
In this short report, we featured a qualitative approach to engaging with highly experienced ISPs in an effort to begin shaping a practice-driven research agenda for the field of implementation science. Notwithstanding the bounded wording of interview prompts, respondents offered a wide-ranging and rich set of ideas about promising future research that would connect to their every-day implementation practice, largely centered around (a) stakeholder engagement and developing trusting relationships, (b) evidence use, (c) workforce development, and (d) cost-effective implementation. Given our general aim to encourage ongoing exploration of practice-driven research questions from the perspective of experienced ISPs, it is worth considering how rich future investigations on this front could be if interview prompts are fully unbounded and explicitly open to all aspects of implementation science. We are enthusiastic about such future studies, which could also seek to overcome some of our study limitations by recruiting diverse samples with respect to racial/ethnic identity, professional experience, and geographic location.
Here we also want to acknowledge the possibility that respondents highlighted an idea for future research that is already addressed in existing studies (e.g., the role and functions of champions in implementation; Bonawitz et al., 2020). Thus, our findings point not only to valuable practice-informed gaps in the literature that could be filled by implementation researchers, but also topics for which dissemination and translation efforts may not have yielded optimal reach, particularly to experienced ISPs actively engaged in implementation efforts. Connecting back to Juckett et al., (2022) point about implementation science rapidly evolving, this issue around knowledge translation prompts ongoing reflection in terms of how implementation research can be made visible and available to the professionals who might benefit most from cutting-edge and relevant empirical insights.
We also see value in ISPs bolstering their own capacity for engaging with the implementation science literature, perhaps by drawing from high-quality, open-access journals or otherwise freely available articles in relevant journals and other outlets. Thus, the onus of productively shaping and utilizing implementation research in practice should not be placed solely on implementation researchers, but ISPs should work diligently to seek out available implementation research to the fullest extent possible. Doing so may enhance the insights ISPs can offer about developing a robust practice-driven research agenda in implementation science. Moreover, implementation research and implementation practice can be brought closer together by a) creating visible, inclusive spaces—physical and virtual—for perspectives to be shared (e.g., communities of practice); b) developing more opportunities for implementation practitioners to publish and present at implementation science conferences, including keynotes; c) inviting implementation researchers into practice spaces to co-produce and pursue joint research agendas; d) building a workforce that can make implementation science practical and usable for service systems and communities; and e) incentivizing (e.g., grant funding, infrastructure support, professional recognition) practice-research partnerships.
Footnotes
Author Contributions: Todd M. Jensen, PhD, is research assistant professor in the School of Social Work, Associate Director for Research in the Collaborative for implementation Practice, and Family Research and Engagement Specialist in the Jordan Institute for Families at the University of North Carolina at Chapel Hill; Allison J. Metz, PhD, is professor of practice, Director of Implementation Practice, and Director of the Collaborative for Implementation Practice in the School of Social Work at the University of North Carolina at Chapel Hill; Mackensie E. Disbennett, MSW, is research specialist in the Collaborative for Implementation Practice in the School of Social Work at the University of North Carolina at Chapel Hill; Amanda B. Farley, BA, is implementation associate in the Collaborative for Implementation Practice in the School of Social Work at the University of North Carolina at Chapel Hill.
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Generous support for this study was provided by the William T. Grant Foundation (Award #188777).
ORCID iDs: Todd M. Jensen https://orcid.org/0000-0002-6930-899X
Allison J. Metz https://orcid.org/0000-0002-0369-7021
References
- Albers B., Metz A., Burke K. (2020). Implementation support practitioners—A proposal for consolidating a diverse evidence base. BMC Health Services Research, 20(1), 1–10. 10.1186/s12913-020-05145-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Albers B., Metz A., Burke K., Bührmann L., Bartley L., Driessen P., Varsi C. (2021). Implementation support skills: Findings from a systematic integrative review. Research on Social Work Practice, 31(2), 147–170. 10.1177/1049731520967419 [DOI] [Google Scholar]
- Beidas R. S., Dorsey S., Lewis C. C., Lyon A. R., Powell B. J., Purtle J., Saldana L., Shelton R. C., Stirman S. W., Lane-Fall M. B. (2022). Promises and pitfalls in implementation science from the perspective of US-based researchers: Learning from a pre-mortem. Implementation Science, 17(1), 55. 10.1186/s13012-022-01226-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bonawitz K., Wetmore M., Heisler M., Dalton V. K., Damschroder L. J., Forman J., Allan K. R., Moniz M. H. (2020). Champions in context: Which attributes matter for change efforts in healthcare? Implementation Science, 15(1), 1–10. 10.1186/s13012-020-01024-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bührmann L., Driessen P., Metz A., Burke K., Bartley L., Varsi C., Albers B. (2022). Knowledge and attitudes of implementation support practitioners—Findings from a systematic integrative review. PLoS One, 17(5), e0267533. 10.1371/journal.pone.0267533 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Curtis S., Gesler W., Smith G., Washburn S. (2000). Approaches to sampling and case selection in qualitative research: Examples in the geography of health. Social Science & Medicine, 50(7–8), 1001–1014. 10.1016/S0277-9536(99)00350-0 [DOI] [PubMed] [Google Scholar]
- Davis R., D’Lima D. (2020). Building capacity in dissemination and implementation science: A systematic review of the academic literature on teaching and training initiatives. Implementation Science, 15(1), 1–26. 10.1186/s13012-020-01051-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jensen T., Kainz K. (2019). Positioning social work researchers for engaged scholarship to promote public impact. Journal of the Society for Social Work and Research, 10(4), 591–609. 10.1086/706266 [DOI] [Google Scholar]
- Juckett L. A., Bunger A. C., McNett M. M., Robinson M. L., Tucker S. J. (2022). Leveraging academic initiatives to advance implementation practice: A scoping review of capacity building interventions. Implementation Science, 17(1), 1–14. 10.1186/s13012-022-01216-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kirchner J. E., Ritchie M. J., Pitcock J. A., Parker L. E., Curran G. M., Fortney J. C. (2014). Outcomes of a partnered facilitation strategy to implement primary care–mental health. Journal of General Internal Medicine, 29(4), 904–912. 10.1007/s11606-014-3027-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kirchner J. E., Woodward E. N., Smith J. L., Curran G. M., Kilbourne A., Owen R. R., Bauer M. S. (2016). Implementation science supports core clinical competencies: An overview and clinical example. Primary Care Companion for CNS Disorders, 18(6), e1–e7. 10.4088/pcc.16m02004 [DOI] [PubMed] [Google Scholar]
- Koerber A., McMichael L. (2008). Qualitative sampling methods: A primer for technical communicators. Journal of Business and Technical Communication, 22(4), 454–473. 10.1177/1050651908320362 [DOI] [Google Scholar]
- Metz A., Albers B., Burke K., Bartley L., Louison L., Ward C., Farley A. (2021). Implementation practice in human service systems: Understanding the principles and competencies of professionals who support implementation. Human Service Organizations: Management, Leadership & Governance, 45(3), 238–259. 10.1080/23303131.2021.1895401 [DOI] [Google Scholar]
- Metz A., Boaz A., Robert G. (2019). Co-creative approaches to knowledge production: What next for bridging the research to practice gap? Evidence & Policy, 15(3), 331–337. 10.1332/174426419X15623193264226 [DOI] [Google Scholar]
- Metz A., Jensen T., Farley A., Boaz A. (2022). Is implementation research out of step with implementation practice? Pathways to effective implementation support over the last decade. Implementation Research and Practice. Advance online publication. 10.1177/26334895221105585 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Metz A., Jensen T., Farley A., Boaz A., Bartley L., Villodas M. (2022). Building trusting relationships to support implementation: A proposed theoretical model. Frontiers in Health Services, 2, 71. 10.3389/frhs.2022.894599 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Parker L. E., Ritchie M. J., Bonner L. M., Kirchner J. E. (2014). Examining inside the black box of implementation facilitation: Process and effects on program quality. In Bethesda: Paper presented at the National Institutes of Health/Academy Health 7th Annual Conference on the Science of Dissemination and Implementation.
- Peters D. H., Adam T., Alonge O., Agyepong I. A., Tran N. (2013). Implementation research: What it is and how to do it. BMJ, 347, f6753. 10.1136/bmj.f7086 [DOI] [PubMed] [Google Scholar]
- Schreier, M. (2014). Qualitative content analysis. In U. Flick (Ed.), The sage handbook of qualitative data analysis (pp. 170–183). Sage.
- Supplee L., Metz A., Boaz A. (2023). Learning across contexts: Bringing together research on research use and implementation science. William T. Grant Foundation. [Google Scholar]
- Thøgersen D. (2021, October 19). Beware of the new gap—between implementation science and implementation practice. European Implementation Collaborative. https://implementation.eu/beware-of-the-new-gap-between-implementation-science-and-implementation-practice/
- Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., Blachman, M., Dunville, R., & Saul, J. (2008). Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology, 41(3), 171–181. 10.1007/s10464-008-9174-z [DOI] [PubMed]
- Westerlund A., Nilsen P., Sundberg L. (2019). Implementation of implementation science knowledge: The research-practice gap paradox. Worldviews on Evidence-Based Nursing, 16(5), 332–334. 10.1111/wvn.12403 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wilson P., Kislov R. (2022). Implementation science (Elements of Improving Quality and Safety in Healthcare). Cambridge University Press. [Google Scholar]
