Skip to main content
Wiley Open Access Collection logoLink to Wiley Open Access Collection
. 2019 Oct 11;16(5):332–334. doi: 10.1111/wvn.12403

Implementation of Implementation Science Knowledge: The Research‐Practice Gap Paradox

Anna Westerlund 1,, Per Nilsen 2, Linda Sundberg 1
PMCID: PMC6899530  PMID: 31603625

A person who wants to find a solution to a public health problem has a different task than someone who wants to create or test a theory. (Eldredge, Markham, Ruiter, Kok, & Parcel, 2016, p. 8)

The challenges in improving health care are considerable, as are the efforts made to develop and deliver best practice (Grol, Wensing, Eccles, & Davis, 2013). Different interventions with evidence of effectiveness are continuously made available for potential improvement of health care. However, the difficulties in implementing and using such evidence are well known (Greenhalgh, Robert, Macfarlane, Bate, & Kyriakidou, 2004). The knowledge‐practice gap in health care refers to the gap between scientific knowledge and its application in routine healthcare practice.

Implementation science has developed in the 2000s in response to this gap, with the ambition to generate knowledge to promote a better uptake of evidence for improvements in the quality and safety of health care. The body of implementation knowledge comprises a rapidly growing amount of empirical studies as well as countless theories, frameworks, and models, contributing to an understanding of factors associated with successful implementation of evidence‐based interventions within a variety of settings (Tabak, Khoong, Chambers, & Brownson, 2012).

The multitude of empirical implementation studies, as well as theories, models, and frameworks developed in implementation science, reflect a growing evidence‐based concerning implementation (Brownson, Colditz, & Proctor, 2018). However, despite the rapid progress of implementation science, the knowledge‐practice gap in health care is still substantial, as shown in studies that describe difficulties in achieving desirable change in healthcare practice. Low rates of adoption and limited use of evidence‐based interventions are persistent problems. Thus, the challenges of reducing the knowledge‐practice gap still remain after more than two decades of research.

The aim of this editorial is to address the knowledge‐practice gap by means of increasing awareness of a parallel knowledge‐practice gap (i.e., the somewhat paradoxical gap between scientific knowledge concerning implementation and actual real‐life implementation and use of this knowledge in healthcare practice).

This editorial is based on findings and conclusions presented in a doctoral thesis by the first author, which investigated the resemblance between available scientific knowledge on implementation and implementation strategies used in healthcare practice in three large improvement efforts in Sweden (Westerlund, 2018). An overall conclusion of the thesis was that there exists a parallel knowledge‐practice gap between scientific knowledge on implementation and the use of this knowledge in implementation efforts in healthcare practice (Westerlund, 2018; Westerlund et al., 2017). The findings showed that implementation knowledge was not transferred to healthcare practice (and practitioners) to a sufficient extent, thus restricting the systematic use of implementation knowledge in practice.

Implementation science has a twofold aim: to produce knowledge sufficiently generalizable to contribute to scientific knowledge accumulation and to produce knowledge applicable for improved practice (Fixsen, Blase, & Van Dyke, 2019). The question of use, applicability, and impact of implementation science has been highlighted previously, and the need to make implementation science knowledge more relevant and widely disseminated has been called for in the literature (Armson, Roder, Elmslie, Khan, & Straus, 2018; McIsaac et al., 2018). Implementation knowledge is not taught in healthcare practitioners’ basic training and only seldom in continuing professional education. Although the literature on evidence‐based implementation is expanding and courses are increasingly being made available, these do not focus on practical issues or guidance on how to actually use implementation science knowledge in implementation endeavors (Nilsen, Neher, Ellström, & Gardner, 2017). Ovretveit, Mittman, Rubenstein, and Ganz (2017) have noted that healthcare practitioners are not expected to be knowledgeable about implementation science.

Although implementation science is widely considered an applied science, the extent to which knowledge produced in this field is actually used by practitioners is not known. There are few empirical studies concerning if or how scientific knowledge on implementation is being used in healthcare practice (Armson et al., 2018). As implementation researchers, we need to ask ourselves if our research findings and evidence on implementation have reached the world of practice to a sufficient degree.

There are many analytical tools aimed at supporting researchers’ use of implementation science in their research endeavors (Simpson et al., 2013). When approaching the implementation knowledge field, phrases such as the following are frequently encountered:“Theories and frameworks enhance implementation research” and”inform study design and execution” (Tabak et al., 2012, p. 6) or“Scholars seeking to study implementation have over 60 conceptual frameworks to guide their work” (Birken et al., 2017, p. 2). The impression is that models and frameworks are developed to“help advance implementation science” (Damschroder et al., 2009, p. 2). Recently, the ImpRes tool was developed with the stated purpose to“support research teams in the process of designing implementation research” (King's Improvement Science, 2018, p. 1). These observations raise the questions of whether other researchers are the primary target audience of implementation science knowledge and the extent to which the knowledge produced in the field actually reaches beyond academia. To a large extent, knowledge produced in implementation science still seems to belong to the scientific community rather than practitioners to improve outcomes in health care (Armson et al., 2018; Ovretveit et al., 2017; Westerlund, 2018).

Considering the vast amount and variation of empirical studies of implementation efforts in many different healthcare settings, there is no question that the field of implementation science has produced knowledge on implementation of great relevance for potential use in health care. It seems highly plausible that a conscious and systematic use of scientific knowledge on implementation would be beneficial in change efforts in health care and would likely increase adoption and use of research‐informed interventions to improve the quality of care. Hence, applying scientific knowledge on implementation in healthcare practice may help bridge the knowledge‐practice gap in health care.

So‐called”action models” such as Knowledge‐to‐Action (Graham et al., 2006) and Quality Implementation Framework (QIF; Meyers, Durlak, & Wandersman, 2012) have been developed to guide the translation of research into practice. The originators of the QIF introduced the concept of”practical implementation science,” which refers not only to the translation of implementation science knowledge into user‐friendly resources but also to research and actions based on this translation. Meyers and colleagues stated that one of their goals was to”outline practical implications for improving future implementation efforts in the world of practice” (Meyers, Durlak, et al., 2012, p. 464). Deriving from the QIF, Meyers and colleagues developed what they referred to as a”practical implementation tool” and the Quality Implementation Tool. The aim was to assist practitioners and those providing support to practitioners in implementing interventions with better quality (Meyers, Durlak et al., 2012; Meyers, Katz et al., 2012). However, efforts like these with the explicit goal of narrowing the gap between the science and practice of implementation may not be sufficiently practice‐friendly or ready to use. We do not know this because studies regarding their utility and usability do not exist.

In many ways, making use of implementation science knowledge could be viewed as an important implementation strategy with the potential to reduce the knowledge‐practice gap in health care. However, studies are needed to explore and assess this assumption. We strongly recommend research efforts focusing on further development of the concept of”practical implementation science.” There is a need for research on the applicability and use of models and frameworks as well as additional focus on the question of how to develop and evaluate more user‐friendly tools.

The rapidly growing body of evidence for implementation has the potential to bridge the knowledge‐practice gap in health care. However, implementation science knowledge is still predominantly in the domain of researchers. For knowledge on implementation to facilitate bridging the knowledge‐practice gap, it needs to be translated to user‐friendly tools that are actually used by healthcare practitioners.

With this editorial, we hope to have raised awareness of the need for the implementation science society to reflect upon the question of how we can support the systematic use of implementation science knowledge among leaders and other practitioners in healthcare settings.

Implementation science was born out of a desire to bridge the knowing‐doing gap (i.e., the gap between what is known and what is actually done in health care). It is a paradox if the knowledge produced in this field fails to reach the world of practice. For the practice of implementation to be furthered, we as researchers have an obligation to contribute to improved utilization and translation of the knowledge produced in the implementation science field.

[The copyright line for this article was changed on October 28, 2019 after original online publication]

References

  1. Armson, H. , Roder, S. , Elmslie, T. , Khan, S. , & Straus, S. E. (2018). How do clinicians use implementation tools to apply breast cancer screening guidelines to practice? Implementation Science, 13, Article 79 Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/s13012-018-0765-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Birken, S. A. , Powell, B. J. , Presseau, J. , Kirk, M. A. , Lorencatto, F. , Gould, N. J. , … Damschroder, L. J. (2017). Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): A systematic review. Implementation Science, 12, Article 2 Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/s13012-016-0534-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Brownson, R. C. , Colditz, G. A. , & Proctor, E. K. (2018). Dissemination and implementation research in health: Translating science to practice, 2nd ed. New York, NY: Oxford University Press. [Google Scholar]
  4. Damschroder, L. J. , Aron, D. C. , Keith, R. E. , Kirsh, S. R. , Alexander, J. A. , & Lowery, J. C. (2009). Fostering implementation of health services research findings into 71 practice: A consolidated framework for advancing implementation science. Implementation Science, 4, Article 50 Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Eldredge, L. K. B. , Markham, C. M. , Ruiter, R. A. , Kok, G. , & Parcel, G. S. (2016). Planning health promotion programs: An intervention mapping approach. San Francisco, CA: Jossey‐Bass. [Google Scholar]
  6. Fixsen, D. L. , Blase, K. A. , & Van Dyke, M. K. (2019). Implementation practice & science. ????: Active Implementation Research Network. [Google Scholar]
  7. Graham, I. D. , Logan, J. , Harrison, M. B. , Straus, S. E. , Tetroe, J. , Caswell, W. , & Robinson, N. (2006). Lost in knowledge translation: Time for a map? Journal of Continuing Education in the Health Professions, 26(1), 13–24. [DOI] [PubMed] [Google Scholar]
  8. Greenhalgh, T. , Robert, G. , Macfarlane, F. , Bate, P. , & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Quarterly, 82(4), 581–629. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Grol, R. , Wensing, M. , Eccles, M. , & Davis, D. (2013). Improving patient care: The implementation of change in health care. Chichester, UK: Wiley Blackwell. [Google Scholar]
  10. King's Improvement Science (2018). Implementation Science Research Development (ImpRes) tool and guide. London, UK: King's College London; Retrieved from https://impsci.tracs.unc.edu/wp-content/uploads/ImpRes-Guide.pdf. [Google Scholar]
  11. McIsaac, J. L. , Warner, G. , Lawrence, L. , Urquhart, R. , Price, S. , Gahagan, J. , … Jackson, L. A. (2018). The application of implementation science theories for population health: A critical interpretive synthesis. AIMS Public Health, 5(1), 13–30. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Meyers, D. C. , Durlak, J. A. , & Wandersman, A. (2012). The quality implementation framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3–4), 462–480. [DOI] [PubMed] [Google Scholar]
  13. Meyers, D. C. , Katz, J. , Chien, V. , Wandersman, A. , Scaccia, J. P. , & Wright, A. (2012). Practical implementation science: Developing and piloting the quality implementation tool. American Journal of Community Psychology, 50(1), 481–496. [DOI] [PubMed] [Google Scholar]
  14. Nilsen, P. , Neher, M. , Ellström, P. E. , & Gardner, B. (2017). Implementation of evidence‐based practice from a learning perspective. Worldviews on Evidence‐Based Nursing, 14(3), 192–199. [DOI] [PubMed] [Google Scholar]
  15. Ovretveit, J. , Mittman, B. , Rubenstein, L. , & Ganz, D. A. (2017). Using implementation tools to design and conduct quality improvement projects for faster and more effective implementation. International Journal of Health Care Quality Assurance, 30(8), 755–768. [DOI] [PubMed] [Google Scholar]
  16. Simpson, K. M. , Porter, K. , McConnell, E. S. , Colon‐Emeric, C. , Daily, K. A. , Stalzer, A. , & Anderson, R. A. (2013). Tool for evaluating research implementation challenges. Implementation Science, 8, Article 2 Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/1748-5908-8-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Tabak, R. G. , Khoong, E. C. , Chambers, D. A. , & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine, 43(3), 337–350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Weinert, C. R. , & Mann, H. J. (2008). The science of implementation: Changing the practice of critical care. Current Opinion in Critical Care, 14(4), 460–465. [DOI] [PubMed] [Google Scholar]
  19. Westerlund, A. (2018). The role of implementation science in healthcare improvement efforts: Investigating three complex interventions. Doctoral thesis, Umeå University, Sweden.
  20. Westerlund, A. , Garvare, R. , Nyström, M. E. , Eurenius, E. , Lindkvist, M. , & Ivarsson, A. (2017). Managing the initiation and early implementation of health promotion interventions: A study of a parental support programme in primary care. Scandinavian Journal of Caring Sciences, 31(1), 128–138. [DOI] [PubMed] [Google Scholar]

Articles from Worldviews on Evidence-Based Nursing are provided here courtesy of Wiley

RESOURCES