Skip to main content
Clinical and Translational Science logoLink to Clinical and Translational Science
. 2015 Sep 9;8(6):820–823. doi: 10.1111/cts.12319

Interdisciplinary Priorities for Dissemination, Implementation, and Improvement Science: Frameworks, Mechanics, and Measures

Julian W Brunner 1, Ibrahima C Sankaré 2, Katherine L Kahn 2,3,
PMCID: PMC4905745  NIHMSID: NIHMS709619  PMID: 26349456

Abstract

Much of dissemination, implementation, and improvement (DII) science is conducted by social scientists, healthcare practitioners, and biomedical researchers. While each of these groups has its own venues for sharing methods and findings, forums that bring together the diverse DII science workforce provide important opportunities for cross‐disciplinary collaboration and learning. In particular, such forums are uniquely positioned to foster the sharing of three important components of research. First: they allow the sharing of conceptual frameworks for DII science that focus on the use and spread of innovations. Second: they provide an opportunity to share strategies for initiating and governing DII research, including approaches for eliciting and incorporating the research priorities of patients, study participants, and healthcare practitioners, and decision‐makers. Third: they allow the sharing of outcome measures well‐suited to the goals of DII science, thereby helping to validate these outcomes in diverse contexts, improving the comparability of findings across settings, and elevating the study of the implementation process itself.

Keywords: dissemination, implementation, improvement, outcomes, collaboration, research, diffusion, framework, governance, workforce

Introduction

Although research in each of the individual fields of dissemination, implementation, and improvement (DII) has a different emphasis,1, 2 when combined they share a great deal: most notably, an unwavering focus on meeting the evidence needs of healthcare and policy decision makers. Here, we will consider these areas of inquiry together as a single science of DII—both to reflect their common goals and methods and to mirror the scope of the symposium that gave rise to this set of papers.

In the vision for a learning healthcare system, operational questions regularly undergo iterative investigation to enhance the effectiveness of care.3 In this context, the distinctions across stakeholders diminish, with practitioners always contributing to learning and improvement. Today, however, the roles of practice and research tend to remain separate—even when they are performed by the same person. So, as a waypoint toward that vision, it is important that those conducting DII science share conceptual frameworks, strategies for initiating and conducting DII science and outcome measures that are well‐suited to the goals of DII science.

DII Science Symposium

The work presented here was based on and guided by preliminary findings, community engagement efforts, and forum discussions presented at the 2014 Southern California DII Science Symposium, sponsored by the University of California Los Angeles (UCLA) Clinical Translational Science Institute (CTSI), University of Southern California (USC) CTSI, and Kaiser Permanente. The goal of the day‐long symposium was to accelerate the quantity and quality of DII science programs and activities by (a) sharing knowledge and information regarding current DII science‐related activity in the greater Los Angeles area, and (b) fostering networking opportunities and collaboration between experienced researchers in DII science, academics who are new to the field, and community partners to increase their participation in the Initiative's mission, goals, strategies, and operational plans. A diverse group of 129 participants included senior and junior researchers, research fellows, leaders of local healthcare delivery systems, and public health agencies, research partners in community‐based organizations in Los Angeles as well as individuals representing each of the groups described above. Keynote speakers representing funding agencies, public and private delivery systems, and medical associations delivered addresses on the importance and opportunities for implementation and improvement science research. Participants discussed the challenges to designing and executing implementation and improvement science research that meets the needs of stakeholders in healthcare, public health and in communities. Breakout sessions were moderated by a faculty member and a fellow who were selected based on their expertise and experience in the discussion topic. Attendees shared their own work—both ongoing and in development, relevant to DII science, and brainstormed challenges and potential solutions. Rather than achieving consensus, these group sessions sought to describe both barriers and potential solutions to achieve common priorities for advancing the conduct of DII science. A separate article in this issue4 provides more information on the background, mission, and goals of the symposium.

Who Does DII Science

There is a growing community of specialists in DII science, in part fostered by a small number of dedicated training programs funded by NIH.5 However, much of the work in DII science is conducted by researchers or healthcare professionals for whom DII science is a secondary area of interest, namely: social scientists, healthcare practitioners, and biomedical researchers. This is of course not an exhaustive list but it represents the overlapping groups of individuals who design and carry out DII science along with the community of specialists in the field.

Social sciences and other traditional academic disciplines

DII science is by design a broad area of inquiry that encompasses diverse intellectual roots emanating from the social sciences and public health,6 and especially draws on fields that combine approaches from multiple disciplines—for example: health services research, behavioral economics, and management sciences—each of which are themselves “hybrid fields” that draw on economics, sociology, psychology, epidemiology, and several other fields in different proportions. These disciplines offer methods, theories, and empirical findings related to the behavior of individuals (relevant to healthcare professionals) and institutions (relevant to organizations that provide, fund, or otherwise facilitate health and healthcare). Together, these constitute the fundamental focus of DII science.

Healthcare practice

DII science is notable for its inclusion of professionals whose primary role is to care for or support the health of a specific community. A defining characteristic of these practitioners in DII science is a desire to infuse the rigor of research into operational activities and to produce generalizable knowledge in the process.2 Likewise, these individuals are ideally situated to synthesize theory‐ and research‐based insights with practical insights, and to use this synthesis to inform implementation activities.7 One attendee asserted that academic medical centers have often failed to change their practice in response to their own research findings, and that organizations are increasingly attempting to “practice what they publish."

Biomedical research

Researchers in medicine—from basic science to clinical trials of medical intervention efficacy and effectiveness—all depend in some way on DII to improve the impact of their research. For this reason, funders are increasingly encouraging medical researchers to go beyond hypothetical statements of possible impact, and to incorporate aspects of DII science in their work.1, 8 Attendees also provided examples of researchers who have worked primarily in the early phases of translation (i.e., T0, T1, and T2) turning their attention to the later phases of translation that characterize DII science. For example, one investigator with experience in randomized efficacy trials described the necessity of diverging from the methods and conceptual frameworks typically used, and learning about approaches best suited to DII.

Several researchers have called for universities to play a more prominent role in implementation science, instead of leaving “postdiscovery” activities to be handled chiefly by governmental agencies and nongovernmental organizations. For example, HIV researchers have proposed that HIV is exactly the type of health challenge that requires academia—and in particular health‐related academic departments—to take on a leadership role in both conducting implementation research and training students to conduct this research.9 Establishing “practice tracks” and rewarding faculty for community impact as well as publication productivity are two mechanisms that some universities have used to support and reward leadership in DII science.10

Clinical trials that are increasingly targeted at “postregulatory” decision makers such as payers and patients11 can also bring biomedical researchers into DII science. Much of what we think of as DII science most closely resembles health services research, e.g., in its emphasis on complex, system‐level interventions, and in its use of mixed methods and natural experiments. However, patient‐centered outcomes research (PCOR), as well as the closely related fields of comparative effectiveness research and pragmatic clinical trials, can represent facets of DII science in their focus on study designs with high external validity and interventions that are highly adaptable to diverse settings and patient populations.12 Similarly, the central importance of engaging patients and end‐users of research in identifying meaningful outcomes and study designs represents another substantial overlap between PCOR and DII science.13 Conducting trials that embody principles of patient‐centered outcomes research can be an entry point to DII science for biomedical researchers, as those trials include more and more of the delivery system factors that determine treatments' effectiveness in “real world” settings.14

Priorities for Sharing

Each of these stakeholders has discipline‐specific journals and conferences—many within each of the groups described above—but DII science offers an opportunity for these groups to share strategies that cut across segments of the research and practice workforce.

At the Southern California DII Science Symposium held in March of 2014, we convened a discussion about research topics and priorities in DII science. Attendees included specialists in DII science as well as individuals representing each of the groups described above. Attendees were invited to share their own work—both ongoing and in development, relevant to DII science, and propose common priorities for advancing the conduct of DII science. From this conversation, three topics relevant to DII science emerged as particularly important to share across the groups that make up the DII science workforce. These three topics, discussed at greater length below, are: (1) conceptual frameworks for DII, (2) strategies for initiating and governing DII research, and (3) outcome measures well suited to the goals of DII science.

Frameworks

Conceptual frameworks in DII science tend to be built primarily to guide either research or implementation, though several can serve both functions. However, this dual role can make the process of identifying appropriate frameworks challenging: some are best suited for simply expressing hypothesized relationships among variables in research, others are designed for the express purpose of guiding practice—i.e., to provide a bridge between theories, empirical findings, and specific implementation strategies.15

Conceptual frameworks in a health planning context have been defined as “strategic or action‐planning models that provide a systematic way to develop, manage, and evaluate interventions,”16 and thus sit closer to the implementation side of the spectrum. Evaluation frameworks span the continuum: formative evaluation emphasizes improvements in the design and delivery of interventions while summative evaluation emphasizes generalizable findings about interventions' effectiveness. Likewise, while the majority of frameworks for DII concern the implementation of interventions, some can guide the ongoing maintenance and improvement of capabilities; these frameworks also span the continuum—for example, one such framework quite intentionally represents the iterative and bidirectional process linking research with practice.17

Health services, biomedical, and clinical researchers in the group all expressed an interest in learning about existing and emerging conceptual frameworks suitable for DII science. Researchers with expertise outside of DII science may be most familiar with conceptual frameworks that strain to reflect the unique challenges and context of DII science—for example, theories of individual behavior change or healthcare access. Forums for DII science serve as an excellent opportunity for researchers to become acquainted with frameworks and theories that have already been developed and adapted for individual and organizational behavior in healthcare settings, and focus on the use and spread of innovations. Some examples of such conceptual frameworks are provided below.

Additional resources

A systematic review by Greenhalgh et al. serves as a key resource for identifying frameworks intended to guide practice—i.e., to aid in the dissemination and implementation of innovations.18 This review also lead to the development of a widely used overarching framework summarizing relevant constructs from many research traditions.19 Another widely used framework is the Interactive Systems Framework (ISF), which aims to support not only practitioners and researchers, but also funders and providers of technical assistance.17

In addition, an AHRQ analysis of quality improvement strategies is an additional resource for frameworks that help translate knowledge to practice—it outlines the theoretic basis for quality improvement interventions, and introduces a taxonomy of quality improvement strategies.20

Likewise, a 2012 inventory of theories and frameworks for dissemination and implementation research21 identified 61 models, and grouped them according to their flexibility, orientation toward dissemination or implementation, and the level(s) at which the models operate (e.g., individual, organizational, community, and system). Among the frequently used models highlighted in the inventory is the Consolidated Framework for Implementation Research (CFIR)22 which is particularly useful for representing and studying factors above the individual level, and the theoretical domains framework (TDF) which is well‐suited to represent individual behavior change.23

The frameworks noted here are recent, frequently cited examples, but they serve as illustrations of the DII‐specific models that enable separate research activities to form a coherent body of literature, and further convergence around these models could allow empirical findings to more easily inform and advance theories in DII science.

Mechanics: strategies for initiating and governing DII research

Scientific disciplines outside of DII science acknowledge the importance of linking research with practice and policy, and many researchers and organizations in these disciplines work diligently to make research relevant to potential users. For DII science this link is a core, defining goal. Strategies for creating and nurturing this link are of particular interest to DII science, and are highly relevant to the healthcare practitioners, social scientists, and biomedical researchers who practice it. We discussed a promising model for initiating research with an inherent link to healthcare practice: “embedded research,” in which a researcher joins an implementation team to improve opportunities to learn from implementation. The embedded researcher helps the team balance rigorous data collection with speed and efficiency. The VA has used this model since the 1980s, and attributes a great deal of the significant improvements in quality in the 1990s to this approach, describing it as a necessary complement to “top‐down” strategy.24 Kaiser Permanente of Southern California recently established a research‐operations partnership (ROP) to enact this model, and, with the HMO Research Network, hosted a conference to allow organizations to share lessons from embedded research.25

A recurring challenge for these endeavors, however, is this question: When does the careful and systematically studied implementation of an intervention become research? When the central purpose of an effort is (local) quality improvement, what can be added to improve the usefulness and generalizability of the information generated in the course of implementation? What kind of organizational learning turns a patient or consumer into a research participant?

Additional resources

In a seminal report, bioethicists at Johns Hopkins proposed a framework for the ethical oversight of the learning healthcare system.26, 27 This report suggests that ethical oversight should be proportional to the potential risks to participants, rather than determined by the generalizability of the learning. The DII science community is positioned to become leaders in the ethical practice of learning across the research/QI continuum. The Common Rule still holds generalizability of knowledge as a key factor in determining the degree of ethical oversight required.28 However, DII scientists can campaign for ethical oversight—including the requirement for various forms of informed consent or IRB approval—to more closely consider indicators of potential risks to patients besides the ability of local lessons to inform activities outside of a single organization.29

Another critical approach for tying research to practice is community partnered participatory research (CPPR), whose focus is an equitable partnership between community and academics in all phases of the research process.30 CPPR's emphasis is on the community beyond the clinic, and helps ensure that research participants and their communities—both at the individual and organizational level—are a part of the decision‐making process in prioritizing research questions and in study design. This bidirectional exchange can enhance the ability of DII projects to provide real‐world solutions and can simultaneously improve community and academic research capacity.21

Outcome measures

Every field has its set of most‐popular outcome measures for studies of intervention efficacy or effectiveness. Outcomes related to DII, however, tend to be more proximate. For example, outcomes in implementation have been defined as “the effects of deliberate and purposive actions to implement new treatments, practices, and services.”31 Similarly, a taxonomy of implementation outcomes identified acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, coverage, and sustainability as emblematic of implementation science.6

A cohesive community of DII scientists can elevate implementation outcomes by elevating the study of the implementation process itself as opposed to the study of the intervention being implemented. Furthermore, collaboration and sharing among those conducting implementation and improvement activities can lead to outcome measures that are validated in diverse contexts, and it can improve the comparability of DII findings across settings—particularly useful for meta‐analysis. Even more promising is the potential for this communication to foster empirical testing of relevant theories, thereby advancing the clarity and validity of DII science outcome measures.

Conclusion

These “sharing priorities” are important for all groups doing DII science, and are probably the ideas that the groups described above (social scientists, healthcare practitioners, and biomedical researchers) are least likely to have in common. Therefore, forums focusing on DII have a potentially important role to serve in developing these common tools that can be deployed by practitioners, biomedical researchers, social scientists, and specialists in DII. Moreover, the importance of sharing conceptual frameworks, outcomes, and research strategies across settings and stakeholder perspectives speaks to the value of DII as a discipline and as a community.

Conflict of Interests

The manuscript authors declare that they have no conflict of interests, financial or other to declare.

Previous Presentations

The work is based on preliminary findings, stakeholder engagement efforts, and forum discussions presented at the 2014 Southern California Regional Dissemination, Implementation and Improvement Science Symposium sponsored, in part, by the National Institutes of Health/National Center for Advancing Translational Sciences through UCLA CTSI Grant Number UL1TR000124, the Southern California CTSI Grant Number UL1TR000130, and Kaiser Permanente Southern California.

Acknowledgments

The research described was supported, in part, by the National Institutes of Health/National Center for Advancing Translational Sciences through UCLA CTSI Grant Number UL1TR000124, the Southern California CTSI Grant Number UL1TR000130, and Kaiser Permanente Southern California. Work on the manuscript was also supported by UCLA CTSI Number Grant Number TL1TR000121.The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

References

  • 1. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am J Public Health. 2012; 102(7): 1274–1281. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Marshall M, Pronovost P, Dixon‐Woods M. Promotion of improvement as a science. Lancet. 2013; 381(9864): 419–421. [DOI] [PubMed] [Google Scholar]
  • 3. Etheredge LM. A rapid‐learning health system. Health Aff (Millwood). 2007; 26(2): w107–w118. [DOI] [PubMed] [Google Scholar]
  • 4. Inkelas M, Brown AF, Vassar SD, Sankaré IC, Martinez AB, Kubicek K, Kuo T, Mahajan A, Gould M, Mittman BS. Enhancing Dissemination, Implementation and Improvement Science in CTSAs through Regional Partnerships. ClinTransl Sci. In press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5. Meissner HI, Glasgow RE, Vinson CA, Chambers D, Brownson RC, Green LW, Ammerman AS, Weiner BJ, Mittman B. The U.S. training institute for dissemination and implementation research in health. Implement Sci. 2013; 8: 12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Peters DH, Adam T, Alonge O. Implementation research: what it is and how to do it. 2013; 6753(November): 1–7. [DOI] [PubMed] [Google Scholar]
  • 7. Gonzales R, Handley MA, Ackerman S, O'Sullivan PS. A framework for training health professionals in implementation and dissemination science. Acad Med. 2012; 29(6): 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Selby JV, Beal AC, Frank L. The Patient‐Centered Outcomes Research Institute (PCORI) national priorities for research and initial research agenda. JAMA. 2012; 307(15): 1583–1584. [DOI] [PubMed] [Google Scholar]
  • 9. El‐Sadr WM, Philip NM, Justman J. Letting HIV transform academia—embracing implementation science. N Engl J Med. 2014; 370(18): 1679–1681. [DOI] [PubMed] [Google Scholar]
  • 10. Brownson RC, Kreuter MW, Arrington BA, True WR. Translating scientific discoveries into public health action: how can schools of public health move us forward? Public Health Rep. 2006; 121(1): 97–103. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11. Tunis SR, Stryer DB, Clancy CM. Practical Clinical Trials. JAMA J Am Med Assoc. 2003; 290(12): 1624–1632. [DOI] [PubMed] [Google Scholar]
  • 12. Kahn KL, Adams JL, Weeks JC, Chrischilles EA, Schrag D, Ayanian JZ, Kiefe CI, Ganz PA, Bhoopalam N, Potosky AL, et al. Adjuvant chemotherapy use and adverse events among older patients with stage III colon cancer. JAMA. 2010; 303(11): 1037–1045. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Frank L, Basch E, Selby JV; For the Patient‐Centered Outcomes Research Institute . The PCORI perspective on patient‐centered outcomes research. JAMA. 2014; 20036: 2014–2015. [DOI] [PubMed] [Google Scholar]
  • 14. DeMets DL, Califf RM. A historical perspective on clinical trials innovation and leadership: where have the academics gone? JAMA. 2011; 305(7): 713–714. [DOI] [PubMed] [Google Scholar]
  • 15. Davies P, Walker AE, Grimshaw JM. A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implement Sci. 2010; 5: 14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Green L, Kreuter M. Health program planning: An educational and ecological approach. 2005. Available at: http://library.wur.nl/WebQuery/clc/1769237. Accessed April 3, 2015.
  • 17. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, Blachman M, Dunville R, Saul J. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. Am J Commun Psychol. 2008; 41(3–4): 171–181. [DOI] [PubMed] [Google Scholar]
  • 18. Greenhalgh T, Robert G, Bate P, Macfarlane F, Kyriakidou O. Diffusion of Innovations in Health Service Organisations: A Systematic Literature Review; 2005. doi:10.1002/9780470987407. [DOI] [PMC free article] [PubMed]
  • 19. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004; 82(4): 581–629. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Shojania KG, McDonald KM, Wachter RM, Owens DK, Markowitz AJ. Closing the quality gap: a critical analysis of quality improvement strategies. Agency Health Care Res Qual. 2004; (9), AHRQ Publication No. 04‐0051‐1. [Google Scholar]
  • 21. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012; 43(3): 337–350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009; 4: 50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Cane J, O'Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012; 7: 37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Rubenstein LV., Pugh J. Strategies for promoting organizational and practice change by advancing implementation research. J Gen Intern Med. 2006; 21(Suppl. 2): S58–64. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25. HMO Research Network: “Embedded Research to Improve Health.” 2014. Available at: http://www.hmornmeeting.org/index.html.
  • 26. Faden RR, Kass NE, Goodman SN, Pronovost P, Tunis S, Beauchamp TL. An ethics framework for a learning health care system: a departure from traditional research ethics and clinical ethics. Hastings Cent Rep. 2013; Spec No(February): S16–S27. [DOI] [PubMed] [Google Scholar]
  • 27. Kass NE, Faden RR, Goodman SN, Pronovost P, Tunis S, Beauchamp TL. The research‐treatment distinction: a problematic approach for determining which activities should have ethical oversight. Hastings Cent Rep. 2013; Spec No(February): S4–S15. [DOI] [PubMed] [Google Scholar]
  • 28. Silberman G, Kahn KL. Burdens on research imposed by institutional review boards: the state of the evidence and Its implications for regulatory reform. Milbank Q. 2011; 89(4): 599–627. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29. Platt R, Grossmann C, Selker HP. Evaluation as part of operations: reconciling the common rule and continuous improvement. Hastings Cent Rep. 2013; Spec No: S37–S39. [DOI] [PubMed] [Google Scholar]
  • 30. Jones L, Wells K. Strategies for academic and clinician engagement in community‐participatory partnered research. JAMA. 2007; 297(4): 407–410. [DOI] [PubMed] [Google Scholar]
  • 31. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Heal Ment Heal Serv Res. 2011; 38: 65–76. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Clinical and Translational Science are provided here courtesy of Wiley

RESOURCES