Skip to main content
Implementation Science Communications logoLink to Implementation Science Communications
. 2025 Nov 27;7:2. doi: 10.1186/s43058-025-00831-9

So what? Elevating the impact of implementation science

Ross C Brownson 1,2,, Juliet Iwelunmor 3, Thomas A Odeny 2,4, Enola K Proctor 5, Elvin H Geng 6
PMCID: PMC12763859  PMID: 41310907

Abstract

Background

Given the substantial public funding of health-related research, tangible benefits of this support must be demonstrated. Implementation science provides actionable methods to enhance population health, reduce health inequities, and guide effective public health and clinical practices and policies. We must elevate the notion of impact (the “so-what gap”) and the role of implementation science, particularly in university settings.

Main text

We distinguish between scientific output and impacts. Impacts in implementation science are commonly defined as improvements in health outcomes, quality of life, quality of services, or policy change. In contrast, traditional academic outputs, such as citation counts and grant awards, hold minimal, direct societal relevance. Principles of audience segmentation (partitioning the target audience for dissemination and implementation into smaller groups by meaningful distinctions), which are increasingly applied in implementation science, can enhance impact. We highlight trade-offs in enhancing the focus on impact across multiple categories (e.g., accountability, evaluation). We describe four essential domains of implementation impact: speed of research translation, sustainability, de-implementation, and equity. Multiple examples, across diverse topics, illustrate these domains (e.g., HIV treatment, use of community health workers). To boost impact via more active dissemination and implementation of research findings, we provide ideas within five categories: (1) co-production of knowledge, (2) tailored dissemination, (3) organizational support, (4) capacity building, and (5) implementation metrics.

Conclusions

Generating new research knowledge does not guarantee societal impact. For implementation science to become more relevant to societal needs, enhancing and evaluating its impacts matter; otherwise, systemic changes required in institutions will continue to evolve slowly. We argue that impactful implementation science involves developing new skill sets and uncovering meaningful work that changes the field while adopting a collaborative working approach with individual researchers, their organizations, funders, and the communities they aim to benefit. Navigating the hurdles and translating research into practice and policy can amplify societal impact, making implementation science more applicable, accessible, and equitable for all.

Keywords: Clinical practice, Equity, Implementation science, Population health, Scale-up


Contributions to the literature.

  • Impact has different meanings for different audiences, and university-based researchers mainly focus on academic outputs (e.g., journal impact factors, grants) that represent knowledge generation itself and may have attenuated relevance to the rest of society.

  • Actions to enhance implementation science impact should consider the trade-offs in shifting the research enterprise.

  • Impactful implementation science emphasizes four essential implementation domains (speed of research translation, sustainment, de-implementation, and equity).

  • Implementation science impacts can be improved by both incremental and transformational change.

“What you do makes a difference, and you have to decide what kind of difference you want to make.”

– Jane Goodall

Introduction

The substantial public investment in health-related research must ultimately yield tangible benefits to population health and society [1]. Globally, funders often place a high priority on research; they also increasingly want more information on the effectiveness of approaches and societal benefits, including tangible impacts on population health [26]. In a survey of 31 health funders across the globe, van der Linden and colleagues found a high priority for dissemination and implementation (D&I) of research to increase value and impact [4]. The team also identified the challenges in identifying and applying successful approaches for D&I, along with metrics for assessing impact [4].

Much clinical and public health research yields tangible benefits for practice, policy, and society. For example, one of the significant achievements of public health and medical science is reducing the burden of vaccine-preventable diseases (e.g., polio eradication; control of cervical and liver cancer) [7, 8]. Despite such evidence, the gaps that remain between what could work to improve health and what is applied in public health practice, clinical practice, and policy change are the focus of growing attention and dissatisfaction [911]. The substantial benefits of bridging this chasm remain unrealized and are higher in many parts of the world. For example, scaling-up just 10 community-level interventions in five East African countries (Burundi, Kenya, Rwanda, Uganda, and the United Republic of Tanzania) could avert approximately 75,000 child deaths [12].

The science of implementation focuses on increasing the adoption, implementation, and sustainment of EBPPs, ensuring that implementation strategies align with the context of the setting or population, and improving equity across settings. To date, implementation science literature has focused primarily on “what” and “how”—defining the field’s key concepts and determining questions about how strategies work. In brief, “knowing what to do” does not ensure “doing what we know” [13]. This so-called “know-do gap” is partially due to ineffective implementation and sustainment of evidence-based programs and policies (EBPPs) [14]. Now is the time for implementation science and the related research community to ask questions of “so what?”—emphasizing and crafting paths to impact that demonstrate a more tangible return on investment.

Therefore, in this article, we describe (1) implementation science and potential impacts; (2) definitions of impact, trade-offs, and audience differences; (3) the distinctive role of selected implementation science domains to enhance impact; and (4) recommendations for achieving greater and more equitable impacts.

Implementation science and its impacts

Implementation science focuses on advancing impact, which is crucial for transforming scientific discoveries into actionable strategies that improve population health, reduce health inequities, and inform effective public health, clinical practice, and policy [15, 16]. The growth of implementation science has been a critical shift in biomedical sciences [17]. The focus on the impact of implementation science coincides with a movement toward consequentialism [17], aligning with the utilitarian ethic of maximizing positive impacts and relying in part on implementation science [18]. The shorter-term impacts of implementation science may include intentions to use evidence in program or policy development, as well as increased self-efficacy among practitioners when using or finding evidence. In the medium term, indicators of impact may include greater use of EBPPs in public health settings or improvements in clinical practice [19]. Long-term impacts include well-known outcomes such as disease burden (e.g., years of life lost) or economic value (e.g., cost-effectiveness) [20].

A focus on policy implementation is crucial because policy has a profound impact on population health [21]. However, policy implementation is understudied [22], and the policy process has distinctive implementation properties (less control over timing, the need for local data, the role of ideology, and special interests) [23]. The scale-up of the US President’s Emergency Plan for AIDS Relief (PEPFAR) is an example of a significant advance in addressing the global HIV/AIDS epidemic. In PEPFAR, high incidence and mortality from HIV/AIDS (the problem), funding for life-saving antiretroviral therapy (the policy), and effective advocacy coupled with bipartisan support (the politics) converged [24].

Defining impacts, trade-offs, and audiences

Many scholars and organizations call for a broader, more real-world view of research impact [25, 26]. A standard definition of impact is “a significant or major effect” [27], which has a general meaning across all of society. Selected large-scale efforts and reviews to better document research impact outside of academe have developed definitions [2833]. The most common elements across various definitions include quality of services, policy change, health outcomes, quality of life, economic benefits, environmental changes, and social/cultural impacts [32, 33]. Other elements include avenues to change (attitudes, awareness, funding) across multiple levels (local, regional, national, and international) [33]. While not explicit in most definitions, impact can be positive or negative. For example, community development to enhance the built environment can result in benefits (e.g., greater walkability, economic growth) and unintended consequences (e.g., gentrification, displacement of long-time residents) [34]. In Africa, traditional indoor cooking with biomass fuels produces smoke, which, though detrimental to respiratory health, has an observed repellent effect on mosquitoes [35]. Consequently, the adoption of smokeless cooking devices, while significantly reducing indoor air pollution and associated respiratory ailments, could potentially increase exposure to mosquito-borne diseases.

Trade-offs in increasing a focus on impact

Increasing the focus on impact in implementation science involves some changes in thinking and practices. Every decision in implementation science involves one or more trade-offs (comparative gains and losses). While not covering these topics in depth, we offer a series of trade-offs that are often driven by available resources, organizational priorities, skill sets, policy, and politics (Table 1). External accountability pressures researchers to demonstrate medium- to long-term societal impact, sometimes in conflict with shorter-term academic incentives. Definitions of impact vary across disciplines and community segments. Projects should focus on long-term societal outcomes, but these outcomes are often beyond the scope of research timelines. Equity-focused, high-impact strategies may neglect marginalized groups needing new resources and tailored approaches. Enhanced co-production and storytelling may improve impact documentation but require skills beyond traditional research competencies, risking increased bias and burden on academic freedom. In practice, these trade-offs are seldom either/or considerations; rather, as researchers strive to enhance impact outside of academic settings, they should keep these and other trade-offs in mind and seek to maximize the return on investment for their scholarship.

Table 1.

Trade-offs of increasing the focus on impact in implementation science

Category Pro Con
External Accountability
 Societal impact

Research should more fully document societal impacts

Stakeholders (funders, partners, residents) expect a return on investment from research

A focus on societal impacts does not comport with academic incentives for promotion and tenure; research may be subject to political review as social forces act through politics
 Social contract Publicly- and privately funded researchers have a social contract and an obligation to show societal benefits

Every academic discipline has a different definition of impact, making it challenging to fulfill a social contract

Impact is defined differently across community segments with distinct social and political values

Focus and Scope
 Project focus Impact beyond standard academic metrics (grants, publications) should be a focus of every research project The time horizon for societal impacts is long, much longer than the duration of a typical research project
 Equity focus

More attention is needed on high-impact, scalable strategies in implementation science to address inequities

Lessons about equitable implementation are available across countries, settings, and populations

High-impact approaches may leave out marginalized groups who need more tailored and resource-intensive strategies

Groups marginal to science may not hold social power and privileged public discourse (e.g., news media, social media)

Engagement and Communication
 Co-production Impact is enhanced by co-producing implementation science with practitioners, policymakers, and/or payers Participatory research is time-intensive and beyond the scope of many research projects
 Storytelling Stories backed up by data (and data backed up by stories) can better document the impact of implementation science

Researchers are not well-qualified to tell stories

Storytelling requires audience segmentation and knowledge, which is a profession in and of itself (e.g., journalism)

Evaluation and Reporting
 Standardization Standardized frameworks help to show the value of implementation science Standardized approaches in measuring impact will stifle innovation and impede academic freedom; increase the burden of science
 Measurement There are multiple methods for measuring the societal impacts of implementation science There is a high potential for bias in the assessment of impact, in part because we measure the elements that are easy to measure (publications)

It is also important to consider impact trade-offs in light of the social context for research. Socially responsive research is essential, yet it also carries risks [36]. Social forces—particularly when amplified through political mechanisms—can shape science in ways that undermine its integrity. In the current climate, scientific discourse is subject to anti-vaccination ideologies or partisan agendas at the highest levels of government. If science is expected to be accountable to society, and government is presumed to represent society, then the rise of “cancel science” movements illustrates how social influence can produce instability and whiplash in the scientific community.

Audiences and audience segmentation

Impact has different meanings for different audiences. University-based researchers often prioritize academic outputs (e.g., citation rates, journal impact factors, grants) that represent knowledge generation itself, which may have an attenuated direct relevance to the rest of society. As appropriate for specific areas of scholarship, researchers often prioritize discovering new knowledge (rather than applying it). Several factors drive the focus on academic metrics among researchers: academic incentives (promotion/tenure guidelines), ease in measuring outputs (grants, publications), difficulty showing the causal attribution of research impacts to practice and policy change, and lack of skills among researchers in non-academic dissemination. It is also essential to focus on the impact of scholarship rather than only on published research. White papers or testimony to policy bodies may be highly influential even if the relevant scholarship is not published in journal articles [37].

Audience segmentation helps enhance impact. Audience segmentation, which originated in business and social sciences, involves partitioning the target audience for D&I into smaller groups by meaningful distinctions (e.g., demographic, psychographic) [3841]. Segmentation helps to ensure that research products meet the unique requirements of market segments, in part by delivering tailored messages to enhance the timeliness, relevance, and usefulness of research findings. Audience characteristics (e.g., motivations, social influences, time urgency) are considered to fine-tune messages and dissemination strategies [42]. Table 2 summarizes key audience differences regarding the impact of research. Practitioners and policymakers value practical ways to apply knowledge to their settings, often with adaptation for local relevance [4345]. Given the many audiences for research uptake, we do not suggest that researchers should alone fill the dissemination void. Researchers can and should often partner with audiences in Table 2 to form a bridge between the generators of knowledge and those most likely to affect change and impact.

Table 2.

Audience segmentation for more impactful implementation science

Segment/actors Relevant characteristics Messages and impacts Channelsa
Health researchers

• Deep knowledge of specialized issues

• High commitment to health

• Long horizons in the research process

• Main incentives to publish research, obtain grant funding

• Few incentives to disseminate research outside the scientific world

• Make a difference in society

• Improve health equity

• Enhance resources for research

• Evidence being used to guide practice

• Enhanced capacity among research partners

• Journal articles, especially in leading journals

• Professional associations/meetings

• Leading scholars (opinion leaders)

• Study participant feedback

Public health practitioners

• High commitment to health

• Wide range of professional backgrounds

• Access to summaries of evidence but often not the original research

• Mid to long-term horizon for impacts

• Make a difference in society

• Improve health equity/social justice

• Prevention saves lives

• Enhance resources

• Leadership meetings, preferably face-to-face

• National government agencies

• Professional associations

• Brief summaries of evidence

• Social media

Clinical practitioners

• High commitment to health

• Narrow range of professional backgrounds

• Time urgency

• Short-term horizon for impacts

• Improve patient care

• Improve quality and safety of care

• Improve health and well-being

• Improve health equity

• Enhance efficiency of care

• Journal articles

• Professional associations

• Professional conferences

• Brief summaries of evidence

• Social media

• In-service training

Policymakers (Big P policy, elected officials)

• Variable commitment to health (often limited knowledge across many issues)

• Wide range of professional backgrounds

• Influence of political parties, ideology

• Some rely heavily on staff

• Trust and credibility built on personal relationships

• Short-term horizon for impacts

• Serve constituents

• Improve peoples’ lives

• Create return on investment

• Pressure of re-election

• Real-world stories,

• Brief summaries of evidence

• Delivery of messages by opinion leaders

Staff members for elected officials

• Variable commitment to health (often limited knowledge across many issues)

• Wide range of professional backgrounds

• Will read longer reports

• Trust and credibility built on personal relationships

• Short-term horizon for impacts

• Serve elected official(s)

• Serve party priorities

• Create return on investment

• Longer summaries of evidence

• Experiences from similar jurisdictions

• Delivery of messages by opinion leaders

Agency and system administrators, directors, and leaders (small p policymakers)

• High commitment to health (if in a health agency/ system)

• Narrow range of professional backgrounds

• Time urgency

• Short to medium horizon for impacts

• Among those in government, there are limits on advocacy

• Improve care/population health

• Seek congruence of outcomes with strategic plans/agency aims

• Manage budgets effectively and efficiently

• Longer summaries of evidence for selected issues

• Delivery of messages by opinion (other system) leaders

• Professional associations

• Feedback from embedded researchers

Community members and community partners

• Variable commitment to health

• Value different types of ‘knowledge’ and ‘evidence’ regarding health

• Impacted by personal or familial experiences as patients

• Wide range of professional backgrounds

• Short-term horizon for impacts

• Provide tangible benefits or relevance to self, family, community

• Value as a service or resource to the community

• Seek not to incur high costs or financial burden

• Local media channels (including social media)

• Real-world stories

• Culturally appropriate media (local papers, radio)

• Local community leaders (opinion leaders)

• Economic impacts of interventions

Research payers

• High commitment to public health

• Priority on evidence

• Impacted by advocacy groups, advisory boards

• Typically, former researchers or policymakers

• Intermediate and long-range horizons

• Return on investment of research dollars

• Balance a portfolio of research types

• Accountability to elected officials, ministers of health, boards

• Seek congruence of outcomes with strategic plans/agency aims

• Manage budgets effectively and efficiently

• Researchers’ annual and final reports

• Peer-reviewed publications

• Brief summaries of key findings

aThe route of message delivery to reach the audience segment

Adapted and expanded from Brownson et al. [43], Shato et al. [44], and Shelton and Brownson [45]

The impact of translational research on policy and practice can be defined in ways beyond individual-level health outcomes. Nearly 50 years ago, Weiss encouraged using more flexible and exact concepts about the policy environment [46]. She argued that research can change policy and practice by prompting a new conceptualization of a problem in a policy community. For example, Farmer’s work in Haiti used a single-arm case series of HIV treatment success without sophisticated laboratory support [47]. These efforts helped the global public health community reimagine HIV treatment as a public health challenge rather than a medical care issue. In some ways, Weiss's model requires a paradigm shift in research known for its impact. Research on HIV treatment changed the perception that HIV had to be treated with highly specialized doctors and sophisticated monitoring. While no one has adopted the original Farmer model at a larger scale (e.g., no laboratory monitoring), the research has changed mindsets and led to global investments that would have been impossible.

Implementation science domains to enhance impact

When research addresses real-world challenges, is conducted with rigorous methods, and is disseminated to key leaders and partners, beneficial impacts occur [48, 49]. Implementation science methods underscore the impact of research with efforts to close the so-what gap [50].

While not an exhaustive list, to illustrate documented and potential impacts of implementation science, we describe four exemplar implementation topics (speed of research translation, scale-up and sustainment, de-implementation, and equity) along with brief examples (Table 3) [5155, 57, 58]. We chose these topics because population health impact is maximized when EBPPs are implemented with speed [59], sustained over time [62], with unnecessary or harmful practices de-implemented [63], and equity prioritized to ensure all populations benefit [64]. These four domains are also priority action areas for the broad field of implementation science [5961]. Each example illustrates how implementation science concepts and methods contributed to impact, including core principles, the context for implementation, and how approaches were tailored to various audiences.

Table 3.

Selected implementation science topicsa

Topics Definition Examples of impact
Speed Time of research translation is measured in multiple ways: time in a translational stage; time to achieve an implementation milestone; and time to achieve a predefined outcome (service system, health outcome)

Collaborative care for depression: The DIAMOND Initiative (Depression Improvement Across Minnesota–Offering a New Direction) was implemented in 75 primary care clinics in two years [51]

COVID-19 vaccines: The rapid development and distribution of COVID-19 vaccines significantly mitigated the impact of the pandemic

Sustainability Sustainability is the extent to which an evidence-based practice can deliver benefits over an extended time

School nutrition in Australia: A multi-strategy intervention has been scaled to over 2,000 schools and sustained for nearly 20 years [52]

Tobacco control in California: A 30-year experience in scaling up tobacco control policies has shown a return on investment of 231 to 1 in direct medical expenditures [53]

De-implementation The process of stopping practices that are ineffective or harmful, not the most effective or efficient (including cost-effective), or no longer necessary; maintaining ineffective practices wastes precious resources

Chest X-rays to screen for tuberculosis: Routine X-ray screening in low-risk populations has been discontinued due to low effectiveness and potential for harm in low-risk populations [54]

Hormone replacement therapy (HRT) for postmenopausal women: HRT was widely prescribed for the prevention of heart disease and osteoporosis, but large studies subsequently showed more risks than benefits [55]

Equity Everyone has a fair and just opportunity to be as healthy as possible [56]. Equity in implementation involves social justice and requires greater attention to the needs, cultures, and histories of communities being served

Housing programs for low-income families: Initiatives to provide safe and affordable housing have improved health outcomes (respiratory health, mental health) [57]

School meal programs in the US: School meal programs improve nutritional intake and academic performance among children from low-income families [58]

aThis is not a comprehensive list of topics, but rather examples where progress has been made to varying degrees and are priorities for implementation science [5961]

Speed of research translation

The speed at which actionable evidence is implemented has sizable implications for impact [59]. Over two decades ago, it was estimated that EBPPs take an average of 17 years to affect practice across multiple diseases and risk factors [65, 66]. More recently, Khan and colleagues studied five EBPPs in cancer control: mammography, clinicians’ advice to quit smoking, colorectal cancer screening, HPV co-testing, and HPV vaccination [67]. They found that the time from publication to implementation ranged from 13 years (advice to quit smoking) to 21 years (mammography), averaging 15 years. The challenge is conceptually simple: getting what works to the people who need it with the greatest speed and efficiency [68]. The framework to assess the speed of translation is a comprehensive model for describing and addressing the determinants of the pace of implementation [59]. We acknowledge that in some cases, speed may not be warranted (e.g., the evidence base is still developing, the safety of a medical treatment has not been thoroughly tested) [59].

Example: HIV treatment

Context: Historically, it was believed that starting HIV antiretrovirals should be a cautious and sometimes slow process, often involving weeks or months of counseling to prepare patients. However, emerging evidence in the early 2010s demonstrated that rapid initiation of treatment had clinical benefits previously unrecognized by the scientific community (e.g., fewer AIDS-related events even among patients not exhibiting symptoms at the time of assessment, reduced risk of HIV transmission) [69]. Implementation science illustrated whether rapid initiation was possible and how to do so. Rapid antiretroviral therapy (ART) initiation required re-thinking assumptions and standard practices such as multiple adherence counseling sessions (the norm in Africa) and the perceived need for pre-treatment laboratory studies (the norm in the US). Several studies published in 2015 and 2016 showed not only outcomes with rapid initiation of treatment, but also demonstrated changes in workflows, clinical assessments, training, and incentives in the health system that enabled rapid ART initiation [7072].

Impact: These studies, in part, led the World Health Organization to make a “rapid policy recommendation” on July 1, 2017 [73]. These policies were developed through a guideline review committee and input from opinion leaders, program leaders, and scientists worldwide. In many regions where HIV prevalence is high, this policy was adopted quickly and comprehensively, and by 2018, the vast majority of patients in high-prevalence settings in Africa, as well as most settings in the United States, were starting HIV medication on the day of or within a few days of diagnosis [74].

Key lessons: The shift toward rapid HIV treatment initiation challenged prior, more cautious approaches. While science often moves at a deliberate pace, this case study shows that shifting norms about treatment, the urgency of a health issue, and the opening of a policy window [23] can quickly impact clinical treatment protocols. Policy change is often most effective when a diverse group of stakeholders is involved in the process.

Sustainability

Sustaining interventions ensures that they reach a larger portion of the population over time. This is essential for achieving widespread equity with public health impacts. Sustainability refers to the lasting use of an EBPP whose benefits persist over an extended time, often after external support from a funder ends. Scheirer and Dearing suggest that measures for sustainability should also include a broad set of determinants, including the presence of community- or organizational-level partnerships, continued attention to the health being addressed, and replication in other sites [75]. Sustainability has been operationalized according to: 1) maintenance of a program’s initial health benefits, 2) institutionalization of the program in a setting or community, 3) capacity building in the recipient setting or community, and 4) sustainability capacity [76].

Example: Community health workers as a sustainable implementation strategy

Context: Community health workers (CHWs) are public health workers who work either for pay or as volunteers in a local health care system. CHWs often share language, culture, and lived experiences with their clients. Sustainment of CHW programs usually requires some combination of resources, integration within healthcare systems, and support from supervisors, peers, and providers [77]. In Brazil, the Family Health Strategy relies on CHWs to provide basic primary care to families in their homes. The program has been a core element in Brazil’s primary care system since 1994. The CHWs are part of a core, multidisciplinary team of physicians and nurses [78]. They also have access to other specialists (e.g., psychologists, physiotherapists). The program includes 265,000 CHWs and covers 67% of the population [78]. Over 20 years of research, initially conducted in Ceara, have demonstrated impacts that give health systems confidence in continuing expansion to scale [79, 80].

Impact: The Family Health Strategy and its use of CHWs have shifted care in Brazil from more expensive hospital-based care to less costly and effective preventive care. The program has been associated with such impacts as increased rates of immunization, reduced healthcare inequities, increased breastfeeding rates, and fewer avoidable hospitalizations [78].

Key lessons: From the Brazilian experience, effective care can be delivered and sustained when CHWs are integrated into a broader multidisciplinary health team (including physicians and nurses). When CHWs share cultural and linguistic backgrounds with the communities they serve, trust, accessibility, and the effectiveness of care delivery can continue.

De-implementation

Another impact is de-implementation, or stopping practices that are ineffective or harmful, not the most effective or efficient (including cost-effective), or are no longer necessary [81]. Nascent evidence indicates that, like implementation efforts, de-implementation requires active approaches and local champions for success. Intervention de-implementation is a process within a complex system of organizations and individual actors, with bi-directional influences between the intervention and the context from which it is being removed [63]. Extant theories, models, and frameworks (hereafter, “frameworks) help to conceptualize de-implementation [82]. A related concept, mis-implementation, involves the continuation of ineffective interventions (the need for de-implementation) and the premature ending of EBPPs [83]. Among evidence-based programs, almost 40% within US state health departments are discontinued when they should continue [83], i.e., mis-implemented.

Example: Lead in gasoline

Context: Millions of tons of lead was added to gasoline worldwide since the 1920s to improve vehicle efficiency. The health effects of low-level exposure are increasingly clear and well-documented [84], including impaired neurodevelopment (e.g., lower IQ, decreased attention span), cardiovascular risks, kidney disease, and premature death [85]. In the 1970s, policies and regulations were introduced in many countries to eliminate lead from gasoline. The phase-out was completed in 2021. Following the classic Diffusion Theory [86], there were early (e.g., Japan), middle (e.g., Russia), and late adopters (e.g., Algeria) in the lead elimination process.

Impact: Although it took decades, removing lead from gasoline has significantly reduced lead exposure. Eliminating lead in gasoline has increased children’s intelligence, saved lives, and created economic benefits (e.g., health care costs, productivity) [85]. Lead elimination worldwide is complete, but it was slower and more sporadic than necessary, as is often the case with policy de-implementation (also known as policy termination).

Key lessons: Even in light of clear scientific evidence of the health risks of lead exposure, de-implementation may lag due to political, economic, and industry resistance. Several properties from Diffusion of Innovation [86] were illustrated in this case study: 1) lead phase out occurred in phases with early, middle and late adopters, 2) adoption was influenced by the relative advantage of the benefits of new policy standards over existing standards, and 3) change is driven in part by economic benefits through reduced health costs and improved productivity.

Equity

Equity in public health and clinical settings means that all segments of the population, especially the most marginalized, have a fair and just opportunity to be healthy [56]. Inequities can lead to disparities in health outcomes and undermine the overall effectiveness of EBPPs. Public Health 3.0 emphasizes engagement and actions directly affecting the social, environmental, and economic conditions driving health inequities [87]. This comprehensive approach considers the impact of where people live, learn, work, and play, involving intersecting systems such as housing, safety, physical environments, education, and economic stability [88]. Recently, the widespread recognition of racism as a public health crisis and a focus for implementation science has further mobilized action [8991]. Equity should be a centerpiece for implementation science impacts [64], with participatory approaches yielding tangible, equity-related benefits (e.g., more effective study execution) [92].

Example: Participatory budgeting

Context: Participatory budgeting is a promising method for involving community partners in the distribution of public funds more equitably. In participatory budgeting, community members directly contribute to the allocation of public funds. The process originated in Brazil in 1989 and is increasingly used across multiple governmental sectors in the United States and globally [93]. The Tacoma-Pierce County Health Department in Washington state (USA) uses a participatory budgeting process [94]. In Tacoma-Pierce County, this process was implemented to support the health department’s focus on equity, which seeks to improve health outcomes in neighborhoods with reduced life expectancy.

Impact: In Brazil, participatory budgeting is associated with multiple endpoints, including better access to public services, a reduction in extreme poverty, and decreases in child and infant mortality [95]. Participatory budgeting increases transparency in funding decisions, provides accountability for decisions made by public health agencies, and contributes to more responsive governance and more significant community impact overall.

Key lessons: Empowering the community to change funding priorities can lead to increased confidence in the democratic process among community members and greater legitimacy and trust between the community and public health officials. Participatory budgeting can surface overlooked health needs, build capacity to address these needs, and, when implemented effectively, promote sustained commitment to stakeholder engagement. This is a version of highly impactful participatory implementation science [96].

Enhancing impact: a path forward

Research with impacts beyond the academic requires trust in research and the scholarly process [97]. Recent data show that researchers are trusted more than most other professions. In a 2022 survey in 28 countries, scientists were the 2nd most trusted profession, with 57% of adults surveyed trusting researchers [98]. Yet, trust in scientists varies significantly worldwide. A 2020 survey found that trust in scientists was highest in Australia and New Zealand (62% of respondents expressed high trust) and lowest in Africa (19% expressed high trust) [99]. In some regions, trust has dropped significantly over time. For example, while nearly 80% of Americans support government investments in scientific research [100], trust in scientists declined by 12% from January 2019 through October 2022 [101], and scholars warn of a “deadly rise of anti-science” [102]. Given this decline, the demand for greater research accountability, and pressures on research funding in many countries, researchers must maximize and document the impacts of implementation science beyond academia.

Frameworks provide a systematic roadmap for guiding implementation science study design, measures, data collection, data analysis, and outputs (e.g., Reach, Effectiveness, Adoption, Implementation, and Maintenance [RE-AIM], Consolidated Framework for Implementation Research [CFIR]) [103]. While over 100 frameworks are used in implementation science [104], few explicitly focus on impacts beyond those typical in a research study (e.g., health outcomes). A notable exception is the framework proposed by Proctor and colleagues (2009), which hypothesizes and proposes demonstrating that successful implementation has a downstream impact on service system outcomes and clinical/population health [105].

Several frameworks and models identify research impacts beyond typical academic metrics [2832, 106]. For example, the Translational Science Benefits Model provides a framework and benchmarks to measure the impact of scientific discoveries beyond traditional metrics, including 1) clinical and medical benefits; 2) community and public health benefits; 3) economic benefits; and 4) policy and legislative benefits [106]. Most impact frameworks are based on a logic model, commonly used in evaluation to depict how programs or research activities link with shorter-term outputs and longer-term outcomes [107]. Frameworks seek to enhance some combination of accountability, transparency, advocacy, return on investment, and translation speed [108].

We suggest three interconnected pathways to greater impact of implementation science (i.e., funders, other organizations, and individual researchers). The first set of activities involves funders making impact a more explicit requirement in funding announcements on implementation science, knowledge mobilization, and knowledge translation [3, 6, 109]. While it has grown slightly in recent years, we underinvest in implementation science. For example, in the D&I research category, the US NIH invested about 1.75% of its total budget in fiscal year 2023 [110]. This US funding for D&I research is likely to decrease in the coming years due to significant reductions in the NIH budget (the largest public funder of research on the globe) [111]. Knowledge mobilization recommendations have shifted funding priorities toward more D&I science in the UK and other European countries. The second area is for research organizations, particularly universities, to build a “culture of impact” that makes central the use of impact frameworks, including a greater focus on societal impact in hiring and promotion, and better design studies for impact in early stages of the research process [112, 113]. The third area involves individual researchers who can more effectively engage with partners, design for dissemination, and actively disseminate research findings. Wolfenden and colleagues have reported that health policy or practice impact is increased when intervention trialists undertake knowledge translation strategies (e.g., involve end-users, adapt knowledge to the local context, tailor interventions) [114].

Given these challenges and opportunities, What will elevate the “so what” of implementation science? Causal pathways between the strategies to promote uptake of an EBPP and a range of impacts (health and otherwise) are complex. This complexity makes it difficult, if not impossible, to attribute a given impact solely to implementation science. However, through our four implementation domains (speed of research translation, scale-up and sustainment, de-implementation, and equity) and the associated examples, it is clear that implementation science methods contribute to a set of impacts of high societal relevance.

We need to actively disseminate research and fundamentally change how knowledge is disseminated and utilized. Greenhalgh distinguished between “letting it happen” and “making it happen” in translating research into practice and policy [115]. Too often, research dissemination has been relegated to the “letting it happen” category, despite a substantial body of literature demonstrating the ineffectiveness of passive dissemination [116, 117]. To improve the dissemination of evidence in practice settings, the push–pull–capacity model posits that for science to affect practice, there must be a combination of the push (a basis in science and technology), the pull (a market demand from practitioners or policymakers) [118], and the capacity (the delivery ability of public health and health care systems) [119] The pull-pull model helps highlight some capacity-related factors that enhance the translation of research to practice (e.g., multipronged approaches, dedicated staff for dissemination) [120122].

While not exhaustive, we provide ideas to accelerate the impact of research for implementation scientists (Table 4). These recommendations draw upon the broad literature in implementation science (including diffusion theory [86]), lessons from the domains and examples in Table 3, and the combined experience of the authors in implementation research and practice (within clinical care, public health practice, and social work practice). We acknowledge that any actions to enhance research impact are highly contextual—what works will differ across settings or regions of the world. When considering our ideas, the roles of incremental and transformational change are worth noting [123, 124]. Impact based on incremental change primarily involves minor and gradual adjustments to existing practices, but doing these practices more efficiently and effectively. Transformational change is a significant overhaul of an organization, with shifts to systems, culture, and structures (e.g., a substantial change in mission).

Table 4.

Recommendations to enhance the impact of implementation science, with more transformational changes in bold

Implementation science principle Recommendation Segments/actorsa
Co-production and stakeholder engagement

• Develop and implement a partnership model that focuses on equity and places much of the power with end-users and implementation partners rather than with universities

• Partner with diverse individuals and groups to ensure the research is valuable and timely, beginning with the research question(s) being answered

• Engage with end users of implementation science throughout the research process

• Build collaborations across fields, particularly those outside of the health sector that may have impacts on equity (e.g., economic development, housing, transportation, the arts)

• Work closely with policymakers to ensure research findings inform policy decisions

• Researchers

• State and local practitioners

• Policymakers

• Advocates

• Funders

• Researchers

• State and local practitioners

• Health system leaders

Tailored dissemination and knowledge translation

• Establish dedicated entities to rapidly respond to communication priorities, including countering disinformation

• Conduct active, multilevel dissemination of research findings

• Learn the media landscape (cultivate local, state or national contacts; be available instantly; deliver pithy summaries of research)

• When disseminating evidence, account for the potential for implicit biases, negative attitudes towards groups who have been historically marginalized, and structural/historical barriers among marginalized groups

• For more extensive research projects, develop robust and comprehensive dissemination plans

• Follow principles of effective dissemination that take into account audience preferences, message framing, and appropriate channels (including relevant information in Table 2)

• Be aware of, and proactively develop strategies for, the impact of social media for rapidly spreading information (or disinformation) (e.g., COVID-19 vaccination)

• Share data with partners in formats that support community-level use and decision-making

• Products should reflect the images, stories, and outcomes of interest to populations experiencing inequities

• Researchers

• State and local practitioners

• Advocates

Organizational readiness and supportive infrastructure

• Overhaul academic incentive structures (e.g., promotion and tenure criteria) to reward and place high value on active dissemination, partner engagement, and impactful research

• When working with research partners, avoid showing up with pre-determined research plans

• Enhance funding for implementation science, in part by showing the value among policymakers

• In funding announcements require rigorous dissemination plans, targeting one or more non-research audience

• In academic institutions, make dissemination to non-research audiences expected and build units that are dedicated to active dissemination (beyond developing press releases)

• Academic institutions

• Funders

• Researchers

Implementation capacity building

• Radically shift the academic training model to focus highly on on-the-job training for workers in health care, public health, and service delivery

• Hire and support diverse teams with skills in dissemination and implementation science

• Hire and support faculty and staff whose work promises impact, particularly in marginalized communities

• Develop and scale-up ongoing training for researchers and partners to improve dissemination skills

• In ongoing training programs (e.g., graduate training in public health), conduct bi-directional learning with community partners to illustrate real-world impact

• Academic institutions

• Researchers

Use of implementation impact metrics

• In academic institutions, adopt an impact framework to develop metrics to track research uptake and sustainment outside of academe; realign the organization based on metrics

• Evaluate the process of de-implementation

• Evaluate the impact of research a dedicated function (with resources) of scientific organizations

• Use systems such as Altmetrics to begin to bridge academic products (publications) with societal uptake (media, social media, policy)

• Researchers

• Evaluators

aIndividuals, groups, and partners who are most likely to take action to address the recommendation

The increased impact of research begins with how we engage with partners, stakeholders, and end users of evidence. The inequitable power distribution between researchers, clinical partners, public health and service agencies, and the communities served is a significant issue [125]. “Expert” voices of professionals are often emphasized over the lived experiences of community members [125]. Lessons from community-engaged research show that a participatory process enhances the odds of impact with a commitment to mutual goal-setting, implementation of strategies, and continuous improvement [126]. Stakeholder engagement poses challenges, namely the time-intensiveness of the process, potential misalignment between partners’ and academics’ goals, and the privileging of researchers’ voices over those of partners [127, 128]. There is also a need to shift greater influence in implementation science to practitioners and researchers in low- and middle-income countries [129]. In a second category, dissemination practices matter. The dissemination of research evidence to non-scientists is enhanced when messages are framed in ways that evoke emotion and interest, and demonstrate usefulness [130]. Organizational readiness is needed to create environments that address the motivations and needs of individuals and organizations seeking to enhance impact. These actions can come in the form of incentive structures (e.g., promotion and tenure guidelines) and organizational capacity (e.g., organizations with expectations for dissemination to non-research audiences are more effective at dissemination [131]). In research institutions, how we build implementation capacity among faculty and staff influences our ability to translate science better to practice and policy. And finally, “what gets measured, gets done”—we need to expand metrics and evaluation approaches to map and improve implementation science uptake outside research settings. Metrics need to include easier-to-measure indicators (changes in clinical practice) and those more difficult to quantify (changes in social determinants of health).

As we apply these and other ideas more systematically, we must sharpen the focus on equitable impacts for all populations [45], including narratives and metrics around benefits and unintended consequences [132]. As noted in the participatory budgeting example, democratizing health-related research decision-making can provide a greater community voice. To address health equity in implementation science more fully, research organizations and partner agencies should make health equity a core value, develop better tracking systems, build skills among staff, and develop new partnerships [133].

We highlighted that policy and practice environments are complex, and outcomes are the product of various actors, each motivated by different objectives, as highlighted in Table 2. The chain of events leading from a particular research study to a change in policy or practice may be unexpected, with pathways that are difficult to discern. It is often impossible to draw a direct relationship between any specific research finding and a change in policy or practice, mainly because policymaking is an open system where multi-causality is the norm. One of the challenges for our increasing orientation toward impact is mapping these pathways. This mapping can be enhanced by systems science methods that aid in studying complex systems, with heterogeneous and interacting forces [134].

While our essay focuses mainly on the promise and impacts of implementation science, it is essential to remember that academia is about much more than research, and the research function intersects with other parts of a university’s mission. Universities can be engines of innovation and cultural enrichment [1], with significant foci on education and teaching, outreach, economic development, and dissemination of knowledge. Outreach and dissemination are part of the fabric of US land-grant universities within the agriculture extension service. Diffusion theory, the earliest theory in implementation science, originated in research on the diffusion of farming practices, including the use of agricultural extension agents (“change agents” in diffusion theory) [135].

Any discussion of research translation warrants caution, namely that not all research can or should have immediate impacts [136]. Several reasons underly this assertion: 1) research is iterative and often takes time to build an evidence base ready for action, 2) research is sometimes methods-oriented and provides an architecture for the research process but not information that is actionable outside of academe, 3) single studies are seldom definitive—significant impacts often come from aggregated research, and 4) documenting the impact of research translation is challenging, making it difficult, if not impossible, to show causal attribution (rather than contribution to a change). While there is guidance on when evidence is sufficient for D&I (e.g., the quality and quantity of evidence, priority among stakeholders, and availability of resources) [43], these decisions are judgments, not formulaic ones.

Conclusion

We are at an opportune time to enhance the impact of implementation science on public health, clinical practice, and society. Designing implementation science studies for impact can amplify the goal of achieving greater benefits. This requires new skill sets and use of collaborative approaches that include individual researchers, their organizations, funders, and the communities they aim to benefit. To ensure implementation science is more impactful in filling societal needs, systemic changes are required in academic institutions that often evolve slowly. Navigating the hurdles and the hope of translating research into practice and policy can amplify societal impact, making implementation science more applicable, accessible, and equitable to all.

Acknowledgements

The authors are grateful for the supportive organizational climate, spirit of team science, and intellectual input from the scientific community at Washington University in St. Louis, particularly Dr. William Powderly, Ashley Sturm, and the Washington University Network for Dissemination and Implementation Research (WUNDIR).

Disclaimer

The findings and conclusions in this article are those of the authors and do not necessarily represent the official positions of the National Institutes of Health or the Centers for Disease Control and Prevention.

Abbreviations

ART

Antiretroviral therapy

CHW

Community health worker

D&I

Dissemination and implementation

EBPP

Evidence-based programs and policies

HPV

Human Papilloma Virus

NIH

National Institutes of Health

PEPFAR

President’s Emergency Plan for AIDS Relief

WUNDIR

The Washington University Network for Dissemination and Implementation Research

Authors’ contributions

RCB conceptualized the original article and wrote the draft of the paper. JI, TAO, EKP, and EHG provided input on the original outline, contributed text to the draft manuscript, and provided intellectual content. All authors provided edits on article drafts and approved the final version of the manuscript.

Funding

This work was supported in part by the National Cancer Institute (P50CA244431), the National Center for Advancing Translational Sciences (UL1TR002345), the National Institute of Child Health and Human Development (UG1HD113156), the National Institute of Diabetes and Digestive and Kidney Diseases (P30DK092950, R25DK123008), the National Institute of Mental Health (R25MH080916), and the Centers for Disease Control and Prevention (U48DP006395), and the Foundation for Barnes-Jewish Hospital.

Data availability

Not applicable.

Declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

EHG is a Co-Editor-in-Chief, and RCB is on the Editorial Board at Implementation Science Communications. All other authors declare they have no conflicting interests.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Thorp H, Goldstein B. Our Higher Calling: Rebuilding the Partnership between America and Its Colleges and Universities. Chapel Hill, NC: The University of North Carolina Press; 2018. [Google Scholar]
  • 2.Cardoso-Weinberg A, Alley C, Kupfer LE, Aslanyan G, Makanga M, Zicker F, et al. Funders' Perspectives on Supporting Implementation Research in Low- and Middle-Income Countries. Glob Health Sci Pract. 2022;10(2). [DOI] [PMC free article] [PubMed]
  • 3.McLean RKD, Graham ID, Tetroe JM, Volmink JA. Translating research into action: an international study of the role of research funders. Health Res Policy Syst. 2018;16(1):44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.van der Linden B, Dunham KM, Siegel J, Lazowick E, Bowdery M, Lamont T, et al. Health funders’ dissemination and implementation practices: results from a survey of the Ensuring Value in Research (EViR) funders’ forum. Implement Sci Commun. 2022;3(1):36. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Beidas RS, Dorsey S, Lewis CC, Lyon AR, Powell BJ, Purtle J, et al. Promises and pitfalls in implementation science from the perspective of US-based researchers: learning from a pre-mortem. Implement Sci. 2022;17(1):55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.National Institutes of Health. Dissemination and Implementation Research in Health (R01). Bethesda, MD: National Institutes of Health; 2024.
  • 7.Centers for Disease Control and Prevention. Ten great public health achievements–worldwide, 2001–2010. MMWR Morb Mortal Wkly Rep. 2011;60(24):814–8. [PubMed] [Google Scholar]
  • 8.Centers for Disease Control and Prevention. Ten great public health achievements–United States, 2001–2010. MMWR Morb Mortal Wkly Rep. 2011;60(19):619–23. [PubMed] [Google Scholar]
  • 9.Green LW, Ottoson JM, Garcia C, Hiatt RA. Diffusion theory, and knowledge dissemination, utilization, and integration in public health. Annu Rev Public Health. 2009;30:151–74. [DOI] [PubMed]
  • 10.Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Wang T, Tan JB, Liu XL, Zhao I. Barriers and enablers to implementing clinical practice guidelines in primary care: an overview of systematic reviews. BMJ Open. 2023;13(1):e062158. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Hategeka C, Tuyisenge G, Bayingana C, Tuyisenge L. Effects of scaling up various community-level interventions on child mortality in Burundi, Kenya, Rwanda, Uganda and Tanzania: a modeling study. Glob Health Res Policy. 2019;4:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Proctor EK, Geng E. A new lane for science. Science. 2021;374(6568):659. [DOI] [PubMed] [Google Scholar]
  • 14.Donohue JF, Elborn JS, Lansberg P, Javed A, Tesfaye S, Rugo H, et al. Bridging the “Know-Do” Gaps in Five Non-Communicable Diseases Using a Common Framework Driven by Implementation Science. J Healthc Leadersh. 2023;15:103–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Woolf SH. The meaning of translational research and why it matters. JAMA. 2008;299(2):211–3. [DOI] [PubMed] [Google Scholar]
  • 16.Kilbourne AM, Glasgow RE, Chambers DA. What can implementation science do for you? Key success stories from the field. J Gen Intern Med. 2020;35(Suppl 2):783–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Galea S. An argument for a consequentialist epidemiology. Am J Epidemiol. 2013;178(8):1185–91. [DOI] [PubMed] [Google Scholar]
  • 18.Kim D. Bridging the epidemiology-policy divide: a consequential and evidence-based framework to optimize population health. Prev Med. 2019;129:105781. [DOI] [PubMed] [Google Scholar]
  • 19.Brownson RC, Eyler AA, Harris JK, Moore JB, Tabak RG. Getting the word out: new approaches for disseminating public health science. J Public Health Manag Pract. 2018;24(2):102–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Fielding JE, Teutsch SM. So what? A framework for assessing the potential impact of intervention research. Prev Chronic Dis. 2013;10:120160. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Pollack Porter KM, Rutkow L, McGinty EE. The importance of policy change for addressing public health problems. Public Health Rep. 2018;133(1_suppl):9S-14S. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Purtle J, Peters R, Brownson RC. A review of policy dissemination and implementation research funded by the National Institutes of Health, 2007–2014. Implement Sci. 2016;11:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Brownson RC, Royer C, Ewing R, McBride TD. Researchers and policymakers: travelers in parallel universes. Am J Prev Med. 2006;30(2):164–72. [DOI] [PubMed] [Google Scholar]
  • 24.Dybul M. Lessons learned from PEPFAR. J Acquir Immune Defic Syndr. 2009;52(Suppl 1):S12–3. [DOI] [PubMed] [Google Scholar]
  • 25.Greenhalgh T, Raftery J, Hanney S, Glover M. Research impact: a narrative review. BMC Med. 2016;14:78. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Maddox BB, Phan ML, Byeon YV, Wolk CB, Stewart RE, Powell BJ, et al. Metrics to evaluate implementation scientists in the USA: what matters most? Implement Sci Commun. 2022;3(1):75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Merriam-Webster. Dictionary Springfield, MA: Merriam-Webster, Incorporated; 2024 [Available from: https://www.merriam-webster.com/dictionary/impact.
  • 28.Buxton M, Hanney S. How can payback from health services research be assessed? J Health Serv Res Policy. 1996;1(1):35–43. [PubMed] [Google Scholar]
  • 29.Panel on the Return on Investments in Health Research. Making an Impact: A Preferred Framework and Indicators to Measure Returns on Investment in Health Research. Ottawa, Ontario, Canada: Canadian Academy of Health Sciences. 2009.
  • 30.Australian Research Council. Excellence in Research for Australia Canberra, Australia: ARC; 2022 [Available from: https://www.arc.gov.au/evaluating-research/excellence-research-australia.
  • 31.Higher Education Funding Council for England. Research Excellence Framework 2014: Overview report by Main Panel A and Sub-panels 1 to 6. London: HEFCE London, UK: HEFCE; 2014 [Available from: https://2014.ref.ac.uk/.
  • 32.Searles A, Doran C, Attia J, Knight D, Wiggers J, Deeming S, et al. An approach to measuring and encouraging research translation and research impact. Health Res Policy Syst. 2016;14(1):60. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Alla K, Hall WD, Whiteford HA, Head BW, Meurk CS. How do we define the policy impact of public health research? A systematic review. Health Res Policy Syst. 2017;15(1):84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Serrano N, Schmidt L, Eyler AA, Brownson RC. Perspectives from public health practitioners and advocates on community development for active living: what are the lasting impacts? Am J Health Promot. 2024;38(1):80–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Hennessee I, Kirby MA, Misago X, Mupfasoni J, Clasen T, Kitron U, et al. Assessing the effects of cooking fuels on Anopheles mosquito behavior: an experimental study in rural Rwanda. Am J Trop Med Hyg. 2022;106(4):1196–208. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Resnik DB, Elliott KC. The ethical challenges of socially responsible science. Account Res. 2016;23(1):31–46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Chapin H, Deneau D. Citizen involvement in public policy-making: access and the policy-making process. Ottawa: Canadian Council on Social Development; 1978. [Google Scholar]
  • 38.Chong JL, Lim KK, Matchar DB. Population segmentation based on healthcare needs: a systematic review. Syst Rev. 2019;8(1):202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 39.Kreuter MW, McClure SM. The role of culture in health communication. Annu Rev Public Health. 2004;25:439–55. [DOI] [PubMed] [Google Scholar]
  • 40.Slater MD. Theory and method in health audience segmentation. J Health Commun. 1996;1(3):267–83. [DOI] [PubMed] [Google Scholar]
  • 41.Slater MD, Kelly KJ, Thackeray R. Segmentation on a shoestring: health audience segmentation in limited-budget and local social marketing interventions. Health Promot Pract. 2006;7(2):170–3. [DOI] [PubMed] [Google Scholar]
  • 42.Niederdeppe J, Boyd A, King A, Rimal R. Core Concepts of Effective Public Health Communication. Annu Rev Public Health. 2025 in press. [DOI] [PMC free article] [PubMed]
  • 43.Brownson RC, Shelton RC, Geng EH, Glasgow RE. Revisiting concepts of evidence in implementation science. Implement Sci. 2022;17(1):26. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Shato T, Kepper MM, McLoughlin GM, Tabak RG, Glasgow RE, Brownson RC. Designing for dissemination among public health and clinical practitioners in the USA. J Clin Transl Sci. 2024;8(1):e8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Shelton RC, Brownson RC. Enhancing Impact: A Call to Action for Equitable Implementation Science. Prev Sci. 2024;25(Suppl 1):174–89. [DOI] [PMC free article] [PubMed]
  • 46.Weiss C. Research For Policy’s Sake: The Enlightenment Function of Social Research. Policy Anal. 1977;3(4):531–45. [Google Scholar]
  • 47.Farmer P, Leandre F, Mukherjee J, Gupta R, Tarter L, Kim JY. Community-based treatment of advanced HIV disease: introducing DOT-HAART (directly observed therapy with highly active antiretroviral therapy). Bull World Health Organ. 2001;79(12):1145–51. [PMC free article] [PubMed] [Google Scholar]
  • 48.Greenhalgh T, Papoutsi C. Studying complexity in health services research: desperately seeking an overdue paradigm shift. BMC Med. 2018;16(1):95. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Rycroft-Malone J, Bucknall T. Models and Frameworks for Implementing Evidence- Based Practice: Linking Evidence to Action. 2010.
  • 50.Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. 3rd ed. New York: Oxford University Press; 2023. [Google Scholar]
  • 51.Solberg LI, Crain AL, Jaeckels N, Ohnsorg KA, Margolis KL, Beck A, et al. The DIAMOND initiative: implementing collaborative care for depression in 75 primary care clinics. Implement Sci. 2013;8:135. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Nathan N, Wolfenden L, Bell AC, Wyse R, Morgan PJ, Butler M, et al. Effectiveness of a multi-strategy intervention in increasing the implementation of vegetable and fruit breaks by Australian primary schools: a non-randomized controlled trial. BMC Public Health. 2012;12:651. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Lightwood JM, Anderson S, Glantz SA. Smoking and healthcare expenditure reductions associated with the California Tobacco Control Program, 1989 to 2019: a predictive validation. PLoS ONE. 2023;18(3):e0263579. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.World Health Organization. WHO operational handbook on tuberculosis. Module 2: screening - systematic screening for tuberculosis disease. Geneva: World Health Organization; 2021. [PubMed] [Google Scholar]
  • 55.Cagnacci A, Venier M. The Controversial History of Hormone Replacement Therapy. Medicina (Kaunas). 2019;55(9). [DOI] [PMC free article] [PubMed]
  • 56.Braveman P. Defining health equity. J Natl Med Assoc. 2022;114(6):593–600. [DOI] [PubMed] [Google Scholar]
  • 57.Thomson H, Thomas S, Sellstrom E, Petticrew M. Housing improvements for health and associated socio-economic outcomes. Cochrane Database Syst Rev. 2013(2):CD008657. [DOI] [PMC free article] [PubMed]
  • 58.Cohen JFW, Hecht AA, McLoughlin GM, Turner L, Schwartz MB. Universal School Meals and Associations with Student Participation, Attendance, Academic Performance, Diet Quality, Food Security, and Body Mass Index: A Systematic Review. Nutrients. 2021;13(3). [DOI] [PMC free article] [PubMed]
  • 59.Proctor E, Ramsey AT, Saldana L, Maddox TM, Chambers DA, Brownson RC. FAST: a framework to assess speed of translation of health innovations to practice and policy. Glob Implement Res Appl. 2022;2(2):107–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Brownson R, Colditz G, Proctor E. Future issues in dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. 2nd ed. New York: Oxford University Press; 2018. p. 481–90. [Google Scholar]
  • 61.Brownson RC, Cabassa LJ, Drake BF, Shelton RC. Closing the gap: advancing implementation science through training and capacity building. Implement Sci. 2024;19(1):46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 62.Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39:55–76. [DOI] [PubMed] [Google Scholar]
  • 63.McKay VR, Morshed AB, Brownson RC, Proctor EK, Prusaczyk B. Letting go: conceptualizing intervention de-implementation in public health and social service settings. Am J Community Psychol. 2018;62(1–2):189–202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D. Implementation science should give higher priority to health equity. Implement Sci. 2021;16(1):28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Balas E, Boren S. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray A, editors. Yearbook of Medical Informatics 2000: Patient-Centered Systems. Stuttgart, Germany: Schattauer; 2000. p. 65–70. [PubMed] [Google Scholar]
  • 66.Grant J, Green L, Mason B. Basic research and health: a reassessment of the scientific basis for the support of biomedical science. Res Eval. 2003;12:217–24. [Google Scholar]
  • 67.Khan S, Chambers D, Neta G. Revisiting time to translation: implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes Control. 2021;32(3):221–30. [DOI] [PubMed] [Google Scholar]
  • 68.Smith J, Rapport F, O’Brien TA, Smith S, Tyrrell VJ, Mould EVA, et al. The rise of rapid implementation: a worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review. BMC Health Serv Res. 2020;20(1):449. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 69.Cohen MS, Chen YQ, McCauley M, Gamble T, Hosseinipour MC, Kumarasamy N, et al. Prevention of HIV-1 infection with early antiretroviral therapy. N Engl J Med. 2011;365(6):493–505. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Amanyire G, Semitala FC, Namusobya J, Katuramu R, Kampiire L, Wallenta J, et al. Effects of a multicomponent intervention to streamline initiation of antiretroviral therapy in Africa: a stepped-wedge cluster-randomised trial. Lancet HIV. 2016;3(11):e539–48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Koenig SP, Dorvil N, Devieux JG, Hedt-Gauthier BL, Riviere C, Faustin M, et al. Same-day HIV testing with initiation of antiretroviral therapy versus standard care for persons living with HIV: a randomized unblinded trial. PLoS Med. 2017;14(7):e1002357. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Pilcher CD, Ospina-Norvell C, Dasgupta A, Jones D, Hartogensis W, Torres S, et al. The effect of same-day observed initiation of antiretroviral therapy on HIV viral load and treatment outcomes in a US public health setting. J Acquir Immune Defic Syndr. 2017;74(1):44–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.World Health Organization. Guidelines for managing advanced HIV disease and rapid initiation of antiretroviral therapy. Geneva: World Health Organization; 2017. [PubMed] [Google Scholar]
  • 74.Kerschberger B, Boulle A, Kuwengwa R, Ciglenecki I, Schomaker M. The impact of same-day antiretroviral therapy initiation under the World Health Organization treat-all policy. Am J Epidemiol. 2021;190(8):1519–32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Scheirer MA, Dearing JW. An agenda for research on the sustainability of public health programs. Am J Public Health. 2011;101(11):2059–67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Shediac-Rizkallah MC, Bone LR. Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998;13(1):87–108. [DOI] [PubMed] [Google Scholar]
  • 77.Mehra R, Boyd LM, Lewis JB, Cunningham SD. Considerations for building sustainable community health worker programs to improve maternal health. J Prim Care Community Health. 2020;11:2150132720953673. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Wadge H, Bhatti Y, Carter A, Harris M, Parston G, Darzi A. Brazil’s Family Health Strategy: Using Community Health Care Workers to Provide Primary Care. New York, NY: The Commonwealth Fund; 2016.
  • 79.Macinko J, de Marinho Souza MF, Guanais FC, da Silva Simoes CC. Going to scale with community-based primary care: an analysis of the family health program and infant mortality in Brazil, 1999–2004. Soc Sci Med. 2007;65(10):2070–80. [DOI] [PubMed] [Google Scholar]
  • 80.Rice-Marquez N, Baker T, Fischer C. The community health worker: forty years of experience in an integrated primary rural health care system in Brazil. J Rural Health. 1998;4:87–100. [Google Scholar]
  • 81.Rabin B, Viglione C, Brownson R. Terminology for dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. 3rd ed. New York: Oxford University Press; 2023. p. 27–68. [Google Scholar]
  • 82.Walsh-Bailey C, Tsai E, Tabak RG, Morshed AB, Norton WE, McKay VR, et al. A scoping review of de-implementation frameworks and models. Implement Sci. 2021;16(1):100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Brownson RC, Allen P, Jacob RR, Harris JK, Duggan K, Hipp PR, et al. Understanding mis-implementation in public health practice. Am J Prev Med. 2015;48(5):543–51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Walsh M. The global experience with lead in gasoline and the lessons we should apply to the use of MMT. Am J Ind Med. 2007;50:853–60. [DOI] [PubMed] [Google Scholar]
  • 85.Angrand RC, Collins G, Landrigan PJ, Thomas VM. Relation of blood lead levels and lead in gasoline: an updated systematic review. Environ Health. 2022;21(1):138. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 86.Rogers EM. Diffusion of Innovations. 5th ed. New York: Free Press; 2003. [Google Scholar]
  • 87.DeSalvo KB, Wang YC, Harris A, Auerbach J, Koo D, O’Carroll P. Public health 3.0: a call to action for public health to meet the challenges of the 21st century. Prev Chronic Dis. 2017;14:E78. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Williams DR, Costa MV, Odunlami AO, Mohammed SA. Moving upstream: how interventions that address the social determinants of health can improve health and reduce disparities. J Public Health Manag Pract. 2008;14(Suppl):S8-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Benjamin GC, Jones CP, Davis Moss R. Editorial: Racism as a public health crisis: from declaration to action. Front Public Health. 2022;10:893804. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.World Health Organization. Tackling structural racism and ethnicity-based discrimination in health Geneva: World Health Organization; 2024 [Available from: https://www.who.int/activities/tackling-structural-racism-and-ethnicity-based-discrimination-in-health.
  • 91.Adsul P, Shelton RC, Oh A, Moise N, Iwelunmor J, Griffith DM. Challenges and opportunities for paving the road to global health equity through implementation science. Annu Rev Public Health. 2024;45(1):27–45. [DOI] [PubMed] [Google Scholar]
  • 92.Ramanadhan S, Davis MM, Armstrong R, Baquero B, Ko LK, Leng JC, et al. Participatory implementation science to increase the impact of evidence-based cancer prevention and control. Cancer Causes Control. 2018;29(3):363–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Bartocci L, Grossi G, Mauro S, Ebdon C. The journey of participatory budgeting: a systematic literature review and future research directions. Int Rev Adm Sci. 2023;89(3):757–74. [Google Scholar]
  • 94.Bittle BB. Sharing power to improve population health: participatory budgeting and policy making. J Public Health Manag Pract. 2022;28(4 Suppl 4):S143–50. [DOI] [PubMed] [Google Scholar]
  • 95.Hagelskamp C, Schleifer D, Rinehart C, Silliman R. Participatory budgeting: could it diminish health disparities in the United States? J Urban Health. 2018;95(5):766–71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 96.Ramanadhan S, Davis M, Donaldson T, Miller E, Minkler M. Participatory Approaches in Dissemination and Implementation Science. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. 3rd ed. New York: Oxford University Press; 2023.
  • 97.Cvitanovic C, Shellock R, Mackay M, van Putten E, Karcher D, Dickey-Collas M, et al. Strategies for building and managing ‘trust’ to enable knowledge exchange at the interface of environmental science and policy. Environ Sci Policy. 2021;123:179–89. [Google Scholar]
  • 98.Ipsos. Doctors and scientists are seen as the world’s most trustworthy professions New York, NY: Ipsos; 2022 [Available from: https://www.ipsos.com/en-us/news-polls/global-trustworthiness-index-2022.
  • 99.Wellcome Trust. Trust in science London: Wellcome Global Monitor; 2021 [Available from: https://wellcome.org/news/public-trust-scientists-rose-during-covid-19-pandemic-0.
  • 100.Pew Research Center. Government investments in scientific research and the importance of the U.S. being a world leader in science Washington, DC: Pew; 2023 [Available from: https://www.pewresearch.org/science/2023/11/14/government-investments-in-scientific-research-and-the-importance-of-the-u-s-being-a-world-leader-in-science/.
  • 101.Pew Research Center. Americans’ Trust in Scientists, Positive Views of Science Continue to Decline Washington, DC: Pew; 2023 [Available from: https://www.pewresearch.org/science/2023/11/14/americans-trust-in-scientists-positive-views-of-science-continue-to-decline/.
  • 102.Hotez P. The deadly rise of anti-science: a scientist’s warning. Baltimore, MD: Johns Hopkins University Press; 2023. [Google Scholar]
  • 103.Tabak R, Nilsen P, Woodward E, Chambers D. The conceptual basis for dissemination and implementation research: Lessons from existing theories, models and frameworks. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and Implementation Research in Health: Translating Science to Practice. 3rd ed. New York: Oxford University Press; 2023. p. 86–105. [Google Scholar]
  • 104.University of Colorado Denver. Dissemination & Implementation Models in Health Research & Practice Denver, CO: ACCORDS Dissemination & Implementation Science Program; 2024 [Available from: http://dissemination-implementation.org/index.aspx.
  • 105.Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009;36(1):24–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 106.Luke DA, Sarli CC, Suiter AM, Carothers BJ, Combs TB, Allen JL, et al. The translational science benefits model: a new framework for assessing the health and societal benefits of clinical and translational sciences. Clin Transl Sci. 2018;11(1):77–84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 107.Easterling DV, Jacob RR, Brownson RC, Haire-Joshu D, Gundersen DA, Angier H, et al. Participatory logic modeling in a multi-site initiative to advance implementation science. Implement Sci Commun. 2023;4(1):106. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.Deeming S, Searles A, Reeves P, Nilsson M. Measuring research impact in Australia’s medical research institutes: a scoping literature review of the objectives for and an assessment of the capabilities of research impact assessment frameworks. Health Res Policy Syst. 2017;15(1):22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 109.Davies H, Powell A, Nutley S. Mobilising knowledge to improve UK health care: learning from other countries and other sectors – a multimethod mapping study. Health Serv Deliv Res. 2015. Southampton, UK: University of Southampton Science Park. [PubMed]
  • 110.National Institutes of Health. Estimates of Funding for Various Research, Condition, and Disease Categories (RCDC) Bethesda, MD: NIH RePORT; 2024 [Available from: https://report.nih.gov/funding/categorical-spending#/.
  • 111.Kozlov M. NIH chief stands by funding cuts to ‘politicized science’ at tense hearing. Nature. 2025 June 11. [DOI] [PubMed]
  • 112.Kepper M, L’Hotta A, Thembekile Shato T, Kwan B, Glasgow R, Luke D, et al. Supporting Teams with Designing for Dissemination and Sustainability: the Design, Development, and Usability of a Digital Interactive Platform. Implement Sci. 2024 in review. [DOI] [PMC free article] [PubMed]
  • 113.Luke D, Malone S, Galea S. The Road from Science to Health: The Importance of Designing For, Measuring, and Communicating Impact in Public Health. Annu Rev Public Health. 2026 in press. [DOI] [PubMed]
  • 114.Wolfenden L, Mooney K, Gonzalez S, Hall A, Hodder R, Nathan N, et al. Increased use of knowledge translation strategies is associated with greater research impact on public health policy and practice: an analysis of trials of nutrition, physical activity, sexual health, tobacco, alcohol and substance use interventions. Health Res Policy Syst. 2022;20(1):15. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 115.Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82(4):581–629. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 116.Glasgow RE, Marcus AC, Bull SS, Wilson KM. Disseminating effective cancer screening interventions. Cancer. 2004;101(5 Suppl):1239–50. [DOI] [PubMed] [Google Scholar]
  • 117.Lehoux P, Denis JL, Tailliez S, Hivon M. Dissemination of health technology assessments: identifying the visions guiding an evolving policy innovation in Canada. J Health Polit Policy Law. 2005;30(4):603–41. [DOI] [PubMed] [Google Scholar]
  • 118.Proctor EK, Toker E, Tabak R, McKay VR, Hooley C, Evanoff B. Market viability: a neglected concept in implementation science. Implement Sci. 2021;16(1):98. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 119.Curry SJ. Organizational interventions to encourage guideline implementation. Chest. 2000;118(2 Suppl):40S-S46. [DOI] [PubMed] [Google Scholar]
  • 120.Brownson RC, Jacobs JA, Tabak RG, Hoehner CM, Stamatakis KA. Designing for dissemination among public health researchers: findings from a national survey in the United States. Am J Public Health. 2013;103(9):1693–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 121.Knoepke CE, Ingle MP, Matlock DD, Brownson RC, Glasgow RE. Dissemination and stakeholder engagement practices among dissemination & implementation scientists: results from an online survey. PLoS ONE. 2019;14(11):e0216971. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122.Wilson PM, Petticrew M, Calnan MW, Nazareth I. Does dissemination extend beyond publication: a survey of a cross section of public funded research in the UK. Implement Sci. 2010;5:61. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 123.Burnes B. Kurt Lewin and the planned approach to change: a reappraisal. J Manage Stud. 2004;41(6):977–1002. [Google Scholar]
  • 124.Weick KE, Quinn RE. Organizational change and development. Annu Rev Psychol. 1999;50:361–86. [DOI] [PubMed] [Google Scholar]
  • 125.Simon-Ortiz S, Bilick S, Frey M, Gould S, Long C, Waugh E, et al. Community power-building groups and public health NGOs: reimagining public health advocacy. Health Aff Millwood. 2024;43(6):798–804. [DOI] [PubMed] [Google Scholar]
  • 126.Beckman M, Penney N, Cockburn B. Maximizing the impact of community-based research. J High Educ Outreach Engagem. 2011;15(2):83–103. [Google Scholar]
  • 127.Cargo M, Mercer SL. The value and challenges of participatory research: strengthening its practice. Annu Rev Public Health. 2008;29:325–50. [DOI] [PubMed] [Google Scholar]
  • 128.Wallerstein N, Duran B, Oetzel J, Minker M. Community-Based Participatory Research for Health: Advancing Social and Health Equity. 3rd ed. San Francisco, CA: Jossey-Bass; 2018. [Google Scholar]
  • 129.Bartels SM, Haider S, Williams CR, Mazumder Y, Ibisomi L, Alonge O, et al. Diversifying Implementation Science: A Global Perspective. Glob Health Sci Pract. 2022;10(4). [DOI] [PMC free article] [PubMed]
  • 130.Milkman KL, Berger J. The science of sharing and the sharing of science. Proc Natl Acad Sci U S A. 2014;111(Suppl 4):13642–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 131.Tabak RG, Stamatakis KA, Jacobs JA, Brownson RC. What predicts dissemination efforts among public health researchers in the United States? Public Health Rep. 2014;129(4):361–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 132.Thomson K, Hillier-Brown F, Todd A, McNamara C, Huijts T, Bambra C. The effects of public health policies on health inequalities in high-income countries: an umbrella review. BMC Public Health. 2018;18(1):869. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 133.Samet J, Brownson RC. Reimagining public health: mapping a path forward. Health Aff Millwood. 2024;43(6):750–8. [DOI] [PubMed] [Google Scholar]
  • 134.Luke DA, Stamatakis KA. Systems science methods in public health: dynamics, networks, and agents. Annu Rev Public Health. 2012;33:357–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 135.Dearing JW. Evolution of diffusion and dissemination theory. J Public Health Manag Pract. 2008;14(2):99–108. [DOI] [PubMed] [Google Scholar]
  • 136.Lavis J, Ross S, McLeod C, Gildiner A. Measuring the impact of health research. J Health Serv Res Policy. 2003;8(3):165–70. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

Not applicable.


Articles from Implementation Science Communications are provided here courtesy of BMC

RESOURCES