Skip to main content
Springer Nature - PMC COVID-19 Collection logoLink to Springer Nature - PMC COVID-19 Collection
editorial
. 2022 Jun 2;2(2):107–119. doi: 10.1007/s43477-022-00045-4

FAST: A Framework to Assess Speed of Translation of Health Innovations to Practice and Policy

Enola Proctor 1,#, Alex T Ramsey 2,✉,#, Lisa Saldana 3, Thomas M Maddox 4,5, David A Chambers 6, Ross C Brownson 7,8,9
PMCID: PMC9161655  PMID: 35669171

Abstract

The 17-year time span between discovery and application of evidence in practice has become a unifying challenge for implementation science and translational science more broadly. Further, global pandemics and social crises demand timely implementation of rapidly accruing evidence to reduce morbidity and mortality. Yet speed remains an understudied metric in implementation science. Prevailing evaluations of implementation lack a temporal aspect, and current approaches have not yielded rapid implementation. In this paper, we address speed as an important conceptual and methodological gap in implementation science. We aim to untangle the complexities of studying implementation speed, offer a framework to assess speed of translation (FAST), and provide guidance to measure speed in evaluating implementation. To facilitate specification and reporting on metrics of speed, we encourage consideration of stakeholder perspectives (e.g., comparison of varying priorities), referents (e.g., speed in attaining outcomes, transitioning between implementation phases), and observation windows (e.g., time from intervention development to first patient treated) in its measurement. The FAST framework identifies factors that may influence speed of implementation and potential effects of implementation speed. We propose a research agenda to advance understanding of the pace of implementation, including identifying accelerators and inhibitors to speed.

Keywords: Implementation science, Translational science, Speed, Rapid cycle research, Metrics

Introduction

The protracted time from discovery to practical application (Lenfant, 2003), known as the 17-year gap, is a unifying challenge for translational research and “raison d’etre” for implementation science. This gap persists: a 2021 study of cancer control research found time to translation averaging 15 years (Khan et al., 2021). The span between generating and applying evidence prolongs human suffering, thwarts the public health benefit of scientific discoveries (Sung et al., 2003), and prompts stakeholders to question implementation success (Smith et al., 2020).

The discovery-to-practice gap is strikingly evident but untenable during times of public health crisis, evidenced in the COVID-19 pandemic when speed is all-important. Co-ordinated effort and funding led to rapid implementation at population scale of vaccine development, ventilation protocols, personal protective equipment production, and home monitoring programs (Ball, 2021). Yet science-based public health measures such as mandates for physical distancing, masking, and vaccine uptake continue to be debated vigorously, thwarting the timeliness of policy implementation (Chernozhukov et al., 2021). As global pandemics illustrate, public health crises demand timely implementation of rapidly accruing evidence to reduce morbidity and mortality. Indeed, other social interventions, such as those for the opioid crisis, child maltreatment, and refugee relief require fast implementation as well, with the COVID-19 pandemic offering an opportunity to illustrate that speed can be accomplished when agreed upon by multiple stakeholders. Proctor and Geng call for a lane of science that studies rapid uptake of proven interventions (Proctor & Geng, 2021). There is now national recognition for the need to move faster, as the National Academies have recently formed a Standing Committee for CDC Center for Preparedness and Response—a forum to accelerate research translation in public health emergencies (The National Academies of Sciences, Engineering, n.d.).

Despite increased prioritization, speed remains an understudied metric in implementation science. Prevailing definitions of implementation lack a temporal aspect (Smith et al., 2020) and increased use of hybrid effectiveness-implementation trials notwithstanding, current research approaches have not accelerated implementation (Glasgow & Chambers, 2012; Kilbourne et al., 2016; Riley et al., 2013). The Dissemination and Implementation Research in Health funding opportunity announcements (National Institutes of Health, 2019a, b, c) highlight potential “to speed the translation of research into practice” as an “innovation” review criterion. National Cancer Institute-supported Implementation Science Centers for Cancer Control emphasize the need for studying “rapid cycle” implementation (Oh et al., 2021), and funding opportunity announcements for the Clinical and Translational Science Award (CTSA) Program call for “clear and meaningful metrics and outcomes” to assess the speed of research translation and training (National Institutes of Health, 2018). Yet the implementation science literature reflects scant attention to speed (Kessler & Glasgow, 2011), leaving us ill equipped to answer the question: What are the variables and conditions most likely to impact speed of implementation? With dedicated investigation and measurement of speed across a range of innovations and contexts we could, in time, begin to understand how long implementation should take, from discovery to incorporation into routine public health practice or clinical care.

Implementation speed is a thorny issue. Its investigation could involve (1) speed of translational processes (e.g., from landmark paper to guideline, guideline to implementation), (2) pace of a given implementation effort (e.g., progression through implementation stages), (3) timeliness of prevention efforts and health care delivery (e.g., to stem more severe illness), or (4) speed of achieving clinical outcomes (e.g., improved functioning after receiving evidence-based care). These orientations range from a largely individual focus (e.g., identifying a critical care need and intervening faster) to a population focus (e.g., translating and scaling up evidence to inform assessment and management practices in critical care). This paper merges the former two orientations as follows: speed of moving from synthesized recommendations based on actionable evidence (e.g., guideline) warranting implementation to the point at which that evidence is identifiable as being used in standard practice, where contextually-appropriate.

Cognizant of potential trade-offs in prioritizing implementation speed, we present both sides of an argument about the optimal pace of the implementation process—implementation occurs too slowly, versus implementation should not be rushed. We discuss the balance of risks and benefits of slow or quick implementation. Of course, certain interventions should not be implemented given uncertain evidence or lack of evidence of clinical or population health benefit (Norton et al., 2017). Our work is based on an assumption that the interventions to be accelerated are those with established benefit for the general population or identified subpopulation.

The Debate: Fast or Slow?

The Case for Speed

Several factors compel expeditious roll out of new discoveries, three of which are among the most frequently debated issues in research translation: (1) rapid health and social crises, (2) the typically reactive nature of systems; and (3) healthcare and social inequities.

First—evident globally—new epidemics and rapid disease spread require fast response. Traditional research paradigms accommodate slow-moving, incremental improvements (Sverdlov, 2018), but crippling pandemics, epidemics, and other pressing societal needs demand rapid advances. Adoption of new innovations carries perceived and actual risk, but failure to act—or acting too slowly—risks significant harm. Research shows that governmental delays in implementing evidence-based virus mitigation policies are linked to disease spread, with attendant massive health, economic, and social tolls (Chernozhukov et al., 2021; Walker et al., 2020). In contrast, co-ordinated efforts, ingenuity, and “supercharged” funding yield clear payoffs, as with the development of novel messenger RNA-based vaccines and their unprecedented rate of distribution (Ball, 2021). Importantly though, speed of vaccine distribution varies dramatically across communities, highlighting that health equity must be a priority in accelerated translation (Jean-Jacques & Bauchner, 2021). And a mutual desire across systems to reach a solution, along with provision of funding, are not sufficient for rapid change, as evidenced by the ongoing efforts to address the opioid epidemic and resulting consequences across multiple service sectors including criminal justice, child welfare, mental health, and addiction (Morrow et al., 2019).

Second, reflected in the challenge of making prevention common practice, social service, healthcare and public health remain largely reactive. Optimally, societies would have armamentaria of new innovations, practice guidelines, and solutions in design and testing queues, ready for quick deployment. However, new solutions tend to be prioritized in crises, such as the opioid epidemic or dementia among a rapidly aging population (Khanassov et al., 2014). Moreover, efforts to understand implementation often wait until innovations have a strong evidence base (Colditz & Emmons, 2018; Proctor et al., 2013). A more anticipatory approach—designing solutions for future implementation—may narrow substantially the research-to-practice gap (Brown et al., 2017).

Third, long-persisting racial, economic, and geographical inequities in prevention and care exact an unnecessary toll on human health and well-being. Notably, the “time to implementation” of public health programs and clinical sciences is prolonged among populations from disadvantaged communities. Many individuals from these groups lag in receipt of COVID-19 vaccines, clinician advice against smoking, breast and colorectal care, and lung cancer screening eligibility, utilization, and follow-up care (Khan et al., 2021; Levinson, 2017; Ndugga et al., 2021; Sosa et al., 2021; Ward et al., 2004). Similarly, although the poor social determinants of health associated with child maltreatment and subsequent involvement in the child welfare system have been documented for decades, receipt of evidence-based practice for families demonstrating these risk-factors is lacking (Hunter & Flores, 2021). Designing innovations for implementation in communities where the need is greatest, rather than adapting innovations originally designed for higher-resourced contexts and settings, may accelerate scale-up and the reach of prompt diagnosis and treatment among minoritized populations (Mohr et al., 2017, 2018).

The Case for Pause

Despite recognized urgency to accelerate research translation, several questions signal caution when weighing fast-versus-slow. Does haste make waste? Is fast inherently risky? Can rapid research yield strong evidence, absent replication? Can moving fast co-exist with sustainment, or does this represent a direct trade-off? Does rapid research increase inequities? Does the “life lesson” of Tony award winner Andre De Shields (2019)—slowly is the fastest way to get where you are going—apply to implementation in health and other global domains?

The health mandate to “do no harm” cautions against risk, prompting the question, “What is the minimum level of evidence needed for implementation?” (Ramsey et al., 2019). Once again, lessons from the COVID-19 pandemic are useful: rapid development, testing, and approval of COVID-19 vaccines triggered an onslaught of questions about safety and effectiveness (Khuroo et al., 2020). For implementation of health innovations to be successful, safety and effectiveness need to pass the scrutiny of both researchers and the population to be served.

Yet generally speaking, “evidence-in-progress” is all we have, reflected in the U.S. Food and Drug Administration’s system of evidence determination: diffusion proceeds while evidence continues to accrue. All researchers believe that better evidence is on the horizon, awaiting the next trial. But real-world problems demand action. The adage “More research is needed…”, while true, can slow science’s capacity to improve the human condition. How soon should we act on evidence, even as it continues to evolve? (Ramsey et al., 2019). And how can we speed the process of translating evidence to practice, fostering change amid complex health and human service settings that are not “built for speed”?

Speed Limit Guidance

Innovative designs promise research efficiencies and faster implementation (Glasgow et al., 2012). Frameworks such as Designing for Accelerated Translation (DART) (Ramsey et al., 2019) and methodological advances such as hybrid designs, user-centered designs, rapid ethnography, and market viability assessment (Curran et al., 2012; Hamilton & Finley, 2019; Proctor, McKay, et al., 2021; Proctor, Toker, et al., 2021) can shorten time from idea to implementation (Vindrola-Padros et al., 2021). We view iterative and concurrent approaches to evidence generation and implementation as practical, representative of real-world needs, and useful in accelerating time-to-benefit.

Even when faster translation is neither prudent nor safe, we argue for the importance of systematically measuring the pace of research translation and understanding the influences on and impact of implementation speed. Such research will inform, if not resolve, the debate about fast-versus-slow. Accordingly, we aim to untangle complexities in the study of implementation speed and offer a framework to assess speed of translation (FAST) as a key metric in implementation science. FAST offers potential referents and observation windows to structure assessments of speed and identifies potential influences on and effects of implementation pace. Finally, we propose a research agenda to advance understanding of the pace of implementation, including research to identify accelerators and inhibitors of speed. As noted throughout the paper, the concept of speed cuts across the care continuum (prevention to palliation) and settings. Given its visibility and global urgency, several of our illustrations highlight the COVID pandemic response, although we anticipate that the key concepts and proposed research agenda for studying implementation speed will apply broadly across many other domains and service settings. As a general principle, innovations with sufficient evidence of effectiveness and community or end-user demand, combined with mitigated costs and risks of acting versus not acting on the evidence—as theorized in the DART framework (Ramsey et al., 2019)—are appropriate for prioritizing implementation speed, regardless of the domain.

Complexities of Speed: Who, What, and How?

Several complexities beset studying speed of research translation. We discuss three that are essential for conceptual and methodological advancement: stakeholders’ priorities for speed, referents for measurement, and metrics.

Who Cares About Speed and Why Do They Care?

To whom is the pace of research translation relevant? To whom is it a priority? Table 1 contrasts several stakeholder groups.

Table 1.

Stakeholder perspectives and selected priorities on the speed of research translation

Stakeholders Perspectives and priorities (sample questions)
Intervention developers, trainers, and purveyors How long until the innovation is adopted?
Clinicians How long will the innovation take to learn? How long to reach competence? When can the innovation be used?
Clients and patients How long until the innovation is available? How long until improvement is seen?
Administrators

How long is the change process?

How quickly will new innovation become routine?

Payers How long until return on investment?
Policy makers Is the innovation ready at the time when decisions are being made? Can we implement the innovation and demonstrate improvement in time for the re-election cycle?
Communities How long until users of the innovation are reached? How long until coverage rates are adequate?
Advocates

Does rapid research affect health equity?

How long until equity is realized?

Researchers (*Current*) How long does it take to translate evidence to practice?
Researchers (*Proposed*)

How long will each stage of research translation take for this innovation?

How can we optimize the speed of intervention delivery upon identifying effectiveness?

How can we better measure the speed of change?

What factors will impact speed?

What strategies will enhance speed?

How do we increase speed for disadvantaged groups?

What effects did speed at both the translational research and applied implementation levels have on overall impact of the innovation?

Sample questions are not necessarily mutually exclusive to stakeholder group

Stakeholders’ perspectives, preference, and priorities for speed are likely to differ, perhaps even within a given stakeholder group. For instance, community-based stakeholders often move “at the speed of trust” (Covey & Merrill, 2006), justifiably resistant to partner with unfamiliar research teams to adopt innovative programs that may not meet an identified need in the community. At the same time, community health providers often cite the slow pace of research agendas and grant timelines as a key barrier to collaboration (Carter-Edwards et al., 2021). When meaningful research-practice partnerships coalesce around community needs, increased priorities on implementation speed may enhance engagement and the relevance of research for community partners. As another example, healthcare consumers and advocates may view drug approval as too slow, while regulators focused on safety may work to slow the pace. Timing and timeliness may also be viewed in unique ways by policy makers, who are often most concerned about research being available at the time when decisions are being made (Smith et al., 2022). Indeed, Bullock et al. (2021) identified “timing/sequencing” as one of eight categories of policy-related determinants of implementation (Bullock et al., 2021). Researchers should explicate the stakeholder perspectives captured and interpret data in the context of those perspectives.

On What Referent is Speed Being Measured and Analyzed?

The referent of speed—what is being measured—also varies, as demonstrated in Table 2. One referent is phase of implementation. The plethora of phase models within implementation science reflects a prevailing assumption that practice change occurs in sequenced, dynamic patterns of change. Hence studies may focus on the pace of completing stages of implementation (e.g., from Engagement through Competency in the Stages of Implementation Completion model) (Saldana et al., 2020); progressing through various phases of an implementation process (e.g., from Exploration through Sustainment in the EPIS framework) (Moullin et al., 2019), or moving through cyclical models such as Plan-Do-Study-Act (PDSA) phases (Taylor et al., 2014).

Table 2.

Potential referents of speed, to be measured per intervention

Speed of what? Examples
Completing phases of the implementation process

How long should the planning for change phase last?

What are the optimal lengths of the exploration phases?

How long does it take to attain organizational readiness to adopt new interventions or programs?

Attainment of implementation outcomes

How quickly provider or system adoption of innovations is attained?

How much training time is needed to attain fidelity to intervention protocols?

How quickly can innovations penetrate groups of health care users or community populations to achieve targeted reach?

How quickly does the innovation become sustained or institutionalized?

Speed may capture “time-to-outcome attainment”. Time to achieving implementation outcomes (e.g., adoption), the most proximal outcomes (Proctor et al., 2009), may function as rate-determining factors in the ultimate speed of implementation. Thus, studies may measure time from an intervention’s availability to its acceptance by providers and patients, time required to train providers to deliver interventions with fidelity, or time to scale-up for penetration within a health system or a community. Some implementation outcomes take longer to attain, and some may be necessary pre-conditions for other outcomes, e.g., implementation feasibility may depend on funding acquisition or new pricing policies. We have scant data on the typical length of time from an intervention’s availability to time to achieving specific implementation outcomes, such as providers’ awareness and decision to adopt, but recent unpublished analyses suggest a high degree of variation depending on the program being implemented (Alley et al., 2022).

In early implementation (e.g., engagement and planning), we need to assess which implementation outcomes take longer to achieve (e.g., obtaining resources to offset the cost barriers to implementation) than others (getting to acceptability through use of a champion or communication campaign). Akin to intervention mapping, an implementation team can assess speed, anticipate timing, and prioritize earlier efforts on slower-moving challenges. A common example from healthcare settings is the adoption of a new innovation that requires modifications to the electronic health record system. Because of the effort needed to program these changes, this implementation activity might be prioritized up front, before other seemingly more essential implementation activities such as hiring and training. Similarly, because funding can be challenging to obtain, implementers often wait to learn if they have obtained funding prior to connecting with key stakeholders for implementation, recognizing that delays between engaging with stakeholders and moving to action often undercut engagement and motivation.

Observation Periods: From When-to-When?

Reporting observation periods for data collection is widely expected in reporting research methods. Similarly, in reporting rate or time of research translation, studies need to explicate observation windows. Khan et al. (2021) used the observation period “time from publication to implementation of practice guidelines,” derived from Balas and Boren (2000). We also encourage reporting of more “downstream” observation periods, including “time from actionable evidence-based recommendation to completion of an implementation effort making use of that evidence.” Hence, implementation pace could be reported as time from intervention development to time of first provider training, time of first patient treated, or time to reaching agency sustainment. Brookman-Frazee et al. (2016) measured quarterly the slope of adoption within pre-determined time increments. Quantitative measurement of adoption rates over time, particularly in relation to other sites or prior adoption efforts, can prove helpful. Table 3 below highlights various domains of measurement, with sample metrics.

Table 3.

Measurement of speed

Domains for measuring speed Example metrics
Speed in the implementation process
Time elapsed to achieve predefined implementation milestone Number of days from starting provider training to first person receiving the intervention
Time elapsed to attain predefined outcome (implementation, service system, clinical outcomes) Number of months to attain 60% of eligible providers delivering the intervention following clinic adoption
Implementation progress between predefined time periods Number of implementation steps completed or outcomes attained in 6 months
Rate of progress (or changes in slope) over time or between milestones

% increase in sites adopted in first 6 month period vs. second 6 month period

Visual depiction (i.e., curve) of % increase in providers engaged 6 months prior to readiness assessment vs. 6 months subsequent to readiness assessment

Pace of iterative development or improvement Time elapsed (in days) from start to end of 1st PDSA cycle, 2nd PDSA cycle
Speed in the translation of research
Time spent within a translational stage (and time saved in subsequent iterations within the translational stage) Number of months to develop first versus second iterations of intervention
Time to advance from one translational stage to another Number of months from intervention development to efficacy testing in real-world settings (e.g., from Stage I to Stage III in NIH Stage Model for Behavioral Intervention Development)

A Proposed Framework to Assess Speed of Translation (FAST)

Implementation science enjoys a wealth of frameworks and conceptual models (Tabak et al., 2012), some focusing on implementation processes as key to sustainment (Shelton & Lee, 2019). Yet very few, notably the Stages of Implementation Completion (Chamberlain et al., 2011), explicitly guide assessment of sequential progress toward implementation. Moreover, extant taxonomies of implementation outcomes do not address speed of attainment. The challenges and uncertainties associated with accelerating implementation efforts warrant a guiding framework.

Figure 1 below depicts our proposed Framework to Assess Speed of Translation (FAST). FAST can guide research to inform: (1) an initial set of parameters and metrics for capturing and reporting speed, including identification of the referent (i.e., speed of what?), endpoints (i.e., speed from when to when?), and outcomes to assess speed of implementation and its change over time; (2) factors that affect speed, including those of the innovation, context (e.g., demand, need), adopters, and implementation strategies; and (3) the effects of speeding implementation.

Fig. 1.

Fig. 1

Depiction of the determinants of implementation pace in the framework to assess speed of translation (FAST)

Factors in our proposed FAST framework align well with other widely used frameworks that do not focus as explicitly on speed, including the Interactive Systems Framework (Wandersman et al., 2008), the Consolidated Framework for Implementation Research (Damschroder et al., 2009), and Active Implementation Frameworks (Fixsen et al., 2005). The FAST framework is influenced by these existing frameworks as evidenced by a focus on speed with reference to multiple stakeholder groups (i.e., varying perspectives on speed), the specific innovation and the context in which it is to be implemented (i.e., innovation and adopter factors manifesting as accelerators and inhibitors to speed), and the importance of implementation capacity and other implementation drivers (i.e., determinants and strategies for building speed). The FAST framework aims to build on these and other implementation science frameworks to promote and facilitate the study of factors that influence speed of implementation and the effects of speed on important outcomes across diverse contexts.

A Research Agenda on Speed

We propose a five-pronged research agenda to advance understanding of implementation speed: (1) describe speed and develop metrics, (2) examine innovation, adopter, and contextual influences on speed, (3) identify and test how specific implementation strategies can accelerate speed, (4) assess the effect of implementation speed on key outcomes, and (5) develop designs for testing speed.

Describe Speed and Develop Metrics

Literature currently provides no systematic review of speed of implementation (Smith et al., 2020). Debates about the merits of speed notwithstanding, the field would benefit, at minimum, from better specification and reporting of speed of implementation. A first priority is detailing how fast phases, outcomes (at intermediate and late stages), and implementation processes are achieved. This requires the field to better specify and report on the perspectives (i.e., speed from whose viewpoint), referents (i.e., speed in terms of what), and measurement (i.e., speed defined how) of the speed of translation. This descriptive research will yield valuable information but more importantly will set the stage for exploratory and analytic studies as required for testing influences on speed and effects of speed. An excellent example is provided by Huebner and colleagues, who calculated time to achieving fidelity for delivering Sobriety Treatment and Recovery Team (START) programs for families with co-occurring child maltreatment and substance use disorders (Huebner et al., 2015).

Examine Influences on Speed

What factors influence implementation speed? Similar to a prior call for sustainment research (Proctor et al., 2015), studies should explore such correlates of speed as features of the innovation, characteristics of adopters, and features of the implementation context, as follows.

Features of the Innovation

Diffusion theory has long suggested that properties of innovations affect rate of uptake, with Rogers (2003) positing that compatibility, relative advantage, complexity, and cost affect adoption. Thus, we can form testable hypotheses that some interventions, policies, and programs may be inherently faster to adopt and deliver. Questions to be addressed about innovation features include: Are interventions with certain features (e.g., designed and packaged with end-user in mind) adopted more quickly than others? and Would modifying certain features of the intervention affect pace of implementation?.

Emerging evidence suggests that patterns of adoption and penetration vary across evidence-based interventions (Brookman-Frazee et al., 2016). Moreover, an innovation’s developmental state appears to influence the pace of implementation, with formative implementations being less efficient, though not necessarily less effective (Saldana et al., 2020). Researchers should map and report innovation-specific adoption curves, reflecting rollout pace across various phases, benchmarks, and iterations. User-centered design and stakeholder engagement may produce innovations that are faster to implement. There are also likely to be differences in speed of implementation for highly incentivized or mandatory versus voluntary innovations. For example, if a service is covered by Medicare and is an indicator of quality for a health system (e.g., mammography screening), it is more likely to be rapidly adopted than a voluntary innovation (e.g., encouragement of employees to follow new healthy lifestyle guidelines).

Adopter Characteristics: Organizational, Provider, and Patient

Studies can identify the speed with which specific systems, organizations, or communities implement innovations, applying characterizations from Rogers (2003) of early and late adopters, or quick versus slow implementers. Risk tolerance is dynamic and likely varies across stakeholders of any human service system undergoing change. While personality traits of providers and system leaders can remain stable, an entrepreneurial lens suggests that risk tolerance can increase with experience and support (Proctor, McKay, et al., 2021; Proctor, Toker, et al., 2021). Likewise, organizational climate, culture, infrastructure, and experience with innovation adoption vary and are malleable (Glisson, 2007), although individuals may adopt change faster than systems (Klein & Kozlowski, 2000). Systems with embedded implementation staff and resources likely adopt new interventions more quickly than systems without such internal resources or that rely on external consultation. For example, the Veterans Affairs initiated the QUERI program to support implementation and later adopted implementation facilitators to expedite and enhance implementation of evidence-based practices (Ritchie et al., 2021).

Research should capture the risk-reward balance in organizations’ willingness to adopt versus maintaining a status quo. The effects of implementation capacity/capital (Neal & Neal, 2019; Wandersman et al., 2008), along with a range of other key implementation drivers (Kaye et al., 2012; Torrey et al., 2012), also warrants further study to understand how individuals and organizations leverage prior experience to improve efficiency of subsequent implementation efforts.

Further, the strength of connections across delivery system levels, as highlighted in the Dynamic Sustainability Framework (Chambers et al., 2013) and more specifically, the concept of bridging factors (Lengnick-Hall et al., 2021), emphasizes the importance of specific relational structures across levels and the role of formal arrangements. Implementation supports via bridging factors may include financial investment, contract requirements, or legislative mandates. The unit of implementation also is likely to influence the speed of translation. For example, a set of evidence-based interventions to promote vaccination uptake is likely to be implemented more quickly in a small number of clinics compared with a larger, diverse set of clinics across a state (Dang et al., 2020; Dearing & Cox, 2018).

Does prior success in adopting an innovation lead to a more successful second attempt? Conversely, does prior implementation failure lead to later caution? By assessing adopter characteristics as they relate to implementation speed, investigators can compare the rates at which different provider groups or agencies accrue sufficient capability, opportunity, and motivation for change.

Features of Context: Pull, Capacity, and Urgency

Contexts and circumstances for innovation implementation also may affect pace. Demand for both intensifies in public health crises. Accordingly, necessary resources and support may become available to speed intervention deployment. In contrast, implementation may be slower when demand is not apparent or when implementing an innovation carries clinical or financial risk (Ramsey et al., 2019). Implementation researchers must better understand the demand for interventions. Treatment and program developers are strong in “pushing out” discoveries but weaker in cultivating “pull” (i.e., the market demand for an innovation) (Proctor, McKay, et al., 2021; Proctor, Toker, et al., 2021) or recognizing current pull (Orleans, 2007). Push–pull tension may be related to “capacity”, or systems’ potential to deliver value (Curry, 2000; Orleans et al., 1999). Quality improvement efforts often seek to build capacity—with lessons for implementation science to prioritize end-users’ needs, motives, and preferences. Consistent with resource allocation theory (Bower, 2018), capacity is influenced by the number of other interventions and projects already being implemented within a given context, which may prompt de-implementation considerations.

Fiscal resources are another influential feature of context. Importantly, the total costs of implementation are inclusive of providing the intervention itself (e.g., services billable to insurance, equipment, materials), as well as costs related to maintaining an effective implementation infrastructure (e.g., ongoing training and coaching costs, data system costs, contracts with developers). Costs for intervention implementation vary and often are obscure (Proctor et al., 2019; Saldana et al., 2022). Adoption pace may be unaffected when costs are low, may slow when implementation costs are high or funding streams are sporadic, and may accelerate when intervention delivery is incentivized. Brookman-Frazee et al. (2016) found accelerated ramp up in response to fiscal mandates to deliver evidence-based interventions. Understanding demand and costs of an innovation may lay groundwork for long-term, implementation success without compromising quality or sustainability (Chambers et al., 2013; Luke et al., 2014).

Finally, policy mandates provide context for implementation, but often do not come with the implementation infrastructure necessary for rapid on-the-ground implementation. For example, the Family First Prevention Services Act (FFPSA; P.L. 115-123) enacted in 2018 authorized federal funding for prevention services for mental, substance abuse, and in-home parent skills programs in an effort to reduce the number of entries into foster care. Yet three years into the process, only a limited number of states have approved FFPSA plans (Healthy Families America, 2021) and the number of evidence-based practices that have been reviewed and approved for funding, while growing, are still too few to meet the scale and scope of a federal mandate.

Active Efforts to Generate Speed: Can Implementation Strategies Accelerate Implementation?

Implementation strategies are the “how to” for moving guidelines, policies, and practices into adoption and use (Curran, 2020; Proctor et al., 2009). While published compilations (Powell et al., 2012, 2015) list arrays of different implementation strategies—ranging from provider training, data dashboards, checklists and protocols, policy initiatives, and organizational change methods—few studies describe the time required to use such strategies or the time between starting to use particular strategies and ultimate success in implementation.

Calculating time to impact of implementation strategies is a research priority. Studies should examine pace as a function of specific implementation strategies such as various training approaches, stakeholder engagement, task shifting, and champion messages. Clear implementation outcomes need be defined to identify the duration between implementation strategies and these outcomes (e.g., the time from training to serving the first client versus the time from training to achieving fidelity in program delivery). Although partnerships have been shown to be essential facilitators in implementation efforts, partnerships require time to shape and solidify (Proctor, McKay, et al., 2021; Proctor, Toker, et al., 2021), raising important questions: Does collaboration slow things down? Or does stakeholder engagement enhance relevance, thus preventing down-stream stagnancy and lag? Answers to these questions may shift the goal from “going faster” to “going farther, faster”.

Does Speed Matter and If So, What is Its Impact?

Finally, this proposed research agenda includes understanding the outcomes and impacts of faster-versus-slower research translation. Can faster pace of implementation produce sustainable change? Can accelerated implementation yield stakeholder satisfaction and quality outcomes? Or might we learn, akin to Goldilocks, that there is “too fast, too slow, and just right?” More nuanced questions and data may reveal the optimal pace for implementation for different innovations, with different stakeholder groups, at different implementation phases, and under different resource contexts. Answering these and other questions will advance the growing line of research on precision implementation and informing decisions of when, how, and under what circumstances faster implementation is warranted.

Designs for Testing Speed

Understanding speed of research translation, its determinants, and outcomes requires distinct measurement and analytical tools (Guastaferro & Collins, 2019). Data must be gathered over time and analyzed for trends and non-linear patterns. Such data may be captured retrospectively or concurrently from ongoing studies. As in measuring sustainability, capturing speed requires establishing optimal observation periods (Proctor et al., 2015), defined checkpoints, and short- and long-term outcomes that indicate whether speed increases or decreases. These measurements should be relevant to the level of implementation being evaluated for speed.

Adoption speed may be most efficiently studied within ongoing implementation projects via natural experiments (Glasgow et al., 2012; Petticrew et al., 2005). Practice-based research can yield “tacit knowledge” or “colloquial evidence”—pragmatic information based on direct experience and action in practice (Kothari et al., 2012; Sharma et al., 2015). Researchers can readily leverage observational studies of implementation processes within their natural context, capturing time-related data along with features of context.

Hybrid effectiveness-implementation designs (Curran et al., 2012) are intended to speed knowledge development about interventions and their implementation, incorporating questions about implementation simultaneously into studies of effectiveness (i.e., Type 1) or simultaneously test the impact of one or more implementation strategies alongside the effectiveness of one or more interventions (i.e., Type 2). Other methods for accelerating research translation include rapid ethnographic assessment (Hamilton & Finley, 2019; Reisinger, 2019; Sangaramoorthy & Kroeger, 2020), rapid qualitative syntheses (Brown-Johnson et al., 2020), and rapid cycle research (National Cancer Institute, 2019; Sangaramoorthy & Kroeger, 2020).

Conclusion

This paper addresses the merits for and against accelerating the translation of evidence into practice, along with a conceptual framework for assessing the speed of translation. Research yields many public health and clinical interventions with solid evidence, low risk, and strong adoption potential that have yet to be implemented at scale. Interventions still on-the-shelf reflect unrealized value and provide the ultimate argument for speed. “Fast-tracking” such innovations from research evidence to use-in-practice is critical. Conversely, some interventions currently in use have weak evidence of effectiveness and carry high costs, signaling misuse of resources (e.g., financial, effort). “Flat-lining” such innovations—that is, halting their implementation pending more evidence or de-implementing altogether—may be necessary. Such interventions may provide a cautionary tale and counterpoint to the argument for speed. A key goal is to more quickly distinguish between these typologies to efficiently make decisions regarding implementation and de-implementation. Still other interventions require use of complex implementation strategies, training capacity, and infrastructure details for their delivery. Here, efficiencies can be found to improve the pace of achieving these pre-conditions, and in turn, population health benefit.

We propose a framework for assessing speed of implementation along with an agenda for research on speed. The FAST framework identifies multiple determinants that affect the speed of translation, thereby informing a research agenda for the study of speed in implementation science. Executing this agenda requires careful conceptualization and measurement of speed indicators, intervals, and correlates, as well as rigorous testing of how long various implementation strategies take.

Researchers have advocated for a reduction in the persistent 17-year evidence-to-practice gap (Lenfant, 2003). While the field of implementation science has grown in capacity, quality and quantity of work, explicit attention to speed has scarcely moved from a conceptual goal to a target for improvement. We have lacked careful, explicit study of determinants of implementation speed, as well as an agenda to build a more robust knowledge base that reduces the time from discovery to widespread use.

In a global pandemic, research cannot ignore the need for rapid response. Systems must move nimbly within dynamic contexts and changing course of disease, and research must provide guidance on this process. This article is a first step toward that important endeavor. Accelerating speed requires careful analysis and rigorous research. Need implementation take 15 or 17 years? Can and should we move faster? How can research translation and implementation of evidence-based interventions be accelerated? It is time to accelerate research on speed.

Abbreviations

CTSA

Clinical and Translational Science Award

DART

Designing for Accelerated Translation

EPIS

Exploration, Preparation, Implementation, Sustainment

FAST

Framework to Assess Speed of Translation

NCI

National Cancer Institute

NIH

National Institutes of Health

PDSA

Plan-Do-Study-Act

QUERI

Quality Enhancement Research Initiative

Author Contributions

EKP conceived the paper; as joint first authors, EKP and ATR equally contributed to the paper content and format; ATR developed the FAST framework; all authors contributed to writing and editing the paper and reviewed the final manuscript.

Funding

EKP was supported by the Implementation Science-Entrepreneurship Unit of the Washington University Institute of Clinical and Translational Sciences grant UL1TR002345 from the National Center for Advancing Translational Sciences (NCATS) of the National Institutes of Health (NIH); the National Institute of Mental Health Grant R25 MH080916; the National Cancer Institute Grant P50CA244431; the National Institute of Mental Health grant 5P50MH113662; and Patient-Centered Outcomes Research Institute (PCORI) Award (TRD-1511-33321). ATR was supported by the National Institute on Drug Abuse under Award Numbers K12DA041449 and R34DA052928 and the National Cancer Institute under Award Number P50CA244431. LS was supported by R01DA044745. RCB was supported by National Cancer Institute (Grant Number P50CA244431), the National Institute of Diabetes and Digestive and Kidney Diseases (Grant Numbers P30DK092950 and P30DK056341), the Centers for Disease Control and Prevention (Cooperative Agreement number U48DP006395), and the Foundation for Barnes-Jewish Hospital. The findings and conclusions in this paper are those of the authors and do not necessarily represent the official positions of the National Institutes of Health or the Centers for Disease Control and Prevention.

Declarations

Competing interests

The authors declare no other competing interests. Views expressed in this paper are those of the authors and are not official positions of the National Cancer Institute.

Footnotes

Enola Proctor and Alex T. Ramsey are joint first authors.

References

  1. Balas EA, Boren SA. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray A, editors. Yearbook of medical informatics 2000: Patient-centered systems. Schattauer; 2000. [PubMed] [Google Scholar]
  2. Ball P. The lightning-fast quest for COVID vaccines—And what it means for other diseases. Nature. 2021;589(7840):16–18. doi: 10.1038/d41586-020-03626-1. [DOI] [PubMed] [Google Scholar]
  3. Bower J. Resource allocation theory. In: Augier M, Teece DJ, editors. The Palgrave Encyclopedia of strategic management. Palgrave Macmillan; 2018. pp. 1445–1448. [Google Scholar]
  4. Brookman-Frazee L, Stadnick N, Roesch S, Regan J, Barnett M, Bando L, Innes-Gomberg D, Lau A. Measuring sustainment of multiple practices fiscally mandated in children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research. 2016;43(6):1009–1022. doi: 10.1007/s10488-016-0731-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Brown-Johnson C, Safaeinili N, Zionts D, Holdsworth LM, Shaw JG, Asch SM, Mahoney M, Winget M. The Stanford lightning report method: A comparison of rapid qualitative synthesis results across four implementation evaluations. Learning Health Systems. 2020;4(2):e10210. doi: 10.1002/lrh2.10210. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, Collins LM, Duan N, Mittman BS, Wallace A, Tabak RG, Ducharme L, Chambers DA, Neta G, Wiley T, Landsverk J, Cheung K, Cruden G. An overview of research and evaluation designs for dissemination and implementation. Annual Review of Public Health. 2017;38:1–22. doi: 10.1146/annurev-publhealth-031816-044215. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Bullock HL, Lavis JN, Wilson MG, Mulvale G, Miatello A. Understanding the implementation of evidence-informed policies and practices from a policy perspective: A critical interpretive synthesis. Implementation Science. 2021;16(1):18. doi: 10.1186/s13012-021-01082-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Carter-Edwards L, Grewe ME, Fair AM, Jenkins C, Ray NJ, Bilheimer A, Dave G, Nunez-Smith M, Richmond A, Wilkins CH. Recognizing cross-institutional fiscal and administrative barriers and facilitators to conducting community-engaged clinical and translational research. Academic Medicine. 2021;96(4):1–8. doi: 10.1097/ACM.0000000000003893. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Chamberlain P, Brown CH, Saldana L. Observational measure of implementation progress in community based settings: The Stages of implementation completion (SIC) Implementation Science. 2011;6(1):116. doi: 10.1186/1748-5908-6-116. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science. 2013;8(1):117. doi: 10.1186/1748-5908-8-117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Chernozhukov V, Kasahara H, Schrimpf P. Causal impact of masks, policies, behavior on early covid-19 pandemic in the US. Journal of Econometrics. 2021;220(1):23–62. doi: 10.1016/j.jeconom.2020.09.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Colditz GA, Emmons K. The promise and challenges of dissemination and implementation research. In: Brownson R, Colditz G, Proctor E, editors. Dissemination and implementation research in health: Translating science to practice. 2. Oxford University Press; 2018. pp. 1–17. [Google Scholar]
  13. Covey SMR, Merrill RR. The speed of trust: The one thing that changes everything. Simon & Schuster; 2006. [Google Scholar]
  14. Curran GM. Implementation science made too simple: A teaching tool. Implementation Science Communications. 2020;1(1):27. doi: 10.1186/s43058-020-00001-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care. 2012;50(3):217–226. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Curry SJ. Organizational interventions to encourage guideline implementation. Chest. 2000;118:40S–46S. doi: 10.1378/chest.118.2_suppl.40S. [DOI] [PubMed] [Google Scholar]
  17. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science. 2009;4(1):50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Dang H, Dao S, Carnahan E, Kawakyu N, Duong H, Nguyen T, Nguyen D, Nguyen L, Rivera M, Ngo T, Werner L, Nguyen N. Determinants of scale-up from a small pilot to a national electronic immunization registry in Vietnam: Qualitative evaluation. Journal of Medical Internet Research. 2020;22(9):e19923. doi: 10.2196/19923. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Dearing JW, Cox JG. Diffusion of innovations theory, principles and practice. Health Affairs. 2018;37(2):183–190. doi: 10.1377/hlthaff.2017.1104. [DOI] [PubMed] [Google Scholar]
  20. De Shields, A. (2019). Quotes | Andre De Shields. https://www.andredeshields.com/quotes
  21. Fixsen D, Naoom S, Blase K, Friedman R, Wallace F. Implementation research: A synthesis of the literature. University of South Florida; 2005. [Google Scholar]
  22. Glasgow RE, Chambers D. Developing robust, sustainable, implementation systems using rigorous, rapid and relevant science. Clinical and Translational Science. 2012;5(1):48–55. doi: 10.1111/j.1752-8062.2011.00383.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health approaches to dissemination and implementation science: Current and future directions. American Journal of Public Health. 2012;102(7):1274–1281. doi: 10.2105/AJPH.2012.300755. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Glisson C. Assessing and changing organizational culture and climate for effective services. Research on Social Work Practice. 2007;17(6):736–747. doi: 10.1177/1049731507301659. [DOI] [Google Scholar]
  25. Guastaferro K, Collins LM. Achieving the goals of translational science in public health intervention research: The multiphase optimization strategy (MOST) American Journal of Public Health. 2019;109(S2):S128–S129. doi: 10.2105/AJPH.2018.304874. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Hamilton AB, Finley EP. Qualitative methods in implementation research: An introduction. Psychiatry Research. 2019;280:112516. doi: 10.1016/j.psychres.2019.112516. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Healthy Families America. (2021). Family First Prevention Services Act : Questions and Answers. Retrieved from https://preventchildabuse.org/wp-content/uploads/2020/08/FFPSA-Q1-2.pdf
  28. Huebner RA, Posze L, Willauer TM, Hall MT. Sobriety treatment and recovery teams: Implementation fidelity and related outcomes. Substance Use & Misuse. 2015;50(10):1341–1350. doi: 10.3109/10826084.2015.1013131. [DOI] [PubMed] [Google Scholar]
  29. Hunter AA, Flores G. Social determinants of health and child maltreatment: A systematic review. Pediatric Research. 2021;89(2):269–274. doi: 10.1038/s41390-020-01175-x. [DOI] [PubMed] [Google Scholar]
  30. Jean-Jacques M, Bauchner H. Vaccine distribution—Equity left behind? JAMA - Journal of the American Medical Association. 2021;325(9):829–830. doi: 10.1001/jama.2021.1205. [DOI] [PubMed] [Google Scholar]
  31. Kaye S, DePanfilis D, Bright CL, Fisher C. Applying implementation drivers to child welfare systems change: Examples from the field. Journal of Public Child Welfare. 2012;6(4):512–530. doi: 10.1080/15548732.2012.701841. [DOI] [Google Scholar]
  32. Kessler R, Glasgow RE. A proposal to speed translation of healthcare research into practice: Dramatic change is needed. American Journal of Preventive Medicine. 2011;40:637–644. doi: 10.1016/j.amepre.2011.02.023. [DOI] [PubMed] [Google Scholar]
  33. Khan S, Chambers D, Neta G. Revisiting time to translation: Implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes and Control. 2021;32(3):221–230. doi: 10.1007/s10552-020-01376-z. [DOI] [PubMed] [Google Scholar]
  34. Khanassov V, Vedel I, Pluye P. Case management for dementia in primary health care: A systematic mixed studies review based on the diffusion of innovation model. Clinical Interventions in Aging. 2014;9:915–928. doi: 10.2147/cia.s64723. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Khuroo MS, Khuroo M, Khuroo MS, Sofi AA, Khuroo NS. COVID-19 vaccines: A race against time in the middle of death and devastation! Journal of Clinical and Experimental Hepatology. 2020;10:610–621. doi: 10.1016/j.jceh.2020.06.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Kilbourne AM, Rani Elwy A, Sales AE, Atkins D. Accelerating research impact in a learning health care system VA’s quality enhancement research initiative in the choice act era. Medical Care. 2016;55(7):S4–S12. doi: 10.1097/MLR.0000000000000683. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Klein KJ, Kozlowski SWJ, editors. Multilevel theory, research, and methods in organizations: foundations, extensions, and new directions. San Francisco: Jossey-Bass Inc; 2000. [Google Scholar]
  38. Kothari A, Rudman D, Dobbins M, Rouse M, Sibbald S, Edwards N. The use of tacit and explicit knowledge in public health: A qualitative study. Implementation Science. 2012;7(1):20. doi: 10.1186/1748-5908-7-20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Lenfant C. Clinical research to clinical practice—Lost in translation? New England Journal of Medicine. 2003;349(9):868–874. doi: 10.1056/nejmsa035507. [DOI] [PubMed] [Google Scholar]
  40. Lengnick-Hall R, Stadnick NA, Dickson KS, Moullin JC, Aarons GA. Forms and functions of bridging factors: Specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implementation Science. 2021;16(1):34. doi: 10.1186/s13012-021-01099-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Levinson AH. Where the U.S. tobacco epidemic still rages: Most remaining smokers have lower socioeconomic status. Journal of Health Care for the Poor and Underserved. 2017;28(1):100–107. doi: 10.1353/hpu.2017.0012. [DOI] [PubMed] [Google Scholar]
  42. Luke DA, Calhoun A, Robichaux CB, Elliott MB, Moreland-Russell S. The Program Sustainability Assessment Tool: A new instrument for public health programs. Preventing Chronic Disease. 2014;11:130184. doi: 10.5888/pcd11.130184. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Mohr DC, Lyon AR, Lattie EG, Reddy M, Schueller SM. Accelerating digital mental health research from early design and creation to successful implementation and sustainment. Journal of Medical Internet Research. 2017;19:e7725. doi: 10.2196/jmir.7725. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Mohr DC, Riper H, Schueller SM. A solution-focused research approach to achieve an implementable revolution in digital mental health. JAMA Psychiatry. 2018;75(2):113–114. doi: 10.1001/jamapsychiatry.2017.3838. [DOI] [PubMed] [Google Scholar]
  45. Morrow JB, Ropero-Miller JD, Catlin ML, Winokur AD, Cadwallader AB, Staymates JL, Williams SR, McGrath JG, Logan BK, McCormick MM, Nolte KB, Gilson TP, Menendez MJ, Goldberger BA. The opioid epidemic: moving toward an integrated, holistic analytical response. Journal of Analytical Toxicology. 2019;43(1):1–9. doi: 10.1093/jat/bky049. [DOI] [PubMed] [Google Scholar]
  46. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implementation Science. 2019;14(1):1. doi: 10.1186/s13012-018-0842-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. National Cancer Institute. (2019). Implementation Science Consortium in Cancer (ISCC), Meeting Summary Report, Wednesday, July 10, Friday July 12 (2019 | Division of Cancer Control and Population Sciences (DCCPS). Retrieved from https://cancercontrol.cancer.gov/IS/initiatives/ccis/2019-iscc
  48. National Institutes of Health. (2018). Limited Competition: Clinical and Translational Science Award (CTSA) Program: Collaborative Innovation Award, (U01 Clinical Trial Optional) PAR-19-099. Retrieved from https://grants.nih.gov/grants/guide/pa-files/par-19-099.html
  49. National Institutes of Health. (2019a). Dissemination and Implementation Research in Health (R01 Clinical Trial Optional) PAR-19-274. Retrieved from https://grants.nih.gov/grants/guide/pa-files/PAR-19-274.html#_Part_1._Overview
  50. National Institutes of Health. (2019b). Dissemination and Implementation Research in Health (R03 Clinical Trial Not Allowed) PAR-19-276. Retrieved from https://grants.nih.gov/grants/guide/pa-files/par-19-276.html
  51. National Institutes of Health. (2019c). Dissemination and implementation research in Health (R21 clinical trial optional) PAR-19-275. Retrieved from https://grants.nih.gov/grants/guide/pa-files/PAR-19-275.html
  52. Ndugga, N., Hill, L., Artiga, S., & Parker, N. (2021). Latest Data on COVID-19 Vaccinations by Race/Ethnicity | KFF. Kaiser Family Foundation. Retrieved from https://www.kff.org/coronavirus-covid-19/issue-brief/latest-data-on-covid-19-vaccinations-by-race-ethnicity/
  53. Neal JW, Neal ZP. Implementation capital: Merging frameworks of implementation outcomes and social capital to support the use of evidence-based practices. Implementation Science. 2019 doi: 10.1186/s13012-019-0860-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Norton WE, Kennedy AE, Chambers DA. Studying de-implementation in health: An analysis of funded research grants. Implementation Science. 2017;12(1):1–13. doi: 10.1186/s13012-017-0655-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  55. Oh A, Vinson CA, Chambers DA. Future directions for implementation science at the National Cancer Institute: Implementation Science Centers in Cancer Control. Translational Behavioral Medicine. 2021;11(2):669–675. doi: 10.1093/tbm/ibaa018. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Orleans, C. T., Gruman, J., & Anderson, N. (1999). Roadmap for the next frontier: Getting evidence based behavioral medicine into practice. Proceedings of the 20th Annual Meeting of the Society of Behavioral Medicine; March 3–6; San Diego, California.
  57. Orleans CT. Increasing the demand for and use of effective smoking-cessation treatments: Reaping the full health benefits of tobacco-control science and policy gains—in our lifetime. American Journal of Preventive Medicine. 2007;33(6):S340–S348. doi: 10.1016/j.amepre.2007.09.003. [DOI] [PubMed] [Google Scholar]
  58. Petticrew M, Cummins S, Ferrell C, Findlay A, Higgins C, Hoy C, Kearns A, Sparks L. Natural experiments: An underused tool for public health? Public Health. 2005;119(9):751–757. doi: 10.1016/j.puhe.2004.11.008. [DOI] [PubMed] [Google Scholar]
  59. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, York JL. A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review : MCRR. 2012;69(2):123–157. doi: 10.1177/1077558711430690. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Proctor EK, Kirchner JE. A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science. 2015;10(1):21. doi: 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Proctor E, Hooley C, Morse A, McCrary S, Kim H, Kohl PL. Intermediary/purveyor organizations for evidence-based interventions in the US child mental health: Characteristics and implementation strategies. Implementation Science. 2019;14(1):3. doi: 10.1186/s13012-018-0845-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  62. Proctor EK, Geng E. A new lane for science. Science. 2021;374(6568):659. doi: 10.1126/science.abn0184. [DOI] [PubMed] [Google Scholar]
  63. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36(1):24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Proctor EK, McKay VR, Toker E, Maddox TM, Hooley C, Lengnick-Hall R, MacGibbon S, Evanoff B. Partnered innovation to implement timely and personalized care: A case study. Journal of Clinical and Translational Science. 2021;5(1):e121. doi: 10.1017/cts.2021.778. [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: Recommendations for specifying and reporting. Implementation Science. 2013;8(1):1–11. doi: 10.1186/1748-5908-8-139. [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Proctor EK, Toker E, Tabak R, McKay VR, Hooley C, Evanoff B. Market viability: A neglected concept in implementation science. Implementation Science. 2021;16(1):1–8. doi: 10.1186/S13012-021-01168-2/TABLES/2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, Padek M. Sustainability of evidence-based healthcare: Research agenda, methodological advances, and infrastructure support. Implementation Science. 2015;10(1):88. doi: 10.1186/s13012-015-0274-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Ramsey AT, Proctor EK, Chambers DA, Garbutt JM, Malone S, Powderly WG, Bierut LJ. Designing for Accelerated Translation (DART) of emerging innovations in health. Journal of Clinical and Translational Science. 2019;3(2–3):53–58. doi: 10.1017/cts.2019.386. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Reisinger H. Assembling consensus through clinical workflows: Using rapid ethnographic assessment for successful external facilitation. Implementation Science. 2019;14:1–7. doi: 10.1186/s13012-018-0842-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Riley WT, Glasgow RE, Etheredge L, Abernethy AP. Rapid, responsive, relevant (R3) research: A call for a rapid learning health research enterprise. Clinical and Translational Medicine. 2013;2(1):10–10. doi: 10.1186/2001-1326-2-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Ritchie MJ, Parker LE, Kirchner JE. From novice to expert: Methods for transferring implementation facilitation skills to improve healthcare delivery. Implementation Science Communications. 2021;2(1):39. doi: 10.1186/s43058-021-00138-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Rogers EM. Diffusion of innovations. 5. A Division of Simon and Schuster Inc.; 2003. [Google Scholar]
  73. Saldana L, Bennett I, Powers D, Vredevoogd M, Grover T, Schaper H, Campbell M. Scaling implementation of collaborative care for depression: Adaptation of the stages of implementation completion (SIC) Administration and Policy in Mental Health. 2020;47(2):188–196. doi: 10.1007/s10488-019-00944-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Saldana L, Ritzwoller DP, Campbell M, Block EP. Using economic evaluations in implementation science to increase transparency in costs and outcomes for organizational decision-makers. Implementation Science Communications. 2022;3(1):40. doi: 10.1186/s43058-022-00295-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Sangaramoorthy T, Kroeger KA. Rapid ethnographic assessments. Routledge; 2020. [Google Scholar]
  76. Sharma T, Choudhury M, Kaur B, Naidoo B, Garner S, Littlejohns P, Staniszewska S. Evidence informed decision making: The use of “colloquial evidence” at NICE. International Journal of Technology Assessment in Health Care. 2015;31(3):138–146. doi: 10.1017/S0266462314000749. [DOI] [PubMed] [Google Scholar]
  77. Shelton RC, Lee M. Sustaining evidence-based interventions and policies: Recent innovations and future directions in implementation science. American Journal of Public Health. 2019;109(S2):S132–S134. doi: 10.2105/AJPH.2018.304913. [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Smith J, Rapport F, O’Brien TA, Smith S, Tyrrell VJ, Mould EVA, Long JC, Gul H, Cullis J, Braithwaite J. The rise of rapid implementation: A worked example of solving an existing problem with a new method by combining concept analysis with a systematic integrative review. BMC Health Services Research. 2020;20(1):1–14. doi: 10.1186/s12913-020-05289-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Smith N, Mazzucca S, Hall M, Lich K, Brownson R, Frerichs L. Opportunities to improve policy dissemination by tailoring communication materials to the research priorities of legislators. Implementation Science Communications. 2022 doi: 10.1186/s43058-022-00274-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Sosa E, D’Souza G, Akhtar A, Sur M, Love K, Duffels J, Raz DJ, Kim JY, Sun V, Erhunmwunsee L. Racial and socioeconomic disparities in lung cancer screening in the United States: A systematic review. CA A Cancer Journal for Clinicians. 2021;71(4):299–314. doi: 10.3322/caac.21671. [DOI] [PMC free article] [PubMed] [Google Scholar]
  81. Sung NS, Crowley WF, Genel M, Salber P, Sandy L, Sherwood LM, Johnson SB, Catanese V, Tilson H, Getz K, Larson EL, Scheinberg D, Reece EA, Slavkin H, Dobs A, Grebb J, Martinez RA, Korn A, Rimoin D. Central challenges facing the national clinical research enterprise. Journal of the American Medical Association. 2003;289(10):1278–1287. doi: 10.1001/jama.289.10.1278. [DOI] [PubMed] [Google Scholar]
  82. Sverdlov ED. Incremental science: Papers and grants, yes; discoveries, no. Molecular Genetics, Microbiology and Virology. 2018;33(4):207–216. doi: 10.3103/S0891416818040079. [DOI] [Google Scholar]
  83. Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine. 2012;43(3):337–350. doi: 10.1016/j.amepre.2012.05.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan–do–study–act method to improve quality in healthcare. BMJ Quality & Safety. 2014;23(4):290–298. doi: 10.1136/bmjqs-2013-001862. [DOI] [PMC free article] [PubMed] [Google Scholar]
  85. The National Academies of Sciences, Engineering, and M. (n.d.). Standing Committee for CDC Center for Preparedness and Response SCPR | National Academies. The National Academies of Sciences, Engineering, and Medicine. Retrieved January 6, 2022, from https://www.nationalacademies.org/our-work/standing-committee-for-cdc-center-for-preparedness-and-response-scpr
  86. Torrey WC, Bond GR, McHugo GJ, Swain K. Evidence-based practice implementation in community mental health settings: The relative importance of key domains of implementation activity. Administration and Policy in Mental Health and Mental Health Services Research. 2012;39(5):353–364. doi: 10.1007/s10488-011-0357-9. [DOI] [PubMed] [Google Scholar]
  87. Vindrola-Padros C, Brage E, Johnson GA. Rapid, responsive, and relevant?: A systematic review of rapid evaluations in health care. American Journal of Evaluation. 2021;42(1):13–27. doi: 10.1177/1098214019886914. [DOI] [Google Scholar]
  88. Walker PGT, Whittaker C, Watson OJ, Baguelin M, Winskill P, Hamlet A, Djafaara BA, Cucunubá Z, Mesa DO, Green W, Thompson H, Nayagam S, Ainslie KEC, Bhatia S, Bhatt S, Boonyasiri A, Boyd O, Brazeau NF, Cattarino L, Ghani AC. The impact of COVID-19 and strategies for mitigation and suppression in low- and middle-income countries. Science. 2020;369(6502):413–422. doi: 10.1126/science.abc0035. [DOI] [PMC free article] [PubMed] [Google Scholar]
  89. Wandersman A, Duffy J, Flaspohler P, Noonan R, Lubell K, Stillman L, Blachman M, Dunville R, Saul J. Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology. 2008;41(3–4):171–181. doi: 10.1007/s10464-008-9174-z. [DOI] [PubMed] [Google Scholar]
  90. Ward E, Jemal A, Cokkinides V, Singh GK, Cardinez C, Ghafoor A, Thun M. Cancer disparities by race/ethnicity and socioeconomic status. CA A Cancer Journal for Clinicians. 2004;54(2):78–93. doi: 10.3322/canjclin.54.2.78. [DOI] [PubMed] [Google Scholar]

Articles from Global Implementation Research and Applications are provided here courtesy of Nature Publishing Group

RESOURCES