Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2023 Oct 18.
Published in final edited form as: Saf Sci. 2022 Apr 13;152:105763. doi: 10.1016/j.ssci.2022.105763

Methods to improve the translation of evidence-based interventions: A primer on dissemination and implementation science for occupational safety and health researchers and practitioners

RJ Guerin a,*, RE Glasgow b,c, A Tyler b,d, BA Rabin e,f, AG Huebschmann b,g,h
PMCID: PMC10583726  NIHMSID: NIHMS1805208  PMID: 37854304

Abstract

Objective:

A limited focus on dissemination and implementation (D&I) science has hindered the uptake of evidence-based interventions (EBIs) that reduce workplace morbidity and mortality. D&I science methods can be used in the occupational safety and health (OSH) field to advance the adoption, implementation, and sustainment of EBIs for complex workplaces. These approaches should be responsive to contextual factors, including the needs of partners and beneficiaries (such as employers, employees, and intermediaries).

Methods:

By synthesizing seminal literature and texts and leveraging our collective knowledge as D&I science and/or OSH researchers, we developed a D&I science primer for OSH. First, we provide an overview of common D&I terminology and concepts. Second, we describe several key and evolving issues in D&I science: balancing adaptation with intervention fidelity and specifying implementation outcomes and strategies. Next, we review D&I theories, models, and frameworks and offer examples for applying these to OSH research. We also discuss widely used D&I research designs, methods, and measures. Finally, we discuss future directions for D&I science application to OSH and provide resources for further exploration.

Results:

We compiled a D&I science primer for OSH appropriate for practitioners and evaluators, especially those newer to the field.

Conclusion:

This article fills a gap in the OSH research by providing an overview of D&I science to enhance understanding of key concepts, issues, models, designs, methods and measures for the translation into practice of effective OSH interventions to advance the safety, health and well-being of workers.

Keywords: Dissemination and implementation science, Translational research, Occupational safety and health, Workplace safety and health, Evidence-based interventions, Research-to-practice

1. Introduction

Many occupational safety and health (OSH) interventions have been demonstrated to improve worker safety and health. Examples of positive effects range from preventing occupational injuries and hearing loss to reducing musculoskeletal, skin, and lung diseases (Keefe et al., 2020; Teufer et al., 2019) to reducing work-related stress (Richardson and Rothstein, 2008). However, effective OSH research programs are not broadly translated to other settings (Cunningham et al., 2020; Dugan and Punnett, 2017; Guerin et al., 2021; Rondinone et al., 2010; Schulte et al., 2017). This research-to-practice lag has substantial implications for the health and well-being of the global workforce (Schulte et al., 2017). For example, according to Lucas et al. (2014), only 17% of fishing safety research has been adopted in workplaces to the benefit of workers. Similar results were reported in a review by Tinc and colleagues (2018). More adaptive, innovative, accelerated and transdisciplinary OSH research has been called for (Schulte et al., 2019; Tamers et al., 2018). This includes scientific approaches that speed translation for addressing the multi-level and interconnected real-world challenges of a rapidly changing global economy and workforce (Schulte et al., 2019), and global public health crises such as the COVID-19 pandemic. These approaches should include dissemination and implementation (D&I) science—a growing field that examines the complex processes by which scientific evidence is adopted, implemented, and maintained/sustained in clinical and community-based settings, bridging the gap between research and everyday practice (Estabrooks et al., 2018; National Institutes of Health [NIH], 2021). D&I science is variously referred to as implementation science, (T3-T4) translational science, knowledge translation, and knowledge transfer and exchange (Cunningham et al., 2020; Guerin et al., 2021; Rabin and Brownson, 2018; Schulte et al., 2017). The application of D&I science methods has been shown to shorten the research to practice pipeline (e.g., Fixsen et al., 2007; Harden et al., 2021; Khan et al., 2021) increasing the speed of translation to benefit the public.

In the United States, the National Institute for Occupational Safety and Health (NIOSH), within the Centers for Disease Control and Prevention (CDC), advanced the use of D&I methods in response to reviews by the National Academies of Science (NAS) highlighting the gap in research to practice in OSH (and at NIOSH) (NAS & NRC, 2009). These efforts were also aimed at answering calls within the OSH community to promote the study of factors that facilitate or limit the development, transfer, use, and sustainability of OSH interventions (Schulte et al., 2003; NAS, 2009). D&I science approaches have been integrated into strategic NIOSH initiatives and funding opportunities (Dugan and Punnett, 2017; Guerin et al., 2021), and D&I is a topic of scholarly interest at national and international OSH conferences and workshops. Despite increased awareness and some early progress in integrating D&I research methods into OSH initiatives, critical gaps persist (Cunningham et al., 2020; Guerin et al., 2021; Lucas et al., 2014; Redeker et al., 2019; Tinc et al., 2018). The reasons for this lag are numerous. As is the case with many scientific fields, D&I science suffers from inconsistency in its terminology that may confuse those seeking to learn its methods (Cunningham et al., 2020; Dugan and Punnett, 2017; Guerin et al., 2021; Rabin and Brownson, 2018). OSH and other researchers new to the D&I field may have difficulty knowing where and how to get started. D&I approaches may appear as burdensome “add-ons” to existing studies that are primarily focused on providing efficacy or effectiveness evidence and have little funding or other support for fostering additional evaluation or dissemination activities. Finally, the complexity and diversity of workplaces and difficulty accessing some worker populations (e.g., immigrant and contingent workers and small businesses employees) can make conducting D&I research in OSH settings challenging (Cunningham et al., 2020). Notwithstanding these difficulties, substantial opportunities exist for further integrating D&I approaches into OSH research and practice in the U.S. context and internationally. Such a move will fill important knowledge gaps related to moving OSH research into sustained practice to protect workers.

The purpose of this primer is to provide OSH researchers and practitioners who are newer to D&I with resources and an overview of D&I science methods and approaches that hold promise for translating effective OSH interventions into practice to improve safety, health, and well-being outcomes for working people. To do this, we first provide an overview of terminology, topics, and concepts in D&I relevant to OSH. Second, we discuss some of the most important foci of D&I science, including how D&I fits into the typical translational research pipeline conceptualization; the importance of balancing adaptation of interventions to fit context with fidelity to core program components; and the difference between implementation outcomes, effectiveness outcomes, and implementation strategies. Third, we provide guidance and resources for applying D&I theories, models, and frameworks (TMFs) to OSH research; give examples of D&I science in OSH; and as an example, develop an extended application of the Practical, Robust, Implementation and Sustainability Model (PRISM) (Feldstein and Glasgow, 2008; Glasgow et al., 1999; Glasgow et al., 2019). We also introduce appropriate research designs, methods and measures for D&I science. Finally, we discuss future directions for D&I science generally, and specific key applications to OSH research and practice efforts.

2. What is dissemination and implementation science? A brief overview

2.1. D&I science defined

In OSH, where D&I science is still gaining a foothold, there is variation and inconsistency in terminology, similar to what is found in the still young D&I field (Lobb and Colditz, 2013). For example, the term “translation research” is promoted by NIOSH and used among its grantees (Cunningham et al., 2020; Schulte et al., 2017), while “knowledge transfer” (Crawford et al., 2016; Duryan et al., 2020; Rondinone et al., 2010) and “knowledge transfer and exchange” (Van Eerd and Saunders, 2017) are terms commonly used among OSH scientists in Europe and Canada to conceptualize similar overlapping (but not synonymous) areas of investigation. Increasingly, OSH researchers in the United States are adopting the terminology of mainstream D&I science (Dugan and Punnett, 2017). Rabin and colleagues (2008, 2018a) have assembled extensive glossaries to capture and synthesize the multitude of terms and definitions used by D&I researchers, practitioners, and funders (Colquhoun et al., 2014; McKibbon et al., 2010; Powell et al., 2015). With the proliferation of D&I science TMFs —more than 150 have been identified to date (e.g., Birken et al., 2017a; Strifler et al., 2018; Tabak et al., 2013)—harmonizing terminology in the field is an ongoing challenge.

For purposes of this paper, we use (slightly revised) NIH (2019) definitions: implementation research is the systematic investigation of the use of strategies to enhance adoption, integration, and sustainment of evidence-based health interventions in clinical and community settings to improve individual and population health; dissemination research is the scientific study of targeted distribution of information and intervention materials to a specific public health audience. Dissemination strategies are often informed by Rogers’ widely-used theory, Diffusion of Innovations (Dearing, 2008; Dearing et al., 2018; Rogers, 2003), and are concerned with promoting the use of evidence-based practices by important decision-makers through communication of information tailored to their specific needs (Brownson et al., 2018a). D&I science also addresses “designing for dissemination, implementation and sustainment” in the development of new, evidence-based programs, by grounding their development and evaluation in key collaborator/recipient perspectives (Brownson et al., 2013; Rabin and Brownson, 2018; Rabin et al., 2018). Within the D&I field, research methods do not typically foster only implementation or only dissemination, but it is recognized that certain D&I approaches and models are better suited to “I” than “D,” and vice versa (Lobb and Colditz, 2013). As more scientific knowledge has been generated to date about the methods that successfully promote implementation as compared to those promoting dissemination, one may consider that the approaches discussed in this article are most relevant to successful implementation of interventions. The D&I field overall continues to wrestle with how to best disseminate successful programs (Brownson et al., 2013).

Dissemination, “scale up” and “scale-out“ are distinct concepts within D&I science. Whereas dissemination research is the study of how best to spread and sustain knowledge of an evidence-based intervention through the systematic distribution of information to a specific audience (NIH, 2019), scale-up refers to the planned spreading of an evidence-based intervention to additional units of the same or in a similar context, often focusing on the same population for which the program/ practice was originally shown to be effective (Aarons et al., 2017). Scale-out refers to the use of strategies to implement, assess, adapt, and sustain an evidence-based policy/practice/program as it is delivered to new populations and/or in new settings differing from those tested in effectiveness trials (Aarons et al., 2017). The potential promise of “scale out” is that this approach may “borrow strength” from evidence in a prior effectiveness trial, thereby speeding up the translational pipeline (Aarons et al., 2017; Smith et al., 2018). For an OSH example, Buller and colleagues (2020) are conducting a randomized trial (guided by multiple D&I models and theories) to compare two methods of national scale up of an effective occupational sun protection program for outdoor workers experiencing an elevated risk of skin cancer.

2.2. Characteristics of D&I science

2.2.1. The translational pipeline

The D&I field has a long history with roots in agriculture, sociology, education, communication, marketing, and management (Colditz and Emmons, 2018; Dearing et al., 2018; Estabrooks et al., 2018; Rabin and Brownson, 2018; Rogers, 2003), and has expanded to other disciplines. Health-related fields leading D&I efforts currently include health services, HIV prevention, school health, mental health, nursing, cancer control, cardiovascular risk reduction, and violence prevention, among others (Rabin and Brownson, 2018). D&I science relies on transdisciplinary approaches that cross fields of inquiry, integrating multiple perspectives and methods (Bauer et al., 2015; Estabrooks et al., 2018) to address the “leaky pipeline” that hinders the transfer of scientific knowledge to practice (Green, 2009). This pipeline is characterized by the 17 years it takes to turn 14 percent of original research to the benefit of program recipients (Balas and Boren, 2000; Brownson et al., 2018b; Green, 2009). Several interacting factors—including characteristics of the intervention (e.g., high cost, not developed considering user needs), the setting/context (e.g., competing demands, limited time and resources), inadequate/inappropriate research designs (e.g., not relevant or representative of the population of interest), and interactions among these factors—have limited the uptake of evidence-based practices and programs (Colditz and Emmons, 2018; Glasgow and Emmons, 2007). D&I science systematically addresses the gaps in the research-to-practice pipeline by engaging partners and beneficiaries to tailor the ways that an intervention will be implemented to fit the context and setting of endusers (Aarons et al., 2012; Chambers et al., 2013). The application of D&I methods has been shown to shorten the time needed to advance research to practice (e.g., Fixsen et al., 2007; Harden et al., 2021; Kahn et al., 2021).

To explain the movement of scientific innovations from discovery to widespread adoption and population impact, clinical and public health researchers have used pipeline or “T” (translational) stage models (Khoury et al., 2010; Brown et al., 2017; Westfall et al., 2007). Fig. 1 presents an adapted “T” stage model for OSH that emphasizes the interactions and iterations of the research translation process. As illustrated in Fig. 1, for implementation success, it is important to: a) plan for dissemination and sustainability from the outset (Brownson et al., 2013) and b) engage stakeholders/beneficiaries on an ongoing basis across all stages of the research continuum (Glasgow et al., 2012). T0 translation focuses on the “pre-intervention” scientific discovery stage, identifying research challenges and opportunities, asking the question, could an intervention work? The T1 phase emphasizes internal validity (efficacy) and results in knowledge creation about “does an intervention work?” under optimal conditions (Fort et al., 2017; Khoury et al., 2007; Rabin and Brownson, 2018). T2 translation involves effectiveness research that investigates, through randomized trials or other methods simulating “real world conditions,” whether an intervention improves health and safety outcomes (Fort et al., 2017; Glasgow et al., 2012; Rabin and Brownson, 2018). T3 translational research continues to assess effectiveness, but also systematically explores pragmatic (Loudon et al., 2015), realist questions (Pawson, 2013) such as what works, for whom, how, in what contexts, and how is it sustained over time? (Gaglio and Glasgow, 2018). T3 questions may focus on how to make an effective intervention work in diverse, multi-level settings (Khoury et al., 2007; Rabin and Brownson, 2018) and how it can be adapted to fit various contexts and resource constraints, such as in large vs. small workplaces. Hybrid effectiveness studies, as indicated in Fig. 1, promote examination of both effectiveness and implementation outcomes within the same study, with the aim of accelerating the research-to-practice process by combining aspects of T2 and T3 research (Curran et al., 2012). Finally, T4 translational research is focused on producing public health impact and developing the most generalizable knowledge about the positive and sustained health and safety outcomes at the population level that result from disseminating and implementing interventions known to be effective (Agency for Healthcare Research and Quality [AHRQ], 2014; Khoury et al., 2010; Rabin and Brownson, 2018). As stated previously, the translational research cycle is recursive. Information at a later stage informs research at earlier stages and depending on the outcomes at any given timepoint, it may be necessary to go back to an earlier stage. The need to consider partner/beneficiary engagement at the early T1-T2 phases is important yet often overlooked, resulting in the unintended consequence of developing an intervention that is highly efficacious, but fundamentally ill-suited to the needs of partners who would adopt it in the T3-T4 phases (Brownson et al., 2013).

Fig. 1.

Fig. 1.

The Translational Research Cycle

Sources adapted from: AHRQ, 2014; Brown et al., 2017; Khoury et al., 2010; PAR-19–274 Dissemination and Implementation Research in Health; Westfall et al., 2007.

2.3. Defining Evidence

Within D&I science, evidence-based interventions are defined broadly and may include programs, practices, policies, recommendations and guidelines (Rabin et al., 2010). Brown and colleagues (2017) refer to seven types (the “7 Ps”) of public health and health services interventions relevant to D&I efforts that can be delivered in different contexts, and which have varying degrees of applicability to OSH: programs, practices, principles, procedures, products, pills, and policies. Brownson and colleagues (2009) delineate three types of evidence generated through and from public health interventions: 1) Type 1 evidence defines the etiology of diseases and the magnitude, severity, and extent to which the risk factors for these conditions, and the conditions themselves, can be prevented; 2) Type 2 evidence describes the relative effectiveness of specific interventions to improve people’s safety and health; and 3) Type 3 evidence, which is the most scarce in OSH, demonstrates how and under which contextual conditions interventions are (successfully) implemented and sustained. In its most basic terms, research in D&I science seeks to take programs that already have sufficient Type 2 evidence, typically framed as consistent effectiveness demonstrated through meta-analyses and/or Cochrane reviews or strong recommendations in public health guidelines (Proctor et al., 2012), and to study them in order to generate Type 3 evidence. To provide a simple comparison between these types of evidence and the translational research cycle (Fig. 1), Type 2 evidence is typically derived from T2 effectiveness research studies, whereas Type 3 evidence is derived from T3 and/or T4 research studies that evaluate implementation strategies and outcomes and consider how an intervention’s effects relate to context.

It has been argued that requiring definitive evidence at a given stage of the translational pipeline before moving to the next has resulted in a lack of “rapid and relevant” movement of research to practice (Kessler and Glasgow, 2011). In public health, not all types of evidence (e.g., qualitative research) are equally represented in systematic reviews, and gray literature—such as government reports, book chapters, conference proceedings, and other materials—may provide useful information (Jacobs et al., 2012). In addition to the traditional terminology of evidence-based interventions, some contexts also define promising interventions as valid targets of dissemination and implementation activities. For example, the U.S. Department of Veterans Affairs (VA, 2021) maintains a warehouse of promising practices that meet a set of criteria regarding potential for impact. These criteria for the Office of Rural Health within the VA include increased access, strong partnerships, clinical impact, return on investment, operational feasibility, and customer satisfaction.

In the OSH field, Type 2 evidence is more typically represented by “strong recommendations in public health guidelines” (Proctor et al., 2012) because there is often limited evidence from randomized controlled trials (Anger et al., 2015; Hempel et al., 2016; Howard et al., 2017; Nold and Bochmann, 2010; Robson et al., 2012). OSH evidence is generated from and through diverse and multidisciplinary sources, including human and animal studies and observational, epidemiological and worker case studies (Hempel et al., 2016). Examples of factors relevant to OSH recommendations and guidelines that may or may not be included in an evidence synthesis include projected costs of the preventive action or policy, current industry standards, context-dependent values and practices, technical feasibility (Hempel et al., 2016; Nold and Bochmann, 2010), partner and program recipient concerns (e.g. needs of employers, employees, and intermediaries such as labor or professional organizations; Sinclair et al., 2013), and occupational health equity issues (Ahonen et al., 2018).

2.4. Defining “Context“

The D&I field uses the term “context” to capture the complex web of factors to be considered when implementing interventions (Huebschmann et al., 2019; NIH, 2019). More specifically, Movsisyan et al. (2019) define context as the “set of characteristics and circumstances that consist of active and unique factors within which the implementation of an intervention is embedded” (p. 2). Context is multilevel, and cuts across economic, social, political, and temporal domains (Neta et al., 2015). D&I research seeks to address barriers to adoption of evidence-based interventions arising from multiple, interacting influences (Burke et al., 2015) crossing socioecological (e.g., policy, community, organizational, interpersonal, and individual) levels (Bronfenbrenner, 1979; Sallis et al., 2008). Context is not a fixed organizational structure but a process that is dynamic, iterative and negotiated (May et al., 2016; Chambers et al., 2013). A key premise of D&I science is that it is critical to package and convey the evidence necessary to improve health in ways that fit local settings and meet the needs of end-users (including workers, employers, intermediary groups such as professional organizations, and policy makers). This is because even the most robust evidence-based program can fail if context is not explicitly considered. Partners/collaborators find it challenging to implement with fidelity programs that are unacceptable or not feasible in their setting, or if they prefer alternative approaches.

An example of OSH research incorporating contextual factors is Tenney and colleagues’ (2019) study of the adoption of the NIOSH Total Worker Health® (TWH) approach among 382 businesses. TWH programs are designed to integrate protection from work-related safety and health hazards with promotion of injury and illness prevention efforts to advance worker well-being (NIOSH, 2020). The authors found that larger businesses (>200 employees) implemented more comprehensive health and safety strategies in their workplaces compared to smaller businesses (≤50 employees), highlighting contextual factors related to size of business that were associated with TWH implementation. At the organizational level, contextual factors including business structure, age, organization of work/workflows, characteristics of the workforce including employee demographics (such as age), use of contingent labor, management and leadership characteristics, financial resources, and organizational climate were identified as key in understanding the differential adoption of TWH initiatives by smaller versus larger businesses (Tenney et al., 2019). In another example from Carlan and colleagues (2012), the authors explore how the organization of work in the Canadian construction sector (decentralized, non-linear, and nonhierarchical) requires moving away from top-down approaches for disseminating workplace safety and health information to identifying effective networks and intermediaries (such as unions) through which OSH knowledge may be communicated and transferred in these complex, dynamic contexts.

Overall, as shown in Fig. 1, research in D&I science should generate new knowledge on the feasibility and acceptability of specific implementation strategies (discussed in further detail below) to deliver an evidence-based program in a given context, leading to a better understanding of how and why the program works, for whom it works and in what settings. Research that is more relevant, actionable, tailored, responsive and iterative holds the promise of creating a “pull” not just for evidence-based practice, but for practice-based evidence (Lobb and Colditz, 2013; Green, 2007).

3. Key D&I science concepts

The following section provides a brief overview of several key concepts in D&I science including fidelity and adaptation, implementation strategies, and implementation outcomes.

3.1. Fidelity and adaptation

Systematically monitoring and documenting adaptations to an evidence-based program is critical for understanding how these modifications influence intervention outcomes (Rabin and Brownson, 2018), and is closely linked to the concept of fidelity, or the extent to which a program is implemented as intended by its designers (Backer, 2001). Balancing fidelity and adaptation has been a topic of scholarly interest and debate for many years (see e.g., Bopp et al., 2013; Carvalho et al., 2013; Castro et al., 2004; Chambers and Norton, 2016; Dearing, 2008; Rohrbach et al., 2007), including in OSH (von Thiele Schwarz et al., 2015). There is a growing recognition among D&I researchers that adaptation is inevitable, and even desirable, to meet the local needs and constraints of program providers and recipients (Allen et al., 2018). The value of adaptation and fidelity may be different for various partners and program recipients, and recommendations have been proposed for reconciling their respective roles in the research-to-practice translational pathway (von Thiele Schwarz et al., 2019). How to identify what is essential to an evidence-based intervention—its core components or functions (or “the intervention’s basic purpose”; Perez Jolles et al., 2019, p. 1032)—is an important challenge in the successful implementation of evidence-based programs (Backer, 2001; Durlak and DuPre, 2008) and is critical to the measurement and assessment of implementation fidelity. Core functions or components are directly related to an intervention’s theory of change, which delineates the mechanisms by which the intervention works (Blasé and Fixsen, 2013). They are, in other words, the “special sauce” that characterizes an effective program. A recent study (Nykänen et al., 2021), sought to identify the “active ingredients” of a safety training intervention for young workers delivered in Finnish vocational schools. Based in social cognitive theory (Bandura, 1986), the intervention tested the association between the core intervention components (safety skills training, safety inoculation training, a positive atmosphere for safety learning, and active learning techniques) and study outcomes (safety preparedness, internal safety locus of control, risk attitudes and safety motivation). The study team found, for example, quality of program delivery was associated with student motivational outcomes. Identifying the core components of the safety training intervention will facilitate efforts to replicate or adapt it to other settings while keeping the key elements intact (Nykänen et al., 2021).

A current area for exploration in D&I science includes using systematic frameworks (e.g., Stirman et al., 2019; Stirman et al., 2013) to plan for, capture and characterize adaptations of implementation strategies and/or interventions during the implementation process (Escoffery et al., 2019; Finley et al., 2018; Rabin et al., 2018), with particular attention to the rationale for each adaptation and to the preservation of core components. Several recommendations for categorizing and understanding adaptations have been advanced in the D&I field (Escoffery et al., 2019; Glasgow et al., 2020; Kirk et al., 2020; Miller et al., 2021; Perez Jolles et al., 2019; Stirman et al., 2013, 2019). Investigations related to measuring and monitoring intervention fidelity and adaptations in OSH are limited, and more research is needed in this area.

3.2. Implementation strategies

Colloquially referred to as “how” an evidence-based program is implemented (Proctor et al., 2013), implementation strategies are the methods used to enhance program implementation outcomes (e.g., adoption, fidelity, and sustainability, see section 3.3 below) (Proctor et al., 2011). Powell and colleagues (2015) identified 73 discrete implementation strategies that can be grouped into 9 categories (Waltz et al., 2015). For example, the category “Train and educate stakeholders” consists of multiple strategies, including using train-the-trainer methods and making training interactive, while the category “Develop stakeholder interrelationships” includes discrete strategies such as “identify and prepare champions,” and “identify early adopters.” It is common to use multiple strategies or ‘strategy bundles’ during implementation to address multiple determinants (barriers and facilitators) to intervention implementation. The selection of strategies may vary depending on the phase of implementation and may require a variety of techniques to ensure that the strategies fit the local context (NCI, 2019; Powell et al., 2017). Implementation Mapping can be used to identify barriers and facilitators to program implementation and specific strategies, such as those delineated by Powell et al. (2015), to address these determinants and optimally deliver an intervention (Fernandez et al., 2019). However, it should be noted that there is systematic evidence of the effectiveness for only a minority of implementation strategies (Grimshaw et al., 2012; Wolfenden et al., 2018) and that implementation strategies do not always lead to improved implementation or sustainment. Research is needed in OSH on tailoring implementation strategies to context using robust and systematic methods (such as those described above) and leveraging what is already known about fitting strategies to other settings to achieve program impact.

3.3. Implementation outcomes

As previously mentioned, the outcomes assessed in D&I research are related to but distinct from those assessed in intervention studies. Generally speaking, implementation outcomes are associated with the effects of implementation strategies, while effectiveness outcomes are intended to analyze the impact of the program, policy or practice on specific health outcomes, such as a reduction in work-related injuries or improvement in work-related fatigue (Table 1). Proctor and colleagues (2011) developed a frequently cited taxonomy of eight conceptually distinct implementation outcomes. These include acceptability, adoption, appropriateness, feasibility, fidelity, implementation cost, penetration, and sustainability. Implementation outcomes are typically assessed at the organizational, community, or policy level (Rabin et al., 2008), but several, such as feasibility and acceptability, are also examined at the individual (program recipient) level. Approaches to examine these eight implementation outcomes should be adapted to account for health equity (e.g., assessing feasibility in a low-resource setting) (Brownson et al., 2021). In sum, implementation outcomes are key, intermediate outcomes that are critical for monitoring the successful implementation of evidence-based programs (NCI, 2019).

Table 1.

Examples of key implementation outcomes and OSH effectiveness outcomes

Implementation outcomes* Organizational effectiveness outcomes Individual (worker/employer) effectiveness outcomes

Acceptability: Perception among key partners/beneficiaries that the OSH program or practice is agreeable or satisfactory.
Adoption: Intention among key partners/beneficiaries to employ an OSH intervention (i.e., “uptake”).
Appropriateness: Perceived fit of the OSH innovation or intervention for a given context/population/health and safety problem.
Costs: Costs of an OSH implementation effort
Feasibility: Extent to which the OSH intervention can be successfully used within a given setting.
Fidelity: Degree to which an OSH intervention was implemented as intended by the program developers
Penetration: Extent of integration of an OSH intervention within a community, organization, or system.
Sustainability: Extent to which a newly implemented program/intervention is maintained or institutionalized within an organization/workplace.
Safety culture/climate
Supervisory support
Absenteeism
Presenteeism
Turnover
Occupational health equity
Occupational injuries, illnesses and fatalities
Well-being
Physical health
Mental health
Changes in attitude, intention and behavior
Occupational injuries, illnesses and fatalities
Occupational health equity
Fatigue
Stress
Depression
Burnout
Social connectedness
Job performance
Job satisfaction
Job commitment
Intent to leave
Work-life balance
Positive self-concept

Source. Adapted from, Friedland & Price, 2003; Lewis et al., 2015; NCI, 2019; NIOSH, 2013; *Proctor et al., 2011

4. Commonly used D&I science theories, models and frameworks

The terms “theory,” “model,” and “framework” (TMFs) in D&I science publications have distinct technical meanings, but they are often used interchangeably (Bauer et al., 2015). TMFs generally describe approaches or systematic ways to plan, implement, and evaluate the implementation of evidence-based interventions and can help researchers understand context, D&I processes, and outcomes (Damschroder, 2020; Nilsen, 2015; Nilsen and Bernhardsson, 2019). TMFs can be used to assess why an intervention works (or fails to) and increase the interpretability of study findings (Tabak et al., 2018). As stated previously, more than 150 D&I TMFs have been identified (e.g. Birken et al., 2017a; Strifler et al., 2018; Tabak et al., 2013) and applied to varying degrees (Skolarus et al., 2017). Prior reviews (Damschroder, 2020; Nilsen, 2015; Nilsen and Bernhardsson, 2019) have categorized D&I TMFs, depending upon their purpose, as process models, evaluation frameworks, and determinant theories/frameworks. Many TMFs are hybrid in the sense that they fall into more than one of the process, determinants, or evaluation categories (Damschroder, 2020). An overview of some commonly used D&I TMFs is provided below. (For a link to an interactive webtool to help researchers and practitioners select D&I science TMFs for their research, see Appendix A).

The earliest and still one of the most widely used theories in D&I science is Rogers’ Diffusion of Innovations (Rogers, 2003), which seeks to explain the processes influencing the spread and adoption of innovations through certain channels over time, considering factors such as adopters’ perceptions of the innovation (such as its cost, effectiveness, compatibility, complexity, trialability and observability); innovativeness of the adopter; and characteristics of social system(s), individual adoption processes, and the diffusion system, including the important roles of “opinion leaders” and “champions” (Damschroder, 2020; Dearing et al., 2018; Nilsen, 2015; Rogers, 2003). The Theoretical Domains Framework (Michie et al., 2005; Michie et al., 2011), which resulted from a systematic review of 19 published D&I frameworks, provides guidance for studying behavior change in terms of implementation activities and outcomes (Michie, 2014). For example, the TDF was recently used by OSH researchers to develop and psychometrically test a questionnaire to identify determinants of safety behaviors among workers in critical industries, such as transportation (Morgan et al., 2021). Organizational change theories in D&I science (Weiner et al., 2008, 2009, 2020) hold promise for OSH research for exploring implementation processes and outcomes within multilevel systems/complex workplaces (Carlan et al., 2012; Hofmann et al., 2017; Robertson et al., 2021).

Process models provide general guiding principles and “phases” of research planning and implementation rather than explicitly indicating a set of specific steps needed within each phase of implementation (Estabrooks et al., 2018; Nilsen, 2015; Tabak et al., 2018). An example of a process model used in public health research is knowledge into action (K2A) advanced by the CDC, which includes three (non-linear) components—research, translation, and institutionalization—and the implementer, deliverer, recipient interactions, support structures, and evaluation capacity needed to move knowledge to sustainable practice (Wilson and Brady, 2011). Many other process models are used in public health research and practice (see e.g., Birken et al., 2017a).

Consisting of four phases that describe the implementation process, the Exploration, Preparation, Implementation, Sustainment (EPIS) framework (Aarons et al., 2011) is considered a hybrid process and determinant model (Damschroder, 2020). Within EPIS, diverse factors from the inner and outer context that may hinder or facilitate the implementation of programs are considered in each phase (Brown et al., 2017; Moullin et al., 2019). This framework also articulates outer system contextual factors, such as the regulatory/policy environment, and inner context factors, such as organizational leadership, considered key to implementation processes. These factors may apply across many or all implementation stages. Another component of EPIS is the factors that relate to the intervention, such as its fit and cost. Finally, “bridging factors,” such as interagency collaboration, intermediaries (e.g., unions and professional organizations) and community-academic partnerships, create linkages between inner and outer contextual factors (Moullin et al., 2019). EPIS also provides some guidance regarding the temporal relations of D&I outcomes (Becan et al., 2018; Lewis et al., 2018a). For example, perceived fit of the intervention would primarily be assessed in the preparation phase, fidelity would be measured while the program is being implemented, and institutionalization of the intervention would be assessed in the sustainment phase (Lewis et al., 2018a). While EPIS has had limited uptake in OSH to date, it may be useful for assessing the multilevel factors that hinder/facilitate implementation of new innovations in workplaces.

Another hybrid model that is most often used as an evaluation framework is RE-AIM (Glasgow et al., 1999), with its five, specific dimensions (Reach, Effectiveness, Adoption, Implementation, and Maintenance) and its contextually expanded version, PRISM (Practical Robust Implementation and Sustainability Model, described later in this paper; Feldstein and Glasgow, 2008; Glasgow et al., 2019). Used frequently in research and grant applications to the CDC and the NIH (Glasgow et al., 1999; Glasgow et al., 2019; Vinson et al., 2018), RE-AIM was designed to enhance the quality, efficiency, and public health impact of efforts to translate research into practice. Cutting across all five of the RE-AIM implementation outcomes are equity concerns related to the representativeness of those who participate or benefit from the evidence-based program (Glasgow et al., 2019; Gaglio et al., 2013; Woodward et al., 2021). Although RE-AIM is most widely applied as an evaluation framework, it is also been used for guiding initial intervention planning with partners and beneficiaries (Holtrop et al., 2018). RE-AIM can also be used iteratively during program implementation to guide adaptations to implementation strategies if interim RE-AIM outcomes are not being met to the extent expected or intended (Glasgow, et al. 2019).

Recognizing that organizations may have limited capacity and resources, the intended goal of RE-AIM is to improve intervention monitoring and reporting across the dimensions, while not necessarily requiring comprehensive assessment of the intervention across all five dimensions (Glasgow and Estabrooks, 2018; Glasgow et al., 2019). Such “pragmatic” uses of the framework suggest the importance of engaging partners and beneficiaries early on and throughout the design, implementation, and evaluation of interventions, to establish a priori the dimensions and research questions that are most suitable to the setting, audience, needs, and resources, and the stage of research (Glasgow et al., 2019). Recent reconsiderations of RE-AIM promote an enhanced focus on sustainability (Glasgow et al., 2018; Shelton et al., 2020a) by addressing dynamic context, focusing on multi-faceted cost and economic issues, and promoting health equity (Shelton et al., 2020a).

A scan of the past decade of OSH literature for use of implementation and evaluation models and frameworks suggests only modest uptake of RE-AIM, which has been suggested as a useful tool for the evaluation of OSH interventions (Schelvis et al., 2015). However, some examples of the use of the framework in OSH as an evaluation and planning tool are available (e.g., Cocker et al., 2018; Jenny et al., 2015; Schwatka et al., 2018; Storm et al., 2016; Viester et al., 2014). Issues of sustainability and dynamic context, including impacts on health equity (Baumann and Cabassa, 2020; Shelton et al., 2020; Woodward et al., 2019, 2021) are topics of current interest in OSH (Ahonen et al., 2018), which may be systematically investigated through an application of RE-AIM. Appendix A contains a link to RE-AIM resources, which include guidance on using the framework, measures, checklists and a planning tool. An applied example of the use of RE-AIM to conduct a process evaluation for an RCT of a worksite behavior change intervention to prevent musculoskeletal disorders (MSDs) among construction workers in the Netherlands is summarized in Appendix B (Viester et al., 2014). Through their use of RE-AIM, the research team was able to demonstrate that the study achieved satisfactory adoption and representative reach among workers while also identifying challenge areas, including difficulties delivering the intervention with fidelity.

Determinant models/frameworks specify barriers and facilitators to implementation processes or outcomes and include the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009) — one of the most widely used TMFs in D&I science (Birken et al., 2017b). CFIR was developed based on multiple, published implementation theories to identify and categorize known determinants of implementation outcomes. The framework consists of five domains known to influence implementation outcomes and interact with each other: Intervention Characteristics, Outer Setting, Inner Setting, Characteristics of Individuals, and Process (by which implementation is achieved). Within each of these domains are multiple constructs reflecting determinants (i.e., barriers and facilitators) of implementation (Damschroder et al., 2009). Determinants can act as independent variables with a direct effect on implementation outcomes (the dependent variable); as moderators (“effect modifiers”) of the effectiveness of D&I interventions; or as mediators that are links in a causal chain of a D&I mechanism (Flottorp et al., 2013; Lewis et al., 2018b; Nilsen, 2015). Developed to advance health services research by consolidating many existing implementation theories, the CFIR has been cited extensively across disciplines and is currently being updated (“CFIR 2.0”) to reflect more diverse settings, provide clarification on and elaboration of key constructs, include additional constructs of interest and relevance (e.g., “mass disruptions” such as global pandemics in the outer setting domain), and to expand the focus on health equity and implementation outcomes (Damschroder, 2021).

CFIR has received limited attention in OSH, but one interesting application is in research by Tinc et al. (2020). Tractor overturns have been a leading cause of death on American farms, and rollover protection structures (ROPS), first introduced in the 1960s, are highly effective in preventing death and serious injury when used with a seatbelt. While ROPS have been standard issue for decades on new tractors, many farmers use older equipment that needs to be retrofitted. Research was conducted using CFIR to gain an understanding of the barriers to farmers’ uptake of ROPS (Tinc et al., 2020). Participants in the National ROPS Rebate Program (NRRP) were surveyed at four time points. The surveys measured 14 CFIR constructs and correlations with three intervention outcomes (intakes, or the number of people who signed up for the program, funding progress, and tractor retrofits with ROPS). Findings revealed that eight CFIR survey items covering four constructs in two domains (access to knowledge and information [inner setting], leadership engagement [inner setting], engaging [process], and reflecting and evaluating [process]) were correlated (rho ≥ 0.50) with at least one of the three outcome measures. Items correlated with all three outcome measures included those related to access to knowledge and information (inner setting) and engaging (process), indicating that these constructs may be the most important for expanding the NRRP (Tinc et al., 2020). In terms of the utility of applying the CFIR in OSH to scale-up initiatives, research indicates challenges when using this D&I framework in agricultural settings versus single site implementation studies (Tinc et al., 2018). More research is needed to understand CFIR’s utility for other OSH interventions and settings.

Another determinant TMF (Nilsen and Bernhardsson, 2019) is the Practical Robust Implementation and Sustainability Model (PRISM; Feldstein and Glasgow, 2008; Glasgow et al., 2019), an extension of RE-AIM which addresses key multilevel contextual factors that influence RE-AIM outcomes and considers the perspectives of multiple partners and beneficiaries (as depicted in Fig. 2 and adapted to reflect the OSH context).

Fig. 2.

Fig. 2.

The Practical, Robust, Implementation and Sustainability Model (PRISM) for OSH. Adapted from: Feldstein and Glasgow, 2008.

The PRISM contextual factors include: the program characteristics from the perspective of organizational and individual recipients; the characteristics of diverse, multilevel recipients of the program; the implementation and sustainability infrastructure; and the external environment (Feldstein and Glasgow, 2008; Glasgow et al., 2019; McCreight et al., 2019). PRISM contextual factors may be used to guide researchers during the program planning, implementation, evaluation and dissemination phases (Glasgow et al., 2019).

For the OSH practitioner, PRISM could be used to consider employers’, managers’, and workers’ perspectives and to identify factors influencing successful implementation of programs, policies and guidelines in the workplace. For instance, in the program planning phase, PRISM could guide the selection of appropriate evidence-based interventions to address context-specific needs and priorities, as well as engage workers and managers in the identification of barriers/facilitators to successful implementation and sustainability. During implementation, PRISM could be used to improve the fit between the evidence-based practice and the workplace by systematically assessing organizational, employer and worker characteristics and perspectives, and the existing implementation and sustainability infrastructure. Using iterative, qualitative approaches, such as focus groups and interviews, OSH researchers and practitioners could tailor implementation strategies or adapt the evidence-based intervention to provide a better match to the workplace, and to improve RE-AIM outcomes (such as Maintenance; Fig. 1). In general, understanding factors that influence the program end users, whether employers, managers/supervisors, or workers/employees, will likely improve Reach and Effectiveness; addressing organizational characteristics (such as industry, business size, geography, and firm structure; Schwatka et al., 2018; Tenney et al., 2019) should lead to improved Adoption, Implementation and Maintenance, at both the individual and organizational level.

Table 2 provides examples of key questions and probes for OSH practitioners and researchers when applying the PRISM/RE-AIM framework (as illustrated in Fig. 2) to the multilevel implementation of an evidence-based intervention within a worksite, business or organization.

Table 2.

Key PRISM* questions and probes for OSH practitioners and researchers.

PRISM domain Key questions Probes
PROGRAM INTERVENTION How does the intervention design influence implementation?
Organizational perspective What organizational-level factors impede/facilitate successful intervention implementation? Is the intervention:

• Aligned with the organization’s mission and readiness for change?
• In line with employers’ /managers’ preferences, beliefs, and priorities?
• Supported by strong evidence?
• Addressing a gap or need within the organization?
• Observed to be beneficial?
• Easy to use and cost-effective?
Employee/worker perspective What individual, employee-level factors impede/facilitate successful intervention implementation? Is the intervention:

• Addressing key employee concerns?
• In line with employee preferences, beliefs, and priorities?
• Accessible to workers with diverse backgrounds?
• Easy to use and cost-effective?
PROGRAM RECIPIENTS How do characteristics of (multilevel) recipients influence implementation?
Organizational level (leadership, management) What characteristics of the organization (e.g., financial health and resources, tendency to take risks or be an early adopter, and morale) can impact the successful implementation of an intervention? • Is management supportive?
• Are the goals realistic and clearly communicated? • Is input provided across all organizational levels?
• Who has knowledge, ideas and opinions on the program/problem? Is there previous experience with similar programs?
• Who will be naysayers and what will they say?
• Who are the key players (champions) to get on board?
Employee/worker level What characteristics of workers/employees, including socioeconomic and demographic factors can impact the success of an intervention? • Who has knowledge, ideas and opinions on the program/problem?, Is there previous experience with similar programs?
• Who are the key players (champions) to get on board?
• Are there demographic or baseline health and/or social determinants that facilitate or hinder participation?
IMPLEMENTATION AND SUSTAINABILITY INFRASTRUCTURE How can the implementation plan be developed with partner/beneficiary input that adequately considers dissemination and sustainability from the beginning?
Are there established procedures or personnel whose responsibilities will include performance related to this program? (e.g., audit and feedback; allocated budget) • Can the intervention be paired with an already institutionalized process or collective understanding?
• How likely is the intervention to produce lasting effects for participants?
• How can the intervention be monitored for an extended period?
• How will success be tracked and reported?
• How will lessons learned be communicated/disseminated?
• What are likely modifications or adaptations that will need to be made to sustain the initiative over time (e.g., lower cost, different staff/expertise, reduced intensity, different settings)?
EXTERNAL ENVIRONMENT How do outside factors influence the organization?
What external influences potentially hinder/facilitate program implementation? • Are there regulations and/or policies or guidelines that may hinder/facilitate program implementation?
• What is the economic climate? Are there endogenous shocks (e. g., an economic downturn), or sociodemographic shifts/changes (e.g., an aging workforce) that could affect program adoption, implementation and maintenance?
*

Practical, Robust, Implementation and Sustainability Model (PRISM).

TMFs are foundational to D&I science, in that they inform the design, evaluation, and outcomes assessed, and they may be used in combination with each other. Combining frameworks may help researchers to address multiple study purposes and multiple conceptual levels (Birken et al., 2017b.). For example, CFIR (Damschroder et al., 2009) did not originally specify implementation outcomes and has been used in combination with the Proctor Implementation Outcomes Model (2011) or RE-AIM (Glasgow et al., 2019). Relatively recent studies (e.g., Damschroder et al., 2017; King et al., 2020) provide examples of the integration of CFIR and RE-AIM where CFIR is used to assess context and RE-AIM to describe diverse implementation and effectiveness outcomes. Birken and colleagues (2017b) conducted a systematic review and provide in-depth analysis of the combination of CFIR and TDF in D&I science studies where CFIR served as the overarching D&I TMF and the TDF allowed for more focused assessment of provider level behavior change. Bowser and colleagues (2019) combined EPIS with multiple models and theories to guide an assessment of the environmental, organizational, and economic factors affecting delivery of behavioral health services for justice—involved youth. D&I TMFs can also be combined with other approaches, such as the Pragmatic Explanatory Continuum Indicator Summary (PRECIS-2) model (Loudon et al., 2015) to determine how pragmatic and generalizable a study is (Gaglio et al., 2014; Luoma et al., 2017). In contrast to explanatory studies that rely on highly controlled methods to establish efficacy, pragmatic studies are those that address the effectiveness of an intervention in real-world settings, are conducted under ‘usual care’ conditions, and produce data that are directly relevant to real-world beneficiaries. In designing for dissemination, implementation and sustainment and impact in OSH, low burden, cost-effective, pragmatic approaches should be considered (Zohar and Polachek, 2014). The PRECIS-2 (Loudon et al., 2015) and the newer PRECIS-2 PS (Norton et al., 2021) are particularly relevant to pragmatic, implementation research, and include key domains of importance to multiple relevant beneficiaries (Huebschmann et al., 2019) scored from 1 (very explanatory, i.e., with a focus on internal validity) to 5 (very pragmatic, i.e., with a focus on external validity). In OSH research, PRECIS-2 PS could be combined with another model (e.g., EPIS or CFIR) to design a study, to write a review article to evaluate the level of pragmatism of existing study designs in the OSH literature, or to inform a funding or grant application. It should be noted that, while combining D&I models in a single study can be useful for exploring multiple study purposes and conceptual levels and domains, this approach may result in unnecessary complexity and redundancy (Birken et al., 2017b).

Although the examples above highlight different TMFs, it is also important to recognize that there are many more commonalities than differences across D&I science TMFs and that many key issues—such as the importance of context and multi-level perspectives, understanding and tracking implementation and adaptations, and engaging partners and beneficiaries—are addressed to a greater or lesser extent in all. More guidance and practical tools, such as those provided free-of-charge on the Dissemination and Implementation Science Models in Health webtool [https://dissemination-implementation.org] which also allows for the comparison of multiple models, are needed (Birken et al., 2017b).

5. D&I science study designs and methods

With their strong focus on internal validity (and limited emphasis on external validity or pragmatism), traditional, randomized controlled trials (RCTs) are not always desirable or feasible for investigating D&I questions, including in workplace settings, and several alternative approaches have been advanced (Brown et al., 2017). Mentioned previously, hybrid implementation-effectiveness trial designs (Curran et al., 2012; Kemp et al., 2019; Landes et al., 2020) may be of particular interest and value to OSH researchers, as these approaches combine aspects of effectiveness trials and implementation research, allowing for a timelier translation of results to public health impact (Wolfenden et al., 2016). However, hybrid designs are typically more complex to execute than traditional RCTs (Curran et al., 2012), and may require additional resources and expertise to deploy successfully.

Other D&I study designs include the Multiphase Optimization Strategy (MOST), a framework for developing multicomponent interventions involving a three-stage process (preparation, optimization, evaluation) through which the most effective intervention can be identified within key constraints (such as program cost) (Collins et al., 2007; Collins et al., 2011; Guastaferro and Collins, 2019). Other pragmatic approaches include iterative designs [e.g., Sequential Multiple Assignment Randomized Trials (SMART); Nahum-Shani et al., 2012], user-centered designs (Lyon and Koerner, 2016), cluster randomized and stepped wedge designs (Brown and Lilford, 2006; Handley et al., 2018) in which all settings receive the intervention. While the above approaches are promising in terms of their ability to address key D&I science issues such as dynamic context and adaptation, their application can be challenging and is beyond the scope of this primer. Interested readers are referred to the citations above.

The issue of statistical power in implementation studies is also a critical design issue (Brown et al., 2017; Landsverk et al., 2018). D&I science research often tests system-level interventions where power is influenced most strongly by the number of units at the highest (group) level (e.g., work units/team or organizations) versus the individual (e.g., worker) level (Brown et al., 2017; Landsverk et al., 2018). Previous, and simpler techniques for calculating statistical power and sample size are typically not appropriate for implementation studies because of the multilevel clustering and longitudinal nature of D&I data (Landsverk et al., 2018). Newer tools exist for adequately planning/powering these complex study designs (e.g., Optimal Design from the W.T. Grant Foundation; http://wtgrantfoundation.org/resource/optimal-design-with-empirical-information-od0).

D&I science methods are varied, with an increasing focus on the use of pragmatic, participatory approaches, including community-based participatory research (CBPR) (Minkler, 2010). CBPR emphasizes equitable representation and engagement of multilevel and multisectorial partners and beneficiaries throughout the research process. Building strong community linkages is integral to participatory research approaches, as it is to successful D&I efforts. Moreover, improving the relevance of evidence-based interventions through participatory research approaches may help to expedite the use of new practices and programs by relevant collaborators and program recipients (Lobb and Colditz, 2013). Mixed methods research (Creswell et al., 2011) is also frequently used in D&I scholarship to appropriately evaluate complex, multilevel research translation challenges (Rabin and Brownson, 2018). In mixed methods studies, researchers intentionally integrate (or combine in a meaningful and systematic way) quantitative and qualitative data to maximize the strengths and minimize the weaknesses of each type of data (Creswell et al., 2011). Use by researchers of mixed methods approaches in D&I science is most common for identifying the barriers and facilitators to successful intervention dissemination and implementation, but these techniques can also be used to plan and monitor all stages of the implementation process (Palinkas and Cooper, 2018; Palinkas et al., 2011).

6. D&I measures

In D&I science, qualitative assessments, such as through interviews and focus groups (e.g., Aarons et al., 2012; Hamilton and Finley, 2019; McCreight et al., 2019), are the predominant assessment approach (Weiner, 2021). As a newer field, quantitative assessment in D&I science is challenged by measurement issues, and work in this area is underdeveloped but rapidly expanding (Lewis et al., 2015; Lewis et al., 2018a, 2018b; Martinez et al., 2014; Weiner, 2021). Advancements in D&I science necessitate the development, and widespread use, of reliable, valid, and pragmatic measures (Glasgow, 2013; Glasgow and Riley, 2013; Stanick et al., 2021; Weiner, 2021) to assess the effects of context, implementation strategies and adaptations on outcome variables and constructs (Lewis et al., 2018a). Glasgow and Riley (2013) describe pragmatic measures as those that are important to collaborators, have low burden for respondents and staff, have broad applicability, are sensitive to change over time, and are actionable (e.g., easy to score and interpret in real-world settings).

Tools available to researchers wishing to qualitatively assess D&I constructs include for example a customizable interview guide based on the CFIR constructs that are the focus of an evaluation (cfirguide.org). Free templates of focus group and one-on-one interview guides are also available for assessing RE-AIM constructs, before, during and after program implementation (re-aim.org). Examples of quantitative D&I measures that have been shown to be reliable, have validity data, and meet criteria for being pragmatic include:

  • Measures by Weiner and colleagues (2017) to assess intervention acceptability, appropriateness and feasibility (12 items, four for each construct).

  • A measure by Jacobs et al. (2014) with three dimensions and two items per subscale to assess implementation climate—or the extent to which organizational members perceive that innovation use is expected, supported, and rewarded by their organization.

  • An 18-item, pragmatic measure by Ehrhart et al. (2014) of strategic climate for implementation of evidence-based interventions, that assesses six dimensions of organizational context.

  • A 12-item measure of implementation leadership (with four subscales, 3-items each) by Aarons and colleagues (2014).

  • A 12-item measure of organizational readiness for implementing change from Shea and colleagues (2014).

  • A brief and pragmatic measure from Moullin et al. (2018) to assess providers’ intentions to use a specific innovation or intentions to use new practices.

  • The Program Sustainability Assessment Tool (PSAT) from Luke and colleagues (2014), a reliable, 40-item instrument with eight domains (5 items per domain) that can be used to assess the capacity for the sustainability of public health programs. The newer Clinical Sustainability Assessment Tool (CSAT) has seven domains (35 items) and is self-assessment used by both clinical staff and recipients to evaluate the sustainability capacity of a clinical practice [https://sustaintool.org/csat/assess/].

As demonstrated by these examples, progress has been made in developing pragmatic rating criteria for D&I science to inform measure development and evaluation (Lewis et al., 2015; Stanick et al., 2021). See Appendix A for links to some commonly used D&I science measures.

7. Additional topics and emerging issues

Although it is impossible in this relatively brief overview to capture the range, depth, and complexity of issues being addressed in the D&I field today, this primer presents some main themes, concepts and methods of investigation and provides guidance for OSH researcher engagement in D&I. Additional and emerging topics of interest and areas for future study are briefly described below.

7.1. Designing for dissemination, implementation and sustainment (D4DIS)

Although this primer refers to both “D” and “I” research, it is generally acknowledged that dissemination research, and what is known about the active process of spreading evidence-based information to key audiences through defined channels and strategies (Rabin and Brownson, 2018), lags implementation research. To address this gap, and to improve the spread of evidence-based interventions across public health domains, Brownson and colleagues (2013) have advanced strategies for Designing for Dissemination, Implementation and Sustainability (D4DIS) (Rabin and Brownson, 2018). This entails ensuring that the products of research (including technologies and messages) are developed to align with the needs of the audience and the characteristics of the context (Brownson et al., 2018a). Practical tools exist for helping researchers to plan D4DIS efforts (see for example, Designing for Dissemination, 2018 and the Stakeholder Engagement Selection Tool, https://dicemethods.com/tool). Similarly, until relatively recently the issue of sustainability of intervention programs and results has not received adequate attention. Inter-related issues of sustainability, cost and other economic issues, and health equity are the focus of developing D&I science research initiatives (Brownson et al., 2021; Chambers et al., 2013; Proctor et al., 2015; Shelton et al., 2018; Woodward et al., 2019, 2021) and have obvious and timely relevance to OSH.

7.2. D&I mechanisms

As mentioned previously, mechanisms of change/action describe the process by which an implementation strategy brings about specified implementation outcomes. However, implementation strategies are frequently misaligned to contexts where programs are being implemented (Lewis et al., 2018a). For a hypothetical example in OSH, workers may receive training on the proper donning and doffing of personal protective equipment (PPE) to protect them from workplace exposures (an intrapersonal-level strategy), but if the problem is provision of PPE to the workplace, an organizational-and/or societal-level determinant, this intervention will be ineffective at achieving the desired health and safety outcomes. Without understanding how implementation strategies work, they will likely fail to achieve positive impact (Grimshaw et al., 2012). D&I science research on mechanisms of action sheds light on the “how and why” questions of health interventions and programs. In these studies, causal pathway models and analyses illustrate how mechanisms can be mediators (but not all mediators are mechanisms) or can be moderators of the effects on implementation outcomes. These models can delineate how an implementation strategy operates by illustrating the specific actions that lead from the deployment of the strategy to the desired implementation outcomes (Lewis et al., 2018b). Research on mechanisms of action is an emerging topic of interest among D&I researchers and funders (Brownson et al., 2018b; Lewis et al., 2018b), and may be an area for future exploration in the OSH field.

7.3. Systems science

Systems science involves methods for simulating and modeling complex systems to inform practice and policy (Luke and Stamatakis, 2012). Complex OSH challenges—including global public health crises such as the COVID-19 pandemic, the changing nature of work and the workforce, and the interaction of work and nonwork factors—will increasingly require the application of systems science or similar approaches (Guerin et al., 2021; Schulte et al., 2019). Complex systems consist of heterogeneous components that are nonlinear, interact with one another, have collective properties that are not explained by studying the individual elements of the system, persist over time, and are dynamic and adaptive to changing circumstances (Luke and Stamatakis, 2012). Lags between cause and effect, nonlinear relationships between variables, and unplanned system behavior at various, socioecological levels are hallmarks of complexity in D&I (Burke et al., 2015; Neta et al., 2018). Systems science investigations use methods (e.g., social network analysis, system dynamics, agent-based modeling and systems dynamics modeling) developed in other disciplines including sociology, business, political science, organizational behavior, computer science, and engineering (Luke et al., 2018; Neta et al., 2018). For example, in D&I science, systems science has been used to model the impact of alternate implementation approaches over time to “test not guess” the expected outcomes before proceeding with pilot testing (Zimmerman et al., 2016) and to be responsive to the needs of practitioners and decision makers (Chambers, 2020; Estabrooks et al., 2018). As Estabrooks and colleagues (2018) note, systems-based approaches, by their very nature, cannot be successful without representation and sustained engagement from the systems that these activities are intended to change.

8. Conclusion

The limited focus on D&I science within the OSH field (Dugan and Punnett, 2017; Guerin et al., 2021; Schulte et al., 2017), has real implications for the timely and relevant translation of research knowledge to practice for preventing workplace injuries and illnesses, and improving worker well-being. It is both concerning and a missed opportunity that an extensive evidence base of OSH research and developed and accumulated knowledge is not often applied in real world settings, including in areas related to prevalent, and generally well-understood, occupational exposures and health effects (Schulte et al., 2017). Even less is known about translating research on newer, psychosocial, and other emergent hazards related to work, workplaces and nonwork factors to improve public health. D&I science approaches are uniquely suited to addressing the complex challenges faced by OSH researchers as they scan the horizon for emerging threats and risks to today’s and tomorrow’s workers (Guerin et al., 2021; Schulte et al., 2019; Tamers et al., 2020). To seize the promise of D&I science, there is a need to build expertise and capacity (Colditz and Emmons, 2018; Proctor and Chambers, 2017); explore pragmatic and cost-effective practices and programs that emphasize external validity, representative reach, and health equity (Colditz and Emmons, 2018; Green and Nasser, 2018; Glasgow et al., 2012, 2019); and build active community partnerships (Chambers and Azrin, 2013). A key premise of D&I science is packaging and conveying the evidence necessary to improve public health in ways that are relevant to local communities, settings, and end-users (Brownson et al., 2018a; Dearing and Kreuter, 2010; Huebschmann et al., 2019) and that reduce OSH inequities (Ahonen et al., 2018).

In conclusion, this primer offers an overview of the promise, opportunities, and challenges of integrating D&I science into OSH, as well as examples, guidance, and resources for exploring these approaches to enhance the impact of OSH efforts. In the light of the global COVID-19 pandemic and other emergent and dynamic OSH risks, never have these challenges been more salient, or more urgent, than they are today.

Acknowledgements

Special thanks to Drs. Samantha Harden, Lauren-Menger Ogle, Andrea Okun, Paul Schulte, Christina Studts, Liliana Tenney and Pamela Tinc for their thoughtful input to and feedback on earlier versions of this manuscript and to Samantha Newman, NIOSH, for graphic design expertise and assistance.

Funding

This research was primarily supported by internal CDC/NIOSH funding. Research reported in this publication was also supported by the National Cancer Institute of the National Institutes of Health (NIH) under Center P50 grant award number 5P50CA244688. Dr. Tyler is supported by grant number K08HS026512 from the Agency for Healthcare Research and Quality.

Appendix A

Building dissemination & implementation (D&I) science capacity in OSH

More widespread, subject matter expertise in D&I science in the OSH field, as is the case more broadly across scientific disciplines (Proctor and Chambers, 2017), is needed. Training programs in D&I science include the Implementation Science 2 (IS2), and the Training Institute in Dissemination Research in Health (TIDRH) programs in Australia and Ireland that follow the former NIH model. Other professional development opportunities in D&I science are listed on The Society for Implementation Research Collaboration (SIRC) website, and several websites provide regularly updated D&I resources, interactive tools, example applications, and information about conferences and upcoming events.

Select D&I Science Resource Websites.

Models, frameworks and tools

Appendix B

Application of RE-AIM to a worksite intervention to prevent musculoskeletal disorders (MSDs) among construction workers in the Netherlands (adapted from Viester et al., 2015).

RE-AIM Dimension Key Questions Level Operationalization Application

REACH
How many people are exposed or served and are they representative?
Representativeness:
(1) what proportion of those who would ideally be exposed or served are actually served? (2) Are those exposed representative of the population of interest? Does the intervention reach at-risk groups?
Individual • Assess # people actually exposed or served; # people ideally exposed or served as population of interest
• Compare characteristics of those actually served vs. population of interest
• A total of 314 workers, randomized to an intervention group (n = 162) or control group (n = 152); 31% (314 of 1,021) of all workers reached.
• Participants were slightly older than nonparticipants (37% of all company workers aged ≥ 50 years vs 46% of participants aged ≥ 50 years).
• Comparable BMI levels in all company workers and participants
How measured? Participant baseline data and company data
EFFECTIVENESS
What is the impact of the intervention on intended outcomes?
(1) Will the intervention achieve the intended outcomes?
(2) Are the outcomes consistent across population sub-groups?(3) Are there any unanticipated consequences of the intervention?(4) Do the benefits outweigh any adverse consequences?
Individual and Setting/ sector • Assess existing evidence
• Be clear about outcomes
• Develop a logic model
• Examine impact across subgroups
• Look at unanticipated (+ and −) consequences
• Examine benefits vs. adverse consequences
• Short-term (6 month) intervention effects on (determinants of) diet/physical activity behavior changes (stage-of-change, self-efficacy, and decisional balance) targeted to reduce MSDs.
• Significantly more intervention group participants improved (i.e., moved toward action and maintenance) compared with control group participants from baseline to follow-up for both dietary behavior and physical activity.
How measured? Participant baseline and 6-month follow-up questionnaire
ADOPTION
How many settings/sectors are involved and are they representative?
Representativeness:
(1) what proportion of eligible workplaces/sectors could actually participate in the intervention?(2) Are there differences between the workplaces/sectors that do or do not participate?
Setting/ sector • Assess # settings/sectors that actually participate; # setting/ sectors that could participate
• Compare characteristics between participating vs. non-participating settings/sectors
• The program was developed and implemented in one large company.
• Participation rates did not differ between the two main company units (general construction and infrastructure). Within infrastructure, participation rates varied between subunits.
How measured? Direct observation
IMPLEMENTATION
Were the required activities of the intervention successfully implemented?
(1) What activities are required to implement the intervention?
(2) Are those activities occurring as intended?(3) What is the cost (time and money) of the intervention?(4) What is the acceptability of the intervention to the population of interest?
Mainly at the setting/ sector level • Define activities required to implement
• Determine process measures (e. g., fidelity) that capture data on activities
• Assess time and costs to implement
• Assess acceptability of initiative to key partners/recipients
Dose delivered = Number of workers who received coaching appointments (98.4% were provided).
Fidelity = Extent to which the coaching program was delivered as intended (i.e., timing and content of the sessions). Fidelity was moderate and needed adjustments. For example, the protocol was not always followed, and the sessions were not sufficiently long to cover required content.
Dose received = Actual exposure to the intervention coaching sessions. Dose rated as “high:” Roughly 84% (n = 126) of workers in the intervention group attended ≥ one coaching session and 61.1% of participants completed all coaching sessions, with a duration of up to 2 h of total contact. Reasons given for noncompletion were lack of interest, time, or conflicting expectations of the program.
Costs of the multi-phase, program implementation were not directly reported. It was noted that a cost evaluation, given limited resources, would be essential to making a business case to management.
Satisfaction = perceptions of the coaching, number of sessions, and the program materials, were rated as “high” overall; mean rating was 7.6 (SD = 1.0), on a scale from 0 to 10. The intervention implementers (personal health coaches) also viewed the program as satisfactory and usable.
How measured? Questionnaires, interviews, and coaching registrations
MAINTENANCE
What are the long-term effects of the program, and are they sustainable?
(1) Does the intervention produce lasting effects?(2) Is there consistent support from the organizations involved? (3) Is the funding adequate for program maintenance? Individual and setting/ sector • Examine outcomes of interest
• Plan for long-term maintenance
• Engage partners to help with sustainability
• Examine strategies to ensure funding
• Organizational intention for long-term implementation.
• Organizational decision makers were interested in continuing the program if it reduces sick leave time or improves other health outcomes. Lost work time due to program participation is a barrier.
How measured? Interviews

Footnotes

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Disclaimer

The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the National Institute for Occupational Safety and Health, Centers for Disease Control and Prevention.

CRediT authorship contribution statement

R.J. Guerin: Conceptualization, Writing — original draft, Methodology, Project administration. R.E. Glasgow: Writing — review & editing. A. Tyler: Writing — review & editing. B.A. Rabin: Writing — review & editing. A.G. Huebschmann: Conceptualization, Methodology, Supervision, Writing — review & editing.

References

  1. Aarons GA, Ehrhart MG, Farahnak LR, 2014. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implem. Sci. 9, 45. 10.1186/1748-5908-9-45. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, Silovsky JF, Hecht DB, Chaffin MJ, 2012. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implem. Sci. 7 (1) 10.1186/1748-5908-7-32. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons GA, Hurlburt M, Horwitz SM, 2011. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Admin. Policy Mental Health Mental Health Serv. Res. 38 (1), 4–23. 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH, 2017. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implem. Sci. 12 (1), 111. 10.1186/s13012-017-0640-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Ahonen EQ, Fujishiro K, Cunningham T, Flynn M, 2018. Work as an inclusive part of population health inequities research and prevention. Am. J. Public Health 108 (3), 306–311. 10.2105/AJPH.2017.304214. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. AHRQ, 2014. Dissemination Planning Tool: Exhibit A from Volume 4. Rockville, MD. Available at https://www.ahrq.gov/patient-safety/resources/advances/vol4/planning.html. Accessed January 9, 2021. [Google Scholar]
  7. Allen JD, Linnan LA, Emmons KM, Brownson R, Colditz G, Proctor E, 2018. Fidelity and its relationship to implementation effectiveness, adaptation, and dissemination. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice, second ed. Oxford University Press, New York, NY, pp. 281–304. [Google Scholar]
  8. Anger WK, Elliot DL, Bodner T, Olson R, Rohlman DS, Truxillo DM, Kuehl KS, Hammer LB, Montgomery D, 2015. Effectiveness of Total Worker Health interventions. J. Occup. Health Psychol. 20 (2), 226–247. 10.1037/a0038340. [DOI] [PubMed] [Google Scholar]
  9. Backer TE, 2001. Finding the Balance: Program Fidelity and Adaptation in Substance Abuse Prevention: A State-of-the-Art Review. Rockville, MD: Center for Substance Abuse Prevention, Substance Abuse and Mental Health Services Administration, 1–82. Available at https://www.csun.edu/sites/default/files/FindingBalance1.pdf. Accessed January 9, 2021. [Google Scholar]
  10. Balas EA, Boren SA, 2000. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray AT (Eds.), Yearbook of Medical Informatics. Schattauer, Stuttgart, Germany, pp. 65–70. [PubMed] [Google Scholar]
  11. Bandura A, 1986. Social foundations of thought and action: a social cognitive theory. Prentice-Hall, Englewood Cliffs, N.J. [Google Scholar]
  12. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM, 2015. An introduction to implementation science for the non-specialist. BMC Psychol. 3 (1), 32. 10.1186/s40359-015-0089-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Baumann AA, Cabassa LJ, 2020. Reframing implementation science to address inequities in healthcare delivery. BMC Health Serv. Res. 20 (1), 1–9. 10.1186/s12913-020-4975-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Becan JE, Bartkowski JP, Knight DK, Wiley TRA, DiClemente R, Ducharme L, Welsh WN, Bowser D, McCollister K, Hiller M, Spaulding AC, Flynn PM, Swartzendruber A, Dickson MF, Fisher JH, Aarons GA, 2018. A model for rigorously applying the Exploration, Preparation, Implementation, Sustainment (EPIS) framework in the design and measurement of a large scale collaborative multi-site study. Health Justice 6 (1). 10.1186/s40352-018-0068-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Birken SA, Powell BJ, Shea CM, Haines ER, Alexis Kirk M, Leeman J, Rohweder C, Damschroder L, Presseau J, 2017a. Criteria for selecting implementation science theories and frameworks: results from an international survey. Implem. Sci. 12 (1) 10.1186/s13012-017-0656-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Birken SA, Powell BJ, Presseau J, Kirk MA, Lorencatto F, Gould NJ, Shea CM, Weiner BJ, Francis JJ, Yu Y, Haines E, Damschroder LJ, 2017b. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review. Implem. Sci. 12 (1) 10.1186/s13012-016-0534-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Blase K, Fixsen D, 2013. Core intervention components: Identifying and operationalizing what makes programs work. ASPE Research Brief. U.S. Department of Health and Human Services; https://aspe.hhs.gov/report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work. [Google Scholar]
  18. Bopp M, Saunders RP, Lattimore D, 2013. The tug-of-war: fidelity versus adaptation throughout the health promotion program life cycle. J. Primary Prevent. 34 (3), 193–207. 10.1007/s10935-013-0299-y. [DOI] [PubMed] [Google Scholar]
  19. Bowser D, Henry BF, McCollister KE, 2019. An overlapping systems conceptual framework to evaluate implementation of a behavioral health intervention for justice-involved youth. Health Serv. Insights 12, 10. 10.1177/1178632919855037. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Bronfenbrenner U, 1979. The ecology of human development. Harvard University Press, Cambridge, MA. [Google Scholar]
  21. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, Collins LM, Duan N, Mittman BS, Wallace A, Tabak RG, Ducharme L, Chambers DA, Neta G, Wiley T, Landsverk J, Cheung K, Cruden G, 2017. An overview of research and evaluation designs for dissemination and implementation. Annu. Rev. Public Health 38 (1), 1–22. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Brown CA, Lilford RJ, 2006. The stepped wedge trial design: a systematic review. BMC Med. Res. Method. 6 (1), 54. 10.1186/1471-2288-6-54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Brownson RC, Eyler AA, Harris JK, Moore JB, Tabak RG, 2018a. Getting the word out: New approaches for disseminating public health science. J. Public Health Manage. Pract. 24 (2), 102–111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Brownson RC, Colditz GA, Proctor EK, 2018b. Future issues in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice, secnod ed. Oxford University Press, New York, NY, pp. 481–490. [Google Scholar]
  25. Brownson RC, Fielding JE, Maylahn CM, 2009. Evidence-based public health: a fundamental concept for public health practice. Annu. Rev. Public Health 30 (1), 175–201. [DOI] [PubMed] [Google Scholar]
  26. Brownson RC, Jacobs JA, Tabak RG, Hoehner CM, Stamatakis KA, 2013. Designing for dissemination among public health researchers: findings from a national survey in the United States. Am. J. Public Health 103 (9), 1693–1699. 10.2105/AJPH.2012.301165. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Brownson RC, Kumanyika SK, Kreuter MW, Haire-Joshu D, 2021. Implementation science should give higher priority to health equity. Implem. Sci. 16 (1), 28. 10.1186/s13012-021-01097-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Buller DB, Buller MK, Meenan R, Cutter GR, Berteletti J, Eye R, Pagoto S, 2020. Design and baseline data of a randomized trial comparing two methods for scaling-up an occupational sun protection intervention. Contemp. Clin. Trials 97, 106147. 10.1016/j.cct.2020.106147. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Burke JG, Lich KH, Neal JW, Meissner HI, Yonas M, Mabry PL, 2015. Enhancing dissemination and implementation research using systems science methods. Int. J. Behav. Med. 22 (3), 283–291. 10.1007/s12529-014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Carlan NA, Kramer DM, Bigelow P, Wells R, Garritano E, Vi P, 2012. Digging into construction: Social networks and their potential impact on knowledge transfer. Work 42 (2), 223–232. 10.3233/WOR-2012-1345. [DOI] [PubMed] [Google Scholar]
  31. Carvalho ML, Honeycutt S, Escoffery C, Glanz K, Sabbs D, Kegler MC, 2013. Balancing fidelity and adaptation: implementing evidence-based chronic disease prevention programs. J. Public Health Manage. Pract. 19 (4), 348–356. 10.1097/PHH.0b013e31826d80eb. [DOI] [PubMed] [Google Scholar]
  32. Castro FG, Barrera M, Martinez CR, 2004. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev. Sci. 5 (1), 41–45. 10.1023/B:PREV.0000013980.12412.cd. [DOI] [PubMed] [Google Scholar]
  33. Center for Training and Research Translation, University of North Carolina Center for Health Promotion and Disease Prevention (n.d.). RE-AIM dimensions, associated questions, level, measurement, influencing factors & improvement strategies. Available at https://snapedtoolkit.org/app/uploads/RE-AIMTable.pdf. Accessed January 10, 2021.
  34. Chambers DA, 2020. Considering the intersection between implementation science and COVID-19. Implem. Res. Pract. 10.1177/0020764020925994. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Chambers DA, Azrin ST, 2013. Research and services partnerships: partnership: a fundamental component of dissemination and implementation research. Psychiatric Serv. 64 (6), 509–511. 10.1176/appi.ps.201300032. [DOI] [PubMed] [Google Scholar]
  36. Chambers DA, Glasgow RE, Stange KC, 2013. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implem. Sci. 8 (1), 117. 10.1186/1748-5908-8-117. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Chambers DA, Norton WE, 2016. The adaptome: advancing the science of intervention adaptation. Am. J. Prev. Med. 51 (4), S124–S131. 10.1016/j.amepre.2016.05.011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Cocker KD, Cardon G, Bennie JA, Kolbe-Alexander T, Meester FD, Vandelanotte C, 2018. From evidence-based research to practice-based evidence: Disseminating a web-based computer-tailored workplace sitting intervention through a health promotion organisation. Int. J. Environ. Res. Public Health 15 (5), 1049. 10.3390/ijerph15051049. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Colditz GA, Emmons KM, 2018. The promise and challenges of dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice, second ed. Oxford University Press, New York, NY, pp. 1–18. [Google Scholar]
  40. Collins LM, Murphy SA, Strecher V, 2007. The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): new methods for more potent eHealth interventions. Am. J. Prev. Med. 32 (5), S112–S118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Collins LM, Baker TB, Mermelstein RJ, Piper ME, Jorenby DE, Smith SS, Christiansen BA, Schlam TR, Cook JW, Fiore MC, 2011. The multiphase optimization strategy for engineering effective tobacco use interventions. Ann. Behav. Med. 41 (2), 208–226. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Colquhoun H, Leeman J, Michie S, Lokker C, Bragge P, Hempel S, McKibbon KA, Peters G-J, Stevens KR, Wilson MG, Grimshaw J, 2014. Towards a common terminology: a simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implem. Sci. 9 (1) 10.1186/1748-5908-9-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Crawford JO, Davis A, Walker G, Cowie H, Ritchie P, 2016. Evaluation of knowledge transfer for occupational safety and health in an organizational context: Development of an evaluation framework. Policy Pract. Health Saf. 14 (1), 7–21. 10.1080/14773996.2016.1231864. [DOI] [Google Scholar]
  44. Creswell JW, Klassen AC, Plano Clark VL, Smith KC, 2011. Best practices for mixed methods research in the health sciences. National Institutes of Health, Bethesda (Maryland). Available at https://www.csun.edu/sites/default/files/best_prac_mixed_methods.pdf. Accessed January 9, 2021. [Google Scholar]
  45. Cunningham TR, Tinc P, Guerin RJ, Schulte PA, 2020. Translation research in occupational health and safety settings: common ground and future directions. J. Saf. Res. 74, 161–167. 10.1016/j.jsr.2020.06.015. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C, 2012. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med. Care 50 (3), 217. 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Damschroder LJ, 2021. Introduction & Application of an Updated Consolidated Framework for Implementation Research: CFIR 2.0 [Conference presentation]. The 14th Annual Conference on the Science of Dissemination and Implementation in Health (D&I). Academy Health. [Google Scholar]
  48. Damschroder LJ, 2020. Clarity out of chaos: use of theory in implementation research. Psychiatry Res. 283, 112461 10.1016/j.psychres.2019.06.036. [DOI] [PubMed] [Google Scholar]
  49. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC, 2009. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implem. Sci. 4 (1), 1–15. 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Damschroder LJ, Reardon CM, AuYoung M, Moin T, Datta SK, Sparks JB, Maciejewski ML, Steinle NI, Weinreb JE, Hughes M, Pinault LF, Xiang XM, Billington C, Richardson CR, 2017. Implementation findings from a hybrid III implementation-effectiveness trial of the Diabetes Prevention Program (DPP) in the Veterans Health Administration (VHA). Implem. Sci. 12 (1) 10.1186/s13012-017-0619-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Dearing JW, 2008. Evolution of diffusion and dissemination theory. J. Public Health Manage. Pract. 14 (2), 99–108. 10.1097/01.PHH.0000311886.98627.b7. [DOI] [PubMed] [Google Scholar]
  52. Dearing JW, Kee KF, Peng TQ, 2018. Historical roots of dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 48–61. [Google Scholar]
  53. Dearing JW, Kreuter MW, 2010. Designing for diffusion: how can we increase uptake of cancer communication innovations? Patient Educ. Couns. 81, S100–S110. 10.1016/j.pec.2010.10.013. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Designing for Dissemination, 2018. Bridging the Science and Practice of Designing for Dissemination: Going from Unicorns to Workhorses (Workshop). Available at https://www1.ucdenver.edu/docs/librariesprovider94/di-docs/guides-and-tools/2018-d4d-workbook_revised2.pdf?sfvrsn=463c06b9_2. Accessed January 9, 2021.
  55. Dugan AG, Punnett L, 2017. Dissemination and implementation research for occupational safety and health. Occup. Health Sci. 1 (1–2), 29–45. 10.1007/s41542-017-0006-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Durlak JA, DuPre EP, 2008. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am. J. Community Psychol. 41 (3–4), 327–350. 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
  57. Duryan M, Smyth H, Roberts A, Rowlinson S, Sherratt F, 2020. Knowledge transfer for occupational health and safety: cultivating health and safety learning culture in construction firms. Accid. Anal. Prev. 139, 105496 10.1016/j.aap.2020.105496. [DOI] [PubMed] [Google Scholar]
  58. Ehrhart MG, Aarons GA, Farahnak LR, 2014. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS) (2014). Implem. Sci. 9, 157. 10.1186/s13012-014-0157-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Escoffery C, Lebow-Skelley E, Udelson H, Böing EA, Wood R, Fernandez ME, Mullen PD, 2019. A scoping study of frameworks for adapting public health evidence-based interventions. Transl. Behav. Med. 9 (1), 1–10. 10.1093/tbm/ibx067. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Estabrooks PA, Brownson RC, Pronk NP, 2018. Dissemination and implementation science for public health professionals: an overview and call to action. Prevent. Chronic Dis. 15, 180525. 10.5888/pcd15.180525. [DOI] [PMC free article] [PubMed] [Google Scholar]
  61. Feldstein AC, Glasgow RE, 2008. A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. Joint Commission J. Qual. Patient Saf. 34 (4), 228–243. 10.1016/S1553-7250(08)34030-6. [DOI] [PubMed] [Google Scholar]
  62. Fernandez ME, ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, Ruiter RAC, Markham CM, Kok G, 2019. Implementation mapping: using intervention mapping to develop implementation strategies. Front. Public Health 7. 10.3389/fpubh.2019.00158. [DOI] [PMC free article] [PubMed] [Google Scholar]
  63. Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, Moreau JL, Dyer KE, Lanham HJ, Leykum L, Hamilton AB, 2018. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med. Res. Method. 18 (1) 10.1186/s12874-018-0610-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Fixsen DL, Blasé KA, Timbers GD, Wolf MM, 2007. In search of program implementation: 792 replications of the Teaching-Family Model. Behav. Anal. Today 8 (1), 96–110. 10.1037/h0100104. [DOI] [Google Scholar]
  65. Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, Eccles MP, 2013. A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implem. Sci. 8 (1), 1–11. 10.1186/1748-5908-8-35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Fort DG, Herr TM, Shaw PL, Gutzman KE, Starren JB, 2017. Mapping the evolving definitions of translational research. J. Clin. Transl. Sci. 1 (1), 60–66. 10.1017/cts.2016.10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Gaglio B, Glasgow RE, 2018. Evaluation approaches for dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 317–334. [Google Scholar]
  68. Gaglio B, Phillips SM, Heurtin-Roberts S, Sanchez MA, Glasgow RE, 2014. How pragmatic is it? Lessons learned using PRECIS and RE-AIM for determining pragmatic characteristics of research. Implem. Sci. 9, 96. 10.1186/s13012-014-0096-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Gaglio B, Shoup JA, Glasgow RE, 2013. The RE-AIM framework: A systematic review of use over time. Am. J. Public Health 103 (6), e38–e46. 10.2105/AJPH.2013.301299. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Glasgow RE, 2013. What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Educ. Behav. 40 (3), 257–265. 10.1177/1090198113486805. [DOI] [PubMed] [Google Scholar]
  71. Glasgow RE, Battaglia C, McCreight M, Ayele RA, Rabin BA, 2020. Making implementation science more rapid: Use of the RE-AIM framework for mid-course adaptations across five health services research projects in the veterans health administration. Front. Public Health 8, 194. 10.3389/fpubh.2020.00194. [DOI] [PMC free article] [PubMed] [Google Scholar]
  72. Glasgow RE, Emmons KM, 2007. How can we increase translation of research into practice? Types of evidence needed. Annu. Rev. Public Health 28, 413–433. 10.1146/annurev.publhealth.28.021406.144145. [DOI] [PubMed] [Google Scholar]
  73. Glasgow RE, Estabrooks PE, 2018. Peer reviewed: Pragmatic applications of RE-AIM for health care initiatives in community and clinical settings. Prevent. Chronic Dis. 15, 170271 10.5888/pcd15.170271. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Glasgow RE, Harden SM, Gaglio B, Rabin BA, Smith ML, Porter GC, Estabrooks PA, 2019. RE-AIM planning and evaluation framework: adapting to new science and practice with a twenty-year review. Front. Public Health 7, 64. 10.3389/fpubh.2019.00064. [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Glasgow RE, Huebschmann AG, Brownson RC, 2018. Expanding the CONSORT figure: Increasing transparency in reporting on external validity. Am. J. Prev. Med. 55 (3), 422–430. 10.1016/j.amepre.2018.04.044. [DOI] [PubMed] [Google Scholar]
  76. Glasgow RE, Riley WT, 2013. Pragmatic measures: what they are and why we need them. Am. J. Prev. Med. 45 (2), 237–243. 10.1016/j.amepre.2013.03.010. [DOI] [PubMed] [Google Scholar]
  77. Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C, 2012. National Institutes of Health approaches to dissemination and implementation science: current and future directions. Am. J. Public Health 102 (7), 1274–1281. 10.2105/AJPH.2012.300755. [DOI] [PMC free article] [PubMed] [Google Scholar]
  78. Glasgow RE, Vogt TM, Boles SM, 1999. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am. J. Public Health 89 (9), 1322–1327. 10.2105/AJPH.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
  79. Green LW, Nasser M, 2018. Furthering dissemination and implementation research: The need for more attention to external validity. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 301–316. [Google Scholar]
  80. Green LW, Ottoson JM, Garcia C, Hiatt RA, 2009. Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annu. Rev. Public Health 30, 151–174. 10.1146/annurev.publhealth.031308.100049. [DOI] [PubMed] [Google Scholar]
  81. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE, 2012. Knowledge translation of research findings. Implem. Sci. 7 (1), 1–17. 10.1186/1748-5908-7-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  82. Guastaferro K, Collins LM, 2019. Achieving the goals of translational science in public health intervention research: the Multiphase Optimization Strategy (MOST). Am. J. Public Health 109 (S2), S128–S129. [DOI] [PMC free article] [PubMed] [Google Scholar]
  83. Guerin RJ, Harden SM, Rabin BA, Rohlman DS, Cunningham TR, TePoel MR, Parish M, Glasgow RE, 2021. Dissemination and implementation science approaches for occupational safety and health research: implications for advancing total worker health. Int. J. Environ. Res. Public Health 18 (21), 11050. 10.3390/ijerph182111050. [DOI] [PMC free article] [PubMed] [Google Scholar]
  84. Hamilton AB, Finley EP, 2019. Qualitative methods in implementation research: An introduction. Psychiatry Res. 280, 112516 10.1016/j.psychres.2019.112516. [DOI] [PMC free article] [PubMed] [Google Scholar]
  85. Handley MA, Lyles CR, McCulloch C, Cattamanchi A, 2018. Selecting and improving quasi-experimental designs in effectiveness and implementation research. Annu. Rev. Public Health 39, 5–25. 10.1146/annurev-publhealth-040617-014128. [DOI] [PMC free article] [PubMed] [Google Scholar]
  86. Harden SM, Balis LE, Strayer III T, Wilson ML, 2021. Assess, plan, do, evaluate, and report: iterative cycle to remove academic control of a community-based physical activity program. Prevent. Chronic Dis. 18, 200513 10.5888/pcd18.200513. [DOI] [PMC free article] [PubMed] [Google Scholar]
  87. NIH, 2021. Dissemination and Implementation Research in Health. PAR-19–274. Available at: https://grants.nih.gov/grants/guide/pa-files/PAR-19-274.html. Accessed January 9, 2021.
  88. Hempel S, Xenakis L, Danz M, 2016. Systematic reviews for occupational safety and health questions: resources for evidence synthesis. RAND Corporation. California: Santa Monica. 2016. Available at: http://www.rand.org/pubs/research_reports/RR1463.html. Accessed January 10, 2021. [Google Scholar]
  89. Hofmann DA, Burke MJ, Zohar D, 2017. 100 years of occupational safety research: From basic protections and work analysis to a multilevel view of workplace safety and risk. J. Appl. Psychol. 102 (3), 375–388. 10.1037/apl0000114. [DOI] [PubMed] [Google Scholar]
  90. Holtrop JS, Rabin BA, Glasgow RE, 2018. Qualitative approaches to use of the RE-AIM framework: rationale and methods. BMC Health Serv. Res. 18 (1), 1–10. 10.1186/s12913-018-2938-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  91. Howard J, Piacentino J, MacMahon K, Schulte P, 2017. Using systematic review in occupational safety and health. Am. J. Ind. Med. 60 (11), 921–929. 10.1002/ajim.22771. [DOI] [PubMed] [Google Scholar]
  92. Huebschmann AG, Leavitt IM, Glasgow RE, 2019. Making health research matter: a call to increase attention to external validity. Annu. Rev. Public Health 40, 45–63. 10.1146/annurev-publhealth-040218-043945. [DOI] [PubMed] [Google Scholar]
  93. Jacobs JA, Jones E, Gabella BA, Spring B, Brownson RC, 2012. Peer reviewed: tools for implementing an evidence-based approach in public health practice. Prevent. Chronic Dis. 9, 110324 10.5888/pcd9.110324. [DOI] [PMC free article] [PubMed] [Google Scholar]
  94. Jacobs SR, Weiner BJ, Bunger AC, 2014. Context matters: measuring implementation climate among individuals and groups. Implem. Sci. 9, 46. 10.1186/1748-5908-9-46. [DOI] [PMC free article] [PubMed] [Google Scholar]
  95. Jenny GJ, Brauchli R, Inauen A, Füllemann D, Fridrich A, Bauer GF, 2015. Process and outcome evaluation of an organizational-level stress management intervention in Switzerland. Health Promot. Int. 30 (3), 573–585. 10.1093/heapro/dat091. [DOI] [PubMed] [Google Scholar]
  96. Keefe AR, Demers PA, Neis B, Arrandale VH, Davies HW, Gao Z, Hedges K, Holness DL, Koehoorn M, Stock SR, Bornstein S, 2020. A scoping review to identify strategies that work to prevent four important occupational diseases. Am. J. Indust. Med. 63 (6), 490–516. 10.1002/ajim.23107. [DOI] [PubMed] [Google Scholar]
  97. Kemp CG, Wagenaar BH, Haroz EE, 2019. Expanding hybrid studies for implementation research: Intervention, implementation strategy, and context. Front. Public Health 7, 325. 10.3389/fpubh.2019.00325. [DOI] [PMC free article] [PubMed] [Google Scholar]
  98. Kessler R, Glasgow RE, 2011. A proposal to speed translation of healthcare research into practice: dramatic change is needed. Am. J. Prev. Med. 40 (6), 637–644. 10.1016/j.amepre.2011.02.023. [DOI] [PubMed] [Google Scholar]
  99. Khan S, Chambers D, Neta G, 2021. Revisiting time to translation: implementation of evidence-based practices (EBPs) in cancer control. Cancer Causes Control 32 (3), 221–230. [DOI] [PubMed] [Google Scholar]
  100. Khoury MJ, Gwinn M, Yoon PW, Dowling N, Moore CA, Bradley L, 2007. The continuum of translation research in genomic medicine: how can we accelerate the appropriate integration of human genome discoveries into health care and disease prevention? Genet. Med. 9 (10), 665–674. 10.1097/GIM.0b013e31815699d0. [DOI] [PubMed] [Google Scholar]
  101. Khoury MJ, Gwinn M, Ioannidis JP, 2010. The emergence of translational epidemiology: from scientific discovery to population health impact. Am. J. Epidemiol. 172 (5), 517–524. 10.1093/aje/kwq211. [DOI] [PMC free article] [PubMed] [Google Scholar]
  102. King DK, Shoup JA, Raebel MA, Anderson CB, Wagner NM, Ritzwoller DP, Bender BG, 2020. Planning for implementation success using RE-AIM and CFIR frameworks: a qualitative study. Front. Public Health 8, 59. 10.3389/fpubh.2020.00059. [DOI] [PMC free article] [PubMed] [Google Scholar]
  103. Kirk MA, Moore JE, Stirman SW, Birken SA, 2020. Towards a comprehensive model for understanding adaptations’ impact: the model for adaptation design and impact (MADI). Implem. Sci. 15 (1), 1–15. 10.1186/s13012-020-01021-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  104. Landes SJ, McBain SA, Curran GM, 2020. Reprint of: an introduction to effectiveness-implementation hybrid designs. Psychiatry Res. 283, 112630 10.1016/j.psychres.2019.112513. [DOI] [PubMed] [Google Scholar]
  105. Landsverk J, Brown CH, Smith JD, Chamberlain P, Curran GM, Palinkas L, Horwitz SM, 2018. Design and analysis in dissemination and implementation research. In: Brownson RCGA, Colditz EK (Eds.), Proctor Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 201–227. [Google Scholar]
  106. Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG, 2015. Outcomes for implementation science: An enhanced systematic review of instruments using evidence-based rating criteria. Implem. Sci. 10 (1), 155. 10.1186/s13012-015-0342-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  107. Lewis CC, Proctor EK, Brownson RC, 2018a. Measurement issues in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 229–244. [Google Scholar]
  108. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, Weiner B, 2018b. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front. Public Health 6, 136. 10.3389/fpubh.2018.00136. [DOI] [PMC free article] [PubMed] [Google Scholar]
  109. Lobb R, Colditz GA, 2013. Implementation science and its application to population health. Annu. Rev. Public Health 34, 235–251. 10.1146/annurev-publhealth-031912–114444. [DOI] [PMC free article] [PubMed] [Google Scholar]
  110. Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe KE, Zwarenstein M, 2015. The PRECIS-2 tool: designing trials that are fit for purpose. BMJ 350, h2147. 10.1136/bmj.h2147. [DOI] [PubMed] [Google Scholar]
  111. Lucas DL, Kincl LD, Bovbjerg VE, Lincoln JM, 2014. Application of a translational research model to assess the progress of occupational safety research in the international commercial fishing industry. Saf. Sci. 64, 71–81. 10.1016/j.ssci.2013.11.023. [DOI] [Google Scholar]
  112. Luke DA, Calhoun A, Robichaux CB, Elliott MB, Moreland-Russell S, 2014. Peer Reviewed: The program sustainability assessment tool: A new instrument for public health programs. Prevent. Chronic Dis. 11, 130184 10.5888/pcd11.130184. [DOI] [PMC free article] [PubMed] [Google Scholar]
  113. Luke DA, Stamatakis KA, 2012. Systems science methods in public health: Dynamics, networks, and agents. Annu. Rev. Public Health 33, 357–376. 10.1146/annurev-publhealth-031210-101222. [DOI] [PMC free article] [PubMed] [Google Scholar]
  114. Luke DA, Morshed AB, McKay VR, Combs TB, 2018. Systems science methods in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 157–174. [Google Scholar]
  115. Luoma KA, Leavitt IM, Marrs JC, Nederveld AL, Regensteiner JG, Dunn AL, Glasgow RE, Huebschmann AG, 2017. How can clinical practices pragmatically increase physical activity for patients with type 2 diabetes? A systematic review. Transl. Behav. Med. 7 (4), 751–772. [DOI] [PMC free article] [PubMed] [Google Scholar]
  116. Lyon AR, Koerner K, 2016. User-centered design for psychosocial intervention development and implementation. Clin. Psychol. 23 (2), 180–200. 10.1111/cpsp.12154. [DOI] [PMC free article] [PubMed] [Google Scholar]
  117. Martinez RG, Lewis CC, Weiner BJ, 2014. Instrumentation issues in implementation science. Implem. Sci. 9, 118. 10.1186/s13012-014-0118-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  118. May CR, Johnson M, Finch T, 2016. Implementation, context and complexity. Implem. Sci. 11 (1), 1–12. 10.1186/s13012-016-0506-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  119. McKibbon KA, Lokker C, Wilczynski NL, Ciliska D, Dobbins M, Davis DA, Straus SE, 2010. A cross-sectional study of the number and frequency of terms used to refer to knowledge translation in a body of health literature in 2006: a Tower of Babel? Implem. Sci. 5 (1), 16. 10.1186/1748-5908-5-16. [DOI] [PMC free article] [PubMed] [Google Scholar]
  120. McCreight MS, Rabin BA, Glasgow RE, Ayele RA, Leonard CA, Gilmartin HM, Battaglia CT, 2019. Using the Practical, Robust Implementation and Sustainability Model (PRISM) to qualitatively assess multilevel contextual factors to help plan, implement, evaluate, and disseminate health services programs. Transl. Behav. Med. 9 (6), 1002–1011. 10.1093/tbm/ibz085. [DOI] [PubMed] [Google Scholar]
  121. Michie S, 2014. Implementation science: understanding behavior change and maintenance. BMC Health Serv. Res. 14 (Suppl 2), O9. 10.1186/1472-6963-14-S2-O9. [DOI] [Google Scholar]
  122. Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A, 2005. Making psychological theory useful for implementing evidence-based practice: a consensus approach. BMJ Qual. Saf. 14 (1), 26–33. 10.1136/qshc.2004.011155. [DOI] [PMC free article] [PubMed] [Google Scholar]
  123. Michie S, Van Stralen MM, West R, 2011. The behavior change wheel: a new method for characterizing and designing behavior change interventions. Implem. Sci. 6 (1), 42. [DOI] [PMC free article] [PubMed] [Google Scholar]
  124. Miller CJ, Barnett ML, Baumann AA, Gutner CA, Wiltsey-Stirman S, 2021. The FRAME-IS: a framework for documenting modifications to implementation strategies in healthcare. Implem. Sci. 16 (1), 1–12. 10.1186/s13012-021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  125. Minkler M, 2010. Linking science and policy through community-based participatory research to study and address health disparities. Am. J. Public Health 100 (S1), S81–S87. 10.2105/AJPH.2009.165720. [DOI] [PMC free article] [PubMed] [Google Scholar]
  126. Morgan JI, Curcuruto M, Steer M, Bazzoli A, 2021. Implementing the theoretical domains framework in occupational safety: development of the safety behavior change questionnaire. Saf. Sci. 136, 105135 10.1016/j.ssci.2020.105135. [DOI] [Google Scholar]
  127. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA, 2019. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implem. Sci. 14 (1), 1–16. 10.1186/s13012-018-0842-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  128. Moullin JC, Ehrhart MG, Aarons GA, 2018. Development and testing of the Measure of Innovation-Specific Implementation Intentions (MISII) using Rasch measurement theory. Implem. Sci. 13 (1). [DOI] [PMC free article] [PubMed] [Google Scholar]
  129. Movsisyan A, Arnold L, Evans R, Hallingberg B, Moore G, O’Cathain A, Pfadenhauer LM, Segrott J, Rehfuess E, 2019. Adapting evidence-informed complex population health interventions for new contexts: a systematic review of guidance. Implem. Sci. 14 (1) 10.1186/s13012-019-0956-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  130. Nahum-Shani I, Qian M, Almirall D, Pelham WE, Gnagy B, Fabiano GA, Waxmonsky JG, Yu J, Murphy SA, 2012. Experimental design and primary data analysis methods for comparing adaptive interventions. Psychol. Methods 17 (4), 457–477. 10.1037/a0029372. [DOI] [PMC free article] [PubMed] [Google Scholar]
  131. NAS, 2009. Evaluating Occupational Health and Safety Research Programs: Framework and Next Steps. Washington, DC: National Academies Press. Available at https://www.ncbi.nlm.nih.gov/books/NBK219530. Accessed January 9, 2021. [PubMed] [Google Scholar]
  132. NCI, 2019. Implementation science at a glance: a guide for cancer control practitioners. U.S. Department of Health and Human Services, National Institutes of Health. Available at https://cancercontrol.cancer.gov/IS/docs/NCi-ISaaG-Workbook.pdf. Accessed January 9, 2021. [Google Scholar]
  133. Neta G, Glasgow RE, Carpenter CR, Grimshaw JM, Rabin BA, Fernandez ME, Brownson RC, 2015. A framework for enhancing the value of research for dissemination and implementation. Am. J. Public Health 105 (1), 49–57. 10.2105/AJPH.2014.302206. [DOI] [PMC free article] [PubMed] [Google Scholar]
  134. Neta G, Brownson RC, Chambers DA, 2018. Opportunities for epidemiologists in implementation science: A primer. Am. J. Epidemiol. 187 (5), 899–910. 10.1093/aje/kwx323. [DOI] [PMC free article] [PubMed] [Google Scholar]
  135. Nilsen P, 2015. Making sense of implementation theories, models and frameworks. Implem. Sci. 10 (1), 53. 10.1186/s13012-015-0242-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  136. Nilsen P, Bernhardsson S, 2019. Context matters in implementation science: A scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Serv. Res. 19 (1), 1–21. 10.1186/s12913-019-4015-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  137. NIOSH, 2013. Quality of Worklife Questionnaire. Available at https://www.cdc.gov/niosh/topics/stress/qwlquest.html. Accessed January 9, 2021.
  138. NIOSH, 2020. What is Total Worker Health? Available at: https://www.cdc.gov/niosh/twh/default.html. Accessed January 9, 2021.
  139. Nold A, Bochmann F, 2010. Examples of evidence-based approaches in accident prevention. Saf. Sci. 48 (8), 1044–1049. 10.1016/j.ssci.2010.02.009. [DOI] [Google Scholar]
  140. Norton WE, Loudon K, Chambers DA, Zwarenstein M, 2021. Designing provider-focused implementation trials with purpose and intent: introducing the PRECIS-2-PS tool. Implem. Sci. 16 (1), 1–11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  141. Nykänen M, Guerin RJ, Vuori J, 2021. Identifying the “active ingredients” of a school-based, workplace safety and health training intervention. Prev. Sci. 22 (7), 1001–1011. [DOI] [PMC free article] [PubMed] [Google Scholar]
  142. Palinkas LA, Cooper BR, 2018. Mixed methods evaluation in dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 336–353. [Google Scholar]
  143. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J, 2011. Mixed method designs in implementation research. Admin. Policy Mental Health Mental Health Serv. Res. 38 (1), 44–53. 10.1007/s10488-010-0314-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  144. Pawson R, 2013. The Science of evaluation: A realist manifesto. Sage, Thousand Oaks, CA. [Google Scholar]
  145. Perez Jolles M, Lengnick-Hall R, Mittman BS, 2019. Core functions and forms of complex health interventions: a patient-centered medical home illustration. J. Gen. Intern. Med. 34 (6), 1032–1038. 10.1007/s11606-018-4818-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  146. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, Kirchner JE, 2015. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implem. Sci. 10 (1), 21. 10.1186/s13012-015-0209-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  147. Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, Mandell DS, 2017. Methods to improve the selection and tailoring of implementation strategies. J. Behav. Health Serv. Res. 44 (2), 177–194. 10.1007/s11414-015-9475-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  148. Proctor EK, Chambers DA, 2017. Training in dissemination and implementation research: a field-wide perspective. Transl. Behav. Med. 7 (3), 624–635. 10.1007/s13142-016-0406-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  149. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, Padek M, 2015. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implem. Sci. 10 (1), 88. 10.1186/s13012-015-0274-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  150. Proctor EK, Powell BJ, McMillen JC, 2013. Implementation strategies: recommendations for specifying and reporting. Implem. Sci. 8 (1), 1–11. 10.1186/1748-5908-8-139. [DOI] [PMC free article] [PubMed] [Google Scholar]
  151. Proctor EK, Powell BJ, Baumann AA, Hamilton AM, Santens RL, 2012. Writing implementation research grant proposals: ten key ingredients. Implem. Sci. 7 (1), 96–108. 10.1186/1748-5908-7-96. [DOI] [PMC free article] [PubMed] [Google Scholar]
  152. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M, 2011. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin. Policy Mental Health Mental Health Serv. Res. 38 (2), 65–76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  153. Rabin BA, Brownson RC, 2018. Terminology for dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 19–45. [Google Scholar]
  154. Rabin BA, Brownson RC, Haire-Joshu D, Kreuter MW, Weaver NL, 2008. A glossary for dissemination and implementation research in health. J. Public Health Manage. Pract. 14 (2), 117–123. 10.1097/01.PHH.0000311888.06252.bb. [DOI] [PubMed] [Google Scholar]
  155. Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC, 2010. Dissemination and implementation research on community-based cancer prevention: a systematic review. Am. J. Prev. Med. 38 (4), 443–456. 10.1016/j.amepre.2009.12.035. [DOI] [PubMed] [Google Scholar]
  156. Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, Glasgow RE, 2018. Systematic, multimethod assessment of adaptations across four diverse health systems interventions. Front. Public Health 6, 102. 10.3389/fpubh.2018.00102. [DOI] [PMC free article] [PubMed] [Google Scholar]
  157. RE-AIM.org, n.d. Planning and Evaluation Questions for Initiatives Intended to Produce Public Health Impact. Available at https://www.re-aim.org/resources-and-tools/self-rating-quiz/. Accessed January 9, 2021.
  158. Redeker NS, Caruso CC, Hashmi SD, Mullington JM, Grandner M, Morgenthaler TI, 2019. Workplace interventions to promote sleep health and an alert, healthy workforce. J. Clin. Sleep Med. 15 (4), 649–657. 10.5664/jcsm.7734. [DOI] [PMC free article] [PubMed] [Google Scholar]
  159. Richardson KM, Rothstein HR, 2008. Effects of occupational stress management intervention programs: a meta-analysis. J. Occupat. Health Psychol. 13 (1), 69–93. 10.1037/1076-8998.13.1.69. [DOI] [PubMed] [Google Scholar]
  160. Robertson MM, Tubbs D, Henning RA, Nobrega S, Calvo A, Murphy LA, 2021. Assessment of organizational readiness for participatory occupational safety, health and well-being programs. Work (Reading, Mass.) 69 (4), 1317–1342. 10.3233/WOR-213552. [DOI] [PMC free article] [PubMed] [Google Scholar]
  161. Robson LS, Stephenson CM, Schulte PA, Amick BCI, Irvin EL, Eggerth DE, Chan S, Bielecky AR, Wang AM, Heidotting TL, Peters RH, Clarke JA, Cullen K, Rotunda CJ, Grubb PL, 2012. A systematic review of the effectiveness of occupational health and safety training. Scand. J. Work, Environ. Health 38 (3), 193–208. [DOI] [PubMed] [Google Scholar]
  162. Rogers EJ, 2003. Diffusion of Innovations, fifth ed. Free Press, New York. [Google Scholar]
  163. Rohrbach LA, Dent CW, Skara S, Sun P, Sussman S, 2007. Fidelity of implementation in Project Towards No Drug Abuse (TND): a comparison of classroom teachers and program specialists. Prev. Sci. 8 (2), 125. 10.1007/s11121-006-0056-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  164. Rondinone BM, Boccuni F, Iavicoli S, 2010. Trends and priorities in occupational health research and knowledge transfer in Italy. Scand. J. Work Environ. Health 36 (4), 339–348. [DOI] [PubMed] [Google Scholar]
  165. Sallis JF, Owen N, Fisher EB, 2008. Ecological models of health behavior. In: Glanz K, Rimer BK, Viswanath K (Eds.), Health behavior and health education: Theory, research, and practice. John Wiley & Sons, Philadelphia, PA, pp. 466–485. [Google Scholar]
  166. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ, 2014. Organizational readiness for implementing change: A psychometric assessment of a new measure. Implem. Sci. 9 (1), 1–15. 10.1186/1748-5908-9-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  167. Shelton RC, Chambers DA, Glasgow RE, 2020a. An extension of RE-AIM to enhance sustainability: addressing dynamic context and promoting health equity over time. Front. Public Health 8, 134. 10.3389/fpubh.2020.00134. [DOI] [PMC free article] [PubMed] [Google Scholar]
  168. Shelton RC, Lee M, Brotzman LE, Wolfenden L, Nathan N, Wainberg ML, 2020b. What is dissemination and implementation science?: an introduction and opportunities to advance behavioral medicine and public health globally. Int. J. Behav. Med. 27 (1), 3–20. 10.1007/s12529-020-09848-x. [DOI] [PubMed] [Google Scholar]
  169. Shelton RC, Cooper BR, Stirman SW, 2018. The sustainability of evidence-based interventions and practices in public health and health care. Annu. Rev. Public Health 39, 55–76. 10.1146/annurev-publhealth-040617-014731. [DOI] [PubMed] [Google Scholar]
  170. Schelvis RMC, Oude Hengel KM, Burdorf A, Blatter BM, Strijk JE, van der Beek AJ, 2015. Evaluation of occupational health interventions using a randomized controlled trial: challenges and alternative research designs. Scand. J. Work Environ. Health 41 (5), 491–503. 10.5271/sjweh.3505. [DOI] [PubMed] [Google Scholar]
  171. Schulte PA, Cunningham TR, Nickels L, Felknor S, Guerin R, Blosser F, Chang C-C, Check P, Eggerth D, Flynn M, Forrester C, Hard D, Hudson H, Lincoln J, McKernan LT, Pratap P, Stephenson CM, Van Bogaert D, Menger-Ogle L, 2017. Translation research in occupational safety and health: A proposed framework. Am. J. Ind. Med. 60 (12), 1011–1022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  172. Schulte PA, Okun A, Stephenson CM, Colligan M, Ahlers H, Gjessing C, Loos G, Niemeier RW, Sweeney MH, 2003. Information dissemination and use: critical components in occupational safety and health. Am. J. Ind. Med. 44 (5), 515–531. [DOI] [PubMed] [Google Scholar]
  173. Schulte PA, Delclos G, Felknor SA, Chosewood LC, 2019. Toward an expanded focus for occupational safety and health: a commentary. Int. J. Environ. Res. Public Health 16 (24), 4946. 10.3390/ijerph16244946. [DOI] [PMC free article] [PubMed] [Google Scholar]
  174. Schwatka NV, Tenney L, Dally MJ, Scott J, Brown CE, Weitzenkamp D, Shore E, Newman LS, 2018. Small business Total Worker Health: A conceptual and methodological approach to facilitating organizational change. Occup. Health Sci. 2 (1), 25–41. [DOI] [PMC free article] [PubMed] [Google Scholar]
  175. Sinclair R, Cunningham TR, Schulte P, 2013. A model for occupational safety and health intervention in small businesses. Am. J. Ind. Med. 56 (12), 1442–1451. [DOI] [PMC free article] [PubMed] [Google Scholar]
  176. Skolarus TA, Lehmann T, Tabak RG, Harris J, Lecy J, Sales AE, 2017. Assessing citation networks for dissemination and implementation research frameworks. Implem. Sci. 12 (1), 1–17. 10.1186/s13012-017-0628-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  177. Smith JD, Berkel C, Rudo-Stern J, Montaño Z, St. George SM, Prado G, Mauricio AM, Chiapa A, Bruening MM, Dishion TJ, 2018. The Family Check-Up 4 Health (FCU4Health): Applying implementation science frameworks to the process of adapting an evidence-based parenting program for prevention of pediatric obesity and excess weight gain in primary care. Front. Public Health 6. 10.3389/fpubh.2018.00293. [DOI] [PMC free article] [PubMed] [Google Scholar]
  178. Stanick CF, Halko HM, Nolen EA, Powell BJ, Dorsey CN, Mettert KD, Weiner BJ, Barwick M, Wolfenden L, Damschroder LJ, Lewis CC, 2021. Pragmatic measures for implementation research: development of the Psychometric and Pragmatic Evidence Rating Scale (PAPERS). Transl. Behav. Med. 11 (1), 11–20. 10.1093/tbm/ibz164. [DOI] [PMC free article] [PubMed] [Google Scholar]
  179. Stirman SW, Baumann AA, Miller CJ, 2019. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implem. Sci. 14, 58. 10.1186/s13012-019-0898-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  180. Stirman SW, Miller CJ, Toder K, Calloway A, 2013. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implem. Sci. 8 (1), 65. 10.1186/1748-5908-8-65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  181. Strifler L, Cardoso R, McGowan J, Cogo E, Nincic V, Khan PA, Scott A, Ghassemi M, MacDonald H, Lai Y, Treister V, Tricco AC, Straus SE, 2018. Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J. Clin. Epidemiol. 100, 92–102. [DOI] [PubMed] [Google Scholar]
  182. Storm JF, LePrevost CE, Tutor-Marcom R, Cope WG, 2016. Adapting certified safe farm to North Carolina agriculture: an implementation study. J. Agromed. 21 (3), 269–283. 10.1080/1059924X.2016.1180273. [DOI] [PubMed] [Google Scholar]
  183. Tabak RG, Hook M, Chambers DA, Brownson RC, 2018. The conceptual basis for dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 73–88. [Google Scholar]
  184. Tabak RG, Khoong EC, Chambers D, Brownson RC, 2013. Models in dissemination and implementation research: useful tools in public health services and systems research. Front. Public Health Serv. Syst. Res. 2 (1), 8. 10.13023/FPHSSR.0201.08. [DOI] [Google Scholar]
  185. Tamers SL, Goetzel R, Kelly KM, Luckhaupt S, Nigam J, Pronk NP, Rohlman DS, Baron S, Brosseau LM, Bushnell T, Campo S, Chang C-C, Childress A, Chosewood LC, Cunningham T, Goldenhar LM, Huang T-K, Hudson H, Linnan L, Newman LS, Olson R, Ozminkowski RJ, Punnett L, Schill A, Scholl J, Sorensen G, 2018. Research methodologies for Total Worker Health®: Proceedings from a workshop. J. Occup. Environ. Med. 60 (11), 968–978. [DOI] [PMC free article] [PubMed] [Google Scholar]
  186. Tamers SL, Streit J, Pana-Cryan R, Ray T, Syron L, Flynn MA, Castillo D, Roth G, Geraci C, Guerin R, Schulte P, Henn S, Chang CC, Felknor S, Howard J, 2020. Envisioning the future of work to safeguard the safety, health, and well-being of the workforce: A perspective from the CDC’s National Institute for Occupational Safety and Health. Am. J. Ind. Med. 63 (12), 1065–1084. 10.1002/ajim.23183. [DOI] [PMC free article] [PubMed] [Google Scholar]
  187. Tenney L, Fan W, Dally M, Scott J, Haan M, Rivera K, Newman LS, 2019. Health links™ assessment of total worker health® practices as indicators of organizational behavior in small business. J. Occup. Environ. Med. 61 (8), 623–634. 10.1097/JOM.0000000000001623. [DOI] [PMC free article] [PubMed] [Google Scholar]
  188. Teufer B, Ebenberger A, Affengruber L, Kien C, Klerings I, Szelag M, Griebler U, 2019. Evidence-based occupational health and safety interventions: a comprehensive overview of reviews. BMJ Open 9 (12), e032528. 10.1136/bmjopen-2019-032528. [DOI] [PMC free article] [PubMed] [Google Scholar]
  189. Tinc PJ, Gadomski A, Sorensen JA, Weinehall L, Jenkins P, Lindvall K, 2018. Adapting the T0–T4 implementation science model to occupational health and safety in agriculture, forestry, and fishing: A scoping review. Am. J. Ind. Med. 61 (1), 51–62. 10.1002/ajim.22787. [DOI] [PubMed] [Google Scholar]
  190. Tinc PJ, Jenkins P, Sorensen JA, Weinehall L, Gadomski A, Lindvall K, 2020. Key factors for successful implementation of the National Rollover Protection Structure Rebate Program: A correlation analysis using the consolidated framework for implementation research. Scand. J. Work Environ. Health 46 (1), 85–95. [DOI] [PubMed] [Google Scholar]
  191. Trinkley KE, Kahn MG, Bennett TD, Glasgow RE, Haugen H, Kao DP, Matlock DD, 2020. Integrating the practical robust implementation and sustainability model with best practices in clinical decision support design: implementation science approach. J. Med. Int. Res. 22 (10), e19676. 10.2196/19676. [DOI] [PMC free article] [PubMed] [Google Scholar]
  192. VA. Office of Rural Health, Rural Promising Practices. https://www.ruralhealth.va.gov/providers/promising_practices.asp.
  193. Van Eerd D, Saunders R, 2017. Integrated knowledge transfer and exchange: An organizational approach for stakeholder engagement and communications. Scholarly Res. Commun. 8 (1) 10.22230/src.2017v8n1a274. [DOI] [Google Scholar]
  194. Viester L, Verhagen EA, Bongers PM, van der Beek AJ, 2014. Process evaluation of a multifaceted health program aiming to improve physical activity levels and dietary patterns among construction workers. J. Occup. Environ. Med. 56 (11), 1210–1217. 10.1097/JOM.0000000000000250. [DOI] [PubMed] [Google Scholar]
  195. Vinson CA, Stamatakis KA, Kerner JF, 2018. Dissemination and implementation research in community and public health settings. In: Brownson RC, Colditz GA, Proctor EK (Eds.), Dissemination and Implementation Research in Health: Translating Science to Practice. 2R, second ed. Oxford University Press, New York, NY, pp. 355–370. [Google Scholar]
  196. von Thiele Schwarz U, Aarons GA, Hasson H, 2019. The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation. BMC Health Serv. Res. 19 (1), 1–10. 10.1186/s12913-019-4668-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  197. von Thiele Schwarz U, Hasson H, Lindfors P, 2015. Applying a fidelity framework to understand adaptations in an occupational health intervention. Work 51 (2), 195–203. [DOI] [PubMed] [Google Scholar]
  198. Waltz TJ, Powell BJ, Matthieu MM, Damschroder LJ, Chinman MJ, Smith JL, Kirchner JE, 2015. Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: results from the Expert Recommendations for Implementing Change (ERIC) study. Implem. Sci. 10 (1), 109. 10.1186/s13012-015-0295-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  199. Weiner BJ, 2021. Assessing Multilevel Contexts. Presentation at the 2021 Colorado Pragmatic Research in Health Conference, virtual. [Google Scholar]
  200. Weiner BJ, 2009. A theory of organizational readiness for change. Implem. Sci. 4, 67. 10.1186/1748-5908-4-67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  201. Weiner BJ, Amick H, Lee SYD, 2008. Review: Conceptualization and measurement of organizational readiness for change. A review of the literature in health services research and other fields. Med. Care Res. Rev. 65 (4), 379–436. 10.1177/1077558708317802. [DOI] [PubMed] [Google Scholar]
  202. Weiner BJ, Clary AS, Klaman SL, Turner K, Alishahi-Tabriz A, 2020. In: Organizational Readiness for Change: What We Know, What We Think We Know, and What We Need to Know. Implementation Science 3.0. Springer, Cham, Switzerland, pp. 101–144. [Google Scholar]
  203. Weiner BJ, Lewis CC, Stanick C, Powell BJ, Dorsey CN, Clary AS, Halko H, 2017. Psychometric assessment of three newly developed implementation outcome measures. Implem. Sci. 12 (1), 1–12. 10.1186/s13012-017-0635-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  204. Westfall JM, Mold J, Fagnan L, 2007. Practice-based research—“Blue Highways” on the NIH roadmap. JAMA 297 (4), 403–406. 10.1001/jama.297.4.403. [DOI] [PubMed] [Google Scholar]
  205. Wilson KM, Brady TJ, Lesesne C, on behalf of the NCCDPHP Work Group on Translation (2011). An organizing framework for translation in public health: the Knowledge to Action Framework. Prevent. Chronic Dis. 8(2), A46. http://www.cdc.gov/pcd/issues/2011/mar/10_0012.htm. Accessed July 3, 2020. [PMC free article] [PubMed] [Google Scholar]
  206. Wolfenden L, Williams CM, Wiggers J, Nathan N, Sze LY, 2016. Improving the translation of health promotion interventions using effectiveness-implementation hybrid designs in program evaluations. Health Promot. J. Australia 27 (3), 204–207. 10.1071/HE16056. [DOI] [PubMed] [Google Scholar]
  207. Wolfenden L, Goldman S, Stacey FG, Grady A, Kingsland M, Williams CM, Yoong SL, 2018. Strategies to improve the implementation of workplace-based policies or practices targeting tobacco, alcohol, diet, physical activity and obesity. Cochrane Database Syst. Rev. 11 10.1002/14651858.CD012439.pub2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  208. Woodward EN, Singh RS, Ndebele-Ngwenya P, Melgar Castillo A, Dickson KS, Kirchner JE, 2021. A more practical guide to incorporating health equity domains in implementation determinant frameworks. Implem. Sci. Commun. 2 (1), 61. 10.1186/s43058-021-00146-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  209. Woodward EN, Matthieu MM, Uchendu US, Rogal S, Kirchner JE, 2019. The health equity implementation framework: proposal and preliminary study of hepatitis C virus treatment. Implem. Sci. 14 (1), 1–18. 10.1186/s13012-019-0861-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
  210. Zimmerman L, Lounsbury DW, Rosen CS, Kimerling R, Trafton JA, Lindley SE, 2016. Participatory system dynamics modeling: increasing stakeholder engagement and precision to improve implementation planning in systems. Adm Policy Ment Health. 43 (6), 834–849. 10.1007/s10488-016-0754-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  211. Zohar D, Polachek T, 2014. Discourse-based intervention for modifying supervisory communication as leverage for safety climate and performance improvement: A randomized field study. J. Appl. Psychol. 99 (1), 113. [DOI] [PubMed] [Google Scholar]

RESOURCES