Abstract
Background
Effective implementation of evidence-based practices (EBPs) remains a significant challenge. Numerous existing models and frameworks identify key factors and processes to facilitate implementation. However, there is a need to better understand how individual models and frameworks are applied in research projects, how they can support the implementation process, and how they might advance implementation science. This systematic review examines and describes the research application of a widely used implementation framework, the Exploration, Preparation, Implementation, Sustainment (EPIS) framework.
Methods
A systematic literature review was performed to identify and evaluate the use of the EPIS framework in implementation efforts. Citation searches in PubMed, Scopus, PsycINFO, ERIC, Web of Science, Social Sciences Index, and Google Scholar databases were undertaken. Data extraction included the objective, language, country, setting, sector, EBP, study design, methodology, level(s) of data collection, unit(s) of analysis, use of EPIS (i.e., purpose), implementation factors and processes, EPIS stages, implementation strategy, implementation outcomes, and overall depth of EPIS use (rated on a 1–5 scale).
Results
In total, 762 full-text articles were screened by four reviewers, resulting in inclusion of 67 articles, representing 49 unique research projects. All included projects were conducted in public sector settings. The majority of projects (73%) investigated the implementation of a specific EBP. The majority of projects (90%) examined inner context factors, 57% examined outer context factors, 37% examined innovation factors, and 31% bridging factors (i.e., factors that cross or link the outer system and inner organizational context). On average, projects measured EPIS factors across two of the EPIS phases (M = 2.02), with the most frequent phase being Implementation (73%). On average, the overall depth of EPIS inclusion was moderate (2.8 out of 5).
Conclusion
This systematic review enumerated multiple settings and ways the EPIS framework has been applied in implementation research projects, and summarized promising characteristics and strengths of the framework, illustrated with examples. Recommendations for future use include more precise operationalization of factors, increased depth and breadth of application, development of aligned measures, and broadening of user networks. Additional resources supporting the operationalization of EPIS are available.
Electronic supplementary material
The online version of this article (10.1186/s13012-018-0842-6) contains supplementary material, which is available to authorized users.
Keywords: Implementation, Framework, Model, Theory, Outer context, Inner context, Process, Systematic review, Diffusion of innovations
Background
Effective implementation of evidence-based interventions, treatments, or innovations (hereafter referred to as evidence-based practices [EBPs]) to address complex and widespread public health issues remains a significant challenge. Our ability to effectively implement an EBP is as important as treatment effectiveness because failed implementation efforts are often the underlying reason for lack of EBP effectiveness or impact in health and social care systems and organizations [1–3]. There are numerous frameworks, models, and theories that identify key factors, and sometimes processes, to facilitate EBP implementation [4–6]. Such implementation frameworks are commonly used to help select and structure research questions, methods, strategies, measures, and results. While an increasing number of studies use implementation frameworks, the ways in which these frameworks are used or operationalized is not well described and their theoretical and practical utility are often left unexamined [7].
The present study is a systematic review of one highly cited and widely used implementation framework, the Exploration, Preparation, Implementation, Sustainment (EPIS) framework [8]. Until recently, this comprehensive framework has had limited prescriptive guidance for its use. The EPIS framework was developed based on examination of the literature on implementation in public sector social and allied health service systems (e.g., mental health, substance use disorder treatment, social care, child welfare) in the USA, and has applicability in other countries and other settings. This study will determine how EPIS has been applied and how widely the framework has been disseminated, adopted, and implemented in diverse health, allied health, and social care sectors, and further afield.
The EPIS framework
As shown in Fig. 1, EPIS has key components that include four well-defined phases that describe the implementation process, identification of outer system and inner organizational contexts and their associated factors, innovation factors that relate to the characteristics of the innovation/EBP being implemented, and bridging factors, the dynamics, complexity, and interplay of the outer and inner contexts [8].
The first key component of EPIS is the four phases of the implementation process, defined as Exploration, Preparation, Implementation, and Sustainment (EPIS). In the Exploration phase, a service system, organization, research group, or other stakeholder(s) consider the emergent or existing health needs of the patients, clients, or communities and work to identify the best EBP(s) to address those needs, and subsequently decides whether to adopt the identified EBP. In addition, consideration is given to what might need to be adapted at the system, organization, and/or individual level(s) and to the EBP itself. The exploration phase begins when implementers and relevant stakeholders are aware of a clinical or public health need and are considering ways to address that need. The implementers move into the next phase of preparation upon deciding to adopt one or more EBPs or innovations. In the Preparation phase, the primary objectives are to identify potential barriers and facilitators of implementation, further assess needs for adaptation, and to develop a detailed implementation plan to capitalize on implementation facilitators and address potential barriers. Critical within the Preparation phase is planning of implementation supports (e.g., training, coaching, audit, and feedback) to facilitate use of the EBP in the next two phases (Implementation and Sustainment) and to develop an implementation climate that indicates that EBP use is expected, supported, and rewarded [9]. In the Implementation phase and guided by the planned implementation supports from the Preparation phase, EBP use is initiated and instantiated in the system and/or organization(s). It is essential that ongoing monitoring of the implementation process is incorporated to assess how implementation is proceeding and adjust implementation strategies to support efforts accordingly. In the Sustainment phase, the outer and inner context structures, processes, and supports are ongoing so that the EBP continues to be delivered, with adaptation as necessary, to realize the resulting public health impact of the implemented EBP.
The second key component of the EPIS framework is the articulation of contextual levels and factors comprised of the outer system context and the inner organizational context. Within each phase, outer and inner context factors that could be considered as instrumental to the implementation process are highlighted, many of which apply across multiple implementation phases. The outer context describes the environment external to the organization and can include the service and policy environment and characteristics of the individuals who are the targets of the EBP (e.g., patients, consumers). The outer context also includes inter-organizational relationships between entities, including governments, funders, managed care organizations, professional societies, and advocacy groups, that influence and make the outer context dynamic. For example, collaboration between child welfare and mental health systems may occur, surrounding the development and implementation of a coordinated care program for youth served in both sectors. It is important to note that the outer context is dynamic. The inner context refers to the characteristics within an organization such as leadership, organizational structures and resources, internal policies, staffing, practices, and characteristics of individual adopters (e.g., clinicians or practitioners). Within the inner context, there are multiple levels that vary by organization or discipline and may include executive management, middle management, team leaders, or direct service providers (e.g., clinicians, practitioners). Together, the inner and outer contexts reflect the complex, multilayered, and highly interactive nature of the socioecological context of health and allied healthcare that is noted in many implementation frameworks [8, 10–12].
The third key component of EPIS are the factors that relate to the EBP or innovation itself. There is an emphasis of fit of the EBP to be implemented with the system and patient/client population (outer context), as well as the organization and provider (inner context). This implies that some adaptation to the EBP will likely be necessary. The aim is to maintain the core components on an EBP and adapt the periphery.
The fourth and final component of EPIS is the recognition of the interconnectedness and relationships between outer and inner context entities, what is part of what we refer to as bridging factors. The bridging factors are deemed to influence the implementation process as the inner context of organizations is influenced by the outer system in which the organization operates, but those influences are reciprocal (e.g., industry lobbyists impacting pharmacy legislation, and direct to consumer marketing). For example, hospitals are subject to federal, state, and local policies that define certification and reporting requirements.
EPIS considers that adaptation (often involving implementation strategies) will likely be necessary in regard to the outer and inner contexts as well as to the EBP. This is supported by recent work identifying the need for a dynamic approach to adaptation that involves all relevant stakeholders through the four EPIS phases in order to capitalize on the knowledge and experience of the implementation team and maximize the ability to find solutions that are acceptable to all stakeholders [13]. Furthermore, this is consistent with calls for consideration of the need for adaptation in EBP sustainment [14, 15]. This emphasis on adaptation to improve fit within the EPIS framework is akin to what others have identified as fostering values-innovation fit [16, 17]. The values-innovation fit proposes that innovation implementation will be more successful if there is a high degree of fit between the values and needs of implementers and the characteristics of the innovation to be implemented [16]. For example, one implementation strategy may be to develop system and organizational climates that support such a values-innovation fit [18]. EPIS also explicitly identifies the importance of EBP characteristics and the role of EBP developers and purveyors/intermediaries (i.e., those who support the implementation process) throughout the process of implementation and demonstration of effectiveness. This is especially important when considering values-innovation fit and identifying potential adaptations to increase EBP fit within a specific setting while preserving fidelity to EBP core elements that are responsible for clinical or service outcomes.
It is unclear the degree to which these varying components of EPIS are identified, operationalized, and studied in the published literature. To address this gap, we conducted a systematic review of the literature to describe how the EPIS implementation framework has been used in peer-reviewed, published studies. This review (1) describes EPIS use in implementation research to date and (2) makes recommendations for using the EPIS framework to advance implementation science theory, research, and practice.
Methods
A multi-step process was used to identify, review, and analyze the use of the EPIS framework.
Search strategy
A systematic search of the literature was executed in May 2017 to locate studies published in academic journals that utilized the EPIS framework. The search strategy was based on cited reference searching of the original EPIS article titled “Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors” [8]. The title was used as the TITLE search term in each database. The following seven databases and search criteria were used: (1) PubMed single citation matcher (TITLE in ALL FIELDS CITED by), (2) Scopus: (TITLE in REFERENCES), (3) PsycINFO: (TITLE in REFERENCE and PEER REVIEWED), (4) ERIC: (TITLE in ANYWHERE), (5) Web of Science: (TITLE AND THE “TIMES CITED” LINK FOR FULL LIST OF CITATIONS OF THE ORIGINAL EPIS PAPER), (6) Social Sciences Index (TITLE in All Text Fields), and (7) Google Scholar: (TITLE Cited By).
Our initial search criteria were anchored to the original EPIS citation [8] to ensure complete identification of articles that had used the EPIS framework. We utilized the title of the original article—rather than the more recently accepted acronym that follows from the EPIS phases. As such, all records were published between 2011 (when the EPIS framework was published) and 2017 (when the search was conducted).
Prior to assessment of inclusion and exclusion criteria duplicates of resulting articles were removed.
Inclusion and exclusion criteria
Inclusion criteria were as follows:
Report on an original (empirical) research study
Published in a peer-reviewed journal
Study design or implementation project focused on dissemination, implementation, or sustainment, including hybrid designs.
Utilized the EPIS framework to guide study theory, design, data collection, measurement, coding, analysis, and/or reporting
Papers were excluded if they were conceptual (e.g., commentary, debate) rather than empirical research or a synthesis (e.g., EPIS was cited as one of a list of frameworks, theories, or models but was not used in a meaningful way).
Data collection
Four reviewers (JM, KD, NS, BR) assessed titles, abstracts, and full articles for inclusion. Each article was independently assessed by two reviewers. Papers (n = 74; 9.7%), where there was a difference of opinion regarding inclusion, were assessed by a third reviewer (GA).
Data extraction
Each article was critically appraised by two reviewers independently. Reviewers extracted the data summarized in Table 1 from each included article. Refer to Additional file 1 for extracted data of each article.
Table 1.
Author | List of authors |
---|---|
Year | Year of publication |
Objective | Summary of publication’s objective(s) |
Country | Country where implementation efforts were conducted |
Setting | Physical setting where implementation took place (e.g., mental health clinic, church, community center, primary care) |
Sector | Sector (e.g., psychology, social work, mental Health, behavioral health, public health) |
EBP, Innovation or Intervention | Specific EBP (i.e., the innovation or intervention) implemented |
Health focus | Whether a health focus was reported (yes/no) |
Study design | Study design as reported in the paper (e.g., prospective, retrospective, hybrid implementation, case study, participant observation) |
Study methodology | Study methodology (i.e., qualitative, quantitative, or mixed) |
Larger Study Design | Methodology of larger study, if the effort was part of a larger study |
Type of EPIS use | How EPIS was used (e.g., study design, data collection, measurement, analysis, coding, and/or reporting/interpretation) |
Level of data collection | Level(s) of data collection (e.g. outer context, inner context, multilevel) |
Level of analysis | Level(s) of analysis (e.g., provider, team, supervisor, organization, system) |
Outer context | Whether outer context factors were assessed (yes/no) |
Inner context | Whether inner context factors were assessed (yes/no) |
Innovation factors | Whether innovation factors were assessed (yes/no) |
Bridging factors | Whether other bridging factors within EPIS were assessed (yes/no) |
Implementation strategy | Whether there was a researcher (co)initiated implementation strategy (yes/no) and reported (yes/no) |
Implementation outcomes | Implementation outcomes (e.g., feasibility, adoption, fidelity) |
Stages | EPIS phase(s) in which implementation factors were assessed: Exploration, Preparation, Implementation, Sustainment; Phase(s) were rated for the degree to which the authors were explicit in their use, where 0 = phase not included, 1 = implicit inclusion of phase, 2 = explicit inclusion of phase. Explicit inclusion is where the authors overtly included the named phase(s) of EPIS that were included in their study, while implicit inclusion was assessed by the reviewers based on the EPIS phase definitions |
Depth | Overall depth of inclusion of EPIS, from 1 = conceptual (e.g., inner and outer context factors were applied to study design but not carried through the study and evaluation phases) to 5 = operationalized (e.g., looked at a few factors incorporated throughout the paper [intro, design, measurement, conclusions] or included all phases) |
Additional classifications of implementation factors from the original 2011 paper were added to the data extraction and EPIS framework figure. There was limited emphasis in the original 2011 paper and also a large presence of factors regarding the innovation and factors connecting the innovation and the two contexts, which highlighted the need for additional terms to better represent particular factors that fall outside of the inner and outer context. As such, we refer to these as Innovation factors (for example innovation/EBP characteristics, innovation/EBP fit), and Bridging factors, or factors that span the inner and outer contexts (for example, interagency collaboration and community-academic partnerships).
Synthesis of results
Data were recorded in an extraction table (Additional file 1). Following data extraction, each reviewer met with the paired team member of each article to compare their results, reach consensus for areas that required further discussion or review, and combine extracted data into a single entry in the data extraction table. Finally, studies that had multiple articles were grouped together based on the parent project. In Additional file 1, projects that contained more than one article have a summary of the project reported first, followed by articles within each parent project listed in italics.
Results
Our search identified 2122 records (Fig. 2). After removal of duplicates, 762 articles were screened (title, abstract, and full-text review as needed) of which 67 initially met the inclusion criteria. Articles were then grouped by projects (if projects had one or multiple articles) resulting in a total of 49 unique research projects (see Additional file 1). Further results are reported by project.
Projects utilizing EPIS were conducted in 11 countries that span high-, low-, and middle-income contexts: USA (39 projects, plus 2 in collaboration with other countries), Canada (1 project in collaboration with USA), Mexico (1 project), Sweden (1 project), Norway (1 project in collaboration with USA), Belgium (1 project), Australia (2 projects), UK (1 project), Brazil (1 project), the Bahamas (1 project), and South Africa (1 project). In 47% of projects, one or more authors had a professional connection (e.g., sponsor or mentor on a training grant) or research collaboration (e.g., co-investigator on a grant, co-author on a scholarly product) with the lead author of the original EPIS framework paper, Dr. Gregory Aarons; 6% and 4% had a professional connection or collaboration with the second (Hurlburt) and third (Horwitz) authors, respectively.
The included projects spanned a variety of public sectors: public health, child welfare, mental or behavioral health, substance use, rehabilitation, juvenile justice, education, and school-based nursing. The child welfare and mental health sectors had the highest representation across the projects (13 and 19 projects, respectively). The physical setting for the projects varied from whole service systems (e.g., Mental Health Department) to organizational level settings (e.g., schools, child welfare agencies, substance use treatment programs, community-based organizations, health clinics). The scale of the projects ranged from case studies of a few organizations to very large studies across hundreds of organizations or several states and counties. All projects were in public sector settings.
The majority of projects (n = 36, 73%) were investigating the implementation of one specific EBP. Two projects offered sites a selection of EBPs (4%), while 11 projects (22%) were implementing the concept of EBP. The health focus was reported in 8 projects (16%). These included (but not limited to) maltreatment, behavioral problems, mental health or substance use, human immunodeficiency virus (HIV), Parkinson’s Disease, teen pregnancy, and workplace disability.
There was a reasonable division in the methodologies employed in the projects. Eleven projects (22%) used quantitative methods, 13 (27%) qualitative methods, and 26 (53%) mixed methods. Projects often produced separate articles focusing on qualitative and quantitative methodology, with 22 (32%) quantitative articles, 21 (31%) qualitative articles, and 25 (37%) mixed-methods articles. The data collected in the projects included assessment of multiple inner context levels (20 projects, 41%), followed by assessment of one inner context level (15 projects, 31%) and then assessment of multiple outer and inner levels (14 projects, 29%). Data analyses conducted in the projects was usually multilevel. In total, only 12 projects (24%) were analyzed at a single level. Seven study protocols were included in the review, two used EPIS only in the study design, while three had subsequent results articles included. Seven projects employed EPIS only in analysis or coding, and/or reporting. Four projects used EPIS to frame the study and then in reporting, but not in study design, data collection or measurement. Six projects (12%) used EPIS only as background to frame the study. The role of EPIS in the projects has been summarized in Table 2.
Table 2.
Projects | Articles | |
---|---|---|
Study design | 37 | 28 |
Reporting | 37 | 28 |
Framing | 34 | 23 |
Measurement | 24 | 18 |
Data collection | 23 | 18 |
Analysis | 18 | 14 |
Coding | 13 | 10 |
Note: Categories are not mutually exclusive. Included projects and articles may have used the EPIS for multiple purposes
In terms of the outer and inner context, innovation, and bridging factors, we found some variation in reporting. Factors associated with the outer and inner context were reported in 57% (n = 28) and 90% (n = 44) of projects, respectively. A large number of projects included innovation factors (37%) and bridging factors that spanned the inner and outer contexts (31%).
Regarding the EPIS phases, we noted a variation in how explicitly projects (i.e., authors overtly included the name of the phases) identified the various phases and differences in what phases were included in projects. Table 3 summarizes the distribution of the EPIS phase(s) examined and whether the phase was explicitly versus implicitly used. All of the included projects and the majority of the articles (78%) explicitly focused on the Implementation stage whereas a minority of the projects (29%) and articles (23%) explicitly focused on the Exploration stage. On average, projects included 2.02 out of the four phases (SD = 1.14). Table 4 reports the frequency of EPIS factors at each phase measured in the included articles. The most frequently measured factors across phases were organizational characteristics and individual adopter characteristics.
Table 3.
Projects explicit | Projects implicit | Articles explicit | Articles implicit | |
---|---|---|---|---|
Exploration | 14 | 4 | 14 | 5 |
Preparation | 17 | 5 | 18 | 9 |
Implementation | 38 | 4 | 49 | 9 |
Sustainment | 18 | 5 | 23 | 6 |
Note: Phases are not mutually exclusive. A specific project or article may have focused on multiple phases
Table 4.
Constructs | Phases | ||||
---|---|---|---|---|---|
E | P | I | S | ||
Outer context | Service environment | 6 | 9 | 17 | 10 |
Policies | 2 | 6 | 12 | 6 | |
Funding/contracting | 8 | 12 | 20 | 14 | |
Leadership | 2 | 2 | 7 | 6 | |
Inter-organizational environment and networks | 9 | 8 | 15 | 12 | |
Client characteristics | 2 | 2 | 4 | 3 | |
Patient/client advocacy | 1 | 0 | 2 | 1 | |
Innovation factors* | Innovation/EBP developers | 0 | 0 | 0 | 0 |
Innovation/EBP characteristics | 1 | 1 | 1 | 1 | |
Innovation/EBP fit* | 5 | 6 | 14 | 9 | |
Bridging factors* | Community-academic partnerships | 1 | 1 | 2 | 2 |
Purveyors/intermediaries | 0 | 0 | 0 | 0 | |
Inner context | Organizational characteristics | 13 | 18 | 32 | 22 |
Culture | 4 | 6 | 11 | 5 | |
Climate | 7 | 9 | 13 | 6 | |
Leadership | 6 | 9 | 22 | 9 | |
Quality and fidelity monitoring/support | 2 | 4 | 8 | 3 | |
Organizational staffing processes | 7 | 9 | 27 | 15 | |
Individual characteristics | 16 | 24 | 34 | 19 | |
Attitudes towards EBPs | 9 | 10 | 17 | 9 | |
Demographics | 4 | 5 | 7 | 4 |
*Represents factors that are new and not defined in original EPIS 2011 paper
Frequency counts represent unique articles
Finally, to quantify the overall coverage of the EPIS framework, a rating from 1 (conceptual use of EPIS) to 5 (operationalized use of EPIS) was assigned. The average rating of EPIS depth was 2.8 out of 5, indicating a moderate depth of EPIS application.
Discussion
This review describes how one of the most commonly used D&I frameworks [7], the EPIS framework, has been used in practice and reported in the literature since its first publication in 2011 until mid-2017. A total of 49 unique research projects using EPIS, published in 67 peer-reviewed articles, were identified. Projects were conducted in 11 countries, across a range of public sector settings. While the EPIS framework was developed based on the literature on implementation in public social and allied health service systems in the USA, it appears to have broad applicability in other countries and other health and/or allied health settings.
The promise of implementation science models and frameworks is that they may allow for cross-setting and cross-country comparison of factors associated with implementation, which can contribute to our understanding of optimal implementation strategies and generalizability of concepts and constructs, support the harmonization of measures and evaluation practices, and help advance the field of implementation science and implementation practice. This review shows the promise and utility of EPIS to guide studies in various settings, topic areas, and geopolitical locations, and economically resourced regions. For example, our results demonstrate that EPIS has been used in high-, low-, and middle-income countries including Sweden [19], South Africa [20], and Mexico [21]. EPIS has also been used in other settings including public health [22], schools [23], and community health centers [24]. We encourage adaptation and use of EPIS outside of currently tested projects.
There may be some tendency for frameworks, models, and theories to be used by those in aligned information and professional networks [25]; however, it is likely that a given framework may have broad applicability across settings. The EPIS framework is a relatively young model compared to some other implementation models and frameworks in the field. It is natural that after initial introduction and application, the network of users will broaden [26]. This has already been observed with EPIS, as more than 50% of research projects included in this review had no direct affiliation with the first author of the EPIS framework (Aarons). We expect that this natural diffusion of EPIS will continue and will be enhanced as more diverse examples of its use emerge. Moreover, we anticipate that more comprehensive use of EPIS, including such aspects as inter-organizational networks, innovation fit at system, organization, provider, and patient levels, may be enhanced through the examples, recommendations, and resources described in this review [27–30].
EPIS was developed as a comprehensive, stand-alone implementation framework containing the core components [4] of implementation; the implementation process was divided into four phases, and an enumeration of potential influencing factors for each phase across the multilevel context may be evaluated quantitatively and qualitatively, allowing for the testing of related implementation strategies. We reviewed the completeness and frequency with which the key components of EPIS have been used across research projects. The depth of EPIS inclusion was moderate. However, we recommend more in-depth use in the articulation, application, and measurement of factors included in implementation frameworks. On the other hand, use of all components of a framework is not always feasible, practical, desirable, or necessary for a given implementation study or project [31] and many implementation frameworks do not include all the core components of implementation [4].
In terms of the process-related characteristics of the EPIS phases (i.e., moving through phases of Exploration, Preparation, Implementation, and Sustainment), we found variability in the use of process-related aspects of EPIS, with the most frequent phase being Implementation. Furthermore, the majority of the research projects had the Implementation phase as their main focus with much less emphasis on the Exploration, Preparation, and Sustainment phases. This finding is consistent with other literature suggesting that thoughtful planning which could happen in the Exploration and Preparation phases for implementation and sustainment is infrequent although critical [4]. It is also documented in the literature that attention to sustainment is sparse but is imperative for ongoing and meaningful change, improvement in outcomes, and public health impact [32, 33]. We suggest that implementation researchers begin with sustainment in mind and as the ultimate goal of implementation efforts. This perspective does not preclude, and even embraces, the need for careful navigation through Exploration, Preparation, and Implementation phases and for adaptations of the outer or inner contexts, implementation strategies, and the EBP(s). Examples of the use of EPIS in the Exploration and Preparation phase include projects that examine service providers and supervisors/administrators attitudes towards and knowledge of an EBP(s) to inform implementation strategies and adaptation efforts [34, 35].
Projects in this review varied in regard to the depth with which EPIS was described and operationalized, with only some cases of EPIS being applied throughout the implementation phases. For the most benefit, it is desirable that implementation models and frameworks are used to inform all phases of the research process from early development of implementation research questions through to presentation and dissemination of research findings. It is also true that frameworks might have diverse strengths and might be more appropriate to use for certain purposes than others. There are five broad categories that frameworks have been classified into based on their primary purpose: process models, determinant frameworks, classic theories, implementation frameworks, and evaluation frameworks [5, 36]. For example, the Reach, Effectiveness, Adoption, Implementation, and Maintenance Framework (RE-AIM) has historically been used as a planning and evaluation framework [37, 38] and the Consolidated Framework for Implementation Research (CFIR) is frequently used as a determinant to guide qualitative methods and analyses [39]. EPIS can be classified in many categories as it may be used for the purpose of understanding process, determinants, implementation, and evaluation. By guiding multiple components of implementation, the EPIS framework may be used for several purposes, reducing the need for use of multiple frameworks.
It is critical to go beyond the mention of the framework in the introduction of a research grant or paper or only applying the framework retrospectively during data analysis, without sufficient operationalization of the framework in the research process. A content review of U.S. National Institutes of Health grant proposal abstracts, funded and not funded, showed that one key characteristic of funded proposals was that the implementation framework(s) selected was better described and operationalized [40]. We recommend careful consideration and operationalization of components, and also relating use of theory in testing and advancing knowledge of what aspects of implementation frameworks are more or less useful in driving implementation process and advancing implementation science. Greater depth and breadth of EPIS use would include providing descriptions of the implementation plan, the factors included in the project, and how and when the specified factors and process are being assessed.
There was variability in the specific factors examined at each phase, although organizational and individual adopter characteristics were the most frequent factors across all phases. It is not surprising to see that inner context factors are most commonly assessed. The relative higher frequency of measuring organizational and individual adopter characteristics may be influenced by the greater availability of quantitative measures of these factors in comparison to system level factors (refer to Table 5 for list of associated measures). A recent publication in the journal Implementation Science highlighted the need to better define and develop well operationalized and pragmatic measures for the assessment of external implementation context and bridging factors [41]. Access to existing measures is provided through a number of resources and publications [42–44]. More specifically, measures for various EPIS factors have been developed and tested through a number of studies. Examples of these measures are provided in Table 5. Development and use of additional measures meeting these criteria is a high priority area for Implementation Science.
Table 5.
Sample of EPIS factors | Example quantitative measures | |
---|---|---|
Outer context | Service environment | Sustainability assessment tool |
Policies | EBP-specific document review (i.e., speeches, regulations, annual reports; documented system-wide policy) | |
Funding/contracting | – | |
Leadership | Leadership Competence Scale of Program Sustainability Index [63] | |
Inter-organizational environment and networks | – | |
Patient/client characteristics* | Demographics, Administrative Claims Data | |
Patient/client advocacy* | – | |
Innovation factors* | Innovation/EBP developers | – |
Innovation/EBP characteristics | – | |
Innovation/EBP fit* | – | |
Bridging factors* | Community-academic partnerships* | – |
Purveyors/intermediaries | – | |
Inner context | Organizational characteristics | Group Innovation Inventory [64]; Implementation Climate Scale [65]; Level of Institutional Scale [66]; Organizational Climate Measure [67]; Organizational Culture and Climate via Children’s Services Survey [68]; Organizational Readiness for Change [69]; Organizational Social Context Survey [70]; Organizational Size; Program Sustainability Index [71]; Siegel Scale of Support of Innovation [72] |
Culture | Organizational Culture and Climate via Children’s Services Survey [68]; Organizational Social Context Survey [70] | |
Climate | Implementation Climate Assessment [66]; Implementation Climate Scale [65, 72]; Organizational Climate Measure [67]; Organizational Social Context Survey [64]; Time Climate Inventory [73] | |
Readiness for change | Organizational Readiness for Change [69]; Readiness for Organizational Change [74] | |
Leadership | Implementation Leadership Scale [75]; Multifactor Leadership Questionnaire [76] | |
Quality and fidelity monitoring/support* | Adherence and Skill Checklist [77]; Assessment of Climate Embedding Mechanisms [18]; Examination of Common Dimensions of EBI(s); performance-based role-plays [78]; Therapist Procedures Checklist-Revised [79]; Therapist Procedures Observational Coding System [80] | |
Supportive coaching | Coaching records | |
Organizational staffing processes | Data regarding turnover rates and reasons | |
Individual characteristics | Demographics; Emotional Competency Inventory [81]; Evidenced-Based Practice Attitudes Scale [82]; Knowledge of Evidenced-Based Services Questionnaire [83]; organizational readiness for change [69] | |
Attitudes towards EBPs | Evidence-Based Practice Attitudes Scale [82]; Perceived Characteristics of Intervention Scale; Barriers to Research Practice Scale [84] | |
Implementation citizenship* | Implementation Citizenship Behavior Scale [61] | |
Burnout* | Maslach Burnout Inventory [85] |
*Represents factors that are new and not defined in original EPIS 2011 paper
Examples are only provided for those factors that were measured in the review—indicates that there were no quantitative measures in the included articles of this review
It is important to note that the role and relevance of factors within the inner and outer context might vary across phases. Some factors might be important throughout all phases (e.g., funding, organizational characteristics), while others might have heightened relevance during one or two of the phases (e.g., quality and fidelity monitoring/support during the Implementation and Sustainment phases). We also emphasize the importance of attending to the bridging factors and the dynamic interplay between inner and outer context factors. We encourage those using the EPIS framework to use theory or a logic model of their particular implementation strategy and context to decide what factors are likely to be critical and relevant in their study [45, 46]. Detailed and deep use of implementation models and frameworks to identify specific implementation determinants and targets, and processes of implementation can help to address these concerns. The model developed from the EPIS framework for the Interagency Collaborative Team (ICT) project provides an example of interconnectedness and relationships between and within outer and inner context entities [47]. In the ICT project, a community-academic partnership was formed to bridge the outer and inner contexts. Furthermore, interagency collaborative relationships within and across the contextual levels were formed including between outer context policy makers with advocacy groups and community-based organizations contracted to provide home-based services with clients and families [48]. Outer context policies were instantiated through collaborative processes such as community stakeholder meetings, the use of negotiations, and procurement and contracting. Contracts, which clearly specifies the expectation to use EBPs, communicates a strong system level support (outer context) for a climate (inner context) where EBPs are expected, supported, and rewarded [49].
As discussed, EPIS includes levels across the socioecological context [12], touching on factors at the individual, organizational, and systems levels. A multi-level conceptualization of implementation processes, and the understanding that interactions across various levels need to be considered, has been an increasing discourse in the Implementation Science literature [10]. A strength of EPIS is in its perspective that draws attention to the complexities of its multi-level conceptualization including data collection and data analysis. For example, when collecting qualitative data the interviewer may ask about the respondent experience at their own unit level (e.g., experience of supervisors in their team) or other levels (e.g., the larger agency or system level policies). It is important to specify hypotheses both within levels and across levels. As an example, interventions to improve leadership and organizational implementation climate may be intended to improve clinician attitudes towards EBP, and EBP adoption, use, and fidelity [18]. In this case, the higher level leadership and climate are at the higher unit level, while attitudes, adoption, and fidelity are at the individual clinician level.
The multi-level contextual nature of EPIS lends itself to a variety and integration of methodologies using quantitative only, qualitative only, or mixed-method approaches. There is an increasing appreciation in Implementation Science for the need to use a combination of quantitative and qualitative methods which allow for a more complete assessment of our often context-dependent, pragmatic research questions (i.e., for whom, under what circumstances, why, and how does an intervention or implementation strategy work) [50, 51]. In our review, we found a number of examples where mixed-methods approaches guided by the EPIS framework were able to provide more comprehensive evaluation of an implementation research problem. For example, Gleacher and colleagues [52] used qualitative interview data from clinicians to augment quantitative utilization and implementation data to examine multilevel factors associated with adoption and implementation of measurement feedback systems in community mental health clinics. A critical challenge in the field is to find ways to publish findings from mixed-method studies; we found that two thirds of the mixed-method projects in this review published their qualitative and qualitative findings in separate papers. Space limitations and orientation of various journals (i.e., more qualitative or quantitative focus) might form barriers for mixed-methods findings to be published in an integrated manner. There are resources on how to apply mixed-methods to Implementation Science research that give guidance and examples of integration of qualitative and quantitative conceptualization, data collection and analysis, and reporting [53–55].
Future directions of EPIS
The results from this systematic review have informed our recommendations for the future use of EPIS for (1) more precise operationalization of EPIS factors, (2) consideration of the interplay between inner and outer context through bridging factors, and (3) discussion of how EPIS can be consistently incorporated with greater depth and throughout the lifespan of an implementation project (breadth).
Recommendation no. 1: Precise operationalization of EPIS factors
The use of precise and operationalized definitions of EPIS factors is key to facilitate the successful application of this framework and guide appropriate measurement of factors. In this vein, we have refined definitions of the EPIS factors (see Table 6). The definitions are flexible to ensure applicability of EPIS factors across phases and multiple levels. For example, the inner context factor organizational characteristics is defined as “structures or processes that take place or exist in organizations that may influence the process of implementation.” Inherent within this definition is that this construct may be an important consideration within any of the four EPIS phases and at multiple levels (e.g., provider, team, supervisor). Moving forward, we encourage implementation scientists to utilize these definitions to inform their application and measurement of EPIS factors, as well as using the EPIS factors and relationships between factors to develop theoretical models for testing in implementation studies.
Table 6.
EPIS constructs | Definition | Examples | |
---|---|---|---|
Outer context | Service environment/policies* | State and federal sociopolitical and economic contexts that influence the process of implementation and delivery/use of the innovation | Policies; legislation; monitoring and review; auditing; mandates |
Funding/contracting | Fiscal support provided by the system in which implementation occurs. Fiscal support can target multiple levels (e.g., staff training, fidelity monitoring, provision of the innovation/EBP) involved in implementation and delivery/use of the innovation | Contracting arrangements; grants; fee-for service, addition to formulary; capitation fees, incentives | |
Leadership | Characteristics and behaviors of key decision-makers pertinent at all levels who are necessary but not sufficient to facilitate or promote the implementation process and delivery/use of the innovation | Transformational leadership; Implementation leadership | |
Inter-organizational environment and networks | Relationships of professional organizations through which knowledge of the innovation/EBP is shared and/or goals related to the innovation/EBP implementation are developed/established | Inter-organizational collaboration, commitment, competition, co-opetition | |
Patient/client characteristics* | Demographics and individual characteristics of the target population/end user | Socioeconomic status, health condition, comorbidities, age, gender, motivation | |
Patient/client advocacy* | Support or marketing for system change based on consumer needs, priorities and/or demographics | Client advocacy; class-action lawsuits, consumer organizations | |
Innovation factors* | Innovation/EBP developers | Characteristics of the individuals or team(s) responsible for the creation of the EBP/innovation that may be the subject of implementation efforts | Engagement in implementation, continuous quality improvement, rapid-cycle testing, prototyping |
Innovation/EBP Characteristics | Features or qualities of innovations to be implemented | Complexity, ease of learning, cost, burden, reporting requirements | |
Innovation/EBP fit* | The extent to which the innovation/EBP fits the needs of the population served or context in which it is implemented | Innovation/EBP structural and process fit with system, organizations, providers, patients/clients | |
Bridging factors* | Community-academic partnerships* | Active partnerships between researchers and key community stakeholders, who can represent multiple levels involved in implementation (e.g., system representatives, organizational leaders, providers, consumers), that can facilitate successful implementation and delivery/use of the innovation | Community participation; partnerships; ongoing positive relationships; valuing multiple perspectives |
Purveyors/intermediaries | Organizations or individuals providing support or consultation for implementation and/or training in the innovation | Implementation readiness assessment, strategy development, training support | |
Inner Context | Organizational characteristics | Structures or processes that take place and/or exist in organizations that may influence the process of implementation | Culture; climate; readiness for change; structure; leadership; receptive context; absorptive capacity; social network support |
Leadership | Characteristics and behaviors of individuals involved in oversight and/or decision-making related to EBP implementation within an organization | Competing priorities; use of climate/culture embedding mechanisms; transformational leadership; implementation leadership | |
Quality and fidelity monitoring/support* | Processes or procedures undertaken to ensure adherence to active delivery of the innovation/EBP and/or an implementation strategy | Fidelity support system; quality assurance evaluation; continuous quality improvement | |
Organizational staffing processes | The processes or procedures in place at an organization related to the hiring, review, and retention of staff involved in the active delivery of the innovation/EBP and/or its implementation | Professional training and qualification related to EBI delivery; staff turnover | |
Individual characteristics | Shared or unique characteristics of individuals (e.g., provider, supervisor, director) that influence the process of implementation | Attitudes towards EBP; demographics and/or background; client characteristics; job demands |
*Represents factors that are new or adaptations based on the original EPIS 2011 paper
Recommendation no. 2: Consideration of the dynamic interplay between inner and outer context factors
In addition to inner and outer context factors, we also now explicitly highlight and define the integral role of bridging factors. These factors were previously conceptualized as those that interlace the inner and outer context factors but were not formally classified within the EPIS framework (see Fig. 1 of Aarons et al. [2011] paper) [8]. In our current conceptualization, these factors and their interactions include: Community Academic Partnerships, and Purveyors/Intermediaries. For example, the Dynamic Adaptation Process [13] incorporates an explicit emphasis on these bridging factors to inform EBP adaptation in a planned, systematic way to increase its feasibility for implementation and sustainment. As our results suggest, these bridging factors are active ingredients to aid in understanding the interaction between outer and inner context factors and thus represent a key area of consideration in future work.
Recommendation no. 3: Increase EPIS depth and breadth
Our results show that more than one phase and level of EPIS have been considered in many implementation studies, highlighting the breadth of the EPIS framework. While this is encouraging, we recommend that future implementation efforts consider how EPIS can be applied longitudinally throughout all phases (i.e., Exploration, Preparation, Implementation and Sustainment) and levels (e.g., system, organization, provider) of the implementation process. We suggest that implementation efforts “begin with sustainment in mind.” This reflects the increasing emphasis within implementation science on explicit incorporation or acknowledgement of the sustainment phase from the outset of study planning and execution [56, 57]. Further, our results suggest that EPIS was most commonly used to inform the study design, report results, and frame the research endeavor. We recommend that EPIS, as a theoretical framework, be thoughtfully applied throughout a project from study framing to explicit identification of how EPIS was used within various levels of data collection and analysis and through reporting and interpretation of results. In a longitudinal study design, factors may be evaluated across multiple EPIS phases. Examples of quantitative measures are provided in Table 5 and definitions for qualitative analyses in Table 6.
Finally, the phases of the implementation process may be operationalized by defining and measuring movement through the phases. For example, when an organization is aware of or shows interest in using an EBP, they enter the Exploration phase. Subsequently if they make the decision to adopt the EBP then they would move into the Preparation phase. First use of the EBP would signify transition into the Implementation phase. Lastly, continued use over a designated period of time may be defined as being in Sustainment. These types of movements have been flagged for incorporation into guidelines such as PRECIS-2 [58].
Exemplar of comprehensive use of EPIS framework: JJ-TRIALS
One example of meticulous and comprehensive use of EPIS is the US National Institute on Drug Abuse (NIDA) Juvenile Justice Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS) project [59, 60]. In this major multiple center (six centers, 34 study sites) cluster randomized implementation trial, EPIS was used throughout the implementation phases and across contextual levels to stage the implementation process, select quantitative and qualitative measures, and identify important outcomes. JJ-TRIALS is probably the best and most explicit example of application of the EPIS framework. Indeed, JJ-TRIALS may be one of the best examples of a rigorous, deep, and thoughtful approach to applying an implementation science framework to a large-scale implementation initiative. For example, JJ-TRIALS is testing a bundled strategy that supports the implementation of data-driven decision-making using two different facilitation approaches (core and enhanced). JJ-TRIALS moves beyond implementation of a single EBP to allow for implementation of evidence-based process improvement efforts. Activities to move through the EPIS phases were mapped out along with implementation factors and appropriate measures and strategies to assess and address the key multilevel system and organizational issues. Ways to document and evaluate the implementation factors, implementation strategies, and movement through all of the EPIS phases were determined. In addition, there was a conceptual adaptation of EPIS itself based on input and perspectives of community partners, investigators, and NIH staff wherein the framework was represented in a more recursive and cyclical manner consistent with improvement processes and this resulted in the development of EPIS Wheel [59, 60]. As shown in Fig. 1, based on our current systematic review, we have also provided a depiction of the EPIS framework using a more cyclical perspective that also captures the key features of outer context, inner context, as well as the nature of the practice(s) to be implemented (innovation factors), and the interaction with intervention developers and purveyors that may foster appropriate adaptations of context and practice (bridging factors).
EPIS resources
EPIS website: episframework.com
The website https://EPISFramework.com provides a number of tools for planning and use of EPIS throughout the implementation process. The website is now available and is a living resource that will be continually updated and enhanced.
Limitations
There are several limitations of this systematic review. We limited the review to peer-reviewed, empirical articles citing Aarons et al. 2011. Ongoing or completed grant-funded studies or contracts that applied EPIS are not included. In addition, unpublished applications of EPIS would not have been included nor articles that do not directly cite Aarons et al. 2011, or articles without searchable reference citations. As such, our results likely do not reflect all implementation efforts that used EPIS and in particular the search strategy may have limited the inclusion of practitioners’ application of the framework for implementation practice. Our rating of the depth to which EPIS was used was based on one item that was developed by the study team. Although operationalized and internally consistent as used in this study, it was not a standardized measure of EPIS use.
Conclusion
The EPIS framework has a great promise to serve as a multilevel, context-sensitive, broadly applicable framework for Implementation Science research and practice. Our review described the patterns of use to date, summarized promising characteristics and strengths of the EPIS framework, and illustrated those through examples. We also provide recommendations for future use including more precise operationalization, increased depth and breadth of EPIS application, improved use measures for a number of factors, and the ongoing broadening of networks of users, topics, and settings. Additional resources supporting the operationalization of EPIS are available and under development [61].
Additional file
Acknowledgements
Dr. Aarons is core faculty and Dr. Stadnick is a fellow with the Implementation Research Institute (IRI), at the George Warren Brown School of Social Work, Washington University in St. Louis, through an award from the National Institute of Mental Health (R25MH08091607) with additional support from the National Institute on Drug Abuse and the U.S. Department of Veterans Affairs.
Funding
This project was supported in part by the US National Institute of Mental Health R01MH072961 (Aarons), K23 MH115100 (Dickson), K23 MH110602 (Stadnick) and National Institute of Drug Abuse R01DA038466 (Aarons), and U01DA036233 (R. DiClemente). The opinions expressed herein are the views of the authors and do not necessarily reflect the official policy or position of the NIMH, NIDA, or any other part of the U.S. Department of Health and Human Services.
Availability of data and materials
Not applicable.
Authors’ contributions
GAA and JCM conceptualized this study. JCM, NAS, KSD, BAR, and GAA identified, reviewed, and evaluated articles, and drafted and edited the manuscript. All authors read and approved the final manuscript.
Ethics approval and consent to participate
Ethics approval was not required for this systematic review.
Consent for publication
Not applicable.
Competing interests
GAA is an Associate Editor of Implementation Science; all decisions on this paper were made by another editor. The authors declare that they have no other competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Contributor Information
Joanna C. Moullin, Email: jcmoullin@gmail.com
Kelsey S. Dickson, Email: kdickson@sdsu.edu
Nicole A. Stadnick, Email: nstadnic@ucsd.edu
Borsika Rabin, Email: borsika.a.rabin@gmail.com.
Gregory A. Aarons, Email: gaarons@ucsd.edu
References
- 1.Griffith TL, Zammuto RF, Aiman-Smith L. Why new technologies fail: overcoming the invisibility of implementation. Ind Manage. 1999;41:29–34. [Google Scholar]
- 2.Klein KJ, Knight AP. Innnovation implementation: overcoming the challenge. Curr Dir Psychol Sci. 2005;14:243–246. doi: 10.1111/j.0963-7214.2005.00373.x. [DOI] [Google Scholar]
- 3.Rizzuto TE, Reeves J. A multidisciplinary meta-analysis of human barriers to technology implementation. Consult Psychol J: Pract Res. 2007;59:226–240. doi: 10.1037/1065-9293.59.3.226. [DOI] [Google Scholar]
- 4.Moullin JC, Sabater-Hernández D, Fernandez-Llimos F, Benrimoj SI. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework. Health Res Policy Syst. 2015;13:16. doi: 10.1186/s12961-015-0005-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53. doi: 10.1186/s13012-015-0242-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43:337–350. doi: 10.1016/j.amepre.2012.05.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Mazzucca S, Tabak RG, Pilar M, Ramsey AT, Baumann AA, Kryzer E, Lewis EM, Padek M, Powell BJ, Brownson RC. Variation in research designs used to test the effectiveness of dissemination and implementation strategies: a review. Front Public Health. 2018;6:32. doi: 10.3389/fpubh.2018.00032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Hlth. 2011;38:4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–274. doi: 10.1146/annurev-publhealth-032013-182447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Brownson RC, Colditz GA, Proctor EK. Dissemination and implementation research in health: translating science to practice. USA: Oxford University Press; 2012. [Google Scholar]
- 11.Chambers DA. Commentary: increasing the connectivity between implementation science and public health: advancing methodology, evidence integration, and sustainability. Annu Rev Public Health. 2018;39:1–4. doi: 10.1146/annurev-publhealth-110717-045850. [DOI] [PubMed] [Google Scholar]
- 12.Green L, Kreuter M. Health program planning: an educational and ecological approach. Boston: McGraw Hill; 2005. [Google Scholar]
- 13.Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, Silovsky JF, Hecht DB, Chaffin MJ. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:1–9. doi: 10.1186/1748-5908-7-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117. doi: 10.1186/1748-5908-8-117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65. doi: 10.1186/1748-5908-8-65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Klein KJ, Sorra JS. The challenge of innovation implementation. Acad Manag Rev. 1996;21:1055–1080. doi: 10.5465/amr.1996.9704071863. [DOI] [Google Scholar]
- 17.Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O, Peacock R. Storylines of research in diffusion of innovation: a meta-narrative approach to systematic review. Soc Sci Med. 2005;61:417–430. doi: 10.1016/j.socscimed.2004.12.001. [DOI] [PubMed] [Google Scholar]
- 18.Aarons GA, Ehrhart MG, Moullin JC, Torres EM, Green AE. Testing the Leadership and Organizational Change for Implementation (LOCI) intervention in substance abuse treatment: a cluster randomized trial study protocol. Implement Sci. 2017;12:29. doi: 10.1186/s13012-017-0562-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Leavy B, Kwak L, Hagströmer M, Franzén E. Evaluation and implementation of highly challenging balance training in clinical practice for people with Parkinson’s disease: protocol for the HiBalance effectiveness-implementation trial. BMC Neurol. 2017;17:27. doi: 10.1186/s12883-017-0809-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Peltzer K, Prado G, Horigian V, Weiss S, Cook R, Sifunda S, Jones D. Prevention of mother-to-child transmission (PMTCT) implementation in rural community health centres in Mpumalanga province, South Africa. J Psychol Afr. 2016;26:415–418. doi: 10.1080/14330237.2016.1219537. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Patterson TL, Semple SJ, Chavarin CV, Mendoza DV, Santos LE, Chaffin M, Palinkas L, Strathdee SA, Aarons GA. Implementation of an efficacious intervention for high risk women in Mexico: protocol for a multi-site randomized trial with a parallel study of organizational factors. Implement Sci. 2012;7:105. doi: 10.1186/1748-5908-7-105. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.de MNT M, de RMPF S, Filho DAM. Sustainability of an innovative school food program: a case study in the northeast of Brazil. SciELO Public Health. 2016;21:1899–1908. doi: 10.1590/1413-81232015216.08752016. [DOI] [PubMed] [Google Scholar]
- 23.Willging CE, Green AE, Ramos MM. Implementing school nursing strategies to reduce LGBTQ adolescent suicide: a randomized cluster trial study protocol. Implement Sci. 2016;11:145. doi: 10.1186/s13012-016-0507-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Williams JR, Blais MPB, D, Dusablon T, Williams WO, Hennessy KD: Predictors of the decision to adopt motivational interviewing in community health settings. J Behav Health Serv Res 2014, 41:294–307. [DOI] [PubMed]
- 25.Skolarus TA, Lehmann T, Tabak RG, Harris J, Lecy J, Sales AE. Assessing citation networks for dissemination and implementation research frameworks. Implement Sci. 2017;12:97. doi: 10.1186/s13012-017-0628-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Shoup JA, Gaglio B, Varda D, Glasgow RE. Network analysis of RE-AIM framework: chronology of the field and the connectivity of its contributors. Transl Behav Med. 2014;5:216–232. doi: 10.1007/s13142-014-0300-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM. A guide to using the theoretical domains framework of behaviour change to investigate implementation problems. Implement Sci. 2017;12:77. doi: 10.1186/s13012-017-0605-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Stetler CB, Damschroder LJ, Helfrich CD, Hagedorn HJ. A guide for applying a revised version of the PARIHS framework for implementation. Implement Sci. 2011;6:99. doi: 10.1186/1748-5908-6-99. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Consolidated Framework for Implementation Research (CFIR) [http://www.cfirguide.org/].
- 30.Reach Effectiveness Adoption Implementation Maintenance (RE-AIM) [http://www.re-aim.org/].
- 31.Birken SA, Powell BJ, Presseau J, et al. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review. Implement Sci. 2017;12:2. [DOI] [PMC free article] [PubMed]
- 32.Tricco AC, Ashoor HM, Cardoso R, MacDonald H, Cogo E, Kastner M, McKibbon A, Grimshaw JM, Straus SE. Sustainability of knowledge translation interventions in healthcare decision-making: a scoping review. Implement Sci. 2016;11:55. doi: 10.1186/s13012-016-0421-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39(1):55–76. [DOI] [PubMed]
- 34.Gates LB, Hughes A, Kim DH. Influence of staff attitudes and capacity on the readiness to adopt a career development and employment approach to services in child welfare systems. J Publ Child Welfare. 2015;9:323–340. doi: 10.1080/15548732.2015.1060917. [DOI] [Google Scholar]
- 35.Moore LA, Aarons GA, Davis JH, Novins DK. How do providers serving American Indians and Alaska Natives with substance abuse problems define evidence-based treatment? Psychol Serv. 2015;12:92–100. doi: 10.1037/ser0000022. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Birken SA, Rohweder CL, Powell BJ. T-CaST: an implementation theory comparison and selection tool. Implement Sci. 2018;13:143. [DOI] [PMC free article] [PubMed]
- 37.Glasgow RE, Vogt T, Boles S. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89:1322–1327. doi: 10.2105/AJPH.89.9.1322. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Harden SM, Smith ML, Ory MG, Smith-Ray RL, Estabrooks PA, Glasgow RE. RE-AIM in clinical, community, and corporate settings: perspectives, strategies, and recommendations to enhance public health impact. Front Public Health. 2018;6:71. doi: 10.3389/fpubh.2018.00071. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, Damschroder L. A systematic review of the use of the consolidated framework for implementation research. Implement Sci. 2016;11:72. doi: 10.1186/s13012-016-0437-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Proctor EK, Powell BJ, Baumann AA, Hamilton AM, Santens RL. Writing implementation research grant proposals: ten key ingredients. Implement Sci. 2012;7:96. doi: 10.1186/1748-5908-7-96. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Watson DP, Adams EL, Shue S, Coates H, McGuire A, Chesher J, Jackson J, Omenka OI. Defining the external implementation context: an integrative systematic literature review. BMC Health Serv Res. 2018;18:209. doi: 10.1186/s12913-018-3046-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, Comtois KA. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10:2. doi: 10.1186/s13012-014-0193-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Rabin BA, Purcell P, Naveed S, P MR, Henton MD, Proctor EK, Brownson RC, Glasgow RE. Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012;7:119. doi: 10.1186/1748-5908-7-119. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Rabin BA, Lewis CC, Norton WE, Neta G, Chambers D, Tobin JN, Brownson RC, Glasgow RE. Measurement resources for dissemination and implementation research in health. Implement Sci. 2016;11:42. doi: 10.1186/s13012-016-0401-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Lewis CC, Stanick C, Lyon A, Darnell D, Locke J, Puspitasari A, Marriott BR, Dorsey CN, Larson M, Jackson C, et al. Society for implementation research collaboration; 2018. Implementation science. 2017. Proceedings of the fourth biennial conference of the Society for Implementation Research Collaboration (SIRC) 2017: implementation mechanisms: what makes implementation work and why? Part 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Proceedings from the 10th Annual Conference on the Science of Dissemination and Implementation. In Science of Dissemination and Implementation; 2018; Arlington, VA. Implementation Science; 2017: 728.
- 47.Hurlburt M, Aarons GA, Fettes D, Willging C, Gunderson L, Chaffin MJ. Interagency collaborative team model for capacity building to scale-up evidence-based practice. Child Youth Serv Rev. 2014;39:160–168. doi: 10.1016/j.childyouth.2013.10.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, Chaffin MJ. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adolesc Psychol. 2014;43:915–928. doi: 10.1080/15374416.2013.876642. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Aarons GA, Farahnak LR, Ehrhart MG, Sklar M. Aligning leadership across systems and organizations to develop strategic climate to for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–274. doi: 10.1146/annurev-publhealth-032013-182447. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Palinkas LA, Cooper BR. Mixed methods evaluation in dissemination and implementation science. In: Brownson RC, Colditz G, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York: Oxford University Press; 2017. [Google Scholar]
- 51.Holtrop JS, Rabin BA, Glasgow RE. Qualitative approaches to use of the RE-AIM framework: rationale and methods. BMC Health Serv Res. 2018;18:177. doi: 10.1186/s12913-018-2938-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Gleacher AA, Olin SS, Nadeem E, Pollock M, Ringle V, Bickman L, Hoagwood K. Implementing a measurement feedback system in community mental health clinics: a case study of multilevel barriers and facilitators. Adm Policy Ment Hlth. 2016;43:1–15. doi: 10.1007/s10488-014-0614-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.QUALRIS workgroup . Qualitative research in implementation science. National Cancer Institute. 2017. [Google Scholar]
- 54.Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Adm Policy Ment Hlth. 2011;38:44–53. doi: 10.1007/s10488-010-0314-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreatment. 2012;17:67–79. doi: 10.1177/1077559511426908. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Wiley T, Belenko S, Knight D, Bartkowski J, Robertson A, Aarons G, Wasserman G, Leukefeld C, DiClemente R, Jones D. Juvenile Justice-Translating Research Interventions for Adolescents in the Legal System (JJ-TRIALS): a multi-site, cooperative implementation science cooperative. Implement Sci. 2015;10:A43. doi: 10.1186/1748-5908-10-S1-A43. [DOI] [Google Scholar]
- 57.Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7:17. doi: 10.1186/1748-5908-7-17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Loudon K, Treweek S, Sullivan F, Donnan P, Thorpe KE, Zwarenstein M. The PRECIS-2 tool: designing trials that are fit for purpose. BMJ. 2015;350:h2147. doi: 10.1136/bmj.h2147. [DOI] [PubMed] [Google Scholar]
- 59.Knight DK, Belenko S, Wiley T, Robertson AA, Arrigona N, Dennis M, Bartkowski JP, McReynolds LS, Becan JE, Knudsen HK, et al. Juvenile Justice—Translational Research on Interventions for Adolescents in the Legal System (JJ-TRIALS): a cluster randomized trial targeting system-wide improvement in substance use services. Implement Sci. 2016;11:57. doi: 10.1186/s13012-016-0423-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Becan JE, Bartkowski JP, Knight DK, Wiley TRA, DiClemente R, Ducharme L, Welsh WN, Bowser D, McCollister K, Hiller N, Spaulding AC, Flynn PM, Swartzendruber DMF, Fisher JH, Aarons GA. A model for rigorously applying the exploration, preparation, implementation, sustainment (EPIS) framework in the design and measurement of a large scale collaborative multi-site study. Health Justice. 2017;6(1):9. doi: 10.1186/s40352-018-0068-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Ehrhart MG, Aarons GA, Farahnak LR. Going above and beyond for implementation: the development and validity testing of the implementation citizenship behavior scale (ICBS). Implement Sci. 2015;10. [DOI] [PMC free article] [PubMed]
- 62.Moher D, Liberati A, Tetzlaff J, Altman DG. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6:e1000097. doi: 10.1371/journal.pmed.1000097. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Mancini JA, Marek LI. Sustaining community-based programs for families: conceptualization and measurement. Fam Relat. 2004;53:339–347. doi: 10.1111/j.0197-6664.2004.00040.x. [DOI] [Google Scholar]
- 64.Caldwell DF, O’Reilly CA., III The determinants of team-based innovation in organizations: the role of social influence. Small Gr Res. 2003;34:497–517. doi: 10.1177/1046496403254395. [DOI] [Google Scholar]
- 65.Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS) Implement Sci. 2014;9:157. doi: 10.1186/s13012-014-0157-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Steckler A, Goodman RM, McLeroy KR, Davis S, Koch G. Measuring the diffusion of innovative health promotion programs. Am J Health Promot. 1992;6:214–225. doi: 10.4278/0890-1171-6.3.214. [DOI] [PubMed] [Google Scholar]
- 67.Patterson MG, West MA, Shackleton VJ, Dawson JF, Lawthom R, Maitlis S, Robinson DL, Wallace AM. Validating the organizational climate measure: links to managerial practices, productivity and innovation. J Organ Behav. 2005;26:379–408. doi: 10.1002/job.312. [DOI] [Google Scholar]
- 68.Glisson C. The organizational context of children's mental health services. Clin Child Fam Psych. 2002;5:233–253. doi: 10.1023/A:1020972906177. [DOI] [PubMed] [Google Scholar]
- 69.Lehman WEK, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abus Treat. 2002;22:197–209. doi: 10.1016/S0740-5472(02)00233-7. [DOI] [PubMed] [Google Scholar]
- 70.Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood K, Mayberg S, Green P. Research network on youth mental health: assessing the organizational social context (OSC) of mental health services: implications for research and practice. Adm Policy Ment Hlth. 2008;35:98–113. doi: 10.1007/s10488-007-0148-5. [DOI] [PubMed] [Google Scholar]
- 71.Siegel SM, Kaemmerer WF. Measuring the perceived support for innovation in organizations. J Appl Psychol. 1978;63:553–562. doi: 10.1037/0021-9010.63.5.553. [DOI] [Google Scholar]
- 72.Aarons G: Organizational climate for evidence-based practice implementation: development of a new scale. Proceedings of the Annual Association for Behavioral and Cognitive Therapies: November 2011: Toronto 2011.
- 73.Anderson NR, West MA. Measuring climate for work group innovation: development and validation of the team climate inventory. J Organ Behav. 1998;19:235–258. doi: 10.1002/(SICI)1099-1379(199805)19:3<235::AID-JOB837>3.0.CO;2-C. [DOI] [Google Scholar]
- 74.Holt DT, Armenakis AA, Feild HS, Harris SG. Readiness for organizational change: the systematic development of a scale. J Appl Behav Sci. 2007;43:232–255. doi: 10.1177/0021886306295295. [DOI] [Google Scholar]
- 75.Aarons GA, Ehrhart MG, Farahnak LR. The Implementation Leadership Scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9:157. doi: 10.1186/s13012-014-0157-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 76.Bass BM, Avolio BJ. The multifactor leadership questionnaire. Palo Alto, CA: Consulting Psychologists Press; 1989. [Google Scholar]
- 77.Beidas RS, Barmish AJ, Kendall PC. Training as usual: can therapist behavior change after reading a manual and attending a brief workshop on cognitive behavioral therapy for youth anxiety? Behavior Therapist. 2009;32:97–101. [Google Scholar]
- 78.Dimeff LA, Koerner K, Woodcock EA, Beadnell B, Brown MZ, Skutch JM, Paves AP, Bazinet A, Harned MS. Which training method works best? A randomized controlled trial comparing three methods of training clinicians in dialectical behavior therapy skills. Behav Res Ther. 2009;47:921–930. doi: 10.1016/j.brat.2009.07.011. [DOI] [PubMed] [Google Scholar]
- 79.Weersing VR, Weisz JR, Donenberg GR. Development of the therapy procedures checklist: a therapist-report measure of technique use in child and adolescent treatment. J Clin Child Adolesc. 2002;31:168–180. doi: 10.1207/S15374424JCCP3102_03. [DOI] [PubMed] [Google Scholar]
- 80.McLeod BD, Weisz JR. The therapy process observational coding system for child psychotherapy-strategies scale. J Clin Child Adolesc Psychol. 2010;39:436–443. doi: 10.1080/15374411003691750. [DOI] [PubMed] [Google Scholar]
- 81.Boyatzis RE, Goleman D, Rhee KS, Bar-On R, Parker JD. Handbook of Emotional Intelligence. In: Bar-On R, JDA P, editors. Clustering competence in emotional intelligence: insights from the emotional competence inventory. San Francisco: Jossey-Bass; 2000. pp. 343–362. [Google Scholar]
- 82.Aarons GA. Mental health provider attitudes toward adoption of evidence-based practice: the evidence-based practice attitude scale (EBPAS) Ment Health Serv Res. 2004;6:61–74. doi: 10.1023/B:MHSR.0000024351.12294.65. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Stumpf RE, CK H-MM, Chorpita BF. Implementation of evidence-based services for youth: assessing provider knowledge. Behav Modif. 2009;33:48–65. doi: 10.1177/0145445508322625. [DOI] [PubMed] [Google Scholar]
- 84.Funk SG, Champagne MT, Wiese RA, Tornquist EM. Barriers: the barriers to research utilization scale. Appl Nurs Res. 1991;4:39–45. doi: 10.1016/S0897-1897(05)80052-7. [DOI] [PubMed] [Google Scholar]
- 85.Schaufeli WB, Leiter MP. The Maslach burnout inventory-test manual. 1996. Maslach burnout inventory–general survey; pp. 19–26. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Not applicable.