Abstract
Background
External implementation support (EIS) is a well-recognized feature of implementation science and practice, often under related terms such as technical assistance and implementation facilitation. Existing models of EIS have gaps related to addressing practice outcomes at both individual and organizational levels, connecting practice activities to intended outcomes, or grounding in well-established theories of behavior and organization change. Moreover, there have been calls to clarify the mechanisms of change through which EIS influences related outcomes.
Method
In this article, we theorize about mechanisms of change within EIS. Our theorizing process aligns with the approach advocated by Kislov et al. We aim to consolidate prior EIS literature, combining related constructs from previous empirical and conceptual work while drawing on our extensive EIS experience to develop a higher-order, midrange theory of change.
Results
Our theory of change is empirically and practically informed, conceptually situated within an established grand theory of change, and guided by eight practice principles and social cognitive theory. The theory of change proposes 10 core practice components as mechanisms of change within EIS. When used according to underlying theory and principles, they are believed to contribute to favorable practice outcomes at individual, team, organizational, and system levels. The model offers flexibility by recognizing the need for sequential support processes and the demand to practice in dynamic and responsive ways. Case examples are presented to illustrate major themes and patterns of the model in action.
Conclusions
The proposed model is intended to support prospective EIS studies by conceptualizing discernable practice components with hypothesized relationships to proximal and distal practice outcomes. The model can be behaviorally operationalized to compliment and extend competency-based approaches to implementation support practitioner (ISP) training and coaching. Over time, the model should be refined based on new empirical findings and contributions from ISPs across the field.
Keywords: implementation practice, implementation support practitioner, technical assistance, facilitation, intermediaries, implementation mechanisms
Plain Language Summary
There are few models that help us understand how external support providers work with organizational, system, and community partners to improve their efforts to implement innovative programs and practices. Existing models typically describe characteristics and features of the process but lack grounding in well-established theories of behavior and organizational change. In this paper, we theorize about mechanisms of change within the support process, which we label core practice components, and explain how their use might improve implementation efforts through shorter- and longer-term practice outcomes. We believe that our model holds promise for informing future advancements in both research and practice. Foremost, the core practice components lend themselves to behavioral definitions and thus being observed and reported in action. In research, this will allow the relationships we propose in our model to be tested and refined over time, resulting in an incremental accumulation of knowledge. In practice, a greater understanding of core practice components and their relationships to key practice outcomes offers ways to enhance training and coaching activities for external support providers. The model may also aid support providers to more effectively navigate the support process and plan more timely and effective support strategies.
Introduction
External implementation support (EIS) provided to organizations, systems, or communities is a well-recognized feature of the adoption and scale-up of innovative practices. Meyers et al. (2012), in their comprehensive review of implementation processes, found that 80% of frameworks included some form of EIS. More recent taxonomies of implementation strategies identify categories related to EIS, such as interactive assistance, capacity-building, and scale-up (Leeman, Birken et al., 2017; Waltz et al., 2015). Leeman, Birken et al. (2017) note that these strategies are often employed by support system actors, such as purveyor and intermediary organizations. Several related terms have been used in the literature to describe the role of support system actors providing EIS, including “technical assistance providers” (e.g., Blase, 2009; Katz & Wandersman, 2016), “facilitators” (e.g., Baskerville et al., 2012; Berta et al., 2015; Lessard et al., 2016; Ritchie et al., 2020; Stetler et al., 2006), and “change agents” (e.g., Blase et al., 2015; Guldbrandsson, 2008; McCormack et al., 2013; Rogers, 2003), among others (Albers et al., 2020). Rushovich et al. (2015) call for clarification of these roles. We use the term EIS and the related term implementation support practitioners (ISPs) in this paper to reinforce unification of these related concepts (Albers et al., 2020; Metz, Albers et al., 2021).
A recent consensus study report from the National Academies of Sciences, Engineering, and Medicine (NASEM, 2019) characterizes EIS as both proactive and responsive, combining implementation science and skill training, facilitation, and supportive behavioral coaching for individuals, groups, and organizations. Albers et al. (2020) conceptualize ISPs as working closely with leaders and staff in provider organizations to improve implementation outcomes by building implementation capacities at individual and organizational levels. The NASEM consensus study report concludes that blended implementation strategies that incorporate EIS play a key role in optimizing local implementation outcomes. Ray et al. (2012) suggest that implementation-focused technical assistance “is an active ingredient in high quality implementation of evidence-based programs” (p. 417). Several authors reinforce, however, that participants in EIS must themselves play a ready and active role in the change process (Albers et al., 2020; Berta et al., 2015; Chinman et al., 2016; Katz & Wandersman, 2016; Ritchie et al., 2020; Rushovich et al., 2015; Yazejian et al., 2019). Despite the potential contributions that EIS may offer during implementation and scale-up, authors note that research on EIS, to date, suffers from a lack of rigor and often demonstrates limited effectiveness when considered alone (Albers et al., 2020; Berta et al., 2015; Feinberg et al., 2008; Katz & Wandersman, 2016; McCormack et al., 2013; West et al., 2012).
Contributing to the lack of rigor is the lack of organized EIS approaches and a lack of quality and consistency in the provision of EIS strategies (Albers et al., 2021; Katz & Wandersman, 2016). Most of the handful of published, peer-reviewed articles refer only to broad action steps, tasks or strategies, competencies, themes, and roles and their relationships with other key actors (e.g., Albers et al., 2021; Guldbrandsson, 2008; Katz & Wandersman, 2016; Lessard et al., 2016; Meyers et al., 2012; Ritchie et al., 2020). Katz and Wandersman (2016) offer a systematic approach to technical assistance based on the Getting to Outcomes model, but without articulating the relationships between technical assistance steps and proximal and distal outcomes of support. Leeman et al. (2015) and Leeman, Calancie et al. (2017) provide a well-conceptualized model of support but focus only on practitioners’ implementation capacities and behaviors. Berta et al. (2015) present an insightful discussion of the possible mechanisms of facilitation, but the model primarily focuses on organizational learning outcomes, which also may be only one feature of EIS. Of note, Albers et al. (2020) recently proposed a preliminary logic model for implementation support that articulates key ISP resources (e.g., position, background, attitudes, and competencies), proximal outcomes (e.g., relationships, participant readiness, and participant competencies), and conditions that may influence the change process (e.g., context factors). However, the authors advocate for more in-depth study of how ISPs might use their competencies (e.g., Metz, Albers et al., 2021) to affect proximal and distal outcomes. Ritchie et al. (2020) recently made a similar call to explore core components of implementation facilitation.
Theorizing about mechanisms of change in EIS
In this paper, our aim is to present a theory of change within EIS that incorporates proposed mechanisms of change, which we refer to as core practice components (CPCs), and their relationships to proximal and distal practice outcomes. Our approach to developing this theory of change reflects Kislov et al. (2019), who advocated for theorizing in implementation science rather than adherence to finished theoretical products. They recognized the need to better accommodate the inherently iterative and fluid interplay between empirical, practice, and theoretical work. Therefore, like other products of theorizing, our theory of change is open to empirical and practical refinement while informing subsequent work and incremental accumulation of knowledge. Our intent has been to develop a midrange, mechanism-based theory that focuses on a limited number of elements and explore the complex relationships and interdependencies between them.
The development of this theory of change itself has been incremental. We have spent over 6 years reviewing relevant EIS literature, reflecting on our EIS engagements, and drawing from our professional backgrounds (e.g., cognitive–behavioral, organizational, systems, and improvement competencies). Our aim has been to build from common descriptive themes, findings, and approaches to model the essential practice functions that ISPs may use to influence intended practice outcomes and how that might happen. We have also developed eight practice principles to guide the use of our proposed theory of change (see Table 1). Because our model reflects a consolidation of prior EIS literature, combining related constructs from prior studies and pre-existing models, we refer to it as a higher-order midrange theory (Kislov et al., 2019).
Table 1.
Practice Principles Underlying the Proposed Model of External Implementation Support
| Co-creation |
| The development of implementation capacity is recognized as an outcome of co-creation ( Metz & Bartley, 2015 ). Authentic, equitable partnerships must be developed and nurtured among all partners for success and sustainability. Implementation support practitioners fully participate in and seek to reinforce the co-creation process, including equitable power sharing and the inclusion of historically marginalized voices ( Yazejian et al., 2019 ). |
| Implementation scientist–practitioner model |
| Implementation support practitioners are grounded by the transdisciplinary science of implementation, including the dual roles of implementation research and practice ( Ramaswamy et al., 2019 ). Implementation research informs the exchange of effective ideas among all co-creation partners engaged in implementation practice. Lessons and learning from implementation practice inform the advancement of implementation research. |
| Proactive support |
| Proactive implementation support anticipates needs and incorporates strategic approaches to bring new knowledge, skills, and opportunities for participants to apply and test new learning, with reinforcement and supportive feedback, in their own environments. |
| Contextualized and responsive support |
| Implementation strategies need to be considered and tailored according to key features of local context, such as history (including historical trauma and inequities), size, resources, culture, population density, and political and social complexities ( Yazejian et al., 2019 ). Ongoing implementation support needs to be responsive to local progress, setbacks, feedback, and key events. |
| Adaptive leadership |
| Implementation and scale-up typically present adaptive challenges (see Heifetz et al., 2009 ). Implementation support practitioners must develop an appreciation for, and comfort with, the diverse perspectives held within local environments and begin to recognize these as clues to the presence of adaptive challenges embedded within the context and its people. |
| Stage-based approach |
| Implementation and scale-up require iterative series of inquiries, actions, and adjustments often across longer-term engagements. Dynamic stage-based approaches to implementation have been widely utilized to address these demands, and implementation support practitioners must be mindful to pace and modify support activities across such stages. |
| Data-driven progress monitoring and improvement |
| Implementation support practitioners collect and use mixed-methods data to identify local needs and plan responsive support strategies, monitor the progress and outcomes of implementation efforts, monitor the effectiveness of their own support, and make data-driven quality improvements. Improvement science methods such as the Model for Improvement ( Courtlandt et al., 2009 ) may greatly benefit implementation support practitioners’ activities. |
| Local ownership of progress |
| Implementation support practitioners promote local partners’ ownership of implementation processes and successes. Ongoing success should not be perceived to be due to, or dependent on, external implementation support practitioners. This principle can be demonstrated by developing and continually reinforcing self-regulation of effective implementation processes within local contexts. |
Three of the current authors, including the first author, have simultaneously been leading parallel programmatic efforts to fully operationalize the model and empirically study its application over the past 6 years in practice-focused projects across two states. The details and results of these efforts are published separately (Aldridge et al., 2023), but have contributed to the theory of change we propose in this paper.
Theoretical grounding
Heeding the call for theory to drive implementation, our practice model is grounded in social cognitive theory (Bandura, 1986). Social cognitive theory frames learning and performance within a social context in which humans are active agents who can influence and be influenced by their environment. The theory has been well applied to organizational contexts and collective social endeavors (e.g., Bandura, 1988, 1991, 2000; Wood & Bandura, 1989) and used to underpin other implementation concepts and practice interventions (Avery et al., 2016; Roppolo et al., 2019). Social cognitive theory offers a constructive theoretical framework from which ISPs may conceptualize implementation practice environments and their influences on both individual/team and organizational/system improvements.
To compliment social cognitive theory, we integrate key methods for supporting individual and group behavior change, such as adult learning best practices and supportive behavioral coaching, and organizational learning and improvement, such as facilitation and improvement cycles. Similar to related conceptualizations of EIS (Albers et al., 2020; Metz, Albers et al., 2021), we envision EIS as best deployed by individuals or small teams of ISPs working within partnerships across multiple system levels.
Based on our experience and the literature on readiness for implementing innovation (e.g., Chilenski et al., 2015; Dymnicki et al., 2014; Prochaska et al., 2001; Romney et al., 2014), we assume a handful of conditions are necessary to fully activate our model. First, support participants must be willing and able to explore changes in their implementation practice behaviors and their organizational policies, structures, and procedures. Second, the involvement of willing and capable organizational leaders must be continuous to enable organizational learning and improvement. Finally, broader system partners (e.g., funders, policymakers, community partners) must be supportive of change processes and able to engage in related co-creation processes (Metz & Bartley, 2015). Although our model may be partially activated without these conditions fully in place, we believe it limits the potential of the model.
Our belief is that there is no generalizable correct solution or strategy in the highly adaptive contexts of implementation and scale-up. Rather, success and sustainment are shaped iteratively within local contexts through equitable interactions among several co-creation partners, of which ISPs are but one (Metz & Bartley, 2015; Metz, Albers et al., 2021; Stetler et al., 2006; Yazejian et al., 2019). To that end and to connect our model to an established grand theory of change with relevant outcomes, we anchor our model within “Co-Creation Partners” and “Local Implementation & Scale-up” in the Integrated Theory of Change for Successful, Sustainable Scale-up detailed in the previously referenced NASEM consensus study report (see Figure 1; 2019). Following presentation of our proposed theory of change, we reflect on case examples from our prior experiences providing EIS to demonstrate elements of the model in action.
Figure 1.
Integrated Theory of Change for the Successful, Sustainable Scale-Up of Evidence-Based Interventions
Note. Republished with permission of The National Academies Press from Fostering Healthy Mental, Emotional, and Behavioral Development in Children and Youth: A National Agenda, National Academies of Sciences, Engineering, and Medicine, 2019; permission conveyed through Copyright Clearance Center, Inc.
Proximal and Distal Outcomes of EIS
EIS ultimately aims to support optimization of implementation outcomes (Proctor et al., 2011), an expected precursor to the optimization of intervention outcomes (NASEM, 2019). The attainment of these outcomes is most directly and sustainably influenced by those individuals, groups, and organizations implementing a program or practice, rather than by ISPs. Here, we describe outcomes more immediately influenced by ISPs: working alliance with local leaders and implementation teams; implementation practice knowledge, skills, and abilities among local leaders and implementation teams; local system capacity and performance for implementation and scale-up; and the abilities of local leaders and teams to self-regulate effective implementation processes over time and without dependence on EIS. We refer to these outcomes as practice outcomes (see Figure 2). Like other models (Albers et al., 2020; Chilenski et al., 2016; Foster-Fishman et al., 2001; Yazejian et al., 2019), we expect that relationship factors, which we represent as working alliance, are foundational to the achievement of all practice outcomes. We also expect that implementation practice knowledge, skills, and abilities among support participants are a precursor to their intentional and systematic development of local system capacity and performance.
Figure 2.
Conceptual Model of External Implementation Support Practice Outcomes
Note. Hypothesized primary influences between outcomes are represented by dark arrows and secondary influences by gray arrows.
We conceptualize an additional practice outcome, the ability of local leaders and teams to self-regulate the implementation process according to best practices (Roppolo et al., 2019), as a key moderator of sustained capacity and performance. Self-regulation of implementation by organizational teams is a relatively new concept in the field, though self-regulation itself is a well-established construct from social cognitive theory (e.g., Bandura, 1986; Karoly, 1993). Roppolo et al. (2019) anchor the concept in implementation teams’ (1) confidence in their abilities to use implementation best practices, (2) responsibility for and influence over the changes needed for successful implementation, (3) planning and evaluation tools needed to improve implementation, (4) appropriate identification and response to implementation challenges, and (5) ability to manage implementation without dependency on long-term external support. As teams’ abilities to regulate implementation processes grow, they may be better equipped to manage the challenges to develop, improve, and sustain the use of innovative programs and practices.
The relationships between these practice outcomes are visualized in Figure 2, with hypothesized primary influences between outcomes represented by dark arrows and secondary influences by gray arrows. For example, implementation best practice knowledge, skills, and abilities enable local leaders and implementation teams to self-regulate the implementation process effectively. At the same time, a secondary benefit of being able to self-regulate implementation capacity and performance is exposure to ongoing learning about what works and what does not work in local context, thus helping leaders and teams to further develop their knowledge, skills, and abilities.
CPCs of EIS
To influence these practice outcomes, we propose 10 CPCs of EIS (see Table 2). We conceptualize these CPCs as essential functions of the support process for ISPs; if used as intended and according to the theory and principles behind them, they are likely to contribute to favorable practice outcomes. They may also help describe how ISPs leverage their position, background, attitudes, and competencies (Albers et al., 2020; Ritchie et al., 2020) to formulate practice activities within a co-creative environment that contribute to change at individual, team, organizational, and system levels. Table 2 provides the following information about each CPC: whether the practice component is thought to work mainly at the individual/team or organization/system level of change, an example practice activity, the proximal practice outcome, and examples of descriptive literature to find more narrative or empirical information.
Table 2.
Core Practice Components, Focus of Change, Example Activities, Proximal Outcomes, and Descriptive Literature
| Practice component | Primary focus of change | Example practice activity | Proximal outcomes | Descriptive literature |
|---|---|---|---|---|
| 1. Build collaborative relationships | Individuals, teams | Support emotional and practical readiness for next action steps | Working alliance | Chilenski et al., 2016; Katz & Wandersman, 2016; Loper et al., 2021; Metz, Albers et al., 2021; Palinkas et al., 2009; Ritchie et al., 2020; Rushovich et al., 2015; Stetler et al., 2006 |
| 2. Reinforce leaders and teams’ self-regulation of effective implementation performance | Individuals, teams | Reinforce leaders and teams’ personal agency in their local implementation processes | Self-regulation of effective implementation | McWilliam et al., 2016; Roppolo et al., 2019; Sanders & Mazzucchelli, 2013 |
| 3. Assess implementation capacity, implementation performance, and progress towards intended outcomes | Organization/system | Conduct quantitative assessments of implementation capacity or performance | (setting up core practice component 4) | Amodeo et al., 2006; Blase, 2009; Feinberg et al., 2008; Katz & Wandersman, 2016; Loper et al., 2021; Meyers et al., 2012; Ritchie et al., 2020; Stetler et al., 2006; Wandersman et al., 2012 |
| 4. Facilitate collaborative agreements about implementation performance goals on which to focus support | Organization/system | Set jointly conferred goals for improving implementation performance | (setting up core practice components 5–6) | Lessard et al., 2016; Metz, Albers et al., 2021; Metz, Woo et al., 2021; Prochaska et al., 2001; Stetler et al., 2006; Wandersman et al., 2012 |
| 5. Provide adult learning on implementation science and best practices to leaders and teams | Individuals, teams | Use incidental or “just in time” learning to strengthen connections between implementation best practices and real-time applications | Leaders’ and teams’ implementation knowledge, skills, and abilities | Blase, 2009; Cook & Dupras, 2004; Cucciare et al., 2008; Dirksen, 2016; Dunst & Trivette, 2012; Metz, Albers et al., 2021; Wouters et al., 2013 |
| 6. Facilitate the development of implementation capacity | Organization/system | Facilitate the development or refinement of organizational/system structures, resources, or practices | Implementation capacity | Aldridge et al., 2016; Berta et al., 2015; Blase, 2009; Courtlandt et al., 2009; Metz, Albers et al., 2021; Metz et al., 2015; Ritchie et al., 2020; Schell et al., 2013 |
| 7. Facilitate leaders and teams’ application of skills, resources, and abilities within their context | Individuals, teams Organization/system |
Facilitate experiential learning activities where implementation knowledge, skills, and abilities are applied | Leaders’ and teams’ implementation knowledge, skills, and abilities; self-regulation of effective implementation | Berta et al., 2015; Blase et al., 2015; Dunst & Trivette, 2012; Hayes et al., 2012 |
| 8. Provide supportive behavioral coaching to leaders and teams | Individuals, teams | Provide specific behavioral praise | Leaders’ and teams’ implementation knowledge, skills, and abilities; self-regulation of effective implementation | Alagoz et al., 2018; Chilenski et al., 2016; Eiraldi et al., 2014; Joyce & Showers, 2002; Miller & Rollnick, 2012; Nadeem et al., 2013; Ray et al., 2012; Stetler et al., 2006 |
| 9. Facilitate collective learning and adaptive problem solving | Organization/system | Facilitate identification of technical and adaptive elements of challenges | Leaders’ and teams’ implementation knowledge, skills, and abilities; self-regulation of effective implementation | Akin et al., 2013; Amodeo et al., 2006; Berta et al., 2015; Blase et al., 2015; Heifetz et al., 2009; Julian, 2017; Lessard et al., 2016; Ritchie et al., 2020; Spoth & Greenberg, 2011; Stetler et al., 2006 |
| 10. Transition out of intensive implementation support | Individuals, teams | Facilitate understanding of when and to whom to reach out for support in the future | Sustained implementation performance | Chilenski et al., 2016; Guldbrandsson, 2008; Lessard et al., 2016; Roppolo et al., 2019 |
Theory- and principle-driven use of CPCs
Using these practice components in accordance with social cognitive theory and the principles behind them is a critical part of our proposed practice model. For example, activities intended to build collaborative relationships (CPC 1) should demonstrate co-creation principles and expectations, including expectations for power sharing and how other co-creation partners may be expected to interact with the support relationship (Yazejian et al., 2019). Activities designed to assess local implementation capacities, performance, and progress (CPC 3) should incorporate mixed-methods and be sensitive to the presence of disparities and system inequities that may be differentially impacting historically marginalized community members. Facilitation activities (CPCs 4, 6, 7, and 9) should integrate social learning exercises (Bandura, 1977) and be tailored to key facets of local context, including resources, strengths, needs, and historical events (Yazejian et al., 2019). Additionally, the pacing of support activities should acknowledge the stage-based nature of implementation, pursuing long-term improvement goals through short-term action steps (Wood & Bandura, 1989) that are responsive to progress and setbacks and generally shaped around iterative inquiry and adjustment. Consequentially, longer-term engagements may often be required to enable sustainable improvements and system changes.
Multiple pathways for influencing change
Our full theory of change tying together CPCs and practice outcomes is presented in Figure 3. The model offers potential paths for influencing practice outcomes through individual and combinations of CPCs. For example, ISP behaviors and activities designed to build collaborative working relationships with support participants (CPC 1) may strengthen working alliance, which benefits the entire practice environment. ISP activities designed to provide tailored learning on the use of effective implementation practices (CPC 5) may contribute to support participants’ acquisition of knowledge, skills, and subsequent abilities to improve local capacity and performance. Individual practice outcomes might also be influenced by multiple practice components. For example, support participants’ acquisition of knowledge, skills, and abilities and, separately, their self-regulation of effective implementation are each influenced by a cohort of closely aligned practice components (CPCs 7, 8, and 9). These multiple pathways and interdependencies afford ISPs options to tailor their use of practice components to influence practice outcomes over time and in response to situational factors that may make some practice activities more appropriate, acceptable, or feasible in the moment.
Figure 3.
Proposed External Implementation Support Theory of Change
Note. Each white box indicates a core practice component with colored numbers to indicate thematic grouping (see Figure 4). Shaded boxes indicate external implementation support practice outcomes (see Figure 2). Bold black arrows indicate primary proposed influences. Nonbolded black arrows indicate secondary proposed influences.
Influencing change at multiple levels
As indicated in Table 2, the proposed practice model lends itself to an integration of individual/team behavior change and organization/system learning and improvement. The necessity of providing support at each level is a well-established feature of EIS (e.g., Albers et al., 2020; Amodeo et al., 2006; Berta et al., 2015; Blase, 2009; Foster-Fishman et al., 2001; Ritchie et al., 2020; Rushovich et al., 2015). At the organization/system level, the proposed practice model allows for the integration of organizational improvement methods, such as the Model for Improvement (Courtlandt et al., 2009). For example, during ISP facilitation of collaborative agreements about goals on which to focus support (CPC 4), partners might focus on Model for Improvement questions “what are we trying to accomplish” and “how will we know that a change is an improvement?” Partners may benefit from developing detailed aim statements and identifying useful measures to monitor change, if not already identified from initial assessment activities (CPC 3). As goals are further detailed (CPC 4) and plans are outlined for facilitating capacity development (CPC 6), partners might benefit from focusing on the question “what changes can we make that will result in an improvement?” ISPs might conceptualize their facilitation of application (CPC 7), follow-up assessment activities (CPC 3), and facilitation of collective learning and problem solving (CPC 9) with the “Do,” “Study,” and “Act” phases of the plan–do–study–act cycle. Using this framing can enable ISPs to facilitate support participants through iterative learning and improvement activities to optimize implementation strategies and structures while retaining a highly tailored and contextually driven approach to EIS.
Supporting individual/team behavior change towards effective implementation is a feature of EIS that we see as essential to scale-up and sustainment. In alignment with social cognitive theory, it is important for ISPs to recognize that the people inside a system shape and are shaped by that system; these bi-directional influences perpetuate status quo or drive innovation and improvement (Bandura, 2006). The proposed practice model embeds several CPCs to support behavior change for effective implementation. For example, an example activity related to building collaborative relationships (CPC 1) is supporting individual and team readiness for next action steps. As discussed by Prochaska et al. (2001), a key challenge of collective change in support of innovation is only about 20% of individuals might be preparing to participate in organized action in the near term. The other 80% of individuals may only be contemplating change, if at all, and must be intentionally supported to participate in the collective actions required for success.
To support change, ISPs may use adult learning best practices to provide instruction to support participants on effective implementation strategies (CPC 5; Dunst & Trivette, 2012). ISPs can then facilitate support participants’ application of this learning to affect intended changes within their system environments (CPC 7) and provide supportive behavioral coaching following application to further refine, sustain, and generalize the behavior change processes (CPC 8). The system changes that are created through this iterative process further shape and reinforce effective behavior changes among other individuals and teams in that system. Finally, by continually reinforcing support participants’ self-regulation of effective implementation processes (CPC 2) and ultimately tapering intensive support (CPC 10), they are empowered to effectively self-regulate ongoing system improvement without dependency on EIS.
Thematic groupings among CPCs
Building on these patterns for intentional use, the CPCs lend themselves to thematic groupings and possible sequential dependencies. For example, assessing implementation capacity, performance, and progress (CPC 3) may typically precede the development of collaborative agreements about goals on which to focus support (CPC 4). Nevertheless, the intended use of both CPCs is to co-design support by establishing a shared understanding of context and purpose to tailor support activities. Similarly, providing learning on effective implementation practices and structures (CPC 5) may precede facilitation of support participants’ development of implementation capacity (CPC 6), though both activities focus on capacity development across individual, team, organizational, and system levels. Further, such capacity development may typically precede application, coaching, and collective learning and problem solving (CPCs 7, 8, and 9), when the role of the ISP is to facilitate support participants’ iterative steps to perform on their new capacities and drive desired system improvements. The full thematic grouping of CPCs is in Figure 4. Of note, building collaborative relationships and reinforcing self-regulation (CPCs 1 and 2) are considered foundational and thus may lend themselves to high frequencies of use across successful ISP engagements.
Figure 4.
Thematic Grouping of Core Practice Components
Dynamic use of CPCs
Notwithstanding these possible sequential dependencies, EIS is often anything but typical in the highly contextual and adaptive environments of implementation and scale-up. Neither the process of implementation nor the provision of EIS follows a linear trajectory. ISPs and support participants must demonstrate a combination of rigor and agility in the work they do together. ISPs must be alert to capacity building moments and learning opportunities presented by support participants and their organizations. This precludes a manualized approach and requires a dynamic, responsive approach in which multiple practice functions in various orders can be supported at the same time while ensuring forward movement and integrity to the science of effective implementation. Achieving this rests with ISPs’ ability to facilitate a dynamic process that guides support participants and their organizations through a process that includes all required steps of effective implementation.
Relatedly, the tension between theory and practice is significant when applying implementation science in complex, multifaceted practice environments. Although implementation science has provided evidence of what works to enable effective implementation, the demands of applying these concepts in real-world practice are frequently beyond feasible. The principle of co-creation must allow for a level of flexibility in use of the science, with neither the science nor the practice taking dominance but rather the development of common ground through authentic collaboration and equitable power sharing. ISPs must use considerable skill to continually develop common ground in a manner that does not challenge the integrity of the knowledge implementation science provides. Likewise, because effective implementation practice can create demands on the service setting that are difficult to accommodate, ISPs must allow their EIS to be continually shaped by the realities and partners in the settings in which they engage. Discovering common ground and, likewise, minimally sufficient dosage for effective EIS is a shared responsibility informed by all perspectives.
Case Examples
To illustrate elements of this practice model in action, we share our reflections on several case examples from our experiences providing EIS. These case examples are drawn from our individual and collective practice, which combined represent several decades of experience both in the United States and globally. Two case examples are presented here, and two additional case examples are available in Appendices A and B. Though we had not fully developed our EIS theory of change at the time of these case examples, we have found great utility and value in retroactively applying the model to describe, understand, and learn from prior practice work. Additionally, our retroactive application has helped to test and refine the functional elements of our model. Although all CPCs were likely used in various ways during these practice engagements, we choose to only highlight several in these case examples to illustrate major themes and patterns. A more precise way to operationalize the CPCs and a discussion of their dynamic use in discrete practice interactions is available in a separate paper (Aldridge et al., 2023). Descriptive contexts for the practice engagements from which these case examples are drawn are available in Table 3.
Table 3.
Descriptive Context for External Implementation Support Case Examples
| Title, website, location, and dates | Type of funding | Implementation support practitioners | Setting | Program or practice implemented/scaled |
|---|---|---|---|---|
| School Mental Health Ontario (SMH-ON) https://smho-smso.ca Ontario, Canada 2011–present | Provincial ministry | Implementation Support Coaches from a provincial intermediary organization | Publicly funded provincial school system including 72 school boards | System-wide integration of school mental health strategies including health promotion, prevention, and early intervention practices |
| Evidence-Based Prevention & Intervention Support (EPIS) Center www.episcenter.psu.edu Pennsylvania, United States2008–present | State government | Implementation specialists from a state university–based intermediary organization | Schools, service provider organizations, and city/county government agencies | A diverse menu of prevention and intervention EBPs |
| Center For Evidence to Practice (E2P) laevidencetopractice.com Louisiana, United States2019–present (see APPENDIX A) | State government | Implementation specialists from a state university–based intermediary organization | Children's behavioral health practitioners and provider organizations serving Medicaid-eligible children and families | A menu of children's behavioral health treatment models |
| California Partners for Permanency (CAPP) https://cfpic.org/the-child-family-practice-model/ California, United States2010–2016 (see APPENDIX B) | Federal government | Implementation specialists from a university and practice facilitators from a statewide child welfare and human services nonprofit organization | Publicly funded county child welfare agencies | Child and family practice model in child welfare with core components and related behaviors for both family-level practice and leadership/system change |
School Mental Health Ontario (SMH-ON) in Ontario, Canada (https://smho-smso.ca)
Implementation science is integral to the model adopted by the Ontario Ministry of Education for a broad-sweeping provincial school mental health initiative in 2011. However, provincial school boards, responsible for the integration of school mental health strategies in their schools and territories, did not originally have familiarity with implementation science or its application. Moreover, school boards operate autonomously within the broad policy and curriculum expectations of the Ministry of Education, placing an emphasis on local context and agency to ensure successful implementation and programmatic adaptation while retaining integrity to core initiative components.
To support these critical needs, SMH-ON, a provincial intermediary organization also funded by the Ontario Ministry of Education, hired and trained a team of Implementation Support Coaches (IS Coaches), which were assigned to each provincial school board. The anchor interaction within the support process has been regular meetings of each board (at least monthly), during which boards identify priorities for their students’ mental health needs and address them through planned actions. These board meetings afford ongoing opportunities to build implementation knowledge and skill at the leadership level through practical application that is specific to each board, their mental health leaders, and their resources.
The foundation of IS Coaches’ approach is developing a trusting relationship with the mental health leadership team on the school board (CPC 1). Each IS Coach tailors their facilitation to individual boards’ unique needs, resources, and circumstances through contextual learning and assessment (CPC 3) and co-designing the support process (CPC 4). To accomplish implementation science learning and application objectives with mental health leadership, IS Coaches iteratively provide adult learning on effective implementation practices (CPC 5); facilitate leaders’ application of related skills, resources, and abilities within their school or territory (CPC 7); and provide supportive behavioral coaching following application (CPC 8).
Using a self-regulatory framework throughout the support process (CPC 2) enables more sustainable application of effective implementation strategies. Where boards have maintained continuity of leadership, IS Coaches have been able to reduce their teaching and facilitation role to match boards’ increased self-efficacy for effective implementation. Where there has been significant change in leadership among boards, IS Coaches have facilitated continuity and momentum in the ongoing use of implementation science. The school mental health initiative continues to date with funding that has increased approximately tenfold since 2011. To help scale the initiative through individual schools within a board's region or territory, a cascade approach has been used wherein school mental health leaders support implementation learning down through all levels of their system, creating a parallel facilitation process.
The Evidence-based Prevention and Intervention Support (EPIS) Center in Pennsylvania, United States (www.episcenter.psu.edu)
Since 1999, the Pennsylvania Commission on Crime and Delinquency (PCCD) has awarded grants to local providers, schools, and county agencies to implement a diverse menu of up to 20 different evidence-based prevention and intervention programs (EBPs). To maximize the impact and return on investment from these grants, PCCD partnered with the Prevention Research Center at Penn State University beginning in 2008 to create the EPISCenter, an EIS intermediary (Bumbarger & Campbell, 2012; Rhoades et al., 2012). For each of the PCCD-endorsed EBP models, the EPISCenter worked with the model developer and experienced practitioners to develop a readiness assessment tool, FAQs, and implementation monitoring resources. These resources were intended to help organizations carefully consider the fit and feasibility of adopting a given EBP, including their own capacity, even before applying for grant funding. Once PCCD awarded a grant for the adoption of an EBP, the grantee would sign a memorandum of understanding agreeing to work with the EPISCenter to develop an implementation plan, monitor implementation quality and fidelity, report quarterly on performance and progress toward intended outcomes, and accept technical assistance.
Within the first quarter of the grant period, EPISCenter implementation specialists would meet with the grantee to identify potential challenges (CPC 3) and implementation support priorities (CPC 4) and to collaboratively develop a plan for the development of implementation capacity within the grantee organization (CPC 6). To create shared space for these activities, the EPISCenter has prioritized building strong collaborative relationships across partner groups (CPC 1) and has recognized the importance of focusing concurrently on individual practitioners, provider organizations, and the broader systems (policy/regulatory and financial) within which they operate.
Two resources have been particularly useful to aid EPISCenter implementation specialists and grantees to identify challenges (CPC 3) and inform support priorities (CPC 4). First, the Annual Survey of Evidence-Based Programs, administered across all grantees, was used to identify and monitor trends in adoption, implementation, outcome assessment, and sustainment barriers and facilitators. The results of this annual survey also facilitated the continuous improvement of statewide funding and oversight mechanisms. Second, although each EBP model had a small number of required performance measures, the EPISCenter worked with grantees to co-develop additional, locally identified implementation measures that would serve as benchmarks to assess progress. These measures were based on each program's theory of change, considered within the specific local implementation context. This co-development process was an opportunity for each grantee to think more intentionally about what would be required for successful implementation.
Using the results from these and other assessments, EPISCenter implementation specialists were able to facilitate grantees’ development of implementation capacity (CPC 6) according to their existing capacities, needs, and priorities. This put in motion continuous cycles of performance application (CPC 7), team and organizational learning (CPCs 3, 8, and 9), and improvement (returning to CPCs 4 and 6) throughout the grant period. The EPISCenter also supported model-specific learning communities for each EBP model, bringing sites together throughout the year to collectively discuss challenges, share successes, and problem-solve innovative approaches to high quality implementation. These learning communities enabled collective learning and adaptive problem solving (CPC 9) and helped the state take each EBP to scale.
Discussion
Our proposed EIS theory of change attends to several identified needs in the field. First, it fills a critical gap by proposing mechanisms of change, which we have referred to as CPCs, by which ISPs may utilize their competencies and other resources to affect both proximal and distal practice outcomes. We have augmented other models of practice outcomes by integrating the concept of local leaders’ and implementation teams’ self-regulation of effective implementation processes (Roppolo et al., 2019), which we believe to be a critical factor in sustained implementation performance at any level. Our conceptual model integrates both individual/team behavior change and organization/system learning and improvement through well-established methods. The model is grounded by social cognitive theory, which has been widely applied to behavioral change in individuals, groups, organizations, and social systems (e.g., Bandura, 1986, 1988, 1991, 2000; Wood & Bandura, 1989). It is backed by eight practice principles that are consistent with recent models of ISP competencies (e.g., Metz, Albers et al., 2021). Finally, it consolidates and combines several related concepts and bodies of literature in implementation research and practice, including technical assistance, facilitation, and change agency.
We believe this theory of change informs future research and practice activities in several ways. In research, we believe our theory of change can be operationalized, tested, and refined at several levels, including both individual and combinations of CPCs and outcomes. Relatedly, the model offers CPCs centered on behavioral practice functions, which we believe helps navigate concerns that mechanisms of change are often unobservable, too context-sensitive, and require considerable effort to operationalize (Albers et al., 2020). This may allow for ongoing development and refinement of discrete practice behaviors and activities, better ability to track such behaviors and activities in practice, the ability to observe contextually relevant patterns of EIS, and the ability to better identify and test hypotheses related to EIS. Many of these assumptions are supported within the separate paper (Aldridge et al., 2023), detailing parallel programmatic efforts by several current authors to fully operationalize the model and empirically monitor its more than 5-year application by ISPs supporting the scale-up of the Triple P - Positive Parenting Program system of interventions in North Carolina and South Carolina. Results from that descriptive study detail 48 discrete practice activities across the 10 CPCs and empirically demonstrate elements of both the thematic groupings of CPCs and their dynamic use by ISPs.
In practice, we believe our model can offer new ways to train and coach ISPs, complimenting and extending competency-based approaches. Clearly articulated CPCs and outcomes enable tailored operationalization and application. They can be used to train ISPs on practice-relevant behaviors and activities in a variety of practice environments and situations, offer flexible methods for ISPs to conceptualize the change process, and can be used for reflective coaching and learning to improve practice outcomes. By leaning into the practice principles, particularly co-creation, while opening to a more dynamic use of the CPCs, the practice model also offers methods for ISPs to manage the tensions inherent in organizational and system change processes related to implementation and scale-up. Finally, as we have demonstrated in our case examples, the model can be used to retroactively describe and understand historical EIS activities, affording new opportunities to learn from prior practice engagements.
Next steps
We believe the immediate next steps related to the proposed model are twofold. First, to ensure that the proposed 10 CPCs can be well-trained, monitored, coached, and used by ISPs in practice, we recommend that each component detail more discrete behaviors or activities essential to the intended use of that component in practice. To that end, three of the current authors have worked with their team on one way to operationalize each CPC at such a level. Our interest in this paper is to provide an overview of the theory of change and descriptive case examples in a way that might be accessible to a variety of ISPs from different practice approaches. To learn more about efforts to operationalize the CPCs for programmatic use, we refer the reader to the separate paper describing that work (Aldridge et al., 2023).
Second, descriptive studies of these 10 CPCs as used in implementation practice would help establish an initial empirical foundation to further investigate and refine the proposed theory of change. As discussed above, three of the current authors and their teammates offer an initial descriptive study of the CPCs in use in practice-focused projects across two states in a separate paper (Aldridge et al., 2023). However, several such descriptive reports may benefit the refinement of this model. While there is certainly interest in testing the array of hypotheses embedded in our theory of change, we believe that more basic questions focusing on the use and monitoring of the CPCs across a variety of contexts may need to be answered first.
In addition to these two immediate next steps, we invite those in the implementation practice space, particularly those providing EIS, to reflect and respond to our proposed practice model. We wonder if and how this practice model may illuminate challenges, successes, and other experiences in EIS, particularly across diverse ISPs operating from a variety of practice approaches. We also wonder how others’ prior and current EIS challenges, successes, and experiences may help refine, evolve, and expand this practice model. Even if aspirational, we believe there would be benefit to reaching an evolution of this model that is shared. We welcome conversations of this nature in the effort to advance both implementation research and practice.
Acknowledgements
The authors would like to acknowledge both current and former members of The Impact Center at Frank Porter Graham (FPG) team for their contributions to refining the CPC nomenclature used in this paper. We would like to acknowledge Rohit Ramaswamy for his mentoring around theorizing in implementation science. We would like to acknowledge Wendy Morgan for her help identifying key literature related to contemporary adult learning methods. Finally, we would like to acknowledge Jennifer Robinette for her formatting and copyediting support.
Appendix
Appendix A
Case example 3: The Center for Evidence to Practice (E2P) in Louisiana, United States (laevidencetopractice.com)
The Louisiana Department of Health, Office of Behavioral Health (OBH), is promoting the statewide scale-up of a diverse menu of evidence-based intervention programs (EBPs), with a focus on children's behavioral health treatment models to serve low-income (i.e., Medicaid-eligible) children and families. Beginning in 2019, OBH partnered with Louisiana State University (LSU) Health Sciences Center, School of Public Health, to create the LSU Center for Evidence to Practice (E2P Center), an EIS intermediary for building implementation capacity among local children's behavioral health treatment providers, Medicaid managed care organizations (MCOs), and the state. The E2P Center supports providers’ capacity for the adoption and implementation of 12 different EBP treatment models through the development and facilitation of training, written treatment model descriptions, guides for referral and service matching, business models, and navigating Medicaid reimbursement. As the Louisiana initiative is in its early stages, the current focus of the E2P Center is on identifying the specific practitioner and provider capacities OBH will need to support to effectively scale these treatment models.
The E2P Center regularly analyzes Medicaid claims data to provide strategic feedback to OBH, MCOs, and providers (CPC 3) and to facilitate continuous quality improvement cycles at the system level (iterations of CPCs 4, 6, 7, and 9). The Center also facilitates collective learning and adaptive problem solving (CPC 9) through learning communities of both provider organizations and MCOs. The E2P Center continually assesses the needs of partners through interviews, focus groups, and surveys (CPC 3) and adapts its implementation support and trainings (CPC 5) to meet pressing implementation and sustainability challenges. For example, in response to the COVID pandemic, the E2P Center pivoted its focus to support the transition of these EBP children's treatment models to telehealth delivery (Phillippi & Bumbarger, 2021; Singh et al., 2022). This support required working closely with model developers, OBH, and MCOs and implementing clinicians and provider organizations to identify and address the emerging and urgent telehealth capacity needs of each stakeholder group (CPCs 3, 4, 5, and 6).
As in the other case examples, the LSU E2P Center has prioritized building strong collaborative relationships across many partner groups (CPC 1) to identify the barriers and facilitators of high-quality implementation (CPC 3) and to support partners’ implementation activities (CPC 4). The Center recognizes the importance of focusing concurrently on individual practitioners, the provider organizations that employ and support the practitioners, and the broader systems (policy/regulatory and financial) within which they operate.
Appendix B
Case example 4: California Partners for Permanency (CAPP) in California, United States (https://cfpic.org/the-child-family-practice-model/)
In the early-to-mid 2010s, as part of a 5-year, federally funded project in California to reduce disparate permanency outcomes among African American and Native American children in foster care, a team of implementation specialists from FPG Child Development Institute, University of North Carolina at Chapel Hill, provided EIS to several county child welfare agencies implementing an innovative child and family practice model. The practice model was evidence-informed; co-developed by state and local child welfare agency partners, African American community partners, and Native American tribal partners; and integrated intervention components at both frontline and organizational leadership levels to reinforce community-engaged and trauma-informed racial equity and inclusion practices (Children's Bureau, 2020).
After practice model development and during the middle-to-late stages of the project, FPG implementation specialists and their state practice facilitator counterparts became particularly engaged with one county agency. What had originally begun as a predominantly technical process to support practice model implementation (e.g., implementation science training, brokering additional resources, and consultative support) needed to quickly evolve. County child welfare partners were experiencing multiple challenges and frustrations around translating implementation science concepts into agency-informed structures and processes, distrust between agency divisions and levels of staff, and the ongoing legacy of institutional racism and system-inflicted trauma in local communities, particularly the African American community and communities of Latin ancestry (the latter of which were not an original focus within the project but were also experiencing disparate permanency outcomes and collectively represented a larger proportion of system-involved children and youth in the county). Within this emergent situation, FPG implementation specialists repeatedly focused on building collaborative relationships with their state practice facilitator counterparts, agency staff, and community partners (CPC 1). These intensive relational efforts did not come together at once. Instead, FPG implementation specialists initially focused on agency implementation team members and state facilitators, exploring the intricacies of sharing power in implementation practice. They then relied on co-creation principles and practices to jointly reconnect relationships to key community partners. Once re-engaged, these community partners were uniquely able to re-engage agency leaders in support of the agency implementation team, which had become overexposed and was under-supported in the agency's efforts to implement authentic changes that would support full and effective use of the new practice model.
Once working relationships were sufficiently re-engaged, FPG implementation specialists and state facilitators relied heavily on the principles of contextualized and responsive support and stage-based approaches, shifting the language and ways in which implementation science concepts were taught (CPC 5; for more information about these shifts, see Aldridge et al., 2016 and Boothroyd et al., 2017), co-developing interactive action steps with agency leaders and implementation team members to facilitate the development of implementation capacity (CPC 6), and working with these leaders and team members to iteratively test and adjust new learning and organizational capacities through small tests of change, reflective coaching, and collective learning and problem solving with all partners (CPCs 7, 8, and 9). These shifts further reinforced working alliances and fostered a collaborative learning culture for the duration of the project. Because several of the elements that originally frayed working alliances were long-standing and continued to re-emerge in this county child welfare agency, FPG implementation specialists, state facilitators, and agency implementation team members frequently returned to the principle of adaptive leadership by creating space for collective learning and adaptive problem solving with individuals and groups of agency staff and community partners (CPC 9). As authentic change became embedded over multiple years within this agency, FPG implementation specialists used coaching strategies to reinforce self-regulation of effective practices (CPC 2) and leaned heavily on the principle of local ownership of progress. These factors enabled FPG implementation specialists to successfully transition from the intensive support role (CPC 10) with sustainable change in place.
Footnotes
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: this work was supported by The Duke Endowment (grant numbers 1945-SP, 2037-SP, 2081-SP) and the North Carolina Department of Health and Human Services (contract numbers 00034755, 00034805, 00035954, 00036619, 00037333, 00039054, 00040617, 00042356, 00044072).
ORCID iDs: William A. Aldridge https://orcid.org/0000-0002-8267-2753
Rebecca H. Roppolo https://orcid.org/0000-0003-3564-4773
References
- Akin B. A., Bryson S. A., Testa M. F., Blase K. A., McDonald T., Melz H. (2013). Usability testing, initial implementation, and formative evaluation of an evidence-based intervention: Lessons from a demonstration project to reduce long-term foster care. Evaluation and Program Planning, 41, 19–30. 10.1016/j.evalprogplan.2013.06.003 [DOI] [PubMed] [Google Scholar]
- Alagoz E., Ming-Yuan C., Hitchcock M., Brown R., Quanbeck A. (2018). The use of external change agents to promote quality improvement and organizational change in healthcare organizations: A systematic review. BMC Health Services Research, 18, 42. 10.1186/s12913-018-2856-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Albers B., Metz A., Burke K. (2020). Implementation support practitioners – A proposal for consolidating a diverse evidence base. BMC Health Services Research, 20, 368. 10.1186/s12913-020-05145-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Albers B., Metz A., Burke K., Bührmann L., Bartley L., Driessen P., Varsi C. (2021). Implementation support skills: Findings from a systematic integrative review. Research on Social Work Practice, 31(2), 147–170. 10.1177/1049731520967419 [DOI] [Google Scholar]
- Aldridge W. A., II, Boothroyd R. I., Fleming W. O., Lofts Jarboe K., Morrow J., Ritchie G. F., Sebian J. (2016). Transforming community prevention systems for sustained impact: Embedding active implementation and scaling functions. Translational Behavioral Medicine, 6(1), 135–144. 10.1007/s13142-015-0351-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aldridge W. A., II, Roppolo R. H., Chaplo S. D., Everett A. B., Lawrence S. N., DiSalvo C. I., Minch D. R., Reed J. J., Boothroyd R. I. (2023). Trajectory of external implementation support activities across two states in the United States: A descriptive study. Implementation Research and Practice, 1–24. 10.1177/26334895231154285 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Amodeo M., Ellis M. A., Samet J. H. (2006). Introducing evidence-based practices into substance abuse treatment using organization development methods. The American Journal of Drug and Alcohol Abuse, 32(4), 555–560. 10.1080/00952990600920250 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Avery L., Charman S. J., Taylor L., Flynn D., Mosely K., Speight J., Lievesley M., Taylor R., Sniehotta F. F., Trenell M. I. (2016). Systematic development of a theory-informed multifaceted behavioural intervention to increase physical activity of adults with type 2 diabetes in routine primary care: Movement as Medicine for Type 2 Diabetes. Implementation Science, 11, 99. 10.1186/s13012-016-0459-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bandura A. (1977). Social learning theory. Prentice Hall. [Google Scholar]
- Bandura A. (1986). Social foundations of thought and action: A social cognitive theory. Prentice-Hall, Inc. [Google Scholar]
- Bandura A. (1988). Organisational applications of social cognitive theory. Australian Journal of Management, 13(2), 275–302. 10.1177/031289628801300210 [DOI] [Google Scholar]
- Bandura A. (1991). Social cognitive theory of self-regulation. Organizational Behavior and Human Decision Processes, 50(2), 248–287. 10.1016/0749-5978(91)90022-L [DOI] [Google Scholar]
- Bandura A. (2000). Exercise of human agency through collective efficacy. Current Directions in Psychological Science, 9, 75–78. 10.1111/1467-8721.00064 [DOI] [Google Scholar]
- Bandura A. (2006). Toward a psychology of human agency. Perspectives on Psychological Science, 1(2), 164–180. 10.1111/j.1745-6916.2006.00011.x [DOI] [PubMed] [Google Scholar]
- Baskerville N. B., Liddy C., Hogg W. (2012). Systematic review and meta-analysis of practice facilitation within primary care settings. Annals of Family Medicine, 10(1), 63–74. 10.1370/afm.1312 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berta W., Cranley L., Dearing J. W., Dogherty E. J., Squires J. E., Estabrooks C. A. (2015). Why (we think) facilitation works: Insights from organizational learning theory. Implementation Science, 10, 141. 10.1186/s13012-015-0323-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Blase K. (2009). Technical assistance to promote service and system change (roadmap to effective intervention practices #4). Tampa, Florida: University of South Florida, Technical Assistance Center on Social Emotional Intervention for Young Children. https://files.eric.ed.gov/fulltext/ED577840.pdf
- Blase K. A., Fixsen D. L., Sims B. J., Ward C. S. (2015). Implementation science: Changing hearts, minds, behavior, and systems to improve educational outcomes. Oakland, CA: The Wing Institute. https://fpg.unc.edu/publications/implementation-science-changing-hearts-minds-behavior-and-systems-improve-educational
- Boothroyd R. I., Flint A. Y., Lapiz A. M., Lyons S., Jarboe K. L., Aldridge W. A., II. (2017). Active involved community partnerships: Co-creating implementation infrastructure for getting to and sustaining social impact. Translational Behavioral Medicine, 7(3), 467–477. 10.1007/s13142-017-0503-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bumbarger B. K., Campbell E. M. (2012). A state agency-university partnership for translational research and the dissemination of evidence-based prevention and intervention. Administration and Policy in Mental Health and Mental Health Services Research, 39(4), 268–277. 10.1007/s10488-011-0372-x [DOI] [PubMed] [Google Scholar]
- Children’s Bureau. (2020, June 11). California Partners for Permanency (CAPP). Administration for Children & Families, U.S. Department of Health and Human Services. Retrieved February 2, 2022, fromhttp://www.acf.hhs.gov/programs/cb/resource/pii-capp
- Chilenski S. M., Olson J. R., Schulte J. A., Perkins D. F., Spoth R. (2015). A multi-level examination of how the organizational context relates to readiness to implement prevention and evidence-based programming in community settings. Evaluation and Program Planning, 48, 63–74. 10.1016/j.evalprogplan.2014.10.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chilenski S. M., Perkins D. F., Olson J., Hoffman L., Feinberg M. E., Greenberg M., Welsh J., Crowley D. M., Spoth R. (2016). The power of a collaborative relationship between technical assistance providers and community prevention teams: A correlational and longitudinal study. Evaluation and Program Planning, 54, 19–29. 10.1016/j.evalprogplan.2015.10.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chinman M., Acosta J., Ebener P., Malone P. S., Slaughter M. E. (2016). Can implementation support help community-based settings better deliver evidence-based sexual health promotion programs? A randomized trial of Getting To Outcomes®. Implementation Science, 11, 78. 10.1186/s13012-016-0446-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cook D. A., Dupras D. M. (2004). A practical guide to developing effective web-based learning. Journal of General Internal Medicine, 19(6), 698–707. 10.1111/j.1525-1497.2004.30029.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Courtlandt C. D., Noonan L., Feld L. G. (2009). Model for improvement – part 1: A framework for health care quality. Pediatric Clinics of North America, 56(4), 757–778. 10.1016/j.pcl.2009.06.002 [DOI] [PubMed] [Google Scholar]
- Cucciare M. A., Weingardt K. R., Villafranca S. (2008). Using blended learning to implement evidence-based psychotherapies. Clinical Psychology: Science and Practice, 15(4), 299–307. 10.1111/j.1468-2850.2008.00141.x [DOI] [Google Scholar]
- Dirksen J. (2016). Design for how people learn (2nd ed). New Riders. [Google Scholar]
- Dunst C. J., Trivette C. M. (2012). Moderators of the effectiveness of adult learning method practices. Journal of Social Sciences, 8(2), 143–148. 10.3844/jssp.2012.143.148 [DOI] [Google Scholar]
- Dymnicki A., Wandersman A., Osher D., Grigorescu V., Huang L.(2014, September). Willing, able-> ready: Basics and policy implications of readiness as a key component for scaling up implementation of evidence-based interventions (ASPE Issue Brief). U.S. Department of Health & Human Services. https://aspe.hhs.gov/reports/willing-able-ready-basics-policy-implications-readiness-key-component-scaling-implementation
- Eiraldi R., McCurdy B., Khanna M., Mautone J., Jawad A. F., Power T., Cidav Z., Cacia J., Sugai G. (2014). A cluster randomized trial to evaluate external support for the implementation of positive behavioral interventions and supports by school personnel. Implementation Science, 9, 12. 10.1186/1748-5908-9-12 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Feinberg M. E., Ridenour T. A., Greenberg M. T. (2008). The longitudinal effect of technical assistance dosage on the functioning of communities that care prevention boards in Pennsylvania. The Journal of Primary Prevention, 29(2), 145–165. 10.1007/s10935-008-0130-3 [DOI] [PubMed] [Google Scholar]
- Foster-Fishman P. G., Berkowitz S. L., Lounsbury D. W., Jacobson S., Allen N. A. (2001). Building collaborative capacity in community coalitions: A review and integrative framework. American Journal of Community Psychology, 29(2), 241–261. 10.1023/a:1010378613583 [DOI] [PubMed] [Google Scholar]
- Guldbrandsson K. (2008). From news to everyday use: The difficult art of implementation . https://www.folkhalsomyndigheten.se/contentassets/f816e017ddac4628b9bfca8150164b04/from-news-to-everyday-use.pdf
- Hayes S. C., Pistorello J., Levin M. E. (2012). Acceptance and commitment therapy as a unified model of behavior change. The Counseling Psychologist, 40(7), 976–1002. 10.1177/0011000012460836 [DOI] [Google Scholar]
- Heifetz R. A., Grashow A., Linsky M. (2009). The practice of adaptive leadership: Tools and tactics for changing your organization and the world. Harvard Business Press. [Google Scholar]
- Joyce B., Showers B. (2002). Student achievement through staff development (3rd ed). Association for Supervision and Curriculum Development. [Google Scholar]
- Julian D. A. (2017). Toward a definition of the management function as it relates to collaborative community problem-solving. Global Journal of Community Psychology Practice, 8(2), 1–12. Retrieved from http://www.gjcpp.org/. [Google Scholar]
- Karoly P. (1993). Mechanisms of self-regulation: A systems view. Annual Review of Psychology, 44, 23–52. 10.1146/annurev.psych.44.1.23 [DOI] [Google Scholar]
- Katz J., Wandersman A. (2016). Technical assistance to enhance prevention capacity: A research synthesis of the evidence base. Prevention Science, 17(4), 417–428. 10.1007/s11121-016-0636-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kislov R., Pope C., Martin G. P., Wilson P. M. (2019). Harnessing the power of theorising in implementation science. Implementation Science, 14, 103. 10.1186/s13012-019-0957-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Leeman J., Birken S. A., Powell B. J., Rohweder C., Shea C. M. (2017). Beyond “implementation strategies”: Classifying the full range of strategies used in implementation science and practice. Implementation Science, 12, 125. 10.1186/s13012-017-0657-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- Leeman J., Calancie L., Hartman M. A., Escoffery C. T., Herrmann A. K., Tague L. E., Moore A. A., Wilson K. M., Schreiner M., Samuel-Hodge C. (2015). What strategies are used to build practitioners’ capacity to implement community-based interventions and are they effective?: A systematic review. Implementation Science, 10, 80. 10.1186/s13012-015-0272-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Leeman J., Calancie L., Kegler M. C., Escoffery C. T., Herrmann A. K., Thatcher E., Hartman M. A., Fernandez M. E. (2017). Developing theory to guide building practitioners’ capacity to implement evidence-based interventions. Health Education & Behavior, 44(1), 59–69. 10.1177/1090198115610572 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lessard S., Bareil C., Lalonde L., Duhamel F., Hudon E., Goudreau J., Lévesque L. (2016). External facilitators and interprofessional facilitation teams: A qualitative study of their roles in supporting practice change. Implementation Science, 11, 97. 10.1186/s13012-016-0458-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Loper A., Woo B., Metz A. (2021). Equity is fundamental to implementation science. Stanford Social Innovation Review, 19(3), A3–A5. 10.48558/QNGV-KG05 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McCormack B., Rycroft-Malone J., DeCorby K., Hutchinson A. M., Bucknall T., Kent B., Schultz A., Snelgrove-Clarke E., Stetler C., Titler M., Wallin L., Wilson V. (2013). A realist review of interventions and strategies to promote evidence-informed healthcare: A focus on change agency. Implementation Science, 8, 107. 10.1186/1748-5908-8-107 [DOI] [PMC free article] [PubMed] [Google Scholar]
- McWilliam J., Brown J., Sanders M. R., Jones L. (2016). The Triple P Implementation Framework: The role of purveyors in the implementation and sustainability of evidence-based programs. Prevention Science, 17(5), 636–645. 10.1007/s11121-016-0661-4 [DOI] [PubMed] [Google Scholar]
- Metz A., Albers B., Burke K., Bartley L., Louison L., Ward C., Farley A. (2021). Implementation practice in human service systems: Understanding the principles and competencies of professionals who support implementation. Human Service Organizations: Management, Leadership & Governance, 45(3), 238–259. 10.1080/23303131.2021.1895401 [DOI] [Google Scholar]
- Metz A., Bartley L. (2015). Co-creating the conditions to sustain the use of research evidence in public child welfare. Child Welfare, 94(2), 115–140. https://www.jstor.org/stable/48623521 [Google Scholar]
- Metz A., Naoom S., Halle T., Bartley L. (2015). An integrated stage-based framework for implementation of early childhood programs and systems (OPRE Research Brief OPRE 2015-48). Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services. https://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/resources/OPRE-stage_based_framework_brief_508.pdf
- Metz A., Woo B., Loper A. (2021). Equitable implementation at work. Stanford Social Innovation Review, 19(3), A29–A31. 10.48558/R793-6704 [DOI] [Google Scholar]
- Meyers D. C., Durlak J. A., Wandersman A. (2012). The Quality Implementation Framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3-4), 462–480. 10.1007/s10464-012-9522-x [DOI] [PubMed] [Google Scholar]
- Miller W. R., Rollnick S. (2012). Motivational interviewing: Helping people change (3rd ed). Guilford Publications. https://ebookcentral-proquest-com.libproxy.lib.unc.edu/lib/unc/detail.action?docID=1034770 [Google Scholar]
- Nadeem E., Gleacher A., Beidas R. S. (2013). Consultation as an implementation strategy for evidence-based practices across multiple contexts: Unpacking the black box. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 439–450. 10.1007/s10488-013-0502-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Academies of Sciences, Engineering, and Medicine. (2019). Fostering healthy mental, emotional, and behavioral development in children and youth: A national agenda. The National Academies Press. 10.17226/25201 [DOI] [PubMed] [Google Scholar]
- Palinkas L. A., Aarons G. A., Chorpita B. F., Hoagwood K., Landsverk J., Weisz J. R. (2009). Cultural exchange and the implementation of evidence-based practices: Two case studies. Research on Social Work Practice, 19(5), 602–612. 10.1177/1049731509335529 [DOI] [Google Scholar]
- Phillippi S., Bumbarger B. K. (2021). Technology’s acceleration in behavioral health: COVID-19, 988, social media, treatment and more (Technical Assistance Collaborative Paper No. 9). National Association of State Mental Health Program Directors. https://www.nasmhpd.org/sites/default/files/9_AcceleratingTechnology_508.pdf
- Prochaska J. M., Prochaska J. O., Levesque D. A. (2001). A transtheoretical approach to changing organizations. Administration and Policy in Mental Health and Mental Health Services Research, 28(4), 247–261. 10.1023/A:1011155212811 [DOI] [PubMed] [Google Scholar]
- Proctor E., Silmere H., Raghavan R., Hovmand P., Aarons G., Bunger A., Griffey R., Hensley M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ramaswamy R., Mosnier J., Reed K., Powell B. J., Schenck A. P. (2019). Building capacity for Public Health 3.0: Introducing implementation science into an MPH curriculum. Implementation Science, 14, 18. 10.1186/s13012-019-0866-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ray M. L., Wilson M. M., Wandersman A., Meyers D. C., Katz J. (2012). Using a training-of-trainers approach and proactive technical assistance to bring evidence based programs to scale: An operationalization of the Interactive Systems Framework’s Support System. American Journal of Community Psychology, 50(3-4), 415–427. 10.1007/s10464-012-9526-6 [DOI] [PubMed] [Google Scholar]
- Rhoades B. L., Bumbarger B. K., Moore J. E. (2012). The role of a state-level prevention support system in promoting high-quality implementation and sustainability of evidence-based programs. American Journal of Community Psychology, 50(3-4), 386–401. 10.1007/s10464-012-9502-1 [DOI] [PubMed] [Google Scholar]
- Ritchie M. J., Parker L. E., Kirchner J. E. (2020). From novice to expert: A qualitative study of implementation facilitation skills. Implementation Science Communications, 1, 25. 10.1186/s43058-020-00006-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rogers E. M. (2003). Diffusion of innovations (5th ed). Free Press. [Google Scholar]
- Romney S., Israel N., Zlatevski D. (2014). Exploration-stage implementation variation: Its effect on the cost-effectiveness of an evidence-based parenting program. Zeitschrift für Psychologie, 222(1), 37–48. 10.1027/2151-2604/a000164 [DOI] [Google Scholar]
- Roppolo R. H., McWilliam J., Aldridge W. A. II, Jenkins R. H., Boothroyd R. I., Moore R. I. (2019). Is the concept of self-regulation useful for supporting effective implementation in community settings? Clinical Child and Family Psychology Review, 22(1), 118–128. 10.1007/s10567-019-00286-0 [DOI] [PubMed] [Google Scholar]
- Rushovich B. R., Bartley L. H., Steward R. K., Bright C. L. (2015). Technical assistance: A comparison between providers and recipients. Human Service Organizations: Management, Leadership & Governance, 39(4), 362–379. 10.1080/23303131.2015.1052124 [DOI] [Google Scholar]
- Sanders M. R., Mazzucchelli T. G. (2013). The promotion of self-regulation through parenting interventions. Clinical Child and Family Psychology Review, 16(1), 1–17. 10.1007/s10567-013-0129-z [DOI] [PubMed] [Google Scholar]
- Schell S. F., Luke D. A., Schooley M. W., Elliott M. B., Herbers S. H., Mueller N. B., Bunger A. C. (2013). Public health program capacity for sustainability: A new framework. Implementation Science, 8, 15. 10.1186/1748-5908-8-15 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Singh S., Fenton A., Bumbarger B. K., Casano K., Beiter K., Simpson L., Phillippi S.. (2022). (in press). Transitioning behavioral healthcare in Louisiana through the COVID-19 pandemic: Policy and practice innovations to sustain telehealth expansion. Journal of Technology in Behavioral Science, 7(3), 296–306. doi: 10.1007/s41347-022-00248-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spoth R., Greenberg M. (2011). Impact challenges in community science-with-practice: Lessons from PROSPER on transformative practitioner-scientist partnerships and prevention infrastructure development. American Journal of Community Psychology, 48(1-2), 106–119. 10.1007/s10464-010-9417-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stetler C. B., Legro M. W., Rycroft-Malone J., Bowman C., Curran G., Guihan M., Hagedorn H., Pineros S., Wallace C. M. (2006). Role of “external facilitation” in implementation of research findings: A qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implementation Science, 1, 23. 10.1186/1748-5908-1-23 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Waltz T. J., Powell B. J., Matthieu M. M., Damschroder L. J., Chinman M. J., Smith J. L., Proctor E. K., Kirchner J. E. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10, 109. 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wandersman A., Chien V. H., Katz J. (2012). Toward an evidence-based system for innovation support for implementing innovations with quality: Tools, training, technical assistance, and quality assurance/quality improvement. American Journal of Community Psychology, 50(3-4), 445–459. 10.1007/s10464-012-9509-7 [DOI] [PubMed] [Google Scholar]
- West G. R., Clapp S. P., Averill E. M. D., Cates W., Jr. (2012). Defining and assessing evidence for the effectiveness of technical assistance in furthering global health. Global Public Health, 7(9), 915–930. 10.1080/17441692.2012.682075 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wood R., Bandura A. (1989). Social cognitive theory of organizational management. The Academy of Management Review, 14(3), 361–384. 10.2307/258173 [DOI] [Google Scholar]
- Wouters P., van Nimwegen C., van Oostendorp H., van der Spek E. D. (2013). A meta-analysis of the cognitive and motivational effects of serious games. Journal of Educational Psychology, 105(2), 249–265. 10.1037/a0031311 [DOI] [Google Scholar]
- Yazejian N., Metz A., Morgan J., Louison L., Bartley L., Fleming W. O., Haidar L., Schroeder J. (2019). Co-creative technical assistance: Essential functions and interim outcomes. Evidence & Policy: A Journal of Research, Debate and Practice, 15(3), 339–352. 10.1332/174426419X15468578679853 [DOI] [Google Scholar]




