Skip to main content
Implementation Research and Practice logoLink to Implementation Research and Practice
. 2021 Jan 28;2:2633489520982903. doi: 10.1177/2633489520982903

What educational strategies and mechanisms facilitate EBP use? A mixed methods examination of therapist perceptions within a system-driven implementation of multiple EBPs

Mojdeh Motamedi 1,2,, Anna S Lau 3, Teresa Lind 1,2,4, Joyce HL Lui 3,5, Adriana Rodriguez 3, Ashley Smith 3, Lauren Brookman-Frazee 1,2
PMCID: PMC9978618  PMID: 37089987

Abstract

Background:

Many strategies may be used by external consultants (such as treatment developers and trainers) and internal program leaders to support evidence-based practice (EBP) implementation. The goal of this study was to identify which educational implementation strategies are considered by therapists to be most helpful, through which mechanisms, and whether these strategies are linked to EBP use.

Methods:

Semi-structured interviews were conducted with 60 therapists, and 826 therapists completed surveys regarding their perceptions of educational implementation strategies and reported delivery of EBPs within a system-driven, multiple-EBP implementation effort. Using sequential QUAL → QUAN mixed methods, we first identified qualitative themes. Next, we conducted a multilevel logistic regression to examine how quantitative survey items corresponding with qualitative themes predicted EBP use.

Results:

Initial qualitative thematic analyses revealed four implementation strategies perceived as essential for EBP delivery: connection to a community of trained therapists, ongoing consultation/supervision, availability of internal supervisors trained in the EBP, and access to EBP materials and resources. Quantitative results showed strategies related to connections with a community of trained therapists (i.e., percentage of other therapists at an agency with EBP training and delivery experience as opposed to those who are only trained in the EBP), ongoing consultation/supervision, and having an internal supervisor trained in the EBP (receiving EBP-specific in-house supervision) were significantly associated with EBP use while receiving EBP boosters was not. The closest quantitative indicator corresponding to access to EBP resources, EBP web-based training, was not associated with EBP use. Therapist reported these strategies supported EBP delivery through exposure to other therapists’ cases, guidance/feedback, emotional support, and removing logistic barriers to EBP use.

Conclusions:

These findings demonstrate how considering therapist perspectives and creating a network of EBP support via supervisors, consultants, and a community of therapists experienced in the EBP may be particularly critical to EBP delivery.

Plain language abstract:

Public mental health systems are increasingly implementing multiple evidence-based practices (EBPs). There are many strategies that may be used by external consultants (such as treatment developers and trainers) and internal program leaders to support EBP implementation. The goal of this study was to identify which of these internal and external implementation strategies are considered by therapists to be most helpful and how these strategies are linked with continued use of EBPs. First, qualitative interviews with therapists revealed the following strategies are key for supporting their delivery of EBPs: (1) connections to a community of trained therapists, (2) ongoing consultation/supervision, (3) having an internal supervisor at their program who was trained in the EBP, and (4) access to EBP materials and logistic resources. Next, quantitative analyses of survey data examined whether any of the strategies therapists identified as most helpful predicted the continued delivery of EBPs by therapists after initial training. Results confirmed that strategies involving connections with a community of therapists trained in and experienced with the EBP, ongoing consultation/supervision, and having an internal supervisor trained in the EBP were each significantly associated with EBP use. Therapist reported these strategies supported EBP delivery through exposure to other therapists’ cases, guidance/feedback, emotional support, and removing logistic barriers to EBP use. These findings can assist systems and programs in prioritizing implementation strategies to support the sustained delivery of EBPs.

Keywords: Evidence-based practices, implementation strategies, sustainment, therapists, sequential QUAL → QUAN mixed methods


Therapists working in community mental health settings are increasingly expected to deliver multiple evidence-based practices (EBPs), often in response to system- or policy-driven EBP implementation initiatives (Bruns et al., 2015; Lau & Brookman-Frazee, 2016). However, system-wide sustainment of EBP delivery following initial implementation is challenging (Bruns et al., 2015; Edmunds et al., 2013). Implementation strategies, or the specific methods used to adopt and sustain EBPs (Proctor et al., 2013), are necessary for realizing the continued benefits of evidence-based care. The Exploration, Planning, Implementation, and Sustainment (EPIS) framework guiding this study proposes certain implementation strategies may be particularly critical during the sustainment phase of EBP implementation and that mixed methods can help examine these strategies (Aarons et al., 2011; Moullin et al., 2019).

Implementation strategies that focus on training and educating providers are considered among the most important and feasible approaches to facilitating sustained EBP use among therapists (Powell et al., 2015; Waltz et al., 2015). Training, supervision, and consultation are key education and training strategies as they provide continued support after initial training, which by itself is generally insufficient to promote EBP use (Nadeem et al., 2013; Valenstein-Mah et al., 2020; Waltz et al., 2015). However, less is known about how therapists perceive and differentiate among these training strategies and how they mechanistically impact continued use within the sustainment phase of EBP implementation (Moullin et al., 2019).

It is a priority to understand the mechanisms or the processes through which an implementation strategy operates to affect desired implementation outcomes (Lewis et al., 2018, 2020; Williams, 2016). For training, consultation, and supervision, mechanisms thought to improve therapists’ practice include problem-solving, modeling/rehearsing skill acquisition, enhancing motivation and engagement, accountability, sustainability planning, discussing learnings from other cases, and adapting to improve fit (Bearman et al., 2013; Bennett-Levy et al., 2009; Nadeem et al., 2013). However, these mechanisms may not be consistently engaged across educational implementation strategies; thus, it is important to identify mechanisms that therapists identify as associated with particular strategies (Nadeem et al., 2013).

Therapist perspectives on educational implementation strategies

Further understanding therapists’ perceptions of specific implementation strategies is required to refine educational strategies, particularly since therapists may only engage in strategies they perceive to be beneficial (Beidas et al., 2019; Lyon et al., 2013). In addition, multiple methods (e.g., concept mapping, quantitative and qualitative data, and Delphi methods) are needed to generate a better understanding of stakeholders’ perceptions of implementation strategies since they often hold different perspectives (Powell et al., 2015; Stadnick et al., 2018). For example, using mixed methods can traingulate and expand findings to help refine our understanding of implementation strategies (Palinkas et al., 2011).

As targeted consumers of educational implementation strategies, therapists are best positioned to answer questions to help researchers qualitatively and quantitatively identify which aspects of training, consultation, and supervision are most or least helpful. Previous studies have identified a range of educational strategies that improve EBP implementation, including access to ongoing consultation, learning collaboratives, and distributing educational materials (Waltz et al., 2015). However, it remains unclear which implementation strategies therapists perceive as most beneficial, and how these perceptions influence their EBP delivery (Cucciare et al., 2008; Reese et al., 2016). In addition, there is often variability in how these strategies are enacted and by who (e.g., by therapists’ agency versus outside consultants; Aarons et al., 2011) that result in different mechanistic processes used in supervision and consultation (e.g., guidance on adaptation versus accountability for providers adhering to plans; Nadeem et al., 2013). Community mental health supervisors predominantly rely on passive training techniques such as supportive listening and information gathering, whereas more active techniques such as didactic instruction, coaching, and performance feedback are less common (Dorsey et al., 2018; Schriger et al., 2020). Understanding therapists’ perceptions can offer insight into optimal approaches to supplying educational strategies, such as frequency and timing of consultation, who provides supervision or consultation, and what processes to focus on to support EBP use (Edmunds et al., 2013; Herschell et al., 2014; Nadeem et al., 2013).

Study context: sustainment of system-drive implementation of multiple EBPs

The current study was part of the “4KEEPS” Study, which applies the EPIS framework to study sustainment of multiple EBPs within the Los Angeles County Department of Mental Health (LACDMH; Aarons et al., 2011; Lau & Brookman-Frazee, 2016). In 2010, LACDMH began reiumbursing therapists for providing specific EBPs and financially supporting and coordinating initial training and consultation for these EBPs (LACDMH, 2016). In response, LACDMH agencies employed a various implementation strategies to support sustained EBP delivery, many of which focused on training and educating, including increasing EBP-specific communication and supervision/consultation, providing guidance on adaptation, pursuing internal capacity through train-the-trainer models, deterring turnover of trained therapists with incentive, using EBP champions/leaders, and providing additional resources such as administrative and workflow support (Regan et al., 2017; Rodriguez et al., 2018).

Although multiple educational strategies were used by LACDMH, some therapists felt the support and training was insufficient, which contributed to discontinued EBP use (Lau et al., 2020). Previous research in this context has focused on therapist characteristics or general implementation climate and organizational context as predictors of EBP sustainment (e.g., Lau et al., 2020; Regan et al., 2017; Rodriguez et al., 2018). The current study expanded this work by differentiating specific educational implementation strategies therapists found useful, exploring how these strategies were useful, and testing associations with therapists continued EBP use.

Current study

This study used a sequential QUAL → QUAN mixed methods approach to (1) qualitatively identify educational implementation strategies perceived by therapists as most helpful and the mechanisms by which these strategies support continued EBP delivery, (2) conduct quantitative multilevel analyses assessing the association of identified strategies with therapists’ EBP use, and (3) triangulate findings and deepen interpretation by integrating quantitative and qualitative results.

Methods

Study context

The 4KEEPS study (see Lau & Brookman-Frazee, 2016 for details) examined the implementation and sustainment of the following six EBPs: Cognitive Behavioral Intervention for Trauma in Schools (CBITS), Child-Parent Psychotherapy (CPP), Managing and Adapting Practice (MAP), Seeking Safety (SS), Trauma-Focused Cognitive Behavior Therapy (TF-CBT), and Positive Parenting Program (Triple P). Data were collected 5 years after the initial system-driven roll-out of EBPs, corresponding to the sustainment phase of the LACDMH initiative. This research was conducted in full compliance with APA standards for ethical practice and the LACDMH and the University of California Los Angeles Institutional Review Boards.

Participants

Therapists were emailed and asked to participate if they had trained in or delivered at least one of the six EBPs. Therapists received $40 for participating in the semi-structured interview, $20 for the survey, and $20 more for completing the survey within 2 weeks of distribution. Written informed consent was obtained from all participants.

“In-depth” interview sample (QUAL)

Through purposeful sampling, 60 therapist interviews across 12 agencies were selected from a larger sample of 126 therapist interviews (Palinkas, 2014; see Table 1 for therapist characteristics). Therapists discussed one to three EBPs based on how many target EBPs they were implementing. On average, therapists discussed two EBPs in their interview. Since the rate of utilization varied by EBP, some EBPs were discussed more than others (see Table 2). Thus, random sampling stratifying by EBP was used to ensure at least 15 interviews were selected per EBP, starting with selecting interviews capturing the least utilized EBPs. Through this process, we exceeded the typically recommended number of interviews and ensured we had sufficient interviews to examine therapist perceptions of educational implementation strategies for each EBP except CBITS (Mason, 2010).

Table 1.

Characteristics of participants in the qualitative interview and quantitative survey sample.

QUAL sample QUAN sample χ2 t test
Therapists (N = 60) Therapists (N = 826)
Female 88.30% 88.10% 0.01
Age M = 36.12 (SD = 9.38) M = 36.62 (SD = 9.19) 0.03
Race/ethnicity 5.23
 Latinx 61.70% 43.80%
 Non-Hispanic White 18.30% 34.30%
 Black 10.00% 6.90%
 Asian/Pacific Islander 8.30% 11.70%
 Other 1.70% 3.30%
Level of education 0.09
 Bachelors and under 3.30% 2.40%
 Master’s 86.70% 85.40%
 Doctorate 10.00% 12.20%
Licensed 20.00% 42.60% 10.45**
Discipline 0.15
 Marriage and family therapist 56.70% 55.40%
 Psychologist 13.3% 13.90%
 Social worker 28.30% 28.90%
 Other (e.g., case manager, behavior, or rehabilitation specialist) 1.70% 1.70%
Average # EBPs trained in M = 2.60 (SD = 0.87) M = 2.41 (SD = 1.03) –1.90

M: mean; SD: standard deviation; EBPs: evidence-based practices.

**

p < .01.

Table 2.

Description of EBPs and implementation strategy characteristics.

Evidence-based practices Problem target Requires consultation Train the trainer modela Required implementation support characteristicsb Therapists reporting on EBP during interviews
Cognitive Behavioral Intervention for Trauma in Schools (CBITS) Trauma Yes Yes 2-day initial training, 10 weekly consultation calls 0
Child-Parent Psychotherapy (CPP) Trauma and attachment Yes No 2.5-day initial training, 2-day booster training at 6 and 12 months; 18 months of bi-weekly group consultation 18
Managing and Adapting Practice (MAP) Anxiety, trauma, depression, conduct Yes Yes 5-day initial training, 6 months of bi-weekly consultation, portfolio submission 35
Positive Parenting Program (Triple P) Conduct No No 3-day initial training, consultation calls available upon request 17
Seeking Safety (SS) Trauma, substance use No Yes 1-day initial training, optional consultation calls recommended 26
Trauma-Focused Cognitive Behavior Therapy (TF-CBT) Trauma Yes No 2-day initial training, 1-day online training, 12 consultation calls, 1 booster training, audio tape review by certified trainer 35

EBP: evidence-based practices; CBITS: Cognitive Behavioral Intervention for Trauma in Schools; CPP: Child-Parent Psychotherapy; MAP: Managing and Adapting Practice; SS: Seeking Safety; TF-CBT: Trauma-Focused Cognitive Behavior Therapy.

a

Train-the-trainer models allowed for clinical supervisors to be certified as trainers for that EBP.

b

Based on requirements described by the LACDMH (2016).

Survey sample

A total of 826 therapists from 67 agencies, including the 60 therapists in the “in-depth” study, were included in the quantitative analyses (see Table 1 for characteristics). On average, therapists were trained in 2.4 EBPs. It was rare (0.20%) for therapists to be trained in all 6 EBPs, with almost all therapists (98%) trained in 1 to 4 EBPs.

Measures

Qualitative measures

Semi-structured interviews

Three female postdoctoral fellows and four bachelor-level research assistants, a male and three females, conducted a 1-hr, one-time interview. Interviewers, who did not have pre-existing relationships with the therapists, informed therapists that they were not affiliated with EBP developers or LACDMH and the interview purpose was to learn about therapists’ EBP perceptions. Interviews were audio-recorded and conducted at therapists’ agencies with only the interviewer and up to one other research team member present.

Interviews followed a funnel approach where therapists were first asked broad questions about training followed by more specific probes about ongoing consultation, booster training, and supervision and what was and was not helpful (Spradley, 1979). See Appendix 1 for specific questions.

Quantitative measures

EBP characteristics

Based on the LACDMH (2016) Implementation Handbook, EBPs were characterized as requiring consultation (yes/no). As described in Table 2, consultation requirements varied in frequency.

Therapist characteristics

Therapists reported on their gender, race, licensure status, and educational degree.

EBP training and use

For each EBP, therapists reported whether they were ever trained in or used the EBP and whether they used the EBP in the past 2 months (see Supplemental Table 1 for prevalence).

EBP implementation strategies

Therapists indicated (yes/no) if they received any of the following beyond the initial training specific to each EBP: booster training/workshop, ongoing consultation with EBP developers, in-house supervision, and web-based training (see Supplemental Table 1 for prevalence).

EBP self-efficacy

Using a 5-point Likert-type scale (1 = not at all, 5 = to a very great extent), therapists rated themselves on: “I am well prepared to deliver this practice even with challenging clients” and “I am confident in my ability to implement this practice.” These items were averaged for each EBP (Cronbach’s α = .89).

Data analysis plan

Qualitative data analysis

Therapist interviews were transcribed and entered into QSR International’s NVivo 11 Software. Based on recommendations for qualitative analyses in mental health and implementation science, a codebook was developed using a “coding, consensus, and comparison” method (Hill et al., 1997; Palinkas, 2014; Willms et al., 1990). In this iterative process, the coding team reviewed a subset of interviews to inform developing a preliminary coding scheme. Two clinical psychology postdoctoral fellows then applied this initial coding scheme to four interviews. Once a coding scheme was finalized (see Appendix 1), the codebook was applied to all 60 transcripts. Consistent with the EPIS framework, the coding scheme included two codes distinguishing inner versus outer context implementation support (e.g., agency versus developer/outside supports; Aarons et al., 2011) and two codes delineating when these strategies are used (e.g., initial training versus ongoing consultation). The coding team consisted of the two master coders/trainers (postdoctoral fellows), a doctoral student and two post-baccaulerate research staff. Master coders met regularly with research assistant coders to discuss procedures and codes and resolve differences in codes through one-on-one discussions. To avoid coder drift and inconsistency, 50% of coded interviews were randomly selected and reviewed by the master coders.

We used a postpositivist paradigm with a mix of inductive and deductive approaches for our thematic analysis since our findings and codes were based on literature driven semi-structured interview questions, while themes were inductively identified based on responses to the questions (Palinkas & Cooper, 2017; Ponterotto, 2005). Thematic analysis involved an independent review of codes by authors (M.M., A.R., & A.S.), all clinical psychology postdoctoral fellows, to identify salient themes. This was followed by consensus meetings to review and agree upon broader themes pertaining to educational implementation strategies and mechanisms therapists found most useful in supporting EBP delivery. Identified themes were classified as either (1) educational implementation strategies, which were defined as methods involved in training, supervision, and consultation that are “used to promote the implementation of an EBP,” including the features of these methods (e.g., frequency, source; Lewis et al., 2020); or (2) mechanisms, the processes by “which an implementation strategy operates to affect” EBP implementation (Lewis et al., 2020). Four salient themes about implementation strategies and four themes related to mechanisms were identified (see Appendix 1 subcodes). An implementation strategy or its mechanism was considered helpful if a quote was coded as being appealing or a facilitator, and unhelpful if it was coded as a challenge or barrier. For example, the following quote about the mechanism of “guidance/feedback on EBP delivery” was coded as a barrier: “I wish like in the trainings they would really focus on how to do that, or what is an alternative for that age group, since TF-CBT’s supposed to be good for a three year old.”

Salience was determined based on (1) how many therapists brought up the theme across interviews and/or (2) how much a therapist emphasized a point within an interview. For example, nearly half of the therapist interviews included the theme of valuing a community of trained therapists. Although emotional support emerged as a theme much less often, when it was present, therapists used terms emphasizing its importance such as superlatives or making explicit the essential nature of this mechanism.

Quantitative data analysis

Analyses were restricted to the EBPs therapists were trained in or used. An unconditional logistic model indicated some variance attributable to each level of the data (e.g, therapist-level ICC = .16, agency level ICC = .02). Thus, multilevel modeling was conducted using Stata/IC 15. Level 1 was specified as EBP (n = 2,064), Level 2 as therapist (n = 826), and Level 3 as agency (n = 67). A single multilevel logistic regression model was conducted to predict therapists’ EBP use in the past 2 months. Level 1 predictors included (1) EBP requiring consultation (0 = SS and Triple P; 1 = CBITS, CPP, MAP and TF-CBT); therapist report of receiving EBP-specific (2) boosters, (3) consultation, (4) in-house supervision, and (5) web-based training; (6) the percentage of other therapists across an agency that reported ever being trained in and using each EBP, and (7) the percentage of other therapists across an agency that reported only ever being trained in but never used each EBP. Level 1 also included therapist EBP use in the past 2 months as the outcome and EBP self-efficacy as a covariate, since it was a significant predictor of EBP use in past studies (e.g., Brookman-Frazee et al., 2018; Lau et al., 2020). Level 2 included therapist-level covariates (i.e., gender, race/ethnicity, licensure status, education, and number of EBPs the therapist was trained in).

Integrating qualitative and quantitative analyses

A sequential QUAL → QUAN mixed methods approach was used to triangulate findings and provide more insights about the results. Quantitative findings were organized by qualitative themes (see Table 3). Qualitative and quantitative findings were merged to identify points of convergence and ways the themes and quantitative data expanded on each other resulting in implications summarized in Table 4 (Palinkas et al., 2011).

Table 3.

Multilevel logistic regression predicting therapist EBP use in the past 2 months.

Variable by themes of implementation strategies OR 95% CI
Connections to a community of trained therapists
 Percentage of other therapists at agency ever used and were trained in the EBP 2.91** [1.75, 4.82]
 Percentage of other therapists at agency that have only trained in but not used the EBP 0.07* [0.01, 0.31]
Ongoing consultation/supervision
 Therapist has received booster training workshop 1.25 [0.94, 1.70]
 Therapist has received ongoing consultation from EBP developers/trainers 2.45** [1.79, 3.34]
 EBP requiring consultation 2.05** [1.54, 2.73]
Internal supervisor trained in EBP
 Therapist has received EBP-specific in-house supervision 2.60** [1.90, 3.48]
Access to EBP materials and logistic resources
 Therapist has received EBP web-based training 0.80 [0.59, 1.09]
Covariates
 Therapist EBP self-efficacy 2.03** [1.73, 2.38]
 Number of EBPs therapists trained in 0.93 [0.79, 1.09]
 Licensed 0.43** [0.32, 0.60]
 Education (Reference: Bachelor’s degree)
  Master’s degree 0.63 [0.19, 2.07]
  Doctoral degree 0.62 [0.18, 2.21]
 Female 1.08 [0.68, 1.71]
 Race (Reference: Non-Hispanic White)
  Hispanic 1.02 [0.71, 1.45]
  Other minority 0.93 [0.62, 1.41]

OR: odds ratio; CI: confidence interval; EBP: evidence-based practices.

*

p < .05; **p < .01.

Table 4.

Summary of qualitative themes integrated with quantitative findings.

Qualitative themes Quantitative predictors of EBP use in the past 2 months QUAL->QUAN implications
Strategy Mechanism
Connections to a community of trained therapists • Guidance and feedback on EBP delivery
• Exposure to other therapists’ cases
• Emotional support
• More therapists at an agency ever used and were trained in the EBP
• Less therapists at an agency that were only trained in an EBP but not used it
Expansion: What is valued among the connections to a community of trained therapists may be their hands-on experience using the EBP not their experience from only receiving training in the EBP.
Ongoing consultation/ supervision • Guidance and feedback on EBP delivery
• Exposure to other therapists’ cases
• Ongoing consultation from EBP developers/trainers
• EBP requiring consultation
Expansion: Booster sessions may not be enough to impact EBP implementation relative to continuous ongoing consultation, which is linked to EBP use.
Internal supervisor trained in EBP • Guidance and feedback on EBP delivery • EBP-specific in-house supervision Convergence: Confirm having an internal supervisor trained in the EBP is beneficial in addition to other support from within and outside of the agency.
Access to EBP materials/logistic resources, ranging from:
• materials translated into additional languages
• additional staff to support EBP implementation (e.g., case managers and parent partners)
• EBP specific resources (e.g., a laptop for MAP dashboard, toys for trauma work in CPP)
• Remove logistic barriers to EBP use • EBP web-based training Expansion: There are many ways to remove logistic barriers to EBP use by agencies or EBP developers. However, it may be necessary for these additional EBP materials and logistic resources to target removing logistic barriers specific to the situation (e.g., linguistic needs, target problem, EBP-specific needs, location, and therapist preference). These resources need to be evaluated further as their utility may vary by situation.
• workbooks
• referrals
• email updates
• websites
• extra time
• laptops

EBP: evidence-based practices; MAP: Managing and Adapting Practice; CPP: Child-Parent Psychotherapy.

Results

Aim 1: Aualitatively identify salient educational implementation strategies and their mechanisms

Educational implementation strategies

Connections with a community of trained therapists

Over half of the therapists mentioned valuing access to other therapists implementing the same EBP at their own agency and other agencies. Therapists voiced “having other people trained in the agency has been very helpful, because whenever we have a question or a concern, we go to those people to guide us.” Therapists also expressed appreciation for connections to therapists at other programs through group consultation calls organized by the EBP developer, which helped compensate for when therapists “felt really alone” in having difficulty delivering an EBP. Although a few individuals mentioned specific preferences (e.g., smaller more intimate group consultation for closer connections), this theme emphasized the broad range of individuals whom therapists felt they could benefit from connecting with regarding experiences delivering the same EBP rather than specifying who that person was.

Ongoing consultation/supervision

This theme emphasized ongoing support after training by sources within or outside the therapist’s agency, such as through the EBP developer. Importantly, therapists highlighted it was the ongoing nature that was helpful, apart from the consultation and supervision itself. Therapists noted, “when I first got my client, it was really hard to understand what I was doing. I felt lost, but then we ended up going to some boosters, so that helped—it reminded us what to do.” Therapists also noted the challenge of not having ongoing EBP-specific consultation/supervision after initial training: “the most challenging thing here was for a long time . . . we went for a good eight months with no supervision.”

Internal supervisor trained in EBP

Therapists valued having an in-house supervisor who “knows her stuff” in contrast to consultation organized by the EBP developer, which is often from outside the therapist’s agency. One therapist reported, “I get weekly supervision by someone who’s actually at my agency, and that’s nice because it’s in-person. . . I like how they kind of plug people in rather than maybe phone calls with the trainers.”

Access to EBP materials and logistic resources

Therapists commented on a diverse range of resources EBP developers and agencies provided (see Table 4). For resources from outside their agency, therapists said, “I like that everything is available on the website, and you can go and grab the information . . . They’re constantly updating them” and within their agency: “we do have case managers [and] rehab specialists available to us. So that helps because then they’re another, they add an extra, so they’re reinforcing certain skills we’re working on.”

Mechanisms of educational implementation strategy

As summarized in Table 4, therapists’ responses revealed four potential mechanisms explaining how the educational implementation strategies impacted EBP use.

For the implementation strategies of connections to a community of trained therapists and ongoing consultation/supervision, therapists reported benefiting from being exposed to other therapists’ cases and receiving guidance/feedback on EBP delivery. Therapists described these activities as helping them feel validated when experiencing challenges and helping them understand what to do in their sessions. One therapist said,

We had the ability to present cases, and we got the team feedback on our cases so that we could understand where we needed to adjust the treatment or to reconsider another approach or intervention. So, that was all very helpful.

Therapists also noted internal supervisors trained in the EBP helped by providing guidance/feedback on EBP delivery. When discussing connections to a community of trained therapists, some therapists also commented on the importance of emotional support, especially for cases involving trauma. This included normalizing and validating challenges therapists faced delivering EBPs. One therapist mentioned,

My supervisor’s door is open whenever I need it, too, so sometimes it’d be like I’m in my own crisis, so—because these cases are hard. You see—like, say sometimes I would have trauma narratives like five in a row and I would want to cry, because it was just too intense. Yeah. My supervision is necessary.

The identified mechanism for access to EBP materials/logistic resources was removing logistic barriers to EBP use. Barriers mentioned included time, translation, and client engagement. One therapist noted, “I think it really depends on where you’re working and how—if there’s support staff to support you in being able to collect all those measures, store them. I think it makes a huge difference.”

Aim 2: Examine associations between EBP delivery and quantitatively measured implementation strategies

Need for colleagues who are trained in and used the EBP

Each of the quantitative indicators of educational implementation strategies was mapped on to the identified themes. Supplemental Table 2 displays the correlation among quantitative predictors and outcomes. Table 3 displays odds ratios organized by how they corresponded to the qualitative themes across each EBP. For the strategy of having connections to a community of trained therapists, we calculated the percentage of other therapists at an agency that were only trained in an EBP and the percentage that were trained in and used an EBP. Higher percentage of other therapists that were trained in and used an EBP was associated with a greater likelihood of EBP use (odds ratio [OR] = 2.91, p < .01) while the percentage of other therapists at an agency that were only trained in an EBP was associated with a lower likelihood of EBP use (OR = 0.07, p < .05).

Need for ongoing consultation and in-house supervision

The strategy of ongoing consultation/supervision corresponded to three items: EBP required ongoing consultation, and therapist report of receiving booster trainings and ongoing consultation from EBP developers/trainers. EBPs requiring consultation and therapist report of receiving ongoing consultation from EBP developers/trainers were both associated with EBP use (OR = 2.05 and OR = 2.45, respectively, p < .01). However, receiving booster trainings was not (OR = 1.25, p = .13) despite being correlated with EBP use (r = 0.22, p < .01). Having an internal supervisor trained in the EBP quantitatively corresponded to therapists reporting having an EBP-specific in-house supervision, which predicted EBP use (OR = 2.60, p < .01). Finally, receiving web-based training, partially corresponding to the strategy of having access to EBP materials and resources, was marginally associated with a lower likelihood of EBP use (OR = 0.80, p < .10) but was associated with EBP use in bivariate correlations (r = .14, p < .01).

Aim 3: Integrate quantitative–qualitative results

Table 4 provides a summary of integrated qualitative and quantitative findings. This integration clarified that the community of trained therapists need to have used the EBP and consultation should be ongoing beyond a few booster sessions to facilitate ongoing EBP use. Qualitative and quantitative results converged regarding the importance of having an in-house supervisor trained in the EBP while findings regarding access to EBP materials/logistic resources partially diverged. This divergence may be because the mechanism for this strategy suggests it is important for the specific resource to map on to addressing the unique barriers to delivering the specific EBP to a specific population in a specific situation. Further examination of the qualitative data reflects therapist frustration with a lack of access to needed materials: “We go back again to the resources, which are not in Spanish. . . . I’m waiting for that video in Spanish for the teenagers. I’m like, ‘Where is that video?’” Such experiences exemplify the importance of additional resources fitting the need for overcoming specific barriers, such as linguistic challenges.

Discussion

Recent systematic reviews have called for using mixed methods approaches to understand which implementation strategies are considered helpful by different stakeholders, through which mechanisms, and how these strategies are associated with EBP implementation (Lewis et al., 2020; Moullin et al., 2019). Based on a review of studies using the EPIS framework, there is also a lack of studies examining the sustainment phase of system-wide initiatives to increase EBP use (Moullin et al., 2019). To this end, this study used sequential QUAL → QUAN mixed methods to identify educational implementation strategies and the mechanisms that therapists perceive as most helpful in facilitating ongoing EBP delivery during the LACDMH multiple EBP sustainment phase. This resulted in a variety of implications to be considered by both the inner context of mental health agencies and the outer context of EBP purveyors to better support EBP implementation.

Which strategies do therapists find helpful (qualitative findings)

Consistent with prior studies examining perspectives of LACDMH program leaders (Regan et al., 2017; Rodriguez et al., 2018) and stakeholders in other settings (Aarons et al., 2009; Powell et al., 2015), therapists in the current study reported ongoing consultation and in-house supervision were helpful for EBP use. The current study highlighted that therapists especially valued connections to other therapists trained in the EBP. Consequently, it may benefit agencies to have multiple therapists trained in the same EBP. These findings complement Rodriguez and colleagues’ (2018) finding that larger agencies report greater EBP sustainment than smaller agencies, since there is a greater likelihood of peers being trained in the same EBP within a larger agency. However, program leaders should be cautious to not train individual therapists in too many EBPs, which is associated with EBP discontinuation (Lau et al., 2020).

Mental health programs can also consider leveraging connections with other agencies since therapists valued connecting and consulting with therapists within and outside their agency who are using the same EBP. A recent review of the EPIS framework encourages putting greater attention on how to bridge across inner and outer settings (Moullin et al., 2019). For example, smaller programs can facilitate access to other trained therapists via connections through EBP consultation calls organized by EBP purveyors. Since external consultation by purveyors can become expensive, our findings suggest agencies may share consultation groups across programs and agencies, and utilize in-house EBP supervision, which can be more cost-effective (Nadeem et al., 2013; Olmstead et al., 2011; Triplett et al., 2020). Online learning collaboratives may be another way to facilitate connections among trained therapists in different agencies, but more research is necessary in this area (Mehta et al., 2018).

Therapists reported that connections to other trained therapists were especially valuable because it provided modeling of EBP application via other therapists’ cases, guidance on EBP delivery, and emotional support. For the implementation strategy of ongoing consultation, exposure to other cases and EBP guidance/feedback were also identified as mechanisms, while therapist comments indicated that EBP guidance/feedback was the primary beneficial mechanism for in-house supervision. Other studies have similarly identified EBP guidance/feedback (e.g., on rehearsing a skill, overcoming challenges, or on how to adapt an EBP) and exposure to other therapists’ cases as mechanisms important for EBP delivery (e.g., Herschell et al., 2014; Nadeem et al., 2013). However, other core processes identified in a review by Nadeem and colleagues (2013), such as accountability and sustainability planning, were not mechanisms described by therapists in this study. These mechanisms may not be as relevant for therapists in a context where they are employed by agencies contracted to deliver EBPs for reimbursement. Given that accountability mechanisms are considered important for quality assurance and long-term EBP use, especially after therapists no longer receive expert EBP guidance, it is important for agencies and EBP trainers to prime expectations for addressing these important processes and helping therapists see their value (Nadeem et al., 2013). Furthermore, agencies should consider how to increase the use of active supervision techniques such as EBP-specific guidance/feedback. This may be important given that passive techniques are more common in community settings and up to one third of supervision time may be devoted to addressing non-clinical functions including administrative tasks (Dorsey et al., 2017; Schriger et al., 2020).

Therapists in this study also shed light on the importance of emotional support, which was not identified as a core mechanism for supervision and consultation by Nadeem and colleagues’ (2013) review. Emotional support was a mechanism unique to having connections to a community of trained therapists. Informal consultation and social support through a community of trained therapists can help therapists feel validated, reduce therapist burnout, and, therefore, improve sustained delivery of EBPs (Carson et al., 1999). It is notable that therapists mentioned emotional support since it is not as explicitly or routinely emphasized in many implementation models. Certain EBP implementation models emphasize this mechanism to a greater extent, such as Dialectical Behavior Therapy, which encourages formal peer consultation (Prada et al., 2018). Support from other therapists may be especially important for therapists seeing clients with significant clinical needs since this mechanism was emphasized among therapists treating trauma.

Which implementation strategies predict EBP use (quantitative findings)

Quantitative findings clarified that a community of therapists only trained in the EBP (but never used) was relatively unhelpful and that therapists needed a community of colleagues with training in and experience using the EBP to be predictive of continued EBP delivery. Receiving EBP consultation and in-house supervision also predicted EBP use. However, EBP consultation/training boosters were found to potentially be too brief in dose to have an added value when a therapist is already receiving consultation and/or in-house supervision. This is consistent with therapists emphasizing the ongoing nature of consultation during the qualitative interviews, the EPIS framework stressing continued inner and outer context supports, and multiple studies showing initial trainings are insufficient for continued EBP use (Moullin et al., 2019; Owens et al., 2014).

Implications of integrated qualitative and quantitative findings

In addition to the aforementioned ways that quantitative findings expanded on the qualitative findings, both quantitative and qualitative results converged in emphasizing the importance of some in-house EBP-specific support for therapists. Outside support or consultation may not be able to replace other factors associated with having an in-house supervisor that knows the EBP well, such as having a local EBP champion or trainer and greater access to EBP guidance/feedback within the program (Aarons et al., 2016; Rodriguez et al., 2018). Train-the-trainer models are one way to ensure there is a trained local supervisor within a therapist’s agency for EBP-specific guidance. This is consistent with findings that therapists receiving training and consultation from both an outside master trainer and an in-house agency trainer had better outcomes in comparison to therapists receiving training and consultation from only an outside master trainer (Triplett et al., 2020). These results suggest that when applying the EPIS framework, some inner setting implementation support is prudent (Moullin et al., 2019).

Comparing the qualitative and quantitative results can also help explain why EBP web-based training did not predict therapist EBP use. This may partially be due to the quantitative indicator not capturing the additional agency furnished EBP resources therapists described as helpful during qualitative interviews, some of which have come up in other studies (e.g., workbooks, translations, toys for CPP, online monitoring systems, additional staff support; Regan et al., 2017). Furthermore, since the mechanism by which extra resources helped implementation was removing logistic barriers to EBP use, the specific resource of web-based training may not have helped reduce salient barriers. Thus, the integration of the qualitative and quantitative data emphasizes the need for extra resources to address the barriers therapist face in specific situations. For example, therapists discussed how referrals and extra staff support were helpful, which may help overcome barriers such as EBP fit for caseload and extra burden associated with translation and documentation (Lau et al., 2020). Consistent with the EPIS framework’s emphasis on EBP fit across levels, other studies with LACDMH and other mental health systems have linked implementation outcomes to extra support via staff time and outreach for appropriate referrals (Aarons et al., 2011; Lau et al., 2020; Moullin et al., 2019; Rodriguez et al., 2018). Further research assessing additional EBP resources that help therapists overcome barriers to EBP use and research on how this varies (e.g., by EBP, population, setting) is necessary for understanding the range of resources that contribute to EBP sustainment under different conditions.

Limitations

This study focused exclusively on therapist responses to questions about their training, supervision, and consultation; therefore, findings are limited to educational implementation strategies. Consequently, we did not examine therapist perspectives on other EBP-specific implementation strategies and whether other strategies, such as incentives to reward EBP use, predicted EBP use. Additional research examining such counterfactuals would help interrogate the relative effectiveness of distinctive categories of implementation strategies on EBP implementation.

Qualitative results may have been impacted by recall bias since therapists received training and consultation in EBPs at different times, but this is unlikely since many interviews suggested therapists were still receiving these implementation strategies and consultation can last up to 18 months for some EBPs. Notably, the qualitative sample had more unlicensed therapists than the quantitative sample possibly due to licensed therapists being less available for interviews. Although a sample with more licensed therapists may have had different perspectives on educational implementation strategies, this is unlikely since the quantitative results mostly confirmed the qualitative findings.

The quantitative findings were limited by single item indicators that were measured concurrently. Future studies can examine whether established summary scores corresponding to identified strategies prospectively predict therapists’ EBP sustainment. In addition, the quantitative indicator of receiving EBP web-based training did not completely capture the broader theme of access to EBP materials and resources. Unfortunately, our quantitative data did not include items fully capturing all the resources mentioned in therapist interviews. Future studies can examine whether other EBP materials and resources facilitate EBP implementation.

Recommendations for implementation strategy refinement

This study advances understanding of therapist perceptions of the beneficial aspects of EBP-specific educational implementation strategies within a system-wide sustainment phase of multiple EBPs (Lau & Brookman-Frazee, 2016; Moullin et al., 2019). Our results highlight how quantitative findings can elaborate on qualitative results and how findings can differ when analyzing the real-world experience of therapists receiving multiple EBP strategies for multiple EBPs simultaneously. Findings suggest mental health programs and EBP purveyors can refine their approach to providing educational strategies for continued EBP implementation by ensuring therapists receive

  1. Connections to other therapists who were trained in and used the EBP within the same agency or at another agency. This can provide EBP guidance and emotional support, a less emphasized strategy in previous research with this population (e.g., Regan et al., 2017; Rodriguez et al., 2018);

  2. Ongoing consultation that includes exposure to other therapist’s cases and EBP guidance/feedback;

  3. In-house EBP-trained supervisor providing EBP guidance/feedback; and

  4. Access to other materials and logistic resources necessary for overcoming logistic barriers to EBP implementation.

Taken together, these educational implementation strategies are likely to increase therapists’ use of an EBP beyond initial training.

Supplemental Material

sj-pdf-1-irp-10.1177_2633489520982903 – Supplemental material for What educational strategies and mechanisms facilitate EBP use? A mixed methods examination of therapist perceptions within a system-driven implementation of multiple EBPs

Supplemental material, sj-pdf-1-irp-10.1177_2633489520982903 for What educational strategies and mechanisms facilitate EBP use? A mixed methods examination of therapist perceptions within a system-driven implementation of multiple EBPs by Mojdeh Motamedi, Anna S Lau, Teresa Lind, Joyce HL Lui, Adriana Rodriguez, Ashley Smith and Lauren Brookman-Frazee in Implementation Research and Practice

Appendix 1

Therapist in-depth interview codebook for codes used in this study

Relevant Interview Questions:

Now I’ll ask about your training experiences and then move on to how you feel about using _________ now.

  • 3. How do you feel about the training you have received in _____?

  • 4. What other training experiences have you had since the initial training?

  •  a. What was your experience with any ongoing consultation from an outside trainer?

  •  b. What was your experience with any booster training?

  •  c. What was your experience with any [EBP] supervision offered at your agency?

Attitudes

  • 5. Appeal: What do you like most about using ____?

  • 6. Implementation Support: What types of supports have made this practice easier for you to deliver?

(reminder for staff: supports can include those offered by the developer, outside trainers, LACDMH resources, or agency resources/procedures)

  • 7. Limitations: What did you find most challenging about using ___?

Prompt for each challenge raised:

  • i. What has helped you move past these challenges?

Note: Specifier nodes are not required. If the therapist response is too general to be specified, text can be coded only to parent node or sub-node.
Construct Specifier Subcodes Definitions & exemplars Questions/notes
Perceptions of Training Appeal [A]
Challenges/Limitations [C]
Initial training Definition: any comments or descriptions about initial training experiences. This should only include content wherein the first training experiences are described. The moment a therapist begins describing non-initial training such as ongoing supervision/consultation (e.g., phone calls, in-person), only/also code Ongoing consultation node. Appropriate to code in/formal training.
Exemplars:
- “The trainings were overwhelming because it was jam packed with information.” [Challenges/Limitations]
Q: How do you feel about the training you have received in ___?
Q: What other training experiences have you had since the initial training?
(Probes: ongoing consultation, booster training, practice-specific supervision)
Note: There may be common overlap across both sub-codes. Use keywords and exemplars to help distinguish the proper node. Co-coding is appropriate.
Ongoing consultation/ training Definition: any comments about experiences with ongoing consultation or training including re/certification process. Key words: refresher, booster trainings. This might be developer specific but can also be ongoing consultation/training received from the agency. In the latter case, you might consider co-coding with IMPLEMENTATION SUPPORTS. Appropriate to code in/formal training.
Exemplars:
- “But it was good to have those refresher boosters because I think that’s what really helped us if we had questions, we could go back and things like that.” [Appeal]
Other Implementation Supports Barriers [B]
Facilitators [F]
Developer/outside supports Definition: any comments about implementation supports from the developer or outside supports unrelated to actual training/booster sessions; those should be coded under “Perceptions of Training”. Trainer/developer supports, such as messages relayed about the practice or additional resources outside of the context of formal training and booster may be coded here. Comments about lacking resources or challenges with resources may be coded as Barriers.
Exemplars:
- “The website resources for that practice are very easy to navigate” [Facilitator]
- “I mean they always say to adjust the practice with home-based delivery settings, but I don’t think you’ve ever been given examples, right, and so it’s like a trial error kind of basis.” [Barrier]
Q: What has helped you move past these challenges?
Agency resources and procedures Definition: any comments about agency resource and procedure implementation supports including clinical supports delivered within the agency (e.g., practice-specific supervision) or administrative supports above and beyond those supports expected from developer supports (e.g., supervision); in this instance, code Developer/Outside Supports. Include comments about peer supervision/support. Agency resources might include referrals to agency-based parent partner programs or in-house psychiatry services. If unclear whether Outside or Agency-based support, code under parent node only. Comments about lacking resources or challenges with resources may be coded as Barriers.
Exemplars:
- “Our agency hired someone to help with outcome data entry.” [Facilitator]

Footnotes

Authors’ note: This research was conducted in full compliance with APA standards for ethical practice in research, under the review of the LACDMH and the University of California Los Angeles Institutional Review Boards, which were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Informed consent in writing was obtained from all participants included in the study.

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by National Institute of Mental Health (NIMH) Grant R01 MH100134 to the second and last author.

Supplemental material: Supplemental material for this article is available online.

References

  1. Aarons G. A., Green A. E., Trott E., Willging C. E., Torres E. M., Ehrhart M. G., Roesch S. C. (2016). The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: A mixed-method study. Administration and Policy in Mental Health and Mental Health Services Research, 43(6), 991–1008. 10.1007/s10488-016-0751-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons G. A., Hurlburt M., Horwitz S. M. C. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons G. A., Wells R. S., Zagursky K., Fettes D. L., Palinkas L. A. (2009). Implementing evidence-based practice in community mental health agencies: A multiple stakeholder analysis. American Journal of Public Health, 99(11), 2087–2095. 10.2105/AJPH.2009.161711 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bearman S. K., Weisz J. R., Chorpita B. F., Hoagwood K., Ward A., Ugueto A. M., Bernstein A. (2013). More practice, less preach? The role of supervision processes and therapist characteristics in EBP implementation. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 518–529. 10.1007/s10488-013-0485-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Beidas R. S., Williams N. J., Becker-Haimes E. M., Aarons G. A., Barg F. K., Evans A. C., Jackson K., Jones D., Hadley T., Hoagwood K., Marcus S. C., Neimark G., Rubin R. M., Schoenwald S. K., Adams D. R., Walsh L. M., Zentgraf K., Mandell D. S. (2019). A repeated cross-sectional study of clinicians’ use of psychotherapy techniques during 5 years of a system-wide effort to implement evidence-based practices in Philadelphia. Implementation Science, 14(67), 1–13. 10.1186/s13012-019-0912-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bennett-Levy J., McManus F., Westling B. E., Fennell M. (2009). Acquiring and refining CBT skills and competencies: Which training methods are perceived to be most effective? Behavioural and Cognitive Psychotherapy, 37(5), 571–583. 10.1017/s1352465809990270 [DOI] [PubMed] [Google Scholar]
  7. Brookman-Frazee L., Zhan C., Stadnick N., Sommerfeld D., Roesch S., Aarons G. A., Innes-Gomberg D., Bando L., Lau A. S. (2018). Using survival analysis to understand patterns of sustainment within a system-driven implementation of multiple evidence-based practices for children’s mental health services. Frontiers in Public Health, 6, Article 54. 10.3389/fpubh.2018.00054 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Bruns E. J., Kerns S. E. U., Pullmann M. D., Hensley S. W., Lutterman T., Hoagwood K. E. (2015). Research, data, and evidence-based treatment use in state behavioral health systems, 2001–2012. Psychiatric Services, 67(5), 496–503. 10.1176/appi.ps.201500014 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Carson J., Maal S., Roche S., Fagin L., De Villiers N., O’Malley P., Brown D., Leary J., Holloway F. (1999). Burnout in mental health nurses: Much ado about nothing? Stress Medicine, 15(2), 127–134. [DOI] [Google Scholar]
  10. Cucciare M. A., Weingardt K. R., Villafranca S. (2008). Using blended learning to implement evidence-based psychotherapies. Clinical Psychology: Science and Practice, 15, 299–307. 10.1111/j.1468-2850.2008.00141.x [DOI] [Google Scholar]
  11. Dorsey S., Kerns S. E. U., Lucid L., Pullmann M. D., Harrison J. P., Berliner L., Thompson K., Deblinger E. (2018). Objective coding of content and techniques in workplace-based supervision of an EBT in public mental health. Implementation Science, 13(1), 19. 10.1186/s13012-017-0708-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Dorsey S., Pullmann M. D., Kerns S. E. U., Jungbluth N., Meza R., Thompson K., Berliner L. (2017). The juggling act of supervision in community mental health: Implications for supporting evidence-based treatment. Administration and Policy in Mental Health and Mental Health Services Research, 44, 838–852. 10.1007/s10488-017-0796-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Edmunds J. M., Beidas R. S., Kendall P. C. (2013). Dissemination and implementation of evidence based practice. Clinical Psychology: Science and Practice, 20, 152–165. 10.1111/cpsp.12031 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Herschell A. D., Reed A. J., Mecca L. P., Kolko D. J. (2014). Community-based clinicians’ preferences for training in evidence-based practices: A mixed-method study. Professional Psychology: Research and Practice, 45(3), 188–199. 10.1037/a0036488 [DOI] [Google Scholar]
  15. Hill C. E., Thompson B. J., Williams E. N. (1997). A guide to conducting consensual qualitative research. The Counseling Psychologist, 25(4), 517–572. 10.1177/0011000097254001 [DOI] [Google Scholar]
  16. Lau A. S., Brookman-Frazee L. (2016). The 4KEEPS study: Identifying predictors of sustainment of multiple practices fiscally mandated in children’s mental health services. Implementation Science, 11(1), 1–8. 10.1186/s13012-016-0388-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Lau A. S., Lind T., Crawley M., Rodriguez A., Smith A., Brookman-Frazee L. (2020). When do therapists stop using evidence-based practices? Findings from a mixed method study on system-driven implementation of multiple EBPs for children. Administration and Policy in Mental Health and Mental Health Services Research, 47, 323–337. 10.1007/s10488-019-00987-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Lewis C. C., Boyd M. R., Walsh-Bailey C., Lyon A. R., Beidas R., Mittman B., Aarons G. A., Weiner B. J., Chambers D. A. (2020). A systematic review of empirical studies examining mechanisms of implementation in health. Implementation Science, 15(1), 21. 10.1186/s13012-020-00983-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Lewis C. C., Klasnja P., Powell B. J., Lyon A. R., Tuzzio L., Jones S., Walsh-Bailey C., Weiner B. (2018). From classification to causality: Advancing understanding of mechanisms of change in implementation science. Frontiers in Public Health, 6, Article 136. 10.3389/fpubh.2018.00136 [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Los Angeles County Department of Mental Health. (2016). Prevention and early intervention implementation handbook: Mental Health Services Act. http://file.lacounty.gov/SDSInter/dmh/247145_PEIImplementationHandbook-PDFforWebsiterev.7-27-16.pdf
  21. Lyon A. R., Ludwig K., Romano E., Leonard S., Vander A., Mccauley E. (2013). “If it’s worth my time, I will make the time”: School-based providers’ decision-making about participating in an evidence-based psychotherapy consultation program. Administration and Policy in Mental Health and Mental Health Services Research, 40, 467–481. 10.1007/s10488-013-0494-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Mason M. (2010). Sample size and saturation in PhD studies using qualitative interviews. Forum Qualitative Sozialforschung, 11(3), 1–19. 10.17169/fqs-11.3.1428 [DOI] [Google Scholar]
  23. Mehta T. G., Atkins M. S., Neal J. W., Walden A. L. (2018). Supporting mental health providers: The feasibility and promise of a virtual professional learning community. Evidence-Based Practice in Child and Adolescent Mental Health, 3(4), 236–251. 10.1080/23794925.2018.1486687 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Moullin J. C., Dickson K. S., Stadnick N. A., Rabin B., Aarons G. A. (2019). Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implementation Science, 14(1), 1–16. 10.1186/s13012-018-0842-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Nadeem E., Gleacher A., Beidas R. S. (2013). Consultation as an implementation strategy for evidence-based practices across multiple contexts: Unpacking the black box. Administration and Policy in Mental Health and Mental Health Services Research, 40(6), 439–450. 10.1007/s10488-013-0502-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Olmstead T., Carroll K. M., Canning-Ball M., Martino S. (2011). Cost and cost-effectiveness of three strategies for training clinicians in motivational interviewing. Drug Alcohol Dependence, 116(3), 195–202. 10.1016/j.drugalcdep.2010.12.015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Owens J. S., Lyon A. R., Brandt N. E., Masia Warner C., Nadeem E., Spiel C., Wagner M. (2014). Implementation science in school mental health: Key constructs in a developing research agenda. School Mental Health, 6, 99–111. 10.1007/s12310-013-9115-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Palinkas L. A. (2014). Qualitative and mixed methods in mental health services and implementation research. Journal of Clinical Child and Adolescent Psychology, 43, 851–861. 10.1080/15374416.2014.910791 [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Palinkas L. A., Aarons G. A., Horwitz S., Chamberlain P., Hurlburt M., Landsverk J. (2011). Mixed method designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 44–53. 10.1007/s10488-010-0314-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Palinkas L. A., Cooper B. R. (2017). Mixed methods evaluation in dissemination and implementation science. In Brownson R. C., Colditz G. A., Proctor E. K. (Eds.), Dissemination and implementation research in health: Translating science to practice (2nd ed., pp. 335–354). Oxford University Press. 10.1093/oso/9780190683214.003.0020 [DOI] [Google Scholar]
  31. Ponterotto J. G. (2005). Qualitative research in counseling psychology: A primer on research paradigms and philosophy of science. Journal of Counseling Psychology, 52, 126–136. 10.1037/0022-0167.52.2.126 [DOI] [Google Scholar]
  32. Powell B. J., Waltz T. J., Chinman M. J., Damschroder L. J., Smith J. L., Matthieu M. M., Proctor E. K., Kirchner J. A. E. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10(1), 1–14. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Prada P., Perroud N., Rüfenacht E., Nicastro R. (2018). Strategies to deal with suicide and non-suicidal self-injury in borderline personality disorder, the case of DBT. Frontiers in Psychology, 9, Article 2595. 10.3389/fpsyg.2018.02595 [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Proctor E. K., Powell B. J., McMillen J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8(1), 139. 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Reese H. E., Pollard C. A., Szymanski J., Berman N., Crowe K., Rosenfield E., Wilhelm S. (2016). The behavior therapy training institute for OCD: A preliminary report. Journal of Obsessive-Compulsive and Related Disorders, 8, 79–85. 10.1016/j.jocrd.2015.12.005 [DOI] [Google Scholar]
  36. Regan J., Lau A. S., Barnett M., Stadnick N., Hamilton A., Pesanti K., Bando L., Brookman-Frazee L. (2017). Agency responses to a system-driven implementation of multiple evidence-based practices in children’s mental health services. BMC Health Services Research, 17(1), 671. 10.1186/s12913-017-2613-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Rodriguez A., Lau A. S., Wright B., Regan J., Brookman-Frazee L. (2018). Mixed-method analysis of program leader perspectives on the sustainment of multiple child evidence-based practices in a system-driven implementation. Implementation Science, 13, 44. 10.1186/s13012-018-0737-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Schriger S. H., Becker-Haimes E. M., Skriner L., Beidas R. S. (2020). Clinical supervision in community mental health: Characterizing supervision as usual and exploring predictors of supervision content and process. Community Mental Health Journal. 10.1007/s10597-020-00681-w [DOI] [PMC free article] [PubMed]
  39. Stadnick N. A., Lau A. S., Barnett M., Regan J., Aarons G. A., Brookman-Frazee L. (2018). Comparing agency leader and therapist perspectives on evidence-based practices: Associations with individual and organizational factors in a mental health system-driven implementation effort. Administration and Policy in Mental Health and Mental Health Services Research, 45(3), 447–461. 10.1007/s10488-017-0835-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Spradley J. (1979). The ethnographic interview. Holt, Rinehart and Winston. [Google Scholar]
  41. Triplett N. S., Sedlar G., Berliner L., Jungbluth N., Boyd M., Dorsey S. (2020). Evaluating a train-the-trainer approach for increasing EBP training capacity in community mental health. Journal of Behavioral Health Services and Research, 47(2), 189–200. 10.1007/s11414-019-09676-2 [DOI] [PubMed] [Google Scholar]
  42. Valenstein-Mah H., Greer N., Mckenzie L., Hansen L., Strom T. Q., Stirman S. W., Wilt T. J., Kehle-Forbes S. M. (2020). Effectiveness of training methods for delivery of evidence-based psychotherapies: A systematic review. Implementation Science, 15, 40. 10.1186/s13012-020-00998-w [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Waltz T. J., Powell B. J., Matthieu M. M., Damschroder L. J., Chinman M. J., Smith J. L., Proctor E. K., Kirchner J. A. E. (2015). Use of concept mapping to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10(1), 109. 10.1186/s13012-015-0295-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Williams N. J. (2016). Multilevel mechanisms of implementation strategies in mental health: Integrating theory, research, and practice. Administration and Policy in Mental Health and Mental Health Services Research, 43(5), 783–798. 10.1007/s10488-015-0693-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Willms D. G., Best J. A., Taylor D. W., Gilbert J. R., Wilson M. C., Lindsay E. A., Singer J. (1990). A systematic approach for using qualitative methods in primary prevention. Medical Anthropology Quarterly, 4(4), 391–409. 10.1525/maq.1990.4.4.02a00020 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-pdf-1-irp-10.1177_2633489520982903 – Supplemental material for What educational strategies and mechanisms facilitate EBP use? A mixed methods examination of therapist perceptions within a system-driven implementation of multiple EBPs

Supplemental material, sj-pdf-1-irp-10.1177_2633489520982903 for What educational strategies and mechanisms facilitate EBP use? A mixed methods examination of therapist perceptions within a system-driven implementation of multiple EBPs by Mojdeh Motamedi, Anna S Lau, Teresa Lind, Joyce HL Lui, Adriana Rodriguez, Ashley Smith and Lauren Brookman-Frazee in Implementation Research and Practice


Articles from Implementation Research and Practice are provided here courtesy of SAGE Publications

RESOURCES