Abstract
Background
System-driven scale-up of multiple evidence-based practices (EBPs) is an increasingly common method used in public mental health to improve care. However, there are little data on the long-term sustained delivery of EBPs within these efforts, and previous studies have relied on retrospective self-report within cross-sectional studies. This study identified prospective predictors of sustained EBP delivery at the EBP-, therapist-, and organizational-levels using survey and administrative claims data within a large-scale system-driven implementation effort.
Methods
777 therapists and 162 program leaders delivering at least one of six EBPs of interest completed surveys assessing perceptions of EBPs and organizational context. These surveys were linked to administrative data to examine prospective predictors of therapists’ EBP delivery over 33 months.
Results
Five of the six EBPs implemented showed sustained delivery in the system, with volume varying by EBP. Although total EBP claim volume per therapist decreased over time, the volume ratio (ratio of EBP-specific claims to total EBP and non-EBP claims) stayed relatively stable. Multilevel models revealed that EBPs that required consultation, had unstructured content, higher therapist self-efficacy with the EBP, and more positive program leader perceptions of the EBP were associated with greater sustained volume and volume ratio of the EBP. Therapists who were trained in fewer EBPs, who were unlicensed, and who worked in agencies rated by program leaders as lower on organizational staff autonomy and stress showed greater sustained EBP volume and volume ratio. Finally, more direct service hours per week provided by therapist predicted greater sustained EBP volume, but lower volume ratio.
Conclusions
The results point to the importance of EBP, therapist, and organizational factors that may be targeted in implementation strategies to promote the sustainment of EBPs.
Keywords: Evidence-based practice, sustainment, administrative claims, community mental health, children’s mental health
Prospective Predictors of Sustainment of Multiple EBPs in a System-Driven Implementation Context: Examining Sustained Delivery Based on Administrative Claims
Large-scale evidence-based practice (EBP) implementation efforts are increasingly common as systems address the research-to-practice gap in children’s mental health services (Hoagwood et al., 2014). Despite the steep investments in these efforts (Roundfield & Lang, 2017), there are little data regarding the sustainment, or continued use, of EBPs in routine care following initial scale-up (Stirman et al., 2012). Within one system-driven multiple EBP implementation in Los Angeles County, therapists were found to sustain delivery of at least one EBP for an average of 22 months (Brookman-Frazee et al., 2018). The current analysis expands this work by examining how therapist and program leader perceptions of individual EBPs and organizational climate prospectively predicted therapist sustained delivery of EBPs, as measured by administrative claims data, over 33 months within this initiative.
Conceptual Framework of EBP Sustainment
The Exploration, Preparation, Implementation, Sustainment (EPIS) Framework contends that the fit of the EBP within the system (outer context), as well as the fit within the organization (inner context), shape the likelihood of EBP sustainment (Aarons, Hurlburt, et al., 2011a; Moullin et al., 2019). Within the inner context, EPIS delineates multiple levels of variation (e.g., EBP, therapist, and organization) that can impact EBP sustainment (Aarons, Hurlburt, et al., 2011a). The current study examined how variation in inner context factors, including therapist, organizational/agency, and EBP-level factors, predicted the continued delivery of multiple EBPs during the sustainment phase of a large-scale implementation.
EBP Factors Potentially Related to Sustainment
The flexibility or potential for adaptation of an EBP has been identified as a factor that can promote sustainment (Han & Weiss, 2005; Palinkas et al., 2013). However, community therapists have also rated structured EBPs that provide session by session guidance on activities as relatively more appealing (Barnett et al., 2017). The conditions of EBP training can also influence sustainment, with ongoing consultation beyond the initial workshop identified as a key driver of EBP use (Herschell et al., 2010), and greater participation in ongoing consultation associated with longer EBP sustainment (Cooper et al., 2015; Hodge et al., 2017). A lack of available consultation support following initial EBP training has been cited as a contributing factor to program-level discontinuation of an EBP (Rodriguez et al., 2018).
Therapist perceptions of specific EBPs are also related to implementation outcomes (Cook et al., 2015; Reding et al., 2014), however, most studies have relied on cross-sectional designs. One prospective study on the implementation of a school-based violence prevention program found positive teacher attitudes toward the program predicted continuation and growth of implementation (Gregory et al., 2007). Negative therapist perceptions of an EBP have been linked to therapist retrospective report of discontinuing that EBP (Lau et al., 2020).
Therapist self-efficacy, or confidence in delivering an EBP, has been hypothesized to influence sustainment (Scheirer, 2005). Poorer sustainment of interventions has been linked to therapist’s perceptions of the difficulty and complexity of an EBP (Hunter et al., 2015). Although data have largely been limited to cross-sectional analyses of therapist self-efficacy and EBP sustainment (Lau et al., 2020), teachers have been found to be more likely to continue implementing a school-based intervention when they felt more confident delivering it (Rohrbach et al., 2006).
Therapist Factors Related to Sustainment
Therapist factors, including training background, professional identity, and work load conditions can also impact sustained EBP delivery (Blasinsky et al., 2006; Novins et al., 2013). Background characteristics such as theoretical orientation, years of experience, and professional licensure have not been consistently associated with therapist EBP adoption, implementation or sustainment outcomes (e.g., Beidas et al., 2017; Lau et al., 2020; Wolk et al., 2016). Workload factors, including high caseloads, productivity demands, and burnout, appear to play a role in EBP sustainment. Although sufficient direct service hours with clients may support continued use of EBPs (Brookman-Frazee et al., 2018; Lau et al., 2020), higher caseloads, feeling overworked, and emotionally exhausted are related to poor EBP sustainment (Brookman-Frazee et al., 2018; Palinkas et al., 2013). These same conditions relate to turnover of therapists trained in EBPs, which contributes to sustainment failures at the agency- and therapist-levels (Beidas et al., 2016; Massatti et al., 2008; Scanlan & Still, 2019; Woltmann et al., 2008).
A review of studies across care sectors found that more positive provider attitudes towards EBPs were associated with sustainment (Stirman et al., 2012). Therapist views of EBPs in general may influence sustained use, beyond their perceptions of a specific EBP. A stance of openness to practice innovations, and the feeling that EBPs diverge from one’s usual approach to care are two facets of general attitudes toward EBP (Aarons, 2004). Perceptions that EBPs diverge from one’s usual approach has been linked to discontinuing an EBP in a cross-sectional analysis, but prospective data are lacking (Lau et al., 2020).
Organizational/Agency Factors Related to Sustainment
Supportive leadership is a determinant of EBP sustainment (Aarons & Sommerfeld, 2012). Program leaders make critical decisions that support sustainment, including selection of implementation strategies (Aarons, Sommerfeld, et al., 2011b). Organizational leaders’ positive perceptions of adopted EBPs have been found to predict sustainment in schools (Langley et al., 2010) and community mental health agencies (Rodriguez et al., 2018). Data indicate the importance of leadership behaviors in promoting EBP sustainment, such as being knowledgeable about EBPs, supporting staff in the implementation process, proactive problem-solving, and perseverance through the difficulties of implementation (Aarons et al., 2014, 2016).
Organizational climate, or shared staff perceptions of the work environment, are posited as a central influence on implementation outcomes (Aarons, Hurlburt, et al., 2011a). Climates characterized by manageable levels of staff stress, higher perceived job autonomy and staff cohesion have been linked to positive EBP perceptions (Aarons & Sommerfeld, 2012) and reduced turnover (Aarons et al., 2009), which can support EBP sustainment. However, findings regarding the direct link between general aspects of organizational climate and EBP sustainment are mixed, with some data showing greater sustainment in higher functioning organizational contexts (Rohrbach et al., 2006) and other data showing no link (Rodriguez et al., 2018).
The Current Study
Study and Outer Fiscal Context
In 2010, the Los Angeles County Department of Mental Health (LACDMH) launched the Prevention and Early Intervention (PEI) initiative with permanent funding from the California Mental Health Services Act. Through the PEI program, LACDMH aimed to deliver EBPs to a population of youth for whom a short-term (usually less than one year), relatively low-intensity intervention would measurably improve symptoms early in their manifestation. PEI expanded care to populations who might otherwise not meet medical necessity criteria for Medicaid-funded services. To be eligible for PEI, youth must be early in the course of an emotional disturbance. LACDMH furnished large-scale training for therapists in an initial set of six EBPs: Cognitive Behavioral Intervention for Trauma in Schools (CBITS), Child-Parent Psychotherapy (CPP), Trauma-Focused Cognitive Behavior Therapy (TF-CBT), Triple P - Positive Parenting Program (Triple P), Seeking Safety (SS), as well as Managing and Adapting Practice (MAP; see Appendix A). Training for these EBPs began in March 2010 and services began in July 2010.
The timing of the PEI initiative coincided with a state budget crisis, which substantially reduced other sources of funding for mental health services. PEI provided an opportunity for agencies to continue to deliver services (Regan et al., 2017) through a new funding stream, and the launch of PEI in 2010 resulted in an unprecedented scale-up of EBPs (Brookman-Frazee et al., 2016). Although non-PEI (e.g., County General Funds) funding sources increased with the restoration of general funds beginning in 2014, the PEI program continued to be included in agency contracts for clients determined to be eligible, and agencies were expected to utilize their PEI allocations. In the current sample, PEI claims comprised 51% of all billing in fiscal year (FY) ‘14-’15, 43% in FY ‘15-’16, 37% in FY ‘16-’17, and 36% in FY ‘17-’18.
Study Objectives
First, we sought to describe levels of therapist PEI EBP sustainment 6–8 years after the initial PEI implementation by using administration claims data to index volume and the ratio of EBP-specific claims to all psychotherapy claims over 33 months. Second, we examined prospective predictors of sustained PEI EBP volume and volume ratio, across EBP-, therapist-, and agency-levels. We hypothesized that EBP factors (e.g., required ongoing consultation, positive therapist and program leader perceptions of EBPs), therapist factors (e.g., lower workload and exhaustion, and more openness and less divergence with EBPs in general), and agency factors (e.g., higher staff autonomy, lower organizational stress, and higher organizational cohesion) would be associated with greater sustainment. Previous cross-sectional studies of the PEI initiative have examined correlates of program leader-reported sustainment of EBPs at the agency level (Rodriguez et al., 2018) and therapist-reported discontinuation of EBPs at the therapist level (Lau et al., 2020). This study extends the research by: (1) indexing EBP sustainment with administrative claims data, (2) linking therapist- and program leader-reports of inner context factors with claims data, and (3) prospectively assessing associations between inner context conditions (at the EBP-, therapist-, and organizational-levels) and EBP sustainment.
Methods
Procedure
Data were drawn from the Knowledge Exchange on Evidence-Based Practice Sustainment (4KEEPS) study, which examined predictors of sustained delivery of multiple EBPs within a system-driven implementation effort in Los Angeles County (see Lau & Brookman-Frazee, 2016 for details). Two data sources included: (1) survey data collected in 2015 from therapists and program leaders delivering PEI EBPs, and (2) administrative claims data for participating therapists extracted for the 33 months following survey completion (April 2015 through December 2017). Ninety-eight LACDMH agencies that delivered at least one of the EBPs of interest were eligible. In agency-level recruitment, leaders from 69 agencies provided contact information for eligible therapists who had been trained in and delivered at least one of the six EBPs during fiscal year 2013–2014. Program leaders were eligible if they provided clinical or administrative oversight for one or more of the EBPs in the same period.
Survey Data
Eligible therapists and program leaders were emailed a link to provide consent and complete online surveys. Survey data collection occurred between March and July 2015, with a response rate of 44.2% for therapists and 60.7% for program leaders, similar to previous community therapists’ surveys (Cashel, 2002; Hawley et al., 2009).
Administrative Claims Data
Psychotherapy claims included information about each therapist-client encounter (e.g., procedure code, youth diagnosis). In addition, each psychotherapy claim included an “EBP code” field, in which LACDMH required therapists to report if they used an EBP, regardless of the funding source. For PEI reimbursement in Los Angeles County therapists were required to deliver an EBP and indicate which EBP was delivered in this EBP code field (Los Angeles County Department of Mental Health, 2016a). In the current sample, 95% of PEI claims reported a specific EBP in the EBP code field (e.g., TF-CBT) and 5% of non-PEI claims reported a specific EBP in this field.
All psychotherapy claims for therapists who completed the survey were extracted for the 33 months following survey completion (between April 2015 and December 2017). These included claims for any funding source including Medicaid or PEI. Therapist survey data were matched to administrative claims data by therapists’ National Provider Identification numbers. A total of 606,374 psychotherapy claims for children 21 years old and younger at the time of service were available for analysis. Study procedures were approved by Institutional Review Boards at University of California Los Angeles and Los Angeles County Department of Mental Health.
Participants
See Table 1 for demographic, training, and service delivery characteristics of therapists and program leaders. Survey data from the 777 therapists were matched with administrative claims data. Survey data from 162 program leaders were collected to assess their EBP perceptions and the organizational climate of the agencies where the therapists were employed. Program leader data was aggregated to the agency-level to capture organizational characteristics.
Table 1.
Descriptive statistics for therapist (n = 777) and program leader (n = 162) characteristics.
| Characteristics | Therapists (n = 777) |
Program leaders (n = 162) |
|---|---|---|
| Gender, No. (%) | ||
| Female | 685 (88.2) | 136 (84.0) |
| Male | 92 (11.8) | 26 (16.0) |
| Race/Ethnicity, No. (%) | ||
| African American | 52 (6.7) | 13 (8.0) |
| Asian American/Pacific Islander | 91 (11.7) | 21 (13.0) |
| Hispanic | 332 (42.7) | 43 (26.5) |
| Non-Hispanic White | 274 (35.3) | 80 (49.4) |
| Multiracial/Other | 28 (3.6) | 5 (3.1) |
| Education, No. (%) | ||
| Less than master’s degree | 18 (2.3) | 0 (0) |
| Master’s degree | 668 (86.0) | 136 (84.0) |
| Doctoral degree | 91 (11.7) | 26 (16.0) |
| Licensure, No. (%) | ||
| Licensed | 351 (45.2) | 153 (94.4) |
| Not licensed | 336 (54.8) | 9 (5.6) |
| Discipline, No. (%) | ||
| Marriage and family therapy | 431 (55.5) | 85 (52.5) |
| Psychology/counseling | 101 (13.0) | 20 (12.3) |
| Social work | 231 (29.7) | 57 (35.2) |
| Other | 14 (1.8) | 0 (0) |
| Language, No, (%) | ||
| Can deliver services in English only | 359 (44.9) | 99 (61.1) |
| Can deliver services in a non-English language | 428 (55.1) | 63 (38.9) |
| Primary setting,a No. (%) | ||
| Outpatient clinic | 615 (79.2) | 134 (82.7) |
| School | 81 (10.4) | 52 (32.1) |
| Community/Other | 81 (10.4) | 72 (44.4) |
| Theoretical Orientation, No. (%) | ||
| Cognitive behavioral or behavior | 446 (57.4) | 67 (41.4) |
| Not cognitive behavioral or behavior | 331 (42.6) | 95 (58.6) |
| Years of professional experience | ||
| Mean (SD) | 6.8 (6.1) | 12.7 (6.9) |
| Range | 0–45 | 0–35 |
| Number of the six EBPs trained in | ||
| Mean (SD) | 2.4 (1.1) | — |
| Range | 1–6 | — |
Note. No. = number; SD = standard deviation; EBP = evidence-based practice.
Program leaders could select more than one setting for their program.
Measures
Outcome Variables
Volume of claims for specific PEI EBP per active month
The total number of psychotherapy claims per therapist during the study period for each of the six PEI EBPs was divided by the number of months the therapist made any claims (to PEI and non-PEI sources) during the 33 months. An example of how this would be calculated for the specific PEI EBP of MAP is:
Therapist ratio of PEI EBP-specific claims to all claims (PEI EBP volume ratio)
PEI EBP volume ratio assess the extent to which therapists were using a given PEI EBP as a function of all their psychotherapy claims over the study period. The number of a therapist’s claims for a specific PEI EBP was divided by a therapist’s total number of all psychotherapy claims (PEI and non-PEI sources) billed over the study period. Volume ratios could range from of 0 to 100%. An example of how this would be calculated for the specific PEI EBP of MAP is:
Predictor Variables: EBP-Level Factors
EBP structure: prescribed session content/order
EBPs were characterized based on whether or not they had a prescribed session content and an intended order of sessions. Following Barnett et al., 2017, CBITS, Triple P, and TF-CBT were classified as having a prescribed session content and order, and Seeking Safety, CPP, and MAP were classified as not having a prescribed session content/order. See Appendix B.
EBP training requirements: consultation
EBPs were also characterized on whether or not there was mandatory follow-up consultation with an EBP expert trainer after the initial training. Consistent with prior studies, CBITS, CPP, MAP, and TF-CBT were classified as having required consultation, and Seeking Safety and Triple P were coded as not requiring consultation (Barnett et al., 2017). See Appendix C.
Therapist perceived self-efficacy delivering a specific EBP
Therapists reported on feelings of self-efficacy, or their belief in their capacity to deliver an EBP, for each of the EBPs they had delivered on the survey (Bandura, 1977). Two items were used for each EBP: “I am well prepared to deliver [EBP] even with challenging clients,” and “I am confident in my ability to implement [EBP].” Therapists responded on a five-point Likert scale (1 = not at all, 5 = a very great extent). A mean of both items was used as a composite (α = .83 to .94 across EBPs).
Therapist EBP perceptions
An adapted version of the Perceived Characteristics of Intervention Scale (PCIS; Cook et al., 2015) was used to assess therapist’s views of specific EBPs. Therapists rated each item on a five-point Likert scale (1 = not at all, 5 = a very great extent). We used a composite mean score to measure therapists’ perceptions of each EBP. Internal consistencies were high (α = .93 to .94 across EBPs).
Predictor Variables: Therapist Factors
Therapist demographic, professional, and practice characteristics
In the survey data, therapists reported their gender, race/ethnicity, language(s) of service delivery, average number of direct service hours per week, and years of professional practice. For each EBP, therapists reported whether they had received training in the EBP, had ever delivered the EBP to a client, and whether they had delivered the EBP to a client within the last two months.
Therapist emotional exhaustion
The Organizational Social Context Questionnaire (OSC; Glisson et al., 2008) was used to measure therapist emotion exhaustion, or the extent to which therapists felt that their emotional resources had been depleted (Maslach & Jackson, 1981). Therapists completed five items on a seven-point Likert scale (1 = strongly disagree, 7 = strongly agree), with higher ratings indicating more emotional exhaustion (α = .81).
Therapist general attitudes toward EBPs
Therapist attitudes toward adoption of EBPs in general were measured using the Evidence-Based Practice Attitude Scale (EBPAS; Aarons, 2004). Therapists completed the four-item Openness subscale, which assessed a therapist’s openness to new practices (α = .79) and the four-item Divergence subscale, which assessed the degree to which a therapist’s usual practice diverges with research-based/academically developed interventions (α = .71). Items were rated on a five-point Likert scale (0 = not at all, 4 = very great extent).
Predictor Variables: Agency-Level
Program leader EBP perceptions
Program leaders completed the same adapted version of the PCIS as therapists for each of the EBPs they had supervised the implementation in their agency (see therapist measure for additional details; α = .91 to .97 across EBPs). Program leaders’ PCIS scores were aggregated corresponding to the EBP for which the therapist claimed.
Organizational climate
Therapist and program leader perceptions of their agency’s climate was measured with the Organizational Climate Measure (OCM; Forgatch et al., 2005). Therapists and program leaders completed the Autonomy subscale, which measured the extent to which they perceived employees were given a latitude to enact their work. Items were answered using a four-point scale (1 = definitely false, 4 = definitely true). Internal consistencies were high (α = .72 for therapists; α = .80 for program leaders). Therapist- and program-leader reports of autonomy were not significantly correlated (r = .18, p = .18).
Organizational readiness for change
Program leaders completed two subscales of the Organizational Readiness for Change questionnaire (ORC; Lehman et al., 2002), which measured leader perspectives on organizational climate and resources supporting EBP implementation. Program leader (vs. therapist) report was selected for these constructs, as it was believed that they would have the most knowledge of overall organizational context. The Cohesion subscale assessed perceptions of group cohesion at the agency, and the Stress subscale measured perceptions of stress, workplace burden, and staff strain. Program leaders responded using a five-point scale (1 = disagree strongly, 5 = agree strongly). Internal consistencies were: Cohesion (α = .76) and Stress (α = .84).
Data Analytic Plan
This study sought to: (1) describe EBP specific volume and volume ratio using administrative claims data, and (2) examine prospective predictors of EBP volume and volume ratio at EBP-, therapist-, and agency-levels. To examine Aim 1, descriptive statistics were run on: (a) the volume of claims per active month for the six EBPs per therapist, and (b) the volume ratio of claims for each EBP per therapist. The number of therapists claiming for each EBP was also explored. To examine Aim 2, two multilevel negative binomial regression models were run in Stata/SE (version 15.1) to predict: (a) the volume of claims per active month for the six EBPs per therapist, and (b) the volume ratio of claims for each EBP per therapist. Multilevel negative binomial regression models were used due to the skewed distribution of both outcome variables (volume: skewness = 2.54, kurtosis = 8.88; volume ratio: skewness = 1.79, kurtosis = 3.44). Null models indicated a significant amount of variance at the therapist and agency level for both outcomes (therapist level ICC = .06 to .07; agency level ICC = .05 to .07). Therefore, all multilevel models were run with three levels, with EBP at level 1 (n = 2,298), therapists at level 2 (n = 777), and agencies at level 3 (n = 64). The number of months a therapist was actively claiming was controlled for in the second model predicting volume ratio, and the proportion of PEI claims per therapist was controlled for in both models.
Results
Aim 1: Describe Therapist-Level PEI EBP Volume and Volume Ratio
Descriptive statistics for the six EBPs are presented in Table 2. Data consisted of 606,374 claims from April 2015 to December 2017, with 219,827 claims for the six EBPs. A total of 33,366 children were served across 64 agencies. PEI EBP claims accounted for 41.2% of all psychotherapy claims made by therapists during the study period. The number of EBPs claimed per therapist was 2.3 (SD = 1.0). Therapists on average served 56.2 children (SD = 47.8). Of the 777 therapists, 404 (52.0%) were still submitting claims (PEI or non-PEI) within the system 33 months later. On average, therapists made claims for 20.7 months (SD = 11.1). Table 2 shows that 140 therapists claimed to CPP through PEI at some point during the study period, 1 to CBITS, 594 to MAP, 417 to TF-CBT, 153 to Triple P, and 283 to SS. The average number of claims per therapist across the study period was 1,012.3 (SD = 744.2).
Table 2.
Claiming patterns for participating therapist from April 2015 to December 2017.
| PEI Claims | Non-PEI Claims | Total Claims (All PEI and non-PEI) | |||||||
|---|---|---|---|---|---|---|---|---|---|
| CPP | CBITS | MAP | TF-CBT | Triple P | SS | Other PEI | |||
| Claims | |||||||||
| Volume: Number of claims for each PEI EBP per therapist, M (SD) | 76.8 (123.5) | 0.3 (2.9) | 197.4 (275.6) | 86.7 (156.4) | 53.1 (128.8) | 29.3 (83.1) | 52.6 (105.7) | 431.9 (487.1) | 752.3 (741.0) |
| Volume ratio: Percentage of claims for each PEI EBP per therapist, M (SD) (Number of claims for each PEI EBP per therapist/Total claims per therapist) | 11.9% (17.1) | 0.03% (.2) | 23.6% (21.7) | 12.2 (16.9) | 4.8 (8.1) | 6.2 (15.2) | 7.1 (11.8) | 56.4 (26.1) | — |
| Therapists | |||||||||
| Number of therapists trained in each PEI EBP, No. (% of therapists) | 119 (15.3%) | 48 (6.2%) | 507 (65.3%) | 548 (70.5%) | 168 (21.6%) | 483 (62.2%) | — | — | 777 |
| Number of therapists claimed for each PEI EBP, No. (% of therapist) | 140 (18.8%) | 1 (<.01%) | 504 (67.8%) | 417 (56.1%) | 153 (20.6%) | 283 (38.1%) | 457 (61.5%) | 722 (97.2%) | 743 |
Note. PEI = Prevention and Early Intervention; EBP = evidence-based practice; CPP = Child-Parent Psychotherapy; CBITS = Cognitive Behavioral Intervention for Trauma in Schools; MAP = Managing and Adapting Practice; TF-CBT = Trauma Focused Cognitive Behavior Therapy; Triple P = Positive Parenting Program; SS = Seeking Safety; No. = number; M = mean; SD = standard deviation.
Volume of Claims Per Active Month
Descriptive information regarding the volume of claims for each PEI EBP are provided in Table 2. As shown in Figure 1, MAP had the highest volume of claims with an average of 197.4 claims per therapist (SD = 275.6) followed by TF-CBT (M = 86.7, SD = 156.4), CPP (M = 76.8, SD = 123.5), Triple P (M = 53.1, SD = 128.8), SS (M = 29.3, SD = 83.1) and CBITS (M = 0.3, SD = 2.9). The total number of claims for the six PEI EBPs declined over time. Seeking Safety showed the greatest percent decline (85.7%) followed by CPP (73.4%), Triple P (70.9%), MAP (70.4%), and TF-CBT (66.5%).
Figure 1.
Volume of claims per active month for each EBP. EBP = evidence-based practice; FY = fiscal year; Q = quarter; CPP = Child-Parent Psychotherapy; CBITS = Cognitive Behavioral Intervention for Trauma in Schools; MAP = Managing and Adapting Practice; TF-CBT = Trauma Focused Cognitive Behavior Therapy; Triple P = Positive Parenting Program; SS = Seeking Safety.
Volume Ratio
Table 2 and Figure 2 shows the volume ratio average and rate of each EBP per therapist. MAP had the highest volume ratio (M = 23.6%, SD = 21.7), followed by CPP (M = 11.9%, SD = 17.1), TF-CBT (M = 12.2%, SD = 16.9), SS (M = 6.2%, SD = 15.2), and Triple P (M = 4.8%, SD = 8.1). However, CBITS did not significantly penetrate (M = 0.03%, SD = 0.2).
Figure 2.
Volume ratio of claims for each EBP Per therapist. EBP = evidence-based practice; FY = fiscal year; Q = quarter; CPP = Child-Parent Psychotherapy; CBITS = Cognitive Behavioral Intervention for Trauma in Schools; MAP = Managing and Adapting Practice; TF-CBT = Trauma Focused Cognitive Behavior Therapy; Triple P = Positive Parenting Program; SS = Seeking Safety.
Aim 2: Explore Prospective Predictors of PEI EBP Volume and Volume Ratio
Given the low initial adoption and delivery of CBITS, this practice was excluded from Aim 2 analyses. Results are shown in Table 3.
Table 3.
EBP, therapist, and agency factors prospectively predicting PEI EBP volume and PEI volume ratio per therapist.
| Predictor Variables | Volume of Claims for Each PEI EBPa | Volume Ratio of Claims for Each PEI EBPb | ||
|---|---|---|---|---|
| B (SE) | 95% CI | B (SE) | 95% CI | |
| EBP Factors: | ||||
| EBP requires consultationc | .85 (.08)** | .71, 1.00 | .79 (.06)** | .76, .92 |
| EBP has structured content/orderd | −.40 (.06)** | −.53, −.28 | −.46 (.06)** | −.57, −.34 |
| Therapist self-efficacy for specific EBP | .14 (.04)** | .05, .23 | .15 (.04)** | .07, .22 |
| Therapist perceptions of specific EBP (PCIS) | −.02 (.05) | −.10, .07 | .03 (.04) | −.05, .11 |
| Program leader perceptions of specific EBP (PCIS) | .31 (.08)** | .16, .46 | .22 (.07)** | .08, .36 |
| Therapist Factors: | ||||
| Therapist licensede | −.36 (.07)** | −.50, −.21 | −.12 (.06) | −.24, .01 |
| Therapist cognitive behavioral/behavioral orientationf | .01 (.06) | −.25, .08 | .003 (.05) | −.10, .10 |
| Therapist years practiced | −.01 (.01) | −.10, .13 | .002 (.01) | −.01, .01 |
| Therapist direct service hours per week | .03 (.01)** | .02, .04 | −.01 (.004)* | −.02, −.002 |
| Therapist number of EBPs trained in | −.20 (.03)** | −.27, −.14 | −.20 (.03)** | −.25, −.14 |
| Therapist general attitudes towards EBPs (EBPAS) | ||||
| Openness | −.05 (.06) | −.16, .07 | −.06 (.05) | −.16, .04 |
| Divergence | −.05 (.05) | −.15, .05 | −.05 (.05) | −.14, .04 |
| Therapist emotional exhaustion (OSC) | −.04 (.03) | −.10, .02 | −.03 (.03) | −.08, .02 |
| Agency Factors: | ||||
| Therapist perceptions of organizational climate (OCM) | ||||
| Autonomy | .01 (.20) | −.38, .39 | −.17 (.17) | −.51, .18 |
| Program leader perceptions of organizational climate (OCM) | ||||
| Autonomy | −.31 (.15)* | −.60, −.03 | −.14 (.13) | −.39, .11 |
| Program leader perceptions of organizational context (ORC) | ||||
| Cohesion | −.12 (.14) | −.40, .16 | −.05 (.12) | −.29, .19 |
| Stress | −.16 (.06)* | −.28, −.03 | −.07 (.05) | −.18, .04 |
Note. PEI = Prevention and Early Intervention; EBP = evidence-based practice; No. = number; M = mean; SD = standard deviation; PCIS = Perceived Characteristics of Intervention Scale (PCIS); EBPAS = Evidence-Based Practice Attitudes Scale; Organizational Social Context (OSC) questionnaire; OCM = Organizational Climate Measure; OCM = Organizational Social Context. The number of months a therapist was active claiming was controlled for in the model predicting volume ratio, and the proportion of PEI claims per therapist was controlled for in both models.
Volume of claims for each PEI EBP claims per therapist was calculated by summing the total number of claims for a specific EBP and dividing by the number of months a therapist was active (i.e., made any claims); bVolume ratio of claims for each PEI EBP per therapist was calculated by dividing the total number of claims for a specific EBP per therapist by the total number of claims per therapist (including other PEI EBPs and Non-PEI claims); cEBP required consultation was coded as 0 = EBP did not have required consultation, 1 = EBP had required consultation; dEBP prescribed session content/order was coded as 0 = EBP does not have prescribed session content/order, 1 = EBP has prescribed session content/order; eTherapist licensed was coded as 0 = not licensed, 1 = licensed; fTherapist cognitive behavioral/behavioral orientation was coded as 0 = orientation other than cognitive behavioral or behavioral, 1 = cognitive behavioral or behavioral orientation.
p < .05, ** p < .01.
EBP-Level Factors
EBPs requiring ongoing consultation were associated with higher volume (B = .85, SE = .08, p < .01) and volume ratio (B = .79, SE = .06, p < .01). EBPs with structured content/order demonstrated lower volume (B = −.40, SE = .06, p < .01) and volume ratio (B = −.46, SE = .06, p < .01). In addition, higher therapist ratings of self-efficacy in delivering a specific EBP were linked to higher subsequent EBP specific volume (B = .14, SE = .04, p < .01) and volume ratio (B = .15, SE = .04, p < .01). In contrast, positive therapist perceptions of the specific EBP were not significantly associated with subsequent volume (B = −.02, SE = .05, p = .73) and volume ratio (B = .03, SE = .04, p = .47) for that EBP. However, more positive program leader perceptions of a specific EBP were associated with higher volume and volume ratio for that EBP (volume: B = .31, SE = .08 p < .01; volume ratio: B = .22, SE = .07, p < .01).
Therapist-Level Factors
Therapists with a license had significantly lower subsequent claim volume (B = −.36, SE = .07, p < .01) than non-licensed therapists. However, there was no significant difference in EBP claim volume ratio between licensed and non-licensed therapists (B = −.12, SE = .06, p = .07). Therapists who had been trained in more EBPs had lower subsequent claims volume and volume ratio for each given EBP (volume: B = −.20, SE = .03, p < .01; volume ratio: B = −.15, SE = .03, p < .01). There wer variable findings regarding direct service hours per week depending on the outcome. More direct service hours per week were associated with higher claiming volume (B = .03, SE = .01, p < .01), but lower volume ratio (B = −.01, SE = .004, p < .01). Therapist years of professional experience, attitudes toward EBPs in general, theoretical orientation, and emotional exhaustion were not associated with EBP claim volume or volume ratio.
Agency-Level Factors
Program leader perceptions of organizational staff autonomy predicted lower therapist EBP claim volume (B = −.31, SE = .15, p < .05), but were not significantly associated with volume ratio (B = −.14, SE = .12, p = .27). In contrast, therapist perceptions of staff autonomy were not significantly associated with therapist EBP claim volume or volume ratio. Program leader perceptions of higher staff stress significantly predicted lower EBP claim volume (B = −.16, SE = .06, p < .05) but not volume ratio (B = −.07, SE = .05. p = .21). Finally, program leader perceptions of higher staff cohesion were not associated with EBP claim volume or volume ratio.
Discussion
The first aim of this study was to measure the volume and volume ratio of therapists’ claims for six EBPs six to eight years after an initial system-driven roll out of multiple EBPs in Los Angeles County. Over the 33-month study period, the total volume of therapists’ claims for the six EBPs declined by nearly 75%. A sample-based decline is to be expected since therapist turnover and attrition is expected and therapist survival time in this public sector system is typically under three years (Brookman-Frazee et al., 2018). In contrast, the ratio of EBP specific volume relative to all psychotherapy claims remained relatively consistent across the time period and ranged from .03% (CBITS) to 23.6% (MAP). This suggests that when therapists continued to claim within the system, their use of specific EBPs is relatively stable.
Examination of the specific EBPs initially supported by LACDMH revealed that five of the EBPs were sustained in the system, but that the volume and volume ratio of claims varied by EBP. The EBP with the highest volume and volume ratio, MAP, addressed the broadest range of target problems (trauma, conduct, anxiety, and depression), and ages (0–21 years old). This suggests that EBPs with wide applications may be more easily sustained on a larger scale. The EBP with the lowest volume, CBITS, was never meaningfully implemented in the system and thus there was no opportunity for sustainment. This may be explained by the CBITS’ unique modality and service setting characteristics. CBITS was the only EBP of the six initially supported by LACDMH intended to be delivered in groups at schools. Only 10% of the current therapist sample reported that they primarily worked in schools. Therefore, their opportunity to deliver CBITS was limited (Regan et al., 2017; Rodriguez et al., 2018). This limited adoption is consistent with previous findings emphasizing the importance of fit between an EBP and the organizational setting and therapist workflow for long-term sustainment (Aarons & Palinkas, 2007; Green & Aarons, 2011).
The second aim of this study was to explore how inner context factors across EBP-, therapist-, and organizational/agency-levels prospectively predicted EBP volume and volume ratio. Findings highlight factors that may be critical to target in efforts to improve EBP sustainment. Specifically, EBPs with ongoing consultation had higher sustained volume, consistent with other literature highlighting the importance of ongoing support of EBP delivery following initial training (Cooper et al., 2015; Hodge et al., 2017). In the current study we were only able to assess the role of expert-led consultation. However, there is evidence that other forms of ongoing consultation, such as peer-led consultation, may also be effective (Stirman et al., 2017; Weisz et al., 2020).
Therapist self-efficacy delivering a particular EBP was also linked to higher volume. Other research has linked EBP knowledge to greater EBP implementation (Beidas et al., 2015, 2017), and has shown that therapists tend to sustain EBPs they perceive as easier to deliver (Hunter et al., 2015; Massatti et al., 2008). These findings point to the importance of educational and training implementation strategies geared toward increasing therapist self-confidence and mastery. In particular, supervisor support that emphasizes the use of active learning techniques (e.g., role play) may bolster therapist self-efficacy (Gibson et al., 2009), however these supervision strategies have been found to be rare in community implementation (Lucid et al., 2018).
Therapist workload and perceptions of agency organizational stress also emerged as significant predictors of EBP volume during this sustainment phase. Therapists with a higher number of direct service hours per week had a higher volume of PEI EBP-specific claims, but lower volume ratio of the same EBPs relative to all of their psychotherapy claims. This suggests that therapists with greater productivity and higher caseloads may have more opportunity to implement the EBPs in which they are trained, but there is not a parallel effect on increasing the proportion of care delivered that is EBP. Relatedly, program leader perceptions of agency organizational stress were also associated with lower therapist EBP volume. Poorer therapist working conditions and agency organizational stress are linked to therapist turnover, a key threat to EBP sustainment (Glisson et al., 2008). Other studies have found that agencies better prepared for change and implementation of an EBP, and have lower level of organizational stress, are more likely to sustain EBP delivery (Glisson & Schoenwald, 2005; Lang et al., 2017). These results suggest the need for leaders to consider therapists’ overall training history and workload in EBP implementation planning. For example, limiting the number of EBPs a therapist is trained in and selecting an EBP that best fits with their caseload and professional goals may benefit sustainment. In addition, within a fee-for-service implementation strategy such as PEI, permitting reimbursement for EBP-required activities that can add to workload outside of direct service (e.g., EBP-related documentation, outcome monitoring) may promote sustainment. Future studies would benefit from including additional measures related to these therapist stress and workforce issues, such as psychological safety, work engagement, and turnover intention.
Finally, results indicated that program leader perceptions of an EBP may play a particularly important role in therapists’ sustained delivery of these EBPs, whereas therapist perceptions of an EBP were not associated with their future claiming. Several other studies have found that program leader support of an EBP contributes to its long-tern sustainment (Rodriguez et al., 2018; Willging et al., 2015). Program leader EBP perceptions may set the tone, expectations and organizational support for therapist EBP use, all contributing to the organizational climate for sustainment. This highlights the importance of program leaders as influential champions and critical stakeholders in implementation planning.
At the agency level, program leader perceptions of more therapist autonomy in the workplace predicted lower subsequent therapist PEI EBP claim volume and volume ratio. Interestingly, this held true even when therapist perceptions of autonomy were also entered into the model, with therapist perceptions of autonomy not significantly linked to subsequent claiming. At the agency level, program leader perceptions of autonomy were not significantly correlated with therapist perceptions of autonomy (r = 0.18, p = 0.18), and program leaders reported higher levels of perceived autonomy than therapists (program leaders: M = 2.7, SD = 0.5; therapists: M = 2.4, SD = 0.4). The level of within-agency variation of perceptions was similar for program leaders (ICC = .13) and therapists (ICC = .14). Within a system-driven implementation, organizations in which leaders believe that therapists can exercise more autonomy may set the stage for more latitude in desisting from implementing adopted EBPs even when implementation strategies such as fiscal reimbursement mechanisms remain in place.
Limitations and Future Directions
In interpreting these results, it is important to consider both the strengths and limitations of administrative claims data, which was used to index EBP sustainment. Use of administrative claims has become increasingly common due to several advantages of this data format (Hoagwood et al., 2015), including the fact that claims are typically made at the time of service, which eliminates the need for retrospective reporting. In addition, administrative claims data are a byproduct of everyday service provision and require no extra action by therapists or administrators for data capture. However, limitations of this type of data must also be considered. Administrative claims only document which EBP the therapist reported using, not levels of fidelity, adherence, or quality indicators. In addition, multiple may factors influence whether therapists report delivering an EBP in claims for a given client, including requirements for reimbursement.
An additional limitation was the focus on the six PEI EBPs that were initially supported with training and consultation furnished by LACDMH, though PEI offered reimbursement for many more approved EBPs. Five of these initially supported EBPs were sustained at a level that could be studied in this prospective analysis and accounted for 83.1% of therapist PEI claims in this sample. Future studies might examine how the EBPs that were adopted by individual agencies outside the county-wide roll-out might show different sustainment trajectories. Inclusion of more EBPs would also increase variance in the identified EBP characteristics examined in this study.
Relatedly, the unique context of the PEI initiative in Los Angeles County must be considered when drawing inferences. The changing fiscal landscape likely explains some decrease in overall PEI claims since the peak in 2010 (Brookman-Frazee et al., 2016, 2018). Yet, the current findings show that therapists were still delivering these EBPs 6–8 years following scale-up. Results suggest that unlicensed therapists were more likely to have higher subsequent EBP claims volume and therapists sustained a particular EBP for 22 months on average (Brookman-Frazee et al., 2018). This suggests that early career therapists continue to enter the LACDMH system, receive training in EBPs and deliver those services. A critical next step in sustainment would be to consider strategies to retain these therapists within the LACDMH workforce to retain EBP capacity.
This study did not include information about client-level outcomes, though data from evaluations of PEI suggest that children who received PEI EBPs showed clinically significant improvement (Ashwood et al., 2018). In addition, therapists who self-selected to complete the survey may have tended to have more positive perceptions of EBPs. Such a bias would have restricted the range on some of predictor variables (e.g., perceptions of EBPs), yet associations with the sustainment outcomes may still warrant attention in informing implementation strategies.
Conclusions
This study offers a unique prospective examination of how inner context factors influence sustained claiming for PEI EBPs. Although overall volume and volume ratio rates decreased over the study period, PEI EBP claiming still comprised a significant portion of therapists’ caseloads 6–8 year following initial implementation. EBP characteristics, therapist self-efficacy, therapist workload and organizational stress, and program leader perceptions of an EBP all predicted sustainment. Ongoing consultation that bolsters self-efficacy and program leader investment seemed particularly central in sustaining EBPs in this context where reimbursement remained available. Behind the scenes at the system-level, it was also essential that EBP training remained accessible, EBP champions were available to addressed implementation issues, and regular meetings with system-leaders provided updates, outcomes, and technical support. The observed sustainment of EBPs may reflect a larger cultural shift marked by commitment to a children’s mental health service system guided by evidence.
Appendix A: MAP
We use the term “EBPs” throughout, inclusive of MAP, for ease of communication. We acknowledge that MAP differs from the other EBPs as it is an evidence management system, with a direct service component that coordinates multiple evidence sources, employs practices components drawn from over 700 evidence-based treatments, and offers structured process management supports to guide clinical care for anxiety, trauma, depression, conduct problems in children 0–21 years old. In the LACDMH system, the MAP direct service model complements a diverse array of EBPs by allowing therapists to select, review, adapt, or construct promising treatments as needed to match particular child characteristics based on the latest scientific findings (see Chorpita and Daleiden, 2018).
Appendix B: EBP Structure
EBPs were characterized based on whether or not they had a prescribed session content and an intended order of sessions. EBPs with prescribed session content and order were defined as those with explicit guidance in their manuals as to what content should be covered in which order. Based on guidance in their manuals, three of the EBPs, CBITS (Jaycox, 2003), TF-CBT (Cohen et al., 2017), and Triple P (Turner et al., 2002) were classified as having this structure. In contrast, Seeking Safety (Najavits, 2001), CPP (Lieberman et al., 2015), and MAP (Chorpita et al., 2014) were classified as having more flexibility in the selection and ordering of therapy content and activities across an episode of treatment. Following Barnett et al., (2017), EBPs were classified as having a prescribed session content and order (CBITS, Triple P, and TF-CBT) were coded as a 1, and the practices without a prescribed session content/order (Seeking Safety, CPP, and MAP) were coded as a 0.
Appendix C: EBP Training Requirements
EBPs were also characterized on whether or not there was mandatory follow-up consultation with an EBP expert trainer after the initial training. These implementation requirements were established by LACDMH in consultation with the EBP developers (Los Angeles County Department of Mental Health, 2016b). Seeking Safety and Triple P did not require ongoing consultation, whereas CBITS, CPP, MAP, and TFCBT did, though the length and format of the required consultation varied. CBITS and TF-CBT required 13 and 16 consultation calls with external trainers, respectively. TF-CBT also required review of two session recordings. CPP required 18 months of consultation calls with external trainers for both therapists and supervisors in the agency. Similarly, MAP required six months of ongoing consultation and submission of case-based portfolios. Following Barnett et al. (2017), the EBPs classified as having required consultation (CBITS, CPP, MAP, and TF-CBT) were coded as a 1, and those without required consultation (Seeking Safety and Triple P) were coded as a 0.
Footnotes
Author note: Anna S. Lau ORCID ID 0000-0002-1142-2175, Teresa Lind ORCID ID 0000-0003-4524-9638, Mojdeh Motamedi ORCID ID 0000-0003-2239-3610, and Lauren Brookman-Frazee ORCID ID 0000-0001-7582-7529. Teresa Lind changed affiliation to San Diego State University and Joyce H. L. Lui changed affiliation to University of Maryland.
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the National Institute of Health (grant number R01 MH100134).
ORCID iDs: Teresa Lind https://orcid.org/0000-0003-4524-9638
Mojdeh Motamedi https://orcid.org/0000-0003-2239-3610
References
- Aarons G. A. (2004). Mental health provider attitudes toward adoption of evidence-based practice: The evidence-based practice attitude scale (EBPAS). Mental Health Services Research, 6, 61–74. 10.1023/B:MHSR.0000024351.12294.65 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Palinkas L. A. (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34, 411–419. 10.1007/s10488-007-0121-3 [DOI] [PubMed] [Google Scholar]
- Aarons G. A., Sommerfeld D. H. (2012). Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. Journal of the American Academy of Child & Adolescent Psychiatry, 51, 423–431. 10.1016/j.jaac.2012.01.018 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Ehrhart M. G., Farahnak L. R. (2014). The implementation leadership scale (ILS): Development of a brief measure of unit level implementation leadership. Implementation Science, 9, 45–54. 10.1186/1748-5908-9-45 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Green A. E., Trott E., Willging C. E., Torres E. M., Ehrhart M. G., Roesch S. C. (2016). The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: A mixed-method study. Administration and Policy in Mental Health and Mental Health Services Research, 43, 991–1008. 10.1007/s10488-016-0751-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Hurlburt M., Horwitz S. M. (2011a). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38, 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Sommerfeld D. H., Willging C. E. (2011b). The soft underbelly of system change: The role of leadership and organizational climate in turnover during statewide behavioral health reform. Psychological Services, 8, 269–281. 10.1037/a0026196 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons G. A., Sommerfeld D. H., Hecht D. B., Silovsky J. F., Chaffin M. J. (2009). The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: Evidence for a protective effect. Journal of Consulting and Clinical Psychology, 77, 270–280. 10.1037/a0013223 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ashwood J., Kataoka S., Eberhart N., Bromley E., Zima B., Baseman L., Marti F., Kofner A., Tang L., Azhar G., Chamberlin M., Erickson B., Choi K., Zhang L., Miranda J., Burnam M. (2018). Evaluation of the mental health services Act in Los Angeles county: Implementation and outcomes for key programs. RAND Corporation. 10.7249/RR2327 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bandura A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84, 191–215. 10.1037/0033-295X.84.2.191 [DOI] [PubMed] [Google Scholar]
- Barnett M., Brookman-Frazee L., Regan J., Saifan D., Stadnick N., Lau A. S. (2017). How intervention and implementation characteristics relate to community therapists’ attitudes toward evidence-based practices: A mixed methods study. Administration and Policy in Mental Health and Mental Health Services Research, 44, 824–837. 10.1007/s10488-017-0795-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Marcus S., Aarons G. A., Hoagwood K. E., Schoenwald S., Evans A. C., Hurford M. O., Hadley T., Barg F. K., Walsh L. M., Adams D. R., Mandell D. S. (2015). Predictors of community therapists’ use of therapy techniques in a large public mental health system. JAMA Pediatrics, 169, 374–382. 10.1001/jamapediatrics.2014.3736 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Marcus S., Wolk C. B., Powell B., Aarons G. A., Evans A. C., Hurford M. O., Hadley T., Adams D. R., Walsh L. M., Babbar S., Barg F., Mandell D. S. (2016). A prospective examination of clinician and supervisor turnover within the context of implementation of evidence-based practices in a publicly-funded mental health system. Administration and Policy in Mental Health and Mental Health Services Research, 43, 640–649. 10.1007/s10488-015-0673-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Beidas R. S., Skriner L., Adams D., Wolk C. B., Stewart R. E., Becker-Haimes E., Williams N., Maddox B., Rubin R., Weaver S., Evans A., Mandell D., Marcus S. C. (2017). The relationship between consumer, clinician, and organizational characteristics and use of evidence-based and non-evidence-based therapy strategies in a public mental health system. Behaviour Research and Therapy, 99, 1–10. 10.1016/j.brat.2017.08.011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Blasinsky M., Goldman H. H., Unützer J. (2006). Project IMPACT: A report on barriers and facilitators to sustainability. Administration and Policy in Mental Health and Mental Health Services Research, 33, 718–729. 10.1007/s10488-006-0086-7 [DOI] [PubMed] [Google Scholar]
- Brookman-Frazee L., Stadnick N., Roesch S., Regan J., Barnett M., Bando L., Innes-Gomberg D., Lau A. S. (2016). Measuring sustainment of multiple practices fiscally mandated in children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research, 43(6), 1009–1022. 10.1007/s10488-016-0731-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brookman-Frazee L., Zhan C., Stadnick N., Sommerfeld D., Roesch S., Aarons G. A., Innes-Gomberg D., Bando L., Lau A. S. (2018). Using survival analysis to understand patterns of sustainment within a system-driven implementation of multiple evidence-based practices for children’s mental health services. Frontiers in Public Health, 6, 15–12. 10.3389/fpubh.2018.00054 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cashel M. L. (2002). Child and adolescent psychological assessment: Current clinical practices and the impact of managed care. Professional Psychology: Research and Practice, 33, 446–453. 10.1037/0735-7028.33.5.446 [DOI] [Google Scholar]
- Chorpita B. F., Daleiden E. L. (2018). Coordinated strategic action: Aspiring to wisdom in mental health service systems. Clinical Psychology: Science and Practice, 25, e12264. 10.1111/cpsp.12264 [DOI] [Google Scholar]
- Chorpita B. F., Daleiden E. L., Collins K. S. (2014). Managing and adapting practice: A system for applying evidence in clinical care with youth and families. Clinical Social Work Journal, 42, 134–142. 10.1007/s10615-013-0460-3 [DOI] [Google Scholar]
- Cohen J. A., Mannarino A. P., Deblinger E. (2017). Treating trauma and traumatic grief in children and adolescents (2nd ed.). The Guilford Press. [Google Scholar]
- Cook J. M., Thompson R., Schnurr P. P. (2015). Perceived characteristics of intervention scale: Development and psychometric properties. Assessment, 22, 704–714. 10.1177/1073191114561254 [DOI] [PubMed] [Google Scholar]
- Cooper B. R., Bumbarger B. K., Moore J. E. (2015). Sustaining evidence-based prevention programs: Correlates in a large-scale dissemination initiative. Prevention Science, 16, 145–157. 10.1007/s11121-013-0427-1 [DOI] [PubMed] [Google Scholar]
- Forgatch M. S., Patterson G. R., DeGarmo D. S. (2005). Evaluating fidelity: Predictive validity for a measure of competent adherence to the Oregon model of parent management training. Behavior Therapy, 36, 3–13. 10.1016/S0005-7894(05)80049-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gibson J. A., Grey I. M., Hastings R. P. (2009). Supervisor support as a predictor of burnout and therapeutic self-efficacy in therapists working in ABA schools. Journal of Autism and Developmental Disorders, 39, 1024–1030. 10.1007/s10803-009-0709-4 [DOI] [PubMed] [Google Scholar]
- Glisson C., Schoenwald S. K. (2005). The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental Health Services Research, 7, 243–259. 10.1007/s11020-005-7456-1 [DOI] [PubMed] [Google Scholar]
- Glisson C., Schoenwald S. K., Kelleher K., Landsverk J., Hoagwood K. E., Mayberg S., Green P. (2008). The research network on youth mental health. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Administration and Policy in Mental Health and Mental Health Services Research, 35, 124–133. 10.1007/s10488-007-0152-9 [DOI] [PubMed] [Google Scholar]
- Green A. E., Aarons G. A. (2011). A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping. Implementation Science, 6, 104. 10.1186/1748-5908-6-104 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gregory A., Henry D. B., Schoeny M. E. (2007). The metropolitan area child study research group school climate and implementation of a preventive intervention. American Journal of Community Psychology, 40, 250–260. 10.1007/s10464-007-9142-z [DOI] [PubMed] [Google Scholar]
- Han S. S., Weiss B. (2005). Sustainability of teacher implementation of school-based mental health programs. Journal of Abnormal Child Psychology, 33, 665–679. 10.1007/s10802-005-7646-2 [DOI] [PubMed] [Google Scholar]
- Hawley K. M., Cook J. R., Jensen-Doss A. (2009). Do noncontingent incentives increase survey response rates among mental health providers? A randomized trial comparison. Administration and Policy in Mental Health and Mental Health Services Research, 36, 343–348. 10.1007/s10488-009-0225-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Herschell A. D., Kolko D. J., Baumann B. L., Davis A. C. (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30, 448–466. 10.1016/j.cpr.2010.02.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hoagwood K. E., Essock S., Morrissey J., Libby A., Donahue S., Druss B., Finnerty M., Frisman L., Narasimhan M., Stein B. D., Wisdom J., Zerzan J. (2015). Use of pooled state administrative data for mental health services research. Administration and Policy in Mental Health and Mental Health Services Research, 43, 67–78. 10.1007/s10488-014-0620-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hoagwood K. E., Olin S. S., Horwitz S., McKay M., Cleek A., Gleacher A., Lewandowski E., Nadeem E., Acri M., Chor K. H. B., Kuppinger A., Burton G., Weiss D., Frank S., Finnerty M., Bradbury D. M., Woodlock K. M., Hogan M. (2014). Scaling up evidence-based practices for children and families in New York state: Toward evidence-based policies on implementation for state mental health systems. Journal of Clinical Child & Adolescent Psychology, 43, 145–157. 10.1080/15374416.2013.869749 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hodge L. M., Turner K. M. T., Sanders M. R., Forster M. (2017). Factors that influence evidence-based program sustainment for family support providers in child protection services in disadvantaged communities. Child Abuse & Neglect, 70, 134–145. 10.1016/j.chiabu.2017.05.017 [DOI] [PubMed] [Google Scholar]
- Hunter S. B., Han B., Slaughter M. E., Godley S. H., Garner B. R. (2015). Associations between implementation characteristics and evidence-based practice sustainment: A study of the adolescent community reinforcement approach. Implementation Science, 10, 173. 10.1186/s13012-015-0364-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Jaycox L. H. (2003). CBITS: Cognitive behavioral intervention for trauma in schools. Sopris West. [Google Scholar]
- Lang J. M., Randall K. G., Delaney M., Vanderploeg J. J. (2017). A model for sustaining evidence-based practices in a statewide system. Families in Society: The Journal of Contemporary Social Services, 98, 18–26. 10.1606/1044-3894.2017.5 [DOI] [Google Scholar]
- Langley A. K., Nadeem E., Kataoka S. H., Stein B. D., Jaycox L. H. (2010). Evidence-based mental health programs in schools: Barriers and facilitators of successful implementation. School Mental Health, 2, 105–113. 10.1007/s12310-010-9038-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lau A. S., Brookman-Frazee L. (2016). The 4KEEPS study: Identifying predictors of sustainment of multiple practices fiscally mandated in children’s mental health services. Implementation Science, 11(1), 31–38. 10.1186/s13012-016-0388-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lau A. S., Lind T., Crawley M., Rodriguez A., Smith A., Brookman-Frazee L. (2020). When do therapists stop using evidence-based practices? Findings from a mixed method study on system-driven implementation of multiple EBPs for children. Administration and Policy in Mental Health and Mental Health Services Research, 47, 323–337. 10.1007/s10488-019-00987-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lehman W. E. K., Greener J. M., Simpson D. D. (2002). Assessing organizational readiness for change. Journal of Substance Abuse Treatment, 22, 197–209. 10.1016/S0740-5472(02)00233-7 [DOI] [PubMed] [Google Scholar]
- Lieberman A. F., Ippen C. G., Van Horn P. (2015). Don’t hit my mommy! A manual for child-parent psychotherapy with young children exposed to violence and other trauma (2nd ed.). Zero To Three. [Google Scholar]
- Los Angeles County Department of Mental Health (2016a). Prevention and early intervention implementation handbook. Revised July 2016. Los Angeles County Department of Mental Health. http://file.lacounty.gov/SDSInter/dmh/247145_PEIImplementationHandbook-PDFforWebsiterev.7-27-16.pdf [Google Scholar]
- Los Angeles County Department of Mental Health (2016b). Training protocols for prevention and early intervention practices. Los Angeles County Department of Mental Health. http://file.lacounty.gov/SDSInter/dmh/201947_PEITrainingProtocolsrevised4-1-16.pdf [Google Scholar]
- Lucid L., Meza R., Pullmann M. D., Jungbluth N., Deblinger E., Dorsey S. (2018). Supervision in community mental health: Understanding intensity of EBT focus. Behavior Therapy, 49, 481–493. 10.1016/j.beth.2017.12.007 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maslach C., Jackson S. E. (1981). The measurement of experienced burnout. Journal of Organizational Behavior, 2, 99–113. 10.1002/job.4030020205 [DOI] [Google Scholar]
- Massatti R. R., Sweeney H. A., Panzano P. C., Roth D. (2008). The de-adoption of innovative mental health practices (IMHP): Why organizations choose not to sustain an IMHP. Administration and Policy in Mental Health and Mental Health Services Research, 35, 50–65. 10.1007/s10488-007-0141-z [DOI] [PubMed] [Google Scholar]
- Moullin J. C., Dickson K. S., Stadnick N. A., Rabin B., Aarons G. A. (2019). Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implementation Science, 14, 1. 10.1186/s13012-018-0842-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Najavits L. M. (2001). Seeking safety: A treatment manual for PTSD and substance abuse. Guilford Press. [DOI] [PubMed] [Google Scholar]
- Novins D. K., Green A. E., Legha R. K., Aarons G. A. (2013). Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review. Journal of the American Academy of Child & Adolescent Psychiatry, 52(10), 1009–1025. 10.1016/j.jaac.2013.07.012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas L. A., Weisz J. R., Chorpita B. F., Levine B., Garland A. F., Hoagwood K. E., Landsverk J. (2013). Continued use of evidence-based treatments after a randomized controlled effectiveness trial: A qualitative study. Psychiatric Services, 64(11), 1110–1118. 10.1176/appi.ps.004682012 [DOI] [PubMed] [Google Scholar]
- Reding M. E. J., Chorpita B. F., Lau A. S., Innes-Gomberg D. (2014). Providers’ attitudes toward evidence-based practices: Is it just about providers, or do practices matter, too? Administration and Policy in Mental Health and Mental Health Services Research, 41, 767–776. 10.1007/s10488-013-0525-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Regan J., Lau A. S., Barnett M., Stadnick N., Hamilton A., Pesanti K., Bando L., Brookman-Frazee L. (2017). Agency responses to a system-driven implementation of multiple evidence-based practices in children’s mental health services. BMC Health Services Research, 17, 671–684. 10.1186/s12913-017-2613-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rodriguez A., Lau A. S., Wright B., Regan J., Brookman-Frazee L. (2018). Mixed-method analysis of program leader perspectives on the sustainment of multiple child evidence-based practices in a system-driven implementation. Implementation Science, 13, 44. 10.1186/s13012-018-0737-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rohrbach L. A., Grana R., Sussman S., Valente T. W. (2006). Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation & the Health Professions, 29, 302–333. 10.1177/0163278706290408 [DOI] [PubMed] [Google Scholar]
- Roundfield K. D., Lang J. M. (2017). Costs to community mental health agencies to sustain an evidence-based practice. Psychiatric Services, 68, 876–882. 10.1176/appi.ps.201600193 [DOI] [PubMed] [Google Scholar]
- Scanlan J. N., Still M. (2019). Relationships between burnout, turnover intention, job satisfaction, job demands and job resources for mental health personnel in an Australian mental health service. BMC Health Services Research, 19, 62. 10.1186/s12913-018-3841-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- Scheirer M. A. (2005). Is sustainability possible? A review and commentary on empirical studies of program sustainability. American Journal of Evaluation, 26, 320–347. 10.1177/1098214005278752 [DOI] [Google Scholar]
- Stirman S. W., Kimberly J., Cook N., Calloway A., Castro F., Charns M. (2012). The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. Implementation Science, 7, 17–36. 10.1186/1748-5908-7-17 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stirman S. W., Pontoski K., Creed T., Xhezo R., Evans A. C., Beck A. T., Crits-Christoph P. (2017). A non-randomized comparison of strategies for consultation in a community-academic training program to implement an evidence-based psychotherapy. Administration and Policy in Mental Health and Mental Health Services Research, 44, 55–66. 10.1007/s10488-015-0700-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Turner K. M., Markie-Dadds C., Sanders M. R. (2002). Facilitator’s manual for group triple P (2nd ed.). University of Queensland/Triple P International. [Google Scholar]
- Weisz J. R., Thomassin K., Hersh J., Santucci L. C., MacPherson H. A., Rodriguez G. M., Bearman S. K., Lang J. M., Vanderploeg J. J., Marshall T. M., Lu J. J., Jensen-Doss A., Evans S. C. (2020). Clinician training, then what? Randomized clinical trial of child STEPs psychotherapy using lower-cost implementation supports with versus without expert consultation. Journal of Consulting and Clinical Psychology, 88(12), 1065–1078. 10.1037/ccp0000536 [DOI] [PubMed] [Google Scholar]
- Willging C. E., Green A. E., Gunderson L., Chaffin M., Aarons G. A. (2015). From a “perfect storm” to “smooth sailing”: Policymaker perspectives on implementation and sustainment of an evidence-based practice in two states. Child Maltreatment, 20, 24–36. 10.1177/1077559514547384 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wolk C., Marcus S. C., Weersing V. R., Hawley K. M., Evans A. C., Hurford M. O., Beidas R. S. (2016). Therapist- and client-level predictors of use of therapy techniques during implementation in a large public mental health system. Psychiatric Services, 67(5), 551–557. 10.1176/appi.ps.201500022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Woltmann E. M., Whitley R., McHugo G. J., Brunette M., Torrey W. C., Coots M. S., Lynde D., Drake R. E. (2008). The role of staff turnover in the implementation of evidence-based practices in mental health care. Psychiatric Services, 59(7), 732–737. 10.1176/ps.2008.59.7.732 [DOI] [PubMed] [Google Scholar]


