Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Feb 20.
Published in final edited form as: J Clin Child Adolesc Psychol. 2014 Feb 20;43(2):201–215. doi: 10.1080/15374416.2013.869750

Searching for Elements of Evidence-based Practices in Children’s Usual Care and Examining their Impact

Ann F Garland A,E, Erin C Accurso B, Rachel Haine-Schlagel C,E, Lauren Brookman-Frazee D,E, Scott Roesch C,E, Jin Jin Zhang E
PMCID: PMC4041606  NIHMSID: NIHMS546527  PMID: 24555882

Abstract

Objective

Most of the knowledge generated to bridge the research - practice gap has been derived from experimental studies implementing specific treatment models. Alternatively, this study uses observational methods to generate knowledge about community-based treatment processes and outcomes. Aims are to (1) describe outcome trajectories for children with disruptive behavior problems (DBPs), and (2) test how observed delivery of a benchmark set of practice elements common in evidence-based (EB) treatments may be associated with outcome change, while accounting for potential confounding variables.

Method

Participants included 190 children ages 4–13 with DBPs and their caregivers, plus 85 psychotherapists, recruited from six clinics. All treatment sessions were video-taped and a random sample of four sessions in the first four months of treatment was reliably coded for intensity on 27 practice elements (benchmark set and others). Three outcomes (child symptom severity, parent discipline, and family functioning) were assessed by parent report at intake, four, and eight months. Data were collected on several potential covariates including child, parent, therapist, and service use characteristics. Multi-level modeling was used to assess relationships between observed practice and outcome slopes, while accounting for covariates.

Results

Children and families demonstrated improvements in all three outcomes, but few significant associations between treatment processes and outcome change were identified. Families receiving greater intensity on the benchmark practice elements did demonstrate greater improvement in the parental discipline outcome.

Conclusion

Observed changes in outcomes for families in community care were generally not strongly associated with the type or amount of treatment received.

Keywords: treatment process-outcome research, childhood disruptive behavior problems, community-based care, evidence-based practice


Most of the growing knowledge about delivery and outcomes of evidence-based (EB) practices has come from experimental research testing the impact of specific treatment models across a wide array of efficacy, effectiveness, and implementation studies. While these studies are significantly advancing knowledge of effective treatments and implementation methods, observational research examining “usual care” (UC) in community-based settings can contribute complementary knowledge. Descriptive data on UC allows for identification of treatment processes and child/family/therapist characteristics associated with outcomes. This “practice-based evidence” can then be used to support quality improvement efforts and to facilitate the implementation of EB practices in the UC context.

Multiple studies and reviews report minimal average impact of UC treatment on child and family outcomes but acknowledge the significant variability in UC treatment practices and outcomes (Bickman, Noser, & Summerfelt, 1999; Kazak et al., 2010; Warren, Nelson, Mondragon, Baldwin, & Burlingame, 2010; Weiss, Catron, Harris, & Phung, 1999; Weisz, Donenberg, Han, & Weiss, 1995; Weisz, Jensen-Doss, & Hawley, 2006). However, given the paucity of research characterizing UC practices for children and families, little is known about how variability in UC practices may be associated with differential outcomes. The assumption that delivery of care more consistent with EB treatment results in better patient outcomes has guided current efforts to disseminate and implement EB practices in community practice settings, and emerging successful efforts to do so support this notion (e.g., Farmer, Burns, Wagner, Murray, & Southerland, 2010; Hogue, Dauber, Liddle & Samuolis, 2004; Kolko et al., 2009; Weisz et al., 2012).

Results of a comprehensive system-wide Quality Improvement initiative for children’s mental health care in the State of Hawaii also provide preliminary uncontrolled practice-based support for the link between delivery of care more consistent with EB practice and improved clinical outcomes within a UC system (Daleiden, Chorpita, et al., 2006). A promising specific effect was demonstrated in a subset of youths with ADHD, with faster average rate of improvement for those whose services included greater therapist-reported delivery of practice elements more consistent with EB treatments for ADHD (e.g., problem-solving, time out; Mueller, Daleiden, Chorpita, Tolman, & Higa-MacMillan, 2009).

Experts report that providers in UC service settings rarely deliver EB treatment protocols (Kazdin & Weisz, 2003), but until recently, there were limited data on the extent to which UC practice even resembles elements of EB practice (Garland, Brookman-Frazee, et al., 2010). Deconstructing EB treatment protocols into practice elements (Chorpita, Daleiden, & Weisz, 2005; defined as discrete therapeutic strategies, including therapists’ techniques and therapy content, such as psychoeducation and affect education, respectively, allows for comparison between UC and EB practices and the extent to which UC practice incorporates elements common to EB practices (Garland, Brookman-Frazee, et al., 2010; Garland, Hurlburt, & Hawley, 2006). Of course, this approach requires identifying “benchmarks” representing common elements of EB practices for specific clinical populations. In recent years, several research groups have used a variety of methods to identify core competencies and/or practice elements common across multiple individual treatment protocols for a variety of clinical problem areas (e.g., Kaminski, Valle, Filene, & Boyle, 2008, parent-training interventions; Rogers & Vismara, 2008, autism interventions; Roth & Pilling, 2008, depression and anxiety; Rotheram-Borus, Ingram, Swendeman, Flannery, 2009, HIV prevention; Sburlati, Schmering, Lyneham, & Rapee, 2011, childhood anxiety and depression; St. Amand, Bard, & Silovsky, 2008, childhood sexual behavior).

Using a Delphi method adapted from Quality of Care research, our research group identified benchmark practice elements common across eight individual EB treatment models for children ages 4 to 13 with disruptive behavior problems (DBPs; Garland, Hawley, Brookman-Frazee, & Hurlburt, 2008). This target patient population was selected because children with DBPs represent the vast majority of youths presenting to UC (Garland et al., 2001), and there are more EB treatment models for this clinical problem area than for other childhood mental health problems (Eyberg, Nelson, & Boggs, 2008). Disruptive behavior problems are also common across disorders (Lillienfeld, 2003).

In a previous report (Garland, Brookman-Frazee, et al., 2010), we used this set of benchmark practice elements common to EB treatments to examine the extent to which a large sample of observed UC practice (over 1200 treatment sessions) reflected elements of EB treatments for this patient population. Findings suggested that some of the benchmark practice elements (e.g., positive reinforcement, psychoeducation) were observed in the majority of sessions, whereas others (e.g., assignment or review of homework; role-playing and modeling for skill-building) were observed relatively infrequently. Overall, each of the benchmark practice elements were observed at low average intensity, with high intensity benchmark elements observed in fewer than 15% of sessions (Garland, Brookman-Frazee, et al., 2010).

Certainly, the therapeutic process is a two-way interaction, with both therapist and patient factors contributing to therapists’ use of particular practices. However, examination of child, family, and therapist characteristics associated with delivery of these benchmark practice elements in this study revealed very few significant effects, suggesting that the UC practice we observed was relatively diffuse (i.e., non-specific) (Brookman-Frazee, Haine, Baker-Ericzen, Zoffness, & Garland, 2010). Observed practice intensity on the benchmark elements was positively associated with child age but unrelated to primary diagnosis. Greater intensity on the benchmark indicators was also associated with therapists’ self-report of a primary cognitive behavioral or behavioral theoretical orientation vs. eclectic/other, but was unrelated to other therapist characteristics (e.g., discipline).

As the first study to utilize observational measures to characterize UC treatment, this study provides a general description of UC practice as broad and eclectic (i.e., many practice elements were observed in each session), but lacking in depth and specificity. We found that benchmark practice element intensity is positively associated with parents’ perceptions of treatment effectiveness (Haine-Schlagel et al., 2013) and marginally associated with greater number of sessions attended in this UC sample (Garland, Haine-Schlagel, Accurso, Baker-Ericzen, & Brookman-Frazee, 2012), therefore the next logical step is to test how variability in observed practice may be associated with differential outcomes. The primary aim of this current report is to examine the relation between observed intensity on the benchmark practice elements common to EB treatments and prospectively assessed clinical outcomes, while accounting for potential confounding variables.

Prior efforts to examine treatment process – outcome relations have primarily assessed fidelity to a specific intervention, but there are a few studies using different methods to examine practice elements more broadly. For example, St. Amand and colleagues (2008) used meta-analytic methods to identify practice elements associated with outcomes and found that parenting/behavior management strategies were associated with better outcomes for treatment of childhood sexual behavior. Using somewhat similar techniques to identify elements most strongly associated with outcome variability, Kaminski and colleagues (2008) highlighted the importance of active skill-building techniques, such as behavioral rehearsal, in parent-training interventions. Hogue and colleagues (2004) conducted process – outcome analyses using observational data on treatment sessions and found that a family focus in treatment predicted improvement in drug use outcomes among adolescents. Finally, emerging research from Denneny and Mueller (2012) is focused on testing the influence of EB practice elements on outcomes in a large system of care.

Any investigation examining the relationship between treatment processes and outcomes needs to address critical methodological decisions regarding measurement of treatment processes and outcomes, as well as selection of potential covariates that may be confounds (e.g., client and therapist characteristics, relationship factors, etc.; Beauchaine, Webster-Stratton, & Reid, 2005; Hogue et al., 2008). To maximize the rigor and relevance of the treatment process measurement used in our study, reliable observer ratings of many videotaped psychotherapy sessions were utilized (as opposed to therapist report). In addition, the array of practice elements assessed within those sessions included the benchmark set of elements predetermined to be common in EB practice, as well as other practice elements identified by providers as common in UC (Garland, Plemmons, & Koontz, 2006). Assessing for observed intensity on these “other” practice elements provides an important control for a potentially generic practice “intensity” effect on outcome change, as opposed to a more specific effect of benchmark practice elements.

Psychotherapy studies have also demonstrated the importance of relationship factors (e.g., therapeutic alliance) when examining specific practice element effects (Hogue et al., 2008). Therapeutic alliance has been associated with clinical outcomes for child and family treatment in a variety of individual and meta-analytic studies (e.g., McLeod, 2011; Karver et al., 2006; Shirk & Karver, 2003); thus we included parent and youth perspectives on therapeutic alliance in our models testing predictors of outcome trajectories in UC.

There are many additional child, family, therapist, and service use factors that could be associated with treatment processes and outcomes, and likewise, could confound the relationship between process and outcome (Beauchaine et al., 2005). Much of the research attempting to identify variables associated with treatment outcomes comes from controlled efficacy and effectiveness trials, with a notable exception of Warren and colleagues (2009, 2010) studies that examine community care, and findings have been somewhat inconclusive. Thus, there are few empirically-supported guidelines for selection of predictors to test in UC. We selected potential predictors that had demonstrated an association with intensity of observed delivery of EB elements in UC (Brookman-Frazee et al., 2010), and/or variables that might have a confounding association with outcome trajectories (e.g., number of sessions attended, intensity on non-benchmark elements, and therapeutic alliance). The final set of potential covariate predictors in this study includes basic child demographic and clinical factors (e.g., age, gender, race/ethnicity, diagnosis, comorbidity), parental factors (e.g., parent education level and psychopathology), therapist factors (e.g., experience and primary theoretical orientation), and service use factors (e.g., number of sessions attended and intensity of “other” non-benchmark practice elements). Review of each of these is beyond the scope of this paper, but a few are highlighted below for particular interest.

One potential covariate predictor of specific interest is number of sessions attended. There is controversy regarding the extent to which amount (i.e., “dose”) of services received (often assessed as number of sessions attended) in UC may be associated with differential outcome trajectories. Andrade and colleagues (2000) reported no significant difference on outcome trajectories for children receiving negligible treatment (defined as fewer than eight sessions) compared to those receiving more treatment. More recently, Warren and colleagues (2010) found no relationship between number of sessions attended and outcome trajectories for a large sample of over 4000 youths in UC. Lindhiem & Kolko (2011) similarly found no “dose” effect. In contrast, two studies have found a positive relationship between number of UC sessions and child and adolescent outcomes (Angold et al., 2000; Shapiro et al., 1997).

There is also inconsistency regarding the impact of parent and therapist characteristics on child and family treatment outcomes. For example, some studies have indicated that family stressors, such as low socio-economic status (SES) and parental psychopathology, may be associated with outcomes (Beauchaine et al., 2005). Therefore, these variables were included as potential predictors in this study (parental education was used as a proxy for SES). In addition, there is considerable interest, but minimal clarity regarding the effects of therapist characteristics, such as extent of experience and/or primary theoretical orientation (Beutler et al., 2004), so these variables were also included as potential predictors.

One of the other primary decisions in a study of treatment processes and outcomes is selection of outcome domains, given that there are many potential outcome domains to assess the impact of children’s mental health care (Hoagwood et al., 2012). Change across outcome domains is not necessary highly concordant (Brookman-Frazee et al., 2006); therefore, assessment of multiple outcome constructs is recommended (Kazdin, 2000). Child symptom severity is the most commonly assessed outcome for children with disruptive behavior problems (Eyberg et al., 2008), but relying exclusively on measurement of child symptom change may underestimate treatment impact (Kazdin, 2000). Given that many EB treatments for children incorporate parent-training, assessment of change in parenting practices as a key outcome is warranted. Likewise, improved quality of family relationships is one of the most important desired treatment goals for youths (across diagnostic groups), and it is also highly prioritized by UC providers (Garland, Lewczyk-Boxmeyer, Gabayan, & Hawley, 2004).

Given the limited yet discouraging data on the impact of UC treatment, and lack of clarity regarding factors (both treatment process and other factors) associated with outcomes in UC, there have been many calls for research on treatment processes and outcomes in UC contexts (e.g., Bickman, 2000; Weisz et al., 2006; Warren et al., 2010). This study addresses these calls with detailed observational data on UC treatment processes, child, family, therapist, and service use characteristics, and outcome trajectories. The aims of this study are: 1) to describe change in three outcome domains: (a) overall symptom severity, (b) parental disciplinary practices, and (c) family functioning across eight months for children with DBPs entering publicly-funded community-based out-patient care; and 2) to determine whether variability in outcome trajectories is associated with variability in treatment processes, specifically with observed intensity on benchmark practice elements common in EB treatments, and secondarily to identify other factors (i.e., child, family, therapist, and service use characteristics) associated with outcome trajectories.

Methods

Data from this study were collected as part of a larger observational study examining treatment processes and outcomes for children with DBPs in UC (refer to Garland, Brookman-Frazee, et al., 2010 and Garland, Hurlburt, Brookman-Frazee, Taylor, & Accurso, 2010 for more details on clinic settings, client recruitment, and data collection procedures).

Participants

Participating Clinics. Six clinics, representing the largest contractors for publicly-funded out-patient care for children in one of the largest counties in the Southwestern U.S., were selected for participation. The clinics serve ethnically and diagnostically diverse children and their families. There was no manipulation in services offered or therapist training associated with this study, and none of the clinics specialized in delivery or incorporation of EB treatments before or during the study period.

Child and Caregiver (herein after referred to as parent) Participants. A total of 218 children participated in the larger study, representing 84% of the 258 potential participants identified by the six clinics as meeting eligibility criteria (listed below) due to 40 parents declining to participate when recruited by research staff. Inclusion criteria for child participants were (a) presenting problems included a disruptive behavior problem (DBP: aggression, defiance, delinquency, oppositional, or impulsive behavior by parent report), (b) age 4–13 years, (c) primary language for child and parent was English or Spanish, and (d) child was entering a new episode of psychotherapy (defined as no therapy for previous three months) with a participating therapist. This current study includes the 190 child participants for whom practice data (i.e., session videotapes) and outcome data were collected.

Table 1 presents a summary of the participant characteristics (children, parents, and therapists). As indicated, the average child age was 9 years, the majority were male (67%), and half of the sample was Caucasian. All children had DBPs as presenting target problems, but their clinician-assigned primary diagnoses varied at intake. Almost half of the sample had more than one diagnosis; 75% of the children had a disruptive behavior or ADHD diagnosis. Most participating parents were biological mothers (n=166, 87%) of the children. Average annual household income was $20,739 (SD=34,454).

Table 1.

Descriptives on study participants (n=190).

Participant characteristics n Mean (SD)
or %
Child Demographic Characteristics

  Child age 190 9.0 (2.7)
  Child gender (male) 128 67.4%
  Child race/ethnicity
    Caucasian 95 50.0%
    Latino/Hispanic 54 28.4%
    African American 16 8.4%
    Other/Mixed 25 13.2%
  Parents’ highest level of education
    Some high school or less 32 17.3%
    High school diploma to some college 123 66.5%
    College graduate to some graduate school 30 16.2%

Child Clinical Factors at Entry

  Primary diagnosis assigned by clinician
    DBD 39 20.5%
    ADHD 73 38.4%
    Mood Disorder 45 23.7%
    Anxiety 17 9.0%
    Autism Spectrum Disorder/Other 16 8.4%
  Comorbidity (more than one diagnosis) 90 47.4%

Parent Clinical Factors at Entry

  Parent psychopathology (BSI mean score) 188 57.1 (11.5)

Therapist and Service Use Characteristics

  Therapist months practiced 85 28.6 (39.4)
  Therapist status (staff v. trainee) 36 42.4%
  Therapist discipline
    MFT 50 58.8%
    Psychology 18 21.2%
    Social Work 17 20.0%
  Therapist primary theoretical orientation
    CBT 23 27.1%
    Family Systems 32 37.6%
    Nondirective 7 8.2%
    Eclectic/Other 23 27.1%
Service use
  Number of treatment visits attended in 8 months 190 16.0 (8.9)

Therapist Participants. Eighty percent (n=105 of 131) of the therapists recruited into the study agreed to participate. Of those, 85 were designated as the primary therapist for a participant child, with data available for the current study on treatment processes and outcomes; primary therapist was defined by having the most recorded sessions with the client over the study period. Participating therapists did not differ from all available therapists on age, gender, or race/ethnicity, but participation rate was higher among unlicensed therapists compared to licensed therapists (86% compared to 72%). The majority of the final 85 therapists were female (n=72, 84.7%) and Caucasian (n=57, 67.1%) with an average of 2.8 years of post-graduate clinical experience. Therapists came from different mental health disciplines, including marriage and family therapy (n=50, 58.8%), psychology (n=18, 21.2%), and social work (n=17, 20.0%). Consistent with national UC samples (e.g., Glisson et al., 2008), the majority (n=52, 61.2%) were master’s level clinicians.

Procedures

Data sources for the current study included: 1) baseline in-person interviews with children, parents, and therapists to collect socio-demographic and other clinically relevant baseline data; 2) follow-up telephone interviews at 4 and 8 months post-baseline to collect repeated measures of outcome indicators; 3) administrative billing records on treatment session attendance and clinician-assigned diagnoses; and 4) videotapes of treatment sessions. All treatment sessions were recorded and a random sample of four sessions within the first four months of treatment was selected for coding (mean number of tapes per child = 3.4 (SD = 1.0), given that some dropped out of care prior to completing four sessions). Although coded videotaped sessions are available up to 16 months after service entry in the larger study (Garland, Brookman-Frazee, et al., 2010), only practice data from the first four months were selected for these analyses. Examining treatment processes in the first four months allowed us to 1) maximize the representativeness of practice for the largest sample (given that many children/families ended care after 4 months), 2) minimize the confounds of multiple therapists (given that most therapist transfers occurred after the first four months of therapy), and 3) most importantly, to examine the true predictive impact of early treatment processes on later treatment outcomes (i.e., trajectories assessed over eight months). Furthermore, research supports the importance of treatment processes in early sessions as potent predictors of clinical outcomes (Feeley, DuRubeis, & Gelfand, 1999). While treatment did continue past the first four months for some families, average intensity of benchmark practice elements did not differ across the four 4-month time intervals; this is consistent with a report on adult psychotherapy which documented relative stability in process variables across time (Carroll et al., 2000).

Outcome data were collected every four months regardless of time in treatment. At baseline, all participants had data on child symptom severity, all but one had data on family functioning, and all but five had data on parental disciplinary practices. Follow-up rates at all time-points were well over 80% for all measures. Participants with complete outcome data did not differ significantly from participants with missing data on key variables such as child symptom severity at baseline (p > .10).

Measures

Measures are described below within the following categories: (a) outcome measures and (b) potential predictor variables, including (b.i) benchmark practice element within session predictor variables, and (bi.i) other potential covariate predictors, such as child, family, therapist, and service characteristics.

a) Outcome Measures

Child symptom severity

The Eyberg Child Behavior Inventory (ECBI; Eyberg & Pincus, 1999) is a widely used 36-item parent-report measure designed to assess behavior problems in children ages 2 to 16. Only the intensity scale is used in the current study, which has a clinical cutoff of 132 (Eyberg & Pincus, 1999). The inventory exhibits good convergent validity (Boggs, Eyberg, & Reynolds, 1990), excellent internal consistency in this study (Cronbach’s alpha = .92) and other studies (Cronbach’s alpha =.98; Eyberg & Robinson, 1983), and excellent discriminative power (Eyberg & Robinson, 1983).

Parenting practices

The Inconsistent Discipline scale (Frick, 1991) contains six parent-report items measuring parent inconsistency in disciplinary methods (e.g., not enforcing stated consequence) taken from the Alabama Parenting Questionnaire. The internal reliability in this sample was acceptable (alpha = .73) and reliability and validity have been supported in other studies (e.g., Frick, Christian, & Wootton, 1999). Higher scores reflect more inconsistency.

Family functioning

The Family Relationship Index (FRI; Holahan & Moos, 1983) is a 27-item true-false parent-report measure that assesses the quality of family relationships. The measure has demonstrated good construct validity (Holahan & Moos, 1983) and sensitivity and it has a suggested clinical cutoff score of less than nine (Edwards & Clarke, 2005). In this sample, the Cronbach’s alpha for the FRI was .78. Higher scores reflect higher quality family relationships.

b.i) Potential predictors variables (within session practice elements)

PRAC Therapeutic Process Observational Coding System for Child Psychotherapy – Strategies scale (PRAC TPOCS-S)

The PRAC TPOCS-S (Garland, Brookman-Frazee, & McLeod, 2008) is an adapted version of the original TPOCS-S (McLeod & Weisz, 2010). It assesses for a wide array of 27 practice elements through coding of treatment session videotapes, including both therapeutic techniques (e.g., modeling) and content (e.g., affect management). Intensity of each practice element is coded as directed to children or parents (or both). Intensity is conceptually the same as “extensiveness” and reflects both the time spent on the strategy and the thoroughness with which it was pursued. Intensity is rated at the end of the session for each observed element on a Likert scale of 1 to 6 (1–2 = low, 3–4 = medium, 5–6 = high); a score of 0 indicates the element was not observed. Inter-rater reliability on intensity coding was adequate (Cichetti, 1994), as assessed by double-coding of 31% (379) of the coded sessions resulting in an intra-class correlation coefficient of .78 across all codes. Details on the coders (non-clinician research assistants), coder training (didactics, manual review, and criterion testing), and reliability analyses are found in Garland, Hurlburt, and colleagues (2010).

As noted earlier, a subset of the PRAC TPOCS-S practice elements was identified as benchmark elements common to EB treatments for children with DBPs (Garland, Hawley, et al., 2008). These benchmark elements include therapeutic techniques (e.g., using positive reinforcement and limit-setting, psychoeducation, establishing/ reviewing goals, modeling, role-play/practice, and assigning/reviewing homework) and therapeutic content (e.g., affect education, parent-child relationship, principles of positive reinforcement, principles of limit-setting/punishment, problem-solving skills, and affect/anger management).

Four composite scores from the PRAC TPOCS-S were used in this study—two composites represented the benchmark elements, specifically the average intensity of observed practice elements common in EB treatments, calculated separately for elements directed to (a) children and (b) parents. The possible range was 0–6; mean intensity for strategies targeting children was 1.31 (SD .62) and for strategies targeting parents, .97 (SD .60). Two additional PRAC TPOCS-S composites reflected the average intensity of other practice elements not identified as common to EB treatments for our target population directed to (a) children (mean intensity = .97 (SD .34)) and (b) parents (mean intensity = 1.01 (SD .49)). These “other” non-benchmark codes included strategies such as exploring the past, using play and coordinating child’s external care, and were included as potential covariates to control for a more generic “intensity” effect on outcomes.

b.ii) Potential predictor variables (other)

Child demographics

Child age, gender, and race/ethnicity were collected by parent report.

Child clinical factors at service entry

Therapist-assigned psychiatric diagnoses, including number of diagnoses to represent comorbity, were collected from billing records.

Parent factors

Severity of parent psychopathology was assessed at baseline using the Brief Symptom Inventory (BSI; Derogatis, 1993). The BSI is a 53-item instrument that is one of the most widely accepted screening tools of general adult psychopathology (Derogatis, 1993; Derogatis & Melisaratos, 1993). The global severity index of the BSI was used as a gross representation of parent psychopathology. The BSI demonstrates excellent reliability and good convergent validity (Derogatis & Melisaratos, 1993); the internal reliability (Cronbach’s alpha) was .96 for this sample.

Parent Education level was collected by parent report and is used here as a proxy indicator for SES.

Therapist characteristics

Therapist factors included duration of practice experience (self-reported as months of practice post-graduate school) and primary theoretical orientation self-reported and coded into the following three categories: (1) Behavioral/Cognitive Behavioral, (2) Family Systems, (3) Eclectic/Nondirective/other).

Other service factors: attendance and alliance

The total number of treatment sessions attended during the 8 month study period was collected from billing records to account for treatment attendance. Quality of the therapeutic alliance was assessed from youth (age 9 and older) and parents using the Therapeutic Alliance Scales for Children, Revised (TASC-r; Shirk, Gudmundsen, Kaplinski, & McMakin, 2008; Shirk & Saiz, 1992) and Therapeutic Alliance Scale for Caregivers and Parents (TASCP; Accurso, Hawley, & Garland, 2013), respectively. The TASC scales assess (a) the affective bond (e.g., “I feel like my therapist/my child’s therapist is on my side and tries to help me”) and (b) client-therapist collaboration on therapeutic tasks (e.g., “I think my therapist/my child’s therapist and I work well together on dealing with my/our problems”). The scales include 12 items that are rated from 1 (not true) to 4 (very much true). The TASC-r has demonstrated good reliability and validity (Creed & Kendall, 2005) and had a Cronbach’s alpha of .91 in this study. The TASCP has also demonstrated strong reliability, temporal stability, and convergent/divergent validity, with good internal consistency in this sample (Cronbach’s alphas = .85–.88; Accurso et al., 2013). The alliance variables were paired with the benchmark practice elements in our models, as described below.

Analyses

SAS (version 9.2) was used to calculate sample descriptives and the mean scores on process and outcome measures at each time-point. Cohen’s d effect sizes were calculated based on difference between baseline and 8 month outcome scores, divided by the standard deviation of the change score. Given the interest in rate of change and the nested data structure (i.e., time nested within children, nested within therapists), slopes-as-outcome models were used to examine growth trajectories for the three outcomes—child symptom severity, inconsistent parenting, and family functioning. Clinic-level variance was not accounted for due to lack of significant differences in mean outcome scores by clinic. Predictors were entered at both level-2 (e.g., child age, diagnosis, practice element intensity received, etc.) and level-3 (e.g., therapist experience and theoretical orientation). The parameter of greatest interest in these models is the slope coefficient, representing an interaction of the predictors (level-2 or level-3) with time (level-1). SuperMix Version 1.1 (Hedecker, Gibbons, du Toit, & Patterson, 2008) was used to calculate intraclass coefficients (ICCs) to account for the nested data structure. As a first step, intraclass correlations (ICCs) were calculated for each outcome variable (8 month score) to assess the percent of variability in each outcome that is attributable to the therapist and child levels to determine whether to account for these levels in subsequent analyses; a .05 cutoff was used (Reise, Ventura, Neuchterlein, & Kim, 2005). SAS (version 9.2) was used for subsequent multi-level modeling to test for predictors of outcome slope. These models appropriately examine multilevel data and can accommodate missing data. Initially, potential covariates (child, parent, therapist and service characteristics) were screened for inclusion in the final multivariate models for each outcome by examining the bivariate relationship between the variable and the slope on each outcome. Given the relative heterogeneity in this sample, interaction effects (i.e., DBD diagnosis by benchmark practice intensity) were examined in addition to main effects for diagnosis.

To rigorously test the potential effect of observed intensity of the benchmark practice elements on outcome slopes, the next step was to enter any covariates at least marginally associated with outcome slope (p < .10) into a multilevel model with the benchmark practice element composite variables and therapeutic alliance variables (separate models were run for each outcome slope using the benchmark composite for elements directed to children paired with child alliance and the benchmark composite for parents paired with parent alliance). Rigorous studies examining therapy process-outcome relationships often examine specific practice elements with therapeutic alliance (e.g., Webb et al., 2012). This approach was used to control for potential confounds when testing the potential impact of observed intensity of benchmark practice elements on outcome slope, and to identify robust predictors of outcome slopes. When there was a significant effect of benchmark practice elements on outcome slope, follow-up analyses were conducted to illuminate the direction of the effect.

Results

Child Symptom Severity Outcome

On average, child symptom severity (ECBI intensity score) decreased from baseline (M = 146.50; SD = 36.32) to four months (M = 129.64; SD = 35.95) and then to eight months (M = 125.64; SD = 36.78). This change in problem intensity scores is both statistically significant (B = −11.44, SE = 1.09, p < .00001) and clinically significant (i.e., 8-month mean is below clinical cut-off of 132 (Eyberg & Pincus, 1999); it also represents a moderate effect over time (Cohen’s d = .72).

The ICC values from the unconditional growth model (i.e., time nested within child, nested within therapist) reflected substantial variance in the eight-month ECBI score at the child (ICC=.704) and therapist (ICC= −.059) levels. The negative variability estimate at the therapist level indicated that this level should not be included in multivariate analyses for this outcome. Child-level factors accounted for 70.4% of the variance in this outcome and thus this level was included in the multivariate model.

Initial bivariate screening for potential covariates to include in multivariate model

Of all the potential covariates assessed, the only covariate associated with change in child symptom severity slope was parental psychopathology at baseline (p < .0001).

Multivariate analyses

The multivariate models, which included parental psychopathology, the benchmark practice element intensity composite, and quality of therapeutic alliance, are presented in Table 2 for parent and child directed practice elements. As indicated, parental psychopathology was significantly associated with change on child symptom severity, but benchmark practice element intensity and alliance were not.

Table 2.

Multivariate model predicting child symptom severity slope.

Predictors B SE T
value
p-value
Intercept 75.41 21.66 3.48 .0009

Time −11.55 1.18 −9.76 <.0001

Parental psychopathology .90 .22 4.01 <.0001

EB practice element composite - Parent 3.32 4.70 0.71 .48
Therapeutic alliance - Parent .39 .40 .98 .33
Predictors B SE T
value
p-value
Intercept 57.94 24.93 2.32 .02

Time −10.35 1.66 −6.24 <.0001

Parental psychopathology 1.0 0.28 3.57 .0005

EB practice element composite - Child 2.87 5.14 0.56 0.58
Therapeutic alliance - Child .63 .39 1.61 0.11

Caregiver Disciplinary Practices Outcome

Reports of inconsistent parent disciplinary practices were significantly reduced from baseline (M = 14.70; SD = 4.54) to four months (M = 14.14; SD = 4.61) and then to eight months (M = 13.96; SD = 4.34). This change over time represents statistically significant improvement (B = −0.52, SE = 0.17, p = .003) with a small effect size (Cohen’s d = .17). The ICC values reflected substantial variance in the eight month outcome score at the child level (.507) and the therapist level (.079). Given that therapists accounted for approximately 8% of the variance in caregiver disciplinary practices and children accounted for approximately 50% of the variance in this outcome, both levels of the nested data structure were accounted for in subsequent analyses.

Initial bivariate screening for potential covariates to include in multivariate model

Marginally significant to significant predictors of change in caregiver disciplinary practices included child race, parental education, parental psychopathology, and total number of treatment visits.

Multivariate analyses

All of the variables listed above were entered into the multivariate models with the benchmark practice element intensity composites and therapeutic alliance. As shown in Table 3, variables significantly related to change in disciplinary practices included mixed/other race, parental psychopathology, total number of visits attended, youth reported therapeutic alliance, and average intensity on benchmark practice elements targeting children.

Table 3.

Multivariate model predicting inconsistent parenting practices slope.

Predictors B SE T
value
p-value
Intercept 8.46 2.91 2.91 .005
Time −0.50 0.20 −2.53 .01

Child race/ethnicitya: Black 1.52 1.16 1.30 .19
Child race/ethnicitya: Latino −.32 .78 −.41 .68
Children race/ethnicitya: Other/Mixed −.79 .96 −.82 .41
Parent educationb: High school or college −1.09 .93 −1.17 .24
Parent educationb: College + −1.82 1.14 −1.60 .11
Parental psychopathology .10 .03 3.55 .0004
Number of sessions attended −0.08 .04 −2.15 .03

EB practice element composite - Parent .11 .57 .19 .85
Therapeutic alliance - Parent .08 .05 1.49 .14
Predictors B SE T
value
p-value
Intercept 4.07 3.13 1.30 .20
Time −.71 .29 −2.49 .01

Child race/ethnicitya: Black .70 1.64 .43 .67
Child race/ethnicitya: Latino .55 .89 .62 .54
Child race/ethnicitya: Other/Mixed −3.68 1.38 −2.67 .008
Parent educationb: High school or college −.29 1.08 −.27 .79
Parent educationb: College + .33 1.39 .24 .81
Parental psychopathology .16 .03 4.60 <.0001
Number of sessions attended −.13 .05 −2.39 .02

EB practice element composite - Child 1.41 .70 2.03 .04
Therapeutic alliance - Child .11 .05 2.4 .02
a

= Reference group is White

b

=Reference group is Less than high school diploma

Given this study’s primary aim of examining any relationships between observed benchmark practice elements and outcome change, this specific effect was probed further. Figure 1 presents a graphic illustration of steeper improvement in disciplinary practices (reduction in inconsistent discipline) in families where children received greater average intensity of benchmark practice elements.

Figure 1.

Figure 1

Rate of decrease in inconsistent parenting practices as predicted by intensity of benchmark practice elements consistent with evidence-based practice directed to children.

Family Functioning Outcome

Reports of the quality of family functioning improved from baseline (M = 9.22; SD = 4.53) to four months (M = 9.91; SD = 4.25) and then to eight months (M = 10.35; SD = 4.10). This improvement in scores is statistically significant (B = 0.49, SE = 0.15, p = .001) and represents a small effect size (Cohen’s d = .28). The baseline mean score is just above the suggested clinical cut-off of nine (Edwards & Clarke, 2005), and the 8-month score is well above. In the unconditional growth model, there was significant variability in family functioning at the child level (ICC = .587) but not the therapist level (ICC =.049). Given that child-level factors accounted for approximately 60% of the variance in family functioning, this level was accounted for in multivariate analyses for this outcome.

Initial bivariate screening for potential covariates to include in multivariate model

Variables even marginally associated with change in family relationship functioning included child age, child gender, parental psychopathology, and average intensity of “other” (non-benchmark) practice elements directed towards parents.

Multivariate analyses

The final multivariate models (Table 4) included the above variables in addition to the EB practice element composites and therapeutic alliance. As noted in Table 4, predictors that retained significance in the multivariate models included child gender, parental psychopathology, and average intensity of “other” practice elements. The benchmark practice elements and alliance variables were not significantly associated with change in family functioning.

Table 4.

Model predicting family functioning slope.

Predictors B SE T value p-value
Intercept 23.56 2.73 8.64 <.0001
Time .48 .19 2.51 0.01

Child age −0.12 .11 −1.15 .25
Child gender −1.88 .59 −3.16 .002
Parental psychopathology −.133 0.02 −5.56 <.0001
Other practice elements −2.32 .79 −2.95 .003

EB practice element composite - Parent .88 .67 1.32 .19
Therapeutic alliance - Parent −0.02 .04 −0.46 .65
Predictors B SE T value p-value
Intercept 24.75 4.59 5.39 <.0001
Time .52 .23 2.24 0.03

Child age −.52 .29 −1.75 .08
Child gender −1.85 .83 −2.22 .03
Parental psychopathology −0.09 .04 −2.45 <.0001
Other practice elements −1.80 .90 −2.00 .05

EB practice element composite - Child .23 .72 .31 .75
Therapeutic alliance - Child −.03 .05 −.57 .57

Discussion

Our results indicate that children with disruptive behavior problems and their parents who enter treatment in publicly-funded community-based mental health clinics demonstrate improvements in outcomes over eight months. Change over time was small to moderate and statistically significant for all three outcomes (i.e., child symptom severity, inconsistent parent disciplinary practices, and general family functioning), as well as clinically significant for one of the three outcomes (i.e., child symptom severity). The effect size for child symptom severity was similar to that reported for child functional impairment in a large, diverse sample of children and youth from a different public mental health system (e.g., Cohen’s d of .72, compared to .71 in Mueller, Tolman, Higa-McMillan, & Daleiden, 2010). Change was greater during the first four months compared to the second four months across all outcomes. This pattern of greater change early in treatment is consistent with other research examining outcome trajectories for children in treatment for DBPs (e.g., Lindhiem & Kolko, 2011). Although improvements were observed over time, it is important to emphasize that change cannot necessarily be attributed to treatment effects in this uncontrolled observational study.

Beyond describing change in outcome trajectories for this “usual care” sample, the aim of this study was to examine potential associations between treatment processes and outcomes, most specifically observed intensity on benchmark practice elements common in EB treatments. After accounting for other potential confounding variables across three outcome domains, only two significant effects were found. Specifically, families whose children received treatment reflecting more intense delivery of practice elements common to EB practice (e.g., using positive reinforcement and limit setting, modeling of skills, role-playing) demonstrated a greater (i.e., steeper) reduction in parental inconsistent discipline. Additionally, greater intensity of parent-directed “other” non-benchmark practice elements (e.g., improved communication, family member roles) actually predicted lesser improvements in family functioning. These findings provide unique, but very preliminary “practice-based evidence” for the potential positive impact of common elements of EB practice in UC settings, as well as the potential negative impact of non-common “other” practice elements. The findings should be interpreted with caution given the number of models examined and the few significant findings.

Parental inconsistent discipline is a particularly important outcome indicator for families with children with DBPs given its strong association with disruptive behaviors in children (Stormshak, Bierman, McMahon, & Lengua, 2000). Further, improvements in parents’ disciplinary practices may have long term impacts on the children’s behavior and general family functioning (Kazdin & Wassell, 2000). The fact that parent disciplinary practices may have been indirectly impacted by interventions delivered to the child seems somewhat counterintuitive. It is possible that a change in child behavior mediated a change in parenting. Since parents were present in the majority of sessions, it is also possible that parents were indirectly influenced by interventions delivered to children in their presence—therapists were modeling skills (e.g., limit setting) for parents regardless of whether this was their explicit intention.

The lack of more significant findings linking the benchmark practice elements to outcome change for the other two outcome domains (child symptom severity and family functioning) is somewhat disappointing, but not necessarily surprising. Very few studies have identified treatment characteristics (outside of fidelity to a specified treatment model) associated with outcome trajectories. Finding an association between outcome change and UC treatment processes in the current study was challenging due to complexities in measuring treatment processes (Garland, Hurlburt, et al., 2010) and the fact that the observed average intensity of delivery of EB practice elements was very low overall (Garland, Brookman-Frazee, et al., 2010). The restricted variability in intensity of practice elements at the low end of the scale may reflect a potential “floor effect” for observed intensity of benchmark practice elements. These results nevertheless suggest that diffuse, low intensity delivery of benchmark practice elements is not strongly linked to improvements in treatment outcomes. Additional analyses not reported here indicated that benchmark practice elements were no more effective for children with a primary diagnosis of DBD compared to other diagnoses, suggesting that diagnostic variability in the sample did not account for lack of effects.

It is important to note that while the benchmark practice elements were common in EB treatments for children with DBPs, they cannot be considered as consistent with (i.e., with fidelity to) EB treatments when delivered at such low intensity. Likewise, multiple potentially important factors related to effective delivery of EB practices were not assessed in this study, including (but not limited to) sequencing and coordination of practice elements within and across sessions, individual tailoring of elements based on diagnosis and/or targeted problems, and therapist competence. The present study provides an initial gross survey of the extent to which UC reflects practice elements conceptually consistent with EB treatments. It was not designed to assess true fidelity to EB treatments, which would have yielded no variance since none of the observed practice was consistent with any EB treatment.

The results illustrate that far more of the variance in 8 month outcome scores was attributable to patient factors, which accounted for between 51–70% of the variance, compared to less than 10% of variance attributable to therapists. These results are relatively consistent with Warren and colleagues’ (2010) findings that 53% of the variability in their inclusive youth outcome measure is attributed to between child variance and 2.7% to between therapist variance. Despite the large amount of variance accounted for at the patient-level, we were only able to identify a few significant effects for specific patient or parent demographic or clinical characteristics associated with outcome change, including child gender, child race, and parental psychopathology. Parental psychopathology was particularly robust as the only factor that significantly predicted change in all three outcomes. Previous research has demonstrated significant associations between parental psychopathology (e.g., maternal depression) and outcomes in treatment for child behavior problems (Baydar, Reid, & Webster-Stratton, 2003). This was particularly salient in our diverse UC treatment sample, with families challenged by multiple life stressors, including low incomes and elevated rates of parental psychopathology.

Previous research has demonstrated that it is challenging to identify significant, consistent predictors or correlates of outcome change in child and youth mental health treatment across different predictor categories (e.g., child, family, or service characteristics) in UC settings, even with large samples and rigorous research methods (e.g., Denneny & Mueller, 2012; Shapiro et al., 1997; Warren et al., 2010). Even in more controlled treatment trial research, few consistent findings regarding basic demographic and clinical child and family characteristics as predictors of treatment outcome have been identified (e.g., Kazdin & Wassell, 2000; Phillips et al., 2000). We assessed for a variety of variables across these categories and found few to be even marginally related to outcome slopes. Difficulty in identifying significant predictors of outcome slope in UC may be due to the heterogeneity of routine care samples and treatment (including treatment format, duration, frequency, intensity, etc.), the limited magnitude of overall change, and the multitude of potentially interacting factors likely to be associated with change on these outcomes, including each families’ intervening life events, which could be related, or unrelated, to treatment.

Some of the non-significant findings in this study offer relevant contributions to the literature on UC. For example, we found no positive relationship between number of visits attended and outcome trajectories. We did find one effect in the inverse direction such that the number of visits attended over 8 months was associated with weaker change in inconsistent discipline practices. Most studies have found no relationship between number of sessions attended (treatment dose) and outcome trajectories, including a methodologically rigorous study with over 4000 youths served in public and private community-based care (Warren et al., 2010). Several child, family, and treatment process factors have been found to predict session attendance (see Garland et al., 2012; Gopalan et al., 2010), but attendance has rarely been identified as a significant predictor of positive outcomes in UC.

Given the lack of association between session attendance and outcome trajectories, as well as the observation that clinical change was steepest during the first four months of treatment, what beyond treatment impact might explain the observed change in clinical outcomes? Lambert and Bickman (2004) have proposed that observed improvement in clinical outcomes assessed relatively early in treatment likely reflects natural improvement in functioning, rather than treatment effects. They offer compelling simulation data to support this “clock setting” cure hypothesis, suggesting that if we assume that children usually enter care at their personal “worst” functioning, functioning is likely to naturally improve shortly after service entry (i.e., regression to the mean). Alternative explanations include a phenomenon conceptually similar to a placebo effect whereby patients demonstrate improvement in functioning as a result of entering care. This may reflect expectations of benefit and relief upon success of seeking care.

The results of this study should be interpreted in the context of study strengths and limitations. This is the first study we are aware of that collected detailed observational data on within-session UC practice to allow for a test of the relationship between observed UC treatment processes (practice elements delivered by community-based therapists) and outcome trajectories outside the context of an intervention trial. Inclusion of three different outcome domains is a strength of the study, but the fact that all outcomes were based on parent self-report is a significant limitation. Future research that includes other outcome indicators (e.g., functional outcomes) and other informants are needed. Additionally, the practice element data are limited to what could be observed behaviorally within the randomly selected sessions (thus excluding ancillary services, phone calls, and other collateral care). The extent to which observation itself may have influenced practice is unknown, but we attempted to minimize this effect by routinizing videotaping and using small, unobtrusive cameras. Our assessment of within-session practice elements was also limited to a focus on provider behavior, as opposed to more complex measurement of the interaction between therapist and client behavior. Given that psychotherapy is a dynamic, interactive endeavor, examining therapist behaviors alone provides just one perspective on therapeutic processes (Krause & Lutz, 2009). In an effort to complement this single perspective, we included youth and parent ratings of therapeutic alliance in the models, but only child alliance for only one outcome (i.e., inconsistent parenting) was associated with change. This is perhaps not surprising given that few studies have concurrently examined both UC treatment processes and alliance within a multilevel framework, and restricted variability in alliance may reflect a “ceiling effect” for this sample.

Reliable and complex multilevel analytic methods were used in this study to account for the nested data structure. However, significant variability in practice could not be controlled and ultimately limited power to detect effects given the observational nature of this study. For example, many of the participating therapists had only one patient in the study, which limited analytic examination of within and between therapist effects. Patient participants varied in their session attendance; thus, the number of assessed sessions also varies. Regardless of how long patients stayed in treatment, outcome data were collected every four months excluding attrition.

Although we do not know how well these findings generalize to other service settings in other geographic areas, the therapist sample is similar to a national sample of providers of public mental health care in terms of educational level and basic demographics (gender, race/ethnicity) (Glisson et al., 2008). The majority were trainees, consistent with high representation of trainees in public service systems (Hawley & Weisz, 2005). The patient participant sample is also similar to other clinical samples in terms of common diagnoses, high comorbidity, race/ethnic diversity, and gender distribution (Rosenblatt & Rosenblatt, 2000; Zima et al., 2005).

Implications

The articles in this Special Issue advance efforts to bridge the persistent gap between research and practice (Weisz et al., 1995). Decades of research on intervention development, efficacy, and effectiveness, as well as emerging research on implementation methods, have generated solid “building blocks” to construct this bridge with empirically-derived knowledge. We hope that the practice-based research reported here also contributes a few building blocks of knowledge to support the practice “side” of the bridge by characterizing UC practices, identifying the extent to which they resemble EB practices, characterizing outcome trajectories, and identifying predictors of outcome trajectories. In this report and others, we have shown that UC for children with DBPs includes several of the practice elements common in EB treatment models for this patient population, but these elements are delivered at low average intensity and thus are inconsistent with EB models. Generally, “other” elements are also delivered at low average intensity. In the current report, we demonstrated a very preliminary effect whereby parent disciplinary practices improved faster when practice elements common to EB practice were delivered at greater intensity. Experimental research is needed to test the impact of interventions designed to intensify the delivery of EB practice elements with attention to coordination of these elements (e.g., selection, sequencing, evaluating) and tailoring treatment for specific patients (Chorpita et al., 2005).

In conclusion, we extend the research – practice bridge metaphor to illustrate a central theme of this Special Issue. An impressive bridge constructed of building blocks of essential knowledge about effective interventions, training and implementation methods, organizational contexts, and outcome monitoring now spans the research – practice gap, but if we do not improve the bidirectional flow of traffic across that bridge, we risk having built a “bridge to nowhere.” The articles in this series highlight promising innovative strategies to improve the flow of traffic (i.e., knowledge exchange) between research and practice, emphasizing the application of research-derived knowledge to practice improvement. We hope that our contribution provides solid grounding on the practice “side” of the gap by illuminating treatment processes and outcomes in a usual care context.

Acknowledgements

The authors thank Robin Taylor and William Ganger, M.S. for study and data management. Preparation of this manuscript was supported by NIMH grants R01-MH-66070 (Garland), F31-MH083399 (Accurso), K23-MH-077584 (Brookman-Frazee), K23-MH080149 (Haine-Schlagel), and P30-MH-074778 (Landsverk).

References

  1. Accurso EC, Hawley KM, Garland AF. Psychometric properties of the Therapeutic Alliance Scale for Caregivers and Parents. Psychological Assessment. 2013;25:244–252. doi: 10.1037/a0030551. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Andrade AR, Lambert EW, Bickman L. Dose effect in child psychotherapy: Outcomes associated with negligible treatment. Journal of the American Academy of Child and Adolescent Psychiatry. 2000;39:161–168. doi: 10.1097/00004583-200002000-00014. [DOI] [PubMed] [Google Scholar]
  3. Angold A, Costello EJ, Burns BJ, Erkanli A, Farmer EMZ. Effectiveness of nonresidential specialty mental health services for children and adolescents in the “real world.”. Journal of the American Academy of Child and Adolescent Psychiatry. 2000;39:154–160. doi: 10.1097/00004583-200002000-00013. [DOI] [PubMed] [Google Scholar]
  4. Baydar N, Reid MJ, Webster-Stratton C. The role of mental health factors and program engagement in the effectiveness of a preventive parenting program for Head Start mothers. Child Development. 2003;74:1433–1453. doi: 10.1111/1467-8624.00616. [DOI] [PubMed] [Google Scholar]
  5. Beauchaine TP, Webster-Stratton C, Reid MJ. Mediators, moderators, and predictors of 1-year outcomes among children treated for early-onset conduct problems: A latent growth curve analysis. Journal of Consulting and Clinical Psychology. 2005;73:371–388. doi: 10.1037/0022-006X.73.3.371. [DOI] [PubMed] [Google Scholar]
  6. Beutler LE, Malik M, Alimohamed S, Harwood TM, Talebi H, Noble S, Wong E. Therapist effects. In: Lambert MJ, editor. Bergin and Garfield’s Handbook of Psychotherapy and Behavior Change. 5th Edition. New York: Wiley; 2004. pp. 227–306. [Google Scholar]
  7. Bickman L. The most dangerous and difficult question in mental health services research. Mental Health Services Research. 2000;2:71–72. [Google Scholar]
  8. Bickman L, Noser K, Summerfelt WT. Long-term effects of a system of care on children and adolescents. Journal of Behavioral Health Services and Research. 1999;26:185–202. doi: 10.1007/BF02287490. [DOI] [PubMed] [Google Scholar]
  9. Boggs SR, Eyberg S, Reynolds LA. Concurrent validity of the Eyberg Child Behavior Inventory. Journal of Clinical Child Psychology. 1990;19:75–78. [Google Scholar]
  10. Brookman-Frazee L, Haine RA, Baker-Ericzen M, Zoffness R, Garland AF. Factors associated with use of evidence-based practice strategies in usual care youth psychotherapy. Administration and Policy in Mental Health and Mental Health Services. 2010;37(3):205–207. doi: 10.1007/s10488-009-0244-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Brookman-Frazee L, Haine RL, Garland AF. Measuring outcomes of real-world youth psychotherapy: Who and what to ask? Psychiatric Services. 2006;57:1373–1375. doi: 10.1176/ps.2006.57.10.1373. [DOI] [PubMed] [Google Scholar]
  12. Carroll KM, Nich C, Sifry RL, Nuro KF, Frankforther TL, Ball SA, Fendon L, Rounsaville BJ. A general system for evaluating therapist adherence and competence in psychotherapy research in the addictions. Drug and Alcohol Dependence. 2000;57:225–238. doi: 10.1016/s0376-8716(99)00049-6. [DOI] [PubMed] [Google Scholar]
  13. Chorpita BF, Daleiden EL, Weisz JR. Identifying and selecting the common elements of evidence based interventions: A distillation and matching model. Mental Health Services Research. 2005;7:5–20. doi: 10.1007/s11020-005-1962-6. [DOI] [PubMed] [Google Scholar]
  14. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: Application of the distillation and matching model to 615 treatments from 322 randomized trials. Journal of Consulting and Clinical Psychology. 2009;77:566–579. doi: 10.1037/a0014565. [DOI] [PubMed] [Google Scholar]
  15. Creed TA, Kendall PC. Therapist alliance-building behavior within a cognitive-behavioral treatment for anxiety in youth. Journal of Consulting and Clinical Psychology. 2005;73:498–505. doi: 10.1037/0022-006X.73.3.498. [DOI] [PubMed] [Google Scholar]
  16. Daleiden EL, Chorpita BF, Donkervoet CM, Arensdorf AA, Brogan M. Getting better at getting them better: Health outcomes and evidence-based practice within a system of care. Journal of the American Academy of Child and Adolescent Psychiatry. 2006;45:749–756. doi: 10.1097/01.chi.0000215154.07142.63. [DOI] [PubMed] [Google Scholar]
  17. Denneny D, Mueller C. Do empirically supported packages or their practices predict superior therapy outcomes for youth with conduct disorders?. Poster presented at the Forty-sixth Annual Convention of the Association of Behavioral and Cognitive Therapies; National Harbor, MD. 2012. [Google Scholar]
  18. Derogatis LR. Brief Symptom Inventory: Administration, scoring and procedures manual. 4th ed. Minneapolis, MN: NCS, Pearson, Inc.; 1993. [Google Scholar]
  19. Derogatis LR, Melisaratos N. The Brief Symptom Inventory: An introductory report. Psychological Medicine. 1983;13:595–605. [PubMed] [Google Scholar]
  20. Edwards B, Clarke V. The validity of the Family Relationship Index as a screening tool for psychological risk in families of cancer patients. Psycho-oncology. 2005;15:546–554. doi: 10.1002/pon.876. [DOI] [PubMed] [Google Scholar]
  21. Eyberg SM, Nelson MM, Boggs SR. Evidence-based psychosocial treatments for children and adolescents with disruptive behavior. Journal of Clinical Child and Adolescent Psychology. 2008;37:215–237. doi: 10.1080/15374410701820117. [DOI] [PubMed] [Google Scholar]
  22. Eyberg S, Pincus D. Psychological Assessment Resources. Odessa, FL: Psychological Assessment Resources; 1999. Eyberg Child Behavior Inventory and Sutter-Eyberg Student Behavior Inventory – Revised. [Google Scholar]
  23. Eyberg SM, Robinson EA. Conduct problem behavior: Standardization of a behavioral rating scale with adolescents. Journal of Clinical Child Psychology. 1983;12:347–354. [Google Scholar]
  24. Feeley M, DeRubeis RJ, Gelfand LA. The temporal relation of adherence and alliance to symptom change in cognitive therapy for depression. Journal of Consulting and Clinical Psychology. 1999;67:578–582. doi: 10.1037//0022-006x.67.4.578. [DOI] [PubMed] [Google Scholar]
  25. Farmer EMZ, Burns BJ, Wagner HR, Murray M, Southerland DG. Enhancing “usual practice” treatment foster care: Findings from a randomized trial on improving youths’ outcomes. Psychiatric Services. 2010;61:555–561. doi: 10.1176/appi.ps.61.6.555. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Frick PJ. The Alabama Parenting Questionnaire. Unpublished instrument, University of Alabama; 1991. [Google Scholar]
  27. Frick PJ, Christian RC, Wootton JM. Age trends in the association between parenting practices and conduct problems. Behavior Modification. 1999;23:106–128. [Google Scholar]
  28. Gardner F, Hutchings J, Bywater T, Whitaker C. Who benefits and how does it work? Moderators and mediators of outcome in an effectiveness trial of a parenting intervention. Journal of Clinical Child & Adolescent Psychology. 2010;39:568–580. doi: 10.1080/15374416.2010.486315. [DOI] [PubMed] [Google Scholar]
  29. Garland AF, Brookman-Frazee L, Hurlburt MS, Accurso EC, Zoffness R, Haine-Schlagel RA, Ganger W. Mental health care for children with disruptive behavior problems: A view inside therapists’ offices. Psychiatric Services. 2010;61:788–795. doi: 10.1176/appi.ps.61.8.788. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Garland AF, Brookman-Frazee L, McLeod BD. Scoring manual for the PRAC study therapy process observational coding system for child psychotherapy: strategies scale. San Diego: Child and Adolescent Services Research Center; 2008. unpublished coding manual. [Google Scholar]
  31. Garland AF, Haine-Schlagel RA, Accurso EC, Baker-Ericzen MJ, Brookman-Frazee L. Exploring the effect of therapists’ treatment practices on client attendance in community-based care for children. Psychological Services. 2012;9:74–88. doi: 10.1037/a0027098. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Garland AF, Hawley KM, Brookman-Frazee L, Hurlburt M. Identifying common elements of evidence-based psychosocial treatments for children’s disruptive behavior problems. Journal of the American Academy of Child and Adolescent Psychiatry. 2008;47:505–514. doi: 10.1097/CHI.0b013e31816765c2. [DOI] [PubMed] [Google Scholar]
  33. Garland AF, Hough R, McCabe K, Yeh M, Wood P, Aarons G. Prevalence of psychiatric disorders for youths in public sectors of care. Journal of the American Academy of Child and Adolescent Psychiatry. 2001;40:409–418. doi: 10.1097/00004583-200104000-00009. [DOI] [PubMed] [Google Scholar]
  34. Garland AF, Hurlburt MS, Hawley KM. Examining Psychotherapy Processes in a Services Research Context. Clinical Psychology: Science and Practice. 2006;13:30–46. [Google Scholar]
  35. Garland AF, Hurlburt MS, Brookman-Frazee L, Taylor RM, Accurso EC. Methodological challenges of characterizing usual care psychotherapeutic practice. Administration and Policy in Mental Health and Mental Health Services. 2010;37(3):208–220. doi: 10.1007/s10488-009-0237-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Garland AF, Lewczyk-Boxmeyer CM, Gabayan E, Hawley K. Multiple stakeholder agreement on desired outcomes for adolescents’ mental health services. Psychiatric Services. 2004;55:671–676. doi: 10.1176/appi.ps.55.6.671. [DOI] [PubMed] [Google Scholar]
  37. Garland AF, Plemmons D, Koontz L. Research-Practice Partnerships in Mental Health: Lessons from Participants. Administration and Policy in Mental Health and Mental Health Services. 2006;33:517–528. doi: 10.1007/s10488-006-0062-2. [DOI] [PubMed] [Google Scholar]
  38. Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood KE, Mayberg S, Green P. Assessing the organizational social context (OSC) of mental health services: Implications for research and practice. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35:98–113. doi: 10.1007/s10488-007-0148-5. [DOI] [PubMed] [Google Scholar]
  39. Gopalan G, Goldstein L, Klingenstein K, Sicher C, Blake C, McKay MM. Engaging families into child mental health treatment: Updates and special considerations. Journal of the Canadian Academy of Child & Adolescent Psychiatry. 2010;19:182–196. [PMC free article] [PubMed] [Google Scholar]
  40. Hawley KM, Weisz JR. Youth versus parent working alliance in usual clinical care: Distinctive associations with retention, satisfaction and treatment outcome. Journal of Clinical Child and Adolescent Psychology. 2005;34:117–128. doi: 10.1207/s15374424jccp3401_11. [DOI] [PubMed] [Google Scholar]
  41. Haine-Schlagel R, Fettes DL, Garcia AR, Brookman-Frazee L, Garland AF. Consistency with evidence-based treatments and perceived effectiveness of children’s community-based care. Community Mental Health Journal. 2013:1–6. doi: 10.1007/s10597-012-9583-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Hedeker D, Gibbons RD, Du Toit SHC, Patterson D. SuperMix - A program for mixed effects regression models. Chicago: Scientific Software International; 2008. [Google Scholar]
  43. Hoagwood KE, Jensen P, Acri MC, Olin SS, Lewandowski RE, Herman RJ. Outcome domains in child mental health research since 1996: have they changed and why does it matter? Journal of the American Academy of Child and Adolescent Psychiatry. 2012;51(12):1241–1260. doi: 10.1016/j.jaac.2012.09.004. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Hogue A, Dauber S, Liddle HA, Samuolis J. Linking session focus to treatment outcome in evidence-based treatments for adolescent substance abuse. Psychotherapy. 2004;41:83–96. doi: 10.1037/0033-3204.41.2.83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Hogue A, Henderson CE, Dauber S, Barajas PC, Fried A, Liddle HA. Treatment adherence, competence, and outcome in individual and family therapy for adolescent behavior problems. Journal of Consulting and Clinical Psychology. 2008;76:544–555. doi: 10.1037/0022-006X.76.4.544. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Holahan CJ, Moos RH. The quality of social support: Measures of family and work relationships. British Journal of Clinical Psychology. 1983;22:157–162. [Google Scholar]
  47. Kaminski JW, Valle LA, Filene JH, Boyle CL. A meta-analytic review of components associated with parent training program effectiveness. Journal of Abnormal Child Psychology. 2008;36:567–589. doi: 10.1007/s10802-007-9201-9. [DOI] [PubMed] [Google Scholar]
  48. Karver MS, Handelsman JB, Fields S, Bickman L. Meta-analysis of therapeutic relationship variables in youth and family therapy: The evidence for different relationship variables in the child and adolescent treatment outcome literature. Clinical Psychology Review. 2006;26:50–65. doi: 10.1016/j.cpr.2005.09.001. [DOI] [PubMed] [Google Scholar]
  49. Kazak AE, Hoagwood K, Weisz JR, Hood K, Kratochwill TR, Vargas LA, Banez GA. A meta-systems approach to evidence-based practice for children and adolescents. American Psychologist. 2010;65(2):85–97. doi: 10.1037/a0017784. [DOI] [PubMed] [Google Scholar]
  50. Kazdin AE. Psychotherapy for children and adolescents: directions for research and practice. New York: Oxford University Press; 2000. [Google Scholar]
  51. Kazdin AE, Wassell G. Predictors of barriers to treatment and therapeutic change in outpatient therapy for antisocial children and their families. Mental Health Services Research. 2000;2(1):27–40. doi: 10.1023/a:1010191807861. [DOI] [PubMed] [Google Scholar]
  52. Kazdin AE, Weisz JR. Evidence-Based Psychotherapies for Children and Adolescents. New York: Guilford; 2003. [Google Scholar]
  53. Kolko DJ, Dorn LD, Bukstein OG, Pardini D, Holden EA, Hart J. Community vs. clinic-based modular treatment of children with early-onset ODD or CD: A clinical trial with 3-year follow up. Journal of Abnormal Child Psychology. 2009;37:591–609. doi: 10.1007/s10802-009-9303-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Krause MS, Lutz W. Process transforms inputs to determine outcomes: Therapists are responsible for managing process. Clinical Psychology: Science and Practice. 2009;16:73–81. [Google Scholar]
  55. Lambert W, Bickman L. The “clock-setting” cure: how children’s symptoms might improve after ineffective treatment. Psychiatric Services. 2004;55:381–382. doi: 10.1176/appi.ps.55.4.381. [DOI] [PubMed] [Google Scholar]
  56. Lillienfeld SO. Comorbidity between and within childhood externalizing and internalizing disorders: Reflections and Directions. Journal of Abnormal Psychology. 2003;31:285–291. doi: 10.1023/a:1023229529866. [DOI] [PubMed] [Google Scholar]
  57. Lindhiem O, Kolko DJ. Trajectories of symptom reduction during treatment for behavior problems in pediatric primary-care settings. Adminsitration and Policy in Mental Health. 2011;38(6):486–494. doi: 10.1007/s10488-011-0335-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  58. McLeod BD, Weisz JR. The therapy process observational coding system alliance scale: Measure characteristics and prediction of outcome in usual clinical practice. Journal of Consulting and Clinical Psychology. 2005;73:323–333. doi: 10.1037/0022-006X.73.2.323. [DOI] [PubMed] [Google Scholar]
  59. McLeod BD, Weisz JR. The Therapy Process Observational Coding System for Child Psychotherapy Strategies Scale. Journal of Clinical Child & Adolescent Psychology. 2010;39(3):436–443. doi: 10.1080/15374411003691750. [DOI] [PubMed] [Google Scholar]
  60. McLeod BD. Relation of the alliance with outcomes in youth psychotherapy: A meta-analysis. Clinical Psychology Review. 2011;31:603–616. doi: 10.1016/j.cpr.2011.02.001. [DOI] [PubMed] [Google Scholar]
  61. Mueller CW, Tolman R, Higa-MacMillan CK, Daleiden EL. Longitudinal predictors of youth functional improvement in a public mental health system. Journal of Behavioral Health Services and Research. 2010;37:350–362. doi: 10.1007/s11414-009-9172-4. [DOI] [PubMed] [Google Scholar]
  62. Mueller CW, Daleiden EL, Chorpita BF, Tolman RT, Higa-MacMillan CK. EBS practice elements and youth outcomes in a state-wide system. Paper presented at the annual meeting of the American Psychological Association; Toronto, Canada. 2009. [Google Scholar]
  63. Noser K, Bickman L. Quality indicators of children’s mental health services: Do they predict improved client outcomes? Journal of Emotional and Behavioral Disorders. 2000;8(1):9–18. [Google Scholar]
  64. Patterson GR, Chamberlain P. A functional analysis of resistance during parent training therapy. Clinical Psychology: Science and Practice. 1994;1(1):53–70. [Google Scholar]
  65. Phillips SD, Hargis MB, Kramer TL, Lensing SY, Taylor JL, Burns BJ, Robbins JM. Toward a level playing field: Predictive factors for the outcomes of mental health treatment for adolescents. Journal of American Academy of Child and Adolescent Psychiatry. 2000;39:1485–1495. doi: 10.1097/00004583-200012000-00008. [DOI] [PubMed] [Google Scholar]
  66. Reise SP, Ventura J, Nuechterlein KH, Kim KH. An illustration of multilevel factor analysis. Journal of Personality Assessment. 2005;84:126–136. doi: 10.1207/s15327752jpa8402_02. [DOI] [PubMed] [Google Scholar]
  67. Rogers SJ, Vismara LA. Evidence-based comprehensive treatments for early autism. Journal of Clinical Child and Adolescent Psychology. 2008;37(1):8–38. doi: 10.1080/15374410701817808. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Rosenblatt A, Rosenblatt J. Demographic, clinical, and functional characteristics of youth enrolled in six California systems of care. Journal of Child and Family Studies. 2000;9:51–66. [Google Scholar]
  69. Roth AD, Pilling S. Using an evidence-based methodology to identify the competencies required to deliver effective cognitive and behavioural therapy for depression and anxiety disorders. Behavioural and Cognitive Psychotherapy. 2008;36:129–147. [Google Scholar]
  70. Rotheram-Borus MJ, Ingram BL, Swendeman D, Flannery D. Common principles embedded in effective adolescent HIV prevention programs. AIDS Behavior. 2009;13:387–398. doi: 10.1007/s10461-009-9531-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Sburlati ES, Schniering CA, Lyneham HJ, Rapee RM. A model of therapist competencies for the empirically supported cognitive behavioral treatment of child and adolescent anxiety and depressive disorders. Clinical Child and Family Psychology Review. 2011;14(1):89–109. doi: 10.1007/s10567-011-0083-6. [DOI] [PubMed] [Google Scholar]
  72. Shapiro JP, Welker CJ, Jacobson BJ. A naturalistic study of psychotherapeutic methods and client in-therapy functioning in a child community setting. Journal of Clinical Child Psychology. 1997;26:385–396. doi: 10.1207/s15374424jccp2604_7. [DOI] [PubMed] [Google Scholar]
  73. Shirk SR, Gudmundsen G, Kaplinski HC, McMakin DL. Alliance and outcome in cognitive-behavioral therapy for adolescent depression. Journal of Clinical Child and Adolescent Psychology. 2008;37:631–639. doi: 10.1080/15374410802148061. [DOI] [PubMed] [Google Scholar]
  74. Shirk S, Karver M. Prediction of treatment outcome from relationship variables in child and adolescent therapy: A meta-analytic review. Journal of Consulting and Clinical Psychology. 2003;71:452–464. doi: 10.1037/0022-006x.71.3.452. [DOI] [PubMed] [Google Scholar]
  75. Shirk SR, Saiz CC. Clinical, empirical, and developmental perspectives on the therapeutic relationship in child psychotherapy. Development and Psychopathology. 1992;4:713–728. [Google Scholar]
  76. St. Amand A, Bard DE, Silovsky JF. Meta-analysis of treatment for child sexual behavior problems: practice elements and outcomes. Child Maltreatment. 2008;13:145–166. doi: 10.1177/1077559508315353. [DOI] [PubMed] [Google Scholar]
  77. Warren JS, Nelson PL, Burlingame GM. Identifying youth at risk for treatment failure in outpatient community mental health services. Journal of Child and Family Studies. 2009;18:690–701. [Google Scholar]
  78. Warren JS, Nelson PL, Mondragon SA, Baldwin SA, Burlingame GM. Youth psychotherapy change trajectories and outcomes in usual care: Community mental health versus managed care settings. Journal of Consulting and Clinical Psychology. 2010;78:144–155. doi: 10.1037/a0018544. [DOI] [PubMed] [Google Scholar]
  79. Webb CA, DeRubeis RJ, Dimidjian S, Hollon SD, Amsterdam JD, Shelton RC. Predictors of patient cognitive therapy skills and symptom change in two randomized clinical trials: the role of therapist adherence and the therapeutic alliance. Journal of Consulting and Clinical Psychology. 2012;80:373–381. doi: 10.1037/a0027663. [DOI] [PMC free article] [PubMed] [Google Scholar]
  80. Weiss B, Catron T, Harris V, Phung TM. The effectiveness of traditional child psychotherapy. Journal of Consulting and Clinical Psychology. 1999;67:82–94. doi: 10.1037//0022-006x.67.1.82. [DOI] [PubMed] [Google Scholar]
  81. Weisz JR, Donenberg GR, Han SS, Weiss B. Bridging the gap between lab and clinic in child and adolescent psychotherapy. Journal of Consulting and Clinical Psychology. 1995;63:688–701. doi: 10.1037//0022-006x.63.5.688. [DOI] [PubMed] [Google Scholar]
  82. Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: A meta-analysis of direct comparisons. American Psychologist. 2006;61:671–689. doi: 10.1037/0003-066X.61.7.671. [DOI] [PubMed] [Google Scholar]
  83. Weisz JR, Chorpita BF, Palinkas L, Schoenwald SK, Miranda J, Bearman SK, Gibbons RD. Testing standard and modular designs for psychotherapy treatment for depression, anxiety, and conduct problems in youth: A randomized effectiveness trial. Archives of General Psychiatry. 2012;69:274–282. doi: 10.1001/archgenpsychiatry.2011.147. [DOI] [PubMed] [Google Scholar]
  84. Zima BT, Hurlburt MS, Knapp P, Ladd H, Tang L, Duan N, Wells KB. Quality of publicly-funded outpatient specialty mental health care for common childhood psychiatric disorders in California. Journal of the American Academy of Child and Adolescent Psychiatry. 2005;44:130–144. doi: 10.1097/00004583-200502000-00005. [DOI] [PubMed] [Google Scholar]

RESOURCES