Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2021 Apr 1.
Published in final edited form as: J Child Fam Stud. 2019 Dec 23;29(4):1008–1020. doi: 10.1007/s10826-019-01667-3

Providers’ Perspectives on Implementing a Multiple Family Group for Children with Disruptive Behavior

Emily K Hamovitch 1, Mary Acri 1, Lindsay A Bornheimer 2, Idan Falek 1, Kate Lambert 1, Madeline Galler 1
PMCID: PMC7747879  NIHMSID: NIHMS1547366  PMID: 33343177

Abstract

Objectives

The adoption of research-supported treatments is contingent upon multiple interactional levels, including provider level factors. Provider-level factors have been shown to be critical to uptake. The purpose of this study is to examine the relationship between sociodemographic factors, attitudes, and perceived barriers/facilitators to implementation through a comparative approach involving practitioners trained to facilitate a multiple family group intervention for children with disruptive behavior.

Methods

Participants included 91 practitioners who participated in an intervention study regarding barriers to adopting an evidence-based practice. Demographic characteristics were collected via a socio-demographic questionnaire. Barriers and facilitators were assessed via open-ended questions as well as a scale, developed by the authors and guided by the Consolidated Framework for Implementation Research that explored provider views regarding the intervention, the systemic and organizational context, experience facilitating groups and involving families in treatment, and feelings toward involving families in treatment. Between group analyses were conducted to examine demographic and characteristic differences of providers by implementation status. Independent samples t-tests for continuous characteristics and chi-square tests for categorical characteristics were used. Responses to open-ended questions were compiled, reviewed, and coded, and frequencies and percentages were calculated.

Results

Results demonstrated that providers who implemented the intervention were significantly more likely to have favorable attitudes toward the intervention compared to those who did not implement it. Prior experience facilitating groups was significantly associated with implementation. Common barriers to implementation included ineligible caseloads and feeling unqualified to deliver the intervention.

Conclusions

Further attention on improving recruitment rates and promoting adequate training and supervision is needed.

Keywords: Research-supported treatment, behavioral disorder, multiple family group, implementation, child mental health


Disruptive behavior disorders (DBD) are chronic and impairing mental health conditions characterized by delinquency, aggression, and defiance towards authority figures (Agency for Healthcare Research and Quality (AHQR), 2015; Fernandez & Eyberg, 2009). Between 3.2% to 15% of children and adolescents are estimated to have a disruptive behavior disorder, and some evidence suggests that males, children between six and 11 years of age, and children who are impacted by poverty are at the greatest risk for DBDs (Acri et al., 2018; CAHMI, n.d.; CDC, n.d.; Nock et al., 2006), although these findings, and particularly the association between disruptive behavior disorders and gender, vary across studies (AHRQ, 2015).

Behavior parent training (BPT) interventions are among the most effective treatments for child DBDs (Acri et al., 2018; Brestan & Eyberg, 1998; Piquero et al., 2016; Serketich & Dumas, 1996). Originating in the 1960s (Acri et al., 2018), BPT interventions are based upon the theory that parenting practices such as harsh and inconsistent discipline, lack of consistent supervision, and monitoring, and the quality of the parent/child relationship, including low parental involvement and warmth towards the child have a powerful role in the onset and continuance of child DBDs (Chacko et al., 2015; Frick & Muñoz, 2006; Jones et al., 2013). Behavior parent training interventions are designed to enhance parenting practices and the parent/child relationship through didactic instruction, modeling, and role play (Chacko et al., 2015; Jones et al., 2013; Kazdin, 1997).

Despite their effectiveness, however, few families receive BPT interventions (Chacko et al., 2016). In addition to access barriers (e.g., extensive waitlists for services, cost), logistical obstacles including lack of transportation and childcare, and stigma, mistrust and misperceptions about the mental health system (e.g., Hamovitch et al., 2018; McKay & Bannon, 2004; Staudt, 2007), the behavioral healthcare system has struggled to implement research-supported treatments (RSTs) like BPT as part of standard care. It is estimated that it takes up to two decades for research supported treatments to be implemented in real-world settings (Amodeo et al., 2011; Atkins et al., 2016; Glisson et al., 2016; Proctor et al., 2009), if at all; as noted by Chaudoir et al. (2013), only 15% of research supported treatments are eventually delivered as part of practice. Compounding this issue is that a small percentage of RSTs that are initially implemented are sustained over time (Chor et al., 2014).

Concretizing research-supported treatments as standard practice is a longstanding concern within the behavioral health field (Aarons et al., 2007; Chaudoir et al., 2013), and major funders such as the National Institutes of Health and the Agency for Healthcare Research and Quality have dedicated millions of dollars to examine barriers to implementing RSTs (Battaglia, 2018). The field of implementation science theorizes that the uptake of RSTs is contingent upon the larger fiscal and legislative landscape in which an agency or clinic is embedded, as well as factors internal to an organization, including its social context and leadership support for the RST (Aarons, 2004; Aarons et al., 2014; Damshroeder et al., 2009; Harvey & Gumport, 2015; Willging et al., 2015). A synthesis of this literature shows that provider-level factors are among the most powerful influences in the implementation of RSTs, and include sociodemographic characteristics such as age, level of education and years in their role as a provider, the provider’s previous experience with and knowledge about the intervention (Sanders et al., 2009), and their perception of the RST including its relevance to a given population, ease of implementation, and flexibility (Aarons et al., 2007; Acri et al., in press; Borntrager et al., 2009; Chaudoir et al., 2013; Haug et al., 2008). These factors interact with each other (e.g., provider attitudes vary by demographic characteristics-for instance, younger age is associated with more favorable views about RSTs), and with leadership factors, the organizational context, and the larger legislative and fiscal environment. As a case in point, Aarons et al. (2009) found providers who worked for private organizations held more favorable opinions of RSTs than those who worked in public institutions, which they surmised was due to differences in the setting’s context and support for adopting innovative practices.

Collectively, this literature suggests that there are multiple, interactional aspects of the workforce that influence the implementation of RSTs. However, it is not known how provider demographic characteristics, attitudes towards RSTs and perceived barriers and facilitators vary by implementation status among the same sample of providers and in reference to the same RST. For example, did providers who lifted the intervention perceive a different set of barriers than those who did not ultimately lift, holding the RST constant? Although the existing literature is informative for building knowledge about provider-level factors, there has not yet been a side-by-side comparison of the same RST to our knowledge. Given evidence that aspects of the intervention itself are associated with implementation (Aarons, 2004; Aarons et al., 2014; Damshroeder et al., 2009; Harvey & Gumport, 2015; Willging et al., 2015), holding this variable constant is intended to more clearly understand how provider-level factors influence uptake. Accordingly, the purpose of the current study was to investigate the relationship between provider-level factors and implementation status (defined as whether the provider implemented the intervention) among a sample of providers who were included in a larger NIMH-funded study that examined the impact and implementation processes underlying uptake of a group behavior parent training program for families of youth with disruptive behavior disorders. Based upon the previous literature, we hypothesized that providers who implemented the RST were 1) significantly more likely to: have prior practice experience with a RST, 2) have prior experience in facilitating treatment groups and, 3) report more favorable beliefs about the intervention and confidence in facilitating group and involving families in the group process. Additionally, we explored perceived barriers and facilitators to implementation in order to better understand implementation processes by triangulating comparisons between providers based upon implementation status.

Methods

Participants

Providers were eligible for inclusion if they received training in the 4 Rs and 2 Ss model as part of the larger NIMH-funded study. Consent was obtained prior to participating in the training. Institutional Review Board approval was obtained.

Two hundred seventy-six (n = 276) providers met eligibility criteria and 91 (33.0%) provided consent to participate in this study. Providers were 38 years of age on average (SD = 11.4), more than half identified as being White (n = 48, 59.3%) and over three-quarters identified as being of non-Hispanic/Latino ethnicity (n = 60, 80.0%). Over half of the sample (n = 61, 69.3%) reported completion of a graduate degree, and 41 (47.1%) held a social work license (i.e. LMSW or LCSW). Providers reported having worked in the mental health field for almost 8 years on average (SD = 7.7) and were employed for over 3.5 years (SD = 5.3) in their current setting. Three-quarters (n = 64, 75.3%) of providers received previous training in a research-supported treatment (RST), 62 (75.6%) used a RST currently in their clinical work, and 11 (12.8%) reported prior experience with the 4 Rs and 2 Ss model. Table 1 presents demographic and clinical experience characteristics of the sample.

Table 1.

Demographic Characteristics and Clinical Experience of Providers

All Participants (n=91) Implemented Intervention (n=27) Did Not Implement Intervention (n=64)

Characteristic n % n % n %
Age (M ± SD) 80 38.09 ± 11.41 25 35.84 ± 10.76 55 39.11 ± 11.64
Race
 American Indian or Alaska Native 2 2.47 0 0.00 2 3.13
 Asian 14 17.28 4 14.81 10 15.63
 Black/African American 17 20.99 4 14.81 13 20.31
 White 48 59.26 18 66.67 30 46.88
Ethnicity
 Non-Hispanic 60 80.00 23 85.19 37 57.81
 Hispanic/Latino 15 20.00 3 11.11 12 18.75
Education
 Some college 4 4.55 1 3.70 3 4.69
 College graduate 13 14.77 2 7.41 11 17.19
 Masters, graduate or nursing school 61 69.32 18 66.67 43 67.19
 PhD or MD 7 7.95 3 11.11 4 6.25
 Other 3 3.41 3 11.11 0 0.00
License/Credentials
 Family Development Credentials 4 4.60 1 3.70 3 4.69
 PEP certification 4 4.60 3 11.11 1 1.56
 Social work licensing (LMSW, LCSW) 41 47.13 11 40.74 30 46.88
 Psychology licensing 5 5.75 3 11.11 2 3.13
 Board certified MD 2 2.30 0 0.00 2 3.13
 Other 14 16.09 6 22.22 8 12.50
 None 17 19.54 2 7.41 15 23.44
Years worked in the mental health field (M ± SD) 56 7.73 ± 7.72 17 6.47 ± 6.35 39 8.28 ± 8.26
Years worked in clinic (M ± SD) 55 3.57 ± 5.27 19 2.10 ± 1.90 36 4.34 ± 6.26
Position
 Part time 12 13.64 2 7.41 10 15.63
 Full time 76 86.36 25 92.59 51 79.69
Caseload (M ± SD) 85 28.04 ±72.95 26 29.42 ± 34.12 59 27.42 ± 84.88
Trained in Research Supported Treatment (RST)?
 No 21 24.71 7 25.93 14 21.88
 Yes 64 75.29 20 74.07 44 68.75
Currently using RST?
 No 20 24.39 6 22.22 14 21.88
 Yes 62 75.61 20 74.07 42 65.63
Prior training in the 4Rs and 2Ss?
 No 75 87.21 22 81.48 53 82.81
 Yes 11 12.79 5 18.52 6 9.38

Procedure

Eligible providers were informed of this study by a member of the research team via email correspondence. If providers were interested in participating in the study, the email contained a link that allowed them access to an online survey. Providers completed a questionnaire consisting of 63 questions that assessed clinical experience, organizational climate, beliefs about inclusion of families in treatment, and perceptions of the 4 Rs and 2 Ss model. Providers who reported having facilitated a 4 Rs and 2 Ss group with clients were asked additional questions about the process of implementing the model. Participation in the survey took between 20 to 30 minutes.

This is a sub-study of a larger National Institute of Mental Health (NIMH)-funded study examining a multiple family group model entitled the 4 Rs and 2 Ss for Strengthening Families (4 Rs and 2 Ss). Briefly, the 4 Rs and 2 Ss is a manualized, curriculum-based group that draws from common elements of research-supported treatments designed to address conduct problems in children, and incorporates those elements into a coordinated set of practices in order to decrease problem behaviors, strengthen families, and increase engagement in treatment (Acri et al., 2017; Chacko et al., 2015; McKay et al., 2002). This sub-study investigated perceptions and experiences among 91 providers who were trained in the 4 Rs and 2 Ss for Strengthening Families model between September, 2017 and December 2018.

The 4 Rs and 2 Ss for Strengthening Families is a multiple family group model for children between seven and 11 years of age who meet diagnostic criteria for a disruptive behavior disorder (DBD) and their families. In this model, six to eight families including adult caregivers and siblings over six years of age meet for weekly group sessions of one hour in length. In this case, a family is defined by at least one child and one primary caregiver. The targeted skills and processes are referred to in the curriculum as the 4 Rs (Rules, Responsibility, Relationships, Respectful Communication, Stress, and Social Support). The 4Rs (Rules, Responsibility, Relationships, and Respectful Communication) were chosen as core content in the curriculum, as these topics have been empirically linked to childhood behavioral disorders, and the 2Ss (Stress and Social Support) were added to the curriculum as they are s two factors known to hinder treatment attendance (Acri et al., 2017; Chacko et al., 2015; Gopalan et al., 2015; McKay et al., 2002). The 4 Rs and 2 Ss is co-delivered by two mental health providers, or by a provider and a parent advocate who is a trained caregiver with prior experience navigating the mental health service system. Facilitators of the model underwent a training, designed by the developers of the 4 Rs and 2 Ss model with the assistance of the research team. The training consisted of a series of online modules, which covered topics including an introduction to the model; engagement and barriers; recruitment and orientation; key facilitation skills; group structure; supervision and fidelity; and common challenges. Once complete, facilitators participated in an in-person, three-hour follow-up training, in which they reviewed the material covered in the online modules, and engaged in a series of role plays to practice group facilitation skills. Supervision was provided to the 4Rs and 2Ss facilitators monthly, for one hour in length, by clinical social workers who were involved in the development of the 4Rs and 2Ss model. Additionally, fidelity checks were conducted every four sessions to ensure that group facilitators adhered to the model. Aside from free training and supervision, providers were not given any additional incentives for participating in the training or implementing the intervention.

Measures

Demographic characteristics were collected via a general socio-demographic questionnaire used in prior studies (e.g. Chacko et al., 2015; Gopalan et al., 2015) that gathered information including provider age, race/ethnicity, education and credentials. Previous experience facilitating groups was collected through three questions. The first question asked participants whether they have facilitated groups with families of children with mental health problems in the past, the second asked whether they have facilitated groups with parents before, and the third asked whether they have facilitated groups with children previously. All three questions regarding experience in facilitating groups included a binary yes/no response. Internal and external facilitators and barriers were assessed via a scale that was developed by the authors and guided by the Consolidated Framework for Implementation Research (CFIR) which aims to establish a single framework to guide implementation research by integrating constructs from the range of theoretical frameworks that were most commonly overlapping across the implementation science literature (Kirk et al., 2016). Briefly, the CFIR consists of five domains; the intervention, inner setting, the outer setting, the individual characteristics of those involved, and the process by which the implementation of the intervention is executed (Damschroder et al., 2009).

The scale developed for this study contains 9 questions about provider views regarding the intervention (“I believed that the 4 Rs and 2 Ss was an effective intervention for children with behavior disorders and their families”); provider beliefs about the relevance of the 4 Rs and 2 Ss to their own work (“I felt that it would be a service my clients would want to participate in”; “I felt that the intervention addressed many of my clients’ needs”); and the perceived burden of using the 4 Rs and 2 Ss (“I felt that the intervention was too burdensome for me to offer as a service to my clients”). Scores were anchored along a 5- point Likert scale ranging from Strongly Disagree (1) to Strongly Agree (5), with higher scores indicating more favorable beliefs about the intervention.

The internal and external settings were captured through 5 questions which addressed the systemic and organizational context from the provider’s perspective, specifically: pressure from leadership, transparency regarding the decision to implement, and provider agency over treatment models. Scores ranged from Strongly Disagree (5) to Strongly Agree (1), with higher scores indicated a greater sense of personal agency over treatment decisions.

Individual provider characteristics were sub-categorized into 3 questions exploring experience and confidence facilitating groups (e.g.,“I feel anxious before attending a group that I am facilitating”); and 13 questions exploring experiences with involving families in treatment (e.g., “I feel I can best help children with conduct problems by involving their families”). Scores ranged from Strongly Disagree (1) to Strongly Agree (5) and reverse coding was used to establish that higher scores for the confidence facilitating groups subscale indicate more confidence in facilitating groups, and higher scores for the experiences involving families in treatment indicate more positive feelings towards involvement of parents in treatment.

Attributions were also included as an additional construct, in order to investigate providers’ prejudiced feelings toward involving families in mental health treatment, as family involvement is a key component of the 4Rs and 2Ss intervention specifically. This scale consisted of 5 questions, ranging from Strongly Disagree (1) to Strongly Agree (5), with higher scores on this scale indicating greater prejudiced feelings toward the involvement of families in treatment.

Reliability was evaluated for each of the 5 generated multi-item scales (intervention, systemic/organizational context, individual experience with groups, individual experience involving families, and attributions). Cronbach’s alphas were 0.86 for intervention, 0.77 for the systemic/organizational context, 0.70 for individual experience with groups, 0.88 for individual experience involving families, and 0.82 for attributions. Next, Confirmatory Factor Analysis (CFA) was performed to assess the factor structure of the 5 generated scales. Each model demonstrated good fit per global (Chi-square p-value > .05; Comparative Fit Index > 0.95, Root Mean Square of Approximation < 0.08, p-value for close fit > 0.05, Standardized Root Mean Square Residual < 0.05) and focused fit indices (standardized residuals below the absolute value of 2 and modification indices below 4). All items loaded significantly onto their respective factor, with loadings ranging from 0.76 to 1.33 for attributions, 0.59 to 0.71 for individual experience with groups, 0.32 to 1.17 for individual experience involving families, 0.24 to 1.08 for intervention, and 0.35 to 1.27 for systems. As a result of the CFA and no cross-loadings, treating the items as unidimensional was determined to be reasonable.

Barriers to implementation were captured differently based upon whether the provider implemented the intervention in a group. Providers who did not implement the intervention (those who were unable to begin facilitating a group) were asked a single, open-ended question regarding the reasons that they did not use the intervention. Providers who did implement the intervention in a group were asked to complete three questions regarding perceived barriers in planning for the group, barriers to attendance for families, and general barriers (e.g., “what other difficulties arose when running the group?”)

Data Analysis

Quantitative data were analyzed using SPSS 24. Univariate explorations of demographic and clinical characteristics of all providers were examined to describe and better understand the sample. Second, between group analyses were conducted to examine demographic and clinical experience differences of providers by implementation status (implement the 4 Rs and 2 Ss treatment versus those who did not). Independent samples t-tests were used for continuous characteristics (e.g. provider age or scores on the subscales) and chi-square tests for categorical characteristics (e.g. race or prior experience in facilitating groups). Lastly, univariate provider experiences and perceptions of the treatment process and 4 Rs and 2 Ss intervention in general were explored amongst those who lifted the treatment.

Responses to the open-ended questions were compiled, reviewed, and coded by two members of the research team (M. G. and I. F.) with oversight from the contributing authors, who have had prior experience with coding qualitative data (Acri et al., 2019). Codes were compared and any disagreements in categorization were discussed amongst the coders and team members until a consensus was reached. After three rounds of coding, frequencies and percentages were calculated.

Results

Implementation Status by Demographic and Clinical Experience Characteristics

Twenty-seven (n = 27, 29.7%) providers implemented the 4 Rs and 2 Ss intervention. There were no significant differences in demographic characteristics by implementation status, thus, it is apparent that the two implementation status groups were similar. Hypothesis 1 was therefore not supported. There were, however, significant differences in prior clinical experience by implementation status (Figure 1). Specifically, in line with hypothesis 2, a significantly larger proportion of providers who implemented the intervention had prior experience in facilitating groups with: families of children with mental health problems (x2(1)=5.11, p < .01), children and teens (x2(1)=6.82p < .01), and parents, (x2(1)=2.45, p < .05), compared to those who did not implement the intervention, independently.

Figure 1.

Figure 1

Provider experiences by implementation status

Note. Differences by lift status examined using Chi-Square tests; asterisk(s) identify significant differences

* p< .05, ** p< .01, *** p< .001

Perceptions of Multi-level Factors that Influence Uptake

As demonstrated in Table 2, there were significant differences between groups regarding intervention characteristics; specifically, hypothesis 3 was supported- an independent sample t-test indicated that providers who implemented the intervention reported more positive and favorable attitudes towards the 4 Rs and 2 Ss intervention (M = 28.7, SD = 3.5) compared to those who did not implement the intervention (M = 26.0, SD = 4.4; t(76)= −2.66, p < .01). While these significant differences are present, it is important to note that mean differences were small (2.68 scale points), thus the meaningfulness of this significant difference must be considered in light of similar scores. No additional significant differences were found between groups on the remaining four subscales, furthermore supporting the observation that the two implementation status groups were similar.

Table 2.

Means and SD of Factors that Influence Uptake by Implementation status

Implemented Intervention (n=27)
Did Not Implement Intervention (n=64)
Siga
Characteristic n M ± SD n M ± SD
Intervention characteristics 25 28.72 ± 3.46 53 26.04 ± 4.44 **
Systemic context 24 11.96 ± 1.88 52 11.90 ± 2.12
Individual Characteristics
 Confidence facilitating groups 21 10.14 ± 2.35 36 9.42 ± 2.20
 Experience involving families 23 42.35 ± 5.37 51 42.22 ± 6.30
Attributions 23 14.00 ± 3.79 51 13.96 ± 3.19
a

Significance examined using independent samples t-tests

*

p<.05

**

p<.01

***

p<.001

Perceived Barriers among Providers Who Did Not Implement the Intervention

Sixty-four (n = 64) practitioners provided 49 unique responses regarding their perceptions of barriers that impeded the implementation of the 4 Rs and 2 Ss model. Almost two-thirds (n = 30, 61.2%) of participants cited population-level barriers as impediments to implementation. Within this category, over a third (n = 11, 36.7%) indicated that their client caseload was ineligible to participate, followed by seven (n = 7, 23.3%) who reported that their caseload was too low to fill a group. Other categories within the population-level barriers include caregiver being unavailable for participation (n = 5, 16.7%), services being inappropriate for the caregiver’s primary concern (n = 5, 16.7%) and recruitment issues (n = 2, 6.7%).

Fourteen (n = 14, 28.6%) of the responses pertained to individual (provider)-level barriers; including not feeling qualified to implement the 4 Rs and 2 Ss intervention (n = 7, 50.0%), followed by scheduling and time constraints (n = 3, 21.4%); as one provider explained, “It was difficult to devote the necessary time to organization a group, conducting it consistently and then devoting time to supervision.” Three (n = 3, 6.1%) of the responses cited agency-level barriers, including lacking resources (n = 2, 66.7%); and administrative resistance (n = 1, 33.3%). As one provider noted, they worked in a “small clinic and did not have resources to participate as we would have liked.” Finally, two (n = 2, 4.1%) responses were intervention-level factors, specifically that they did not have favorable attitudes toward a group modality of treatment, as opposed to an individual modality.

Perceived Barriers among Providers who Implemented the Intervention

The 27 providers who implemented the 4 Rs and 2 Ss responded to open-ended questions about barriers to planning and implementing a multiple family group, as well as general barriers encountered when facilitating the group. Regarding planning for the group, twenty-four (n = 24, 88.9%) providers contributed 24 unique responses to barriers encountered while planning their groups. Three providers cited not having encountered any barriers. Population-level barriers were most commonly cited, (n = 17, 70.8%); including perceiving that they did not have eligible participants on their caseload (n = 10, 58.8%), caregivers being unavailable due to time and scheduling (n = 2, 11.8%), and recruitment difficulties (e.g., difficulty getting families to commit to attending the group; n = 2, 11.8%).

Agency–level barriers were cited by four (n = 4, 16.7%) providers and included lack of space (n = 3, 75.0%) and administrative resistance (n = 1, 25.0%). As reported by one provider, the intervention was “Not prioritized in our clinic by administrators.” Three providers cited individual-level barriers (n = 3, 12.5%), including being unavailable due to time and scheduling restraints (n = 2, 66.7%) and not feeling qualified to implement the intervention (n = 1, 33.3%). Finally, one provider noted the intervention-level barrier of lacking “Time!!! Time to recruit, prepare, organize, deliver.”

Regarding executing the group, twenty-four (n = 24, 88.9%) providers provided 33 unique responses about barriers to attendance for families, and all (n = 33, 100.0%) were categorized as population-level factors, including families being unavailable to participate due to scheduling and timing conflicts (n = 15, 45.5%),transportation difficulties (n = 4, 12.1%), caregiver resistance (n = 4, 12.1%), family illness (n = 3, 9.1%), family crisis (n = 2, 6.1%) and attrition (n = 1, 3.0%).

Regarding general barriers, twenty-two (n = 22, 81.5%) providers contributed 25 unique responses to other difficulties that arose while implementing the group. Nineteen providers (n = 19, 76.0%) cited barriers on the population-level, including the caregiver being unavailable to participate due to issues with timing and scheduling (n = 4, 21.1%), disruption to receiving services (n = 3, 15.8%), attendance issues (n = 2, 10.5%), recruitment issues (n = 2, 10.5%), family crisis (n = 2, 10.5%) and lack of childcare (n = 1, 5.3%). Five providers (n = 5, 26.3%) reported disruptive behaviors during the intervention, including “It was difficult for many of the children to remain seated for one hour.”

Three (n = 3, 12%) of participants cited intervention-level factors, and specifically noted an interest in covering more skills. Two (n = 2, 8.0%) providers reported individual-level barriers of not feeling qualified to implement the intervention. Finally, one practitioner (n = 1, 4.0%) reported the agency-level barrier or not having resources, stating “our clinic doesn’t have any extra funds. Some of the activities required materials…[which] made it difficult as I had to improvise or spend my own money.”

Discussion

The purpose of this study was to examine the association between provider level factors, associated with the implementation of a family group intervention for child DBDs. The benefit of studying provider-level factors in implementing same RST allowed us to have a deeper understanding of how such factors influence uptake. Holding the RST variable constant enabled a close examination of how provider demographic characteristics, attitudes toward RSTs and perceived barriers and facilitators vary by implementation status. Several findings are notable. First, it is notable that none of the demographic factors collected, including age, working experience, and current use of RST in practice (hypothesis 1), were related to implementation or lack thereof. This finding suggests that the implementation status groups were similar and providers at any age or stage of their working trajectory may be equally likely to adopt (or not adopt) research-supported treatments, and accordingly, efforts to promote uptake should not be limited to particular groups based on demographic characteristics.

However, there was a significant difference between groups regarding provider attitudes about the intervention (hypothesis 3), in that providers who implemented the intervention were significantly more likely to have favorable attitudes toward the 4 Rs and 2 Ss intervention than those who did not implement it. This finding, while it must be considered within the context of the mean difference being relatively small (see results), is consistent with research that points to the critical importance of individual attitudes in driving RST adoption by therapists. A study by Nelson and Steele (2007) of 214 therapists from several states found that attitudes toward the treatment predicted 21.3% of RST use after controlling for theoretical orientation and clinical setting. Similarly, Jensen’s (2009) study of children’s service providers’ attitudes toward research-supported treatments found that policy changes and access to RSTs alone were not sufficient to promote its implementation- rather, therapists’ attitudes toward RSTs were significantly related to the uptake of these practices. In light of this finding, actions to improve attitudes toward RSTs are recommended. Specifically, research shows that practices that are perceived to be relatively simple are more easily adopted in less time than those that are more complex, and “reinvention” of the RST to fit the local context may promote adoption (Titler, 2008). Titler (2008) recommends the use of quick reference guides, decision aids, and the use of clinical reminders.

Second, and part of hypothesis 2, prior experience facilitating groups with parents and teens in the past was significantly associated with implementation, which suggests that beyond the RST, mode of delivery might be a decisive factor influencing uptake. It appears that having experience specifically with group work may be a driving factor in implementation status. Laying the foundation for group facilitation skills may be important to experience in the formative years of providers’ education. However, concerns about the diminished role of group work education within the field of social work have been documented (Simon et al., 2017). A study by Simon and Kilbane (2014) which analyzed information about group work within accredited MSW programs in the United States found that since the early 90s, there has been a decline in focused group-work concentrations, as well as the number of students enrolled in such concentrations, and a decline in the percentage of programs offering elective-group work courses. Furthermore, a study by LaPorte and Sweifach (2011) which examined group-work based field work experiences of 1360 first year MSW students found that more than half of respondents indicated that their field instructors provided very little or no information about group work theory and practice. The literature demonstrates that unfortunately, a decline in social work education for groups has resulted in the development of practitioners with limited or no group work in social work education, who may lack knowledge and skills required for effective group work facilitation (Muskat, 2013). Berghart and Simon (2005) state that several agencies tend to employ few trained group workers, and there has been an increasing scarcity of agency training and supervision in group work. This may be due to the fact that as funding has diminished, agencies have cut their internal staff development activities, and practitioners are expected to attend trainings outside their places of employment (Berghart & Simon, 2005). It is therefore imperative that educational curricula incorporate further exposure to group facilitation skills, as well as other research-supported treatment models. Group supervision, which has been shown to be a useful approach to teaching group work to students (Geller, 1995), may be a way of improving comfort and skill in this area. Muskat (2013) proposes a model of group supervision in which 8 to 12 members who are interested in group facilitation meet at preset intervals at a regular time and in a comfortable convenient and private location. Muskat (2013) speaks to the importance of establishing a purpose of the group, that recruitment be carried out by the agency sponsor, and that the group be led by a practitioner with significant experience in group facilitation, who could be recruited from either inside or outside the agency. Regarding structure of the group, Muskat (2013) recommends that the supervisory session should begin with a statement of purpose and introductions, followed by a review of key issues related to facilitating groups, and an engagement of participants in exercises and activities. Once members begin to feel engaged with one-another, they should be encouraged to raise issues emerging from the groups they facilitate.

An important finding of the qualitative data was the consistency in population-level barriers across the sample, regardless of whether providers implemented the model or not, and centrally, their client caseload was ineligible to participate or too low to fill a group. If providers are unable to find enough families to recruit into a group intervention, further attention should be placed on the importance of methods of recruiting. Participation in mental health programs has been found to be related to accessibility of services, stigma associated with treatment, and the relationship between clients and staff- therefore, strategies to address these factors must be heavily weighted in improving recruitment rates (Derr et al., 2001). A review of the literature by Watson (2005) discusses active engagement strategies aimed at increasing recruitment, suggesting attention at the level of the casework (i.e. prompt respons es, frequent contact, assertive community outreach) as well as the level of the agency (i.e. allowing recruitment time, using non-stigmatizing program names, using other agencies as ambassadors). It is also important for providers to focus their recruitment efforts outside of their own caseloads, and to expand opportunities to participate in research-supported treatments across the agency. Having regular cross-agency meetings may be imperative in doing so. Additionally, maintaining families in treatment can be problematic. Although the 4Rs and 2Ss intervention was designed in partnership with caregivers to address the logistical barriers associated with participating in treatment, it is important to be mindful that premature termination and lack of engagement of mental health s ervices is a prevalent problem confronting the mental health system. Future research is needed to attend to and address this issue.

It is important to also note that agency-level barriers were seen to prevent providers from planning and executing the group. These barriers included lack of resources, administrative resistance (lack of prioritization), lack of space, and lack of funding. Such barriers have also been reported in the literature. One study which examined factors associated with sustainability of evidence-based practices across 49 sites found that the most common factors preventing implementation included inadequate financial support, lack of prioritization, and workforce issues (Bond et al., 2014). The authors explain the critical importance of funding, citing the fact that incentives led to expansions of programs, whereas a reduction in reimbursements led to a significant reduction in such programs. They add, however, that funding alone will not suffice to promote sustainability and uptake of programs, and prioritization of programs is critical.

The finding that providers’ caseloads are ineligible for participation in this intervention is somewhat surprising, in light of the high number of children with disruptive behavior referred to mental health services (National Collaborating Centre for Mental Health, 2013). Again, this result may be due to provider-level comfort in delivering an individual modality over groups. The time and energy needed to overhaul existing practices may stand in the way of providing opportunities to include children in new, tested interventions, in this case, a group intervention. Although future research is warranted before drawing any firm conclusions, the fact that clinicians felt unqualified to deliver the intervention despite being trained in the model and being offered ongoing supervision reinforces the importance of incorporating exposure to group facilitation skills within educational curricula as well as on the job.

Taken together, these findings point to the importance of the providers’ previous experience and attitudes toward delivering the intervention. It may be that previous experience in related intervention activities (which are separate from the RST itself) provide a foundation upon which providers can draw in taking up the RST; in this case, experience facilitating groups seems to have provided the foundation. Training, supervision and greater exposure to knowledge and skills that are foundational to particularly RSTs are therefore needed. Fixsen et al. (2005) indicate that dissemination of information and training alone are inadequate for ensuring that evidence-based practices are implemented and sustained- rather interprofessional collaboration is needed in the form of communication and information exchange within and across agencies. Communities of practice (CoP), which are “groups of people who share knowledge, learn together and create common practices” (Wenger et al., 2002), have been documented in the literature as a measure of enhancing implementation. Barwick et al. (2009) examined the benefits of a community of practice for children’s mental health organizations that were mandated to adopt a standardized outcome measure, and found that CoP participants demonstrated greater use of the tool in practice, better content knowledge and more satisfaction with implementation supports than providers who did not participate in the CoP. In fact, a learning collaborative approach was used as a vehicle for supporting training and the implementation of a version of the 4Rs and 2Ss program (implemented in a prior study). A study evaluating this approach found that the services provided by staff of the learning collaborative appeared to positively impact the implementation of the intervention. Strategies included providing a forum to express concerns and brainstorm solutions, training for technical and procedural questions about running the group, periodic check ins to secure guidance from training staff, and availability to address practical questions to problem solve (Stephens et al., 2014). The authors conclude that CoPs are a promising model for translating RST knowledge and promoting practice change in childrens’ mental health and we encourage more research on this model.

Limitations

There are several limitations of this study, including the fact that the majority of providers did not implement an intervention, and therefore the sample size of those who did implement the 4Rs and 2Ss was low. Additionally, the sample of clinics recruited into the study was quite homogenous. For example, all clinics recruited into the study were OMH-funded within New York City serving children with behavioral difficulties, and providers within these clinics all willingly agreed to participate in the study and were offered free training to adopt a research-supported treatment. Therefore, it may be difficult to account for significant differences among the organizations and providers surveyed within these clinics. Also, investigations of hypothesis 1 in particular support the observation that the sample is homogenous. Furthermore, it is worth nothing that the mean group difference between providers who implemented the intervention reported more positive and favorable attitudes towards the 4 Rs and 2 Ss intervention compared to those who did not implement the intervention was significant albeit small, which should be considered when interpreting the results and suggests the need for future study. Despite these limitations, this study suggests that attention to creating positive provider attitudes, improving comfort and familiarity with the intervention, and improving recruitment to these research-supported treatments is warranted. Future research at the provider and organizational level is needed to identify the most effective methods of addressing these issues.

To conclude, the study set out to identify provider level factors that may be influential in the adoption of a research-supported group family practice for child DBD, and that may underlie this state of affairs. Results suggest that familiarity with the intervention, attitudes towards it and agency-level factors may be primary factors to consider. These findings have implications for the field. Particularly, with regard to ensuring that interventions are viewed favorably to providers who are trained to implement them, familiarizing providers with interventions through training and exposure to a range of modalities and populations, and focusing our efforts on recruitment strategies to engage participants, are imperative improve uptake of RST.

Research involving Human Participants and/or Animals

Ethical approval: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. This article does not contain any studies with animals performed by any of the authors. This study was approved by New York University’s Institutional Review Board.

Supplementary Material

10826_2019_1667_MOESM1_ESM

Table 3.

Open-Ended Responses to Barriers in Implementing the 4Rs and 2Ss

Question/Respondents Total Responses Codes n %
What are the reasons you chose not to use the 4Rs and 2Ss model with your clients? (n=42) 49
Agency Level 3 6.12
 Lack of resources 2 66.67
 Administrative resistance 1 33.33
Individual Level 14 28.57
 Provider not qualified to implement intervention 7 50.00
 Provider unavailable (time/scheduling) 3 21.43
 Other provider implementing intervention 2 14.29
 Provider no longer working in clinic 2 14.29
Population Level 30 61.22
 Caseload ineligible 11 36.67
 Low caseload 7 23.33
 Caregiver unavailable (time/scheduling) 5 16.67
 Services inappropriate for caregiver’s primary concern 5 16.67
 Recruitment issues 2 6.67
Intervention Level 2 4.08
 Provider unaware of different modality 2 100.00
What are some barriers you encountered in planning your group? (n=24) 24
Agency Level 4 16.67
 Lack of space 3 75.00
 Administrative resistance 1 25.00
Individual Level 3 12.50
 Provider unavailable (time/scheduling) 2 66.67
 Provider not qualified to implement intervention 1 33.33
Population Level 17 70.83
 Caseload ineligible 10 58.82
 Caregiver unavailable (time/scheduling) 2 11.76
 Recruitment issues 2 11.76
 Attendance issues 1 5.88
 Attrition 1 5.88
 Language barriers 1 5.88
What were the most common barriers to attendance for families? (n=24) 33
Population Level 33 100.00
 Caregiver unavailable (time/scheduling) 15 45.45
 Transportation 4 12.12
 Caregiver resistance 4 12.12
 Family illness 3 9.09
 Family crisis 2 6.06
 Attrition 1 3.03
What other difficulties arose when running the group? (n=22) 25
Agency Level 1 4.00
 Lack of funding 1 100.00
Individual Level 2 8.00
 Provider not qualified to implement intervention 2 100.00
Population Level 19 76.00
 Disruptive behavior during intervention 5 26.32
 Caregiver unavailable (time/scheduling) 4 21.05
 Disruption to receiving services 3 15.79
 Attendance issues 2 10.53
 Recruitment issues 2 10.53
 Family crisis 2 10.53
 Lack of childcare 1 5.26
Intervention Level 3 12.00
 Intervention modality unfavorable 3 100.00

Acknowledgments

Funding

This study was funded by NIMH (R01- MH-106771).

Footnotes

Disclosure of potential conflicts of interest

Conflict of Interest: The authors declare that they have no conflict of interest.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Compliance with Ethical Standards

Publisher's Disclaimer: This Author Accepted Manuscript is a PDF file of an unedited peer-reviewed manuscript that has been accepted for publication but has not been copyedited or corrected. The official version of record that is published in the journal is kept up to date and so may therefore differ from this version.

References

  1. Aarons GA (2004). Mental health provider attitudes toward adoption of evidence-based practice: The Evidence-Based Practice Attitude Scale (EBPAS). Mental Health Services Research, 6, 61–74. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aarons GA, Green AE, Willging CE, Ehrhart MG, Roesch SC, Hecht DB, & Chaffin MJ (2014). Mixed-method study of a conceptual model of evidence-based intervention sustainment across multiple public-sector service settings. Implementation Science, 9, 183–194. doi: 10.1186/s13012-014-0183-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Aarons GA, & Palinkas LA (2007). Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research, 34, 411–419. [DOI] [PubMed] [Google Scholar]
  4. Aarons GA, Sommerfeld DH, & Walrath-Greene CM (2009). Evidence-based practice implementation: The impact of public versus private sector organization type on organizational support, provider attitudes, and adoption of evidence-based practice. Implementation Science, 4(83). doi: 10.1186/1748-5908-4-83 [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Acri MC, Bornheimer LA, Hamovitch EK, & Lambert K (in press). Outcomes associated with adapting a research-supported treatment for children with behavioral disorders Research on Social Work Practice. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Acri M, Gopalan G, Chacko A, & McKay M (2018). Engaging families into treatment for child behavior disorders: A synthesis of the literature In Lochman J & Mathys W (Eds.), Wiley Handbook of Disruptive and Impulse-Control Disorders. Hoboken, NJ; Wiley. [Google Scholar]
  7. Acri MC, Hamovitch E, Garay E, & McKay M (2017). Testing the 4Rs and 2Ss multiple family group intervention: Study protocol for a randomized controlled trial. Trials, 18, 588. doi: 10.1186/s13063-017-2331-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Acri MC, Hamovitch EK, Lambert K, Galler M, Parchment TM, & Bornheimer LA (2019). Perceived benefits of a multiple family group for children with behavior problems and their families. Social Work with Groups, 1–16. doi: 10.1080/01609513.2019.1567437 [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Agency for Healthcare Research and Quality (AHRQ). (2015). Treating Disruptive Behavior Disorders in Children and Teens. Retrieved from https://effectivehealthcare.ahrq.gov/topics/disruptive-behavior-disorder/consumer [PubMed]
  10. Amodeo M, Lundgren L, Cohen A, Rose D, Chassler D, Beltrame C, & D’Ippolito M (2011). Barriers to implementing evidence-based practices in addiction treatment programs: Comparing staff reports on motivational interviewing, adolescent community reinforcement approach, assertive community treatment, and cognitive-behavioral therapy. Evaluation and Program Planning, 34, 382–389. doi: 10.1016/j.evalprogplan.2011.02.005 [DOI] [PubMed] [Google Scholar]
  11. Atkins MS, Rusch D, Mehta TG, & Lakind D (2016). Future directions for dissemination and implementation science: Aligning ecological theory and public health to close the research to practice gap. Journal of Clinical Child & Adolescent Psychology, 45, 215–226. doi: 10.1080/15374416.2015.1050724 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Barwick MA, Peters J, & Boydell K (2009). Getting to uptake: do communities of practice support the implementation of evidence-based practice?. Journal of the Canadian Academy of Child and Adolescent Psychiatry = Journal de l’Academie canadienne de psychiatrie de l’enfant et de l’adolescent, 18, 16–29. [PMC free article] [PubMed] [Google Scholar]
  13. Battaglia C, & Glasgow RE (2018). Pragmatic dissemination and implementation research models, methods and measures and their relevance for nursing research. Nursing Outlook, 66, 430–445. [DOI] [PubMed] [Google Scholar]
  14. Berghart AM, & Simon SS (2005). Practicing what we preach: Creating groups for ourselves. Social Work With Groups, 27(4), 17–30. [Google Scholar]
  15. Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, & Williams J (2014). Long-term sustainability of evidence-based practices in community mental health agencies. Administration and Policy in Mental Health and Mental Health Services, 41(2), 28–236. [DOI] [PubMed] [Google Scholar]
  16. Borntrager CF, Chorpita BF, Higa-McMillan C, & Weisz J (2009). Provider attitudes toward evidence-based practices: Are the concerns with the evidence or with the manuals? Psychiatric Services, 60(5), 677–681. doi: 10.1176/appi.ps.60.5.677 [DOI] [PubMed] [Google Scholar]
  17. Brestan EV, & Eyberg SM (1998). Effective psychosocial treatments of conduct-disordered children and adolescents: 29 years, 82 studies, and 5,272 kids. Journal of Clinical Child Psychology, 27, 180–189. [DOI] [PubMed] [Google Scholar]
  18. Centers for Disease Control and Prevention. (n.d.). Data and Statistics on Children’s Mental Health. Retrieved December 20, 2018, from https://www.cdc.gov/childrensmentalhealth/data.html
  19. Chacko A, Alan C, Uderman J, Cornwell M, Anderson L & Chimiklis A (2015). Training parents of children with ADHD In Barkley R (Ed), Attention Deficit Hyperactivity Disorder: A Handbook for Diagnosis and Treatment (4th edition) (pp. 513–536).New York, NY: Guilford Press. [Google Scholar]
  20. Chacko A, Jensen SA, Lowry LS, Cornwell M, Chimklis A, Chan E, ... & Pulgarin B (2016). Engagement in behavioral parent training: Review of the literature and implications for practice. Clinical Child and Family Psychology Review, 19, 204–215. doi: 10.1007/s10567-016-0205-2 [DOI] [PubMed] [Google Scholar]
  21. Chaudoir SR, Dugan AG, & Barr CH (2013). Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implementation Science, 8(22). [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Child and Adolescent Health Measurement Initiative (CAHMI). (n.d.). Data Resource Center for Child & Adolescent Health. Retrieved from http://childhealthdata.org/
  23. Chor KHB, Olin S-CS, Weaver J, Cleek AF, McKay MM, Hoagwood KE, & Horwitz SM (2014). Adoption of clinical and business trainings by child mental health clinics in New York State. Psychiatric Services, 65, 1439–1444. doi: 10.1176/appi.ps.201300535 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(50). doi: 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Derr MK, Douglas S, & Pavetti L, (2001). Providing mental health services to TANF recipients: Program design choices and implementation challenges in four states. Washington, DC: Department of Health and Human Services. [Google Scholar]
  26. Fernandez MA, & Eyberg SM (2009). Predicting treatment and follow-up attrition in parent–child interaction therapy. Journal of Abnormal Child Psychology, 37, 431–441. [DOI] [PubMed] [Google Scholar]
  27. Frick PJ, & Muñoz L (2006). Oppositional defiant disorder and conduct disorder. In Essau CA, & C. A. (Eds.), (pp. 26–51). New York, NY, US: Routledge/Taylor & Francis Group. [Google Scholar]
  28. Geller C (1995). Group supervision as a vehicle for teaching group work to students. The Clinical Supervisor, 12(1), 199–214. [Google Scholar]
  29. Glisson C, Williams NJ, Hemmelgarn A, Proctor E, & Green P (2016). Aligning organizational priorities with ARC to improve youth mental health service outcomes. Journal of Consulting and Clinical Psychology, 84, 713–725. doi: 10.1037/ccp0000107. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Gopalan G, Chacko A, Franco L, Dean-Assael KM, Rotko LE, Marcus SM, ... & McKay MM (2015). Multiple family groups for children with disruptive behavior disorders: child outcomes at 6-month follow-up. Journal of Child and Family Studies, 24, 2721–2733. doi: 10.1007/s10826-014-0074-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Hamovitch E, Acri M, & Bornheimer L (2018). Who is accessing family mental health programs? Demographic differences before and after system reform. Children and Youth Services Review, 85, 239–244. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Harvey AG, & Gumport NB (2015). Evidence-based psychological treatments for mental disorders: Modifiable barriers to access and possible solutions. Behaviour Research and Therapy, 68, 1–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Haug NA, Shopshire M, Tajima B, Gruber V, & Guydish J (2008). Adoption of evidence-based practices among substance abuse treatment providers. Journal of Drug Education, 38, 181–192. 10.2190/DE.38.2.f [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Jensen-Doss A, Hawley KM, Lopez M, & Osterberg LD (2009). Using evidence-based treatments: The experiences of youth providers working under a mandate. Professional Psychology: Research and Practice, 40, 417–424. doi: 10.1037/a0014690 [DOI] [Google Scholar]
  35. Jones DJ, Forehand R, Guellar J, Kincaid C, Parent J, Fenton N, & Goodrum N (2013). Harnessing innovative technologies to advance children’s mental health: Behavioral parent training as an example. Clinical Psychology Review, 33, 241–252. 10.1016/j.cpr.2012.11.003 [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Kazdin AE (1997). Parent management training: Evidence, outcomes, and issues. Journal of the American Academy of Child & Adolescent Psychiatry, 36, 1349–1356. [DOI] [PubMed] [Google Scholar]
  37. Kirk MA, Kelley C, Yankey N, Birken SA, Abadie B, & Damschroder L (2016). A systematic review of the use of the Consolidated Framework for Implementation Research. Implementation Science, 11(72). doi: 10.1186/s13012-016-0437-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. LaPorte HH, & Sweifach J (2011). MSW foundation students in the field: Reflections on the nature and quality of group work assignments and supervision. Journal of Teaching in Social Work, 31, 239–249. [Google Scholar]
  39. McKay MM, & Bannon WM Jr. (2004). Engaging families in child mental health services. Child and Adolescent Psychiatric Clinics of North America, 13, 905–921. [DOI] [PubMed] [Google Scholar]
  40. McKay MM, Harrison ME, Gonzales J, Kim L, & Quintana E (2002). Multiple-family groups for urban children with conduct difficulties and their families. Psychiatric Services, 53, 1467–1468. 10.1176/appi.ps.53.11.1467 [DOI] [PubMed] [Google Scholar]
  41. Muskat B (2013). The use of IASWG standards for social work practice with groups in supervision of group work practitioners. Social Work With Groups, 36, 208–221. [Google Scholar]
  42. National Collaborating Centre for Mental Health (Great Britain). (2013).Antisocial behaviour and conduct disorders in children and young people: Recognition, intervention and management (Vol.158). London: RCPsych Publications. [Google Scholar]
  43. Nelson TD, & Steele RG (2007). Predictors of practitioner self-reported use of evidence-based ractices: Practitioner training, clinical setting, and attitudes toward research. Administration and Policy in Mental Health and Mental Health Services Research, 34, 319–330. [DOI] [PubMed] [Google Scholar]
  44. Nock MK, Kazdin AE, Hiripi E, & Kessler RC (2006). Prevalence, subtypes, and correlates of DSM-IV conduct disorder in the national comorbidity survey replication. Psychological Medicine, 36, 699–710. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Piquero AR, Jennings WG, Diamond B, Farrington DP, Tremblay RE, Welsh BC, & Gonzalez JMR (2016). A meta-analysis update on the effects of early family/parent training programs on antisocial behavior and delinquency. Journal of Experimental Criminology, 12, 229–248. [Google Scholar]
  46. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, & Mittman B (2009). Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36, 24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Sanders MR, Prinz RJ, & Shapiro CJ (2009). Predicting utilization of evidence-based parenting interventions with organizational, service provider and client variables. Administration and Policy in Mental Health and Mental Health Services Research, 36, 133–143. doi: 10.1007/s10488-009-0205-3. [DOI] [PubMed] [Google Scholar]
  48. Serketich WJ & Dumas JE (1996). The effectiveness of behavioral parent training to modify antisocial behavior in children: A meta-analysis. Behavior Therapy, 27, 171–186. [Google Scholar]
  49. Simon SR, & Kilbane T (2014). The state of group work education in U.S. graduate schools of social work. Social Work with Groups, 37, 243–256. [Google Scholar]
  50. Simon SR, Kilbane TL, & Stoltenberg EB (2017). Underexplored aspects of group-work education in MSW programs. Social Work With Groups, 42, 56–71. [Google Scholar]
  51. Staudt M (2007). Treatment engagement with caregivers of at-risk children: Gaps in research and conceptualization. Journal of Child and Family Studies, 16(2), 183–196. [Google Scholar]
  52. Stephens TN, McGuire-Schwartz M, Rotko K, Fuss A, & McKay MN (2014). A learning collaborative supporting the implementation of an evidence-informed program, the “4Rs and 2Ss for children with conduct difficulties and their families.” Journal of Evidence Based Social Work, 11(5), 511–523. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Titler MG (2008). The Evidence for Evidence-Based Practice Implementation In Hughes RG (Ed.), Patient safety and quality: An evidence-based handbook for nurses. Rockville, MD: Agency for Healthcare Research and Quality. [PubMed] [Google Scholar]
  54. Watson J (2005). Active engagement: Strategies to increase service participation by vulnerable families. Ashfield, N.S.W: NSW Centre for Parenting & Research, Dept of Community Services. [Google Scholar]
  55. Wenger E, McDermott R, Snyder W (2002). Cultivating communities of practice: A guide to managing knowledge. Boston, MA: Harvard Business School Press. [Google Scholar]
  56. Willging CE, Green AE, Gunderson L, Chaffin M, & Aarons GA (2015). From a “perfect storm” to “smooth sailing” policymaker perspectives on implementation and sustainment of an evidence-based practice in two states. Child Maltreatment, 20, 24–36. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

10826_2019_1667_MOESM1_ESM

RESOURCES