Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Mar 23.
Published in final edited form as: J Evid Based Soc Work. 2014;11(5):511–523. doi: 10.1080/15433714.2013.831007

A Learning Collaborative Supporting the Implementation of an Evidence-Informed Program, the “4Rs and 2Ss for Children with Conduct Difficulties and their Families”

Tricia N Stephens 1, Mandy McGuire-Schwartz 2, Lauren Rotko 3, Ashley Fuss 4, Mary M McKay 5
PMCID: PMC4369766  NIHMSID: NIHMS669279  PMID: 25491005

Abstract

In this qualitative study the authors examine factors associated with the successful implementation and plans for continued use of an evidence-informed intervention, the 4Rs and 2Ss Program for Strengthening Families, in a sample of 29 New York State, Office of Mental Health licensed child mental health clinics. A learning collaborative (LC) approach was used as a vehicle for supporting training and implementation of the program. The PRISM theoretical framework (Feldstein & Glasgow, 2008) was used to guide the data analysis. Data were analyzed using a multi-phase iterative process, identifying influences on implementation at multiple levels: the program (intervention), the external environment, implementation and sustainability infrastructure, and recipient characteristics. Clinics that were more proactive evidenced staff with advanced organizational skills were able to take advantage of the trainings and supports offered by the LC and fared better in their ability to adopt the intervention. The ability to adapt the intervention to the specific constraints of the clinics was a strong influence on continued use following the end of the LC. These preliminary results suggest that the supports provided by the LC are useful in consolidating information about the process of implementing evidence-informed interventions in community mental health settings. The impact of these supports is also based on their interactions with specific clinic contextual factors.

Keywords: Learning collaborative, evidence-informed practices, PRISM, children’s mental health, uptake, sustainability


Evidence-based practice (EBP) refers to a body of scientific knowledge about the most effective service procedures available, including all areas of service delivery, referral, assessment, case management, treatment, and clinical outcomes (Hoagwood & Johnson, 2003). Despite the documented benefits of EBPs, adoption of these practices has been limited in community mental health settings where significant barriers exist (Aarons, Wells, Zagursky, Fettes, & Palinkas, 2009; Hoagwood & Olin, 2002; Jensen, 2003; Rotheram-Borus & Duan, 2003). Child and family-serving community-based mental health clinics face specific challenges in implementing EBPs (Burns, 1999; Lyons, 2009). Workforce preparation to deliver EBPs is a serious obstacle. Too often, clinicians depend on non-EBPs when providing services to children and families (Baumann, Kolko, Collins, & Herschell, 2006; Bickman, Lambert, Summerfelt, & Heflinger, 1996; Leslie, Hurlburt, Landsverk, Barth, & Slymen, 2004).

Failure to support clinical staff in adopting EBPs is a major gap in the translation of research findings to practice. An estimated 90% of publically funded child-serving agencies do not use EBPs (Hoagwood & Olin, 2002). In a study by Walrath, Sheehan, Holden, Hernandez, and Blau (2006), which surveyed child mental health providers working in community settings, clinicians who reported not using EBPs identified a number of reasons, including knowledge barriers, attitude barriers, and contextual/practical barriers. A majority of providers also reported that although agencies did not require them to use EBPs, there was at least one mechanism for support available to promote the use of EBPs. In order to improve treatment quality and the sustainability of EBPs in children’s services, the barriers of dissemination and implementation need to be addressed (Burns, 1999; Hoagwood, Burns, Kiser, Ringeisen, & Schoenwald, 2001).

There is data suggesting that EBPs, when adopted in child and family-serving community mental health centers, are most sustainable when they are amended from their original models in order to meet the particular needs of the settings. In a study aimed at describing the use of EBPs by providers working in a child community mental health clinic, Sheehan, Walrath, and Holden (2007) found that 80% of providers surveyed reported using at least one EBP, in addition to medication. Although providers reported using at least one EBP, the study found that there was a lack of full treatment implementation, suggesting that treatments were being used without fidelity to the model (Sheehan, Walrath, & Holden, 2007). While conformity to the designed treatment may be important for outcomes, the ability of treatments to be adapted by practitioners as needed may be essential to increasing the usage of EBPs.

The LC Model as an Implementation Support

The dissemination of information and training are two of the most common strategies used in supporting the adoption of new practices, but these methods are inadequate for ensuring that evidence-based mental health treatments are implemented and sustained (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005; Stirman, Crits-Christoph, & DeRubeis, 2004). LCs were created in an attempt to bridge the gap between simply distributing information and facilitating the adoption of evidence-based and evidence-informed practices. Pioneered by the Institute for Healthcare Improvement (IHI), Breakthrough Series Collaboratives, as they were called, sought to address the high rates of costly errors and lack of use of scientifically supported interventions, while reducing exorbitant costs in healthcare.

As described by IHI, LCs are short-term “learning systems,” including teams of representatives from various hospitals and clinics to “seek improvement in a focused topic area” (IHI, 2003). LC models currently in use were inspired by the IHI model and focus on streamlining the use of best practices. LC teams typically consist of key agency leadership (including administrators, supervisors, and clinicians) from different organizations who work together to identify and disseminate a best practice (Institute for Healthcare Improvement, 2003).

Since their creation in 1995, LCs have been widely used in the healthcare system to support change, implement medical interventions, and improve quality of care (Schouten, Hulscher, Van Everdingen, Huijsman, & Grol, 2008; Wilson, Berwick, & Cleary, 2003). The evidence suggests that participation in LCs targeted at a particular best practice leads to improvements in care. This finding has been demonstrated in a number of fields, including pediatric care (Young, Glade, Stoddard, & Nolin, 2006), nursing (Cronenwett, Sherwood, & Gelmon, 2009), diabetes care (Landis, Schwarz, & Curran, 2006), and newborn hearing screening follow-up (Russ, Hanna, DesGeorges, & Forsman, 2010).

Despite their popularity and established benefits in the medical field, LCs have not been widely applied to child mental health settings (Cavaleri, Gopalan, McKay, Messam, & Elwyn, 2010; Ebert, Amaya-Jackson, Markiewcs, Kisiel, & Fairbank, 2012), with only two studies having been conducted. Cavaleri et al. (2006) used a LC methodology to increase engagement in mental health services for children and families and found support for the use of this model. Ebert and colleagues (2012) also conducted an observational study, evaluating the use of an LC to support the implementation and sustainability of Trauma-Focused Cognitive Behavioral Therapy (TF-CBT) in community settings. A majority of participants (67%) in this study reported feeling very or extremely ready to sustain and continue their TF-CBT efforts (Ebert et al., 2012). Additionally, at the one-year follow-up, 100% (n = 11) of agencies continued to offer this intervention at their agencies (Ebert et al., 2012). These studies support the idea that the LC model may be an effective vehicle for dissemination of best practices in child mental health settings.

A Conceptual Framework—PRISM

Feldstein and Glasgow (2008) identified the challenges inherent in creating conceptual frameworks to guide implementation research on strategies such as LC models. The Practical, Robust, Implementation, and Sustainability Model (PRISM) was proposed to assist organizations in the identification of factors important in the implementation of a practice innovation, as well as ways of measuring outcomes.

Borrowing heavily from four existing models in implementation and diffusion research (see Feldstein & Glasgow, 2008 for a detailed description), PRISM focuses on the following key areas. The first area is focused on understanding the characteristics of the proposed practice, including the evidence base supporting it and organizational and patient perspectives of the practice. The second concerns identifying factors in the external environment, such as regulations or reimbursement patterns that may affect adoption of the practice. The third is focused on the implementation and sustainability infrastructure within an organization, including the alignment of the practice with the organization’s mission and the organization’s readiness for change. Finally, the fourth involves the characteristics of the organizational and patient recipients, emphasizing that organizations that conduct needs assessments to identify barriers are the most successful.

Organizational Characteristics and Perspectives

In work done by Breitkreuz, McConnell, Savage, and Hamilton (2011), it was noted that clinicians were more invested in a practice when the rationale for implementation was clearly presented, and when trainers demonstrated expertise, were responsive to their concerns, and acknowledged their professional experience (Breitkreuz et al., 2011). Additionally, the organizational structure and context can determine to what extent EBPs will be perceived as useful (Aarons & Palinkas, 2007; Rosenberg, 2009; Weisz, Jensen-Doss, & Hawley, 2006) and utilized (Breitkreuz et al., 2011; Riemer, Rosof-Williams, & Bickman, 2005), particularly in the field of children’s mental health (Dulcan, 2005).

Multi-level organizational support of EBPs is viewed as a positive and important influence on implementation. Aarons and Palinkas (2007) found that if practitioners felt that the EBP enhanced their professional competence, fit within their usual tasks and experience, and matched with the mission of the organization, they were more likely to use it in their day-to-day practice. On the other hand, any increased oversight and reduction of practice autonomy were viewed as negative aspects of EBPs and contributed to decreased usage (Aarons & Palinkas 2007).

Implementation and Sustainability Infrastructure

Despite their importance, there is little systematic observation and understanding of the adaptations that practices must undergo during implementation for them to be successfully adopted. The most important factor identified as relating to sustainability of an EBP is the adaptability of the practice (Addis & Krasnow, 2000). Although the adaptation of an EBP is controversial due to issues of fidelity, research has suggested that insistence on strict adherence may create unnecessary barriers in real world settings (Berwick, 2003). Modifications and reinvention to fit varying local conditions and contexts are common when EBPs are implemented in practice settings (Druss, 2005). According to practitioners, being able to adapt and tailor interventions to meet the needs of families is an important deciding factor related to continued usage (Aarons & Palinkas, 2007). Thus, the ability to understand the implementation process and the resulting effect on the system, providers, and consumers is critical in determining the likelihood of achieving favorable outcomes in community settings (Aarons & Palinkas, 2007).

Aims

Through this qualitative study the authors aim to advance knowledge about the process of implementing and sustaining the use of the 4Rs and 2Ss for Strengthening Families (4Rs) Program across child and family mental health clinics in New York State. The data collected was analyzed to answer a core set of questions, guided by the PRISM conceptual framework, related to the following areas:

  1. Program
    1. How did clinical staff use the practice protocol?
    2. How did staff respond to the multiple family group format?
  2. External Environment
    1. What were the factors that motivated clinic leadership to volunteer to be involved in the LC?
    2. What challenges did clinics anticipate at the outset of the LC?
  3. Implementation and Sustainability Infrastructure
    1. What were the major challenges faced by the clinics once they were involved in the LC?
    2. What constructs were associated with clinic performance in implementation?
  4. Recipient Characteristics
    1. What construct was identified by participating clinics as contributing to their decision about whether or not to continue their use of this evidence informed intervention?

A LC was initiated by the New York State Children’s Technical Assistance Center (CTAC) to support the implementation of the 4Rs and 2Ss Program intervention across clinic sites. CTAC is a training, consultation, and educational resource center available to all child-serving clinics in New York State, funded by the New York State Office of Mental Health.ӑCTAC assists child-serving providers in addressing the challenges associated with the recent changes in clinic regulations, financing, and overall healthcare reforms.

METHODS

The authors present an analysis of qualitative data collected from providers and program directors as part of a LC process supporting the implementation of the 4Rs and 2Ss Program. The 4Rs and 2Ss program is an evidence-informed clinical intervention for groups of families with children (aged 7–11 years old) who meet the diagnostic criteria for oppositional defiant disorder or conduct disorder. Following the initial implementation of the program by 13 community child mental health clinics in the New York City metropolitan area, the CTAC offered trainings and technical assistance to more than 300 clinics across New York State, to aid in the statewide implementation of the program. The data presented here is based on provider feedback about the subsequent implementation efforts of the 29 clinics that opted to be involved in this rollout of the 4Rs program across New York State (see Table 1 for clinic characteristics).

TABLE 1.

Clinic Characteristics

Total Number
of Clinics
Downstate Upstate
Number of clinics in the Learning Collaborative 29 20 9
Number of clinics who ran groups 26 17 9
Number of groups started 29 20 9
Number of groups that completed curriculum content 21 14 7
Number of groups that ended prematurely 8 6 2
Average number of families in group 5 5 3
Average number of families who dropped out 2 2 3
Clinics that used a parent partner 10 5 5

Data Collection and Data Capture Tool

Before, during, and after the implementation of the 4Rs and 2Ss Program at each child mental health clinic site, the research team asked program facilitators and directors to provide feedback through a series of structured meetings and phone calls. Twenty-nine agencies across New York State participated in this process. (See Table 2 for a breakdown of the LC activities, sample content and related PRISM elements.)

TABLE 2.

Breakdown of the Learning Collaborative Activities and Related PRISM Elements

Date Event Sample Content PRISM Elements
October 2011 Introductory webinar Description of the 4Rs and 2Ss
Information on Learning Collaborative model
Program (intervention)
October–November 2011 In-person meetings (round 1) Discussion of how clinics became involved
Current organizational practices
Work plans
External environment
Implementation and sustainability infrastructure
November–December 2011 Facilitator training 4Rs and 2Ss history & outcomes
Review of manuals, practice, engagement, facilitation, parent advocate’s role, performance indicators, and fidelity
Program (intervention)
December 2011–January 2012 Individual calls (round 1) Progress updates
Staff feedback
Plans to involve family advocates
Program (intervention)
Recipients (organizational)
January 2012 Group calls (round 1) Challenges
Webinars and data collection
Family survey
Fidelity
Program (intervention)
Recipients (service users)
February 2012 In-person meetings (round 2) Progress and challenges
Problem-solving approaches
Performance indicators
Family surveys
Fidelity
Program (intervention)
External environment
Recipients (service users & organizational)
March 2012 Special interest call with group facilitators Experiences with the program
Questions and concerns
Program (intervention)
Recipients (service users & organizational)
April 2012 Individual calls (round 2) Progress & challenges
Data submission
Program (intervention)
June 2012 Group calls (round 2) Updates
Data submission
Program (intervention)
July 2012 In-person meetings (round 3) Experience with the project
Performance indicator data
Lessons learned
Future plans
Parent partners
Program (intervention)
Recipients (service users & organizational)
Implementation and sustainability infrastructure

Scripts for the meetings and phone calls were developed by CTAC staff to collect structured feedback about the real world implementation of the program from facilitators and directors (See Table 2 for sample content that was covered in each meeting). Extensive notes were taken at each in-person and telephone-based meeting by a team that included doctoral level students and trained CTAC research staff. A table was developed, with each question from the CTAC script given a dedicated recording space. This allowed the note takers to capture responses from the various clinic representatives to the questions that were asked in an open format. Wherever possible, there were multiple recorders at in-person meetings, allowing for the cross referencing and, if necessary, correction of information.

Data Analysis

Data from the provider meetings and group and individual calls were analyzed using a multi-phase, iterative process. Coders worked in pairs during the data analysis phase of the study. Notes from each round of meetings or calls were read aloud by one of the authors. The author identified preliminary codes for that set of notes. The same author then reviewed all notes and adjusted codes as necessary, to more fully capture prevalent themes across respondents.

For example, codes fell into seven categories: Reasons for Initially Joining, Overall Experience with the Project, Performance Indicator Data, Lessons Learned, Future Plans with the Model, Parent Partners, and Additional Comments. Within each category, there were then several subcategories. Under Reasons for Joining, the options were Expanding the Scope of Practice (increasing EBPs into clinic/school, community marked by low socioeconomic status, or chaotic families and clinic), Child Behavior Problems (mentions children/kids struggling with behavior, ODD, or kids with ADHD), Improve Clinic Attendance (attendance or “looking for a way to keep them in the clinic”), and Finances (interest in cost savings).

After the first rater completed coding for each set of notes, a second rater, using the codes developed by the primary coder as a guide, coded the same set of notes. All notes were labeled with as many codes as appropriate. Inter-rater reliability was then determined based on a sample of 25% of the agencies in the group, using a random number generator obtained from www.random.org/sequences/. It was decided ahead of time that every fourth number from the random number chart would be chosen for comparison. Using the sample of agencies labeled sequentially with the chosen numbers, the percentage of codes accurately and concurrently identified by both raters was calculated. Rates of reliability from each round of coding are detailed in Table 3.

TABLE 3.

Inter-rater Reliability Scores for LC Coding

Round 1
(Face to Face)
Round 1
(Calls–Group
and Individual)
Round 2
(Face to Face)
Round 2
(Calls–Group
and Individual)
Round 3
(Face to Face)
85% 87% 90% 92% 84.6% (upstate) & 91.9% (downstate)

RESULTS

The clinics that participated in the LC supporting the roll out of the 4Rs 2Ss Program experienced varying degrees of success according to: (1) their use of the 4Rs 2Ss Program manual and group format; (2) their management of the external environment, as evidenced by their proactivity in addressing logistical challenges such as transportation and space and their resourcefulness in securing reimbursement from payers like Medicaid and private insurance companies; (3) their ability to harness an implementation and sustainability infrastructure, both internally and through supports offered by CTAC; and (4) their assessment and management of recipients, as evidenced by their ability to recruit and retain participants, manage staff workload and commitment to the program, and maximize benefit to service users.

The themes developed from the data collected during the LC are detailed below, organized according to the following facets of the PRISM model: program, external environment, and implementation and sustainability infrastructure. This is followed by a discussion of the factors that influenced clinics’ decisions about whether continued use of the 4Rs Program was feasible, including recipient characteristics, characteristics associated with strong implementation, and service users’ perceptions of the 4Rs 2Ss Program.

Program

How did clinical staff use the practice protocol?

How did staff respond to the multiple family group format?

In-person LC supports included: training of clinicians; in program delivery; and program materials in multiple languages. Phone and in-person supports were made available in a structured format. These trainings and the availability of CTAC staff to answer questions and offer follow up support were thought to enhance the experience of participating clinics. Initially, there were varied levels of receptiveness across the clinic administration and staff to the 4Rs 2Ss Program. Some participants expressed open skepticism. They viewed the 4Rs 2Ss Program as potentially another top-down directive from the state office of mental health. Comments included:

We’ve gone through a lot of EBP before … seems out of sync between what clinic work needs and what is taught.—Administrator

We have been involved in a number of programs in NY and … know what fundamentally doesn’t work, so am here so that even if the program doesn’t work (I) will identify the forces both external and internal that get in the way of delivery of services. Am hoping that some of these things can be fleshed out.—Administrator

However, as the LC progressed, group facilitators and other clinical staff overwhelmingly described overall positive feelings about the 4Rs 2Ss Program intervention. Clinicians pointed to the strengths of 4Rs 2Ss Program, noting helpful training and preparation for delivering the group, ease of use, low preparation times for clinicians who felt relief that they did not need to dedicate hours to prepare for their group, and positive participant response to the intervention. The manual was described as ready to be used without hours of preparation ahead of group time, thus easing its use within a practice day. Clinic staff described satisfaction with support provided by the CTAC staff, as well as opportunities to learn from other LC participants about ways in which they solved problems and adapted the program to meet their needs.

Though participation levels at the initial face-to-face LC meeting and the trainings for group facilitators were strong across clinics, those clinics that struggled and did not implement a group had faltering participation on group calls and follow up meetings. CTAC staff did outreach to remain connected to these clinics, yet there was attrition. Though information and support were available through CTAC via the Internet, phone calls, and in-person meetings, it appeared that some clinics were not positioned to take advantage of these supports.

External Environment

What were the factors that motivated clinics to volunteer to be involved in the LC?

The LC participants responded to realities that were motivated both internally, by the needs of clients, and externally, by policy and financial considerations. The following themes were identified as reasons for joining the LC: both clinicians and administrators expressed a desire to learn about an intervention that may have clinical relevance to their population; there was interest, particularly among administrators, in a group format that might provide a financial advantage in the clinic setting; and administrators were interested in the opportunity to diversify programming in the face of clinic restructuring.

What challenges did clinics anticipate at the outset of the LC?

Clinic administrators and clinical staff involved in the initial face-to-face meeting expressed concerns about logistical barriers they faced, such as finding sufficient space, facilitating transportation for clients, and scheduling trainings. Additionally, some concerns were expressed about the severity of the service users’ clinical symptoms affecting the feasibility of intervention. Other concerns noted by clinic staff and administrators included views that attendance would be low, that parents and parent partners would be difficult to engage, and that the intervention would prove to be too costly or would not be reimbursed by Medicaid and other insurance companies. Administrators asked practice-informed questions such as “What does family mean? Individual child and caregiver or ‘whole family’?” in an effort to understand how the intervention would fall into their current billing and productivity practices. Financial concerns were discussed and were seen as a potentially critical barrier to group implementation.

Implementation and Sustainability Infrastructure

What were the major challenges faced by the clinics once they were involved in the LC?

Many clinicians expressed eagerness to begin running groups at the outset of the LC. They identified their first steps as focusing on recruitment through the assessment of current caseloads and making presentations at staff and other agency meetings to generate interest. They also discussed possible roadblocks and solutions to starting the groups related to time-management, staff efficiency concerns, and anticipating what clinical issues could arise in the clinic. Start up steps also included identifying staff that could be trained as group facilitators and recruiting parent partners who could serve as co-facilitators.

Recruitment and Engagement

Recruitment and engagement of families proved to be a major challenge in the implementation and sustainability of the intervention. One New York City administrator described attempts at outreach in a school-based setting:

We got buy in from the principal who presented the program at a PTA meeting and also did a [back-pack] mailing inviting about 40 families to come (to an informational meeting) and no one came.

Clinics described no-shows, lateness to group meetings, and families who showed interest and signed up for the groups but never showed up for the meetings. All clinics, regardless of their level of success in running the groups, reported challenges with attendance. Across clinics, the modal number of families attending each group was 3.

Parent Partners

Clinics were mixed in their ability to recruit, hire, and retain parent partners who could assist in the recruitment of families and co-facilitate the groups. Ten of the twenty-six clinics were successful in engaging a parent partner as a group facilitator. Some of the clinics were familiar with parent partners and were able to integrate them fairly easily into their recruitment and training of group facilitators. These clinics described their parent partners as “amazing” and “having an empowering experience.” Another clinic staff member, however, described struggles with getting the parent partner to view herself as an equal participant in the facilitation process. Where parent partners were hired, clinics generally described their contribution as very positive, particularly in the recruitment of families and in breakout sessions where families were separated into adults and children.

Other clinics struggled because using parent partners represented a cultural shift for their programs. Still other clinics were excited about the prospect of incorporating a parent partner but faced bureaucratic challenges in hiring and using an unlicensed group leader. External funding was available to clinics to compensate the parent partners, but bureaucratic obstacles such as hiring restrictions and concerns about licensing and billing were occasionally too difficult to overcome.

Logistical Issues

Logistical issues like finding appropriate space, assisting clients with transportation costs, ironing out the details of child-care, and providing meals were described by some clinics as time consuming and problematic in terms of worker productivity. In addition, behavioral issues frequently arose during group sessions, presenting challenges to the facilitators. One Upstate New York clinician described her frustration with their experience.

Space—meal in one (room) the session in the other; behavioral issues during session—week 2 (tried to implement the program with the family individually thinking it was a better fit, but didn’t work either—family not ready) …

Those clinics that were proactive with planning for these logistical issues were the best equipped to manage these issues. One clinician advocated for funding to secure metrocards for her group and was successful in getting this approved. She was then able to let parents know that transportation would be provided, thus alleviating this worry at the recruitment stage. Other clinics noted that providing meals, offering child care for younger children, and holding the group during after-work hours were essential to making the group accessible to parents.

What constructs were identified by participating clinics as contributing to their decision about whether or not to continue their use of this evidence informed intervention?

Flexibility

When the LC participants were asked whether they would run another group, flexibility proved to be the major concern. Clinicians in particular wanted to know how much they could amend the length of the intervention while maintaining fidelity. Several clinics already had groups set up for the summer, a few even running multiple groups across various agency sites. About half of the clinics expressed a willingness to run another group as is, while the remainder expressed an interest in running a variation of the group. The most frequent suggestion for amending the group to the clinic’s needs was to reduce the number of sessions from 16 to 8. This was followed by a desire to use the manual with individual clients rather than groups and to perhaps offer a hybrid version in which the manual would be used in both individual and group sessions.

Recipient Characteristics

The clinics that were observed to be the most successful in implementing and sustaining groups had several characteristics in common. They had early administrative buy-in and support, as well as previous experience running groups. They used co-facilitators to run the groups. They also had sufficient agency staffing support, which made it possible for them to conduct practical outreach, such as causing reminder phone calls to be made to participants.

What characteristics were associated with strong performance in implementation?

The clinics most successful in implementing the intervention could be described as being proactive in their approach to establishing administrative and clinical supports for the 4Rs 2Ss Program. They were able to identify families interested in the program early, quickly secure space, resolve transportation funding, and effectively plan for group activities such as mealtime. Clinics used internal referrals, posters in clinic waiting areas, and opening the group up to all family members as ways of recruiting service users.

One clinic, described as making significant progress in the start up phase, attributed their success to clinic administrators being active in directing staff to support the new initiative: “Leadership told clinicians to give T and M (facilitators) one family and then they would have a pool to choose from.” This clinic also provided food at the groups, as the meeting time was around dinner time. The clinic interns also made reminder phone calls to families of their group meeting the following day. Yet another clinic recognized early on that transportation costs were prohibitive to families, particularly those that intended to have multiple family members attend. This clinic supervisor approached her administrators and secured funds to assist families in defraying the cost of transportation to the clinic.

The clinics that showed greatest progress were also persistent in their follow up with reimbursing agencies in order to make the implementation of the 4Rs 2Ss Program feasible. Clinics described working actively with the LC staff to obtain documentation that satisfied requests from insurance companies that required explanation of the intervention before making payments for it. Practical, financial concerns such as who could be billed for when multiple family members were participating in the program arose, and research related to obtaining clarification from Medicaid and insurance companies was obtained and shared during the LC face-to-face meetings and the phone calls.

Service User Perception of the 4Rs 2Ss Program

A benefit of the LC was that it allowed group facilitators to share the reactions of the families to the 4Rs 2Ss Program. The families that were included in the groups were largely drawn from the caseloads of clinicians within the clinics, at times even from the caseload of the group facilitator. At least one child in each family had a diagnosis of a disruptive behavior disorder. The parents in the groups responded favorably to the group format, with many reporting that they would appreciate having a group to themselves where they could discuss issues they did not feel comfortable sharing in front of their children. Parents also expressed their reluctance to end the group as the program came to a close, noting their enjoyment of dedicated time and the ability to share with other families.

The families described another positive response to the 4Rs 2Ss Program, noting that it allowed them to have dedicated family time that was not spent in front of the television. They enjoyed the activities that required family engagement, and some parents noted that it allowed them to see their child with the disruptive behavior diagnosis beyond their label of “problem child.”

DISCUSSION

In this study the authors attempted to document the process of implementation of the 4Rs 2Ss Program in child and family-serving community mental health settings. In doing so, they have contributed to the field of implementation research by affording insight into the factors that promoted success in the real world adoption of an evidence-informed program. The PRISM model was a useful framework to organize the data from the LC, which highlighted the aspects of program, external environment, implementation, and sustainability infrastructure, and recipients that either promoted or hindered adoption of the 4Rs Program beyond the initial period of active support afforded by the LC environment. They also reinforce the existing knowledge that indicates that the flexibility and adaptability of an intervention are the most important factors that support sustained usage.

The dedicated service provided by the staff of the learning collaborative appeared to positively impact the roll out of the 4Rs Program. Clinics were provided with a forum to air their concerns and brainstorm solutions prior to the roll out. A standardized training format and support were available to all participating clinics, allowing for the resolution of both technical and procedural questions about running the groups. Periodic check-ins in person and by phone, along with an on-line support site, meant that there were ample opportunities to secure guidance both from the training staff and other clinics on how to overcome real world obstacles to successful implementation.

Problem solving was a key component of the LC process. This translated to practical questions being answered in the sessions about what could be reasonably amended while maintaining fidelity. Therefore questions were fielded and answered around topics such as what constitutes a family member, what can be done with the younger siblings of participating children, how could the services be billed, and what happens when a family has to exit the group prematurely. While success varied across clinics, there were a number of characteristics that were identified that are linked to successful adoption. These included factors related to their use of the program, their response to external environmental pressures, their mobilization of implementation and sustainability infrastructure, and recipient characteristics.

ACKNOWLEDGMENTS

A special thank you to the researchers, trainers, and staff of the Children’s Technical Assistance Center: Kara Dean-Assael, Lydia Franco, Priscilla Shorter, Andrew Cleek, and Anthony Salerno, without whom this work would not be possible.

FUNDING

This research was supported in part by grants from the National Institute on Health (R01MH072 649).

Footnotes

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Contributor Information

Tricia N. Stephens, Silver School of Social Work, New York University, New York, New York, USA

Mandy McGuire-Schwartz, Silver School of Social Work, New York University, New York, New York, USA.

Lauren Rotko, Silver School of Social Work, New York University, New York, New York, USA.

Ashley Fuss, McSilver Institute for Poverty Policy & Research, New York University, New York, New York, USA.

Mary M. McKay, Silver School of Social Work, McSilver Institute for Poverty Policy & Research, New York University, New York, New York, USA

REFERENCES

  1. Aarons GA, Palinkas LA. Implementation of evidence based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007;34:411–419. doi: 10.1007/s10488-007-0121-3. [DOI] [PubMed] [Google Scholar]
  2. Aarons GA, Well RS, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: A multiple stakeholder analysis. American Journal of Public Health. 2009;99:2087–2095. doi: 10.2105/AJPH.2009.161711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Addis ME. Methods for disseminating research products and increasing evidence-based practice: Promises, obstacles, and future directions. Clinical Psychology: Science and Practice. 2002;9:367–378. [Google Scholar]
  4. Baumann BL, Kolko DJ, Collins K, Herschell AD. Understanding practitioners’ characteristics and perspectives prior to the dissemination of an evidence-based intervention. Child Abuse and Neglect. 2006;30:771–787. doi: 10.1016/j.chiabu.2006.01.002. [DOI] [PubMed] [Google Scholar]
  5. Berwick DM. Disseminating innovations in health care. JAMA. 2003;289:1969–1975. doi: 10.1001/jama.289.15.1969. [DOI] [PubMed] [Google Scholar]
  6. Bickman L, Lambert EW, Summerfelt WT, Heflinger CA. Rejoinder to questions about the Fort Bragg evaluation. Journal of Child and Family Studies. 1996;5(2):197–206. [Google Scholar]
  7. Breitkreuz B, McConnell D, Savage A, Hamilton A. Integrating triple P into existing family support services: A case study on program implementation. Prevention Science. 2011;12:411–422. doi: 10.1007/s11121-011-0233-6. [DOI] [PubMed] [Google Scholar]
  8. Burns BJ. A call for a mental health services research agenda for youth with serious emotional disturbance. Mental Health Services Research. 1999;1:5–20. [Google Scholar]
  9. Cavaleri MA, Gopalan G, McKay MM, Appel A, Bannon WM, Bigley MF, Thaler S. Impact of a learning collaborative to improve child mental health service use among low-income urban youth and families. Best Practices in Mental Health. 2006;2:67–79. [Google Scholar]
  10. Cavaleri MA, Gopalan G, McKay MM, Messam T, Elwyn L. The effect of a learning collaborative to improve engagement in child mental health services. Children and Youth Services Review. 2010;32:281–285. [Google Scholar]
  11. Cronenwett L, Sherwood G, Gelmon SB. Improving quality and safety education: The QSEN learning collaborative. Nursing Outlook. 2009;57:304–312. doi: 10.1016/j.outlook.2009.09.004. [DOI] [PubMed] [Google Scholar]
  12. Druss BG. Medicine-based evidence in mental health. Psychiatric Services. 2005;56:543. doi: 10.1176/appi.ps.56.5.543. [DOI] [PubMed] [Google Scholar]
  13. Dulcan MK. Practitioner perspectives on evidence-based practice. Child and Adolescent Psychiatric Clinics of North America. 2005;14(2):225–240. doi: 10.1016/j.chc.2004.04.005. [DOI] [PubMed] [Google Scholar]
  14. Ebert L, Amaya-Jackson L, Markiewcs JM, Kisiel C, Fairbank JA. Use of the breakthrough series collaborative to support broad and sustained use of evidence-based trauma treatment for children in community practice settings. Administration & Policy in Mental Health & Mental Health Services Research. 2012;39:187–199. doi: 10.1007/s10488-011-0347-y. [DOI] [PubMed] [Google Scholar]
  15. Feldstein AC, Glasgow RE. A practical, robust implementation and sustainability model (PRISM) Joint Commission Journal on Quality and Patient Safety. 2008;34(4):228–243. doi: 10.1016/s1553-7250(08)34030-6. [DOI] [PubMed] [Google Scholar]
  16. Fixsen DL, Naoom SF, Blasé KA, Friedman RM, Wallace F. Implementation research: A synthesis of the literature. Tampa, FL: National Implementation Research Network; 2005. [Google Scholar]
  17. Hoagwood K, Burns BJ, Kiser L, Ringeisen H, Schoenwald SK. Evidence-based practice in child and adolescent mental health services. Psychiatric Services. 2001;52:1179–1189. doi: 10.1176/appi.ps.52.9.1179. [DOI] [PubMed] [Google Scholar]
  18. Hoagwood K, Olin SS. The NIMH blueprint for change report: Research priorities in child and adolescent mental health. Journal of the American Academy of Child and Adolescent Psychiatry. 2002;41:760–767. doi: 10.1097/00004583-200207000-00006. [DOI] [PubMed] [Google Scholar]
  19. Hoagwood K, Johnson J. School psychology: a public health framework I. From evidence-based practices to evidence-based policies. Journal of School Psychology. 2003;41:3–21. [Google Scholar]
  20. Horn SD. Performance measures and clinical outcomes. Journal of the American Medical Association. 2006;296:2731–2732. doi: 10.1001/jama.296.22.2731. [DOI] [PubMed] [Google Scholar]
  21. Institute for Healthcare Improvement. The breakthrough series: IHI’s collaborative model for achieving breakthrough improvement; IHI Innovation Series White Paper; Cambridge, MA: Author; 2003. Retrieved from http://www.ihi.org/IHI/Results/WhitePapers/ [Google Scholar]
  22. Jensen PS. Commentary: The next generation is overdue. Journal of the American Academy of Child and Adolescent Psychiatry. 2003;42:527–530. doi: 10.1097/01.CHI.0000046837.90931.A0. [DOI] [PubMed] [Google Scholar]
  23. Landis SE, Schwarz M, Curran DR. North Carolina family medicine residency’s diabetes learning collaborative. Family Medicine. 2006;38:190–194. [PubMed] [Google Scholar]
  24. Leslie LK, Hurlburt MS, Landsverk J, Barth RP, Slyman DJ. Outpatient mental health services for children in foster care: A national perspective. Child Abuse and Neglect. 2004;28:697–712. doi: 10.1016/j.chiabu.2004.01.004. [DOI] [PubMed] [Google Scholar]
  25. Lyons JS. Knowledge creation through total clinical outcomes management: A practice-based evidence solution to address some of the challenges of knowledge translation. Journal of Canadian Academic Child and Adolescent Psychiatry. 2009;18:38–45. [PMC free article] [PubMed] [Google Scholar]
  26. Riemer M, Rosof-Williams J, Bickman L. Theories related to changing clinician practice. Child and Adolescent Psychiatric Clinics of North America. 2005;14:241–254. doi: 10.1016/j.chc.2004.05.002. [DOI] [PubMed] [Google Scholar]
  27. Rosenberg L. The reality of implementing evidence-based practices. Journal of Behavioral Health Services and Research. 2009;37:1–3. doi: 10.1007/s11414-009-9195-x. [DOI] [PubMed] [Google Scholar]
  28. Rotheram-Borus MJ, Duan N. Next generation of preventive interventions. Journal of the American Academy of Child & Adolescent Psychiatry. 2003;42:518–526. doi: 10.1097/01.CHI.0000046836.90931.E9. [DOI] [PubMed] [Google Scholar]
  29. Russ SA, Hanna D, DesGeorges J, Forsman I. Improving follow-up to newborn hearing screening: A learning-collaborative experience. Pediatrics. 2010;126(Suppl. 1):S59–S69. doi: 10.1542/peds.2010-0354K. [DOI] [PubMed] [Google Scholar]
  30. Sheehan AK, Walrath CM, Holden EW. Evidence-based practice use training and implementation in the community-based service setting: A survey of children’s mental health service providers. Journal of Child and Family Studies. 2007;16:169–182. [Google Scholar]
  31. Schouten LM, Hulscher ME, Van Everdingen JJ, Huijsman R, Grol RP. Evidence for the impact of quality improvement collaboratives: Systematic review. British Medical Journal. 2008;336:1491–1494. doi: 10.1136/bmj.39570.749884.BE. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Stirman SW, Crits-Christoph P, DeRubeis RJ. Achieving successful dissemination of empirically supported psychotherapies: A synthesis of dissemination theory. Clinical Psychology: Science and Practice. 2004;11:343–359. [Google Scholar]
  33. Walrath C, Sheehan A, Holden E, Hernandez M, Blau G. Evidence-based treatments in the field: A brief report on provider knowledge, implementation, and practice. Journal of Behavioral Health Services and Research. 2006;33:244–253. doi: 10.1007/s11414-005-9008-9. [DOI] [PubMed] [Google Scholar]
  34. Weisz JR, Jensen-Doss A, Hawley KM. Evidence-based youth psychotherapies versus usual clinical care: A meta-analysis of direct comparisons. American Psychologist. 2006;61:671–689. doi: 10.1037/0003-066X.61.7.671. [DOI] [PubMed] [Google Scholar]
  35. Wilson T, Berwick DM, Cleary PD. Performance improvement: What do collaborative improvement projects do? Experience from seven countries. Joint Commission Journal on Quality and Safety. 2003;29:85–93. doi: 10.1016/s1549-3741(03)29011-0. [DOI] [PubMed] [Google Scholar]
  36. Young PC, Glade GB, Stoddard GJ, Nolin C. Evaluation of a learning collaborative to improve the delivery of preventive services by pediatric practices. Pediatrics. 2006;117:1469–1476. doi: 10.1542/peds.2005-2210. [DOI] [PubMed] [Google Scholar]

RESOURCES