Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Sep 12.
Published in final edited form as: Adm Policy Ment Health. 2020 Mar;47(2):227–243. doi: 10.1007/s10488-019-00930-5

Skills for Developing and Maintaining Community-Partnerships for Dissemination and Implementation Research in Children’s Behavioral Health: Implications for Research Infrastructure and Training of Early Career Investigators

Geetha Gopalan 1, Alicia C Bunger 1, Byron J Powell 1
PMCID: PMC6742583  NIHMSID: NIHMS1523784  PMID: 30863918

Abstract

By engaging with community partners, dissemination and implementation scholars can enhance research relevance and translation. We illustrate the skills needed for developing and maintaining community partnerships by presenting two case studies of partnerships between early-career investigators and child welfare systems to implement mental health interventions. The cases represent two models of partnership (investigator-led and agency-led), highlighting the value and difficulty of conducting community-engaged implementation research. The experiences described feature strategies for building and managing relationships, navigating rules and regulations, adaptation, and securing resources. We offer suggestions for improving training and research infrastructures to support community-engaged implementation scholars.

Keywords: community partnerships, implementation research, children’s mental health, child welfare

Introduction

Children and youth with mental health challenges often receive substandard mental health and child welfare services (Garland et al., 2010). Evidence-based treatments (EBTs) are underutilized, and problems with implementation can diminish their impact (Durlak & DuPre, 2008). Emerging efforts to advance implementation science (Institute of Medicine, 2015; National Institutes of Health, 2016) will require that researchers partner closely with a wide range of community stakeholders to support implementation success (Chambers & Azrin, 2013). This paper identifies those skills needed to develop and maintain community partnerships within the context of implementation research. We present two case studies of efforts to partner with child welfare systems toward improving the quality of behavioral health services for children, youth, and families. These case studies were chosen because of the authors’ personal involvement, as well as their shared lived experience as early-career scholars and trainees in the Implementation Research Institute (IRI) training program for emerging implementation research scientists (http://iristl.org/about/). The IRI was developed in response to the paucity of researchers trained to address the conceptual and methodological challenges inherent in Dissemination and Implementation (D&I) research. While we acknowledge that the two case studies represent only two of many types of community-based partnerships (Palinkas, Short, & Wong, 2015), we utilized our experiences to provide suggestions for improving training and research infrastructures for new D&I investigators.

Community-Engaged Research and Potential for Advancing Implementation Research

Partnerships with community stakeholders provide the basis for community-engaged research approaches, bringing researchers and stakeholders together to share power by jointly identifying problems and questions, designing methods, and using results (Baker, Homan, Schonhoff, & Kreuter, 1999). These research models fall along a continuum ranging from minimal to maximal stakeholder control (Barkin, Schlundt, & Smith, 2013). Studies include those led by academics with stakeholder input as well as community-based participatory research (CBPR) models, where community stakeholders share equal control over all elements of the research process (Israel, Schulz, Parker, & Becker, 1998). By emphasizing research “with” (rather than “on”) communities, practitioners, agencies, or systems (Christopher, Watts, McCormick, & Young, 2008), community-engaged research can facilitate shared learning, enhance the practice-relevancy of research, and build community capacity (Israel et al., 1998).

Such approaches are well-suited for D&I research (Chambers & Azrin, 2013; Mendel, Meredith, Schoenbaum, Sherbourne, & Wells, 2008; Shea et al., 2017; Wallerstein & Duran, 2010). By engaging with community partners, D&I scholars can enhance the external validity and saliency of D&I studies (Wallerstein & Duran, 2010). Given the importance of context for explaining implementation success and informing when and how adaptations need to be made (e.g., Aarons, Hurlburt, & Horwitz, 2011; Damschroder et al., 2009; Cabassa et al, 2014; Mendel et al., 2008), stakeholders’ first-hand knowledge can shape understandings of a problem by identifying relevant contextual, organizational, system, and policy features. Moreover, such knowledge can aid in the development of sensitive measures (Mendel et al., 2008; Minkler, 2005), specify cross-level influences (Baker et al., 1999), and build stronger connections between implementation theory and practice (Israel et al., 1998). Community-engaged approaches might also be useful for identifying local innovations in implementation strategies (Powell et al., 2015; Powell et al., 2017), and investigating cutting-edge issues, such as those related to sustainment. Engaging stakeholders in research builds agency capacity for maintaining EBT use well after grant funding ends, while also providing a context for research examining conditions and mechanisms underlying sustainment (Chambers & Azrin, 2013; Wallerstein & Duran, 2010).

Partnership Challenges and Skills

Although community-engaged research holds promise for making great gains in D&I, identifying, developing, and maintaining research-practice partnerships is challenging. Substantial tension can arise between researchers and community stakeholders because of different language, agendas, priorities, and professional incentives (Baker et al., 1999; Israel et al., 1998; Mikesell, Bromley, & Khodyakov, 2013). The development of these partnerships can take years, and evolve as stakeholders join and leave the group (Christopher et al., 2008). Clear communication, in-person meetings, and written expectations could help build trust among partners, which is especially important in the early stages of a partnership (Christopher et al., 2008). Maintaining trusting partnerships, however, over the long term requires researchers and community stakeholders to balance differing needs. Implementation or data collection might need to be put on hold because of competing demands within an organization, which can frustrate researchers beholden to project or grant deadlines (Kilbourne et al., 2012). As a result, this type of work can be prohibitive for researchers (especially early-career scholars), as well as community stakeholders who may not have the resources or time to invest in project development (Kilbourne et al., 2012; Mendel et al., 2008; Mikesell et al., 2013). Rigorous, tightly controlled designs and methods might not be appropriate in every implementation setting, requiring research partners to consider alternatives (Baker et al., 1999; Israel et al., 1998). Close collaborations that develop over long periods of time might also compromise researchers’ objectivity (Mendel et al., 2008).

By sharing experiences, challenges, strategies, and helpful supports for conducting community-engaged D&I research, scholars can learn from one another and advance the field (Dunbar et al, 2012). The two case studies presented below showcase efforts of early-career investigators to partner with child welfare systems to improve the quality of behavioral health services for children, youth, and families. Case 1 focuses on a National Institute of Mental Health (NIMH)-funded exploratory/developmental study which utilizes task-shifting strategies to implement the 4Rs and 2Ss Strengthening Families Program (4R2S; Gopalan, 2016) – a multiple family group-delivered intervention originally provided by advanced mental health practitioners to reduce child disruptive behavior difficulties – so that it could be delivered by child welfare caseworkers providing placement prevention services. Case 2 involves a Children’s Bureau-funded demonstration where behavioral health screening, assessment, and referral practices are implemented within a public child welfare agency.

Both case studies were based upon the first and second authors’ experiences conducting implementation research in partnership with community stakeholders, and drew upon primary data collection (e.g., notes, qualitative interviews, archival data) and participant observation. To ensure consistency of reporting and a thorough description of the partnership process in both case studies, we developed each case narrative based on a common framework that included seven main areas of focus: 1) project features (context, location, sector, intervention, study purpose, investigator training, and study origins), 2) study features (implementation study type, design methods, implementation strategy, client outcomes, implementation outcomes, other relevant implementation constructs, and target population), 3) mental health integration (integration model, observations, and conflicts of culture), 4) investigator relationships with partners (initial formation/history, development and maintenance, and liaisons/brokers), 5) institutional review board and/or ethical challenges, 6) balancing researcher and agency needs (control over research question, design, methods, dissemination; and competing demands), and 7) potential for advancing dissemination and implementation. After initial drafts of each case were completed, we used two approaches to ensure objectivity and accuracy of reporting. First, the third author reviewed each case for clarity and consistency of reporting and to highlight common themes between the two cases. Clarifying questions and potential themes were discussed via web conference between all authors. Second, we ensured the rigor of reporting by using member checking (Padgett, 2012). Each case was shared with community partners who verified the accuracy of the case descriptions contained in this article.

Case Studies

Case Study 1: Improving Child Behavior Using Task Shifting to Implement the 4Rs and 2Ss Strengthening Families Program (4R2S) in Child Welfare (R21 MH102544; PI: Gopalan)

This NIMH-funded study (Gopalan, 2016) tested the feasibility of implementing the 4R2S intervention to reduce child behavioral difficulties, typically offered in mental health settings, into child welfare organizations that serve families to prevent out-of-home placement. This exploratory/developmental study focused on the needs of children with disruptive behavior difficulties reared by child welfare (CW)-involved families, who have an increased risk of future maltreatment and out-of-home placement (Barth, Wildfire, & Green, 2006; Barth, 2009; Videka, Gopalan, & Bauta, 2014). However, the lack of available providers (Asen, 2002) and difficulties engaging families in mental health agencies (Gopalan et al., 2010) frequently limit treatment access. CW organizations providing services to prevent out-of-home placement may be ideal locations to deliver child mental health EBTs in conjunction with existing placement prevention services. In this way, families’ needs can be met comprehensively.

Based on a common elements approach (Chorpita & Daleiden, 2009; Garland, Hawley, Brookman- Frazee, & Hurlburt, 2008), core components of 4R2S represent essential effective practice components from the literature on behavioral parent training (Chorpita & Daleiden, 2009; Garland et al., 2008) and family therapy (e.g., Alexander, Pugh, Parsons, & Sexton, 2000; Keiley, 2002) for treating children’s behavioral difficulties. Sessions involve discussion and in vivo skill development related to “4Rs” (i.e., Rules, Responsibility, Relationships, Respectful Communication). In addition, factors impacting family engagement in mental health services (Kazdin, 1995; Kazdin & Whitley, 2003; Wahler & Dumas, 1989) are represented as the “2Ss” (Stress and Social Support), where session content and activities help families develop coping skills and shore up social supports. Over the course of four months, weekly sessions are convened with six to eight families per group (including caregivers, identified children with behavioral difficulties, siblings, and additional primary caregivers). Sessions are facilitated by at least two group leaders. Engagement is promoted through extensive phone outreach, and the provision of transportation expenses, meals, and childcare at each session.

4R2S was evaluated in an effectiveness study (n = 320) utilizing a block comparison design (Chacko et al., 2015; Gopalan et al., 2015; McKay et al., 2010). Findings demonstrated that 4R2S was superior to SAU in reducing child behavioral difficulties and parental depression and stress; increasing child social skills; and engaging low-income, predominantly minority families. 4R2S exemplifies many behavioral parent training programs that train caregivers in child behavior management skills and are provided within the U.S. mental health service system (Chacko et al., 2015). Such services are also relevant to the child welfare service systems’ goals of promoting child safety, permanency, and well-being (Pecora et al., 2000), as child behavioral difficulties increase the risk of future maltreatment, out-of-home placement, and other psychosocial difficulties (Barth et al., 2006; Barth, 2009; Cairns, Cairns, & Neckerman, 1989; Loeber, Burke, & Pardini, 2009; Videka et al., 2014). Adding behavioral parent training services to existing child welfare system service offerings could avert future out-of-home placement, as well as reduce the risk of future maltreatment and maladaptive outcomes. However, child welfare agencies are staffed by caseworkers who typically do not possess advanced mental health training and experience. Caseworkers’ existing casework responsibilities and heavy workloads may further hinder implementation of child mental health EBTs.

To address this challenge, we drew upon task-shifting strategies which involved modifying the intervention (and, potentially, the new setting context) for provision by non-mental health providers, training non-mental health providers in the modified intervention, and establishing regular supervision and monitoring by mental health specialists (Patel et al., 2010; Rahman, 2007; Verdeli et al., 2003; WHO, 2008). Such strategies have increased access of mental health EBTs for adults and children in Africa and Asia, but have yet to be implemented in a high-income country and within child welfare services. Consequently, key components of this study involved (1) utilizing task-shifting strategies to address limitations in staffing capacity and increase service access to vulnerable populations; (2) employing the Practical, Robust, Implementation and Sustainability Model (PRISM; Feldstein & Glasgow, 2008) to guide intervention and agency-level changes needed to support successful implementation; (3) modifying content, training, and supervision of 4R2S so that it could be delivered in child welfare settings; and (4) implementing the modified intervention and obtaining subsequent feedback from child welfare staff (caseworkers, supervisors, administrators) and caregivers to assess initial feasibility and acceptability.

This exploratory/developmental study was designed to take place over two phases. In Phase I, a collaborative advisory board (CAB) was convened to (1) modify the intervention to be delivered by caseworkers in placement prevention service settings and (2) develop training and supervision protocols for caseworkers. Relevant stakeholders (e.g., caregivers with a history of receiving CW services, CW caseworkers, CW Supervisors, and CW administrators) were invited to consult on the CAB, in addition to research staff with extensive clinical experience conducting the 4R2S intervention (Gopalan, 2016). Seven CAB meetings took place over the course of six months, where members participated in team-building exercises, review of study background, procedures, and materials, as well as an overview of the 4R2S sessions. CAB meetings encouraged discussion among all members in order to obtain feedback on task-shifting strategies; PRISM domains; 4R2S session components, structure, and processes; caseworker training and supervision; and any changes to the agency context to promote implementation success. At each meeting, research staff took detailed written field notes of CAB members’ feedback. Field notes were subsequently summarized into structured observation guides to aid in making rapid modifications to the 4R2S intervention, training, and supervision.

Phase II focused on pilot-testing the modified 4R2S intervention for feasibility and acceptability outcomes. The study recruited purposive samples of caseworkers (n = 6), supervisors (n = 3), administrators (n = 2), and families (n = 15) to participate in three cohorts of the modified 4R2S intervention. Eligibility criteria included that caseworkers should not have any advanced mental health training (e.g., clinical license, post-graduate employment/training as a mental health provider), and that families had at least one identified child between the ages of six and 13 who met diagnostic criteria for Conduct Disorder or Oppositional Defiant Disorder utilizing the Disruptive Behavior Disorders Checklist (American Psychiatric Association, 2000; Pelham, Gnagy, Greenslade, & Milich, 1992). Data collected in the study focused on demographics at baseline, fidelity and session attendance during intervention delivery, and post-test quantitative surveys and qualitative interviews/focus groups to assess participants’ perceptions of feasibility and acceptability. Due to the small sample size, quantitative analysis focused on measuring outcomes against project-defined high-feasibility and high-acceptability markers. Qualitative data would be analyzed separately to identify a priori (e.g., feasibility, acceptability) and emergent themes. Subsequent mixed methods analysis would compare and contrast quantitative and qualitative data to determine if results confirm each other (i.e., convergence), or generate additional information (i.e., expansion) about factors impacting feasibility and acceptability.

The study was designed to advance implementation science by pilot testing this task-shifting approach. By using the CW system as a non-specialty service sector platform to launch targeted mental health services, this study would also be able to provide generalizable knowledge on ways to facilitate cross-setting implementation for similar EBTs, increase EBT access, and potentially reduce costs within transforming child-serving systems. Finally, methods (e.g., manual, training, supervision, enrollment) from this study would be solidified to support a larger-scale study (R01) testing both the effectiveness and implementation success of the modified 4R2S for child welfare services. Specifically, the intended R01 study would involve a hybrid effectiveness-implementation design (Curran, Bauer, Mittman, Pyne, & Stetler, 2012) involving a randomized control trial testing the impact of the task-shifted 4R2S in child welfare services versus services as usual (e.g., referral to child mental health clinics) on child and family outcomes, as well as uptake, integration, and sustainment.

Initial relationship development and engagement model.

As this study was investigator-initiated, the research team engaged community-based organizations (CBOs) providing placement prevention services to families identified by the child welfare system. Prior to submitting a grant proposal to NIMH, the research team recruited two CBOs utilizing existing contacts within the team’s professional network (e.g., prior employers, research sites). Initial engagement activities involved in-person meetings to introduce the project, with written one–page material succinctly summarizing relevant information. Meeting content highlighted the potential intervention benefits to children and families, and how the overall project aligned with child welfare relevant goals (e.g., reduced risk of out-of-home placement), overcame service provision barriers (e.g., lack of available mental health providers to meet child mental health needs), and addressed CBO capacity-building goals (e.g., increasing clinical skills among caseworkers). Such efforts were concluded with clearly-articulated “asks,” in this case, to provide a letter of support (initially drafted by the research team) to support the grant application.

Upon gaining funding for the project, the research team was informed that final approval would have to be achieved by the state authority to implement any procedures involving human research subjects. Unfortunately, the state authority felt that the study violated state law prohibiting the shifting of therapeutic activities that only Master’s level practitioners could legally provide. The state authority did authorize, however, the research team to complete Phase I activities, which were not considered human subjects research, and re-submit the protocol with modifications. As a result, the research team was able to complete Phase I by recruiting caseworkers, supervisors, and administrators from one of the participating sites (one CBO dropped out of the project at this time as they became involved in a competing EBT implementation project), as well as caregivers with prior personal child welfare histories to create a CAB. However, when the study protocol with the modified intervention was re-submitted to the state authority, they continued to refuse authorization to implement the modified intervention within the state.

Consequently, the research team was forced to move the study to another state following the completion of Phase I. This time, the research team initiated engagement at the state level first to determine interest, identify relevant regulatory constraints, and obtain-buy-in. Through the research team’s professional networks, an in-person meeting with the director of the state agency and research review chair was conducted to describe the study, obtain buy-in at the state level, and obtain the state authority’s assistance in disseminating information about the study to county-run agencies. Similar to previous agency engagement efforts, state officials were provided with a brief one-page written summary, and requested assistance in engaging CBOs at the county level. The state agency director emailed the written summary to the county-level CBOs, and, as a result, one county-run CBO agreed to partner with research team.

Partnership benefits.

The county-run agency Director and the In-Home Services Program Manager became the study “champions” - providing access as well as supporting overall study implementation. Upon receiving the information about the study from the agency director, the program manager initiated contact with the research team to implement the study in her unit. Specifically, the Program Manager went out of her way to promote and support this study among unit personnel. She was able to spearhead decreases to caseworker caseloads in order to facilitate implementation and training opportunities, provide useful real-time feedback on implementation challenges allowing course-correction when needed, as well as assist with recruitment efforts by ensuring agency staff informed all potentially eligible families about the study. Moreover, she found creative ways to supplement the study when the research funding expired, and will continue to work with the research team on future relevant research related projects with her agency. Finally, the program manager was able to help facilitate administrative procedures (including problem-solving with the agency director) to ensure the study ran smoothly.

Throughout the implementation of Phase II, research staff also sought to ensure tangible benefits were made to CBO staff. Due to state ethics laws prohibiting state employees from receiving research funds while working with their clients, the research team was unable to provide CBO participants (caseworkers, supervisors, administrators) with financial stipends for their involvement in the study. Instead, the research team provided a number of in-person trainings for all CBO staff on the 4R2S model, ensuring that all trainees received continuing education unit (CEU) credits. Additionally, research funds were utilized to provide a full-day training to all CBO staff on secondary trauma.

Unique challenges.

As discussed previously, state-level regulations around shifting tasks from master’s-level practitioners to those with fewer formal professional qualifications proved to be a formidable barrier in the first state, resulting in the project being moved to another state to complete Phase II. Through the process, the research team invariably had to become knowledgeable of state statutes around scope of practice laws involving mental health clinicians as well as child welfare caseworkers. In the second state, scope of practice regulations for child welfare caseworkers allowed for greater overlap of skills with clinicians, and likely facilitated the study’s ultimate approval.

As this was an implementation-focused study, the research team prioritized addressing any implementation challenges perceived by CBO staff in real-time in order to promote the success of Phase II. Research procedures (training, consenting, scheduling groups, mode of facilitation) had to account for existing work schedules and workload among caseworkers, as well as feedback from administration. When home visiting was discussed as a service delivery option, CBO staff informed the research team that front-line workers would balk at adding additional work to their already harried agenda during home visits, potentially leading to greater rates of turnover. Caseworkers, themselves, expressed their reluctance to add 4R2S content to their in-home meetings, as oftentimes clients’ homes did not offer enough space or ability to concentrate on intervention content. As a result, initial suggestions to deliver 4R2S during home visiting were discarded. However, the research team provided the modified 4R2S manual to all casework staff, allowing them to utilize specific content or strategies during home visits at caseworkers’ discretion. In this way, research staff learned about the exigencies of child welfare practice from front-line and supervisory perspectives, which then informed implementation procedures within Phase II. Such issues also highlighted the challenges of adding new tasks to overburdened caseworkers.

An additional challenge involved conducting research within child welfare services. As mandated under human subjects research ethical guidelines, the research team was responsible for ensuring that all research participants in Phase II felt free to volunteer or leave the study without repercussions. Unfortunately, the child welfare system culture frequently values compliance over engagement (Lalayants, 2013), where families may often agree to services for fear of negative consequences should they fail to comply. Consequently, research staff made specific emphasis to ensure all potential family participants (e.g., caregivers, children) were informed about their rights and voluntary status, and made multiple efforts to inform all CBO staff that potential family participants could not be penalized if refusing to take part in the study or withdrawing their participation.

Because this study involved research with publicly funded service providers, human subjects ethics approval was required from two institutions for Phase II: (1) the academic institution administering the study, as well as (2) the state child welfare authority. Institutional Review Boards (IRBs) are the organized committees formally designated within institutions in the United States to examine, approve, and monitor behavioral and biomedical research involving human subjects. The very nature of the IRB review process, in and of itself, was a challenge for conducting implementation research, which requires a high degree of flexibility and ability to revise procedures in “real-time” in order to avert implementation failures. However, if specific procedures were not originally written into the initial IRB protocol, adding new procedures and documents required a formal modification process through both IRB bodies. Depending on the nature of modifications, formal IRB approval could take up to several weeks within one institution alone. However, having to receive permission for a set of modifications from two institutions consecutively resulted in a delay of over 5 months for this study. The academic IRB was well-staffed, efficiently-run, and had accumulated decades of institutional knowledge on a variety of research projects. As a result, initial approval and modification procedures occurred fairly quickly. However, the counterpart IRB at the state child welfare agency, which conducted its own independent review of study procedures, struggled with staffing shortages, turnover, high workloads, and some loss of institutional knowledge as administrations changed and/or re-organized at the state level. Specifically, the two individuals at the state child welfare authority with whom the research team had originally met and received initial buy-in for Phase II of the project, were no longer employed in these same positions by the time modifications were submitted for approval. As a result, individuals reviewing the modifications were likely not as familiar with the initial protocol, resulting in additional questions and layers of scrutiny. Managing multiple IRBs for one project required strong organizational skills from the research team, with careful documentation of changes and approvals made to ensure that protocols and materials remained consistent across IRBs. Given staff turnover at either institution, research staff learned that it was necessary to re-orient reviewers of study procedures, as well as allow for some delays due to bureaucratic exigencies. Ultimately, when writing IRB protocols for multiple institutions, it is best when procedures are initially conceptualized as flexible and iterative, potential changes are anticipated in the initial protocol to avoid the need for subsequent modification, and IRB staff are consulted upon initial submission when drafting protocols for appropriate language.

Recommendations.

The following recommendations highlight important “lessons learned” relevant to all implementation researchers, as well as for early-career scholars.

  1. Be tenacious. Despite the various setbacks and pitfalls the research team experienced throughout this study, we were determined to implement Phase II and, as a result, completed final data collection in April 2017. Oftentimes, this meant identifying and acknowledging mistakes made, such as being insufficiently informed in the first state about the regulations around task-shifting and the additional layer of review required at the state level before beginning human subjects research. Tenacity in this context also entailed using these “lessons learned” to inform our subsequent approach to engage stakeholders and implement the remainder of the study in the second state. Each challenge provided an opportunity to creatively problem-solve solutions between the research team and community stakeholders.

  2. Understand the regulatory landscape of the service setting in which research will take place. In this study, the relative inexperience of the investigator led to unexpected demands to gain knowledge on state-specific scope of practice regulations as well as how to navigate two IRBs. In hindsight, having this information at the start of the enterprise could have resulted in a more informed selection from the beginning of jurisdictions and community partners that could ensure successful completion of the research project.

  3. Cultivate political capital. Engaging and working with public organizations is inherently a political process. The research team benefitted from efforts to identify stakeholders and partners within the existing professional networks among team members. If the individuals within these networks were unable to provide direct support, they expanded to teams’ network by identifying new potential partners. Although the IRI provided some political leverage for this study through the connections already developed by senior IRI faculty, many beginning implementation researchers may not have such resources at their disposal, nor have sufficient reputation among community partners with which to engage them. Even in the current case study, the PI made multiple unsuccessful attempts to engage with other CBOs before successfully connecting with the final research site. Early-career researchers may benefit from learning skills to develop and cultivate political capital. These skills may include attending activities and events within communities of interest to engage stakeholders, practicing interpersonal “soft” skills needed for effective networking and relationship-building through role-play activities and mock events, and advocating within entrenched public bureaucracies (e.g., frequent email and phone outreach, in-person advocacy).

  4. Partner with a public entity for initial development and submission of grant proposals. Although the current study did not have a public partner, such an individual at the state level could have served as a co-investigator on this project, preferably someone whose position is relatively permanent and unlikely to change. Ideally, the study would have benefitted from public partners employed at the state institution responsible for final approval and oversight of the study, as well as stakeholders from local (e.g., city/county) child welfare authorities, and individual CBOs. Such public administration stakeholders as co-investigators could be utilized to shepherd research projects through bureaucratic procedures within the agency/administration. In this case example, a public partner co-investigator might have been able to expedite the review of IRB modifications that, instead, took five months to approve. In this type of research-community partnership, researchers and community partners would share responsibility for leadership (Palinkas, Short, & Wong, 2015). Moreover, such a partnership would be designed to be mutually-beneficial, where researchers could utilize community resources as natural laboratories for developing, testing, and implementing EBTs, while community partners would influence research questions based on their needs to deliver high quality services to families. However, it must be acknowledged that permanent public administration positions with which to share leadership may not exist, as staffing in such settings is subject to the larger political winds of change. As each new political administration takes office, previous partners may be replaced by new appointees. Moreover, engaging such partners initially may be challenging, as implementation research studies are often highly unpredictable in both process and payoff. Further examination of specific strategies and considerations to engage such partners is warranted.

Case Study 2: Gateway CALL: Initiative to Improve Access to Needs Driven, Evidence-Based Behavioral Health Services in Child Welfare (US Department of Health and Human Services, Administration for Children and Families, Children’s Bureau, Grant #90CO1104; Grantee: Franklin County Children Services, Evaluator: Bunger)

The second case, the Gateway CALL project, was a five-year (2012–2017) system demonstration project set within a large, county-based public child welfare agency in the Midwest, intended to improve coordination of and access to behavioral health services for youth served by the child welfare agency. Gateway CALL was one of 20 local and state demonstrations funded through the U.S. Children’s Bureau to address the substantial unmet behavioral health service needs among system-involved children (Burns et al., 2004; Horwitz et al., 2012). Child welfare systems and front-line workers served as gateways to behavioral health services (Leslie et al. 2005; Stiffman, Pescosolido, & Cabassa, 2004). However, children’s access to quality services is often shaped by a variety of non-clinical factors including front-line workers’ knowledge and ability to recognize behavioral health needs (Stiffman et al, 2000), availability of evidence-based behavioral health services in the community (Kolko et al 2009), and the linkages, contracts, and partnerships that connect the two systems (Bunger, Stiffman, Foster & Shi, 2009; Bunger, Cao, Girth, Hoffman & Robertson, 2016; Bai et al 2009; Hurlburt et al 2004). To address these issues, the Gateway CALL project involved: 1) implementing a universal and electronic behavioral health screening process within child welfare intake units to quickly identify children with trauma exposure and behavioral health concerns; 2) expanding a co-located behavioral health assessment team (staffed by a local behavioral health agency partner) and implementing validated assessment tools; 3) shifting behavioral health referral responsibilities to the co-located behavioral health team; 4) ongoing case monitoring and service plan adjustments among ongoing child welfare units; and 5) reconfiguring the local service array to ensure availability of needed, evidence-based services (if possible).

The grant proposal was initiated by the county child welfare agency, which also served as the primary grantee. The child welfare agency contracted with the College of Social Work at The Ohio State University since the project was first funded in 2012 to design and conduct the evaluation. As a condition of funding, all grantees were required to evaluate effectiveness of the project for improving key child welfare outcomes including safety, permanence, and well-being. In addition, the Children’s Bureau also required cost and implementation studies that examined adoption, fidelity, and sustainment of the practice changes, the strategies used to implement changes, and barriers and facilitators (US Department of Health and Human Services, 2012). To meet the requirements, the lead evaluator (second author) designed a Hybrid Type 1 study that combined intervention effectiveness testing and an examination of implementation (Curran et al., 2012).

To implement these new changes in a feasible but rigorous way, a dynamic wait-listed design was planned (a staged roll-out, or a stepped wedge design). New screening procedures were rolled out in stages across organizational units, which were matched with comparison units (based on size, worker experience, etc.) and assigned to cohorts based on the supervisor’s administrative load (Brown et al., 2017). This approach allowed us to compare changes in key indicators over time (before and after the intervention), and across experimental and comparison groups while refining and testing the effectiveness of the strategies used to implement the intervention. To implement practice changes within the structure of the staged roll-out, we used a multifaceted approach that included planning, work-plan tailoring, training, coaching, small tests of change, extensive quality management system and tool development, and audit/feedback (Bunger, Powell, Robertson, MacDowell, Birken & Shea, 2017). The evaluation studies leveraged this staged roll-out design, and drew on mixed methods and data sources (primary survey data, child welfare case records, Medicaid billing records, and focus group transcripts).

Initial relationship development and engagement model.

The relationship between the lead evaluator (second author) and the child welfare and project management partners was brokered by the dean of the College of Social Work in the summer of 2012. In need of an external evaluator to assist in the preparation of the project proposal, the partners approached the dean, who connected them to the second author (who was contracted to begin a tenure-track faculty appointment in Fall 2012). Initial project and evaluation planning (before proposal submission) took place over the phone and by email, but shifted to in-person meetings immediately after the funding notification in October 2012.

As the grantee, the child welfare agency partner assumed project leadership, and designated a primary director from within the agency. The project director convened a core leadership group comprised of staff and supervisors and a representative from the behavioral health partner organization. This leadership team met one to two times per month and was responsible for project design, implementation, and logistics. Notably, the project director emphasized inclusion of worker perspectives, therefore additional implementation teams were formed for each intervention component. These teams were comprised of the supervisors of the units responsible for implementation, and they played a direct role in tailoring the project’s design, and its implementation blue print. The lead evaluator also participated in these teams, offering advice and information to guide decision making, articulating the main evaluation questions, and designing the evaluation studies. Thus, the partnership was community-driven, with the purpose of making sustained changes to the organization and practice.

Partnership benefits.

One of the benefits of working so closely with child welfare and behavioral health partners was the potential for developing programs and implementation plans that optimized fit with the local context. The agency partner led the design of the project and its implementation with the unique contexts of everyday child welfare and behavioral health practice in mind. As a result, the new screening, assessment, and reassessment procedures were designed in a way that fit with existing case flows and timelines; and implementation leveraged naturally existing strengths like formal and informal agency champions and leaders, training times and infrastructure, and in-house quality management capacities. Perhaps because of this close consideration of context, there was minimal front-line resistance to change, and high levels of uptake of the new screening and assessment process initially.

Second, the partnership with a university-based external evaluation lent credibility to the child welfare agency partner, and connected the agency with additional capacities, resources, and expertise (e.g., informatics, behavioral health services and implementation research, organizational change). For instance, to inform selection of the new screening and assessment tools, the evaluation team prepared a manual that compiled and presented available instruments. The team’s expertise in clinical practice and instrumentation research facilitated a balanced comparison of the tools based on their psychometric performance and practice utility. Also, the lead evaluator recommended using the staged roll-out approach in response to agency concerns about exceeding organizational capacity and resources if the project were implemented agency-wide from the beginning. Drawing on training in implementation and organizational management research, the evaluation team was able to help structure the project’s implementation in a way that balanced the practical needs of the child welfare agency (gradual ramp up, small tests of change, and iterative refinements to the project) with evaluation design rigor (comparison groups).

Unique challenges.

Despite the benefits of working so closely with child welfare and behavioral health partners, several challenges compromised rigor and ability to assess effectiveness. Staff and leadership turnover is inevitable with a long-term project, especially in child welfare settings (United States Government Accountability Office, 2003). However, turnover can shift project direction and introduce delays as a newly configured team establishes a common understanding and vision of the project and its management. With the exception of the lead evaluator and the contracted project managers, there had been turnover among all key stakeholders and partners, including the federal program officer, supervisors at the child welfare agency, and staff within the evaluation team. The lead project director from the child welfare agency, who initiated the project, also left the agency for a new position nine months after the project started. The second project director had been part of the leadership team from the beginning, to facilitate continuity. As a result, the partnership was in a continual period of relationship-formation and -building. To build rapport and shared understanding, this entailed regular communication, as well as continually revisiting and justifying project goals and design. Each new partner brought new ideas, suggestions, or highlighted constraints requiring adjustments to the project plans. These transitions early in the project led to delays in project development, approval from the funder, and launch.

The most significant stakeholder shift occurred three years into the project. The project began in partnership with an assessment team from a local behavioral health organization that had been co-located within the child welfare agency for 10 years prior. However, to comply with federal procurement regulations, the child welfare agency partner had to initiate a new competitive request-for-proposals (RFP) process to contract out for the assessment services. The original behavioral health assessment service partner lost their bid in this process, leading to a disruption in the existing contract. Thus, half-way through implementation, the contracting change introduced a new provider. This shift offered an opportunity to examine and compare strategies needed to implement the assessment component by two different providers (and to investigate the effect of the provider change on uptake and fidelity using a now-naturally occurring interrupted time series design). Since little is known, empirically, about the role of federal regulations and contracting in implementation (see Willging et. al., 2016 for an exception), this situation highlighted a valuable lesson for other child welfare agencies and funders interested in pursuing similar initiatives. However, introducing a new provider to the intervention also presented a potential confound to the outcome study, and required a reassessment of study design and power. Analyses would have to control for which contract provider conducted the assessment, which could generate concerns about public comparisons of provider quality.

While the new partner was dedicated to a seamless transition, replicating an established co-located program in less than two months was a challenge. The transition required additional planning meetings, trainings, and coaching to ensure the new partner replicated the program. During these meetings, the leadership team began to realize the extensive number of decisions made previously. Many of those decisions and their rationale were documented (e.g., the rationale for selecting the screening and assessment tools), although some were not (e.g. process for submitting and scoring screens completed on paper) and may have been missed with the departure of the former behavioral health partner, suggesting the need for extensive and coordinated documentation efforts.

The evaluation team also needed to amend the IRB protocol to include the new behavioral health partner, and therefore all data collection activities were placed on hold while the IRB reviewed the modification request. Evaluation team members talked frequently with IRB staff to walk through the modifications, and received guidance for communicating the implications of these changes for human subjects protection. As a result, there were no requests for clarification after review, reducing further delay. Once evaluation activities resumed, the evaluation team proceeded with planned focus groups with front-line staff and supervisors to explore barriers and facilitators to ongoing project implementation and sustainment. These focus groups highlighted how the provider transition generated substantial stress for both the child welfare workers and new behavioral health clinicians, as they built relationships and adapted their practice to one another. Focus groups also elicited information about assessment delays, frustration around contract expectations, confusion due to new formats for reporting assessment results, and several other emergent and unanticipated implementation problems that could ultimately influence the project’s sustainability. The external evaluators were in a unique position to share that feedback quickly (without disclosing the participants’ identity) with the child welfare and behavioral health partners so they could respond.

To enhance the feasibility of the project, continual adjustments to the evaluation design chipped away at rigor. For instance, the original plan included an agency-wide scale up, and the target population included all children involved in the system. This scope would have included about 300 front-line workers, and over 2,500 children (generating sufficient power for detecting differences between the experimental and comparison groups over time for several outcome measures, as well as testing other relationships within the data). However, when the new project director was assigned nine months into planning, the leadership team scaled back the population to only include those youth brought into legal custody because of concerns about the legal authority of the child welfare system to suggest or intervene in families with whom the agency had no jurisdiction (per the agency’s legal counsel). As a result, implementation was targeted toward a smaller population, which in turn included a smaller number of units and workers than originally intended. A full agency-wide scale up was still possible with the staged roll-out design. However, after the second wave, the child welfare agency decided not to continue scaling to the remaining units because of a backlog of assessments inundating the new assessment team, and interests in pursuing a different social and emotional assessment system. In fact, the screening component was discontinued earlier, in September 2016. As a result, the project ended in September 2017 after training eight units (about 60 front-line workers), who screened only about 200 children. The project generated comprehensive information from multiple data sources about the implementation context, workers’ practices, and children’s cases. However, both the implementation and outcome analyses are underpowered to detect anything but a large effect.

Generally, these adjustments were necessary to carry out this project within the ever-changing context of a real-world public child welfare agency. However, managing the time-intensive work associated with a closely partnered project and flexing the study design, while balancing the productivity demands of an early-career investigator, was also challenging. There had been hundreds of hours invested in face-to-face planning meetings, and approximately half of the lead evaluator’s work time over the five-year period was dedicated to this project. Yet, because of the lengthy start-up process and delays, there were no publications or scholarly presentations from this project until the third year. These considerations could be serious barriers to conducting community-engaged implementation research for those in tenure-track positions, and doctoral students with publication pressures. To ensure that the evaluation team had publications, some of the initial needs assessment results were reframed, and published within the mental health, children’s services, and human service administration journals. However, those data were not originally intended for that purpose and, therefore, the findings likely have limited generalizability.

Recommendations.

Based on these experiences, the following recommendations for early-career implementation researchers are made.

  1. Articulate mutual expectations. Especially in new relationships, community and academic partners may be unaware of the specific demands and expectations that the other encounters. While all partners were mutually invested in promoting children’s service access, the researcher was unfamiliar with the child welfare agency’s competing political and funding demands, and the child welfare agency was unaware of the researcher’s productivity pressures while on the tenure track. The Gateway CALL project began, continued, and ended without an explicit conversation about partner goals, expectations, and needs from the collaboration. Open and honest communication (at the beginning, and again throughout the project and as stakeholders transitioned), is critical for strong research-practice partnerships (Palinkas, Short, & Wong, 2015) and could have built a more productive relationship between the evaluation team and agency partners.

  2. Obtain support from the home institution. Although there were benefits from specialized IRI mentorship, especially from senior implementation scholars who understood child welfare systems, support from the home institution (in this case, College of Social Work at Ohio State University) was necessary. The College of Social Work values partnerships with community agencies, and therefore key leadership (the dean) was instrumental in brokering the relationship between the child welfare agency and lead evaluator. However, institutional support continued to be necessary as the project progressed. Considering all the project and evaluation design adjustments, mentorship was needed for revising IRB protocols, budgets, and contracts. Additionally, internal advocacy for the project was needed since compliance and financial units within a large university may not be as flexible as needed. Finally, given the time needed for building relationships, the ability to use research funding for course-release without penalty is absolutely essential.

  3. Sequence community partnerships within a research trajectory. Intensive community partnerships can be slow to generate data and scholarly publications due to potentially lengthy ramp-up periods. Scholars, especially early-career scholars, may consider how a community-engaged implementation study is sequenced within their research trajectory so their productivity is not compromised during the early years on the tenure track. For instance, investigators might consider securing new data from which to publish first, before, or while they are planning a new project with a community partner.

Discussion

The preceding case studies present two different models of community partnership: Case 1 describes an investigator-led partnership and Case 2 describes an agency-led partnership. Despite unique factors due to the different structures of these partnerships, both cases demonstrate the value and difficulty associated with conducting community-partnered research. The “real world” nature of both studies increases the relevance of their findings, and the community partners in both cases allowed investigators to accomplish far more than what could have been accomplished without these committed relationships. At the same time, these partnerships and studies are not easy to manage. While some of the challenges described in these two cases have been acknowledged in other implementation studies, they may be especially difficult for early-career researchers to work through. This discussion focuses on cross-cutting challenges that early-career researchers are likely to face when they conduct community-partnered research, as well as some suggested responses and skills that may smooth the path to developing and maintaining community partnerships (see Table 1 for a summary of the challenges identified in the case studies, and corresponding recommendations). We focus specifically upon challenges related to building and managing relationships; managing rules and regulations; adapting interventions, implementation approaches, study designs; and securing resources.

Table 1.

Common challenges and recommendation

Challenge Recommended Response and Skills
Managing Relationships
Building Buy-In with key stakeholders
  • Identify opportunities to network with potential agency champions and decision makers

  • Collaborate with agency stakeholders during grant writing and include them as project directors, PIs or Co-PIs

  • Communication with key stakeholders about effective approaches to advocate for the project within the system, and potential political ramifications.

Turnover
  • Plan for orienting new stakeholders on a continual basis

Identifying Mutual benefits
  • Clear conversations about expected benefits at the beginning of the project, and over time especially as stakeholders transition

  • Highlighting how research procedures align with agency and staff goals

Transparency and Communication
  • Regular face to face communication; open and honest communication about expectations, roles, and needs

  • Use of one-page written summaries with clear “asks” of agency staff

Managing Rules and Regulations
Regulatory Constraints
  • Identify regulations that influence your study upfront, and continually manage them

  • Work with universities to manage.

Managing IRBs
  • Identify all layers review outside of the academic institution (e.g., city, county, state)

  • Facilitate a meeting or phone call between staff at IRBs to negotiate timelines

  • Consult with IRB staff to allow for flexibility within protocol and avoid future modification requests

  • Confirm a mutually acceptable approach for protecting human subjects.

  • Carefully document all procedures and changes

Adaptation - Intervention, Implementation, and Study Design
Delays and Plan Changes
  • Adaptation of project and research design

  • Consider “salvage” strategies including alternative approaches/designs/analyses for answering core research questions, alternative publication topics;

Shared control of research design
  • Planning meetings

  • Present and consider feasibility, acceptance, and rigor of alternatives

Culture and climate within child welfare agencies
  • Emphasize voluntary nature of the study with front-line workers through multiple venues (e.g., staff meetings, emails, written summaries of project)

Securing Resources
Other Unexpected challenges
  • Build a support team and have support structures in place comprised of senior scholars (who can provide expert mentorship), as well as research team members (with local expertise)

  • Inclusive leadership approach as a PI (gather information from multiple perspectives within the research team)

Time and Resources
  • Budget for course release

  • Connect with other sources of expertise (e.g. big data), and mentorship within and across institutions.

Initiating and managing relationships with community partners is one challenge for early-career researchers who may be new to their communities, unfamiliar with the service landscape, and unconnected to key stakeholders. The experiences described in the two cases demonstrate the role of professional networks for helping investigators develop relationships with community partners, including direct professional connections (Case 1) and those of their mentors (Case 2). In contrast, it may be possible that investigators with more experience and an established reputation within the community might have had less difficulty in establishing partnerships or required less assistance from professional networks and mentors. These relationships were initiated during different phases of the research process, demonstrating that while it may be ideal to engage community partners in the early stages of grant writing to ensure buy-in prior to the implementation effort (Proctor, Powell, Baumann, Hamilton, & Santens, 2012), it may not always be possible to sustain these relationships over time. Indeed, that approach also comes with its own problems, since obtaining research funding is often arduous and community partners may lose interest, patience, or get drawn into competing efforts (as happened in Case 1).

Regardless of when a community partnership is initiated, there is an immediate and continued need for all investigators to establish trust and a shared understanding of mutual goals. This begins by acknowledging that the partners will accomplish far more together than either stakeholder group could accomplish alone. The importance of fostering mutual respect during these initial phases (and throughout the research process) cannot be emphasized enough. Berwick (2008) notes, “Academicians and frontline caregivers best serve patients and communities when they engage with each other on mutually respectful terms. Practitioners show respect for academic work when they put formal scientific findings into practice rapidly and appropriately. Academics show respect for clinical work when they want to find out what practitioners know” (p. 1184). Fostering this type of respect requires introspection, openness, and humility as one considers potential differences between academic and community partners and how both stakeholder groups can work together to pursue mutually beneficial aims (Shea et al., 2017). When this type of relationship is established, both stakeholder groups can become champions for the other, and it becomes much easier to identify “win-wins.” Indeed, other child welfare scholars have attributed partnership success to strong relationships, continued engagement for sustaining buy-in, and working through challenges that emerge (Akin, Strolin-Grotzman, & Collins-Camargo, 2017).

Developing relationships requires a set of strategies common to all investigators regardless of experience: regular face-to-face communication; open and honest dialogue regarding expectations, goals, and needs; and a commitment to working through differences and conflicts. Other implementation scholars have highlighted how partners come to the table with different expectations, needs, and goals. For instance, Kilbourne and colleagues (2012) acknowledge how open communication and collaborative problem solving helped academic and community partners work through disagreements about the timing and scope of implementation efforts that were driven by different professional demands. Shea et al. (2017) recommend using plain language that is culturally sensitive, engaging in active listing and productive conflict resolution techniques, and clarifying misunderstandings respectfully. Frank discussion of goals, expectations, needs, and even mistakes may have further strengthened the working relationship with partners in Case 1’s second site, while the lack of open communication may explain some of the partnership challenges in Case 2. These habits of communication become particularly important when staff turnover disrupts implementation processes (Beidas et al., 2016; Woltmann et al., 2008), as building buy-in with new staff is necessary (as in Case 2). Turnover and other unexpected changes throughout the implementation process demand that stakeholders exhibit patience, flexibility, persistence (i.e., tenacity), and a healthy sense of humor.

Managing rules and regulations is a challenge inherent to any research involving human subjects; however, the regulatory complexity increases as researchers partner with additional stakeholder groups and organizations. There is, moreover, an additional hurdle to overcome when considering the relative experience of the investigators themselves, who may not be previously acquainted with a jurisdiction’s regulatory landscape. Case 1 provides an example of how the intervention was incompatible with the first state’s labor laws. In Case 2, a key implementing partner was replaced due to contracting regulations. These issues are common challenges for partnered-research especially in public child welfare settings, which are subject to a variety of labor, accountability, and procurement regulations as governmental entities. For instance, in a major study of SafeCare implementation (a child maltreatment prevention intervention), a state-wide shift in procurement and reimbursement processes threatened sustainment of the intervention among contracted providers (Willging et al 2016; Willging et al 2015). Data sharing policies (especially when involving HIPAA-covered entities) are also likely key considerations in partnerships (Akin, Strolin-Gotlzman, & Collins-Camargo, 2017), although did not emerge as significant challenges in these two cases. Although policy and regulatory changes are outside the researcher’s control, ideally, one would identify regulatory constraints upfront and work with relevant entities to find workable compromises. This requires an intimate knowledge of the characteristics of partnering community agencies and service settings (Shea et al., 2017), suggesting the utility of deep and prolonged engagement with community partners over time. In fact, Hurlburt and colleagues (2014) emphasize engaging stakeholders in a thorough assessment of the intervention’s fit within existing policies, practices, and contracts. Since early-career scholars may have limited familiarity with their partners’ practice settings, mentorship from other experienced researchers and ongoing conversations with partners are critical for understanding how interventions, implementation, and research designs fit within the regulatory environment.

Researchers must also identify all layers of review outside of the academic institution (e.g., city, county, state), facilitate meetings between IRB personnel to negotiate timelines, and confirm a mutually acceptable approach for protecting human subjects. The experiences described in this paper suggest that it may be particularly helpful for researchers to work closely with their IRBs to establish a flexible protocol that may allow the researchers to avoid frequent requests for modifications, and work delays. This particular “lesson learned” can differentiate new from experienced investigators, who may already possess this level of institutional knowledge. Alternately, Goodyear-Smith and colleagues (2015) underscore the importance of IRBs embracing a much more flexible approach for implementation research, which often involves the co-design of interventions and implementation strategies. Specifically, they note that IRBs should be encouraged to 1) receive broad training in the wide range of research methods and study designs pertinent to implementation research; 2) establish ground rules for participatory research applications; 3) acknowledge the benefits of power-sharing in implementation research; and 4) demonstrate a commitment to the principle of emergence and continued learning, as implementation science is rapidly developing, requiring both IRB members and investigators to stay abreast of cutting edge approaches to partnered research.

A third cross-cutting challenge (and opportunity) for researchers of all levels relates to the need to potentially adapt interventions, implementation strategies, and study designs. While it is prudent to anticipate challenges with community partners, unexpected challenges that demand shifts in approach will inevitably emerge. As investigators from three child welfare demonstration sites note, rigid adherence to original protocols is likely to lead to implementation and study failure (Akin, Strolin-Goltzman, & Collins-Camargo, 2017), and in both cases described in this paper, researchers had to adapt. In the best of circumstances, changes to interventions, implementation strategies, research methods, and designs would be in the service of maximizing rigor and relevance; however, they can also pose threats to rigor (as in Case 2). In addition to selecting a research design that maximizes internal and external validity (Brown et al., 2017; Mercer, DeVinney, Fine, Green, & Dougherty, 2007), investigators can enhance rigor by carefully documenting changes in the study protocol. This process is aided by increasingly sophisticated typologies and methods that can be used to document specific adaptations to interventions or implementation strategies (Barrerra, Berkel, & Castro, 2017; =Stirman, Miller, Toder, & Calloway, 2013), ensure that interventions and strategies are reported in sufficient detail to be replicated (Albrecht, Archibald, Arseneau, & Scott, 2013; Proctor, Powell, & McMillen, 2013), and track implementation strategies prospectively (Bunger et al., 2017). Using these guidelines and methods provides room for flexibility and adaptation based upon community partner input, but also increases the ability to understand how implementation processes and strategies influence implementation and clinical outcomes. Sometimes implementation challenges at multiple levels will threaten the viability and successful execution of a study as planned, necessitating the use of implementation “salvage” strategies to save a study from being terminated (Dunbar et al., 2012; Hoagwood, Chaffin, Chamberlain, Bickman, & Mittman, 2011). Relatedly, the inherent time-intensiveness of community partnered research coupled with unexpected delays can stymie the production of publications and grant submissions (as in the second case). We recommend that new investigators, in particular, mitigate against those challenges by employing a twofold approach in which they creatively pursue (1) a wide range of potential publications focusing on the community partnered study (e.g., study protocols, methodology papers, theoretical papers, systematic reviews, etc.), as well as (2) publications based upon data from other sources. A practical example of this is a methodology paper that demonstrates a novel approach to tracking implementation strategies prospectively (Bunger et al., 2017), which was developed as a part of the Gateway CALL study. That article was pragmatically useful to the evaluation, but also may be important to the field as an exemplar of a new methodological approach.

A final shared challenge pertains to securing resources. This includes building a supportive, interdisciplinary team of investigators comprised of senior colleagues and community members who can lend their expertise and knowledge of local service systems to early-career scholars as they build their own capacity. Building interdisciplinary teams is central to mental health services, implementation, and community-partnered research (Landsverk et al., 2009). In the two presented cases, it was particularly important to ensure that teams included more senior and politically connected investigators who could open doors and provide critical insight and support during difficult times. In the absence of mentors internal to the investigators’ institutions, linkages to external mentors, such as those provided through the IRI, become even more critical. Investigators also found it helpful to embrace an inclusive leadership approach, taking every opportunity to learn from community members and academics on teams. Ultimately, community-based implementation research requires an adaptive problem-solving approach that is facilitated by the input and expertise of multiple stakeholders. In addition to ensuring broad participation of stakeholders, support from the home institution and practical resources such as budgeting for course releases were also critical to the success of these projects. Additional supports to ensure time and resources for community-based research can be drawn from universities’ Clinical and Translational Science Awards through Community Engagement Cores and pilot funding (Zerhouni & Alving, 2006).

Conclusion

Community partnered research represents an outstanding opportunity to increase the relevance and rapid translation of research findings into real world practice. However, the experiences described in this paper also suggest that these partnerships should not be entered into lightly. Both community and academic stakeholders should choose partners wisely, communicate their needs and expectations clearly and formally from the outset, and commit to adaptive problem-solving throughout the research process. Shea and colleagues (2017) recently proposed a framework that outlines researchers’ readiness to engage in community engaged (CE) dissemination and implementation research. They outline 40 competencies that are organized into nine domains: 1) perceived value of CE in D&I research; 2) introspection and openness; 3) knowledge of community characteristics; 4) appreciation for stakeholders’ experience with and attitudes toward research; 5) preparing the partnership for collaborative decision-making; 6) collaborative planning for the research design and goals; 7) communication effectiveness; 8) equitable distribution of resources and credit; and 9) sustaining the partnership. These domains and competencies represent goals to strive toward, and points to consider prior to engaging in community partnered research. We recommend that early stage investigators engage in community partnerships “with eyes wide open,” fully recognizing the potential benefits and the myriad of challenges. Yet, we fully acknowledge that one cannot anticipate all the challenges and opportunities that will be generated from these types of partnerships. As illustrated in the history of Oregon’s behavioral parent training model (Patterson, Reid, & Eddy, 2002), research trajectories often do not move in a linear, neat fashion. Rather, setbacks, twists and turns, and “cul de sacs” are common, ultimately providing the investigator with new opportunities to learn and make substantial contributions. This reality speaks to the need to be flexible, tenacious, and creative in solving problems collaboratively as they emerge.

Funding

This work was supported in part from grants and contracts from the National Institutes of Health. The Implementation Research Institute has provided training to GG, AB, and BP (NIMH R25 MH080916, Proctor, PI). GG acknowledges support from the National Institute of Mental Health (R21 MH102544; Gopalan, PI). AB was supported by the Department of Health and Human Services, Administration for Children, Youth and Families, Children’s Bureau [grant number 90CO1104] and the AB InBev Foundation. The findings and discussion do not represent the official view of Children’s Bureau. BJP was supported in part by the National Institute of Mental Health (L30 MH108060; Powell, PI; R01MH106510; Lewis, PI; K01MH113806: Powell, PI) and the National Center for Advancing Translational Science (UL1 TR001111; Buse, PI).

Footnotes

Conflict of interest

The authors declare that they have no conflict of interest.

Research Involving Human and Animal Participants

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

References

  1. Aarons GA, Hurlburt M, & Horwitz SM (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. 10.1007/s10488-010-0327-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Akin BA, Strolin-Goltzman J, & Collins-Camargo C (2017). Successes and challenges in developing trauma-informed child welfare systems: A real-world case study of exploration and initial implementation. Children and Youth Services Review, 82, 42–52. 10.1016/j.childyouth.2017.09.007 [DOI] [Google Scholar]
  3. Albrecht L, Archibald M, Arseneau D, & Scott SD (2013). Development of a checklist to assess the quality of reporting of knowledge translation interventions using the workgroup for intervention development and evaluation research (WIDER) recommendations. Implementation Science, 8(1), 52. https://doi:10.1186/1748-5908-8-52 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Alexander J, Pugh C, Parsons B, & Sexton T (2000). Functional family Therapy. In Elliott DS (Ed.), Blueprints for violence prevention Vol. 8, Functional family therapy (pp. 185–205). Boulder, CO: Institute of Behavioral Science. [Google Scholar]
  5. American Psychiatric Association. (2000). Diagnostic and statistical manual of mental disorders - text revision (4th ed.). Washington, DC: American Psychiatric Association. [Google Scholar]
  6. Asen E (2002). Multiple family therapy: An overview. Journal of Family Therapy, 24(1), 3–16. https://doi-org.proxy.wexler.hunter.cuny.edu/10.1111/1467-6427.00197 [Google Scholar]
  7. Bai Y, Wells R, & Hillemeier MM (2009). Coordination between child welfare agencies and mental health service providers, children’s service use, and outcomes. Child Abuse and Neglect, 33(6), 372–381. 10.1016/j.chiabu.2008.10.004 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Baker EA, Homan S, Schonhoff SR, & Kreuter M (1999). Principles of practice for academic/practice/community research partnerships. American Journal of Preventive Medicine, 16(3), 86–93. 10.1016/S0749-3797(98)00149-4 [DOI] [PubMed] [Google Scholar]
  9. Barkin S, Schlundt D, & Smith P (2013). Community-engaged research perspectives: Then and now. Academic Pediatrics, 13(2), 93–97. 10.1016/j.acap.2012.12.006 [DOI] [PubMed] [Google Scholar]
  10. Barrera M, Berkel C, & Castro F (2017). Directions for the advancement of culturally adapted preventive interventions: Local adaptations, engagement, and sustainability. Prevention Science, 18(6), 640–648. 10.1007/s11121-016-0705-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Barth RP (2009). Preventing child abuse and neglect with parent training: Evidence and opportunities. The Future of Children, (2), 95–118. [DOI] [PubMed] [Google Scholar]
  12. Barth RP, Wildfire J, & Green RL (2006). Placement into foster care and the interplay of urbanicity, child behavior problems, and poverty. American Journal of Orthopsychiatry, 76(3), 358–366. 10.1037/0002-9432.76.3.358 [DOI] [PubMed] [Google Scholar]
  13. Beidas RS, Marcus S, Wolk CB, Powell B, Aarons GA, Evans AC, … Mandell DS (2016). A prospective examination of clinician and supervisor turnover within the context of implementation of evidence-based practices in a publicly-funded mental health system. Administration and Policy in Mental Health and Mental Health Services Research. 43(5), 640–649. 10.1007/s10488-015-0673-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Berwick DM (2008). The science of improvement. Journal of the American Medical Association, 299(10), 1182–1184. 10.1001/jama.299.10.1182 [DOI] [PubMed] [Google Scholar]
  15. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, … Cruden GH (2017). An overview of research and evaluation designs for dissemination and implementation. Annual Review of Public Health, 38, 1–22. 10.18131/G35W23 [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, & Shea C (2017). Tracking implementation strategies: A description of a practical approach and early findings. Health Research Policy and Systems, 15(1): 15. 10.1186/s12961-017-0175-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Bunger AC, Stiffman AR, Foster KA, & Shi P (2009). Child welfare workers’ connectivity to resources and youth’s receipt of services. Advances in Social Work, 10(1), 19–38. [PMC free article] [PubMed] [Google Scholar]
  18. Bunger A, Cao Y, Girth A, Hoffman J, & Robertson H (2016). Constraints and benefits of child welfare contracts with behavioral health providers: Conditions that shape service access. Administration and Policy in Mental Health and Mental Health Services Research, 43(5), 728–739. 10.1007/s10488-015-0686-1 [DOI] [PubMed] [Google Scholar]
  19. Burns BJ, Phillips SD, Wagner HR, Barth RP, Kolko DJ, Campbell Y, & Landsverk JA (2004). Mental health need and access to mental health services by youths involved with child welfare: A national survey. Journal of the American Academy of Child and Adolescent Psychiatry, 43(8), 960–970. 10.1097/01.chi.0000127590.95585.65 [DOI] [PubMed] [Google Scholar]
  20. Cabassa LJ, Gomes A, Meyreles Q, Capitelli L, Younge RG, Dragatsi D, … Lewis-Fernández R (2014). Using the collaborative intervention planning framework to adapt a health-care manager intervention to a new population and provider group to improve the health of people with serious mental illness. Implementation Science, 9(1), 178. 10.1186/s13012-014-0178-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Cairns RB, Cairns BD, & Neckerman HJ (1989). Early school dropout: Configurations and determinants. Child Development, 60(6), 1437–1452. [DOI] [PubMed] [Google Scholar]
  22. Chacko A, Gopalan G, Franco LM, Dean-Assael KM, Jackson JM, Marcus S, … McKay MM (2015). Multiple family group service model for children with disruptive behavior disorders: Child outcomes at post-treatment. Journal of Emotional and Behavioral Disorders, 23(2), 67–77. 10.1177/1063426614532690 [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Chambers DA, & Azrin ST (2013). Research and services partnerships: Partnership: A fundamental component of dissemination and implementation research. Psychiatric Services, 64(6), 509–511. 10.1176/appi.ps.201300032 [DOI] [PubMed] [Google Scholar]
  24. Chorpita BF, & Daleiden EL (2009). Mapping evidence-based treatments for children and adolescents: Application of the distillation and matching model to 615 treatments from 322 randomized trials. Journal of Consulting and Clinical Psychology, 77(3), 566–579. 10.1037/a0014565 [DOI] [PubMed] [Google Scholar]
  25. Christopher S, Watts V, McCormick Alma Knows His Gun, & Young S (2008). Building and maintaining trust in a community-based participatory research partnership. American Journal of Public Health, 98(8), 1398–1406. 10.2105/AJPH.2007.125757 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Curran GM, Bauer M, Mittman B, Pyne JM, & Stetler C (2012). Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50(3), 217–226. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, & Lowery JC (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Durlak JA, & Dupre EP (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350. 10.1007/s10464-008-9165-0 [DOI] [PubMed] [Google Scholar]
  29. Dunbar J, Hernan A, Janus E, Davis-Lameloise N, Asproloupos D, O’Reilly S, … Carter R (2012). Implementation salvage experiences from the Melbourne diabetes prevention study. BMC Public Health, 12(1): 806. 10.1186/1471-2458-12-806 [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Feldstein AC, & Glasgow RE (2008). A practical, robust implementation and sustainability model (PRISM) for integrating research findings into practice. The Joint Commission Journal on Quality and Patient Safety, 34(4), 228–243. 10.1016/S1553-7250(08)34030-6 [DOI] [PubMed] [Google Scholar]
  31. Garland AF, Brookman-Frazee L, Hurlburt MS, Accurso EC, Zoffness RJ, Haine-Schlagel R, & Ganger W (2010). Mental health care for children with disruptive behavior problems: A view inside therapists’ offices. Psychiatric Services, 61(8), 788–795. 10.1176/ps.2010.61.8.788 [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Garland AF, Hawley KM, Brookman-Frazee L, & Hurlburt MS (2008). Identifying common elements of evidence-based psychosocial treatments for children’s disruptive behavior problems. Journal of the American Academy of Child & Adolescent Psychiatry, 47(5), 505–514. 10.1097/CHI.0b013e31816765c2 [DOI] [PubMed] [Google Scholar]
  33. Goodyear-Smith F, Jackson C, & Greenhalgh T (2015). Co-design and implementation research: Challenges and solutions for ethics committees. BMC Medical Ethics, 16(1), 78. 10.1186/s12910-015-0072-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Gopalan G (2016). Feasibility of improving child behavioral health using task-shifting to implement the 4Rs and 2Ss program for strengthening families in child welfare. Pilot and Feasibility Studies, 2(1) 21. 10.1186/s40814-016-0062-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Gopalan G, Chacko A, Franco LM, Rotko L, Marcus S, & McKay MM (2015). Multiple family groups service delivery model to reduce childhood disruptive behavioral disorders: Outcomes at 6-months follow-up. Journal of Child and Family Studies, 24(9), 2721–2733. 10.1007/s10826-014-0074-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Gopalan G, Goldstein L, Klingenstein K, Sicher C, Blake C, & Mckay MM (2010). Engaging families into child mental health treatment: Updates and special considerations. Journal of the Canadian Academy of Child and Adolescent Psychiatry = Journal De L’Academie Canadienne De Psychiatrie De L’Enfant Et De L’Adolescent, 19(3), 182–196. [PMC free article] [PubMed] [Google Scholar]
  37. Hoagwood KE, Chaffin M, Chamberlain P, Bickman L, & Mittman B (2011, March). Implementation salvage strategies: Maximizing methodological flexibility in children’s mental health research. Paper presented at the 4th Annual NIH Conference on the Science of Dissemination and Implementation, Washington, D.C. [Google Scholar]
  38. Horwitz SM, Hurlburt MS, Goldhaber-Fiebert JD, Heneghan AM, Zhang J, Rolls-Reutz J, … Stein REK (2012). Mental health services use by children investigated by child welfare agencies. Pediatrics, 130(5), 861–869. 10.1542/peds.2012-1330 [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Hurlburt MS, Leslie LK, Landsverk JA, Barth RP, Burns BJ, Gibbons RD, … Zhang J (2004). Contextual predictors of mental health service use among children open to child welfare. Archives of General Psychiatry, 61(12), 1217–1224. [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Hurlburt M, Aarons GA, Fettes D, Willging C, Gunderson L, & Chaffin MJ (2014). Interagency collaborative team model for capacity building to scale-up evidence-based practice. Children & Youth Services Review, 39, 160–168. 10.1016/j.childyouth.2013.10.005 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Institute of Medicine. (2015). Psychosocial interventions for mental and substance use disorders: A framework for establishing evidence-based standards. Washington, D.C.: The National Academies Press. [PubMed] [Google Scholar]
  42. Israel BA, Schulz AJ, Parker EA, & Becker AB (1998). Review of community-based research: Assessing partnership approaches to improve public health. Annual Review of Public Health, 19, 173–202. 10.1146/annurev.publhealth.19.1.173 [DOI] [PubMed] [Google Scholar]
  43. Kazdin AE, & Whitley MK (2003). Treatment of parental stress to enhance therapeutic change among children referred for aggressive and antisocial behavior. Journal of Consulting and Clinical Psychology, 71(3), 504–515. [DOI] [PubMed] [Google Scholar]
  44. Kazdin AE (1995). Conduct disorders in childhood and adolescence (2nd ed.). Thousand Oaks, CA, US: Sage Publications, Inc. [Google Scholar]
  45. Keiley MK (2002). Attachment and affect regulation: A framework for family treatment of conduct disorder. Family Process, 41(3), 477–493. 10.1111/j.1545-5300.2002.41312.x [DOI] [PubMed] [Google Scholar]
  46. Kilbourne AM, Neumann MS, Waxmonsky J, Bauer MS, Kim HM, Pincus HA, & Thomas M (2012). Public-academic partnerships: Evidence-based implementation: The role of sustained community-based practice and research partnerships. Psychiatric Services, 63(3), 205–207. 10.1176/appi.ps.201200032 [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Kolko D, Herschell A, Costello A, & Kolko R (2009). Child welfare recommendations to improve mental health services for children who have experienced abuse and neglect: A national perspective. Administration and Policy in Mental Health and Mental Health Services Research, 36(1), 50–62. 10.1007/s10488-008-0202-y [DOI] [PubMed] [Google Scholar]
  48. Lalayants M (2013). Parent engagement in child safety conferences: The role of parent representatives. Child Welfare, 92(6), 9–42. [PubMed] [Google Scholar]
  49. Landsverk J (2009). Creating interdisciplinary research teams and using consultants. In Stiffman AR (Ed.), The field research survivors guide (pp. 127–145). New York, NY: Oxford University Press. [Google Scholar]
  50. Leslie LK, Gordon JN, Lambros K, Premji K, Peoples J, & Gist K (2005). Addressing the developmental and mental health needs of young children in foster care. Journal of Developmental and Behavioral Pediatrics, 26(2), 140–151. 10.1097/00004703-200504000-00011 [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Loeber R, Burke J, & Pardini DA (2009). Perspectives on oppositional defiant disorder, conduct disorder, and psychopathic features. Journal of Child Psychology and Psychiatry, 50(1–2), 133–142. 10.1111/j.1469-7610.2008.02011.x [DOI] [PubMed] [Google Scholar]
  52. McKay MM, Gopalan G, Franco L, DeanAssael K, Chacko A, Jackson JM, & Fuss A (2011). A collaboratively designed child mental health service model: Multiple family groups for urban children with conduct difficulties. Research on Social Work Practice, 21(6), 664–674. 10.1177/1049731511406740 [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Mendel P, Meredith L, Schoenbaum M, Sherbourne C, & Wells K (2008). Interventions in organizational and community context: A framework for building evidence on dissemination and implementation in health services research. Administration and Policy in Mental Health and Mental Health Services Research, 35(1), 21–37. 10.1007/s10488-007-0144-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Mercer SL, Devinney BJ, Fine LJ, Green LW, & Dougherty D (2007). Study designs for effectiveness and translation research: Identifying trade-offs. American Journal of Preventive Medicine, 33(2), 139–154. 10.1016/j.amepre.2007.04.005 [DOI] [PubMed] [Google Scholar]
  55. Mikesell L, Bromley E, & Khodyakov D (2013). Ethical community-engaged research: A literature review. American Journal of Public Health, 103(12), e7–e14. 10.2105/AJPH.2013.301605 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Minkler M (2005). Community-based research partnerships: Challenges and opportunities. Journal of Urban Health, 82(Suppl 2), ii3–ii12. 10.1093/jurban/jti034 [DOI] [PMC free article] [PubMed] [Google Scholar]
  57. National Institutes of Health. (2016). Dissemination and implementation research in health (R01). Retrieved from https://grants-nih-gov.proxy.wexler.hunter.cuny.edu/grants/guide/pa-files/PAR-16-238.html [Google Scholar]
  58. Patterson GR, Reid JB, & Eddy JM (2002). A brief history of the Oregon model. In Reid JB, Patterson GR, & Snyder J (Eds)., Antisocial Behavior in Children and Adolescents: A Developmental Analysis (pp. 3–21). Washington, DC, US: American Psychological Association. [Google Scholar]
  59. Padgett D (2012). Qualitative and mixed methods in public health. New York, N.Y.: Sage. [Google Scholar]
  60. Palinkas LA, Short C, & Wong M (2015). Research-practice-policy partnerships for implementation of evidence-based practices in child welfare and child mental health. New York, N.Y.: William T. Grant Foundation. [Google Scholar]
  61. Pecora PJ, Whittaker JK, Maluccio AN, Barth RP, & Plotnick RD (2000). The child welfare challenge: Policy, practice, and research (2nd ed.). New York: Aldine de Gruyter. [Google Scholar]
  62. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, … Kirchner JE (2015). A refined compilation of implementation strategies: Results from the expert recommendations for implementing change (ERIC) project. Implementation Science, 10(1), 21. 10.1186/s13012-015-0209-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  63. Powell B, Beidas R, Lewis C, Aarons G, McMillen J, Proctor E, & Mandell D (2017). Methods to improve the selection and tailoring of implementation strategies. The Journal of Behavioral Health Services & Research, 44(2), 177–194. 10.1007/s11414-015-9475-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  64. Proctor EK, Powell BJ, Baumann AA, Hamilton AM, & Santens RL (2012). Writing implementation research grant proposals: Ten key ingredients. Implementation Science, 7(1), 96. 10.1186/1748-5908-7-96 [DOI] [PMC free article] [PubMed] [Google Scholar]
  65. Proctor EK, Powell BJ, & McMillen JC (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8(1), 139. 10.1186/1748-5908-8-139 [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Shea CM, Young TL, Powell BJ, Rohweder C, Enga ZK, Scott JE, … Corbie-Smith G (2017). Researcher readiness for participating in community-engaged dissemination and implementation research: A conceptual framework of core competencies. Translational Behavioral Medicine, 7(3), 393–404. 10.1007/s13142-017-0486-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Stiffman AR, Hadley-Ives E, Doré P, Polgar M, Horvath VE, Striley C, & Elze D (2000). Youths’ access to mental health services: The role of providers’ training, resource connectivity and assessment of need. Mental Health Services Research, 2(3), 141–154. [DOI] [PubMed] [Google Scholar]
  68. Stiffman AR, Pescosolido B, & Cabassa LJ (2004). Building a model to understand youth service access: The gateway provider model. Mental Health Services Research, 6(4), 189–198. [DOI] [PMC free article] [PubMed] [Google Scholar]
  69. Stirman SW, Miller CJ, Toder K, & Calloway A (2013). Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science, 8(1) 65, 10.1186/1748-5908-8-65 [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. United States Department of Health and Human Services. (2012). Initiative to improve access to needs-driven, evidence-based/evidence-informed mental and behavioral health services in child welfare: Funding opportunity announcement HHS-2012-ACF-ACYF-CO-0279. Rockville, M.D.: US Department of Health and Human Services. [Google Scholar]
  71. United States Government Accountability Office [GAO]. (2003). HHS could play a greater role in helping child welfare agencies recruit and retain staff. Retrieved from http://www.gao.gov/new.items/d03357.pdf
  72. Videka L, Gopalan G, & Bauta B (2014). Child abuse and neglect. In Gitterman A (Ed.), Handbook of social work practice with vulnerable populations (3rd ed.) (pp. 248–268). New York: Columbia University Press. [Google Scholar]
  73. Wahler RG, & Dumas JE (1989). Attentional problems in dysfunctional mother-child interactions: An interbehavioral model. Psychological Bulletin, 105(1), 116–130. 10.1037/0033-2909.105.1.116 [DOI] [PubMed] [Google Scholar]
  74. Wallerstein N, & Duran B (2010). Community-based participatory research contributions to intervention research: The intersection of science and practice to improve health equity. American Journal of Public Health, 100 (S1), S40–S46. 10.2105/AJPH.2009.184036 [DOI] [PMC free article] [PubMed] [Google Scholar]
  75. Willging CE, Aarons GA, Trott EM, Green AE, Finn N, Ehrhart MG, & Hecht DB (2016). Contracting and procurement for evidence-based interventions in public-sector human services: A case study. Administration and Policy in Mental Health and Mental Health Services Research, 43(5), 675–692. 10.1007/s10488-015-0681-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  76. Willging CE, Green AE, Gunderson L, Chaffin M, & Aarons GA (2015). From a ‘perfect storm’ to ‘smooth sailing’: Policymaker perspectives on implementation and sustainment of an evidence-based practice in two states. Child Maltreatment, 20(1), 24–36. 10.1177/1077559514547384 [DOI] [PMC free article] [PubMed] [Google Scholar]
  77. Woltmann EM, Whitley R, Mchugo GJ, Brunette M, Torrey WC, Coots L, … Drake RE (2008). The role of staff turnover in the implementation of evidence-based practices in mental health care. Psychiatric Services, 59(7), 732–737. 10.1176/ps.2008.59.7.732 [DOI] [PubMed] [Google Scholar]

RESOURCES