Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 May 1.
Published in final edited form as: Cogn Behav Pract. 2014 Jan 9;21(2):127–133. doi: 10.1016/j.cbpra.2013.12.007

Acting Locally and Globally: Dissemination and Implementation Around the World and Next Door

Michael A Southam-Gerow 1, Cassidy C Arnold 1, Adriana Rodriguez 1, Julia R Cox 1
PMCID: PMC4298830  NIHMSID: NIHMS640044  PMID: 25620868

Abstract

Murray et al. (this issue) present a fascinating account of their international dissemination and implementation (D&I) research focused on training therapists in Thailand and Iraq to provide a modular treatment approach called Common Elements Treatment Approach to youth. In this commentary, we use Murray et al. as a springboard to discuss a few general conclusions about the current direction of D&I research. Specifically, we reflect on current D&I models, highlighting their ecological focus and their emphasis on stakeholder involvement. Next, we discuss the central importance of implementation supports such as treatment programs, training approaches, assessment and outcome monitoring tools, and organizational interventions. We conclude with a consideration of how D&I work that aims to adapt implementation supports for local needs represent a key path to our goal of sustainability.

Keywords: dissemination, implementation, treatment, child/adolescent


In our city, as in many cities around the United States, there has been a strong emphasis on locally sourcing products. For example, many area restaurants proudly list the local farms from which they obtainED the ingredients for the items on the menu. This emphasis on thinking locally has a long tradition in the United States. It is thus not surprising that an emphasis on thinking locally has become fashionable in the field of children’s mental health treatment research. Earlier in the history of the field, a strong emphasis was placed on the development of an evidence base to help ameliorate the mental health problems facing many individuals. These early efforts established a wealth of evidence-based treatments and represent a critical achievement for our field. In the last dozen or so years, however, there has been a realization that localization of these evidence-based treatments (EBTs) had been neglected so far. This realization led to the rise of translational and dissemination/implementation science, a burgeoning area of our field. Localizing—that is, adapting to fit specific (local) contexts—has been a key theme in dissemination and implementation science. Subsequently, there has been a strong emphasis on understanding stakeholder perspectives and adapting EBTs for specific contexts.

Thus, it was fascinating to read the article by Murray et al. (this issue) because the themes of “thinking globally” and “acting locally” are both strongly emphasized. Their excellent and detailed description of their projects in Iraq and Southeast Asia demonstrates how far we have come in terms of our dissemination and implementation science. The paper also points to some emerging themes for our field to focus on moving forward. In this brief commentary, we use the Murray et al. paper as a launching point to discuss several issues related to the broader goal of going global by staying local—that is, disseminating what works best by learning how to tailor our implementation efforts to local needs and preferences. We start by providing a quick overview of some of the frameworks currently guiding dissemination and implementation science. Next, we introduce the notion that one important aspect of implementation efforts concerns the how and what of supports provided to the “end-users” of the evidence-based treatment, specifically focusing on characteristics of the supports. We conclude by discussing how best to move toward sustainability of our implementation efforts.

We start by briefly reviewing how we got here: how it is that we ended up, after so many years and so much effort of focusing on identifying “universal” EBTs, moving toward a renewed emphasis on the importance of local needs. Others have trod this ground before us, so our discussion here will be brief (e.g., Aarons, Hurlburt, & Horwitz, 2011; Proctor et al., 2009; Schoenwald & Hoagwood, 2001; Southam-Gerow, Rodriguez, Chorpita, & Daleiden, 2012). From the 1950s to the 1990s, we emphasized the development of generalizable knowledge about treatments (Chorpita et al., 2011; Southam-Gerow & Prinstein, in press; Strupp & Howard, 1992). Given emerging epidemiological data suggesting high rates of psychopathology among children and adolescents in the United States and other countries (e.g., Merikangas et al., 2010; Rescorla et al., 2012), scientists focused their efforts on developing and testing psychosocial, pharmacological, and combined treatments for these problems. As most readers know, this led to a highly influential body of work that has had a profound and critically important public health impact (e.g., Chorpita et al., 2011). We now have a large number of EBT programs that address many of the mental health problems children and adolescents face. However, the field quickly discovered that the “if you build it, they will come” (or more appropriately, if you research it, therapists will deliver it) approach to dissemination of EBTs was not going to be sufficient. Instead, the emergence of dissemination and implementation (D&I) science helped to identify for the field the way forward to promote greater public health through identifying barriers to D&I and then devising interventions to overcome them.

One early emphasis of D&I science has been the elaboration of frameworks through which to conceptualize the challenges facing the field, as well as helping to guide efforts to overcome those challenges. Although a thorough review of the models that have been proposed is beyond the scope of this commentary, it is worth noting that by and large the various frameworks proposed share many similarities (see, e.g., Meyers, Durlak, & Wandersman, 2012; Southam-Gerow, Arnold, Bair, & Cox, under review). First, many models acknowledge and address the complex nature of the forces on dissemination and implementation by accounting for the influence of variables at multiple levels. For example, both the Mental Health Services Ecological model (e.g., Schoenwald & Hoagwood, 2001; Southam-Gerow et al., 2012; Southam-Gerow, Ringeisen & Sherrill, 2006) and Proctor et al.’s (2009) Implementation Research Model highlight the importance of different levels of the ecology to consider when planning D&I science. Specifically, the models describe how child, family, therapist, team, organization, and/or system variables may be important in D&I efforts. For instance, therapist attitudes about the use of EBTs, levels of family stress, and organizational culture may all individually influence the success of an EBT implemented in a community setting.

Aarons and colleagues (2011) emphasize similar notions with their concepts of “inner” and “outer” contexts as influences in implementation in public service sectors (cf. Damschroder & Hagedorn, 2011). By inner context, they are referring primarily to factors within an agency or organization, such as characteristics of the organization or characteristics of the employees in that organization. By outer context, they are referring to a broader set of variables, including the service system setting and the interrelations among different organizations in the service setting. The notion that appreciating the relevance of various levels of influence on the implementation of an innovation (like EBTs) is relevant for both localized and global D&I science. Indeed, the context of low- and middle-income countries (LMIC), given that these countries often times have limited mental health infrastructures (e.g., organizations, workforce, policies, funding), offers even further support and relevance for these D&I models.

Another characteristic shared across many D&I frameworks is the idea that the process of implementation may involve several stages or phases. Aarons et al. (2011) provide a comprehensive example in their well-written review of the mental health, public health, organizational development and business research sectors. In the paper, they identified four thematic phases relevant to D&I work: Exploration (e.g., understanding the organizational issues at hand, such as how funding contexts or organizational culture influence EBP adoption), Adoption Decision/Preparation (e.g., factors that contribute to the adoption of EBPs, such as academic-public partnerships), Implementation (e.g., EBP structural fit), and Sustainment (e.g., fidelity support or staffing). At each phase, the authors describe how one needs to consider factors encompassed within the outer and inner contexts and the interconnections between the two contexts. In addition to the phasic commonality in D&I models, there is also significant recognition that these processes necessitate flexibility, moving through such phases in a nonlinear manner.

A final commonality across frameworks relates to their emphasis on the importance of identifying a process to involve stakeholders (or adopters) in, and integrate their feedback into, D&I efforts. There is increasing recognition in D&I science that the adopters’ perspectives on relevance, advantages, clarity, and replicability of the innovation are critical. Indeed, there are numerous D&I efforts that use partnership and participatory action research approaches to engage with stakeholders (e.g., Baptiste et al., 2006; Fox, Mattek, & Gresl, in press; Lyon et al., in press; Southam-Gerow, Hourigan, & Allin, 2009). Murray et al. (this issue) include excellent examples of involving stakeholders in the process of implementation, emphasizing particularly the bidirectionality of EBT implementation. Although Murray et al. brought with them important and scarce (in the settings studied) knowledge (e.g., TF-CBT), they also strove to build in feedback and feed-forward processes to help maintain stakeholder involvement, thereby improving chances of the endeavor’s success.

Their experience and method may be a fruitful guide for future research. Learning about and from perceptions of stakeholders across the ecology of the context before implementation may be a crucial step within LMIC. Such preparatory work could focus on identifying what barriers exist for clients and families, therapists, agencies, etc. As they found, traditional, US-developed training and supervision models may not always be feasible.

Implementation Supports and Their Characteristics

According to D&I scientists, thinking and planning across the ecology helps to maximize our chances for success in dissemination and implementation efforts in LMIC (as well as in Western countries). The Murray et al. paper (this issue) highlighted the potential relevance for carefully identifying a strong armamentarium of implementation supports. As noted at the outset, there was great emphasis in the middle and late 20th century on developing treatment programs: step-by-step guides to performing specific treatment approaches. And certainly, treatment programs represent a critical implementation support. However, as D&I researchers have realized, there are a number of other supports that may be relevant but that remain understudied. In this section, we discuss a few of the implementation supports present in Murray et al.—and offer a few others that represent areas for future work.

Understandably, Murray et al. (this issue) employed the implementation most studied to date: a specific treatment program, in their case the Common Elements Treatment Approach (CETA). CETA includes treatment elements or modules derived from evidence-based treatments for depression, traumatic stress, and anxiety. One valuable characteristic of CETA and other module-based treatment packages (e.g., Weisz et al., 2012) is flexibility, a feature of particular importance when dissemination to a variety of settings is desirable. To demonstrate the flexibility of CETA and the collaborative nature of D&I work using feedback and feed-forward, Murray et al. added a substance-use module because of the high rates of alcohol use identified during qualitative phases of their project. By selecting a treatment program that was both flexible, multi-problem focused, and responsive to the local needs, Murray et al. created treatment content that was both of high quality and relevant to the target population.

A general principle to consider for future D&I efforts concerns characteristics of the treatment program, such as level of structure, flexibility, and complexity. Some implementation situations may require a highly adaptable approach, like the CETA approach. As Weisz et al. (2012) found, the more flexible modular approach they tested was superior to the more structured approach in community mental health settings, where cases have high complexity (e.g., Southam-Gerow, Chorpita, Miller, & Gleacher, 2008). Other implementation settings may benefit from use of a more structured treatment, for example, in settings where many clients come in with a single focal problem (e.g., depression, anxiety).

A final note about treatment programs is warranted as it relates to implementation supports. As treatment protocols have evolved (and become more commercial), a variety of training materials have been developed that themselves represent supports. For example, some treatment programs provide worksheets or parent handouts that are designed to increase the acceptability and utility of the program. These supports have potential to be useful in some settings and future work could focus on whether the use of them has benefits.

Another key implementation support is training and supervision—an area that has only recently become the focus of scientific inquiry. Recent reviews of the effectiveness of different training modalities highlight that, while self-reported knowledge may increase, traditional workshop trainings result in minimal behavior change (Beidas & Kendall, 2010; Herschell, Kolko, Baumann, & Davis, 2010). The inclusion of more active training techniques (e.g., small group practice, behavioral rehearsal with feedback, ongoing coaching) has resulted in improved trainee skill (e.g., Cross et al., 2011). Beyond training, there has been an increased emphasis on the importance of ongoing consultation and/or supervision after a training is complete (e.g., Dorsey et al., 2013; Schoenwald, Sheidow, & Chapman, 2009) and has been shown to improve the uptake of EBP within a trainee’s clinical practice (e.g., Beidas, Edmunds, Marcus, & Kendall, 2012). With these new lines of research, the field is moving toward a more nuanced understanding of therapy as it is delivered in the real world.

Assessment represents a third category of implementation support and one that was highlighted by Murray et al. (this issue). Specifically, one obstacle to implementation was lack of training related to assessment and case identification among mental health staff in Iraq and Thailand. Astutely, the authors recognized the need to address this obstacle, reasoning that identifying cases appropriate for CETA was critical to their mission. Accordingly, they developed a package of locally normed assessment measures and provided training for how to use them. The measures, with different versions for each location, included checklists of locally relevant symptoms associated with the various treatment targets of CETA. Murray et al. should be commended for their use of a local sample to collect normative data and validity data. The training for the lay counselors and local supervisors included vignettes complete with assessment data so that counselors could practice identifying an appropriate sequence of treatment modules along with times when the dose of each module should be modified. Again, the implementation support needed here was flexible and the training required was comprehensive. In other settings, with higher numbers of already trained mental health professionals, less training may be required. It is worth noting that our evidence base for treatments is reliant on high-quality and accurate case identification. However, to date, very few implementation efforts have trained “end-users” to conduct the assessment needed for case identification.

Another measurement-related implementation tool warrants attention: monitoring of treatment progress. Ostensibly, the goal of psychological treatment is to improve the functioning and/or reduce symptoms in an individual. For the most part, good tools are lacking to help therapists monitor treatment progress. Even in efficacy studies, therapists are in general “flying without instruments” insofar as they are not involved in measuring, nor are they privy to others’ measurement of, client progress on standardized or idiographic measures. In this sense, ongoing monitoring of treatment progress remains stuck in the oral tradition, with therapists left to ask clients how treatment is going and clients offering a nonstandardized reply.

In treatment studies, of course, outcome assessment is paramount. That is, data (often lots of data!) are available but are not shared. Recently, there has been an emphasis on the importance of providing feedback to therapists on the outcomes of their clients during and after treatment. For example, Lambert and colleagues have spent more than a decade developing and testing measures that can be used in therapy practice with adults and youth that provide useful feedback on client progress, demonstrating in some studies that providing that information alone (i.e., with no other training) improves therapist effectiveness (e.g., Lambert, 2005; Lambert, Harmon, Slade, Whipple, & Hawkins, 2005). In another study, Southam-Gerow, Daleiden, et al. (in press) describe the MAP approach, an evidence-informed system of care provision that includes a therapist-maintained clinical dashboard that monitors client progress. They report the strong preliminary effects observed in a county-wide implementation of MAP, with effect sizes ranging from .59 to .80.

There are other important supports to consider, especially at the organizational level. Murray and colleagues (this issue) identify two organizational challenges they encountered: (a) the unavailability of a skilled mental health workforce, and, relatedly, (b) the lack of higherlevel mental health professionals to make well-informed clinical decisions (i.e., determining treatment focus, choosing specific interventions, and the appropriate dose). Thus, one additional support would be interventions to improve the readiness of a setting for the implementation of a particular intervention. Developers of other interventions have emphasized this sort of organization level implementation support. One example is multisystemic therapy (MST; Henggeler, Melton, Brondino, Scherer, & Hanley, 1997; Henggeler, Schoenwald, Borduin, Rowland, & Cunningham, 2009), an intensive, family-based intervention originally developed to address serious behavior problems in adolescents. The MST model includes a formal “site assessment” to determine the feasibility of program start-up and sustainability. Based on the initial assessment, stakeholders can begin arranging and developing the infrastructure that will support MST, including a set of “required” practices (e.g., full-time master-level therapists to hold a limited caseload, tracking therapist fidelity and family-level outcomes). These pre-implementation supports are time and resource intensive, but evidence suggests that they pave the way for effective and efficient treatment.

Whereas MST attends to specific organizational variables to promote the implementation and sustainability of the program itself, the Availability, Responsiveness and Continuity (ARC) organizational intervention was developed as a stand-alone program to provide community agencies with the structure, tools, and procedures to support the delivery of effective services more broadly (Glisson & Schoenwald, 2005). The goal of the 18-month ARC intervention is to improve mental health service delivery by targeting the organizational social context (e.g., increasing teamwork, goal setting, feedback). Clinicians and management staff alike are given access to specific tools to promote efficient and effective service delivery. Clinicians’ attitudes and behaviors are also targeted, in an effort to promote commitment to the agency, flexibility, and openness to change. In a recent trial, youths served by agencies in which ARC was implemented had significantly better clinical outcomes compared to control agencies. Indeed, the youths served by those agencies with the most improved organizational social context demonstrated the most clinical improvement (Glisson et al., 2013). The success of an organizational intervention such as ARC without the additional implementation of a specific EBP, rather dramatically highlights the importance of agency structure and functionality and is becoming a more common focus of research (e.g., Lewis & Simons, 2011). Embedding such supports within the composition of an agency in this way may also demonstrate that such outcomes are sustainable.

Toward Sustainability

Murray et al. provide an excellent survey of relevant implementation supports for our field. Moving forward, it is clear that development of these therapist-, organization-, and community-level implementation supports are important to ensure that an implemented program endures. Indeed, one major reason for the local/global synergy in mental health research currently concerns the critical importance of sustainability. There is an understanding that funding and human effort are finite. To solve some of our biggest problems, like widespread dissemination and implementation of state-of-the-science interventions for childhood mental health problems, we need to structure our supports for the long-term—so that any changes that occur will be long-lasting. We conclude this commentary with a few observations inspired by the work of Murray et al. on how to foster sustainability.

First, consistent with Murray et al.’s (this issue) approach, localizing expertise is a crucial step toward sustainability. We need implementation models that quickly move away from the need for expert guidance toward localized expertise or competence with the intervention. The supports included in Murray et al.’s studies are excellent models for future research. Further, we find the recent emphasis on training, supervision, and consultation models encouraging (e.g., Beidas & Kendall, 2010; Fritz et al., 2013; Herschell et al., 2010; Leffler, Yo, West, McCarty, & Atkins, 2013), as it is through these processes that the seeds of sustainability are sowed. Identifying optimal training and consultation approaches to maximize sustainability represents a key focus for future research. As one example, Southam-Gerow et al. (in press) compared two training methods, training by experts vs. training by local trainers (trained via a train-the-trainer approach). They found that therapists by either method were able achieve the credentialing standard for the treatment successfully at relatively equal (and high) rates.

A second set of relevant foci for maintaining sustainability concerns broader systemslevel influences. Arguably the most important of these is funding. EBTs are often initially developed and funded with federal grant dollars, a funding stream that is not sustainable. The implementation of an EBT will often require ongoing funding to allow access to the supports needed to maintain expertise of therapists and/or develop expertise for new therapists. Thus, part of implementing the program is involved with identifying how to maintain funding moving forward. The developers of MST have been among the most proficient of the developers in this regard. As noted, this is built into MST implementation in the identification of the funding stream (Henggeler & Schoenwald, 2011). Specifically, prior to implementation of MST in a community, a funding stream by the agency (agencies) seeking to bring MST into the community is identified. Relevant stakeholders (e.g., municipal fiscal officers, municipal administrators, juvenile justice personnel) are consulted and involved to ensure that the funding is sustained.

Another example is the recent work documented by Southam-Gerow and colleagues (in press) in Los Angeles County. There, the treatment developers of Managing and Adapting Practice (MAP) partnered with the local mental health authority and the state government training and quality assurance organization to implement an innovative approach to evidence-informed care. The endeavor was reliant on a stream of funding identified by the local mental health authority, to which the developers tailored their approach. Specifically, as part of a transformation of the county mental health system, LA County Department of Mental Health identified a set of EBTs that would be the only services reimbursed via one particular and large funding source, a stream of dollars funded through the Mental Health Services Act (MHSA). The MHSA was designed to help the state transform mental health services by taxing 1% of each dollar of income earned over $1M. The MHSA had generated more than $6.5B in additional revenue by the end of 2011, with projected annual revenues over $750M per year for the next several years (California Department of Mental Health, 2011). As one of the EBTs selected for this funding source, MAP was well-positioned for sustainability given the ongoing source of funding.

In their paper, Murray and colleagues (this issue) briefly discuss the challenges of providing mental health services to individuals in low- and middle-income countries and articulate the long-term financial benefit to CETA versus multiple single foci treatments but do not discuss long-term funding of the CETA for the populations with whom they are working. Without funding from local sources (governmental, nongovernmental, and private), the benefits of training local supervisors and counselors will likely fade away.

Like our local restaurateurs, we D&I scientists aspire to “think globally and act locally.” Our work aims to improve local circumstances through application of (hopefully) generalizable science through a process of partnership and adaptation. We seek to improve local circumstances because we recognize that, in doing so, we improve global circumstances. Murray et al. remind us to be at once proud of the knowledge we have to share with the world and modest enough to recognize our need for the local expertise of those with whom we partner internationally. That same spirit of sharing expertise with modesty, through partnership, has been a successful model here in the United States as well (e.g., Aarons & Chaffin, 2013; Saldana & Chamberlain, 2013; Southam-Gerow et al., in press). We hope that together, we can identify the key implementation supports—and the best ingredients for those supports—to continue the effort to reduce the burden of mental health problems among children and families all over the world.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  1. Aarons G, Chaffin M. Scaling-up evidence-based practices in child welfare services systems. CYF News. 2013 Retrieved from http://www.apa.org/pi/families/resources/newsletter/2013/04/child-welfare.aspx.
  2. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health. 2011;38(1):4–23. doi: 10.1007/s10488-010-0327-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Baptiste DR, Bhana A, Petersen I, McKay M, Voisin D, Bell C, Martinez DD. Community collaborative youth-focused HIV/AIDS prevention in South Africa and Trinidad: Preliminary findings. Journal of Pediatric Psychology. 2006;31(9):905–916. doi: 10.1093/jpepsy/jsj100. [DOI] [PubMed] [Google Scholar]
  4. Beidas RS, Edmunds JM, Marcus SC, Kendall PC. Training and consultation to promote implementation of an empirically supported treatment: A randomized trial. Psychiatric Services. 2012;63:660–665. doi: 10.1176/appi.ps.201100401. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Beidas RS, Kendall PC. Training therapists in evidence-based practice: A critical review of studies from a systems-contextual perspective. Clinical Psychology: Science and Practice. 2010;17(1):1–30. doi: 10.1111/j.1468-2850.2009.01187.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Chorpita BF, Daleiden EL, Ebesutani C, Young J, Becker KD, Nakamura BJ, Starace N. Evidence-based treatments for children and adolescents: An updated review of indicators of efficacy and effectiveness. Clinical Psychology: Science and Practice. 2011;18:154–172. doi: 10.1111/j.1468-2850.2011.01247.x. [DOI] [Google Scholar]
  7. Cross WF, Seaburn D, Gibbs D, Schmeelk-Cone K, White AM, Caine ED. Does practice make perfect? A randomized control trial of behavioral rehearsal on suicide prevention gatekeeper skills. Journal of Primary of Prevention. 2011;32:195–211. doi: 10.1007/s10935-011-0250-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Damschroder LJ, Hagedorn HJ. A guiding framework and approach for implementation research in substance use disorders treatment. Psychology of Addictive Behaviors. 2011;25(2):194–205. doi: 10.1037/a0022284. [DOI] [PubMed] [Google Scholar]
  9. Dorsey S, Pullmann MD, Deblinger E, Berliner L, Kerns SE, Thompson K, Garland AF. Improving practice in community-based settings: A randomized trial of supervision-study protocol. Implementation Science. 2013;8(89):1–12. doi: 10.1186/1748-5908-8-89. [DOI] [PMC free article] [PubMed] [Google Scholar]
  10. Fox RA, Mattek RJ, Gresl BL. Evaluation of a university-community partnership to provide home-based, mental health services for children from families living in poverty. Community Mental Health Journal. doi: 10.1007/s10597-012-9545-7. (in press) [DOI] [PubMed] [Google Scholar]
  11. Fritz RM, Tempel AB, Sigel BA, Conners-Burrow NA, Worley KB, Kramer TL. Improving the dissemination of evidence-based treatments: Facilitators and barriers to participating in case consultation. Professional Psychology: Research and Practice. 2013;44(4):225–230. doi: 10.1037/a0033102. [DOI] [Google Scholar]
  12. Glisson C, Hemmelgarn A, Green P, Williams NJ. Randomized trial of the Availability, Responsiveness and Continuity (ARC) organizational intervention for improving youth outcomes in community mental health programs. Journal of the American Academy of Child & Adolescent Psychiatry. 2013;52:493–500. doi: 10.1016/j.jaac.2013.02.005. [DOI] [PubMed] [Google Scholar]
  13. Henggeler SW, Melton GB, Brondino MJ, Scherer DG, Hanley JH. Multisystemic therapy with violent and chronic juvenile offenders and their families: The role of treatment fidelity in successful dissemination. Journal of Consulting and Clinical Psychology. 1997;65:821–833. doi: 10.1037//0022-006x.65.5.821. [DOI] [PubMed] [Google Scholar]
  14. Henggeler S, Schoenwald SJ. Evidence-based interventions for juvenile offenders and juvenile justice policies that support them. Social Policy Report. 2011;25(1):1–20. [Google Scholar]
  15. Henggeler SW, Schoenwald SK, Borduin CM, Rowalnd MS, Cunningham PB. Multisystemic therapy for antisocial behavior in children and adolescents. 2. New York, NY: The Guilford Press; 2009. [Google Scholar]
  16. Herschell AD, Kolko DJ, Baumann BL, Davis AC. The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review. 2010;30:448–466. doi: 10.1016/j.cpr.2010.02.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Lambert MJ. Enhancing psychotherapy through feedback to clinicians. The Register Report: National Health Service Providers in Psychology. 2005;31(2):15–19. [Google Scholar]
  18. Lambert MJ, Harmon C, Slade K, Whipple JL, Hawkins EJ. Providing feedback to psychotherapists on their patients’ progress: Clinical results and practice suggestions. Journal of Clinical Psychology: In Session. 2005;61(2):165–174. doi: 10.1002/jclp.20113. [DOI] [PubMed] [Google Scholar]
  19. Leffler JM, Yo J, West AE, McCarty CA, Atkins MS. Training in evidence-based practice across the professional continuum. Professional Psychology: Research and Practice. 2013;44(1):20–28. doi: 10.1037/a0029241. [DOI] [Google Scholar]
  20. Lewis C, Simons A. A pilot study disseminating cognitive behavioral therapy for depression: Therapist factors and perceptions of barriers to implementation. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:324–334. doi: 10.1007/s10488-011-0348. [DOI] [PubMed] [Google Scholar]
  21. Lyon AR, Ludwig K, Romano E, Koltracht J, Vander Stroup A, McCauley E. Using modular psychotherapy in school mental health: Provider perspectives on intervention-setting fit. Journal of Clinical Child & Adolescent Psychology. doi: 10.1080/15374416.2013.843460. (in press) [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Merikangas KR, He JP, Burstein M, Swanson SA, Avenevoli S, Cui L, Swendsen J. Lifetime prevalence of mental disorders in US adolescents: results from the National Comorbidity Survey Replication–Adolescent Supplement (NCS-A) Journal of the American Academy of Child & Adolescent Psychiatry. 2010;49(10):980–989. doi: 10.1016/j.jaac.2010.05.017. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Meyers DC, Durlak JA, Wandersman A. The quality implementation framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology. 2012;50(3–4):462–80. doi: 10.1007/s10464-012-9522-x. [DOI] [PubMed] [Google Scholar]
  24. Murray LK, Dorsey S, Haroz E, Lee C, Alsiary M, Haydary A, Weiss WM, Bolton P. A common elements treatment approach for adult mental health problems in low- and middle-income countries. Cognitive and Behavioral Practice. doi: 10.1016/j.cbpra.2013.06.005. (this issue) [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36:24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Rescoral L, Ivanova MY, Achenbach TM, Begovac I, Chahed M, Drugli MB, Zhang EY. International epidemiology of child and adolescent psychopathology II: Integration and applications of dimensional findings from 44 societies. Journal of the American Academy of Child and Adolescent Psychiatry. 2012;51:1273–1283.e8. doi: 10.1016/j.jaac.2012.09.012. [DOI] [PubMed] [Google Scholar]
  27. Saldana L, Chamberlain P. Scaling up two evidence-based practices for children’s mental health. CYF News. 2013 Apr; Retrieved from http://www.apa.org/pi/families/resources/newsletter/2013/04/child-mental-health.aspx.
  28. Schoenwald SK, Hoagwood K. Effectiveness, transportability, and dissemination of interventions: What matters when? Psychiatric Services. 2001;52:1190–1197. doi: 10.1176/appi.ps.52.9.1190. [DOI] [PubMed] [Google Scholar]
  29. Schoenwald SK, Sheidow AJ, Chapman JE. Clinical supervision in treatment transport: Effects on adherence and outcomes. Journal of Consulting and Clinical Psychology. 2009;77:410–421. doi: 10.1080/15374410802575388. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Southam-Gerow MA, Arnold CC, Tully CB, Cox JR. Dissemination of evidence-based practices. In: Alfano C, Beidel D, editors. Comprehensive evidence-based interventions for school-aged children and adolescents. Hoboken, NJ: Wiley & Sons, Inc; (under review) [Google Scholar]
  31. Southam-Gerow MA, Chorpita BF, Miller LM, Gleacher AA. Are children with anxiety disorders privately-referred to a university clinic like those referred from the public mental health system? Administration and Policy in Mental Health and Mental Health Services Research. 2008;35:168–180. doi: 10.1007/s10488-007-0154-7. [DOI] [PubMed] [Google Scholar]
  32. Southam-Gerow MA, Daleiden EL, Chorpita BF, Bae C, Mitchell C, Faye M, Alba M. MAPping Los Angeles County: Taking an evidence-informed model of mental health care to scale. Journal of Clinical Child & Adolescent Psychology. doi: 10.1080/15374416.2013.833098. (in press) [DOI] [PubMed] [Google Scholar]
  33. Southam-Gerow MA, Hourigan SE, Allin RB. Adapting evidence-based mental health treatments in community settings: preliminary results from a partnership approach. Behavior Modification. 2009;33(1):82–103. doi: 10.1177/0145445508322624. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Southam-Gerow MA, Prinstein MJ. Evidence-based treatment updates: The evolution of the evaluation of psychological treatments for children & adolescents. Journal of Clinical Child & Adolescent Psychology. doi: 10.1080/15374416.2013.855128. (in press) [DOI] [PubMed] [Google Scholar]
  35. Southam-Gerow MA, Ringeisen HL, Sherrill JT. Integrating interventions and services research: Progress and prospects. Clinical Psychology: Science and Practice. 2006;13(1):1–8. doi: 10.1111/j.1468-2850.2006.00001.x. [DOI] [Google Scholar]
  36. Southam-Gerow MA, Rodríguez A, Chorpita BF, Daleiden EL. Dissemination and implementation of evidence based treatments for youth: Challenges and recommendations. Professional Psychology: Research and Practice. 2012;43:527–534. doi: 10.1037/a0029101. [DOI] [Google Scholar]
  37. Strupp HH, Howard KI. A brief history of psychotherapy research. In: Freedheim DK, editor. A history of psychotherapy. Washington, DC: American Psychological Association; 1992. pp. 309–334. [Google Scholar]
  38. Weisz JR, Chorpita BF, Palinkas LA, Schoenwald SK, Miranda J, Bearman SK, the Research Network on Youth Mental Health Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: A randomized effectiveness trial. Archives of General Psychiatry. 2012;69:274–82. doi: 10.1001/archgenpsychiatry.2011.147. [DOI] [PubMed] [Google Scholar]

RESOURCES