Skip to main content
Springer logoLink to Springer
editorial
. 2013 Aug 14;40(6):451–455. doi: 10.1007/s10488-013-0512-6

Unpacking the Black Box of Implementation: The Next Generation for Policy, Research and Practice

Kimberly Hoagwood 1,, Marc Atkins 2, Nicholas Ialongo 3
PMCID: PMC3824224  PMID: 23942647

It was, as best as we can recall, a late spring afternoon and the three of us were completing another of our semi-regular phone calls. As each of us is directing NIMH-funded Services Research Centers related to children’s mental health research, we often sought each other’s advice and feedback on the direction of our work. We saw the complementarity of our work: We all focused on low income and largely inner city populations; we all focused on schools and community agencies; we all focused on improving effective services via training, consultation, and fitting effective practices into natural community ecologies. We all brought different strengths: Nick’s work on specific evidence-based preventive services and their installation with fidelity; Marc’s work on building onto existing naturalistic supports in schools and agencies, rather than superimposing packaged practices; and Kimberly’s work on studying feasible and practical implementation strategies that can be adopted widely by states and healthcare systems.

On one of our calls among the three of us, we were discussing the need for synergy among our Centers and after bantering around several ideas, one of us called the question: Great ideas but who is going to really do this work? It was clear that the three of us in our roles directing these Centers had core research tasks to accomplish and little room for new projects. Yet, we saw there was an opportunity for pushing the research agenda beyond each of our Center’s mandates. Using the gestalt of our combined work to launch new projects. What if, one of us suggested, we brought our colleagues from each of our Centers together? And what if we focused not on our senior colleagues, who though brilliant were similarly preoccupied with their own core tasks? What if we brought together the extraordinarily talented early career faculty, postdocs, and graduate students from each of our Centers?

We agreed to host a series of cross-center meetings. The first was convened in Chicago in June 2011; the second in New York September 2011; the third in Baltimore October 2011; and the fourth in New York December 2012. The format was simple: the hosting center would present the key questions and data from its projects and the rest of the time would be spent brainstorming and networking our early career colleagues. Over time the long sought synergies became apparent; drafts of grants and papers were exchanged and ideas for new projects emerged. By the time of the second meeting, the group had formed working groups and from this emerged enough data based papers for special issues to develop including the one in this volume.

As we review these papers, we reflect back to our initial discussions and think about the new directions these papers represent. We want to note especially what we see as implications for policy, new research directions, and practice.

As has been said before, the ethical and scientific challenge for our field inheres in the sluggish movement of effective practices for children and families into the systems that are tasked to serve them (see Bickman and Hoagwood 2010). And, as has been documented many times, knowledge about how to improve the lives of children is available now and increasingly so in the form of packaged programs, and widely available trainings, aps and the like. The problem is not lack of what to do; it is lack of a systematic and evidence-informed way of doing it. And an overriding obstacle has been the intractable fragmentation of the service systems.

Consider a recent meta-analysis of treatment effectiveness studies for children’s mental health problems (Lee et al. 2013). They conducted a comprehensive literature search and identified 20 studies since 2007 that examined the effectiveness of interventions for anxiety, depression, and disruptive behavior problems conducted in practice settings. They then compared results from these effectiveness studies to benchmarks from two meta-analyses of efficacy trials. They found that for internalizing problems improvement rates for the effectiveness studies matched the two efficacy benchmarks. For disruptive behavior, results were more variable but were generally favorable with most studies out-performing the benchmark while a few under-performed. The authors note: “It is particularly noteworthy that the majority of the studies we reviewed addressed the transportability of interventions developed in North America to other countries, and for many, it included translation of materials into another language. Only two of the studies were based in North America” (Lee et al. 2013, p. 85, italics added). Thus, we, in the U.S. have developed a research enterprise that has led to impressive improvements in the treatment of children’s mental health problems everywhere but here!

Why is that? Undoubtedly there are many reasons but one is likely to be the excessive complexity of the child service systems to which these interventions are transported. Consider the implementation challenges that are increasingly documented in studies that examine the installation of evidence-based practices; they include regulatory constraints, lack of workforce training, inadequate supervisory structures, inability to bill for new practices, etc. (Bickman and Hoagwood 2010; Hoagwood et al. 2013; McHugh and Barlow 2010).

However, the traditional model for the mental health service system is uniquely poised for change. The political and social will exists now to make system changes on a massive scale. These changes have the potential to create a more integrated and data-driven health and mental health system. The 2010 Patient Protection and Affordable Care Act are creating a set of incentives, payment mechanisms, and attention to quality metrics that are restructuring the healthcare systems by which services are delivered. Because the umbrella of the healthcare act includes mental health and addiction services, along with a broad range of other health services, the potential for integrated services informed by data about outcomes and quality that are shared exists for the first time in this country (Berenson et al. 2013; Conway et al. 2013; Koh and Sebelius 2010).

Measuring and tracking quality indicators, for example, has been endorsed in the National Quality Strategy (NQS) of the Affordable Care Act, and developing child health care quality measures for use in Medicaid and the Children’s Health Insurance Programs (CHIP) has been mandated by the Children’s Health Insurance Program Reauthorization Act of 2009 (CHIPRA). Significant funding to support these initiatives exists. The Pediatric Quality Measures Program was allocated 40 million dollars to support seven Centers of Excellence in 2010 and to develop new measures and refine existing ones for a core set. The Centers for Medicare & Medicaid Services (CMS) also funded 10 five-year demonstration projects at an estimated total cost of $100 million in 2010, of which seven propose to develop, test, evaluate and/or report adherence to quality measures. The use of these quality indicators or measures will be sustained via financial incentives to collect and report on adherence rates through a federal match that is part of the American Recovery and Reinvestment Act of 2009 (ARRA). Eligible providers will receive these payments for demonstrating “meaningful use” of quality measures under the Electronic Health Records Incentive Program and will be able to benchmark their own performance against aggregated data (Conway et al. 2013; Zima et al. 2013). The point is that these massive healthcare policy changes are driving system-level changes. If we in the children’s mental health field use this opportunity to craft our research to inform these changes, then the possibility exists for a quality-driven, evidence-based national system of child health care.

The papers in this special issue focus on some of the important ingredients of quality change in children’s services, i.e., data-based strategies to improve the use of evidence-based prevention and clinical practices by teachers, counselors, families, and therapists after training. All of the papers focus on consultation, coaching, and other post-training strategies that can be delivered within the naturalistic settings of schools and community agencies. The papers pull apart the active ingredients that will yield higher fidelity to the effective practices and will improve outcomes. They also illustrate the complexity of the social and organizational processes that need attention if installation of effective practices is to be sustained.

For example, Becker et al. (2013) provide data from a large prevention trial that examines coaching visits to teachers who had been taught the Good Behavior Game to identify specific coaching strategies, and found that coaches strategically varied their use of strategies (e.g., modeling, delivery) based on teacher implementation quality. Coaching was associated with improved implementation quality. Similarly, Reinke et al. (2013) describe two understudied facets of fidelity: ratings of teacher engagement and differentiation of exposure to coaching. They show how these can be operationalized and measured so that they can be included in professional development of teachers and potentially used to establish benchmarks or standards for evaluating fidelity in evidence-based interventions.

Beidas et al. (2013) describes three mechanisms through which consultation may affect adherence and skill: connectedness, authenticity, and responsiveness. The analysis, using mixed methods, also suggests that active learning is not consistently the mechanism through which effective consultation operates. This leads to an important set of questions for further research.

Bearman et al. (2013) identify and test several specific predictors of evidence-based practice (EBP) use through a study of the components of effective supervision. They found that supervision involving modeling and role playing predicted higher EB practice use than discussion but also found age and sex-related differences. Because there is some evidence that didactic trainings without behavioral rehearsal or ongoing support are not sufficient to change therapist behavior, this study is important in suggesting that modeling and role-play may be two important behaviors to include in training and supervision of EBPs, and that therapists in community practice are able to implement these practices.

The other papers in the series also identify active ingredients of installing EBPs via consultation. For example, Edmunds et al. (2013) describe how behavioral rehearsal as a form of active learning may affect use of therapist skills. The degree of participation in the consultation process moderated the relationship between behavioral rehearsal and skill. In another study using qualitative data, Lyon et al. (2013) examines agreement to participate in training/consultation in EBPs in schools and identified a set of motivational factors based on social learning theory that were relevant. These included expectations, attitudes, as well as practical issues, such as time.

Masia-Warner et al. (2013) describe how specific consultation strategies can support school counselor’s implementation of an EBT for adolescent social anxiety by school counselors. They developed measures of adherence and competence and showed that agreement between counselors and consultants was strong for adherence but less strong for competence. Interestingly, regarding competence, counselors were observed by consultants to be good implementers of exposure exercises but less strong implementers of cognitive elements of the intervention. This provides a strong rationale for a multi-tiered intervention in which counselors work collaboratively with other mental health staff trained in more complex intervention strategies for those youth who require more intensive interventions.

Finally, Nadeem et al. (2013) describe the distribution of content and time in real-world supervision of therapists trained in EBTs. Importantly about one-fourth of the time is spent on administrative and organizational barriers, and 50 % on clinical content.

The implications of these papers for the new world order of healthcare policy reform are three fold. First, they demonstrate that identification, measurement, and testing of specific consultation practices after EBP training are feasible in real-world settings such as schools and community agencies. This is critical for future benchmarking of service quality. Much more work is needed, but these are important first steps. Second, they demonstrate the range of relevant consultation strategies and techniques that require further study to better improve not just the processes of EBP service installation but more importantly the outcomes. Third, they show how a new generation of research and of exceptionally promising junior researchers can help to mold the field of children’s services to make its yield directly applicable to important mental health policy issues.

These papers also point to new directions for research and practice. In regard to research design, as Proctor and Rosen (2008) note, service system research should involve the perspective of clinicians who make ideographic decisions regarding research evidence. Thus, an important next step in implementation and dissemination research is to match research designs with the intended use of the data to inform practice. Toward that end, innovative research designs that are both contextually relevant and methodologically rigorous are necessary to promote a clearer understanding of contextual factors that impede or enhance implementation processes. The advantage of these design alternatives is that the false dichotomy of ivory tower priorities for certainty and practice setting priorities for relevance is replaced by designs that accommodate the goal of advancing evidence-informed practice.

For example, Glasgow et al. (2005) recommend expanding the CONSORT criteria (that focuses primarily on enhancing internal validity) to include external validity criteria for “practical clinical designs.” They discuss the importance of representative sampling (including setting-specific factors), use of clinically relevant alternative interventions in place of no-treatment controls, and use of a broad range of relevant outcomes. Interestingly, included among the recommendations is the use of single subject designs, which have all but disappeared from clinical research. Recently, Kratochwill and Levin (2010) described procedures to adapt single-subject designs to accommodate randomized controlled trials. Specifically, they presented a model involving four stages of educational interventions with the goal to inform classroom practice that has relevance for mental health practice as well (see Fig. 4, p. 131). This is followed by a series of randomization strategies across units, settings, behaviors, or phases of intervention. Taken together, these strategies have the advantage of enhancing scientific rigor without sacrificing relevance to practice settings.

Another design issue that is highly relevant to dissemination and implementation research is the need for alternatives to the randomized controlled design when random assignment is not feasible. West et al. (2008) describe models that approximate random assignment for these occasions. Two categories are described, randomized encouragement designs that incorporate participant choice into the design (see also Freedman 1987 and Lavori et al. 2001 for a discussion of clinical equipoise) and quantitative assignment designs in which participants are assigned by preconceived criteria (e.g., risk or need). Finally, mixed method or hybrid research designs are also highly relevant to dissemination and implementation research. These designs can be important to allow an iterative process of research and practice can include both formative and generative research designs (Atkins et al. 2006). Other examples of hybrid designs include studies that incorporate aspects of effectiveness and implementation—to simultaneously test the impact of interventions under real world conditions (effectiveness) and test the spread or disseminability of these interventions (implementation) (see Curran et al. 2012).

The papers in this special issue also raise higher order questions about practice improvement. This has been an under researched area, and these papers fill a large hole. The issues raised are both micro and macro-level. Beginning at the micro-level, do some skills prove more trainable than others? For example, while training teachers (see Becker et al. 2013) in an urban school district to praise student behavior has proven to be no easy task, an even more difficult task has been training them to deliver the praise in a sincere and enthusiastic manner. Can sincerity and enthusiasm be trained? Are these personality traits that teachers or clinicians bring to the proverbial “table” and no amount of behavioral rehearsal will alter? Can these traits and/or aptitudes be reliably assessed and used in selecting candidates for training as clinicians and teachers? These questions lead to a natural set of research questions for future studies.

Moving to the more macro-level, several of these papers suggest that training and mentoring practices may need to be tailored to reflect variation in learning styles and clinician characteristics, such as gender and age. The influence of school, agency, or organizational context on training and consultation is also a practice question with researchable potential. Much has been written about the impact of social organizational context on uptake of new practices, on job satisfaction, and on child outcomes (see Glisson et al. 2012; Glisson and Schoenwald 2005; Glisson et al. 2010; Glisson et al. 2008). To make new practices stick in real world contexts, the combined influence of learning styles, clinician characteristics, and characteristics of the workplace need to be disaggregated. Core components that are modifiable need to be identified for the development of practice-based and targeted interventions.

While moving clinical science training programs towards the use of evidence-based training and mentoring practices is a formidable task, re-tooling via training and mentoring community-based clinicians in such practices is expensive, labor-intensive, and ultimately inefficient. An important question for improving practice relates to the kinds of institutional supports that will be needed to support these improvements. To this end, the common elements approach of Chorpita and Daleiden (2009) reflects some of the most original thinking about practical ways to advance practice improvement in children’s mental health. It is likely that in the re-tooling of the workforce web-based technologies are likely to provide valuable solutions. This will include web-based training and consultation models; the use of data to drive decision-making; the development of practical and robust metrics and measures that are sensitive to change, individually focused, and measurable (Bickman et al. 2012; Chorpita and Daleiden 2009).

The issue of embedding these web-based tools into real world practice settings raises its own set of implementation challenges (Bickman et al. 2012) and yet another research agenda. But it is important that the development and testing of these practical tools be done by people knowledgeable about mental health systems, fidelity to evidence-based practices, and meaningful child and family outcomes. If we don’t do it, someone else will.

In summary, the papers in this special issue advance the field of implementation science in children’s mental health by addressing real-world, practical, and down-to-earth issues about how best to train, coach, mentor, and provide consultation to front-line providers (teachers, counselors, clinicians, case workers) on alternative practices that are likely to improve child and family outcomes. The editors and all of the authors are to be commended for looking at the horizon and flying towards it with vision and hard work. The papers as a whole provide a picture of the future. Together they set a standard for linking policy, research and practice as they relate to evidence-based training and consultation methods to improve children’s mental health outcomes.

References

  1. Atkins MS, Frazier SL, Cappella E. Hybrid research models: Natural opportunities for examining mental health in context. Clinical Psychology: Science and Practice. 2006;13(1):105–108. doi: 10.1111/j.1468-2850.2006.00012.x. [DOI] [Google Scholar]
  2. Bearman SK, Weisz JR, Chorpita B, Hoagwood K, Marder A, Ugueto A, et al. More practice, less preach? The role of supervision processes and therapist characteristics in EBP implementation. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0485-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Becker K, Domitrovich C, Ialongo N. Supporting school-based preventive interventions: The PATHS to PAX coaching model. Administration and Policy in Mental Health and Mental Health Services Research. 2013 [Google Scholar]
  4. Beidas R, Edmunds J, Cannuscio C, Gallagher M, Downey M, Kendall P. Therapists’ perspectives of consultation following training: A qualitative examination. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0475-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Berenson, R. A., Pronovost, P. J., & Krumholz, H. M. (2013). Achieving the potential of health care performance measures: Timely analysis of immediate health policy issues. Robert Wood Johnson Foundation, 1–6. http://www.rwjf.org/content/dam/farm/reports/reports/2013/rwjf406195.
  6. Bickman L, Hoagwood K. Special Issue: Making the real world ideal. Administration and Policy in Mental Health and Mental Health Services Research. 2010;37(1–2):1–2. doi: 10.1007/s10488-010-0289-9. [DOI] [PubMed] [Google Scholar]
  7. Bickman L, Kelley SD, Athay M. The technology of measurement feedback systems. Couple and Family Psychology: Research and Practice. 2012;1(4):274–284. doi: 10.1037/a0031022. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Chorpita BF, Daleiden EL. Mapping evidence-based treatments for children and adolescents: Application of the distillation and matching model to 615 treatments from 322 randomized trials. Journal of Consulting and Clinical Psychology. 2009;77(3):566–579. doi: 10.1037/a0014565. [DOI] [PubMed] [Google Scholar]
  9. Conway PH, Mostashari F, Clancy C. The future of quality measurement for improvement and accountability quality measuring for improvement and accountability. JAMA. 2013;309(21):2215–2216. doi: 10.1001/jama.2013.4929. [DOI] [PubMed] [Google Scholar]
  10. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care. 2012;50(3):217–226. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Edmunds JM, Kendall PC, Ringle V, Read K, Brodman D, Pimentel SS, Beidas RS. An examination of behavioral rehearsal during consultation as a predictor of training outcomes. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0490-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Freedman B. Equipoise and the ethics of clinical research. New England Journal of Medicine. 1987;317:141–145. doi: 10.1056/NEJM198707163170304. [DOI] [PubMed] [Google Scholar]
  13. Glasgow RE, Magid DJ, Beck A, Ritzwoller D, Estabrooks PA. Practical clinical trials for translating research to practice: Design and measurement recommendations. Medical Care. 2005;43(6):551–557. doi: 10.1097/01.mlr.0000163645.41407.09. [DOI] [PubMed] [Google Scholar]
  14. Glisson C, Hemmelgarn A, Green P, Dukes D, Atkinson S, Williams NJ. Randomized trial of the availability, responsiveness, and continuity (ARC) organizational intervention with community-based mental health programs and clinicians serving youth. Journal of the American Academy of Child and Adolescent Psychiatry. 2012;51:780–787. doi: 10.1016/j.jaac.2012.05.010. [DOI] [PubMed] [Google Scholar]
  15. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental health services research. 2005;7(4):243–259. doi: 10.1007/s11020-005-7456-1. [DOI] [PubMed] [Google Scholar]
  16. Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. Journal of Consulting and Clinical Psychology. 2010;78(4):537. doi: 10.1037/a0019160. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Glisson C, Schoenwald SK, Kelleher K, Landsverk J, Hoagwood KE, Mayberg S, et al. Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Administration and Policy in Mental Health and Mental Health Services Research. 2008;35(1):124–133. doi: 10.1007/s10488-007-0152-9. [DOI] [PubMed] [Google Scholar]
  18. Hoagwood K, Olin S, Cleek A. Beyond context to the skyline: Thinking in 3D. Administration and Policy in Mental Health and Mental Health Services Research. 2013;40(1):23–28. doi: 10.1007/s10488-012-0451-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Koh HK, Sebelius KG. Promoting prevention through the affordable care act. New England Journal of Medicine. 2010;363(14):1296–1299. doi: 10.1056/NEJMp1008560. [DOI] [PubMed] [Google Scholar]
  20. Kratochwill TR, Levin JR. Enhancing the scientific credibility of single-case intervention research: Randomization to the rescue. Psychological Methods. 2010;15(2):124. doi: 10.1037/a0017736. [DOI] [PubMed] [Google Scholar]
  21. Lavori PW, Rush AJ, Wisniewski SR, Alpert J, Fava M, Kupfer DJ, Trivedi M. Strengthening clinical effectiveness trials: Equipoise-stratified randomization. Biological Psychiatry. 2001;50(10):792–801. doi: 10.1016/S0006-3223(01)01223-9. [DOI] [PubMed] [Google Scholar]
  22. Lee CM, Horvath C, Hunsley J. Does it work in the real world? The effectiveness of treatments for psychological problems in children and adolescents. Professional Psychology: Research and Practice. 2013;44(2):81–88. doi: 10.1037/a0031133. [DOI] [Google Scholar]
  23. Lyon AR, McCauley E, Ludwig K, Vander Stoep A, Cosgrove T. “If it’s worth my time I will make the time”: School-based providers’ decision-making about participating in an evidence-based psychotherapy consultation program. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0494-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Masia-Warner C, Brice C, Esseling P, Stewart C, Mufson L, Herzig K. Supervisors’ perceptions of school counselors’ ability to deliver an empirically-based intervention for adolescent social anxiety disorder. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0498-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. McHugh RK, Barlow DH. The dissemination and implementation of evidence-based psychological treatments. American Psychologist. 2010;65(2):73–84. doi: 10.1037/a0018121. [DOI] [PubMed] [Google Scholar]
  26. Nadeem E, Gleacher A, Pimentel S, Hill L, McHugh M, Hoagwood K. The role of consultation calls for clinic supervisors in supporting large-scale dissemination of evidence based treatments for children. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0491-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Proctor EK, Rosen A. From knowledge production to implementation: Research challenges and imperatives. Research on Social Work Practice. 2008;18(4):285–291. doi: 10.1177/1049731507302263. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Reinke WM, Herman KC, Stormont M, Newcomer L, David K. Incredible years teacher classroom management program: Examining the relation of intervention support systems on teacher fidelity of implementation. Administration and Policy in Mental Health and Mental Health Services Research. 2013 doi: 10.1007/s10488-013-0496-2. [DOI] [PubMed] [Google Scholar]
  29. West SG, Duan N, Pequegnat W, Gaist P, Des Jarlais DC, Holtgrave D, Mullen PD. Alternatives to the randomized controlled trial. Journal Information. 2008;98(8):1359–1366. doi: 10.2105/AJPH.2007.124446. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Zima BT, Murphy JM, Scholle SH, Hoagwood KE, Sachdeva RC, Mangione-Smith R, Jellinek M. National quality measures for child mental health care: Background, progress, and next steps. Pediatrics. 2013;131(Supplement 1):S38–S49. doi: 10.1542/peds.2012-1427e. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Administration and Policy in Mental Health are provided here courtesy of Springer

RESOURCES