Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2020 Sep 1.
Published in final edited form as: Glob Soc Welf. 2018 Sep 8;6(3):199–207. doi: 10.1007/s40609-018-0121-0

Parenting Programs for Underserved Populations in Low- and Middle-Income Countries: Issues of Scientific Integrity and Social Justice

Ana A Baumann 1,1, Anilena Mejia 2, Jamie M Lachman 3, J Rubén Parra Cardona 4, Gabriela López-Zerón 5, Nancy G Amador Buenabad 6, Eunice Vargas 7, Melanie M Domenech Rodríguez 8
PMCID: PMC7036747  NIHMSID: NIHMS1506103  PMID: 32095423

Abstract

Research suggests that parenting programs are effective for preventing behavioral and emotional difficulties in children, but a lot more attention needs to be paid to issues of context and culture during the development, testing and implementation of these interventions. The views and needs of underserved and disenfranchised communities in the US and the Global South are often not taken into account for the development and testing of interventions. The successful implementation of evidence-based interventions for vulnerable children and families in underserved and marginalized communities requires careful consideration of how existing paradigms of prevention, evaluation, and implementation science impact issues of social justice and equity. This paper will describe how a team of parenting program researchers has been collaborating with their partners globally in generating local knowledge by balancing the need for rigorous scientific methods with issues of power. Authors from the U.S., Latin America, Africa and Southeast Asia draw on their experiences regarding challenges and successes with issues regarding study design and measurement, the transferability and adaptation of interventions, and the dissemination and implementation of different parenting interventions while placing communities at the center of their efforts through participatory methods. We describe innovative approaches that span the continuum of intervention development, adaptation, optimization, evaluation, implementation, and scale up of different parenting programs for vulnerable children and families across the world. We conclude by offering specific and pragmatic recommendations to increase access of culturally relevant and effective parenting programs in these communities.

Keywords: Global mental health, parenting, scale up, scale out, implementation


Approximately 10 years ago, the Lancet Mental Health Group issued a call for action to scale-up mental health services worldwide, particularly in Low- and Middle-Income countries (LMIC; Lancet Mental Health Group, 2007) to respond to the glaring gap between those who needed mental health care and those who received it (Andrade et al., 2014; Kohn, Saxena, Levav, & Saraceno, 2004). In 2001, the World Health Organization (WHO) released the report titled “Mental Health: New Understanding, New Hope” (World Health Organization, 2001) and a number of collaborative efforts to improve the global mental health have been established since then (Abdulmalik & Thornicroft, 2016).

Global health can be defined as a set of initiatives that promote the scale up of evidence based interventions to improve the quality of service delivery of mental health services, particularly in LMICs (bayetti & Jain, 2017; Jain & Orr, 2016; White, Gregg, Batten, Hayes, & Kasujja, 2017). The initiatives to expand global mental health care delivery are based on the moral and ethical assumptions that anyone should receive attention and care, regardless of their social determinants of health, including location (Kirmayer & Pedersen, 2014; Mason, Kerridge, & Lipworth, 2017; Patel, 2014), and much progress have been made towards improving the mental health care delivery globally (Patel, Boyce, Collins, Saxena, & Horton, 2011). However, as the field of global mental health grows, it has also been met with a number of critiques and challenges (Bayetti & Jain, 2017; Whitley, 2015). Concerns have been expressed by numerous scholars about the potential clash between scaling evidence-based interventions (EBIs) and the social injustice that may be inherent in such practice. Critics have been vocal about the proposed interventions’ inattention to the social and cultural conditions that give rise to mental distress (Mills & White, 2017; Whitely, 2015) and to the many “cultural variations in the experience of illness” (Fernando, 2011, p. 22). As Mason and colleagues (Mason, Jerridge, & Lipowroth, 2017) state: “some of these efforts potentially obscure the social, economic, and political histories of the locations where projects are implemented, as well as the plurality of knowledge and values within and across communities.” (Mason et al., 2017).

One could propose that, instead of perceiving the scale up of EBIs as a “black or white” phenomena, it is more of a palette full of colors where, depending on the pathway of implementation, of the context, and on how the collaboration is established, one could successfully balance a shared knowledge between LMIC and High Income Countries (HIC). This paper aims to share some of our lessons learned as we pursue implementation efforts of evidence-based parenting interventions (EBPI) while maintaining social justice in our work. In this paper, we focus on parenting interventions because this is our specific area of research, but we believe that the principles outlined here are similar to those related to other mental health preventive interventions.

Before we outline our challenges, we clarify the terms used in this paper. Implementation of an intervention refers to the process of integrating EBI in real-world settings. Implementation research aims to understand the factors and strategies that facilitate or hinder the adoption of these interventions in usual care (Proctor et al., 2011). Dissemination research, on the other hand, refers to the study of targeted distribution of information and intervention materials to a clinical audience with the intent to spread and sustain knowledge associated with evidence-based interventions Rabin & Brownson, 2017). Scaling out is a recent term coined by Aarons and colleagues (Aarons, Sklar, Mustanski, Benbow, & Brown, 2017) to refer to the process whereby EBIs are implemented with either new populations, new delivery systems, or both. Scaling up refers to the expansion of the delivery of one EBI within the same or very similar setting under which the intervention has been originally tested. A helpful metaphor to distinguish these two may be that scaling up is akin to watching the seed grow into a flower within one garden whereas scaling out is planting seeds in different gardens. Such distinctions in terms are important as they have consequences in the hypothesis, assumptions, and designs of the studies. For more information about these terms, we refer the readers to Aarons et al.’s article (Aarons et al., 2017). As we implement EBIs in different settings, we conceptualize our work as “scaling out” EBPIs to different settings.

Experiences of Success and Innovative Approaches for the Scaling Out of Evidence-Based Parenting Interventions into LMICs

There are numerous implementation frameworks (Tabak, Khoon, Chambers, & Brownson, 201) but very few explain the process of implementation of evidence-based intervention with diverse populations (Yancey, Ortega, & Kumanyika, 2006). Several scale up frameworks also exist (Brownson, Colditz, & Proctor, 2017), including one developed by our team, using components of cultural adaptation and implementation science fields (Blinded for review3). In general, scale up frameworks involve feedback loops of at least three phases: 1) learning from HICs and adapting to LMICs leads to 2) strong partnerships and involvement of local communities for 3) the optimization of existing interventions. In the present section, we describe our experiences of success implementing EBPIs in LMICs, describing our work in these three sections.

Learning from High Income Countries

When doing global work, there is a tension between how much to “take” from high income countries to LMICs, and how much to develop on the ground. A first step is to spent time gathering system-level information, including resources (human, financial and physical) to be able to implement the EBPI. This phase could be placed in the “Exploration” phase of our model (Blinded for review3). In our experience, is seems that this first phase entails some top down approach at the beginning as the teams translate the manuals, examine which measures to use, and examine who are the potential stakeholders that will be trained to deliver the intervention. For example, our teams in Mexico City spent a lot of time translating parenting practice measures from English to Spanish, back translating and examining their fit to the context in Mexico prior to using it in a randomized controlled trial (RCT). Similarly, because the available validated measures for child outcomes in Brazil are simply too expensive for our stakeholders, we are in the process of doing cognitive interviewing of the translated measures that are free measures. In other words, often in this beginning phase, our stakeholders are still “consuming” knowledge from HICs and not “creating new knowledge” that could be disseminated to similar countries in the region.

In Mexico there is a large interest in linking researcher with the clinics responsible for attending to the population. In Enseňada, Baja California, they are establishing collaborative networks between researchers from the universities with governmental health institutions and with civil associations, as well as encourage collaborative work with foreign researchers who are also interested in benefiting the Mexican population. The goal of such three-tiered relationships is to facilitate learning between academia and clinicians who deliver care.

Adapting the interventions

The field of cultural adaptation defines adaptation as “the systematic modification of an evidence-based intervention (EBI) to consider language, culture, and context in such a way that it is compatible with the client’s cultural patterns meanings and values” (Bernal, Bonilla, & Bellido, 1995) to refer to adaptations based on client cultural background. In this paper, we expand the concept and incorporate adaptations that go beyond cultural elements at the client level, such as modifying interventions to fit provider characteristics, organizational contexts, and service settings (e.g., historical, political, and economic contexts; Blinded for review1, 2017). This broader conceptualization allows us to specify the components that have been adapted to fit to a broader context, as part of increasing the fit of the intervention to the LMICs settings.

For example, a group of us have translated and adapted GenerationPMTO to Spanish for Latino families in the U.S. and in Mexico (REF). When scaling out the intervention to Mexico, our first iteration involved using the traditional training of GenerationPMTO: a series of five workshops across 18 months (Blinded for review3). This in-depth training, however, has been proven to be a challenge for low-resource settings, and we then adapted the training to use technology. A pilot study was conducted to test the feasibility of using blended learning with a mix of online and in vivo strategies to train therapists in GenerationPMTO (Baumann et al., under review). In Panama, before trialing the Triple P Positive Parenting Program, a series of studies were conducted with local communities in order to ensure the program was culturally relevant and acceptable to their needs (Blinded for review 910). Our work has shown that adaptation is inherent - and perhaps crucial - to the implementation process of EBPIs in LMIC (Blinded for review-6; Blinded for review-8–10). However, we have also faced challenges in that researchers tend to not report their adaptation process or justification for the adaptations made (Blinded for review-5), difficulting the scientific integrity and replicability of adaptations so as to test how to adapt our interventions in a cost-effective and sustainable way (Blinded for review-8)

Partnering and involving local communities in research efforts.

Partnering and involving the community is a crucial component. Accordingly, the field of implementation science has embraced more the principles of community based participatory research (CBPR; Blachman-Demner, Wiley, & Chambers, 2017; Holt & Chambers, 2017). Community engagement is multilevel (Brown et al., 2014; Mazzucca et al., 2018) and involves stakeholders in many roles. Particularly in global health, the concept of community and stakeholders need to be broaden, as we often face a lack of trained professionals able to deliver interventions (Belfer, 2008). Because often the intervention is being delivered while its evidence is still being evaluated, often we need to train not only practitioners but also research staff to evaluate the work (Weiss et al., 2012).

Our team has approached a train-the-train model, with the goal of having a full transfer of knowledge and skills to the new setting adopting the intervention (Baumann et al., 2016). In Mexico, the approach has been to train researchers (faculty, as well as graduate and undergraduate students), in addition to clinicians, on the EBP to be implemented. The goal of such tiered approach is to support the sustainability of the interventions in the long term: the researchers will maintain the quality of research and evaluation; and clinicians and therapists will support the work by delivering the intervention. Training clinicians in a train-the trainer model also supports diminishing the potential risk of, when ready to scaling up within the country, of having a solely top-down approach of foreign researchers training everyone. As such, our research stakeholders in Mexico make the point of having a close relationship with clinicians and therapists from different agencies to ensure that the work being done is relevant to them. In this way, dissemination is not taken as an order that hierarchically descends from the authorities (from top to bottom), but as a need that arises from themselves. This makes it more likely for clinicians to understand and have a voice when implementing EBPs.

Involving multiple stakeholders from other countries is not an easy task, however, as often we have challenges in funding for training (Baumann et al., 2016). The level of interest from authorities and leaders in the countries that we have worked with has varied, and in some places the often in depth training of our interventions and process evaluation of our work could be considered more of a challenge than an opportunity by some leaders. Part of our work, therefore, involves activism and lobbying to convince policy makers of the importance of collecting evidence that will improve the implementation process of their interventions. For this, collaborations between prevention researchers in LMICs and between those in HICs and LMICs are key.

The feedback loop: Giving results to local policy makers

Conducting studies in collaboration with local policy makers is key for successful implementation. In the case of Panama, a trial of the Strengthening Families Program 10–14 is currently on its way. In order to ensure sustainability after the trial is over, the study was designed so that policy makers committed the time and resources of their health practitioners for delivering the program. If the program shows effective, then the capacity is already built in the Ministry of Health and Education, thus making sure implementation takes place effortless. This kind of buy-in from local policy makers will ensure sustainability and effective implementation. Although testing efficacy of interventions is very important to establish if they have the potential to produce changes in children and their families, process evaluations give us information on whether the intervention is appropriate and sustainable in a particular context. There is a need for comprehensive process evaluations in LMICs, using implementation frameworks that allow discussion on sustainability of interventions in a particular context (WHO & ExpandNet, 2011). Sustainability is particularly relevant to LMICs where governmental systems change often and corruption is a key factor affecting provision of services (Blinded for review-11). Moreover, cost-effectiveness analyses are also important as we need to ensure that EBPIs imported from HICs are economically sustainable in a different context (Duncan, MacGillivray, & Renfrew, 2017). The above factors highlight the acute need for the field of implementation research to support the work of global health researchers in preparing for the implementation of EBPIs in LMICs.

Lessons Learned: Recommendations for the Implementation of Evidence-Based Parenting Interventions in LMICs

To be able to talk about the research of scaling out of EBPIs, we need to first define EBI. An intervention is considered evidence-based if (1) it is included in a federal registry of evidence-based interventions, or (2) it has produced positive effects on the primary targeted outcome, and these findings have been reported in a peer-review journal, or (3) the intervention has documented evidence of effectiveness such as (a) documentation of a theory of change, and (b) replication of findings (SAMHSA, 2016). One important caveat in this definition, however, is that much of the research done to establish the effectiveness of the interventions have been conducted with relatively few disadvantage persons in the trials; and with the majority of the trials in the U.S. (Blinded for review-4; Yancey, Glenn, Ford, & Bell-Lewis, 2017; Yancey, Ortega, & Kumanyika, 2006). As such, implementing these interventions in LMIC represent challenges in that the effect of EBPIs are not the same for everyone (Gardner et al., 2017; Leitjen et al., 2018; vab Aar et al., 2017). With such drawbacks, we would like to warn the potential assumption that if an intervention has evidence in HIC, it can be implemented in LMIC without previously examining the context. As we described above, to a lot of work needs to be done, even at the measurement level, to be able to examine whether an intervention has evidence in a LMIC.

Choosing how to measure an intervention being implemented in LMIC, however, is not a trivial task. Many interventions use self-report measures of these outcomes, such as the Alabama Parenting Questionnaire (APQ; Shelton, Frick, & Wootton, 1996) for parenting practices and the Child Behavior Checklist (CBCL; Achenbach, 1991) to assess child outcomes. Some interventions combine observational tasks with self-reports of child behavior to triangulate outcomes (Patterson, Forgatch & DeGarmo, 2010). However, some tests are very time consuming to administer and/or are commercial and require payment per use, which makes it unsustainable for poorly funded projects in LMICs (Fernald, Prado, Kariger, & Raikes, 2017). How can we then examine the evidence of an interventions with often burdensome measures? How can we detect the trends in child development to inform policy and intervention implementation if the measures are not comparable across settings that do not have resources?

The World Bank Group provides a set of ideal characteristics of an assessment, which should, among other things, be appropriate, interpretable, easy to administer and of low cost (Fernald et al., 2017). To address these issues, scholars have advocated for evidence-based assessments (EBA; American Psychological Association, 2006; Hunsley & Mash, 2007) where both the process of conducting the assessment as well as the instrument used for evaluation are carefully selected through systematic and empirically based, research-driven approach to assessment (Beidas et al., 2015). While a lot have been published on EBA with recommendations of measures for youth and adult outcomes (Hunsley, 2015; Hunsley & Mash, 2015; Mash & Hunsley, 2005; Roberts, Blossom, Evans, Amaro, & Kanine, 2017), a lot needs to be done for measures for low-resource settings (Beidas et al., 2015), particularly for international communities. Without good measures that are practical, free, valid, and translated to different languages, we continue to struggle in providing patient-centered, equitable service to our communities in LMICs.

Second, when choosing and implementing an intervention in diverse settings, attention needs to be given to the assumptions made regarding the mechanisms of action. Specifically for parent interventions, the underlying assumption of a parenting intervention is that they will help reduce conduct problems in children (Weisz & Kazdin, 2010). However, data has shown that a third of the families exposed to parenting interventions fail to show improvement (REF). While much needs to be known about which components from the interventions help whom, some scholars have hypothesized that variables such as contextual factors may play a role (Gardner et al., 2017). For example, time out may be received differently by White American parents used to this technique in the US compared to immigrant parents or parents in LMIC (Blinded for review-7; Leijten et al., 2018). It may be that the source of the information matters: parents may see time out as an American way of teaching things (REF), or it may be that the strategy is not the best for everyone. Much research needs to be done to disentangle the specific components of parenting interventions that are effective for different populations. As Leitjen et al. (REF) argues, the questions around understanding the clinical effectiveness of an intervention may be more fruitful if we perhaps move from “who benefits?” to ask “who benefits from what, when and how?” (Leijten et al., 2018REF). In fact, we argue that we could go further in this question by adding “who benefits from what, when and how?” if we add components of the implementation science in our work as we scale up EBPIs in LMIC. We advocate for two ways to do answer these questions: through designs and the clear engagement of stakeholders in our work.

Implementation of EBIs in LMICs: designs. We can hypothesize that the well-cited finding that it takes 17 years from research to turn 14 percent of research to benefit patient care (Balas & Boren, 2000) may be even worse when considering the context in LMICs. Part of the challenge in this delay involved the designs to test the efficacy of interventions. Traditionally, investigators would be expected to conduct randomized control trials (RCTs) to test the efficacy of an intervention prior to scaling it out to usual care (Curran et al., 2012). However, RCTs present a host of challenges when it relates to conducting studies in low-resource and for minority populations: they present ethical and practical challenges of assigning members of small communities to randomization when the community is small and withhold members from beneficial interventions in communities that are often in dire need of basic services (Dixon, Salinas, & Marques, 2016). Additional risks such as historical or political events that can affect vulnerable and low-resourced communities may also pose a challenge in conducting RCTs and have a clear definition of the effects of the intervention (Blinded for review-2).

Because of such challenges, researchers have recently advocated for other types of designs, such as regression discontinuity, interrupted time series designs, and roll-out randomization designs (Henry, Tolan, Gorman-Smith, & Schoeny, 2017). A design that has been recently advocated to be used in parent-intervention is the multiphase optimization strategy (MOST) a framework that includes evaluation of behavioral interventions, while also optimizing the intervention before its evaluation (Collins, 2018). The advantage of MOST designs is that they allow to evaluate which of the multi component of a given EBPI contributes to the overall outcome; and which component produces a large effect enough to justify its cost of implementation (Collins, 2014). Creative designs such as MOST and other adaptive designs may be a better option to answer what are the best components of EBPIs that can be implemented in LMICs considering the realistic constraints of the low resource settings, or what is “the best experimental design is the one that gathers the most, and most relevant, scientific information while making the most efficient use of available” (Collins, 2014). We encourage, therefore, researchers to think about different designs that may be able to accommodate issues of internal and internal validity (Landsverk et al., 2017), as well as practice, pragmatic and ethical considerations of RCTs when implementing EBPIs in LMICS (Brown et al., 2017; Curran, Bauer, Mittman, Pyne, & Stetler, 2012; Landsverk et al., 2017; Mazzucca et al., 2017). An overview of the different designs proposed by implementation scientists to support accelerating the reach of EBPIs is beyond the scope of this manuscript but can be found in other reviews (Landsverk et al., 2017; Mazzucca et al, 2018).

Engaging communities is a key component of our work. While scale up frameworks tend to have stakeholder engagement as a key component, there is limited empirical guidance on what are the key actions and best practices of stakeholder involvement (Goodman & Vetta, 2017). To address these issues, Goodman and Vetta (2017) provide three categories of stakeholder engagement: (a) non-participation; (b) symbolic participation; and (c) engaged participation. The engaged participation, according to the authors, involves collaboration (i.e., both researchers and community members are actively involved in the design and implementation of the project), patient-centered process (i.e., community stakeholders are the main decision makers of the design and implementation process, as well as of the publications; whereas researchers are supportive of the process but not leaders) and community-based participatory research (i.e, where there is trust between community stakeholders and researchers, and an equitable partnership and shared decision making). The authors state that different members of the community can have roles, but that true stakeholder involvement entails giving voice to those who traditionally have limited power and input in the research design and implementation process. The follow up question, then, is on how to evaluate the engagement of your stakeholders? Goodman and colleagues (2017) provide a survey measuring 11 principles of community involvement, from acknowledging the community (e.g, showing appreciation for the community’s time and effort), to disseminating findings and facilitating a collaborative and equitable partnership. While the survey needs to be empirically tested in different settings as potential predictor of community engagement, it could provide a useful guide for global researchers to support their community work and social justice as they scale out evidence-based interventions.

Knowledge sharing as a two-way process

The advantage of positive partnership is, of course, related to the sustainability of any global work. In many ways, the assumptions of scale up frameworks are that there is a one way arrow of information from HIC to LMIC. The bi-directional relationship of global work is crucial is one is to not impose colonialism practices that could only reince oppression if work is done in a unidirectional approach (Parker et al., 2017). To avoid colonialism in our global work, Parker and colleagues (2014) advocate for: (1) clear agreement and shared goals between all parties; (2) equitable distribution of powers, including opportunities to change the design and implementation of the intervention; (3) equally incorporating local knowledge and perspectives when developing and recognizing skills and expertise; (4) ongoing communication based on honest exchanges and willingness to raise concerns; and (5) trust.

The bi-directional communication among HIC and LMIC partners is not only beneficial for good practice of research grounded in social justice, but also to “bring back” lessons learned from LMIC. The notion that the knowledge gained in LMIC is relevant to HIC is not new, and is well documented in different areas in the literature (Harris, Weisberger, Silver, Dadwal, & Macinko, 2016). Different names have been used to label the process of “bringing back” the lessons learned to HIC, such as”reverse innovation” (Bhattacharyya et al., 2017; Immelt, Govindarajan, & Trimble, 2009; Trimble & Govindarajan, 2012), “innovation blowback” (Brown & Hagel, 2005), and “social innovation” (Chambon, David, & Devevey, 1982). However, the field of global health still have a lot to learn as sometimes the innovations or lessons learned from LMIC tend to be discounted and not valued (Harris et al., 2016), including bias against publication and shared information from LMIC researchers (Harris et al., 2017).

Conclusions

The implementation of EBIs in LMICs is a complex process that calls for creativity and adaptability, while also maintaining scientific rigor grounded in social justice principles. It is through bi-directional communication and sharing of knowledge among scholars engaging in global work that the access to culturally and contextually relevant parenting programs will be expanded. It will take a committed community of scholars to make these needed programs more accessible and sustainable. We urge global health researchers to collaborate with implementation science researchers to draw on potential frameworks and creative designs that could potentially test of efficacy of EBPIs in LMIC while also accelerating the uptake of these interventions (Betancourt & Chambers, 2016). Our team also urges for treatment developers to think about the transportability and sustainability of their interventions for different settings. Designing and testing interventions using methods such as User Centered Design (Baumann et al., 2018; Lyon et al., 2014; Lyon & Kroner, 2016) could be a beneficial way to balance scientific integrity with social justice in LMIC.

Acknowledgments

Compliance with Ethical Standards:

Funding: This study was funded by 3U01HL133994-02S1, 1U24HL136790-01, 1R01HG009351-01A1 and 3 UL1 RR024992-09; K01-MH066297

Footnotes

Conflict of Interest: All the authors declare that there is no conflict of interest.

Ethical approval: All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed consent: Informed consent was obtained from all individual participants included in the study.

Contributor Information

Ana A. Baumann, Washington University in St. Louis.

Anilena Mejia, Instituto de Investigaciones Científicas y Servicios de Alta Tecnología (INDICASAT).

Jamie M. Lachman, University of Oxford, University of Glasgow.

J. Rubén Parra Cardona, The University of Texas at Austin.

Gabriela López-Zerón, Michigan State University.

Nancy G. Amador Buenabad, Instituto Nacional de Psiquiatría

Eunice Vargas, Universidad Autónoma de Baja California.

Melanie M. Domenech Rodríguez, Utah State University.

References

  1. Aarons GA, Sklar M, Mustanski B, Benbow N, & Brown CH (2017). “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implementation Science, 12, 1 10.1186/s13012-017-0640-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Abdulmalik J, & Thornicroft G (2016). Community mental health: A brief, global perspective. Neurology, Psychiatry and Brain Research, 22(2), 101–104. 10.1016/j.npbr.2015.12.065 [DOI] [Google Scholar]
  3. American Psychological Association. (2006). Evidence-based practice in psychology. American Psychologist 61, 271–285, 10.1037/0003-066X.61.4.271 [DOI] [PubMed] [Google Scholar]
  4. Balas EA, & Boren SA (2000). Managing clinical knowledge for health care improvement. Yearbook of Medical Informatics 2000: Patient-Centered Systems 1, 65–70. 10.1055/s-0038-1637943, Blinded for review −1, Blinded for review −2, Blinded for review −3, Blinded for review −4, Blinded for review −5 [DOI] [PubMed] [Google Scholar]
  5. Bayetti C, & Jain S (2017). Problematising global mental health In Cohen BMZ (Ed.), Routledge international handbook of critical mental health London: Routledge. [Google Scholar]
  6. Beidas RS, Stewart RE, Walsh L, Lucas S, Downey MM, Jackson K, … Mandell DS (2015). Free, brief, and validated: standardized instruments for low-resource mental health settings. Cognitive and Behavioral Practice, 22, 5–19. 10.1016/j.cbpra.201402.002 [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Belfer ML (2008). Child and adolescent mental disorders: The magnitude of the problem across the globe. Journal of Child Psychology and Psychiatry 49, 226–236. 10.1111/j.1469-7610.2007.01855.x [DOI] [PubMed] [Google Scholar]
  8. Bernal G, Bonilla J, & Bellido C (1995). Ecological validity and cultural sensitivity for outcome research: Issues for the cultural adaptation and development of psychosocial treatments with Hispanics. Journal of Abnormal Child Psychology, 23, 67–82. 10.1007/BF01447045 [DOI] [PubMed] [Google Scholar]
  9. Betancourt TS, & Chambers DA (2016). Optimizing an era of global mental health implementation science. JAMA psychiatry, 73(2), 99–100. doi: 10.1001/jamapsychiatry.2015.2705 [DOI] [PubMed] [Google Scholar]
  10. Bhattacharyya O, Wu D, Mossman K, Hayden L, Gill P, Cheng Y-L, … McGahan A (2017). Criteria to assess potential reverse innovations: opportunities for shared learning between high- and low-income countries. Globalization and Health, 13, 4 10.1186/s12992-016-0225-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Blachman-Demner DR, Wiley TR, & Chambers DA (2017). Fostering integrated approaches to dissemination and implementation and community engaged research. Translational Behavioral Medicine 7, 543–546. 10.1007/s13142-017-0527-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Bowen DJ, Hyams T, Goodman M, West KM, Harris‐ Wai J, & Yu JH (2017). Systematic review of quantitative measures of stakeholder engagement. Clinical and Translational Science 10, 314–336. 10.1111/cts.12474 [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Brown. CH, Chamberlain P, Saldaña L, Padgett C, Wang W, & Cruden G (2014). Evaluation of two implementation strategies in 51 child county public service systems in two states: Results of a cluster randomized head-to-head implementation trial. Implementation Science 9, 134 10.1186/s13012-014-0134-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, … Cruden G (2017). An overview of research and evaluation designs for dissemination and implementation. Annual Review of Public Health 38, 1–22. 10.1146/annurev-publhealth-031816-044215 [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Brownson RC, Colditz GA, & Proctor EK (2017). Dissemination and implementation research in health: Translating science to practice (2nd ed.).. London, UK: Oxford Press; 10.1093/oso/9780190683214.001.0001 [DOI] [Google Scholar]
  16. Brown JS, & Hagel J (2005). Innovation blowback: Disruptive management practices from Asia. McKinsey Quarterly 1 35–45. Retrieved from: https://www.mckinsey.com/ Blinded for review-6 [Google Scholar]
  17. Chambon JL, David A, & Devevey JM (2005). Les Innovations Sociales Paris: Presses Universitaires de France. [Google Scholar]
  18. Collins LM (2014). Optimizing family intervention programs: The multiphase optimization strategy (MOST). In McHale SM, Amato PR, & Booth A (Eds.), Emerging methods in family research (pp. 231–244). New York, NY: Springer. [Google Scholar]
  19. Collins LM (2018). Optimization of behavioral, biobehavioral, and biomedical interventions: The Multiphase Optimization Strategy (MOST) New York, NY: Springer. [Google Scholar]
  20. Curran GM, Bauer M, Mittman B, Pyne JM, & Stetler C (2012). Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50, 217–226. 10.1097/MLR.0b013e3182408812 [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Dixon L, Salinas M, & Marques L (2016). Advances and challenges in conducting research with diverse and vulnerable populations in a healthcare setting: Reducing stigma and increasing cultural sensitivity. In Parekh R, & Childs EW (Eds.), Stigma and Prejudice: Touchstones in understanding diversity and healthcare (pp. 303–324). Switzerland: Humana Press, Blinded for review −7 [Google Scholar]
  22. Duncan KM, MacGillivray S, & Renfrew MJ (2017). Costs and savings of parenting interventions: Results of a systematic review. Child: Care, Health and Development 43, 797–811. 10.1111/cch.12473 [DOI] [PubMed] [Google Scholar]
  23. Fernald LCH, Prado E, Kariger P, & Raikes A (2017). A toolkit for measuring early childhood development in low- and middle-income countries. Washington, DC: World Bank; Retrieved from: www.worldbank.org [Google Scholar]
  24. Gardner F, Leijten P, Mann J, Landau S, Harris V, Beecham J, … Scott S. (2017). Could scale-up of parenting programmes improve child disruptive behaviour and reduce social inequalities? Using individual participant data meta-analysis to establish for whom programmes are effective and cost-effective. Public Health Research, 5(10), 1–144. 10.3310/phr05100 [DOI] [PubMed] [Google Scholar]
  25. Goodman MS, Thompson VLS, Arroyo Johnson C, Gennarelli R, Drake BF, Bajwa P, … Bowen D (2017). Evaluating community engagement in research: Quantitative measure development. Journal of Community Psychology, 45(1), 17–32. 10.1002/jcop.21828 [DOI] [PMC free article] [PubMed] [Google Scholar]
  26. Goodman MS, & Sanders Thompson VL (2017). The science of stakeholder engagement in research: classification, implementation, and evaluation. Translational Behavioral Medicine, 7, 486–491. 10.1007/s13142-017-0495-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Harris M, Weisberger E, Silver D, Dadwal V, & Macinko J (2016). That’s not how the learning works – the paradox of reverse innovation: A qualitative study. Globalization and Health, 12, 36 10.1186/s12992-016-0175-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Harris M, Marti J, Watt H, Bhatti Y, Macinko J, & Darzi AW (2017). Explicit bias toward high-income-country research: a randomized, blinded, crossover experiment of English clinicians. Health Affairs 36, 1997–2004. 10.1377/hlthaff.2017.0773 [DOI] [PubMed] [Google Scholar]
  29. Henry D, Tolan P, Gorman-Smith D, & Schoeny M (2017). Alternatives to randomized control trial designs for community-based prevention evaluation. Prevention Science 18, 671–680. 10.1007/s11121-016-0706-8 [DOI] [PubMed] [Google Scholar]
  30. Holt CL, & Chambers DA (2017). Opportunities and challenges in conducting community-engaged dissemination/implementation research. Translational Behavioral Medicine 7, 389–392. 10.1007/s13142-017-0520-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  31. Hunsley J (2015). Translating evidence-based assessment principles and components into clinical practice settings. Cognitive and Behavioral Practice 22, 101–109. 10.1016/j.cbpra.2014.10.001 [DOI] [Google Scholar]
  32. Hunsley J, & Mash EJ (2007). Evidence-based assessment. Annual Review of Clinical Psychology 3, 29–51. 10.1146/annurev.clinpsy.3.022806.091419 [DOI] [PubMed] [Google Scholar]
  33. Hunsley J, & Mash E (2005). Introduction to the special section on developing guidelines for the evidence-based assessment of adult disorders. Psychological Assessment 17, 251–255. 10.1037/1040-3590.17.3.251 [DOI] [PubMed] [Google Scholar]
  34. Kirmayer LJ, & Pedersen D (2014). Toward a new architecture for global mental health. Transcultural Psychiatry 51, 759–776. 10.1177/1363461514557202 [DOI] [PubMed] [Google Scholar]
  35. Immelt JR, Govindarajan V, Trimble C (2009). How GE is disrupting itself. Harvard Business Review 87 56–65. Retrieved from: https://hbr.org/ [Google Scholar]
  36. Landsverk J, Brown CH, Smith JD, Chamberlain P, Curran GM, Palinkas L,… Horwitz SM (2017). Design and analysis in dissemination and implementation research. In Brownson RC, Colditz GA, & Proctor EK (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 201–228). New York: Oxford University Press. [Google Scholar]
  37. Leijten P, Melendez‐ Torres GJ, Gardner F, van Aar J, Schulz S, & Overbeek G (2018, online first). Are relationship enhancement and behavior management “the golden couple” for disruptive child behavior? Two meta‐ analyses. Child Development 10.1111/cdev.13051 [DOI] [PubMed]
  38. Leijten P, Raaijmakers M, Wijngaards L, Matthys W, Menting A, Hemink-van Putten M, & Orobio de Castro B (2018). Understanding who benefits from parenting interventions for children’s conduct problems: An integrative data analysis. Prevention Science, 19, 579–588. 10.1007/s11121-018-0864-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Lyon AR, & Koerner K (2016). User‐ centered design for psychosocial intervention development and implementation. Clinical Psychology: Science and Practice, 23, 180–200. 10.1111/cpsp.12154 [DOI] [PMC free article] [PubMed] [Google Scholar]
  40. Lyon AR, Lau AS, McCauley E, Vander Stoep A, & Chorpita BF (2014). A case for modular design: Implications for implementing evidence-based interventions with culturally diverse youth. Professional Psychology: Research and Practice, 45, 57–66. 10.1037/a0035301 [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Mash E, & Hunsley J (2005). Evidence-based assessment of child and adolescent disorders: Issues and challenges. Journal of Clinical Child & Adolescent Psychology 34, 1537–4416. 10.1207/s15374424jccp3403_1 [DOI] [PubMed] [Google Scholar]
  42. Mason PH, Kerridge I, & Lipworth W (2017). The global in global health is not a given. The American Journal of Tropical Medicine and Hygiene, 96, 767–769. 10.4269/ajtmh.16-0791 [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Mazzucca S, Tabak RG, Pilar M, Ramsey AT, Baumann AA, Kryzer E, … & Brownson RC (2018). Variation in research designs used to test the effectiveness of dissemination and implementation strategies: A review. Frontiers in Public Health 6, 32 10.3389/fpubh.2018.00032, Blinded for review −8, Blinded for review −9, Blinded for review −10 [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Mills C, & White RG (2017). ‘Global mental health spreads like bush fire in the global south’: Efforts to scale up mental health services in low- and middle-income countries. In White RG, Jain S, Orr DMR, & Read UM (Eds.), The Palgrave handbook of sociocultural perspectives on global mental health (pp. 187–209). London, UK: Palgrave McMillan., Blinded for review-11 [Google Scholar]
  45. Parker G, Ali S, Ringell K, & McKay M (2014). Bi-directional exchange: The cornerstone of globally focused social work. Global Social Welfare, 1(1), 1–8. 10.1007/s40609-014-0011-z [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Patel V (2014). Why mental health matters to global health. Transcultural Psychiatry 51, 777–789. 10.1177/1363461514524473 [DOI] [PubMed] [Google Scholar]
  47. Patterson GR, Forgatch MS, & DeGarmo DS (2010). Cascading effects following intervention. Development and Psychopathology, 22, 949–970. 10.1017/S0954579410000568 [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, … Hensley M (2011). Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health 38, 65–76. 10.1007/s10488-010-0319-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  49. Rabin BA, & Brownson RC (2017). Terminology for dissemination and implementation research. In Brownson R, Colditz G, & Proctor E (Eds). Dissemination and implementation research in health: Translating science to practice (2nd ed., pp. 19–45). New York: Oxford University Press. [Google Scholar]
  50. Roberts MC, Blossom JB, Evans SC, Amaro CM, & Kanine RM (2017). Advancing the scientific foundation for evidence-based practice in clinical child and adolescent psychology. Journal of Clinical Child & Adolescent Psychology 46, 915–928. 10.1080/15374416.2016.1152554 [DOI] [PubMed] [Google Scholar]
  51. Substance Abuse and Mental Health Services Administration. (2016). Defining “evidence based” Retrieved from: https://www.samhsa.gov/capt/applying-strategic-prevention-framework/step3-plan/defining-evidence-based
  52. Tabak RG, Khoong EC, Chambers DA, & Brownson RC (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine 43, 337–350. 10.1016/j.amepre.2012.05.024 [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Trimble C, & Govindarajan V (2012). Reverse innovation: Create far from home, win everywhere Boston, MA: Harvard Business Review Press. [Google Scholar]
  54. van Aar J, Leijten P, de Castro BO, & Overbeek G (2017). Sustained, fade-out or sleeper effects? A systematic review and meta-analysis of parenting interventions for disruptive child behavior. Clinical Psychology Review, 51, 153–163. 10.1016/j.cpr.2016.11.006 [DOI] [PubMed] [Google Scholar]
  55. Weiss B, Ngo VK, Dang HM, Pollack A, Trung LT, Tran CV, … & Do KN (2012). A model for sustainable development of child mental health infrastructure in the lmic world: Vietnam as a case example. International Perspectives in Psychology: Research, Practice, Consultation, 1, 63–77. 10.1037/a0027316 [DOI] [PMC free article] [PubMed] [Google Scholar]
  56. Weisz JR, & Kazdin AE (2010). Evidence-based psychotherapies for children and adolescents (2nd ed.). New York: Guilford Press. [Google Scholar]
  57. White RG, Gregg J, Batten S, Hayes LL, & Kasujja R (2017). Contextual behavioral science and global mental health: Synergies and opportunities. Journal of Contextual Behavioral Science, 6, 245–251. 10.1016/j.jcbs.2017.07.001 [DOI] [Google Scholar]
  58. World Health Organization, & ExpandNet. (2011). Beginning with the end in mind: Planning pilot projects and other programmatic research for successful scaling up Retrieved from: http://www.who.int/reproductivehealth/publications/strategic_approach/9789241502320/en/ [Google Scholar]
  59. Yancey AT, Glenn BA, Ford CL, & Bell-Lewis L (2017). Dissemination and implementation research among racial/ethnic minority and other vulnerable populations. In Brownson RC, Colditz GA, & Proctor EK (Eds.), Dissemination and implementation research in health: Translating science to practice (2nd ed., pp. 449 – 470). London, UK: Oxford Press. doi: 10.1093/oso/9780190683214.001.0001 [DOI] [Google Scholar]
  60. Yancey A, Ortega A, & Kumanyika S (2006). Effective recruitment and retention of minority research participants. Annual Review of Public Health 27, 1– 28. doi: 10.1146/annurev.publhealth.27.021405.102113. [DOI] [PubMed] [Google Scholar]

RESOURCES