Skip to main content
Springer logoLink to Springer
. 2010 Oct 22;38(1):44–53. doi: 10.1007/s10488-010-0314-z

Mixed Method Designs in Implementation Research

Lawrence A Palinkas 1,6,, Gregory A Aarons 2,6, Sarah Horwitz 3, Patricia Chamberlain 4, Michael Hurlburt 5,6, John Landsverk 6
PMCID: PMC3025112  PMID: 20967495

Abstract

This paper describes the application of mixed method designs in implementation research in 22 mental health services research studies published in peer-reviewed journals over the last 5 years. Our analyses revealed 7 different structural arrangements of qualitative and quantitative methods, 5 different functions of mixed methods, and 3 different ways of linking quantitative and qualitative data together. Complexity of design was associated with number of aims or objectives, study context, and phase of implementation examined. The findings provide suggestions for the use of mixed method designs in implementation research.

Keywords: Methods, Implementation research, Mental health services


Despite the need for and existence of practices that effectively prevent or treat mental health problems in children and adolescents, such practices are rarely employed in child welfare systems (Usher and Wildfire 2003; Burns et al. 2004; Leslie et al. 2004). In fact, as much as 90% of public youth-service systems, including mental health, education, juvenile justice and child welfare, do not use evidence-based practices (Hoagwood and Olin 2002). Unfortunately, our understanding of the reasons for this apparent gap between science and practice is limited to a few empirical studies and conceptual models that may or may not be not empirically grounded (Aarons et al., this issue). In implementation research, mixed method designs have been increasingly been utilized to develop a science base for understanding and overcoming barriers to implementation. More recently, they have been used in the design and implementation of strategies to facilitate the implementation of EBPs (Proctor et al. 2009). Mixed methods designs focus on collecting, analyzing and merging both quantitative and qualitative data into one or more studies. The central premise of these designs is that the use of quantitative and qualitative approaches in combination provides a better understanding of research issues than either approach alone (Robins et al. 2008). In such designs, qualitative methods are used to explore and obtain depth of understanding as to the reasons for success or failure to implement evidence-based practice or to identify strategies for facilitating implementation while quantitative methods are used to test and confirm hypotheses based on an existing conceptual model and obtain breadth of understanding of predictors of successful implementation (Teddlie and Tashakkori 2003).

In this paper, we examine the application of mixed method designs in implementation research in a sample of mental health services research studies published in peer-reviewed journals over the last 5 years. Our aim was to determine how such methods were currently being used, whether this use was consistent with the conceptual framework outlined by Aarons et al. (this issue) for understanding the phases of implementation, and whether these strategies could offer any guidance for subsequent use of mixed methods in implementation research.

Methods

We conducted a literature review of mental health services research publications over a five-year period (Jan 2005–Dec 2009), using the PubMed Central database. Data were taken from the full text of the research article. Criteria for identification and selection of articles included reports of original research and one of the following: (1) studies that were specifically identified as using mixed methods, either through keywords or description in the title; (2) qualitative studies conducted as part of larger projects, including randomized controlled trials, which also included use of quantitative methods; or (3) studies that “quantitized” qualitative data (Miles and Huberman 1994) or “qualitized” quantitative data (Tashakkori and Teddlie 1998). Per criteria used by McKibbon and Gadd (2004), the analysis had to be fairly substantial—for example, a simple descriptive analysis of baseline demographics of the participants was not sufficient to be included as a mixed method article. Further, qualitative studies that were not clearly linked to quantitative studies or methods were excluded from our review.

We next assessed the use of mixed methods in each study to determine their structure, function, and process. A taxonomy of these elements of mixed method designs and definition of terms is provided in Table 1 below. Procedures for assessing the reliability of the classification procedures are described elsewhere (Palinkas et al. 2010). Assessment of the structure of the research design was based on Morse’s (1991) taxonomy that gives emphasis to timing (e.g., using methods in sequence [represented by a “→”symbol] versus using them simultaneous [represented by a “+” symbol]), and to weighting (e.g., primary method [represented by capital letters like “QUAN”] versus secondary [represented in small case letters like “qual”]). Assessment of the function of mixed methods was based on whether the two methods were being used to answer the same question or to answer related questions and whether the intention of using mixed methods corresponded to any of the five types of mixed methods designs described by Greene et al. (1989) (Triangulation or Convergence, Complementarity, Expansion, Development, and Initiation or Sampling). Finally, the process or strategies for combining qualitative and quantitative data was assessed using the typology proposed by Cresswell and Plano Clark (2007): merging or converging the two datasets by actually bringing them together, connecting the two datasets by having one build upon the other, or embedding one dataset within the other so that one type of data provides a supportive role for the other dataset.

Table 1.

Taxonomy of mixed method designs

Element Category Definition
Structure QUAL → quan Sequential collection and analysis of quantitative and qualitative data, beginning with qualitative data, for primary purpose of exploration/hypothesis generation
qual → QUAN Sequential collection and analysis of quantitative and qualitative data, beginning with qualitative data, for primary purpose of confirmation/hypothesis testing
Quan → QUAL Sequential collection and analysis of quantitative and qualitative data, beginning with quantitative data, for primary purpose of exploration/hypothesis generation
QUAN → qual Sequential collection and analysis of quantitative and qualitative data, beginning with quantitative data, for primary purpose of confirmation/hypothesis testing
Qual + QUAN Simultaneous collection and analysis of quantitative and qualitative data for primary purpose of confirmation/hypothesis testing
QUAL + quan Simultaneous collection and analysis of quantitative and qualitative data for primary purpose of exploration/hypothesis generation
QUAN + QUAL Simultaneous collection and analysis of quantitative and qualitative data, giving equal weight to both types of data
Function Convergence Using both types of methods to answer the same question, either through comparison of results to see if they reach the same conclusion (triangulation) or by converting a data set from one type into another (e.g. quantifying qualitative data or qualifying quantitative data)
Complementarity Using each set of methods to answer a related question or series of questions for purposes of evaluation (e.g., using quantitative data to evaluate outcomes and qualitative data to evaluate process) or elaboration (e.g., using qualitative data to provide depth of understanding and quantitative data to provide breadth of understanding)
Expansion Using one type of method to answer questions raised by the other type of method (e.g., using qualitative data set to explain results of analysis of quantitative data set)
Development Using one type of method to answer questions that will enable use of the other method to answer other questions (e.g., develop data collection measures, conceptual models or interventions)
Sampling Using one type of method to define or identify the participant sample for collection and analysis of data representing the other type of method (e.g., selecting interview informants based on responses to survey questionnaire)
Process Merge Merge or converge the two datasets by actually bringing them together (e.g., convergence—triangulation to validate one dataset using another type of dataset)
Connect Have one dataset build upon another data set (e.g., complementarity—elaboration, transformation, expansion, initiation or sampling)
Embed Conduct one study within another so that one type of data provides a supportive role to the other dataset (e.g., complementarity—evaluation: a qualitative study of implementation process embedded within an RCT of implementation outcome)

Results

Our search identified 22 articles published between 2005 and 2009 that met our criteria for analysis. Our analyses revealed 7 different structural arrangements, 5 different functions of mixed methods, and 3 different ways of linking quantitative and qualitative data together. Many studies included more than one structural arrangement, function or process; hence the raw numbers often added up to more than the total number of studies reviewed. Twelve of the 22 papers presented qualitative data only, but were part of larger studies that included the use of quantitative measures.

Mixed Method Structure

In 9 of the 22 studies reviewed, quantitative and qualitative methods were used in sequence and in 19 studies, they were used simultaneously. Six studies used them in both sequential and simultaneous fashion. Sequential designs are dictated either by the specific methodology, study objectives, or logistical issues in collection and analysis of data. For instance, Proctor et al. (2007) conducted a qualitative pilot study to capture the perspective of agency directors on the challenge of implementing evidence-based practices in community mental health agencies prior to the development and testing of a specific implementation intervention in the belief that incorporation of this perspective in the development stage would lead to a more successful outcome that would be assessed using quantitative methods (qual → QUAN). Using the technique of concept mapping (Trochim 1989), Aarons et al. (2009), solicited information on factors likely to impact implementation of EBPs in public sector mental health settings from 31 services providers and consumers organized into 6 focus groups. Each participant then sorted a series of 105 statements into piles and rated each statement according to importance and changeability. Data were then entered in a software program that uses multidimensional scaling and hierarchical cluster analysis to generate a visual display of how statements clustered across all participants. Finally, 22 of the original 31 participants assigned meaning to and identified an appropriate name for each of the clusters identified (Aarons et al. 2009).

As an example of a simultaneous collection and analysis of qualitative and quantitative data Sharkey et al. (2005) conducted a qualitative study of factors affecting the implementation of a randomized controlled trial parallel to the trial’s quantitative assessment of the effectiveness of a transitional discharge model for people with a serious mental illness (QUAN + qual). Aarons and Palinkas (Aarons and Palinkas 2007; Palinkas and Aarons 2009), simultaneously collected qualitative data through annual interviews and focus groups and quantitative data through semi-annual web-based surveys to assess the process of implementation of SafeCare®, an intervention designed to reduce child neglect and out-of-home placements of neglected children. The study also assessed its impact on agency organizational culture and climate and the therapeutic relationship between home visitor and client family (QUAN + QUAL).

With respect to the weighting or prioritization of each method, all but one of the studies examined had unbalanced designs; of these, 19 studies used quantitative methods as the primary or dominant method and qualitative methods as the secondary or subordinate method. For instance, a qualitative assessment by Palinkas et al. (2008) of the process of implementation of evidence-based treatments for depression, anxiety and conduct disorders in children was secondary to the primary aim of evaluating the effectiveness of two different variations of the treatments, one based on the standardized use of manualized treatments and one based on a modular approach (QUAN + qual). In two studies (Aarons et al. 2009; Bachman et al. 2009) qualitative methods were primary and quantitative methods were secondary (Quan + QUAL); in two other studies (Aarons and Palinkas 2007; Marty et al. 2008) both types of unbalanced designs were used.

Ten of the 22 studies included balanced designs in which quantitative and qualitative methods were given equal weight. In all 10 studies, the methods were used simultaneously (QUAN + QUAL). Whitley et al. (2009) documented the process of implementation of an illness management and recovery program for people with severe mental illness in community mental health settings using qualitative data to assess perceived barriers and facilitators of implementation and quantitative data to assess implementation performance based on assessments of fidelity to the practice model, with no overriding priority assigned to either aim. Some studies gave equal weight to qualitative and quantitative data for the purpose of evaluating fidelity and implementation barriers/facilitators even though the collection of qualitative data to assess implementation was viewed as secondary to the overall goal of evaluating the effectiveness of an intervention (e.g., Marshall et al. 2008; Marty et al. 2008; Rapp et al. 2009).

Mixed Method Function

Our review revealed five distinct functions of mixing methods. The first function was convergence in which qualitative and quantitative methods were used sequentially or simultaneously to answer the same question. Eight (36%) of the studies included this function. We identified two specific forms of convergence, triangulation and transformation. Triangulation involves the use of one type data to validate or confirm conclusions reached from analysis of the other type of data. For instance, in examining the sustainability of evidence-based practices in routine mental health agencies, Swain et al. (2009) used triangulation to identify commonalities and disparities between quantitative data obtained from closed-ended questions and qualitative data obtained from open-ended questions in a survey administered to 49 participants, each participant representing a distinct practice site. Transformation involves the sequential quantification of qualitative data—e.g., (qual → QUAN) or the use of qualitative techniques to transform quantitative data. The technique of concept mapping used by Aarons et al. (2009), where qualitative data elicited from focus groups are “quantitized” using multidimensional scaling and hierarchical cluster analysis, is an example of transformation.

In 14 studies, quantitative and qualitative methods were used in complementary fashion to answer related questions for the purpose of evaluation. For instance, Hoagwood et al. (2007) used a case study of an individual child to describe the process of implementation of an evidence-based, trauma-focused, cognitive-behavioral therapy for treatment of symptoms of PTSD in children living in New York City in the aftermath of the World Trade Center attack on September 11, 2001. Although the article does provide information on the outcome of the child’s treatment, the case study method was intended more to illustrate the process of treatment, beginning with engagement and moving to assessment, treatment, and finally, to outcome. This technique also illustrates the use of an elaborative design in which qualitative methods are used to provide depth of understanding to complement the breadth of understanding afforded by quantitative methods. In this instance, the “thick description” of the child’s progress from symptom presentation to completion of treatment offers a degree of depth of understanding of the experience of this child and other study participants that is not possible from measures on standardized clinical assessment instruments alone.

In 13 of the studies, mixed methods designs exhibited the function of expansion in which qualitative data were used to explain findings from the analyses of quantitative data. For instance, Kramer and Burns (2008) used data from qualitative interviews with providers as part of a summative evaluation to understand the factors contributing to partial or full implementation of a CBT for depressed adolescents in two publically-funded mental healthcare settings. Brunette et al. (2008) used qualitative data collected from interviews and ethnographic observations to elucidate barriers and facilitators to implementation of integrated dual disorders treatment and explain differences in treatment fidelity across the study sites.

Mixed methods were also used in 6 studies for the purpose of developing new measures, conceptual models, or interventions. In one study (Blasinsky et al. 2006), development of a rating scale to construct predictors of program outcomes and sustainability of a collaborative care intervention to assist older adults suffering from major depression or dysthymia involved the sequential use of QUAL to identify form and content of items to be used in a QUAN study—e.g., survey questions (qual → QUAN). In a second study, qualitative data was sequentially collected and analyzed to develop a conceptual framework for generating hypotheses explaining the adoption and implementation of Functional Family Therapy in a sample of family and child mental health services organizations in New York State to be tested using quantitative methods (qual → QUAN) (Zazalli et al. 2008). In two studies, intervention development or adaptation involved the use of qualitative methods to develop new interventions or adapt existing interventions to new populations (qual → QUAN). For instance, semi-structured interviews were conducted by Henke et al. (2008) to test the feasibility of a primary care depression performance-based reward program.

Finally, mixed methods were used to identify a sample of participants for use of the other method. This technique was used in 5 of the 22 studies (23%). One form of sampling was the sequential use of QUAN data to identify potential participants for QUAL study (quan → QUAL). Aarons and Palinkas (2007), for example, selected clinical case managers having the most positive and most negative views of an evidence-based practice for extended semi-structured interviews based on results of a web-based quantitative survey asking about the perceived value and usefulness of SafeCare®. The other form of sampling used qualitative data to identify samples of participants for quantitative analysis. A study of staff turnover in the implementation of evidence based practices in mental health care by Woltmann et al. (2008) used qualitative data obtained through interviews with staff, clinic directors and consultant trainers to create categories of turnover and designations of positive, negative and mixed influence of turnover on outcomes. These categories were then quantitatively compared with implementation outcomes via simple tabulations of fidelity and penetration means for each category.

Mixed Method Process

The integration of quantitative and qualitative data occurred in three forms, merging the data, connecting the data, and embedding the data. In 17 studies, the qualitative study was embedded within a larger quantitative effectiveness trial or implementation study. Slade et al. (2008) nested a qualitative study within a multi-site randomized controlled trial of a standardized assessment of mental health problem severity to determine whether the intervention improved agreement on referrals and to identify professional and organizational barriers to implementation. In 11 studies, the insights gained from one type of method were connected to a different type of method to answer related questions through complementarity, expansion, development or sampling. Thus, the qualitative assessment of agency director perspectives on implementation of evidence-based practices by Proctor et al. (2007) was designed as a pilot-stage step in a research agenda to develop and quantitatively test implementation intervention. Zazalli et al. (2008) connected qualitative data collected from semi-structured interviews with 15 program administrators to the development of a conceptual model of implementation of Functional Family Therapy that could then be tested using quantitative methods. In 10 studies, qualitative and quantitative data were brought together in the analysis phase to answer the same question through triangulation or related questions through complementarity. Bachman et al. (2009) merged qualitative data collected from semi-structured interviews with quantitative data collected from two surveys to describe and compare the experience of integrating children’s services in 35 children’s trusts in England.

Mixed Methods and Phases of Implementation

Using the conceptual framework proposed by Aarons et al. (this issue), we also mapped the use of mixed methods of the 22 studies reviewed along two dimensions, phase of implementation and inner and outer context. The results are presented in Table 2 below. Fifteen of the 22 studies focused on the implementation stage and 13 studies focused on organizational characteristics that facilitated or impeded implementation. Only two studies focused on the exploration stage (Aarons et al. 2009; Proctor et al. 2007). Two studies focused on the adoption stage (Palinkas et al. 2008; Zazalli et al. 2008), and two studies focused on the sustainability stage (Blasinsky et al. 2006; Swain et al. 2009). One study (Bearsley-Smith et al. 2007) proposed to study the adoption, implementation and sustainability stages in a longitudinal fashion; however, the article provided few details on elements of inner or outer context to be examined. The majority of studies that examined socio-political context and funding issues were focused on the implementation or sustainability stages, while the majority of studies that examined organizational and individual adapter characteristics were focused on the implementation stage. Only one study (Aarons et al. 2009) examined the role of client advocacy.

Table 2.

Studies using mixed method to examine outer and inner context by implementation stage

Exploration Adoption Implementation Sustainability
Outer context
Socio-political/funding Aarons et al. (2009) Palinkas and Aarons (2009) Blasinsky et al. (2006)
Henke et al. (2008) Swain et al. (2009)
Bachmann et al. (2009)
Client advocacy Aarons et al. (2009)
Inner context
Inter-organizational environment Proctor et al. (2007) Palinkas et al. (2008) Palinkas and Aarons (2009)
Bachmann et al. (2009)
Organizational characteristics Aarons et al. (2009) Aarons and Palinkas (2007) Blasinsky et al. (2006)
Bachman et al. (2009)
Proctor et al. (2007)
Brunette et al. (2008)
Henke et al. (2008)
Hoagwood et al. (2007)
Kramer and Burns (2008)
Marshall et al. (2008)
Marty et al. (2008)
Palinkas and Aarons (2009)
Rapp et al. (2009)
Sharkey et al. (2005)
Whitley et al. (2008)
Woltman et al. (2008)
Individual adopter characteristics Aarons et al. (2009) Palinkas et al. (2008) Aarons and Palinkas (2007) Swain et al. (2009)
Proctor et al. (2007) Zazelli et al. (2008) Bachman et al. (2009)
Goia and Dziadosz (2008)
Henke et al. (2008)
Hoagwood et al. (2007)
Rapp et al. (2009)
Kramer and Burns (2008)
Marshall et al. (2008)
Slade et al. (2008)
Zazelli et al. (2008)
Unspecified Bearsley Smith et al. (2007) Bearsley Smith et al. (2007) Bearsley Smith et al. (2007)

Discussion

Our analysis of the 22 studies uncovered five major reasons for using mixed method designs in intervention research. The first reason was to use quantitative methods to measure intervention and/or implementation outcomes and qualitative methods to understand process. This aim was explicit in 11 of the 22 studies. Qualitative inquiry is highly appropriate for studying process because (1) depicting process requires detailed descriptions of how people engage with one another, (2) the experience of process typically varies for different people so their experiences need to be captured in their own words, (3) process is fluid and dynamic so it can’t be fairly summarized on a single rating scale at one point in time, and (4) participants’ perceptions are a key process consideration (Patton 2001).

The second reason was to conduct both exploratory and confirmatory research. In mixed method designs, qualitative methods are used to explore a phenomenon and generate a conceptual model along with testable hypotheses, while quantitative methods are used to confirm the validity of the model by testing the hypotheses (Teddlie and Tashakkori 2003). This combined focus is also consistent with the call by funding agencies (NIMH 2004) and others (Proctor et al. 2009) to develop new conceptual models and to develop new measures to test these models. Several of the studies focused on development of new measures (Blaskinsky et al. 2006; Slade et al. 2008) or conceptual frameworks (Zazalli et al. 2008), or the development of new or adaptations of existing interventions (Proctor et al. 2007; Henke et al. 2007).

The third reason was to examine both intervention content and context. Many of the studies included in this review used mixed methods to examine the context of implementation of a specific intervention (e.g., Henke et al. 2008; Sharkey et al. 2005; Slade et al. 2008; Whitley et al. 2009). Unlike efficacy studies where context can be controlled, implementation research occurs in real world settings distinguished by their complexity and variation in context (Landsverk et al., this issue). Qualitative methods are especially suited to understanding context (Bernard 1988). In contrast, quantitative methods were used to measure aspects of the content of the intervention in addition to the intervention’s outcomes. A particularly important element of content was the degree of fidelity of application of the intervention. Schoenwald et al. (this issue) discuss different strategies for the quantitative measurement of fidelity to explain variation in intervention/implementation outcomes.

The fourth reason for using mixed methods was to incorporate the perspective of potential consumers of evidence-based practices (both practitioners and clients) (Proctor et al. 2009). As observed by Aarons et al. (this issue), some models that describe approaches to organizational change and innovation adoption highlight the importance of actively including and involving critical relevant stakeholders during the process of considering and preparing for innovation adoption. Use of qualitative methods gives voice to these stakeholders (Sofaer 1999) and allows partners an opportunity to express their own perspectives, values and opinions (Palinkas et al. 2009). Obtaining such a perspective was an explicit aim of studies by Henke et al. (2008), Proctor et al. (2007), Aarons et al. (2009), and Palinkas and Aarons (2009). A mixed method approach is also consistent with the need to understand patient and provider preferences in the use of Sequential Multiple Assignment Randomized Trial (SMART) designs when testing and evaluating the effectiveness of different strategies to improve implementation outcomes (Landsverk et al., this issue).

Finally, mixed methods were used to compensate for one set of methods by the use of another set of methods. For instance, convergence or triangulation of quantitative and qualitative data was an explicit feature of the mixed method study of the implementation of SafeCare® in Oklahoma by Aarons et al. (Aarons and Palinkas 2007; Palinkas and Aarons 2009) because of limited statistical power in quantitative analyses that were nested in teams of services providers, a common problem of implementation research (Proctor et al. 2009; Landsverk et al., this issue).

The studies examined in this review represent a continuum of mixed method designs that ranges from the simple to the complex. Simple designs were observed in single studies that have a limited objective or scope. For instance, in seeking to determine whether the experience of using mixed methods accounted for possible changes in attitudes towards their use, Gioia and Dziadosz (2008) used semi-structured interview and focus group methods to obtain first-hand accounts of practitioners’ experiences in being trained to use an EBP, and a quantitative measure of attitudes towards the use of EBPs to identify changes in attitudes over time. In contrast, complex designs usually involve more than one study, each of which are linked by a set of related objectives. For instance, Bearsley-Smith et al. (2007) describe a protocol for a cluster randomized feasibility trial in which quantitative measures are used in studies designed to evaluate program outcomes (e.g., diagnostic status and clinical severity, client satisfaction) and measure program fidelity, and qualitative methods (clinician focus groups and semistructured client interviews) are used in studies designed to assess the process of implementation and explain quantitative findings.

In addition to study objectives, complexity of mixed method designs is also related to the context in which the study or studies were conducted. For instance, six of the studies reviewed were embedded in a larger effort known as the National Evidence-Based Practice Implementation Project, which was designed to explore whether EBP’s can be implemented in routine mental health service settings and to discover the facilitating conditions, barriers, and strategies that affected implementation (Brunette et al. 2008; Marshall et al. 2008; Marty et al. 2008; Rapp et al. 2009; Whitley et al. 2009). Two additional studies (Aarons and Palinkas 2007; Palinkas and Aarons 2009) were part of a mixed-method study of implementation embedded in a statewide randomized controlled trial of the effectiveness of an evidence-based practice for reducing child neglect and out of home foster placements. In each instance, the rationale for the use of a mixed method design was determined by its role in the larger project (primary or secondary), resulting in an unbalanced structure and emphasis on complementarity to understand the process of implementation and expansion to explain outcomes of the larger project. However, the embedded mixed method study itself often reflected a balanced structure and use of convergence, complementarity, expansion, and sampling to understand barriers and facilitators of implementation.

Complexity of mixed method designs is also related to the phase of implementation under examination. Mixed method studies of the exploration and adoption phases described by Aarons et al. (this issue) tended to utilize less complex designs characterized by a sequential unbalanced structure for the purpose of seeking convergence through transformation or developing new measures, conceptual frameworks or interventions, and a process of connecting the data. In contrast, studies of the implementation and sustainability phases tended to utilize more complex designs characterized by a simultaneous balanced or unbalanced structure for the purpose of seeking convergence through triangulation, complementarity, expansion and sampling, and a process of embedding the data. Nevertheless, as these studies illustrate, research on any of the four phases of implementation described by Aarons et al. may utilize and benefit from the application of any combination of elements of structure, function and process as long as this combination is consistent with study aims and context.

Our examination of these studies also revealed other characteristics of mixed method designs in implementation research that are noteworthy. First, the vast majority of studies reviewed utilized observational designs. As Landsverk et al. (this issue) and others (Proctor et al. 2009), have noted, most early research on implementation was observational in nature, relying upon naturalistic case study approaches. More recently, prospective, experimental designs have been used to develop, test and evaluate specific strategies designed to increase the likelihood of implementation (Chamberlain et al. 2008; Glisson and Schoenwald 2005). Second, all of the 22 studies reviewed focused on characteristics of organizations and individual adopters that facilitated or impeded the process of implementation. Only seven studies included a focus on the outer context or the interorganizational component of the inner context of implementation (Aarons et al., this issue). Third, only 2 of the 22 studies (Aarons and Palinkas 2007; Palinkas and Aarons 2009) focused on implementation in child welfare settings. Given the issues in Child Welfare, such as lack of professional education focused on evidence based practices and the richness of information solicited through mixed methods, the paucity of studies on implementation in Child Welfare is surprising.

However, there are ongoing efforts to incorporate mixed method designs in research involving the implementation of evidence-based practices that include experimental designs to evaluate implementation strategies, an examination of outer and interorganizational context, and are situated in child welfare settings. Two such efforts include Using Community Development Teams to Scale-up MTFC in California (Patricia Chamberlain, Principal Investigator) and Cascading Diffusion of an Evidence-Based Child Maltreatment Intervention (Mark Chaffin, Principal Investigator). The first is a randomized controlled trial designed to evaluate the effectiveness of a strategy for implementing Multidimensional Treatment Foster Care (MTFC; Chamberlain et al. 2007), an evidence-based program for out of home youth aged 8–18 with emotional or behavioral problems. Mixed methods are being used to examine the structure and operation of system leaders’ influence networks and use of research evidence. The Cascading Diffusion Project is a demonstration grant examining whether or not a model of planned diffusion of an evidence-based practice can develop a network of services with self-sustaining levels of model fidelity and provider competency. A mixed method approach is being employed to describe the relationships between provider staff, system and organizational factors, and their impact on the implementation process. In both projects, qualitative and quantitative methods are being used in a simultaneous, unbalanced arrangement for the purpose of seeking complementarity, using quantitative methods to achieve breadth of understanding (i.e., generalizability) of both content (i.e., fidelity) and outcomes (i.e., stage of implementation, number of children placed, recidivism), and qualitative methods to achieve depth of understanding (i.e., thick description) of both process and inner and outer context of implementation, all in embedded design.

In recommending changes in the current approach to evidence in health care to accelerate the improvement of systems of care and practice, Berwick (2008) recommends embracing a wider range of scientific methodologies than the usual RCT experimental design. These methodologies include the use of assessment techniques developed in engineering and used in quality improvement (e.g., statistical process control, time series analysis, simulations, and factorial experiments) as well as ethnography, anthropology, and other qualitative methods. Berwick argues that such methods are essential to understanding mechanisms and context of implementation and quality improvement. Nevertheless, it is the combining of these methods through mixed method designs that is likely to hold the greatest promise for advancing our understanding of why evidence-based practices are not being used, what can be done to get them into routine use, and how to accelerate the improvement of systems of care and practice.

Acknowledgements

This study was funded through grants from the National Institute of Mental Health (P50 MH50313-07; P30-MH074678: J. Landsverk, PI).

Disclosures

None for any author.

Open Access

This article is distributed under the terms of the Creative Commons Attribution Noncommercial License which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

References

  1. Aarons, G. A., Hurlburt, M., & Horwitz, S. M., (2010). Advancing a conceptual model of evidence-based practice implementation in child welfare. Administration and Policy in Mental Health and Mental Health Services Research (submitted). [DOI] [PMC free article] [PubMed]
  2. Aarons GA, Palinkas LA. Implementation of evidence-based practice in child welfare: Service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007;34:411–419. doi: 10.1007/s10488-007-0121-3. [DOI] [PubMed] [Google Scholar]
  3. Aarons GA, Wells R, Zagursky K, Fettes DL, Palinkas LA. Implementing evidence-based practice in community mental health agencies: Multiple stakeholder perspectives. American Journal of Public Health. 2009;99(11):2087–2095. doi: 10.2105/AJPH.2009.161711. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Bachman MO, O’Brien M, Husbands C, Shreeve A, Jones N, Watson J, Reading R, Thoburn J, Mugford M, The National Evaluation of Children’s Trusts Team Integrating children’s services in England: National evaluation of children’s trusts. Child: Care, Health and Development. 2009;35:257–265. doi: 10.1111/j.1365-2214.2008.00928.x. [DOI] [PubMed] [Google Scholar]
  5. Bearsley-Smith C, Brown MO, Sellick K, Villanueva EV, Chesters J, Francis K, Reddy P. Does interpersonal psychotherapy improve clinical care for adolescents with depression attending a rural child and adolescent mental health service? Study protocol for a cluster randomized feasibility trial. BMC Psychiatry. 2007;7:53. doi: 10.1186/1471-244X-7-53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  6. Bernard HR. Qualitative methods in cultural anthropology. Newbury Park, CA: Sage; 1988. [Google Scholar]
  7. Berwick DM. The science of improvement. Journal of the American Medical Association. 2008;299(10):1182–1184. doi: 10.1001/jama.299.10.1182. [DOI] [PubMed] [Google Scholar]
  8. Blasinsky M, Goldman HH, Unűtzer J. Project IMPACT: A report on barriers and facilitators to sustainability. Administration and Policy in Mental Health and Mental Health Services. 2006;33:718–729. doi: 10.1007/s10488-006-0086-7. [DOI] [PubMed] [Google Scholar]
  9. Brunette MF, Asher D, Whitley R, Lutz WJ, Weider BL, Jones AM, McHugo GJ. Implementation of integrated dual disorders treatment: A qualitative analysis of facilitators and barriers. Psychiatric Services. 2008;59:989–995. doi: 10.1176/appi.ps.59.9.989. [DOI] [PubMed] [Google Scholar]
  10. Burns BJ, Phillips SD, Wagner HR, Barth RP, Kolko DJ, Campbell C, Landsverk J. Mental health need and access to mental health services by youths involved with child welfare: A national survey. Journal of the American Academy of Child and Adolescent Psychiatry. 2004;43:960–970. doi: 10.1097/01.chi.0000127590.95585.65. [DOI] [PubMed] [Google Scholar]
  11. Chamberlain P, Leve LD, DeGarmo DS. Multi-dimensional treatment foster care for girls in the juvenile justice system: 2-year follow-up of a randomized clinical trial. Journal of Consulting and Clinical Psychology. 2007;75(1):187–193. doi: 10.1037/0022-006X.75.1.187. [DOI] [PMC free article] [PubMed] [Google Scholar]
  12. Cresswell JW, Plano Clark VL. Designing and conducting mixed method research. Thousand Oaks, CA: Sage; 2007. [Google Scholar]
  13. Gioia D, Dziadosz G. Adoption of evidence-based practices in community mental health: A mixed method study of practitioner experience. Community Mental Health Journal. 2008;44:347–357. doi: 10.1007/s10597-008-9136-9. [DOI] [PubMed] [Google Scholar]
  14. Glisson C, Schoenwald SK. The ARC organizational and community intervention strategy for implementing evidence-based children’s mental health treatments. Mental Health Services Research. 2005;7(4):243–259. doi: 10.1007/s11020-005-7456-1. [DOI] [PubMed] [Google Scholar]
  15. Greene JC, Caracelli VJ, Graham WF. Toward a conceptual framework for mixed method evaluation designs. Educational Evaluation and Policy Analysis. 1989;11:255–274. [Google Scholar]
  16. Henke RM, Chou AF, Chanin JC, Zides AB, Scholle SH. Physician attitude toward depression care interventions: Implications for implementation of quality improvement initiatives. Implementation Science. 2008;3:40. doi: 10.1186/1748-5908-3-40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Hoagwood K, Olin S. The NIMH blueprint for change report: Research priorities in child and adolescent mental health. Journal of the American Academy of Child and Adolescent Psychiatry. 2002;41(7):760–767. doi: 10.1097/00004583-200207000-00006. [DOI] [PubMed] [Google Scholar]
  18. Hoagwood KE, Vogel JM, Levitt JM, D’Amico PJ, Paisner WI, Kaplan SJ. Implementing an evidence-based trauma treatment in a state system after September 11: The CATS Project. Journal of the American Academy of Child and Adolescent Psychiatry. 2007;46(6):773–779. doi: 10.1097/chi.0b013e3180413def. [DOI] [PubMed] [Google Scholar]
  19. Kramer TF, Burns BJ. Implementing cognitive behavioral therapy in the real world: A case study of two mental health centers. Implementation Science. 2008;3:14. doi: 10.1186/1748-5908-3-14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  20. Landsverk, J., Brown, C. H., Rolls Reutz, J., Palinkas, L. A., & Horwitz, S. M. (2010). Design elements in implementation research: A structured review of child welfare and child mental health studies. Administration and Policy in Mental Health and Mental Health Services Research. doi:10.1007/s10488-010-0315-y. [DOI] [PMC free article] [PubMed]
  21. Leslie LK, Hurlburt MS, Landsverk J, Barth R, Slymen DJ. Outpatient mental health services for children in foster care: A national perspective. Child Abuse and Neglect. 2004;28:697–712. doi: 10.1016/j.chiabu.2004.01.004. [DOI] [PubMed] [Google Scholar]
  22. Marshall T, Rapp CA, Becker DR, Bond GR. Key factors for implementing supported employment. Psychiatric Services. 2008;59:886–892. doi: 10.1176/appi.ps.59.8.886. [DOI] [PubMed] [Google Scholar]
  23. Marty D, Rapp C, McHugo G, Whitley R. Factors influencing consumer outcome monitoring in implementation of evidence-based practices: Results from the National EBP Implementation Project. Administration and Policy In Mental Health. 2008;35:204–211. doi: 10.1007/s10488-007-0157-4. [DOI] [PubMed] [Google Scholar]
  24. McKibbon KA, Gadd CS. A quantitative analysis of qualitative studies in clinical journals for the 2000 publishing year. BMC Medical Informatics and Decision Making. 2004;4:11. doi: 10.1186/1472-6947-4-11. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Miles MB, Huberman AM. Qualitative data analysis: An expanded sourcebook. 2. Thousand Oaks, CA: Sage; 1994. [Google Scholar]
  26. Morse JM. Approaches to qualitative–quantitative methodological triangulation. Nursing Research. 1991;40:120–123. doi: 10.1097/00006199-199103000-00014. [DOI] [PubMed] [Google Scholar]
  27. National Institute of Mental Health. (2004). Advancing the science of implementation: Improving the fit between mental health intervention development and service systems. Retrieved January 2010 from http://www/nimh.nih.gov/scientific meetings/scienceofimplementation.pdf.
  28. Palinkas LA, Aarons GA. A view from the top: Executive and management challenges in a statewide implementation of an evidence-based practice to reduce child neglect. International Journal of Child Health and Human Development. 2009;2:47–55. [Google Scholar]
  29. Palinkas LA, Aarons GA, Chorpita BF, Hoagwood K, Landsverk J, Weisz JR, The Research Network on Youth Mental Health Cultural exchange and the implementation of evidence-based practice: Two case studies. Research on Social Work Practice. 2009;19:602–612. doi: 10.1177/1049731509335529. [DOI] [Google Scholar]
  30. Palinkas, L. A., Horwitz, S. M., Chamberlain, P., Hulrburt, M., & Landsverk, J. (2010). Mixed method designs in mental health services research. Psychiatric Services (in press). [DOI] [PubMed]
  31. Palinkas LA, Schoenwald SK, Hoagwood K, Landsverk J, Chorpita B, Weisz JR, The MacArthur Research Network on youth Mental Health An ethnographic study of implementation of evidence-based treatments in child mental health: First steps. Psychiatric Services. 2008;59(7):738–746. doi: 10.1176/appi.ps.59.7.738. [DOI] [PubMed] [Google Scholar]
  32. Patton MQ. Qualitative research and evaluation methods. 3. Thousand Oaks, CA: Sage Publications; 2001. [Google Scholar]
  33. Proctor EK, Knudsen KJ, Fedoracivius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: Agency director perspectives. Administration and Policy In Mental Health. 2007;34:479–488. doi: 10.1007/s10488-007-0129-8. [DOI] [PubMed] [Google Scholar]
  34. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman C. Implementation research in mental health services: An emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services. 2009;36:24–34. doi: 10.1007/s10488-008-0197-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  35. Rapp, C. A., Etzel-Wise, D., Marty, D., Coffman, M., Carlson, L., Asher, D., Callaghan, J., & Holter M. (2009). Barriers to evidence-based practice implementation: Results of a qualitative study. Community Mental Health Journal. doi:10.1007/s10597-009-9238-z. [DOI] [PubMed]
  36. Robins CS, Ware NC, dosReis S, Willging CE, Chung JY, Lewis-Fernández R. Dialogues on mixed methods and mental health services research: Anticipating challenges, building solutions. Psychiatric Services. 2008;59:727–731. doi: 10.1176/appi.ps.59.7.727. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. Schoenwald, S. K., Garland, A. F., Chapman, J. E., Frazier S. L., Sheidow, A. J., & Southam-Gerow, M. A. (2010). Toward an effective and efficient measurement of implementation fidelity. Administration and Policy in Mental Health and Mental Health Services Research (submitted). [DOI] [PMC free article] [PubMed]
  38. Sharkey S, MacIver S, Cameron D, Reynolds W, Lauder W, Veitch T. An exploration of factors affecting the implementation of a randomized controlled trial of a transitional discharge model for people with serious mental illness. Journal of Psychiatric and Mental Health Nursing. 2005;12:51–56. doi: 10.1111/j.1365-2850.2004.00792.x. [DOI] [PubMed] [Google Scholar]
  39. Slade M, Gask L, Leese M, McCrone P, Montana C, Powell R, Stewart M, Chew-Graham Cl. Failure to improve appropriateness of referrals to adult community mental health services–lessons from a multi-site cluster randomized controlled trial. Family Practice. 2008;25:181–190. doi: 10.1093/fampra/cmn025. [DOI] [PubMed] [Google Scholar]
  40. Sofaer S. Qualitative methods: What are they and why use them? Health Services Research. 1999;34:1101–1118. [PMC free article] [PubMed] [Google Scholar]
  41. Swain, K., Whitley, R., McHugo, G. J., & Drake, R. E. (2009). The sustainability of evidence-based practices in routine mental health agencies. Community Mental Health Journal. epub ahead of print. doi:10.1007/s10597-009-9202y. [DOI] [PubMed]
  42. Tashakkori A, Teddlie C. Mixed methodology: Combining the qualitative and quantitative approaches. Thousand Oaks: Sage; 1998. [Google Scholar]
  43. Teddlie C, Tashakkori A. Major issues and controversies in the use of mixed methods in the social and behavioral sciences. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in the social and behavioral sciences. Thousand Oaks, CA: Sage; 2003. pp. 3–50. [Google Scholar]
  44. Trochim WM. An introduction to concept mapping for planning and evaluation. Evaluation and Program Planning. 1989;12:1–16. doi: 10.1016/0149-7189(89)90016-5. [DOI] [Google Scholar]
  45. Usher CL, Wildfire JB. Evidence-based practice in community-based child welfare systems. Child Welfare. 2003;82:597–614. [PubMed] [Google Scholar]
  46. Whitley R, Gingerich S, Lutz WJ, Mueser KT. Implementing the illness management and recovery program in community mental health settings: Facilitators and barriers. Psychiatric Services. 2009;60:202–209. doi: 10.1176/appi.ps.60.2.202. [DOI] [PubMed] [Google Scholar]
  47. Woltman EM, Whitley R, McHugo GJ, et al. The role of staff turnover in the implementation of evidence-based practices in health care. Psychiatric Services. 2008;59:732–737. doi: 10.1176/appi.ps.59.7.732. [DOI] [PubMed] [Google Scholar]
  48. Zazzali JL, Sherbourne C, Hoagwood KE, Greene D, Bigley MF, Sexton TL. The adoption and implementation of an evidence based practice in child and family mental health services organizations: A pilot study of functional family therapy in New York State. Administration and Policy In Mental Health. 2008;35:38–49. doi: 10.1007/s10488-007-0145-8. [DOI] [PubMed] [Google Scholar]

Articles from Administration and Policy in Mental Health are provided here courtesy of Springer

RESOURCES