Abstract
Qualitative methods are a valuable tool in implementation research because they help to answer complex questions such as how and why efforts to implement best practices may succeed or fail, and how patients and providers experience and make decisions in care. This article orients the novice implementation scientist to fundamentals of qualitative methods and their application in implementation research, describing: 1) implementation-related questions that can be addressed by qualitative methods; 2) qualitative methods commonly used in implementation research; 3) basic sampling and data collection procedures; and 4) recommended practices for data analysis and ensuring rigor. To illustrate qualitative methods decision-making, a case example is provided of a study examining implementation of a primary care-based collaborative care management model for women Veterans with anxiety, depression, and PTSD.
Keywords: knowledge translation, translational research, evidence-based practices, semi-structured interviews, focus groups, ethnography, evaluation
1. What are qualitative methods?
Qualitative research broadly refers to a category of research approaches that produce findings without reliance on quantitative measurement or statistical analysis (Strauss & Corbin, 1990). In the clinical context, these studies “help us understand why promising clinical interventions do not always work in the real world, how patients experience care, and how practitioners think. They also explore and explain the complex relations between the healthcare system and the outside world....” (Greenhalgh et al., 2016). Qualitative methods commonly include individual and focus group interviews, participant observation, ethnography, and a variety of other approaches. Traditionally, qualitative methods have been used across a variety of disciplines to describe how things are; as with, for example, participant observation in early cultural anthropology, which documented the beliefs and practices of specific cultural groups. Although still perhaps most common in the social sciences, qualitative methods are increasingly recognized for their utility in clinical and health research generally, with a recent call for “[t]rialists and other stakeholders…to recognize the benefits of using qualitative methods in surgical, device and drug trials” (Clement et al., 2018).
Qualitative methods are an integral component of implementation research—i.e., “the scientific study of the use of strategies to adopt and integrate evidence-based health interventions into clinical and community settings to improve individual outcomes and benefit population health ” (National Institutes of Health PAR 19–274cae)—to the extent that the National Cancer Institute recently commissioned an expert group of social scientists to develop a White Paper on Qualitative Research in Implementation Science (QualRIS, 2019), intended primarily for an implementation science audience with some knowledge of qualitative methods. To complement the QualRIS effort, this paper orients the novice implementation scientist to fundamentals of qualitative methods and their application in implementation research, addressing: 1) implementation-related questions that can be addressed by qualitative methods (section 2.1); 2) qualitative methods commonly used in implementation research (section 3); 3) basic sampling and data collection procedures (sections 3.1–3.4); and 4) use of inductive and deductive approaches in qualitative analysis, including guidelines for ensuring rigor, validity and reliability of data (section 4). These fundamentals will be illustrated via a case study of collaborative care for women Veterans (Hamilton et al., 2017b), presented below.
2. Why are qualitative methods critical to implementation science?
To answer this question, we will first share a slice of a real-world implementation effort currently underway. Depression is the second most prevalent health condition among women Veteran users of Veterans Affairs (VA) healthcare, next to hypertension (Frayne et al., 2018). Among women Veterans aged 18–44, depression is the most prevalent health condition (28%), with anxiety (23%) and post-traumatic stress disorder (PTSD; 22%) also among the top five health conditions (Frayne et al., 2018). Depression and anxiety are substantially more prevalent among women Veterans than men (Maguen et al., 2010). The VA healthcare organization grapples with how to appropriately connect women to mental health (MH) providers who understand their distinctive MH needs, and to support their navigation of often fragmented MH care options offered across VA primary care (PC) and MH settings. Depending on the VA site, women’s access to MH care is variable. Women may have access to integrated MH care in general PC clinics or in a separate portion of a general PC clinic set aside for women. Others may receive care in stand-alone women-only clinics. Such highly variable models of MH care delivery in PC and MH make it difficult to engage women in services in a timely manner, and to support their retention in care.
Collaborative care models have a strong evidence base for enhancing patient engagement and retention in PC-based MH care for depression (Gilbody et al., 2006; Thota et al., 2012), and the VA has extended application of collaborative care to many other mild to moderate conditions, including anxiety disorders and alcohol misuse. The VA’s approach to integrated MH involves collaborative care with two components: co-located MH professionals who are integral components of the PC team and MH care management (VHA Handbook. 1160.01, 2008). While co-location has been achieved at many sites, implementation of care management (i.e., “a set of activities intended to improve patient care and reduce the need for medical services by enhancing coordination of care, eliminate duplication, and helping patients and caregivers more effectively manage health conditions”; Goodell et al., 2009) has been limited, especially in stand-alone women’s clinics. Many have suggested moving toward blended models with local tailoring (VA Office of Patient Care Services, 2013). Toward this end, we have used an implementation strategy, Replicating Effective Programs (REP), to implement a tailored “Collaborative Care for Women Veterans” (CCWV; Hamilton et al., 2017b) intervention that encourages local blending of elements from VA-approved models with a PC-friendly evidence-based, computer-assisted cognitive behavioral treatment (CBT) platform for anxiety, depression, and PTSD (Roy-Byrne et al., 2010).
While we have several clinical outcomes of interest in this study, they are secondary to implementation outcomes of feasibility, acceptability, adoption, appropriateness, penetration, and sustainability (Proctor et al., 2011). Why are they secondary? Because we know that both collaborative care and the CBT platform we are offering work. What we do not know includes: how to make a gender-tailored collaborative care model “fit” into different PC configurations; how to engage women in this model; why some women will engage and others will not; how to foster buy-in among local stakeholders for a gender-specific model of care; how to encourage clinicians to refer women to CCWV; why some providers refer to CCWV while others do not, and so forth. Qualitative methods are essential in implementation because they provide a rigorous and efficient way to answer these kinds of “how” and “why” questions, and we need the answers in order to know how (and whether) to proceed with spreading this gender-tailored care model.
As Bauer and Kirchner note in the Introduction to this Special Issue [cite], the central tasks of implementation science require identifying barriers and facilitators to uptake of clinical innovations and developing strategies to overcome barriers and leverage facilitators toward establishing routine use of best practices in clinical care. Moreover, as described in this issue’s contribution from Damschroder [cite], implementation research is driven by theoretical and conceptual models that help in planning for, making sense of, and predicting change in use of innovations. Qualitative research is therefore a critical approach to discovering and documenting: the context(s) in which implementation occurs; the environment(s) where implementation occurs; the process that occurs during implementation; the effectiveness of implementation strategies (i.e., “methods or techniques used to enhance the adoption, implementation, and sustainability of a clinical practice or program”; Curran et al., 2012); and the relationship(s) between theorized and actual changes.
Qualitative methods add value to implementation science by helping to describe what is happening and why. Qualitative methods in implementation research are also increasingly oriented toward supporting practice and problem-solving [cite Elwy formative evaluation paper, this issue]. Mixed methods designs combining qualitative and quantitative approaches are also used commonly in implementation science, as both qualitative and quantitative findings can be integrated in order to generate “unique insights into multifaceted phenomena related to health care quality, access, and delivery” (Fetters et al., 2013; see also Aarons et al., 2012; Robins et al., 2008). Thus qualitative methods can be used to fulfill a variety of functions in implementation and implementation research.
2.1. What types of questions can qualitative methods address in implementation science?
Given that implementation research aims to support the uptake of evidence-supported practices in clinical care, recommended outcomes emphasize how an innovation is perceived—e.g., its feasibility, acceptability, and appropriateness—as well as practical elements of evaluating its spread (Proctor et al., 2011). In Table 1, we outline our implementation outcomes and how they align with research questions of interest in the CCWV study, using a tabular approach common in implementation science.
Table 1.
Implementation Outcome | Outcome Definition (Adapted from Proctor et al., 2011) | Aligning Research Questions with Implementation Outcomes: CCWV Example |
---|---|---|
Feasibility | Extent to which an innovation can practically be used in a given setting | How do we ensure a gender-tailored collaborative care model is feasible within different configurations of primary care? |
Acceptability | View among stakeholders that a given innovation is agreeable or satisfactory | How do we as researchers foster support among local stakeholders for a gender-specific model of care? |
Appropriateness | Perceived compatibility with needs and practices of a setting or population; perceived utility in addressing a given problem | Why are some women referred to the CCWV care manager while others are not? Why do some women engage with the care manager while others do not? |
Adoption | Intention or action to try an innovation | How do we encourage clinicians to refer women to CCWV? Why do some providers refer to CCWV while others do not? |
Penetration | Reach or integration of the innovation within a setting | How do we attract women to—and retain them appropriately in—this model? |
Sustainability | Extent to which innovation is maintained or routinized within setting over time | How do we ensure that the care model remains available after the implementation study (and funding) is over? |
Qualitative methods are invaluable in addressing the hows and whys of implementation, and are well-suited to both relatively straightforward and more nuanced aims, such as: discerning how effectively an intervention is adopted at each site (e.g., Hamilton et al., 2013); revealing organizational and interpersonal dynamics affecting the intervention (e.g., Harvey et al., 2018); explaining practice change (e.g., Lessard et al., 2016); discerning barriers and facilitators to uptake of the intervention (e.g., Gleacher et al., 2016); identifying what strategies are being used to foster organizational change, how successful they are perceived to be, and how they make a difference (e.g., Bokhour et al., 2018); and identifying contextual elements and provider perceptions that affect implementation and sustainability (e.g., Aarons et al., 2016). Not all of these questions can feasibly be asked in every study, and some questions are more salient during the course of implementation. For example, one might examine feasibility early in a study to ensure that adequate consideration is given to local barriers and facilitators before a new evidence-based practice is put into place. It can also be important to evaluate anticipated sustainability early in a study in order to identify foreseeable problems to maintaining an intervention over time and to plan for solutions once funding for initial implementation support (e.g., training from intervention purveyors [or training from an outside team]) has ended.
3. What qualitative methods are most commonly used in implementation research?
As in any form of research, the appropriate study design and methods are dependent on the research questions. For the kinds of research questions outlined above, talking with people—in one-on-one interviews, small groups, or focus groups—is likely to be the most robust way to understand what they think about a new innovation and efforts to implement. However, a variety of other strategies based in observation or analysis of texts or other media may also have value, as discussed further below. Table 2 lays out a basic checklist of elements to consider in designing a rigorous plan for qualitative implementation research. There are many excellent books and articles devoted to the use of qualitative methods (see, e.g., Denzin and Lincoln, 2011; Patton, 1990). Here, we will briefly share some general tips for conducting qualitative research in the context of implementation.
Table 2.
Key Elements to Consider in Designing Qualitative Implementation Research | Aligning Qualitative Methods with Implementation Research Questions: CCWV Example |
---|---|
Sample of participants • Who are the key stakeholders for implementing this innovation? • Whose support is necessary for implementation to be successful? |
VA Leadership (local, regional) Women’s health leaders and managers Primary Care Providers Mental Health Providers Social Workers Nurses Women Veterans |
Data collection instruments (e.g., interview guides; fieldnote templates) • What do you need to know? • How does your conceptual framework guide your data collection approach? |
Individual interviews evaluating feasibility, acceptability, appropriateness, adoption, penetration, and sustainability of CCWV in VA |
Timing of data collection • When and how often will you talk with participants? |
Interviews occurring pre-, mid-, and post-implementation |
Location of data collection • Where will the innovation be implemented? • Where is it practical to engage with stakeholders? |
Interviews occurring on-site in clinics where CCWV is based and by telephone, as needed |
Qualitative team • Who will collect the data? • What level of training and experience do they require? |
Interviews conducted by skilled team of masters- and PhD-level qualitative researchers |
Recording and transcription • Will conversations be recorded? If so, will they be transcribed? By whom? • What other forms of documentation will be maintained, e.g., fieldnotes of observations? |
Interviews recorded and professionally transcribed; other forms of documentation include notes from site visits and periodic reflections (Finley et al., 2018) with implementation team members |
Data management • How will data be stored to ensure compliance with research ethics and organizational standards? |
Recordings and de-identified transcripts stored behind VA firewall, accessible only to authorized team members |
Data analysis • How will data analysis be conducted to answer the research question(s) in a systematic, rigorous, yet pragmatic and timely manner? • How will your conceptual framework guide your data analysis approach? |
Rapid qualitative analysis (Hamilton 2013) using a focused, team-based approach (Hamilton et al., 2017) |
Ethics of research • How will appropriate protections for participants be maintained at each stage of the research process? |
Employee key stakeholders: verbal informed consent to participate and to be recorded Patients: written documentation of consent Highly secure data management and storage |
Often in research there is a tension between rigor and feasibility, and this may be particularly true for implementation research, in which timelines are typically rapid and there are frequently multiple “moving parts” that need to be understood. Feasibility is strongly linked to resources such as time, funding (e.g., the scale and scope of your study), and team members, as well as to intended products. If the research plan calls for conducting interviews, therefore, it may be necessary to take a targeted approach to interviewing by narrowing sample size and composition, as well as number and type of interview questions asked, and data analysis techniques.
3.1. Individual interviews and sampling considerations
Briefly, with regard to individual qualitative interviews, implementation scientists typically focus on talking with individuals commonly referred to as “key stakeholders.” These are individuals who play a role in or are otherwise impacted by the implementation effort. This does not necessarily mean that they will be directly involved in the research, but they may have role(s) that touch the research. For example, in the CCWV study, key stakeholders include not only PC and MH providers, site staff, and patients, but also administrators of clinics, facilities, medical centers, and healthcare regions. These type of stakeholders are referred to as “multilevel” stakeholders, i.e., individuals at multiple levels of an organization who may have some decision-making authority and/or unique perspectives on what you are trying to achieve (Hamilton et al., 2017a; Kirchner et al., 2012). By interviewing multilevel stakeholders, the team obtains valuable vantage points on the research questions noted in Section 2.1.
Typical sample sizes for a medical facility in which one is conducting implementation research range between 5–10 individuals in key roles at each site, for individual interviews. This is consistent with the recommendation of Guest et al. (2006), who found that, with a narrow research scope and a relatively homogenous target audience, a sample size of 6–12 is usually adequate to reach “saturation,” i.e., the juncture in data collection at which new insights are not being gleaned from interviews (though notably, many argue that saturation is a concept more appropriately applied to data analysis rather than sampling; see, e.g., Nelson 2017). Malterud et al., (2016) propose a useful concept of “information power” with regard to sampling: the more information the sample holds, relevant for the actual study, the lower number of participants needed. There are often more individuals available to interview in these settings, but the key is to identify those most critical to your implementation research effort. This is referred to as purposive or purposeful sampling, i.e., identifying individuals whose perspectives will help to inform and guide research (Palinkas et al., 2015). In some cases, initial participants suggest additional key stakeholders who may not have been known at the outset of the study; this is referred to as respondent-driven or snowball sampling, wherein key stakeholders are asked (usually at the end of an interview) if they know of anyone else in the local, system, or national context who should be asked for an interview. With qualitative implementation research, however, depth of information across a smaller sample of key stakeholders is preferable to more superficial information across a larger sample of individuals, some of whom may be tangential to the implementation effort.
3.2. Focus group interviews and sampling considerations
Focus group interviewing—a qualitative method emphasizing group interaction (Morgan 1996)—is common in implementation studies and can be effective for gathering rich information about settings (such as normative care practices or workflow), experiences with health services and care models as well as perspectives on interventions and implementation strategies (e.g., Huntink et al., 2014). Focus groups can also be an important way to engage stakeholders and invite their input on issues related to implementation (e.g., Burns et al., 2018), and to better understand the environment of care and contextual factors that may impact implementation. As Robins et al. (2008) point out, however, focus groups are not appropriate for “gaining an in-depth understanding of treatment processes” or “exploring sensitive topics,” particularly if there is likely to be meaningful variation in participants’ views, needs, or practices. Focus groups are not a quick and easy method to obtain information from a larger sample; they are challenging and time-consuming. Moreover, the unit of analysis with a focus group is the group (typically comprised of five to eight individuals; see Krueger and Casey 2014), not the individual. In other words, in a study with eight focus groups with six individuals in each, the sample size is eight, not 48. Therefore, one focus group is rarely, if ever, sufficient, because focus groups are idiosyncratic; one cannot predict how a group of people will interact. Typically, 4–6 focus groups will prove adequate, depending on the diversity of participants (Guest et al., 2017). If the study is comparative in nature, involving different groups or categories of people (e.g., patients who selected different treatment options), more groups are needed to achieve a sufficient sample size for comparison purposes. It is also important to carefully consider focus group composition: mixing individuals at different levels of an organizational hierarchy, for example, should generally be avoided as discussion could be hampered by power differentials. Furthermore, focus groups can be quite difficult to arrange (particularly in busy healthcare settings) and to moderate (Kitzinger 1995), requiring both strong interviewing and observational skills, as well as the ability to control and guide discussion (McDonald, 1993). The selection of focus group methodology should be closely tied to a research question that is best answered by an approach that centers on group discussion.
3.3. Interview data collection instruments
Both individual and focus group interviews require interview guides. Typically in implementation research, these guides are semi-structured, meaning that the questions are specified but can be asked in a conversational style and do not necessarily need to be asked in the exact order in which they appear in the guide. Due to the rapid nature of implementation research, interview guides are generally targeted and focused on questions that participants will likely be able to answer due to their expertise and role within the healthcare system. Interview questions should be inviting (interesting for the participant), accessible (familiar, not opaque or multivalent), and analyzable (useful in meeting your project goals and answering your research questions) (Maietta & Hamilton, 2018). They should also be geared toward the time available for the interview. Often in implementation research, interview questions are shaped by the conceptual framework driving the study (Damschroder et al., 2017). Tables can be helpful in preparing guides by aligning interview questions with framework constructs or domains. In our experience, interview guides for individual interviews work best with approximately 6–8 primary questions, and guides for focus group interviews with roughly 4–6 primary questions depending on the estimated length of the group. Prioritizing questions—putting the most important toward the beginning, and putting optional questions toward the end—can help to ensure that at least a subset of questions is consistently asked across the sample. This is especially important when multiple interviewers, often with different disciplinary backgrounds, are in the field; they need to know which questions should be prioritized. Questions should always be tested before launching into intensive data collection efforts. Testing should occur internally in the team, with mock interviewing, to ensure that interviewers are consistent in how they are asking questions and soliciting information from participants. If possible, pilot testing the guide with individuals who are knowledgeable about the topic can also be helpful.
A thorough discussion of how to prepare guides and conduct individual and focus group interviews is beyond the scope of this paper, but it is important to note that the opening of any interview sets the tone and has a significant impact on the data that will be collected. The main goal at the beginning of an interview, even over the phone, is to establish rapport and put your participant(s) at ease. This can be facilitated by adopting a “learner role” posture (Lofland & Lofland 1995), and starting the interview with what Spradley (1979) coined the “grand tour” question, which gets participants talking about something they should know well, but in a focused way. For example, in our CCWV study, we ask an opening question at pre-implementation such as, “Can you please give me the lay of the land of how women get their mental health care at this facility?” This type of question elicits a “verbal tour” that then gives the interviewer several different strands of inquiry to follow, depending on the participant’s response. This type of question also works well in focus groups, for example, asking a group of patients, “Can we start by talking about what brought you to this facility for your health care?” Across several implementation studies, we have also found it useful to ask a closing question that encourages the participant to imagine the ideal; we call this the “queen/king for a day” question (Brunner et al., 2019), such as, “If resources weren’t an issue and you were in charge, what would be your ideal approach to delivering women’s health care?” We have also found it to be beneficial to ask toward the end, “Is there anything you thought we would be discussing that we haven’t touched on?”
3.4. Other qualitative methods
In addition to individual and focus group interviewing, other qualitative methods include, but are not limited to, observation and archival analysis. Although these methods are less common than interviewing in implementation and evaluation research, they may be of particular value for several reasons. Behavior change is central to implementation, as introducing a new evidence-based practice typically requires adding, removing, or replacing actions that occur as part of clinical care. It is a truism of behavioral research that people do not always do what they think they do or say they do, and observation can be critical in identifying the gap between reported and actual practice. Observation may also reveal features of the context or interpersonal dynamics that are taken for granted (perhaps even unseen) by participants, or not seen as socially desirable to discuss. Examination of archival or other textual analysis can be critical in understanding the history, policy context, or operationalization of a particular initiative (e.g., Regan et al., 2017). Illustrating how these methods can be integrated, Murdoch (2016) observed how nurses used and adapted recommended phone scripts in implementation of a telephone-based triage intervention, and examined how nurses’ practice varied from written protocols. Along these lines, some qualitative methodologists, and anthropologists in particular, have begun embracing ethnographic approaches in implementation (e.g., Bunce et al., 2014; Greenhalgh & Swinglehurst, 2011; Morgan-Trimmer & Wood, 2016; Thirsk & Clark, 2017). Ethnography is characterized by “close engagement with a social group over time” and typically makes use of multiple methods, e.g., observation and interviews, to ensure triangulation (mixing) of data sources (Finley et al., 2018). Because implementation often requires long-term engagement with stakeholders in order to achieve changes in practices or processes, and because multiple methods may be required to adequately understand the kind of complex questions outlined above, ethnography can be highly compatible with the activities and goals of implementation research.
Along similar lines, implementation research has also increasingly using qualitative methods such as fieldnotes (e.g., Ilott et al., 2016), periodic reflections (Finley et al., 2018), and diaries or other annotated logs of observed processes during implementation (e.g., Cohen et al., 2008; Bunger et al., 2017; Rabin et al., 2018). For example, in our CCWV study, a senior research team member holds lightly structured discussions with key team members (investigators, local site key personnel) on a periodic (roughly monthly) basis to check in on and document implementation processes, intervention adaptation and tailoring, and any relevant local, regional, and national contextual shifts (Finley et al., 2018). All of these approaches benefit from occurring either at regular intervals (e.g., monthly or quarterly) or alongside regular events (e.g., intervention trainings) in order to observe and record implementation as it is occurring. Fieldnotes taken during training sessions might describe how the training was received by participants, what kinds of questions they asked, and whether suggestions for changes were made. Similar, logs can be kept by the implementation team to describe why and how adaptations were made to the intervention, e.g., to make it better fit with the existing staffing and workflow. The periodicity of these approaches is well-suited to implementation precisely because it is often unpredictable and requires flexibility, tenacity, and iteration, all of which may be difficult to capture if data collection is only occurring at long intervals, e.g., immediately pre- and post-implementation. Furthermore, these more narrative approaches provide opportunities for attending to reflexivity, which entails acknowledging limits to claims of knowledge, and characterizing the position of the researcher in relation to the participants (see Marcus, 1998), an underemphasized yet important dimension of implementation science.
4.0. How are qualitative data analyzed?
The topic of qualitative data analysis has an extensive history and many varying philosophical and epistemological strands. Even so, there are a few features of analyzing qualitative data in the context of implementation research that we have found noteworthy, particularly the need for efficiency and applicability. Qualitative data are often used to inform the process of implementation, which means that at least preliminary results need to be turned around quickly. Rapid qualitative analysis can be a fruitful strategy for achieving this (Hamilton 2013; Taylor et al., 2018). In one study of supported employment for Veterans with serious mental illness, Hamilton et al. (2013) conducted pre-implementation interviews to inform the selection and tailoring of implementation strategies; interviews were summarized using a structured template aligned with the interview guide (Hamilton 2013), and matrices were developed to be able to review at-a-glance the key information gleaned from the interviews (Averill 2002). Site profiles were generated from the summaries and matrices, with highlights specific to the particular needs of each site. Interview summaries can serve as a resource not only to guide next steps in implementation, but also to inform subsequent waves of data collection, and to prepare for in-depth analysis, e.g., to draft a codebook. Rapid analysis requires a high level of organization and teamwork, and involves potential trade-offs in pausing more interpretive phases and types of analysis, which may come later in an implementation study after more immediate needs for preliminary results are met.
Similar to qualitative data collection in implementation research (see 3.3), qualitative analysis is typically driven or at least informed to some extent by the stud y’s conceptual or theoretical framework. This is considered a more deductive approach, representing the “top-down” end of the analytic spectrum, and framework constructs may be used to categorize segments of narrative data, e.g., to identify descriptions of external context factors that affect implementation (Hamilton et al., 2018). This approach has been used with frameworks such as Normalisation Process Theory (Pope et al., 2013), Theoretical Domains Framework (Atkins et al., 2017; Lawton et al., 2016), as well as with the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2017; Damschroder & Lowery 2013; Keith et al., 2017), which offers CFIR-guided tools for data collection and analysis. An important consideration in using a top-down or framework-driven analytic approach is to remain open to findings that may not fit into the pre-set domains or constructs, reflecting the more inductive or emergent side of the analytic spectrum. Inductive and deductive need not be mutually exclusive approaches; they can be combined in “hybrid” fashion (Fereday & Muir-Cochrane 2006) or each can be used at different times and for different purposes during the course of analysis.
In the context of implementation research, which tends to be highly dynamic, it is essential to maintain methodologic rigor (QualRIS 2019). Techniques such as calculating interrater or intercoder reliability can be helpful to ensure reliability and validity of analyses, but are only appropriate under specific conditions, i.e., when “all participants are asked the same questions, in the same order” (Morse 1997), and when the analysis team is taking a highly formalized approach to coding data. To ensure rigor, several strategies can be used during the course of data collection and analysis, including prolonged engagement, persistent observation, member checking, triangulation, and thick, rich description (see Morse 2015; Morse et al., 2002 for details about these strategies).
The goal of any qualitative analysis is to produce work that is of value to the study and study team, multilevel stakeholders, and the broader scientific community. This work can take many directions beyond traditional academic products such as publications and presentations, especially in implementation research where products such as implementation guidelines, recommendations and toolkits are a common mode of supporting sustained and future spread of a given innovation (QualRIS 2019). In the CCWV study, for example, we are generating “playbooks” and “playbytes” (mini-playbooks focused on disseminating targeted information) to aid sites in implementing the care management model. Products featuring the qualitative data as the “star” (Chenail 1995) will orient team members and audiences, illuminating often complex processes and dynamics, and providing compelling, memorable examples from which to learn.
5.0. Conclusion
In this paper we have provided an introduction to the application of qualitative methods in implementation research. Designed to answer primarily how and why questions, qualitative methods are integral to investigating what happens in implementation, and what “surrounds” and interacts with implementation processes. As Kitson et al. (1998) noted two decades ago, “For implementation to be successful, there needs to be a clear understanding of the nature of evidence being used, the quality of context in terms of its ability to cope with change, and type of facilitation needed to ensure a successful change process.” The sophistication with which we address these phenomena in implementation has grown exponentially since that time, with a plethora of frameworks, models, measures, protocols, and empirical findings available to guide scientists and practitioners [cite papers in special issue]. Study methods and designs have become increasingly refined, with most implementation studies now utilizing mixed methods designs (i.e., designs that incorporate both quantitative and qualitative methods) (Mazzucca et al., 2018); excellent guidance is available on these designs (see, e.g., Aarons et al., 2011; Green et al., 2015; Hoagwood et al., 2015; Palinkas et al., 2011; Palinkas 2014; Palinkas et al., 2019).
Qualitative methods have a central place in the implementation research endeavor. Although efforts to expand and innovate with these methods are underway—with, for example, greater emphasis on ethnography and use of analytic techniques such as qualitative comparative analysis—their application tends to be pragmatic, and the findings they reveal can be an efficient, even transformative, way of increasing knowledge and strengthening capacity for implementation in real-world settings.
Highlights.
Qualitative methods are critical to implementation research.
Qualitative methods address the hows and whys of implementation.
Interviews and observation are key methods in implementation research.
Data collection and analysis are typically driven by an implementation framework.
Rapid turn-around of qualitative findings supports implementation and evaluation.
Acknowledgements
Funding: This work was supported by the VA Quality Enhancement Research Initiative [QUE 15–272], VA Health Services Research & Development [HSR&D; SDR 10–012], and National Institutes of Health National Heart, Lung, and Blood Institute [U01HL142109].
The authors grateful acknowledge historical contributions to this work by Drs. Barbara Bokhour, Geoffrey Curran, Suzanne Heurtin-Roberts, Ray Maietta, Shannon Mitchell, Sarah Ono, Heather Reisinger, Samantha Solimeo, Cathleen Willging, Susan Zickmund, and all members of the NCI QualRIS working group. We would like to thank Annie Sumberg, MPH, for technical support.
Portions of this paper have been presented at Addiction Health Services Research, AcademyHealth, and Society for General Internal Medicine annual meetings; at the annual ResearchTalk Qualitative Research Summer Intensive; and on numerous VA HSR&D national cyberseminars.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
References
- Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA, 2012. Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreat. 17, 67–79. 10.1177/1077559511426908. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, Roesch SC, 2016. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Adm. Policy Ment. Health 43, 991–1008. 10.1007/s10488-016-0751-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Atkins L, Francis J, Islam R, O’Connor D, Patey A, Ivers N, Foy R, Duncan EM, Colquhoun H, Grimshaw JM, Lawton R, Michie S, 2017. A guide to using the theoretical domains framework of behaviour change to investigate implementation problems. Implement. Sci 12,77 10.1186/s13012-017-0605-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Averill JB, 2002. Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qual. Health Res. 12, 855–866. 10.1177/104973230201200611. [DOI] [PubMed] [Google Scholar]
- Brunner J, Cain CL, Yano EM, Hamilton AB, 2019. Local leaders’ perspectives on women veterans’ health care: what would ideal look like? Womens Health Issues. 29, 64–71. 10.1016/j.whi.2018.10.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bunce AE, Gold R, Davis JV, McMullen CK, Jaworski V, Mercer M, Nelson C, 2014. Ethnographic process evaluation in primary care: explaining the complexity of implementation. BMC Health Serv. Res et al. 14, 607 10.1186/s12913-014-0607-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C, 2017. Tracking implementation strategies: a description of a practical approach and early findings. Health Res. Policy Syst 15,15 10.1186/s12961-017-0175-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bokhour BG, Fix GM, Mueller NM, Barker AM, Lavela SL, Hill JN, Solomon JL, Lukas CV, 2018. How can healthcare organizations implement patient-centered care? Examining a large-scale cultural transformation. BMC Health Serv. Res 18, 168 10.1186/s12913-018-2949-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Burns A, Webb M, Stynes G, O’Brien T, Rohde D, Strawbridge J, Clancy L, Doyle F, 2018. Implementation of a quit smoking programme in community adult mental health services-a qualitative study. Front. Psychiatry 9, 670 10.3389/fpsyt.2018.00670. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Chenail RJ, 1995. Presenting Qualitative Data. The Qualitative Report, 2, 1–9. https://nsuworks.nova.edu/tqr/vol2/iss3/5 (accessed 5 April 2019) [Google Scholar]
- Clement C, Edwards SL, Rapport F, Russell IT, Hutchings HA, 2018. Exploring qualitative methods reported in registered trials and their yields (EQUITY): systematic review. Trials 19, 589 10.1186/s13063-018-2983-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cohen DJ, Crabtree BF, Etz RS, Balasubramanian BA, Donahue KE, Leviton LC, Clark EC, Isaacson NF, Stange KC, Green LW, 2008. Fidelity versus flexibility: translating evidence-based research into practice. Am. J. Prev. Med et al. 35, S381–389. http://doi: 10.1016/j.amepre.2008.08.005. [DOI] [PubMed] [Google Scholar]
- Corbin JM, Strauss AL, 2015. Basics of Qualitative Research, fourth ed. Sage Publications, California. [Google Scholar]
- Curran GM, Bauer M, Mittman B, Pyne JM and Stetler C, 2012. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care, 50, 3. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Lowery JC, 2013. Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci. 8, 51 10.1016/j.amepre.2008.08.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Damschroder LJ, Reardon CM, Sperber N, Robinson CH, Fickel JJ, Oddone EZ, 2017. Implementation evaluation of the telephone lifestyle coaching (TLC) program: organizational factors associated with successful implementation. Transl. Behav. Med 7, 233–241. 10.1186/1748-5908-8-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Denzin NK and Lincoln YS eds., 2011. The Sage Handbook of Qualitative Research. Sage Publications, Inc., Thousand Oaks, CA. [Google Scholar]
- Fereday J, Muir-Cochrane E, 2006. Demonstrating rigor using thematic analysis: A hybrid approach of inductive and deductive coding and theme development. Intl. J. Qual. Methods 5, 80–92. 10.1177/160940690600500107. [DOI] [Google Scholar]
- Fetters MD, Curry LA, Creswell JW, 2013. Achieving integration in mixed methods designs-principles and practices. Health Serv. Res 48, 2134–56. 10.1111/1475-6773.12117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, Moreau JL, Dyer KE, Lanham HJ, Leykum L, Hamilton AB, 2018. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med. Res. Methodol 18,153 10.1186/s12874-018-0610-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Frayne SM, Phibbs CS, Saechao F, Friedman SA, Shaw JG, Romodan Y, Berg E, Lee J, Ananth L, Iqbal S, Hayes PM, Haskell S. Sourcebook: Women Veterans in the Veterans Health Administration. Volume 4: Longitudinal Trends in Sociodemographics, Utilization, Health Profile, and Geographic Distribution. Women’s Health Evaluation Initiative, Women’s Health Services, Veterans Health Administration, Department of Veterans Affairs, Washington DC: February 2018. [Google Scholar]
- Gilbody S, Bower P, Fletcher J, Richards D, Sutton AJ, 2006. Collaborative care for depression. Arch. Intern. Med 166, 2314 10.1001/archinte.166.21.2314. [DOI] [PubMed] [Google Scholar]
- Glasgow RE, Harden SM, Gaglio B, Rabin B, Smith ML, Porter GC, Ory MG, Estabrooks PA, 2019. RE-AIM planning and evaluation framework: adapting to new science and practice with a 20-year review. Front. Public Heal 7 10.3389/fpubh.2019.00064. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gleacher AA, Olin SS, Nadeem E, Pollock M, Ringle V, Bickman L, Douglas S, Hoagwood K, 2016. Implementing a measurement feedback system in community mental health clinics: a case study of multilevel barriers and facilitators. Adm. Policy Ment. Health 43, 426–440. 10.1007/s10488-015-0642-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Goodell S, Bodenheimer T & Berry-Millett R, 2009. Care management of patients with complex health care needs. Synthesis Project, Robert Wood Johnson Foundation; Available at: https://www.rwjf.org/en/library/research/2009/12/care-management-of-patients-with-complex-health-care-needs.html. Accessed August 9, 2019. [PubMed] [Google Scholar]
- Guest G, Namey E, & McKenna K, 2017. How many focus groups are enough? Building an evidence base for nonprobability sample sizes. Field Methods. 29, 3–22. 10.1177/1525822X16639015. [DOI] [Google Scholar]
- Green CA, Duan N, Gibbons RD, Hoagwood KE, Palinkas LA, Wisdom JP, 2015. Approaches to mixed methods dissemination and implementation research: methods, strengths, caveats, and opportunities. Adm. Policy Ment. Heal. Ment. Heal. Serv. Res et al. 42, 508–523. 10.1007/s10488-014-0552-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Greenhalgh T, Swinglehurst D, 2011. Studying technology use as social practice: the untapped potential of ethnography. BMC Med. 9, 45 10.1186/1741-7015-9-45. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Greenhalgh T, Annandale E, Ashcroft R, Barlow J, Black N, Bleakley A, Boaden R, Braithwaite J, Britten N, Carnevale F, Checkland K, Cheek J, Clark A, Cohn S, Coulehan J, Crabtree B, Cummins S, Davidoff F, Davies H, Dingwall R, Dixon-Woods M, Elwyn G, Engebretsen E, Ferlie E, Fulop N, Gabbay J, Gagnon M-P, Galasinski D, Garside R, Gilson L, Griffiths P, Hawe P, Helderman J-K, Hodges B, Hunter D, Kearney M, Kitzinger C, Kitzinger J, Kuper A, Kushner S, May A Le Legare, Lingard F, Locock L, Maben L, Macdonald J, Mair ME, Mannion F, Marshall R, May M, Mays C, McKee N, Miraldo L, Morgan M, Morse D, Nettleton J, Oliver S, Pearce S, Pluye W, Pope P, Robert C, Roberts G, Rodella C, Rycroft-Malone S, Sandelowski J, Shekelle M, Stevenson P, Straus F, Swinglehurst S, Thorne D, Tomson S, Westert G, Wilkinson G, Williams S, Young B, Ziebland T, S., 2016. An open letter to The BMJ editors on qualitative research. BMJ. 352, i563 10.1136/bmj.i563. [DOI] [PubMed] [Google Scholar]
- Guest G, Bunce A & Johnson L, 2006. How many interviews are enough? An experiment with data saturation and variability. Field Methods. 18, 59 10.1177/1525822X05279903. [DOI] [Google Scholar]
- Hamilton A Qualitative methods in rapid turn-around health services research. VA HSR&D National Cyberseminar Series: Spotlight on Women’s Health; December 2013. https://www.hsrd.research.va.gov/for_researchers/cyber_seminars/archives/video_archive.cfm?SessionID=780 (accessed 5 April 2019). [Google Scholar]
- Hamilton AB, Brunner J, Cain C, Chuang E, Luger TM, Canelo I, Rubenstein L, Yano EM, 2017a. Engaging multilevel stakeholders in an implementation trial of evidence-based quality improvement in VA women’s health primary care. Transl. Behav. Med. 7, 478–485. 10.1007/s13142-017-0501-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hamilton AB, Farmer MM, Moin T, Finley EP, Lang AJ, Oishi SM, Huynh AK, Zuchowski J, Haskell SG, Bean-Mayberry B, 2017b. Enhancing mental and physical health of women through engagement and retention (EMPOWER): a protocol for a program of research. Implement. Sci et al. 12, 127 10.1186/s13012-017-0658-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hamilton AB, Oishi S, Yano EM, Gammage CE, Marshall NJ, Scheuner MT, 2014. Factors influencing organizational adoption and implementation of clinical genetic services. Genet. Med. 16, 238–245. 10.1038/gim.2013.101 [DOI] [PubMed] [Google Scholar]
- Harvey G, McCormack B, Kitson A, Lynch E, Titchen A, 2018. Designing and implementing two facilitation interventions within the ‘Facilitating Implementation of Research Evidence (FIRE)’ study: a qualitative analysis from an external facilitators’ perspective. Implement. Sci et al. 13, 141 10.1186/s13012-018-0812-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hoagwood K, Olin S, Horwitz S, 2015. Special Issue Overview: Optimizing mixed methods for implementation research in large systems. Adm. Policy Ment. Heal. Ment. Heal. Serv. Res et al. 42, 505–507. 10.1007/s10488-014-0616-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Holtrop JS, Rabin BA, Glasgow RE, 2018. Qualitative approaches to use of the RE-AIM framework: rationale and methods. BMC Health Serv. Res et al. 18, 177 10.1186/s12913-018-2938-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Huntink E, van Lieshout J, Aakhus E, Baker R, Flottorp S, Godycki-Cwirko M, Jager C, Kowalczyk A, Szecsenyi J, Wensing M, 2014. Stakeholders’ contributions to tailored implementation programs: an observational study of group interview methods. Implement. Sci et al. 9, 185 10.1186/s13012-014-0185-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ilott I, Gerrish K, Eltringham SA, Taylor C, Pownall S, 2016. Exploring factors that influence the spread and sustainability of a dysphagia innovation: an instrumental case study. BMC Health Serv. Res et al. 16, 406 10.1186/s12913-016-1653-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Keith RE, Crosson JC, O’Malley AS, Cromp D, Taylor EF, 2017. Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement. Sci, 12, 15 10.1186/s13012-017-0550-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kirchner JE, Parker LE, Bonner LM, Fickel JJ, Yano EM, Ritchie MJ 2012. Roles of managers, frontline staff and local champions, in implementing quality improvement: stakeholders’ perspectives. J. Eval. Clin. Pract. 18, 63–9. 10.1111/j.1365-2753.2010.01518.x. [DOI] [PubMed] [Google Scholar]
- Kitson A, Harvey G, McCormack B, 1998. Enabling the implementation of evidence based practice: a conceptual framework. Qual. Health Care. 7, 149–58. 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kitzinger J, 1995. Qualitative research. Introducing focus groups. BMJ. 311, 299–302. 10.1136/bmj.311.7000.299. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Krueger RA, & Casey MA 2014. Focus Groups: A Practical Guide for Applied Research. SAGE Publications. [Google Scholar]
- Lawton R, Heyhoe J, Louch G, Ingleson E, Glidewell L, Willis TA, McEachan RR, Foy R, 2016. Using the theoretical domains framework (TDF) to understand adherence to multiple evidence-based indicators in primary care: a qualitative study. Implement. Sci et al. 11, 113 10.1186/s13012-016-0479-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lessard S, Bareil C, Lalonde L, Duhamel F, Hudon E, Goudreau J, Levesque L, 2016. External facilitators and interprofessional facilitation teams: a qualitative study of their roles in supporting practice change. Implement. Sci et al. 11, 97 10.1186/s13012-016-0458-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lofland J, Lofland L. Analyzing Social Settings. Wadsworth Publishing Company, 1995. [Google Scholar]
- Maguen S, Ren L, Bosch JO, Marmar CR, Seal KH, 2010. Gender differences in mental health diagnoses among Iraq and Afghanistan veterans enrolled in Veterans Affairs health care. Am. J. Public Health, 100, 2450–2456. 10.2105/AJPH.2009.166165 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Maietta R, & Hamilton A, 2018. Designing and Executing Qualitative Data Collection Projects. Presentation at the 15th Annual Qualitative Research Summer Intensive, Chapel Hill, NC. [Google Scholar]
- Malterud K, Siersma VD, Guassora AD, 2016. Sample size in qualitative interview studies: guided by information power. Qual. Health Res. 26, 1753–1760. 10.1177/1049732315617444. [DOI] [PubMed] [Google Scholar]
- Marcus GE, 1998. Ethnography through Thick and Thin. Princeton, N.J.: Princeton University Press. [Google Scholar]
- Mazzucca S, Tabak RG, Pilar M, Ramsey AT, Baumann AA, Kryzer E, Lewis EM, Padek M, Powell BJ, Brownson RC, 2018. Variation in research designs used to test the effectiveness of dissemination and implementation strategies: a review. Front. Public Heal et al. 6, 32 10.3389/fpubh.2018.00032. [DOI] [PMC free article] [PubMed] [Google Scholar]
- McDonald WJ, 1993. Focus group research dynamics and reporting: an examination of research objectives and moderator influences. J. Acad. Market. Sci et al. 21, 2 10.1007/BF02894427. [DOI] [Google Scholar]
- Morgan D, 1996. Focus Groups as Qualitative Research, second ed. SAGE Publications, California. [Google Scholar]
- Morgan-Trimmer S, Wood F, 2016. Ethnographic methods for process evaluations of complex health behaviour interventions. Trials.17, 232 10.1186/s13063-016-1340-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Morse JM, 1997. “Perfectly healthy, but dead”: the myth of inter-rater reliability. Qual. Health Res. 7, 445–447. 10.1177/104973239700700401. [DOI] [Google Scholar]
- Morse JM, Barrett M, Mayan M, Olson K, & Spiers J, 2002. Verification strategies for establishing reliability and validity in qualitative research. Int. J. Qual. Meth 1, 13–22. 10.1177/160940690200100202. [DOI] [Google Scholar]
- Morse JM, 2015. Critical analysis of strategies for determining rigor in qualitative inquiry. Qual. Health Res. 9, 1212–22. 10.1177/1049732315588501. [DOI] [PubMed] [Google Scholar]
- Murdoch J, 2016. Process evaluation for complex interventions in health services research: analysing context, text trajectories and disruptions. BMC Health Serv. Res 16, 407 10.1186/s12913-016-1651-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nelson J, 2017. Using conceptual depth criteria: addressing the challenge of reaching saturation in qualitative research. Qual. Res. 17, pp.554–570. 10.1177/1468794116679873. [DOI] [Google Scholar]
- Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J, 2011. Mixed method designs in implementation research. Adm. Policy Ment. Heal. Ment. Heal. Serv. Res 38, 44–53. 10.1007/s10488-010-0314-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, 2014. Qualitative and mixed methods in mental health services and implementation research. J. Clin. Child Adolesc. Psychol 43, 851–861. 10.1080/15374416.2014.910791. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K, 2015. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. Adm. Policy Ment. Health 42, 533–44. 10.1007/s10488-013-0528-y. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Palinkas LA, Mendon SJ, Hamilton AB, 2019. Innovations in mixed methods evaluations. Annu Rev Public Health 40, 423–442. 10.1146/annurev-publhealth-040218-044215. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Patton MQ, 1990. Qualitative Evaluation and Research Methods. SAGE Publications, Inc. [Google Scholar]
- Pope C, Halford S, Turnbull J, Prichard J, Calestani M, May C Using computer decision support systems in NHS emergency and urgent care: ethnographic study using normalisation process theory. BMC Health Serv. Res 2013. 23, 111. doi: 10.1186/1472-6963-13-111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Griffey R, Hensley M, 2011. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm. Policy Ment. Heal. Ment. Heal. Serv. Res et al. 38, 65–76. 10.1007/s10488-010-0319-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- QualRIS, 2019. Qualitative Methods in Implementation Science. Division of Cancer Control and Population Sciences, National Cancer Institute; https://cancercontrol.cancer.gov/IS/docs/NCI-DCCPS-ImplementationScience-WhitePaper.pdf (accessed 6 April 2019). [Google Scholar]
- Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, Frank JW, Glasgow RE 2018. Systematic, multimethod assessment of adaptations across four diverse health systems interventions. Front. Public Health 6, 102 10.3389/fpubh.2018.00102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Regan J, Lau AS, Barnett M, Stadnick N, Hamilton A, Pesanti K, Bando L, Brookman-Frazee L 2017. Agency responses to a system-driven implementation of multiple evidence-based practices in children’s mental health services. BMC Health Serv. Res et al. 17, 671 http://doi/org/10.1186/s12913-017-2613-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Robins CS, Ware NC, dosReis S, Willging CE, Chung JY, Lewis-Fernandez R, 2008. Dialogues on mixed-methods and mental health services research: anticipating challenges, building solutions. Psychiatr. Serv 59, 727–731. 10.1176/ps.2008.59.7.727. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roy-Byrne P, Craske MG, Sullivan G, Rose RD, Edlund MJ, Lang AJ, Bystritsky A, Welch SS, Chavira DA, Golinelli D, Campbell-Sills L, Sherbourne CD, Stein MB, 2010. Delivery of evidence-based treatment for multiple anxiety disorders in primary care. JAMA. 303, 1921 10.1001/jama.2010.608. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Spradley J The Ethnographic Interview. Harcourt, Brace, Jovanovich, 1979. (reissued, 2016). [Google Scholar]
- Thirsk LM and Clark AM, 2017. Using qualitative research for complex interventions: The contributions of hermeneutics. Int. J. Qual. Meth 16, 1–10. 10.1177/1609406917721068. [DOI] [Google Scholar]
- Taylor B, Henshall C, Kenyon S, Litchfield I, Greenfield S 2018. Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. 8, e019993 10.1136/bmjopen-2017-019993. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Thota AB, Sipe TA, Byard GJ, Zometa CS, Hahn RA, McKnight-Eily LR, Chapman DP, Abraido-Lanza AF, Pearson JL, Anderson CW, Gelenberg AJ, Hennessy KD, Duffy FF, Vernon-Smiley ME, Nease DE, Williams SP, Community Preventive Services Task Force, 2012. Collaborative care to improve the management of depressive disorders. Am. J. Prev. Med 42, 525–538. 10.1016/j.amepre.2012.01.019. [DOI] [PubMed] [Google Scholar]
- VHA Handbook 1160.01, 2008. Uniform Mental Health Services in VA Medical Centers and Clinics. file:///C:/Users/VHAWLASumbeA/Desktop/1160_01_HK_2008-09-11_rev.pdf (accessed 4 April 2019).
- VA Office of Patient Care Services, 2013. Report on Integrating Mental Health into PACT (IMHIP) in the VA. https://www.hsrd.research.va.gov/publications/internal/imhip-report.pdf (accessed 4 April 2019).