Abstract
Objectives
Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles.
Data Sources
All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals.
Study Design
All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component.
Principal Findings
Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively).
Conclusion
Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports.
Keywords: Health services, research methodology, mixed methods, qualitative methods
As the health services research field continues to evolve, so too does its methods. Mixed methods research capitalizes on the strengths of both qualitative and quantitative methodologies by combining approaches in a single research study to increase the breadth and depth of understanding (Johnson, Onwuegbuzie, and Turner 2007). Mixed methods can be a better approach to research than either quantitative-only or qualitative-only methods when a single data source is not sufficient to understand the topic, when results need additional explanation, exploratory findings need to be generalized, or when the complexity of research objectives are best addressed with multiple phases or types of data (Brannen 1992; Creswell and Plano Clark 2011). Rigorous mixed methods approaches require that individual components (qualitative or quantitative) adhere to their respective established standards (Curry, Nembhard, and Bradley 2009; Creswell and Plano Clark 2011). Despite recent guidelines on frameworks for conducting mixed methods research (e.g., Curry, Nembhard, and Bradley 2009; Creswell and Plano Clark 2011), a critical challenge has been ensuring that reports from mixed methods studies transparently discuss the methodological components integral to the conduct of the studies. Health services researchers and reviewers need clear guidelines regarding research methodology, including methodological components that should be expected in mixed methods papers to indicate that they are sufficiently rigorous.
Mixed Methods in Health Services Research
Health services research is the study of how social factors, financing systems, organizational structures and processes, health technologies, and personal behaviors affect access to health care, the quality and cost of health care, and ultimately, health and well-being (Lohr and Steinwachs 2002). As a result of the breadth of topics addressed, health services research draws upon methods and concepts from many fields, including medicine, epidemiological and economic studies, and the evaluation of services and interventions (Field, Tranquada, and Feasley 1995). Health services researchers increasingly work in interdisciplinary partnerships (e.g., Aboelela et al. 2007) and use innovative methods, including mixed methods, to more fully understand health services phenomena. Mixed methods approaches are also consistent with suggestions to extend scientific and contextual health knowledge beyond randomized trials (Berwick 2005).
Mixed methods research capitalizes on the strengths of both qualitative and quantitative methodology by combining both components in a single research study to increase breadth and depth of understanding (Johnson, Onwuegbuzie, and Turner 2007). Qualitative and quantitative methods can be integrated for different purposes to provide a more comprehensive picture of health services than either method can alone. Mixed methods are appropriate in the following situations: (1) when researchers would like to converge different methods or use one method to corroborate the findings from another about a single phenomenon (triangulation); (2) when researchers would like to use one method to elaborate, illustrate, enhance, or clarify the results from another method (complementarity); (3) when researchers would like to use results from one method to inform another method, such as in creating a measure (development); (4) when researchers would like to use one method to discover paradoxes and contradictions in findings from another method that can suggest reframing research questions (initiation); and (5) when researchers seek to expand the breadth and depth of the study by using different methods for different research components (expansion) (Greene, Caracelli, and Graham 1989). Bryman (2006) modified and expanded this list to add that mixed methods can also be useful in obtaining diversity of views, illustrating concepts, and developing instruments.
Quantitative and qualitative research can be distinguished by the philosophical assumptions brought to the study (e.g., deductive versus inductive), the types of research strategies (e.g., experiments versus case studies), and the specific research methods used in the study (e.g., structured survey versus observation) (Creswell 2008). Qualitative health services research, for example, is a method in which the researcher collects textual material derived from speech or observation and attempts to understand the phenomenon of interest in terms of the meanings people bring to them (Denzin and Lincoln 1994; Shortell 1999; Giacomini and Cook for the Evidence-Based Medicine Working Group 2000; Malterud 2001; Bradley, Curry, and Devers 2007). Certain characteristics are typical of qualitative research, including a naturalistic setting (as opposed to a laboratory), a focus on participants’ perspectives and their meaning, the outcome as a process rather than a product, and data collected as words or images (Padgett 2008).
Guidelines for Conducting Mixed Methods Research
The National Institutes of Health noted the need for rigor in combining qualitative and quantitative methods to study complex health issues in their recent publication, Best Practices for Mixed Methods in Health Sciences (Creswell, Klassen, Plano Clark, and Smith for the Office of Behavioral and Social Sciences Research 2011). There are several frameworks to guide the rigorous conduct and evaluation of mixed methods research (Collins, Onwuegbuzie, and Sutton 2006; Curry, Nembhard, and Bradley 2009; Tashakkori and Teddlie 2010; Creswell and Plano Clark 2011). Collectively, these frameworks recommend that the conduct of mixed method studies—and reports of mixed method research, including peer-reviewed publication—demonstrates explicit rationales for all decisions regarding study design, including the purpose of including both qualitative and quantitative methods. They specifically advise that each component (qualitative or quantitative) should be conducted with a level of rigor in accordance with established principles in its field, and that researchers be transparent in methodological reporting. For example, sampling design should be specified as identical, parallel, nested, or mixed (Onwuegbuzie and Collins 2007); the level of mixing methods (fully versus partially) should be described, as should time orientation (sequential or concurrent components of research) and emphasis (equal importance of methodological approaches or one more dominant) (Leech and Onwuegbuzie 2009).
Conducting and evaluating mixed methods research have unique methodological challenges, particularly related to rigor. Quantitative studies typically rely on quality criteria such as internal validity, generalizability, and reliability (Campbell 1957; Campbell and Stanley 1963; Messick 1989, 1995; Onwuegbuzie and Daniel 2002, 2004; Onwuegbuzie 2003), whereas qualitative studies have roughly comparable quality criteria of credibility, transferability, and dependability (Lincoln and Guba 1985; Guba and Lincoln 1989; Miles and Huberman 1994; Maxwell 2005; Pope and Mays 2006). For example, questions asked when evaluating a qualitative study might include the following: “Were participants relevant to the research question and was their selection well reasoned?” and “Was the data collection comprehensive enough to support rich and robust descriptions of the observed events?” (Giacomini and Cook for the Evidence-Based Medicine Working Group 2000). In addition to determining whether methodological approaches unique to qualitative or quantitative research were employed, an evaluation of a mixed methods study should assess aspects unique to mixed methods, such as how multiple components are integrated and how consistency and discrepancy between findings from each method are managed (Sale and Brazil 2004; O'Cathain, Murphy, and Nicholl 2007). Qualitative, quantitative, and mixed methodologists agree that study procedures should be reported transparently, including sufficient detail to allow the reader to make inferences about study quality (Lincoln and Guba 1985; Giacomini and Cook for the Evidence-based Medicine Working Group 2000; O'Cathain, Murphy, and Nicholl 2007; Armstrong et al. 2008; Creswell 2008; Curry, Nembhard, and Bradley 2009; Leech et al. 2009; Teddlie and Tashakkori 2009).
Several researchers have proposed specific techniques to assess the overall methodology of mixed methods research and assess the methodological components of the qualitative, quantitative, and mixed portions of the studies (e.g., Pluye et al. 2009; O'Cathain 2010; Tashakkori and Teddlie 2010; Creswell and Plano Clark 2011; Leech, Onwuegbuzie, and Combs 2011). For example, O'Cathain (2010) assessed quality of mixed methods research by evaluating transparency and clarity in reporting planning, design, data, interpretive rigor, inference transferability, reporting quality, synthesizability, and utility. Others have suggested alternative methods for assessing quality, but criteria often are not elucidated or are vague. Further, those frameworks typically address quality of the study design as opposed to the characteristics provided in the published article. By contrast, Sale and Brazil (2004) proposed a structured framework for the evaluation of mixed methods publications by identifying key methodological components that should be included for both qualitative and quantitative portions of studies. Despite these advances, we found few published accounts of the rigor of published mixed methods research. Our article has three specific research questions: (1) How has the frequency of mixed methods studies published in health services journals changed over time? (2) How are mixed methods articles being used to elucidate health services? and (3) To what extent do mixed methods reports differ in methodological content compared to qualitative-only or quantitative-only articles?
Method
This systematic review assessed the frequency of mixed methods publications in top health services research journals and compared the frequency of key methodological components in qualitative, quantitative, and mixed method studies. We first reviewed articles in health services research journals to determine the prevalence of mixed methods designs and the presence of key methodological components. Then, we conducted statistical analyses of trends over time in the frequency of mixed methods articles and in the presence of key methodological components of those articles. Because this was an analysis of published data, no ethical oversight was required.
Identification of Mixed Methods Articles
We examined four journals: Health Affairs, Health Services Research, Medical Care, and Milbank Quarterly, which had 5-year impact factors of 2.94–4.71. Journals were selected by reviewing the Institute for Scientific Information (2007) rankings for the top 10 journals in health care sciences and services. Of these 10, we included all journals that focused generally on health services research and excluded journals with narrower foci (Value in Health, Journal of Health Economics, Journal of Pain and Symptom Management, Statistical Methods in Medical Research, Quality and Safety in Health Care, and Quality of Life Research). Although 2001 marked a turning point in the proliferation of mixed methods studies published in major electronic bibliographic databases such as PubMED (Collins, Onwuegbuzie, and Jiao 2007), we chose to examine articles from 2003 to 2007 because 2003 marks publication of the first edition of Tashakkori and Teddlie's landmark Handbook of Mixed Methods in Social and Behavioral Research, which provided the first comprehensive collection of mixed method theory, methodology, and application. Five years represents a sufficient period of time to examine trends of published articles following the publication of a landmark methodological work.
We reviewed empirical articles to determine whether each represented a quantitative, qualitative, or mixed methods study. This entailed using all the information presented in the abstract and the body of the article to identify the research design either as stated or implied by the author(s). We excluded nonempirical articles (book reviews, literature reviews, commentaries and opinion articles, letters to the editor, policy statements) and articles from a special issue of Milbank Quarterly (Volume 83, Number 4) that included only articles published between 1932 and 1998.
We classified articles as quantitative if they included (1) a primary goal of testing theories or hypotheses about relationships between/among variables, or (2) quantitative data and methodology, such as hierarchical linear modeling, multiple regression, or Markov modeling. We classified articles as qualitative if they included either (1) a primary goal of exploring or understanding the meaning ascribed to a specific phenomenon or experience, or (2) qualitative data such as observations, unstructured or semi-structured interviews, or focus group interviews or methodologies such as thematic analysis. Although more complex definitions of mixed method studies exist (e.g., Johnson, Onwuegbuzie, and Turner 2007; Creswell and Plano Clark 2011), we classified articles as mixed methods if they integrated or combined both quantitative and qualitative methods in a single study (Sale and Brazil 2004). This definition reflects the general definitions of mixed methods and the lack of consensus on a specific definition across all multidisciplinary mixed methods researchers.
We used spreadsheets to track classifications, with cells containing articles’ abstracts and our field notes. Two authors read and classified articles in batches of 50 according to type, conferring as needed until agreement was achieved (n = 300 articles); the remaining articles (n = 1,351) were each coded by one author. For the few articles for which methodology was ambiguous (n = 58, 3.5 percent of all empirical articles), classification was resolved in consultation with a third author. Similar methods have been used in other evaluations of mixed methods articles (Powell et al. 2008).
Assessments of Articles
We identified all mixed methods articles (n = 47) and equal random samples (n = 47) of quantitative articles (from 1,502 articles) and qualitative articles (from 102 articles) (total n = 141) in the four journals. Random samples of qualitative and quantitative articles were selected using a random number generator and did not adjust for journal or year. We assessed the frequency of key methodological components reported across articles, then compared rates by article type. The methodological components we focused on were drawn from two conceptual frameworks. The first included Sale and Brazil's (2004) criteria: (1) internal validity for quantitative findings and credibility for qualitative findings, (2) external validity for quantitative findings and transferability or fittingness for qualitative findings, (3) reliability for quantitative findings and dependability for qualitative findings, and (4) objectivity for quantitative findings and confirmability for qualitative findings (specific criteria are listed in Table 3). The second was O'Cathain's transparency criteria for mixed methods studies (O'Cathain, Murphy, and Nicholl 2007; O'Cathain 2010), which specify that mixed methods studies should state the (1) priority of methods (primarily quantitative, primarily qualitative, or equal priority), (2) purpose of mixing methods (e.g., triangulation, complementarity, initiation, development, or expansion), (3) sequence of methods (qualitative first, quantitative first, or simultaneous), and (4) stage of integration of both types of data (e.g., data collection, analysis, interpretation). We assessed four additional components of mixed methods studies: (1) whether qualitative and quantitative components were integrated, (2) whether limitations of design were detailed, (3) whether areas of consistency between qualitative and quantitative components were elucidated, and (4) whether areas of inconsistency between components were described.
Table 3.
Key Methodological Components in Mixed Methods, Quantitative, and Qualitative Health Services Research Articles
Mixed Method Studies (n =47) | Quantitative Studies (n =47) | Qualitative Studies (n =47) | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Yes | No | N/A | % with Component† | Yes | No | N/A | % with Component† | Yes | No | N/A | % with Component† | |
Key quantitative methodological components | ||||||||||||
Truth value (internal validity) | ||||||||||||
Ethical review undertaken | 9 | 37 | 1 | 19.57 | 9 | 37 | 1 | 19.57 | ||||
Informed consent stated | 5 | 21 | 21 | 19.23 | 5 | 38 | 4 | 11.63 | ||||
Identifying or controlling for extraneous/confounding variables*** | 7 | 40 | 0 | 14.89 | 33 | 14 | 0 | 70.21 | ||||
Confidentiality protected | 3 | 42 | 2 | 6.67 | 2 | 42 | 3 | 4.55 | ||||
Comparability of control to intervention groups at baseline | 0 | 0 | 47 | 0.00 | 8 | 36 | 3 | 18.18 | ||||
Control/comparison groups treated similarly | 0 | 0 | 47 | 0.00 | 3 | 40 | 4 | 6.98 | ||||
Applicability (external validity/generalizability) | ||||||||||||
Outcome measures defined | 7 | 0 | 40 | 100.00 | 43 | 3 | 1 | 93.48 | ||||
Control/comparison group described | 2 | 0 | 45 | 100.00 | 11 | 33 | 3 | 25.00 | ||||
Data collection instruments/source of data described*** | 29 | 18 | 0 | 61.70 | 46 | 1 | 0 | 97.87 | ||||
Statement of purpose/objective** | 28 | 19 | 0 | 59.57 | 40 | 7 | 0 | 85.11 | ||||
Source of subjects stated (sampling frame)** | 27 | 19 | 1 | 58.70 | 41 | 6 | 0 | 87.23 | ||||
Study population defined or described*** | 24 | 23 | 0 | 51.06 | 43 | 4 | 0 | 91.49 | ||||
Source of control/comparison group stated | 1 | 1 | 45 | 50.00 | 8 | 36 | 3 | 18.18 | ||||
Selection of control/comparison group described | 1 | 1 | 45 | 50.00 | 8 | 36 | 3 | 18.18 | ||||
Data gathering procedures described* | 23 | 24 | 0 | 48.94 | 33 | 14 | 0 | 70.21 | ||||
Description of setting/conditions under which data collected* | 22 | 24 | 1 | 47.83 | 32 | 15 | 0 | 68.09 | ||||
Statistical procedures referenced or described*** | 19 | 28 | 0 | 40.43 | 45 | 2 | 0 | 95.74 | ||||
Subject recruitment or sampling selection described*** | 17 | 30 | 0 | 36.17 | 35 | 12 | 0 | 74.47 | ||||
Statement about nonrespondents, dropouts, deaths | 16 | 31 | 0 | 34.04 | 21 | 25 | 1 | 45.65 | ||||
p-Values stated*** | 16 | 31 | 0 | 34.04 | 41 | 6 | 0 | 87.23 | ||||
Both statistical and clinical significance acknowledged*** | 13 | 34 | 0 | 27.66 | 41 | 6 | 0 | 87.23 | ||||
Study design stated explicitly** | 11 | 36 | 0 | 23.40 | 26 | 21 | 0 | 55.32 | ||||
Inclusion/exclusion criteria stated explicitly*** | 10 | 36 | 1 | 21.74 | 28 | 19 | 0 | 59.57 | ||||
Missing data addressed | 10 | 37 | 0 | 21.28 | 18 | 29 | 0 | 38.30 | ||||
At least one hypothesis stated* | 10 | 37 | 0 | 21.28 | 23 | 24 | 0 | 48.94 | ||||
Sample randomly selected | 6 | 39 | 2 | 13.33 | 12 | 35 | 0 | 25.53 | ||||
Confidence intervals given for main results*** | 5 | 42 | 0 | 10.64 | 26 | 21 | 0 | 55.32 | ||||
Power calculation provided | 1 | 46 | 0 | 2.13 | 7 | 40 | 0 | 14.89 | ||||
Description of intervention | 0 | 2 | 45 | 0.00 | 7 | 36 | 4 | 16.28 | ||||
Assessment of outcome blinded | 0 | 0 | 47 | 0.00 | 2 | 41 | 4 | 4.65 | ||||
Consistency (reliability) | ||||||||||||
Standardization of observers described | 3 | 44 | 0 | 6.38 | 7 | 40 | 0 | 14.89 | ||||
Neutrality (objectivity) | ||||||||||||
Statement of researcher's assumptions/perspective | 5 | 42 | 0 | 10.64 | 4 | 43 | 0 | 8.51 | ||||
Key qualitative methodological components | ||||||||||||
Truth value (credibility) | ||||||||||||
Triangulation of qualitative sources | 25 | 22 | 0 | 53.19 | 27 | 20 | 0 | 57.45 | ||||
Triangulation of qualitative methods | 16 | 31 | 0 | 34.04 | 13 | 34 | 0 | 27.66 | ||||
Use of exemplars | 13 | 34 | 0 | 27.66 | 14 | 33 | 0 | 29.79 | ||||
Ethical review undertaken | 10 | 37 | 0 | 21.28 | 8 | 30 | 9 | 21.05 | ||||
Triangulation of investigators | 7 | 40 | 0 | 14.89 | 3 | 44 | 0 | 6.38 | ||||
Informed consent stated | 6 | 41 | 0 | 12.77 | 3 | 35 | 9 | 7.89 | ||||
Member checks | 4 | 43 | 0 | 8.51 | 2 | 45 | 0 | 4.26 | ||||
Confidentiality protected | 4 | 43 | 0 | 8.51 | 3 | 35 | 9 | 7.89 | ||||
Consent procedures described | 3 | 44 | 0 | 6.38 | 2 | 36 | 9 | 5.26 | ||||
Peer debriefing | 2 | 45 | 0 | 4.26 | 0 | 47 | 0 | 0.00 | ||||
Negative case analysis (searching for disconfirming evidence) | 1 | 46 | 0 | 2.13 | 0 | 47 | 0 | 0.00 | ||||
Triangulation of theory/perspective | 0 | 47 | 0 | 0.00 | 4 | 43 | 0 | 8.51 | ||||
Applicability (transferability/fittingness) | ||||||||||||
Statement of purpose/objective | 34 | 13 | 0 | 72.34 | 36 | 11 | 0 | 76.60 | ||||
Data gathering procedures described* | 25 | 22 | 0 | 53.19 | 36 | 11 | 0 | 76.60 | ||||
Description of study context or setting*** | 20 | 27 | 0 | 42.55 | 38 | 9 | 0 | 80.43 | ||||
Phenomenon of study stated | 18 | 29 | 0 | 38.30 | 24 | 23 | 0 | 51.06 | ||||
Sampling procedure described | 18 | 29 | 0 | 38.30 | 22 | 23 | 2 | 48.89 | ||||
Rationale for qualitative methods | 17 | 30 | 0 | 36.17 | 12 | 35 | 0 | 25.53 | ||||
Description of participants/informants* | 16 | 31 | 0 | 34.04 | 25 | 20 | 2 | 55.56 | ||||
Statement of research questions | 15 | 32 | 0 | 31.91 | 21 | 26 | 0 | 44.68 | ||||
Statement of how setting was selected | 15 | 32 | 0 | 31.91 | 30 | 17 | 0 | 63.04 | ||||
Data analysis described | 15 | 32 | 0 | 31.91 | 20 | 27 | 0 | 42.55 | ||||
Transcription procedures described | 11 | 36 | 0 | 23.40 | 13 | 28 | 6 | 31.71 | ||||
Coding techniques described | 9 | 38 | 0 | 19.15 | 17 | 30 | 0 | 36.17 | ||||
Justification or rationale for sampling strategy* | 8 | 39 | 0 | 17.02 | 18 | 27 | 2 | 40.00 | ||||
Audiotaping procedures described | 8 | 39 | 0 | 17.02 | 12 | 29 | 6 | 29.27 | ||||
Statement about nonrespondents, dropouts, deaths | 6 | 41 | 0 | 12.77 | 4 | 34 | 9 | 10.53 | ||||
Description of raw data | 3 | 44 | 0 | 6.38 | 4 | 43 | 0 | 8.51 | ||||
Rationale for tradition within qualitative methods | 2 | 45 | 0 | 4.26 | 2 | 45 | 0 | 4.26 | ||||
Data collection to saturation specfiied | 2 | 45 | 0 | 4.26 | 1 | 44 | 2 | 2.22 | ||||
Statement that reflexive journals, logbooks, notes were kept | 2 | 45 | 0 | 4.26 | 3 | 44 | 0 | 6.38 | ||||
Consistency (dependability) | ||||||||||||
External audit of process | 0 | 47 | 0 | 0.00 | 0 | 47 | 0 | 0.00 | ||||
Neutrality (comfirmability) | ||||||||||||
External audit of data | 2 | 45 | 0 | 4.26 | 0 | 47 | 0 | 0.00 | ||||
Bracketing or epoche | 0 | 47 | 0 | 0.00 | 0 | 47 | 0 | 0.00 | ||||
Statement of researcher's assumptions or perspective | 0 | 47 | 0 | 0.00 | 2 | 45 | 0 | 4.26 | ||||
Key mixed methods methodological components | ||||||||||||
Integration of qualitative and quantitative components | 40 | 7 | — | 85.11 | ||||||||
Sequence of methods specified | 10 | 37 | — | 27.03 | ||||||||
Areas of consistency between methods stated | 12 | 35 | — | 25.53 | ||||||||
Areas of inconsistency between methods stated | 6 | 41 | — | 12.77 | ||||||||
Stage of integration specified | 5 | 42 | — | 11.90 | ||||||||
Priority of methods specified | 2 | 45 | — | 4.44 | ||||||||
Purpose of mixing methods specified | 2 | 45 | — | 4.44 | ||||||||
Limitations of mixed methods stated | 2 | 45 | — | 4.26 |
Note.
p < .05;
p < .01;
p < .001.
Percent with quality indicator is calculated as n(yes)/n − n(n/a).
We assessed components using categories of 0 (not described), 1 (described), or not applicable (e.g., for criteria referencing control groups in a study that had none, or ethical review for a study with no human subjects data) (O'Cathain, Murphy, and Nicholl 2007). We identified only whether the study contained or did not contain each methodological component and did not attempt to assess quality or appropriateness of each component within the context of the study. For example, we assessed whether the publication stated that missing data were addressed but not whether the methods to address missing data were the best methods for that particular research design. Similar to initial article classification, two authors read and coded articles to assess presence/absence of each criterion, with any ambiguity resolved in consultation with a third author.
Quantitative Analyses of Trends and Rigor
Once all articles were coded, we conducted a statistical analysis to determine whether there were trends over time in the prevalence of mixed methods articles. To assess this, we used linear regression to test the hypothesis that there would be an increase in the prevalence of the number of mixed methods articles over time. We also conducted chi-square tests to assess differences between mixed methods, qualitative, and quantitative articles on both quantitative and qualitative criteria. We tested whether each criterion was present in the same proportion of quantitative studies as in the quantitative portion of the mixed methods studies and in the same proportion of qualitative studies as in the qualitative portion of the mixed methods studies.
Results
In general, coders could easily categorize the type of study. Challenges arose when transparency about methods was inadequate (N = 58, 3.5 percent of all empirical articles). For example, some papers indicated that data from interviews were included but did not provide details about who was interviewed, what was asked in the interviews, how the interview data were analyzed, or how the interview data were integrated into the overall study.
Research Question 1: How has the frequency of mixed methods studies published in health services journals changed over time?
Table 1 presents a summary of the types of articles published in four major health services research journals from 2003 through 2007. Only 2.85 percent (n = 47) of empirical articles were mixed methods studies; 6.18 percent (n = 102) of empirical studies represented qualitative research. Quantitative research represented 90.98 percent (n = 1,502) of empirical articles. The journal containing the highest proportion of empirical studies employing a mixed methods design was Milbank Quarterly (8.33 percent), followed by Health Affairs (6.91 percent), Health Services Research (4.03 percent), and Medical Care (0.78 percent). Chi-square test showed a significant difference in these proportions (χ2 = 34.67, df = 3, p < .0001).
Table 1.
Type and Design of Empirical Articles Published in Health Services Research Journals from 2003 to 2007, Data Presented by Journal
Journal | Quant | Qual | Mixed | Total |
---|---|---|---|---|
Health Affairs | 305 | 49 | 21 | 375 |
81.33% | 13.07% | 5.60% | ||
Health Services Research | 428 | 26 | 17 | 471 |
90.87% | 5.52% | 3.61% | ||
Medical Care | 751 | 12 | 6 | 769 |
97.66% | 1.56% | 0.78% | ||
Milbank Quarterly | 18 | 15 | 3 | 36 |
50.00% | 41.67% | 8.33% | ||
Total | 1,502 | 102 | 47 | 1,651 |
90.98% | 6.18% | 2.85% |
Note. Mixed, mixed method articles; Qual, qualitative articles; Quant, quantitative articles.
To detect temporal trends in the frequency of mixed methods research in the health services literature, articles were collapsed across journal and examined by publication year. Table 2 presents the frequency of article type for each of the 5 years. All journals combined published an average of 10.8 mixed method articles per year, or 3.27 percent of empirical articles annually. A quadratic trend was seen across the 5 years (R2 = 0.65), indicating a slight increase in mixed method articles in the first 2 years and then a decrease for the remaining years.
Table 2.
Type and Design of Empirical Articles Published in Four Health Services Research Journals from 2003 to 2007, Data Presented by Year
Year | Quant | Qual | Mixed | Total |
---|---|---|---|---|
2003 | 260 | 21 | 7 | 288 |
90.28% | 7.29% | 2.43% | ||
2004 | 295 | 18 | 13 | 326 |
90.49% | 5.52% | 3.99% | ||
2005 | 282 | 17 | 8 | 307 |
91.86% | 5.54% | 2.61% | ||
2006 | 321 | 25 | 10 | 356 |
90.17% | 7.02% | 2.81% | ||
2007 | 344 | 21 | 9 | 374 |
91.98% | 5.61% | 2.41% | ||
Total | 1,502 | 102 | 47 | 1,651 |
90.98% | 6.18% | 2.85% |
Note. Mixed, mixed method articles; Qual, qualitative articles; Quant, quantitative articles.
Research Question 2: How are mixed methods articles being used to elucidate health services research?
Mixed methods articles were categorized into four overlapping categories: Articles on organizational and individual decision making processes (n = 18 studies) combined qualitative interviews with quantitative administrative data analyses to assess decision making about processes or impediments to processes. Examples include a study of formulary adoption decisions (Dandrove, Hughes, and Shanley 2003) and states’ decisions to reduce Medicaid and other public program funding (Hoadley, Cunningham, and McHugh 2004).
Sixteen articles described outcomes or effects of policies or initiatives by combining administrative health record or performance data with interviews of health administrators, providers, or executives. Examples include papers describing outcomes of pay-for-performance changes to Medicaid (Felt-Lisk, Gimm, and Peterson 2007; Rosenthal et al. 2007) and hospital patient safety initiatives (Devers, Pham, and Liu 2004).
Thirteen measurement development articles employed mixed methods to create measurement tools to assess, for example, caregiver burden (Cousineau et al. 2003), patient activation (Hibbard et al. 2004), and the development of a Healthcare Effectiveness Data and Information Set (HEDIS) smoking measure (Pbert et al. 2003). These studies typically examined qualitative data from individual or focus group interviews first to inform creation and testing of a survey.
Articles on experiences and perceptions were the least common category (n = 8), typically combining surveys and interviews. These included family physicians’ perceptions of the effect of medication samples on their prescribing practices (Hall, Tett, and Nissen 2006); caregivers’ experiences of the termination of home health care for stroke patients (Levine et al. 2006); and consumer enrollment experiences in the Cash and Counseling program (Schore, Foster, and Phillips 2007).
Only five mixed methods articles (10.64 percent) of the total mixed methods sample used the terms “mixed method” or “multimethod” in the abstract or text, although four articles (8.51 percent) referred to “qualitative and quantitative” data.
Research Question 3: Do mixed methods articles report qualitative and quantitative methodology differently than methodology is reported in qualitative-only or quantitative-only articles?
Table 3 presents a summary of the frequency of key methodological components present in quantitative articles, qualitative articles, and mixed methods articles (each n = 47). For quantitative methodological components (32 items), mixed methods articles (M = 7.02 [21.94 percent], SD = 6.24) averaged statistically significantly fewer (t(92) = −4.50, p < .00001, Cohen's d effect size = 0.93) components than did quantitative articles (M = 15.06 [47.07 percent], SD = 10.53). For qualitative methodological components (35 items), mixed methods articles (M = 7.17 [21.34 percent], SD = 6.36) did not average a statistically significantly different proportion of components (t(92) = −1.10, p = .14, d = 0.23) than did qualitative articles (M = 8.91 [25.47 percent], SD = 8.83). No article met all criteria, and no criterion was met by all articles. For comparative analyses at a statistical significance level of α = 0.05, power to detect a medium difference (Cohen's h = 0.50) and a large difference (Cohen's h = 0.80) was 78 and 99 percent, respectively.
Of quantitative components, mixed methods studies were most likely to describe sources of data and data collection instruments (61.70 percent of studies), state the purpose/objective of the paper (59.57 percent), state the source of subjects (58.70 percent), and define/describe the study population (51.06 percent). Most mixed methods studies did not include control and intervention groups, which excluded related criteria. Quantitative studies tended to contain more key methodological components, with more than 90 percent of studies defining outcome measures (93.48 percent), defining/describing study population (91.49 percent), describing statistical procedures (95.74 percent), and stating hypotheses (97.87 percent). Quantitative studies were more likely than the quantitative portion of mixed methods studies to describe study characteristics (e.g., study design, subject recruitment), identify or control for confounding variables, provide probability values or confidence intervals, state hypotheses, or acknowledge both statistical and clinical significance (see Table 3).
For qualitative methodological components, mixed methods studies were most likely to state the purpose/objective of the paper (72.34 percent), triangulate qualitative sources (e.g., use both individual and focus group interviews; 53.19 percent), and describe data-gathering procedures (53.19 percent). More than 50 percent of qualitative studies triangulated qualitative sources (57.45 percent), stated the purpose/objective of the paper (57.45 percent), and described the study setting (80.43 percent), how the setting was selected (63.04 percent), the participants (55.56 percent), and data-gathering procedures (76.60 percent). Qualitative studies were more likely than the qualitative portions of the mixed methods studies to describe the study setting, justify the sampling strategy, participants, and data-gathering procedures.
For criteria regarding method integration, few authors justified the use of mixed methods or clearly described the priority, purpose, and sequence of methods, and the stage of integration. Most articles, however, integrated qualitative and quantitative components (85.11 percent); examination of articles indicated components were most frequently integrated in the interpretation phase. Across all studies, few articles stated that informed consent was obtained, ethical review was undertaken, or that subjects’ confidentiality was protected.
Discussion
Previous reports indicate mixed methods articles comprised <1 percent of empirical health articles examined in 2000 (McKibbon and Gadd 2004). Since then, however, the National Institutes of Health has increased funding for mixed methods research, with the proportion of funded research projects up to 5 percent of studies in some institutes (Plano Clark 2010). In the United Kingdom, the proportion of funded research that uses mixed methods is at 17 percent and continuing to increase (O'Cathain, Murphy, and Nicholl 2007). We found that the use of mixed methods in articles published in top health services research journals was generally consistent between 2003 and 2007 at approximately 3 percent of all empirical articles, lower than would be expected given the complexity and depth of health services research questions for which mixed methods would be appropriate. The presence of key methodological components was variable across type of article, but the quantitative portion of mixed methods articles included consistently fewer methodological components than quantitative-only studies and the qualitative portion of mixed methods articles included about the same proportion of methodological components as qualitative-only articles. Mixed methods articles also generally did not address the priority, purpose, and sequence of methods or the integration of methods as suggested by experts in mixed methods (e.g., Creswell and Tashakkori 2008; O'Cathain 2010; Creswell and Plano Clark 2011).
Key methodological components that cut across qualitative and quantitative methodologies were often missing from mixed methods publications. Descriptions of sample selection and sampling procedures, the study context, and data-gathering procedures are essential aspects of interpreting study findings, and mixed methods studies should not be exempt from these basic research requirements. Many mixed methods studies did not include the level of detail that would likely be required for a qualitative or quantitative paper to be accepted in these high-ranking journals. Further, the studies appeared not to follow available guidance on the structure and components of mixed methods studies that discuss basic quality criteria, data collection strategies, methods of data analysis, procedures for integration of methods, processes of making inferences from text, and recommendations for adequate reporting of results (e.g., Giacomini and Cook for the Evidence-based Medicine Working Group 2000; Curry, Nembhard, and Bradley 2009; O'Cathain 2010; Tashakkori and Teddlie 2010; Creswell and Plano Clark 2011). In some ways this finding is not surprising because guidance on mixed methods standards is still emerging. We expect that the National Institutes of Health publication, Best Practices for Mixed Methods in Health Sciences (Creswell, Klassen, Plano Clark, and Smith for the Office of Behavioral and Social Science Research) will lead to increased standardization of mixed methods approaches.
Although they reported more key methodological components on average than the mixed methods articles, quantitative articles in this analysis had some surprising gaps as well, including low reporting of power analyses, how missing data were addressed, and descriptions of control/comparison groups. It should be noted, however, that quantitative articles with large sample sizes do not necessarily need power analyses. With regard to single-method qualitative articles, low proportions described the study context, coding techniques, or data analysis. Few articles with human subjects involvement included statements that the research was conducted with ethical oversight, promised confidentiality, or obtained consent. These findings suggest that the issue of poor transparency in reporting methodology is not limited to mixed methods studies.
Recommendations for Mixed Methods Reporting
The methodological components reported here are not optimal indicators of the quality of mixed methods publications; an article could conceivably have all of these components and yet still be a poor research study. These components are, however, a useful starting point for a systematic evaluation of the rigor of qualitative and quantitative portions of mixed methods studies. Some journals require inclusion of other criteria (e.g., Consolidated Standards of Reporting Trials 2010) to guide reporting of highly structured methodologies (e.g., randomized clinical trials); it would be useful to examine researchers’ and editors’ perspectives on the validity of the methodological components in this study for mixed method publications. It is difficult, however, to identify measurable criteria that capture the breadth of study designs in health services. Further, determination of what indicators of rigor would be appropriate might reasonably vary by study design, topic, scope, and even journal, and qualified judgment is required to determine which criteria are appropriate for each study. These findings suggest mixed methods researchers should provide enough detail on methodology and methodological decisions to allow reviewers to judge quality.
Researchers face challenges writing and publishing mixed methods articles, including communicating with diverse audiences who are familiar with only one methodological approach (i.e., quantitative research or qualitative research), determining the most appropriate language and terminology to use, complying with journal word counts, and finding appropriate publishing outlets with reviewers who have expertise in mixed methods research techniques and who are not biased against mixed methods studies (Leech and Onwuegbuzie 2010; Leech, Onwuegbuzie, and Combs 2011). Our findings suggest that Sale and Brazil's (2004) criteria and existing guidance on conducting mixed methods research (e.g., Collins, Onwuegbuzie, and Sutton 2006; Tashakkori and Teddlie 2010; Creswell and Plano Clark 2011) might be useful frameworks for health services researchers as they work to improve methodological rigor. Journal editors might also encourage the publication of mixed methods projects by (1) publishing guidelines for rigor in mixed methods articles (e.g., Sale and Brazil 2004), (2) identifying experienced reviewers who can provide competent and ethical reviews of mixed methods studies, and (3) requiring transparency of methods for all studies so that (4) rigor and quality can be can be assessed to the same extent they are in quantitative studies. These modifications might require (5) some flexibility in word count or allowance of online appendices to allow mixed methods researchers to describe fully and concisely both qualitative and quantitative components, methods for integrating findings, and appropriate details.
Limitations
In this study, assessment was limited to only published articles. We did not contact authors to determine specific study activities, and studies may have included methodological components (e.g., consenting) not reported in publications. We assessed only whether publications reported the methodological component, but we did not evaluate whether each component was fully and appropriately implemented in the research.
Conclusions
Mixed methods studies have utility in providing a more comprehensive picture of health services than either method can alone. Researchers who use mixed methods techniques should use rigorous methodologies in their mixed methods research designs and explicitly report key methodological components of those designs and methods in published articles. Similarly, journal editors who publish mixed methods research should provide guidance to reviewers of mixed methods articles to assess the quality of manuscripts, and they must be prepared to provide adequate space for authors to report the necessary methodological information. Frameworks are now available to guide both the design and evaluation of mixed methods research studies and published works. Whatever frameworks are used, it is essential that authors who engage in mixed methods research studies meet two primary goals (developed by the American Educational Research Association 2006): Mixed methods researchers should (1) conduct and report research that is warranted or defensible in terms of documenting evidence, substantiating results, and validating conclusions; and (2) ensure that the conduct of research is transparent in terms of clarifying the logic underpinning the inquiry.
Acknowledgments
Joint Acknowledgment/Disclosure Statement: The authors appreciate funding from the National Institute on Drug Abuse (K23 DA020487) and comments and feedback on an earlier draft from the anonymous reviewers, John Creswell, PhD, Alicia O'Cathain, PhD, Hilary Vidair, PhD, Susan Essock, PhD, and Sa Shen, PhD. Portions of this manuscript were presented at the International Mixed Methods Conference in July 2010 in Baltimore, Maryland.
Disclosures: None.
Disclaimers: None.
SUPPORTING INFORMATION
Additional supporting information may be found in the online version of this article:
Appendix SA1: Author Matrix.
Please note: Wiley-Blackwell is not responsible for the content or functionality of any supporting materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.
References
- Aboelela SW, Larson E, Bakken S, Carrasquillo O, Formicola A, Glied SA, Haas J, Gebbie KM. Defining Interdisciplinary Research: Conclusions from a Critical Review of the Literature. Health Services Research. 2007;42(1, part 1):329–46. doi: 10.1111/j.1475-6773.2006.00621.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- American Educational Research Association. Standards for Reporting on Empirical Social Science Research in AERA Publications. Educational Researcher. 2006;35(6):33–40. [Google Scholar]
- Armstrong R, Waters E, Moore L, Riggs E, Cuervo LG, Lumbiganon P, Hawe P. Improving the Reporting of Public Health Intervention Research: Advancing TREND and CONSORT. Journal of Public Health. 2008;30(1):103–9. doi: 10.1093/pubmed/fdm082. [DOI] [PubMed] [Google Scholar]
- Berwick D. The John Eisenberg Lecture: Health Services Research as a Citizen in Improvement. Health Services Research. 2005;40(2):317–36. doi: 10.1111/j.1475-6773.2005.00358.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bradley EH, Curry LA, Devers KJ. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory. Health Services Research. 2007;42(4):1758–72. doi: 10.1111/j.1475-6773.2006.00684.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brannen J. Mixing Methods: Qualitative and Quantitative Research. Aldershot, England: Ashgate; 1992. [Google Scholar]
- Bryman A. Integrating Quantitative and Qualitative Research: How Is It Done? Qualitative Research. 2006;6:97–113. [Google Scholar]
- Campbell DT. Factors Relevant to the Validity of Experiments in Social Settings. Psychological Bulletin. 1957;54:297–312. doi: 10.1037/h0040950. [DOI] [PubMed] [Google Scholar]
- Campbell DT, Stanley JC. Experimental and Quasi-Experimental Designs for Research. Chicago, IL: Rand McNally; 1963. [Google Scholar]
- Collins KMT, Onwuegbuzie AJ, Jiao QG. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research. Journal of Mixed Methods Research. 2007;1:267–94. [Google Scholar]
- Collins KMT, Onwuegbuzie AJ, Sutton IL. A Model Incorporating the Rationale and Purpose for Conducting Mixed Methods Research in Special Education and Beyond. Learning Disabilities: A Contemporary Journal. 2006;4:67–100. [Google Scholar]
- Consolidated Standards of Reporting Trials. 2010. Website [accessed on October 18, 2011]. Available at http://www.consort-statement.org/
- Cousineau N, McDowell I, Hotz S, Hébert P. Measuring Chronic Patients’ Feelings of Being a Burden to Their Caregivers. Medical Care. 2003;41:110–8. doi: 10.1097/00005650-200301000-00013. [DOI] [PubMed] [Google Scholar]
- Creswell J. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches. Thousand Oaks, CA: Sage; 2008. [Google Scholar]
- Creswell JW, Klassen AC, Plano Clark VL, Smith KC for the Office of Behavioral and Social Sciences Research. Best Practices for Mixed Methods Research in the Health Sciences. National Institutes of Health; 2011. [accessed on October 18, 2011]. Available at: http://obssr.od.nih.gov/mixed_methods_research. [Google Scholar]
- Creswell J, Plano Clark V. Designing and Conducting Mixed Methods Research. 2nd Edition. Thousand Oaks, CA: Sage; 2011. [Google Scholar]
- Creswell JW, Tashakkori A. Developing Publishable Mixed Methods Manuscripts. Journal of Mixed Methods Research. 2008;1:107–11. [Google Scholar]
- Curry LA, Nembhard IM, Bradley EH. Qualitative and Mixed Methods Provide Unique Contributions to Outcomes Research. Circulation. 2009;119:1442–52. doi: 10.1161/CIRCULATIONAHA.107.742775. [DOI] [PubMed] [Google Scholar]
- Dandrove D, Hughes FX, Shanley M. Determinants of HMO Formulary Adoption Decisions. Health Services Research. 2003;38(1):169–90. doi: 10.1111/1475-6773.t01-1-00111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Denzin NK, Lincoln YS. Handbook of Qualitative Research. Thousand Oaks, CA: Sage; 1994. [Google Scholar]
- Devers KJ, Pham HH, Liu G. What Is Driving Hospitals’ Patient-Safety Efforts? Health Affairs. 2004;23(2):103–15. doi: 10.1377/hlthaff.23.2.103. [DOI] [PubMed] [Google Scholar]
- Felt-Lisk S, Gimm G, Peterson S. Making Pay-For-Performance Work in Medicaid. Health Affairs. 2007;26:w516–27. doi: 10.1377/hlthaff.26.4.w516. [DOI] [PubMed] [Google Scholar]
- Field MJ, Tranquada RE, Feasley JC. Health Services Research: Workforce and Educational Issues. Washington, DC: National Academies Press; 1995. [Google Scholar]
- Giacomini MK, Cook DJ for the Evidence-based Medicine Working Group. Users’ Guides to the Medical Literature. XXIII. Qualitative Research in Health Care A. Are the Results of the Study Valid? Journal of the American Medical Association. 2000;284:357–62. doi: 10.1001/jama.284.3.357. [DOI] [PubMed] [Google Scholar]
- Greene JC, Caracelli VJ, Graham WF. Toward a Conceptual Framework for Mixed-Method Evaluation. Educational Evaluation and Policy. 1989;11:255–74. [Google Scholar]
- Guba EG, Lincoln YS. Fourth Generation Evaluation. Newbury Park, CA: Sage; 1989. [Google Scholar]
- Hall KB, Tett SE, Nissen LM. Perceptions of the Influence of Prescription Medicine Samples on Prescribing by Family Physicians. Medical Care. 2006;44(4):383–7. doi: 10.1097/01.mlr.0000204017.71426.53. [DOI] [PubMed] [Google Scholar]
- Hibbard JH, Stockard J, Mahoney ER, Tusler M. Development of the Patient Activation Measure (PAM): Conceptualizing and Measuring Activation in Patients and Consumers. Health Services Research. 2004;39(4, part 1):1005–26. doi: 10.1111/j.1475-6773.2004.00269.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hoadley JF, Cunningham P, McHugh M. Popular Medicaid Programs Do Battle with State Budget Pressures: Perspectives from Twelve States. Health Affairs. 2004;23(2):143–54. doi: 10.1377/hlthaff.23.2.143. [DOI] [PubMed] [Google Scholar]
- Institute for Scientific Information. 2007. Web of Knowledge Journal Citation Report for Health Care Sciences and Services [accessed on October 18, 2011]. Available at http://admin-apps.isiknowledge.com/JCR/JCR?RQ=LIST_SUMMARY_JOURNAL.
- Johnson RB, Onwuegbuzie AJ, Turner LA. Toward a Definition of Mixed Methods Research. Journal of Mixed Methods Research. 2007;2:112–33. [Google Scholar]
- Leech NL, Dellinger AB, Brannigan KB, Tanaka H. Evaluating Mixed Research Studies: A Mixed Methods Approach. Journal of Mixed Method Research. 2009;4(1):17–31. [Google Scholar]
- Leech NL, Onwuegbuzie AJ. A Typology of Mixed Methods Research Designs. Quality & Quantity: International Journal of Methodology. 2009;43:265–75. [Google Scholar]
- Leech NL, Onwuegbuzie AJ. Guidelines for Conducting and Reporting Mixed Research in the Field of Counseling and Beyond. Journal of Counseling and Development. 2010;89(1):61–70. [Google Scholar]
- Leech NL, Onwuegbuzie AJ, Combs JP. Writing Publishable Mixed Research Articles: Guidelines for Emerging Scholars in the Health Sciences and Beyond. Mixed Methods Research in the Health Sciences. 2011;5(1):7–24. [Google Scholar]
- Levine C, Albert SM, Hokenstad A, Halper DE, Hart AY, Gould DA. ‘This Case Is Closed’: Family Caregivers and the Termination of Home Health Care Services for Stroke Patients. Milbank Quarterly. 2006;84(2):305–31. doi: 10.1111/j.1468-0009.2006.00449.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lincoln YS, Guba EG. Naturalistic Inquiry. Beverly Hills, CA: Sage; 1985. [Google Scholar]
- Lohr KN, Steinwachs DM. Health Services Research: An Evolving Definition of the Field. Health Services Research. 2002;37:15–7. [PubMed] [Google Scholar]
- Malterud K. Qualitative Research: Standards, Challenges, and Guidelines. Lancet. 2001;358:483–8. doi: 10.1016/S0140-6736(01)05627-6. [DOI] [PubMed] [Google Scholar]
- Maxwell JA. Qualitative Research Design: An Interactive Approach. 2nd Edition. Newbury Park, CA: Sage; 2005. [Google Scholar]
- McKibbon KA, Gadd CS. A Quantitative Analysis of Qualitative Studies in Clinical Journals for the 2000 Publishing Year. BioMed Central Medical Informatics and Decision Making. 2004;4(11) doi: 10.1186/1472-6947-4-11. [accessed on October 18, 2011]. Available at http://www.biomedcentral.com/1472-6947/4/11. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Messick S. Validity. In: Linn RL, editor. Educational Measurement. 3rd Edition. Old Tappan, NJ: Macmillan; 1989. pp. 13–103. [Google Scholar]
- Messick S. Validity of Psychological Assessment: Validation of Inferences from Persons’ Responses and Performances as Scientific Inquiry into Score Meaning. American Psychologist. 1995;50:741–9. [Google Scholar]
- Miles M, Huberman AM. Qualitative Data Analysis: An Expanded Sourcebook. 2nd Edition. Thousand Oaks, CA: Sage; 1994. [Google Scholar]
- O'Cathain A. Assessing the Quality of Mixed Methods Research: Toward a Comprehensive Framework. In: Tashakkori A, Teddlie C, editors. Sage Handbook of Mixed Methods in Social and Behavioral Research. 2nd Edition. Thousand Oaks, CA: Sage; 2010. pp. 531–57. [Google Scholar]
- O'Cathain A, Murphy E, Nicholl J. Integration and Publications as Indicators of ‘Yield’ from Mixed Methods Studies. Journal of Mixed Methods Research. 2007;1(2):147–63. [Google Scholar]
- Onwuegbuzie AJ. Expanding the Framework of Internal and External Validity in Quantitative Research. Research in the Schools. 2003;10(1):71–90. [Google Scholar]
- Onwuegbuzie AJ, Collins KMT. A Typology of Mixed Methods Sampling Designs in Social Science Research. Qualitative Report. 2007;12:281–316. [Google Scholar]
- Onwuegbuzie AJ, Daniel LG. A Framework for Reporting and Interpreting Internal Consistency Reliability Estimates. Measurement and Evaluation in Counseling and Development. 2002;35:89–103. [Google Scholar]
- Onwuegbuzie AJ, Daniel LG. Reliability Generalization: The Importance of Considering Sample Specificity, Confidence Intervals, and Subgroup Differences. Research in the Schools. 2004;11(1):61–72. [Google Scholar]
- Padgett D. Qualitative Methods in Social Work Research. 2nd Edition. Thousand Oaks, CA: Sage; 2008. [Google Scholar]
- Pbert L, Vukovic N, Ockene JK, Hollis JF, Riedlinger K. Developing and Testing New Smoking Measures for the Health Plan Employer Data and Information Set. Medical Care. 2003;41(4):550–9. doi: 10.1097/01.MLR.0000053233.38183.77. [DOI] [PubMed] [Google Scholar]
- Plano Clark VL. The Adoption and Practice of Mixed Methods: U.S. Trends in Federally Funded Health-Related Research. Qualitative Inquiry. 2010;16:428–40. [Google Scholar]
- Pluye P, Gagnon M, Griffiths F, Johnson-Lafleur J. A Scoring System for Appraising Mixed Methods Research and Concomitantly Appraising Qualitative, Quantitative and Mixed Methods Primary Studies in Mixed Studies Reviews. International Journal of Nursing Studies. 2009;46:529–46. doi: 10.1016/j.ijnurstu.2009.01.009. [DOI] [PubMed] [Google Scholar]
- Pope C, Mays N. Qualitative Research in Health Care. 3rd Edition. Malden, MA: Blackwell Publishing; 2006. [Google Scholar]
- Powell H, Mihalas S, Onwuegbuzie AJ, Suldo S, Daley CE. Mixed Methods Research in School Psychology: A Mixed Methods Investigation of Trends in the Literature. Psychology in the Schools. 2008;45:291–309. [Google Scholar]
- Rosenthal MB, Landon BE, Howitt K, Song HR, Epstein AM. Climbing Up the Pay-For-Performance Learning Curve: Where Are The Early Adopters Now? Health Affairs. 2007;26(6):1674–82. doi: 10.1377/hlthaff.26.6.1674. [DOI] [PubMed] [Google Scholar]
- Sale JEM, Brazil K. A Strategy to Identify Critical Appraisal Criteria for Primary Mixed-Method Studies. Quality and Quantity. 2004;38(4):351–65. doi: 10.1023/b:ququ.0000043126.25329.85. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schore J, Foster L, Phillips B. Consumer Enrollment and Experiences in the Cash and Counseling Program. Health Services Research. 2007;42(1, part 2):446–66. doi: 10.1111/j.1475-6773.2006.00679.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shortell S. The Emergence of Qualitative Methods in Health Services Research. Health Services Research. 1999;34:1083–90. [PMC free article] [PubMed] [Google Scholar]
- Tashakkori A, Teddlie C. Sage Handbook of Mixed Methods in Social and Behavioral Research. 2nd Edition. Thousand Oaks, CA: Sage; 2010. [Google Scholar]
- Teddlie C, Tashakkori A. Foundations of Mixed Methods Research: Integrating Quantitative and Qualitative Approaches in the Social and Behavioral Sciences. Thousand Oaks, CA: Sage; 2009. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.