Abstract
Introduction
Demand for training in mixed methods is high, with little research on faculty development or assessment in mixed methods. We describe the development of a Self-Rated Mixed Methods Skills Assessment and provide validity evidence. The instrument taps six research domains: “Research question,” “Design/approach,” “Sampling,” “Data collection,” “Analysis,” and “Dissemination.” Respondents are asked to rate their ability to define or explain concepts of mixed methods under each domain, their ability to apply the concepts to problems, and the extent to which they need to improve.
Methods
We administered the questionnaire to 145 faculty and students using an internet survey. We analyzed descriptive statistics and performance characteristics of the questionnaire using Cronbach’s alpha to assess reliability and an ANOVA that compared a mixed methods experience index with assessment scores to assess criterion-relatedness.
Results
Internal consistency reliability was high for the total set of items (.95) and adequate (>=.71) for all but one subscale. Consistent with establishing criterion validity, respondents who had more professional experiences with mixed methods (e.g., published a mixed methods paper) rated themselves as more skilled, which was statistically significant across the research domains.
Discussion
This Self-Rated Mixed Methods Assessment instrument may be a useful tool to assess skills in mixed methods for training programs. It can be applied widely at the graduate and faculty level. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning.
Keywords: Professional development, Outcomes assessment, Mixed methods research, Research training, Faculty development, Mixed methods research, Outcomes/impact assessment, Self assessment, Survey methodology, Workforce development/issues
Mixed methods research has continued to develop across disciplines. It is an approach that involves the collection and analysis of quantitative and qualitative data and their integration.1 The use of mixed methods in the health sciences in particular has grown considerably in the U.S. and internationally, as seen in published studies2,3 and National Institutes of Health (NIH) funded grants.4 Newer funding mechanisms, such as the Patient Centered Outcomes Research Institute (PCORI), have encouraged the use of mixed methods for enhancing patient centered outcomes.5 Mixed methods have been central in public health,6,7 trauma research,8 social work,9 primary care,10,11 and counseling.12 Researchers developing interventions in these fields have found value in mixed methods approaches to evaluate responses to the intervention by stakeholders in the settings in which the intervention is to be delivered.13 Mixed methods data can augment a randomized clinical trial or intervention design by gathering exploratory data before, during, or after a trial14 to improve the development of an intervention, or explain the outcomes of a trial.15 Mixed methods approaches can operate as core methodologies of implementation science across the translational continuum,16,17 thus contributing important new methodologies in mixed methods training. Mixed methods research is needed in implementation science in order to understand the context of interventions, patient experiences, and nuances from multiple perspectives.18 Nevertheless, the majority of current faculty nationwide have not benefited from receiving formal mixed methods coursework and training.19,20
Scholars across the disciplines of medicine, public health, and the social sciences come to the research enterprise with different experiences and expertise. Disciplinary training and background can vary widely. The Mixed Methods Research Training Program (MMRTP) for the Health Sciences was established to address the need to provide research training in an interdisciplinary context, as suggested by Earley 21 Funded by the Office of Behavioral and Social Sciences Research (OBSSR) of the NIH through an R25 training grant, the MMRTP provides a mentorship-based, year-long program to train faculty-level scholars in the design and conduct of mixed methods research. Application to the program is competitive with far more applications received than available positions. A committee comprised of the program leaders (JG, CD, JC, and TG) rates each application using five criteria: 1) overall quality and quantity of Scholarship and research relevant to stage in career, 2) quality and potential impact of the proposal, 3) capacity (or potential) for working effectively in transdisciplinary teams, 4) interest in mixed methods research and potential for national leadership in advancing the field of mixed methods, and 5) likelihood that participation in the mixed methods collaborative network would lead to a successful application for NIH, VA, or comparable funding in the arena of health sciences research.
The faculty scholars who have participated in this program are generally novice mixed methods researchers. These scholars entered this program with a mixed methods research project concept paper, for which they received mentorship support towards developing it into a grant application to an NIH Institute or a university administered NIH career development award. The scholar’s project thus became the vehicle for learning and applying mixed methods research methodologies in learning about the their application and in faculty development.22
In developing the MMRTP, we needed an assessment instrument to tailor the program to the unique needs of the participating scholar. A self-assessment instrument would also be used in a program evaluation that involves a pre-post assessment of the mixed methods skills that the participating scholar would have acquired. The same instrument would be used in pre-post administration to examine changes in scholar skills with the addition of several additional open-ended questions on the post administration to gather feedback about the MMRTP’s strengths and weaknesses. The literature that involves training in mixed methods approaches and methodologies is sparse.19 There is little content in this literature that addresses the assessment of mixed methods skills. Given that the MMRTP trains early career faculty, this assessment had to consider the unique needs of these faculty who were targeted for the training program. Specifically, these scholars had varying degrees of research experience and had been socialized to a certain disciplinary way of thinking within their individual disciplinary training programs. We thus needed to consider their previous exposure to research and disciplinary approaches within their respective academic environments.
The purpose of this paper is to describe how we developed a mixed methods skills assessment questionnaire. The MMRTP’s principal investigator Joseph Gallo, co-investigators Charles Deutsch, John Creswell and evaluator Timothy Guetterman led the development of the assessment and sought input from consultants Marsha Wittink, Fran Barg, Felipe Castro ??, Britt Dahlberg and Daphne Watkins (MW, FB, FG ??, BD, DW), who are co-authors of this article. For this assessment, we adopted the NIH Best Practices23 definition of mixed methods research as: focusing on research questions that call for contextual understanding, employing rigorous qualitative and quantitative research, intentionally integrating methods, and framing the research within philosophy and theory. That definition of mixed methods aligns most closely with the MMRTP. The intended use of the questionnaire is to assess mixed methods learners at varying stages of expertise and identify needs and strengths to guide mixed methods skill development. We provide initial evidence of validity and reliability along with the instrument itself (Appendix) for potential use in other mixed methods training programs. A tested instrument to assess mixed methods skills can be applied to the MMRTP as well as other mixed methods training programs. The assessment can be used formatively to tailor training or obtain a baseline assessment; summative to measure change in skills; and as a research instrument to investigate mixed methods skills development.
As a check of criterion validity, we evaluated the responses obtained, taking into consideration the prior self-reported professional experiences with mixed methods (e.g., writing a mixed methods proposal; see Table 1) of these respondents. We hypothesized that participants having a greater number of professional experiences involving mixed methods approaches would be positively associated with their level of specific self-assessed skills and also associated (negatively) with their avowed need for skills development. Finally, we examined consequential validity, which is concerned with the impact of the activity of assessment and subsequent use of scores.24,25
Table 1.
Faculty (n = 59) | Student(n = 86) | Total(n = 145) | |
---|---|---|---|
Background in Research Methods | |||
I am primarily trained in qualitative research. | 12 (20.3%) | 22 (25.6%) | 34 (23.4%) |
I am primarily trained in quantitative research. | 47 (79.7%) | 60 (69.8%) | 107 (73.8%) |
I am primarily trained in mixed methods. | 12 (20.3%) | 27 (31.4%) | 39 (26.9%) |
Professional Experiences in Mixed Methods | |||
I wrote a mixed methods application that received funding.*** | 21 (35.6%) | 5 (5.8%) | 26 (17.9%) |
I wrote an application that did not receive funding.*** | 22 (37.3%) | 12 (14.0%) | 34 (23.4%) |
I participate in a mixed methods work group. | 24 (40.7%) | 26 (30.2%) | 50 (34.5%) |
I have presented mixed methods research at a local or institutional meeting.*** | 32 (54.2%) | 19 (22.1%) | 51 (35.2%) |
I have presented mixed methods research at a national meeting.*** | 28 (47.5%) | 7 (8.1%) | 35 (24.1%) |
I have taken a course in mixed methods.* | 23 (39.0%) | 51 (59.3%) | 74 (51.0%) |
I have published a paper using mixed methods.*** | 25 (42.4%) | 5 (5.8%) | 30 (20.7%) |
I wrote a dissertation involving mixed methods.* | 16 (27.1%) | 9 (10.5%) | 25 (17.2%) |
I mentor or advise others in mixed methods research.*** | 16 (27.1%) | 5 (5.8%) | 21 (14.5%) |
I have reviewed mixed methods applications on an NIH study section.*** | 8 (13.6%) | 0 (0.0%) | 8 (5.5%) |
I have reviewed mixed methods applications for a foundation or other organization.** | 10 (16.9%) | 3 (3.5%) | 13 (9.0%) |
I have reviewed mixed methods manuscripts as a peer reviewer for a journal.*** | 25 (42.4%) | 9 (10.5%) | 34 (23.4%) |
Summary | |||
Mean number (± standard deviation) of professional experiences*** | 4.2 ± 3.1 | 1.8 ± 1.7 | 2.8 ± 2.6 |
Percent with 3 or more professional experiences reported | 78% | 49% | 61% |
p-value < 0.05,
p-value < 0.01,
p-value < .001, based on χ2 test for counts of experiences and independent samples t-test for mean number of experiences
METHODS
Designing the Skills Assessment
We generated a pool of items that assess candidate skills under each of six domains that are related to research; namely: (a) “Research question,” (b) “Design/approach,” (c) “Sampling,” (d) “Data collection,” (e) “Analysis,” and (f) “Dissemination.” The scholar’s skills domains were identified based on our previous experiences in teaching and writing about mixed methods approaches.14,26 We also sought the assistance from five consultants of the MMRTP to evaluate the content-relatedness of these items to assess mixed methods skills. The consultants are faculty with demonstrated success in obtaining independent NIH or equivalent funding for research related to translation, patient-centered care, or program evaluation; a track record and commitment to mentoring investigators; and expertise in mixed methods research. We circulated drafts of the instrument to these consultants for comments in an iterative fashion, incorporating feedback. Feedback from consultants helped us revise the wording of items to reduce the participant’s cognitive fatigue. Consultants suggested more granular detail regarding experiences and background, particularly to distinguish writing funded versus unfunded mixed methods grants. Feedback also generated new items about mixed methods integration, such as mixed methods inferences. Our final list also included skills specific to quantitative and qualitative research, separately. However, for the development of the self-rated assessment of mixed methods, we only administered the component related to mixed methods to the participants as reported in this article.
Dimensions for the ratings of skills
To develop this skills assessment, we drew from educational theory related to the epistemology of knowledge. The theory suggests that assessing skills involves different levels of application: the ability to define or explain the topic, the ability to apply it to practical issues, and the ability to give expert advice. The concepts have been applied to competency rating scales in the field of education27 and mixed methods in particular through a typology that distinguishes novice from expert mixed methods researchers.28 In that educational competency scale, individuals rated technological competencies for four skills: “converse about the content in general ways,” “give explanations about critical concepts,” “apply knowledge to challenging practical problems,” and “give expert advice.” For each domain, scale responses to each scale item were originally evaluated on a 7-point Likert-type dimension ranging from (1)=”Not at all” to (7)=”To a great extent.” Although we drew heavily from the structure of these assessments, we modified it in several ways. First, we reduced the 7-point response dimension to a 5-point response dimension because we did not believe that discriminating between 7 versus 5-points would be meaningful.29 Employing this 5-point dimension also reduced the burden for respondents. Next, we reduced the columns for ratings from 4 to 3 skills: (a) the ability to define/explain, (b) the ability to apply to practical problems, and (c) the extent to which “I need to improve my skills.” Including a rating for the need to improve one’s skills specifically incorporates the learner’s perspective on what they need to improve. Soliciting the need to improve skills provided further guidance to MMRTP planning by incorporating the learner’s self-identified needs. Furthermore, the assessment instrument concluded with open-ended items that asked about the particular new skills needed and the scholar’s learning goals for the program (see Appendix). The open-ended items were designed to solicit skills and goals not anticipated in the quantitative items and to understand goals in the scholars’ own voices.
Data collection
We sent an email containing a link to the online survey to the 25 consultants on the MMRTP and requested that they forward the email to 5 to 10 colleagues and students. The online survey software recorded the time from the start of the survey to completion of the final item. For the purpose of this study, we added an additional open-ended item to the skills assessment to solicit feedback from respondents about skills we might have missed. Respondents who were confidentially willing to provide their name and e-mail address were sent a $10 gift card as a token of appreciation. The names and email addresses were kept separate from the survey responses. The study procedures were reviewed and deemed educational research, and therefore the project received approval as an exempt project from the Institutional Review Board of the Johns Hopkins University Bloomberg School of Public Health.
Data analysis
Our data analysis proceeded in three phases to gather evidence of validity and reliability of scores. First, in conducting a descriptive data analyses, we examined the distributional properties of each variable in frequencies analyses as used to identify any responses that were out of range, missing, or in error. Second, we created an “experience index” that consisted of the number of “yes” responses to the 12 items that assessed the level of experience in using mixed methods research approaches (see Appendix). We then computed the Cronbach’s alpha coefficient for the total instrument and also for each of the six subscales that assessed the skills categories (“Research question,” “Design,” etc).30 Third, we sorted the sample into two reference groups based on the value of the experience index: above the median (more than two experiences) and below the median (two or fewer). To assess criterion-related validity, we compared the individuals above the median of experiences with mixed methods with those having less experiences based on the means of subscales of three self-rated scale scores that involve: (a) define or explain the concept, (b) the ability to apply the concept to practical problems, and (c) the extent to which the respondent felt they needed to improve their mixed methods skills. We used independent sample analysis of variance to compare the skill means between high and low experience groups, with an α of 0.05 for statistical significance. All analyses were conducted with the SPSS statistical program, version 22 (Armonk, NY).
RESULTS
Sample characteristics
Of 192 individuals who started the questionnaire, 145 completed all the items (a 75% completion rate). Attrition in the questionnaire occurred primarily after the professional experiences items, such as writing a mixed methods application for funding and mentoring others in mixed methods research (Table 1). Of the 47 respondents who did not complete all items, 44 stopped after the professional experiences items, and 3 respondents completed several items in the skills domains but did not complete the entire questionnaire. Few differences were found between people who completed the questionnaire (n=145) and those who partially completed the questionnaire (n=47). One notable exception was that a statistically significantly greater portion of completers who reported being primarily trained in mixed methods research relative to quantitative or quantitative inquiry, as compared with those who partially completed the questionnaire (27% and 13%, respectively, χ2(1, N=192)=3.95, p<.05). There were no other statistically significant differences in reported experiences between completers and partial completers. The mean number of professional experiences endorsed was 3.0 for partial completers, compared to 2.8 experiences for completers.
The sample of completed questionnaires consisted of 59% (n=86) students and 41% (n=59) faculty (Table 1). The assessment took an average of 12 minutes to complete. As expected, faculty reported having a greater number of professional experiences (e.g., publishing an article) related to mixed methods (mean, 4.2±3.1) on average than the graduate students (mean, 1.8±1.7). Also, the faculty were more likely to report having more than two professional experiences with mixed methods (78% of faculty reported having more than 2 professional experiences with mixed methods compared to 49% of graduate students). More students reported their research methods training was primarily in mixed methods approaches, when compared with the faculty (31% students, 20% faculty), and the graduate students were more likely to report having taken a course in mixed methods (59% students, 39% faculty). Nevertheless, faculty were more likely than students to report experiences with mixed methods in applying for funding, obtaining funding, participating in a mixed methods work group, presenting and publishing mixed methods research, or serving as any type of mixed methods reviewer.
Internal consistency
We used Cronbach’s alpha coefficient to examine the internal consistency reliability of each of the six scales to assess participants’ skills in the conduct of mixed methods research. The overall Cronbach’s alpha coefficient for the total set of items was 0.95. Reliability for each domain subscale (combining ability to define/explain, ability to apply, and need to improve skill) was also adequate for all but one of the subscales (Research Questions, 0.71; Design/Approach, 0.90; Sampling, 0.60; Data Collection, 0.75; Analysis, 0.85; Dissemination, 0.70). In addition, we examined reliability of subscales within each of the six research domains (research questions, design, sampling, data collection, analysis, dissemination) and for each skill set (ability to define/explain, ability to apply, and need to improve skill). Cronbach’s alpha was above 0.70 for all subscales except sampling-ability to apply, which was 0.65.
Criterion-related validity
We calculated mean ratings for each domain and skill set, and compared the respondent groups: (a) the High group that reported having a number of professional experiences above the median of the sample, with (b) the Low group of respondents who reported their number of professional experiences that numbered below the median (see Table 2). For each domain, mean ratings of “the ability to define or explain,” and “the ability to apply to practical problems,” were significantly higher for persons who reported having a greater number of professional experiences with mixed methods approaches. This result indicates criterion-relatedness between tangible mixed methods experiences and skill ratings. All respondents rated their need to improve on skills very highly, with no statistically significant differences according to reported level of professional experiences with mixed methods.
Table 2.
Professional experiences in mixed methods endorsed in Section 1 | My ability to define/explain | My ability to apply to practical problems | Extent to which I need to improve my skill | |||||
---|---|---|---|---|---|---|---|---|
Above the median of 2 | Below the median of 2 | F(1, 143) | Above the median of 2 | Below the median of 2 | F(1, 143) | Above the median of 2 | Below the median of 2 | |
Research question | ||||||||
Formulate question & aims that link modes of inquiry | 3.8 (0.7) | 3.0 (0.8) | 41.4** | 3.6 (0.6) | 2.9 (0.8) | 28.0** | 4.2 (0.8) | 4.3 (0.8) |
State underlying philosophical assumptions | ||||||||
Rationale for mixed methods study | ||||||||
Design/approach | ||||||||
Identifying integration points in a design | 3.2 (0.7) | 2.6 (.9) | 21.2** | 3.1 (0.7) | 2.4 (0.9) | 30.0* | 4.2 (0.7) | 4.2 (0.9) |
Explanatory sequential designs | ||||||||
Exploratory sequential designs | ||||||||
Convergent parallel designs | ||||||||
Intervention designs | ||||||||
Program evaluation designs | ||||||||
Case studies | ||||||||
Threats to internal validity in mixed methods | ||||||||
Threats to external validity in mixed methods | ||||||||
Diagram of the mixed methods design | ||||||||
Sampling | ||||||||
Sampling strategies that link qualitative and quantitative methods (e.g., random followed by purposive) | 3.9 (0.8) | 3.1 (1.1) | 24.9** | 3.7 (0.8) | 3.0 (1.0) | 24.7** | 4.0 (1.0) | 4.2 (1.0) |
Ethical principles of consent and recruitment | ||||||||
Data collection | ||||||||
Strategies for concurrent data collection | 3.6 (0.9) | 2.8 (1.0) | 27.3** | 3.5 (1.0) | 2.7 (1.2) | 16.6** | 4.2 (1.0) | 4.3 (1.0) |
Strategies for sequential data collection | ||||||||
Analysis | ||||||||
Combining qual and quan data (e.g., joint matrix) | 2.6 (1.1) | 1.8 (0.8) | 20.7** | 2.5 (1.0) | 1.8 (1.0) | 14.3** | 4.4 (0.8) | 4.3 (1.0) |
Cultural consensus analysis | ||||||||
Inference that links qualitative and quantitative (i.e., meta-inference) | ||||||||
Dissemination | ||||||||
Writing results incorporating both qualitative and quantitative methods in the same report | 3.5 (0.9) | 2.9 (1.1) | 16.0** | 3.4 (0.9) | 2.8 (1.2) | 10.4** | 4.3 (0.9) | 4.4 (0.9) |
Communicate results involving both qualitative and quantitative methods to non-academic audiences |
p-value < 0.05,
p-value < 0.01 for pairwise comparisons within “ability to explain,” “ability to apply to practical problems,” and “extent to which I need to improve.”
Additional topics mentioned
Respondents also suggested skills or themes that they thought were important that were not emphasized in the skills assessment, namely the need to learn about: focus groups, latent variable models, the design of instruments, collaborating with a team, writing for high-impact journals, identifying mixed methods designs, researcher reflexivity in qualitative inquiry, thinking “outside the box,” implementing research programs, and skills for planning to conduct mixed methods research.
DISCUSSION
To assess scholars’ skills acquisition within our MMRTP, we needed to develop a self-rated skills assessment instrument to guide and tailor our educational strategies. The instrument developed exhibited good internal consistency as assessed with Cronbach’s alpha coefficient and good criterion-related validity as assessed by comparisons among two criterion groups which were defined as high and low in the extent of their prior experiences with mixed methods (e.g., working on a mixed methods team, writing mixed methods grants, etc.). This Self-Rated Mixed Methods Assessment instrument is thus a tool that can be applied widely at the graduate and faculty level and complements student-based assessments such as tests and the survey developed by Poth 31
Before discussing the implications of the development of a self-rated assessment of skills in mixed methods, limitations should be mentioned. The instrument is a self-assessment and self-perceptions may be inaccurate. Nevertheless, the instrument does explicitly ask and thereby encourage individuals to reflect on their own learning needs. That reflection along with applied experience and feedback from consultants and mentors may help scholars improve their self-assessment ability. Our study is based on responses from a convenience sample of persons who were contacted by the consultants in the MMRTP supported by an R25 grant. Respondents may have been more interested in mixed methods than others who received an invitation but did not respond. Among persons who started the questionnaire, most finished, and differences in reported experiences with mixed methods of completers and partial completers were minimal. Nevertheless, the assessment instrument should be tested with other samples, including the evaluation of performance of the instrument after receiving various types of educational or training experiences. We have evidence that in our first two cohorts of scholars, the instrument is capable of measuring change after training based on administration of the same instrument pre-post training.32
In the sample studied here, internal reliability of the Self-Rated Mixed Methods Skills Assessment was excellent for the overall scale and for the subscales that assess the domains: of defining a research question, design/approach, sampling, data collection, analysis, and dissemination. Cronbach’s alpha coefficient was not adequate for sampling-ability to apply. The sampling domain consisted of only two items, one asking about “sampling strategies that link qualitative and quantitative methods” and one concerning “ethical principles of consent and recruitment.” In retrospect, these two items are not related, tapping different kinds of abilities, and we may wish to add more detailed tasks to tap each domain.
Additional work needs to be done to validate the instrument in relation to external criteria, such as outcomes of training or to performance measures of abilities. In this paper we tried to make an initial evaluation of criterion validity by comparing scores to ratings based on levels of experience with mixed methods approaches, finding that persons with more experience reported higher self-ratings of their skills. For our study, the experience index allowed a comparison of individuals with greater or fewer research experiences related to mixed methods, and a similar strategy may apply more broadly to other training topics when training outcomes are not easily obtainable. Consistent with the idea that a “first generation of faculty” have taught themselves mixed methods and now are teaching it,33 a smaller proportion of early career faculty than graduate students in our sample reported ever taking a mixed methods course. In our sample, only about one-third of the faculty reported having written a successful mixed methods proposal for funding.
The citations reviewed by Frels and colleagues on publications pertaining to the pedagogy of mixed methods research all refer to conceptual, theoretical statements or to descriptions of courses for graduate students.34 Very little in the way of assessment of faculty in advanced programs has been reported, and there is a need for an instrument that can serve as a guide for training in the health sciences. No similar assessment for mixed methods learning has been published. Developed with support from the NIH through the OBSSR, the development of this Self-Rated Mixed Methods Skills Assessment may facilitate training and evaluation of investigators in the health sciences who use mixed methods. Based on our sample, the assessment has broad applicability to individuals across health sciences disciplines and training levels (e.g., graduate student, early career, mid-career faculty). Intended uses of the assessment include evaluating of mixed methods trainings, understanding learners baseline skills, guiding and tailoring educational strategies, and assessing mixed methods learning progress. Regarding consequences validity24,25 of the impact of assessing, interpreting, and applying scores of the Self-Rated Mixed Methods Skills Assessment, we speculate on consequences for learners and faculty. For the learner, assessment may lead to enhanced motivation to learn and training focused on self-identified needs. For faculty, the assessment may improve curriculum and course content planning. A goal of future research would be to refine the instrument for use in workshops and training for advanced students as well as in the MMRTP. The availability of a brief evaluation tool can contribute to increasing the quality of training in mixed methods for the health sciences.
Lessons for Practice.
The Self-Rated Mixed Methods Skills Assessment has good initial evidence of content-related validity and reliability for the purpose of assessing mixed methods research skills.
Intended uses of the assessment include understanding baseline research skills as a formative assessment to guide training; evaluating courses, workshops, training, and other research education programs; and assessing learner progress.
Continuing education aimed at faculty or graduate mixed methods research skills might use the assessment to improve quality of training.
Acknowledgments
Source of Funding
The Mixed Methods Research Training Program for the Health Sciences is supported by the Office of Behavioral and Social Sciences Research under Grant R25MH104660-01. Participating institutes for this project are the National Institute of Mental Health, National Heart, Lung, and Blood Institute, National Institute of Nursing Research, and the National Institute on Aging.
We are grateful to participants who filled out the survey. We also thank Lilly Pritula for her editorial assistance with the manuscript.
Footnotes
The list of participating national consultants can be found on our website: http://www.jhsph.edu/academics/training-programs/mixed-methods-training-program-for-the-health-sciences/
Conflicts of Interest
The authors have no conflicts of interest to declare.
Contributor Information
Timothy C. Guetterman, University of Michigan.
John W. Creswell, University of Michigan.
Marsha Wittink, University of Rochester.
Fran K. Barg, University of Pennsylvania.
Felipe G. Castro, Arizona State University.
Britt Dahlberg, Institute for Research, Chemical Heritage Foundation.
Daphne C. Watkins, University of Michigan.
Charles Deutsch, Harvard University.
Joseph J. Gallo, Johns Hopkins University.
References
- 1.Creswell JW, Plano Clark VL. Designing and conducting mixed methods research. 2. Thousand Oaks, CA: Sage; 2011. [Google Scholar]
- 2.Ivankova NV, Kawamura Y. Emerging trends in the utilization of integrated designs in the social, behavioral, and health sciences. In: Tashakkori A, Teddlie C, editors. Sage handbook of mixed methods in social and behavioral research. 2. Thousand Oaks, CA: Sage; 2010. pp. 581–611. [Google Scholar]
- 3.Coyle CE, Schulman-Green D, Feder S, et al. Federal funding for mixed methods research in the health sciences in the United States: Recent trends. J Mix Methods Res. 2016 [Google Scholar]
- 4.Plano Clark VL. The adoption and practice of mixed methods: U.S. trends in federally funded health-related research. Qualitative Inquiry. 2010;16:428–440. [Google Scholar]
- 5.Nass P, Levine S, Yancy C. Methods for involving patients in topic generation for patient-centered comparative effectiveness research: An international perspective. Washington, DC: Patient-Centered Outcomes Research Institute; 2014. [Google Scholar]
- 6.Pluye P, Hong QN. Combining the power of stories and the power of numbers: Mixed methods research and mixed studies reviews. Annu Rev Public Health. 2014;35(1):29–45. doi: 10.1146/annurev-publhealth-032013-182440. [DOI] [PubMed] [Google Scholar]
- 7.Padgett DK. Qualitative and mixed methods in public health. Thousand Oaks, CA: Sage; 2012. [Google Scholar]
- 8.Creswell JW, Zhang W. The application of mixed methods designs to trauma research. Journal of Traumatic Stress. 2009;22:612–621. doi: 10.1002/jts.20479. [DOI] [PubMed] [Google Scholar]
- 9.Watkins DC, Gioia D. Mixed methods research. New York, NY: Oxford; 2015. [Google Scholar]
- 10.Barg FK, Huss-Ashmore R, Wittink MN, Murray GF, Bogner HR, Gallo JJ. A mixed methods approach to understand loneliness and depression in older adults. Journal of Gerontology: Social Sciences. 2006;61(6):S329–339. doi: 10.1093/geronb/61.6.s329. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Wittink MN, Barg FK, Gallo JJ. Unwritten rules of talking to doctors about depression: integrating qualitative and quantitative methods. Ann Fam Med. 2006;4(4):302–309. doi: 10.1370/afm.558. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Plano Clark VL, Wang SC. Adapting mixed methods research to multicultural counseling. In: Ponterotto JG, Casas JM, Suzuki LA, Alexander CM, editors. Handbook of multicultural counseling. 3. Thousand Oaks, CA: Sage; 2010. pp. 427–438. [Google Scholar]
- 13.Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care. 2012;50:217–226. doi: 10.1097/MLR.0b013e3182408812. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Gallo JJ, Lee SY. Mixed methods in behavioral intervention research. In: Gitlin LN, Czaja SJ, editors. Behavioral Intervention Research. New York, New York: Springer Publishing Company; 2016. pp. 195–211. [Google Scholar]
- 15.Farquhar MC, Ewing G, Booth S. Using mixed methods to develop and evaluate complex interventions in palliative care research. Palliat Med. 2011;25(8):748–757. doi: 10.1177/0269216311417919. [DOI] [PubMed] [Google Scholar]
- 16.Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of evidence needed. Annual review of public health. 2007;28:413–433. doi: 10.1146/annurev.publhealth.28.021406.144145. [DOI] [PubMed] [Google Scholar]
- 17.Glasgow RE, Vinson C, Chambers D, Khoury MJ, Kaplan RM, Hunter C. National Institutes of Health Approaches to Dissemination and Implementation Science: Current and Future Directions. American Journal of Public Health. 2012;102(7):1274–1281. doi: 10.2105/AJPH.2012.300755. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Palinkas LA. Qualitative and mixed methods in mental health services and implementation research. J Clin Child Adolesc Psychol. 2014;43(6):851–861. doi: 10.1080/15374416.2014.910791. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Hesse-Biber S. The problems and prospects in the teaching of mixed methods research. Int J Soc Res Methodol. 2015;18(5):463–477. [Google Scholar]
- 20.Frels RK, Onwuegbuzie AJ, Leech NL, Collins KM. Challenges to teaching mixed research courses. The Journal of Effective Teaching. 2012;12(2):23–44. [Google Scholar]
- 21.Earley M. A synthesis of the literature on research methods education. Teaching in Higher Education. 2014;19:242–253. [Google Scholar]
- 22.Gusic ME, Milner RJ, Tisdell EJ, Taylor EW, Quillen DA, Thorndyke LE. The essential value of projects in faculty development. Academic medicine : journal of the Association of American Medical Colleges. 2010;85(9):1484–1491. doi: 10.1097/ACM.0b013e3181eb4d17. [DOI] [PubMed] [Google Scholar]
- 23.Creswell JW, Klassen AC, Plano Clark VL, Smith KC. Best practices for mixed methods research in the health sciences. Washington, DC: National Institutes of Health; 2011. [Google Scholar]
- 24.Cook DA, Lineberry M. Consequences validity evidence: Evaluating the impact of educational assessments. Academic medicine : journal of the Association of American Medical Colleges. 2016;91(6):785–795. doi: 10.1097/ACM.0000000000001114. [DOI] [PubMed] [Google Scholar]
- 25.Kane MT. Explicating validity. Assessment in Education: Principles, Policy & Practice. 2016;23(2):198–211. [Google Scholar]
- 26.Creswell JW. A concise introduction to mixed methods research. Thousand Oaks, CA: Sage; 2015. [Google Scholar]
- 27.Harnisch D, Shope RJ. Developing technology competencies to enhance assessment literate teachers. Paper presented at: Society for Information Technology & Teacher Education International Conference; 2007; Chesapeake, VA. [Google Scholar]
- 28.Guetterman TC. What distinguishes a novice from an expert mixed methods researcher? Quality & Quantity. 2017;51:377–398. [Google Scholar]
- 29.DeVellis RF. Scale development: Theory and applications. 3. Thousand Oaks, CA: Sage; 2012. [Google Scholar]
- 30.Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;16:297–334. [Google Scholar]
- 31.Poth C. What constitutes effective learning experiences in a mixed methods research course? An examination from the student perspective. Int J Mult Res Approaches. 2014;8:74–86. [Google Scholar]
- 32.Guetterman TC, Creswell JW, Deutsch C, Gallo JJ. Skills development and academic productivity of scholars in the NIH mixed methods research training program for the health sciences. 2017 doi: 10.29034/ijmra.v10n1a25. Manuscript under review. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Creswell JW, Tashakkori A, Jensen KD, Shapley KL. Teaching mixed methods research: Practices, dilemmas, and challenges. In: Tashakkori A, Teddlie C, editors. Handbook of Mixed Methods in Social and Behavioral Research. Thousand Oaks, California: Sage Publishers; 2003. [Google Scholar]
- 34.Frels RK, Onwuegbuzie AJ, Leech NL, Collins KMT. Pedagogical strategies used by selected leading mixed methodologists in mixed research courses. The Journal of Effective Teaching. 2014;14:5–34. [Google Scholar]