Skip to main content
Journal of the American Medical Informatics Association : JAMIA logoLink to Journal of the American Medical Informatics Association : JAMIA
. 2021 Sep 19;28(12):2743–2748. doi: 10.1093/jamia/ocab195

Guidance for publishing qualitative research in informatics

Jessica S Ancker 1,, Natalie C Benda 2, Madhu Reddy 3, Kim M Unertl 1, Tiffany Veinot 4
PMCID: PMC8633663  PMID: 34537840

Abstract

Qualitative research, the analysis of nonquantitative and nonquantifiable data through methods such as interviews and observation, is integral to the field of biomedical and health informatics. To demonstrate the integrity and quality of their qualitative research, authors should report important elements of their work. This perspective article offers guidance about reporting components of the research, including theory, the research question, sampling, data collection methods, data analysis, results, and discussion. Addressing these points in the paper assists peer reviewers and readers in assessing the rigor of the work and its contribution to the literature. Clearer and more detailed reporting will ensure that qualitative research will continue to be published in informatics, helping researchers disseminate their understanding of people, organizations, context, and sociotechnical relationships as they relate to biomedical and health data.

Keywords: qualitative research, qualitative methods, reliability, biases, data collection, data analysis

INTRODUCTION

Many informatics researchers, including those with training and expertise in quantitative or computational methods, come to recognize the value of applying qualitative methods. Qualitative research is the collection and analysis of nonquantitative and nonquantifiable data through methods such as interviews and observation to understand perspectives, beliefs, and experiences. Qualitative research is invaluable for understanding context, explaining phenomena and processes, understanding the rationale underlying behavior and decisions, generating hypotheses, and developing or extending theory about sociotechnical phenomena. In health and biomedical informatics, qualitative research is often used to gain insights into the patients and professionals who use informatics innovations, the contexts in which they live and work, and the life experiences that are the sources of the medical, technology, and digital trace data that informaticists analyze.

As a recent scoping review noted, qualitative research represents a small but important portion of published articles in JAMIA.1 In this perspective, we provide a brief outline of expectations for qualitative research published in informatics. Our goal is to explain how authors can demonstrate rigor and avoid common pitfalls by reporting how they have sought to reduce bias, improve reliability, and verify their findings (Box 1). Moreover, we aim to provide guidance that researchers can use at the early stages of studies to ensure that they are able to rigorously report methodological details in a subsequent manuscript.

BOX 1:

Guidance for reporting qualitative research in informatics

  1. Theory

    1. Cite theory appropriate to the topic being studied if applicable

  2. Research question and study design

    1. State the research question

    2. State the study design and methodological perspective of the research

  3. Sampling

    1. Describe the sampling approach

    2. Describe any approaches to ensure the inclusion of people from marginalized or underserved groups

    3. Report and justify the sample size

    4. If using saturation to determine sample size, report what type of saturation was used, and how saturation was assessed*

  4. Data collection

    1. Report how data were collected

    2. Report any methods for reducing bias in data collection and analysis*

  5. Data analysis

    1. Describe data analysis methods, with appropriate citations*

      1. For deductive analysis, report how the theory was used in the data collection and analysis*

      2. For inductive analysis, report how the steps of inductive analysis were done*

      3. For theory development, report how categories were developed*

    2. Describe any methods for improving the dependability of coding*

    3. Report any measures for improving the credibility of findings or verifying interpretations*

  6. Results

    1. Report sample size and characteristics of participants

    2. Support thematic findings with extracts, quotes, images, or observations

    3. Provide synthesis and interpretation

  7. Discussion

    1. Describe assumptions of the research and details of setting and context to illustrate transferability of findings

    2. Describe relationship of findings, or new theory developed in the study, to existing theory

    3. Report limitations

*Elements with an asterisk may need to be elaborated in an appendix to avoid lengthening the manuscript.

Theoretical rigor

All research, whether quantitative or qualitative, is strengthened with a firm foundation in relevant theory or an explanation of why new theory is needed. Authors should use their literature review to cite the relevant theory or theories that have grounded their work. For example, a study of the adoption of a novel technology would typically begin with a discussion of existing theories of technology diffusion and adoption. In some cases, existing theories are not adequate to describe the phenomenon being studied. For example, theories of technology adoption developed in advantaged populations may not be entirely relevant for understudied groups. In these cases, qualitative work may be needed to develop a new concepts or theories or extend existing ones. Grounded theory research, in particular, is designed to develop new theory. Authors using grounded theory should consider discussing current theories and explaining where they fall short, which will strengthen the rationale for theory-building work.

Rigor in stating the research question and clarifying the study design and methodological perspective

Like other research papers, qualitative research papers should contain a clear statement of the research question. Authors should describe the gap in knowledge, discuss why the question needs to be answered, and explain why qualitative or mixed-methods research is appropriate to answer it. Qualitative research questions should not be phrased as hypotheses.

Describing the study design and stating the methodological approach that guided the work is extremely helpful, especially in an inherently multidisciplinary field such as informatics. The methodological approach should be aligned with the research question.2–4 For example, phenomenological research is appropriate for developing a nuanced, sensitive understanding of the lived experience of a phenomenon and the meaning attributed to it by those who experience it. Grounded theory researchers seek to develop novel social theories emerging from data analysis, especially around social processes. Recently, grounded theory-based mixed-methods research has also been proposed.5 Ethnography is useful for describing groups and interpreting their cultures, contexts, and shared meanings; such studies may use a combination of qualitative and quantitative data. A case study approach can prove particularly useful for evaluating interventions or programs or investigating critical events; these studies may also use mixed methods. Participatory design, usability research, and user-centered design are examples of qualitative or mixed-methods approaches intended to produce new technologies adapted to the needs and capabilities of their users and stakeholders.

When using mixed methods, researchers should state their selected study design. Examples include sequential exploratory (qualitative then quantitative) to discover a new phenomenon and then determine its generalizability; sequential explanatory (quantitative then qualitative) to explain unexpected findings from quantitative work; measure development (quantitative, qualitative, then quantitative) in which qualitative methods are used to develop instruments for quantitative surveys; and parallel mixed designs such as evaluating the implementation of a technology intervention through qualitative research while simultaneously collecting quantitative usage data.6–8

The informatics literature includes many excellent examples of studies conducted from the perspective of grounded theory,9 phenomenology,10,11 user-centered design,12–15 case study,16,17 ethnography,18,19 and other qualitative and mixed-methods frameworks.

Rigor in sampling and justification of sample size

Whether researchers work directly with participants or conduct secondary analyses of existing data, they should demonstrate rigor in their sampling approach. What was the population, community, culture, or phenomenon of interest to the research question, and how did the researchers obtain a sample from it? Researchers should explain their sampling approach.3,20,21 Purposive (or purposeful) sampling describes a group of methods for recruiting nonprobability samples of individuals likely to have perspectives or experiences of interest. With purposive sampling, researchers should report how they identified the groups or perspectives for targeted recruiting (eg, whether it was on the basis of theory or empirical observation).22 If purposive sampling strategies such as quota sampling and typical case sampling were used to increase representativeness or oversample subgroups of interest, these methods should be described. In grounded theory, in which the goal is theory development, the sampling should be shown to be theoretically justified.23 For mixed-methods studies, the relationship between quantitative and qualitative samples and analyses should also be specified. Throughout, researchers should explain why their sampling method is appropriate to the research question, discuss whether sampling may be subject to biases, and discuss how they sought to address such biases.

Informatics journals have shown an increased interest in health equity, and authors are encouraged to describe any sampling approaches intended to maximize inclusion of historically marginalized and underserved populations. An informatics-relevant example is this study with men who have sex with men.21

Researchers must also provide a convincing rationale for their sample size. A few papers in the qualitative literature have suggested that small sample sizes are sufficient for interviews and focus groups with human subjects.24,25 However, these citations are more suited to the planning stage of a qualitative project (as in a funding proposal to justify sample sizes) rather than at the execution and reporting stages of a study. A citation to previous research alone may not provide sufficient assurance that the sample size was adequate for a specific study. Instead, it is the responsibility of the researcher to demonstrate that the sample size is adequate to answer the research question.

One well-accepted sample size criterion is saturation. Researchers invoking saturation should explain which of the several definitions of saturation they used and how they determined that saturation was achieved.26–30 Researchers must also show how their criteria for sampling and sample size determination are harmonized with the research question and purpose. (For example, inductive thematic saturation can be achieved with small sample sizes when the population of interest is homogeneous or the research question is simple, but larger samples are usually needed when the population of interest has multiple strata of interest.29)

Usability researchers are encouraged to note potential limitations of the Nielsen and the “10 ± 2” sample size heuristics and to recognize that larger samples will tend to help find more usability problems and allow input by more diverse users.31–35

If the researchers sampled units of analysis other than the individual (eg, events, communities, organizations, or social media posts), this should also be justified according to the goals of the study. For example, in a case study, bellwether or ideal case sampling of hospitals might be justified for examining the effects of a clinical informatics intervention. Authors should draw upon any published methodological literature relevant to their sampling approach.36–40

Rigor in data collection and the relationship between researcher and participants

Researchers must describe their data collection methods (eg, interviews, focus groups, observation, etc.), and justify how their choice of data collection method was appropriate for the research question. For mixed-methods studies, the approach to integrating qualitative and quantitative data collection should be explained.4,41,42

The selection or development process for any instruments used in the study should be described. For example, for semistructured interviews, how and why were topic areas selected and how were interview questions developed? For observation, if a template was used to guide data collection, how and why were template components selected? Any pilot testing of the instruments should be discussed, including any use of mixed methods to design the instrument. Instruments developed for the study should be included as appendices.

Researchers should describe how data collection was carried out. It should be clear who conducted interviews, observations, and focus groups, whether they were conducted in person or by telephone or video conference, and how observers and interviewers were trained. When multiple persons collected data, as in team-based ethnographic research,43 researchers should describe how they ensured reliability across data collectors. Researchers who created field notes should describe their creation (eg, free text notes, observation template).44

The interpretivist nature of much qualitative research means that the relationship between the research participants and the researchers is of critical importance. In the research report, it can be helpful to provide a brief summary of the characteristics, training, and perspectives of the qualitative researchers to help readers assess the credibility of their work.45 Researchers should also be prepared to discuss how they addressed potential data collection biases.2 These include power differentials that may reduce patient candor when interviewed by physician-researchers2 or concerns about employment security or professional repercussions that may cause healthcare workers to be concerned about answering questions about their work. These also include the known tendencies for people to provide overly positive assessments of an innovation when interviewed by its developer46 and to change their behavior when they know they are being observed.47 For focus groups, researchers should report how they addressed common limitations such as group composition issues (eg, role hierarchies that influence participant discussion), or impacts of dominant personalities.

For publication, researchers should describe how they addressed and sought to mitigate these and other potential biases, for example, through reflexivity, prolonged engagement, or persistent observation. If reflexivity practices were used to help researchers understand their own relationship to the research question and the research participants, and their evolving understanding of the data over the course of the project, they should be described.27 Readers and reviewers value descriptions of reflexivity practices, especially in explaining relationships between researchers and marginalized participants, where the perpetuation of bias is likely. In some research traditions, positionality statements are increasingly used to make reflexivity practices in research with marginalized groups more explicit.48

Rigor in data analysis methods

Informatics journals require qualitative researchers to report how they analyzed their qualitative data and provide methodological citations. Many approaches to data analysis are available.4,49,50 What they have in common is that each involves 4 stages: (1) a method for systematically identifying patterns or concepts in the data, (2) a method for reliably labeling these patterns or concepts across different transcripts, fieldnotes, or collected documents/images/artifacts, (3) a method of discovering or identifying relationships between these concepts to synthesize themes or groups of themes comprising theories, and (4) methods to verify and test developing analyses.

These 4 steps are accomplished differently by different analysis approaches. If researchers used deductive approaches such as directed content analysis (which collect and code data on the basis of an existing theoretical framework that predefines the set of applicable concepts and relationships between them), they should report the theory or framework they used.51 By contrast, researchers who use inductive approaches such as thematic analysis49 (analyses conducted in the absence of an existing theoretical framework) should explain how they followed the inductive approach of immersing themselves in the data and allowing concepts and relationships to emerge from reading, review, theorizing, and discussion. Given the large number of first-cycle and second-cycle coding approaches available, researchers should cite which was used.52 Combined inductive-deductive analysis may use components of each of these analysis approaches. For mixed-methods studies, these 4 steps will typically be followed by at least one approach to integrating qualitative and quantitative data (see excellent texts42 on the variety of approaches).

Although inductive data analysis is a component of grounded theory development, not all forms of inductive data analysis meet the definition of grounded theory. Qualitative authors are encouraged to reserve the term “grounded theory” for projects that seek to develop novel theories about social phenomena and conform to one of the main approaches to doing so.50,53–55

Researchers to describe any approaches to improve dependability of coding, for example, whether multiple coders worked on the transcripts, and if so, how they worked together (eg, consensus meetings or establishment of inter-rater reliability). Audit trails (eg, memoing in grounded theory) can also be used for single-authored projects.4,56

Researchers should also describe any methods for improving credibility or verifying their interpretation of the data. If they apply triangulation, they should describe what data sources, researchers, or data types were compared.57 If they applied negative case analysis58,59 to seek out and analyze data that appear to disconfirm a developing concept or theory, they should describe how they identified the negative cases and how the analysis revealed patterns that did or did not hold true. If they exposed their interpretation for critique and reinterpretation by participants in the community being studied, methods for doing so should be described (eg, updating the interview guide to include emerging themes to be discussed with new participants). Alternately, if they conducted formal, terminal member checking by inviting participant discussion and review of findings, they should describe the method, the feedback provided, and how it was addressed in the analysis.60–62

The informatics literature contains many good examples of studies that apply inductive analysis,9 deductive analysis,63 mixed inductive–deductive analysis,64 and mixed qualitative–quantitative methods.65

Rigor in reporting results

The results section should contain a description of the sociodemographic and other relevant characteristics of any human subjects in the sample.

When themes are reported as part of the results, researchers should recognize that a theme cannot be sufficiently described in a sentence or phrase. Themes must be supported with rich examples of actual extracts, quotes, or images gathered during data collection. Quotes not only give life and interest to the research report but also serve the critical function of connecting the source data to the researchers’ interpretation. In addition to the quotes, researchers should provide synthesis by explaining themes and categories, and the depth and range of findings represented by those categories. Labeling quotes and extracts with study-specific labels such as “Participant 1” is one helpful approach to demonstrate that representative quotes are drawn from the entire sample of participants.

Word limits of biomedical research journals can introduce challenges with integrating rich and descriptive source quotations into the text of a results section. While quotes integrated into the results text are a powerful approach to building a high-quality results description, additional tools such as thoughtfully integrated tables of quotations, boxes for longer-form quotations, and summary visualizations such as timelines and network diagrams can also prove helpful. In mixed-methods studies, joint displays of qualitative and quantitative data can also help show linkages between data.66

Rigor in discussion and conclusions

Qualitative researchers are invited to discuss the potential transferability of their findings to other settings or populations.26 To help readers determine whether the findings might be transferable to other settings and populations, authors should provide detail about the context and setting of the research and about their assumptions about the domain of interest.

As with any publication, qualitative reports should also include a limitations section. Common limitations in qualitative work may include known and unavoidable lack of representation of certain participant perspectives, unavoidable researcher biases, or issues with the transferability of findings. Approaches used to address limitations, such as the use of bracketing to address potential researcher biases, should be discussed in the limitations section.

CONCLUSION

Qualitative research is integral to health and biomedical informatics. High-quality qualitative research is conducted by many informatics researchers whose original backgrounds are in quantitative methods. Examples of excellent qualitative and social science research honored with AMIA’s Diana Forsythe Award are available here (https://www.amia.org/amia-awards/working-group-awards). With this summary of publication expectations, we hope to encourage the submission of high-quality qualitative research that advances the field of informatics. Researchers might find it helpful to refer to some of the formal reporting checklists to learn more about reporting expectations in qualitative research, even when publishing in journals that do not require them.45,67 However, in recognition of restrictive manuscript lengths, the present guidelines are intended to be more selective than prior formal reporting checklists. They are also crafted to emphasize unique issues arising in health informatics, such as frequent use of usability testing methods.

Addressing all the elements described here is likely to make the qualitative manuscript very long. Authors are encouraged to consider writing an online-only methodological appendix that would allow them to describe their approach in detail without violating length limits. Using tables, boxes, and figures to illustrate methods and results can also help authors stay within page limits.

We hope that this summary will support more complete reporting of research studies, but we recognize that it is not a substitute for training in qualitative research—although this summary can be an instructional tool within broader training programs. We encourage interested researchers to consult the references shared below and the many excellent texts, courses, and mentors available in informatics and beyond. Following the literature to staying abreast of the qualitative research can also alert researchers to innovations in theory and methods that can be applied to continue advancing our understanding of the patients and professionals who use informatics innovations, the contexts in which they live and work, and their beliefs, perspectives, and life experiences.

FUNDING

This work received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

AUTHOR CONTRIBUTIONS

JSA, NCB, MR, KMU, and TV made substantial contributions to the conception and design of this work; drafted the work and revised it critically for important intellectual content; and gave final approval of the version to be published. All agree to be accountable for all aspects of the work.

CONFLICT OF INTEREST STATEMENT

None declared.

DATA AVAILABILITY

No data were generated in the course of this study.

REFERENCES

  • 1.Hussain MI, Figueiredo MC, Tran BD, et al. A scoping review of qualitative research in JAMIA: past contributions and opportunities for future work. J Am Med Inform Assoc 2021; 28 (2): 402–13. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Korstjens I, Moser A.. Series: practical guidance to qualitative research. Part 2: context, research questions and designs. Eur J Gen Pract 2017; 23 (1): 274–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Moser A, Korstjens I.. Series: practical guidance to qualitative research. Part 3: sampling, data collection and analysis. Eur J Gen Pract 2018; 24 (1): 9–18. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Cresswell JW, Poth CN.. Qualitative Inquiry and Research Design. 4th ed. Thousand Oaks, CA: Sage Publications, Inc.; 2018. [Google Scholar]
  • 5.Creamer EG. Advancing Grounded Theory with Mixed Methods. United Kingdom: Routledge; 2021. [Google Scholar]
  • 6.Cresswell JWPC, Vicki L.. Designing and Conducting Mixed Methods Research. Thousand Oaks, CA: Sage Publications; 2018. [Google Scholar]
  • 7.Watkins D, Gioia D.. Mixed Methods Research. New York, NY: Oxford University Press; 2015. [Google Scholar]
  • 8.Curry L, Nunez-Smith M.. Mixed Methods in Health Sciences Research: A Practical Primer. SAGE; 2014. [Google Scholar]
  • 9.Winkelman WJ, Leonard KJ, Rossos PG.. Patient-perceived usefulness of online electronic medical records: employing grounded theory in the development of information and communication technologies for use by patients living with chronic illness. J Am Med Inform Assoc 2005; 12 (3): 306–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Ancker JS, Witteman HO, Hafeez B, Provencher T, Van de Graaf M, Wei E.. The invisible work of personal health information management among people with multiple chronic conditions: qualitative interview study among patients and providers. J Med Internet Res 2015; 17 (6): e137. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Ancker JS, Witteman HO, Hafeez B, Provencher T, Van de Graaf M, Wei E.. “You get reminded you're a sick person”: personal data tracking and patients with multiple chronic conditions. J Med Internet Res 2015; 17 (8): e202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Willis MA, Hein LB, Hu Z, et al. Feeling better on hemodialysis: user-centered design requirements for promoting patient involvement in the prevention of treatment complications. J Am Med Inform Assoc 2021; 28 (8): 1612–31. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Ancker JS, Stabile C, Carter J, et al. Informing, reassuring, or alarming? Balancing patient needs in the development of a postsurgical symptom reporting system in cancer. AMIA Annu Symp Proc 2018; 2018: 166–74. [PMC free article] [PubMed] [Google Scholar]
  • 14.Wang X, Kim TC, Hegde S, et al. Design and evaluation of an integrated, patient-focused electronic health record display for emergency medicine. Appl Clin Inform 2019; 10 (4): 693–706. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Benda NC, Alexopoulos GS, Marino P, Sirey JA, Kiosses D, Ancker JS.. The age limit does not exist: a pilot usability assessment of a SMS-messaging and smartwatch-based intervention for older adults with depression. AMIA Annu Symp Proc 2021; 2020: 213–22. [PMC free article] [PubMed] [Google Scholar]
  • 16.Cresswell KM, Bates DW, Williams R, et al. Evaluation of medium-term consequences of implementing commercial computerized physician order entry and clinical decision support prescribing systems in two ‘early adopter’ hospitals. J Am Med Inform Assoc 2014; 21 (e2): e194–202. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Abraham J, Reddy MC.. Challenges to inter-departmental coordination of patient transfers: a workflow perspective. Int J Med Inform 2010; 79 (2): 112–22. [DOI] [PubMed] [Google Scholar]
  • 18.Novak L, Brooks J, Gadd C, Anders S, Lorenzi N.. Mediating the intersections of organizational routines during the introduction of a health IT system. Eur J Inf Syst 2012; 21 (5). doi: 10.1057/ejis.2012.2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Greenhalgh T, Wherton J, Shaw S, Papoutsi C, Vijayaraghavan S, Stones R.. Infrastructure revisited: an ethnographic case study of how health information infrastructure shapes and constrains technological innovation. J Med Internet Res 2019; 21 (12): e16093. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Unertl KM, Schaefbauer CL, Campbell TR, et al. Integrating community-based participatory research and informatics approaches to improve the engagement and health of underserved populations. J Am Med Inform Assoc 2016; 23 (1): 60–73. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Iott BE, Veinot TC, Loveluck J, Kahle E, Golson L, Benton A.. Comparative analysis of recruitment strategies in a study of men who have sex with men (MSM) in Metropolitan Detroit. AIDS Behav 2018; 22 (7): 2296–311. [DOI] [PubMed] [Google Scholar]
  • 22.Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K.. Purposeful sampling for qualitative data collection and analysis in mixed method implementation research. ADM Policy Ment Health 2015; 42 (5): 533–44. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Draucker CB, Martsolf DS, Ross R, Rusk TB.. Theoretical sampling and category development in grounded theory. Qual Health Res 2007; 17 (8): 1137–48. [DOI] [PubMed] [Google Scholar]
  • 24.Guest G, Bunce A, Johnson L.. How many interviews are enough? An experiment with data saturation and variability. Field Methods 2006; 18 (1): 59–82. [Google Scholar]
  • 25.Guest G, Namey E, McKenna K.. How many focus groups are enough? Building an evidence base for nonprobability sample sizes. Field Methods 2017; 29 (1): 3–22. [Google Scholar]
  • 26.Korstjens I, Moser A.. Series: practical guidance to qualitative research. Part 4: trustworthiness and publishing. Eur J Gen Pract 2018; 24 (1): 120–4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Miller WL, Crabtree BF.. Qualitative analysis: how to begin making sense. Fam Pract Res J 1994; 14 (3): 289–97. [PubMed] [Google Scholar]
  • 28.Morse J. The significance of saturation. Qual Health Res 1995; 5 (2): 147–9. [Google Scholar]
  • 29.Hennink MM, Kaiser BN, Weber MB.. What influences saturation? Estimating sample sizes in focus group research. Qual Health Res 2019; 29 (10): 1483–96. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Saunders B, Sim J, Kingstone T, et al. Saturation in qualitative research: exploring its conceptualization and operationalization. Qual Quant 2018; 52 (4): 1893–907. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Nielsen J. Usability Engineering. Boston, MA: Academic Press; 1993. [Google Scholar]
  • 32.Nielsen J. The Usability Engineering Lifecycle. San Diego, CA: Academic Press; 1993: 71–114. [Google Scholar]
  • 33.Faulkner L. Beyond the five-user assumption: benefits of increased sample sizes in usability testing. Behav Res Methods Instrum Comput 2003; 35 (3): 379–83. [DOI] [PubMed] [Google Scholar]
  • 34.Hwang W, Salvendy G.. Number of people required for usability evaluation: the 10±2 rule. Commun ACM 2010; 53 (5): 130–3. [Google Scholar]
  • 35.Schmettow M. Sample size in usability studies. Commun ACM 2012; 55 (4): 64–70. [Google Scholar]
  • 36.Schensul JJ. LM. Chapter 10: sampling in ethnographic research. In: Schensul JJ, LeCompte MD, eds. Essential Ethnographic Methods: A Mixed Methods Approach. Walnut Creek, CA: Altamira; 2013: 280–318. [Google Scholar]
  • 37.Kozinetz RV, ed. Chapter 4: netnography. In: Doing Ethnographic Research Online. Thousand Oaks, CA: Sage; 2010: 58–73. [Google Scholar]
  • 38.Morse JM. Sampling in grounded theory. In: Bryant A, Charmaz K, eds. The Sage Handbook of Grounded Theory. Thousand Oaks, CA: Sage; 2007: 229–244. [Google Scholar]
  • 39.Honigmann JJ. Chapter 12: Sampling in ethnographic fieldwork. In: Burgess R, ed. Field Research: A Sourcebook and Manual. New York, NY: Routledge; 1982: 121–139. [Google Scholar]
  • 40.Yin RK, ed. Chapter 2: Designing case studies. In: Case Study Research: Design and Methods. Thousand Oaks, CA: Sage; 2014; 27–70. [Google Scholar]
  • 41.Fetters MD, Curry LA, Creswell JW.. Achieving integration in mixed methods designs—principles and practices. Health Serv Res 2013; 48 (6 Pt 2): 2134–56. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Tashakkori A, Teddlie C.. Sage Handbook of Mixed Methods in Social and Behavioral Research. Thousand Oaks, CA: Sage; 2016. [Google Scholar]
  • 43.Jarzabkowski P, Bednarek R, Cabantous L.. Conducting global team-based ethnography: methodological challenges and practical methods. Hum Relat 2015; 68 (1): 3–33. [Google Scholar]
  • 44.Emerson RFR, Shaw LL.. Writing Ethnographic Fieldnotes. Chicago: University of Chicago Press; 2011. [Google Scholar]
  • 45.O'Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA.. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med 2014; 89 (9): 1245–51. [DOI] [PubMed] [Google Scholar]
  • 46.Dell N, Vaidyanathan V, Medhi I, Cutrell E, Thies W. “Yours is better!”: participant response bias in HCI. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. New York: Association for Computing Machinery; 2012: 1321–1330.
  • 47.McCambridge J, Witton J, Elbourne DR.. Systematic review of the Hawthorne effect: new concepts are needed to study research participation effects. J Clin Epidemiol 2014; 67 (3): 267–77. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Liang CA, Munson SA, Kientz JA.. Embracing four tensions in human-computer interaction research with marginalized people. ACM Trans Comput-Hum Interact 2021; 28 (2): 1–47. Article 14. [Google Scholar]
  • 49.Braun V, Clarke V.. Using thematic analysis in psychology. Qual Res Psychol 2006; 3 (2): 77–101. [Google Scholar]
  • 50.Strauss A, Corbin J.. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 2nd ed. Thousand Oaks, CA: Sage; 1998. [Google Scholar]
  • 51.Hsieh HF, Shannon SE.. Three approaches to qualitative content analysis. Qual Health Res 2005; 15 (9): 1277–88. [DOI] [PubMed] [Google Scholar]
  • 52.Saldana J. The Coding Manual for Qualitative Researchers. Thousand Oaks, CA: SAGE Publications Limited; 2021. [Google Scholar]
  • 53.Charmaz K. Constructing Grounded Theory. Thousand Oaks, CA: Sage; 2014. [Google Scholar]
  • 54.Glaser B, Strauss A.. Discovery of Grounded Theory: Strategies for Qualitative Research. New York: Routledge; 2017. [Google Scholar]
  • 55.Strauss A, Corbin J.. Grounded Theory in Practice. Thousand Oaks, CA: Sage; 1997. [Google Scholar]
  • 56.Birks M. Chapter 3: quality processes in grounded theory research. In: Birks M, Mills J, eds. Grounded Theory: A Practical Guide. Thousand Oaks, CA: Sage; 2011. [Google Scholar]
  • 57.Flick U. Doing Triangulation and Mixed Methods the Sage Qualitative Research Kit. Thousand Oaks, CA: Sage; 2018. [Google Scholar]
  • 58.Miles M, Huberman M.. Qualitative Data Analysis: An Expanded Sourcebook. Thousand Oaks, CA: Sage Publications, Inc.; 1994. [Google Scholar]
  • 59.Miles MB, Huberman AM, Saldaña J.. Qualitative Data Analysis: A Methods Sourcebook. Thousand Oaks, CA: Sage Publications; 2018. [Google Scholar]
  • 60.Morse JM, Barrett M, Mayan M, Olson K, Speirs J.. Verification strategies for establishing reliability and validity in qualitative research. Int J Qual Methods 2002; 1 (2): 13–9. [Google Scholar]
  • 61.Brear M. Process and outcomes of a recursive, dialogic member checking approach: a project ethnography. Qual Health Res 2019; 29 (7): 944–57. [DOI] [PubMed] [Google Scholar]
  • 62.Birt L, Scott S, Cavers D, Campbell C, Walter F.. Member checking: a tool to enhance trustworthiness or merely a nod to validation? Qual Health Res 2016; 26 (13): 1802–11. [DOI] [PubMed] [Google Scholar]
  • 63.Veinot TC, Campbell TR, Kruger DJ, Grodzinski A.. A question of trust: user-centered design requirements for an informatics intervention to promote the sexual health of African-American youth. J Am Med Inform Assoc 2013; 20 (4): 758–65. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 64.Ancker JS, Miller MC, Patel VN, Kaushal R; HITEC Investigators. Sociotechnical challenges to developing technologies for patient access to health information exchange data. J Am Med Inform Assoc 2014; 21 (4): 664–70. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Veinot TC, Meadowbrooke CC, Loveluck J, Hickok A, Bauermeister JA.. How “community” matters for how people interact with information: mixed methods study of young men who have sex with other men. J Med Internet Res 2013; 15 (2): e33. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 66.Fetters MDGTC. Development of a joint display. In: Onwuegbuzie AJJ, Burke R, eds. The Routledge Reviewer's Guide to Mixed Method Analysis. Abingdon, United Kingdom: Routledge; 2020: 259. [Google Scholar]
  • 67.Tong A, Sainsbury P, Craig J.. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19 (6): 349–57. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

No data were generated in the course of this study.


Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of Oxford University Press

RESOURCES