Skip to main content
Journal of Medical Internet Research logoLink to Journal of Medical Internet Research
. 2024 Nov 28;26:e62761. doi: 10.2196/62761

Exploring Participants’ Experiences of Digital Health Interventions With Qualitative Methods: Guidance for Researchers

Kristin Harrison Ginsberg 1,#, Katie Babbott 1,#, Anna Serlachius 1,✉,#
Editor: Taiane de Azevedo Cardoso
Reviewed by: Deepak Singh, Yongjie Sha, Lina Weinert
PMCID: PMC11638693  PMID: 39607999

Abstract

Digital health interventions have gained prominence in recent years, offering innovative solutions to improve health care delivery and patient outcomes. Researchers are increasingly using qualitative approaches to explore patient experiences of using digital health interventions. Yet, the qualitative methods used in these studies can vary widely, and some methods are frequently misapplied. We highlight the methods we find most fit for purpose to explore user experiences of digital tools and propose 5 questions for researchers to use to help them select a qualitative method that best suits their research aims.

Keywords: qualitative methods, content analysis, thematic analysis, digital health evaluation, user engagement, user experience, digital health intervention, innovation, patient experience, health care, researcher, technology, mobile health, mHealth, telemedicine, digital health, behavior change, usability, tutorial, research methods, qualitative research, study design

Introduction

Digital health interventions encompass a wide range of technologies, such as mobile apps, websites, wearable devices, and telemedicine platforms. Their use in behavioral change and mental health interventions are common; such interventions are increasingly used as both an adjunct to face-to-face care and as standalone interventions [1-4]. Although digital health interventions have ostensible benefits in terms of scalability and potential to improve equity of access, the high dropout rates of digital health interventions, which can reach up to 80% [5,6], raise questions of acceptability and usability. Understanding participants’ experiences in these interventions is therefore crucial for developing effective digital tools, improving and tailoring existing digital tools, and optimizing health outcomes.

Qualitative research exploring user experience of digital health tools has surged alongside the exponential increase in digital health interventions. Researchers frequently use qualitative approaches to explore engagement, usability, and uptake of digital health interventions. In this type of “applied research,” researchers often start with predefined research questions, such as how well an intervention works and under what circumstances [7]. Yet, despite these focused research questions, the qualitative methods (ie, the qualitative approaches used) and methodology (ie, the theoretical rationale and perspective that guide the research) used in these studies can vary widely and be misapplied. For example, a systematic review of 16 studies that used qualitative methods to assess user experiences of digital interventions for pediatric patients [8] found an eclectic range of data analysis methods used, including thematic analysis (37%), content analysis (31%), hermeneutic research analysis (6%), and the generically described “deductive” analysis (6%). Nearly 20% of the articles did not describe their qualitative analysis methodology (ie, their theoretically informed approach). These vague, incomplete, or absent descriptions of qualitative methodology are common across the health intervention literature [9].

As health psychology researchers who have conducted a range of studies exploring patient experiences using digital health interventions [10,11], we understand the challenge of selecting a qualitative approach for this type of research, as there are no existing guidelines to inform the selection of the right qualitative method (ie, the approach that is used) and/or methodology (ie, the overarching philosophical framework or lens). Although there exist several guidelines for improving the reporting of qualitative research [12-15], these generally do not offer guidance on selecting appropriate qualitative approaches. Furthermore, qualitative approaches are also more flexible and interpretative than quantitative methods, which can add to selection confusion. Therefore, in this article, we summarize the most common qualitative approaches used in digital health research when exploring user engagement and make recommendations based on our experiences. We also provide 5 questions and a decision tree that digital health intervention researchers can use to select a qualitative approach that best meets their research goals.

First Things First: Grounded in Theory, or Not?

As qualitative research has grown in popularity in behavioral research, so too has confusion over the range of approaches available and the theoretical or methodological frameworks involved. For example, frequent misapplication of thematic analysis, one of the most often used qualitative methods, has led Braun and Clarke [16], authors of the seminal paper on the method, to publish 4 recent papers clarifying its methodology and “flexible” theoretical approach [17-20].

Starting first with theory and its underlying epistemology (ie, underlying philosophy) can help you choose the qualitative approach that best fits your research question. Qualitative researchers exploring participant experiences in health-related research often take either a constructionist or realist approach. A constructionist approach, also referred to as interpretivist, critical, or “artfully interpretive” [21], assumes that reality is socially constructed and shaped through interactions, language, and meaning-making processes. It emphasizes that individuals and groups actively create their own realities through their interpretations and interactions with the world. Social constructionist qualitative approaches in health- and mental health–related research often analyze patterns in language, discourse, and narratives to explore the underlying meaning and the social constructs of reality [22].

In contrast, a realist approach, or a “scientifically descriptive” [21] approach, posits that there is a degree of objectivity that exists independently of human perception. It suggests that social phenomena have inherent structures and properties that exist regardless of human interpretation. Realist approaches are often used in mixed methods research and tend to analyze qualitative data with a focus on identifying a “consistency of meaning,” often through triangulation and the use of several co-coders [23].

Researchers also need to consider whether they are taking a deductive (top-down), inductive (data-driven), or abductive (combination of both) approach in their analysis and whether they will focus on semantic (explicit) or latent (implicit) meanings in the data. It is increasingly acknowledged that the coding and analysis process is rarely completely inductive or deductive [24] and is usually a combination of both [25,26]. For example, even if you choose a mainly deductive approach, it is not uncommon that unexpected and largely inductive (data-driven) codes may also become apparent during the qualitative coding process, even if you are coding with a framework in mind or are approaching the data with predetermined deductive codes. This “abductive” approach also allows for a more nuanced and contextual story to be told. For example, even if you are coding participants’ responses using a digital usability framework, you can also code inductively when participants raise important concepts, experiences, and thoughts that relate to the overall research question and aims.

Untangling Qualitative Methodologies

Overview

Some qualitative approaches are inherently linked to their broader methodological frameworks (eg, narrative analysis, interpretative phenomenological analysis, and grounded theory) while others are more flexible in terms of belonging to a certain methodological framework (eg, thematic analysis [16] and content analysis [27]). Thematic analysis, qualitative content analysis, grounded theory, and interpretative phenomenological analysis are some of the many forms of pattern-seeking qualitative approaches. In our opinion, grounded theory and interpretative phenomenological analysis, which are highly interpretive and explorative, are less likely to fit the needs of most digital health research exploring participant usability of and engagement with digital tools, and we are not going to discuss them here. (Good guidance on these methods can be found via Strauss and Corbin [28] and McLeod [29].) Instead, we’ve highlighted the 2 most commonly used approaches in digital health research exploring participant experience and user engagement.

Qualitative Content Analysis

Qualitative content analysis, including conventional, directed, and summative approaches [30], uses systematic methods to analyze patterns in text and explore meaning. Sometimes considered an “intermediary” approach between qualitative and quantitative methods [31], content analysis, particularly some types, such as summative, use methods that borrow from quantitative research, such as counting the frequency of words or phrases [30]. Researchers using content analysis may take a deductive, inductive, or abductive (combined) approach and often use it deductively given its utility for exploring predefined categories and/or incorporating an existing behavioral or theoretical framework [27,29]. Conventional content analysis can also be used more interpretively, and shares some similarities with certain types of thematic analysis (such as the framework method, discussed below) [32].

We find qualitative content analysis particularly useful in digital health intervention evaluations that build on existing frameworks or previous research (as most of our work does). We have also found directed content analysis to be an efficient and effective method to analyze qualitative data within mixed methods randomized controlled trials of digital health interventions (such as in Brenton-Peters et al [33] and Serlachius et al [34]).

Thematic Analysis

Similar in some aspects to content analysis, thematic analysis is a “family” of related methods [19] that involve labeling (“coding”) data and then organizing them into themes (patterns of meaning). Reflexive thematic analysis, which directly addresses the researcher’s role and process in the analysis, is Braun and Clarke’s [16] updated approach and seeks in part to differentiate itself from older styles of thematic analysis, such as codebook and framework analysis, which take a more structured, and often combined inductive/deductive, approach.

Due in part to Braun and Clarke’s [16] detailed, step-by-step description of thematic analysis published in 2006, this approach has become widely used. They have written at length on how this approach is frequently misunderstood and misapplied, as demonstrated by a recent study where they reviewed 20 health-related studies that used thematic analysis and found the most common problem was in the creation of themes [20]. In reflexive thematic analysis, a theme should highlight a broader meaning across the dataset, telling an interpretive story. However, many studies confuse themes with “topic summaries,” such as “helpfulness of the intervention” or “ease of use.” Braun and Clarke [20] note that if the theme could have been created before conducting the data analysis (ie, it could be mapped directly to an interview question), then it is a topic summary.

We feel that if your research question and aims align best with topic summaries and deductive analysis, qualitative content analysis or codebook or framework thematic analysis may be a better fit than reflexive thematic analysis or other more interpretative approaches.

Selecting a Qualitative Approach: 5 Questions to Consider

To use qualitative approaches more effectively in digital health user engagement research, we believe there needs to be more coherence and consistency in choosing qualitative methods. Therefore, we suggest researchers use the following 5 questions to help select the qualitative approach most appropriate for their research aims (Figure 1 shows a decision tree.)

Figure 1.

Figure 1

Questions to help select a qualitative method in digital health research evaluation.

Question 1: What Is Your Theoretical/Methodological Position: Are You a Realist or an Interpreter?

Most digital health intervention evaluation studies have pragmatic goals grounded in a realist (ie, “scientifically oriented”) view, building on previous research findings, for example, as part of mixed methods research. If your study fits within this orientation, strongly consider using a method like content analysis or thematic framework analysis, which can flexibly accommodate existing frameworks and build on previous quantitative findings.

While more inductively aligned methods of thematic analysis or other more interpretative approaches such as narrative analysis or grounded theory can be excellent approaches at the development stage of a digital health intervention, when there is more emphasis on exploring in-depth patient experiences, they are frequently less aligned with most research aims around digital health intervention evaluation and exploring questions regarding user experience and user engagement through a realist epistemology. However, there are exceptions; for example, a study by Knox et al [35] explored stakeholder views on virtual pulmonary rehabilitation using a critical epistemology, and a paper by Bleyel et al [36] explored patient perceptions of mental health care through video consultations from a critical realist perspective [36]. These examples demonstrate the importance of articulating your research aims, methods, and methodological perspective to justify your chosen approach, which is not always easily achievable within the tight word-count limits of medical journals.

Also carefully consider whether you want to take a deductive, inductive, or abductive approach in your analysis. While you can use any of these approaches across the different types of content and thematic analysis methods [27,37], you should select an analysis method that works within your research aims and state this orientation in your reported methodology.

Question 2: Are You Using an Existing Health or Behavioral Change Framework?

Many digital health interventions are built on existing intervention or behavior change frameworks, such as the capability, opportunity, motivation-behavior (COM-B) model [38] and the reach, effectiveness, adoption, implementation, and maintenance (RE-AIM) framework [39]. If you want to use an existing framework in your qualitative analysis, as many digital intervention studies do, choose an approach designed for this, such as directed content analysis or thematic framework analysis. Do not choose a highly interpretive approach, such as reflexive thematic analysis or interpretative phenomenological analysis, if you intend to map your qualitative findings onto an existing framework. A good example by Szinay and colleagues [40] used the framework method to explore engagement with well-being apps informed by the COM-B model. For further details regarding the framework method, refer to Gale and colleagues [32], who provide a comprehensive outline of the 7 stages of analysis using this method in health research.

Question 3: What Type of Research Question Are You Trying to Answer?

Qualitative research is well suited to answer “how” and “why” questions, such as why individuals did (or did not) use a digital health intervention and how they went about it. But different research questions may be better suited to one type of qualitative analysis than another.

For example, let us suppose your digital health intervention for weight loss had a high drop-out rate, and you want to use qualitative methods to explore reasons why. The most suitable qualitative approach depends on your underlying rationale and research strategy (including your methodological/theoretical position) and what you plan to do in response to your findings (ie, changes or iterations to the intervention or exploring a different approach altogether). If you want to explore topics such as participants’ beliefs about weight loss and/or perceived barriers to weight loss, narrative analysis or another more inductive approach may be more appropriate. However, if, as is more common in digital health intervention evaluation research, you have specific questions related to participants’ opinions about the design, content, and activities within the intervention itself that caused them to stop using it, then a more focused and deductive or abductive qualitative approach (ie, content analysis) may be more useful.

Question 4: How Are You Collecting Data?

According to a 2019 systematic review, frequently used qualitative data collection tools in digital health intervention evaluation and testing and exploring users’ perspectives include “think-aloud” protocols, focus groups, and interviews [41]. We often also use open-ended online survey questions in our mixed methods evaluation studies. It is important to consider the type of data you are collecting alongside your analysis plan.

Stemming from cognitive psychology, the think-aloud method asks participants to share what they are thinking while performing a task [42], which can lead to a focused data set. Because of this, the think-aloud method is popular in usability research and has been found to successfully generate intervention improvements [43]. Think-aloud data are most commonly analyzed using a deductive approach and with methods similar to content analysis, such as counting positive or negative sentiment [44], but these data can also be coded and organized into topics or themes [12].

Focus groups and interviews can generate large datasets that may cover a broad range of topics and experiences. While exploratory questions may be useful in digital health intervention development research, a more focused approach is usually necessary to meet the goals of evaluation research or exploring whether a digital tool met participants’ needs. Therefore, it is critical to develop an interview guide that considers your research questions, theoretical framework, and desired result type (ie, themes or frequencies). Otherwise, you may end up with a wide-ranging and meandering dataset that does not answer your research questions and leads to what Braun and Clarke [20] call a “mish-mash” of ineffective qualitative analysis.

Question 5: How Much Time Do You Have?

Digital health interventions often require rapid evolution to stay relevant, and researchers may have limited time, resources, and funding [7]. Therefore, it is important to consider how quickly you need to generate your results, as well as the resources needed to complete the analysis.

In terms of the approaches discussed here, reflexive thematic analysis, interpretive phenomenological analysis, grounded theory, and narrative analysis often require significant time to conduct, with multiple rounds of detailed, in-depth coding and thematic review and ongoing exploration regarding reflexivity and how it influences the process and outcomes of the analysis. Content analysis or framework thematic analysis can be done in a more efficient and timely manner, with codebooks often developed in advance of analysis and the frequent use of co-coders with a focus on accuracy.

Increasingly, so-called rapid approaches have been developed and compared against more traditional qualitative approaches due to this need to reduce time and improve efficiency in health care research [45,46]. For example, Holdsworth et al [47] used a rapid form of framework analysis to summarize evaluation data from intensive care unit site visits while still on site. They found this approach delivered significant savings in analysis and transcription costs.

Conclusion

After more than a decade of conducting qualitative research exploring patient experiences of and user engagement with digital health tools, we too have grappled with choosing appropriate qualitative methods. In this article, we draw on our experience, which has been largely in health psychology and digital health–related research, but acknowledge that similar qualitative design challenges exist in health informatics, health services research, and the broader usability research literature [14,15]. Ultimately, we believe that many of these ongoing interdisciplinary challenges and discussions enhance overall understanding of the benefits of qualitative and mixed methods research and lead to improvements not just in the conducting and reporting of qualitative research but also in the ongoing development and clarification of the methods themselves. This paper did not have the scope to explore questions regarding what constitutes “good” or rigorous qualitative research in digital health, but there are several articles providing useful summaries of key criteria in digital health research, health services research, and digital health assessment [48,49].

Digital health interventions offer innovative solutions to improve health care delivery and patient outcomes. Researchers now frequently use qualitative approaches to explore engagement, usability, and uptake of digital health interventions. However, some of the most popular qualitative approaches are frequently misapplied in this type of research and may not be the most appropriate for this type of research. Therefore, we suggest researchers consider their research aims, theoretical orientation, and the time and resources available before selecting a qualitative approach to use when exploring participants’ experiences of digital tools.

Abbreviations

COM-B

capability, opportunity, motivation-behavior

Re-Aim

reach, effectiveness, adoption, implementation, and maintenance

Footnotes

Conflicts of Interest: None declared.

References

  • 1.Li J, Theng Y, Foo S. Game-based digital interventions for depression therapy: a systematic review and meta-analysis. Cyberpsychol Behav Soc Netw. 2014 Aug;17(8):519–27. doi: 10.1089/cyber.2013.0481. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.McLean G, Band R, Saunderson K, Hanlon P, Murray E, Little P, McManus RJ, Yardley L, Mair FS, DIPSS co-investigators Digital interventions to promote self-management in adults with hypertension systematic review and meta-analysis. J Hypertens. 2016 Apr;34(4):600–12. doi: 10.1097/HJH.0000000000000859. https://eprints.gla.ac.uk/119425 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Philippe TJ, Sikder N, Jackson A, Koblanski ME, Liow E, Pilarinos A, Vasarhelyi K. Digital health interventions for delivery of mental health care: systematic and comprehensive meta-review. JMIR Ment Health. 2022 May 12;9(5):e35159. doi: 10.2196/35159. https://mental.jmir.org/2022/5/e35159/ v9i5e35159 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Rose T, Barker M, Maria Jacob C, Morrison L, Lawrence W, Strömmer Sofia, Vogel C, Woods-Townsend K, Farrell D, Inskip H, Baird J. A systematic review of digital interventions for improving the diet and physical activity behaviors of adolescents. J Adolesc Health. 2017 Dec;61(6):669–677. doi: 10.1016/j.jadohealth.2017.05.024. https://europepmc.org/abstract/MED/28822682 .S1054-139X(17)30253-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Melville KM, Casey LM, Kavanagh DJ. Dropout from Internet-based treatment for psychological disorders. Br J Clin Psychol. 2010 Nov;49(Pt 4):455–71. doi: 10.1348/014466509X472138.bjcp840 [DOI] [PubMed] [Google Scholar]
  • 6.Pedersen DH, Mansourvar M, Sortsø Camilla, Schmidt T. Predicting dropouts from an electronic health platform for lifestyle interventions: analysis of methods and predictors. J Med Internet Res. 2019 Sep 04;21(9):e13617. doi: 10.2196/13617. https://www.jmir.org/2019/9/e13617/ v21i9e13617 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Murray E, Hekler EB, Andersson G, Collins LM, Doherty A, Hollis C, Rivera DE, West R, Wyatt JC. Evaluating digital health interventions: key questions and approaches. Am J Prev Med. 2016 Nov;51(5):843–851. doi: 10.1016/j.amepre.2016.06.008. https://europepmc.org/abstract/MED/27745684 .S0749-3797(16)30229-X [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Cheng L, Liu F, Mao X, Peng W, Wang Y, Huang H, Duan M, Wang Y, Yuan C. The pediatric cancer survivors' user experiences with digital health interventions: a systematic review of qualitative data. Cancer Nurs. 2022;45(1):E68–E82. doi: 10.1097/NCC.0000000000000885.00002820-202201000-00018 [DOI] [PubMed] [Google Scholar]
  • 9.O'Cathain Alicia, Hoddinott P, Lewin S, Thomas KJ, Young B, Adamson J, Jansen YJ, Mills N, Moore G, Donovan JL. Maximising the impact of qualitative research in feasibility studies for randomised controlled trials: guidance for researchers. Pilot Feasibility Stud. 2015;1(1):32. doi: 10.1186/s40814-015-0026-y. https://pilotfeasibilitystudies.biomedcentral.com/articles/10.1186/s40814-015-0026-y .26 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Garner K, Thabrew H, Lim D, Hofman P, Jefferies C, Serlachius A. Exploring the Usability and acceptability of a well-being app for adolescents living with type 1 diabetes: qualitative study. JMIR Pediatr Parent. 2023 Dec 22;6:e52364. doi: 10.2196/52364. https://pediatrics.jmir.org/2023//e52364/ v6i1e52364 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Wallace-Boyd K, Boggiss AL, Ellett S, Booth R, Slykerman R, Serlachius AS. ACT2COPE: A pilot randomised trial of a brief online acceptance and commitment therapy intervention for people living with chronic health conditions during the COVID-19 pandemic. Cogent Psychology. 2023 May 11;10(1):2208916. doi: 10.1080/23311908.2023.2208916. [DOI] [Google Scholar]
  • 12.Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007 Dec;19(6):349–57. doi: 10.1093/intqhc/mzm042.mzm042 [DOI] [PubMed] [Google Scholar]
  • 13.Ancker J, Benda N, Reddy M, Unertl K, Veinot T. Guidance for publishing qualitative research in informatics. J Am Med Inform Assoc. 2021 Nov 25;28(12):2743–2748. doi: 10.1093/jamia/ocab195. https://europepmc.org/abstract/MED/34537840 .6372394 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Mann DM, Chokshi SK, Kushniruk A. Bridging the gap between academic research and pragmatic needs in usability: a hybrid approach to usability evaluation of health care information systems. JMIR Hum Factors. 2018 Nov 28;5(4):e10721. doi: 10.2196/10721. https://humanfactors.jmir.org/2018/4/e10721/ v5i4e10721 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Murphy E, Dingwall R, Greatbatch D, Parker S, Watson P. Qualitative research methods in health technology assessment: a review of the literature. Health Technol Assess. 1998;2(16):iii–ix, 1. http://www.journalslibrary.nihr.ac.uk/hta/hta2160 . [PubMed] [Google Scholar]
  • 16.Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006 Jan;3(2):77–101. doi: 10.1191/1478088706qp063oa. [DOI] [Google Scholar]
  • 17.Braun V, Clarke V, Hayfield N, Terry G. Answers to frequently asked questions about thematic analysis. University of Auckland. [2024-11-21]. https://cdn.auckland.ac.nz/assets/psych/about/our-research/documents/Answers%20to%20frequently%20asked%20questions%20about%20thematic%20analysis%20April%202019.pdf .
  • 18.Braun V, Clarke V. Reflecting on reflexive thematic analysis. Qual Res Sport Exerc Health. 2019 Jun 13;11(4):589–597. doi: 10.1080/2159676X.2019.1628806. [DOI] [Google Scholar]
  • 19.Braun V, Clarke V. Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern‐based qualitative analytic approaches. Couns Psychother Res. 2020 Oct 18;21(1):37–47. doi: 10.1002/capr.12360. [DOI] [Google Scholar]
  • 20.Braun V, Clarke V. Toward good practice in thematic analysis: avoiding common problems and be(com)ing a researcher. Int J Transgend Health. 2023;24(1):1–6. doi: 10.1080/26895269.2022.2129597. https://www.tandfonline.com/doi/10.1080/26895269.2022.2129597?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub0pubmed .2129597 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Finlay L. Thematic analysis: the ‘good’, the ‘bad’ and the ‘ugly’. Eur J Qual Res Psychother. 2021;11:103–116. doi: 10.4337/9781847208552.00018. https://ejqrp.org/index.php/ejqrp/article/view/136 . [DOI] [Google Scholar]
  • 22.Harper D, Thompson AR, editors. Qualitative Research Methods in Mental Health and Psychotherapy: A Guide for Students and Practitioners. 1st ed. New York, NY: Wiley; 2011. [Google Scholar]
  • 23.Madill A, Jordan A, Shirley C. Objectivity and reliability in qualitative analysis: realist, contextualist and radical constructionist epistemologies. Br J Psychol. 2000 Feb;91 ( Pt 1):1–20. doi: 10.1348/000712600161646. [DOI] [PubMed] [Google Scholar]
  • 24.Bingham AJ. From data management to actionable findings: a five-phase process of qualitative data analysis. Int J Qual Methods. 2023 Aug 10;22:3. doi: 10.1177/16094069231183620. [DOI] [Google Scholar]
  • 25.Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006 Mar 01;5(1):80–92. doi: 10.1177/160940690600500107. [DOI] [Google Scholar]
  • 26.Proudfoot K. Inductive/deductive hybrid thematic analysis in mixed methods research. J Mix Methods Res. 2022 Sep 20;17(3):308–326. doi: 10.1177/15586898221126816. [DOI] [Google Scholar]
  • 27.Graneheim UH, Lindgren B, Lundman B. Methodological challenges in qualitative content analysis: a discussion paper. Nurse Educ Today. 2017 Sep;56:29–34. doi: 10.1016/j.nedt.2017.06.002.S0260-6917(17)30142-9 [DOI] [PubMed] [Google Scholar]
  • 28.Strauss A, Corbin JC. Basics of Qualitative Research: Grounded Theory Procedures and Techniques. Thousand Oaks, CA: Sage Publications; 1990. [Google Scholar]
  • 29.McLeod J. Qualitative Research in Counselling and Psychotherapy. Thousand Oaks, CA: Sage Publications; 2012. [Google Scholar]
  • 30.Hsieh H, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005 Nov;15(9):1277–88. doi: 10.1177/1049732305276687.15/9/1277 [DOI] [PubMed] [Google Scholar]
  • 31.Mayring PAE. International Encyclopedia of Education (Fourth Edition) Amsterdam, Netherlands: Elsevier; 2023. Qualitative content analysis; pp. 314–322. [Google Scholar]
  • 32.Gale NK, Heath G, Cameron E, Rashid S, Redwood S. Using the framework method for the analysis of qualitative data in multi-disciplinary health research. BMC Med Res Methodol. 2013 Sep 18;13:117. doi: 10.1186/1471-2288-13-117. https://bmcmedresmethodol.biomedcentral.com/articles/10.1186/1471-2288-13-117 .1471-2288-13-117 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Brenton-Peters Jennifer M, Consedine NS, Cavadino A, Roy R, Ginsberg KH, Serlachius A. Finding kindness: a randomized controlled trial of an online self-compassion intervention for weight management (SC4WM) Br J Health Psychol. 2024 Feb;29(1):37–58. doi: 10.1111/bjhp.12686. [DOI] [PubMed] [Google Scholar]
  • 34.Serlachius A, Schache K, Kieser A, Arroll B, Petrie K, Dalbeth N. Association between user engagement of a mobile health app for gout and improvements in self-care behaviors: randomized controlled trial. JMIR Mhealth Uhealth. 2019 Aug 13;7(8):e15021. doi: 10.2196/15021. https://mhealth.jmir.org/2019/8/e15021/ v7i8e15021 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Knox L, Gemine R, Dunning M, Lewis K. Reflexive thematic analysis exploring stakeholder experiences of virtual pulmonary rehabilitation (VIPAR) BMJ Open Respir Res. 2021 Jul;8(1):e000800. doi: 10.1136/bmjresp-2020-000800. https://bmjopenrespres.bmj.com/lookup/pmidlookup?view=long&pmid=34301711 .8/1/e000800 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Bleyel C, Hoffmann M, Wensing M, Hartmann M, Friederich H, Haun MW. Patients' perspective on mental health specialist video consultations in primary care: qualitative preimplementation study of anticipated benefits and barriers. J Med Internet Res. 2020 Apr 20;22(4):e17330. doi: 10.2196/17330. https://www.jmir.org/2020/4/e17330/ v22i4e17330 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Roseveare C. Thematic Analysis: A Practical Guide, by Virginia Braun and Victoria Clarke. Can J Program Eval. 2023 Jun 01;38(1):143–145. doi: 10.3138/cjpe.76737. [DOI] [Google Scholar]
  • 38.Keyworth C, Epton T, Goldthorpe J, Calam R, Armitage CJ. Acceptability, reliability, and validity of a brief measure of capabilities, opportunities, and motivations ("COM-B") Br J Health Psychol. 2020 Sep;25(3):474–501. doi: 10.1111/bjhp.12417. https://eprints.whiterose.ac.uk/173903/ [DOI] [PubMed] [Google Scholar]
  • 39.Forman J, Heisler M, Damschroder LJ, Kaselitz E, Kerr EA. Development and application of the RE-AIM QuEST mixed methods framework for program evaluation. Prev Med Rep. 2017 Jun;6:322–328. doi: 10.1016/j.pmedr.2017.04.002. https://linkinghub.elsevier.com/retrieve/pii/S2211-3355(17)30065-7 .S2211-3355(17)30065-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Szinay D, Perski O, Jones A, Chadborn T, Brown J, Naughton F. Perceptions of factors influencing engagement with health and well-being apps in the United Kingdom: qualitative interview study. JMIR Mhealth Uhealth. 2021 Dec 16;9(12):e29098. doi: 10.2196/29098. https://mhealth.jmir.org/2021/12/e29098/ v9i12e29098 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Maramba I, Chatterjee A, Newman C. Methods of usability testing in the development of eHealth applications: A scoping review. Int J Med Inform. 2019 Jun;126:95–104. doi: 10.1016/j.ijmedinf.2019.03.018.S1386-5056(18)31318-2 [DOI] [PubMed] [Google Scholar]
  • 42.Ericsson KA, Simon HA. Verbal reports as data. Psychol Rev. 1980 May;87(3):215–251. doi: 10.1037/0033-295X.87.3.215. [DOI] [Google Scholar]
  • 43.Yardley L, Spring BJ, Riper H, Morrison LG, Crane DH, Curtis K, Merchant GC, Naughton F, Blandford A. Understanding and promoting effective engagement with digital behavior change interventions. Am J Prev Med. 2016 Nov;51(5):833–842. doi: 10.1016/j.amepre.2016.06.015.S0749-3797(16)30243-4 [DOI] [PubMed] [Google Scholar]
  • 44.Yardley L, Bradbury K, Morrison L. Qualitative Research in Psychology: Expanding Perspectives in Methodology and Design (2nd Ed) Washington, DC: American Psychological Association; 2021. Using qualitative research for intervention development and evaluation. [Google Scholar]
  • 45.Taylor B, Henshall C, Kenyon S, Litchfield I, Greenfield S. Can rapid approaches to qualitative analysis deliver timely, valid findings to clinical leaders? A mixed methods study comparing rapid and thematic analysis. BMJ Open. 2018 Oct 08;8(10):e019993. doi: 10.1136/bmjopen-2017-019993. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=30297341 .bmjopen-2017-019993 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Vindrola-Padros C, Johnson GA. Rapid techniques in qualitative research: a critical review of the literature. Qual Health Res. 2020 Aug;30(10):1596–1604. doi: 10.1177/1049732320921835. [DOI] [PubMed] [Google Scholar]
  • 47.Holdsworth LM, Safaeinili N, Winget M, Lorenz KA, Lough M, Asch S, Malcolm E. Adapting rapid assessment procedures for implementation research using a team-based approach to analysis: a case example of patient quality and safety interventions in the ICU. Implement Sci. 2020 Feb 22;15(1):12. doi: 10.1186/s13012-020-0972-5. https://implementationscience.biomedcentral.com/articles/10.1186/s13012-020-0972-5 .10.1186/s13012-020-0972-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48.Mays N, Pope C. Qualitative research in health care. Assessing quality in qualitative research. BMJ. 2000 Jan 01;320(7226):50–2. doi: 10.1136/bmj.320.7226.50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Tracy SJ. Qualitative quality: eight “big-tent” criteria for excellent qualitative research. Qual Inq. 2010 Oct 01;16(10):837–851. doi: 10.1177/1077800410383121. [DOI] [Google Scholar]

Articles from Journal of Medical Internet Research are provided here courtesy of JMIR Publications Inc.

RESOURCES