Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2025 Sep 10.
Published in final edited form as: Int J Qual Methods. 2025 Jan 16;24:10.1177/16094069251315395. doi: 10.1177/16094069251315395

Updating and Advancing Member-Checking Methods: Use of Video and Asynchronous Technology to Optimize Participant Engagement

Robyn Schafer 1,2, Julia C Phillippi 3
PMCID: PMC12419494  NIHMSID: NIHMS2107445  PMID: 40933861

Abstract

Member checking enhances the trustworthiness and transformative potential of qualitative research. There are a variety of aims and approaches to reengaging with participants in sharing data or preliminary analysis and soliciting feedback through member checking. Published studies often lack descriptions of member-checking methods or outcomes, and there is a lack of research on the use of technologies for this purpose. Asynchronous video and internet-based technologies can be valuable tools to improve the accessibility, equity, effectiveness, and acceptability of member checking and foster increased engagement. This publication presents a detailed description of member checking in an exemplar study that used readily available technologies to create a video synopsis of preliminary findings and embed that video in a multimedia, mixed methods web-based survey which was emailed to participants. This member-checking method was purposefully selected to advance the specific research aims of the study, reflect the epistemological stance of the researchers and unique considerations of the study population, and address relevant situational factors to optimize participant engagement. This strategy facilitated wide, cost-effective, and timely distribution and resulted in a good response rate with rich feedback. Asynchronous technologies were a useful alternative to in-person or synchronous meetings to facilitate voluntary participation, foster reflection that deepened analysis, and capture multiple voices and perspectives. Findings from this research support the use of video and electronic survey technologies to enhance study credibility, address ethical and methodological challenges related to member checking, and increase equity and engagement. Future studies are needed to expand and refine integration of technologies into member checking to address diverse research aims, contexts, and study populations.

Keywords: qualitative research, member checking, participant validation, trustworthiness, digital research methods, asynchronous engagement

Introduction

Although there are multiple strategies to increase trustworthiness in qualitative research, member checking has been called “the most crucial technique for establishing credibility” (Lincoln & Guba, 1985, p. 314). The process of reengaging with participants by sharing study data or preliminary analysis and soliciting feedback is recognized for its value in reducing researcher bias and increasing rigor. Member checking is widely considered the “gold standard for establishing trustworthiness” (Kornbluh, 2015, p. 397) in qualitative research. However, as technological advances in recording and transcription have improved data collection, the need for members to “check” data accuracy has decreased. At the same time, the loci of authority and accountability in qualitative research have shifted towards participatory research approaches, which emphasize collaborative research methods with participants having increased involvement in and ownership of the research process (Caretta & Pérez, 2019; Doyle, 2007; Motulsky, 2021). As such, research methods have evolved to appreciate the benefits of participant reengagement and feedback beyond verifying objective data (transactional validity) towards validating participant experiences and serving as a catalyst for change (transformational or catalytic validity) (Cho & Trent, 2006; Lather, 1986). Contemporary member-checking aims strive for ‘holistic validity’ that embraces an inclusive and evolving process to ensure that the research captures the richness and complexity of the phenomenon through participant collaboration (Cho & Trent, 2006). To achieve this goal, member checking may use a variety of approaches such as data verification, constructive reflection, co-construction of knowledge, collaborative analysis, emancipatory action, and direction of the research inquiry in alignment with research aims (Brear, 2019; Cho & Trent, 2006; Urry et al., 2024). Each approach has unique considerations and challenges. As Birt et al. (2016) have proposed, member checking should be viewed as a valuable intellectual process which requires thoughtful implementation, not merely as an item on a technical checklist to demonstrate that criteria for rigor have been met.

As the aims of member checking have evolved, so too should the means used to achieve those aims. Historical strategies of member checking based on sharing of written reports or in-person meetings pose barriers to accessibility, equity, and acceptability. Incorporation of modern technologies for asynchronous exchange of information can mitigate these barriers and increase participant engagement. Despite widespread use of technology in research, there is limited published evidence about using asynchronous technologies such as video recordings and internet-based surveys to facilitate member checking. No relevant studies were identified in a review of the literature conducted on PubMed and Google Scholar using a variety of search terms including (1) member checking, participant validation, or validity in qualitative research; and (2) technology, video, or electronic survey.

Given the evolution of member checking’s aims and the lack of evidence on asynchronous technologies to achieve them, examples of researchers’ experience using these media are needed to guide future studies. This paper addresses that critical gap by describing an innovative approach to member checking using simple asynchronous technologies. We begin by reviewing member-checking aims, approaches, and considerations and then present an example of member checking utilizing an asynchronous video presentation and multimedia electronic survey from a mixed methods study exploring decision making in perinatal care (Schafer et al., 2023, 2024). We conclude with suggestions for how future researchers might apply these lessons to address ethical and methodological concerns and optimize member-checking to increase validity, equity, and engagement in qualitative and mixed methods research.

Member Checking: Aims and Approaches, and Considerations

There are many combinations of terms used to describe the process of reengagement with research participants (who may be referred to as members, respondents, informants, stakeholders, or interviewees) and their contributions or involvement in the research effort in providing checks, validation, verification, feedback, reviews, follow-up interviews, reflections, or collaboration (Thomas, 2017). The commonly accepted term “member checking” will be used here, although it could easily be argued that modernization of approaches to participant reengagement merits an update in terminology.

Member checking is a long-standing and widely used strategy to enhance rigor in qualitative research (Lincoln & Guba, 1985; Tong et al., 2007). Despite its pervasiveness, member checking’s value and efficacy are not universally accepted (Barbour, 2001; Morse, 2015; Sandelowski, 1993; Thomas, 2017). Member checking is one of several tools proposed to increase trustworthiness, alongside persistent observation, prolonged engagement, negative case analysis, triangulation, reflexivity, rich description, external audits, and peer review or debriefing (Creswell & Plano Clark, 2018; Lincoln & Guba, 1985).

Member checking enhances trustworthiness through a variety of ways. For example, member checking may be used to verify the accuracy of collected data, identify potential researcher biases, confirm resonance with researchers’ representations, generate new data, co-construct interpretations based on preliminary findings, or provide opportunities for dialogue including participants’ critique, feedback, questions, or collaboration (Doyle, 2007; Goldblatt et al., 2011; Harvey, 2015; Kornbluh, 2015; Tracy, 2010). Beyond increasing transactional validity, member checking also has the potential to advance transformational validity by demonstrating respect and appreciation for participants’ involvement, increasing transparency and confidence in the research effort, providing opportunity for self-reflection, facilitating active participation, and ensuring that interpretations are meaningful to the participants (Cho & Trent, 2006; Doyle, 2007; Kornbluh, 2015; Locke & Ramakrishna Velamuri, 2009).

Although member checking is often used in qualitative research, there is a lack of consensus around its aims and methods. This problem is only intensified by norms of scholarly journal publication which frequently omit details regarding member-checking processes and outcomes (Caretta & Pérez, 2019; de Loyola González-Salgado et al., 2022; Kornbluh, 2015; Locke & Ramakrishna Velamuri, 2009; Thomas, 2017). Member checking can take many forms and occur at various (or multiple) stages throughout the research process. For example, participants may be asked to identify errors or omissions in transcripts after data collection, discuss emerging themes in focus groups during analysis, or reflect on interpretations presented as cases or written summaries (Creswell & Poth, 2018; Tong et al., 2007). The information provided to participants in member checking also varies widely. It may include interview transcripts or recordings, specific quotes, descriptive or interpretative narratives, dramatic storytelling, preliminary analysis of key concepts or themes, or drafts of reports, posters, or manuscripts. Similarly, information can be shared with participants through a variety of means. Most commonly, this takes the form of member-checking interviews or focus groups, although other means such as booklets, community meetings, and participatory workshops have been reported (Brear, 2019; Caretta & Pérez, 2019; de Loyola González-Salgado et al., 2022; Goldblatt et al., 2011; Koelsch, 2013).

There is also no consensus on how participants’ feedback should be collected. Typically, it is provided as written or verbal responses, but a variety of creative undertakings ranging from found poetry to diagrammatic elicitation to card sorting have been explored (Harry et al., 2005; Reilly, 2013; Sahakyan, 2023). Some researchers use a community-based, participatory approach, such as synthesized or dialogic member checking or collaborative reflection, in which they reengage with participants iteratively to optimize co-construction of knowledge and shape the research trajectory (Birt et al., 2016; Brear, 2019; Chase, 2017; Harvey, 2015; Urry et al., 2024). Other researchers recommend a flexible approach that engages participants in assessing their preferences and collaborating to determine the most appropriate method and format (Carlson, 2010; Erdmann & Potthoff, 2023).

There is ongoing conversation and controversy about best practices in member checking, its role and value in qualitative research, and epistemological, ethical, and methodological considerations and challenges in its implementation (Birt et al., 2016; Erdmann & Potthoff, 2023; Slettebø, 2021). From a positivist or postpositivist paradigm, member checking is an instrument to enhance accurate representation or approximation of an objective reality. In this approach, member checking is used to assess the accuracy of data and findings and explore ‘fit’ between researchers’ representations and participants’ experience (Varpio et al., 2017). Constructivist or interpretivist approaches do not agree that there is a fixed, unchanging truth that can be accounted for by a researcher and confirmed by a participant. Using these approaches, alignment with the participants’ viewpoint may not be sufficient or suitable for determining trustworthiness (Sandelowski, 1993). Instead, researchers strive for co-construction or interpretation of multiple, coexisting subjective realities, and use member checking to revisit information to stimulate new findings or gather feedback to inform the research process.

In a third approach, grounded in critical theory or participatory research, validity is not measured by a study’s correspondence to an approximated objective or subjective reality, but rather by the impact of the research on the participants (Doyle, 2007). The goal of the research effort in this transformative paradigm is to “know reality in order to better transform it” (Lather, 1986, p. 67). As such, member checking can be used to investigate how participation in the research effort influences members’ thoughts, feelings, or behaviors; impacts self-understanding or self-determination; or initiates change (Koelsch, 2013; Lather, 1986). In this approach, member checking provides an opportunity to affirm the value of participants’ knowledge and perspectives, ensure they have some control over how their experiences are represented, and direct the research agenda to reflect their priorities. In this way, member checking can modulate power imbalances between the researcher and participants, reduce invisibility, and lead to emancipatory action and improved social justice (Brear, 2019; Erdmann & Potthoff, 2023; Motulsky, 2021). Finally, as a fourth option, scholars may adopt a pragmatic approach, designing a hybrid methodology to address their research questions optimally based on what works best for the specific study aims, population, and context (Johnson & Onwuegbuzie, 2004).

It is important to acknowledge that member checking raises ethical concerns about epistemic injustice with regard to authoritative knowledge and the ways in which some forms of knowledge and perspectives are legitimized over others (Fricker, 2007; Jordan, 1997). Given differences in experiences and motivations, researchers and participants will inevitably have asymmetry of knowledge and different ways of seeing and reacting to data. As such, they should not be expected to arrive at the same interpretations (Rolfe, 2006). Consensus in member checking may not be desirable, in that it may indicate oversimplification or a lack of inclusion of diverse perspectives (Doyle, 2007).

Sandelowski (1993) has suggested that member checking may actually threaten validity, since the stories we tell ourselves often change, and the meaning of those stories evolves over time. As such, members may perceive their experience differently at the time of member checking than they did at initial data collection. Participants may also wish to shape their portrayal in a specific direction that is at odds with the researcher’s interpretation (Caretta & Pérez, 2019). Careful and critical consideration of how participant feedback gathered through member checking will influence the research effort is necessary to avoid feelings of disempowerment, embarrassment, or mistreatment among participants or negative effects on the research findings (Barbour, 2001; Carlson, 2010; Locke & Ramakrishna Velamuri, 2009; Sandelowski, 1993). Specifically, researchers should consider in advance how results of member checking will be used, including assimilation of disconfirming voices and ambivalence, as well as the limits of their interpretive authority (Birt et al., 2016; Madill & Sullivan, 2018; Urry et al., 2024).

Additional ethical considerations in member checking include participant burden in contributing additional time and effort to the research effort and issues of inequity based on differences in access, knowledge, and resources to support participation. Similar to all participant engagement, methods for member checking should resonate with participants’ needs and reduce barriers to accessibility. In addition, it is important to acknowledge that member checking has the potential to be emotionally or psychological harmful. Being placed in a situation with potential conflict or disagreement can be uncomfortable or challenging, especially for people who have been historically marginalized, overburdened, or experienced trauma (Barbour, 1998; Motulsky, 2021).

Traditionally, when described in the literature, especially in early publications, member checking has been conducted in person, due, in part, to a lack of other reliable methods of obtaining responses from participants. In-person and synchronous online engagement have the benefits of real-time dialogue for clarification and building participant-researcher relationships. However, unless well planned and implemented, synchronous discussion can limit access and may bias engagement toward dominant societal groups (Candela, 2019; Motulsky, 2021).

Since member checking was initially described in the literature, communication methods and preferences have changed. For example, during the COVID-19 pandemic, in-person communication was limited, so methods relying on face-to-face interaction required adaptation. Even after health concerns abated, these modified research techniques have value in increasing participant engagement and study credibility. Although younger individuals still tend to spend more time online than older adults, the use of social media sites and digital video platforms has increased dramatically across all age demographics (Faverio, 2022), making these technologies more accessible for research communications and data sharing.

In-person meetings inevitably present challenges for participant engagement including time and availability, financial considerations, geographic barriers, and accessibility and physical mobility limitations. Even synchronous virtual meetings have barriers and burdens related to schedule co-ordination, reliability of technology, preparation in appearance, environment suitability, and disparities in digital literacy and caregiver responsibilities. Asynchronous engagement mitigates these challenges by making information exchange more accessible and convenient. Asynchronous delivery also avoids potential pitfalls from power dynamics that might lead to participants feeling coerced to agree with data or analysis or continue engagement beyond their comfort level (Birt et al., 2016; Urry et al., 2024).

Effective implementation of technology can reduce inequities and increase inclusion to improve the reach of member checking. Just as video and graphical research abstracts are gaining popularity due to higher rates of viewers’ comprehension and satisfaction compared to traditional, written abstracts (Bredbenner & Simon, 2019), so too the use of media can improve the way that findings are shared when reengaging with participants in member checking. Simple and widely available technologies have the potential to reduce costs, increase time effectiveness, minimize participant burden, and improve the efficiency of sharing and collecting data in member checking.

In short, there is no singular, ideal approach to member checking. There are a multitude of strategies, each with unique strengths and limitations. The most appropriate approach for member checking will vary based on the nature of the research question, the epistemological stance of the researchers, and situational and contextual considerations related to the study population. Member checking should be approached as an intellectual process, with thoughtful consideration of the ways in which it will enhance a study and further its research aims. Ethical member-checking approaches should strive to minimize the potential for harm while maximizing benefits to participants and their communities. Video and internet-based technologies offer a promising opportunity to increase accessibility, equity, and engagement. However, methods for using these technologies in qualitative research have not been broadly disseminated in the literature, potentially decreasing their uptake.

Using Asynchronous Technology for Member Checking: an Exemplar

In this section, we present an exemplar study that used asynchronous technologies for member checking. As described in previous publications (Schafer et al., 2023, 2024), this sequential, mixed methods study explored 25 participants’ experience of decision-making in perinatal care via an online self-administered survey and subsequent in-depth, semi-structured interviews. The median age of participants was 36.4 (range 22–42). Most (96%) had at least some college education, identified as White (76%), and had been born in the United States (94%). All major regions of the United States were represented in the sample, including 20% from rural areas. Participant demographics are described further in previous publications (Schafer et al., 2023). Ethical approval for this study was sought at Vanderbilt University’s Institutional Review Board; the study was deemed exempt based in the determination that it posed minimal risk. No participants had relationships with any of the researchers outside of the study context. All participants provided written informed consent prior to enrollment and granted both written and verbal permission for continued contact for member checking and follow-up communication.

Background: Situating the Research Effort and Researcher Positionality

The member-checking strategy employed in this study was designed to align with the study aims, researchers’ epistemological framework, and situational factors related to the research effort and study population. In this mixed methods study using interpretive description (Thorne, 2016), we took a pragmatic approach to develop an optimal member-checking strategy to best address the research question. Specifically, this study aimed to understand the experience of decision-making of pregnant people who left the hospital system to seek home birth due to presence of fetal malpresentation, which limited the availability for planned vaginal birth in a hospital. Based on our professional experience as perinatal care providers (nurse-midwives in the United States), we anticipated that participants likely had encountered a lack of involvement and autonomy in their experience of decision-making. As such, we aimed for a member-checking approach that would encourage active participation, validate participants’ experience, make them feel heard and valued, and ensure they retained some control over the representation of their experiences.

We approached this study from a critical feminist perspective guided by our professional Code of Ethics (American College of Nurse-Midwives, 2005; Jefford et al., 2019). Our engagement with participants was informed by a trauma-informed, person-centered, relationship-centric model of care. As such, we aimed for research that would demonstrate respect for dignity, individuality, and diversity; foster autonomy and partnership; seek to disrupt systems of power and privilege that cause harm; and support transformation that benefitted participants and the communities and populations they represent (American College of Nurse-Midwives, 2005).

Given that individuals in this population likely experienced coercive or traumatic interactions, the voluntary nature of participation was a priority. For the same reason, we opted not to present case studies or narratives of individual experiences to avoid distress or possible retraumatization and instead opted to share data individually, rather than bringing participants together as a group. This approach also was chosen to give voice to all participants and avoid potential disempowerment or marginalization. Based on this framework, our member-checking strategy involved presenting all participants, as individuals, with a full preliminary, synthesized analysis using a relationship-centered approach that aimed to foster co-construction of recommendations to improve the quality of health care and ensured consensual participation.

Using an analytic approach grounded in situational analysis (Clarke et al., 2018), we considered situational factors related to the research process and population of interest. Due to the nature of the research question and eligibility criteria, all participants were parenting a young child, and we knew from challenges in scheduling and conducting interviews that participants had limited availability and many competing demands on their attention. They had already given a great deal of their time to the research effort, with initial data collection interviews lasting an average of 105 minutes (range 59–155 min, SD 25 min) and some interviews having to be continued over multiple sessions due to time constraints and unanticipated interruptions. In addition, although participants had been offered compensation for their time in the form of an electronic gift card following interviews, we did not have additional funds to provide compensation for participation in member checking. We were also facing time constraints on deadlines for the research project and desired timely responses. Given all these considerations, our approach to member checking aimed to be quick and convenient to minimize participant burden and, correspondingly, maximize the quantity and quality of responses in the limited timeframe available.

Towards that end, we aimed to utilize technology with demonstrated efficacy in the population of interest. The technologies used for member checking mirrored those that had been used successfully during data collection. For example, surveys had been conducted through Research Electronic Data Capture (REDCap) research platform, a secure web-based data processing and management application (Harris et al., 2009). REDCap has proven efficacy for research among reproductive-aged women with varying levels of health literacy (Phillippi et al., 2018). This internet-based platform had been used effectively with participants to establish eligibility and collect quantitative and qualitative data using a self-administered online questionnaire with multiple-choice “radio buttons” and free-text boxes, so those same forms of data collection were replicated for member checking. In addition to engagement with the online REDCap platform, participants had demonstrated ease of access and facility with Internet technologies through use of the cloud-based video and audio-conferencing platform (Zoom) used for interviews. As such, we anticipated that use of internet-based technology and an electronic survey would not present barriers to participation in member checking.

Similarly, a video approach was deemed appropriate to share the study findings and implications. Because this study was conducted on parents during the COVID-19 pandemic, all participants had young children at home, resulting in time demands that made synchronous meetings or review of written content challenging. The COVID-19 pandemic spurred widespread use of online engagement and increasing familiarity with Zoom and social media platforms. The demographic of study participants (reproductive-aged women) is known for high rates of engagement with online video content including YouTube, Facebook, Instagram, and TikTok (Pew Research Center, 2024). Engagement with online video content has skyrocketed in recent years (Faverio, 2022), and video and graphical summaries of research have demonstrated potential to improve uptake and comprehension of findings in viewers (Bredbenner & Simon, 2019; Jeyaraman et al., 2023). However, online content transmission also presents potential for depersonalization, a possibility of which we were acutely aware as researchers in perinatal care, where overreliance on technology leads to detached interactions (Davis-Floyd, 2001; Postmes et al., 2002). As such, we made an effort to retain a personal connection in our asynchronous communication by opting for video that superimposed the first author speaking over the background of the presentation slides. We felt that this approached enhanced the relationship-oriented aspect of the communication exchange, especially since most participants had extensive previous interaction with the first author during interviews conducted via videoconferencing.

Methods: Member-Checking Process

Following our data analysis, we created slides to present a brief synopsis of preliminary findings and implications using Microsoft PowerPoint. Slides contained simple graphics, illustrations, and minimal text. We then created a Zoom videoconferencing meeting with no audience and recorded the presentation within Zoom, using the slides as a “virtual background” displayed behind the speaker, so both the speaker and slides could be viewed simultaneously.

The video began with a brief introduction to set expectations for member checking, namely, to ensure that the content in the presentation resonated with participants’ experience and encompassed the most important aspects of that experience. Next, the video presentation provided a brief summary of core themes, exemplar quotes, a graphical illustration of a proposed storyline, and key implications for practice and health services reform. The recording concluded with comments thanking participants for sharing their stories and contributing to the research effort and sharing the researchers’ contact information. This presentation approach was selected to draw on a variety of learning styles, in that it included visual, auditory, linguistic, logical, and interpersonal elements by using pictures, spoken word, text, and a diagrammatic storyline. We used the familiar face and voice of the primary investigator to increase interpersonal aspects of communication and used an informative, professional tone to effectively convey findings. We intentionally kept the presentation short (approximately 10 minutes) to reduce cognitive overload and increase comprehension, based on adult learning theories regarding maximal attention span (Bradbury, 2016). The recording was then uploaded to a private (unpublished) YouTube channel. The video recording and transcript can be accessed at https://go.rutgers.edu/mcvideo.

Using REDCap’s media options, the YouTube video was embedded into an electronic survey. After the video presentation, the survey asked participants to respond to 4 questions to provide feedback about preliminary study findings and associated implications. The content of the survey is provided in Table 1. Participants could respond to survey questions using a variety of methods, including free-text (unlimited character) data entry or file upload, which could include audio or video files easily recorded on a smartphone device, as well as phone, text, and email. The link to the survey was emailed to all participants who had completed interviews, asking them to view the video and respond to a brief online survey within one week. If a participant had not yet responded, a single reminder email was sent 5 days later.

Table 1.

Member-Checking Survey.

Thank you to all the amazing people who contributed to this important research! We are so grateful you were willing to share your stories and support. This page provides our preliminary results for your review. We hope you will share your feedback to help us strengthen our final results
Please review this video and then answer the questions below
 1. Please rate the extent to which you agree with this statement: “These findings feel true to my experience.” (strongly agree, agree, neither agree nor disagree, disagree, strongly disagree)
 2. What about these preliminary results do you feel best encompasses your experience?
 3. What about these preliminary results do you feel most needs further development or improvement?
 4. What else do you think we should to know? (Any and all feedback is welcome!)
 If you prefer to provide your feedback by uploading a video or audio recording, you can do so here or reach out by phone or text to ###-###-####

Results of Member Checking

Of the 25 participants who completed the initial survey, 23 participated in interviews, with one declining and one lost to follow-up. Those 23 participants were sent an email inviting them to participate in member checking with a link to the REDCap member-checking survey. Using YouTube analytics, we were able to determine the number of unique viewers (n = 9, 39.1% of invited participants) who accessed the survey and confirm the duration of their viewing time during the one-week response timeframe. All 9 viewers completed the member-checking survey (100% viewer response rate) and submitted written responses through the survey’s free-text boxes; none opted for file upload, phone, text, or email response alternatives.

Participant feedback (provided in Supplemental 1) addressed aspects of both transactional and transformational validity. Based on a 5-point Likert scale, all respondents strongly agreed (n = 7, 77.8%) or agreed (n = 2, 22.2%) that the preliminary findings felt true to their experience. In open-ended comments, participants shared positive feedback to reinforce their sense of resonance with the findings. They also considered commonalities and differences with other participants’ experience, reflected on their experience of participating in the research process, provided suggestions for improvement, and expressed gratitude for having the opportunity to share their experience. Additional qualitative responses addressed recommendations to clarify or expand findings and suggestions for future research and clinical implications. For example, participants responded:

It’s all true and accurate. (Participant [P]1)

You have done an amazing job sharing my views and experience. (P2)

It’s interesting to hear that my experience lined up with others. I’m curious as to what gave us all the belief in … (P7)

Be sure to make it clear that, at least for me, and it seems like for a lot of the other women you interviewed, that we … (P4)

I would love to see more research on … (P6)

Could the results include a bit more about … (P5)

It’s been nice to talk about my experience for this study. I don’t talk about it much. (P7)

Thank you so much. I’m always here if you need more. (P9)

The research team reviewed member-checking responses in peer debriefing. Data were coded into five themes: resonance, self-reflection, reiteration of key concepts, opportunities to clarify or expand, and praise/gratitude. With these themes in mind, we explored opportunities to strengthen our analysis and clinical implications to incorporate participants’ suggestions and reflect on what they found most meaningful and important. Based participant responses, we concluded that our synthesis and interpretation positively resonated with participants and authentically represented key themes of their experience. Since no participants shared disconfirming views, we determined it was not necessary to eliminate or significantly revise our interpretations or perform further rounds of member checking to gain additional clarity or insights.

Discussion

The asynchronous member-checking approach effectively engaged participants, as evidenced by a high viewer response rate and detailed feedback. Participants confirmed resonance between preliminary analysis and their experience, expressed feeling heard and validated, engaged in self-reflection and comparison, and offered additional insights that enhanced analysis and trustworthiness of study findings. The simple, asynchronous video and internet-based technology used for member checking proved to be successful for achieving cost- and time-effective engagement that resulted in valuable participant feedback.

By conducting member checking through a video presentation embedded in an electronic survey, participants were able to engage with member checking easily and asynchronously at a time and place of their choosing. As such, this approach may have mitigated inequities in access and challenging situational factors such as time and geographic constraints. It also likely reduced participant burden, which is an especially important consideration for a population of caregivers of young children. The video format continued interpersonal interaction with the researcher that was initiated during data collection and relied on familiar technologies. Asynchronous engagement avoided direct interactions with the researcher that might have otherwise made it uncomfortable for participants to challenge findings or discontinue participation. In this way, the engagement of technology reduced the potential for coercion or distress.

Although there is very limited research exploring the use of asynchronous technologies to facilitate member checking, existing studies have shown that surveys and video segments can be incorporated to align with the research aims (Birt et al., 2016; Naidu & Prose, 2018). The rate and quality of member-checking responses in this study were high, in comparison to the limited data reported elsewhere (Motulsky, 2021; Thomas, 2017). The high viewer response rate and rich feedback provided are evidence that the use of asynchronous technologies can be used effectively for member checking in populations with access and familiarity with video and online survey technologies.

Strengths and Limitations

Participant feedback and engagement with technology through member checking in this mixed methods study provide reassurance that asynchronous technologies can be useful for member checking. Video and internet-based asynchronous technologies used appeared to have low participant burden, based on the 100% viewer response rate. For the researchers, the process was quick and straightforward; the use of asynchronous technologies had added benefits of wide distribution, cost effectiveness, and timeliness of sharing data and gathering feedback. This approach also ensured voluntary participation and bolstered opportunity for participants to share criticism or negative feedback, as participants could disagree with the findings or withdraw consent for participation at any time without having any uncomfortable communication with the researcher that would be inherent in synchronous interactions.

The primary limitation of this member-checking strategy was the lack of opportunity for interaction or discussion to deepen analysis or generate new interpretations that synchronous methods could have provided, had participants been available for such interaction. Similarly, although the short and simple survey design did reduce participant burden, it also limited the opportunity for interactive data collection. The lack of response through methods other than free-text survey boxes demonstrated that the inclusion of alternative communication avenues was unnecessary. Lastly, the asynchronous approach relied on familiarity with technology and may not be optimal for persons with lower levels of technological literacy, access, or comfort.

Implications

This study offers valuable insights to guide researchers in using member checking to advance qualitative and mixed methods research. Researchers should recognize that member checking is one of many available techniques that can be used in collaboration to enhance validity, broadly conceived, and consider the ways in which member checking is appropriate to advance the research aims. This study highlights the need for researchers to engage in reflexivity and consider their positionality, implicit biases, and epistemological stance. It is also essential to think critically about the specific context and attributes of the study population when designing member-checking strategies. Thoughtful consideration is necessary to develop an approach that acknowledges inequities and asymmetries of power, minimizes potential barriers or burden, and maximizes equitable, ethical, and respectful engagement to benefit participants and their communities.

This study provides an example of how increased transparency in member checking enhances qualitative rigor. When using member checking, researchers should provide sufficient details to situate their selected strategy within the study and contextualize its contributions to the research effort. This description should include the goals of member checking within the study and fit with the overall research aims and researchers’ theoretical positions. Researchers should also provide details about member-checking processes and outcomes in publications. This may include how participants were selected for member checking, rates of participation, method of sharing and gather member-checking data, results of member checking, and the influence of participant feedback on the research effort. When researchers determine that member checking is not an optimal or appropriate strategy for a study, that rationale should be provided. Increased transparency in member checking is necessary to enhance methodological rigor and guide best practices to inform future research.

Additional research is needed to explore and refine the use of asynchronous technologies in member checking and to find an appropriate balance of convenience and accessibility with opportunities for meaningful engagement and co-construction of knowledge. Further studies are needed to investigate how to optimize member-checking strategies for different research contexts and populations, including those with varying levels of technological literacy and extent of marginalization.

Conclusion

Member checking is one of many techniques that, if implemented thoughtfully, can enhance trustworthiness in qualitative and mixed methods research, facilitate co-construction of knowledge, potentiate empowerment, and promote transformative change. It has the potential not just to validate data and the research process but participants themselves. A wide variety of approaches and techniques for member checking exist, each with unique considerations and strengths. However, methods used for member checking are often poorly described in published research, limiting awareness and adoption of innovative approaches.

The optimal member-checking strategy in a given study should align with research aim, epistemological stance of researchers, and unique considerations of the study population, with consideration of the ways in which member-checking methods can increase equity, reduce participant burden, and facilitate meaningful engagement. By prioritizing participant reengagement through optimal member-checking methods, researchers can advance work that is inclusive, equitable, and reflective of diverse experiences and perspectives. Asynchronous technologies such as video recordings and electronic surveys offer innovative solutions to make member checking more accessible, convenient, and engaging. While our experience was positive, there is a limited body of evidence on the use of technology and asynchronous engagement for member checking, and further research is needed. As research methodologies continue to evolve, innovative member-checking approaches can strengthen the trustworthiness and transformative potential of qualitative research.

Supplementary Material

Supplement 1 Member-Checking Responses

Supplemental material for this article is available online.

Acknowledgements

The authors gratefully acknowledge Drs. Mary S. Dietrich, Holly Powell Kennedy, and Shelagh Mulvaney, who contributed to the conceptualization of the research project and supervision of the research effort in the qualitative study presented.

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the A.C.N.M. Foundation, National Center for Advancing Translational Sciences (UL1 TR000445), National Institutes of Health (K08HS024733), National League for Nursing, Vanderbilt University, March of Dimes Foundation.

Footnotes

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Ethical Statement

Ethical Committee

Vanderbilt University IRB.

Data Availability Statement

Data are available and included as supplemental information with this article.

References

  1. American College of Nurse-Midwives. (2005). Code of ethics. American College of Nurse-Midwives. Available at: https://www.midwife.org/ACNM/files/ACNMLibraryData/UPLOADFILENAME/000000000048/Code-of-Ethics.pdf
  2. Barbour RS (1998). Engagement, representation and presentation in research practice. In Barbour RS & Huby G (Eds.), Meddling with mythology: AIDS and the social construction of knowledge (1st ed., pp. 180–197). Routledge. Available at: 10.4324/9780203976524-15 [DOI] [Google Scholar]
  3. Barbour RS (2001). Checklists for improving rigour in qualitative research: A case of the tail wagging the dog? BMJ, 322(7294), 1115–1117. 10.1136/bmj.322.7294.1115 [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Birt L, Scott S, Cavers D, Campbell C, & Walter F (2016). Member checking: A tool to enhance trustworthiness or merely a nod to validation? Qualitative Health Research, 26(13), 1802–1811. 10.1177/1049732316654870 [DOI] [PubMed] [Google Scholar]
  5. Bradbury NA (2016). Attention span during lectures: 8 seconds, 10 minutes, or more? Advances in Physiology Education, 40(4), 509–513. 10.1152/advan.00109.2016 [DOI] [PubMed] [Google Scholar]
  6. Brear M (2019). Process and outcomes of a recursive, dialogic member checking approach: A project ethnography. Qualitative Health Research, 29(7), 944–957. 10.1177/1049732318812448 [DOI] [PubMed] [Google Scholar]
  7. Bredbenner K, & Simon SM (2019). Video abstracts and plain language summaries are more effective than graphical abstracts and published abstracts. PLoS One, 14(11), e0224697. 10.1371/journal.pone.0224697 [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Candela AG (2019). Exploring the function of member checking. Qualitative Report, 24(3), 619–628. 10.46743/2160-3715/2019.3726 [DOI] [Google Scholar]
  9. Caretta MA, & Pérez MA (2019). When participants do not agree: Member checking and challenges to epistemic authority in participatory research. Field Methods, 31(4), 359–374. 10.1177/1525822X19866578 [DOI] [Google Scholar]
  10. Carlson JA (2010). Avoiding traps in member checking. Qualitative Report, 15(1), 1102–1113. 10.46743/2160-3715/2010.1332 [DOI] [Google Scholar]
  11. Chase E (2017). Enhanced member checks: Reflections and insights from a participant-researcher collaboration. Qualitative Report, 22(10), 2689–2703. 10.46743/2160-3715/2017.2957 [DOI] [Google Scholar]
  12. Cho J, & Trent A (2006). Validity in qualitative research revisited. Qualitative Research, 6(3), 319–340. 10.1177/1468794106065006 [DOI] [Google Scholar]
  13. Clarke AE, Friese C, & Washburn RS (2018). Situational analysis: Grounded theory after the interpretive turn (2nd ed.). Sage Publications. [Google Scholar]
  14. Creswell JW, & Plano Clark VL (2018). Designing and conducting mixed methods research (3rd ed.). Sage Publications. [Google Scholar]
  15. Creswell JW, & Poth CN (2018). Qualitative inquiry and research design: Choosing among five approaches (4th ed.). Sage Publications. [Google Scholar]
  16. Davis-Floyd R (2001). The technocratic, humanistic, and holistic paradigms of childbirth. International Journal of Gynecology & Obstetrics, 75(Suppl 1), S5–S23. https://www.ncbi.nlm.nih.gov/pubmed/11742639 [Google Scholar]
  17. de Loyola González-Salgado I, Rivera-Navarro J, Gutiérrez-Sastre M, Conde P, & Franco M (2022). Conducting member checking within a qualitative case study on health-related behaviours in a large European city: Appraising interpretations and co-constructing findings. Health, 28(1), 3–21. 10.1177/13634593221109682 [DOI] [PubMed] [Google Scholar]
  18. Doyle S (2007). Member checking with older women: A framework for negotiating meaning. Health Care for Women International, 28(10), 888–908. 10.1080/07399330701615325 [DOI] [PubMed] [Google Scholar]
  19. Erdmann A, & Potthoff S (2023). Decision criteria for the ethically reflected choice of a member check method in qualitative research: A proposal for discussion. International Journal of Qualitative Methods, 22(1), 16094069231177664. 10.1177/16094069231177664 [DOI] [Google Scholar]
  20. Faverio M (2022). Share of those 65 and older who are tech users has grown in the past decade. Pew Research Center. Available at: https://pewrsr.ch/3HZd2ao [Google Scholar]
  21. Fricker M (2007). Epistemic injustice: Power and the Ethics of knowing. Oxford University Press. [Google Scholar]
  22. Goldblatt H, Karnieli-Miller O, & Neumann M (2011). Sharing qualitative research findings with participants: Study experiences of methodological and ethical dilemmas. Patient Education and Counseling, 82(3), 389–395. 10.1016/j.pec.2010.12.016 [DOI] [PubMed] [Google Scholar]
  23. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, & Conde JG (2009). Research electronic data capture (REDCap): A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42(2), 377–381. 10.1016/j.jbi.2008.08.010 [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Harry B, Sturges KM, & Klingner JK (2005). Mapping the process: An exemplar of process and challenge in grounded theory analysis. Educational Researcher, 34(2), 3–13. https://www.jstor.org/stable/3700040 [Google Scholar]
  25. Harvey L (2015). Beyond member-checking: A dialogic approach to the research interview. International Journal of Research and Method in Education, 38(1), 23–38. 10.1080/1743727X.2014.914487 [DOI] [Google Scholar]
  26. Jefford E, Alonso C, & Stevens JR (2019). Call us midwives: Critical comparison of what is a midwife and what is midwifery. International Journal of Childbirth, 9(1), 39–50. 10.1891/2156-5287.9.1.39 [DOI] [Google Scholar]
  27. Jeyaraman M, Ratna HVK, Jeyaraman N, Maffulli N, Migliorini F, Nallakumarasamy A, & Yadav S (2023). Graphical abstract in scientific research. Cureus, 15(9), e45762. 10.7759/cureus.45762 [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Johnson RB, & Onwuegbuzie AJ (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14–26. [DOI] [Google Scholar]
  29. Jordan B (1997). In Davis-Floyd R & Sargent CF (Eds.), Childbirth and authoritative knowledge: Cross-cultural perspectives. University of California Press. [Google Scholar]
  30. Koelsch LE (2013). Reconceptualizing the member check interview. International Journal of Qualitative Methods, 12(1), 168–179. 10.1177/160940691301200105 [DOI] [Google Scholar]
  31. Kornbluh M (2015). Combatting challenges to establishing trustworthiness in qualitative research. Qualitative Research in Psychology, 12(4), 397–414. 10.1080/14780887.2015.1021941 [DOI] [Google Scholar]
  32. Lather P (1986). Issues of validity in openly ideological research: Between a rock and a soft place. Interchange, 17(1), 63–84. 10.1007/BF01807017 [DOI] [Google Scholar]
  33. Lincoln Y, & Guba E (1985). Naturalistic inquiry. Sage Publications. [Google Scholar]
  34. Locke K, & Ramakrishna Velamuri S (2009). The design of member review: Showing what to organization members and why. Organizational Research Methods, 12(3), 488–509. 10.1177/1094428108320235 [DOI] [Google Scholar]
  35. Madill A, & Sullivan P (2018). Mirrors, portraits, and member checking: Managing difficult moments of knowledge exchange in the social sciences. Qualitative psychology (Washington, D.C.), 5(3), 321–339. 10.1037/qup0000089 [DOI] [Google Scholar]
  36. Morse JM (2015). Critical analysis of strategies for determining rigor in qualitative inquiry. Qualitative Health Research, 25(9), 1212–1222. 10.1177/1049732315588501 [DOI] [PubMed] [Google Scholar]
  37. Motulsky SL (2021). Is member checking the gold standard of quality in qualitative research? Qualitative Psychology, 8(3), 389–406. 10.1037/qup0000215 [DOI] [Google Scholar]
  38. Naidu T, & Prose N (2018). Re-Envisioning member checking and communicating results as accountability practice in qualitative research: A South African community-based organization example. Forum for Qualitative Social Research, 9(3), 1. 10.17169/fqs-19.3.3153 [DOI] [Google Scholar]
  39. Pew Research Center. (2024). Americans’ social media use. Pew Research Center. Available at: https://www.pewresearch.org/internet/wp-content/uploads/sites/9/2024/01/PI_2024.01.31_Social-Media-use_report.pdf [Google Scholar]
  40. Phillippi JC, Doersam JK, Neal JL, & Roumie CL (2018). Electronic informed consent to facilitate recruitment of pregnant women into research. Journal of Obstetric, Gynecologic, and Neonatal Nursing, 47(4), 529–534. 10.1016/j.jogn.2018.04.134 [DOI] [Google Scholar]
  41. Postmes T, Spears R, & Lea M (2002). Intergroup differentiation in computer-mediated communication: Effects of depersonalization. Group Dynamics: Theory, Research, and Practice, 6(1), 3–16. 10.1037/1089-2699.6.1.3 [DOI] [Google Scholar]
  42. Reilly RC (2013). Found poems, member checking and crises of representation. Qualitative Report, 18(30), 1–18. 10.46743/2160-3715/2013.1534 [DOI] [Google Scholar]
  43. Rolfe G (2006). Validity, trustworthiness and rigour: Quality and the idea of qualitative research. Journal of Advanced Nursing, 53(3), 304–310. 10.1111/j.1365-2648.2006.03727.x [DOI] [PubMed] [Google Scholar]
  44. Sahakyan T (2023). Member-checking through diagrammatic elicitation: Constructing meaning with participants. Tesol Quarterly, 57(2), 686–701. 10.1002/tesq.3210 [DOI] [Google Scholar]
  45. Sandelowski M (1993). Rigor or rigor mortis: The problem of rigor in qualitative research revisited. Advances in Nursing Science, 16(2), 1–8. 10.1097/00012272-199312000-00002 [DOI] [Google Scholar]
  46. Schafer R, Dietrich M, Kennedy H, Mulvaney S, & Phillippi J (2023). “I had no choice”: A mixed-methods study on access to care for vaginal breech birth. Birth: Issues in Perinatal Care, 51(1), 413–423. 10.1111/birt.12797 [DOI] [Google Scholar]
  47. Schafer R, Kennedy HP, Mulvaney S, & Phillippi JC (2024). Experience of decision-making for home breech birth: An interpretive description. SSM-Qualitative Research in Health, 5(1), 100397. 10.1016/j.ssmqr.2024.100397 [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Slettebø T (2021). Participant validation: Exploring a contested tool in qualitative research. Qualitative Social Work, 20(5), 1223–1238. 10.1177/1473325020968189 [DOI] [Google Scholar]
  49. Thomas DR (2017). Feedback from research participants: Are member checks useful in qualitative research? Qualitative Research in Psychology, 14(1), 23–41. 10.1080/14780887.2016.1219435 [DOI] [Google Scholar]
  50. Thorne S (2016). Interpretive description: Qualitative research for applied practice (2nd ed.). Routledge. [Google Scholar]
  51. Tong A, Sainsbury P, & Craig J (2007). Consolidated criteria for reporting qualitative research (COREQ): A 32-item checklist for interviews and focus groups. International Journal for Quality in Health Care, 19(6), 349–357. 10.1093/intqhc/mzm042 [DOI] [PubMed] [Google Scholar]
  52. Tracy SJ (2010). Qualitative quality: Eight “big-tent” criteria for excellent qualitative research. Qualitative Inquiry, 16(10), 837–851. 10.1177/1077800410383121 [DOI] [Google Scholar]
  53. Urry K, Chur-Hansen A, & Scholz B (2024). From member checking to collaborative reflection: A novel way to use a familiar method for engaging participants in qualitative research. Qualitative Research in Psychology, 21(3), 357–374. 10.1080/14780887.2024.2355972 [DOI] [Google Scholar]
  54. Varpio L, Ajjawi R, Monrouxe LV, O’Brien BC, & Rees CE (2017). Shedding the cobra effect: Problematising thematic emergence, triangulation, saturation and member checking. Medical Education, 51(1), 40–50. 10.1111/medu.13124 [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplement 1 Member-Checking Responses

Data Availability Statement

Data are available and included as supplemental information with this article.

RESOURCES