Abstract
In-person interactions have traditionally been the gold standard for qualitative data collection. The COVID-19 pandemic required researchers to consider if remote data collection can meet research objectives, while retaining the same level of data quality and participant protections. We use four case studies from the Philippines, Zambia, India and Uganda to assess the challenges and opportunities of remote data collection during COVID-19. We present lessons learned that may inform practice in similar settings, as well as reflections for the field of qualitative inquiry in the post-COVID-19 era. Key challenges and strategies to overcome them included the need for adapted researcher training in the use of technologies and consent procedures, preparation for abbreviated interviews due to connectivity concerns, and the adoption of regular researcher debriefings. Participant outreach to allay suspicions ranged from communicating study information through multiple channels to highlighting associations with local institutions to boost credibility. Interviews were largely successful, and contained a meaningful level of depth, nuance and conviction that allowed teams to meet study objectives. Rapport still benefitted from conventional interviewer skills, including attentiveness and fluency with interview guides. While differently abled populations may encounter different barriers, the included case studies, which varied in geography and aims, all experienced more rapid recruitment and robust enrollment. Reduced in-person travel lowered interview costs and increased participation among groups who may not have otherwise attended. In our view, remote data collection is not a replacement for in-person endeavours, but a highly beneficial complement. It may increase accessibility and equity in participant contributions and lower costs, while maintaining rich data collection in multiple study target populations and settings.
Keywords: qualitative study, public health, health policies and all other topics, study design
Summary box.
Qualitative researchers have historically championed gathering respondents’ perspectives via face-to-face engagement, but the ongoing pandemic presents challenges to in-person research.
In shifting to remote data collection—via mobile phones or online formats—we identified challenges related to rapport building, fear of technology, privacy and confidentiality, and developed measures to address this.
Drawing from our qualitative data collection experiences in the Philippines, Zambia, India and Uganda, this paper outlines exemplars, mitigation techniques and lessons learned that could inform remote interviewing strategies beyond the COVID-19 context.
Introduction
As qualitative researchers, we champion the value and necessity of rapport building, empathy, open and honest dialogue, and a sense of closeness between research teams and interview respondents. Throughout our careers, we have adhered to a longstanding (if unstated) view that face-to-face engagement, in a location that is comfortable for and familiar to the respondent, is the gold standard in qualitative data collection—and anything else is second best.1 2 Face-to-face interviewing facilitates a qualitative researcher’s ability to observe non-verbal cues (eg, furtive glances, fidgeting, or an eye roll), use silence as an element of patient dialogue, and to record and probe about the artefacts or tools that reflect a person’s life (eg, the material objects that hold meaning or value for an individual).3 COVID-19 and associated lockdowns and social distancing have forced us to challenge these perceptions in pursuit of gathering trustworthy, rigorous and authentic qualitative data in low- and middle-income countries (LMICs).4–6
Several academics, often doctoral students, have highlighted the pros and cons of collecting data remotely.7–9 James and Busher described doctoral data collection using email, and noted disadvantages of the asynchronous approach, which could sometimes cause a loss of coherence and flow of thought, leaving the data feeling ‘dry’ due to an absence of visual and auditory cues.9 The authors also highlighted concerns about consent and anonymity given the nature of electronic messaging and data storage.9 Similarly, researchers using phone interviews to collect qualitative data described a lack of non-verbal data, which contributed to a limited understanding of context.10 Several others, however, detailed the benefits of phone interviews offering richer discussions on sensitive topics due to increased perceptions of anonymity,11 12 and improved access to hard-to-reach respondents13 and settings that may otherwise be considered unsafe for research.14
More recently, studies have examined video communication platforms such as Zoom, Skype or WhatsApp,8 15–18 and identified mixed, but largely positive experiences. Deakin and Wakefield highlighted tremendous potential for Skype to facilitate data collection across a wide range of geographical perspectives while operating on modest budgets.15 At least two studies directly compared in-person to online communication,8 16 and found relatively modest differences across the approaches in terms of participant satisfaction and data quality,8 although microphones, webcams and uneven internet reliability presented challenges. Most recently, studies have explored the use of mobile instant messaging applications to elicit respondents' daily experiences, feelings and thoughts.17 18 Kaufmann and Peil18 state that the use of WhatsApp messaging has proven useful in capturing participant’s daily experiences via multimedia options including pictures, videos, screenshots, emojis, filters and hashtags.
A majority of literature on the use of remote means (eg, internet or phone based) to gather qualitative data precedes the current COVID-19 pandemic, and comes from high-income countries (HICs). As noted above, researchers working in HICs have highlighted that remote data collection facilitates reaching people who are isolated, geographically dispersed, stigmatized, overlooked or ignored.19–22 They note the novelty of remote data collection, because it represents a substantive adaptation or pivot from the status quo. In contrast, there is little research on remote data collection in LMICs. A counterpoint to expanded participation, remote data collection may create or foment selection bias because access to electricity, mobile phones, and the Internet, while expanding, is not nearly as universal in LMICs as in HICs.23–25 Though mobile phone ownership among women has been increasing, a gender gap persists: women are 10% less likely than men to own mobile phones across LMICs with the largest gap observed in South Asia.23 Similarly, women in LMICs are 23% less likely than men to use ‘mobile internet’, a term that refers to accessing the internet via a smartphone or tablet using a wireless or cellular connection.25 26 Broadly speaking, rural populations in LMICs are also 40% less likely to use mobile internet than urban populations.25 Hence, while researchers in LMICs have had to adapt and pivot for decades in the interest of getting data amid major structural challenges (we have, for example, contended with natural calamities, political unrest, epidemics and resource shortages), we have rarely considered electronic or mobile data collection as a promising solution.
In relation to the current pandemic, we are aware of blog entries27 and Twitter discussions, though relatively little academic literature to guide the research community, particularly the qualitative community, on how to adapt amid the ongoing pandemic. In this practice paper, drawing from our experiences collecting data remotely via online and mobile phone-based interviews across four LMICs, we share methodological and practical adaptations and lessons learned to guide fellow qualitative researchers who are contending with the ongoing pandemic—and who may want to consider remote means of data collection well into the future. We do not emphasize general tenets of qualitative research, or tips for collecting high-quality qualitative data generally, but instead focus on remote qualitative research specifically.
Case studies
Our case studies stem from research underway in the Philippines, Zambia, India and Uganda. While comprehensively discussing comparative historical, cultural, structural and social differences is beyond the scope of this paper, we present a snapshot of demographics, COVID-19-related details, pertinent information regarding each country’s access to electricity, mobile phone subscriptions, internet connectivity and information related to our ongoing research (table 1).
Table 1.
Philippines | Zambia | India | Uganda | |
Covid-19 related | ||||
First confirmed case* | January 30, 2020 | March 18, 2020 | January 30, 2020 | March 21, 2020 |
Total confirmed cases as of October 5 2020* | 322 497 | 15 052 | 6 623 815 | 8808 |
Total deaths as of October 5 to 2020* | 5776 | 333 | 102 685 | 81 |
Deaths per 100 000 population† | 5.3 | 1.8 | 7.4 | 0.2 |
Electricity, internet, phone | ||||
Access to electricity (% of population)‡ | 95 (total) 98 (urban) 93 (rural) |
40 (total) 77 (urban) 11 (rural) |
95 (total) 100 (urban) 93 (rural) |
43 (total) 58 (urban) 38 (rural) |
Mobile§ subscriptions per 100 people | 154 | 96 | 84 | 57 |
Secure Internet servers per 1 million people¶ | 111 | 36 | 389 | 22 |
Individuals using the Internet (%)** | 43 | 14 | 20 | 24 |
Population characteristics | ||||
Population†† | 108 million | 18 million | 1.4 billion | 44 million |
Average age (median)‡‡ | 25.7 | 17.6 | 28.4 | 16.7 |
Descriptions of our qualitative research | ||||
Topical focus | Vaccine hesitancy | TB care-seeking | COVID-19 health services | Mental health and HIV |
Target population | Parents of <5 children, policy makers, healthcare workers, community leaders | Patients diagnosed with TB in the 2 weeks prior to interview | Private healthcare providers, including medical doctors and experience-based rural medical practitioners who provide services to low-income populations | People living with HIV, health workers, community members knowledgeable about mental health |
Geographical areas | Urban and rural | Urban | Urban and rural | Rural, trading and fishing communities |
Sampling technique | Purposive—criterion | Purposive—criterion | Purposive—criterion and snowball | Purposive—criterion and snowball |
Average length of interview | 1–1.5 hours | 45–60 min | 45–60 min | 1–2 hours |
Remote platforms used for recruitment and/or interviews | Zoom, Skype, Google Meet and FB messenger | Mobile phone | Mobile phone, WhatsApp phone, Zoom | Telephone |
TB, tuberculosis.
We begin by highlighting our experiences in the field and the challenges both prior to and during data collection with special emphasis on an overarching theme or challenge that emerged within a given research team, and the workaround pursued to mitigate this challenge.
Case study 1: overcoming fear of online interviewing in the Philippines
Fear is perhaps the best word to describe our collective feeling upon realizing that an online shift was inevitable in order to collect data for ‘Project SALUBONG: Building Vaccine Confidence via Empathy and Narratives’ in the Philippines. We feared how review boards, fellow scientists and research participants would react, particularly because vaccines are a controversial topic, and we felt that controversial topics necessitate direct, in-person engagement. Fear also describes the perspective of our interview teams in terms of engaging with online platforms. Several of our younger data collectors are tech-savvy, and highly conversant on the nuances of tech and ‘tech speak’; they understand toggling, and amplify their communication styles with hashtags and emojis. Meanwhile, many of our older staff members are self-proclaimed ‘technophobes’ who felt overwhelmed by the number of buttons and navigation links on mobile devices and computers. We addressed these fears head-on. We modified trainings to include modules on computer applications, video calling platforms and online voice recorders, as well as data backup and protection procedures. To train interviewers, we used Zoom breakout rooms, which allowed interviewers to practice interviewing techniques in different groups, with and without supervision from trainers, but we always ensured that a tech supporter was ready to support any tech-related snafus. We practiced recruiting, consenting and interviewing online, including modules on ‘tech disruptions’ so that research assistants would have to develop workarounds if a screen froze or a call dropped. We also developed a phone script to facilitate the recruitment process (see online supplemental file 1) and trained our techno-reticent researchers on multiple platforms that participants described preferring (eg, Facebook messenger, Zoom, Google Meet or Skype). For consenting, in lieu of meeting participants in person and establishing informed consent by signature or fingerprint, participants signed consent forms remotely during a recorded video call, and shared a ‘selfie’ with the signed form. To ensure participants’ internet connectivity throughout the interview, we purchased and transmitted free mobile data packages in advance. Lastly, to bolster transnational collaboration amid travel restrictions, we conducted systematic debriefings via Zoom at the end of each day of data collection to share experiences and improve study procedures.28
bmjgh-2020-004193supp001.pdf (44.7KB, pdf)
Case study 2: allaying respondent suspicions and building mobile rapport in Zambia
Our study sought to understand care-seeking experiences and preferences among newly diagnosed (<3 weeks), adult patients with tuberculosis (TB) at three health facilities, identified through health facility registers. We transitioned from the planned in-person to mobile phone-based data collection. When calling potential participants, we first confirmed the identity of the person answering the phone by asking for details that we could verify via facility-based client records, such as their name and recent care-seeking behaviour. Persons called were often suspicious, questioning how and why they were contacted. Providing a clear and comfortable introduction was thus part of rapport building, requiring interviewers to allay concerns by quickly outlining our purpose and explaining how we obtained their phone number. Mentioning their health facility in the introduction ‘signaled’ the interview topic, leading some to immediately decline participation. For others, the association with the health facility built trust and credibility, including allowing participants to confirm the study’s aim with facility staff prior to participation. Additional rapport-building followed usual in-person techniques of answering participant questions, listening carefully, starting with comfortable topics, and using third-person examples for sensitive questions. We had thought phone interviews might be shorter, or that data gathered by phone may be less forthright or revealing. In fact, this was not the case. In comparison to in-person in-depth interviews (IDIs), participants’ tone of voice and the detailed narration of their experiences suggested that, for many respondents, it was easier to discuss sensitive topics and challenging life experiences while not in the physical presence of another person. Rapport extended beyond the initial interview, with several participants seeking TB or COVID-19 information from researchers during or after the call (in order to provide consistent information, we created COVID-19 interviewer scripts that included referral phone numbers). To prevent possible problems, early in the interview we discussed data use and/or times for a follow-up call in case of an abbreviated interview due to network or phone battery challenges, and we collected details required for mobile money reimbursement. Regular research team debriefs over Zoom and memos written within 24–48 hours post interview helped us to address challenges in real time.
Case study 3: rapid recruitment of respondents for remote interviews in India
Our study aims to provide immediate, actionable evidence to inform the government’s efforts on leveraging the private health sector’s capacity to meet the health needs of poor and vulnerable populations, like migrants, who have been disproportionately affected by COVID-19 in Uttar Pradesh (UP), India. Given the diversity of private health providers who play a critical role in providing services to these populations—ranging from small nursing homes and single-doctor clinics to experience-based practitioners, such as rural medical practitioners (RMPs)—we have had to adopt different strategies to remotely recruit respondents for phone and online interviews during the pandemic. First, we identified professional networks of private health providers (eg, allopathic, Ayurveda, Yoga & Naturopathy, Unani, Siddha and Homoeopathy), and experience-based practitioners at state and district levels. Building rapport with the Heads of health associations and district health leadership over multiple phone conversations and engaging them as key informants proved to be a useful strategy to recruit both providers from small hospitals and nursing homes as well as experience-based practitioners across the study sites in UP. We complemented this strategy by identifying other small hospitals and larger hospitals through UP’s Health Management Information System and cold calling them using a recruitment script that was designed to introduce the research objectives as well as establish researcher and institutional identity. We found our institutional affiliation with Johns Hopkins University brought legitimacy to our interactions with respondents who we had directly approached. Lastly, we relied on snowball sampling as an important recruitment strategy and found it to be especially effective for identifying single-doctor clinicians, as well as, gaining their trust in interviews. In addition, snowball sampling was particularly important for reaching RMPs, given our inability to conduct an in-person mapping exercise to identify them. Overall, conducting remote interviews has allowed for an unexpected level of speed and flexibility with scheduling. Often our respondents have been willing to participate in a phone interview on the same day or the next, and they have been willing to schedule interviews outside normal working hours, for example, during evenings and weekends. Furthermore, with data collectors based across time zones, we have had a unique opportunity to schedule interviews during early mornings, afternoons and late evenings, per the respondents’ convenience.
Case study 4: addressing interview fatigue in Uganda
‘Musawo [health care worker], these questions are many’. This statement was featured in one of our first in-person interviews, conducted prior to the national lockdown that halted data collection. Interviews were running well over an hour, and some participants seemed impatient by the end, with responses becoming thin. Our study uses a variety of qualitative methods to engage participants on the often difficult-to-discuss topic of mental health among people living with HIV in South-western Uganda. As we navigated shifting to telephone-based data collection, we were particularly concerned about fatigue and patience based on experiences in prior interviews. Surely participants would be more likely to get fatigued, impatient, and distracted when over the phone, and now we would not be able to see it. We shortened our guides, but wondered if it was enough. We had also lost our ability to use a timeline visual that we had developed. It had centred the interviews and worked well. It was now condensed into a script—more added time! To address these concerns, interviewers developed strategies for explaining the timeline by first summarising the points on the timeline and stating they would walk through time points in chronological order. Interviewers continued to keep a hardcopy of the timeline in front of them during the interview, allowing the tool to guide questions. We discussed plans in case participants wanted to cut interviews short or seemed tired, such as having a pre-agreed on back-up time, and considered if we should split the interviews into two sessions. When recruiting participants, we stressed they should find a comfortable and private place for the interview. To build rapport, we chatted briefly about the rainy season, well-being of their family and checked-in verbally throughout interviews: ‘Are you still doing ok?’, ‘Is the time alright for you?’. To our surprise, interviews ran over an hour but participants were not fatigued, with rich responses continuing through to the end of the interviews. Only one person has refused participation to date.
Adapted qualitative components amid the pandemic
The continuing need for qualitative interviewing to personalize and adapt during the pandemic suggests unlearning and re-learning some of the traditional approaches that have shaped the discipline. In table 2, we break down the deceptively ‘simple’ act of remote interviewing across all of our case study settings and by study phases (from training data collection teams to conducting debriefings post-interviews), using succinct bullet points to guide qualitative research teams as they collect data remotely.
Table 2.
Challenges exacerbated by remote approaches | How we mitigated these in our studies | Lessons learned |
Research phase: data collector training | ||
|
|
|
Research phase: respondent recruitment | ||
|
|
|
Research phase: consent | ||
|
|
|
Research phase: preparing for the interview | ||
|
|
|
Research phase: conducting the interview | ||
|
|
|
Research phase: debriefing teams post interview | ||
|
|
|
Notable challenge: accessing rural and remote populations
We note that in many settings, rural populations are less likely to have mobile and/or internet access, which facilitated enrollment in our case studies. In Uganda, participants (who are people living with HIV) were drawn from an open, population-based cohort study.29 Cohort study participants are asked to provide a telephone number, even if they themselves do not own the phone. Sampling from this existing study with robust procedures in place to obtain contact information increased our ability to reach participants, particularly those in rural areas. Our Uganda-based study is focused on eliciting local models of mental health and although remote data collection may limit the range of perspectives, we feel we are still able to achieve our objectives despite being unable to enroll individuals who lack telephone access. Given the rapid proliferation of mobile technologies, even in rural settings,30 strategies beyond cohort designs to engage participants could include multiple recruitment attempts at different times of day and over a period of time to attempt to make contact when someone is in signal range, and/or supporting access through community healthcare workers and others in closer geographical proximity, and/or scheduling contacts for a time when they can share a mobile device. In India, identifying private providers located in rural locations was difficult in the absence of an existing roster of providers. Once we are able to establish contact with 1–2 providers through snowball sampling however, the lack of access to mobile phones or internet connectivity was not a substantial barrier for conducting remote interviews.
Unanticipated benefits of remote data collection
Beyond challenges, remote data collection presents unforseen benefits and opportunities. These opportunities include direct study benefits (eg, faster recruitment), to broader impacts such as reduced carbon dioxide emissions (table 3).
Table 3.
Video interviewing | Oftentimes, we code transcripts. Verbatim transcripts are an excellent way to tease out verbalized features of a conversation. However, much of the depth of a conversation may be lost because ‘silent’ communication is not captured during transcription. Videos allow us to code not just the text, but also much of the body language and emotional texture of an interview in a manner that may not otherwise be possible. |
Recruitment | Recruitment was generally faster, particularly in urban settings or settings with strong internet or mobile access. Additionally, we did not experience a higher refusal rate compared with face-to-face data collection. |
Low costs | Face-to-face interviewing requires travel across several locations, some of which may be in hard-to-reach rural areas, requiring high financial costs (ie, vehicle access, fuel, per diems and accommodation of the research team). Remote data collection can be done on a reduced budget. |
Minimization of environmental dilemmas | Online and phone-based approaches reduce ecological and carbon footprints because researchers and teams are not travelling to/within countries. |
Reduce possibility of awkwardness and embarrassment on sensitive topics | Our data suggest that, for some respondents (and possibly also for some research assistants), discomfort seems to be reduced when using remote formats. Discussion of sensitive information was often much easier remotely. |
Skills building | Research assistants appreciated learning how to use remote technologies, representing an added skill that is transferrable to several aspects of their professional and personal lives. |
Expanded data collection opportunities | Remote data collection may expand opportunities for participation to individuals who would not have been able to travel to enrol in a study. Additionally, investigators and students who are not based locally have an increased ability to participate in and lead data collection activities. |
Conclusion
We found that conducting qualitative research remotely can initially be daunting, as it requires diverging from common and familiar procedures both prior to and during data collection. Some of our researchers and participants were hesitant—and even technophobic—at the outset of the process. However, with new and adapted procedures, comprehensive training, continuous debriefings to address emerging issues, and increasing familiarity with processes, it was possible to collect high-quality data. Remote data collection allowed broad and rich participation in each of our case studies, proving effective for our populations of interest. We caution, however, that there may be challenges reaching participants in areas where telephone or internet access is poor, requiring inventive strategies to improve enrollment or requiring that researchers be forthright about recruitment limitations. In our view, remote data collection is not wholly a replacement for in-person endeavours, but it is a highly beneficial complement to such approaches. We plan to incorporate online and mobile data collection into our future research efforts, regardless of pandemic-related restrictions.
Acknowledgments
The authors wish to acknowledge all their data collection and project teams. In the Philippines: Mila Aligato, Jhoys Landicho-Guevarra, Jeniffer Landicho, Vivienne Endoma, Thea Andrea Bravo, Jonas Wachinger, Kate Bärnighausen, Marianette Inobaya, Jerric Rhazel Guevarra, and Nicanor de Claro III. In Zambia: Besa Chibwe, Chansa Chilambe, Esther Hamweemba, Herbert Nyirendra, Jenala Chipungu, Kabwe Mwamba, Kasapo Lumbo, Lloyd Chifunda, Mainza Syulikwa, Marksman Foloko, Mwati Chipungu, Njekwa Mukamba. In Uttar Pradesh, India: Sara Bennett and Priyanka Das. In Uganda: Caitlin Kennedy, Fred Nalugoda, and Neema Nakyanjo.
Footnotes
Handling editor: Seye Abimbola
Twitter: @shannonamcmahon
Contributors: MDCR and SAM conceived the study and wrote the first draft of the manuscript. CM, AM, SH, NSW, WD, AS and LKB contributed to the writing of the case studies and edited the manuscript. SAM supervised all the writing and editing of the manuscript. All authors have read, critically revised the paper, and approved the final version of the manuscript.
Funding: This work was supported, in whole or in part, by the Bill & Melinda Gates Foundation (OPP1217275). Under the grant conditions of the Foundation, a Creative Commons Attribution 4.0 Generic License has already been assigned to the Author Accepted Manuscript version that might arise from this submission. They further acknowledge the funding support we received for the research projects. In the Philippines: Global Grand Challenges, Bill and Melinda Gates Foundation. In Zambia: UCSF Gladstone, Bill and Melinda Gates Foundation, Alliance for Health Policy and Systems Research, Vittol Foundation, NIH and the CDC. In India: Johns Hopkins Alliance for A Healthier World. In Uganda: The Johns Hopkins Catalyst Awards. Author NW was supported by training grant T32 MH103210 from the National Institute of Mental Health.
Disclaimer: The funders had no role in the decision to publish, or preparation of the manuscript. The content is the responsibility of the authors and does not necessarily represent the views of any funder.
Competing interests: None declared.
Provenance and peer review: Not commissioned; externally peer reviewed.
Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.
Data availability statement
No data are available from this practice paper.
Ethics statements
Patient consent for publication
Not required.
References
- 1.McCoyd JLM, Kerson TS. Conducting intensive interviews using Email. Qualitative Social Work: Research and Practice 2006;5:389–406. 10.1177/1473325006067367 [DOI] [Google Scholar]
- 2.Green J, Thorogood N. Qualitative methods for health research. sage, 2018. [Google Scholar]
- 3.Saldaña J, Omasta M. Qualitative research: analyzing life. Sage Publications, 2016. [Google Scholar]
- 4.Lincoln YS, Guba EG, Pilotta JJ. Naturalistic inquiry. 9. Newbury Park, CA: Sage Publications, 1985: 438–9. 10.1016/0147-1767(85)90062-8 [DOI] [Google Scholar]
- 5.Lincoln YS, Guba EG. But is it rigorous? Trustworthiness and authenticity in naturalistic evaluation. New Directions for Program Evaluation 1986;1986:73–84. 10.1002/ev.1427 [DOI] [Google Scholar]
- 6.Tobin GA, Begley CM. Methodological rigour within a qualitative framework. J Adv Nurs 2004;48:388–96. 10.1111/j.1365-2648.2004.03207.x [DOI] [PubMed] [Google Scholar]
- 7.Redlich-Amirav R, Higginbottom G. New emerging technologies in qualitative research. The Qualitative Report 2014;19:1–14. [Google Scholar]
- 8.Archibald MM, Ambagtsheer RC, Casey MG, et al. Using Zoom videoconferencing for qualitative data collection: perceptions and experiences of researchers and participants. Int J Qual Methods 2019;18:160940691987459. 10.1177/1609406919874596 [DOI] [Google Scholar]
- 9.James N, Busher H. Credibility, authenticity and voice: dilemmas in online interviewing. Qualitative Research 2006;6:403–20. 10.1177/1468794106065010 [DOI] [Google Scholar]
- 10.Creswell J, Poth C. Qualitative inquiry and research design: choosing among five approaches. SAGE Publications, 2016. [Google Scholar]
- 11.Carr ECJ, Worth A. The use of the telephone interview for research. NT Research 2001;6:511–24. 10.1177/136140960100600107 [DOI] [Google Scholar]
- 12.Greenfield TK, Midanik LT, Rogers JD. Effects of telephone versus face-to-face interview modes on reports of alcohol consumption. Addiction 2000;95:277–84. 10.1046/j.1360-0443.2000.95227714.x [DOI] [PubMed] [Google Scholar]
- 13.Sweet L. Telephone interviewing: is it compatible with interpretive phenomenological research? Contemp Nurse 2002;12:58–63. 10.5172/conu.12.1.58 [DOI] [PubMed] [Google Scholar]
- 14.Sturges JE, Hanrahan KJ. Comparing telephone and face-to-face qualitative interviewing: a research note. Qualitative Research 2004;4:107–18. 10.1177/1468794104041110 [DOI] [Google Scholar]
- 15.Deakin H, Wakefield K. Skype interviewing: reflections of two PHD researchers. Qualitative Research 2014;14:603–16. 10.1177/1468794113488126 [DOI] [Google Scholar]
- 16.Krouwel M, Jolly K, Greenfield S. Comparing Skype (video calling) and in-person qualitative interview modes in a study of people with irritable bowel syndrome - an exploratory comparative analysis. BMC Med Res Methodol 2019;19:219. 10.1186/s12874-019-0867-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Gibson K. Bridging the digital divide: reflections on using WhatsApp instant messenger interviews in youth research. Qual Res Psychol 2020;14:1–21. 10.1080/14780887.2020.1751902 [DOI] [Google Scholar]
- 18.Kaufmann K, Peil C. The mobile instant messaging interview (MIMI): using WhatsApp to enhance self-reporting and explore media usage in situ. Mob Media Commun 2020;8:229–46. 10.1177/2050157919852392 [DOI] [Google Scholar]
- 19.McCoyd J, Kerson T. Conducting intensive interviews using Email: a serendipitous comparative opportunity. Qualitative Social Work 2006;5:389–406. [Google Scholar]
- 20.Drabble L, Trocki KF, Salcedo B, et al. Conducting qualitative interviews by telephone: lessons learned from a study of alcohol use among sexual minority and heterosexual women. Qual Soc Work 2016;15:118–33. 10.1177/1473325015585613 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.McInroy LB. Pitfalls, potentials, and ethics of online survey research: LGBTQ and other marginalized and hard-to-access youths. Soc Work Res 2016;40:83–94. 10.1093/swr/svw005 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Beaton B, Perley D, George C, et al. Engaging remote marginalized communities using appropriate online research methods. The Sage handbook of online research methods 2017:563–77. [Google Scholar]
- 23.Rowntree O. The mobile gender gap report 2019. GSM Association, 2019. [Google Scholar]
- 24.Sambuli N. Challenges and opportunities for advancing Internet access in developing countries while upholding net neutrality. Journal of Cyber Policy 2016;1:61–74. 10.1080/23738871.2016.1165715 [DOI] [Google Scholar]
- 25.Bahia K, Suardi S. The state of mobile Internet connectivity 2019 2019.
- 26.Mobile Internet [Internet] . ZIFF Davis, LLC. PCMAG digital group. Available: https://www.pcmag.com/encyclopedia/term/mobile-internet#:~:text=0%2D9,IS%20FOR%20PERSONAL%20USE%20ONLY [Accessed 9 Oct 2020].
- 27.Samuels F. Tips for collecting primary data in a Covid-19 era 2020.
- 28.McMahon SA, Winch PJ. Systematic Debriefing after qualitative encounters: an essential analysis step in applied qualitative research. BMJ Glob Health 2018;3:e000837. 10.1136/bmjgh-2018-000837 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Chang LW, Grabowski MK, Ssekubugu R, et al. Heterogeneity of the HIV epidemic in agrarian, trading, and fishing communities in Rakai, Uganda: an observational epidemiological study. Lancet HIV 2016;3:e388–96. 10.1016/S2352-3018(16)30034-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Bank W. The transformational use of information and communication technologies in Africa 2012.
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjgh-2020-004193supp001.pdf (44.7KB, pdf)
Data Availability Statement
No data are available from this practice paper.