Abstract
Background
Despite an increasing use of qualitative email interviews by nurse researchers, there is little understanding about the appropriateness and equivalence of email interviews to other qualitative data collection methods, especially on sensitive topics research.
Purpose
The purpose is to describe our procedures for completing asynchronous, email interviews and to evaluate the appropriateness and equivalency of email interviews to phone interviews in two qualitative research studies that examined reproductive decisions.
Methods
Content analysis guided the methodological appraisal of appropriateness and equivalency of in-depth, asynchronous email interviews to single phone interviews. Appropriateness was determined by: (a) participants’ willingness to engage in email or phone interviews, (b) completing data collection in a timely period, and (c) participants’ satisfaction with the interview. Equivalency was evaluated by: (a) completeness of the interview data, and (b) insight obtained from the data.
Results
Of the combined sample in the two studies (N = 71), 31% of participants chose to participate via an email interview over a phone interview. The time needed to complete the email interviews averaged 27 to 28 days and the number of investigator probe-participant response interchanges was 4 to 5 cycles on average. In contrast, the phone interviews averaged 59 to 61 minutes in duration. Most participants in both the email and phone interviews reported they were satisfied or very satisfied with their ability to express their true feelings throughout the interview. Regarding equivalence, 100% of the email and phone interviews provided insight into decision processes. Although insightful, two of the email and one phone interview had short answers or, at times, underdeveloped responses. Participants’ quotes and behaviors cited within four published articles, a novel evaluation of equivalency, revealed that 20% to 37.5% of the citations about decision processes were from email participants, which is consistent with the percent of email participants.
Conclusions
In-depth, asynchronous email interviews were appropriate and garnered rich, insightful data that augmented the phone interviews. Awareness of the procedures, appropriateness, and nuances when carrying out email interviews on sensitive topics may provide nurse researchers with the ability to obtain thick, rich data that can best advance clinical practice and direct future research.
Keywords: data collection methods, decision research, electronic mail, Internet, qualitative methods
Nurse and social science investigators are increasingly using email interviews to collect qualitative data (Nehls, 2013). Yet, the literature on understanding procedures for carrying out qualitative email interviews and comparing email interviews for appropriateness and equivalence to other more established qualitative interviews methods such as phone interviews remains limited, especially on sensitive topics research. In order to contribute to understanding about email interview intricacies and prepare for future research, we examined procedures and data collected from our qualitative research studies to ascertain how our email interviews compared to our phone interviews in appropriateness and equivalency. Thus, the purpose of this paper is to describe our procedures for completing asynchronous, email interviews and to evaluate the appropriateness and equivalence of email interviews to phone interviews in two qualitative studies that examined reproductive treatment decisions.
Background
As nurses and other investigators increasingly turn toward qualitative email interviews to examine a variety of phenomena and processes, the advantages (e.g., low cost, automatic transcription, increased access to geographically dispersed or hidden populations) and disadvantages (e.g., effort and willingness to write on behalf of participants, loss of sensory and emotional cues, increased possibility of dropout or discontinuous responses by participants) have begun to emerge (Bowden & Galindo-Gonzalez, 2015; Burns, 2010; Hamilton & Bowers, 2006; Hunt & McHale, 2007; James & Busher, 2006; Meho, 2006; Nehls, 2013). What remains an important consideration for nurses and is yet to be fully understood, especially in sensitive research, is understanding the quality of the data obtained and the procedures and contexts for using email interviews. Several researchers have sought to address these concerns and have carried-out the sparse methodological analyses comparing qualitative email interviews with other methods of qualitative data collection and the findings are inconsistent. For example, in a seminal paper, Curasi (2001) led a team of senior students who set out to examine Internet shopping behaviors and compared 24 in-depth interviews collected face-to-face with 24 interviews collected via email. During data collection, Curasi’s students sent follow-up emails to obtain further information from some participants after reviewing the initial email responses. Curasi found data collected from the email interviews contained some very short and very precise responses to the questions posed and at times, provided more in-depth data than some of the face-to-face interview responses. Cook (2012) described the merit of qualitative email interviews completed with 26 women to understand the meaning of sexually transmitted infections in women’s lives. In this methodological report, Cook found the quality of the email responses high as participants provided rich data that included sensitive disclosures about sexual abuse, rape, abusive ex-partners, and problematic interactions with clinicians. However, Cook’s report was unable to provide information about comparative data quality as the participants completed all interviews by asynchronous email.
In a sensitive topics study that contained both face-to-face and email interviews, Ratislavová and Ratislav (2014) interviewed 18 Czech women (12 via face-to-face and 6 via asynchronous email) to understand grieving processes following perinatal loss. The researchers reported the quality of the email interviews was “slightly poorer” than the face-to-face interviews because the women’s email responses were “more structured and did not involve as much [data] repetition” compared to the face-to-face interviews (p. 455). Adding to the concern that email interviews may provide less insight, Kazmer and Xie (2008) reported that asynchronous email interviews are limited because participants’ responses did not seem to provide adequate insight into detailed thought processes compared to synchronous (e.g., phone, face-to-face) interviews when conducting research about Internet use. Other investigators found that email interviews provided less unique ideas than phone or instant messaging responses when examining electronic game-playing practices (Dimond, Fiesler, DiSalvo, Pelc, & Bruckman, 2012).
Our Two Qualitative Research Studies
After receiving Institutional Review Board approvals and obtaining informed consent from all participants, we completed two qualitative research projects using a grounded theory approach. In Study 1, we examined decision processes of 22 genetically at-risk, heterosexual couples (44 individual partners) who were deciding whether to use preimplantation genetic diagnosis (PGD) to prevent the transmission of known genetic disorders to their future children (Drazba, Kelley, & Hershberger, 2014; Hershberger et al., 2012). In Study 2, we examined decision processes of 27 young adult women with cancer, who were deciding whether to undergo fertility preservation treatment (egg and embryo freezing) prior to their cancer therapy (Hershberger, Finnegan, Altfeld, Lake, & Hirshfeld-Cytron, 2013; Hershberger, Finnegan, Pierce, & Scoccia, 2013). In both studies, eligible participants were given the choice of completing the in-depth interview by phone (one interview, digitally recorded) or by email (asynchronous interchanges). Regardless of interview method (i.e., email or phone) chosen, all of the interviews were completed by the Principal Investigator (PI; first author) and used a semi-structured interview guide. The interview guides for Study 1 and 2 contained only slight deviations in language for the email and phone interviews (e.g., when requesting participants to “state” responses for phone interviews versus “write” responses for email interviews). Prior to the onset of the two Studies, the PI had completed multiple qualitative face-to-face interviews, 2 qualitative phone interviews and no email interviews. Of note, couple dyads in Study 1 were interviewed separately from their respective partner. Once the participant chose the interview method, rapport was established through an introductory email or through a phone conversation. Participants were encouraged to ask the PI questions about the study and procedures, which were clarified accordingly. Then, the PI either spoke or emailed the primary research question to the participant, depending on the interview method and allowed the participant to respond. Once the participant responded, they were asked additional follow-up questions and probes per the interview guide. For the email interviews, a series of asynchronous, investigator probe-participant response interchanges took place between the PI and the participants to carry out the asynchronous interviews. Details of our multifaceted strategies and lessons learned were reported earlier (Hershberger et al., 2011; Ryan et al., 2013).
To obtain participants’ perceptions about satisfaction with the phone and email interviewing, we embedded open-ended methodological appraisal questions into the end of the interview guide, after the participant responded to all the decision-making process questions and probes. The methodological appraisal questions were: “What determined your choice to participate by phone or email?” and, “Describe your level of satisfaction with your ability to express your true thoughts and feelings by participating in the way you did.”
Methods
Content analysis as described by Elo and Kyngäs (2008) guided the methodological analyses. Appropriateness was determined by: (a) participants’ willingness to engage in email or phone interviews, (b) completing data collection in a timely period, and (c) participants’ satisfaction with the interview. Equivalency was evaluated by: (a) completeness of the interview data, and (b) insight obtained from the data. For appropriateness, we compared the number of participants who choose to complete email interviews versus phone interviews and determined the time needed for completion of data collection per interview method. Additionally, the participants’ responses to the methodological appraisal questions about choice and satisfaction with the interview method were identified, coded, and categorized. For equivalency, the participants’ responses to interview questions about reproductive decisions were analyzed for completeness of the interview data (e.g., responses to interview questions) and insight provided into decision-making processes. As an additional indicator of insight, we compared the number of participants’ quotes and behaviors that were cited in four of our published articles (2 from each study) where key decision process findings were reported.
Results
For both studies combined, the majority of the participants (69%) opted to complete the qualitative interviews by phone. However, the couples in Study 1 chose phone slightly less often than the young women in Study 2. Within the couple dyads in Study 1, all but 4 of the couples chose the same method of interview (e.g., both partners chose email or both partners chose phone). In these 4 couples, the male partners typically opted for an email interview (n = 3) whereas the females typically opted for a phone interview. In one of these couples, a male partner who chose email also completed a short phone interview to respond to the final round of research questions. See Table 1 for details about the participants’ choices for interview method.
Table 1.
Participant Choice of Interview Method
| Study | Email Interviews | Phone Interviews |
|---|---|---|
|
Study 1 Couples’ PGD Study* (N = 22 Couples/44 Individuals) |
6 Couple Dyads and 4 Individual Partners (16 Individuals; 9 men, 7 women) (36% of study sample) |
12 Couple Dyads and 4 Individual Partners (28 Individuals; 13 men, 15 women) (64% of study sample) |
|
Study 2 Young Women’s Cancer Study (N = 27 Individuals) |
6 Young Women (22% of study sample) |
21 Young Women (78% of study sample) |
|
Totals for both studies (N = 71 Individuals) |
22 Individuals (31% of combined sample) |
49 Individuals (69% of combined sample) |
In 4 couples, one partner within the couple dyad chose email and the other partner chose phone
The amount of time needed to complete the email and phone interviews varied. In Study 1, the investigator probe-participant response interchanges needed for the email interviews was 4 to 8 (mean = 4.94) cycles and in Study 2 the range was 2 to 6 (mean = 3.83). The number of days to complete the investigator probe-participant response interchanges in Study 1 was 9 to 96 days (mean = 28 days) and in Study 2 was 5 to 43 days (mean = 27 days). In comparison, the length of time for the phone interviews in Study 1 ranged from 38 to 114 minutes (mean = 61 minutes) and in Study 2 ranged from 34 to 114 minutes (mean = 59 minutes).
Regarding completeness of the interviews and insight obtained, all of the email and phone interviews in both studies contained insightful data that added to understanding about decision processes. However, regarding completeness, two of the participants in the email interviews (or 9% of email interviews) and one participant in the phone interviews (or 2% of phone interviews), all males, gave short or underdeveloped responses to specific questions or probes when posed by the PI. Yet, in analytic memos it was noted that one of the email participants who gave short responses could provide foundational data for a case analysis about decision processes, which indicates insightful, rich data. Findings about the number of participants’ quotes and behaviors cited within the four published articles revealed that 20% to 37.5% of the participants’ citations about decision processes were from email participants, which is comparable to the percent of email participants in the studies. Table 2 provides details of the quotes cited in the published articles by interview method.
Table 2.
Quotes Cited in Published Research Articles by Interview Method
| Article, Study & Total Number of Quotes | Number of Quotes from Email Interviews | Number of Quotes from Phone Interviews |
|---|---|---|
|
Article 1 Couples’ PGD Study Social Science & Medicine, 2012 16 Total Quotes |
6 (37.5% of total) |
10 (62.5% of total) |
|
Article 2 Couples’ PGD Study Journal of Genetic Counseling, 2014* 16 Total Quotes |
4 (25% of total) |
12 (75% of total) |
|
Article 3 Young Women’s Cancer Study Journal of Obstetric, Gynecologic, & Neonatal Nursing, 2013 23 Total Quotes |
6 (26% of total) |
17 (74% of total) |
|
Article 4 Young Women’s Cancer Study Research & Theory for Nursing Practice, 2013 35 Total Quotes |
7 (20% of total) |
28 (80% of total) |
Sample was comprised 18 Couples (36 Individuals) that decided to undergo PGD
The methodological appraisal questions revealed that all but one email participant indicated they were satisfied or very satisfied with their ability to express their true thoughts and experiences through email or phone interviews. In the negative case and upon further probing, the participated reported he was not satisfied because of a perceived bias by the investigators about accepting treatment versus not being satisfied with the method of interview or his ability to express his true thoughts. He wrote, “Sometimes, it feels like they [researchers] are just wondering why we didn’t ‘conduct business’ or go thru with the [PGD] procedure.” Table 3 depicts quotations from participants about their satisfaction and reasons for participating in the interview by email or phone.
Table 3.
Representative Quotes about Satisfaction and Reason for Choosing Interview Method
| Email Participant & Study | Quotation |
|---|---|
| Male Study 1 |
“I was very satisfied with this [email] format. I’m a better writer than I am a speaker, so email gives me time to collect my thoughts and say what I mean rather than saying what comes to mind so that a telephone conversation doesn’t drag on while I’m thinking of exactly what to say.” |
| Female Study 1 |
“I feel I express myself best in writing.” |
| Male Study 1 |
“Responding by email allowed me the time to reflect and ensure that my thoughts and feelings were accurately conveyed.” And, “I found it very satisfying to compose my thoughts…. It was actually therapeutic as I had not allowed myself the time or emotional energy to look at our journey as comprehensively as this study provided.” |
| Female Study 2 |
“I chose to participate by email because, as a general rule, I feel that I can express myself better when I have time to think about what I want to say. If I had participated by phone, I felt that I might have forgotten some important things that I wanted to share.” |
| Phone Participant & Study | Quotation |
| Male Study 1 |
“It’s been great. I’d rather probably talk on the phone than type.” |
| Female Study 1 |
“I feel a high level…. and that’s more of, my thing. I prefer to talk to people on the phone.” |
| Female Study 2 |
“It’s very easy.” And, “I just think it’s more personable. To speak with someone over the phone.” |
Discussion
This methodological analysis found that asynchronous email interviewing is appropriate for use in sensitive topics research on reproductive decisions when used in conjunction with phone interviews. Our findings revealed that a sub-set of our samples (22% to 36%) preferred email interviews, felt they were suitable or well suitable for capturing decision processes, and may not have participated in our studies if email interviews were not offered.
Based on our analysis, we found the email interviews equivalent to the phone interviews as far as providing insight into key decision processes based on content analysis and our novel evaluation of the number of participants’ quotes and behaviors that were cited in our four published articles on key decision-making processes. When examining these data, we were surprised to discover the almost mirror findings of the percent of email participants in the studies (36% Study 1, 22% Study 2) compared to the percent of email citations (37.5% Article 1, 26% Article 3). This finding was particularly insightful as in Articles 1 and 3 we reported the main decision process findings from each of the two studies. These findings add support for email interview data providing rich insight (Curasi, 2001) and yet contradicts the findings from others such as Kazmer and Xie (2008) who found that although email interviews can provide in-depth data they do not provide adequate insight about thought processes. This inconsistency may be explained by differences in study design, procedures, and/or instruments. For example, in our two studies, one interviewer (i.e., PI), who is the research instrument, completed all interviews, whereas the research instrument varied in the Kazmer and Xie studies. Keeping the interview guides consistent for the two data collection methods (email and phone) likely contributed to the equivalency of the interviews. Our use of multiple (between 2 to 8), investigator probe-participant response interchanges was higher than other investigators’ reports about the number of follow-up emails used (Dimond et al., 2012; Kazmer & Xie, 2008). In addition, in Study 1 the dyadic couples’ design may have added to the richness of these data. The relatively low percentage of email interviews in our studies (22% to 36%) further confounds understanding but may provide a baseline for triangulating the appropriate percentage of email to phone interviews in studies examining decision processes on sensitive topics.
Other nuanced and noteworthy findings were that males chose email interviewing more frequently than females in our couples study (Study 1) and that young women (Study 2) chose email interviews less frequently than our couples, despite the older mean age of the couples’ sample (mean age = 33 years) compared to the young women’s sample (mean age = 28.7 years). We would be remiss not to discuss the knowledge we gained from embedding the methodological appraisal questions into our studies. In addition to providing analytic data about email and phone interviews, the insight we gained from the participant who reported his perception of a perceived bias by the investigators about PGD treatment decisions was invaluable. Because we purposely designed our studies to maintain neutrality about treatment decisions, this insight, obtained through an email interview, prompted us to reexamine our communication strategies with participants across studies. It also exemplifies a reported benefit of email interviews whereby participants may feel more comfortable or safer about self-disclosure and are more likely to express true feelings and experiences (Bowden & Galindo-Gonzalez, 2015; Cook, 2012; Meho, 2006).
Although a benefit of email interviews is the avoidance of transcription, the mean duration of our asynchronous email interviews (27 to 28 days) and the number of investigator probe-participant response interchanges (about 4 to 5 cycles on average) required a high level of organizational skills and effort. Taken together with the finding that participants in two of the email interviews, or 9% of all email interviews, did not respond fully to specific probes warrants discussion. The lack of full responses by 9% of email participants compared to 2% of phone participants may be related to the unique skills required by interviewers when managing the asynchronous interchanges. Other investigators have noted similar challenges of coordinating email interview tasks, scheduling, and the potential for information overload (Hunt & McHale, 2007; Meho, 2006). This may account as to why other investigators have limited their use of email interchanges or follow-up questions and probes. When designing future studies, investigators may want to consider the time and effort needed to carry out interviews of this type and plan effective organizational strategies to maximize the completeness of the data obtained. Finally, it is worth noting that in addition to the skills of the interviewer, participants themselves vary in their ability to provide in-depth responses regardless of interview method. Morse (2010) reminds investigators that not all participants “… who volunteer to participate in your study will have all of the characteristics of an excellent participant” (p. 231).
Limitations and Future Research
A limitation is the self-selective design that allowed participants themselves to determine his or her interview method (email or phone), which may have allowed personal characteristics or other confounding factors to influence the findings. Nevertheless, we were interested in knowing about participants’ preferences for interviews and thus randomization into interview method was not possible. Another limitation is the various perspectives among qualitative researchers about evaluating in-depth qualitative interviews for insight (Borer & Fontana, 2012). To minimize this limitation, we carried out previously reported and novel analyses to obtain knowledge of the appropriateness and equivalency of our email and phone interviews. We encourage qualitative investigators to critique and add to our analytic approach and the framework we developed.
We acknowledge the lack of experience in conducting email interviews by the PI may have negatively impacted data quality. However, we did not find evidence supporting this notion. One explanation could be the PI’s extensive experience with face-to-face interviews and relatively little experience with phone interviews prior to launching the two Studies mitigated any data discrepancies between the phone and email interviews. Another explanation could be related to a challenge in our research, which is accruing an appropriate sample (Hershberger et al., 2011); thus, we are typically thrilled to have participants engage in our research albeit by phone or email interviews. It is possible that participants reacted to the PI’s conscious or unconscious written cues and diligence about their participation during the email interviews, which influenced data quality. Bjerke (2010) has drawn attention to the importance of investigator biases or behaviors and the impact these biases and behaviors can have on qualitative data collection and analysis based on her experience with email and face-to-face interviews.
Our findings shed light on several areas for future research. Because we found subtle difference in how males and females participated in email interviews, future research examining gender differences in male and female preferences and responses to email and other emerging technological interviews (e.g., skype, instant messaging) would be beneficial. Another area for research builds on our findings regarding the appropriateness of email interviews. Nursing and other practice disciplines may want to consider whether health-focused research topics differ from other social science research and if so, do the email interviewing procedures, skills, or behaviors of the interviewer need to be different in these instances? Future research that can investigate, address, or analyze these questions will advance the science of qualitative research, likely contributing to nursing research and ultimately improving clinical practice.
Highlights.
There is a lack of knowledge and inconsistency about the quality of qualitative, email interviews.
Procedures for completing asynchronous, email interviews are described.
A novel framework for evaluating and comparing the appropriateness and equivalence of qualitative interviews is provided.
Findings demonstrated that email interviews when used with phone interviews are appropriate in sensitive-topics research and garnered rich, insightful data.
Acknowledgments
Support for this research was provided by the National Institutes of Health, National Institute of Child Health and Human Development and the Office of Research on Women’s Health (K12 HD055892), National Institute of Nursing Research (R03 NR010351 & P30 NR010680) and the University of Illinois at Chicago College of Nursing Dean’s Fund. We thank Rebekah Hamilton, PhD, RN, FAAN for her guidance on implementing the email interviews and Cecelia Roscigno, PhD, RN, CNRN for her comments on a preliminary draft of this paper.
Footnotes
Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Contributor Information
Patricia E. Hershberger, College of Nursing and College of Medicine, University of Illinois at Chicago, Chicago, Illinois, 60612, United States.
Karen Kavanaugh, Senior Nurse Scientist, Children’s Hospital of Wisconsin, Milwaukee, Wisconsin 53226, United States.
References
- Bjerke TN. When my eyes bring pain to my soul, and vice versa: Facing preconceptions in email and face-to-face interviews. Qualitative Health Research. 2010;20(12):1717–1724. doi: 10.1177/1049732310375967. [DOI] [PubMed] [Google Scholar]
- Borer MI, Fontana A. Postmodern trends: Expanding the horizons of interviewing practices and epistemologies. In: Gubrium JF, Holstein JA, Marvasti AB, McKinney KD, editors. The SAGE Handbook of Interview Research: The Complexity of the Craft. 2nd. Thousand Oaks, CA: Sage; 2012. pp. 45–60. [Google Scholar]
- Bowden C, Galindo-Gonzalez S. Interviewing when you’re not face-to-face: The use of email interviews in a phenomenological study. International Journal of Doctoral Studies. 2015;10:79–92. [Google Scholar]
- Burns E. Developing email interview practices in qualitative research. Sociological Research Online. 2010;15(4) doi: 10.5153/sro.2232. [DOI] [Google Scholar]
- Cook C. Email interviewing: Generating data with a vulnerable population. Journal of Advanced Nursing. 2012;68(6):1330–1339. doi: 10.1111/j.1365-2648.2011.05843.x. [DOI] [PubMed] [Google Scholar]
- Curasi CF. A critical exploration of face-to-face interviewing vs. computer-mediated interviewing. International Journal of Market Research. 2001;43(4):361–375. [Google Scholar]
- Dimond JP, Fiesler C, DiSalvo B, Pelc J, Bruckman AS. Qualitative data collection technologies: A comparison of instant messaging, email, and phone. Proceedings of the 17th ACM International Conference on Supporting Group Work. 2012:277–280. doi: 10.1145/2389176.2389218. [DOI] [Google Scholar]
- Drazba KT, Kelley MA, Hershberger PE. A qualitative inquiry of the financial concerns of couples opting to use preimplantation genetic diagnosis to prevent the transmission of known genetic disorders. Journal of Genetic Counseling. 2014;23(2):202–211. doi: 10.1007/s10897-013-9638-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Elo S, Kyngäs H. The qualitative content analysis process. Journal of Advanced Nursing. 2008;62(1):107–115. doi: 10.1111/j.1365-2648.2007.04569.x. [DOI] [PubMed] [Google Scholar]
- Hamilton RJ, Bowers BJ. Internet recruitment and e-mail interviews in qualitative studies. Qualitative Health Research. 2006;16(6):821–835. doi: 10.1177/1049732306287599. [DOI] [PubMed] [Google Scholar]
- Hershberger PE, Finnegan L, Altfeld S, Lake S, Hirshfeld-Cytron J. Toward theoretical understanding of the fertility preservation decision-making process: Examining information processing among young women with cancer. Research and Theory for Nursing Practice. 2013;27(4):257–275. doi: 10.1891/1541-6577.27.4.257. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hershberger PE, Finnegan L, Pierce PF, Scoccia B. The decision-making process of young adult women with cancer who considered fertility cryopreservation. Journal of Obstetric, Gynecologic, and Neonatal Nursing. 2013;42(1):59–69. doi: 10.1111/j.1552-6909.2012.01426.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hershberger PE, Gallo AM, Kavanaugh K, Olshansky E, Schwartz A, Tur-Kaspa I. The decision-making process of genetically at-risk couples considering preimplantation genetic diagnosis: Initial findings from a grounded theory study. Social Science & Medicine. 2012;74(10):1536–1543. doi: 10.1016/j.socscimed.2012.02.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hershberger PE, Kavanaugh K, Hamilton R, Klock SC, Merry L, Olshansky E, Pierce PF. Development of an informational web site for recruiting research participants: Process, implementation, and evaluation. Computers, Informatics, Nursing. 2011;29(10):544–551. doi: 10.1097/NCN.0b013e318224b52f. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hunt N, McHale S. A practical guide to the e-mail interview. Qualitative Health Research. 2007;17(10):1415–1421. doi: 10.1177/1049732307308761. [DOI] [PubMed] [Google Scholar]
- James N, Busher H. Credibility, authenticity and voice: Dilemmas in online interviewing. Qualitative Research. 2006;6(3):403–420. doi: 10.1177/1468794106065010. [DOI] [Google Scholar]
- Kazmer MM, Xie B. Qualitative interviewing in Internet studies: Playing with the media, playing with the method. Information, Communication & Society. 2008;11(2):257–278. doi: 10.1080/13691180801946333. [DOI] [Google Scholar]
- Meho LI. E-mail interviewing in qualitative research: A methodological discussion. Journal of the American Society for Information Science and Technology. 2006;57(10):1284–1295. doi: 10.1002/asi.20416. [DOI] [Google Scholar]
- Morse JM. Sampling in grounded theory. In: Bryant A, Charmaz K, editors. The SAGE handbook of grounded theory. Thousand Oaks, CA: Sage Publications; 2010. pp. 229–244. [Google Scholar]
- Nehls K. Methodological considerations of qualitative email interviews. In: Sappleton N, editor. Advancing research methods with new technologies. Hershey, PA: Information Science Reference (IGI Global); 2013. pp. 303–315. [Google Scholar]
- Ratislavová K, Ratislav J. Asynchronous email interview as a qualitative research method in the humanities. Human Affairs. 2014;24(4):452–460. doi: 10.2478/s13374-014-0240-y. [DOI] [Google Scholar]
- Ryan CJ, Choi H, Fritschi C, Hershberger PE, Vincent CV, Hacker ED, Wilkie DJ. Challenges and solutions for using informatics in research. Western Journal of Nursing Research. 2013;35(6):722–741. doi: 10.1177/0193945913477245. [DOI] [PMC free article] [PubMed] [Google Scholar]
