Skip to main content
Social Work Research logoLink to Social Work Research
. 2016 Apr 2;40(2):83–94. doi: 10.1093/swr/svw005

Pitfalls, Potentials, and Ethics of Online Survey Research: LGBTQ and Other Marginalized and Hard-to-Access Youths

Lauren B McInroy 1
PMCID: PMC4886272  PMID: 27257362

Abstract

Online research methodologies may serve as an important mechanism for population-focused data collection in social work research. Online surveys have become increasingly prevalent in research inquiries with young people and have been acknowledged for their potential in investigating understudied and marginalized populations and subpopulations, permitting increased access to communities that tend to be less visible—and thus often less studied—in offline contexts. Lesbian, gay, bisexual, transgender, and queer (LGBTQ) young people are a socially stigmatized, yet digitally active, youth population whose participation in online surveys has been previously addressed in the literature. Many of the opportunities and challenges of online survey research identified with LGBTQ youths may be highly relevant to other populations of marginalized and hard-to-access young people, who are likely present in significant numbers in the online environment (for example, ethnoracialized youths and low-income youths). In this article, the utility of online survey methods with marginalized young people is discussed, and recommendations for social work research are provided.

Keywords: Internet-based methods; LGBTQ; online methods, survey methods, youths


Online research methodologies are providing increased opportunities for population-focused data collection by enabling researchers to capture the unique and nuanced experiences of populations and subpopulations in new, innovative ways (Andrews, Nonnecke, & Preece, 2003; Buchanan & Hvizdak, 2009; Willis, 2011; Wright, 2005). This growth is timely, as online participation is increasingly ubiquitous (Bracken, Jeffres, Neuendorf, & Atkin, 2009). In the United States, approximately 92% of youths (ages 13 to 17) were online daily in 2014 and 2015 (Lenhart et al., 2015). Online methodologies may be particularly suitable for research inquiries with young people (Denissen, Neumann, & van Zalk, 2010; Hessler et al., 2003); the Internet is a critical “social domain and ... communication tool” (Fox, Morris, & Rumsey, 2007, p. 540) for youths, who are generally quick to incorporate and integrate technology into their lives. Online methodologies have also been found to be effective for investigating marginalized and hard-to-access populations, particularly the lesbian, gay, bisexual, transgender, and queer (LGBTQ) communities (Riggle, Rostosky, & Reedy, 2005), increasing access to populations that tend to be less visible, and thus often less studied, in offline contexts (Andrews et al., 2003; Best, Krueger, Hubbard, & Smith, 2001; Riggle et al., 2005).

Online technologies “reduce or temporarily remove barriers associated with geography, age” (Hillier & Harrison, 2007, p. 84), and marginalization and permit recruitment of previously understudied populations and subpopulations for research (Pascoe, 2011; Willis, 2011). For the purposes of this article, populations and subpopulations refer to communities or groups composed of individuals who share one or more unique social identity markers, geographic spaces, or both. Although numerous methodological approaches have been undertaken with various youth populations online (for example, e-mail-based interviews, content analysis of online accounts), including in social work research specifically (Willis, 2011), this article focuses on the potential of survey research as a mechanism for data collection with marginalized and hard-to-access youth populations. LGBTQ young people are a socially stigmatized, yet digitally active, population for which online surveys offer a critical source of data. The use of online methodologies with this population has been previously addressed in the literature (for example, McDermott & Roen, 2012; Riggle et al., 2005; Willis, 2011).

This article argues that many of the opportunities and challenges of online research with LGBTQ youths may be relevant to research with other populations of marginalized and hard-to-access young people. However, it is important to emphasize that the degree to which individuals and communities, including marginalized populations, are engaged as participants in a particular study is not solely determined by the method of data collection and related sampling strategy. The study context and specific population/subpopulation factors (for example, social exclusion, perceptions of research, legislation) are also critical. Yet, assuming effective sampling, online surveys offer considerable reach to recruit hard-to-access and stigmatized populations by offering accessible alternatives to offline research, potentially enhancing generalizability of study findings to the full population. Online surveys offer opportunities for youths who may experience barriers to offline research opportunities to convey their knowledge, perspectives, and lived experiences (Heath, Brooks, Cleaver, & Ireland, 2009; Willis, 2011; Wright, 2005).

LITERATURE REVIEW

Online Research with Marginalized and Hard-to-Access Youth Populations

Many marginalized and hard-to-access youth populations may be well suited to online research, including those who are geographically isolated (for example, living in rural areas, living abroad); are geographically dispersed or experience barriers to offline research (for example, disabilities, LGBTQ, ethnoracialized, homeless); have experienced institutionalized settings (for example, criminal justice, hospitalization, child welfare); or have participated in high-risk or illegal activities (for example, risky sexual activity, substance use, gang involvement) (Bender, Begun, DePrince, Haffejee, & Kaufmann, 2014; Heath et al., 2009). Yet the approach to data collection and generalizability of the results will necessarily differ depending on the research questions and sampling strategy. Whereas online sampling has been used effectively with smaller, individual studies of youths with disabilities and other hard-to-access populations (Fox et al., 2007; McDermott & Roen, 2012), and in conjunction with mobile devices to collect data from homeless youths (Bender et al., 2014), the use of technologies for sampling stigmatized populations of youths remains understudied.

Internet and Technology Use by Marginalized Youth Populations and Social Work Research

Internet and mobile technology use is substantial and significant for diverse populations of youths in the United States and Canada, though notable variations continue to exist in devices and platforms used, as well as frequency and rates of use (Bender et al., 2014; Lenhart et al., 2015; Steeves, 2014). A study conducted by the Pew Research Center on a nationally representative U.S. sample of 1,060, 13- to 17-’year-olds and their parents found that Hispanic and African American young people may access the Internet more frequently than their white peers. Furthermore, whereas white youths have the most access to computers (91%) and Hispanic youths have the most access to tablets (62%), African American youths are the most likely to own smartphones (85%) and to go online using mobile devices (100%) (Lenhart et al., 2015). Rural youths (91%) are more likely to access the Internet on a mobile device compared with their urban counterparts (89%), though access is highest among suburban youths (93%). Differences in mobile use are nearly negligible by socioeconomic status (SES), and even in the poorest families (less than $30,000 annually) 86% of youths go online daily. However, youths of lower SES and of higher SES do show a difference in preferred platforms. The former are more likely to use Facebook (49% versus 37%), whereas the latter use Instagram or Twitter most often (Lenhart et al., 2015). Homeless young people also have significant access to technology (including the Internet) (Bender et al., 2014); one recent study of one hundred 18- to 24-year-olds in the United States indicated that nearly half (46%) access it daily and most (93%) access it weekly (Pollio, Batey, Bender, Ferguson, & Thompson, 2013).

With the potential to effectively solicit the distinctive experiences of specific marginalized and hard-to-access populations, online surveys have salient implications for social work researchers. The Code of Ethics of the National Association of Social Workers (2008) calls for social workers to

prevent and eliminate ... discrimination against any person, group, or class on the basis of race, ethnicity, national origin, color, sex, sexual orientation, gender identity or expression, age, marital status, political belief, religion, immigration status, or mental or physical disability. (p. 27, section 6.04[d]) Thus, the social work profession has a commitment to fostering social justice, particularly for marginalized populations. Although benefits exist for online research with general populations, many of the advantages of online surveys (for example, enhanced recruitment, increased anonymity) may be particularly advantageous when engaging in research with marginalized populations whose access to offline research may be limited by the barriers and stigma they experience. Providing a practical discussion and recommendations specifically for social work researchers, this article demonstrates the utility of online survey methods with marginalized and hard-to-access young people.

TYPES OF ONLINE SURVEYS

Online survey methodologies are varied but typically take one of two forms with regard to how the survey is presented, which in turn affects distribution: (1) e-mail-based surveys in which the survey is e-mailed directly to participants (either in the body of the e-mail or as an attachment) to be filled out and returned by e-mail to the research team and (2) Web-based surveys in which the survey is hosted on a Web site, where participants fill out responses directly submitted through the online platform (Andrews et al., 2003; Gunter, Nicholas, Huntington, & Williams, 2002; Hoonakker & Carayon, 2009). One notable advantage to Web-based surveys, which have become increasingly prevalent, is that data collected are immediately recorded by the online software, potentially permitting greater measurement of midsurvey attrition (which is difficult, if not impossible, to assess in e-mail-based surveys). However, this Web-based approach does not supply the survey directly to individuals as e-mail-based approaches do (Andrews et al., 2003). Researchers must solicit potential participants to navigate to the survey Web site, using methods such as recruiting advertisements. These procedures carry unique difficulties, such as regulations of platforms where advertisements are posted (Alessi & Martin, 2010).

STRENGTHS OF ONLINE SURVEYS

Online survey methodologies generally permit convenient, timely, and cost-effective research (Bartell & Spyridakis, 2012; Denissen et al., 2010; Denscombe, 2009; Gunter et al., 2002; Wang & Doong, 2010). This is particularly relevant for marginalized, geographically dispersed, hard-to-access, or “socially distant groups” (Davis, Bolding, Hart, Sherr, & Elford, 2004) that may otherwise be difficult or costly to access (Andrews et al., 2003). In addition to the simultaneous collection of quantitative and qualitative data (Riggle et al., 2005), potential conveniences of online surveys, compared with paper (that is, offline) surveys, include the following: (a) easier and faster construction and administration using survey-building programs; (b) numerous approaches to sampling and recruitment, including potential random assignment; (c) easier and faster recruitment of larger samples; (d) potentially improved response rates; (e) abundant design options, including strategies tailoring surveys to individual participants and reducing participant burden (for example, skip patterns for nonpertinent questions); and (f) automated deployment of the survey and capture of responses, permitting improved accuracy and speed of survey completion and data entry. The ability to integrate audiovisual content, including different content for various completion conditions, is also unique (Alessi & Martin, 2010; Andrews et al., 2003; Bartell & Spyridakis, 2012; Best et al., 2001; Denissen et al., 2010; Denscombe, 2009; Hoonakker & Carayon, 2009; Mustanski, 2001; Wang & Doong, 2010; Wright, 2005).

Paper and online survey methodologies are increasingly comparable with regard to reliability, validity, and results garnered (Bartell & Spyridakis, 2012; Denissen et al., 2010). Yet these two approaches may still not be equivalent—as potential differences in responses indicate. Some research has found that online surveys facilitate improved response rates, both for whole surveys and for individual items, including more detailed responses to qualitative questions (Gunter et al., 2002). This is an important consideration when working with underresearched populations. Other research suggests that whereas paper and online surveys have relatively similar response rates on individual items for closed-ended questions, responses to open-ended questions may be negatively affected by online formats (Bartell & Spyridakis, 2012; Denscombe, 2009; Gunter et al., 2002). This issue of response rates is contentious, and may be influenced by contextual elements such as question type and sample composition (Denscombe, 2009).

Anonymity

The anonymity of online surveys may be appealing to both participants and researchers. Many stigmatized youth populations might be more willing to participate online because of the relative anonymity and privacy of the context (Heath et al., 2009). Online research allows participants to feel increased comfort and autonomy and decreased inhibitions to participation as a result of knowing that their contributions will remain confidential and that they have the ability to complete the survey privately (McDermott & Roen, 2012; Willis, 2011). Participants, including youths (McDermott & Roen, 2012), may also be more likely to answer truthfully. People often answer “socially or emotionally sensitive questions” (Bartell & Spyridakis, 2012) more honestly online than when completing paper surveys—and this might be especially true for individuals who may experience stigma in offline research, such as marginalized young people. Potential effects such as social desirability bias caused by the presence of researchers may also be minimized (Denissen et al., 2010; Gunter et al., 2002).

RECRUITMENT AND SAMPLING

There are two overarching approaches to online recruitment: (1) passive and (2) active. In the passive version, potential participants view the research opportunity that has been posted or shared by the researchers on an online platform (for example, Web site, social media site, online group) and choose whether to seek more information or participate in the study (McDermott & Roen, 2012). This approach may foster an enhanced sense of control and ownership (Fox et al., 2007) for youth respondents who elect to participate due to the increased initiative required for participation, encouraging active engagement. An important consideration for this type of approach is the cost of posting potential research opportunities, particularly on some popular identity-specific Web sites or platforms where it can be expensive to do so (Alessi & Martin, 2010), which can negatively affect the low-cost advantage of online methodologies. In the active approach, researchers contact individuals directly with the opportunity to participate, either through their individual accounts on particular online platforms or by e-mail (Bortree, 2005). However, direct messaging is also potentially challenging, as it often requires having access to (or generating) a list of accounts, e-mail addresses, or both. At times, a combination of active and passive recruitment may be most appropriate.

Online recruitment for population-based studies often generates convenience samples, permitting recruitment through some incomplete sampling frames—such as e-mail lists or records of accounts on Web sites—and participant self-selection (Andrews et al., 2003). However, such recruitment strategies frequently produce a more sociodemographically variable sample composition in identity-specific research with youth populations (McDermott & Roen, 2012; Mustanski, 2001), incorporating and investigating subpopulations who may be excluded in offline research. Past difficulties with offline LGBTQ youth sampling led to overreliance on clinical and community-based convenience samples, which tended to be ethnoracially, geographically, and behaviorally homogenous (for example, urban, Caucasian, open about LGBTQ status, actively engaged with offline LGBTQ community) (McDermott & Roen, 2012). Researchers in offline studies also frequently used LGBTQ subsamples drawn from large-scale, general, population-based studies of student populations or retrospective studies with adults to inform LGBTQ youth research—both of which limited in-depth investigation of contemporary concerns (for example, current LGBTQ social campaigns, recent LGBTQ-related legislation) (McDermott & Roen, 2012). Many of these challenges with offline research could apply to other marginalized populations. Marginalized and hard-to-access youths are often excluded from offline research, offline samples may not be sufficiently inclusive of diverse individuals and subpopulations, or issues unique to the population are not adequately addressed (for example, legislation, discrimination/barriers, social/media representation).

The process and implications of selected recruitment approaches should be carefully considered, including minimizing disruptions to users of the online platforms selected (Alessi & Martin, 2010). Online communities and networks of peers have been used to distribute research opportunities and recruit LGBTQ youths. These networks may be effective with many marginalized groups, as young people often know others who share their identities. Yet biases, privacy concerns, or both may be created as certain subgroups of youths could be oversampled and participants may be asked to provide contact information for peers. Privacy may be maintained by asking participants to contact peers directly (Mustanski, 2001). It is essential that researchers use proper online etiquette, or netiquette, for the online populations and communities they are studying to encourage participation (Buchanan & Hvizdak, 2009). This includes not spamming, or excessively contacting, potential participants, either through actively soliciting individuals repeatedly or passively publicizing the study repetitively on the online platforms selected for recruitment (Riggle et al., 2005).

The terminology used in recruitment may affect sample composition. For example, the language and meaning of particular LGBTQ identity terms may vary significantly in different sociocultural contexts (McInroy & Craig, 2012; Riggle et al., 2005; Willis, 2011). Specifically targeting a LGBTQ population, for instance, may result in the exclusion of participants who do not label their same-sex attraction or behavior as explicitly LGBTQ. Factors such as age, race and ethnicity, SES, level of education, and geographic location may affect LGBTQ youths’ self-identification with a particular population (McInroy & Craig, 2012), as well as the self-identification of other marginalized young people (for example, ethnoracialized youths, youths with disabilities). Online surveys should include questions that screen participants, not relying on the recruitment advertisements to ensure that respondents meet inclusion criteria (Riggle et al., 2005). Furthermore, researchers should limit their use of “technical language” (Alessi & Martin, 2010), particularly when recruiting young people. As potential online participant populations become more diverse, individuals are “less interested in ... surveys not salient to their interests” (Andrews et al., 2003, p. 191). Youth studies using online data collection should ensure that research opportunities and recruitment advertisements are as engaging as possible for the participants they hope to access.

DISSEMINATION

Although not exclusive to online methodologies, online dissemination is quite compatible with online surveys—providing ways throughout the research process to lend credibility, support activities (for example, recruitment), and mobilize knowledge. The Internet may provide findings in multiple interactive formats (for example, Web sites, online discussion boards or forums, social media networks), incorporating audiovisual content and valuable material that do not meet academic publishing requirements (Duffy, 2000). In social work, online dissemination may also “extend community access to effective interventions” (Paxton, 2013, p. 525), potentially decreasing pressure on offline interventions by providing information directly to individuals. Whereas LGBTQ young people often do not pursue help in offline contexts as a result of discrimination, they frequently seek advice from peers and access resources online (Craig & McInroy, 2014; McDermott & Roen, 2012; Paxton, 2013). This may also be true of other socially marginalized youth populations (for example, ethnoracialized youths, youths engaging in risky or illegal behavior). Provision of online resources related to or derived from research may facilitate information seeking among youths, providing a strong motivation for online dissemination (Paxton, 2013).

CONCERNS REGARDING ONLINE SURVEYS

There are also challenges with online survey approaches, including concerns over methodological quality and equivalence, access issues for participants, and technological drawbacks. Online methodologies should be subject to the same rigorous methodological standards—such as validity and reliability—as offline data collection methods (Stafford & Gonier, 2007; Wang & Doong, 2010). Yet adaptation of offline methodologies to online contexts is continuing to present challenges (Wright, 2005). As mentioned, despite indications of increasing consistency, online and paper surveys may not be equivalent. Concerns remain over the comparability of measures in different formats, and research has indicated that differences remain in samples and outcomes (Bartell & Spyridakis, 2012; Denissen et al., 2010; Denscombe, 2009; Gunter et al., 2002). Variations in digital capabilities and access to technology may also affect the representativeness of results (Denissen et al., 2010). Although this issue is decreasingly relevant with contemporary youth populations in the United States and Canada, as their online participation is nearly universal (Lenhart et al., 2015; Steeves, 2014), some differences may persist that affect the ability to both access and use technology (Pascoe, 2011; Willis, 2011). Online opportunities may remain more limited for youths who have less-educated parents, come from lower SES backgrounds, live in rural areas, or are ethnoracial minorities (Pascoe, 2011; Roberts & Foehr, 2008), potentially influencing online sample composition. However, these indicators may be decreasingly significant as the technological immersion of youths in multiple settings (for example, home, school, community) continues to increase.

Other issues with online surveys include the following: (a) the challenge of developing a representative sampling frame for population-based studies; (b) difficulties in accurately measuring nonresponse and attrition rates (for example, those who view the research opportunity and choose not to participate or those who drop out before completing the survey); (c) issues around anonymity and data security (for example, secure transmission and storage of participant data online); and (d) challenges with digital delivery (for example, ensuring that the research opportunity reaches participants) (Andrews et al., 2003; Bartell & Spyridakis, 2012; Hoonakker & Carayon, 2009; Stafford & Gonier, 2007; Wang & Doong, 2010; Wright, 2005). Individuals changing or using multiple e-mail addresses, rerouting of research inquiries to junk mail folders, and technical difficulties with Web-based survey platforms may hinder research efforts (Andrews et al., 2003; Gunter et al., 2002). Online data collection also makes enforcing conditions for survey completion difficult, and surveys may be completed under nonoptimal conditions; this is also often true of paper surveys (Mustanski, 2001; Riggle et al., 2005; Wang & Doong, 2010). The lack of a comprehensive sampling frame may also not be prohibitive if recruitment focuses on spaces specific to the population (Wang & Doong, 2010), such as recruiting youths through identity-specific platforms.

Measurement of response rates in online research can be a particularly challenging issue, depending on the recruitment approach. There is often no means of tracking the number of viewers of a recruitment advertisement (Riggle et al., 2005), though current survey software may be able to address this, at least partially. For example, survey platforms now have the option to record presurvey and midsurvey attrition (for example, people who look at or partially complete the survey but do not submit it) and are offering increased options for outreach and deployment to potential participants. The greater potential for attrition in online research, as participant dropout may be more likely due to a lack of social pressure to continue participation (Denissen et al., 2010; Denscombe, 2009), may be reconceptualized as an ethical benefit, preventing participant coercion. In response to the challenges of generalizability (when examining a population broadly), some researchers have focused instead on generating diverse samples to increase methodological rigor (Best et al., 2001). Furthermore, online samples may be more representative in many ways (for example, geographic diversity) than offline samples (Mustanski, 2001). Representativeness and the ability to generalize results may also be less critical in targeted population studies (Gunter et al., 2002), such as those undertaken with many marginalized populations.

KEY CONSIDERATIONS FOR ONLINE SURVEYS

Recruitment

Define Target Population

To encourage effective recruitment, an “operational definition” (Riggle et al., 2005) of the intended population and sample for each study should be developed. For example, in an investigation of ethnoracialized young people, (a) the racial or ethnic identities sought, (b) other participation criteria (for example, age range, limits on the geographic region of interest), and (c) the terminology likely to be effective in recruiting the desired sample should be explicated. Definitions also promote more effective evaluation of generalizability (Engel & Schutt, 2014).

Investigate Target Population’s Online Presence

Researchers should look at current statistics (if available) on their particular population of interest’s online engagement and patterns of use before engaging in online research. This may provide important information for sampling and recruitment. For example, the majority of homeless youths access the Internet on a weekly basis, which indicates online surveys may be an appropriate method of data collection from this population (Bender et al., 2014; Pollio et al., 2013).

Select Platforms Strategically

Online platforms where recruitment is undertaken should be selected for relevance to the research topic or popularity among potential participants (Seymour, 2001), such as recruitment advertisements for young people with disabilities being placed on Web sites and social media groups for that population. Researchers should also carefully weigh the cost of advertising on potential platforms against the potential visibility of the advertisement to the target population.

Reduce Biases

To decrease potential biases in sample composition, it may be most effective to recruit participants across several online platforms (general and identity specific) using a combination of approaches to attempt to generate a more diverse sample (Mustanski, 2001).

Define Meaning of Terms

Researchers should ensure that the terminology used in recruitment materials and the survey is clearly defined for participants, and that, wherever possible, terminology reflects that used by the specific population under consideration. Without clear guidance young people may misinterpret terminology used in recruitment materials and surveys, even if it is commonly used language (Ólafsson, Livingstone, & Haddon, 2013), compromising the sample composition and validity of the results. For example, rural youths may not conceptualize themselves as such and may need specific guidelines to help them determine their membership in the population sought.

Maintain a Web Site

A Web site can act as a hub, providing ongoing information about the study and researchers to interested individuals, encouraging recruitment, and allowing participants to track the research process. This facilitates the building of trust between researchers and participants, which is particularly critical in research with marginalized groups (Riggle et al., 2005). The site could also provide supportive resources, study results, and other relevant information for individuals both during and following completion of the study, making a crucial contribution to an ethical research process by sharing knowledge gained with the community.

Survey Design

Select Appropriate Visual Design

Some research suggests online surveys should be as similar as possible in appearance to paper surveys to prevent stimuli that could affect responses (Wang & Doong, 2010). Yet an advantage of online surveys is the ability to use features such as animation or graphics to enhance presentation and engagement (Andrews et al., 2003; Gunter et al., 2002). These features may also increase response rates, facilitate longer answers to qualitative questions, result in fewer mistakes, and encourage greater disclosure (Gunter et al., 2002)—perhaps especially for young people. However, such features may also affect usability of responses (Andrews et al., 2003), such as for youths with disabilities who require the use of assistive technologies like screen readers. Online surveys should be designed with an awareness of the potential for “selection and information bias” (Bracken et al., 2009) as a result of the chosen formatting.

Construct Questions Carefully

Question construction should be considered with attention to the intended sample population. Qualitative questions often require respondents who are capable of engaging in self-reflexivity and articulating experiences and perceptions (Riggle et al., 2005). In online surveys with youth populations, it may be better to request that respondents “ ‘describe’ rather than ‘explain’ ” (Riggle et al., 2005) their thoughts or observations given their stage of development and cognitive ability and the lack of researcher oversight of survey completion (Ólafsson et al., 2013). Surveys should also always provide an “other” or a “do not know” option to prevent invalid answers—ideally with a write-in option to allow youths to express their experiences (Ólafsson et al., 2013). Of utmost importance is that the survey be understandable (for example, using clear, age-appropriate language) and not of excessive length (Bartell & Spyridakis, 2012; Mustanski, 2001). Ólafsson et al. (2013) provided an excellent discussion of common questions regarding research online with young people.

Maximize Response Rates

Issues that might promote participant attrition should be attended to, such as an excessive number of qualitative questions, excessively long surveys with complex or multistep questions, questions without the ability to opt out or to refuse to answer, and lack of usability of the survey interface (Andrews et al., 2003; Hoonakker & Carayon, 2009). For example, ethnoracialized youths’ high use of mobile devices (Lenhart et al., 2015) indicates that online surveys with the population should be compatible with mobile devices. Andrews et al. (2003) discussed numerous design factors to improve response rates; (a) testing the survey’s usability in multiple Web browsers and on multiple devices, (b) prominently displaying a tool that estimates closeness of the survey to completion, (c) tailoring the survey and distribution to the population (for example, recruitment materials and locations, language level, terminology, length), (d) allowing participants to view the whole survey prior to completion, and (e) requesting demographic details at the beginning of the survey.

Pilot Test

Pilot testing is essential, and both recruitment materials and surveys intended for marginalized youths should be pilot tested with members of that population whenever possible (Ólafsson et al., 2013; Riggle et al., 2005). Andrews et al. (2003) outlined a four-phase piloting process to undertake when developing an online survey prior to release: (1) review by experts, (2) piloting with individuals from the target population, (3) a pilot study, and (4) final proofing. Pilot testing allows researchers to evaluate the age appropriateness and validity of the survey, including suitability of the terminology selected for the population (Ólafsson et al., 2013).

Incentivize

Incentives should be considered, including the practicalities of distributing incentives online (Mustanski, 2001; Wang & Doong, 2010). Not providing incentives may actually produce selection bias, as fewer individuals are willing to participate. Incentives have been found to improve response rates of paper surveys (Andrews et al., 2003; Riggle et al., 2005) and may similarly improve online response rates (Hoonakker & Carayon, 2009). Incentives could also prevent participant attrition (Mustanski, 2001). Internet-based incentives may include individual or lottery-style rewards. Rewards may be monetary or in the form of electronic gift vouchers to online retailers (Mustanski, 2001; Riggle et al., 2005). Incentives often prevent or limit participant anonymity as an individual’s contact information (usually at least an e-mail address) is necessary to distribute the reward (Mustanski, 2001). Participants’ responses and information for incentives (for example, names, addresses) should be stored separately or even collected using two separate, sequential surveys so that responses remain anonymous.

Consider Technical Practicalities

Technical considerations, such as making the research Web site searchable to major search engines and maximizing usability on multiple Web browsers and devices (including small screens for mobile viewing and completion), should be taken into account (Mustanski, 2001) because of youths’ rapidly increasing use of mobile technology (Lenhart et al., 2015) and the possibility that youths with privacy concerns or who are socially marginalized may feel more comfortable using their personal mobile devices rather than shared computers.

ETHICS

It is critical that the ease of online approaches does not permit complacency with regard to rigorously ethical research (Buchanan & Hvizdak, 2009). Debate exists over whether online research is uniquely risky as compared with offline research (Fox et al., 2007). Ethics implications are also unique to the particular population under consideration. Regardless, online research requires reassessment of the standard ethical approaches to research used in offline inquires (Fox et al., 2007; Stern, 2003).

Consent

All standard elements of the consent process should be adhered to in online investigations, including provision of information about the research inquiry and clarification of the procedures in place to ensure confidentiality, anonymity, and privacy (Andrews et al., 2003; Flicker, Haans, & Skinner, 2004; Wang & Doong, 2010). One approach to acquiring consent in e-mail-based surveys is to provide forms by e-mail to be signed (or a typed statement provided) and returned (Fox et al., 2007; McDermott & Roen, 2012). With Web-based surveys, consent is frequently obtained by providing the consent document on the first page of the survey and requiring participants to click a button or type out a statement of agreement indicating consent to participation before proceeding. It has been stressed that paper-based methods of consent are no more secure or valid than online methods (McDermott & Roen, 2012).

The issue of consent is particularly contentious with regard to youths, especially adolescents under 18 years of age. According to the Children’s Online Privacy Protection Act (1998), in the United States children under the age of 13 years are not permitted to participate in online research without active parental/guardian consent. The act also outlines material required in privacy policies when conducting research with young people (Denissen et al., 2010). Typically, active parental consent is also required by research ethics boards (REBs) at institutions for all participants under 18 years of age. However, exceptions may exist “when this conflicts with youths’ emerging desire for privacy and independence” (Denissen et al., 2010, p. 570), or when seeking parental consent may put the adolescent at unnecessary risk. For example, asking LGBTQ youths under 18 years of age to provide parental consent to their participation in research inquiries could put participants at significant risk if their parents are unaware or unsupportive of their LGBTQ status, and may result in less diverse samples (Elze, 2003; Mustanski, 2011; Tufford, Newman, Brennan, Craig, & Woodford, 2012). This issue of parental knowledge and support may also apply to other populations of youths (for example, youths who are homeless or youths engaging in illegal or risky behavior).

Privacy and Anonymity

Issues of privacy and anonymity in online research are complex and multilayered. For example, there is significant debate regarding participant dishonesty, particularly around youth age and individuals participating in a survey multiple times. Researchers typically rely on participant honesty, suggesting that these methodological issues are also present in offline research approaches (Flicker et al., 2004; McDermott & Roen, 2012; Mustanski, 2001; Riggle et al., 2005). However, the anonymity of online surveys may increase deception by participants, so it has been suggested that strategies for identifying this deception be considered (Mustanski, 2001). For LGBTQ populations (and likely other hard-to-access populations), the issue of participant dishonesty is also mediated by marginalized participants indicating significant satisfaction in being able to honestly provide information and experiences when they may not be able to do so offline (Denissen et al., 2010).

Online research also raises new questions regarding what constitutes identifiable information; e-mail addresses or Internet Protocol addresses may be considered recognizable (Buchanan & Hvizdak, 2009; Denissen et al., 2010). Furthermore, perceptions of privacy and anonymity by participants are also complicated by online approaches. Participants in previous online studies have indicated the desire for increased feedback from researchers as well as a more personal and interactive process (Hessler et al., 2003), suggesting they wanted a less anonymous experience. In one e-mail-based survey study of LGBTQ youths, researchers found that participants added unsolicited accounts of their online participation, and several referred researchers to their individual social media accounts for further details (Hillier & Harrison, 2007). Another online study with LGBTQ youths found that participants used e-mail addresses with identifiable information, which is consistent with research suggesting some participants show minimal concern over sharing such information (McDermott & Roen, 2012).

Distress

Though it depends on the research topic, online research may increase encounters with participants in mental, physical, or emotional distress. This is partly due to perceptions of anonymity online, and is also elevated when working with marginalized populations who may experience disproportionate risk of negative outcomes due to their stigmatized status (Stern, 2003). Although legal repercussions for researchers are unlikely, there remains a moral, ethical, and professional responsibility to respond. This issue is complicated for clinical researchers, such as social workers, who possess skills and knowledge regarding distress. Ultimately, responses may need to be determined on a case-by-case basis, depending on the particular situation and the ability to respond given technological limitations (Stern, 2003). One approach is to provide a personalized response (for example, individualized resources and referrals based on survey responses) when participant contact information exists. However, the ability to provide referrals is limited by the online context (Flicker et al., 2004; Willis, 2011). To proactively meet ethical responsibilities, researchers should consider making available relevant resources on the study’s Web site, throughout the survey, and at the end of the survey to provide support for participants (Willis, 2011). This may include placing resources alongside particularly difficult or distressing topics (for example, questions about self-harm, suicidality, experiences of violence, and discrimination).

Care of Data and Limits of Technology

Treatment, storage, and backup of participant data, as well as the privacy policy of online survey platforms, are concerns that should be addressed (Buchanan & Hvizdak, 2009; Hessler et al., 2003). Encryption of data is always recommended and, as mentioned earlier, identifying information and deidentified data should be stored separately (Denissen et al., 2010). Data transmission through certain technologies—such as e-mail—has also been critiqued, as these technologies were not designed for confidential information (Hessler et al., 2003). The risks and difficulties with maintaining data security should be clarified with participants, perhaps particularly for youths who may be especially vulnerable if data are compromised (for example, youths engaging in illegal activities). The limitations of constructing ethical online surveys should also be considered. Survey platforms have been criticized for not allowing participants to skip questions or end their participation without the data already collected being saved (Buchanan & Hvizdak, 2009). Every effort should be made to ensure these functions are available.

ETHICS RECOMMENDATIONS

Encourage Credibility

One good strategy to increase credibility and perceived legitimacy is to provide a “third-party guarantee” (Andrews et al., 2003) of the survey by linking to or prominently displaying the REB or department and institution logo (Mustanski, 2001). A professional, comprehensive study Web site may also facilitate a sense of trustworthiness.

Make Age/Population Appropriate

When undertaking research with young people, the ethics materials—such as consent forms—should be provided in a youth-appropriate format with easy-to-understand language, and all efforts should be made to minimize coercion (Flicker et al., 2004). For certain youth populations, additional considerations may be necessary. For example, in populations where youths’ first language is not English, translations of consent forms may be required. Similarly, formal or informal parental consent may not always be appropriate (for example, for LGBTQ youths and homeless youths).

Assess for Dishonesty

Ask questions in several ways to assess for participant deception or random responses to questions (Mustanski, 2001). For example, Flicker et al. (2004) suggested asking a participant’s year of birth and age separately to evaluate age deception.

Outline Limitations

REB applications and informed consent forms should clearly detail potential risks and limitations to confidentiality (Hessler et al., 2003) as well as outline steps to facilitate confidentiality and online safety (for example, deleting browsing history, using private browsing).

CONCLUSION

The potential representativeness of online data should be considered in the context of the rapidly increasing online engagement and mobile technology use of U.S. and Canadian youth populations, while recognizing the unique online participation patterns of diverse youth populations and subpopulations (Bender et al., 2014; Bracken et al., 2009; Lenhart et al., 2015; Steeves, 2014). Online approaches to social work research offer opportunities to make visible the “silenced and invisible voices” (Willis, 2011) of LGBTQ populations and other marginalized and hard-to-access populations in a comparably safe, anonymous context. Although the circumstances of individual studies and populations of interest differ widely, the practical guidelines provided outline important considerations for social work researchers seeking to undertake ethical and methodologically sound survey research with these types of youth populations.

REFERENCES

  1. Alessi E. J., Martin J. I. (2010). Conducting an Internet-based survey: Benefits, pitfalls, and lessons learned [Research Note]. Social Work Research , 34, 122–128. [Google Scholar]
  2. Andrews D., Nonnecke B., Preece J. (2003). Electronic survey methodology: A case study in reaching hard-to-involve Internet users. International Journal of Human-Computer Interaction , 16, 185–210. [Google Scholar]
  3. Bartell A. L., Spyridakis J. H. (2012). Managing risk in Internet-based survey research. In Professional Communication Conference (IPCC), 2012 IEEE International (pp. 1–6). Piscataway, NJ: IEEE International; doi:10.1109/IPCC.2012.6408600 [Google Scholar]
  4. Bender K., Begun S., DePrince A., Haffejee B., Kaufmann S. (2014). Utilizing technology for longitudinal data collection with homeless youth. Social Work and Health Care , 53, 865–882. [DOI] [PubMed] [Google Scholar]
  5. Best S. J., Krueger B., Hubbard C., Smith A. (2001). An assessment of the generalizability of Internet surveys. Social Science Computer Review , 19(2), 131–145. [Google Scholar]
  6. Bortree D. S. (2005). Presentation of self on the Web: An ethnographic study of teenage girls’ Weblogs. Education, Communication & Information , 5(1), 25–39. [Google Scholar]
  7. Bracken C. C., Jeffres L. W., Neuendorf K. A., Atkin D. (2009). Parameter estimation validity and relationship robustness: A comparison of telephone and Internet survey techniques. Telematics and Informatics , 26, 144–155. [Google Scholar]
  8. Buchanan E. A., Hvizdak E. E. (2009). Online survey tools: Ethical and methodological concerns of human research ethics committees. Journal of Empirical Research on Human Research Ethics , 4(2), 37–48. [DOI] [PubMed] [Google Scholar]
  9. Children’s Online Privacy Protection Act of 1998, 15 U.S.C. §§ 6501–6506 (October 21, 1998).
  10. Craig S. L., McInroy L. (2014). You can form a part of yourself online: The influence of new media on identity development and coming out for LGBTQ youth. Journal of Gay & Lesbian Mental Health , 18(1), 95–109. [Google Scholar]
  11. Davis M., Bolding G., Hart G., Sherr L., Elford J. (2004). Reflecting on the experience of interviewing online: Perspectives from the Internet and HIV study in London. AIDS Care , 16, 944–952. [DOI] [PubMed] [Google Scholar]
  12. Denissen J.J.A., Neumann L., van Zalk M. (2010). How the Internet is changing the implementation of traditional research methods, people’s daily lives, and the way in which developmental scientists conduct research. International Journal of Behavioral Development , 34, 564–575. [Google Scholar]
  13. Denscombe M. (2009). Item non-response rates: A comparison of online and paper questionnaires. International Journal of Social Research Methodology , 12(4), 281–291. [Google Scholar]
  14. Duffy M. (2000). The Internet as a research and dissemination resource. Health Promotion International , 15, 349–353. [Google Scholar]
  15. Elze D. E. (2003). 8,000 miles and still counting ... ’Reaching gay, lesbian and bisexual adolescents for research. Journal of Gay & Lesbian Social Services , 15(1–2), 127–145. [Google Scholar]
  16. Engel R. J., Schutt R. K. (2014). Fundamentals of social work research (2nd ed.). Thousand Oaks, CA: Sage Publications. [Google Scholar]
  17. Flicker S., Haans D., Skinner H. (2004). Ethical dilemmas in research on Internet communities. Qualitative Health Research , 14(1), 124–134. [DOI] [PubMed] [Google Scholar]
  18. Fox F. E., Morris M., Rumsey N. (2007). Doing synchronous online focus groups with young people: Methodological reflections. Qualitative Health Research , 17, 539–547. [DOI] [PubMed] [Google Scholar]
  19. Gunter B., Nicholas D., Huntington P., Williams P (2002). Online versus offline research: Implications for evaluating digital media. Aslib Proceedings , 54, 229–239. [Google Scholar]
  20. Heath S., Brooks R., Cleaver E., Ireland E. (2009). Researching young people’s lives. Thousand Oaks, CA: Sage Publications. [Google Scholar]
  21. Hessler R., Downing L., Beltz C., Pelliccio A., Powell M., Vale W. (2003). Qualitative research on adolescent risk using e-mail: A methodological assessment. Qualitative Sociology , 26(1), 111–124. [Google Scholar]
  22. Hillier L., Harrison L. (2007). Building realities less limited than their own: Young people practising same-sex attraction on the Internet. Sexualities , 10(1), 82–100. [Google Scholar]
  23. Hoonakker P., Carayon P. (2009). Questionnaire survey nonresponse: A comparison of postal mail and Internet surveys. International Journal of Human-Computer Interaction , 25, 348–373. [Google Scholar]
  24. Lenhart A., Duggan M., Perrin A., Stepler R., Rainie L., Parker K (2015). Teens, social media & technology overview, 2015: Smartphones facilitate shifts in communication landscape for teens. Retrieved from http://www.pewinternet.org/files/2015/04/PI_TeensandTech_Update2015_0409151.pdf [Google Scholar]
  25. McDermott E., Roen K. (2012). Youth on the virtual edge: Researching marginalized sexualities and genders online. Qualitative Health Research , 22, 560–570. [DOI] [PubMed] [Google Scholar]
  26. McInroy L., Craig S. L. (2012). Articulating identities: Language and practice with multiethnic sexual minority youth. Counselling Psychology Quarterly , 25, 137–149. [Google Scholar]
  27. Mustanski B. S. (2001). Getting wired: Exploiting the Internet for the collection of valid sexuality data. Journal of Sex Research , 38, 292–301. [Google Scholar]
  28. Mustanski B. (2011). Ethical and regulatory issues with conducting sexuality research with LGBT adolescents: A call to action for a scientifically informed approach. Archives of Sexual Behavior , 40, 673–686. [DOI] [PubMed] [Google Scholar]
  29. National Association of Social Workers. (2008). Code of ethics of the National Association of Social Workers. Retrieved from http://www.socialworkers.org/pubs/code/code.asp [Google Scholar]
  30. Ólafsson K., Livingstone S., Haddon L. (Eds.). (2013). How to research children and online technologies? Frequently asked questions and best practice. Retrieved from http://eprints.lse.ac.uk/50437/1/_Libfile_repository_Content_Livingstone%2C%20S_EU%20Kids%20Online_How%20to%20research%20children%20and%20online%20technologies%28lsero%29.pdf [Google Scholar]
  31. Pascoe C. J. (2011). Resource and risk: Youth sexuality and new media use. Sexuality Research and Social Policy , 8, 5–17. [Google Scholar]
  32. Paxton S. J. (2013). Dissemination in the Internet age: Taming a wild thing. International Journal of Eating Disorders , 45, 525–528. [DOI] [PubMed] [Google Scholar]
  33. Pollio D. E., Batey D. S., Bender K., Ferguson K., Thompson S. (2013). Technology use among emerging adult homeless in two U.S. cities [Practice Update]. Social Work , 58, 173–175. [DOI] [PubMed] [Google Scholar]
  34. Riggle E.D.B., Rostosky S. S., Reedy C. S. (2005). Online surveys for BGLT research: Issues and techniques. Journal of Homosexuality , 49(2), 1–21. [DOI] [PubMed] [Google Scholar]
  35. Roberts D. F., Foehr U. G. (2008). Trends in media use. Future of Children , 18(1), 11–37. [DOI] [PubMed] [Google Scholar]
  36. Seymour W. S. (2001). In the flesh or online? Exploring qualitative research methodologies. Qualitative Research , 1(2), 147–168. [Google Scholar]
  37. Stafford T. F., Gonier D. (2007). The online research “bubble”: Seeking to improve the commonly used online survey sampling approaches. Communications of the ACM , 50(9), 109–112. [Google Scholar]
  38. Steeves V. (2014). Young Canadians in a wired world, phase III: Life online. Retrieved from http://mediasmarts.ca/ycww [Google Scholar]
  39. Stern S. R. (2003). Encountering distressing information in online research: A consideration of legal and ethical responsibilities. New Media & Society , 5, 249–256. [Google Scholar]
  40. Tufford L., Newman P. A., Brennan D. J., Craig S. L., Woodford M. R. (2012). Conducting research with lesbian, gay, and bisexual populations: Navigating research ethics board reviews. Journal of Gay & Lesbian Social Services , 24(3), 221–240. [Google Scholar]
  41. Wang H., Doong H. (2010). Nine issues for Internet-based survey research in service industries. Service Industries Journal , 30, 2387–2399. [Google Scholar]
  42. Willis P. (2011). Talking sexuality online: Technical, methodological and ethical considerations of online research with sexual minority youth. Qualitative Social Work , 11, 141–155. [Google Scholar]
  43. Wright K. B. (2005). Researching Internet-based populations: Advantages and disadvantages of online survey research, online questionnaire authoring software packages, and Web survey services. Journal of Computer-Mediated Communication , 10(3). doi:10.1111/j.1083-6101.2005.tb00259.x [Google Scholar]

Articles from Social Work Research are provided here courtesy of Oxford University Press

RESOURCES