Abstract
This paper discusses methodological issues that emerged during the design and implementation of a large-scale survey of licensed social workers in the USA. Benefits and challenges of survey research are identified. The paper provides recommendations for future workforce studies and surveys that assess behavioral health problems.
Keywords: Mental health, Alcohol and other drug problems, Workforce studies, Online research, Surveys
When conducted with attention to methodological rigor and ethical considerations, web-based studies can generate meaningful information about health and health behaviors (Alessi and Martin 2010; Morgan et al. 2013). This article contributes to ongoing discussions about the benefits and challenges of online research. Our insights about web-based investigations were acquired through the implementation of Social Workers’ Self-Reported Wellness: A National Study, a survey that explored a number of areas affecting social workers in the USA, including their mental health and alcohol and other drug (AOD) problems. While a full description of the study is currently being prepared for publication and can be viewed online at https://wp.nyu.edu/socialworkers/, this article highlights some of the benefits and challenges that were experienced during the research process.
The Social Workers’ Self-Reported Wellness study (Straussner et al. 2015) responded to calls to investigate challenges facing behavioral health professionals in the USA (U. S. Department of Health and Human Services 2013). The survey contained 75 close- and open-ended questions and was hosted by Qualtrics. The sampling frame consisted of 69,661 social workers who were licensed in the 13 states that provided licensees’ email addresses. One-half of these individuals were randomly selected for the study, and of these 34,841 social workers a total of 6112 respondents completed the questionnaire. From September 2015 to December 2015, we sent three separate emails inviting individuals to participate in the study.
The administration of the study generated useful observations regarding benefits and challenges associated with online research, particularly when assessing behavioral health problems and conducting research on sensitive issues experienced by helping professionals.
Benefits
After exploring different research designs, an online survey that generated mixed methods data was seen as the best strategy for this study. This decision was based on the following factors:
Time and Cost
Due to the national scope of the study, an expeditious and inexpensive method for collecting a tremendous amount of information from a large sample was needed. Constraints imposed by time and funding supported the utilization of a web-based approach.
User-Friendly Technologies
We used several software packages that easily interfaced with Qualtrics, including IBM SPSS, file-sharing products by Google, and programs from the Microsoft Office suite. Readily available on- and off-campus, these programs made historically cumbersome tasks more convenient. Further, Qualtrics and other applications offered free telephone consultation.
Given the ubiquity of computer use by professionals, most respondents were familiar with the mechanics of online studies, requiring minimal support to navigate the questionnaire. Upon completing the study, one participant wrote, “The format was easy and enjoyable, the questions were straightforward, and it didn’t take a lot of time.”
Anonymity
Protection of respondents’ anonymity is an ethical imperative and is of critical importance when exploring sensitive issues, such as mental health issues and AOD problems. The survey was configured so that respondents’ identifying information was not collected, and data provided in the questionnaires were de-coupled from participants’ email addresses. Although the informed consent statement described these protections, several respondents sought reassurance that the information they provided about mental health and AOD problems would not be shared with employers or state licensing boards.
Improved Internal Validity
This study examined sensitive topics, including mental health issues, AOD problems, and the utilization of behavioral health services. Responding to these questions at a location of one’s choosing and with a personal computer, rather than at a research center or through face-to-face interviews, may have minimized concerns about stigma social desirability, generating more honest responses and improving the validity of the data (Ong and Weiss 2000).
Expanded Participation
Convenience is another important feature of online studies that may increase participation. Respondents had the option to complete the survey when and where they chose and, importantly, at one or more sittings, allowing them to think about their responses.
Online studies may promote involvement of individuals considered hard-to-reach through traditional approaches. Our study seems to have garnered participation of professionals who are often omitted from study samples, such as those who were unemployed, disabled, or retired, yet maintained active licensure. This was reflected by one participant who wrote, “… Thank you for the opportunity to participate. So often when you are retired from a profession, people don’t include you in surveys or research projects.”
Communication and Data Collection
During the recruitment phase, we sent and received over 100,000 emails. Features associated with Qualtrics and other technologies allowed for the creation of automated messages to manage this influx of correspondences. The ease of email communications promoted prompt and individualized exchanges with study participants. Similarly, web-based technologies made it possible to collect, compile, and analyze data from 6112 respondents, a process that would have posed great difficulty using traditional methods.
Challenges
While there are benefits to online research, we encountered numerous anticipated and unexpected challenges. Although some of these difficulties resulted from the study’s design choices, many of these pitfalls may be experienced by others who adopt a web-based approach.
Obtaining and Using Email Addresses
As this was an online study, an electronic means of communication was needed to correspond with prospective participants. Since there was not a national listing of licensed social workers which included their email addresses, we acquired this information from states’ licensing boards. However, this approach posed numerous challenges. Only 13 states collected and made publicly available licensees’ email addresses. Upon emailing the sample, 5% of our messages were immediately returned as undeliverable. Further, 70% of the emails that we sent were never opened. A variety of factors could explain this situation: correspondences could have reached unused accounts, the emails may have been filtered to spam folders, or individuals might have deleted the messages upon receiving them. Consequently, sampling bias and coverage error need to be considered when interpreting results.
Representativeness and Response Rates
Licensed social workers may be significantly different from unlicensed social workers. Moreover, individuals who were selected for this study’s sample, and respondents who completed the questionnaire, may not reflect the characteristics of the larger population of licensed social workers. Research indicates that individuals who complete online surveys may be slightly different from those who participate in traditional studies (Curtin et al. 2000). This trend seems to be reflected in our study: compared to the rest of the profession (Whitaker et al. 2006), study participants were more likely to have been White and reported higher levels of education and income. Thus, sampling bias may have compromised external validity and generalizability of findings.
Some investigators speculate that “survey saturation”—receiving excessive invitations to participate in online studies—may compromise response rates and the validity of findings. Conveying these concerns, one respondent wrote, “My most cherished hope is that I will never again be surveyed.”
The estimation of response rates for online surveys poses additional challenges, as there are debates about the calculation of these figures. Some standards suggest that the rate should take into account the number of emails that were never received or opened, while others recommend adherence to a stricter formula that only includes the number of messages sent and questionnaires completed (Pew Research Center 2016). Depending upon which approach is used, our study’s response rate ranges from 18 to 28%. Other investigators should expect similar response rates and consider implementing creative strategies to promote study participation (Sappleton and Lourenço 2016).
Technology-Related Concerns
While most participants completed the questionnaire with ease, a few contacted us because they were unfamiliar with the mechanics of web-based questionnaires. Although we provided technical support, it is likely that some individuals abandoned the survey if they were unable to navigate it on their own.
Technology being imperfect, other glitches emerged. After receiving a large number of messages from respondents, our email account was temporarily de-activated; the service provider interpreted the increase in activity as spam-related. When an emailed question about the incentivization process could not reach us during this momentary impasse, a participant wrote, “I have taken the time to complete the social work survey and had considerable difficulty submitting my email address for a chance to win the gift certificate to Amazon. I found this quite disappointing and funny that I was just commenting on how social work in general is devalued, I felt my efforts to support your research were somewhat diminished by the structure of this component of the online process.”
Use of Incentives
Providing incentives can promote involvement in a study. This feature of our survey generated great interest: 52% requested to be considered for one of the study’s five incentives, a $200 gift card from Amazon. However, technologies and processes related to the incentivization can introduce additional challenges. Without proper planning, this feature could compromise respondents’ anonymity. To address this concern, we created a unique email address, displayed upon study completion, which individuals could contact to be included in the incentivization process. This de-coupled participants’ responses on the questionnaire from their identifying information, thereby protecting anonymity.
For some respondents, explanation of the random selection of recipients was unclear or unacceptable. Several individuals suggested that every respondent should receive a $200 gift card. Other participants wrote lengthy emails attempting to demonstrate that they were among the worthiest of candidates to receive an incentive. While the provision of incentives likely increased response rates—and served as an acknowledgement of participants’ contributions—the inclusion of this component required unanticipated effort.
Respondent-Researcher Interactions
Due to the anonymity of web-based studies, the respondent-researcher relationship may be subject to the kinds of reactions that have emerged in impersonal, online exchanges, including skepticism, and frustration. After receiving an email about the study, some individuals were concerned about its legitimacy and sought reassurance that the study was real. Others expressed irritation about being included in the sample, indicating the mailings were burden-some. On the more extreme side, several social workers replied with inappropriate remarks and vulgar language.
Conclusions
Our study collected essential information about the behavioral health of social workers in the USA, generating results that will inform workforce development. The administration of this study illuminated numerous benefits and challenges associated with an online study of licensed professionals. As investigators consider the feasibility of online surveys, significant consideration should be given to the types of technologies that could be utilized and their goodness-of-fit with the prospective sample. When administered with attentiveness to the complexities of survey technologies, online studies can contribute novel information about health and health behaviors, including mental health and AOD problems.
Footnotes
Compliance with Ethical Standards
Conflict of Interest The authors declare that they have no conflict of interest.
References
- Alessi EJ, & Martin JI (2010). Conducting an internet-based survey: benefits, pitfalls, and lessons learned. Social Work Research, 34(2), 122–128. [Google Scholar]
- Curtin R, Presser S, & Singer E (2000). The effects of response rate changes on the index of consumer sentiment. Public Opinion Quarterly, 64(4), 413–428. [DOI] [PubMed] [Google Scholar]
- Morgan AJ, Jorm AF, & Mackinnon AJ (2013). Internet-based recruitment to a depression prevention intervention: lessons from the Mood Memos study. Journal of Medical Internet Research, 15(2), e31. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ong AD, & Weiss DJ (2000). The impact of anonymity on responses to sensitive questions. Journal of Applied Social Psychology, 30, 1691–1708. [Google Scholar]
- Pew Research Center (2016). Internet surveys. Retrieved from: http://www.people-press.org/methodology/collecting-survey-data/internet-surveys/.
- Sappleton N, & Lourenço F (2016). Email subject lines and response rates to invitations to participate in a web survey and a face-to-face interview: the sound of silence. International Journal of Social Research Methodology, 19(5), 611–622. [Google Scholar]
- Straussner SLA, Senreich E, & Steen JT (2015). Social Workers’ Self-Reported Wellness: A National Study. Unpublished survey. [Google Scholar]
- U. S. Department of Health and Human Services. (2013). Report to congress on the nation’s substance abuse and mental health workforce issues. Rockville: SAMHSA. [Google Scholar]
- Whitaker T, Weismiller T, & Clark E (2006). Assuring the sufficiency of a frontline workforce: executive summary. Washington: NASW. [Google Scholar]
