Abstract
Objective
This study aimed to examine varying incentives on acceptance to participate in an online survey on social media and to identify related demographic factors.
Methods
The study used Facebook and targeted its users aged 18 to 24 years in the United States. During recruitment, participants were randomized to one of the three types of incentives for survey completion, (1) a $5 gift card, (2) a lottery for a $200 gift card, and (3) a $5 gift card plus a lottery for a $200 gift card. Acceptance rates for survey participation were compared across three incentives using percentages, 95% logit-transformed confidence intervals, and Pearson’s chi-squared tests. The survey asked about cognition and behaviors around smoking and vaping.
Results
The ads had 1,782,931 impressions, 1,104,139 reaches, and 11,878 clicks. The average ad frequency was 1.615, and the click-through rate was 0.67%. Males clicked less than females when seeing the ads. The acceptance rates for the three incentives were 63.7%, 37.2%, and 64.6%, respectively. A Chi-square test confirmed that the lottery-only group had a lower acceptance rate compared to those guaranteed an incentive, including the gift card group and the gift card and lottery group. Further analyses indicated that males did not opt into the survey as often as females when given the lottery-only incentive option, and those who did not meet their financial expenses opted into the survey more often than those who had more money than their expenses when given the lottery-only incentive option.
Conclusions
This study suggests that incentives guaranteed to all participants, even if the incentive's value is small, may lead to higher acceptance rates compared to a lottery for a greater incentive in social media-based surveys.
Keywords: Digital health, eHealth, digital media, social media, incentive, recruitment, behavior change, smoking, vaping
Introduction
The use of social media to recruit research participants for health research has significantly increased in the past decade.1–7 Researchers have successfully recruited general and targeted populations on various social media platforms, including Facebook, Instagram, Twitter, and others. 2 Social media recruitment has also effectively accessed youth and young adults and hard-to-reach groups for research.2,4–7
Incentives play a positive role in increasing response to requests for research participation.8–15 The role of incentives is based on the social exchange theory, which posits that the actions of individuals are motivated by the anticipated relationship between the associated rewards and costs.16–18 Individuals may respond to a request to participate in research when they expect and trust that the rewards of responding (i.e. monetary compensation, a social reward, etc.) outweigh the costs of participating (i.e. time and effort).19,20
Although there has been debate about the ethics of incentives, and some researchers point out that incentives may be coercive, past literature suggests that incentivized surveys are not coercive when participation is voluntary and informed consent is offered after a full explanation of the purpose of the research, procedures, risks and benefits, alternatives, and confidentiality.21–23 Explanation of the cost and compensation is also an essential part of informed consent, and the forms and amount of incentives should clearly be communicated to potential participants before the start of the survey.22,23 Careful examination of the data quality is required for any surveys, including online surveys, particularly when incentivized, as incentives can attract participants who are not genuinely interested in the survey. However, previous research suggests that incentivized surveys tend to have improved data quality compared to nonincentivized surveys in several aspects, such as decreased drop-out rates and increased complete responses, in addition to survey response rates.8–15 Regarding the content of survey responses, recent studies show that the risks of bots that fill surveys with random responses have been increasing for incentivized online surveys and methods to detect bots (e.g. collection of timestamps, examine survey response patterns, and collect participant email address) should be employed in the survey protocol.24,25 The nature of incentives can vary depending on the target population (e.g. vulnerable populations), the amount, and the research settings, and decisions should be made with careful consideration by each institutional review board.
A large body of experimental research demonstrates that higher incentives increase response, but with diminishing returns, cash or cash equivalents are preferred to gifts, and evidence for lotteries is mixed and inconclusive.8–10 There is also limited empirical evidence of incentive effects on different subpopulations, particularly for online surveys. 10 However, the majority of these studies have been conducted in educational or university settings, professional societies, or in ongoing online panels that participants joined for survey opportunities and presumably have some prior contact with the surveying entity.9,26 The extent to which findings from these studies can be generalized to social media platforms is unclear. The perceived legitimacy of the entity contacting an individual for research is a significant factor in respondent trust, and this issue is especially acute on social media platforms. 10 Incentives may play an important role in establishing the validity of a survey opportunity.
A separate line of research has examined strategies for recruitment on different social media platforms or in comparison to traditional recruitment methods, and many studies have used incentives to recruit participants online.9,26 A systematic review of 110 unique studies conducted on Facebook found that incentives offered included a small guaranteed gift or reimbursement for participation, a chance to win a prize via lottery, or a combination of a small guaranteed reimbursement plus a chance to win a prize via lottery. 5 However, none of these studies compared the effect of varying incentive strategies on recruitment success.
Researchers have noted a critical need for effective and novel methods for research and experimentation in social media. 27 Given the substantial increase in social media research and the lack of robust evidence based on methodological best practices, such research should include systematic approaches to experimentally test varying recruitment strategies, including incentives, on social media platforms. Incentives may increase participation and the likelihood of retention in longitudinal studies,11–15,21,26–28 and may also increase engagement with interventions, including those delivered through social media. One important objective of future research using social media platforms is to determine how to optimize study designs and responses to recruitment and intervention methods. 29
This study aimed to (1) examine varying incentive strategies on acceptance to participate in an online research study survey on social media, (2) evaluate whether incentive effectiveness varied by participant demographic characteristics, and (3) examine Facebook advertising statistics (e.g. click-through rate or CTR) and their relationship to the participants’ demographics. The goal of the study was to provide evidence for optimal incentive strategies in future social media-based surveys, particularly around smoking and vaping.
Methods
Study design
This study was conducted as part of a larger, long-term study examining the effects of social media-based antitobacco advertising on young adult attitudes, beliefs, and behavior. The larger study conducted a baseline survey and two follow-up surveys to assess changes in smoking and vaping behaviors and cognitions. The surveys were pilot tested and updated by research staff before starting the study.
The eligibility for the study was (1) Facebook users, (2) aged 18 to 24 years, and (3) in the United States. Facebook was selected because it is simple to facilitate surveys on the platform, and they offer detailed ad statistics on their dashboard. The sample was a convenience sample and was not representative of national demographics.
Of the larger longitudinal study explained above, this study only used the acceptance of survey participation and demographics from the baseline survey. The baseline survey had 39 questions in total, including 4 questions based on skip patterns that were shown to participants depending on their previous responses. Data was collected between December 29, 2021, and February 15, 2022.
The study was reviewed and approved by the institutional review board at The George Washington University (NCR202837) and was determined to be exempt. This article is informed by the CHERRIES framework for web-based surveys. 30
Incentives
During recruitment, participants were randomized to receive one of the three types of incentive offers, a $5 Amazon gift card (incentive 1), a lottery for a $200 Amazon gift card (incentive 2), and a $5 Amazon gift card plus a lottery for a $200 Amazon gift card (incentive 3), at their initial recruitment (see Figure 1). All participants in the first and the third group were told in advance that they would receive a $5 Amazon gift card after completing the baseline and an additional gift card after completing the two follow-up surveys, for a total of $15 if they completed all three surveys. In these groups, a $5 incentive was guaranteed because the gift card was offered after each survey for all respondents. Participants in the second group, on the other hand, were offered a lottery ticket to win a $200 Amazon gift card, only if they completed all three surveys. The conditions clearly explained the incentive options before they began the survey. Specific sentences used for each group were: “You will receive a $5 Amazon voucher after the completion of each survey, for a total of $15 if you complete all the 3 surveys”; “If you answer all the 3 surveys you will get a lottery ticket to enter a contest to win a $200 Amazon voucher”; and “You will receive a $5 Amazon voucher after the completion of each survey, for a total of $15 if you complete all the 3 surveys. Moreover, if you answer all the 3 surveys you will also get a lottery ticket to enter a contest to win a $200 Amazon voucher”. Group allocation was automatically generated by the online platform using simple randomization.
Figure 1.
Flowchart of recruitment process.
Recruitment
This study used a novel online platform for recruitment and data collection developed by Virtual Lab LLC (https://vlab.digital/). 31 The platform enables researchers to reach, recruit, and interact with a target population and to complete all steps in a research study through social media platforms online. It is run independently of social media companies. It screens potential participants based on the proposed eligibility criteria, recruits study participants from the target population, starts a baseline survey right after study participation consent, and sends reminders and follow-ups until study completion based on retargeting technology. The eligibility criteria can be specific and detailed for behavioral interventions and stratified by individual-level factors including age, gender, race, and ethnicity, geographic locations where the intervention is conducted, and other relevant factors of interest. The platform currently uses Facebook messenger and WhatsApp for surveys, and uses Facebook, Instagram, and Google search results for retargeting.
The process of recruitment is summarized in Figure 1. Virtual Lab posted paid ads with images on users’ feeds to recruit study participants. Figure 2 shows the specific ad images used for this step. Ad images were broad on purpose, and no information regarding the topic and the incentive group was provided in this phase to avoid bias. The potential participants clicked on the ads, which redirected them to Facebook Messenger, where the survey took place. The survey was linked to the study's Facebook page using the Virtual Lab platform. This allowed the study's Facebook pages to send automated messages on Facebook messenger. Participants who clicked on the ads saw a welcome message and a question that asked permission to chat. If participants confirmed permission to chat, the experiment randomly assigned them to one of the three incentive groups. They then saw an introduction message that explained their incentive offer, an explanation of the study, the IRB information, a consent form, and an invitation to continue to the survey. If participants consented and accepted to participate, they began the baseline survey. Eligibility for the study was determined by screening questions at the beginning of the survey.
Figure 2.
Ad images.
Measures
Acceptance of survey participation was asked using a question, “Do you accept to participate in the survey?”, after an introduction message that explained their incentive offer, an explanation of the study, the IRB information, and a consent form.
Demographic questions included gender, age, race and ethnicity, education, and financial situation. The financial situation was asked using a question, “Considering your own income and the income from any other people who help you, how would you describe your overall personal financial situation?”. Options included live comfortably, meet needs with a little left, just meet basic expenses, and doesn't meet basic expenses.
Ad statistics included impressions, reach, clicks, cost, frequency, and CTR. Ad impressions refer to the number of times the ads were shown on the screen. Reach is the number of people who saw the ads at least once, estimated by the online platform. Ad impressions may include multiple views of the ads by the same individual, whereas the reach does not include those. Click corresponds to the number of clicks on the ads. The cost was recorded as the total cost spent for displaying each ad. The frequency refers to the average number of times each person saw the ad, which was calculated as the number of impressions divided by the reach. The CTR corresponds to the percentage of times people saw the ad and performed a link click, calculated as the number of clicks divided by impressions times 100.
Analysis
(Aim 1) The analyses compared acceptance rates for survey participation by the three incentive options. Subgroup percentages and 95% logit-transformed confidence intervals were calculated. A Pearson’s chi-squared test evaluated statistically significant independence of participation between the incentive options.
(Aim 2) Responses for demographic questions for the entire sample were not available since those who denied participation in the survey did not answer any questions. However, frequencies and percentages of subcategories for the survey participant group were calculated. Pearson’s chi-squared tests evaluated statistically significant independence of participation between the incentive options by each demographic group.
(Aim 3) Ad impressions, reach, click, cost, frequency, and CTR were compared across gender using the data recorded on the ad management website.
Analyses were performed with Rstudio (Version 4.1) and the packages: tidyverse, ggplot2, gtsummary, rstatix, and freqtables.
Results
Aim 1: Incentives and acceptance rates
A total of 3143 people clicked on the Facebook survey advertisement. The acceptance rate for the $5 gift card option (incentive group 1) and the $5 gift card plus a lottery option (incentive group 3) were 63.7% and 64.6%. The acceptance rate for the lottery-only option (incentive group 2) was lower, at 37.2%. A Chi-square test confirmed the difference in proportions of acceptance of the survey between the types of incentives offered, X2 (2, N = 3143) = 199.82, p = 4.07e−44. Figure 3 shows the probability of participation by incentive offered.
Figure 3.
Probability of survey participation by incentive option offered.
Aim 2: Incentives and acceptance rates by demographics
Table 1 specifies each incentive group's random assignment and who accepted to participate in the survey.
Table 1.
Acceptance of survey participation by incentive option offered.
| Offered an incentive | Yes, took the survey | No, opted out | Chi-squared test | |
|---|---|---|---|---|
| Incentive 1: $5 gift card | N = 1072 | 683 (63.7%) | 389 (36.3%) | X2 (2, N = 3143) = 199.8, p = 4.07e-44 |
| Incentive 2: $200 gift card lottery | N = 999 | 372 (37.2%) | 627 (62.8%) | |
| Incentive 3: $5 gift card & $200 gift card lottery | N = 1072 | 692 (64.6%) | 380 (35.4%) | |
| Overall | N = 3143 | 1747 (56%) | 1396 (44%) |
The survey collected demographics for those who confirmed they would like to take the survey and were within the age range (N = 1609). Table 2 shows the sample characteristics. The convenience sample of this study consisted of more female (68%) and Asian American or Native Hawaiian/Pacific Islander (AANHPI, 21%) compared to the US National survey, at 48.7% and 5.5%, respectively. 23 Other demographics were similar across our sample and the US national sample.
Table 2.
Survey participant sample description.
| Characteristic | Overall, N = 1609 | Incentive 1, N = 633 | Incentive 2, N = 343 | Incentive 3, N = 633 | Chi-squared test p-value |
|---|---|---|---|---|---|
| Gender | 0.023 | ||||
| Female | 1084 (68%) | 412 (66%) | 255 (75%) | 417 (67%) | |
| Male | 440 (28%) | 189 (30%) | 70 (21%) | 181 (29%) | |
| Neither | 66 (4.2%) | 25 (4.0%) | 13 (3.8%) | 28 (4.5%) | |
| Missing | 19 | 7 | 5 | 7 | |
| Age (mean, SD) | 21.06(2.00) | 21.08(1.97) | 21.11(1.96) | 21.01(2.06) | |
| Race and ethnicity | 0.53 | ||||
| White, non-Hispanic | 753 (49%) | 294 (49%) | 166 (51%) | 293 (48%) | |
| Hispanic | 276 (18%) | 110 (18%) | 60 (19%) | 106 (17%) | |
| Black, non-Hispanic | 125 (8.1%) | 42 (6.9%) | 29 (9.0%) | 54 (8.8%) | |
| AANHPI, non-Hispanic | 321 (21%) | 137 (23%) | 57 (18%) | 127 (21%) | |
| Other race | 64 (4.2%) | 22 (3.6%) | 11 (3.4%) | 31 (5.1%) | |
| Missing | 70 | 28 | 20 | 22 | |
| Financial situation | 0.030 | ||||
| Doesn’t meet expenses | 105 (6.7%) | 35 (6%) | 33 (10%) | 37 (6%) | |
| Meets or exceeds budget | 1464 (93%) | 580 (94%) | 300 (90%) | 584 (94%) | |
| Missing | 40 | 18 | 10 | 12 | |
| Level of education | 0.50 | ||||
| Less than high school graduate | 206 (15%) | 79 (14%) | 36 (12%) | 91 (16%) | |
| High school graduate | 711 (50%) | 280 (50%) | 159 (53%) | 272 (49%) | |
| College graduate | 501 (35%) | 200 (36%) | 106 (35%) | 195 (35%) | |
| Missing | 191 | 74 | 42 | 75 |
AANHPI: Asian American or Native Hawaiian/Pacific Islander; SD: standard deviation.
Pearson's chi-squared test confirmed the difference in proportions of males and females between the types of incentives offered, X2 (2, N = 1524) = 11.02, p = .004. While 30% and 29% of the survey participants were males in the incentive groups that offered $5 gift cards, only 21% were males in the lottery-only incentive group. On the other hand, 66% and 67% of the survey participants were females in the incentive groups that offered guaranteed gift cards, and 75% of the survey participants were females in the lottery-only incentive group. Since the incentive groups were assigned randomly, we assume that males did not opt into the survey as often as females when given the lottery-only incentive option. Figure 4 compares incentive responses by participants’ gender.
Figure 4.
Survey participants’ gender.
A Pearson's chi-squared test confirmed the difference in proportions of those who did not meet their regular household expenses and those who did between the types of incentives offered, X2 (2, N = 1569) = 7.04, p = .030. While 6.0% of the survey participants did not meet their expenses in each incentive group that offered guaranteed gift cards, 10% of the survey participants did not meet their expenses in the lottery-only incentive group. On the other hand, 94% of the survey participants met or exceeded their expenses in the incentive groups that offered guaranteed gift cards, and 90% of the survey participants met or exceeded their expenses in the lottery-only incentive group. Since the incentive groups were assigned randomly, we assume that when given the lottery-only incentive option, those who did not meet their financial expenses opted into the survey more often than those who had more money than their expenses. According to chi-squared tests, all other demographics tested did not have a statistically significant different proportions. Figure 5 compares incentive responses by participants’ self-reported financial situations.
Figure 5.
Survey participants’ financial status.
Aim 3: Ad statistics
The Facebook recruitment ads were displayed 1,782,931 times on the screen (ad impressions). A total of 1,104,139 people saw the ads at least once (reach). The ad images are shown in Figure 2. These ads were clicked 11,878 times. The average frequency was 1.615. The cost of displaying the ads was $8439.
The CTR was 0.67% on average. The CTR was different across gender for both ads. The CTR among males for ad 1 and ad 2 were 0.50% and 0.53%, respectively, and those among females were 1.07% and 0.95%. For both ads, males clicked less than females when they saw the ads. Table 3 shows the ad statistics.
Table 3.
Ad statistics.
| Impression | Reach | Click | Total cost (USD) | ||
|---|---|---|---|---|---|
| Male | Ad 1: Amazon girl | 639,894 | 362,019 | 3194 | 2892 |
| Ad2: Banner brown | 609,744 | 365,898 | 3232 | 2429 | |
| Female | Ad 1: Amazon girl | 311,154 | 198,780 | 3334 | 2063 |
| Ad2: Banner brown | 222,139 | 177,442 | 2118 | 1056 | |
| Total | 1,782,931 | 1,104,139 | 11,878 | 8439 | |
| Frequency | Click-through rate | ||||
| Male | Ad 1: Amazon girl | 1.768 | 0.50% | ||
| Ad2: Banner brown | 1.666 | 0.53% | |||
| Female | Ad 1: Amazon girl | 1.565 | 1.07% | ||
| Ad2: Banner brown | 1.252 | 0.95% | |||
| Total | 1.615 | 0.67% |
Discussion
Differences in recruitment rates by incentive type
The present study provides preliminary evidence for which types of incentives were associated with survey acceptance. This study found that acceptance rates were higher when incentives were small but guaranteed, compared to a lottery for a greater incentive. Acceptance of the small but guaranteed incentive was 1.7 times that of the lottery-only option. Adding a lottery to a small incentive did not increase survey participation. Although the survey asked about cognition and behaviors around smoking and vaping, our findings on incentive would not be limited to these topics as these were not shared with participants at the time of survey acceptance.
The findings suggest that potential participants decided to take the survey based on the expected probability of getting the incentive. While the odds of winning the lottery were not directly provided to participants since they were dependent on the number of participants, acceptance rates suggest that participants discounted the value of the lottery since the expected probability of winning it was unknown. The small, guaranteed incentive was valued more highly. This study suggests that a small, guaranteed incentive is more effective than a lottery in encouraging social media survey participation. Also, our findings suggest that adding a lottery to a guaranteed incentive does not improve the chances of participating in a survey.
This finding is consistent with past studies, especially in psychology and behavioral economics.32,33 Whether explicitly or implicitly, people are affected by the expected probability of outcomes and prefer promised incentives rather than taking risks for an unpromised return, even if the incentive amount is larger. Rewards, costs, and trust are all key factors in promoting effective social exchange. 34 Ensuring these three factors are present may be particularly important in studies with low response propensity, such as online surveys. One hypothesis is that trust in the lottery option (i.e. in the likelihood of a reward) was low, reducing the propensity to accept the survey.
In practice, the cost for a guaranteed incentive can be a fiscal challenge in research. This study compared a $5 gift card and a $200 lottery, and the amount of reward was 40 times different from the participants’ view. Yet, those in the guaranteed incentive group accepted the survey by 1.7 times that of the lottery group. However, for a guaranteed incentive, the total cost will increase as the number of participants increases, and careful consideration of the cost-effectiveness would be required for researchers. Further research is required to examine the cost-effectiveness of guaranteed incentives and lottery to identify the most effective strategy. In examining these factors, it is also important to investigate the application of bots in survey responses across guaranteed incentives and lottery.24,25
This study also suggests gender differences in response to survey incentives. Since the incentive groups were assigned randomly, we concluded that males did not opt into the survey as often as females when given the lottery-only incentive option. 35 This finding has implications for studies that stratify by gender, as acceptance rates should be expected to be lower for male respondents.
Online recruitment advertising
This study used a novel, social media-based technology for survey recruitment. This methodology enables researchers to reach many potential participants with lower costs and more efficient targeting by participant characteristics compared to conventional recruitment methods. 31 We found that the recruitment using this methodology was less costly than other recruitment methods used by the authors, such as using online ad platforms and traditional data collection vendors such as large internet-based research firms, 31 at 8439 USD to reach 1,104,139 population and to collect 1609 study participants.
We also found variations in recruitment ad CTR by gender. Since a lower percentage of males clicked the ads compared to females, we conclude that researchers may need to pay more to attract male participants. Further research is needed to understand this tendency and what other factors (e.g. survey content, format, etc.) may explain our findings. There is a large body of research finding that females are more likely to participate in research than males.36–38
Future research
This research is part of a larger, long-term study examining the effects of social media-based antitobacco advertising on young adult attitudes, beliefs, and behavior. The incentive evidence gathered here will be used in this long-term research. We will conduct additional incentive experiments to expand the evidence based on social media-based surveys and response probabilities, including cost and cost-effectiveness. Finally, we will compare retention and attrition by study and incentive conditions across groups using the follow-up surveys.
Limitations
First, the number of participants was not consistent across groups. We observed a difference in the sample size of those offered the lottery-only option 73 less than the other two groups, as shown in Table 2. Since participants were randomly assigned and had an equal probability of receiving each incentive offer, and thus each person clicked on the ad to one of the three incentive groups, each group should have the same total sample size. We hypothesize that some people assigned to the lottery-only incentive may have blocked the Facebook survey account after seeing that we offered a lottery-only incentive rather than something guaranteed. Additionally, this study did not investigate different amounts of incentives, such as the possible difference between varying guaranteed incentive amounts either alone or combined with different lottery options. Also, the odds of winning the lottery were not presented as they were based on how many people opted into the survey, which was not capped. The responses to further variations in the types of incentives may differ and yield additional evidence on the response to the offered social exchange. In addition, the present study only used Facebook and may not be generalizable to other social media platforms (e.g. Snapchat, Tik Tok). Since the use of Facebook among young adults is not as common as for the older generations, it may not be the best platform to reach the masses of this population. However, Facebook was a good option for this study due to the convenience of facilitating the survey through its message feature, which is not currently available on other social media platforms. Finally, the study used a convenience sample collected via social media and is not representative of the overall US population of young adults aged 18 to 24 years.
Conclusion
The survey acceptance rate of the small but guaranteed incentive (incentive 1) was 1.7 times that of the lottery-only option (incentive 2). Adding a lottery to a small incentive (incentive 3) did not increase survey participation. It is hypothesized that potential participants have decided on their acceptance based on the probability of getting the incentive rather than the size of the incentive. This study suggests that incentives guaranteed to all participants, even if the incentive's value is small, may lead to higher acceptance rates compared to a lottery for a greater incentive in social media-based surveys. Findings from this study offer insights into methods for social media-based survey recruitment and suggest avenues for future research.
Footnotes
Contributorship: The manuscript was prepared by MI, HMT, JC, and WDE. MI, HMT, and JBB worked on data analysis. The implementation of the study was done by DD, NR, and RG. JC, ECH, and WDE reviewed and supervised the study and the manuscript. All authors reviewed and approved the final version of the manuscript.
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Ethical approval: This study was reviewed and approved by the institutional review board at The George Washington University (NCR202837).
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the National Cancer Institute (grant number R01CA253013).
Guarantor: WDE.
ORCID iDs: Megumi Ichimiya https://orcid.org/0000-0002-2665-6513
Jennifer Cantrell https://orcid.org/0000-0003-4730-2535
References
- 1.Alshaikh F, Ramzan F, Rawaf S, et al. Social network sites as a mode to collect health data: a systematic review. J Med Internet Res 2014; 16: e171. Epub ahead of print 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Darko EM, Kleib M, Olson J. Social media use for research participant recruitment: integrative literature review. J Med Internet Res 2022; 24: e38015. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Frampton GK, Shepherd J, Pickett K, et al. Digital tools for the recruitment and retention of participants in randomised controlled trials: a systematic map. Trials 2020; 21: 478. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Reagan L, Nowlin SY, Birdsall SB, et al. Integrative review of recruitment of research participants through Facebook. Nurs Res 2019; 68: 423–432. [DOI] [PubMed] [Google Scholar]
- 5.Thornton L, Batterham PJ, Fassnacht DB, et al. Recruiting for health, medical or psychosocial research using Facebook: systematic review. Internet Interv 2016; 4: 72–81. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Whitaker C, Stevelink S, Fear N. The use of Facebook in recruiting participants for health research purposes: a systematic review. J Med Internet Res 2017; 19: e290. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Topolovec-Vranic J, Natarajan K. The use of social media in recruitment for medical research studies: a scoping review. J Med Internet Res 2016; 18: e286. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Nicolaas G, Corteen E, Davies B. The use of incentives to recruit and retain hard-to-get populations in longitudinal studies. NatCen Soc Res 2019: 7–15. [Google Scholar]
- 9.Singer E. The use and effects of incentives in surveys. In: Vannette DL, Krosnick JA. (eds) The Palgrave handbook of survey research. Switzerland: Springer, 2018, pp.112–135. [Google Scholar]
- 10.Toepoel V. Effects of incentives in surveys. In: Gideon L. (ed.) Handbook of survey methodology for the social sciences. New York: Springer, 2012, pp.209–223. [Google Scholar]
- 11.Yu S, Alper HE, Nguyen AM, et al. The effectiveness of a monetary incentive offer on survey response rates and response completeness in a longitudinal study. BMC Med Res Methodol 2017; 17: 77. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Abdelazeem B, Hamdallah A, Rizk MA, et al. Does usage of monetary incentive impact the involvement in surveys? A systematic review and meta-analysis of 46 randomized controlled trials. Gao Z, editor. PLoS One 2023; 18: e0279128. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Göritz AS. Incentives in Web Studies: Methodological Issues and a Review. 2006.
- 14.Shamon H, Berning CC. Attention check items and instructions in online surveys: boon or bane for data quality? Surv Res Methods 2020; 14: 55–77. [Google Scholar]
- 15.Shaw MJ, Beebe TJ, Jensen HLet al. et al. The use of monetary incentives in a community survey: impact on response rates, data quality, and cost. Health Serv Res 2001; 35: 1339–1346. [PMC free article] [PubMed] [Google Scholar]
- 16.Blau PM. Exchange power in social life. New York: Wiley, 1964. [Google Scholar]
- 17.Homans GC. Social behavior: its elementary forms. New York: Harcourt, Brace, & World, 1961. [Google Scholar]
- 18.Thibaut JW, Kelly HH. The social psychology of groups. New York: Wiley, 1959. [Google Scholar]
- 19.Dillman DA. Mail and telephone surveys: the total design method. New York: Wiley, 1978. [Google Scholar]
- 20.Dillman DA, Smyth JD, Christian LH.Internet, mail, and mixed-mode surveys: the tailored design method .3rd ed.New York: Wiley, 2009. [Google Scholar]
- 21.Singer E, Bossarte RM. Incentives for survey participation when are they “coercive”? Am J Prev Med 2006; 31: 411–418. [DOI] [PubMed] [Google Scholar]
- 22.Joffe S, Cook EF, Cleary PD, et al. Quality of informed consent: a new measure of understanding among research subjects. JNCI J Natl Cancer Inst 2001; 93: 139–147. [DOI] [PubMed] [Google Scholar]
- 23.FERCAP Multi-Country Research Team, Karbwang J, Koonrungsesomboon N, et al. What information and the extent of information research participants need in informed consent forms: a multi-country survey. BMC Med Ethics. 2018; 19: 79. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Xu Y, Pace S, Kim J, et al. Threats to online surveys: recognizing, detecting, and preventing survey bots. Soc Work Res 2022; 46: 343–350. [Google Scholar]
- 25.Pozzar R, Hammer MJ, Underhill-Blazey M, et al. Threats of bots and other bad actors to data quality following research participant recruitment through social media: cross-sectional questionnaire. J Med Internet Res 2020; 22: e23021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Wu MJ, Zhao K, Fils-Aime F. Response rates of online surveys in published research: a meta-analysis. Comput Hum Behav 2022; 7: 1–2, 7–10. [Google Scholar]
- 27.Evans WD, Abroms LC, Broniatowski D, et al. Digital media for behavior change: review of an emerging field of study. Int J Environ Res Public Health 2022; 19: 2–5, 11–12. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Reveilhac M, Steinmetz S, Morselli D. A systematic literature review of how and whether social media data can complement traditional survey data to study public opinion. Multimed Tools Appl 2022; 81: 10107–10142. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Tulsiani S, Ichimiya M, Gerard R, et al. Assessing the feasibility of studying awareness of a digital health campaign on Facebook: pilot study comparing young adult subsamples. JMIR Formative Res 2022; 6: e37856. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Eysenbach G. Improving the quality of web surveys: the checklist for reporting results of internet E-surveys (CHERRIES). J Med Internet Res 2004; 6: e34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Rao MN, Donati D, Olvera O, et al. Conducting surveys and interventions entirely online: a virtual lab practitioner’s manual. Washington, DC: World Bank Group, 2020. [Google Scholar]
- 32.Kahneman D, Tversky A. Prospect theory: an analysis of decision under risk. Econometrica 1979; 47: 263–291. [Google Scholar]
- 33.Linnemayr S, MacCarthy S, Kim A, et al. Behavioral economics-based incentives supported by mobile technology on HIV knowledge and testing frequency among Latino/a men who have sex with men and transgender women: protocol for a randomized pilot study to test intervention feasibility and acceptability. Trials 2018; 19: 540. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Nord WR. Social exchange theory: an integrative approach to social conformity. Psychol Bull 1969; 71: 174–208. [DOI] [PubMed] [Google Scholar]
- 35.Laguilles JS, Williams EA, Saunders DB. Can lottery incentives boost web survey response rates? Findings from four experiments. Res High Educ 2011; 52: 537–553. [Google Scholar]
- 36.Burg JAR, Allred SL, Sapp JH. The potential for bias due to attrition in the national exposure registry: an examination of reasons for nonresponse, nonrespondent characteristics, and the response rate. Toxicol Ind Health 1997; 13: 1–13. [DOI] [PubMed] [Google Scholar]
- 37.Dunn KM, Jordan K, Lacey RJ, et al. Patterns of consent in epidemiologic research: evidence from over 25,000 responders. Am J Epidemiol 2004; 159: 1087–1094. [DOI] [PubMed] [Google Scholar]
- 38.Galea S, Tracy M. Participation rates in epidemiologic studies. Ann Epidemiol 2007; 17: 643–653. [DOI] [PubMed] [Google Scholar]





