Abstract
Given persistent communication inequalities, it is important to develop interventions to improve Internet and health literacy among underserved populations. These goals drove the Click to Connect (C2C) project, a community-based eHealth intervention that provided novice computer users of low socioeconomic position (SEP) with broadband Internet access, training classes, a Web portal, and technical support. In this paper, we describe the strategies used to recruit and retain this population, the budgetary implications of such strategies, and the challenges and successes we encountered. Results suggest that personal contact between study staff and participants and provision of in-depth technical support were central to successful recruitment and retention. Such investments are essential to realize the promise of eHealth with underserved populations.
Keywords: Participant recruitment, participant retention, eHealth, communication inequalities, community-based interventions, underserved populations
In the United States, populations of low socioeconomic position (SEP) face large and continuing disparities in morbidity and mortality across a wide range of chronic health conditions, such as cancer and heart disease. These disparities can be attributed to myriad factors—including healthcare access, utilization, and quality; environmental conditions; and lifestyle behaviors—and have cumulative effects across the lifecourse (Adler et al., 2007; Berkman & Kawachi, 2000; Centers for Disease Control and Prevention, 2011). Some scholars have proposed that eHealth applications may be a promising avenue for narrowing such gaps. Improved access and ability to use eHealth offerings, such as health information websites, online social support networks, and mobile health communication devices, may empower individuals to take greater control over their health (Kreps & Neuhauser, 2010; Strecher, 2007). Greater exposure to health information in the media, whether through active seeking or passive acquisition, may influence health knowledge and encourage prevention behaviors (Kelly et al., 2010; Redmond, Baer, Clark, Lipsitz & Hicks, 2010; Shim, Kelly, & Hornik, 2006), and over time such information engagement may reach beyond health to other life domains.
Despite their promise, eHealth applications with low SEP populations may encounter several challenges. First, not all population groups have the same opportunities to access and use these applications. As Internet access has become more widespread in recent years, some researchers have suggested that the digital divide is narrowing. Yet at the same time, differences in the type of access cannot be overlooked. People with greater income and education, and those from urban areas, are much more likely to enjoy broadband connectivity than their poorer, less educated, and rural counterparts (U.S. National Telecommunications & Information Administration, 2010; Viswanath, 2011). Second, differences in Internet use and engagement remain a pressing concern and have been described as “the second-level digital divide” (Hargittai, 2002). Studies have shown that there are important differences in how low SEP populations use the Internet and how confident they feel in navigating the Web (Hargittai, 2010; Lee, 2009). Last, there is growing attention to eHealth literacy—which refers to individuals’ ability to identify, understand, evaluate, and apply health information from electronic sources (Norman & Skinner, 2006)—and low SEP users have expressed concerns about their ability to understand online health information and assess the quality of information obtained (Knapp, Madden, Wang, Sloyer, & Shenkman, 2011). Taken together, population subgroup differences in the type of Internet access, the nature of Internet use and engagement, and the level of eHealth literacy are all types of communication inequalities. Communication inequalities refer to differences in social groups’ ability to access, attend to, process, retain, and act on information, and these inequalities have the potential to exacerbate health disparities (Viswanath, 2005, 2006).
Importantly, we need to understand how low SEP and other underserved populations access, utilize, and benefit from eHealth offerings if we are to successfully design interventions that address health and communication divides. This was the fundamental premise guiding the Click to Connect (C2C) project, a community-based eHealth intervention whose goal was to improve Internet and health literacy among low SEP groups by providing home access to high-speed Internet, computer and Internet training classes, a Web portal that facilitates Internet navigation, and ongoing technical support. Engaging with low SEP populations through community settings was crucial for two reasons. First, we wanted to better understand the ways in which context limits or supports the use of new media, broadly, and eHealth offerings, specifically. Second, given the challenges of recruiting and retaining low SEP populations, we felt that it was worth partnering with established organizations in the communities that serve our population of interest. We did not, however, use Internet access sites in the community to deliver the intervention, as several barriers could deter low SEP groups from using these sites. For example, libraries, schools, and other sites might not be in convenient locations or open at convenient times, and physical or other disabilities might limit access to these locations. Sites also might have restrictions on search parameters and time spent on the Internet, thereby further constraining access. C2C thus allows us to examine how low SEP groups use the Internet once issues of access are resolved.
In the current study, we describe the strategies used to recruit and retain low SEP novice computer users for the C2C intervention, which took place over extended periods of time in community settings. We also describe the budgetary implications of such strategies, as well as some of the recruitment and retention challenges we encountered. Clinical and behavioral intervention research has found that recruiting and retaining low SEP populations can be difficult because of participants’ lack of time, lack of transportation, and greater mobility (Eakin et al., 2007; Nicholson et al., 2011; Webb et al., 2010). Although some of the challenges we encountered overlap with those documented in clinical interventions, some are particularly relevant to community-based eHealth interventions. Ultimately, we review ways to maximize participation, while also considering the associated costs of these strategies—data that are rarely reported (Rdesinski, Melnick, Creach, Cozzens, & Carney, 2008; UyBico, Pavel, & Gross, 2007), yet crucial as researchers and practitioners evaluate how to intervene with underserved populations in a constrained fiscal environment. Our hope is that by providing these data for a hard-to-reach population, we can contribute to future eHealth efforts with such groups.
Method
Data for this study come from “Click to Connect: Improving health literacy through computer literacy” (C2C), a randomized controlled intervention study funded through the U.S. National Cancer Institute (NCI). The goal of the project was to understand the ways in which low SEP populations with limited online experience access and use the Internet for health information. Primary outcomes included changes in media use and exposure to health information, Internet usage patterns, and health information seeking and efficacy. Secondary outcomes included health knowledge and health beliefs.
Participants were recruited from adult education centers in greater Boston, Massachusetts (further details regarding these efforts appear in the Results section). Individuals were eligible to participate if they met the following criteria: 1) did not have home broadband Internet access; 2) were enrolled in a General Educational Development (GED), pre-GED, or high-level English for Speakers of another Language (ESOL) class; 3) were between the ages of 25 and 60; 4) did not identify as computer savvy; and 5) had a working telephone number. Participants were randomized to the intervention or control condition; SAS Version 9.3 was used to generate the random assignment.
The intervention provided classes to train participants to use computers and the Internet. Coursework included methods of searching for health information. The intervention was conducted in three waves, and during each wave participants received nine monthly training classes at community colleges in Boston. In addition to the classes, participants received a computer, broadband Internet access (for 9 to 18 months), access to a Web portal designed for low-literacy populations that facilitates Internet navigation, and ongoing technical support. At the end of each project period, the control condition received hard copies of the health information provided on the Web portal. Both groups received incentives for completing surveys throughout the project (details appear in the Results section). The Dana-Farber Cancer Institute Institutional Review Board granted human subjects approval for this project.
Participants were lower SEP novice computer and Internet users in greater Boston (Table 1). A majority of participants were under age 50, two-thirds were women, and more than three-quarters were non-Hispanic Black or Hispanic. Nearly 60% were living at or below the federal poverty level, and 70% had not completed high school or a GED.
Table 1.
Demographic Characteristics of Baseline Click to Connect Participants (N = 324a)
| Characteristic | Nb | %b |
|---|---|---|
| Age | ||
| ≤ 34 | 116 | 36 |
| 35–49 | 136 | 42 |
| ≥ 50 | 71 | 22 |
| Gender | ||
| Male | 110 | 34 |
| Female | 214 | 66 |
| Poverty level | ||
| ≤ FPL | 174 | 59 |
| > FPL | 123 | 41 |
| Education | ||
| 1st–8th grade | 46 | 14 |
| 9th–12th grade | 186 | 57 |
| GED or high school diploma from outside the U.S. | 69 | 21 |
| Some college outside the U.S. | 22 | 7 |
| Race/ethnicity | ||
| White, non-Hispanic | 21 | 6 |
| Black, non-Hispanic | 171 | 53 |
| Hispanic | 81 | 25 |
| Other | 51 | 16 |
| Employment status | ||
| Working | 160 | 49 |
| Not working < 6 months | 30 | 9 |
| Not working ≥ 6 months | 92 | 28 |
| Disabled | 31 | 10 |
| Other | 11 | 3 |
| Language acculturation | ||
| Low | 32 | 10 |
| Medium | 126 | 39 |
| High | 165 | 51 |
| Immigrant status | ||
| Born in U.S. | 137 | 42 |
| In U.S. ≥ 10 years | 106 | 33 |
| In U.S. < 10 years | 80 | 25 |
| Health status | ||
| Excellent/very good/good | 240 | 74 |
| Fair/poor | 84 | 26 |
As evident in Figure 1, 336 participants were enrolled in C2C; however, 12 were deemed ineligible post-randomization (see Figure 3, footnote a). Removing these 12 ineligible participants yielded a final baseline N of 324.
Across variables, sample sizes may differ slightly due to missing data, and percentages may not sum to 100 due to rounding.
This study focuses on process outcomes related to recruiting and retaining participants for an eHealth intervention. Process evaluation data were collected between May 2007 (start of recruitment) and May 2010 (end of follow-up).
Recruitment: Procedures, Data Sources, and Methods
We used a proactive rather than a reactive approach to recruit C2C study participants (Yancey, Ortega, & Kumanyika, 2006). A proactive approach brings project staff into direct contact with potential participants. This typically involves face-to-face contact with community leaders and organizations, as well as recruitment presentations and meetings in the community. In contrast, a reactive or passive approach relies on potential participants’ contacting project staff after coming across recruitment flyers, mailings, or other advertisements. We also utilized pre-screening procedures to identify potential participants from the larger population. Using the “most sensitive” screening procedures, or early delivery of questions that best discriminate between those who will or will not be eligible, increased the efficiency of the process (Berger, Begun, & Otto-Salaj, 2009).
To support three waves of the intervention, study recruitment occurred between May and November 2007, April and October 2008, and March and October 2009. For each wave, the recruitment process included outreach sessions in community settings, phone calls for pre-screening, and phone calls for pretest administration. Details of these activities are presented in the Results section. Data describing the outreach sessions and flow of participants through the recruitment process (including reasons for refusal or ineligibility) came from study tracking logs. Call logs provided data regarding staff effort for the pre-screening calls. Similar logs provided data describing staff effort and the number of contact attempts made to administer the pretest. Internal records describing staff salary were utilized to support economic assessments.
Recruitment analyses were mainly descriptive statistics and expenditure estimation. The details of participants’ ineligibility and refusal were analyzed qualitatively to identify common themes among the study records. To understand the relationship between contact attempts and pretest survey completion, we analyzed utility cut points, which allow researchers to gauge the point at which additional contact attempts are likely to have diminishing returns. We based our analysis on Rdesinski and colleagues’ (2008) assessment of the number of calls necessary to enroll a certain number of participants/administer the pretest survey.
Retention: Procedures, Data Sources, and Methods
Given the anticipated challenges of retaining a low SEP population, we employed several strategies to maintain high retention rates. First, we built a number of contact points into the intervention and control group follow-up procedures. The intervention participants completed a monthly survey, either at the training session or on their own. If they did not complete the survey during class, they were subsequently contacted by phone as a reminder. Intervention participants did not receive incentives for completing the monthly surveys, as they already received free Internet access, training, and support through the intervention. In contrast, the control group received monthly reminder postcards asking for contact information updates or confirmations and, in return, received a $5 giftcard. Second, we encouraged communication between key study staff (the project director, research assistants, and technical specialist) and participants, as such relationships have been shown to be important drivers of retention (Yancey et al., 2006).
Retention data were collected between November 2007 (start of first wave classes) and May 2010 (end of last wave follow-up). The retention data focus on three areas: tracking participant mobility, documenting contacts between staff and participants (including calls for technical support), and estimating expenditures. Study tracking logs that detailed changes in participant contact information were created and analyzed. Call logs were analyzed to assess patterns of interaction; although users became increasingly comfortable with email, the telephone was the predominant method of communication between study staff and participants. Finally, invoices from external technical support vendors and internal accounting estimates provided data for the economic assessment. As with the recruitment data, we relied mainly on descriptive statistics, utility cut point assessment, and expenditure estimation to analyze our data.
Results
Recruitment Results
Recruitment strategies: challenges and successes
To achieve our recruitment goal of 312 participants (which was informed by a priori power calculations), we made 190 in-person presentations at 32 adult literacy centers in greater Boston. These outreach sessions were conducted across three study waves: 47 presentations between May and November 2007, 76 presentations between April and October 2008, and 67 presentations between March and October 2009. A project director led all of the presentations; each required 90 minutes (60 minutes of travel + 30 minutes of presentation/Q&A), totaling 285 hours of staff time across the three waves. Overall these sessions generated 1,767 potential participants (Figure 1).
Figure 1.
Click to Connect participant recruitment.
Although our proactive recruitment strategy yielded many potential participants, a substantial number were either ineligible or not interested in participating. During pre-screening—when study staff called potential participants identified during the 190 outreach sessions—over 1,000 people were deemed ineligible and more than 200 refused to participate (Figure 1). Some of the reasons for ineligibility or refusal may be more common among lower SEP populations. For example, several potential participants reported that they were leaving the Boston area during the next 12 months, underscoring the residential instability often experienced by lower SEP populations (Blumenthal, Sung, Coates, Williams, & Liff, 1995; Eakin et al., 2007; Kan, 1999; Schafft, 2006). In many cases, potential participants’ eligibility could not be verified, typically because they did not have a working phone number. Some participants were ineligible because they had a language barrier, while others were deemed ineligible if they could not attend computer classes. Inability to attend classes was often due to transportation, time, or schedule constraints. Still others refused to participate because of scheduling difficulties; for instance, one potential participant was working two jobs while attending school and another had three children and worked fulltime.
Despite these challenges in engaging potential participants, we found that pre-screening can be a time- and cost-saving strategy. It enables study staff, during an initial phone call, to identify many of those who are ineligible or not interested in participating. Here, we were able to refine the pool of potential participants from 1,767 to 529, successfully identifying 70% (n = 1,238) of individuals as ineligible or refused (Figure 1).
Given the large number of individuals who were ineligible or refused, it was important to be persistent in attempts to enroll eligible participants. We tracked the number of calls necessary to enroll participants and administer the pretest survey; Figure 2 presents the number of pretest scheduling attempts per participant enrolled in C2C. Two-thirds completed the pretest within four contact attempts, whereas almost 90% completed it within eight contact attempts. On the other hand, calling participants another four times (for a total of 12 contact attempts) only increased participation by 5%. These utility cut points (Rdesinski et al., 2008) allow researchers to assess the extent to which completion rates increase with successive contact attempts—and, in turn, whether resources might be better spent on fewer enrollment calls and additional recruitment activities (e.g., eight contact attempts and greater pre-screening).
Figure 2.
Number of pretest survey scheduling attempts per enrolled participant (N = 324).
Despite the tremendous investments in recruitment, the study team ultimately had to extend recruiting for the final wave until after the Wave 3 training classes had started. Participants who were recruited late received a make-up session to ensure that the full intervention was delivered.
Recruitment economic assessment
Through our active recruitment efforts, we were able to exceed our goal of 312 enrolled participants. To assess the cost of such efforts, we estimated the amount of staff time required for in-person presentations and outreach, pre-screening (eligibility verification), and enrollment/pretest survey completion. Recruitment activities centered on outreach sessions, pre-screening, and pretest delivery. A project director led the outreach sessions, while two research assistants conducted the pre-screening, screening, and enrollment/pretest administration calls. For each wave/year, approximately three months of project director time and a total of five months of research assistant time were dedicated to recruitment efforts. The project director salary was approximately $57,500, and the research assistants’ salaries were approximately $35,000. Assuming annual salary increases of 2% due to inflation and an institutional salary fringe rate of 28%, the total staffing costs for three waves of recruitment were estimated at $92,014. The project director also required access to a car to make the outreach trips. Travel reimbursement was estimated at $1,425 based on average distances to recruitment sites and the average institutional reimbursement rate for the study period. We also paid each enrolled participant $25 for completing the pretest. Thus, the total estimated recruitment costs across three waves were $101,538.52.
Retention Results
Retention strategies: challenges and successes
Our overarching retention strategy involved consistent and intensive contact with study participants and, in this way, was a natural extension of our proactive recruitment approach. This contact took place in several venues—for example, training classes and participants’ homes (during computer installation and technical support provision)—and via communication channels including traditional mail, email, and telephone. Although study staff communicated with participants via all three channels, the modality that perhaps best illustrates the extent of staff–participant interaction is the telephone. Across three waves, there were 7,473 retention-related phone calls, the reasons for which are detailed in Table 2. These calls were initiated or received by three core staff members, all of whom were with C2C from its inception to its conclusion: a project director, who led the recruitment outreach sessions; a research assistant, who conducted the recruitment pre-screening; and a technical support specialist, who handled all computer installations and coordinated technical support efforts.
Table 2.
Retention-Related Telephone Calls by Call Category (N = 7,473)
| Call category | N | % |
|---|---|---|
| Technical support | 2,150 | 28.8 |
| Includes calls about problems with Internet, computer, spyware/viruses, study software (i.e., tracking software), and printer | ||
| Intervention follow-up | 2,133 | 28.5 |
| Includes calls about monthly deliverables (e.g., postcard completion for control participants, Web survey completion for intervention participants) and tracing (e.g., changing address or phone number(s)) | ||
| Class scheduling | 1,690 | 22.6 |
| Includes calls about attendance (e.g., class reminder/confirmation, makeup class scheduling) and class content (e.g., general information, exercises/handouts) | ||
| Broadband setup | 599 | 8.0 |
| Includes calls about broadband installation (e.g., scheduling, confirmation, cancellation/rescheduling) | ||
| Other | 595 | 8.0 |
| Includes calls about scheduling home visits (for non-deployment purposes) and general questions/comments | ||
| Deployment | 306 | 4.1 |
| Includes calls about computer setup and installation (e.g., scheduling, confirmation, cancellation/rescheduling) |
Nearly 30% (n = 2,150) of retention-related calls were technical support calls (Table 2). Although most were between C2C staff and participants, there were occasional calls between study staff and technical support vendors, which were made on behalf of the participant. Technical support interactions pertained only to intervention participants. Of the 155 intervention participants not lost post-randomization (Figure 3), 154 were involved in at least one technical support interaction. The average number of interactions per participant was 14.0 (SD = 10.5; median = 12.0). The issues reported during technical support calls are provided in Table 3. Among the most commonly reported issues were difficulty accessing the Internet (e.g., due to service interruptions), higher-level computer system problems (e.g., software conflicts, participant modification of computer settings), and spyware or viruses. Although there were 2,150 calls logged, some calls involved multiple issues; in total, 2,459 issues were addressed during these calls. Call log notes also revealed that multiple calls often were needed to resolve a particular challenge faced by a given user.
Figure 3.

Click to Connect retention for intervention and control participants.
a 12 participants were deemed ineligible post-randomization. Once C2C staff began installing computers in intervention participants’ homes, they determined that 11 participants had a pre-existing broadband connection, and one was living with a Wave 1 control participant. Removing these 12 ineligible participants yielded a final baseline N of 324.
b An additional 11 participants (8 intervention, 3 control) were lost post-randomization; thus, a total of 313 participants remained in the trial.
Table 3.
Technical Support Issues Reported During Technical Support Calls (N = 2,459a)
| Technical support issue | N | % |
|---|---|---|
| Internet | 1,088 | 44.2 |
| Refers to inability to access the Internet (e.g., due to service interruptions, software issues, or participant modification of computer/browser setup) | ||
| Computer | 647 | 26.3 |
| Refers to higher-level system problems (e.g., software conflicts, participant modification of computer settings) | ||
| Spyware/viruses | 273 | 11.1 |
| Refers to a non-working or limited functionality computer due to the presence of spyware, malware, or viruses | ||
| Study software | 216 | 8.7 |
| Refers to issues related to the tracking software, which was installed on every computer as part of the evaluation process for the study | ||
| Other | 147 | 6.0 |
| Refers to all other technical support issues (e.g. forgotten passwords, contact info for technical support vendors) | ||
| Printer | 88 | 3.6 |
| Refers to any printer-related issues |
Total number of issues reported during 2,150 technical support telephone calls (multiple issues were reported during some calls).
In addition, nearly 30% (n = 2,133) of retention-related calls were follow-up calls (Table 2). Most of these involved following up with control participants to obtain monthly postcards and with intervention participants to obtain monthly survey responses. As described in the Method section, each month study staff mailed postcards to control participants asking for contact information updates/confirmations. After returning each postcard, control participants received a $5 giftcard. Across three waves, 41.7% (n = 890) of follow-up calls were postcard-related, most of which (n = 569) were initiated by C2C staff. These interactions likely contributed to an average postcard completion rate of 94.5% among control participants. Intervention participants had a different monthly task: They were expected to complete a brief survey about their computer and Internet use experiences. The survey’s primary purpose was to stay in touch with intervention participants outside of the classroom, but it also gave participants the opportunity to provide feedback on the C2C Web portal and report any difficulties they were experiencing. The survey link was emailed to intervention participants; they learned to access the survey during the first email training class, which was held during month two of the intervention. Nearly one-quarter (24.1%, n = 515) of follow-up calls were related to monthly surveys (e.g., reminder of survey ID, reminder to complete), and most (n = 429) were initiated by participants. Across waves, the average monthly survey completion rate for intervention participants was 71.9%.
One important call category included conversations about the computer and Internet training classes. Nearly 25% of all retention-related calls (n = 1,690) pertained to training classes (Table 2), and of those, 56.3% (n = 952) were attendance reminder or confirmation calls initiated by C2C staff (n = 594) and participants (n = 358). Unlike clinic-based interventions, community-based interventions like C2C require participant engagement outside of the typical clinical encounter. Training classes were a key component of the C2C intervention, and thus it was essential to maximize participant attendance. There were a total of nine required classes; participants were called and reminded to attend and, if necessary, staff called participants to schedule a make-up class. Participants sometimes called to confirm a class time or request attendance at a make-up class. The primary reasons that participants missed classes included work conflicts, personal or family health issues, or other family issues (e.g., death in the family). Other reasons included transportation issues, weather issues, and inability to find childcare. Ultimately, by continually interacting with participants and accommodating their schedules, C2C staff achieved an overall class attendance rate of 84.3% across three waves.
Maintaining consistent contact with participants sometimes proved challenging, given the residential instability observed during the study. Across waves, there were 107 changes of address by 82 participants—or one-quarter of study participants. In addition, there were 118 changes of telephone number (76 cell phone, 42 home landline) by 85 participants. Although some participants changed their address or phone number more than once during the study, we observed a substantial amount of mobility among participants overall. Some of the reasons participants reported moving included eviction or homelessness, intimate partner violence or other relationship issues, and immigration issues—all of which may be more common among lower SEP populations.
Despite mobility and related challenges (e.g., disconnected phone service, unanswered phones), C2C had an 87.8% retention rate, which surpassed our a priori expectations. That said, compared with pretest administration, posttest administration required more contact attempts to achieve survey completion. Comparing Figures 2 and 4, we observe that among those with complete follow-up, only half (52.6%) completed the posttest within four contact attempts, whereas 66.0% completed the pretest within four attempts. Nearly three-quarters (74.1%) had completed the posttest within eight contact attempts, while almost 90% completed the pretest after the same number of attempts. As evident in Figure 4, subsequent calls yielded additional completed posttest surveys, but over time such contact produced comparatively fewer completions.
Figure 4.
Number of posttest survey scheduling attempts per study participant with complete follow-up (N = 275).
Retention economic assessment
To assess retention costs, we estimated the amount of staff time required for participant engagement, technical support, and retention materials and incentives. Major staffing costs related to the two research assistants (approximately six months each per year). Using the same salary, inflation, and fringe benefit assumptions presented above, the total staffing costs for three years were $111,210.58. The intervention required three years of technical support, which was outsourced for the first two years and then brought in-house for the third year. Technical support costs include 0.8 FTE for a technical specialist and invoiced technical support costs for the first two years. The outsourced technical support vendor’s rate was $75/hour. The estimated cost for one year was $51,000, totaling $153,000 across three years. We also provided monthly incentives to 158 control participants to keep their contact information current for one year, for an approximate cost of $9,006. Intervention participants received a $10 incentive for completing a health literacy assessment; 143 individuals completed the assessment, for an additional cost of $1,430. All participants who completed the posttest (87.8%) received a $25 incentive. Thus, the total retention cost for three years was estimated at $283,031.58.
Discussion
The goal of this paper was to describe strategies used to recruit and retain low SEP novice computer users for the C2C project, a community-based eHealth intervention. We considered the estimated costs of these strategies—budgetary considerations that are not often discussed in the research literature—and we described the central recruitment and retention challenges we encountered during the study.
To recruit participants, we used a proactive approach, with an emphasis on in-person presentations and personal contact with community members and organizations. Although this strategy is resource-intensive, the literature suggests that it was appropriate for the C2C intervention, and thus it likely contributed to the study’s successful recruitment efforts. Proactive approaches are common in community-based interventions such as this one (UyBico et al., 2007), and personal contact has been described as a primary recruitment vehicle in studies with underserved populations (Graham, Lopez-Class, Mueller, Mota, & Mandelblatt, 2011; Nicholson et al., 2011; Yancey et al., 2006), perhaps due in part to lower levels of literacy among these populations. Additionally, there is evidence that “proactive strategies are associated with higher recruitment yields when eligibility is rare” (Yancey et al., 2006, p. 16). As noted in the Method section, C2C had numerous eligibility criteria; thus, active in-person recruitment was important because it enabled us to cast a wide net, generating a large pool of potential participants. The opportunity to engage with a large pool of target population members was greatly facilitated by our partnership with community-based institutions, in our case adult education centers. This strategy has been used successfully in other attempts to increase health literacy and lessen the digital divide for vulnerable populations (Kreps, 2005). Our hope was that by extensively pre-screening participants, we would minimize subsequent ineligibility and refusal during enrollment.
Yet this approach did not preclude the challenges associated with engaging a lower SEP population. Disconnected telephones, wrong numbers, and lack of answering machines have been identified as barriers to recruitment among minority and lower income populations (Eakin et al., 2007; Mendez-Luck et al., 2011; Osann et al., 2011), as have transportation, time, and schedule constraints (Ejiogu et al., 2011; Withall, Jago, & Fox, 2011). Consistent with prior studies, our results showed that these were the primary reasons potential participants were ineligible or refused to participate. In addition, researchers have documented residential instability among lower SEP populations (Blumenthal et al., 1995; Eakin et al., 2007; Kan, 1999; Schafft, 2006), and indeed several prospective participants indicated they would be leaving greater Boston within the next year. Researchers conducting eHealth interventions with underserved populations should ensure that there is a sufficiently large base of potential participants and then utilize efficient pre-screening mechanisms to identify eligible enrollees.
These challenges are relevant not only to recruitment efforts but also to retention. For example, some intervention participants routinely lost telephone service, as they could not afford their bill. Phone disconnection had two outcomes: First, it limited staff members’ ability to reach participants, and second, it led to Internet service interruptions. The second outcome was unanticipated but had an important impact on intervention delivery. Because the cable provider bundled broadband and phone service, Internet access was disrupted when phone bills went unpaid, even though the project was paying for participants’ Internet. Participant mobility was another challenge: One-quarter of participants reported changing their address and/or phone number during the study. Such residential instability surely can influence retention efforts, insofar as it becomes crucial to track participants and update contact information regularly (e.g., by collecting multiple telephone numbers; El-Khorazaty et al., 2007; Rdesinski et al., 2008).
Ultimately, retention seems to have benefited greatly from high levels of contact between study staff and participants. These points of contact often occurred via phone, and although call content may have revolved primarily around training class scheduling, postcard or monthly survey reminders, and technical support, the interactions also provided an opportunity for engagement and social support. Moreover, the same staff members were involved in the project from its inception to its conclusion, which enabled them to develop rapport with participants. The importance of both intensive contact with participants and staff consistency have been described in previous studies (e.g., Douyon, Chavez, Bunte, Horsburgh, & Strunin, 2010; Paskett et al., 2008; Yancey et al., 2006). These interactions worked alongside other established retention strategies, such as providing timely incentives and prioritizing participant convenience (e.g., offering multiple class locations, rescheduling classes; El-Khorazaty et al., 2007; Nicholson et al., 2011; Webb et al., 2010; Yancey et al., 2006). These resource-intensive efforts yielded a higher than expected retention rate (87.8%)—which also surpassed the rates reported in clinical and behavioral intervention studies with low SEP populations (Eakin et al., 2007; Nicholson et al., 2011; Webb et al., 2010).
The use of community organizations as delivery channels afforded another important support for retention and engagement in this eHealth intervention. Using a physical site for training sessions allowed for in-person interaction between study staff and participants, as well as among participants, fostering a sense of community and connectedness. At the same time, the partnership with established community-based organizations decreased delivery costs.
In recruitment and retention studies with underserved populations, proactive in-person recruitment approaches and high-contact retention efforts are touted, yet the budgetary implications of such strategies are not always considered (UyBico et al., 2007). Some studies have attempted to quantify the costs of these efforts (Berger et al., 2009; Cotter, Burke, Stouthamer-Loeber, & Loeber, 2005; Graham et al., 2011; Gustafson et al., 2005; Keyzer et al., 2005; Rdesinski et al., 2008), but overall there has been less attention to where and how researchers should invest limited funds. In addition, to our knowledge there have been no such efforts in the context of eHealth interventions with underserved populations.
Our results suggest that it may be necessary to build interpersonal connections into the study—whether through telephone-based contact or other mechanisms—to be successful in recruitment and retention. Such interactions are essential even in eHealth interventions. Simply providing C2C intervention participants with computer and Internet access would not have been sufficient. Novice users needed support through the steep learning curve surrounding technology use, as evidenced by the large number of technical support interactions (e.g., spyware that led to disabled computers, participant modification of computer or browser setup that affected Internet access). In other words, eHealth cannot be equated with complete automation; rather, technical support will be another important point of contact for participants.
A pilot study that drove C2C’s development also highlighted the importance of technical support for novice computer and Internet users, finding that the number technical support calls decreased dramatically through the one-year intervention period (Kontos, Bennett, & Viswanath, 2007). Relying primarily on in-house technical support staff (rather than external vendors) may prove worthwhile, insofar as it will enable staff to maintain connections with participants and build their sense of obligation to the study. In-house technical support is also likely to be much less expensive than outsourced support. At a minimum, project staff will likely be able to manage a large portion of the support that is required and request high-level help as needed. The level of complexity of problems did not always warrant the cost of a highly trained information technology professional.
Beyond technical support, results suggest that persistent call attempts—during both recruitment and retention stages of the project—are important. Such persistence has been described as an important recruitment (Eakin et al., 2007; Rdesinski et al., 2008) and retention (Cotter et al., 2005; Eakin et al., 2007; Yancey et al., 2006) strategy, but less is known about whether persistence has diminishing returns. In our analyses of pretest and posttest call attempts, we identified utility cut points that suggest that after a certain number of attempts, researchers might want to consider redirecting resources toward other recruitment or retention activities. For instance, for a highly transient study population, more recruitment call attempts might seem wise; however, our results suggest that eight attempts could achieve sufficient enrollment and that it may be more important to earmark additional funds for pre-screening.
Our estimated costs for recruitment and retention efforts totaled $384,570.10. Of this total, about one-quarter ($101,538.52) went toward recruitment, while almost three-quarters ($283,031.58) went toward retention. Importantly, our analyses show that engaging a lower SEP population in a community-based eHealth intervention can be quite successful with sufficient resource allocation. Researchers have recommended that federal grants allow for greater budgets to test different recruitment strategies that would ensure minority and underserved participation (Yancey et al., 2006). Still others have called for greater documentation of the full costs associated with recruitment and retention efforts so that adequate funding can be obtained for these activities (Rdesinski et al., 2008). In the same line, researchers engaged in resource-intensive engagement have suggested that limiting follow-up would greatly increase attrition and that the costs to the study would be greater than any short-term savings (Cotter et al., 2005). The current study adds to this literature by estimating the costs of these efforts and, in turn, underscoring the need for large recruitment and retention budgets. It is vital to understand the resource requirements to recruit and retain low SEP populations in order to successfully reach, study, and ultimately improve the health of these vulnerable populations.
Several study limitations should be noted. First, we did not empirically evaluate different recruitment and retention strategies. Instead, we described the strategies we used, the outcomes we achieved, the challenges we encountered, and the estimated costs of our efforts. Second, we did not collect demographic data from eligible non-participants, and therefore we could not assess the representativeness of those enrolled. Third, participants who were deemed ineligible might have benefited from the intervention. Future eHealth efforts with low SEP populations should consider ways to overcome barriers to eligibility—for example, by providing training classes in languages other than English and offering classes in multiple locations at different times. Fourth, all participants were adult education students and thus were already activated and engaged in learning. Although this was a strength from the standpoint of recruitment and retention, generalizability to other low SEP populations may be constrained. Fifth, providing home broadband access and support is expensive, particularly when compared to using Internet access sites in the community. If we had used publicly available sites, it would have been less costly, and personal contact might have been less important. Crucially, though, using publicly available sites would have limited participants’ Internet access; a central C2C goal was to examine low SEP groups’ routine Internet use once most physical and financial barriers to access were eliminated. Last, the C2C intervention occurred between 2007 and 2010, and thus some of these data are two to five years old. However, while this intervention might look somewhat different if it were conducted today, differential Internet access, use, and engagement—as well as eHealth literacy—remain pressing concerns (Hargittai, 2010; Lee, 2009).
Although eHealth applications have the potential to reduce communication inequalities and, in turn, address health disparities, there is concern that these offerings could in fact widen existing gaps, due to issues of technology access and literacy (Ahern, Kreslake, & Phalen, 2006; Viswanath, 2011). The current study highlights the investments required if the promise of eHealth with underserved populations is to be realized. Researchers have underscored the importance of human support in eHealth (Glasgow, 2007), and our results suggest that personal contact was the central strategy in successfully recruiting and retaining C2C participants. By describing our overarching strategy, the concomitant challenges, and the budgetary implications, we hope to inform and drive future eHealth efforts within low SEP communities.
Acknowledgments
This study was supported by a grant from the National Cancer Institute (NCI) (5 R01-CA122894; Viswanath, PI.). This publication’s contents are the responsibility of the authors and do not necessarily represent the official views of NCI. Funding support for R.H.N. was provided through NCI by the Harvard Education Program in Cancer Prevention (5 R25-CA057711). We thank Neyha Sehgal, Martha Zorn, and Elaine Puleo for their help with data collection.
Contributor Information
Rebekah H. Nagler, Department of Society, Human Development, and Health, Harvard School of Public Health and Center for Community-Based Research, Dana-Farber Cancer Institute
Shoba Ramanadhan, Center for Community-Based Research, Dana-Farber Cancer Institute;.
Sara Minsky, Center for Community-Based Research, Dana-Farber Cancer Institute;.
K. Viswanath, Department of Society, Human Development, and Health, Harvard School of Public Health and Center for Community-Based Research, Dana-Farber Cancer Institute
References
- Adler NE, Stewart J, Cohen S, Cullen M, Diez Roux AV, Dow W, Williams D. Reaching for a healthier life. San Francisco, CA: The John D. and Catherine T. MacArthur Foundation Research Network on Socioeconomic Status and Health; 2007. Retrieved from www.macses.ucsf.edu/downloads/Reaching_for_a_Healthier_Life.pdf. [Google Scholar]
- Ahern DK, Kreslake JM, Phalen JM. What is eHealth (6): perspectives on the evolution of eHealth research. Journal of Medical Internet Research. 2006;8(1):e4. doi: 10.2196/jmir.8.1.e4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Berger LK, Begun AL, Otto-Salaj LL. Participant recruitment in intervention research: scientific integrity and cost-effective strategies. International Journal of Social Research Methodology. 2009;12(1):79–92. doi: 10.1080/13645570701606077. [DOI] [Google Scholar]
- Berkman LF, Kawachi I. Social epidemiology. New York: Oxford University Press; 2000. [Google Scholar]
- Blumenthal DS, Sung J, Coates R, Williams J, Liff J. Recruitment and retention of subjects for a longitudinal cancer prevention study in an inner-city Black community. Health Services Research. 1995;30(1):197–205. [PMC free article] [PubMed] [Google Scholar]
- Centers for Disease Control and Prevention. CDC health disparities & inequalities report – United States, 2011. MMWR. 2011;60(Suppl):1–116. Retrieved from http://www.cdc.gov/mmwr/pdf/other/su6001.pdf. [PubMed] [Google Scholar]
- Cotter RB, Burke JD, Stouthamer-Loeber M, Loeber R. Contacting participants for follow-up: how much effort is required to retain participants in longitudinal studies? Evaluation and Program Planning. 2005;28(1):15–21. doi: 10.1016/j.evalprogplan.2004.10.002. [DOI] [Google Scholar]
- Douyon M, Chavez M, Bunte D, Horsburgh CR, Strunin L. The GirlStars program: challenges to recruitment and retention in a physical activity and health education program for adolescent girls living in public housing. Preventing Chronic Disease. 2010;7(2):A42. Retrieved from http://www.cdc.gov/pcd/issues/2010/mar/08_0248.htm. [PMC free article] [PubMed] [Google Scholar]
- Eakin EG, Bull SS, Riley K, Reeves MM, Gutierrez S, McLaughlin P. Recruitment and retention of Latinos in a primary care-based physical activity and diet trial: The Resources for Health study. Health Education Research. 2007;22(3):361–371. doi: 10.1093/her/cyl095. [DOI] [PubMed] [Google Scholar]
- Ejiogu N, Norbeck JH, Mason MA, Cromwell BC, Zonderman AB, Evans MK. Recruitment and retention strategies for minority or poor clinical research participants: lessons from the Healthy Aging in Neighborhoods of Diversity across the Life Span Study. Gerontologist. 2011;51(Suppl 1):S33–S45. doi: 10.1093/geront/gnr027. [DOI] [PMC free article] [PubMed] [Google Scholar]
- El-Khorazaty MN, Johnson AA, Kiely M, El-Mohandes AA, Subramanian S, Laryea HA, Joseph JG. Recruitment and retention of low-income minority women in a behavioral intervention to reduce smoking, depression, and intimate partner violence during pregnancy. BMC Public Health. 2007;7:233. doi: 10.1186/1471-2458-7-233. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Glasgow RE. eHealth evaluation and dissemination research. American Journal of Preventive Medicine. 2007;32(5 Suppl):S119–S126. doi: 10.1016/j.amepre.2007.01.023. [DOI] [PubMed] [Google Scholar]
- Graham AL, Lopez-Class M, Mueller NT, Mota G, Mandelblatt J. Efficiency and cost-effectiveness of recruitment methods for male Latino smokers. Health Education & Behavior. 2011;38(3):293–300. doi: 10.1177/1090198110372879. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gustafson DH, McTavish FM, Stengle W, Ballard D, Jones E, Julesberg K, Hawkins R. Reducing the digital divide for low-income women with breast cancer: a feasibility study of a population-based intervention. Journal of Health Communication. 2005;10(Suppl 1):173–193. doi: 10.1080/10810730500263281. [DOI] [PubMed] [Google Scholar]
- Hargittai E. Second-level digital divide: differences in people’s online skills. First Monday. 2002;7 Retrieved from http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/942/864. [Google Scholar]
- Hargittai E. Digital na(t)ives? variation in Internet skills and uses among members of the “net generation”. Sociological Inquiry. 2010;80(1):92–113. doi: 10.1111/j.1475-682X.2009.00317.x. [DOI] [Google Scholar]
- Kan K. Expected and unexpected residential mobility. Journal of Urban Economics. 1999;45(1):72–96. doi: 10.1006/juec.1998.2082. [DOI] [Google Scholar]
- Kelly B, Hornik R, Romantan A, Schwartz JS, Armstrong K, DeMichele A, Wong N. Cancer information scanning and seeking in the general population. Journal of Health Communication. 2010;15(7):734–753. doi: 10.1080/10810730.2010.514029. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Keyzer JF, Melnikow J, Kuppermann M, Birch S, Kuenneth C, Nuovo J, Rooney M. Recruitment strategies for minority participation: challenges and cost lessons from the power interview. Ethnicity & Disease. 2005;15(3):395–406. [PubMed] [Google Scholar]
- Knapp C, Madden V, Wang H, Sloyer P, Shenkman E. Internet use and eHealth literacy of low-income parents whose children have special health care needs. Journal of Medical Internet Research. 2011;13(3):e75. doi: 10.2196/jmir.1697. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kontos EZ, Bennett GG, Viswanath K. Barriers and facilitators to home computer and Internet use among urban novice computer users of low socioeconomic position. Journal of Medical Internet Research. 2007;9(4):e31. doi: 10.2196/jmir.9.4.e31. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kreps GL. Disseminating relevant health information to underserved audiences: implications of the Digital Divide Pilot Projects. Journal of the Medical Library Association. 2005;93(4 Suppl):S68–S73. [PMC free article] [PubMed] [Google Scholar]
- Kreps GL, Neuhauser L. New directions in eHealth communication: opportunities and challenges. Patient Education and Counseling. 2010;78(3):329–336. doi: 10.1016/j.pec.2010.01.013. [DOI] [PubMed] [Google Scholar]
- Lee CJ. The role of Internet engagement in the health-knowledge gap. Journal of Broadcasting & Electronic Media. 2009;53(3):365–382. doi: 10.1080/08838150903102758. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mendez-Luck CA, Trejo L, Miranda J, Jimenez E, Quiter ES, Mangione CM. Recruitment strategies and costs associated with community-based research in a Mexican-origin population. Gerontologist. 2011;51(Suppl 1):S94–S105. doi: 10.1093/geront/gnq076. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Nicholson LM, Schwirian PM, Klein EG, Skybo T, Murray-Johnson L, Eneli I, Groner JA. Recruitment and retention strategies in longitudinal clinical studies with low-income populations. Contemporary Clinical Trials. 2011;32(3):353–362. doi: 10.1016/j.cct.2011.01.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Norman CD, Skinner HA. eHealth literacy: essential skills for consumer health in a networked world. Journal of Medical Internet Research. 2006;8(2):e9. doi: 10.2196/jmir.8.2.e9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Osann K, Wenzel L, Dogan A, Hsieh S, Chase DM, Sappington S, Nelson EL. Recruitment and retention results for a population-based cervical cancer biobehavioral clinical trial. Gynecologic Oncology. 2011;121(3):558–564. doi: 10.1016/j.ygyno.2011.02.007. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Paskett E, Reeves K, McLaughlin J, Katz M, McAlearney A, Ruffin M, Gehlert S. Recruitment of minority and underserved populations in the United States: the Centers for Population Health and Health Disparities experience. Contemporary Clinical Trials. 2008;29(6):847–861. doi: 10.1016/j.cct.2008.07.006. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Rdesinski RE, Melnick AL, Creach ED, Cozzens J, Carney PA. The costs of recruitment and retention of women from community-based programs into a randomized controlled contraceptive study. Journal of Health Care for the Poor and Underserved. 2008;19(2):639–651. doi: 10.1353/hpu.0.0016. [DOI] [PubMed] [Google Scholar]
- Redmond N, Baer HJ, Clark CR, Lipsitz S, Hicks LS. Sources of health information related to preventive health behaviors in a national study. American Journal of Preventive Medicine. 2010;38(6):620–627. doi: 10.1016/j.amepre.2010.03.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Schafft KA. Poverty, residential mobility, and student transiency within a rural New York school district. Rural Sociology. 2006;71(2):212–231. doi: 10.1526/003601106777789710. [DOI] [Google Scholar]
- Shim M, Kelly B, Hornik RC. Cancer information scanning and seeking behavior is associated with knowledge, lifestyle choices, and screening. Journal of Health Communication. 2006;11(Suppl 1):157–172. doi: 10.1080/10810730600637475. [DOI] [PubMed] [Google Scholar]
- Strecher V. Internet methods for delivering behavioral and health-related interventions (eHealth) Annual Review of Clinical Psychology. 2007;3(1):53–76. doi: 10.1146/annurev.clinpsy.3.022806.091428. [DOI] [PubMed] [Google Scholar]
- U.S. National Telecommunications & Information Administration. Exploring the digital nation: Home broadband Internet adoption in the United States. 2010 Retrieved from http://www.ntia.doc.gov/report/2010/exploring-digital-nation-home-broadband-internet-adoption-united-states.
- UyBico SJ, Pavel S, Gross CP. Recruiting vulnerable populations into research: a systematic review of recruitment interventions. Journal of General Internal Medicine. 2007;22(6):852–863. doi: 10.1007/s11606-007-0126-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Viswanath K. Science and society: the communications revolution and cancer control. Nature Reviews Cancer. 2005;5(10):828–835. doi: 10.1038/nrc1718. [DOI] [PubMed] [Google Scholar]
- Viswanath K. Public communications and its role in reducing and eliminating health disparities. In: Thomson GE, Mitchell F, Williams MB, editors. Examining the health disparities research plan of the National Institutes of Health: unfinished business. Washington, DC: Institute of Medicine; 2006. pp. 215–253. [PubMed] [Google Scholar]
- Viswanath K. Cyberinfrastructure: an extraordinary opportunity to bridge health and communication inequalities? American Journal of Preventive Medicine. 2011;40(5 Suppl 2):S245–S248. doi: 10.1016/j.amepre.2011.02.005. [DOI] [PubMed] [Google Scholar]
- Webb DA, Coyne JC, Goldenberg RL, Hogan VK, Elo IT, Bloch JR, Culhane JF. Recruitment and retention of women in a large randomized control trial to reduce repeat preterm births: the Philadelphia Collaborative Preterm Prevention Project. BMC Medical Research Methodology. 2010;10:88. doi: 10.1186/1471-2288-10-88. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Withall J, Jago R, Fox KR. Why some do but most don’t. Barriers and enablers to engaging low-income groups in physical activity programmes: a mixed methods study. BMC Public Health. 2011;11:507. doi: 10.1186/1471-2458-11-507. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Yancey AK, Ortega AN, Kumanyika SK. Effective recruitment and retention of minority research participants. Annual Review of Public Health. 2006;27:1–28. doi: 10.1146/annurev.publhealth.27.021405.102113. [DOI] [PubMed] [Google Scholar]



