Abstract
Three barriers investigators often encounter when conducting longitudinal work with homeless or other marginalized populations are difficulty tracking participants, high rates of no-shows for follow-up interviews, and high rates of loss to follow-up. Recent research has shown that homeless populations have substantial access to information technologies, including mobile devices and computers. These technologies have the potential both to make longitudinal data collection with homeless populations easier and to minimize some of these methodological challenges. This pilot study’s purpose was to test whether individuals who were homeless and sleeping on the streets—the “street homeless”—would answer questions remotely through a web-based data collection system at regular “follow-up” intervals. We attempted to simulate longitudinal data collection in a condensed time period. Participants (N = 21) completed an in-person baseline interview. Each participant was given a remotely reloadable gift card. Subsequently, weekly for 8 weeks, participants were sent an email with a link to a SurveyMonkey questionnaire. Participants were given 48 h to complete each questionnaire. Data were collected about life on the streets, service use, community inclusion, substance use, and high-risk sexual behaviors. Ten dollars was remotely loaded onto each participant’s gift card when they completed the questionnaire within the completion window. A substantial number of participants (67% of the total sample and 86% of the adjusted sample) completed at least seven out of the eight follow-up questionnaires. Most questionnaires were completed at public libraries, but several were completed at other types of locations (social service agencies, places of employment, relative/friend/acquaintance’s domiciles, or via mobile phone). Although some of the questions were quite sensitive, very few participants skipped any questions. The only variables associated with questionnaire completion were frequency of computer use and education—both positive associations. This pilot study suggests that collecting longitudinal data online may be feasible with a subpopulation of persons experiencing homelessness. We suspect that participant follow-up rates using web-based data collection methods have the potential to exceed follow-up rates using traditional in-person interviews. If this population of persons experiencing street homelessness can be successful with this method of data collection, perhaps other disenfranchised, difficult-to-track, or difficult-to-reach populations could be followed using web-based data collection methods. Local governments are striving to decrease the “digital divide,” providing free or greatly discounted wi-fi connectivity as well as mobile computer lab access to low-income geographic areas. These actions, in combination with increased smart phone ownership, may permit vulnerable populations to connect and communicate with investigators.
Keywords: Homeless, Longitudinal data collection, Information technology, Technology, Computers, Mobile phones, Tracking, No-show, Follow-up, Internet
Background
Homeless populations have disproportionately high rates of physical-, mental-, and substance use-related morbidities and comorbidities [1–3]. Many investigators have examined prevention and intervention strategies for use with this vulnerable population [4–9]. Three barriers investigators often encounter when conducting longitudinal work with homeless or other marginalized populations are difficulty tracking participants, high rates of no-shows for follow-up interviews, and high rates of loss to follow-up [10–13]. Persons experiencing homelessness have some additional context-specific barriers.
The “unsheltered homeless,” those who choose to sleep on the streets and in other areas not meant for human habitation, can be especially difficult to track. The daytime locations frequented by these individuals often change. These changes may be due to several factors: the days and times during which social services and health care appointments are offered, work and school schedules, volunteer work, social obligations, mental health problems, or substance seeking/using behaviors and their aftermath. For some of the unsheltered homeless, the nighttime hours are just as varied. Nevertheless, those who do sleep overnight in a typical location are sometimes protective of that information. Regardless of disclosure, these individuals may suddenly need to find another place to stay if, for instance, the Department of Sanitation conducts an unannounced “sweep” and disposes of their belongings or if the Police Department begins enforcing non-loitering ordinances in their area [14]. In addition to these considerations, research associates may not feel safe traveling to these locations at night to schedule or conduct follow-up interviews with participants. Individuals who recently became homeless can be just as difficult to track. They often “try out” the shelter system and various social services for a period of time until they find the right fit. Therefore, their schedules and associations with locations are often inconsistent [14]. The “sheltered homeless”—those who choose to use the emergency, transitional, and safe haven housing system—can be slightly easier to track. Their daytime activities may or may not be consistent, but they are linked to a nighttime location that research associates would likely feel safer visiting [14]. Additionally, those who have mobile phones often deplete their allotment of minutes/text messages, have difficulty keeping their phones charged, discontinue service for several days at a time, change phone numbers, or lose their phones [15, 16]. Those staying overnight in shelters may not have access to their phones in the evening and nighttime hours [17].
Investigators working with homeless populations often encounter high rates of no-shows for follow-up interviews. For instance, when investigators schedule follow-up interviews months in advance (as is often the case in longitudinal research), it may be difficult for the participant to remember the appointment date, time, and location. Delivering reminder notices to this population has been difficult in the past, but it has become somewhat easier with the increasing availability of mobile phones, email addresses, and social networking sites. For participants who remember appointments, competing priorities may hinder showing up for these appointments. The follow-up interview may be less important to a participant than a job interview, medical appointment, or desirable meal. Homeless individuals may be unable to travel to study locations for interviews due to lack of funds for travel expenses, inclement weather, or physical ailments that make it difficult for them to walk long distances [18]. Traveling with their belongings may pose another concern, as the unsheltered homeless are often wary of leaving their belongings most places due to risk of theft [14].
The difficulty of tracking participants coupled with high rates of no-shows for follow-up interviews can lead to permanent loss to follow-up. According to a meta-analysis [19], follow-up rates in randomized controlled trials assessing substance use outcomes of homeless adults range from 38 to 93%, with a mean weighted follow-up rate of 71%. This follow-up rate diminishes confidence in results. To increase follow-up rates, Ribisl and colleagues [20] recommend that researchers “make research involvement convenient and rewarding” for participants (p. 9). Newer technologies, such as computer, tablet, smartphone, and other mobile devices, may make this more possible. Web-based self-administered data collection systems offer participants the benefit of being able to complete the survey at a time and location of the study participant’s choice [21]. When entering data into a computer, participants can stop and restart the survey as many times as they choose [22] and can read and re-read questions without feeling pressured to answer quickly [23]. This approach has the potential to increase follow-up rates among homeless populations, which would markedly improve their representativeness in study results, as well as the generalizability of results. Improvement in generalizability would assist in the dissemination and implementation of study findings for policy reform for persons experiencing homelessness.
Homeless populations have substantial access to information technologies, including mobile devices and computers [15, 24–30]. In adult homeless populations, mobile phone ownership ranges from 44 to 62%, recent computer use is reported to be 47%, and Internet use ranges from 19 to 84% [27]. Participants report using these technologies to remain in communication with friends and family members as well as for work-related activities, communication with health care providers, personal entertainment, social networking, and education [15, 24, 25, 31–33]. Investigators argue that information technology is an integral component to the survival needs and experiences of social inclusion among homeless populations [34].
Some investigators have published innovate efforts designed to track and provide remuneration to homeless and other mobile populations in longitudinal studies [35, 36]. Yet, no one, to our knowledge, has investigated collecting data from the adult homeless population using a web-based data collection system. In this pilot study, we attempted to simulate longitudinal data collection in a condensed time period. More specifically, we tested whether individuals who were homeless and sleeping on the streets would answer questions remotely using technology at regular “follow-up” intervals, over the course of 8 weeks.
Methods
This study was approved by Temple University’s Institutional Review Board. The research team consisted of the investigator, two full-time bachelor’s-level research associates, one full-time graduate student, and one individual who was formerly homeless and had maintained strong ties with the homeless community.
Recruitment and Study Enrollment
Research team members approached individuals who were “hanging out” in public spaces (e.g., train and subway stations, parks, stairs leading to the public library’s entrance) and told them we were “conducting a study on technology use.” Individuals expressing interest in the study were screened for eligibility. Eligibility criteria included (1) having slept on the streets for at least 8 of the past 14 nights, (2) not staying at a shelter during the past 14 nights, (3) feeling comfortable using a computer, (4) feeling comfortable reading text on a computer, (5) having an email address and knowing the password, and (6) having access to a computer or mobile device with Internet access. Because the pilot study was partially funded by a mental health grant, the first ten participants also had to report having either a mental health or substance use problem. Individuals who met eligibility criteria were given a thorough description of the study and were invited to participate. Those who agreed completed an informed consent process.
Baseline Interview
Immediately following the informed consent process, participants completed a 45- to 75-min, in-person, baseline interview. This interview began with an open-ended history of housing, marginal housing (“doubling up”), shelter use, and other homelessness. This was followed by the Homeless Supplement to the Diagnostic Interview Schedule [37], which served as a check on the data gathered during the open-ended residential history. Next completed were the Substance Use and Psychiatric Sections of the Addiction Severity Index [38], questions about technology use [15, 24], the Personal Empowerment scale [39], the Empowerment Scale [40], and the HERTH Hope Index [41]. The interview concluded with several demographic questions. Interviews were audio-recorded to ensure the open-ended residential history information was captured completely. At the close of the interview, participants were given a $10 gift card to Dunkin’ Donuts and a Temple University College of Health Professions identification card holder in which they could carry their gift card.
Follow-Up Interviews
Follow-up interviews began 7 days after baseline interview completion. Each week, for 8 weeks, participants were sent an email with a link to a SurveyMonkey questionnaire. Because the interval between surveys was so short, participants were given only 48 h to complete each questionnaire. Within 24 h of questionnaire completion, an additional $10 was loaded remotely onto the participant’s Dunkin’ Donuts gift card.
Each follow-up interview began with two questions: a logistical item: “How are you completing this survey right now?” (response options included “using a computer at a library,” “using a computer at a social services agency,” “using a computer at your place of employment,” “using a computer at a relative/friend/acquaintance’s place,” “using a cell phone,” and “other”) and a global quality of life item: “How do you feel about your life in general?” (response options included “terrible,” “unhappy,” “mostly dissatisfied,” “mixed,” “mostly satisfied,” “pleased,” and “delighted”) [42]. Additional questions were asked each week about varying topics—life on the streets (weeks 1 and 5), service use (weeks 2 and 6), community inclusion (weeks 3 and 7), and substance use and high-risk sexual behaviors (weeks 4 and 8). (See Table 1 for details about measures.) Because many of the questionnaire items could be considered sensitive, “I don’t want to answer this question.” was a response option for every item on every questionnaire. Questionnaire completion time was typically between 3 and 10 min.
Table 1.
Measures administered at various time points
Time point | Type of administration | Measures | Number of questions |
---|---|---|---|
Baseline | In-person interview | • Open-ended history of housing, marginal housing (doubling up), shelter use, and homelessness • Homeless Supplement to the Diagnostic Interview Schedule • Addiction Severity Index—Substance Use and Psychiatric Sections • Questions about technology use • Personal Empowerment Scale • Empowerment Scale • HERTH Hope Index • Demographic questions |
139 |
Week 1 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Questions about hunger, where participants stayed overnight during the past seven days, safety, personal hygiene, personal item storage, and technology use |
21 |
Week 2 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Treatment services review—selected items measuring service use for physical health, substance use, mental health, education, social service, and criminal justice problems |
20 |
Week 3 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Temple University Community Participation Measure (assesses how integrated participants were with the local community) |
33 |
Week 4 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Risk Behavior Survey—selected items to measure substance use and high-risk sexual behaviors |
22 |
Week 5 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Questions about hunger, where participants stayed overnight during the past seven days, safety, personal hygiene, personal item storage, and technology use |
21 |
Week 6 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Treatment services review—selected items measuring service use for physical health, substance use, mental health, education, social service, and criminal justice problems |
20 |
Week 7 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Temple University Community Participation Measure (assesses how integrated participants were with the local community) |
33 |
Week 8 follow-up | Web-based SurveyMonkey questionnaire | • Place of questionnaire completion • Quality of life item • Risk Behavior Survey—selected items to measure substance use and high-risk sexual behaviors • Questions about where participants stayed overnight during the past seven days |
32 |
Two Waves
In April 2013, five participants were recruited for the pilot study to test our procedures (e.g., sending out SurveyMonkey questionnaires, remotely reloading gift cards). Four of these participants provided us with feedback during a focus group in May 2013. The remaining participants were recruited for the pilot study in June (n = 10) and July (n = 6) 2013, and data collection was completed in September 2013.
Data Analysis
Data were analyzed using SAS 9.4. Frequencies and univariate statistics were run on all variables. Because of the small sample size, Fisher’s exact and t tests were used to detect relationships between independent (demographics, mental health, substance use, and technology use) and dependent (follow-up SurveyMonkey questionnaire completion) variables. Because of the limited sample size, the threshold for statistical significance was set at p ≤ .10.
Results
Sample statistics are presented in Table 2. The sample was primarily male (95%), middle-aged (mean 45±11 years), non-Veteran (81%), high school- or GED-educated (76%), and unemployed (86%). Participants self-identified as Caucasian (33%), African American (33%), and Latino (33%). A substantial number of participants reported current and lifetime mental health problems (67 and 86%, respectively). Far fewer participants (24%) reported current alcohol or drug problems. Participants reported experiencing homelessness for an average of almost 3 years (mean 2.87 ± 3.64 years) over the course of their lives. Lifetime homelessness was equally trifurcated, with one-third of the sample reporting less than 1 year, one-third reporting between 1 year and less than 3 years, and one-third reporting 3 or more years of lifetime homelessness.
Table 2.
Sample characteristics
Variable | Statistic for total sample (N = 21) | Statistic for adjusted sample (N = 14) |
---|---|---|
% Male | 95 | 93 |
Mean ± SD age | 45 ± 11 years | 44 ± 11 |
Race: | ||
% White/Caucasian | 33 | 43 |
% Black/African American | 33 | 29 |
% Multiracial | 10 | 7 |
% No racial identity | 24 | 21 |
Ethnicity: | ||
% Hispanic/Latino | 33 | 29 |
% High school diploma (or GED) | 76 | 71 |
Employment status: | ||
% Employed full time | 0 | 0 |
% Employed part time | 0 | 0 |
% Employed odd jobs | 14 | 14 |
% Unemployed | 86 | 86 |
% Veteran | 19 | 7 |
Marital status: | ||
% Married | 5 | 0 |
% Separated | 10 | 7 |
% Divorced | 24 | 21 |
% Never been married | 62 | 71 |
% Past 30-day mental health problem | 67 | 64 |
% Lifetime mental health problem | 86 | 93 |
% Past 30-day alcohol use | 48 | 57 |
% Past 30-day alcohol problem | 14 | 21 |
% Past 30-day drug use | 33 | 43 |
% Past 30-day drug problem | 14 | 21 |
Mean ± SD years of lifetime homelessness | 2.87 ± 3.64 | 2.67 ± 3.90 |
Frequency of computer use: | ||
Almost every day | 67 | 79 |
Three or four days a week | 19 | 7 |
One or two days a week | 5 | 0 |
One to three days a week | 10 | 14 |
Less than once a month | 0 | 0 |
Frequency of checking email: | ||
Almost every day | 71 | 79 |
Three or four days a week | 14 | 14 |
One or two days a week | 10 | 0 |
One to three days a week | 5 | 7 |
Less than once a month | 0 | 0 |
Questionnaire Completion
As can be seen in Table 3, two-thirds (67%) of the sample completed at least seven of the eight follow-up questionnaires, and almost half (48%) of the sample completed all eight. Four participants (19%) never completed any follow-up questionnaires, and three participants (14%) completed between one and four follow-up questionnaires. Of the four participants who completed no follow-up questionnaires, one was hospitalized between the baseline interview and the first follow-up questionnaire, one had provided the research team with an email address registered to someone with a name other than the participant, one had an email address that was incorrectly documented in the research team’s records (i.e., all emails “bounced back”), and one never responded to any of the follow-up questionnaires.
Table 3.
Completion of follow-up SurveyMonkey questionnaires
Statistic for total sample | Statistic for adjusted sample | |
---|---|---|
% of sample completed: | N = 21 | N = 14 |
Zero questionnaires | 19 (n = 4) | 7 (n = 1) |
1+ questionnaire | 81 (n = 17) | 93 (n = 13) |
7+ questionnaires | 67 (n = 14) | 86 (n = 12) |
All 8 questionnaires | 48 (n = 10) | 64 (n = 9) |
% of sample completed: | N = 21 | N = 14 |
Week 1 questionnaire | 76 (n = 16) | 86 (n = 12) |
Week 2 questionnaire | 71 (n = 15) | 86 (n = 12) |
Week 3 questionnaire | 71 (n = 15) | 93 (n = 13) |
Week 4 questionnaire | 71 (n = 15) | 93 (n = 13) |
Week 5 questionnaire | 71 (n = 15) | 93 (n = 13) |
Week 6 questionnaire | 67 (n = 14) | 86 (n = 12) |
Week 7 questionnaire | 67 (n = 14) | 86 (n = 12) |
Week 8 questionnaire | 57 (n = 12) | 71 (n = 10) |
% of total number of questionnaires completed: | N = 116 | N = 98 |
Using a computer at a library | 78 (n = 90) | 76 (n = 74) |
Using a computer at a social services agency | 5 (n = 6) | 6 (n = 6) |
Using a computer at place of employment | 4 (n = 5) | 6 (n = 6) |
Using a computer at a relative/friend/acquaintance’s place | 3 (n = 4) | 3 (n = 3) |
Using a mobile phone | 9 (n = 11) | 9 (n = 9) |
% of participants completing 1+ questionnaires: | N = 17 | N = 13 |
Using a computer at a library | 94 (n = 16) | 92 (n = 12) |
Using a computer at a social services agency | 29 (n = 5) | 38 (n = 5) |
Using a computer at place of employment | 6 (n = 1) | 8 (n = 1) |
Using a computer at a relative/friend/acquaintance’s place | 18 (n = 3) | 15 (n = 2) |
Using a cell phone | 18 (n = 3) | 15 (n = 2) |
Mean + SD # of types of locations at which participants completed questionnaires | 1.65 + 0.79 | 1.69 + 0.75 |
Between 12 and 16 participants completed each of the follow-up questionnaires. The highest rate of participation occurred during week 1 (76%), and the lowest rate of participation occurred during week 8 (57%). The content of the follow-up questionnaires was not related to completion rates. Instead, it appears a slow drop-off occurred over time.
Location of Questionnaire Completion
Data on location of questionnaire completion are presented in Table 3. Out of a total of 168 follow-up questionnaires (21 participants with eight questionnaires each), 116 were completed. The overwhelming majority of follow-up questionnaires were completed using a computer at a library (78%), but several follow-up questionnaires were completed via mobile phone (9%) or by using a computer at a social services agency (5%), place of employment (4%), or a relative, friend, or acquaintance’s place (3%). Nine participants used one consistent type of location to complete their follow-up questionnaires (predominantly the library, but one participant used a mobile phone), while five participants used two types of locations, and three participants used three types of locations.
Missing Data
Each participant was provided the opportunity to answer a total of 202 follow-up questions during the 8-week study. Only four participants selected the option “I do not want to answer this question.” Their declinations resulted in a total of eight missing responses for seven unique items. These missing data were not related to subsequent study dropout: three of these participants completed all eight of the follow-up questionnaires, and the other participant completed seven. Because all questions were asked twice (once between weeks 1 and 4 and a second time between weeks 5 and 8), we were struck that the participants who declined to answer an item at one time point did provide data to the exact same question during the other time point.
Suspected Interviewer Error
During the data analyses, we discovered that one member of the research team conducted the informed consent and baseline interviews for three of the four participants who completed no follow-up questionnaires and for two of the three participants who completed few (between one and four) follow-up questionnaires. This interviewer was associated with participants completing fewer total follow-up questionnaires (2.71 ± 3.45 vs. 6.93 ± 2.71, t = 3.37, df = 19, p = .0032) and with participants being less likely to complete at least one follow-up questionnaire (57% vs. 93%; Fisher’s exact test = .0877), seven out of the eight follow-up questionnaires (29% vs. 86%; Fisher’s exact test = .0173), and all eight follow-up questionnaires (14% vs. 64%; Fisher’s exact test = .0635). We suspect this interviewer was not clear with participants about the purpose of the study and the importance of completing follow-up questionnaires. Therefore, we removed this interviewer’s participants from the dataset, and we reanalyzed the data.
In the adjusted dataset, 86% of participants completed at least seven of the eight follow-up questionnaires, and 64% completed all eight. One participant (who was hospitalized) completed zero follow-up questionnaires, and one participant completed four follow-up questionnaires. Between 10 and 13 participants completed each of the follow-up questionnaires. Questionnaire completion was steady with 12 or 13 participants completing questionnaires each week, until the final week when there was a drop to 10 participants. Out of a total of 112 follow-up questionnaires (14 participants with eight questionnaires each), 98 were completed. Slightly over half (54%) of these participants completed follow-up questionnaires at more than one type of location. Six participants used one consistent type of location to complete their follow-up questionnaires, while five participants used two types of locations, and two participants used three types of locations.
Associations with Questionnaire Completion
In both the complete and adjusted datasets, sex, age, race, ethnicity, employment, Veteran status, marital status, mental health symptoms and problems, alcohol use and problems, other drug use and problems, and years of lifetime homelessness were not related to questionnaire completion or non-completion. Only two variables were related to questionnaire completion: frequency of computer use (in the complete dataset) and education (in the adjusted dataset). In the complete dataset, participants who reported using a computer every day were more likely than other participants to complete all eight follow-up questionnaires (64% vs. 14%, Fisher’s exact test = .0635). In the adjusted dataset, participants who reported having a high school diploma or GED were more likely than other participants to complete at least seven of the eight follow-up questionnaires (100% vs. 50%, Fisher’s exact test = .0659) as well as all eight follow-up questionnaires (80% vs. 25%, Fisher’s exact test = .0949).
Discussion
This study suggests that collecting longitudinal data online may be feasible with a subpopulation of persons experiencing homelessness. With very little prompting, 67% of the total sample and 86% of the adjusted sample completed at least seven out of eight follow-up surveys, each within a short 48-h window. Using a computer everyday was related to increased questionnaire completion in the total sample, while having a high school diploma or GED was related to increased questionnaire completion in the adjusted sample. We hypothesize that, when given proper informed consent, participant follow-up rates using web-based data collection methods have the potential to exceed follow-up rates using traditional in-person interviews. Even when a substantial proportion of our participants received suboptimal instructions at baseline in this study, participant follow-up was close to the mean weighted follow-up rate in a meta-analysis of in-person follow-up interviews with similar individuals—67% vs. 71% [19]. The inclusion criteria limited participation in our study to those who felt comfortable using and reading text on a computer, had an email address and knew the password, and had access to a computer or mobile device with Internet access. This is a specific subpopulation of persons who are homeless. It is unknown how other subpopulations would respond.
Several participants took advantage of the mobility of the web-based system and completed the follow-up questionnaires in more than one location. It appears that having flexible access to technology—either static technology in multiple types of locations or mobile technology—permitted participants to complete the follow-up questionnaires within the 48-h window. Participants who were at work, at a social service agency, or visiting with relatives, friends, or acquaintances were able to complete the follow-up questionnaires wherever they were. We suspect that as more low-income populations acquire smart phones, more questionnaires would be completed on smart phones.
We purposefully collected data in the follow-up questionnaires on sensitive topics, including mental health symptoms and service use, alcohol and other drug use problems and service use, and risky sexual behaviors, to test whether participants would feel comfortable answering such questions in public spaces. No pattern was detected between questionnaire content and questionnaire completion. And there were very few missing data. These facts suggest that participants felt comfortable providing data about these sensitive topics in various settings. In fact, the literature suggests that collecting sensitive data using Computer-Assisted Self-Interviewing systems can yield more valid data than in-person interviews [43–48].
If the completion window were extended from 48 h to 14, 30, 45, or 60 days, as would likely be done in research projects with 3-, 6-, 9-, and 12-month follow-up points [49], participants who use the computer once a week or once a month would have a greater opportunity to complete the surveys. Furthermore, an extended completion window would permit researchers to send reminder emails to participants who had not completed their questionnaires while their completion window was still open. Because this study had only a 48-h completion window, study participants were not pursued aggressively.
Proper documentation of email addresses is critical. Unfortunately, the research team recorded problematic email addresses for two of the four participants who completed no follow-up surveys. Research team screeners should have sent test emails to potential participants, and potential participants should have been required to respond to enroll in the study. This email verification process likely would have diminished the number of incorrectly documented email addresses.
Clearly, collecting data online does not obviate the need for implementing standard follow-up data collection procedures with persons experiencing homelessness. During the informed consent process, research teams should stress the importance of follow-up data and motivate potential participants to participate from study beginning through study ending. Collecting comprehensive locator information and conducting timely email, text message, or voicemail check-ins help maintain ties with participants and keep research teams up-to-date on contact information. Building relationships with agencies that serve persons experiencing homelessness, and securing the proper release of information forms for participants, can enhance communication with providers when tracking difficult-to-find participants. Research team leaders can act as “refusal converters,” contacting participants who do not complete surveys to stress how important their follow-up data are to the research project. Remuneration can be offered in an increasing stepwise fashion, and a bonus can be offered for completing all surveys. Additionally, research teams may need to assist participants with locating computer and web access when participants are mobile. As always, research teams need to be patient, persistent, and creative to achieve high follow-up rates [49–51].
This project was a small pilot study to test the feasibility of collecting web-based data from persons experiencing “unsheltered” homelessness. This data collection method needs to be replicated with larger and more inclusive samples, with extended windows for follow-up data collection, and with more aggressive participant tracking. This study recruited participants and conducted the baseline interview in person. Recruiting homeless persons online and collecting baseline data online may present additional opportunities and challenges. Furthermore, many of the measures used in this study demonstrated strong psychometric properties when tested during in-person interviews; the psychometric properties of online data collection using these measures are currently unknown.
If web-based data collection efforts are, indeed, successful in larger studies of homeless populations with longer follow-up data collection points, then it could provide an additional feasible method of data collection for investigators working with this population. And, if this population of persons experiencing street homelessness can be successful with this method of data collection, perhaps other disenfranchised, difficult-to-track, and difficult-to-reach populations could be followed using web-based data collection methods. Local governments are striving to decrease the “digital divide,” providing free or greatly-discounted wi-fi connectivity as well as mobile computer lab access to low-income geographic areas [52–55]. Our pilot study shows that these actions, in combination with increased smart phone ownership, may permit more marginalized populations to connect and communicate with investigators.
Acknowledgements
This work was supported by a Temple University College of Public Health Dean’s Incentive Grant, a Temple University School of Social Work Research Assistant, and NIH Grant 1 P20 MH085981. The authors would like to thank Dr. Mark Salzer, Dr. Petra Kottsieper, Mr. Jared Pryor, Mr. Justin Benner, and Mr. Andre Cureton for their assistance with planning and carrying out the research project. We would like to thank Dr. Miguel Munoz-Laboy for his advice on the manuscript and Dr. Nick Garg for his editing assistance.
Compliance with Ethical Standards
This study was approved by Temple University’s Institutional Review Board.
References
- 1.Castellow J, Kloos B, Townley G. Previous homelessness as a risk factor for recovery from serious mental illnesses. Community Ment Health J. 2015;51(6):674–84. doi: 10.1007/s10597-014-9805-9. [DOI] [PubMed] [Google Scholar]
- 2.Creech SK, Johnson E, Borgia M, Bourgault C, Redihan S, O’Toole TP. Identifying mental and physical health correlates of homelessness among first-time and chronically homeless veterans. J Community Psychol. 2015;43(5):619–27. doi: 10.1002/jcop.21707. [DOI] [Google Scholar]
- 3.Stringfellow EJ, Kim TW, Gordon AJ, Pollio DE, Grucza RA, Austin EL. Substance use among persons with homeless experience in primary care. Subst Abus. 2016 doi: 10.1080/08897077.2016.1145616. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Upshur C, Weinreb L, Bharel M, Reed G, Frisard C. A randomized control trial of a chronic care intervention for homeless women with alcohol use problems. J Subst Abuse Treat. 2015;51:19–29. doi: 10.1016/j.jsat.2014.11.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Whittaker E, Swift W, Flatau P, Dobbins T, Schollar-Root O, Burns L. A place to call home: study protocol for a longitudinal, mixed methods evaluation of two housing first adaptations in Sydney. Australia BMC Public Health. 2015;15(1):1–9. doi: 10.1186/1471-2458-15-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Reback CJ, Peck JA, Fletcher JB, Nuno M, Dierst-Davies R. Lifetime substance use and HIV sexual risk behaviors predict treatment response to contingency management among homeless, substance-dependent MSM. J Psychoactive Drugs. 2012;44(2):166–72. doi: 10.1080/02791072.2012.684633. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Orwin R, Scott C, Arieira C. Transitions through homelessness and factors that predict them: three-year treatment outcomes. J Subst Abuse Treat. 2005;28:S23–39. doi: 10.1016/j.jsat.2004.10.011. [DOI] [PubMed] [Google Scholar]
- 8.Nyamathi A, Salem BE, Zhang S, Farabee D, Hall B, Khalilifard F, et al. Nursing case management, peer coaching, and hepatitis a and B vaccine completion among homeless men recently released on parole: randomized clinical trial. Nurs Res. 2015;64(3):177–89. doi: 10.1097/NNR.0000000000000083. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Herman DB, Conover S, Gorroochurn P, Hinterland K, Hoepner L, Susser ES. Randomized trial of critical time intervention to prevent homelessness after hospital discharge. Psychiatr Serv. 2011;62(7):713–9. doi: 10.1176/ps.62.7.pss6207_0713. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.McKenzie M, Tulsky JP, Long HL, Chesney M, Moss A. Tracking and follow-up of marginalized populations: a review. J Health Care Poor Underserved. 1999;10(4):409–29. doi: 10.1353/hpu.2010.0697. [DOI] [PubMed] [Google Scholar]
- 11.Stefancic A, Schaefer-McDaniel N, Davis A, Tsemberis S. Maximizing follow-up of adults with histories of homelessness and psychiatric disabilities. Eval Program Plann. 2004;27(4):433–42. doi: 10.1016/j.evalprogplan.2004.07.006. [DOI] [Google Scholar]
- 12.Stein JA, Nyamathi AM. Completion and subject loss within an intensive hepatitis vaccination intervention among homeless adults: the role of risk factors, demographics, and psychosocial variables. Health Psychol. 2010;29(3):317. doi: 10.1037/a0019283. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Veldhuizen S, Adair CE, Methot C, Kopp BC, O’Campo P, Bourque J, et al. Patterns and predictors of attrition in a trial of a housing intervention for homeless people with mental illness. Soc Psychiatry Psychiatr Epidemiol. 2015;50(2):195–202. doi: 10.1007/s00127-014-0909-x. [DOI] [PubMed] [Google Scholar]
- 14.Eyrich-Garg KM. Social support networks of people staying overnight on the streets. 2009;Unpublished raw data.
- 15.Eyrich-Garg KM. Mobile phone technology: a new paradigm for the prevention, treatment, and research of the non-sheltered “street” homeless? J Urban Health. 2010;87(3):365–80. doi: 10.1007/s11524-010-9456-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.LeDantec CA, Edwards WK. Designs on dignity: perceptions of technology among the homeless. In: Proceedings of the CHI ’08 SIGCHI Conference on Human Factors in Computing Systems. 2008. Florence, Italy.
- 17.McInnes DK, Fix GM, Solomon JL, Petrakis BA, Sawh L, Smelson DA. Preliminary needs assessment of mobile technology use for healthcare among homeless veterans. Peer J. 2015 doi: 10.7717/peerj.1096. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Chen B, Mitchell A, Tran D. “Step up for foot care” addressing podiatric care needs in a sample homeless population. J Am Podiatr Med Assoc. 2014;104(3):269–76. doi: 10.7547/0003-0538-104.3.269. [DOI] [PubMed] [Google Scholar]
- 19.Eyrich-Garg KM, Stahler GJ. The effectiveness of psychosocial treatment for homeless substance abusers: a meta-analytic review. Paper presented at: American Public Health Association 138th Annual Scientific Meeting. 2010. Denver, CO. https://apha.confex.com/apha/138am/webprogram/start.html. Accessed 15 July 2016.
- 20.Ribisl KM, Walton MA, Mowbray CT, Luke DA, Davidson WS, Bootsmiller BJ. Minimizing participant attrition in panel studies through the use of effective retention and tracking strategies: review and recommendations. Eval Program Plann. 1996;19(1):1–25. doi: 10.1016/0149-7189(95)00037-2. [DOI] [Google Scholar]
- 21.Baer A, Saroiu S, Koutsky LA. Obtaining sensitive data through the web: an example of design and methods. Epidemiology. 2002;13(6):640–5. doi: 10.1097/00001648-200211000-00007. [DOI] [PubMed] [Google Scholar]
- 22.Parks KA, Pardi AM, Bradizza CM. Collecting data on alcohol use and alcohol-related victimization: a comparison of telephone and web-based survey methods. J Stud Alcohol. 2006;67(2):318–23. doi: 10.15288/jsa.2006.67.318. [DOI] [PubMed] [Google Scholar]
- 23.Duffy JC, Waterto JJ. Under‐reporting of alcohol consumption in sample surveys: the effect of computer interviewing in fieldwork. Br J Addict. 1984;79(4):303–8. doi: 10.1111/j.1360-0443.1984.tb03871.x. [DOI] [PubMed] [Google Scholar]
- 24.Eyrich-Garg KM. Sheltered in cyberspace? Computer use among the unsheltered ‘street’ homeless. Comput Hum Behav. 2011;27(1):296–303. doi: 10.1016/j.chb.2010.08.007. [DOI] [Google Scholar]
- 25.Eyrich-Garg KM. Virtual addresses: can e-mail technology help providers communicate with the non-sheltered ‘street’ homeless? 2011; Unpublished manuscript, School of Social Work, Temple University, Philadelphia, PA.
- 26.Harpin S, Davis J, Low H, Gilroy C. Mobile phone and social media use of homeless youth in Denver, Colorado. J Community Health Nurs. 2016;33(2):90–7. doi: 10.1080/07370016.2016.1159440. [DOI] [PubMed] [Google Scholar]
- 27.McInnes DK, Li AE, Hogan TP. Opportunities for engaging low-income, vulnerable populations in health care: a systematic review of homeless persons’ access to and use of information technologies. Am J Public Health. 2013;103(S2):e11–24. doi: 10.2105/AJPH.2013.301623. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Rice E, Barman‐Adhikari A. Internet and social media use as a resource among homeless youth. J Comput-Mediat Commun. 2014;19(2):232–47. doi: 10.1111/jcc4.12038. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Pollio DE, Batey DS, Bender K, Ferguson K, Thompson S. Technology use among emerging adult homeless in two U.S. cities. Soc Work. 2013;58(2):173–5. doi: 10.1093/sw/swt006. [DOI] [PubMed] [Google Scholar]
- 30.Rice E, Lee A, Taitt S. Cell phone use among homeless youth: potential for new health interventions and research. J Urban Health. 2011;88(6):1175–82. doi: 10.1007/s11524-011-9624-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.McInnes DK, Sawh L, Petrakis BA, Rao S, Shimada SL, Eyrich-Garg KM, et al. The potential for health-related uses of mobile phones and internet with homeless veterans: results from a multisite survey. Telemed e-Health. 2014;20(9):801–9. doi: 10.1089/tmj.2013.0329. [DOI] [PubMed] [Google Scholar]
- 32.McInnes DK, Fix GM, Solomon JL, Petrakis BA, Sawh L, Smelson DA. Preliminary needs assessment of mobile technology use for healthcare among homeless veterans. Peer J. 2015;3:e1096. doi: 10.7717/peerj.1096. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Neale J, Stevenson C. Homeless drug users and information technology: a qualitative study with potential implications for recovery from drug dependence. Subst Use Misuse. 2014;49(11):1465–72. doi: 10.3109/10826084.2014.912231. [DOI] [PubMed] [Google Scholar]
- 34.Roberson J, Nardi B. Survival needs and social inclusion: technology use among the homeless. In: Proceedings of the 2010 ACM Conference on Computer Supported Cooperative Work. Savannah, GA. 2010.
- 35.Bender K, Begun S, DePrince A, Haffejee B, Kauffman S. Utilizing technology for longitudinal communication with homeless youth. Soc Work Health Care. 2014;53(9):865–82. doi: 10.1080/00981389.2014.925532. [DOI] [PubMed] [Google Scholar]
- 36.DesJarlais DC, Perlis TE, Settembrino JM. The use of electronic debit cards in longitudinal data collection with geographically mobile drug users. Drug Alcohol Depend. 2005;77(1):1–5. doi: 10.1016/j.drugalcdep.2004.06.010. [DOI] [PubMed] [Google Scholar]
- 37.North CS, Eyrich KM, Pollio DE, Foster DA, Cottler LB, Spitznagel EL. The homeless supplement to the diagnostic interview schedule: test‐retest analyses. Int J Methods Psychiatr Res. 2004;13(3):184–91. doi: 10.1002/mpr.174. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.McLellan AT, Alterman AI, Cacciola J, Metzger D, O’Brien CP. A new measure of substance abuse treatment. Initial studies of the treatment services review. J Nerv Ment Dis. 1992;180(2):101–10. doi: 10.1097/00005053-199202000-00007. [DOI] [PubMed] [Google Scholar]
- 39.Segal SP, Silverman C, Temkin T. Measuring empowerment in client-run self-help agencies. Community Ment Health J. 1995;31(3):215–27. doi: 10.1007/BF02188748. [DOI] [PubMed] [Google Scholar]
- 40.Rogers ES, Chamberlin J, Ellison ML, Crean T. A consumer-constructed scale to measure empowerment among users of mental health services. Psychiatr Serv. 1997;48(8):1042–7. doi: 10.1176/ps.48.8.1042. [DOI] [PubMed] [Google Scholar]
- 41.Herth K. Abbreviated instrument to measure hope: development and psychometric evaluation. J Adv Nurs. 1992;17(10):1251–9. doi: 10.1111/j.1365-2648.1992.tb01843.x. [DOI] [PubMed] [Google Scholar]
- 42.Lehman AF, Ward NC, Linn LS. Chronic mental patients: the quality of life issue. Am J Psychiatry. 1982;139(10):1271–6. doi: 10.1176/ajp.139.10.1271. [DOI] [PubMed] [Google Scholar]
- 43.Estes LJ, Lloyd LE, Teti M, Raja S, Bowleg L, Allgood K. Perceptions of audio computer-assisted self-interviewing (ACASI) among women in an HIV-positive prevention program. Plos One. 2010;5(2) doi: 10.1371/journal.pone.0009149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Gribble J, Miller H, Cooley P, Catania J, Pollack L, Turner C. The impact of T-ACASI interviewing on reported drug use among men who have sex with men. Subst Use Misuse. 2000;35(6–8):869–90. doi: 10.3109/10826080009148425. [DOI] [PubMed] [Google Scholar]
- 45.Harmon T, Turner CF, Rogers SM, Eggleston E, Roman AM, Villarroel MA, et al. Impact of T-ACASI on survey measurements of subjective phenomena. Public Opin Q. 2009;73(2):255–80. doi: 10.1093/poq/nfp020. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Islam MM, Topp L, Conigrave KM, van Beek I, Maher L, White A. The reliability of sensitive information provided by injecting drug users in a clinical setting: clinician-administered versus audio computer-assisted self-interviewing (ACASI) Aids Care-Psychol Socio-Med Asp AIDS/HIV. 2012;24(12):1496–503. doi: 10.1080/09540121.2012.663886. [DOI] [PubMed] [Google Scholar]
- 47.McNeely J, Strauss SM, Rotrosen J, Ramautar A, Gourevitch MN. Validation of an audio computer-assisted self-interview (ACASI) version of the alcohol, smoking and substance involvement screening test (ASSIST) in primary care patients. Addiction. 2016;111(2):233–44. doi: 10.1111/add.13165. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Villarroel MA, Turner CF, Rogers SM, et al. T-ACASI reduces bias in STD measurements: the national STD and behavior measurement experiment. Sex Transm Dis. 2008;35(5):499–506. doi: 10.1097/OLQ.0b013e318165925a. [DOI] [PubMed] [Google Scholar]
- 49.Cottler LB, Compton WM, Ben-Abdallah A, Horne M, Claverie D. Achieving a 96.6 percent follow-up rate in a longitudinal study of drug abusers. Drug Alcohol Depend. 1996;41(3):209–17. doi: 10.1016/0376-8716(96)01254-9. [DOI] [PubMed] [Google Scholar]
- 50.Desmond DP, Maddux JF, Johnson TH, Confer BA. Obtaining follow-up interviews for treatment evaluation. J Subst Abuse Treat. 1995;12(2):95–102. doi: 10.1016/0740-5472(94)00076-4. [DOI] [PubMed] [Google Scholar]
- 51.Haggerty KP, Fleming CB, Catalano RF, Petrie RS, Rubin RJ, Grassley MH. Ten years later: locating and interviewing children of drug abusers. Eval Program Plann. 2008;31(1):1–9. doi: 10.1016/j.evalprogplan.2007.10.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Fuentes-Bautista M, Inagaki N. Reconfiguring public internet access in Austin, TX: Wi-Fi’s promise and broadband divides. Gov Inf Q. 2006;23(3–4):404–34. doi: 10.1016/j.giq.2006.07.013. [DOI] [Google Scholar]
- 53.Hsieh JJP, Rai A, Keil M. Addressing digital inequality for the socioeconomically disadvantaged through government initiatives: forms of capital that affect ICT utilization. Inf Syst Res. 2011;22(2):233–53. doi: 10.1287/isre.1090.0256. [DOI] [Google Scholar]
- 54.Park N, Lee KM. Wireless cities: local governments’ involvement in the shaping of wi-fi networks. J Broadcast Electron Media. 2010;54(3):425–42. doi: 10.1080/08838151.2010.498849. [DOI] [Google Scholar]
- 55.Sipior JC, Ward BT, Connolly R. The digital divide and t-government in the United States: using the technology acceptance model to understand usage. Eur J Inf Syst. 2011;20(3):308–28. doi: 10.1057/ejis.2010.64. [DOI] [Google Scholar]