Abstract
Objective
To examine the impact of 3 data collection modes on the number of questions answered, data quality, and student preference.
Methods
275 urban seventh-grade students were recruited and randomly assigned to complete a paper survey (SAQ), PDA survey (PDA), or PDA survey with audio (APDA). Students completed a paper debriefing survey.
Results
APDA respondents completed significantly more questions compared to SAQ and PDA. PDA and APDA had significantly less missing data than did SAQ. No differences were found for student evaluation.
Conclusions
Strong benefits may be gained by the use of APDA for adolescent school-based data collection.
Keywords: adolescent health survey, survey data collection, handheld computer, audio-enhancement
Early adolescence is increasingly seen as an appropriate time to introduce interventions aimed at preventing risky behaviors such as drug and alcohol use, violence, and unprotected sexual activity.1,2 As adolescents spend a majority of their time at school, the classroom becomes a natural venue for implementing these interventions.3 Researchers often use survey methodology to assess the effectiveness of these interventions. Due to the highly sensitive nature of some targeted behaviors, researchers use questionnaires involving sophisticated branching patterns. Yet, the level of reading competency and command of the English language required to navigate such questionnaires is not always prevalent among students in diverse urban school settings. Researchers have sought to address this issue, and others, in their development of different data collection methods.
A common and economical method for collecting data from a large number of students is the paper-based self-administered questionnaire (SAQ). This technique is much more likely to yield increased reports of sensitive behaviors when compared with interviewer-administered methods.4–10 However, the SAQ requires moderate reading skills, often requires students to navigate sophisticated skip patterns, and necessitates large testing areas to guarantee privacy. Also, school administration and parents oftentimes reject SAQs because exposure to detailed and sensitive questions (eg, types of sexual behavior) is not easily limited to only students for whom it applies – thus potentially exposing other students to developmentally inappropriate questions.
Computer-assisted self-interview (CASI) systems address many of the limitations of the SAQ. CASI systems may be used with both desktop and laptop computer systems, allowing for transportability.4,11 These systems provide computer-controlled navigation of sophisticated branching patterns (to skip nonapplicable questions), programmed consistency checks, and automatic data entry.4,12 They also help reduce missing data by ensuring that respondents address all relevant questions.11,13,14 Audio-enhancement features may be added to CASI programs (ie, A-CASI), allowing survey respondents to listen to questions through headphones while concurrently reading questions on the computer, thereby potentially reducing issues related to literacy and comprehension.7,8 However, the cost, resources and staffing requirements of this method reduce its feasibility in school-based research where space and resources are limited.
A third self-administered data collection option is the small, handheld personal digital assistant (PDA), which benefits from the advantages of the CASI approach, but is cheaper and more portable than the desktop- or laptop-based systems. Over the past several years, more studies have been published examining the use of PDAs and other handheld devices, particularly in school-based or adolescent health research.15–20 Although this line of research has developed similarly to the body of research related to CASI, little research has been conducted examining the effects of adding audio-enhancement to PDAs for survey-based data collection (similar to A-CASI).21,22
Trapl and colleagues demonstrated the feasibility of an audio-enhanced PDA-based data collection system and successfully used this method to collect baseline data on middle school adolescents.21,23 However, the study did not address questions of comparability or improvement over existing methods of data collection. To our knowledge no comparative literature examining the potential benefits or limitations of the use of audio-enhanced PDAs (APDA) in any survey research exists. Hence, this study aims to examine the differential effects of 3 different data collection modes (SAQ, PDA, APDA) on the number of questions answered, data quality, and student survey mode preference.
METHODS
This study involves a cross-sectional survey of a diverse group of seventh-grade students using a stratified random sampling design. This study was approved by the authors’ institutional review board (IRB); active parental consent and student assent were required for participation in the study. Students were recruited from 7 K-8 schools in the Cleveland Municipal School District, a large, urban school district, based on their ethnic and cultural heterogeneity. Of the 521 students enrolled in seventh grade, 11 students with limited English proficiency and 2 students with cognitive deficiencies were excluded from participating in the study. Among the remaining 508 students, 385 (75.8%) provided “yes” consents; the remaining provided “no” consents (9.4%) or did not respond (14.8%). Of the 385 students with consent, 275 students (71%) were in attendance on the day of data collection and participated in the study.
The researchers were concerned about having an equal distribution of students with low reading ability across each of the data collection modes; thus, a stratified randomization procedure was used. Prior to randomization to survey mode, students were first stratified based on their scores on sentence and listening comprehension assessments conducted within one month prior to the survey administration.23 Based on scores, students were placed into one of 3 groups: (1) moderate/ high scoring on both assessments, (2) low scoring on only one, (3) low scoring on both. Students were then individually randomized to complete a self-administered survey by one of 3 modes: SAQ, PDA, or APDA. Students completed the survey in small groups by mode, were placed at tables with controlled seating, and were allotted 25 minutes to complete the 178-item survey.
Although the study was primarily intended to assess modes of data collection, students were presented with a survey on general health beliefs and behavior, focusing on sexuality, physical activity, and nutrition. Item measures included demographics (eg, age, gender, self-identified race and ethnicity); attitudinal and belief-based measures and behaviors. All data collection modes had identical survey questions. Cardiff Teleform (Cardiff Software Ind., Vista, CA), an optical character recognition software package, was used to design and manage data from the multipage, paper-based surveys (SAQ). Surveyor was used to design and execute the PDA-based surveys.21 The survey was programmed to allow students to skip questions without answering; however, an alert prompt notified students of unanswered questions and required the student to either choose to return to the question or skip to the next item. Development of the APDA surveys was identical to that of the PDA-based survey, except for the voice files created to match the survey items. Voice files were recorded in a professional recording studio using a female voice professional who was asked to affect an ‘unbiased health professional’ voice. Voice files were created for questions only and were not created for response options. Students used plug-in headphones with APDA. All students were asked to complete a modified, paper-based debriefer questionnaire11 immediately following the survey administration.
Measures
Number of answered questions
The number of questions validly answered out of the possible 178 questions during the 25-minute survey period was calculated for each student. Questions skipped as part of a skip pattern were not considered “answered” questions. However, among SAQ respondents, if a student provided a response to a question that should have been skipped, this response was considered an answered question.
Missing data
Proportion of missing data by question and survey section (eg, sensitive vs non-sensitive questions; questions at the beginning vs the end of the survey) were assessed. Additionally, missing data were examined as a dichotomous variable, grouping students as either having no missing data or having any missing data among the first 70 questions.
Missing data were examined for respondents who answered a minimum of 70 questions in order to distinguish true missing data from data missing due to respondents’ having inadequate time to complete the survey. Next, missing data were examined among individual questions to better understand the characteristics of questions left unanswered. In this approach, questions were randomly chosen with intent to variegate question type (ie, sensitive and nonsensitive) and placement (ie, beginning versus middle of survey). All behavior questions were examined to determine the extent of missing data.
Failed skip patterns and inconsistent answers
Primarily an issue with the SAQ, any skip pattern where students answered questions they should not have answered was deemed a failed skip pattern. Logically inconsistent responses (eg, student answers, “Never drank alcohol” but reports having been drunk) were also assessed. Proportion of failed skip patterns and inconsistent answers were calculated for each of the 8 skip patterns examined.
Exposure to sensitive questions
The proportion of students who were unnecessarily exposed to sensitive questions (eg, in-depth questions about sexual behavior) on the SAQ was calculated following the same logic pattern used in programming the PDA and APDA surveys (ie, responding “no” to ever having sex).
Behaviors
Reporting of both sensitive and non-sensitive behaviors was calculated, including fast food and breakfast consumption; alcohol, tobacco, and substance use; dishonest in-school behavior; stealing and property damage; precoital behavior; sexual behavior. All behavioral measures were drawn from standardized national population-based surveys (eg, Youth Risk Behavior Survey, National Longitudinal Study of Adolescents). Prevalence of behaviors was calculated based on proportion of “yes” responses by participants.
Student preference
Mode preference was assessed in the debriefer survey with a question that asked, “If you had the choice, how would you have preferred to answer the survey?” Responses included in person with an interviewer, by PDA, by PDA with the questions read to me through headphones (APDA), by a written questionnaire that I could fill out myself, on the phone with an interviewer, or no preference/doesn’t matter.
Descriptive characteristics
Demographic variables, including age, gender, self-identified and ethnicity, and current living arrangement (eg, 2-parent home) were collected in the health survey. The frequency with which the student spoke a language other than English with friends, the frequency with which the student spoke a language other than English at home, and the number of years the student had been living in the United States were also asked.
Data Analytic Plan
Both the SAQ and PDA-based survey responses were compiled into an SPSS data file and analyzed using SPSS 13.0 (SPSS Inc, Chicago, IL). Bivariate statistics, including analysis of variance and chi-square tests, established the presence of associations between survey mode and the outcomes. Analysis of covariance using general linear models in SPSS was run to adjust for demographic varirace ables found to vary by mode despite the randomization scheme.
RESULTS
As shown in Table 1, 48.4% of the sample was female, and mean age was 13.15 years (SD=0.75). These characteristics were similar across the 3 data collection modes. Racial/ethnic distribution across the 3 modes was significantly different, despite individual randomization of study participants.
Table 1.
Total N=275 | SAQ N=90 | PDA N=93 | APDA N=92 | p-value | |
---|---|---|---|---|---|
Gender (% Female) | 48.4 | 47.8 | 47.3 | 50.0 | NS |
Race/Ethnicity | 0.021 | ||||
% White | 23.0 | 34.8 | 17.2 | 17.4 | |
% African American | 29.6 | 25.8 | 38.7 | 23.9 | |
% Hispanic | 37.6 | 30.3 | 37.6 | 44.6 | |
% Other | 10.8 | 9.0 | 6.5 | 14.2 | |
Age in Years: Mean(SD) | 13.15(.75) | 13.20(.78) | 13.14(.76) | 13.10 (.71) | NS |
Speak Mostly/Only English With Family | 60.8 | 62.9 | 64.1 | 55.4 | NS |
Lived in US >6 years | 89.1 | 88.9 | 93.5 | 84.8 | NS |
Live With 2 parents | 49.8 | 43.3 | 52.7 | 53.3 | NS |
The number of questions answered varied significantly by mode (Table 2), with students who took the survey on the APDA completing significantly more questions (149.7) than did both SAQ respondents (116.2) and PDA respondents (117.8). The significant differences were unchanged after adjusting for participants’ race (p < .001). Thirty-three students finished prior to the end of the allotted 25 minutes and were distributed by mode as follows: 5 SAQ, 6 PDA, 22 APDA (p = .001, data not shown).
Table 2.
Mode | N | Unadjusted | Adjusted |
---|---|---|---|
| |||
Mean (SD) | Mean (SD)a | ||
SAQ | 90 | 116.2 (33.2)*** | 115.2 (30.1)*** |
PDA | 93 | 117.8 (32.8)*** | 117.9 (29.9)*** |
APDA | 92 | 149.7 (21.8) | 149.6 (29.8) |
Indicates significant difference when compared to APDA; p < .001
Note.
Adjusted for race/ethnicity
As shown in Table 3, there was a significant difference in the prevalence of missing data by mode, with the APDA group (5.4%) and the PDA group (15.1%) having significantly fewer students with missing data than did the SAQ group (33.3%) (p < .001 for both comparisons). Although there appears to be a difference between the PDA and APDA groups, this difference is not statistically significant.
Table 3.
Significant difference when compared to SAQ; p < .001
Note.
Adjusted for race/ethnicity
Of the demographic-related questions examined for missingness, only self-reported grades yielded any missing data. Interestingly, this variable had the highest rate of missing data, and all missing cases were among those completing the survey via SAQ (N=7). Although missing data were significantly higher among SAQ respondents for self-reported grades (p < .001) and sexual intercourse (p < .05) (data not shown), the overall rates of missing data were still quite low, with 7.8% of SAQ respondents missing self-reported grades (compared to 0% among PDA and APDA respondents) and 3.3% of SAQ respondents missing sexual intercourse (compared to 0% among PDA and APDA respondents).
Missed skip patterns and logically inconsistent responses were examined among only SAQ respondents owing to the programmed skips of the PDA and APDA. A skip pattern is missed when a student responds to a question that should have been skipped based on the response to a feeder question. A skipped response is considered logically inconsistent if the response to the skipped question is in direct conflict with the stem question. Table 4 first provides the number of students whose response to the stem question would have indicated a need to skip. Indented questions below the stem question indicate first the number of students who missed the skip, followed by the number of students who provided inconsistent responses. Although 7.1–32.8% students missed the skip patterns, fewer provided inconsistent responses (0% to 8.9%). The greatest number of inconsistencies was reported among questions regarding sexual experience. It is important to note that age of initiation, number of partners, and use of a condom at last sex did not provide response options that would allow the student to be logically consistent (eg, “I’ve never had sex”).
Table 4.
Questions | Missed Skip
|
Inconsistencies
|
||
---|---|---|---|---|
N | % | N | % | |
Never Had A Drink of Alcohol | 32 | |||
Ever got drunk | 6 | 18.8 | 0 | 0.0% |
Had a drink of alcohol in past month | 7 | 21.9 | 1 | 3.1% |
Never Smoked Marijuana | 69 | |||
Smoked marijuana in past month | 17 | 24.6 | 1 | 1.4% |
Never Kissed on the Lips | 27 | |||
Kissed on the lips in past 3 months | 8 | 29.6 | 2 | 7.4% |
Never French Kissed | 43 | |||
French kissed in past 3 months | 8 | 18.6 | 2 | 4.7% |
Never Touch Breasts/breasts Touched | 49 | |||
Touched breasts/breasts touched in past 3 months | 15 | 30.6 | 3 | 6.1% |
Never Touched Other’s Private Parts | 64 | |||
Touched private parts in past 3 months | 21 | 32.8 | 3 | 4.7% |
Never had Private Parts Touched | 60 | |||
Private parts touched in past 3 months | 17 | 28.3 | 2 | 6.1% |
Never Had Sex | 56 | |||
Valid age of initiation | 4 | 7.1 | 4 | 7.1% |
Valid number of partners | 4 | 7.1 | 4 | 7.1% |
Valid gender of sexual partners | 5 | 8.9 | 5 | 8.9% |
Valid response to condom at last sex | 3 | 5.4 | 3 | 5.4% |
Sex in the past 3 months | 8 | 14.3 | 0 | 0.0% |
Overall, prevalence of behaviors was reported similarly and without any consistent pattern across modes, with the exception of sexual experience (Table 5). SAQ respondents were much more likely to report ever engaging in sexual activity compared to PDA and APDA respondents.
Table 5.
% Reported | Total | SAQ | PDA | APDA | p |
---|---|---|---|---|---|
Ate Fast Food | 77.4 | 71.1 | 84.7 | 76.3 | 0.094 |
Ever Drank Alcohol | 60.7 | 62.9 | 63.7 | 55.6 | 0.480 |
Ever Used Marijuana | 16.0 | 17.4 | 15.7 | 15.0 | 0.903 |
Ever Cheated in School | 51.2 | 46.8 | 52.7 | 54.2 | 0.591 |
Ever Shoplifted | 27.6 | 29.2 | 27.2 | 26.5 | 0.915 |
Ever Had Sexual Intercourse | 21.9 | 30.6 | 19.1 | 16.0 | 0.046 |
Note.
Adjusted for race
In the debriefer survey, students were asked to indicate their preferred mode to complete the survey if offered several options. When examining student preference by mode, it appears a majority of students in the PDA and APDA groups chose the mode they had just used, 46.7% and 47.3% respectively, as shown in Table 6. This was followed by a large group within each mode choosing the remaining PDA mode, with 23.9% of PDA-completers choosing APDA, and 20.9% of APDA-completers choosing PDA. The trend was quite different for students completing the survey on SAQ. These students were approximately evenly distributed across SAQ (13.6%), PDA (14.8%), and APDA (15.9%). The most popular choice was interviewer-administered survey (28.4%).
Table 6.
Preference (%)/Mode | Total | SAQ | PDA | APDA |
---|---|---|---|---|
SAQ | 7.0 | 13.6 | 3.3 | 4.4 |
PDA | 27.7 | 14.8 | 46.7 | 20.9 |
APDA | 29.2 | 15.9 | 23.9 | 47.3 |
Interview | 16.2 | 28.4 | 7.6 | 13.2 |
Phone Interview | 3.0 | 1.1 | 3.3 | 4.4 |
No Preference | 16.9 | 26.1 | 15.2 | 9.9 |
DISCUSSION
Students completing the survey on the APDA answered significantly more questions than did both PDA and SAQ respondents. These results require a small caveat with regard to how the data were collected. To assess the impact of the technology and audio-enhancement on the impact of survey administration, 2 options were available: allow students to work through the full survey and record completion time relative to start time or limit the time allowed and count the number of questions answered. In order to be respectful of the disruption to the school day already imposed by the scheduled data collection process, the latter was chosen based on past experiences in which it has been observed that students have a wide range in the amount of time needed to complete surveys. Results indicated that the number of questions answered by students during the 25-minute survey period was impacted by addition of audio, but not by the PDA technology per se as the number of questions answered by the SAQ respondents was comparable to that of PDA respondents. Relieving the burden of reading survey questions allowed students to move through the survey more quickly, with APDA respondents answering on average over one question per minute more than PDA and SAQ respondents.
The ability to program the survey via Surveyor and execute it on the PDA greatly reduced the prevalence of missing data. This finding was consistent with findings in other studies implementing computerized survey technology, with both PDA and APDA respondents having significantly less missing data than did SAQ respondents. 4,5,12,11,14 Issues of failed skip patterns and logically inconsistent data were also moot for the PDA and APDA groups due to the programmable skip patterns inherent in the software.21 Among SAQ respondents, up to one third of students failed to follow a skip pattern. More interestingly, logically inconsistent data among responses to simple skip patterns was found to be less problematic than failed skip patterns with a maximum of 9% of students who should have followed the skip (5.6% of the entire SAQ sample) providing a logically inconsistent response. Thus, a majority of students who failed to follow a skip pattern still provided logically consistent responses to subsequent questions when a logically consistent option was available. Although technology can solve the resulting symptom of failed skip patterns and logically inconsistent data, we still have very little understanding as to what contributes to a failed skip pattern or a logically inconsistent response given by a respondent.
This study supported the findings of the school-based studies of Beebe and colleagues,24 who found increased reporting of sensitive behaviors by SAQ and Hallfors and colleagues,14 who found no difference in reporting of drug and alcohol behaviors by mode, as compared to a large body of research indicating increased reports of sensitive behaviors among CASI or ACASI respondents.14,24 Reporting of sexual intercourse was the only behavior found to vary significantly by mode of data collection. Contrary to what was initially hypothesized based on CASI and ACASI literature, reporting of sexual intercourse was significantly higher among SAQ respondents when compared to that of both PDA and APDA respondents. In fact, the reporting of sexual intercourse among SAQ respondents was almost twice as much as that reported by APDA respondents (30% vs 15% respectively).
In an attempt to validate at least one of these values, prevalence rates of sexual intercourse among this population were sought from 2 alternate sources. The SAQ prevalence rate of sexual intercourse was similar to the rate reported for seventh-grade students (31.7%) from the 2005 administration of the Youth Risk Behavior Survey in Cleveland Municipal School District.25 Similarly, the APDA prevalence rate of sexual intercourse was similar to the rate reported for 2 cohorts of seventh-grade students (N=1331) participating in the Healthy Teens Building Healthy Schools baseline survey in Cleveland Municipal School District in 2004 and 2005. (Communication with Elaine Borawski, August 18, 2005). Thus, both the SAQ sexual intercourse prevalence and the APDA sexual intercourse prevalence were supported by local data.
The findings related to the unexpected reporting of sexual intercourse by mode led the authors to think more about sensitive behaviors generally and ask the question “What is a sensitive behavior among this particular population?” Illegal behaviors typically thought to be considered sensitive, such as marijuana use, showed no impact of mode. Perhaps these findings are simply an indication of a change in the trend of what is sensitive, or more, what students consider to be “cool” or desirable among their peers, an illustration of social norms at work. Students’ responses on SAQ were much more visible to the other students seated at their table. This was supported in the data by PDA and APDA respondents’ reporting lower scores of survey visibility as reported via the debriefing survey (ie, “Do you think the people sitting next to you could see your answers?”) compared to the SAQ respondents (data not shown). Further, because a “no” response to sexual experience instructed the respondent to skip ahead in the survey, student sexual experience may have been even more obvious to students at the table. Thus, to maintain an appearance of cool, it is possible that some students among the SAQ group may have indicated sexual experience when, in fact, this was untrue.
As has been suggested in other research, there was also the possibility that the voice used for the audio-enhancement contributed to the humanization of the APDA approach, thus contributing to the feeling of an interviewer-administered survey and subsequent social desirability, and a subsequent downward reporting trend.26 However, the fact that the PDA report of sexual experience was not significantly different from the APDA report and, in fact, was much more similar to the reporting of the APDA respondents than to that of the SAQ respondents indicates that this was not likely.
Finally, there were no mode differences among student preference of the survey experience. However, it is interesting to note that although the students completing the survey on the PDA or APDA preferred the mode they had just used (ie, PDA completers preferred PDA), the most popular choice among SAQ students was interviewer-administered survey (28.4%), possibly due to the burden of reading. Although this was the finding for a one-time, 25-minute survey, it would be interesting to see if students might have a different opinion if they were asked on the second, third, or even fourth time completing a similar survey, as often is the case with longitudinal studies. Further, it is possible that the 25-minute survey attenuated some findings related to student experience that may have become more apparent during a much longer survey. Also, by exposing students to only one mode of data collection, they may not have had the exposure and subsequent understanding of other modes to inform their mode preference.
There are several limitations to the current study. First, the participating students self-selected into the study and were likely very different from those students who chose not to participate or those with consent but not attending school on the day of the data collection sessions. Students not attending school, typically those with lower academic achievement, were also less likely to be engaged in the study. Thus, we do not know the abilities of the students who declined participation in the study, nor do we know why these students did not want to participate. Although students did receive a t-shirt as a token of appreciation for their participation and were removed from class (unfortunately seen as a benefit to many students), this was not incentive enough for those students who were completely uninterested.
Second, students participating in the study completed a survey using only one mode of data collection, so intermode correspondence of respondent reports could not be calculated. This approach could have provided additional insight into some of the discrepancies found across mode, such as reporting of sexual experience. Also, by not exposing students to other modes of data collection aside from the mode to which they were randomized, the student’s ability to make a comparison judgment on future mode preference may have been reduced. Additionally, our comparison did not include CASI or ACASI survey administration, allowing us to draw conclusions solely from the comparisons of SAQ, PDA, and APDA.
Some of the most interesting patterns of response and student engagement in survey data collection would seem to benefit from some level of face-to-face interviewing. When examining the data, it was not uncommon to discuss with other colleagues what may have been going through an adolescent respondent’s head when responding to certain questions. However, the answers to these questions cannot be obtained through close-ended responses to simple questions.
Future studies should be designed to implement the APDA with older adolescents and younger adolescents to further explore feasibility and superiority, inferiority, or equivalence of APDA when compared to alternative data-collection modes. Similarly, it would be helpful to better understand the limitations of the APDA system, for example, to know if the APDA system could be effectively implemented for larger data-collection sessions, such as data collection sessions sometimes held in cafeterias or gymnasiums, in order to reduce classroom interruption.
Given our findings related to the higher reporting of sexual intercourse among SAQ respondents compared to APDA respondents, there is clearly a need to better understand the cognitive processes experienced by an adolescent engaged in the survey process. Sexual intercourse and other sex-related measures are often the primary outcomes for adolescent prevention programs aimed at reducing sexual initiation or risky sexual behavior (eg, not using condoms). If student responses to these types of questions can be so seemingly easily swayed by such factors as who is sitting next to them in the survey environment and what data collection mode is used to respond to the questions, then researchers in the area of adolescent sexual behavior should be very concerned about the validity of their data. Although much of the population is hesitant to engage young adolescents in conversations around sexual behavior, most would agree that it is appropriate to engage youth who are developmentally ready to engage in such conversations (eg, those already having sexual intercourse), and researchers must find ways to ethically identify these youth and engage them in processes that draw from them the social and psychological components driving responses to such sensitive questions.
Self-report questionnaires are oftentimes the mechanism by which data are collected to assess the efficacy and effectiveness of interventions designed to minimize high-risk behavior and prevent chronic diseases. Thus, researchers must acknowledge the important role of data collection and data quality on informing these programs.
This study indicates strong benefits to be gained by the use of audio-enhanced personal digital assistants for school-based data collection with adolescents. Specifically, the findings provide clear indication that the audio-enhancement greatly reduced the burden of reading each of the survey questions, allowing students to move through the survey more quickly and answer more questions, in addition to reducing missing data. Beyond the support provided by the data-driven outcomes, the ease in which APDA data collection systems can be implemented in the school setting should not be overlooked.
Acknowledgments
The authors thank the students, staff, and administrators of the Cleveland Metropolitan School District, formerly the Cleveland Municipal School District, for their participation in this study. Moreover, this work was supported through funds from the National Institute of Child Health and Human Development: HD-R01-41364-S2. Some results in this paper were presented as posters at the American Public Health Association Annual Meeting in 2006 and American Academy of Health Behavior Annual Meeting in 2008.
Contributor Information
Erika S. Trapl, Department of Epidemiology and Biostatistics, Prevention Research Center for Healthy Neighborhoods, Case Western Reserve University, Cleveland, OH.
H. Gerry Taylor, Department of Pediatrics, Case Western Reserve University, Cleveland, OH.
Natalie Colabianchi, University of Michigan, Institute for Social Research, Ann Arbor, MI.
David Litaker, Department of Medicine, Case Western Reserve University, Cleveland, OH.
Elaine A. Borawski, Prevention Research Center for Healthy Neighborhoods, Case Western Reserve University, Cleveland, OH.
References
- 1.Petersen AC, Leffert N. Developmental issues influencing guidelines for adolescent health research: a review. J Adolesc Health. 1995;17(5):298–305. doi: 10.1016/1054-139x(95)00184-t. [DOI] [PubMed] [Google Scholar]
- 2.Resnick MD, Bearman PS, Blum RW, et al. Protecting adolescents from harm. JAMA. 1997;278(10):823. doi: 10.1001/jama.278.10.823. [DOI] [PubMed] [Google Scholar]
- 3.Gans JE, Brindis CD. Choice of research setting in understanding adolescent health problems. J Adolesc Health. 1995;17(5):306–313. doi: 10.1016/1054-139x(95)00182-r. [DOI] [PubMed] [Google Scholar]
- 4.Beebe TJ, Mika T, Harrison PA, et al. Computerized school surveys. Social Science Computer Review. 1997;15(2):159. [Google Scholar]
- 5.Ellen JM, Gurvey JE, Pasch L, et al. A randomized comparison of A-CASI and phone interviews to assess STD/ HIV-related risk behaviors in teens. J Adolesc Health. 2002;31(1):26–30. doi: 10.1016/s1054-139x(01)00404-9. [DOI] [PubMed] [Google Scholar]
- 6.Gribble JN, Miller HG, Rogers SM, Turner CF. Interview mode and measurement of sexual behaviors: methodological issues. J Sex Res. 1999;36(1):16–24. doi: 10.1080/00224499909551963. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Romer D, Hornik R, Stanton B, et al. “Talking” computers: a reliable and private method to conduct interviews on sensitive topics with children. J Sex Res. 1997;34(1):3–9. [Google Scholar]
- 8.Turner CF, Forsyth BH, O’Reilly JM, et al. Automated self-interviewing and the survey measurement of sensitive behaviors. In: Couper MP, Baler RP, Bethlehem J, et al., editors. Computer-Assisted Survey Information Collection. New York: John Wiley and Sons; 1998. pp. 455–473. [Google Scholar]
- 9.Turner CF, Ku L, Rogers SM, et al. Adolescent sexual behavior, drug use, and violence: increased reporting with computer survey technology. Science. 1998;280(5365):867. doi: 10.1126/science.280.5365.867. [DOI] [PubMed] [Google Scholar]
- 10.Turner CF, Lessler JT, Devore J. Effects of mode of administration and wording on reporting of drug use. In: Turner CF, Lessler JT, Gfroerer JD, editors. Survey Measurement of Drug Use: Methodological Studies. Washington, DC: 1992. pp. 177–220. DHHS Pub. no. 92-1929. [Google Scholar]
- 11.Couper M. Technology trends in survey data collection. Social Science Computer Review. 2005;23(4):486–501. [Google Scholar]
- 12.Ramos M, Sedivi BM, Sweet EM. Computerized self-administered questionnaires. In: Couper MP, Baler RP, Bethlehem J, et al., editors. Computer-Assisted Survey Information Collection. New York: John Wiley and Sons; 1998. pp. 389–408. [Google Scholar]
- 13.Couper MP, Singer E, Tourangeau R. Understanding the effects of audio-CASI on self-reports of sensitive behavior. Public Opinion Quarterly. 2003;67(3):385–395. [Google Scholar]
- 14.Hallfors D, Khatapoush S, Kadushin C, et al. A comparison of paper vs computer-assisted self interview for school alcohol, tobacco, and other drug surveys. Eval Program Plann. 2000;23(2):149–155. [Google Scholar]
- 15.Jaspan HB, Flisher AJ, Myer L, et al. Brief report: methods for collecting sexual behaviour information from South African adolescents--a comparison of paper versus personal digital assistant questionnaires. J Adoles. 2007;30(2):353–359. doi: 10.1016/j.adolescence.2006.11.002. [DOI] [PubMed] [Google Scholar]
- 16.Denny SJ, Milfont TL, Utter J, et al. Hand-held internet tablets for school-based data collection. BMC Research Notes. 2008;1:52. doi: 10.1186/1756-0500-1-52. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.McClamroch KJ. Evaluating the usability of personal digital assistants to collect behavioral data on adolescents with paradata. Field Methods. 2011;23(3):219–242. [Google Scholar]
- 18.Olson AL, Gaffney CA, Hedberg VA, Gladstone GR. Use of inexpensive technology to enhance adolescent health screening and counseling. Arch Pediatr Adolesc Med. 2009;163(2):172–177. doi: 10.1001/archpediatrics.2008.533. [DOI] [PubMed] [Google Scholar]
- 19.Hallfors D, Hyunsan C, Simbarashe R, et al. Supporting adolescent orphan girls to stay in school as HIV risk prevention: evidence from a randomized controlled trial in Zimbabwe. Am J Public Health. 2011;101(6):1082–1088. doi: 10.2105/AJPH.2010.300042. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Laska MN, Graham D, Moe SG, et al. Situational characteristics of young adults’ eating occasions: a real-time data collection using Personal Digital Assistants. Public Health Nutr. 2010;14(3):472–479. doi: 10.1017/S1368980010003186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Trapl ES, Borawski EA, Stork PP, et al. Use of audio-enhanced personal digital assistants for school-based data collection. J Adolesc Health. 2005;37(4):296–305. doi: 10.1016/j.jadohealth.2005.03.025. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Kilanowski JF, Trapl ES. Evaluation of the use of audio-enhanced personal digital assistants to survey Latino migrant farmworkers. Res Nurs Health. 2010;33(2):156–163. doi: 10.1002/nur.20369. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Williams KT, Service AG. GRADE: Group reading assessment and diagnostic evaluation. American Guidance Service; 2001. [Accessed January 26, 2012]. Available at: http://www.pearsondiagnostic.com/ [Google Scholar]
- 24.Beebe TJ, Harrison PA, Mcrae JA, et al. An evaluation of computer-assisted self-interviews in a school setting. Public Opinion Quarterly. 1998;62(4):623–632. [Google Scholar]
- 25.2005 Cleveland Municipal School District Middle School Youth Risk Behavior Survey. Case Western Reserve University, Division of Adolescent Health; 2005. [Accessed January 26, 2012]. Available online: http://prchn.org/reports/ [Google Scholar]
- 26.Nass C, Moon Y, Green N. Are machines gender neutral? Gender-stereotypic responses to computers with voices. Journal of Applied Social Psychology. 1997;27(10):864–876. [Google Scholar]