Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2012 Jul 8.
Published in final edited form as: J Adolesc Health. 2005 Oct;37(4):296–305. doi: 10.1016/j.jadohealth.2005.03.025

Use of audio-enhanced personal digital assistants for school-based data collection

Erika S Trapl a, Elaine A Borawski a,*, Paul P Stork b, Loren D Lovegreen a, Natalie Colabianchi a, Maurice L Cole a, Jacqueline M Charvat c
PMCID: PMC3391606  NIHMSID: NIHMS300378  PMID: 16182140

Abstract

Purpose

To review the different data collection options available to school-based researchers and to present the preliminary findings on the use of audio-enhanced personal digital assistants (APDA) for use in school-based data collection.

Methods

A newly developed APDA system was used to collect baseline data from a sample of 645 seventh grade students enrolled in a school-based intervention study. Evaluative measures included student response, time to completion, and data quality (e.g., missingness, internal consistency of responses). Differences in data administration and data quality were examined among three groups of students: students newer to the United States speaking English as a second language; special education students; and students not newer to the United States receiving regular education.

Results

The APDA system was well received by students and was shown to offer improvements in data administration (increased portability, time to completion) and reduced missing data. Although time to completion and proportion of missing data were similar across the three groups of students, psychometric properties of the data varied considerably.

Conclusions

The APDA system offers a promising new method for collecting data in the middle school environment. Students with cognitive deficits and language barriers were able to complete the survey in a similar amount of time without additional help; however, differences in data quality suggest that limitations in comprehension of the questions remained even though the questions were read to the respondents. More research on the use of APDA is necessary to fully understand the effect of data collection mode with special populations.

Keywords: Data collection, Adolescents, Personal digital assistants, PDA, School surveys, Health behavior, Questionnaires, Sexual behavior


Researchers often turn to the school environment as the primary venue for collecting surveillance data on sensitive risk behaviors among adolescents [13]. Schools are also a common venue for health behavior interventions aimed at preventing or reducing risk behavior such as drug and alcohol use, violence, or unprotected sexual activity. To collect surveillance data or to assess the effectiveness of behavioral interventions, researchers need a flexible and cost-effective method for collecting sensitive data from a large number of adolescents in a short period of time, in a way that is the least intrusive to the school schedule and environment.

For many years, researchers relied on the self-administered, paper-based survey (SAQ) to collect data within the school environment. However, the SAQ has numerous limitations, most of which are owing to the reliance on reading competency and command of the English language, neither of which are guaranteed in today’s diverse urban school setting. Computerized methods (i.e., audio-computer assisted self-interviewing, A-CASI) have been designed to address these limitations; however, these methods are not always conducive to the school environment.

The purpose of this article is to review the different data collection options currently available to school-based researchers and to present the development and preliminary findings on the extension of the A-CASI system to personal digital assistants (PDA), or small handheld computers, for use in school-based data collection. This data collection system was designed not only in response to the investigators’ desire to integrate technological advances into the data collection process, but also to create a more flexible and transportable computer-based data collection system to be used in schools. The newly developed system was implemented and evaluated as part of a school-based intervention study conducted among urban middle school students.

Background

School-based data collection methods

School-based research and surveillance typically uses paper–pencil self-administered questionnaires (SAQ) for data collection. SAQs allow researchers to reach the largest number of respondents in the most economic way, requiring one survey administrator for a large number of students. However, the SAQ requires a moderate reading level and sufficient cognitive ability to accurately interpret skip patterns typically found in risk behavior research. If these patterns are unsuccessfully navigated, the amount of missing and inconsistent data increases. This, in turn, increases the chances of the student with cognitive or language difficulties being excluded from the study, either as an outright exclusion or as an analytic exclusion (i.e., cases with missing data are dropped from the analyses by default). In addition, the inability to navigate branching patterns in the SAQ can expose students to developmentally inappropriate questions (e.g., exposure to condom-use questions when respondent has never engaged in sexual intercourse).

Interviewer-administered surveys, including face-to-face interviews or telephone surveying, seemingly address many of the limitations of self-administered questionnaires; the interviewer reads the survey to the student and navigates the complicated branching patterns, reducing missing data and inconsistent responses. However, extensive research with adolescents has shown that interviewer-administered techniques are more likely to yield decreased reports of sensitive behaviors when compared with self-administered methods [410]. Further, this method can be cost-prohibitive when conducting a large scale surveillance or intervention study.

The most desirable solution is a methodology that draws on the strengths of both personal interviewing and self-administered questionnaires, realized in audio-supported, computer-assisted self-interviewing (A-CASI). A-CASI reads the questions to the respondent, thereby reducing literacy demands. A-CASI allows for the programming of difficult branching patterns, and provides a level of privacy at least comparable to that of SAQ [10]. This method has the benefit of standardization across the study population because the same survey and voice files are used [10]. A-CASI also provides several data quality improvements over SAQ, including faster data entry and fewer nonresponses and data consistency checks [11]. Children are quick to pick up the technology and can complete surveys with minimal assistance [12]. With few exceptions [13], surveys completed by adolescents using A-CASI elicited increased reporting of sexual behavior when compared with face-to-face interviews, phone interviews, and self-administered paper–pencil surveys [12,14,15]. In survey research, methods yielding higher reporting of sensitive risky behaviors are believed to be more accurate as these methods presumably address issues of social desirability [16]. Thus, increased reports of sensitive behavior collected via A-CASI are believed to be more accurate than SAQ reports.

To facilitate A-CASI data collection in school-based research, researchers either rely on currently existing school resources, such as computer labs, or build their own portable data collection system using laptop computers. Utilization of existing school resources requires less expenditure on the part of the researcher. However, the researcher is restricted to the availability of the existing resources, thus eliminating some prospective study sites owing to lack of resources [13] and potentially increasing selection bias. Use of school resources may also be perceived as less private to students who may fear that their responses could be accessible to school administrators and teachers [13]. Although creating a portable data collection system (i.e., with laptops) does require a substantial initial investment by the researcher, it reduces selection bias of schools and standardizes equipment across all study participants. However, researchers still need to be concerned with finding an adequate testing site with sufficient flat-top space and electrical outlets, neither of which are widely available in most urban schools. Equally important are issues of transportation and security of an expensive system that requires substantial research staff.

Personal digital assistants

Personal digital assistants (PDA) became widely available in the 1990s. PDAs fit easily into an individual’s hand and have a screen size approximately the size of one’s palm. A stylus, an instrument similar to an inkless pen, is used to navigate the touch-sensitive screen of the PDA. PDAs support multimedia, providing a color screen and volume control. A rechargeable battery can generally run several hours before requiring charging, although a lithium back-up battery is generally included to prevent loss of data.

PDAs offer several equipment-specific advantages over both desktop and laptop computers for data collection. First, PDAs cost substantially less than a laptop or desktop computer ($300 vs. $1000+, respectively). Second, owing to their size, PDAs are easier to transport than laptop computers, placing less physical burden on research staff.

Third, PDAs allow for greater flexibility in the testing venue compared with laptops. Laptops require flat surfaces and ample electric supply, whereas PDAs are charged before data collection and are able to run for six hours before needing to be recharged. PDAs can be used in a variety of settings with and without tabletop access, including classrooms, libraries, cafeterias, media centers, and auditoriums. Because there is little limitation in the venue, it is easier to create larger spaces between students that may contribute to a greater sense of privacy and increase the validity of reporting by the student [13]. However, owing to the small size of the PDA, security of the equipment is a larger issue.

Fourth, PDAs create a natural interface for the student as they fit easily into the palm of one’s hand, allowing students physical control over the privacy of their answers through moving the PDA or physically shifting positions. Further, the stylus is reminiscent of a pen, creating a familiar response mechanism for survey respondents.

The most commonly used research settings for the PDA-based data collection have been marketing and clinical settings, where a single interviewer/clinician uses one PDA to collect data on a number of respondents/patients. Recently, there have been a number of published reports on the use of PDAs as electronic diaries in areas ranging from the assessment of pain [1719] and mood [2023] to health-related quality of life [24]. Palermo et al [18] compared the use of electronic PDA-based pain diaries to paper pain diaries, finding that the PDA-based diary was feasible to use with children and that children were more compliant and accurate in their PDA-based diary entries when compared with paper diaries. Whalen and colleagues [2023] have used PDA-based diaries to collect behavior, social context, and mood data from adolescents in a longitudinal adolescent health study. However, to our knowledge, there are no published reports on the use of PDAs in school-based research.

One of the barriers to using PDAs in school-based research has been the availability of software needed to develop surveys for the PDA. Although there are currently a number of survey design packages available that contain PDA-based data collection modules [2533], many options needed for self-administered surveys are not in place, such as the ability to link voice files or to “lock” the respondents into the survey and out of other programs on the PDA. To address these and other shortcomings of current PDA software packages, we developed a new software package, Surveyor, for designing and executing audio-enhanced, PDA-based (APDA) surveys with support from the National Institute of Child Health and Human Development (NICHD). Details regarding the development of Surveyor are available elsewhere [34].

One of the more significant contributions of Surveyor is the ability to link audio files (e.g., *.wav files) to each of the questions in the survey. These files can be recorded from a human or computer-generated voice. In addition, the program locks the student into the survey, deactivating the menu bars and hardware buttons (i.e., student can’t access other programs on the PDA) and requires a password to exit from the program. Students have the ability to scroll both forward and backward in the survey, and are alerted when they’ve left a question unanswered, although Surveyor will allow students to leave questions unanswered. Students’ responses are temporarily saved at each section break in the survey defined by the survey programmer, with a final *.txt file containing the completed results file in a comma-delimited format. In the case of hardware or software failure, the temporary results file can be reloaded and students restart at the beginning of the section where they had left off. Similar to A-CASI, Surveyor also allows the programming of complex skip patterns, which minimizes unnecessary exposure of students to sensitive questions.

By eliminating one of the major barriers to PDA-based survey administration through the creation of a customized, audio-capable, survey design software program similar to those used with A-CASI, we have enabled an initial examination of the feasibility of the APDA system and the impact on data administration and data quality in school-based survey research.

Evaluation of APDA in the Middle School Environment

The second half of this article describes our observations of using APDA in the middle school environment. The study in which the APDA was used is a school-wide intervention study aimed at influencing adolescent sexual behavior, with the secondary aim of influencing the normative environment of the school, which in turn may affect behavior. Owing to this secondary aim, it was very important that we involve all students, including those who might have difficulty completing a survey such as those enrolled in ESL (English as Second Language) classes, special education classes, or even mainstreamed students with cognitive or behavioral challenges. We anticipated that the audio support of the APDA would allow students with limited reading skills and/or less familiarity with the English language to complete the survey; however, it was unclear whether data collected from these groups of students would be similar in quality to the data collected from students without these limitations.

Based on our understanding of A-CASI, there were a number of additional expectations that we had for the APDA. First, we presumed that although students would be mostly unfamiliar with APDA, they would quickly acclimate to the technology. We also expected that APDA would provide all the advantages of A-CASI (e.g., low rates of missing data, programmable skip patterns) with the additional benefits of being cheaper, more portable, and providing a strong sense of privacy to students. Finally, we were particularly interested in how APDA might affect data quantity (e.g., number of questions completed) and quality (e.g., missing data, internal consistency of responses).

Of greatest interest was to examine whether there would be differences in these observations among three exclusive groups: students newer to the United States speaking English as a second language (ESL); special education (SE) students; and all other students (i.e., students living in the United States most or all of their lives receiving regular education [REG]). That is, we were interested in determining whether there would be group differences in the response to APDA, the ability to complete the survey, the missingness of data, and the internal consistency of survey item responses. Throughout the remainder of the article these three groups will be referred to as the REG (regular education, not newer to U.S.), ESL (newer to U.S., speaking English as a second language), and SE (special education) student groups.

Programming of questionnaire

Using Surveyor [35], gender-specific surveys were created, with the number of questions ranging between 203 and 243, depending on the branching paths navigated by the student. Voice files for each question were recorded in a professional studio using a female vocal expert who was asked to affect a “non-biased health professional” voice. Prior research has shown that the gender of the voice does not significantly affect participants’ responses [11], so the female voice was used with both the male and female surveys. It should be noted that the version of Surveyor used at the time of this study did not have the capability of linking audio files to the response categories; thus, this feature was not tested and evaluated in this study.

All surveys were executed on the Dell Axim X5 hand-held computer enhanced with a 256-MB SanDisk flash memory card used for storing voice files and completed survey results files.

Population and data collection procedures

The population was comprised of seventh grade students from three ethnically diverse, urban middle schools, who are participants in a larger intervention trial aimed at postponing and/or reducing sexual activity. The schools were initially selected based on their heterogeneity with regard to ethnicity and culture (roughly a third each Hispanic, white, and African-American). Moreover, owing to the high influx of new immigrants to the neighborhoods surrounding these schools, many of the students speak a language other than English at home. With regard to reading ability, reading proficiencies in these three schools are low with less than 50% passing the 6th grade proficiency exam, when compared with the Ohio state pass rate of 65%. However, the rates are similar to those of the overall school district [36].

This project, including the use of APDA, was approved by the authors’ institutional review board. Active informed consent was obtained from the parents or guardians of all students completing the survey. Immediately before completing the survey, each child completed an assent form. After administrative questions were completed, study staff escorted the student to a seat in the testing area and explained the use of APDA, while completing three example questions with the student. Students received disposable headphones for use with the PDA and were left to complete the survey. When finished, students were instructed by a screen at the end of the survey to raise their hands to notify study staff of their completion of the survey. Students kept their headphones as an incentive for completing the survey.

Sample characteristics

Of the eligible 734 seventh grade students, 88% (645/734) of their parents gave active consent for their child to participate in the study. On average, the students were 12.6 years old (SD = .8), and the sample was equally split by gender (50.5% male) and less than half (45.3%) self-reported to have grades of a B average or higher. Approximately 40% of students in the sample self-identified as Hispanic, 26% as white, 28% as African-American, 2% as Asian, and less than 1% as American Indian or Pacific Islander. Nearly four percent (3.6%) reported that they did not self-identify with any group. Twelve percent (n = 75) of the eligible study population was identified as “special education” by their respective schools, indicating cognitive or behavioral deficits, and included in our SE student group. In addition, 9% of the sample (n = 59) was identified as less acculturated as measured by living in the United States for six years or less and mostly/only spoke a language other than English with their parents (the ESL group). Two ESL students also had special education status, and therefore, are not included as a subgroup in the comparisons below. No students refused to complete the survey.

Descriptive and evaluative measures

Demographic variables

Demographic variables included gender, age, self-reported ethnicity, self-reported grades in school, number of years living in the U.S., and use of a language other than English in the home.

Student response to APDA

Four questions were asked at the end of the survey to assess student response to APDA: (1) How much did you like or dislike completing the survey on the PDA?; (2) How honest were you when answering the questions in the survey?; (3) What did you like about completing the survey on the PDA?; and (4) What did you dislike about completing the survey on the PDA? The latter two questions offered a list of five to six different aspects of APDA (e.g., the questions being read to them, the headphones, privacy), and students were encouraged to choose all that applied. These responses were examined both in aggregate and as a comparison of the REG, ESL, and SE student groups. Crosstabs and corresponding chi-square statistics were calculated for pairs (e.g., REG vs. ESL, REG vs. SE, ESL vs. SE).

Time to completion

The time at which each student began and completed the survey was recorded on a log-in sheet by the survey administrator. To examine the impact of APDA on data quantity, we calculated the mean and median time to completion of the survey and compared the time to completion across a variety of demographic variables, including the three student groups (REG, ESL, and SE). T-tests and ANOVA with a Tukey post-hoc analysis were used to assess differences among the groups of interest.

Data quality

Two aspects of data quality, missingness of data and internal consistency of scale items, were examined in this study.

Missingness of data

Missing data often varies by the sensitivity of the topic, the placement in the survey (beginning vs. end), and whether the response is conditional upon a prior response. Thus, a missing data percentage was calculated for various substantive sections of the survey, including sections at the beginning, middle and end of the survey, and sections on sensitive and nonsensitive topics. Five of the 13 sections were chosen: (a) demographics, which consisted of 15 questions (e.g., what grade are you in?) and was the first section of the survey; (b) a section on “knowledge,” comprised of 12 questions in true/false format (e.g., many persons with an HIV infection have been cured.), which immediately followed the demographics; (c) a section of “diet/physical activity behavior,” which was asked in the middle of the survey and consisted of 19 questions (e.g., in the past two weeks, how many times did you eat at a fast food restaurant?); (d) a section on “sexual behavior,” which was also in the middle of the survey and consisted of 15 highly sensitive questions, many of which were contingency based (e.g., have you ever kissed someone on the lips? or has anyone ever touched you below the waist, under your clothing?); and (e) a final section on “parent practices,” which consisted of 19 questions (e.g., my parents pretty much let me do whatever I want) and was located at the very end of the survey. For each section, the percentage of missing items (# missing/total items) was calculated for each question and the range presented for the entire sample and across the three student groups (REG, ESL and SE).

Data consistency

The internal consistency of items for four separate unidimensional scales was also examined as an indicator of data quality. The scales, chosen for their evaluative nature, required students to either make evaluative judgments to statements (e.g., strongly agree to strongly disagree) or to recall and summarize the frequency of events (e.g., never to always) [37]. We assessed the following constructs: (a) consequences of early sexual initiation (4 items, e.g., having sexual intercourse as a teenager makes it harder for someone to study and stay in school); (b) condom-related prevention beliefs (3 items, e.g., when used properly, a condom prevents HIV); (c) sexual behavior beliefs (4 items, e.g., I believe it is okay for people my age to have sex with a steady boyfriend or girlfriend); and (d) peer social support for healthy behavior (4 items, e.g., how often have your friends suggested ways you could eat healthier or make better food choices?).

These scales were selected because they represented a range of abstract thinking, from specific object attitudes (e.g., condoms) to more abstract general attitudes (e.g., consequences) [37]. They were also selected because we anticipated that ESL and SE students may have more difficulty in interpreting and assessing these types of questions, thus providing less consistency across the items of the scales. For this reason, we examined the internal consistency of the items comprising each scale, arguing that if there is less consistency across the items of a known unidimensional (latent) scale where the internal consistency is typically high, it may mean that the respondent is not reading the questions/responses appropriately [38], and that this may be owing to reading ability and/or comprehension difficulties that are being masked by the PDA (i.e., only needing to point and click their answers after the questions are read to them).

We calculated a separate Cronbach coefficient alpha for each of the four scales, within each of the three student groups (REG, ESL, SE). Owing to the difference in sample size of the REG group as compared with the ESL and SE group, and the impact of sample size on Cronbach alpha coefficients [38], a random sample of the regular education (REG) students was drawn to be similar in sample size to that of the other two groups. The Feldt’s test of equal coefficients was used to compare the coefficients across the three student groups (REG vs. ESL, and REG vs. SE) to determine whether the scales reliably measure the theorized constructs similarly across the groups [39].

Results

Student response to APDA

The PDA system was well received by students. Over 94% of students reported that they liked completing the survey on the APDA and an overwhelming 96% of students reported being completely or mostly honest when completing the survey. When given a list of potential reasons for liking the PDA, nearly half of all surveyed students indicated that they liked having the questions read to them, as illustrated in Table 1 below; this was similar across the three subgroups, but most pronounced among the REG students. Students also identified liking the privacy of the APDA (59%) and being able to move forward and backward in the survey to change their answers (41%). On the other hand, the most unpopular aspect of completing the survey on the PDA was tapping the “Next” button after each page of questions (20%). Only 5% of students reported that they did not like the PDA, and 6% did not like having the questions read to them. Of interest are the differences observed in the likes and dislikes across the three different student groups. Five characteristics (i.e., like having questions read, like wearing headphones, like something else, dislike tapping “NEXT” button, dislike something else) were found to be similar across REG, ESL, and SE students (p > .05); many of the other aspects of the PDA surveying are quite different among the subgroups.

Table 1.

Student likes and dislikes of APDA: all students and among REG, ESL and SE students

All students Regular educ (REG) (A) English as second language (ESL)(B) Special educ (SE) (C) Group differences p < .05
Likes (%) n = 643 n = 513 n = 57 n = 73
 PDA itself 64.6 67.6 47.4 57.4 AB
 Privacy 59.2 63.0 56.1 32.4 AC, BC
 Having questions read 47.9 49.4 42.1 41.2
 Wearing headphones 28.9 29.4 26.3 26.5
 Ability to change answers 41.2 44.3 35.1 23.5 AC
 Something else 24.5 26.3 14.0 20.6 AB
Dislikes (%)
 PDA itself 5.2 3.7 5.6 15.9 AC
 Having questions read 6.4 5.1 1.9 17.4 AC, BC
 Wearing headphones 9.8 8.6 11.1 17.4 AC
 Tapping “NEXT” button 19.4 19.5 16.7 21.7
 Something else 20.9 22.0 16.7 17.4

Time to completion

The average time to complete the 203–243 question survey was 52 minutes. As shown in Table 2, the range was large, from 15 minutes to 123 minutes; however, less than 1% of the population finished in less than 30 minutes and only 3.2% took more than 80 minutes. The comparability of quartiles across the REG, ESL, and SE groups indicates that a majority of students finished in a relatively similar amount of time. In addition, there were no significant differences in time to completion by gender or age.

Table 2.

Time to Completion: Total and by subgroup

Group n Time (minutes) to completion mean (SD) Range (minutes) Median (minutes) 25th/75th Percentile (minutes)
Overall total 632a 52.33 (12.35) 15–123 50.00 44.0/58.0
Gender
 Male 318 52.12 (12.36) 20–104 50.00 44.0/58.0
 Female 314 52.54 (12.35) 15–123 51.00 44.0/58.0
Age
 ≤ 12 years 349 52.75 (12.82) 15–123 50.00 44.5/60.0
 ≥ 13 years 283 51.82 (11.20) 20–104 50.00 44.0/56.0
Education and language group
 REG 504 52.22 (12.50) 15–123 50.00 44.0/58.0
 SE 70 50.43 (11.35)b 28–78 49.00 42.5/58.0
 ESL 56 55.88 (11.77)b 34–92 53.00 48.0/62.0
Sexual experiencea
 No 520 52.14 (12.09) 15–123 50.00 44.0/58.0
 Yes 104 52.67 (12.96) 20–104 51.00 45.0/57.75
*

p <.05;

**

p <.01;

***

p < .001.

a

Sample sizes in the subgroup analyses differ slightly due to missing data (i.e., 11 students were missing time data and two students were removed from group analysis because they were in both the ESL and SE groups).

b

Times for ESL and SE groups were significantly different from each other (p < .01).

The SE students were not only able to complete the survey, but did so in a shorter period of time than the REG students (50.43 for SE vs. 52.22 for REG). This seemingly counterintuitive result may be an indication of poorer reading comprehension, as discussed further in the discussion section. As expected, the ESL group did take longer to complete the survey than the REG or SE students. However, the post hoc analysis reveals that only the SE and ESL groups were significantly different from each other.

Programmed logic patterns allowed for an additional 40 questions that could be answered by students, depending upon prior responses. The vast majority of these questions were related to sexual behavior, and therefore created a concern that students answering these additional questions may be identified as sexually active owing to the extra amount of time needed to complete the survey. Examination of the time to completion by sexual experience showed no significant difference.

Data quality

Missingness of data

Table 3 provides the ranges of missing data for questions by section, including sections at the beginning (i.e., Demographics, Knowledge), middle (Diet/Activity Behavior, Sexual Behavior), and end (Parenting) of the survey. For example, in the Demographics section, overall rates of missing data by question ranged from .2% (number of younger siblings) to .8% (attendance at religious services). Of additional note is the similarity between the rates of missing data within sections containing sensitive questions (e.g., sexual behavior) compared with less sensitive questions (e.g., demographics), and the notable low rates of missing data among the sexual behavior questions. Lastly, rates of missing data appeared to be similar between REG and ESL students; however, SE students appear to have sections with far more missing data compared with the other groups, particularly in the “Knowledge” section.

Table 3.

Percentage missing data per section: total, by education and acculturation

All students Regular education (REG) English as second language (ESL) Special educ (SE)
Section (no. questions) n = 643 n = 513 n = 57 n = 73
Demographics (Q = 15) 0.2–0.8 0.2–0.6 0–1.8 0–2.7
Knowledge (Q = 12) 0.5–4.5 0–3.5 0–3.5 0–12.3
Diet/activity behavior (Q = 19) 0.3–1.6 0.2–1.8 0–1.8 0–4.1
Sexual behavior (Q = 15) 0.2–1.4 0.2–1.4 0–1.8 0–2.7
Parenting (Q = 19) 0.6–1.9 0.6–1.8 0–1.8 1.4–5.5

To further explore the quality of data among the subgroups, we examined the Cronbach coefficient alpha for each of the four different constructs within the survey: consequences of early sexual behavior, condom-related prevention beliefs, beliefs about sexual behavior in adolescents, and peer social support. Results show that Cronbach coefficient alphas were the highest among REG students, followed by the ESL students, with the SE students having the lowest coefficient alphas; this pattern held for all four scales. The alphas for REG students all fall in the respectable to very good range, the alphas of the ESL students fall in the minimally acceptable to respectable range, whereas the alphas of the SE students range from unacceptable (Sex Beliefs: .545) to respectable (Condom Prevention: .733) (Table 4) [38,40].

Table 4.

Internal consistencies of scaled items: REG vs. ESL vs. SE groups

Regular educ (REG) English as second language (ESL) F statistic p Value Special educ (SE) F statistic p Value
Construct (# items) n = 75 n = 57 REG vs. ESL n = 73 REG vs. SE
Consequences of early sexual behavior (4) .800 .725 1.375 .107 .651 1.745 .009
Condom-related prevention beliefs (3) .862 .756 1.768 .014 .733 1.935 .003
Beliefs about sexual behavior (4) .833 .696 1.820 .010 .545 2.725 <.001
Peer social support (4) .770 .722 1.209 .230 .638 1.573 .027

Results of Feldt’s test for equal coefficients in comparing REG and ESL students indicate that the coefficients for two of the scales were not statistically different from each other (consequences of early sexual behavior and peer social support), whereas two of the coefficients were different (condom-related prevention beliefs and beliefs about sexual behavior). All coefficients were found to be statistically different when comparing the responses of SE students to the REG students.

Observations of APDA “in the Field”

The compact quality of this new data collection system required only one staff member to tote 25 PDAs, necessary cords, and the administrative laptop in a single bag, greatly reducing the physical burden on research staff. This system also allowed for great flexibility in the testing venue, a distinct benefit in the ever-changing environment of schools. The testing venue varied across the three schools based on availability of space and included locations such as the library, media center, and classrooms. At one of the study schools, the primary designated data collection location was the rarely used balcony of the school’s auditorium, typically used for storage. With poor lighting, minimal electrical outlets, and seating that did not offer fold-up desktops, this space, which would not have been conducive for SAQ or laptop-based data collection systems, was one of our most ideal venues. Furthermore, the minimalist nature of APDA enabled the field research staff to quickly set up and break down the system and move to new testing venues, or even another school, with little time and effort.

The APDA system proved reliable as no data was lost owing to software or hardware error or equipment loss, confidentiality of students’ responses was secure, and data was available for analysis within days of completion of the last survey.

The benefit of the audio files accompanying the survey extended beyond the ability to account for reading level. When students put on their headphones and began the survey, they appeared to become completely enveloped in the “world” created by APDA’s visual, audio, and interactive stimulation. Even the most disruptive student quickly quieted after beginning the survey and was not distracted by the movements or sounds of the research staff or students around him or her. This resolved a common problem with using the SAQ in the classroom setting when students finish at different times or make comments out loud on the survey questions.

As an observer, it was apparent when students reached sections containing sensitive questions based on their body language. First, students would look around to see who was looking at them, and then they would shift their physical positions, seemingly to feel confident of their privacy. Students turned sideways in their seats, hunched over the APDA, bringing it close to their faces, or even resting their foreheads on the edge of a table or desk to hide the APDA under the table. The APDA is small enough to allow the student to feel that they have control over the privacy of their answers.

The small screen of the APDA also allowed students to sit within a few feet of one another without knowing what questions their neighbors are answering. When in close proximity to other students completing paper-based surveys, it is much easier for peers to recognize which question or section a student is answering, potentially risking confidentiality and reducing the student’s trust in the data collection process. Moreover, a student, teacher or staff member can walk by and glance at a paper-based survey, possibly gaining knowledge of a student’s confidential responses. However, owing to the small screen size of the APDA, exposure of responses to passers-by, such as teachers or other classmates, was also minimized. Only one to three questions are displayed at a time using APDA, and it is virtually impossible for students to be aware of the programmed skip patterns to infer how a student answered previous questions. Lastly, students can ask study staff about a particular survey question on the APDA without exposing their other responses, maintaining their privacy.

The ability to maintain student’s confidentiality in the data collection process was simplified with the APDA via the ability to link a unique identifier to the survey instead of creating a complicated paper-based confidentiality system that is often required with paper-based surveys. Moreover, the confidentiality of results was maintained through saving the results as a text file that was not accessible on the APDA; these results files were downloaded to the study staff administrative laptop immediately after the student completed the survey and deleted from the APDA.

Implications

The results from our preliminary work with this APDA data collection system parallel the published reports on desktop- or laptop-based A-CASI systems in that APDA provided a completely standardized data collection system, with the added benefits of being highly compact and extremely portable. With the audio enhancement, students with a range of reading and language abilities were able to participate in data collection. APDA provided other benefits offered by A-CASI, such as sophisticated skip patterns and automatic production of clean data files. However, differences among regular education students, students newer to the U.S. who speak English as a second language and special education students in PDA preferences (Table 1), rates of missingness (Table 3), and Cronbach alpha coefficients (Table 4) indicate that this method must be explored further with these groups of students to determine the effect of the audio enhancement on question/response comprehension. In particular, as only the questions were read to the students, it is unclear whether the internal consistency could be improved if response categories were also read.

Our experience with the APDA mimic that of Romer et al [12] and Hallfors et al [41] with computers; although 66% of students had never used a PDA before the survey, they quickly learned how to use the technology, exhibited little fatigue during the survey, were very respectful of the equipment, and even asked when they could complete the survey again. Our field staff has rarely observed these characteristics when using pencil-paper surveys with middle school or high school students.

A common problem with the SAQ is that students often refuse or choose not to answer sensitive questions, yielding high proportions of missing data. For example, the Centers for Disease Control reports that nonresponse rates on the 2003 Youth Risk Behavior Survey (YRBS) ranged from .4% on a question assessing respondents’ age, to 15.5% for the question assessing injurious suicide attempt, with the higher rate likely owing to the sensitive nature of the question [3]. In survey research on sexual behavior conducted among adults, reported nonresponse rates range from 6% to 13% [42,43]. No published research was found by the authors that explicitly discussed nonresponse rates on questions of sexual behavior among adolescents. In the sexual behavior section completed using APDA, one of the most sensitive sections of the survey, the largest overall nonresponse rate was 1.4%, potentially indicating a significant improvement for research where sexual behavior is our primary outcome.

Beyond the benefits commonly associated with A-CASI, we found that APDA allowed us to ask more questions than the SAQ in a similar amount of time. Typically, school-based SAQs are designed to be completed within a 45-minute class period, with the maximum number of questions ranging from 30 (upper elementary school) to 80 (high school), depending on the age of the targeted student population. For example, the survey designed for the 2005 middle school YRBS contains 49 questions designed to be answered in a single, 45-minute class period [44]. Following this rate of question completion for SAQ, students should have answered approximately 57 questions in 52 minutes; using APDA, students completed over 200 questions in 52 minutes. Notably, no students asked to stop completing the survey.

One issue of concern was the shorter completion time of the SE students identified as cognitively or behaviorally impaired. Although the difference in time to completion is not statistically significant, this seemingly inconsistent result may bring into question the students’ comprehension of the survey and their responses. This matter is further complicated by the increased proportion of missing data among this subgroup, as well as the lower coefficient alphas of the scales used in the study. It is possible that both missing data and the internal consistency of the four scales could be improved by adding audio enhancement for the response options in addition to the questions and text. Only through further exploration of this data collection technique among these special subgroups can this question be adequately answered.

The APDA data collection system is still in its development phase. The Surveyor software is currently in the process of upgrading to include enhancing features, such as linking audio files to the response categories, a printable codebook, a trace log (i.e., tracking response path and time through the survey), a robust logic engine for internal consistency checks, and rich text formatting.

APDA offers great promise to school-based researchers and enhances our understanding of how adolescents with different cognitive and behavioral characteristics react to technology and its use in collecting self-reported data. However, far more research is needed to better understand the underlying mechanisms that contribute to differences in survey administration and data quality. Further, as this study only included urban middle school students, there is no evidence as to how this data collection method would be received by an older or younger population or a suburban, rural or even alternative school setting.

Conclusion

Based on the above findings, use of an APDA data collection system is feasible with middle school students in a controlled environment. The portable data collection system was easily implemented in a variety of venues with minimal survey administration staff. No data were lost owing to hardware or software malfunction; data files were immediately available for analysis. Students responded positively to the technologically enhanced survey and remained engaged during the 200+ question survey. Students with cognitive deficits and language barriers were able to complete the survey in a similar amount of time without additional help, although more research on the use of APDA is necessary to fully understand the effect of data collection mode with special populations.

Acknowledgments

This research was funded through a grant and supplemental funding from the National Institutes of Health, National Institute of Child Health & Human Development (R01-HD41364 & 3 R-01-HD41364-02S1). The authors thank other members of the research team and all of the participating schools and youth.

References

  • 1.Resnick M, Bearman PS, Blum RW, et al. Protecting adolescents from harm. JAMA. 1997;278(10):823–32. doi: 10.1001/jama.278.10.823. [DOI] [PubMed] [Google Scholar]
  • 2.Gans JE, Brindis CD. Choice of research setting in understanding adolescent health problems. J Adolesc Health. 1995;17(5):306 –13. doi: 10.1016/1054-139x(95)00182-r. [DOI] [PubMed] [Google Scholar]
  • 3.Brener ND, Kann L, Kinchen SA, et al. Methodology of the youth risk behavior surveillance system. MMWR Recomm Rep. 2004;53:1–13. [PubMed] [Google Scholar]
  • 4.Schober SE, FeCaces M, Pergamit M, Branden L. Effect of mode of administration of reporting of drug use in the National Longitudinal Study. In: Turner CF, Lessler JT, Gfroerer JC, editors. Survey Measurement of Drug Use: Methodological Studies. Washington, DC: Government Printing Office; 1992. pp. 267–76. [Google Scholar]
  • 5.Turner CF, Lessler JT, Devore JW. Effects of mode of administration and working on reporting of drug use. In: Turner CF, Lessler JT, Gfroerer JC, editors. Survey Measurement of Drug Use: Methodological Studies. Washington, DC: Government Printing Office; 1992. pp. 177–220. [Google Scholar]
  • 6.Brittingham A, Tourangeau R, Kay W. Reports of smoking in a national survey: data from screening and detailed interviews, and from self- and interviewer-administered questions. Ann Epidemiol. 1998;8(6):393– 401. doi: 10.1016/s1047-2797(97)00237-8. [DOI] [PubMed] [Google Scholar]
  • 7.French SA, Peterson CB, Story M, et al. Agreement between survey and interview measures of weight control practices in adolescents. Int J Eat Disord. 1998;23(1):45–56. doi: 10.1002/(sici)1098-108x(199801)23:1<45::aid-eat6>3.0.co;2-1. [DOI] [PubMed] [Google Scholar]
  • 8.Davoli M, Perucci CA, Sangalli M, et al. Reliability of sexual behavior data among high school students in Rome. Epidemiology. 1992;3(6):531–5. doi: 10.1097/00001648-199211000-00013. [DOI] [PubMed] [Google Scholar]
  • 9.Millstein SG, Irwin CEJ. Acceptability of computer-acquired sexual histories in adolescent girls. J Pediatr. 1983;103(5):815–9. doi: 10.1016/s0022-3476(83)80493-4. [DOI] [PubMed] [Google Scholar]
  • 10.Gribble JN, Miller HG, Rogers SM, Turner CF. Interview mode and measurement of sexual behaviors: methodological issues. Special Issue: Methods of inquiry about sex: New advances. 1999;36(1):16 –24. doi: 10.1080/00224499909551963. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Turner CF, Forsyth BH, O’Reilly JM, et al. Automated self-interviewing and the survey measurement of sensitive behaviors. In: Couper MP, Baker RP, Bethlehem J, et al., editors. Computer Assisted Survey Information Collection. New York, NY: John Wiley & Sons, Inc; 1998. pp. 455–73. [Google Scholar]
  • 12.Romer D, Hornik R, Stanton B, et al. “Talking” computers: a reliable and private method to conduct interviews on sensitive topics with children. J Sex Res. 1997;34(1):3–9. [Google Scholar]
  • 13.Beebe TJ, Harrison PA, McRae JA, Jr, et al. An evaluation of computer-assisted self-interviews in a school setting. Public Opin Q. 1998;62(4):623–32. [Google Scholar]
  • 14.Turner CF, Ku L, Rogers SM, et al. Adolescent sexual behavior, drug use, and violence: increased reporting with computer survey technology. Science. 1998;280(5365):867–73. doi: 10.1126/science.280.5365.867. [DOI] [PubMed] [Google Scholar]
  • 15.Ellen JM, Gurvey JE, Pasch L, et al. A randomized comparison of A-CASI and phone interviews to assess STD/HIV-related risk behaviors in teens. J Adolesc Health. 2002;31(1):26 –30. doi: 10.1016/s1054-139x(01)00404-9. [DOI] [PubMed] [Google Scholar]
  • 16.Schroder KEE, Carey MP, Vanable PA. Methodological challenges in research on sexual risk behavior: II. Accuracy of self-reports . Ann Behav Med. 2003;26(2):104 –23. doi: 10.1207/s15324796abm2602_03. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Stone AA, Shiffman S, Schwartz JE, et al. Patient non-compliance with paper diaries. BMJ. 2002;324:1193– 4. doi: 10.1136/bmj.324.7347.1193. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Palermo TM, Valenzuela D, Stork PP. A randomized trial of electronic versus paper pain diaries in children: impact on compliance, accuracy, and acceptability. Pain. 2004;107:213–9. doi: 10.1016/j.pain.2003.10.005. [DOI] [PubMed] [Google Scholar]
  • 19.Gaertner J, Elsner F, Pollmann-Dahmen K, et al. Electronic pain diary: a randomized crossover study. J Pain Symptom Manage. 2004;28(3):259 – 67. doi: 10.1016/j.jpainsymman.2003.12.017. [DOI] [PubMed] [Google Scholar]
  • 20.Henker B, Whalen CK, Jamner LD, Delfino RJ. Anxiety, affect and activity in teenagers: monitoring daily life with electronic diaries. J Am Acad Child Adolesc Psychiatry. 2002;41(6):660 –70. doi: 10.1097/00004583-200206000-00005. [DOI] [PubMed] [Google Scholar]
  • 21.Whalen C, Jamner LD, Henker B, Delfino RJ. Smoking and moods in adolescents with depressive and aggressive dispositions: evidence from surveys and electronic diaries. Health Psychol. 2001;20(2):99–111. [PubMed] [Google Scholar]
  • 22.Whalen C, Jamner LD, Henker B, et al. Is there a link between adolescent cigarette smoking and pharmacotherapy for ADHD? Psychol Addict Behav. 2003;17(4):332–5. doi: 10.1037/0893-164X.17.4.332. [DOI] [PubMed] [Google Scholar]
  • 23.Whalen C, Jamner LD, Henker B, et al. The ADHD spectrum and everyday life: experience sampling of adolescent moods, activities, smoking and drinking. Child Dev. 2002;73:209 –27. doi: 10.1111/1467-8624.00401. [DOI] [PubMed] [Google Scholar]
  • 24.Saleh KJ, Radosevich DM, Kassim RA, et al. Comparison of commonly used orthopaedic outcome measures using palm-top computers and paper surveys. J Orthop Res. 2002;20:1146 –51. doi: 10.1016/S0736-0266(02)00059-1. [DOI] [PubMed] [Google Scholar]
  • 25.SurveyView Survey Software. [cited 2004 Mar 9]. Available from: www.surveyview.com.
  • 26.Mercator Research Group. Snap Surveys. [cited 2004 Mar 9]. Available from: www.snapsurveys.com.
  • 27.PocketPC PocketSurvey. [cited 2004 Mar 9]. Available from: www.pocketsurvey.co.uk/
  • 28.Adesso Systems. [cited 2004 Mar 9]. Available from: www.adessosystems.com.
  • 29.Perseus MobileSurvey. [cited 2004 Mar 9]. Available from: www.perseusdevelopment.com/softwareprod/mobilesurveyfeatures.html.
  • 30.Global Bay Mobile Technologies. [cited 2004 Mar 9]. Available from www.globalbay.com.
  • 31.Apian Software, Inc. SurveyHost. [cited 2004 Mar 9]. Available from: www.surveyhost.com.
  • 32.DigSee SURE. [cited 2004 Mar 9]. Available from: www.mobilentry.com/eng/
  • 33.Raosoft EZSurvey. [cited 2004 Mar 9]. Available from: www.raosoft.com/products/palm/ppc.html.
  • 34.Trapl E. Use of audio-enhanced handheld computers for school-based data collection. Department of Epidemiology & Biostatistics. Cleveland, OH: Case Western Reserve University; 2004. [Google Scholar]
  • 35.Surveyor (software) Don’t Pa..panic software; Cleveland, OH: 2003. [Google Scholar]
  • 36.2002–2003 Annual Report on Educational Progress in Ohio. Columbus OH: Ohio Department of Education; 2003. [Google Scholar]
  • 37.Sudman S, Bradburn NM. Asking Questions. The Jossey-Bass Series in Social and Behavioral Sciences. San Francisco, CA: Jossey-Bass Inc., Publishers; 1982. [Google Scholar]
  • 38.DeVellis RF. Applied Social Research Methods. 2. Vol. 26. Thousand Oaks CA: Sage Publications Inc; 2003. Scale Development. [Google Scholar]
  • 39.Feldt L, Woodruff D, Salih F. Statistical inference for coefficient alpha. Appl Psychol Meas. 1987;11(1):93–103. [Google Scholar]
  • 40.Nunnally J, Bernstein IH. Psychometric Theory. 3. New York, NY: McGraw-Hill; 1994. [Google Scholar]
  • 41.Hallfors D, Khatapoush S, Kadushin C, et al. A comparison of paper vs computer-assisted self interview for school alcohol, tobacco, and other drug surveys. Eval Program Plann. 2000;23(2):149 –55. [Google Scholar]
  • 42.Johnson W, DeLamater JD. Response effects in sex surveys. Public Opin Q. 1976;40:165– 81. [Google Scholar]
  • 43.Bradburn N, Sudman S, Blair E, Stocking C. Question threat and response bias. Public Opin Q. 1978;42(2):221–34. [Google Scholar]
  • 44.Centers for Disease Control and Prevention. 2005 Youth Risk Behavior Survey: Middle School Questionnaire. 2004. [Google Scholar]

RESOURCES