Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2008 Jul 1.
Published in final edited form as: J Assoc Nurses AIDS Care. 2007;18(4):51–63. doi: 10.1016/j.jana.2007.05.002

Implementation of Audio Computer-Assisted Interviewing Software in HIV/AIDS Research

Erika Pluhar 1, Katherine A Yeager 1, Carol Corkran 1, Frances McCarty 1, Marcia McDonnell Holstad 1, Pamela Denzmore-Nwagbara 1, Bridget Fielder 1, Colleen DiIorio 1
PMCID: PMC2075082  NIHMSID: NIHMS28264  PMID: 17662924

Abstract

Computer assisted interviewing (CAI) has begun to play a more prominent role in HIV/AIDS prevention research. Despite the increased popularity of CAI, particularly audio computer assisted self-interviewing (ACASI), some research teams are still reluctant to implement ACASI technology due to lack of familiarity with the practical issues related to using these software packages. The purpose of this paper is to describe the implementation of one particular ACASI software package, the Questionnaire Development System™ (QDS™), in several nursing and HIV/AIDS prevention research settings. We present acceptability and satisfaction data from two large-scale public health studies in which we have used QDS with diverse populations. We also address issues related to developing and programming a questionnaire, discuss practical strategies related to planning for and implementing ACASI in the field, including selecting equipment, training staff, and collecting and transferring data, and summarize advantages and disadvantages of computer assisted research methods.


Within the past 20 years, computers have played a role in many phases of survey research, including instrument design, sampling, and data entry, coding, cleaning, and analysis. In recent years, survey instruments and questionnaires have become more complex, demanding increasingly advanced skills from interviewers (Couper, 2000). The use of computer-assisted interviewing (CAI) provides the means for the standardization of these complex interviews that may be difficult for live interviewers to administer because of the complexity of the questions and skip patterns. Researchers continue to transition to CAI in an effort to ease the burden of complex interviews on both interviewers and respondents and, later in the process, to facilitate data entry and management.

Researchers have employed a variety of CAI programs, particularly, audio computer-assisted self-interviewing (ACASI), including those that are written in Visual Basic and programmed in Microsoft Access, Snap Surveys v.6, Audio Data Systems, Sensus Q&A 2.0, and Computer-Assisted Survey Execution System. We currently use a software program called Questionnaire Development System™ (QDS™), developed and sold by Nova Research Company, to build and implement ACASI interviews. This article is based on our experience across several large-scale HIV/AIDS prevention and care studies in which we have used the QDS™ software.

In the ACASI format, questions are read to the participants using either a digitally recorded human voice or a text-to-speak computerized voice. Each participant hears exactly the same voice with the same accent reading the same instructions and questionnaire items presented with the same level of emotion and sensitivity. This feature of CAI provides the much-needed standardization required in interview administration, thus increasing the consistency of questionnaire administration and decreasing measurement error. Potential confounding factors such as the personality, tone, and accent of the interviewer are removed, and because each participant is hearing identical instructions and items, measurement error is further reduced. Skip patterns are consistent and not based on participant or interviewer’s interpretation. Because the QDS™ system can be programmed in any language, the need for an interpreter is eliminated.

Using the ACASI system allows the questionnaire to be customized to some extent. Items can be linked to the responses of previous items, and it is possible to include parameters for responses, thus eliminating out-of-range responses. ACASI also requires a participant to give a response (options include refuse to answer) to each question in the same order, which helps decrease missing data often observed in paper-and-pencil tests.

In terms of data management, the ACASI system has distinct advantages over the face-to-face and paper-and-pencil formats. Participants’ responses are entered directly into a format that is easily transferred into a computer database. Direct entry of responses eliminates the need for data entry personnel and reduces the risk of data entry errors. Data are entered and stored in a “data warehouse.”

Despite the increased popularity of computer assisted interviewing, some research teams are still reluctant to implement ACASI technology due to lack of familiarity with the practical issues related to using these software packages. Thus, the purpose of this paper is to describe the implementation of one popular ACASI software package, the Questionnaire Development System™ (QDS™), in a variety of HIV/AIDS prevention and care research studies. We begin with a brief review of the literature on CAI and then present acceptability and satisfaction data from two large-scale public health studies in which we have used QDS™ with diverse populations. We then discuss the practical aspects of using QDS™ in the field and conclude with a summary of our experiences.

Background

Because CAI is a relatively new and evolving method of interview administration, much of the research on CAI/ACASI has focused on the reliability and validity of this method of data collection relative to more traditional methods. The reliability and validity of data collected using ACASI has been examined in several populations (men, men who have sex with men, women, adolescents, HIV positive individuals, and substance users) (e.g., Gribble, Miller, Rogers, & Turner, 1999; Murphy, Durako, Muenz, & Wilson, 2000; Van Griensven, Supawitkul, Kilmarx, Limpakar-Najarat, Young, Manopaiboon, et al, 2001). A number of researchers have compared computer based interviewing techniques with face-to-face interviews (FTFI) or paper-and-pencil interviews (PAPI) (e.g., Perlis, Des Jarlais, Friedman, Arasteh, & Turner, 2004; Williams, Freeman, Bowen, Zhao, Elwood, Gordon, et al, 2000). Typically, the focus of these comparisons has been to determine if there is an increase in self-reporting of sensitive data when computer assisted interviews were used.

Reliability and Validity

Several authors who have studied substance use have validated self-report measurements using ACASI with objective urine test results. Murphy et al (2000) found significant correlations between self-reported marijuana use via ACASI compared to urine drug levels in a cohort of 182 HIV positive and negative adolescents. Conditional Kappa statistics (Kc) for the total cohort were 0.57, 0.71, and 0.69 for 2, 5, and 7 day reports, indicating moderate agreement. Kc were much higher in the HIV+ adolescents across all reports: 0.66 vs 0.42 (2-day), 0.92 vs 0.36 (5-day), and 0.92 vs 0.30 (7-day). Van Griensven et al (2001) noted that of all Thai youths with positive drug screens, 85% of adolescent males and 84% of adolescent females self-reported drug use via ACASI.

One study compared test-retest reliability using a four-group crossover design comparing face-to-face interview (FTFI) and ACASI in surveys on drug use and sexual behavior among drug users in 10 cities (Williams et al, 2000). The authors found a near perfect agreement on demographic information between both forms of self-report. Differences in Pearson correlation statistics occurred mainly with respect to the number of times versus the number of days an individual smoked crack, injected cocaine, used heroin or speedball. More days of drug use were reported with FTFI, possibly related to the skill of the interviewer in reviewing details of days and times of drug use. When correlations with urine drug screens were compared between ACASI and FTFI, there was a significant (p<.001) agreement with self-report by ACASI and urine drug screens for cocaine (83% agreement, K= 0.54) and opiates (86% agreement, K= 0.72). There was also a significant agreement (p<.001) with self-report by FTFI and urine drug screens for cocaine (91% agreement, K= 0.69) and opiates (95% agreement, K= .90) (Williams et al., 2000).

Krawczyk, Gardner, Wang, Sadek, Loughlin, Anderson-Mahoney, et al (2003) examined test-retest reliability of a baseline HIV questionnaire measuring demographics, sexual risk behavior, substance use, HIV knowledge, medical history, and social support delivered by ACASI using the QDS system in both English and Spanish. All 69 participants were HIV positive. They found Pearson correlations and Kappa statistics were 0.73 and 0.77, respectively, which the authors considered to be an acceptable range. Questions that requested dates had the weakest reliability and did not vary with language.

Response Patterns with ACASI

Additional research on the application of CAI/ACASI has focused on the response patterns associated with computer assisted interviews. Several studies have shown that participants are more willing to admit to sensitive behaviors using ACASI compared to paper-and-pencil and interviewer administered formats (Jones, 2003; Metzger, Koblin, Turner, Navaline, Valenti, Holte, et al, 2000; Turner, Ku, Rogers, Lindberg, Pleck, & Sonenstein, 1998).

Several national surveys used ACASI methodology to sample large populations on sensitive behaviors. Participants in the 1995 National Survey of Adolescent Men (N= 1690) were randomized to either ACASI or PAPI. The authors found that those randomized to ACASI were over three times more likely to report sex with a prostitute, a four-fold higher prevalence of reported sex with men (p<.001), almost three times more likely to report injection drug use, three times more likely to report sex with an injection drug user (IDU) (OR 13.8), as well as more likely to report carrying a weapon (gun, knife, razor), threatening others or being threatened, and being drunk or high during heterosexual sex (OR 5.5)(Turner, Ku, Rogers, Lindberg, Pleck, & Sonenstein, 1998).

In the 1993 pretest to the 1995 National Survey of Family Growth, respondents who used ACASI compared to interviewer administered CAPI reported more abortions (Mosher, 1994). Similarly, participants in the ACASI reporting arm of the 1997 National Household Survey on Drugs and Alcohol (NHSDA) field experiment (Lesler, Casper, Penne, & Barker, 2000; SAMSHA, 2003) reported higher prevalence of illicit drugs in a lifetime (OR 1.3, p=.08), past year (OR 1.57, p= .00), and in the past month (OR 1.1 p=.73) compared to PAPI. A higher prevalence of drug use was especially noted by adolescents ages 12–17 who used the ACASI format. Their lifetime and past year reported prevalence was 1.76 times greater (p=.00), and their past month use was 1.57 time greater (p=.02) than those who responded via PAPI.

In a vaccine preparedness study, responses of 946 gay men and IDU’s to sensitive questions were compared from ACASI to FTFI mode. Gay men who responded through ACASI were 1.2 times more likely to report having an HIV sero-positive partner (95% CI 0.65, 2.2), 1.3 times more likely to report having a partner of unknown HIV status (95% CI 0.91, 1.85) and IDUs were 2.4 times more likely to report sharing unclean needles (95% CI 1.34, 4.30). IDUs using ACASI, however, reported less IDU frequency (95% CI 0.51, 1.09) and less use of needle exchange programs (95% CI 0.62, 1.32) (Metzger et al, 2000). No differences were noted in either group in responses to less sensitive questions like illness or hospitalizations, and for gay men, the existence of any male sex partners. Fewer IDU-ACASI users reported having health insurance (95% CI 0.30, 0.54) and injecting drugs during the study follow-up period (95% CI 0.52, 0.90).

Researchers have suggested that the increased disclosure in computer assisted interviewing may be attributable to the impersonal and anonymous interaction between the respondent and the computer during which respondents are less concerned about how they may be viewed by others (Rosenfeld, Booth-Kewley, & Edwards, 1996). The use of headphones with the computer can help create a “private space” for the interview. Participants enter their responses directly into the computer and are less likely to be influenced by the interviewer (Jones, 2004).

Participant Receptiveness and Acceptability

In addition to response patterns, researchers have studied respondent receptiveness to computer assisted interviews. In the study by Metzger et al. (2000), only 20% of the participants reported difficulties with ACASI including the slow speed of the interview, making corrections, and understanding questions. Sixty percent of IDUs and 37% of gay men preferred ACASI, and 60% of IDUs and 59% of gay men felt ACASI encouraged honest answers. Respondents in the NHSDA (SAMHSA, 2003) study felt that ACASI increased privacy, and those with fair to poor reading ability found the audio component helpful in understanding the questions. Similarly, Jones (2003) reported that over 98% of 257 urban young women who were surveyed about HIV sexual risk behaviors felt ACASI was easy to use. In this section we present data on participant acceptance and satisfaction with the ACASI system from two of our public health studies. In addition, we share our experience with three other studies in which the ACASI system was used.

The experience of our study group has been diverse and unique, as we have used computer assessments with children as well as both healthy and chronically ill adults. Specifically, we have collected data using ACASI in five different projects with several different populations: adolescent boys ages 11–14 years and their fathers or father figures, children ages 6–12 and their mothers or mother figures, adults with epilepsy, adults with HIV, and women with HIV. Our experience with ACASI is from a sample size of 1833 and approximately 4276 interviews. See Table 1 for a description of the study samples.

Table 1.

Study samples and number of interviews

Sample size Target population Number of Interviews Number of time points Length of data collection
Get Busy Living^ 247 HIV + adult men and women 607 4 1 year
KHARMA Pilot Study 16 HIV + women 31 2 4 months
Project EASE 320 adult men and women with epilepsy 895 3 6 months
Set the P.A.C.E! (Phase 1)* 680 300 mothers and 380 children ages 6–12 680 1 Not applicable
REAL Men 548 adolescent boys ages 11–14 years and their fathers or father figures 2020 4 1 year
Epilepsy Pilot 22 adult men and women with epilepsy 42 2 3 months
^

Acceptability data on the use of ACASI were collected from 151 participants.

*

Acceptability data on the use of ACASI were collected from 180 mothers.

The two main studies and two pilot studies from which we have gathered data evaluating participant satisfaction with ACASI are called Get Busy Living, Set the P.A.C.E.! (Parents and Children Empowered), KHARMA Project (Keeping Healthy and Active with Risk Reduction and Medication Adherence) pilot, and the Epilepsy pilot. Table 2 presents demographics for the these four study samples. Set the P.A.C.E.! was an intervention study designed primarily for urban African American mothers and their 6–12 year old children. The study partnered with Boys & Girls Clubs of Metro Atlanta (BGCMA) to test an intervention that aimed to help mothers communicate with their children about sexuality, puberty, and growing up in order to reduce later sexual risk behaviors and increase resilience among children. Data were collected using ACASI at three time points over one year. However, data on participant satisfaction with ACASI were collected during Phase 1, a descriptive study with only one data collection time point. All interviews took place using laptop computers that were brought to BGCMA sites. In addition, some follow-up interviews were conducted at participants’ homes, local churches and community centers, and local libraries. Get Busy Living was designed to evaluate an intervention to foster adherence to antiretroviral medications in HIV sero-positive adults. Data were collected on a laptop or desktop computer with a touchscreen at four time points at an urban HIV clinic. Participant acceptability data were collected only from a portion of the entire samples for Set the PACE and Get Busy Living, as collection of these specific data commenced mid-way through study enrollment.

Table 2.

Sample Demographics

Set the PACE Get Busy Living KHARMA Pilot Study Epilepsy Pilot
Mean age 9.5 (children)
39 (mothers)
41.6 40.5 43.5
Median income $30,000–39,000 annually $582 per month $584/month --
Gender n (%)
Male 101 (56) children 166 (66) 0 15 (68)
Female 79 (44) children 80 (32) 16 (100) 7 (32)
Transgender 0 4 (1.6) 0 0
Ethnic Group n (%)
African American 158 (88) 216 (87) 16 (100) 10 (46)
White 20 (11) 19 (8) 0 10 (46)
Native American -- 3 (1) 0
Hispanic -- 4 (2) 0 1 (4)
Other 2 (1) 5 (2) 0 1 (4)
Marital Status n (%)
Married 65 (36) 17 (7) 1 (6.3) 10 (46)
Divorced/ Separated/ Widowed 54 (30) 69 (30) 5 (31.3) 5 (23)
Never been married 52 (29) 129 (52) 4 (25) 6 (27)
Committed relationship 9 (5) 32 (13) 6 (37.5) 1 (4)
Education n (%)
Less than High School 9 (5) 32 (13) 2 (12.5) 1 (4)
High School or GED 36 (20) 127 (51) 9 (56.3) 9 (41)
College/Technical or higher 135 (75) 88 (36) 5 (31.3) 12 (55)

The KHARMA Project pilot study tested the feasibility of an intervention using group motivational interviewing with HIV sero-positive adult women to increase their adherence to antiretroviral medications and risk reduction behaviors. Data were collected using ACASI at 2 timepoints. Interviews were conducted on laptop computers in the study office at an urban HIV clinic. The Epilepsy pilot study evaluated the feasibility of conducting a study at the epilepsy clinic of a large urban public hospital. This study is testing an intervention designed to increase adherence to medication and epilepsy self management practices. Baseline and follow-up interviews were completed for each pilot participant on a laptop computer at the epilepsy clinic.

Table 3 provides the results of the participant evaluations of the ACASI interview for Set the P.A.C.E.!, Get Busy Living, the KHARMA Project pilot, and the Epilepsy pilot. The types of sensitive data collected in these studies included questions on sexuality communication, drug and alcohol use, sexual possibility situations, sexual history, HIV status, financial issues, drug use, sexual abuse, history of incarceration, depression and stigma. Over 90% of the participants who completed anonymous evaluations felt comfortable using the computer to answer the questions in the interview and said the computer voice was easy to listen to and understand. Because of this automated but interactive method of interviewing, participants could focus on the questions. Most participants found that it was easy to keep paying attention while answering the questions. Nearly all of the participants felt that using the computer helped keep the answers private, which was important due to the sensitive nature of this data. Eighty-two percent disagreed that they would rather have a person ask them the interview questions and write down the answers. Only small numbers of participants evaluated the ACASI experience negatively. For example, across all studies, 11% said they would rather have a person ask them the interview questions and 11% believed the questions on the computer were difficult to understand. Overall, these data suggest that the privacy offered by this type of interview was more important than the impersonal nature of the computer. However, for small numbers of participants, a live interviewer may be preferred over a computer. Further research with individuals who prefer a live interviewer would help investigators understand factors that inhibit or promote participation in computer-assisted data collection.

Table 3.

Participant satisfaction with ACASI in two public health studies

Set the PACE
n=171–172
Get Busy Living
n=146–151
KHARM A Pilot Study
n= 14–15
Epilepsy Pilot
n=16–18
Total
I felt comfortable using computer to answer interview questions.)
Strongly agree 131 (76) 133 (88) 12 (80) 12 (75) 288 (81)
Agree 29 (17) 15 (10) 3 (20) 3 (19) 50 (14)
Disagree 2 (1) 1 (6) 3 (1)
Strongly disagree 10 (6) 3 (2 ) 13 (4)
The computer voice was easy to listen to and understand.
Strongly agree 88 (51) 122 (81) 10 (67) 10 (63) 230 (65)
Agree 66 (39) 23 (15) 3 (20) 4 (25) 96 (27)
Disagree 11 (6) 1 (1) 1 (7) 2 (13) 15 (4)
Strongly disagree 6 (3) 5 (3) 1 (7) 12 (3)
The questions were difficult for me to understand
Strongly agree 4 (2) 32 (21) 1 (7) 1 (6) 38 (11)
Agree 6 (4) 11 (7) 3 (18) 20 (6)
Disagree 66 (38) 30 (20) 5 (33) 5 (29) 106 (30)
Strongly disagree 96 (56) 77 (51) 9 (60) 8 (47) 190 (54)
It was easy to keep paying attention while answering the questions.
Strongly agree 83 (48) 97 (66) 10 (67) 8 (56) 198 (56)
Agree 79 (46) 40 (27) 4 (27) 10 (44) 133 (38)
Disagree 5 (3) 3 (2) 8 (2)
Strongly disagree 5 (3) 8 (5) 1 (7) 14 (4)
I would rather have a person ask me the questions.
Strongly agree 4 (2) 30 (21) 1 (7) 2 (13) 37 (11)
Agree 9 (5) 10 (7) 1 (7) 4 (25) 24 (9)
Disagree 66 (38) 47 (32) 4 (29) 5 (31) 122 (35)
Strongly disagree 93 (54) 59 (40) 8 (57) 5 (31) 165 (47)
Using the computer helped keep my answers private.
Strongly agree 84 (49) 115 (78) 11 (79) 7 (44) 217 (62)
Agree 77 (45) 22 (15) 2 (14) 9 (56) 110 (31)
Disagree 8 (5) 5 (3) 13 (4)
Strongly disagree 3 (2) 6 (4) 1 (7) 10 (3)

Sample size ranged from 347 to 356, taking into account missing responses.

In addition to the above two studies, we have implemented ACASI in several other public health studies, which have informed our experience with this technology. The R.E.A.L. Men project is a two component, gender and culturally specific, HIV prevention intervention for African American fathers or father-figures and their 11–14 year old adolescent sons. Data were collected using ACASI at four time points at local BGCMA sites, participants’ homes, churches, and libraries. Project EASE is a study examining epilepsy self-management and was conducted at three centers in Boston and Atlanta. Participants were adults diagnosed with epilepsy who completed three assessments three months apart that addressed demographics, seizure experiences, and personal and environmental factors associated with self-management of epilepsy. The interviews took place in clinics and study offices. Both of these studies used the QDS™ system except for Project EASE, which used a Visual Basic based program designed by a co-investigator on the study.

Across all of our studies, we have found that factors such as participants’ computer skills, the role of the interviewer, and the length of the computer interview were important to consider based on the population studied. Depending on the study, participants had varying levels of computer skills, including those who had never used a computer. All of the computer interviews used a laptop or desktop monitor with a touch screen and an audio component. To introduce the computer to the participants, we used a standardized script followed by an interviewer demonstration. Study staff demonstrated how the computer worked by going through a short series of preliminary demographic questions with the participant. Next, participants completed practice questions on the computer. For most of the studies, the interviewer assisted the participant at the beginning of the interview and then gave the participant privacy to complete the interview independently. For the children ages 6–10, we used the audio computer-assisted personal interview (ACAPI) format, where the interviewer asked the questions and recorded the child’s answers on the computer. Older children (11–12) in the same study did the computer interview independently using the ACASI format. The length of the interview varied based on the needs of the study and the attention span of the participants. For example, the interview for the epilepsy study lasted approximately two hours, whereas the interview for the 6–10 year old children was limited to 30–45 minutes. We also have used CAI for telephone interviews in which the interviewer administers the items for participants who no longer reside in the study locale.

Utilizing this technology in the epilepsy study necessitated special caution. Concern had been expressed that using a computer might trigger seizures for epileptic individuals. Initially, individuals with a history of photosensitivity were excluded from the study. However, some of those excluded used computers on a daily basis without incidence. Thus, we received permission from the Institutional Review Board to modify the exclusion criteria. Individuals with photosensitivity were allowed to participate with the permission of a physician. Out of the 892 assessments completed, only person had a seizure that was considered possibly related to the computer interview and the patient recovered without problems.

Literacy is a challenging issue with computer assessments (Lessler, Caspar, Penne, & Barker, 2000). Those participants with low literacy seem to have an easier time with the computer assessment than with a paper-and-pencil interview since the computer reads the interview and the option to repeat a question is available. With 6–10 year old children, we have found that use of ACASI is possible in conjunction with interviewers reading the questions aloud to child participants and entering their responses into the computer. Further investigation of this area is needed.

Overall, the computer interview was well received by participants in all of our studies. Our experience has demonstrated that a diverse group of individuals were able to complete the computer assessment with ease. Furthermore, ACASI has reduced the need for data entry personnel, thereby decreasing the likelihood of data entry error and minimizing data cleaning.

Summary

In sum, investigators have found ACASI to be an acceptable, reliable, and valid format for survey administration, ensuring accurate, honest, and confidential participant responses for a broad range of sensitive topics. FTFI seems to elicit higher frequencies of drug use in IDUs. This could be related to the limited ability of early ACASI to probe day-to-day details of behaviors that an experienced interviewer can do. As researchers become more familiar with using the ACASI format, this difference may improve. Further studies are needed in this area.

While reporting on these very important reliability, validity, and acceptability issues related to the use of ACASI, researchers have tended to overlook the more practical side of their field-based experiences regarding the implementation of ACASI. We turn now to a discussion of implementation issues including programming a questionnaire, staff training, equipment selection and security, and data collection and transfer.

Practical Application of ACASI

Before the fieldwork begins, a substantial amount of work is required to prepare an ACASI questionnaire using the QDS™ software. Although the remaining discussion focuses on issues related to the development and implementation of an ACASI interview, it is important to note that the first step really involves developing a survey/questionnaire based on sound design principles. Computer delivery will not save a poorly designed survey. In the following section, we address issues related to programming a questionnaire. We then discuss practical strategies for planning for and implementing ACASI in the field, including selecting equipment, training staff, data collection, and data transfer. All of the recommendations are based on our experience using this technology across five large-scale, public health intervention studies.

Programming an ACASI Questionnaire

The QDS™ software package has three main components. First, the questionnaire development component provides the forum (called a specifications file) and tools for programming a questionnaire. Second, the delivery component allows the user to transform the specifications file into a variety of end products, including a self-administered audio version, a self-administered written questionnaire, and interviewer administered versions, as well as a codebook option. Third, the data warehouse/management component allows for storage of interview data files and transfer into statistical software programs for data cleaning and analysis. We will focus mostly on the questionnaire design and development component. The design feature is based on “elements” and different elements are added depending on the type of information that is being collected or conveyed. For example, the “data” element is used to ask a question and create response options, the “information” element is used to convey information to the respondent or interviewer, and the “automatic variable” element prompts the system to automatically enter or calculate a variable such as date or time.

Training staff to use QDS

The first step in initiating use of the QDS software is to learn how to build a questionnaire. In our experience, it is most helpful to conduct internal training led by a project staff member knowledgeable in the program. Although it is possible to learn the program using the tutorial that comes with the software, training internally in a one-on-one or small group format allows for a more interactive and practical learning process. For us, one staff member learned the software “on the job,” with intermittent support from a colleague who already was familiar with the program. This staff member then went on to teach many other individuals as new projects began. The people she trained then continued to teach others. Thus, knowledge was passed along through informal training, which we believe is possible because the program is not all that complex, particularly if the questionnaire being created is not complicated. In addition, Nova Research, the company that produces the QDS™ software, offers trainings to research staff and the software comes with a tutorial.

In our experience, it generally takes one initial training session of 2–3 hours in length to establish basic working knowledge and skills for QDS™ programming. Learning appears to be most effective if each trainee is able to work through the various techniques on his or her own computer during the training. Using an actual questionnaire that the trainees will use in the study also enhances practical application. Basic areas to cover in an introductory training include starting a new specifications file; an overview of the various types of elements and how to insert elements; building response options and response cards; programming skip patterns and the use of markers; building different applications including ACASI, self-administered questionnaires, and codebooks; and testing and troubleshooting.

Tips for questionnaire programming

We have also discovered a number of time-saving, organizational, and programming tips across various studies. First, when building an assessment from the ground up using the QDS design studio software, we suggest that individuals copy and paste similar items in the same instrument so that it is not necessary to type each individual item. For example, if entering a 20-item instrument into the QDS format, the first item can be typed manually and response options selected. If subsequent items use the same response options, the first item can be copied and pasted 19 more times and then the wording of each item can be changed. Copying and pasting (rather than typing each item) also reduces the chance of typographical or grammatical errors.

Second, when building complex instruments with many skip patterns, it is helpful to build each individual scale (or subscale) as a separate file first, rather than developing a single large tool that includes several components. This approach allows for efficient, separate testing of the component scales or subscales. Once the scale functions correctly in ACASI, it can be integrated (through copying and pasting) into the larger assesment.

Third, we have found it useful to name variables (i.e., elements) using a standardized format. For example, in a 29-item self-concept instrument that was the second instrument (labeled “B”) in our assessment, we labeled each item with the corresponding variable name B1, B2, B3, etc. For the variable label, we used the phrase “Self-Concept” for each item, followed by a truncated version of the item (e.g., “Self-Concept: classmates make fun of me”). This makes it easier to recognize distinct sections when the programmer is viewing the entire questionnaire specifications file in the design studio (a good organizational tool that applies to survey design, in general).

Fourth, in working with youth, we have had a few instances where a child participant manipulated the computer after the interview had been completed in such a way that the data from the interview did not save properly. Thus, we advise building into the program an element that requests the interviewer to enter a close-out identification number. This number is pre-set by the programmer, known only to the interviewers, and is required to move out of the ACASI interview. Using a close-out code allows the interviewer more control over what happens to the data at the end of the interview, particularly when the participant is self-administering the interview.

Fifth, when programming instruments that involve skip patterns, we have found that it is better to use a feature called “markers” to program skip sequences that span more than 2 elements. Markers are simply an indicator in the program. For example, for the instrument discussed above, “B,” the end of the instrument might be marked with a line of code that reads “ENDB.” If only some participants are supposed to answer the questions in instrument “B” (e.g., females but not males), a skip pattern would be written into the code prior to beginning that instrument whereby women continued and men automatically skipped to the end using the marker “ENDB” as an indicator for where the instrument stops. In general, we have found that attempting to skip more than two elements often results in programming errors, whereas markers provide a more distinct command.

Sixth, we attempted to record human voices for use in several of our studies, but found that the time it took to ensure proper programming and execution was generally not worth it compared to the ease of use of the computer voice. In addition, any changes made to the questionnaire had to be re-recorded adding more steps to the development process. A variety of computer voices are now available for use with QDS, which allow researchers to more effectively tailor computer-based interviews to the gender and race of study participants. However, researchers have not found differences based on the type of voice in participants’ willingness to finish the survey, to answer specific questions, or on the answers participants provide to questions (Couper, Singer, & Tourangeau, 2004).

Use of ACASI Methods in the Field

As computer-assisted interviewing methods become more common in research, so does the need to address logistical issues related to using CAI equipment in the field. The main focus for researchers and project staff is on the safe transfer of data from the field to the base office for processing. Several key issues become pertinent in the selection of both hardware and software for successful completion of data collection tasks. We have found that purchasing the best equipment, developing a project protocol, developing an extensive staff training agenda, and transferring the data from the field to the base office are critical issues to consider when using computer technology to conduct data collection in the field.

Equipment

Because of the increased possibility of interviewer coding errors with paper- and-pencil interviews and interviewer keying errors with CAPI, the laptop computer has become a popular piece of equipment for collecting ACASI data. But which computer is best suitable for this task? We tested two different laptops in the beginning phase of our research planning. Since testing, our computer of choice for several projects has been one that is small and lightweight, capable of sustaining several applications including QDS, and has the voice and touchscreen features. The models we use have external floppy “a-drives,” which facilitate data transfer. If conducting interviews in a set location, it is possible to select a desktop computer with touchscreen capabilities that will support ACASI. In addition to selecting a computer, other equipment to consider are durable headphones that will remain intact with repeated uses and computer bags with multiple pockets and zippers to store equipment and paperwork securely.

When assessing overall study costs, fewer interviewers are needed to conduct ACASI interviews, reducing the total cost of the study. However, during the first year, purchase of computers and software can be expensive. Laptop computers with touchscreen capability, which we have found to be the best, are often the most expensive. Sometimes it is difficult to estimate the number of computers required for a project. On two projects, we later purchased additional computers to accommodate the number of interviews. Modifications in computer hardware and software over the course of the project can be problematic as well. Our first computers came with Windows98, but when we purchased later ones they came with Windows2000. This led to problems with the compatibility of our file transfer program (originally written in C++) and a new program had to be written in Visual Basic.

Protocol

After the laptop selection has been made, it is helpful for the project staff to develop a data collection protocol that includes tracking methods such as a log to check the computers in and out of the base office, a form to record computer problems, and a form for reporting protocol violations that may occur during data collection. Protocol also may include directions on how to handle participants who do not want to use the computer, participants who may have some difficulty reading, and what staff should do if a laptop is lost or stolen. The process of checking the laptop in and out may vary from project to project, but has been critical in order for us to maintain inventory of equipment and accountability of interviewers. If several laptops are used for data collection in the field, numbering the computers and labeling them with the project name helps keep track of the equipment. On the log, the interviewers record the computer number assigned to them, along with date, time, and condition of the assigned laptop. The same information is recorded when the laptop is returned to the base office after an assessment has been completed. We have requested that interviewers record laptop problems, power/battery failure, and missing equipment in a computer log, which is stored in the carrying bags. All protocol violations related to the laptop are reported to the project director and recorded on the computer log. While seemingly minor details, these checks and balances have proven critical in monitoring field data collection activities using ACASI on laptop computers.

Training

The next step involves the training the field staff. Training is crucial in order for the interviewers to educate potential participants on how to use the laptop computer in a standardized manner. It also builds a comfort level necessary to collect the data. In our experience, training project staff occurs prior to the beginning of data collection and continues periodically as needed in the form of “refresher courses.” Training is usually conducted by the project director or project coordinator, and includes an introduction to the data collection equipment (basics such as set up and how to turn the computers on and off), and detailed training on the use of the QDS/ACASI software. In addition, training for our field staff has included practice sessions with mock interviews and role-plays to work through specific interview scenarios. With training, most staff members can administer the questionnaire, reducing or eliminating the need for highly trained and skilled interviewers. Using ACASI, one staff member can administer more than one questionnaire at a time. In the REAL Men and Set the P.A.C.E.! projects, we used eight computers. This allowed us to be at multiple sites and to interview several participants at the same time.

Preparation

After the training and practice sessions, interviewers are ready to begin data collection. We have found that a checklist can help guide interviewers as they move into the field. Checking the laptop before leaving the base office is paramount. The computers should have been recharged in case of power failure at the research site, and the material needed for the interview should be included in the carrying case. A checklist helps to ensure that mandatory items are included such as a battery pack, power cord, participant’s resource book, a guide for answering participants’ questions in a standardized format, floppy disks, paper copies of the questionnaire (in case of computer failure), pencils, pens, required paper work, extra styli (for manipulating the touch screen), headphones, and a computer problem log book.

Data Collection

Setting up the laptops with ACASI at the research sites can be a smooth process. One of the most common snags for us has been negotiating space at our community-based organization sites, which can be facilitated by frequent dialogue, planning, and open communication with site staff members. Once space has been negotiated, the interviewers need to secure a power source. Some electrical outlets may be nonfunctional or located in a hard-to-reach place. An outlet tester or night light can be a handy item to use in testing outlets prior to computer set. The same type of relationship building can be transferred to interviews that are conducted in participants’ homes. In this situation, problems occur when other family members manipulate the computers, when functional and secure outlets cannot be found, or when power failure occurs. For any field data collection activities, interviewers should remember to keep the computer at a reasonable height, away from water, within sight at all times, and completely charged.

It is useful to build breaks into the longer interviews, and breaks may need to be used more frequently for young children and older adults. We found that it is helpful to offer participants the opportunity to take a break and to provide verbal encouragement from the interviewer as well as the voice on the computer.

Data Transfer

One of the important responsibilities of an interviewer is to transfer the data from the field to the base office. To transfer ACASI data we train interviewers to copy the completed interview file from the laptop onto a floppy disk. We have developed a simple program in Visual Basic that guides the interviewer through the back-up process. First, a completed .QAD (data) file is saved in a default location. The interviewer then clicks on an icon named “Step 2,” which initiates the copying process. The program prompts the interviewer to enter the participant identification number (which has also been made the title of the saved data file). Upon hitting “enter,” the file is copied into two back-up folders on the hard drive of the computer, and also onto a floppy disk in the external a-drive. The disk is returned to the project office and given to the data manager with accompanying paperwork. Other possibilities for data transfer include sending an encrypted data file over the Internet and using an external jumpdrive instead of floppy disks. We tried the former option in one of our pilot studies, but had trouble negotiating the firewall at our institution. The latter may be particularly useful with data files that exceed the size limit of typical floppy disks. QDS does have a shipping option in the data warehouse component. We have yet not used this in our work.

We have experienced several problems using the floppy disk system to transfer data. These include having a corrupted, full, unformatted, or missing floppy disk, computer malfunction that does not allow data to be saved to disk, and user error. Interviewers should be trained on how to locate the saved QDS data file on the computer hard drive so that it can be manually copied and pasted onto a floppy disk should the program malfunction.

Other possible approaches for data transfer are available depending on the study that may help reduce or eliminate some of the problems we experienced. These include web-based data entry in which the questionnaire is created and administered and data are collected via the Internet. The other option is that the laptop containing the questionnaire be connected to a network over which data are automatically transferred to the study database. While both of these options eliminate need to transfer data from the laptop to a central database, they also require a study location with Internet access and/or networking capability.

Equipment and Data Security

Last but very important is the need to secure data at all times. Interviewers are responsible for securing the laptop computers containing data stored on the hard drive. Computers should be stored in the trunk of a car rather than on the car seat. When not in use, laptop computers should be kept in a locked in cabinet in a secure room to prevent potential theft. In our studies that have used desktop computers, we secured the computer to the desk with a lock.

To address data security concerns, each laptop should be set up so that a password is required to enter the operating system. We also assign each study staff member a unique ID and password. Study databases are maintained in the home office and are password protected. Only the statistician and principal investigator have access to the study data. Identifying information is stored in an Microsoft Access database separately from participants’ responses which are stored and analyzed in SPSS.

Conclusion

Overall, there are numerous advantages to the use of ACASI technology. The QDS system described facilitates standardized, consistent administration of questionnaires in multiple languages. Across many studies, data obtained via ACASI have been shown to be accurate, valid, and reliable. As other authors have indicated, ACASI can be used to collect data from a broad range of individuals (de Leeuw, Hox, & Kef, 2003). We have used ACASI to collect data primarily from adult men and women and adolescents. Our adolescent participants ranged in age from 11–16 years and our adults from 18–75 years. Our participants live primarily in urban areas. Our participants have included men who have sex with men, elderly, drug users, and people with HIV, epilepsy, and other chronic illnesses such as hypertension and diabetes. They also spanned the socioeconomic continuum with people receiving public assistance to those making over $100,000 per year. Some of our participants have low literacy skills and many have either not used a computer or have limited computer skills. The audio component of ACASI is particularly effective for participants with low literacy skills. Among these diverse individuals and settings, the majority felt comfortable using the computer, liked the privacy it afforded, and encountered few difficulties with its use.

A major weakness described is the detailed probing that can be accomplished by a skilled interviewer face-to-face. In addition, cost may be prohibitive to projects with little or no funding. Losing data can be a problem if staff is not properly trained and continually retrained, and if a knowledgeable data manager is not available to troubleshoot. It also may be difficult for cognitively impaired individuals to use the computers and also for children with Attention Deficit/Hyperactivity Disorder. As ACASI technology and researchers’ expertise evolve and improve, these issues may be resolved in the future.

Acknowledgments

This research was supported by the following grants from the National Institutes of Health: NIH R01 HDMH 39541, NIH/NINR R01 NR08094-01, and NIH 5 R01 NR 04770.

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  1. Couper MP. Usability evaluation of computer-assisted survey instruments. Social Science Computer Review. 2000;189(4):384–396. [Google Scholar]
  2. Couper MP, Singer E, Tourangeau R. Does voice matter? An interactive voice response (IVR) experiment. Journal of Official Statistics. 2004;20(3):551–570. [Google Scholar]
  3. de Leeuw E, Hox J, Kef S. Computer-assisted self-interviewing tailored for special populations and topics. Field Methods. 2003;15(3):223–251. [Google Scholar]
  4. Gribble J, Miller H, Rogers S, Turner C. Interview mode and measurement of sexual behaviors: Methodological issues. Journal of Sex Research. 1999;36(1):16–24. doi: 10.1080/00224499909551963. [DOI] [PMC free article] [PubMed] [Google Scholar]
  5. Jones Rachel. Relationships of sexual imposition, dyadic trust, and sensation seeking with sexual risk behavior in young urban women. Research in Nursing & Health. 2004;27:15–197. doi: 10.1002/nur.20016. [DOI] [PubMed] [Google Scholar]
  6. Jones R. Survey data collection using audio computer assisted self-interview. Western Journal of Nursing Research. 2003;25(3):349–358. doi: 10.1177/0193945902250423. [DOI] [PubMed] [Google Scholar]
  7. Kiesler S, Sproull LS. Response effects in the electronic survey. Public Opinion Quarterly. 1986;50:402–413. [Google Scholar]
  8. Krawczyk CS, Gardner LI, Wang J, Sadek R, Loughlin AM, Anderson-Mahoney P, Metsch L, Green S. Test-retest reliability of a complex Human Immunodeficiency Virus research questionnaire administered by an audio computer-assisted self-interviewing system. Medical Care. 2003;41(7):853–858. doi: 10.1097/00005650-200307000-00009. [DOI] [PubMed] [Google Scholar]
  9. Lesler J, Caspar R, Penne M, Barker P. Developing computer assisted interviewing (CAI) for the national household survey on drug abuse. Journal of Drug Issues. 2000;30(1):9–34. [Google Scholar]
  10. Metzger D, Koblin B, Turner C, Navaline H, Valenti F, Holte S, Gross M, Sheon A, Miller H, Cooley P, Seage G. Randomized controlled trial of audio computer-assisted self-interviewing: Utility and acceptability in longitudinal studies. American Journal of Epidemiology. 2000;152(2):99–106. doi: 10.1093/aje/152.2.99. [DOI] [PubMed] [Google Scholar]
  11. Moon Y. Impression management in computer-based interviews: the effects of input modality, output modality, and distance. Public Opinion Quarterly. 1998;62:610–622. [Google Scholar]
  12. Mosher W. Design and operation of the 1995 National Survey of Family Growth. Family Planning Perspectives. 1998;30(1):43–46. [PubMed] [Google Scholar]
  13. Murphy DA, Durako S, Muenz LR, Wilson CM. Marijuana use among HIV-positive and high-risk adolescents: a comparison of self-report through audio computer-assisted self-administered interviewing and urinalysis. American Journal of Epidemiology. 2000;152(9):805–813. doi: 10.1093/aje/152.9.805. [DOI] [PubMed] [Google Scholar]
  14. Perlis T, Des Jarlais D, Friedman S, Arasteh K, Turner C. Audio-computerized self-interviewing versus face-to-face interviewing for research data collection at drug abuse treatment programs. Addiction. 2004;99:885–896. doi: 10.1111/j.1360-0443.2004.00740.x. [DOI] [PubMed] [Google Scholar]
  15. Rosenfeld P, Booth-Kewley S, Edwards JE, Thomas MD. Responses on computer surveys: Impression management, social desirability, and the big brother syndrome. Computers in Human Behavior. 1996;12(2):263–274. [Google Scholar]
  16. Substance Abuse and Mental Health Services Administration (SAMHSA) Executive Summary: Development of Computer-assisted interviewing procedures for the national household survey on drug abuse. 2003 Retrieved on 11/19/03 from the World Wide Web: www.smahsa.gov/oas/nhsda/CompAssistInterview/cover.htm.
  17. Turner DF, Ku L, Rogers SM, Lindberg LD, Pleck JH, Sonenstein FL. Adolescent sexual behavior, drug use, and violence: increased reporting with computer survey technology. Science. 1998;280(5365):867–873. doi: 10.1126/science.280.5365.867. [DOI] [PubMed] [Google Scholar]
  18. Van Griensven F, Supawitkul S, Kilmarx PH, Limpakarnjanarat K, Young NL, Manopaiboon C, Mock PA, Korattana S, Mastro TD. Rapid assessment of sexual behavior, drug use, Human Immunodeficiency Virus, and sexually transmitted diseases in Northern Thai youth using audio-computer self-interviewing and noninvasive specimen collection. Pediatrics. 2001;108(1):e13–19. doi: 10.1542/peds.108.1.e13. [DOI] [PubMed] [Google Scholar]
  19. Williams ML, Freeman RC, Bowen AM, Zhao Z, Elwood WN, Gordon C, Young P, Rusek R, Signes CA. A comparison of the reliability of self-reported drug use and sexual behaviors using computer-assisted versus face-to-face interviewing. AIDS Education and Prevention. 2000;12(3):199–213. [PubMed] [Google Scholar]

RESOURCES