Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Jan 31.
Published in final edited form as: Stud Fam Plann. 2010 Dec;41(4):251–262. doi: 10.1111/j.1728-4465.2010.00251.x

Survey Estimates of Non-Marital Sex and Condom Knowledge among Ethiopian Youth: Improved Estimates Using a Non-Verbal Response Card

David P Lindstrom 1, Tefera Belachew 2, Craig Hadley 3, Megan Hattori 1, Dennis Hogan 1, Fasil Tessema 2
PMCID: PMC3907937  NIHMSID: NIHMS529221  PMID: 21465726

Abstract

The accurate assessment of risky sexual behaviors and barriers to condom use is essential to reduce the spread of HIV/AIDS. This study tests a new non-verbal response card method for obtaining more accurate responses to sensitive questions in the context of face-to-face interviewer-administered questionnaires in a survey of 1,269 Ethiopian youth ages 13-24 years. Comparisons of responses between a control group that provided verbal responses and an experimental group that used the card indicate that the prevalence of non-marital sexual intercourse may be twice as high as typical survey methods suggest, and that knowledge of condom access may be 20 percent lower in the study population. These results suggest that conventional face-to-face interviewer-administered surveys may provide seriously biased estimates of risky adolescent sexual behavior and perceived access to condoms in illiterate populations, hampering the development of successful programs to limit HIV/AIDS.

Introduction

Successful programs to combat the spread of HIVAIDS in a population require accurate information about the prevalence of risky sexual behaviors among men and women. Estimates of the at-risk population, by subgroup, permit programs to target certain groups in the population and are essential for the effective social marketing of condoms for disease prevention. Sample surveys are the primary source of information about risky sexual behaviors and condom use. Yet survey researchers have long known that in interviewer-administered surveys, respondents often intentionally misreport their behavior and attitudes in order to create a more favorable image of themselves in the eyes of the interviewer, or to avoid creating an awkward interaction. Response bias is an especially critical issue for obtaining information about sexual behaviors, particularly high-risk sexual behaviors.

A number of innovations in survey methodology have been developed to address response bias, including strategies to increase the level of respondent privacy and confidentiality while preserving the advantages of having an interviewer present. These innovations typically involve some level of respondent self-administration for the sensitive portion of the interview, and often require basic literacy. In this study we introduce a new non-verbal response card method for soliciting responses to sensitive questions that was developed and tested in a survey of adolescent sexual behavior and knowledge fielded in southwestern Ethiopia. We present reports of sexual behavior, knowledge of condoms, and acceptance of premarital sex for respondents who gave conventional verbal responses, and for respondents who used the non-verbal response card method. The results reveal significant differences in reported behavior, knowledge, and attitudes by response method. In particular, we find higher levels of non-marital sex and lower levels of condom knowledge reported by respondents who used the non-verbal response card method compared to the verbal response method.

Background

Demographic research in developing nations has long been concerned with survey measurement and analysis of a variety of potentially sensitive issues and behaviors; these include sex before marriage or outside of marriage, unprotected intercourse, substance use, abortion, family violence, and the autonomy of women in household decision-making. Although there is a general recognition that responses to survey questions on these topics may be inaccurately reported, nearly all research on reporting errors in developing country surveys has focused on issues dealing with non-response (Gibson, Hudes and Donovan 1999; Mishra et al. 2006), the temporal compression or telescoping of events (Gage 1995), and the consistency and reliability of responses (Eggleston, Leitch and Jackson 2000; Knodel and Piampiti 1977; Strickler et al. 1997; Williams, Sobieszczyk and Perez 2001; Nyitray et al. 2009). Less attention has been given to the accuracy of survey responses to sensitive items in interviewer-administered population surveys (for exceptions see Gregson et al. 2002; Gregson et al. 2004; Lara et al 2004; Mensch, Hewett and Erulkar 2003; Nnko et al. 2004; Obermeyer 2005; Plummer et al. 2004; Weinhardt et al. 1998).

In face-to-face interviewer-administered surveys, non-response and intentional misreporting are common response effects with questions that address sensitive topics. The refusal to participate in a survey interview or to respond to individual questions can bias survey results. A potentially worse situation arises when subjects intentionally misreport their behavior or opinions rather than refuse to answer, because they feel socially obligated to cooperate, or because they wish to make a positive impression on the interviewer. This type of misreporting may be more problematic from a data quality perspective than non-responses because it is not easily detected and can bias sample estimates without the researcher's knowledge. In comparison, item-specific non-response can be assessed for the differences between those who respond to specific items and those who do not.

Systematic misreporting on sensitive topics generally takes the form of underreporting socially undesirable behaviors or attitudes and overreporting desirable ones. Tourangeau, Rips and Rasinski (2000) identify social desirability, invasion of privacy, and risk of disclosure as three dimensions of sensitive questions that generate response bias. Social desirability refers to the tendency of respondents to report behaviors or attitudes that project a favorable image of themselves and that do not offend the interviewer or elicit the interviewer's disapproval. Social desirability stems from an individual's need for social approval, as well as the desire to conform to perceived cultural norms of good behavior and cooperation and avoid embarrassment and shame. Johnson and van de Vijver (2003) find systematic cross-cultural differences in the response effects of social desirability, with lower levels of social desirability bias associated with higher levels of affluence and social power. In the United States, minority groups are more likely to underreport stigmatizing behaviors such as substance abuse and abortions than majority whites (Jones and Forrest 1992).

Because social desirability is based on the respondent's assessment of the degree of sensitivity of a question and how the interviewer will judge a particular response, the relative magnitude and direction of response effects in face-to-face interviews will vary, often in predictable ways, across questions, response modes, individuals, social groups, and cultures (Catania 1999). These issues are particularly salient for survey research on sexual behavior and reproductive health (see Axinn 1991; Bearinger et al. 2007; Marston and King 2006; Puri and Busza 2004; Zehner 1970). For example, the double standard for the sexual behavior of men and women produces a tendency for women to underreport the number of sexual partners and for men in some age or cultural subgroups to overreport the number of partners (Catonia et al. 1990; Curtis and Sutherland 2004; Fenton et al. 2005; Marston and King 2006; Mensch, Hewett and Erulkar 2003; Nnko et al. 2004; Plummer et al. 2004; Smith 1992;). Persons who are highly educated and those living in cities typically are less inhibited in reporting non-normative behaviors than poorly educated rural respondents. These differentials in response bias prevent the accurate description of sexual behaviors at the population level, and misrepresent the extent of social and economic differences in reported sexual behaviors.

Privacy issues are a second dimension of sensitive questions that generate response effects in interviewer-administered surveys. Sensitive questions, particularly those dealing with intimate sexual behaviors, may be viewed as intrusive. Investigators count on the impersonal and scientific nature of the survey interview to reduce the awkwardness associated with questions about private matters. However, in cultures that emphasize collectivism and cooperation in social interaction, the need to maintain positive and harmonious relations with the interviewer can contribute to biased results if respondents react to intrusive questions by providing inaccurate responses (Johnson and van de Vijver 2003; Jones 1983). For instance, in Ethiopia, refusal rates for surveys are exceptionally low compared to surveys in higher income countries, in part because of the strong cultural emphasis on politeness and conformity (Central Statistical Agency [Ethiopia] and ORC Macro 2006). These high response rates, however, may mask intentional misreporting by respondents who might otherwise refuse to participate.

A third dimension that generates response effects for sensitive questions is the risk of disclosure. Respondents may refuse to answer a sensitive question or intentionally misreport a behavior or attitude because of concerns that others will hear their responses during the course of the interview. There also may be a concern that interviewers who learn embarrassing responses will reveal those responses to others, especially when the interviewers are recruited locally from the same ethnic, linguistic, and religious group.

These response effects are often sensitive to the mode of data collection the interviewer uses for sensitive questions. In spite of the problems of non-response and misreporting, the advantages offered by the presence of an interviewer (i.e., higher overall participation rates, question clarification, fewer invalid responses, and direct observation) generally outweigh the potential drawbacks (Catania et al. 1990). A number of innovations in questionnaire administration and response modes have been introduced for use in face-to-face interviews to reduce the response effects produced by sensitive questions (Tourangeau et al. 1997). In computer-assisted self-interviewing (CASI), questions are displayed on a computer screen and responses are entered using the keyboard. Simultaneous verbal instructions may be provided by the interviewer or played through earphones (audio computer-assisted self-interviewing, ACASI) to guide the respondent. An alternative method for collecting sensitive survey data is to provide the respondent with a self-administered paper and pencil questionnaire that the respondent places in a sealed envelope upon completion. Studies conducted in developed countries have consistently shown that some form of self-administration in the sensitive section of a questionnaire reduces the level of misreporting (Couper and Stinson 1999). For example, illicit drug use is more likely to be reported in self-administered questionnaires than interviewer-administered questionnaires (Tourangeau, Rips and Rasinski 2000:270). Tourangeau and Smith (1996) found that the gap in the number of sexual partners reported by men and women in interviewer-administered questionnaires was sharply reduced, when computer-assisted self-administration was used. Jones and Forrest (1992) found that the reporting of abortions by American women in the National Survey of Family Growth (NSFG) increased significantly when respondents were given a self-administered questionnaire as compared to an interviewer-administered questionnaire. In the case of desirable behaviors, Gribble et al. (1999) report that normative behaviors such as consistent condom use are less likely to be overreported in a telephone audio computer-assisted self-interview (T-ACASI). Macalino et al. (2002) found that injecting drug users reported lower levels of preventive behavior with ACASIs than in face-to-face interviews. As expected, research also indicates that the impact of self-administration is negligible with non-sensitive questions (Tourangeau, Rips and Rasinski 2000; Turner et al. 1998).

The recent proliferation of methodological experiments in the developing world suggests that alternative methods for survey administration have potential for reducing social desirability bias, although the results are mixed. Gregson et al. (2002; 2004) used informal confidential voting interviews in which the interviewers read the questions and the respondents wrote the answers on voting strips before placing the strips in ballot boxes. This method produced higher rates of reported HIV risk behaviors than face-to-face interviews. In a study of induced abortion in Mexico, Lara et al. (2004) used a random response technique in which the respondent answered yes or no to one of two randomly assigned written questions “Were you born in April?” or “Did you ever try to interrupt a pregnancy?” Although the reported rate of attempted induced abortion was higher with the random response technique compared with other methods, only one question was asked using this random response technique and the rates of reporting successful abortions using other methods (face-to-face interviews, ACASI, and self-administered questionnaires) in subsequent questions differed by interview location. In a study conducted in rural Malawi, Mensch et al. (2008) found inconsistent response effects by interview method. While reports of multiple lifetime sexual partners and sex with a friend or acquaintance were higher among respondents who used ACASI, reports of ever had sex and having sex with a boyfriend were higher among respondents in face-to-face interviews. Mensch et al. (2008) also found that the association between having a positive biomarker for an STI and reporting risky sexual behavior was stronger in the face-to-face interviews than among respondents who used ACASI. In contrast, Hewett and colleagues (2008) in a study conducted in Sao Paulo, Brazil found stronger correlations between risk behaviors and biomarkers for STIs when interviews were conducted with ACASI than face-to-face. Additionally, the STI-positive participants were more likely to underreport risky sexual behavior in the face-to-face interviews than if they used ACASI.

Although some form of self-administration of sensitive questions has great potential, there remain important barriers to successful self-administration in developing countries. Both the computer-assisted and paper and pencil methods place burdens on the respondent that make them less appropriate in populations where levels of educational attainment are low (Gribble et al. 1999). The paper and pencil method requires more than basic literacy, and computer assisted methods, even when the questions are read to the respondent aloud or on audio, require basic familiarity with a keyboard and number recognition. In many developing country settings, literacy is limited and familiarity with computers outside of large urban areas is rare, which reduces the effectiveness of the paper and pencil and computer-assisted modes of self-administration (Cleland et al. 2005; van de Wijgert et al. 2000). For example, Mensch, Hewett and Erulkar (2003:266) in an experimental study of the relative effectiveness of ACASI and self-administered paper and pencil questionnaires found that in certain settings the use of a computer in survey interviews produced anxiety, suspicion, and hostility in the study population. They also found there were technical problems in 20 percent of the interviews, largely due to issues with the keypad (Hewett, Erulkar, and Mensch 2004). The informal confidential voting interviews tested in Zimbabwe did not involve the use of computers, but required respondents to be sufficiently literate; 8 percent of the respondents in Zimbabwe were not sufficiently literate (Gregson et al. 2002). The ballot method also had slightly higher rates of missing data than the face-to-face interviews, but did not exceed 4.1 percent. The rate of inconsistent responses was also generally low but higher than in face-to-face interviews.

Concerns about social desirability bias, invasion of privacy, and risk of disclosure are particularly salient when studying adolescent sexual and reproductive health as youth may conceal their romantic relationships from anyone perceived to be an elder (Bearinger et al. 2007; Haram 2005; Mensch, Hewett and Erulkar 2003; Plummer et al. 2004). In this study we present an alternative methodology that overcomes some of the limitations of self-administered questionnaires and computer-assisted methods in populations with high rates of illiteracy and little familiarity with computers. The non-verbal response card addresses the three dimensions of sensitive questions (social desirability, invasion of privacy, and risk of disclosure) that generate response effects in face-to-face interviewer-administered questionnaires. The non-verbal response card places minimal cognitive demands on the respondent; it is highly portable, can be used with any language, is very inexpensive, and is adaptable to a wide variety of subject matter and response options.

Study Site

The non-verbal response card was developed and tested in the Gilgel Gibe Social and Sexual Relationship History Survey, conducted in 2006. The survey collected information on the formation of romantic relationships and the transition into sexual activity for adolescents and young adults, ages 13-24 years. The sample for the survey was drawn from the Gilgel Gibe Demographic Surveillance System (DSS), which incorporates rural communities and small urban centers in the immediate areas surrounding the Gilgel Gibe dam, Jimma Zone. The area is approximately six hours driving time to the southwest of the capital city, Addis Ababa, and has a population of approximately 45,000. The Gilgel Gibe Social and Sexual Relationship History Survey randomly sampled 1,300 youth from the approximately 8,900 households in the Gilgel Gibe DSS.

The study population is predominantly Muslim and ethnically Oromo. The Oromos are the largest single ethnic group in Ethiopia and constitute approximately 40 percent of the national population. The median age at marriage in the Oromiya region is 18.7 years for women ages 20-24, and 24.4 years for men ages 25-59 (Central Statistical Agency and ORC Macro 2006:83). Premarital sexual intercourse is common among partners who are engaged to be married in Ethiopia, and generally occurs less than one year before marriage (Lindstrom, Kiros, and Hogan 2009). In the Oromiya region, reported recent sexual intercourse among never married youth ages 15-24 in the 2005 Ethiopia DHS is low. In that survey 3.7 percent of never married women ages 15-49 reported having sexual intercourse in the last 12 months as did 9.4 percent of never married men ages 15-49. Reports of extra-marital sexual activity are even less common. Less than one percent (0.5%) of women in a union ages 15-49 reported sexual intercourse with a non-marital or non-cohabiting partner in the last 12 months as did only 0.8 percent of men in unions (Central Statistical Authority [Ethiopia] and ORC Macro 2006:86-87,192-193).

The adolescent and young adult respondents in the Gilgel Gibe Social and Sexual Relationship History Survey were interviewed at home. Female interviewers were used with female respondents and male interviewers with male respondents. The questionnaire collected information on contact with health services, food insecurity, aspirations, attitudes regarding gender relations, HIV knowledge, and information about the last four romantic relationships, including information on the background characteristics of each partner and the nature of intimate physical and sexual contact between the partners.1 Respondents were also asked about the conditions under which first sexual intercourse occurred, knowledge and use of condoms, perceptions of HIV risk, and attitudes regarding the appropriateness of premarital sex. Sensitive questions regarding sexual behavior and knowledge were asked at the end of the interview.

Non-Verbal Response Method

A major concern of the investigators in launching this study was that sensitive questions about sexual behaviors would be subject to considerable response bias in this largely rural, Muslim population. To address the issue of response bias, the authors developed an innovative response method called the non-verbal response card. This new method uses a response card that allows the respondent to non-verbally and confidentially communicate responses to questions read by the interviewer.

The response card is an 8½ × 11 inch laminated sheet of heavy stock paper with a respondent side and an interviewer side. Each side is divided into 35 cells (5 rows and 7 columns) with a small hole punched through the center of each cell. On the respondent side of the card, the cells contain written and color coded responses (see Figure 1). The numeric responses range from 0 to 25 (for the number of sexual partners and age at first sex), and the non-numeric responses are Yes, No, and does not apply. The numeric responses are indicated by both a written number and vertical bars (for example, || for 2, and ||||| ||||| for 10). The non-numeric responses are written in the two local languages and are color coded, green for Yes, red for No and blue for does not apply. Each cell on the interviewer side of the card contains a unique three-digit number. The number of cells and response options provided on the card are survey specific, and can vary across questionnaires or question sets within questionnaires, permitting the use of the card for a variety of topics and study populations.

Figure 1.

Figure 1

Non-Verbal Response Cards

The card is held by the respondent with the respondent side visible only to the respondent and the interviewer side visible only to the interviewer. The respondent indicates his/her response to a question by inserting the point of a stick that is provided through the hole in the appropriate response cell. The interviewer records the three digit number in the cell on the interviewer side of the card through which the point of the stick is protruding. To ensure that the interviewer does not recognize a response based on the position of the response cell, a total of 10 response cards were prepared, in which the order of the responses on each card varies (but the response set remains the same), and the three digit number assigned to each response is different. There are also multiple Yes, No, and does not apply response cells on each card so that the respondent is not repeatedly using the same cell for Yes or No on any single card. The three digit numbers are randomly assigned to the 35 possible responses with a total of 10 unique numbers (corresponding to each of the 10 cards) assigned to each response. The three digit numeric codes are recoded to their corresponding response after the data have been entered into computer readable data files.

At the start of the sensitive section of the questionnaire, the interviewer presents the respondent with an envelope with the 10 response cards inside. The respondent is instructed to pull out the cards and inspect them while the interviewer explains how to use the cards and how the cards are designed to preserve the confidentiality of the respondent's responses. The interviewer uses a demonstration card that has only two rows of cells with examples of the numeric and non-numeric response cells. The interviewer uses the demonstration card to show the respondent how the card is used, and to remind the respondent throughout the course of the interview that green is for Yes, red is for No, and blue is for does not apply. The respondent is instructed to hold onto any one of the cards and to set the other cards down. At any point during the interview the respondent can change cards if he/she wishes. At the end of the sensitive portion of the interview, the respondent is instructed to place all of the response cards back into the envelope in any order.

Application of the Response Card Method

The survey questionnaire and non-verbal response cards were first pre-tested with 202 randomly selected adolescents in an urban community in the Gilgel Gibe study area. The interviewers received one week of intensive training prior to the pre-test, and they received an additional week of training with the final version of the survey questionnaire and non-verbal response cards prior to beginning the actual survey interviews. The interviewers quickly grasped the concept and use of the cards, and reported that respondents easily understood the response procedures and were comfortable with the cards.

Following the pre-test, the non-verbal response cards were randomly assigned to one-half of the full study sample of 1,300 youth in advance of interviewing. Table 1 presents selected sample characteristics for the respondents who provided verbal responses and for respondents who used the non-verbal response cards. The distributions of sex, age, education, marital status, place of residence, religion, and ethnicity are virtually identical for the two groups. This comparison provides confirmation of the randomization of the response method: the two groups are comparable in size and indistinguishable from one another with respect to key social and demographic characteristics.

Table 1.

Selected Sample Characteristics by Response Method, Gilgel Gibe Social and Sexual Relationship History Survey 2006, Youth Ages 13-24, Southwestern Ethiopia.

Selected respondent characteristics Verbal response Card response
% %
Female 49.0 49.1
Age 13-16 52.4 51.7
    17-20 30.0 30.2
    21-24 17.5 18.1
No school 35.5 35.2
Some school (1+ years) 64.5 64.8
Never married 76.1 76.1
Married 23.4 23.4
Divorced/separated/widowed 0.5 0.5
Urban 23.9 25.2
Rural 76.1 74.8
Muslim 88.6 87.9
Orthodox Christian 10.3 11.5
Other Christian 1.1 0.6
Oromo 88.2 90.7
Amhara 3.3 2.5
Yem 3.3 3.5
Other ethnicity 5.2 3.3
Number of observations 633 636

Each interviewer conducted interviews using both methods to reduce the potential influence of interviewer effects on differences in reporting generated by the two methods. Interviewers were required to use the non-verbal response cards for the sensitive portion of the questionnaire with the youth who were assigned the non-verbal response card (experimental group), and they were required to use the conventional verbal response method with the other one-half of the sample (control group). The sensitive portion of the survey included 50 questions on sexual behavior, knowledge, and attitudes. Two separate questionnaires were prepared: one for those assigned to the card method and one for those assigned to the verbal method. The questionnaire for use with the cards included instructions to be read by the interviewer on how to use the card for each question. It did not include any skip instructions for the sensitive portion of the questionnaire because the interviewer did not know the respondent's responses to earlier questions. However, respondents were told to point to any of the solid blue squares if the question did not apply. For example, when asked how old they were at the time of first sexual intercourse, respondents were told “If you have never had sexual intercourse point to any of the blue squares.” The questionnaire used with the verbal responses included skip patterns for questions that were not applicable based on earlier responses. In all other respects, the two questionnaires were identical.

Invalid responses (a numeric response for a yes/no question or a yes/no response for a numeric question) ranged from approximately one to three percent of responses for respondents who used the card method compared to less than one percent of verbal respondents. There was a slight tendency among respondents who used the cards to use the blue squares (does not apply) to respond no.2

Results

Due to the low levels of reported sexual behavior in this population, many of the sensitive questions such as the condition of first sexual intercourse or condom use were not applicable for most respondents. Of the 50 questions for which the card method was used, a total of 12 applied to all respondents and addressed sensitive topics regarding non-marital sexual behavior, condom knowledge, and sexual attitudes.3 Based on the social stigma attached to risky sex, the widespread social marketing of condoms, and recent exposure to more permissive models of courtship and sexual relationships in the study area, we expect non-marital sex and the acceptance of casual sex to be underreported, and condom knowledge and acceptance of pre-marital sex in committed relationships to be overreported by respondents using the verbal response method.

Table 2 presents the percentage of youth who reported a non-marital sexual partner in the last 12 months, by response method, stratified by sex, education, place of residence, and marital status. We used the question “Including your current relationship, how many men [women] have you had sexual intercourse in the last 12 months?” to define a non-marital sexual partner. Never married respondents who reported having sexual intercourse with one or more partners in the last 12 months, and ever-married respondents who reported having sexual intercourse with more than one partner in the last 12 months are treated as reporting a non-marital sexual partner.4 Table 2 also presents the percentage of youth reporting they were ever at risk of contracting HIV in the last 12 months, the percentage reporting two or more lifetime sexual partners, and among never married youth the percentage reporting ever had sexual intercourse.

Table 2.

Reported Sexual Behavior and Perception of HIV Risk by Response Method and Selected Characteristics, Youth Ages 13-24, Gilgel Gibe Social and Sexual Relationship History Survey 2006, Southwest Ethiopia.

Percent of youth reporting
Non-marital sexual partner in last 12 monthsa Ever at risk of contracting HIV in the last 12 months Two or more lifetime sexual partners Ever had sexual intercourse (never married youth) n (n=never married youth)
Total Sample
    Verbal response 2.8 0.2 1.4 3.7 633 (482)
    Card response 5.8 ** 3.8 *** 3.2 ** 6.9 ** 636 (484)
    Card(%)/Verbal(%) (2.07) (19.00) (2.29) (1.86)

Females
    Verbal response 2.6 0.3 2.3 3.8 310 (184)
    Card response 5.2 * 3.6 *** 2.3 6.8 305 (195)
    Card(%)/Verbal(%) (2.00) (12.00) (1.00) (1.79)
Males
    Verbal response 3.1 0.0 0.6 3.7 323 (298)
    Card response 6.2 * 4.1 *** 4.0 *** 7.0 * 321 (289)
    Card(%)/Verbal(%) (2.00) - (6.67) (1.89)
No school
    Verbal response 2.2 0.4 1.8 3.7 225 (107)
    Card response 5.0 4.6 *** 5.0 * 5.8 218 (106)
    Card(%)/Verbal(%) (2.27) (11.5) (2.78) (1.57)
Some school (1+ years)
    Verbal response 3.2 0.0 1.2 3.7 408 (375)
    Card response 6.1 ** 3.5 *** 2.2 7.2 ** 408 (378)
    Card(%)/Verbal(%) (1.91) - (1.83) (1.95)
Rural
    Verbal response 2.1 0.2 0.8 2.8 482 (361)
    Card response 5.8 *** 4.5 *** 3.6 *** 5.7 * 469 (356)
    Card(%)/Verbal(%) (2.76) (22.50) (4.50) (2.04)
Urban
    Verbal response 5.3 0.0 3.3 6.7 151 (121)
    Card response 5.7 1.9 * 1.9 10.3 157 (128)
    Card(%)/Verbal(%) (1.08) - (0.58) (1.54)
Never married
    Verbal response 3.3 0.0 0.4 482
    Card response 6.4 ** 3.3 ** 2.1 ** 481
    Card(%)/Verbal(%) (1.94) - (5.25)
Ever married
    Verbal response 1.3 0.7 4.6 151
    Card response 3.4 5.5 ** 6.8 145
    Card(%)/Verbal(%) (2.62) (7.86) (1.48)
a

Respondents who are never married and report one or more sexual partners in the last 12 months, and ever married respondents who report two or more sexual partners in the last 12 months.

Note:

***

p<0.01

**

p<0.05

*

p<0.10. Significance levels for difference of proportions.

The overall prevalence of non-marital sex in the study population is low: 2.8 percent of respondents who gave a verbal response reported a non-marital sexual partner in the last 12 months, and 3.7 percent of never married youth who gave a verbal response reported ever having sexual intercourse. These low reported levels of non-marital sexual activity are consistent with the reports in the 2005 Ethiopia DHS described above. However, the reported levels of non-marital sexual experience in the sample are approximately twice as high among respondents who used the non-verbal response cards. The effect of the response method on the willingness of respondents to report that they were at risk of contracting HIV in the last twelve months is especially striking. Virtually no verbal respondents admitted to being at risk of contracting HIV compared to 3.8 percent of respondents who used the card method.

Although we might expect the response effect to be greatest among those subgroups for whom the reporting of non-marital sexual relations is most stigmatized, such as women, rural inhabitants, and married respondents, no such pattern emerges. On the contrary, the reported levels of sexual experience and HIV risk are higher among respondents who used the card method compared to the verbal method in 28 of the 30 subgroup comparisons made in Table 2, with 20 of these differences statistically significant. The response effect is weakest among urban respondents, which is consistent with the perception of urban populations as being more accepting of non-marital sexual activity than rural populations.

Table 3 presents the percentage of respondents who reported knowing where to obtain condoms and acceptance of premarital sex by response method, stratified by sex, education, place of residence, marital status, and recent contact with health services. In contrast to non-marital sexual activity, which we expected to be underreported in the verbal responses, we expect knowledge of condoms and acceptance of premarital sex in committed relationships to be overreported in the verbal responses. Public health campaigns promoted by the government and non-governmental organizations in the study area have emphasized the importance of safe sex practices and condom use. In this context, we expect youth to overreport knowing where to obtain condoms because some youth may perceive a lack of knowledge to be a sign of ignorance or backwardness. The results in Table 3 support this expectation, and consistently show that the youth in the sample overreport knowing where condoms can be obtained and overreport knowing a place where they would feel comfortable obtaining condoms when they gave verbal responses. In the total sample the percentage of youth who reported knowing where they could obtain a condom was 22 percent lower among youth used the card method compared to the verbal response method. Youth who used the card method were also less likely to report acceptance of premarital sex among both men and women who were going steady or engaged to be married.

Table 3.

Reported Knowledge of Access to Condoms and Acceptance of Premarital Sex by Response Method and Selected Characteristics, Youth Ages 13-24, Gilgel Gibe Social and Sexual Relationship History Survey 2006, Southwest Ethiopia.

Percent of youth reporting
Know where to obtain condoms Know a comfortable place to obtain condoms Is acceptable for a young woman to have sexual intercourse when she is: Is acceptable for a young man to have sexual intercourse when he is:

Casually sexually attracted Going steady Engaged Casually sexually attracted Going steady Engaged
Total Sample
    Verbal response 43.1 35.2 15.0 36.8 49.6 23.1 40.8 54.3
    Card response 33 7 *** 27.0 *** 19.7 ** 31.9 * 41.7 *** 26.2 34.4 ** 46.0 ***
    Card(%)/Verbal(%) (0.78) (0.77) (1.31) (0.87) (0.84) (1.13) (0.84) (0.84)

Females
    Verbal response 25.8 16.8 5.2 25.2 43.2 14.2 31.3 49.7
    Card response 21.7 17.7 8.1 25.0 41.1 19.0 30.0 45.5
    Card(%)/Verbal(%) (0.84) (1.05) (1.56) (1.99) (0.95) (1.34) (0.96) (0.92)
Males
    Verbal response 59.8 52.9 24.5 48.0 55.7 31.6 49.8 58.8
    Card response 45.3 *** 35.9 *** 30.9 * 38.5 ** 42.2 *** 33.1 38.8 *** 46.6 ***
    Card(%)/Verbal(%) (0.76) (0.68) (1.26) (0.80) (0.76) (1.05) (0.78) (0.79)
No school
    Verbal response 18.2 11.6 7.1 28.0 46.7 12.9 30.7 47.1
    Card response 17.2 12.7 11.3 20.9 * 42.1 16.8 24.7 44.5
    Card(%)/Verbal(%) (0.95) (1.09) (1.59) (0.75) (0.90) (1.30) (0.80) (0.94)
Some school (1+ years)
    Verbal response 56.9 48.3 19.4 41.7 48.8 28.7 46.3 58.3
    Card response 42.6 *** 34.7 *** 24.2 * 37.8 58.6 *** 31.2 39.7 * 46.8 ***
    Card(%)/Verbal(%) (0.75) (0.72) (1.25) (0.91) (1.20) (1.09) (0.86) (0.80)
Rural
    Verbal response 34.0 27.2 14.1 33.0 49.2 20.1 35.9 51.2
    Card response 25.8 *** 19.1 *** 18.3 * 29.4 40.3 *** 21.9 28.8 ** 42.6 ***
    Card(%)/Verbal(%) (0.76) (0.70) (1.30) (0.89) (0.82) (1.09) (0.80) (0.83)
Urban
    Verbal response 72.2 60.9 17.9 49.0 51.0 32.5 56.3 64.2
    Card response 56.9 *** 50.3 * 23.8 39.4 * 45.6 38.8 50.9 56.3
    Card(%)/Verbal(%) (0.79) (0.83) (1.33) (0.80) (0.89) (1.18) (0.90) (0.88)
Never married
    Verbal response 50.6 41.7 17.8 27.8 50.4 27.4 42.9 56.8
    Card response 36.7 *** 29.7 *** 22.4 * 21.8 40.1 *** 28.8 36.5 ** 44.7 ***
    Card(%)/Verbal(%) (0.73) (0.72) (1.26) (0.78) (0.80) (1.05) (0.85) (0.79)
Ever married
    Verbal response 24.2 14.6 6.0 39.6 47.0 9.3 33.8 46.4
    Card response 19.2 18.1 10.8 35.0 46.6 17.7 ** 27.9 50.3
    Card(%)/Verbal(%) (0.79) (1.24) (1.80) (0.88) (0.99) (1.90) (0.83) (1.08)
No recent contact with health services
    Verbal response 28.0 20.6
    Card response 18.9 *** 12.7 **
    Card(%)/Verbal(%) (0.68) (0.62)
Recent contact with health services
    Verbal response 51.1 42.9
    Card response 42.1 *** 35.1 **
    Card(%)/Verbal(%) (0.82) (0.82)

Note:

***

p<0.01

**

p<0.05

*

p<0.10. Significance levels for difference of proportions.

The response effect for condom knowledge varies across subgroups and is weakest among females and youth with no schooling. Critically important for sexual health interventions targeted at high risk groups, the largest difference in response patterns is found among single respondents and respondents who report no recent contact with health services. Close to 51 percent of single respondents who used the verbal response method reported knowing where to obtain a condom, whereas only 37 percent of respondents in the same group who used the non-verbal response card reported knowing where to obtain a condom. Similarly, among youth who haven not had any recent contact with health services, 28 percent who used the verbal response method reported knowing where to obtain a condom, whereas only 19 percent who used the card method reported knowing where to obtain a condom. This difference in reported condom knowledge alters the challenges for public health program efforts by suggesting that conventional survey estimates may significantly overestimate condom knowledge, and most likely condom use as well, among subgroups who are often the target of out-reach programs.

On the other hand, the reports of condom knowledge, according to whether the respondent had recent contact with health services, suggest that the conventional verbal response method may underestimate the potential impact of contact with the health service sector on condom knowledge. Based on the verbal responses, youth who have had recent contact with health services are 1.8 (51.1/28.0) times more likely to report knowing where to obtain a condom than youth who have not had recent contact with health services. However, based on the non-verbal response cards, youth who have had recent contact with health services are 2.2 (42.1/18.9) times more likely to report knowing where to obtain a condom. The use of the non-verbal response card suggests that contact with health services is about 22 percent more effective (2.2/1.8=1.22) in providing youth in the study area with knowledge of where they can get a condom, than is suggested by the verbal responses.

Discussion

The primary purpose of the Gilgel Gibe Social and Sexual Relationship History Survey was to provide a baseline assessment of the prevalence of potentially high-risk sexual behaviors. Because of the relative religious conservatism of the study population and the need to interview youth in their homes, we anticipated problems of response bias to questions on sexual behavior and knowledge. We developed the non-verbal response cards to provide a more private and confidential method for youth to respond to the sensitive questions, and we tested the effectiveness of the cards using a randomized control trial design in which one-half of the sample used the response cards and the other half of the sample provided verbal responses. Our findings from the Gilgel Gibe survey indicate that youth are more likely to report stigmatized behaviors and are more likely to admit to a lack of sexual knowledge when they use the non-verbal response cards compared to providing verbal responses. While the prevalence of premarital and extramarital sexual behavior is still low in the study population, the non-verbal response card method produces estimates of these behaviors that are around twice as high as the estimates provided by the conventional verbal response method. We also found that estimates of the percentage of youth who knew where to obtain condoms were approximately 22 percent lower among youth who used the more private and confidential card method as compared to the verbal response method. Most critically for public health programs, the over-reporting of condom knowledge was greatest among single youth and youth who had no recent contact with the formal health sector.

Despite the strengths of the non-verbal response card method, it is not without weaknesses. Since the interviewers do not know the respondents’ answers to the sensitive questions, skip patterns cannot be built into the questionnaire. To address this problem blue squares were included in the response card to indicate does not apply. However, the use of the does not apply square places a greater burden on the respondent than would be the case using the verbal response method with an interviewer directed skip pattern. The pre-test version of the non-verbal response card included ordinal response categories, but the respondents had difficulty using the card accurately for this purpose. Future research using the card method should explore the viability of incorporating ordinal and nominal response categories perhaps with dedicated cards and interviewer guide cards. The non-verbal response card also introduced interviewer error when the 3-digit code recorded by the interviewer was invalid, and respondent error when the type of response (yes/no or numeric) was inconsistent with the question asked. A subsequent version of the non-verbal response card tested in the field uses a larger font for the 3-digit codes, and separates the card into a yes/no panel and a numeric response panel to reduce these types of error.

This study has broader implications for how researchers should solicit responses to sensitive questions in general, as well as questions that address types of knowledge and attitudes that are associated with less traditional or more modern life styles. While qualitative research methods sometimes have been used to address concerns about the validity of survey responses to sensitive questions and to better understand the dynamics of risky sexual behaviors and condom use (Nzioka 2004; Obermeyer 2005, Marston and King 2006), these methods are costly and are not appropriate for generalizing to entire populations. The nonverbal response card dramatically improves the reporting of risky sexual behaviors in situations in which such behaviors are sensitive and subject to response bias. Because the interviewer does not know the interviewee's response, the social desirability motive for misreporting is greatly reduced and is limited to those respondents who regardless of the mode of question administration or response do not believe their responses are confidential. The awkwardness created by intrusive questions is reduced because the respondent does not provide a verbal response. The response card also reduces respondent concerns about the risk of disclosure during the course of the interview. Not only does the interviewer not know the interviewee's response, no one else does within listening range. This feature of the card is especially important in interviews in crowded settings where privacy is difficult to achieve, and also particularly with youth, women, and girls in low-income settings where cultural norms prohibit young people and particularly girls from being completely alone with strangers. This method is inexpensive, easily implemented among poor rural populations in which illiteracy is high, and can be adapted to a variety of survey instruments. The non-verbal response card method is particularly useful in settings in which computer-assisted methodologies are impractical or not feasible.

Acknowledgements

This research was made possible with support from David and Lucile Packard Foundation, the Mellon Foundation, the Compton Foundation, and NIH grant R03 AI078156. We thank Jimma University for access to the sampling frame, the field teams of the Gilgel Gibe Demographic Surveillance System and the Jimma Longitudinal Family Survey of Youth (JLFSY), and JLFSY project members Abebe Gebremariam, Challi Jira, and Kifle Woldemichael.

Footnotes

1

Romantic relationships were defined by the interviewers as “a relationship that lasted for at least one month in which you were boyfriend and girlfriend or husband and wife. A romantic relationship may have involved sexual relations or it may have involved nothing more than holding hands.”

2

In a series of seven yes/no questions regarding the conditions of first sexual intercourse, the percentage of does not apply responses (blue squares) for respondents, who had responded yes to a prior question on ever had sexual intercourse, ranged from 8-10 percent of the card responses compared to around 3 percent of the verbal responses. Among respondents who had responded no to the question on ever had sexual intercourse, 2-3 percent of the card responses to the questions on conditions of first sex were yes/no rather than does not apply (blue square).

3

The questions asked of all respondents include: (1) Have you ever had sexual intercourse?, (2) Including your current relationship, with how many men [women] have you ever had sexual intercourse?, (3) Including your current relationship, with how many men [women] have you had sexual intercourse in the last 12 months?, (4) In the last 12 months do you think you were ever at the risk of contracting HIV?, (5) Do you know of place where you could obtain condoms if you needed to?, (6) Do you know of a place where you would feel comfortable obtaining condoms?, (7) Is it acceptable for a young woman to have sexual intercourse when she is casually sexually attracted to a male?, (8) Is it acceptable for a young woman to have sexual intercourse when she is going steady with a male?, (9) Is it acceptable for a young woman to have sexual intercourse when she is engaged to be married with a male?, (10) Is it acceptable for a young man to have sexual intercourse when he is casually sexually attracted to a female?, (11) Is it acceptable for a young man to have sexual intercourse when he is going steady with a female?, (12) Is it acceptable for a young man to have sexual intercourse when he is engaged to be married with a female?.

4

Male respondents in polygamous unions had to report more than 2 sexual partners in the last 12 months in order to be classified as having a non-marital sexual partner.

References

  1. Axinn W. The influence of interviewer sex on responses to sensitive questions in Nepal. Social Science Research. 1991;20:303–319. [Google Scholar]
  2. Bearinger LH, Sieving RE, Ferguson J, Sharma V. Global perspectives on the sexual and reproductive health of adolescents: patterns, prevention, and potential. Lancet. 2007;369:1220–1231. doi: 10.1016/S0140-6736(07)60367-5. [DOI] [PubMed] [Google Scholar]
  3. Catania JA. A Framework for Conceptualizing Reporting Bias and Its Antecedents in Interviews Assessing Human Sexuality. The Journal of Sex Research. 1999;36(1):25–38. [Google Scholar]
  4. Catania JA, Gibson DR, Chitwood DD, Coates TJ. Methodological Problems in AIDS Behavioral Research: Influences on Measurement Error and Participation Bias in Studies of Sexual Behavior. Psychological Bulletin. 1990;108(3):339–362. doi: 10.1037/0033-2909.108.3.339. [DOI] [PubMed] [Google Scholar]
  5. Central Statistical Agency [Ethiopia] and ORC Macro . Ethiopia Demographic and Health Survey. Central Statistical Agency and ORC Macro; Addis Ababa, Ethiopia and Calverton, Maryland, USA: 2006. 2005. [Google Scholar]
  6. Cleland J, Boerma JT, Carael M, Weir SS. Monitoring Sexual Behaviour in General Populations: A Synthesis of Lessons of the Past Decade. Sexually Transmitted Infections. 2005;80(Suppl II):ii1–ii7. doi: 10.1136/sti.2004.013151. [DOI] [PMC free article] [PubMed] [Google Scholar]
  7. Couper M, Stinson LL. Completion of Self-Administered Questionnaires in a Sex Survey. The Journal of Sex Research. 1999;36(4):321–330. [Google Scholar]
  8. Curtis SL, Sutherland EG. Measuring sexual behaviour in the era of HIV/AIDS: the experience of Demographic and Health Surveys and similar enquiries. Sex Transm Infect. 2004;80(Suppl II):ii22–ii27. doi: 10.1136/sti.2004.011650. doi: 10.1136/sti.2004.011650. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Eggleston E, Leitch J, Jackson J. Consistency of Self-Reports of Sexual Activity Among Young Adolescents in Jamaica. International Family Planning Perspectives. 2000;26:79–83. [Google Scholar]
  10. Fenton KA, Johnson AM, McManus S, Erens B. Measuring Sexual Behaviour: Methodological Challenges in Survey Research. Sexually Transmitted Infections. 2005;2001(77):84–92. doi: 10.1136/sti.77.2.84. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Gage AJ. An Assessment of the Quality of Data on Age at First Union, First Birth, and First Sexual Intercourse for Phase II of the Demographic and Health Surveys Program. Macro International, Inc.; Calverton, MD: 1995. [Google Scholar]
  12. Gibson DR, Hudes ES, Donovan D. Estimating and Correcting for Response Bias in Self-Reported HIV Risk Behavior. The Journal of Sex Research. 1999;36:96–101. [Google Scholar]
  13. Gregson S, Mushati P, White PJ, Mlilo M, Mundandi C, Nyamukapa C. Informal confidential voting interview methods and temporal changes in reported sexual risk behaviour for HIV transmission in sub-Saharan Africa. Sexually Transmitted In- fections. 2004;80(Supplement 2):ii36–ii42. doi: 10.1136/sti.2004.012088. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Gregson S, Zhuwau T, Ndlovu J, Nyamukapa CA. Methods to reduce social desirability bias in sex surveys in low-development settings: Experience in Zimbabwe. Sexually Transmitted Diseases. 2002;29(10):568–575. doi: 10.1097/00007435-200210000-00002. [DOI] [PubMed] [Google Scholar]
  15. Gribble JN, Miller HG, Rogers SM, Turner CF. Interview Mode and Measurement of Sexual Behaviors: Methodological Issues. The Journal of Sex Research. 1999;36(1):16–24. doi: 10.1080/00224499909551963. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Hewett PC, Erulkar AS, Mensch BS. The feasibility of computer-assisted survey interviewing in Africa: Experience from two rural districts in Kenya. Social Science Computer Review. 2004;22(3):319–334. [Google Scholar]
  17. Hewett PC, Mensch BS, de A. Ribeiro MCS, Jones HE, Lippman SA, Montgomery MR, van de Wijgert JHHM. Using Sexually Transmitted Infection Biomarkers to Validate Reporting of Sexual Behavior within a Randomized, Experimental Evaluation of Interviewing Methods. American Journal of Epidemiology. 2008;(168):202–211. doi: 10.1093/aje/kwn113. [DOI] [PMC free article] [PubMed] [Google Scholar]
  18. Haram L. ‘Eyes Have No Curtains’: the Moral Economy of Secrecy in Managing Love Affairs Among Adolescents in Northern Tanzania in the Time of AIDS. Africa Today. 2005;51(4):56–73. [Google Scholar]
  19. Johnson TP, van de Vijver FJR. In: Social Desirability in Cross-Cultural Research. Cross-Cultural Survey Methods. Harkness JA, van de Vijver FJR, Mohler P, editors. John Wiley & Sons; Hoboken, NJ: 2003. pp. 195–204. [Google Scholar]
  20. Jones EL. The Courtesy Bias in South-East Asian Surveys. In: Bulmer M, Warwick DP, editors. Social Research in Developing Countries. UCL Press; London: 1983. [Google Scholar]
  21. Jones EL, Forrest JD. Under-reporting of Abortion in Survey of U.S. Women: 1986-1988. Demography. 1992;29:113–126. [PubMed] [Google Scholar]
  22. Knodel J, Piampiti S. Response Reliability in a Longitudinal Survey in Thailand. Studies in Family Planning. 1977;8:55–66. [PubMed] [Google Scholar]
  23. Lara D, Strickler J, Díaz Olavarrieta C, Ellertson C. Measuring induced abortion in Mexico: Comparison of four methodologies. Sociological Methods and Research. 2004;32(4):529–558. [Google Scholar]
  24. Lindstrom D, Kiros G, Hogan DP. Transitions into First Intercourse, Marriage, and Childbearing among Ethiopian Women. Genus. 2009;LXV(2):45–77. [PMC free article] [PubMed] [Google Scholar]
  25. Macalino GE, Celentano DD, Latkin C, Strath-dee SA, Vlahov D. Risk behaviors by audio computer- assisted self-interviews among HIV-seropositive and HIV-seronegative injection drug users. AIDS Education and Prevention. 2002;14(5):367–378. doi: 10.1521/aeap.14.6.367.24075. [DOI] [PubMed] [Google Scholar]
  26. Marston C, King E. Factors that Shape Young People's Sexual Behavior: A Systematic Review. Lancet. 2006;368:1581–86. doi: 10.1016/S0140-6736(06)69662-1. [DOI] [PubMed] [Google Scholar]
  27. Mensch BS, Hewett PC, Erulkar AS. The Reporting of Sensitive Behavior by Adolescents: A Methodological Experiment in Kenya. Demography. 2003;40(2):247–268. doi: 10.1353/dem.2003.0017. [DOI] [PubMed] [Google Scholar]
  28. Mensch BS, Hewett PC, Gregory R, Helleringer S. Sexual Behavior and STI/HIV Status Among Adolescents in Rural Malawi: An Evaluation of the Effect of Interview Mode on Reporting. Studies in Family Planning. 2008;39(4):321–334. doi: 10.1111/j.1728-4465.2008.00178.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  29. Mishra V, Vaessen M, Boerma JT, Arnold F, Way A, Barrere B, Cross A, Hong R, Sangh J. HIV Testing in National Population-Based Surveys: Experience from the Demographic and Health Surveys. Bulletin of the World Health Organization. 2006;84:537–45. doi: 10.2471/blt.05.029520. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Nnko S, Boerma JT, Urassa M, Mwaluko G, Zaba B. Secretive females or swaggering males?: An assessment of the quality of sexual partnership reporting in rural Tanzania. Social Science & Medicine. 2004;59(2):299–310. doi: 10.1016/j.socscimed.2003.10.031. [DOI] [PubMed] [Google Scholar]
  31. Nyitray AG, Kim J, Hsu C-H, Papenfuss M, Villa L, Lazcano-Ponce E, Giuliano AR. Test-Retest Reliability of a Sexual Behavior Interview for Men Residing in Brazil, Mexico, and the United States. American Journal of Epidemiology. 2009 doi: 10.1093/aje/kwp225. DOI: 10.1093/aje/kwp225: 1-10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  32. Nzioka C. Unwanted Pregnancy and Sexually Transmitted Infection among Young Women in Rural Kenya. Culture, Health & Sexuality. 2004;6:31–44. doi: 10.1080/1369105031000106365. [DOI] [PubMed] [Google Scholar]
  33. Obermeyer CM. Reframing Research on Sexual Behavior and HIV. Studies in Family Planning. 2005;36:1–12. doi: 10.1111/j.1728-4465.2005.00037.x. [DOI] [PubMed] [Google Scholar]
  34. Puri MC, Busza J. In Forests and Factories: Sexual Behaviour among Young Migrant Workers in Nepal. Culture, Health & Sexuality. 2004;6:145–58. [Google Scholar]
  35. Plummer, et al. A Bit More Truthful”: the Validity of Adolescent Sexual Behaviour Data Collected in Rural Northern Tanzania Using Five Methods. Sexually Transmitted Infections. 2004;80(Suppl 2):49–56. doi: 10.1136/sti.2004.011924. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Smith TW. Discrepancies between men and women in reporting number of sexual partners: A summary from four countries. Social Biology. 1992;39:205–211. doi: 10.1080/19485565.1992.9988817. [DOI] [PubMed] [Google Scholar]
  37. Strickler JA, Magnani RJ, McCann HG, Brown LF, Rice JC. The Reliability of Reporting of Contraceptive Behavior in DHS Calendar Data: Evidence from Morocco. Studies in Family Planning. 1997;28:44–53. [PubMed] [Google Scholar]
  38. Tourangeau R, Rasinski K, Jobe JB, Smith TW, Pratt WF. Sources of Error in a Survey on Sexual Behavior. Journal of Official Statistics. 1997;13:341–365. [Google Scholar]
  39. Tourangeau R, Rips LJ, Rasinski K. The Psychology of Survey Response. Cambridge University Press; Cambridge, U.K.: 2000. [Google Scholar]
  40. Tourangeau R, Smith TW. Asking Sensitive Questions: The Impact of Data Collection Mode, Question Format, and Question Context. Public Opinion Quarterly. 1996;60:275–304. [Google Scholar]
  41. Turner CF, Forsyth BH, O'Reilly J, Cooley PC, Smith TK, Rogers SM, Miller HG. Automated Self-interviewing and the Survey Measurement of Sensitive Behaviors. In: Couper MP, Baker RP, Bethlehem J, Clark CZ, Martin J, Nicholls W, O'Reilly JM, editors. Computer-assisted Survey Information Collection. Wiley and Sons, Inc.; New York: 1998. [Google Scholar]
  42. van de Wijgert J, Padian N, Shiboski S, Turner C. Is Audio Computer-Assisted Self-Interviewing a Feasible Method of Surveying Zimbabwe? International Journal of Epidemiology. 2000;29:885–90. doi: 10.1093/ije/29.5.885. [DOI] [PubMed] [Google Scholar]
  43. Weinhardt LS, Forsyth AD, Carey MP, Jaworski BC, Durant LE. Reliability and Validity of Self-Report Measures of HIV-Related Sexual Behavior: Progress Since 1990 and Recommendations for Research and Practice. Archives of Sexual Behavior. 1998;27(2):155–180. doi: 10.1023/a:1018682530519. [DOI] [PMC free article] [PubMed] [Google Scholar]
  44. Williams L, Sobieszczyk T, Perez AE. Consistency Between Survey and Interview Data Concerning Pregnancy Wantedness in the Philippines. Studies in Family Planning. 2001;32:244–253. doi: 10.1111/j.1728-4465.2001.00244.x. [DOI] [PubMed] [Google Scholar]
  45. Zehner RB. Sex Effect in the Interviewing of Young Adults. Sociological Focus. 1970;3:75–84. [Google Scholar]

RESOURCES