Skip to main content
Public Health Reports logoLink to Public Health Reports
. 2019 May 6;134(1 Suppl):46S–56S. doi: 10.1177/0033354919826566

Asking Survey Questions About Criminal Justice Involvement

Ting Yan 1,, David Cantor 1
Editors: Juliet Bui, Minh Wendt, Alexis Bakos
PMCID: PMC6505312  PMID: 31059416

Abstract

Criminal justice involvement is a multifaceted construct encompassing various forms of contact with the criminal justice system. It is a sensitive topic to ask about in surveys and also a sensitive topic for respondents to answer. This article provides guidance for writing survey questions on criminal justice involvement, starting with a review of potential causes for reporting error and nonresponse error associated with survey questions on criminal justice involvement. Questions about criminal justice involvement are subject to errors that are common to any survey (eg, misunderstanding questions, recall bias, telescoping). Reponses to these questions are also subject to underreporting because of social desirability concerns. We also address strategies to reduce error for questions pertaining to criminal justice involvement (eg, self-administered data collection, wording of forgiving questions, indirect methods). We then discuss common design decisions associated with writing survey questions on criminal justice involvement (eg, type and frequency of criminal justice involvement, reference period,) and provide examples of questions from current surveys.

Keywords: criminal justice involvement, sensitive questions, social desirability bias


Criminal justice involvement encompasses various forms of contact with the criminal justice system. Involvement in the criminal justice system starts after an individual is arrested and can include pretrial detention, conviction, incarceration, probation, and parole.1 Criminal justice involvement is an important life event that affects not only the persons involved but also others involved in their lives. Research shows that criminal justice involvement has a negative effect on people’s physical and mental health,2-4 employment and earnings,5 marriage and family life,5,6 and access to housing opportunities,7 among other areas. Parental involvement in the criminal justice system is found to have a detrimental effect on children’s health,8,9 behaviors,7,10,11 income and financial security,12,13 food security,13 receipt of health services,14 and education, including college access and completion.15-17

Given the detrimental effects criminal justice involvement can have on a person’s life, data are needed to systematically examine these effects on the general population. However, collecting such data is challenging, in part because asking questions about criminal justice involvement in general population surveys is often a sensitive issue for respondents. This article provides guidance and advice for writing survey questions on criminal justice involvement, with particular attention to the sensitive nature of the topic.18 Although guidance and advice are available for writing questions about a sensitive topic,19-21 they are not specifically designed for questions about criminal justice involvement. Therefore, we first review potential causes for reporting error and nonresponse error associated with survey questions on criminal justice involvement. Next, we discuss strategies to reduce error for such questions. We conclude by discussing common design decisions associated with writing survey questions on criminal justice involvement and providing examples of questions from current surveys.

Reporting and Nonresponse Error Related to Questions on Criminal Justice Involvement

As with any type of survey question, questions pertaining to criminal justice involvement are subject to error related to reporting and nonresponse. With regard to reporting error, a respondent can misunderstand the survey question, partially or inaccurately recall relevant information, use a cognitive shortcut to come up with a response, or fail to report an answer using one of the response options provided to them.19 For example, the National Survey of Family Growth (NSFG)—a general population survey introduced to sampled members as being about “family life, marriage and divorce, having and raising children, and health and health care”—asks all male respondents, “In the last 12 months, have you spent any time in a jail, prison, or a juvenile detention facility?”22 To respond, a respondent must first interpret what the question means (comprehension), determine what is considered a “jail, prison, or a juvenile detention facility,” and then interpret what it means to “spend any time” in such a place. Some respondents might interpret “spend any time” to mean “spending time as a result of formal sentencing,” whereas others might interpret it as a 1-night stay in a jail before appearing in court.

A respondent must then remember events that qualify under the given definition (recall). One type of recall error is external telescoping,23,24 which occurs when a respondent reports an incident taking place outside the period of interest. Respondents are more likely to telescope memorable events they can recall, even if the event occurred before the period of interest.25 A second type of recall error is when the respondent does not remember the event at all.25 For example, respondents might forget or not recall a 1-night stay in a jail that happened in the distant past.26

In addition to these common sources of reporting error, questions on criminal justice involvement are also subject to response error because they cover a sensitive topic. Topics can be sensitive because of (1) intrusiveness (criminal justice involvement is not generally discussed with others); (2) possible negative consequences, sometimes even legal concerns, from disclosing criminal justice involvement to another individual; and (3) the social stigma associated with criminal justice involvement.4,27 Because of the social stigma associated with criminal justice involvement, respondents are prone to not report these events even when they are retrieved from memory, leading to underreporting.18 The tendency to underreport events that invoke social stigma is also known as social desirability bias. For example, on the NSFG, the respondent might recall from memory an instance of criminal justice involvement in the correct reference period. However, before reporting, the respondent might edit his or her answer to “no” to avoid potential judgment.

Evidence of this editing process for criminal justice involvement can be found in studies comparing self-reported data with administrative data that found that arrests,26 delinquent activities that led to police contact,28 committing welfare and unemployment benefit fraud,29 and convictions29 were underreported. In addition, female respondents,27,29 older persons,27,29 persons with higher education,27,30 persons with higher levels of social desirability concerns,27,28,30 and persons with more severe offenses26,27,30 underreport criminal justice involvement more often than male respondents, younger respondents, persons with lower education, persons with lower levels of social desirability concerns, and persons with minor offenses, respectively.

With regard to nonresponse error, several studies show that persons tend not to participate in surveys that have sensitive topics,31 especially surveys that ask about undesirable behaviors and attitudes.32,33 However, including items about criminal justice involvement in general population surveys is unlikely to substantially alter response rates unless (1) the survey is introduced as a survey on criminal justice involvement or (2) respondents know beforehand that items on criminal justice involvement are part of the survey. As mentioned earlier, the NSFG is described as a survey about “family life, marriage and divorce, having and raising children, and health and health care” in the advance letter to sampled households.22 Sampled members do not know beforehand that the survey includes 4 questions on criminal justice involvement (Table 1) asked of men aged 15-44. The final weighted response rate for the 2006-2010 NSFG data collection was similar between men (75%) and women (78%), even though the 4 questions on criminal justice involvement are asked only of male respondents.34 During the 2011-2013 NSFG data collection, the response rate for the male questionnaire was 1 percentage point lower than the response rate for the female questionnaire,35 a difference that was negligible.

Table 1.

Item nonresponse rates to questions on criminal justice involvement from general population surveys, United States

Survey Questions on Criminal Justice Involvement Years of Data Collection Item Nonresponse Rate, No. (%)a
National Survey of Family Growth35 In the last 12 months, have you spent any time in a jail, prison, or a juvenile detention facility? 2011-2013 17/4815 (0.4)
Have you ever spent time in a jail, prison, or juvenile detention center? 2011-2013 20/4485 (0.4)
Have you been in jail, prison, or a juvenile detention facility only one time or more than one time? 2011-2013 10/1211 (0.8)
The last time you were in jail, prison, or juvenile detention, how long were you in? 2011-2013 17/1211 (1.4)
National Survey on Drug Use and Health36 Not counting minor traffic violations, have you ever been arrested and booked for breaking the law? Being “booked” means that you were taken into custody and processed by the police or by someone connected with the courts, even if you were then released. 2002-2014 2138/722 640 (0.3)
Were you on probation at any time during the past 12 months? 2002-2014 2254/722 627 (0.3)
Were you on parole, supervised release, or other conditional release from prison at any time during the past 12 months? 2002-2014 1651/722 627 (0.2)
Survey of Criminal Justice
Experience37
Nowadays, persons are often stopped by the police for many different reasons. Since age 18, have you ever been stopped by the police? 2013 26/3260 (0.8)
Since age 18, have you ever been arrested, booked, or charged for breaking a law? 2013 42/3260 (1.3)
Have you been arrested, booked, or charged for breaking a law in the past 12 months? 2013 1/647 (0.2)
Since age 18, have you ever been convicted of or pled guilty to any charges (other than a minor traffic violation)? 2013 33/3260 (1.0)
Have you been convicted of or pled guilty to any charges other than a minor traffic violation in the past 12 months? 2013 2/345 (0.6)
Since age 18, have you ever been under any form of criminal justice supervision, including on probation, in jail, or in prison? 2013 27/3260 (0.8)
Since age 18, have you ever been on probation? 2013 1/316 (0.3)
Are you currently on probation? 2013 1/248 (0.4)
Since age 18, have you ever been in jail? 2013 5/216 (2.3)
Since age 18, how many times have you been in jail? 2013 3/259 (1.2)
Since age 18, have you ever been in prison? 2013 4/316 (1.3)
Prior to age 18, did you ever spend time in a juvenile detention center? 2013 34/3260 (1.0)
National Longitudinal Study of Adolescent to Adult Health38 (Has/did) your biological mother ever (spent/spend) time in jail or prison? 2008-2009 (Wave 4) 49/5144 (1.0)
Has your biological father ever served time in jail or prison? 2001-2002 (Wave 3) 292/4866 (6.0)
(Has/did) your biological father ever (spent/spend) time in jail or prison? 2008-2009 (Wave 4) 294/5114 (5.7)
Have you ever spent time in a jail, prison, juvenile detention center, or other correctional facility? 2008-2009 (Wave 4) 7/1457 (0.5)
How many times have you been in a jail, prison, juvenile detention center, or other correctional facility? 2008-2009 (Wave 4) 4/815 (0.5)

a Item nonresponse rates were calculated by using public-use data. The rates are unweighted. Percentages were calculated as the number of respondents who agreed to participate in the survey but did not provide a substantive answer to a particular question about criminal justice involvement divided by the number of respondents who agreed to participate in the survey and were asked the question about criminal justice involvement.

Nonresponse to specific survey questions (also known as “item nonresponse”) also does not seem to be a problem with questions on criminal justice involvement. We computed item nonresponse rates to questions on criminal justice involvement from 4 surveys (the NSFG,35 National Survey on Drug Use and Health [NSDUH],36 Survey of Criminal Justice Experience,37 and National Longitudinal Study of Adolescent to Adult Health38; Table 1). Item nonresponse rates were <2% for all but 2 estimates from the National Longitudinal Study of Adolescent to Adult Health (ie, question on “father in jail/prison” in wave 3 and wave 4). This low nonresponse rate is within the typical range of missing data for nonsensitive questions.39 However, it may also be that refusing to answer these questions is perceived to be an admission of criminal justice involvement.

Strategies to Reduce Reporting Error Common to All Survey Questions

As noted previously, questions about criminal justice involvement are subject to errors common to any survey (eg, misunderstanding survey questions, forgetting, telescoping). When crafting survey questions on criminal justice involvement, several principles minimize these sources of error. First, survey questions on criminal justice involvement should use simple language40 that can be understood by all respondents. Questions should avoid the use of jargon or technical or big words. For example, the term criminal justice involvement may have various meanings across the general population. In addition, respondents might interpret jail in multiple ways. Some respondents may think it is a specific type of incarceration (eg, jail vs prison) and others might interpret it as being incarcerated. One way to increase comprehension is to provide a definition to the respondent when using a specific term.

Second, question writers should keep the task of answering manageable for the respondent.40 Even if a respondent understands the meaning of criminal justice involvement, the task of recall and using retrieved memory to answer the question may become unmanageable, especially for respondents who have to search their memory for events that happened a long time ago or for respondents who have multiple experiences with criminal justice involvement.

One way to aid the recall task is to use examples. Examples make the search of memory more specific to the targeted behavior or situation. For example, if the intent of the question is to include any time the person was incarcerated, regardless of the type of facility (eg, jail, prison, or a juvenile facility), examples can illustrate a stay as the result of formal sentencing, a temporary stay before appearing in the court, or being in a secure facility as a juvenile. Although examples are generally useful for making the question easy for the respondent to answer, respondents might focus too much on the examples.41 This excess focus on the examples might unduly restrict the respondent’s memory search to just the examples and could make the question longer and more complex than necessary. The survey designer has to judiciously select examples that cover a variety of situations without substantially increasing the length and complexity of the survey. For example, the National Longitudinal Survey of Youth—1997 Cohort (NLSY97) asks respondents if they have ever “been sentenced to spend time in a corrections institution.” The examples provided to respondents for this question include “a jail, prison, or a youth institution like a juvenile hall or reform school or training school.”42 The examples do not substantially increase the length and complexity of the question but are comprehensive enough to cover almost all situations.

Another way to aid recall is to lengthen the survey questions by adding memory cues to improve recall.20 Memory cues can take the form of examples, but they may also provide contexts that promote recall. For example, referencing a particular period of a person’s life (eg, divorce, losing a job) may trigger memories that are associated with criminal justice involvement. Although textbooks on questionnaire design recommend writing short questions,21,40 long questions with memory cues have been found to increase the reporting of a targeted behavior.43 A compromise has to be made among question length, questionnaire space, and simplification, and question writers have to be mindful of this compromise.

Strategies to Reduce Reporting Error for Sensitive Questions

The sensitivity of questions about criminal justice involvement leads to additional concerns about reporting error. Specifically, respondents tend to underreport criminal justice involvement because of fear of negative repercussions or presenting a negative image. Several survey procedures are used to reduce this type of underreporting.

Self-Administered Modes

A survey can be administered by an interviewer in person or completed by respondents on their own. Research shows that respondents are more likely to report sensitive behaviors when using self-administered modes than when using interviewer-administered modes.18,27,44-48 Self-administration removes respondents’ concerns about privacy and looking good in front of the interviewer. Self-administered modes also reduce the likelihood that another person will hear or read their answers, thereby reducing the risk of disclosure. The effectiveness with which self-administration reduces underreporting does not differ across various self-administration modes (eg, paper questionnaire, online survey).18 The NSFG follows this strategy by including the 4 questions on criminal justice involvement in the audio computer-assisted self-interviewing component of the survey.49

“Forgiving” Introduction

The survey literature suggests that sensitive questions be set up in a “forgiving” context to reduce social desirability concerns by relaxing the “norm” associated with the topic. For example, Holtgraves et al50 provided a forgiving introduction before asking respondents about vandalism: “Almost everyone has probably committed vandalism at one time or another.” They then compared answers obtained under this introduction with answers obtained without the forgiving introduction. The question with the forgiving introduction elicited 70% more admissions of vandalism than the question without a forgiving introduction. Examples of forgiving introductions can be found in 2 other studies.31,51 Several studies, however, did not find that a forgiving introduction improved reporting of socially undesirable behaviors or attitudes.45,52 The Survey of Criminal Justice Experience uses this strategy when asking about being stopped by the police; the survey includes a forgiving introduction—“Nowadays, people are often stopped by the police for many different reasons”—before asking if respondents have ever been stopped by the police since age 18.37

Question Presupposing the Behavior

Related to the forgiving introduction is using “loaded” survey questions; that is, survey questions that presuppose or assume that respondents have engaged in the socially undesirable behavior.20 Research has found that questions presupposing a behavior produce a higher report of that behavior than questions that do not make such an assumption. For example, Knauper53 reported on one study that asked respondents, “In the past 10 years, how many times did you witness a crime?” This loaded question led to substantially more reports of witnessing crimes than the alternative question that did not carry such a presupposition (eg, “Did you witness a crime in the past 10 years? If yes, how many times?”).53 This strategy is recommended for questions measuring socially undesirable behaviors,20 although no empirical research has examined the usefulness of this strategy when asking questions about criminal justice involvement.

Long Questions With Familiar Words

Bradburn and colleagues20 recommend using words or terms that are familiar to respondents to reduce the threat of the survey items. They found that respondents to surveys generally preferred the word booze to liquor and that survey items using the word booze produced 15% more reports of socially undesirable behavior than did survey items using the word liquor.20 Bradburn and colleagues20 also showed that longer items increased the report of socially undesirable behavior. Although this result has not been generalized to other types of sensitive behaviors such as criminal justice involvement, the NSDUH36 and the NLSY9742 use examples to make questions longer (NSDUH: “During the past 12 months, that is, since [DATE], did you stay overnight or longer in any type of juvenile detention center, sometimes called ‘juvie,’ prison, or jail?” NLSY-97: “Have you ever been sentenced to spend time in a corrections institution, like a jail, prison, or a youth institution like a juvenile hall or reform school or training school or to perform community service?”).

“Ever” Only or Before “Current”

Bradburn et al20 recommend asking whether respondents have “ever” engaged in a sensitive behavior (eg, Did you ever, even once, take something from a store without paying for it?) before asking whether they currently engage in that behavior (eg, In the past 12 months, did you take something from a store without paying for it?). Their thinking is that if respondents report a behavior for their lifetime, they may be more willing to report it in the recent past than those who did not report a behavior for their lifetime. Bradburn et al did not present empirical evidence supporting this recommendation, but several surveys (eg, the NSDUH36 and the National Intimate Partner and Sexual Violence Survey54) use this recommendation when asking about sensitive behaviors.

High-Frequency List or Open-Ended Format

Closed-ended response categories communicate to the respondent what the investigator believes is the norm for the population. These norms are used when responding to a question. A high-frequency list communicates a higher norm than a low-frequency list. For example, when asked to report sexual partners using a low-frequency list (0, 1, 2, 3, 4, ≥5) or a high-frequency list (0, 1-4, 5-9, 10-49, 50-99, ≥100), female respondents reported more sexual partners when responding to a condition that had the high-frequency list than to a condition that had the low-frequency list.45 An alternative to using a high-frequency list or low-frequency list is to use an open-ended question that does not provide any response categories to the respondent.

Indirect Methods

All previously mentioned strategies aim to encourage the report of socially undesirable behaviors or attitudes by reducing the perceived threat or social undesirability of survey items. Another approach to encourage reports of sensitive behaviors is to use indirect methods that maintain respondents’ anonymity.18,55 For instance, Holbrook and Krosnick56 used the forced alternative method, a version of the randomized response technique, to measure voter turnout. Specifically, they asked respondents to get a coin and flip it. Respondents whose coins came up tails were instructed to answer “no” (regardless of their true status with respect to voter turnout), whereas respondents whose coins came up heads were asked to answer the survey item, “Did you vote in the elections held on November 5, 2002?” By using the randomized response technique, respondents’ answers do not directly reveal their status with respect to the sensitive behavior of interest. However, the prevalence of persons engaged in the behavior can still be estimated at the aggregate level. Because of this extra protection of individual status, indirect methods increase disclosure of socially undesirable behaviors.55,57 Examples of indirect methods include the randomized response technique (respondents are randomly assigned to answer 1 of 2 questions about a controversial issue)58 and its possible variations. One such variation is the unrelated question method, in which respondents are randomly assigned to answer the question of interest (eg, voter turnout) or a question unrelated to the question of interest (eg, whether the respondent was born in April).59 Another variation is the force alternative method that has been described previously.56,60 In the variation called the cross-wise model, respondents are presented with both a sensitive question of interest and an innocuous question that is unrelated to the sensitive question and 2 response options from which to choose; the 2 response options are “my answers are the same for both questions” and “my answers are different for both questions” without revealing what their answers to the sensitive question are.61

The item count technique is another indirect method in which respondents are randomly assigned to receive a long list of questions including the sensitive question of interest or the same list without the sensitive question of interest. Respondents are then asked to provide the number of items from the list they endorse.62 In the 2-list method (a variation of the item count technique), all respondents receive 2 mutually exclusive lists—list A and list B; for a random half of the respondents, list A includes the sensitive question of interest, whereas for the other random half, list B includes the sensitive question.63 Lastly, the item sum technique for continuous variables randomly assigns the respondents to receive a long list with the sensitive question or a short list without the sensitive question and asks them to report, for example, a sum across the list.64

A major disadvantage common to all these indirect methods is that individual status with regard to the sensitive behavior of interest is not known, making it difficult, if not impossible, to analyze the data at the individual level. There are emerging methods to analyze the data obtained using the item count technique,65 but their application is not straightforward. Furthermore, these methods may not be applicable to other indirect methods.

Designing Questions on Criminal Justice Involvement for General Population Surveys

To apply the aforementioned principles for designing questions on criminal justice involvement, it is first necessary to make decisions on what information is most essential to collect. One such decision is what type of criminal justice involvement is of interest. Questions about criminal justice involvement are asked in several general population surveys (Table 2). The questions cover a range of criminal justice involvement, including committing crimes, being incarcerated, and being paroled. The questions differ in the type of criminal justice involvement. For example, the NSFG49 and NLSY9742 ask about any type of incarceration. The NSDUH36 asks about any type of incarceration among juveniles only. The Survey of Criminal Justice Experience (SCJE37) asks about incarceration in jail and in prison separately.

Table 2.

Reference period of questions on criminal justice involvement in general population surveys, United Statesa

Type of Involvement Question Reference Period (Timing) Survey
Crime committed Since the last interview on [DATE], have you stolen something from a store or something that did not belong to you worth less than $50? Since last interview National Longitudinal Survey of Youth—1997 Cohort (NLSY97)42
Crime committed In your ENTIRE life, did you…use someone else’s credit card without their permission? Lifetime National Epidemiologic Survey on Alcohol and Related Conditions66
Stopped by police Nowadays, persons are often stopped by the police for many different reasons. Since age 18, have you ever been stopped by the police? Since age 18 Survey of Criminal Justice Experience (SCJE)37
Ticketed by police During this contact, were you ticketed or given a warning? In past 12 months Police–Public Contact Survey67
Arrest Not counting minor traffic violations, have you ever been arrested and booked for breaking the law? Being “booked” means that you were taken into custody and processed by the police or by someone connected with the courts, even if you were then released. Lifetime National Survey on Drug Use and Health (NSDUH)36
Arrest for specific crime In the past 12 months, were you arrested and booked for…? In past 12 months NSDUH
Charge Do you currently have any charges pending against you? Current Fragile Families and Child Wellbeing Study (FFCWS)68
Charge (specific type) What charges do you currently have pending? Current FFCWS
Juvenile court As a result of these charges, did you ever go to juvenile court? Lifetime NLSY97
Adult court As a result of these charges, did you ever go to adult court? Lifetime NLSY97
Conviction Since age 18, have you ever been convicted of or pled guilty to any charges (other than a minor traffic violation)? Since age 18 SCJE
Conviction Have you been convicted of or pled guilty to any charges other than a minor traffic violation in the past 12 months? In past 12 months SCJE
Conviction (specific crime) Have you ever been convicted of a felony? Lifetime NLSY97
Probation Were you on probation at any time during the past 12 months? In past 12 months NSDUH
Incarceration Have you ever spent time in a jail, prison, or juvenile detention center? Lifetime National Survey of Family Growth (NSFG)49
Incarceration In the past 12 months, have you spent any time in a jail, prison, or a juvenile detention facility? In past 12 months NSFG
Incarceration During the past 12 months, that is, since [DATE], did you stay overnight or longer in any type of juvenile detention center, sometimes called “juvie,” prison, or jail?b In past 12 months NSDUH
Incarceration Have you ever been sentenced to spend time in a corrections institution, like a jail, prison, or a youth institution like a juvenile hall or reform school or training school or to perform community service? Lifetime NLSY97
Jail Since age 18, have you ever been in jail? Since age 18 SCJE
Prison Since age 18, have you ever been in prison? Since age 18 SCJE
Parole Were you on parole, supervised release, or other conditional release from prison at any time during the past 12 months? In past 12 months NSDUH
Parole Are you still on parole for that sentence? Current NLSY97

a The table showcases that (1) different reference periods are used for different types of involvement and (2) different surveys use different reference periods even for a similar type of involvement. The table is organized by type of involvement, such as committing crimes, being stopped or ticketed by police, being arrested or charged, having to go to court, being convicted, being probated, being incarcerated (including in jail or prison), and being paroled. Reference periods for recalling criminal justice involvement include since last interview, lifetime, in past 12 months, since age 18, and current. The survey items are chosen for illustrative purposes; the table is not intended to be exhaustive of all survey items asked in all surveys.

b This question is asked of respondents aged 12-17.

A second decision is choosing the appropriate reference period for criminal justice involvement. The surveys we examined included questions about lifetime experience (eg, NSDUH, NLSY97), involvement since age 18 (eg, SCJE), involvement in the past year (eg, NSDUH), involvement since the last interview (NLSY97), or current involvement (the Fragile Families and Child Wellbeing Study [FFCWS]; Table 2). Various reference periods can be found within the same survey. For example, NLSY97 asks about lifetime conviction for a felony but about the past 12 months for parole. Schaeffer and Presser69 advise setting the appropriate reference period based on the periodicity of the target events, the salience and regularity of the events, and the analytic goals of the survey. In general, rare and salient events can use longer reference periods, and mundane and more frequent events should use shorter reference periods. Criminal justice involvement is a rare, memorable, and salient event for most persons. According to the 2011-2013 NSFG, nearly 7 in 100 males aged 15-44 reported having spent any time in a jail, prison, or juvenile detention center in the past 12 months.49 Thus, asking about lifetime experience is appropriate for a survey of the general population. However, a shorter reference period (eg, in the past 12 months) might be more appropriate for some subgroups who have more frequent involvement with criminal justice (eg, current inmates or former prisoners). Researchers’ analytic use of the information should also be considered when setting the reference period. If the researcher is interested in studying the effect of current or recent incarceration on health, the reference period should be set to reflect that need.

In many cases, surveys on criminal justice involvement are interested not only in whether there was any involvement but also in how often involvement occurred. The survey designer should decide whether it is better to ask for the number of events or the rate of occurrence during a specified period (eg, once per month, once per week). Questions on the frequency of criminal justice involvement take various forms (Table 3). We suggest asking for counts of criminal justice involvement in surveys of the general population because criminal justice involvement does not occur regularly enough in the general population for a rate to be useful. In the surveys we reviewed, nearly all questions about the frequency of criminal justice involvement asked for a count. A closed-ended question format with a high-frequency list is recommended for this purpose.

Table 3.

Questions on the frequency and duration of criminal justice involvement in general population surveys, United Statesa

Type of Involvement Question Type of Information Reference Period (Timing) Survey
Crime committed How many times have you stolen something from a store or something that did not belong to you worth less than $50 since the last interview on [DATE]? Frequency Since last interview National Longitudinal Survey of Youth—1997 Cohort (NLSY97)42
Crime committed In the past 12 months, how often did you steal something worth more than $50? Frequency In past 12 months National Longitudinal Study of Adolescent to Adult Health38
Arrest Not counting minor traffic violations, how many times during the past 12 months have you been arrested and booked for breaking a law? Frequency In past 12 months National Survey on Drug Use and Health (NSDUH)36
Charge How many charges do you currently have pending? Frequency Current Fragile Families and Child Wellbeing Study68
Incarceration Have you been in jail, prison, or a juvenile detention facility only 1 time or more than one time? Frequency Lifetime National Survey of Family Growth (NSFG)49
Incarceration The last time you were in jail, prison, or juvenile detention, how long were you in? Length of time Most recent NSFG
Jail Since age 18, how many times have you been in jail? Frequency Since age 18 Survey of Criminal Justice Experience (SCJE)37
Probation How long have you been on probation? Length of time Current SCJE
Incarceration Altogether, about how much time had you ever served in a prison, jail, or a juvenile facility before your confinement in [DATE]? Length of time Lifetime National Former Prisoner Survey70
Incarceration How long were you in jail, prison, or juvenile detention?/The last time you were in jail, prison, or juvenile detention, how long were you in? Length of time Most recent NSFG
Incarceration During the past 12 months, how many nights altogether did you stay in any type of juvenile detention center, prison, or jail?b Length of time In past 12 months NSDUH
Jail Thinking about your most recent time in jail, how long were you in jail? Length of time Most recent SCJE
Community service Since [DATE], what year did you first complete a sentence to perform community service? Length of time Since last interview NLSY97

a The table illustrates that surveys on criminal justice involvement are interested not only in whether there was any involvement but also in how often involvement occurred. The table is organized by type of involvement, such as committing crimes, being arrested, being charged, being convicted, being probated, being incarcerated (including in jail or prison), being paroled, or being required to perform community service. Types of information on how often criminal justice involvement occurred include frequency and length of time of criminal justice involvement. Reference periods for recalling criminal justice involvement include since last interview, lifetime, in past 12 months, since age 18, and current. The survey items are chosen for illustrative purposes; the table is not intended to be exhaustive of all survey items asked in all surveys.

b This question is asked of respondents aged 12-17.

Several surveys ask about the total time spent incarcerated. Questions on duration of criminal justice involvement ask for the total length of time during a respondent’s lifetime (National Former Prisoner Survey70), the length of time in the past 12 months (NSDUH), the duration of the most recent involvement (NSFG, SCJE), or the duration of current involvement (SCJE). Total length of time during the lifetime or in the past 12 months will require calculations and estimation on the part of the respondent if there is more than 1 spell in the reference period. Duration for the longest spell or the most recent spell is cognitively less demanding because it does not involve mental calculations. In addition, asking about the longest spell or the most recent spell allows respondents to anchor on a specific event, facilitating better recall and more accurate placement of the event in the correct reference period. We encourage researchers to provide anchors of this type when writing questions about frequency or duration of criminal justice involvement. For example, NLSY97 is a longitudinal survey, and respondents are interviewed multiple times. In the later waves of interviews, NLSY97 asks respondents to report what happened to them since the last interview. Using the last interview as an anchoring point for respondents has the potential to reduce external telescoping and encourage better recall.

It may be of interest to researchers on criminal justice involvement to ask about the type of crime or offense that led to the criminal justice involvement. Some surveys do not ask for the type of crime or offense involved (eg, NSFG). Other surveys ask only for a broad category of crime or offense; NLSY97 has 1 question asking if respondents have ever been convicted of a felony. Still other surveys (eg, FFCWS, NSDUH) ask about specific crimes charged for, convicted of, or committed. NSDUH asks if respondents have been arrested or booked for 17 types of crime, ranging from vandalism and driving under the influence of alcohol or other drugs to arson and murder. The specificity of crime type is driven by the analytic goals of the researchers and other practical concerns, such as questionnaire length. Respondents may not be able to map their knowledge about the crime to formal categories, such as those shown in NSDUH. Differences may also exist between what respondents were arrested for and of what they were convicted. To the extent possible, categories on the questionnaire should be developed around the analytic goals.

After the questions are designed, researchers are encouraged to pretest and evaluate their survey questions before taking them into the field. Pretesting could involve requesting feedback from potential respondents, having experts in criminal justice involvement and survey methodology review the questions, and administering the questions to a small number of eligible respondents.71 One common mistake of survey designers is not allowing enough time to pretest and revise the survey based on the feedback. It is best to build pretesting into the project schedule to ensure that results can be incorporated into the final survey questions.

Conclusion

As shown in the surveys we reviewed, it is possible to collect data on criminal justice involvement in general population surveys. Little evidence suggests that including questions on criminal justice involvement in surveys affects response rates34,35 or results in abnormally high item nonresponse (Table 1). Survey designers need to be aware that these questions should follow design principles similar to any other question, making the respondent’s cognitive tasks as simple as possible by facilitating comprehension and recall of the target events. In addition, researchers should design the procedures and questions to account for the sensitive nature of the topic, which may involve using a self-administered questionnaire (or module), writing the questions to promote disclosure (eg, using forgiving wording), or using indirect questioning methods.

Footnotes

Declaration of Conflicting Interests: The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This article was commissioned by the National Academies of Sciences, Engineering, and Medicine Standing Committee on Improving Collection of Indicators of Criminal Justice System Involvement in Population Health Data Programs on behalf of the US Department of Health and Human Services (HHS). Opinions and statements included in this article are not necessarily adopted, endorsed, or verified as accurate by the National Academies of Sciences, Engineering, and Medicine or any other organization or agency that provided support for the project. Support for the Standing Committee was provided by HHS through an interagency agreement with the National Science Foundation (No. SES-1024012).

References

  • 1. Wildeman C. What do we want to measure when we measure criminal justice contact? Some thoughts. Paper presented at: Improving Collection of Indicators of Criminal Justice System Involvement in Public Health Data Programs: A Workshop; March 29-30, 2016; Washington, DC. [Google Scholar]
  • 2. Massoglia M. Incarceration as exposure: the prison, infectious disease, and other stress-related illnesses. J Health Soc Behav. 2008;49(1):56–71. doi:10.1177/002214650804900105 [DOI] [PubMed] [Google Scholar]
  • 3. Massoglia M. Incarceration, health, and racial disparities in health. Law Soc Rev. 2008;42(2):275–306. doi:10.1111/j.1540-5893.2008.00342.x [Google Scholar]
  • 4. Schnittker J, John A. Enduring stigma: the long-term effects of incarceration on health. J Health Soc Behav. 2007;48(2):115–130. doi:10.1177/0022146507048000202 [DOI] [PubMed] [Google Scholar]
  • 5. Western B. Punishment and Inequality in America. New York, NY: Russell Sage Foundation; 2006. [Google Scholar]
  • 6. Massoglia M, Remster B, King RD. Stigma or separation? Understanding the incarceration–divorce relationship. Social Forces. 2011;90(1):133–155. doi:10.1093/sf/90.1.133 [Google Scholar]
  • 7. Geller AB, Curtis MA. A sort of homecoming: incarceration and the housing security of urban men. Soc Sci Res. 2011;40(4):1196–1213. doi:10.2139/ssrn.1632578 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Roettger ME, Boardman JD. Parental incarceration and gender-based risks for increased body mass index: evidence from the National Longitudinal Study of Adolescent Health in the United States. Am J Epidemiol. 2012;175(7):636–644. doi:10.1093/aje/kwr409 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9. Wakefield S, Wildeman C. Mass imprisonment and racial disparities in childhood behavioral problems. Criminology Public Policy. 2011;10(3):791–792. doi:10.1111/j.1745-9133.2011.00741.x [Google Scholar]
  • 10. Wildeman C. Imprisonment and infant mortality. Social Problems. 2012;59(2):228–257. doi:10.1525/sp.2012.59.2.228 [Google Scholar]
  • 11. Roettger ME, Swisher RR. Associations of fathers’ history of incarceration with sons’ delinquency and arrest among black, white, and Hispanic males in the United States. Criminology. 2011;49(4):1109–1147. doi:10.1111/j.1745-9125.2011.00253.x [Google Scholar]
  • 12. Schwartz-Soicher O, Geller A, Garfinkel I. The effect of paternal incarceration on material hardship. Soc Serv Rev. 2011;85(3):447–473. doi:10.1086/661925 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13. Foster H, Hagan J. Punishment regimes and the multilevel effects of parental incarceration: intergenerational, intersectional, and interinstitutional models of social inequality and systemic exclusion. Annu Rev Sociol. 2015;41:135–158. doi:10.1146/annurev-soc-073014-112437 [Google Scholar]
  • 14. Foster H, Hagan J. Maternal imprisonment, economic marginality, and unmet health needs in early adulthood. Prev Med. 2017;99:43–48. doi:10.1016/j.ypmed.2017.01.018 [DOI] [PubMed] [Google Scholar]
  • 15. Hagan J, Foster H. Children of the American prison generation: the paradoxical “spillover” school effects of incarcerating mothers. Law Soc Rev. 2012;46:37–69. doi:10.1111/j.1540-5893.2012.00472.x [Google Scholar]
  • 16. Hagan J, Foster H. Intergenerational educational effects of mass imprisonment in America. Sociol Educ. 2012;85(3):259–286. doi:10.1177/0038040711431587 [Google Scholar]
  • 17. Foster H, Hagan J. Maternal and paternal imprisonment and children’s social exclusion in young adulthood. J Crim Law Criminol. 2016;105(2):387–430. [Google Scholar]
  • 18. Tourangeau R, Yan T. Sensitive questions in surveys. Psychol Bull. 2007;133(5):859–883. doi:10.1037/0033-2909.133.5.859 [DOI] [PubMed] [Google Scholar]
  • 19. Tourangeau R, Rips LJ, Rasinski K. The Psychology of Survey Response. Cambridge, UK: Cambridge University Press; 2000. [Google Scholar]
  • 20. Bradburn N, Sudman S, Wansink B. Asking Questions: The Definitive Guide to Questionnaire Design—For Market Research, Political Polls, and Social and Health Questionnaires. New York, NY: Wiley and Sons; 2004. [Google Scholar]
  • 21. Fowler FJ., Jr Improving Survey Questions: Design and Evaluation. Applied Social Research Methods Series Volume 38 Thousand Oaks, CA: Sage; 1995. [Google Scholar]
  • 22. National Center for Health Statistics. Advance letters for households: phase 1. National Survey of Family Growth; https://www.cdc.gov/nchs/data/nsfg/nsfg2013-2015_advancehouseholdletters.pdf. Accessed October 29, 2018. [Google Scholar]
  • 23. Neter J, Waksberg J. A study of response errors in expenditures data from household interviews. J Am Stat Assoc. 1964;59(30):18–55. doi:10.2307/2282857 [Google Scholar]
  • 24. Tourangeau R, Bradburn NM. The psychology of survey response In: Marsden PV, Wright JD, eds. Handbook of Survey Research. Bingley, UK: Emerald; 2010:315–346. [Google Scholar]
  • 25. Gaskell GD, Wright DB, O’Muircheartaigh CA. Telescoping of landmark events: implications for survey research. Public Opin Q. 2000;64(1):77–89. [DOI] [PubMed] [Google Scholar]
  • 26. Wyner GA. Response errors in self-reported number of arrests. Sociol Methods Res. 1980;9(2):161–177. [Google Scholar]
  • 27. Preisendörfer P, Wolter F. Who is telling the truth? A validation study on determinants of response behavior in surveys. Public Opin Q. 2014;78(1):126–146. doi:10.1093poq/nft079 [Google Scholar]
  • 28. Junger M. Discrepancies between police and self-report data for Dutch racial minorities. Br J Criminol. 1989;29(3):273–284. [Google Scholar]
  • 29. van der Heijden PGM, van Gils G, Bouts J, Hox JJ. A comparison of randomized response, computer-assisted self-interview, and face-to-face direct questioning: eliciting sensitive information in the context of welfare and unemployment benefit. Sociol Methods Res. 2000;28(4):505–537. doi:10.1177/0049124100028004005 [Google Scholar]
  • 30. De Jonge CPK. Who lies about electoral gifts? Experimental evidence from Latin America. Public Opin Q. 2015;79(3):710–739. doi:10.1093/poq/nfv024 [Google Scholar]
  • 31. Catania JA, Gibson DR, Chitwood DD, Coates TJ. Methodological problems in AIDS behavioral research: influences of measurement error and participation bias in studies of sexual behavior. Psychol Bull. 1990;108(3):339–362. [DOI] [PubMed] [Google Scholar]
  • 32. Tourangeau R, Groves RM, Redline CD. Sensitive topics and reluctant respondents: demonstrating a link between nonresponse bias and measurement error. Public Opin Q. 2010;74(3):413–432. doi:10.1093/poq/nfq004 [Google Scholar]
  • 33. Sakshaug JW, Yan T, Tourangeau R. Nonresponse error, measurement error, and mode of data collection: tradeoffs in a multi-mode survey of sensitive and non-sensitive items. Public Opin Q. 2010;74(5):907–933. doi:10.1093/poq/nfq057 [Google Scholar]
  • 34. National Center for Health Statistics. Public use data file documentation, 2006-2010: National Survey of Family Growth. https://www.cdc.gov/nchs/data/nsfg/NSFG_2006-2010_UserGuide_MainText.pdf. Accessed October 29, 2018.
  • 35. National Center for Health Statistics. 2011-2013 National Survey of Family Growth (NSFG): summary of design and data collection methods. https://www.cdc.gov/nchs/data/nsfg/NSFG_2011_2013_DesignandDataCollectionMethods.pdf. Accessed October 29, 2018.
  • 36. Substance Abuse and Mental Health Services Administration. National Survey on Drug Use and Health. https://www.samhsa.gov/data/data-we-collect/nsduh-national-survey-drug-use-and-health. Accessed November 19, 2018.
  • 37. Brown S, Manning W. The Survey of Criminal Justice Experience (SCJE), 2013. Ann Arbor, MI: Inter-university Consortium for Political and Social Research; 2014. 10.3886/ICPSR35080.v1. Accessed November 17, 2018. [DOI] [Google Scholar]
  • 38. Harris KM, Udry JR. National Longitudinal Study of Adolescent to Adult Health (Add Health), 1994-2008 (Public Use). Ann Arbor, MI: Carolina Population Center, University of North Carolina-Chapel Hill, Inter-university Consortium for Political and Social Research; 2018. 10.3886/ICPSR21600.v21. Accessed November 17, 2018. [DOI] [Google Scholar]
  • 39. de Leeuw ED. Data Quality in Mail, Telephone and Face to Face Surveys. Amsterdam, the Netherlands: T.T. Publikaties; 1992. [Google Scholar]
  • 40. Converse J, Presser S. Survey Questions: Handcrafting the Standardized Questionnaire. Thousand Oaks, CA: Sage; 1986. [Google Scholar]
  • 41. Tourangeau R, Conrad FG, Couper MP, Ye C. The effects of providing examples in survey questions. Public Opin Q. 2014;78(1):100–125. doi:10.1093/poq/nft083 [Google Scholar]
  • 42. US Bureau of Labor Statistics. National Longitudinal Survey of Youth–1997 Cohort. https://www.nlsinfo.org/sites/nlsinfo. Accessed November 17, 2018.
  • 43. Cannell C, Miller P, Oksenberg L. Research on interviewing techniques In: Leinhardt S, ed. Sociological Methodology. San Francisco, CA: Jossey-Bass; 1981:389–437. [Google Scholar]
  • 44. Schober MF, Conrad FG, Antoun C, et al. Precision and disclosure in text and voice interviews on smartphones. PLoS One. 2015;10(6):e0128337 doi:10.1371/journal.pone.0128337 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45. Tourangeau R, Smith TW. Asking sensitive questions: the impact of data collection mode, question format, and question context. Public Opin Q. 1996;60(2):275–304. [Google Scholar]
  • 46. Lind LH, Schober MF, Conrad FG, Reichert H. Why do survey respondents disclose more when computers ask the questions? Public Opin Q. 2013;77(4):888–935. doi:10.1093/poq/nft038 [Google Scholar]
  • 47. Villarroel MA, Turner CF, Eggleston E, et al. Same-gender sex in the United States: impact of T-ACASI on prevalence estimates. Public Opin Q. 2006;70(2):166–196. doi:10.1093/poq/nfj023 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 48. Corkrey R, Parkinson L. A comparison of four computer-based telephone interviewing methods: getting answers to sensitive questions. Behav Res Methods Instrum Comput. 2002;34(3):354–363. [DOI] [PubMed] [Google Scholar]
  • 49. Centers for Disease Control and Prevention. National Survey of Family Growth. https://www.icpsr.umich.edu/icpsradmin/nsfg/variableGroupChild/7359?studyNumber=9998. Accessed November 17, 2018.
  • 50. Holtgraves T, Eck J, Lasky B. Face management, question wording, and social desirability. J Appl Soc Psychol. 1997;27(18):1650–1671. doi:10.1111/j.1559-1816.1997.tb01618.x [Google Scholar]
  • 51. Peter J, Valkenburg PM. The impact of “forgiving” introductions on the reporting of sensitive behavior in surveys: the role of social desirability response style and developmental status. Public Opin Q. 2011;75(4):779–787. doi:10.2307/41288417 [Google Scholar]
  • 52. Persson M, Solevid M. Measuring political participation—testing social desirability bias in a Web-survey experiment. Int J Public Opin Res. 2014;26(1):98–112. doi:10.1093/ijpor/edt002 [Google Scholar]
  • 53. Knauper B. Filter questions and question interpretation: presuppositions at work. Public Opin Q. 1998;62(1):70–78. doi:10.1086/297832 [Google Scholar]
  • 54. US Department of Health and Human Services, Centers for Disease Control and Prevention, National Center for Injury Prevention and Control. National Intimate Partner and Sexual Violence Survey (NISVS): General Population Survey Raw Data, 2010. Ann Arbor, MI: Inter-university Consortium for Political and Social Research; 2016. 10.3886/ICPSR34305.v1. Accessed November 17, 2018. [DOI] [Google Scholar]
  • 55. Lensvelt-Mulders GJLM, Hox JJ, van der Heijden PGM, Maas CJM. Meta-analysis of randomized response research: thirty-five years of validation. Sociol Methods Res. 2005;33(3):319–348. doi:10.1177/0049124104268664. [Google Scholar]
  • 56. Holbrook AL, Krosnick JA. Measuring voter turnout by using the randomized response technique: evidence calling into question the method’s validity. Public Opin Q. 2010;74(2):328–343. doi:10.1093/poq/njq012 [Google Scholar]
  • 57. Hoffmann A, Musch J. Assessing the validity of two indirect questioning techniques: a stochastic lie detector versus the crosswise model. Behav Res Methods. 2016;48(3):1032–1046. doi:10.3758/s13428-015-0628-6 [DOI] [PubMed] [Google Scholar]
  • 58. Warner SL. Randomized response: a survey technique for eliminating evasive answer bias. J Am Stat Assoc. 1965;60(309):63–66. [PubMed] [Google Scholar]
  • 59. Greenberg BG, Abul-Ela ALA, Simmons WR, Horvitz DG. The unrelated question randomized response model: theoretical framework. J Am Stat Assoc. 1969;64(32):520–539. [Google Scholar]
  • 60. Coutts E, Jann B. Sensitive questions in online surveys: experimental results for the randomized response technique (RRT) and the unmatched count technique (UCT). Sociol Methods Res. 2011;40(1):169–193. doi:10.1177/0049124110390768 [Google Scholar]
  • 61. Jann B, Jerke J, Krumpal I. Asking sensitive questions using the crosswise model: an experimental survey measuring plagiarism. Public Opin Q. 2012;76(1):32–49. doi:10.2307/41345966 [Google Scholar]
  • 62. Droitcour J, Caspar RA, Hubbard ML, Parsely TL, Visscher W, Ezzati TM. The item count technique as a method of indirect questioning: a review of its development and a case study application In: Biemer PP, Groves RM, Lyberg LE, Mathiowetz NA, Sudman S, eds. Measurement Errors in Surveys. New York, NY: Wiley; 1991:185–210. [Google Scholar]
  • 63. Biemer P, Brown G. Model-based estimation of drug use prevalence using item count data. J Off Stat. 2005;21(2):287–308. [Google Scholar]
  • 64. Trappmann M, Krumpal I, Kirchner A, Jann B. Item sum: a new technique for asking quantitative sensitive questions. J Surv Stat Methodol. 2014;2(1):58–77. doi:10.1093/jssam/smt019 [Google Scholar]
  • 65. Imai K. Multivariate regression analysis for the item count technique. J Am Stat Assoc. 2011;106(494):407–416. doi:10.1198/jasa.2011.ap10415 [Google Scholar]
  • 66. National Institute on Alcohol Abuse and Alcoholism. National Epidemiologic Survey on Alcohol and Related Conditions. https://www.niaaa.nih.gov/research/nesarc-iii/questionnaire. Accessed November 17, 2018.
  • 67. Bureau of Justice Statistics. Police–Public Contact Survey. https://www.bjs.gov/index.cfm?ty=dcdetail&iid=251#Questionnaires. Accessed November 17, 2018.
  • 68. Princeton University. Fragile Families and Child Wellbeing Study. https://fragilefamilies.princeton.edu/documentation/general. Accessed November 17, 2018.
  • 69. Schaeffer NC, Presser S. The science of asking questions. Annu Rev Sociol. 2003;29:65–88. doi:10.1146/annurev.soc.29.110702.110112 [Google Scholar]
  • 70. Bureau of Justice Statistics. National Former Prisoner Survey. https://www.bjs.gov/index.cfm?ty=dcdetail&iid=322#Questionnaires. Accessed November 17, 2018.
  • 71. Yan T, Kreuter F, Tourangeau R. Evaluating survey questions: a comparison of methods. J Off Stat. 2012;28(4):503–529. [Google Scholar]

Articles from Public Health Reports are provided here courtesy of SAGE Publications

RESOURCES