Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Apr 1.
Published in final edited form as: J Am Acad Child Adolesc Psychiatry. 2009 Apr;48(4):380–385. doi: 10.1097/CHI.0b013e3181999705

The National Comorbidity Survey Adolescent Supplement (NCS-A): II. Overview and Design

Ronald C Kessler 1, Shelli Avenevoli 2, E Jane Costello 3, Jennifer Greif Green 1, Michael J Gruber 1, Steven Heeringa 4, Kathleen R Merikangas 5, Beth-Ellen Pennell 4, Nancy A Sampson 1, Alan M Zaslavsky 1
PMCID: PMC2718678  NIHMSID: NIHMS126915  PMID: 19242381

Abstract

OBJECTIVE

To present an overview of the design and field procedures of the National Comorbidity Survey Replication Adolescent Supplement (NCS-A)

METHOD

The NCS-A is a nationally representative face-to-face household survey of the prevalence and correlates of DSM-IV mental disorders among US adolescents (ages 13–17) that was carried out between February 2001 and January 2004 by the Survey Research Center of the Institute for Social Research at the University of Michigan. The sample was based on a dual-frame design that included 904 adolescent residents of the households that participated in the National Comorbidity Survey Replication (85.9% response rate) and 9244 adolescent students selected from a representative sample of 320 schools in the same nationally representative sample of counties as the NCS-R (74.7% response rate).

RESULTS

Comparisons of sample and population distributions on Census socio-demographic variables and, in the school sample, school characteristics documented only minor differences that were corrected with post-stratification weighting. Comparisons of DSM-IV disorder prevalence estimates among household vs. school sample respondents in counties that differed in the use of replacement schools for originally selected schools that refused to participate showed that the use of replacement schools did not introduce bias into prevalence estimates.

CONCLUSIONS

The NCS-A is a rich nationally representative dataset that will substantially increase understanding of the mental health and well-being of adolescents in the United States.

Keywords: National Comorbidity Survey Adolescent Supplement (NCS-A), mental disorders, epidemiology, survey design, survey sampling

OBJECTIVE

This paper presents an overview of the design and field procedures of the National Comorbidity Survey Replication Adolescent Supplement (NCS-A), a nationally representative survey of DSM-IV mental disorders among adolescents (ages 13–17) in the S. The NCS-A was designed to provide information on the prevalence and correlates of mental disorders among US youth by extending the lower age range of the National Comorbidity Survey Replication (NCS-R),1 a nationally representative survey of adult mental disorders that was fielded in 2001–3. Companion papers discussion the background, rationale, and instruments used in the NCS-A2 and present data on the validity of the NCS-A diagnostic assessments.3

Although the original intent of the NCS-A was to obtain the sample of adolescents from those residing in NCS-R households, the number of such youth was too small to generate the target sample of 10,000 respondents. We consequently supplemented the sample by adding a school-based sample. This had lower costs than household screening.4 The final sample, then, was based on a dual-frame design4, 5 in which one sample was recruited from the NCS-R households and the other from a representative sample of schools in the same communities as the NCS-R households. All schools (public and private, schools for gifted children, therapeutic schools, etc.) were included in their true population proportions. A stratified probability sample of students was selected from each school to participate in the survey.

METHOD

Fieldwork organization and procedures

The NCS-A fieldwork was carried out by the same professional national field interview staff from the Survey Research Center (SRC) at the University of Michigan that carried out the NCS-R. There were 197 interviewers supervised by a team of 18 experienced regional supervisors. A study manager located at the central SRC facility in Michigan oversaw the work of the supervisors and their staff. Upon making in-person contact, the interviewer answered questions before obtaining written informed consent from the parent and then written informed assent from the adolescent. Interviews were never conducted with a non-emancipated adolescent unless at least one parent or guardian was present in the home during the interview. However, no parent consent was requested in the small number of cases where we interviewed an emancipated minor. Adolescents were paid $50 for participating in the survey. The Human Subjects Committees of both Harvard Medical School and the University of Michigan approved these recruitment, consent, and field procedures.

Interviewer training and field quality control

Interviews were administered using computer-assisted personal interview (CAPI) methods Each SRC interviewer completes a two-day General Interviewer Training (GIT) course before working on any survey. Experienced interviewers also have to complete GIT refresher courses on a periodic basis. Each NCS-A interviewer additionally received a five-day training specific to the NCS-A CAPI interview. Several steps were taken to ensure quality of fieldwork. Sample households were selected centrally to avoid interviewers recruiting respondents from preferred neighborhoods. The computerized interviews had a built-in clock to record speed of data entry, making it difficult for interviewers to shorten interviews by skipping sections or filling in sections quickly. Supervisors reviewed each interview within 24 hours of completion to check for a wide range of errors. Supervisors contacted a random 10% of interviewed households to confirm address, enumeration, random selection procedures, interview length, and a random sample of question responses. Completed CAPI interviews were sent electronically to supervisors every night for this purpose. In cases where problems were detected, interviewers were instructed to re-contact the respondent to obtain the missing data.

The sampling design

As noted above, the NCS-A household survey included adolescents who resided in households identified in the NCS-R. Selection of NCS-R households is described in detail elsewhere6 and will not be repeated here other than to note that the households were based on a three-stage clustered area probability sampling design that was representative of households in the continental US. Age and sex of each household member were recorded, allowing us to target households with adolescents. The NCS-A school sample, in comparison, was selected from a comprehensive government list of all licensed schools in the country. Although school-based samples miss adolescents who have dropped out of school, the NCS-A also included youth from the NCS-R household sample. A representative sample of middle schools, junior high schools, and high schools in the NCS-R counties was selected from the government list with probabilities proportional in size of the student body in the classes relevant to the target sample of youth ages 13–17. All accredited schools were eligible, including private and residential schools. In some cases where there were several small schools in a geographic area, those schools were combined to form a cluster that was treated as a single school for purposes of sampling.

School recruitment consisted of contacting individual school Principals, with the district’s approval, to obtain rosters from which to contact student families for study participation. Schools were provided $200 as a token of appreciation for this cooperation. Within each school, a random sample of 40–50 eligible students was selected for sampling. Toward the end of the recruitment period when more schools were needed to complete the study, school payment was increased to $300. A total of 320 schools participated in the survey. We began with a target sample of 289 schools initially contacted for participation, of which 81 agreed. The primary reason for refusal was reluctance to release student information. Some schools had policies against giving out student information. We had the additional problem that our recruitment took place shortly after the Columbine shooting incident, at which time schools around the country were inundated with requests from local colleges to carry out studies of students in area schools.

Districts that required formal research proposals usually granted our request eventually, but sometimes with the stipulation that they would only release student information if they first had parental written consent. We generally rejected schools of the latter type based on the fact that active parent consent has been shown in previous research to result in a very low response rate. In cases where no replacement schools were readily available, though, we agreed to this requirement. This occurred in roughly 15% of schools. As shown below, the response rate was dramatically lower in this subsample, which we referred to as blinded schools because we were blinded to the identities of the sample students until after signed consent was obtained by the school principals.

Based on the low initial school-level response rate and often protracted time frame of recruitment, we recruited multiple replacement schools for some refusal schools. Replacement schools were selected to match the initial refusal schools in terms of school size, geographic area, and demographic characteristics. The fact that we ended up with 320 schools rather than the original 289 reflects this expansion of recruitment. In cases where multiple replacement schools were included in the sample for one original school, the total number of interviews targeted in the replacement schools added up to the number targeted for the original school.

RESULTS

Sample disposition

The NCS-A sample disposition is reported in Table 1. The overall adolescent response rate was 75.6%, for a total of 10,148 completed interviews. This is made up of response rates of 85.9% (n = 904) in the household sample, 81.8% (8,912) in the un-blinded school sample, and 22.3% (n = 332) in the blinded school sample. Non-response was largely due to refusal (21.3%), which in the household and un-blinded school samples came largely from parents rather than adolescents (72.3% and 81.0%, respectively). The refusals in the blinded school sample, in comparison, came almost entirely (98.1%) from parents failing to return the signed consent postcard.

Table 1.

NCS-A sample disposition

Household Un-blinded School Blinded School Total
% (n) % (n) % (n) % (n)
I. Adolescents
 Interview 85.9 (904)a 81.8 (8,912) 22.3 (332) 75.6 (10,148)
 Refusalb 11.3 (119) 14.7 (1,604) 76.4 (1137) 21.3 (2,860)
 Circumstantial 2.4 (25) 1.9 (211) 2.9 (13) 1.9 (249)
 No contact 0.4 (4) 1.5 (165) 0.4 (6) 1.3 (175)
II. Parents
 Full questionnaire 52.4 (551)c 52.4 (5,703) 15.9 (237) 48.3 (6,491)
 Short-form questionnaire 18.5 (195)c 16.0 (1,744) 3.7 (55) 14.8 (1,994)
 Either 70.9 (746) 68.4 (7,447) 19.6 (292) 63.0 (8,485)
  (n) (1,052) (10,892) (1,488) (13,432)
a

25 of the household survey respondents were not students. The remaining 879 were students.

b

The much higher refusal rate in the blinded school sample than the other samples was due to the fact that in blinded schools active written parental consent, in the form of a signed return postcard in response to a letter mailed by the school Principal, was required before the school would release the names and addresses of sample adolescents to the research team. Some 74.9% of parents in blinded schools failed to return these postcards, while another 1.5% of cases were omitted because of refusal on the part of either the parent (0.9%) or the adolescent (0.6%).to participate after a parent had signed the informed consent postcard. As in the blinded school sample, the majority of refusals in both the household sample (72.3%) and the unblended school sample (81.0%) came from parents rather than adolescents.

c

15 of the parents who completed a questionnaire (8 full questionnaire, 7 short-form questionnaire) were the parents of adolescents who were not students.

Consistent with parents being less cooperative than adolescents, the response rate to the parent self-administered questionnaire (SAQ) was considerably lower than the response rate to the adolescent interview: 63.0% compared to 75.6%. The parent SAQ response rate could not be higher than the adolescent response rate by design, though, as parent SAQs were collected only for adolescents who completed interviews. The conditional parent response rate given adolescent response did not differ substantially between the household sample (82.5%; 70.9/85.9), the un-blinded school sample (83.6%; 68.4/81.8), and the blinded school sample (87.9%; 19.6/22.3).

Weighting

According to the most recent Census data, approximately 96.6% of US adolescents in the age range 13–17 are students (www.census.org). We would consequently have expected about 31 non-student respondents in the household sample (i.e., 3.4% of 904). The actual number was 25. This is too few to support extrapolation to the population of the roughly half million non-student adolescents ages 13–17 in the US. We consequently excluded the non-student respondents from the bulk of the analyses and concentrated on the 10,123 respondents who were students. Weighting focused on the student population. As the sample design involved a dual-frame approach, a distinct weighting scheme was used to make each of the two samples representative of adolescents in the US household population. The two weighted samples were then merged for purposes of analysis.

The household sample weighting was the simpler of the two in that weights had already been developed for the NCS-R household sample. The NCS-R weights are described elsewhere6 and will not be discussed here. We began by adding these weights to the adolescent data and adjusting them for differential probability of selection of adolescents as a function of number of other adolescents in the household. These doubly-weighted data were then compared with nationally representative Census data on basic socio-demographic characteristics for purposes of post-stratification. Two data files were used for this purpose. The first was the 2000 Census Public Use Microdata Sample (PUMS; www.census.gov/support/pumsdata.html) of a 5% sample of the entire US population. Data were extracted from the PUMS for adolescents who were students at the time of the Census. The second was a small area geo-code data file prepared by a commercial firm that aggregated 2000 Census data to the level of the Block Group (BG) for each of the 208,790 BGs (http://www.geolytics.com/resources/us-census-2000.html). These BG-level data were linked to the data record of each NCS-A respondent, while the national distributions for the population on these same BG-level variables were generated by weighting the BG-level data by the population of eligible adolescents in each BG.

We selected a wide range of variables available in the NCS-A as well as either in the PUMS or the BG-level data file to post-stratify the NCS-A data. (Details available on request.) In addition, we had some information about variables not in the Census files available for the NCS-A household sample, as the NCS-R was completed in these households. In particular, we compared and weighted for discrepancies between the DSM-IV/CIDI disorders reported by the adult NCS-R respondents in the households of NCS-A respondents and non-respondents.

The sample data were weighted using a complex method described elsewhere7 to make the distributions of the post-stratification variables in the weighted sample identical to those in the population datasets while maintaining the associations among these variables found in the sample. Weighting for the school sample was based on the same approach except that we also included a third set of variables extracted from the Quality Education Data (QED) database, a commercially-produced database of the characteristics of all primary and secondary schools in the US (http://www.qeddata.com). Comparison of weighted and unweighted distributions of the weighting variables in the two samples (results available on request) showed that post-stratification did not have dramatic effects on these variables. For example, the proportion of adolescents who are Non-Hispanic White was estimated to be 55.5% before post-stratification compared to the actual population distribution of 65.6%, a relative increase of 18% (i.e., 65.6/55.5) on this proportion after post-stratification. This general pattern of relatively modest adjustments in proportions held for the vast majority of the post-stratification variables included in the analysis. A more detailed discussion of weighting is presented elsewhere.7

Was bias introduced by using replacement schools?

The low school-level response rate raises concerns about the possibility of bias. As noted above, replacement schools were carefully matched to original schools on data from the QED and Block Group databases. However, this does not guarantee that students in the replacement schools are comparable to students in the original schools in prevalence of mental disorders or rates of treatment of mental disorders. Fortunately, we were able to carry out an analysis of this issue using the household sample, which includes students from the schools that refused to be in the study. This analysis was carried out at the level of the 84 Primary Sampling Units (PSUs) in the original household sample. In each PSU we calculated the proportion of school-sample respondents who were from replacement schools rather than from the originally selected schools. We then compared the 12-month prevalence of DSM-IV mental disorders as assessed in our survey and the 12-month prevalence of treatment of mental disorders in the school sample versus the household sample as a function of percent of school-sample respondents in the PSU from replacement schools (%RS). If prevalence was different among students in the replacement schools than the original schools, the difference in prevalence among respondents in the household sample compared to the school sample (H-S) would increase significantly with an increase in %RS. This did not occur. Neither the H-S difference in disorder prevalence or in prevalence of treatment varied systematically as a function of %RS (design-adjusted t = 0.3–0.5, p =.59–.78). Based on these results, we consider it very unlikely that substantial bias was introduced into the study by the high proportion of replacement schools used in the survey.

CONCLUSIONS

This paper presented an overview of the NCS-A survey design and field procedures. The design allowed us to gather data from a nationally representative sample of adolescents and schools. An important limitation of the NCS-A is the relatively low response rate of schools in the school sample, but we were able to show in analyses reported here that this appears not to have introduced bias into estimates of the prevalence of mental disorders or estimates of treatment rates. Nonetheless, because of this between-sample difference, we plan to replicate key results separately in the household and school samples when we get to the point of carrying out substantive analyses. Consistency of results across the two samples will be taken as an indication of robustness of findings.

The consolidated NCS-A data file will be released as a public use dataset as soon as a fully documented codebook and cleaned dataset is created. The information in this rich dataset will allow many analyses to be carried out to increase our understanding of the health and well-being of adolescents in the United States. Information on timing and method of accessing the dataset will be posted at [website deleted for blind review] as the data become available.

Table 2.

The association between proportion of replacement schools in Primary Sampling Units and household-versus-school sample differences in 12-month prevalence estimates of DSM-IV disorders and treatment

Percent of replacement schools in the primary sampling unit Interaction
Low Medium High All
Est (se) Est (se) Est (se) Est (se) t-value

I. 12-month prevalence estimates
Any DSM-IV disorder 43.6 (1.0) 38.4 (1.0) 45.9 (1.7) 44.9 (1.4)
Any treatment 33.5 (0.8) 31.1 (1.0) 30.8 (1.5) 30.9 (1.0)
II. Difference in 12-month prevalence estimates between the household sample (coded 1) and school sample (coded 0)
Any DSM-IV disorder −0.1 (5.2) 5.2 (4.3) −2.4 (2.8) −1.4 (3.2) 0.5
Any treatment 3.8 (3.6) 6.2 (3.1) 6.6* (3.1) 6.4 (3.4) 0.3
*

The difference in the unadjusted prevalence estimates between the household sample and the school sample among respondents in the Primary Sampling Units defined by the column heading is significant at the.05 level, two-sided design-based test. The test took into consideration both the weighting and the clustering of the data.

The proportion of respondents in the school sample in the primary sampling unit who came from replacement schools are as follows: Low (0.0–27.9%), Medium (28.0–54.2%), High (54.3–97.3%), All (100%).

The interaction term represents the extent to which the difference in the 12-month prevalence estimates between the household sample and the school sample varies as a function of the proportion of respondents in the school sample in the Primary Sampling Unit who came from replacement schools. Although linear interaction terms are shown here, the same clearly non-significant results were found when the interaction terms were estimated using a logistic link function.

Acknowledgments

The NCS-A is carried out in conjunction with the World Health Organization World Mental Health (WMH) Survey Initiative. We thank the staff of the WMH Data Collection and Data Analysis Coordination Centres for assistance with instrumentation, fieldwork, and consultation on data analysis. The WMH Data Coordination Centres have received support from NIMH (R01-MH070884, R13-MH066849, R01-MH069864, R01-MH077883), NIDA (R01-DA016558), the Fogarty International Center of the National Institutes of Health (FIRCA R03-TW006481), the John D. and Catherine T. MacArthur Foundation, the Pfizer Foundation, and the Pan American Health Organization. The WMH Data Coordination Centres have also received unrestricted educational grants from Astra Zeneca, BristolMyersSquibb, Eli Lilly and Company, GlaxoSmithKline, Ortho-McNeil, Pfizer, Sanofi-Aventis, and Wyeth. A complete list of WMH publications can be found at http://www.hcp.med.harvard.edu/wmh/.

FUNDING

The National Comorbidity Survey Replication Adolescent Supplement (NCS-A) is supported by the National Institute of Mental Health (NIMH; U01-MH60220) with supplemental support from the National Institute on Drug Abuse (NIDA), the Substance Abuse and Mental Health Services Administration (SAMHSA), the Robert Wood Johnson Foundation (RWJF; Grant 044780), and the John W. Alden Trust. The work of Dr. Merikangas and her staff on the NCS-A is additionally supported by the NIMH Intramural Research Program, and the work of Dr. Zaslavsky and his staff supported by NIMH grant R01-MH66627. The views and opinions expressed in this report are those of the authors and should not be construed to represent the views of any of the sponsoring organizations, agencies, or U.S. Government. A complete list of NCS-A publications can be found at http://www.hcp.med.harvard.edu/ncs.

Footnotes

Statistical experts: Steven Heeringa, Ph.D., Alan Zaslavsky, Ph.D

DISCLOSURE

Dr. Kessler has been a consultant for GlaxoSmithKline Inc., Kaiser Permanente, Pfizer Inc., Sanofi-Aventis, Shire Pharmaceuticals, and Wyeth-Ayerst; has served on advisory boards for Eli Lilly & Company and Wyeth-Ayerst; and has had research support for his epidemiological studies from Bristol-Myers Squibb, Eli Lilly & Company, GlaxoSmithKline, Johnson & Johnson Pharmaceuticals, Ortho-McNeil Pharmaceuticals Inc., Pfizer Inc., and Sanofi-Aventis. The remaining authors have nothing to disclose.

References

  • 1.Kessler RC, Merikangas KR. The National Comorbidity Survey Replication (NCS-R): background and aims. Int J Methods Psychiatr Res. 2004;13:60–68. doi: 10.1002/mpr.166. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Merikangas KR, Avenevoli S, Costello EJ, Koretz D, Kessler RC. Background and measures in the National Comorbidity Survey Adolescent Supplement (NCS-A) J Am Acad Child Adolesc Psychiatry. doi: 10.1097/CHI.0b013e31819996f1. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Kessler RC, Green JG, Gruber M, et al. Concordance of DSM-IV diagnoses based on the WHO Composite International Diagnostic Interview (CIDI) Version 3.0 with blinded clinical reassessments in the National Comorbidity Survey Adolescent Supplement (NCS-A) J Am Acad Child Adolesc Psychiatry. doi: 10.1097/CHI.0b013e31819a1cbc. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Johnston LD, O’Malley PM, Bachman JG, Schulenberg JE. Monitoring the future national results on adolescent drug use: Overview of key findings, 2006. Bethesda, MD: National Institute on Drug Abuse; 2007. (NIH Publication No. 07-6202) [Google Scholar]
  • 5.Lepkowski JM, Groves RM. A Mean Square Error Model for Dual Frame, Mixed Mode Survey Design. J Am Stat Assoc. 1986;81:930–937. [Google Scholar]
  • 6.Kessler RC, Berglund P, Chiu WT, et al. The US National Comorbidity Survey Replication (NCS-R): design and field procedures. Int J Methods Psychiatr Res. 2004;13:69–92. doi: 10.1002/mpr.167. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Kessler RC, Avenevoli S, Costello EJ, et al. Design and Field Procedures in the US National Comorbidity Survey Replication Adolescent Supplement (NCS-A) Int J Methods Psychiatr Res. doi: 10.1002/mpr.279. in press. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES