Skip to main content
HHS Author Manuscripts logoLink to HHS Author Manuscripts
. Author manuscript; available in PMC: 2019 Aug 1.
Published in final edited form as: Matern Child Health J. 2018 Aug;22(8):1093–1102. doi: 10.1007/s10995-018-2526-x

The Design and Implementation of the 2016 National Survey of Children’s Health

Reem M Ghandour 1,5,, Jessica R Jones 1, Lydie A Lebrun-Harris 1, Jessica Minnaert 1, Stephen J Blumberg 2, Jason Fields 3, Christina Bethell 4, Michael D Kogan 1
PMCID: PMC6372340  NIHMSID: NIHMS1003614  PMID: 29744710

Abstract

Introduction

Since 2001, the Health Resources and Services Administration’s Maternal and Child Health Bureau (HRSA MCHB) has funded and directed the National Survey of Children’s Health (NSCH) and the National Survey of Children with Special Health Care Needs (NS-CSHCN), unique sources of national and state-level data on child health and health care. Between 2012 and 2015, HRSA MCHB redesigned the surveys, combining content into a single survey, and shifting from a periodic interviewer-assisted telephone survey to an annual self-administered web/paper-based survey utilizing an address-based sampling frame.

Methods

The U.S. Census Bureau fielded the redesigned NSCH using a random sample of addresses drawn from the Census Master Address File, supplemented with a unique administrative flag to identify households most likely to include children. Data were collected June 2016–February 2017 using a multi-mode design, encouraging web-based responses while allowing for paper mail-in responses. A parent/caregiver knowledgeable about the child’s health completed an age-appropriate questionnaire. Experiments on incentives, branding, and contact strategies were conducted.

Results

Data were released in September 2017. The final sample size was 50,212 children; the overall weighted response rate was 40.7%. Comparison of 2016 estimates to those from previous survey iterations are not appropriate due to sampling and mode changes.

Discussion

The NSCH remains an invaluable data source for key measures of child health and attendant health care system, family, and community factors. The redesigned survey extended the utility of this resource while seeking a balance between previous strengths and innovations now possible.

Keywords: Child health, National and state estimates, Children with special health care needs, National Survey of Children’s Health, National Survey of Children with Special Health Care Needs, Title V Maternal and Child Health Services

Introduction

Since 2001, the Health Resources and Services Administration’s Maternal and Child Health Bureau (HRSA MCHB) has provided funding and direction for two periodic surveys: the National Survey of Children’s Health (NSCH) and the National Survey of Children with Special Health Care Needs (NS-CSHCN). Together, these surveys provided national-and state-level data on key measures of child health, including the presence and impact of special health care needs; health care access, utilization, and quality; and related family and community factors (van Dyck et al. 2004). This content, as well as the commitment to producing reliable state-level data, were driven in large part by close ties to the Title V Maternal and Child Health Block Grant Services program (Title V) (US Department of Health and Human Services 2017). Data from both the NSCH and NS-CSHCN have served as the primary sources for state Title V needs assessments, program planning, and monitoring. In addition, these surveys have served as the data source for Healthy People objectives (US Department of Health and Human Services 2014) and the foundation for scientific studies (US Department of Health and Human Services Centers for Disease Control and Prevention 2017), on topics ranging from disease prevalence and treatment (Visser et al. 2014; Kogan et al. 2009) to underinsurance (Kogan et al. 2010) to adverse childhood experiences (Bethell et al. 2014).

In addition to the ability to produce both national- and state-level estimates, the surveys shared several characteristics. Both surveys were fielded three times as modules of the State and Local Area Integrated Telephone Survey (SLAITS) system by the Center for Disease Control and Prevention’s National Center for Health Statistics (NCHS) (U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, and National Center for Health Statistics 2015). Both surveys collected data for children ages 0–17 years. Within households with multiple children, one child was randomly selected to be the subject of the interview. These telephone-based interviews were conducted with a parent or caregiver knowledgeable about the child’s health and health care, resulting in approximately 95,000 and 40,000 completed interviews per administration of the NSCH and NS-CSHCN, respectively.

Despite these strengths and the utility of the surveys, over time HRSA MCHB and stakeholders came to realize that a redesign was warranted. The impetus for this decision was threefold. First, the overall response rate for the NSCH declined from 55.3% in 2003 (Blumberg et al. 2005) to 23.0% in 2011–2012 (Centers for Disease Control and Prevention, National Center for Health Statistics, and State and Local Area Integrated Telephone Survey 2013); the response rate for the NS-CSHCN similarly declined between 2001 and 2009–2010 from 61.0% (Blumberg et al. 2003) to 25.5% (US Department of Health and Human Services 2011). These trends are consistent with an overall decline in response rates observed in federal surveys (Czajka and Beyler 2016) and other cross-sectional surveys (Brick and Williams 2013; Keeter et al. 2017) over the past decade or two, though the decline for telephone surveys has been sharper than for other modes. Second, in addition to declining response rates, telephone surveys such as the NSCH had to adapt to an overall increase in the proportion of households in the U.S. without landline telephones. Between 2003 and 2016, the proportion of households with children without landline telephones grew from < 5% to over 60% (Blumberg and Luke 2016). Third, efforts were made to address this trend through the addition of a cell-phone frame to the last administrations of both the NSCH and the NS-CSHCN. However, consistent with industry-wide challenges (AAPOR Cell Phone Task Force 2010), the inclusion of cell-phone samples proved to be both costly and inefficient.

Together, these trends prompted HRSA MCHB to explore the feasibility and appropriateness of changing the underlying sampling frame from telephone numbers to household addresses. This decision was not made lightly as such a shift would be matched with a change in the mode of administration from an interviewer-assisted telephone survey to a self-administered web or paper-based survey. In addition to issues related to coverage and response rates, advancements in our collective understanding of children’s health were also considered. Specifically, we recognized that survey content developed for CSHCN, such as medical home access and health care transition planning (Hadland and Long 2014; Cooley and Sagerman 2011), could be relevant for all children. Finally, HRSA MCHB recognized the need to support stakeholders in data-driven decision making through the provision of more timely and frequent estimates.

The goals of this manuscript are threefold: (1) to provide an overview of the survey redesign process and attendant decisions regarding content and methodology; (2) to describe the implementation of the redesigned 2016 NSCH; and (3) to provide preliminary information for 2016 data users and considerations for the 2017 NSCH. The manuscript is divided into three sections corresponding to each of these goals: “Redesign Process and Key Decisions presents the rationale for key decisions related to both survey design and content changes; “2016 NSCH Implementation describes the sampling and data collection process as well as experiments conducted to inform future generations of the survey; and “2016 NSCH Data Release and Planning for 2017” describes basic features of the 2016 sample, considerations for data use, and improvements implemented for 2017.

Redesign Process and Key Decisions

In response to the three drivers discussed above, HRSA MCHB initiated a formal redesign process in 2012. The process included four major areas of activity: in-depth assessment of existing NSCH and NS-CSHCN content; consultation with experts in survey transformation and design; consultation with stakeholders on priorities for content retention, revision, and expansion; and quality assurance activities—specifically cognitive, usability, mode effects, and operational tests, including a national pretest. Several key decisions were reached based on the experiments conducted and input garnered from experts and stakeholders in aggregate, including:

  1. A single survey can be fielded annually, yielding new national estimates each year and new state estimates every 2–3 years, and thereby addressing the need for more “real time” data to drive decision-making. An efficiency of more frequent administrations is the ability to rely on smaller annual samples as pooled estimates may be utilized to produce reliable estimates when needed.

  2. Several major national surveys have successfully transitioned from random-digit-dial (RDD) telephone frames to address-based sampling (ABS) frames, including the National Household Education Survey, the Health Information National Trends Survey, and the Nielsen TV Ratings Diary. Their experiences demonstrate that, relative to RDD sampling and telephone interviewing, an ABS frame can be utilized to improve sampling frame coverage, reduce bias, support more efficient modes of survey administration, and improve response rates (Tourangeau and Plewes 2013).

  3. A web-based data collection instrument can be used to improve both the effectiveness and cost-efficiency of data collection, particularly when paired with paper-based data collection.

  4. Previous survey content can be effectively adapted for self-administration through either a web- or paper-based instrument.

  5. Content from the two previous data collection instruments (NSCH and NS-CSHCN) can be combined, despite some losses, in such a way as to allow for new content on emergent priorities as identified by stakeholders at the national, state, and local levels.

Survey Design and Procedures

Although the redesign reflected a significant departure from previous administration practices, the central purpose of the data collection effort did not shift: to provide state- and nationally representative estimates of key physical, emotional, and behavioral health indicators, health care access and quality, and related family, community, and systems factors, among children < 18 years of age (van Dyck et al. 2004), including the presence of special health care needs. The resultant design reflected both tried-and-true approaches utilized in previous iterations of both surveys as well as innovations achievable through the use of an ABS frame and self-administered questionnaires. Given the U.S. Census Bureau’s experience conducting the National Household Education Survey—a survey for households with children using an ABS frame and self-administered questionnaires—HRSA MCHB chose to shift the data collection platform from NCHS to the U.S. Census Bureau.

Content

The selection and refinement of content for the redesigned survey reflected the need to retain critical content that is uniquely available through the NSCH (e.g., medical home access), while creating room for emergent priorities. The 2016 NSCH capitalized on the extensive work done to select and refine survey items for previous administrations of both the NSCH (van Dyck et al. 2004) and the NS-CSHCN (van Dyck et al. 2002), and retained much of the content previously fielded in eight core content areas illustrated in Fig. 1. As feasible and appropriate, age-specific content was also retained on selected topical questionnaires, such as breast-feeding questions for children ages 0–5 years and health care transition planning for children ages 12–17 years (Table 1). Every effort was made to retain survey items from previous iterations of the NSCH and NS-CSHCN within the redesigned questionnaire. When revisions, additions, and deletions were made, six policy-, programmatic-, and scientifically based criteria were applied to guide the decision-making process and are detailed in Table 2.

Fig. 1.

Fig. 1

Content areas assessed across all age-specific (0–5, 6–11, and 12–17 years) questionnaires for the 2016 National Survey of Children’s Health

Table 1.

Survey content for the redesigned 2016 National Survey of Children’s Health

(1) Family and household characteristics
- Number of children in household
- Primary language spoken
- Residential mobility
- Difficulty getting by on family’s income
- Food sufficiency (affordability, nutritional value, and adequacy)
- Federal benefits (food stamps, SSDI, SNAP, WIC)
- Adult relationship to child (Adults #1 and #2)
- Sex, age, nativity, marital status, educational attainment, and employment status (Adult #1 and #2)
- Physical and mental health (Adult # 1 and #2)
- Family income (dollar value and sources and dollar value)
- Number of people in household and number of family members in household
(2) Child demographics
- Sex
-Age
- Race and ethnicity
- Ability to speak English (5–17 years)
- Nativity and length of time in US if not native born
(3) Child health status
- Special health care need status (5-item CSHCN screener)
- General physical and oral health
- Flourishing
- Bullying (perpetration, victimization)
- Arguing
- Functional difficulties
- Disabilities (age-appropriate items from the American Community Survey).
- Health conditions (ever diagnosed; current; if current, mild, moderate, or severe)
  • Allergies

  • Arthritis

  • Asthma

  • Blood disorder

  • Brain injury

  • Cystic fibrosis

  • Diabetes

  • Down syndrome

  • Epilepsy

  • Genetic or inherited condition (not otherwise mentioned)

  • Heart condition

  • Headaches

  • Tourette syndrome

  • Anxiety problem

  • Behavior/conduct problem

  • Depression

  • Developmental delay

  • Intellectual disability

  • Speech or other language disorder

  • Learning disability

  • Other mental health condition

  • Substance use disorder (6–17 years)

  • Autism or Autism Spectrum Disorder

  • ADD/ADHD

- Activity limitations (frequency, extent)
(4) Child as Infant
- Birth weight
- Preterm birth
- Maternal age
- Breastfeeding (0–5 years)
(5) Health care services
- Doctor visit, past 12 months
- Number of doctor visits
- Length of time provider spent during last visit
- Private time with doctor alone (12–17 years)
- Current height and weight (Body Mass Index)
- Parental concern regarding child’s weight
- Usual place for sick and routine/preventive care and type of place (e.g., doctor’s office, clinic)
- Developmental screening (0–5 years)
- Vision testing
- Oral health care
- Mental health care
- Specialist care
- Alternative health care or treatment
- Unmet medical needs in past year (type of unmet need, reasons for unmet needs)
- Frustration getting services
- Number of ER/ED visits in past year
- Special education/early intervention plan (age at first receipt, current use)
- Special services (e.g., speech or occupational therapy, age at first receipt, current use)
(6) Experience with healthcare providers
- Personal doctor or nurse
- Referral access
- Family centered care
- Healthcare decision-making
- Care coordination
- Communication with other providers
- Transition planning (12–17 years)
(7) Health insurance and providing for health care
- Status and type Reason(s) for gaps in coverage
- Adequacy
- Medical/health-related expenses, past 12 months
- Problems paying for medical/health-related expenses, past 12 months
- Employment changes due to child’s health status, past 12 months
- Hours spent coordinating/providing care, average week
(8) Child health behaviors and family functioning
- Sleep Sleep position (<12 months)
- Screen time physical activity
- Number of days family members read, tell stories, or sing songs to child, average week (0–5 years)
- Days missed of school due to illness/injury
- School contact regarding problems in school
- Repeated grades
- Extracurricular activities and parental attendance at events
- Difficulty making or keeping friends
- Parent-child ability to talk/share things that are meaningful
- Parental coping
- Emotional support for parents (presence, source of)
- Child care for > 10 h/week
- Employment changed due to child care problems, past 12 months
- Family meal sharing, past week
- Household tobacco use
- Family resiliency. When family faces problems, they…
- Adverse childhood experiences child’s learning (3–5 years)
9) Neighborhood characteristics
- Presence or absence of amenities and negative attributes
- Social support
- Neighborhood safety
- School safety (6–17 years)
- Presence of an adult (other than those in home) that child can rely on for advice (6–17 years)

Table 2.

Criteria for the revision, addition, and deletion of content from the redesigned National Survey of Children’s Health

1. Modification improves consistency with Federal policy/programs or harmonization with other federal surveys or data collection efforts (e.g., physical activity guidelines; ACS-6 disability measure);
2. Modification reflects changes that have occurred in the field of study or our understanding of a topic/question (e.g., transition, ADHD treatment, reasons for delayed care);
3. Modification helps to reduce the number of questions and conserve space, which is a requirement given the need to combine content from two surveys without significantly expanding the length of the interview;
4. Modification is needed because of the transition to self-administered questionnaires from interviewer-assisted telephone interviews;
5. Modification increases the survey’s focus on Maternal and Child Health Bureau priorities and tie to Title V Maternal and Child Health Services Block Grant program; and
6. Modification reflects the need for data on emergent priorities (e.g., readiness to learn among 3–5 year olds).

Consistent with the approach used previously, the content—as well as the design, materials, and plan for weighting—of the redesigned survey was informed by the input of a technical expert panel (TEP) comprised of national leaders in child health and individuals who were familiar with the history, uses, and purpose of the NSCH. Much of the work to refine and revise selected content was addressed through small workgroups comprised of TEP members, topical experts, and HRSA MCHB staff convened between 2012 and 2016 to review specific sets of items, such as health care transition, insurance coverage and adequacy, and whether young children are healthy and ready to learn. Selected items on other topics were either sponsored or recommended by federal partners and stakeholders in the field. Additionally, content and format revisions were further informed by cognitive and usability testing, and by expert qualitative reviews of the survey layout and content. The final screener and age-specific questionnaires, approved by the federal Office of Management and Budget are available online at: https://mchb.hrsa.gov/data/national-surveys.

2016 NSCH Implementation

Sample

The 2016 NSCH utilized a sample of 364,153 household addresses drawn from the Census master address file (MAF), a complete listing of all known living quarters in the 50 states and the District of Columbia that is used to support the decennial census. The MAF was supplemented with an administrative records based flag, designed to indicate those addresses most likely to be households with children. This “child flag” was developed by the Census Bureau for the NSCH based on multiple sources of administrative data, such as the Census Numident1, 2010 Census Unedited File, Internal Revenue Service’s 1040 and 1099 files, Medicare Enrollment Database, Indian Health Service database, Selective Service System, and Public and Indian Housing and Tenant Rental Assistance Certification System data from the Department of Housing and Urban Development. Of these, the Census Numident and the IRS 1040 provided the most significant information. Addresses “flagged” as households with a child or likely to have a child were oversampled; approximately 60% of the sample for the 2016 NSCH was drawn from this stratum, designated as Stratum 1; the remaining addresses—those not expected to have children—were designated as Stratum 2. The proportion of households drawn from Stratum 1 and 2 was specified based on the relative size of each strata and the expected efficiency of the “child flag” in each state (U. S. Census Bureau 2017). Similar to previous iterations of the NSCH and NS-CSHCN, state-level samples were allocated with a target to yield a similar number of completed surveys in each state and the District of Columbia (≈ 1,500 per state, including 300 for CSHCN).

Data Collection

The 2016 NSCH retained a two-phased data collection approach: (1) an initial household screener to assess the presence, demographic characteristics, and special health care need status of any children in the home; and (2) a substantive topical questionnaire to be completed by parents or caregivers of selected children (US Department of Health and Human Services 2016). Because some survey questions are more appropriate for children of certain ages, slightly different age-specific topical questions were used for children ages 0–5, 6–11, and 12–17 years.

While the screener served as a census of all children in the household, only one child was randomly selected within each household to be the subject of the topical questionnaire. The subsampling selection of the subject child varied by mode of administration. For web responders, the subject child was randomly selected after completion of the screener and the respondent was immediately directed to the age-appropriate topical questionnaire; for paper responders, the screener was returned to the Census Bureau, where the subject child was then randomly sampled and an age-appropriate topical questionnaire was mailed back to the household for completion. In order to address programmatic and policy-related goals for the survey, two oversamples were applied in households with multiple children to ensure adequate representation of specific groups: CSHCN and children ages 0–5 years had an 80 and 60% greater probability of selection, respectively, than other children in the household. These oversamples were applied, respectively, to address HRSA MCHB’s long-standing programmatic responsibility for assessing the prevalence and impact of special health care needs and to compensate for potential undercoverage of very young children (ages 0–2 years) for whom administrative records may not yet exist.

Data collection for the redesigned 2016 NSCH was conducted between June 10, 2016 and February 10, 2017. Based on the results of the national pretest, fielded in 2015, the 2016 NSCH utilized a multi-mode, “web push + mail” design wherein selected households were initially sent a letter inviting them to participate in the survey via the web. Nonresponders received multiple mailings, including a paper instrument that could be returned by mail. Results from the 2015 pretest indicated that telephone follow-up (TFU) as a means of converting nonresponders was neither effective nor efficient; of the 5287 cases included in the pretest TFU operation only 324 cases were ultimately resolved as a result of this process, yielding 38 completed topical questionnaires (U.S. Census Bureau 2016). As a result, TFU was not included as part of the 2016 production survey. However, a Telephone Questionnaire Assistance line for inbound calls from respondents with questions, those experiencing difficulty with survey, needing language assistance, or a desire to complete the survey by phone was available to maximize opportunities for participation; approximately 1500 calls were made to the toll-free line, resulting in 653 completed “interviews” and 37 completed topical questionnaires (U.S. Census Bureau 2016).

The survey was available in both English and Spanish, with initial contact materials including instructions in English on one side of the invitation letter and instructions in Spanish on the other. Based on the experience fielding the 2011–2012 NSCH, wherein 0.2% of all completed interviews were conducted in a language other than English or Spanish (Bramlett et al. 2017), only Spanish translated materials were created for the 2016 NSCH. The estimated average survey length for households with children (the screener and topical combined) was about 35 min.

Experiments

The 2016 NSCH included three experiments designed to identify opportunities to increase the efficiency and effectiveness of the survey administration process. The experiments addressed specific questions related to the use of incentives, the branding of survey materials, and the use of varying data collection procedures based on internet response likelihood. The 2016 production survey sought to identify the relative benefit of a $5 versus a $2 incentive, when compared to a control group receiving no compensation ($0). Results from this test indicated that the $2 incentive significantly increased response over no incentive (2.9% points) while response among the $5 incentive group was 5.0% points higher than the control group. However, increases in response must be weighed against the additional costs associated with providing higher incentives and ultimately the $2 incentive was determined to be most cost effective given limited resources (U. S. Census Bureau 2017).

Sampled households were also divided equally into two mail branding treatment groups to test whether a HRSA MCHB letterhead yielded a higher response rate than the Census Bureau’s standard branding. Though research has shown Census-branded survey materials to be associated with a 5% higher response rate over other branding (Carroll and Zuckerberg 2013), it was possible that given the survey’s focus, respondents may have been more likely to reply to branding that explicitly noted a maternal and child health affiliation. However, results from this second experiment indicated that there was no statistically significant difference in the return rates by branding (36.4 vs. 35.9%, respectively).

The third experiment may be best described as an adaptation to standard data collection procedures based on the predicted likelihood that the household would respond via the internet. An experiment was conducted that modified data collection procedures based on the tract-level internet response likelihood. Using information from the American Community Survey (ACS), the Census Bureau developed a tract-level assessment of internet response mode choice. Since 2012, ACS respondents have been able to submit survey forms over the internet; paradata indicated whether a respondent chose the online option. An internet-accessible household measure was created to be equal to a weighted proportion of the respondents in each Census tract that chose to submit the ACS survey over the internet if given the option to do so. Based on the tract-level identification of internet response rates from the 2013–2014 ACS survey years, sample households were ranked by tract. The lowest 30% of households by tract-level identification of internet response were designated as “Low Web” with the remaining 70% designated as “High Web”. Non-responding households in the former category were sent a paper screener questionnaire earlier in the follow-up process than those in the latter. The initial assumption was that addresses in Low Web response tracts would tend to prefer paper relative to addresses in High Web tracts. Results from this third experiment showed that, indeed, High Web addresses generally preferred to respond over the internet, but the relationship with paper was more complicated. Specifically, addresses flagged as High Web were more likely to return a screener (+ 38%), report children (+ 15%), and return a topical (+ 98%). Further, Low Web addresses were not only less likely to respond over the internet, they were also less likely to respond at all. The results from the 2016 NSCH showed that Low Web status on the internet response indicator was a better predictor of non-response rather than being able to identify a preference for paper surveys.

2016 NSCH Data Release and Planning for 2017

Data files along with methodological reports and other data-user resources were released September 5, 2017 and are available online: https://mchb.hrsa.gov/data/national-surveys. The final sample size was 50,212 children and the overall weighted response rate was 40.7%2 (U. S. Census Bureau 2017), reflecting a marked increase from the previous NSCH administration in 2011–2012 (23.0%). The interview completion rate, defined as the proportion of screened households known to include children that then completed the topical questionnaire, was also higher (69.7%) compared to the telephone interview completion rate in 2011–2012 (54.1% for landline and 41.2% for cell phones) (Centers for Disease Control and Prevention, National Center for Health Statistics, and State and Local Area Integrated Telephone Survey 2013). Both a child-level and a household-level data file are available through the Census Bureau’s website. Reports, user resources, and key estimates are also available through the websites of HRSA MCHB and the Child and Adolescent Health Measurement Initiative’s Data Resource Center (Child and Adolescent Health Measurement Initiative 2017). Consistent with previous NSCH and NS-CSHCN data releases, child-level weights are provided to allow for the calculation of estimates that generalize to the population of noninstitutionalized children nationally and in each state. Post-stratification adjustment, or raking, was also conducted to ensure that sociodemographic subgroups are appropriately represented in these estimates. However, it is important to note that due to changes in both the sampling frame and the mode of administration, comparison of 2016 estimates to those from previous iterations of either the NSCH (2003, 2007, 2011–2012) or the NS-CSHCN (2001, 2005–2006, 2009–2010) are not appropriate; where content is consistent, comparisons will be possible on an annual basis beginning in 2016.

With the shift to an annual administration, the release of 2016 data coincided with the initiation of 2017 data collection. Based on preliminary evaluation of the 2016 survey design and implementation process, three adaptations were planned for the 2017 administration. First, the approach to classifying households based on their predicted likelihood of internet response was refined. Results from the 2016 data collection efforts suggested that expected differences in the likelihood of response by web versus paper may be multidimensional, reflecting a mix of both internet preference (separate from access) versus paper preference and general response/non-response tendencies. Thus, for 2017, sample geography was still be evaluated at the tract level for the likelihood of internet response, given the strong, positive relationship between this flag and observed web response. Response preference for paper was also evaluated and developed as a separate distribution from web preference. The creation of two separate mode preference distributions will allow both distributions to be combined and compared with overall expected response propensity for future improvements.

Second, the sampling strata were modified to more accurately identify households with children. For addresses where administrative records did not indicate children present, the sample was subsetted into two groups: (a) addresses that had characteristics associated with a higher likelihood of having children present and (b) addresses that were very unlikely to have children present. Tests against the ACS indicated that this modified sampling frame covered 95% of households with children. Identifying households most likely and least likely to contain children allowed limited resources to be targeted to those households with eligible children, and to achieve a larger overall sample of children with available funds.

Third, based on the response patterns observed in 2016 with respect to mode and timing, significant changes were made to the production schedule and contact strategies for 2017. These included decreased time between contacts, a reminder postcard in a pressure-sealed envelope with web login information sent 1 week after the initial invitation to participate in the survey, inclusion of a fact sheet detailing the benefits of the survey to respondents, and an earlier shift to paper instruments for non-responders.

Finally, results of the 2016 incentive and branding experiments were also reflected in the 2017 data collection. Based on the absence of a significant advantage to changing to the HRSA MCHB branding, 2017 mailings included Census Bureau branding only and the incentive level was capped at $2 with 90% of the sampled addresses receiving this amount. The remaining 10% of sampled addresses received no incentive in order to monitor the effectiveness of the cash incentive.

Due to the desire to have at least two comparable data points for national estimates and 2 years of consistent data that can be merged for estimates of population subgroups at the state level, limited content changes were entertained for 2017. These included minimal adjustments to the flow of the instrument and minor wording changes deemed necessary to improve clarity. A comprehensive review and assessment of content is currently underway, which will inform the selection of content for the 2018 NSCH.

Conclusion

The NSCH remains an invaluable data source for key measures of child health and well-being as well as attendant health care system, family, and community factors. The 4-year redesign was a thoughtful and ambitious process, which sought to maintain and extend the utility of this important resource while striking a balance between the successes and strengths of the past and the opportunities and innovations possible today. The resulting redesigned survey will yield both representative national and state-level estimates for a range of program-critical and policy-relevant child health indicators while assuring more timely data availability. The 2016 NSCH lays the groundwork for increased efficiencies and innovations in data collection and processing in the future.

Significance.

The National Survey of Children’s Health and the National Survey of Children with Special Health Care Needs have been unique sources of national- and state-level data on key measures of child health since 2001. In 2012–2015, the surveys underwent a significant redesign, shifting from a telephone- to an address-based sampling frame and changing mode of administration from an interviewer-assisted survey to a self-administered web or paper-based survey. Content revisions were also undertaken, resulting in a single survey that included the addition of new items on a range of topics. Data from the redesigned 2016 survey are now available. What is already known about the subject Data from the National Survey of Children’s Health and the National Survey of Children with Special Health Care Needs have and Child Health Services Block Grant Program as well as the data source for multiple Healthy People objectives and numerous scientific studies. What this study adds This is the first detailed description of the 4-year process to redesign and combine these surveys and describes the attendant decisions regarding methodology and content. The implementation of the redesigned 2016 National Survey of Children’s Health is described in detail, including the results of experiments. Preliminary information for 2016 data users and considerations for the 2017 NSCH are also noted.

Acknowledgements

We extend our deepest gratitude to members of the National Survey of Children’s Health Technical Expert Panel, staff and leadership at the National Center for Education Statistics, and staff and leadership at the Child and Adolescent Health Measurement Initiative.

Footnotes

1

The Numident is based on the collection of all individuals who have been assigned Social Security Numbers. Demographic data from the Numident is updated from federal tax data and various administrative records.

2

The overall weighted response rate is the product of the probability that the address is resolved, that a resolved address completes a screener questionnaire, and that a household identified as having children completes the topical questionnaire. An address is considered to be “resolved” when it has been determined that it is eligible to complete the screener or topical questionnaires.

Publisher's Disclaimer: The views expressed in this article are those of the authors and do not necessarily reflect the official policies of the U.S. Department of Health and Human Services or the Health Resources and Services Administration or the National Center for Health Statistics, nor does mention of the department or agency names imply endorsement by the U.S. government. Dr. Stephen Blumberg co-authored this paper in his role as chair of the Technical Expert Panel. Further, the views expressed on statistical, methodological, technical, or operational issues are those of the author(s) and not necessarily those of the U.S. Census Bureau.

References

  1. AAPOR Cell Phone Task Force. (2010). New Considerations for Survey Researchers When Planning and Conducting RDD Telephone Surveys in the U.S. With Respondents Reached via Cell Phone Numbers. American Association of Public Opinion Research. [Google Scholar]
  2. Bethell CD, et al. (2014). Adverse childhood experiences: Assessing the impact on health and school engagement and the mitigating role of resilience. Health Affairs, 33(12), 2106–2115. [DOI] [PubMed] [Google Scholar]
  3. Blumberg SJ, et al. (2003). Design and operation of the National Survey of Children with Special Health Care Needs, 2001 in Vital Health Stat, Centers for Disease Control and Prevention and National Center for Health Statistics, [PubMed] [Google Scholar]
  4. Blumberg SJ, et al. (2005). Design and operation of the National Survey of Children’s Health, 2003 in Vital Health Stat, National Center for Health Statistics. [PubMed] [Google Scholar]
  5. Blumberg SJ, & Luke JV (2016). Wireless substitution: Early release of estimates from the National Health Interview Survey, January–June 2016.
  6. Bramlett MD, et al. (2017). Design and operation of the National Survey of Children’s Health, 2011–2012. Vital and Health Statistics 1 (59), 1–256. [PubMed] [Google Scholar]
  7. Brick JM, & Williams D (2013). Explaining rising nonresponse rates in cross-sectional surveys. Annals of the American Academy of Political and Social Sciences, 645, 36–60. [Google Scholar]
  8. Carroll SH, & Zuckerberg A (2013). Do Names Matter? Experiments Comparing Different Branding and Levels of Personally Identifiable Information in a Mail Questionnaire. Proceedings of the 2013 Federal Committee on Statistical Methodology (FCSM) Research Conference, Editor. [Google Scholar]
  9. Centers for Disease Control and Prevention, National Center for Health Statistics, and State and Local Area Integrated Telephone Survey. (2013). 2011–2012 National Survey of Children’s Health Frequently Asked Questions. Retrieved March 14, 2017 from ftp://ftp.cdc.gov/pub/Health_Statistics/NCHS/slaits/nsch_2011_2012/01_Frequently_asked_questions/NSCH_2011_2012_FAQs.pdf.
  10. Child and Adolescent Health Measurement Initiative. (2017). Data Resource Center for Child and Adolescent Health. Retrieved May 12, 2017 from http://www.childhealthdata.org/.
  11. Cooley WC, & Sagerman PJ (2011). Supporting the health care transition from adolescence to adulthood in the medical home. Pediatrics, 128(1), 182–200. [DOI] [PubMed] [Google Scholar]
  12. Czajka JL, & Beyler A (2016). Declining response rates in federal surveys: Trends and implications. Washington, DC: Mathematica Policy Research. [Google Scholar]
  13. Hadland SE, & Long WE (2014). A systematic review of the medical home for children without special health care needs. Maternal and Child Health Journal, 18(4), 891–898. [DOI] [PubMed] [Google Scholar]
  14. Keeter S, et al. (2017). What low response rates mean for telephone surveys. Washington, DC: Pew Research Center. [Google Scholar]
  15. Kogan MD, et al. (2009). Prevalence of parent-reported diagnosis of autism spectrum disorder among children in the US, 2007. Pediatrics, 124(5), 1395–1403. [DOI] [PubMed] [Google Scholar]
  16. Kogan MD, et al. (2010). Underinsurance among children in the United States. The New England Journal of Medicine, 363(9), 841–851. [DOI] [PubMed] [Google Scholar]
  17. Tourangeau R, & Plewes TJ (2013). Nonresponse in Social Science Surveys: A Research Agenda National Research Council, Editor. 2013, Panel on a Research Agenda for the Future of Social Science Data Collection, Committee on National Statistics, Division of Behavioral and Social Sciences and Education: Washington, DC. [Google Scholar]
  18. U.S. Census Bureau. (2016). National Survey of Children’s Health, Pretest 2015: Methodology Report.
  19. U.S. Census Bureau. (2017). 2016 National Survey of Children’s Health: Methodology Report. Forthcoming. [Google Scholar]
  20. U.S. Department of Health and Human Services, Centers for Disease Control and Prevention, and National Center for Health Statistics. (2015). State and Local Area Integrated Telephone Survey (SLAITS). Retrieved August 16, 2017 from https://www.cdc.gov/nchs/slaits/index.htm.
  21. US Department of Health and Human Services. (2011). 2009–2010 National Survey of Children with Special Health Care Needs Frequently Asked Questions. Retrieved April 7, 2017 from https://www.cdc.gov/nchs/data/slaits/nscshcnfaqs2009.pdf.
  22. US Department of Health and Human Services. (2014). Healthy People 2020. Retrieved March 13, 2017 from https://www.healthypeople.gov/.
  23. US Department of Health and Human Services. (2016). National Survey of Children’s Health. Retrieved March 31, 2017 from https://mchb.hrsa.gov/data/national-surveys.
  24. US Department of Health and Human Services. Title V Maternal and Child Health Services Block Grant Program. Retrieved April 7, 2017 from https://mchb.hrsa.gov/maternal-child-health-initiatives/title-v-maternal-and-child-health-services-block-grant-program.
  25. US Department of Health and Human Services Centers for Disease Control and Prevention. State and Local Area Integrated Telephone Survey: Publications and Presentations Using SLAITS Data. Retrieved March 13, 2017 from https://www.cdc.gov/nchs/slaits/slaits_products.htm.
  26. van Dyck P, et al. (2004). The National Survey of Children’s Health: A new data resource. Maternal and Child Health Journal, 8(3), 183–188. [DOI] [PubMed] [Google Scholar]
  27. van Dyck PC, et al. (2002). The national survey of children with special health care needs. Ambulatory Pediatrics, 2(1), 29–37. [DOI] [PubMed] [Google Scholar]
  28. Visser SN, et al. (2014). Trends in the parent-report of health care provider-diagnosed and medicated attention-deficit/hyperactivity disorder: United States, 2003–2011. Journal of the American Academy of Child and Adolescent Psychiatry, 53(1), 34–46 e2. [DOI] [PMC free article] [PubMed] [Google Scholar]

RESOURCES