Abstract
Objectives. Telephone survey data are widely used to describe population health, but some fear that people with disabilities cannot participate. We tested the hypothesis that a telephone survey would underrepresent adults with disabilities, and that the adults with disabilities who responded would report lower prevalences of sensory, mental, self-care, and multiple limitations than those observed in people with disabilities in the general population.
Methods. We compared characteristics of adults with disabilities identified by the 2001 Washington State Behavioral Risk Factor Surveillance Survey (BRFSS) to Washington adults with disabilities in the Census 2000 Supplementary Survey (C2SS), to 2 BRFSS Disability Supplements, and to the Washington State Population Survey. All except the C2SS are telephone surveys.
Results. Contrary to expectations, post hoc analyses of all telephone surveys found significantly higher prevalence of disability in the Washington adult population than did the C2SS. The hypothesis of more sensory, mental, and self-care limitation in telephone disability samples was supported in only 2 of 11 instances in which a disability sample was asked about 1 of these limitations. Findings were not explained by differences in disability definition or type of informant.
Conclusions. These results suggest that population telephone surveys do not underrepresent adults with disabilities. The counterintuitive finding of their higher survey participation raises further questions.
Healthy People 2010 directs public health agencies to identify and reduce health disparities between people with and without disabilities.1 Telephone surveys such as the Behavioral Risk Factor Surveillance Survey (BRFSS) are the most common way to collect population health data on people with disabilities to track these disparities. However, there are questions about the degree to which people with disabilities, especially limitations in hearing, speaking, and cognition, have difficulty participating in telephone surveys.2–4 Under-representing persons with disabilities might produce substantial bias in BRFSS-based estimates of health behaviors and affect conclusions drawn from them. To evaluate the validity of this concern, we compared demographic and disability characteristics of adults with disabilities identified by the 2001 Washing-ton State BRFSS to Washington State adults in the Census 2000 Supplementary Survey (C2SS), and to statewide telephone survey disability samples from 2 BRFSS Disability Supplements (DSs) and the Washington State Population Survey (WSPS).
This analysis tested the hypotheses that (1) a telephone survey would underrepresent people with disabilities in Washington State, and (2) the people with disabilities who responded to a telephone survey would report lower prevalence of sensory, mental, self-care, and multiple limitations than those in the general population. We expected differentials for 2 reasons. First, socioeconomic characteristics are known to affect response rates to telephone surveys. Higher-income and better-educated persons (and households) are more likely to complete surveys.5,6 Second, people with disabilities might be less likely to respond for functional reasons. Those with cognitive, hearing, and self-care limitations could be particularly affected because of difficulty in answering the telephone and in completing interviews.2–4,7
METHODS
Census data on adults with disabilities most closely approach a full count of the population and avoid bias associated with nonresponse. However, a comparison between the most accurate census disability counts (the C2SS) and BRFSS disability samples requires consideration of 3 methodological differences between them, summarized in Table 1 ▶. (1) The 2 surveys used different disability definitions. (2) The BRFSS was a telephone survey; the C2SS, like the full census, used multimode data collection, which was reflected in different response rates. (3) Although both surveys sampled households, the C2SS asked a household informant for data on all its members and the BRFSS asked a single individual to self-report. No single data source resolved these differences. Comparisons of disability data from a number of different Washington population surveys, described later, provided a way to estimate the size and direction of their effects.
TABLE 1—
C2SS | BRFSS | Disability Supplement | Disability Supplement | State Population Survey | |
Year | 2000 | 2001 | 2001 | 2003 | 2000 |
Disability definitiona | census | BRFSS | BRFSS | BRFSS and census | census |
Data collection mode | mail/phone/interview | RDD phone | RDD phone | RDD phone | RDD phone |
Response rate, % | 95 | 48 | 43 | 44 | 48 |
Respondent | household head | individual | individual | individual | household head |
Households in sample, No. | 8127 | 4207 | 2117 | 2110 | 6726 |
Persons aged ≥21 y in sample, No. | approximately 14 400 | 4029 | 2023 | 2007 | 12 138 |
Note. C2SS = Census 2000 Supplementary Survey; BRFSS = Behavioral Risk Factor Surveillance Survey; RDD = random-digit-dialed.
aCensus definition: 1 or more of 6 limitations. BRFSS definition: limited in any activities or use special equipment.
All disability prevalence comparisons used 1-tailed t tests at the .05 α level for differences in means and proportions for independent samples with heterogeneous variances. Comparisons for demographic characteristics used 2-tailed tests α (± .05). Standard errors for C2SS data were calculated manually.8,9 Standard errors for weighted BRFSS and other data were calculated by SUDAAN 7.5 (Research Triangle Institute, Research Triangle Park, NC).
Census Data Source
Because of data collection errors in interviews, the 2000 census long-form disability data overcounted adults with limitations in working and in going out and therefore over-counted adults with overall disabilities.10 Estimates from the C2SS are considered to be more reliable counts11 and were used in the analyses reported here. The C2SS is part of the transition from the decennial census long form, a detailed 1-in-6 sample of households, to ongoing data collection that gives more current estimates of social characteristics.11 Participation in the C2SS, like participation in the census, is required by law. Like the census, the C2SS asked the household head or knowledgeable person questions about each individual in the 8127 sampled Washington households.12 Unlike the census, the C2SS did not include persons in institutions. Fifty-six percent of households returned mailed questionnaires, and those that did not within 1 month were telephoned for interviews (7% of responses). Census interviewers then visited the 32% of households that could not be reached by telephone, producing a response rate of 95.4%. The census identified a person as having a disability if the informant said he or she had 1 or more of the following conditions (labeled with census names).
16. Do you (does Person X) have any of the following long-lasting conditions:
Blindness, deafness, or a severe vision or hearing impairment? (sensory)
A condition that substantially limits 1 or more basic physical activities such as walking, climbing stairs, reaching, lifting, or carrying? (physical)
17. Because of a physical, mental, or emotional condition lasting 6 months or more, does this person have any difficulty in doing any of the following activities:
Learning, remembering, or concentrating? (mental)
Dressing, bathing, or getting around inside the home? (self-care)
(Answer if this person is 16 YEARS OLD OR OLDER.) Going outside the home alone to shop or visit a doctor’s office? (going out)
(Answer if this person is 16 YEARS OLD OR OLDER) Working at a job or business? (work)
C2SS data are available in tables and in public use microdata samples, which were used when tabular data were not available. Estimates from the C2SS had larger errors than the census long-form disability data because the C2SS sample was smaller and its public use microdata samples data were a subset of the C2SS.13 However, these data are the most complete population-level estimates available, with very low sampling bias from unit nonresponse and lower rates of item non-response than the census.13,14 It is response bias with which this article is most concerned.
Telephone Survey Data Sources
The Washington State BRFSS is an ongoing, random-digit-dialed telephone survey of the civilian, noninstitutionalized population aged 18 years and older.15 The survey tracks the prevalence of key health- and safety-related behaviors and characteristics of the state population. In a household reached by telephone, a single male or female informant is randomly selected to report on his or her own behaviors. In 2001, the Washington State BRFSS interviewed 4207 persons aged 18 years and older with a response rate of 48%. Data were weighted to reflect the age and gender distribution of the state’s population during the survey year. The survey identified as people with disabilities anyone who answered affirmatively to 1 or both of 2 questions previously used in the National Health Interview Survey:
Are you limited in any way in any activities because of physical, mental, or emotional problems?
Do you now have any health problem that requires you to use special equipment, such as a cane, a wheelchair, a special bed, or a special telephone? Include occasional use or use in certain circumstances.
In 2001 and 2003, the Centers for Disease Control and Prevention provided funds for the Washington State BRFSS to use DSs to collect additional data on people with and without disabilities. In addition to the BRFSS disability definition, the 2001 DS included the census mental and self-care limitation questions, and the 2003 DS included all 6 census items. The DSs were conducted by the Washington State BRFSS contractor using the BRFSS sampling frame, procedures, and demographic questions.
The Washington SPS is a biennial random-digit-dialed telephone survey of Washington households modeled on the Census Current Population Survey and conducted by the state’s Office of Financial Management. The WSPS uses the census household informant method to ask about all household members and the census questions to determine disability status. It differs from the C2SS only in being a voluntary telephone survey. The 2000 WSPS intentionally oversampled non- Whites and was offered in Spanish. Interviews from 6726 households gave data on 17967 individuals aged from birth to 100 years. The sample was weighted using 2000 census data on age, gender, and race to represent the population of the state.
RESULTS
Contrary to the initial hypothesis that people with disabilities would be underrepresented in telephone surveys, post hoc analyses of responses found significantly higher prevalence of disability in the Washington State population aged 21 years and older than did the C2SS (Table 2 ▶). This held true regardless of disability definition or informant used. The 2001 BRFSS did not ask about census limitations. The WSPS and 2003 DS, which did, found higher population rates of census disability and of physical, mental, sensory, and work limitation than the C2SS, and equal rates of limitation in self-care and going out alone.
TABLE 2—
Characteristic | Census or C2SS | BRFSS | Disability Supplement | Disability Supplement | State Population Survey |
Year | 2000 | 2001 | 2001 | 2003 | 2000 |
BRFSS-defined disability prevalence, % (SE) | NA | 23.1 (0.72)* | 23.9 (1.01)* | 25.7 (1.06)* | NA |
Census-defined disability prevalence, % (SE) | 18.2 (0.04)b | NA | NA | 29.3 (1.11)* | 22.0 (0.53)* |
Prevalence of census-defined limitations, % (SE) | |||||
Physical limitation | 10.5 (0.28) | NA | NA | 18.9 (0.92)* | 13.7 (0.44)* |
Work limitation | 6.8 (0.29) | NA | NA | 11.5 (0.85)* | 8.5 (0.37)* |
Mental limitation | 5.4 (0.22) | NA | 10.3 (0.74)* | 10.2 (0.76)* | 6.8 (0.33)* |
Difficulty going out alone | 5.6 (0.21) | NA | NA | 6.8 (0.61) | 5.0 (0.30) |
Sensory limitation | 5.5 (0.19) | NA | NA | 8.4 (0.68)* | 6.6 (0.32)* |
Self-care limitation | 2.7 (0.30) | NA | 2.0 (0.35) | 4.9 (0.51)* | 3.2 (0.23) |
Education (age ≥25 y) , % (SE) | |||||
< High school education | 12.9 (0.10)c | 6.3 (0.48)* | 7.2 (0.67)* | 7.4 (0.70)* | 7.7 (0.34)* |
> High school education | 62.2 (0.44)c | 68.8 (0.85)* | 69.2 (1.18)* | 65.9 (1.69) | 62.0 (0.64) |
Annual household income, $, % (SE) | |||||
< 15 000 | 13.1 (0.05)d | 7.1 (0.49)* | 7.6 (0.67)* | 6.3 (0.72)* | 6.0 (0.29)* |
> 75 000 | 24.2 (0.06) | 20.6 (0.70)* | 23.0 (1.16) | 19.8 (1.03)* | 31.4 (0.61)* |
Employed (aged 21–64 y) | 72.6 (0.05)e | 75.7 (0.86) | 74.3 (1.20) | 71.0 (1.57) | 79.2 (0.57)* |
Note. NA = question not asked on this survey; BRFSS = Behavioral Risk Factor Surveillance Survey; C2SS = Census 2000 Supplementary Survey.
aAge ≥21 years except where noted.
b Washington State disability and limitation prevalence data are from C2SS online tabulated data at http://factfinder.census.gov.
cCensus education data are from census short form (SF3), Table QT-P20, Washington State.
dCensus income data for families, not households; from census short form (SF3), Table DP-3, Washington State.
eCensus employment rate for persons aged 21 to 64 years from census short form (SF3), Table QT-P24, Washington State.
* P < .05; post hoc t test for difference of proportions with C2SS/census data is significant.
The second hypothesis predicted that the disability samples from Washington telephone surveys would have lower prevalence of mental, sensory, and self-care limitation than the C2SS disability sample. Only 1 of the 4 telephone disability samples in which people were asked about self-care limitation and 1 of the 3 in which people were asked about sensory limitation reported a lower prevalence than that in the C2SS. Post hoc analyses showed that in all 4 telephone disability samples, prevalence of mental limitation was higher than or not significantly different from C2SS rates (Table 3 ▶). In the 2 DS disability samples that included data on number of census limitations, contrary to the second hypothesis, the prevalence of multiple limitations was not different from that in the C2SS (Table 3 ▶). A post hoc analysis found that in the WSPS disability sample the prevalence of multiple limitations was actually higher.
TABLE 3—
Characteristic | C2SS | BRFSS | DS | DS | DS | WSPS |
Year | 2000 | 2001 | 2001 | 2003 | 2003 | 2000 |
Disability definitionb | Census | BRFSS | BRFSS | BRFSS | Census | Census |
Census-defined limitations, % (SE) | ||||||
Mental limitation | 29.6 (1.31) | NA | 27.5 (2.19) | 25.7 (2.13) | 36.0 (2.15)* | 32.7 (1.34) |
Sensory limitation | 29.7 (1.31) | NA | NA | 18.0 (1.84)* | 27.6 (1.96) | 31.3 (1.31) |
Self-care limitation | 14.8 (1.01) | NA | 7.6 (1.33)* | 18.4 (1.82) | 16.5 (1.59) | 15.0 (1.04) |
No. limitations, % (SE) | ||||||
1 | 50.9 (1.43) | NA | NA | 26.9 (1.86)* | 49.5 (3.79) | 45.2 (1.01)* |
≥4 | 12.8 (0.96) | 15.9 (1.53) | 14.6 (1.89) | 16.3 (0.74)* | ||
Age and gender, % (SE) | ||||||
≥65 y | 35.5 (1.37) | 28.4 (1.56)* | 30.1 (2.10) | 28.4 (1.96) | 29.1 (1.84)* | 35.2 (1.34) |
Male total | 48.1 (1.45) | 46.1 (1.77) | 43.6 (2.44) | 43.0 (2.38) | 42.3 (2.23) | 46.6 (1.39) |
Male aged ≥65 y | 32.2 (1.80) | 23.8 (2.34)* | 25.9 (3.20) | 26.6 (3.11) | 25.6 (2.89) | 31.5 (1.80) |
Female aged ≥65 y | 39.8 (1.50) | 32.3 (2.09)* | 33.3 (2.75) | 29.7 (2.50)* | 31.7 (2.37)* | 38.2 (1.94) |
Race/ethnicity, % (SE) | ||||||
White | 82.4 (1.08) | 86.3 (1.29) | 92.6 (1.39)* | 93.4 (1.18)* | 87.9 (1.27)* | 88.1 (1.22)* |
Hispanic | 4.6 (0.44) | 2.7 (0.61) | 5.4 (1.23) | 3.2 (0.95) | 4.3 (1.07) | 4.6 (0.52) |
Education (aged ≥25 y), % (SE) | ||||||
< High school education | 12.9 (0.10) | 9.3 (1.06)* | 9.6 (1.52)* | 12.3 (2.02) | 15.7 (1.79) | 15.5 (1.05)* |
> High school education | 62.2 (0.44) | 63.8 (1.74) | 62.8 (2.37) | 60.3 (3.06) | 55.9 (2.71)* | 47.3 (1.41) |
Annual household income, $, % (SE) | ||||||
< 15 000 | 25.7 (1.82) | 13.4 (1.30)* | 14.5 (1.75)* | 17.8 (2.06) | 17.0 (1.81) * | 13.6 (0.93)* |
> 75 000 | 16.6 (1.55) | 15.2 (1.27) | 13.7 (3.78) | 11.3 (1.69) | 10.9 (1.48) | 18.6 (1.17) |
Employed (aged 21–64 y) | 49.9 (1.94) | 63.5 (2.03)* | 61.3 (2.88)* | 45.9 (3.23) | 47.3 (3.04) | 56.7 (1.71) |
Note. NA = question not asked on this survey; BRFSS = Behavioral Risk Factor Surveillance Survey; C2SS = Census 2000 Supplementary Survey; DS = Disability Supplement; WSPS = Washington State Population Survey.
aAge ≥ 21 years except where noted.
bCensus definition: 1 or more of 6 limitations. BRFSS definition: limited in any activities or use special equipment.
* P < .05; post hoc t test for difference of proportions with C2SS/census data is significant.
The adults with disabilities identified by the 2001 BRFSS differed demographically from people with disabilities described in the C2SS (Table 3 ▶).
Compared with those in the C2SS disability sample, the 2001 BRFSS adults with disabilities included fewer adults aged 65 years or older and more persons aged 21 to 64 years overall and for men and women separately. Respondents with disabilities aged 25 years or older in the 2001 BRFSS were less likely to report less than a high school education than those in the C2SS, and those aged 21 to 64 years were more likely to be employed. BRFSS respondents with disabilities were less likely to report household incomes below $15 000 than were those in the C2SS disability sample. The 2001 BRFSS and DS and the C2SS disability samples did not differ on proportions of people with high in- comes (more than $75 000), on ethnicity, on proportion with postsecondary education, or on gender. Similar results were observed in the disability samples from the other telephone surveys, suggesting that findings were not attributable to vagaries of sampling in the 2001 BRFSS.
DISCUSSION
Contrary to expectation, the 2001 BRFSS survey found higher rates of disability in the Washington State adult population than did the C2SS, and equal degrees of limitation among those with disabilities. To understand why survey respondents might report more disability than those in the general population, it was necessary to consider whether the findings were attributable to the methodological differences in data collection. Although direct estimation of these methodological effects was not possible, comparison with the other surveys’ disability samples permitted indirect estimates.
Methodological Differences Among Data Sources
Disability definitions did not account for the higher prevalence of disability in the BRFSS. Regardless of the definition used, all telephone surveys found higher disability prevalence than did the C2SS. The 2003 DS used both BRFSS and census disability questions, so respondents could be classified as having “BRFSS-defined disability” or “census-defined disability.” The prevalence of census disability (29.3%) in the 2003 DS was not significantly different from the prevalence of BRFSS disability (25.7%, Table 2 ▶). Seventy-eight percent of persons in the 2003 DS BRFSS disability group also met the census disability criteria. This suggests that although the BRFSS disability definition included somewhat different people, as one would expect from its wording, it did not produce the higher prevalence of population disability observed in the 2001 BRFSS.
It also seems unlikely that the higher 2001 BRFSS disability prevalence was attributable to the use of self-informant rather than household or proxy informant. The WSPS, using a household informant and the census disability definition, also found a higher population disability prevalence than the C2SS.
The third methodological difference between the C2SS and the surveys was response rate. Compared with the C2SS, in these telephone surveys with modest response rates (43% to 48%), nonresponse bias might explain the higher survey disability rates. The population samples for the 2001 Washington BRFSS and other surveys were weighted to match the census age and gender distribution of the 2000 population, and the WSPS was also race-adjusted, so the samples were not biased on these characteristics. However, relative to census population data from the 2000 long form (summary file 3), the survey samples were nonrepresentative in education and income. All 4 telephone surveys included fewer people with less than a high school education and fewer with very low family income. In addition, respondents to 2 of the 4 surveys, including the 2001 BRFSS, had significantly higher prevalence of postsecondary education than the census predicted. The WSPS also estimated a higher employment rate (those aged 21 to 64 years) than the census, and a lower population poverty rate (the BRFSS income categories did not permit computation of the poverty rate). The 2001 BRFSS and the other telephone surveys therefore underrepresented demographic groups in the Washington population that typically experience higher prevalence of disability. This bias might have been expected to produce a low survey disability prevalence, but higher estimates resulted.
Poststratification weighting based on demographics, used in all these samples, is unlikely to improve the representativeness of data on attitudinal or behavioral measures unless these are highly correlated with the adjustment variables.16 Disability prevalence differs by age, gender, education, income, and employment but in combination these variables explain only a modest proportion of variation in disability.17 The loose association between disability and demographics may explain why the survey samples with more education and higher income still produced higher estimates of disability. Survey nonresponse is largely the result of 2 broad problems: some individuals in a sample are relatively inaccessible to the surveyor and some are unwilling to cooperate.4,6 If the nonrespondents are distinctively different from respondents, bias can result.18 The C2SS found that households with and without adults with disabilities were equally likely to be without a telephone (the prevalence of noncoverage was 1.1%), so this did not account for response differences. Perhaps the survey samples included more people with disabilities than the C2SS because people without disabilities were harder to reach, or because people with disabilities were disproportionately more likely to participate.4 Were people without disabilities more likely to filter calls with answering machines, or to use cell phones, which were not included in random-digit-dialed number banks? There are no data that address these questions for telephone surveys.
Other Possible Explanations of Findings
Given that the obvious methodological differences did not explain the higher prevalence of disability found in the BRFSS and other surveys, what might? Geography? The C2SS sampled from a subset of Washington counties, which may have had lower disability prevalence and given an artificially low state disability estimate. However, a post hoc analysis of the C2SS microsample data, analyzed by county, did not show lower disability prevalence in the individual counties (data not shown). Did survey respondents with disabilities overreport their limitations? This explanation cannot be ruled out, although as noted earlier, higher disability prevalence was found in surveys using both self-report and household informant report of limitation, and with different disability questions and variable question order, so higher prevalence was probably not solely a function of self-report or order effects in surveys. Did the differences owe to seasonal effects? All the data, including the C2SS, were continuously collected throughout the year, so this was not a plausible explanation.
The findings of overestimation of disability and equal representation of multiple limitations are puzzling and defy easy explanation. However, they do answer the original questions posed in this article. There was no evidence that telephone surveys differentially underrepresent persons with disabilities, as the BRFSS and census defined disability. Judging from DS-estimated prevalence of mental, sensory, self-care, and multiple limitations, the BRFSS sampling methodology and disability definition do not reduce participation by persons with these conditions in the disability sample.
Limitations of the Study
Caution is indicated in interpreting these results. Although the C2SS is the best standard currently available, it too is a sample, albeit a mandatory one, with multiple modes of response and a 95% response rate. In this analysis, use of the C2SS rather than the full, but flawed, short-form census 1-in-6 data resulted in less precise estimates of disability prevalence and demographic characteristics of adults with disabilities in the population. However, this should tend to reduce rather than inflate differences between the census and BRFSS data. Comparisons of census short-form, long-form, and C2SS estimates of accurately counted demographic and limitation variables (physical, mental, sensory, self-care, and work) confirmed the observed differences between C2SS and the BRFSS and other Washington telephone survey samples, suggesting that the loss of precision did not obscure important differences. The C2SS was administered in 2000 and the BRFSS, DS, and WSPS survey data were collected between 2000 and 2003, which might have reflected increases in the population disability prevalence that the C2SS could not capture. However, prevalence of the state’s census disability estimated by the 2000 and 2001 Census Supplementary Surveys and the 2002 American Community Survey showed no change during this period, despite increases in the state population,19 so time differences were probably not a factor in the findings.
Disability, unlike gender or age, is a variable state that is difficult to measure precisely with any simple set of questions,20 and the census definition has been criticized as insufficiently sensitive.21 Therefore, the group identified as having a “disability” in either census or BRFSS terms may not have included the full range of people believed to have a disability. In addition, the proxy reporting of disability used in the census and C2SS has been shown to produce biased assessments depending on the proxy’s relationship to the subject.7 We do not know whether this proxy bias operated differently when the assessment was made on a written form or face-to-face interview in the C2SS rather than in a telephone interview for the WSPS, in which a rapid response was required. It is possible that C2SS proxy reporting or face-to-face interviews resulted in an undercount of persons with disabilities, which would have made the BRFSS and other estimates appear higher. However, Andresen’s work on proxy bias in some census measures suggested that bias operated in the opposite direction, to inflate rather than reduce disability counts.21 Face-to-face interviews were conducted in 32% of households in the C2SS and might have underreported the informant’s limitation.7 It was impossible to affirm whether this happened, but because the prevalence of disability in the household’s primary respondent (20.8%) was not lower than the C2SS state prevalence estimate, interview bias seems a somewhat unlikely cause of an undercount.
There are other limitations to this study. The data described only Washington State, which has a predominantly White, well-educated population, making it hard to evaluate racial representation and differences in education in the population and disability samples. The state had relatively low BRFSS, DS, and WSPS response rates, and a higher response rate might somehow change the relation between survey and census disability estimates. Lastly and perhaps most important, the C2SS, BRFSS, and DS surveys shared relatively few variables, limiting comparisons to explain the counterintuitive finding of higher disability prevalence in telephone surveys.
Conclusion
The question driving this analysis was whether policymakers can rely on BRFSS and other survey data to describe the health of people with disabilities and monitor their progress toward full inclusion and other goals of the Americans with Disabilities Act of 1990. The answer from the Washington State data was that people with disabilities participated in telephone surveys at rates that were higher than expected. The BRFS surveys found significant differences in health, health behaviors, and well-being between adults with and without disabilities, and this study suggests that these disparities are not explained by survey methodology. Public health practitioners should undertake efforts to understand and reduce them.
Acknowledgments
This work was supported by a cooperative agreement (U59/CCU006992–10) with the Centers for Disease Control and Prevention.
Note. The contents of this article are solely the responsibility of the authors and do not necessarily represent the official views of the Centers for Disease Control and Prevention.
Human Participant Protection The Washington State survey data reported in this article were collected with approval of the Washington State Department of Health institutional review board.
Peer Reviewed
Contributors S. Kinne originated the study, conducted the analyses, and wrote the article. T. Topolski assisted with analysis and writing.
References
- 1.Healthy People 2010: Understanding and Improving Health? Washington DC: US Department of Health and Human Services; 2000: 6–3.
- 2.Parsons JA, Baum S, Johnson TJ. Inclusion of Disabled Populations in Social Surveys: Review and Recommendations. Chicago, Ill: National Center for Health Statistics: Survey Research Laboratory University of Illinois, Chicago; 2000.
- 3.Kirchner C. Improving research by assuring access. Footnotes. 1998;26(7):7. [Google Scholar]
- 4.Hendershot G, Colpe L, Hunt P. Persons with activity limitations: non response and proxy response in the US National Health Interview Survey on Disability. In: Barnarrt S, Altman B, Larson S, Hendershot G, eds. Using Survey Data to Study Disability: Results from the NHIS-D. 2003. Oxford, England: Elsevier; 2003: 41–51. Research in Social Science and Disability; Vol 3.
- 5.Piazza T. Meeting the challenge of answering machines. Public Opinion Q. 1993;57:219–231. [Google Scholar]
- 6.Keeter S, Miller C, Kohut A, Groves RM, Presser S. Consequences of reducing nonresponse in a national telephone survey. Public Opinion Q. 2000;64: 125–148. [DOI] [PubMed] [Google Scholar]
- 7.Meyers AR, Andresen EM. Enabling our instruments: accommodation, universal design and access to participation in research. Arch Phys Med Rehabil. 2000; 81(suppl 12):S5–S9. [DOI] [PubMed] [Google Scholar]
- 8.US Census Bureau. Accuracy of the data, Census 2000 Supplementary Survey. Available at: http://www.census.gov/acs/www/Downloads/ACS/Accuracy00.pdf. Accessed December 5, 2003.
- 9.US Census Bureau. Census technical documentation, Census 2000 Supplementary Survey Public Use Microsample (PUMS) data. Available at: http://www.census.gov/acs/www/Downloads/C2SS/AccuracyPUMS.pdf. Accessed November 30, 2004.
- 10.Stern SM. Counting people with disabilities: how survey methodology influences estimates in Census 2000 and the Census 2000 Supplementary Survey. Prepared for the Annual Conference of the American Statistical Association, San Francisco, CA. US Census Bureau: Washington, DC. Available at: http://www.census.gov/acs/www/Downloads/ACS/finalstern.pdf. Accessed November 29, 2004.
- 11.US Census Bureau. American Community Survey. Survey basics: what is the community survey? Available at: http://www.census.gov/acs/www/SBasics/What/What1.htm. Accessed January 9, 2004.
- 12.US Census Bureau. American Community Survey. Survey basics: sample size. Available at: http://www.census.gov/acs/www/SBasics/SSizes/SSizes2.htm. Accessed January 7, 2004.
- 13.US Census Bureau. Meeting 21st century demographic data needs—implementing the American Community Survey. May 2002. Report 2: Demonstrating survey quality. Available at: http://www.census.gov/acs/www/Downloads/Report02.pdf (PDF file). Accessed January 13, 2004.
- 14.Starsinic M, Albright K. Coverage and completeness in the Census 2000 Supplementary Survey. Available at: http://www.census.gov/acs/www/Downloads/ACS/Paper41.pdf (PDF file). Accessed January 21, 2004.
- 15.State-specific prevalence of disability among adults—11 states and the District of Columbia, 1998. MMWR Morb Mortal Wkly Rep. 2000;49;711–714. [PubMed] [Google Scholar]
- 16.Teitler JO, Reichman NE, Sprachman S. Costs and benefits of improving response rates for a hard-to-reach population. Public Opinion Q. 2003;67:126–139. [Google Scholar]
- 17.Kinne S, Patrick DL, Doyle DL. Population prevalence of “secondary conditions” among people with disabilities. Am J Public Health. 2004;94:11–14. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Curtin R, Presser S, Singer E. The effects of response rate changes on the Index of Consumer Sentiment. Public Opinion Q. 2000;64:413–429. [DOI] [PubMed] [Google Scholar]
- 19.US Census Bureau. American Community Survey data tables: change profiles 2000–2002. Available at: http://www.census.gov/acs/www/Products/Profiles/Chg/2002/0002/Tabular/040/04000US532.htm. Accessed February 9, 2004.
- 20.Zola IK. Disability statistics: what we count and what it tells us. J Disability Policy Studies. 1993:4(2): 9–37. [Google Scholar]
- 21.Andresen EM, Fitch CA, McLendon PM, Meyers AR. Reliability and validity of disability questions for US Census 2000. Am J Public Health. 2000;90: 1297—1299. [DOI] [PMC free article] [PubMed] [Google Scholar]