Abstract
Objective:
Administrative health data are frequently used for large population-based studies. However, the validity of these data for identifying neurologic conditions is uncertain.
Methods:
This article systematically reviews the literature to assess the validity of administrative data for identifying patients with neurologic conditions. Two reviewers independently assessed for eligibility all abstracts and full-text articles identified through a systematic search of Medline and Embase. Study data were abstracted on a standardized abstraction form to identify ICD code–based case definitions and corresponding sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs).
Results:
Thirty full-text articles met the eligibility criteria. These included 8 studies for Alzheimer disease/dementia (sensitivity: 8–86.5, specificity: 56.3–100, PPV: 60–97.9, NPV: 68.0–98.9), 2 for brain tumor (sensitivity: 54.0–100, specificity: 97.0–99.0, PPV: 91.0–98.0), 4 for epilepsy (sensitivity: 98.8, specificity: 69.6, PPV: 62.0–100, NPV: 89.5–99.1), 4 for motor neuron disease (sensitivity: 78.9–93.0, specificity: 99.0–99.9, PPV: 38.0–90.0, NPV: 99), 2 for multiple sclerosis (sensitivity: 85–92.4, specificity: 55.9–92.6, PPV: 74.5–92.7, NPV: 70.8–91.9), 4 for Parkinson disease/parkinsonism (sensitivity: 18.7–100, specificity: 0–99.9, PPV: 38.6–81.0, NPV: 46.0), 3 for spinal cord injury (sensitivity: 0.9–90.6, specificity: 31.9–100, PPV: 27.3–100), and 3 for traumatic brain injury (sensitivity: 45.9–78.0 specificity: 97.8, PPV: 23.7–98.0, NPV: 99.2). No studies met eligibility criteria for cerebral palsy, dystonia, Huntington disease, hydrocephalus, muscular dystrophy, spina bifida, or Tourette syndrome.
Conclusions:
To ensure the accurate interpretation of population-based studies with use of administrative health data, the accuracy of case definitions for neurologic conditions needs to be taken into consideration.
Administrative health databases contain rich, prospectively collected data on patients, clinical providers, diseases, procedures, and mortality. These data, in which medical diagnoses are coded according to the WHO International Classification of Diseases (ICD), are a valuable resource for studying incidence, prevalence, mortality, multi-morbidity, quality indicators, and health resource utilization because of their relatively low cost, complete ascertainment of population-based samples, and common international disease-coding framework.1 Medical leaders also rely heavily on administrative data for day-to-day health care management (e.g., to generate hospital report cards and physician performance reports or to make important health care decision allocations).2 Because these data sources are so large and population-based, they can be used for epidemiologic and outcome studies of both common and rare neurologic conditions. These important population-based studies influence practice and patient care.3–5 However, these data were originally created for administrative purposes (e.g., reimbursements for medical services) rather than for research purposes. Therefore, it is imperative for any administrative data users to examine the validity and the accuracy of case ascertainment in their data sources before use.1
The overall aim of this study was to provide recommendations for researchers and stakeholders on the best ICD codes to use for various neurologic conditions. More specifically, our objectives were to systematically review the international literature to examine validated ICD-9 and ICD-10-based case definitions for neurologic conditions and to compare the validity of different case definitions across studies and countries.
METHODS
Search strategy.
We searched Medline (1948 to November 2010) and Embase (1980 to November 2010) for relevant articles. For 4 conditions (hydrocephalus, spina bifida, cerebral palsy, and brain tumors), searches were conducted in March 2011 and included articles up until the date they were run (table e-1 on the Neurology® Web site at www.neurology.org). Our search strategy consisted of the following terms: administrative data, hospital discharge data, ICD-9, ICD-10, medical record, health information, surveillance, physician claims, claims, hospital discharge, coding, codes, validity, validation, case definition, algorithm, agreement, accuracy, sensitivity, specificity, positive predictive value, negative predictive value combined with the MESH and EMTREE terms and key words for each neurologic condition. These 15 neurologic conditions were selected because they represent the priority conditions identified as part of a large 4-year nationally funded (Public Health Agency of Canada and Neurological Health Charities of Canada) Population Health Study of Neurological Conditions. Searches were limited to reports of human studies published in English.
Study selection.
For each neurologic condition, 2 reviewers independently assessed all abstracts for fulfillment of the predetermined eligibility criteria. To be eligible for inclusion, articles had to report on original studies that validate ICD-9 or ICD-10 codes for the neurologic disease of interest by comparing the accuracy of the codes with a reference standard. They also had to report at least 1 of the following measures of validity: sensitivity, specificity, positive predictive value (PPV), or negative predictive value (NPV). Articles that validated neurologic conditions in specialized populations (e.g., seizures in vaccine recipients or dementia in patients with heart failure) or in which there were only case definitions based on ICD-8 codes were excluded.
Full-text articles were pulled for all abstracts selected by any reviewer. Selected full-text articles were then reviewed by 2 reviewers for fulfillment of eligibility. Disagreements were resolved by consensus. Reference lists of selected full-text articles were hand-searched, and experts in the field of administrative data or neurologic disorders were consulted to ensure no additional studies were missed with use of the above search strategy.
Data extraction.
Study data were abstracted by 2 reviewers in duplicate, using a standardized data abstraction form. Validated case definitions were abstracted, with specific ICD codes used for the case definition from each publication, and sensitivity, specificity, PPVs, and NPVs were recorded. Additional study information was recorded, including study location, validation database, sample size, years of data collection, and the gold standard that was used to validate the condition of interest.
Quality assessment.
A quality assessment of each included validation study was performed by 2 reviewers in duplicate, using a standardized 40-item checklist.6 One point was assigned for fulfilling each item on the checklist. Some items were not applicable because of the nature of the study. When it was unclear if a certain item was fulfilled, it was marked as uncertain and no points were assigned. Any discrepancy was resolved by consensus.
Role of the funding source.
This study was in part supported by operating funds from the Public Health Agency of Canada and Alberta Innovates Health Solutions. The funders played no role in the study design, data collection, analysis, or interpretation.
Ethics.
This study was reviewed and approved by the Conjoint Health Research Ethics Board at the University of Calgary, Alberta, Canada.
RESULTS
A total of 2,309 independent abstracts were reviewed for all 15 conditions. Of these, 134 full-text articles were reviewed (including 4 from hand search and expert consultation), and 30 full-text articles met all eligibility criteria (table 1). See table e-2 for a list of excluded articles and reasons for exclusion. Figure includes a flow chart of study inclusion for the 15 neurologic conditions. Hand-searching references of full-text articles and consultation with experts identified 4 additional articles for Alzheimer disease (AD) and dementia but not for any other conditions. For 7 conditions—cerebral palsy, dystonia, Huntington disease, hydrocephalus, muscular dystrophy, spina bifida, and Tourette syndrome—no full-text articles met the eligibility criteria, and these conditions were therefore excluded from any further analysis. The number of studies, number of validations, and ranges of sensitivity, specificity, PPVs, and NPVs for each condition are reported in table 1. A detailed summary of all included articles can be found in tables e-3 to e-10.
Table 1.
Range of sensitivities, specificities, NPVs, and PPVs reported for each condition

Abbreviations: ICD = International Classification of Diseases; NPV = negative predictive value; PPV = positive predictive value.
Refers to the number of case definitions (sets of ICD codes) tested.
Includes amyotrophic lateral sclerosis and other anterior horn cell disease.
Figure. Flow diagram of article inclusion.
AD = Alzheimer disease and dementia; BT = brain tumor; CP = cerebral palsy; Dy = dystonia; Epil = epilepsy; HD = Huntington disease; Hy = hydrocephalus; MN = motor neuron disease; MD = muscular dystrophy; PD = Parkinson disease; Sb = Spina bifida; SCI = spinal cord injury; TS = Tourette syndrome; TBI = traumatic brain injury. A listing of excluded articles and reasons for exclusion can be found in table e-2.
AD and dementia.
For AD and dementia, 581 independent abstracts were reviewed and 8 full-text articles met all eligibility criteria (4 were identified by hand search or expert consultation) (table e-3).7–14 A total of 21 case definitions were tested (table 1). Seven studies investigated ICD-9-based case definitions, whereas 3 studies also tested ICD-10-based case definitions. Studies that used inpatient and outpatient claims databases revealed higher sensitivity than those that used inpatient claims only,10,12–14 outpatient claims only,9 or death certificates.8 Specificity was consistently high (>84%), with the exception of 1 validation9 that included the ICD-9 code 298.9 for “unspecified psychosis” in the case definition. PPVs and NPVs were reported only for studies that used databases with inpatient or outpatient claims and were similar across databases.
Brain tumors.
Of 353 independent abstracts for brain tumors reviewed, 2 full-text articles met eligibility criteria (table e-4).15,16 Only 4 case definitions were tested. One study investigated primary intracranial tumors,15 whereas the other looked at brain metastases in lung cancer patients.16 In the study of primary intracranial tumors, the case definition consisted solely of ICD-9 codes, and only 1 sensitivity (54%) was reported.15 In the second study, only the ICD-9 code for secondary malignant neoplasm of the brain or spinal cord (ICD 198.3) was used to define the case definitions.16 Reported sensitivities, specificities, and PPVs were similar, regardless of whether the cases were defined by ≥1, ≥2, or ≥3 claims.
Epilepsy.
Of 285 abstracts reviewed, 4 full-text articles met eligibility criteria (table e-5).17–20 For the selected full-text articles, 14 case definitions were tested. Of the 4 studies meeting all eligibility criteria, case definitions consisted of ICD-9 codes in 3 studies18–20 and ICD-10 codes in 2 studies.17,18 Sensitivity and specificity were reported in only 1 study report that assessed the ICD-10 codes G40–41 in inpatients admitted to a seizure monitoring unit.18 Lower PPVs were reported in those studies that included the ICD-9-CM convulsion code 780.318,19 vs those that did not,17,18 except in 1 study in which patients also had to be on an antiepileptic drug to meet the case definition.
Motor neuron disease (including amyotrophic lateral sclerosis).
Of 140 independent abstracts reviewed, 4 full-text articles met all eligibility criteria (table e-6).21–24 A total of 6 case definitions were tested. All 4 studies validated case definitions consisting of the ICD-9 code 335 for anterior horn cell disease or 335.2 for motor neuron disease, and all showed good diagnostic accuracy, except for 1 study in which the reference standard was a population-based registry.23 No outpatient databases were validated for motor neuron disease.
Multiple sclerosis.
Of 119 independent abstracts reviewed, 2 full-text articles met all eligibility criteria (table e-7).25,26 Seven case definitions were tested. Both studies that met all eligibility criteria validated case definitions consisting of the ICD-9 code 340 or ICD-10 code G35 in databases containing inpatient, outpatient, and prescription claims. Adding prescription claims to the case definition resulted in negligible improvement in sensitivity, specificity, PPV, or NPV.26
Parkinson disease and parkinsonism.
One hundred independent abstracts were reviewed, and 4 full-text articles met all eligibility criteria (table e-8).27–30 In the full selected full-text articles, 16 case definitions were tested. All 4 studies validated case definitions consisting of ICD-9 codes related to Parkinson disease (PD)/parkinsonism. Three studies used databases containing inpatient and outpatient claims27,28,30; 1 used inpatient, outpatient, and prescription drug claims28; and 1 used outpatient claims only.29 Adding prescription of a dopamine agonist or levodopa/carbidopa to the ICD-9 codes increased sensitivity but decreased the PPV.28 One study looked at only outpatient claims and revealed a high sensitivity (89.2% to 100%) but a low specificity (0% to 28.4%).29 Most of the poorly performing case definitions relied on self-report as the reference standard.
Spinal cord injury.
A total of 119 independent abstracts were reviewed, and 3 full-text articles met all eligibility criteria (table e-9).31–33 In the full selected full-text articles, 19 case definitions were tested, none of which reported NPVs. All 3 studies validated case definitions consisting of ICD-9 codes, and 1 also looked at case definitions consisting of ICD-10 codes.31 The best performing case definitions included the ICD-9 codes 806 for “fracture of vertebral column with spinal cord injury” and 952 for “spinal cord injury without evidence of spinal bone injury.”32,33 Conversely, less specific codes (e.g., ICD-9 “other paralytic syndromes”) performed poorly. Adding the ICD-9 code for late effect of spinal cord injury (907.2) improved the sensitivity but at the cost of a lower specificity and PPV.31
Traumatic brain injury.
A total of 284 independent abstracts were reviewed, and 3 full-text articles met all eligibility criteria (table e-10).34–36 Eight case definitions were tested.34–36 One study looked at inpatient and registry databases and found that case definitions tested in the inpatient database alone and in combination with the registry had a slightly higher PPV than the registry database alone.36 Including less specific codes (e.g., ICD-9 800.0–800.9, which can include some skull fractures without mention of intracranial injury) was associated with lower PPV. Only 1 study used more than 1 test of diagnostic accuracy, comparing coding of an inpatient database to real-time assessment in the emergency department, and found low sensitivity (45.9%) and PPV (23.9%) but high specificity (97.8%) and NPV (99.2%).34
Quality assessment.
We assessed the quality of each validation study, using a published 40-item checklist of reporting criteria.6 For the 29 studies assessed, total quality scores ranged from 12 to 29 (mean, 20.5) (table e-11).
DISCUSSION
In this systematic review of 15 priority neurologic conditions, validation studies were identified for AD and dementia (n = 8), brain tumor (n = 2), epilepsy (n = 4), motor neuron disease (n = 4), multiple sclerosis (n = 2), PD/parkinsonism (n = 2), spinal cord injury (n = 3), and traumatic brain injury (n = 3). However, no validation studies were identified for cerebral palsy, dystonia, Huntington disease, hydrocephalus, muscular dystrophy, spina bifida, or Tourette syndrome. For many of these conditions, population-based studies have been published utilizing ICD codes and administrative data without obvious prior validation studies.37–39 This indicates an important gap in knowledge and suggests that validation studies are still needed for many neurologic conditions.
Even among neurologic conditions for which validation studies exist, heterogeneity in coding accuracy was identified. Several factors seem to be associated with this, including the number of codes used in the algorithm, the varied databases validated, and the nature of the conditions. For some conditions, fewer than 3 ICD codes were tested alone or in combination, such as epilepsy (ICD-9: 345, ICD-10: G40–41), motor neuron disease (including amyotrophic lateral sclerosis) (ICD-9: 335, 335.2), and multiple sclerosis (ICD-9: 340, ICD-10: G35), resulting in consistent reports of excellent coding accuracy. Therefore, these codes can be recommended for future studies using administrative data. Validations for AD and dementia, brain tumors, PD/parkinsonism, spinal cord injury, and traumatic brain injury were more varied, often including more than 3 ICD codes, with up to 72 ICD-9 codes for traumatic brain injury. With the exception of spinal cord injury, the ICD codes used and the diagnostic accuracy were too varied to allow recommendations regarding the best case definition. For spinal cord injury, ICD-9 codes 806 and 952 were validated in combination in 2 studies and were associated with high sensitivity and moderate PPV, suggesting they may be suitable for administrative health data research.
The validation studies also used a variety of databases (inpatient, outpatient, mortality-based registries, or death certificates). No particular type of database clearly outperformed the others. The best database to use will vary, depending on the condition (e.g., acute conditions may be best captured in hospital data, vs chronic conditions in outpatient data), how data are collected, and who is responsible for coding the data. Ultimately, we recommend that case definitions used to identify neurologic conditions in administrative health data be validated in the database of interest prior to initiation of a study, to ensure the study results are accurate and meaningful.
Validation results may be heterogeneous because of variation in the ICD version used. In 28 of the 30 included studies, ICD-9-code-based case definitions were tested (23 ICD-9-CM, 5 ICD-9), but only 7 studies validated an ICD-10-code-based case definition (3 ICD-10-CA, 1 ICD-10-AM, 3 ICD-10). Although the underlying coding framework is the same internationally, many countries use different ICD versions. For example, although some countries are currently using ICD-10 (12,420 codes), some countries are still using ICD-9 (6,882 codes).1 Furthermore, some countries use ICD-10 clinical modifications, such as ICD-10-AM (Australia), ICD-10-CA (Canada), ICD-10-GM (Germany), ICD-10-KM (Korea), ICD-10-TM (Thailand), and ICD-10-CM (United States).1 These differ in their number of codes, chapters, and categories. Specific conditions are present in some but not all clinical modifications. Subcodes can also differ. For example, code S06.21 refers to diffuse brain injury with moderate loss of consciousness in ICD-10-CA, but to diffuse cerebral contusions in ICD-10-GM and ICD-10-AM. The definition of the “main condition” also differs between these international ICD-10 clinical modifications.1 These variations can influence case definitions and the validity of administrative data. Finally, as countries transition to using newer ICD versions, up-to-date validations of administrative data will be required.
Coding accuracy may also have been influenced by the ease in which certain conditions can be diagnosed. Conditions such as PD and AD, which rely solely on clinical history for diagnosis, may be more difficult to code than conditions for which diagnostic tests (e.g., neuroimaging for multiple sclerosis) are available to confirm the diagnosis. Furthermore, the reference standard used to validate the algorithms likely had an effect on data quality (e.g., conditions where self-report was used performed poorly).
As a result of the quality assessment, we found that although most studies fulfilled a large proportion of the reporting criteria, some gaps remained. Only a portion of studies described the age (n = 11) and severity of disease (n = 3) in the validation cohort and outlined the exclusion criteria (n = 8). Details about the administrative data collection were also lacking, such as who identified patients and ensured that selection adhered to patient recruitment criteria (n = 2); who collected data (n = 3); and whether an a priori data collection form was used (n = 2). Only 1 study utilized a split sample design to revalidate the administrative data in a second cohort.13 Whereas most study reports described the number, training, and expertise of persons reading the reference standard (n = 20), only 4 reported κ if there was >1 person reading the reference standard and only 7 reported that investigators were blinded to the ICD codes while reading the reference standard. Furthermore, few studies showed a flow diagram (n = 6), the distribution of disease severity (n = 7), or a cross tab of index tests by results of reference standards (n = 12). Estimates of diagnostic accuracy varied among studies, with the most popular being sensitivity (n = 26) and PPV (n = 28). Thirteen studies reported 95% confidence intervals for these estimates. Only 8 studies reporting PPV and NPV indicated that the ratio of cases to controls of the validation cohort approximated the prevalence of the condition in the population.
This study has some limitations. The literature review included only studies published in the English language, and we did not search for validation studies published in the gray literature. Our search strategy targeted studies whose primary aim was to validate ICD coding of neurologic conditions. Therefore, it is possible that validation studies embedded within a large study (e.g., prevalence or health resource utilization) may have been missed. Furthermore, in validation studies, publication bias may be an issue if authors report only case definitions that perform well rather than all the case definitions tested or if they fail to publish the study because validations with low sensitivity, specificity, and positive and negative predictive values were noted. However, several studies in this systematic review did report case definitions with accuracy measures that were low or very low. Therefore, we believe that publication bias is not a major concern.
Although validated case definitions for some neurologic conditions have been reported, with varying levels of accuracy, data are still limited for many. As population health researchers and decision makers continue to utilize administrative health data to answer important health research questions, it is critical to develop and test the accuracy of case definitions for all neurologic conditions prior to use. In order to compare validation studies, reporting guidelines have recently been developed.6 This framework will facilitate the development of optimal case definitions for neurologic conditions, to improve the quality of population-based research for all neurologic conditions.
Supplementary Material
GLOSSARY
- AD
Alzheimer disease
- ICD
International Classification of Diseases
- NPV
negative predictive value
- PD
Parkinson disease
- PPV
positive predictive value
Footnotes
Supplemental data at www.neurology.org
AUTHOR CONTRIBUTIONS
All authors made a significant contribution to this study. N. Jette, H. Quan, A. Metcalfe, and C. St. Germaine-Smith conceived of and designed the study; N. Jette obtained the funding for the study; all authors (except B. Hemmelgarn and H. Quan) were involved in data collection; C. St. Germaine-Smith conducted the statistical analysis and drafted the manuscript; C. St. Germaine-Smith, A. Metcalfe, and N. Jette were involved in figure design; and all authors participated in the interpretation of the data and in editing of the manuscript.
DISCLOSURE
C. St. Germaine-Smith reports no disclosures. A. Metcalfe holds a CIHR doctoral award in Genetics (Ethics, Law and Society) and a studentship award from the CIHR Strategic Training program in Genetics, Child Development, and Health. T. Pringsheim has received travel funding from Teva Neuroscience Canada and speaker honoraria from Shire Canada. She currently receives research support from the Canadian Institute of Health Research, the Public Health Agency of Canada, and the Tourette Syndrome Foundation of Canada. J.I. Roberts reports no disclosures. C. Beck currently holds grants/research support from the Canadian Institutes of Health Research, Alberta Health Services, the University of Calgary Department of Psychiatry, and the Carlos Ogilvie Memorial Foundation. Dr. Beck's spouse has received honoraria and support for travel for 14 international and national workshops and presentations in the field of Family Therapy. B. Hemmelgarn receives salary/research support from Alberta Innovates Health Solutions, Alberta Health and Wellness, and the University of Calgary. J. McChesney holds a studentship award from the Western Regional Training Center for Health Services Research. H. Quan has received salary support from an Alberta Innovates Health Solutions Health Scholar Award. N. Jette holds a salary award from Alberta Innovates Health Solutions and a Canada Research Chair (CRC) Tier 2 in Neuroscience Health Services Research. She previously held a Canadian Institutes of Health Research New Investigator Award (declined after 2010 due to CRC). She has received or currently holds grants/research support from the Canadian Institutes of Health Research, the Public Health Agency of Canada, Alberta Innovates Health Solutions, Alberta Health Services, the University of Calgary Faculty of Medicine and Hotchkiss Brain Institute, and Alberta Health and Wellness. She has no commercial financial disclosures. All grants and research support are paid directly to the University of Calgary. Go to Neurology.org for full disclosures.
REFERENCES
- 1. Jette N, Quan H, Hemmelgarn B, et al. The development, evolution, and modifications of ICD-10: challenges to the international comparability of morbidity data. Med Care 2010; 48: 1105– 1110 [DOI] [PubMed] [Google Scholar]
- 2. Bradley EH, Herrin J, Mattera JA, et al. Quality improvement efforts and hospital performance: rates of beta-blocker prescription after acute myocardial infarction. Med Care 2005; 43: 282– 292 [DOI] [PubMed] [Google Scholar]
- 3. Nallamothu B, Gurm H, Ting H, et al. Operator experience and carotid stenting outcomes in Medicare beneficiaries. JAMA 2011; 206: 1338– 1343 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. Bushnell C, Jamison M, James A. Migraines during pregnancy linked to stroke and vascular diseases: US population based case-control study. BMJ 2009; 338 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Arana A, Wentworth CE, Ayuso-Mateos JL, Arellano FM. Suicide-related events in patients treated with antiepileptic drugs. N Engl J Med 2010; 363: 542– 551 [DOI] [PubMed] [Google Scholar]
- 6. Benchimol EI, Manuel DG, To T, Griffiths AM, Rabeneck L, Guttmann A. Development and use of reporting guidelines for assessing the quality of validation studies of health administrative data. J Clin Epidemiol 2011; 64: 821– 829 [DOI] [PubMed] [Google Scholar]
- 7. Bharmal MF, Weiner M, Sands LP, Xu H, Craig BA, Thomas J., 3rd Impact of patient selection criteria on prevalence estimates and prevalence of diagnosed dementia in a Medicaid population. Alzheimer Dis Assoc Disord 2007; 21: 92– 100 [DOI] [PubMed] [Google Scholar]
- 8. Macera CA, Sun RK, Yeager KK, Brandes DA. Sensitivity and specificity of death certificate diagnoses for dementing illnesses, 1988–1990. J Am Geriatr Soc 1992; 40: 479– 481 [DOI] [PubMed] [Google Scholar]
- 9. Pippenger M, Holloway RG, Vickrey BG. Neurologists' use of ICD-9CM codes for dementia. Neurology 2001; 56: 1206– 1209 [DOI] [PubMed] [Google Scholar]
- 10. Quan H, Li B, Saunders LD, et al. Assessing validity of ICD-9-CM and ICD-10 administrative data in recording clinical conditions in a unique dually coded database. Health Serv Res 2008; 43: 1424– 1441 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Taylor DH, Jr, Ostbye T, Langa KM, Weir D, Plassman BL. The accuracy of Medicare claims as an epidemiological tool: the case of dementia revisited. J Alzheimers Dis 2009; 17: 807– 815 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Fisher ES, Whaley FS, Krushat WM, et al. The accuracy of Medicare's hospital claims data: progress has been made, but problems remain. Am J Public Health 1992; 82: 243– 248 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Henderson T, Shepheard J, Sundararajan V. Quality of diagnosis and procedure coding in ICD-10 administrative data. Med Care 2006; 44: 1011– 1019 [DOI] [PubMed] [Google Scholar]
- 14. Quan H, Parsons GA, Ghali WA. Validity of information on comorbidity derived from ICD-9-CCM administrative data. Med Care 2002; 40: 675– 685 [DOI] [PubMed] [Google Scholar]
- 15. Counsell CE, Collie DA, Grant R. Limitations of using a cancer registry to identify incident primary intracranial tumours. J Neurol Neurosurg Psychiatry 1997; 63: 94– 97 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Eichler AF, Lamont EB. Utility of administrative claims data for the study of brain metastases: a validation study. J Neurooncol 2009; 95: 427– 431 [DOI] [PubMed] [Google Scholar]
- 17. Christensen J, Vestergaard M, Olsen J, Sidenius P. Validation of epilepsy diagnoses in the Danish National Hospital Register. Epilepsy Res 2007; 75: 162– 170 [DOI] [PubMed] [Google Scholar]
- 18. Jette N, Reid AY, Quan H, Hill MD, Wiebe S. How accurate is ICD coding for epilepsy? Epilepsia 2010; 51: 62– 69 [DOI] [PubMed] [Google Scholar]
- 19. Parko K, Thurman DJ. Prevalence of epilepsy and seizures in the Navajo Nation 1998–2002. Epilepsia 2009; 50: 2180– 2185 [DOI] [PubMed] [Google Scholar]
- 20. Pugh MJ, Van Cott AC, Cramer JA, et al. Trends in antiepileptic drug prescribing for older patients with new-onset epilepsy: 2000–2004. Neurology 2008; 70: 2171– 2178 [DOI] [PubMed] [Google Scholar]
- 21. Beghi E, Logroscino G, Micheli A, et al. Validity of hospital discharge diagnoses for the assessment of the prevalence and incidence of amyotrophic lateral sclerosis. Amyotroph Lateral Scler Other Motor Neuron Disord 2001; 2: 99– 104 [DOI] [PubMed] [Google Scholar]
- 22. Chancellor AM, Swingler RJ, Fraser H, Clarke JA, Warlow CP. Utility of Scottish morbidity and mortality data for epidemiological studies of motor neuron disease. J Epidemiol Community Health 1993; 47: 116– 120 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Chio A, Ciccone G, Calvo A, et al. Validity of hospital morbidity records for amyotrophic lateral sclerosis: a population-based study. J Clin Epidemiol 2002; 55: 723– 727 [DOI] [PubMed] [Google Scholar]
- 24. Pisa FE, Verriello L, Deroma L, et al. The accuracy of discharge diagnosis coding for amyotrophic lateral sclerosis in a large teaching hospital. Eur J Epidemiol 2009; 24: 635– 640 [DOI] [PubMed] [Google Scholar]
- 25. Culpepper WJ, 2nd, Ehrmantraut M, Wallin MT, Flannery K, Bradham DD. Veterans Health Administration multiple sclerosis surveillance registry: the problem of case-finding from administrative databases. J Rehabil Res Dev 2006; 43: 17– 24 [DOI] [PubMed] [Google Scholar]
- 26. Marrie RA, Yu N, Blanchard J, Leung S, Elliott L. The rising prevalence and changing age distribution of multiple sclerosis in Manitoba. Neurology 2010; 74: 465– 471 [DOI] [PubMed] [Google Scholar]
- 27. Noyes K, Liu H, Holloway R, Dick AW. Accuracy of Medicare claims data in identifying parkinsonism cases: comparison with the Medicare current beneficiary survey. Mov Disord 2007; 22: 509– 514 [DOI] [PubMed] [Google Scholar]
- 28. Swarztrauber K, Anau J, Peters D. Identifying and distinguishing cases of parkinsonism and Parkinson's disease using ICD-9 CM codes and pharmacy data. Mov Disord 2005; 20: 964– 970 [DOI] [PubMed] [Google Scholar]
- 29. Szumski NR, Cheng EM. Optimizing algorithms to identify Parkinson's disease cases within an administrative database. Mov Disord 2009; 24: 51– 56 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. White D, Moore S, Waring S, Cook K, Lai E. Identifying incident cases of parkinsonism among veterans using a tertiary medical center. Mov Disord 2007; 22: 915– 923 [DOI] [PubMed] [Google Scholar]
- 31. Hagen EM, Rekand T, Gilhus NE, Gronning M. Diagnostic coding accuracy for traumatic spinal cord injuries. Spinal Cord 2009; 47: 367– 371 [DOI] [PubMed] [Google Scholar]
- 32. Johnson RL, Gabella BA, Gerhart KA, McCray J, Menconi JC, Whiteneck GG. Evaluating sources of traumatic spinal cord injury surveillance data in Colorado. Am J Epidemiol 1997; 146: 266– 272 [DOI] [PubMed] [Google Scholar]
- 33. Thurman DJ, Burnett CL, Jeppson L, Beaudoin DE, Sniezek JE. Surveillance of spinal cord injuries in Utah, USA. Paraplegia 1994; 32: 665– 669 [DOI] [PubMed] [Google Scholar]
- 34. Bazarian JJ, Veazie P, Mookerjee S, Lerner EB. Accuracy of mild traumatic brain injury case ascertainment using ICD-9 codes. Acad Emerg Med 2006; 13: 31– 38 [DOI] [PubMed] [Google Scholar]
- 35. Rodriguez SR, Mallonee S, Archer P, Gofton J. Evaluation of death certificate-based surveillance for traumatic brain injury: Oklahoma 2002. Public Health Rep 2006; 121: 282– 289 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36. Shore AD, McCarthy ML, Serpi T, Gertner M. Validity of administrative data for characterizing traumatic brain injury-related hospitalizations. Brain Inj 2005; 19: 613– 621 [DOI] [PubMed] [Google Scholar]
- 37. Chen YY, Lai CH. Nationwide population-based epidemiologic study of Huntington's disease in Taiwan. Neuroepidemiology 2010; 35: 250– 254 [DOI] [PubMed] [Google Scholar]
- 38. Hjern A, Thorngren-Jerneck K. Perinatal complications and socio-economic differences in cerebral palsy in Sweden: a national cohort study. BMC Pediatr 2008; 8: 49 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39. Lie KK, Groholt EK, Eskild A. Association of cerebral palsy with Apgar score in low and normal birthweight infants: population based cohort study. BMJ 2010; 341: c4990 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.

