Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2009 Feb 25.
Published in final edited form as: Mov Disord. 2009 Jan 15;24(1):51–56. doi: 10.1002/mds.22283

Optimizing Algorithms to Identify Parkinson’s Disease Cases Within an Administrative Database

Nicholas R Szumski 1,2,3,*, Eric M Cheng 1,2,3
PMCID: PMC2647991  NIHMSID: NIHMS93759  PMID: 18816696

Abstract

Patients assigned the diagnostic ICD-9-CM code for Parkinson’s disease (PD) in an administrative database may not truly carry that diagnosis because of the various error sources. Improved ability to identify PD cases within databases may facilitate specific research goals. Experienced chart reviewers abstracted the working diagnosis of all 577 patients assigned diagnostic code 332.0 (PD) during 1 year at a VA Healthcare System. We then tested the ability of various algorithms making use of PD and non-PD diagnostic codes, specialty of clinics visited, and medication prescription data to predict the abstracted working diagnosis. Chart review determined 436 (75.6%) patients to be PD or Possibly PD, and 141 (24.4%) to be Not PD. Our tiered consensus algorithm preferentially used data from specialists over nonspecialists improved PPV to 83.2% (P = 0.003 vs. baseline). When presence of a PD prescription was an additional criterion, PPV increased further to 88.2% (P = 0.04 vs. without medication criterion), but sensitivity decreased from 87.4 to 77.1% (P = 0.0001). We demonstrate that algorithms provide better identification of PD cases than using a single occurrence of the diagnostic code for PD, and modifications of such algorithms can be tuned to maximize parameters that best meet the goals of a particular database query.

Keywords: Parkinson’s disease, ICD-9-CM codes, administrative data, predictive value, sensitivity, specificity


Parkinson’s disease (PD) is a neurodegenerative disorder marked by bradykinesia, rigidity, resting tremor, and postural instability. To perform clinical studies of large populations of people with PD, use of administrative databases may be necessary. However, several features of PD present challenges to case ascertainment. PD may not be clinically distinct from its mimics at early stages, leading to use of administrative diagnostic ICD-9-CM codes of those PD mimics. Accuracy of the clinical diagnosis is likely to become more assured over time as clinical data accumulates. The lack of a definitive clinical test for PD (short of pathology) precludes diagnostic certainty, making verification of database diagnoses even more difficult. The accuracy of ICD-9-CM codes alone is demonstrably suboptimal,14 and combinatorial or algorithmic approaches adjusted for particular research goals may improve utility of diagnostic codes.510

We created and analyzed algorithms for identifying a working diagnosis of PD, defined using a gold standard of expert chart review, within a large VA Healthcare System administrative database. Because movement disorders specialists accurately diagnose PD in the clinic (PPV of 98.6% compared with a pathological gold standard11), we developed an algorithm giving greater weight to codes assigned in movement disorders specialty clinic when available and sought to build consensus over time by considering frequency of code assignment versus assignment of codes for similar conditions.

PATIENTS AND METHODS

Database and Sample Characteristics

Our database included all veterans obtaining outpatient care at the Veterans Administration (VA) Greater Los Angeles Healthcare System (VAGLAHS), comprised of one hospital and 14 outpatient clinics. Our sample consisted of all patients assigned an out-patient ICD-9-CM diagnosis code of 332.0 (PD) between October 1, 2001, and September 30, 2002. The VAGLAHS Institutional Review Board approved all study procedures.

Gold standard for Determining Working Diagnosis

For each patient in our sample, one of two nurses with research experience in medical record abstraction determined the eventual working diagnosis of the patient through review of the electronic medical record from 1997 through September 30, 2004 (previously described elsewhere12). When the medical record provided conflicting information about diagnosis, the chart abstractors gave greater weight to notes by specialists, to more recently written notes (presumably with more clinical information available), and to more detailed notes documenting clinical reasoning for or against PD. They classified each patient as PD, Possibly PD, or Not PD. For instances of Not PD, the patient’s working diagnosis was recorded.

Administrative Data Collected for Building Algorithms

We collected data for analysis during the 3-year period between October 1, 2001, and September 30, 2004. For each patient in the sample, we obtained the following information from the administrative database: dates of all outpatient encounters; specialty of clinics visited (movement disorders clinic vs. general neurology clinic vs. non-neurology clinic); all instances of administrative ICD-9-CM codes assigned for conditions closely related to PD; and dates of prescriptions of medications related to PD [levodopa (L-dopa) in any formulation, pramipexole, ropinirole, pergolide, entacapone, and selegiline].

Development of Algorithms Tested

Initially, we tested the ability of individual variable to predict the working diagnosis as they were manipulated. We subsequently developed two algorithms to test the predictive value of combinations of these individual variables. The unanimity algorithm simply identified likely PD cases when code 332.0 was applied at least once, in any setting, without any instances of competing ICD-9-CM codes of PD mimics (332.1, 333.0, 333.1; see Appendix) elsewhere. The tiered consensus algorithm output a likely working diagnosis for each patient based on code frequency and clinic specialty. Instances in which code 332.0 was applied only once were output as Not PD. If a patient had been seen by a movement disorders specialist, then the code most commonly assigned by specialists became the algorithm output (with 332.0 yielding PD and any other code Not PD). Instances of patients who had seen a specialist but not assigned code 332.0 were output as Not PD. If a patient had not seen a movement disorders specialist, then the code most frequently assigned by a general neurologist, if any, was output. For patients who had seen neither a specialist nor a general neurologist, the code most frequently assigned by non-neurologists became the algorithm’s final output. Each of the two algorithms was also tested using presence of PD medications (with medications modifier) as an additional criterion for output of PD.

Analysis

For each algorithm evaluated, the output (PD or Not PD) was compared with the working diagnosis determined by chart review. For this analysis, chart review results of PD or Possibly PD were all considered to demonstrate a likely working diagnosis of PD. Therefore, a true positive was an instance in which an algorithm output PD and the expert chart reviewed returned PD or Possibly PD. True negatives were instances in which the algorithm output Not PD and the expert chart review also determined Not PD. Sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated for each algorithm. Comparisons of these parameters across algorithms were made using two-sample tests of proportion, with significance level α = 0.05. The overall agreement with the gold standard (number of true positives plus true negatives) was also calculated for each algorithm.

RESULTS

Baseline and Demographic Data

We identified 577 patients who were given the diagnosis code for PD (332.0) at least once between October 1, 2001, and September 30, 2002. Demographic and clinical characteristics of these patients for the period from October 1, 2001, through September 30, 2004, are summarized in Table 1. In our sample, 31% had a visit to a movement disorders clinic during the 3-year period, 51% had a visit to a general neurology clinic, and 71% were seen in at least one of the two types of neurology clinic. Of the 577 patients, 436 (baseline PPV of 75.6%) were determined by expert chart review to have a working diagnosis of PD (n = 407, 70.5% of total) or Possibly PD (n = 29, 5.0% of total). The remaining 143 (24.4%) patients reviewed were determined to have a working diagnosis of Not PD (see Table 2).

TABLE 1.

Demographic and clinical characteristics of study population (those patients receiving code 332.0 between October 1, 2001 and September 30, 2002). Clinical data over 3-year period between October 1, 2001, and September 30, 2004

Characteristic n (%)
Age in years, mean (SD) 75.6 (8.7)
Male 568 (98.4)
No. of times receiving 332.0 code
 At least once 577 (100)
 At least twice 409 (70.9)
 At least three times 293 (50.8)
 At least four times 224 (38.8)
 At least five times 166 (28.8)
Visited neurology clinic, by type:
Movement disorders
specialty clinic
General neurology
clinic
 Yes Yes 117 (20.3)
 Yes No 60 (10.4)
 No Yes 235 (40.7)
 No No 165 (28.6)
Received one or more PD-related medications 408 (70.7)

TABLE 2.

Results of experienced chart review: Categorization of patients determined by initial chart review and final working diagnosis assigned. The working diagnosis was used as the gold standard for algorithm analysis

Chart review
determination
Working
diagnosis
assigned
n (% of 577)
PD PD 407 (70.5)
Possibly PDa PD 29 (5.0)
Total PD 436 (75.6)
Not PD
 Medication-induced Not PD 15 (2.9)
  Parkinsonisma PD 2a
 Vascular Parkinsonism Not PD 39 (6.8)
 Dementia with
  Lewy bodies
Not PD 10 (1.7)
 Multiple systems atrophy Not PD 10 (1.7)
 Progressive supranuclear
  palsy
Not PD 7 (1.2)
 Normal pressure
  hydrocephalus
Not PD 3 (0.5)
 Miscellaneous
  Parkinsonism
Not PD 5 (2.6)
 Essential tremor Not PD 7 (1.2)
 Miscellaneous movement
  disorder
Not PD 13 (2.3)
 No movement disorder Not PD 8 (1.4)
 Unable to determine
  from records
Not PD 14 (2.4)
Total Not PD 141 (24.4)
a

Two cases of possible PD were also felt to have a contribution of medication-induced Parkinsonism, and are listed twice, but included in the totals and analysis as working diagnosis, PD.

Identifying Cases by Frequency of Assigned PD Code (see Table 3)

TABLE 3.

Predictive values, sensitivity, and specificity of various case ascertainment variables (individually or in combination) and algorithms in identifying a working diagnosis of PD. Final column indicates degree of agreement with the gold standard (true positives plus true negatives). Expert chart review used as gold standard for all comparisons

Sensitivity
(95% CI)
Specificity
(95% CI)
PPV
(95% CI)
NPV
(95% CI)
Agreed with
chart review N (%)
Baseline inclusion criterion:
 In any clinic: 332.0 assigned ≥1 100% 0.0% 75.6% Undefined 436 (75.6%)
Codes assigned in any clinic:
 332.0 assigned ≥ 2 89.2% (85.9–92.0) 28.4% (21.1–36.6) 79.4% (75.5–82.9) 46.0% (35.2–57.0) 429 (74.4%)
 332.0 assigned ≥ 5 67.0% (62.3–71.4) 58.2% (49.6–66.4) 83.2% (78.9–87.0) 36.3% (30.0–42.9) 374 (64.8%)
Codes assigned in any neurology clinic:
 332.0 assigned ≥ 1 68.8% (64.2–73.1) 43.3% (35.0–51.9) 78.9% (74.5–82.9) 31.0% (24.6–37.9) 361 (62.6%)
 332.0 assigned ≥ 2 56.9% (52.1–61.6) 71.6% (63.4–78.9) 86.1% (81.6–89.9) 34.9% (29.5–40.8) 349 (60.5%)
 332.0 assigned ≥ 3 50.0% (45.2–54.8) 78.0% (78.1–90.5) 87.6% (86.9–94.5) 33.5% (28.4–38.9) 328 (56.8%)
Codes assigned in movement disorders specialty clinic:
 332.0 assigned ≥ 1 33.0% (28.6–37.7) 85.1% (78.1–90.5) 87.3% (81.2–92.0) 29.1% (24.8–33.8) 264 (45.8%)
 332.0 assigned ≥ 2 28.4% (24.3–32.9) 91.5% (85.6–95.5) 91.2% (85.1–95.4) 29.3% (25.0–33.7) 253 (43.8%)
 332.0 assigned ≥ 3 25.5% (21.4–29.8) 94.3% (89.1–97.5) 93.3% (87.2–97.1) 29.0% (24.9–33.4) 244 (42.3%)
PD-related medicationa prescribed:
 Any of abovea 80.0% (76.0–83.7) 58.2% (49.6–66.4) 85.5% (81.8–88.8) 48.5% (40.8–56.3) 431 (74.7%)
Algorithms:
 Unanimity algorithm 78.0% (73.8–81.8) 33.3% (25.6–41.8) 78.3% (74.2–82.1) 32.9% (25.3–41.2) 387 (67.1%)
 Unanimity algorithm with
 medications
60.3% (55.6–64.9) 74.5% (66.4–81.4) 88.0% (83.7–91.4) 37.8% (32.1–43.8) 368 (63.8%)
 Tiered consensus algorithm 87.4% (83.9–90.4) 45.4% (37.0–54.0) 83.2% (79.4–86.5) 53.8% (44.4–63.0) 445 (77.1%)
 Tiered consensus algorithm
 with medications
77.1% (72.8–80.9) 68.1% (59.7–75.7) 88.2% (84.5–91.3) 49.0% (41.8–56.2) 432 (74.9%)
a

Levodopa (in any formulation), pramipexole, ropinirole, pergolide, entacapone, or selegiline.

Increasing the model’s code count threshold (number of times 332.0 assigned to patient) beyond one (i.e. simply meeting study inclusion criteria) increased the PPV for identifying PD, increasing from 75.6 to 79.4% when set at a threshold of two or more instances and to 83.2% when set at five or more. There was a concomitant loss of sensitivity at higher thresholds with corresponding drop in NPV.

Identifying Cases by Specialty of Clinic Assigning PD Code (see Table 3)

When the model was independently adjusted to seek out patients who had been given code 332.0 in any neurology clinic over a 3-year period, the PPV rose from baseline 75.6 to 78.9%. PPV was higher still when the model was narrowed to seek out patients given code 332.0 by a movement disorders specialist at least once, up to 87.3%. By increasing the code count threshold for 332.0 being applied by any neurologist, the PPV increased to 87.6% (≥3 instances), and for movement disorders specialists to 93.3% (≥3 instances). The NPV of these models was relatively insensitive to changes in code count threshold. However, sensitivity dropped off dramatically as code count thresholds and expertise level of the clinical setting were increased.

Identifying Cases by Prescription of Medications Related to PD

Presence of a filled prescription of any of the PD medications during the 3-year period had a PPV of 85.5% and NPV of 48.5%. (see Table 3). Among individual medications, presence of L-dopa was the most sensitive (74.1%; other medications ranged from 1.8–30.0%) but least specific (62.4%; other medications ranged from 92.2–98.6%) for a diagnosis of PD. Presence of L-dopa had a PPV of 85.9%, lowest of any medication. The highest PPV (96.2%) was associated with selegiline or pergolide.

Identifying Cases via Algorithmic Approach: Incorporating Frequency of PD Codes, Specialty of Clinic Assigning Codes, and PD-Related Medications (see Table 3)

The baseline PPV of presence of at least one instance of the diagnostic code for PD (332.0) was 75.6%. The unanimity algorithm, using 3-year input data, nonsignificantly improved PPV over baseline to 78.3%. The unanimity algorithm with medications significantly improved PPV to 88.0%. Of the 391 patients with PD given code 332.0 at least once without any instances of competing codes during the ascertainment year, 51 (13%) received codes 332.1, 333.0, or 333.1 during the subsequent 2 years.

The tiered consensus algorithm, with code count threshold of ≥2 and also using 3-year data, improved PPV significantly over baseline to 83.2% (P = 0.003) with an NPV of 53.8%. The algorithm differed from simply using the inclusion criterion presence of 332.0 by excluding 119 of the 577 patients by producing a final output of Not PD. Of those 119 patients, 64 were determined to be Not PD (i.e. appropriately excluded) and 55 PD (i.e. inappropriately excluded) according to our gold standard. Of the algorithms tested, only this algorithm agreed more often with the gold standard working diagnosis (445 patients) than was true using our baseline criterion, presence of 332.0 alone (436 patients). The tiered consensus algorithm with medications improved PPV significantly over the version without medications to 88.2% (P = 0.04) with significant loss of sensitivity, falling from 87.4 to 77.1% (P = 0.0001).

DISCUSSION

Summary of Results

Nearly 25% of the patients identified using a single ICD-9-CM code for idiopathic Parkinson’s disease (332.0) within our VA database were determined by expert chart review not to support a working diagnosis of PD. In general, PPV was increased by assignment of 332.0 by clinics of higher expertise or with higher frequency and by presence of any PD-related medication. (Specificity also was increased by presence of any PD-related medication, with the exception of L-dopa, likely because of its frequent use in diagnostic trials.) NPV dropped with addition of a medication criterion to our algorithms. The tiered consensus algorithm, designed to emulate an approach taken by an expert chart reviewer, produced high PPV and NPV (NPV was highest of any variable or algorithm tested), with relatively spared sensitivity when the medication modifier was not used. When the medication modifier was used with this algorithm, specificity improved dramatically with excellent PPV, but loss of NPV. The tiered consensus algorithm had the unique characteristic, not seen with any other models tested, of being in agreement with our gold standard in its output more than the inclusion criterion alone.

Sources of Error in Case Ascertainment from Administrative Databases

Case ascertainment from administrative databases presents many potential pitfalls. Basic models of error sources have been described and quantified, and have been shown to vary widely by diagnosis and site of care delivery.13 Key sources of potential error in database ICD-9-CM codes include:

  1. Inaccurate diagnosis (at odds with clinical facts).

  2. Premature or uncertain diagnosis (lacking important clinical information that will develop later).

  3. Missing diagnosis (clinical features of interest not evaluated or no attempt at diagnosis).

  4. Coding error (misapplication/misuse of codes, clerical error, or omission).

PD may be particularly susceptible to these errors in coding because no definitive diagnostic test exists and the disease’s natural history may lead to application multiple diagnostic labels over time, influenced by new clinical data (e.g. imaging, medication response), patterns of clinical progression, or examiner experience with determining or excluding a diagnosis of PD.

Other database information, such as pharmacy prescription data, may also be used for case ascertainment. However, this may be misleading because medications may be used for conditions other than the target condition (e.g. dopaminergic agents for restless legs syndrome rather than PD), diagnostic trials, or for inappropriate indications. Prescription records also lack data to describe feedback from a diagnostic trial, such as when failure to respond to L-dopa suggests an atypical Parkinsonism other than PD.

An expert chart reviewer may potentially overcome all these sources of error, provided enough of various types of longitudinal clinical data has been accumulated and recorded accurately, by looking at cumulative data compiled over time and by taking into account the narrative assessment and plans recorded by providers as they care for patients. However, screening of large databases quickly requires that these approaches be emulated in an automated fashion, with imperfect results.

In Perspective: Comparison with Previous Studies

No confirmatory test for PD, short of pathology exists, and varying gold standards have been used in PD literature on administrative databases for determining “true” cases. Two prior studies have reported accuracy of PD or Parkinsonism codes in an administrative database, but each used a different gold standard than ours. One study used a gold standard of patient self-report of PD diagnosis or use of PD medication14 and found presence of code 332.0 or presence of any of multiple codes (332.0, 332.1, 333.0, 333.1) was associated with high sensitivity and PPVs comparable with our study. A second study examined the accuracy of the diagnosis of Parkinsonism in a VA database using a gold standard of a chart review using explicit diagnostic criteria adapted from the London Brain Bank.4 They examined PD-related codes (332.0, 332.1, or 333.0) and PD-related medications (dopamine agonist or L-dopa), as well as a combinatorial approach (presence of codes or medications) to increase sensitivity (with concomitant loss of PPV). They reported similar predictive values and much higher sensitivity compared with our study, but their algorithms identified Parkinsonism, encompassing a broader set of patients than idiopathic PD. (In our study, our chart abstractors labeled patients with Parkinsonism but with a working diagnosis other than PD as being Not PD.) Our reviewers also used implicit judgment weighing the documented clinical data rather than explicit diagnostic criteria to determine the working diagnosis.

Limitations of Present Study

Our study methods overestimate the true overall sensitivity of all the models tested, because any prevalent cases of true PD not given a code of 332.0 during the ascertainment year are excluded from analysis. Idiosyncratic coding methods (for example, use of 333.0 (not 332.0) for PD in a movement disorder clinic reported in one study4) or coding error during entry, which has been estimated to be 3% in a VA database over multiple services15 may mislead any algorithm. We tried to minimize impact of spurious codes in our tiered consensus algorithm by requiring that 332.0 be assigned at least twice and using the code most often assigned at the tier of highest expertise. Any algorithm may erroneously assign PD to a patient if it cannot build a case against PD by making use of codes representing similar diagnoses (PD mimics). We sought to minimize this by tracking use of competing ICD-9-CM codes 332.1, 333.0, and 333.1. However, other codes (such as 781.0, 781.2, or 781.3, see Appendix) may also describe symptoms labeled 332.0 elsewhere, and were not included in this analysis. This study describes the characteristics of our particular clinical environment; the results may not be generalizable to other populations or clinical settings due to differences in prevalence or coding patterns. Similarly, the results cannot be generalized to other diagnoses, as the characteristics implicitly depend on the nature of the disease being coded.

Useful Application of Database Algorithms: PD and Beyond

Analysis of database interrogation algorithms of the sort described here may allow investigators to craft an approach to accentuate the qualities they seek. For example, an investigator seeking patterns of health care access and delivery may desire a very high sensitivity for case ascertainment; an investigator seeking patients eligible for a phase III drug trial may desire a very high PPV with higher specificity. The models and algorithms described here may help PD investigators approach these questions in large databases with more awareness of what characteristics will help include or exclude particular types of patients. Similar analyses, using the same kinds of principles to help develop an algorithm that mimics an expert chart review, may be developed for other conditions and diagnoses to aid researchers in other fields.

Acknowledgments

Research supported by the Southwest Parkinson’s Disease Research, Education, and Clinical Center (PADRECC). Dr. Cheng is supported by an NINDS career development award (K23NS058571). We thank Stefanie D. Vassar for her help in compiling data and performing statistical analyses.

APPENDIX

ICD-9-CM codes mentioned in the present discussion:

  • 332.0: Parkinson’s disease

  • 332.1: Secondary Parkinsonism

  • 333.0: Other degenerative diseases of the basal ganglia

  • 333.1: Essential and other specified forms of tremor

  • 781.0: Abnormal involuntary movements

  • 781.2: Abnormality of gait

  • 781.3: Lack of coordination

Footnotes

Potential conflict of interest: None reported.

REFERENCES

  • 1.Romano PS, Chan BK, Schembri ME, Rainwater JA. Can administrative data be used to compare postoperative complication rates across hospitals? Med Care. 2002;40:856–867. doi: 10.1097/00005650-200210000-00004. [DOI] [PubMed] [Google Scholar]
  • 2.Geraci JM, Ashton CM, Kuykendall DH, Johnson ML, Wu L. International classification of diseases, 9th revision, clinical modification codes in discharge abstracts are poor measures of complication occurrence in medical inpatients. Med Care. 1997;35:589–602. doi: 10.1097/00005650-199706000-00005. [DOI] [PubMed] [Google Scholar]
  • 3.Benesch C, Witter DM, Jr, Wilder AL, Duncan PW, Samsa GP, Matchar DB. Inaccuracy of the international classification of diseases (ICD-9-CM) in identifying the diagnosis of ischemic cerebrovascular disease. Neurology. 1997;49:660–664. doi: 10.1212/wnl.49.3.660. [DOI] [PubMed] [Google Scholar]
  • 4.Swarztrauber K, Anau J, Peters D. Identifying and distinguishing cases of parkinsonism and Parkinson’s disease using ICD-9 CM codes and pharmacy data. Mov Disord. 2005;20:964–970. doi: 10.1002/mds.20479. [DOI] [PubMed] [Google Scholar]
  • 5.Pippenger M, Holloway RG, Vickrey BG. Neurologists’ use of ICD-9CM codes for dementia. Neurology. 2001;56:1206–1209. doi: 10.1212/wnl.56.9.1206. [DOI] [PubMed] [Google Scholar]
  • 6.Guevara RE, Butler JC, Marston BJ, Plouffe JF, File TM, Jr., Breiman RF. Accuracy of ICD-9-CM codes in detecting community-acquired pneumococcal pneumonia for incidence and vaccine efficacy studies. Am J Epidemiol. 1999;149:282–289. doi: 10.1093/oxfordjournals.aje.a009804. [DOI] [PubMed] [Google Scholar]
  • 7.Szeto HC, Coleman RK, Gholami P, Hoffman BB, Goldstein MK. Accuracy of computerized outpatient diagnoses in a Veterans Affairs general medicine clinic. Am J Manag Care. 2002;8:37–43. [PubMed] [Google Scholar]
  • 8.Culpepper WJ, II, Ehrmantraut M, Wallin MT, Flannery K, Bradham DD. Veterans Health Administration multiple sclerosis surveillance registry: the problem of case-finding from administrative databases. J Rehabil Res Dev. 2006;43:17–24. doi: 10.1682/jrrd.2004.09.0122. [DOI] [PubMed] [Google Scholar]
  • 9.Reker DM, Hamilton BB, Duncan PW, Yeh SC, Rosen A. Stroke: who’s counting what? J Rehabil Res Dev. 2001;38:281–289. [PubMed] [Google Scholar]
  • 10.Singh JA, Holmgren AR, Noorbaloochi S. Accuracy of veterans administration databases for a diagnosis of rheumatoid arthritis. Arthritis Rheum. 2004;51:952–957. doi: 10.1002/art.20827. [DOI] [PubMed] [Google Scholar]
  • 11.Hughes AJ, Daniel SE, Ben-Shlomo Y, Lees AJ. The accuracy of diagnosis of parkinsonian syndromes in a specialist movement disorder service. Brain. 2002;125(Part 4):861–870. doi: 10.1093/brain/awf080. [DOI] [PubMed] [Google Scholar]
  • 12.Cheng EM, Swarztrauber K, Siderowf AD, et al. Association of specialist involvement and quality of care for Parkinson’s disease. Mov Disord. 2007;22:515–522. doi: 10.1002/mds.21311. [DOI] [PubMed] [Google Scholar]
  • 13.Peabody JW, Luck J, Jain S, Bertenthal D, Glassman P. Assessing the accuracy of administrative data in health information systems. Med Care. 2004;42:1066–1072. doi: 10.1097/00005650-200411000-00005. [DOI] [PubMed] [Google Scholar]
  • 14.Noyes K, Liu H, Holloway R, Dick AW. Accuracy of medicare claims data in identifying Parkinsonism cases: comparison with the Medicare current beneficiary survey. Mov Disord. 2007;22:509–514. doi: 10.1002/mds.21299. [DOI] [PubMed] [Google Scholar]
  • 15.Lloyd SS, Rissing JP. Physician and coding errors in patient records. JAMA. 1985;254:1330–1336. [PubMed] [Google Scholar]

RESOURCES