Abstract
OBJECTIVE
To determine the positive predictive value of ICD-9-CM coding of acute myocardial infarction and cardiac procedures.
METHODS
Using chart-abstracted data as the standard, we examined administrative data from the Veterans Health Administration for a national random sample of 5,151 discharges.
MAIN RESULTS
The positive predictive value of acute myocardial infarction coding in the primary position was 96.9%. The sensitivity and specificity of coding were, respectively, 96% and 99% for catheterization, 95.7% and 100% for coronary artery bypass graft surgery, and 90.3% and 99.7% for percutaneous transluminal coronary angioplasty.
CONCLUSIONS
The positive predictive value of acute myocardial infarction and related procedure coding is comparable to or better than previously reported observations of administrative databases.
Keywords: DRG, information systems, medical records, databases, myocardial infarction
Databases containing the International Classification of Diseases, Clinical Modification, Ninth Revision(ICD-9-CM)1 coding of discharge diagnoses are used for a variety of purposes,2 including reimbursement, budgetary planning, monitoring of clinical care activities,3,4 health services research,5 and development of clinical guidelines.6 As the scope of utilization of these data broadens, the importance of ICD-9-CM coding accuracy increases.
Reimbursement policies and regulations have provided incentives to both improve coding of procedures and preferentially increase the coding of acute over chronic conditions (“DRG creep”),7–9 raising concerns about the uses of administrative data.10–15
Acute myocardial infarction (AMI) (ICD-9-CM 410.0–410.9) is a common discharge diagnosis and the frequent topic of study in health services and clinical epidemiologic research. Despite increasing use of discharge databases, there are no recent national studies of the accuracy of the coding of AMI.16–22 Using defined inclusion and exclusion criteria to enhance the accuracy of diagnostic coding, we studied coding for AMI within the Patient Treatment File (PTF), the national automated patient discharge database of the Veterans Health Administration.
METHODS
All hospital discharges of male veterans with a primary ICD-9-CM diagnosis of AMI (410) recorded in the PTF between January 1, 1994, and September 30, 1995, were eligible. Until 1995, the Department of Veterans Affairs (VA) used the “primary” diagnosis for each discharge. Currently, the VA uses the term “principal” diagnosis. In pilot data, we found that for 85% of AMI cases in the VA, the primary diagnosis was the reason for admission (in other words, was equivalent to the principal diagnosis). We therefore used the primary diagnosis in this study. The PTF contains a patient identifier, patient characteristics, discharge diagnoses, and ICD-9-CM procedure codes. For the purposes of our research, we used a sequentially applied algorithm of exclusions to refine our cohort and to focus on decision making for incident cases of AMI. Exclusions were length of stay greater than 180 days, discharged alive with a length of stay less than 3 days, transfer from a non-VA hospital, AMI that occurred after noncardiac surgery, cardiac procedure coded in the 90 days prior to admission, or AMI coded anytime during the prior year. Cases with a fifth-digit ICD-9-CM code of 2 (indicating AMI in the prior 8 weeks) were excluded because the purpose was to identify the initial admission for AMI.
A random sample of 5,151 patients was generated. Because of the differential rates of cardiac procedure use across Veterans Affairs Medical Centers (VAMCs), we sampled patients stratified by the on-site availability of cardiac procedure technology.23
We used the Cooperative Cardiovascular Project structured review instrument24 and specific criteria to confirm the diagnosis of AMI. Four registered nurses collected data from the medical record including date of birth, race, symptoms on presentation, laboratory values, and electrocardiography findings.
As in other studies of coding, we used three categories of data to evaluate the diagnosis of AMI: patient symptoms, electrocardiographic data, and cardiac enzyme values.16 Patient symptoms included chest pain, discomfort, pressure or heaviness, or epigastric discomfort; angina; discomfort or pain in arms, back, or jaw; nausea; vomiting; diaphoresis; sense of impending doom or anxiety; cardiac or respiratory arrest; sudden death; syncope; shortness of breath; or new-onset pedal edema. Criteria for this category were met if the record indicated at least one of these symptoms. Electrocardiographic criteria were evaluated by review of the admitting electrocardiogram. Reviewers assessed the presence of new Q waves, progressive evolution of T wave changes, or ST elevation or ST depression, plus one of the preceding. Cardiac enzyme criteria were met when at least one of the patient's laboratory cardiac enzyme values (peak creatine phosphokinase [CK], CK-MB band greater than 5%, or peak lactate dehydrogenase level greater than normal with an isoenzyme fraction 1 greater than fraction 2) from the first 48 hours following the onset of symptoms was above normal for the institution. Patients meeting at least two of three categories of clinical criteria were judged to have had an AMI, as were patients who died within 24 hours of admission, met the symptom criterion, but did not meet either the enzyme or electrocardiographic criteria.18
All cardiac procedures documented on the discharge summary sheet or in the progress notes were recorded. Nurses were not blinded to the PTF coding of these procedures.
Analysis
We considered the medical record abstraction data to be the standard against which the accuracy of the ICD-9-CM codes in the PTF should be assessed. The positive predictive value is the conditional probability that an AMI was present on admission given that it was coded in the PTF. Patient and hospital characteristics associated with the confirmation of a diagnosis of AMI were assessed using χ2tests.
The sensitivity of procedure coding is the proportion of cases with an ICD-9-CM procedure code in the PTF confirmed by chart review. The specificity of the procedure coding is the proportion of all cases not noted to have the procedure coded in the PTF and confirmed by chart review.
RESULTS
A sample of 5,151 medical charts was requested from 81 VAMCs, and 4,712 (92%) were received and reviewed. Abstractors confirmed a diagnosis of AMI in 96.9% of these. There was no significant difference by admitting hospital type in either the percentage of records received or the confirmed diagnosis of AMI.
Table 1 displays patient and hospital characteristics. Patients with confirmed AMI were younger (mean age ± SD of confirmed cases 65.8 ± 10.8 years vs 67.9 ± 10.5 years for unconfirmed cases; p < .05) and more likely to have one of the three cardiac procedures coded than those who did not meet study criteria for AMI (29.5% vs 16.3%; p < .001). There were no significant differences by race, number of coded secondary diagnoses, or initial admission to a VAMC that performs cardiac surgery.
Table 1.
Patient Characteristics | AMI Confirmed (n = 4,565) | AMI Unconfirmed (n = 147) | p Value |
---|---|---|---|
Mean age, years (± SD) | 65.8 (±10.8) | 67.9 (±10.5) | <.05 |
Race, n(%) | .69 | ||
White | 3,778 (82.8) | 120 (81.6) | |
African American | 572 (12.5) | 17 (11.6) | |
Hispanic | 104 (2.3) | 5 (3.4) | |
Other | 111 (2.4) | 5 (3.4) | |
More than 5 secondary diagnoses coded, n(%) | 2,817 (61.7) | 99 (67.4) | .17 |
Presence of any cardiac procedure code, n(%) | 1,345 (29.5) | 24 (16.3) | <.001 |
Admitting hospital type, n(%) | .24 | ||
Low-volume basic service | 1,189 (26.0) | 48 (32.7) | |
High-volume basic service | 965 (21.1) | 27 (18.4) | |
Catheterization only | 1,249 (27.4) | 33 (22.5) | |
Cardiac surgery | 1,162 (25.5) | 39 (26.5) |
Table 2 displays the classification of the 147 cases of false-positive AMI coding. Diagnosis in the 8 weeks prior to admission accounted for 44% of cases of false-positive coding, and “rule-out” AMI that did not meet clinical criteria accounted for a total of 26% of false-positive coding.
Table 2.
Reason for Exclusion | n(%) |
---|---|
Diagnosis AMI in the 8 weeks prior to admission | 65 (44) |
Met symptom criterion only; AMI ruled out | 23 (16) |
Met electrocardiographic criterion only; AMI ruled out | 8 (5) |
Met enzyme criterion only; AMI ruled out | 7 (5) |
AMI more than 5 days prior to admission; admision from home | 9 (6) |
History of AMI only; not an acute presentation | 8 (5) |
No mention of AMI in record | 7 (5) |
Other conditions miscoded as AMI (stroke, n = 3; respiratory arrest, n = 5; pulmonary embolus, n = 1) | 9 (6) |
Cardiac arrest associated with other conditions; AMI not clearly documented | 6 (6) |
Missing laboratory and electrocardiographic data | 3 (2) |
No laboratory tests or electrocardiogram performed | 2 (1) |
Table 3 displays cardiac procedure coding in the PTF among confirmed AMI cases. Sensitivity of PTF coding of cardiac catheterization was 96.0%; of coronary artery bypass graft surgery (CABG), 95.7%; and of percutaneous transluminal coronary angioplasty (PTCA), 90.3%. The specificity of coding of catheterization was 99.0%; of CABG, 100%; and of PTCA, 99.7%. We matched chart coding of date of procedure with PTF coding of date of procedure. There was an exact date match for 95.7% of catheterizations, 98.3% of CABG surgeries, and 94.0% of PTCA procedures.
Table 3.
Chart Review | ||
---|---|---|
Patient Treatment File Coding* | Procedure Confirmed, n | Procedure Not Confirmed, n |
Cardiac catheterization coded | 1,293 | 31 |
Cardiac catheterization not coded | 54 | 3,187 |
CABG coded | 88 | 0 |
CABG not coded | 4 | 4,473 |
PTCA coded | 289 | 12 |
PTCA not coded | 31 | 4,233 |
CABG indicates coronary artery bypass graft surgery; PTCA, percutaneous transluminal coronary angioplasty.
DISCUSSION
This study of the Veterans Health Administration's discharge database demonstrates that the positive predictive value of the diagnosis of AMI and cardiac procedures in the PTF is high when using an algorithm to refine a cohort of AMI cases. For cases with AMI coded in the primary coding position, 96.9% had an AMI on admission to the hospital. Absence of clinical criteria for AMI or myocardial infarction in the prior 8 weeks accounted for the majority of false-positive AMI codes. The large number of false-positive cases with an AMI in the prior 8 weeks draws attention to a potential problem with the accuracy of the fifth-digit code of 2.
In this study, we used an algorithm to exclude cases of patients hospitalized less than 3 days because “rule-out” myocardial infarction has been a source of miscoding in prior studies.16,17 Indeed, our findings document a higher positive predictive value for AMI coding than have other studies of coding accuracy of AMI in both non-VA data15–22,25 and VA data.26 Without the use of this algorithm, our conclusions might have been quite different. We cannot exclude the possibility that increasing incentives to code procedures and acute illness more accurately account for our findings.7–9
This study shares at least one of the limitations cited in previous coding validation efforts in that we are not able to assess the number of patients who had an AMI documented in the medical record, but not in the administrative database.8,15
Discharge abstract databases provide a wealth of information on resource utilization and outcome that are used for reimbursement, quality monitoring, research, and a growing list of other purposes. We believe that health services research studies that use administrative data should routinely incorporate validation of ICD-9-CM diagnostic accuracy into their projects as well as development of algorithms to increase the likelihood of true-positive cases.
Acknowledgments
Dr. Petersen is an Associate in the Career Development Award Program of the VA Health Services and Research Department (HSR&D) Service. Dr. Daley was a Senior Research Associate in the same program at the time this research was conducted. This project was supported by grants IIR 94-054 and PPR 942-D001 from the VA HSR&D Service.
The authors thank Rebecca Lamkin, Judith Beard, Susan Kluiber, Caterina Brown, Kathleen Hickson, RN, Kathleen Sheehan, RN, Rita Krolak, RN, and Kathleen Greene, RN, for their help with this project.
REFERENCES
- 1.Public Health Service, US Department of Health and Human Services. International Classification of Diseases, Ninth Revision, Clinical Modification. Washington, DC: US Government Printing Office; 1980. pp. 366–7. Publication (PHS) 80-1260. [Google Scholar]
- 2.Laine C. Coming to grips with large databases. Ann Intern Med. 1997;127:645–7. doi: 10.7326/0003-4819-127-8_part_1-199710150-00012. [DOI] [PubMed] [Google Scholar]
- 3.Roper WL, Winkenwerder W, Hackbarth GW, Krakauer H. Effectiveness in health care: an initiative to evaluate and improve medical practice. N Engl J Med. 1988;319:1197–1202. doi: 10.1056/NEJM198811033191805. [DOI] [PubMed] [Google Scholar]
- 4.Tierney WM, Overhage JM, McDonald DJ. Toward electronic medical records that improve care. Ann Intern Med. 1995;122:725–6. doi: 10.7326/0003-4819-122-9-199505010-00011. [DOI] [PubMed] [Google Scholar]
- 5.Lohr KN. Use of insurance claims data in measuring quality of care. Int J Technol Assess Health Care. 1990;6:262–71. doi: 10.1017/s0266462300000787. [DOI] [PubMed] [Google Scholar]
- 6.Fine MJ, Auble TE, Yealy DM, et al. A prediction rule to identify low-risk patients with community-acquired pneumonia. N Engl J Med. 1997;336:243–50. doi: 10.1056/NEJM199701233360402. [DOI] [PubMed] [Google Scholar]
- 7.Simborg DW. DRG creep: a new hospital-acquired disease. N Engl J Med. 1981;304:1602–4. doi: 10.1056/NEJM198106253042611. [DOI] [PubMed] [Google Scholar]
- 8.Jollis JG, Ancukiewicz M, DeLong ER, Pryor DB, Muhlbaier LH, Mark DB. Discordance of databases designed for claims payment versus information systems: implications for outcomes research. Ann Intern Med. 1993;119:844–50. doi: 10.7326/0003-4819-119-8-199310150-00011. [DOI] [PubMed] [Google Scholar]
- 9.Assaf AR, Lapane KL, McKenney JL, Carleton RA. Possible influence of the prospective payment system on the assignment of discharge diagnoses for coronary heart disease. N Engl J Med. 1993;329:931–5. doi: 10.1056/NEJM199309233291307. [DOI] [PubMed] [Google Scholar]
- 10.Roos LL, Roos NP, Cageorge SM, Nicol JP. How good are the data? Reliability of one health care data bank. Med Care. 1982;20:266–76. doi: 10.1097/00005650-198203000-00003. [DOI] [PubMed] [Google Scholar]
- 11.Iezzoni L. Using administrative diagnostic data to assess the quality of hospital care: pitfalls and potential of ICD-9-CM. Int J Technol Assess Health Care. 1990;6:272–81. doi: 10.1017/s0266462300000799. [DOI] [PubMed] [Google Scholar]
- 12.Dans P. Looking for answers in all the wrong places. Ann Intern Med. 1993;119:855–6. doi: 10.7326/0003-4819-119-8-199310150-00014. [DOI] [PubMed] [Google Scholar]
- 13.Romano PS, Mark DH. Bias in the coding of hospital discharge data and its implications for quality assessment. Med Care. 1994;32:81–90. doi: 10.1097/00005650-199401000-00006. [DOI] [PubMed] [Google Scholar]
- 14.Demlo LK, Campbell PM, Brown SS. Reliability of information abstracted from patient's medical records. Med Care. 1978;16:995–1005. doi: 10.1097/00005650-197812000-00003. [DOI] [PubMed] [Google Scholar]
- 15.Fisher ES, Whaley FS, Krushat WM, et al. The accuracy of Medicare's hospital claims data: progress has been made, but problems remain. Am J Public Health. 1992;82:243–8. doi: 10.2105/ajph.82.2.243. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Iezzoni LI, Burnside S, Sickles L, et al. Coding of acute myocardial infarction: clinical and policy implications. Ann Intern Med. 1988;109:745–51. doi: 10.7326/0003-4819-109-9-745. [DOI] [PubMed] [Google Scholar]
- 17.Schiff GD, Yaacoub AS. The diagnostic coding of myocardial infarction. Ann Intern Med. 1988;109:745–51. doi: 10.7326/0003-4819-110-3-243_1. Letter. [DOI] [PubMed] [Google Scholar]
- 18.Daley J, Jencks SF, Draper D, et al. Predicting hospital-associated mortality for Medicare patients: a method for patients with stroke, pneumonia, acute myocardial infarction, and congestive heart failure. JAMA. 1988;260:3617–24. doi: 10.1001/jama.260.24.3617. [DOI] [PubMed] [Google Scholar]
- 19.Kennedy GT, Stern MP, Crawford MH. Miscoding of hospital discharges as acute myocardial infarction: implications for surveillance programs aimed at elucidating trends in coronary artery disease. Am J Cardiol. 1984;53:1000–2. doi: 10.1016/0002-9149(84)90625-8. [DOI] [PubMed] [Google Scholar]
- 20.Reznik RB, Goldstein GB, Ring I, Berry G. The determination of the incidence of acute myocardial infarction from hospital morbidity records. J Chronic Dis. 1984;9/10:733–42. doi: 10.1016/0021-9681(84)90042-0. [DOI] [PubMed] [Google Scholar]
- 21.Nova Scotia–Saskatchewan Cardiovascular Epidemiology Group. Estimation of the incidence of acute myocardial infarction using record linkage: a feasibility study in Nova Scotia and Saskatchewan. Can J Public Health. 1989;80:412–7. [PubMed] [Google Scholar]
- 22.van Walraven C, Wang B, Ugnat AM, Naylor CD. False-positive coding for acute myocardial infarction on hospital discharge records: chart audit results from a tertiary centre. Can J Cardiol. 1990;6:383–6. [PubMed] [Google Scholar]
- 23.Wright SM, Daley J, Peterson ED, Thibault GE. Outcome of acute myocardial infarction in the Department of Veterans Affairs: does regionalization of health care work? Med Care. 1997;35:128–41. doi: 10.1097/00005650-199702000-00004. [DOI] [PubMed] [Google Scholar]
- 24.Jencks SF. HCFA's Health Care Quality Improvement Program and the Cooperative Cardiovascular Project. Ann Thorac Surg. 1994;58:1858–62. doi: 10.1016/0003-4975(94)91727-2. [DOI] [PubMed] [Google Scholar]
- 25.Ellerbeck EF, Jencks SR, Randford MJ, et al. Quality of care for Medicare patients with acute myocardial infarction: report on a four state pilot of the Cooperative Cardiovascular Project. JAMA. 1995;273:1509–14. [PubMed] [Google Scholar]
- 26.Hunte-Young NL, Hamann C, Cagan M, Daley J, Thibault GE. Validity and reliability of coding in the Veterans Health Administration Patient Treatment File for veterans with acute myocardial infarctions. 1994. Abstract presented at the VA HSR&D annual meeting, April.