ABSTRACT
Purpose
Contemporary epidemiologic research on acute myocardial infarction (AMI) using electronic health records (EHR) relies on International Classification of Diseases, 10th Revision, Clinical Modification (ICD‐10‐CM) codes, but limited studies have been conducted to validate these codes in the United States. Therefore, this study aimed to validate AMI events identified by ICD‐10‐CM diagnosis codes.
Methods
The study was conducted as part of a hepatitis B vaccine safety study. Suspected cases of AMI were identified using ICD‐10‐CM codes (I21.* or I22.*) in any diagnosis position from August 7, 2018 to November 30, 2020. Cases were adjudicated independently by two cardiologists, with a third resolving disagreements. Positive predictive value (PPV) was calculated as the percentage of suspected cases that were confirmed as definite or probable AMI on review, and exact binomial 95% confidence intervals (CI) were estimated.
Results
Of 202 potential AMI events identified among 69 625 individuals, 162 (80.2% [95% CI: 74.0%–85.5%]) were confirmed. Encounters with AMI coded as the principal discharge diagnosis code were more likely to be confirmed (86.8% [80.5%–91.6%]) than those with AMI in another diagnosis position (55.8% [39.9%–70.9%]), while patients with a history of congestive heart failure and peripheral vascular disease had lower PPV compared to those without (83.2% [76.7%–88.6%] and 82.9% [76.4%–88.3%], respectively).
Conclusion
We found that over 80% of AMI cases identified with ICD‐10‐CM codes were confirmed upon cardiologist adjudication. Cases not coded in the principal diagnosis position were much less likely to be confirmed, and care should be taken when using them in EHR‐based research.
Keywords: acute myocardial infarction, ICD‐10 coding, validation
Summary.
Little work has been done to validate the accuracy of ICD‐10 coding of acute myocardial infarction.
Our study found that 80.2% of hospitalized encounters coded as AMI were true AMI events.
Encounters with AMI coded as the principal discharge diagnosis were more likely to be true events, at 86.8%.
Encounters coded not as the principal discharge diagnosis were less likely to be true AMI, with 55.8% confirmed.
Patients with congestive heart failure or peripheral vascular disease were less likely to have accurate coding of AMI events.
1. Introduction
Acute myocardial infarction (AMI) is one of the leading causes of death with an annual estimate of more than 1 million AMI‐related deaths in the United States (US). Specialized hospitals are equipped to promptly treat patients exhibiting AMI signs and symptoms, which can be categorized into ST‐elevation myocardial infarction (STEMI) or non‐ST‐elevation myocardial infarction (NSTEMI); each type requires different clinical management [1]. Due to the acute nature of AMI, treatment within 6 h of symptom onset is essential for a favorable outcome, and providers are tasked with identifying the appropriate course of AMI clinical management the moment a patient presents with AMI signs and symptoms [2, 3]. The first steps include running diagnostic tests to determine if the presenting signs and symptoms were due to an AMI, other cardiac conditions such as myocarditis or pericarditis, or non‐cardiac conditions such as asthma and esophagitis [1, 2, 3, 4].
In 2015, the US transitioned standardized coding from International Classification of Diseases, Ninth Revision (ICD‐9) to International Classification of Diseases, Tenth Revision, Clinical Modification (ICD‐10‐CM) [5], which increased the potential specificity of coding of AMI [6]. In 2018, the Global Myocardial Infarction (MI) Task Force established the Fourth Universal Definition of MI, detailing criteria that would assist clinicians in diagnosing AMI [4].
Accurate ICD‐10‐CM coding for AMI is important for studies using electronic health record (EHR) data. Inaccurate coding that results in misclassification of study measures can bias study results. Previous studies have assessed the validity of ICD‐9 AMI codes. A systematic review published in 2015 estimated the positive predictive value (PPV) of AMI diagnoses identified by ICD‐9‐CM codes from studies conducted in European, North and South American, and Asian countries, finding PPVs generally greater than 70% [7]. Few studies have examined the PPV of ICD‐10‐CM codes for AMI, with three studies in Asia finding PPVs ranging from 71% to 85% [8, 9, 10] and one study in Europe finding a PPV of 74%–100% [11]. In one US study, an AMI ICD‐10‐CM‐based algorithm was developed and applied to a large US healthcare claims database; the authors characterized the AMI population and generated descriptive statistics but acknowledged that their efforts should not be viewed as a substitute for medical chart validation [12]. In this current study, we assessed the validity of AMI ICD‐10‐CM codes adjudicated as part of a hepatitis B vaccine safety study.
2. Materials and Methods
This validation study was conducted within a pragmatic, prospective cohort study examining the safety of a hepatitis B vaccine, in which the primary outcome was AMI. The overall study methods are described in detail elsewhere [13]. The study was conducted within Kaiser Permanente Southern California (KPSC), an integrated health care delivery system currently with over 4.8 million members. The study was approved by the KPSC Institutional Review Board, which waived the requirement for informed consent due to minimal risk to participants.
All adult KPSC members who met the eligibility criteria and received ≥ 1 dose of the hepatitis B vaccine during the study period from August 7, 2018 to October 31, 2019 during encounters in family medicine or internal medicine departments were followed for AMI for the subsequent 13 months.
Suspected cases of AMI were identified in the EHR using ICD‐10‐CM codes I21.* or I22.* in any diagnosis position (Table 1). Inpatient records were the primary source, along with emergency department (ED) records for those who were hospitalized or died within 24 h of the encounter to ensure complete capture of patients who died before they could be admitted. Charts of patients with suspected AMI were reviewed independently by two cardiologists, following the Fourth Universal Definition of MI [13]. The cardiologists determined if suspected cases were definite AMI, probable AMI, not AMI, or having insufficient information. In case of disagreement, a third cardiologist reviewed the case. Agreement among two of the three cardiologists was considered the final adjudication result. If all three cardiologists disagreed, the case was considered indeterminate. Cases determined to be definite or probable AMI were considered confirmed, while cases determined to be not an AMI, indeterminate, or having insufficient information were considered not confirmed. If any individuals had multiple encounters coded for AMI, only the first encounter was reviewed.
TABLE 1.
ICD‐10 codes for acute myocardial infarction.
| ICD‐10 code | Description | STEMI/NSTEMI |
|---|---|---|
| I21.0 | ST elevation (STEMI) myocardial infarction of anterior wall | STEMI |
| I21.01 | ST elevation (STEMI) myocardial infarction involving left main coronary artery | STEMI |
| I21.02 | ST elevation (STEMI) myocardial infarction involving left anterior descending coronary artery | STEMI |
| I21.09 | ST elevation (STEMI) myocardial infarction involving other coronary artery of anterior wall | STEMI |
| I21.1 | ST elevation (STEMI) myocardial infarction of inferior wall | STEMI |
| I21.11 | ST elevation (STEMI) myocardial infarction involving right coronary artery | STEMI |
| I21.19 | ST elevation (STEMI) myocardial infarction involving other coronary artery of inferior wall | STEMI |
| I21.2 | ST elevation (STEMI) myocardial infarction of other sites | STEMI |
| I21.21 | ST elevation (STEMI) myocardial infarction involving left circumflex coronary artery | STEMI |
| I21.29 | ST elevation (STEMI) myocardial infarction involving other sites | STEMI |
| I21.3 | ST elevation (STEMI) myocardial infarction of unspecified site | STEMI |
| I21.4 | Non‐ST elevation (NSTEMI) myocardial infarction | NSTEMI |
| I21.9 | Acute myocardial infarction, unspecified | Unspecified |
| I21.A | Other type of myocardial infarction | Unspecified |
| I21.A1 | Myocardial infarction type 2 | Unspecified |
| I21.A9 | Other myocardial infarction type | Unspecified |
| I22.0 | Subsequent ST elevation (STEMI) myocardial infarction of anterior wall | STEMI |
| I22.1 | Subsequent ST elevation (STEMI) myocardial infarction of inferior wall | STEMI |
| I22.2 | Subsequent non‐ST elevation myocardial infarction (NSTEMI) | NSTEMI |
| I22.8 | Subsequent ST elevation (STEMI) myocardial infarction of other sites | STEMI |
| I22.9 | Subsequent ST elevation (STEMI) myocardial infarction of unspecified site | STEMI |
As patients could be seen in multiple care settings during the diagnosis and treatment of their suspected AMI, some patients had multiple diagnosis codes for AMI found in their records. In order to identify the ultimate diagnosis from the encounter, the following prioritization was used: first, a code identified as the principal diagnosis code was selected over others; second, inpatient codes were prioritized over ED codes; and finally, codes from a KPSC facility (where inpatient coding is always done by professional coders after discharge) were selected over those from outside facilities (where coding methodology is unknown). For the few who still had multiple codes, the most specific code was selected, removing unspecified site codes (I21.9, AMI, unspecified or I21.3, STEMI of unspecified site) in favor of a more specific code from the same setting.
Collected details of the AMI coding included whether it was in the principal diagnosis position, whether the AMI was coded as STEMI/NSTEMI/unspecified, and whether the encounter was in a KPSC or non‐KPSC facility, and length of stay. Patient characteristics were collected, including demographics, comorbidities in the prior year including congestive heart failure, peripheral vascular disease, diabetes, history of AMI, smoking, Charlson Comorbidity Index, and number of inpatient, ED, and outpatient healthcare encounters in the prior year. The study was approved by the KPSC IRB.
2.1. Statistical Methods
Characteristics of the population were described using frequencies. The AMI confirmation rate (i.e., PPV) was calculated as the percentage of suspected cases that were found to be definite or probable AMI on review, and exact binomial 95% confidence intervals (CI) were calculated. Comparisons of rates among subsets by coding and patient characteristics were made using Fisher's exact test. We also examined alternative criteria for identifying AMI events, excluding non‐specific AMI codes and excluding codes not in the principal discharge diagnosis position. We presented PPV and the corresponding rate of true AMI events that would be missed as a result of these exclusions.
3. Results
Among 69 625 vaccine recipients (median age 49 years [IQR 38–56 years], 51.0% male, 58.8% with diabetes, 0.4% with history of AMI), there were 202 potential AMI events identified with 162 events (80.2% [95% CI: 74.0%–85.5%]) confirmed as definite or probable AMI (Table 2). Encounters where AMI was coded as the principal diagnosis code were more likely to be confirmed (86.8% [80.5%–91.6%]) than those with AMI entered in another diagnosis position (55.8% [39.9%–70.9%], p < 0.001). Encounters coded as STEMI (82.1% [69.6%–91.1%]) or NSTEMI (83.3% [75.9%–89.3%]) were more likely to be confirmed as AMI than those with codes not indicating presence or absence of ST elevation (42.9% [17.7%–71.1%], p = 0.004).
TABLE 2.
Positive predictive value of ICD‐10 AMI diagnosis codes and association with coding and patient characteristics.
| Potential AMI reviewed, n | Confirmed AMI, n | Positive predictive value (95% CI), % | p | |
|---|---|---|---|---|
| Overall | 202 | 162 | 80.2 (74.0–85.5) | |
| AMI in principal diagnosis position | < 0.001 | |||
| No | 43 | 24 | 55.8 (39.9–70.9) | |
| Yes | 159 | 138 | 86.8 (80.5–91.6) | |
| ST elevation code a | 0.004 | |||
| STEMI | 56 | 46 | 82.1 (69.6–91.1) | |
| NSTEMI | 132 | 110 | 83.3 (75.9–89.3) | |
| Unspecified | 14 | 6 | 42.9 (17.7–71.1) | |
| AMI diagnosis code used | 0.001 | |||
| I21.0*, STEMI of anterior wall | 22 | 21 | 95.5 (77.2–99.9) | |
| I21.1*, STEMI of inferior wall | 16 | 11 | 68.8 (41.3–89.0) | |
| I21.2*, STEMI of other sites | 7 | 7 | 100.0 (59.0–100.0) | |
| I21.3, STEMI of unspecified site | 11 | 7 | 63.6 (30.8–89.1) | |
| I21.4, NSTEMI | 132 | 110 | 83.3 (75.9–89.3) | |
| I21.9, AMI, unspecified | 2 | 0 | 0.0 (0.0–84.2) | |
| I21.A*, other type of MI | 12 | 6 | 50.0 (21.1–78.9) | |
| Health system of coding | 0.001 | |||
| KPSC | 108 | 96 | 88.9 (81.4–94.1) | |
| Non‐KPSC (claims data) | 94 | 66 | 70.2 (59.9–79.2) | |
| Age (years) | 0.772 | |||
| 30–49 | 45 | 36 | 80.0 (65.4–90.4) | |
| 50–59 | 109 | 89 | 81.7 (73.1–88.4) | |
| ≥ 60 | 48 | 37 | 77.1 (62.7–88.0) | |
| Gender | 0.170 | |||
| Female | 57 | 42 | 73.7 (60.3–84.5) | |
| Male | 145 | 120 | 82.8 (75.6–88.5) | |
| Race/ethnicity | 0.226 | |||
| Hispanic | 94 | 78 | 83.0 (73.8–89.9) | |
| Non‐hispanic white | 57 | 42 | 73.7 (60.3–84.5) | |
| Non‐hispanic black | 23 | 16 | 69.6 (47.1–86.8) | |
| Non‐hispanic Asian/Pacific Islander | 19 | 18 | 94.7 (74.0–99.9) | |
| Other/unknown | 9 | 8 | 88.9 (51.8–99.7) | |
| Congestive heart failure | 0.033 | |||
| No | 167 | 139 | 83.2 (76.7–88.6) | |
| Yes | 35 | 23 | 65.7 (47.8–80.9) | |
| Peripheral vascular disease | 0.031 | |||
| No | 170 | 141 | 82.9 (76.4–88.3) | |
| Yes | 32 | 21 | 65.6 (46.8–81.4) | |
| Diabetes | 0.390 | |||
| No | 41 | 31 | 75.6 (59.7–87.6) | |
| Yes | 161 | 131 | 81.4 (74.5–87.1) | |
| History of AMI | 0.374 | |||
| No | 184 | 149 | 81.0 (74.6–86.4) | |
| Yes | 18 | 13 | 72.2 (46.5–90.3) | |
| Smoker (current or former) | 0.323 | |||
| No | 115 | 95 | 82.6 (74.4–89.0) | |
| Yes | 87 | 67 | 77.0 (66.8–85.4) | |
| Charlson comorbidity index | 0.579 | |||
| 0 | 24 | 21 | 87.5 (67.6–97.3) | |
| 1 | 62 | 50 | 80.6 (68.6–89.6) | |
| 2–3 | 56 | 46 | 82.1 (69.6–91.1) | |
| 4+ | 60 | 45 | 75.0 (62.1–85.3) | |
| Length of hospital stay | 0.310 | |||
| 1–2 days | 26 | 20 | 76.9 (56.4–91.0) | |
| 3–6 days | 122 | 102 | 83.6 (75.8–89.7) | |
| 7 days or longer | 54 | 40 | 74.1 (60.3–85.0) | |
| Number of inpatient visits in prior year | 0.008 | |||
| 0 | 163 | 137 | 84.0 (77.5–89.3) | |
| ≥ 1 | 39 | 25 | 64.1 (47.2–78.8) | |
| Number of outpatient visits in prior year | 0.004 | |||
| 0–1 | 37 | 36 | 97.3 (85.8–99.9) | |
| 2–9 | 87 | 69 | 79.3 (69.3–87.3) | |
| ≥ 10 | 78 | 57 | 73.1 (61.8–82.5) | |
| Number of emergency department visits in prior year | 0.070 | |||
| 0 | 124 | 104 | 83.9 (76.2–89.9) | |
| 1 | 43 | 35 | 81.4 (66.6–91.6) | |
| ≥ 2 | 35 | 23 | 65.7 (47.8–80.9) |
STEMI codes were I21.0*, I21.1*, I21.2*, I21.3, I22.0, I22.1, I22.8, and I22.9. NSTEMI codes were I21.4 and I22.2. Unspecified codes were I21.9 and I21.A*.
Patients with a history of congestive heart failure (65.7% [47.8%–80.9%]) and peripheral vascular disease (65.6% [46.8%–81.4%]) had a lower PPV compared to those without (83.2%, [76.7%–88.6%], p = 0.033, and 82.9%, [76.4%–88.3%], p = 0.031, respectively). There was a trend toward lower PPV among patients with higher numbers of healthcare encounters in the prior year, with PPV of 73.1% (61.8%–82.5%) among those having at least 10 outpatient encounters compared to 97.3% (85.8%–99.9%) for those with zero or one outpatient encounter (p = 0.004). Likewise, PPV was somewhat lower among those with two or more ED encounters in the prior year (65.7% [47.8%–80.9%]), compared with 83.9% (76.2%–89.9%) among those with no ED encounters in the prior year (p = 0.070). There were no statistically significant differences in AMI confirmation rate by age, gender, race/ethnicity, length of stay, history of AMI, diabetes, smoking status or Charlson Comorbidity Index.
Diagnosis codes from within the KPSC system were more likely to have a confirmed AMI (88.9% [81.4%–94.1%]) than those from a non‐KPSC facility (70.2% [59.9%–79.2%], p = 0.001). Those with a principal diagnosis code within the KPSC system were most likely to be confirmed (92.9% [81.4%–94.1%]), followed by principal diagnosis codes from non‐KPSC facilities (80.0% [69.2%–88.4%]). Events coded with non‐principal diagnosis codes were generally confirmed if they came from a KPSC facility (75.0% [53.3%–90.2%]), but non‐principal diagnosis codes from a non‐KPSC facility were unlikely to be confirmed (31.6% [12.6%–56.6%]).
When considering specific diagnosis codes, the most common code was I21.4, NSTEMI, which had a PPV of 83.3% (75.9%–89.3%). Other codes with high PPV were I21.2* (STEMI of other sites) at 100% (59.0%–100%) and I21.0* (STEMI of anterior wall) at 95.5% (77.2%–99.9%). Lower PPV was seen for I21.1* (STEMI of inferior wall, 68.8% [41.3%–89.0%]), I21.3 (STEMI of unspecified site, 63.6% [30.8%–89.1%]), I21.A* (Other type of MI, 50.0% [21.1%–78.9%]), and I21.9 (AMI, unspecified, 0% [0.0%–84.2%]). No events were identified solely based on I22.* codes (subsequent STEMI and NSTEMI). Three events included this code in the charts, but all had I21.4 codes as the principal diagnosis; all three were confirmed.
Alternative criteria for identifying suspected AMI events resulted in improved PPV at the expense of missing some true cases. First, excluding the 6.9% of encounters with only non‐specific AMI codes (I21.9 or I21.A) from consideration resulted in PPV of 83.0%, while missing only 3.7% of true cases. Second, excluding the 21.3% of encounters where AMI was not coded as the principal discharge diagnosis (which also resulted in excluding the non‐specific codes) increased PPV to 86.8%, while missing 14.8% of true AMI events.
4. Discussion
In our study of 202 potential AMI cases, we found that 80.2% overall were confirmed as true AMI events. Those with AMI coded in the principal diagnosis position were more likely to be confirmed as true AMI events compared to those with AMI not in the principal diagnosis position. Similar findings regarding the accuracy of ICD‐10 coding as the principal diagnosis for heart failure were observed in a recent study, with PPV of 98% seen for principal diagnosis codes and 66% for codes in other positions [13].
We observed that patients who had fewer healthcare encounters in the year prior to the event were more likely to have a confirmed AMI. This could be due to healthcare seeking behavior or challenges in accessing care; for example, those who had fewer prior healthcare encounters might have waited until it was more certain that they were experiencing an AMI before seeking care, while those with more prior healthcare encounters might have been more likely to seek care when having less clearly‐defined symptoms. Likewise, those with a history of congestive heart failure and peripheral vascular disease were less likely to have a confirmed AMI diagnosis, suggesting a lower threshold of seeking care or entering an AMI diagnosis for patients with pre‐existing cardiovascular disease, or initial presentation that could be mistaken for AMI, particularly in the case of congestive heart failure where patients can present with chest discomfort or shortness of breath, elevated troponin levels and abnormal electrocardiogram findings.
We saw a higher PPV for encounters that occurred entirely within the KPSC system, where inpatient coding is done by professional coders after discharge. Non‐KPSC settings could include fee‐for‐service settings and heterogeneity in coding practices and financial incentives. As we were not able to verify these practices at non‐KPSC encounters where these patients were seen, we could not directly estimate the impact of professional vs. provider coding or fee‐for‐service vs. managed care organization on PPV. Another possibility is that less complete documentation was available from non‐KPSC sites, though every effort was made to obtain complete notes for all cases regardless of care setting. Nevertheless, we found that a similar proportion of non‐KPSC cases (3.2%) and KPSC cases (3.7%) were deemed indeterminate upon adjudication, though a higher rate in non‐KPSC cases would have been expected if incomplete documentation were more common in those cases.
To our knowledge, this is the first study to validate ICD‐10‐CM coding of AMI in the US using chart review and cardiologist adjudication. Our results are comparable to findings from Japan (84.9% and 82.5% PPV)[8 9], Korea (71.4%–73.1% PPV) [10], and Europe (74.0%–100% PPV) [11].
Alternative criteria for identifying potential AMI cases may be important to consider, depending on study objectives. Our study sought to identify all AMI events and therefore used broad search criteria, prioritizing sensitivity. We show that excluding non‐specific AMI codes results in only a few true cases being missed, while resulting in an improvement of PPV from 80.2% to 83.0%. Studies where sensitivity is prioritized, including those where chart review will be used to confirm true cases or studies where AMI will be used as a confounder, may benefit from using more inclusive criteria. Further limiting to codes in the principal discharge diagnosis position would yield increased PPV of 86.8%; however, excluding events using other diagnosis positions would result in missing 14.8% of confirmed AMI events. These criteria may be more appropriate in studies where specificity is prioritized. The use of specific codes in the principal diagnosis position may also be appropriate for incidence studies, as the total number of suspected cases identified in this way was within 2% of the total number of confirmed AMI cases in our study.
Our study had some potential limitations. First, as the parent study used EHR to identify potential AMI cases based on diagnosis codes, we could not identify which of the many encounters without AMI codes were true AMI (false negatives) or truly not AMI (true negatives). Thus, we were only able to evaluate the PPV of diagnosis codes and were not able to evaluate sensitivity, negative predictive value, or specificity of coding. Although most AMI seen in hospital settings are likely to be coded, additional research to ascertain true and false negatives is warranted. Other studies have suggested that AMI has been miscoded as unstable angina (I20) or atherosclerotic heart disease (I25) [9, 14]; these codes could be evaluated in future AMI algorithms to potentially increase sensitivity and reduce the likelihood of false negatives, but chart review and adjudication would be needed to confirm AMI in these miscoded events. Second, while every attempt was made to get complete records for review, some details from care provided outside of KPSC were ultimately unavailable for case review, which, along with the retrospective nature of the review, may have limited the ability of the reviewers to accurately determine the presence of AMI. However, only seven suspected cases (3.5%) were ultimately left as indeterminate after review, which would only have a small impact on our overall results. If sufficient information had been present and all of them had been found to be confirmed AMI events, the PPV would have risen only to 83.7% compared to the 80.2% we observed. Finally, as this study was conducted within an integrated health care delivery system, results may not be generalizable to other care settings, particularly fee‐for‐service systems. However, because our study included AMI from both KPSC (managed healthcare) and non‐KPSC (potentially fee‐for‐service) settings, and PPV is presented overall and by these settings, we expect that these results should have broad generalizability. Strengths of our study include rigorous chart review and adjudication by a team of cardiologists using the Fourth Universal Definition of MI and the comprehensive capture of care received both inside and outside of the KPSC health system.
5. Conclusions
We found that over 80% of AMI cases identified with ICD‐10‐CM codes were verified upon rigorous chart review and adjudication. Coding accuracy was even higher (over 86%) for those with AMI in the principal diagnosis position. Coding of AMI in another position had lower accuracy, and studies should be cautious in relying solely on AMI diagnosis codes in these positions.
5.1. Plain Language Summary
Our study looked at the accuracy of diagnosis codes for acute myocardial infarctions (AMI, heart attacks) in the electronic medical record of hospitalized patients. We had a team of three cardiologists review the charts of 202 patients coded as having had an AMI. We found that about 80%, or 4 out of 5 of them were actual AMIs. Medical coders designate a code as the principal reason for the hospitalization. When they indicated that the AMI was the principal reason, it was more likely to be a true AMI than when some other diagnosis was the principal reason. People with a pre‐existing heart condition like congestive heart failure or peripheral vascular disease were less likely to have accurately‐coded AMI events, possibly because some symptoms of those conditions can initially look like an AMI.
Ethics Statement
This study was approved by the KPSC IRB, IRB # 11601.
Consent
The IRB granted a waiver of informed consent.
Conflicts of Interest
The authors declare no conflicts of interest.
Acknowledgments
The authors thank the patients of Kaiser Permanente for helping to improve care through the use of information collected through its electronic health record systems.
Slezak J., Bruxvoort K. J., Sy L. S., et al., “Validation of ICD‐10 Diagnosis Codes for Identification of Acute Myocardial Infarction From a US Integrated Healthcare System,” Pharmacoepidemiology and Drug Safety 34, no. 7 (2025): e70179, 10.1002/pds.70179.
Funding: This work was supported by the Kaiser Permanente Department of Research & Evaluation.
References
- 1. Mechanic O. J., Gavin M., and Grossman S. A., Acute Myocardial Infarction (StatPearls, 2024). [PubMed] [Google Scholar]
- 2. O'Gara P. T., Kushner F. G., Ascheim D. D., et al., “2013 ACCF/AHA Guideline for the Management of ST‐Elevation Myocardial Infarction: Executive Summary: A Report of the American College of Cardiology Foundation/American Heart Association Task Force on Practice Guidelines,” Circulation 127, no. 4 (2013): 529–555, 10.1161/CIR.0b013e3182742c84. [DOI] [PubMed] [Google Scholar]
- 3. Amsterdam E. A., Wenger N. K., Brindis R. G., et al., “2014 AHA/ACC Guideline for the Management of Patients With non–ST‐Elevation Acute Coronary Syndromes,” Circulation 130, no. 25 (2014): e344–e426, 10.1161/CIR.0000000000000134. [DOI] [PubMed] [Google Scholar]
- 4. Thygesen K., Alpert J. S., Jaffe A. S., et al., “Fourth Universal Definition of Myocardial Infarction (2018),” Circulation 138, no. 20 (2018): e618–e651, 10.1161/CIR.0000000000000617. [DOI] [PubMed] [Google Scholar]
- 5. Khera R., Dorsey K. B., and Krumholz H. M., “Transition to the ICD‐10 in the United States: An Emerging Data Chasm,” JAMA 320, no. 2 (2018): 133–134, 10.1001/jama.2018.6823. [DOI] [PubMed] [Google Scholar]
- 6. Hernandez‐Ibarburu G., Perez‐Rey D., Alonso‐Oset E., et al., “ICD‐10‐CM Extension With ICD‐9 Diagnosis Codes to Support Integrated Access to Clinical Legacy Data,” International Journal of Medical Informatics 129 (2019): 189–197, 10.1016/j.ijmedinf.2019.06.010. [DOI] [PubMed] [Google Scholar]
- 7. Rubbo B., Fitzpatrick N. K., Denaxas S., et al., “Use of Electronic Health Records to Ascertain, Validate and Phenotype Acute Myocardial Infarction: A Systematic Review and Recommendations,” International Journal of Cardiology 187 (2015): 705–711, 10.1016/j.ijcard.2015.03.075. [DOI] [PubMed] [Google Scholar]
- 8. Ando T., Ooba N., Mochizuki M., et al., “Positive Predictive Value of ICD‐10 Codes for Acute Myocardial Infarction in Japan: A Validation Study at a Single Center,” BMC Health Services Research 18, no. 1 (2018): 895, 10.1186/s12913-018-3727-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Nakai M., Iwanaga Y., Sumita Y., et al., “Validation of Acute Myocardial Infarction and Heart Failure Diagnoses in Hospitalized Patients With the Nationwide Claim‐Based JROAD‐DPC Database,” Circulation Reports 3, no. 3 (2021): 131–136, 10.1253/circrep.CR-21-0004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Kimm H., Yun J. E., Lee S. H., Jang Y., and Jee S. H., “Validity of the Diagnosis of Acute Myocardial Infarction in Korean National Medical Health Insurance Claims Data: The Korean Heart Study (1),” Korean Circulation Journal 42, no. 1 (2012): 10–15, 10.4070/kcj.2012.42.1.10. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Coloma P. M., Valkhoff V. E., Mazzaglia G., et al., “Identification of Acute Myocardial Infarction From Electronic Healthcare Records Using Different Disease Coding Systems: A Validation Study in Three European Countries,” BMJ Open 3, no. 6 (2013): e002862, 10.1136/bmjopen-2013-002862. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Bruxvoort K., Slezak J., Qian L., et al., “Association Between 2‐Dose vs 3‐Dose Hepatitis B Vaccine and Acute Myocardial Infarction,” JAMA 327, no. 13 (2022): 1260–1268, 10.1001/jama.2022.2540. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Beyrer J., Manjelievskaia J., Bonafede M., et al., “Validation of an International Classification of Disease, 10th Revision Coding Adaptation for the Charlson Comorbidity Index in United States Healthcare Claims Data,” Pharmacoepidemiology and Drug Safety 30, no. 5 (2021): 582–593, 10.1002/pds.5204. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Tsai T. Y., Lin J. F., Tu Y. K., et al., “Validation of ICD‐10‐CM Diagnostic Codes for Identifying Patients With ST‐Elevation and Non‐ST‐Elevation Myocardial Infarction in a National Health Insurance Claims Database,” Clinical Epidemiology 15 (2023): 1027–1039, 10.2147/CLEP.S431231. [DOI] [PMC free article] [PubMed] [Google Scholar]
