Abstract
Purpose
To describe the Acute Myocardial Infarction (AMI) Validation project, a test case for health outcome validation within the FDA-funded Mini-Sentinel pilot program.
Methods
The project consisted of four parts: (1) case identification: developing an ICD9-based algorithm to identify hospitalized AMI patients within the Mini-Sentinel Distributed Database; (2) chart retrieval: establishing procedures that ensured patient privacy (collection and transfer of minimum necessary amount of information, and redaction of direct identifiers to validate potential cases of AMI; (3) abstraction and adjudication: trained nurse abstractors gathered key data using a standardized form with cardiologist adjudication; and (4) calculation of the positive predictive value of the constructed algorithm.
Results
Key decision points included: (1) breadth of the AMI algorithm; (2) centralized vs. distributed abstraction; and (3) approaches to maintaining patient privacy and to obtaining charts for public health purposes. We used an algorithm limited to ICD9 codes 410.x0-410.x1. Centralized data abstraction was performed due to the modest number of charts requested (<155). The project’s public health status accelerated chart retrieval in most instances.
Conclusions
We have established a process to validate AMI within Mini-Sentinel, which may be used for other health outcomes. Challenges include: (1) ensuring that only minimum necessary data is transmitted by Data Partners for centralized chart review; (2) establishing procedures to maintain data privacy while still allowing for timely access to medical charts; and (3) securing access to charts for public health uses that do not require IRB approval while maintaining patient privacy.
Keywords: Myocardial infarction, coronary artery disease, validation, administrative data
Introduction
In 2007, the U.S. Congress passed the FDA Amendments Act (FDAAA) mandating the FDA to establish a postmarket risk identification and analysis system to link and analyze safety data from multiple sources.1 In May 2008, in response to the Congressional mandate, the FDA launched the Sentinel Initiative, a long-term program designed to create a national electronic monitoring system for medical product safety (the Sentinel System). The Sentinel System is being developed and implemented in stages and, when fully functional, will complement FDA’s existing postmarket safety surveillance systems.
The Mini-Sentinel pilot, a contract awarded by FDA to Harvard Pilgrim Health Care Institute (HPHCI) to develop the scientific operations needed for the eventual Sentinel System, is being conducted as a collaborative effort between FDA and a consortium of institutions led by HPHCI.2, 3 Because accurate and timely identification of health outcomes is an essential component of active safety surveillance, Mini-Sentinel convened a workgroup to establish a process for identification and validation of a selected health outcome: acute myocardial infarction (AMI). This is the first health outcome to be validated under Mini-Sentinel. In addition to developing and validating an algorithm to identify hospitalized AMI cases within the Mini-Sentinel Distributed Database,4 another goal of the workgroup was to design an efficient validation process that could be used as a model for future validation efforts of other health outcomes of interest.
This paper describes how the AMI Validation Workgroup developed the Mini-Sentinel validation process for AMI and discusses the barriers encountered. Additional information can be found in our final report.5
Overview of Design for the AMI Validation Process
The Mini-Sentinel AMI Validation project was a collaboration among multiple entities. The FDA’s Sentinel Initiative team contributed input and oversight throughout the validation effort. The Mini-Sentinel Operations Center, comprised of researchers and staff from HPHCI, provided a scientific, analytic and administrative infrastructure. The Operations Center facilitated communication between collaborators, designed programs for chart retrieval and coordinated the retrieval effort. Academic researchers from Meyers Primary Care Institute and University of Massachusetts Medical School were charged with designing the approach to chart identification, identifying necessary chart components, and performing abstraction and adjudication. Four Data Partners participated in this project, including HealthCore, Inc.; Humana; three member health plans within the Kaiser Permanente Center for Effectiveness and Safety Research (CESR); and two health plans in the HMO Research Network (HMORN). Data Partners implemented computer programs written in SAS by the Operations Center to identify likely AMI cases, retrieved, copied, de-identified and transmitted selected healthcare data to the lead team through the Operations Center via a secure web portal.
The AMI validation process consisted of four parts: (1) an approach to case identification with the goal of producing an algorithm that would reliably identify AMI cases ; (2) a protocol for case retrieval from the Data Partners, which outlined necessary chart components to confirm the AMI diagnosis and established effective approaches to obtaining and de-identifying chart information; (3) a parsimonious data abstraction form including relevant elements derived from the medical chart components and completed by trained nurse abstractors; and (4) an adjudication protocol for confirmation of the AMI diagnosis by cardiologist adjudicators. The culmination of this effort will be a determination of the positive predictive value (PPV) of the algorithm. The following sections focus on the first three parts of this process.
Case Identification
The overarching goals of this project were to validate the diagnostic codes used to identify likely AMI cases and to design an efficient validation process that could be used for future validation of other health outcomes in Mini-Sentinel. It was determined that 100 charts would be sufficient to obtain a reasonable PPV and establish the validation process. To obtain more contemporary findings, we decided to include only patients who were hospitalized for AMI between January 1, 2009 and December 31, 2009 for whom there were records in the Mini-Sentinel Distributed Database, which currently comprises administrative and claims data formatted locally into a common data model.4 Hospital stays of less than 24 hours and observation stays were included. There were no restrictions on age, sex, other diagnoses, or other patient characteristics, but patients were required to be enrollees of the respective health plan for the entire duration of hospitalization.
We had the opportunity to consult with a concurrent Mini-Sentinel workgroup that was charged with developing an active surveillance protocol for AMI,6 which informed decision-making in the early stages. The two workgroups began by reviewing the literature and examining prior completed reviews7, 8 to identify previously used algorithm components, with a focus on those yielding the highest PPVs (Table 1). We also consulted with cardiologists, cardiovascular researchers, and FDA review staff with expertise in cardiovascular disease. We considered the types of data that would likely be available from the medical records relating to the patient’s index hospitalization, as well as the likelihood of access to information both prior to the hospitalization and following hospital discharge for AMI survivors. We reviewed the pathophysiology of AMI and acute coronary syndrome, and discussed whether to create a strict definition of AMI or a definition that more broadly captured cardiovascular disease states as part of a continuum.
Table 1.
Data types | Algorithm components | Algorithm structure | Algorithm performance metrics |
---|---|---|---|
|
|
|
|
In reviewing the literature, we found a wide range of ICD-9 codes in use, with a few studies assessing ICD-8 or ICD-10 codes and several studies combining ICD codes with other criteria.7-20 We identified ICD-9 code 410 as the code most frequently yielding PPVs in the mid to high 90% range, and we also identified the need to specify the ICD-9 code using 2 decimal places. Since the number 2 in the second place after the decimal (i.e., 410.x2) indicates a past MI, we limited our sample to 410.x0 or 410.x1. Our algorithm therefore identified patients with ICD-9 hospital discharge codes (a principal or primary discharge code only9,15,17,19) of 410.x0 and 410.x1. If a Data Partner does not have a diagnosis designated as principal or primary, we used the first-listed discharge diagnosis. Although we reviewed previously studied algorithms that incorporated hospital length of stay, we did not find that this reliably increased the PPV and, therefore, did not include this requirement in the final algorithm.10, 12, 16, 17
The two workgroups held additional meetings with members of both teams in order to reach consensus on a common AMI definition for both studies. We discussed including in our definition deaths occurring within one day of an emergency department visit for acute ischemic heart disease (ICD-9 code: 411.1, 411.8, 413.x), but decided against including these additional ICD rubrics due to concerns regarding the adequacy of information that would be available to adjudicate these cases. Codes including 412 (old myocardial infarct) and 414 (chronic ischemic heart disease) were excluded in an effort to focus on acute events. Our final algorithm to identify AMI patients identified patients with ICD-9 principal (or first-listed) discharge codes 410.x0 and 410.x1.
Based on this algorithm, the Operations Center developed a SAS program, tested it with two Data Partners for accuracy, and then distributed it to all Data Partners participating in the project. To obtain 100 cases for adjudication, efforts were made to identify approximately 150 cases (assuming 67% retrieval) across all participating Data Partners, with each Data Partner pursuing an equal number of cases. In order to identify a random sample of likely AMI cases and the hospitals in which they received care, participating Data Partners executed the SAS program to query their own locally maintained administrative and claims data (see following section of this report).
Case Retrieval
Centralized vs. distributed chart abstraction
The workgroup developed a protocol for retrieving medical chart information. In order to proceed with chart retrieval, we needed to: 1) decide whether chart abstraction would take place centrally or in a distributed manner (i.e., each Data Partner abstracts its own charts); and 2) establish protocols for ensuring patient privacy and data security, and for helping Data Partners comply with their regulatory responsibilities when supplying data for public health surveillance, which does not require individual consent or privacy authorization.21 Because the abstraction process would have major implications in terms of the amount of information transferred, the workgroup held multiple meetings to address the question of centralized vs. distributed data abstraction. Before selecting an approach to pursue, the workgroup discussed why a centralized versus distributed approach might be preferred for the purpose of this validation activity and as a model for future Mini-Sentinel validations. Specifically, the workgroup discussed the following issues:
Capacity to maintain patient de-identification: Individually identifiable information other than dates of service in the charts was intended to be fully redacted prior to transmission from Data Partner sites to the Operations Center. Data Partners noted, however, that some individually identifiable information might be overlooked, especially in extremely large charts, and this might pose a risk to patient privacy.
Existing infrastructure within the Data Partners to perform medical chart abstraction by trained abstractors: Some Data Partners advocated for a distributed approach since they had available experienced abstractors who could be trained to perform the required abstraction tasks. Others did not have experienced abstractors on site.
Quality of data abstraction: Given the modest number of medical records to be reviewed and the relatively high number of sites, it would be challenging to train abstractors to perform only a handful of abstractions and still maintain adequate quality and reliability. Additionally, some Data Partners would not be using nurses or other individuals with relevant healthcare experience as abstractors, leading to an increased likelihood of variation in abstraction quality.
Short-term efficiency: A distributed approach would require abstractor training and evaluation at multiple sites, potentially impacting negatively the timeline for the overall abstraction effort.
Long-term efficiency: Efficiency of future Mini-Sentinel validation projects was also a consideration. In the future, when another health outcome needs to be validated, a centralized approach would require training of a limited number of abstractors instead of periodically retraining numerous abstractors across multiple Data Partners. A centralized approach could maximize resources and minimize the amount of time required to abstract necessary information.
After careful consideration, a centralized approach was ultimately pursued and selected components of medical records were extracted. All charts were redacted of direct identifiers, but retained dates of service to reduce the amount of information transmitted to the minimum amount necessary to accomplish the public health purpose of the project. The redacted redcords were then securely transferred via the Mini-Sentinel Secure Portal to the lead team at Meyers Primary Care Institute for centralized abstraction.
Determination of chart components
Once a centralized abstraction approach was chosen, the lead team proposed a list of critical chart components and other information they considered important for the validation. This initial list was developed broadly and then narrowed down based on input from Data Partners, the Operations Center, the FDA, and individuals with clinical and epidemiologic expertise relevant to cardiovascular disease.
Abstraction tools from various validation studies were reviewed to inform decisions on the list of critical chart components and other information to be extracted.10, 20, 22 In response to Data Partners’ concerns over the amount of information to be extracted and transferred, the lead team further excluded several chart components (e.g., medications and patient vital signs) in order to lessen the amount of information to be transferred.
The Operations Center reviewed the revised list in relation to the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule’s minimum necessary standard and confirmed that the critical chart components and other information requested constituted the minimum amount of information for the activity. The chart components that were requested are summarized in Table 2.
Table 2.
○ Admission history and physical |
○ Discharge summary |
○ Transfer records |
○ Cardiology consult notes |
○ Autopsy reports/ Death notes |
○ EMT/Ambulance notes |
○ Emergency Department notes |
○ Copies of all 12 lead electrocardiograms |
○ Laboratory reports |
○ Cardiac catheterization report |
○ Percutaneous coronary intervention reports |
○ Cardiac bypass surgery report |
○ Cardiac stress tests/nuclear stress tests/reports |
Certain items requested remained broad in scope. For example, copies of all laboratory results were requested. This was done in order to obtain cardiac biomarker information.
Cardiac biomarkers are one of the critical items of interest for AMI validation, but only represent a subset of all laboratory results. This decision was made in order to avoid the need for a highly trained individual at each site capable of determining which specific pages of the laboratory report section of the medical record were required.
Obtaining chart information
After the list of chart components was finalized, Data Partners proceeded to execute the SAS program locally, identifying a random sample of likely AMI cases whose medical records were to be located. Data Partners then asked source data holders (e.g., individual hospital medical records departments) for access to these patients’ records. Source data holders either sent the medical records to vendors commissioned to extract, copy, and redact the requested information, or allowed Data Partners direct access to records for extraction, copying and redaction. Redacted chart data were sent to the Operations Center via the Mini-Sentinel Secure Portal.
The Operations Center provided each Data Partner with a privacy packet prepared by the Mini-Sentinel Privacy Panel. This packet included: 1) the Mini-Sentinel Privacy Panel White Paper describing data privacy issues in Mini-Sentinel;23 2) a letter from the Department of Health and Human Services Office for Human Research Protections (OHRP) to the FDA stating that the regulations OHRP administers do not apply to the Sentinel Initiative (OHRP oversees all IRBs); and 3) a letter from the FDA to the Mini-Sentinel Principal Investigator explaining FDA’s legal authority to obtain data for use in its Sentinel and Mini-Sentinel activities. The privacy packet described the legal basis for determining that the work of the Mini-Sentinel pilot constitutes a public health activity not under the purview of IRBs. Data Partners were strongly encouraged to disseminate the privacy packet to their IRBs and Privacy Boards as well as any other relevant entities.
The Operations Center also supplied a customizable letter template to provider sites, an instructional flowchart and a list of frequently asked questions. The letter (addressed to each provider site from specific Data Partners) explained the purpose of the project and what was being requested. Letters also explained that the request was being carried out on behalf of Mini-Sentinel and the FDA. The flowchart outlined the array of possible scenarios for chart retrieval and detailed the steps for chart redaction and data transmission. The frequently asked questions included a list of anticipated questions, such as “How were provider sites selected to participate in this activity?”, “Where will the abstracted data go”, and “How can I contact my Data Partner directly?”
Redaction of individually identifiable information was performed in accordance with HIPAA’s provisions for a ‘limited dataset,’ which is an alternative to using fully de-identified information. Under HIPAA, creation of a limited dataset requires removal of 16 direct identifiers, but allows for the inclusion of dates, geographic location (not as specific as street address), and any other code or characteristic not explicitly excluded.21 Redaction was completed before chart components were transferred to the Operations Center. Each Data Partner assigned a new, de-identified ID unique to each redacted chart, and maintained a crosswalk between the newly assigned IDs and the original IDs. The Operations Center did not receive this crosswalk for IDs; it was maintained by locally by each Data Partner.
Admission and discharge dates as well as dates corresponding to EKGs, laboratory results, procedures and tests were not redacted. This information was considered crucial for determining whether available EKGs and test results corresponded to the hospital stay of interest and therefore whether an AMI occurred during the identified hospital stay. In addition, for certain tests (EKGs, cardiac biomarkers), the results needed to be assessed by cardiologist adjudicators in chronological order. We considered assigning reference values for every date. Ultimately we opted not to pursue this approach since we felt that this would substantially increase workload and introduce multiple opportunities for error.
Data Partners were provided with credentials to login to the Mini-Sentinel Secure Portal for transferring, managing, or retrieving chart data. Security was managed within the folder structure of the site; the Secure Portal contains private folders accessible only to specified members within each Data Partner site and authorized Operations Center staff, as well as common folders defined for all users. Data Partners electronically uploaded redacted charts to their site-specific private folders. The Operations Center verified that all charts1 were redacted thoroughly and then moved all files to a separate private folder, allowing the lead team access to the data.
Retrieval of charts requested
A total of 153 charts were requested, of which 143 were successfully retrieved. Approximately 80% of the 143 charts were obtained within three months following the request. The total time it took to received all 143 charts was approximately five months.
Challenges encountered during the chart retrieval process
Regarding level of burden, Data Partners were initially concerned that they would be required to obtain information from multiple sources, including outpatient medical records. However, clinical information relevant to the present validation study was to be extracted from medical records relating to only a single hospitalization.
Regarding privacy issues and IRB concerns, Data Partners described several challenges they encountered during the chart retrieval process. Though sometimes causing delays, most source data holder IRBs allowed charts to be located and retrieved after being provided with the privacy packet containing letters and documents that clarified the status of this validation project as a public health surveillance activity undertaken under the auspices of the FDA. However, despite the privacy packet, seven charts requested were not obtained due to IRB concerns and insistence on patient consent prior to releasing medical records.
Several other issues were brought to the workgroup’s attention by the Data Partners. Some redacted charts were sent by mail as opposed to electronic transmission, which led to delays in transferring data. One Data Partner found that including a list of frequently asked questions and answers along with each chart request led to improved turnaround times. Frequent inquiries concerning the disposition of charts and relationship building with hospital staff processing the request were also helpful in obtaining charts more quickly.
Abstraction and Adjudication
Development of abstraction and adjudication form
The lead team identified and reviewed a number of AMI abstraction forms and manuals used in past AMI validation activities. The team also consulted individuals with clinical and epidemiological expertise relevant to cardiovascular disease, reviewed the American Heart Association (AHA)’s Universal Definition of Myocardial Infarction24, 25 and the literature on troponin standardization,26 and communicated with directors of laboratories on percentile cutoffs for what were considered to be “positive” troponin values. Based on clinical consultation and literature review, the lead team created a 36-item abstraction form that included: 1) general demographic information; 2) brief medical history; 3) cardiac biomarker information; 4) copies of electrocardiograms; 5) cardiac testing, procedure, and intervention information; and 6) information on disposition at the time of hospital discharge. The lead team trained two nurse abstractors to enter abstracted information into a Microsoft Access database and provided an accompanying instruction manual. Both abstractors gathered data from the first 10 cases. These abstractions were reviewed together with both nurse abstractors to ensure high inter-rater reliability on items critical for the adjudications.
In consultation with the FDA and individuals with clinical and epidemiologic expertise relevant to cardiovascular disease, the lead team created an adjudication protocol based on the AHA Universal Definition of Myocardial Infarction. In addition to abstracted data from the form described above, adjudicating cardiologists were provided with copies of electrocardiograms and copies of all cardiac test and procedure reports.
Access to redacted charts
Redacted components of the medical record were sent to the Operations Center via the Mini-Sentinel Secure Portal and then made available through this secure site to the lead team for data abstraction and case adjudication.
Challenges encountered during the chart abstraction and adjudication process
One of the more challenging issues that emerged in the design of the abstraction and adjudication forms related to differences in cardiac biomarker reference standards among different hospitals. It was essential to design abstraction materials that could adequately capture both biomarker results and reference standards, even when presented in a variety of ways from different hospital sources.
The workgroup was also challenged with reconciling the biomarker standards described in the published AHA definition of AMI24 with laboratory values likely to be available in hospital records. While the published definition we were using defined abnormal biomarker values as falling “above the 99th percentile of the upper reference limit”,24 preliminary reviews of several charts showed that hospital laboratories did not routinely report percentile cut-offs. Through communication with the director of one hospital laboratory, we also found that the reported reference values did not always correspond to this 99th percentile cut-off. We opted to capture any and all available information on reference standards from charts (i.e., from printed laboratory reports) but did not contact laboratory directors at each individual site for any further information.
Summary of lessons learned
We believe the following lessons learned will assist us in developing best practices when conducting similar Mini-Sentinel validation activities in the future:
Protocol Development: Engaging the FDA from the initial phase of the project facilitated development of the scope of work and setting of project goals.
Workgroup: A workgroup that includes the FDA, the Mini-Sentinel Operations Center, and the academic and Data Partners provided a platform for effective communication during protocol development and allowed us to identify potential challenges quickly. Regular workgroup meetings also allowed for achieving consensus among the workgroup members with regard to project timelines, requested chart components, and deliverables.
Chart Request: The Operations Center provided each Data Partner with a privacy packet to disseminate to source data holders. These documents outlined the activity as public health surveillance and detailed privacy and confidentiality standards used in the activity. Data Partners distributed these documents when making the initial chart request to source data holders, and we believe that this resource helped expedite the chart retrieval process.
Privacy Packet: One Data Partner mentioned that they were asked for contract information showing that the Data Partner was part of Mini-Sentinel. In future work, consideration should be given to adding this information to the privacy packet.
Chart Retrieval: The workgroup planned for multiple chart component extraction scenarios (i.e., vendors vs. non-vendor processes) which provided Data Partners with additional options and increased flexibility when retrieving charts.
Some Data Partners preferred to employ third-party vendors to retrieve charts. This process did briefly delay the project timeline as Data Partners faced challenges in finding reasonably priced vendors. Although each chart retrieval scenario required additional resource planning, this hybrid approach made for an efficient overall chart retrieval process.
Transferring of information: The Operation Center provided Data Partners access to the Mini-Sentinel Secure Portal for transferring, managing, and retrieving chart data. The Portal is a secure and immediate pathway for uploading data. The Operations Center was also able to track all data and provide abstractors access to charts through this environment.
Overall Timeline: Approximately seven to eight months were required to develop a protocol, request charts from Data Partner sites and complete abstraction and adjudication.
Considerations for future validation efforts: Future efforts to validate health outcomes on a national scale should pay particular attention to abstractor training and data capture for laboratory values (or other tests) whose reference ranges differ by hospital. Laboratory values or tests in which serial values must be captured can pose a challenge for patients who are transferred in from an outside hospital, due to the presence of multiple data formats and multiple reference ranges.
Conclusions
The AMI validation project has established a process for validating medical outcomes within Mini-Sentinel that can serve as a model for future surveillance validation activities. The project has provided important insights into the challenges inherent in conducting health outcome validation across public, academic, and private entities. Key issues identified include: 1) the need to determine the scope of health outcome definitions (broad vs. more focused); 2) the need for early assessment regarding centralized vs. distributed approaches to chart abstraction; and 3) the need to have a policy and systematic approach for maintaining patient privacy, data security, and addressing regulatory compliance issues under HIPAA and the Common Rule. In addition, it will be important for future validation projects to anticipate between-hospital differences in laboratory reference standards and between-hospital variations in how these data will be presented to the adjudicators.
Key points.
Key decision points in the Mini-Sentinel acute myocardial infarction (AMI) validation process included: (1) determining the scope of the ICD-9 based AMI algorithm; (2) determining whether to pursue centralized vs. distributed abstraction; and (3) approaches to maintaining patient privacy and to addressing the project’s status as a public health activity that does not come under the purview of Institutional Review Boards (IRBs).
We used an algorithm limited to ICD9 codes 410.x0-410.x1. Centralized data abstraction was performed due to the modest number of medical charts requested (<155 in total). The project’s public health status accelerated chart retrieval in most instances.
Acknowledgments
Dr Cutrona was supported in part by Award Number KL2RR031981 from the National Center for Research Resources (NCRR).
The authors would like to thank the following Data Partners for their input into the validation process and their tireless efforts to obtain and prepare charts:
HealthCore, Inc.[Gregory Daniel, Jenni (Jie) Li, Amanda Rodrigez]
HMO Research Network:
Group Health Research Institute [Denise Boudreau, Danelle Wallace]
Fallon Community Health Plan [Susan Andrade]
Humana [Vinit Nair, Mary Costantino]
Kaiser Permanente Center for Effectiveness and Safety Research: [Daniel Jaynes]
Kaiser Permanente Northern California [Daniel Ng]
Kaiser Permanente Georgia [Melissa Butler]
Kaiser Permanente Hawaii [Cynthia Nakasato, Yee Hwa Daida]
In addition, we would like to thank Dr. Karen Hicks for her assistance in developing the abstraction and adjudication tools; Dr. Jorge Yarzebski for assistance in obtaining and abstracting charts; Dr. Guillermo Talero, Michaela Richardson, and Catherine Emery for their careful chart abstraction work, Dr. David McManus and Dr. Joel Gore for their work as adjudicators. Mini-Sentinel is funded by the Food and Drug Administration (FDA) through Department of Health and Human Services (HHS) Contract Number HHSF223200910006I.”
Funding information: This study was supported through funding from contract HHSF223200910006I from the U.S. Food and Drug Administration (FDA).
Footnotes
COI statement: Authors have no conflicts of interest to report.
References
- 1.Behrman RE, Benner JS, Brown JS, et al. Developing the Sentinel System - A National Resource for Evidence Development. N Engl J Med. 2011 doi: 10.1056/NEJMp1014427. [DOI] [PubMed] [Google Scholar]
- 2. [accessed 31 January 2011];The Sentinel Initiative: Access to Electronic Healthcare Data for More than 25 Million Lives. Achieving FDAAA Section 905 Goal One. 2010 Jul; http://www.fda.gov/downloads/Safety/FDAsSentinelInitiative/UCM233360.pdf.
- 3.PDS supplement paper describing Mini-Sentinel’s mission and policies.
- 4.PDS supplement paper describing CDM and MSDD.
- 5.AMI Validation Final Report: pending.
- 6.PDS supplement paper describing the AMI surveillance project.
- 7.Kachroo S, Jones N, Reynolds MW. Final report prepared for the Foundation for the National Institutes of Health via the Observational Medical Outcomes Partnership (OMOP) United Biosource Corporation; Lexington, MA: 2009. Systematic literature review for evaluation of myocardial infarction. [Google Scholar]
- 8.Lux L, Jarrett N, West S. Systematic evaluation of health outcome of interest definitions in observational studies and clinical definitions for the Observational Medical Outcomes Partnership: Myocardial infarction report. RTI International; Research Triangle Park, NC: 2009. [Google Scholar]
- 9.Austin PC, Daly PA, Tu JV. A multicenter study of the coding accuracy of hospital discharge administrative data for patients admitted to cardiac care units in Ontario. Am Heart J. 2002;144:290–296. doi: 10.1067/mhj.2002.123839. [DOI] [PubMed] [Google Scholar]
- 10.Choma NN, Griffin MR, Huang RL, et al. An algorithm to identify incident myocardial infarction using Medicaid data. Pharmacoepidemiol Drug Saf. 2009;18:1064–1071. doi: 10.1002/pds.1821. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Hammar N, Alfredsson L, Rosen M, et al. A national record linkage to study acute myocardial infarction incidence and case fatality in Sweden. Int J Epidemiol. 2001;30(Suppl 1):S30–34. doi: 10.1093/ije/30.suppl_1.s30. [DOI] [PubMed] [Google Scholar]
- 12.Kiyota Y, Schneeweiss S, Glynn RJ, et al. Accuracy of Medicare claims-based diagnosis of acute myocardial infarction: estimating positive predictive value on the basis of review of hospital records. Am Heart J. 2004;148:99–104. doi: 10.1016/j.ahj.2004.02.013. [DOI] [PubMed] [Google Scholar]
- 13.Madsen M, Davidsen M, Rasmussen S, et al. The validity of the diagnosis of acute myocardial infarction in routine statistics: a comparison of mortality and hospital discharge data with the Danish MONICA registry. J Clin Epidemiol. 2003;56:124–130. doi: 10.1016/s0895-4356(02)00591-7. [DOI] [PubMed] [Google Scholar]
- 14.Nguyen-Khoa BA, Goehring EL, Jr., Werther W, et al. Hospitalized cardiovascular diseases in neovascular age-related macular degeneration. Arch Ophthalmol. 2008;126:1280–1286. doi: 10.1001/archopht.126.9.1280. [DOI] [PubMed] [Google Scholar]
- 15.Pajunen P, Koukkunen H, Ketonen M, et al. The validity of the Finnish Hospital Discharge Register and Causes of Death Register data on coronary heart disease. Eur J Cardiovasc Prev Rehabil. 2005;12:132–137. doi: 10.1097/00149831-200504000-00007. [DOI] [PubMed] [Google Scholar]
- 16.Petersen LA, Wright S, Normand SL, et al. Positive predictive value of the diagnosis of acute myocardial infarction in an administrative database. J Gen Intern Med. 1999;14:555–558. doi: 10.1046/j.1525-1497.1999.10198.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Solomon DH, Schneeweiss S, Glynn RJ, et al. Relationship between selective cyclooxygenase-2 inhibitors and acute myocardial infarction in older adults. Circulation. 2004;109:2068–2073. doi: 10.1161/01.CIR.0000127578.21885.3E. [DOI] [PubMed] [Google Scholar]
- 18.Varas-Lorenzo C, Castellsague J, Stang MR, et al. Positive predictive value of ICD-9 codes 410 and 411 in the identification of cases of acute coronary syndromes in the Saskatchewan Hospital automated database. Pharmacoepidemiol Drug Saf. 2008;17:842–852. doi: 10.1002/pds.1619. [DOI] [PubMed] [Google Scholar]
- 19.Wahl PM, Rodgers K, Schneeweiss S, et al. Validation of claims-based diagnostic and procedure codes for cardiovascular and gastrointestinal serious adverse events in a commercially-insured population. Pharmacoepidemiol Drug Saf. 2010;19:596–603. doi: 10.1002/pds.1924. [DOI] [PubMed] [Google Scholar]
- 20.Yeh RW, Sidney S, Chandra M, et al. Population trends in the incidence and outcomes of acute myocardial infarction. N Engl J Med. 362:2155–2165. doi: 10.1056/NEJMoa0908610. [DOI] [PubMed] [Google Scholar]
- 21.Department of Health and Human Services. Office of the Secretary Standards for Privacy of Individually Identifiable Health Information; Final Rule. 45 CFR Parts 160 and 164: Federal Register. 2002:53182–53273. [PubMed] [Google Scholar]
- 22.Floyd KC, Yarzebski J, Spencer FA, et al. A 30-year perspective (1975-2005) into the changing landscape of patients hospitalized with initial acute myocardial infarction: Worcester Heart Attack Study. Circ Cardiovasc Qual Outcomes. 2009;2:88–95. doi: 10.1161/CIRCOUTCOMES.108.811828. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Rosati K, Evans B, McGraw D. [accessed 31 January 2011];HIPAA and Common Rule Compliance in the Mini-Sentinel Pilot. http://mini-sentinel.org/work_products/About_Us/HIPAA_and_CommonRuleCompliance_in_the_Mini-SentinelPilot.pdf.
- 24.Thygesen K, Alpert JS, White HD, et al. Universal definition of myocardial infarction. Circulation. 2007;116:2634–2653. doi: 10.1161/CIRCULATIONAHA.107.187397. [DOI] [PubMed] [Google Scholar]
- 25.Alpert JS, Thygesen K, Jaffe A, et al. The universal definition of myocardial infarction: a consensus document: ischaemic heart disease. Heart. 2008;94:1335–1341. doi: 10.1136/hrt.2008.151233. [DOI] [PubMed] [Google Scholar]
- 26.Tate JR, Bunk DM, Christenson RH, et al. Standardisation of cardiac troponin I measurement: past and present. Pathology. 2010;42:402–408. doi: 10.3109/00313025.2010.495246. [DOI] [PubMed] [Google Scholar]