PURPOSE
The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) requires eligible clinicians to report clinical quality measures (CQMs) in the Merit-Based Incentive Payment System (MIPS) to maximize reimbursement. To determine whether structured data in electronic health records (EHRs) were adequate to report MIPS CQMs, EHR data aggregated by ASCO's CancerLinQ platform were analyzed.
MATERIALS AND METHODS
Using the CancerLinQ health technology platform, 19 Oncology MIPS (oMIPS) CQMs were evaluated to determine the presence of data elements (DEs) necessary to satisfy each CQM and the DE percent population with patient data (fill rates). At the time of this analysis, the CancerLinQ network comprised 63 active practices, representing eight different EHR vendors and containing records for more than 1.63 million unique patients with one or more malignant neoplasms (1.73 million cancer cases).
RESULTS
Fill rates for the 63 oMIPS-associated DEs varied widely among the practices. The average site had at least one filled DE for 52% of the DEs. Only 35% of the DEs were populated for at least one patient record in 95% of the practices. However, the average DE fill rate of all practices was 23%. No data were found at any practice for 22% of the DEs. Since any oMIPS CQM with an unpopulated DE component resulted in an inability to compute the measure, only two (10.5%) of the 19 oMIPS CQMs were computable for more than 1% of the patients.
CONCLUSION
Although EHR systems had relatively high DE fill rates for some DEs, underfilling and inconsistency of DEs in EHRs render automated oncology MIPS CQM calculations impractical.
BACKGROUND
Reimbursement for quality of care in medicine has been key to Medicare since its inception. Recently, Congress enacted the Medicare Access and CHIP Reauthorization Act (MACRA)1,2 as a revision to the sustainable growth rate formula that determined the Centers for Medicare and Medicaid Services (CMS) reimbursement rates. Taking effect in 2017, MACRA set up several models for adjusting quality-based physician reimbursement, including the Merit-based Incentive Payment System (MIPS).3 CMS provides specialty-specific clinical quality measures (CQMs). The 19 cancer-related oncology MIPS (oMIPS) CQMs for 2018 are the focus of this analysis.4
CONTEXT
Key Objective
To determine if federal clinical quality measures for oncology can be automatically calculated from data elements (DEs) held in electronic health record (EHR) systems.
Knowledge Generated
The vast majority of oncology clinical quality measures could not be directly calculated from EHR DEs. None of the studied vendors adequately implemented the necessary clinical quality-related DEs across the full set; even when the DEs were available in an EHR, they were poorly filled.
Relevance
Federal quality initiatives for oncology have recently centered around the automated extraction of high-quality DEs from EHRs. Here, we demonstrate that automated clinical quality calculations are not feasible at present, with negative implications for public health reporting, research, and other secondary uses of EHR data. Solving this problem will require widespread creation and adoption of common DEs along with improvements in the routine capture and exchange of structured data.
Although MIPS provides monetary incentives or penalties for reporting measures, clinicians or their designees often must sift through multiple charts to extract the required data elements (DEs) and, in many cases, manually enter the measure statistics into a separate quality reporting system. Owing to the effort, time, and cost of reporting, the MIPS reporting requirement can discourage participation in voluntary quality reporting programs, thereby diluting the effectiveness of the quality initiatives.5,6 MIPS reporting, and its associated data validation and auditing, would be more efficient if these MIPS measures could be derived directly from the electronic health record (EHR) without manual abstraction.7,8 EHR data have been investigated as the input to payment-associated quality metrics.9,10 Although the EHR has been proposed as a possible aid to data for MIPS reporting,11 it is not clear if current EHR implementations can support MIPS submissions through standardized and structured data fields (DFs).
To determine whether EHR DEs can be leveraged for oMIPS reporting, we analyzed patient records in the CancerLinQ health technology platform developed by ASCO. CancerLinQ extracts data from the EHRs implemented at multiple practices by importing data in structured DFs and by manual abstraction facilitated by text mining.12 For this study, we limited the use of unstructured text data to those values that could be readily matched to standard terminologies. At the time of this analysis, 63 health care institutions, representing eight different EHR vendors and containing records for more than 1.63 million unique patients with one or more malignant neoplasms (total of 1.73 million cancer cases), were actively contributing data as part of the CancerLinQ network. This study investigates the feasibility of deriving oMIPS CQMs directly from a variety of EHR systems through the CancerLinQ data set.
MATERIALS AND METHODS
Study Team
The data analysts comprised six oncology domain experts, including oncology informaticians from the CancerLinQ Oncology Informatics Taskforce (ASCO volunteer body under the CancerLinQ Physician Advisory Committee) and CancerLinQ staff from the medical and informatics teams. Regular meetings were held to jointly review data sets and analyses. All analyses are consistent with ASCO/CancerLinQ data privacy standards.
EHR Systems
CancerLinQ practices implemented general EHR and oncology-specific EHR products from commercial vendors with significant market share, including Allscripts (Chicago, IL), ARIA (Varian Medical Systems, Palo Alto, CA), Centricity (GE Healthcare, Chicago, IL), CureMD (New York, NY), Epic (Verona, WI), MOSAIQ (Elekta, Stockholm, Sweden), NextGen (Irvine, CA), and OncoEMR (Flatiron Health, New York, NY). EHR systems and oncology-specific modules were included in these analyses, but other specialty modules such as radiation oncology, pathology, and surgical systems were not examined. To maintain confidentiality, vendor names in all analyses and practice data have been aggregated and/or anonymized.
Oncology Practices
The CancerLinQ practices used for this analysis had completed a series of activation steps, including quality assessments and data completeness reviews. Data quality was confirmed by multiple methods including a review of the active cancer population count, quality measure scores, patient longitudinal record validation, and manual review.
Data Preparation
Structured data were extracted from predefined DFs in EHR databases and imported into the CancerLinQ database. Text strings were also converted into structured data in the canonical CancerLinQ data model as standard clinical terminologies representing diagnoses (ICD-9 or ICD-10),13 staging terms (SNOMED CT),14 medications (RxNorm),15 and laboratory tests (LOINC).16
Development of oMIPS-Associated DEs
Nineteen oMIPS CQMs from 2018 were studied.17 For each measure, the study team identified discrete DEs required to score patient records as meeting or not meeting the measure definition. Each DE represents a data definition for which data must be located to enable measure calculation. For example, the concept of whether a patient smokes may have an associated DE, smoking status. Review of the oMIPS measure components and the data found in each EHR system resulted in a consolidated list of DEs created in the CancerLinQ database, covering the structured data components required to calculate all 19 CQMs. DE definitions are provided in the Data Supplement.
Data Analysis
When a DE was present as a structured field in an EHR database, it was designated as a DF. A DF is a standard location or technical approach to store DE data in an EHR's database in a structured manner. For example, a DE may exist as a prespecified column in a database table or as a defined entry in a data dictionary that is available for storage of a patient's DE data in a more generic yet still-structured manner (eg, using an entity-attribute-value approach18). A given DF is capable of storing a patient's data for a DE. A DF may exist in multiple copies for a given patient, eg, to represent multiple successive values of a laboratory test. Thus, a DF may be filled with DE data one or more times, or it may remain empty (unfilled) for a given eligible patient in a time frame suitable for MIPS measure calculation. Within the measure-applicable time frame, the percentage of patients that had at least one filled DF for a given DE was calculated and is called the DE fill rate. If a DF for a given DE was not present in an EHR, then the DE fill rate for that DE is zero. DE fill rates were calculated separately for each vendor and for each practice.
To highlight differences among EHR systems, data were analyzed on a per-EHR basis such that data from each EHR system had equal weight for all calculations. In particular, no weighting on the basis of practice size or total patients per vendor was performed for the analyses of DE fill rate percentages or for the calculation of any other statistic. This ensures a balanced comparison among the vendors studied and avoids skewing the results toward the vendors representing the most patients. Descriptive statistics were used to summarize our findings.
RESULTS
EHR Data set Derivations and Definitions
Data from 2010 to 2020 were extracted from the 63 CancerLinQ practices in this analysis. Data were analyzed for oMIPS CQMs pertaining to all cancers and for subsets limited to breast, prostate, and colorectal cancers. The Data Supplement presents the 19 oMIPS measures and the measure text that was used to search DFs from the eight EHR system databases. Mapping oMIPS measures to component DFs was often complex. As an example of this complexity, the Data Supplement shows the breakdown of the electronic CQM (eCQM) definition for oMIPS measure 16 (bone scans in low- or very low-risk prostate cancer), followed by a discussion of the analysis process.
Table 1 presents the breakdown of cancer patient records among the eight EHR vendors. The top four vendors comprised over 90% of the total cases analyzed.
TABLE 1.
Demographics: The Cancer Case Composition of the Eight EHR Vendor Systems

oMIPS-Associated DEs Were Poorly Populated
For oMIPS-associated DEs, the mean fill rate across all evaluated practices was 23%. Seventy-eight percent of the DEs (49 of 63) were populated with data. Only 23 DEs were filled in consistently (ie, for ≥ 90% of the practices). No data were found in any practice for 14 DEs. Figure 1 summarizes these fill rates. For figure clarity purposes, note that Figure 1 averages the DE fill rates over the 63 practices, whereas all other figure and tables average data for each equally weighted EHR vendor.
FIG 1.
Fill rates for oMIPS-associated DEs, averaged over all practices. Column 1 shows a list of the 63 DEs analyzed in this study. The percentage of practices (among 63 practices) that had at least one filled DF for at least one patient is shown for each DE in column 2. The average fill rate for each DE across all practices is shown in column 3. Columns 1-3 are split into left and right panels to conserve space. The last row of the right panel computes the average values for columns 2 and 3. The rows are sorted first by column 2, then column 3, and then by DE alphabetical order. To enhance readability, all percentages are rounded to the closest integer value (0-100). The definition for each DE can be found in the Data Supplement. DEs, data elements; DF, data field; oMIPS, Oncology Merit-Based Incentive Payment System.
DE Analysis Case Study: Stage Groups
Recording the stage group in EHRs for all cancers is widely considered a best practice,19-22 and accordingly, these data are populated in almost 87% of the patients for cancer registry data imported into the CancerLinQ data set (data not shown). One hundred percent of the practices in the data set implemented the stage_group DE (Fig 1) as a DF (Fig 1, column 2), and all practices had at least one value (for at least one patient) in this DF. However, across all practices, only 38% of the stage group DFs were filled (Fig 1, column 3). The minimum fill rate for this DE was 0.1% at a single practice, and the maximum fill rate was 60.6% at a single practice, with a standard deviation over the 63 practices of 22%. Standard deviations between the 63 practices and between the eight EHRs were often large and are omitted from the figures for clarity. The complete EHR comparison data set, including a heat map demonstrating the range of DE fill rates, is found in the Data Supplement.
DE Fill Rates Varied Substantially Among EHR Vendors
Fill rates for the individual DFs varied widely among the sites and EHR systems. Figure 2 shows the 63 oMIPS-associated DEs and their respective fill rates filtered by the EHR system. The average fill rate for all DEs across all vendors was approximately 22%.
FIG 2.

Heat map of fill rates for each DE, grouped by the vendor. The fill rate was calculated for each DE, for patient data in each of the eight EHR systems. Each column in the heat map displays DE fill rates from one of the eight vendors. Columns are split into a left and right panel to conserve space. The rightmost columns of both panels show the mean fill rate for each DE, and the last row of the right panel shows the mean fill rate for each vendor. The rows are sorted by the mean DE fill rate (right column). A heat map is applied for the 63 DEs, with green indicating higher fill rates, red indicating lower fill rates, and yellow indicating fill rates of intermediate values. A separate recalibrated heat map is applied to the last row in the second panel (mean fill rate per vendor) to better highlight the small differences in overall fill rates between vendors. The definition for each DE can be found in the Data Supplement. DE, data element; EHR, electronic health record.
Three registration and reimbursement-related DFs (diagnosis_code, age_dob, and gender) were available in structured fields for all sites and all EHR vendors surveyed. Similarly, other heavily referenced fields (encounter_type, diagnosisdate, and tobacco) were filled most of the time (> 78%). Conversely, over half of the DEs (32 of 63) had a fill rate of < 10% and only 21 of the 63 DEs (33%) exceeded a 25% fill rate. The average fill rate for the top 50% of the DEs was 42%. The average fill rate for the bottom 50% of the DEs was only 0.7%. Figure 2 also presents a breakout of the above-mentioned results grouped by the eight EHR vendors. To preserve vendor anonymity, EHR vendors are not listed in alphabetical order nor the order they appear in Table 1. Figure 2 shows that, below the top six DEs listed above, some vendors began to show DE fill rates of zero, beginning with blood pressure measurements. A DE fill rate of zero indicates that no data were received from practices using that the EHR vendor because of either lack of captured data or lack of the DF in any of the implementations. With only a single exception (date_of_death), every subsequent DE had at least one vendor with a fill rate of zero for one or more DEs. At the bottom of the DE list, 15 DEs had fill rates of zero for all vendors. These include alcohol_counseling, alcohol_reduction, brachytherapy, cause_of_death, consult_report, consulted_md, consulting_provider, cryotherapy, dietary_changes, er_visit, icu, lifestyle_recommend, pain_plan, physical_activity, primary_care_referral, and surgical_procedure.
No additional consistent patterns of DE fill rate failures among vendors were noted. However, a few fill rate outliers were detected. For example, vendors 6 and 7 had very low fill rates for diagnosisdate; vendor 8 had a 97% fill rate for med_reconciliation; vendor 2 had a 100% fill rate for advance_directive; and vendor 5 had an 81% fill rate (relatively very high) for alcohol_use. Across source systems, we do see variability across individual DEs; however, this does not correlate with score calculability for any single vendor.
oMIPS Measure Calculability Was Uniformly Poor
The probability of finding data in a given DF is equal to the DE fill rate. Therefore, the calculability for an oMIPS CQM may be approximated for most CQMs by calculating the product of the fill rates for its component DEs. Since any oMIPS CQM with a component DE fill rate of 0% results (in a DE essential for the measure calculation) in 0% calculability, 11 of the 19 oMIPS CQM had a calculability of 0%, as presented in Table 2. Only two oMIPS CQMs (M9 and M18, as defined in the Data Supplement, related to tobacco use) had calculability > 1%. These two measures were likely more successful because these they had only three-four component DEs (three component DEs were shared between them), and also because tobacco cessation is a very common component of many quality initiatives.
TABLE 2.
Oncology MIPS Measure Calculability

The fraction of component DEs with a > 50% fill rate, presented in Table 2 (column 4), is only loosely correlated with calculability. Higher fill rates (and no zero fill rates) would be required to usefully automate the calculation of MIPS measures from EHR data. The concept of useful fill rates is explored in more detail in the Data Supplement. Table 2 also includes the results covering measures specific for colorectal, breast, and prostate cancer.
DISCUSSION
This study exposes significant and pervasive limitations for automated EHR data extraction for oMIPS CQMs because of sparse DF availability and low fill rates. Even for the instances where DFs were available, there was little standardization across EHR vendors and practices, as well as poor fill rates for most DFs. Our results highlight the challenges preventing meaningful exchange of oncology data, despite recent enactment of the information blocking 21st Century Cures Act legislation.23 One challenge is the complexity and frequent changes of oncology DEs, which do not lend themselves easily to standardization and maintenance of interoperable, computer-readable, and up-to-date data dictionaries. Another challenge is the lack of a mandate to implement data capture standards within EHRs. Data in structured EHR fields vary widely among implementations because data capture standards have not been widely adopted by EHR systems, and also because practices do not routinely share data capture templates.
Even when structured DFs do exist, in practice, most of the fields are sparsely populated, and many clinical concepts are captured only in unstructured text notes, commonly generated by dictation as many clinicians still prefer. Existing natural language processing (NLP) systems require significant implementation and tuning effort at each site to achieve acceptable performance.24 Additionally, narrative text entered by clinicians and processed by NLP has no standardized lists of answer choices or predetermined value sets to guide a user when entering data into an EHR user interface. This can lead to NLP-generated data set skewing when compared with structured data capture.
Similar challenges have plagued other medical specialties. Our results are consistent with previous studies which have assessed the impact of EHR limitations and data gaps for reporting CQMs across multiple practice settings, including primary care, emergency care, postdischarge settings, and cardiology.9,10,25,26 In a 2016 study,25 the EHR extraction of nine cardiovascular CQMs took an average of almost 460 days of effort, with an average of 8.4 separate processing tasks required per measure. A 2014 study10 evaluated DE fill rates required for evaluating five National Quality Forum CQMs and found that most measures required DEs with very low fill rates. They concluded, in part, that none of the five measures could be readily computed and that manually intensive steps including data mapping and text mining would be required to extract the measure data. A 2020 study of eCQMs27 found inconsistencies in the definitions of measure concepts (eg, in Clinical Quality Language expressions27-29), logical phrases, value sets, terminologies, and the DEs required to satisfy the various measures. Our study of 19 oMIPS CQMs identified numerous examples of these problems, as exemplified by the case study described in the Data Supplement.
There are several ways to improve automated reporting from the EHR. Using a policy-based approach, CMS could retract measures that cannot be automatically extracted. CMS could also incentivize the community-based development of national standards for structured data capture of DEs. In addition, CMS could incentivize the routine automated exchange of standardized DEs between separate EHR systems and EHR modules. Data capture and transmission standards will also be required to automate the reporting of quality measures. One example of this approach is the Minimal Common Oncology Data Elements (mCODE) project, which produces oncology-specific data specifications under the auspices of ASCO and Health Level Seven International (HL7).30
Another example of data capture and transmission standards is the College of American Pathologists (CAP) Cancer Protocols (CCPs).31 The CCP documents contain cancer case summaries (checklists) that guide pathologists to create reports containing standard data structures, which are essentially DEs. Beginning in 2009, CAP began releasing the CCP's checklists in a computer-readable XML format so that pathology software vendors could make the standardized data-entry forms and storage mechanisms available inside the existing pathology systems and EHRs. These widely used pathology templates are known as electronic Cancer Checklists (eCCs).31-35
Although we completed a comprehensive search for required data to support the measure set calculations described in this analysis, we did not have access to EHR data dictionaries for each vendor and practice. Therefore, we cannot exclude the possibility that despite considerable manual effort, desired DE data values were present in EHRs but not found. EHR implementation also frequently involves significant customization to accommodate local clinical workflows and data capture preferences. Such customization accommodates clinical workflows, but it decreases standardization and interoperability and increases the difficulty of data extraction.
Additionally, poor DE fill rates can result from multiple causes, eg, lack of standard DEs, lack of EHR DFs for DEs, lack of DE usage in EHR data-entry forms, lack of structured data entry by clinicians and other health care providers, inadequate vendor approaches to exchange of data between EHR modules and other EHRs, sites or practices, vendor systems that do not permit incorporating externally sourced structured data, and vendor systems that exchange data in non–interoperable formats, such as narrative text, PDF, or fax. Each of these DE-related issues decreases the ability to enter, locate, and exchange standardized and structured data.
These results demonstrate that, in the oncology use case, the EHRs and oncology practices studied are incapable of satisfying oncology MIPS reporting requirements through retrieval of clinically recorded structured DEs. Crossing the chasm between quality measures and high-quality data will require widespread creation and adoption of common DEs along with improvements in the routine capture and exchange of structured data.
Anna E. Schorer
Employment: Oncology Analytics
Richard Moldwin
Other Relationship: College of American Pathologists
Jacob Koskimaki
This author is a member of the JCO Clinical Cancer Informatics Editorial Board. Journal policy recused the author from having any role in the peer review of this manuscript.
Elmer V. Bernstam
Stock and Other Ownership Interests: IBM
Consulting or Advisory Role: Samsung Bioepis, Xencor, Debiopharm Group, Silverback Therapeutics, IBM Watson Health, IBM Watson Health, F. Hoffmann LaRoche, PACT Pharma, eFFECTOR Therapeutics, Kolon Life Sciences, Tyra Biosciences, Zymeworks, Puma Biotechnology, Zentalis, Alkermes, Infinity Pharmaceuticals, AbbVie, Black Diamond Therapeutics, Eisai, OnCusp Therapeutics, Lengo Therapeutics, Tallac Therapeutics, Karyopharm Therapeutics, Biovica
Speakers' Bureau: Chugai Pharma
Research Funding: Taiho Pharmaceutical, Debiopharm Group, Bayer, Puma Biotechnology, CytomX Therapeutics, Jounce Therapeutics, Zymeworks, Curis, Boehringer Ingelheim, Novartis, AstraZeneca, Genentech, Calithera Biosciences, Aileron Therapeutics, Pfizer, eFFECTOR Therapeutics, AbbVie, Guardant Health, Daiichi Sankyo, GlaxoSmithKline, Seattle Genetics, Klus Pharma, Takeda
Travel, Accommodations, Expenses: Roche, Boehringer Ingelheim
Neeta K. Venepalli
Honoraria: Lilly
James L. Chen
Consulting or Advisory Role: Syapse
Speakers' Bureau: Foundation Medicine
Research Funding: Eisai
Patents, Royalties, Other Intellectual Property: MatchTX
No other potential conflicts of interest were reported.
PRIOR PRESENTATION
Presented at ASCO Annual Meeting 2019 (abstr e18074), May 26, 2019, Chicago, IL.
SUPPORT
CDC Cooperative Agreement (award 5 NU58DP006457-03-00; R.M.). NIH NCATS UL1TR003167 (E.V.B.). NIH NCATS 5U01TR002393 (E.V.B.). Cancer Prevention Research Institute of Texas (CPRIT) Precision Oncology Decision Support Core RP150535 (E.V.B.). Data Science and Informatics Core for Cancer Research RP170668 (E.V.B.). Reynolds and Reynolds Professorship (E.V.B.).
A.E.S., R.M., and J.K. contributed equally to this work.
AUTHOR CONTRIBUTIONS
Conception and design: Anna E. Schorer, Richard Moldwin, Jacob Koskimaki, Elmer V. Bernstam, Neeta K. Venepalli, Robert S. Miller, James L. Chen
Collection and assembly of data: Anna E. Schorer, Richard Moldwin, Jacob Koskimaki, Neeta K. Venepalli, Robert S. Miller, James L. Chen
Data analysis and interpretation: Anna E. Schorer, Richard Moldwin, Jacob Koskimaki, Elmer V. Bernstam, Neeta K. Venepalli, Robert S. Miller, James L. Chen
Manuscript writing: All authors
Final approval of manuscript: All authors
Accountable for all aspects of the work: All authors
AUTHORS' DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated unless otherwise noted. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/cci/author-center.
Open Payments is a public database containing information reported by companies about payments made to US-licensed physicians (Open Payments).
Anna E. Schorer
Employment: Oncology Analytics
Richard Moldwin
Other Relationship: College of American Pathologists
Jacob Koskimaki
This author is a member of the JCO Clinical Cancer Informatics Editorial Board. Journal policy recused the author from having any role in the peer review of this manuscript.
Elmer V. Bernstam
Stock and Other Ownership Interests: IBM
Consulting or Advisory Role: Samsung Bioepis, Xencor, Debiopharm Group, Silverback Therapeutics, IBM Watson Health, IBM Watson Health, F. Hoffmann LaRoche, PACT Pharma, eFFECTOR Therapeutics, Kolon Life Sciences, Tyra Biosciences, Zymeworks, Puma Biotechnology, Zentalis, Alkermes, Infinity Pharmaceuticals, AbbVie, Black Diamond Therapeutics, Eisai, OnCusp Therapeutics, Lengo Therapeutics, Tallac Therapeutics, Karyopharm Therapeutics, Biovica
Speakers' Bureau: Chugai Pharma
Research Funding: Taiho Pharmaceutical, Debiopharm Group, Bayer, Puma Biotechnology, CytomX Therapeutics, Jounce Therapeutics, Zymeworks, Curis, Boehringer Ingelheim, Novartis, AstraZeneca, Genentech, Calithera Biosciences, Aileron Therapeutics, Pfizer, eFFECTOR Therapeutics, AbbVie, Guardant Health, Daiichi Sankyo, GlaxoSmithKline, Seattle Genetics, Klus Pharma, Takeda
Travel, Accommodations, Expenses: Roche, Boehringer Ingelheim
Neeta K. Venepalli
Honoraria: Lilly
James L. Chen
Consulting or Advisory Role: Syapse
Speakers' Bureau: Foundation Medicine
Research Funding: Eisai
Patents, Royalties, Other Intellectual Property: MatchTX
No other potential conflicts of interest were reported.
REFERENCES
- 1.Findlay S, Berenson R, Lott R, et al. : Health Policy Brief: Implementing MACRA. Physicians who treat Medicare beneficiaries are subject to a new law and regulations governing their payment. Health Affairs:1-5, 2017 [Google Scholar]
- 2.H.R.2—114th Congress (2015-2016): Medicare Access and CHIP Reauthorization Act of 2015 | Library of Congress, 2015. https://www.congress.gov/bill/114th-congress/house-bill/2?q=%7B%22search%22:[%22114publ10%222]%7D&s=2&r=1 [Google Scholar]
- 3.Medicare Program : Merit-Based Incentive Payment System (MIPS) and Alternative Payment Model (APM) incentive under the physician fee schedule, and criteria for physician-focused payment models. Fed Regist 42:77008-77831, 2016 [PubMed] [Google Scholar]
- 4.Eligible Professional/Eligible Clinician eCQMs | eCQI Resource Center. https://ecqi.healthit.gov/ep-ec?globalyearfilter=2018 [Google Scholar]
- 5.Nabhan C, Smith Y, Ernst FR, et al. : Barriers to MACRA among a cohort of community oncologists. J Clin Oncol 35, 2017. (suppl 15; abstr 6613) [Google Scholar]
- 6.Spivack SB, Laugesen MJ, Oberlander J: The politics and policy of health reform: No permanent fix: MACRA, MIPS, and the politics of physician payment reform. J Health Polit Policy Law 43:1025-1040, 2018 [DOI] [PubMed] [Google Scholar]
- 7.Hess CT: 2017 Merit-Based Incentive Payment System data validation and auditing. Adv Skin Wound Care 30:432, 2017 [DOI] [PubMed] [Google Scholar]
- 8.Merit-Based Incentive Program System (MIPS) : Data Validation and Audit Factsheet. Centers for Medicare & Medicaid Services, Quality Payment Program, 2019. https://qpp-cm-prod-content.s3.amazonaws.com/uploads/590/MIPS Data Validation and Audit Fact Sheet.pdf [Google Scholar]
- 9.Ahmad FS, Rasmussen LV, Persell SD, et al. : Challenges to electronic clinical quality measurement using third-party platforms in primary care practices: The healthy hearts in the heartland experience. JAMIA Open 2:423-428, 2019 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Amster A, Jentzsch J, Pasupuleti H, et al. : Completeness, accuracy, and computability of National Quality Forum-specified eMeasures. J Am Med Inform Assoc 22:409-416, 2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Advancing Care Information Reporting. https://www.healthit.gov/topic/federal-incentive-programs/MACRA/MIPS/advancing-care-information-reporting [Google Scholar]
- 12.Potter D, Brothers R, Kolacevski A, et al. : Development of CancerLinQ, a health information learning platform from multiple electronic health record systems to support improved quality of care. JCO Clin Cancer Inform 4:929-937, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.WHO | ICD-10 Online Versions. World Health Organization. http://www.who.int/classifications/icd/icdonlineversions/en/ [Google Scholar]
- 14.SNOMED—5-Step Briefing. https://www.snomed.org/snomed-ct/five-step-briefing [Google Scholar]
- 15.RxNorm. U.S. National Library of Medicine. https://www.nlm.nih.gov/research/umls/rxnorm/ [DOI] [PubMed] [Google Scholar]
- 16.LOINC. https://loinc.org/ [Google Scholar]
- 17.MIPS Explore Measures—QPP. https://qpp.cms.gov/mips/explore-measures?tab=qualityMeasures&py=2018#measures [Google Scholar]
- 18.Nadkarni PM, Marenco L, Chen R, et al. : Organization of heterogeneous scientific data using the EAV/CR representation. J Am Med Inform Assoc 6:478-493, 1999 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Shulman LN, Miller RS, Ambinder EP, et al. : Principles of safe practice using an oncology EHR system for chemotherapy ordering, preparation, and administration, part 2 of 2. JCO Oncol Pract 4:254-257, 2008 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Evans TL, Gabriel PE, Shulman LN: Cancer staging in electronic health records: Strategies to improve documentation of these critical data. JCO Oncol Pract 12:137-139, 2016 [DOI] [PubMed] [Google Scholar]
- 21.Sinaiko AD, Barnett ML, Gaye M, et al. : Association of peer comparison emails with electronic health record documentation of cancer stage by oncologists. JAMA Netw Open 3:e2015935, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Carr LL, Zelarney P, Meadows S, et al. : Development of a cancer care summary through the electronic health record. JCO Oncol Pract 12:e231-40, 2016 [DOI] [PubMed] [Google Scholar]
- 23.21st Century Cures Act: Interoperability, Information Blocking, and the ONC Health IT Certification Program. https://www.federalregister.gov/documents/2020/05/01/2020-07419/21st-century-cures-act-interoperability-information-blocking-and-the-onc-health-it-certification [Google Scholar]
- 24.Carrell DS, Schoen RE, Leffler DA, et al. : Challenges in adapting existing clinical natural language processing systems to multiple, diverse health care settings. J Am Med Inform Assoc 24:986-991, 2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Weiskopf NG, Khan FJ, Woodcock D, et al. : A mixed methods task analysis of the implementation and validation of EHR-based clinical quality measures. AMIA Annu Symp Proc 2016:1229-1237, 2017 [PMC free article] [PubMed] [Google Scholar]
- 26.Parsons A, McCullough C, Wang J, et al. : Validity of electronic health record-derived quality measurement for performance monitoring. J Am Med Inform Assoc 19:604-609, 2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.McClure RC, Macumber CL, Skapik JL, et al. : Igniting harmonized digital clinical quality measurement through terminology, CQL, and FHIR. Appl Clin Inform 11:23-33, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.CQL—Clinical Quality Language | eCQI Resource Center. https://ecqi.healthit.gov/cql [Google Scholar]
- 29.Rhodes B: Electronic Clinical Quality Measure (eCQM) Clinical Quality Language (CQL) Basics for Eligible Professionals and Eligible Clinicians. Centers for Medicare & Medicaid Services, 2018. https://ecqi.healthit.gov/system/files/EP-EC_CQL_Basics_Webinar.pdf [Google Scholar]
- 30.Osterman TJ, Terry M, Miller RS: Improving cancer data interoperability: The promise of the Minimal Common Oncology Data Elements (mCODE) initiative. JCO Clin Cancer Inform 4:993-1001, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Torous VF, Simpson RW, Balani JP, et al. : College of American Pathologists cancer protocols: From optimizing cancer patient care to facilitating interoperable reporting and downstream data use. JCO Clin Cancer Inform 5:47-55, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Simpson RW, Berman MA, Foulis PR, et al. : Cancer biomarkers: The role of structured data reporting. Arch Pathol Lab Med 139:587-593, 2015 [DOI] [PubMed] [Google Scholar]
- 33.Srigley J, Lankshear S, Brierley J, et al. : Closing the quality loop: Facilitating improvement in oncology practice through timely access to clinical performance indicators. JCO Oncol Pract 9:e255-e261, 2013 [DOI] [PubMed] [Google Scholar]
- 34.Goel AK, Campbell WS, Moldwin R: Structured data capture for oncology. JCO Clin Cancer Inform 5:194-201, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Idowu MO, Bekeris LG, Raab S, et al. : Adequacy of surgical pathology reporting of cancer: A College of American Pathologists Q-probes study of 86 institutions. Arch Pathol Lab Med 134:969-974, 2010 [DOI] [PubMed] [Google Scholar]

