Abstract
A report on how accurately physicians used methodology in a nationwide demonstration by the Centers for Medicare & Medicaid Services to enhance quality of cancer treatment and care and promote evidence-based practices.
Improving the quality of cancer care has been an important issue, both for the cancer and health policy communities. The Institute of Medicine noted the lack of knowledge about the quality of cancer care and the need for better measurement and surveillance of clinical effectiveness in community settings.1 Since then, several professional organizations have made progress in developing cancer-specific measures of clinical effectiveness,2 but others have noted the lack of widespread implementation of measurement, especially in the diagnosis and treatment of cancer.3
The challenges facing those developing, implementing, and evaluating programs that encourage the use of clinical guidelines have been discussed to varying degrees in the literature.4 However, evidence indicating which implementation methods are most effective remains limited.5 Barriers include lack of organizational support, clinicians' concerns over the quality of guidelines or the evidence on which they are based, patient preference, clinician reluctance to change, lack of familiarity, financial constraints, and the impracticality or complexity of the guidelines.4,6,7 According to Timmermans et al,8 “nonadherence to clinical practice guidelines remains the major barrier to the successful practice of evidence-based medicine.”
In January 2006, the Centers for Medicare & Medicaid Services (CMS) redesigned a 2005 demonstration developed to enhance the quality of cancer treatment and care and promote “evidence-based best practices that have been proven to lead to improved patient outcomes.”9 The new, year-long nationwide demonstration encouraged office-based oncologists and hematologists, for the first time to our knowledge, to report clinical information on cancer disease states through the Medicare billing system. The demonstration was limited to the following specialists: hematology (specialty code 82), hematology/oncology (specialty code 83), medical oncology (specialty code 90), and gynecological oncology (specialty code 98). For 13 cancer diagnoses (including breast, chronic myelogenous leukemia, colon cancer, esophageal cancer, gastric cancer, head and neck cancer, multiple myeloma, non-Hodgkin's lymphoma, non–small-cell and small-cell lung cancers, ovarian cancer, pancreatic cancer, prostate cancer and rectal cancer), oncologists were encouraged to use clinical guidelines developed by the National Comprehensive Cancer Network (NCCN) and ASCO.
Eighty-one new G-codes were developed for the oncology demonstration in the following three categories: first, the current disease state as best understood clinically at the time of the visit; second, the primary reasons for a patient's evaluation and management visit; and third, physician self-reported assessment of whether the patient's management adhered to clinical guidelines. Physicians reporting at least one G-code for each of the three categories were eligible for an additional payment of $23 per visit. The demonstration started in January 2006 and concluded on December 31, 2006, and more than 5,600 physicians—approximately two thirds of eligible oncologists and hematologists nationwide—participated.
This article focuses on the extent to which participating physicians used this methodology to report cancer staging properly and accurately. It describes the experience of oncologists and hematologists as they adapted their practice to report clinical information and guideline adherence on Medicare claims. It also presents implications for working on the front lines with physician practices, keeping in mind what was learned about their approach to coding and the potential impact it may have in the area of performance measurement.
Methods
Several methods were used between October 2006 and October 2007 to evaluate the 2006 Medicare Oncology Demonstration program. These included a national survey of 526 participating physicians; site visits to nine oncology practices in Pennsylvania, Virginia, Iowa, Nebraska, and Colorado; telephone interviews with oncologists or practice managers in Colorado, Georgia, Illinois, Nebraska, New York, Pennsylvania, Texas, and Washington; and an analysis of claims data. The details of these methods have been described elsewhere.10
Findings
Rapid Implementation of Demonstration Coding Presented Challenges for Many Oncology Practices
More than 40% of physicians participating in the survey reported the required coding, billing, data reporting, and documentation to be “very difficult” or “difficult.” Nonetheless, almost two thirds of physicians surveyed indicated that they “always” submitted G-codes for qualifying visits. In total, 2.9 million claims (or 23% of claims submitted by participating physicians for the 13 cancers) came from the demonstration.
Interviews with practice administrative staff revealed that they were involved in all aspects of the demonstration, including recommending participation in the demonstration, developing worksheets to summarize demonstration G-codes, and revising internal superbills to track and report claims payment. The physician survey revealed that 30% of oncologists reported that their nonphysician staff had to take on “a lot” of extra work to participate in the demonstration, compared with 11% of oncologists who said that the demonstration required “a lot” of extra effort on their own part.
Administrative staff quickly realized the success of the demonstration depended on making it “as easy as possible for the docs.” As a result, they developed summary documentation and coding worksheets. However, this summary information often did not provide a sufficient level of precision to ensure that coding was consistent with demonstration guidelines. Table 1 provides various examples from site visits showing incorrectly summarized code descriptions. This lack of precision led some physicians to make incorrect assumptions about how to code new and existing patients. For example, one interviewed physician used the evaluation and management code G9050 for all work-ups, not just those performed at the time of diagnosis or staging, as indicated in the CMS instructions. Such misinterpretations could have implications for the accuracy of analyses when codes are used to determine whether physicians properly recommended adjuvant therapy (or followed clinical guidelines). To determine the extent of error, all claims with a G9050 code were checked to determine whether the same patient had visited an oncologist in the previous 90 days. Approximately 3% of all eligible claims (ie, total number of claims billed by participating physicians for any of the 13 cancers) submitted including a G9050 code were miscoded and likely should have been reported as G9051 for established patients. However, the demonstration data suggested that participating physicians appropriately reported disease state.
Table 1.
Code | CMS Instructions | Selected Examples From Site Visits |
---|---|---|
G9071 (breast cancer, female) | Oncology; disease status; invasive female breast cancer (does not include ductal carcinoma in situ); adenocarcinoma as predominant cell type; stage I or stage IIA-IIB; or T3, N1, M0; and ER and/or PR positive; with no evidence of disease progression, recurrence, or metastases | Onc Dx brst Stg 1-2B no dx pr |
G9077 (prostate cancer) | Oncology; disease status; prostate cancer, limited to adenocarcinoma as predominant cell type; T1-T2C and Gleason 2-7 and PSA ≤ 20 at diagnosis with no evidence of disease progression, recurrence, or metastases | Onc Dx prostate T1 no progress |
G9065 (non-small cell lung cancer) | Oncology; disease status; limited to non-small cell lung cancer; extent of disease initially established as stage IIIA (prior to neo-adjuvant therapy, if any) with no evidence of disease progression, recurrence, or metastases | NSCLC Stage IIIA, stable |
Abbreviations: CMS, Centers for Medicare & Medicaid Services; ER, estrogen receptor; PR, progesterone receptor; PSA, prostate-specific antigen.
Oncologists' Perceptions of Clinical Guidelines and Adherence Varied Considerably
Interviews with oncologists revealed differing nomenclature used within the context of clinical guidelines. In describing clinical guidelines, none of the physicians used the Field et al11 definition of “systematically developed statements to assist practitioners and patient decisions about appropriate health care for specific circumstances.” Instead, some physicians used terms interchangeably, such as “best practices,” “evidence-based medicine,” “evidence-based guidelines,” “standards of care,” and “clinical pathways.”
Interviewed physicians also interpreted the demonstration code for adherence to guidelines in various ways, from “very strict” to “very loose,” despite reporting adherence to established standards of care. Interviewed physicians reported that they often did not review the applicable guidelines to ensure that their treatment decision was within the recommendations. A few physicians interviewed said they would check “guideline adherence” on the coding form even if one of the applicable guidelines for that visit was not followed because at least some aspect of the care provided was usually within the guidelines. Claims analyses indicated that nine of 10 physicians reported their management of the patient adhered to guidelines.
Surveyed physicians were more likely to agree than disagree about the importance of using clinical guidelines, and the majority did not find them difficult to use. The majority of surveyed physicians reported that they looked up and/or followed clinical guidelines and identified the stage of cancer with the same frequency as before the demonstration. However, approximately one third of the physicians reported determining adherence to guidelines as “difficult.” Some interviewed physicians indicated that they referred to the clinical guidelines only for patients with diagnoses with which they were not familiar. Other physicians said that their fast-paced environment limited their ability to regularly look up the guidelines for each patient, only checking “management differs from guidelines” when they were consciously working outside their normal approach to treatment. Table 2 (Data Supplement, online only) shows how often physicians looked up a clinical guideline to confirm a G-code for a patient.
The demonstration identified varying perceptions of clinical guidelines and their use and points to the importance of recognizing common barriers to implementing the use of clinical guidelines on the front lines of medical practice. Previous research has shown that guideline characteristics affect the frequency of their use, with those that are easier to follow and those not requiring specific resources having a better chance of implementation.5 Multifaceted approaches to guideline implementation have generally proven more successful, especially when interactive educational strategies, clinical reminders and decision support systems, patient-specific interventions, and the production of practical guidelines of low complexity are used.4
Oncology Practices Adapted Differently to Implementation Challenges
The interviews and physician survey showed that practices developed vastly different approaches to helping physicians review the applicable clinical guidelines. Some printed copies of the guidelines and stored them in binders placed at practice stations. Others counted on physicians to look up the guidelines on their office computers. Some physicians used handheld devices to reference guidelines, as they had before the demonstration. Fewer than half of the participating practices responding to the physician survey reported having an electronic medical record (EMR) system in place at the time of the survey.
One physician practice was part of a regional oncology group refining a customized reference and reminder system imbedded within its EMR. This system automatically provided prompts, comparing each chemotherapy regimen recommended for a patient with the relevant clinical guideline. Whenever oncologists using the EMR recommended adjuvant therapy outside the guidelines, they were required to document their rationale for this decision, and they received regular reports on the percentage of time that their chemotherapy treatment fell outside the approved guidelines as well as a comparison of how their practice performed within the overall network.
Age and experience of physicians have also been reported as key variants in the adoption of clinical guidelines, with younger and less experienced physicians more likely to refer and adhere to guidelines.5 Some older physicians interviewed were more skeptical of the value of clinical guidelines, stating that they are “too broad and lack the needed specificity” to address the complexities of their treatment planning. Others indicated that guidelines fail to address the clinical steps after failed treatment plans or multiple comorbidities. Most physicians acknowledged the credibility and helpfulness of clinical guidelines for infrequent diagnoses (eg, rectal or head and neck cancer). However, many indicated that usefulness depended on which guidelines were used.
Promising approaches to multifaceted implementation include recognizing the importance of reliance on opinion leaders, providing concurrent feedback to clinicians, conducting educational outreach, and recognizing that one model does not fit all.4 These approaches also may result in physicians devoting attention and resources to understanding not only the guidelines themselves but also the importance of documenting how they have been used appropriately to ensure that any coding used for purposes of monitoring and measurement is both accurate and complete.
Discussion
This demonstration yielded a number of valuable lessons. Despite efforts by CMS and affected medical specialty societies to help physicians transition into the demonstration, improved collaboration with medical and practice management societies and enhanced communication with practice managers, staff, and participating physicians before the implementation of a project like this would have been valuable. Pretesting coding instructions and practice tool kits would have ensured that the materials were more readily understandable. In future projects, consideration should be given to conducting training sessions with physicians and administrative staff to stress the importance of understanding guideline use, review the coding requirements, and emphasize the importance of consistent data collection. Finally, offering support through effective reminder systems, preferably imbedded in their decision-support systems or EMRs, could also improve the ongoing use of evidence-based medicine in day-to-day practice.
The physician interviews and the survey results identified significant challenges in correctly coding clinical information within a system that was designed for billing. Physicians unaccustomed to providing detailed clinical information on billing forms, especially when they do not recognize the implications of coding inconsistencies, are likely to make incorrect assumptions that could potentially result in serious errors in the computation of compliance rates. These potential inconsistencies in reporting are additionally exacerbated by other challenges in compiling data on complex episodes of care across settings and specialties and over long periods of time.
Oncologists are increasingly becoming only one part of a team of health providers caring for patients with cancer. The growing multidisciplinary practice trend among many cancers makes it difficult to interpret practice patterns from claims data alone, such as which tests and procedures did or did not occur as a result of the involvement of a given specialty. Effective monitoring of clinical guideline compliance must encompass a broad set of providers and settings and be performed for a time period long enough to allow for all the care that could be provided.
Despite the seemingly limited impact of this demonstration, oncologists can learn valuable lessons from considering the implications of this experience on the future of their practice. First, oncologists as a community would benefit from taking a close look at their varied levels of understanding, knowledge, and attitudes about clinical guidelines; the meaning and value of guidelines; and the extent to which oncologists' attitudes affect their ability and willingness to consistently practice evidence-based medicine. Although those participating in the demonstration generally did a good job reporting cancer staging, they were far less consistent and careful in referring to widely accepted clinical guidelines and in coding the care, given the complex and specific instructions provided. Ensuring that accurate clinical information is summarized for complex treatment regimens cannot be delegated to administrative staff.
Given the increasing pressure to demonstrate improved outcomes, oncologists would benefit from identifying areas in which the active promotion of clinical guidelines and their adherence in day-to-day practice are most likely to have a positive impact on both cost and patient outcomes. Focusing on and understanding the barriers inherent in busy practices related to implementing complex and multiple clinical guidelines and identifying a strategy for encouraging consistent use of evidence-based medicine in these areas would benefit all involved.
Finally, the oncology community would benefit from developing and adopting tools and a multifaceted approach to validate and check treatment recommendations and ensure they are consistent with evidence-based medicine. These efforts would optimally include the development of automated reminder and reference systems as well as other means that more easily integrate checks and reminders about complex but important treatment guidelines for busy practitioners in everyday clinical decision making. The increasing support and pressure to adopt EMR systems provide an opportunity for oncologists to become actively involved in developing and refining decision support systems that encourage such activities. Oncologists must remain actively engaged in adopting evidence-based medicine. Inevitably, if they fail as a community to meet these challenges, complex and less helpful means of monitoring clinical guideline adherence will be imposed on them by payers and regulators.
Supplementary Material
Acknowledgment
We thank Pauline Karikari-Martin, MPH, MSN, RN, project officer for the Evaluation of the Second Phase of the Oncology Demonstration, Centers for Medicare & Medicaid Services. Special thanks also goes to our volunteer expert panel members, who provided valuable input throughout the evaluation of the 2006 demonstration. The analyses on which this report is based were performed under Contract No. HHSM-500-2006-00009I/Task Order #2, jointly funded by the Centers for Medicare & Medicaid Services and the National Cancer Institute, Department of Health and Human Services, which ended in May 2009. There was no policy determination at the time of award of the contract. The content of this publication does not necessarily reflect the views or policies of the department, nor does mention of trade names, commercial products, or organizations imply endorsement by the US Government. The authors assume full responsibility for the accuracy and completeness of the ideas presented.
Authors' Disclosures of Potential Conflicts of Interest
The authors indicated no potential conflicts of interest.
Author Contributions
Conception and design: Julia Doherty, Myra Tanamor
Administrative support: Julia Doherty
Collection and assembly of data: Julia Doherty, Myra Tanamor
Data analysis and interpretation: Julia Doherty, Myra Tanamor, John Feigert, Judy Goldberg-Dey
Manuscript writing: Julia Doherty, Myra Tanamor, John Feigert, Judy Goldberg-Dey
Final approval of manuscript: Julia Doherty, Myra Tanamor, John Feigert, Judy Goldbery-Dey
References
- 1.Hewitt M, Simone JV, editors. Ensuring Quality Cancer Care. Washington, DC: National Academies Press; 1999. [PubMed] [Google Scholar]
- 2.Eden J, Simone JV, editors. Washington, DC: National Academies Press; 2005. Assessing the Quality of Cancer Care: An Approach to Measurement in Georgia. [Google Scholar]
- 3.Clauser SB. Use of cancer performance measures in population health: A macro-level perspective. J Natl Cancer Inst Monogr. 2004;33:142–154. doi: 10.1093/jncimonographs/lgh020. [DOI] [PubMed] [Google Scholar]
- 4.Prior M, Guerin M, Grimmer-Somers K. The effectiveness of clinical guideline implementation strategies: A synthesis of systematic review findings. J Eval Clin Pract. 2008;14:888–897. doi: 10.1111/j.1365-2753.2008.01014.x. [DOI] [PubMed] [Google Scholar]
- 5.Francke AL, Smit MC, de Veer AJ, et al. Factors influencing the implementation of clinical guidelines for health care professionals: A systematic meta-review. BMC Med Inform Decis Mak. 2008;8:38. doi: 10.1186/1472-6947-8-38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Mendelson D, Carino TV. Evidence-based medicine in the United States: De rigueur or dream deferred? Health Aff (Millwood) 2005;24:133–136. doi: 10.1377/hlthaff.24.1.133. [DOI] [PubMed] [Google Scholar]
- 7.Cabana MD, Rand CS, Powe NR, et al. Why don't physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282:1458–1465. doi: 10.1001/jama.282.15.1458. [DOI] [PubMed] [Google Scholar]
- 8.Timmermans S, Mauck A. The promises and pitfalls of evidence-based medicine. Health Aff (Millwood) 2005;24:18–28. doi: 10.1377/hlthaff.24.1.18. [DOI] [PubMed] [Google Scholar]
- 9.Centers for Medicare & Medicaid Services. Fact Sheets: Details for 2006 Oncology Demonstration Program. www.cms.hhs.gov/apps/media/press/factsheet.asp?Counter=1717&intNumPerPage=10&checkDate=&checkKey=&srchType=1&numDays=3500&srchOpt=0&srchData=&keywordType=All&chkNewsType=6&intPage=&showAll=&pYear=&year=&desc=true&cboOrder=title.
- 10.Tanamor MA, Doherty J, Feigert J, et al. Evaluation of the second phase of the oncology demonstration: Contract No. HHSM-500-2006-00009I/Task Order #2. http://www.cms.hhs.gov/Reports/Downloads/Tanamor_2009.pdf.
- 11.Field MJ, Lohr MJ, editors. Washington, DC: National Academies Press; 1990. Clinical Practice Guidelines: Directions for a New Program. [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.