Abstract
PURPOSE:
Although the provision of a treatment summary (TS) is a quality indicator in oncology, routine delivery of TSs remains challenging. Automatic TS generation could facilitate use, but data on accuracy are lacking in complex cancers such as head and neck cancer (HNC). We developed and evaluated an electronic platform to automate TS generation for HNC.
METHODS:
The algorithms autopopulated TSs using data from billing records and an institutional cancer registry. A nurse practitioner used the medical record to verify the accuracy of the information and made corrections electronically. Inaccurate and missing data were considered errors. We described and investigated reasons for errors in the automatically generated TSs.
RESULTS:
We enrolled a heterogeneous population of 43 survivors of HNC. Using billing data, the information on primary site, lymph node status, radiation, and chemotherapy use was accurate in 93%, 95%, 93%, and 95% of patients, respectively. Billing data captured surgery accurately in 77% of patients; once an omitted billing code was identified, accuracy increased to 98%. Chemotherapies were captured in 90% of patients. Using the cancer registry, month and year of diagnosis were accurate in 91% of cases; stage was accurate in 28% of cases. Reprogramming the algorithm to ascertain clinical stage when pathologic stage was unavailable resulted in 100% accuracy. The algorithms inconsistently identified radiation receipt and treating physicians from billing data.
CONCLUSION:
It is feasible to automatically and accurately generate most components of TSs for HNC using billing and cancer registry data, although clinical review is necessary in some cases.
INTRODUCTION
The provision of a survivorship care plan, including a treatment summary (TS) and a follow-up plan, is an important oncology quality indicator for the ASCO Quality Oncology Practice Initiative and the Commission on Cancer.1,2 However, the resources needed to create TSs pose barriers to routine implementation.3-9 Studies leveraging electronic platforms to autopopulate TSs have been described for breast, endometrial, and ovarian cancers.10-13 We tested this approach for a complex cancer population by developing a program to automate TS generation for head and neck cancer (HNC), a heterogeneous diagnosis that encompasses multiple disease sites and multimodality therapy.14-20 We used data from the billing system and an institutional cancer registry to automate TS creation. If accurate, our Web-based program could offer a means to simplify TS generation across cancers. Our objective was to describe the algorithmic logic and examine the accuracy of the automatically generated TSs.
METHODS
This study is part of a feasibility study of an electronic program (Feasibility Study of the Head and Neck Survivorship Tool: Assessment and Recommendations [HN-STAR]) developed to ease the delivery of survivorship care plans for survivors of HNC. The feasibility study is detailed elsewhere.21 Any survivor who is at least 3 years out from treatment of HNC is eligible for the HNC survivorship clinic; survivors meeting this criterion were invited to participate in the feasibility study, which was approved by the institutional review board at Memorial Sloan Kettering Cancer Center.
Overview
A medical oncologist (S.S.B.) and a programmer analyst (R.S.) developed a database script procedure that extracts data from hospital and physician billing records and institutional tumor registry to autopopulate ASCO-recommended, salient elements of a TS.22 This process was external to the electronic health record (EHR) to ensure interoperability for future dissemination. A TS was generated for each study participant. An nurse practitioner (NP) reviewed each TS and confirmed or corrected the diagnosis and treatment details (Data Supplement). We modified the database script to address specific programming errors.
Treatment Summary Generation (Data Elements)
Data for the TS were compiled using International Classification of Diseases (ICD)-9 and -10 site codes, Current Procedural Terminology (CPT) codes, institutional billing codes, Healthcare Common Procedural Coding System J codes, and institutional tumor registry data. Primary sites were identified in the SEER manual for HNC, and corresponding CPT codes were identified in Encoder Pro software (Optum360, Eden Prairie, MN).23 We categorized related diagnoses that require identical follow-up care (Table 1).24
TABLE 1.
Head and Neck Cancer Site and Billing Codes From Hospital and Physician Billing History
HN-STAR ascertained ICD-9 and -10 codes for the primary sites of HNC; if billing codes for at least one primary site were listed, the most common diagnosis code was selected. Date of diagnosis (month and year) and staging information (ie, tumor, nodal, and comprehensive stage) were captured from the institutional tumor registry. Lymph node involvement was ascertained from billing codes. We prioritized pathologic over clinical stage information when both were available. Surgery, chemotherapy, and radiation therapy were assessed independently. Each primary site was mapped to an exhaustive list of surgical procedures, using surgical CPT codes, that could be performed to treat tumors at that tumor site. HN-STAR used CPT codes to identify radiation and chemotherapy. A list of potential radiation fields (ie, primary site, left neck, or right neck) was presented to the clinician for manual entry during TS verification. Chemotherapy administration was captured with CPT codes, and routinely used chemotherapeutic agents were identified using Healthcare Common Procedural Coding System J codes.25 The primary surgeon was identified from the billing invoice for each surgical CPT code. All other providers were identified as any physician who submitted claims for relevant chemotherapy and radiation treatment.
Treatment Summary Verification
For each patient, HN-STAR generated a TS for the NP to review electronically. The NP verified its accuracy compared with clinical documentation within the EHR. The NP manually corrected incorrect or erroneously omitted fields. The NP also entered human papillomavirus status and radiation fields when applicable (Data Supplement). HN-STAR electronically captured all corrections.
Analysis
We analyzed the accuracy of the following fields: primary tumor site, lymph node status, receipt of treatment (ie, surgery, radiation, and chemotherapy), surgery type, chemotherapeutic agents, treating physicians, dates of diagnosis, and staging. We report the accuracy (calculated as the number of accurately autopopulated items divided by the total number of potential data captures) for each field. We categorized each error as a programming error (ie, a mistake in our algorithms) or nonprogramming error (ie, an error that could not be corrected algorithmically). We revised the database scripts to correct programming errors, when possible. We describe nonprogramming errors and their implications.
RESULTS
Survivor Population
Forty-three survivors participated and were a median of 60 years old (range, 29 to 83 years), predominantly male (86%), and a median of 79 (range, 46 to 140) months from having received their diagnosis. There was a range of primary tumor sites, with oropharynx predominating (56%). Treatment included surgery, radiation, and chemotherapy in 21 (49%), 36 (84%), and 32 (74%) of cases, respectively. Thirty-six survivors (84%) received multimodality therapy.
Accuracy of TS
The primary site and lymph node status were correct in 93% and 95% of patients, respectively. Month and year of diagnosis were accurate in 91% of cases. Although the use of radiation or chemotherapy was correctly identified in 93% and 95% of cases, respectively, surgery was appropriately identified in 77% of cases. Treating physicians were accurately identified in 80% of cases. Stage was accurate in 12 cases (28%; Table 2).
TABLE 2.
Source of Error by Field
Sources of Errors
The most notable error was inaccuracy of staging. The original algorithm did not default to clinical stage when pathological stage was unavailable. A second error led to only T stage, rather than comprehensive stage, being included. Once corrected, the platform accurately identified stage in all cases.
The most significant inaccuracy from billing codes was omission of a surgery from nine TSs. We inadvertently excluded one CPT code, resulting in eight omissions. A misclassification of a surgery to the wrong ICD-9 or -10 site code prevented capture of one surgery. Once corrected, the accuracy of the surgery algorithm was 98%, with the final missed surgery having occurred at a different institution.
Three patients had multiple cancers in the institutional registry, which led to inaccurate dates of diagnosis, because we only pulled dates from the first cancer diagnosis present in the registry. Receiving inpatient treatment led to an omission of chemotherapy in one case. Treatment at a different institution was responsible for a single error in each of the following fields: diagnosis month and year, primary site, surgery date, and chemotherapy use. Specific chemotherapies were missed in four cases when the regimen included more than one drug.
DISCUSSION
We leveraged an EHR to generate a mostly automatic TS for survivors of HNC. We successfully identified primary site, lymph node status, diagnosis date, and receipt of radiation and chemotherapy in greater than or equal to 90% of cases. Once we corrected programming errors, we achieved similar accuracy in capturing surgeries and staging. The algorithms use diagnostic and billing information, which are increasingly available electronically.
Institutions differ in their ability to query electronic databases for treatment summary data. Many institutional registries capture basic clinical data electronically, but likely differ in data storage organization that would require modification of our logic. For example, if a hospital does not have an electronic cancer registry, billing data could be used to ascertain diagnosis dates. In addition, the engineering effort to combine multiple data sources could vary between institutions. Our algorithms to identify treating physicians as any physician who placed a relevant order for HNC treatment failed to distinguish primary responsible physicians from other physicians. A future iteration of this procedure will present a checklist of all providers in order of billing frequency for verification. In addition, chemotherapy is not completely captured using our algorithms, owing to inpatient treatment not captured in billing codes. Like previous efforts, our algorithms failed to capture treatment that occurred outside the institution or before the adoption of our current EHR.11 Until there is more connectivity between EHRs, generating TSs will remain challenging for survivors treated at multiple institutions.
Although we included a heterogeneous population of survivors from our institution, a limitation is that there may be diagnostic and treatment codes we did not test because they are not routinely used at our institution. We also did not time the verification process nor the process of manually creating a treatment summary. Although these data could help describe the benefit of automation, the timing of the automated and the manual processes likely varies too widely between providers and institutions to reliably generalize beyond our study.
In summary, our algorithms successfully captured the vast majority of diagnostic and treatment data required to automate TS generation, but verification is still required for specific cases. For patients who received treatment at other institutions, had multiple HNCs, or had an unspecified site of disease, the TS requires verification. If chemotherapy is not identified from billing codes, the TS of patients who had inpatient visits should also be verified. These algorithms could be modified for use with other EHRs, for survivors earlier in their survivorship trajectory, and for survivors of other cancers.26 Although HN-STAR cannot fully automate TS generation, automation could greatly reduce the resource barriers to the delivery of survivorship care plans.
ACKNOWLEDGMENT
Supported by the National Cancer Institute (NCI; Grant No. R21 CA187441 [S.S.B. and T.S.]), a Cancer Center Support Grant from the NCI to Memorial Sloan Kettering Cancer Center (Grant No. P30 CA008748), and the Head and Neck Survivorship Fund. S.S.B. was supported by a Clinical Scholar Award from the Leukemia & Lymphoma Society.
APPENDIX
Fig A1.
Treatment summary.
AUTHOR CONTRIBUTIONS
Conception and design: Shrujal S. Baxi, Ranjit Sukhu, Stacie Corcoran, Andrew Salner, Andrew J. Vickers, Mary S. McCabe, Talya Salz
Administrative support: Elizabeth Fortier, Mary S. McCabe
Provision of study material or patients: Andrew Salner, Mary S. McCabe
Collection and assembly of data: Shrujal S. Baxi, Ranjit Sukhu, Elizabeth Fortier, Stacie Corcoran, Mary S. McCabe, Talya Salz
Data analysis and interpretation: Shrujal S. Baxi, Ranjit Sukhu, Andrew Salner, Talya Salz
Manuscript writing: All authors
Final approval of manuscript: All authors
Accountable for all aspects of the work: All authors
AUTHORS’ DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
Automating Treatment Summary Development Using Electronic Billing Information: A Pilot Study of Survivors of Head and Neck Cancer
The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/jop/site/ifc/journal-policies.html.
Shrujal S. Baxi
Employment: Flatiron Health
Stock and Other Ownership Interests: Flatiron Health
Andrew Salner
Consulting or Advisory Role: IBM
Andrew J. Vickers
Stock and Other Ownership Interests: OPKO Health
Patents, Royalties, Other Intellectual Property: I am named on a patent for a statistical method to detect prostate cancer. This method has been commercialized by Opko as the “4Kscore.” I receive royalties from sales of the 4Kscore.
No other potential conflicts of interest were reported.
REFERENCES
- 1.American Society of Clinical Oncology : ASCO Treatment Summary and Survivorship Care Plans, 2013. http://www.cancer.net/survivorship/follow-care-after-cancer-treatment/asco-cancer-treatment-and-survivorship-care-plans
- 2.American College of Surgeons Commission on Cancer : Cancer program standards, version 1.2: Ensuring patient-centered care, 2016
- 3.Salz T, McCabe MS, Onstad EE, et al. : Survivorship care plans: Is there buy-in from community oncology providers? Cancer 120:722-730, 2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Salz T, Oeffinger KC, McCabe MS, et al. : Survivorship care plans in research and practice. CA Cancer J Clin 62:101-117, 2012 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Birken SA, Deal AM, Mayer DK, et al. : Determinants of survivorship care plan use in US cancer programs. J Cancer Educ 29:720-727, 2014 [Erratum:J Cancer Educ 29:608-610, 2014] [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Birken SA, Deal AM, Mayer DK, et al. : Following through: The consistency of survivorship care plan use in United States cancer programs. J Cancer Educ 29:689-697, 2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Birken SA, Mayer DK, Weiner BJ: Survivorship care plans: Prevalence and barriers to use. J Cancer Educ 28:290-296, 2013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Blanch-Hartigan D, Forsythe LP, Alfano CM, et al. : Provision and discussion of survivorship care plans among cancer survivors: Results of a nationally representative survey of oncologists and primary care physicians. J Clin Oncol 32:1578-1585, 2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Dulko D, Pace CM, Dittus KL, et al. Barriers and facilitators to implementing cancer survivorship care plans. Oncol Nurs Forum 40:575-580, 2013 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Collie K, McCormick J, Waller A, et al. : Qualitative evaluation of care plans for Canadian breast and head-and-neck cancer survivors. Curr Oncol 21:e18-e28, 2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Tevaarwerk AJ, Wisinski KB, Buhr KA, et al. : Leveraging electronic health record systems to create and provide electronic cancer survivorship care plans: A pilot study. J Oncol Pract 10:e150-e159, 2014 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Donohue S, Sesto ME, Hahn DL, et al. : Evaluating primary care providers’ views on survivorship care plans generated by an electronic health record system. J Oncol Pract 11:e329-e335, 2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Nicolaije KAH, Ezendam NPM, Vos MC, et al. : Oncology providers’ evaluation of the use of an automatically generated cancer survivorship care plan: Longitudinal results from the ROGY Care trial. J Cancer Surviv 8:248-259, 2014 [DOI] [PubMed] [Google Scholar]
- 14.Cohen EE, LaMonte SJ, Erb NL, et al. : American Cancer Society Head and Neck Cancer Survivorship Care Guideline. CA Cancer J Clin 66:203-239, 2016 [DOI] [PubMed] [Google Scholar]
- 15.Adelstein DJ, Li Y, Adams GL, et al. : An intergroup phase III comparison of standard radiation therapy and two schedules of concurrent chemoradiotherapy in patients with unresectable squamous cell head and neck cancer. J Clin Oncol 21:92-98, 2003 [DOI] [PubMed] [Google Scholar]
- 16.Denis F, Garaud P, Bardet E, et al. : Final results of the 94-01 French Head and Neck Oncology and Radiotherapy Group randomized trial comparing radiotherapy alone with concomitant radiochemotherapy in advanced-stage oropharynx carcinoma. J Clin Oncol 22:69-76, 2004 [DOI] [PubMed] [Google Scholar]
- 17.Yabroff KR, Lawrence WF, Clauser S, et al. : Burden of illness in cancer survivors: Findings from a population-based national sample. J Natl Cancer Inst 96:1322-1330, 2004 [DOI] [PubMed] [Google Scholar]
- 18.Harrington CB, Hansen JA, Moskowitz M, et al. : It’s not over when it’s over: Long-term symptoms in cancer survivors--a systematic review. Int J Psychiatry Med 40:163-181, 2010 [DOI] [PubMed] [Google Scholar]
- 19.Stricker CT, Jacobs LA: Physical late effects in adult cancer survivors. Oncology (Williston Park) 22:33-41, 2008. (suppl 8, Nurse Ed) [PubMed] [Google Scholar]
- 20.Stein KD, Syrjala KL, Andrykowski MA: Physical and psychological long-term and late effects of cancer. Cancer 112:2577-2592, 2008. (suppl 11) [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Salz T, McCabe MS, Oeffinger KC, et al. : A head and neck cancer intervention for use in survivorship clinics: A protocol for a feasibility study. Pilot Feasibility Stud 2:23, 2016 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Mayer DK, Nekhlyudov L, Snyder CF, et al. : American Society of Clinical Oncology clinical expert statement on cancer survivorship care planning. J Oncol Pract 10:345-351, 2014 [DOI] [PubMed] [Google Scholar]
- 23.Optum360 : Encoder Pro for Payers Professional. https://www.encoderprofp.com/epro4payers/
- 24.Pfister DG, Spencer S, Adelstein D, et al., (eds): National Comprehensive Cancer Network Clinical Practice Guidelines in Oncology: Head and Neck Cancers (ed 1). Fort Washington, PA, National Comprehensive Cancer Network, 2018 [Google Scholar]
- 25.Pfister DG, Spencer S, Brizel DM, et al. : Head and Neck Cancers, Version 1.2015. J Natl Compr Canc Netw 13:847-855, 2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.American Society of Clinical Oncology : The state of cancer care in America, 2017: A report by the American Society of Clinical Oncology. J Oncol Pract 13:e353-e394; 2017 [DOI] [PubMed] [Google Scholar]



