Abstract
Background
Health services researchers have studied how care from oncologists impacts treatment and outcomes for cancer patients. These studies frequently identify physician specialty using files from the Center for Medicare and Medicaid Services (CMS) or the American Medical Association (AMA). The completeness of the CMS data resources, individually or combined, to identify oncologists is unknown. This study assessed the sensitivity of CMS data to capture oncologists included in the AMA Physician Masterfile.
Methods
Oncologists were identified from three CMS data resources: physician claims, the National Plan and Provider Enumeration System Registry, and the Medicare Data on Provider Practice and Specialty file. CMS files and AMA data were linked using a unique physician identifier. Sensitivity to identify any oncologists, radiation oncologists (ROs), surgical oncologists (SOs), and medical oncologists (MOs) was calculated for individual and combined CMS files. For oncologists in the AMA data not identified as oncologists in the CMS data, their CMS specialty was assessed.
Results
Individual CMS files each captured approximately 83% of the 17 934 oncologists in the AMA Masterfile; combined CMS files captured 90.4%. By specialty, combined CMS data captured 98.2% of ROs, 89.3% of MOs, and 70.1% of SOs. For ROs and SOs in the AMA data not identified as oncologists in the CMS data, their CMS specialty was usually similar to the AMA subspecialty; ROs were radiologists and SOs were surgeons.
Conclusion
Using combined files from CMS identified most ROs and MOs found in the AMA, but not most SOs. Determining whether to use the AMA data or CMS files for a particular research project will depend on the specific research question and the type of oncologist included in the study.
Researchers frequently use health data, such as Medicare claims or the linked Surveillance, Epidemiology, and End Results (SEER) cancer registry-Medicare files, to study treatment and outcomes for cancer patients. These data can be used to assess the role of physicians in population-based cancer care by determining how physician specialty impacts the type of treatment and outcomes for patients (1–11). For example, a prior SEER-Medicare study demonstrated that individual radiation oncologists had a statistically significant role in determining if women with breast cancer received hypofractionated radiation therapy (1). Another study reported that African American patients with pancreatic cancer were less likely to consult a cancer specialist and receive recommended treatment than white patients (5).
Individual physicians can be identified from Medicare claims by using the National Provider Identifier (NPI), a unique number assigned to each physician by the Center for Medicare and Medicaid Services (CMS) that is used as the required identifier on health-care claims. Some researchers analyzing the Medicare data will determine if a physician is an oncologist by linking the NPIs on the Medicare claims to the American Medical Association (AMA) Physician Masterfile, which includes physician specialty (12). However, data from the AMA Masterfile are expensive, limiting its accessibility to some researchers. To overcome the cost of the AMA data, some researchers have used files available from CMS to determine physician specialty. The availability of specialty information on CMS files varies by the data resource. Physician claims from CMS include the performing physician’s self-reported specialty, and they are readily accessible to researchers who have obtained claims. CMS also maintains two other data resources that have information about physician specialty, including oncology: the National Plan and Provider Enumeration System (NPPES) Registry (13), which can be downloaded for free, and the Medicare Data on Provider Practice and Specialty (MD-PPAS) file (14), which is available at a nominal annual cost.
The availability of different data resources that include information about physician specialty presents researchers with uncertainty about which files are the best to identify oncologists. Prior studies have found that the AMA Masterfile includes more oncologists than are identified from Medicare claims (15,16). However, these prior studies did not assess the other, aforementioned, CMS data resources to identify oncologists. The purpose of this study was to assess the utility of all data resources available from CMS—Medicare claims, the NPPES Registry, and the MD-PPAS file—to identify oncologists. The analysis used the AMA data specialty classification compared with individual CMS data resources, as well as combinations of the CMS data resources, to determine the sensitivity of the CMS data to identify oncologists reported in the AMA data. For physicians who were correctly identified as oncologists in the CMS data, we compared the listed oncology specialty (radiation, surgical, or medical oncologist) in the AMA and CMS data. For oncologists in the AMA data who were not identified as oncologists in any CMS data resource, we examined the specialty that was reported on the CMS files. The composite of this information will provide researchers with an in-depth understanding of each CMS data resource’s utility to identify oncologists for studies related to practice patterns and outcomes.
Methods
Data Resources
AMA Physician Masterfile.
The AMA Physician Masterfile (AMA data) is a database that maintains current and historical information for all physicians, medical residents, and students in the United States, including doctors of medicine and doctors of osteopathy (12). The AMA data contain more than 1.4 million US physicians, including more than 400 000 foreign medical graduates who are certified to practice in the United States. Information about each physician in the AMA data is compiled from multiple sources. The AMA assigns a unique number to every student entering a US medical school and verifies his or her graduation with the Association of American Medical Colleges. Physician specialty information is obtained from residency programs and the American Board of Medical Specialty certification. The American Board of Medical Specialty captures graduates of US and international medical schools. Physicians may update their practice data through self-report to the AMA at any time. The AMA data include information such as the physician’s NPI, medical school and residency training, state where the physician’s practice is located, primary and, if reported, secondary specialty, and type of practice (primary physician activity is direct patient care, teaching, administration, etc.).
Physician Claims Included on the SEER-Medicare Data.
The SEER registries are funded by the National Cancer Institute and include population-based cancer registries that, at the time of data analysis, captured all incident cancers occurring in 28% percent of the US population. Each registry collects information on all newly diagnosed cancers occurring in a defined geographic region. The SEER-Medicare data link persons in the SEER data to their Medicare files, if they are eligible for and enrolled in Medicare (17). The Medicare data include information for each beneficiary about Medicare eligibility (parts A and B) and health maintenance organization or fee-for-service enrollment. For beneficiaries with fee-for-service coverage, the SEER-Medicare data include claims submitted by providers. Medicare claims from physicians include the NPI and self-reported specialty for the treating physician.
NPPES Registry.
The NPPES Registry, maintained by CMS, is a complete repository of NPIs for individual providers, physicians, other health-care providers, and health-care organizations. NPIs have been the standard identifier for all health-care providers covered by the Health Insurance Portability and Accountability Act since 2007 (13). NPIs are assigned when a health-care provider enrolls in the NPPES Registry. Medical specialty is self-reported by physicians at the time that they apply for an NPI. Physicians may report one primary specialty and up to 14 secondary specialties. Specialties are coded in the NPPES data using standardized Healthcare Provider Taxonomy Codes established by the National Uniform Claim Committee (18). Data as of June 2017 were included in this analysis.
MD-PPAS File.
The MD-PPAS file has been produced annually by CMS since 2008. The goal of the file is to provide enhanced information about individual provider’s practice and specialty. The annual MD-PPAS files contain a record for any provider who had a valid NPI and submitted a part B noninstitutional claim for evaluation and management services, procedures, imaging, or nonlaboratory testing during the year (14). For this analysis, the MD-PPAS files from 2008 to 2015 were included. The annual files include each provider’s NPI, demographic information, and specialty. Specialty is self-reported when the provider enrolls in Medicare’s Provider Enrollment, Chain and Ownership System (PECOS), using the same specialty codes as those on the physician claims. Providers are required to enroll in the PECOS system to receive payment from Medicare, and they must revalidate their PECOS data annually. The PECOS system allows providers to report two specialties, but almost all physicians report only one. A small number of physicians do not report a specialty on the PECOS file. For physicians who do not have a specialty on the PECOS file, their specialty is determined from the physician claims used to create the MD-PPAS file. In cases where more than one type of specialty is reported on the claims, the PECOS file includes the most frequently reported specialty on the claims.
Identification of Oncologists and Other Physician Specialties
Study Population.
Physicians were eligible for inclusion in the study if they were included in the AMA data and either their primary or secondary specialty was oncology per the AMA as of July 2017. If more than one oncology specialty was reported on the AMA data, the specialty was assigned in the following hierarchical order: radiation oncologist (RO), surgical oncologist (SO), and medical oncologist (MO). Furthermore, physicians were only included if their type of practice listed in the AMA data was direct patient care because these physicians were likely to submit claims to Medicare and, thus, be included in the other files being assessed. Of the AMA oncologists, 19.5% (n = 4338) were excluded because they did not provide direct patient care.
Classification of Physician Specialty
Oncology and oncology specialties (ROs, SOs, and MOs) were identified as shown in Table 1. Nononcology physician specialties were also possible in the CMS data resources; these specialties were identified as shown in the Supplementary Appendix (available online). On all CMS files, physician specialty was self-reported. For the physician claims and the MD-PPAS file, we used codes developed by CMS for the performing provider specialty variable and primary specialty, respectively. On the NPPES file, specialty was identified from taxonomy codes.
Table 1.
Category† | Oncology specialty | AMA data* | Medicare physician claims and MD-PPAS | NPPES |
---|---|---|---|---|
Codes | Codes | Codes | ||
Any oncologist | Any of the specialties below | See below | See below | See below |
Radiation oncologist | Radiation oncology | RO | 92 | 2085R0001X |
Surgical oncologist | Surgical oncology | SO | 91 | 2086X0206X |
Advanced surgical oncology | ASO | N/A | N/A | |
Gynecological oncology | GO | 98 | 207VX0201X | |
Medical oncologist | Medical oncology | MO, ON | 90 | 207RX0202X |
Hematology | HEM, HMP | 82 | 207RH0000X | |
Hematology/Oncology | HO | 83 | 207RH0003X |
Only physicians whose type of practice was listed as direct patient care were included. AMA = American Medical Association; CMS = Center for Medicare and Medicaid Services; MD-PPAS = Medicare Data on Provider Practice and Specialty; NPPES = National Plan and Provider Enumeration System; N/A = no applicable code.
If more than one oncology specialty was listed within each data resource, oncology specialty was assigned using a hierarchical approach: 1) radiation oncologist; 2) surgical oncologist, and 3) medical oncologist.
Some of the data resources allowed for multiple specialties to be listed creating the possibility for different specialties to be reported within a file or between files. To assign a physician specialty for the CMS data resources, we developed a hierarchical approach. First, within each data resource, if there was any indication that a physician was an oncologist, he or she was classified as such. Then, among the physicians who were identified as oncologist, oncology specialty was assigned in the following hierarchical order: RO, SO, and MO. For example, if two oncology specialties (RO and SO) were listed in a file for the same physician, then the higher-ranked specialty (RO) was assigned. This same hierarchical approach was used to consolidate oncology specialties across CMS data resources. Within the CMS data resources, there were physician specialties other than oncologists. We wanted to classify these nononcologists into groups. To consolidate physicians’ information within and across the CMS data resources, we extended our hierarchical approach for nononcology specialties in the following order: radiology, surgery, “other” medical specialty, hospital-based, and primary care (see Supplementary Appendix, available online) These physicians were only identified from the CMS data resources.
Analysis
The AMA data were considered the gold standard to identify oncologists because of the extent that it has been used in studies of cancer treatment and outcomes (19–25). We calculated the sensitivity of each of the CMS data resources to capture any oncologist, ROs, SOs, and MOs, respectively. In addition to assessing the sensitivity of the individual CMS data resources, we also assessed the sensitivity when these files were combined (ie, the physician claims and NPPES Registry, the physician claims and the MD-PPAS file, the NPPES Registry and the MD-PPAS file, or all of the CMS files).
Results
There were 17 934 oncologists identified from the AMA data (Table 2). Of these, 23.0% were ROs, 8.8% were SOs, and 68.2% were MOs per the AMA data. Using only CMS physician claims, 84.0% of all AMA oncologists were identified, although the sensitivity of the claims to capture oncologists varied by specialty. Physician claims identified 95.4% of ROs, 53.0% of SOs, and 83.3% of MOs. Similar results were observed for the NPPES Registry and the MD-PPAS file, which captured 82.8% and 82.5% of all oncologists, respectively. Both the NPPES Registry and the MD-PPAS file captured more than 90% of ROs and 82% of MOs. Ascertainment of SOs varied between the NPPES Registry (59.8%) and the MD-PPAS file (46.0%). The utility of the CMS data resources to identify type of oncologist improved when files were combined. Using a combination of the three CMS data resources to identify any oncologists improved sensitivity to more than 90% and resulted in identification of 98.2% of ROs, 70.1% of SOs, and 89.3% of MOs. The ability to identify oncologists using the physician claims combined with the NPPES file was similar to the combination of all three CMS data resources.
Table 2.
Type of Oncology Specialty
†
|
||||
---|---|---|---|---|
Any oncologist (n = 17 934) | Radiation oncologists (n = 4123) | Surgical oncologists (n = 1575) | Medical oncologists (n = 12 236) | |
n (%)‡ | n (%)‡ | n (%)‡ | n (%)‡ | |
Using Single CMS File | ||||
Physician Claims only | 15 073 (84.0) | 3935 (95.4) | 834 (53.0) | 10 188 (83.3) |
NPPES Registry | 14 844 (82.8) | 3730 (90.5) | 942 (59.8) | 10 098 (82.5) |
MD-PPAS File only | 14 802 (82.5) | 3958 (96.0) | 724 (46.0) | 10 041 (82.1) |
Combined Files Using | ||||
Physician Claims and NPPES Registry | 16 136 (90.0) | 4043 (98.1) | 1095 (69.5) | 10 872 (88.9) |
Physician Claims and MD-PPAS file | 15 470 (86.3) | 4017 (97.4) | 873 (55.4) | 10 458 (85.5) |
NPPES Registry and MD-PPAS | 16 061 (89.6) | 4041 (98.0) | 1059 (67.2) | 10 879 (88.9) |
All of the CMS Files | 16 204 (90.4) | 4048 (98.2) | 1104 (70.1) | 10 924 (89.3) |
The Physician Masterfile served as the gold standard for identifying oncologists. AMA = American Medical Association; CMS = Center for Medicare and Medicaid Services; MD-PPAS = Medicare Data on Provider Practice and Specialty; NPPES = National Plan and Provider Enumeration System.
Oncologic specialties classified in hierarchical order (radiation, surgical, medical).
Number and percent of oncologists identified in the AMA Masterfile who were found to have concordant specialty in the CMS data.
We compared the AMA and CMS files for agreement about oncology specialty. There were 128 physicians who were classified as oncologists in both the AMA and at least one CMS data resource, but there was disagreement between the two data sources about the specific oncology specialty. This accounted for less than 1% of all physicians (data not shown). In addition, there were 84 physicians (<0.5%) who had conflicting types of oncology specialties between the CMS files, independent of their AMA specialty. For the 1730 physicians (RO: 70; SO: 461; MO: 1199) who were identified as an oncologist in the AMA data but not determined to be an oncologist in any CMS data resource, we identified their specialties as reported in the CMS files (Table 3). For these 1730 AMA-classified oncologists, the most frequent specialties listed in the CMS data for these physicians were hospital-based specialty, surgery, and primary care. There were 3.6% AMA oncologists who had a NPPES taxonomy code of “specialist” (174400000X), which was outside the typical range of taxonomy codes for physicians (eg, codes that begin with 20).
Table 3.
Specialty per the CMS data† | All oncologists (n = 1730) |
Radiation oncologists (n = 70) |
Surgical oncologists (n = 461) |
Medical oncologists (n = 1199) |
||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Physician claims, % | NPPES Registry, % | MD-PPAS file, % | Physician claims, % | NPPES Registry, % | MD-PPAS file, % | Physician claims, % | NPPES Registry, % | MD-PPAS file, % | Physician claims, % | NPPES Registry, % | MD-PPAS file, % | |
Radiology | 2.6 | 2.3 | 2.5 | 52.9 | 44.3 | 52.9 | 0.2 | 0.2 | 0.0 | 0.6 | 0.6 | 0.6 |
Surgery | 29.5 | 28.3 | 29.8 | 10.0 | 10.0 | 10.0 | 89.6 | 85.2 | 89.8 | 7.5 | 7.5 | 7.8 |
Medical specialty | 4.2 | 4.6 | 4.0 | 8.6 | 10.0 | 8.6 | 0.9 | 1.1 | 0.9 | 5.2 | 5.6 | 4.9 |
Hospital-based | 31.1 | 31.7 | 30.7 | 7.1 | 10.0 | 7.1 | 0.9 | 0.9 | 0.9 | 44.1 | 44.8 | 43.5 |
Primary care | 28.5 | 27.7 | 33.0 | 20.0 | 17.1 | 21.4 | 6.5 | 6.3 | 8.5 | 37.4 | 36.5 | 43.1 |
Specialist‡ | — | 3.6 | — | — | 5.7 | — | — | 6.1 | — | — | 2.5 | — |
Missing/Other | 4.2 | 1.9 | 0.0 | 1.4 | 2.9 | 0.0 | 2.0 | 0.2 | 0.0 | 5.2 | 2.5 | 0.0 |
Only AMA Direct Patient Care Physicians were included. MD-PPAS = Medicare Data on Provider Practice and Specialty; NPPES = National Plan and Provider Enumeration System. AMA = American Medical Association; CMS = Center for Medicare and Medicaid Services; MD-PPAS = Medicare Data on Provider Practice and Specialty; NPPES = National Plan and Provider Enumeration System.
Specialties reported in hierarchical order (radiation, surgical, medical).
Specialist category only available in the NPPES file based on taxonomy code of 174400000X.
The CMS-reported specialty varied when stratified by the AMA-defined oncology specialty. For physicians who were ROs in the AMA data, radiology was the most frequently reported specialty in the CMS data, although the frequency varied by CMS data resource. A similar pattern was observed for AMA-defined SOs, with more than 85% having a surgery specialty in each of the CMS files. For medical oncologists in the AMA data, the most common specialties in the CMS data were hospital-based and primary care. Of the doctors identified as providing primary care, 90% or more reported an internal medicine specialty in all CMS files (data not shown).
Discussion
Our analysis demonstrated that individual CMS data resources have more than 80% sensitivity to capture any oncologists as reported in the AMA data. The completeness of the CMS data to identify any oncologist is improved by combining files. Although using combined files from CMS can identify 90.4% of all oncologists, the files varied in sensitivity to identify specific oncology specialties. The CMS data, especially combined files, captured 98% of ROs and 89% of MOs. However, CMS data resources, individually or combined, did not identify a sizeable portion of the SOs. Therefore, researchers who want to use CMS data to study treatment and outcomes for patients receiving care from SOs should consider linkage to the AMA data. These findings are similar to a prior study that found that using Medicare claims with the CMS NPI Directory (precursor to the NPPES Registry) increased the percent of all oncologists identified in Medicare claims by 36.1% (16). The same study also found that the addition of AMA data markedly increased the identification of SOs.
For physicians who were identified as oncologists in the AMA data but were not found to be oncologists in the CMS data, many had a CMS specialty that was similar to the oncology specialty on the AMA data (eg, ROs reported as radiologists, SOs reported as surgeons). There was a small percent of physicians in the NPPES file who had a taxonomy code for “specialty.” We manually reviewed a subset of 25 providers with the taxonomy code for “specialty” and searched for their type of practice on the internet. These providers were found to be involved in a range of medical services, including oncologists, prosthetic designers, and counselors. The numerous types of services included under this taxonomy code means that it is not useful for imputing that a physician is an oncologist.
Some investigators using Medicare claims alone to study cancer care have assigned physician specialty through the CMS specialty codes or by looking at services that are specific to cancer (3,5). Using this approach, if there is a claim for chemotherapy or radiation therapy, the physician is considered a medical oncologist or radiation oncologist, respectively, regardless of the CMS specialty code. Such an approach may be an option for chemotherapy and radiation therapy, however, it is more complex for surgical treatment. Many surgeries are not specific to cancer (eg, hemicolectomy) and are commonly performed by both surgeons and surgical oncologists. For surgeries that are specific to cancer, it may be possible to identify from the claims the frequency that a specific physician is performing these procedures and develop a minimal threshold above which that physician is imputed to be a surgical oncologist. More evaluation of this approach and whether it would improve the ability of Medicare claims to identify surgical oncologists is needed.
We used the AMA data as the standard against which to compare CMS data. The AMA data appear to include the vast majority of oncologists. A prior study that used Medicare claims, the CMS NPI Directory, and the AMA data to identify oncologists found that only 6% of oncologists were in the Medicare claims and the CMS NPI Directory but not in the AMA data (16). The number of medical and radiation oncologists that we reported is consistent with prior studies that have used the AMA data to report on the oncology workforce (26,27). However the number of SOs that we identified is nearly double that reported in these earlier studies. We attribute our higher number to the inclusion of both primary and secondary specialties reported on the AMA data. If we limited the identification of SOs to those with a primary specialty code, our numbers would be similar to those in the prior reports. Because we used both primary and secondary specialty in our analysis, there is the possibility that there could be conflicting oncology specialties reported in the AMA data. This occurred in 101 oncologists in the AMA data (0.56% of physicians), 90 of whom were radiation oncologists who also listed medical oncology. ROs are required to complete an initial year of residency in specialties other than RO before beginning their RO training.
In determining which data resource is most accurate and complete to identify oncologists, it is important to consider how physician specialties are initially assigned. The specialty information for each physician in the AMA data is obtained from the American Association of Medical Colleges, the American Board of Medical Specialties, and self-report from a survey of individual physicians. Data on specialty from CMS are self-reported at the time when a physician applies for an NPI or enrolls in the PECOS system, with regular updates to the PECOS system (14). Our analysis assessed only those physicians who were oncologists. We did not have access to the entirety of AMA data or CMS data. Therefore, we were not able to assess the specificity of the CMS data to identify physicians who were not oncologists.
Our findings support several recommendations regarding which data resources are best suited for researchers who are using Medicare data to study the role of physician specialty in oncology care. We conclude that researchers who want to identify any oncologist can use a combination of the assessed CMS data resources to capture 90% of oncologists. If a researcher opts to use only CMS data resources to identify oncologists, it appears that using Medicare claims and the NPPES Registry will capture almost the same number of oncologists as the combination of the Medicare claims, the NPPES Registry, and the MD-PPAS data. The NPPES data are public use and can be downloaded without charge from the CMS website (13). The MD-PPAS data are classified by CMS as identifiable data and are released to researchers after a review process and payment for the file. In addition, our findings support the use of combined CMS data resources or even Medicare claims alone to identify ROs. If the focus of the analysis is on MOs, a researcher would need to consider whether the 89% sensitivity of the combined CMS data resources for identifying MOs was sufficient and if the additional cost to obtain the AMA was worthwhile. Researchers involved in studies that require identification of SOs should use the AMA data to have more complete ascertainment of this specialty.
In conclusion, this study has shown that in some situations, researchers can use combined CMS data resources to identify the vast majority of oncologists included in the AMA data. In these situations, researchers can forego the expense and the administrative process of acquiring the AMA data. However, studies focusing on SOs and perhaps MOs should incorporated data from the AMA. The determination of which data are needed to identify oncologists will depend on the specific research question and the type of oncologist included in the study.
Notes
Affiliations of authors: National Cancer Institute, Division of Cancer Control and Population Science, Bethesda, MD (JLW, DPW, LE); Information Management Services, Calverton, MD (MJB, RB); Center for Medicare and Medicaid Services, Baltimore, MD (SC).
The authors have no conflicts of interest to declare.
The contents of this article are solely the responsibility of the authors and do not necessarily represent the official views of the US Department of Health and Human Services or any of its agencies.
Supplementary Material
References
- 1. Boero IJ, Gillespie EF, Hou J, et al. The impact of radiation oncologists on the early adoption of hypofractionated radiation therapy for early-stage breast cancer. Int J Radiat Oncol Biol Phys. 2017;97(3):571–580. [DOI] [PubMed] [Google Scholar]
- 2. Boero IJ, Paravati AJ, Hou J, et al. The impact of surgeons on the likelihood of mastectomy in breast cancer. Ann Surg. 2019;269(5):951–958. [DOI] [PubMed] [Google Scholar]
- 3. Davidoff AJ, Rapp T, Onukwugha E, et al. Trends in disparities in receipt of adjuvant therapy for elderly stage III colon cancer patients: the role of the medical oncologist evaluation. Med Care. 2009;47(12):1229–1236. [DOI] [PubMed] [Google Scholar]
- 4. Gilligan MA, Neuner J, Sparapani R, Laud PW, Nattinger AB.. Surgeon characteristics and variations in treatment for early-stage breast cancer. Arch Surg. 2007;142(1):17–22. [DOI] [PubMed] [Google Scholar]
- 5. Murphy MM, Simons JP, Ng SC, et al. Racial differences in cancer specialist consultation, treatment, and outcomes for locoregional pancreatic adenocarcinoma. Ann Surg Oncol. 2009;16(11):2968–2977. [DOI] [PubMed] [Google Scholar]
- 6. Neuman HB, Schumacher JR, Schneider DF, et al. Variation in the types of providers participating in breast cancer follow-up care: a SEER-Medicare analysis. Ann Surg Oncol. 2017;24(3):683–691. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Onukwugha E, Mullins CD, Hsu VD, Seal B, Hussain A.. Effect of urologists and medical oncologists on treatment of elderly men with stage IV prostate cancer. Urology. 2011;77(5):1088–1095. [DOI] [PubMed] [Google Scholar]
- 8. Quyyumi FF, Wright JD, Accordino MK, et al. Factors associated with follow-up care among women with early-stage breast cancer .J Oncol Pract. 2019;15(1):e1–e9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Rim SH, Hirsch S, Thomas CC, et al. Gynecologic oncologists involvement on ovarian cancer standard of care receipt and survival. World J Obstet Gynecol. 2016;5(2):187–196. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Sheffield KM, Crowell KT, Lin YL, Djukom C, Goodwin JS, Riall TS.. Surveillance of pancreatic cancer patients after surgical resection. Ann Surg Oncol. 2012;19(5):1670–1677. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Silber JH, Rosenbaum PR, Polsky D, et al. Does ovarian cancer treatment and survival differ by the specialty providing chemotherapy? J Clin Oncol. 2007;25(10):1169–1175. [DOI] [PubMed] [Google Scholar]
- 12.American Medical Association. AMA Physician Masterfile. https://www.ama-assn.org/practice-management/masterfile/ama-physician-masterfile. Accessed January 18, 2019.
- 13.Center for Medicare & Medicaid Services. National Plan and Provider Enumeration System. https://npiregistry.cms.hhs.gov/. Accessed January 18, 2019.
- 14.Center for Medicare & Medicaid Services. Research Data Assistance Center (ResDac). Medicare Data on Provider Practice and Specialty (MD-PPAS). Version 2.3, 2018. https://www.resdac.org/cms-data/files/md-ppas/data-documentation. Accessed January 18, 2019.
- 15. Baldwin LM, Adamache W, Klabunde CN, Kenward K, Dahlman C, L Warren J.. Linking physician characteristics and Medicare claims data: issues in data availability, quality, and measurement. Med Care. 2002;40(suppl 8):IV-82–IV-95. [DOI] [PubMed] [Google Scholar]
- 16. Pollack LA, Adamache W, Eheman CR, Ryerson AB, Richardson LC.. Enhancement of identifying cancer specialists through the linkage of Medicare claims to additional sources of physician specialty. Health Serv Res. 2009;44(2 pt 1):562–576. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.National Cancer Institute. SEER-Medicare Linked Database. https://healthcaredelivery.cancer.gov/seermedicare/. Accessed January 18, 2019.
- 18.National Uniform Claim Committee. Health care provider taxonomy. http://www.nucc.org/index.php/code-sets-mainmenu-41/provider-taxonomy-mainmenu-40. Accessed January 18, 2019.
- 19. Wilson LE, Pollack CE, Greiner MA, Dinan MA.. Association between physician characteristics and the use of 21-gene recurrence score genomic testing among Medicare beneficiaries with early-stage breast cancer, 2008-2011. Breast Cancer Res Treat. 2018;170(2):361–371. [DOI] [PubMed] [Google Scholar]
- 20. Ellis SD, Nielsen ME, Carpenter WR, et al. Gonadotropin-releasing hormone agonist overuse: urologists’ response to reimbursement and characteristics associated with persistent overuse. Prostate Cancer Prostatic Dis. 2015;18(2):173–181. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. Quek RG, Master VA, Portier KM, et al. Association of reimbursement policy and urologists’ characteristics with the use of medical androgen deprivation therapy for clinically localized prostate cancer. Urol Oncol. 2014;32(6):748–760. [DOI] [PubMed] [Google Scholar]
- 22. Hoffman KE, Niu J, Shen Y, et al. Physician variation in management of low-risk prostate cancer: a population-based cohort study. JAMA Intern Med. 2014;174(9):1450–1459. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Goulart BH, Reyes CM, Fedorenko CR, et al. Referral and treatment patterns among patients with stages III and IV non-small-cell lung cancer. J Oncol Pract. 2013;9(1):42–50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24. Baldwin LM, Dobie SA, Billingsley K, et al. Explaining black-white differences in receipt of recommended colon cancer treatment. J Natl Cancer Inst. 2005;97(16):1211–1220. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Jang TL, Bekelman JE, Liu Y, et al. Physician visits prior to treatment for clinically localized prostate cancer. Arch Intern Med. 2010;170(5):440–450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Kirkwood MK, Kosty MP, Bajorin DF, Bruinooge SS, Goldstein MA.. Tracking the workforce: the American Society of Clinical Oncology workforce information system. J Oncol Pract. 2013;9(1):3–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.American Society of Clinical Oncology. The State of Cancer Care in America, 2017: a report by the American Society of Clinical Oncology. J Oncol Pract. 2017;13(4):e353–e394. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.