Skip to main content
Journal of the National Cancer Institute. Monographs logoLink to Journal of the National Cancer Institute. Monographs
. 2020 May 15;2020(55):66–71. doi: 10.1093/jncimonographs/lgz031

Comparison of Physician Data in Two Data Files Available for Cancer Health Services Research

Dolly P White 1, Lindsey Enewold 1,, Ann M Geiger 1, Robert Banks 2, Joan L Warren 1
PMCID: PMC7868034  PMID: 32412069

Abstract

Introduction

Physicians are vital to health-care delivery, but assessing their impact on care can be challenging given limited data. Historically, health services researchers have obtained physician characteristics data from the American Medical Association (AMA) Physician Masterfile. The Center for Medicare and Medicaid Services’ Medicare Data on Provider Practice and Specialty (MD-PPAS) file was assessed, as an alternative source of physician data, particularly in the context of cancer health services research.

Methods

We used physician National Provider Identifiers in the MD-PPAS data (2008–2014) to identify physicians in the AMA data current as of July 18, 2016. Within each source, we grouped physicians into six broad specialty groups. Percent agreement and Cohen’s kappa coefficient (k) were calculated for age, sex, specialty, and practice state.

Results

Among the 698 202 included physicians, there was excellent agreement for age (percent agreement = 97.7%, k =0.97) and sex (99.4%, k =0.99) and good agreement for specialty (86.1%, k =0.80). Within specialty, using AMA as the reference, agreement was lowest for oncologists (77%). Approximately 85.9% of physicians reported the same practice state in both data sets.

Conclusion

Although AMA data have been commonly used to account for physician-level factors in health services research, MD-PPAS data provide researchers with an alternative option depending on study needs. MD-PPAS data may be optimal if nonphysicians, provider utilization, practice characteristics, and/or temporal changes are of interest. In contrast, the AMA data may be optimal if more granular specialty, physician training, and/or a broader inclusion of physicians is of interest.


Physicians are vital members of health-care delivery teams and have a sizeable influence on patient treatment decisions and, by extension, outcomes (1). Using their knowledge of available local resources, such as cancer specialists and high-volume cancer treatment centers offering clinical trials, physicians can make recommendations that affect cancer outcomes. Patients with access to specialty care have better outcomes than patients lacking access (2,3). Therefore, to more completely understand care delivery, health services researchers often want to study characteristics of the treating physician(s), especially physician specialty, in addition to patient and organizational factors. However, information related to physicians and their treatment patterns is limited.

Analyses of the National Cancer Institute’s Surveillance, Epidemiology, and End Results (SEER)-Medicare data, which include detailed information about tumor characteristics, cancer treatments, patient specific factors, aggregate socioeconomic data, and descriptors of the care setting, have greatly contributed to the understanding of cancer health-care delivery in the United States (4). For cancer health services researchers interested in including physician specialty in their analyses, specialty codes are included on Medicare claims, but previous studies have indicated that this source of specialty information is suboptimal (5,6). As a result, researchers using SEER-Medicare typically have linked to the American Medical Association’s (AMA) Physician Masterfile (herein referred to as “AMA” data), when physician characteristics were of interest (7). Another resource for physician data, the Medicare Data on Provider Practice and Specialty (herein referred to as “MD-PPAS” data), is now available (8). The MD-PPAS data contain physician information collected by the Center for Medicare and Medicaid Services (CMS). To date, there has been limited assessment of the utility of the MD-PPAS data for health services research and, to our knowledge, no assessment involving SEER-Medicare data (9).

The primary objective of this study was to provide a comparison of the MD-PPAS and AMA data sources, with a focus on the utility of these data sources for cancer health services research. The study aims included 1) identification of common and unique variables in each data source; 2) the quantification of agreement for the common variables, particularly physician specialty; and 3) an exploration of when each data source may be the optimal choice for an analysis. We used the results of the primary analysis to determine the merits of including the MD-PPAS data in the SEER-Medicare database.

Methods

Data Sources

MD-PPAS.

The MD-PPAS file was initially created in 2008 through a collaboration between CMS and the Office of the Assistant Secretary for Planning and Evaluation to assist researchers with assigning individual providers to tax identification number (TIN)-based group practices and determining physician specialty (8). Subsequently, annual files have been created. Three data sources are used to create the MD-PPAS files: the National Plan and Provider Enumeration System, which is the national directory through which CMS assigns each health-care provider a unique National Provider Identifier (NPI); the Provider Enrollment, Chain, and Ownership System (PECOS), which health-care providers use to enroll as a Medicare provider; and claims submitted to Medicare by providers. Physician and nonphysician providers are eligible for inclusion in the MD-PPAS file if they have a valid NPI and have submitted at least one Medicare fee-for-service (FFS) physician claim during a given calendar year. Specifically, only providers who submitted Medicare claims for evaluation and management visits, procedures, imaging, or nonlaboratory tests are included in the MD-PPAS data (8).

The MD-PPAS files include information pertaining to each provider’s birth date, sex, specialty, practice TIN, geographic location, and utilization summary measures. Specialty is determined from two sources: the self-validated PECOS data and, if specialty is missing in PECOS data, the most frequent specialty reported on the Medicare claims. The utilization measures are aggregate measures of the amount of Medicare services billed during the previous 2 years, including the number of line items, total allowed charges, and number of unique beneficiaries cared for (Table 1). For this analysis, we included MD-PPAS data (version 2.0) for the years 2008 through 2014; this was the latest version available at study commencement. Compared with the earlier version, MD-PPAS version 2.0 included additional individual providers (all nonphysician providers and all providers residing in Puerto Rico), annual provider-level measures (number of Part B line items, total Medicare allowed charges, and the number of unique patients overall and by TIN), and monthly TIN indicators to help distinguish between a change in provider practice location vs maintenance of concurrent positions in multiple practice locations (8). There was also a change in methodology used to assign providers to TINs, their primary geographic location, and to the hospitalist specialty group.

Table 1.

Comparison of variables included in the MD-PPAS file and AMA Physician Masterfile

MD-PPAS
AMA
Mechanism Update frequency Mechanism Update frequency
Common variables
 Birth date Physician report* At least every 5 y Physician survey§, medical and training institutions, state and federal agencies, ECFMG Continuously
 Sex Physician report Continuously Medical school: AAMC Continuously
 Primary and secondary specialties Physician report*, At least every 5 y Physician survey§ Continuously
 Practice state Derived from Zip code reported by physician on Medicare claims Continuously Physician survey§, correspondence from hospitals, government agencies, medical societies, specialty boards, licensing agencies Continuously
Unique variables
 Utilization measures Medicare claims Annually based on prior 2 y N/A N/A
 Primary type of practice N/A N/A Physician survey§ Continuously or daily
 Medical school ID** N/A N/A Medical school: AAMC As needed to correct mistakes
 Medical school year of graduation N/A N/A Medical school: AAMC As needed to correct mistakes or if a physician has a delay in graduation
 Medical training institution code†† N/A N/A Training institutions: ACGME, National GME Census As needed to correct mistakes or if a physician begins another residency program
 US trained N/A N/A Training institutions: ACGME, National GME Census, ECFMG As needed to correct mistakes
*

PECOS. AAMC = American Association of Medical Colleges; ACGME = Accreditation Council for Graduate Medical Education; AMA = American Medical Association; ECFMG = Educational Commission for Foreign Medical Graduates; GME = Graduate Medical Education; MD-PPAS = Medicare Data on Provider Practice and Specialty; N/A = not applicable; NPI = National Provider Identifier; NPPES = National Plan and Provider Enumeration System; PECOS = Provider Enrollment, Chain, and Ownership System.

NPPES.

If not enrolled in PECOS, based on the specialty most frequently (primary) or the second most frequently (secondary) reported in Medicare out-patient (Part B) claims.

§

Annual census of physicians.

Number of line items billed by an NPI, total allowed charges billed by NPI, and number of unique beneficiaries for whom the NPI billed.

Indicating whether physician’s primary activity is direct patient care, teaching, administration, etc.

**

The code for the medical school where the physician graduated.

††

The institution where the physician is or was in graduate training.

AMA Physician Masterfile.

The AMA Physician Masterfile was established in 1906 for AMA membership record-keeping (7). Over time it has become a database including information on all living and deceased medical doctors and doctors of osteopathy who are AMA members, are currently in or have completed an accredited residency training program, or have a valid US state license. The AMA also collects information for foreign and international medical graduates who live in the United States and have a valid US state license.

The AMA file includes information pertaining to each resident and attending physician’s birth date, sex, specialty, primary type of practice, geographic location, medical education, postgraduate training, and professional certification (Table 1). Physician data are obtained or verified through credentialing institutions and organizations. Specialty information is obtained from self-report on physician surveys and information provided to the AMA from Graduate Medical Education programs. Geographic information is provided at the time the physician is included in the AMA data, although individual physicians can update practice data including practice state and primary type of practice at any time through the AMA website. For this analysis, we used AMA data current as of July 18, 2016.

Study Population

We identified over 1.2 million unique providers, by their NPIs, in the MD-PPAS data (Figure 1). Given our interest in assessing physician characteristics, we excluded nonphysician providers (MD-PPAS broad specialty code “7”) and retained only physicians (n = 734 318) (8). We then requested AMA data for these physicians, of which 36 043 (5%) were not found in the AMA data. Next, we excluded from the analytic data set physicians who had inconsistent specialty information across the annual MD-PPAS files (n = 73) because this would have precluded accurately assigning their specialty. The final sample included 698 202 physicians.

Figure 1.

Figure 1.

Process of linking the physician National Provider Identifiers (NPIs) included in the Medicare Data on Provider Practice and Specialty (MD-PPAS) files and the American Medical Association (AMA) Physician Masterfile. MD-PPAS files were from 2008–2014 and AMA data was current as of July 18, 2016. N = all observations available; n = subset of observations included/excluded.

Variables of Interest

The analysis focused on physician demographics, state where the physician practiced, and specialty. These variables were assessed because they were found in both data sources (“common” variables). Table 1 lists the common variables as well as other variables available in each data source that may be of interest to researchers.

Demographics.

The MD-PPAS and AMA data include two demographic variables that are frequently used in health services research: birth date, which is used to determine physician age, and sex. We calculated age as 2014 minus the year of birth in each data set because 2014 was the last year of data available in the version 2.0 MD-PPAS data. Then we created the following age groups: younger than 40 years, 40–49 years, 50–59 years, 60–69 years, age 70 years and older, and missing. Sex was categorized as male, female, or missing.

Practice State.

Both data sources collect information on practice state. For the MD-PPAS data, state information may change across the annual data files. Therefore, we assigned state according to the most recent MD-PPAS data year available for each physician.

Physician Specialty.

Both the MD-PPAS and AMA data have primary and secondary specialty information. Only one specialty is reported per specialty field in the MD-PPAS data (eg, specialty 1: oncology; specialty 2: internal medicine). However, physicians can report one or more specialties per specialty field in the AMA data (eg, specialty 1: infectious diseases; specialty 2: internal medicine or emergency medicine or critical care medicine). Given the different specialty classification systems, for comparison, we created a broad specialty classification scheme that included oncology, surgery (not oncology), radiology (not oncology), primary care, other, and missing (Supplemental Table 1). Oncologists were singled out because they are a frequent focus in cancer health services research.

Physicians were assigned to a broad specialty within each data source based on the following hierarchical scheme that considered all primary and secondary specialty codes equally. Any physician reporting any type of oncology specialty was assigned to the oncology category. For example, a physician who reported the specialties medical oncology and internal medicine would be classified as an oncologist. For the remaining physicians, any physician with a surgery specialty was assigned to the surgery category. Next, of those who remained, any physician with a radiology specialty was assigned to the radiology category. Any remaining physicians who reported a primary care specialty were assigned to the primary care category. Finally, all remaining physicians with a specified specialty were assigned to the other category. Physicians who had an unspecified or undefined specialty were categorized as missing.

Statistical Analyses

The percent agreement and Cohen’s kappa coefficient (k) were calculated for each of the common variables. We included missing values in the calculations for all variables except sex because there were no missing sex values in the AMA data. We performed a sensitivity analysis restricting the data to only physicians who were classified as providing direct patient care in the AMA data (type of practice code = 020; N = 596 547). All statistics were calculated using SAS version 9.4 (SAS Institute, Cary, NC).

Results

Of the 698 202 physicians found in both MD-PPAS and AMA data, more than one-half were between 40 and 59 years old (MD-PPAS: 51.9%; AMA: 52.3%) and more than 70% were male (MD-PPAS: 70.4%; AMA: 70.6%) (Tables 2 and 3). There was excellent agreement between the two data sources for age group (percent agreement: 97.7%, k =0.97) and sex (percent agreement: 99.4%, k =0.99). In both data sources, primary care was the most common specialty (MD-PPAS: 39.3%; AMA: 42.6%) and oncology the least common specialty (MD-PPAS: 2.9%; AMA 3.4%) (Table 4). Overall, specialty agreement between the two data sources was good (percent agreement: 86.1%, k =0.80). Agreement within specialty category (eg, among physicians identified as having a given specialty in AMA, the percentage of physicians who were similarly classified in MD-PPAS) was lower for oncologists (77%) and primary care physicians (82%) than for surgeons (94%) and radiologists (98%). The same practice state was reported in MD-PPAS and AMA data for 85.9% of physicians (Table 5). Of note, 7.9% of physicians were missing practice state information in AMA data vs only 0.02% in MD-PPAS data. In the sensitivity analysis including only physicians involved in direct patient care, the level of agreement was similar (data not shown).

Table 2.

Agreement of physician age as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n = 698 202)*

MD-PPAS Age (years)
AMA Age (years) <40 40–49 50–59 60–69 70+ Missing Total Row n (%)
<40 135 578 106 23 12 <11 >253 135 983 (19.5)
40–49 133 183 449 82 >11 <11 1231 184 916 (26.5)
50–59 83 106 178 095 >121 <11 2127 180 543 (25.9)
60–69 62 35 80 133 765 83 3760 137 785 (19.7)
70+ 86 101 81 154 51 567 5538 57 527 (8.2)
Missing 945 311 113 51 >17 <11 1448 (0.20)
Total Column n (%) 136 887 (19.6) 184 108 (26.4) 178 474 (25.6) 134 118 (19.2) 51 695 (7.4) 12 920 (1.9) 698 202 (100)

*MD-PPAS files from 2008–2014 and AMA data as of July 18, 2016. Cell sizes <11 masked for confidentiality concerns. Percent agreement: 97.7%; Kappa 0.97.

Table 3.

Agreement of physician sex as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n = 698 202)*

MD-PPAS Sex
AMA Sex Female Male Missing Total Row n (%)
Female 203 000 1826 111 204 937 (29.4)
Male 2094 489 934 1237 493 265 (90.6)
Missing 0 0 0 0 (0)
Total Column n (%) 205 094 (29.4) 491 760 (70.4) 1348 (0.2) 698 202 (100)

*MD-PPAS files from 2008–2014 and AMA data as of July 18, 2016. When physcians with missing sex classification in the MD-PPAS files were excluded, percent agreement: 99.4% and Kappa: 0.99.

Table 4.

Agreement of physician specialty as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n = 698 202)*

MD-PPAS Specialty
AMA Specialty Oncology Surgery Radiology Primary Care Other Missing Total Row n (%)
Oncology 18 394 1074 160 3297 >938 <11 23 874 (3.4)
Surgery 542 106 706 236 2908 3 422 59 113 873 (16.3)
Radiology 98 60 33 902 264 <424 >19 34 767 (5.0)
Primary Care 1003 1013 301 243 908 50 897 148 297 270 (42.6)
Other 116 1086 442 18 746 197 921 127 218 438 (31.3)
Missing 117 1597 431 5189 >2 635 <11 9980 (1.4)
Total Column n (%) 20 270 (2.9) 111 536 (16.0) 35 472 (5.1) 274 312 (39.3) 256 248 (36.7) 364 (0.1) 698 202 (100)

*MD-PPAS files from 2008–2014 and AMA data as of July 18, 2016. Cell sizes <11 masked for confidentiality concerns. Percent agreement: 86.1%; Kappa 0.80.

Table 5.

Agreement of physician practice location as classified using the Medicare Data on Provider Practice and Specialty (MD-PPAS) file compared to the American Medical Association (AMA) Physician Masterfile (n = 698 202)*

Same state n (%)
Yes 600 022 (85.9)
No 42 848 (6.1)
Missing (MD-PPAS) 133 (0.0)
Missing (AMA) 55 163 (7.9)
Missing (Both) 36 (0.0)

*MD-PPAS files from 2008–2014, location determined based on most recent information available for each physician; AMA classification as of July 18, 2016.

Discussion

Physician information from the two data sources was found to have excellent agreement with respect to physician age and sex. Despite the MD-PPAS data being limited to a select group of physicians (eg, those who submitted a Medicare FFS claim for direct patient care), the distribution of physician demographics did not appear to be exceptionally biased. A 2014 census of active physicians by the Federation of State Medical Boards showed 48% of responding physicians were 40–59 years old and 66% were male (10). There was also 85.9% agreement between the two data sources for state of practice. This was somewhat surprising given that updates to practice geographic information are obtained by voluntary physician report to the AMA (11). In a 2015 physician workforce study, less than one-half of physicians were reported to be practicing in the state where they completed their residency (12).

There was good agreement for overall specialty and within specialty category for surgery and radiology. However, the results indicate underascertainment of oncologists and primary care physicians in the MD-PPAS data compared with AMA data. Although the level of ascertainment of oncologists was lower in the MD-PPAS data, the inclusion of specialty data from PECOS appears to capture more oncologists than if ascertainment relied solely on physician Medicare claims data. A previous study indicated that only 60% of physicians classified as oncologists in the AMA data would be similarly classified based on Medicare claims data alone (6).

When deciding which data source would be best for an analysis, researchers should consider factors such as the completeness of common variables and data acquisition costs. Given the good agreement between the two sources on physician age and sex, if these are the only variables of interest, the MD-PPAS may be a more economical option, especially if investigators are requesting only a few years of data. The price structure for obtaining AMA data is based on the number of NPIs compared with the number of years for MD-PPAS data. If practice state is germane to the research question, MD-PPAS may also be the better choice because practice state is likely more current and missing data were less common than in the AMA data. However, AMA data may be a better option if the focus of the study is to ascertain a complete cohort of oncologists or primary care physicians given the lower potential to identify oncologists and primary care physicians in the MD-PPAS data.

In addition to common variables and data acquisition costs, unique provider information in the AMA and MD-PPAS data may influence a researcher’s choice of which data source to use. The MD-PPAS data capture both physician and nonphysician providers who submit claims to Medicare. Although we focused on physicians in this study, being able to assess characteristics of nonphysician providers may be key to a specific research question. This is of increasing importance as nonphysician providers are more intricately involved in patient care, especially primary care (13). Furthermore, the MD-PPAS data include provider utilization summary measures (eg, number of unique Medicare beneficiaries treated and total allowed Medicare charges), which could be used to characterize provider workload and/or treatment patterns (5). MD-PPAS data can also provide insights into practice characteristics. For example, providers can be assigned to group practices (eg, if two providers are affiliated with the same practice, they will have a common TIN). Practice size can then be calculated as the number of providers assigned the same TIN. Additional practice characteristics (eg, academic vs community) can then be determined through linkages to other data sources via the unencrypted TIN (14). Finally, the MD-PPAS files are created annually, thus allowing for assessments of temporal changes (eg, physicians who relocate and fluctuations in Medicare utilization measures). Therefore, if information on nonphysicians, provider utilization, practice characteristics, and/or temporal changes is of interest, then the MD-PPAS would appear to be a better data source.

Conversely, AMA data include physician medical education and postgraduate training information, which is not available in the MD-PPAS data (4). Compared with the MD-PPAS data, the AMA data also provide more granular specialty information, which could allow investigators to more extensively categorize and compare types of physicians. Additionally, the AMA data are less restrictive about which physicians are included; physicians are included regardless of insurance affiliation. In contrast, to be included in the MD-PPAS data, physicians must accept Medicare patients and submit select types of Medicare FFS claims (8). Therefore, if information on physician training, more granular specialty, and/or a broader inclusion of physicians is of interest, then the AMA would appear to be a better data source.

All of the above factors were considered when determining the merits of including the MD-PPAS data in the SEER-Medicare database. Physician specialty, particularly the ability to accurately identify oncologists, is of great interest to many SEER-Medicare researchers. Importantly, the observed underascertainment of oncologists in MD-PPAS data made us reconsider the benefit of releasing the data through SEER-Medicare. Additionally, although researchers might be unaware that relevant data exist, the demand for the unique variables (eg, nonphysician provider characteristics and provider utilization measures) in the MD-PPAS data is currently low in the SEER-Medicare research community. Furthermore, some of the possible insights gained from the MD-PPAS data can already be gleamed through the claims data. For example, TINs are available on the claims; therefore, if desired, SEER-Medicare researchers can already determine if providers included in their subset of the data belong to the same group practices. It should be noted that the TINs included in the SEER-Medicare data are encrypted, like all individual provider identifiers; therefore, it is not feasible to link data obtained through SEER-Medicare to an external data source via TINs. For these reasons, it was decided that there was not enough justification to release the MD-PPAS data with the SEER-Medicare database at this time. We are aware that the MD-PPAS file is evolving and physician specialty ascertainment, especially for oncologists, may improve in the future (eg, through the development of novel algorithms that account for diagnosis and/or procedure codes in claims data or utilization of additional data sources). We are also mindful that additional unique variables may be added to the MD-PPAS and/or that demand for the current unique variables may increase. As a result, we plan to stay abreast of such developments.

Although this study had strengths, namely it compared two large resources of provider characteristics in a systematic way, there were limitations that should be discussed. Given the different specialty classification systems between the MD-PPAS and AMA data, we created a hierarchical broad specialty classification scheme to categorize physicians within each data source. It is possible that we differentially misclassified physicians, particularly given the variation in granularity between the two data sources. It is also possible that variations in specialty were oversimplified using the scheme. Comparison of categorical age and practice location, based on state alone, also may have meant that potential age and geographical variations on more granular levels were not identified. Although AMA data are commonly used in studies as the standard for physician specialty, we did not designate a gold standard between the AMA data and the MD-PPAS data to measure sensitivity, specificity, and positive predictive value. When we began our analysis, MD-PPAS version 2.0 was available. CMS continues to improve the MD-PPAS data and version 2.3 is now available. Assessments of data available in the most recent version may show greater agreement and concordance between the two data sources. Additionally, comparison of other sources of physician data were not considered but may be of interest for future investigation.

In conclusion, this analysis compared variables available in the MD-PPAS and AMA data and the agreement of the common variables. We found comparable physician information on age, sex, overall specialty, and practice state is available in the MD-PPAS and AMA data. There was some indication that oncologists were underascertained in the MD-PPAS data compared with the AMA data. Each data source has unique physician information (eg, granular specialty information in AMA and practice characteristics in MD-PPAS) that may influence which data source to include in an analysis. Although AMA data have been commonly used to account for physician-level factors in health services research, the availability of MD-PPAS data provides researchers with an alternative option depending on study needs. Physicians are at the heart of cancer health-care delivery, and being able to describe and account for their involvement is critical to health services research.

Notes

Affiliations of authors: National Cancer Institute, Division of Cancer Control and Population Science, Healthcare Delivery Program, Bethesda, MD (DPW, LE, AMG, JLW); Information Management Services, Calverton, MD (RB).

The authors have disclosed that they have no financial interests, arrangements, affiliations, or commercial interests with the manufacturers of any products discussed in this article or their competitors. This article was produced by employees of the US government as part of their official duties and, as such, is in the public domain in the United States of America. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Cancer Institute, the National Institutes of Health, or the Centers for Medicare and Medicaid Services.

We thank the following individuals from the Centers for Medication and Medicaid Services for their contributions to this study: David M. Bott, PhD; Jennifer Lloyd, PhD, MA, MS; and Arpit Misra PhD, MA.

Supplementary Material

lgz031_Supplementary_Table

References

  • 1. Klabunde CN, Ambs A, Keating NL, et al. The role of primary care physicians in cancer care. J Gen Intern Med. 2009;24(9):1029–1036. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Gage-Bouchard EA, Rodriguez LM, Saad-Harfouche FG, Miller A, Erwin DO.. Factors influencing patient pathways for receipt of cancer care at an NCI-designated comprehensive cancer center. PLoS ONE. 2014;9(10):e110649.. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Institute of Medicine (US) and National Research Council (US) National Cancer Policy Board; Hewitt M, Simone JV, eds. Ensuring Quality Cancer Care. Washington, DC: National Academies Press (US; ); 1999. Bookshelf ID: NBK230937. [PubMed] [Google Scholar]
  • 4.National Cancer Institute. Division of Cancer Control & Population Sciences. SEER-Medicare: Brief Description of the SEER-Medicare Database 2017. https://healthcaredelivery.cancer.gov/seermedicare/overview/. Accessed March 19, 2018.
  • 5. Baldwin LM, Adamache W, Klabunde CN, Kenward K, Dahlman C, Warren JL.. Linking physician characteristics and Medicare claims data: issues in data availability, quality, and measurement. Med Care. 2002;40(suppl 8):IV-82–IV-95. [DOI] [PubMed] [Google Scholar]
  • 6. Pollack LA, Adamache W, Eheman CR, Ryerson AB, Richardson LC.. Enhancement of identifying cancer specialists through the linkage of Medicare claims to additional sources of physician specialty. Health Serv Res. 2009;44(2, pt 1):562–576. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.American Medical Association. AMA. AMA Physician Masterfile. 1995. –2017. https://www.ama-assn.org/life-career/ama-physician-masterfile. Accessed September 25, 2017.
  • 8.Center for Medicare & Medicaid Services. Research Data Assistance Center (ResDac). Medicare Data on Provider Practice and Specialty (MD-PPAS) 2016. https://www.resdac.org/cms-data/files/md-ppas/data-documentation. Accessed September 25, 2017.
  • 9. Welch PW, Stearns SC, Cuellar AE, Bindman AB.. Use of hospitalists by Medicare beneficiaries: a national picture. Medicare Medicaid Res Rev. 2014;4(2):ecollection 2014. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Young A, Chaudhry HJ, Pei X, Halbesleben K, Polk DH, Dugan M.. A census of actively licensed physicians in the United States, 2014. J Med Regul. 2015;101(2):7–23. [Google Scholar]
  • 11. Bindman AB. Using the National Provider Identifier for health care workforce evaluation. Medicare Medicaid Res Rev. 2012;3(3):E1–E10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Association of American Medical Colleges. Center for Workforce Studies. 2015 State Physician Workforce Data Book. 2015. https://www.aamc.org/data/workforce/reports/442830/statedataandreports.html. Accessed October 20, 2017.
  • 13.Association of American Medical Colleges. The complexities of physician supply and demand: projections from 2013 to 2025. 2015. https://www.aamc.org/download/426242/data/ihsreportdownload.pdf. Accessed September 25, 2017.
  • 14. Welch WP, Bindman AB.. Town and gown differences among the 100 largest medical groups in the United States. Acad Med. 2016;91(7):1007–1014. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

lgz031_Supplementary_Table

Articles from Journal of the National Cancer Institute. Monographs are provided here courtesy of Oxford University Press

RESOURCES