Skip to main content
Health Services Research logoLink to Health Services Research
. 2009 Apr;44(2 Pt 1):562–576. doi: 10.1111/j.1475-6773.2008.00935.x

Enhancement of Identifying Cancer Specialists through the Linkage of Medicare Claims to Additional Sources of Physician Specialty

Lori A Pollack, Walter Adamache, Christie R Eheman, A Blythe Ryerson, Lisa C Richardson
PMCID: PMC2677054  PMID: 19207588

Abstract

Objective

To examine the number of cancer specialists identified in three national datasets, the effect of combining these datasets, and the use of refinement rules to classify physicians as cancer specialists.

Data Sources

1992–2003 linked Surveillance, Epidemiology, and End Results (SEER)-Medicare data and a cancer-free comparison population of Medicare beneficiaries, Unique Physician Identification Number (UPIN) Registry, and the American Medical Association (AMA) Masterfile.

Study Design

We compared differences in counts of cancer specialists identified in Medicare claims only with the number obtained by combining data sources and after using rules to refine specialty identification.

Data Extraction

We analyzed physician specialty variables provided on Medicare claims, along with the specialties obtained by linkage of unencrypted UPINs on Medicare claims to the UPIN Registry, the AMA Masterfile, and all sources combined.

Principle Findings

Medicare claims identified the fewest number of cancer specialists (n=11,721) compared with 19,753 who were identified when we combined all three datasets. The percentage increase identified by combining datasets varied by subspecialty (187 percent for surgical oncologists to 50 percent for radiation oncologists). Rules created to refine identification most affected the count of radiation oncologists.

Conclusions

Researchers should consider taking the additional effort and cost to refine classification by using additional data sources based on their study objectives.

Keywords: Specialties, medical, Medicare, classification, neoplasms


An understanding of variations in the delivery and outcomes of clinical care by physician specialty can be used to target education and interventions to the most relevant professional societies and training programs. The assessment of these variations, however, depends on the quality of the data used to measure physician specialty. The Surveillance, Epidemiology, and End Results (SEER)-Medicare dataset, which consists of cancer registry data linked to Medicare enrollment and claims files, contains physician specialty on the claims and is commonly used by health services researchers to examine specialty-related variations in care of people with cancer (Earle et al. 2002, 2006; Keating et al. 2003; National Cancer Institute 2007). Information about physician specialty available on Medicare claims can be supplemented by linking to sources such as the Unique Physician Identification Number (UPIN) Registry or the American Medical Association (AMA) Physician Masterfile (AMA 2004; NHIC 2006).

Baldwin et al. (2002) reviewed the availability and quality of data on physician characteristics in Medicare claims and found that while these data are useful, investigators must understand the limitations of data on physician specialty so that they might decide the best approach to a particular research question and how these limitations may affect findings. For example, Rosenblatt and colleagues found that the AMA Masterfile identified 97 hematologist/oncologists in Washington state in addition to the 58 hematologist/oncologists found in that state's Medicare claims, a 167 percent increase (Rosenblatt et al. 1998; Baldwin et al. 2002). Thus, researchers investigating the role of hematologist/oncologists potentially would underestimate the number of physicians with this specialty if using Medicare claims alone.

The extent to which there is agreement on cancer-specific physician specialty variables between Medicare claims, the UPIN Registry, and the AMA Masterfile for the entire SEER-Medicare dataset is unknown. Given the additional time and financial investment that is required to obtain and link the UPIN Registry and/or the AMA Masterfile to SEER-Medicare data, we believe a more complete understanding of the agreement on cancer-specific physician specialty between these data sources would be useful. Such information could be used by future investigators to determine whether seeking additional data on specialty will enhance the quality of their cancer-related research. The purpose of this paper is to compare the agreement and identification of physicians by cancer specialty between SEER-Medicare claims, the UPIN Registry, and the AMA Masterfile. We also demonstrate an approach to refine the determination of physicians as cancer specialists when using all three data sources, and we provide an overview of Medicare claims-based information on physician specialty, including the process of obtaining the data.

Methods

This study was conducted in the context of a larger study to identify how many long-term cancer survivors continue to receive care from cancer specialists more than 5 years after their cancer diagnosis. To evaluate the completeness of available data on cancer specialty, we compared the information on physician specialty contained in three data sources: (1) the Medicare claims obtained with the linked SEER-Medicare data, (2) the UPIN Registry, and (3) the AMA Masterfile.

Data Sources

Medicare Claims

We obtained the Medicare carrier (Physician Part B) and hospital outpatient claims for 1992–2003 for both cancer survivors and a comparison population of Medicare beneficiaries without cancer who were residing in SEER areas. The carrier claims contain a variable indicating the specialty of the performing physician. To obtain a Medicare billing (profiling) number, physicians must report their specialty to carriers and are requested to notify carriers of any subsequent changes in their specialty. The specialty code on Medicare carrier claims originates during claims processing when the carrier matches the performing physician's Medicare billing number to the corresponding record (practice setting) in its provider file. This practice setting record is the source of both the UPIN and the physician specialty available to researchers on Medicare claims. Hospital outpatient claims identify attending, operating, and other physicians but, unlike the carrier claims, do not contain a specialty variable. To protect confidentiality, SEER-Medicare carrier claims contain encrypted performing and referring physician UPINs. We requested and received approval from 10 individual SEER sites to use the unencrypted UPINs so that we could link the Medicare claims to both the UPIN Registry and the AMA Masterfile.

UPIN Registry

The UPIN is an identifier assigned to qualifying non-institutional Medicare providers. Physicians self-designate their medical specialty or specialties when enrolling with a Medicare carrier (NHIC 2006); they are required to indicate their primary specialty but do not have to indicate their secondary specialty, if any. Because specialties can vary by practice setting for individual providers, a single physician could theoretically have different primary or secondary specialty codes for each practice setting. We used the full UPIN Registry instead of the public-use version; the full UPIN Registry includes practice settings on physicians no longer practicing medicine as well as active physicians.

AMA Masterfile

The AMA Masterfile contains current and historical data on all physicians, regardless of membership in the AMA. A record for each physician is established upon entry to an accredited medical school or graduate medical education program in the United States or upon state licensure. The AMA Masterfile includes each physician's primary and secondary specialties, which are obtained from surveying the physicians directly or, in the event of nonresponse, from sources such as surveys of residency/fellowship programs (AMA 2004).

For each distinct UPIN found in the Medicare carrier and hospital outpatient claims, we created a record in our physician-level analytic file. Both UPIN Registry and AMA Masterfile data were added to each record, regardless of concordance.

Study Population

We identified 586,327 performing and referring physicians in the 1992–2003 SEER-Medicare claims for two groups: (1) men and women diagnosed with colorectal or bladder cancer, men diagnosed with prostate cancer, and women diagnosed with uterine or breast cancer during years 1992–1997, and (2) the “noncancer” SEER-Medicare comparison population. Cancer specialists were defined as physicians practicing gynecologic oncology, hematology, hematology/oncology, medical oncology, musculoskeletal oncology, radiation oncology, or surgical oncology. The three specialties of hematology, medical oncology, and hematology/oncology were grouped for analysis because the training programs are similar and most board-certified hematologists are also board certified in medical oncology. The Medicare claims identified 11,721 cancer specialists. The reasons that the numbers of cancer specialists and physicians generally were high include: (1) 12 years of claims were used, and (2) if a Medicare beneficiary moved out of a SEER-coverage area, the Medicare claims captured additional physicians that provided care in the new location.

Refinement Rules for Discordant Identification of Specialties

Because the specialties of individual physicians were not always identified consistently in the different databases, we developed a set of rules to identify potentially misclassified cancer specialists. If a physician was identified as a cancer specialist in the AMA Masterfile or in both the Medicare claims and UPIN Registry, he or she was considered a cancer specialist. If the identified specialties were discordant, the specialty classification in the AMA Masterfile took precedence over Medicare claims and the UPIN Registry because the AMA does more active confirmation of specialty identification. If the cancer specialty was identified in only the Medicare claims or the UPIN Registry but not in AMA Masterfile, the physician was considered misclassified because of the discordance between the available sources.

We sought to further restrict the identified cancer specialists to physicians who most likely made clinical decisions regarding patient care. We decided that cancer physicians who were also identified as specializing in pathology, blood banking/transfusion, dermatology, or anesthesiology likely had a supportive rather than a clinical role in the oncology care of patients with the study cancers and did not consider these physicians to be cancer specialists. We found that many radiation oncologists were also classified as diagnostic radiologists or specialists in nuclear medicine; we believed these physicians may have been misclassified as radiation oncologists because of the similarity in their specialty's name. Accordingly, we considered only physicians having Medicare claims with a radiation therapy Current Procedural Terminology code (CPT 77000-77799) to be practicing radiation oncology. In addition, we found that the specialty of radiation oncology often occurred along with emergency medicine; these physicians were believed to be misclassified radiologists and were also considered to be potentially misclassified as cancer specialists if no radiation therapy codes were found.

Analysis

Data from the three sources were combined into a single physician-level data file for each physician identified in Medicare claims. For each data source, counts of physicians who provided services or referred patients for services were obtained by cancer specialty. All analyses were conducted using SAS 9.1 (SAS Institute, Inc., Cary, NC). The percentage increase in the number of physicians identified when using all data sources combined, compared with use of Medicare claims alone, was calculated.

Results

Our experience with the three data sources for physician specialty information in terms of effort to obtain the data, the time from request to receipt, the cost of purchasing the data, and the experience needed by the researchers to use it is summarized in Table 1. We obtained SEER-Medicare claims through the standard procedure of submitting a protocol to the National Cancer Institute (NCI), which took 7 weeks for approval. Obtaining UPIN Registry data for this study took longer because we needed to request permission from each of 10 SEER sites to receive unencrypted UPINs (the time to receive approval from each site ranged from 3 to 48 days) before the data request could be processed. The cost of purchasing the UPIN Registry was negligible, but the use of the UPIN Registry required experience to (1) ensure that retired and deceased physicians would be included in the file because study claims went as far back as 1992 and could have been from currently nonactive physicians, and (2) perform the linkage of the UPIN Registry data to Medicare claims. The AMA Physician Masterfile was easy to use given its specific focus and variables, but the effort to obtain the AMA data at the time of our request was hindered by an ongoing internal review of the AMA's policies and procedures for data release (which may have now been resolved). The cost for AMA data was significantly higher than that for the Medicare claims or UPIN Registry.

Table 1.

Summary of Data Sources Used to Obtain Information on Physician Specialty for a SEER-Medicare Analysis

Source Description Methods That Source Uses to Obtain Data Process for Researchers to Obtain Data Effort to Obtain Time to Obtain Cost Experience Needed to Use
Medicare claims The Medicare research data set includes the encrypted UPIN and a provider specialty code During claims processing, carriers place the specialty code associated with the physician's practice setting on the claim before sending it to CMS for payment Proposals for data use are submitted to the National Cancer Institute and reviewed for protection of confidentiality ++ ++ ++ ++
UPIN Registry Database of Unique Physician Identification Numbers (UPINs) for providers who bill Medicare Providers self-identify and update information when they enroll with Medicare carrier Researcher must seek approval from each SEER site to obtain unencrypted UPINs, which are then linked to UPIN Registry by researcher +++ +++ + +++
AMA Masterfile§ Demographic, education, specialty, and practice information on U.S. physicians Established upon entry to medical school or training and updated from primary certification sources and direct surveys UPINS are usually unencrypted by a third party, sent to the AMA's programming contractor to attach selected variables, then re-encrypted for release to researcher +++ +++ +++ +
*

Ratings reflect the experience of the authors and may be different for other researchers over time.

Ratings are from least (+) to most (+++).

Request for approval from SEER registries increases time and effort to obtain.

§

We did not need a third party to unencrypt the UPINs because we had received permission for use of the actual UPINs. Time to obtain was slow because the AMA's policies and procedures on releasing data were being reviewed when our request was submitted.

SEER, Surveillance Epidemiology and End Results; CMS, Centers for Medicare and Medicaid Services; UPIN, Unique Physician Identifier Number; AMA, American Medical Association.

The counts of physicians identified in each of the data sources, individually and after the data sources were combined, are shown in Table 2. Medicare claims alone identified 11,721 cancer specialists. The UPIN Registry identified more surgical oncologists, radiation oncologists, and gynecologic oncologists than the Medicare claims or AMA data, and AMA data identified more hematologist/oncologists than the Medicare claims or UPIN Registry. Both the UPIN Registry and AMA Masterfile contributed to capturing more cancer specialists than use of only Medicare claims. Combining the UPIN Registry to Medicare data, we identified 4,236 more cancer specialists (an increase of 36.1 percent). The increase in cancer specialists identified when combining AMA Masterfile data to Medicare claims was 6,454 (55.1 percent). By combining all three data sources, even a greater number was identified: 19,753 unique cancer specialists, a 68.5 percent increase in number of use of claims alone. Using the combined data, we identified 13,151 hematologist/oncologists, 1,450 surgical oncologists, 4,827 radiation oncologists, and 1,047 gynecologic oncologists. Musculoskeletal oncologists were identified only in the AMA Masterfile.

Table 2.

Counts of Cancer Specialists Identified in Each Data Source and Effect of Combining UPIN Registry and AMA Masterfile Data to Medicare Claims on Number of Physicians Identified

Medicare Claims and UPIN Medicare Claims and AMA All Sources Combined (Claims+UPIN+AMA)
Medicare Claims UPIN Registry AMA Masterfile


Cancer Specialty Count Count Count Count % Increase Count % Increase Count % Increase
Any cancer specialty 11,721 15,076 16,368 15,597 36.1 18,175 55.1 19,753 68.5
Hematology/oncology 7,726 9,865 11,281 10,452 35.3 12,214 58.1 13,151 70.2
Surgical oncology 506 936 689 1,003 98.2 1,049 107.3 1,450 186.6
Radiation oncology 3,218 3,894 3,840 4,182 30.0 4,471 38.9 4,827 50.0
Gynecologic oncology 432 699 650 744 72.2 864 100.0 1,047 142.4
Musculoskeletal oncology n/a n/a 65 0 65 65

Notes: Physicians can be counted in more than one category, and thus the sum of individual categories exceeds overall count of identified cancer specialists.

% Increase reflects percentage of specialists identified by combining data sources over use of claims alone.

Hematology/oncology refers to the following medical specialties: hematology, hematology–oncology, or medical oncology.

n/a denotes that specialty information is not available in data source.

Source: UPIN Registry, AMA Masterfile, Medicare Carrier claims 1992–2003.

UPIN, Unique Physician Identifier Number; AMA, American Medical Association.

Table 3 shows a detailed account, overall and by specialty, of the cancer physicians identified when all data sources were combined, and how many were found in one, two, or all three data sources. The counts reflect the specialties as reported before our refinement of discordant specialty identification. Of the 19,753 cancer specialists of any type identified by combining data sources, only 9,673 (line a) appeared in all three of the sources. Medicare claims alone identified 640 (line b) cancer specialists that were not found in any other data source. The UPIN Registry alone identified 1,578 (line d) and the AMA Masterfile alone identified 3,796 (line f) cancer specialists. Other lines show contributions of two sources when one source does not identify physicians as cancer specialists. Line c, for instance, shows that 1,167 specialists were identified in both the claims and UPIN Registry but not the AMA Masterfile. The relatively low counts for surgical oncology and gynecologic oncology, identified in all three sources (line a), indicates how important it can be to use all three data sources.

Table 3.

Number of Cancer Specialists Identified by Individual and Combined Data Sources by Specialty Type

All Cancer Specialists Hematology/ Oncology * Surgical Oncology Radiation Oncology Gynecologic Oncology
Total number identified by combining datasets 19,753 13,151 1,450 4,827 1,047
Contribution from each data source(s)
Medicare Claims UPIN Registry AMA Masterfile
a Yes Yes Yes 9,673 6,629 139 2,536 201
b Yes No No 640 423 60 237 28
c Yes Yes No 1,167 510 300 394 186
d No Yes No 1,578 937 401 356 183
e No Yes Yes 2,658 1,789 96 608 129
f No No Yes 3,796 2,699 447 645 303
g Yes No Yes 241 164 7 51 17
*

Hematology/oncology refers to the following medical specialties: hematology, hematology–oncology, or medical oncology.

For reference in results text describing Table 3.

UPIN, Unique Physician Identifier Number; AMA, American Medical Association.

The counts presented in Table 4 demonstrate the effect of applying rules to reduce potential misclassification of specialty. Among the 19,753 cancer specialists identified by combining the three data sets, 82.1 percent were still included after applying our refinement rules. The specialty most affected by our rules was radiation oncology, where 53.5 percent of the identified physicians were confirmed by concordance with Medicare claims or UPIN Registry data, and radiation therapy claims; the rest were suspected to be misclassified radiologists or radiologic subspecialists. The effect of applying our rules on the combined hematology–oncology and the gynecologic oncology categories was smaller (91.5 and 84.8 percent were confirmed, respectively.) Compared with the UPIN Registry, a larger percentage of physicians identified in Medicare claims were confirmed by our refinement rules as cancer specialists, overall and by subspecialty.

Table 4.

Differences in the Number of Physicians Identified as Cancer Specialists after Applying Rules* to Limit Misclassification

Medicare Claims UPIN Registry AMA Masterfile All Sources




Specialty Original Count Confirmed Count (%) Original Count Confirmed Count (%) Original Count Confirmed Count (%) Original Count Confirmed Count (%)
Any cancer specialty 11,721 10,162 (86.7%) 15,076 12,604 (83.6%) 16,368 14,744 (90.1%) 19,753 16,209 (82.1%)
Hematology/oncology 7,726 7,291 (94.4%) 9,865 9,226 (93.5%) 11,281 11,218 (99.4%) 13,151 12,034 (91.5%)
Surgical oncology 506 446 (88.1%) 936 587 (62.7%) 689 688 (99.9%) 1,450 1,040 (71.7%)
Radiation oncology 3,218 2,040 (63.4%) 3,894 2,256 (57.9%) 3,840 2,185 (56.9%) 4,827 2,583 (53.5%)
Gynecologic oncology 432 404 (93.5%) 699 568 (81.3%) 650 650 (100.0%) 1,047 888 (84.8%)
*

To be considered confirmed as cancer specialist, a cancer specialist needed to be in the AMA Masterfile, or in both Medicare Claims and the UPIN registry concurrently. Further refinement was done by requiring radiation oncologists to have filed claims for radiation therapy and eliminating those physicians who likely play a supportive role in cancer management (e.g., pathologist, anesthesiologist, etc.).

Hematology/oncology refers to the following medical specialties: hematology, hematology–oncology, or medical oncology.

UPIN, Unique Physician Identifier Number; AMA, American Medical Association.

Discussion

The linkage of Medicare claims to additional data sources identified almost 70 percent more cancer physicians than were identified in the Medicare claims alone. The greatest increases were observed for gynecologic oncologists and surgical oncologists, whose numbers more than doubled when combined data was used. These results show that researchers who conduct studies focused on cancer specialty and the delivery of care by specialty should be aware that the use of Medicare claims data alone may lead to an underestimate of the number of cancer specialists and a misinterpretation of their role. Supplemental data sources, like the UPIN Registry and AMA data, can be used to identify more cancer specialists, but the addition of these sources will add time and cost to the study.

We should note that the increased numbers and relative percentages of identified cancer specialists presented in this study are based on counts of specialists within each data set and are not given within the context of a research study. In truth, the magnitude of the gain in identified specialists will differ based on the research design and objectives. This study was conducted as preliminary work in our study on long-term cancer survivors receiving care from cancer specialists. Using the same data, we calculated the percentage increase in the number of long-term cancer survivors identified as receiving care from a cancer specialist using the combined data sources versus Medicare claims alone. We found that using combined data, 18.1 percent (range of 10.8–37.4 percent depending on cancer type) more long-term survivors were identified as receiving care from cancer specialists than were identified using Medicare data alone (L.A. Pollack, unpublished data). This example illustrates the reality that although combining other data sources with Medicare claims identifies more cancer specialists, the magnitude of the effect on study results can vary based on the research question and study design. A reason why the percentage increase of identified cancer specialists in our larger study was lower than the percentage increase shown in this paper may be that we used only the UPINs from the claims of persons with a known diagnosis of cancer rather than both the cancer and noncancer control populations in the larger study.

The use of supplemental data sources to determine physician specialty in research studies is not consistent. Many published studies have relied on Medicare claims to identify physician specialty (Earle et al. 2003; Earle and Neville 2004; Pham et al. 2005), but more recent studies are linking the UPINs to AMA data (Baldwin et al. 2005; Earle et al. 2006; Keating et al. 2006; Ryerson et al. 2007). Another approach is to use additional data other than the specialty classification in the claims to capture specialists. For example, Keating et al. (2003) identified practicing oncologists by including physicians who billed Medicare for providing chemotherapy.

One advantage of linking AMA data to SEER-Medicare data is that it is done through a third party, and thus researchers avoid the burden of obtaining and protecting the confidentiality of the unencrypted UPINs, which can identify individual physicians (National Cancer Institute 2007). If we had used only AMA data to enhance the identification of specialists, we would have captured more cancer specialists than identified in the Medicare claims alone but not as many as identified through obtaining permission to use the unencrypted (not publicly available) UPINs and linking to the UPIN Registry. The order of the linkage in the analytic tables would change the number of additional cancer specialists identified in each step of analyzing the data but not the overall results.

To our knowledge, no studies have specifically examined the use of algorithms to refine specialties and exclude misclassified physicians. This paper demonstrates that particular physician specialties may be at risk of being misclassified, particularly diagnostic radiologists and specialists in nuclear medicine. In this case, we found that refinement by looking at the type of service billed was useful. A recent study estimated the number of professionally active radiation oncologists in the United States (Lewis and Sunshine 2007). Although the years of this study did not align with our study years and our study included retired and nonpracticing physicians, we compared the estimates from that study with the counts derived from our sources to assess whether the results were similar. Lewis and Sunshine estimated there were 2,900 radiation oncologists in 1995 and 3,500 in 2003 (Lewis and Sunshine 2007); in our study, we identified 3,218 radiation oncologists through Medicare claims alone, 4,827 by combining datasets, and 2,583 after refinement rules were applied. The higher count from our combined dataset suggests that physicians other than practicing radiation oncologists were included. The lower counts found after we applied our refinement rules may be closer to the actual number because this study includes mainly physicians practicing in SEER areas but not all practicing physicians. Another recent study estimated the number of active oncologists in 2005, but it was not comparable to our results because of differences in the study years and in the grouping of specialties considered as oncologists (Erikson et al. 2007).

Previous work has described Medicare claims, the UPIN Registry, and AMA data in terms of availability and the percentage of missing variables (Baldwin et al. 2002), but a comparison of the agreement of these data source has not been done using the entire SEER dataset. A limitation of the present study is that none of the data sources included could be viewed as a gold standard, because there are few validation studies. Therefore, we cannot be certain that the specialties listed are accurate, nor can we know how well our algorithm refined the identification of specialty. There has not been any direct benefit to physicians that would reward misreporting of physician specialty since the implementation of Medicare's Physician Fee Schedule in 1993, but researchers must recognize that subspecialists are not restricted to practicing their specialty and often provide more generalized care (Rosenblatt et al. 1998). Thus, the limitations of any refinement must be acknowledged.

In conclusion, although physician specialty is provided on Medicare claims, when the actual UPIN for providers is linked with additional data sources, the identified specialty does not always concur. Our study examined the level of agreement for cancer specialist between Medicare claims, the UPIN Registry, and the AMA Physician Masterfile and found that at least 50 percent more cancer specialists were identified by combining the three. The gain in identified physicians may not be as large, however, in the context of a specific research question or after algorithms are applied to refine the identification of specialties. In addition, enhancement by combining data sources or refinement rules does not equal true validation. This study demonstrates that researchers should be aware that the specialty information in Medicare claims may not always be accurate, and linkage to the UPIN Registry or AMA Physician Masterfile is one approach to enhance the number of identified cancer specialists.

Acknowledgments

Joint Acknowledgment/Disclosure Statement: This research was supported by the Centers for Disease Control and Prevention, Division of Cancer Prevention and Control (Contract 200-2002-00575 Task 17). This study used the linked SEER-Medicare database. The authors acknowledge the efforts of the Applied Research Program, NCI; the Office of Research, Development, and Information, CMS; Information Managements Services; and the SEER Program tumor registries in the creation of the SEER-Medicare database. We also acknowledge Ann Larsen of RTI International for programming and data management.

Disclosure: The authors have no conflicts of interest.

Disclaimers: The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the Centers for Disease Control and Prevention. The interpretation and reporting of these data are the sole responsibility of the authors.

Supporting Information

Additional supporting information may be found in the online version of this article:

Appendix SA1: Author Matrix.

hesr0044-0562-SD1.doc (248KB, doc)

Please note: Wiley-Blackwell is not responsible for the content or functionality of any supporting materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.

References

  1. American Medical Association (AMA). AMA Physician Masterfile. 2004. [November 13, 2007]. Available at http://www.ama-assn.org/ama/pub/category/2673.html.
  2. Baldwin LM, Adamache W, Klabunde CN, Kenward K, Dahlman CP, Warren J. Linking Physician Characteristics and Medicare Claims Data: Issues in Data Availability, Quality, and Measurement. Medical Care. 2002;40(8, Suppl.):IV-82–95. doi: 10.1097/00005650-200208001-00012. [DOI] [PubMed] [Google Scholar]
  3. Baldwin LM, Dobie SA, Billingsley K, Cai Y, Wright GE, Dominitz JA, Barlow W, Warren JL, Taplin SH. Explaining Black-White Differences in Receipt of Recommended Colon Cancer Treatment. Journal of National Cancer Institute. 2005;97(16):1211–20. doi: 10.1093/jnci/dji241. [DOI] [PMC free article] [PubMed] [Google Scholar]
  4. Earle CC, Burstein HJ, Winer EP, Weeks JC. Quality of Non-Breast Cancer Health Maintenance among Elderly Breast Cancer Survivors. Journal of Clinical Oncology. 2003;21(8):1447–51. doi: 10.1200/JCO.2003.03.060. [DOI] [PubMed] [Google Scholar]
  5. Earle CC, Neumann PJ, Gelber RD, Weinstein MC, Weeks JC. Impact of Referral Patterns on the Use of Chemotherapy for Lung Cancer. Journal of Clinical Oncology. 2002;20(7):1786–92. doi: 10.1200/JCO.2002.07.142. [DOI] [PubMed] [Google Scholar]
  6. Earle CC, Neville BA. Under Use of Necessary Care among Cancer Survivors. Cancer. 2004;101(8):1712–9. doi: 10.1002/cncr.20560. [DOI] [PubMed] [Google Scholar]
  7. Earle CC, Schrag D, Neville BA, Yabroff KR, Topor M, Fahey A, Trimble EL, Bodurka DC, Bristow RE, Carney M, Warren JL. Effect of Surgeon Specialty on Processes of Care and Outcomes for Ovarian Cancer Patients. Journal of National Cancer Institute. 2006;98(3):172–80. doi: 10.1093/jnci/djj019. [DOI] [PubMed] [Google Scholar]
  8. Erikson C, Salsberg E, Forte G, Bruinooge S, Goldstein M. Future Supply and Demand for Oncologists: Challenges to Assuring Access to Oncology Services. Journal of Oncology Practice. 2007;3(2):79–86. doi: 10.1200/JOP.0723601. [DOI] [PMC free article] [PubMed] [Google Scholar]
  9. Keating NL, Landrum MB, Ayanian JZ, Winer EP, Guadagnoli E. Consultation with a Medical Oncologist Before Surgery and Type of Surgery among Elderly Women with Early-Stage Breast Cancer. Journal of Clinical Oncology. 2003;21(24):4532–9. doi: 10.1200/JCO.2003.05.131. [DOI] [PubMed] [Google Scholar]
  10. Keating NL, Landrum MB, Guadagnoli E, Winer EP, Ayanian JZ. Factors Related to Underuse of Surveillance Mammography among Breast Cancer Survivors. Journal of Clinical Oncology. 2006;24(1):85–94. doi: 10.1200/JCO.2005.02.4174. [DOI] [PubMed] [Google Scholar]
  11. Lewis RS, Sunshine JH. Radiation Oncologists in the United States. International Journal of Radiation Oncology Biology Physics. 2007;69(2):518–27. doi: 10.1016/j.ijrobp.2007.02.053. [DOI] [PubMed] [Google Scholar]
  12. National Cancer Institute. SEER-Medicare Database. 2007. [November 13, 2007]. Available at http://healthservices.cancer.gov/seermedicare/
  13. NHIC Corp. The UPIN Registry. 2006. [November 13, 2007]. Available at http://www.upinregistry.com/faq.asp.
  14. Pham HH, Schrag D, Hargraves JL, Bach PB. Delivery of Preventive Services to Older Adults by Primary Care Physicians. Journal of the American Medical Association. 2005;294(4):473–81. doi: 10.1001/jama.294.4.473. [DOI] [PubMed] [Google Scholar]
  15. Rosenblatt RA, Hart LG, Baldwin LM, Chan L, Schneeweiss R. The Generalist Role of Specialty Physicians: Is There a Hidden System of Primary Care? Journal of the American Medical Association. 1998;279(17):1364–70. doi: 10.1001/jama.279.17.1364. [DOI] [PubMed] [Google Scholar]
  16. Ryerson AB, Eheman C, Burton J, McCall N, Blackman D, Subramanian S, Richardson LC. Symptoms, Diagnoses, and Time to Key Diagnostic Procedures among Older U.S. Women with Ovarian Cancer. Obstetrics and Gynecology. 2007;109(5):1053–61. doi: 10.1097/01.AOG.0000260392.70365.5e. [DOI] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

hesr0044-0562-SD1.doc (248KB, doc)

Articles from Health Services Research are provided here courtesy of Health Research & Educational Trust

RESOURCES