Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2010 Jan 22.
Published in final edited form as: Breast J. 2008 Dec 12;15(1):17–25. doi: 10.1111/j.1524-4741.2008.00666.x

The Structural Landscape of the Health Care System for Breast Cancer Care: Results from The Los Angeles Women’s Health Study

Diana M Tisnado *, Jennifer L Malin , May L Tao , Patricia Ganz §,, Danielle Rose-Ash , Ashlee F Hu , John Adams ¶,**, Katherine L Kahn *,¶,**
PMCID: PMC2810106  NIHMSID: NIHMS168719  PMID: 19120382

Abstract

The structure of health care has been rapidly evolving in response to financial pressures and demands to improve quality. Little work has documented the structure of care and its impact in the context of breast cancer care. We conducted a survey to characterize Los Angeles physicians caring for breast cancer patients and the structural landscape of the healthcare system in which they practice. Cross-sectional survey of physicians who treated a population-based cohort of breast cancer patients. We surveyed 477 physicians, targeting all Los Angeles County medical oncologists, radiation oncologists, and surgeons reported by patients participating in the Los Angeles Women’s Health Study (77% response rate). Specialty-specific questionnaires were developed. Items were based on the structure and quality of care literature, cognitive interviews with cancer care specialists, and existing physician survey instruments. Breast cancer care providers in Los Angeles are diverse, with one-third non-white and 46% speaking a non-English language. Group practice is most common, (37% single specialty, 16% group-model HMO, 8% multi-specialty group). Minimal teaching involvement predominates. Mean new breast cancer patient volumes are relatively high (8 per month overall; six for surgeons), representing 46% of new cancer patients. Physicians reported high career satisfaction levels (83–92%). Physicians were least satisfied with the amount of time spent with patients (82%). Data from this study represent important building blocks for further analyses to determine the impact of structural characteristics on the quality of care that breast cancer patient’s experience.

Keywords: breast cancer, quality of care, structure of care


Quality of health care is generally conceptualized to include three dimensions: the structure of care (e.g., setting, organizational context), the process of care (e.g., diagnosis, treatment), and outcomes of care (e.g., survival, function, quality of life). Structure of care includes the human, physical, and financial resources necessary to provide care that is believed to influence outcomes (1). Increasingly, it appears that many mutable factors influencing the process of care and outcomes lie in the domain of structure (26). In the last decade, the structure of health care in the US has been rapidly evolving in response to increasing financial pressures and demands to improve quality. Some of these changes have been shown to negatively impact primary care delivery (710), yet little is known about how these new organizational and financial arrangements affect the delivery of care for serious diseases such as cancer.

The American College of Radiology’s Patterns of Care studies conceived in the 1970s described the structure of radiation therapy in the US and attempted to identify predictors of the quality of radiation care being delivered (11). Since then, several studies have examined how treatment patterns for cancer vary with the structure of care at a macro-organizational level, including such characteristics as hospital or setting type (1214), and managed care (1520). Other studies have examined structural variables including surgeon specialty (21,22), surgeon case volume (13,2124), and hospital case volume (22,25,26).

Few studies have systematically evaluated the impact of the “micro-organizational” characteristics of care in hospitals and physician practices on processes and outcomes of cancer care. In a study of clinician beliefs about important structural characteristics, clinicians reported inadequate staffing and other organizational resources to be a barrier to patient enrollment in clinical trials (27). In another, structures believed to facilitate coordination of care included tracking of referrals, regular multidisciplinary meetings, feedback of performance data, use of guidelines, computerized systems, and a single physical location (28). Findings regarding audit and feedback of performance data have been mixed (29,30). The effects of most of these structures on the quality of breast cancer care have not been evaluated, although treatment patterns for breast cancer may vary significantly with the structure of the health care delivery system (15).

We conducted a physician survey to obtain information about the structure of breast cancer care and, ultimately, to evaluate the impact of structure on the quality of breast cancer care. This paper characterizes Los Angeles County physicians providing care to breast cancer patients and describes the system in which these physicians practice.

MATERIALS AND METHODS

We conducted a cross-sectional study of the structure and organization of care associated with physicians who treated a population-based sample of women with incident breast cancer participating in the Los Angeles Women’s Health Study (LAW). The LAW study includes a population-based sample of women ages ≥50 identified with a new diagnosis of breast cancer in Los Angeles County in 2000 by the Los Angeles County Cancer Surveillance Program Rapid Case Ascertainment system. We surveyed 1224 consenting women. A detailed computer-assisted telephone interview (CATI) queried women about health care providers who fulfilled roles including delivering, recommending, or discussing breast cancer treatments. We targeted all Los Angeles County medical oncologists, radiation oncologists, and surgeons reported by patients in CATI for whom contact information could be verified for our physician survey.

To develop our conceptual framework for survey development and analysis, we conducted (i) a review of clinical and health services literature regarding structural characteristics and quality of care; (ii) cognitive interviews with 14 key informants including surgeons, medical oncologists, and radiation oncologists selected from multiple locations in Los Angeles. In addition, we conducted interviews with community advisors from advocacy organizations and other groups serving women with breast cancer. Interviews lasted 1 hour and covered topics including practice characteristics (16,26,31,32), barriers to referrals (33), coordination of care (28), patient and provider support (30,34) and financial issues (3537). Survey items drew upon the literature, with several items adapted from existing instruments such as the Community Tracking Study Physician Survey (http://www.hschange.com) and the Young Physicians Survey (38), as well as cognitive interviews and reviews offered by community advisors.

We organized survey topics into the following five conceptual domains: Facilities and Resources, Referrals and Coordination, Physician Support, Patient Support, and Financial Incentives. We developed three specialty-specific versions of the survey (medical oncology, radiation oncology, and surgeon) based on this conceptual framework. Survey items were revised based on a second level of review of the conceptual framework, accompanying hypotheses, and each conceptual domain to ensure that all domains were adequately represented.

Physicians were queried about their personal characteristics (e.g., age, gender, race/ethnicity, language, personal practice ownership interest, how time is spent) and characteristics of their main practice, defined as the one in which they see most of their patients (e.g., practice type, ownership, size). Physicians were also queried about levels of satisfaction with six dimensions of their career: current specialty, current work setting, amount of time spent with patients, extent to which practice has met professional expectations, decision to become a physician, and overall professional career. Response options were on a five-point Likert scale, ranging from very dissatisfied to very satisfied.

A run-in phase of the survey was fielded to 10 physicians of each specialty type, with a 70% response rate. No questions or problems were noted. Final production occurred with no substantive changes to the survey instruments.

We mailed the finalized self-administered survey instrument to 477 physicians between April and October of 2004, including 175 medical oncologists, 75 radiation oncologists, and 227 surgeons. Strategies to maximize response rate included a $50 incentive check enclosed in the first mailing, an announcement in a local radiation oncologist professional organization newsletter, and follow-up telephone calls to determine receipt and encourage participation. We conducted a second mailing and round of telephone calls to all nonresponders. Additional interventions included notes and telephone calls from clinicians on the research team, and $5 gift cards to office staff who facilitated physician participation in a practice.

Response weights were calculated as the inverse of the probability of response based on a logistic regression model including physician gender, specialty type, study patient volume, and sharing an office with another surveyed physician.

Descriptive analyses were performed including comparisons of means using “lincom,” for continuous variables and chi-squares for categorical variables, with Stata survey commands to perform analyses weighted for nonresponse.

The study was approved by the UCLA Institutional Review Board.

RESULTS

We determined 19 of the original 477 physicians to be ineligible: 15 who were no longer practicing in Los Angeles County, and four who reported no longer treating breast cancer patients as of the fielding period. We received 348 surveys from physicians associated with 298 unique office addresses (185 physicians practiced out of offices shared by at least one other study participant). Our final response rate was 77% (67% for medical oncologists, 89% for radiation oncologists, and 80% for surgeons). Our final sample includes 111 medical oncologists, 66 radiation oncologists and 171 surgeons.

Significant predictors of response were specialty type (medical oncologist O.R. = 0.24, p = 0.0004; surgeon O.R. = 0.44, p = 0.04 as compared with radiation oncologists), and sharing an office with another physician targeted for survey fielding (O.R. = 1.7, p = 0.02).

Physician and Practice Characteristics

Physician Demographics

Overall, physician mean age was 52.5 (SE = 0.5) and varied slightly among the specialty types, with a mean age of 53.2 for medical oncologists and 53.0 for surgeons as compared with 49.5 for radiation oncologists (p < 0.05) (Table 1). Respondents were predominantly male (81.7%) and of non-Hispanic white race/ethnicity (66.4%), followed by Asian (19.8%), Other (5.6%), Hispanic (any race) (5.2%), and African American (3.0%).

Table 1.

Demographics: Physician Characteristics, Mean or % (SE)

Medical oncologist (n = 111) Radiation oncologist (n = 66) Surgeon (n = 171) All (n = 348)
Mean Age* 53.2 (0.8) 49.5 (1.2) 53.0 (0.7) 52.5 (0.5)
Gender
 % Male 76.3 (3.9) 79.4 (4.9) 86.8 (2.5) 81.7 (2.1)
 % Female 23.7 (3.9) 20.6 (4.9) 13.2 (2.5) 18.3 (2.1)
Race/ethnicity
 % NH White 64.0 (4.8) 65.1 (5.9) 68.7 (3.6) 66.4 (2.6)
 % Asian 23.0 (4.0) 24.2 (5.3) 15.8 (2.8) 19.8 (2.2)
 % Other 7.1 (3.1) 4.5 (2.6) 4.7 (1.6) 5.6 (1.5)
 % Hispanic (any race) 2.8 (1.6) 3.2 (2.2) 7.8 (2.1) 5.2 (1.2)
 % Black 3.1 (1.7) 3.1 (2.1) 2.9 (1.3) 3.0 (1.2)
 % with additional languages 46.7 (4.8) 34.8 (5.9) 49.0 (3.8) 45.8 (2.7)
 Mean cancer volume (new patients/month)*** 28.3 (2.7) 30.7 (2.0) 9.8 (0.6) 20.0 (1.2)
 Mean breast cancer volume (new patients/month)*** 8.9 (0.8) 12.0 (1.2) 5.6 (0.5) 7.8 (0.5)
 Mean % of cancer patients with breast cancer* 37.0 (1.9) 37.7 (2.5) 54.8 (1.8) 45.5 (1.3)
 Mean number of offices for seeing patients*** 1.7 (0.1) 2.5 (0.3) 1.4 (0.1) 1.7 (0.1)
Respondent ownership interest in practice
 Full owner 32.4 (4.6) 20.0 (5.0) 52.1 (3.8) 39.6 (2.7)
 Part owner 42.9 (4.8) 37.9 (6.0) 30.8 (3.5) 36.4 (2.6)
 No ownership interest 24.7 (4.1) 42.1 (6.1) 17.2 (2.9) 24.0 (2.3)
*

Lincom comparison of means: p < 0.05;

***

p < 0.001.

Chi-squared test: p < 0.001.

Nearly 46% of respondents reported fluency in at least one language other than English. The most frequently reported non-English language was Spanish (42%), followed by Chinese (10%). Over 30 different non-English languages were reported.

Breast Cancer Patient Volume

With all specialties combined, respondents reported treating a mean of 20.0 new cancer patients per month, and 7.8 new breast cancer patients per month (i.e., 45% of respondent’s new cancer cases were breast cancer). Radiation oncologists reported the highest cancer and breast cancer volumes. Surgeons reported the lowest cancer patient volumes but they reported the highest proportions of breast cancer patients as a proportion of new cancer cases (54.8%).

Physician Ownership Interest in Practice

Overall, 39.6% reported full practice ownership and 36.4% reported partial ownership. Surgeons were most likely to have ownership interest in their practice, with 52.1% full owners and 30.8% part owners. Respondent ownership varied significantly by specialty type (p < 0.001).

Physician’s Number of Offices

On average, respondents reported practicing in more than one office (mean = 1.7 offices; SE = 0.1).

Practice Setting Type

Overall, the majority of physicians reported practicing in groups (Table 2). There were variations by specialty type, with 60.9% of radiation oncologists and 43.8% of medical oncologists in single-specialty groups as compared with 23.9% of surgeons, and 43.1% of surgeons reporting solo practice. Nearly 16% of physicians reported practicing in a group-model HMO.

Table 2.

Demographics: Practice Characteristics, % (SE)

Medical oncologist (n = 111) Radiation oncologist (n = 66) Surgeon (n = 171) All (n = 348)
Practice type***
 Single-specialty group 43.8 (4.8) 60.9 (6.0) 23.9 (3.2) 37.2 (2.6)
 Solo 25.5 (4.3) 7.7 (3.3) 43.1 (3.8) 30.9 (2.5)
 Group model HMO 15.0 (3.7) 11.9 (4.0) 17.2 (2.8) 15.5 (2.0)
 Multi-specialty group 4.5 (2.0) 10.5 (3.8) 9.5 (2.3) 7.8 (1.4)
 University-based 9.6 (2.8) 9.1 (3.5) 5.2 (1.7) 7.5 (1.4)
 Other 1.6 (1.2) 0 (0.0) 1.1 (0.8) 1.1 (0.6)
Practice best described as***
 Office-based 91.5 (2.6) 61.9 (6.0) 57.2 (3.8) 70.6 (2.4)
 Hospital-based 6.7 (2.3) 33.7 (5.8) 31.9 (3.6) 22.9 (2.2)
 Both 1.8 (1.3) 3.0 (2.1) 7.9 (2.0) 4.9 (1.1)
 Other 0 (0.0) 1.5 (1.5) 3.0 (1.3) 1.7 (0.7)
Practice ownership***
 One or more physicians, or a physician-owned corporation 74.5 (4.4) 56.2 (6.1) 73.1 (3.4) 70.9 (2.5)
 HMO, health plan, or insurance company 14.3 (3.7) 13.4 (4.2) 18.0 (2.9) 15.9 (2.0)
 Hospital 0.9 (0.9) 16.7 (4.6) 4.0 (1.5) 4.9 (1.1)
 Medical school/university 6.2 (2.3) 6.0 (2.9) 1.8 (1.0) 4.1 (1.1)
 County government 3.4 (1.7) 1.6 (1.6) 1.2 (0.8) 2.1 (0.8)
 Some other type of owner 0.9 (0.9) 4.7 (2.7) 1.9 (1.1) 2.0 (0.7)
 Don’t know 0 (0.0) 1.5 (1.4) 0 (0.0) 0.2 (0.2)
Practice size
 1 22.4 (4.1) 7.8 (3.3) 37.6 (3.7) 27.2 (2.5)
 2–5 43.3 (4.8) 47.2 (6.2) 24.4 (3.3) 35.0 (2.6)
 6–15 18.7 (3.6) 19.8 (4.9) 10.1 (2.3) 14.8 (1.9)
 16–49 0.9 (0.9) 20.8 (5.0) 4.1 (1.5) 5.6 (1.2)
 50–99 1.6 (1.1) 0 (0.0) 2.9 (1.3) 2.0 (0.7)
 100+ 13.2 (3.6) 4.5 (2.5) 21.0 (3.1) 15.5 (2.0)
***

Chi-squared test: p < 0.001.

Practice Ownership Type

The vast majority reported working in office-based (70.6%) and physician-owned practices (70.9%).

Practice Size

Overall, respondents most frequently reported working in practice settings with 2–5 full or part-time physicians (35%), but 15.5% reported working in practices with over 100 physicians.

How Physicians Spend Their Time

On average, respondents spent 51.3 hours in direct patient care activities during their last complete week of work (Table 3). Surgeons reported the most patient care hours with a mean of 54.8, and radiation oncologists reported the fewest with a mean of 41.9 hours.

Table 3.

How Physicians Spend Their Time, Mean or % (SE)

Medical oncologist (n = 111) Radiation oncologist (n = 66) Surgeon (n = 171) All (n = 348)
Mean hours worked in direct patient care*** 51.0 (1.6) 41.9 (1.2) 54.8 (1.6) 51.3 (1.0)
Teaching involvement*
 0–1 day/month 56.7 (4.8) 68.3 (5.7) 53.9 (3.8) 56.9 (2.7)
 2–5 days/month 28.0 (4.3) 18.3 (4.8) 20.4 (3.1) 22.9 (2.3)
 6–15 days/month 8.7 (2.6) 5.9 (2.9) 7.4 (2.0) 7.6 (1.4)
 >15 days/month 7.6 (2.5) 7.5 (3.3) 18.4 (3.0) 12.7 (1.8)
Average number of minutes scheduled for
 Discussion of treatment options*** 56.0 (1.9) 59.7 (1.8) 39.6 (1.2) 48.9 (1.1)
 Routine follow-up for patient on chemotherapy*** 18.3 (0.6) 11.7 (0.6) 14.5 (0.5) 15.4 (0.3)
 Routine check-up with a patient after completion of chemotherapy and radiation therapy*** 17.3 (0.6) 20.3 (0.8) 14.7 (0.4) 16.5 (0.3)
Medical oncologists only: On average, how frequently per cycle do you schedule routine follow-up visits with breast cancer patients receiving chemotherapy?
 1 59.1 (4.8) n/a n/a n/a
 2 22.3 (4.0) n/a n/a n/a
 >2 5.4 (2.2) n/a n/a n/a
 As needed: No set routine 13.2 (3.3) n/a n/a n/a
Radiation oncologists only: On average, how frequently do you schedule routine, on-treatment check-ups with breast cancer patients during the course of radiation treatment?
 Less than once per week n/a 13.5 (4.2) n/a n/a
 Once per week n/a 85.0 (4.4) n/a n/a
 More than once per week n/a 1.6 (1.6) n/a n/a
 As needed: no set routine n/a 0 n/a n/a
Surgeons only: During the 12 months following definitive breast surgery, on average, how frequently do you schedule routine follow-up visits with breast cancer patients for post-operative care?
 0 n/a n/a 0.6 (0.6) n/a
 1 n/a n/a 10.1 (2.3) n/a
 2 n/a n/a 16.8 (2.9) n/a
 >2 n/a n/a 53.5 (3.8) n/a
 As needed: no set routine n/a n/a 19.0 (0.3) n/a
*

Chi-squared test: p < 0.05.

***

Lincom comparison of means, p < 0.001.

Overall, the majority (56.9%) reported minimal teaching involvement at 0–1 day per month, though 22.9% reported 2–5 teaching days per month.

Visit Duration

Time scheduled for different types of visits varied with specialty type. For visits involving an initial consultation and discussion of treatment options with a new patient, surgeons reported the briefest durations (mean = 39.6 minutes), while radiation oncologists reported the longest (mean = 59.7 minutes) (p < 0.001). Medical oncologists reported the longest routine, on-treatment follow-up visits (mean = 18.3 minutes) compared to radiation oncologists or surgeons (11.7 and 14.5 minutes, respectively; p < 0.0001), while radiation oncologists reported the longest follow-up visits after treatment completion, with a mean of 20.3 minutes compared to medical oncologists or surgeons (17.3 and 14.7 minutes, respectively; p < 0.0001).

Frequency of Follow-up Visits

Physicians were queried about their typical frequency of follow-up visits and whether such visits were scheduled on an as-needed basis with no set routine. Responses varied according to specialty type. No radiation oncologists reported scheduling these visits on an as-needed basis, while 13.2% of medical oncologists reported that they have no set routine. Surgeons were asked about postoperative follow-up visits during the 12 months following definitive breast surgery (as compared with on-treatment follow-up). Among surgeons, 19.0% reported scheduling these visits as needed with no set routine.

Career Satisfaction

Levels of reported satisfaction were high and were dichotomized as very or somewhat satisfied as compared with all other responses (Table 4). For all specialties combined, 91.8% reported being very or somewhat satisfied with their current specialty, 90% reported being very or somewhat satisfied with their decision to become a physician, and 89.0% reported being very or somewhat satisfied with their current work setting, and 88.9% somewhat or very satisfied with their overall professional career. Physicians were least satisfied with the amount of time spent with patients, with 81.5% reporting being very or somewhat satisfied. These rates did not vary significantly by specialty type.

Table 4.

Physician Satisfaction: % Reporting Very or Somewhat Satisfied (SE)

Medical oncologist (n = 111) Radiation oncologist (n = 66) Surgeon (n = 171) All (n = 348)
Satisfaction with the following areas of medical practice
 Your current practice specialty 91.8 (2.6) 93.8 (3.0) 91.2 (2.2) 91.8 (1.5)
 Your decision to become a physician 91.5 (2.6) 92.3 (3.3) 88.2 (2.5) 90.1 (1.6)
 Your current work setting 90.3 (2.8) 86.1 (4.3) 88.9 (2.4) 89.0 (1.7)
 Your overall professional career 89.6 (2.9) 92.4 (3.3) 87.1 (2.6) 88.9 (1.7)
 Extent to which this practice has met your professional expectations 88.5 (3.0) 89.1 (3.9) 85.4 (2.7) 87.1 (1.8)
 The amount of time you can spend with a patient 79.6 (3.8) 89.4 (3.8) 80.3 (3.0) 81.5 (2.1)

DISCUSSION

These results provide a much-needed framework for understanding the relationships between structural factors and the outcomes of breast cancer care.

Our study found a diverse population of surgeons and oncologists practicing in Los Angeles relative to reports from other regions, with one-third of physicians reporting a race/ethnicity other than white. We found more female (13%) and non-white surgeons (32%) than a recent study in Pennsylvania (36) that reported 6% female and 16% nonwhite (1.9% black, 14% other) surgeons, and another multi-regional study of surgeons (1% black, 9% other) (13). Our surgeon age and gender findings were similar to those of a previous study of surgeons practicing in Detroit and Los Angeles (23). Nearly half of the physicians providing breast cancer care in Los Angeles reported speaking a language in addition to English, most frequently Spanish.

Provider volume has important implications for quality of care. Surgeons with low cancer procedure volumes have been found to have worse patient outcomes (39,40). In the specific context of breast cancer, surgeon volumes of fewer than or equal to five and fewer than 30 breast cancer operations annually have been associated with higher 5-year adjusted mortality (22,24). No work of which we are aware has assessed the impact of oncologist volume on outcomes.

We found a mean new breast cancer patient volume of 7.8 per month for all specialties combined, ranging from nearly six per month for surgeons to 8.9 for medical oncologists and 12.0 for radiation oncologists. In this study, physicians were asked about their number of new patient visits rather than numbers of procedures performed. It is likely that a certain proportion of such visits were for second opinions and would not have led to surgery or treatment. Indeed, previous work has found that over 33% of breast cancer patients consulted with more than one surgeon prior to surgery (41). Although differing measurement methods make comparisons difficult, our results regarding surgeon volumes appear to contrast with lower surgeon breast cancer procedure volumes of noted in the existing literature. Bickell and colleagues found a median surgeon volume of fewer than five breast cancer surgeries annually in the New York metropolitan area (26). Other work has found that over 80% of surgeons performed fewer than or equal to five breast surgeries annually (22). In a multi-regional study (21), over three-fourths of surgeons who perform breast surgeries had fewer than or equal to one breast cancer case per month. In contrast, less than 2% of surgeons in our study reported fewer than one new breast cancer patient per month and 16% reported fewer than or equal to one per month—comparable to the findings of a study in Los Angeles and Detroit in which 11.5% of surgeons reported performing fewer than or equal to 10 breast surgeries annually (42).

A number of factors in addition to the count of visits versus procedures may have contributed to the higher volumes reported by surgeons in this study compared with others. Volumes in the literature have been measured among surgeons associated with differing patient samples, with variations in age range (21), inclusion of patients with metastatic disease and/or or DCIS (22,26), and geographic region (24,26). Our physicians were identified by a population-based cohort of women ages 50 and over, including those with DCIS and metastatic disease. Physicians were asked to report average monthly volume as opposed to annual volume in order to minimize recall bias. It is possible that survey nonresponse may have biased our volume results. Physicians with higher volumes of breast cancer study patients were more likely to respond to our survey. However, our high response rate may make our findings more robust compared with other cross-sectional physician surveys, and the impact of any nonresponse bias was at least partially addressed by using volume as one factor in weighting for nonresponse.

We found a high degree of group affiliation among cancer care providers in Los Angeles. Overall, the highest proportion of respondents reported practicing in single specialty groups. With multi-specialty groups and group-model HMOs combined, 61% of respondents reported being in group practice. Surgeons stood out as the specialists most likely to be in solo practice (43%), although 50% of surgeons reported being in some form of group practice. This finding is consistent with a study of surgeons in Pennsylvania, where nearly half were in private group practice, and 40% practiced solo (33).

We found a substantial proportion of breast cancer- treating physicians were practicing in group-model HMOs: 17% of surgeons, 15% of medical oncologist, and 12% of radiation oncologists. In contrast, only 2% of ASCO members reported an HMO practice setting in a national survey (43). However, despite the growth of large consolidated medical organizations in California, the majority of breast cancer treating physicians in this study reported at least some personal ownership interest in their practice.

This is the first study of which we are aware to examine career satisfaction specifically among physicians caring for breast cancer patients, and is one of the few to make comparisons among medical oncologists, radiation oncologists, and surgeons.

Levels of satisfaction with various aspects of respondents’ careers were generally very high at the time of this survey. In comparison, previous studies have found rates of career satisfaction of 79–80% among medical and surgical oncologists (43,44), and 80% among primary care physicians (45). These results may be viewed as reassuring given the high levels of stress associated with cancer patient care, as well as administrative, reimbursement, and management duties and growing regulatory and administrative burdens imposed by private and public payers (46). It is possible that physicians specializing in breast cancer face somewhat lower levels of stress associated with end-of-life issues, given the better prognosis for breast cancer overall as compared with many other cancers. It is also possible that physicians in large West Coast cities such as Los Angeles, with a high degree of managed care penetration in earlier years, may have adapted to many of these challenges earlier than physicians in other regions. However, changes in Medicare reimbursement for chemotherapy infusions implemented subsequent to the study period may have challenged satisfaction levels among medical oncologists.

CONCLUSION

To date, little has been documented in the literature about the distributions of cancer care provider characteristics. These data show that the settings in which Los Angeles medical oncologists practice, the services available in those settings, and the ways in which these cancer care providers are reimbursed are quite varied. Although many medical oncologists are in solo or single-specialty practice, many now work in integrated settings with other provider types. High proportions of patients with managed care coverage highlight the importance of learning what is inside the largely black box of managed care, and how the associated structural features and reimbursement systems impact processes of care and patient outcomes. These data represent essential building blocks for further analyses to determine the impact of the structural characteristics of care on the processes and outcomes of care that women with breast cancer ultimately experience. This work will have important implications for learning which structural arrangements support the best quality of cancer care and are therefore most worthy of the time and financial investments necessary to sustain and replicate them. Understanding the structure of care and the ways it influences the delivery of care and patient outcomes will allow us to isolate areas of excellent care that should be modeled, and to identify other areas where specific components of care can be improved or that need further well-focused research.

References

  • 1.Donabedian A. Explorations in Quality Assessment and Monitoring: The Definition of Quality and Approaches to its Assessment. Ann Arbor, MI: Health Administration Press; 1972. [Google Scholar]
  • 2.Landon BE, Reschovsky J, Reed M, et al. Personal, organizational, and market level influences on physicians’ practice patterns: results of a national survey of primary care physicians. Med Care. 2001;39:889–905. doi: 10.1097/00005650-200108000-00014. [DOI] [PubMed] [Google Scholar]
  • 3.Walton RT, Harvey E, Dovey S, et al. Computerised advice on drug dosage to improve prescribing practice. Cochrane Database Syst Rev. 2001:CD002894. doi: 10.1002/14651858.CD002894. [DOI] [PubMed] [Google Scholar]
  • 4.O’Brien MA, Freemantle N, Oxman AD, et al. Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2001:CD003030. doi: 10.1002/14651858.CD003030. [DOI] [PubMed] [Google Scholar]
  • 5.Renders CM, Valk GD, Griffin S, et al. Interventions to improve the management of diabetes mellitus in primary care, out-patient and community settings: a systematic review. Diabetes Care. 2001;24:1821–33. doi: 10.2337/diacare.24.10.1821. [DOI] [PubMed] [Google Scholar]
  • 6.Whelan T, Levine M, Willan A, et al. Effect of a decision aid on knowledge and treatment decision making for breast cancer surgery: a randomized trial. JAMA. 2004;292:435–41. doi: 10.1001/jama.292.4.435. [DOI] [PubMed] [Google Scholar]
  • 7.Kerr EA, Mittman BS, Hays RD. Quality assurance in capitated physician groups. Where is the emphasis? JAMA. 1996;276:1236–9. [PubMed] [Google Scholar]
  • 8.Kerr EA, Hays RD, Mittman BS, et al. Primary care physicians’ satisfaction with quality of care in California capitated medical groups. JAMA. 1997;278:308–12. [PubMed] [Google Scholar]
  • 9.Hargraves JL, Palmer RH, Orav EJ, et al. Practice characteristics and performance of primary care practitioners. Med Care. 1996;9(Suppl):SS67–76. doi: 10.1097/00005650-199609002-00007. [DOI] [PubMed] [Google Scholar]
  • 10.Stoddard JJ, Reed M, Hadley J. Financial incentives and physicians’ perceptions of conflict of interest and ability to arrange medically necessary services. J Ambul Care Manage. 2003;26:39–50. doi: 10.1097/00004479-200301000-00005. [DOI] [PubMed] [Google Scholar]
  • 11.Kramer S. The study of the patterns of cancer care in radiation therapy. Cancer. 1997;39:780–7. doi: 10.1002/1097-0142(197702)39:2+<780::aid-cncr2820390712>3.0.co;2-i. [DOI] [PubMed] [Google Scholar]
  • 12.Gillis CR, Hole DJ. Survival outcome of care by specialist surgeons in breast cancer: a study of 3786 patients in the west of Scotland. BMJ. 1996;312:145–8. doi: 10.1136/bmj.312.7024.145. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Thwin SS, Fink AK, Lash T, et al. Predictors and outcomes of surgeons’ referral of older breast cancer patients to medical oncologists. Cancer. 2005;104:936–42. doi: 10.1002/cncr.21256. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Laliberte L, Fennell M, Papadonatos G. The relationship of membership in research networks to compliance with treatment guidelines for early-stage breast cancer. Med Care. 2005;43:471–9. doi: 10.1097/01.mlr.0000160416.66188.f5. [DOI] [PubMed] [Google Scholar]
  • 15.Malin JL, Schuster M, Kahn KL, et al. Quality of breast cancer care: what do we know? J Clin Oncol. 2002;20:4381–93. doi: 10.1200/JCO.2002.04.020. [DOI] [PubMed] [Google Scholar]
  • 16.Lee-Feldstein A, Anton-Culver H, Lee-Feldstein PJ. Treatment differences and other prognostic factors related to breast cancer survival: delivery systems and medical outcomes. JAMA. 1994;271:485–90. [PubMed] [Google Scholar]
  • 17.Lee-Feldstein A, Feldstein PJ, Buchmuller T, et al. The relationship of HMOs, health insurance, and delivery systems to breast cancer outcomes. Med Care. 2000;38:705–18. doi: 10.1097/00005650-200007000-00003. [DOI] [PubMed] [Google Scholar]
  • 18.Lee-Feldstein A, Feldstein PJ, Buchmueller T, et al. Breast cancer outcomes among older women: HMO, fee-for-service, and delivery system comparisons. J Gen Intern Med. 2001;16:189–99. doi: 10.1111/j.1525-1497.2001.91112.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Potosky AL, Merrill RM, Riley GF, et al. Breast cancer survival and treatment in health maintenance organization and fee-for- service settings. J Natl Cancer Inst. 1997;89:1654–5. doi: 10.1093/jnci/89.22.1683. [DOI] [PubMed] [Google Scholar]
  • 20.Riley GF, Potosky AL, Klabunde CN, et al. Stage at diagnosis and treatment patterns among older women with breast cancer: an HMO and fee-for-service comparison. JAMA. 1999;281:720–6. doi: 10.1001/jama.281.8.720. [DOI] [PubMed] [Google Scholar]
  • 21.Neuner JM, Gilligan MA, Sparapani R, et al. Decentralization of breast cancer surgery in the United States. Cancer. 2004;101:1323–9. doi: 10.1002/cncr.20490. [DOI] [PubMed] [Google Scholar]
  • 22.Skinner KA, Helsper JT, Deapen D, et al. Breast cancer: do specialists make a difference? Ann Surg Oncol. 2003;10:606–15. doi: 10.1245/aso.2003.06.017. [DOI] [PubMed] [Google Scholar]
  • 23.Hawley ST, Hofer T, Janz NK, et al. Correlates of between-surgeon variation in breast cancer treatments. Med Care. 2006;44:609–16. doi: 10.1097/01.mlr.0000215893.01968.f1. [DOI] [PubMed] [Google Scholar]
  • 24.Sainsbury R, Haward B, Rider L, et al. Influence of clinical workload and patterns of treatment on survival from breast cancer. Lancet. 1995;345:1251–2. doi: 10.1016/s0140-6736(95)90924-9. [DOI] [PubMed] [Google Scholar]
  • 25.Roohan P, Bickell NA, Baptiste MS, et al. Hospital volume differences and five-year survival from breast cancer. Am J Public Health. 1998;88:454–7. doi: 10.2105/ajph.88.3.454. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Bickell NA, Aufses AH, Chassin MR. The quality of early-stage breast cancer care. Ann Surg. 2000;232:220–4. doi: 10.1097/00000658-200008000-00012. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 27.Somkin CP, Altschuler A, Ackerson L, et al. Organizational barriers to physician participation in cancer clinical trials. Am J Manag Care. 2005;11:413–21. [PubMed] [Google Scholar]
  • 28.Bickell NA, Young GJ. Coordination of care for early-stage breast cancer patients. J Gen Intern Med. 2001;16:737–42. doi: 10.1111/j.1525-1497.2001.10130.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Mor V, Laliberte L, Petrsek AC, et al. Impact of breast cancer treatment guidelines on surgeon practice patterns: results of a hospital-based intervention. Surgery. 2000;128:847–61. doi: 10.1067/msy.2000.109530. [DOI] [PubMed] [Google Scholar]
  • 30.Oxman AD, Thomson MA, Davis C, et al. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ. 1995;153:1423–31. [PMC free article] [PubMed] [Google Scholar]
  • 31.Coia LR, Hanks GE. Quality assessment in the USA: How the patterns of care study has made a difference. Semin Radiat Oncol. 1997;7:146–56. doi: 10.1053/SRAO00700146. [DOI] [PubMed] [Google Scholar]
  • 32.Kanbour-Shakir A, Harris KM, Johnson RR, et al. Breast care consultation center: role of the pathologist in a multidisciplinary center. Diagn Cytopathol. 1997;7:191–6. doi: 10.1002/(sici)1097-0339(199709)17:3<191::aid-dc4>3.0.co;2-i. [DOI] [PubMed] [Google Scholar]
  • 33.Siminoff LA, Zhang A, Saunders CM, et al. Referral of breast cancer patients to medical oncologists after initial surgical management. Med Care. 2000;38:696–704. doi: 10.1097/00005650-200007000-00002. [DOI] [PubMed] [Google Scholar]
  • 34.Wagner EH, Glasgow RE, Davis C, et al. Quality improvement in chronic illness care: a collaborative approach. Jt Comm J Qual Improv. 2001;27:63–80. doi: 10.1016/s1070-3241(01)27007-2. [DOI] [PubMed] [Google Scholar]
  • 35.Gosden T. Impact of payment method on behaviour of primary care physicians: a systematic review. J Health Serv Res Policy. 2001;6:44–55. doi: 10.1258/1355819011927198. [DOI] [PubMed] [Google Scholar]
  • 36.Shortell SM, Zazzali JL, Burns LR, et al. Implementing evidence- based medicine: the role of market pressures, compensation incentives, and culture in physician organizations. Med Care. 2001;39(Suppl):I-62–I-78. [PubMed] [Google Scholar]
  • 37.Conrad DA, Sales A, Liang SY, et al. The impact of financial incentives on physician productivity in medical groups. Health Serv Res. 2002;37:885–906. doi: 10.1034/j.1600-0560.2002.57.x. Review. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.American Medical Association Education and Research Foundation. Practical Patterns of Young Physicians, 1987a. Chicago, IL: American Medical Association Education and Research Foundation; 1987. [Google Scholar]
  • 39.Begg CB, Cramer LD, Hoskins WJ, et al. Impact of hospital volume on operative mortality for major cancer surgery. JAMA. 1998;280:1747–51. doi: 10.1001/jama.280.20.1747. [DOI] [PubMed] [Google Scholar]
  • 40.Hillner BE, Smith TJ, Desch CE. Hospital and physician volume or specialization and outcomes in cancer treatment: importance in quality of cancer care. J Clin Oncol. 2000;18:2327–40. doi: 10.1200/JCO.2000.18.11.2327. [DOI] [PubMed] [Google Scholar]
  • 41.Katz SJ, Hofer TP, Hawley S, et al. Patterns and correlates of patient referral to surgeons for treatment of breast cancer. J Clin Oncol. 2007;25:271–6. doi: 10.1200/JCO.2006.06.1846. [DOI] [PubMed] [Google Scholar]
  • 42.Katz SJ, Lantz PM, Janz NK, et al. Surgeon perspectives about local therapy for breast carcinoma. Cancer. 2005;104:1854–61. doi: 10.1002/cncr.21396. [DOI] [PubMed] [Google Scholar]
  • 43.Whippen DA, Canellos GP. Burnout syndrome in the practice of oncology: results of a random survey of 1,000 oncologists. J Clin Oncol. 1991;9:1916–20. doi: 10.1200/JCO.1991.9.10.1916. [DOI] [PubMed] [Google Scholar]
  • 44.Kuerer HM, Eberlein TJ, Pollock RE, et al. Career satisfaction, practice patterns and burnout among surgical oncologists: report on the quality of life of members of the Society of Surgical Oncology. Ann Surg Oncol. 2007;14:3043–53. doi: 10.1245/s10434-007-9579-1. [DOI] [PubMed] [Google Scholar]
  • 45.Landon BE, Reschovsky J, Blumenthal D. Changes in career satisfaction among primary care and specialist physicians, 1997–2001. JAMA. 2003;289:442–9. doi: 10.1001/jama.289.4.442. [DOI] [PubMed] [Google Scholar]
  • 46.Einhorn LH, Levinson J, Li S, et al. American Society of Clinical Oncology 2001 Presidential Initiative: impact of regulatory burdens on quality cancer care. J Clin Oncol. 2002;20:4722–6. doi: 10.1200/JCO.2002.02.011. [DOI] [PubMed] [Google Scholar]

RESOURCES