Abstract
Understanding how health care system structures, processes, and available resources facilitate and/or hinder the delivery of quality cancer care is imperative, especially given the rapidly changing health care landscape. The emerging field of cancer care delivery research (CCDR) focuses on how organizational structures and processes, care delivery models, financing and reimbursement, health technologies, and health care provider and patient knowledge, attitudes, and behaviors influence cancer care quality, cost, and access and ultimately the health outcomes and well-being of patients and survivors. In this article, we describe attributes of CCDR, present examples of studies that illustrate those attributes, and discuss the potential impact of CCDR in addressing disparities in care. We conclude by emphasizing the need for collaborative research that links academic and community-based settings and serves simultaneously to accelerate the translation of CCDR results into practice. The National Cancer Institute recently launched its Community Oncology Research Program, which includes a focus on this area of research.
INTRODUCTION
US health care systems are evolving rapidly in response to concerns about quality, cost, reimbursement, and access. Practices are merging or affiliating with health systems, and health systems are consolidating.1 New payment approaches that incorporate explicit cost and quality goals have spurred experimentation with service delivery models, such as accountable care organizations, shared savings approaches, expanded roles for interdisciplinary team members, and the patient-centered medical home. Cancer care has not been immune to these changes, and expert reviews have drawn attention to the various ways care delivery has fallen short of its goal of providing consistent, high-quality cancer care in diverse settings.2,3 In response, researchers and practitioners have sought to strengthen the evidence base that supports cancer care provision, enhance care coordination, and improve both patient outcomes and care experiences. Achieving these goals within the context of a changing health care system is challenging; the evolving incentive structures sometimes support but often impede these improvements.
This combination of a dynamic health care environment and the need to further improve the quality, consistency, and efficiency of care demands a better understanding of how cancer care is delivered across diverse settings and of the financial, organizational, and behavioral factors that interact with clinical knowledge to shape outcomes. Cancer care has distinctive characteristics; these include complex provider and patient decision making, an evolution of patient needs along the cancer control continuum, and the requirement to coordinate multiple treatment modalities and transitions in care within a multidisciplinary setting.4 These characteristics motivate a care delivery research effort in the oncology context that integrates system-wide factors and addresses the issues unique to cancer. The emerging field of cancer care delivery research (CCDR) offers an opportunity to examine these factors with greater attention to their multilevel influence on patient outcomes.
This article defines CCDR and identifies five core attributes using illustrative examples of practice-changing care delivery research conducted in oncology or general practice settings. We considered a study to be potentially practice changing if it generated evidence sufficiently robust and generalizable to warrant application by clinicians, organizations, and policy makers. Examples were assembled from seminal reviews, empirical evidence, and discussions with health services researchers and are provided to highlight key issues, research needs, and opportunities. We give special attention to the important contribution CCDR makes toward addressing health disparities. Considerations for expanding CCDR in the community setting are also addressed, including the ongoing challenges of capturing research-quality data from rapidly evolving electronic health records and related information systems.5 We conclude with an introduction to the recently launched National Cancer Institute (NCI) Community Oncology Research Program (NCORP), a network that builds on previous NCI investments in community-based clinical research,6–8 and includes a focus on community-based CCDR.
CONCEPTUALIZING CCDR
Building on the definition of health services research proposed by Lohr and Steinwachs,9 the NCI characterizes CCDR as “the multidisciplinary field of scientific investigation that studies how social factors, financing systems, organizational structures and processes, health technologies, and health care provider and patient behaviors affect access to cancer care, the quality and cost of cancer care, and ultimately the health and well-being of cancer patients and survivors.”10 A hallmark of CCDR is the generation of new knowledge to inform practice change, defined as clinically important and sustained modification of the structures and processes of cancer care delivery to improve clinical outcomes, enhance the patient experience, and optimize value.
The scope of CCDR encompasses individuals, families, providers, teams and health care organizations, payers, policymakers, communities, populations, and their interactions.9,11 CCDR is distinguished from quality improvement by its focus on developing new and generalizable knowledge about the effectiveness, acceptability, cost, optimal delivery mode, active ingredients, and causal mechanisms that influence outcomes and affect the value of cancer care, across diverse settings and populations.12 Quality improvement activities focus on systematic efforts by a health care organization or system to monitor and raise the standards of system performance and care delivery.13 CCDR findings can be applied by decision makers at the local level to inform quality improvement activities and reimbursement policies, as well as in the development of practice guidelines and standards.
CCDR complements cancer clinical trials, which typically focus on the efficacy and safety of new therapies and the testing of new approaches to prevention and supportive care within the confines of the controlled trial setting. Understanding how multilevel systems and contextual factors influence the structures, processes, and outcomes of cancer care delivery is essential to enacting practice change14 and is especially important when patient characteristics alone are insufficient to explain variations in outcomes. CCDR also provides an opportunity to examine how clinician and health system factors affect equity in cancer care for diverse racial/ethnic groups and medically underserved populations.
Although historically, most health care delivery research has been conducted in primary care, early examples of CCDR within community settings, such as the NCI Community Clinical Oncology Program8 and the NCI Community Cancer Centers Program,6 have informed changes in community oncology practice. CCDR study designs include observational studies based on analyses of existing or newly collected data on patients, clinicians, and/or organizations engaged in the practice of oncology; pragmatic trials; intervention studies testing new approaches and processes of care delivery; and dissemination and implementation studies comparing approaches to improve oncology practice.
ATTRIBUTES OF CCDR
We believe that CCDR studies are most likely to have the greatest impact on practice change if they encompass one or more of five attributes: (i) address problems salient to patients and clinicians, (ii) engage clinicians as active collaborators in the design and conduct of studies, (iii) use standardized measures of health care quality, (iv) examine causal pathways and active ingredients of practice change, and (v) incorporate diverse settings and samples (Fig 1). These attributes were identified through literature review and extensive discussions with CCDR experts.
Fig 1.
Attributes of cancer care delivery research (CCDR) that can lead to evidence-based practice change.
Saliency to Patients and Clinicians
CCDR studies that are most likely to improve practice are those focused on problems of importance to both patients and clinicians. Study aims that incorporate patient and clinician perspectives and reflect the constraints and incentives in real-world practice settings have the greatest likelihood of leading to meaningful and sustained practice changes.15 For example, addressing symptoms and well-being is an essential component of effective palliative care, and these outcomes are highly salient to patients, families, and clinicians. The Australian Palliative Care Outcomes Collaboration has identified clinician- and patient-reported measures that reflect the delivery of high-quality palliative care, systematically incorporating standardized data elements into a national system for audit, clinician feedback, and practice benchmarking designed to drive improvement in the delivery of palliative care. Importantly, the system also allows programs to compare and contrast models of service delivery and deployment of resources, motivating program enhancements that improve the value of care while remaining responsive to variations in patient needs and preferences based on local context. A recent study demonstrated that collecting information important to both patients and clinicians at the point of care and incorporating that information into a system that provides feedback and benchmarking result in improved clinical outcomes in symptom control and patient/family well-being.16
Clinician Collaboration in Design and Conduct of Studies
An important element of CCDR is that clinicians are actively engaged in the design and conduct of research aimed at improving care delivery processes and clinical outcomes. This is especially important in complex intervention studies, where research design and data collection occur at multiple levels. Involving front-line clinicians in study design and implementation strengthens external validity and yields important insights about intervention feasibility and acceptability.17
In collaboration with clinicians, investigators in the outpatient breast cancer chemotherapy clinic at Dana-Farber Cancer Institute developed an intervention to reduce errors and gaps in care through improved communication.18 The team developed tools and procedures to reduce risks in priority process areas by improving communication and interactions among various team members. Implementation of these interventions improved safety, efficiency, quality, and satisfaction for both patients and clinicians.18
The STEP-UP (Study to Enhance Prevention by Understanding Practice) trial provides an example of a collaboratively developed care delivery study in a primary care setting. This group-randomized trial engaged 79 practices to determine whether a set of tailored interventions would improve preventive services delivery in three areas: health-habit counseling, cancer screening, and immunization.19 Practices in the intervention group received a nurse-led assessment of their current performance in these three areas. They were then presented with tools and approaches for practice improvement, and teams chose which strategies to implement in their particular setting. The intervention group also received practice facilitation, as well as feedback and peer benchmarking. One and 2 years after implementation, the intervention group showed a significant and sustained increase in the frequency of both health-habit counseling and screening compared with the control group.20
Use of Standardized Measures of Care Quality
Standardized structure, process, and outcome measures of care quality are an essential feature of CCDR.21 These indicators provide a rich source of ideas for CCDR by identifying practice variation and targets for practice change. CCDR studies that incorporate well-accepted quality measures are also more likely to be feasible across diverse settings and to generate findings that can be directly applied in practice. Oncology quality measures have been identified by organizations including the American Society of Clinical Oncology (ASCO), the American Society of Radiation Oncology, the Oncology Nursing Society, the American College of Surgeons Commission on Cancer, the National Comprehensive Cancer Network, and the National Quality Forum. For example, the National Surgical Quality Improvement Program, a hospital-based data collection system sponsored by the American College of Surgeons, is considered the gold standard for evaluating surgical outcomes in cancer.22 Program data revealed that many breast surgery centers had higher than expected tumor re-excision rates as a result of the identification of positive surgical margins remaining after initial tumor resection.23 McCahill et al24 subsequently conducted an observational study at four practice sites. They found that variability in re-excision rates was explained by inconsistencies in the minimum distance to clear margins among practices rather than by patient factors or a practice's procedure volume. Variability in clear margin distances was found both among institutions and among surgeons within an institution. These findings motivated guideline development addressing margin distances in breast-conserving surgery for women with early-stage breast cancer.25,26 Future CCDR studies might explore whether guideline-concordant care is being delivered and identify the factors that contribute to practice variation after guideline dissemination, which in some contexts may include weak evidence or patient preferences.
Providing feedback to clinicians in the form of quality performance data and benchmarking against peers can serve as a component of a practice-change intervention and has shown some efficacy in improving care delivery processes and outcomes.27 The ASCO Quality Oncology Practice Initiative (QOPI) is a practice-based quality monitoring program that collects data from participating practices on a wide range of process measures, including the timeliness of care delivery and adherence to national guidelines. QOPI provides practices with an analysis of their own data as well as comparisons benchmarked to other participating practices. Oncology practices with sustained participation in this program of routine quality monitoring and feedback reported better performance (eg, in quality measures of pain care and documentation of hospice and palliative care discussions) than practices participating in QOPI for the first time.28 A similar quality reporting system—the Rapid Quality Reporting System (RQRS)—is a cancer registry–based data system that provides continuous performance feedback and benchmarking to participating hospitals. An evaluation comparing performance before and after RQRS implementation at 41 community cancer centers demonstrated significant improvement in five breast and colorectal cancer (CRC) quality metrics after RQRS implementation.29 Guideline adherence also increased by 39% and 25%, respectively, in uninsured and Medicaid patients receiving radiation therapy after breast-conserving surgery.30
Examination of Causal Pathways and Active Ingredients of Practice Change
CCDR studies that yield the greatest insights include a focus on isolating the causal mechanisms and active ingredients of a practice-change intervention. Colquhoun et al12 recently specified a framework describing considerations for developing, targeting, and testing interventions for practice change. They propose that successful practice-change interventions are supported by empiric knowledge about the active ingredients in the intervention, the causal mechanisms (processes or mediators) by which an intervention effects change, the mode of delivery, and the intended targets of the intervention.
A recent demonstration project illustrates the importance of linking structure and process measures with outcomes. Clinical leaders in four federally qualified health centers were trained to deliver a practice facilitation intervention that incorporated a tracking system for cancer screening and engaged primary care teams in identifying and implementing strategies to improve uptake of effective screening tests in underserved populations. The centers monitored progress across several process measures: proportion of eligible patients screened for breast, cervical, and CRCs; proportion screened who received timely results; and proportion of abnormal screens definitively evaluated within 90 days.31 From 2005 to 2007, CRC screening rates increased from 8.6% to 21.2%, and the program was scaled to more than 50 participating centers. Additional studies are needed to confirm that improvements are sustainable and to examine which aspects represent the active ingredients in this practice-change intervention.
The underlying mechanisms and active ingredients of an intervention are perhaps most apparent when process measures are linked to resultant health outcomes. A study in 1999 reported high prescription rates of hematopoietic colony-stimulating factors (CSFs) for indications outside of the ASCO guidelines for prophylaxis and management of neutropenia.32 A subsequent analysis similarly showed that CSF prescription in patients at risk for neutropenia was not supported by ASCO guidelines.33 ASCO identified reduction in guideline-discordant CSF prescription as one of its top-five strategies for improving care and decreasing cost.34,35 A recent study in 22 community oncology practices providing service to nearly 100,000 patients demonstrated that a peer-to-peer consultative intervention increased guideline concordance in CSF prescription and produced substantial cost savings, with no increase in febrile neutropenic events.36
Russell et al37 recently reported the findings of a multicenter randomized controlled trial to improve lung cancer treatment outcomes. The CCDR intervention consisted of a baseline audit of care quality and patient experience, peer review, and coaching of teams to develop individualized quality improvement plans. Although the intervention was effective in optimizing the interval from diagnosis to treatment, the active ingredients in the intervention were not disentangled. In another example, an ongoing study to improve prostate cancer outcomes using a multifaceted practice improvement intervention and a stepped-wedge, mixed-methods design will yield information about efficacy and isolate the mechanisms of change and active ingredients within the intervention.38 Mixed methods are an important consideration in the design of CCDR studies and can help to evaluate the acceptability of a practice-change intervention, gauge implementation fidelity, and identify the active ingredients and mechanisms of provider and organizational change.
Incorporation of Diverse Settings and Samples
For CCDR results to be reproducible and widely generalizable, samples and settings should be representative of the diverse contexts of care delivery. For example, a single-site study testing routine electronic symptom surveillance and alert-triggered symptom management in patients undergoing cancer surgery showed positive effects on symptom severity.39 However, a multisite study of a similar intervention in patients receiving cancer chemotherapy did not confirm these benefits.40 Thus, as intervention testing moves from efficacy to effectiveness, pragmatic trial designs and samples drawn from community-based settings and diverse populations become essential. Evidence developed in multisite, community-based contexts may also accelerate implementation of research-tested interventions in routine clinical care. As an example, in a quasi-experimental study conducted among underserved Latina women with a new breast cancer diagnosis, a higher proportion of women who received a navigation intervention initiated treatment within 30 days, and navigation also reduced the time from diagnosis to treatment initiation to 21 days (v 48 days in control group).41 These results were independent of cancer stage at diagnosis and other characteristics at the individual or clinic level. A cluster-randomized trial of nurse navigation also demonstrated that women with breast cancer who receive the intervention have fewer unmet psychosocial needs, have a more favorable patient experience, and report more favorable ratings of care coordination and information provision.42
On the basis of the body of evidence from trials showing the efficacy of community-based models of patient navigation, particularly for underserved populations, the Commission on Cancer identified the provision of navigation programs as one of its patient-centered standards for program accreditation.43 At the same time, implementation of navigation programs has varied widely, and comparative evidence about the cost effectiveness of different navigation models is still needed.44 One potential reason for uneven adoption may be that many navigation studies have been conducted in a small number of clinics or within regionally based health systems and therefore may not be viewed as having general applicability. CCDR studies that address the cost effectiveness of different navigation models and barriers to scaling across diverse community settings will provide important knowledge to support widespread adoption.
Another example of a study matched to its implementation setting is the evaluation conducted by Kaiser Permanente Southern California (KPSC) of the effectiveness of aromatase inhibitors (AIs) and tamoxifen in reducing contralateral breast cancer incidence. The study used a retrospective cohort design and sampled from 15 medical centers and more than 100 outpatient clinics.45 Results showed that risk reduction was greatest in the group with the highest adherence to AIs and tamoxifen, leading KPSC to incorporate AI and tamoxifen adherence measures into its electronic health record. KPSC then developed a program of systematic outreach to women who were overdue for prescription refills, resulting in more timely refill rates.46 This study also illustrates how data routinely collected in health records can highlight suboptimal practice patterns and be used simultaneously to drive enhancements to care delivery protocols. Understanding how to tailor this new care pathway for implementation in organizational settings beyond KPSC would enhance generalizability.
The importance of employing diverse settings and samples in CCDR is also exemplified by the Colorectal Cancer Care Collaborative established by the Department of Veterans Affairs (VA) to reduce the time from a positive CRC screening test to diagnostic colonoscopy.47 Building on the success of this collaborative, all VA facilities were required to submit a flow map of their local CRC screening and diagnostic process, describe their quality improvement activities to enhance fecal occult blood testing follow-up, and report quarterly the proportion of patients who screened fecal occult blood test positive and who received a diagnostic colonoscopy within 60 days. From these national data, Powell et al48 determined that two infrastructural improvements and three process improvements were predictive of higher proportions of patients receiving a timely diagnostic colonoscopy. Understanding how to generalize these process improvements beyond the VA could advance care delivery processes across diverse settings.
ADDRESSING HEALTH CARE DISPARITIES
In developing CCDR studies that address cancer care disparities, all of the attributes of CCDR highlighted remain relevant. However, because of the lingering effects of historical injustices on current perceptions and attitudes, as well as barriers related to language, culture, knowledge, and understanding, CCDR studies of cancer care disparities require careful attention to patient perspectives and to the often essential role of community-based organizations as mediators of practice improvement.7,49 In addition, the negative impact of cancer-related financial burden on well-being, access to care, and clinical outcomes is recognized increasingly as an issue for an even broader range of patients than those traditionally conceptualized as underserved,50 and is thus a crucial research topic for CCDR.
In one example of a CCDR intervention to address health care disparities, the state of Delaware implemented a program to reduce racial disparities in CRC incidence and mortality.51 The three-part program increased CRC screening among minority groups, improved access of these groups to CRC treatment, and implemented patient navigation to improve efficiency and continuity of care. Subsequently, state-level disparities between whites and blacks in rates of screening colonoscopy, CRC incidence, and CRC mortality were virtually eliminated. The program included statewide reimbursement for colonoscopy for uninsured residents with incomes up to 250% of the federal poverty level, deployed nurse navigators at each of the five acute care hospital sites in the state, and established the Delaware Cancer Treatment Program to cover the costs of cancer care for 2 years for uninsured residents with household incomes up to 650% of the federal poverty level. Future research is needed to understand the cost effectiveness of wide-scale implementation of this model program.
EXPANDING CCDR IN COMMUNITY SETTINGS
What we have learned to date about the capacity of CCDR to influence practice change is based largely on research conducted in integrated health care delivery settings (eg, VA, Kaiser Permanente) and academic medical centers. This is because such integrated settings are often better equipped to facilitate the rapid and widespread implementation of practice changes and typically have data resources that support outcomes evaluation. In addition, CCDR studies conducted outside these integrated systems are often performed at only one site or in one geographic region, thereby limiting their generalizability. There are also many barriers to integrating CCDR findings across settings that use different health information technology systems.52 Thus, implementing these practice changes beyond their original study settings remains a challenge.8
For CCDR to influence practice change across community settings, studies must involve participants drawn from a diverse range of community practice settings. Conducting CCDR within a broad network such as NCORP that includes independent community practices, system-affiliated practices, and safety-net institutions could substantially enhance the relevance and broaden the adoption of CCDR findings in routine practice.53 In addition, with some notable exceptions, we found few examples of multisite CCDR studies conducted in community settings that focused on reducing disparities for diverse and/or underserved populations. This gap might be addressed by CCDR networks that are more broadly community based and that specifically offer a network of care delivery sites that focus on improving cancer care services to underserved populations. In response to this need, in August 2014, NCI launched NCORP. This new program builds on 30 years of community-based clinical trials research experience.7,8 NCORP, which currently comprises 34 community sites, 12 minority/underserved sites, and seven research bases, provides a unique opportunity to conduct CCDR in settings that reflect real-world practice conditions and diverse patient populations. NCORP has the potential to contribute to much-needed progress in improving clinical outcomes and enhancing the efficiency, access, quality, and affordability of care across the full spectrum of oncology practice settings and the care continuum.
DELIVERING ON THE PROMISE OF CCDR
This article has described important characteristics of CCDR, which produces new knowledge that can be applied across diverse settings to improve cancer care quality and outcomes, and has provided examples of studies that support practice change. A key finding is that the impact of CCDR is greatest when it is conducted in diverse practice settings. Successful implementation of CCDR outside integrated health care systems will require multilevel partnerships across academic and community settings and a coordinated research infrastructure including integrated data systems. It will also require investment and commitment from health care providers and other key stakeholders to continuously improve quality and enact health policy that supports these efforts.
Acknowledgment
We thank our colleagues who provided valuable insights in the development of this article: Brenda Adjei, Ann Geiger, Maureen Johnson, Stephen Taplin, and Ashley Wilder Smith (National Cancer Institute); Terry Field (University of Massachusetts); Joanne Schottinger, James Dearing, and Debra Ritzwoller (Kaiser Permanente); Sara Greene, Diana Buist, and Ed Wagner (Group Health Cooperative); Martin Charns (Boston University); David Atkins and Brian Mittman (Department of Veterans Affairs); and Anne Brown Rodgers.
Footnotes
The findings and conclusions in this report are those of the authors and do not necessarily represent the official position of the National Cancer Institute or the Patient-Centered Outcomes Research Institute.
Authors' disclosures of potential conflicts of interest are found in the article online at www.jco.org. Author contributions are found at the end of this article.
AUTHORS' DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
Disclosures provided by the authors are available with this article at www.jco.org.
AUTHOR CONTRIBUTIONS
Conception and design: All authors
Collection and assembly of data: Erin E. Kent, Sandra A. Mitchell, Steven B. Clauser
Data analysis and interpretation: Erin E. Kent, Sandra A. Mitchell, Kathleen M. Castro, Judith A. Hautala, Oren Grad, Worta J. McCaskill-Stevens
Manuscript writing: All authors
Final approval of manuscript: All authors
AUTHORS' DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
Cancer Care Delivery Research: Building the Evidence Base to Support Practice Change in Community Oncology
The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or jco.ascopubs.org/site/ifc.
Erin E. Kent
No relationship to disclose
Sandra A. Mitchell
No relationship to disclose
Kathleen M. Castro
No relationship to disclose
Darren A. DeWalt
Consulting or Advisory Role: Merck
Arnold D. Kaluzny
No relationship to disclose
Judith A. Hautala
No relationship to disclose
Oren Grad
Consulting or Advisory Role: AbbVie, Pfizer, Janssen, ForSight, Durata, Merck
Rachel M. Ballard
No relationship to disclose
Worta J. McCaskill-Stevens
No relationship to disclose
Barnett S. Kramer
No relationship to disclose
Steven B. Clauser
No relationship to disclose
REFERENCES
- 1.Dafny L. Hospital industry consolidation: Still more to come? N Engl J Med. 2014;370:198–199. doi: 10.1056/NEJMp1313948. [DOI] [PubMed] [Google Scholar]
- 2.Fineberg HV. Foreword: Understanding and influencing multilevel factors across the cancer care continuum. J Natl Cancer Inst Monogr. 2012;2012:1. doi: 10.1093/jncimonographs/lgs009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Spinks T, Albright HW, Feeley TW, et al. Ensuring quality cancer care: A follow-up review of the Institute of Medicine's 10 recommendations for improving the quality of cancer care in America. Cancer. 2012;118:2571–2582. doi: 10.1002/cncr.26536. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Taplin SH, Anhang Price R, Edwards HM, et al. Introduction: Understanding and influencing multilevel factors across the cancer care continuum. J Natl Cancer Inst Monogr. 2012;2012:2–10. doi: 10.1093/jncimonographs/lgs008. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Ross TR, Ng D, Brown JS, et al. The HMO Research Network Virtual Data Warehouse: A public data model to support collaboration. EGEMS (Wash DC) 2014;2:1–8. doi: 10.13063/2327-9214.1049. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Clauser SB, Johnson MR, O'Brien DM, et al. Improving clinical research and cancer care delivery in community settings: Evaluating the NCI community cancer centers program. Implement Sci. 2009;4:63. doi: 10.1186/1748-5908-4-63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.McCaskill-Stevens W, McKinney MM, Whitman CG, et al. Increasing minority participation in cancer clinical trials: The minority-based community clinical oncology program experience. J Clin Oncol. 2005;23:5247–5254. doi: 10.1200/JCO.2005.22.236. [DOI] [PubMed] [Google Scholar]
- 8.Minasian LM, Carpenter WR, Weiner BJ, et al. Translating research into evidence-based practice: The National Cancer Institute Community Clinical Oncology Program. Cancer. 2010;116:4440–4449. doi: 10.1002/cncr.25248. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Lohr KN, Steinwachs DM. Health services research: An evolving definition of the field. Health Serv Res. 2002;37:7–9. [PubMed] [Google Scholar]
- 10.National Cancer Institute. NCI Community Oncology Research Program: Research areas. http://ncorp.cancer.gov/research. [DOI] [PMC free article] [PubMed]
- 11.National Cancer Institute. NCI Community Oncology Research Program. http://ncorp.cancer.gov.
- 12.Colquhoun H, Leeman J, Michie S, et al. Towards a common terminology: A simplified framework of interventions to promote and integrate evidence into health practices, systems, and policies. Implement Sci. 2014;9:51. doi: 10.1186/1748-5908-9-51. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Wang MC, Hyun JK, Harrison M, et al. Redesigning health systems for quality: Lessons from emerging practices. Jt Comm J Qual Patient Saf. 2006;32:599–611. doi: 10.1016/s1553-7250(06)32078-8. [DOI] [PubMed] [Google Scholar]
- 14.McDonald K, Chang C, Schultz E. Through the Quality Kaleidoscope: Reflections on the Science and Practice of Improving Health Care Quality—Closing the Quality Gap: Revisiting the State of the Science. Rockville, MD: Agency for Healthcare Research and Quality; 2013. [PubMed] [Google Scholar]
- 15.Kahn K, Ryan G, Beckett M, et al. Bridging the gap between basic science and clinical practice: A role for community clinicians. Implement Sci. 2011;6:34. doi: 10.1186/1748-5908-6-34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Currow DC, Allingham S, Yates P, et al. Improving national hospice/palliative care service symptom outcomes systematically through point-of-care data collection, structured feedback and benchmarking. Support Care Cancer. 2015;23:307–315. doi: 10.1007/s00520-014-2351-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Seifer SD, Michaels M, Collins S. Applying community-based participatory research principles and approaches in clinical trials: Forging a new model for cancer clinical research. Prog Community Health Partnersh. 2010;4:37–46. doi: 10.1353/cpr.0.0103. [DOI] [PubMed] [Google Scholar]
- 18.Bunnell CA, Gross AH, Weingart SN, et al. High performance teamwork training and systems redesign in outpatient oncology. BMJ Qual Saf. 2013;22:405–413. doi: 10.1136/bmjqs-2012-000948. [DOI] [PubMed] [Google Scholar]
- 19.Goodwin MA, Zyzanski SJ, Zronek S, et al. A clinical trial of tailored office systems for preventive service delivery: The Study to Enhance Prevention by Understanding Practice (STEP-UP) Am J Prev Med. 2001;21:20–28. doi: 10.1016/s0749-3797(01)00310-5. [DOI] [PubMed] [Google Scholar]
- 20.Stange KC, Goodwin MA, Zyzanski SJ, et al. Sustainability of a practice-individualized preventive service delivery intervention. Am J Prev Med. 2003;25:296–300. doi: 10.1016/s0749-3797(03)00219-8. [DOI] [PubMed] [Google Scholar]
- 21.Wright JD, Hershman DL. Measure for measure: The cost of improving quality. J Natl Cancer Inst. 2014;106:dju266. doi: 10.1093/jnci/dju266. pii. [DOI] [PubMed] [Google Scholar]
- 22.Cima RR, Lackore KA, Nehring SA, et al. How best to measure surgical quality? Comparison of the Agency for Healthcare Research and Quality Patient Safety Indicators (AHRQ-PSI) and the American College of Surgeons National Surgical Quality Improvement Program (ACS-NSQIP) postoperative adverse events at a single institution. Surgery. 2011;150:943–949. doi: 10.1016/j.surg.2011.06.020. [DOI] [PubMed] [Google Scholar]
- 23.Goldberg RF, Rosales-Velderrain A, Clarke TM, et al. Variability of NSQIP-assessed surgical quality based on age and disease process. J Surg Res. 2013;182:235–240. doi: 10.1016/j.jss.2012.10.925. [DOI] [PubMed] [Google Scholar]
- 24.McCahill LE, Single RM, Aiello Bowles EJ, et al. Variability in reexcision following breast conservation surgery. JAMA. 2012;307:467–475. doi: 10.1001/jama.2012.43. [DOI] [PubMed] [Google Scholar]
- 25.Buchholz TA, Somerfield MR, Griggs JJ, et al. Margins for breast-conserving surgery with whole-breast irradiation in stage I and II invasive breast cancer: American Society of Clinical Oncology endorsement of the Society of Surgical Oncology/American Society for Radiation Oncology consensus guideline. J Clin Oncol. 2014;32:1502–1506. doi: 10.1200/JCO.2014.55.1572. [DOI] [PubMed] [Google Scholar]
- 26.Moran MS, Schnitt SJ, Giuliano AE, et al. Society of Surgical Oncology-American Society for Radiation Oncology consensus guideline on margins for breast-conserving surgery with whole-breast irradiation in stages I and II invasive breast cancer. Int J Radiat Oncol Biol Phys. 2014;88:553–564. doi: 10.1016/j.ijrobp.2013.11.012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.van Dam PA, Verkinderen L, Hauspy J, et al. Benchmarking and audit of breast units improves quality of care. Facts Views Vis Obgyn. 2013;5:26–32. [PMC free article] [PubMed] [Google Scholar]
- 28.Campion FX, Larson LR, Kadlubek PJ, et al. Advancing performance measurement in oncology: Quality oncology practice initiative participation and quality outcomes. J Oncol Pract. 2011;7(suppl):31s–35s. doi: 10.1200/JOP.2011.000313. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Halpern MT, Spain P, Holden DJ, et al. Improving quality of cancer care at community hospitals: Impact of the National Cancer Institute Community Cancer Centers Program pilot. J Oncol Pract. 2013;9:e298–e304. doi: 10.1200/JOP.2013.000937. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Halpern MT, Spain P, Holden DJ, et al. Comparative health outcomes analysis for the NCCCP evaluation: Final report. http://ncccp.cancer.gov/files/Comparative-Health-Outcomes-Analysis-Report.pdf.
- 31.Taplin SH, Haggstrom D, Jacobs T, et al. Implementing colorectal cancer screening in community health centers: Addressing cancer health disparities through a regional cancer collaborative. Med Care. 2008;46(suppl):S74–S83. doi: 10.1097/MLR.0b013e31817fdf68. [DOI] [PubMed] [Google Scholar]
- 32.Bennett CL, Weeks JA, Somerfield MR, et al. Use of hematopoietic colony-stimulating factors: Comparison of the 1994 and 1997 American Society of Clinical Oncology surveys regarding ASCO clinical practice guidelines—Health Services Research Committee of the American Society of Clinical Oncology. J Clin Oncol. 1999;17:3676–3681. doi: 10.1200/JCO.1999.17.11.3676. [DOI] [PubMed] [Google Scholar]
- 33.Potosky AL, Malin JL, Kim B, et al. Use of colony-stimulating factors with chemotherapy: Opportunities for cost savings and improved outcomes. J Natl Cancer Inst. 2011;103:979–982. doi: 10.1093/jnci/djr152. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Schnipper LE, Smith TJ, Raghavan D, et al. American Society of Clinical Oncology identifies five key opportunities to improve care and reduce costs: The top five list for oncology. J Clin Oncol. 2012;30:1715–1724. doi: 10.1200/JCO.2012.42.8375. [DOI] [PubMed] [Google Scholar]
- 35.Bennett CL, Djulbegovic B, Norris LB, et al. Colony-stimulating factors for febrile neutropenia during cancer therapy. N Engl J Med. 2013;368:1131–1139. doi: 10.1056/NEJMct1210890. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Fishman ML, Kumar A, Davis S, et al. Guideline-based peer-to-peer consultation optimizes pegfilgrastim use with no adverse clinical consequences. J Oncol Pract. 2012;8(suppl):e14s–e17s. doi: 10.1200/JOP.2012.000540. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Russell GK, Jimenez S, Martin L, et al. A multicentre randomised controlled trial of reciprocal lung cancer peer review and supported quality improvement: Results from the improving lung cancer outcomes project. Br J Cancer. 2014;110:1936–1942. doi: 10.1038/bjc.2014.146. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Brown BB, Young J, Smith DP, et al. Clinician-led improvement in cancer care (CLICC): Testing a multifaceted implementation strategy to increase evidence-based prostate cancer care—Phased randomised controlled trial: Study protocol. Implement Sci. 2014;9:64. doi: 10.1186/1748-5908-9-64. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Cleeland CS, Wang XS, Shi Q, et al. Automated symptom alerts reduce postoperative symptom severity after cancer surgery: A randomized controlled clinical trial. J Clin Oncol. 2011;29:994–1000. doi: 10.1200/JCO.2010.29.8315. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Mooney KH, Beck SL, Friedman RH, et al. Automated monitoring of symptoms during ambulatory chemotherapy and oncology providers' use of the information: A randomized controlled clinical trial. Support Care Cancer. 2014;22:2343–2350. doi: 10.1007/s00520-014-2216-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Ramirez A, Perez-Stable E, Penedo F, et al. Reducing time-to-treatment in underserved Latinas with breast cancer: The Six Cities Study. Cancer. 2014;120:752–760. doi: 10.1002/cncr.28450. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Wagner EH, Ludman EJ, Aiello Bowles EJ, et al. Nurse navigators in early cancer care: A randomized, controlled trial. J Clin Oncol. 2014;32:12–18. doi: 10.1200/JCO.2013.51.7359. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Fashoyin-Aje LA, Martinez KA, Dy SM. New patient-centered care standards from the commission on cancer: Opportunities and challenges. J Support Oncol. 2012;10:107–111. doi: 10.1016/j.suponc.2011.12.002. [DOI] [PubMed] [Google Scholar]
- 44.Paskett ED, Harrop JP, Wells KJ. Patient navigation: An update on the state of the science. CA Cancer J Clin. 2011;61:237–249. doi: 10.3322/caac.20111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Haque R, Ahmed SA, Fisher A, et al. Effectiveness of aromatase inhibitors and tamoxifen in reducing subsequent breast cancer. Cancer Med. 2012;1:318–327. doi: 10.1002/cam4.37. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Chlebowski RT, Kim J, Haque R. Adherence to endocrine therapy in breast cancer adjuvant and prevention settings. Cancer Prev Res (Phila) 2014;7:378–387. doi: 10.1158/1940-6207.CAPR-13-0389. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Jackson GL, Powell AA, Ordin DL, et al. Developing and sustaining quality improvement partnerships in the VA: The Colorectal Cancer Care Collaborative. J Gen Intern Med. 2010;25(suppl 1):38–43. doi: 10.1007/s11606-009-1155-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Powell AA, Gravely AA, Ordin DL, et al. Timely follow-up of positive fecal occult blood tests strategies associated with improvement. Am J Prev Med. 2009;37:87–93. doi: 10.1016/j.amepre.2009.05.013. [DOI] [PubMed] [Google Scholar]
- 49.Wieder R, Teal R, Saunders T, et al. Establishing a minority-based community clinical oncology program: The University of Medicine and Dentistry of New Jersey, New Jersey Medical School–University Hospital Cancer Center experience. J Oncol Pract. 2013;9:e48–e54. doi: 10.1200/JOP.2012.000648. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Zafar SY, Peppercorn JM, Schrag D, et al. The financial toxicity of cancer treatment: A pilot study assessing out-of-pocket expenses and the insured cancer patient's experience. Oncologist. 2013;18:381–390. doi: 10.1634/theoncologist.2012-0279. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Grubbs SS, Polite BN, Carney J, Jr, et al. Eliminating racial disparities in colorectal cancer in the real world: It took a village. J Clin Oncol. 2013;31:1928–1930. doi: 10.1200/JCO.2012.47.8412. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Ross MK, Wei W, Ohno-Machado L. “Big data” and the electronic health record. Yearb Med Inform. 2014;9:97–104. doi: 10.15265/IY-2014-0003. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Margolis PA, DeWalt DA, Simon JE, et al. Designing a large-scale multilevel improvement initiative: The improving performance in practice program. J Contin Educ Health Prof. 2010;30:187–196. doi: 10.1002/chp.20080. [DOI] [PubMed] [Google Scholar]

