Abstract
Objective:
To develop a measure of cancer services integration (CSI) that can inform clinical and administrative decision-makers in their efforts to monitor and improve cancer system performance.
Methods:
We employed a systematic approach to measurement development, including review of existing cancer/health services integration measures, key-informant interviews and focus groups with cancer system leaders. The research team constructed a Web-based survey that was field- and pilot-tested, refined and then formally conducted on a sample of cancer care providers and administrators in Ontario, Canada. We then conducted exploratory factor analysis to identify key dimensions of CSI.
Results:
A total of 1,769 physicians, other clinicians and administrators participated in the survey, responding to a 67-item questionnaire. The exploratory factor analysis identified 12 factors that were linked to three broader dimensions: clinical, functional and vertical system integration.
Conclusions:
The CSI Survey provides important insights on a range of typically unmeasured aspects of the coordination and integration of cancer services, representing a new tool to inform performance improvement efforts.
Abstract
Objectif :
Mettre au point une mesure de l'intégration des services de cancérologie qui permette de renseigner les décideurs cliniques et administratifs dans le suivi et l'amélioration du rendement du réseau de cancérologie.
Méthode :
Nous avons employé une approche systématique pour la mise au point de mesures, notamment par la revue des mesures actuelles de l'intégration des services de cancérologie et de santé, par des entrevues auprès d'informateurs clés et par des groupes de discussion auprès des dirigeants du réseau de cancérologie. L'équipe de recherche a élaboré un sondage en ligne qui a été testé, précisé puis mené auprès d'un échantillon d'administrateurs et de prestataires de soins de cancérologie en Ontario, au Canada. Nous avons ensuite effectué une analyse factorielle exploratoire afin de déterminer les aspects essentiels de l'intégration des services de cancérologie.
Résultats :
Au total, 1769 médecins, cliniciens et administrateurs ont répondu au sondage de 67 questions. L'analyse factorielle exploratoire a permis de dégager 12 facteurs qui sont liés à trois aspects généraux : les aspects cliniques, les aspects fonctionnels et les aspects liés à l'intégration systémique verticale.
Conclusions :
Le sondage sur l'intégration des services de cancérologie donne d'importantes pistes concernant une variété d'aspects habituellement non mesurés en matière de coordination et d'intégration des services de cancérologie, ce qui représente un nouvel outil pour renseigner les initiatives d'amélioration du rendement.
For more than a decade, health services researchers have focused on the integration of health services as a means to improve performance. Measures have been developed that assess both provider- and patient-derived aspects of the coordination and continuity of health services within and across sectors (Gillies et al. 1993; Burns et al. 2001; Alexander et al. 2001; Fairchild et al. 2002; Ware et al. 2003; Durbin et al. 2004; Dolovich et al. 2004). Cancer systems, representing a microcosm of broader healthcare systems (including health promotion, cancer prevention/screening, surgical interventions, radiation and systemic therapies, supportive and palliative care), present a particularly challenging context for service integration (Sullivan et al. 2008). Cancer patients are often cared for by multiple providers (e.g., surgeons, medical oncologists, radiation oncologists, nurses, radiation therapists, social workers, community healthcare providers, etc.) in multiple care settings (e.g., at specialized/comprehensive cancer centres, teaching and community hospitals, primary care settings and/or at home). In Ontario, Canada, a 2001 review of cancer services highlighted their fragmented nature and recommended “ways to improve the integration of cancer services at the local and regional levels, the quality of patient care, and the productivity and efficiency in the cancer service component of the Ontario health system” (Cancer Services Implementation Committee 2001). While this review led to major reorganization of the Ontario cancer system (Sullivan et al. 2004; Dobrow et al. 2006), a specific measure of cancer services integration (CSI) to guide restructuring and monitor performance improvement did not exist. This paper reports on efforts to develop a measure of integration specific to cancer services as part of a broader undertaking to monitor and improve cancer system performance in Ontario.
Survey Development
We employed a systematic approach to survey development, including a scan for existing models of “integrated” cancer services, a literature review of concepts and measures of health services integration, key-informant interviews and focus groups with cancer system clinicians and administrators. These were followed by item generation, testing and reduction, including pilot surveys and feedback interviews with cancer system decision-makers, before the launch of the CSI Survey in February 2007.
Scan for models of integrated cancer services
Through the mid- to late 1990s, the Veterans Health Administration in the United States went through a period of major restructuring, including the realignment of its cancer services (Wilson and Kizer 1998). Decision-making was decentralized and a system of integrated service networks was developed. This included primary, secondary and comprehensive cancer centres, local cancer registries, a research partnership with the National Cancer Institute and a standard electronic data infrastructure that supported a program of performance accountability and quality improvement (Wilson and Kizer 1998). Similarly, England and Wales recently went through a process of redesigning their cancer services (Department of Health 2000; Griffith and Turner 2004). Their ambitious reforms coincided with broader reforms in the National Health Service (Department of Health 1997, 1998), with a comprehensive cancer plan promoting collaborative partnerships and focused on improving the patient experience (Department of Health 2000). In Canada, British Columbia has developed an integrated cancer system based on central program/network infrastructure, a research centre, a comprehensive cancer registry and a network of service organizations and practice leaders to drive development of standardized processes of care (Carlow 2000).
While these illustrations of evolving cancer systems in different jurisdictions help to characterize important elements of an integrated cancer system, none provided specific definitions of, or tools for measuring, CSI. To augment the jurisdictional scan, we conducted a broader literature review.
Review of measures of cancer/health services integration
A search focusing specifically on published measures of CSI did not yield relevant findings. This was consistent with the findings of two recent reports, one a synthesis on health systems integration research (Suter et al. 2007) and another a systematic review of health system integration measures (Raina et al. 2006). Both identified a number of general and disease-/condition-specific measures of integration; however, none were specific to cancer services. Therefore, to inform our work, we first examined non-cancer measures and drew on the evolving body of research on health services integration to provide a conceptual basis for development of a measure of CSI.
Some of the best-known work comes from the Health Systems Integration Study (Shortell et al. 2000), which characterized health system performance as an output of integration, linking a system's vision, culture, strategy and leadership with three main dimensions of integration (Gillies et al. 1993):
Functional integration is defined as the extent to which key support functions and activities (such as financial management, strategic planning, human resource management, and information management) are coordinated across operating units of a system.
Physician–system integration is defined as the extent to which physicians are economically linked to a system, use its facilities and services, and actively participate in its planning, management and governance.
Clinical integration is defined as the extent to which patient care services are coordinated across the various functions, activities and operating units of a system.
In their extensive review of this work, Shortell and colleagues (2000) suggested that functional integration was most important for financial management and operating policies, information systems, resource allocation, quality improvement and strategic planning and less important for administrative support, human resources and marketing. Physician–system integration reflected physician remuneration, incentive, interdisciplinary care and accountability models, with physicians under pressure to contain costs, shift focus from individual to population levels and provide public accountability for performance. Shortell and colleagues (2000) described three levels of clinical integration, including a corporate level, where structural, systemic and cultural factors influence clinical integration; an intermediate/managerial level, where economies of scope or scale influence the standardization or duplication of clinical services; and a technical level that reflects the use of practice guidelines or protocols to influence care delivery (Shortell et al. 2000). These authors suggested that clinical integration is the most challenging and important component of an organized delivery system.
Leatt and colleagues (2000) described characteristics of integrated service delivery that reflect health system structures in Canada. They emphasized focus on the individual patient experience, starting with primary healthcare, sharing and utilizing information, creating virtual coordination networks at the local level, revising funding methods and developing performance monitoring capacity (Leatt et al. 2000). In a review of 41 studies, Leatt (2002) recommended that integrated service delivery should be characterized along three key dimensions: clinical, information and vertical integration. Clinical integration was linked to disease management programs, reflecting use of clinical protocols, pathways, guidelines and multidisciplinary teams, along with participatory structures and policies, and communication strategies to ensure stakeholder acceptance (Leatt 2002). Information integration focused on information management and technology that allows timely information sharing across traditional organizational and professional boundaries for all stakeholders (Leatt 2002). Vertical integration was linked to the patient experience, described as interorganizational arrangements across the continuum of care that allow improved coordination of patient care (Leatt 2002).
Leatt's patient-centred focus on integration differs somewhat from other views (Conrad and Dowling 1990; Hernandez 2000; Budetti et al. 2002; Burns and Pauly 2002), raising a fundamental conceptual question regarding the measurement of integration: Should measures of integration be derived from provider or patient perceptions? Interest in continuity of care dates back more than 30 years (Mindlin and Densen 1969; Bass and Windle 1972), yielding diverse patient-derived conceptions of what it is and how it can be measured (Reid et al. 2002; Freeman et al. 2001). In a multidisciplinary review, Haggerty and colleagues (2003) suggested that the concept of continuity of care should capture aspects of informational continuity (use of information on past events and personal circumstances to make current care appropriate for each individual), relational continuity (ongoing therapeutic relationship between a patient and one or more providers) and management continuity (a consistent, coherent approach to management of a health condition that is responsive to a patient's changing needs). More fundamentally, they suggested that “[c]ontinuity is not an attribute of providers or organisations … [it] is how individual patients experience integration of services and coordination” (Haggerty et al. 2003). Conrad (1993) cautioned, however, that focus ultimately needs to be at the level of the system:
[t]he essence of a system is the ability to aggregate up individual level care coordination and clinical processes into a system level capacity to plan, deliver, monitor, and adjust the structures and strategies for coordinating the care of populations over time. The coordination of care for individual patients is a necessary but not sufficient condition to realizing system level clinical integration.
Despite these apparent contradictions, both provider-derived conceptions of health services integration and patient-derived conceptions of continuity of care are related. With a survey of ambulatory oncology patient satisfaction already underway in Ontario, which included questions on continuity and coordination of care, our intent was to develop a provider-derived measure of CSI that would complement data and insights drawn from the patient-derived measure.
Interviews, Focus Groups and Survey-Item Generation
We next looked to local cancer system leaders to examine what aspects of existing health services integration measures were relevant to cancer services. Interviews were conducted with clinical program leaders (i.e., systemic therapy, radiation oncology, surgical oncology, nursing, health human resources, clinical guideline development, prevention/screening, palliative care, supportive care, pathology and social work) from Ontario's cancer system. Each informant was asked to describe key challenges or barriers to the integration of cancer services, and to formulate three potential survey items. Focus groups were conducted with members of Cancer Care Ontario's Clinical Council (including clinical program leaders) and Provincial Leadership Council (including regional administrative heads for each Regional Cancer Program and Cancer Care Ontario's executive team). In both cases, council members were asked to identify examples of effective and ineffective integration in the Ontario cancer system and desired features reflecting integrated cancer services.
Survey items were generated iteratively, initially drawing on the 54-item survey instrument produced through the Health Systems Integration Study (Gillies et al. 1993) and supplemented by items suggested by key informants. After field testing and a pilot survey, the survey instrument was further refined, resulting in a 67-item questionnaire (13 demographic and 54 Likert scale items) with specific versions of each item tailored for the three main participant groups (i.e., physicians, other clinicians, administrators) to improve relevance and comprehension (item descriptions provided in Appendix). http://www.longwoods.com/product.php?productid=20933
Methods
Healthcare providers and administrators that had regular opportunities to interact with the cancer system were the primary focus of the survey (Table 1 describes the target population). Given cost considerations, an electronic survey was selected as the distribution mode, allowing a much larger sample of cancer care providers and administrators to be surveyed than would have been possible with more traditional paper- or telephone-based surveys. The electronic survey allowed real-time data collection and customized survey design, including use of conditional (skip/jump) logic to ensure that respondents were asked questions relevant to their position and region. However, the target population did require Internet or e-mail access at work.
TABLE 1.
Physicians | Other clinicians | Administrators |
---|---|---|
Medical Oncologist | Pharmacist | Corporate Leadership (e.g., CEO, Executive Director) |
Radiation Oncologist | Systemic Therapy Clinic Nurse | Cancer Services |
Paediatric Oncologist | Chemotherapy Nurse | Case Management |
Radiologist | Inpatient Oncology Nurse | Client/Patient Services |
Surgical Oncologist | Radiation Therapy Nurse | Clinical Programs |
Surgeon – General | Advanced Practice Nurse | Finance |
Surgeon – Gynaecologist | Clinical Trials Nurse | Human Resources |
Surgeon – Urologist | OBSP Nurse | Information Technology/Management |
Surgeon – Thoracic | Social Worker | Nursing |
Surgeon – Otolaryngologist | Dietician | Prevention/Screening |
Haematologist | Dosimetrist | |
Pathologist | Radiation Therapist | |
Gastroenterologist | Medical Physicist | |
Respirologist | Community Care Planners | |
Palliative Care Physician |
The sampling frame was constructed from a variety of sources, including the Canadian Medical Directory, Cancer Care Ontario's e-mail directories and direct contact with provider organizations, including hospitals and community care access centres (CCACs). In addition to the inclusion of all 14 CCACs in Ontario, 63 Ontario hospitals were selected based on the following criteria:
all Regional Cancer Program host hospitals
all teaching hospitals
all children's hospitals
all Cancer Surgery Agreement (CSA)/Systemic Therapy Agreement (STA) hospitals1
all hospitals performing over 100 cancer surgeries per year (2005/06)2
minimum of three hospitals per geographically defined Local Health Integration Network (where criteria 1 through 5 did not provide this, up to two additional hospitals were selected in order of highest cancer surgery volume).
Because there were only minimal cost implications of expanding the sample size when using the electronic survey, the sample included the entire target population of identifiable cancer care providers and administrators in Ontario that had Internet/email access at work.
The survey was launched on February 26, 2007 with responses accepted at any time over a three-week period. An e-mail introduction to the survey was sent to all study subjects from the appropriate Regional Cancer Program leader. This mailing was followed by an automated e-mail invitation and three automated reminder e-mails, each with a link to the Web-based survey and co-signed by the appropriate Regional Cancer Program leader and two members of Cancer Care Ontario's executive team. These e-mail invitations described the study, provided contact details for further information and offered an explicit option for the study subject to decline participation and be removed from the reminder list. All respondents were offered a $5 electronic gift certificate for participating. Ultimately, the survey was received by 5,366 cancer care providers and administrators throughout Ontario.
Data were captured automatically through a Surveymonkey.com database and downloaded for analysis using SPSS (version 15). An exploratory factor analysis was the main analytical approach taken to guide identification of CSI dimensions (Harman 1976; Rummel 1970). The factor structure of the full 54-item scale was assessed through unweighted least squares analysis with varimax rotation (Jöreskog 1977). Resultant factors were then interpreted by examining item content and pattern of coefficients.
Ethics approval for the study was granted by the University of Toronto's Research Ethics Board.
Results
Participation rates and participant characteristics
Of the 5,366 e-mail invitations sent to valid e-mail addresses, there were 2,031 responses (i.e., the survey was accessed via the Web link). For the purposes of this study, we defined “participation” as those respondents who completed question 10, which required identification of the Regional Cancer Program most relevant to the respondent's clinical or professional work. According to this criterion, there were 1,769 participants, resulting in a participation rate of 33%. Provincially, 47% of administrators participated in the survey, while participation rates for physicians (25%) and other clinicians (32%) were considerably lower. A detailed analysis of participation rates has been reported elsewhere (Dobrow et al. 2008).
Of the 1,769 participants, 28% were physicians, 35% were other clinicians and 37% were administrators, with the majority female (69%) between the ages of 40 and 60 (71%) (Table 2). Participants represented all 13 Regional Cancer Programs in Ontario, identifying teaching hospitals (47%), community hospitals (37%), CCACs (13%) or other locations (3%) as their primary place of work. A Regional Cancer Program host hospital (teaching or community) was the main location of work for 50% of participants, suggesting that participants provided good representation for both cancer centre and non-cancer centre based individuals.
TABLE 2.
Participants | |||
---|---|---|---|
n=1,769 | % | ||
Sex | Female | 1,212 | 68.5% |
Male | 549 | 31.0% | |
No Response | 8 | 0.5% | |
Age | <40 | 391 | 22.1% |
40–49 | 605 | 34.2% | |
50–59 | 650 | 36.7% | |
60+ | 114 | 6.4% | |
No response | 9 | 0.5% | |
Region Cancer Program (RCP)*† | RCP A | 79 | 4.5% |
RCP B | 241 | 13.6% | |
RCP C | 67 | 3.8% | |
RCP D | 199 | 11.2% | |
RCP E | 86 | 4.9% | |
RCP F | 406 | 23.0% | |
RCP G | 56 | 3.2% | |
RCP H | 54 | 3.1% | |
RCP I | 119 | 6.7% | |
RCP J | 190 | 10.7% | |
RCP K | 61 | 3.4% | |
RCP L | 138 | 7.8% | |
RCP M | 73 | 4.1% | |
Location of work | Teaching Hospital | 835 | 47.2% |
Community Hospital (100 or more beds) | 613 | 34.7% | |
Community Hospital (less than 100 beds) | 43 | 2.4% | |
Community Care Access Centre | 230 | 13.0% | |
Other (e.g., Private Practice Clinic, Public Health Unit) | 39 | 2.2% | |
No response | 9 | 0.5% | |
Position* | Physician | 498 | 28.2% |
Other Clinician | 625 | 35.3% | |
Administrator | 646 | 36.5% | |
Distance from main RCP in region | At main RCP hospital | 878 | 49.6% |
Less than 10 km but not at main RCP hospital | 285 | 16.1% | |
Between 11 and 20 km from main RCP hospital | 132 | 7.5% | |
Between 21 and 100 km from main RCP hospital | 255 | 14.4% | |
More than 100 km from main RCP hospital | 195 | 11.0% | |
No response | 24 | 1.4% |
Answer to item required.
Sample size for each RCP varied.
It was possible to compare a few characteristics of the survey participants (n=1,769) and the full sample (N=5,366), with no major differences detected. Comparing regional response, 11 of the 13 regions had participation rates within 1% (with all 13 within 3%) of the regional breakdown for the full sample. Compared with the full sample, participants included relatively more administrators and fewer physicians.
Item response distribution and missing data
For the 54 Likert scale items, a five-point scale was used (“strongly agree” to “strongly disagree”), along with a “not applicable” option. Missing responses were relatively low for all items, with non-response rates not higher than 10% for any one item and combined missing and “not applicable” response rates not higher than 20% for any one item. Frequency distributions indicated a full range of responses for all items, with no floor or ceiling effects noted. Therefore, all 54 items were retained for further analysis.
Exploratory factor analysis
Given participants' varying individual item completion rates (i.e., no missing data or “not applicable” responses) for all 54 items, the exploratory factor analysis (EFA) was ultimately based on 722 valid responses. Following examination of eigenvalues, scree plot and factor loadings, a 12-factor (36-item) solution was determined to provide the best fit. Eigenvalues for the 12 factors ranged from 11.6 to 1.1, accounting for 51% of the common variance. While factor loadings above 0.32 can be considered meaningful (Tabachnick and Fidell 2007), 49 of the 54 items had loadings greater than 0.32, creating a complex interpretation of the resultant factors. Therefore, a higher threshold of 0.5 was used to allow clearer interpretation of the resultant factors (Table 3). Internal consistency reliability for each of the resultant factors was estimated using Cronbach's coefficient alpha with acceptable values ranging from 0.74 to 0.90 (Table 3).
TABLE 3.
Factor | Items* loading to factor (≥0.50) | Factor loading** | Cronbach's coefficient | Interpreted theme | Interpreted dimension |
---|---|---|---|---|---|
Factor 1 | 14Q 14R 14O 14P |
0.81 0.77 0.67 0.65 |
0.87 | Clinical responsiveness to requests for advice (medical/radiation oncologists, surgeons and pathologists) | Clinical |
Factor 2 | 16B 16C 16A 16D |
0.71 0.66 0.57 0.52 |
0.75 | Support and effectiveness of multidisciplinary cancer conferences | Clinical |
Factor 3 | 16G 16H 16I 16J |
0.71 0.71 0.58 0.58 |
0.86 | Clinical leadership and guidance regarding best practices and innovations | Clinical |
Factor 4 | 15I 15H 15J 15G |
0.74 0.73 0.65 0.64 |
0.84 | Regional coordination of resources (staff/personnel, technology/equipment, financial) | Functional |
Factor 5 | 16O 16P |
0.82 0.78 |
0.90 | Support for Regional Cancer Program leadership role | Vertical System |
Factor 6 | 14A 14C 14B |
0.76 0.71 0.62 |
0.81 | Regional coordination of health promotion and cancer prevention/screening activities | Vertical System |
Factor 7 | 14J 14I 14L 14K |
0.72 0.70 0.51 0.50 |
0.75 | Awareness of whom to contact for advice (palliative/supportive care, public health, community-based service organizations) | Functional |
Factor 8 | 15K 15L 15M |
0.83 0.82 0.74 |
0.88 | Influence of Regional Cancer Program on the allocation of resources (staff/personnel, technology/equipment, financial) | Vertical System |
Factor 9 | 16L 16M |
0.80 0.56 |
0.74 | Regional Cancer Program awareness of practice variation within/among regions | Vertical System |
Factor 10 | 15O 15N |
0.61 0.58 |
0.76 | Existence of standardized technology use policies and professional training programs in region | Functional |
Factor 11 | 15D 15E |
0.79 0.63 |
0.75 | Access to computers/Internet for clinical/professional needs | Functional |
Factor 12 | 14N 14M |
0.69 0.67 |
0.83 | Clinical responsiveness to requests for advice (palliative/supportive care) | Clinical |
Item descriptions provided in Appendix.
All factor loadings below 0.50 suppressed.
Various methods of imputation were performed, including substitution and stochastic regression imputation, to assess the impact of missing data on the resultant factor structure (Little and Rubin 2002). This included recoding “not applicable” responses to “neither agree nor disagree” or extreme values (e.g., “strongly agree” or “strongly disagree”) and using regression residuals to impute values for missing data. This approach allowed data from all 1,769 responses to be analyzed. This sensitivity analysis showed that while imputing extreme values did, as expected, produce inconsistent factor structures, recoding of “not applicable” to “neither agree nor disagree” and stochastic substitution using regression residuals resulted in factor structures highly consistent with the initial approach taken.
Overall, the EFA produced a consistent factor structure, with the interpretation of the 36 items loading to one of the 12 factors relatively clear and each of the inferred themes addressing important aspects of CSI (Table 3).
Discussion
Dimensions of CSI
Our intent was to develop a measure of CSI that could provide insights on typically unmeasured aspects of the coordination and integration of cancer services. The 12 factors were compared to the dimensions of integration identified in the literature review, with particular focus on the provider-derived dimensions of health services integration (Table 3). Four factors (factors 1, 2, 3 and 12) reflect key elements of clinical integration (i.e., clinical responsiveness to requests for advice from medical/radiation oncologists, surgeons and pathologists; effectiveness of multidisciplinary clinical teams; and clinical leadership/guidance regarding best practices). Each of these factors directly influences patient care services and directs attention to different aspects of clinical integration, including informal clinical interactions (factors 1 and 12), formal multidisciplinary clinical conferences (factor 2) and the role of clinical leadership in facilitating best practice (factor 3). Accounting for the top three factors in terms of common variance explained, these results are consistent with the findings of Shortell and colleagues (2000), who suggested clinical integration was the most challenging and important component of an organized delivery system. These findings suggest that efforts to improve clinical integration would have the greatest impact on overall service integration.
Four other factors (factors 4, 7, 10 and 11) reflect elements of functional integration (i.e., regional coordination of resources; awareness of whom to contact for advice regarding palliative/supportive care, public health and community-based services; existence of standardized policies and training programs; and access to computers/Internet). These functional integration factors reflect the potential to facilitate patient care activities, representing a mix of communication and information infrastructure and coordination or standardization of policies and programs. It should be noted that while some of these functional integration factors directly reflect Leatt's (2002) conceptualization of information integration, overall the study's findings suggest that information integration was relevant, and often essential, to most of the 12 identified factors, and therefore difficult to categorize exclusively. Therefore, our interpretation of functional integration is more consistent with that of Shortell and colleagues (2000), which focused on the coordination of key support functions and activities.
The four remaining factors (factors 5, 6, 8 and 9) constitute the final dimension of CSI. These factors primarily reflect elements of system leadership, including support for the role of a system leadership entity (i.e., the Regional Cancer Program in the Ontario context), with specific focus on its awareness of comparative performance (i.e., practice variation within and among regions) and its influence over key stakeholder relationships (i.e., resource allocation, regional coordination of promotion and prevention activities). Consistent with Leatt's (2002) conception of vertical integration, these four factors emphasize the importance of governance and accountability issues and extend Gillies and colleagues' (1993) conception of physician–system integration, which reflects individual and organizational roles and relationships within a broader system. These four factors also emphasize system-level capacity to coordinate services, reflecting Conrad's (1993) attention to aggregated rather than individual-level coordination processes. Therefore, considering these four factors together, we have characterized this third dimension as vertical system integration.
The CSI Survey tool
Improving service integration is a key component of performance improvement efforts in many areas of healthcare, and particularly important for cancer services given the challenges of multiple providers and multiple care settings (Sullivan et al. 2008). However, given the lack of a measure of CSI, an important gap exists for decision-makers interested in improving system performance. Our findings suggest that clinical, functional and vertical system integration represent the key elements of variation that influence CSI.
The CSI Survey provides decision-makers with the ability to measure 12 key components of service integration, representing an important tool to make informed performance improvement decisions. The 12 CSI factors and three dimensions provide direction for decision-makers, both in terms of targeting where efforts are needed to achieve performance improvements in CSI and in identifying appropriate levels of responsibility for cancer system leaders. Ultimately, the 36 Likert scale items contributing to the 12 factors can detect the majority of variation in CSI, representing a more concise tool for measuring service integration in cancer systems (Appendix).
Preliminary work to disseminate findings from the CSI Survey with cancer system leaders in Ontario has been encouraging. However, to validate the tool further, application of the CSI Survey in other jurisdictions is needed. With most of the identified factors representing aspects of service integration relevant to other complex disease management areas, the CSI Survey may also have broader application beyond a specific focus on cancer services.
Limitations
With the low clinician participation rate for the CSI Survey, a common problem with surveys of clinicians (Schoenman et al. 2003), caution should be exercised when extrapolating these results to broader populations of cancer care providers in Ontario or elsewhere. Similarly, while the survey requirement that participants have an e-mail address and Internet access may have introduced a selection bias, concerns that specific groups of providers or administrators were excluded were not raised in our numerous interactions with provider organizations.
It should also be noted that the sample did not include family physicians. While we acknowledge the contribution that family physicians make in the care of cancer patients, our survey development work suggested that most family physicians in Ontario typically care for only a limited number of cancer patients. Therefore, as the survey was designed and relevant for healthcare providers who routinely provide care to a large number of cancer patients, family physicians were excluded. However, despite their exclusion, the survey still produced several important factors related to the coordination of health promotion, cancer prevention/screening activities, the awareness of primary care contacts and the responsiveness of palliative and supportive care (factors 6, 7 and 12).
Although missing data also presented challenges, the EFA was analytically sound, producing consistent results using various imputation methods and assumptions. Finally, it should be noted that the CSI Survey was developed in the context of a large, publicly funded healthcare system. However, the integration dimensions are broadly relevant and should be largely transferable to other types of healthcare systems.
Conclusions
We set out to develop a measure of CSI that can inform clinical and administrative decision-makers in their efforts to monitor and improve cancer system performance. Through the development of the CSI Survey, we have created a provider-derived survey tool that provides insights on 12 key factors across three dimensions of integration (i.e., clinical, functional and vertical system). The CSI Survey provides an important starting point for measuring the coordination and integration of cancer services, establishing a tool to guide cancer system leaders on how to target efforts and resources in the ongoing pursuit of high performance.
Acknowledgements
The authors acknowledge exceptional support for the project from cancer care providers and administrators representing Regional Cancer Programs, hospitals, community care access centres and public health units across Ontario. Funding for this project was provided by a grant from the Canadian Health Services Research Foundation (#RC1-1071-06), with matching funds provided by Cancer Care Ontario.
Appendix: CSI Survey Item Wording
Items | CSI Survey Item Wording (Physician Version) | |
---|---|---|
Items loading to one of 12 factors | 14A | Cancer prevention and screening activities are well coordinated in my region. |
14B | The ABC RCP actively engages primary care providers in the development of regional cancer screening initiatives. | |
14C | Health promotion activities are well coordinated in my region. | |
14I | I know whom to contact when seeking palliative care advice. | |
14J | I know whom to contact when seeking supportive care advice. | |
14K | I know whom to contact when seeking public health advice. | |
14L | I do not have difficulty contacting individuals within community-based service organizations. | |
14M | Palliative care experts in my region are not responsive to requests for advice regarding my cancer patients. | |
14N | Supportive care experts in my region are not responsive to requests for advice regarding my cancer patients. | |
14O | Medical oncologists in my region are not responsive to requests for advice regarding my cancer patients. | |
14P | Radiation oncologists in my region are not responsive to requests for advice regarding my cancer patients. | |
14Q | Surgeons in my region are not responsive to requests for advice regarding my cancer patients. | |
14R | Pathologists in my region are not responsive to requests for information regarding my cancer patients. | |
15D | In my normal practice setting, I do not have adequate access to a computer for my clinical/professional needs (e.g., access to Computerized Physician Order Entry). | |
15E | In my normal practice setting, I do not have adequate Internet access for my clinical/professional needs (e.g., access to online services such as Medline). | |
15G | My organization coordinates the use of its staff/personnel with other organizations in the region to better care for cancer patients. | |
15H | My organization coordinates the use of its technology and equipment with other organizations in the region to better care for cancer patients. | |
15I | My organization coordinates the use of its financial resources with other organizations in the region to better care for cancer patients. | |
15J | My organization coordinates with other organizations in the region to eliminate unnecessary duplication of administrative services. | |
15K | The allocation of staff/personnel in my practice region is influenced by the ABC RCP. | |
15L | The allocation of technology and equipment in my practice region is influenced by the ABC RCP. | |
15M | The allocation of financial resources in my practice region is influenced by the ABC RCP. | |
15N | Standardized professional training programs exist throughout my practice region. | |
15O | Standardized policies for the use of technology and equipment exist throughout my practice region. | |
16A | The ABC RCP organizes multidisciplinary cancer conferences on a regular basis (e.g., at least once per month). | |
16B | I regularly participate (e.g., at least once per month) in multidisciplinary cancer conferences. | |
16C | Multidisciplinary cancer conferences in my practice region improve the sharing of information on best practices. | |
16D | Interprofessional discussions and case conferences are effective ways to coordinate and improve individual care plans for my cancer patients. | |
16G | Provincial clinical leaders do a good job of communicating best practice guidelines/standards to regional clinical leaders in my practice region. | |
16H | Regional clinical leaders do a good job of communicating best practice guidelines/standards to clinicians in my practice region. | |
16I | There is effective information sharing on clinical innovations between the ABC RCP and other RCPs in Ontario. | |
16J | There is effective information sharing on clinical innovations between the ABC RCP and clinicians in my practice region. | |
16L | The ABC RCP is aware of practice differences within my practice region. | |
16M | The ABC RCP is aware of practice differences among RCPs in Ontario. | |
16O | The ABC RCP has goals and objectives that are agreed on and widely shared by most health care organizations in my practice region. | |
16P | The ABC RCP has values and norms that are agreed on and widely shared by most health care organizations in my practice region. | |
Items NOT loading to one of 12 factors | 14D | The ongoing care of cancer patients in my region is normally coordinated by one provider/case manager. |
14E | Cancer patients in my region receive the most convenient care possible (e.g., limited patient travel, care provided close to home, etc.). | |
14F | Cancer care providers in my region often perform clinical outreach. | |
14G | I have access to multidisciplinary expertise within my practice region. | |
14H | Access to multidisciplinary expertise within my practice region is timely. | |
15A | I have access to my organization's electronic health records for the cancer patients that I am responsible for. | |
15B | I have access to other organizations' electronic health records for the cancer patients that I am responsible for. | |
15C | Health records (either paper or electronic) for the cancer patients I am responsible for are rarely complete. | |
15F | I have access to video-conferencing technology to care for my cancer patients. | |
15P | The ABC RCP has a comprehensive health human resources plan in place to recruit and retain cancer care providers in my practice region. | |
15Q | Decisions to invest in new cancer services in my practice region are evaluated in relation to the systemwide priorities of the ABC RCP. | |
15R | The broad continuum of cancer services (from health promotion and cancer prevention activities through to treatment/therapy, supportive and palliative care) is well coordinated in my region. | |
16E | There is recognized provincial clinical leadership within my principal clinical area. | |
16F | There is recognized regional clinical leadership within my principal clinical area. | |
16K | I have access to data that compare my clinical performance to colleagues in my practice region. | |
16N | The ABC RCP is the recognized regional leader of cancer services for my practice region. | |
16Q | A broad range of healthcare organizations have sufficient input into strategic planning for the ABC RCP. | |
16R | The ABC RCP is connected to my clinical practice. |
RCP = Regional Cancer Program
CSA and STA hospitals were identified through their contract status with Cancer Care Ontario to provide incremental service volumes for cancer surgery and/or systemic therapy.
Hospital-specific cancer surgery volumes were obtained from Cancer Care Ontario data sources.
To view the appendix please visit http:www.longwoods.com/URL TO COME
Contributor Information
Mark J. Dobrow, Scientist, Cancer Services and Policy Research Unit, Cancer Care Ontario; Assistant Professor, Department of Health Policy Management and Evaluation, University of Toronto, Toronto, ON.
Lawrence Paszat, Senior Scientist, Institute for Clinical Evaluative Sciences; Associate Professor, Department of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON.
Brian Golden, Sandra Rotman Chair in Health Sector Strategy, Rotman School of Management; University of Toronto and University Health Network, Toronto, ON.
Adalsteinn D. Brown, Assistant Deputy Minister, Health System Strategy, Ontario Ministry of Health and Long-Term Care; Assistant Professor, Department of Health Policy, Management and Evaluation, University of Toronto, Toronto, ON.
Eric Holowaty, Senior Consultant, Population Studies and Surveillance, Cancer Care Ontario; Associate Professor, Department of Public Health Sciences, University of Toronto, Toronto, ON.
Margo C. Orchard, Policy Advisor, Strategy, Canadian Partnership Against Cancer, Toronto, ON.
Neerav Monga, Biostatistician, Cancer Care Ontario, Toronto, ON.
Terrence Sullivan, President and Chief Executive Officer, Cancer Care Ontario; Associate Professor, Departments of Health Policy, Management and Evaluation and Public Health Sciences, University of Toronto, Toronto, ON.
References
- Alexander J.A., Waters T.M., Burns L.R., Shortell S.M., Gillies R.R., Budetti P.P., Zuckerman H.S. The Ties That Bind: Interorganizational Linkages and Physician–System Alignment. Medical Care. 2001;39(7):I30–I45. [PubMed] [Google Scholar]
- Bass R.D., Windle C. Continuity of Care: An Approach to Measurement. American Journal of Psychiatry. 1972;129(2):196–201. doi: 10.1176/ajp.129.2.196. [DOI] [PubMed] [Google Scholar]
- Budetti P.P., Shortell S.M., Waters T.M., Alexander J.A., Burns L.R., Gillies R.R., Zuckerman H. Physician and Health System Integration. Health Affairs. 2002;21(1):203–10. doi: 10.1377/hlthaff.21.1.203. [DOI] [PubMed] [Google Scholar]
- Burns L.R., Alexander J.A., Shortell S.M., Zuckerman H.S., Budetti P.P., Gillies R.R., Waters T.M. Physician Commitment to Organized Delivery Systems. Medical Care. 2001;39(7):I9–I29. doi: 10.1097/00005650-200107001-00002. [DOI] [PubMed] [Google Scholar]
- Burns L.R., Pauly M.V. Integrated Delivery Networks: A Detour on the Road to Integrated Health Care? Health Affairs. 2002;21(4):128–43. doi: 10.1377/hlthaff.21.4.128. [DOI] [PubMed] [Google Scholar]
- Cancer Services Implementation Committee. Report of the Cancer Services Implementation Committee. Toronto: Ontario Ministry of Health and Long-Term Care; 2001. [Google Scholar]
- Carlow D.R. The British Columbia Cancer Agency: A Comprehensive and Integrated System of Cancer Control. Healthcare Quarterly. 2000;3(3):31–45. doi: 10.12927/hcq..16755. [DOI] [PubMed] [Google Scholar]
- Conrad D. Coordinating Patient Care Services in Regional Health Systems: The Challenge of Clinical Integration. Hospital and Health Services Administration. 1993;38(4):491–508. [PubMed] [Google Scholar]
- Conrad D.A., Dowling W.L. Vertical Integration in Health Services: Theory and Managerial Implications. Health Care Management Review. 1990;15(4):9–22. doi: 10.1097/00004010-199001540-00003. [DOI] [PubMed] [Google Scholar]
- Department of Health. London, UK: The Stationery Office; 1997. The New NHS: Modern, Dependable. [Google Scholar]
- Department of Health. London, UK: The Stationery Office; 1998. A First-Class Service: Quality in the New NHS. [Google Scholar]
- Department of Health. London, UK: The Stationery Office; 2000. The NHS Cancer Plan: A Plan for Investment, A Plan for Reform. [Google Scholar]
- Dobrow M., Langer B., Angus H., Sullivan T. Quality Councils as Health System Performance and Accountability Mechanisms: The Cancer Quality Council of Ontario Experience. HealthcarePapers. 2006;6(3):8–21. doi: 10.12927/hcpap..18059. [DOI] [PubMed] [Google Scholar]
- Dobrow M.J., Orchard M.C., Golden B., Holowaty E., Paszat L., Brown A.D., Sullivan T. Response Audit of an Internet Survey of Health Care Providers and Administrators: Implications for Determination of Response Rates. Journal of Medical Internet Research. 2008;10(4):e30. doi: 10.2196/jmir.1090. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Dolovich L.R., Nair K.M., Ciliska D.K., Lee H.N., Birch S., Gafni A., Hunt D.L. The Diabetes Continuity of Care Scale: The Development and Initial Evaluation of a Questionnaire That Measures Continuity of Care from the Patient Perspective. Health and Social Care in the Community. 2004;12(6):475–87. doi: 10.1111/j.1365-2524.2004.00517.x. [DOI] [PubMed] [Google Scholar]
- Durbin J., Goering P., Streiner D.L., Pink G. Continuity of Care: Validation of a New Self-Report Measure for Individuals Using Mental Health Services. Journal of Behavioral Health Services and Research. 2004;31(3):279. doi: 10.1007/BF02287291. [DOI] [PubMed] [Google Scholar]
- Fairchild D.G., Hogan J., Smith R., Portnow M., Bates D.W. Survey of Primary Care Physicians and Home Care Clinicians: An Assessment of Communication and Collaboration. Journal of General Internal Medicine. 2002;17(4):253–61. doi: 10.1046/j.1525-1497.2002.10717.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Freeman G., Shepperd S., Robinson I., Ehrich K., Richards S. Continuity of Care: Report of a Scoping Exercise for the National Co-ordinating Centre for NHS Service Delivery and Organisation R&D (NCCSDO) London, UK: National Co-ordinating Centre for NHS Service Delivery and Organisation R&D; 2001. [Google Scholar]
- Gillies R.R., Shortell S.M., Anderson D.A., Mitchell J.B., Morgan K.L. Conceptualizing and Measuring Integration: Findings from the Health Systems Integration Study. Hospital and Health Services Administration. 1993;38(4):467–89. [PubMed] [Google Scholar]
- Griffith C., Turner J. United Kingdom National Health Service. Cancer Services Collaborative ‘Improvement Partnership’: Redesign of Cancer Services. A National Approach. European Journal of Surgical Oncology. 2004;(30) suppl. 1:1–86. doi: 10.1016/j.ejso.2004.07.010. [DOI] [PubMed] [Google Scholar]
- Haggerty J.L., Reid R.J., Freeman G.K., Starfield B.H., Adair C.E., McKendry R. Continuity of Care: A Multidisciplinary Review. British Medical Journal. 2003;327(7425):1219–21. doi: 10.1136/bmj.327.7425.1219. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Harman H.H. Modern Factor Analysis. Chicago: University of Chicago Press; 1976. [Google Scholar]
- Hernandez S.R. Horizontal and Vertical Healthcare Integration: Lessons Learned from the United States. HealthcarePapers. 2000;1(2):59–65. doi: 10.12927/hcpap.2000.17219. [DOI] [PubMed] [Google Scholar]
- Jöreskog K.G. Factor Analysis by Least-Square and Maximum-Likelihood Method. In: Enslein K., Ralston A., Wilf H.S., editors. Statistical Methods for Digital Computers. New York: John Wiley and Sons; 1977. [Google Scholar]
- Leatt P. Integrated Service Delivery. Ottawa: Minister of Public Works and Government Services; 2002. [Google Scholar]
- Leatt P., Pink G.H., Guerriere M. Towards a Canadian Model of Integrated Healthcare. HealthcarePapers. 2000;1(2):13–35. doi: 10.12927/hcpap..17216. [DOI] [PubMed] [Google Scholar]
- Little R.J.A., Rubin D.B. Statistical Analysis with Missing Data. Hoboken, NJ: John Wiley and Sons; 2002. [Google Scholar]
- Mindlin R.L., Densen P.M. Medical Care of Urban Infants: Continuity of Care. American Journal of Public Health and the Nation's Health. 1969;59(8):1294–301. doi: 10.2105/ajph.59.8.1294. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Raina P., Santaguida P.L., Rice M., Gauld M., Smith S., Hader J. Hamilton, ON: McMaster Evidence-Based Practice Centre; 2006. A systematic Review of Health Care System Performance Indicators: Measures of Integration. [Google Scholar]
- Reid R., Haggerty J., McKendry R. Defusing the Confusion: Concepts and Measures of Continuity of Healthcare. Ottawa: Canadian Health Services Research Foundation; 2002. Retrieved May 26, 2009. < http://www.chsrf.ca/final_research/commissioned_research/programs/pdf/cr_contcare_e.pdf>. [Google Scholar]
- Rummel R.J. Applied Factor Analysis. Evanston, IL: Northwestern University Press; 1970. [Google Scholar]
- Schoenman J.A., Berk M.L, Feldman J.J., Singer A. Impact of Differential Response Rates on the Quality of Data Collected in the CTS Physician Survey. Evaluation and the Health Professions. 2003;26(1):23–42. doi: 10.1177/0163278702250077. [DOI] [PubMed] [Google Scholar]
- Shortell S.M., Gillies R.R., Anderson D.A., Erickson K.M., Mitchell J.B. Remaking Health Care in America: The Evolution of Organized Delivery Systems. San Francisco: Jossey-Bass; 2000. [Google Scholar]
- Sullivan T., Dobrow M., Thompson L., Hudson A. Reconstructing Cancer Services in Ontario. HealthcarePapers. 2004;5(1):69–80. doi: 10.12927/hcpap..16843. [DOI] [PubMed] [Google Scholar]
- Sullivan T., Dobrow M.J., Schneider E., Newcomer L., Richards M., Wilkinson L., Borella L., Lepage C., Glossmann G.P., Walshe R. “Améliorer la responsabilité clinique et la performance en cancérologie” [“Improving Clinical Accountability and Performance in the Cancer Field”] Pratiques et Organisation des Soins. 2008;39(3):207–15. [Google Scholar]
- Suter E., Oelke N.D., Adair C.E., Waddell C., Armitage G.D., Huebner L.-A. Health Systems Integration: Definitions, Processes and Impact. A Research Synthesis. Calgary: Health Systems and Workforce Research Unit, Calgary Health Region; 2007. [Google Scholar]
- Tabachnick B.G., Fidell L.S. Using Multivariate Statistics. Boston: Allyn and Bacon; 2007. [Google Scholar]
- Ware N.C., Dickey B., Tugenberg T., McHorney C.A. CONNECT: A Measure of Continuity of Care in Mental Health Services. Mental Health Services Research. 2003;5(4):209–21. doi: 10.1023/a:1026276918081. [DOI] [PubMed] [Google Scholar]
- Wilson N.J., Kizer K.W. Oncology Management by the ‘New’ Veterans Health Administration. Cancer. 1998;82(10) Suppl.:2003–9. doi: 10.1002/(sici)1097-0142(19980515)82:10+<2003::aid-cncr5>3.3.co;2-u. [DOI] [PubMed] [Google Scholar]