Skip to main content
Healthcare Policy logoLink to Healthcare Policy
. 2015 Nov;11(2):102–118.

Health System-Level Factors Influence the Implementation of Complex Innovations in Cancer Care

Les facteurs au niveau du système de santé influencent la mise en œuvre d'innovations complexes pour le traitement du cancer

Robin Urquhart 1,, Lois Jackson 2, Joan Sargeant 3, Geoffrey A Porter 4, Eva Grunfeld 5
PMCID: PMC4729286  PMID: 26742119

Abstract

Background:

The movement of new knowledge and tools into healthcare settings continues to be a slow, complex and poorly understood process. In this paper, we present the system-level factors important to the implementation of synoptic reporting tools in two initiatives (or cases) in Nova Scotia, Canada.

Methods:

This study used case study methodology. Data were collected through interviews with key informants, document analysis, non-participant observation and tool use/examination. Analysis involved production of case histories, analysis of each case and a cross-case analysis.

Results:

The healthcare system's delivery and support structure, information technology infrastructure, policy environment and history of collaboration and inter-organizational relationships influenced tool implementation in the two cases.

Conclusions:

The findings provide an in-depth, nuanced understanding of how healthcare system components can influence the implementation of a new tool in clinical practice.

Background

Despite a growing literature, the movement of innovations (i.e., new knowledge and tools) into healthcare settings continues to be a slow, complex and poorly understood process (ICEBeRG 2006; Stetler 2003; Ward et al. 2009). Though researchers in this area have focused predominantly on individual-level factors that affect the uptake of new knowledge (Eccles et al. 2005; Grimshaw et al. 2001, 2006; Grol and Grimshaw 2003), many organizational, socio-political and economic factors influence whether individuals in healthcare settings actually adopt and make use of innovations in their practice (Contandriopoulos et al. 2010; Grol et al. 2007; Kitson et al. 2008; Stetler 2003). Indeed, the process of moving innovations into healthcare practice is dynamic and highly contingent on contextual factors (Battista 1989; Denis et al. 2002; Dijkstra et al. 2006; Fraser 2004; Kitson et al. 2008; Litaker et al. 2006; Rycroft-Malone et al. 2004; Titler 2008; Van de Ven et al. 1999).

As the delivery of care becomes increasingly multidisciplinary and technologically advanced, the introduction of innovations is increasingly becoming a collective endeavour. That is, many new tools and practices introduced in healthcare organizations are complex and require coordinated use by many individuals and professional groups to achieve benefits (Helfrich et al. 2007). At the same time, healthcare settings are characterized by high levels of interdependency and interconnectedness among individuals in the system (Contandriopoulos et al. 2010; Iles and Sutherland 2001). Thus, for many practices, individuals working in healthcare organizations (e.g., clinicians, administrators) seldom have enough autonomy to apply new knowledge in making use of new tools and technologies (Contandriopoulos et al. 2010; Havelock 1969; Leviton 2003). In addition, many of the defining features of healthcare systems, including the range and diversity of stakeholders; professional autonomy of many of its staff; and complex governance, resourcing and regulatory arrangements, may all impact the implementation and use of innovations in healthcare settings (Iles and Sutherland 2001; Pollitt 1993). An improved understanding of the system-level influences on the implementation of innovations in healthcare may prove important to more effectively moving many innovations into practice.

Synoptic reporting tools (SRTs) are an evidence-based means of reporting findings from medical and surgical investigations and procedures that differ from the dominant method of reporting, known as narrative reporting. Although there is a spectrum of what is considered synoptic reporting (Srigley et al. 2009), contemporary SRTs generally differ from narrative reporting in at least two ways. First, SRTs normally require that the physician enter information about the patient, procedure and findings using a computer rather than dictate information into a voice recorder or telephone system. Second, the end synoptic report presents data items in a structured manner and contains only the information necessary for patient care rather than providing a free-text descriptive account of the procedure and findings. More than two decades of research has demonstrated that SRTs consistently improve the quality of pathology (Austin et al. 2009; Beattie et al. 2003; Branston et al. 2002; Chamberlain et al. 2000; Chapuis et al. 2007; Cross et al. 1998; Gill et al. 2009; Hammond and Flinner 1997; Karim et al. 2008; Messenger et al. 2011; Mohanty et al. 2007; Rigby et al. 1999; Srigley et al. 2009; Wilkinson et al. 2003; Zarbo 1992) and surgery (Chambers et al. 2009; Edhemovic et al. 2004; Park et al. 2010; Temple et al. 2010) reporting for a variety of cancers, as well as of various diagnostic investigations and procedures (Harvey et al. 2007; Laflamme et al. 2005), compared to narrative reporting.

In this paper, we present the system-level factors important to SRT implementation in two initiatives in Nova Scotia (NS), Canada.

Methods

In NS, we examined the key interpersonal-, organizational- and system-level factors that influenced the implementation and use of SRTs in two contemporary cases of cancer care:

  • 1. Synoptic reporting in the Colon Cancer Prevention Program (CCPP); and

  • 2. Synoptic reporting in the Surgical Synoptic Reporting Tools Project (SSRTP).

In this paper, we present the findings with respect to the system-level factors. We used case study methodology (Stake 2006; Yin 2009) to examine SRT implementation and use. A case study is “an empirical inquiry that investigates a contemporary phenomenon in depth and within its real-life context” (p. 18) (Yin 2009). The cases were selected to allow cross-case analysis and comparison, and to permit the investigation of a particular phenomenon in different contexts (i.e., to study SRT implementation and use across different settings and individuals). The study was approved by the Research Ethics Boards at all applicable institutions. A detailed description of the methods, including sampling decisions and recruitment processes, and analyses are provided elsewhere (Urquhart et al. 2012).

Three theoretical perspectives largely informed the design of this study: the Promoting Action on Research Implementation in Health Services (PARiHS) framework (Kitson et al. 1998, 2008), the organizational framework of innovation implementation (Helfrich et al. 2007) and “systems” thinking/change (Kitson 2009). The PARiHS framework proposes that the successful implementation of research into practice is a function of the interaction between three core elements: the level and nature of the evidence; the context into which the research is implemented (where context is comprised of the sub-elements of culture, leadership and evaluation); and the method by which the process is facilitated. These elements are conceptualized as existing on a continuum, with high evidence, context and facilitation driving successful implementation. The organizational framework of innovation implementation comprises the following six elements and highlights relationships among them: management support, implementation policies and practices, financial resource availability, implementation climate, innovation champions and the “fit” between users' values and the innovation. These elements are posited to play important roles in achieving implementation effectiveness (i.e., consistent, committed and skilled innovation use). “Systems” thinking/change posits that the successful translation of knowledge into practice is a function of: the nature and characteristics of the new knowledge; individuals' levels of autonomy in making decisions about using the new knowledge; how individuals negotiate and renegotiate relationships with others in the system; and how individuals attract the resources needed to sustain changes in practice.

Case Descriptions

In 2009, the CCPP implemented an SRT for colonoscopy reporting as part of this new population-based screening program. The CCPP is administered by Cancer Care Nova Scotia, the provincial cancer agency. Within the program, individuals are advised to undergo a screening colonoscopy, arranged through the CCPP, if they have a positive fecal immuno-chemical test. The impetus for implementing an SRT was to enable performance monitoring and quality improvement for colonoscopy, support the appropriate follow-up of participants in the screening pathway and facilitate overall maintenance of the screening program. The CCPP implemented the endoscopy reporting software and database from the Clinical Outcomes Research Initiative (CORI), developed at Oregon Health and Science University, in partnership with the National Institutes of Health, AstraZeneca and Novartis. CORI was rolled out across the province concurrently with the screening program over a two-year period (2009–2010). All screening colonoscopies in the province are coordinated through the screening program and must be reported using CORI (i.e., a policy of mandatory use for screening colonoscopies). Refusal to use this tool meant that endoscopists (gastroenterologists and surgeons) would not be permitted to perform screening colonoscopies coordinated by the CCPP.

In 2010, synoptic reporting for cancer surgery commenced as part of a pan-Canadian collaboration, funded and led by the Canadian Partnership Against Cancer, to expand surgical synoptic reporting to numerous Canadian provinces. The intent was to capitalize on the successful adoption and implementation of synoptic reporting in Alberta, Canada. Thus, the SSRTP commenced as a pilot project for breast and colorectal cancer surgery. A small number of surgeons (nine) performing breast and/or colorectal cancer surgeries were selected to participate at three hospitals (two academic/tertiary care centres, one community hospital). The SSRTP implemented the Web-based Surgical Medical Record (WebSMR), originally developed in Alberta and jointly owned by Alberta Health Services and Softworks Group Inc. WebSMR was implemented at the three hospitals over 2010–2011; its use was voluntary. The implementation team had neither the authority to mandate use nor the capacity to influence use through organizational or provincial policies.

Both initiatives planned to integrate their SRT with existing hospital information technology (IT) systems, allowing seamless transfer of information across patient registration and medical record systems. Detailed case records are provided elsewhere to describe each case's socio-political context, governance structure and implementation timeline (Urquhart 2013).

Data Collection

Multiple data collection procedures were used to gain rich, detailed information about each case:

  • 1. One-on-one semi-structured interviews (Patton 2002) were conducted with key informants. One researcher (RU) conducted all interviews. Each interview was audio-recorded, transcribed verbatim by an experienced research coordinator and checked for accuracy. Following each interview, the questions and responses were reviewed to determine whether the issues were answered in sufficient depth and the interview script was adapted when needed (Rubin and Rubin 1995).

  • 2. Documents (e.g., project plans/charters, formal/informal evaluations, communications materials) were reviewed to gain a historical and contextual perspective on the initiative and to corroborate and augment evidence from other sources (Yin 2009). Documents related to the structure, infrastructure and/or governance and regulatory frameworks of NS's healthcare system were also retrieved and reviewed.

  • 3. Non-participant observation was conducted for one case only (SSRTP) to observe training sessions (format, quality of training) and initial surgeon reactions to viewing/using the SRT.

  • 4. Each SRT and resulting synoptic reports were examined to gain insight into the technical operations related to using the tool and the end report.

Data Analysis

Data analysis began with the first data collected. Analysis involved conducting a separate thematic analysis (Braun and Clarke 2006) for each case, involving coding; collating codes; and generating, reviewing and refining themes. The coding framework was developed during pilot work (Urquhart et al. 2011). One researcher (RU) constructed the case descriptions and coded all interview transcripts and field notes line-by-line in their entirety. The documentary evidence was not coded line-by-line but rather read and re-read to identify contextual and historical data, record codes/concepts and link them to specific document excerpts and triangulate findings from other sources (e.g., interviews and observation). Codes were collapsed into categories through an iterative process that included: critically analyzing each concept and category to identify similar and distinct concepts and categories, linking the same concepts and categories across all the data collected, reviewing the research questions and re-reading the study protocol, reviewing the theoretical perspectives and re-reading the publications associated with those frameworks, consulting several case study methodology and general qualitative research texts and several research team meetings to review and question the analyses. An analogous iterative process was performed to identify, review and refine themes. These processes were imperative to developing a deeper understanding of what occurred in each case and to considering and questioning alternative explanations.

A cross-case analysis was conducted to compare and contrast the themes across cases, and to understand their specific relevance to each case and its context. Emergent findings were discussed on multiple occasions with the research team to assist the analytic process and questioning of the data. Table 1 presents the numerous steps taken to maximize rigour.

Table 1.

Techniques to maximize rigour

Technique
Use of three theoretical perspectives to guide research design, analyses and interpretation, helping to build a wider explanation of SRT implementation and a means of exploring a range of plausible theoretical interpretations.
Strategic selection of cases to support greater confidence in findings.
Pilot work to refine data collection and analyses processes, and inform the final study design.
Interview guides that included questions/probes reflective of all constructs present in the three theoretical perspectives, but open-ended questions to minimize non-biased responses and to elicit a variety of perspectives and viewpoints.
Key informants across four units of analysis (clinician user, implementation team, organization and larger system) and multiple data collection methods, permitting triangulation.
A single researcher to collect all data.
Audio-recording, verbatim transcription and auditing of all interviews.
Considering other plausible explanations for the findings and seeking out additional evidence where inconsistencies or contradictions existed.
Maintaining a case study database, or a complete set of all the data collected for each case and all records related to the treatment of the data during the analytic process.
Maintaining a chain of evidence throughout data analysis, or an explicit trail that identified the links between the data collected and the interpretations/conclusions.
Member checking to verify specific factual data and to ask participants for their responses/reactions to findings.
Multiple meetings /discussions of the research team to review the analytic procedures and discuss and question the findings.

Results

A description of the healthcare system context is presented in an additional online file. Nineteen key informants were interviewed in the CCPP case; 21 in the SSRTP case (Table 2). Informants included implementation team members, SRT users (i.e., endoscopists, surgeons), organizational members (e.g., department heads/managers) and regional/provincial members (e.g., administrators at the provincial cancer agency, health districts or Department of Health and Wellness). Table 3 presents the number and nature of documents collected and analyzed for each case.

Table 2.

Number and nature of key informants, by unit of analysis

CCPP SSRTP
Implementation teama Team members = 4 Team members = 3
Clinician user Physician, tertiary = 3
Physician, community = 2
Physician, tertiary = 4
Physician, community = 2
Organization Department head, tertiary = 1
Manager, tertiary = 1
Manager, community = 2
Report end user, tertiary = 1
Manager, tertiary = 3
Manager, community = 1
Report end user, tertiary = 3
System Executive, health district = 1
Executive, government = 1
Executive, provincial program = 2
Manager, provincial program = 1
Executive, health district = 1
Executive, government = 1
Executive, provincial program = 2
Manager, provincial program = 1
Total n = 19 n = 21

Note: CCPP = Colon Cancer Prevention Program; SSRTP = Surgical Synoptic Reporting Tools Project.

Table 3.

Number, source and nature of documents collected and analyzed

Source Type
CCPP (n = 19) Web search Communications materials (6)
Governmental reports (4)
Practice recommendations/position statements (3)
Analysis of software applications (1)
Implementation team Implementation strategy (1)
Provincial evaluation (1)
Public presentation (1)
Other key informants Published consensus guidelines (1)
Media article (1)
SSRTP (n = 14) Web search Communications materials (3)
Conference presentation (1)
Implementation team Project charter (1)
Provincial evaluation (3)
Other key informants National implementation strategy (4)
National evaluation (2)
System context (n = 16) Web search Consultant's report on Nova Scotia's healthcare system (1)
Cancer Management Strategy for Nova Scotia (1)
Evaluation of Cancer Care Nova Scotia (1)
Reports on Nova Scotia's e-health system (2)
Reports/discussion papers on privacy and personal health information legislation (3)
Acts on privacy/ personal health information, Nova Scotia (4)
Act on privacy/ personal health information, Federal (1)
Pan-Canadian framework on privacy/personal health information (1)
Hospital Business Plans (2)

Note: CCPP = Colon Cancer Prevention Program; SSRTP = Surgical Synoptic Reporting Tools Project.

The data demonstrated that specific characteristics of the provincial healthcare system, presented below, influenced SRT implementation. By the end of data collection for this study (Winter 2012), the CCPP had integrated its SRT with hospital IT systems in one of nine health districts, allowing patient demographics to electronically enter the system and the colonoscopy report to enter the patient's electronic medical record in that one district. In the other eight districts, various “work-around” solutions were created to accommodate the SRT within each hospital's existing processes and capacities. By the end of the study, the SSRTP had integrated its SRT with hospital IT systems in each hospital it was implemented, allowing immediate transfer of information across systems.

Care delivery and support structure

The existing healthcare system structure created challenges in terms of role clarity, governance and sharing of patient information across organizations. Within this structure, health districts were responsible for managing and delivering care, while various other organizations were responsible for relevant policy or support activities. For instance, the provincial cancer agency's role in cancer care was largely related to policy development and standard setting (not service delivery), while a centralized IT organization provided operations support for most (but not all) provincial health IT systems, but was not responsible for implementing new IT systems. The lack of clarity around each organization's roles and mandates led to many issues and frustrations over the course of SRT implementation.

For example, the CCPP was situated in the provincial cancer agency, yet was responsible for implementing and delivering a population-based screening program. As described by one informant, “There is a big question of governance … As a provincial cancer agency, we introduced this tool to support cancer screening … It puts us in a difficult position because we really have no business being in the business, on the service side, right?” (Team Member #2, CCPP). This apparent discrepancy in roles (policy setting vs. service delivery) created many challenges related to governance and data sharing.

In the SSRTP, many key informants linked a lack of clarity around organizational roles and responsibilities to governance challenges: “The problem is … we have got a hybrid cancer system that is not totally clear on who does what and how. From my perspective, … that becomes problematic because you don't know … who owns it and who really wants it. There is no trouble finding people who support it, the trouble is finding the group that owns it” (Team Member #1, SSRTP). In fact, related to governance, key informants identified different organizations (sometimes incorrectly identifying their own organization) as the “owner” of SRT implementation. The lack of clarity meant the implementation team had to spend considerable effort understanding the core business of each organization: “It was not until I understood how that system worked, it was frustrating, but once I figured it out, [implementation] was a lot easier” (Team Member #2, SSRTP).

IT Infrastructure

Key informants described the legacy of health IT infrastructure in the province as a patchwork of disparate systems that were implemented and had evolved in a largely unplanned way. There was no single IT platform in the province, nor was there a provincial plan on how to best leverage information management/IT systems. Key informants across all levels of the healthcare system viewed this legacy as impeding the progress of SRT implementation as well as the user experience. One key informant described the challenge this way: “We have three different hospital systems, you know, [Hospital A] has their own Meditech Magic, then there is Meditech out in the districts, and then [District B] has the best of breed, a combination of a whole bunch of things. The lab systems are not all the same, the operating room systems are not all the same, nothing is the same. So it is a huge challenge, particularly as we seek to share information … and it takes an enormous amount of resources” (System Member #2, SSRTP). The challenge of multiple IT platforms and systems was compounded by differing processes and structures at each hospital. As one implementation team member expressed, “To have [CORI] sit on top of different business processes, different information systems, different staff structures in terms of where their IT person sits, is a challenge. … For me, that has been the biggest challenge” (Team Member #2, CCPP).

Policy Environment

Privacy legislation at the time of SRT implementation (in both cases) included more than 40 different pieces of provincial and federal legislation, all relevant in some way to the collection, sharing and/or use of personal health information. At the same time, each hospital had its own policies and procedures related to privacy, security of personal health information and data integrity. In the CCPP, this legislative and regulatory environment was viewed by implementation team members as especially prohibitive in the context of SRT implementation, wherein personal health information would be collected and shared: “There is a wall there and nobody is really willing to ask ‘why is that wall there and does that wall really need to apply in this case?' You know, the wall might be there for a very good reason. But, you know, should we put a door in for these guys? Maybe yes, maybe no, but I don't think those risk assessments are really ever done. It is the ‘just talk to the hand.' It is a no” (Team Member #4, CCPP). Despite implementing in the same policy context, implementation team members in the SSRTP did not view the legislative/regulatory requirements as barriers to implementation but rather tasks that had to be completed: “You know, from my perspective, I don't really, I am not all that invested in caring about [privacy impact assessments and related things]. I realize that has to be done and all that I care about is that it gets done” (Team Member #1, SSRTP).

Inter-organizational Relationships

The history and nature of inter-organizational relationships and interactions within the healthcare system were viewed as impacting SRT implementation. Underlying the structural, infrastructural and regulatory components, key informant interviews and documentary data suggested a widespread resistance by health districts and the organizations tasked with supporting the districts to work together and think beyond their individual organizations and programs. Key informants in the CCPP case described limited collaboration and existing relationships among organizations within the healthcare system (which were sometimes perceived as precarious) as obstacles that often “got in the way” of SRT implementation. As one system member put it, “there is going to be issues with the adoption of these systems until there is a change of culture within the environment. In particular, there is a lot of, a lot of, ‘this is the way it should be done and this is how we will do it' rather than collaboratively working together on a solution. That is true, I think, of the healthcare sector environment. That whole mentality has to change and until it does, implementing any system is going to be difficult” (System Member #5, CCPP). Similarly, in the SSRTP, key informants discussed historical relationships and interactions within the healthcare system as being critically important to SRT implementation: “organizational interactions are absolutely the number one [factor] … because there are so many players, so many organizations” (Organizational Member #3, SSRTP).

Discussion

This paper presents the system-level factors that influenced SRT implementation in one Canadian province. System-level components, such as the structural, infrastructural, regulatory, political and socio-historical context of the existing healthcare system, are largely absent (Berwick 2003; Davis et al. 2003; Graham et al. 2006; Grol and Grimshaw 1999; Helfrich et al. 2007; Jacobson et al. 2003; Kitson et al. 1998, 2008; Lavis et al. 2003; Stetler 2003; Ward et al. 2009), or recognized but given scant description (Dobbins et al. 2002; Greenhalgh et al. 2004), in much of the theoretical work in the knowledge-to-practice literature. Findings from this study demonstrated that certain features of the healthcare system – its delivery and support structure, IT infrastructure, policy environment and history of limited collaboration and weak working relationships across organizations – were problematic in the context of SRT implementation. In a study investigating the diffusion of eight innovations in acute and primary care in the UK, Fitzgerald and colleagues (2002) found that the capacity of an organization to innovate depended on the structural complexity of the organization and broader care delivery system, the history of the organization and the quality of intra- and inter-organizational relationships. While the nature of these components, and the degree of impact they will ultimately have, will almost certainly vary by setting, our findings support their importance to the implementation process. We advise others to consider these factors when planning implementation efforts. However, other factors influenced SRT implementation in the cases studied, in both facilitating and impeding ways. These are reported elsewhere (Urquhart et al. 2014).

The mandatory versus voluntary nature of the cases warrants further discussion. Indeed, the considerable enabling influence of a mandatory use policy in the CCPP was incredibly important to ensure province-wide use of the SRT for screening colonoscopies. Nonetheless, the findings suggest that this mandatory use policy did not lessen the importance of the system-level factors, nor benefit the implementation team as it navigated the socio-political context. In addition, this policy did not automatically make SRT implementation a higher priority initiative among supporting departments (e.g., IT, medical records), nor did it increase departmental or organizational capacity for implementation. Despite the mandatory nature, the end goals of implementation had not been achieved in most health districts four years post-implementation (i.e., integration with existing hospital IT systems, use of the SRT for diagnostic colonoscopies). Thus, our findings suggest that policy related to mandatory use, in and of itself, was insufficient to ensure effective implementation.

Interestingly, despite the cases occurring at the same time within the same provincial healthcare system, the legislative and regulatory environment was viewed as especially obstructive to SRT implementation in the CCPP but not in the SSRTP. While the SSRTP was able to integrate its SRT with hospital IT systems in a relatively timely manner, the issues that delayed IT integration in the CCPP case purportedly related to privacy and data ownership/ sharing, and technical work that had to be completed. The data (not shown) strongly suggested that one of the fundamental reasons for this difference between cases related to the interpersonal aspects of implementation – stakeholder involvement; the capacity to build, negotiate and leverage helpful relationships; and managing the change process in each organization. See Urquhart et al. (2014) for detailed presentation of these findings.

SRT implementation in this study occurred in a highly interdependent healthcare system, in which 34 hospitals were governed by nine health districts and supported (either in a policy or operations way) by provincial programs and organizations. These interdependencies, however, created considerable challenges for SRT implementation in a system wherein roles, mandates and governance structures were not clearly defined; legislative and regulatory frameworks were inconsistent; and relationships among organizations were burdened by past conflict and tension. Given our findings, viewing innovation implementation in healthcare organizations through the lens of complex adaptive systems (Begun et al. 2003; Best and Holmes 2010; Plsek and Greenhalgh 2001; Zimmerman et al. 1998) might aid our understanding of implementation processes. This perspective focuses on the relationships embedded inside and outside the organization itself and emphasizes the need to analyze relationships across levels of the system (Begun et al. 2003). In such a social system, history matters: what is happening now is undoubtedly influenced by what happened earlier (Anderson and McDaniel Jr. 2000). The data from this study demonstrated that historical relationships and interactions within the healthcare system impacted SRT implementation. Similarly, Fitzgerald et al. (2002) found that diffusion processes in acute and primary care settings were “radically affected” by the nature of the prior relationships among the various players in each initiative, and that high-quality relationships were able to counterbalance many negative contextual factors (p. 1441).

The limited conceptual and empirical work on system-level factors in the literature on moving knowledge into healthcare practice may be partly owing to difficulties in investigating them (Contandriopoulos et al. 2010; Mitton et al. 2007; Zapka et al. 2012) or to the belief that “changing these factors is generally out of reach of those within the organization who are involved in improving health care” (p. 122) (Grol et al. 2007). Even if these factors are difficult to change, however, recognizing and understanding their potential influence is still important when designing strategies to affect practice change. Several authors have recently proposed conceptual models or frameworks (Damschroder et al. 2009; Satterfield et al. 2009) that take an ecological perspective and more fully account for system-level factors, such as the economic, regulatory and/or socio-political context, that need to be considered when moving knowledge into practice.

This study has a number of strengths, including the high participation rate and numerous techniques used to enhance rigour. One limitation is that it was undertaken in one province only, potentially limiting its applicability to other jurisdictions. At the same time, healthcare systems have many similarities (e.g., complex governance and resourcing arrangements, diverse stakeholders), which should facilitate the applicability of these findings to other settings. A second limitation is that some of the system-level components influential in this study might be more germane to the implementation of health IT innovations whereby implementation teams often have to navigate current healthcare structures and develop relationships with individuals in different departments and organizations to integrate the innovation into existing IT infrastructure. However, the structure of the care delivery and support system as well as its socio-historical context could conceivably influence innovation implementation in many areas of healthcare, especially those characterized by high levels of interdependency – for example, the implementation of care delivery models for persons with chronic disease or multi-morbidities whereby appropriate care usually involves mutually supporting roles spanning healthcare providers, organizations and sectors. Moreover, recent Canadian studies (Look Hong et al. 2010; Wright et al. 2011) have suggested that targeted system-level strategies, particularly those related to policy changes, may facilitate the implementation of complex innovations in healthcare.

In summary, this study has provided an in-depth, nuanced understanding of how health-care system components can influence the implementation of a complex innovation in clinical practice. Future research is needed to refine and expand our knowledge of how system-level factors affect implementation processes and how to manage and/or leverage these factors to more effectively plan for and integrate innovations into healthcare settings.

Acknowledgements

The authors gratefully acknowledge all individuals who generously provided their time to participate in this study. They also thank Margaret Jorgensen for her assistance with this study and Cynthia Kendell for her helpful review of the manuscript. This study was funded by the CIHR/CCNS Team in Access to Colorectal Cancer Services in Nova Scotia (funders: Canadian Institutes of Health Research, Cancer Care Nova Scotia, Nova Scotia Department of Health and Wellness, Capital District Health Authority; Dalhousie University Faculty of Medicine, Dalhousie Medical Research Foundation). Robin Urquhart also received funding from the Nova Scotia Health Research Foundation to carry out this work.

Contributor Information

Robin Urquhart, Assistant Professor, Department of Surgery, Dalhousie University, Halifax, NS.

Lois Jackson, Professor, School of Health and Human Performance, Dalhousie University, Halifax, NS.

Joan Sargeant, Acting Head and Professor, Division of Medical Education Dalhousie University, Halifax, NS.

Geoffrey A. Porter, Professor, Department of Surgery, Dalhousie University, Halifax, NS.

Eva Grunfeld, Giblon Professor and Vice Chair, Research, Department of Family and Community Medicine, University of Toronto Toronto, ON.

References

  1. Anderson R.A., McDaniel R.R., Jr 2000. “Managing Health Care Organizations: Where Professionalism Meets Complexity Science.” Health Care Management Review 25(1): 83–92. [DOI] [PubMed] [Google Scholar]
  2. Austin R., Thompson B., Coory M., Walpole E., Francis G., Fritschi L. 2009. “Histopathology Reporting of Breast Cancer in Queensland: the Impact on the Quality of Reporting as a Result of the Introduction of Recommendations.” Pathology 41(4): 361–65. [DOI] [PubMed] [Google Scholar]
  3. Battista R.N. 1989. “Innovation and Diffusion of Health-Related Technologies. A Conceptual Framework.” International Journal of Technology Assessment in Health Care 5(2): 227–48. [DOI] [PubMed] [Google Scholar]
  4. Beattie G.C., McAdam T.K., Elliott S., Sloan J.M., Irwin S.T. 2003. “Improvement in Quality of Colorectal Cancer Pathology Reporting with a Standardized Proforma – A Comparative Study.” Colorectal Disease 5(6): 558–62. [DOI] [PubMed] [Google Scholar]
  5. Begun J.W., Zimmerman B., Dooley K. 2003. “Health Care Organizations as Complex Adaptive Systems.” In Mick S.M., Wyttenbach M. (Eds), Advances in Health Care Organization Theory (pp. 253–88). San Francisco, CA: Jossey-Bass. [Google Scholar]
  6. Berwick D.M. 2003. “Disseminating Innovations in Health Care.” JAMA 289(15): 1969–75. [DOI] [PubMed] [Google Scholar]
  7. Best A., Holmes B. 2010. “Systems Thinking, Knowledge, and Action: Towards Better Models and Methods.” Evidence and Policy 6(2): 145–59. [Google Scholar]
  8. Branston L.K., Greening S., Newcombe R.G., Daoud R., Abraham J.M., Wood F. et al. 2002. “The Implementation of Guidelines and Computerised Forms Improves the Completeness of Cancer Pathology Reporting. The CROPS Project: A Randomised Controlled Trial in Pathology.” European Journal of Cancer 38(6): 764–72. [DOI] [PubMed] [Google Scholar]
  9. Braun V., Clarke V. 2006. “Using Thematic Analysis in Psychology.” Qualitative Research in Psychology 3: 77–101. [Google Scholar]
  10. Chamberlain D.W., Wenckebach G.F., Alexander F., Fraser R.S., Kolin A., Newman T. 2000. “Pathological Examination and the Reporting of Lung Cancer Specimens.” Clinical Lung Cancer 1(4): 261–68. [DOI] [PubMed] [Google Scholar]
  11. Chambers A.J., Pasieka J.L., Temple W.J. 2009. “Improvement in the Accuracy of Reporting Key Prognostic and Anatomic Findings During Thyroidectomy by Using a Novel Web-Based Synoptic Operative Reporting System.” Surgery 146(6): 1090–98. [DOI] [PubMed] [Google Scholar]
  12. Chapuis P.H., Chan C., Lin B.P., Armstrong K., Armstrong B., Spigelman A.D. et al. 2007. “Pathology Reporting of Resected Colorectal Cancers in New South Wales in 2000.” ANZ Journal of Surgery 77(11): 963–69. [DOI] [PubMed] [Google Scholar]
  13. Contandriopoulos D., Lemire M., Denis J.L., Tremblay E. 2010. “Knowledge Exchange Processes in Organizations and Policy Arenas: A Narrative Systematic Review of the Literature.” The Milbank Quarterly 88(4): 444–83. [DOI] [PMC free article] [PubMed] [Google Scholar]
  14. Cross S.S., Feeley K.M., Angel C.A. 1998. “The Effect of Four Interventions on the Informational Content of Histopathology Reports of Resected Colorectal Carcinomas.” Journal of Clinical Pathology 51(6): 481–82. [DOI] [PMC free article] [PubMed] [Google Scholar]
  15. Damschroder L.J., Aron D.C., Keith R.E., Kirsh S.R., Alexander J.A., Lowery J.C. 2009. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science.” Implementation Science 4: 50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  16. Davis D., Evans M., Jadad A., Perrier L., Rath D., Ryan D. et al. 2003. “The Case for Knowledge Translation: Shortening the Journey from Evidence to Effect.” BMJ 327(7405): 33–35. [DOI] [PMC free article] [PubMed] [Google Scholar]
  17. Denis J.L., Hebert Y., Langley A., Lozeau D., Trottier L.H. 2002. “Explaining Diffusion Patterns for Complex Health Care Innovations.” Health Care Management Review 27(3): 60–73. [DOI] [PubMed] [Google Scholar]
  18. Dijkstra R., Wensing M., Thomas R., Akkermans R., Braspenning J., Grimshaw J., Grol R. 2006. “The Relationship between Organisational Characteristics and the Effects of Clinical Guidelines on Medical Performance in Hospitals, a Meta-Analysis.” BMC Health Service Research 6: 53. 10.1186/1472-6963-6-53. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Dobbins M., Ciliska D., Cockerill R., Barnsley J., DiCenso A. 2002. “A Framework for the Dissemination and Utilization of Research for Health-Care Policy and Practice.” Online Journal of Knowledge Synthesis for Nursing E9(1):149–160. 10.1111/j.1524-475X.2002.00149.x. [PubMed] [Google Scholar]
  20. Eccles M., Grimshaw J., Walker A., Johnston M., Pitts N. 2005. “Changing the Behavior of Healthcare Professionals: The Use of Theory in Promoting the Uptake of Research Findings.” Journal of Clinical Epidemiology 58(2): 107–12. [DOI] [PubMed] [Google Scholar]
  21. Edhemovic I., Temple W.J., de Gara C.J., Stuart G.C. 2004. “The Computer Synoptic Operative Report – A Leap Forward in the Science of Surgery.” Annals of Surgical Oncology 11(10): 941–47. [DOI] [PubMed] [Google Scholar]
  22. Fitzgerald L., Ferlie E., Wood M., Hawkins C. 2002. “Interlocking Interactions, the Diffusion of Innovations in Health Care.” Human Relations 55(12): 1429–49. [Google Scholar]
  23. Fraser I. 2004. “Translation Research: Where Do We Go From Here?” Worldviews on Evidence-Based Nursing 1(Suppl 1): S78–83. [DOI] [PubMed] [Google Scholar]
  24. Gill A.J., Johns A.L., Eckstein R., Samra J.S., Kaufman A., Chang D.K. et al. 2009. “Synoptic Reporting Improves Histopathological Assessment of Pancreatic Resection Specimens.” Pathology 41(2): 161–67. [DOI] [PubMed] [Google Scholar]
  25. Graham I.D., Logan J., Harrison M.B., Straus S.E., Tetroe J., Caswell W., Robinson N. 2006. “Lost in Knowledge Translation: Time for a Map?” Journal of Continuing Education in the Health Professions 26(1): 13–24. [DOI] [PubMed] [Google Scholar]
  26. Greenhalgh T., Robert G., MacFarlane F., Bate P., Kyriakidou O. 2004. “Diffusion of Innovations in Service Organizations: Systematic Review and Recommendations.” The Milbank Quarterly 82(4): 581–629. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Grimshaw J., Eccles M., Thomas R., MacLennan G., Ramsay C., Fraser C., Vale L. 2006. “Toward Evidence-Based Quality Improvement. Evidence (and Its Limitations) of the Effectiveness of Guideline Dissemination and Implementation Strategies 1966–1998.” Journal of General Internal Medicine 21(Suppl 2): S14–20. [DOI] [PMC free article] [PubMed] [Google Scholar]
  28. Grimshaw J.M., Shirran L., Thomas R.E., Mowatt G., Fraser C., Bero L. et al. 2001. “Changing Provider Behaviour: An Overview of Systematic Reviews of Interventions.” Medical Care 39(8 Suppl 2): II2–45. [PubMed] [Google Scholar]
  29. Grol R., Bosch M.C., Hulscher M., Eccles M.P., Wensing M. 2007. “Planning and Studying Improvement in Patient Care: The Use of Theoretical Perspectives.” The Milbank Quarterly 85(1): 93–138. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Grol R., Grimshaw J. 1999. “Evidence-Based Implementation of Evidence-Based Medicine.” The Joint Commission Journal on Quality Improvement 25(10): 503–13. [DOI] [PubMed] [Google Scholar]
  31. Grol R., Grimshaw J. 2003. “From Best Evidence to Best Practice: Effective Implementation of Change in Patients' Care.” Lancet 362(9391): 1225–30. [DOI] [PubMed] [Google Scholar]
  32. Hammond E.H, Flinner R.L. 1997. “Clinically Relevant Breast Cancer Reporting: Using Process Measures to Improve Anatomic Pathology Reporting.” Archives of Pathology and Laboratory Medicine 121(11): 1171–75. [PubMed] [Google Scholar]
  33. Harvey A., Zhang H., Nixon J., Brown C. J. 2007. “Comparison of Data Extraction from Standardized Versus Traditional Narrative Operative Reports for Database-Related Research and Quality Control.” Surgery 141(6): 708–14. [DOI] [PubMed] [Google Scholar]
  34. Havelock R.G. 1969. Planning for Innovation through Dissemination and Utilization of Knowledge. Ann Arbor, MI: The University of Michigan Institute for Social Research. [Google Scholar]
  35. Helfrich C.D., Weiner B.J., Mckinney M.M., Minasian L. 2007. “Determinants of Implementation Effectiveness: Adapting a Framework for Complex Innovations.” Medical Care Research and Review 64(3): 279–303. [DOI] [PubMed] [Google Scholar]
  36. Iles V., Sutherland K. 2001. Organizational Change: A Review for Health Care Managers, Professionals and Researchers. London, UK: National Health Service; <http://www.sdo.nihr.ac.uk/files/adhoc/change-management-review.pdf>. [Google Scholar]
  37. Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG). 2006. “Designing Theoretically-Informed Implementation Interventions.” Implementation Science 1: 4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Jacobson N., Butterill D., Goering P. 2003. “Development of a Framework for Knowledge Translation: Understanding User Context.” Journal of Health Services Research & Policy 8(2): 94–99. [DOI] [PubMed] [Google Scholar]
  39. Karim R.Z., van den Berg K.S., Colman M.H., McCarthy S.W., Thompson J.F., Scolyer R.A. 2008. “The Advantage of Using a Synoptic Pathology Report Format for Cutaneous Melanoma.” Histopathology 52(2): 130–38. [DOI] [PubMed] [Google Scholar]
  40. Kitson A., Harvey G., McCormack B. 1998. “Enabling the Implementation of Evidence Based Practice: A Conceptual Framework.” Quality in Health Care 7(3): 149–58. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Kitson A.L. 2009. “The Need for Systems Change: Reflections on Knowledge Translation and Organizational Change.” Journal of Advanced Nursing 65(1): 217–28. [DOI] [PubMed] [Google Scholar]
  42. Kitson A.L., Rycroft-Malone J., Harvey G., Mccormack B., Seers K., Titchen A. 2008. “Evaluating the Successful Implementation of Evidence into Practice Using the PARiHS Framework: Theoretical and Practical Challenges.” Implementation Science 3(1): 1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  43. Laflamme M.R., Dexter P.R., Graham M.F., Hui S.L., McDonald C.J. 2005. “Efficiency, Comprehensiveness and Cost-Effectiveness When Comparing Dictation and Electronic Templates for Operative Reports.” AMIA Annual Symposium Proceedings 2005: 425–29. [PMC free article] [PubMed] [Google Scholar]
  44. Lavis J.N., Robertson D., Woodside J.M., McLeod C.B., Abelson; J. Knowledge Transfer Study Group. 2003. “How Can Research Organizations More Effectively Transfer Research Knowledge to Decision Makers?” The Milbank Quarterly 81(2): 221–48. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Leviton L.C. 2003. “Evaluation Use: Advances, Challenges and Applications.” American Journal of Evaluation 24: 525–35. [Google Scholar]
  46. Litaker D., Tomolo A., Liberatore V., Stange K.C., Aron D. 2006. “Using Complexity Theory to Build Interventions That Improve Health Care Delivery in Primary Care.” Journal of General Internal Medicine 21 (Suppl 2): S30–34. [DOI] [PMC free article] [PubMed] [Google Scholar]
  47. Look Hong N.J., Gagliardi A.R., Bronskill S.E., Paszat L.F., Wright F.C. 2010. “Multidisciplinary Cancer Conferences: Exploring Obstacles and Facilitators to Their Implementation.” Journal of Oncology Practice 6(2): 61–68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  48. Messenger D.E., McLeod R.S., Kirsch R. 2011. “What Impact Has the Introduction of a Synoptic Report for Rectal Cancer Had on Reporting Outcomes for Specialist Gastrointestinal and Nongastrointestinal Pathologists?” Archives of Pathology and Laboratory Medicine 135(11): 1471–75. [DOI] [PubMed] [Google Scholar]
  49. Mitton C., Adair C.E., McKenzie E., Patten S.B., Perry Waye B. 2007. “Knowledge Transfer and Exchange: Review and Synthesis of the Literature.” The Milbank Quarterly 85(4): 729–68. [DOI] [PMC free article] [PubMed] [Google Scholar]
  50. Mohanty S.K., Piccoli A.L., Devine L.J., Patel A.A., William G.C., Winters S.B. et al. 2007. “Synoptic Tool for Reporting of Hematological and Lymphoid Neoplasms Based on World Health Organization Classification and College of American Pathologists Checklist.” BMC Cancer 7: 144. [DOI] [PMC free article] [PubMed] [Google Scholar]
  51. Park J., Pillarisetty V.G., Brennan M.F., Jarnagin W.R., D'Angelica M.I., Dematteo R.P. et al. 2010. “Electronic Synoptic Operative Reporting: Assessing the Reliability and Completeness of Synoptic Reports for Pancreatic Resection.” Journal of the American College of Surgeons 211(3): 308–15. [DOI] [PubMed] [Google Scholar]
  52. Patton M.Q. 2002. Qualitative Research and Evaluation Methods (3rd ed.). Thousand Oaks, CA: SAGE Publications. [Google Scholar]
  53. Plsek P.E., Greenhalgh T. 2001. “Complexity Science: The Challenge of Complexity in Health Care.” BMJ 323(7313): 625–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  54. Pollitt C. 1993. “The Struggle for Quality: The Case of the NHS.” Policy and Politics 21(3): 161–70. [Google Scholar]
  55. Rigby K., Brown S.R., Lakin G., Balsitis M., Hosie K.B. 1999. “The Use of a Proforma Improves Colorectal Cancer Pathology Reporting.” Annals of the Royal College of Surgeons of England 81(6): 401–03. [PMC free article] [PubMed] [Google Scholar]
  56. Rubin H., Rubin I. 1995. Qualitative Interviewing: The Art of Hearing Data. Thousand Oaks, CA: Sage Publications. [Google Scholar]
  57. Rycroft-Malone J., Harvey G., Seers K., Kitson A., Mccormack B., Titchen A. 2004. “An Exploration of the Factors That Influence the Implementation of Evidence into Practice.” Journal of Clinical Nursing 13(8): 913–24. [DOI] [PubMed] [Google Scholar]
  58. Satterfield J.M., Spring B., Brownson R.C., Mullen E.J., Newhouse R.P., Walker B.B., Whitlock E.P. 2009. “Toward a Transdisciplinary Model of Evidence-Based Practice.” The Milbank Quarterly 87(2): 368–90. [DOI] [PMC free article] [PubMed] [Google Scholar]
  59. Srigley J.R., McGowan T., Maclean A., Raby M., Ross J., Kramer S., Sawka C. 2009. “Standardized Synoptic Cancer Pathology Reporting: A Population-Based Approach.” Journal of Surgical Oncology 99(8): 517–24. [DOI] [PubMed] [Google Scholar]
  60. Stake R. 2006. Multiple Case Study Analysis. New York, NY: Guilford Press. [Google Scholar]
  61. Stetler C.B. 2003. “Role of the Organization in Translating Research into Evidence-Based Practice.” Outcomes Management 7(3): 97–103. [PubMed] [Google Scholar]
  62. Temple W.J., Francis W.P., Tamano E., Dabbs K., Mack L.A., Fields A. 2010. “Synoptic Surgical Reporting for Breast Cancer Surgery: An Innovation in Knowledge Translation.” American Journal of Surgery 199(6): 770–75. [DOI] [PubMed] [Google Scholar]
  63. Titler M.G. 2008. “The Evidence for Evidence-Based Practice Implementation.” In Hughes R.G., ed., Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville, MD: Agency for Healthcare Research and Quality. [Google Scholar]
  64. Urquhart R. 2013. Multi-Level Factors Influence the Implementation and Use of Complex Innovations in Cancer Care: A Multiple Case Study of Synoptic Reporting in Nova Scotia [PhD dissertation]. Dalhousie University, Halifax, NS. [DOI] [PMC free article] [PubMed]
  65. Urquhart R., Porter G.A., Grunfeld E., Sargeant J. 2012. “Exploring the Interpersonal-, Organization-, and System-Level Factors That Influence the Implementation and Use of an Innovation-Synoptic Reporting In Cancer Care: Study Protocol.” Implementation Science 7: 12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  66. Urquhart R., Porter G.A., Sargeant J., Jackson L., Grunfeld E. 2014. “Multi-Level Factors Influence the Implementation and Use of Complex Innovations in Cancer Care: A Multiple Case Study of Synoptic Reporting.” Implementation Science 9: 121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  67. Urquhart R., Sargeant J., Porter G.A. 2011. “Factors Related to the Implementation and Use of an Innovation in Cancer Surgery.” Current Oncology 18(6): 271–79. [DOI] [PMC free article] [PubMed] [Google Scholar]
  68. Van de Ven A.H., Polley D.E., Garud R., Venkataraman S. 1999. The Innovation Journey. Oxford: Oxford University Press. [Google Scholar]
  69. Ward V., House A., Hamer S. 2009. “Developing a Framework for Transferring Knowledge into Action: A Thematic Analysis of the Literature.” Journal of Health Services Research & Policy 14(3): 156–64. [DOI] [PMC free article] [PubMed] [Google Scholar]
  70. Wilkinson N.W., Shahryarinejad A., Winston J.S., Watroba N., Edge S.B. 2003. “Concordance with Breast Cancer Pathology Reporting Practice Guidelines.” Journal of the American College of Surgeons 196(1): 38–43. [DOI] [PubMed] [Google Scholar]
  71. Wright F.C., Gagliardi A.R., Fraser N., Quan M.L. 2011. “Adoption of Surgical Innovations: Factors Influencing Use of Sentinel Lymph Node Biopsy for Breast Cancer.” Surgical Innovation 18(4): 379–86. [DOI] [PubMed] [Google Scholar]
  72. Yin R.K. 2009. Case Study Research: Design and Methods (4th ed.). Thousand Oaks, CA: Sage. [Google Scholar]
  73. Zapka J., Taplin S.H., Ganz P., Grunfeld E., Sterba K. 2012. “Multilevel Factors Affecting Quality: Examples from the Cancer Care Continuum.” Journal of the National Cancer Institute Monographs 2012(44): 11–19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  74. Zarbo R.J. 1992. “Interinstitutional Assessment of Colorectal Carcinoma Surgical Pathology Report Adequacy. A College of American Pathologists Q-Probes Study of Practice Patterns from 532 Laboratories and 15,940 Reports.” Archives of Pathology & Laboratory Medicine 116(11): 1113–19. [PubMed] [Google Scholar]
  75. Zimmerman B.J., Lindberg C., Plsek P.E. 1998. A Complexity Science Primer: What Is Complexity Science and Why Should I Learn It? Edgeware: Insights from Complexity Science for Health Care Leaders. Irving, TX: VHA Publishing. [Google Scholar]

Articles from Healthcare Policy are provided here courtesy of Longwoods Publishing

RESOURCES