Skip to main content
International Journal of Health Policy and Management logoLink to International Journal of Health Policy and Management
letter
. 2015 Dec 24;5(3):215–217. doi: 10.15171/ijhpm.2015.214

Single Versus Multi-Faceted Implementation Strategies – Is There a Simple Answer to a Complex Question? A Response to Recent Commentaries and a Call to Action for Implementation Practitioners and Researchers

Gill Harvey 1,2,*, Alison Kitson 1
PMCID: PMC4770932  PMID: 26927596

We are increasingly aware of the need to be as effective and efficient as possible when designing and applying strategies to implement evidence-informed changes into policy and practice, particularly within our resource constrained health systems. Implementation efforts consume a range of resources, including time, people, educational input, and communication systems to name just a few. In many cases, the costs of implementation are not accurately described or evaluated,1 hence it is difficult to make informed judgements about whether the resources invested in implementation provide relative value in terms of the improvements made in patient care, service delivery or population health. However, in a recent paper,2 we cautioned against a hasty response to a review of systematic reviews which suggested that there was no evidence to indicate that multi-faceted implementation strategies were more effective than single strategies.3 Our argument was that implementation of changes in practice or policy is rarely simple or rational and that it was not particularly helpful to think about a straightforward dichotomy between single versus multi-faceted approaches to implementation. Instead, we proposed a more nuanced approach, using facilitators to assess and diagnose barriers and enablers of implementation in a particular setting and then tailoring the facilitating approach accordingly.

Here we offer a response to four commentaries on our paper, ‘Translating Evidence into Healthcare Policy and Practice: Single Versus Multi-Faceted Implementation Strategies – Is There a Simple Answer to a Complex Question?’2 It has been interesting and insightful to read and reflect upon the commentaries offered by Rycroft-Malone,4 Wilkinson and Frost,5 Bucknall and Fossum,6 and Eldh and Wallin.7 We begin by summarising what seem to be the main messages identified within the commentaries before offering some final thoughts on key actions for both practitioners and researchers of implementation to help us move forward in the future.

Firstly, reflecting on the issues identified by the commentators. These highlight a number of important points relating to: the recognised complexity of implementation; the ambiguity of defining single versus multi-faceted interventions; the importance of matching implementation approaches to a detailed assessment of the situation; the valuable role of theory and conceptual frameworks; and the need for evaluation and review methodologies that are sophisticated enough to capture the inherent complexity of implementation. We will briefly address each of these in turn. On the first point, all of the commentaries reinforce our belief about implementation, namely that ‘it’s complicated’ and that, as yet, we do not fully understand the mechanisms by which new evidence does (or does not) find its way into changes in policy and practice. On a related theme, both Bucknall and Fossum6 and Eldh and Wallin7 raise an important point regarding the distinction made between single and multi-faceted interventions, noting the ambiguity that exists. What one person might define as a single (or composite) intervention, others might interpret as a multi-faceted intervention, and hence the distinction itself is unclear. A number of useful examples are provided in the commentaries that illustrate this point, including that of facilitation. This could potentially be defined as either a single or multi-faceted intervention, according to the terms applied in the review of reviews3 (where a single intervention is focused on addressing one of many potential barriers and a multi-faceted intervention targets several barriers). From our experience – and a view echoed by the commentators – facilitation is a complex intervention in its own right, comprising a person (the facilitator) applying a range of strategies appropriate to the context in which implementation is taking place, the people they are working with and the change they are trying to introduce.8

Building upon these previous points, there is general agreement amongst the commentaries about the need to match approaches to a detailed assessment of the context in which implementation is taking place. Rycroft-Malone (p. 482)4 offers a useful preface to the ‘transfer-translate-transform’9 framework that we proposed in the original paper, namely the need to ‘assess-particularise-design.’ This clearly requires appropriate knowledge and skills to make a detailed assessment of the particular setting in which implementation is planned, accurately diagnose the likely barriers and select appropriately from the range and complexity of implementation strategies that are available. In reflecting on this point, Wilkinson and Frost5 helpfully remind us of the need to think about those staff who are at the forefront of implementation and provide practical ‘toolkits’ to guide their decision-making. On the subject of what is known and what is as yet unknown about the mechanisms by which implementation occurs, Rycroft-Malone4 emphasises the useful role of theory in ‘plugging’ the evidence gaps that exist and offering fresh insights. Within the field of implementation science, this is an area where significant advances have been made with a range of theories and conceptual frameworks available to guide implementation design, application and evaluation. These include theories relating to behaviour change,10,11 organizational readiness12 and the process of embedding and sustaining change in practice,13 to name just a few. Similarly, conceptual frameworks such as PARIHS (Promoting Action on Research Implementation in Health Services),14-16 the Consolidated Framework for Implementation Research (CFIR)17 and the Knowledge-to-Action (K2A)18 framework offer heuristic and practical guidance for planning and evaluating implementation. As our paper attempted to highlight, there are also many potentially useful theories that we could draw on from outside of healthcare and implementation science. We presented a theory developed by Carlile9,19 from studies in the field of manufacturing and product development, which suggested a need to apply a knowledge transfer, translation or transformation strategy depending on the nature and complexity of the boundaries that had to be overcome. Interestingly, some of the commentaries applied and adapted this theory further to illustrate the points they were making, reinforcing the helpful role of theory in advancing analysis, understanding and debate.

Our final reflections relate specifically to the methodologies and methods that we apply to study implementation. Again, this is an issue that all of the commentaries highlighted as important, both in relation to primary and secondary research studies. On the subject of primary research, the commentators emphasise the need to ‘unpack’ implementation interventions by providing more extensive descriptions of what was done, what happened as a result and why, including attention to likely contextual influences and ‘learning effects’ over time. Universally the authors call for more creative designs to study implementation and knowledge translation, including randomised controlled trials (RCTs) with embedded process evaluation and the use of alternative methodologies such as realist evaluation. Questions are also raised about how best to undertake secondary research such as systematic reviews or reviews of reviews in a way that captures the complexity of what is actually happening and why. As Rycroft-Malone (p. 481)4 notes, any review is "always going to be dependent on the characteristics of the evidence that feeds it." Thus if we want to develop a deeper and more contextually grounded understanding of implementation, we will need to apply review methods that enable us to capture this level of detail. In turn, this requires researchers who report implementation studies to document the detail of the strategies they adopted to change practice and behaviour. Bucknall and Fossum6 helpfully use the analogy to drug therapy to illustrate this point, suggesting a need of think about issues relating to dose, frequency, timing and duration of the implementation strategy, alongside issues such as compliance (fidelity) and side effects. As Wilkinson and Frost5 point out, the development of new reporting standards such as TIDieR (Template for Intervention Description and Replication)20 and StaRI (Standards for Reporting Implementation Studies of Complex Interventions)21 should help in this endeavour.

In closing, we would like to thank the commentators for their thoughtful and thought-provoking reviews, which suggest a number of important areas in terms of ‘a call for action’ for future implementation practice and research. These include:

  • Developing practical guidance and toolkits that those responsible for implementation can use to assess, diagnose and tailor implementation strategies to the local context;

  • Ensuring detailed reporting of the strategies that are applied in implementation projects and implementation research and how well (or not) they worked;

  • Embracing the complexity of implementation interventions, rather than trying to categorise them;

  • Adopting theories, research methodologies and methods that can accommodate and capture the complexity of implementation.

Finally, returning to our starting point of making sure that we use implementation resources wisely, we would suggest a need for more detailed and rigorous study of the cost-effectiveness of different implementation strategies. Only then will we be able to make valid judgements about whether the investment in implementation offers an acceptable return in terms of the gains achieved in improved processes and outcomes of healthcare.

Ethical issues

Not applicable.

Competing interests

Authors declare that they have no competing interests.

Authors’ contributions

Both authors reviewed the commentaries and GH then drafted the manuscript which both authors agreed upon before submission.

Authors’ affiliations

1School of Nursing, University of Adelaide, Adelaide, SA, Australia. 2Alliance Manchester Business School, University of Manchester, Manchester, UK.

Citation: Harvey G, Kitson A. Single versus multi-faceted implementation strategies – is there a simple answer to a complex question? A response to recent commentaries and a call to action for implementation practitioners and researchers. Int J Health Policy Manag. 2016;5(3):215-217. doi:10.15171/ijhpm.2015.214

References

  • 1.Chambers D, Simpson L. Introduction to the 7th Annual Conference on the Science of Dissemination and Implementation: transforming health systems to optimize individual and population health. Implementation Science. 2015;10(Suppl 1):I1. doi: 10.1186/1748-5908-10-S1-I1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Harvey G, Kitson A. Translating evidence into healthcare policy and practice: single versus multi-faceted implementation strategies – is there a simple answer to a complex question? Int Health Policy Manag. 2015;4(3):123–126. doi: 10.15171/ijhpm.2015.54. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Squires J, Sullivan K, Eccles M, Worswick J, Grimshaw J. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement Sci. 2014;9(1):152. doi: 10.1186/s13012-014-0152-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Rycroft-Malone J. It’s more complicated than that: Comment on "Translating evidence into healthcare policy and practice: single versus multi-faceted implementation strategies – is there a simple answer to a complex question?". Int Health Policy Manag. 2015;4(7):481–482. doi: 10.15171/ijhpm.2015.67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Wilkinson JE, Frost H. "Horses for Courses": Comment on "translating evidence into healthcare policy and practice: single versus multi-faceted implementation strategies – is there a simple answer to a complex question?". Int Health Policy Manag. 2015;4(10):685–686. doi: 10.15171/ijhpm.2015.127. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Bucknall T, Fossum M. It is not that simple nor compelling! Comment on "translating evidence into healthcare policy and practice: single versus multi-faceted implementation strategies - is there a simple answer to a complex question?". Int J Health Policy Manag. 2015;4(11):787–788. doi: 10.15171/ijhpm.2015.142. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Eldh AC, Wallin L. How single is "single" - some pragmatic reflections on single versus multifaceted interventions to facilitate implementation comment on "translating evidence into healthcare policy and practice: single versus multifaceted implementation strategies - is there a simple answer to a complex question?". Int J Health Policy Manag. 2015;4(10):699–701. doi: 10.15171/ijhpm.2015.133. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Harvey G, Kitson A. Implementing Evidence-Based Practice in Healthcare: A facilitation guide. Abingdon, Oxon: Routledge; 2015.
  • 9.Carlile PR. Transferring, translating, and transforming: an integrative framework for managing knowledge across boundaries. Organ Sci. 2004;15(5):555–568. doi: 10.1287/orsc.1040.0094. [DOI] [Google Scholar]
  • 10.Michie S, van Stralen M, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6:42. doi: 10.1186/1748-5908-6-42. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:37. doi: 10.1186/1748-5908-7-37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Weiner B. A theory of organizational readiness for change. Implement Sci. 2009;4:67. doi: 10.1186/1748-5908-4-67. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.May C, Mair F, Finch T. et al. Development of a theory of implementation and integration: Normalization Process Theory. Implement Sci. 2009;4:29. doi: 10.1186/1748-5908-4-29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7:149–159. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Rycroft-Malone J, Kitson A, Harvey G. et al. Ingredients for change: revisiting a conceptual framework. Qual Saf Health Care. 2002;11:174–180. doi: 10.1136/qhc.11.2.174. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16. Harvey G, Kitson A. PARIHS re-visited: introducing i-PARIHS. In: Harvey G, Kitson A, eds. Implementing Evidence-Based Practice in Health Care: A facilitation guide. Abingdon, Oxon: Routledge; 2015.
  • 17.Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50. doi: 10.1186/1748-5908-4-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Graham I, Tetroe JM. The Knowledge to Action Framework. In: Rycroft-Malone J, Bucknall T, eds. Models and Frameworks for Implementing Evidence-Based Practice: Linking Evidence to Action. Chichester, England: Wiley-Blackwell; 2010.
  • 19.Carlile PR. A pragmatic view of knowledge and boundaries: boundary objects in new product development. Organ Sci. 2002;13(4):442–455. doi: 10.1287/orsc.13.4.442.2953. [DOI] [Google Scholar]
  • 20.Hoffmann TC, Glasziou PP, Boutron I. et al. Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ. 2014;348:g1687. doi: 10.1136/bmj.g1687. [DOI] [PubMed] [Google Scholar]
  • 21.Pinnock H, Epiphaniou E, Sheikh A. et al. Developing standards for reporting implementation studies of complex interventions (StaRI): a systematic review and e-Delphi. Implement Sci. 2015;10(1):42. doi: 10.1186/s13012-015-0235-z. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from International Journal of Health Policy and Management are provided here courtesy of Kerman University of Medical Sciences

RESOURCES