In a previous IJIC editorial I reflected on the fact that we have yet to make any significant breakthrough to understand the implementation and sustainability of complex service innovations that so characterise the development of integrated care programmes [1]. Without such knowledge we might be able to explain the core building blocks of integrated care systems, but we cannot adequately explain the intricacies of effective implementation nor fully understand the causes of the outcomes we observe. This is not simply a methodological problem but reflects a more deep-rooted challenge in the lack of value that is placed both in the commissioning of such research and the findings that are produced.
Last month I attended a one-day seminar in London, attended by a ‘who’s who’ of health service researchers, examining the Challenges of Evaluating Integrated Care Programmes. This was hosted by the Health Foundation and the Nuffield Trust and was a chance to reflect on the evidence for integrated care following the recent conclusion of the three-year New Care Models programme funded by NHS England [2,3,4]. This itself built on earlier and well evaluated programmes such as the Integrated Care Pilots [5] and Integrated Care Pioneers [6]. Specifically, the seminar asked why all this research had yet not been able to reach any definitive conclusion to how the impact (positive, negative or neutral) of different integrated care strategies could be explained?
For example, evidence was presented at the meeting that multi-speciality community providers (MCPs) – one of the new care models in England that sought to promote specialist care in the community – had had a collective impact in the lowering of emergency hospital admissions (if not overall bed days) [4]. However, given the significant variation in ‘the what and the how’ of implementation in practice, it has been impossible to really understand what interventions or processes have been able to make the difference, how these should be balanced in practice, or indeed at what cost [4]. This means we don’t really have enough knowledge on what or how to implement things, nor on what really makes a difference to the outcomes we observe – the very practicalities that managers and professionals need if they are to replicate or adopt new ways of working effectively.
The upshot of the discussion was that we needed research which allowed for a deeper dive into the ‘black box’ of complexities that integrated care initiatives present. However, it was also noted that evaluations, even those of the highest technical quality, rarely have been given the opportunity to use an implementation science approach to examine what happens at the operational level. Moreover, because such qualitative research is both time- and resource-intensive (for example, requiring ethnographic and observational methods) such investigations are often based on surveys and/or case study approaches of limited design. The consequence is that this results in a list of ‘truisms’ – e.g. ‘teamwork is essential’, ‘effective leadership matters’, or ‘success depended on a positive organisational culture’. Such findings give you an understanding of the outcome domains that matter when evaluating the difference between success and failure, but don’t solve the problem of the practical tools and approaches that might be used in specific contexts to build or replicate such capabilities (if required, of course, since some things happen naturally or even by fluke).
If we are serious about understanding the ‘black box’ of complexities, then we need to take social science seriously. Unfortunately, the value of an implementation science approach is under-appreciated, rarely commissioned and often not published in the ‘high impact’ journals as they are seen as lower on the evidence hierarchy from which decisions are made. This is despite the overwhelming evidence that it is the ‘softer’ issues such as leadership and management, effective teams and networks, and positive cultures and behaviours that are the most likely to be deciding factors in whether innovations in integrated care succeed or fail. Investigations, therefore, need to become much more granular in understanding how behaviours are influenced and the practical tools and approaches required to do so.
To take a step forward in the development of practical tools to support the implementation of integrated care in practice, a cross-European project called SUSTAIN (Sustainable Tailored Integrated Care for Older People in Europe) sought to understand what worked, and what didn’t work, when making improvements to highly variable integrated care initiatives across 7 different countries [7]. The results of the 4-year study found the usual suspects – for example, related to things like the impact of professional attitudes and relationships, the importance of well-functioning teams, cultural factors and local contexts.
However, as the SUSTAIN researchers conclude, “the dynamic and non-linear nature of the improvement process must be recognised in order to take integrated care forward successfully. The environment within which integrated care programmes emerge in their own contexts change over time, and the agents (the people and organisations involved) are constantly adapting to such changes. Improvement processes help to reveal a range of issues that need to be addressed in order to implement services effectively and so achieve the improvement goals that integrated care initiatives seek to make. Whilst some of these issues may relate to minor changes in practice, others may reveal larger cultural or organisational issues that need to be addressed. An approach to integrated care improvement is therefore needed that is intelligent, sensitive, responsive and adaptive” [8, p. 62].
Over the course of the project, SUSTAIN identified and piloted many approaches to improvement. The final SUSTAIN Roadmap helpfully provides an account of these improvement methods to provide a set of insights for others looking to implement similar approaches to improvement in their own contexts [8]. In this regard, key operational issues are addressed in detail – such as conducting meetings, establishing steering groups, enabling team momentum, communicating and measuring progress, and providing effective feedback. The research suggests that the engine room of successful integrated care programmes, as complex service innovations, requires a cycle of improvement since making progress is always non-linear and usually a messy affair.
Progress in implementation science methodology continues to be made that go beyond the variations to realistic synthesis reviewed previously [1], specifically in ways that help to recognise how and why the more intangible qualities of complex interventions might better explain outcomes. For example, the recently published NASSS framework has constructed a taxonomy for empirical application to help explain the non-adoption or abandonment of technology by individuals and difficulties achieving scale-up, scale and sustainability, an approach that has clear resonance for investigating integrated care [9]. Development evaluation is also a relatively new approach to support the creation of dynamic, complex innovations where expected outcomes are uncertain or unclear [10] and has been used quite successfully in the context of transforming care delivery in Foundry – a province-wide initiative in British Columbia, Canada, that involves over 100 partnerships target young people aged 12–24 to provide a one-stop-shop for integrated mental health care, substance use, primary care, social services, and youth and family peer support [11]. By creating a space for evaluation, the voices and priorities of young people in the co-design and development of the services could be created with decision-makers and funders over time [12].
The conclusion to be drawn is that programme evaluations, however well designed to work through economic and other impacts, are likely to have limited ability to explain how integrated care works in practice. Process evaluations within these will pick up on important key themes and issues in implementation, but not the tools and approaches that were used to enable them. What is needed is a shift in tactic where evaluation takes a more practical and participatory form to support continuous reflective learning that is embedded within integrated care projects and which act as a tool for quality improvement. Evaluation and monitoring practices may then become built-in to the DNA of everyday working practice, valued by all participants, and so enable the complexities of integrated care in specific contexts to be resolved in real-time.
Some might argue that this approach would be too expensive and time consuming or may lack objectivity and thus the ability to validate findings. Yet, as explained in this editorial, integrated care programmes are complex and dynamic entities that require reflexive enquiry in order to evolve and improve over time. The methodologies we use to assess them therefore need to capture these dynamics and promote a more intimate relationship between research and practice. This has some profound consequences since it implies that research needs to play the supportive (but not subordinate) role to practice, but equally that practice should respond to research findings in equal measure. Unlocking the ‘black box’ to improve our knowledge of implementation complexity therefore implies the need to simultaneously tackle the tensions inherent in action research [13].
Acknowledgements
The author was a member of the SUSTAIN research consortium, a project funded under Horizon 2020 – the Framework Programme for Research and Innovation (2014–2020) from the European Union under grant agreement No. 634144.
Funding Statement
The author was a member of the SUSTAIN research consortium, a project funded under Horizon 2020 – the Framework Programme for Research and Innovation (2014–2020) from the European Union under grant agreement No. 634144.
Competing Interests
The author has no competing interests to declare.
References
- 1.Goodwin, N. Understanding and Evaluating the Implementation of Integrated Care: A ‘Three Pipe’ Problem. International Journal of Integrated Care, 2016; 16(4): 19 DOI: 10.5334/ijic.2609 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.National Audit Office. Developing new care models through NHS Vanguards; 2018. Available at https://www.nao.org.uk/report/developing-new-care-models-through-nhs-Vanguards/.
- 3.The University of Manchester (online). The New Models of Care Vanguard Programme in England. National Programme Evaluation. Available at: https://www.research.manchester.ac.uk/portal/en/projects/the-new-models-of-care-vanguard-programme-in-england-national-programme-evaluation(03d3a975-e002-419d-9bae-fa4eb81dcb28).html.
- 4.Tallack, C. Evidence emerging from evaluation of the New Care Models. Paper presented The King’s Fund, 21 March 2017 Available at: https://www.kingsfund.org.uk/audio-video/charles-tallack-evaluation-new-care-models. [Google Scholar]
- 5.RAND Europe, Ernst & Young LLP. National Evaluation of the Department of Health’s Integrated Care Pilots, RAND Europe, Ernst & Young LLP, March 2012. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/215103/dh_133127.pdf.
- 6.Erens, B, Wistow, G, Mounier-Jack, S, Douglas, N, Manacorda, T, Duran, MA and Mays, N. Early findings from the evaluation of the Integrated Care and Support Pioneers in England. Journal of Integrated Care, 2017; 25(3): 137–149. Available at: http://piru.lshtm.ac.uk/projects/current-projects/integrated-care-pioneers-evaluation.html DOI: 10.1108/JICA-12-2016-0047 [DOI] [Google Scholar]
- 7.de Bruin, S, Stoop, A, Baan, C, Nijpels, G, Billings, J on behalf of the SUSTAIN consortium. Sustainable Tailored Integrated Care for Older People in Europe (SUSTAIN-project). Lessons learned from improving integrated care in Europe; 2019. Available at: https://www.sustain-eu.org/wp-content/uploads/sites/4/2019/01/SUSTAIN-overarching-policy-report_final-version.pdf.
- 8.SUSTAIN. Sustainable Tailored Integrated Care for Older People in Europe: The Roadmap; 2019. Available at: https://www.sustain-eu.org/wp-content/uploads/sites/4/2019/03/SUSTAIN-Roadmap.pdf.
- 9.Greenhalgh, T, Wherton, J, Papoutsi, C, Lynch, J, Hughes, G, A’Court, C, Hunder, S, Procter, R and Shaw, S. Analysing the role of complexity in explaining the fortunes of technology progranmmes: empirical application of the NASSS framework. BMC Medicine, 2018; 16: 66 DOI: 10.1186/s12916-018-1050-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Patton, M. Developmental Evaluation. Applying Complexity Concepts to Enhance Innovation and Use The Guilford Press; 2010. [Google Scholar]
- 11.Foundry (online). Foundry: Who We Are. Available at https://foundrybc.ca/who-we-are/.
- 12.Salmon, A. Developmental evaluation as a tool for systems transformation: lessons from Foundry, 6 December 2017, work in progress presentation. Available at: https://foundrybc.ca/work-progress-seminar-dr-amy-salmon/.
- 13.Hammersley, M. Action research: a contradiction in terms? Paper presented to the Annual Conference of the British Educational Research Association, 12–14 September 2002 Available at: http://www.leeds.ac.uk/educol/documents/00002130.htm. [Google Scholar]