Although the delivery of specialised skills from GPs is not new, the NHS Plan formalised the role of the GP with a special interest (GPwSI) as part of a radical programme to reconfigure the healthcare workforce.1 This development was part of a broader policy agenda to shift the balance of care towards the primary care sector, in order to deliver more patient-centred services and reduce waiting times and avoidable admissions to secondary care. However, against a background of increasing demands on limited resources and the need to maximise the benefits of additional health service investment, the focus has shifted to cost-effectiveness.2
Building on these developments, national frameworks were developed to define skills, competencies and governance but primary care organisations were encouraged to develop innovations in service delivery based on local need.3 However, despite the policy rhetoric, the initiative has developed considerable momentum without any evidence base.
The randomised controlled trial by Baker et al4 in this month's Journal (page 912) showing no differences in clinical outcomes between orthopaedic hospital and practice-based clinics reflects an early and developing evidence base of the effectiveness of GPwSIs. With the shift in emphasis to decision making at a local level, a key question is how the evidence base can be developed to support policy decisions in a way that is relevant to local health economies.
The evaluation of public policy is set across a spectrum of approaches.
RATIONAL DECISION MAKING
The dominant analytical framework for health policy research reflected in Baker et al's study is known as a rational approach. In its broadest sense, this demands an explicit statement of objectives and values, and an examination of the costs and consequences of competing alternatives in order to provide a rigorous and generalisable evidence base. These demands present a formidable challenge to health service researchers5,6
A rational approach needs the purpose of investment in GPwSIs to be clear from the outset: whether GPwSIs are intended to be additional to and working in cooperation with existing secondary care services (increasing health care outputs more efficiently from additional resources); or substituting for and therefore working in competition with existing services (achieving a desired output at minimum cost). Although some developments have been shown to be an addition,7 within the context of a modernisation agenda that demands a balancing act between competition and cooperation, a more realistic perspective is to see them as a combination of both. This gives rise to conflicting economic perspectives; that is, different costs and benefits will be relevant from the perspective of the commissioning GP practice, the primary care organisation, the hospital trust or the NHS. The different answers can potentially destabilise local health economies.8 For example, in Bradford where the PCT led the country in creating GPwSI posts the hospital trust ran into serious financial difficulties. The combined affect of GPwSIs skimming low cost work against a background of national tariffs is likely to reduce hospital incomes.
Other problems with a rational approach are well recognised and may prove insurmountable. For example, studies must control for referral rates increasing with better access; it is difficult to weigh and integrate the many relevant outcomes; there may also be important but unanticipated consequences in other parts of the system that are not captured; each GPwSI development will reflect different local health economies, historical contexts, case and intervention mix.
DECISION MAKING IN A GARBAGE CAN
At the other extreme of the policy analysis spectrum is garbage can decision making.9 Here, relating means to ends is highly problematic, and policy making is often arbitrary. Problems and solutions float around at random and their resolution depends on the time they are picked up and the availability of cans in which to put them.
For example, an intermediate care headache clinic led by one of the authors started as a chance encounter of a GP with an interest in headache, a senior PCT manager who had severe migraine and the availability of a small amount of soft money to pump prime an initiative that is now in its fifth year.10
INCREMENTAL MODELS
This approach sits mid-way between the two extremes and recognises that there are limits to rational behaviour due to limited information and processing power.
Incremental models identify how we ‘muddle through’ and stress the importance of change by mutual adjustment and negotiation underpinned by pragmatism.11,12 Such models emphasise the importance of the context in which economic transactions take place. The context, in turn, is influenced by culture and social norms, and the relative power wielded by different stakeholders.
If we accept this as a more accurate model of the world, an approach known as realistic evaluation may offer a more relevant framework within which to develop health service research.
DEVELOPING AN EVIDENCE BASE IN INTERMEDIATE CARE — TOWARDS REALISTIC EVALUATION
Despite significant investment, the impact of health service research on service delivery has been disappointing.13,14 Research is still viewed as a store in which researchers are busy filling shelves with a comprehensive set of studies that a decision maker might some day drop by to purchase.15 The aim is to extend the notion of internal validity to all customers even in the presence of an increasingly heterogeneous set of confounding variables that begin as soon as the check-out is reached.
Realistic evaluation reflects a foundation in scientific realist philosophy. It seeks to understand the ways in which mechanisms, such as GPwSI clinics, interact with contextual factors, such as local professional networks, history and culture, to bring about unique outcomes. In contrast, the currently prevailing approach minimises contextual factors in order to identify more direct and universal relationships between mechanisms and outcomes.16 Although this approach has been used widely in education and criminology research, we are only aware of one study published in health care where realistic evaluation was used to complement a randomised controlled trial investigating the impact of mental health link workers in primary care.17
A realistic evaluation promises the opportunity of more useful insights into specific interventions, shifting the question from ‘what works?’ to ‘what works for whom in what circumstances?’ The starting point is to generate a number of theories of how mechanism, context and outcomes may inter-relate. For example, one theory would be that the mechanism of shared discussion and support operating in a context of a history of good relationships between GPs and consultants leads to better health outcomes. Other examples are shown in the Journal's online supplementary information.
Such hypotheses then frame the research strategies to test possible configurations of context, mechanism and outcome to provide results that may be transferable rather than generalisable. Statistical significance is replaced by ‘likely to be of importance.’ Although evidence from randomised controlled trials is not excluded, the importance of qualitative, ethnographic and case study research is elevated and can provide a richer understanding of local contexts and contingencies.
Baker et al's paper represents an important first step in the development of an evidence base to support GPwSI expansion. Realistic evaluation can offer an analytical framework to complement the randomised controlled trial that takes stock of social structures, local cultures and institutions, reflecting the reality that there are no idealised solutions but that it is the actions of stakeholders that are triggered in conducive circumstances that can lead to relevant outcomes. The research task is to identify, articulate, test and refine configurations of mechanism, context and outcome rather than assuming it is a specific intervention that gives the desired results.
A broader evidence base supporting the right thinking is more likely to get us to an approximation of where we want to be rather than attempts to engineer health economies to defined outcomes underpinned by rational analysis. Realistic evaluation offers an important framework that can facilitate this process in a way that is accessible and acceptable to local policy makers.
Supplementary information
Additional information is available online at http://www.rcgp.org.uk/journal/supp/index.asp
REFERENCES
- 1.NHS. An NHS Plan for investment, a plan for reform. London: The Stationery Office; 2000. [Google Scholar]
- 2.The Audit Commission. Transforming primary care: the role of primary care trusts in shaping and supporting general practice and a quicker treatment closer to home. www.audit-commission.gov.uk (accessed Feb 2005)
- 3.Department of Health. Guidelines for the appointments of a general practitioner with a special interest. London: Department of Health; 2002. [Google Scholar]
- 4.Baker R, Sanderson-Mann J, Longworth, et al. Randomised controlled trial to compare GP-run orthopaedic clinics based in hospital outpatient departments and general practices. Br J Gen Pract. 2005;55:912–917. [PMC free article] [PubMed] [Google Scholar]
- 5.Pencheon D. Intermediate care. BMJ. 2002;324:1347–1348. doi: 10.1136/bmj.324.7350.1347. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Kernick D. Developing intermediate care provided by General Practitioners with a special interest: the economic perspective. Br J Gen Pract. 2003;53:553–556. [PMC free article] [PubMed] [Google Scholar]
- 7.Nocon A, Leese B. The role of UK general practitioners with a special clinical interests: implications for policy and service delivery. Br J Gen Pract. 2004;54:50–56. [PMC free article] [PubMed] [Google Scholar]
- 8.Mannion R. Payment by results and demand management – learning from the South Yorkshire laboratory. Report to the Department of Health University of York. York: Centre for Health Economics; 2005. [Google Scholar]
- 9.Cohen M, March J, Olsen J. A garbage can model of organisational choice. Adm Sci Q. 1972;17:1–25. [Google Scholar]
- 10.Kernick D. A descriptive study of an intermediate headache clinic delivered by general practitioners with a special interest. Headache Care. 2005;2(2):101–104. [Google Scholar]
- 11.Simon HA. Administrative behaviour. New York: Free Press; 1957. [Google Scholar]
- 12.Lindblom CE. The science of muddling through. Public Adm Rev. 1959;19:78–88. [Google Scholar]
- 13.Black N. Evidence based policy: proceed with care. BMJ. 2000;323:275–279. doi: 10.1136/bmj.323.7307.275. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Ovretveit J, Gustafson D. Using research to inform quality programmes. BMJ. 2003;326:759–761. doi: 10.1136/bmj.326.7392.759. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Lomas J. Connecting research and policy. Can J Policy Res. 2000;1:140–144. [Google Scholar]
- 16.Pawson R, Tilley N. Realistic evaluation. London: Sage Publications; 1997. [Google Scholar]
- 17.Byng R, Norman I, Redfern S. Using realistic evaluation to evaluate a practice level intervention to improve primary healthcare for patients with long-term mental illness. Evaluation. 2005;11(1):69–93. [Google Scholar]
