Abstract
Background
Stratifying patient populations by risk of adverse events was believed to support preventive care for those identified, but recent evidence does not support this. Emergency admission risk stratification (EARS) tools have been widely promoted in UK policy and GP contracts.
Aim
To describe availability and use of EARS tools across the UK, and identify factors perceived to influence implementation.
Design and setting
Cross-sectional survey in UK.
Method
Online survey of 235 organisations responsible for UK primary care: 209 clinical commissioning groups (CCGs) in England; 14 health boards in Scotland; seven health boards in Wales; and five local commissioning groups (LCGs) in Northern Ireland. Analysis results are presented using descriptive statistics for closed questions and by theme for open questions.
Results
Responses were analysed from 171 (72.8%) organisations, of which 148 (86.5%) reported that risk tools were available in their areas. Organisations identified 39 different EARS tools in use. Promotion by NHS commissioners, involvement of clinical leaders, and engagement of practice managers were identified as the most important factors in encouraging use of tools by general practices. High staff workloads and information governance were identified as important barriers. Tools were most frequently used to identify individual patients, but also for service planning. Nearly 40% of areas using EARS tools reported introducing or realigning services as a result, but relatively few reported use for service evaluation.
Conclusion
EARS tools are widely available across the UK, although there is variation by region. There remains a need to align policy and practice with research evidence.
Keywords: Primary care, emergency health services, health care surveys, clinical prediction rule, implementation science, risk stratification
INTRODUCTION
Emergency admission risk stratification (EARS) tools have been widely promoted to help identify vulnerable people who may benefit from intervention in primary or community care.1,2 With rising emergency admissions that are costly and associated with poor patient outcomes,3,4 it was hoped that EARS could support efforts to positively impact on patient experience, health, and associated costs (the ‘triple aim’).1
The tools use clinical data, such as diagnoses, medication, and medical histories, alongside demographic data to calculate the risk of emergency hospital admission, typically within 12 months.5,6 Interest in EARS tools has accelerated over recent years as reports have emerged that up to a fifth of emergency admissions could be prevented through early intervention in primary care,7 while improvements in data availability, quality, and linkage have improved the feasibility of generating risk scores from routine data.
The literature identifies four stages that are essential to embed such clinical prediction rules in practice: rule development; validation in an external population; impact and effectiveness studies; and implementation in clinical practice.8–10
Much of the EARS literature focuses on the first two stages, with systematic reviews focusing on model derivation, technical characteristics, and performance.5,11,12
There is less literature addressing effectiveness, implementation, and impact. Of the studies that use EARS to select patients for community interventions (typically case management or virtual wards), none reported reductions in admissions, cost savings, or significant patient benefit.13–16
Although there is no strong evidence supporting the clinical or cost effectiveness of interventions incorporating EARS tools, there has been concerted political and practical investment in EARS across the developed world.17–19 In the UK, state funding supported the development of risk models: Patients at Risk of Readmission (PARR) and the Combined Model in England; Scottish Patients at Risk of Readmission and Admission (SPARRA) in Scotland; and Predictive Risk Stratification Model (PRISM) in Wales. At the time this survey was distributed, all four nations had GP contracts that funded primary care ‘case management’ of patients at high risk of admission.
Use of EARS tools within GP contracts is not mandatory, but is dependent on local commissioning decisions. In 2011, NHS England decided to open the market for the supply of EARS tools so as to promote commercial investment and choice,20 and encourage clinical commissioning groups (CCGs) to coordinate local EARS requirements. However, little is known about EARS implementation at macro (nation), meso (commissioning group or health board), and micro (practice) levels; whether tool use has been limited to the GP contract case management activity; and what factors explain variation in accessibility and use. This study aimed to describe EARS tool implementation across the UK in terms of scale of uptake, which tools were available, uses of tools, and factors perceived by commissioners to influence usage.
EARS tools are reasonably accurate in terms of predicting which patients are at highest risk of admission to hospital in the following year. UK policy and GP contracts have advocated and incentivised the use of EARS to facilitate the provision of targeted care to those at highest risk, with an assumption that this would reduce emergency admissions. Recent evidence from a pragmatic randomised trial in general practice has shown that the introduction of EARS was associated with an increase in emergency admissions and the number of days in hospital. To the authors’ knowledge, this study provides the first evidence relating to the implementation of EARS tools across the UK; policymakers and practitioners need to now consider the next steps in managing emergency admissions to hospital, and the role of EARS. |
METHOD
Design
Cross-sectional online survey of NHS organisations with responsibility for primary care commissioning in the UK. This article accords with the Checklist for Reporting Results of Internet Electronic Surveys and the checklist for Strengthening the Reporting of Observational studies in Epidemiology.21,22
Participants and recruitment
The authors approached all 235 NHS commissioning organisations: 209 CCGs in England; 14 health boards in Scotland; seven health boards in Wales; and five local commissioning groups (LCGs) in Northern Ireland. Web searches and telephone enquiries identified appropriate contacts, identifying 264 across the 235 organisations. An email list was obtained from Binleys’ Database of NHS Commissioning Organisations including: CCG accountable officers; GP board members; and leads for commissioning acute unplanned or non-elective care, long-term conditions, primary care quality, information services, and data protection. Thus, 1298 contacts were identified in total.
Each person was emailed, with up to four reminders. All emails were personalised, and included a participant information sheet. The content of each reminder email was varied to increase impact, following the advice of Dillman et al.23 In all correspondence, the importance of gaining responses from organisations with or without EARS tools available in their area was emphasised. The SurveyMonkey platform was used to distribute emails and manage survey responses, and optimise the survey for desktops, tablets, and smartphones.23 The survey was live from November 2015 to May 2016.
An implied consent model was used, which infers willingness to participate from survey completion.24
Survey instrument
With no existing relevant instrument available, the authors developed a questionnaire to address the study’s objectives. Early versions were piloted with EARS specialists, survey methodologists, and with volunteers from 12 commissioning organisations from across the UK. Three volunteers took part in cognitive interviews to further test for ambiguity and usability, resulting in modifications to wording and the order of questions.
Lastly, technical aspects of the web-based survey management were tested, including mail merging and data extraction. The final questionnaire included closed questions and open-ended text questions, and addressed: responders’ characteristics; EARS tool availability; local use of EARS; and the role of specified factors in influencing EARS use (using a Likert scale from 1 [marked as ‘not at all important’] to 6 [‘very important’]). These factors were derived from relevant published literature.25–28
Text boxes enabled the capture of additional influencing factors not provided in the pre-identified list.
Analysis
IBM SPSS Statistics (version 22) was used to estimate unweighted descriptive statistics and cross tabulate responses by nation, size of organisation, and whether the responder was clinical or not. To analyse the question about the proportion of practices with access to an EARS tool, the midpoint of the range offered was used, namely 17% for ’1–33%’, 50% for ‘34–66%’, and 83% for ‘67–99%’, alongside 0% and 100%. NVivo (version 11) was used to manage textual replies and group them thematically. One author led coding, supported by a second author. Comments were selected for this paper to illustrate themes.
RESULTS
Profile of responses
The authors received responses relating to 171 (72.8%) of the 235 organisations approached (Table 1).
Table 1.
Nation | n/ N | % |
---|---|---|
England (CCGs) | 152/209 | 73 |
Scotland (NHS boards) | 7/14 | 50 |
Wales (health boards) | 7/7 | 100 |
Northern Ireland (LCGs) | 5/5 | 100 |
| ||
Size of catchment population | ||
Small (22k to 202k) | 53/171 | 31 |
Medium (203k to 302k) | 55/171 | 32 |
Large (303k to 1142k) | 63/171 | 37 |
| ||
CCG deprivation quartiles (England only) | ||
1 (lowest deprivation) | 40/152 | 26 |
2 | 34/152 | 22 |
3 | 37/152 | 24 |
4 (highest deprivation) | 41/152 | 27 |
| ||
Type of responder per organisation | ||
GP | 49/171 | 29 |
Other clinician | 17/171 | 10 |
Non-clinician | 105/171 | 61 |
| ||
Employing organisation (multiple response) | ||
Commissioning organisation (CCG, health board, or LCG) | 140/171 | 82 |
General practice | 35/171 | 20 |
Commissioning Support Unit (England only) | 7/171 | 4 |
Other | 5/171 | 3 |
CCG = clinical commissioning group. k = 1000. LCG = local commissioning group.
Five responders each covered two CCGs, and one responder addressed the five Northern Ireland LCGs, due to a national approach. The total number of contributors was therefore 161. Seventeen duplicate responses were excluded. Participating organisations supported a mean of 42.9 practices each, and a mean population of 292 000. The 64 non-responding organisations supported a mean of 43.5 practices each, and a mean population of 277 000. The majority of responses came from employees of the commissioning group or health board (n = 140 organisations, 81.9%). Median survey completion time was 9.2 minutes. Free-text comments were provided for 129 responses (75.4%).
Access to EARS tools across the UK
Of the 171 responding organisations, 148 (86.5%) reported availability of EARS tools to ≥1 practice in their area, but this varied widely by country (Table 2).
Table 2.
Reported estimate of practices with EARS availability in commissioning area, % | Nation | Total (N = 171), n (%) | |||
---|---|---|---|---|---|
England CCGs, n (%) | Scotland health boards, n (%) | Wales health boards, n (%) | Northern Ireland LCGs, n (%) | ||
None (0) | 19 (12.5) | — | 4 (57.1) | — | 23 (13.5) |
1–33 | 9 (5.9) | — | 2 (28.6) | — | 11 (6.4) |
34–66 | 5 (3.3) | 2 (28.6) | 1 (14.3) | — | 8 (4.7) |
67–99 | 22 (14.5) | 1 (14.3) | — | — | 23 (13.5) |
All (100) | 97 (63.8) | 4 (57.1) | — | 5 (100) | 106 (62.0) |
Any access | 133/152 (87.5) | 7/7 (100) | 3/7 (42.8) | 5/5 (100) | 148 (86.5) |
EARS = emergency admission risk stratification. CCG = clinical commissioning group. LCG = local commissioning group.
Overall, the authors estimate almost three-quarters of practices in responding areas had access to EARS tools: Northern Ireland (100%), Scotland (91%), England (76%), and Wales (14%), based on mid-ranges as outlined in the Method section.
There were 39 EARS tools identified in use across the UK (Table 3 lists the main tools encountered). This included 37 in England (commercial and NHS tools), with the NHS versions often developed with or by Commissioning Support Units, who provide business intelligence support to multiple CCGs. In the other nations, only one tool was reported in each commissioning area — typically national NHS products — SPARRA in Scotland, PRISM in Wales, and a similar tool in Northern Ireland.
Table 3.
Supplier — Tool | Underlying risk model | Sources of data informing risk model algorithm: primary care / secondary care | Nation, n | Total, n | % | |||
---|---|---|---|---|---|---|---|---|
England | Scotland | Wales | NI | |||||
EMIS Web — Risk Stratification | QAdmissions | P | 58 | 1 | 0 | 0 | 59 | 34.5 |
TPP Systm One | TPP | P | 35 | 0 | 0 | 0 | 35 | 20.5 |
Bespoke local tool | Varies | Varies | 30 | 0 | 0 | 0 | 30 | 17.5 |
Sollis — Clarity Patients | ACG | P&S | 22 | 0 | 0 | 0 | 22 | 12.8 |
Vision | QAdmissions | P | 15 | 0 | 0 | 0 | 15 | 5.8 |
North of England CSU — RAIDR | Combined | P&S | 12 | 0 | 0 | 0 | 12 | 7.0 |
Vision — Basic tool | Vision | P | 10 | 0 | 0 | 0 | 10 | 5.9 |
Capita | ACG | P&S | 7 | 0 | 0 | 0 | 7 | 4.1 |
Dr Foster | Dr Foster | Unknown | 6 | 0 | 0 | 0 | 6 | 3.5 |
Eclipse | Eclipse | Unknown | 5 | 0 | 0 | 0 | 5 | 2.9 |
ISD Scotland — SPARRA | SPARRA | P&S | 0 | 5 | 0 | 0 | 5 | 2.9 |
Health Intelligence | Combined | P&S | 5 | 0 | 0 | 0 | 5 | 2.9 |
NI HSCB — Risk Stratification | NI model | P&S | 0 | 0 | 0 | 5 | 5 | 2.9 |
NHS Wales — PRISM | PRISM | P&S | 2 | 0 | 2 | 0 | 4 | 2.4 |
Other | Varies | Varies | 16 | 1 | 1 | 0 | 18 | 10.5 |
ACG = adjusted clinical group. EARS = emergency admission risk stratification. CSU = commissioning support unit. HSCB = Health and Social Care Board. ISD = Information Services Division. NI = Northern Ireland. P = primary care. PRISM = Predictive Risk Stratification Model. RAIDR = Reporting Analysis and Intelligence Delivering Results. S= secondary care. SPARRA = Scottish Patients at Risk of Readmission and Admission. TPP = The Phoenix Partnership.
Factors encouraging and inhibiting access and use of EARS tools
The factors seen as most important in encouraging access and use were: engagement of practice managers (mean score 4.84), clinical leadership (mean score 4.77), and NHS commissioners (mean score 4.63) (Table 4). Research evidence (mean score 3.59), case studies of benefits from other areas (mean score 3.36), and other NHS agencies (mean score 3.25) scored lowest. The importance of the unplanned admissions enhanced service in England was flagged in text comments, as well as:
‘Ease of use and regular reminders about updated data.’ (LCG A, Scotland); and ‘Additional engagement with practices through use in social care.’
(CCG A, North of England).
Table 4.
Factor | Importance,a mean (SD) | ||||
---|---|---|---|---|---|
England (n = 127) | Northern Ireland (n = 5) | Scotland (n = 7) | Wales (n = 3) | Total (n = 142) | |
Engagement of practice managers | 4.85 (1.23) | 6.00 (0.00) | 3.71 (1.60) | 5.00 (1.00) | 4.84 (1.26) |
Clinical leadership | 4.74 (1.27) | 6.00 (0.00) | 4.57 (1.72) | 4.67 (0.58) | 4.77 (1.27) |
Role of CCG/health board/LCG | 4.67 (1.13) | 6.00 (0.00) | 3.14 (1.34) | 4.33 (2.08) | 4.63 (1.21) |
Financial incentives | 4.28 (1.50) | 6.00 (0.00) | 3.57 (1.51) | 2.00 (1.00) | 4.25 (1.54) |
Local or national priorities or policy | 4.10 (1.42) | 5.00 (0.00) | 3.14 (1.35) | 3.67 (1.15) | 4.07 (1.41) |
Local service provision aligned with EARS use | 4.14 (1.44) | 5.00 (0.00) | 3.14 (1.21) | 1.00 (0.00) | 4.05 (1.52) |
Role of practice clusters or networks | 3.81 (1.59) | 5.00 (0.00) | 3.57 (1.40) | 3.67 (2.52) | 3.84 (1.58) |
Research evidence | 3.59 (1.49) | 5.00 (0.00) | 3.57 (1.71) | 1.33 (0.58) | 3.59 (1.51) |
Case studies of benefits from other areas | 3.35 (1.40) | 4.00 (0.00) | 3.71 (1.49) | 2.00 (1.73) | 3.36 (1.40) |
Role of other NHS agencies | 3.24 (1.50) | 4.00 (0.00) | 2.86 (1.46) | 3.33 (1.15) | 3.25 (1.47) |
Likert scale 1–6. CCG = clinical commissioning group. EARS = emergency admission risk stratification. LCG = local commissioning group. SD = standard deviation.
The factors perceived most important in inhibiting access to and use of EARS tools related to capacity — workload of practice staff (mean score 4.94) and workload of other care staff (mean score 4.08) (Table 5). Considered slightly less important was information governance issues (mean score 3.90). Text comments on other factors inhibiting general practice uptake highlighted some perceptions of a lack of value or confidence in tool inputs and outputs, and commissioning challenges:
‘For many practices, the tool did not identify many patients who were not on their horizons already.’
(CCG C, South of England)
‘Biggest issue has been lack of reliability of the tool.’
(CCG D, North of England)
‘The CCG has struggled to commission an appropriate EARS [tool] ; some of this has been related to contradictory offers from different providers, including local Commissioning Support Unit.’
(CCG E, Midlands and East of England)
Table 5.
Factorb | Importance,a mean (SD) | ||||||||
---|---|---|---|---|---|---|---|---|---|
With EARS tools in area | Without EARS tools | ||||||||
England (n = 127) | Northern Ireland (n = 5) | Scotland (n = 7) | Wales (n = 3) | Subtotal (n = 142) | England (n = 14) | Wales (n = 4) | Subtotal (n = 18) | Total (n = 160) | |
Lack of research evidence | 2.86 (1.46) | 4.00 (0.00) | 3.00 (1.15) | 3.67 (2.51) | 2.93 (1.43) | 3.50 (1.45) | 3.25 (0.95) | 3.44 (1.38) | 2.99 (1.43) |
Lack of training/expertise in using EARS tools | 3.58 (1.58) | 3.00 (0.00) | 4.29 (1.80) | 1.33 (0.58) | 3.55 (1.55) | 4.36 (1.50) | 1.50 (1.00) | 3.72 (1.84) | 3.57 (1.58) |
Cost of introducing EARS tools | 2.85 (1.56) | 3.00 (0.00) | 3.00 (1.15) | 2.67 (2.89) | 2.86 (1.46) | 4.57 (1.34) | 1.50 (1.00) | 3.89 (1.81) | 2.98 (1.53) |
Information governance issues | 3.84 (1.64) | 5.00 (0.00) | 2.86 (1.36) | 4.67 (0.58) | 3.85 (1.58) | 4.50 (1.56) | 3.50 (1.91) | 4.28 (1.64) | 3.90 (1.59) |
Issues with software or hardware | 3.48 (1.63) | 4.00 (0.00) | 4.14 (1.87) | 3.33 (0.58) | 3.53 (1.56) | 3.86 (1.74) | 1.25 (0.50) | 3.28 (1.90) | 3.50 (1.60) |
Resistance from clinical leaders | 3.08 (1.54) | 2.00 (0.00) | 3.29 (1.11) | 4.33 (2.08) | 3.07 (1.51) | 2.92 (1.54) | 4.00 (1.82) | 3.17 (1.61) | 3.08 (1.52) |
Lack of interest from practice staff | 3.89 (1.55) | 4.00 (0.00) | 3.71 (1.70) | 3.00 (1.00) | 3.87 (1.47) | 3.71 (1.73) | 3.25 (2.06) | 3.61 (1.75) | 3.84 (1.50) |
Lack of alignment with service priorities or policy | 2.81 (1.36) | 3.00 (0.00) | 3.42 (1.40) | 1.00 (0.00) | 2.81 (1.33) | 3.07 (1.49) | 2.00 (1.41) | 2.83 (1.50) | 2.81 (1.35) |
Workload of practice staff | 5.02 (1.31) | 6.00 (0.00) | 4.57 (1.81) | 4.33 (2.08) | 5.02 (1.22) | 4.71 (1.44) | 2.75 (1.71) | 4.28 (1.67) | 4.94 (1.29) |
Workload of other care staff | 4.12 (1.55) | 5.00 (0.00) | 4.00 (1.63) | 3.33 (1.15) | 4.13 (1.48) | 4.28 (1.54) | 1.75 (1.50) | 3.72 (1.84) | 4.08 (1.52) |
EARS tool not integrated with clinical systems | 3.81 (1.66) | 5.00 (0.00) | 4.14 (1.77) | 2.67 (1.15) | 3.84 (1.61) | 4.42 (1.50) | 1.50 (0.58) | 3.78 (1.83) | 3.83 (1.64) |
Likert scale 1–6.
Responses from organisations without EARS availability who answered 0 for every category were treated as missing data (n = 3). EARS = emergency admission risk stratification. SD = standard deviation.
Use of EARS tools
The majority of areas with access to EARS tools reported their use by GPs or other professionals to identify individual patients at risk (Table 6). Many areas were using the tools to support service planning or development at the level of commissioners or ‘practice clusters or networks’; but only a fifth of commissioners reported using EARS for service evaluation.
Table 6.
Use | England (N = 133) | NI, Scotland, and Wales (N = 15) | Total (N = 148) | |||
---|---|---|---|---|---|---|
n | % | n | % | n | % | |
To identify patients for follow up or review (case finding) by practice staff | 113 | 85.0 | 12 | 80.0 | 125 | 84.5 |
To identify patients for follow up or review (case finding) by non-practice staff | 69 | 51.9 | 4 | 26.7 | 73 | 49.3 |
To inform service planning or development work at CCG/health board/LCG level | 64 | 48.1 | 7 | 46.7 | 71 | 48.0 |
To inform service planning or development by groups of practices (for example, practice clusters or networks) | 40 | 30.1 | 9 | 60.0 | 49 | 33.1 |
In relation to service evaluations | 26 | 19.5 | 5 | 33.3 | 31 | 20.9 |
CCG = clinical commissioning group. LCG = local commissioning group. NI = Northern Ireland.
When asked if the introduction of EARS tools had resulted in the ‘introduction of new services’ or the ‘redesign of existing services’ in their area, 58 of 148 organisations (39.2%) confirmed that they had (data not shown).
Text comments (n = 57) revealed a diverse range of service innovations operating mainly at the multipractice level (for example, locality) following the introduction of EARS tools. These initiatives had impacted on or introduced a range of staff roles (for example, care coordinators and community matrons), and new multidisciplinary services (for example, frailty teams, integrated care teams, hospital at home, and telecare):
‘Development of extended MDT [multidisciplinary] programme. Helps identify patients for presentation.’
(CCG G, Midlands and East of England)
‘Case coordinators band 5 non-clinical employed by community provider to create list and coordinate MDT.’
(CCG H, South of England)
‘We are … implementing a Frailty Pathway, this has included the commissioning of a Community Geriatrician, GPs with an extended role, re-modelling our Community Teams, and bringing them together to form integrated Neighbourhood Teams.’
(CCG I, Midlands & East of England)
‘Principally risk tools have been well used through the national GP Contract work. However, the local Integrated Care programme has also mandated and incentivised use of the risk tool. Based on the clinical profile of high-risk patients, additional locally enhanced services are also being developed such as in the areas of frailty and palliative care’.
(CCG J, London)
DISCUSSION
Summary
Widespread implementation of EARS tools across the UK was found, with variation in implementation and use at macro (nation), meso (commissioning group), and micro (practice) levels. This variation extended to the choice of EARS tools. In England, Scotland, and Northern Ireland the majority of areas reported changing or introducing services due to the introduction of EARS tools. Human factors as drivers of EARS implementation were identified, with practice managers, clinical leaders, and NHS commissioners in key roles. However, the main barriers to implementing EARS tools in general practice were concerns about workload and information governance. For both implementers and non-implementers, research evidence, or the lack of, was a relatively low scoring factor. Only a fifth of sites were contributing to evaluations.
Strengths and limitations
To the authors’ knowledge, this is the first study to examine implementation of EARS across the UK. The main strength of this study lies in the comprehensive coverage; 171 (72.8%) of the 235 NHS commissioners across four nations, which is notably higher than previous CCG surveys, where coverage was 27%,29 43%,30 and 52%.31 Nevertheless, non-responders could differ from responders, and the views of front end staff may differ in comparison to this survey of commissioning bodies.
A further potential limitation is the variation in responder profile; a mix of clinical and non-clinical staff. However, the authors believe this reflects the profile and involvement of both staff types in the commissioning process.
Comparison with existing literature
Most of the literature on EARS has focused on model development and validation.
The survey indicated that fewer than a fifth of Welsh practices had a tool available, reflecting the Welsh Government’s decision to delay rolling out EARS, pending the results of a randomised trial (PRISMATIC) in 32 practices within one health board area.32 The findings justified the cautious approach, as the intervention was shown to be ineffective and resulted in increased costs, emergency admissions, and bed days.13,32 Further studies published after the widespread roll out of EARS, including a systematic review of case management for at-risk individuals (not necessarily identified through EARS tools), and an observational study of multidisciplinary case management for risk stratified patients, found increased costs and health service use.15,33 It is far from uncommon for health innovations to be introduced at scale and great cost without an understanding of their costs and effects. Notable examples include the introduction of NHS Direct and the LUCAS mechanical chest compression device.34,35
Implications for research and practice
Widespread implementation of EARS across the UK represents a huge investment of time, energy, and financial resource. At the time of the major nationwide introduction of EARS associated with the GP contracts, there was no research evidence supporting the EARS-led case management approach. Implementation was based on assumptions that it would reduce admissions and improve care, but in the authors’ view this idealised perspective was not a reasonable foundation for the level of investment that followed. Nor was due consideration given to the potential for unintended consequences, including unmet need.36,37 Nonetheless, case management of those at high risk of admission continues as an international healthcare priority. EARS was the most common ‘intervention’ within the 50 NHS integrated care ‘Vanguard’ sites,38 and use is recommended in National Institute for Health and Care Excellence clinical guidance on multimorbidity39 and major programmes elsewhere, for example, Australia40 and Catalonia.41 EARS tools have been joined in the GP risk stratification armoury by frailty identification tools. Notable among these is the Electronic Frailty Index,42 which is recommended for use in the NHS England GP contract to identify frail older adults for preventive care and admission avoidance.43 As with EARS tools, frailty tool use has been encouraged in advance of effectiveness studies (for example, PROSPER44). Like any such innovation, in the longer-term, the use of EARS and frailty tools will depend on evidence of benefit.28,45,46 The academic community must therefore encourage evaluation in line with the edict to base innovation on rigorous evidence.1,2,47 This is not always forthcoming — a large pan-European study of EARS concluded that data on effectiveness were simply not available.17
This survey has confirmed that primary care in the UK responded to policy and contractual initiatives by widespread implementation of EARS tools. This may be deemed a success, arming general practice with infrastructure and data to support improved patient care. However, emerging evidence from studies completed after the widescale implementation suggest that the use of EARS is costly and ineffective, with lost opportunities from alternative expenditure and activity. Going forward, policymakers must consider the current evidence base in decisions on the future use of emergency admission, and similar risk stratification and case identification tools.
Acknowledgments
The authors thank all participants and Deborah Kwan, Emma Richards, and Kayleigh Nelson for help identifying responders.
Funding
The authors are grateful for funding support from Abertawe Bro Morgannwg University Health Board (ABM UHB; now Swansea Bay University Health Board) and the Wales Centre for Primary and Emergency Care Research (PRIME; http://www.primecentre.wales).
Ethical approval
The authors gained NHS Research & Development permission from ABM UHB (Integrated Research Application System reference: 168535).
Provenance
Freely submitted; externally peer reviewed.
Competing interests
The authors have declared no competing interests.
Discuss this article
Contribute and read comments about this article: bjgp.org/letters
REFERENCES
- 1.Nuffield Trust Choosing a predictive risk model: a guide for commissioners in England. 2011 https://www.nuffieldtrust.org.uk/resource/choosing-apredictive-risk-model-a-guide-for-commissioners-in-england (accessed 7 Sep 2020).
- 2.NHS England Using case finding and risk stratification: a key service component for personalised care and support planning. 2015 https://www.england.nhs.uk/publication/making-it-happen-case-finding (accessed 7 Sep 2020).
- 3.National Audit Office Emergency admissions to hospital: managing the demand. 2013 https://www.nao.org.uk/report/emergency-admissionshospitals-managing-demand (accessed 7 Sep 2020).
- 4.Steventon A, Deeny S, Friebel R, et al. Emergency hospital admissions in England. Which may be avoidable and how? 2018 https://www.health.org.uk/publications/emergency-hospital-admissions-in-england-which-may-beavoidable-and-how (accessed 7 Sep 2020).
- 5.Wallace E, Stuart E, Vaughan N, et al. Risk prediction models to predict emergency hospital admission in community-dwelling adults: a systematic review. Med Care. 2014 doi: 10.1097/mlr.0000000000000171. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Paton F, Wilson P, Wright K. Predictive validity of tools used to assess the risk of unplanned admissions: a rapid review of the evidence. 2014 https://www.york.ac.uk/media/crd/predicting%20unplanned%20admissions.pdf (accessed 7 Sep 2020). [Google Scholar]
- 7.Blunt I. Focus on preventable admissions: trends in emergency admissions for ambulatory care sensitive conditions, 2001 to 2013. 2013 https://www.health.org.uk/sites/default/files/QualityWatch_FocusOnPreventableAdmissions.pdf (accessed 7 Sep 2020). [Google Scholar]
- 8.Toll DB, Janssen KJM, Vergouwe Y, Moons KGM. Validation, updating and impact of clinical prediction rules: a review. J Clin Epidemiol. 2008 doi: 10.1016/j.jclinepi.2008.04.008. [DOI] [PubMed]
- 9.Moons KGM, Kengne AP, Grobbee DE, et al. Risk prediction models: II. External validation, model updating, and impact assessment. Heart. 2012 doi: 10.1136/heartjnl-2011-301247. [DOI] [PubMed] [Google Scholar]
- 10.Adams ST, Leveson SH. Clinical prediction rules. BMJ. 2012 doi: 10.1136/bmj.d8312. [DOI] [PubMed] [Google Scholar]
- 11.O’Caoimh R, Cornally N, Weathers E, et al. Risk prediction in the community: a systematic review of case-finding instruments that predict adverse healthcare outcomes in community-dwelling older adults. Maturitas. 2015 doi: 10.1016/j.maturitas.2015.03.009. [DOI] [PubMed] [Google Scholar]
- 12.Kansagara D, Englander H, Salanitro A, et al. Risk prediction models for hospital readmission: a systematic review. JAMA. 2011 doi: 10.1001/jama.2011.1515. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Snooks H, Bailey-Jones K, Burge-Jones D, et al. Effects and costs of implementing predictive risk stratification in primary care: a randomised stepped wedge trial. BMJ Qual Saf. 2018 doi: 10.1136/bmjqs-2018-007976. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Lewis G, Georghiou T, Steventon A, et al. Impact of ‘Virtual Wards’ on hospital use: a research study using propensity matched controls and a cost analysis. Southampton: NIHR Service Delivery and Organisation programme; 2013. [Google Scholar]
- 15.Stokes J, Kristensen SR, Checkland K, Bower P. Effectiveness of multidisciplinary team case management: difference-in-differences analysis. BMJ Open. 2016 doi: 10.1136/bmjopen-2015-010468. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Roland M, Lewis R, Steventon A, et al. Case management for at-risk elderly patients in the English integrated care pilots: observational study of staff and patient experience and secondary care utilisation. Int J Integr Care. 2012 doi: 10.5334/ijic.850. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.de Manuel Keenoy E, Nalin M, Alhambra T, et al. White paper on deployment of stratification methods. 2016 http://assehs.eu//upload/docpublicos/17/white-paper-assehs-european-project.pdf (accessed 7 Sep 2020). [Google Scholar]
- 18.Freund T, Wensing M, Geissler S, et al. Primary care physicians’ experiences with case finding for practice-based care management. Am J Manag Care. 2012;18(4):e155–e161. [PMC free article] [PubMed] [Google Scholar]
- 19.Oliver-Baxter J, Bywood P, Erny-Albrecht K. Predictive risk models to identify people with chronic conditions at risk of hospitalisation. Adelaide: Primary Health Care Research & Information Service; 2015. [Google Scholar]
- 20.Johnson S. Risk stratification and next steps with DH risk prediction tools: patients at risk of re-hospitalisation and the combined predictive model. Letter from Stephen Johnson, Deputy Director, Head of Long Term Conditions. Department of Health; 2011. [Google Scholar]
- 21.Eysenbach G. Improving the quality of web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES) J Med Internet Res. 2004 doi: 10.2196/jmir.6.3.e34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.von Elm E, Altman DG, Egger M, et al. Strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007 doi: 10.1136/bmj.39335.541782.AD. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Dillman DA, Smyth JD, Christian L. Internet, phone, mail, and mixed-mode surveys: the tailored design method. New Jersey, NJ: John Wiley & Sons; 2014. [Google Scholar]
- 24.Smith T. Ethics in medical research: a handbook of good practice. Cambridge: Cambridge University Press; 1999. [Google Scholar]
- 25.Hutchings HA, Evans BA, Fitzsimmons D, et al. Predictive risk stratification model: a progressive cluster-randomised trial in chronic conditions management (PRISMATIC) research protocol. Trials. 2013 doi: 10.1186/1745-6215-14-301. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Porter A, Kingston MR, Evans BA, et al. It could be a ‘Golden Goose’: a qualitative study of views in primary care on an emergency admission risk prediction tool prior to implementation. BMC Fam Pract. 2016 doi: 10.1186/s12875-015-0398-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Kingston MR, Evans BA, Nelson K, et al. Costs, effects and implementation of routine data emergency admission risk prediction models in primary care for patients with, or at risk of, chronic conditions: a systematic review protocol. BMJ Open. 2016 doi: 10.1136/bmjopen-2015-009653. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Greenhalgh T, Robert G, Macfarlane F, et al. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004 doi: 10.1111/j.0887-378X.2004.00325.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Williams I, Harlock J, Robert G, et al. Decommissioning health care: identifying best practice through primary and secondary research — a prospective mixed-methods study. Health Serv Deliv Res. 2017 doi: 10.3310/hsdr05220. [DOI] [PubMed] [Google Scholar]
- 30.Simmons D, Loughlan C. Approaches to diabetes health professional education in England: a report from the Diabetes UK Healthcare Professional Education Task and Finish Group. 2014 https://researchdirect.westernsydney.edu.au/islandora/object/uws:29879 (accessed 7 Sep 2020).
- 31.Dunne JA, Wormald JCR, Ghedia R, Soldin M. Implementation of national body contouring surgery guidelines following massive weight loss: a national cross-sectional survey of commissioning in England. J Plast Reconstr Aesthet Surg. 2017 doi: 10.1016/j.bjps.2016.09.008. [DOI] [PubMed] [Google Scholar]
- 32.Snooks H, Bailey-Jones K, Burge-Jones D, et al. Predictive risk stratification model: a randomised stepped-wedge trial in primary care (PRISMATIC) Southampton: NIHR Journals Library; 2018. [PubMed] [Google Scholar]
- 33.Stokes J, Panagioti M, Alam R, et al. Effectiveness of case management for ‘at risk’ patients in primary care: a systematic review and meta-analysis. PLoS One. 2015 doi: 10.1371/journal.pone.0132340. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 34.Munro J, Nicholl J, O’Cathain A, Knowles E. Impact of NHS Direct on demand for immediate care: observational study. BMJ. 2000 doi: 10.1136/bmj.321.7254.150. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 35.Gates S, Smith JL, Ong GJ, et al. Effectiveness of the LUCAS device for mechanical chest compression after cardiac arrest: systematic review of experimental, observational and animal studies. Heart. 2012 doi: 10.1136/heartjnl-2011-301571. [DOI] [PubMed] [Google Scholar]
- 36.Roland M, Abel G. Reducing emergency admissions: are we on the right track? BMJ. 2012 doi: 10.1136/bmj.e6017. [DOI] [PubMed] [Google Scholar]
- 37.Exley J, Abel GA, Fernandez JL, et al. Impact of the Southwark and Lambeth Integrated Care Older People’s Programme on hospital utilisation and costs: controlled time series and cost-consequence analysis. BMJ Open. 2019 doi: 10.1136/bmjopen-2018-024220. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.National Audit Office Developing new care models through NHS vanguards. 2018 www.nao.org.uk/report/developing-new-care-models-through-nhsvanguards (accessed 7 Sep 2020).
- 39.Kernick D, Chew-Graham CA, O’Flynn N. Clinical assessment and management of multimorbidity: NICE guideline. Br J Gen Pract. 2017 doi: 10.3399/bjgp17X690857. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.NSW Government, The Agency for Clinical Innovation Risk Stratification in NSW. 2018 https://www.aci.health.nsw.gov.au/resources/chronic-care/aci/risk-stratification-program (accessed 7 Sep 2020).
- 41.Duenas-Espin I, Vela E, Pauws S, et al. Proposals for enhanced health risk assessment and stratification in an integrated care scenario. BMJ Open. 2016 doi: 10.1136/bmjopen-2015-010301. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Clegg A, Bates C, Young J, et al. Development and validation of an electronic frailty index using routine primary care electronic health record data. Age Ageing. 2016 doi: 10.1093/ageing/afw039. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.NHS England Standard General Medical Services Contract 2017/18. 2018 https://www.england.nhs.uk/publication/nhs-england-standard-generalmedical-services-contract-2017-18 (accessed 7 Sep 2020).
- 44.Clegg A, Young J, Bower P, et al. Personalised care planning to improve quality of life for older people with frailty (PROSPER) 2018 https://medicinehealth.leeds.ac.uk/dir-record/research-projects/1173/personalised-care-planning-toimprove-quality-of-life-for-older-people-with-frailty-prosper (accessed 7 Sep 2020).
- 45.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998 doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2015 doi: 10.1186/s13012-016-0398-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Lewis G. Next steps for risk stratification in the NHS. 2015 https://www.england.nhs.uk/wp-content/uploads/2015/01/nxt-steps-risk-strat-glewis.pdf (accessed 7 Sep 2020). [Google Scholar]