Skip to main content
The British Journal of General Practice logoLink to The British Journal of General Practice
. 2019 Feb 26;69(680):e154–e163. doi: 10.3399/bjgp19X701513

Incentive schemes to increase dementia diagnoses in primary care in England: a retrospective cohort study of unintended consequences

Dan Liu 1, Emily Green 2, Panagiotis Kasteridis 3, Maria Goddard 4, Rowena Jacobs 5, Raphael Wittenberg 6, Anne Mason 7
PMCID: PMC6400615  PMID: 30803980

Abstract

Background

The UK government introduced two financial incentive schemes for primary care to tackle underdiagnosis in dementia: the 3-year Directed Enhanced Service 18 (DES18) and the 6-month Dementia Identification Scheme (DIS). The schemes appear to have been effective in boosting dementia diagnosis rates, but their unintended effects are unknown.

Aim

To identify and quantify unintended consequences associated with the DES18 and DIS schemes.

Design and setting

A retrospective cohort quantitative study of 7079 English primary care practices.

Method

Potential unintended effects of financial incentive schemes, both positive and negative, were identified from a literature review. A practice-level dataset covering the period 2006/2007 to 2015/2016 was constructed. Difference-in-differences analysis was employed to test the effects of the incentive schemes on quality measures from the Quality and Outcomes Framework (QOF); and four measures of patient experience from the GP Patient Survey (GPPS): patient-centred care, access to care, continuity of care, and the doctor–patient relationship. The researchers controlled for effects of the contemporaneous hospital incentive scheme for dementia and for practice characteristics.

Results

National practice participation rates in DES18 and DIS were 98.5% and 76% respectively. Both schemes were associated not only with a positive impact on QOF quality outcomes, but also with negative impacts on some patient experience indicators.

Conclusion

The primary care incentive schemes for dementia appear to have enhanced QOF performance for the dementia review, and have had beneficial spillover effects on QOF performance in other clinical areas. However, the schemes may have had negative impacts on several aspects of patient experience.

Keywords: dementia, incentives, primary health care, programme evaluation, reimbursement

INTRODUCTION

Dementia is an umbrella term covering a range of progressive neurological conditions. It is a terminal condition that has a devastating effect on individuals and their families, and presents a huge challenge to society.1 In 2015 around 850 000 people were estimated to be living with dementia in the UK and this number is expected to rise to >2 million by 2052.2 However, in 2009 underdiagnosis was ‘the norm’,1 with between one-half and two-thirds of people in the UK with dementia having received no formal diagnosis.1,3 A key aim of the 2009 Dementia Strategy was to encourage earlier diagnosis.1 A raft of measures was introduced including two voluntary financial incentive schemes in primary care: Directed Enhanced Service 18 (DES18), for ‘facilitating timely diagnosis of and support for dementia’,46 and the complementary Dementia Identification Scheme (DIS).7

DES18 ran from April 2013 to March 2016.46 It supported a proactive and timely approach for assessing patients considered at risk of developing dementia, and then testing them as appropriate. DES18 also aimed to improve support for individuals who were newly diagnosed with dementia and their carers by referring them to specialist services and offering a care plan or a carer health check.

DIS ran from 1 October 2014 to 31 March 2015 and was designed to support and complement DES18.7,8 The aim was to encourage GP practices to adopt a proactive approach in identifying patients with dementia and, working with their clinical commissioning groups (CCGs), to develop relevant services and care packages. Like DES18, this involved identifying at-risk patients, working with care or nursing homes to find symptomatic patients, and offering them a dementia assessment to improve the recording of dementia on the practice’s dementia register and hence to improve care.

DES18 and DIS appear to have boosted diagnosis rates,9 but the unintended effects (positive or negative) of the schemes are unknown. Incentive schemes can unintentionally impact on other aspects of patient care; for example, by diverting clinical and administrative resources away from core and/or unincentivised services or conditions.10 The aims of this study were to test the effects of these schemes on quality measures from the Quality and Outcomes Framework (QOF) and on patient experience.

METHOD

A literature review was conducted to inform the researchers’ selection of unintended effects (positive or negative) of the two primary care incentive schemes. The search was restricted to UK studies of incentive schemes in primary care that were published between 2006 and 2016. The authors searched MEDLINE, Embase, PsycINFO, CINAHL, and HMIC.

How this fits in

A previous evaluation of two primary care schemes for tackling underdiagnosis in dementia demonstrated that the schemes had been effective in terms of their intended effects, but their unintended consequences are unknown. This study addresses that gap in the evidence base. The researchers show that the schemes are associated with higher-quality care both for dementia and for other long-term conditions, but that some aspects of patient experience may have been adversely affected. Feedback from the GP Patient Surveys could help practices to identify and mitigate adverse effects.

The researchers screened 509 records and identified 22 relevant studies.1031 None of these articles investigated the unintended consequences of DES18 or DIS. In total, evidence on 12 unintended effects from other incentive schemes was found. Effects on provider behaviours included: gaming (inappropriate exception reporting);10,17,28 reduced clinical autonomy;17,24,26,29 internal motivation;17,26 and provider professionalism.10,24 Effects on practices included greater use of computers and widespread adoption of electronic medical records.21,23,27 For patients, there were effects on health inequalities;10,11,1520,22,30 loss of patient-centredness;13,14,18,19,21,25,27,30 the doctor–patient relationship;14,21,27 access to care;14 and continuity of care.13,14,18,19,22,28,30 Studies also identified spillover effects on the quality of care falling outside of the schemes.10,12,14,17,21,25,30,31

Data

This retrospective cohort quantitative study was conducted on primary care practices in England. The cohort was a balanced panel, so all practices contributed data in all 10 years of the study from 2006/2007 to 2015/2016. A list of the datasets used to construct the dependent and explanatory variables is provided in Box 1.

Box 1. Datasets used for the analysis.

Dataset Reporting level Year range Type of variable(s) derived Details of variable
QOF GP practice 2006/2007 to 2015/2016 Dependent Overall QOF achievement on clinical domain. Used to generate variables of the achievement of different QOF clinical indicators
Control Practice-list size, percent of practice patients ≥65 years

GPPS (unweighted)a GP practice 2008/2009 to 2015/2016 Dependent Practice-level responses to each question in the survey Used to generate variables to investigate: patient-centred care, access to care, continuity of care, and the doctor–patient relationship

Dementia assessments data GP practice 2013/2014 to 2015/2016 Policy Used to identify participation in Directed Enhanced Services (DES18): Facilitating Timely Diagnosis and Support for People with Dementia

List of participation for DIS GP practice 2013/2014 to 2015/2016 Policy Used to identify participation in DIS

Dementia Assessment and Referral data collection GP practice 2013/2014 to 2015/2016 Control Used to construct ‘hospital effort’ indicator

HES Patient 2013/2014 to 2015/2016 Control Used to construct ‘hospital effort’ indicator

GMSb GP practice 2011/2012 to 2015/2016 Control Proportion of practice patients in different age and sex bands (≥65 years) used to derive expected dementia registers. GMS contract status

ADS GP practice 2006/2007 to 2015/2016 Control Numbers of practice patients in each LSOA. Used to generate practice- level weighted averages of rurality and deprivation

ONS: urban LSOA 2004 to 2011 Control Source of urban classifications. Combined with ADS to derive practice rurality measure. The 2004 data were used for missing values in 2011

ONS: deprivation LSOA 2010 to 2015 Control Source of IMD classifications. Combined with ADS to derive practice deprivation measure. The 2010 data were used for missing values in 2015

CCG code GP practice 2006/2007 to 2015/2016 Control Practice CCG code
a

GPPS is a questionnaire that is sent to a sample of each practice of England’s registered patients and is designed to collect data on different aspects of patient experience.32

b

Method of GMS data collection changed 2015–2016 and data are missing for around 15% of practices. ADS = Attribution Dataset. CCG = clinical commissioning group. DIS = Dementia Identification Scheme. GMS = General and Personal Medical Services dataset. GPPS = GP Patient Survey. HES = Hospital Episode Statistics. IMD = Index of Multiple Deprivation. LSOA = lower-layer super output area. ONS = Office for National Statistics. QOF = Quality and Outcomes Framework.

Five patient-focused measures that could be captured from available data were selected from the 12 potential unintended consequences. These domains are detailed in Box 2 along with the measures used to evaluate the effects of DES18 and DIS.

Box 2. Outcomes for the analysis of unintended consequences.

Domain Measure
1. Schemes’ impacts on quality of care outside DES18 and DIS Population achievement of all QOF clinical indicators excluding the dementia annual review and diagnosis indicator (a weighted measure of overall achievement of the QOF clinical domains [excluding dementia review and diagnosis indicator], with the maximum points for each indicator used as weights)
Population achievement of the QOF dementia annual review indicator
2. Patient-centred care Mean percentage of responders answering ‘good’ or ‘very good’ to each part of the question:
‘Last time you saw or spoke to a GP from your GP surgery:
  • How good was that GP at involving you in decisions about your care?

  • How good was your GP at listening to you?

  • How good was that GP at treating you with care and concern?’

3. Access to care Percentage of responders answering ‘good’ or ‘very good’ to: ‘Last time you saw or spoke to a GP from your GP surgery, how good was that GP at giving you enough time?’
4. Continuity of care Percentage of responders answering ‘almost always’ or ‘always’ to: ‘How often do you see (or speak to) the doctor you prefer to see?’
5. Doctor–patient relationship Percentage of responders answering good or very good to: ‘Last time you saw or spoke to a GP from you GP surgery, how good were they at explaining tests and treatments?’
Percentage of responders answering ‘yes, definitely’ to: ‘Did you have confidence and trust in the GP you saw or spoke to?’

DES18 = Directed Enhanced Service 18. DIS = Dementia Identification Scheme. QOF = Quality and Outcomes Framework.

Domain 1: The researchers used two measures of practice performance from the QOF data to evaluate the schemes’ impacts on the quality of care outside of the schemes.

The QOF is a voluntary financial incentive scheme designed to improve quality of primary care.33,34 It incentivises 19 clinical areas as well as public health indicators;33 dementia was added to the QOF in 2006.35 Practices can exclude (‘exception report’) patients from specific indicators, who are not counted when calculating achievement for payment purposes.33 In contrast, a ‘population achievement’ measure includes exception-reported patients in the denominator, and the present researchers used this approach in both of the measures:

  • a QOF composite measure of all clinical indicators excluding the two dementia-specific indicators (annual review and post-diagnostic tests for reversible dementia); and

  • the QOF dementia annual review indicator.33,36

The first measure aimed to investigate the impact of participation in the dementia schemes on the quality of care for long-term conditions other than dementia. In theory, this effect could be negative, for example, diverting a practice’s resources towards dementia assessments could adversely impact the quality of care in other areas; or it could be positive, in so far as better organised practices might perform well on both the dementia schemes and on QOF. The second measure was used to assess the impact of the schemes on the dementia annual review for existing patients. It is plausible that attention could be focused on newly diagnosed patients at the expense of those with an existing diagnosis (negative effect); alternatively, increased resources for dementia could have beneficial spillover effects on existing patients. The authors did not assess the impact on the QOF indicator for incentivising tests for reversible dementia in newly diagnosed patients, as better-quality post-diagnostic care is an intended effect of the schemes.

Domains 2 to 5: patient experience domains included patient-centred care, access to care, continuity of care, and the doctor–patient relationship. The measures in these domains were constructed from the GP Patient Survey (GPPS). The rationale for including these indicators is similar to that of the first quality measure: that the impact of participation in the dementia incentive schemes on other patient experiences of, and access to, primary care could be either negative or positive, depending on how practices managed their resources.

The researchers’ key explanatory variables were practice participation in the incentive schemes. For any particular year in which DES18 was active, a practice was defined as a participant if it provided data on the number of dementia assessments conducted that year. Even practices that recorded zero assessments were counted as DES18 participants, because they had engaged with the incentive scheme by signing up for the scheme, for which they were paid, and by reporting data.

NHS England provided data on practices that participated in DIS, which was based on information collected by Local Area Teams for payment purposes.

As other factors may impact practices’ outcomes, the researchers adjusted for a range of practice characteristics: the proportion of patients aged ≥65 years; the practice list size; the proportion of patients living in the 20% most deprived small areas; the proportion of patients in urban areas; the number of full-time equivalent GPs per 1000 patients (in deciles); and whether practices had a General and Personal Medical Contract. The Office for National Statistics (ONS) provides data on the Index of Multiple Deprivation score and rural–urban classification for small areas known as lower-layer super output areas (LSOAs). These were attributed to practices as weighted averages of the proportions of registered practice patients in each LSOA. A variable to capture dementia screening activity in local hospitals (a ‘hospital effort’ indicator used in a previous study9) was also included, based on one of the Commissioning for Quality and Innovation Framework (CQUIN) schemes.3739 The researchers also accounted for regional characteristics (CCG).

Statistical modelling

All the dependent variables were continuous measures ranging from 0 to 100. A difference-in-differences (DID) design was used to model the impact of DES18 and DIS on the unintended consequences. DID is a method that has been used extensively in the policy evaluation literature,40 and the approach was previously used to evaluate the effectiveness of the two schemes in terms of their intended consequences.9 This design is appropriate when information before and after the introduction of the incentive schemes is available for both the treatment group (those who participated in the schemes) and control group (those who never participated). An important assumption is that the treatment and control groups are subject to the same time trends, known as the ‘common trends’ assumption.40

DIS operated during the period when DES18 was active, and practices could participate in one, both, or neither of the schemes. As DES18 ran for 3 years, practices could participate in DES18 in any number of these years. To account for these features in the model, the same eight DES18 groups and two DIS groups defined in the authors’ previous research were used.9

A mixed-effects linear DID model that allowed for multiple periods and multiple incentive schemes was applied (technical details of the model are available from the authors on request).9,4143

RESULTS

National practice participation rates in DES18 and DIS were 98.5% and 76% respectively. In total, 7079 practices were included in the study sample. Table 1 shows the number of practice-years within each participation group.

Table 1.

Practice participation in DES18 and DIS from 2006/2007 to 2015/2016

Scheme and years of participation Practice-years %
DES18

  Years of participation: three 56 200 79.39
YYY 56 200 79.39

  Years of participation: two 11 130 15.72
YYN 1260 1.78
YNY 1370 1.94
NYY 8500 12.01

  Years of participation: one 2400 3.39
YNN 440 0.62
NYN 680 0.96
NNY 1280 1.81

  No participation 1060 1.50
NNN 1060 1.50
Total 70 790 100

DIS
No 16 970 23.97
Yes 53 820 76.03
Total 70 790 100
a

Table shows the size of the DES18 and DIS groups for the balanced panel. In total, 7090 practices contributed data for each year of the 10-year study period. Researchers identified different types of participants for the 3-year DES18, distinguishing practices into categories according to the number and order of participation years. For example, a practice that only participated in the first 2 years of DES18 (but not the third year) was categorised as YYN. DIS was only a 6-month scheme. DES18 =Directed Enhanced Service 18. DIS =Dementia Identification Scheme. N =no. Y =yes.

Sample statistics are presented in Table 2, and Table 3 summarises the DIS and DES18 policy effects on the quality of care and on patient experience. Results of the analysis of effects on the composite measure of care quality for long-term conditions in QOF (excluding dementia) are available from the authors on request. Figure 1 shows the trends of the mean QOF clinical composite measure (excluding dementia indicators). Figures 2 and 3 show the trends of the remaining measures. The formal tests (available from the authors on request) show that the pre-intervention time trends for the control and treatment practices are parallel at the 0.1% significance level.

Table 2.

Descriptive statistics of the estimation sample, N = 7079 practices

Description Mean SD Minimum Maximum n
Population achievement of dementia review, weighted measure, % 76.56 14.05 0 100.00 70 790
Population achievement excluding dementia indicators, weighted measure, % 80.76 4.56 0.05 99.79 70 790
Patient-centred care from 2008/2009 to 2015/2016, % 83.04 6.84 39.11 99.15 57 896
Access to care from 2008/2009 to 2015/2016, % 88.20 6.28 41.46 100.00 57 896
Continuity of care from 2008/2009 to 2015/2016, % 48.09 17.92 0.00 100.00 57 560
Doctor–patient relationship 1 (explaining tests and treatments) from 2008/2009 to 2015/2016, % 82.17 7.61 36.84 100.00 57 896
Doctor–patient relationship 2 (confidence and trust in the GP) from 2008/2009 to 2015/2016, % 68.26 10.73 19.05 98.29 57 896
Practice patients ≥65 years, % 16.13 5.68 0.17 47.99 70 790
Practice list size, 1000s 7.30 4.21 0.63 60.38 70 790
Practice patients living in 20% most deprived areas, % 23.01 26.14 0 99.65 70 790
Practice patients living in urban areas, % 82.57 32.54 0 100.00 70 790
Full-time equivalent GPsa per 1000 patients, deciles 0.57 0.19 0.01 8.98 70 790
Hospital effortb from 2013/2014 to 2015/2016 86.10 17.02 0 100.00 21 237
GMS contract 0.59 0.49 0 1 70 790
a

Excluding retainers/registrars.

b

Hospital effort is assumed to be zero in the period 2006/2007 to 2012/2013. Unless otherwise stated, the variables cover 2006/07 to 2015/16. n = number of practice-years.

Table 3.

Results of the policy variables of DIS and DES18 on outcomes

Domain measure Quality of primary care Doctor–patient relationship
Population achievement of all QOF indicators excluding dementia, coefficient (95% CI) Population achievement of the QOF dementia annual review indicator, coefficient (95% CI) Patient-centred care, coefficient (95% CI) Access to care, coefficient (95% CI) Continuity of care, coefficient (95% CI) Explaining tests and, treatments coefficient (95% CI) Confidence and trust in the GP, coefficient (95% CI)
DES18 policy 0.743a (0.490 to 0.996) 1.302a (0.555 to 2.050) −0.525a (−0.755 to −0.296) −0.364b (−0.582 to −0.145) −0.417 (−0.890 to 0.057) −0.371b (−0.620 to −0.122) −0.520b (−0.837 to −0.202)
DIS policy 0.429a (0.209 to 0.648) −0.01 (−0.749 to 0.729) 0.138 (−0.094 to 0.369) 0.198 (−0.033 to 0.429) −0.663b (−1.147 to −0.180) 0.265c (0.006 to 0.525) −0.189 (−0.527 to 0.148)
Within R2 0.116 0.025 0.056 0.086 0.431 0.371 0.100
Between R2 0.141 0.142 0.362 0.384 0.302 0.297 0.380
Overall R2 0.129 0.057 0.267 0.300 0.334 0.326 0.318
Standard deviation of practice random effect 2.976 5.706 4.316 3.994 12.599 4.788 7.224
Intraclass correlation 0.483 0.174 0.545 0.580 0.745 0.589 0.665
Observations, practice-years 70 790 70 790 57 896 57 896 57 560 57 896 57 896
Practices, n 7079 7079 7237 7237 7195 7237 7237
a

0.1% significance level.

b

1% significance level.

c

5% significance level. DES18 = Directed Enhanced Service 18. DIS = Dementia Identification Scheme. QOF = Quality and Outcomes Framework.

Figure 1.

Figure 1.

Trends in the quality of primary care for long-term conditions in QOF (excluding dementia): variation by participation in the schemes. DES18 = Directed Enhanced Service 18. DIS = Dementia Identification Scheme. QOF = Quality and Outcomes Framework. Light grey: DES18: April 2013 to March 2016. Dark grey: DIS: October 2014 to March 2015.

Figure 2.

Figure 2.

Trends of outcomes for DES18 scheme. DES18 = Directed Enhanced Service 18. QOF = Quality and Outcomes Framework. Light grey: DES18: April 2013 to March 2016. Dark grey: DIS: October 2014 to March 2015.

Figure 3.

Figure 3.

Trends of outcomes for DIS scheme. DES18 = Directed Enhanced Service 18. QOF = Quality and Outcomes Framework. DIS = Dementia Identification Scheme. Light grey: DES18: April 2013 to March 2016. Dark grey: DIS: October 2014 to March 2015.

Quality of care

Practices that participated in either or both of the schemes to incentivise early diagnosis of dementia had significantly higher overall quality of clinical care compared with non-participants. Participation in DES18 and DIS increased practice achievement by 0.743 percentage points and 0.429 percentage points, respectively (Table 3).

Participation in DES18 was associated with a statistically significant positive effect on practice performance on the annual dementia review, with participation in DES18 increasing practice achievement by 1.302 percentage points on average. Participation in DIS had no significant effect (Table 3).

Patient experience

The schemes were associated with some negative effects on patient experience. Participation in DES18 decreased GPPS indicators of patient-centred care (−0.525 percentage points, 95% confidence interval [CI] = −0.755 to −0.296), access to care (−0.364 percentage points, 95% CI = −0.582 to −0.145), explaining tests and treatments (−0.371 percentage points, 95% CI = −0.620 to −0.122), and confidence and trust in the GP (−0.520 percentage points, 95% CI = −0.837 to −0.202), but had no effect on continuity of care, such as the ability to see their preferred clinician. Conversely, participation in DIS negatively impacted care continuity (−0.663 percentage points, 95% CI = −1.147 to −0.180), but was linked to improved patient experience with respect to one indicator of the doctor–patient relationship: explaining tests and treatment (0.265 percentage points, 95% CI = 0.006 to 0.525).

DISCUSSION

Summary

Analysis by the researchers of the unintended consequences of the schemes revealed mixed effects. The schemes appear not only to have enhanced QOF performance in dementia review, but also to have had beneficial spillover effects on QOF performance in other clinical areas. This is possibly due to the use of extra funds attracted through the schemes to improve other areas of care at the practice; alternatively, it could be capturing practices’ organisational skills, such as the ability to comply with incentive schemes. Whatever the reason, it is reassuring that there was no adverse effect on either the annual dementia review or the quality of care for patients with other long-term conditions. However, the present study also uncovered some negative consequences. Analysis of the GPPS indicators identified deleterious effects for DES18 on several aspects of patient experience. For DIS, the only significant negative impact found was on continuity of care. A possible causal mechanism for each of these negative effects is that practices diverted efforts towards assessments for dementia, reducing the time available for other patients in a variety of ways, as described more fully in the data section.

Strengths and limitations

The major strength of this study was that it addressed a gap in the evidence base on the unintended effects of the incentive schemes for underdiagnosis in dementia. Comprehensive datasets were used covering almost all English general practices over a 10-year period.

There are several reasons why the present findings on the effects on patient experience should be interpreted with caution. First, the GPPS data are derived from a small sample of practice patients and so may not be representative. Second, the impact on different types of patients, such as those with or without dementia, or their carers, is unclear. It is possible that patient experience may have improved for some types of patients. Third, the researchers did not control for other DES schemes, as data on uptake are not available.

There are methodological weaknesses inherent in observational studies. Randomised controlled trials are considered the optimal study design for identifying causal effects as they control for known and unknown biases.40 However, DID is a good alternative method for non-experimental policy changes, such as the schemes evaluated in this study, if there are large numbers of observations (the schemes were national), if participation varies over time (as it did), and the dataset covers a reasonably long time-series. Although there was an extensive list of covariates to control for practice and regional characteristics, and hospital effort was included, there may be other confounders that could bias the present results. In addition, the authors could not test some potential unintended consequences due to lack of data.

Comparison with existing literature

There have been no previous studies on the unintended consequences of DES18 or DIS. Studies investigating the unintended consequences of the QOF or other local incentive schemes have found mixed effects on the quality of care outside of the schemes.10,12,14,31 One study found no significant effect on access to care or on the doctor–patient relationship,14 but two studies showed that continuity of care declined significantly.14,19

Implications for research and practice

The present study indicates that the schemes could have had a small adverse effect on patient experience. Alongside the unintended effects, policymakers should also consider that the schemes had a positive impact on tackling underdiagnosis.9 Depending on the relative values placed on improving the diagnosis of dementia as opposed to the small negative effects on some aspects of patient experience, policymakers may consider this trade-off acceptable. Future evaluations of incentive schemes should include analysis of the unintended as well as the intended effects. Feedback from the GPPS could help practices to identify and mitigate any potential adverse effects of this nature.

One potential area for future research is gaming (inappropriate exception reporting), which was highlighted in the literature review as a potential unintended consequence of QOF. There are no data to test whether practices assessed cases inappropriately in order to gain financially from the schemes, though qualitative work may shed light on this issue. There is also a risk of misdiagnosis, which can have ‘truly tragic consequences’,44 especially if doctors feel pressured into providing an early diagnosis.

There are variations in the availability of post-diagnosis support between CCGs,45 which may be a response to higher diagnosis rates in areas where the incentive scheme had most impact. Policymakers could focus on monitoring future schemes and ensuring practices are supported to deliver sufficient high-quality post-diagnostic support.

Acknowledgments

The authors would like to thank Kath Wright from the Centre for Reviews and Dissemination at the University of York for her support in completing the literature search. They are also grateful for feedback on an earlier draft of this article by attendees at the Health Economics Study Group (winter 2018, City University), and for comments from the project advisory group and from two referees.

Funding

This article is based on independent research commissioned and funded by the National Institute for Health Research (NIHR) Policy Research Programme (Policy Research Unit in the Economics of Health and Social Care Systems: reference 103/0001). The views expressed in this article are those of the authors and not necessarily those of the NHS, the NIHR, the Department of Health and Social Care, arm’s length bodies, or other government departments.

Ethical approval

Ethical approval was not required for this study.

Provenance

Freely submitted; externally peer reviewed.

Competing interests

The authors have declared no competing interests.

Discuss this article

Contribute and read comments about this article: bjgp.org/letters

REFERENCES


Articles from The British Journal of General Practice are provided here courtesy of Royal College of General Practitioners

RESOURCES