Abstract
Background
The NHS Diabetes Prevention Programme (NHS-DPP) has been delivered by four commercial organizations across England, to prevent people with impaired glucose tolerance developing Type 2 diabetes. Evidence reviews underpinning the NHS-DPP design specification identified 19 Behavior Change Techniques (BCTs) that are the intervention “active ingredients.” It is important to understand the discrepancies between BCTs specified in design and BCTs actually delivered.
Purpose
To compare observed fidelity of delivery of BCTs that were delivered to (a) the NHS-DPP design specification, and (b) the programme manuals of four provider organizations.
Methods
Audio-recordings were made of complete delivery of NHS-DPP courses at eight diverse sites (two courses per provider organization). The eight courses consisted of 111 group sessions, with 409 patients and 35 facilitators. BCT Taxonomy v1 was used to reliably code the contents of NHS-DPP design specification documents, programme manuals for each provider organization, and observed NHS-DPP group sessions.
Results
The NHS-DPP design specification indicated 19 BCTs that should be delivered, whereas only seven (37%) were delivered during the programme in all eight courses. By contrast, between 70% and 89% of BCTs specified in programme manuals were delivered. There was substantial under-delivery of BCTs that were designed to improve self-regulation of behavior, for example, those involving problem solving and self-monitoring of behavior.
Conclusions
A lack of fidelity in delivery to the underlying evidence base was apparent, due to poor translation of design specification to programme manuals. By contrast, the fidelity of delivery to the programme manuals was relatively good. Future commissioning should focus on ensuring the evidence base is more accurately translated into the programme manual contents.
Keywords: Type 2 diabetes, Diabetes prevention programme, Nondiabetic hyperglycemia, ntervention fidelity, Behavior change, Behavior change techniques
Key intervention elements to help people set, monitor and plan how to achieve diet and physical activity goals in a national diabetes prevention programme were under-delivered.
Introduction
In response to the increasing incidence of Type 2 diabetes, prevention programmes have been implemented internationally, which target individuals at increased risk of developing Type 2 diabetes due to nondiabetic hyperglycemia. Across multiple countries, trials have found such programmes to be effective at promoting weight loss and thereby reducing the risk of developing Type 2 diabetes [1]. In line with this, the first wave of the National Health Service Diabetes Prevention Programme (NHS-DPP) was implemented across England in 2016, following a pilot phase in several “demonstrator” sites [2]. The first wave of national NHS-DPP rollout was implemented with four different commercial provider organizations delivering their own versions of the NHS-DPP, based on the NHS design specification. This design specification consisted of an NHS-DPP service specification, which described features of prevention programmes to which each provider organization should adhere, for example, at least 13 sessions over a period of at least nine months [3], supplemented by the National Institute for Health and Care Excellence (NICE) public health 38 (PH-38) guidance “Type 2 diabetes: prevention in people at high risk” [4].
A key feature of the NHS-DPP design specification was that it identified 19 Behavior Change Techniques (BCTs) that each provider organization’s version of the NHS-DPP should deliver, across the NHS-DPP service specification [3] and NICE PH38 [4] documents. BCTs have been defined as the “active ingredients” of interventions to change behavior, for example, setting goals regarding intended levels of physical activity [5]. These have been precisely defined using a standard description of 93 such BCTs [5]. The NHS-DPP design specification was developed following a systematic review of behavior change interventions to prevent diabetes that could be delivered in routine practice [6]. This systematic review did not draw clear conclusions around, which BCTs to include, as the direct evidence base for which specific BCTs were associated with greater effectiveness for this specific population was limited [6]. Planned moderator analyses exploring this issue were not informative due to most interventions containing the same BCTs [6]. Given this, the NHS-DPP design specification included the BCTs identified by the NICE PH38 guidance [4] as well as those specifically identified in the NHS service specification document [3]. This NICE guidance which was informed by a review of systematic reviews with comparable populations. These systematic reviews identified BCTs that have found to be particularly effective at changing the key behaviors of physical activity, healthy eating, smoking cessation, alcohol reduction, and sleep [4].
Early evaluation of the programme highlighted concerns expressed in interviews with stakeholders in the NHS-DPP demonstrator sites about the extent to which the NHS-DPP provider organizations are in fact delivering the 19 BCTs that the NHS-DPP design specification indicate [7]. These stakeholders, including local commissioners and healthcare professionals who referred to the NHS-DPP, identified that procedures to ensure that BCTs were delivered as planned were unclear and relied on provider organizations verifying what was delivered, with no external validation. Any lack of delivery of the 19 BCTs specified is problematic for three main reasons. First, there was evidence that these BCTs are particularly effective at changing health-related behaviors [4]. Second, if provider organizations are not delivering these BCTs, but instead are delivering other BCTs, it will be difficult to establish the reasons for the effectiveness of the NHS-DPP or lack of effectiveness, and thereby improve this programme [8]. Third, given that four provider organizations were delivering the NHS-DPP, it is important to be sure that a similar service is being delivered by each provider organization.
It is generally agreed that it is useful to tailor or adapt programmes for each population that receives an intervention, for example, cultural adaptation of food recommendations [9]. Despite this, there is also general agreement that the mechanisms of action of programmes should be the same following adaptation [10]. In the NHS-DPP, the mechanisms of action are those processes through which the 19 core BCTs are supposed to have their effect [11].
The issue of whether provider organizations are delivering BCTs that they were supposed to deliver is one aspect of intervention fidelity [8]. Although various frameworks for intervention fidelity are available, a leading model is that proposed by the National Institutes of Health Behavior Change Consortium [8]. This model distinguished between five aspects of fidelity: (a) the design of the intervention, that is, what was planned to be delivered, (b) training for those facilitators delivering the intervention, (c) delivery of the intervention, (d) intervention receipt, that is, how the intervention is understood and experienced, and (e) intervention enactment, that is, whether people who receive the intervention carry out the activities that the intervention is designed to stimulate. The issue of intervention fidelity has attracted a good deal of research, but much of it focuses on fidelity of delivery to the detriment of other aspects [12].
Previous fidelity research on the NHS-DPP has thoroughly mapped out intervention design [13]. This research identified the core 19 BCTs in the NHS-DPP design specification, through mapping of BCTs in the NHS-DPP service specification document [3] and the NICE PH38 guidance [4]. It compared these BCTs with the intervention designs for each of the four NHS-DPP provider organizations, consisting of each provider organization’s intervention programme manual and the framework response documents [13]. The programme manuals are the documents that are used by facilitators throughout the programme, and consists of a “cookbook” describing what BCTs should be delivered in the intervention, how they should be delivered, and how frequently (i.e., what “dose” of each BCT). Framework response documents were required by NHS England from each of the four provider organizations during the 2016 commissioning process A section of these framework response documents detailed their proposed service delivery (including underlying theory and inclusion of BCTs), in line with NHS England requirements. Thus, this previous research compared two aspects of intervention design: that proposed by NHS England (the NHS-DPP design specification) and that proposed by the four commercial provider organizations (their intervention designs). This comparison showed that the four provider organizations each planned to deliver 14 of the 19 BCTs included in the NHS-DPP design specification, although the 14 BCTs varied between provider organizations [13]. Further, the four provider organizations’ programme manuals contained a further nine to 31 BCTs not included in the NHS-DPP design specification [13]. Thus, there was a lack of fidelity to the NHS-DPP design specification in each provider organization’s planned intervention design.
The present research extends this earlier research, aiming to examine what BCTs were actually delivered by NHS-DPP facilitators when delivering the full NHS-DPP courses (see Fig. 1 for a schematic that shows the aspects of fidelity considered in the present study: NHS-DPP design specification, provider intervention design, and intervention delivery).
Fig. 1.
Schematic showing aspects of intervention fidelity assessed in present study.
The present research also considers the dose of the BCTs delivered, in terms of both providers’ intervention designs and actual delivery. Dose can be considered in relation to duration, frequency, and total amount, which is a function of duration and frequency [14]. The present research considers dose in relation to frequency (i.e., the number of times a BCT was delivered), as each provider organization specified a frequency in their programme manuals, thereby allowing a direct comparison with the frequency of delivery. There is little guidance currently on what is the optimal dose of an intervention [14], so a comparison of whether the dose specified by provider organizations in their programme manuals and in actual delivery was a fair comparison, as the providers themselves indicated what would be an appropriate dose when they wrote the programme manuals.
Given that it was not possible to identify frequency of BCTs in the provider organizations’ framework responses, the present research only looked at the programme manuals when considering dose. Similarly, a comparison of fidelity of delivery of dose with NHS-DPP design specification is not possible, as this specification did not indicate a dose either. The present analyses report fidelity of delivery of BCTs in relation to presence/absence and to dose of the provider organizations’ programme manuals for three reasons. First, the provider organizations’ programme manuals were considered to be the best description of what BCTs were intended to be delivered, as these were directly used in intervention delivery. Second, there was little additional information on BCTs at all in framework responses compared to programme manuals. Third, the reporting of fidelity to programme manuals for both presence/absence and dose facilitated comparisons between these two indices of fidelity. However, to ensure that the decision to consider just programme manual and not framework responses in relation to provider organizations’ intervention designs, sensitivity analyses were conducted to examine the effects of these analytic choices on results obtained.
Thus, specific objectives of the present analyses were:
a. To describe which BCTs were delivered, with what frequency for each of the four provider organizations;
b. To compare whether the BCTs that were delivered had fidelity to those BCTs in both the (i) NHS-DPP design specification, and the (ii) programme manuals of each of the four NHS-DPP provider organizations;
c. To assess whether the dose of BCTs being delivered had fidelity to the dose indicated in the programme manuals of each of the four NHS-DPP provider organizations.
Methods
Design
Delivery of the complete NHS-DPP course was observed at two sites per provider organization, to yield complete courses delivered at eight sites in total. Initial assessments where patients were offered participation in the NHS-DPP were not included, as the NHS-DPP service specification indicates that the intervention starts with the first group session [3]. This delivery was compared with two indicators of the design of the NHS-DPP programme, described earlier [13]:
The NHS-DPP design specification, derived by coding the service specification documents produced for the commissioning process [3], including NICE PH38 guidance for prevention of Type 2 diabetes in people at high risk [4].
The planned intervention design for each NHS-DPP provider organization was identified based on coding of each of the providers’ programme manuals [13].
Participants and Setting
Written consent was obtained for everyone present in the NHS-DPP course sessions observed, including the facilitators delivering the programme, patients, and family members/carers accompanying patients.
The NHS commissioned four commercial provider organizations to deliver the DPP, to facilitate the problems of implementing a national programme at scale in a short time frame. Three of the providers were national organizations who deliver a range of programmes for health, wellbeing, and employment (i.e., Ingeus, ICS, and Reed Momenta), and one of the providers was a nonprofit organization (LWTC). We have not named the providers in the present report, as a condition of the present research was that we would not link findings to specific providers. For each provider organization, the intervention was delivered in line with NHS England’s stipulation of delivery in groups of no more than 15–20 adults with non-diabetic hyperglycemia, over at least 13 sessions [3]. Further details on participants, provider setting and facilitators is provided in Table 1.
Table 1.
Characteristics of each site observed: group size, facilitator characteristics, and site location demographic features
| Provider A | Provider B | Provider C | Provider D | ||||
|---|---|---|---|---|---|---|---|
| Site A1 | Site A2 | Site B1 | Site B2 | Site C1 | Site C2 | Site D1 | Site D2 |
| Total number of unique patients consented | |||||||
| 85 | 95 | 42 | 22 | 51 | 33 | 35 | 27 |
| Mean number of patients/session (with range) | |||||||
| M = 12 | M = 14 | M = 17 | M = 16 | M = 14 | M = 10 | M = 9 | M = 7 |
| R = 5–19 | R = 4–21 | R = 12–23 | R = 11–23 | R = 7–30 | R = 5–19 | R = 2–16 | R = 4–13 |
| Number of accompanying family members/carers consented | |||||||
| 2 | 2 | 2 | 3 | 3 | 1 | 4 | 2 |
| Total number of facilitators observed | |||||||
| 5 | 2 | 6 | 6 | 2 | 7 | 4 | 3 |
| Facilitator backgrounds | |||||||
| Public health; nutrition; psychology; nutrition therapist; teacher; personal trainer | Personal training; cardiac rehabilitation | Environmental science; nutritional therapy; sport’s science; personal training | Nutrition & community health; nutritionist; nutrition; sports nutrition; sports & coaching | Sports health & nutrition; nutrition | Health psychology; teacher; gym instructor; mental health; exercise, nutrition, & health; physical health & exercise | Personal training; health sciences; health trainer; nutrition | Health promotion; health psychology; psychotherapist |
| SES profile (IMD) of venue location a | |||||||
| 2 | 2, 3* | 2 | 3 | 6 | 1 | 2 | 2 |
| Ethnicity profile (IMD) of venue location (percentage white) b | |||||||
| 15% | 75%, 65%* | 45% | 96% | 91% | 54% | 65% | 88% |
| Location | |||||||
| Community center | Hotel; leisure center | GP surgery | Leisure center | Community center | Chapel hall; charity building | Leisure center | Community center |
a IMD, Index of Multiple Deprivation Scores associated with the lower super output area derived from venue postcodes, ranging from 1 (representing the 10% most deprived areas in England) to 10 (representing the 10% least deprived areas in England). Information obtained from Department for Communities and Local Government [14].
bInformation on ethnicity for each geographical site was obtained from The Office of National Statistics [15], taken from Census 2011.
*Site A2 has two values for IMD and ethnicity profile as researchers attended two sites for the group observations.
Note: The number of group cohorts observed at each site are as follows: Site A1 = 3 cohorts; Site A2 = 3 cohorts; Site B1 = 2 cohorts; Site B2 = 1 cohort; Site C1 = 2 cohorts; Site C2 = 2 cohorts; Site D1 = 3 cohorts; Site D2 = 2 cohorts.
Procedure
NHS ethical approval for the study described in the manuscript was granted by the North West Greater Manchester East NHS Research Ethics Committee (ref. 17/NW/0426, August 1, 2017).
Sites were purposively sampled based on an overall sampling frame of NHS-DPP providers that were in place during the first wave of national rollout in 2018–2019. The term “site” refers to the geographical location in which observations took place. We aimed to sample sites with variation in geographical location, deprivation, and ethnicity. Observations of more than one group cohort at sites was required in some cases (see Table 1), with the aim of sampling group cohorts for the same provider organization as similar as possible to the original group cohorts. Resampling was required due to (a) provider organization delays in scheduling maintenance sessions for some cohorts (k = 4 sites), and (b) some participants attending the first session, but not providing written consent, so the first session had to be observed at different sites delivered by the same provider organization (k = 3 sites). In addition, some cohorts merged due to participant dropout (k = 3 sites). Observations were undertaken between August 2018 and November 2019.
Written consent was obtained on the first day that researchers were present at the group (usually Session 1), prior to the session commencing. Participants provided written consent on first meeting researchers, and this consent carried over to future NHS-DPP sessions in which researchers were present. At the beginning of each subsequent session, researchers checked that everyone present had previously provided consent. If a new patient was attending the group session, full written consent was obtained before recording.
Delivery of BCTs was captured via an audio recorder being placed next to the facilitator during both group sessions and sessions, which included one-to-one consultations with patients.
Analysis
Audio recordings were transcribed verbatim by an external transcription company. Researchers used both the audio recordings and transcripts to code BCT delivery, supplemented by contemporaneous notes. The use of audio recordings allowed information from tone of voice and speech to be used when coding. The contemporaneous notes documented any observations that may not have been picked up on the audio recorder, for example, handing out worksheets, any issues observed in the session, and so on. The audio recordings and notes sometimes gave additional context to the transcripts.
BCTs were extracted onto a coding sheet developed for the present study. The standardized 93-item BCT taxonomy (BCTT) v1 was used [5], and researchers underwent training in its use developed by the taxonomy authors [15]. Two further BCTs were used. The BCT of “increase positive emotions” is not included in the BCTTv1, but was noted by the BCTTv1 authors for inclusion in the next version of the taxonomy. The BCT “increase salience of behaviors” was not listed in the BCTTv1, but was previously identified as being specified by each of the four NHS-DPP provider organizations [13]. Definitions of both BCTs appear in footnotes in Tables 2 and 3.
Table 2.
Behavior change techniques specified in the NHS-DPP programme design specification compared to frequency of behavior change techniques delivered across whole course in each of eight sites
Table 3.
Frequency of behavior change techniques specified in each providers’ programme manuals compared to frequency of behavior change techniques delivered across whole course in each of eight sites
A set of coding rules were developed (see Supplementary S1), which were based on the same coding rules previously used to code the documents describing NHS-DPP design [13]. A discrete instance of each BCT was considered present on the commencement of a new activity or if a different health behavior (e.g., diet, physical activity) was targeted, to allow coding of the “dose” or number of times each BCT was delivered, as well as whether it was present or not. The programme manuals identified “activities,” which included problem solving discussions, filling out worksheets, providing information, and so on. Researchers coded every individually delivered BCT captured on the audio-recording, but when collating the number of BCTs delivered in that session, where the same BCT was delivered to different attendees during their one-to-one reviews, that BCT was only counted as occurring once. A single researcher independently coded all 111 sessions, and 16 sessions (14%) were double coded (two sessions per site) to allow assessment of reliability of coding. Any discrepancies in coding were discussed by these researchers until agreement was met. Researchers documented reasons why a BCT was coded or not in the relevant BCT coding sheet. The reasons were in line with BCTTv1 guidance, e.g. the BCT “problem solving” was not coded where barriers were discussed, but not solutions to the barriers; or the BCT “social support unspecified” not coded as the activity was not linked to the performance of a behavior. To assess agreement in whether a BCT was present or not between the whole programme design and delivery, percentage agreement, and Cohen’s kappa coefficient were used [16]. To assess agreement on dose of BCTs (i.e., number of BCTs) between whole programme design specifications and delivery, Spearman’s rho statistic was used.
Two sensitivity analyses were conducted, to assess whether the decisions regarding how to operationalize provider intervention designs affected the results obtained. First, the specification of what should be delivered was broadened to also include BCTs that were included in the provider organizations’ framework response documents, but not necessarily indicated as compulsory in providers’ programme manuals. In the second sensitivity analysis, the specification of what should be delivered was broadened to also include BCTs that were indicated as “optional” in programme manuals.
Results
Complete delivery of the NHS-DPP was observed for two sites for each of the four provider organizations. Across the eight sites, 18 cohorts were recruited (see Table 1), due to some cohorts merging due to dropout, provider organization delays in scheduling maintenance sessions and it not being possible to obtain consent for all people present. In these eight sites, 111 group sessions were observed, with a total of 35 facilitators, 390 NHS-DPP participants, and 19 other carers or relatives (see Table 1). The sites were generally in locations that were relatively deprived for England, with seven of eight sites having indices of multiple deprivation (IMD) of three or below [17], and with substantial variation in the proportion of people at each site who were white based on IMD (from 45% to 96% white) [18].
Which BCTs were Delivered?
The frequency with which each BCT was delivered was reliably coded, with Cohen’s kappa statistics from 0.56 to 0.95 across the 16 sessions coded, with a mean of 0.77. Of these 16 kappa statistics, two were at the top end of the “moderate agreement” category (0.41–0.60), seven indicated “substantial agreement” (0.61–0.80) and seven indicated “perfect agreement,” according to conventional criteria [16]. These Cohen’s kappa statistics were comparable for those produced for coding of each of the four design specification documents, from 0.75 to 0.88 [13].
The number of times each BCT was delivered in each site over the whole programme is shown in Table 2. Definitions of all BCTs is provided in Supplementary S2, along with example codes. Across the sites, the most commonly delivered BCTs were “information about health consequences” (delivered 554 times), “social support [unspecified]” (delivered 128 times), “behavior substitution” (delivered 118 times), “feedback on outcome(s) of behavior” (104 times), and “self-monitoring of behavior” (delivered 86 times).
Were the BCTs that were Delivered Indicated in the NHS-DPP Design Specification?
The 19 BCTs indicated in the NHS-DPP design specification were generally among the most commonly delivered BCTs (see Table 2). For the eight sites observed, between 9 and 13 of the specified 19 BCTs were delivered across whole programme (shown in Table 2). Of these 19 BCTs, only seven BCTs were delivered at all eight sites during the whole delivery of the NHS-DPP. Three of the BCTs were not delivered at any site at any point in the programme, that is, “social support (emotional),” “pros and cons,” and “monitoring of outcome(s) of behavior without feedback.”
In addition to the 19 BCTs indicated in the NHS-DPP design specification, a further 33 BCTs were delivered at one or more sites. Of these 33 BCTs, seven were delivered at all sites (37% of 19 BCTs), including “feedback on outcome(s) of behavior,” which was the BCT that was delivered fourth most frequently. Each site delivered between 15 and 25 additional BCTs that were not included in the NHS-DPP design specification. Given the low rates of delivery of specified BCTs, and high rates of delivery of nonspecified BCTs, the agreement statistics between BCTs included in NHS design specification and what was delivered ranged from κ = 0.22 to κ = 0.37 across the eight sites. This extent of agreement is considered “fair” according to conventional criteria [16].
Were the BCTs that were Delivered Indicated in the Four Provider Programme Manuals?
The four provider organizations included between 20 and 43 mandatory BCTs in their programme manuals [13]. At each of the eight sites, the majority of these BCTs were delivered during the programme (see Table 3). The proportion delivered ranged from 14/20 (70%) [Site D1] to 24/27 (89%) [Site A1]. Apart from the BCTs indicated in the programme manuals, each site delivered some additional BCTs, with this number of BCTs ranging from two additional BCTs (Sites C1 and C2) to 11 (Site D1).
The agreement statistics between BCTs included in programme manuals and what was delivered ranged from κ = 0.50 (Site D1) to κ = 0.78 (Site C1) across the eight sites. This extent of agreement is considered “moderate” for the two provider D sites [16], where the number of BCTs not included in programme manuals (9 and 11 for Sites D2 and D1 respectively) approached the number that were included (15 and 14 respectively). Agreement was “substantial” for the six other sites [16].
Sensitivity analyses showed that the agreement statistics between BCTs in the provider intervention designs and actual delivery were robust to how the provider intervention designs were operationalized. First, where the specification of what should be delivered was broadened to also include BCTs that were included in the provider organizations’ framework response documents as well as programme manuals (see Supplementary S3), the agreement statistics between design and delivery were slightly lower at all eight sites than in the main analysis, and ranged from κ = 0.45 (Site D2) to κ = 0.76 (Site C1). Second, when the specification of what should be delivered was broadened to also include BCTs that were indicated as “optional” in programme manuals as well as mandatory BCTs (see Supplementary S4), the agreement statistics between design and delivery were very similar to those in the main analysis, and ranged from κ = 0.50 (Site D1) to κ = 0.79 (Site C1).
What is the Dose of BCTs being Delivered, and is this In Line with the Dose Indicated in the Four NHS-DPP Provider Programme Manuals?
The programme manuals for the four provider organizations indicated between 126 (Provider D) and 395 (Provider C) total instances of BCTs should have been delivered (see Table 3). There was generally strong agreement between the number of times each BCT was delivered and the number of times that programme manuals indicated that BCT should be delivered (see Table 3). Spearman’s rho for the eight sites ranged from rs = 0.46 (Site D1) to rs = 0.83 (Site A1).
The BCT that was most under-delivered was “problem solving,” delivered 128 times fewer than indicated in the programme manuals, followed by “self-monitoring of behavior” (under-delivered 58 times) and “review outcome goal(s)” (under-delivered 55 times). By contrast, “feedback on outcome(s) of behavior” was delivered 58 times more than the programme manuals indicated that it should have been, followed by “behavior substitution” that was delivered 53 times more than the programme manuals indicated.
Discussion
There was a substantial gap between the BCTs indicated in the NHS-DPP design specification to those actually delivered in the whole NHS-DPP courses at the eight sites observed. Only seven of the specified 19 BCTs were delivered at all sites. Across the eight sites, between 15 and 25 BCTs were delivered that were not included in the NHS-DPP design specification. It is notable that the four providers generally delivered those BCTs specified in their programme manuals, with between 70% and 89% of the planned BCTs being delivered during the courses across the eight sites observed. The dose of BCTs being delivered was also generally in line with dose specified in programme manuals. There was extensive delivery of some BCTs, notably providing information about health consequences, but underdelivery of others compared with what was specified in provider programme manuals, notably those involving problem solving, self-monitoring of behavior, and reviewing outcome (typically weight) goals.
A key strength of the present research is that it examined the fidelity of a programme where the research team were independent of the teams that developed the intervention. Examination of intervention fidelity by people who were not involved in its development is rare [19]. Other strengths of the present study include the elicitation and coding of all key documents for NHS-DPP design specification and provider intervention design, including commercially sensitive programme manuals. Further, the present study observed complete delivery of the NHS-DPP at eight sites, involving 35 facilitators being observed at 111 sessions. Design specifications of NHS-DPP and providers, and observations of delivery were reliably coded using standardized methods.
Clearly, eight sites cannot be truly representative of an entire programme, but the present approach aimed to ensure that the research captured in depth what was delivered in complete courses, within the constraints of time and resource. Further, there were two cohorts observed for each provider organization, which indicated consistency within providers. A limitation that applies to nearly all research observing the delivery of interventions is that the facilitators were aware that they were being observed. However, the evidence on the effects of reactivity to being measured or observed suggests that such effects are typically fairly transitory [20], and if present, the present research would provide an overestimate of fidelity of delivery. A final limitation is that the present analysis focused only on the design and delivery of the NHS-DPP with regard to BCTs. We did not consider other aspects of intervention design and delivery that contribute to intervention success, such as therapist warmth and therapeutic alliance [21].
It has previously been shown that each provider planned to deliver 74% of the unique BCTs in the NHS-DPP specification, and a large number of BCTs, which were not specified [13]. The present research builds on this earlier finding to show a substantial gap from NHS-DPP design specification to what was delivered for all four providers, highlighting a lack of fidelity to the original evidence base for the NHS-DPP programme [3, 4, 6]. Importantly, the gap between what providers planned to deliver (as indicated by their programme manuals) and what they actually delivered was comparatively small.
To our knowledge, this is the first thorough examination of fidelity of delivery to design specification for any diabetes prevention programme in the world. It is generally very rare for there to be empirical examination of fidelity of delivery of any intervention not developed by a study team [19], such as with the NHS-DPP, and particularly where there were multiple providers, such as in the present case. The most comparable examination would be the examination of national Stop Smoking Services in England, where 63% of planned BCTs based on programme manuals were delivered in face-to-face [22] and 42% were delivered in telephone sessions [23]. The NHS-DPP compares favorably with 82%–91% delivery of planned BCTs across the eight sites (shown in Table 3). Such high levels of fidelity of delivery of BCTs to programme manuals are rarely found in interventions not delivered by a research team [24]. The NHS-DPP also compares favorably in terms of not delivering BCTs that were not in programme manuals, with the range of 4%–15% across the eight sites (shown in Table 3) being substantially lower than the 65% of unplanned BCTs delivered in face-to-face and 23% delivered in telephone sessions found in the Stop Smoking studies [22, 23].
Although many other studies have examined fidelity of delivery of behavior change interventions, often in relation to BCTs, the quality of the fidelity assessment observed is likely to be typically lower than the fidelity studies just mentioned, where the research team did not deliver the intervention and the intervention was delivered as scale. For example, systematic reviews of behavior change interventions have found that objective assessment of delivery was rare, and methods were often of uncertain reliability [25, 26]. For this reason, calls for greater validity in fidelity assessment have repeatedly been made [12, 26]. Given this, it may not be useful to compare findings of these studies to the present study.
It is notable that there was underdelivery of other BCTs, notably those involving problem solving, self-monitoring of behavior, and reviewing outcome (typically weight) goals. These latter BCTs aim to improve the capability of participants to self-regulate their behavior. Importantly, although outcome (weight) measurement frequently happened in the intervention sessions and were fed back to participants, they were not often reviewed with participants so they could modify their goals in light of failure or success. Similar underdelivery of self-regulation techniques such as goal setting and action planning has been noted in other studies [27]. Other BCTs that were underdelivered by all providers were those that involved monitoring behavior such as via diaries or pedometers without feedback, prompting consideration of pro’s and con’s, and providing emotional social support. The underdelivery of all these BCTs presents the greatest opportunity for improvement, given that the evidence review underpinning the NHS-DPP flagged up such BCTs as important [4], in line with evidence from the wider literature on the effectiveness of self-regulation BCTs [28].
There was evidence of overdelivery of BCTs that are easier to deliver, notably providing information about health consequences. It was particularly notable that between 15 and 25 BCTs not included in the NHS-DPP design specification were delivered at the eight sites, with a total of 33 distinct BCTs being added to the 19 that the NHS-DPP design specified. There are two implications of the delivery of these additional BCTs. First, there is evidence from some reviews that interventions that include more BCTs produce larger changes in behavior than interventions with fewer BCTs [29]. Thus, the inclusion of these additional BCTs may therefore be useful. However, the evidence base is stronger for BCTs included in the NHS-DPP design specification, so the beneficial effects of these additional BCTs will probably be outweighed by the harmful effects of not including the BCTs in the NHS-DPP specification. A further implication of the inclusion of these additional BCTs is that it resulted in substantial variation in what is being delivered by the four provider organizations. Thus, although the NHS-DPP is a nationally implemented programme, there appears to be differences in what BCTs patients receive according to which provider organization they are allocated. It is currently unclear whether there are differences in effectiveness due to provider organization [30].
The authors of the present work are conducting several other streams of work in relation to fidelity of the NHS-DPP, which are intended to shed light on the findings reported here. First of all, an analysis of mode of delivery, including number of sessions, length of sessions, and provider characteristics has been made [31], which indicates that there is good fidelity of NHS-DPP delivery to these organizational and structural aspects. Other work has observed the training of the intervention facilitators, and found that facilitators were not trained in all BCTs noted in provider intervention design specifications [32]. This may provide at least a partial explanation for the discrepancies between design and delivery noted here. Ongoing qualitative work is examining how the intervention is received, an under-researched aspect of fidelity [12]. This examination of receipt may shed light on how the additional BCTs that providers have included were received, as well as issues such as the relative importance of BCT content compared to other aspects of the intervention, such as structural issues or facilitator characteristics.
Future research could usefully examine the relative impacts of the four programmes on participant experience and behavior change. The substantial variation between provider organizations offers a useful natural experiment at scale from which researchers should profit. The best available evidence based on analyses by the NHS-DPP team [30] suggests that the NHS-DPP is having an effect on weight loss (2.3 kg) and HbA1c (1.26 mmol/mol) in intention-to-treat analysis. The present analysis suggests there is room for the NHS-DPP to improve these outcomes in future iterations, given the underdelivery of self-regulatory BCTs that the wider literature on behavior change suggest are key to effective behavior change. Consideration of relative effectiveness on key outcomes between provider organizations would provide a more compelling evidence base for precisely which BCTs are most useful in terms of changing these outcomes, particularly with regard to those BCTs that are not specified in the evidence base, and which vary between providers. A final area where more research would be useful would be to interview facilitators or other provider organization staff to get a clearer insight into why BCT delivery was not more in line with provider intervention designs or the NHS-DPP design specifications.
The key implication of these findings is that there is a need for commissioners of national interventions put greater effort into ensuring provider organizations use BCTs for which there is the strongest evidence. The present research has shown that delivery of BCT content that is contained within programme manuals is very good. However, there appears to be a gap between what the evidence base suggests and delivery, largely because of failures to translate the evidence base into manualized form by all four provider organizations [13]. Thus, focusing efforts on ensuring the evidence base translates into the contents of programme manuals in future rounds of commissioning appears warranted. It would be useful to examine how best NHS-DPP commissioning and monitoring arrangements can improve fidelity of BCT delivery in future NHS-DPP commissioning rounds, as this would likely have much wider applicability.
Supplementary Material
Acknowledgments
This work is independent research funded by the National Institute for Health Research (Health Services and Delivery Research, 16/48/07 – Evaluating the NHS Diabetes Prevention Programme (NHS DPP): the DIPLOMA research programme (Diabetes Prevention – Long Term Multimethod Assessment)). The views and opinions expressed in this manuscript are those of the authors and do not necessarily reflect those of the National Institute for Health Research or the Department of Health and Social Care.
We would like to thank the NHS-DPP Programme team at NHS-England for facilitating this research at all stages, as well as the provider organizations for providing all relevant documentation and assisting in the organization of observations at each site. We are grateful to all the facilitators and attendees who consented to observations of NHS-DPP sessions. With thanks to researchers Hannah Long and Kelly Howells who both attended the NHS-DPP sessions for data collection when researchers EC or REH were unable to attend. We would also like to thank the following researchers in the DIPLOMA team who provided valuable feedback during the manuscript preparation: Simon Heller, Emma McManus, Lisa Miles, Sarah Cotterill, Claudia Soiland-Reyes, William Whitaker and Paul Wilson.
References
- 1. Sood HS, Maruthappu M, Valabhji J. The National Diabetes Prevention Programme: a pathway for prevention and wellbeing. Br J Gen Pract. 2015;65:336–337. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. NHS England. NHS Diabetes Prevention Programme (NHS DPP). 2017. Available at https://www.england.nhs.uk/diabetes/diabetes-prevention/2017/. Accessibility verified July 20, 2018.
- 3. NHS England. Service specification no. 1: provision of behavioural interventions for people with non-diabetic hyperglycaemia [Version 01]. March 2016. Available at https://www.england.nhs.uk/wp-content/uploads/2016/08/dpp-service-spec-aug16.pdf. Accessibility verified March 21, 2020.
- 4. National Institute for Health and Care Excellence (NICE). PH38 Type 2 Diabetes: Prevention in People at High Risk. London: National Institute for Health and Care Excellence; 2012. (updated September 2017). Available at https://www.nice.org.uk/guidance/ph38/resources/type-2-diabetes-prevention-in-people-at-high-risk-pdf-1996304192197. Accessibility verified July 22, 2019. [Google Scholar]
- 5. Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: building an international consensus for the reporting of behavior change interventions. Ann Behav Med. 2013;46:81–95. [DOI] [PubMed] [Google Scholar]
- 6. Public Health England. A systematic review and meta-analysis assessing the effectiveness of pragmatic lifestyle interventions for the prevention of type 2 diabetes mellitus in routine practice. 2015. Available at https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/733053/PHE_Evidence_Review_of_diabetes_prevention_programmes-_FINAL.pdf. Accessibility verified March 21, 2020.
- 7. Penn L, Rodrigues A, Haste A, et al. NHS Diabetes Prevention Programme in England: formative evaluation of the programme in early phase implementation. BMJ Open. 2018;8:e019467. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Bellg AJ, Borrelli B, Resnick B, et al. ; Treatment Fidelity Workgroup of the NIH Behavior Change Consortium . Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium. Health Psychol. 2004;23:443–451. [DOI] [PubMed] [Google Scholar]
- 9. Evans RE, Craig P, Hoddinott P, et al. When and how do ‘effective’ interventions need to be adapted and/or re-evaluated in new contexts? The need for guidance. J Epidemiol Community Health. 2019;73:481–482. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Hawe P, Shiell A, Riley T. Complex interventions: how “out of control” can a randomised controlled trial be? BMJ. 2004;328:1561–1563. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. Carey RN, Connell LE, Johnston M, et al. Behavior change techniques and their mechanisms of action: a synthesis of links described in published intervention literature. Ann Behav Med. 2019;53:693–707. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12. Toomey E, Hardeman W, Hankonen N, et al. Focusing on fidelity: narrative review and recommendations for improving intervention fidelity within trials of health behaviour change interventions. Health Psychol Behav Med. 2020;8:132–151. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Hawkes RE, Cameron E, Bower P, French DP. Does the design of the NHS Diabetes Prevention Programme intervention have fidelity to the programme specification? A document analysis. Diabet Med. 2020;37:1357–1366. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Voils CI, King HA, Maciejewski ML, Allen KD, Yancy WS Jr, Shaffer JA. Approaches for informing optimal dose of behavioral interventions. Ann Behav Med. 2014;48:392–401. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. BCTTv1 online training. Available at https://www.bct-taxonomy.com. Accessibility verified June 26, 2020.
- 16. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–174. [PubMed] [Google Scholar]
- 17. Department for Communities and Local Government. The English indices of deprivation [Internet]. 2015. Available at https://www.gov.uk/government/statistics/english-indices-of-deprivation-2015. Accessibility verified December 23, 2019.
- 18. Office for National Statistics, taken from Census 2011 [Internet]. Available at https://www.nomisweb.co.uk/census/2011/ks201ew. Accessibility verified December 23, 2019.
- 19. Borrelli B. The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. J Public Health Dent. 2011;71(suppl 1):S52–S63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20. Godfrey E, Chalder T, Risdale L, Seed P, Ogden J. Investigating the “active ingredients” of cognitive behaviour therapy and counselling for patients with chronic fatigue in primary care: developing a new process measure to assess treatment fidelity and predict outcome. Br J Clin Psychol. 2007;46:253–272. [DOI] [PubMed] [Google Scholar]
- 21. French DP, Miles LM, Elbourne D, et al. Reducing bias in trials from reactions to measurement: The MERIT study including developmental work and expert workshop. Health Technol Assess. 2020; forthcoming. [DOI] [PubMed] [Google Scholar]
- 22. Lorencatto F, West R, Christopherson C, Michie S. Assessing fidelity of delivery of smoking cessation behavioural support in practice. Implement Sci. 2013;8:40. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23. Lorencatto F, West R, Bruguera C, Michie S. A method for assessing fidelity of delivery of telephone behavioral support for smoking cessation. J Consult Clin Psychol. 2014;82: 482–491. [DOI] [PubMed] [Google Scholar]
- 24. Williams SL, McSharry J, Taylor C, Dale J, Michie S, French DP. Translating a walking intervention for health professional delivery within primary care: a mixed-methods treatment fidelity assessment. Br J Health Psychol. 2020;25:17–38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25. Lambert JD, Greaves CJ, Farrand P, Cross R, Haase AM, Taylor AH. Assessment of fidelity in individual level behaviour change interventions promoting physical activity among adults: a systematic review. BMC Public Health. 2017;17:765. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26. Walton H, Spector A, Tombor I, Michie S. Measures of fidelity of delivery of, and engagement with, complex, face-to-face health behaviour change interventions: a systematic review of measure quality. Br J Health Psychol. 2017;22:872–903. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27. Fredrix M, Byrne M, Dinneen S, McSharry J. “It’s an important part, but I am not quite sure that it is working”: educators’ perspectives on the implementation of goal‐setting within the “DAFNE” diabetes structured education programme. Diabetic Med. 2019;36:80–87. [DOI] [PubMed] [Google Scholar]
- 28. Michie S, Abraham C, Whittington C, McAteer J, Gupta S. Effective techniques in healthy eating and physical activity interventions: a meta-regression. Health Psychol. 2009;28:690–701. [DOI] [PubMed] [Google Scholar]
- 29. Olander EK, Fletcher H, Williams S, Atkinson L, Turner A, French DP. What are the most effective techniques in changing obese individuals’ physical activity self-efficacy and behaviour: a systematic review and meta-analysis. Int J Behav Nutr Phys Act. 2013;10:29. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Valabhji J, Barron E, Bradley D, et al. Early outcomes from the English National Health Service Diabetes Prevention Programme. Diabetes Care. 2020;43:152–160. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Hawkes RE, Cameron E, Cotterill S, Bower P, French DP. The NHS Diabetes Prevention Programme: an observational study of service delivery and patient experience. BMC Health Serv Res. 2020;20:1098. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32. Hawkes RE, Cameron E, Miles LM, French DP. Fidelity of training to intervention design in a National Diabetes Prevention Programme; under review. [DOI] [PMC free article] [PubMed]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.



