Abstract
Objectives
To describe the quality of reporting and investigation into surgical Never Events in public reports.
Design
Semi-quantitative and qualitative review of published Quality Accounts for three years (2011/2–2013/14). Data on Never Events were compared with previously collated Never Events rates. Quality of reported investigations was assessed using the London Protocol.
Setting
English National Health Service.
Participants
All English acute hospital trusts.
Main outcome measures
Quality of Never Event reporting.
Results
Quality Accounts were available for all Trusts for all three years, of which 342 referred to years when a surgical Never Event had occurred. A total of 125 of 342 (37%) accounts failed to report any or all Never Events that had occurred; 13/342 (4%) provided full disclosure; 197 (58%) reported that some investigation had taken place. Of these 197, 61 (31%) were limited in scope; 61 (31%) were categorised as detailed reports. Task and Technology factors were the commonest factor (103/211 (49%)) Identified in investigations, followed by Individual factors (48/211 (23%)). Team and Work environment factors were identified in 29/211 (14%) and 23/211 (11%), respectively. Organisational and Management 5/211 (2%) factors were rarely identified, and the Institutional context was never discussed.
Conclusions
Reporting of Never Events and their investigations by English NHS Trusts in their Quality Accounts is neither consistently transparent nor adequate. As with clinical error, the true root causes are likely to be organisational rather than individual.
Keywords: Patient safety, Never Events, transparency, candour, error, surgery, Quality Accounts
Background
Never Events are defined adverse events within healthcare that are supposed to be avoidable with current processes. Within the English National Health Service (NHS), they are remain relatively unusual, predominantly low harm, events; surgically related Never Events occur in around 1 in 17,000 operations, with severe harm occurring in around 1 in 250,000.1 Notwithstanding the question of unrecognised or unreported incidents, every English hospital is required to report all of its Never Events to central reporting bodies.2
There has been a drive towards increasing transparency within healthcare at personal, provider and structural level. This is partly in recognition of the fundamental importance of trust between the ‘user’ (patient or tax-payer) and the ‘provider’ (healthcare professional or organisation) and also of the role of transparency in helping organisations and individuals learn and avoid the same mistakes.3–5 Ratings for ‘open and honest reporting’ are now publically available for most English NHS Trusts.6
One statutory requirement in the NHS is the annual publication of a ‘Quality Account’ by every NHS Trust. These Quality Accounts include
A statement from the organisation detailing the quality of the services they provide. Clinical teams, managers, patients and patient groups may all have a role in choosing what to write about in this section, depending on what is important to the organisation and to the local community.7
These Quality Accounts are the major, though not sole, source of public information about the organisation. If Trusts are committed to openness about safety, then the public might reasonably expect Never Events to be consistently reported and the lessons learned from investigations shared. Similarly, if Trusts are committed to learning and sharing learning, some detail of investigations, findings and actions should be available.
Given that data on Never Events are collected, and Quality Accounts published, by every Trust, we wished to explore the transparency and openness of reporting of Never Events. In addition, we wished to explore the extent to which investigation and learning was shared with the public and other healthcare organisations.
Methods
We undertook a semi-quantitative and qualitative study comparing the information provided about surgical Never Events in Trust Quality Accounts over a three-year period (2011–2014) with details of the surgical Never Events that were known to have occurred.
Data sources
Quality Accounts
These were obtained for every extant English NHS acute Trust for each of the three years from NHS Choices website,7 Trust websites and (in a small number of cases) by email request.
Surgical Never Events
As part of a previous investigation, every extant English NHS acute Trust was requested to provide details of surgical Never Events for the years 2011/12 and 2012/13,1 under the stipulations of the Freedom of Information Act. Data for 2013/14 were extracted from the NHS England published data.8 Surgical Never Events were defined as one of: retained foreign object following an invasive procedure; wrong site surgery; and wrong implant.
Quantitative data analysis
Following initial testing on a small sample of Quality Accounts, two rating systems were used to categorise the Quality Account information. The first related to reporting of the Never Event (Table 1), the second to reporting of the investigation and learning from the event (Table 2).
Table 1.
Scoring system for reporting of details of Never Events within Quality Accounts.
| Category | Category label |
|---|---|
| No Never Event mentioned in Quality Account (and no Never Events) | 1 |
| Incomplete disclosure Never Events occurred but did not mention any/all of them in Quality Account | 2 |
| Explicitly and correctly identified no Never Events have occurred | 3 |
| Number of Never Events only | 4 |
| Brief description (2 or 3 out of the following criteria) 1. Number 2. Correct organ or foreign object or implant identified 3. States any (or none) corrective procedure or surgery required (or not) by the patient 4. States the outcome of the patient, i.e. none, low, moderate, severe harm or death; permanent or temporary | 5 |
| Full/Complete details (all of the following criteria) 1. Number 2. Correct organ or foreign object or implant identified 3. States any (or none) corrective procedure or surgery required (or not) by the patient 4. States the outcome of the patient, i.e. none, low, moderate, severe harm or death; permanent or temporary | 6 |
Table 2.
Scoring system for reporting of investigations into Never Events within Quality Accounts.
| Category | Category label |
|---|---|
| No Never Event investigation mentioned (but no Never Event occurred; or Never Event not reported in Quality Account) | 1 |
| Have had Never Events but no mention of investigations | 2 |
| Statement that investigation has taken place only (Including if investigation is limited to review of policies/procedures or equivalent) | 3 |
| ‘Ongoing’ investigation | 4 |
| Brief report 1. States actions taken after the investigations without full explanation 2. Single cause identified | 5 |
| Full/complete details 1. Identifies multiple errors/themes/contributing factors 2. Explanation of any actions taken or shared learning gathered | 6 |
The rating of quality of investigation was based on the National Patient Safety Agency framework for root cause analysis9 and the London Protocol.10 To be rated as ‘full’, the report of investigations had to identify the errors, themes and contributing factors as well as providing some explanation of any actions taken or shared learning.
The first author (NW) undertook all scoring initially. All items rated as 5 and 6 in both rating scales were discussed in detail by the authors. Forty-five (15 from each year) additional Quality Accounts were selected at random by the senior author (IM) and rated independently.
Relevant sections of the Quality Accounts are reproduced in this report as examples of positive and negative reporting. Although the Quality Accounts are all publicly available, we have chosen not name the individual organisations as this report seeks to promote learning not blame.
Systems analysis of possible causes of failure to report
Using extracts from the original Freedom of Information request responses and statements in Quality Accounts, we used the London Protocol template (Table 3) to explore contributory factors that lead to lack of reporting of Never Events.
Table 3.
An abbreviated summary of the London Protocol. This framework was used to identify investigation/learning themes in the quality accounts, and to provide a qualitative analysis of the possible reasons underlying the quality of Never Event reporting.
| Factor types | Contributory influencing factor |
|---|---|
| Patient | Condition (complexity and seriousness) Language and communication Personality and social factors |
| Task and Technology | Task design and clarity of structure Availability and use of protocols Availability and accuracy of test results Decision-making aids |
| Individual (staff) | Knowledge and skills Competence Physical and mental health |
| Team | Verbal communication Written communication Supervision and seeking help Team structure (congruence, consistency, leadership, etc.) |
| Work environment | Staffing levels and skills mix Workload and shift patterns Design, availability and maintenance of equipment Administrative and managerial support Environment Physical |
| Organisational and Management | Financial resources and constraints Organisational structure Policy, standards and goals Safety culture and priorities |
| Institutional context | Economic and regulatory context NHS executive Links with external organisations |
Results
Quality Accounts and Never Event data were available for every English Acute NHS Trust (n = 158).
There was high inter-rater reliability of the categorisation tool for both reporting of Never Events and their investigation. Of the 45 accounts reviewed, 42 and 37 had identical scores for reporting and investigation, respectively. Cohen’s kappa coefficients were 0.907 and 0.739, respectively (Cohen’s kappa coefficient is a statistic of agreement for categorical items that included allowance for agreement that would occur by chance). Discrepancies in investigation scores were all over the interpretation of whether investigations were ongoing or completed.
The distribution of reporting categories is shown in Figure 1. In the three years combined, of those Trusts who had a surgical Never Event (37%, 125/342), Quality Accounts made no mention of Never Events that had occurred or failed to mention all of the Never Events (Category 2); 4% (13/342) provided full disclosure. There appears to have been an increase in the proportion of Quality Accounts reporting the number Never Events which occurred (Category 4), with a concomitant decrease in the proportion failing to report them at all (Category 2) between 2011/12 and 2012 onwards. There was no clear change in the proportions providing more details (Categories 5 and 6).
Figure 1.
Categories of Never Event reports in English NHS Acute Trust Quality Accounts 2011–2014. (1) No Never Event mentioned (and no Never Event occurred), (2) incomplete disclosure, (3) explicitly and correctly identified no Never Events have occurred, (4) number of Never Events only, (5) brief description, (6) full/complete details.
The quality of investigation reporting is shown in Figure 2. Investigations were reported to have occurred in 58% (197/342) of Quality Accounts where a Never Event had occurred. Forty per cent (79/197) of these investigations were in name only with statements such as ‘we have conducted a rigorous investigation’ or a very superficial ‘we have reviewed our policies’ (Category 3).
Figure 2.
Categories of Never Event investigation reports in English NHS Acute Trust Quality Accounts 2011–2014. (1) No Never Event investigation mentioned (and no Never Event occurred, or not mentioned in Quality Account), (2) have had Never Events but no mention of investigations, (3) statement that investigation has taken place, or investigation is limited to review of policies/procedures or equivalent, (4) ‘ongoing’ investigation, (5) brief report, (6) full/complete details.
Of those who did report details of the investigation, 31% (61/197) were limited in scope (Category 5); 31% (61/197) were categorised as detailed reports (Category 6). A small proportion of Quality Accounts 8% (26/342) reported the involvement of external assessors/investigators. There was no clear change across the three years.
Based on the information given in the reports, Task and Technology factors were commonest factor identified in 49% (103/211) investigations followed by Individual factors (23% (48/211). Team and Work environment factors were identified in 14% (29/211) and 11% (23/211), respectively. Patient 1% (3/211) and Organisational and management 2% (5/211) factors were rarely identified, and the Institutional context was never discussed (Figure 3).
Figure 3.
Contributory factors identified by investigations of Never Events reported in Quality Accounts of English NHS Trusts 2011–2014. Categories of factors are inferred from reports using the London protocol framework.
Systems analysis of quality of Quality Account reporting
Failure to transparently report Never Events
The definitions of surgical Never Events are relatively clear-cut (but readers are referred to a recent report by Toft which challenges this view11). In a few cases, the diagnosis was disputed – which may have led to some confusion about what could or should be reported (Task and Technology: Decision-making aids). For instance, one Trust reported that
the … incidents could be interpreted as outside the criterion of the Never Event as further surgery was not required and the patients were happy with the outcome of their surgery.
There appears to be some uncertainty about the degree of information it is acceptable to place in the public domain. This is evidenced by the response of a few Trusts to the original Freedom of Information request such as
We cannot disclose the details of the never events as there is a potential risk of the patients being identified so we apply an exemption under section 40 of the Freedom of Information Act.
Most, but not all, Trusts released these data on appeal and review by a senior manager, suggesting that more junior staff may be erring on the side of caution (Individual: Knowledge and skills). There were a few cases where, following clarification with Trusts whose detailed Quality Account information did not match detailed Freedom of Information responses, it was clear that clerical errors or failure of internal communication had occurred (Individual: skills; Team: Communication).
National guidance on what should appear in Quality Accounts is somewhat vague (Task and Technology: Availability and use of protocols; Institutional Context: National Health Service executive). The Quality Accounts frameworks state that
Quality Accounts must be seen as a key mechanism by which you can demonstrate that a relentless focus on improving service quality is being maintained.12
and
To support them to do this, from 2012/13 … trusts would need to include … a commentary on their performance and any steps that the Trust intends to take, or has already taken, to improve.2
Similarly, the Never Events framework13 mandates that
Boards, Chief Executives and Accountable Officers … understand their obligations with respect to the publication of information on never event occurrence … in annual Quality Accounts,
and ‘Include NE numbers and type in Annual Reports and if possible Quality Accounts’ and
The type of incidents; The learning derived from the incidents, with a particular focus on the system changes that have been made to reduce the probability of it occurring again; Data on the total number of Never Events, including the historical context and related incidents such as prevented Never Events if appropriate; That learning has been shared more widely than the organisation.
Taken together, these statements can be interpreted as requiring public discussion of the Never Events and the steps the Trust has taken to improve. However, there is no explicit statement of what ‘good’ looks like for Trusts to aspire to.
The fact that some Trusts provided exemplary detail in their Quality Accounts would suggest that whatever the perceived minimum requirements for reporting, the safety culture and goals of the individual Trusts also play a part (Organisational and Management: Policy, standards and goals, Safety culture and priorities). One Trust reported:
… Retained surgical instrument (small gauze swab retained in knee after wound closure) – patient immediately returned to theatre and swab removed and there was no long term harm to the patient …
Conversely, this is presumably affected by the sometimes blame-focused approach of media and politicians.
Failure to report investigation sufficiently to inspire openness and learning for others
The failure to report transparently the investigations that should have taken place would have many of the same contributory factors as the failure to report Never Events adequately. There is clear guidance, in a variety of formats,9,10 on the components of an adequate investigation (Task and Technology: Decision-making aids) that is reinforced by local commissioning arrangements. When reported, many of the investigations appeared to lack breadth, depth or specificity of action. For example, two Quality Accounts reported that
… Following this event, a number of actions have been implemented to reduce the risk of recurrence and included: a full review of our practice and processes relating to completion of the World Health Organisation (WHO) surgical check list in theatres and provision of additional training.
and
… Maternity service guideline was not followed in respect of the … swab count …
Some investigations give the impression of a superficial systems analysis combined with a punitive approach to individual staff such as
The root cause was identified as …’systems failure’ and individual error.… managerial action is ongoing …
Conversely, there were examples of investigations and learning which not only demonstrated the organisation’s own desire to learn, looked wider than just the event itself, but are also of value to any healthcare organisation. Examples of these include
… whilst an investigation was undertaken. Reinforcement of the importance of team brief for catching unforeseen changes to the operating list. Locking down of operating lists 24 hours before. Video reflexivity exercise to reinforce safety behaviours. Identification of risks of partial EPR implementation.
and
… the investigation revealed a number of factors contributing to the incident … poor swab counting in an emergency situation; unfamiliarity of the team …;… handover …
There may be many reasons for this general poor quality including lack of adequate training (Individual: Knowledge and skills, Competence; Organisational and Management) and organisational drive to find the true root causes, rather than the easier ‘sharp end’ issues (Organisational and Management). The focus appeared to be on Task and Technology, Individuals and Teams. Deeper organisational issues, which inevitably reflect on the organisation as a whole, rather than individuals or small departments, were rarely addressed (Organisational and Management: Financial resources, Safety culture, Standards).
Discussion
We have found that transparent reporting of Never Events by English NHS Trusts in their Quality Accounts is generally poor, but there are examples of excellence. The reporting of investigations that have occurred is also patchy.
There are limitations to our findings. Our tool for assessing reporting is a relatively blunt instrument. It is not intended to provide a narrative account of the content of the reports that contained qualitative information, simply to describe, in broad terms how much has been reported. The distinction between limited and full reporting is of course subjective to an extent. We suggest that the number of organisations which provide no or incorrect information or just the number of incidents is of more concern. The categorisation we have developed appears to be reliable with good inter-rater reliability. We would suggest that it could be used primarily as a formative tool – allowing organisations to self-assess the quality of their reporting.
Quality Accounts are not the only mechanism for organisations to share information about Never Events. There were some notable examples of Trusts where the Quality Accounts were relatively poor, but publically available Board papers contain full details of both the nature of the Never Events and the investigations and learning that have stemmed from these. Our only criticism of those organisations would be that the data are hard to find which limits the opportunity for other organisations to learn.
With regard to the quality of investigation, we could only judge what was written in the Quality Account. Full and thorough investigations may well have taken place but not been reported. The widespread use of the NPSA Root Cause Analysis Toolkit9 suggests that these investigations should have taken place. However, given the relative shallowness of investigations when they were reported, we would speculate that the investigations might have left room for improvement. Work from the Perinatal Institute14 found that only one-quarter of significant concerns in care processes identified by independent panels were found in internal investigations. Similarly, recent external reviews of Never Event investigations11,15 have highlighted inadequacies in internal reports and a relatively narrow focus.
We chose to look only at surgical Never Events for several reasons. They clearly fulfil the ‘this really shouldn’t happen’ spirit of a Never Event, which some of the other Never Events do not. They have remained consistent over time,1 which makes comparison more straightforward, and there is a reasonable agreement about what is and is not a Never Event. It was apparent from some of the Freedom of Information responses, and some Quality Accounts, that Trusts vary in, and will argue about, their interpretation of what is or is not a Never Event. Surgical Never Events are collectively far more common than the other Never Events, accounting for around two-thirds of reported Never Events.8 Finally, surgical Never Events are found consistently in most definitions of Never Events across the globe.
There are probably many reasons why Quality Accounts do not contain what we would suggest is sufficient information. The NHS position on publication has changed from not required at provider level when Never Events were first introduced,16 through a requirement to include moderate detail2 to headline publication on a searchable website.8 This may explain in part the apparent sharp increase in proportion of Quality Accounts at least providing the number of Never Events between 2011/12 and 2012 onwards. We do not suggest that our brief root cause analysis is a complete description of all the contributory factors, but we would assert that organisational and external influences, their interpretation by Trusts, and the relative lack of clarity from NHS England are likely to be the key drivers.
We were somewhat surprised by the lack of real change over the three years. Recent high profile failings in NHS care at Mid-Staffordshire17 and Morecambe Bay Hospitals18 have led to very public commitments to greater transparency and learning within healthcare. Similar issues no doubt exist in every healthcare framework. Despite this, there is no obvious change over this time. Looking forward, there is a new statutory duty for openness at the organisational and individual level. This ‘Duty of Candour’ came into force after the Quality Accounts examined in this report were published.19 The Care Quality Commission (the NHS quality regulator) advice to providers20 concerning the Duty of Candour explicitly states that
there should also be a commitment to being open and transparent at board level, or its equivalent such as a governing body.
Furthermore, it uses wider definitions than the legal framework (derived from Francis17), for instance
Transparency: Allowing information about the truth about performance and outcomes to be shared with staff, people who use the service, the public and regulators.
The individual professionals’ duty of candour has existed for a long time. However, the legal framework of the UK Duty of Candour and the regulator’s specific interest in whether the duty is being applied at organisational level may, in time, drive improvements.
Does this poor level of transparency matter? We suggest it does at several levels. First, it can be viewed as a marker of a lack of trust between a provider and its community. The lack of trust clearly has many origins and all sides bear responsibility for this. Second, every Quality Account is scrutinised by the key stakeholders relevant to the organisation. Very few commented on the lack of information about Never Events, which they must have already known about. This raises the question of whether the stakeholders understand what transparency and learning should look like. Third, there is a missed opportunity to learn. Numerous reports have been written across the globe, urging healthcare to learn from its past mistakes. Transparent sharing of incidents and learning is a key part of this. This impediment to learning may in part explain the apparent lack of change in Never Event rate over time1. Fourth, of the investigations reported, too many appeared to stop at relatively superficial explanations and responses. Poor quality investigation is unlikely to solve the problem(s) and inevitably focuses more on the ‘sharp end’. The repeated emphasis on audit and compliance with use of the WHO Safe Surgery Checklist Capability may be a symptom of this. Failure to understand why the WHO Checklist did not prevent events that it ostensibly should have done will lead to the same errors. Capability, capacity and culture will all have influenced this lack of transparency and rigour to varying degrees in different organisations.
However, we believe that our data provide an opportunity for organisations to improve. First, they are a benchmark of the quality of reporting which NHS England or any other body could use to gauge improvement over time. Second, we have created a framework for organisations to self-assess themselves in the quality of their reporting, whether internally or to external parties. We hesitate to suggest mandatory standards for reporting, as these may be counter-productive. But perhaps, by setting a vision of what good reporting looks like, organisations will feel safety in numbers when they do report their Never Events, and their learning, fully. Third, we suggest that our analysis provides evidentiary support for the role of an independent investigatory body for healthcare.21,22
In summary, reporting of Never Events and their investigations by English NHS Trusts in their Quality Accounts is neither consistently transparent nor adequate. There are many possible contributory factors but as with clinical error, the true root causes are likely to be organisational rather than individual. It would appear that NHS Trusts have some way to go before they embrace Don Berwick’s call5:
Recognise that transparency is essential and expect and insist on it.
Declarations
Competing interests
None declared
Funding
None declared
Ethical approval
Ethical approval was not required as the study involved analysis of data that are all in the public domain. No individual patient data were used or sought, nor can any patients be identified.
Guarantor
IM
Contributorship
IM conceived this study and oversaw all aspects of data analysis, development of the scoring system and the writing up of the paper. NW assisted with development of the scoring system, undertook the primary scoring and data abstraction and assisted with drafting, revision and final approval of the manuscript. SM conceived this study and assisted with revision and final approval of the manuscript.
Acknowledgements
None
Provenance
Not commissioned; peer-reviewed by Kunal Kulkarni.
References
- 1.Moppett IK, Moppett SH. Surgical caseload and the risk of surgical Never Events in England. Anaesthesia 2016; 71: 17–30. [DOI] [PubMed] [Google Scholar]
- 2.Quality accounts: reporting requirements for 2011/12 and planned changes for 2012/13, Gateway reference number: 17240. See www.gov.uk/government/uploads/system/uploads/attachment_data/file/215165/dh_132727.pdf (2012, last checked 22 February 2016).
- 3.Department of Health. An organisation with a memory: report of an expert group on learning from adverse events in the NHS. See http://webarchive.nationalarchives.gov.uk/20130107105354/http://dh.gov.uk/prod_consum_dh/groups/dh_digitalassets/@dh/@en/documents/digitalasset/dh_4065086.pdf (2000, last checked 22 February 2016).
- 4.Donaldson L. An organisation with a memory. Clin Med 2002; 2: 452–457. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Berwick D. 2013. A promise to learn – a commitment to act: improving the safety of patients in England. See www.gov.uk/government/uploads/system/uploads/attachment_data/file/226703/Berwick_Report.pdf (2013, last checked 22 February 2016).
- 6.NHS England. NHS and social care transparency data. See www.nhs.uk/Service-Search/Accountability?OrgType=&OrgTopic=&Location.Name=e.g.+postcode+or+town&Location.Id=0&IsNationalSearch=True&submit=Go (2015, last checked 22 February 2016).
- 7.NHS England. Quality Accounts. See www.nhs.uk/aboutNHSChoices/professionals/healthandcareprofessionals/quality-accounts/Pages/about-quality-accounts.aspx (2015, last checked 22 February 2016).
- 8.Never Events data. See https://www.england.nhs.uk/patientsafety/never-events/ne-data/ (2015, last checked 22 February 2016).
- 9.Root Cause Analysis (RCA) report-writing tools and templates. See www.nrls.npsa.nhs.uk/resources/?entryid45=59847 (2008, last checked 22 February 2016).
- 10.Taylor-Adams S, Vincent C, Street P. Systems analysis of clinical incidents: the London protocol. Clin Risk 2004; 10: 211–220. [Google Scholar]
- 11.Toft B. External review of Never Events in interventional procedures co-commissioned by Sheffield Teaching Hospitals NHS Foundation Trust and Sheffield Clinical Commissioning Group. See http://chfg.org/wp-content/uploads/2015/01/External-review-redacted-version-at-the-request-of-patients.pdf (2014, last checked 22 February 2016).
- 12.Quality Accounts for 2010-11, Gateway reference number: 15119. See www.gov.uk/government/uploads/system/uploads/attachment_data/file/216295/dh_122541.pdf (2010, last checked 22 February 2016).
- 13.The Never Events Policy Framework: an update to the never events policy. Gateway Reference 17891. See http://www.idsc-uk.co.uk/docs-2012/never-events-policy-framework-update-to-policy.pdf (2012, last checked 22 February 2016).
- 14.Perinatal Institute. Confidential enquiry into intrapartum related deaths. See www.pi.nhs.uk/pnm/clinicaloutcomereviews/WM_IfH_-_IntrapartumConfidentialEnquiryReport_-_Oct%202010.pdf (2010, last checked 22 February 2016).
- 15.Clinical Human Factors Group. Never? See www.chfg.org/wp-content/uploads/2012/03/Never_Events_Corrected_Final_VersionApril12.pdf (2012, last checked 22 February 2016).
- 16.Process and action for Primary Care Trusts. Never events framework 2009/10. See www.nrls.npsa.nhs.uk/neverevents/?entryid45=59859 (2009, accessed 22 February 2016).
- 17.Report of the Mid Staffordshire NHS Foundation Trust Public Inquiry. See www.midstaffspublicinquiry.com/report (2013, last checked 22 February 2016).
- 18.Morecombe Bay investigation report. See www.gov.uk/government/publications/morecambe-bay-investigation-report (2015, last checked 22 February 2016).
- 19.Care Quality Commission. Regulation 20: duty of candour. See www.cqc.org.uk/content/regulation-20-duty-candour (2014, last checked 22 February 2016).
- 20.Regulation 20: Duty of candour. Information for all providers: NHS bodies, adult social care, primary medical and dental care, and independent healthcare. See www.cqc.org.uk/sites/default/files/20150327_duty_of_candour_guidance_final.pdf (2015, last checked 22 February 2016).
- 21.Macrae C, Vincent C. Learning from failure: the need for independent safety investigation in healthcare. J R Soc Med 2014; 107: 439–443. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Independent Patient Safety Investigation Service (IPSIS) Expert Advisory Group. See www.gov.uk/government/groups/independent-patient-safety-investigation-service-ipsis-expert-advisory-group (2015, last checked 22 February 2016).



