Abstract
Background
Although previous research suggests that different kinds of patient feedback are used in different ways to help improve the quality of hospital care, there have been no studies of the ways in which hospital boards of directors use feedback for this purpose.
Objectives
To examine whether and how boards of directors of hospitals use feedback from patients to formulate strategy and to assure and improve the quality of care.
Methods
We undertook an in-depth qualitative study in two acute hospital National Health Service foundation trusts in England, purposively selected as contrasting examples of the collection of different kinds of patient feedback. We collected and analysed data from interviews with directors and other managers, from observation of board meetings, and from board papers and other documents.
Results
The two boards used in-depth qualitative feedback and quantitative feedback from surveys in different ways to help develop strategies, set targets for quality improvement and design specific quality improvement initiatives; but both boards made less subsequent use of any kinds of feedback to monitor their strategies or explicitly to assure the quality of services.
Discussion and conclusions
We have identified limitations in the uses of patient feedback by hospital boards that suggest that boards should review their current practice to ensure that they use the different kinds of patient feedback that are available to them more effectively to improve, monitor and assure the quality of care.
Introduction
Boards of directors of hospitals in the National Health Service (NHS) in England have three key roles: formulating strategy; ensuring accountability; and shaping a healthy culture.1 In undertaking these roles, boards make decisions about annual plans and budgets that impact directly on the quality of care. The Department of Health expects boards of directors of NHS trusts to use feedback from patients to help assure the quality of care.2 3 Monitor, the independent regulator of NHS foundation trusts, has asked boards to seek feedback from patients and said that they could gain assurance by using patients to design improvements and monitor their impact.4 5
Patient feedback is often presented in verbal or written reports to hospital boards of directors.6 7 Boards spend substantial amounts of time reviewing patient satisfaction, listening to patients’ stories and discussing quality and safety.7 8 Although some studies of boards include data about patient experience as an element of quality, they do not show whether boards use the data explicitly to assure or improve quality.9–12 Although hospital boards often see the oversight of quality as a priority,13 14 and there is evidence that regular discussion of quality by boards is positively associated with the quality of care,15–19 these studies again do not show how boards use patient feedback to influence quality.
It is widely accepted that patient feedback should be used to improve the quality of hospital care.6 9 10 20–22 The relationships between patient feedback, patient experience and patient involvement are complex.23–25 Some authors distinguish between quantitative and qualitative feedback, identifying routine administrative statistics and different kinds of surveys of patients as examples of quantitative feedback, and in-depth interviews, focus groups, compliments and complaints as qualitative feedback.2 26 Recent analysis has focused on the strengths and limitations of national and local patient surveys,27–29 and on the relationships between quantitative and qualitative methods and the contents of the feedback (eg, the amount of detail and focus on ‘functional’ service issues or on patients’ subjective feelings).30 31 It is argued that different kinds of patient feedback may be used in different ways.23 26 32 Surveys and other quantitative methods are generalisable, help identify problem areas in the delivery of services and are useful for benchmarking and monitoring but lack the detail needed to design specific changes.27 29–31 33–35Patient forums and other forms of patient involvement, and in-depth analysis of comments, complaints and patients’ stories, generate detailed understanding of patients’ feelings and specific aspects of care; they build commitment to change and lead to specific improvements.30 33 34 36
If different kinds of patient feedback are used in various ways generally to improve the quality of hospital care, we are led to questions about how boards of directors themselves use these diverse kinds of feedback. Do boards of trusts with a focus on surveys and other quantitative methods use these surveys and statistics to identify problems, assure quality and monitor improvements; do boards of trusts with a focus on patient involvement use in-depth feedback to generate commitment to change and to support the design and implementation of specific improvements? To address these questions we report in this paper the findings of an in-depth qualitative study of whether and how the boards of directors of two NHS foundation trusts in England used feedback from patients to formulate strategy and to assure and improve the quality of care.
Methods
We adopted a qualitative research strategy to generate propositions about analytical relationships in the use of patient feedback that could be tested and generalised through further research.37–40 We studied two trusts in order to develop concepts and explanations that might not be identified from a single setting.41–43 We purposively selected the trusts (on the basis of informal local discussions and reading the trusts’ annual plans and quality accounts) as contrasting examples of the collection of patient feedback in order to analyse their similarities and differences in the use of different kinds of feedback.38 40 44
Hillside Hospital NHS foundation trust (a pseudonym) was selected as a good example of a hospital with a well established practice in using local surveys to collect patient feedback. It introduced a local inpatient survey in 2004, before these surveys were common in the NHS, and in 2011 received more than 1200 completed survey forms a month. Northbank Hospitals NHS foundation trust (a pseudonym) was selected because of its experience in patient involvement. It was widely recognised for practice and research in the use of patient workshops and forums in service development. The two trusts provided acute hospital services for the people of three inner-city local authority areas with a diverse population of 850 000 people. They provided a wide range of more specialised services and worked closely with the local university in teaching and research. Despite the geographical relationship and some organisational links between the two trusts, their collection and use of patient feedback were separate and distinct.
We collected data from three main sources in each trust: interviews with managers; trust documents; and non-participant observation of meetings. Ethical approval was granted before data collection started. Fieldwork was undertaken in 2011; our documentary data relate to 2010, 2011 and 2012. In collecting data we focused on the use of patient feedback in the management of the hospitals generally and on four specific examples: neurosciences and food service at Hillside Hospital; cancer services and outpatient experience at Northbank.
We selected managers for interview on the basis of information from our earlier contacts about which managers had responsibilities for patient feedback. We approached 59 people for interview (of whom one declined and one failed to respond) and interviewed 57 people as shown in table 1.
Table 1.
Role | Hillside | Northbank | Total |
Board member: executive or corporate director | 4 | 4 | 8 |
Corporate manager* | 7 | 8 | 15 |
Divisional or directorate manager | 6 | 7 | 13 |
Clinical, service or project manager | 5 | 4 | 9 |
Clinician with management responsibility | 4 | 4 | 8 |
Trust governor | 3 | 1 | 4 |
Total | 29 | 28 | 57 |
*‘Corporate Managers’ are managers other than directors working in trust headquarters.
The eight board members, four trust governors and four other interviewees provided data about the use of patient feedback by the boards. We used an interview guide drawing on the research literature and focusing on the kinds of patient feedback interviewees used in their work, what the patient feedback was used for, and the organisational processes through which it was used. Interviews were semistructured, audio-recorded and lasted about 45 min. The transcribed audio recordings formed sources of data for analysis.
We observed all the meetings in public in 2011 of the boards of directors; Hillside board met 12 times and Northbank 8 times. Meetings lasted about 2 hours. We made contemporaneous handwritten notes of what was said about patient feedback and patient experience. After the meetings the notes were typed to form sources of data for analysis.
We identified and drew data from a wide range of government, regulatory and local trust paper and electronic documents. In focusing on the use of feedback by boards of directors we have drawn on the agendas and minutes of the 56 meetings in public of the two boards between 2010 and 2012. We have drawn on the annual plans, annual reports, quality accounts, quality strategies and reports of quality and assurance committees in each of the 3 years, and the regular chief executive’s reports, performance reports, scorecards and patient experience reports presented to meetings of the boards, a total of 180 trust documents.
We analysed the data in three stages. First, we developed inductively and deductively a coding frame with five main categories: kinds (or sources) of patient feedback; domains (or subject matter) of feedback; uses made of feedback; aims of using feedback; and organisational processes in the use of feedback.45 We used NVivo to code the interview and observational data and generate thematic material for further analysis. Second, we used all the data to build up detailed time-ordered narratives, identifying the context and the specific decisions and actions in the boards’ use of feedback.46 Third, we used the coded data and the narratives to develop more detailed analyses of the relationships between the organisational processes and outcomes—the different uses by different boards of different kinds of patient feedback.38 47 48
Findings
We examine how the boards of directors of Hillside and Northbank Hospitals, trusts with different emphases on patient surveys and patient involvement, used different kinds of patient feedback in developing strategy, providing assurance and improving the quality of care.
Developing strategy
Following the Mid Staffordshire hospitals crisis in the mid-2000s and Department of Health guidance in 2010, the boards of directors at Northbank and Hillside Hospitals took different approaches in using patient feedback to develop strategies for the quality of care. The Northbank board used feedback from patient workshops and from national surveys to develop a new quality strategy; the Hillside board used national and local surveys to set quantified targets for quality improvement.
The proposed new quality strategy at Northbank developed from a review of the trust’s existing programme of patient and public involvement.
“We have […] put a lot of emphasis since 2007 on putting patients at the heart of decision-making within the organisation […] our Board was also very, very keen to understand where we get patient feedback from.’ (Director A, Northbank Hospitals).
In developing the strategy, managers analysed information about patient experience from the National Inpatient Survey and from complaints and Patient Advice and Liaison Service data; they commissioned three workshops with patients to understand what made patients feel safe in hospital. The results were presented to the board in March 2011. A further workshop was convened for patients to identify priorities for improvement. A report (Quality Strategy 2011–2013) proposing eight ‘patient experience priorities’ was discussed at length at a board meeting in September 2011. It said that the board would be presented with feedback based on patients’ stories, complaints and local surveys to monitor progress with the strategy. Our observation notes of the meeting recorded that the chair ‘warmly welcomed’ the strategy; but the minutes did not say that the board approved it or made any decisions about it.
Our analysis of board papers shows that the Northbank board did not subsequently receive the full set of patient feedback proposed in the strategy. The board did receive information based on surveys and complaints, primarily in the context of the monitor and the Commissioning for Quality and Innovation (CQUIN) targets established by the Government. There was little discussion at the board about this feedback and the board did not base any formal decisions directly on it. Minutes of the quality committee and the board from 2012 also revealed no evidence of discussions or decisions based specifically on feedback from patients used to monitor the patient experience standards or the quality strategy more generally.
The board of directors at Hillside in 2010 adopted quantified targets for improvement in the results of its existing inpatient survey. It agreed in its annual plan and quality accounts for 2010/2011 three quality improvement priorities: use the trust’s ‘First Choice’ transformation programme to improve patient experience; achieve target patient satisfaction scores in CQUIN metrics measured by local and national patient surveys; and achieve the trust’s own ‘How Are We Doing’ survey benchmark, to be in the top 20% of trusts locally in the national hospital patients’ surveys by 2011/2012.
The Hillside board received reports at each meeting in 2011 showing performance against CQUIN and benchmarking targets measured by the trust’s regular survey of inpatients. The reports show that in the year the large majority of the CQUIN targets were achieved, but little progress was made towards the benchmarking targets. Although our observations showed that the board discussed the surveys and patient feedback in nearly all meetings in 2011, it made only two decisions resulting from these discussions: to ask for the information to be presented differently; and to ask the director of operations to establish responsibility for the provision of hand washing gels in wards and communal areas. Interviewees spoke about the use of patient feedback in decision making by the board.
‘… it may not be apparent at every Board meeting what those decisions that they might or might not be making are. […] what it does do for the Board of Directors is I think it focuses […] in terms of the trust strategy and overall direction, ‘where are we going to focus our efforts?’ (Director A, Hillside Hospital).
The board at Hillside focused more on the use of surveys to develop targets and priorities than to make decisions about their implementation and achievement.
Providing assurance
Department of Health and monitor guidance in 2009 and 2010 asked boards of directors of NHS trusts to use patient feedback to help assure the quality of care. The boards at Northbank and Hillside Hospitals responded to this guidance in different ways.
The board at Northbank used an assurance and risk committee for quality assurance. Individual directors expressed different views about the use of patient feedback by the committee.
‘And it’s that Committee that tends to look at the broader issues of patterns of patients’ complaints, patterns of patients’ surveys, responsiveness to issues that are raised by patients. It tends to stick there rather than be something which gets addressed at the Board.’ (Director B, Northbank Hospitals).
‘We’ve got an Assurance and Risk Committee, which is a formal sub-committee of the Board, chaired by one of our Non-execs. […] But that very much doesn’t talk about patient feedback.’ (Director A, Northbank Hospitals).
Our observation data and analysis of the publicly available papers and minutes of the committee and board in 2011 and 2012 revealed no evidence that the board itself regularly received from the committee explicit information or statements based on patient feedback that demonstrated how the quality of care was assured.
The board at Hillside Hospital frequently discussed the results of patient surveys. Some managers and governors saw feedback as assurance.
‘I think it is about joining up with adverse incidents and complaints – it’s another way of really being assured, seeking assurance that what’s going out there is safe really.’ (Divisional manager, Hillside Hospital).
‘I mean if I go to a Directors’ meeting… at the last one I think they had a good 30 min plus discussion on issues from those surveys. They really mean it. It’s not just for show.’ (Governor, Hillside Hospital).
But, despite the discussion of the survey results, none of the managers and none of the trust’s directors we interviewed explicitly stated in interview that the board itself used feedback as assurance. None of the minutes of board meetings and nobody who spoke at board meetings explicitly stated that patient feedback provided assurance of the quality of care. The discussion at board meetings about patient surveys did not translate into explicit statements of assurance about quality.
Improving the quality of care
Policy guidance in England is unclear about how boards of directors of hospitals should use patient feedback to help improve the quality of care. In this section we examine two examples showing how the boards of Northbank and Hillside Hospitals used different kinds of feedback in specific improvement initiatives.
The board of the Northbank Hospitals trust discussed in 2010 and 2011 an initiative to improve access to the hospitals for outpatients. The initiative was the result of complaints and informal discussion with patients (rather than surveys or questionnaires) about telephone communications and outpatient appointments. There was detailed discussion about individuals’ experience at a meeting of the Patient Experience Working Group of the trust’s council of governors.
‘…it was something I raised at the Patient Experience sub-group of the Governors, and one of the Non-execs was there who wanted to hear about what our key issues were, and was very interested.’ (Corporate Manager, Northbank Hospitals).
Individual board members and governors were instrumental in persuading the board to take action. The chair of the governors’ Patient Experience Working Group spoke at the board meeting in January 2011.
‘But when they’re at the Board meeting, that’s when I really try and nail something down. I mean I nailed down all this business about out-patients.’ (Governor, Northbank Hospitals).
According to the minutes of the meeting, ‘Members of the Board agreed that this was an issue that required both a short term solution and be part of a longer term strategy’, but the board made no formal decisions about action to be taken. Following a meeting of the board in private, a board paper in March said that the board was ‘committed’ to making improvements. An ‘Operational Update’ to the next meeting of the board reported on the actions being taken to improve communications and access for outpatients. These actions were, according to the minutes of the meeting, ‘welcomed’ by the board, but the board did not formally approve or make other decisions about them.
Although qualitative patient feedback influenced the Northbank board in the development of the outpatient improvement initiative, the board did not subsequently use feedback to monitor its progress and success. The ‘Operational Update’ to the board said that progress would be measured by quantified operational measures and by the development of ‘softer measures’ including patient surveys, analysis of complaints and patient interviews. Progress with the actions taken by managers was discussed at subsequent board meetings, but the only measure of success reported to the board or referred verbally to in any of meetings was the abandoned telephone call rate, itself measured electronically by the trust. No quantitative or qualitative ‘softer measures’ of patient experience were reported to the board. Despite the lack of information based on patient feedback, the minutes of the board meeting in November 2011 stated that the chair suggested that the action was now ‘complete’.
An initiative to improve the quality of the food service at Hillside Hospital was the result of patient surveys and perceived pressure from regulatory bodies. The board considered and formally approved the initiative as part of a broader transformation programme.
‘That’s definitely come from feedback, because we score in the bottom 20 percent nationally for trusts for help with feeding or patients having a perception that they’re not getting enough help.’ (Corporate Manager, Hillside Hospital).
Two other factors in addition to patient feedback influenced the board. Interviewees said that the new food service initiative was introduced when, for the first time in England, trust chief executives had been placed under a legal duty to protect patients from risks of inadequate nutrition. The Care Quality Commission (CQC) was known to be working on how it would measure compliance with this duty. Further, the trust had recently received a critical Care Quality Commission report following a hygiene inspection visit.
‘… in the middle of 2009 we had very, very unhelpful hygiene code inspection.’ (Director B, Hillside Hospital).
‘Round about that time, just before Christmas, we had a spot check from CQC about cleanliness. And we were castigated for dirty mattresses and all sorts of things. So when that result came along, we were, the Trust was acutely sensitive about any kind of patient experience, bad things with the service.’ (Project Manager, Hillside Hospital).
As a result directors wanted to avoid further criticism by the commission; improving the food service for patients became a high priority. The initiative consisted of a project team of clinical and hotel service staff working with ward staff to improve the availability and presentation of food, and to help patients eat.
Following the approval by the Hillside board of the food service improvement initiative, data about patients’ perception of the food service, drawn from the trust’s regular survey of inpatients, were presented in patient experience reports to the board each month in 2011. The data showed somewhat higher scores in 2011 than in 2010, and that they continued to be lower than the benchmark that the trust had set for itself. The reports twice included comments about the food service, once to say that the score had fallen and once to say that it had risen. Our observation of the meetings showed that there was never any discussion at the board of the survey results about help with feeding or the food service; the minutes of the meetings contained no record of any further discussion or decisions by the board about it. A paper to the board in January 2012 proposed substantial changes to the transformation programme to make it more outward looking and to help meet the financial pressures being experienced by the trust. The food service initiative was not specifically identified in that paper or in the trust’s forward plan for 2012/2013.
Discussion
In this paper we have reported the results of the first detailed study of the use by hospital boards of directors of patient feedback to help improve the quality of care. Our findings lead to two main propositions about how the boards of directors of hospital trusts with different traditions of patient involvement and patient surveys use different kinds of patient feedback.
The literature suggests that boards of directors of hospital trusts with a focus on feedback from patient involvement might use in-depth feedback to build commitment and support the design of specific quality improvements and that boards of trusts with a focus on surveys would use the surveys to identify problems and priorities for improvement.30 33 34 36 Our findings provide partial but incomplete support for this proposition. Although patient involvement at Northbank Hospitals contributed to the development of strategy and quality improvements, it did not result in formal commitment by the board to these initiatives. Although surveys and statistics at Northbank and Hillside Hospitals contributed to priorities for quality improvement, external pressures on the boards of both trusts also influenced these priorities. Our findings suggest that boards do use different kinds of quantitative and qualitative patient feedback to develop strategies and quality improvement initiatives, but that external pressures are equally important in determining whether and how boards use feedback.
The literature also suggests that boards of hospital trusts with a focus on patient surveys and statistics might use these kinds of feedback primarily to monitor improvements and assure the quality of care.27 29 30 34 35 We found little evidence, other than the monitoring of contractual targets, to support this argument. Boards of trusts with robust systems of patient surveys do not always use the feedback from surveys explicitly to monitor or assure the quality of care. Although previous studies6–8 have shown that boards receive and discuss surveys and other kinds of patient feedback, we have extended these findings by showing that the discussion of surveys and other kinds of feedback does not of itself lead to action or explicit assurance.
This analysis leads to three implications for policy and practice. First, it suggests that boards should review their current practice in using different kinds of patient feedback, to ensure that information and discussions lead to appropriate actions and decisions to improve and assure the quality of care. Second, boards may wish to be more explicit than at present about the standards based on patient feedback that they and their committees use to assure the quality of services. Third, as increasing amounts of feedback are collected from patients and as pressures on hospital services increase, boards may wish to discuss in public the relationships between patients’ views and other service priorities, so that patients have realistic expectations about the impact of their feedback on the quality of care.
In this paper we have examined the ways in which different boards use different kinds of patient feedback. The limitations of the study, in terms of data collection from two purposively selected trusts at a specific point in time, mean that further research is needed to develop and test the propositions we have presented here. Practice in these and other trusts has developed in recent years, with, for example, the implementation in England of the NHS Friends and Family Test in 2012. A priority for further research is to investigate the current use of patient feedback by boards of directors in a wide range of hospital trusts. This would have two aims: to investigate how boards of directors can effectively combine patient feedback from a variety of sources with other internal and external expectations to identify priorities for improvement; and to identify how surveys and other sources of statistical information are most effectively used by boards to monitor improvements and assure the quality of care.
Footnotes
Contributors: RL was responsible for collecting data and led the drafting of the manuscript. All the authors contributed to the analysis of the data. JB and NJF revised and edited the manuscript.
Competing interests: None declared.
Patient consent: There were no patients involved in this study.
Ethics approval: Education and Management Research Ethics Panel of King’s College London.
Provenance and peer review: Not commissioned; externally peer reviewed.
References
- 1. Leadership Academy NHS. The Healthy NHS Board. Principles for Good Governance 2013. [Google Scholar]
- 2. Department of Health. Understanding what matters. A Guide to using patient feedback to transform Services. London: Department of Health, 2009. [Google Scholar]
- 3. Department of Health. Hard Truths: The Journey to Putting Patients First. VolumeOne of the Government Response to the Mid Staffordshire NHS Foundation TrustPublic Inquiry. Cm 8777-1. London: The Stationery Office Ltd, 2014. [Google Scholar]
- 4. Monitor. Quality Governance Framework. London: Monitor, 2010. [Google Scholar]
- 5. Monitor. Quality Governance: how does a Board Know that its Organisation is working effectively to improve Patient Care? guidance for boards of NHS Provider Organisations. London: Monitor, 2013. [Google Scholar]
- 6. Reeves R, Seccombe I. Do patient surveys work? Qual Saf Health Care 2008;17:437–41. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Millar R, Mannion R, Freeman T, et al. . Hospital board oversight of quality and patient safety: a narrative review and synthesis of recent empirical research. Milbank Q 2013;91:738–70. 10.1111/1468-0009.12032 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Jiang HJ, Lockee C, Bass K, et al. . Board oversight of quality: any differences in process of care and mortality? J Healthc Manag 2009;54:15–29. [PubMed] [Google Scholar]
- 9. Jha AK, Epstein AM. A survey of board chairs of English hospitals shows greater attention to quality of care than among their US counterparts. Health Aff 2013;32:182–7. 10.1377/hlthaff.2012.1060 [DOI] [PubMed] [Google Scholar]
- 10. Rozenblum R, Lisby M, Hockey PM, et al. . The patient satisfaction chasm: the gap between hospital management and frontline clinicians. BMJ Qual Saf 2013;22:242–50. 10.1136/bmjqs-2012-001045 [DOI] [PubMed] [Google Scholar]
- 11. Botje D, Klazinga NS, Wagner C. To what degree is the governance of dutch hospitals orientated towards quality in care? does this really affect performance? Health Policy 2013;113:134–41. 10.1016/j.healthpol.2013.07.015 [DOI] [PubMed] [Google Scholar]
- 12. Dixon-Woods M, Baker R, Charles K, et al. . Culture and behaviour in the English National Health Service: overview of lessons from a large multimethod study. BMJ Qual Saf 2014;23:106–15. 10.1136/bmjqs-2013-001947 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Bismark MM, Walter SJ, Studdert DM. The role of boards in clinical governance: activities and attitudes among members of public health service boards in Victoria. Aust Health Rev 2013;37:682–7. 10.1071/AH13125 [DOI] [PubMed] [Google Scholar]
- 14. Mannion R, Davies H, Freeman T, et al. . Overseeing oversight: governance of quality and safety by hospital boards in the English NHS. J Health Serv Res Policy 2015. 20:9–16. 10.1177/1355819614558471 [DOI] [PubMed] [Google Scholar]
- 15. Botje D, Klazinga NS, Suñol R, et al. . Is having quality as an item on the executive board agenda associated with the implementation of quality management systems in european hospitals: a quantitative analysis. Int J Qual Health Care 2014;26(S1):92–9. 10.1093/intqhc/mzu017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Parand A, Dopson S, Renz A, et al. . The role of hospital managers in quality and patient safety: a systematic review. BMJ Open 2014;4:e005055 10.1136/bmjopen-2014-005055 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17. Jiang HJ, Lockee C, Fraser I. Enhancing board oversight on quality of hospital care: an agency theory perspective. Health Care Manage Rev 2012;37:144–53. 10.1097/HMR.0b013e3182224237 [DOI] [PubMed] [Google Scholar]
- 18. Szekendi M, Prybil L, Cohen DL, et al. . Governance practices and performance in US academic medical centers. Am J Med Qual 2015;30:520–5. 10.1177/1062860614547260 [DOI] [PubMed] [Google Scholar]
- 19. Tsai TC, Jha AK, Gawande AA, et al. . Hospital board and management practices are strongly related to hospital performance on clinical quality metrics. Health Aff 2015;34:1304–11. 10.1377/hlthaff.2014.1282 [DOI] [PubMed] [Google Scholar]
- 20. Department of Health. High Quality Care for all. NHS Next Stage Review Final Report. CM 7432. London: The Stationery Office 2008. [Google Scholar]
- 21. Wiig S, Storm M, Aase K, et al. . Investigating the use of patient involvement and patient experience in quality improvement in Norway: rhetoric or reality? BMC Health Serv Res 2013;13:206–19. 10.1186/1472-6963-13-206 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Bismark M, Biggar S, Crock C, et al. . The role of governing boards in improving patient experience: attitudes and activities of health service boards in Victoria, Australia. Patient Exp J 2014;1:144–52. [Google Scholar]
- 23. Mockford C, Staniszewska S, Griffiths F, et al. . The impact of patient and public involvement on UK NHS health care: a systematic review. Int J Qual Health Care 2012;24:28–38. 10.1093/intqhc/mzr066 [DOI] [PubMed] [Google Scholar]
- 24. Dent M, Pahor M. Patient involvement in Europe--a comparative framework. J Health Organ Manag 2015;29:546–55. 10.1108/JHOM-05-2015-0078 [DOI] [PubMed] [Google Scholar]
- 25. Fumagalli LP, Radaelli G, Lettieri E, et al. . Patient Empowerment and its neighbours: clarifying the boundaries and their mutual relationships. Health Policy 2015;119:384–94. 10.1016/j.healthpol.2014.10.017 [DOI] [PubMed] [Google Scholar]
- 26. Coulter A, Fitzpatrick R, Cornwell J. The point of care: measures of patients’ Experience in Hospital; Purposes, Methods and Uses. London: The King’s Fund, 2009. [Google Scholar]
- 27. Robert G, Cornwell J. Rethinking policy approaches to measuring and improving patient experience. J Health Serv Res Policy 2013;18:67–9. 10.1177/1355819612473583 [DOI] [Google Scholar]
- 28. Beattie M, Murphy DJ, Atherton I, et al. . Instruments to measure patient experience of healthcare quality in hospitals: a systematic review. Syst Rev 2015;4:97–118. 10.1186/s13643-015-0089-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29. Gleeson H, Calderon A, Swami V, et al. . Systematic review of approaches to using patient experience data for quality improvement in healthcare settings. BMJ Open 2016;6:e011907 10.1136/bmjopen-2016-011907 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30. Tsianakas V, Maben J, Wiseman T, et al. . Using patients’ experiences to identify priorities for quality improvement in breast Cancer care: patient narratives, surveys or both? BMC Health Serv Res 2012;12:271–81. 10.1186/1472-6963-12-271 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31. Edwards KJ, Walker K, Duff J. Instruments to measure the hospital inpatient experience: a literature review. Patient Exp J 2015;2:77–85. [Google Scholar]
- 32. Tritter JQ. Revolution or evolution: the challenges of conceptualizing patient and public involvement in a consumerist world. Health Expect 2009;12:275–87. 10.1111/j.1369-7625.2009.00564.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Morrow E, Cotterell P, Robert G, et al. . Mechanisms can help to use patients’ experiences of chronic disease in research and practice: an interpretive synthesis. J Clin Epidemiol 2013;66:856–64. 10.1016/j.jclinepi.2012.12.019 [DOI] [PubMed] [Google Scholar]
- 34. Coulter A. Understanding the experience of illness and treatment : Ziebland S, Coulter A, Calabrese JD, Et alUnderstanding and using Health Experiences. improving patient care. Oxford: Oxford University Press, 2013:6–15. [Google Scholar]
- 35. DeCourcy A, West E, Barron D. The National adult Inpatient Survey conducted in the English National Health Service from 2002 to 2009: how have the data been used and what do we know as a result? BMC Health Serv Res 2012;12:71–82. 10.1186/1472-6963-12-71 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36. Attree P, Morris S, Payne S, et al. . Exploring the influence of service user involvement on health and social care services for Cancer. Health Expect 2011;14:48–58. 10.1111/j.1369-7625.2010.00620.x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37. Eisenhardt KM. Building theories from case study research. Acad Manage Rev 1989;14:532–50. [Google Scholar]
- 38. Eisenhardt KM, Graebner ME. Theory building from cases: opportunities and challenges. Acad Manage J 2007;50:25–32. 10.5465/AMJ.2007.24160888 [DOI] [Google Scholar]
- 39. May T. Social Research: issues, methods and process. Maidenhead: Open University Press, 2011. [Google Scholar]
- 40. Rohlfing I. Case Studies and Causal Inference. an integrative framework. Basingstoke: Palgrave Macmillan, 2012. [Google Scholar]
- 41. Tsoukas H. The validity of ideographic research explanations. Acad Manage Rev 1989;14:551–61. [Google Scholar]
- 42. Locock L, Ferlie E, Dopson S, et al. . Research design: “upscaling” qualitative research : Dopson S, Fitzgerald L, Knowledge to Action? Evidence-Based Health Care in Context. Oxford: Oxford University Press, 2005:48–78. [Google Scholar]
- 43. Byrne D. Complex realist and configurational approaches to cases: a radical synthesis : Byrne D, Ragin CC, The Sage Handbook of Case-based methods. London: sage Publications Ltd, 2009:101–12. [Google Scholar]
- 44. Rueschemeyer D. Can one or a few cases yield theoretical gains? : Mahoney J, Rueschemeyer D, Comparative historical analysis in the Social Sciences. Cambridge: Cambridge University Press, 2003:305–36. [Google Scholar]
- 45. Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods 2006;5:80–92. [Google Scholar]
- 46. Yin RK. Case Study Research Design and methods. Thousand Oaks, Ca: Sage Publications Inc, 2009. [Google Scholar]
- 47. Ritchie J, Spencer L. Qualitative data analysis for applied policy research : Bryman A, Burgess RG, Analyzing qualitative data. London: Routledge, 1994:173–94. [Google Scholar]
- 48. Fiss PC. Case studies and the configurational analysis of organizational phenomena : Byrne D, Ragin CC, The Sage Handbook of Case-based methods. London: Sage Publications Ltd, 2009:424–40. [Google Scholar]