Skip to main content
Journal of the Royal Society of Medicine logoLink to Journal of the Royal Society of Medicine
. 2008 Dec 1;101(12):598–604. doi: 10.1258/jrsm.2008.080103

Can clinicians benefit from patient satisfaction surveys? Evaluating the NSF for Older People, 2005–2006

Steve lliffe 1, Jane Wilcock 1, Jill Manthorpe 2, Jo Moriarty 2, Michelle Cornes 2, Roger Clough 3, Les Bright 4; Older People Researching Social Issues (OPRSI)
PMCID: PMC2625387  PMID: 19092030

Summary

Background

A transformation of healthcare is underway, from a sellers' market to a consumers' market, where the satisfaction of the patient's needs is part of the definition of quality. Patient satisfaction surveys are widely used to judge service quality, but clinicians are sceptical about them because they are too often poorly designed measures that do not lead to improvements in the quality of care.

Aim

To explore the use of patient satisfaction survey data in identifying problems with the provision of inpatient care for older people.

Methods

A case study using secondary analysis of postal survey data about older people's experiences of health and social care services, obtained during the evaluation of the National Service Framework for Older People in 2005–2006. The survey asked about experiences of inpatient care and of discharge from hospital, and sought perceptions of the avoidability of the admission.

Settings and participants

A total of 4170 people aged 50 years and over returned a postal questionnaire in six local authority areas of England. Responses from 584 who had experienced a recent overnight stay in hospital are reported and discussed.

Findings

The response rate was 35%, ranging from 26% to 44% in the six areas surveyed. The great majority of those who had recent direct experience of inpatient care reported that they had been engaged in decision-making, that staff promoted their independence and maintained their dignity. There were widespread examples, however, of the opposite experiences. Discharge from hospital was problematic for about one-third of survey respondents with this experience, and there were different accounts of poorly managed discharges from all areas.

Conclusions

Case studies using local survey data can be used as formative assessments of services. The response rate to the survey and the likelihood of responder bias mean that patient satisfaction survey data of this sort cannot be used to judge or compare services in a summative way, but can highlight areas where remedial action is needed. Small-scale local surveys may seem to lack the robustness of larger studies, but do identify similar areas of concern. Commissioners and clinicians could use the findings of such surveys to inform dialogues about the quality of hospital care for older people.

Background

The National Service Framework for Older People (NSFOP) was introduced in England by the Department of Health (DH) in 20011 being 'triggered by concerns about widespread infringement of dignity and unfair discrimination in older people's access to care'. Standard two of the NSFOP addresses person-centred care stating: 'NHS and social care services should treat older people as individuals and enable them to make choices about their own care'. Milestones were to be met by 2005 and the DH held that these were well on track.2 In re-launching the NSFOP in 2006 the DH re-focused attention to older people's experiences and in particular to enhancing their confidence that 'in all care settings older people will be treated with respect for their dignity and their human rights'.3 Later inspection by the Healthcare Commission exploring general inpatient experiences identified continuing deficits in this area.4

In 2004–2005 the Healthcare Commission reviewed the implementation of the NSFOP, using a rapid appraisal approach5 that included a postal survey of patient satisfaction together with public listening events, focus groups and individual interviews. This mixed methodology was used to build a picture of service performance at local level (where information about the quality of services is often lacking6), and to guide interventions to improve the quality of care.4 The survey component of the appraisal was used to reach older people who might not respond to invitations to participate in public or group events, and to seek opinions about the person-centredness (or otherwise) of hospital inpatient care. The results of this survey informed the mid-point review of the NSFOP7 but have not been reported in detail until now.

Inpatient experience may be difficult to access, biased and incomplete. The experiences of people who have had inpatient care may be valuable after the inpatient episodes when they are better able to comment on their experiences, though clearly with hindsight. Satisfaction surveys have been criticized for their limited ability to capture nuances of experience, their tendency to underestimate the extent of dissatisfaction,8 responderbiases that exclude those with the worst health and greatest need9 and their weakness as outcome measures.10 Nevertheless they are widely used in health services research11 and in regulation of services, for a number of reasons. First, service users have strong views about whether care is good or bad, are the best judges of aspects of care like interpersonal relations and communication, attach high importance to being encouraged to ask questions about their treatment, and having their choices explained.12 Second, a transformation of healthcare is underway, from a sellers' market to a consumers' market where the satisfaction of patients' needs is part of the definition of quality. Finally, there is the ideological reason that, in a democratic society, service users should have the right to influence decisions and activities influencing them.13

Clinicians are sceptical about the methods used to evaluate the services that they provide,14 contrasting the scientific rigour of the methods used to evaluate treatments with the subjectivity of methods used to evaluate services. Clinicians have been particularly critical of patient satisfaction surveys, partly because clinicians communicate regularly with their patients and hear their concerns15 and partly because satisfaction surveys have often been flawed, poorly designed, measures of service quality that did not lead to improvements in the quality of care.16 However, there is an inexorable trend towards evaluating quality of services and publicly reporting the findings, so commissioners and clinicians seeking to improve the quality of care need information that allows for remedial action before a regulatory body intervenes. Cleary has made a plea for clinicians to embrace patient satisfaction surveys as a way of stimulating and guiding quality improvement efforts rather than a threatening mechanism for identifying 'bad apples'.15 In the absence of resources for complex public consultation mechanisms, one way to carry out this kind of screening for problems in service provision is to use a questionnaire survey.17

This paper explores the utility of such an approach by using secondary analysis of data from a survey carried out as part of the Healthcare Commission's review of the National Service Framework for Older People as a case study of satisfaction surveys. Case studies are used extensively in social sciences research and evaluation studies. They are particularly useful when investigators have little control over events, and when the focus is on contemporary phenomena within a 'real-life' context.18 The methodological problems of such a survey are discussed, as well as the potential benefits of this approach for commissioners and service providers.

Methods

The Healthcare Commission, together with the other inspectors of public services in health and social care, the Commission for Social Care Inspection (CSCI) and the Audit Commission in England, undertook a mid-point review of the National Service Framework for Older People (NSFOP) in 2004. A team of independent researchers was engaged to elicit the perspectives of older people about the impact and effectiveness of the NSFOP. Older people were involved from the start in the design of the evaluation, its implementation and in the analysis of findings19 through the involvement of the research co-operative Older People Researching Social Issues (OPRSI).20

Settings

A sample of localities in England was selected by the joint inspection bodies.21 These included urban and rural areas with diverse populations, including areas with high proportions of older people and those with declining numbers of older people, covering six local authority areas. A mixed methodology approach was used, including a postal survey. Data collection took place in 2005–2006.

Postal survey

This was constructed to answer the Healthcare Commission's questions about specific experiences of older people's health and social care services. Older People Researching Social Issues played a key part in formulating questions, advising on the format and in piloting the questionnaire with older people, in order to address patient experiences as directly as possible,22 avoid wording that might overestimate satisfaction23 but without using simplistic 'portmanteau' questions.24 During a six-month consultation process 10 versions of the questionnaire were produced. The questionnaire was designed to be as succinct and non-intrusive as possible but with room for respondents to add detail of their views and experiences. The questionnaire was composed in plain English, printed in large font and accompanied by freepost envelopes. Demographic details were sought, and questions were asked about experiences of inpatient care, discharge and the avoidability of admission. MREC approval was gained for the postal survey (reference number 05/MRE09/33).

The questions specifically asking about hospital care are shown in Box 1.

Box 1. Questions asked about hospital care.

Please think about your most recent experience of hospital care, if you have had more than one in the last year, or that of the person you care for.

  1. If you (or the person you care for) have been in hospital, could your inpatient treatment have been avoided?

    Can you explain your thinking if you answered ‘possibly’, ‘probably’ or ‘definitely’? Space for free text

  2. While you (or the person you care for) were in hospital, could you describe your care and treatment as:

    • Sensitive to your views, beliefs or preferences (or those of the person you care for)? Yes/No

    • Personalized, not standardized ('one size fits all')? Yes/No

    • Designed to involve you (or the person you care for) in making decisions about future care? Yes/No

    • Designed to promote independence? Yes/No

  3. If you (or the person you care for) have been in hospital, which of the answers below best describes your experience of leaving hospital?

    • My discharge from hospital was at the right time for me, and well organized

    • My discharge from hospital was at the right time for me, but badly organized

    • I went home too soon, but it was well organized

    • My discharge from hospital was too soon, and was badly organized

    • I went home later than I wanted, but it was well organized

    • I went home later than I wanted, and it was badly organized

    Can you give us any further details about this experience?

    Free-text answer

Six primary care trusts (PCTs) were asked to draw a stratified random sample of 2000 older people (over 50 years of age) from the PCT register. The sampling procedure was similar to that followed by each PCT for the National Patient Survey (available on www.nhssurveys.org). However, for this survey the sample was stratified by age to ensure that all age groups over 50 years were adequately represented. Firstly, the database of registered patients was separated into the following age bands: 50–64 years (required sample 700); 65–74 years (required sample 700); 75 years and over (required sample 600). Secondly, a random sampling technique was applied to achieve the required sample from each age band, and the PCTs arranged the mailing to ensure that responders remained anonymous to the research team.

Recipients were sent an explanatory covering letter, the survey and a consent form. Completed consent forms and questionnaires were returned by freepost to the Social Care Workforce Research Unit at King's College London, to ensure that PCTs could not identify respondents. Because the survey was one component of a time-limited rapid appraisal process no reminders were sent. Questionnaire answers were allocated locality identifiers and coded.

Analysis of quantitative and qualitative data

Quantitative data was entered onto SPSS-PC to allow production of descriptive statistics and parametric and non-parametric tests as appropriate. Quality was assured by double data entry. Free- text responses to the survey were transcribed and added to a coding frame linked to the subject areas of the survey. Responses that covered more than one subject area, for example, commenting on primary and social care support after hospital stay, were entered under both categories. The responses were read by three members of the research team from a variety of disciplinary backgrounds to consider meaning and context. Agreement about categorization was reached by discussion within the team.

Results

There was an overall response rate of 35% (n=4170) for the survey of 12,000 people aged 50 years and over (the response rate ranged from 26% to 44% across the six PCTs). Thirty-six percent of those aged 50–64, 37% of those aged 65–74 and 27% of those aged 75 and over responded. The characteristics of the responders are shown in Table 1.

Table 1.

Characteristics of survey participants (n=4170)

Characteristics Responses, % (n)
Women 56% (4170)
Age group (years)
 50–64 36%
 65–74 37%
 75 and over 27% (4170)
Living alone 25% (4012)
White UK 97% (4010)
General health
 Excellent  6%
 Very Good or Good 65%
 Fair or Poor 29% (4118)
Longstanding illness 40% (4170)
Emergency support available if needed from family or other 92% (4170)
Mental health problem 8% (4083)
Heart problem 47% (4071)

The mean age of the 4170 survey respondents was 69 years (range 50–101 years) and 56% were women. Over half of respondents (61%) lived with their spouse or partner, the remainder living alone (25%), with their family (11%) or with others (3%). One percent stated their ethnicity as White Irish, White Other, Asian and less than 1% as Black UK, Black Caribbean or Black African. Respondents were asked to report on their general health, 29% stated that their health was 'Fair or Poor', 65% 'Very Good or Good' and 6% 'Excellent'. Less than half of the survey respondents had a disability or longstanding illness (40%) with the majority having a carer available to them if needed (92%).

Fourteen percent of respondents had a personal experience of an overnight stay in hospital (n=584), and this group has been used for the secondary analysis; those answering for a relative or friend were excluded from the analysis. Table 2 illustrates the opinions and experiences of this group.

Table 2.

Opinions and experiences of those who had had at least one overnight stay in hospital in the 12 months prior to the survey (n=584, 14% of the total survey population)

n % (range across sites)
Could inpatient treatment have been avoided?
Definitely not 527 90.3 (82–92)
Possibly, if community services were better 15 2.6 (2–4)
Possibly, if doctors or nurses had acted earlier 15 2.6 (2–5)
Probably, if community services were better 9 1.5 (1–4)
Probably, if doctors or nurses had acted earlier 8 1.4 (0–3)
Definitely 10 1.7 (1–3)
Inpatient care and treatment was:
Sensitive to views, beliefs or preferences 478 82 (69–84)
Personalized 426 73 (65–82)
Designed to involve person in decision-making 438 75 (65–82)
Designed to promote independence 450 77 (71–87)
The experience of leaving hospital was:
Timely and well organized 403 69 (63–79)
Timely but badly organized 88 15 (11–22)
Premature but well organized 29 5 (1–7)
Premature and badly organized 35 6 (2–11)
Delayed but well organized 17 3 (2–5)
Delayed and badly organized 12 2 (1–4)

Ninety percent did not think that their admission could have been avoided, and around 80% felt that their care and treatment were sensitive to their views, beliefs or preferences, were personalized, and were designed to involve them in making decisions about future care and to promote independence. Of the small number who thought that their admission possibly or probably could have been avoided, half attributed admission to the poor quality of community services and half to the lack of timely intervention by doctors or nurses. Only 10 (2%) were certain that their admission had been avoidable. The free-text data contained a number of examples of poor hospital care in which older patients felt they were not treated with respect, and in which person-centred care seemed deficient ( Box 2).

Box 2. Experiences of poor quality hospital care.

No dignity shown to the very elderly, left uncovered in wet soiled bed, some patients constantly crying out for attention. (80489 Male, 62; Retired engineering technician)

Lack of communication between all nursing staff and relatives of patients. Long delays in answering patients' buzzer for bed pan for half hour then 20 minutes before pan arrived. (80422 Male, 54; Builder)

I was in hospital recently; I was largely left alone and ignored because I cannot speak up for myself. (70205 Female, 97; Retired – housewife and mother)

Under-staffing. Lack of communication. Cancelled theatre arrangements after nil by mouth exclusion for two-thirds of day. Lack of aftercare/physio guidance. (70384 Female, 68; Housewife and mother)

Eighty-four percent of respondents felt that their discharge from hospital was timely, 11% that discharge was premature and 4% that discharge was delayed (Table 2). The free-text data yielded accounts of how well organized discharge had been for some older people, and how poorly managed for others (Box 3).

Box 3. Positive and negative experiences of hospital discharge.

Absolutely marvellous treatment for two new knee replacements. I can't praise the new hospital enough. (90344 Female, 71; School teacher)

Adequate time given for me to arrange transport to take me home. (90466 Male, 70; Retired – Purchasing manager)

Arrangements were made speedily and satisfactorily. (90335 Female, 79; Retired housing officer)

Care as an inpatient (excellent) and home amenities well provided for (promptly). (70114 Male, 69; Retired – Coach cleaner)

From Cottage Hospital back to residential home was all organized for me. (10665 Male, 94; Telephone operator)

I needed transportation on discharge which was attended to within one hour. Collected and delivered door to door. First class. (80802 Male, 67; Engineer)

I was discharged with medication and full advice given on how to use and how often. I had full details of when I had to return and what out patient treatment I could expect. (90090 Female, 63; Retired – Tea lady)

Confusion of information given – ward staff at variance with consultant. Given very little notice. (80811 Female, 62; Social worker)

Had to organize own transport /no ambulance available/not allowed to go home alone. (50596 Female, 80; Retired – Shop assistant)

No doctor in attendance to authorize discharge. (89999 Male, 69; Retired – Painter/decorator)

Seeing more than one doctor meant the right hand did not know what the left was doing, only my persistent asking got a response and eventually got me a discharge at 7:30 at night. (10559 Male, 63; Printer)

The hospital needed a bed and they gave a patient my bed while I waited in my dressing gown in the waiting room for my husband to come for me. (50520 Female, 60; Retired – Sales assistant)

Had to find own transport home. GP was not pleased as live alone, had been in bed two weeks. (90556 Female, 82; Shop assistant)

I was sent home with a package of pills I hadn't had beforehand, didn't know how I would react to them. I was told I would go home after lunch so my daughters were to wait at home for me, I finally arrived home in nightdress and wrapped in a blanket at 8:30. (70205 Female, 97; Retired – Housewife and mother)

Discharged too soon. Couldn't breath properly. Still had things wrong. (50339 Female, 68; Retired – Cleaner)

Discussion

What this study shows

This case study suggests that the majority of older users of hospital services in the six localities who responded to the invitation to comment on policy and practice were satisfied with the quality of their care. Hospital care appeared personalized for most respondents. Their responses and accounts suggest that for a large number of older people receiving inpatient care, their independence was promoted and they were engaged in decision-making. Notwithstanding the positive responses from most respondents, there were some people whose experiences of hospital care were far from person-centred, justifying the Department of Health's recent emphasis on promoting dignity in the care of older people in hospitals and other care settings.3 Problems with hospital discharge remain despite efforts over long periods of time to correct them, and according to these findings, persist for a significant group of older people needing inpatient care.

Limitations of the study

Although we were able to elicit the views of a large number of older people in this survey, the findings need to be interpreted with caution, and read with an awareness of high levels of satisfaction found with advancing age.25,26 Surveys are not necessarily the best way to elicit opinions on complex issues like quality of care, but in this case the survey included opportunities for brief descriptions of experiences to be incorporated in the responses, and was also set in the context of a multimethod rapid appraisal.27 Retrospective capture of views about the quality of services runs the risk of recall bias, but time delays in eliciting experience can be useful because satisfaction immediately after an episode of service utilization tends to reflect the quality of communication between patient and staff. Later assessments of satisfaction tend to reflect other outcomes (like symptom control, need for repeat visits, functional ability) which may be more important.28 Despite our sampling strategy the 75-and-over age group was under-represented among respondents.

The meaning of the study

We doubt that any clinician will be surprised by the accounts given, but the case study methods achieve authenticity29 in the sense that they are fair (all views are included) and catalytic (the findings lead to actual or potential stimulation of action, in this instance by the Healthcare Commission). The Healthcare Commission's judgement on the implementation of the NSFOP7 was that progress had been made in addressing problems of hospital care.

Although small-scale local surveys may seem to lack the robustness of larger studies, these findings identify similar areas of concern to the Picker Institute's large-scale surveys and a subsequent Healthcare Commission report using a different methodology.4 While this suggests that this case study has value as an evaluation of service performance, the trustworthiness of the data and our analysis of it depend on its credibility to others with experience of the topic, transferability to other settings, dependability (depth of description of methods, peer analysis of data, third-party evaluation of data gathering) and confirmability (by independent review of the data).30 In our view the dependability of this case study rests on the methods used (described here briefly but in more detail elsewhere21) and the confirmability through the Healthcare Commission's review and utilization of the findings. The credibility and transferability of the findings are left for the reader to judge.

Implications of the findings

The response rate to the survey is comparable to that obtained by the Picker Institute in its patient satisfaction surveys, before reminders are sent to non-responders31 but nevertheless has implications for interpretation of the findings. The greatest problem, in our view, is response bias rather than lack of population representativeness. When small-scale local surveys are used as screening tools for problems with local services rather than as an accurate representation of the spectrum of experience, low response rates only matter if problems are uncommon or rare. Our sample size was large enough to identify potential problems in hospital care. The tendency for those with the greatest needs not to respond to surveys,9 and for positive evaluations to 'hide' negative feelings,32 means that results are more indicative than descriptive. While our findings could not be used comparatively to judge 'winners and losers' among hospitals, the negative experiences revealed by the survey could be used to educate and inform hospital staff and to focus and facilitate quality improvement efforts.15

Conclusion

Service evaluation can be formative (guiding change) or summative (judging performance). Our interpretation of this case study is that it could only be formative. Small-scale patient satisfaction surveys do not measure patient satisfaction in any scientific sense, and cannot be used as robust and reliable metrics for service evaluation, but when used as formative case studies they can identify problems in service provision that may require remedial action. Clinicians and managers should avoid interpreting their finds as 'league table' results but instead use them to demonstrate that they are working collaboratively to respond to patients' concerns.

Footnotes

DECLARATIONS —

Competing interests None declared

Funding Funded by the Healthcare Commission T/03/56/MG and sponsored by King's College London

Ethical approval MREC approval was gained for the postal survey section of the evaluation (reference number 05/MRE09/33)

Guarantor SI

Contributorship JM was the lead applicant on the proposal and was overall project manager. RC was a co-applicant on the proposal, facilitated groups, trained the researchers and analysed the qualitative data. MCwas the project coordinator and collated and analysed all data. LB was a co-applicant and facilitated groups and conducted interviews. SI was a co-applicant on the proposal, analysed groups and wrote up the study findings. OPRSI conducted focus groups and interviews and commented on the study findings and local reports

Acknowledgements

None

References

  • 1.Department of Health. National Service Framework for Older People. London: Department of Health; 2001. See http://www.dh.gov.uk/assetRoot/04/07/12/83/04071283.pdf. [Google Scholar]
  • 2.Department of Health. Better health in old age: Report from Professor Ian Philp. London: Department of Health; 2004. [Google Scholar]
  • 3.Department of Health. A new ambition for old age: Next steps in implementing the National Service Framework for Older People. London: Department of Health; 2006. [Google Scholar]
  • 4.Healthcare Commission. Caring for Dignity. London: Healthcare Commission; 2007. [Google Scholar]
  • 5.World Health Organization. Guidelines for Rapid Appraisal to Assess Community Health Needs. Geneva: World Health Organization; 1992. [Google Scholar]
  • 6.Robinson J, Elkan R. Health Needs Assessment. London: Churchill Livingstone; 1996. [Google Scholar]
  • 7.Healthcare Commission. Living Well in Later Life. A Review of Progress against the National Service Framework for Older People. London: Commission for Healthcare Audit and Inspection; 2006. [Google Scholar]
  • 8.Cohen G, Forbes J, Garraway M. Can different patient satisfaction survey methods yield consistent results? Comparison of three surveys. BMJ. 1996;313:841–4. doi: 10.1136/bmj.313.7061.841. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Ehnfors M, Smedby B. Patient satisfaction surveys subsequent to hospital care: problems of sampling, non-response and other losses. Qual Assur Health Care. 1993;5:19–32. doi: 10.1093/intqhc/5.1.19. [DOI] [PubMed] [Google Scholar]
  • 10.Scott A, Smith R. Keeping the customer satisfied: issues in the interpretation and use of patient satisfaction surveys. Int J Qual Health Care. 1994;6:353–9. doi: 10.1093/intqhc/6.4.353. [DOI] [PubMed] [Google Scholar]
  • 11.Ferlie E. Organisational studies. In: Fulop N, Allen P, Clarke A, Black N, editors. Studying the Organisation and Delivery of Health Services: Research Methods. London: Routledge; 2001. pp. 29–30. [Google Scholar]
  • 12.Cohen G. Age and health status in a patient satisfaction survey. Soc Sci Med. 1996;42:1085–93. doi: 10.1016/0277-9536(95)00315-0. [DOI] [PubMed] [Google Scholar]
  • 13.Vuori H. Patient satisfaction – does it matter? Int J Qual Health Care. 1991;3:183–9. doi: 10.1093/intqhc/3.3.183. [DOI] [PubMed] [Google Scholar]
  • 14.Harwood R. Evaluating the impact of the National Service Framework for Older People; qualitative science or populist propaganda? Age Ageing. 2007;36:483–5. doi: 10.1093/ageing/afm085. [DOI] [PubMed] [Google Scholar]
  • 15.Cleary PD. The increasing importance of patient surveys. BMJ. 1999;319:720–1. doi: 10.1136/bmj.319.7212.720. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Nelson CW, Niederberger J. Patient satisfaction surveys: an opportunity for total quality improvement. Hosp Health Serv Adm. 1990;35:409–27. [PubMed] [Google Scholar]
  • 17.Delbanco T. Quality of care through the patient's eyes. BMJ. 1996;313:832–3. doi: 10.1136/bmj.313.7061.832. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Yin R. Case Study Research: Design and Methods. 3rd edn. Thousand Oaks, CA: Sage Publications; 2003. [Google Scholar]
  • 19.Scott J, Bhaduri R, Sutcliffe C. Clear Voices: A Good Practice Guide to Involving Older People and Carers in Strategic Planning and Service Development. Manchester: Personal Social Services Research Unit; 2004. [Google Scholar]
  • 20.Clough R, Green B, Hawkes B, Raymond G, Bright L. Older People as Researchers: Evaluating a Participative Project. York: Joseph Rowntree Foundation; 2006. [Google Scholar]
  • 21.Klee D, Manthorpe J. Partnership in inspection: lessons from the review of the NSFOP. J Integrated Care. 2006;14:6. [Google Scholar]
  • 22.Gavin K, Turner M. Methods of surveying patients' satisfaction. BMJ. 1997;314:227. doi: 10.1136/bmj.314.7075.227. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Cohen G, Forbes J, Garraway M. Can different patient satisfaction survey methods yield consistent results? Comparison of three surveys. BMJ. 1996;313:841–4. doi: 10.1136/bmj.313.7061.841. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Carr-Hill R. The measurement of patient satisfaction. J Public Health Medicine. 1992;14:236–49. [PubMed] [Google Scholar]
  • 25.Cohen G. Age and health status in a patient satisfaction survey. Soc Sci Med. 1996;42:1085–93. doi: 10.1016/0277-9536(95)00315-0. [DOI] [PubMed] [Google Scholar]
  • 26.Calnan M, Almond S, Smith N. Ageing and public satisfaction with the health service: an analysis of recent trends. Soc Sci Med. 2003;57:757–62. doi: 10.1016/s0277-9536(03)00128-x. [DOI] [PubMed] [Google Scholar]
  • 27.Manthorpe J, Clough R, Cornes M, Bright L, Moriarty J, Iliffe S. OPRSI. Four years on: the impact of the National Service Framework for Older People on the experiences, expectations and views of older people. Age Ageing. 2007;36:501–7. doi: 10.1093/ageing/afm078. [DOI] [PubMed] [Google Scholar]
  • 28.Jackson J, Chamberlin J, Kroenke K. Predictors of patient satisfaction. Soc Sci Med. 2001;52:609–20. doi: 10.1016/s0277-9536(00)00164-7. [DOI] [PubMed] [Google Scholar]
  • 29.Guba E, Lincoln Y. Fourth Generation Evaluation. Thousand Oaks, CA: Sage Publications; 1989. [Google Scholar]
  • 30.Guba E, Lincoln Y. Effective Evaluation: Improving the Usefulness of Evaluation Results Through Responsive and Naturalistic Approaches. San Francisco, CA: Jossey-Bass; 1981. [Google Scholar]
  • 31.See http://www.pickerinstitute.org/documents/Improving%20service%20of%20care.pdf
  • 32.Williams B, Coyle J, Healy D. The meaning of patient satisfaction: an explanation of high reported levels. Soc Sci Med . 1998;47:1351–9. doi: 10.1016/s0277-9536(98)00213-5. [DOI] [PubMed] [Google Scholar]

Articles from Journal of the Royal Society of Medicine are provided here courtesy of Royal Society of Medicine Press

RESOURCES