Skip to main content
Health Expectations : An International Journal of Public Participation in Health Care and Health Policy logoLink to Health Expectations : An International Journal of Public Participation in Health Care and Health Policy
. 2006 May 8;9(2):98–109. doi: 10.1111/j.1369-7625.2006.00377.x

Involving mental health service users in quality assurance

Jenny Weinstein 1
PMCID: PMC5060339  PMID: 16677189

Abstract

Objective  This study compares the process and outcomes of two approaches to engaging mental health (MH) service users in the quality assurance (QA) process.

Background  QA plays a significant role in health and care services, including those delivered in the voluntary sector. The importance of actively, rather than passively, involving service users in evaluation and service development has been increasingly recognized during the last decade.

Design  This retrospective small‐scale study uses document analysis to compare two QA reviews of a MH Day Centre, one that took place in 1998 as a traditional inspection‐type event and one that took place in 2000 as a collaborative process with a user‐led QA agenda.

Setting and participants  The project was undertaken with staff, volunteers and service users in a voluntary sector MH Day Centre.

Intervention  The study compares the management, style, evaluation tools and service user responses for the two reviews; it considers staff perspectives and discusses the implications of a collaborative, user‐led QA process for service development.

Results  The first traditional top–down inspection‐type QA event had less ownership from service users and staff and served the main purpose of demonstrating that services met organizational standards. The second review, undertaken collaboratively with a user‐led agenda focused on different priorities, evolving a new approach to seeking users’ views and achieving a higher response rate.

Conclusions  Because both users and staff had participated in most aspects of the second review they were more willing to work together and action plan to improve the service. It is suggested that the process contributed to an evolving ethos of more effective quality improvement and user involvement within the organization.

Keywords: day centres, mental health, quality assurance, user involvement

Introduction

Quality assurance

The quality assurance (QA) discourse from the USA and Japan that informed 1980s business consulting was widely adopted within the UK public sector, particularly in the sphere of health and care provision. 1 The literature on QA is too broad to consider here but key approaches designed to ensure continuous service improvement through on‐going critical monitoring and evaluation are usefully summarized on the Department of Trade and Industry website. 2

The New Labour Government of 1997 inherited and enhanced inspection‐based QA processes in the public sector whereby specialist quality staff check the work of others and allocate grades or stars. There was also interest in using frameworks such as Total Quality Management, 3 , 4 which involve all stakeholders of an organization and accreditation systems‐like the Excellence model, 5 Investors in People, 6 or PQUASSO. 7 More radically, Peters 8 advocated a dedicated focus on the customer as the only way to improve performance.

During the 1990s, the voluntary sector came under increasing pressure to adopt quality systems. 9 Not only did external grant‐makers or donors want to ensure that their money was being used effectively but, because government was looking to the sector to provide public services under the new ‘Compact’, 10 statutory funders increasingly required some form of QA system as part of their contracts.

Nevertheless, a review of the use of quality systems in the voluntary sector 11 reported that many voluntary organizations were reluctant to adopt what was perceived as a ‘business ethos’ and only a few had established internal QA frameworks by the end of the 1990s. 12 However, the medium‐sized voluntary agency, which is the subject of this study, was a pioneer in this respect, having introduced an internal quality system in 1992.

The system was modelled on the inspectoral approach whereby services adopt a set of standards and employ ‘inspectors’ (in this case trained volunteers) to visit the service, speak to stakeholders, peruse documentation and make an evaluation of the extent to which service standards are being met. 13 The purpose was to assure the Board of Trustees that the community was receiving quality services through a process whereby professionals and trained community volunteers defined excellence.

Involvement of mental health service users in service evaluation

The 1990s saw an increasingly strong movement for the empowerment of users and carers, including those involved with mental health (MH) services. 14 , 15 , 16 The Government's modernization agenda for both health and social services exemplified in documents such as the National Service Framework for Mental Health 17 professes to have empowerment and choice for service users at its heart. Nevertheless, service users experience some MH services as disempowering, 18 while many professionals still feel uncomfortable with the idea of working in partnership with service users. 19

There is mixed evidence about how far user involvement is being achieved or, if it is happening, whether it is having any impact on service development. 20 , 21 Strategies such as inviting user representatives onto planning and evaluation groups can be problematic 22 because of power dynamics and lack of clarity about their roles and influence. Another barrier is the fear that too much involvement will lead to ‘unrealistic expectations’ 23 while a further argument, states that the users who are actively involved only represent their own interests that will be different from those of other users. Crawford and Rutter, 24 refuting this, found that, on the contrary, the views and priorities of a MH user group were very similar to those of a randomly selected sample of other mentally ill patients while Crepaz‐Keay et al. 25 argued that debates of this kind only served to keep service user involvement at a tokenistic level.

There is limited material specifically on user involvement in QA and a growing literature on related areas of work such as involving users in developing and planning services, for example, Harding and Oldman; 26 or involving service users in research. 27 , 28 Many of the studies cited here argue that it is not sufficient to ask service users to passively respond to surveys and consultations devised by professionals – a much more dynamic and inclusive approach to involvement is required.

Godfrey 29 argues that simply asking users for their views is not enough; he says: ‘More needs to be done to include them in all aspects of research, planning and evaluation of social care’. Godfrey 29 identified MH service users to interview others whose voices would not normally be heard. He quotes Beeforth et al. 30 who point out that the user interviewer, who is not responsible for the delivery of services, is more likely to gain a genuinely authentic response.

Objective of the project

This project was undertaken to compare the process and outcomes of two different approaches to engaging MH service users in the QA process of a voluntary agency. Two QA reviews of the day centre; a traditional inspection‐type event in 1998, and an inclusive collaborative process in 2000, were compared retrospectively. In the context of the increasing emphasis on user involvement within MH services, a different approach was undertaken for the 2000 review as a way of contributing to the kind of changes recommended in the literature cited above. The agency was keen to find a more meaningful way of involving service users in the QA process than had been achieved in 1998. These reviews are reflected on here because they may be of interest to colleagues involved in similar work in the voluntary sector or other settings. 31

Setting and participants

The UK voluntary agency where the project took place is a medium‐sized provider of social work, residential and day services to adult service users of one particular minority ethnic group.

The Mental Health Day Centre that is the focus of the study had been transformed during the late‐1990s, from an old‐fashioned rehabilitation centre providing ‘sheltered’ employment – filling boxes and envelopes – to a modern therapeutic centre offering a range of creative activities. This centre catered for approximately 15 clients per day with severe and enduring MH issues and had a registered membership that fluctuated between 70 and 90. The majority of members were between 40 and 60 years old and a high proportion had been attending for many years. A manager and four project workers staffed the Centre.

Methodology

Although this study is based on a retrospective analysis of documentation, the process of the project being described has several features of an action research model. ‘Put simply, action research is ‘‘learning by doing’’– a group of people identifies a problem, does something to resolve it. There is a dual commitment in action research to study a system and concurrently to collaborate with members of the system in changing it’. 32 It is in the tradition of ‘participatory reflective inquiry and practice, participatory inquiry for empowerment and evaluating as direct practice’ 33 and it ‘is committed to a view of evaluative purpose which is for service users and an evaluative process which involves participatory evaluating with service users’. 34

Two QA reviews of the centre are examined; one that took place in 1998 as a traditional inspection‐type event and one that took place in 2000 and involved service users and staff in a more inclusive QA process. The study compares the main QA tools – service user questionnaires – used in the two reviews, the first compiled by professionals and the second by service users; it reflects on the process of the two reviews, contrasting evaluation style, approach, response rate and response content; it considers the perspective and roles of staff in the QA process and discusses the implications of a collaborative approach and user‐led agenda for service development.

The author has carefully examined all the documentation including minutes of meetings, correspondence, service standards, questionnaires and reports in relation to the two QA reviews in order to compare the process, methodology and outcomes. The full list of service standards, questionnaires and QA reports are not reproduced here for reasons of maintaining anonymity, space and also because the study focuses on the role and style (illustrated with examples) rather than the content of these documents.

The organization and day centre have been anonymized and, following consultation with academic colleagues, it was agreed that there were no ethical procedures to be pursued on the formal level.

Limitations of the study

This account is retrospective, based on the author's experience and reflections of work undertaken in 2000, drawing on the documentation and records that were kept at the time. It was always the intention to write up the project and this was discussed as a potential joint venture with the manager of the Day Centre who had worked closely in partnership and should take most of the credit for the project. All the relevant documentation was carefully collated, but for many reasons there has been a 5‐year time lapse and job changes before the article was finally written. The author, who was involved with the project management, can in no way claim to be entirely objective or that her account is free of bias.

Findings

Table 1 provides a summary of the comparison of the two reviews described in more detail below.

Table 1.

 Summary of comparison between the two reviews

Quality assurance process 1998 review 2000 review
Planning process Discussion between AD 
Quality Assurance and AD 
Mental health Steering group composed of service users, day centre staff, volunteers, QA team members and external service user consultant
QA process 1‐day event 6‐month process
Questionnaire purpose Designed directly to identify how far service standards were being met Designed to explore priorities identified by service users
Questionnaire wording Drafted by QA Department Drafted by QA steering group 
Wording revised by eight service users
Questionnaire distribution Distributed by Day Centre staff to users who came into the centre during a 2‐week period Posted to home address of each service user
Questionnaire return Questionnaires returned to Day Centre staff Questionnaires returned in sealed envelopes directly to QA department
Consultation with staff Discussion with two members of staff on the day of the QA review Manager and staff represented on steering group. Full staff team consulted on draft questionnaire and involved throughout process
Response rate to user questionnaire (%) 28 73
Report Analysed and written up by QA Department with judgements and recommendations Analysed and written up by QA Department setting out only the user responses with no judgements or recommendations
Feedback of findings Presented to Manager of Service and Board of Trustees with recommendations Presented to service users and staff for them to decide on plan of action
Outcome Staff indifference and user cynicism – no system for monitoring implementation of recommendations Service users and staff agreed an action plan and monitoring process

Review 1: 1998

In 1998, the Day Centre was evaluated by the Quality Assurance Department using the established process. The Assistant Director for QA had a discussion with the Assistant Director of Mental Health and they agreed the parameters and time scale for the review. The Day Centre Manager and staff were informed about this. A questionnaire for Centre members was derived from the agreed MH service standards and was compiled by the QA department using a Likert satisfaction scale and inviting ‘additional comments’. Examples of standards (S) and related questions (Q) are set out below to illustrate the content, style and language used.

Examples from content of 1998 standards and questionnaire

S2 to be offered a comprehensive assessment, where possible by suitable health and care staff. The assessment will be undertaken in partnership with the client and any carer or relative requested by the client to be involved.

1.
1.

Q2 Are you satisfied that your health, personal situation and needs have been fully assessed?

2.

Q2a Were you involved in the discussion about your health, situation and service needs?

S6 to participate in the development of a written care plan. The plan will outline the services to be provided and the hoped for outcomes for the client.

2.
3.

Q4 Are you satisfied that you have a written care plan which specifies the services you receive and reflects your needs and goals?

4.

Q5 Were you involved in the development of the care plan?

S7 to participate in regular reviews of the care plan which should take place at minimum on a 6 monthly basis.

3.
5.

Q6 Is your care plan reviewed every 6 months?

6.

Q7b Are you satisfied that you are helped to feel comfortable at your review?

Other standards and questions covered key worker system, confidentiality, medication, activities, food, staff attitudes and complaints procedure.

Distribution and analysis of 1998 questionnaire

The Centre Manager was given 75 questionnaires, sufficient for all registered members. These were distributed over a 2‐week period prior to the review day to members who came into the centre. Members were invited either to complete the questionnaires themselves or to attend the review day and complete the questionnaire with a member of the QA team. Questionnaires were collected by Day Centre staff and returned to the QA department. The QA review day involved members of the QA team spending a day at the centre and interviewing one member of staff, the manager and five clients.

There was a response rate of 28% (21) to the questionnaires. These were analysed and where more than 25% of respondents expressed dissatisfaction, the standard concerned was deemed not to have been met and when between 16% and 24% expressed dissatisfaction, the standard was deemed to have been partly met.

QA 1998 report

The following extract from the QA report illustrates how the report focused on whether or not service standards were met.

The majority of the standards were found to be met. Clients particularly appreciated:

  • • 

    the excellent support provided by the centre;

  • • 

    the range of activities;

  • • 

    the food; and

  • • 

    key worker system.

Standard 8: Being consulted and feeling comfortable at reviews

…Some clients do not always feel that they are adequately consulted or that they can feel comfortable and express their own views at reviews.

Standard 23: Programme of activities in consultation with clients

…Some clients expressed the need to have somewhere they can come when they are feeling vulnerable or unwell, without necessarily having to participate in an organized activity…

Standard 19: Caring and professional approach to clients

Some clients feel that they are sometimes patronized and spoken to ‘like children’…

The report concluded with a series of recommendations made by the QA team to address the issues raised by clients. It was submitted to the Board with an appendix presenting a detailed breakdown of the survey results in pie charts demonstrating an overall high rate of satisfaction. The Board received it as information.

Outcomes

At this time, there was no robust system of reviewing implementation of recommendations from QA reviews. Because line managers did not have any ownership of the QA process, some were understandably hostile to the idea of internal colleagues and volunteers making judgements about their service. Service users were not informed of the outcome of reviews and some users told QA volunteers that they were not keen to participate because ‘nothing happens’ or ‘no one takes any notice’. The Quality Manager, in an ‘independent’ capacity had no authority to insist on changes and could only pick up on issues that recurred at the next review in 1 or 2 years time.

Evolution of QA

By 2000, it had been agreed that QA should not remain quasi‐independent but should move towards becoming integral to the operational and strategic management of the organization. Within a framework of QA, the pressure on providers to feed the user views into the QA cycle raises issues of control and impacts on the dynamics between users and staff. 34 In this respect, staff discomfort with the system was acknowledged and it was agreed that they should be enabled to have more ownership of the process.

It was also accepted, as suggested by Pilgrim 35 that it is more in the interests of providers than users to hear the user's voice as ‘consumer feedback’ like ‘hotel guests completing a satisfaction survey’. The outcome of such surveys ‘gives little indication of a user's experience of care, and what exactly it is that users are pleased with or would like improved. It is unlikely that such surveys are providing us with a reasonable reflection of users’ experience of health care’. 34 Williams et al. 36 found that service users who had responded ‘satisfied’ or ‘highly satisfied’ to a survey questionnaire were much more negative in their responses when engaged in unstructured discussions about their experiences.

An alternative process was therefore devised whereby providers hold back on control of what is actually asked about in order to allow consideration of a wider range of issues. 34 The new approach featured many of the principles underpinning user‐focused monitoring (UFM), 37 pioneered by Dr Diana Rose. Service users involved in UFM argue that when they set the questions, conduct the interviews and compile the reports, the outcome is more interesting to other service users and has a more powerful impact on service providers. 38 Pilgrim and Waldron 39 moved a step further with their project which went beyond consultation to involve service users in direct negotiations with providers for the changes that were required.

Review 2: 2000

Steering group and agreeing the agenda for review

In order to ensure that the process was genuinely user‐led, it was decided that service users must be involved from the outset of the process. Simpson et al., 40 who compared evaluation design and findings in research conducted by service users with those undertaken by MH professions, found that service users and MH professions often had different priorities and views about what was most important. They also concluded that the actual process of being involved in a meaningful way was valuable for service users, boosting their self‐esteem and giving them more confidence.

The responsibility of the steering group was to decide on the methodology and oversee the process of the review. It consisted of two service users elected by members of the day centre, two members of the day centre staff, one day centre volunteer, one QA volunteer, one service user representative from an independent local user group and the QA manager. The independent user group representative, a local champion of user involvement, was invited to monitor the process and ensure that it was genuinely collaborative, and the agenda was user‐led.

The first stage was to agree the agenda for the review. User members of the steering group explained that the MH service standards, while important, did not represent the issues of most pressing importance for users. To identify these, a QA meeting was convened at the day centre where the service user representatives on the steering group canvassed the views of their fellow members. The meeting of 35 members came up with a list of about 20‐items and then voted on their priorities which were:

  • • 

    holiday/short break;

  • • 

    opening hours;

  • • 

    making a commitment or dropping in;

  • • 

    what users really need/want from the centre;

  • • 

    role of staff running the centre; and

  • • 

    programme of the centre.

A small subgroup including the service users developed a first draft of the questionnaire which was then piloted with eight service users at the centre who made changes where they found jargon or ‘unfriendly’ language.

The user‐designed questionnaire was then taken to a staff meeting for comment. Staff's main concern was the question about holidays and short breaks. Staff did not want to raise service user expectations, knowing it would be very unlikely that the organization would fund a holiday for members or even fund a member of staff to go away with members. Similarly they were concerned about raising expectations about bank holiday opening.

The Centre Manager was very helpful here explaining that this is a user issue. While it may not be possible for staff to be involved in these activities, if users want to organize holidays or short breaks or outings on bank holidays, staff could assist to facilitate this. Every centre‐based activity does not have to be staff‐led. The staff approved the questionnaire although they continued to have concerns about raising expectations.

Wording of 2000 questionnaire

One of the ways in which the users decided to word the questionnaire was to identify common statements made by users on a subject they had disagreed about, and to ask all users to state the degree to which they agreed or disagreed with the statements.

In order to illustrate the style and language, two examples are set out below.

1.
7.

Q4 At the moment, except for Mondays and Sunday afternoons, members have to make a commitment to come to the centre on a limited number of days and to attend activities on those days. Please indicate whether you agree or disagree with the following statements:

  • (a)

     It is very helpful to know that I will be attending the Centre on certain days as it gives me a structure to my week

  • (b)

     I would prefer it if more groups were open and we were not required to make a regular commitment to them

  • (c)

     One of the reasons our programme works well is because we all attend our groups regularly to get to know each other and trust can develop

  • (d)

     I find it much too restrictive to make a commitment to attend on certain days and would prefer a drop in system

  • (e)

     I would like the evenings to run more like Sundays – without having to make a commitment.

2.
8.

What is the main aim of [the centre] from your point of view. Please agree or disagree with the following statements.

  • (a)

     It is very important that I can feel safe and comfortable in a warm atmosphere

  • (b)

     It is very important that [the centre] provides me with a structure to my day and activities as an alternative to work

  • (c)

     It is very important to me to attend groups and activities to learn new skills

  • (d)

     My main aim in attending is to maintain good mental health

  • (e)

     The main reason why I attend is to make friends and meet people who are in the same boat as me so we can support each other

  • (f)

     My main reason for attendance is to get a good meal at an affordable price.

Distribution and analysis of 2000 questionnaire

Users on the steering group advised that questionnaires should be distributed by post to all registered members with an accompanying letter offering them the opportunity either to complete the questionnaire themselves or to do this with support from a fellow service user or a QA volunteer. A sealable envelope addressed directly to the QA department was enclosed because user representatives advised that some users may feel concerned about staff seeing negative comments. It was agreed that QA team members would visit the centre on 3 days of the review week to be available to speak to as many members as possible.

A total of 51 completed questionnaires were received constituting a response rate of 73%– well over 50% improvement on the 28% response to the 1998 survey. These were analysed using a simple package called Pinpoint.

2000 QA report

The following are some extracts from the report on the analysis of the responses to illustrate a contrast with the report of 1998 which focused on standards while this focuses directly on user views.

Opening times

The most popular time for the centre to be open was Bank Holiday with 33 members keen for the centre to be open. Religious holidays followed although 28 members were either not sure or did not want the centre to be open then.

Making a commitment or dropping in

Members were especially keen on the structure provided by the centre and generally indicated that the commitment required to specific days was helpful to them. Only nine members (18%) agreed that they found it too restrictive to have to attend on certain days and would actually prefer a drop in system. However, with regard to attending in the evenings, nearly half the participants said they would like the evenings to be run like Sundays – without having to make a commitment.

What do users really need/want from the centre?

…The main reason for attending the centre is the supportive community atmosphere and comfort/safety of being with own ethnic group. The social aspects, the positive benefits to maintaining good mental health and providing a structure to the day were considered to be almost as important. The meal was seen as the least important reason for attending.

Programme of groups and activities

When asked about the programme of groups and activities 37 of the 51 members who replied added some comments. These included:

  • • 

    do not want to be pressurized to do groups;

  • • 

    could be more varied trips out;

  • • 

    facilitators should inspire interest in subject;

  • • 

    sometimes OK but sometimes clash of personalities; and

  • • 

    excellent but not enough fine music.

When asked what activities they would like that are not currently offered, members suggested: Music, yoga, poetry swimming, counselling service…

Another key difference between this and the 1998 report is that here, only the findings were presented – no judgements or recommendations were made.

The user representatives on the steering groups asked members how they would like the findings fed back. At their request, a PowerPoint presentation was made to the members and they were also given individually printed copies of the report, which was then left with the members and the staff to work on an action plan.

Members were asked how they had found this new approach to QA. While expressing appreciation of the process as a whole, some users asked why they had not been involved in the analysis of the findings and the preparation of the report. This was useful learning for the author about how easy it is to slip back into ‘taking over’ as the professional.

Outcomes of Review 2000

This project could not have been undertaken without the wholehearted support and involvement of the centre manager and staff. Staff acknowledged their initial reservations about a user‐led approach, in particular their fears about raising expectations. However, following the delivery of the findings, the QA process continued within the centre for a further 6 weeks with the development of an implementation plan between staff and service users. This included staff offering to support service users who wanted to organize joint holidays or meet socially outside the opening hours of the centre and addressing the needs of a minority of service users who wanted a less structured day. The process enabled staff to see that service users were able to be realistic about the resource and financial constraints and to negotiate a range of different strategies for meeting their needs. Over time, the problems of loneliness and isolation at weekends were mitigated by the establishment of a buddy scheme linking individual service users with members of the community.

In order to overcome barriers to organizational change, the commitment of leaders is critical. 41 In this case, the enthusiasm and encouragement of the Head of Mental Health and the Director of Community Services meant that the new QA approach was only one example of a number of innovative user‐focused projects that developed over the next few years. In 2005, in recognition of an innovative project whereby service users and staff promote understanding and education about MH within the community, the organization was first runner up for a national accolade.

Discussion

The comparison of the two QA reviews seems to capture vividly the differences between a professional‐led process and a user‐led process of evaluation. During the 1990s this organization was seen as a pioneer in using trained community volunteers of the same ethnic background as the service users to evaluate services and seek user's views. At that time, processes such as person‐centred care planning, holding reviews, inviting relatives and establishing key working 42 were vital improvements to a previously institutional service and were legitimately the subject of organizational monitoring and QA review.

It was therefore illuminating to see how different the agenda of the service users was when they were asked to identify their priorities. They did not prioritize care plans, reviews or even key working. This reflects their views expressed in the 1998 QA survey that they felt uncomfortable about being part of a care plan review meeting and chimes with findings such as those of McDermott 43 that service users on the Care Programme Approach did not understand what it was or why they were placed on it.

Service users were concerned about their own quality of life as human beings and as members of a community; they wanted to go on holiday like other people and they hated feeling lonely and isolated when the Centre was closed on bank holidays. Service users appreciated having a structure to their day but it was interesting to hear that the most important aspect of the Centre, from their perspective, was that it provided a safe haven where they felt comfortable and accepted among their own community. Addressing some of these issues began a process of further development of day services within the organization along the lines recommended by the Social Exclusion Unit 44 that day services should be more outward‐looking – enabling people to participate more actively in the life of the community.

In retrospect, the author can see that the first review was undertaken as a task that needed to be completed in order to provide a report to the Board. Apart from agreeing the date of the review and agreeing how the questionnaires would be distributed, the staff were barely involved. The users were not involved at all except for those who completed questionnaires and the five users who spoke with QA volunteers. The whole process did not take more than 6 weeks. The outcome of the first review was solely the report.

The second review took 5 months from first steering group meeting to presentation of the report to the service users. All the users in the Centre were involved in the process of agreeing the agenda and methodology of the review. A very high proportion actively participated by completing questionnaires. Two service users were involved in managing the process and, anecdotally, staff mentioned that this had increased their confidence and self‐esteem.

There had been concerns about staff feeling undermined by a user‐led process. This was addressed by having staff representatives on the steering group and enabling staff to have a final veto on the questionnaire. Their concerns about raising expectations were assuaged when they found that users were quite happy to have staff assist them in making their own holiday and bank holiday arrangements with each other. Users had needed support of staff because they did not know how to take those first steps.

In the field of MH, there is a relatively weak evidence base for the impact of user involvement on organizational change because outcomes are rarely measurable indicating the difficulty of measuring cultural and organizational change and its sustainability. 45 While not wishing to make unsubstantiated claims the following developments are interesting. ‘Fear of raising expectations’ had been a long held reason for not involving service users in planning within the organization. Subsequently, in spite of many disappointments in relation to failed planning applications and bids falling through, users have been involved at every stage in plans to transfer their centre to a new building.

The chair of the QA committee, who was also a Trustee, gained the agreement of fellow board members to a more collaborative approach to QA in spite of their on‐going bottom line that they wanted to be assured of quality control. More radically, the new Chief Executive, who, at the time of the study was the Director of Community services, has for the first time included involvement of service users in governance as part of the organization's vision.

This project may also have played a part in the evolution of QA within the organization helping it to develop from:

  • • 

    focusing on an inspection event to focusing on a QA process;

  • • 

    the customer being ‘a vague concept’ to the customer being ‘a specific person or group with specific needs’; and 46

  • • 

    ‘quality being the responsibility of the quality department to quality being the responsibility of every employee’. 46

Further work on this subject would involve service users in the analysis and reporting of findings and in any publication relating to the developments.

References

  • 1. Wilkes J. Quality assurance In: Pierce R, Weinstein J. (eds) Innovative Education and Training for Care Professionals: A Provider's Guide. London, UK: Jessica Kingsley, 2000: 111–120. [Google Scholar]
  • 2. DTI . The Original Quality Gurus. Available at: http://www.dti.gov/quality/gurus, accessed on May 2005. [Google Scholar]
  • 3. Born G. Process Management to Quality Improvement. New York, USA: Wiley, 1994. [Google Scholar]
  • 4. Asher M. Managing Quality in the Service Sector. London, UK: Kogan Page, 1996. [Google Scholar]
  • 5. European Forum for Quality Management (EFQM) . The Excellence Model. Available at: http://www.efqm.org, accessed on May 2005. [Google Scholar]
  • 6. Investors in People. Available at: http://www.investorsinpeople.org, accessed on May 2005.
  • 7. Charities Evaluation Service . Practical Quality Assurance System for Small Organisations (PQASSO). Available at: http://www.ces‐vol.org.uk, accessed on May 2005. [Google Scholar]
  • 8. Peters T. The Pursuit of WOW! Every Person's Guide to Topsy‐turvy Times. London, UK: Macmillan, 1995. [Google Scholar]
  • 9. Barclay J, Abdy M. Quality Matters: Funders and Quality in the Voluntary Sector. London, UK: NCVO, 2001. [Google Scholar]
  • 10. Home Office Active Community Unit . Compact on Relations between Government and the Voluntary and Community Sector in England. London, UK: Home Office Active Community Unit, 1998. [Google Scholar]
  • 11. Quality Standards Task Group . Improving our Performance: A Strategy for the Voluntary and Community Sector. London, UK: NCVO, 2004. [Google Scholar]
  • 12. Quality Standards Task Group . Sharing Learning QSTG 1997–2004. London, UK: NCVO, 2004. [Google Scholar]
  • 13. CSCI . Extract from web site on Inspection of Social Care Services. Available at: http://www.csci.org.uk, accessed on May 2005. [Google Scholar]
  • 14. Everett B. Something is happening: the contemporary consumer and psychiatric survivor movement in historical context. The Journal of Mind and Behaviour, 1994; 15: 55–70. [Google Scholar]
  • 15. Whittington C. Collaboration and partnership in context In: Weinstein J, Whittington C, Leiba T. (eds) Collaboration in Social Work Practice. London, UK: Jessica Kingsley, 2003: 13–38. [Google Scholar]
  • 16. Barnes M, Bowls R. Taking over the Asylum. London, UK: Palgrave, 2001. [Google Scholar]
  • 17. DH . The National Service Framework for Mental Health. London, UK: Department of Health, 1999. [Google Scholar]
  • 18. Bristol MIND . User Focused Study of In‐patient Services in 3 Bristol Hospitals. Bristol, UK: Bristol MIND, 2004. [Google Scholar]
  • 19. Leiba T, Weinstein J. Who are the participants in the collaborative process and what makes collaboration succeed or fail? In: Weinstein J, Whittington C, Leiba T. (eds) Collaboration in Social Work Practice. London, UK: Jessica Kingsley, 2003: 63–82. [Google Scholar]
  • 20. Crawford M, Aldridge T, Bhui K et al. User involvement in the planning and delivery of mental health services: a cross‐sectional survey of service users and providers. Acta Psychiatrica Scandinavica, 2003; 107: 410–414. [DOI] [PubMed] [Google Scholar]
  • 21. Peck E, Gulliver P, Towel D. Information consultation or control: user involvement in mental health services in England at the turn of the century. Journal of Mental Health, 2002; 11: 441–451. [Google Scholar]
  • 22. Robert G, Hardacre J, Locock L, Bate P, Glasby J. Redesigning mental health services: lessons on user involvement from the mental health collaborative. Health Expectations, 2003; 6: 60–71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23. Poulton B. User involvement in identifying health needs and shaping and evaluating services: is it being realized? Journal of Advanced Nursing, 1999; 30: 1289–1296. [DOI] [PubMed] [Google Scholar]
  • 24. Crawford MJ, Rutter D. Are the views of mental health user groups representative of those of ‘ordinary’ patients? A cross‐sectional survey of service users and providers. Journal of Mental Health, 2004; 13: 561–568. [Google Scholar]
  • 25. Crepaz‐Keay D, Binns C, Wilson E. Dancing with Angels. London, UK: CCETSW, 1997. [Google Scholar]
  • 26. Harding T, Oldman H. Involving Service Users and Carers in Local Services. London, UK: National Institute of Social Work, 1996. [Google Scholar]
  • 27. Beresford P, Wallcroft J. Psychiatric system survivors and emancipatory research issues, overlaps and differences In: Barnes C, Mercer G. (eds) Doing Disability Research. Leeds: Disability Press, 1997. 26–33. [Google Scholar]
  • 28. Telford R, Faulkner A. Learning about service user involvement in mental health research. Journal of Mental Health, 2004; 13: 549–559. [Google Scholar]
  • 29. Godfrey M. More than ‘involvement’: how commissioning user interviewers in the research process begins to change the balance of power. Practice, 2004; 16: 223–231. [Google Scholar]
  • 30. Beeforth M, Conlan E, Graley R. Have we Got Views for you: User Evaluation of Case Management, 1994. London, UK: The Sainsbury Centre for Mental Health, 1994. [Google Scholar]
  • 31. Bloor M. Addressing social problems through qualitative research In: Silverman D. (ed.) Qualitative Research, Theory Method and Practice. London, UK: Sage, 1998: 21. [Google Scholar]
  • 32. O'Brien R. An Overview of the Methodological Approach of Action Research 1997. Available at: http://www.web.net/robrien/papers/arfinahtm, accessed on May 2005. [Google Scholar]
  • 33. Shaw I. Just inquiry? Research and evaluation for service users In: Kemshall H, Littlechild R. (eds) User Involvement and Participation in Social Care. London, UK: Jessica Kingsley, 2000: 29–44. [Google Scholar]
  • 34. Edwards C, Staniszewska S. Accessing the user's perspective. Health and Social Care in the Community, 2000; 8: 417–428. [DOI] [PubMed] [Google Scholar]
  • 35. Pilgim D. Protest and co‐option – the voice of mental health service users In: Bell A, Lindley P. (eds) Beyond the Water Towers: The Unfinished Revolution in Mental Health Services 1985–2005. London, UK: Sainsbury Centre for Mental Health, 2005; 23. [Google Scholar]
  • 36. Williams B, Coyle J, Healy D. The meaning of patient satisfaction: an exploration of high reported levels. Social Science and Medicine, 1998; 47: 1351–1359. [DOI] [PubMed] [Google Scholar]
  • 37. User Focused Monitoring Network . Doing it for Real: a Guide to Setting up and Undertaking a User Focused Monitoring Project. London, UK: Sainsbury Centre for Mental Health, 2003. Available at: http://www.scmh.org.uk, accessed on May 2005. [Google Scholar]
  • 38. Sainsbury Centre for Mental Health . User‐focused Monitoring 2005. Available at: http://www.scmh.org.uk, accessed on May 2005. [Google Scholar]
  • 39. Pilgrim D, Waldron L. User involvement in mental health service development: how far can it go? Journal of Mental Health, 1998; 7: 95–104. [Google Scholar]
  • 40. Simpson EL, Barkham M, Gilbody S, House A. Involving service users as researchers for the evaluation of adult statutory mental health services The Cochrane Database of Systematic Reviews, 2003; CD004810 (DOI: 10.1002/14651858.CD004810). [Google Scholar]
  • 41. Joseph Rowntree Findings . Increasing User Involvement in Voluntary Organizations 2003. Available at: http://www.jrf.org.uk, accessed on October 2005. [Google Scholar]
  • 42. DOH . The Care Programme Approach for People with a Mental Illness Referred to the Psychiatric Services HC 23 LASSL. London, UK: Department of Health, 1990. [Google Scholar]
  • 43. McDermott G. The Care Programme Approach: a patient perspective. Mentally Ill in the Community, 1998; 3: 47–63. [PubMed] [Google Scholar]
  • 44. Social Exclusion Unit . Action on Mental Health: A Guide to Promoting Social Inclusion. London, UK: Office of the Deputy Prime Minister, 2004. Available at: http://www.socialexclusion.gov.uk, accessed on May 2005. [Google Scholar]
  • 45. Rose D, Fleischmann P, Tonkiss F, Campbell P, Wykes T. Review of the Literature: User and Carer Involvement in Change Management in a Mental Health Context. Report to NHS Service Delivery and Organisation Research and Development Programme 2003. London, UK: NHS SDO Quoted in Social Care Institute for Excellence; Has service user participation made a difference to social care services? 2005. Available at: http://www.scie.org.uk, accessed on October 2005. [Google Scholar]
  • 46. The Ohio State University College of Dentistry . Principles of Continuous Quality Improvement (PQI): A Shift from a Focus on the End Product or Service to Process 2005: Available at: http://www.dent.ohio‐state.edu/cqm/TeamMembers/principles_of_continuous_quality…, accessed on April 2005. [Google Scholar]

Articles from Health Expectations : An International Journal of Public Participation in Health Care and Health Policy are provided here courtesy of Wiley

RESOURCES