Table 2.
Benchmarking project number | Project | Rationale for the benchmarking project | Participation of centres in the programme | What impact did the benchmarking project have? | Success factors | Failure factors |
---|---|---|---|---|---|---|
BMP1 | National Practice Benchmarking | To promote measurement of clinical activity. | Decrease of participation after 8 years. | – | To make the survey more accessible, it was stratified into 2 sections (minimum data set and extra). | – |
BMP2 | Benchmarking Lombardy | To give feedback to hospitals about their performance and create a culture of evaluation. Few existing analysis of performance. | – | It helped directors draw plans to improve critical areas. | Adjustment for diagnostic-related groups. | Public disclosure of results might promote risk-averse behaviour by providers (discourage them from accepting high-risk patients). This is subject to debate. |
Use of regional administrative data so employees more likely to accept the results. | ||||||
BMP3 | Benchmarking for length of stay | To determine the potential for reduction in length of stay. | At the beginning: full participation rate, then more hospitals stopped participation because engaged in other compulsory registration projects. | It has helped to identify the medical specialties for which the decrease of length of stay is the most possible. | – | – |
BMP4 | Benchmarking of breast cancer units | To ensure that care provided to breast cancer patients was based on clinical guidelines and quality assurance. | Participation is voluntary. Increase in specialist breast centres participating in the programme from 2003 to 2009. | Improvement on many clinical indicators and indicators of use of clinical guidelines. | The project was voluntary and used anonymised data. | – |
BMP5 | Benchmarking of trauma centres | To improve outcomes of the trauma centres. | – | Highlighted the need for greater cooperation between trauma registry programme coordinators to ensure standardisation of data collection. | – | Crude hospital mortality is not a robust indicator for trauma centres as it does not take into account mortality after discharge. |
BMP6 | National Mental Health Benchmarking Project | Part of the National Mental Health Strategy. | Selection criteria set for the candidate organisations. | Modification of practices. | Commitment of the management and securing resources. | Data quality and variability in information systems/data interpretation. |
Feeding back benchmarking data to clinical staff to maintain their motivation to the project. | ||||||
Forums for participants provided them the opportunity to discuss the performance of their organisation and draw lessons from other organisations. | ||||||
BMP7 | NCDAH | Measuring quality in palliative care is challenging. | There was a 13% increase in programme participation between round 2 and round 3. | Improvement in practices and in communications between health professionals. | Holding a workshop for participants to reflect on data, enhance understanding and learn from others. | The feedback report should not have too heavy data or contain too complex information. |
Participants found exercise was useful and improved care in the organisation. | ||||||
BMP8 | PATH | In Europe, hospital performance assessment is a priority for WHO Regional Office for Europe. There are few initiatives to compare hospital performance internationally. | 66 hospitals initially registered for participation but a total of 51 hospitals actually participated. | Participation in the project facilitated the integration of different quality assessment activities and data collection. | If the project focuses much more strongly on international comparisons and improved validity. | Lack of personnel, expertise and time for participating hospitals to collect data. |
In some countries it was a stepping stone for starting quality implementation projects (when there was none). | Some issues addressed by the indicators felt too vague and difficult to put in place. | |||||
Competing priorities and reorganisation of hospitals. | ||||||
Competing or overlapping projects. | ||||||
BMP9 | Danish Indicator Project | There is no systematic outcome assessment of patient care. | Participation was mandatory for all hospitals and relevant clinical departments and units treating patients with the 8 diseases. | Increase in the percentage of patients receiving recommended care and interventions according to national practice guidelines. | Easy data collection: in the participating hospitals, data are collected electronically and transmitted safely via the Internet to the project national database. | – |
Improvement in waiting time. | In Denmark it is possible to assign a unique patient identifier, thus facilitating data collection. | |||||
For lung cancer patients, a concerted action has been set up in order to improve this area. | ||||||
BMP10 | Nordic Indicator Project | Need to document and monitor the quality of health service performance. | – | It has allowed us to gather evidence about differences in survival rate from prostate cancer. | – | Not all countries are equally able to track patients after hospital discharge (some countries assign unique patient identifiers, others not). |
Desire for transparency and accountability. | ||||||
BMP11 | Cancer Network Management Benchmarking | The United Kingdom has the worst cancer survival rate in Europe. Benchmarking project set up to support a quality improvement strategy. | – | – | Using a mix of structure, process and outcome indicators. | – |
BMP12 | Emerge | To improve the quality of care in hospitals. | Participation was voluntary. | Quality improvement between the two cycles of benchmarking. | Interpretation of results should be guided by a culture of organisational learning rather than individual blame. | In emergency department, there is a selection bias in patients’ survey. |
BMP13 | Benchmarking NCCN | There is no information on clinical productivity. | Participating centres are members of the NCCN. | – | – | – |
BMP14 | Benchmarking CALNOC | Nurses comprise the largest group of professionals employed in hospitals, and are thus uniquely positioned to significantly influence patient safety and quality of care. | Low attrition rate (fewer than 3% hospitals withdrawing from project since 1998). | Participating CALNOC hospitals reduced their Hospital Acquired Pressure Ulcer rates from 10% to 2.8% with half of the hospitals achieving 0%. | Outcome measures include not only injuries but also near-misses, allowing us to correct the system. | – |
Measures are tied to reimbursement possibly providing financial incentives for hospitals to participate. | CALNOC also offers educational and consultancy service in best practices, possibly contributing to success of the project. | |||||
BMP15 | Benchmarking of Comprehensive Cancer Centres | – | Centres selected by a case study. | – | Internal stakeholders must be convinced that others might have developed solutions for problems that can be translated to their own settings. | Due to different reimbursement mechanisms in different countries the use of financial indicators is complex. |
Management must reserve sufficient resources for the total benchmarks. | ||||||
Limit the scope to a well-defined problem. | ||||||
Define criteria to verify the comparability of benchmarking partners based on subjects and process. | ||||||
Construct a format that enables a structured comparison. | ||||||
Use both quantitative and qualitative data for measurement. | ||||||
Involve stakeholders to gain consensus about the indicators. | ||||||
Keep indicators simple so that enough time can be spent on the analysis of the underlying processes. | ||||||
For indicators showing a large annual variation in outcomes, measurement over a number of years should be considered. | ||||||
Adapt the identified better working methods so that they comply with other practices in the organisation. | When the CCC is in a middle of a complex merger. | |||||
BMP16 | Benchmarking of radiotherapy department | – | Centres selected by a case study. | – | Measuring the percentage of patients in clinical trials not useful for radiotherapy. | |
As some indicators were subject to large yearly variations, measuring indicators over a 1-year period does not always give a good impression of performance. | ||||||
BMP17 | Benchmarking of chemotherapy units | It is part of applying a business approach to improve the efficiency of chemotherapy by identifying best practices. | Centres selected by a case study. | Best practices from benchmarking were used in discussion about the planning system. | Benchmarking should not only be used for comparison of performance, but also to gain insight into underlying organisational principles. | Using business jargon can make medical and care professional left out. |
Benchmarking made the partners aware that other organisations with similar problems were able to achieve better outcomes. | ||||||
BMP18 | Essence of Care | There are unacceptable variations in the standards of care across the countries and reports showed a decline in the quality of care. | No information. | Many improvements were reported at the local level rather than institutional level. | High awareness of the project among nurses. | Although the definition of standards was detailed, the process for measuring them was not. |
Issues of costs associated with litigation for negligence might be a factor for the development of quality initiatives. | Improved motivation of staff after receiving positive feedback. | The project is seen as a top priority at the clinical governance level. | Lack of dedicated funding. | |||
In one area the experience of the benchmarking process itself has brought together sections of the division that would not normally meet. | Lack of interest by physicians (seen as a nurse initiative). | |||||
The benchmarking process has given more power and authority to matrons. | ||||||
BMP19 | BELIEVE | To improve pain control. | Mix of public and private health facilities. Medical and surgical services. | 52 action plans written including training, adaptation of patient record, protocols, development of pain measurement tools. | Project piloted by the CCECQA, an organisation that most hospitals are familiar with, and that has a good reputation for its work. | When questions are difficult to interpret. |
Pain control put higher on the agenda and staff more aware of it. | Benchmarking process was transparent. | Too heavy workload. | ||||
Improvement of practices. | Before audit visits, a meeting was organised to share experiences. |
BMP: Benchmarking Project number; CALNOC: Collaborative Alliance for Nursing Outcomes; CCC: Comprehensive Cancer Centre; CCECQA: Committee for Coordination of Evaluation and Quality in Aquitaine; WHO: World Health Organisation; PATH: Performance Assessment Tool for Quality Improvement in Hospitals; NCCN: National Comprehensive Cancer Network.
List of indicators used in projects in Appendix 2 in supplementary material and full table on the website http://www.oeci.eu/benchcan