Table 4.
Author | Study design | Benchmarking model and/or steps | Indicators | Outcome | Impact (improvements/improvement suggestions) | Success factors |
---|---|---|---|---|---|---|
Barr [30] | Multi comparisons study using the National Practice Benchmark. | Partner: Industry Content: Performance/Strategic Purpose: Collaborative National Practice Benchmark survey |
N.A. | The National Practice Benchmark reveals a process of change that is reasonably orderly and predictable, and demonstrates that the adaptation of the oncology community is directional, moving toward gains in efficiency as assessed by a variety of measures. | N.A. | To make the survey more accessible, it was stratified into 2 sections (minimum data set and extra). |
Brann [31] | Multi comparisons study in which representatives from child and adolescent mental health organizations used eight benchmarking forums to compare performance against relevant KPIs. | N.A. | Key performance indicators looking at outcomes in mental health | Benchmarking has the potential to illuminate intra- and inter-organizational performance. | N.A. | 1. Commitment of the management and securing resources. 2. Feeding back benchmarking data to data interpretation clinical staff to maintain their motivation to the project. 3. Forums for participants to provide them with the opportunity to discuss the performance of their organisation and draw lessons from other organisations. |
De Korne [3] | Mixture of methods: a systematic literature review and semi-structured interviews. An evaluation frame (based on a systematic literature review) was applied longitudinally to a case study of nine eye hospitals that used a set of performance indicators for benchmarking. | Partner: Industry/Global Content: Process/Performance Purpose: Collaborative 4P model : 1) the purposes of benchmarking; 2) the performance indicators used; 3) the participating organizations; and 4) the organizations’ performance management systems. |
Performance outcome indicators | The benchmarking indicators were mostly used to initiate and to facilitate discussions about management strategies. The eye hospitals in this study were not successful in reaching the goal of quantifying performance gaps or identifying best practices. | Indicators for benchmarking were not incorporated in a performance management system in any of the hospitals, nor were results discussed with or among employees; only the strategic level was involved. | Performance indicators should; 1. Represent strategically important items; 2.the indicators have to be specific, measurable, acceptable, achievable, realistic, relevant, and timely (SMART); 3. Data have to be converted into measurable quantities; 4. the indicator information has to be comparable to those of other organizations; 5. selected indicators must be relevant to the benchmarking purposes; 6. the indicators should have validity with respect to performance and participants and should also discriminate. |
De Korne [25] | Mixture of methods: quantitative analysis included (i) analysis of fiscal year 2009 benchmarking performance data and (ii) evaluation of multiple cases by applying an evaluation frame abstracted from the literature to five U.S. eye hospitals that used a set of 10 indicators for efficiency benchmarking. Qualitative analysis of interviews, document analyses, and questionnaires. | Partner: Industry Content: Performance Purpose: Collaborative 4P model : 1) the purposes of benchmarking; 2) the performance indicators used; 3) the participating organizations; and 4) the organizations’ performance management systems. |
Efficiency outcome indicators | The benchmark initiative fulfilled many of its purposes, namely, identifying performance gaps, implementing best practices, and stimulating exchange of knowledge. | Case studies showed that, to realize long-term efforts, broader cooperation is necessary. | 1. the 4P model suggests that reliable and comparable indicators are a precondition for a successful benchmark, 2. case studies suggest that the development process is an important part of benchmarking. 3. homogeneity in language, reimbursement systems, and administrations |
Schwappach [26] | Prospective and retrospective mixed methods: Questionnaires, Demographic, clinical, and performance data collected via specific data sheets; systematic data controlling. |
Partner: Industry Content: Process/Performance Purpose: Collaborative EMERGE: (1) selection of interested hospitals, participating on a voluntary basis; (2) joint development of a set of clinical performance indicators agreed upon by all parties; (3) establishment of a measurement system, development of measurement tools and design of data collection instruments; (4) data collection in a first measurement cycle; (5) benchmarking of results and definition of shared, quantitative targets; (5) initialization of hospital-specific improvement activities; (6) data collection in a second measurement cycle; and (7) benchmarking of results. |
Outcome Indicator set including two main components: objective measures that evaluate clinical performance in terms of speed and accuracy of patient assessment, and patients’ experiences with care provided by Eds. | Concordance of prospective and retrospective assignments to one of three urgency categories improved significantly by 1%, and both under- and over-prioritization, were reduced. Significant improvements in the reports provided by patients were achieved and were mainly demonstrated in structures of care provision and perceived humanity. | A number of improvement activities were initiated in individual hospitals covering a wide range of targets, from investment in ED structures to professional education and organization of care. | Interpretation of results should be guided by a culture of organisational learning rather than individual blame. |
Shaw [30] | Multi comparisons study with the use of questionnaire containing ten questions. | N.A. | 10 ‘questions’ regarding ED patient utilization, wait times, services, and attending physician staffing of the nation’s PEDs. Indicators qualified as outcome indicators. |
Benchmarking of PEM staffing and performance indicators by PEM directors yields important administrative data. PEDs have higher census and admission rates compared with information from all EDs, while their attending staffing, wait times, and rate of patients who leave without being seen are comparable to those of general EDs. | In larger departments, the opening of fast tracks during high census times has allowed for shorter disposition of lower acuity patients with good success, this has been recommended as one of the solutions to better ED throughput. | N.A. |
Van Lent [6] | Multi comparisons study internationally benchmarking operations management in cancer centres. | Partner: Industry/Global Content: Performance Purpose: Collaborative. Spendolinis method and a new 13d step: 1. Determine what to benchmark; 2. Form a benchmarking team; 3. Choose benchmarking partners; 4. Define and verify the main characteristics of the partners; 5. Identify stakeholders; 6. Construct a framework to structure the indicators; 7. Develop relevant and comparable indicators; 8. Stakeholders select indicators; 9. Measure the set of performance indicators; 10. Analyze performance differences; 11. Take action: results were presented in a report and recommendations were given; 12. Develop improvement plans; and 13. Implement the improvement plans |
Outcome indicators containing a numerator and a de-numerator The selected indicators distinguished between the total organization level, diagnostics, surgery, medication related treatments, radiotherapy and research. | The results on the feasibility of benchmarking as a tool to improved hospital processes are mixed. Success factors identified are a well-defined and small project scope, partner selection based on clear criteria, stakeholder involvement, simple and well-structured indicators, and analysis of both the process and its results. | All multiple case studies provided areas for improvement and one case study presented the results of a successful improvement project based on international benchmarking. | 1. Internal stakeholders must be convinced that others might have developed solutions for problems that can be translated to their own settings. 2. Management must reserve sufficient resources for the total benchmarks. 3. Limit the scope to a well-defined problem. 4. Define criteria to verify the comparability of benchmarking partners based on subjects and process. 5. Construct a format that enables a structured comparison. 6. Use both quantitative and qualitative data for measurement. 7. Involve stakeholders to gain consensus about the indicators.8. Keep indicators simple so that enough time can be spent on the analysis of the underlying processes. 9. For indicators showing a large annual variation in outcomes, measurement over a number of years should be considered. 10. Adapt the identified better working methods so that they comply with other practices in the organisation. |