Abstract
Objectives:
Performance indicators are a popular mechanism for measuring the quality of healthcare to facilitate both quality improvement and systems management. Few studies make comparative assessments of different countries’ performance indicator frameworks. This study identifies and compares frameworks and performance indicators used in selected Organisation for Economic Co-operation and Development health systems to measure and report on the performance of healthcare organisations and local health systems. Countries involved are Australia, Canada, Denmark, England, the Netherlands, New Zealand, Scotland and the United States.
Methods:
Identification of comparable international indicators and analyses of their characteristics and of their broader national frameworks and contexts were undertaken. Two dimensions of indicators – that they are nationally consistent (used across the country rather than just regionally) and locally relevant (measured and reported publicly at a local level, for example, a health service) – were deemed important.
Results:
The most commonly used domains in performance frameworks were safety, effectiveness and access. The search found 401 indicators that fulfilled the ‘nationally consistent and locally relevant’ criteria. Of these, 45 indicators are reported in more than one country. Cardiovascular, surgery and mental health were the most frequently reported disease groups.
Conclusion:
These comparative data inform researchers and policymakers internationally when designing health performance frameworks and indicator sets.
Keywords: Epidemiology/public health, health performance, performance indicators, international comparison, performance frameworks, quality of care
Introduction
For more than two decades, regulators, policymakers, researchers and clinicians have endeavoured to improve the quality of healthcare by designing and applying indicators of performance. There are national and international incentives for rating the performance of health systems. The World Health Organisation (WHO)1 and others2,3 have attempted to rank health systems for the insights gained from global comparisons, while consumers have an interest in selecting the best provider for treatment for their particular condition and knowing that their taxes are being spent wisely.4 To meet these multiple demands, performance indicators (‘measurable elements of practice performance for which there is evidence or consensus that they can be used to assess the quality, and hence change of quality, of care provided’) and performance frameworks (‘conceptual frameworks that set out the rationale and design principles for an indicator set’)5,6 are typically designed to routinely monitor aspects of healthcare performance such as effectiveness, efficiency, safety and quality.7 The quest for a single composite indicator of quality, prevalent in the early days of indicator development, has largely been abandoned in favour of multidimensional frameworks.1 Indicator sets commonly contain a combination of structure, process and outcome assessments.8
The Organisation for Economic Co-operation and Development (OECD) publishes 60 internationally comparable indicators of healthcare quality.9 These are useful and influential. However, many countries, even those with advanced data systems, have difficulty linking practice performance to outcomes because of limitations in data availability and poor capabilities to link data. Notwithstanding these kinds of shortcomings, it is useful to assess the frameworks and performance indicators in a sample of countries for the insights this provides.
Some health systems have moved faster than others in adopting performance indicators as tools for quality improvement and have made details of their indicators and systems for applying them publicly available at national, regional or institutional levels. We selected eight prominent health systems for review and assessment based on purposively selecting a sample that were exemplars in using indicators and making their data and performance systems available: Australia, Canada, Denmark, England, the Netherlands, New Zealand, Scotland and the United States. At the time of our review, all had made progress in selecting or applying indicators to measure or stimulate improved performance and most had developed a framework for conceptualising performance improvement or indicator use.
This research aims to identify and analyse indicators and their frameworks which report on the performance of healthcare organisations and local health areas. This will provide comparative cases and information on progress for the benefit of regulators, policymakers and researchers within those countries and elsewhere, but is of particular use to policymakers interested in constructing future frameworks.
Methods
We searched for relevant performance indicators and their domains across the eight countries. Following this, we analysed performance indicators that were nationally consistent (used across the country rather than just regionally), locally relevant (measured and reported publicly at a local level, for example, a health service) and measured patient-level metrics. We conducted our study in four stages.
Stage 1: identify comparable nations using performance indicators to monitor and report on healthcare
To make comparison relevant, all selected countries are OECD members and have been classified by the World Bank as high income.10 Data on the rates of health expenditure and life expectancy for 19 countries were obtained from Australia’s Health 201011 and from OECD reports9 (including Australia, Austria, Canada, Denmark, Finland, France, Germany, Greece, Ireland, Italy, Japan, the Netherlands, New Zealand, Norway, Portugal, Spain, Sweden, the United Kingdom and the United States).12 After screening by the research team, the eight countries we noted above were selected for detailed review on the basis that each had made substantial progress in using indicators and developing performance frameworks and had made their indicators and performance frameworks widely available.
Stage 2: finding performance indicators
We conducted our Internet search of performance indicator systems in the eight comparator countries in May 2013. The OECD and Departments or Ministries of Health and associated government health organisations in each country were searched. A scoping table detailing the indicators by country was developed. Indicators that were collected consistently on a national scale were included, but could be relevant and useful to local quality improvement efforts. The purpose of the table was to compile an initial ‘long-list’ of available indicators and then to identify a ‘short-list’ of those reported in multiple countries.
Stage 3: detailed review of selected performance indicators
The performance indicators were subjected to a detailed assessment and were classified according to whether they applied to community/hospital/population, country of origin and clinical specialty.
Stage 4: country-specific frameworks
The health system performance frameworks for each country were reviewed together with their accompanying online and published documentation. Domains within the performance frameworks were compared.
Results
Performance indicator frameworks
A summary of each country’s approach to performance indicator use is shown in Tables 1 and 2. Most of the eight countries have an overarching framework for the selection and reporting of indicators which establish the broader aims for their implementation activity and play a large role in indicator selection and use. The number and focus of frameworks varied greatly between the eight countries, but typically included reference to both monitoring and improving quality and efficiency of the healthcare system. There appears to be considerable overlap between the definitions of many of the domains such as effectiveness and appropriateness. Indicators are sometimes also used to promote consumer choice at a regional or local level.
Table 1.
Australia | Canada | Denmark | England | The Netherlands | New Zealand | Scotland | United States | |
---|---|---|---|---|---|---|---|---|
Estimated population (rank)a,13 | 22,262,501 (55) | 34,568,211 (37) | 5,556,452 (11) | 53,900,00014 (22)b | 16,807,037 (64) | 4,365,113 (125) | 5,300,00014 (22)b | 316,668,567 (3) |
Life expectancy at birth: overall years (rank) | 82 (10) | 82 (13) | 79 (48) | 80b (30) | 81 (21) | 81 (25) | 80b (30)b | 79 (51) |
Infant mortality: deaths per 1000 live births (rank) | 4.49 (190) | 4.78 (182) | 4.14 (197) | 4.5b (189)b | 3.69 (205) | 4.65 (145) | 4.5b (189) | 5.9 (174) |
GDP ($US) (rank) | 986.7 billion (19) | 1.513 trillion (14) | 213.6 billion (55) | 2.375 trillionb (9)b | 718.6 billion (24) | 134.2 billion (64) | 2.375 trillionb (9)b | 15.94 trillion (2) |
GDP per capita ($US) (rank) | 42,400 (94) | 43,400 (142) | 38,300 (32) | 37,500b (34)b | 42,900 (23) | 30,200 (50) | 37,500b (34)b | 50,700 (14) |
Healthcare expenditure (%GDP) (rank) | 8.7 (2010) (48) | 11.3 (15) | 11.4 (14) | 9.6b (32)b | 11.9 (7) | 10.1 (30) | 9.6b (32)b | 17.9 (2) |
Type of health system | Universal coverage – Medicare Voluntary private insurance available |
Publicly funded – Medicare provides universal coverage for all hospital and physician services out-of-pocket expenses dental, optometry and pharmaceuticals Voluntary private insurance available |
Publicly funded – out-of-pocket expenses dental, optometry and pharmaceuticals Voluntary private insurance available |
Publicly funded – NHS Voluntary private insurance available |
Universal coverage ensured – mix of public and private insurance | Publicly funded Voluntary private insurance available |
Publicly funded – NHS Voluntary private insurance available |
Public and private insurance – majority private insurance |
Health system performance frameworks | PAF and ROGS provide key conceptual principles | Framework is conceptualised across four dimensions: (1) health status, (2) non-medical determinants of health, (3) health system performance and (4) community and health system characteristics | No framework as yet | NHS Outcomes Framework CCG Outcomes Indicator Set QOF |
Overarching framework to meet four needs: (1) staying healthy, (2) getting better, (3) living independently with a chronic illness and (4) end-of-life care | Six health targets, three focus on patient access and three on prevention. Primary Health Organisation targets Quality and Safety markers Atlas of Healthcare Variation |
12 Quality Outcome Indicators; HEAT targets and other measurements at local and national levels | Two locally reported The Commonwealth Fund – no set framework reports across health dimensions (see below) Hospital Compare – no framework reports on seven dimensions (see below) |
Dimensions/domains reported | PAF – safety, effectiveness, appropriateness, quality, access, efficiency, equity, competence, capability, continuity, responsiveness, sustainability ROGS – effectiveness, appropriateness, quality, access, efficiency, equity |
Eight domains: (a) acceptability, (b) accessibility, (c) appropriateness, (d) competence (e) continuity, (f) effectiveness, (g) efficiency and (h) safety |
Under development | NHS Outcomes – five domains: premature death, quality of life, recovery, positive experience and care/safety CCG – adds to the overarching NHS Outcomes QAO Framework – four domains – clinical, organisational, patient care experiences and additional services |
Three overarching themes: (1) quality of care, (2) access to care and (3) healthcare expenditure |
Diverse themes. Atlas domains: maternity, gout, demography, cardiovascular disease, polypharmacy and surgical procedures | Described as Quality Ambitions: safe, person-centred and effective | The Commonwealth Fund – four domains: access, prevention and treatment, costs and potentially avoidable hospital use and health outcomes Hospital Compare – seven dimensions – general information, timely and effective care, readmissions, complications and death, use of medical imaging, survey of patients’ experiences, Medicare payment and number of Medicare patients |
Framework purpose | ROGS, PAF: - to support improved local-level performance assessment - to support a safe, high-quality Australian health system, through improved transparency and accountability |
To determine (1) the health of Canadians and (2) how well the health system performs and operates on the principles of providing report that is secure, that respects Canadians’ privacy and is also consistent, relevant, flexible, integrated, user-friendly and accessible | N/A | NHS Outcomes Framework and CCG Outcomes Indicator Set: - to provide a national-level overview of how well the NHS is performing - to provide an accountability mechanism between the Secretary of State for Health and the NHS Commissioning Board for the effective spend of some £95 billion of public money - to act as a catalyst for driving up quality throughout the NHS by encouraging a change in culture and behaviour QOF is not about performance management per se, but incentivising and rewarding good practice |
Used to compare healthcare system performance in other years and countries, with policy and procedure and where possible between healthcare providers | Health targets are a set of national performance measures designed to improve the performance of health services that reflect significant public and government priorities. They provide a focus for action for DHBs and are focussed on accountability not quality improvement. Primary Health Organisation targets to improve the health of enrolled populations and reduce inequalities in health outcomes through supporting clinical governance and rewarding quality improvement within PHOs. Improvements in performance against a range of nationally consistent indicators result in incentive payments to PHOs. QSMs will be used to evaluate the success of the national safety campaign and determine whether the desired changes in practice and reductions in harm and cost have occurred. Atlas of Healthcare Variation: aims to stimulate debate by highlighting variation, rather than making judgements about why variation exists or whether it is appropriate, leading to improvements in healthcare services. |
To structure and coordinate the range of measurements that are taken across the NHS Scotland. - 12 QOIs are used for national reporting on longer term progress towards the Quality Ambitions and the Quality Outcomes. These are intended as indicators of quality and do not have associated targets. - HEAT targets describe the specific and short-term priority areas for focussed action in support of the Quality Outcomes. |
Commonwealth Fund: uses comparative data to assess the performance of their healthcare systems, establishes priorities for improvement and sets achievement targets Hospital Compare: to help stimulate and support improvements in the quality of care delivered by Medicare hospitals, with the intention of improving hospitals’ quality of care through the distribution of objective, easy to understand data on hospital performance and quality information from consumer perspectives |
Data sources | Multiple data sources as identified in the data plan 2-13-2016 Australian Institute of Health and Welfare national data holdings National Partnership Agreement Data submissions Australian Bureau of Statistics data Other collections |
Statistics Canada CIHI Canadian Hospital Reporting Project |
Clinical Quality Development Programme (RKKP), individual registries and databases, Sundhed.dk | Health and Social Care Information Centre; Royal College of Physicians | Dutch Hospital Databank | Health Quality and Safety Commission/Atlas of healthcare variation Primary Health Organisation Performance Programme |
Information Services Division Scotland Scottish Government |
Main sources of data include Centre for Medicare and Medicaid, The Joint Commission, Centers for Disease Control and Prevention and other Medicare data and data from within Hospital Referral Regions |
DBH: District Health Boards; GDP: gross domestic product; NA: Nott applicabel; NHS: National Health Service; QAO; Quality and Outcomes; PAF: Performance and Accountability Framework; PHO: Primary Health Organisations; ROGS: Report on Government Services; QOF: Quality and Outcomes Framework; CCG: Clinical Commissioning Group; QOI: Quality Outcome Indicator; QSM: quality and safety marker; CIHI: Canadian Institute for Health Information.
Rank refers to CIA World Factbook,15 country compared to the rest of the world.
Figures for UK in total, not England or Scotland specifically.
Table 2.
Australia – PAF | Australia – ROGS | England – High Quality Care for All | England – NHS Outcomes Framework | Canada – Canadian Health Indicator Framework | The Netherlands – dimensions of healthcare performance | The Netherlands – healthcare needs | Scotland – Quality Measurement Framework | United States – Agency for Healthcare Research and Quality | United States – Commonwealth Fund | OECD | Total | |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Effectiveness | X | X | X | X | X | X | X | X | 8 | |||
Access | X | X | X | X | X | X | X | 7 | ||||
Safety | X | X | X | X | X | X | X | 7 | ||||
Efficient | X | X | X | X (‘Efficiency and governance’) | X | 5 | ||||||
Quality | X | X | X | X | 4 | |||||||
Appropriateness | X | X | X | X | 4 | |||||||
Outcomes of care/health improvement | Three domains relate to outcomes | Four domains relate to outcomes | X | X | 4 | |||||||
Patient-centred/experience | X | X | X | X | 4 | |||||||
Cost | X | X | X | 3 | ||||||||
Equity | X | X | X | 3 | ||||||||
Responsiveness | X | X | X | 3 | ||||||||
Competence/capability | X | X | 2 | |||||||||
Continuity | X | X | 2 | |||||||||
Timely | X | 1 | ||||||||||
Acceptability | X | 1 | ||||||||||
Sustainability | X | 1 | ||||||||||
Avoidable hospital use | X | 1 |
PAF: Performance and Accountability Framework; ROGS: Report on Government Services; NHS: National Health Service; OECD: Organisation for Economic Co-operation and Development.
In Australia, the National Health Performance Authority (NHPA)16 was established under the Australian National Health Reform Act 201117 as an independent portfolio agency to monitor and report on healthcare system performance; it has since been merged with the Australian Institute of Health and Welfare. NHPA commenced operations in 2012. As part of its Strategic Plan 2012–2015,16 NHPA is required to regularly review its Performance and Accountability Framework (PAF) to ensure it remains relevant and continues to address the needs of the Australian public for high-quality healthcare. The PAF consists of 48 national indicators: 31 indicators for Medicare Locals now called Primary Health Care Networks; (geographically based primary care co-ordinating agencies) and 17 indicators for performance of Local Hospital Networks and hospitals18 (see Table 1).
The Canadian framework has two main goals: to determine (1) the health of Canadians and (2) how well the health system performs and operates according to the published principles of providing reports that respect Canadians’ privacy and are also consistent, relevant, flexible, integrated, user-friendly and accessible.19,20 The indicator framework is conceptualised in terms of the provision of high-quality comparative information across four dimensions. Within these, eight domains of health system performance are defined.19,21
The Danish do not have a formal framework, but one is currently being developed. Instead, as a proxy framework, the Danish Institute for Quality and Accreditation in Healthcare (IKAS) manages the Danish Healthcare Quality Program (DDKM) as a national accreditation and standard–based programme. At the time of the study, this provides advanced indicators and applies them throughout the country. These standards are overseen by the International Society for Quality in Healthcare (ISQua).22 The Danish National Indicator Project (DNIP) merged with the Clinical Quality Development Programme (RKKP) in 2010.8 Although Denmark lacks a formal framework, the DNIP manual outlines the thinking behind its clinical indicators and planned future indicators (1) to improve prevention, diagnostics, treatment and rehabilitation; (2) to provide documentation for setting priorities and (3) to create an information resource for patients and consumers.
An example of a framework which operates at multiple geographical levels is that used in England’s National Health Service (NHS). This comprises three performance frameworks: the NHS Outcomes Framework, which focuses on performance and accountability; the Clinical Commissioning Group (CCG) Outcomes Indicator Set aimed at helping the CCG in planning and benchmarking and providing information to consumers and the Quality and Outcomes Framework (QOF) which is a voluntary pay for performance programme for general practice in England.23
The Dutch framework, by comparison, is relatively streamlined and more consumer-focussed. Representatives of the Netherlands’ Ministry of Health, Welfare and Sport collaborated with academic researchers to develop the conceptual framework after reviewing the strategic goals of the healthcare system, information needs of policymakers and studying existing theory and international experiences.24 The resulting framework divides healthcare into four specific community needs: (1) staying healthy, (2) getting better, (3) living independently with a chronic illness and (4) end-of-life care.25
New Zealand has included an atlas of healthcare variation as one of its four health system performance monitoring mechanisms,26–29 and other countries such as the United States and the United Kingdom utilise an atlas, too. The atlas is organised according to clinical areas: maternity, demography, cardiovascular disease, gout, polypharmacy and surgical procedures.29 In 2013, the NZ Health Quality and Safety Commission’s campaign Open for Better Care was commenced to measure whether planned changes in practice occur and whether they have resulted in reduced costs and harms.28
Scotland conceptualised its Quality Measurement Framework on three levels to structure and coordinate the range of measurements that are taken across NHS Scotland. For monitoring long-term progress, there are 12 Quality Outcome Indicators (QOIs) which do not have specific targets; short-term priority areas are focussed on by the ‘HEAT’ targets: Health improvement for the people of Scotland (H), Efficiency and governance improvements (E), Access to services (A) and Treatment appropriate to individuals (T); and the third category includes all other national and local reporting requirements.
In the United States, three identifiable entities report on healthcare performance. One reports nationally (the US Department of Health and Human Services’ Agency for Healthcare Research and Quality (AHRQ)), one internationally (The Commonwealth Fund) and one locally (Hospital Compare). While there is no single integrated framework, the AHRQ measures health system performance across four dimensions13 and the Commonwealth Fund aims to be a catalyst for change by identifying promising practices to help the United States create a high-performing healthcare system.14 The Commonwealth Fund spans four dimensions of health system performance: access, including insurance and affordability of care; prevention and treatment, including quality of ambulatory, acute, post-acute and palliative care; avoidable hospital use and cost, such as care that could have been avoided if the patient received appropriate care initially; and indicators assessing the extent to which people can enjoy long and healthy lives.15
Of the 11 frameworks published in five countries and the OECD, the most frequently used (self-reported) domains were effectiveness (eight), access and safety (seven each) and efficiency (five; Table 2). There are likely to be considerable overlap between the definitions of some of the domains such as effectiveness and appropriateness. For example, the OECD considers these two domains as separate while the Australian framework considers appropriateness as a subset of effectiveness.6,18 Because of this, and hierarchical relationships between domains within some frameworks, it is difficult to report the number of indicators used against each domain for each country.
Indicator choice
The search in eight countries found 401 indicators that fulfilled the ‘nationally consistent and locally relevant’ criteria we applied. Of these, 45 indicators are reported in more than one country. Table 3 contains a breakdown of indicators by country.
Table 3.
Country – primary source for an indicator | N |
---|---|
England | 111 |
Canada | 86 |
United States | 94 |
Denmark | 68 |
Australia | 56 |
New Zealand | 33 |
Scotland | 24 |
The Netherlands | 15 |
The search yielded 219 community-level, 231 hospital-level and 37 population-level indicators. Some indicators were classified into more than one category (Table 4).
Table 4.
Domain | Community | Hospital | Population | Total |
---|---|---|---|---|
Access | 41 | 45 | 0 | 86 |
Patient experience | 25 | 21 | 1 | 47 |
Safety and quality | 146 | 145 | 2 | 293 |
Efficiency | 2 | 11 | 0 | 13 |
Population health outcomes | 5 | 9 | 34 | 48 |
Total | 219 | 231 | 37 | N/A |
We classified the indicators, where possible, into major disease groups (Table 5). Cardiovascular, surgery and mental health were the most frequently reported disease groups. Indicators tend to be more specifically linked to a clinical condition or disease group in some countries such as Denmark.30
Table 5.
Major clinical grouping | N | %a |
---|---|---|
Cardiovascular disease | 62 | 15 |
Surgery | 45 | 11 |
Mental health | 42 | 10 |
Cancer | 26 | 6 |
Endocrine disease | 21 | 5 |
Respiratory disease | 20 | 5 |
Musculoskeletal | 17 | 4 |
Maternal and child health | 17 | 4 |
Emergency | 11 | 3 |
Radiology | 6 | 1 |
Chronic kidney disease | 5 | 1 |
Neurological disease | 4 | 1 |
Denominator = 401.
Review processes
Regular review of the performance framework and indicators is conducted in most of the eight countries by government or government-funded, arm’s-length bodies. For example, the Canadian framework has continually developed and evolved since its inception, as a result of collaboration from a dedicated group including the Canadian Institute for Health Information (CIHI), Statistics Canada (SC), Health Canada (HC) and representatives from other stakeholder groups.19 Similarly, the National Institute for Health and Clinical Excellence (NICE) has a key role in indicator development in England. NICE is responsible for managing the development process of clinical and public health indicators for the QOF and the CCG indicator sets.31 NICE also recommends whether the existing indicators should continue to be part of the frameworks. NICE has developed guides,32,33 which set out in detail the processes involved in managing the development of indicators. Thus, indicators tend to be developed in a relatively open and transparent way, with input from individuals and stakeholder organisations. Of course, this statement masks the contested and political components of indicator development and use, which does not figure explicitly in policy documentation, academic articles or this review and is mostly country-specific.34
Reporting
The timing and mechanism of reporting on indicators were not consistent between countries, nor were they always internally consistent. This can be seen in the reporting on Canada’s health system performance, where various indicators are reported via multiple channels. There were 101 performance indicators listed on the SC website.35 The CIHI also has indicators listed under the Canadian Hospital Reporting Project (CHRP). While some of the indicators are the same as those listed by SC, there are some additional hospital performance indicators (21 in total).36 Additionally, the Government of Canada37 has a candidate list of 70 indicators that were approved for use by Federal, Provincial and Territorial Ministers of Health in 2004. However, it is difficult to gauge how many indicators are in use, because only certain indicators are selected for inclusion in the annual reports and there appear to be various degrees of overlap. For example, the Health Indicators 2013 report20 provides results for 88 indicators, 13 of which were additional indicators chosen to be included to measure performance by socioeconomic status at provincial and national levels.20 Although this appears confusing from an external perspective, variable reporting may be more effective in some instances, as CIHI addresses reporting needs by acknowledging different audiences and tailoring reporting for their requirements.
Meanwhile, in the Netherlands, a report detailing results for all 125 indicators is published every 2 years.38 From 2011, the information was updated via a website twice a year and from 2014, the report will be published every 4 years.38 The indicators are reported at the national level, not locally. Indicators reported locally (at regional levels) occur via the Dutch Hospital Database. There are two dedicated websites that provide consumers of healthcare with information about the quality of a service and provide ratings for their service.39 These are Independer – www.independer.nl – and Kiesbeter – www.kiesbeter.nl or ‘Choose Better’.
Similarly, in Denmark, clinical indicators are reported, and a structured audit process is initiated every 6 months by audit groups of clinicians at national, regional and local levels to explain the risk-adjusted results and to plan improvements. After the audit process is complete, the data are released publicly, including comments on the results from the audit groups.40 Reports on many of the indicators are available on the www.sundhed.dk website.
Discussion
In this study, the eight countries selected for review were using indicators and had implemented a performance framework, several over more than a decade. The progress they have made, and choices taken in selecting and using indicators, might be of value for other health systems in contemplating the development of their indicators or frameworks, or modifying their performance mechanisms. A key finding was the widespread support for implementing a healthcare system performance framework. The importance of a logical, universally acceptable and viable conceptual framework to underpin development of a national performance indicator set is also emphasised in the literature.41,42 A conceptual framework sets out the rationale and design principles for the indicator set and links it to the wider health system context. It seeks to answer the question ‘performance of what – and to what ends?’6 Reasons given for developing such a framework are as follows: (1) to define the scope and dimensions of measurement;24,6 (2) to help align the performance measurement system with other policy priorities and financial exigencies;43 (3) to provide a clearly defined vision to encourage buy-in by clinicians and consumers24,43 and (4) by involving potential end-users of the data in the design of the framework, to ensure its future usefulness.44 A conceptual framework encompassing multiple domains and with balanced representation across structure, process and outcome indicators is considered to be a key element of health reform over time.45
Although we presented the self-reported domains by country, consistency of definitions between countries and the level of semantic overlap was not tested; however, these are likely to be substantial. For example in the Australian PAF, the domain appropriateness is subordinate to, or sub-class of, the domain effectiveness.18 In Canada, these two domains are not grouped, but are classified as separate concepts. Definitions for these domains are often not explicit in the policy documents.20 Definitional consistency between countries should be the subject of more research or efforts to internationally standardise.
Although there is a substantial literature dealing with the design, properties and scientific soundness of individual indicators, there is considerably less attention given to how indicators are used in practice and the impact they may have on the behaviour of health professionals, or on the quality of care. While there is no answer to questions such as how many indicators, which domains should be targeted or what should be the right mix of indicators, there is a fundamental debate centred on whether the purpose of performance indicators is accountability or quality improvement.41–43,46 Internationally, there is a split between those countries which emphasise public reporting and accountability (e.g. the UK NHS’s ‘star-ratings’ system of 2001–2005)47 and those that use results for non-publicised feedback to organisations to stimulate improvement.
It is broadly agreed that monitoring performance imposes an inherent pressure on healthcare organisations or services to improve practice.48 However, the extent to which this is accomplished is disputed and under-researched. The paucity of research examining the links between indicators and improvement may be due to the difficulty in attributing change to any particular policy initiative or intervention.49
The literature supports the use of performance indicators, suggesting that their impact is more likely to be on provider rather than consumer behaviour.43,50 However, there is a general call for more good quality studies on impact.50–52
England and Canada do the most extensive research and development work to select indicators. The role of NICE in England exemplifies a thorough and considered approach to continued indicator development. The process of review, seen especially in England, Canada and Australia, is critical to the continued development of performance indicators and their use.
Conclusion
A large amount of comparative information about international performance indicators is now available.53 We examined the systems in use in eight countries. Assessing commonalities and differences between indicator specification and application in comparable health systems may be of value to regulators, policymakers, researchers and clinicians and forms a foundation for further research into the practical impact of indicators on the quality of healthcare.
Acknowledgments
The authors would like to thank their colleagues Dr Diane Watson, Bernie Harrison, Jessica Stewart, Dr Julia Langton, Jason Boyd, Jessica Herkes and Kristiana Ludlow for their support to the project.
Footnotes
Declaration of conflicting interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.
Funding: This work was supported, in part, by the National Health Performance Authority, Australia (but no one including the Authority’s officers had a role in the study, its conduct, results, interpretation or publication of this article) and the National Health and Medical Research Council (APP1054146).
References
- 1. Smith P, Papanicolas I. Health system performance comparison: an agenda for policy, information and research. Copenhagen: WHO ROE, 2012. [Google Scholar]
- 2. Papanicolas I, Kringos D, Klazinga NS, et al. Health system performance comparison: new directions in research and policy. Health Policy 2013; 112: 1–3. [DOI] [PubMed] [Google Scholar]
- 3. Veillard J, Moses McKeag A, Tipper B, et al. Methods to stimulate national and sub-national benchmarking through international health system performance comparisons: a Canadian approach. Health Policy 2013; 112: 141–147. [DOI] [PubMed] [Google Scholar]
- 4. Giuffrida A, Gravelle H, Roland M. Measuring quality of care with routine data: avoiding confusion between performance indicators and health outcomes. Brit Med J 1999; 319: 94. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Crampton P, Perera R, Crengle S, et al. What makes a good performance indicator? Devising primary care performance indicators for New Zealand. New Zeal J Med 2004; 117(1191): U820. [PubMed] [Google Scholar]
- 6. Arah O, Westert GP, Hurst J, et al. A conceptual framework for the OECD Health Care Quality Indicators Project. Int J Qual Health Care 2006; 18: 5–13. [DOI] [PubMed] [Google Scholar]
- 7. Institute of Medicine. Measuring the quality of health care. Washington, DC: The National Academies Press, 1999. [Google Scholar]
- 8. Donabedian A. Evaluating the quality of medical care. Milbank Q 1966; 44: 166–206. [PubMed] [Google Scholar]
- 9. Organisation for Economic Co-operation and Development. Statistical profile of Australia, 2013, http://www.oecd-ilibrary.org/economics/country-statistical-profile-australia_20752288-table-aus (accessed 9 February 2015).
- 10. The World Bank Group. High income: OECD 2015, http://data.worldbank.org/income-level/OEC (accessed 9 February 2015).
- 11. Australian Institute of Health and Welfare. Australia’s health 2010 (Australia’s Health Series No. 12). Canberra, ACT, Australia: AIHW, 2010. [Google Scholar]
- 12. Australian Government Department of Health and Ageing. Primary health care reform in Australia – report to support Australia’s first National Primary Health Care Strategy, 2009, http://apo.org.au/files/Resource/NPHC-supp.pdf (accessed 23 December 2016).
- 13. Agency Healthcare Research and Quality. 2011 National healthcare quality and disparities reports, 2011, http://www.ahrq.gov/research/findings/nhqrdr/nhqrdr11/measurespec/patient_centeredness.html (accessed 9 February 2015).
- 14. The Commonwealth Fund. Foundation history, 2013, http://www.commonwealthfund.org/About-Us/Foundation-History.aspx (accessed 9 February 2015).
- 15. The Commonwealth Fund. Rising to the challenge – results from a scorecard on local health system performance, 2012, http://www.commonwealthfund.org/~/media/Files/Publications/Fund%20Report/2012/Mar/Local%20Scorecard/1578_EXEC_SUMM_Commission_rising_to_challenge_local_scorecard_2012_FINAL.pdf (accessed 9 February 2015).
- 16. National Health Performance Authority. About us, 2013, http://www.nhpa.gov.au/internet/nhpa/publishing.nsf/Content/About-us (accessed 15 February 2015). [DOI] [PubMed]
- 17. Australian Government. National Health Reform Act 2011, 2011, http://www.comlaw.gov.au/Details/C2011C00952 (accessed 9 February 2015). [Google Scholar]
- 18. National Health Performance Authority. Performance indicator reporting, 2015, http://www.nhpa.gov.au/internet/nhpa/publishing.nsf/Content/Performance-Indicator-Reporting (accessed 9 February 2015).
- 19. Arah OA, Klazinga NS, Delnoij DMJ, et al. Conceptual frameworks for health systems performance: a quest for effectiveness, quality, and improvement. Int J Qual Health Care 2003; 15: 377–398. [DOI] [PubMed] [Google Scholar]
- 20. Canadian Institute for Health Information. Health Indicators 2013, 2013, https://secure.cihi.ca/estore/productFamily.htm?locale=en&pf=PFC2195&lang=fr (accessed 9 February 2015). [Google Scholar]
- 21. Arah OA, Westert GP. Correlates of health and healthcare performance: applying the Canadian health indicators framework at the provincial-territorial level. BMC Health Serv Res 2005; 5: 76. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22. Thomson S, Osborn R, Squires D, et al. International profiles of health care systems, 2012, http://www.commonwealthfund.org/Publications/Fund-Reports/2012/Nov/International-Profiles-of-Health-Care-Systems-2012.aspx (accessed 9 February 2015).
- 23. Department of Health. The NHS Outcomes Framework 2012/13. London: Department of Health, 2011. [Google Scholar]
- 24. Ten Asbroek AHA, Arah OA, Geelhoed J, et al. Developing a national performance indicator framework for the Dutch health system. Int J Qual Health Care 2004; 16(Suppl. 1): 65–71. [DOI] [PubMed] [Google Scholar]
- 25. Van den Berg M, Heijink R, Zwakhals L, et al. Health care performance in the Netherlands: easy access, varying quality, rising costs. EuroHealth 2010; 16: 27–29. [Google Scholar]
- 26. Ministry of Health. About the health targets, 2014, http://www.health.govt.nz/new-zealand-health-system/health-targets/about-health-targets (accessed 9 February 2015).
- 27. DHB Shared Services. PHO Performance Programme, 2013, http://www.health.govt.nz/our-work/primary-health-care/primary-health-care-subsidies-and-services/pho-performance-programme-and-transition-integrated-performance-and-incentive-framework (accessed 23 December 2016).
- 28. Open for Better Care. About the campaign, 2013, http://www.open.hqsc.govt.nz/open/about-the-campaign/ (accessed 9 February 2015).
- 29. Health Quality & Safety Commission. Atlas of Healthcare Variation, 2013, http://www.hqsc.govt.nz/our-programmes/health-quality-evaluation/projects/atlas-of-healthcare-variation/ (accessed 9 February 2015).
- 30. OECD Health Division. OECD reviews of health care quality: Denmark: executive summary, assessment and recommendations, 2013, http://www.oecd.org/els/health-systems/ReviewofHealthCareQualityDENMARK_ExecutiveSummary.pdf (accessed 9 February 2015).
- 31. National Institute for Health and Clinical Excellence. About the Quality and Outcomes Framework (QOF), 2014, http://www.nice.org.uk/standards-and-indicators/qofindicators (accessed 23 February 2015).
- 32. National Institute for Health and Clinical Excellence. How we develop QOF, 2014, https://www.nice.org.uk/standards-and-indicators/How-we-develop-QOF (accessed 23 February 2015).
- 33. National Institute for Health and Clinical Excellence. Developing indicators for the Commissioning Outcomes Framework (COF): interim process guide. London: NICE, 2012. [Google Scholar]
- 34. Mannion R, Braithwaite J. Unintended consequences of performance measurement in healthcare: 20 salutary lessons from the English National Health Service. Intern Med J 2012; 42: 569–574. [DOI] [PubMed] [Google Scholar]
- 35. Statistics Canada. Health indicators 2013: definitions, data sources and rationale, 2013, http://www.statcan.gc.ca/pub/82-221-x/2013001/tbl-eng.htm (accessed 9 February 2015).
- 36. Canadian Institute for Health Information. Canadian Hospital Reporting Project technical notes – clinical indicators, 2013, http://publications.gc.ca/collections/collection_2013/icis-cihi/H118-86-1-2013-eng.pdf (accessed 24 February 2015).
- 37. Government of Canada. Healthy Canadians: a federal report on comparable health indicators 2010, 2010, http://www.hc-sc.gc.ca/hcs-sss/pubs/system-regime/2010-fed-comp-indicat/index-eng.php (accessed 24 February 2015). [Google Scholar]
- 38. National Institute of Public Health and the Environment. Dutch healthcare performance report, 2012, http://www.rivm.nl/en/Documents_and_publications/Scientific/Reports/2012/april/Dutch_Health_Care_Performance_Report (accessed 9 February 2015).
- 39. Cordasev H, Arne Björnberg M, Hjertqvist O. Cross border care EU: how to choose the best hospital? A study of hospital information portals in five EU countries, 2010, http://www.healthpowerhouse.com/files/HCP-study-Best-hospital-final-101119.pdf (accessed 10 October 2013).
- 40. Mainz J, Krog BR, Bjørnshave B, et al. Nationwide continuous quality improvement using clinical indicators: the Danish National Indicator Project. Int J Qual Health Care 2004; 16(Suppl. 1): i45–i50. [DOI] [PubMed] [Google Scholar]
- 41. Raleigh VS, Foot C. Getting the measure of quality: opportunities and challenges. London: The King’s Fund, 2010. [Google Scholar]
- 42. Smith P, Mossialos E, Papanicolas I, et al. Conclusions. In: Smith P, Mossialos E, Papanicolas I, et al. (eds) Performance measurement for health system improvement. Cambridge: Cambridge University Press, 2010, 675-706. [Google Scholar]
- 43. Berwick D, James B, Coye M. Connections between quality measurement and improvement. Medical Care 2003; 41(1 Suppl.): I30–I38. [DOI] [PubMed] [Google Scholar]
- 44. Collopy B. Clinical indicators in accreditation: an effective stimulus to improve patient care. Int J Qual Health Care 2000; 12: 211–216. [DOI] [PubMed] [Google Scholar]
- 45. Adair CE, Simpson E, Casebeer AL. Performance management in healthcare: part II – state of the science findings by stage of the performance measurement process. Healthc Policy 2006; 2: 56–78. [PMC free article] [PubMed] [Google Scholar]
- 46. Freeman T. Using performance indicators to improve health care quality in the public sector: a review of the literature. Health Serv Manage Res 2002; 15: 126–137. [DOI] [PubMed] [Google Scholar]
- 47. Bevan G, Hood C. What’s measured is what matters: targets and gaming in the English public health care system. Public Admin 2006; 84(3): 517–538. [Google Scholar]
- 48. Boyce M, Browne J. Does providing feedback on patient-reported outcomes to healthcare professionals result in better outcomes for patients? A systematic review. Qual Life Res 2013; 22: 2265–2278. [DOI] [PubMed] [Google Scholar]
- 49. Braithwaite J, Matsuyama Y, Mannion R, et al. Healthcare reform, quality and safety: perspectives, participants, partnerships and prospects in 30 countries. Farnham: Ashgate Publishing Ltd, 2015. [Google Scholar]
- 50. Shekelle P, Lim Y-W, Mattke S, et al. Does public release of performance results improve quality of care? London: Health Foundation, 2008. [Google Scholar]
- 51. Ketelaar N, Faber M, Flottorp S, et al. Public release of performance data in changing the behaviour of healthcare consumers, professionals or organisations. Cochrane Database Syst Rev 2011; 11: 1–62. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52. Marshall M, Shekelle P, Leatherman S, et al. The public release of performance data: what do we expect to gain? A review of the evidence. JAMA 2000; 283: 1866–1874. [DOI] [PubMed] [Google Scholar]
- 53. Hibbert P, Hannaford N, Long J, et al. Final report: performance indicators used internationally to report publicly on healthcare organisations and local health systems. Sydney, NSW, Australia: Australian Institute of Health Innovation, University of New South Wales, 2013. [Google Scholar]