Supplemental Digital Content is available in the text.
Keywords: health services; historically controlled study; humans; quality of health care; reimbursement, incentive; stroke
Abstract
Background and Purpose—
Hospital uptake of evidence-based stroke care is variable. We aimed to determine the impact of a multicomponent program involving financial incentives and quality improvement interventions, on stroke care processes.
Methods—
A prospective study of interventions to improve clinical care quality indicators at 19 hospitals in Queensland, Australia, during 2010 to 2015, compared with historical controls and 23 other Australian hospitals. After baseline routine audit and feedback (control phase, 30 months), interventions involving financial incentives (21 months) and then addition of externally facilitated quality improvement workshops with action plan development (9 months) were implemented. Postintervention phase was 13 months. Data were obtained for the analysis from a previous continuous audit in Queensland and subsequently the Australian Stroke Clinical Registry. Primary outcome: change in median composite score for adherence to ≤8 indicators. Secondary outcomes: change in adherence to self-selected indicators addressed in action plans and 4 national indicators compared with other Australian hospitals. Multivariable analyses with adjustment for clustered data.
Results—
There were 17 502 patients from the intervention sites (median age, 74 years; 46% women) and 20 484 patients from other Australian hospitals. Patient characteristics were similar between groups. There was an 18% improvement in the primary outcome across the study periods (95% CI, 12%–24%). The largest improvement was following introduction of financial incentives (14%; 95% CI, 8%–20%), while indicators addressed in action plans provided an 8% improvement (95% CI, 1%–17%). The national score (4 indicators) improved by 17% (95% CI, 13%–20%) versus 0% change in other Australian hospitals (95% CI, −0.03 to 0.03). Access to stroke units improved more in Queensland than in other Australian hospitals (P<0.001).
Conclusions—
The quality improvement interventions significantly improved clinical practice. The findings were primarily driven by financial incentives, but were also contributed to by the externally facilitated, quality improvement workshops. Assessment in other regions is warranted.
Improving access to evidence-based care, including specialized stroke units, intravenous thrombolysis and thrombectomy in eligible patients with acute ischemic stroke, and medications for secondary prevention, is recommended for optimal stroke outcomes. Yet many patients fail to receive these therapies, even in well-resourced settings.1,2 Various strategies have been proposed to reduce evidence-practice gaps. These include audit and feedback, education and training, and the influence of key opinion leaders and professional groups, often with local tailoring and used in combination.3–5 However, few studies to evaluate these multicomponent strategies that target clinician behavior within hospitals have been undertaken for stroke. Examples include the Get-With-The-Guidelines–Stroke program and Quality in Acute Stroke Care trial in Australia.2,6,7 There is also a lack of evidence for the use of financial incentives to improve stroke care.8
Within Australia, there are 2 main standard quality improvement activities in hospitals. These include the biennial national audit program (since 2007)9 and the Australian Stroke Clinical Registry (AuSCR; established in 2009),10 which has fewer clinical processes than the audit but is collected on all admitted patients and has a 90-day outcome survey. Feedback to hospitals from these programs includes sending personalized benchmarked reports. Within Queensland, additional strategies have included introduction of financial incentives and use of external quality improvement officers to help local teams develop action plans to address evidence-practice gaps (StrokeLink program). We undertook a project (Stroke123) to assess the real-world effectiveness of the new Queensland quality improvement initiatives against a background of standard activities. The primary hypothesis being tested was that Queensland hospitals eligible for financial incentives and participating in an enhanced StrokeLink program could demonstrate greater adherence to a defined set of acute stroke clinical indicators compared with a historical control period and to other hospitals in Australia.
Methods
Design
Details of the Stroke123 study design are outlined in detail elsewhere.11 In brief, a prospective, multicenter, before-and-after study design was used to compare performance across hospitals12 according to Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) guidelines.13 We approached all 23 public hospitals in Queensland, which admitted ≥100 patients (age, ≥18 years) with acute stroke or transient ischemic attack per annum. These hospitals all contributed data to AuSCR and previous audit programs.11 Ethics approvals were obtained from Monash University (2013-2058-1867) and Metro South Human Research Ethics Committee (HREC/13/QPAH/31) being the lead ethics committee for the Queensland hospitals. Individual informed consent was not required because this was secondary use of deidentified data. The data underlying the findings are available from the corresponding author on reasonable request.
Interventions
There were 2 intervention components to change practice: financial incentives and the enhanced StrokeLink program. The financial incentive program implemented from 2012 was developed to provide an incentive payment to increase access to stroke units14 (Methods in the online-only Data Supplement; Table I in the online-only Data Supplement). Payments were also contingent on a minimum proportion of data collected within AuSCR.
The enhanced StrokeLink program included a single outreach visit to each hospital where clinical staff participated in a workshop facilitated by a quality improvement officer with a clinical background in nursing or allied health. The StrokeLink program is based on the Plan-Do-Study-Act model.15 That is, benchmarked feedback is provided to clinicians on their hospital performance and they develop action plans to improve the care they provide. A unique aspect of the enhanced program being tested from 2014 was the provision of AuSCR clinical indicator and 90-day patient outcome data from the previous 12 months. Previously, only snapshot retrospective medical record review had been provided from the biennial national audit of 40 medical records (with no long-term patient outcome information).9 Other features were an interactive discussion on actions to overcome local barriers and the provision of ongoing support via telephone or email (Methods in the online-only Data Supplement; Table II in the online-only Data Supplement).
The interventions were iteratively added, and, therefore, the study comprised 4 phases: T0 (pre-intervention: control) with baseline audit and feedback via StrokeLink (January 2010 to June 2012), T1 (intervention 1: addition of financial incentives, July 2012 to March 2014), T2 (intervention 2: addition of the enhanced StrokeLink program, March 2014 to November 2014), and T3 (post-intervention, November 2014 to December 2015), as outlined in the Figure.11
Figure.
Stroke123 data flowchart. Composite score: calculated by dividing the total number of relevant clinical indicators achieved by the sum of eligible indicators. Action plan: structured written plan of agreed strategies to overcome identified local barriers to implementation of the desired practices to improve stroke care. Primary outcome composite score: included all indicators: stroke unit care, if ischemic stroke: intravenous thrombolysis and aspirin within 48 h; mobilized on same or following day of admission, swallow screen/assessment, prescribed antihypertensive agents at discharge, prescribed antithrombotics if an ischemic stroke, evidence that a care plan outlining postdischarge care in the community was developed with the team and the patient (only available for T1–T3 comparisons). Action plan score: limited to the indicators nominated by hospitals to include in their action plan. National score: 4 indicators collected nationally: stroke unit care, intravenous thrombolysis, prescribed antihypertensive agents at discharge, evidence that a care plan outlining postdischarge care in the community was developed with the team and the patient.
Outcome Measures
Consistent with other studies in this field, we used composite scores that summarize in a single measure the proportion of all needed care that was given.16 Composite scores were derived from individual patient adherence to the following clinical indicators: treatment in a stroke unit; in acute ischemic stroke, use of intravenous thrombolysis, aspirin <48 hours, and prescription of antiplatelet/other antithrombotic medication at discharge; early patient mobilization; use of a swallow screen/assessment before feeding; prescription of antihypertensive medication at discharge; and use of a discharge care plan if hospital separation is to the community (not measured in T0). Composite scores were calculated by dividing the total number of relevant clinical indicators achieved by the sum of eligible indicators for each comparator cohort.
The primary outcome was for change in composite score from ≤8 indicators collected consistently across hospitals in Queensland during the study periods (primary composite score). Secondary outcomes were the change in the composite score from indicators nominated by the hospitals in their action plans during the same period (action plan score) and the change in the composite score from 4 common national clinical indicators, compared with the change in hospitals in other parts of Australia (national score using receipt of stroke unit care, thrombolysis in acute ischemic stroke, prescription of antihypertensive medication at discharge, and use of a discharge care plan).
Statistical Analysis
Analyses were undertaken across the 4 time periods, as described in the published Statistical Analysis Plan (Figure).11 In this pragmatic real-world study, stroke clinical indicator data were from 2 comparable sources. Historical data from Queensland collected using similar methods to AuSCR (ie, minimum set of variables on all admitted patients) before 2012 were matched to the AuSCR, and the data quality was found to be similar (Table III in the online-only Data Supplement). The main difference was lack of 1 variable in the historical Queensland dataset: care plan outlining postdischarge care in the community developed with the team and the patient. Therefore, our analysis using primary composite score either included 7 (T0) or 8 clinical indicators (T1–T3). To assess for the possibility that the results may have been influenced by 1 clinical indicator missing from the historical period, we conducted a sensitivity analysis excluding use of a discharge care plan from the composite score. The action plan score analysis was limited to 14 hospitals with nominated indicators in their action plans (Table III in the online-only Data Supplement).
Because of the skewed distribution of composite score data, median (interquartile limits) for all composite scores in each time period is reported, and nonparametric methods or quantile regression analyses were used to measure change in composite score by time period, adjusted for clustering by hospital. Multilevel random-effects logistic regression analysis was used to measure change in adherence to individual clinical indicators by time period at the hospital level. Models were also generated with a test of interaction (time period×location, ie, Queensland versus other Australian hospitals) but without adjustment for patient characteristics as all quality indicators were universally applicable to eligible patients.17 Secondary analyses included adjustment for available data on patient characteristics (age, sex, stroke severity [ability to walk on admission], and stroke type) because patient-level factors may explain ≤2% of variability in care.16 Significance was 2 sided (α<0.05). All statistical analyses were undertaken using Intercooled STATA 12.1 for Windows (Statcorp, College Station; 2014).
Results
The study included 19 of the 23 eligible Queensland hospitals (83%) and 23 others located elsewhere in Australia (Figure). Across the 19 Queensland hospitals, a total of 17 502 patients were evaluated across the time periods, with 4781 in T0 and 3815 in T3 for the primary analysis. Correspondingly, there were 20 484 patients included across the 23 non-Queensland hospitals during all study periods, including 5903 in T0 and 5188 in T3 (Figure). Patient characteristics were similar between comparator periods and between Queensland and non-Queensland hospitals (Table IV in the online-only Data Supplement).
Table 1 shows that the median primary composite score increased from 0.57 (T0) to 0.75 (T3; P<0.001). In adjusted analyses, there was a 14% improvement in median scores after introduction of financial incentives and this increased to 18% during the full intervention period. These trends attenuated and became nonsignificant in the sensitivity analysis excluding use of a discharge care plan indicator from T1 to T3 (Table V in the online-only Data Supplement). In secondary outcome analyses limited to those clinical indicators specifically nominated by participating hospitals (n=14), action plan score medians increased by 5% after introduction of financial incentives (T0 compared with T1, nonsignificant) and 8% during the whole intervention period (T0 compared with T3, P<0.05; Table 2). Finally, limiting the analysis to the 4 common national indicators, the national score increased by 17% (95% CI, 13%–20%) at participating Queensland hospitals but was stable across non-Queensland hospitals (0%; 95% CI, −3% to 3%; Table 3). Results were consistent in supplementary analyses adjusting for patient characteristics (Table VI in the online-only Data Supplement). In both intervention hospitals in Queensland and nonintervention hospitals elsewhere, the trends were for improved individual patient indicators during the study period (Table 4; Table VII in the online-only Data Supplement).
Table 1.
Changes in Primary Composite Score (All Process of Care) in Queensland (n=19) Hospitals
Table 2.
Changes in Action Plan Score* (Secondary Outcome; n=14 Hospitals)
Table 3.
Changes in National Score (Secondary Outcome)
Table 4.
Multivariable Analyses for Change in Adherence to National Clinical Indicators*
Discussion
In this multicenter observational study, substantial improvements were evident across several clinical indicators of hospital performance following implementation of a multicomponent quality improvement program (ie, financial incentives and feeding back data to hospital clinicians through StrokeLink). Improvement was the greatest after a financial incentive to improve stroke unit access was introduced (14% improvement from T0 to T1). These data suggest that multicomponent and complementary interventions are more effective than single-component interventions to improve systems of stroke care. These findings are in contrast to a recent review in which the authors concluded that there was no compelling evidence that multicomponent interventions were more effective than single-component intervention.18
Our overall finding of an 18% improvement in the primary composite score is large and at the upper limit of the treatment effect for other types of quality improvement interventions assessed in randomized trials.3 However, it appears valid because this improvement occurred against a background of stable performance where conventional passive audit and feedback processes were used in hospitals elsewhere in Australia during the study periods (national score). We acknowledge that baseline adherence to performance indicators in non-Queensland hospitals was larger than in Queensland hospitals (Table VII in the online-only Data Supplement) and may have influenced the overall comparisons if there were ceiling effects. However, this is unlikely because performance across non-Queensland hospitals remained below achievable benchmarks of top performing hospitals based on 2015 AuSCR data (ranging from 5% below benchmarks for intravenous thrombolysis and 31% below for discharge care plans).19 Overall, the comparator hospitals had small improvements across several individual clinical indicators that were consistent with systematic review evidence of audit and feedback interventions (pooled median 4.3% effect size).20
Reported effects of other hospital-based pay-for-performance interventions have been variable,8 ranging from no effect21 to a 4% to 22% improvement in programs for acute coronary syndrome, heart failure, and pneumonia.22 Success of the Queensland financial incentive program is consistent with the evidence for an implementation strategy led by clinical leaders with support by government in a setting of low baseline performance.8 Although funding mechanisms differ across health systems, the principles of the Queensland financial incentives provide a basis for other countries to establish similar pay-for-performance schemes for increasing access to stroke units.
The association between stroke unit care and patient outcome is undisputed23 and is the cornerstone for providing best-practice care in hospitals. The initial targeting of stroke unit care with financial incentives was associated with concurrent improvements in adherence to some (ie, thrombolysis for acute ischemic stroke and early mobilization), but not all, measured clinical indicators. This evidence is consistent with other research, whereby stroke units are associated with greater adherence to evidence-based processes of care than other models of care.11,24,25
Strategies to improve clinical practice should account for local factors that might inhibit quality improvement.26 In this study, the use of the externally facilitated enhanced StrokeLink program was associated with a nonstatistically significant additive increase in both the primary composite score (4%) and action plan scores (3%) after introduction of financial incentives—an effect size that is consistent with published findings of comparable interventions.26 The other reported system-wide quality improvement intervention based on audit and feedback in stroke is the Get-With-The-Guidelines program.7 This intervention was similar to our baseline quality improvement interventions, but more intensive (quarterly workshops, more education), and did not include feedback of patient-level outcomes. These authors report a composite score increase of 10.5% during 5 years with an annualized increase in the odds of receiving measures in their composite score of 1.18.7 Specific novel components in our intervention that may have contributed to our somewhat better improvement included feedback of long-term patient outcomes and ongoing support provided to hospitals via email and telephone. Because many countries have national registries for monitoring stroke care including longer term patient outcomes,27 our approach to audit and feedback could be replicated whereby feedback of long-term outcomes is incorporated, where feasible.
Strengths of our study include being multicentered, the large sample sizes, and continuous longitudinal data. Only 4 public hospitals in Queensland were excluded because they did not participate in either AuSCR or the enhanced StrokeLink program. Therefore, the included hospitals represented different service levels from small regional to large comprehensive stroke units, strengthening the generalizability of our results. We used historical controls in Queensland and contemporaneous controls from other states in Australia to avoid threats to internal validity.
Limitations of our study include the lack of randomization and the inability to firmly distinguish the effects of the individual intervention components, which was not our aim. Attribution of any changes seen subsequent to an improvement intervention may be complicated by factors other than the intervention that may interfere with the system or disrupt the pattern of data.28 The temporal and additive nature of the observed associations provides confidence in the whole package, as does the lack of change in the national score in other Australian comparator hospitals. We were also conservative in our approach to not overestimate effects in our handling of missing indicator data whereby we assumed the process of care did not occur.
In conclusion, the complementary and iterative interventions tested in this real-world study, comprising financial incentives and externally facilitated feedback of registry data with action planning, led to substantial and clinically relevant improvements in best-practice care. Our interventions are readily transferable. Our study contributes important knowledge that can be used to help improve health systems and clinical services for stroke. The individual components and the combined quality improvement intervention deserve further study and application in other regions and countries.
Acknowledgments
We thank the hospital clinicians and patients who contributed data to the Australian Stroke Clinical Registry (AuSCR). We are also grateful to the program facilitators, staff from the AuSCR, the George Institute for Global Health and the Florey Institute of Neuroscience and Mental Health, the Stroke Foundation and members of the AuSCR Management and Steering Committees for their support related to this project and AuSCR (see Acknowledgments in the online-only Data Supplement for coinvestigators and other contributors to the Stroke123 project or AuSCR). We are grateful to Prof Leonid Churilov for his advice on the statistical analysis.
Sources of Funding
The Stroke123 project was supported by the National Health and Medical Research Council (NHMRC) partnership grant 1034415, cofunded by Queensland Health, Monash University, and the Stroke Foundation of Australia. Financial incentives were funded by Queensland Health through their Quality Improvement Payment program. The following authors received research fellowship support from the NHMRC: Prof Cadilhac (1063761 cofunded Heart Foundation), Dr Andrew (1072053), Prof Anderson (1081356), Dr C. Levi (1043913), A/Prof Lannin (1112158), Prof Thrift (1042600), and Dr Kilkenny (1109426). The Australian Stroke Clinical Registry was supported by grants from the NHMRC (1034415), Monash University, Queensland Health, Victorian Department of Health and Human Services, Tasmanian government the Stroke Foundation, Allergan Australia, Ipsen, Boehringer Ingelheim, and consumer donations.
Disclosures
The authors declared the following potential conflicts of interest with respect to the research, authorship, and publication of this article: Prof Cadilhac is the current Data Custodian for Australian Stroke Clinical Registry (AuSCR). Dr R. Grimley is the clinical lead for the Queensland Statewide Stroke Clinical Network and member Stroke Foundation Clinical Council. K. Hill manages the Stroke Foundation’s National Stroke Audit and Stroke Clinical Guidelines programs. Professors Anderson, Middleton, Lannin, Cadilhac, Levi, Donnan, Thrift, and Dr S.G. Faux, G. Cadigan, K. Hill, and Dr R. Grimley are members of the AuSCR Steering or Management committees. Prof Thrift is a member of the Board of the Stroke Foundation. Prof Middleton was formerly a member of the NHMRC Research Committee but was appointed after this grant was awarded. Prof Anderson reports receiving honoraria from Boehringer Ingelheim and Amgen. The other authors report no conflicts.
Supplementary Material
Footnotes
Prof Cadilhac and Dr Grimley contributed equally.
Presented in part at the European Stroke Organisation Conference, Gothenburg, Sweden, May 16–18, 2018.
The online-only Data Supplement is available with this article at https://www.ahajournals.org/doi/suppl/10.1161/STROKEAHA.118.023075.
References
- 1.Cadilhac DA, Andrew NE, Lannin NA, Middleton S, Levi CR, Dewey HM, et al. Australian Stroke Clinical Registry Consortium. Quality of acute care and long-term quality of life and survival: the Australian Stroke Clinical Registry. Stroke. 2017;48:1026–1032. doi: 10.1161/STROKEAHA.116.015714. doi: 10.1161/STROKEAHA.116.015714. [DOI] [PubMed] [Google Scholar]
- 2.LaBresh KA, Reeves MJ, Frankel MR, Albright D, Schwamm LH. Hospital treatment of patients with ischemic stroke or transient ischemic attack using the “Get With The Guidelines” program. Arch Intern Med. 2008;168:411–417. doi: 10.1001/archinternmed.2007.101. doi: 10.1001/archinternmed.2007.101. [DOI] [PubMed] [Google Scholar]
- 3.Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50. doi: 10.1186/1748-5908-7-50. doi: 10.1186/1748-5908-7-50. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Taylor MJ, McNicholas C, Nicolay C, Darzi A, Bell D, Reed JE. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf. 2014;23:290–298. doi: 10.1136/bmjqs-2013-001862. doi: 10.1136/bmjqs-2013-001862. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Kitson AL, Rycroft-Malone J, Harvey G, McCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1. doi: 10.1186/1748-5908-3-1. doi: 10.1186/1748-5908-3-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Middleton S, McElduff P, Ward J, Grimshaw JM, Dale S, D’Este C, et al. QASC Trialists Group. Implementation of evidence-based treatment protocols to manage fever, hyperglycaemia, and swallowing dysfunction in acute stroke (QASC): a cluster randomised controlled trial. Lancet. 2011;378:1699–1706. doi: 10.1016/S0140-6736(11)61485-2. doi: 10.1016/S0140-6736(11)61485-2. [DOI] [PubMed] [Google Scholar]
- 7.Schwamm LH, Fonarow GC, Reeves MJ, Pan W, Frankel MR, Smith EE, et al. Get With The Guidelines-stroke is associated with sustained improvement in care for patients hospitalized with acute stroke or transient ischemic attack. Circulation. 2009;119:107–115. doi: 10.1161/CIRCULATIONAHA.108.783688. doi: 10.1161/CIRCULATIONAHA.108.783688. [DOI] [PubMed] [Google Scholar]
- 8.Kondo K, Damberg C, Mendelson A, Motu’apuaka M, Freeman M, O’Neil M, et al. Understanding the Intervention and Implementation Factors Associated with Benefits and Harms of Pay for Performance Programs in Healthcare. Washington, DC: Department of Veterans Affairs; 2015. [PubMed] [Google Scholar]
- 9.Harris D, Cadilhac D, Hankey GJ, Hillier S, Kilkenny M, Lalor E. National stroke audit: the australian experience. Clinical Audit. 2010;2:25–31. [Google Scholar]
- 10.Cadilhac DA, Lannin NA, Anderson CS, Levi CR, Faux S, Price C, et al. Protocol and pilot data for establishing the Australian Stroke Clinical Registry. Int J Stroke. 2010;5:217–226. doi: 10.1111/j.1747-4949.2010.00430.x. doi: 10.1111/j.1747-4949.2010.00430.x. [DOI] [PubMed] [Google Scholar]
- 11.Cadilhac DA, Andrew NE, Kilkenny MF, Hill K, Grabsch B, Lannin NA, et al. Improving quality and outcomes of stroke care in hospitals: protocol and statistical analysis plan for the Stroke123 implementation study. Int J Stroke. 2018;13:96–106. doi: 10.1177/1747493017730741. doi: 10.1177/1747493017730741. [DOI] [PubMed] [Google Scholar]
- 12.Cohcrane Effective Practice and Organisation of Care (EPOC) Epoc resources for review authors: what study designs can be considered for inclusion in an epoc review and what should they be called? EPOC Resources for Review Authors 2017https://epoc.cochrane.org/resources/epoc-resources-review-authors. Accessed April 27, 2019 [Google Scholar]
- 13.von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP STROBE Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. J Clin Epidemiol. 2008;61:344–349. doi: 10.1016/j.jclinepi.2007.11.008. doi: 10.1016/j.jclinepi.2007.11.008. [DOI] [PubMed] [Google Scholar]
- 14.Queensland Government. Health Funding Principles and Guidelines 2017–18 Financial Year 2018;Version 1.0:129. Brisbane, Australia: Queensland Health; [Google Scholar]
- 15.US Institute for Healthcare Improvement How to Improve. www.ihi.org/knowledge/Pages/HowtoImprove/default.aspx. Accessed March 27, 2019
- 16.Reeves MJ, Gargano J, Maier KS, Broderick JP, Frankel M, LaBresh KA, et al. Patient-level and hospital-level determinants of the quality of acute stroke care: a multilevel modeling approach. Stroke. 2010;41:2924–2931. doi: 10.1161/STROKEAHA.110.598664. doi: 10.1161/STROKEAHA.110.598664. [DOI] [PubMed] [Google Scholar]
- 17.National Stroke Foundation. Clinical Guidelines for Acute Stroke Management. Melbourne: National Stroke Foundation; 2010. [Google Scholar]
- 18.Squires JE, Sullivan K, Eccles MP, Worswick J, Grimshaw JM. Are multifaceted interventions more effective than single-component interventions in changing health-care professionals’ behaviours? An overview of systematic reviews. Implement Sci. 2014;9:152. doi: 10.1186/s13012-014-0152-6. doi: 10.1186/s13012-014-0152-6. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Cadilhac DA, Lannin NA, Anderson CS, Kim J, Andrew N, Kilkenny M, et al. The Australian Stroke Clinical Registry Annual Report 2015. The Florey Institute of Neuroscience and Mental Health; Dec, 2016. Report number 7: 42 https://auscr2.files.wordpress.com/2017/05/auscr_2015_annual_report_final.pdf. Accessed April 27, 2019. [Google Scholar]
- 20.Ivers N, Jamtvedt G, Flottorp S, Young JM, Odgaard-Jensen J, French SD, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259. doi: 10.1002/14651858.CD000259.pub3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Ryan AM, Burgess JF, Jr, Pesko MF, Borden WB, Dimick JB. The early effects of Medicare’s mandatory hospital pay-for-performance program. Health Serv Res. 2015;50:81–97. doi: 10.1111/1475-6773.12206. doi: 10.1111/1475-6773.12206. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Benzer JK, Young GJ, Burgess JF, Jr, Baker E, Mohr DC, Charns MP, et al. Sustainability of quality improvement following removal of pay-for-performance incentives. J Gen Intern Med. 2014;29:127–132. doi: 10.1007/s11606-013-2572-4. doi: 10.1007/s11606-013-2572-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Stroke Unit Trialists C. Organised inpatient (stroke unit) care for stroke. Cochrane Database Syst Rev. 2013;9:CD000197. doi: 10.1002/14651858.CD000197.pub3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Evans A, Perez I, Harraf F, Melbourn A, Steadman J, Donaldson N, et al. Can differences in management processes explain different outcomes between stroke unit and stroke-team care? Lancet. 2001;358:1586–1592. doi: 10.1016/S0140-6736(01)06652-1. doi: 10.1016/S0140-6736(01)06652-1. [DOI] [PubMed] [Google Scholar]
- 25.Cadilhac DA, Ibrahim J, Pearce DC, Ogden KJ, McNeill J, Davis SM, et al. SCOPES Study Group. Multicenter comparison of processes of care between Stroke Units and conventional care wards in Australia. Stroke. 2004;35:1035–1040. doi: 10.1161/01.STR.0000125709.17337.5d. doi: 10.1161/01.STR.0000125709.17337.5d. [DOI] [PubMed] [Google Scholar]
- 26.Baker R, Camosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;308:CD005470. doi: 10.1002/14651858.CD005470.pub3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Cadilhac DA, Kim J, Lannin NA, Kapral MK, Schwamm LH, Dennis MS, et al. National stroke registries for monitoring and improving the quality of hospital care: a systematic review. Int J Stroke. 2016;11:28–40. doi: 10.1177/1747493015607523. doi: 10.1177/1747493015607523. [DOI] [PubMed] [Google Scholar]
- 28.Portela MC, Pronovost PJ, Woodcock T, Carter P, Dixon-Woods M. How to study improvement interventions: a brief overview of possible study types. BMJ Qual Saf. 2015;24:325–336. doi: 10.1136/bmjqs-2014-003620. doi: 10.1136/bmjqs-2014-003620. [DOI] [PMC free article] [PubMed] [Google Scholar]