Abstract
Objectives
External inspections are widely used to improve the quality of care. The effects of inspections remain unclear and little is known about how they may work. We conducted a narrative synthesis of research literature to identify mediators of change in healthcare organisations subject to external inspections.
Methods
We performed a literature search (1980–January 2020) to identify empirical studies addressing change in healthcare organisations subject to external inspection. Guided by the Consolidated Framework for Implementation Research, we performed a narrative synthesis to identify mediators of change.
Results
We included 95 studies. Accreditation was the most frequent type of inspection (n=68), followed by statutory inspections (n=19), and external peer review (n=9). Our findings suggest that the regulatory context in which the inspections take place affect how they are acted on by those being inspected. The way inspections are conducted seem to be critical for how the inspection findings are perceived and followed up. Inspections can engage and involve staff, facilitate leader engagement, improve communication and enable the creation of new networks for reflection on clinical practice. Inspections can contribute to creating an awareness of the inspected organisation’s current practice and performance gaps, and a commitment to change. Moreover, they can contribute to facilitating the planning and implementation of change, as well as self-evaluation and the use of data to evaluate performance.
Conclusions
External inspections can affect different mediators of organisational change. The way and to what extent they do depend on a range of factors related to the outer setting, the way inspections are conducted and how they are perceived and acted on by the inspected organisation. To improve the quality of care, the organisational change processes need to involve and impact the way care is delivered to the patients.
Keywords: change management, clinical governance, health policy, quality in health care, organisational development
Strengths and limitations of this study.
This is the first review addressing how external inspections can contribute to improve the quality of care.
We used a theoretical framework to extract and analyse heterogeneous data to explore mediators of change in the inspected health organisations.
For some of the theoretical constructs, we did not identify relevant studies and some of the review findings were graded low confidence, partly because of concerns about adequacy of data.
Background
External inspections are widely used in healthcare.1 2 They are a core element in accreditation, certification and regulation.3 These heterogeneous, complex processes consist of a set of activities that are introduced into varying organisational contexts and share a common ground in that ‘some dimensions or characteristics of a healthcare provider organisation and its activities are assessed or analysed against a framework of ideas, knowledge or measures derived or developed outside that organisation’.4 The phrase ‘external inspection’ also implies that the inspection is initiated and conducted by an organisation external to the one being inspected. External inspections can serve different purposes like promoting accountability and transparency in a regulated society.1 A key purpose of external inspections is that they are conducted to reveal possible substandard performance. When they reveal substandard performance, the inspected organisations are expected to implement necessary change, thereby improving the quality of care delivery.5
Evidence on the effect of external inspections on the quality of care remains unclear and contradictory.6–10 The way in which external inspections might mediate change in organisations is poorly understood.11 12 Better knowledge of how external inspections can contribute to improve quality of care may increase our understanding of why the effects of external inspections seem to vary and facilitate more effective ways of conducting inspections.6 11 13
We conducted a systematic review and performed a narrative synthesis to explore how external inspections can contribute to mediate change and improve the quality of care in healthcare organisations.
Theoretical framework
Quality of care is understood as a property of healthcare systems delivering care.14 15 Accordingly, improving the quality of care depends on changing the performance of the healthcare system, which implies change in organisational behaviour.16 17 Change in organisational behaviour can be understood as a complex social process that involves individuals or groups with the capacity to initiate activities that can contribute to producing change in their organisation in the presence of the appropriate antecedent conditions.18–20
If external inspection is to contribute to improving the quality of care, it needs to impact organisational change, defined as ‘any modification in organisational composition, structure or behaviour’.21 Change activities take place in an organisation of social actors; thus, these activities are accompanied by communication among these actors. Following Schmidt,22 we suggest that the communication within the organisation can affect organisational ideas, which can be understood as the substantive content of the communication about the activities that are undertaken. These ideas represent a cognitive frame for how the organisational actors view and understand their organisation and the change activities that take place within it.22 We refer to the communication about organisational ideas as organisational discursive activities. Figure 1 shows our study framework.
Figure 1.
Study framework.
We need to explore how external inspections can affect the change and discursive activities involved in organisational change aimed at improving the quality of care. Accordingly, we need to identify the related theoretical constructs and explore how these constructs can be affected by external inspections.
The Consolidated Framework for Implementation Research (CFIR) is a metatheoretical framework based on the synthesis of constructs from existing theories, which identifies and defines key constructs involved in effective implementation of organisational change (see online supplementary file 1).23 The CFIR encompasses five main domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved and the process of implementation. We regard these theoretical constructs as conceptualisations that may be impacted by external inspections during the process of implementing organisational change.
bmjopen-2020-038850supp001.pdf (123.7KB, pdf)
Methods
Literature review
We performed a systematic literature search (1980–January 2020) to identify empirical studies addressing changes in healthcare organisations subject to external inspection. Our inclusion criteria were quantitative and qualitative studies about external inspections that included empirical data about mediators of change at an organisational level for care delivery. The studies also needed to include a method section describing how data were obtained and analysed. Because the purpose of our study was to explore mediators of change rather than assessing effect sizes of inspections, we found it expedient to include both qualitative and quantitative studies.
We searched the following electronic databases for studies: the Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, Database of Abstracts of Reviews of Effectiveness, Medline, Embase, The Cumulative Index to Nursing and Allied Health Literature (CINAHL), PsychINFO and Web of Knowledge. A detailed description of the searches in all of the databases is provided in online supplementary file 2.
bmjopen-2020-038850supp002.pdf (324.8KB, pdf)
Data extraction and analysis
An overview of the article selection process is shown in figure 2. EHo performed an initial screening of the article titles and the abstracts, focusing on relevance. EHo, GSB and EHa independently reviewed the full text articles and then discussed which articles that should be included in the review.
Figure 2.
Flow chart of article selection process.
The main reasons for exclusion were that articles did not include an external inspection, did not address how the external inspection affected the organisation, were not based on empirical findings or did not address clinical care. We included 95 articles for further analysis.
The articles used a range of study designs addressing different types of external inspections in different settings. Thus, we used narrative synthesis as our analytical strategy.24 Narrative synthesis comprises four main elements: developing an initial theory of how the intervention works, developing a preliminary synthesis of findings from the included studies, exploring relationships in the data and assessing the robustness of synthesis. Our initial theory of how external inspections might affect the involved organisations was built around the CFIR. This framework defines key constructs involved in effective implementation of change in an organisation. Accordingly, we explored how external inspections could affect the theoretical constructs identified in the framework.
For each of the included studies, EHo extracted data about the study aim, study method, participants, setting, type of inspection and main findings. EHo conducted the preliminary synthesis of the data for how external inspection affected the involved organisations. The data extraction was guided by the theoretical constructs in the CFIR and comprised written summaries of key findings, excerpts of results from the included studies that addressed the constructs in the CFIR, including illustrative quotations used in qualitative studies. GSB read the included articles to validate the data extraction.
To analyse and explore relationships in the extracted data, we conducted a thematic content analysis of all data included in the preliminary synthesis. We used a combination of direct and indirect approaches as described by Hsieh and Shannon.25 Our initial coding scheme was guided by theory and comprised the theoretical constructs in the CFIR. During the analysis, we added codes derived from the data. Using an iterative process of coding, reflecting on the codes and condensing, we identified themes describing how external inspections could affect organisational change.26 The extracted data and analysis were shared and discussed among all authors.
GRADE (Grading of Recommendations, Assessment, Development and Evaluations)-Confidence in the Evidence from Reviews of Qualitative Research (GRADE-CERQual) was developed to assess the confidence of review findings from qualitative evidence synthesis and is based on four components: methodological robustness, relevance, adequacy of data and coherence of findings. Because most of our included studies used a descriptive design and were based on qualitative data, we found it expedient to apply the GRADE-CERQual principles to assess the confidence of our review findings.
To determine the methodological robustness of the included studies, we applied previously used assessment criteria27 which were adapted from criteria developed by Cunningham et al28 and the Australian National Health and Medical Research Council.29 These criteria were developed to assess the methodological quality of a range study designs. Based on the criteria, studies were scored from 1 to 3, where 3 denotes the most robust study methodology.
The criteria and the rating scheme are displayed in online supplementary file 3. EHo and GSB independently rated all of the included studies and then discussed each article to reach consensus on a CERQual statement for each review finding. Our review is based on previously published and publicly available data; therefore, ethical approval was not needed.
bmjopen-2020-038850supp003.pdf (118.3KB, pdf)
Patient and public involvement
No patients involved.
Results
Table 1 presents the characteristics of the included studies. In total, 95 studies met the eligibility criteria. Accreditation was the most frequent type of inspection (n=68), followed by statutory inspections (n=19), and external peer review (n=9). A descriptive design was used in 46 of the studies, implying that the data were collected merely to describe and characterise the study population. An analytical design was used in 49 of the studies, implying that the data were collected to test a hypothesis or associations between the variables. Five of the analytical studies included an intervention and randomisation. Data collection methods included surveys, performance indicators based on administrative data or patient records, interviews, focus group interviews, document analysis and observation. None of the studies used an experimental design to test the meditators of change in organisations subject to external inspections.
Table 1.
Characteristics of the included studies, Consolidated Framework for Implementation Research (CFIR) constructs addressed and quality rating
| Articles | Purpose of study | Study design | Data source and method | Setting | Type of inspection | CFIR construct addressed in article | Quality rating (1–3, 3 is most robust) |
| Boyd et al64 | To explore how inspection team members work together to conduct hospital inspections | Case study | Mixed methods study with interviews (N=78), observation, survey of 369 team members, response rate 66% | Hospital England | Statutory inspection | 7 | 3 |
| Boyd et al60 | To investigated the inter-rater reliability of assessments made by inspectors inspecting acute hospitals in England | Cross sectional | Survey data (N=286, response rate 65%). Analysis: multinomial logistic regression and Krippendorff’s alpha (Ka) | Hospital England | Statutory inspections | 7 | 3 |
| Algahtani et al97 | To assess impact the perceptions of health professionals on the impact of accreditation | Cross sectional | Survey (N=901, response rate 67%). Analysis: t-test, ANOVA | Hospital Saudi Arabia | Accreditation | 19, 21, 31, 32 | 2 |
| Allen et al10 | To examine the effect of inspections by the Care Quality Commission on emergency department performance | Cross sectional | Analysis of six standard emergency department performance indicators for 118 hospitals. | England, hospitals | Statutory inspection | 7 | 3 |
| Almasabi and Thomas112 | To examine factors explaining quality results in accreditation | Case study | Mixed methods, survey (N=669, response rate 69%), 12 interviews with managers, and document analysis. Regression analysis | 3 hospitals in Saudi Arabia | Accreditation | 27, 28, 31 | 3 |
| Alyahya et al78 | To explore how accreditation was perceived by primary care centres | Case study | Qualitative interviews with 56 healthcare professionals and staff | 7 primary care centres Jordan | Accreditation | 10, 11, 12, 19 | 2 |
| Andres et al109 | To assess association between accreditation and staff perception of organisational culture | Cohort study | Survey at three time points before and after accreditation. N=545, response rate 20%. Analysis: regression and t-test | One university hospital in Hong Kong | Accreditation | 26 | 1 |
| Andres et al74 | To evaluate impact of hospital accreditation on patient experience | Cross sectional | Three consecutive surveys, with of totally 3083 patients, response rate 84, analysed with t-test, ANOVA, and regression analysis | Hospital Hong Kong | Accreditation | 9 | 3 |
| Arianson et al65 | Evaluation of effect of inspection with delivery institutions | Cross sectional | Survey of staff at 26 delivery institutions (N=208, 89% response rate). Interviews | Norway, hospital | Statutory inspection | 7, 32 | 2 |
| Askim44 | Assess county experiences with statutory inspections | Case study | Document analysis, interviews with inspection teams, 8 county managers, government officials. Survey of inspection teams | Norway, primary care | Statutory inspection | 4, 11, 32 | 1 |
| Baskind et al93 | To explore the effects of a standards-based, peer review, accreditation model on standards of care in acute inpatient wards and explore how staff achieved change | Qualitative | 8 telephone interviews with local project leaders | England, mental health wards | Accreditation | 19, 21, 26, 28, 33 | 1 |
| Benson et al61 | To examine the impact of Commission for Health Improvement’s clinical governance reviews on NHS trusts in England. | Cross sectional | 30 NHS trusts. Interview Document analysis, 17 interviews with hospital staff, survey of key stake holders (N=90, 44% response rate) | England, 30 NHS trusts | Statutory inspection | 4, 9, 11, 21, 22, 28, 30, 32, 33 | 2 |
| Bogh et al91 | To identify predictors of the effectiveness of hospital accreditation on process performance measures | Cohort study, with a stepped wedge design | 1 624 518 processes of care. Logistic regression analysis | Hospital Denmark | Accreditation | 18 | 3 |
| Braithwaite et al94 | To determine whether accreditation performance is associated with self-reported clinical performance and independent ratings of four aspects of organisational performance | Cross sectional | 19 health service organisations. Indicators for clinical performance. Assessment of organisational context measures through observation and interviews. Variance and correlation analysis | Australia, acute care: large, medium and small healthcare service organisations | Accreditation | 19, 26, 27 | 2 |
| Braun et al121 | To study the impact of organisational characteristics on quality-related activities in health centres | Cross sectional | Survey of executive directors at health centres (N=830 to 35% response rate). Bivariate analysis | USA, 830 Health centres | Accreditation | 33 | 2 |
| Campbell et al45 | To describe the development, content and piloting of version one of the Primary Medical Care Provider Accreditation scheme | Cross sectional | 36 general practices. Nonparametric analysis of compliance with quality criteria. Interviews with general practitioners, practice managers, nurses | England, 36 practices | Accreditation | 4, 8, 9, 26, 31, 32 | 1 |
| Castro-Avila et al8 | To evaluate the effect of Care Quality Commission external inspections of acute trust on adverse event rates | Interrupted time series analysis | Multilevel-random coefficient modelling of rates of falls with harm and pressure ulcers from 155 acute trusts | Hospital England | Statutory inspection | 23 | 2 |
| Chuang et al119 | To determine accreditation surveyors’ and hospitals’ use and perceived usefulness of clinical indicator reports and the potential to establish the control relationship between the accreditation and reporting systems. | Cross sectional | Survey of 306 surveyors, 24% response rate. Pearson χ2 tests. | Australia | Accreditation | 33 | 2 |
| Collopy et al122 | Assess the use of clinical indicators in the accreditation process | Case study | Performance indicators from 667 hospitals | Australia, hospitals | Accreditation | 33 | 1 |
| Commonwealth of Australia46 | To evaluate the impact of accreditation on the delivery of quality of care and quality of life to residents in aged care homes | Cross sectional | Survey of quality managers (N=1884, 33% response rate) and care staff (N=1116, 47% response rate). Interviews. Literature review | Australia, care homes | Accreditation | 4, 8, 9, 12, 33 | 1 |
| Correa et al110 | To develop a system measurement model of the Brazilian hospital accreditation system | Cross sectional | Survey sent to 515 accredited institutions, response rate 49%, analysis with χ2 test | 515 hospitals Brazil | Accreditation | 27 | 2 |
| Daucourt and Michel98 | To identify the areas of needed improvement that were most frequently identified in the first 100 accredited hospitals by the French Accreditation College | Cross sectional | Archive data on accreditation. Parametric and non-parametric tests | France, hospitals | Accreditation | 21 | 3 |
| Davis et al79 | To survey accredited healthcare departments on barriers to and supports of accreditation preparation, performance on accreditation standards, and benefits and improvements after accreditation | Cross sectional | 48 accredited healthcare departments. Survey of 48 managers, 100% response rate. Document analysis | USA | Accreditation | 8, 10, 32, 33 | 3 |
| Devkaran and O'Farrell39 | To evaluate whether accredited hospitals maintain quality and patient safety standards over the accreditation cycle by testing a life-cycle explanation of accreditation on quality measures | Case study | Case study. Register data with 23 quality indicators. Interrupted time series and regression analyses | United Arab Emirates, hospital | Accreditation | 1, 21, 32 | 3 |
| Devkaran et al116 | To evaluate whether hospital reaccreditation can sustain improvement over three accreditation cycles | Case study | Interrupted time series analysis of 27 quality measures using regression equations | Hospital United Arab Emirates | Accreditation | 32, 33 | 3 |
| Doyle and Grampp35 | To estimate the cost of the accreditation process and to analyse the effect of the accreditation process within the organisation | Case study | Case study of three hospitals. 73 interviews, 7 focus group interviews with hospital staff and team leaders. Survey 1 staff (N=500, 10% response rate) survey 2 team members (N=100, 19% response rate) survey 3 key staff (N=1) Observation | Ireland, 3 hospitals | Accreditation | 1, 8, 12, 22, 26, 28 | 1 |
| Duckett40 | To explore the impact of the accreditation programme on a random sample of 23 Australian hospitals | Case study | Case study with 23 hospitals. Interviews with senior hospital personnel. | Australia, 23 hospitals | Accreditation | 1, 10, 12, 19, 21, 26, 27, 32 | 1 |
| Due et al36 | To explore how general practitioners and their staff perceived the accreditation standards | Case study | Two rounds of qualitative interviews with 39 staff and general practitioners from 11 general practices | 11 general practices Denmark | Accreditation | 4, 28 | 3 |
| Ehlers et al71 | To evaluate effectiveness of unannounced versus announced surveys in detecting quality problems | Cluster- randomised controlled trial | Level of compliance with 113 performance indicators. Binomial regression analysis | 23 hospitals Denmark | Accreditation | 7 | 3 |
| El-Jardali et al86 | To examine impact of accreditation of primary healthcare centres | Cross sectional | Mixed methods, survey (N=403, response rate 76%), 22 qualitative interviews. Analysis: linear regression | 25 primary healthcare centres Lebanon | Accreditation | 12, 21, 28, 30, 31, 32 | 3 |
| El-Jardali et al113 | To assess the perceived impact of accreditation on quality of care and investigate the perceived contributing factors that can explain change in quality of care. | Cross sectional | Survey, 1485 nurses, 75.5% response rate, regression analysis | Lebanon, hospitals | Accreditation | 27, 31, 33 | 3 |
| Esik et al47 | To present an example of how external audit can be used to improve quality of care | Case study | 2 radiotherapy departments. Adherence to predefined quality criteria. Variance analysis | Austria and Hungary | External peer review | 4, 11, 21, 27 | 1 |
| Fairbrother and Gleeson37 | Evaluate accreditation process | Cross sectional | Survey with closed/open-ended questions. 88 respondents from clinical and managerial staff. 44% response rate | Australia, hospital | Accreditation | 1, 28, 32 | 1 |
| Greenfield et al72 | To evaluate short-notice surveys in accreditation programmes | Cross sectional | Two trials, one from hospital (N=20) and one from primary care (N=7). Regression analysis of performance data | Primary care and hospitals Australia | Accreditation | 7 | 3 |
| Greenfield et al53 | To identify resources and expertise required for the revision of accreditation standards | Case study | Expert panel and document analysis | General practice Australia | Accreditation | 4 | 1 |
| Greenfield et al58 | To examine challenges to surveyor reliability | Case study | Qualitative data from group interview with survey coordinators | Hospital Australia | Accreditation | 7 | 2 |
| Greenfield et al54 | To explore development and implantation of an accreditation scheme in Australia | Case study | Multi method case study using document analysis, observations, interview with 197 stakeholders | Hospital Australia | Accreditation | 4, 12 | 3 |
| Greenfield et al48 | To investigate understandings and concerns of stakeholders regarding evolution of accreditation programmes in Australia | Case study | 47 group and individual interviews with 258 stakeholders | Healthcare Australia | Accreditation | 4, 11, 12, 19, 21, 31, 33 | 2 |
| Greenfield et al59 | To investigate accreditation survey coordinators’ perceptions of reliability issues | Cross sectional | Survey data from two surveys. (N=53 and N=35). Descriptive analysis | Hospital Australia | Accreditation | 7 | 1 |
| Greenfield et al30 | To investigate whether an accreditation programme facilitates healthcare organisations to evolve and maintain high-performance human resource management systems | Cross sectional | Cross-sectional multimethod study of 6 high performing accredited healthcare organisations. Document analysis, interviews. | Australia | Accreditation | 1, 10, 11, 12, 21, 30, 32, 33 | 3 |
| Greenfield et al92 | To examine whether longitudinal participating on accreditation is translated into improvements in continuity of quality of patient care | Cohort study | Accreditation panel data from 311 hospitals. Regression analysis | Hospital Australia | Accreditation | 18 | 3 |
| Greenfield et al31 | To explore the experiences of health executives, managers and frontline clinicians who participated in organisational accreditation processes: what motivated them to engage, and what benefits accrued? | Qualitative | Case study. Interviews with 30 staff involved in an accreditation process at one hospital | Australia, teaching hospital | Accreditation | 1, 7, 10, 21, 26, 28, 30, 33 | 2 |
| Greenfield et al62 | To examine if accreditation survey processes are reliable | Case study | Mixed methods, 193 stakeholders participated in interviews and a survey | Hospital Australia | Accreditation | 7 | 2 |
| Greenfield et al69 | To examine survey reliability | Case study | Mixed methods with document analysis, observation and interviews of stake holders | Hospital Australia | Accreditation | 7 | 2 |
| Grenade and Boldy38 | Explore participants experience with accreditation in residential aged care facilities | Case study | Case study of 15 residential aged care facilities. 30 interviews with nurses and managers | Australia, residential aged care facilities | Accreditation | 1, 8,2 6, 28 | 1 |
| Hayes et al117 | Compare the effect of two external feedback strategies on the adherence to congestive heart failure guidelines | Randomised controlled trial | 32 hospitals randomised to two different strategies for external feedback. Performance indicators and interviews with 32 quality managers before and after intervention. Multivariate analysis and t-test. | USA, 32 hospitals | Statutory inspection | 32 | 3 |
| Hinchcliff et al49 | Identify factors enabling effective implementation of accreditation programmes across different healthcare settings | Qualitative | Qualitative interviews with 258 healthcare stakeholders. 39 focus group interviews and 8 individual interviews | Australia | Accreditation | 4,12, 22, 27, 28, 32 | 3 |
| Hinchcliff et al55 | To examine how consumer engagement can be promoted in accreditation | Case study | Multimethod, survey data (84% response rate), group and individual interview of 258 health professionals and stake holders | Hospital Australia | Accreditation | 4, 9 | 3 |
| Hofhuis et al99 | Evaluate the effects of visitation and to determine which factors are related to the effectiveness of visitation | Cross sectional | Survey of 151 allied health professionals. 73% response rate | The Netherlands | External peer review | 21, 32 | 2 |
| Hogden et al75 | To explore staff perceptions on factors influencing an aged care accreditation programme | Case study | Focus group interviews with 66 aged care staff from 11 aged care facilities | Primary care Australia | Accreditation | 9, 12, 28 | 2 |
| Hovlid et al100 | To explore how inspecting organisations express and follow-up non-compliant behaviour | Case study | Document analysis of correspondence between county governors and healthcare organisations in 36 inspections | Healthcare Norway | Statutory inspection | 21, 30, 32 | 2 |
| Kilsdonk et al95 | To explore the value and perceived impact of external review in cancer care | Case study | Qualitative data from interview with 31 stake holders from 15 hospitals | 15 hospitals Netherlands | External review | 19, 21, 23, 26, 28 | 2 |
| Kilsdonk et al105 | To examine the clinical impact of peer review on colorectal cancer care. | Case control | Performance indicators from 30 hospitals, 23 intervention and 7 control. Multivariate logistic analysis | The Netherlands | External peer review | 22 | 2 |
| Knutson et al85 | Explore beliefs and perceptions regarding the importance of accreditation, and to evaluate possible correlations between standard compliance and improved patient care and oncologic outcomes | Cross sectional | Performance indicators. Survey (N=2038 to 50% response rate) | USA, cancer facilities | Accreditation | 12, 33 | 2 |
| Kousgaard et al96 | To explore how general practitioners and their staff experienced mandatory accreditation | Case study | Qualitative data from 11 clinics, 42 interviews | General practices Denmark | Accreditation | 1, 19, 21, 22, 26, 28, 30, 31, 32 | 3 |
| Lam et al90 | To compare performance of hospitals accredited by The Joint Commission and state survey hospitals | Cross sectional | 4 242 684 patients aged 65 years and older. Main outcome: risk adjusted mortality, readmission rates and patient satisfactions scores | 4400 hospitals in USA | Accreditation | 18 | 3 |
| Lee107 | To compare safety climate among nurses before and after accreditation | Cross sectional | Survey before (N=217, response rate 58%) and after accreditation (N=373 response rate 87%). Analysis: t-test and Pearson’s correlation coefficient | Hospital South Korea | Accreditation | 26 | 1 |
| Lombarts and Klazinga32 | Evaluate effect of supporting implementation processes after external peer review | Cross sectional | Survey. Telephone interviews. 25 specialist group practices, 67 specialists | The Netherlands | External peer review | 1, 9, 22, 26, 32 | 1 |
| Lombarts et al57 | To evaluate the impact of facilitation by management consultants on implementing recommendations from external peer review. | Cross sectional | Survey of 205 medical specialists, representing 50 hospital-based specialist groups, 54% response rate. Analysis: correlation using Spearman’s test | The Netherlands | External peer review | 4, 32 | 2 |
| Pomey et al80 | Explore organisational changes following accreditation | Case study | Single case study. Hospital staff. Survey (N=3362, 52% response rate), 67 interviews, interviews, document analysis and observation | France, university hospital | Accreditation | 10, 12, 19, 21, 25, 26, 27, 28, 31, 32 | 2 |
| Pomey et al41 | To explore how accreditation can facilitate organisational change | Case study | Case study based on document analysis, 25 individual interviews, 10 focus group interviews with hospital staff | Canada, 5 hospitals | Accreditation | 1, 10,12,19, 21, 23, 26, 28, 31, 32 | 2 |
| Melo51 | To explore how accreditation can lead to improvement | Case study | Qualitative interview with 49 staff members | One hospital Portugal | Accreditation | 4, 12,19, 23, 26, 27, 31, 32, 33 | 2 |
| Moe et al81 | To examine the impact of accreditation on community family practices | Case study | Performance indicators and survey data (N=82, response rate 81%). Analysis: t-test and ANOVA | 5 family practices in Canada | Accreditation | 10, 26, 27 | 2 |
| Mumford et al73 | To assess direct costs of hospital accreditation in Australia | Cohort | Mixed methods design incorporating stakeholder analysis, survey, activity-based cost analysis, and expert panel review | 6 acute care hospitals in Australia | Accreditation | 8 | 3 |
| Nekoei-Moghadam et al114 | To explore staff perceptions of hospital accreditation | Case study | Qualitative interviews with 17 staff members | Hospital Iran | Accreditation | 28 | 1 |
| Nouwens et al76 | To identify determinants of impact of a practice accreditation programme in primary care | Case study | Qualitative data from 33 primary care professionals | General practices Netherlands | Accreditation | 9, 12, 18, 19, 26, 28, 30, 31, 32, 33 | 2 |
| Nouwens et al101 | To evaluate the effectiveness of improvement plans in practice accreditation in primary care | Randomised controlled trial | 45 primary care practices randomised to intervention and control. Primary outcome: blood pressure, low-density lipoprotein (LDL) cholesterol and prescription of antiplatelet drugs. Analysis: regression | 45 primary care practices in the Netherlands | Accreditation | 21, 30, 32 | 3 |
| Office of Public Management130 | Evaluation of a national audit of specialist inpatient healthcare services for people with learning difficulties in England | Cross sectional | 72 NHS trusts. Survey of managers (N=242, 97% and 93% response rates) | England, hospitals | Statutory inspection | 1, 21 | 1 |
| Oliveira et al102 | To explore managers’ and professionals’ perception on changes deriving from accreditation | Case study | Qualitative interviews with 5 managers and 91 professionals from 5 hospitals | Hospital Brazil | Accreditation | 21, 27, 30, 32 | 2 |
| OPM Evaluation team66 | To assess how the inspection programme on healthcare-associated infections is working to achieve its intended outcomes over time | Cross sectional | 80 health trusts. Performance indicators. 98 interviews with board members, managers, hospital staff. Focus group interview with 20 service users. Survey of managers, 134 responders, 78% response rate | UK, hospitals | Statutory inspection | 7, 11,12,19, 21, 22, 25, 26, 27, 28, 31, 32, 33 | 2 |
| Oude Wesselink et al131 | To evaluate the effects of a supervision programme on the provision of smoking-cessation counselling | Three substudies. 1: observational before–after study, 2 and 3: randomised controlled trial with only postintervention measurements | 233 primary care midwifery practices. Performance indicators, Survey A (n=113, 94% response rate), survey B (N=71, 75% response rate), survey C (N=11, 79% response rate). Analysis: linear and logistic regression | The Netherlands, primary care | Statutory inspection | 32 | 3 |
| Paccioni et al108 | Describe and understand the effects of the accreditation process on organisational control and quality management | Case study | Longitudinal case study of 2 primary care centres. Group interview (N=1) individual interviews (N=35), observation. Survey (N=328, response rate institution A 27%, institution B 47%) Bivariate analysis and t-test. | Canada, primary care | Accreditation | 26, 27, 31, 33 | 2 |
| Pedersen et al89 | To test whether accreditation contributes to foster intrinsic motivation | Cluster randomised stepped wedge | Baseline and follow-up survey of 1164 general practitioners. Response rate 56% and 54%. Analysis: mixed effects multilevel models | General practice Denmark | Accreditation | 14 | 2 |
| Pham et al56 | To examine the impact of quality reporting on hospitals’ data collection and review processes | Cross sectional | 36 hospitals. 111 interviews with hospital staff, hospital association leaders, JCAHO representatives | USA, 36 hospitals | Accreditation | 4, 8,12, 21, 23, 25, 28, 33 | 1 |
| Pollard et al115 | To examine unit characteristics and the use of seclusion and restraint in a mental health unit before and after the promulgation of the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) 2000 standards | Case study | Performance indicators. t-test to compare prepolicy and postpolicy implementation | USA, acute mental health unit | Accreditation | 32 | 3 |
| Rajan et al82 | To explore staff perceptions about changes following accreditation | Case study | Qualitative interviews with managers and staff | 8 cancer centres in Netherlands | Accreditation | 10, 19, 21, 32, 33 | 2 |
| Reisi et al106 | To explore facilitators and barriers for implementation of accreditation | Cross sectional | Survey of 1706 respondents (response rate 76%) from 43 hospitals, multiple regression analysis | 43 hospitals in Iran | Accreditation | 24, 27, 31 | 3 |
| Rivas et al83 | Examine perceptions of local service change and concepts of change among participants in a UK nationwide randomised controlled trial peer reviewed with feedback to promote quality improvement | Qualitative | 43 interviews with hospital staff (24 in intervention group and 11 in control group) | England, NHS | External peer review | 10, 11, 19, 21, 26, 31, 32 | 3 |
| Saleh et al52 | To assess views of hospitals on worthiness of accreditation | Cross sectional | Survey (N=110 hospitals, response rate 63%). Analysis: Pearson χ2, Fischer’s exact test. Qualitative analysis of open-ended answers in survey | Hospital Lebanon | Accreditation | 4, 8 | 2 |
| Salmon et al118 | To assess the effects of an accreditation programme on public hospitals’ processes and outcomes | Randomised controlled trial | 53 hospitals randomised to accreditation or control. Performance indicators and degree of compliance with accreditation standard. χ2, correlation and analyses of variance | South Africa, 20 hospitals | Accreditation | 32 | 2 |
| Schaefer and Wiig67 | To examine how count governors contribute to improvement through external inspections | Case study | Qualitative data from interview with 10 inspectors from two county governor offices | Healthcare Norway | Statutory inspection | 7, 10, 21 | 2 |
| Shah et al87 | To assess the impact of Local Boards of health of governance function on Local Health Departments | Cross sectional | Survey data from 329 Local Health Departments. Response rate 58%. Analysis with logistic regression models. | Primary care USA | Accreditation | 12 | 2 |
| Shakespeare et al103 | Explore effect of external audit on clinical practice and medical decision making in an oncology centre | Cross sectional | Assessment of 100 randomly chosen patient records according to predefined criteria. Univariate and multivariate analysis | Asia, cancer centre | External peer review | 21, 25 | 2 |
| Shaw et al111 | To investigate the relationship between ISO 9001 certification, accreditation and quality management | Cross sectional | Mixed methods multilevel study with data from 73 hospitals in seven countries. Performance data about care delivery and survey data. Analysis: multivariable linear regression | Hospital Europa | Accreditation | 27 | 3 |
| Smithson et al43 | To evaluate the impact of the Care Quality Commission (CQC) on provider performance | Cross sectional | Document review, observation of inspections, 170 qualitative interviews with senior CQC leaders, NHS trusts, clinical staff and stakeholder organisations. Analysis different performance indicators. | The UK, hospitals, and general practices. | Statutory inspection | 1, 7, 8, 10, 11, 12, 18, 19, 21, 26, 27, 31, 32, 33 | 3 |
| Teymourzadeh et al68 | To identify factors affecting surveyor management of hospital accreditation | Case study | Qualitative interview with 21 stakeholders | Hospital Iran | Accreditation | 7 | 1 |
| The Norwegian Board of Health Supervision50 | Evaluation of Statutory inspections with primary elderly care | Cross sectional | 41 interviews with inspection team members, municipality managers and staff. Survey of municipality managers (N=325, 68% response rate) | Norway, primary care | Statutory inspection | 4, 7, 10, 32, 33 | 2 |
| Triantafillou88 | To explore the emergence of accreditation of hospitals in Denmark | Case study | Document analysis and qualitative interviews with 8 key informants | Hospital Denmark |
Accreditation | 12 | 1 |
| Tuijn et al63 | To examine the effect of two interventions on reliability and validity of regulatory judgements | Randomised controlled trial | Variance and regression analysis of inspectors’ judgements and control variables | Nursing homes Netherlands | Statutory inspections | 7 | 3 |
| Uren et al70 | To evaluate stakeholder perception of short-notice (48 hours) assessment | Cohort | Repeated survey to 115 participants. Response rate 28%–46%. Outcomes analysed with Student’s t-test. | 2 hospitals in Australia | Accreditation | 7 | 1 |
| van Doorn-Klomberg et al120 | To examine the effect of accreditation on quality of care for diabetes, chronic obstructive pulmonary disease and cardiovascular disease. | Cohort | 138 family practices. Comparative observational study with two cohorts. Performance indicators from patient records. Multilevel regression analysis | The Netherlands, primary care | Accreditation | 33 | 2 |
| Vanoli et al33 | Evaluate effects of accreditation system | Cross sectional | 19 internal medicine units. Performance indicators | Italy, hospitals | Accreditation | 1, 21 | 1 |
| Walshe et al42 | To evaluate the Care Quality Commission’s acute hospital regulatory model | Cross sectional | 79 interviews with staff from the Care Quality Commission, inspection team members, hospital staff. Survey of inspection team members and hospital staff. Observation | England, NHS | Statutory inspection | 1, 6, 7, 9, 10, 19, 21, 23, 26, 28, 30, 31, 32 | 2 |
| Walshe et al104 | To explore the use of external approaches to quality improvement in healthcare organisations following external reviews | Cross sectional | 47 NHS trust. Interviews with 151 clinicians and managers, and 12 review team members | England, 47 NHS Trusts | Statutory inspection | 21, 30, 32 | 3 |
| Åsprang et al77 | To explore organisational impact of inspections on blood transfusion services | Case study | Qualitative group interviews with 18 professionals | Hospital Norway | Statutory inspection | 10, 12, 21, 23, 27 | 2 |
The assessment criteria for the quality rating are presented in online supplementary additional file 3. Studies were scored from 1 to 3, where 3 denotes the most robust study methodology.
ANOVA, analysis of variance; NHS, National Health Service.
The presentation of the findings is structured according to the mediators of change identified in our analysis guided by the theoretical constructs in the CFIR. For clarity, some of the mediators of change are presented together. The theoretical constructs, the mediators of change identified in our analysis and the corresponding articles are shown in table 2.
Table 2.
Consolidated Framework for Implementation Research (CFIR) constructs, mediators of change identified in our analysis and the relevant articles
| Number | CFIR constructs | Change and discursive activities identified in our empirical data analysis | Articles |
| 1 | Intervention source | Self-assessment | 30–35 37–43 96 |
| 2 | Evidence strength and quality | None identified | |
| 3 | Relative advantage | None identified | |
| 4 | Adaptability | The importance of valid and relevant inspection standards | 36 44–55 61 |
| Guidance on how to follow-up inspection findings | 43 44 50 56 57 | ||
| 5 | Trialability | None identified | |
| 6 | Complexity | Complexity of inspections | 42 |
| 7 | Design quality and packaging | Knowledge and skills of the surveyors and credibility of inspection schemes | 31 43 50 58–60 62–69 |
| Unannounced inspections | 42 66 70–72 | ||
| Choosing whom to inspect | 43 | ||
| 8 | Cost | Increased costs | 35 38 43 46 52 56 73 |
| 9 | Patient needs and resources | Patient focus | 32 42 45 46 55 61 74–76 |
| 10 | Cosmopolitanism | Improved relations and communication with external community partners | 30 31 40–43 50 67 77–83 |
| 11 | Peer pressure | External pressure | 30 43 44 47 48 61 66 78 83 |
| 12 | External policy and incentives | Public confidence in services | 30 40 41 43 46 48 66 78 85 |
| Incentives | 35 49 56 75 76 80 | ||
| Role of mass media | 35 43 48 66 77 | ||
| Government involvement | 48 51 54 86–88 | ||
| 13 | Knowledge and beliefs about the intervention | See the constructs: intervention source, adaptability, design quality and packaging | |
| 14 | Self-efficacy | Intrinsic motivation | 89 |
| 15 | Individual stage of change | None identified | |
| 16 | Individual identification with organisation | See the constructs: tension for change, and learning climate and culture | |
| 17 | Other personal attributes | None identified | |
| 18 | Structural characteristics | Structural characteristics | 43 76 90–92 |
| 19 | Networks and communications | Improved communication | 40 42 43 51 66 76 78 80 82 83 90 93–95 48 |
| Facilitate creation of networks | 40 41 43 78 80 97 | ||
| 20 | Culture | See the construct: learning climate and culture | |
| 21 | Tension for change | Awareness of current practice and performance gaps | 30 31 33 34 39–43 47 56 61 66 67 76 77 82 86 93 95 96 98–104 |
| Awareness of more desirable practice | 30 31 48 82 83 93 95 97 103 | ||
| Commitment to change | 41 61 76 80 96 104 | ||
| 22 | Compatibility | Perceived relevance of inspection findings | 32 35 49 61 66 76 96 105 |
| 23 | Relative priority | Priority of inspected area | 8 41 42 51 56 77 95 |
| 24 | Organisational incentives and rewards | Incentives | 106 |
| 25 | Goals and feedback | Goal setting | 56 66 80 103 |
| 26 | Learning climate and culture | Improved learning and organisational climate | 38 42 43 66 80 81 83 93 94 96 107–109 |
| Reflection on clinical practice | 31 32 41–43 45 66 76 80 | ||
| Improved understanding of organisation and interdependencies in clinical system | 31 35 38 40 41 80 95 96 108 | ||
| 27 | Leader engagement | Engage leaders in improvement | 40 43 47 49 51 66 77 80 81 94 102 106 108 110–113 |
| 28 | Available resources | Allocation of resources | 31 35–38 41 42 49 56 61 66 75 76 80 86 93 95 96 112 114 |
| 29 | Access to knowledge and information | None identified | |
| 30 | Planning | Planning improvement interventions | 30 31 42 61 76 86 96 100–102 104 |
| 31 | Engaging | Engage and involve staff in improvement activities | 41–43 45 48 51 66 76 80 83 86 96 97 106 108 112 113 |
| 32 | Executing | Implementing improvement interventions | 30 32 37 39–45 49–51 57 61 65 66 76 79 80 82 83 86 96 97 99–102 104 115–118 |
| 33 | Reflecting and evaluating | Use of measurement systems and data | 43 48 51 56 61 76 82 85 108 113 119–122 |
| Evaluation and continuous improvement | 30 31 46 48 50 61 66 76 79 85 93 116 |
Self-assessment
Survey and interview data suggest that the self-assessment tools were considered useful and initiated critical reflection on the organisation’s current practice.30–34 Four other studies reported that the self-assessment tool was confusing and time consuming and was concerned with finding and producing documentation rather than reviewing practice.35–38 Four case studies, three using interview data and one using time series analysis of performance indicators found that most substantial changes were made during the self-assessment phase,36 39–41 where as one study using interview data found that the organisation did not make improvements during the self-assessment phase because it was considered sufficient just to identify improvement needs.42 Another case study found that changes made before the site visit were superficial and sought to achieve ritual compliance, focusing more on getting through the inspection process than on really improving the quality of care.43
Importance of valid and relevant inspection standards
Survey and interview data suggest that the standards used for inspections must be valid, focused on clinical practice and benefits for the patients, and should be translated into something perceived as relevant and useful for the services in their improvement work.36 44–50 To be perceived as relevant, the standards should be adapted to the local context.51 52 Development and revision of standards require a collaborative approach and the expertise of a range of stakeholders including the patients.53–55
Guidance on how to follow up inspection findings
Qualitative data indicate that inspected organisations want guidance on how to improve following an inspection,44 50 56 and two studies using survey data and interview data found that organisations receiving support and guidance from a consultant were more successful in implementing changes after the inspection.43 57
Complexity of inspections
Interview data suggest that the inspection teams gather large amounts of qualitative and quantitative data during the inspections, and that it can challenging to synthesis, weigh and make sense of all the gathered information for the inspection team.42
Knowledge and skills of surveyors and credibility of inspections schemes
Ensuring trust and credibility of assessment schemes requires collaboration between regulators, assessment agencies and health services.58 A key factor for credibility is surveying reliability,59 and qualitative and quantitative date suggest that reliability and validity of surveyor judgements and how they interpret and apply the standards can be a challenge.44 49 59–63 The relationship between surveyors and the staff in the organisations being inspected fundamentally affect the way inspections work.43 Qualitative data indicate that the inspection teams should be multiprofessional, and that they need specific knowledge about the services being inspected along with good communication skills, because such knowledge and skills were considered important for the inspected organisations’ confidence in the inspection findings.31 43 50 64–68 Skill and competence of surveyors along with active collaboration and consensus meetings between surveyors can improve reliability and validity of their judgements.60 63 69
Choosing whom to inspect
Inspecting agencies can rely on a risk-based approach for choosing whom to inspect. Data from one case study suggest that the quantitative dataset used to risk assess provider performance and prioritise inspections did not correlate with the subsequent ratings of general practices and acute trusts.43
Unannounced inspections
Interview and survey data suggest that unannounced inspections could limit the time spent preparing for the inspection, and could contribute to enhance the credibility of the inspection findings.42 66 70 Two trials using actual performance data from inspections did not find any difference between regular inspections and unannounced/short-notice (48 hours) inspections in their ability to identify quality problems,71 72 while survey data from stakeholders indicate that short-notice (48 hours) surveys more effectively identified true organisational performance.70
Increased costs
Survey and interview data along with cost indicators suggest that inspections could increase costs during accreditation and inspections. The increased costs related to staff doing different type of work, for example, participating in meetings, producing new documentation and collecting evidence to fulfil documentation requirements, and smaller facilities seemed to have higher costs.35 38 43 46 52 56 73
Patient focus
Three cross-sectional studies using interview and survey data found that the inspection process and recommendations following the inspection should be more directed towards patient care and clinical practice in order to contribute to improvement.32 45 61 Interview and survey data suggest that accreditation can contribute to increase the focus on patient needs, and promote patient involvement in care.55 74–76 Qualitative data suggest that inputs from patients during inspections are valuable, but more work is needed on how to incorporate such inputs in a meaningful way in the inspection processes.42 46
Relations and communication with external stakeholders
Data from two case studies suggest that findings from inspections can be aggregated and used to identify systemic and interorganisational issues which can influence stakeholders and wider systems other than the inspected organisations themselves.43 77 Interview and survey data suggest that inspection processes can contribute to improve the relationship and communication with community partners by sharing experiences and receiving input on how to improve, and that achieving accreditation was viewed as a way to gain prestige and recognition.30 31 41–43 50 78–83 One case study indicated that inspection does not improve relations with community partners.40
External pressure
Interview and survey data suggest that being reviewed by someone external and independent can be viewed as positive because external feedback can provide a stimulus for change which is based on credible evidence.30 43 44 47 61 83 84 Social pressure from stakeholders in the community was viewed as an external pressure to participate in accreditation.48 78
Public confidence in services
Survey and interview data indicate that results from the inspection processes can be made publicly available, and that such publication can demonstrate that the quality of the services meet a certain standard, thereby contributing to public confidence in the services.30 40 41 43 46 55 66 78 85
External incentives
Interview data suggest that health organisations can participate in multiple accreditation programmes, and that resources are directed towards mandatory programmes with public disclosure of results.56 80 Participation in accreditation and inspections are resource demanding, and to promote effective implementation of accreditation and inspections expectations to participate should be aligned and supported by other regulatory incentives.35 49 75 76
Role of mass media
Interview data indicate that mass media are more likely to report bad news and shortcomings following inspection and accreditation than positive news, thereby giving the public an unbalanced picture of the current situation.35 43 66 Two case studies suggest that media coverage can contribute to the implementation of improvement measures following inspections.48 77
Government involvement
Qualitative data suggest that government can play an active role in developing accreditation schemes by incorporating them as requirements in their strategies for improving quality and safety, by facilitating collaboration between regulators and services, and through financial incentives.48 51 54 86–88
Intrinsic motivation
A cluster randomised trial indicated that accreditation contributed to foster intrinsic motivation for staff, especially for those who perceived accreditation as an instrument for quality improvement prior to accreditation.89
Structural characteristics
A cross-sectional study using data from 4400 hospitals did not find differences in outcomes to be associated with type of accreditation programme.90 Performance data from two cohort studies suggest that hospital size and type did not predict effects of accreditation91; however, lower performing hospitals improved at a greater rate than moderate and higher performing hospitals.92 Two case studies indicate that degree of improvement can depend on improvement capability of the inspected organisation,43 and that accreditation can be easier to implement in smaller facilities.76
Creation of networks, and improved communications
Qualitative data indicate that inspections can contribute to improved teamwork and communication within existing networks in the sense that these can become more focused around how the organisation delivers clinical care.40 42 43 48 51 66 76 80 82 83 93–96 Involved staff members reported that inspections can expand existing networks by facilitating the creation of new meeting places.40 41 43 78 80 97
Awareness of current practice, performance gaps and a more desirable practice, and commitment to change
Interview and survey data suggest that an inspection can highlight problem areas needing improvement.31 43 47 67 76 77 82 86 93 95 96 98–103 Case studies suggest that external inspections rarely identify problems previously unknown in the inspected organisation, but the inspections still serve the purpose of confirming these problems and bringing them into the open so that they can be addressed.42 61 104
Other case studies suggest that inspections should not only address deficits in organisational performance but also be used to recognise and validate success30 41 so that the inspected organisation can maintain what it is currently doing well.30 42 66 Qualitative research indicate that the inspection can draw attention to and provide feedback about a more desirable practice and how this can be achieved.30 31 48 82 83 93 95 97 103
Data from interviews suggest that the authority of the inspecting organisation along with organisational reflections on performance gaps that take place during the inspection can contribute to creating an understanding of the necessity of improving and commitment to change.41 61 76 80 96 104
Inspection findings and goal setting
Data from surveys and interviews suggest that the inspection process should be translated into something meaningful and understandable for the front line in order to contribute to involvement in improvement work.32 35 49 61 66 76 96 105 Problem areas identified during inspections can receive increased attention by the inspected organisation.8 41 42 51 56 77 95 Qualitative studies indicate that feedback from the inspection can be used to define improvement goals for the inspected organisation.56 66 80 103
Organizational incentives and rewards
Survey data suggest that internal recognition and rewards were associated with perceived quality results in accreditation.106
Learning climate, reflection on clinical practice and understanding the clinical system
Findings from case studies and three survey studies using organisational context measures suggest that interprofessional collaboration and reflection can contribute to improving the organisational climate during inspection38 42 51 83 93 94 107 108 and strengthening the social relationships between staff members.80 109 However, interview and survey data also suggested that inspection can lower staff morale by focusing solely on what is wrong.43 66 81 96
A key feature of the new meeting places that can be created during an inspection seems to be that they can bring together a broad range of professionals and disciplines to discuss and reflect on clinical processes in a way that they had not done previously.31 32 41–43 45 66 76 80
Staff members report that communication and interprofessional reflection during the inspection process can contribute to enhancing their understanding of the clinical system and its interdependencies.35 38 95 96 Qualitative data suggest that interprofessional reflection can promote the breakdown of organisational silos and contribute to improving individuals’ understanding of the organisation as a whole, when clinicians learn about practices other than those in which they are usually involved.31 40 41 80 108
Engage leaders in improvement
Leader involvement and engagement in inspection are reported to be important because they give a direction to the improvement process and can facilitate the involvement and motivation of other staff members.47 49 Survey and interview data indicate that inspections can facilitate leader engagement in the area that is being inspected.43 51 66 77 81 102 110 111 However, survey data also indicate that accreditation has no effect—or even a negative impact—on leadership.108 Qualitative case studies report that inspections can provide leaders with a platform that can enable them to act to improve clinical work.40 47 80 Four cross-sectional studies using survey data support the idea that leadership engagement in accreditation can be associated with perceived quality results106 112 113 and accreditation performance.94
Allocation of resources
Qualitative data indicate that preparing for the inspection and being inspected can be burdensome and time consuming, and that the efforts do not necessarily match the outcomes.35 36 38 42 49 66 76 86 95 96 112 Survey and interview data suggest that inspections can have negative consequences for the time spent on clinical work.31 37 66 75 96 114 Interview data indicate that there is an inherent risk that staff members who are involved in accreditation processes without sufficient resources and competence in place can become less enthusiastic about future engagement because of the constraints under which their involvement takes place.35 Findings from qualitative studies suggest that resources may be allocated to areas needing improvement following the inspection.41 61 66 80 93
Planning improvement interventions
Interview data indicate that external inspections can provide an opportunity to reflect on current practice and performance gaps, and to plan actions for improvements in these areas.30 31
Staff members report that feedback from the inspection teams can also be used to plan improvement measures, and that perceived accreditation results were associated with quality improvement planning.31 42 61 76 86 96 100 101 104 Interview data suggest that the feedback following an inspection can provide a sense of direction and validity to the planning process by requesting the inspected organisation to produce a formal action plan addressing how non-conformities should be corrected.30 31 104
Engage and involve staff in improvement activities
Qualitative data indicate that the involvement of those being inspected can be a critical factor for change,45 108 and staff involvement in the accreditation process has been shown to be associated with perceived quality results.86 97 106 112 113 Case studies suggest that inspections can contribute to involving clinicians in improvement work,42 43 48 51 76 80 96 but there were also data indicating that an inspection only involves part of the organisation,41 108 with the risk of those not sufficiently involved not buying into its potential for real change.66
Implementing improvement interventions
Findings from case studies and quantitative cross-sectional studies combining survey data and performance indicators indicate that inspections can contribute to the implementation of planned changes, addressing substandard performance identified during the inspections.30 41–43 49 51 66 76 79 82 86 96 97 101 102 104 115 The inspected organisation can implement changes derived from new models of thought developed during the inspection.40 80
In accreditation, qualitative data and time series analyses of performance indicators suggest that the most substantial changes are implemented during the self-assessment phase prior to the actual inspection.39–41 116 In statutory inspections, survey and interview data indicate that changes were implemented throughout the inspection cycle.42 43 65 66 99
Interview and observational survey data in combination with performance indicators indicate that the feedback and recommendation provided to inspected organisations can facilitate the implementation of change.30 41 42 49 51 66 79 86 97 115 Staff members report that recommendations need to be explicit and realistic about what needs to be changed, and these should be cast in a way that eases implementation by being explicit about the aim of the change and its relevance for patient care.61 104
The inspected organisation can be held accountable for implementing changes,41 and this is reported to contribute to creating momentum to speed up the implementation process.42 43 61 66 104
Findings from qualitative studies, surveys and one randomised controlled trial indicate that inspections can contribute to organisational change, but the change does not necessarily affect quality of care.32 37 41 42 100 101 117 118 Interview and survey data, and one randomised controlled trial indicate that changes can be implemented to make the organisation comply with the standard without actually improving patient care.37 40 45 66 118 According to qualitative data, non-conformities identified during inspections and the corresponding feedback can address shortcomings in the quality management system and support processes.44 50 61 100 However, quality management systems are not always implemented systematically and clinicians question to what degree they actually support and impact clinical work.44 Qualitative findings suggest that there is a risk that corrective changes made to the management system will not have any impact on the quality of care.61 100
Use of measurement systems and data, and evaluation and continuous improvement
Staff members report that external inspections can contribute to reinforcing a healthcare organisation’s self-evaluation, which can impact the organisation’s long-term performance.30 43 66 76 The inspecting body can express expectations as to how the inspected organisation should monitor performance and progress of improvement, and what data they can use for this purpose following the inspection.48 51 82 Qualitative data suggest that such expectations should be stated in a way that makes it possible to follow-up and monitor the progress of the implementation, as well as the effects of the changes on the quality of care.43 61 Survey data and performance data indicate that accreditation can promote continuous improvement, and sustain improvements over time.46 116 However, a challenge is that the standard used for inspection is not necessarily sensitive to improvement over time, and it might not include requirements on how to use measurements to inform continuous evaluation and improvement over time.46 48 85
Cross-sectional studies based on survey and performance data indicate that the inspection process can facilitate the development of measurement systems, which can be used to evaluate performance,51 61 82 85 108 113 119 120 and survey and interview data suggest that inspection can have a positive influence on clinicians’ understanding of the necessity of measuring and evaluating improvement progress.56 121 122 However, contradictory findings based on survey data indicate that inspection does not affect the way the inspected organisation uses and processes data to evaluate its performance.108 Survey and interview data indicate that inspection can facilitate reflection on practice in a way that can provide inspiration for further changes beyond the theme of the inspection.31 50 79 93
Assessment of review findings
The assessment of confidence in our findings is shown in table 3. Nine findings were rated as high, 19 moderate and 8 low.
Table 3.
Assessment of confidence of review findings
| Number for CFIR construct | CFIR construct and our review finding | Studies contributing to the review finding | Assessment of methodological limitations | Assessment of relevance | Assessment of coherence | Assessment of adequacy | Overall assessment of confidence | Explanation of judgement |
| 1 | Intervention source: self-assessment contributing to improvement | 30–35 37–43 96 | Moderate methodological limitation in seven studies, minor methodological limitations in three studies and four studies fulfilled all assessment criteria. | 14 studies from different countries and inspection settings offering relevant data addressing the review question. | Inconsistent findings. Data show that self-assessment can be regarded as useful and initiating improvements, but data also indicated that self-assessment could be troublesome and not leading to improvement. |
14 studies offering diverse data from different inspection settings | Low confidence | This judgement was graded low because of inconsistent findings. Our findings support that self-assessment can contribute to improvement provided certain factors. |
| 4 | Adaptability: the importance of valid and relevant inspection standards | 36 44–55 61 | Moderate methodological limitation in five studies, minor methodological limitations in five studies and four study fulfilled all assessment criteria. | 14 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings indicating that standards used for inspections must be valid and focused on clinical practice and the benefits for the patients. | 14 studies offered diverse data overall. | High | This judgement was graded high because of diverse, relevant and coherent data from 14 studies showing the importance of valid and relevant inspection standards. |
| 4 | Adaptability: guidance on how to follow upinspection findings | 43 44 50 56 57 | Moderate methodological limitation in two studies, minor methodological limitations in two studies and one study fulfilled all assessment criteria. | 5 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that inspected organisations want guidance on how to improve following an inspection and that guidance can facilitate implementation. | Moderate concern about the adequacy of data. 5 studies offered diverse data overall. | Moderate | This judgement was graded moderate because of moderate concern about the adequacy of data, and because of methodological limitations in four studies. |
| 6 | Complexity: complexity of inspections | 42 | Moderate methodological limitation in one study | 1 study offering relevant data. | Only one study reported that it can be challenging for the inspection teams to synthesis, weigh and make sense of all the gathered data. | Major concern about the adequacy of data because only study was included. | Low | This judgement was graded low because of major concern about the adequacy of data. |
| 7 | Design quality and packaging: knowledge and skills of surveyors and credibility of inspection schemes | 31 43 50 58–60 62–69 | Moderate methodological limitation in two studies, minor methodological limitation in eight studies and four studies fulfilled all assessment criteria. | 14 studies from different countries and inspection settings offering relevant data. | Coherent findings showing that knowledge and skills of the inspection teams are important for the credibility of the inspection findings. | 14 studies offered diverse data. | High | This judgement was graded high because of diverse, relevant and coherent data from 14 studies showing the importance of knowledge and skills of surveyors and credibility of the inspection schemes. |
| 7 | Design quality and packaging: unannounced inspections | 42 66 70–72 | Moderate methodological limitation in one study, minor methodological limitations in two studies and two studies fulfilled all assessment criteria. | 5 studies from different countries and different inspection settings offering relevant data. | Inconsistent finding about the role of unannounced inspections. | Moderate concern about the adequacy of data. | Low | This judgement was graded low because of moderate concerns about the adequacy of data, and inconsistent findings. There is evidence to support that unannounced and short-notice surveys require less resources for preparations, but there are inconsistent findings as to whether they more effectively can reveal substandard performance. |
| 7 | Design quality and packaging: choosing whom to inspect | 43 | One study fulfilled all assessment criteria. | 1 study form one country and one inspection setting. | Consistent finding that data used to risk assess in advance of inspection was not correlated with subsequent ratings after the inspections. | Major concern about the adequacy of data. | Low | This judgement was graded low because of major concern about the adequacy of data. |
| 8 | Cost: increased cost | 35 38 43 46 52 56 73 | Moderate methodological limitation in four studies, minor methodological limitations in one study and two studies fulfilled all assessment criteria. | 7 studies from different countries offering data about different inspection settings. | Coherent findings that inspections increased costs. | Minor concern about the adequacy of data. 7 studies offered diverse data overall. | Moderate | This judgement was graded moderate because of minor concern about the adequacy of data and methodological limitations in five studies. |
| 9 | Patient needs and resources: Inspection can contribute to increase patient focus | 32 42 45 46 55 61 74–76 | Moderate methodological limitation in one study, minor methodological limitations in three studies and three studies fulfilled all assessment criteria. | 9 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings showing that inspections should be more directed towards patient care and clinical practice in order to contribute to improvement, and that standard that address patient needs can contribute to increased patient focus. | Minor concern about the adequacy of data. | Moderate | This judgement was graded moderate because of minor concern about the adequacy of data, and methodological limitations in four studies. |
| 10 | Cosmopolitanism: improved relations and communication with external community partners | 30 31 40–43 50 67 77–83 | Moderate methodological limitation in one study, minor methodological limitations in five studies and four studies fulfilled all assessment criteria. | 15 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings in that 14 out of 15 studies reported that inspections improved relations and communication with external partners. | 15 studies that together offered moderately diverse data. | High | This judgement was graded high because of diverse, relevant and coherent data from 14 studies showing that inspections can contribute to improve relations and communication with external community partners. |
| 10 | Cosmopolitanism: improvement in other organisations besides the one being inspected | 43 77 | Minor methodological limitations in one study, and one study fulfilled all assessment criteria. | 2 studies from different countries, but same inspection setting offering relevant data addressing the review question. | Coherent finding from two case studies that inspections can contribute to improvement in other organisations than the one being inspected. | Major concerns about adequacy of data. Two studies that together offered diverse data. | Low | This judgement was graded low because of major concerns about the adequacy of data. |
| 11 | Peer pressure: external pressure | 30 43 44 47 61 66 78 83 | Moderate methodological limitation in two studies, minor methodological limitations in four studies and three studies fulfilled all assessment criteria. | 9 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings, indicating that being reviewed by someone external and independent, can create an external pressure for change. | Minor concerns about adequacy of data. 9 studies that together offered diverse data. | Moderate confidence | This judgement was graded moderate because of minor concerns about the adequacy of data, and methodological limitations in six studies. |
| 12 | External policies and incentives: public confidence in services | 30 40 41 43 46 48 66 78 85 | Moderate methodological limitation in three studies, minor methodological limitations in five studies and one study fulfilled all assessment criteria. | 9 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that making the inspection finding publicly available can contribute to public confidence in the services. | Minor concerns about adequacy. 9 studies that together offered diverse data. | Moderate confidence | This judgement was graded moderate because of minor concerns about adequacy of data, and methodological limitations in eight studies. |
| 12 | External policies and incentives: incentives | 35 49 56 75 76 80 | Moderate methodological limitation in two studies, minor methodological limitations in three studies and one study fulfilled all assessment criteria. | 6 studies from different countries, but only one inspection setting offering relevant data addressing the review question. | Coherent findings that incentives can promote the implementation of accreditation programmes. | Moderate concerns about adequacy. 6 studies that together offered diverse data from one inspection setting. | Moderate confidence | This judgement was graded moderate because of moderate concerns regarding adequacy of data, and six studies from one inspection setting. |
| 12 | External policies and incentives: role of mass media | 35 43 48 66 77 | Moderate methodological limitation in one study, minor methodological limitations in three studies and one study fulfilled all assessment criteria. | 5 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that mass media tends to focus on shortcomings following inspections, and can contribute to the implementation of change. | Moderate concerns about adequacy of data. 5 studies that together offered diverse data. | Moderate confidence | This judgement was graded moderate because of moderate concerns regarding adequacy of data and methodological limitation in four studies. |
| 12 | External policies and incentives: government involvement | 48 51 54 86–88 | Moderate methodological limitation in one study, minor methodological limitations in three studies and two studies fulfilled all assessment criteria. | 6 studies from different countries but only one inspection settings offering relevant data addressing the review question. | Coherent findings that government involvement can promote participation on accreditation. | Moderate concerns about adequacy of data. 6 studies that together offered diverse data from one inspection setting. | Moderate | This judgement was graded moderate because of moderate concerns regarding adequacy of data and methodological limitation in four studies. |
| 14 | Self-efficacy | 89 | One study with minor methodological limitation. | 1 study form one country and one inspection setting. | Coherent finding that accreditation can crowd in intrinsic motivation | Major concern about adequacy of data. | Low | This judgement was graded low because of major concerns about adequacy of data. |
| 18 | Structural characteristics | 43 76 90–92 | Minor methodological limitations in one study, and four studies fulfilled all assessment criteria. | 5 studies from different countries and inspection settings offering relevant data addressing the review question. | Mixed and inconclusive findings regarding how structural characteristics affect inspections. | Major concern about adequacy of data. | Low | This judgement was graded low because of major concerns about adequacy of data and inconsistent findings. |
| 19 | Networks and communications: improved communication and creation of networks | 40–43 48 51 66 76 78 80 82 83 93–96 | Moderate methodological limitation in two studies, minor methodological limitations in 12 studies and two studies fulfilled all assessment criteria. | 16 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that external inspections can contribute to facilitation of networks and improved communications in these networks. | 16 studies that together offered moderately diverse data. | High | This judgement was graded high because of diverse, relevant and coherent data from 16 studies. |
| 21 | Tension for change: awareness of current practice and performance gaps | 30 31 33 34 39–43 47 56 61 66 67 76 77 82 86 93 95 96 98–104 | Minor methodological limitation in 8 studies, moderate methodological limitations in 13 studies and 7 studies fulfilled all assessment criteria. | 28 studies from different countries and inspection setting offering relevant data addressing the review question. | Coherent findings that external inspections can contribute to awareness of current practice and performance gaps. | 28 studies offering diverse data. | High confidence | This judgement was graded high because of diverse, relevant and coherent data from 28 studies. |
| 21 | Tension for change: awareness of more desirable practice | 30 31 48 82 83 93 95 97 103 | Minor methodological limitation in six studies, moderate methodological limitations in one study and two studies fulfilled all assessment criteria. | 9 studies from different countries and inspection setting offering relevant data addressing the review question. | Coherent findings that external inspection can contribute to awareness of more desirable practice. | Minor concerns about adequacy. 9 studies that together offered diverse data from interviews | Moderate confidence | This judgement was graded moderate because of minor concern about adequacy of data, and methodological limitations in seven studies. |
| 21 | Tension for change: commitment to change | 41 61 76 80 96 104 | Moderate methodological limitations in one study, minor methodological limitation in four studies and one study fulfilled all assessment criteria. | 6 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that external inspections can contribute to commitment to change. | Moderate concern about adequacy of data. 6 studies offering moderately diverse data. | Moderate confidence | This judgement was graded moderate because of moderate concern regarding adequacy of data, and methodological limitations in four studies. |
| 22 | Compatibility: perceived relevance of inspection findings | 32 35 49 61 66 76 96 105 | Minor methodological limitation in four studies, moderate methodological limitations in two studies and two studies fulfilled all assessment criteria. | 8 studies from different countries and inspection setting offering relevant data addressing the review question | Coherent findings that perceived relevance of inspection findings is important for engagement. | Minor concern about adequacy. 8 studies offered moderately diverse data overall. | Moderate confidence | This judgement was graded moderate because of minor concern regarding adequacy of data, and methodological limitations in six studies. |
| 23 | Relative priority: priority of inspected area | 8 41 42 51 56 77 95 | Minor methodological limitation in six studies, and moderate methodological limitations in one study. | 7 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings in the seven studies that external inspections can contribute to priority of the inspected areas | Moderate concern about adequacy of data, because of the number of studies. | Moderate confidence | This judgement was graded moderate because of moderate concern regarding adequacy of data, and methodological limitations in six studies. |
| 24 | Organisational incentives and rewards: incentives | 106 | One study fulfilled all assessment criteria | 1 study from one country and one inspection setting | Coherent finding that recognition and rewards were associated with perceived quality results | Major concern about adequacy of data | Low | This judgement was graded low because of major concerns about adequacy of data. |
| 25 | Goals and feedback: goal setting | 56 66 80 103 | Minor methodological limitation in three studies, and moderate methodological limitations in one study. | 4 studies form different countries and inspection settings offering relevant data addressing the review question. | Coherent findings in the four studies that external inspections can contribute to setting improvement goals. | Moderate concern about adequacy because of the number of studies. | Moderate confidence | This judgement was graded moderate because of moderate concern regarding adequacy of data, and methodological limitations in four studies. |
| 26 | Learning climate: improved organisational climate and learning climate | 38 42 43 51 66 80 81 83 93 94 96 107–109 | Moderate methodological limitation in four studies, minor methodological limitations in seven studies and three study fulfilled all assessment criteria. | 14 studies from different countries and inspection settings offering relevant data addressing the review question. | In coherent findings that external inspections can contribute to improved organisational climate and learning climate, but can also lower staff moral by solely focusing on what is wrong. | Minor concerns about adequacy. 14 studies that together offered moderately diverse data. | Moderate confidence | This judgement was graded moderate because of minor concerns regarding adequacy of data, combined with methodological limitations in seven studies and the findings that inspections can lower staff moral if solely focusing on what is wrong. |
| 26 | Learning climate: reflection on clinical practice | 31 32 41–43 45 66 76 80 | Moderate methodological limitation in two studies, and minor methodological limitations in six studies, and one study fulfilled all assessment criteria. | 9 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that external inspection can facilitate reflection on clinical practice. | Minor concerns about adequacy of data. 9 studies that together offered moderately diverse data. | Moderate confidence | This judgement was graded moderate because of minor concerns regarding adequacy of data, and methodological limitations in eight studies. |
| 26 | Learning climate: improved understanding of organisation and interdependencies in clinical system | 31 35 38 40 41 80 95 96 108 | Minor methodological limitation in two studies, and moderate methodological limitations in three studies and one study fulfilled all assessment criteria. | Moderate concern about the relevance of studies because all are from an accreditation context. | Coherent findings that external inspection can contribute to improved understanding of clinical system and the organisation as a whole. | Moderate concerns about adequacy. 9 studies that together offered moderately diverse data. | Moderate confidence | This judgement was graded moderate because of coherent findings, but moderate concerns regarding adequacy of data and methodological limitations in seven studies. |
| 27 | Leadership engagement: leader engagement | 40 43 47 49 51 66 77 80 81 94 102 106 108 110–113 | Moderate methodological limitation in two studies, minor methodological limitations in nine studies and six studies fulfilled all assessment criteria. | 17 studies from different countries and inspection settings offering relevant data addressing the review question. | Findings are incoherent. The vast majority of the studies show that external inspections can contribute to leader engagement, but a few studies show that it does not. | Moderate concern about adequacy of findings because of conflicting evidence. | Moderate confidence | This judgement was graded moderate because of methodological limitations in 11 studies, moderate concern about adequacy of data and inconsistent findings. |
| 28 | Available resources: allocation of resources | 31 35–38 41 42 49 56 61 66 75 76 80 86 93 95 96 112 114 | Minor methodological limitation in nine studies, moderate methodological limitations in five studies and five studies fulfilled all assessment criteria. | 19 studies from different countries and inspection setting offering relevant data addressing the review question. | Coherent findings that external inspection can affect allocation of resources in different ways. | 19 studies offering diverse data about how external inspections can affect allocation of resources. | High confidence | This judgement was graded high because of diverse, relevant and coherent data from 19 studies indicating that inspection contribute to alter allocation of resources. |
| 30 | Planning: planning improvement interventions | 30 31 42 61 76 86 96 100–102 104 | Minor methodological limitation in six studies, and five studies fulfilled all assessment criteria. | 5 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that external inspections can contribute to plan improvement interventions. | Minor concerns about adequacy of data. 11 studies that together offered moderately diverse data. | Moderate confidence | This judgement was graded moderate because of minor concerns regarding adequacy of data, and methodological limitations in six studies. |
| 31 | Engaging: engage and involve staff in improvement activities is important for impact of inspections | 41–43 45 48 51 66 76 80 83 86 96 97 106 108 112 113 | Moderate methodological limitation in one study, minor methodological limitations in nine studies and seven studies fulfilled all assessment criteria. | 17 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent finding inspections can contribute to engaging staff in improvement activities, and that involvement of staff is important because it can be associated with quality results. | 17 studies offering diverse data about how external inspections can contribute to engage staff. | High confidence | This judgement was graded high because of diverse, relevant and coherent data from 17 studies indicating that inspection can contribute to staff engagement. |
| 32 | Executing: implementing improvement interventions | 30 32 37 39–45 49–51 57 61 65 66 76 79 80 82 83 86 96 97 99–102 104 115–118 | Moderate methodological limitation in 5 studies, minor methodological limitations in 16 studies and 13 studies fulfilled all assessment criteria. | 34 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that external inspections can affect implementation of improvement measures, but this kind of change does not necessarily imply that the quality of care is improved. | 34 studies offering diverse data. | High confidence | This judgement was graded high because of diverse, relevant and coherent data from 34 studies. |
| 33 | Reflecting and evaluating: evaluation and continuous improvement if that is incorporated in the standard | 30 31 46 48 50 61 66 76 79 85 93 116 | Moderate methodological limitation in two studies, minor methodological limitations in seven studies and three studies fulfilled all assessment criteria. | 12 studies from different countries and inspection settings offering relevant data addressing the review question. | Inconsistent findings. Data support that external inspections can contribute to evaluation and continuous improvement when it is incorporated in the standard. | Minor concern about adequacy of findings. | Moderate confidence | This judgement was graded moderate because of methodological limitations in nine studies combined with minor concern about adequacy of data and. |
| 33 | Reflecting and evaluating: use of measurement systems and data | 43 48 51 56 61 76 82 85 108 113 119–122 | Minor methodological limitation in 10 studies, moderate methodological limitations in 3 studies and 1 study fulfilled all assessment criteria. | 14 studies from different countries and inspection settings offering relevant data addressing the review question. | Coherent findings that external inspection can contribute to use of measurement systems. | 14 studies offered diverse and relevant data. | High confidence | This judgement was graded high because of diverse, relevant and coherent data from 14 studies. |
CFIR, Consolidated Framework for Implementation Research.
Discussion
Our review suggests that the mechanisms by which external inspections might contribute to improve the quality of care are dependent on complex interactions between factors related to the outer setting, how the inspections are conducted and how they are perceived and acted on by the inspected organisations.
Inspections serve different purposes including accountability and transparency in the services.1 Our findings indicate that public disclosure of inspection findings can contribute towards transparency about the standard of the service, and thereby public trust by ensuring that the services meet a certain minimum standard.66 85 Healthcare organisations serve the public, and the review findings suggest that the inspection process can contribute to accountability by creating an opportunity for community partners to provide input on how the inspected organisations should improve.30 41
Regulatory guidance
The regulatory context in which the inspections take place affect how they are perceived and acted on by the organisations being inspected. An active regulator along with incentives, disclosure of inspection results and mandatory programmes seem to promote participation and increase resource allocation towards the inspected area.49 56 Our findings indicate that inspected organisations value guidance on how to follow up inspection findings, and that such guidance can facilitate implementation of changes.50 57 Another key reason for doing inspections is to promote quality improvement. Our findings suggest that the standards used during the inspections should focus on patient needs and be translated into something that is perceived as meaningful by the front-line staff.45 61
Use of self-assessment tools
Most external inspections include a self-assessment phase in which the inspected organisations are expected to prepare and review their own practice prior to the external assessment. Our findings about this phase are contradictory; the perceived usefulness and relevance of the available tools seem to have great impact on the outcome of this phase. Self-assessment tools that address patient care and that help the front-line staff to review their clinical practice are perceived as useful and can contribute to change.30 39 Self-assessment tools primarily concerned with providing documentation and that do not address patient care are perceived as burdensome and do not seem to contribute to changes in care delivery.31 35
Credible, predictable and transparent inspection processes
External inspections represent an assessment of the inspected organisations’ performance, and an evaluation of how the staff do their work. If substandard performance is identified, the performance, and consequently the way the staff do their work needs to be changed. In order to engage in change processes, the inspection scheme and the findings must be perceived as valid and reliable by the relevant stakeholders. The interaction between the surveyors and the inspected organisation is critical for establishing a credible inspection scheme in which the findings are acted on. The inspection teams’ knowledge about the inspected area and their communication skills are reported to be a prerequisite for confidence in the inspection findings.50 65 Inspection processes should be predictable and transparent. The surveyors therefore need to ensure that the inspection findings and the judgement of them are reliable across different organisations.54 59
Measures to bring about change in clinical processes
Our findings suggest that external inspections can affect different mediators of organisational change. The main activities of the improvement process following an inspection include planning and implementing improvement measures, evaluation and continuous improvement.23 123 The findings suggest that when substandard performance is identified, the inspected organisation is expected to plan and implement improvement measures that address the identified performance gaps and evaluate their effects. Inspections can affect all steps in this basic improvement process.
Organisational change is a precondition for improved quality of care, but it is possible to implement organisational change that does not affect the way care is delivered.124 Our findings indicate that there is as an inherent danger that improvement measures following an inspection mainly address deficiencies in management systems and support processes, that is, updating or creating new written guidelines and educational activities. Structural changes of this kind can be a precondition for making changes in the corresponding clinical processes.125 However, structural changes not sufficiently linked to corresponding changes in the clinical processes can have limited effect on the quality of care.126 127
We found evidence that inspections can affect other organisational change and discursive activities beside the basic steps in the improvement process. Our findings indicate that the change and discursive activities can interact and affect the improvement process in complex ways. Based on our theoretical framework, we argue that the way and to what extent inspections affect these change and discursive activities can be crucial for whether the inspection actually leads to organisational change that improve the quality of care.
Strengthening networks and promoting learning
Inspections can contribute to enhancing communication about clinical work and facilitating the development of networks through which organisational members meet and reflect on their own clinical practice and the findings of the inspection, thereby improving the learning climate. Involvement and engagement from leaders and staff can be a prerequisite for such multiprofessional reflection, which can be key to shaping a shared organisational understanding of the organisation’s actual performance and of areas needing improvement, thus contributing to readiness and acceptance for change. Moreover, the reflection can improve the organisational members’ understanding of their clinical system and its interdependencies, on which they can base their planning and implementation of improvement measures. We argue that improvement measures that are planned and implemented based on a new and enhanced understanding of the clinical system and its interdependencies are more likely to produce organisational change that change the way clinical care is delivered and thereby improve the quality of care.
Strengths and limitations
The strength of our study is that we conducted a systematic literature search to identify relevant studies and used a theoretical framework to extract and analyse heterogeneous data to identify possible mediators of change. Our theoretical framework does not represent a complete list of all possible constructs that might contribute to explaining how external inspection affects organisational change and the quality of care. Nevertheless, the theoretical framework was sufficiently comprehensive to include a wide range of constructs for exploring how inspections affect organisations, which in turn can advance our understanding of how inspections can contribute to improving the quality of care.
We assessed nine of our findings to have high confidence and 19 of our findings to have moderate confidence, which indicates that it is highly likely or likely that our findings represent the phenomenon of interest (table 3). The fact that we assessed a finding to have high confidence does not necessarily imply that inspections always will have that particular impact, because they are complex and context dependent. Two of the findings assessed to have high confidence were external inspections’ contribution towards awareness of current practice and performance and implementing improvement interventions. These findings might seem contradictory to previous research suggesting that external inspections have limited impact on the quality of care.128 We argue that these findings illustrate what seem to be a key challenge for external inspections. Even though they might contribute to discover substandard performance and facilitate implementation of organisational change, the quality of care is not improved unless the change processes affect care delivery to patients. Eight findings were assessed to have low confidence, indicating that it is possible that the review findings are a reasonable representation of the phenomenon at interest. One of the main reasons for grading a finding as low was inadequacy of data and inconsistent findings, which can partly be explained by the fact that external inspections are complex interventions introduced into different organisational contexts.129
Implications and future research
In order to contribute to quality improvement, inspections need to affect organisational change activities involved in improving care delivery. We found that inspections can affect different mediators of organisational change, and our findings can thereby enhance our understanding of why inspections seem to have varying effects. Our findings can provide guidance for policy makers and inspectors on how future inspections should be designed and conducted to be more effective. Organisational change to improve clinical services may be promoted by regulatory guidance, use of self-assessment tools as part of the inspection, a credible, predictable and transparent inspection process, and development of measures to bring about change in clinical processes.
This is the first review addressing how external inspections can contribute to improve the quality of care. Future studies should further explore relationships between how the inspections are carried out, their contextual setting and the way they can mediate change in care delivery in the inspected organisations.
Conclusion
External inspections can affect different mediators of organisational change. The way and to what extent they do depend on a range of factors related to the outer setting, the way inspections are conducted and how they are perceived and acted on by the inspected organisation. They can affect the key activities involved in planning, implementing and evaluating organisational change and the organisational discourse about these ongoing change activities. To improve the quality of care, the organisational change processes need to involve and affect the way care is delivered to the patients.
Supplementary Material
Acknowledgments
The authors thank Signe Romuld, Lise Vik-Haugen and Gerd Vik, librarians at the Norwegian Board of Health Supervision and Western Norway University of Applied Sciences, for valuable help in developing and conducting the database search.
Footnotes
Contributors: EHo and GSB contributed to developing the study design, screening articles, data extraction, data synthesis, assessing quality of included articles and confidence in review findings, and drafting of the manuscript and revisions. EHa contributed to developing the study design, screening articles and drafting of the manuscript and revisions. KW, OB, SF, PS and JCF contributed to developing the study design, and drafting of the manuscript and revisions.
Funding: The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests: None declared.
Patient consent for publication: Not required.
Provenance and peer review: Not commissioned; externally peer reviewed.
Data availability statement: Data sharing not applicable as no datasets generated and/or analysed for this study. This is a systematic review, and all the data that we have used are taken from previously published material.
References
- 1.Walshe K, Boyd A. Designing regulation: a review. Machester: The University of Manchester - Manchester Business School, 2007. [Google Scholar]
- 2.Shaw CD, Braithwaite J, Moldovan M, et al. Profiling health-care accreditation organizations: an international survey. Int J Qual Health Care 2013;25:222–31. 10.1093/intqhc/mzt011 [DOI] [PubMed] [Google Scholar]
- 3.Shaw C, Groene O, Berger E, et al. External institutional strategies: accreditation, certification, supervision : Busse R, Klazinga N, Panteli D, et al., Improving healthcare quality in Europe: characteristics, effectiveness and implementation of different strategies. Copenhagen, Denmark: European Observatory on Health Systems and Policies, 2019. [PubMed] [Google Scholar]
- 4.Walshe K. Clinical governance: from policy to practice. Birminhgam: Health Services Management Centre, University of Birmingham, 2000. [Google Scholar]
- 5.Walshe K, Phipps D. Developing a strategic framework to guide the care quality commission’s programme of evaluation. Manchester: University of Manchester - Manchester Business School, 2013. [Google Scholar]
- 6.Brubakk K, Vist GE, Bukholm G, et al. A systematic review of hospital accreditation: the challenges of measuring complex intervention effects. BMC Health Serv Res 2015;15:280. 10.1186/s12913-015-0933-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Flodgren G, Gonçalves-Bradley DC, Pomey M-P, et al. External inspection of compliance with standards for improved healthcare outcomes. Cochrane Database Syst Rev 2016;12:Cd008992. 10.1002/14651858.CD008992.pub3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Castro-Avila A, Bloor K, Thompson C. The effect of external inspections on safety in acute hospitals in the National health service in England: a controlled interrupted time-series analysis. J Health Serv Res Policy 2019;24:182–90. 10.1177/1355819619837288 [DOI] [PubMed] [Google Scholar]
- 9.Angus D, Brouwers M, Driedger M, et al. Designing theoretically-informed implementation interventions the improved clinical effectiveness through behavioural research group (iCEBeRG). Implement Sci 2006;1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Allen T, Walshe K, Proudlove N, et al. Measurement and improvement of emergency department performance through inspection and rating: an observational study of emergency departments in acute hospitals in England. Emerg Med J 2019;36:326–32. 10.1136/emermed-2018-207941 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Flodgren G, Pomey M-P, Taber Sarah A, et al. Effectiveness of external inspection of compliance with standards in improving healthcare organisation behaviour, healthcare professional behaviour or patient outcomes. Cochrane Database Syst Rev 2011;11:CD008992. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Ngo D, Ed B, Putters K. Supervising the quality of care in changing healthcare systems - an international comparison. Rotterdam: Dept. of Healthcare Governance Institute of Health Policy and Management, Erasmus University Medical Center, 2008. [Google Scholar]
- 13.Kok J, Leistikow I, Bal R. Pedagogy of regulation: strategies and instruments to supervise learning from adverse events. Regul Gov 2019;13:470–87. 10.1111/rego.12242 [DOI] [Google Scholar]
- 14.Bengoa R, Kawar R, Key P, et al. Quality of care : a process for making strategic choices in health systems. Geneva: World Health Organization, 2006. [Google Scholar]
- 15.Institute of Medicine Committee on Quality of Health Care in America Crossing the quality chasm: a new health system for the 21st century. Washington DC: National Academies Press, 2001. [Google Scholar]
- 16.Berwick DM. Crossing the boundary: changing mental models in the service of improvement. Int J Qual Health Care 1998;10:435–41. 10.1093/intqhc/10.5.435 [DOI] [PubMed] [Google Scholar]
- 17.Plsek PE, Greenhalgh T. Complexity science: the challenge of complexity in health care. BMJ 2001;323:625–8. 10.1136/bmj.323.7313.625 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Grant D, Marshak RJ. Toward a discourse-centered understanding of organizational change. J Appl Behav Sci 2011;47:204–35. 10.1177/0021886310397612 [DOI] [Google Scholar]
- 19.Beach D. It's all about mechanisms – what process-tracing case studies should be tracing. New Political Economy 2016;21:463–72. 10.1080/13563467.2015.1134466 [DOI] [Google Scholar]
- 20.Machamer P, Darden L, Craver CF. Thinking about mechanisms. Philos Sci 2000;67:1–25. 10.1086/392759 [DOI] [Google Scholar]
- 21.Weiner BJ, Amick H, Lee SY. Conceptualization and measurement of organizational readiness for change: a review of the literature in health services research and other fields. Med Care Res Rev 2008;65:379–436. [DOI] [PubMed] [Google Scholar]
- 22.Schmidt VA. Discursive Institutionalism: the explanatory power of ideas and discourse. Annu Rev Polit Sci 2008;11:303–26. 10.1146/annurev.polisci.11.060606.135342 [DOI] [Google Scholar]
- 23.Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009;4:50. 10.1186/1748-5908-4-50 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Popay J, Roberts H, Sowden A, et al. Guidance on the conduct of narrative synthesis in systematic reviews: final report. ESRC Methods Programme: Swindon, 2006. [Google Scholar]
- 25.Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res 2005;15:1277–88. 10.1177/1049732305276687 [DOI] [PubMed] [Google Scholar]
- 26.Creswell JW. Qualitative inquiry & research design: dhoosing among five approaches. Thousand Oaks: Sage Publications, 2007. [Google Scholar]
- 27.Hinchcliff R, Greenfield D, Moldovan M, et al. Narrative synthesis of health service accreditation literature. BMJ Qual Saf 2012;21:979–91. 10.1136/bmjqs-2012-000852 [DOI] [PubMed] [Google Scholar]
- 28.Cunningham FC, Ranmuthugala G, Plumb J, et al. Health professional networks as a vector for improving healthcare quality and safety: a systematic review. BMJ Qual Saf 2012;21:239–49. 10.1136/bmjqs-2011-000187 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.NHMRC How to review the evidence: systematic identification and review of the scientific literature. Canberra: National Health and Medical Research Council, 1999. [Google Scholar]
- 30.Greenfield D, Kellner A, Townsend K, et al. Health service accreditation reinforces a mindset of high-performance human resource management: lessons from an Australian study. Int J Qual Health Care 2014;26:372–7. 10.1093/intqhc/mzu039 [DOI] [PubMed] [Google Scholar]
- 31.Greenfield D, Pawsey M, Braithwaite J. What motivates professionals to engage in the accreditation of healthcare organizations? Int J Qual Health Care 2011;23:8–14. 10.1093/intqhc/mzq069 [DOI] [PubMed] [Google Scholar]
- 32.Lombarts MJMH, Klazinga NS. Supporting Dutch medical specialists with the implementation of visitatie recommendations: a descriptive evaluation of a 2-year project. Int J Qual Health Care 2003;15:119–29. 10.1093/intqhc/mzg020 [DOI] [PubMed] [Google Scholar]
- 33.Vanoli M, Traisci G, Franchini A, et al. A program of professional accreditation of hospital wards by the Italian Society of internal medicine (SIMI): self- versus peer-evaluation. Intern Emerg Med 2012;7:27–32. 10.1007/s11739-011-0684-6 [DOI] [PubMed] [Google Scholar]
- 34.OoPM Evaluation of a national audit of specialist in healthcare services for people with learning difficulties in England. Office of Public Management, 2007. [Google Scholar]
- 35.Doyle G, Grampp C. Accreditation as a quality tool in public sector reform: the fourth stage of convergence. Dublin: School of Business, College of Business and Law, University College Dublin, 2008. [Google Scholar]
- 36.Due TD, Thorsen T, Kousgaard MB. Understanding accreditation standards in general practice - a qualitative study. BMC Fam Pract 2019;20:23. 10.1186/s12875-019-0910-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Fairbrother G, Gleeson M. EQuIP accreditation: feedback from a Sydney teaching hospital. Aust Health Rev 2000;23:153–62. 10.1071/AH000153 [DOI] [PubMed] [Google Scholar]
- 38.Grenade L, Boldy D. The accreditation experience: views of residential aged care providers. Geriaction 2002;20:5–9. [Google Scholar]
- 39.Devkaran S, O'Farrell PN. The impact of hospital accreditation on clinical documentation compliance: a life cycle explanation using interrupted time series analysis. BMJ Open 2014;4:e005240. 10.1136/bmjopen-2014-005240 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Duckett SJ. Changing hospitals: the role of hospital accreditation. Soc Sci Med 1983;17:1573–9. 10.1016/0277-9536(83)90102-8 [DOI] [PubMed] [Google Scholar]
- 41.Pomey M-P, Lemieux-Charles L, Champagne F, et al. Does accreditation stimulate change? a study of the impact of the accreditation process on Canadian healthcare organizations. Implement Sci 2010;5:31. 10.1186/1748-5908-5-31 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Walshe K, Addicott R, Boyd A. Evaluating the care quality commission’s acute hospital regulatory model: final report Manchester. Manchester: The University of Manchester and The King’s Fund, 2014. [Google Scholar]
- 43.Smithson R, Richardson E, Roberts J. Impact of the care quality Commission on provider performance: room for improvement? London: The King’s Fund, 2018. [Google Scholar]
- 44.Askim J. Hvordan påvirker det statlige tilsynet kommunene OG det lokale selvstyret? Oslo: OsloMet By- og regionforskningsinstituttet NIBR, 2013. [Google Scholar]
- 45.Campbell SM, Chauhan U, Lester H. Primary medical care provider accreditation (PMCPA): pilot evaluation. Br J Gen Pract 2010;60:e295–304. 10.3399/bjgp10X514800 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Commonwealth of Australia Evaluation of the impact of accreditation on the delivery of quality of care and quality of life to residents in Australian government subsidised residential aged care homes. Canberra: Commonwealth of Australia, 2007. [Google Scholar]
- 47.Ésik O, Seitz W, Lövey J, et al. External audit on the clinical practice and medical decision-making at the departments of radiotherapy in Budapest and Vienna. Radiother Oncol 1999;51:87–94. 10.1016/S0167-8140(98)00144-3 [DOI] [PubMed] [Google Scholar]
- 48.Greenfield D, Hinchcliff R, Hogden A, et al. A hybrid health service accreditation program model incorporating mandated standards and continuous improvement: interview study of multiple stakeholders in Australian health care. Int J Health Plann Manage 2016;31:e116–30. 10.1002/hpm.2301 [DOI] [PubMed] [Google Scholar]
- 49.Hinchcliff R, Greenfield D, Westbrook JI, et al. Stakeholder perspectives on implementing accreditation programs: a qualitative study of enabling factors. BMC Health Serv Res 2013;13:1–9. 10.1186/1472-6963-13-437 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.The Norwegian Board of Health Supervision "Vi får Satt Fokus, Blir Bevisstgjort og Må Skjerpe Faget." - En Deskriptiv Undersøkelse med Mommunale Helse- og Omsorgstjenester til Eldre. Oslo: The Norwegian Board of Health Supervision, 2013. [Google Scholar]
- 51.Melo S. The impact of accreditation on healthcare quality improvement: a qualitative case study. J Health Organ Manag 2016;30:1242–58. 10.1108/JHOM-01-2016-0021 [DOI] [PubMed] [Google Scholar]
- 52.Saleh SS, Bou Sleiman J, Dagher D, et al. Accreditation of hospitals in Lebanon: is it a worthy investment? Int J Qual Health Care 2013;25:284–90. 10.1093/intqhc/mzt018 [DOI] [PubMed] [Google Scholar]
- 53.Greenfield D, Civil M, Donnison A, et al. A mechanism for revising accreditation standards: a study of the process, resources required and evaluation outcomes. BMC Health Serv Res 2014;14:571. 10.1186/s12913-014-0571-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Greenfield D, Hinchcliff R, Banks M, et al. Analysing 'big picture' policy reform mechanisms: the Australian health service safety and quality accreditation scheme. Health Expect 2015;18:3110–22. 10.1111/hex.12300 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Hinchcliff R, Greenfield D, Hogden A, et al. Levers for change: an investigation of how accreditation programmes can promote consumer engagement in healthcare. Int J Qual Health Care 2016;28:561–5. 10.1093/intqhc/mzw074 [DOI] [PubMed] [Google Scholar]
- 56.Pham HH, Coughlan J, O'Malley AS. The impact of quality-reporting programs on hospital operations. Health Aff 2006;25:1412–22. 10.1377/hlthaff.25.5.1412 [DOI] [PubMed] [Google Scholar]
- 57.Lombarts MJMHK, Klazinga NSN, Redekop KK. Measuring the perceived impact of facilitation on implementing recommendations from external assessment: lessons from the Dutch visitatie programme for medical specialists. J Eval Clin Pract 2005;11:587–97. 10.1111/j.1365-2753.2005.00595.x [DOI] [PubMed] [Google Scholar]
- 58.Greenfield D, Debono D, Hogden A, et al. Examining challenges to reliability of health service accreditation during a period of healthcare reform in Australia. J Health Organ Manag 2015;29:912–24. 10.1108/JHOM-02-2015-0034 [DOI] [PubMed] [Google Scholar]
- 59.Greenfield D, Hogden A, Hinchcliff R, et al. The impact of national accreditation reform on survey reliability: a 2-year investigation of survey coordinators' perspectives. J Eval Clin Pract 2016;22:662–7. 10.1111/jep.12512 [DOI] [PubMed] [Google Scholar]
- 60.Boyd A, Addicott R, Robertson R, et al. Are inspectors' assessments reliable? ratings of NHS acute hospital trust services in England. J Health Serv Res Policy 2017;22:28–36. 10.1177/1355819616669736 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Benson LA, Boyd A, Walshe K. Learning from regulatory interventions in healthcare: the Commission for health improvement and its clinical governance review process. Clin Govern Int 2006;11:213–24. [Google Scholar]
- 62.Greenfield D, Pawsey M, Naylor J, et al. Are accreditation surveys reliable? Int J Health Care Qual Assur 2009;22:105–16. 10.1108/09526860910944601 [DOI] [PubMed] [Google Scholar]
- 63.Tuijn SM, van den Bergh H, Robben P, et al. Experimental studies to improve the reliability and validity of regulatory judgments on health care in the Netherlands: a randomized controlled trial and before and after case study. J Eval Clin Pract 2014;20:352–61. 10.1111/jep.12136 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 64.Boyd A, Ross S, Robertson R, et al. How Hospital survey teams function. J Health Organ Manag 2018;32:206–23. 10.1108/JHOM-07-2017-0175 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Arianson H, Elvbakken KT, Malterud K. Hvordan opplevde helsepersonell tilsynet med fødeinstitusjoner? Tidsskr Nor Laegeforen 2008;128:1179–81. [PubMed] [Google Scholar]
- 66.OPM Evaluation team Evaluation of the healthcare commission’s healthcare associated infections inspection programme. London: OPM, 2009. [Google Scholar]
- 67.Schaefer C, Wiig S. Strategy and practise of external inspection in healthcare services – a Norwegian comparative case study. Safety in Health 2017;3:3 10.1186/s40886-017-0054-9 [DOI] [Google Scholar]
- 68.Teymourzadeh E, Ramezani M, Arab M, et al. Surveyor management of hospital accreditation program: a thematic analysis conducted in Iran. Iran Red Crescent Med J 2016;18:e30309. 10.5812/ircmj.30309 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 69.Greenfield D, Pawsey M, Naylor J, et al. Researching the reliability of accreditation survey teams: lessons learnt when things went awry. Health Inf Manag 2013;42:4–10. 10.1177/183335831304200101 [DOI] [PubMed] [Google Scholar]
- 70.Uren H, Vidakovic B, Daly M, et al. Short-notice (48 hours) accreditation trial in Australia: stakeholder perception of assessment thoroughness, resource requirements and workforce engagement. BMJ Open Qual 2019;8:e000713. 10.1136/bmjoq-2019-000713 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 71.Ehlers LH, Simonsen KB, Jensen MB, et al. Unannounced versus announced Hospital surveys: a nationwide cluster-randomized controlled trial. Int J Qual Health Care 2017;29:406–11. 10.1093/intqhc/mzx039 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Greenfield D, Moldovan M, Westbrook M, et al. An empirical test of short notice surveys in two accreditation programmes. Int J Qual Health Care 2012;24:65–71. 10.1093/intqhc/mzr074 [DOI] [PubMed] [Google Scholar]
- 73.Mumford V, Greenfield D, Hogden A, et al. Counting the costs of accreditation in acute care: an activity-based costing approach. BMJ Open 2015;5:e008850. 10.1136/bmjopen-2015-008850 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Andres EB, Song W, Song W, et al. Can hospital accreditation enhance patient experience? longitudinal evidence from a Hong Kong hospital patient experience survey. BMC Health Serv Res 2019;19:623. 10.1186/s12913-019-4452-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Hogden A, Greenfield D, Brandon M, et al. How does accreditation influence staff perceptions of quality in residential aged care? Qual Ageing Older Adults 2017;18:131–44. 10.1108/QAOA-07-2016-0028 [DOI] [Google Scholar]
- 76.Nouwens E, van Lieshout J, Wensing M. Determinants of impact of a practice accreditation program in primary care: a qualitative study. BMC Fam Pract 2015;16:78. 10.1186/s12875-015-0294-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Åsprang AF, Frich JC, Braut GS. Organizational impact of governmental audit of blood transfusion services in Norway: a qualitative study. Transfus Apher Sci 2015;53:228–32. 10.1016/j.transci.2015.04.015 [DOI] [PubMed] [Google Scholar]
- 78.Alyahya M, Hijazi H, Harvey H. Explaining the accreditation process from the institutional isomorphism perspective: a case study of Jordanian primary healthcare centers. Int J Health Plann Manage 2018;33:102–20. 10.1002/hpm.2397 [DOI] [PubMed] [Google Scholar]
- 79.Davis MV, Cannon MM, Stone DO, et al. Informing the National public health accreditation movement: lessons from North Carolina's accredited local health departments. Am J Public Health 2011;101:1543–8. 10.2105/AJPH.2011.300199 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Pomey M-P, Contandriopoulos A-P, François P, et al. Accreditation: a tool for organizational change in hospitals? Int J Health Care Qual Assur Inc Leadersh Health Serv 2004;17:113–24. 10.1108/09526860410532757 [DOI] [PubMed] [Google Scholar]
- 81.Moe G, Wang KH, Kousonsavath S. Accreditation: a quality improvement strategy for the community-based family practice. Healthc Q 2019;21:13–20. 10.12927/hcq.2019.25746 [DOI] [PubMed] [Google Scholar]
- 82.Rajan A, Wind A, Saghatchian M, et al. Staff perceptions of change resulting from participation in a European cancer accreditation programme: a snapshot from eight cancer centres. Ecancermedicalscience 2015;9:547. 10.3332/ecancer.2015.547 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Rivas C, Taylor S, Abbott S, et al. Perceptions of changes in practice following peer review in the National chronic obstructive pulmonary disease resources and outcomes project. Int J Health Care Qual Assur 2012;25:91–105. 10.1108/09526861211198263 [DOI] [PubMed] [Google Scholar]
- 84.Savelsbergh CMJH, van der Heijden BIJM, Poell RF. The development and empirical validation of a multidimensional measurement instrument for team learning behaviors. Small Group Res 2009;40:578–607. 10.1177/1046496409340055 [DOI] [Google Scholar]
- 85.Knutson AC, McNamara EJ, McKellar DP, et al. The role of the American College of surgeons' cancer program accreditation in influencing oncologic outcomes. J Surg Oncol 2014;110:611–5. 10.1002/jso.23680 [DOI] [PubMed] [Google Scholar]
- 86.El-Jardali F, Hemadeh R, Jaafar M, et al. The impact of accreditation of primary healthcare centers: successes, challenges and policy implications as perceived by healthcare providers and directors in Lebanon. BMC Health Serv Res 2014;14:86. 10.1186/1472-6963-14-86 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 87.Shah GH, Corso L, Sotnikov S, et al. Impact of local boards of health on local health department accreditation, community health assessment, community health improvement planning, and strategic planning. J Public Health Manag Pract 2019;25:423–30. 10.1097/PHH.0000000000000847 [DOI] [PubMed] [Google Scholar]
- 88.Triantafillou P. Against all odds? understanding the emergence of accreditation of the Danish hospitals. Soc Sci Med 2014;101:78–85. 10.1016/j.socscimed.2013.11.009 [DOI] [PubMed] [Google Scholar]
- 89.Pedersen LB, Andersen MKK, Jensen UT, et al. Can external interventions crowd in intrinsic motivation? a cluster randomised field experiment on mandatory accreditation of general practice in Denmark. Soc Sci Med 2018;211:224–33. 10.1016/j.socscimed.2018.06.023 [DOI] [PubMed] [Google Scholar]
- 90.Lam MB, Figueroa JF, Feyman Y, et al. Association between patient outcomes and accreditation in US hospitals: observational study. BMJ 2018;363:k4011. 10.1136/bmj.k4011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 91.Bogh SB, Falstie-Jensen AM, Hollnagel E, et al. Predictors of the effectiveness of accreditation on hospital performance: a nationwide stepped-wedge study. Int J Qual Health Care 2017;29:477–83. 10.1093/intqhc/mzx052 [DOI] [PubMed] [Google Scholar]
- 92.Greenfield D, Lawrence SA, Kellner A, et al. Health service accreditation stimulating change in clinical care and human resource management processes: a study of 311 Australian hospitals. Health Policy 2019;123:661–5. 10.1016/j.healthpol.2019.04.006 [DOI] [PubMed] [Google Scholar]
- 93.Baskind R, Kordowicz M, Chaplin R. How does an accreditation programme drive improvement on acute inpatient mental health wards? an exploration of members' views. J Ment Health 2010;19:405–11. 10.3109/09638230903531118 [DOI] [PubMed] [Google Scholar]
- 94.Braithwaite J, Greenfield D, Westbrook J, et al. Health service accreditation as a predictor of clinical and organisational performance: a blinded, random, stratified study. Qual Saf Health Care 2010;19:14–21. 10.1136/qshc.2009.033928 [DOI] [PubMed] [Google Scholar]
- 95.Kilsdonk MJ, Siesling S, Otter R, et al. Two decades of external peer review of cancer care in general hospitals; the Dutch experience. Cancer Med 2016;5:478–85. 10.1002/cam4.612 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 96.Kousgaard MB, Thorsen T, Due TD. Experiences of accreditation impact in general practice - a qualitative study among general practitioners and their staff. BMC Fam Pract 2019;20:146. 10.1186/s12875-019-1034-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 97.Algahtani H, Aldarmahi A, Manlangit J, et al. Perception of hospital accreditation among health professionals in Saudi Arabia. Ann Saudi Med 2017;37:326–32. 10.5144/0256-4947.2017.326 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 98.Daucourt V, Michel P. Results of the first 100 accreditation procedures in France. Int J Qual Health Care 2003;15:463–71. 10.1093/intqhc/mzg071 [DOI] [PubMed] [Google Scholar]
- 99.Hofhuis H, Van Den Ende CHM, De Bakker DH. Effects of visitation among allied health professionals. Int J Qual Health Care 2006;18:397–402. 10.1093/intqhc/mzl044 [DOI] [PubMed] [Google Scholar]
- 100.Hovlid E, Høifødt H, Smedbråten B, et al. A retrospective review of how nonconformities are expressed and finalized in external inspections of health-care facilities. BMC Health Serv Res 2015;15:405. 10.1186/s12913-015-1068-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 101.Nouwens E, van Lieshout J, Bouma M, et al. Effectiveness of improvement plans in primary care practice accreditation: a clustered randomized trial. PLoS One 2014;9:e114045. 10.1371/journal.pone.0114045 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 102.Oliveira JLCd, Gabriel CS, Fertonani HP, et al. Management changes resulting from hospital accreditation. Rev Lat Am Enfermagem 2017;25:e2851. 10.1590/1518-8345.1394.2851 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 103.Shakespeare TP, Back MF, Lu JJ, et al. External audit of clinical practice and medical decision making in a new Asian oncology center: results and implications for both developing and developed nations. Int J Radiat Oncol Biol Phys 2006;64:941–7. 10.1016/j.ijrobp.2005.08.027 [DOI] [PubMed] [Google Scholar]
- 104.Walshe K, Wallace L, Freeman T, et al. The external review of quality improvement in health care organizations: a qualitative study. Int J Qual Health Care 2001;13:367–74. 10.1093/intqhc/13.5.367 [DOI] [PubMed] [Google Scholar]
- 105.Kilsdonk MJ, van Dijk BAC, Otter R, et al. The impact of organisational external peer review on colorectal cancer treatment and survival in the Netherlands. Br J Cancer 2014;110:850–8. 10.1038/bjc.2013.814 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 106.Reisi N, Raeissi P, Sokhanvar M, et al. The impact of accreditation on nurses' perceptions of quality of care in Iran and its barriers and facilitators. Int J Health Plann Manage 2019;34:e230–40. 10.1002/hpm.2642 [DOI] [PubMed] [Google Scholar]
- 107.Lee E. Safety climate and attitude toward medication error reporting after hospital accreditation in South Korea. Int J Qual Health Care 2016;28:508–14. 10.1093/intqhc/mzw058 [DOI] [PubMed] [Google Scholar]
- 108.Paccioni A, Sicotte C, Champagne F. Accreditation: a cultural control strategy. Int J Health Care Qual Assur 2008;21:146–58. 10.1108/09526860810859012 [DOI] [PubMed] [Google Scholar]
- 109.Andres EB, Song W, Schooling CM, et al. The influence of hospital accreditation: a longitudinal assessment of organisational culture. BMC Health Serv Res 2019;19:467. 10.1186/s12913-019-4279-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 110.Corrêa JÉ, Turrioni JB, Mello CHP, et al. Development of a system measurement model of the Brazilian Hospital accreditation system. Int J Environ Res Public Health 2018;15:11. 10.3390/ijerph15112520 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 111.Shaw CD, Groene O, Botje D, et al. The effect of certification and accreditation on quality management in 4 clinical services in 73 European hospitals. Int J Qual Health Care 2014;26 Suppl 1:100–7. 10.1093/intqhc/mzu023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 112.Almasabi M, Thomas S. The impact of Saudi hospital accreditation on quality of care: a mixed methods study. Int J Health Plann Manage 2017;32:e261–78. 10.1002/hpm.2373 [DOI] [PubMed] [Google Scholar]
- 113.El-Jardali F, Jamal D, Dimassi H, et al. The impact of hospital accreditation on quality of care: perception of Lebanese nurses. Int J Qual Health Care 2008;20:363–71. 10.1093/intqhc/mzn023 [DOI] [PubMed] [Google Scholar]
- 114.Nekoei-Moghadam M, Amiresmaili M, Iranemansh M, et al. Hospital accreditation in Iran: a qualitative case study of Kerman hospitals. Int J Health Plann Manage 2018;33:426–33. 10.1002/hpm.2480 [DOI] [PubMed] [Google Scholar]
- 115.Pollard R, Yanasak EV, Rogers SA, et al. Organizational and unit factors contributing to reduction in the use of seclusion and restraint procedures on an acute psychiatric inpatient unit. Psychiatr Q 2007;78:73–81. 10.1007/s11126-006-9028-5 [DOI] [PubMed] [Google Scholar]
- 116.Devkaran S, O'Farrell PN, Ellahham S, et al. Impact of repeated hospital accreditation surveys on quality and reliability, an 8-year interrupted time series analysis. BMJ Open 2019;9:e024514. 10.1136/bmjopen-2018-024514 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 117.Hayes RP, Baker DW, Luthi J-C, et al. The effect of external feedback on the management of medicare inpatients with congestive heart failure. Am J Med Qual 2002;17:225–35. 10.1177/106286060201700605 [DOI] [PubMed] [Google Scholar]
- 118.Salmon J, Heavens J, Lombard C. The impact of accreditation on the quality of hospital care: KwaZulu-Natal Province, Republic of South Africa. Bethesda: Quality Assurance Project, University Research Co., 2003. [Google Scholar]
- 119.Chuang S, Howley PP, Hancock S. Using clinical indicators to facilitate quality improvement via the accreditation process: an adaptive study into the control relationship. Int J Qual Health Care 2013;25:277–83. 10.1093/intqhc/mzt023 [DOI] [PubMed] [Google Scholar]
- 120.van Doorn-Klomberg AL, Braspenning JCC, Wolters RJ, et al. Effect of accreditation on the quality of chronic disease management: a comparative observational study. BMC Fam Pract 2014;15:179. 10.1186/s12875-014-0179-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 121.Braun BI, Owens LK, Bartman BA, et al. Quality-related activities in federally supported health centers: do they differ by organizational characteristics? J Ambul Care Manage 2008;31:303–18. 10.1097/01.JAC.0000336550.67922.66 [DOI] [PubMed] [Google Scholar]
- 122.Collopy BT. Clinical indicators in accreditation: an effective stimulus to improve patient care. Int J Qual Health Care 2000;12:211–6. 10.1093/intqhc/12.3.211 [DOI] [PubMed] [Google Scholar]
- 123.Langley GL, Moen R, Nolan KM. The improvement guide: a practical approach to enhancing organizational performance. San Francisco: Jossey-Bass Publishers, 2009. [Google Scholar]
- 124.Berwick DM. Improvement, trust, and the healthcare workforce. Qual Saf Health Care 2003;12:448–52. 10.1136/qhc.12.6.448 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 125.Donabedian A. Evaluating the quality of medical care (Reprinted from Milbank Memorial fund Quarterly, 44,166-203, 1966). Milbank Q 2005;83:691–729. [PubMed] [Google Scholar]
- 126.Grimshaw JM, Thomas RE, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess 2004;8:1. 10.3310/hta8060 [DOI] [PubMed] [Google Scholar]
- 127.Scott I. What are the most effective strategies for improving quality and safety of health care? Intern Med J 2009;39:389–400. 10.1111/j.1445-5994.2008.01798.x [DOI] [PubMed] [Google Scholar]
- 128.Flodgren G, Gonçalves-Bradley DC, Pomey M-P. External inspection of compliance with standards for improved healthcare outcomes. Cochrane Database Syst Rev 2016;12:CD008992. 10.1002/14651858.CD008992.pub3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 129.Campbell NC, Murray E, Darbyshire J, et al. Designing and evaluating complex interventions to improve health care. BMJ 2007;334:455–9. 10.1136/bmj.39108.379965.BE [DOI] [PMC free article] [PubMed] [Google Scholar]
- 130.Office of Public Management Evaluation of a national audit of specialist in healthcare services for people with learning difficulties in England. Office of Public Management, 2007. [Google Scholar]
- 131.Oude Wesselink SF, Lingsma HF, Ketelaars CAJ, et al. Effects of government supervision on quality of integrated diabetes care: a cluster randomized controlled trial. Med Care 2015;53:784–91. 10.1097/MLR.0000000000000399 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
bmjopen-2020-038850supp001.pdf (123.7KB, pdf)
bmjopen-2020-038850supp002.pdf (324.8KB, pdf)
bmjopen-2020-038850supp003.pdf (118.3KB, pdf)


