Abstract
Background
European Foundation for Quality Management (EFQM) model is a widely used quality management system (QMS) worldwide, including Iran. Current study aims to verify the quality assessment results of Iranian National Program for Hospital Evaluation (INPHE) based on those of EFQM.
Methods:
This cross-sectional study was conducted in 2012 on a sample of emergency departments (EDs) affiliated with Tehran University of Medical Sciences (TUMS), Iran. The standard questionnaire of EFQM (V-2010) was used to gather appropriate data. The results were compared with those of INPHE. MS Excel was used to classify and display the findings.
Results:
The average assessment score of the EDs based on the INPHE and EFQM model were largely different (i.e. 86.4% and 31%, respectively). In addition, the variation range among five EDs’ scores according to each model was also considerable (22% for EFQM against 7% of INPHE), especially in the EDs with and without prior record of applying QMSs.
Conclusion:
The INPHE’s assessment results were not confirmed by EFQM model. Moreover, the higher variation range among EDs’ scores using EFQM model could allude to its more differentiation power in assessing the performance comparing with INPHE. Therefore, a need for improvement in the latter drawing on other QMSs’ (such as EFQM) strengths, given the results emanated from its comparison with EFQM seems indispensable.
Keywords: Quality Assessment, EFQM, INPHE, Emergency Department, Iran
Introduction
Debate on quality and its management and improvement initially began in 1950s, which has served as an important criterion for evaluating the performance of most organizations’ services, including health care organizations (HCOs) (1–3). Different models and systems, such as TQM (Total Quality Management), ISO (International Standards Organization) and EFQM have been developed or adapted over time to assess and improve the quality in health care. Bohigas and Heaton (4) place these programs in four main categories including certification by ISO, business excellence awards [e.g. MBNQA Malcolm Baldrige National Quality Award], EFQM, professional peer reviews [e.g. Visitation, Vasitatie in Dutch] and accreditation. Some of these programs have been applied in Iranian health sector. This study seeks to compare the evaluation results of two external assessment programs in Iran; trying to verify the performance assessment results of INPHE through the application of EFQM. Verification has also been previously used in the case of EFQM (5).
INPHE and EFQM: At a glance
The INPHE was established by Iranian Ministry of Health and Medical Education (MOHME) in 1997 at its new structure aiming to assess and improve the quality and safety of services delivered by all hospital nationwide (6). This program is of a national standard setting and local (decentralized) monitoring status (7). The highest evaluation grade granted to the hospitals by INPHE, as Braithwaite et al. (8) put it, is deemed as a valid indicator of high organizational performance and central to the safety and quality in HCOs.
The evaluation process of INPHE normally starts from the emergency department (ED) of hospitals. The ED evaluation is conducted entirely independent of the rest of the hospitals and has important implications for their assessment. That is, if a hospital does not obtain acceptable scores for its ED, its evaluation will be suspended until the ED gains a satisfactory score. In other words, the overall grade of the hospital can never exceed the grade of its ED (9). This emphasis on the EDs, given the nature (i.e. vitality) of the activities in this department, is understandable (10). However, it might also force the hospitals to unwittingly neglect their main activities at the expense of obtaining higher grades for their EDs’ evaluation. Successful evaluation of EDs is a departure point for the evaluation of entire hospitals. Table 1 shows the scoring system of INPHE for the EDs as opposed to that of EFQM model. It is clear from the table that the INPHE is stricter than EFQM as it requires HCOs to obtain 90% of the scores in order to gain the excellent position.
Table 1:
| Explanation | INPHE Score | Explanation | EFQM Score |
|---|---|---|---|
| Excellent | ≥ 90% | Excellence award | > 50% |
| Good | 80% – 89% | Recognized for excellence | >30% |
| Intermediate | 70% – 79% | Committed to excellence | No score* |
| Poor | 60% – 69% | * It entails identifying, prioritizing and implementing improvement projects using the EFQM Excellence Model and RADAR logic. | |
| Substandard (Non-compliant) | 50% – 59% | ||
| Unauthorized (These hospitals are not authorized to work as a hospital. They could work as a limited-surgery clinic (LSC).) | 49% and less | ||
Along with this program, MOHME has been urging the HCOs to apply other quality improvement such as EFQM to promote the quality and prepare for the INPHE evaluation in recent years. It has mandated the all Universities of Medical Sciences (UMSs) to apply the EFQM at least in one of their hospitals (There are presently 40 UMSs in the country, which are responsible both for health services delivery and medical education). At present, Approximately 48 hospitals have applied this quality system in the country (12).
Briefly, the EFQM model is an internationally recognized model for assessing the quality of organizations, which can serve as a diagnostic tool for self-assessment, where organizations can grade themselves against a set of detailed criteria grouped into enablers and results (13). Table 2 outlines the main differences and similarities of INPHE and EFQM model.
Table 2:
| Similarities | Differences |
|---|---|
| Quality improvement | EFQM assessment is voluntary, but INPHE is Mandatory |
| Pre-announced standards | Unlike EFQM, INPHE does not have a conceptual framework |
| Multidisciplinary surveying teams | EFQM model owns a scoring logic (i.e. RADAR) |
| Self-assessment | EFQM model is of a stronger consensus process |
| Similar data collection methods | INPHE leads to tariff alteration, but EFQM certification affects the organization’s reputation |
| Consensual process for final score Ranking |
Iranian National Quality Award (INQA) was designed based on the EFQM model in 2003, following some modifications in the criteria, sub-criteria and the weight of original model. Subsequently, some hospitals could attain the INQA certificate of excellence (14). However, The INQA was not oriented towards health sector. Therefore, the MOHME recommended the application of the original EFQM-2010 to the hospitals later on, considering no alteration in the nine criteria and sub-criteria and weights, except for in its guidance notes (15). An important reason behind the interest of Iranian health system in EFQM is to lay the groundwork for an effective presence in the medical services market of Middle East, in line with the country’s “20-year Vision Plan” (16). Such a ground for application of EFQM is also echoed by Lee (17).
Several studies in the health sector have used the EFQM model (20–23); for example, as in line with the aim of current study, Vallejo and colleagues (21) concluded that the EFQM model could be a suitable model for self-assessment at departmental level in hospitals and to identify the areas of improvement. Numerous Iranian studies have also used the EFQM-2003 model to assess quality in HCOs (24–30). Despite various studies on both EFQM and Iranian evaluation program, there are some justifications behind current study: First, all these studies have simply relied on one model to assess the quality of HCOs’ performance. As to the comparison of the two quality assessment models, a few studies have been undertaken in Iran. Nevertheless, the most have compared the coverage rate of the INPHE and the JCI’s standards without considering their results in practice (31, 32). Similarly, Delgoshai and Tofighi (33) have compared the coverage of the ISO and the INPHE’s standards. All these studies point to the deficient coverage of the INPHE’s standards as compared to those of the JCI and ISO. At international literature, the existing studies have mainly investigated the differences and similarities of various models and none has undertaken a comparison based on their assessment results (34–37). Second, unlike previous research, our study has a departmental approach, that is, rather than covering all activities of a hospital, it only concentrates on one department, which is Emergency Department (ED) (i.e. a more focused study). This is because whole hospital is repeatedly studied earlier. In addition, the ED is an overly vital part of any hospital, which is not studied independently before. The importance of the ED is also recognized in a way it is evaluated by the INPHE (more explanation came formerly).
In fact, current study aims to contribute to this extant area of literature through scrutinizing and validating the results of INPHE relying on the evaluation results of an internationally-known quality management system (QMS) (i.e. EFQM). Since the country has also started to review and replace current evaluation system, the results could serve as invaluable practical implication for such a purpose. Third, previous studies in the country applied the EFQM-2003; we instead used EFQM-2010, which was the latest version at that time.
Methods
Conducted in the second half of 2012, the study investigated all EDs located in the hospitals with more than 300 beds affiliated with TUMS, as they had a well-developed ED. They were chosen because in these hospitals, more than 30 emergency beds existed. In accord with the MOHME’s guidelines, less than 30 emergency beds constitute an emergency ‘unit’ in the hospitals, which are dependent and run by hospital, matron. Instead, the hospitals with equal or more than 30 beds had an emergency ‘department’ having more authority and independence (38). The latter is the subject of this study, as the researchers thought the performance of departments were under their own control and valid to measure.
The data collection team included five members most of them experienced in both EFQM assessment and INPHE. They all had an EFQM assessor certificate with two members of the team further having more than five years of INPHE experience. As with the INPHE, since it is a state-run program the researchers could only access to the final evaluation results of the selected hospitals’ EDs. The evaluation process of INPHE comprises a pre-arranged (announced) site visit by a multidisciplinary team of surveyors. The evaluation usually takes no more than one week depending on the size of hospitals and the number of in-patient beds. During the evaluation process, the surveyors based on their own specialties investigate different aspects of hospitals’ activities including medical equipment, and clinical and paramedical spaces; they interview medical staff (mainly nurses) and sometimes patients, and finally review the related documents. At the end of the site visit, the surveyors are expected to hold a meeting with managers of the hospital to discuss the problems and to brief them on existing non-compliances with pre-announced standards. The result of the assessment is usually sent to the hospitals within a month of the visit and, if any ED is non-compliant, namely achieving grade 4, it is given six months to improve its deficiencies and solve its identified problems. By contrast, higher grades of EDs and hospitals allow for an increase in the tariffs of hospitals’ hotel-type services (9).
However, the researchers could themselves conduct the EFQM assessment for the study. The researchers conducted the data collection using the EFQM-2010 standardized questionnaire (39), following its conversion into Farsi and back-translation into English and lastly a check by English language natives in order for its face validity. The team members completed their questionnaires for each department through observation, interview and review of related documents. RADAR logic was used to score the EDs in each sub-criterion. To put simply using EFQM language, the team assessed that whether the appropriate and integrated approaches deployed systematically in all sections of the EDs existed and that these approaches and their deployment were constantly measured, reviewed and improved in each enabler sub-criterion. As such in the result-related sub-criteria, the specific and fulfilled objectives, improving trends compared to the past and other EDs and the established relationship between the results and the approaches as well as the scope of reported results were assessed and included in the questionnaire.
Drawing on the EFQM consensus process, for each criterion one consensus meeting, thus overall nine meetings were held among assessment team members. Consensus is EFQM ascendancy, which is also used in the INPHE score setting process. However, there is a difference. In the EFQM if the variance of the all team members’ scores is less than 25%, the average score is considered, otherwise, discussion on the justifications of the members must close the scoring gap and a revisit to the site and re-check of the existing evidence might be the last option to this end. In the INPHE, recheck of the evidence and revisit is not conducted and there is no 25% variation condition for averaging the members’ scores.
As to the performance differentiation of two programs, besides checking the structure and process of INPHE against EFQM (Table 2), the researchers mainly compared the overall results of these quality assessment systems in order for evaluating their performance, as their similarities were more recognizable. The highest score attainable for EDs according to INPHE and EFQM model were 1672 and 1000, respectively (18, 40). The researchers calculated the score of each ED, based on the models, as percentage in proportion to the total score and compared subsequently (Fig. 1). In addition, we standardized the scores of the INPHE according to the EFQM score (see Figure 1: SINPHE), to make them more comparable and the variations more recognizable. That is, we considered the highest INPHE score achieved by EDs as equal to the highest EFQM score and adjusted the other INPHE scores for the rest of EDs proportionally. MS Excel was used to display the results.
Fig. 1:
Comparison of the percentage scores of TUMS EDs based on INPHE and EFQM model, 2012
From the research ethic perspective, the proposal of this study was approved by the Research Ethics Committee of TUMS and the name of the hospitals was anonymized for the sake of confidentiality.
Results
The total scores of the EDs, based on the EFQM, were 43.3, 40, 26.9, 22.8, 21.8 percent and 90.9, 90.9, 84.2, 83.2, 83 percent based on INPHE (Fig. 1) showing a huge gap.
It was understood that out of five hospitals selected, two have already implemented external QMSs such as ISO and EFQM. The findings revealed that the score of the departments with at least three years of experience in QMSs was higher than other departments.
Figure 2 shows the percentage score for five EDs separately for the enablers and results. The mean of enablers’ criteria was higher than that of the results in all EDs.
Fig. 2:
Percentage scores of TUMS EDs based on EFQM model in terms of enablers and results criteria, 2012
The mean percentages of the EDs’ scores in the nine criteria of the EFQM (i.e. leadership; strategy; people; partnership and resources; processes, products and services; customer results; people results; society results; and key results) were 30.6, 25.6, 33.8, 40.6, 38.4, 31.6, 29.6, 20.4 and 29.6, respectively (Fig. 3). The highest score pertains to the partnerships and resources and the lowest to the society results. Although INPHE have ten domains (Human resources, Religious and ethical aspects, Physical structure and establishment, Medical equipment and medication, Safety issues, Patient satisfaction, Management, Hospital committees, Sanitation and cleanliness, Information system and medical records) for assessing quality of EDs, they are not of a logical base as with the components of EFQM model.
Fig. 3:
Comparison of the EFQM percentage scores of TUMS EDs with the average score in domestic and foreign studies, 2012
Figure 4 shows the quality web of the EDs in nine criteria of EFQM model. As clear, the distribution pattern of scores among the nine criteria model are similar. The area inside each web represents the quality level and its symmetry implies the balance in the implementation of quality management. Therefore, it could be said that the ED3 overall covered the largest area and owned the most symmetry. The mean percentages of the EDs’ scores in the nine criteria of the EFQM (i.e. leadership; strategy; people; partnership and resources; processes, products and services; customer results; people results; society results; and key results) were 30.6, 25.6, 33.8, 40.6, 38.4, 31.6, 29.6, 20.4 and 29.6, respectively. The highest score pertains to the partnerships and resources and the lowest to the society results. As the INPHE is not of an organized and clear criterion-based structure, we could not draw such figures for this model.
Fig. 4:
Quality web of TUMS EDs in terms of EFQM nine criteria, 2012
Figure 1 illustrates the comparative percentage scores of the departments based on the EFQM and INPHE with the standardized INPHE (SINPHE) curve (The standardization of the INPHE is based on EFQM scores.). The mean percentage score of the EDs was 31% based on the EFQM and 86.4% on INPHE implying a relatively large difference (i.e. 55%) in the scores of two programs.
The overall results of the EDs with previous implementation of the any QMS were higher than rest. The mean score of the former was 416.75 as compared to 238.33 for those without any system of quality improvement. As such, the departments with a QMS could score almost two times more in leadership, strategies, processes, products and services, customer and society results criteria.
Discussion
A raft of models has nowadays emerged for assessing and improving the quality in organizations (34). This abundance has as such made the choice of an appropriate model difficult for most of HCOs. Comparison of various QMSs, as in the current study, could render a performance assessment of the QMSs (41) and explore the strengths and weaknesses of these models making the selection to some extent easier for the organizations.
The most substantial finding of this study comes from the comparison of two programs’ scores. As noted earlier, only the final percentage scores were considered for this purpose (Fig. 1). Two important points could be inferred from the figure. First, a huge gap existed between the score ranges of INPHE and EFQM alluding to the poor level of INPHE standards compared to those of EFQM unlike the fact that at the first glance the INPHE’s standards seemed to be stricter (Table 1). It might be argued that the difference could partly result from the dissimilar attitudes of two programs’ surveyors. However, to alleviate this effect, the researchers included some of the INPHE surveyors (with EFQM certificate) into the EFQM assessment team beforehand. Another reason for this difference might be related to the organizational learning curve (42, 43) in that as the EDs have been evaluated by the INPHE checklists for many years, they might have mastered them and could score high in comparison with the EFQM which is fairly new. The figure also demonstrates that the gap is not equal among the EFQM and INPHE scores of the five EDs. Considering the fact that the MOHME is both the owner and assessor of these EDs (through INPHE) and the hospitals with higher evaluation grade could charge higher tariffs and encounter less deficits, this debate could arise that due to its conflict of interest the MOHME might show more leniency to avoid its hospitals facing with financial problems. Therefore, the scores are kept high even those of the EDs whose EFQM score is at the lowest level. To address this issue, according to the fifth plan of economic, social and cultural development (2009–2014), the parliament mandated the delegation of responsibility evaluation to an independent body (44) which has not been put in action thus far. The MOHME has also moved towards developing an accreditation program of hospitals to be implemented in late 2012. Second point from the Figure 1 is the variation among the scores of EDs based on two programs. As the figure exhibits, the fluctuation is approximately three times more in EFQM scores (range= 22%) than INPHE (range=7%). This differentiation is more noticeable when the score of those EDs with (EDs 1 and 2) and without (EDs 3–5) prior record in using QMSs is considered. It means that EFQM is of a higher differentiation power, so apparently a more valid assessment tool.
The higher enablers score implied that the EDs are potentially competent to render better results in terms of the quality of the services. The scores allocated to the different dimensions of the EDs by the INPHE somehow confirm this finding, that is, more than 60% of the scores are allocated to the structural and human-related aspects (enablers) of the EDs (42). In fact, this overemphasis, we could argue, has drawn the hospitals towards concentrating more on the enablers (capabilities) than the results. As such, as Fig. 2 shows wherever the score of each enabler is high their corresponding results are also high confirming the logic behind EFQM model.
The different score of the EDs with and without any history in application of QMSs, in terms of RADAR logic, was because the EDs with no QMS presented either oral or little evidence in approaches. They have not gathered and analyzed the required data for assessing approaches and deployment or used them for refining approaches and deployment. These departments could not also present enough evidence in results criteria about trends, goals, comparisons and causes.
Furthermore, further investigation into the findings revealed that the score of the departments with at least three years of experience in QMSs was 1.7 times more than other departments confirming the results of those studies published after the implementation of the EFQM model (21–23). Although the casual relationship based on longitudinal or cross-sectional descriptive studies has serious limitations, the repetition of same results in different times and places might strengthen the hypothesis of causality.
The distribution pattern of scores between enablers and results found in this study was similar to the results of more domestic and foreign studies. Consistent with most similar studies (internal and external) on EFQM, the EDs obtained the highest score in people, partnerships and resources and processes criteria (Fig. 3). Improvement was also found to be more substantial in people, partnerships and resources and processes criteria over several years (22, 23). This finding arguably implies that designing and implementing quality improvement projects in terms of enabler criteria is easier as they could be readily prepared. Further, the outcomes are seemingly more tangible and quickly achieved in enablers.
The lowest score of society and strategy is also echoed by all studies conducted inside the country and slightly different from studies abroad in which the society and people results are of the lowest score (Fig. 3). The reason the society results scored low in our study was found owing to the lack of society-oriented attitudes among the managers as seen in the hospitals’ policies and strategies. As to the strategy score, since, it is mostly relevant for whole organization, at departmental level (e.g. an ED) the score might not be achieved, as similarly debated by Valejo (21). The average EFQM score of EDs (i.e. 31%) was lower than the mean scores of domestic (53.4%) and foreign studies (37.4%) conducted on the hospitals. This difference might be because they had mostly used self-assessment with questionnaire or scoring workshop, while this study relied only on external assessment.
Conclusion
Current study despite its limitations is one of the very few studies in the literature, which have compared two external evaluation programs. The results comparison (or verification) of an internal HCO evaluation program (i.e. INPHE) with those of an internationally recognized quality assessment (i.e. EFQM) was intended to, in a way, deliver a performance assessment of the former and prove the feasibility of the EFQM application at a departmental level. The results revealed that the INPHE assessments were not confirmed by those of EFQM model (EDs’ average score based on EFQM= 31% and INPHE= 86.4%). The 55% score difference and 15% variation range in line with two models could allude to the higher EFQM differentiation power in assessing the performance as opposed to INPHE and consequently necessitates the scrutiny of the structure, processes and scoring system of the INPHE. Moreover, the EFQM-2010 was found to be a suitable model for assessing the quality of hospital services at the departmental level. Such advantages of EFQM as having conceptual, criterion-based framework, self-assessment process, scoring logic (i.e. RADAR) and applicability de facto could be invoked in the design and implementation of local models for HCOs evaluation. The cross-sectional design of the study and its small sample size could be the main limitations of the study.
Ethical considerations
Ethical issues (Including plagiarism, Informed Consent, misconduct, data fabrication and/or falsification, double publication and/or submission, redundancy, etc) have been completely observed by the authors.
Acknowledgments
This study was funded and supported by Tehran University of Medical Sciences (TUMS) Grant no. 12323. The authors declare that there is no conflict of interest.
References
- 1.Yousuf N. Total Quality Management: Do health profession educators need to be educated? Education in Medicine Journal. 2011;3(2):65. [Google Scholar]
- 2.Murray CJL, Frenk J. A framework for assessing the performance of health systems. Bulletin of the World Health Organization. 2000;78(6):717–31. [PMC free article] [PubMed] [Google Scholar]
- 3.Arah OA, Westert GP, Hurst J, Klazinga NS. A conceptual framework for the OECD health care quality indicators project. Int J Qual Health Care. 2006;18(suppl1):5–13. doi: 10.1093/intqhc/mzl024. [DOI] [PubMed] [Google Scholar]
- 4.Bohigas L, Heaton C. Methods for external evaluation of health care institutions. Int J Qual Health Care. 2000;12(3):231–8. doi: 10.1093/intqhc/12.3.231. [DOI] [PubMed] [Google Scholar]
- 5.Zarafatangiz Langroudi M, Jandaghi GH. Validity Examination of EFQM’s Results by DEA Models. Journal of Applied Quantitative Methods. 2008;3(3):207–214. [Google Scholar]
- 6.Moghimi A. Familiarity with evaluation concepts and establishing quality measures. Centre for healthcare accreditation and supervision; Healthcare organisations evaluation group; Tehran, Iran: 2004. [Google Scholar]
- 7.Scrivens E. A Taxonomy of the Dimensions of Accreditation Systems. Social Policy & Administration. 1996;30(2):114–24. [Google Scholar]
- 8.Braithwaite J, Westbrook J, Pawsey M, Greenfield D, Naylor J, Iedema R, et al. A prospective, multi-method, multi-disciplinary, multi-level, collaborative, social-organisational design for researching health sector accreditation. BMC Health Services Research. 2006;6(1):113–23. doi: 10.1186/1472-6963-6-113. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.The instruction of standards and principles of evaluation of the general hospitals. Centre for healthcare accreditation and supervision; Healthcare organisations evaluation group; Tehran, Iran: 1997. [Google Scholar]
- 10.Ashour OM, Okudan GE. Fuzzy AHP and utility theory based patient sorting in emergency departments. Int J of Collaborative Enterprise. 2010;1(3):332–58. [Google Scholar]
- 11.Anonymous EFQM Recognition. 2001. Available from: http://www.efqm.org/en/PdfResources/Recognition%20Brochure%202012.pdf. [Accessed 23 February 2013]
- 12.Anonymous Bylaws of Paragraph 4 single article of Iran’s 2009 Budget Act. 2009. Available from: http://mdar.behdasht.gov.ir/uploads/165_826_Aein-name-ejraei.pdf. [Accessed 23 February 2013]
- 13.Nabitz UDO, Klazinga N, Walburg JAN. The EFQM excellence model: European and Dutch experiences with the EFQM approach in health care. Int J Qual Health Care. 2000;12(3):191. doi: 10.1093/intqhc/12.3.191. [DOI] [PubMed] [Google Scholar]
- 14.Anonymous Quality and excellence in hospital centers. 2009. Available from: http://mdar.behdasht.gov.ir/%5Cuploads%5C165_708_EFQM%202.pdf. [Accessed 23 February 2013]
- 15.Anonymous Iranian excellence model for heath sector. 2011. Available from: http://www.iranaward.org/90/hem.pdf. [Accessed 23 February 2013]
- 16.Anonymous Law of the Fourth Economic, Social and Cultural Development Plan of the Islamic Republic of Iran. 2004. Available from: http://www.dolat.ir/PDF/Program.pdf. [Accessed 23 February 2013]
- 17.Lee DH. Implementation of quality programs in health care organizations. Serv Bus. 2012;6(3):387–404. [Google Scholar]
- 18.Anonymous Introducing excellence. 2003. Available from: http://ww1.efqm.org/en/PdfResources/PUB0723_InEx_en_v2.1.pdf. [Accessed 23 February 2013]
- 19.Jaafaripooyan E. Contextual approach to the performance analysis of Iran’s national accreditation programme for healthcare organisations [PhD thesis] School of Management. University of Southampton; United Kingdom: 2011. 2011. [Google Scholar]
- 20.Vernero S, Nabitz U, Bragonzi G, Rebelli A, Molinari R. A two-level EFQM self-assessment in an Italian hospital. Int J Health Care Qual Assur. 2007;20(3):215–31. doi: 10.1108/09526860710743354. [DOI] [PubMed] [Google Scholar]
- 21.Vallejo P, Ruiz-Sancho A, Dominguez M, Ayuso MJ, Mendez L, Romo J, et al. Improving quality at the hospital psychiatric ward level through the use of the EFQM model. Int J Qual Health Care. 2007;19(2):74–79. doi: 10.1093/intqhc/mzl074. [DOI] [PubMed] [Google Scholar]
- 22.Sánchez E, Letona J, Gonazález R, García M, Darpón J, Garay J I. A descriptive study of the implementation of the EFQM excellence model and underlying tools in the Basque Health Service. Int J Qual Health Care. 2006;18(1):58–65. doi: 10.1093/intqhc/mzi077. [DOI] [PubMed] [Google Scholar]
- 23.Nabitz U, Schramade M, Schippers G. Evaluating treatment process redesign by applying the EFQM Excellence Model. Int J Qual Health Care. 2006;18(5):336–45. doi: 10.1093/intqhc/mzl033. [DOI] [PubMed] [Google Scholar]
- 24.Dehnavieh R, Ibrahimpour H, Nouri Hekmat S, Taghavi A, Jafari Sirizi M, Mehrolhassani MH. EFQM-based Self-assessment of Quality Management in Hospitals Affiliated to Kerman University of Medical Sciences. Int J Hospital Research. 2012;1(1):57–64. [Google Scholar]
- 25.Ghazvini SV, Bahrami ES. Performance Evaluation of Rajaei Hospital Based on EFQM Organizational Excellence Model. Payavard Salamat. 2012;6(1):70–8. [Google Scholar]
- 26.Maleki MR, Izadi A. Empowement position in tehran Social Security hospitals based on the organizational excellence model (the EFQM) Payesh Journal. 2010;2:131–6. [Google Scholar]
- 27.Torabi Pour A, Rekab Eslamizadeh S. Self-Assessment Based on EFQM Excellence Model in Ahvaz Selected Hospitals. Health Information Management. 2011;2:138–46. [Google Scholar]
- 28.Maleki MR, Izadi A. A comparative study on results of two hospitals in Tehran based on the Organizational Excellence Model. J Qazvin University of Medical Sciences. 2008;12(2):63–8. [Google Scholar]
- 29.Sajadi H, Hariri M, Karimi M, Baratpour S. Performance Self Assessment by the Excellence Model in Different Hospitals of Isfahan University of Medical Sciences and Healthcare Services. Pejouhesh. 2008;32(3):227–231. [Google Scholar]
- 30.Imani-Nasab MH, Tofighi S, Almasian A, Mohaghegh B, Toosani S, Khalesi N. Quality assessment of emergency wards in Khorramabad public hospitals based on EFQM model. Yafteh. 2012;14(4):17–27. [Google Scholar]
- 31.Ahmadi M, Khoshgam M, Mohammadpoor A. Comparative study of the Ministry of Health standards for hospitals with Joint Commission. Hakim Research Journal. 2008;10(4):45. [Google Scholar]
- 32.Mohammadpor A, Mehdipour Y, Karimi A, Rahdari A. A Comparative Study of the Iran Ministry of Health Patient and Family Education Standards with Joint Commission on Accreditation of Healthcare Organizations. Health Information Management. 2010;6(2):42–50. [Google Scholar]
- 33.Delgoshai B, Toffigi S. Comparison of Hospital standards with ISO principles and presentation appropriate model of Hospital standard development. Yafteh. 2005;6(4):19–29. [Google Scholar]
- 34.Sampaio P, Saraiva P, Monteiro A. A comparison and usage overview of business excellence models. TQM Journal. 2012;24(2):181. [Google Scholar]
- 35.Gouthier M, Giese A, Bartl C. Service excellence models: a critical discussion and comparison. Managing Service Quality. 2012;22(5):447. [Google Scholar]
- 36.del Mar Alonso-Almeida M. Quality awards and excellence models in Africa: An empirical analysis of structure and positioning. Afr J Bus Manage. 2011;5(15):6388–96. [Google Scholar]
- 37.Talwar B. Comparative study of framework, criteria and criterion weighting of excellence models. Measuring Business Excellence. 2011;15(1):49–65. [Google Scholar]
- 38.Anonymous Bylaws of Establishment and operation of hospitals. 2004. Available from: http://medcare.behdasht.gov.ir/uploads/312_1394_tasise%20bimarestan.pdf. Accessed 23 February 2013]
- 39.Anonymous EFQM questionnaire. 2010. Available from: http://www.efqm.org/en. [Accessed 23 February 2013]
- 40.MOHME . The instruction of standards and principles of evaluation of the general hospitals: Emergency department. Centre for healthcare accreditation and supervision; Healthcare organisations evaluation group; Tehran, Iran: 1997. [Google Scholar]
- 41.Jaafaripooyan E, Agrizzi D, Akbari-Haghighi F. Healthcare accreditation systems: further perspectives on performance measures. Int J Qual Health Care. 2011;23(6):645–56. doi: 10.1093/intqhc/mzr063. [DOI] [PubMed] [Google Scholar]
- 42.Arthur JB, Huntley CL. Ramping up the organizational learning curve: assessing the impact of deliberate learning on organizational performance under gainsharing. Academy of Management Journal. 2005;48(6):1159–70. [Google Scholar]
- 43.Fioretti G. The organizational learning curve. European Journal of Operational Research. 2007;177(3):1375–84. [Google Scholar]
- 44.Anonymous Law of the fifth Economic, Social and Cultural Development Plan of the Islamic Republic of Iran. 2009. Available from: http://www.spac.ir/Portal/File/ShowFile.aspx?ID=90fa4381-ca1c-4d41-885a-8e889d572e3d. [Accessed 23 February 2013]




