Abstract
Objective and rationale
The increasing prominence of school self-evaluation (SSE) as a pivotal element in global quality assurance practices is attributed to its cost-effectiveness, relevance to school contexts, empowerment of schools in managing and leading the school self-evaluation processes and holding them accountable for achievement standards. This paper aims to explore various approaches and SSE practices through the evolution and current status of school self-evaluation within the realms of school inspection and quality assurance in Dubai, Ireland, New Zealand and Pakistan. The paper emphasises a comparative perspective to showcase SSE models from these countries as pioneering initiatives for quality assurance in educational improvement, especially for nations where SSE practices are still evolving.
Methods
This study employs literature review, document analysis, and semi-structured interviews with school leaders (n = 32) to offer insights into current school self-evaluation practices in these four countries through a comparative analysis. These methods were chosen in line with the specific aims and research questions of the paper which are to provide an overview from the research literature of the development and current status of SSE as an element of school evaluation in each country studied. To analyse, compare and contrast policy in the area in each case and to obtain, through interviews, rich deep data and findings on the attitude and opinions of key personnel in each country on SSE as a policy and practice.
Results
School self-evaluation (of varying types) is being built into each school evaluation system and becoming an integral part of inspection and quality assurance, and it is argued here, broadly moving along a similar spectrum in each case. However, for now, SSE practices are different in all four countries, usually due to the stage of development of the school inspection system and, above all, the overall structure of their respective educational landscapes— that is to say, whether it is a centralised, controlled and hierarchical education system or a decentralised and autonomous system.
Conclusion
The study suggests that it is possible to integrate school self-evaluation into a wide range of educational systems and philosophies, eventually leading to educational improvement in any country following the case study countries discussed as models.
Keywords: Self-evaluation, School leaders, Accountability, School improvement, School inspection
1. Introduction and background
The primary objectives of school inspection, or external evaluation, are widely recognised to encompass both accountability and school improvement. School inspection ensures schools' compliance with regulatory requirements and educational standards [1]. Simultaneously, it promotes improvements in the quality of education delivery. Achieving this latter objective involves not only external agencies monitoring schools for compliance with statutory and quality standards but also an internal process wherein schools evaluate their own practices. This internal review mechanism, known as School Self-Evaluation (SSE), has become an integral component of global school quality assurance (QA) protocols [2].
Previous research on SSE shows that it empowers schools to set their own objectives, engage staff, and focus on areas needing improvement based on internal evaluations. This fosters ownership of the improvement process and grants schools the autonomy to identify and address their specific needs [3,4]. Macbeath et al. [5] highlight SSE's critical role in enhancing school evaluation effectiveness, advocating for self-evaluation to be central in national school improvement strategies, integrating accountability and self-improvement (p.92).
Several studies also note an economic rationale for self-evaluation [6,7]. Wilcox [8] posits that external evaluation requires significant manpower and financial resources, particularly for inspector recruitment. Historically, inspection costs were obscured as inspectors undertook various roles, including managing school facilities and resources [9]. While external evaluation can drive change and support decision-making and resource allocation, it demands vigilant oversight and transparency to ensure resources are used effectively. Thus, accountability for linking inputs and outputs lies with both schools and inspectors.
Studies also highlight several limitations of external surveillance, such as lack of transparency [10], concerns about evaluator professionalism, inspector subjectivity, and schools resorting to window dressing under high-stakes systems [[11], [12], [13], [14], [15]]. These issues have led to a combined approach using both external and internal evaluations, where each complements and validates the other. Leading inspection jurisdictions and organisations like the Standing International Conference of Inspectorates (SICI)1 advocate for integrating SSE into school practices due to its proven benefits for student learning and educational excellence.
Although SSE is increasingly recognised as an indispensable component of quality assurance practices worldwide, discrepancies persist in its implementation across diverse education systems, and minimal research has been dedicated to identifying the most effective SSE approaches within various educational contexts. This paper intends to explore the various approaches and SSE practices and the evolution and current status of SSE within the realms of school inspection and quality assurance, emphasising a comparative perspective. For this purpose, the study scrutinises the evolution and current status of SSE in four distinct educational landscapes: Dubai, Ireland, New Zealand, and Pakistan. These were chosen both for similarities and differences as described below.
In Dubai, where 90 % of children and young people attend private schools, and 76 % of schools belong to the private sector [18,19], the Knowledge and Human Development Authority (KHDA), a government agency, oversees and controls the quality and operations of private schools [20]. This oversight includes managing the fee structure and class levels without offering any financial or other assistance [21]. In Ireland and New Zealand, schools for compulsory education are self-managing and, though they receive public funding, have a degree of autonomy to make decisions concerning aspects of staffing, budget, curriculum, governance and ethos [22]. In contrast, Pakistan has a hierarchical education system with substantial top-down control. This control extends over recruitment, promotion, curriculum development, and instructional content [23]. Despite the devolution of the education system to the district level since 2001, along with resource pooling at the district level [24], the prevailing top-down, autocratic, and authoritarian approach leaves little room for school leaders and teachers to drive change [[25], [26], [27]]. In every province,2 there are two parallel streams of quality assurance, Monitoring and Evaluation focused on collecting quantitative data against a set of indicators3 and supervisory staff with a responsibility to enhance the quality of and access to education [21,28].
Several studies have examined the benefits and challenges of implementing SSE [see for example, 4,6,7]. This study distinguishes itself by presenting a comparative analysis of SSE practices across four educational systems and aims to showcase SSE models from these countries as pioneering initiatives for quality assurance in educational improvement, especially for nations where SSE practices are still evolving. This study employs a dual approach, combining a thorough review of academic and grey literature with semi-structured interviews involving school leaders. Through these interviews, the study endeavours to capture the perspectives of school leaders on SSE, school improvement planning, SSE practices, and the various SSE-related supports available within their education systems. Understanding the dynamics of SSE is crucial for enhancing educational quality and fostering continuous improvement within educational systems. This study seeks to address the following research questions.
- 
-How has SSE evolved within the realms of school inspection and quality assurance in Dubai, Ireland, New Zealand and Pakistan? 
- 
-What are the current practices of SSE in these countries and how do they compare? 
- 
-What are the general perceptions of school leaders in these countries about the significance and implementation of SSE? 
- 
-How does the structure of a country's educational system influence the development and implementation of SSE? 
These questions aim to delve into the various aspects of SSE practices within four different educational contexts, focusing on comparative analysis, evolution, challenges, and influencing factors.
The following section delves into the literature on the evolution and development of SSE, discussing its benefits and theoretical underpinnings. Subsequently, the methodology employed in this study is explored. Following that, I examine the implementation of SSE in each of the four countries under consideration. Finally, a comparative analysis is undertaken to elucidate similarities and differences, as well as to address issues, challenges, and, importantly, the potential benefits of SSE within diverse education systems.
2. Literature review
Several definitions of SSE offer diverse insights into the nature of the process. Schildkamp and Visscher [29] describe SSE as a systematic information gathering procedure initiated by the school itself, aimed at assessing its functioning and the achievement of its educational objectives. The primary goal of SSE is to support decision-making, promote learning, and enhance overall school improvement. Meuret and Morlaix [30] emphasise the collaborative nature of SSE, where it is conducted by various school stakeholders, including management, staff, pupils, and parents, in contrast to the external evaluation performed by outside agents. Mutch [31] underscores the importance of balancing internal and external perspectives within SSE, especially in a system requiring both improvement and accountability. To sum up, Macbeath [32] places internal evaluation at the core of the quality assurance and improvement process, emphasising its rigorous nature, systematic evaluation of practice, use of indicators as inquiry frameworks, and the application of various analytic and formative tools. These definitions collectively contribute to a broader understanding of the multifaceted nature of SSE and its recognised role in enhancing educational quality and accountability.
Academic research posits multiple reasons for the initiation of self-evaluation in schools. Neoliberal trends in education have provided schools with greater autonomy for self-management while also assigning them the responsibility to enhance their quality through the review and evaluation of their practices [17,[33], [34], [35], [36]]. Mutch [31] maintains that the evolution of SSE is an outcome of the broader movement in the field of evaluation, resulting in democratic approaches, including participatory, empowerment, and democratic evaluation methods. These methods encourage evaluands to participate more fully in evaluative activities, encompassing both self-evaluation and evaluations conducted by external evaluators. Similarly, schools are urged to lead their SSE while also participating in the external evaluation. This represents a shift from the traditional top-down approach to a bottom-up endeavour, where schools and teachers are engaged in evaluating their own performance [16].
Studies have also attributed the emphasis on self-evaluation to the high costs associated with cyclical external evaluations [[6], [7], [8],31,33,36]. Compared to external evaluations, SSE is undeniably cost-effective and offers a greater capacity for follow-through. It encompasses not only economic considerations but also intellectual aspects. If the knowledge acquired during external evaluations is not utilised for school improvement and is instead taken away by the evaluators, it represents a missed opportunity [31]. Furthermore, these studies stress the importance of considering the unintended effects of school inspections, such as window dressing and game-playing when assessing the costs of such inspections [37,38]. School inspections involve significant financial and time costs, along with the possibility of unintended consequences. These consequences may encompass heightened stress levels experienced by both teachers and school leaders, as well as the diversion of valuable time and resources towards inspection preparations, which could otherwise have been directed towards fundamental instructional practices [39].
Additionally, school inspections involve several aspects that render them vulnerable to controversies. For example, inspectors' subjective views on education can influence their judgments and interpretations, potentially leading to discrepancies in assessments and perceptions, which may raise concerns about the fairness and consistency of the process [12]. Inspectors often face criticism for their preconceived notions and biases, which can affect their objectivity and credibility, ultimately eroding trust between schools and inspectors [13]. Moreover, a common perception of insufficient transparency at various phases of the inspection process (pre-inspection, during inspection, and post-inspection) may undermine the credibility of the process and lead to controversy [10,14]. Lindgren and Ronnberg [10] maintain that the legitimacy of school inspection largely depends on transparency and black boxing, a certain level of concealing the social production of facts to establish authority. School inspection often faces criticism due to its limited scope. Inspectors, primarily focusing on academic achievement, may overlook the broader context of the school and the intricacies of the education system. Last but not the least, is the high stakes associated with external evaluations, such as their impact on a school's reputation, funding allocation, and other accountability measures [14]. These controversies have resulted in dissatisfaction among schools and teachers with the school inspection process [40]. Consequently, education systems have shifted toward a devolved quality control mechanism, encouraging schools to assess their own performance by providing them with quality standards [33]. This approach fosters awareness of their current status and empowers them to determine their desired goals. Furthermore, it supports schools through the development of school improvement plans, enabling them to chart their path towards improvement.
Inspection systems across middle-income and lower-middle-income countries, such as Pakistan, encounter several shared challenges. A UNESCO report on Accountability in Education 2017/2018 highlights that these challenges include limitations in both human and financial resources, a reluctance among inspectors to make regular school visits, and a capacity deficit in identifying schools' genuine issues. Furthermore, a significant communication gap exists between school administrations and inspectors. Essentially, negative perceptions held by teachers about inspectors, coupled with a deficiency in the inspectors' professional approach to constructive school evaluations, hinder the effectiveness of the inspection procedures. Consequently, this impedes the overarching objective of inspections, which is to enhance school outcomes by reinforcing mutual accountability between schools and higher administrative bodies [11]. Therefore, SSE can be seen as a quality assurance measure with the potential to mitigate the controversies and challenges associated with external evaluation or school inspection and assist schools in their improvement efforts.
The foundation of SSE is rooted in several theories and frameworks that shape its practices. One such theory is the Learning Organisation Theory, as proposed by Peter Senge, which envisions schools as learning organisations [41]. Within this framework, the principles of team learning, shared vision, and systems thinking support SSE. It underscores the process through which schools continuously acquire new knowledge, skills, and capabilities to enhance their performance while considering their unique contexts and challenges [3,41]. Another influential framework is Deming's Plan-Do-Check-Act cycle, introduced by W. Edwards Deming in the 1950s [42]. This iterative model guides schools in planning improvements, implementing them, assessing their effectiveness, and taking responsive actions. Additionally, Wenger's concept of a 'community of practice' [43] also serves as the foundation for SSE, fostering collaboration among school leaders, teachers, students, and parents who share common goals in achieving school improvement and better learning outcomes. According to Wenger-Trayner & Wenger-Trayner [44], ‘Communities of practice are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly’ (p.2). Every school constitutes a community of practice, aiming to build capacities and capabilities by sharing information and applying newfound knowledge to enhance their work. Together, these theories and frameworks contribute to the essential processes of SSE, promoting continuous improvement within educational institutions.
SSE represents an embodiment of school empowerment, effecting a transfer of authority from central government to the school community. Within SSE, school leaders, management, and school boards find a valuable mechanism for instigating organisational change [33]. Furthermore, school leaders and teachers who actively participate in SSE activities progressively develop their evaluation capacities, which can be effectively applied in various facets of their professional roles [3,4]. This collaborative process actively engages school leaders, teachers, parents, and students in the evaluation process, thus affording them a collective voice in school decision-making procedures. This approach fosters a sense of collegiality and collaboration among all stakeholders [45]. Notably, SSE demonstrates sensitivity to the local context and unique characteristics of individual schools [34,[46], [47], [48]]. Self-evaluating schools can establish their own agendas, allowing staff to concentrate on areas requiring improvement that are relevant to their specific context, thus encouraging ownership of the process [16,30,46,49]. Boyle et al. [49] concur that these attributes collectively render SSE a sustainable approach to school improvement.
To reap the numerous benefits of SSE, it is crucial to navigate several challenges when initiating and embedding this self-improving process in schools. These challenges include ensuring that school leaders and teachers possess the necessary professional skills for conducting evaluations [14,46], the availability of reliable and comprehensive educational data [50], and the time that school personnel can dedicate to SSE and school improvement planning [46]. Any education system seeking to implement SSE must proactively address these challenges to fully capitalise on its potential. Additionally, the implementation of SSE is influenced by varying governance contexts, local policies, and accountability systems, which can significantly impact school inspection and quality assurance practices. As a result, findings from Western studies on SSE may not be directly applicable to low- and middle-income countries [51]. To bridge this gap, this research presents a range of SSE exemplars that can be tailored to fit the unique socio-political contexts of different countries aiming to harness this proven method for sustainable school improvement.
While previous studies have underscored the value and significance of SSE [2,6,33], explored its general implementation [16,46,67], and examined its evolution within specific national contexts [e.g., 34, 31,76], a critical gap remains in the literature. There has been a lack of comparative analyses of SSE practices across different educational systems, as well as an examination of the historical trajectories of these practices. Understanding these variations is essential for identifying the most effective SSE strategies within specific educational contexts, whether centralised or decentralised. This study addresses this gap by providing a comparative analysis of the current status and historical evolution of SSE practices in four countries, highlighting the relationship between these practices and the organisational structure of their educational systems.
3. Methodology
In the pursuit of comprehensive and nuanced insights, this research adopted qualitative research methodology, which was chosen to deeply explore the subject matter and generate a rich understanding of the dynamics surrounding SSE practices and school improvement planning as perceived by school leaders. Employing a multiple case study design, the investigation focused on school inspection and quality assurance systems in four countries: Dubai, Ireland, New Zealand, and Pakistan. This design enabled a detailed analysis within each individual context as well as across different contexts. By comparing multiple cases, similarities and differences were identified, providing valuable insights into the various influences on different educational systems, as suggested by Gustafsson [52]. Additionally, the use of multiple case studies helped to enhance the external validity of the findings. By providing detailed descriptions of each case context, the study may also help others to determine the transferability of the findings to similar contexts. For the internal validity of each case, multiple sources of evidence were used to strengthen the conclusions, thereby ensuring a robust and reliable analysis.
Three primary data collection methods were judiciously employed to facilitate this exploration, namely: literature review, document analysis, and semi-structured interviews with school leaders who play a pivotal role in the SSE process. The literature review for this study aimed to identify existing research on SSE to provide context, highlight gaps in knowledge, and establish a theoretical foundation. Academic databases, including Google Scholar and Scopus, were primarily accessed, focusing on peer-reviewed journal articles published between 2000 and 2023. The search strategy involved using keywords such as [self-evaluation], [internal evaluation] and [school inspection], and Boolean operators were applied to refine the search results. The extracted data was organised using a literature matrix, categorising studies by themes and findings. Thematic analysis was employed to identify common patterns and discrepancies among the studies. By synthesising the existing research, the results were contextualised within the broader field, demonstrating how the study contributes to advancing knowledge in SSE specifically and school inspection in general.
The document analysis phase involved a systematic examination of official websites, notifications, reports, and departmental publications of the four quality assurance systems understudy. Analysing the content of such documents, in most cases, reveals the motivation, intent, and purpose within their particular context. These sources were selected based on their inherent qualities of authenticity, credibility, meaning, and representativeness, as they serve as authoritative conduits of departmental viewpoints and regulatory information [53]. Document analysis provided insights into the evolution of SSE in these countries over time, the current practices of SSE, and its role within the broader school inspection system. Additionally, it elucidated the expectations placed on school leaders concerning SSE in these four countries. Document analysis not only served as a valuable source of initial information but also played a key role in shaping the subsequent semi-structured interviews. To maintain flexibility and ensure that interviewees' responses were not unduly constrained, open-ended questions were formulated, aligning with the exploratory nature of this research.
Patton [54] underscores the importance of purposeful sampling in qualitative research, advocating for the deliberate selection of information-rich cases for in-depth investigation (p. 230). Following this methodology, thirty-two school leaders were intentionally chosen, ensuring almost equal representation from each country. These selections primarily comprised school principals, alongside a few vice principals, headmistresses, and headmasters. Considering the research exploratory focus, the emphasis was on uncovering novel ideas and gaining a profound understanding, rather than pursuing statistically representative data. Thus, a modest sample size was deliberately maintained, prioritising cases rich in information.
In accordance with the goal of purposeful sampling — to deeply explore cases providing substantial insights into the central study issues — selection criteria considered various factors. These factors encompassed geographical location, spanning both rural and urban settings, school type (primary or secondary), and school leaders' experience levels in monitoring, evaluation, and supervision. The research intent was to eschew broad empirical generalisations in favour of a nuanced examination of specific cases. The study involved school leaders whose leadership experience varied from five to thirty years. This wide range of experience, covering different lengths of tenure, enriched the research by incorporating a diverse array of perspectives. As a result, this diversity contributed to a more thorough and expansive understanding of the significance and implementation of SSE practices. Through collaborative discussions with colleagues in each country, the criteria for selecting participants were formulated, and they also aided in identifying suitable respondents.
The study primarily employed theoretical themes derived from document analysis and literature review. The respondents, who were school leaders with extensive experience in school inspection and quality assurance practices, were carefully selected for their relevance [55]. The alignment of pre-determined themes and codes with the interview questions facilitated a process in which each successive interview validated the information obtained from previous interviews. This process led to data saturation, as Fusch and Ness [56] defined, where no new data, themes, or coding were identified. Interviews were conducted with school leaders from one country at a time to obtain a comprehensive and in-depth understanding of the data. The initial phase involved conducting four interviews to establish a base size and identify unique themes. Subsequently, three additional interviews were conducted to search for new information. In the second set of interviews, new information was rarely encountered. For instance, in the case of Pakistan, the feedback on SSE was not very positive. Consequently, four additional school leaders were interviewed to ensure data saturation and robustness of the findings.
All interviews adhered to a consistent approach, with the same interviewer conducting each session. Prior written consent from interviewees was diligently obtained, and interviews were audio-recorded and transcribed verbatim to safeguard the accuracy and integrity of the collected data. The interview questions were thoughtfully structured around four principal themes: the significance of SSE in the QA system, the methodologies employed for SSE within schools, school improvement planning processes, and the training and support related to SSE provided to school leaders.
To uphold the ethical principle of participant confidentiality and anonymity, a systematic coding system was implemented for each research participant. Alphanumeric codes, such as SL1, SL2, and so forth, were assigned to each participant along with the country codes (DB – Dubai, IE- Ireland, NZ- New Zealand, and PK- Pakistan). This coding system replaced any personally identifiable information and ensured that the identities of the school leaders remained protected throughout the study. This precaution was taken to encourage open and candid responses from the participants, thus fostering an environment of trust and honesty during the interviews and data analysis phases.
The analysis of interview transcripts followed the procedural framework outlined by Clarke and Braun [57] for thematic analysis. This method was chosen due to its applicability for both inductive (data-driven) and deductive (theory-driven) analyses. While the study primarily adhered to a deductive approach, employing a predefined set of themes, the coding process retained an inductive element, accentuating noteworthy features across the entire dataset. The thematic analysis encompassed several key steps. Initial readings and subsequent reviews of the transcripts were conducted to establish familiarity with the data. Codes, both open and predefined, were identified, with an emphasis on organising them according to the predefined themes. A comprehensive review of themes ensued, involving an assessment of their effectiveness in relation to coded extracts and the dataset as a whole. The subsequent steps involved ongoing refinement of themes through continuous analysis, ensuring specificity and coherence. The definition of themes was a crucial phase, contributing to the articulation of the overall narrative derived from the analysis. The final step involved summarisation, where vivid and compelling examples were selected, and their relevance was assessed by contextualising them within the research question and existing literature.
3.1. SSE practices in Dubai
The Dubai School Inspection Bureau (DSIB), a part of the KHDA, initiated school inspections during the academic year 2008–2009. The emphasis on SSE was gradually adjusted, allowing schools to familiarise themselves with and gain confidence in the self-evaluation and improvement planning processes [58]. Initially, schools were encouraged to integrate the inspection framework into ongoing performance reviews, aligning it with other self-evaluation measures and prioritising future improvement efforts. This phase underscored two essential reflective questions: 'Where are we?' and 'Where would we like to be?' [59]. Subsequently, schools were prompted to align their SSE progressively with the quality indicators outlined in school inspection handbooks. As a result, SSE records became a pivotal component in the evidence-gathering process for DSIB's inspection preparation [60].
In the initial inspection round, approximately one-third of schools exhibited deficiencies in self-evaluation and improvement planning. However, during the subsequent academic year (2010–2011), inspectors noted a substantial improvement, with about half of the schools assessed as performing well or exceptionally in these areas [59]. By the academic year 2011–2012, SSE had evolved into an integral aspect of the school inspection process [61,62].
Schools receive supplementary guidelines for self-evaluation to facilitate this process [62]. Concurrent with the handbook's publication, the DSIB has introduced an online resource named 'Self-Evaluation: An Online Resource for Schools,' accessible through their official website.4 This platform presents a set of essential inquiries designed for schools to employ during their self-evaluation exercise. It offers templates and relevant questions aligned with each quality indicator featured in the inspection handbook. Schools are strongly encouraged to utilise the information sheets and the provided key questions, either as integral components of their continuous self-evaluation processes or as elements of their submissions to accreditation teams or DSIB inspectors [60].
In Dubai, SSE is sequentially linked to external evaluation [47], necessitating schools to electronically submit self-evaluation forms to DSIB before onsite inspections. Inspectors conduct comprehensive assessments, encompassing interviews with school staff, interactions with students, classroom observations, and evaluations of student work [20,63]. The collected data undergoes rigorous analysis and is matched with SSE findings to evaluate school leaders' understanding of their schools. This aims to transition to validated SSE as the primary inspection mode [64].
The self-evaluation template closely mirrors the inspection criteria, mandating schools to rate themselves on a six-point scale for each standard outlined in the quality framework [65]. This structured and prescriptive tool, required for inspection preparation [21], undergoes close scrutiny during onsite inspections. Inspectors engage in discussions with school leaders to affirm the assigned grades, resembling a validation process.
Additionally, almost during every cycle of inspection, as part of the national agenda or policy, some ‘special inspection topics,' are added to the quality criteria which both inspectors and schools are required to evaluate during the on-site phase of the inspection process and in SSE. These topics encompass various areas, including the quality of a school's support for students with special educational needs, their progress in core subjects, the academic achievements and progress of Emirati students, the use of digital technology, and gamification among others. Schools are expected to self-evaluate their performance in these specific areas, which are then assessed by inspectors as part of the overall evaluation process.
SSE practices have come a long way from a two questions-based reflection in 2009 (as mentioned above) to an SSE cycle that comprises: Review (how are we doing?), Reflect (how well should we be doing?); React (what will we be doing to improve?). The quality of SSE is also among the key limiting judgements5 on which schools have to achieve good, very good, or outstanding rating in order to get an overall good, very good, or outstanding grade. No school can be rated highly if its quality of SSE is not adjudged highly by the school inspectors and no school can stay in the market either if it is repeatedly adjudged acceptable or below [28]. The KHDA school inspections with a strong focus on SSE are also appreciated in scholarly research for continuous improvement in teaching and learning and for cultivating a professional education environment within private schools [20,66].
The interviewed school leaders unequivocally endorsed the significance of SSE in school improvement planning. They stressed its pivotal role in external inspections, highlighting its use as a foundation for validation. The leaders emphasised the critical need for accuracy in SSE.
‘The middle leaders along with the senior leadership team, sit down to complete SEF [self-evaluation form], which is a working document by the way it goes. It's an ongoing thing that gets completed well in time before the inspection … Self-evaluation tells them a whole story about where the school is, what we are doing, where we stand … So, inspectors use the SEF and SIP [school improvement plan] provided by each school as starting points for their work’ (DBSL2)
Another leader emphasised that inspectors prioritise how self-evaluation contributes to the school's self-improvement during inspections.
‘The primary focus for inspectors is how the self-evaluation contributes to the school's self-improvement. During inspections, there's a common concern among inspectors that many schools develop impressive Improvement Plans, but often they lack a foundation in evidence or fail to address the specific needs of the school’ (DBSL3).
Further insights were provided by a school leader who stressed the significance of accuracy in self-evaluation, noting its impact on the perception of leadership.
‘Self-evaluation is a big deal because it needs to be spot on. If it's not accurate, it really shines a light on the leadership. (DBSL5)
In terms of supporting the development of evaluation capacity among school leaders, the leaders discussed a variety of publications and resources accessible through the KHDA website. Notable examples include the annual Key Findings of School Inspections and supplements dedicated to providing guidelines for school leaders and teachers on self-evaluation. Additionally, the KHDA offers professional development opportunities and facilitates the sharing of best practices and knowledge among schools graded as Outstanding or Very Good with others.
The prescriptive SSE instrument, along with supplementary resources to complete the SEF, has simplified the SSE process for adoption by school leaders. Concurrently, the decisive role of the grade given to SSE in the overall school inspection report increases the urgency among school leaders to adhere to KHDA inspectors’ SSE guidelines. All school leaders interviewed were not just familiar with these expectations but seemed to be diligently following them.
3.2. SSE practices in Ireland
SSE became mandatory in Ireland in 2012 (Department of Education [DoE] Circular Nos. 0040/2012 and 0039/2012), with processes for its implementation beginning well before [49]. The Whole School Evaluation pilot of 1999 initiated internal evaluation, a precursor to SSE. From 2001 to 2003, some schools participated in a European Commission-funded project to understand and implement SSE [67,68]. This led the Department of Education to recognise the need for a quality assurance system integrating internal and external evaluations [69]. As a first support mechanism, the Department of Education Inspectorate published, Looking at Our Schools – an aid to second-level schools [LAOS] [69]. However, knowing the limited capacity of schools to carry out effective SSE and the limited training provided by the DoE support services the inspectorate did not strongly enforce an impact-focused approach to SSE [35,70,71].
But, according to Brown et al. [46], the decline in Programme for International Student Assessment (PISA) test scores in 2011 expedited its launch as a mandatory requirement for all schools in 2012 [35,50]. The SSE was introduced as a structured six-step cycle expected to be completed in three to four years in each school. Currently, schools are conducting the third cycle of SSE. Initially, during the first cycle of SSE, to simplify the process, school leaders were instructed (Circulars 0039/2012,6 0040/20127) to focus on literacy and numeracy, along with one additional area for school improvement. In the second cycle (Circulars 0039/2016,8 0040/20169), recognising schools' growing familiarity with SSE, they were encouraged to extend their focus to one or two curriculum areas in primary schools and post-primary schools. This extension aimed to identify and address aspects of teaching and learning, with reference to the Framework for Junior Cycle and other national initiatives, while maintaining a primary focus on teaching and learning. However, in the third cycle (2022–2026, Circular 0056/202210), schools are empowered to choose the areas they consider crucial for improvement. This includes a focus on Well-being and the integration of digital technologies in teaching and learning, aligning with the Wellbeing Policy Statement and Framework for Practice 2018–2023,11 and The Digital Learning Framework2017,12 identified as national priority areas. Compared to the first cycle, the second cycle provided more flexibility and choice to schools in managing and organising the SSE process (Chief Inspector's report 2016–2020), and similarly, the third cycle allows even greater autonomy than the second cycle [72,73].
The inspectorate offers robust support for school leaders and teachers to enhance evaluation skills and assist in school improvement planning. The Department of Education website13 provides various SSE resources, including guides to school self-evaluation, SSE support webinars, as well as the option for schools to request advisory visits by inspectors. The School Quality Framework (LAOS – Looking at Our School) has undergone two revisions, in 2016 and 2022, emphasising schools' self-evaluation for continuous development and improvement [74,75]. Each edition reflects recent educational reforms and emerging needs, covering areas such as child safeguarding, inclusion, digital competences, approaches to remote teaching and learning, and the development of pupils’ independent learning skills [75, p.6). Schools are reminded that LAOS is a quality framework, not a rigid checklist, and they are not expected to cover all aspects during self-evaluation [74,75].
Since the implementation of SSE as a mandatory practice in Ireland, it has been presented as a cyclic six-step process14 for school improvement. This model involves identifying a focus, gathering evidence, analysing and making judgments, writing and sharing a report, implementing the improvement plan and monitoring actions, and evaluating impact. Notably, the model places a strong emphasis on evidence gathering and analysis, encouraging schools to make decisions based on their data—a practice commonly referred to as data-informed decision-making. Brown et al. [35] characterise the Irish inspection model as a co-professional evaluation. This designation stems from its unique attribute of allowing schools a significant voice through dialogue and self-evaluation, thereby shifting the balance of evaluation power. In essence, this approach transforms the role of inspectorates into one focused on aiding schools in enhancing quality through robust internal processes.
Inspectors must consider schools' internal reviews to validate commendable practices and achievements [70]. The symbiotic relationship between SSE and external evaluation is emphasised, with SSE serving as a crucial source of evidence for improvement. SSE and external inspection are viewed as complementary, and the quality of SSE is a pivotal aspect in the 'Whole School Evaluation' report. The Chief Inspector's report for 2016–2020 noted commendable engagement with SSE by school leaders, either rated as good or very good, positively impacting the learning environment [73].
The interviewed school leaders consistently demonstrated their familiarity with the SSE process and understanding of its essential role in facilitating school improvement initiatives.
It [SSE] is essential. You're monitoring attendance, academic attainment, retention, literacy, numeracy, all those kinds of things. So, you're putting in place initiatives and you're having action plans and you're constantly setting targets, evaluating, seeing where you can make further improvements, and so forth (IESL2).
At the same time, they criticised it for being prescriptive and wanted the inspectorate to involve the schools more in rethinking the SSE model.
‘The department's model of self-evaluation is quite restrictive as it will have you looking at literacy and numeracy as the only real components of school success. I think schools' self-evaluation has a way to go in building evaluation competencies among school staff and adopting a transformational leadership model where schools take control of improvement themselves, set targets and actions, and work towards them’ (IESL4).
‘I think it's something that the inspectorate should look at maybe, talking to schools more about SSE and giving them more of the rationale for it’ (IESL1).
The school leaders also referred to inadequate professional support provided to schools to effectively carry out the self-evaluation, possibly leading to challenges in implementing the process.
‘There is a difficulty with SSE and the difficulty is that all schools were asked to do it. We were given the process, but we weren't taught about the product’ (IESL5).
In Ireland, where the SSE process was introduced and implemented over two decades ago, school leaders seek increased autonomy in conducting SSE while simultaneously requesting specific support for its effective implementation. Further, principals have a dual perspective on the SSE model: while they critique its prescriptiveness, they also recognise its importance and advocate for greater school involvement in reshaping the framework to better align with the needs and realities of educational practice.
3.3. SSE practices in New Zealand
In the 1980s, New Zealand embarked on educational reforms influenced by market principles to streamline bureaucracy and empower schools in decision-making. By the early 2000s, the focus shifted towards improving the quality of education, introducing education reviews and evaluation criteria [76]. Over two decades, this review methodology became integral, fostering collaboration between the Education Review Office (ERO) and schools, aiming for a self-improving and sustainable education system. This collaborative approach merges external evaluation with the constructive potential of self-review, emphasising ERO's role as an active partner in school evaluation [31].
The school evaluation indicators include ‘self-review’ as one of the sub-areas of the main dimension: leading and managing the school. The ERO [77] says, ‘schools are required to conduct their own self-review, although the manner in which this is to be done is not prescribed’ (p. 8). According to Timperley [78], it is important to distinguish between prescription, which implies a mere replication and checklist approach, and the specification of crucial processes rooted in a profound comprehension of their significance and how they can be implemented in alignment with a set of guiding principles. In this drive towards promoting self-review, schools were urged to actively engage in self-review across various dimensions, including strategic, regular, and emergent reviews, which also involved a keen examination of their current status, the integration of data as a regular practice, and the utilisation of evidence to gauge progress towards objectives and identify subsequent actions [76].
Recently there has been a big change in the ERO's evaluation approach. The new approach is called the Evaluation for Improvement Approach and according to Goodrick [79] is based on Developmental Evaluation principles as presented by Patton (2010). The ERO has transitioned away from event-based evaluations, which previously yielded summative judgments of a school's performance following a three- or four-day on-site evaluation. Instead, each designated review officer, now referred to as an ‘evaluation partner,’ is allocated a portfolio of schools. This revised approach promotes a more collaborative engagement, whereby evaluation processes are conducted in partnership with schools, rather than being imposed upon them [80]. This collaborative approach offers ‘sharing of evidence at the conclusion of each phase, critical reflection opportunities, and progressive reporting to ERO’ [79, p.1].
Moreover, the approach is grounded in key principles aimed at achieving equity and excellence in education for all students. It emphasises learner-centeredness, participatory collaboration, and cultural responsiveness. Technical rigour is maintained throughout the evaluation process, with a focus on utilising results to inform decision-making and improve educational practices. Additionally, the framework supports the development of evaluation capacity within schools, enabling meaningful self-review [80]. For instance, a school principal shared, ‘It's [ERO evaluation approach] not reviewing the school itself. It's reviewing the school's own capacity for self-review. The model is actually arranged around the school's own evaluation for improvement cycle’ (NZSL3).
In conjunction with the assistance of an evaluation partner, ERO provides a range of supporting documents on its website. Among these resources, a notable document is the 'School Improvement Framework,' which offers a continuum for various dimensions15 of school improvement [81]. Schools, based on their self-review processes, determine their current status and select their goals for a three-year period. According to school leaders interviewed, the Evaluation for Improvement Approach is a high-trust model based on evaluation partners' trust in school leaders' capacity to self-evaluate and schools' trust in the evaluation partners' ability to facilitate a professional dialogue on improvement and support. Founded on a robust collaboration the model allows the ERO to gain a more accurate perspective on what is and is not effective across schools in New Zealand [82]. However, during the course of interviews school leaders referred to variability in school leaders’ evaluation capacity resulting in evaluation partners spending more time in school where the need is more (NZSL5).
According to the ERO strategic plan for 2020–2024, the school evaluation strategies in place have demonstrated considerable efficacy. ERO has identified twenty percent of schools in New Zealand as strong, with the majority being classified as well-placed. These collectively encompass approximately 95 percent of student enrolments. Conversely, only eight percent of schools exhibit areas of concern that outweigh their areas of effective practice, as reported by ERO in 2020 [83, p.11]. Furthermore, an examination of the PISA results for 2018 indicates that New Zealand achieved scores above the Organisation of Economic Cooperation and Development (OECD) average in reading, mathematics, and science literacy. Notably, there was no statistically significant deviation in performance in reading, mathematics, and science literacy when compared to the PISA 2015 results [84].
All school leaders interviewed had no ambiguity about the significance of SSE in SIP and referred to several levels at which SSE can be applicable.
‘It [SSE] is integral to the SIP. It involves self-review processes at both the individual level, focusing on professional learning, and at the broader levels of faculty and the entire school. (NZSL2).
School leaders frequently allude to the rigour that permeates their SSE practices, as they meticulously collect data from diverse sources to assess their school's performance and determine their subsequent course of action.
‘We are never satisfied with our level of performance and when something happens, we always go back and ask ourselves, how did it go? What went well? What do we need to improve? Who else should be included in the consultation? So we are pretty hard on ourselves in terms of our self-review, probably harder than what ERO would be actually’ (NZSL5).
The school leaders also acknowledged the array of resources and professional learning opportunities available to them. They appreciated the support of their evaluation partner, which allows them significant autonomy.
‘I much prefer the new relationship, because it is ongoing so we're able to set goals, and then they [evaluation partners] come in, observe what I've talked about, and give me some feedback on how I'm going towards achieving those goals’ (NZSL6).
‘We have the opportunity as a school to set our own targets with some conversation and critique and in dialogue with the evaluation partner’ (NZSL4).
The SSE practices, introduced through the evaluation for improvement approach, grant schools complete autonomy to utilise their data, select areas for improvement, and set targets based on their individual contexts. The role of ERO evaluation partners is merely to oversee the quality of the SSE process and help build the evaluation capacity of the school leaders by asking tough evaluative questions.
3.4. SSE practices in Pakistan
The term SSE was noticeably absent from the websites, official documents, and notifications of the Departments of Education across all four provinces. Consequently, I initiated an examination of the job descriptions (JD) for school leaders, as well as those for district-level supervisory and monitoring staff. The primary objective was to discern, within the delineated responsibilities and expectations of these roles, whether there is a direct or indirect reference to SSE practices. Simultaneously, I also sought to investigate any explicit mention of school improvement planning, aiming to establish a connection between SSE and SIP. Furthermore, I explored whether supervisory staff were tasked with the responsibility of enhancing the evaluation capacity of school leaders and teachers.
Remarkably, the JD and Key Performance Indicators (KPI) for school leaders are exclusively available on the School Education and Literacy Department (Sindh) website. School leaders in Sindh, who were interviewed with access to the JD, demonstrated familiarity with the expectations of their roles. School leaders in Balochistan also had a copy of the notification issued by the School Education Department16 that has detailed JDs of various levels of school leaders. Conversely, in the other two provinces, Punjab and Khyber Pakhtunkhwa (KPK), the JD could not be located on the education department websites, and the interviewed leaders did not possess copies. However, the school leaders mentioned that their respective Departments of Education issue notifications periodically, detailing their responsibilities. The school leader's JD and KPIs for Sindh encompassed references to SSE-related activities and SIP. This includes developing, implementing, and monitoring school development plans, utilising student data for retention, grade repetition, and dropout analysis, and appraising teachers' performance through class inspections and student reviews. Moreover, some school leaders in Punjab shared 'A Head Teachers' Guide,' outlining various responsibilities of primary and elementary school leaders, including academic roles and tasks. This guide covers evaluating and preparing annual reports, analysing examination results, and reviewing the school's performance in both in-class and out-of-class activities [85, p.2].
Supervisory staff JDs17 are available on provincial education department websites, outlining responsibilities for ensuring quality education. Their roles encompass enhancing student learning, managing learning and development programmes, training staff, conducting inspections, and using data for teacher development. They also focus on student retention and staff appraisals, but their administrative duties often overshadow academic ones. Furthermore, a notable observation is the significant overlap among the duties associated with various designations within the departments of education. This overlap implies a complex interplay of roles, potentially impacting the effective execution of their responsibilities.
The KPK Education Monitoring Authority18 (KPEMA) website outlines its objectives, and a System User Guide shared by one of the monitoring officers provides details about the indicators and guidelines for different tiers of monitoring staff, including Data Collection and Monitoring Assistants. The Programme Monitoring and Implementation Unit19 (Punjab) website details the responsibilities of monitoring staff and provides descriptions and analysis of data collected from schools. The Directorate General of Monitoring and Evaluation in Sindh20, as well as the School Monitoring System in Balochistan,21 focus primarily on reporting information related to teacher attendance, student attendance, and the availability of basic facilities in schools.
Several studies [86,87,88,89,90] report that this service has generally contributed to teacher regularity and attendance, and it has improved the functionality and availability of basic facilities. Yang et al. [91] argue that monitoring systems have improved teacher attendance and regularity but do not guarantee quality teaching. Similarly, the service merely tallies the number of support visits conducted by district supervisory staff, completely disregarding the purpose and nature of these visits [92].
While SSE-related activities are referenced in the JDs, websites, and other documents, the term 'SSE' itself never appears in any of these materials suggesting that SSE is not deemed obligatory. All school leaders interviewed shared different measures they undertake to ensure the quality of education in their schools. Though they universally acknowledged the importance of SSE for school improvement, common SSE practices varied among them. Many mentioned observing lessons and evaluating students' notebooks to assess the quality of assignments and the rigour of checking. Some leaders also stated that they review lesson plans. The level of commitment and understanding of these tasks greatly varied.
For instance, a school principal stated, ‘I observe newly appointed teachers only. It feels awkward to observe the lessons of senior and experienced staff. Some principals sit at the back of the classroom to assess teaching, but I don't find it appropriate. I prefer spot checks. I complete nearly 3–4 rounds of my school daily. I don't allow teachers to check notebooks in the classrooms [during teaching time] or use mobile phones’ (PKSL8).
Some school leaders also mentioned that they ‘conduct monthly review meetings with the staff, examining the overall weak areas of the school and collectively deciding measures to address them’ (PKSL5).
However, most of them considered SSE only as the daily monitoring of school activities.
Similarly, their understanding of school improvement planning and practice also greatly varied. A principal explained that he collaborates with a group of teachers and develops medium-term and long-term plans for the school.
‘The long-term plan involves the expansion of the infrastructure (classrooms, furniture) of the school in light of projected needs. Our medium-term plan encompasses areas such as annual school paint and repair work, timetabling, distribution of free schoolbooks among students, purchasing stationery for teachers, conducting surveys for new admissions, providing teaching resources for teachers, and planning examinations, among other activities’ (PKSL7).
To other school leader, the SIP was more about maintaining a register and recording co-curricular and extra-curricular activities, and some referred to SIP as an annual statement of resources and expenditures. However, a school leader clarified that ‘School development plans are mostly not prepared, as it is not mandatory for schools. Not a common practice, actually’ (PKSL4).
Except for certain school leaders in primary and elementary schools in Punjab, who acknowledged receiving a lesson observation instrument from their designated Assistant Education Officers, consistenly utilised during lesson observations, and engaged in coaching and mentoring teachers to develop activity-based learning approaches and effective use of audio-visual aids, no other school leader mentioned any SSE-related professional development.
SSE-related tasks are incorporated into the job descriptions of all supervisors, but research, such as the study conducted by Noor and Nawab [93], indicates that the priorities of school leaders tend to lean more towards administrative and managerial tasks rather than focusing on student achievement, professional development for teachers, and the establishment of a conducive learning culture.
4. Discussion and conclusion
The methods utilised have proven instrumental in addressing the research questions and fulfilling the objectives of the paper. Through a comprehensive review of research literature and policy documents, coupled with interviews conducted with key stakeholders, I obtained a nuanced understanding of both the overarching policies and the practical implementations of SSE in each respective country. These methods not only provided a thorough overview of policy and practice but also facilitated a deeper insight into the perception of SSE within each context.
In this final section, I undertake a comprehensive examination of the SSE practices within the four distinct educational systems, as delineated in Table 1. Through meticulous comparison and contrast of these practices, my objective is to elucidate nuanced patterns, strengths, and potential areas for improvement. This comparative analysis not only underscores the diversity of SSE implementation but also highlights its adaptability across varying educational contexts, thereby filling the knowledge gap identified in previous research. Furthermore, I endeavour to draw policy conclusions regarding the value of SSE and propose ways forward to increase its role and effectiveness in very diverse education systems.
Table 1.
Comparison of SSE practices in Dubai. Ireland, New Zealand and Pakistan.
| Country | Role of SSE in QA | Introduction of SSE | Focus of SSE | Documents Guiding SSE | SSE and School Improvement Cycle | Other SSE supports | 
|---|---|---|---|---|---|---|
| Dubai | Functions complementarily with External Evaluation processes | Implemented gradually | SSE Form replicates School Inspection Framework | School Inspection Framework | One year | Supplementary resources on the KHDA website | 
| Ireland | Complementary to External Evaluation | Implemented gradually | Emphasis on Numeracy, Literacy, Digital Wellbeing, and other school-identified areas of significance | Looking at our School and circulars issued by the Department of Education prescribing SSE procedures | Three to four years | Advisory visits by school inspectors | 
| New Zealand | Constitutes a central component of the Quality Assurance system | Implemented gradually | Focus areas determined by the school according to contextual needs | School Evaluation Indicators | Three years | Evaluation partner's onsite capacity-building visits | 
| Pakistan | Not a part of QA framework though mentioned in school leaders JD | Yet to be introduced as a QA practice | Student achievement and quality of teaching and learning (as in school leaders' JD) | Randomly referred to in different documents in different provinces | Not mentioned anywhere | Supervisory staff support visit | 
Regarding Pakistan, a distinctive circumstance prevails in which the term SSE finds no mention in any official documentation. The interviewed school leaders made no reference to the use of school data, including achievement and attendance, to set targets for school improvement. No one referred to SSE and school improvement planning as a process; instead, they mostly talked about SSE as a singular, episodic accountability event. The prevailing explanation for this circumstance, as elucidated, is rooted in an approach where most district supervisory and monitoring and evaluation staff consider the availability of teachers and basic facilities as indicators of the quality of education, and this approach has permeated down to school leaders. Despite the mention of student achievement and retention data to plan interventions and classroom observation as a means of assessing teachers’ training needs in the official documents, these are not common school practices. Furthermore, the centralised control structure significantly restricts the autonomy accorded to school leaders and teachers. Coupled with the paucity of professional development opportunities, as discerned through interviews with school leaders, this contributes to the constrained capacity for self-evaluation of their professional practices. Despite the identified challenges, the current absence of SSE in the framework does not preclude the possibility of integrating it into the educational system.
As document analysis and interviews of school leaders in the other three case studies, Dubai, Ireland, and New Zealand reveal, SSE places value on sustainable school improvement, but this is achieved in differing ways. In Dubai, SSE is a strictly rigid process imposed on private schools by a state agency, where quality criteria are used as a checklist by schools and then supported by evidence for self-rating presented to school inspectors. In Ireland, SSE is characterised by a low-stakes accountability system [3,94], allowing schools to focus on two to three areas during one SSE cycle and plan actions for improvement and in New Zealand, external evaluation is almost completely integrated into SSE, granting full autonomy to schools to assess their context and needs and make decisions accordingly.
Moreover, the SSE approach adopted appears to mirror the structure of the education system within each country. For example, in New Zealand, the collaborative partnership of ERO and SSE aligns with the significant autonomy granted to school leaders. Ireland's semi-prescriptive SSE correlates with its low-stakes accountability framework, providing some degree of constraint on otherwise autonomous school leaders. Conversely, in Dubai the fully prescriptive SSE of KHDA inspections reflects the stringent control exerted by KHDA over private schools to maintain their presence in Dubai's education market. The complete absence of SSE terminology from school documents underscores the hierarchical and autocratic nature of Pakistan's education system, where school leaders possess limited autonomy to drive improvement initiatives.
The implementation of SSE and improvement planning has been a gradual and incremental process across three of the four jurisdictions under consideration: Dubai, New Zealand and Ireland. Moreover, the mandatory nature of these initiatives has exhibited variation in both the approach and pace of adoption. In Dubai, for instance, within two years of its initial introduction, SSE became a compulsory undertaking, with potential downgrading consequences for schools failing to conduct realistic and evaluative SSE. In Ireland, schools are encouraged to engage in the SSE process, yet there are no repercussions for poor performance, given the absence of grading and publicly available school league tables [90]. Meanwhile, in New Zealand, the implementation has followed a gradual trajectory, emphasising the provision of tools and nationwide training programmes aimed at enhancing the self-review capacities of school leaders and utilising self-review as a mechanism for school improvement [95].
The recent approach observed in New Zealand reflects the outcome of sustained efforts by the ERO to strengthen the self-review capabilities of school leaders. This evolution has led to a transition toward a high-trust model, wherein the ERO places trust in schools to possess the requisite capabilities for reviewing their practices and effecting improvements, particularly in enhancing student achievement. Simultaneously, evaluation partners contribute evaluative insights to schools, augmenting the evaluative quality of their self-review processes. A very similar gradual progression is taking place in Ireland. Certainly, therefore one can argue that the acceptance and sustainability of SSE within schools in these less centralised and prescriptive jurisdictions depend on the gradual initiation and evolution of the process even in the different educational governance environments of Dubai and Pakistan one can perceive a slow but similar movement to placing great responsibility for school standards and improvement on the schools themselves. It may well be that, for reasons of cost and practicality if no other, the direction of travel is, if slowly, towards the New Zealand approach. SSE is a mandatory activity and a government imperative in Dubai, Ireland, and New Zealand. The differences, as previously discussed, lie in its prescriptive or non-prescriptive nature. Simons [36] posits that when SSE is only a voluntary activity, its uptake is generally slow and infrequent. Schools often require an external imperative to initiate the process (p. 18). Hence, for SSE to be effectively implemented in any education system, it needs to be made a government imperative. As can be seen, in the case of Pakistan, in the absence of an official mandate, SSE practices are erratic across the country.
Although SSE is a well-established process in Dubai, Ireland, and New Zealand, its initiation encounters various challenges. For instance, McNamara et al. [50] assert that the successful implementation of SSE in Ireland faced several issues, including the availability of key data in schools for conducting SSE, the capacity of school leaders and teachers, limited involvement of the entire school community, a restricted role for parents or pupils, and an unclear relationship between SSE and external evaluation. Similarly, in New Zealand, schools initially lacked the internal capacity for the analysis and use of data, and staff may have had insufficient time and motivation for gathering and scrutinising data. Challenges also arose in determining the types of data that were most pertinent and of the highest priority. These challenges were addressed through capacity building for SSE, deemed an essential avenue for professional learning and support for teachers, school leaders, and Boards of Trustees, facilitating effective school self-review and the efficient use of evaluative data for improvement [95].
In the case of Dubai, similar challenges are anticipated, necessitating the development of guidelines and support materials for SSE. When teachers and school leaders receive training in systematic data collection and utilisation, coupled with an awareness of its role in school development, the likelihood of sustaining the SSE process increases [36]. Therefore, prior to initiating SSE, staff responsible for ensuring school quality must undergo training in activities that guide and support SSE practices. There is also a need for continuous professional support for school leaders to enhance their evaluation capacity. The variability in school leaders’ ability to self-evaluate and the requirement for SSE practice training consistently emerged during interviews, particularly in cases involving Ireland and Pakistan. The KHDA has addressed this issue by simplifying the SSE process, while ERO has assigned an evaluation partner to several schools, catering to their capacity-building needs.
In Pakistan's context especially, the motivation of school leaders and teachers to engage in SSE can also pose a challenge. School leaders and teachers are civil servants recruited through provincial public service commissions, and terminating their services is a lengthy and challenging process, as Simons [36] illustrates in the case of teachers and school leaders in Spain. They may question why they should spend their time conducting SSE, as it might encroach upon their administrative and teaching responsibilities. The incentive, of course, is the improvement in teaching and the provision of better learning opportunities for students; nevertheless, it needs to be deeply ingrained in the system.
The successful implementation of SSE and school improvement planning, as discerned through the study of education systems other than Pakistan, requires a gradual and incremental initiation. SSE needs to become an obligatory requirement, and, above all, those in school support roles, school leaders, and teachers must undergo training to build their evaluation capacity. With these prerequisites met SSE can be successfully initiated by mandating schools to proactively assess their own performance and practices. The SSE can typically be guided or overseen by supervisory staff, as it pertains to Pakistan, who hold official positions in the education system and are responsible for ensuring compliance with standards, regulations, and educational goals. In such contexts, supervisory staff plays a crucial role in guiding and supporting the school's self-evaluation efforts, often providing expertise and feedback, and ensuring that the evaluation aligns with educational objectives and quality standards. Such collaboration between the school and supervisory staff can contribute to enhancing the overall quality of education and school performance. However, there is a need to shift the approach, moving the focus from merely considering student enrolment, teacher attendance, or basic facilities to improving the quality of teaching and learning and achieving better learning outcomes.
In summary, these exemplars provide a continuum of approaches to SSE, with Pakistan at one end, exhibiting minimal SSE practices dependent on school leaders' arbitrary choices and lacking official obligations. New Zealand, on the other hand, stands at the opposite end, where SSE serves as the primary quality assurance activity. Dubai and Ireland occupy positions somewhere in between. In countries like Pakistan, where SSE is scarcely present in the system, Dubai's prescriptive SSE model can serve as a valuable starting point. As schools become familiar with the SSE process, there is room for introducing more autonomy and flexibility, following Ireland's example. Eventually, collaborative partnering, similar to the model in New Zealand, could be the ultimate goal at a later stage. The findings of the study can be extrapolated to apply to any country where schools face constraints on autonomy or where SSE practices are either not initiated or are at a rudimentary stage.
5. Implications, limitations and future research directions
This study advances both the theoretical and practical understanding of SSE by demonstrating its potential integration into diverse educational systems and offering actionable insights for stakeholders. The comparative analysis of SSE practices in Dubai, Ireland, New Zealand, and Pakistan underscores SSE's adaptability to centralised as well as decentralised structures, highlighting its potential as a universal and scalable framework for global quality assurance and educational improvement. The research also provides a conceptual framework for understanding SSE's evolution in relation to school inspection systems and broader educational landscapes, while also offering a strategic roadmap for policymakers, educational leaders, and practitioners.
Possible limitations of the study include its reliance on the perspectives of school leaders, which may not capture the full scope of SSE practices. Incorporating the views of other stakeholders, such as teachers, students, and parents, would provide a more comprehensive understanding. Additionally, the study's focus on current SSE practices may not fully reflect long-term outcomes. To address these limitations, future research should consider a broader range of stakeholder perspectives and conduct longitudinal studies to examine the enduring effects of SSE.
Ethical approval statement
This study was approved by the Research Ethics Committee of Dublin City University, with ethics approval reference DCU/REC/2022/008.
Data availability statement
The data associated with this study has not been deposited into a publicly available repository due to the presence of confidential information.
Funding
This research did not receive any specific funding.
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Footnotes
Punjab, Sindh, Khyber Pakhtunkhwa (KPK) & Balochistan.
School status, infrastructure details, teaching and non-teaching staff details, student enrolment and attendance details, administrative visits, delivery of textbooks, stipend details, basic facilities, Parent-teacher council details.
There other three key limiting judgements are student progress and achievement, quality of teaching and learning and leadership and management.
https://www.dlplanning.ie/wp-content/uploads/2018/10/DLF_Primary.pdf.; https://www.dlplanning.ie/wp-content/uploads/2018/10/digital-learning-framework-post-primary.pdf.
Learners: Learner Progress & Achievement, Te Tiriti O Waitangi, Inclusive Learning Climate, Wellbeing & Safety and Conditions to support learners: Responsive Curriculum & Planning, Effective Teaching, Leadership and Capability, Partnerships, Stewardship, Evaluation for Improvement.
No.PPIU/8–70/(A7F)/2023/2552-61/dated 9th May 2023.
Sindh Education and Literacy Department http://www.sindheducation.gov.pk/notifications?S=Search&key=Job+description&pageNumber=1&id=34;Job Descriptions & Competencies of District Education Office, KPK https://kpese.gov.pk/job-descriptions/; Job Descriptions of the offices of the Directorate of Education (Schools) Balochistan No. PPIU/8–70/(A & F)/2023.
References
- 1.Baxter J. In: School Inspectors: Policy Implementers, Policy Shapers in National Policy Contexts. Baxter J., editor. Springer International Publishing; Cham: 2017. School inspectors as policy implementers: influences and activities; pp. 1–23. [Google Scholar]
- 2.O'Brien S., McNamara G., O'Hara J., Brown M., Skerritt C. Teacher leadership in school self-evaluation: an approach to professional development. Ir. Educ. Stud. 2022:1–16. doi: 10.1080/03323315.2022.2135568. [DOI] [Google Scholar]
- 3.Barry G., Walsh C., Ó Gallchóir C., Mannix-McNamara P. School self-evaluation and empowering leadership in DEIS schools: an exploration of success. Ir. Educ. Stud. 2023:1–18. doi: 10.1080/03323315.2022.2135569. [DOI] [Google Scholar]
- 4.Davies D., Rudd P. LGA Educational Research Programme & National Foundation for Educational Research (NFER); 2001. Evaluating School Self-Evaluation. [Google Scholar]
- 5.MacBeath J., Boyd B., Rand J., Bell S. Schools speak for themselves. Education Review-London. 1995;9:14–21. [Google Scholar]
- 6.MacBeath J. A new relationship with schools: inspection and self-evaluation. InFORM September. 2005;5:1–7. [Google Scholar]
- 7.McNamara G., Skerritt C., O'Hara J., O'Brien S., Brown M. For improvement, accountability, or the economy? Reflecting on the purpose (s) of school self-evaluation in Ireland. J. Educ. Adm. Hist. 2022;54(2):158–173. doi: 10.1080/00220620.2021.1985439. [DOI] [Google Scholar]
- 8.Wilcox B. UNESCO, International Institute for Educational Planning; 2000. Making School Inspection Visits More Effective: the English Experience. [Google Scholar]
- 9.Walsh T. In: Essays in the History of Irish Education. Walsh B., editor. Palgrave Macmillan; London: 2016. The national system of education, 1831–2000; pp. 7–43. [DOI] [Google Scholar]
- 10.Lindgren J., Rönnberg L. In: School Inspectors: Policy Implementers, Policy Shapers in National Policy Contexts. Baxter J., editor. Springer International Publishing; Cham: 2017. Knowing inspectors' knowledge: forms and transformations; pp. 159–181. [Google Scholar]
- 11.Hossain M. UNESCO; Paris: 2017. School Inspection Challenges: Evidence from Six Countries: Paper Commissioned for the 2017/18 Global Education Monitoring Report, Accountability in Education: Meeting Our Commitments.https://unesdoc.unesco.org/ark:/48223/pf0000259568 [Google Scholar]
- 12.Moreton H.J., Boylan M., Simkins T. In: School Inspectors: Policy Implementers, Policy Shapers in National Policy Contexts. Baxter J., editor. Springer International Publishing; Cham: 2017. Headteachers who also inspect: practitioner inspectors in England; pp. 137–158. [Google Scholar]
- 13.Penninckx M., Vanhoof J. In: School Inspectors: Policy Implementers, Policy Shapers in National Policy Contexts. Baxter J., editor. Springer International Publishing; Cham: 2017. What stated aims should school inspection pursue?—views of inspectors, policy-makers and practitioners; pp. 231–257. [Google Scholar]
- 14.Altrichter H. In: School Inspectors: Policy Implementers, Policy Shapers in National Policy Contexts. Baxter J., editor. Springer International Publishing; Cham: 2017. The short flourishing of an inspection system; pp. 207–230. [Google Scholar]
- 15.Perryman J. Inspection and the fabrication of professional and performative processes. J. Educ. Pol. 2009;24(5):611–631. doi: 10.1080/02680930903125129. [DOI] [Google Scholar]
- 16.Brady A.M. The regime of self-evaluation: self-conception for teachers and schools. Br. J. Educ. Stud. 2016;64(4):523–541. doi: 10.1080/00071005.2016.1164829. [DOI] [Google Scholar]
- 17.McNamara G., O'Hara J. The importance of the concept of self-evaluation in the changing landscape of education policy. Stud. Educ. Eval. 2008;34:173–179. doi: 10.1016/j.stueduc.2008.08.001. [DOI] [Google Scholar]
- 18.Knowledge and Human Development Authority KHDA's open data - dubai's private schools open data. 2021. https://www.khda.gov.ae/en/opendata
- 19.OECD . OECD Publishing; 2021. OECD Review of Wellbeing Policies and Practices in Dubai's Private School Sector.https://www.oecd-ilibrary.org/sites/c04eb3fb-en/index.html?itemId=/content/component/c04eb3fb-en [Google Scholar]
- 20.AlKutich M., Abukari A. Examining the benefit of school inspection on teaching and learning: a case study of Dubai private schools. J. Educ. Pract. 2018;9(5) [Google Scholar]
- 21.Gardezi S., McNamara G., Brown M., O’ Hara J. Auge y convergencia de los sistemas de inspección escolares: Análisis comparativo de la inspección escolar en Dubái, Irlanda, Nueva Zelanda y Pakistán. Supervisión 21: revista de educación e inspección. 2023;68(68) doi: 10.52149/sp21. [DOI] [Google Scholar]
- 22.Timperley H.S. The New Zealand educational context: evaluation and self-review in a self-managing system, A developmental and negotiated approach to school self-evaluation. Adv. Progr. Eval. 2013;14:23–38. doi: 10.1108/S1474-7863(2013)0000014020. [DOI] [Google Scholar]
- 23.Simkins T., Sisum C., Memon M. School leadership in Pakistan: exploring the headteacher's role. Sch. Effect. Sch. Improv. 2003;14(3):275–291. doi: 10.1076/sesi.14.3.275.15841. [DOI] [Google Scholar]
- 24.Zafar F. Ministry of Education, Decentralization Unit-EFA Wing, Government of Pakistan; 2003. Fiscal Devolution in Education: Case Study-Reflecting Initial Response. [Google Scholar]
- 25.Behlol M.G., Parveen Q. Quality of education and supervisory practices in Pakistani schools. Asian Journal of Education and e-Learning. 2013;1(5) [Google Scholar]
- 26.Gul R., Khilji G. Exploring the need for a responsive school curriculum to cope with the Covid-19 pandemic in Pakistan. Prospects. 2021;51:503–522. doi: 10.1007/s11125-020-09540-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Razzaq J., Forde C. The management of large-scale change in Pakistani education. Sch. Leader. Manag. 2014;34(3):299–316. doi: 10.1080/13632434.2014.905467. [DOI] [Google Scholar]
- 28.Gardezi S., McNamara G., Brown M., O'Hara J. School inspections: a rhetoric of quality or reality? Frontiers in Education. 2023;8 doi: 10.3389/feduc.2023.1204642. [DOI] [Google Scholar]
- 29.Schildkamp K., Vissher A. The utilisation of school self-evaluation instrument. Educ. Stud. 2010;36:371–389. doi: 10.1080/03055690903424741. [DOI] [Google Scholar]
- 30.Meuret D., Morlaix S. Conditions of success of a school self-evaluation some lessons of a European experience. Sch. Effect. Sch. Improv. 2003;14(1):53–71. doi: 10.1076/j.stueduc.2008.08.001. [DOI] [Google Scholar]
- 31.Mutch C. Complementary Evaluation: the development of conceptual framework to integrate external and internal evaluation in the New Zealand school context. Pol. Futures Educ. Internet. 2012;10(5):569–586. doi: 10.2304/pfie.2012.10.5.569. [DOI] [Google Scholar]
- 32.MacBeath J. University of Cambridge; Cambridge: 2012. Future of Teaching Profession. [Google Scholar]
- 33.Brown M., McNamara G., O’ Hara J., O'Brien S. Exploring the changing face of school inspections. Eur. J. Educ. Res. 2016;16(66):1–26. doi: 10.14689/ejer.2016.66.1. [DOI] [Google Scholar]
- 34.Brown M., McNamara G., O'Hara J., O'Brien S. In: School Inspectors: Operational Challenges in National Policy Contexts. Baxter J., editor. Springer International Publishing; Cham: 2017. Inspectors and the process of self-evaluation in Ireland; pp. 71–96. 10.1177%2F1478210316656506. [Google Scholar]
- 35.Brown M., McNamara G., O’ Hara J., O'Brien S., Young C., Faddar J. Integrated co-professional evaluation?: converging approaches to school evaluation across frontiers. Australian Journal of Teacher Education (Online) 2018;43(12):76–90. doi: 10.14221/ajte.2018v43n12.6. [DOI] [Google Scholar]
- 36.Simons H. In: Lei M., Kushner S., editors. vol. 14. Emerald Group Publishing Limited; 2013. Enhancing the quality of education through school self-evaluation; pp. 14–27. (A Developmental and Negotiated Approach to School Self-Evaluation). [Google Scholar]
- 37.Altrichter H., Kemethofer D. Does accountability pressure through school inspections promote school improvement? Sch. Effect. Sch. Improv. 2015;26(1):32–56. doi: 10.1080/09243453.2014.927369. [DOI] [Google Scholar]
- 38.Jones K.L., Tymms P., Kemethofer D., O'Hara J., McNamara G., Huber S., Myrberg E., Skedsmo G., Greger D. The unintended consequences of school inspection: the prevalence of inspection side-effects in Austria, the Czech Republic, England, Ireland, The Netherlands, Sweden, and Switzerland. Oxf. Rev. Educ. 2017;43(6):805–822. doi: 10.1080/03054985.2017.1352499. [DOI] [Google Scholar]
- 39.Ryan K.E., Gandha T., Ahn J. School self-evaluation and inspection for improving US schools? Boulder, CO: National Education Policy Centre (NEPC) 2013 https://nepc.colorado.edu/publication/school-self-evaluation [Google Scholar]
- 40.Earley P. In: Improvement through Inspection?: Complementary Approaches to School Development. Earley P., Fidler B., Ouston J., editors. Routledge; New York: 2017. School improvement and OFSTED inspection: the research evidence; pp. 11–22. [Google Scholar]
- 41.Stoll L., Kools M. The school as a learning organisation: a review revisiting and extending a timely concept. Journal of Professional Capital and Community. 2017;2(1):2–17. http://www.emeraldinsight.com/2056-9548.htm [Google Scholar]
- 42.Taguma M., Gabriel F., Lim M.H. OECD Publishing; 2018. H. OECD Future of Education and Skills 2030: Curriculum Analysis. Connections between Anticipation-Action-Reflection and Continuous Improvement Cycles.https://www.oecd.org/education/2030/Connections-between-Anticipation-Action-Reflection-and-Continuous-Improvement-Cycles.pdf [Google Scholar]
- 43.Wenger E. Cambridge University Press; 1998. Communities of Practice: Learning, Meaning, and Identity. [DOI] [Google Scholar]
- 44.Wenger-Trayner E., Wenger-Trayner B. Introduction to communities of practice: a brief overview of the concept and its uses. 2015. https://www.wenger-trayner.com/introduction-to-communities-of-practice/#:∼:text=From%20this%20perspective%2C%20the%20school,is%20the%20main%20learning%20event
- 45.Brown M., Gardezi S., Cinqir S., Figueiredo M., Faddar J., Kurum G., Vanhoof J., McNamara G., O'Hara J., Ramalho H., Skerritt C., O'Brien S. A Practitioner Toolkit for the Inclusion of Parents and Students in School Self Evaluation and Planning. EQI – Dublin; 2020. Rocha, introduction to distributed evaluation and planning in European schools. [Google Scholar]
- 46.Brown M., Gardezi S., del Castillo Blanco L., Simeonova R., Parvanova Y., McNamara G., O'Hara J., Kechri Z. School self-evaluation an international or country specific imperative for school improvement? International Journal of Educational Research Open. 2021;2 doi: 10.1016/j.ijedro.2021.100063. [DOI] [Google Scholar]
- 47.Kyriakides L., Campbell R.J. School self-evaluation and school improvement: a critique of values and procedures. Stud. Educ. Eval. 2004;30(1):23–36. doi: 10.1016/S0191-491X(04)90002-8. [DOI] [Google Scholar]
- 48.Vanhoof J., Petegem P.V. Matching internal and external evaluation in an era of accountability and school development: lessons from a Flemish perspective. Stud. Educ. Eval. 2007;33(2):101–119. doi: 10.1016/j.stueduc.2007.04.001. [DOI] [Google Scholar]
- 49.Boyle R., O'Hara J., McNamara G., Brown M. In: The Institutionalisation of Evaluation in Europe. Stockmann R., Meyer W., Taube L., editors. Palgrave Macmillan; 2020. Ireland; pp. 227–248. [Google Scholar]
- 50.McNamara G., O'Hara J., Brown M., Quinn I. Quality assurance in Irish schools: inspection and self-evaluation. Administration. 2020;68(4):161–180. doi: 10.2478/admin-2020-0029. [DOI] [Google Scholar]
- 51.Ehren M.C., Eddy-Spicer D., Bangpan M., Reid A. School inspections in low-and middle-income countries: explaining impact and mechanisms of impact. Compare. 2017;47(4):468–482. doi: 10.1080/03057925.2016.1239188. [DOI] [Google Scholar]
- 52.Gustafsson J. Single case studies vs. multiple case studies: a comparative study. 2017. https://www.diva-portal.org/smash/get/diva2:1064378/FULLTEXT01.pdfig
- 53.Scott J.P. Sage Publications; Thousand Oaks, CA: 2006. Documentary Research. [Google Scholar]
- 54.Patton M.Q. 3rd Sage Publications; Thousand Oaks: CA Sage: 2002. Qualitative Research and Evaluation Methods. [Google Scholar]
- 55.Mwita K. Factors influencing data saturation in qualitative studies. International Journal of Research in Business and Social Science. 2022;11(4):414–420. doi: 10.20525/ijrbs.v11i4.1776. 2147-4478. [DOI] [Google Scholar]
- 56.Fusch P.I., Ness L.R. Are we there yet? Data saturation in qualitative research. Qual. Rep. 2015;20(9) http://nsuworks.nova.edu/tqr/vol20/iss9/3 How To Article 1 1408-1416. [Google Scholar]
- 57.Clarke V., Braun V. Thematic analysis. J. Posit. Psychol. 2017;12(3):297–298. doi: 10.1080/17439760.2016.1262613. [DOI] [Google Scholar]
- 58.Dubai School Inspection Bureau . Dubai School Inspection Bureau; Dubai: 2008. Initial Quality Inspections Handbook. [Google Scholar]
- 59.Dubai School Inspection Bureau . Dubai School Inspection Bureau; Dubai: 2009. Inspection Handbook 2009-2010. [Google Scholar]
- 60.Dubai School Inspection Bureau . Dubai School Inspection Bureau; Dubai: 2010. Inspection Handbook 2010 – 2011. [Google Scholar]
- 61.Dubai School Inspection Bureau . Dubai School Inspection Bureau; Dubai: 2011. Inspection Handbook 2011 – 2012. [Google Scholar]
- 62.Knowledge and human development authority, Dubai school told to evaluate themselves. 2011. https://web.khda.gov.ae/en/About-Us/News/2011/Dubai-schools-told-to-evaluate-themselves
- 63.Thacker S., Cuadra E. vol. 91884. The World Bank; Washington, DC: 2014. pp. 1–64. (The Road Travelled: Dubai's Journey towards Improving Private Education-A World Bank Review). [Google Scholar]
- 64.Education development trust, setting up the Dubai schools inspection Bureau - education development trust. 2020. https://www.educationdevelopmenttrust.com/our-research-and-insights/case-studies/setting-up-the-dubai-schools-inspection-bureau
- 65.Knowledge and Human Development Authority . United Arab Emirates School Inspection Framework 2015-2016. KHDA; 2015. Dubai. [Google Scholar]
- 66.Ei Saadi D.H. Master’s Dissertation, (The British University in Dubai; 2017. The Contribution of the UAE School Inspection Framework as a Quality Assurance Tool for School Transformation and Performance Improvement. [Google Scholar]
- 67.McNamara G., Brown M., Gardezi S., O'Hara J., O'Brien S., Skerritt C. Embedding self-evaluation in school routines. Sage Open. 2021;11(4):1–10. doi: 10.1177/21582440211052552. [DOI] [Google Scholar]
- 68.H. Hislop, The quality assurance of Irish schools and the role of evaluation: Current and future trends. The professor seamas Ó súilleabháin memorial lecture 2012. Accessed November 10, 2023, https://assets.gov.ie/25310/68cf8ab4f10142b6a6dd3c7c8a9c3f03.pdf.
- 69.Department of Education Looking at our school: an aid to self-evaluation in second level schools. 2003. http://www.tui.ie/_fileupload/looking_at_our_school_2nd_level_eng.pdf
- 70.McNamara G., O'Hara J. From looking at our schools (Laos) to whole school evaluation-management, leadership and learning (WSE-MLL): the evolution of inspection in Irish schools over the past decade. Educ. Assess. Eval. Account. 2012;24:79–97. doi: 10.1007/s11092-012-9143-9. [DOI] [Google Scholar]
- 71.Department of Education. 2013. Chief inspector's report: 2010–2012.http://www.education.ie/en/Publications/Inspection-ReportsPublications/Evaluation-Reports-Guidelines/Chief-Inspector%E2%80%99s-Report-2010-2012-Main-Report.pdf [Google Scholar]
- 72.Department of Education, Self-evaluation: Next Steps September 2022 – June 2026, (2022). Accessed October 12, 2023, https://assets.gov.ie/232734/3e6ca885-96ec-45a6-9a08-3e810b7cd1ea.pdf.
- 73.Department of education, chief inspector report september 2016 – december 2020. 2022. https://assets.gov.ie/219404/0b911002-2f49-4f07-9e66-644aa5e28101.pdf
- 74.Department of Education Looking at our school: a quality framework for primary and special schools. 2022. https://assets.gov.ie/232720/c8357d7a-dd03-416b-83dc-9847b99b025f.pdf
- 75.Department of Education Looking at our school: a quality framework for post-primary schools. 2022. https://assets.gov.ie/232730/4afcbe10-7c78-4b49-a36d-e0349a9f8fb7.pdf
- 76.Earl L. Effective School Review. Considerations in the framing, definition, identification and selection of indicators of education quality and their potential use in evaluation in the school setting. Education Review Office. 2014 [Google Scholar]
- 77.Education Review Office . Author; Wellington: 2011. Evaluation Indicators for School Reviews. [Google Scholar]
- 78.Timperley H. Promoting and supporting improvement in schools through external review. Education Review Office. 2014 https://ero.govt.nz/how-ero-reviews/schoolskura-english-medium/school-evaluation-indicators/promoting-and-supporting-improvement-in-schools-through-external-review [Google Scholar]
- 79.Goodrick D. New school methodology approach within ERO. Education Review Office. 2020 https://ero.govt.nz/sites/default/files/2021-04/Goodrick%20Approach%20and%20Methodology%20for%20the%20External%20Evaluation%20of%20the%20new%20School%20Evaluation%20for%20Improvement%20Approach.pdf [Google Scholar]
- 80.Education Review Office Principles of practice: education evaluation for improvement in schools and early childhood services. 2021. https://ero.govt.nz/sites/default/files/2021-11/Principles%20of%20Practicev8.pdf
- 81.Education Review Office Te Ara Huarau | School improvement framework. 2022. https://ero.govt.nz/sites/default/files/2022-06/School%20Improvement%20Framework%20March%202022%20Education%20Review%20Office.pdf
- 82.Goodrick D. Schools: evaluation for improvement approach: implementation case studies. Education Review Office. 2022 https://ero.govt.nz/sites/default/files/2022-04/Goodrick%20report%20-%20Case%20Studies%20-%20Schools%20Evaluation%20for%20Improvement%20-%20Synthesis%20and%20implications%20-%2027Apr2022_0.pdf [Google Scholar]
- 83.Education Review Office . 2020. Strategic intentions 2020-2024.https://ero.govt.nz/sites/default/files/2021-04/Strategic%20Intentions%202020-2024.pdf [Google Scholar]
- 84.May S., Jang-Jones A., McGregor A. Ministry of Education; 2019. PISA2018 New Zealand Summary Report System Performance & Equity.https://www.educationcounts.govt.nz/__data/assets/pdf_file/0006/196629/PISA-2018-NZ-Summary-Report.pdf August 2023. [Google Scholar]
- 85.Directorate of Staff Development Head teachers guide for primary and elementary schools. 2021. https://aeoskp.com/wp-content/uploads/2021/03/HEAD-TEACHER-GUIDE.pdf
- 86.Qureshi W.A., us-Samad Assad, Mahmood T. Impact of school monitoring on quality education in public sector: empirical evidence from Pakistan. Journal of Positive School Psychology. 2023;7(6):301–309. http://journalppw.com/ [Google Scholar]
- 87.Javed M.L., Inayat M., Javed M.N. Monitoring and evaluation system in education: an overview of elementary schools. Review of Applied Management and Social Sciences. 2021;4(4):837–847. doi: 10.47067/ramss.v4i4.187. [DOI] [Google Scholar]
- 88.Nadeem H.A., Saadi A.M. A. M. An analysis of monitoring and evaluation system launched by Punjab education department. Journal of Educational Leadership and Management. 2019;1(1):17–32. [Google Scholar]
- 89.Nawaz K., Hussain L., Khan A.N. Effectiveness of the independent monitoring unit at secondary school level in southern districts of khyber Pakhtunkhwa, Pakistan. Dialogue. 2019;14(3) [Google Scholar]
- 90.Saleem S., Naureen S. A study to find out the effectiveness of monitoring system in government middle schools, chiltan town, quetta. Balochistan Review. 2017;XXXVIII(2) [Google Scholar]
- 91.Yang H.S., Kim B., Ullah I. Teachers' monitoring and schools' performance: evidence from public schools in Pakistan: KDI school of public policy & management paper No. 20-02. 2020. [DOI]
- 92.Jan F., Iqbal M. An analysis of supervisory system for government boys high schools of khyber Pakhtunkhwa, Pakistan. Dialogue. 2018;13(4) [Google Scholar]
- 93.Noor T., Nawab A. Are school leaders working as instructional leaders? Exploration of school leadership practices in rural Pakistan. Cogent Education. 2022;9(1) doi: 10.1080/2331186X.2022.2109315. [DOI] [Google Scholar]
- 94.O'Brien S., McNamara G., O'Hara J., Brown M. Irish teachers, starting on a journey of data use for school self-evaluation. Stud. Educ. Eval. 2019;60:1–13. doi: 10.1016/j.stueduc.2018.11.001. [DOI] [Google Scholar]
- 95.Nusche D., Laveault D., MacBeath J., Santiago P. OECD Publishing; 2012. OECD Reviews of Evaluation and Assessment in Education: New Zealand 2011. [DOI] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
The data associated with this study has not been deposited into a publicly available repository due to the presence of confidential information.
