Abstract
It is timely and necessary to consider what Postgraduate Medical Training Programme outcomes are, how they are defined and revised over time, and how they can be used to align health professional performance with the healthcare needs of society. This article which addresses those issues, with specific reference to training in anaesthesiology, was prepared using a modified nominal group (or expert panel) approach.
-
i.
The meaning and value of TP outcomes
The value of training programme (TP) outcomes derives exclusively from predefined purposes related to doctors’ professional performance(s) and wellbeing, and resultant benefits to the health of patients, populations and society. The training “system” that achieves these outcomes must be sustainable and adaptive. Viewed in this way, TP outcomes should provide a powerful and measurable means by which the training effects achieved are evaluated and matched to the changing health needs of society. We endorse the adoption of a systems-based approach across health and education, as proposed by the Lancet Commission1 within which TP outcomes serve as critical components of a logic model such as that described by Van Melle et al.2
-
ii.
What constitute legitimate Training Programme outcomes?
Training Programme outcomes relate not just to the capability of individual trainees or graduates to provide high quality healthcare, but also to the likelihood that they will actually do so over the span of a professional career. The description and definition of TP outcomes should take account of multiple dimensions including inter-individual, inter-cohort and inter-institutional variations, as well as change in measures over time. The extent to which graduates of a TP collectively meet the medical workforce requirements of a population also constitutes a legitimate outcome. TP outcomes can be categorised as (i) patient-focused, (ii) trainee-focused or (iii) programme focused. We acknowledge the high degree of interdependence between these categories of outcome, and overlap in the activities by which they are achieved. Viewed in this way, we suggest that TP outcomes can serve as critical control points in a systems-based approach to optimising healthcare. For instance, for accreditation purposes, the measurement of TP outcomes could be referenced to open access national /international databases for continuous quality improvement of the programme.
-
iii.
Principles governing the achievement of Training Programme outcomes
In considering implementation science principles, we propose the following with respect to the achievement and utility of TP outcomes:
-
1)
Data access: The consistent achievement of pre-determined outcomes will depend on accessing data streams from the programme and the clinical sites (pre-requisite)
-
2)
Evidence base: The aggregate of data from clinical settings and training environment must be provided to stakeholders in a timely way at the correct level. These data can critically inform and influence decision-making. Progress and challenges should be identified at programme director level.
-
3)
Context: The manner in which specific TP outcomes are achieved needs to take account of context specific resources and local health needs.
-
4)
Addressing barriers: The data generated locally need to be examined for unintended consequences. Barriers or perceived barriers to implementation of programme changes relating to specific programme outcomes need to be identified across the system including investment, faculty development, structures and resource access.
-
1)
-
iv.
Stakeholder collaboration in the development and renewal of Training Programme outcomes
Stakeholder collaboration is vital for the development and validation of TP outcomes and to optimise systems thinking. The collaboration should be based on a shared vision for societal benefit through improved healthcare and a shared set of values. This will require meaningful input from some traditionally voiceless constituencies, including patients. Within a collaborative governance structure, the stakeholder network should define patient-focused outcomes, trainee-focused and programme-focused outcomes, in particular ensuring that the needs of vulnerable populations are addressed. Data and information applied using a logic model should drive a co-production approach to inform decisions, which in turn improve instructional practice and institutional performance. The measurement and utility of achieving TP outcomes are problematic. The lack of evidence to clearly link specific clinical performance metrics to improved patient outcomes remains a concern. However, the inclusive approach we present here to optimising stakeholder collaboration to inform the training of doctors may be simply the ontologically right thing to do, consistent with modern values, public accountability, and future well being. We suggest that this is an approach whose real value may not be measurable today in terms of traditional clinical outcome, but is justifiable on rational, scientific and ethical grounds.
-
v.
Priority areas in which new knowledge is required to optimise the value of Training Programme outcomes
In the future, the definition, consistent achievement and utility of TP outcomes require new knowledge in certain areas such as:
-
i.
methodologies for measurement and evaluation of technical and non-technical skills
-
ii.
the nature of skill attrition over time
-
iii.
the science of organisational change in particular for healthcare and education.
-
iv.
the incorporation of systems thinking into the design and operation of TPs
-
v.
the development of a logic model for postgraduate medical training which functions across healthcare and education.
-
vi.
data driven instructional design and assessment of clinical impact of training. This could entail the application of national or international datasets to enhance achievement of TP outcomes (for instance through learning analytics or educational data mining).
KEY POINTS
This statement endorses the adoption of a systems-based approach across health and education, as proposed by the Lancet Commission1 within which Training Programme outcomes serve as critical components of a logic model.
Training Programme outcomes relate not just to the capability of individual trainees or graduates to provide high quality healthcare, but also to the likelihood that they will actually do so over the span of a professional career.
Training Programme outcomes can be categorised as (i) patient-focused, (ii) trainee-focused or (iii) programme focused.
In considering implementation science principles, we propose that the achievement of TP outcomes requires appropriate access to data, evidence-based decision making, consideration of context and management of barriers.
Stakeholder collaboration is vital for the development and validation of TP outcomes and to optimise systems thinking.
Context
If the ultimate purpose of medical education and training is to ensure the consistent and sustained delivery of safe, high quality healthcare to patients and populations,3–5 then there is much room for improvement. Since the publication of the US Institute of Medicine report (2000) “To Err is Human”,5 the information available indicates that unsafe practice and preventable harm is widespread.6 Although improvements in avoidable mortality have been achieved since then (at least in high income countries), the magnitude of decrease varies greatly and is not necessarily related to spending in healthcare.7,8 In 2010, the Lancet Commission on Education of Health Professionals for the 21st century stated that “... a slow-burning crisis is emerging in the mismatch of professional competencies to patient and population priorities because of fragmentary, outdated, and static curricula producing ill-equipped graduates from underfinanced institutions”.1 This view is supported by credible evidence which indicates (i) that great variation (some of which is unwarranted) exists in physician performance9 and (ii) that medical errors occur commonly and persistently.10,11 Holmboe and Kogan have pointed out the need to identify the correlates of unwarranted variation of clinical care in medical education and the need to address the interdependency of unwarranted variation occurring between clinical and educational practices.9 In this article, we propose that a significant proportion of the patient harm which is related to physician performance should be amenable to correction through improvements in training and the critical role of training programme outcomes in that process.
The legitimate need to base medical training on the safety and health needs of society has resulted in regulators requiring that outcomes be defined. For instance, in its 2017 guidance, Excellence by Design, the UK General Medical Council12 specified that a TP description must contain a purpose statement that: “Specif(ies) the high level outcomes so it is clear what capabilities must be demonstrated, and to what level, to complete training”.
Increasingly, it has become feasible to capture clinical outcome data and individual or team practice/performance-related data, thereby creating the potential to link one with the other. Bandiera et al. have proposed a framework of learning outcomes, patient outcomes and health service outcomes for the purpose of accrediting TPs.13 It is also clear that TP outcomes can offer a utility other than as a means of justifying categorical decisions such as programme accreditation.14 The extent to which outcomes are achieved can and should also serve as critical input to the continuous quality improvement of programmes.
Methodology
Given the limited evidence linking medical training with educational or clinical outcomes,15–17 it was decided to adopt a modification of the nominal group (or expert panel) approach to arrive at a consensus.18 The members of the expert group were identified based on their expertise in postgraduate medical training (in particular in anaesthesiology) and accreditation, and/or by requesting nominations from learned societies or training bodies [specifically the European Society of Anaesthesiology and Intensive Care, the European Board of Anaesthesiology (European Union Medical Specialties), the Royal College of Physicians and Surgeons of Canada, and the College of Anaesthesiologists of Ireland]. Two preliminary meetings were held and resulted in agreement on the topics to be addressed and structure of the Consensus Paper. A series of six formal structured meetings (from 15 April 2022 to 28 February 2023) were held to achieve consensus on the key observations, recommendations and final text. All individuals contributed iteratively to drafting specific sections, and latterly in drafting the final manuscript.
The meaning and value of postgraduate medical training programme outcomes
The value of TP outcomes derives exclusively from pre-defined purposes related to doctors’ professional performance(s) and wellbeing, and resultant benefits to the health of patients, populations and society. The training “system” that achieves these outcomes must be sustainable and adaptive. Viewed in this way, TP outcomes should provide a powerful and measurable means by which the training effects achieved are evaluated and matched to the changing health needs of society. We endorse the adoption of a systems-based approach across health and education, as proposed by the Lancet Commission.1 within which TP outcomes serve as critical components of a logic model such as that described by Van Melle et al.2
Training doctors as means to an end
For the purposes of this article, we propose that TP outcomes relate not just to the capability of individual graduates to provide high quality healthcare, but also to the likelihood that they will actually do so over the span of a professional career. In this, we acknowledge the useful taxonomy of Competency based Medical Education outcomes proposed by Hall et al. (focus, level and timeline).19 Furthermore, we propose that TP outcomes also legitimately include the extent to which they collectively meet the medical workforce requirements of a population. Therefore, we propose that TP outcomes be categorised as (i) patient-focused, (ii) trainee-focused or (iii) programme focused. (These are detailed further in section 2.)
A systems-based approach
In expressing the failure of professional education to keep pace with the challenges of 21st Century healthcare, the Lancet Commission1 envisaged the next generation of healthcare education reform as systems based, being competency driven and requiring understanding of the complex interactions between the education and health systems. In the resulting framework, people (or the population) were the driver of these systems.
We agree strongly with the need for such a systems-based approach. Furthermore, we propose that its implementation will require application of the relatively new discipline, health systems science which has been defined as “... a foundational platform and framework for the study and understanding of how care is delivered, how health professionals work together to deliver that care, and how the health system can improve patient care and healthcare delivery”.20 Within that approach, we view TP outcomes as key components.
Ideally, TP outcomes will derive from healthcare policy and inform the design and implementation of the programme (Panel 1). In such a systems-based approach, each TP outcome must be justifiable either rationally or based on empiric evidence, as contributing to improved healthcare. And once defined and justified, the outcomes will be the primary determinants of programme design and operation.
Panel 1. Training programme outcomes as key components of a systems based approach – a hypothetical example
Convincing evidence exists to indicate that structured multi-disciplinary management of elderly patients with fractured neck of femur results in better clinical outcomes.
Guidelines are developed by a suitably qualified group of experts who carry out systematic reviews of the best available evidence and when limited evidence is available, make recommendations based on the group's experience and opinion.
Care is taken to express the evidence levels which underlie each recommendation.
These guidelines are well disseminated and made freely available (i.e. global resource available for application in different local settings).
In time, they and other relevant guidelines inform (national/regional/system-wide) healthcare policy which specifically identifies the need to improve interprofessional teamwork in the delivery of perioperative care to elderly patients.
As a result, training bodies (TB) across several disciplines including anaesthesiology alter their TP outcomes to include a specific competency on contributing to interprofessional team performance. Note that the TB response includes both institutional (affiliation and partnership across disciplines) and instructional (new explicit competencies defined) demonstrating adaptivity of both structure and process.
Implementation of the policy requires a certain, quantified resource including a budget allocation.
New TP outcomes (in this example: educational, patient focused outcomes) are specified in terms of interprofessional performance.
Instructional adjustments in and across programmes are required in training activities, format and assessment.
Ongoing evaluation of the impact of the changes (in educational activity, clinical performance and patient outcome) determine whether (i) the stated improvements have been achieved and (ii) what further adjustments are necessary.
Although simplistic, this example provides an illustration of how a systems-based approach would work. The health and education systems interact with the specific shared intention of improving the healthcare needs of a population. (e.g. more elderly hip fracture patients return to their homes at their pre-injury functional status). The need and best care are identified using current, high quality evidence if it is available or expert opinion if it is not. Those examining and interpreting the data and offering opinion are experts in the relevant disciplines. The manner in which the underlying evidence and rationale for the change is made available to decision makers at different levels is based on implementation science principles. Training bodies (with combined clinical and educational expertise) provide improved education for interdisciplinary team performance. A regulator oversees the TB's intervention (curricular adjustment, format, activity, and assessment). Data streams which capture training activity, clinical performance and clinical outcome are concurrently available in comprehensible form to all stakeholders.
This idealised example depicts four key elements of how such a system should function:
-
i.
The shared objective is an explicit and meaningful health gain.
-
ii.
Decisions are made based on appropriate expert input and evidence presented at the appropriate level, format, and time.
-
iii.
Interaction between health and education system is seamless (co-development principles apply).
-
iv.
The overall system is data driven and adaptive.
The example might also be viewed as a functioning logic model (Van Melle et al. 2 ) with an (i) overall purpose (improved clinical outcome), (ii) inputs (expertise, evidence, budget), (iii) activities (interdisciplinary training sessions, tailored experience, specific assessments), (iv) outputs (workplace based assessments, TB self-evaluation report) and (v) outcomes (clinical performance metrics, clinical outcomes). The success of the applied logic model in achieving its purposes will depend on feedback loops. These form the basis of the system adaptivity. In such a model, it is notable that the TP outcome serves a critical function: it is the means by which the overall purpose is interpreted and translated into activity and effect. The system will not function without a clearly defined and attainable TP outcome.
What the example doesn’t address are the relationships between such elements as training activity, training assessment, clinical performance, skill attrition, and clinical outcome. Currently our understanding of these is limited. An operating logic model might itself provide a means to improve our understanding of these complex relationships. (Eventually one can envisage a “living lab” in which training data streams and data analytics are applied to continual improvement of a predictive model for clinical outcome).
Neither does the example above take account of the fundamental challenges which its implementation would face in terms of organisational change and achievement of genuine co-development structures. By presenting an idealised example, we pose the question: given the premise that improvement is required, is there a better alternative?
Training Programme outcomes should be both sensitive and adaptive to inputs (such as an aging population or advances in gene editing) and also determinants or effectors of outputs, and ultimately of outcomes (societal health). Of the many methodologies covered by the term “systems-based approach”, we see as most suitable the adoption of a logic-based model (such as that described by Van Melle et al.2
Figure 1 from Van Melle2 with permission of the copyright holder.
Fig. 1.
Schematic of a logic model (reproduced with permission).
In general, a logic model will display inputs, activities, outputs and outcomes. Training Programme outcomes could serve as critical nodes within the model. Ideally, the outcomes are clearly defined and measurable, and relationships between each of the other elements understood, and amenable to intervention.
For simplicity, the interactions between key components of a logic model are depicted as linear. However, effective real-world application of a logic model requires that complex interactions and feedback loops are taken into account. The application of a logic model to the training of doctors will require that the significant challenges of complexity, data access, attribution and measurability be addressed. It would be necessary to adopt a shared language or taxonomy across institutions and jurisdictions (such as proposed by Hall et al.19). A “core model” could be established and customised for the many important “local” factors which influence outcome. The operating model would necessarily change over time in response to quality improvement efforts, new insights, and external factors. The function of the TP overall and the model components could be displayed using a dashboard or other visualisation technique.
A TP outcome is not an event or fixed categorical achievement, but rather an “indicator of current status”. At the level of an individual trainee, its meaning is dependent on a measure (e.g. of a particular competence at a point in time), in relation to other such concurrent measures (i.e. of other competencies) and trajectory (change of equivalent measures over time). At the level of the TP, application of aggregated outcome measures across time and cohort could identify systematic influences which favour or impede attainment (e.g. the order in which certain modules are taken, mentorship, type and frequency of casemix experienced).
What constitutes legitimate training programme outcomes?
Categories and evaluation
In Section 1, we propose that TP outcomes can be categorised as: (i) patient-focused, (ii) trainee-focused or (iii) programme focused. Below we describe the domains to be addressed in greater detail, with some examples.
Directly patient-focused outcomes
-
1.
The overall purpose or “raison d’etre” of the Training Programme expressed in terms of societal benefit. (e.g. To educate professionals and a workforce to provide consistently safe anaesthetic care for patients who undergo surgery)
-
2.
Overall TP outcomes expressed in terms of the professional capabilities of a graduating doctor; these will be generic and aligned to the model of the “good doctor” adopted by a particular jurisdiction. (e.g. to consistently relate to patients and their relatives in a manner which is open, clear, compassionate and professional). Crucially, these will include learning outcomes derived from health service science, such as teamwork and value-based care.
-
3.
Outcomes which describe a trainee's aggregate clinical performance over time and across clinical domains and circumstances. These indicate a trainee's development as they progress in training and undertake a widening scope of practice. (e.g. to provide safe anaesthetic care to patients undergoing cardiac surgery in urgent and non-urgent settings).
-
4.
“Synthesised” outcomes descriptive of a doctor's (point in time) capability and performance in a specific clinical environment and clinical circumstances (e.g. to elicit and apply clinical information about a specific patient to formulate an appropriate plan for anaesthetic care)
-
5.
Discrete measurable outcomes of individual programme elements – task, knowledge, attitudes, skills as set out in the programme's curriculum (e.g. to insert an epidural catheter in a patient in labour safely and efficiently).21,22
Trainee focused outcomes
-
1.
Trainees acquire the insights and means to maintain their own wellbeing while working in diverse, complex and stressful healthcare environments and to support the wellbeing of their colleagues.
-
2.
Trainees acquire an understanding and ability to learn continuously throughout their careers in a way that is self-regulated and adaptive.
Programme focused outcomes
-
1.
Consistency of satisfactory completion of the TP within cohorts and across cohorts over time.
-
2.
Achievement of pre-defined metrics related to subsequent professional progression and contribution of a doctor after graduation from the TP.
-
3.
Achievement of pre-defined aggregate metrics on trainee recruitment, withdrawal and progression.
-
4.
Evidence, acquired in co-operation with a regulatory body, of meaningful continuous quality improvement (for instance, based on trainee and faculty feedback, trainee retention, and curricular development).
Within these generic categories and sub-categories, local data, knowledge and experience can be used to define specific TP outcomes according to specific needs in healthcare and education (Panel 2).
Panel 2. Some areas relevant to anaesthesiology in which local education and healthcare needs would inform definition of specific TP outcomes
Specialist /generalist workforce planning vs. programme structure: TP outcomes will differ based on whether a discipline is categorised as sub-specialty or supra-specialty.
Surgical services provided in remote or isolated settings: TP outcomes will take account local patient safety practices, policies, and guidelines.
Team-based vs. individual care: TP outcomes will reflect the need for interprofessional and trans-professional training.
Recent and context-specific adverse event data will inform TP outcomes, including mandatory courses and specific proficiency standards for progression.
National/institutional digital transformation of health services: TP outcomes will include competence in interaction with electronic health records and the use of decision support tools and software.
National or regional affirmative action policies, transnational mobility of health professionals: TP outcomes will include reference to specific language-based communication skills such as huddle performance and handover.
TP outcomes should be individually and collectively amenable to evaluation. This requires a multi-faceted approach using traditional tools and parameters (e.g. examination results, trainee attrition rates), measures of trainee performance,23 healthcare quality measures,24 and clinical outcomes.25 Ideally, TP outcomes should be quantifiable. For the purposes of operating a logic model, it would be valuable if a valid and reliable measurement of a doctor's level of clinical performance for a specific task were referenced to quantitatively defined proficiency standard. Although this quantification of performance level is currently feasible for certain skills,26,27 the practice is not widespread.
The evaluation of TP outcomes should inform refinement of the stated outcomes over time, and provide an understanding of how certain outcomes are achieved. Within the system-based approach we envisage, a training body and accrediting body would cooperate in interpreting such evaluation data in order to identify and share good instructional and/or institutional practices. For instance, an increasing number of European countries are adapting the European Diploma of Anaesthesiology and Intensive Care part 1 examination as pre-requisite for a trainee to complete the national training programme in anaesthesiology.22
Training programme outcomes and societal benefit – the attribution gap
The relationships between the above categories of outcome are important but have not yet been characterised. Intuitively, one might assume, for instance, that a graduate's degree of clinical competence would contribute to a sense of wellbeing, and thereby support the delivery of a sustainable, effective workforce. In that sense, an important criterion for considering a TP outcome as legitimate is that it should directly or indirectly contribute to a societal health gain.
In a logic model, TP outcomes could play a role in addressing the current “attribution gap”; they would express explicit, commonly held goals which operate at different levels, namely:
-
i.
That an individual doctor (graduate of the TP) is capable of providing a particular quality of healthcare to a patient.
-
ii.
That the individual doctor (graduate of the TP) does actually deliver that quality of healthcare consistently to patients (i) across a wide range of clinical conditions/complexity (ii) in a range of clinical settings in which the doctor practices and (iii) over time).
-
iii.
That the graduate cohorts of a TP function as described in (i) and (ii) above with little inter-individual variation (or at least little negative variation).
-
iv.
That graduate cohorts of a TP remain healthy, motivated, fulfilled and contributory in the medical workforce over a professional lifetime.
-
v.
That graduates of different TPs across disciplines function collectively to contribute to superior health outcome for a population (a measurable health gain).
-
vi.
That a regulatory body has sufficient outcome-related data to co-design effective quality improvement measures with the TP.
Ideally, TP outcomes, defined across each of these levels, would provide a means to identify at which point (from training activity to patient outcome) the chain of attribution breaks down (if it does) (Panel 3).
Principles governing the achievement of training programme outcomes – an implementation science approach
Each TP should have as its principal objective the comprehensive and consistent achievement of its stated outcomes. Evaluation of TP outcomes is, without doubt, the best means by which programme fidelity and efficacy can be validated, and investment of resources justified.
Implementation science has established that a clear sense of shared purpose is necessary to achieve change. For instance, a key feature of the PARiHAS framework is “a shared understanding about the benefits, disbenefits, risks, and advantages of the new over the old”.28 The underlying purpose of each TP outcome should be valid and meaningful for those operating at each level of the (interacting) health and education systems. This might be expressed as: “will achievement of this outcome (directly or indirectly) improve the health of a person?” Thus, those responsible for policy, regulation, budget management, accreditation, programme oversight, programme delivery, teachers, trainers, trainees, patients, and members of the general population should all recognise the value in each TP outcome, but from a different perspective. Those who make decisions at the “higher levels” of the system should be able to recognise in each outcome a specific means by which a societal healthcare benefit will occur. They will (or should) bring an understanding of national or regional workforce needs, competing demands for resources, and risks of unintended consequences of policy implementation. Those responsible for programme quality and delivery should recognise in each outcome a characteristic of “the good doctor” or a means to ensure the doctors can and do contribute to high quality and personally fulfilling healthcare over the course of a career. Patients, carers and members of the general population should recognise in the TP outcomes features of the care they would wish to receive. Each outcome carries with it different and meaningful implications in terms of action and accountability for each level.
The necessary centrality of the patient to any system-based approach to training is captured in the model proposed by Wong et al. which refers to (i) the different levels and forms of decision making, (ii) the need for key stakeholders to participate in the co-production, measurement and evaluation of outcomes, and iii. that fact that decision making is context bound and iv. the invariable competition for resources.29
Thomas and Ellaway have described the application of implementation science principles to health professional education.30 These entail ensuring that decisions are based on the relevant available evidence (evidence base) being presented to the appropriate stakeholders (data access), who are best positioned to understand its implications (context) and to respond appropriately (addressing barriers). Using this approach, we propose the following with respect to the achievement of TP outcomes:
-
1.
Data access: The consistent achievement of pre-determined outcomes will depend on accessing data streams from the programme and the clinical sites (pre-requisite).
-
2.
Evidence base: The aggregate of data from clinical settings and training environment must be provided to stakeholders in a timely way at the correct level. These data can critically inform and influence decision-making. Progress and challenges should be identified at programme director level.
-
3.
Context: The manner in which specific TP outcomes is achieved needs to take account of context specific resources and local health needs.
-
4.
Addressing barriers: The data generated locally need to be examined for unintended consequences. Barriers or perceived barriers to implementation of programme changes relating to specific programme outcomes, need to be identified across the system including investment, faculty development, structures and resource access.
Of the various components of a logic-based model2 applied to training doctors, we propose that TP outcomes are uniquely important, (i) representing the means by which society's healthcare needs are met through training, and (ii) determining the specific “deliverables” within curriculum and instructional design. In this sense, the outcomes represent the centre or fulcrum of the professional training system and should determine its operation (upstream) and serve as measures of efficacy (downstream).
We recognise that Training Programmes operate in a complex, dynamic environment. Whilst the four principles above provide accommodation for change, an operating logic model could actually leverage benefit from change (in data content, environmental factors, advances in data science).
Stakeholder collaboration in the development and renewal of training programme outcomes
The operation of a logic model is predicated upon all stakeholders sharing ownership and understanding of TP outcomes, and a commitment to their achievement. The resultant obligations will be viewed differently by the various stakeholders within and across healthcare/education systems.
The realisation of societal health benefits will require multi-institutional input and (given the occupational mobility of medical practitioners) some international alignment on TP outcomes. Many training systems (e.g. CanMEDS31 and Competency by Design32 in Canada) follow curricula and processes with goals aligned to the archetype of the “good doctor”. Consensus on such models between Training Bodies is one pathway to agreeing upon outcomes which can be used to define “quality” with respect to the training of doctors. The achievement of such a consensus even for “high level” principles will be challenging.
As medical education has evolved, it has sought to reconcile standardised outcomes with individualised learning pathways. Informed by the growing evidence base in health systems science, modern training also acknowledges the views of patients, health systems managers, and society itself.20,33 However, medical education systems are increasingly complex and often involve a plurality of discrete stakeholders with limited feedback and communication between them. An individual who is a physician-trainee, tax payer and patient, may hold different viewpoints simultaneously. Effective stakeholder collaboration which requires definition of a shared vision is therefore a complex ideal.34Systems thinking can enable genuine value creation both for individual stakeholders, and for the overall network.35 Applied to healthcare, one proposed framework describes how value co-creation occurs at network level, as stakeholders interact, contribute resources, and share knowledge and experience.35 Value capture takes place when those stakeholders add value during the process. Importantly, it will be necessary to establish boundary conditions for value leveraging to take place.35
Further complexity arises from the requirement for health professional education to function within two inter-dependent systems: education and health.1 Meaningful stakeholder collaboration is necessary to drive the development and evolution of TP outcomes. Within the logic model we envisage, the principles of value co-creation and capture apply in particular to i. establishing a shared vision of societal benefit and a shared set of values (purpose) and ii. defining and evolving the TP outcomes (outcomes) (Fig. 1). A stakeholders’ network can be deployed to define a vision and outcomes that are scientifically sound, but also socially acceptable and, therefore robust. The vision and outcomes are products of collective co-production, delivering benefits to the entire system as opposed to individual co-production where the benefit is largely at the individual level.36 In setting out to define TP outcomes, the network necessarily includes all “levels” from funders and policy makers who influence healthcare systems and address population health issues (e.g. the opioid crisis) to end-users (patients, families, societal needs, and the health system itself). Equally, it should offer equity, diversity and inclusion in representation. In particular, the patient voice, often silent, inherently diverse, but bringing value through lived experience must be central to the collaborative network.37–39 Conflict is inevitable and a normal part of multi-stakeholder processes and may be necessary, if not desirable, for meaningful change to occur. However, resolving it will require a blend of designated and distributed leadership which is both collaborative and inclusive,40–42 operating within a Collaborative Governance structure.43 (Panel 3). Industry and the sustainability literature offer examples of how such deliberate, collective action across multiple organisations can be achieved.44,45 These might be adapted to the implementation of competency based medical education.46
Panel 3. Collaborative governance applied to medical training
One particularly intensive cardiac anaesthesiology module enables trainees to encounter diverse and complex clinical problems, many of which occur out of hours. The programme director notes that trainees find the module stimulating and their skills related to case management improve greatly. The TP outcomes related to the module are invariably achieved. However, trainees completing the modules invariably report feeling fatigued and very few opt for cardiac anaesthesiology in selecting their higher sub-specialty training or fellowship. Input data (e.g. clinical caseload and casemix, including out of hours work) and output data (cohort level workplace based assessment and cohort level trainee feedback) identify an exceptional mismatch in learning and satisfaction, and deliver the relevant summary to the decision makers of appropriate levels (module co-ordinator and Programme Director). The Programme team redesigns the module so that trainees undertake it as two separate and shorter attachments. Overall clinical exposure and skills acquisition is retained with much fewer reports of fatigue and greater interest in cardiac anaesthesiology as a long-term career.
Identifying, acknowledging, and addressing tensions, pitfalls and sub-standards: adaptive management
Ideally, stakeholders’ collaborations exist as adaptive and dynamic equilibria, rather than as networks of siloed stakeholders, with outcomes arising from interactions as emergent phenomena.47,48 They are “adaptive” insofar as they have the capacity to learn from experience and change to suit a context. Healthcare systems increasingly acknowledge that functioning within complex adaptive systems is, in fact, a core physician competency (see Section 2. Patient focused outcomes). Ultimately, the leadership of developing coalitions and the transformation of systems may fall to physicians.47,48
However, aligning stakeholder collaboration is particularly challenging when diverse stakeholder perspectives and conflicting priorities exist (Panel 4). Johnson describes the Polarity Thinking Model for organisations with examples of industries’ ability to accurately differentiate between problems amenable to solutions and irreconcilable polarities.49 Govaerts has suggested how such Thinking can drive opportunities for improvement.50
Turnhout et al.36 cautions on how power and politics can shape processes and outcomes during co-production. Empowerment of stakeholders to work and contribute on an equal footing will be important to the process. This again underpins the importance of the patient as a key stakeholder whose voice must remain prominent and valued for their perspective and vested interest.
Panel 4. Differing stakeholder perspectives on assessment of clinical competencies
How can stakeholders co-operate on the selection, use and interpretation of methodology for assessment of clinical competencies?
-
i.
Shared overall objectives: Within a Collaborative Governance structure, a stakeholder network has identified certain Clinical Competencies as TP outcomes. These are based on a shared agreement that doctors require these competencies to practice safe effective healthcare.
-
ii.
Establish common ground: Across countries and institutions, assessment strategies for clinical competencies share significant commonality. 45,46
-
iii.
Facilitate expression of differing stakeholder perspectives: Different stakeholders express competing and conflicting priorities in choosing metrics for assessment of clinical competencies. A programme Director points to certification examinations (e.g. American or Canadian Board examinations) which remain central to trainee assessment in many training programmes. Trainees express concern regarding tension between formal “examinations” on the one hand, which may compete with clinical learning and performance-related progression on the other. Thus, trainees may feel the need to choose between “study time” and “clinical time”. Patients and their advocates might welcome workplace-based assessments as determinants of minimal competence. This would favour development of valid, reliable forms of objective assessment (perhaps some form of metrics-based training). Educationalists raise the question of “reductionism” if specific skills are to be assessed in isolation. Local data and published evidence are presented relevant to validity of assessment tools and relationships between trainee assessments and subsequent clinical performance and patient outcome.
-
iv.
A decision: Within a collaborative governance structure, there is agreement that a decision is necessary or desirable. For a particular programme, the decision is made (i) only after the different perspectives have been considered (ii) by the person or entity responsible for assessment within the programme (iii) based a set of universally accepted principles (iv) which are tailored to local needs using local data.
Certain incontrovertible principles are agreed to as fundamental to a sound decision. These could include the requirement for construct validity and reliability in assessment tools, acknowledgement of the risks or over- and under-assessment, and the risk of generating a “hidden curriculum” causing trainees to work towards a particular form of assessment rather than towards a competency. Then, the available scientific evidence (perhaps linking performance at some form of assessment with subsequent clinical behaviour) and local constraints (perhaps availability of simulation facilities, or a faculty development programme which includes workplace-based assessments) are taken into account. A decision is then made at the appropriate level (in this example: perhaps Programme Director or Clinical Competency Committee) and rationale shared with stakeholders. This might entail explanation of the need to achieve a balance between over and under-assessment of trainees, quality of feedback (and feedback-related learning), investment of time (human capital) and value for money.
Stakeholders and evaluation – collaboration on continuous quality improvement
In general, evaluation is a challenging but essential component to process development and its validation.42 In the context of training doctors, we examine two branches of evaluation (i) evaluation of the process of collaboration amongst stakeholders and (ii) evaluation of the selection, renewal and achievement of TP outcomes (the latter covered in Section 2).
Conducting ongoing evaluation of inter-organisational collaboration has been reported to offer, at least, the perception of progress and increased faith within stakeholders.13,14,51 Embedding ongoing assessment into the initial organisation and its planning is prudent. In the systems-based approach that we advocate, regulatory bodies play a crucial role in influencing and standardising the quality of TPs, in continuously improving the quality (cQI) of curricula to align with population needs, and in improving learning environments.52
A logic model will operate successfully if programme specific data (ideally using continuous data streams) are used together with emerging published evidence. Ultimately, it will seek to align specific TP outcomes (at patient-, trainee- and programme-focused levels) with measures of healthcare quality or patient outcome. The stakeholder collaborative network must retain sight of its ideal goals with awareness of any trade-offs incurred in progressing towards them.
Priority areas in which new knowledge is required to optimise the value of training programme outcomes
Certain elements of a truly system-based approach will require generation of new knowledge through research. In this section, we point to those areas of greatest importance while recognising that valuable work is currently underway in each of them.
The assessment of clinical competencies
Trainee assessment in anaesthesiology has traditionally been based on observation of clinical skills and standardised tests.53 More recently, a number of innovations, including in the area of non-technical skills, have enabled more comprehensive assessment by incorporating case scenarios, simulations and communication-based didactics.54,55 In an era of competency-based training, standardisation of technical skills assessment is relatively deficient.56
Given the limitations to employing expert opinion alone to adjudicate competence in performing complex tasks,57–59 novel techniques such as Verification of Proficiency60 in trainees using mixed simulators and virtual reality based systems,61 are rapidly gaining acceptance in surgical disciplines. Adopting similar innovations could potentially augment learning and allow for standardisation in the evaluation of the technical skills of anaesthesiology trainees.
Proficiency in non-technical skills such as communication, resource utilisation, situational awareness and adaptability have been incorporated into the European Training Requirement22 and the American Board of Anaesthesiology.62 To date, the assessment of these skills using quantifiable and reproducible metrics is not widely practiced. Preliminary studies have employed techniques such as the Anaesthetists’ Non-Technical Skills tool55 to explore this domain, but its validity and reliability in the different clinical settings need to be further substantiated. Future research is required in this area which may support adoption of a uniform approach to assessment.
The value of certain patient-focused TP outcomes (including achievement of specific clinical competencies by an individual trainee) would be greatly enhanced if a valid reliable assessment tool and a quantifiable “competent” standard of performance were available and universally employed.
Skill attrition over time
To ensure practitioners remain abreast of current practices and guidelines, a number of programmes have mandated the recording of hours engaged in educational activity, maintenance of certification examinations and attendance at refresher courses as licensing requirements.63 Although well established, this does not address the totality of the risk of skill attrition over time. A large scale evidence base which could be used to describe the nature, determinants and variation in skill maintenance and attrition would be invaluable.
Organisational structures and culture
A careful evaluation of the organisational structure and culture within which TPs (including in anaesthesiology) operate is warranted. Limited information exists on the “institutional” role (whether hospital, healthcare system, clinical department or training body) in the achievement of TP outcomes. Many of the “parent” organisations which employ trainees or to which trainees are affiliated have undergone major changes in their financing model, governance, service delivery imperatives and operational structure. Across the world, the anaesthesiology trainee's work practices have undergone change: this includes incorporation of work hour restrictions, shift work, multiple patient handovers, specialty-oriented training and performance focus. Further studies are required in particular in the areas of trainee fatigue and wellbeing, organisational culture, management of adverse outcomes and the balance between clinical service and training functions.
Systems thinking – a core skill for organisational change
The limited and variable manner in which stakeholders engage in “systems thinking” is one of many challenges to implementing a system-based approach to the training of doctors. Although systems-based practice was introduced as a specific competency by ACGME and the American Board of Medical Specialities in 1999,63 systems thinking is not evident in many healthcare organisations. Gonzalo et al.64 correctly point out that 21 century doctors require skills in Systems Based Practice if they are to thrive. Currently, many undergraduate and postgraduate education curricula are deficient in that respect. Future work is required to augment our understanding of how, when and to whom such education is provided.
Data-driven instructional design and learning analytics
We have described the need for decisions made within a system-based approach to medical training to be data driven. The application of a logic model will require that data streams (from training and clinical environments) are established using ethically sound principles and secure methodologies. Techniques for data analysis and visualisation are advancing rapidly; in particular the emerging discipline of learning analytics is likely to influence instructional design applied to medical education.65
Acknowledgements relating to this article
This Statement has been endorsed by the European Society of Anaesthesiology and Intensive Care, the European Board of Anaesthesiology (Section of Anaesthesiology, European Union of Medical Specialists), the Royal College of Physicians and Surgeons of Canada and the College of Anaesthesiologists of Ireland.
We would like to thank Ms Rebecca Cornally for her assistance in the writing of this Statement.
Financial support or sponsorship: none.
Conflict of interest: none.
This manuscript was handled by Charles Marc Samama.
References
- 1.Frenk J, Chen L, Bhutta ZA, et al. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet 2010; 376:1924–1958. [DOI] [PubMed] [Google Scholar]
- 2.Van Melle E, Hall AK, Schumacher DJ, et al. Capturing outcomes of competency-based medical education: the call and the challenge. Med Teach 2021; 43:794–800. [DOI] [PubMed] [Google Scholar]
- 3.Frank JR, Taber S, van Zanten M, et al. The role of accreditation in 21st century health professions education: report of an International Consensus Group. BMC Med Educ 2020; 20: (Suppl 1): 305. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Fishbain D, Yanon YL, Nissanholz-Gannot R. Accreditation systems for postgraduate medical education: a comparison of five countries. Adv Health Sci Educ 2019; 24:503–524. [DOI] [PubMed] [Google Scholar]
- 5.To Err is Human: Building a Safer Health System. The Institute of Medicine. December 1999. Available at: http://www.nap.edu/books/0309068371/html/. [Google Scholar]
- 6.McGlynn EA, Asch SM, Adams J, et al. The quality of healthcare delivered to adults in the United States. NEJM 2003; 348:2635–2645. [DOI] [PubMed] [Google Scholar]
- 7.Mirror, Mirror 2021: Reflecting Poorly. Healthcare in the U.S. Compared to Other High-Income Countries. Available at: https://www.commonwealthfund.org/publications/fund-reports/2021/aug/mirror-mirror-2021-reflecting-poorly [Accessed 31 January 2023]. [Google Scholar]
- 8.Horton R. A new epoch for health professionals’ education. Lancet 2010; 276:1875–1877. [DOI] [PubMed] [Google Scholar]
- 9.Holmboe ES, Kogan JR. Will any road get you there? Examining warranted and unwarranted variation in medical education. Acad Med 2022; 97:1128–1136. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.de Feijter MJ, de Grave WS, Muijtjens AM, et al. Comprehensive overview of medical error in hospitals using incident-reporting systems, patient complaints and chart review of inpatient deaths. PLoS One 2012; 7:e31125. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Makary MA, Daniel M. Medical error—the third leading cause of death in the US. BMJ 2016; 353:i2139. [DOI] [PubMed] [Google Scholar]
- 12.Excellence by Design, the UK General Medical Council 2017. Available at: https://www.gmc-uk.org/education/standards-guidance-and-curricula/standards-and-outcomes/excellence-by-design [Accessed 30 March 2023]. [Google Scholar]
- 13.Bandiera G, Frank J, Scheele F, et al. Effective accreditation in postgraduate medical education: from process to outcomes and back. BMC Med Ed 2020; 20: (Suppl 1): 307. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Akdemir N, Peterson LN, Campbell CM, et al. Evaluation of continuous quality improvement in accreditation for medical education. BMC Med Ed 2020; 20: (Suppl 1): 308–314. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Tamblyn R, Abrahamowicz M, Dauphinee D, et al. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA 2007; 298:993–1001. [DOI] [PubMed] [Google Scholar]
- 16.Tamblyn R, Abrahamowicz M, Dauphinee D, et al. Association between licensure examination scores and practice in primary care. JAMA 2002; 288:3019–3026. [DOI] [PubMed] [Google Scholar]
- 17.Asch DA, Nicholson S, Srinivas S, et al. Evaluating obstetrical residency programs using patient outcomes. JAMA 2009; 302:1277–1283. [DOI] [PubMed] [Google Scholar]
- 18.Jones J, Hunter D. Consensus methods for medical and health services research. BMJ 1995; 311:376–380. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Hall AK, Schumacher DJ, Thoma B, et al. Outcomes of competency-based medical education: a taxonomy for shared language. Med Teach 2021; 43:788–793. [DOI] [PubMed] [Google Scholar]
- 20.American Medical Association. Health System Science. Available at: https://edhub.ama-assn.org/health-systems-science [Accessed 25 March 23]. [Google Scholar]
- 21.Shorten GD, de Robertis E, Goldik Z, et al. European Section/Board of Anaesthesiology/European Society of Anaesthesiology consensus statement on competency-based education and training in anaesthesiology. EJA 2020; 37:421–424. [DOI] [PubMed] [Google Scholar]
- 22.Training Requirements for the Specialty of Anaesthesiology. European Standards of Postgraduate Medical Specialist Training. UEMS 2022. Available at: https://www.uems.eu/__data/assets/pdf_file/0004/156199/UEMS-2022.12-European-Training-Requirements-in-Anaesthesiology.pdf [Accessed 30 March 23]. [Google Scholar]
- 23.Blum RH, Boulet JR, Cooper JB, Muret-Wagstaff SL. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesth Analg 2014; 120:129–141. [DOI] [PubMed] [Google Scholar]
- 24.Jacobs DB, Schreiber M, Seshamani M, et al. Aligning quality measures across CMS — the Universal Foundation. N Engl J Med 2023; 388:776–779. [DOI] [PubMed] [Google Scholar]
- 25.Kim JG, Rodriguez HP, Holmboe E, et al. The reliability of graduate medical education quality of care clinical performance measures. J Grad Med Educ 2022; 14:281–288. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Breen D, O’Brien S, McCarthy N, et al. Effect of a proficiency-based progression simulation programme on clinical communication for the deteriorating patient: a randomised controlled trial. BMJ Open 2019; 9: e025992. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Srinivasan KK, Gallagher A, O’Brien N, et al. Proficiency-based progression training: an ‘end to end’ model for decreasing error applied to achievement of effective epidural analgesia during labour: a randomised control study. BMJ Open 2018; 8: e020099. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28.Kitson Al, Rycroft-Malone J, Harvey G, et al. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges. Implement Sci 2008; 3:1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Wong BM, Holmboe ES. Transforming the academic faculty perspective in graduate medical education to better align educational and clinical outcomes. Acad Med 2016; 91:473–479. [DOI] [PubMed] [Google Scholar]
- 30.Thomas A, Ellaway RH. Rethinking implementation science for health professions education: a manifesto for change. Perspect Med Educ 2021; 10:362–368. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.CANMEDS. Royal College of Physicians and Surgeons of Canada. Available at: https://www.royalcollege.ca/rcsite/canmeds/framework/canmeds-role-professional-e. [accessed 30 March 2023]. [Google Scholar]
- 32.Competence by Design, Royal College of Physicians and Surgeons of Canada. Available at: https://www.royalcollege.ca/rcsite/cbd/competence-by-design-cbd-e [Accessed 30 March 2023]. [Google Scholar]
- 33.Herbert CP, Busing N, Nasmith L. Collaborative governance of postgraduate medical education: can it be achieved? Med Teach 2020; 43:1413–1418. [DOI] [PubMed] [Google Scholar]
- 34.Reypens C, Lievens A, Blazevic V. Leveraging value in multistakeholder innovation networks: a process framework for value co-creation and capture. Ind Mark Manag 2016; 56:40–50. [Google Scholar]
- 35.Tragl L, Savage C, Andreen-Sachs M, Brommels M. Who counts when health counts? A case-study of multistakeholder initiative to promote value-creation in Swedish healthcare. Health Serv Manage Res 2020; 0: doi:10.1177/09514848221100751. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Turnhout E, Metze T, Wyborn C, et al. The politics of co-production: participation, power, and transformation. Curr Opin Environ Sustain 2020; 22:15–21. [Google Scholar]
- 37.Réjean H. Engaging patients and the public in planning for care and driving healthcare reforms. In Managing a Canadian Healthcare Strategy 2017; MQUP. pp. 143–54. [Google Scholar]
- 38.Khalife R, Gupta M, Gonsalves, et al. Patient involvement in assessment of postgraduate medical learners: a scoping review. Med Educ 2022; 56:602–613. [DOI] [PubMed] [Google Scholar]
- 39.Batalden M, Batalden P, Margolis P, et al. Co-production of healthcare service. BMJ Qual Saf 2016; 25:509–517. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Best A, Greenhalgh T, Lewis S, et al. Large-system transformation in healthcare: a realist review. Milbank Q 2012; 90:421–456. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Aunger JA, Millar R, Greenhalgh J, et al. Why do some inter-organisational collaborations in healthcare work when others do not? A realist review. Syst Rev 2021; 10:82. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Rycroft-Malone J, Burton C, Wilkinson J, et al. Collective action for implementation: a realist evaluation of organisational collaboration in healthcare. Implement Sci 2015; 11:17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Herbert CP, Busing NP, Nasmith L. Collaborative governance of postgraduate medical education: can it be achieved? Med Teach 2021; 12 (43):1413–1418. [DOI] [PubMed] [Google Scholar]
- 44.Derks M, Berkers F, Tukker A. Toward accelerating sustainability transitions through collaborative sustainable business modeling: a conceptual approach. Sustainability 2022; 14:3803. [Google Scholar]
- 45.Buléon C, Eng R, Rudolph JW, et al. First steps towards international competency goals for residency training: a qualitative comparison of 3 regional standards in anesthesiology. BMC Med Educ 2021; 21:569. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 46.Weller JM, Naik VN, Ryan J, San Diego RJ. Systematic review and narrative synthesis of competency-based medical education in anaesthesia. Br J Anaesth 2020; 124:748–760. [DOI] [PubMed] [Google Scholar]
- 47.Van Aerde J, Gomes MM, Giuliani M, et al. Complex adaptive systems in CanMEDS 2025. Can Med Edu J 2023; 14:50–53. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Cupido N, Fowler N, Sonnenberg LK, et al. Adaptive expertise in CanMEDS 2025. Can Med Edu J 2023; 14:18–21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Johnson B. Reflections: a perspective on paradox and its application to modern management. J Appl Behav Sci 2014; 50:206–212. [Google Scholar]
- 50.Govaerts MJB, van der Vleuten CPM, Holmboe ES. Managing tensions in assessment: moving beyond either-or thinking. Med Educ 2019; 53:64–75. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Taber S, Akdemir N, Gorman L, et al. A “fit for purpose” framework for medical education accreditation system design. BMC Med Educ 2020; 20: (Suppl 1): 306–1306. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Scientific Council for Government Policy. Supervising public interests. Towards a broader perspective on government supervision (Toezien op publieke belangen. Naar een verruimd perspectief op rijkstoezicht). WRR 2013. Available at: https://english.wrr.nl/publications/reports/2013/09/09/supervising-public-interests.-towards-a-broader-perspective-on-government-supervision [Accessed 30 March 23]. [Google Scholar]
- 53.Tetzlaff JE. Assessment of competency in anesthesiology. Anesthesiology 2007; 106:812–825. [DOI] [PubMed] [Google Scholar]
- 54.Ambardekar AP, Walker KK, McKenzie-Brown AM, et al. The anesthesiology milestones 2.0: an improved competency-based assessment for residency training. Anesth Analg 2021; 133:353–361. [DOI] [PubMed] [Google Scholar]
- 55.Boet S, Larrigan S, Martin L, et al. Measuring nontechnical skills of anaesthesiologists in the operating room: a systematic review of assessment tools and their measurement properties. Br J Anaesth 2018; 121:1218–1226. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 56.Yoganathan S, Finch DA, Parkin E, Pollard J. 360 degrees virtual reality video for the acquisition of knot tying skills: a randomised controlled trial. Int J Surg 2018; 54 (Pt A):24–27. [DOI] [PubMed] [Google Scholar]
- 57.Rutala PJ, Witzke DB, Leko EO, et al. Student fatigue as a variable affecting performance in an objective structured clinical examination. Acad Med 1990; 65: (Suppl): S53–S54. [DOI] [PubMed] [Google Scholar]
- 58.Chuan A, Thillainathan S, Graham PL, et al. Reliability of the direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia. Anaesth Intensive Care 2016; 44:201–209. [DOI] [PubMed] [Google Scholar]
- 59.Morris MC, Gallagher TK, Ridgway PF. Tools used to assess medical students competence in procedural skills at the end of a primary medical degree: a systematic review. Med Educ Online 2012; 17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Matyal R, Mahmood F, Knio ZO, et al. Evaluation of the quality of transesophageal echocardiography images and verification of proficiency. Echo Res Pract 2018; 5:89–95. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Nassar AK, Al-Manaseer F, Knowlton LM, Tuma F. Virtual reality (VR) as a simulation modality for technical skills acquisition. Ann Med Surg (Lond) 2021; 71:102945. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 62.Maintain Certification. American Board of Anesthesiology. Available at: https://www.theaba.org/maintain-certification/moca-eligibility/ [Accessed 30 March 23]. [Google Scholar]
- 63.Edgar L et al. Milestones 2.0: a step forward. Available at: https://meridian.allenpress.com/jgme/article/10/3/367/33604/Milestones-2-0-A-Step-Forward [Accessed 30 March 23]. [Google Scholar]
- 64.Gonzalo JD, Dekhtyar M, Starr SR, et al. Health systems science curricula in undergraduate medical education: identifying and defining a potential curricular framework. Acad Med 2017; 92:123–131. [DOI] [PubMed] [Google Scholar]
- 65.Ten Cate O, Dahdal S, Lambert T, et al. Ten caveats of learning analytics in health professions education: a consumer's perspective. Med Teach 2020; 42:673–678. [DOI] [PubMed] [Google Scholar]

