Abstract
Background
Qualitative realist analysis is gaining in popularity in health professions education research (HPER) as part of theory‐driven program evaluation. Although realist approaches such as syntheses and evaluations typically advocate mixed methods, qualitative data dominate currently. Various forms of qualitative analysis have been articulated in HPER, yet realist analysis has not. Although realist analysis is interpretive, it moves beyond description to explain generative causation employing retroductive theorising. Ultimately, it attempts to build and/or ‘test’ (confirm, refute or refine) theory about how, why, for whom, when and to what extent programs work using the context‐mechanism‐outcome configuration (CMOC) heuristic. This paper aims to help readers better critique, conduct and report qualitative realist analysis.
Realist Analysis Methods
We describe four fundamentals of qualitative realist analysis: (1) simultaneous data collection/analysis; (2) retroductive theorising; (3) configurational analysis (involving iterative phases of identifying CMOCs, synthesising CMOCs into demi‐regularities and translating demi‐regularities into program theory); and (4) realist analysis quality (relevance, rigour, richness). Next, we provide a critical analysis of realist analyses employed in 15 HPER outputs—three evaluations and 12 syntheses. Finally, drawing on our understandings of realist literature and our experiences of conducting qualitative realist analysis (both evaluations and syntheses), we articulate three common analysis challenges (coding, consolidation and mapping) and strategies to mitigate these challenges (teamwork, reflexivity and consultation, use of data analysis software and graphical representations of program theory).
Conclusions
Based on our critical analysis of the literature and realist analysis experiences, we encourage researchers, peer reviewers and readers to better understand qualitative realist analysis fundamentals. Realist analysts should draw on relevant realist reporting standards and literature on realist analysis to improve the quality and reporting of realist analysis. Through better understanding the common challenges and mitigation strategies for realist analysis, we can collectively improve the quality of realist analysis in HPER.
Short abstract
Interested in realist analysis of qualitative data? Then check out this new Focus on Research Methods paper aimed to help researchers better critique, conduct and report qualitative realist analysis
1. INTRODUCTION
Realist approaches such as realist evaluation and realist synthesis unpack the black box of whether interventions (such as programs, policies or services) work or not, for whom and under what circumstances, how and why. 1 They serve to build and/or ‘test’ (confirm, refute or refine) program theory, which can be described as ‘why and how the program brought about the changes observed’. 2 ,p. 3 Realist approaches are gaining in popularity in health professions education research (HPER), especially realist analysis of qualitative data (hence the focus of this paper). This includes not only realist evaluation employing realist interviewing 3 , 4 , 5 but also realist syntheses. 6 , 7 , 8 Although realist syntheses include research outputs employing diverse methods (qualitative, quantitative or mixed), the outputs can themselves be considered as ‘qualitative’ data because they are text‐based. Qualitative analytical approaches are particularly useful for understanding how and why programs work (i.e., mechanisms; things that generate outcomes). 9 Although realist analysis of qualitative data is an interpretive process 10 and therefore has some parallels with interpretivist qualitative ‘thematic’ analysis, important differences exist. Non‐realist qualitative analysis typically serves to describe the phenomenon of interest (often participants' views and/or experiences) to better understand the topic of inquiry through some form of thematic analysis. 11 Sometimes, this analysis serves to build a theory about the phenomenon of interest, as in grounded theory methodology. 11 However, realist analysis employs retroductive theorising, 12 moving beyond description to an explanation of causation and focusing on building and/or testing program theory containing causal explanations using the context‐mechanism‐outcome configuration (CMOC) heuristic. Realist analysis of voluminous qualitative data has many challenges, 9 , 10 , 13 but so far, little advice exists for realist researchers in HPER. 9 , 11 Therefore, this Focus on Research Methods paper follows on from our realist interviewing paper 11 to outline the fundamentals of realist analysis of qualitative data, as well as to provide a critical analysis of Medical Education papers employing realist analyses of qualitative data over the last decade. We also share the challenges associated with realist analysis, as well as potential mitigation strategies. This paper—based on our knowledge of the literature and experiences of realist analysis—is written primarily for those relatively new to realist analysis. We draw on the same case studies employed in our realist interviewing paper—a realist evaluation of state‐wide clinical supervision training in Australia, 14 , 15 , 16 and a UK evaluation of faculty development programs. 4 We therefore recommend that this realist analysis paper is read in conjunction with our previous realist methodology outputs. 9 , 11 Ultimately, we present this critique of existing realist analysis in HPER (alongside our experiences of conducting realist analyses) to provide recommendations to HPE researchers, so that they can better critique, conduct and report realist analyses. We also anticipate the paper will assist peer‐reviewers and editors within the HPER field to critically appraise qualitative realist analyses.
2. THE FOUR FUNDAMENTALS OF REALIST ANALYSIS
As described previously, realist methodologies (such as realist syntheses and realist evaluations) involve theory‐driven approaches to better understand how programs work, for whom and under what circumstances and why. 9 , 11 For readers interested in learning more about the philosophies (e.g., realist ontology) underpinning realist approaches, we direct you elsewhere. 9 , 17 , 18 Realist evaluations typically employ purposive sampling, and although mixed methods of data collection are advocated, qualitative data are dominant in HPER. 9 , 11 It is important that those embarking on qualitative realist analysis in realist evaluations consider four fundamentals: (1) simultaneous data collection and analysis, (2) retroductive theorising, (3) configurational analysis and (4) realist analysis quality (see Figure 1 for an overview of these four fundamentals and their relationships). These fundamental principles also apply to realist syntheses, so we also consider syntheses in this paper given their comparative dominance in the HPER literature.
FIGURE 1.

The four fundamentals of realist analysis and their relationships (We developed this figure thinking primarily of realist interviews as part of realist evaluation, but much of this figure can also be applied to realist synthesis. Each of the concentric circles represents the four core fundamentals of realist analysis, with arrows within the concentric circles highlighting the iterative (back and forth) nature between the different elements within those realist analysis fundamentals. The dotted porous lines, as well as the double‐headed arrows, between the different concentric circles illustrate the complex and dynamic inter‐relations between these four fundamentals of realist analysis).
2.1. Simultaneous data collection and analysis
Realist data collection and analysis is ideally a concurrent rather than sequential process. 17 , 19 , 20 , 21 , 22 For example, in realist interviewing, realist analysis begins during the interview, with the interviewer's in‐the‐moment analysis influencing their questioning: what and how they ask questions to tease out contexts (dynamic and relational features triggering mechanisms), 23 mechanisms (typically hidden processes, structures or entities generating outcomes), 1 outcomes and relationships between these elements. 11 , 24 Analysis then continues more formally after the interview, through analysing the interview transcript (see retroductive theorising below). However, this formal analytic process often begins while interviews are still being conducted and transcribed. Early informal and formal preliminary analysis can help to shape and focus subsequent data collection, which can then be tailored to preliminary findings and thus development of the program theory. 22 As described by Manzano, 21 ,p. 357 ‘realist analysis is not a defined separate stage of the research process; it is an ongoing iterative process of placing nuggets of information … within a wider configurational explanation’. Starting the analysis process as early as possible can also be helpful within realist syntheses with searches, appraisals and syntheses often occurring iteratively. 22
2.2. Retroductive theorising
As explained by Manzano, 21 ,p. 357 realist analysis is an iterative, back and forth process of refining program theory: ‘to and from the evidence’. Central to this is retroductive theorising, 13 , 22 , 25 which aims to ‘elucidate patterns of generative causation’. 26 ,p. 505 In talking about realist syntheses, Dada and colleagues 26 ,p. 505 described the retroductive theorising process as non‐linear involving the ‘to'ing and fro'ing between the programme theory developed and tested in the review, and the evidence included in the review’. As mentioned above, realist analysis serves to build and/or test program theory (which is theory about how and why the program brought about its outcomes). 2 , 12 , 13 , 20 Program theory can be seen as the main unit/output of analysis in realist evaluation and syntheses. 27 Therefore, realist analysis involves inductive, deductive and abductive approaches as part of retroductive theorising to iteratively build and/or test program theory. 11 , 25 , 26 See Box 1 for an explanation of the different types of analytical reasoning employed in realist analysis. As an example, an inductive analytical approach is typical when building initial program theory, with analysts identifying contexts, mechanisms, outcomes and CMOCs within the data to make sense of generative mechanisms and causality. However, once an initial program theory is developed, the analytical approach may switch to predominantly deductive reasoning, that is, analysing data to test the program theory, with the analyst judging whether the data refutes the theory, or confirms the theory wholly, or partly, thereby leading to theory refinement. 31 At any point in the analytic process, researchers can simultaneously be building one (part of the) theory and refining another while analysing any given data source (e.g., an interview transcript or a research paper). 32 Indeed, the analyst might incorporate new thinking and evidence into this building and/or testing of the program theory through abductive reasoning (see Box 1). As illustrated in Box 1, the importance of retroductive theorising is the moving back and forth between different types of reasoning, and theory and data, to identify hidden generative causality. 22
BOX 1. Relevant realist analytical concepts (in alphabetical order).
• Abductive reasoning: Bygstad and colleagues liken abduction with ‘conjecturing’. 28 ,p. 86 Abductive thinking involves the incorporation of insights and data outside of the theoretical framework into the analytic process. 22 An example of abductive reasoning, from our realist economic evaluation of supervision training, is where we conjectured based on the literature that extended‐duration training would not yield additional positive outcomes beyond those achieved through brief (and less expensive) training. 5
• Deductive reasoning: Involves testing theory by examining whether empirical data supports (or refutes) theory. 29 An example of deductive reasoning, from our realist economic evaluation of supervision training, is how we tested our initial program theory with our realist economic data to modify the program theory, thereby proposing different optimised economic models for brief training (for experienced supervisors with experienced facilitators) and extended‐duration supervision training programs (for inexperienced supervisors with junior facilitators). 5
• Generative causality: Actors are central to generative causality, with a focus on ‘how individuals construct causes and give meaning—there may be multiple different understandings of contextual conditions and of how interventions lead to outcomes’. 23 ,p. 3
• Inductive reasoning: Involves building theory from empirical data. 29 An example of inductive reasoning, from our realist economic evaluation of supervision training, is how we developed program theory through identifying in participants' data cost‐sensitive mechanisms and context (e.g., learner protected time, learner engagement and facilitator competence) generating positive training outcomes. 5
• Retroduction: Described as a creative and iterative process of moving back and forth between inductive and deductive reasoning, 28 , 30 and theory and data. 11 Retroduction helps identification of the hidden causal forces underlying identified patterns. 29
2.3. Configurational analysis
Variation can exist in the phases employed for qualitative realist analysis depending on the study's philosophical approach, whether it focuses on specific interventions or generalised phenomena, and whether the analysis focuses on building and/or testing program theory. 25 However, despite these variations, configurational analysis is central to all realist analysis. 9 , 13 , 20 , 22 , 30 , 33 , 34 , 35 , 36 , 37 Such configurational analysis in the HPER literature can be roughly seen as following three phases: (1) identifying CMOCs (instead of individual unlinked Cs, Ms and Os); (2) synthesising CMOCs into demi‐regularities (DRs), that is, prominent recurrent patterns 32 ; and (3) translating DRs into program theory, although there may be some to'ing and fro'ing between these stages and the data. 13 , 21 This is relatively consistent with the steps outlined recently by Peters, 22 where coding, consolidation and conceptual mapping were used as analytical tools in a realist synthesis.
Phase 1 typically involves researchers identifying contexts, mechanisms, outcomes and CMOCs to ‘formulate potential causal explanations to answer what works, for whom, how, and under what circumstances’. 25 ,p. 481 Jackson and Kolla 38 were the first to report coding dyads (e.g., CM, MO and CO) and triads (CMOs) from qualitative interview data. Although CMOCs are a typical analytic heuristic in realist analysis, realist researchers are increasingly suggesting CMOC alternatives to improve the clarity of generative causality: ultimately, the form of configuration needs to make sense as a causal explanation. Such alternatives include adding intervention features as in CIMOs 31 ; interventions and actors as in ICAMOs 34 , 35 ; or strategies as in SCMOs. 39 De Weger and colleagues 39 suggest that researchers should clarify and justify their configuration types if they are different from the typical CMOCs. Finally, researchers have advocated for more nuanced analyses of mechanisms (e.g., resources and reasoning) and contexts (e.g., observable features and relational and dynamic features) to account for their multi‐layered natures, 1 , 23 , 27 as well as the concatenation of CMOCs, whereby some mechanisms can be seen to yield other mechanisms and so on. 40 , 41 Others have also suggested the importance of annotations in this phase, documenting things like how CMOCs may change through the analytic process. 13 , 22
Phase 2 typically involves synthesising (often many) CMOCs into DRs by exploring dominant and/or fluctuating patterns. 25 , 33 This has been described as the data reduction (or consolidation) phase to help make sense of voluminous data. 22 Although large amounts of data can be reduced based on frequencies, 38 data can also be consolidated based on similarity (i.e., categorising conceptually similar CMOCs), especially to prevent excluding important (but infrequent) CMOCs. 22 Such decisions about how to synthesise CMOCs will depend on the data at hand (and its volume), as well as its relevance to the building and/or testing of program theory. Indeed, through progressive focusing, one might ignore (and intentionally not report) CMOCs in the data that are irrelevant to building and/or testing program theory.
Finally, phase 3 typically involves translating DRs into the building and/or testing of program theory, which is often expressed as middle‐range theory (described as formal theory more abstracted than a program theory). 32 , 33 , 34 , 42 Notably, other substantive formal theories (e.g., educational theories) can be incorporated into this analytical process. 10 , 25 For example, in their fourth stage of analysis (called ‘generative mechanism refinement’), Putri and colleagues 25 drew on the concept of knowledge brokering to refine their initial generative mechanism relating to preceptor/mentor support. An example of a CMOC gleaned from this Putri et al. 25 study includes (see p. 487) role transitions (intervention) for emergency nurses (context) triggers knowledge brokering (mechanism) generating expected emergency nurse knowledge and skills and practice transformation (outcomes). Peters 22 described the importance of visualising the program theory at this stage using diagrams, flow charts and/or conceptual maps, illustrating the connections between CMOCs at different levels of abstraction. We direct the readers to diverse examples of visual representations of program theory; specifically, Ajjawi et al., 6 ,p. 943 Lefroy et al. 3 ,p. 1047 and Price et al. 43 ,p. 1002 See Table S1 for examples of code evolution across these realist analysis phases for the illustrative cases.
2.4. Realist analysis quality
Realist analysis is ideally based on data collected using realist approaches (e.g., realist interviewing in realist evaluation), 11 so that generative causality underpinning the relationships between contexts, mechanisms and outcomes is more likely evident. However, it is relatively common for realist analysis to be conducted on non‐realist data (particularly in realist syntheses that commonly analyse/synthesise non‐realist primary studies), which may limit the realist insights possible. In terms of realist syntheses, authors are encouraged to consider the following data features when selecting and appraising the evidence: (a) relevance (i.e., to the topic of inquiry and to building and/or testing of program theory); (b) richness (in terms of the density of explanatory evidence within the data, so ideally being conceptually rich and contextually thick); and (c) rigour. 10 , 22 , 26 , 32 , 44 Note that rigour can be thought of as the trustworthiness of data (e.g., the believability of findings and whether results reflect other contexts), as well as the coherence and plausibility of the program theory (e.g., alignment between the program theory and data). We would argue that relevance, richness and rigour are equally important markers of quality in realist interview data for realist evaluation. Furthermore, realist scholars have recently suggested three criteria for judging the explanatory quality of program theory: consilience (the theory accounts for as much of the relevant data as possible); simplicity (the theory should be as parsimonious as possible); and analogy (the theory should fit with what is already known). 10 , 45 We recommend that researchers read other references for more details on the quality of realist analysis (e.g., Duddy & Wong). 10 Finally, several researchers 13 , 22 , 30 , 37 have advocated for the use of analytical documenting tools such as NVivo, Microsoft Excel and Microsoft PowerPoint to facilitate the quality of qualitative realist analysis. Indeed, multiple tools may be needed to manage the complex and non‐linear analytic process. 21 Such tools are thought to enable quality through facilitating a team‐based approach, helping to manage and make sense of voluminous qualitative data, enabling theory building, testing and/or refinement and providing an audit trail for inherently messy/complex analyses, thereby facilitating transparency of findings. 13 , 22 , 30 , 37
3. AN OVERVIEW OF REALIST ANALYSIS IN HPER
Given the four fundamentals of realist analysis outlined above, we next critique existing realist analysis in HPER to enable us to provide recommendations to HPE researchers, so that they can better critique, conduct and report realist analyses. As mentioned above, although realist approaches are becoming popular in HPER, especially realist syntheses, there are still few examples of qualitative realist analysis (compared with traditional interpretive qualitative data analysis). 46 , 47 In Table 1, we summarise 15 papers published in Medical Education (2016–2023), identified from a Web of Science search, that include realist analysis of qualitative data, either as part of a realist evaluation (where realist interview data were analysed), 3 , 4 , 5 or a realist synthesis (where research papers were synthesised). 6 , 7 , 8 , 43 , 48 , 49 , 50 , 51 , 52 , 53 , 54 , 55 The papers served to evaluate wide‐ranging education‐related interventions, such as those focused on learning (n = 8), workplace transitions (n = 3), assessment (n = 2), education research environments (n = 1) and education committee decision‐making (n = 1). The bulk of these papers employed realist synthesis methodology (n = 12), with the remainder using realist evaluation (n = 3), including one realist economic evaluation. Samples for realist evaluations (including the realist economic evaluation) ranged from 32 to 115 participants (average of 72 participants), whereas samples for syntheses ranged from 22 to 141 articles (average of 63 articles per synthesis). All papers articulated their purposes in relation to program theory, with the majority stating their aims to refine (n = 11), develop (n = 11) and/or test (n = 7) program theory, with fewer citing their aims to describe (n = 1), glean (n = 1) and/or consolidate (n = 1) program theory, illustrating the different terminology used by authors. Nearly all papers (n = 14) cited relevant reporting standards (RAMESES or RAMESES II). 20 , 32 However, none cited Jackson and Kolla, 38 and of the six papers published in 2019 or later, none cited Gilmore et al. 13 See Box 2 for key realist analysis methods papers.
TABLE 1.
Overview and core features of HPER realist analysis (organised by evaluation and then synthesis and alphabetical)
| Author | Program | Design a | Sample | Theory purpose a | RAMESES cited b | Configurational realist analysis | Team‐reflexivity statement | CAQDAS | Graphical presentation | Analytical challenges as reported by authors |
|---|---|---|---|---|---|---|---|---|---|---|
| Lefroy et al., 2017 3 | Doctors' memorable ‘firsts’ across the preparedness for practice transition | Realist evaluation | 32 medical students & 13 junior doctors (interviews) & 70 junior doctors & trainees (focus groups) | Testing & developing middle‐range theory | No | Yes: CMOCs c identified; presented by 3 themes of memorable ‘firsts’ | Yes (p. 1041) | NVivo Version 10 | Yes: program model | Yes: trustworthiness of middle‐range theories could have been enhanced with respondent validation |
| Proctor et al., 2020 4 | Faculty development | Realist evaluation | 32 faculty development leads & members from 17 UK medical schools (interviews) | Gleaning, refining & consolidating program theory | Cites RAMESES II but does not say guidelines were followed | Yes: Cs, Ms & Os identified & presented individually; then 7 dominant CMOCs presented | No: But clear articulation of teamwork | NVivo Version 12 | Yes: initial program theory & final program theory | Yes: uncertainty about causal pathways between contexts, mechanisms & outcomes because multiple mechanisms may operate simultaneously; acknowledges that findings may not be generalisable |
| Rees et al., 2022 5 | Supervision training (brief & extended) | Realist economic evaluation | 43 learners (brief training; interviews), 25 learners (extended training: interviews & longitudinal audio‐diaries) & cost measurement | Developing, testing & refining economic program theory | Yes: RAMESES II | Yes: CMOCs identified & compared for brief & extended training; presented as comparisons (Os, costs, Ms, Cs), cost‐sensitive mechanisms & context, & cost sensitivity analysis | No: But clear articulation of teamwork | NVivo Version 12 for coding qualitative realist data | Yes: other (comparison of costs for original programs versus cost‐optimised programs) | Yes: challenges integrating realist & economic analyses while staying internally coherent with realist principles; inclusion of subjective qualitative outcome measures without objective quantitative data |
| Ajjawi et al., 2018 6 | Successful research environments | Realist synthesis | 42 articles | Developing modified program theory | Yes: RAMESES & RAMESES II | Yes: CMOCs identified, extracted & synthesised to DRs; d presented by 3 cross‐cutting mechanisms, & CMOCs | No: But clear articulation of teamwork | Not stated | Yes: initial program theory & modified program theory | Yes: inconsistencies identifying CMOCs; challenges presenting ‘messily entangled’ mechanisms (p. 941); challenges untangling CMOCs & identifying DRs from large dataset |
| Bansal et al., 2022 7 | Developing learners' person‐centredness | Realist review | 61 articles | Describing & refining program theory | Yes: RAMESES | Yes: CIMOCs e identified, extracted & synthesised; presented by 9 CIMOCs (DRs) & 5 outcome‐generating mechanisms | Yes (p. 490) | Microsoft Excel for data extraction & NVivo version 12 for coding | Yes: refined program theory | Yes: separating educational strategies from learner contexts to identify mechanisms |
| Brennan et al., 2017 8 | Appraisal of doctors | Realist review | 125 articles | Developing, testing & refining program theory | Yes: RAMESES | Yes: CMOCs identified & synthesised; presented by 3 mechanisms | No: But clear articulation of teamwork | NVivo Version 10 | Yes: program theory | Yes: limited rich data to develop, test & refine program theory & the authors were: ‘unable to fully and reliably configure the CMOs’ (p. 1008) |
| Cornett et al., 2020 48 | Scholarly experiences in medical education | Realist review | 28 articles | Developing modified program theory | Yes: RAMESES | Yes: 36 CMOCs identified & synthesised into 5 DRs; presented as supplementary materials | No: But clear articulation of teamwork | Not stated | Yes: modified program theory | Yes: the reviewed studies contained limited description about contexts & mechanisms, & lacked mid‐range theory inhibiting identification of causal pathways |
| Davies et al., 2018 49 | Self‐management support training | Realist synthesis | 44 articles | Refining initial rough theories | Cites RAMESES but does not say guidelines were followed | Yes: 7 CMOCs (‘theories’) generated from if/then configurations & mapped onto learning theory | No: But clear articulation of teamwork | NVivo Version 10 | Yes: other (summary graphic of links between findings & learning theory but not CMOCs or program theory) | Yes: lack of depth in describing interventions & context‐specific challenges limiting the scope for testing theory; synthesis better at theory building than testing; interpretive nature of realist analysis |
| Kehoe et al., 2016 50 | International medical graduates' transitions to host countries | Realist synthesis | 88 articles | Testing & refining program theory | Yes: RAMESES | Yes: CMOCs identified; presented by 3 contextual levels | No: But clear articulation of teamwork | Microsoft Excel | Yes: refined program theory | Yes: lack of objective outcome measures in available literature on which to base CMOCs; existing studies typically lacked rich description necessary to explore theory; overlap between mechanisms & outcomes; subjective nature of realist approach |
| Kent et al., 2017 51 | Pre‐registration interprofessional clinical education | Realist review | 30 articles | Developing, testing & refining program theory | Yes: RAMESES | Yes: Cs, Ms & Os identified & synthesised for each article, & then presented by 4 dominant CMOCs (‘theories’) | Yes (p. 914) | Not stated | No | Yes: several different interventions were reviewed, introducing complexity to data extraction & CMOC identification processes; researcher inexperience with realist methods |
| Price et al., 2021 43 | Doctor remediation | Realist review | 141 articles | Developing, refining & testing program theory | Yes: RAMESES | Yes: 29 CMOCs identified; presented by the main program theory stages | No: But clear articulation of teamwork | NVivo Version 12 | Yes: program theory of remediation | Yes: quality of the existing literature affected robustness of findings; interpretive nature of realist analysis |
| Richmond et al., 2020 52 | Developing clinical reasoning ability | Realist review | 28 articles | Developing & refining program theory | Yes: RAMESES | Yes: CMOCs identified; presented by 5 contexts | Yes (p. 710) | NVivo Version 12 & Microsoft Excel | Yes: CMOCs for individual student contexts & CMOCs for knowledge‐related contexts | Yes: focus on theory building because interventions & outcomes were under‐reported in the articles |
| Schumacher et al., 2023 53 | Prospective entrustment decision‐making | Realist synthesis | 52 articles | Developing & refining program theory | Yes: RAMESES | Yes: CMOCs identified & synthesised; presented as 8 DRs | No: But clear articulation of teamwork | Microsoft Excel | Yes: initial program theory & middle‐range theory | None |
| Sholl et al., 2017 54 | Balancing healthcare professional education & patient care delivery | Realist synthesis | 22 articles | Developing, testing, & refining program theory | Yes: RAMESES | Yes: 23 CMOCs identified & synthesised; presented as summaries of interventions, Os, Cs & Ms & top 3 most frequently mentioned CMOCs (DRs) | Yes (p. 797) | No: explicit statement about not using ATLAS.ti (deviation from original protocol) because of small sample & equally effective to code directly onto papers | Yes: refined program theory in supplementary online material for three interventions (continuing professional development, ward rounds & protected learning time) | Yes: complexities of the process of identifying CMOCs & DRs; recognition that not all types of literature (e.g., peer‐reviewed versus grey literature) leant themselves to realist coding |
| Wiese et al., 2018 55 | Supervised workplace learning | Realist synthesis | 90 articles | Developing realist theory of postgraduate medical education | Yes: RAMESES | Yes: CMOCs identified & DRs; presented as 3 processes, with their six underpinning mechanisms, triggered by contexts at 3 levels | Yes (p. 954 & p. 963) | Microsoft Excel | Yes: middle‐range realist theory of supervised workplace learning | Yes: synthesis is an interpretive/subjective process & others may synthesise the data differently |
Note: These papers were identified through a Web of Science Search conducted in January 2024 using the search terms ‘Medical Education’ (for source) and ‘realist’ (for title), and each paper was synthesised and critiqued by two analysts (CER and DWP/VNBN/EO).
We use the terms employed by authors wherever possible noting that ‘review’ and ‘synthesis’ are synonyms. 10
CMOC: Context‐Mechanism‐Outcome Configuration.
DR: demi‐regularity.
CIMOC: Context‐Intervention‐Mechanism‐Outcome Configuration.
BOX 2. Relevant realist analysis methods papers (in chronological order).
Jackson SF, Kolla G. A new realistic evaluation analysis method: Linked coding of context, mechanism, and outcome relationships. Am J Evaluation. 2012;33(3):339–349. 38
Gilmore B, McAuliffe E, Power J, et al. Data analysis and synthesis within a realist evaluation: toward more transparent methodological approaches. Int J Qual Methods. 2019;18:1–11. 13
Bergeron DA, Gaboury I. Challenges related to the analytical process in realist evaluation and latest developments on the use of NVivo from a realist perspective. Int J Soc Res Methodol. 2020;23(3):355–365. 37
Dalkin S, Forster N, Hodgson P, et al. Using computer assisted qualitative data analysis software (CAQDAS; NVivo) to assist in the complex process of realist theory generation, refinement and testing. Int J Soc Res Methodol. 2021;24(1):123–134. 30
Jackson SF, Poland B, Gloger A, Morgan GT. A realist approach to analysis in a participatory research project. Progress in Community Health Partnerships: Research, Education and Action. 2022;16(S2):91–97. 31
Putri AF, Chandler C, Tocher J. Realist approach to qualitative data analysis. Nurs Res. 2023;72(6):481–488. 25
Peters LA. An exploration of analytical tools to conduct a realist synthesis and demonstrate programme theory development: An example from a realist review. Research Methods in Medicine & Health Sciences. 2024; doi:10.1177/26320843231224807. 22
All papers employed realist configurational analysis, presenting CMOCs as findings (e.g., ‘Through participation in a research project within a culture of research support (C), students increase research competence and are more likely to pursue a research career (O) due to students feeling supported and more confident in their capacity to undertake and complete research projects (M)’: Table S2), 48 with six studies also using the term ‘demi‐regularities’ to represent recurring CMOCs (e.g., ‘When a medical school has a scholarly experience that is highly regarded by both the institution and staff (context), a well‐supported and structured scholarly experience (mechanism) leads to students developing ability and confidence in a wide range of research skills (outcome)’. 48 ,p. 162 In addition to CMOCs, four papers also presented summaries of contexts, mechanisms and outcomes separately. Interestingly, four studies privileged mechanisms as a way of organising the presentation of study findings (e.g., time, identity and relationships), 6 whereas three privileged contexts (e.g., organisational contextual factors, contextual factors of training and individual contextual factors). 50 All papers provided clear accounts of teamwork (often reported in the ‘contributor’ section of the paper), with six including team‐reflexivity statements, articulating what the different backgrounds/expertise of team members brought to the studies and/or how this influenced the analytical process. All realist evaluations used computer‐assisted qualitative data analysis software (CAQDAS—all used NVivo). Four of the realist syntheses did not articulate using any analysis software, whereas other syntheses employed NVivo (n = 3), Microsoft Excel (n = 3) or both (n = 2) to support their realist analysis. Most papers (n = 10) presented graphical representations of program theory, either initial or modified program theory, or both (sometimes as supporting information). Another paper included a graphical representation of CMOCs, and two papers presented other findings in figures. Only one paper did not present findings graphically. Finally, all papers bar one discussed at least one analytical challenge associated with their realist approaches (see Table 1). The three most common challenges articulated were (a) non‐realist data in realist syntheses lacking the richness that would ideally be available to identify CMOCs, and build and/or test program theory (n = 7); (b) difficulties identifying contexts, mechanisms, outcomes and CMOCs in voluminous qualitative data (n = 5), with further synthesis into DRs (n = 2); and (c) the interpretive/subjective nature of realist analysis (n = 4).
4. COMMON REALIST ANALYSIS CHALLENGES AND MITIGATION STRATEGIES
In designing this paper, we agreed to write something that we would have appreciated reading before embarking on our own qualitative realist analysis journeys. So here, we illustrate the key challenges of realist analysis (based on the literature and our experiences), as well as mitigation strategies, in the hopes of assisting novice realist analysts. In developing this paper, we individually documented what we found most challenging about realist analysis from our case studies, 4 , 14 , 15 , 16 as well as our mitigation strategies, and then, we met to discuss our collective experiences and confirm what we would present in this section (see Figure 2 for our three key challenges and mitigation strategies that collectively helped us address the challenges). Interestingly, our three challenges align neatly with the three iterative phases of configurational analysis articulated above, that is, identifying CMOCs (coding), synthesising DRs (consolidation) and translating DRs into presentable program theory (mapping). In Table S1, we present examples of code evolution across these three iterative analysis phases for our illustrative cases. We suggest that researchers read this current methodology paper in conjunction with the original papers underpinning these case studies. 4 , 15 , 16 However, we do not hold up our case studies as flawless exemplars of qualitative realist analysis; rather, we showcase them to look inside the qualitative realist analysis process, including our struggles.
FIGURE 2.

Our three key challenges and mitigation strategies for realist analysis.
4.1. Coding challenges and mitigation strategies
The literature has largely focused on the challenges associated with coding, suggesting that researchers can struggle to identify mechanisms, contexts and configurations. In terms of mechanisms, it is easy to confuse/conflate mechanisms with intervention strategies and/or contexts. 9 , 10 , 28 , 31 , 34 , 37 , 38 , 56 Regarding contexts, one can take too narrow a view of contexts and/or treat contexts in formulaic ways such as lacking explanations of why contexts and mechanisms interact to produce outcomes and how. 10 , 23 Concerning configurations, researchers have sometimes (erroneously) conducted thematic analysis rather than realist configurational analysis; identifying contexts, mechanisms and outcomes separately, without going on to identify CMOCs. 9 , 10 , 34 Furthermore, researchers can sometimes develop CMOCs in interview data only partly based on interviewees' explicit comments; therefore, sometimes inferring connections if data about contexts/mechanisms are missing/thin. 31 , 38 We experienced many of these coding challenges in our case studies and attempted to address these with all three mitigation strategies (see Figure 2).
4.1.1. Case study 1: Clinical supervision training
We had voluminous secondary qualitative data from our realist synthesis of clinical supervision training interventions 14 and primary qualitative realist data evaluating a state‐wide clinical supervision training program. 15 , 16 In total, our data comprised 29 papers for our realist synthesis, 14 76 realist interviews and 176 longitudinal audio‐diaries (LADs) for our realist evaluation. 15 , 16 After much agonising, we eventually agreed as a team to identify C, M, O and CMOCs in our realist synthesis by annotating directly onto the articles and transferring those annotations into a Microsoft Word extraction table. Our team‐based configurational analysis led us to identify 139 unique CMOCs. Given that the original studies were not realist in nature (none had employed realist evaluation), we sometimes found it hard to identify C, M, O and CMOCs, especially determining how contexts triggered mechanisms (and thus, we treated context in a relatively simplistic/formulaic way in our synthesis program theory as broad and non‐specific ‘discipline’ and ‘organisation’ contexts). 14 Furthermore, configurational aspects identified in these 29 papers could be intervention components in one chunk of data, but mechanisms in another. 14
Subsequently, through our team‐based analysis of our interview and LAD data for our evaluation, we coded C, M, O and CMOCs in NVivo (version 12) using a coding framework (based partly on our synthesis, and partly on our initial analysis of a sub‐set of data). We used memos to annotate the development, testing and refinement of program theory, which also enabled shared analytical thinking between our multiple analysts. As above, we identified some configurational elements (e.g., increased confidence) that could be an outcome in one chunk of data, but a mechanism in another. Interestingly, one peer reviewer later asked if we had conflated one of our mechanisms (i.e., mixed pedagogies) with intervention components. 15 However, we argued that the ways participants spoke about ‘mixed pedagogies’ indicated mechanism components of resource and reasoning, as suggested by Dalkin et al. 27 (e.g., ‘it was the way it was presented but also the freedom to tell stories from other people's work experiences as well [+M]’). 15 ,p. 1207 Oftentimes our a priori CMOCs did not quite fit our data; for example, the data may have possessed similar dyads (e.g., CM or MO), but the third configurational element did not fit, so it was challenging documenting the refinement of the CMOCs in NVivo. It was also time‐consuming to code full CMOCs across lengthy transcripts in NVivo or across different transcripts (e.g., a participant may have provided a partial CM or MO during a LAD entry, then provided the missing configurational element in follow‐up email correspondence with the facilitator). Despite our progressive focusing related to our initial program theory, we still struggled with the time‐consuming nature (and thus tedious process) of coding large numbers of CMOCs (and their variations including valence) and keeping track of such volume in our NVivo project with multiple analysts, as has been found by other researchers. 13 , 22 , 31 , 38 Furthermore, NVivo sometimes lacked the functionality required to track and capture the evolution of CMOCs and their variations in voluminous qualitative data. 22
The interpretive coding challenges were addressed (at least in part) by our team‐based approach to analysis, with various rounds of feedback/checking data interpretations by individual analysts, as well as regular team discussions for shared decision‐making. Although our coding in NVivo was not without its problems (as alluded to above), NVivo enabled us to manage our team‐based realist analysis, with clear communication between our analysts through coding and memos, plus provided us with an audit trail for our messy/complex analyses. 13 Addressing the shortcomings of NVivo, as indicated by Peters, 22 we also extracted NVivo analytical outputs into Microsoft Word and Excel documents, which evolved through the analytic process to aid shared analysis and interpretations. These working documents included agreed definitions/explanations of terms (including illustrative quotes), exemplar CMOCs in the order of CIOM (with valence for O and M if needed) and visual representations of program theory.
4.1.2. Case study 2: Faculty development
Although we conducted a modest study compared with Case 1 involving a smaller team (with a single Masters researcher leading the data collection and analysis), our realist analysis employing NVivo (version 12) was still challenging. During our rapid synthesis of the literature to develop the initial program theory (IPT), the heterogenous nature of the social dynamics at play in faculty development programs quickly became apparent. Even in this early stage of the realist evaluation, we found it challenging to present complex theory and determine whether a concept represented a context or mechanism. We largely mitigated this by reflexivity and regular team discussions to reach group consensus, based on repeated engagement with published definitions of terms (context, mechanism and outcome), and other methodology literature. Specifically, we found Dalkin and colleagues' suggestion to think about the constituent parts of the mechanism (distinguishing between the resources offered by the intervention and participant responses) extremely helpful for differentiating between context, intervention and mechanism. 27 However, the existence of multiple published definitions subsequently presented another conundrum in determining the most relevant set of definitions to apply during our theory‐building process. We found reflexivity and team discussions helpful here too, since we reviewed multiple possible definitions before making decisions as a team. 57 We also received helpful encouragement from the realist community to interpret definitions flexibility to avoid them becoming overly restrictive during the retroductive theorising process. Although we derived several of the CMOCs from realist literature on faculty development, 41 others relied on non‐realist literature giving rise to partial evidencing of CMOCs (CMs or MOs) or were made based on our team's personal experience. To clarify the origin of these CMOCs, we produced a diagrammatic representation of the IPT with solid arrows denoting CMOCs derived from existing literature and dotted arrows representing additional CMOCs primarily derived from our team discussion.
4.2. Consolidation challenges and mitigation strategies
The literature has suggested that researchers sometimes identify too many unique CMOCs in their data, 31 , 38 therefore making it hard to recognise patterns such as DRs. Indeed, in 11 interviews (6.5 hours of data), Jackson and Kolla 38 ,p. 343 report identifying 46 context codes, 58 mechanism codes and 24 outcome codes and allude to having hundreds of CMOCs, stating that ‘most of the linked strings of codes were unique’. We also experienced this challenge in our case studies, and we addressed these with all three mitigation strategies.
4.2.1. Case study 1: Clinical supervision training
As mentioned above, we identified 139 unique CMOCs in 29 articles as part of our realist synthesis. 14 We struggled to reduce this large number of CMOCs into a more manageable number of DRs to make sense of our findings and developing program theory. This was especially the case as individual CMOCs continued to evolve across the processes of data collection and analysis. However, to make sense of (and reduce) this large number of CMOCs, we first divided the CMOCs into two categories; those pertaining to brief (n = 87) or extended‐duration training interventions (n = 52). Then, as a team, we synthesised the CMOCs for brief interventions into 10 DRs, and for extended‐duration interventions into 6 DRs by asking ourselves questions: Is this CMOC found elsewhere in the data? How does this CMOC develop our program theory?
Likewise, we synthesised our 120 unique CMOCs in our realist evaluation data using the matrix coding query function in NVivo to examine similarities, differences and patterns in our data, especially looking at coding dominance to determine DRs. 15 For example, NVivo enabled us to bundle together multiple outcomes for the same/similar mechanisms, which allowed us to identify dominant CMOCs (and thus DRs) through reviewing coding counts. We also employed sets and the matrix coding query function to explore the relationships between DRs by context (e.g., profession, level of supervision experience and workplace supervision culture), to further explicate how contexts could trigger outcome‐generating mechanisms differently. 16 Our synthesis also very much focused on refining the DRs identified in our realist synthesis, 14 as well as developing new DRs. This synthesis process involved considerable teamwork, as well as the use of texts (e.g., memos), figures (e.g., program theory) and tables (e.g., CMOCs) to make sense of the data, continuing into the writing and revising of our results sections of papers.
4.2.2. Case study 2: Faculty development
During our 32 realist interviews with faculty members and faculty development leads, we identified a total of 75 unique CMOCs. We later refined these into seven dominant CMOCs in our final program theory during monthly team meetings. In retrospect, we could have described these as DRs, but we did not name them as such in our paper. We employed the NVivo ‘Maps’ function to iteratively produce a diagrammatic representation of the developing theory, including a record of which study participants had provided data to support specific CMOCs (or partial CMOCs). We used this record to identify aspects of the developing theory requiring additional refinement and/or consolidation in subsequent interview phases 21 and to indicate when we had sufficient evidence for this part of the theory. On consolidating aspects of the program theory, iteratively modifying this diagrammatic form allowed for a clear representation of instances where the developing CMOCs had been adapted through its various iterations and where we required greater clarity of thinking to distinguish between contexts, mechanisms and outcomes. We also relied on reflexivity and team discussions to ensure the developing CMOCs adequately reflected participant contributions and to regularly review key concepts in the developing theory. However, our diagrammatic representation of theory was not used quantitatively to identify DRs.
In cases where CMOCs were not substantiated during configurational analysis, we moved them to a separate section of the ‘Map’. A clear record was beneficial in cases where participants suggested major amendments to the program theory late in the refinement phase, as we were able to cross‐check these suggestions based on data collected from other study participants. This diagrammatic approach helped to mitigate against important insights getting lost in the process, as did avoiding prolonged time intervals between interviews wherever possible. This ensured that we revisited key concepts during our team discussions and limited the need for re‐familiarisation with the developing program theory.
4.3. Mapping challenges and mitigation strategies
The literature has discussed how researchers can struggle to demonstrate/document analytical processes, program theory development and the relationship between the two. 22 Researchers can also find it difficult to communicate rich and complex configurations, DRs and program theory in transparent, clear and parsimonious ways. 9 , 39 We experienced these challenges in our case studies and addressed these again with all three strategies.
4.3.1. Case study 1: Clinical supervision training
As found in the literature, we struggled to keep track of our developing, testing and refining of program theory throughout our coding and consolidation phases. We also struggled to present our synthesised DRs in relation to our program theory in journals with limited word counts (despite the opportunity for online supplementary materials), as well as to readers with limited understandings of realist approaches and generative causality. Through multiple rounds of reviewer feedback for different iterations of our papers (sometimes at different journals if a paper was rejected), we continuously grappled with decision‐making over the optimal way to present our results so that our refined program theory (the core aspect of our study findings) met the three quality criteria—consilience, simplicity, and analogy—outlined by Duddy and Wong. 10 In each of our papers, we presented our key DRs in relation to the development, testing and refinement of program theory (i.e., we did not present all CMOCs), and we consistently articulated our CMOCs in an order (hopefully) most easily understandable to the reader. We also presented our program theory through figures: all three of our papers included visual representations of our initial and modified program theory. All these mapping processes were enabled through regular team discussions including reflexivity, use of NVivo and other analytical tools including graphical representations of our program theory.
4.3.2. Case study 2: Faculty development
We also found it challenging during the project to present a complex final program theory in a way that was adequately succinct to satisfy journal formatting criteria (e.g., limited word count and figures/tables) but also sufficiently detailed to clarify the meaning behind the CMOCs to a readership with potentially limited experience of realist methods. We adopted several measures to achieve this balance. For example, we produced visual summaries of our initial and final program theory using the iteratively modified program theory maintained during the analysis process, allowing for direct comparison. We also included a table with specific written examples of each CMOC, with contexts, mechanisms and outcomes labelled in parentheses. Like Papoutsi and colleagues, 58 and as in Case study 1, we found that writing the manuscript acted as the final part of the analysis process, helping us to fine‐tune our interpretations and reconsider the meaning and relevance of different components of the program theory. We also sought feedback from external experts about appropriate use of terminology and methodology.
5. CONCLUSIONS
In this Focus on Research Methods paper, we build on our previous realist interviewing article 11 to better explicate realist analysis to HPER readers. Firstly, we have outlined four fundamentals of realist analysis: (1) simultaneous data collection and analysis; (2) retroductive theorising; (3) configurational analysis (involving three iterative phases of identifying CMOCs, synthesising CMOCs into DRs and translating DRs into program theory); and (4) realist analysis quality (relevance, rigour, richness). Secondly, we have provided a critical analysis of the qualitative realist analysis processes employed in three realist evaluations and 12 realist syntheses published in Medical Education. Relevant to their qualitative realist analyses, we have presented these studies' program interventions, designs, samples, purposes in relation to program theory (e.g., building, testing and/or refining), whether they cited reporting standards, 20 , 32 how they conducted their configurational analyses and whether they included team‐reflexivity, employed CAQDAS, provided graphical representations of their realist analysis findings and reported analytical challenges. Collectively, these studies possessed numerous strengths pertaining to their analyses including their sufficient qualitative samples; clear articulations of program theory development, testing and/or refinement as per relevant RAMESES/RAMESES II reporting standards; evidence of configurational analyses; use of CAQDAS; and graphical representations of analysis findings. However, despite the interpretive nature of realist analyses, few HPER studies included team‐reflexivity statements, and none cited more recent articles on realist analysis from the broader realist methodology literature (as presented in Box 2), possibly because most of these have only recently been published in the last 5 years and none are published in HPER journals. Our critical analysis therefore suggests that realist analysts in HPER could do better to draw on (and explicitly cite) the realist analysis methods literature and high‐quality examples of studies involving realist analysis, as well as making clearer how their team members' backgrounds/expertise influenced their realist analytical processes. Finally, based on the methodology literature, and our own experiences of realist analyses, we have outlined common realist analytical challenges (associated with coding, consolidation and mapping phases of configurational analysis), as well as key mitigation strategies (teamwork, reflexivity and consultation, use of multiple forms of CAQDAS and graphical representations of program theory). Altogether, we have learnt how to conduct realist analysis better, partly based on the literature (including the grey literature such as the RAMESES jiscmail) but also through experiential trial and error with high‐quality and generous peer‐review feedback from realist scholars. At a time where realist approaches are likely to grow further in HPER, we hope this paper will aid novice realist analysts to better critique, conduct and report their realist analyses and open up further dialogue within the HPER field about realist methodologies. We therefore challenge HPER scholars to continuously improve the quality of their realist analysis approaches through the recommendations in Box 3. We also hope that this article will help journal peer‐reviewers and editors to evaluate realist analyses, accommodate the complexity of realist analysis and the presentation of findings and provide constructive developmental feedback for improvement, aligned with the latest methodological thinking.
BOX 3. Recommendations for improving realist analysis in HPER.
• HPE scholars should understand (and apply) the fundamentals of realist analysis.
• HPE researchers should consider realist syntheses and/or evaluations with realist analysis when answering research questions about whether interventions work (or not), for whom and under what circumstances, how and why.
• Realist scholars should draw on relevant RAMESES quality and reporting standards, key realist analysis methods outputs and high‐quality examples of studies involving realist analysis.
• HPE scholars should understand common realist analysis challenges (e.g., coding, consolidation and mapping).
• They should also build mitigation strategies into their study designs including teamwork, reflexivity and consultation, use of qualitative analysis software, and graphical representations of program theory.
AUTHOR CONTRIBUTIONS
Charlotte E. Rees: Conceptualization; writing—original draft; writing—review and editing; formal analysis; supervision. Dominic W. Proctor: Conceptualization; writing—review and editing; writing—original draft; formal analysis. Van N. B. Nguyen: Conceptualization; writing—review and editing; writing—original draft; formal analysis. Ella Ottrey: Conceptualization; writing—review and editing; writing—original draft; formal analysis. Karen L. Mattick: Conceptualization; writing—review and editing; writing—original draft; supervision.
CONFLICTS OF INTEREST STATEMENT
None.
ETHICS STATEMENT
Not relevant but the necessary ethics approvals were in place for case studies 1 and 2 presented in this paper (see the original papers for further ethics details).
Supporting information
Table S1. Examples of code evolution across the realist analysis for the illustrative cases.
ACKNOWLEDGMENTS
We would like to thank our co‐authors for the cases illustrated in this paper. For case study 1, we thank (in alphabetical order) Corinne Davis, Charlotte Denniston, Vicki Edouard, Eve Huang, Sarah Lee, Claire Palermo, Kirsty Pope, Keith Sutton, Susan Waller and Bernadette Ward, Monash University, Australia. For case study 2, we thank David Leeder, University of Exeter, UK. Note that our experiences of data analysis may not necessarily represent those of our co‐authors. Thanks also to Geoff Wong, Associate Professor of Primary Care, University of Oxford, for his constructive feedback on an earlier draft of this paper. Open access publishing facilitated by The University of Newcastle, as part of the Wiley ‐ The University of Newcastle agreement via the Council of Australian University Librarians.
Rees CE, Proctor DW, Nguyen VNB, Ottrey E, Mattick KL. Realist analysis of qualitative data in health professions education research. Med Educ. 2025;59(5):503‐518. doi: 10.1111/medu.15482
Funding information None.
DATA AVAILABILITY STATEMENT
Data sharing not applicable to this article as no datasets were generated or analysed during the current study.
REFERENCES
- 1. Astbury B, Leeuw FL. Unpacking black boxes: mechanisms and theory building in evaluation. Am J Eval. 2010;31(3):363‐381. doi: 10.1177/1098214010371972 [DOI] [Google Scholar]
- 2. Shearn K, Allmark P, Piercy H, Hirst J. Building realist program theory for large complex and messy interventions. Int J Qual Meth. 2017;16(1):1‐11. doi: 10.1177/1609406917741796 [DOI] [Google Scholar]
- 3. Lefroy J, Yardley S, Kinston R, Gay S, McBain S, McKinley R. Qualitative research using realist evaluation to explain preparedness for doctors' memorable ‘firsts’. Med Educ. 2017;51(10):1037‐1048. doi: 10.1111/medu.13370 [DOI] [PubMed] [Google Scholar]
- 4. Proctor D, Leeder D, Mattick K. The case for faculty development: a realist evaluation. Med Educ. 2020;54(9):832‐842. doi: 10.1111/medu.14204 [DOI] [PubMed] [Google Scholar]
- 5. Rees CE, Foo J, Nguyen VNB, et al. Unpacking economic programme theory for supervision training: preliminary steps towards realist economic evaluation. Med Educ. 2022;56(4):407‐417. doi: 10.1111/medu.14701 [DOI] [PubMed] [Google Scholar]
- 6. Ajjawi R, Crampton PES, Rees CE. What really matters for successful research environments? A realist synthesis. Med Educ. 2018;52(9):936‐950. doi: 10.1111/medu.13643 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7. Bansal A, Greenley S, Mitchell C, Park S, Shearn K, Reeve J. Optimising planned medical education strategies to develop learners' person‐centredness: a realist review. Med Educ. 2022;56(5):489‐503. doi: 10.1111/medu.14707 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Brennan N, Bryce M, Pearson M, Wong G, Cooper C, Archer J. Towards an understanding of how appraisal of doctors produces its effects: a realist review. Med Educ. 2017;51(10):1002‐1013. doi: 10.1111/medu.13348 [DOI] [PubMed] [Google Scholar]
- 9. Rees CE, Crampton PES, Nguyen VNB, et al. Introducing realist approaches in health professions education research. In: Rees CE, Monrouxe LV, O'Brien BC, et al., eds. Foundations in Health Professions Education Research: Principles, Perspectives & Practices. Wiley Blackwell; 2023:102‐121. [Google Scholar]
- 10. Duddy C, Wong G. Grand rounds in methodology: when are realist reviews useful, and what does a ‘good’ realist review look like? BMJ Qual Saf. 2023;32(3):173‐180. doi: 10.1136/bmjqs-2022-015236 [DOI] [PubMed] [Google Scholar]
- 11. Rees CE, Davis C, Nguyen VNB, Proctor D, Mattick KL. A roadmap to realist interviews in health professions education research: recommendations based on a critical analysis. Med Educ. 2024;58(6):697‐712. doi: 10.1111/medu.15270 [DOI] [PubMed] [Google Scholar]
- 12. Jagosh J. Retroductive theorizing in Pawson and Tilley's applied scientific realism. J Crit Realism. 2020;19(2):121‐130. doi: 10.1080/14767430.2020.1723301 [DOI] [Google Scholar]
- 13. Gilmore B, McAuliffe E, Power J, Vallières F. Data analysis and synthesis within a realist evaluation: towards more transparent methodological approaches. Int J Qual Methods. 2019;18:1‐11. doi: 10.1177/1609406919859754 [DOI] [Google Scholar]
- 14. Rees CE, Lee SL, Huang E, et al. Supervision training in healthcare: a realist synthesis. Adv Health Sci Educ. 2020;25(3):523‐561. doi: 10.1007/s10459-019-09937-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15. Nguyen VB, Rees CE, Ottrey E, et al. What really matters for supervision training workshops? A realist evaluation. Acad Med. 2022;97(8):1203‐1212. doi: 10.1097/ACM.0000000000004686 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16. Rees CE, Nguyen VNB, Ottrey E, et al. The effectiveness of extended‐duration supervision training for nurses and allied health professionals: a realist evaluation. Nurs Educ Today. 2022;110:105225. doi: 10.1016/j.nedt.2021.105225 [DOI] [PubMed] [Google Scholar]
- 17. Ellaway RH, Kehoe A, Illing J. Critical realism and realist inquiry in medical education. Acad Med. 2020;95(7):984‐988. doi: 10.1097/ACM.0000000000003232 [DOI] [PubMed] [Google Scholar]
- 18. Mukumbang FC, De Souza DE, Eastwood JG. The contributions of scientific realism and critical realism to realist evaluation. J Crit Realism. 2023;22(3):504‐524. doi: 10.1080/14767430.2023.2217052 [DOI] [Google Scholar]
- 19. Wong G, Greenhalgh T, Westhorp G, Pawson R. Realist methods in medical education research: what are they and what can they contribute? Med Educ. 2012;46(1):89‐96. doi: 10.1111/j.1365-2923.2011.04045.x [DOI] [PubMed] [Google Scholar]
- 20. Wong G, Westhorp G, Manzano A, Greenhalgh J, Jagosh J, Greenhalgh T. RAMESES II reporting standards for realist evaluations. BMC Med. 2016;14(1):96. doi: 10.1186/s12916-016-0643-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21. Manzano A. The craft of interviewing in realist evaluation. Evaluation. 2016;22(3):342‐360. doi: 10.1177/1356389016638615 [DOI] [Google Scholar]
- 22. Peters LA. An exploration of analytical tools to conduct a realist synthesis and demonstrate programme theory development: an example from a realist review. Res Meth Med Health Sci. 2024;63. doi: 10.1177/26320843231224807 [DOI] [Google Scholar]
- 23. Greenhalgh J, Manzano A. Understanding ‘context’ in realist evaluation and synthesis. Int J Soc Res Methodol. 2022;5(5):583‐595. doi: 10.1080/13645579.2021.1918484 [DOI] [Google Scholar]
- 24. Pawson R. Theorizing the interview. Br J Sociol. 1996;47(2):295‐314. doi: 10.2307/591728 [DOI] [Google Scholar]
- 25. Putri AF, Chandler C, Tocher J. Realist approach to qualitative data analysis. Nurs Res. 2023;72(6):481‐488. doi: 10.1097/NNR.0000000000000686 [DOI] [PubMed] [Google Scholar]
- 26. Dada S, Dalkin S, Gilmore B, Hunter R, Mukumbang FC. Applying and reporting relevance, richness and rigour in realist evidence appraisals: advancing key concepts in realist reviews. Res Syn Meth. 2023;14(3):504‐514. doi: 10.1002/jrsm.1630 [DOI] [PubMed] [Google Scholar]
- 27. Dalkin SM, Greenhalgh J, Jones D, Cunningham B, Lhussier M. What's in a mechanism? Development of a key concept in realist evaluation. Implement Sci. 2015;10(1):49. doi: 10.1186/s13012-015-0237-x [DOI] [PMC free article] [PubMed] [Google Scholar]
- 28. Bygstad B, Munkvold BE, Volkoff O. Identifying generative mechanisms through affordances: a framework for critical realist data analysis. J Inf Technol. 2016;31(1):83‐96. doi: 10.1057/jit.2015.13 [DOI] [Google Scholar]
- 29. Greenhalgh T, Pawson R, Wong G, Westhorp G, Greenhalgh J, Manzano A, et al. Retroduction in realist evaluation. The RAMESES II Project (www.ramesesproject.org) 2017.
- 30. Dalkin S, Forster N, Hodgson P, Lhussier M, Carr SM. Using computer‐assisted qualitative data analysis software (CAQDAS: NVivo) to assist in the complex process of realist theory generation, refinement and testing. Int J Soc Res Methodol. 2021;24(1):123‐134. doi: 10.1080/13645579.2020.1803528 [DOI] [Google Scholar]
- 31. Jackson SF, Poland B, Gloger A, Morgan GT. A realist approach to analysis in a participatory research project. Prog Commun Health Partnersh. 2022;16(S2):91‐97. doi: 10.1353/cpr.2022.0043 [DOI] [PubMed] [Google Scholar]
- 32. Wong G, Greenhalgh T, Westhorp G, Bunckingham J, Pawson R. RAMESES publication standards: realist syntheses. BMC Med. 2013;11(1):21. doi: 10.1186/1741-7015-11-21 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33. Marchal B, van Belle S, van Olmen J, Hoeree T, Kegels G. Is realist evaluation keeping its promise? A review of published empirical studies in the field of health systems research. Evaluation. 2012;18(2):192‐211. doi: 10.1177/1356389012442444 [DOI] [Google Scholar]
- 34. Marchal B, Kegels G, Van Belle S. Theory and realist methods. In: Emmel N, Greenhalgh J, Manzano A, Monaghan M, Dalkin S, eds. Doing realist research. Sage; 2018:80‐89. doi: 10.4135/9781526451729.n6 [DOI] [Google Scholar]
- 35. Mukumbang FC, Marchal B, Van Belle S, van Wyk B. Using the realist interview approach to maintain theoretical awareness in realist studies. Qual Res. 2020;20(4):485‐515. doi: 10.1177/1468794119881985 [DOI] [Google Scholar]
- 36. Bonnell C, Warren E, Melendez‐Torres GJ. Methodological reflections on using qualitative research to explore the causal mechanisms of complex health interventions. Evaluation. 2022;28(2):166‐181. doi: 10.1177/13563890221086309 [DOI] [Google Scholar]
- 37. Bergeron DA, Gaboury I. Challenges related to the analytical process in realist evaluation and latest developments on the use of NVivo from a realist perspective. Int J Soc Res Methodol. 2020;23(3):355‐365. doi: 10.1080/13645579.2019.1697167 [DOI] [Google Scholar]
- 38. Jackson SF, Kolla G. A new realistic evaluation analysis method: linked coding of context, mechanism, and outcome relationships. Am J Eval. 2012;33(3):339‐349. doi: 10.1177/1098214012440030 [DOI] [Google Scholar]
- 39. De Weger E, Van Vooren NJE, Wong G, et al. What's in a realist configuration? Deciding which causal configurations to use, how, and why. Int J Qual Meth. 2020;19:1‐8. doi: 10.1177/1609406920938577 [DOI] [Google Scholar]
- 40. Astbury B. Some reflections on Pawson's science of evaluation: a realist manifesto. Evaluation. 2013;19(4):383‐401. doi: 10.1177/1356389013505039 [DOI] [Google Scholar]
- 41. Onyura B, Ng SL, Baker LR, Lieff S, Millar B‐A, Mori B. A mandala of faculty development: using theory‐based evaluation to explore contexts, mechanisms and outcomes. Adv Health Sci Educ. 2017;22(1):165‐186. doi: 10.1007/s10459-016-9690-9 [DOI] [PubMed] [Google Scholar]
- 42. Westhorp G. Realist Impact Evaluation: An Introduction. Overseas Development Institute; 2014. [Google Scholar]
- 43. Price T, Wong G, Withers L, et al. Optimising the delivery of remediation programmes for doctors: a realist review. Med Educ. 2021;55(9):995‐1010. doi: 10.1111/medu.14528 [DOI] [PubMed] [Google Scholar]
- 44. Rycroft‐Malone J, McCormack B, Hutchinson AM, et al. Realist synthesis: illustrating the method for implementation research. Implement Sci. 2012;2(1):33. doi: 10.1186/1748-5908-7-33 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45. Wong G. Data gathering in realist reviews: looking for needles in haystacks. In: Emmel N, Greenhalgh J, Manzano A, Monaghan M, Dalkin S, eds. Doing Realist Research. Sage; 2018:131‐146. doi: 10.4135/9781526451729.n9 [DOI] [Google Scholar]
- 46. Konopasky A, Varpio L, Stalmeijer RE. The potential of narrative analysis for HPE research: highlighting five analtytic lenses. Med Educ. 2021;55(12):1369‐1375. doi: 10.1111/medu.14597 [DOI] [PubMed] [Google Scholar]
- 47. Park S, Griffin A, Gill D. Working with words: exploring textual analysis in medical education research. Med Educ. 2012;46(4):372‐380. doi: 10.1111/j.1365-2923.2011.04184.x [DOI] [PubMed] [Google Scholar]
- 48. Cornett M, Palermo C, Wallace MJ, Diug B, Ward B. A realist review of scholarly experiences in medical education. Med Educ. 2021;55(2):159‐166. doi: 10.1111/medu.14362 [DOI] [PubMed] [Google Scholar]
- 49. Davies F, Wood F, Bullock A, Wallace C, Edwards A. Shifting mindsets: a realist synthesis of evidence from self‐management support training. Med Educ. 2018;52(3):274‐287. doi: 10.1111/medu.13492 [DOI] [PubMed] [Google Scholar]
- 50. Kehoe A, McLachlan J, Metcalf J, Forrest S, Carter M, Illing J. Supporting international medical graduates' transition to their host country: realist synthesis. Med Educ. 2016;50(10):1015‐1032. doi: 10.1111/medu.13071 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51. Kent F, Hayes J, Glass S, Rees CE. Pre‐registration interprofessional clinical education in the workplace: a realist review. Med Educ. 2017;51(9):903‐917. doi: 10.1111/medu.13346 [DOI] [PubMed] [Google Scholar]
- 52. Richmond A, Cooper N, Gay S, Atiomo W, Patel R. The student is key: a realist review of educational interventions to develop analytical and non‐analytical clinical reasoning ability. Med Educ. 2020;54(8):709‐719. doi: 10.1111/medu.14137 [DOI] [PubMed] [Google Scholar]
- 53. Schumacher DJ, Michelson C, Winn AS, Turner DA, Martini A, Kinnear B. A realist synthesis of prospective entrustment decision making by entrustment or clinical competency committees. Med Educ. 2024;58(7):812‐824. doi: 10.1111/medu.15296 [DOI] [PubMed] [Google Scholar]
- 54. Sholl S, Ajjawi R, Allbutt H, et al. Balancing health care education and patient care in the UK workplace: a realist synthesis. Med Educ. 2017;51(8):787‐801. doi: 10.1111/medu.13290 [DOI] [PubMed] [Google Scholar]
- 55. Wiese A, Kilty C, Bennett D. Supervised workplace learning in postgraduate training: a realist synthesis. Med Educ. 2018;52(9):951‐969. doi: 10.1111/medu.13655 [DOI] [Google Scholar]
- 56. Westhorp G. Understanding mechanisms in realist research. In: Emmel N, Greenhalgh J, Manzano A, Monaghan M, Dalkin S, eds. Doing Realist Research. Sage; 2018:41‐58. doi: 10.4135/9781526451729.n4 [DOI] [Google Scholar]
- 57. Pawson R, Tilley N. Realistic Evaluation. SAGE Publications; 1997. [Google Scholar]
- 58. Papoutsi C, Mattick K, Pearson M, Brennan N, Briscoe S, Wong G. Social and professional influences on antimicrobial prescribing for doctors‐in‐training: a realist review. J Antimicrob Chemother. 2017;72(9):2418‐2430. doi: 10.1093/jac/dkx194 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Table S1. Examples of code evolution across the realist analysis for the illustrative cases.
Data Availability Statement
Data sharing not applicable to this article as no datasets were generated or analysed during the current study.
