Abstract
Introduction:
Learning by concordance (LbC) is an educational approach designed to develop expertise, particularly in the field of clinical reasoning (CR) among healthcare professionals. It is based on the script concordance test (SCT) with the addition of feedback based on expert responses. The objective of this study was to map the scientific literature on this rapidly growing learning approach.
Methods:
A scoping review was conducted following the Arksey and O’Malley framework and the PRISMA-ScR guidelines. A systematic search was conducted in MEDLINE, Embase, CINAHL, Web of Science and Google Scholar up to March 10, 2025. Eligible primary studies had to focus on the LbC approach targeting healthcare learners.
Results:
Twenty-eight studies met the inclusion criteria: twenty focused on the implementation of the LbC approach and eight on its development. Most of the studies used mixed methods (quantitative and qualitative). The results mainly indicate that learners perceive the LbC approach as engaging and beneficial for decision-making. The articles mention five elements related to the development of LbC that would contribute to its success.
Discussion:
The LbC approach could be applicable to a wide range of disciplines and learning levels. The variability in the procedures for developing the approach, as well as the variability of the objectives and methodologies of the studies, limit the comparability of the results.
Conclusion:
LbC is a promising approach for promoting decision-making skills in a variety of uncertain clinical contexts. The concept of standardized development and evaluation frameworks for this approach could improve its applicability, effectiveness and reproducibility.
Introduction
Clinical expertise in healthcare professions is intrinsically linked to the ability to make informed decisions in contexts of uncertainty [1,2,3,4,5]. Indeed, healthcare professionals must make decisions to solve clinical problems that are often ill-structured [6,7,8], i.e. problems for which there is uncertainty as to how to solve them [9,10]. Despite its central role in clinical decision-making, the most effective educational strategies for training healthcare professionals in the management of uncertainty remain unclear [11,12,13,14]. Over the past decades, research in health professions education has sought to optimize the training of clinical reasoning (CR) by integrating strategies that improve decision-making skills amid uncertainty [15,16,17,18]. Training in CR must incorporate strategies that explicitly address uncertainty, as clinical problems often do not have a single correct solution and can be resolved by several valid approaches [19,20,21,22,23]. It is essential to develop training approaches to foster the ability to make decisions in contexts of uncertainty, as this skill lies at the heart of clinical expertise.
Script Concordance Test (SCT) is a CR assessment tool that considers the individual’s ability to make choices and decisions in uncertain situations [24]. It has been widely studied in several healthcare fields [25,26,27,28,29]. The development of the SCT is based on the script theory, which postulates that healthcare professionals organize, store and access their clinical knowledge notably through “illness scripts”. These scripts, which develop with clinical experience, enable healthcare professionals to recognize patterns in clinical data that are associated with health conditions, and to make decisions about patient care [25,26,30,31]. The principle of SCT is to evaluate these scripts by presenting vignettes to learners with situations involving uncertainty, to which new data are added for consideration in answering Likert-type multiple-choice questions on the level of plausibility or relevance of a hypothesis or proposed action [32]. To quantify the performance of respondents, SCT weights the responses according to their degree of concordance with the answers of a reference panel, which is usually made up of people experienced in the field under study. In this respect, a learner receives one point for a concordance question when his or her answer matches that of the majority of panelists, a fraction of a point for an answer that matches that of a minority of the panelists, or zero points when no panelist gave that answer [33].
Learning-by-concordance (LbC) is a learning approach directly derived from SCT, since it has the same format, except that it presents respondents with the panelists’ answers and explanations on the concordance questions, with the aim of providing learners with elaborate formative feedback [34]. Borrowing from SCT, LbC training gives learners the opportunity to practice making decisions on diagnostic or action hypotheses in a context of uncertainty, and to receive feedback enabling them to compare their decision-making with that of a reference panel [34]. This innovative learning approach is increasingly used, albeit heterogeneously, in different domains of the healthcare professions [35,36,37]. In 2021, guidelines were proposed to support the design of training courses based on the LbC approach [34]. However, there is currently no comprehensive review on the empirical application and evaluation of the LbC approach in health professions education. Moreover, the evidence for learning and the elements that promote meaningful learning with this approach are unclear.
A scoping review is a rigorous and appropriate method for exploring an emerging field of research and mapping the available literature on a given topic (Arksey & O’Malley, 2005; Tricco et al., 2018). Unlike a systematic review, which aims to assess the effectiveness of an intervention, a scoping review examines the diversity of conceptual frameworks and methodological approaches used in existing literature. This approach is therefore particularly suitable for exploring how LbC is studied and applied in the training of healthcare professionals. It allows for the identification of key determinants that influence the adoption of LbC and highlights the methodological and conceptual gaps that need to be explored in greater depth in future research.
Objectives
LbC training is an approach that is becoming increasingly important in the literature on strategies for developing expertise in the healthcare professions. This study aims to fill this gap by conducting a scoping review of existing research on LbC, mapping its scope and nature, identifying the methodological frameworks used, and pinpointing gaps in research design and theoretical foundations to inform future research.
Methods
Methodological framework
This scoping review was conducted in accordance with the methodological framework established by Arksey and O’Malley [38]. It also follows the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guidelines [39]. These guidelines ensure the rigor and transparency needed to map the existing literature on a given topic, enabling in-depth understanding while identifying research gaps [40,41,42].
Identifying the research questions
Based on the objectives of this review, the research team formulated the following research question to guide the literature search: What is the scope and nature of scientific studies on the LbC approach in health professions education? More specifically, this review seeks to identify research questions and methodological approaches used to study the LbC approach, as well as to highlight key gaps in the existing literature.
Identifying relevant studies
A comprehensive bibliographic search was conducted in MEDLINE ALL (Ovid), Embase (Ovid), CINAHL, Web of Science Core Collection, and Google Scholar, covering all available records from the date of creation of the databases until March 10, 2025. The search strategy was developed in collaboration with a health science librarian (SC) (see Appendix 1). The search strategies included controlled vocabulary (e.g., MeSH terms) and keywords related to three core concepts: “learning-by-concordance”, “clinical reasoning”, and “health professions education”.
Inclusion criteria were as follows: studies had to (1) be original and present primary data collection; (2) be published in a peer-reviewed journal; (3) use LbC as the main educational intervention; (4) target learners who are health science students or healthcare professionals; (5) aim to teach CR or any other form of expertise; and (6) be published in English or French.
Exclusion criteria were as follows: (1) conference abstracts; (2) non-original studies, i.e., based on secondary analyses such as literature reviews; and (3) studies primarily focused on SCT and not using the LbC approach (i.e., using panelists feedback as a training method). These criteria are consistent with recommendations for scoping reviews, and the objectives of this study, which are to extract validated data presented in a structured manner and to reduce potential biases associated with unpublished or non-peer-reviewed sources [38,42].
Study selection
Data extracted from bibliographic databases were exported to the Covidence bibliographic sorting platform (https://www.covidence.org/). Studies were selected using a two-stage iterative approach: 1. Screening: After Covidence platform had eliminated duplicates, the titles and abstracts of the identified articles were independently reviewed by two reviewers (AAR and ASP) to determine their eligibility. A third reviewer was responsible for ruling if disagreement remained after discussion (AR).
2. Eligibility: The full texts of potentially relevant articles were evaluated for inclusion by two independent reviewers (AAR and ASP). Disagreements were resolved by discussion. A third reviewer was responsible for ruling if disagreement remained after discussion (AR).
Charting the data
Data from the selected studies were extracted and classified in a table (Excel, Microsoft) with pre-determined variables. The data extracted for each study included: authors, year of publication, type of study, objective, methodological design, learner population, type of intervention, creation process, comparator, learning domain, description of LbC educational intervention, main study results, and conclusions. Data entry was carried out by three independent reviewers (AAR, ASP, AR), who then cross-checked the extracted data.
Collating, summarizing and reporting the results
The data extracted from the studies were systematically analyzed to produce a synthesis. First, a descriptive approach was used to categorize the characteristics of the selected studies, including research design, the methodological approaches used to examine LbC, the learner populations involved, how LbC was implemented, and the context of its use. To this end, the analysis followed the PCC framework: Population (P) – learners in health professions education, Concept (C) – Learning-by-Concordance (LbC) considering its description or operationalization as an educational method, and Context (C) – learning contexts such as continuing education, academic and clinical contexts [43] (See Appendix 2).
In addition, a content analysis was conducted to systematically classify and structure the data extracted from the selected studies that are related to the concept of LbC and, more specifically, the important elements to consider for the success of this approach in terms of learning benefits [44,45]. This approach allowed for an objective synthesis of the ways LbC has been implemented and studied in different contexts, with the aim of optimizing the learning of expertise. These analyses and the synthesis of the results were carried out iteratively by two reviewers (AAR, ASP), who consulted each other and worked collaboratively throughout the process. The analyses were cross-checked by a third reviewer (AR) (See Appendix 3 and 4).
Stakeholder consultation
To enrich our analysis and enhance the relevance of our interpretations, we consulted experts (n = 3) in the field of CR and health professionals education. These experts included authors of publications on CR (n = 2, MCA, BC), and an educator involved in developing training courses for healthcare professionals (JOD). A summary of the preliminary results was sent to these experts for comment. Their comments were incorporated into the interpretation of the results.
Results
Sorting of Selected Studies
The search in the databases yielded 596 references. After removing 270 duplicates, 326 references remained. After checking the titles and abstracts, 288 references were excluded. The full-text review led to the exclusion of 10 references, leaving 28 studies that met the criteria (Appendix 4).
Objectives and methodological characteristics of selected studies
Of the selected studies, 20 focused on the implementation of training courses using the LbC approach and on the impact of the training on the learning of healthcare learners [36,37,46,47,48,49,50,51,52,66,67,68,69,71,72,74,75,76,79,80]. Among these implementation studies, nine adopted a mixed methods design, combining quantitative descriptive analyses (e.g., questionnaire results) and qualitative content analysis of learners’ discourse regarding their opinion on the approach [36,37,47,49,51,52,66,68,80]. Six studies used qualitative approaches, employing content analysis of participants’ discourse to explore learners’ experiences with the LbC approach [46,48,67,69,74,79]. Two studies adopted a mainly quantitative approach: one relied solely on data collection through questionnaires [50], while the other was a quasi-experimental study comparing two cohorts (control vs. intervention) using a customized assessment [76]. Three studies were descriptive implementation reports: two did not collect formal data [71,72], while one reported informal comments from students [75].
In addition to these implementation studies, eight examined the development and conceptualization of LbC training [31,53,54,55,70,73,77,78]. Two of these studies used a mixed methods design, integrating data from questionnaires and a qualitative analysis of participants’ opinion on the training [53,55]. Three studies used mainly qualitative methodologies: One applied a dialogic action research approach with thematic analysis [54]. Another used a nominal group technique to validate an operational framework for the development of LbC [31]. The third followed an interpretive description methodology, conducting four semi-structured dialogue groups [78]. Two studies were descriptive theoretical papers, focusing on the conceptual and pedagogical foundations of LbC [70,73]. One study used a modified e-Delphi method, conducting two rounds of expert consultation to validate a clinical reasoning assessment rubric [77].
Measuring the effects of the LbC approach
Trends emerge from the results on the potential impacts of the approach [36,37,46,47,48,49,50,51,52,67,68,69,70,71,72,75,76,79,80]. All studies on the implementation of LbC considered learners’ perceptions of LbC, which were mainly assessed through questionnaires, qualitative feedback, or a combination of both. In all these studies, learners consistently appreciated the approach, reporting it as engaging, interactive, and beneficial for the development of expertise. Some studies reported that learners expressed positive perceptions regarding their engagement in LbC, with self-reported improvements in knowledge organization and professional reasoning [36,48,49,51,67,68,71,72,75,76,79,80].
Population targeted
The “population” category of the PCC framework enabled classification of the information extracted according to the learners targeted. Most studies (60%) examined the use of the LbC approach among medical learners, including medical students [31,37,46,48,49,53,54,66,79] and practicing physicians [36,50,55]. Other studies focused on LbC training among nursing [47,69], speech-language pathology [51,76], dental [52,80], pharmacy [67], and veterinary medicine students [75].
Concept of the LbC approach
Development of the LbC training
The “concept” category of the PCC framework enabled the inclusion of much of the information extracted from the selected studies. Through the selected studies, the LbC concept included the way in which this approach is developed and implemented. The development process appears to be integral to the concept of the LbC approach. The development of LbC training followed distinct methodological steps from one study to another. This process could be based on an assessment of educational needs [47,78] and/or a literature review of relevant theoretical foundations [49,68,76]. Several studies emphasized that the design of vignettes should be aligned with defined learning objectives [51,53,74,80] and that it could be based on established theoretical frameworks [31,52,55,73,77]. The development teams included educators [46,49,51,52,53,54,78], clinicians [31,36,67,69,72,75,79], experienced medical specialists actively involved in postgraduate training and clinical practice [50,68,71], and occasionally educators from other health professions such as pharmacy [67], nursing [69,70,73], dentistry [80], and veterinary medicine [75]. The most frequently cited characteristic for vignette designers was their clinical expertise in the target domain [31,36,37,46,50,52,68,71,72,75,79]. Some studies mentioned collaborations with program directors [47] and faculty members [49,78], who played a key role in aligning the design of the training with the institution’s curricula and educational objectives. One study indicated that vignettes were designed by a single author [51], while others noted that designers received specific training before contributing [50,53,68]. Moreover, some authors have explicitly documented the involvement and significant contribution of students in the development or pilot testing phase to ensure the appropriate level of difficulty and the relevance of the scenarios [68,76,80].
The validation of the vignettes was described as an important process in the design of the training courses based on the LbC approach. Most studies reported a process for validating vignettes and associated questions. This process was usually conducted by experienced clinicians and/or experts in the field [31,46,50,53,54,67,68,69,71,72,75,76,79,80], generally selected on the basis of their clinical experience, recognition from their peers, and their theoretical expertise [48,49,67,68,69,71,72,76,79,80]. Several studies have emphasized the participation of students in the selection of experts or in the pre-testing of vignettes to ensure their adequacy and relevance to their level of training [48,49,68,76]. In addition, some authors have explicitly described the training provided to experts by vignette designers to maintain the consistency of validation criteria [32,47,68].
The number of experts involved in the validation process varied from one study to another, generally ranging from four to twelve per training program [36,47,67,68,71,72,75,76,79,80]. In most studies, experts’ responses were collected using Likert scales with justification boxes for open-ended responses [36,46,47,48,49,50,51,52,67,68,69,71,72,75,76,79,80]. Several studies have adapted the Likert scale structure according to the learners’ levels of experience, including three response levels for novice learners [46,75], five for more experienced learners [36,46], and four in judgment-based LbC training [47,48].
Delivery of the LbC training
The way in which the LbC approach is implemented appears to be a central element of the concept underlying this approach. In all the studies, the LbC training format followed a structured approach comprising the following steps: 1) Presentation of a clinical vignette, followed by a diagnostic or intervention hypothesis; 2) Introduction of additional clinical data, requiring learners to assess the plausibility of the hypothesis. 3) Selection of the answer using a Likert-type scale, followed by a justification in an open response field. 4) Comparison with expert responses, with structured feedback.
The structure and content of the feedback varied across studies, reflecting differences in instructional design and intended learning outcomes. The main types of feedback reported included: 1) experts’ justifications for their answers [36,49,67,68,69,71,72,75,76,79,80]; 2) statistical distributions of the expert responses [36,49,67,68,75,79,80]; 3) educational summaries of expert reasoning [56,57,68,71,76,79,80]. and 4) facilitated discussions between learners and educators [48,71,72,75].
The number of vignettes and associated questions varied across studies. Some studies used single-question vignettes [48,55,71], while others included several questions per vignette, with an average of nine questions per vignette [36,68,70]. Most of the LbC training programs were delivered asynchronously, allowing learners and experts to participate at their convenience [68,70,72,79,80]. However, some programs have adopted a hybrid approach, combining asynchronous self-paced learning with synchronous group discussions to enhance interactivity and real-time knowledge sharing [52,71,72].
Contexts of LbC training
Regarding the ‘context’ category of the PCC framework, the selected studies show that LbC has been implemented in various training contexts for the education of healthcare professionals. These contexts included undergraduate academic programs [46,48,49,67,69,72,76], continuing professional development [36,50,53,55,71], and clinical training environments [47,68,71,75,79].
Content analysis
Content analysis revealed five recurring elements considered important for the success of the LbC approach, which are presented here in descending order of frequency of occurrence: 1) Design of vignettes and questions; 2) Involvement of experts; 3) Feedback procedures; 4) Training preparation; 5) Collaboration between designers and experts. For vignettes and questions, studies emphasized the importance of using realistic clinical scenarios that incorporate uncertainty to promote decision-making skills [31,47,49,52,53,54,68,71,75,76,79,80]. The participation of experts, i.e., individuals with relevant expertise in the specific field under study, was deemed essential, as they play a central role in validating vignettes by ensuring content validity content, relevance to clinical practice, and appropriate level of difficulty [36,47,53,68,71,72,75,76,79,80]. The experts’ answers served as a reference for learners, helping them refine their CR and decision-making processes. Feedback was considered paramount in this approach, as it includes expert justifications, statistical distributions of responses, and educational summaries, which reinforce learning by allowing learners to compare their decisions with those of experts [48,49,68,71,72,75,76,79,80]. Careful preparation was also considered essential, including defining the target learners, clarifying the objectives, and ensuring relevance to clinical practice to maximize its impact [31,50,68,71,75,76,80]. As for collaboration between designers and experts, it facilitated the training of the experts involved and promoted exchanges between these actors in order to ensure the quality of the training program, in particular the consistency of the vignettes and the answers provided to the questions. This partnership was considered essential to optimize the relevance and educational value of the training [53,68,71,75,76,79,80].
Discussion
Scope and Nature of LbC Research
This review has achieved its objective, which was to describe the scope of studies devoted to the LbC approach for developing the expertise of healthcare professionals. The findings indicate that despite growing interest in LbC, the literature on this approach remains relatively limited and heterogeneous. This reality reflects the innovative and emerging nature of the LbC approach, which has been used with different types of learners, in various disciplines and in different contexts [66,68,70,78,79,80]. The main features of this approach in terms of vignette format, questions, learners’ answers to questions, expert responses, and feedback based on these responses, are followed in all included studies. However, there is some heterogeneity, particularly in how this type of training is developed and how learners receive feedback in the studies included [36,37,46,47,48,49,50,51,52,67,68,69,70,71,72,75,76,79,80]. This study also identified key elements that appear to be important for promoting successful learning through LbC. It was also found that the literature on LbC has some gaps, particularly regarding the methodological design of studies of this approach, the standardization of the process of developing and implementing the approach, and the limitations of assessing the effects and impacts of the approach in the studies found.
Methodological Considerations in LbC Research
Most studies on LbC rely on cohort-based follow-up designs to explore learner engagement and implementation strategies [68,71,72,79]. Although these studies have consistently reported positive perceptions from learners [67,68,69,72,75,76,79,80], their design does not allow for the determination of the added value of the LbC approach compared to other approaches, such as problem-based learning or case-based discussions [62,63]. Indeed, no study has compared the LbC approach to other educational approaches.
Studies reported that students found the approach engaging, interactive, and beneficial for structuring their clinical reasoning [36,47,48,67,68,69,72,75,76,79,80]. The asynchronous digital format was particularly valued for its flexibility, allowing learners to progress at their own pace [68,70,72,79,80]. This adaptability makes LbC accessible to a broader range of learners, including those in geographically remote areas. The evaluation of LbC training has largely relied on learner satisfaction surveys and self-reported perceptions [67,68,69,71,72,75,76,79,80], a common limitation in health professions education research [58,59].
Furthermore, current studies on LbC lack outcome measures that objectively confirm the expertise skills developed through this approach. For example, it would be relevant to conduct longitudinal studies assessing the contribution of LbC to CR and decision-making over time [68,71,77]. Further studies will need to be conducted to describe a reference framework for the indicators of success of the LbC approach that should be evaluated. In addition, theoretical models of learning in LbC should be refined to better understand the mechanisms by which the LbC approach promotes the development of professional reasoning and decision-making.
Constituent elements of the LbC approach
This review revealed general agreement on the fundamental methodological elements of LbC training, in line with the published guidelines on its design and implementation [34]. These findings suggest that LbC is adaptable to various disciplines (e.g., medicine, nursing, pharmacy, veterinary medicine, dentistry, speech therapy, and patient partnership initiatives) [67,69,70,72,73,75,76,80], to various areas of competence (e.g., clinical reasoning, professionalism), and learner levels (students and professionals). However, the lack of standardized frameworks for developing vignettes, selecting experts, and feedback procedures highlights the need to develop more detailed methodological guidelines on these aspects of the approach [68,78].
Key elements for optimizing the LbC approach
The review highlighted five elements to consider to optimize learning through the LbC approach. The first element relates to the realism and authenticity of the vignettes, often cited as essential to effective learning, ensuring that the scenarios accurately reflect real clinical challenges [31,47,49,52,53,54,68,71,75,76,79,80]. The second element concerns the participation of experienced professionals. This is considered essential, particularly for validating educational material and ensuring that the level of uncertainty in the vignettes corresponds to clinical practice. The third element concerns feedback, which may include the degree of consensus or variability in the experts’ responses. It is interesting to note that this degree of consensus on the experts’ responses provides an indication of the level of difficulty and relevance of the vignettes [36,47,53,68,71,72,75,76,79,80]. At the heart of the LbC approach, learners benefit greatly from structured feedback that includes expert justifications, statistical distributions of responses and educational summaries, all of which play a crucial role in reinforcing learning [48,49,64,65,68,71,72,75,76,79,80]. Regarding the fourth element, which concerns training preparation, the studies emphasized the importance of clearly defining the target learners, aligning the objectives with clinical practice and ensuring good coordination between faculty members and vignette designers [31,50,68,71,75,76,80]. Finally, a fifth element, close collaboration, was identified as a key factor in optimizing LbC training. Effective collaboration facilitates the training of experts and ensures that feedback remains relevant to the learning objectives [53,68,71,75,76,79,80]. These methodological elements reflect current recommendations from research on LbC and provide a basis for the development of more structured guidelines for its implementation.
Challenges and Limitations of implementing the LbC approach
Despite its advantages, LbC presents challenges related to the wide range of decision-making required to make effective decisions, skills that are difficult to acquire through a single learning approach. The approach encourages learners to focus on individual diagnostic or intervention hypotheses, which does not always align with the holistic, multifactorial decision-making required in clinical practice. Furthermore, while LbC is grounded in script theory, real-world decision-making often involves a combination of pattern recognition and analytical reasoning, which may not be fully developed through LbC training alone.
Another significant challenge is the resource-intensive nature of LbC implementation. The development and validation of vignettes, collection of expert responses, and administration of training require substantial institutional support [68,71,78,79,80]. Establishing standardized procedures for these steps would streamline the adoption of the LbC approach across different educational settings.
Study Strengths and Limitations
This scoping review was conducted using a rigorous methodology, including systematic searches of multiple databases and adherence to established scoping review guidelines [38,39]. The specificity of the “Learning-by-Concordance” concept facilitated the development of precise search strategies developed in collaboration with a health science librarian.
One limitation of this study is the exclusion of grey literature, meaning that unpublished LbC initiatives may not have been captured. Given educators’ growing interest in LbC, many applications of this approach may exist outside the peer-reviewed literature. Furthermore, the heterogeneity of the included studies—in terms of methodological designs, targeted learning domains, and outcome measures—limits the ability to synthesize the findings comprehensively [68,71,76,79,80].
It is important to consider verifying the effects of educational approaches using the most meaningful impact indicators available. Although it remains difficult to measure the impact of educational interventions on health science learners’ ability to improve patient outcomes, some studies in medical education have explored indirect measures, such as practitioner confidence, clinical decision-making effectiveness, or error reduction [60,61]. Future research could explore whether LbC training influences these intermediate indicators, rather than focusing solely on short-term outcomes associated with the acquisition of knowledge or skills that are not easily transferable to clinical practice.
Implications for Future Research and Training Practices
This review underscores the versatility of the LbC approach in fostering different forms of expertise in health professions education. LbC has been applied in domains ranging from CR and diagnostic decision-making to professionalism and perceptual judgment. Learner populations also varied across disciplines, including medicine, nursing, pharmacy, veterinary medicine, dentistry, speech therapy, and patient partnership initiatives [67,69,70,72,73,75,76,80].
A major takeaway from this review is the need for clearer methodological frameworks to support the implementation of LbC. The development of standardized guidelines for vignette design, expert validation, and feedback procedures will enhance the reproducibility and scalability of this approach [68,78]. Future research should also focus on exploring the cognitive processes underlying LbC learning, incorporating longitudinal studies to assess its long-term impact on professional practice [68,71,77].
Given the consistent commitment of learners and the adaptability of LbC, the approach should be considered a viable strategy for developing expertise in various health professions.
Conclusion
LbC training represents a structured approach to promoting the development of expertise in healthcare professions. It is distinguished by the importance it places on decision-making in a context of uncertainty and the integration of feedback based on the responses of people with relevant expertise in the field under study. Although current studies highlight its adaptability to all disciplines and levels of learning, methodological variability in its development and implementation remains a major challenge. Future research should focus on establishing standardized frameworks for vignette design, expert validation, and feedback procedures. Furthermore, investigating the cognitive processes underlying LbC learning and its long-term impact on professional reasoning could provide valuable insights for optimizing its implementation in health professions education.
Contribution to the advancement of health professions education
This scoping review provides the first comprehensive synthesis of research on LbC in health professions education, offering a structured overview of its applications. By identifying key procedural factors in LbC training, this review contributes to refining its implementation and highlights the need for standardized evaluation frameworks.
Beyond methodological perspectives, this study also informs the design of future LbC-based training programs, ensuring that vignette development, expert involvement, and feedback procedures are structured effectively. By clarifying the gaps in current research, this review serves as a basis for advancing both the study and practical application of LbC in health professions education.
Additional File
The additional file for this article can be found as follows:
Appendix 1 to 5.
Acknowledgements
We would like to express our deep gratitude to the Interdisciplinary Research Group on Cognition and Professional Reasoning, Faculty of Medicine, University of Montreal, for funding this project and its publication. We would also like to thank the University of Geneva, and the Montpellier Institute of Physiotherapy Education for their valuable support throughout this research. Their collaboration and contributions were instrumental in the completion of this study. We would also like to thank Monique Car, health sciences librarian at the Faculty of medicine of the University of Montreal, for her assistance in developing and verifying database search strategies and for her advice on improving the manuscript.
Competing Interests
The authors have no competing interests to declare.
References
- 1.Helou MA, DiazGranados D, Ryan MS, Cyrus JW. Uncertainty in Decision Making in Medicine: A Scoping Review and Thematic Analysis of Conceptual Models. Acad Med. 2020;95(1):157–65. DOI: 10.1097/ACM.0000000000002902 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Hall KH. Reviewing intuitive decision-making and uncertainty: the implications for medical education. Medical education. 2002;36(3):216–24. DOI: 10.1046/j.1365-2923.2002.01140.x [DOI] [PubMed] [Google Scholar]
- 3.Cranley L, Doran DM, Tourangeau AE, Kushniruk A, Nagle L. Nurses’ uncertainty in decision-making: A literature review. Worldviews on Evidence-Based Nursing. 2009;6(1):3–15. DOI: 10.1111/j.1741-6787.2008.00138.x [DOI] [PubMed] [Google Scholar]
- 4.Bhise V, Rajan SS, Sittig DF, Morgan RO, Chaudhary P, Singh H. Defining and Measuring Diagnostic Uncertainty in Medicine: A Systematic Review. J Gen Intern Med. 2018;33(1):103–15. DOI: 10.1111/jan.15414 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Smith SK, Benbenek MM, Bakker CJ, Bockwoldt D. Scoping review: Diagnostic reasoning as a component of clinical reasoning in the U.S. primary care nurse practitioner education. J Adv Nurs. 2022;78(12):3869–96. DOI: 10.1111/jan.15414 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Roex A, Clarebout G, Dory V, Degryse J. Can ill-structured problems reveal beliefs about medical knowledge and knowing? A focus-group approach. BMC medical education. 2009;9:1–9. DOI: 10.1186/1472-6920-9-62 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Sarsfield E. Differences Between Novices’ and Experts’ Solving Ill-Structured Problems. Public Health Nursing. 2014;31(5):444–53. DOI: 10.1111/phn.12100 [DOI] [PubMed] [Google Scholar]
- 8.Higgs J, Jones MA, Loftus S, Christensen N. Clinical Reasoning in the Health Professions E-Book: Clinical Reasoning in the Health Professions E-Book. Elsevier Health Sciences; 2008. [Google Scholar]
- 9.Jonassen DH. Instructional design models for well-structured and III-structured problem-solving learning outcomes. Educational Technology Research and Development. 1997;45(1):65–94. DOI: 10.1007/BF02299613 [DOI] [Google Scholar]
- 10.Chi MTH, Glaser R. Problem-Solving Ability. 1985. [Google Scholar]
- 11.Delamont S, Atkinson P. Doctoring Uncertainty::Mastering Craft Knowledge. Social Studies of Science. 2001;31(1):87–107. DOI: 10.1177/030631201031001005 [DOI] [Google Scholar]
- 12.Atkinson P. Training for certainty. Social Science & Medicine. 1984;19(9):949–56. DOI: 10.1016/0277-9536(84)90324-1 [DOI] [PubMed] [Google Scholar]
- 13.Mitchell AW. Teaching ill-structured problem solving using occupational therapy practice epistemology. Occupational Therapy in Health Care. 2013;27(1):20–34. DOI: 10.3109/07380577.2012.757408 [DOI] [PubMed] [Google Scholar]
- 14.Moffett J, Hammond J, Murphy P, Pawlikowska T. The ubiquity of uncertainty: a scoping review on how undergraduate health professions’ students engage with uncertainty. Adv Health Sci Educ Theory Pract. 2021;26(3):913–58. DOI: 10.1007/s10459-021-10028-z [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Round A. Introduction to clinical reasoning. J Eval Clin Pract. 2001;7(2):109–17. DOI: 10.1046/j.1365-2753.2001.00252.x [DOI] [PubMed] [Google Scholar]
- 16.Ruczynski LI, van de Pol MH, Schouwenberg BJ, Laan RF, Fluit CR. Learning clinical reasoning in the workplace: a student perspective. BMC Med Educ. 2022;22(1):19. DOI: 10.1186/s12909-021-03083-y [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Kassirer JP, Wong JB, Kopelman RI. Learning Clinical Reasoning. Lippincott Williams & Wilkins; 2010. DOI: 10.1097/ACM.0b013e3181d5dd0d [DOI] [Google Scholar]
- 18.Young ME, Thomas A, Lubarsky S, Gordon D, Gruppen LD, Rencic J, et al. Mapping clinical reasoning literature across the health professions: a scoping review. BMC Medical Education. 2020;20(1):107. DOI: 10.1186/s12909-020-02012-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Richmond A. The chicken and the egg: Clinical reasoning and uncertainty tolerance. Med Educ. 2022;56(7):696–8. DOI: 10.1111/medu.14814 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Gheihman G, Johnson M, Simpkin AL. Twelve tips for thriving in the face of clinical uncertainty. Medical Teacher. 2020;42(5):493–9. DOI: 10.1080/0142159X.2019.1579308 [DOI] [PubMed] [Google Scholar]
- 21.Moulder G, Harris E, Santhosh L. Teaching the science of uncertainty. Diagnosis (Berl). 2023;10(1):13–8. DOI: 10.1515/dx-2022-0045 [DOI] [PubMed] [Google Scholar]
- 22.Stojan JN, Daniel M, Hartley S, Gruppen L. Dealing with uncertainty in clinical reasoning: A threshold model and the roles of experience and task framing. Med Educ. 2022;56(2):195–201. DOI: 10.1111/medu.14673 [DOI] [PubMed] [Google Scholar]
- 23.Cooke S, Lemay J-F. Transforming Medical Assessment: Integrating Uncertainty Into the Evaluation of Clinical Reasoning in Medical Education. Academic Medicine. 2017;92(6). DOI: 10.1097/ACM.0000000000001559 [DOI] [PubMed] [Google Scholar]
- 24.Fournier JP, Demeester A, Charlin B. Script concordance tests: guidelines for construction. BMC Med Inform Decis Mak. 2008;8:18. DOI: 10.1186/1472-6947-8-18 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Charlin B, Boshuizen HP, Custers EJ, Feltovich PJ. Scripts and clinical reasoning. Med Educ. 2007;41(12):1178–84. DOI: 10.1111/j.1365-2923.2007.02924.x [DOI] [PubMed] [Google Scholar]
- 26.Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. The Script Concordance test: a tool to assess the reflective clinician. Teach Learn Med. 2000;12(4):189–95. DOI: 10.1207/S15328015TLM1204_5 [DOI] [PubMed] [Google Scholar]
- 27.Charlin B, van der Vleuten C. Standardized assessment of reasoning in contexts of uncertainty: the script concordance approach. Eval Health Prof. 2004;27(3):304–19. DOI: 10.1177/0163278704267043 [DOI] [PubMed] [Google Scholar]
- 28.Dory V, Gagnon R, Vanpee D, Charlin B. How to construct and implement script concordance tests: insights from a systematic review. Med Educ. 2012;46(6):552–63. DOI: 10.1111/j.1365-2923.2011.04211.x [DOI] [PubMed] [Google Scholar]
- 29.Kojich L, Miller SA, Axman K, Eacret T, Koontz JA, Smith C. Evaluating clinical reasoning in first year DPT students using a script concordance test. BMC Medical Education. 2024;24(1):329. DOI: 10.1186/s12909-024-05281-w [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Lubarsky S, Dory V, Audétat MC, Custers E, Charlin B. Using script theory to cultivate illness script formation and clinical reasoning in health professions education. Can Med Educ J. 2015;6(2):e61–70. DOI: 10.36834/cmej.36631 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Pietrement C, Morin M-P, Lefèvre-Utile A, Thibault L-P, Jobin V, Charlin B. Formation par concordance : la théorie des scripts et son application en enseignement revisitées. Pédagogie Médicale. 2024;25(1):51–7. DOI: 10.1051/pmed/2024004 [DOI] [Google Scholar]
- 32.Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice: AMEE guide no. 75. Med Teach. 2013;35(3):184–93. DOI: 10.3109/0142159X.2013.760036 [DOI] [PubMed] [Google Scholar]
- 33.Charlin B, Gagnon R, Lubarsky S, Lambert C, Meterissian S, Chalk C, et al. Assessment in the context of uncertainty using the script concordance test: more meaning for scores. Teach Learn Med. 2010;22(3):180–6. DOI: 10.1080/10401334.2010.488197 [DOI] [PubMed] [Google Scholar]
- 34.Charlin B, Deschênes MF, Fernandez N. Learning by concordance (LbC) to develop professional reasoning skills: AMEE Guide No. 141. Med Teach. 2021;43(6):614–21. DOI: 10.1080/0142159X.2021.1900554 [DOI] [PubMed] [Google Scholar]
- 35.Charton L, Lahmar A, Hernandez E, Rougerie F, Lorenzo M. Impact of an online learning by concordance program on reflection. BMC Medical Education. 2023;23(1):822. DOI: 10.1186/s12909-023-04799-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 36.Lecours J, Bernier F, Friedmann D, Jobin V, Charlin B, Fernandez N. Learning-by-Concordance for Family Physicians: Revealing its Value for Continuing Professional Development in Dermatology. MedEdPublish (2016). 2018;7:236. DOI: 10.15694/mep.2018.0000236.1 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 37.Chantal L, Driss K, Robert G, Bernard C, Nicolas F. Learning-by-Concordance of Perception: A Novel way to Learn to Read Thoracic Images. Acad Radiol. 2023;30(1):132–7. DOI: 10.1016/j.acra.2022.04.015 [DOI] [PubMed] [Google Scholar]
- 38.Arksey H, O’Malley L. Scoping studies: towards a methodological framework. International Journal of Social Research Methodology. 2005;8(1):19–32. DOI: 10.1080/1364557032000119616 [DOI] [Google Scholar]
- 39.Tricco AC, Lillie E, Zarin W, O’Brien KK, Colquhoun H, Levac D, et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann Intern Med. 2018;169(7):467–73. DOI: 10.7326/M18-0850 [DOI] [PubMed] [Google Scholar]
- 40.Anderson S, Allen P, Peckham S, Goodwin N. Asking the right questions: scoping studies in the commissioning of research on the organisation and delivery of health services. Health Res Policy Syst. 2008;6:7. DOI: 10.1186/1478-4505-6-7 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Dixon-Woods M, Agarwal S, Jones D, Young B, Sutton A. Synthesising qualitative and quantitative evidence: a review of possible methods. J Health Serv Res Policy. 2005;10(1):45–53. DOI: 10.1177/135581960501000110 [DOI] [PubMed] [Google Scholar]
- 42.Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Implement Sci. 2010;5:69. DOI: 10.1186/1748-5908-5-69 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Cook DA, West CP. Conducting systematic reviews in medical education: a stepwise approach. Medical education. 2012;46(10):943–52. DOI: 10.1111/j.1365-2923.2012.04328.x [DOI] [PubMed] [Google Scholar]
- 44.Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. avr 2008;62(1):107–15. DOI: 10.1111/j.1365-2648.2007.04569.x [DOI] [PubMed] [Google Scholar]
- 45.Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. nov 2005;15(9):1277–88. DOI: 10.1177/1049732305276687 [DOI] [PubMed] [Google Scholar]
- 46.Charton L, Lahmar A, Hernandez E, Rougerie F, Lorenzo M. Impact of an online learning by concordance program on reflection. BMC Med Educ. 2023;23(1):822. DOI: 10.1186/s12909-023-04799-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 47.Deschênes M-F, Goudreau J. L’apprentissage du raisonnement clinique infirmier dans le cadre d’un dispositif éducatif numérique basé sur la concordance de scripts. Pédagogie Médicale. 2020;21(3):143–57. DOI: 10.1051/pmed/2020041 [DOI] [Google Scholar]
- 48.Fernandez N, Foucault A, Dubé S, Robert D, Lafond C, Vincent AM, et al. Learning-by-Concordance (LbC): introducing undergraduate students to the complexity and uncertainty of clinical practice. Can Med Educ J. 2016;7(2):e104–e13. DOI: 10.36834/cmej.36690 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Foucault A, Dubé S, Fernandez N, Gagnon R, Charlin B. Learning medical professionalism with the online concordance-of-judgment learning tool (CJLT): A pilot study. Med Teach. 2015;37(10):955–60. DOI: 10.3109/0142159X.2014.970986 [DOI] [PubMed] [Google Scholar]
- 50.Hornos EH, Pleguezuelos EM, Brailovsky CA, Harillo LD, Dory V, Charlin B. The practicum script concordance test: an online continuing professional development format to foster reflection on clinical practice. J Contin Educ Health Prof. 2013;33(1):59–66. DOI: 10.1002/chp.21166 [DOI] [PubMed] [Google Scholar]
- 51.Maftoul R, Marcotte K. Formation par concordance de script en orthophonie : récit de pratique en évaluation des troubles acquis de la communication. Pédagogie Médicale. 2023;24(3):179–92. DOI: 10.1051/pmed/2023008 [DOI] [Google Scholar]
- 52.Vaillant-Corroy A-S, Girard F, Virard F, Corne P, Gerber Denizart C, Wulfman C, et al. Concordance of judgement: A tool to foster the development of professionalism in dentistry. European Journal of Dental Education. 2024;28(3):789–96. DOI: 10.1111/eje.13007 [DOI] [PubMed] [Google Scholar]
- 53.Deschênes MF, Charlin B, Phan V, Grégoire G, Riendeau T, Henri M, et al. Educators and practitioners’ perspectives in the development of a learning by concordance tool for medical clerkship in the context of the COVID pandemic. Can Med Educ J. 2021;12(6):43–54. DOI: 10.36834/cmej.72461 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Fernandez N, Deschênes MF, Akremi H, Lecours L, Jobin V, Charlin B. What can Designing Learning-by-Concordance Clinical Reasoning Cases Teach Us about Instruction in the Health Sciences? Perspect Med Educ. 2023;12(1):160–8. DOI: 10.5334/pme.898 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.Lorenzo M, Bailly P, Lépine C. Should we add patients in concordance of judgment learning tool panels? – An analysis between patients and primary care physicians. Med Teach. 2024;46(5):697–704. DOI: 10.1080/0142159X.2023.2274285 [DOI] [PubMed] [Google Scholar]
- 56.Charlin B, Deschênes M-F, Dumas J-P, Lecours J, Vincent A-M, Kassis J, et al. Concevoir une formation par concordance pour développer le raisonnement professionnel : quelles étapes faut-il parcourir ?. Pédagogie Médicale. 2018;19(3):143–9. DOI: 10.1051/pmed/2019019 [DOI] [Google Scholar]
- 57.Charlin B, Fernandez N. Former et évaluer par concordance : des modalités éducatives complémentaires. Pédagogie Médicale. 2022;23(2):131–3. DOI: 10.1051/pmed/2022005 [DOI] [Google Scholar]
- 58.Prystowsky JB, Bordage G. An outcomes research perspective on medical education: the predominance of trainee assessment and satisfaction. Med Educ. 2001;35(4):331–6. DOI: 10.1046/j.1365-2923.2001.00910.x [DOI] [PubMed] [Google Scholar]
- 59.Emery M, Wolff M, Merritt C, Ellinas H, McHugh D, Zaher M, et al. An outcomes research perspective on medical education: Has anything changed in the last 18 years? Med Teach. 2022;44(12):1400–7. DOI: 10.1080/0142159X.2022.2099259 [DOI] [PubMed] [Google Scholar]
- 60.Glick TH. Viewpoint: Evidence-Guided Education: Patients’ Outcome Data Should Influence Our Teaching Priorities. Academic Medicine. 2005;80(2). DOI: 10.1097/00001888-200502000-00008 [DOI] [PubMed] [Google Scholar]
- 61.O’Malley PG, Pangaro LN. Research in Medical Education and Patient-Centered Outcomes: Shall Ever the Twain Meet? JAMA Internal Medicine. 2016;176(2):167–8. DOI: 10.1001/jamainternmed.2015.6938 [DOI] [PubMed] [Google Scholar]
- 62.McLean SF. Case-based learning and its application in medical and health-care fields: a review of worldwide literature. Journal of medical education and curricular development. 2016;3:JMECD. S20377. DOI: 10.4137/JMECD.S20377 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 63.Thistlethwaite JE, Davies D, Ekeocha S, Kidd JM, MacDougall C, Matthews P, et al. The effectiveness of case-based learning in health professional education. A BEME systematic review: BEME Guide No. 23. Medical teacher. 2012;34(6):e421–e44. DOI: 10.3109/0142159X.2012.680939 [DOI] [PubMed] [Google Scholar]
- 64.Gil-Lacruz M, Gracia-Pérez ML, Gil-Lacruz AI. Learning by Doing and Training Satisfaction: An Evaluation by Health Care Professionals. Int J Environ Res Public Health. 2019;16(8). DOI: 10.3390/ijerph16081397 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 65.Rajeh MT, Abduljabbar FH, Alqahtani SM, Waly FJ, Alnaami I, Aljurayyan A, et al. Students’ satisfaction and continued intention toward e-learning: a theory-based study. Medical Education Online. 2021;26(1):1961348. DOI: 10.1080/10872981.2021.1961348 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 66.Foucault A, Dubé S, Fernandez N, Gagnon R, Charlin B. The Concordance of Judgement Learning Tool. Medical Education. 2014;48(5):541–2. DOI: 10.1111/medu.12467 [DOI] [PubMed] [Google Scholar]
- 67.Funk KA, Kolar C, Schweiss SK, Tingen JM, Janke KK. Experience with the script concordance test to develop clinical reasoning skills in pharmacy students. Currents in Pharmacy Teaching and Learning. 2017. Nov 1;9(6):1031–41. DOI: 10.1016/j.cptl.2017.07.021 [DOI] [PubMed] [Google Scholar]
- 68.Charton L, Hernandez E, Lorenzo M. Apports d’une méthode de formation par concordance mixte à l’apprentissage de la lecture des électrocardiogrammes; 2018. [Google Scholar]
- 69.Tedesco-Schneck M. Use of Script Concordance Activity With the Think-Aloud Approach to Foster Clinical Reasoning in Nursing Students. Nurse Educator. 2019. Oct;44(5):275. DOI: 10.1097/NNE.0000000000000626 [DOI] [PubMed] [Google Scholar]
- 70.Deschênes MF, Pelletier I, Tremblay K, Charlin B. La formation par concordance: une nouvelle forme de compagnonnage cognitif pour développer le raisonnement chez les étudiants. Pédagogie collégiale. 2020;33(no 2). hiver 2020. [Google Scholar]
- 71.Henriksen C, Jobin V, Deschênes MF, Tremblay C, Charlin B, Fernandez N. Formation par concordance avec rétroaction multi-source aux questions qui émergent de la pratique médicale en contexte de pandémie COVID-19: Learning by concordance with multi-source feedback for emerging COVID-19 pandemic related cases in medical practice. Pédagogie médicale. 2020;21(4):203–5. DOI: 10.1051/pmed/2020049 [DOI] [Google Scholar]
- 72.Jackson M, Descôteaux A, Nicaise L, Flora L, Berkesse A, Codsi MP, et al. Former en ligne au recrutement de patients partenaires: l’apport des formations par concordance. Pédagogie médicale. 2020;21(2):101–6. DOI: 10.1051/pmed/2020035 [DOI] [Google Scholar]
- 73.Deschênes MF, Létourneau D, Goudreau J. Script Concordance Approach in Nursing Education. Nurse Educ. 2021. Oct 1;46(5):E103–7. DOI: 10.1097/NNE.0000000000001028 [DOI] [PubMed] [Google Scholar]
- 74.Patel R. General practice trainees’ learning experiences of formative think-aloud script concordance testing. Education for Primary Care. 2022. Jul 4;33(4):229–36. DOI: 10.1080/14739879.2022.2057240 [DOI] [PubMed] [Google Scholar]
- 75.Tayce JD, Saunders AB. The Use of a Modified Script Concordance Test in Clinical Rounds to Foster and Assess Clinical Reasoning Skills. Journal of Veterinary Medical Education. 2022. Sep;49(5):556–9. DOI: 10.3138/jvme-2021-0090 [DOI] [PubMed] [Google Scholar]
- 76.Bernard E, Elie-Deschamps J, Judet A, Pépin-Boutin A, Dupont-Bérail S, Lorenzo M. Développer le raisonnement clinique en première année d’orthophonie: création et évaluation d’un dispositif de formation. Glossa. 2023;(136):127–61. [Google Scholar]
- 77.Deschênes MF, Dionne É, Robert-Boluda L. Preliminary Validation of a Clinical Reasoning Theory-Based Assessment Rubric: An e-Delphi Study. Nursing Education Perspectives. 2024. DOI: 10.1097/01.NEP.0000000000001320 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Deschênes MF, Charlin B, Akremi H, Lecours L, Moussa A, Jobin V, et al. Beliefs and experiences of educators when involved in the design of a Learning-by-concordance tool: A qualitative interpretative study. Journal of Professional Nursing. 2024. Sep 1;54:180–8. DOI: 10.1016/j.profnurs.2024.07.004 [DOI] [PubMed] [Google Scholar]
- 79.Verillaud B, Veleur M, Kania R, Zagury-Orly I, Fernandez N, Charlin B. Using learning-by-concordance to develop reasoning in epistaxis management with online feedback: A pilot study. Science Progress. 2024. Jul 1;107(3):00368504241274583. DOI: 10.1177/00368504241274583 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 80.Mainville G, Buithieu H, Charlin B, Strub M. Learning by Concordance as a Tool for Paediatric Dental Traumatology Education. European Journal of Dental Education. 2025. DOI: 10.1111/eje.13079 [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Appendix 1 to 5.