Skip to main content
Medicine logoLink to Medicine
. 2025 Aug 15;104(33):e43920. doi: 10.1097/MD.0000000000043920

A competency-based instructional design: A progressive blended case-based teaching method

Pingyu Zhu a, Binglei Jiang b, Rong Lu c, Ben Wang a, Yixi He d, Kun Yang e, Qiang Zhang a, Pan Zhao f, Bo Huang g,*
PMCID: PMC12366975  PMID: 40826708

Abstract

Background:

Competency-based medical education requires innovative teaching models, yet limited evidence exists on the combined effectiveness of blended and case-based learning. This study aimed to evaluate a progressive blended case-based teaching model’s effectiveness in enhancing the clinical competencies of medical students.

Methods:

From September 2022 to January 2023, 258 5th-year clinical medical students were prospectively enrolled and randomly assigned to either a “3-tier progressive” case-based teaching group (experimental, n = 123) or a traditional lecture-based group (control, n = 135) for an integrated urology module. The experimental group utilized the “Rain Classroom” tool to implement a blended approach involving pre-class, in-class, and post-class activities. The control group received traditional lectures. Mixed-methods analysis assessed impacts on competencies, focusing on final evaluation scores. Informed consent was obtained.

Results:

A total of 258 students participated in the study (123 in the experimental group and 135 in the control group). Before the intervention, no significant differences were found between the experimental and control groups in diagnostic course scores (P = .523), indicating comparable baseline academic levels. After the intervention, the experimental group scored significantly higher on the post-test than the control group (85.6 ± 5.0 vs 81.3 ± 5.2, P < .001). For self-perceived competencies, the experimental group reported significantly higher mean scores in 6 of the 8 core domains, including clinical skills and medical service ability, information collection and processing ability, medical knowledge and lifelong learning attitude, interpersonal and communication skills, teamwork and leadership ability, health promotion and disease prevention (all P < .001), while no significant differences were found in research skills or professionalism and ethical practice. Additionally, frequency-based analysis of the Likert scale responses revealed a higher overall proportion of students in the experimental group recognizing positive impacts of the teaching model on their competency development.

Conclusion:

The “progressive blended” case-based teaching method stimulates students’ learning initiative, significantly enhances the quality of medical talent development, and facilitates continuous improvement in the clinical competencies of medical students.

Keywords: based education, based instructional design, based teaching, case, clinical competencies, competency, outcome, progressive blended learning

1. Introduction

The concept of outcome-based education, first introduced by Spady in 1981, emphasizes not only the acquisition of knowledge but also its application in real-world contexts.[1] This philosophy laid the foundation for the later development of competency-based medical education (CBME), which prioritizes measurable learning outcomes and practical competence in clinical settings.

In recent years, there has been an accelerating global shift toward CBME in response to the growing demand for clinically relevant, learner-centered approaches in undergraduate medical training.[2] This paradigm encourages not only the acquisition of theoretical knowledge but also the integration of communication, teamwork, and clinical reasoning as essential educational outcomes.[3,4]

In China, several medical schools have begun experimenting with blended instructional strategies that combine traditional lectures with case-based and digital modules to foster active learning.[5] Among these, digital platforms have increasingly supported the transition to blended learning. One such platform is Rain Classroom, a widely adopted smart teaching tool developed by Tsinghua University. It integrates seamlessly with PowerPoint and mobile devices to enable real-time interaction, in-class questioning, and personalized post-class review. By bridging synchronous and asynchronous instruction, Rain Classroom has been shown to enhance student engagement and support formative assessment in large-group settings.[6]

Case-based learning (CBL) has proven effective in developing early clinical reasoning and decision-making skills, especially when applied in preclinical and transitional-phase curricula.[7] Additionally, blended learning models – combining in-person discussion with online modules – have been reported to improve learner engagement and knowledge retention, particularly in resource-limited educational contexts.[8] These strategies are especially effective when aligned with national competency frameworks such as CanMEDS, which emphasize a broad range of physician competencies beyond medical expertise, including professionalism, communication, and lifelong learning.[9,10]

Despite these advances, limited empirical evidence exists regarding the combined application of blended and case-based teaching models within core clinical modules in Chinese medical schools. This study aims to address this gap by evaluating the effect of a progressive blended case-based teaching model on students’ perceived competency development and academic performance within an integrated urology module.

2. Methods

2.1. Participants

This study employed a randomized controlled design, enrolling 258 5th-year clinical 2018 cohort medical students from September 2022 and January 2023. Participants were randomly assigned using a computer-generated randomization sequence to either the experimental group (n = 123) or the control group (n = 135). All students had completed the necessary foundational coursework prior to enrollment, including the standardized diagnostics course. As a core bridging subject between basic science and clinical training, diagnostics was used to represent baseline academic performance. Final grades in this course served as a proxy for students’ prior knowledge and initial clinical reasoning ability, enabling baseline comparison between groups. Informed consent was obtained from all participants. The study was approved by the Institutional Review Board and Ethics Committee of the Affiliated Hospital of North Sichuan Medical College (Approval No. 2025ER-251-1).

2.2. Study design

We chose the integrated urology course, covering diseases such as urinary system tumors, stones, and obstructions, as the topic for application in this study, because it requires students to apply foundational knowledge to diverse clinical scenarios such as tumors, stones, and urinary obstructions, making it well-suited for evaluating the effects of case-based and blended teaching approaches. Instructors with similar levels of experience were selected to teach both the experimental and control groups. The teaching period for both the experimental and control groups lasted 4 consecutive weeks, with the same content and schedule delivered across both groups. Each group received 12 teaching hours in total. The post-test was administered immediately following the conclusion of the course, under the same timing and conditions for both groups.

The experimental group received instruction through a progressive blended case-based teaching model that integrated pre-class preparation, in-class interactive case discussions, and post-class consolidation. This model was implemented using the Rain Classroom platform, which enabled instructors to embed interactive questions into PowerPoint slides and push them to students’ smartphones in real time during class. Students could respond instantly, and their aggregated answers were displayed to guide immediate discussion and feedback. After class, the platform automatically sent personalized review materials – including key questions and explanations – to reinforce knowledge retention. Additionally, instructors were able to monitor student participation and comprehension through backend analytics, allowing for real-time adjustments in teaching pace and focus based on formative feedback.

Before the class, the instructor prepared lecture videos and supplementary materials for the course. Through the “Rain Classroom” platform, students received general guidelines for diagnosis and treatment, 3 reference papers related to the course topics, and approximately 30 minutes of video content. Each student was expected to study these materials independently during their free time outside of class. The in-class interactive case discussions were designed in a stepwise progression to enhance the practical application of clinical knowledge, including: Keyword extraction: using concise clinical case examples, the instructor restructured key knowledge points interactively through Rain Classroom tools. Students identified shared keywords across cases, facilitating their first exposure to real clinical scenarios while solidifying their theoretical understanding. Standardized cases: real clinical cases were introduced, accompanied by structured questions. Group discussions allowed students to collaboratively analyze and resolve complex issues, fostering problem-solving and critical thinking skills. Inquiry-based thinking: case studies simulating real diagnostic scenarios were distributed to students before class via the Rain Classroom platform. Students independently explored critical clinical challenges, leading to dynamic “student-to-student” and “teacher-to-student” debates that further stimulated active learning and engagement. The control group followed a traditional lecture-based format, ensuring equivalent instructional time across groups. Before the lecture, students were asked to do a basic preview of the course without engaging in detailed video watching or extensive reading.

2.3. Data evaluation and statistical analysis

The instrument encompassed 8 core competency domains adapted from the CanMEDS framework,[2] including clinical skills, medical knowledge, communication, teamwork, professionalism, and research abilities. Each item was accompanied by a descriptive behavioral indicator to enhance clarity and support content validity. The initial version of the questionnaire was reviewed by a panel of 5 experts (3 senior medical educators and 2 curriculum specialists), and pilot-tested on 30 students not involved in the final study. Feedback was incorporated to refine the wording and ensure clarity and relevance. All items were rated using a 5-point Likert scale (1 = minimal impact; 5 = significant impact). The final version demonstrated good internal consistency, with a Cronbach alpha of 0.85. The full questionnaire is provided in the Supplementary Material (Table S1, Supplemental Digital Content, https://links.lww.com/MD/P676). Prior to the intervention, baseline academic performance was measured using students’ final scores from the standardized diagnostics course, a required subject bridging basic and clinical sciences. This score was obtained from the end-of-course examination, which was administered immediately after course completion and before the start of the teaching intervention. Upon completion of the instructional intervention, both groups were asked to complete the structured 5-point Likert scale questionnaire and to take a post-test consisting of 100 multiple-choice questions designed to evaluate knowledge application and diagnostic reasoning. These instruments were administered simultaneously at the end of the intervention period.

Statistical analyses included descriptive statistics (means and standard deviations) to summarize self-reported competency ratings. Welch t-tests were used to compare mean Likert scores between groups. Additionally, Likert responses were dichotomized into “agree” (scores of 3–5) and “disagree” (scores of 1–2),[11] and chi-square tests were used to compare the frequency distributions of competency recognition across the 2 groups. pretest diagnostics course scores were also compared using independent samples t-tests to verify baseline equivalence, as well as post-test academic performance between groups. All statistical analyses were performed using SPSS version 20.0 (Chicago). Alpha was set at 0.05, and P value of <0.05 were considered significant. A graphical overview of the study design is shown in Figure 1.

Figure 1.

Figure 1.

Flow diagram of the study design. The diagram illustrates the parallel-group, randomized controlled trial design. It outlines the distinct instructional stages (pre-class, in-class, post-class) for the Experimental Group (receiving progressive blended case-based teaching via rain classroom) and the control group (receiving traditional lecture-based teaching). The process culminates in a postintervention survey and post-test to compare outcomes between the 2 groups.

3. Results

3.1. Basic characteristics and information

A total of 258 5th-year clinical medical students were enrolled from September 2022 and January 2023. Baseline demographic and academic characteristics of participants are presented in Table 1. The experimental group (n = 123) and control group (n = 135) showed no significant differences in age (21.1 ± 0.6 vs 21.1 ± 0.6, P = .843), gender distribution (male/female: 55/68 vs 65/70, P = .408), or prior academic performance, as measured by final grades in the diagnostics course (72.8 ± 5.3 vs 72.5 ± 5.5, P = .523). These results confirm the baseline equivalence between groups before the instructional intervention.

Table 1.

Demographic and baseline academic performance characteristics of all the participants.

Item Experimental group (N = 123) Control group (N = 135) Statistics P-value
Gender, n(%)
 Male 55 (44.7%) 65 (48.2%) χ2 = 0.684 .408
 Female 68 (55.3%) 70 (51.9%)
Age (mean ± SD) 21.1 ± 0.6 21.1 ± 0.6 t = 0.197 .843
 Final grade in diagnostics (mean ± SD) 72.8 ± 5.3 72.5 ± 5.5 t = 0.639 .523

SD = standard deviation.

3.2. The comparison of Self-perceived competency ratings based on Likert scale responses

Students’ perceived development across 8 core competencies was assessed using a 5-point Likert scale questionnaire. In the experimental group, the mean clinical skills and medical service ability, information collection and processing ability, medical knowledge and lifelong learning attitude, interpersonal and communication skills, teamwork and leadership ability, health promotion and disease prevention self-perceived competency scores were 4.3 ± 0.5, 4.3 ± 0.5, 4.4 ± 0.5, 4.3 ± 0.5, 4.2 ± 0.5 and 4.4 ± 0.5, respectively. Meanwhile, for the control group, they were 3.8 ± 0.6, 3.9 ± 0.5, 4.1 ± 0.6, 4.0 ± 0.6, 3.9 ± 0.6 and 4.0 ± 0.6 respectively. It is notable that the experimental group’s scores of students’ self-perceived competency (except research capabilities and professionalism and ethical practice) were significantly higher than the control group’s (P < .001). No significant differences were observed in students’ self-ratings for research skills (3.5 ± 0.5 vs 3.4 ± 0.5, P = .126) or professionalism and ethics (3.7 ± 0.6 vs 3.6 ± 0.5, P = .854) (shown in Table 2).

Table 2.

Mean scores of students’ self-perceived competency development based on Likert scale responses.

Primary evaluation item Experimental group (N = 123) Control group (N = 135) T P-value
Clinical skills and medical service abilities 4.3 ± 0.5 3.8 ± 0.6 6.454 <.001
Information collection and processing abilities 4.3 ± 0.5 3.9 ± 0.5 8.196 <.001
Medical knowledge and lifelong learning attitude 4.4 ± 0.5 4.1 ± 0.6 4.428 <.001
Interpersonal and communication skills 4.3 ± 0.5 4.0 ± 0.6 3.986 <.001
Teamwork and leadership abilities 4.2 ± 0.5 3.9 ± 0.6 5.614 <.001
Research capabilities 3.5 ± 0.5 3.4 ± 0.5 1.534 .126
Health promotion and disease prevention 4.4 ± 0.5 4.0 ± 0.6 4.104 <.001
Professionalism and ethical practice 3.7 ± 0.6 3.6 ± 0.6 0.185 .854

3.3. The comparison of Frequency distribution of students’ perceived competency improvement based on Likert scale categorization

The frequency-based analysis of perceived competency improvement is presented in Table 3. A significantly higher proportion of students in the experimental group reported improvement in clinical skills and medical service ability (94.3% vs 86.7%, P = .038), information collection and processing ability (96.8% vs 89.6%, P = .025), medical knowledge and lifelong learning attitude (98.4% vs 91.9%, P = .017), interpersonal and communication skills (95.9% vs 88.9%, P = .035), teamwork and leadership ability (97.6% vs 87.4%, P = .002), and health promotion and disease prevention (94.3% vs 85.2%, P = .017) than in the control group. No significant differences were observed in research skills (P = .348) and professionalism and ethics (P = .370).

Table 3.

Frequency distribution of perceived competency improvement.

Competency domain Experimental group (N = 123) Control group (N = 135) P-value
Agree Disagree Agree Disagree
Clinical skills and medical service abilities 94.3% (116/123) 5.7% (7/123) 86.7% (117/135) 13.3% (18/135) .038
Information collection and processing abilities 96.8% (119/123) 3.3% (4/123) 89.6% (121/135) 10.4% (14/135) .025
Medical knowledge and lifelong learning attitude 98.4% (121/123) 1.6% (2/123) 91.9% (124/135) 8.2% (11/135) .017
Interpersonal and communication skills 95.9% (118/123) 4.0% (5/123) 88.9% (120/135) 11.1% (15/135) .035
Teamwork and leadership abilities 97.6% (120/123) 2.4% (3/123) 87.4% (118/135) 12.6% (17/135) .002
Research capabilities 87.8% (108/123) 12.2% (15/123) 83.7% (113/135) 16.3% (22/135) .348
Health promotion and disease prevention 94.3% (116/123) 5.7% (7/123) 85.2% (115/135) 14.8% (20/135) .017
Professionalism and ethical practice 99.2% (122/123) 0.8% (1/123) 91.1% (123/135) 8.9% (12/135) .370

Agreement was defined as Likert scores of 3 to 5, and disagreement as scores of 1 to 2. Percentages represent the proportion of students who agreed or disagreed with each competency item in their respective group. Statistical comparisons were conducted using Chi-square tests.

3.4. The comparison of academic scores between groups before and after the intervention

The comparison of academic performance between groups is presented in Table 4. The pretest scores, based on final grades in the diagnostics course, showed no statistically significant difference between the experimental and control groups (72.8 ± 5.3 vs. 72.5 ± 5.5, P = .523), indicating comparable academic baselines. In contrast, post-test scores were significantly higher in the experimental group (85.6 ± 5.0) than in the control group (81.3 ± 5.2) (P < .001).

Table 4.

The comparison of academic scores between groups before and after the intervention.

Item Experimental group (N = 123) Control group (N = 135) t P-value
Final grade in diagnostics 72.8 ± 5.3 72.5 ± 5.5 0.639 .523
Post-test 85.6 ± 5.0 81.3 ± 5.2 6.845 <.001

4. Discussion

This study aimed to evaluate the effectiveness of a progressive blended case-based teaching model in improving clinical competencies among medical students. The findings demonstrated that students in the experimental group exhibited significantly higher scores in clinical skills, information processing, communication, and teamwork abilities compared to those in the control group. Furthermore, the frequency-based analysis of Likert scale responses showed that a higher proportion of students in the experimental group perceived positive effects of the teaching model on their competency development. In addition, the post-test scores confirmed that students in the experimental group achieved significantly better academic performance than those in the control group, further validating the effectiveness of this model. These results suggest that integrating structured case discussions, interactive learning, and digital teaching tools can foster both theoretical understanding and practical application in medical education.

However, the study also revealed no significant improvement in research skills or professionalism and ethical practice. These findings indicate that while case-based progressive learning is effective in strengthening core competencies, additional instructional strategies may be required to enhance students’ research capabilities and professional identity formation. Previous studies have similarly reported that traditional case-based or problem-based learning (PBL) approaches are more effective in developing clinical reasoning than in promoting scientific inquiry or ethical decision-making.[12]

Given the increasing emphasis on CBME worldwide, it is crucial to explore more comprehensive teaching strategies that integrate research methodology, ethical reasoning, and interprofessional collaboration into the existing curriculum.[13] The following discussion examines the implications of these findings, compares the effectiveness of this model to other teaching approaches, and outlines potential refinements for future implementation.

The results from both the Likert-based mean score analysis and frequency-based distribution analysis consistently showed that students in the experimental group reported higher levels of self-perceived improvement in core competencies. Specifically, greater self-reported gains were observed in clinical skills, information collection and processing, and teamwork abilities. These findings suggest that the progressive blended case-based model may be more effective than traditional lectures in promoting students’ confidence and perceived growth in these domains.

One key factor contributing to these improvements is the interactive case discussions, which require students to actively synthesize and apply knowledge rather than passively receive information. Prior research has established that CBL and team-based learning (TBL) stimulate cognitive engagement and promote long-term knowledge retention.[14] Our results align with these findings, suggesting that this structured, progressive approach effectively facilitates the transition from theoretical understanding to practical application.

Furthermore, students in the experimental group reported higher self-perceived competencies in interpersonal communication and teamwork. These outcomes were likely influenced by the collaborative case discussions and structured peer feedback embedded in the teaching model, which encouraged interaction and exchange of clinical reasoning processes. Prior studies have also shown that such interactive learning environments help cultivate professional communication and collaborative skills, which are essential in clinical settings.[15]

In addition, more students in the experimental group perceived improvement in competencies related to health promotion and disease prevention. This may be attributed to the integration of real-world case scenarios that addressed public health dimensions, thereby enhancing students’ awareness of preventive care and system-level thinking. These findings support the incorporation of public health perspectives within competency-based teaching frameworks.[16]

Despite the significant improvements observed in most core competencies, our study found no significant enhancement in research capabilities or professionalism and ethical practice. These findings warrant further discussion, as they reveal critical gaps in the current educational approach.

One possible explanation for the lack of improvement in research skills is that the curriculum primarily focused on clinical problem-solving rather than scientific inquiry. Research competency requires structured exposure to study design, data analysis, and evidence-based medicine methodologies, which were not explicitly incorporated into this teaching model. Prior studies have reported that medical students often struggle to develop research skills without dedicated research mentorship and hands-on academic projects.[17] To address this gap, future iterations of this teaching model should integrate dedicated research methodology workshops, structured scientific writing exercises, and guided clinical research opportunities to enhance students’ academic inquiry abilities.

Similarly, the lack of improvement in professionalism and ethical decision-making may be attributed to the short duration of the study and the complexity of professional identity formation. Unlike clinical knowledge and procedural skills, professionalism develops gradually through long-term mentorship, ethical case discussions, and real-world clinical exposure.[18] While case-based discussions provide opportunities for ethical reasoning, they may not fully replicate the nuanced ethical challenges encountered in real clinical practice. Future curriculum designs could incorporate longitudinal mentoring programs, structured ethical reflection sessions, and interprofessional education components to better foster professional identity formation.

Taken together, the self-reported outcomes from both mean score and categorical distribution analyses provide consistent evidence that students perceived greater competency gains under the progressive blended case-based teaching model. However, these results reflect students’ perceptions and do not constitute objective assessments of skill acquisition. Nonetheless, the lack of significant improvement in research skills and professional ethics highlights the need for additional instructional interventions. Future curriculum refinements could include structured research mentorship, ethics workshops, and interdisciplinary learning modules to further enhance these competencies.

Our study reinforces the growing recognition that CBME must extend beyond traditional knowledge transmission to include active learning, collaborative problem-solving, and real-world application. While our findings confirm that progressive blended CBL enhances clinical competencies, they also underscore the need to incorporate research training and professionalism development as integral components of medical education.

To further refine this model, future studies should evaluate its long-term impact on clinical performance by conducting multi-center trials across different medical institutions. Additionally, integrating digital learning tools (such as AI-driven adaptive learning platforms) and interprofessional education modules may further enhance the effectiveness of competency-based training in diverse healthcare environments.[19]

Medical education has widely adopted various instructional models, including PBL, TBL, and traditional lecture-based teaching, each with distinct pedagogical characteristics.

PBL fosters self-directed learning by encouraging students to identify learning objectives independently before engaging in group discussions. While this approach enhances problem-solving abilities, its effectiveness depends on students’ prior knowledge and self-regulation skills, potentially leading to inconsistencies in learning outcomes.[20] TBL follows a structured, team-based format that includes pre-class preparation, readiness assessments, and collaborative decision-making activities. It has been shown to improve teamwork and communication skills, yet its rigid framework may limit individual adaptability and personalized learning experiences.[21] Traditional lecture-based teaching, while efficient for large-scale knowledge dissemination, often promotes passive learning, lacking the interactive components necessary to develop critical thinking and clinical reasoning.[22]

Unlike these models, the progressive blended case-based teaching model incorporates structured case discussions and stepwise progression, integrating guided knowledge acquisition, interactive learning, and post-class reflection. Compared to PBL, it provides more structured guidance, reducing cognitive overload. Compared to TBL, it offers greater flexibility by allowing individualized learning while maintaining collaborative problem-solving. In contrast to traditional lectures, it promotes active engagement and deeper clinical reasoning through interactive case-based scenarios.

While the use of diagnostic course grades served as a consistent proxy for students’ baseline academic competence, this measure also has limitations. These grades reflect prior learning but are not specifically aligned with the competencies targeted by the teaching model. In addition, potential variation in grading standards and instructional approaches may introduce uncontrolled variability.

This study has several other limitations. Although students were randomly assigned, individual differences in motivation, learning styles, and engagement could have influenced the self-reported outcomes. Instructor teaching styles, classroom dynamics, and peer interactions were not fully standardized, which may have introduced unintended bias. The reliance on self-assessment tools captures perceived rather than actual performance. Finally, external factors – such as prior clinical exposure or differing interpretations of questionnaire items – could not be fully controlled and may have affected the responses.

5. Conclusion

This study suggests that the progressive blended case-based teaching model is associated with improved self-reported competencies among medical students, particularly in clinical skills, information processing, communication, teamwork, and system-based practice. These findings are based on structured Likert scale questionnaires and reflect students’ perceived development rather than objectively measured outcomes. Meanwhile, limited self-reported improvement was observed in research skills and professionalism, indicating areas for further curricular enhancement. Compared to PBL, TBL, and traditional lecture-based teaching, this model offers a structured yet flexible framework that integrates guided learning, interactive case discussions, and digital tools to foster student engagement and perceived competency growth.

Author contributions

Conceptualization: Binglei Jiang.

Data curation: Rong Lu.

Formal analysis: Ben Wang.

Investigation: Yixi He.

Methodology: Kun Yang.

Project administration: Pan Zhao.

Software: Qiang Zhang.

Writing – original draft: Pingyu Zhu.

Writing – review & editing: Bo Huang.

Supplementary Material

medi-104-e43920-s001.docx (51.7KB, docx)

Abbreviations:

AI
artificial intelligence
CBL
case-based learning
CBME
competency-based medical education
OBE
outcome-based education
PBL
problem-based learning
SPSS
Statistical Package for the Social Sciences
TBL
team-based learning

This study was supported by the Sichuan Provincial Higher Education Talent Training Quality and Teaching Reform Project (2024–2026), “Exploration and Practice of a New Model for Enhancing Undergraduate Clinical Medicine Practical Competence through Industry-Education Integration and Research-Innovation Dual Drivers” (Grant No. JG2024-0941);And Sichuan Provincial First-Class Undergraduate Course, “Structure, Function, and Diseases of the Urological and Male Reproductive System” (Grant No. YLKC02211). These grants provided financial support and assisted with this study’s implementation.

The authors have no conflicts of interest to disclose.

The datasets generated during and/or analyzed during the current study are not publicly available, but are available from the corresponding author on reasonable request.

Supplemental Digital Content is available for this article.

How to cite this article: Zhu P, Jiang B, Lu R, Wang B, He Y, Yang K, Zhang Q, Zhao P, Huang B. A competency-based instructional design: A progressive blended case-based teaching method. Medicine 2025;104:33(e43920).

BJ and RL contributed to this article equally.

Contributor Information

Pingyu Zhu, Email: zhupingyu@nsmc.edu.cn.

Binglei Jiang, Email: 280311791@qq.com.

Rong Lu, Email: 305772428@qq.com.

Ben Wang, Email: 774384020@qq.com.

Yixi He, Email: 604272721@qq.com.

Kun Yang, Email: 979834606@qq.com.

Qiang Zhang, Email: 877135638@qq.com.

Pan Zhao, Email: 66716891@qq.com.

References

  • [1].Spady W. Outcome-Based Education: Critical Issues And Answers [M]. The American Association of School Administrators; 1994. [Google Scholar]
  • [2].Frank JR, Snell L, Sherbino J. eds. CanMEDS 2015 Physician Competency Framework. Royal College of Physicians and Surgeons of Canada; 2015. [Google Scholar]
  • [3].Donkin R, Yule H, Fyfe T. Online case-based learning in medical education: a scoping review. BMC Med Educ. 2023;23:165. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [4].Lim JJ, Veasuvalingam B. Does online case-based learning foster clinical reasoning skills? A mixed-methods study. Future Healthc J. 2025;12:100210–41. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [5].Chen L, Tang XJ, Chen XK, Ke N, Liu Q. Effect of the BOPPPS model combined with case-based learning versus lecture-based learning on ophthalmology education for five-year paediatric undergraduates in Southwest China. BMC Med Educ. 2022;22:586. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [6].Li X, Li Y, Li X, Chen X, Yang G, Yang L. Comparison of case-based learning combined with Rain Classroom teaching and traditional method in complete denture course for undergraduate interns. BMC Med Educ. 2022;22:610. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [7].Surapaneni KM. Innovative self-directed, problem-oriented, lifelong learning, integrated clinical case exercise (SPLICE) modules promote critical thinking skills, early clinical exposure, and contextual learning among first professional-year medical students. Adv Physiol Educ. 2024;48:69–79. [DOI] [PubMed] [Google Scholar]
  • [8].Naidoo N, Akhras A, Banerjee Y. Confronting the challenges of anatomy education in a competency-based medical curriculum during normal and unprecedented times (COVID-19 Pandemic): pedagogical framework development and implementation. JMIR Med Educ. 2020;6:e21701. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [9].Junge HE. Competency-based Under- and Postgraduate medical Education in General Practice – An Evaluation of Tools and Outcomes [dissertation]. Homburg/Saar: Saarland University; 2023. [Google Scholar]
  • [10].Nayak KR, Nayak V, Punja D, Badyal DK, Modi JN. Simulated patient videos to supplement integrated teaching in competencybased undergraduate medical curriculum. Adv Physiol Educ. 2023;47:296–306. [DOI] [PubMed] [Google Scholar]
  • [11].Alkharusi H. A descriptive analysis and interpretation of data from Likert scales in educational and psychological research. Ind J Psychol Educ. 2022;12:13–6. [Google Scholar]
  • [12].Setia S, Bobby Z, Ananthanarayanan P, Radhika M, Kavitha M, Prashanth T. Case based learning versus problem based learning: a direct comparison from first year medical students perspective. Med Educ. 2011;2:WMC001976. [Google Scholar]
  • [13].Shah N, Desai C, Jorwekar G, Badyal D, Singh T. Competency-based medical education: an overview and application in pharmacology. Indian J Pharmacol. 2016;48(Suppl 1):S5–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [14].Thistlethwaite JE, Davies D, Ekeocha S, et al. The effectiveness of case-based learning in health professional education: a BEME systematic review: BEME Guide No. 23. Med Teach. 2012;34:e421–444. [DOI] [PubMed] [Google Scholar]
  • [15].Lam R. What students do when encountering failure in collaborative tasks. npj Sci Learn. 2019;4:6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [16].Alzayani S, Alsayyad A, Al-Roomi K, Almarabheh A. Innovations in medical education during the COVID-19 era and beyond: medical students’ perspectives on the transformation of real public health visits into virtual format. Front Public Health. 2022;10:883003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [17].Zhang G, Wu H, Xie A, Cheng H. The association between medical student research engagement with learning outcomes. Med Educ Online. 2022;27:2100039. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [18].Omega A, Wijaya Ramlan AA, Soenarto RF, Heriwardito A, Sugiarto A. Assessing clinical reasoning in airway related cases among anesthesiology fellow residents using Script Concordance Test (SCT). Med Educ Online. 2022;27:2135421. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [19].Yang Y, Ye Z, Su Y, Zhao Q, Li X, Ouyang D. Deep learning for in vitro prediction of pharmaceutical formulations. Acta Pharm Sin B. 2019;9:177–85. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • [20].Dolmans DH, De Grave W, Wolfhagen IH, van der Vleuten CP. Problem-based learning: future challenges for educational practice and research. Med Educ. 2005;39:732–41. [DOI] [PubMed] [Google Scholar]
  • [21].Michaelsen LK, Watson WE, Cragin JP, Dee Fink L. Team learning: a potential solution to the problems of large classes. Exchange. 1982;7:13–22. [Google Scholar]
  • [22].Hwang GJ, Yang CL, Chou KR, Chang CY. An MDRE approach to promoting students’ learning performances in the era of the pandemic: a quasi-experimental design. Br J Educ Technol. 2022;53:1706–23. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

medi-104-e43920-s001.docx (51.7KB, docx)

Articles from Medicine are provided here courtesy of Wolters Kluwer Health

RESOURCES