Abstract
Background
The Objective Structured Clinical Examination (OSCE) is a pivotal tool for assessing health care professionals and plays an integral role in medical education.
Objective
This study aims to map the bibliometric landscape of OSCE research, highlighting trends and key influencers.
Methods
A comprehensive literature search was conducted for materials related to OSCE from January 2004 to December 2023, using the Web of Science Core Collection database. Bibliometric analysis and visualization were performed with VOSviewer and CiteSpace software tools.
Results
Our analysis indicates a consistent increase in OSCE-related publications over the study period, with a notable surge after 2019, culminating in a peak of activity in 2021. The United States emerged as a significant contributor, responsible for 30.86% (1626/5268) of total publications and amassing 44,051 citations. Coauthorship network analysis highlighted robust collaborations, particularly between the United States and the United Kingdom. Leading journals in this domain—BMC Medical Education, Medical Education, Academic Medicine, and Medical Teacher—featured the highest volume of papers, while The Lancet garnered substantial citations, reflecting its high impact factor (to be verified for accuracy). Prominent authors in the field include Sondra Zabar, Debra Pugh, Timothy J Wood, and Susan Humphrey-Murto, with Ronaldo M Harden, Brian D Hodges, and George E Miller being the most cited. The analysis of key research terms revealed a focus on “education,” “performance,” “competence,” and “skills,” indicating these are central themes in OSCE research.
Conclusions
The study underscores a dynamic expansion in OSCE research and international collaboration, spotlighting influential countries, institutions, authors, and journals. These elements are instrumental in steering the evolution of medical education assessment practices and suggest a trajectory for future research endeavors. Future work should consider the implications of these findings for medical education and the potential areas for further investigation, particularly in underrepresented regions or emerging competencies in health care training.
Keywords: Objective Structured Clinical Examination, OSCE, medical education assessment, bibliometric analysis, academic collaboration, health care professional training, medical education, medical knowledge, medical training, medical student
Introduction
Objective Structured Clinical Examinations (OSCEs) have emerged as indispensable tools for assessing health care professionals, providing structured evaluations of clinical competencies, communication skills, and decision-making abilities [1,2]. Despite their widespread adoption since the 1970s, the landscape of OSCE research remains multifaceted and dynamic, reflecting ongoing innovations in medical, nursing, and allied health education [3].
While numerous studies have explored various aspects of OSCEs, gaps persist in our understanding of the overarching trends and global dynamics shaping this field. A comprehensive review of the existing literature highlights the need for a systematic approach to mapping the knowledge landscape and identifying emerging trends through bibliometric analysis [4-6]. By applying quantitative methods to scholarly publications, bibliometric analysis offers a unique opportunity to uncover hidden patterns, elucidate research trajectories, and forecast future directions in OSCE research.
Building on this rationale, our study aims to bridge these gaps by conducting a bibliometric analysis of OSCE literature from 2004 to 2023. We hypothesize that this analysis will reveal distinct patterns of publication output, collaboration networks, and thematic clusters within the OSCE research domain. Specifically, we seek to (1) identify key research themes, including but not limited to assessment methodologies, educational interventions, and technological innovations in OSCEs; (2) map the global distribution of OSCE research, highlighting geographic hotspots and areas of collaboration; and (3) explore the interconnections between different disciplines within medical education, shedding light on interdisciplinary collaborations and knowledge diffusion.
By elucidating these aspects, our study aims to provide stakeholders in medical education with valuable insights into the current state and future directions of OSCE research. Ultimately, this knowledge mapping exercise seeks to inform evidence-based decision-making, guide educational practices, and stimulate further research in the field of clinical skills assessment.
Methods
Data Acquisition and Search Strategy
The bibliographic accuracy of literature types in the Web of Science Core Collection (WoSCC) database is superior to any other database, making it the optimal choice for conducting literature analysis [7,8]. Therefore, we opted to perform our search within this database. We conducted a search in the Web of Science (WoS) for all relevant papers published between January 1, 2004, and December 31, 2023. The search formula “(TS=(The Objective Structured Clinical Examination)) or TS=(OSCE)” was used. The literature screening for this study was based on the inclusion criteria: (1) full-text publications related to the OSCEs; (2) papers and review manuscripts written in English; and (3) papers published between January 1, 2004, and December 31, 2023. The exclusion criteria included (1) topics not related to the OSCEs and (2) papers in the form of conference abstracts, news briefs, and so on. A plain text version of the papers was exported.
General Data
Figure 1 shows the process of literature searching and bibliometric analysis. The results indicate that from January 1, 2004, to December 31, 2023, there were a total of 5268 publications related to the OSCE in the WoSCC database, including 1800 papers (84.96%) and 384 reviews (15.04%). The literature involved 133 countries and regions, 5291 institutions, and 24,478 authors.
Data Analysis
To depict annual publication trends and the distribution of national contributions, we used GraphPad Prism (version 8.0.2; Dotmatics). For the bibliometric analysis and the visualization of scientific knowledge maps, the study used both CiteSpace (6.2.4R, 64 bit advanced edition; Chaomei Chen, Drexel University) [9] and VOSviewer (version 1.6.18; Leiden University) [10]. These tools were selected for their robustness in handling extensive bibliometric data and their ability to graphically represent complex networks.
VOSviewer, a Java-based software pioneered by van Eck and Waltman [9] in 2009, facilitates the construction of various types of network maps, such as bibliographic coupling, cocitation, and coauthorship networks. CiteSpace, developed by Professor Chaomei Chen, provides a dynamic and computer-based platform for identifying and visualizing patterns and trends in scientific literature, thereby enabling the exploration of knowledge domains and predictive analysis of research trajectories [10].
Our methodological approach within these applications involved setting specific parameters for network density, threshold values for the inclusion of nodes, and time-slicing techniques to analyze temporal changes. The references corresponding to the software applications were verified against our citation list to ensure accuracy [9,10].
In our study using VOSviewer and CiteSpace software tools for bibliometric analysis, the criteria for defining country-based collaborations were established based on specific considerations. Collaborations were determined by considering the first authors and corresponding authors listed in the paper bylines. This approach was chosen to ensure inclusivity and to capture the entirety of collaborative efforts between researchers from different countries.
The burst detection in CiteSpace is based on the Kleinberg algorithm, which is based on modeling the stream using an infinite-state automaton to extract a meaningful structure from document streams that arrive continuously over time [11]. These analyses can show the fast-growing topics that last for multiple years as well as a single year.
Rationale for Analysis Selection
The aforementioned techniques were chosen a priori due to their widespread use and effectiveness in bibliometric studies. They provide robust and complementary insights into productivity, impact, and collaborative patterns within the research field.
Results
Publication Trend
Since 2004, there has been a gradual increase in the number of papers published annually (Figure 2A). We have divided this into 3 periods: from 2004 to 2010, there was a slow growth, with fewer than 150 papers published per year, indicating that the field had not yet captured researchers’ attention. From 2011 to 2018, the volume of publications gradually increased, indicating growing interest in the field. After 2019, there was a rapid rise in the number of publications, peaking in 2021, which suggests that the field has received widespread attention since then.
Country or Region and Institution Contributions
Figure 2B and C show the annual number of publications from the top 10 countries over the past decade. The top 5 countries in the field are the United States, the United Kingdom, Canada, Germany, and China, respectively. The United States accounts for 30.86% (1626/5268) of the total volume of publications, significantly surpassing other countries.
Among the top 10 countries or regions in terms of the number of published papers, the United States had a citation count of 44,051, far exceeding all other countries or regions. Its citation-per-publication ratio (27.13) ranks third among all countries or regions, which suggests a generally high quality of the published papers. The United Kingdom had the second-highest number of published papers (576 papers) and ranked second in terms of citation count (15,929 citations). The cooperation network, as shown in Figure 3A, indicates close collaboration between the United States and the United Kingdom, which are the highest producers.
A total of 5291 institutions have systematically published papers related to the OSCE. Among the top 10 institutions in terms of publication volume, 6 are from the United States, 2 are from the United Kingdom, and 2 are from Canada (Figure 3B).
Journals’ Contributions
Tables 1 and 2 list the top 10 journals with the highest outputs and the most citations, respectively. BMC Medical Education, with 227 out of 5268 papers, accounting for 4.31% of publications in the field, is the journal with the most published papers, followed by Medical Teacher (179/5268, 3.40%), Medical Education (132/5268, 2.51%), and Journal of Surgical Education (66/5268, 1.25%). Among the top 10 most productive journals, Annals of the Rheumatic Diseases has the highest impact factor at 27.6. All journals are categorized within either Q1 or Q2 quartiles.
Table 1.
Rank | Journals | Papers (N=5268), n (%) | IFa | Quartile in category |
1 | BMC Medical Education | 227 (4.31) | 3.6 | Q1 |
2 | Medical Teacher | 179 (3.40) | 4.7 | Q1 |
3 | Medical Education | 132 (2.51) | 7.1 | Q1 |
4 | Journal of Surgical Education | 66 (1.25) | 2.9 | Q2 |
5 | Academic Medicine | 64 (1.21) | 7.4 | Q1 |
6 | Patient Education and Counseling | 64 (1.21) | 3.5 | Q2 |
7 | Advances in Health Sciences Education | 60 (1.14) | 4.0 | Q1 |
8 | American Journal of Pharmaceutical Education | 59 (1.12) | 3.3 | Q2 |
9 | PLoS One | 59 (1.12) | 3.7 | Q2 |
10 | Nurse Education Today | 56 (1.06) | 3.9 | Q1 |
aIF: impact factor.
Table 2.
Rank | Cited journals | Cocitations, n | IFa (2020) | Quartile in category |
1 | Medical Education | 1868 | 4.7 | Q1 |
2 | Academic Medicine | 1775 | 7.4 | Q1 |
3 | Medical Teacher | 1597 | 4.7 | Q1 |
4 | BMC Medical Education | 941 | 3.6 | Q1 |
5 | JAMA—Journal of American Medical Association | 931 | 120.7 | Q1 |
6 | British Medical Journal | 827 | 107.7 | Q1 |
7 | Advances in Health Sciences Education | 802 | 4.0 | Q1 |
8 | The Lancet | 697 | 168.9 | Q1 |
9 | New England Journal of Medicine | 694 | 158.5 | Q1 |
10 | Teaching and Learning Medicine | 599 | 2.5 | Q3 |
aIF: impact factor.
The influence of a journal is determined by the frequency with which it is cocited, which indicates whether the journal has made a significant impact on the scientific community. According to Table 2, the most commonly cocited journal is Medical Education with 1868 citations, followed by Academic Medicine with 1775 citations, and Medical Teacher with 1597 citations. Among the top 10 journals by cocitation count, The Lancet was cited 697 times and has the highest impact factor of 168.9 within these top journals. All journals within the most cocited list are in the Q1 or Q2 zone.
Authors and Cocited Authors' Contributions
Among all authors who have published literature related to OSCE, Tables 3 and 4 list the top 10 authors with the most published papers. Together, these top 10 authors have published 185 papers, accounting for 3.51% of all papers (N=5268) in the field. Sondra Zabar has 26 publications, which is the highest number of published research papers, followed by Debra Pugh with 22, Timothy J Wood with 20, and Susan Humphrey-Murto with 19. Further analysis indicates that among the top 10 ranked authors, 4 are from the United States, 3 are from Canada, 2 are from Australia, and 1 is from China. CiteSpace visualizes the network of relationships between authors (Figure 4).
Table 3.
Rank | Authors | Papers, n | Locations |
1 | Zabar, Sondra | 26 | United States |
2 | Pugh, Debra | 22 | Canada |
3 | Wood, Timothy J | 20 | Canada |
4 | Humphrey-Murto, Susan | 19 | Canada |
5 | Gillespie, Colleen | 17 | United States |
6 | Shulruf, Boaz | 17 | Australia |
7 | Yang, Ying-Ying | 17 | China |
8 | Durning, Steven J | 16 | United States |
9 | Fuller, Richard | 16 | Australia |
10 | Park, Yoon Soo | 15 | United States |
Table 4.
Rank | Cocited authors | Citations, n |
1 | Harden, Ronald M | 751 |
2 | Hodges, Brian D | 330 |
3 | Miller, George E | 222 |
4 | Epstein, Ronald M | 194 |
5 | van der Vleuten, Cees PM | 173 |
6 | Wass, Valerie | 172 |
7 | Khan, Kamran Z | 164 |
8 | Regehr, Glenn | 162 |
9 | Cook, David A | 160 |
10 | Downing, Steven M | 156 |
Table 4 displays the top 10 authors who have been cocited and cited the most, respectively. A total of 148 authors have been cited more than 50 times, indicating that their research has a high reputation and influence. The largest nodes are associated with the authors who have been cocited the most, including Ronald M Harden with 751 citations, Brian D Hodges with 330 citations, and George E Miller with 222 citations.
Analysis of Highly Cited References
Over the time span from 2004 to 2023, the cocitation network comprised 1053 nodes and 3508 links (Figure 5). According to the top 10 papers by cocitation frequency (Table 5), the most cocited reference is from the journal Advances in Medical Education and Practice (impact factor=2.0), titled “An evaluative study of Objective Structured Clinical Examination (OSCE): students and examiners perspectives” [12]. The first author of this paper is Md Anwarul Azim Majumder. The paper posits that OSCE is the gold standard and universal form for assessing medical students’ clinical competence in a comprehensive, reliable, and effective manner.
Table 5.
Rank | Titles | Journals | IFa (2021) | First authors | Total citations, n |
1 | An evaluative study of Objective Structured Clinical Examination (OSCE): students and examiners perspectives [12] | Advances in Medical Education and Practice | 2.0 | Majumder, Md Anwarul Azim | 38 |
2 | Implementing an online OSCE during the COVID-19 pandemic [13] | Journal of Dental Education | 2.3 | Kakadia, Rahen | 31 |
3 | Diagnostic and statistical manual of mental disorders [14] | Psychiatry Research | 11.3 | Mittal, Vijay A | 31 |
4 | A systematic review of the reliability of Objective Structured Clinical Examination scores [15] | Medical Education | 7.1 | Brannick, Michael T | 30 |
5 | Twelve tips for developing an OSCE that measures what you want [16] | Medical Teacher | 4.7 | Daniels, Vijay John | 30 |
6 | Is the OSCE a feasible tool to assess competencies in undergraduate medical education? [17] | Medical Teacher | 4.7 | Patricio, Madalena F | 29 |
7 | Techniques for measuring clinical competence: Objective Structured Clinical Examinations [18] | Medical Education | 7.1 | Newble, David | 26 |
8 | Assessment in medical education [19] | New England Journal of Medicine | 158.5 | Epstein, Ronald M | 26 |
9 | Assessing communication skills of medical students in Objective Structured Clinical Examinations (OSCE)-a systematic review of rating scales [20] | PLoS One | 3.7 | Cömert, Musa | 26 |
10 | Twelve tips for conducting a virtual OSCE [21] | Medical Teacher | 4.7 | Hopwood, Jenny | 26 |
aIF: impact factor.
Keyword Analysis
Through the analysis of keywords, we can quickly understand the situation and development direction of a field. Based on the co-occurrence of keywords in VOSviewer, the hottest keyword is “education” (n=677 occurrences), followed by “performance” (n=536), “competence” (n=458), and “skills” (n=449; Table 6).
Table 6.
Rank | Keywords | Co-occurrences, n |
1 | Education | 677 |
2 | Performance | 536 |
3 | Competence | 458 |
4 | Skills | 449 |
5 | Reliability | 371 |
6 | Assessment | 342 |
7 | Students | 337 |
8 | Validity | 329 |
9 | Simulation | 284 |
10 | Medical education | 264 |
11 | Diagnosis | 228 |
12 | Care | 217 |
13 | Prevalence | 207 |
14 | Medical students | 197 |
15 | Management | 196 |
16 | Medical education | 171 |
17 | Curriculum | 168 |
18 | Communication | 161 |
19 | Impact | 156 |
20 | Clinical skills | 147 |
The Burst of Cocited References and Keywords
With CiteSpace, we identified 50 of the most reliable citation bursts in the field related to OSCE [12,13,15-62]. The most frequently cited reference, with a burst strength of 15.91, is a paper published in Medical Education titled “A systematic review of the reliability of Objective Structured Clinical Examination scores” [15], whose first author is Michael T Brannick. The paper suggests that OSCEs consist of a series of simulated tasks to assess medical practitioners’ skills in diagnosing and treating patients. Of the 50 references, 47 (94%) were published between 2004 and 2023, indicating that these papers have been frequently cited over nearly 20 years. Notably, 24 of these papers are currently at a citation peak (Figure 6A [12,13,15-62]), meaning that research related to OSCE is expected to continue receiving significant attention in the future.
Among the 768 strongest emerging keywords in the field, we focused on the 50 with the most significant surges (Figure 6B), representing the current hotspots in the field and likely future research directions.
Discussion
Principal Findings
This study is pioneering in its bibliometric approach to OSCE, encapsulating a comprehensive view of the dynamic research trends in this field. By analyzing the bibliometric data internationally, we have mapped out collaboration networks, identified prevailing research directions, and forecasted potential future developments in OSCE scholarship. The surge in OSCE-related publications since 2019 underscores the recognition of OSCEs as essential for evaluating health care practitioners, meeting the demands of modern medicine for more robust and comprehensive assessment methods to gauge clinical competency [22,63].
Despite this growth, the concentration of research output in countries like the United States, the United Kingdom, and Canada may reflect deeper issues of resource allocation and priority setting in medical education globally [64,65]. This suggests a need for a more nuanced discussion on the uneven geographical spread of OSCE research and its implications. The disparity in research contribution could hinder the global exchange of innovative practices and perspectives in medical education [66,67].
Furthermore, the bibliometric data point to the importance of technology in OSCEs, particularly the integration of virtual and augmented reality. However, to fully understand the implications of technological advances, a more detailed analysis is warranted. This should include how technology shapes the development of OSCEs, its impact on the validity and reliability of assessments, and the potential barriers to its widespread adoption [68-70].
The high concentration of publications in Q1 and Q2 quartile journals, especially those with a significant impact factor, attests to the intersection of OSCE research with impactful clinical education and outcomes. The association with prestigious journals underlines the extensive influence and critical importance of OSCEs across multiple medical specialties [71-73].
The prominence of a core group of scholars leading OSCE research suggests a centralization of expertise that could be diversified through broader international collaboration. Such collaboration could introduce various cultural and pedagogical perspectives into the OSCE discourse, thereby enriching both the practice and the research of OSCEs worldwide [74,75].
The keyword analysis reflects a continual focus on the foundational elements of clinical education, such as “education,” “performance,” “competence,” and “skills,” which are at the heart of the OSCE methodology. Emerging research trends suggest a shift toward the integration of innovative educational technologies and methodologies, enhancing both the OSCE process and its outcomes [76,77].
Comparison to the Literature
Our findings align with those of Lim et al [78], who identified issues with construct, content, and predictive validity in OSCEs in pharmacy education, as well as significant resource challenges. These concerns are echoed in our analysis, where similar validity issues and logistical constraints were observed. Other studies, such as those by Hodges et al [79], have highlighted persistent challenges in psychiatric OSCEs, emphasizing the need for continuous refinement and adaptation. Our study extends these discussions by mapping global trends and collaboration networks, underscoring the necessity for continuous re-evaluation and innovation in OSCE methodologies.
Implications of Findings
The challenges associated with OSCEs suggest a need for evolving assessment methods that incorporate simulations, peer assessments, and reflective practices. The resource-intensive nature of OSCEs underscores the necessity for scalable and sustainable alternatives, such as virtual simulations. Policymakers and educators should leverage global collaboration networks to share best practices and develop adaptable, technology-enhanced assessment frameworks. This approach will help address validity concerns and logistical constraints, ensuring that educational assessments remain robust and relevant in the ever-evolving landscape of health care education.
Limitations
Our bibliometric analysis has limitations that may affect our findings. We only used data from the WoSCC database, potentially excluding studies not indexed there and leading to bias toward English-language literature. This limits the scope of our analysis and overlooks valuable contributions from non-English sources.
Suggestions
To address this, future research should involve a wider range of databases and languages [80,81]. Moreover, the data quality in our study may vary, affecting the credibility of our knowledge mapping. Therefore, caution is needed when interpreting results, and complementary research methods should be considered for a more comprehensive understanding of the field. Longitudinal studies are crucial to assess the impact of OSCEs on medical performance, connecting educational assessments with clinical practice and patient care [82,83].
Moreover, understanding how OSCEs adapt to different health care systems, cultural contexts, and specializations will provide insights into their scalability and adaptability. This is particularly relevant as the health care sector grapples with rapid changes and as medical education seeks to prepare health care professionals for diverse practice environments [19,84].
Conclusions
In conclusion, this bibliometric study not only reaffirms the enduring importance and evolutionary path of OSCEs within medical education but also emphasizes the need for OSCEs to evolve in step with broader health care transformations. The data-driven insights from this analysis should inform future research directions, influence policymaking, and refine educational strategies. By doing so, OSCEs can continue to serve as a dynamic, relevant, and innovative tool in the arsenal of clinical education and evaluation methods.
Abbreviations
- OSCE
Objective Structured Clinical Examination
- WoS
Web of Science
- WoSCC
Web of Science Core Collection
Data Availability
All data generated or analyzed during this study are included in this published article.
Footnotes
Authors' Contributions: HB conceived and designed the ideas for the paper. HB, LZ, XH, and SL participated in all data collection and processing. HB was the major contributor in organizing records and drafting the manuscript. All authors proofread and approved the manuscript.
Conflicts of Interest: None declared.
References
- 1.Criscione-Schreiber L. Turning Objective Structured Clinical Examinations into Reality. Rheum Dis Clin North Am. 2020 Feb;46(1):21–35. doi: 10.1016/j.rdc.2019.09.010.S0889-857X(19)30086-9 [DOI] [PubMed] [Google Scholar]
- 2.Alkhateeb N, Salih AM, Shabila N, Al-Dabbagh A. Objective structured clinical examination: Challenges and opportunities from students' perspective. PLoS One. 2022;17(9):e0274055. doi: 10.1371/journal.pone.0274055. https://dx.plos.org/10.1371/journal.pone.0274055 .PONE-D-21-37498 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Jünger Jana, Schäfer Sybille, Roth C, Schellberg D, Friedman Ben-David M, Nikendei C. Effects of basic clinical skills training on objective structured clinical examination performance. Med Educ. 2005 Oct;39(10):1015–20. doi: 10.1111/j.1365-2929.2005.02266.x.MED2266 [DOI] [PubMed] [Google Scholar]
- 4.Gauthier É. Bibliometric analysis of scientific and technological research: a user's guide to the methodology. Science and Technology Redesign Project, CiteSeer. 1998. [2024-08-26]. https://www150.statcan.gc.ca/n1/en/catalogue/88F0006X1998008 .
- 5.Birch S, Lee MS, Alraek T, Kim T. Overview of Treatment Guidelines and Clinical Practical Guidelines That Recommend the Use of Acupuncture: A Bibliometric Analysis. J Altern Complement Med. 2018 Aug;24(8):752–769. doi: 10.1089/acm.2018.0092. [DOI] [PubMed] [Google Scholar]
- 6.Wilson M, Sampson M, Barrowman N, Doja A. Bibliometric Analysis of Neurology Articles Published in General Medicine Journals. JAMA Netw Open. 2021 Apr 01;4(4):e215840. doi: 10.1001/jamanetworkopen.2021.5840. https://europepmc.org/abstract/MED/33856477 .2778567 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Wu H, Li Y, Tong L, Wang Y, Sun Z. Worldwide research tendency and hotspots on hip fracture: a 20-year bibliometric analysis. Arch Osteoporos. 2021 Apr 17;16(1):73. doi: 10.1007/s11657-021-00929-2.10.1007/s11657-021-00929-2 [DOI] [PubMed] [Google Scholar]
- 8.Vargas JS, Livinski AA, Karagu A, Cira MK, Maina M, Lu Y, Joseph AO. A bibliometric analysis of cancer research funders and collaborators in Kenya: 2007-2017. J Cancer Policy. 2022 Sep;33:100331. doi: 10.1016/j.jcpo.2022.100331. https://europepmc.org/abstract/MED/35792397 .S2213-5383(22)00010-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.van Eck Nees Jan, Waltman L. Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics. 2010 Aug;84(2):523–538. doi: 10.1007/s11192-009-0146-3. https://europepmc.org/abstract/MED/20585380 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Chen C. CiteSpace: A Practical Guide for Mapping Scientific Literature. New York, NY: Nova Science Publishers; 2016. [Google Scholar]
- 11.Kleinberg J. Bursty and hierarchical structure in streams. Data Min Knowl Discov. 2003;7:373–397. doi: 10.1023/A:1024940629314. [DOI] [Google Scholar]
- 12.Majumder MAA, Kumar A, Krishnamurthy K, Ojeh N, Adams OP, Sa B. An evaluative study of Objective Structured Clinical Examination (OSCE): students and examiners perspectives. Adv Med Educ Pract. 2019 Jun 5;10:387–397. doi: 10.2147/AMEP.S197275. https://europepmc.org/abstract/MED/31239801 .197275 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Kakadia R, Chen E, Ohyama H. Implementing an online OSCE during the COVID-19 pandemic. J Dent Educ. 2020 Jul 15;85(Suppl 1):1006–8. doi: 10.1002/jdd.12323. https://europepmc.org/abstract/MED/32666512 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Mittal VA, Walker EF. Diagnostic and statistical manual of mental disorders. Psychiatry Res. 2011 Aug 30;189(1):158–9. doi: 10.1016/j.psychres.2011.06.006. https://europepmc.org/abstract/MED/21741095 .S0165-1781(11)00451-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Brannick MT, Erol-Korkmaz HT, Prewett M. A systematic review of the reliability of Objective Structured Clinical Examination scores. Med Educ. 2011 Dec;45(12):1181–9. doi: 10.1111/j.1365-2923.2011.04075.x. [DOI] [PubMed] [Google Scholar]
- 16.Daniels VJ, Pugh D. Twelve tips for developing an OSCE that measures what you want. Med Teach. 2018 Dec;40(12):1208–1213. doi: 10.1080/0142159X.2017.1390214. [DOI] [PubMed] [Google Scholar]
- 17.Patrício MF, Julião M, Fareleira F, Carneiro AV. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med Teach. 2013 Jun;35(6):503–14. doi: 10.3109/0142159X.2013.774330. [DOI] [PubMed] [Google Scholar]
- 18.Newble D. Techniques for measuring clinical competence: Objective Structured Clinical Examinations. Med Educ. 2004 Feb;38(2):199–203. doi: 10.1111/j.1365-2923.2004.01755.x.1755 [DOI] [PubMed] [Google Scholar]
- 19.Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387–396. doi: 10.1056/NEJMra054784.356/4/387 [DOI] [PubMed] [Google Scholar]
- 20.Cömert M, Zill JM, Christalle E, Dirmaier J, Härter M, Scholl I. Assessing communication skills of medical students in Objective Structured Clinical Examinations (OSCE)-a systematic review of rating scales. PLoS One. 2016 Mar 31;11(3):e0152717. doi: 10.1371/journal.pone.0152717. https://dx.plos.org/10.1371/journal.pone.0152717 .PONE-D-15-45543 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Hopwood J, Myers G, Sturrock A. Twelve tips for conducting a virtual OSCE. Med Teach. 2021 Jun;43(6):633–636. doi: 10.1080/0142159X.2020.1830961. [DOI] [PubMed] [Google Scholar]
- 22.Harden RM. Revisiting 'assessment of clinical competence using an objective structured clinical examination (OSCE)'. Med Educ. 2016;50(4):376–379. doi: 10.1111/medu.12801. [DOI] [PubMed] [Google Scholar]
- 23.Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945–949. doi: 10.1016/S0140-6736(00)04221-5.S0140-6736(00)04221-5 [DOI] [PubMed] [Google Scholar]
- 24.Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;287(2):226–235. doi: 10.1001/jama.287.2.226.jrv10092 [DOI] [PubMed] [Google Scholar]
- 25.Hodges B. OSCE! Variations on a theme by Harden. Med Educ. 2003;37(12):1134–1140. doi: 10.1111/j.1365-2923.2003.01717.x. [DOI] [PubMed] [Google Scholar]
- 26.Barman A. Critiques on the objective structured clinical examination. Ann Acad Med Singap. 2005;34(8):478–482. http://www.annals.edu.sg/pdf/34VolNo8200509/V34N8p478.pdf . [PubMed] [Google Scholar]
- 27.Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296(9):1094–1102. doi: 10.1001/jama.296.9.1094.296/9/1094 [DOI] [PubMed] [Google Scholar]
- 28.Rushforth HE. Objective Structured Clinical Examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today. 2007;27(5):481–490. doi: 10.1016/j.nedt.2006.08.009.S0260-6917(06)00138-9 [DOI] [PubMed] [Google Scholar]
- 29.Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med. 2008;40(8):574–578. [PubMed] [Google Scholar]
- 30.American Psychiatric Association . Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition, Text Revision (DSM-5-TR) Washington, DC: American Psychiatric Publishing; 2022. [Google Scholar]
- 31.Pell G, Fuller R, Homer M, Roberts T, International Association for Medical Education How to measure the quality of the OSCE: A review of metrics - AMEE guide no. 49. Med Teach. 2010;32(10):802–811. doi: 10.3109/0142159X.2010.507716. https://eprints.whiterose.ac.uk/75619/ [DOI] [PubMed] [Google Scholar]
- 32.Brand HS, Schoonheim-Klein M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur J Dent Educ. 2009;13(3):147–153. doi: 10.1111/j.1600-0579.2008.00554.x.EJE554 [DOI] [PubMed] [Google Scholar]
- 33.Selim AA, Ramadan FH, El-Gueneidy MM, Gaafer MM. Using Clinical Examination (OSCE) in undergraduate psychiatric nursing education: is it reliable and valid? Nurse Educ Today. 2012;32(3):283–288. doi: 10.1016/j.nedt.2011.04.006.S0260-6917(11)00093-1 [DOI] [PubMed] [Google Scholar]
- 34.Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D. The Objective Structured Clinical Examination (OSCE): optimising its value in the undergraduate nursing curriculum. Nurse Educ Today. 2009;29(4):398–404. doi: 10.1016/j.nedt.2008.10.007.S0260-6917(08)00151-2 [DOI] [PubMed] [Google Scholar]
- 35.American Psychiatric Association . Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. Washington, DC: American Psychiatric Publishing; 2013. [Google Scholar]
- 36.Griesser MJ, Beran MC, Flanigan DC, Quackenbush M, Van Hoff C, Bishop JY. Implementation of an Objective Structured Clinical Exam (OSCE) into orthopedic surgery residency training. J Surg Educ. 2012;69(2):180–189. doi: 10.1016/j.jsurg.2011.07.015.S1931-7204(11)00236-4 [DOI] [PubMed] [Google Scholar]
- 37.Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE Guide No. 81. Part I: an historical and theoretical perspective. Med Teach. 2013;35(9):e1437–e1446. doi: 10.3109/0142159X.2013.818634. [DOI] [PubMed] [Google Scholar]
- 38.Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ. 2011;45(10):1048–1060. doi: 10.1111/j.1365-2923.2011.04025.x. [DOI] [PubMed] [Google Scholar]
- 39.American Educational Research Association Standards for Educational & Psychological Testing (2014 Edition) 2024. [2024-09-26]. https://www.aera.net/publications/books/standards-for-educational-psychological-testing-2014-edition .
- 40.Ilgen JS, Ma IWY, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49(2):161–173. doi: 10.1111/medu.12621. [DOI] [PubMed] [Google Scholar]
- 41.Shirwaikar A. Objective Structured Clinical Examination (OSCE) in pharmacy education - a trend. Pharm Pract (Granada) 2015;13(4):627–630. doi: 10.18549/PharmPract.2015.04.627. https://europepmc.org/abstract/MED/26759616 .pharmpract-13-627 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Harden HR. OSC Guide. 2016. [2024-09-26]. https://www.osc.ca/en/news-events/subscribe/osc-guide .
- 43.Johnston ANB, Weeks B, Shuker M, Coyne E, Niall H, Mitchell M, Massey D. Nursing students' perceptions of the Objective Structured Clinical Examination: an integrative review. Clin Simul Nurs. 2017;13(3):127–142. doi: 10.1016/j.ecns.2016.11.002. https://europepmc.org/abstract/MED/31239801 .197275 [DOI] [Google Scholar]
- 44.Bevan J, Russell B, Marshall B. A new approach to OSCE preparation - PrOSCEs. BMC Med Educ. 2019;19(1):126. doi: 10.1186/s12909-019-1571-5. https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-019-1571-5 .10.1186/s12909-019-1571-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Lockyer J, Carraccio C, Chan MK, Hart D, Smee S, Touchie C, Holmboe ES, Frank JR, ICBME Collaborators Core principles of assessment in competency-based medical education. Med Teach. 2017;39(6):609–616. doi: 10.1080/0142159X.2017.1315082. [DOI] [PubMed] [Google Scholar]
- 46.Khan R, Payne MWC, Chahine S. Peer assessment in the Objective Structured Clinical Examination: a scoping review. Med Teach. 2017;39(7):745–756. doi: 10.1080/0142159X.2017.1309375. [DOI] [PubMed] [Google Scholar]
- 47.Boursicot K, Kemp S, Ong TH, Wijaya L, Goh SH, Freeman K, Curran I. Conducting a high-stakes OSCE in a COVID-19 environment. MedEdPublish (2016) 2020;9:54. doi: 10.15694/mep.2020.000054.1. https://europepmc.org/abstract/MED/38058921 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 48.Lara S, Foster CW, Hawks M, Montgomery M. Remote assessment of clinical skills during COVID-19: a virtual, high-stakes, summative pediatric Objective Structured Clinical Examination. Acad Pediatr. 2020;20(6):760–761. doi: 10.1016/j.acap.2020.05.029. https://europepmc.org/abstract/MED/32505690 .S1876-2859(20)30239-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 49.Graf J, Smolka R, Simoes E, Zipfel S, Junne F, Holderried F, Wosnik A, Doherty AM, Menzel K, Herrmann-Werner A. Communication skills of medical students during the OSCE: gender-specific differences in a longitudinal trend study. BMC Med Educ. 2017;17(1):75. doi: 10.1186/s12909-017-0913-4. https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-017-0913-4 .10.1186/s12909-017-0913-4 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 50.Yeates P, Cope N, Hawarden A, Bradshaw H, McCray G, Homer M. Developing a video-based method to compare and adjust examiner effects in fully nested OSCEs. Med Educ. 2019;53(3):250–263. doi: 10.1111/medu.13783. https://europepmc.org/abstract/MED/30575092 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Norcini J, Anderson MB, Bollela V, Burch V, Costa MJ, Duvivier R, Hays R, Palacios Mackay MF, Roberts T, Swanson D. 2018 Consensus framework for good assessment. Med Teach. 2018;40(11):1102–1109. doi: 10.1080/0142159X.2018.1500016. [DOI] [PubMed] [Google Scholar]
- 52.Lewis KL, Bohnert CA, Gammon WL, Hölzer H, Lyman L, Smith C, Thompson TM, Wallace A, Gliva-McConvey G. The association of standardized patient educators (ASPE) standards of best practice (SOBP) Adv Simul (Lond) 2017;2:10. doi: 10.1186/s41077-017-0043-4. https://advancesinsimulation.biomedcentral.com/articles/10.1186/s41077-017-0043-4 .43 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 53.Chong L, Taylor S, Haywood M, Adelstein BA, Shulruf B. The sights and insights of examiners in Objective Structured Clinical Examinations. J Educ Eval Health Prof. 2017;14(3):34–242. doi: 10.3352/jeehp.2017.14.34. https://europepmc.org/abstract/MED/29278906 .jeehp.2017.14.34 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Shehata MH, Kumar AP, Arekat MR, Alsenbesy M, Mohammed Al Ansari A, Atwa H, Ahmed SA, Deifalla A. A toolbox for conducting an online OSCE. Clin Teach. 2021;18(3):236–242. doi: 10.1111/tct.13285. [DOI] [PubMed] [Google Scholar]
- 55.Craig C, Kasana N, Modi A. Virtual OSCE delivery: the way of the future? Med Educ. 2020;54(12):1185–1186. doi: 10.1111/medu.14286. https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-020-02444-3 .10.1186/s12909-020-02444-3 [DOI] [PubMed] [Google Scholar]
- 56.Boyle JG, Colquhoun I, Noonan Z, McDowall S, Walters MR, Leach JP. Viva la VOSCE? BMC Med Educ. 2020;20(1):514. doi: 10.1186/s12909-020-02444-3. https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-020-02444-3 .10.1186/s12909-020-02444-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 57.Blythe J, Patel NSA, Spiring W, Easton G, Evans D, Meskevicius-Sadler E, Noshib H, Gordon H. Undertaking a high stakes virtual OSCE ("VOSCE") during Covid-19. BMC Med Educ. 2021;21(1):221. doi: 10.1186/s12909-021-02660-5. https://bmcmededuc.biomedcentral.com/articles/10.1186/s12909-021-02660-5 .10.1186/s12909-021-02660-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020;10(11):e042378. doi: 10.1136/bmjopen-2020-042378. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=33154063 .bmjopen-2020-042378 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Donn J, Scott JA, Binnie V, Bell A. A pilot of a virtual Objective Structured Clinical Examination in dental education. A response to COVID-19. Eur J Dent Educ. 2021;25(3):488–494. doi: 10.1111/eje.12624. https://eprints.gla.ac.uk/224064 . [DOI] [PubMed] [Google Scholar]
- 60.Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, Fuller R. Performance assessment: consensus statement and recommendations from the 2020 Ottawa conference. Med Teach. 2021;43(1):58–67. doi: 10.1080/0142159X.2020.1830052. [DOI] [PubMed] [Google Scholar]
- 61.Hannan TA, Umar SY, Rob Z, Choudhury RR. Designing and running an online Objective Structured Clinical Examination (OSCE) on zoom: a peer-led example. Med Teach. 2021;43(6):651–655. doi: 10.1080/0142159X.2021.1887836.S1471-5953(18)30328-7 [DOI] [PubMed] [Google Scholar]
- 62.Solà-Pola M, Morin-Fraile V, Fabrellas-Padrés N, Raurell-Torreda M, Guanter-Peris L, Guix-Comellas E, Pulpón-Segura AM. The usefulness and acceptance of the OSCE in nursing schools. Nurse Educ Pract. 2020;43:102736. doi: 10.1016/j.nepr.2020.102736.S1471-5953(18)30328-7 [DOI] [PubMed] [Google Scholar]
- 63.Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE) Med Educ. 1979;13(1):41–54. doi: 10.1111/j.1365-2923.1979.tb00918.x. [DOI] [PubMed] [Google Scholar]
- 64.Lee GB, Chiu AM. Assessment and feedback methods in competency-based medical education. Ann Allergy Asthma Immunol. 2022;128(3):256–262. doi: 10.1016/j.anai.2021.12.010.S1081-1206(21)01309-0 [DOI] [PubMed] [Google Scholar]
- 65.Mathew MM, Thomas KA. Medical aptitude and its assessment. Natl Med J India. 2018;31(6):356–363. doi: 10.4103/0970-258X.262905. http://www.nmji.in/article.asp?issn=0970-258X;year=2018;volume=31;issue=6;spage=356;epage=363;aulast=Mathew .NatlMedJIndia_2018_31_6_356_262905 [DOI] [PubMed] [Google Scholar]
- 66.Zayyan M. Objective structured clinical examination: the assessment of choice. Oman Med J. 2011;26(4):219–222. doi: 10.5001/omj.2011.55. https://europepmc.org/abstract/MED/22043423 .OMJ-D-10-00135 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 67.Jiang Z, Ouyang J, Li L, Han Y, Xu L, Liu R, Sun J. Cost-effectiveness analysis in performance assessments: a case study of the objective structured clinical examination. Med Educ Online. 2022;27(1):2136559. doi: 10.1080/10872981.2022.2136559. https://europepmc.org/abstract/MED/36250891 . [DOI] [PMC free article] [PubMed] [Google Scholar]
- 68.Cook DA, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang AT, Erwin PJ, Hamstra SJ. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978–988. doi: 10.1001/jama.2011.1234.306/9/978 [DOI] [PubMed] [Google Scholar]
- 69.Bajpai S, Semwal M, Bajpai R, Car J, Ho AHY. Health professions' digital education: review of learning theories in randomized controlled trials by the digital health education collaboration. J Med Internet Res. 2019;21(3):e12912. doi: 10.2196/12912. https://www.jmir.org/2019/3/e12912/ v21i3e12912 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 70.Cheng A, Lang T, Starr S, Pusic M, Cook D. Technology-enhanced simulation and pediatric education: a meta-analysis. Pediatrics. 2014;133(5):e1313–1323. doi: 10.1542/peds.2013-2139.peds.2013-2139 [DOI] [PubMed] [Google Scholar]
- 71.Wilkinson TJ, Wade WB, Knock LD. A blueprint to assess professionalism: results of a systematic review. Acad Med. 2009;84(5):551–558. doi: 10.1097/ACM.0b013e31819fbaa2.00001888-200905000-00008 [DOI] [PubMed] [Google Scholar]
- 72.Preez RRD, Pickworth GE, van Rooyen M. Teaching professionalism: a South African perspective. Med Teach. 2007;29(9):e284–e291. doi: 10.1080/01421590701754128.788874185 [DOI] [PubMed] [Google Scholar]
- 73.Mueller PS. Teaching and assessing professionalism in medical learners and practicing physicians. Rambam Maimonides Med J. 2015;6(2):e0011. doi: 10.5041/RMMJ.10195. https://europepmc.org/abstract/MED/25973263 .rmmj-6-2-e0011 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 74.Alinier G. A typology of educationally focused medical simulation tools. Med Teach. 2007;29(8):e243–e250. doi: 10.1080/01421590701551185.789167409 [DOI] [PubMed] [Google Scholar]
- 75.Fox-Robichaud AE, Nimmo GR. Education and simulation techniques for improving reliability of care. Curr Opin Crit Care. 2007;13(6):737–741. doi: 10.1097/MCC.0b013e3282f1bb32.00075198-200712000-00020 [DOI] [PubMed] [Google Scholar]
- 76.Eva KW, Regehr G. "I'll never play professional football" and other fallacies of self-assessment. J Contin Educ Health Prof. 2008;28(1):14–19. doi: 10.1002/chp.150. [DOI] [PubMed] [Google Scholar]
- 77.Colthart I, Bagnall G, Evans A, Allbutt H, Haig A, Illing J, McKinstry B. The effectiveness of self-assessment on the identification of learner needs, learner activity, and impact on clinical practice: BEME guide no. 10. Med Teach. 2008;30(2):124–145. doi: 10.1080/01421590701881699.790794805 [DOI] [PubMed] [Google Scholar]
- 78.Lim AS, Ling YL, Wilby KJ, Mak V. What's been trending with OSCEs in pharmacy education over the last 20 years? A bibliometric review and content analysis. Curr Pharm Teach Learn. 2024;16(3):212–220. doi: 10.1016/j.cptl.2023.12.028. https://linkinghub.elsevier.com/retrieve/pii/S1877-1297(23)00332-5 .S1877-1297(23)00332-5 [DOI] [PubMed] [Google Scholar]
- 79.Hodges BD, Hollenberg E, McNaughton N, Hanson MD, Regehr G. The psychiatry OSCE: a 20-year retrospective. Acad Psychiatry. 2014;38(1):26–34. doi: 10.1007/s40596-013-0012-8. [DOI] [PubMed] [Google Scholar]
- 80.Boulet J, Durning S. What we measure … and what we should measure in medical education. Med Educ. 2019;53(1):86–94. doi: 10.1111/medu.13652. [DOI] [PubMed] [Google Scholar]
- 81.Lucey CR, Hauer KE, Boatright D, Fernandez A. Medical education's wicked problem: achieving equity in assessment for medical learners. Acad Med. 2020;95(12S Addressing Harmful Bias and Eliminating Discrimination in Health Professions Learning Environments):S98–S108. doi: 10.1097/ACM.0000000000003717.00001888-202012001-00017 [DOI] [PubMed] [Google Scholar]
- 82.Tormey W. Education, learning and assessment: current trends and best practice for medical educators. Ir J Med Sci. 2015;184(1):1–12. doi: 10.1007/s11845-014-1069-4. [DOI] [PubMed] [Google Scholar]
- 83.Gröne O, Mielke I, Knorr M, Ehrhardt M, Bergelt C. Associations between communication OSCE performance and admission interviews in medical education. Patient Educ Couns. 2022;105(7):2270–2275. doi: 10.1016/j.pec.2021.11.005.S0738-3991(21)00728-X [DOI] [PubMed] [Google Scholar]
- 84.Min Simpkins AA, Koch B, Spear-Ellinwood K, St John P. A developmental assessment of clinical reasoning in preclinical medical education. Med Educ Online. 2019;24(1):1591257. doi: 10.1080/10872981.2019.1591257. https://europepmc.org/abstract/MED/30935299 . [DOI] [PMC free article] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
All data generated or analyzed during this study are included in this published article.