Skip to main content
Pharmacy Practice logoLink to Pharmacy Practice
. 2015 Dec 15;13(4):627. doi: 10.18549/PharmPract.2015.04.627

Objective structured clinical examination (OSCE) in pharmacy education - a trend

Annie Shirwaikar 1
PMCID: PMC4696119  PMID: 26759616

Abstract

Pharmacy education has undergone a radical change as it evolves towards becoming a more patient oriented profession. With a greater emphasis on problem based teaching and competency, the Objective Structured Clinical Examination (OSCE), supported by its reliability and validity became the gold standard for the evaluation of clinical skills of undergraduate students of medicine and pharmacy worldwide. Core competency evaluation has become a mandatory and critical norm for accountability of educational objectives as the traditional testing tools cannot evaluate clinical competence. Interpersonal and communication skills, professional judgment, skills of resolution etc., may be best assessed through a well- structured OSCE in comparison to oral examinations, multiple choice tests and other methods of assessment. Though OSCEs as an objective method of evaluation offer several advantages to both students and teachers, it also has disadvantages and pitfalls in implementation. This article reviews the OSCE as a trend in pharmacy education.

Keywords: Educational Measurement, Education, Professional, Clinical Competence

INTRODUCTION

Since Harden and Gleeson1 introduced the OSCE, an acronym for Objective Structured Clinical Examination, as a method of student assessment in medical school in 1975, it became the preferred method for evaluation of learner performance across various health professions. An OSCE is defined as “an approach to the assessment of clinical competence in which the components of competence are assessed in a planned or structured way with the attention being paid to the objectivity of the examination”.2 Most significantly, the OSCE is a part of high-stake entry-to-practice licensing examinations, including the United States Medical Licensing Examination, the Medical Council of Canada Qualifying Examination and the Canadian Pharmacist Qualifying Examination.3,4,5 The Royal Pharmaceutical Society of Great Britain alongside traditional methods of assessments advocates the inclusion of competency-based learning and assessment in the form of OSCEs. The increasing interest of U.S. schools of pharmacy in this technique has been demonstrated by a 7-fold increase in OSCE research presentations at several academic forums in pharmacy between 2006 and 2009.6,7,8,9

Factors determining the OSCE

The traditional clinical examination has been blamed for its bias and lack of strong correlation amongst different evaluators. Pharmaceutical education has undergone a radical change as it evolves towards a more patient oriented profession. With a greater emphasis on problem based teaching and competency10,11,12 the OSCE, due to its reliability and validity, became the gold standard for the evaluation of clinical skills of undergraduate students of medicine and pharmacy worldwide.12,13,14 Core competency evaluation has become a mandatory and critical norm for accountability of educational objectives as the traditional testing tools that are commonly used like multiple choice questionnaires (MCQs) or self-assessment questionnaires (SAQs) cannot evaluate clinical competence in a pharmacy practice setting.15

Nature of the Trend

As the profession of pharmacy metamorphoses to enfold the requirements set forth by the Center for the Advancement of Pharmacy Education and the Accreditation Council for Pharmacy Education (ACPE)15,16,17, a corresponding need to restructure pharmaceutical education has arisen, which would require a significant review of pharmacy curricula and competency assessment methods.10,18

Today the OSCE is used to assess the clinical competency of undergraduate pharmacy students in licensure and certification examinations in many parts of the world.13,14 Candidates rotate through several stations on a timed basis in an OSCE. In the United States and Canada, approximately 12 to 16 stations are used for medical and pharmacy licensing exams5 as compared to Harden’s original OSCE of 16 stations.2 At each station, the candidate faces a simulated task and has to perform specific functions. Both interactive and non-interactive stations are used. Standardized patients are employed in interactive stations, with a trained examiner evaluating the exam using a marking key that is standardized, while non-interactive stations use written responses (not based on observation). Interpersonal and communication skills, professional judgment, skills of resolution, etc., may be best assessed through a well- structured OSCE in comparison to oral examinations, multiple choice tests and other methods of assessment. High costs and difficulties associated therewith have limited its extensive use.2,19 In the 1970s, British Columbia’s College of Pharmacists, the pharmacy practice licensing regulatory body of Canada’s third largest province, started using the OSCE format with patient simulations and standardized problems for the assessment of new and continuing competencies of its member pharmacists.11 In 1996, the pharmacy regulatory body in Ontario College of Pharmacists introduced the OSCE as part of its mandatory Practice Review and Quality Assurance for all practicing pharmacists. As important entry-to-practice competencies could not be adequately measured via traditional examination formats, momentum increased for a national OSCE for pharmacy. Thus started the development of a national OSCE for pharmacy practice by The Pharmacy Examining Board of Canada and June 1997 witnessed the emergence of a blueprint which outlined the number of stations needed for reliable and valid testing of entry-level competencies inclusive of rest stations and testing time (200 minutes).11 Identified entry-level competencies included obtaining and interpreting information, recommending appropriate therapeutic options, effective communication, preparation, distribution of drug products and professional judgments and ethics.4,20 Ahmed et al.21 designed and implemented a 13-station OSCE for a clinical pharmacy course. Patient counseling and communication, clinical pharmacokinetics, identification and resolution of drug related problems and literature evaluation/drug information were the broad competencies assessed. A majority of the students felt that the skill required at some stations required a higher degree of learning than they had achieved.

Deborah22, in a study on OSCE practices in PharmD programs in the U.S., surveyed 108 U.S. schools of pharmacy. Of the 87 colleges that responded, 32 used OSCEs; and 11 of these colleges regularly administered examinations comprising of 3 stations or more. In these colleges, the scenarios were the same for all the students and it was ensured that the standardized patients’ portrayed their roles with consistency. Approximately half of the 55 programs that did not use OSCEs were interested in using the technique. Cost and faculty workloads were commonly encountered as barriers to implementation. Francine et al.15 on assessing pharmacy student performance by problem-based learning (PBL) using OSCE, valued the addition of an OSCE to written examinations for a more comprehensive assessment of the PBL experience. Jan Hastings et al.23 used an OSCE for an elective in nonprescription medication to assess its effect on students’ skills, knowledge and satisfaction. He concluded its efficacy for evaluation of skills and for self-care.

Advantages

OSCEs are generally more objective than most other assessments of practice and the wide range of examiners involved at various stations reduces the risk of examiner bias.24 This is advantageous to the examinee and upholds the teaching standard of the institution. With stations that are reusable, the OSCE takes a much shorter time to execute and is able to examine more students at any given time over a broader range of subjects.25 Clinical-type examinations are often criticized because of poor validity. The OSCE allows examiners to determine competencies to be tested in advance and subsequently to design the examination to test these competencies across several stations. As compared to traditional methods, skills and behaviours may be assessed at different levels of complexity.25,26 Published research findings on OSCEs have reported its reliability, validity and objectivity, covering a wide range like problem solving, communication skills, decision-making and patient management abilities. Its reproducibility, acceptability and structured marking schedule enable recall, teaching audit and determination of standards.25 The large element of variable control ensures validity and reliability. Overall it has better psychometric proerties than traditional methods.26 The OSCE has been considered as important supplementary evidence about a trainees’ competence prior to granting them a certificate of satisfactory completion of the training.26

Disadvantages

The OSCE tends to compartmentalize a candidate’s skills and knowledge and hence may undermine holism. This is a particularly important criticism as the sum of the individual parts does not always equal the whole.1

OSCEs are more difficult to organize and require more materials and human resources.27 The resource implications are significant in terms of organisation, staffing, rooms, etc. All examiners require training and descriptors of what constitutes a pass / fail / different grade for each of the stations that need to be developed.24

Compared to traditional examinations, the OSCE is more expensive to conduct and in addition requires a greater amount of time.19,28 Higher levels of examinee stress and observer fatigue are observed.

The idealized textbook’ scenarios of OSCEs may not mimic real life situations.25 Student performance can also be influenced by the order of stations and the precision of standardized patient simulation.27

Pitfalls in Implementation

The pitfalls of implementing OSCEs revolve around validity, reliability, feasibility and acceptability. Using an established set of components to assess students in an OSCE may appear accurate, but the efficiency and shrewdness of those able to diagnose with minimum effort is put at disadvantage. As the OSCE necessitates conformation to the structured path of the creators, the focus of the exam is on the exam structure rather than on the learner.1,27 Overall validity and reliability of the examination is determined by the quality of individual stations. When stations are too short, the problem could be further compounded and assessment of even simple clinical skills may be more difficult.26

Assessment of skills cannot be extended to emergency situations and also, attitudes which are extremely difficult to assess are best assessed by close observation over a prolonged period. The “artificial” setting of the OSCE makes necessary for examiners and students to acquaint themselves with the process. OSCE’s potential for measure of real-life patient-care situations is limited.27 Some examinees express greater tension in an OSCE as compared with other examinations which has been observed to decrease with familiarity of the format. For repeated examinations, there an adequate pool of stations is needed, which may be limited because of logistical difficulties and financial constraints.26 As the assessment of students using different patients may cause unreliability, it has been suggested that all patients in an OSCE should be examined by all candidates. Marks may not be an accurate reflection of the ability of the student as repetitious demands may tire the student, patient or examiner. Student performance in two examinations, was observed to show a lack in correlation which was concluded as partly due to the test sequence, examiner and patient variability. The accuracy of standardised patient and the order of station encounter also affected students’ performance.27 There are chances of breach of confidentiality, leaking of the stations especially when residents/interns served as observers.

Though the objectivity of the OSCE is claimed to depend on tasks standardisation and the station score checklist, this is not always the case. Inter-station performance differences have been found to be greater than those within a station (intra-station).27 Hence, there should be careful selection of patients; a single station may require several patients especially in case of several examinees or in case of particularly taxing examinations, For examiners the exercise is grueling, but can be compensated by more efficient use of examiners” time.1,26

GUIDELINES FOR IMPLEMENTATION OF THE OSCE

Blueprinting is crucial to establish the validity of a test and its components. It helps developing OSCE stations with simulation tasks and problems that are relevant to practice. Listed below are important steps to be considered for proper implementation of the OSCE.11,28,29

Administrative Structure

Coordinator and Coordinating Committee: The key to successful OSCE is meticulous planning and process. The coordinator must oversee all examination aspects. The examination content must be decided and stations must be developed to test this content in a scheduled timeframe. Standardized patients, trainers and support staff will be required.

Authoring Team: The task of developing OSCE stations can be started after deciding on the examination content and should be undertaken by pharmacy faculty well acquainted with the curriculum and its objectives. Detailed explanations and guidelines that have been reviewed, edited, and agreed upon by the coordinating committee should be given to authors for different stations.

The examination content must be determined by the coordinating committee. For reliable assessment of clinical competence, curricular material encompassing varying skills (history-taking, physical examination, problem-solving, laboratory data interpretation, etc.) must be broadly sampled, across an adequate number of stations of the same duration usually of 10-minute duration each. Moving time between stations must be accounted for.

Station items: The following must be developed: a) a candidate instructions sheet, b) a skill checklist for the station, c) a detailed standardized patient profile, post encounter testing items if any, and d) a list of equipment required.

Patient recruitment and training: Standardized patients may be recruited (local actors, etc.). The first training step for these patients is development of the blueprint of the patient profile so as to develop a standardized, reproducible, clinical encounter. Costs/budget associated with implementing the OSCE can vary widely and must be worked out.

Administering the OSCE - the Logistics

Location: The location for the exam should recreate a real clinical encounter. The testing area should be illustrated with a diagram that clearly depicts the stations and flow patterns. The bell to indicate the time for station change should be audible throughout the examination centre.

Personnel: Additional personnel should be co-opted for effective administration.

Setting up the OSCE: All examiners, examinees, and patients should receive detailed instructions a week in advance of the examination. Examination materials, equipment and artifacts required must be procured and reminders sent to participating personnel. All examination materials must be re-checked. There must be a set of additional stations in case of problems with the planned stations.

Examination Day: The stations should be numbered about an hour before commencement of the examination. Instructions defining the student task should be well defined with proper instructions and the station set up with required materials. All involved should reach the examination center at least half an hour before the examination and the examiners, examinees and the patients should separately be given orientation. When the setup is ready, examinees must report at their preassigned stations and wait for the buzzer to begin. With each buzzer ring, examinees move to the subsequent station, until they have rotated through all the stations. In case of lengthy examinations, patients must be relieved by substituting at appropriate intervals with similar patients. Other patient needs should also be looked into during conduct of the examination. The checklists and answer sheets should be collected and collated as soon as the examination gets over and a standardized scoring method used to determine the grades.

Examination Review: Once the exam has been successfully completed, the entire process must be reflected upon to identify setbacks, problems etc. Recommendations may be made for improving the examination stations and logistics in the future.25

EPILOGUE

Core competency evaluation has become a mandatory and critical norm for accountability of educational objectives as it has been shown that traditional testing tools cannot evaluate clinical competence. In a rapidly changing educational environment, the OSCE paves the way for unbiased objective assessment and evaluation. However for proper implementation of a valid and reliable OSCE, care should be taken to avoid the pitfalls.

ACKNOWLEDGEMENTS

The author would like to thank Prof Dr Raja Bandaranayake and Prof Gamini Premadasa for their valuable guidance and comments.

Footnotes

Conflict of interest

None.

References

  • 1.Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE) Med Educ. 1979;13(1):41–54. [PubMed] [Google Scholar]
  • 2.Harden RM. What is an OSCE? Med Teach. 1988;10(1):19–22. doi: 10.3109/01421598809019321. [DOI] [PubMed] [Google Scholar]
  • 3.The Medical Council of Canada. [accessed September 15 2013]; Available at: http://www.mcc.ca .
  • 4.The United States Medical Licensing Examination; Available at: http://www.usmle.org . [accessed September 15 2013];
  • 5.The Pharmacy Examining Board of Canada. [accessed September 15 2013]; Available at: http://www.pebc.ca .
  • 6.Young A. 107th Annual Meeting of the American Association of Colleges of Pharmacy. San Diego, CA: 2006. Jul 9-12, Making learning portable: continuing education (CE) on an iPod. [Google Scholar]
  • 7.Stowe CD, O’Brien CE, Warmack TS, Gardner SF. 108th Annual Meeting of the American Association of Colleges of Pharmacy. Orlando, FL: Jul 14-17, 2007. Communication skill development: OSCE assessment of lay and healthcare provider encounters. [Google Scholar]
  • 8.Allen R, Mihm LB, Mihm DJ, Robinson D, et al. 109th Annual Meeting of the American Association of Colleges of Pharmacy. Chicago, IL: Jul 19-23, 2008. Evaluating the impact of a nutrition service-learning course on first-year pharmacy students. [Google Scholar]
  • 9.Deloatch KH, Coker HG, White-Harris CY. 110th Annual Meeting of the American Association of 10 Colleges of Pharmacy. Boston, MA: Jul 18-22, 2009. Comparability of student academic performance in a dual-campus doctor of pharmacy program using videoteleconferencing. [Google Scholar]
  • 10.Monaghan MS, Turner PD, Vanderbush RE, Grady AR. Traditional student, nontraditional student and pharmacy practitioner attitudes toward the use of standardized patients in the assessment of clinical skills. Am J Pharm Educ. 2000;64(1):27–32. [Google Scholar]
  • 11.Austin Z, O’Byrne CC, Pugsley J, Munoz LQ. Development and validation processes for an objective structured clinical examination (OSCE) for entry-to-practice certification in pharmacy: the Canadian experience. Am J Pharm Educ. 2003;67(3):76. [Google Scholar]
  • 12.Ahmed A, Mohamad HN, Qais Ahmad MA. Perception of pharmacy students in malaysia on the use of objective structured clinical examinations to evaluate competence. Am J Pharm Educ. 2007;71(6):118. doi: 10.5688/aj7106118. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Rutter PM. The introduction of observed structured clinical examinations (OSCEs) to the M.Pharm degree pathway. Pharm Educ. 2002;1:173–180. [Google Scholar]
  • 14.Corbo M, Patel JP, Abdel Tawab R, Davies JG. Evaluating clinical skills of undergraduate pharmacy students using objective structured clinical examinations (OSCEs) Pharm Educ. 2006;6:53–58. [Google Scholar]
  • 15.Salinitri FD, O’Connell MB, Garwood CL, Lehr VT, Abdallah K. An objective structured clinical examination to assess problem-based learning. Am J Pharm Educ. 2012;76(3):44. doi: 10.5688/ajpe76344. doi:10.5688/ajpe76344. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.>Accreditation Council for Pharmacy Education. Accreditation standards. [accessed September 15 2013]; Available at: http://www.acpe-accredit.org/deans/standards.asp .
  • 17.American Association of Colleges of Pharmacy. Center for the Advancement of Pharmaceutical education. [accessed September 15 2013]; Available at: http://www.aacp.org/resources/education/Pages/CAPEEducationalOutcomes.aspx .
  • 18.Commission to Implement Change in Pharmaceutical Education. Background paper II: Entry-level curricular outcomes, curricular content and educational process. Am J Pharm Educ. 1993;57:377–385. [Google Scholar]
  • 19.Cusimano MD, Cohen R, Rucker W, Murnaghan J, Kodama R, Reznick R. A comparative analysis of the costs of administration of an OSCE. Acad Med. 1994;69(7):571–576. doi: 10.1097/00001888-199407000-00014. [DOI] [PubMed] [Google Scholar]
  • 20.Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of objective structured clinical examination (OSCE) for evaluation and instruction in graduate medical education. J Surg Res. 1996;63(1):225–230. doi: 10.1006/jsre.1996.0252. [DOI] [PubMed] [Google Scholar]
  • 21.Awaisu A, Abd Rahman NS, Nik Mohamed MH, Bux Rahman Bux SH, Mohamed Nazar NI. Malaysian pharmacy students’ assessment of an objective structured clinical examination (OSCE) Am J Pharm Educ. 2010;74(2):34. doi: 10.5688/aj740234. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Sturpe DA. Objective structured clinical examinations in doctor of pharmacy programs in the United States. Am J Pharm Educ. 2010;74(8):148. doi: 10.5688/aj7408148. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Hastings JK, Flowers SK, Pace AC, Spadaro D. An Objective Standardized Clinical Examination (OSCE) in an advanced nonprescription medicines course. Am J Pharm Educ. 2010;74(6):98. doi: 10.5688/aj740698. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Rushforth HE. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse Educ Today. 2007;27(5):481–490. doi: 10.1016/j.nedt.2006.08.009. [DOI] [PubMed] [Google Scholar]
  • 25.Zayyan M. Objective structured clinical examination: the assessment of choice. Oman Med J. 2011;26(4):219–222. doi: 10.5001/omj.2011.55. doi:10.5001/omj.2011.55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.McAleer S, Walker R. Objective structured clinical examination (OSCE) Occas Pap R Coll Gen Pract. 1990;46:39–42. [Google Scholar]
  • 27.Barman A. Critiques on the objective structured clinical examination. Ann Acad Med Singapore. 2005;34(8):478–482. [PubMed] [Google Scholar]
  • 28.Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–451. doi: 10.1136/bmj.1.5955.447. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Dent JA, Ronald MH. 3rd edition. Oxford: Elsevier; 2009. Practical guide for medical teachers. ISBN: 978-0-7020-4551-6. [Google Scholar]

Articles from Pharmacy Practice are provided here courtesy of Centro de Investigaciones y Publicaciones Farmaceuticas

RESOURCES