To the Editor. We read with great interest Dr. Sturpe's description of objective structured clinical examinations (OSCEs) at United States Schools/Colleges of Pharmacy.1 Improving test reliability is a central tenet to the “objective” term in OSCEs. Content specificity is a concern with OSCE assessments and increasing the number of stations vastly improves test reliability.2 Dr. Sturpe's instruction on the number of OSCE stations required for suitable test reliability is very instructive, with a suggested 12-16 stations.
Assessment drives learning so the test reliability of assessments should be a key concern for pharmacy educators. While numerous versions of advanced pharmacy practice experience (APPE) evaluations are used at colleges and schools of pharmacy around the country, test reliability of evaluations should be an important consideration. If APPEs were conceptually thought of as analogous to OSCE stations, then together as an OSCE they can speak to a common ability of learners, ie, the ability to practice pharmacy in a number of environments. This ability continuum can range from limited to expansive, but students can fall anywhere along that spectrum.
Individual APPE rotation objectives must be linked to terminal school or college outcomes, and overall experience assessments mapped to these required objectives. Overall experience assessments should be standardized between preceptors and sites to ensure that students are assessed in a similar manner. An example using SOAP notes as part of an overall experience assessment, notes should be assessed more than once in a single APPE and then repeated among multiple core APPEs (ie, 3 notes/APPE over 4 APPEs would provide 12 evaluations).
Additional “stations” also could be included to complement APPE assessments, similar to the variations that Hodge describes.3 Test reliability should be enhanced with additional rigorous assessments of similar APPE objectives – as long as all evaluations are assessing a similar ability in students. An advantage of including additional assessments is that they provide a reliable, standardized means of critical evaluation for all students. Examples of additional assessments include: a final-year student presentation demonstrating evidence-based medicine skills,4 the National Association of Boards of Pharmacy's Pharmacy Curriculum Outcomes Assessment, or an individual college or school's outcome-based examination prior to APPEs.
Undoubtedly, colleges and schools of pharmacy are investing significant resources into experiential programs and sites. How rigorous (ie, reliable) are methods of evaluation? We forward an alternate paradigm for thinking of APPE evaluations using the strengths of an OSCE approach (ie, improved assessment reliability through greater station numbers). Additionally, some colleges and schools interested in performance-based assessment (such as with an OSCE) may be struggling with finding resources to implement this evaluation. Using experiential programs may foster use of an OSCE approach to assessment.
Michael J. Peeters, PharmD, MEd,a Craig D. Cox, PharmDb
aUniversity of Toledo College of PharmacybTexas Tech University Health Sciences Center School of Pharmacy
REFERENCES
- 1.Sturpe DA. Objective structured clinical examinations in doctor of pharmacy programs in the United States. Am J Pharm Educ. 2010;74(8) doi: 10.5688/aj7408148. Article 148. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Eva KW, Neville AJ, Norman GR. Exploring the etiology of content specificity: factors influencing analogic transfer and problem solving. Acad Med. 1998;73(10):S1–S5. doi: 10.1097/00001888-199810000-00028. [DOI] [PubMed] [Google Scholar]
- 3.Hodges B. OSCE! Variations on a theme by Harden. Med Educ. 2003;37(12):1134–1140. doi: 10.1111/j.1365-2923.2003.01717.x. [DOI] [PubMed] [Google Scholar]
- 4.Peeters MJ, Sahloff EG, Stone GE. A standardized rubric for student presentations. Am J Pharm Educ. 2010;74(9) doi: 10.5688/aj7409171. Article 171. [DOI] [PMC free article] [PubMed] [Google Scholar]