Abstract
Objective. To design, implement, and assess the use of “educational prescriptions” or Education Rx assignments in advanced pharmacy practice experiences (APPEs) in ambulatory care, and to assess the impact of the assignments on Doctor of Pharmacy (PharmD) students’ self-efficacy to practice evidence-based medicine (EBM).
Methods. Students enrolled in select ambulatory care APPEs completed up to four Education Rx assignments. The assignments required students to report the context of the question, source of information, results, appraisal of validity, and relevance of the evidence, and to answer the clinical question. A rubric was used that contained three subparts: a patient/population, intervention, comparison, outcome (PICO) conformity score (8 points), presence of answer to the PICO (1 point), and quality of answer to the PICO (6 points). Demographic information was collected and students were surveyed at the end of the APPE to rate their self-efficacy executing seven evidence-based medicine (EBM) skills.
Results. Thirty students completed 110 Education Rxs. The average score (SD) was 13.6 (2.2) with a PICO conformity subsection score of 7.3 (1.3), and quality of answer subsection score of 5.3 (1.2). Only one Education Rx did not have an answer. Students consulted point-of-care references for a majority of the answers (65%). Sixteen (53%) students completed the self-assessment survey, and all strongly agreed or agreed that the Education Rx activity improved their ability to formulate a well-constructed clinical question and evaluate and apply the evidence.
Conclusion. Through Education Rxs, PharmD students’ self-confidence and their skills in finding answers to clinical questions increased.
Keywords: evidenced-based medicine, EBM, PICO, Education Rx, educational prescription
INTRODUCTION
Pharmacists must be trained to efficiently and effectively apply available evidence in order to provide evidence-based pharmaceutical care for patients. The evidence-based medicine (EBM) framework consists of four skills, including ask (translate uncertainty into a question), acquire (retrieve evidence to answer the question), appraise (critically evaluate the evidence), and apply (implement the evidence into practice).1 This framework is relevant to all healthcare providers to improve health and health care. In 2007, the Institute of Medicine’s Roundtable on Evidence-Based Medicine created a goal: “by the year 2020, 90% of clinical decisions will be supported by accurate, timely, and up-to-date clinical information, and will reflect the best available evidence.”2
Recognizing the importance of EBM skills, the Accreditation Council for Pharmacy Education (ACPE) Accreditation Standards and Key Elements for the Professional Program in Pharmacy Leading to the Doctor of Pharmacy Degree (Standards 2016) and the 2013 Center for Advancement of Pharmacy Education (CAPE) Outcomes describe educational goals that embrace EBM skills, including patient-centered and population-based care. Standards 2016 also includes key elements for the regular assessment of evidence-based clinical reasoning skills throughout the curriculum, including “the ability to apply these skills across the patient’s lifespan, and the retention of knowledge that underpins these skills.”3,4
While inclusion in the ACPE standards and CAPE Outcomes underscores the importance of these skills, defining the optimal way to teach EBM skills has been challenging. Khan and Coomarasamy describe an approach that includes a hierarchy to effectively teach and learn EBM activities. The hierarchy consists of three levels, starting from lecture-based classroom teaching and advancing to interactive and clinically integrated activities, which Khan and Coomarasamy suggest lead to high-level educational outcomes. Such learning experiences mirror practice and facilitate meaningful comprehension.5 Another model to guide EBM training is a whole-task approach described by the Four Component Instructional Design (4C/ID) model. Within the 4C/ID model, complex skill training includes learning tasks, supportive information, procedural information, and part-task practice. These components enable the transfer of pharmacy students’ knowledge and skills into clinical practice.6
Although high-level educational outcomes are achieved from learning activities that are interactive and integrated directly into clinical teaching,5 incorporating specific EBM learning activities into a busy clinical practice during advanced pharmacy practice experiences (APPEs) can be challenging. This may be particularly difficult for preceptors who have not learned the EBM framework or nomenclature, and therefore may not be able to explicitly role model and explain how they retrieve, evaluate, and apply evidence in practice. As experiences for learners differ from site to site, a consistent approach to evaluate EBM skills is needed.
To address this challenge, medicine has incorporated the use of an “educational prescription” in clinical teaching of EBM skills during internal medicine,7,8 family medicine,8 pediatric,8 and emergency medicine9 clerkships. An educational prescription is a learning assignment co-written by the preceptor and learner that specifies the clinical problem that generated the question; states the question, in all four of its key elements, ie, patient/population, intervention, comparison, outcome (PICO); specifies who is responsible for answering it, and reminds every one of the deadline for answering it, taking into account the urgency of the clinical problem that generated it. These prescriptions also help learners practice the important lifelong habit of using EBM on a daily basis to help answer clinical questions. One study evaluated educational prescriptions answered by medical students on internal medicine, family medicine, and pediatric clerkships at a single institution. In a survey completed after the activity, approximately 40% of respondents stated the activity helped them better format a clinical question and appraise its answer.8
Although some of the clinical questions between medicine and pharmacy may differ, the essential EBM skills are the same. Using this framework, a group of ambulatory care clinical faculty members at the University of Minnesota College of Pharmacy adapted the educational prescription (Education Rx) activity for use during ambulatory care APPEs. The primary objectives of this study were to design, implement, and assess the use of Education Rx in ambulatory care APPEs. The secondary objective included assessing the impact of Education Rx assignments on students’ self-efficacy to practice EBM.
METHODS
The main portion of the Education Rx assignment involved completing a PICO for the clinical question. The grading rubric included three sub-scores: a PICO conformity score, presence of an answer to the PICO, and quality of the answer to PICO. In the PICO conformity sub-score, modeled after Thomas and Confrancesco,10 students received up to two points each for the following: patient/population is clearly stated, intervention is clearly stated, comparison is clearly stated, and outcome is clearly stated, with completion of all elements resulting in a maximum score of eight points. The subscore for quality of the answer was evaluated on three characteristics: directness (does the response directly answer the question posed by the PICO?), evidence (is there evidence provided to support the answer?) and preferred management (does the answer indicate a preferred management for this patient or population?), resulting in a maximum score of six. Finally, students received one additional point if the answer was present. The three sections were summed, resulting in a final score out of a possible 15 points.
To prepare students and preceptors for the Education Rx, instructional guides were posted on the course management website (Moodle 2.8, Moodle Pty Ltd; West Perth, WA, Australia). Both the preceptor and student guides included a detailed description of the assignment rationale and instructions. The guides included three examples of an Education Rx. The preceptor guide contained additional information about the frequency of Education Rx assignments, as well as how to access the assignments and rubrics within the course management website.
Students enrolled in an ambulatory care APPE at a clinical faculty member’s practice site were expected to complete four Education Rx assignments per block based on clinical questions that arose during the rotation. These sites were chosen for this project because the faculty members were part of a standing teaching, practice, and research group at our institution. The assignments were submitted by students through the course management website. Students were required to complete each of the following sections for the Education Rx, as well as include context and a brief background of the clinical situation: source of information, results of the information, appraisal of validity and relevance of the evidence, answer to the clinical question, and a self-evaluation of the process. Students could choose from the following sources to use: guideline, primary literature, tertiary literature (eg, review article, textbook), point-of-care reference (eg, DynaMed, UpToDate, Micromedex), and/or expert opinion. Students could select more than one information source. Demographic information was collected, including: APPE block number, rotation site, and number of Education Rx in the series (ie, one of four). Each preceptor evaluated all of his or her students’ Education Rx assignments through use of a rubric.10-12 A single investigator evaluated the assignment if the preceptor was unable to complete grading within a reasonable timeframe. Rubric ratings were not shared with students; however, students received verbal feedback directly from their preceptor, which included an assessment of their performance on some elements of the rubric. The study investigators wanted to analyze the data from the rubric ratings before deciding whether ratings would be included in APPE grades or sharing them with students.
Students were surveyed at the end of the APPE using Qualtrics (Qualtrics, Provo, UT) to rate perceived self-efficacy on seven EBM skills. The seven skills included formulating a well-constructed clinical question, identifying the best resources to answer a question in the least amount of time, conducting an online search to acquire evidence, critically appraising evidence, determining if evidence collected is relevant to the question, applying evidence found, and efficiently executing the steps for evidence-based medicine during patient care, and were selected upon reviewing two published tools.13,14 Students were also allowed to provide additional free-text comments about their experience with Education Rx.
Rubric scores were analyzed using descriptive statistics. Scores were summed and averages for each sub-score were calculated. Frequency of use was calculated for source of information used. Results from the student self-efficacy survey were summed by Likert response, and the frequency of responses was calculated. The University of Minnesota’s Institutional Review Board deemed this study exempt from review. All analyses were conducted using Excel.
RESULTS
Thirty students completed 112 Education Rxs across seven sites during the 2015-2016 academic year. Two were excluded because of incomplete assessment by the evaluator, leaving 110 Education Rxs for analysis. The average (SD) Education Rx score was 13.6 (2.2) out of 15 total possible points (Table 1). The sub-score from the PICO conformity section of the rubric was 7.3 (1.3) out of 8 possible points and the subscore for the quality of the answer section was 5.3 (1.2) out of 6 possible points. Only one Education Rx did not have an answer. Students consulted point-of-care references for a majority of the answers (66%). Use of primary literature (50%) and tertiary sources (50%) were also popular, followed by guidelines (29%) (Table 2).
Table 1.
Scores of Pharmacy Students Who Completed Education Rx Assignments During Advanced Pharmacy Practice Experiences (N=110)
Table 2.
Frequency With Which Pharmacy Students Used Resources to Complete Education Rx Assignments During Advanced Pharmacy Practice Experiences (N=110)
Sixteen (53%) students representing four of the seven APPE sites completed the self-assessment survey regarding the value of this educational activity. All respondents agreed or strongly agreed that completing the Education Rx activity increased their confidence in their ability to formulate a well-constructed clinical question, determine if the evidence found was relevant to the question, and apply the evidence found to the question. Additionally, 15 (94%) survey respondents agreed or strongly agreed that completing the Education Rx activity increased confidence in their ability to conduct an online search to acquire the evidence to answer a question, critically appraise the validity of the evidence found to answer a question, and efficiently execute the steps of evidence-based medicine to care for patients. Lastly, 12 (75%) survey respondents agreed or strongly agreed that completing the Education Rx activity increased confidence in their ability to identify the best resources to answer a question in the least amount of time (Table 3).
Table 3.
Student Evaluation of Confidence in Their Ability to Perform Evidence Based Medicine Tasks (N=16)
DISCUSSION
The results of this Education Rx activity showed that the students submitted quality work and that the activity increased the students’ confidence in their ability to efficiently answer clinical questions. The Education Rx activity presented an opportunity for students to take a whole-task approach in using EBM skills. Constructing questions based on problems they encountered in their APPEs, conducting searches for information, appraising the acquired information, and applying it to answer the questions provided students with an authentic experience to practice EBM skills. The activity was able to be implemented at the point of care across sites. These results are similar to those reported among medical students.7,8 Nixon and colleagues implemented a similar educational prescription for medical students completing internal medicine clerkships. Each student submitted one prescription, with a total of 191 prescriptions evaluated. A total of 61% of students achieved a PICO conformity score of seven or eight. For quality of answer, this study used a three-point scale, with the majority of students receiving two to three points.7
At the conclusion of this study, several unexpected findings were discovered. Based on unsolicited feedback, many students had not previously used the PICO format for constructing clinical questions. Despite this inexperience, students were able to use an Education Rx successfully as evidenced by preceptor evaluation. Students were most challenged to define the “outcome” portion of the PICO as compared to other portions of the Education Rx. Students often confused the outcome portion of the PICO with a general clinical outcome. This finding suggests that students need coaching to clarify what type of answer the preceptor is looking for in the student’s answer to the clinical question. Student self-assessment was positive as they reported appreciating the opportunity to frame a question and perform a defined search process with a real patient. Overall, the researchers felt that this was a valuable experience, as it ensured practical use of a process that is necessary for all pharmacists and could easily be implemented at other sites.
Point-of-care references were most frequently used, while primary literature and guidelines were used less frequently. Comparing information sources used in Education Rx assignments to those taught in the didactic curriculum is useful to understand whether the didactic curriculum is adequately preparing students for experiential education. Didactic curriculum may emphasize primary literature and guidelines; however, this is not reflected in the sources students chose to use in practice. Students may rely on point-of-care sources in practice if they are looking for answers quickly. Because students often confused the “outcome” part of the PICO with a clinical outcome, it is not surprising that point of care references were used most frequently. Had students correctly executed defining the “outcome” of the PICO, they may have used other references like guidelines and primary literature.
Since this initial study, EBM material has been threaded throughout the University of Minnesota’s didactic curriculum, exposing students to the PICO format and this Education Rx activity earlier than their APPE year. To this end, this institution has expanded this activity to all patient care APPEs, including community, acute care, and patient care electives. Each APPE type has one teaching assistant that grades the Education Rx, which results in consistent application of the rubric. A future direction of this research could include improvement in students’ confidence in answering the EBM question and the quality of their answers as they are exposed to six to eight Education Rx activities during their fourth year.
This study had some limitations. Only ambulatory care APPE sites with clinical faculty preceptors were used. This could have introduced researcher bias because of the faculty members’ immersion in the overall curriculum and dedication to implementing EBM. Additionally, because only ambulatory care APPE sites were used, results cannot be extrapolated to other rotation types (eg, community, acute care, institutional). The response rate for the self-assessment was low, with just over half of the students responding; therefore, the results may not reflect the views of all students. Finally, while the rubric allowed for the evaluation of a student’s ability to correctly document the PICO, it was not designed to evaluate the quality of the student response in terms of clinical relevance or accuracy. Such evaluations may be best performed by the preceptor on site, which may require additional preceptor training and development. Given the fact that some precepting faculty members in this study did not successfully submit rubric ratings, even with preceptor training, it may be difficult to get all preceptors to effectively and consistently provide written feedback using the grading rubric in a timely manner. Determining what prevented precepting faculty members from completing and submitting rubric ratings is an important next step.
CONCLUSION
Central to the development of a clinically sound, contemporary practitioner is the ability to quickly assess a patient situation or clinical question, review the current literature, and summarize and apply the findings and pertinence to the patient case. The successful integration of this process has been evaluated and used through a “prescription” in the practice and training of medical residents and students. A modification of this process, the Education Rx, is the first of its kind to be evaluated in pharmacy students. Through the Education Rx, students were able to apply the PICO assessment to a patient care scenario and preceptors were able to better assess a student’s thought process in answering the question. Additionally, the use of this tool increased students’ ability to critically appraise evidence to a clinical question.
ACKNOWLEDGMENTS
The authors would like to thank Claire Kolar and Jean-Baptiste Quillien for their assistance with data organization and analysis.
REFERENCES
- 1.Dawes M, Summerskill W, Glasziou P, et al. Sicily statement on evidence-based practice. BMC Med Educ. 2005;5(1):1. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Institute of Medicine US Roundtable on Evidence-Based Medicine. Leadership Commitments to Improve Value in Healthcare: Finding Common Ground: Workshop Summary. Washington, DC: National Academies Press (US); 2009. Institute of Medicine: Roundtable on Evidence-Based Medicine. https://www.ncbi.nlm.nih.gov/books/NBK52847/. Accessed February 27, 2018.
- 3.Accreditation Council for Pharmacy Education. Accreditation standards and key elements for the professional program in pharmacy leading to the Doctor of Pharmacy degree. July 1, 2016. https://www.acpe-accredit.org/pdf/Standards2016FINAL.pdf. Accessed February 27, 2018.
- 4.Medina MS, Plaza CM, Stowe CD, et al. Center for the Advancement of Pharmacy Education 2013 educational outcomes. Am J Pharm Ed . 2013;77(8):162. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Khan KS, Coomarasamy A. A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine. BMC Medical Education. 2006;6(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Maggio LA, Cate OT, Irby DM, O’Brien BC. Designing evidence-based medicine training to optimize the transfer of skills from the classroom to clinical practice: applying the four component instructional design model. Acad Med. 2015;90(11):1457-1461. [DOI] [PubMed] [Google Scholar]
- 7.Nixon J, Wolpaw T, Schwartz A, Duffy B, Menk J, Bordage G. SNAPPS-Plus: an educational prescription for students to facilitate formulating and answering clinical questions. Acad Med. 2014;89(8):1174-1179. [DOI] [PubMed] [Google Scholar]
- 8.Umscheid CA, Maenner MJ, Mull N, et al. Using educational prescriptions to teach medical students evidence-based medicine. Med Teach. 2016;38(11):1112-1117. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Ismach RB. Teaching evidence-based medicine to medical students. Acad Emerg Med. 2004;11(12):e6-10. [DOI] [PubMed] [Google Scholar]
- 10.Thomas PA, Cofrancesco J. Introduction of evidence-based medicine into an ambulatory clinical clerkship. J Gen Intern Med. 2001;16(4):244-249. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Huang X, Lin J, Demner-Fushman D. Evaluation of PICO as a knowledge representation for clinical questions. American Medical Informatics Association Symposium Proceedings . 2006:359-363. [PMC free article] [PubMed] [Google Scholar]
- 12.Tilson JK, Kaplan SL, Harris JL, et al. Sicily statement on classification and development of evidence-based practice learning assessment tools. BMC Med Educ. 2011;11(1). [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Salbach NM J., Jaglal SB. Creation and validation of the evidence-based practice confidence scale for health care professionals. J Eval Clin Pract. 2010;17(4):794-800. [DOI] [PubMed] [Google Scholar]
- 14.Melnyk BM K., Fineout-Overholt E, Mays MZ. The evidence-based practice beliefs and implementation scales: psychometric properties of two new instruments. Worldviews Evid Based Nurs . 2008:5(4):208-216. [DOI] [PubMed] [Google Scholar]