Skip to main content
American Journal of Pharmaceutical Education logoLink to American Journal of Pharmaceutical Education
. 2018 Oct;82(8):6390. doi: 10.5688/ajpe6390

Learning Activities to Build Population Health Management Skills for Pharmacy Students

Amy L Pakyz a, Kai I Cheang a,, Jeremy S Stultz a, Leticia R Moczygemba b
PMCID: PMC6221523  PMID: 30425402

Abstract

Objective. To describe the implementation and evaluation of population health management learning activities in a second-year Doctor of Pharmacy (PharmD) course.

Methods. Population health learning sessions were implemented in a step-wise manner: population needs assessment activity to identify priority programs for implementation given a specific patient population; didactic materials to introduce program evaluation foundational knowledge; program evaluation design activity to evaluate implemented programs using the Centers for Disease Control and Prevention’s Program Evaluation Framework; and evaluation of program outcome data. Students’ self-rated abilities (grouped into Bloom’s Taxonomy classifications) and perceptions before and after program evaluation activities were assessed. Qualitative analyses evaluated student feedback on learning sessions.

Results. Students’ self-rated abilities for all Bloom’s classifications increased after the learning sessions. Student perceptions on importance of program evaluation also improved (from 71% reporting “agree” or “strongly agree” pre-activities to 79% post-activities). Students found the application to case scenarios and the opportunity to integrate each component of program evaluation into a complete process useful.

Conclusion. Step-wise population health management learning sessions were implemented, culminating in skill-based program evaluation activities. The activities improved students’ self-rated abilities and perceptions regarding program evaluation. Areas for improvement for the learning sessions were also identified and will inform future instructional design.

Keywords: Population health management, program evaluation, CAPE Educational Outcomes

INTRODUCTION

Recent developments in health care emphasize high quality care that is accessible and affordable.1 Population health management is one area that has received significant attention. The Accountable Care Organization model, in which different types of providers share joint responsibility for spending and quality for defined patient populations, is increasingly used in both federal and private-sector health plans.2 As vital members of health care teams, pharmacists need population health management skills to deliver health care services at the population level. Population health management has been selected as a top focus area for health-system pharmacist leaders because of its potential significant impact on pharmacy practice in the coming years.3

Population health management is also a learning domain put forth by the Center for the Advancement of Pharmacy Education (CAPE) 2013 Educational Outcomes.4 The outcomes include four domains: foundational knowledge, essentials for practice and care, approach to practice and care, and personal and profession development. Within each domain are associated subdomains and suggested learning objectives to inform curriculum planning for US pharmacy colleges and schools. Population health management is a subdomain within the essentials for practice and care domain. Its learning objectives include assessing the health needs of a specific population, and implementing and evaluating interventions designed to improve the health of the population.

Published examples of curricular approaches to prepare students for population health management in PharmD programs are lacking. The authors of this study implemented a series of population health management learning sessions and activities, which consisted of a population needs assessment activity, a program evaluation foundational knowledge didactic component, and program evaluation application activities. The objective of this report is to describe the implementation of these learning sessions. In addition, the program evaluation application activities were assessed to see if students’ self-rated abilities and perceptions regarding the activities’ importance and utility improved their learning. Results of this evaluation can inform the development of active learning methods on population health management, and may aid pharmacy faculty undertake such efforts.

METHODS

In fall 2015, learning activities were introduced to the Epidemiology and Pharmacy Practice course (n=135) at Virginia Commonwealth University. This two-credit course is a second-year required course in the four-year PharmD program. Previously, this course only provided an introduction to the principles of epidemiology and their relation to pharmacy practice, including critical evaluation of the pharmacoepidemiologic literature. In fall 2015, new population health management learning activities were added, building on epidemiologic concepts from this course, and students’ previously acquired knowledge on research methods and statistics from other courses. The new learning sessions spanned six classroom hours as described below and depicted in Figure 1.

Figure 1.

Figure 1.

Learning Activities and Assessment

Needs Assessment Activity: Students completed a needs assessment survey (full survey available from authors upon request) via Qualtrics (Provo, UT). The purpose was to create a simulated patient population with diabetes. Diabetes was chosen because it was being taught in a concurrent pharmacotherapeutic course, thus allowing students to apply their therapeutic knowledge. For the purpose of this survey, students were asked to imagine themselves as patients with diabetes. They answered questions related to demographics, socioeconomic factors, social support measures, and health behaviors. They were also asked whether they were adherent to medications using the four-item Morisky scale.5 Quality-of-life questions were composed of the 13 items from the Audit of Diabetes-Dependent Quality of Life questionnaire (ADDQoL).6 After survey completion, the course coordinators compiled students’ responses into a spreadsheet and added other variables normally obtained through medical records (eg, prescription records, and clinical findings such as other diagnoses and laboratory values). The laboratory values and clinical information that were added were derived from the National Committee for Quality Assurance’s Healthcare Effectiveness Data and Information Set (HEDIS) 2015 measures7 regarding diabetes care standards (eg, A1c and blood pressure values).

The students then participated in a 2-hour in-class population needs assessment activity, working together in teams of five to six (24 teams total). Students were provided with the spreadsheet of survey responses, medications, and clinical information. Building upon skills learned previously in this course and their statistics course, students performed descriptive analyses on select variables, including patient demographics and diabetes care quality indicators.7-9 For example, students summarized findings pertaining to diabetes care standards, such as proportion of patients with at-goal blood pressure (<140/90 mmHg)8 and with at-goal AIC (<7.0%).9 They also performed stratified analyses, such as calculation of the proportion of patients taking insulin and who had not received diabetes education, or patients with hypertension but were not prescribed an angiotensin converting enzyme inhibitor or angiotensin II receptor blocker (ACE/ARB). In addition, students summarized the most frequently occurring subdomains of the ADDQoL instrument6 and the percentage of patients in each Morisky adherence category.5 They then constructed 2x2 tables to ascertain odds ratio estimates of relevant outcomes (eg, risk of uncontrolled hypertension among current smokers vs non-smokers).

At the end of the activity, based on their analyses, the teams generated and ranked the three highest-priority programmatic interventions (eg, diabetes counseling, smoking cessation, fitness-related focus program). A team representative recorded each team’s proposed programs in a table that was projected to the whole class. The number of times that a particular program was listed in the table was tallied, and the top two were identified.

Program Evaluation Foundational Knowledge: A 2-hour didactic lecture on program evaluation was provided. The CDC’s Framework for Program Evaluation10 was used to introduce students to the six program evaluation steps, which were: engage stakeholders, describe the program, focus the evaluation design, gather credible evidence, justify conclusions, and ensure use and share lessons learned. Examples of the six steps were also provided.

Program Evaluation Activity: Student teams participated in two in-class program evaluation activities. The first part of the activity involved applying the CDC framework in the primary care setting using the two programmatic interventions that students identified during the needs assessment activity – initiation of an ACE/ARB in patients with hypertension who were not currently on one, and initiation of a smoking cessation program for patients who smoke (Appendix 1). Using the CDC framework, student teams answered questions related to designing program evaluation for these specific programs. For example, for the step “engage key stakeholders,” students identified relevant stakeholders needed for different purposes (eg, to increase program credibility, to authorize continuation of the program). For the steps related to “focus evaluation design” and “gather credible evidence,” students recorded the main outcomes that they would measure, and how they would measure and analyze them.

In the second part of the activity, students were provided simulated program outcome data generated by course instructors. Two different case scenarios were presented with varying program results, levels of target patient participation, and levels of provider and patient satisfaction with the program (Appendix 2). The student teams discussed and recorded whether each specific program should be continued and its justifications (addressing CDC’s “justify conclusion” step). At the conclusion of each activity, the instructors provided a debriefing and facilitated class discussion.

The in-class needs assessment and program evaluation activities were graded and a team-score was given as part of the graded assignments component of the course. Further, there were five multiple-choice questions based on the program evaluation foundational knowledge on the second course examination.

After completion of the program evaluation activity, students took an optional paper-based survey specifically created for this evaluation. The survey was designed as a retrospective pre- and post-assessment of students’ perceptions of their abilities regarding program evaluation. This design involves a “self-report during the course or at the end of treatment that measured subjects’ recall of how they were functioning before program outset.”11 This method was chosen to limit response-shift bias that can occur when a respondent’s perception alters after additional experiences.12-14 Validity of this design has been demonstrated in studies evaluating student perceptions and self-assessments.15-20

The self-assessment survey consisted of 14 questions. The first 10 questions evaluated students’ self-rated ability in the achievement of instructional goals regarding the CDC framework components (questions 1-10, Figure 2). Students rated their abilities on a Likert scale (weak, fair, good, very good), both before and after the program evaluation activities. In addition, questions 11 and 12 evaluated students’ attitudes about program evaluation. These items were “I feel it is important for pharmacy students to learn about program evaluation” and “program evaluation will be helpful to me as a pharmacist” with possible responses of strongly disagree, disagree, agree, and strongly agree. Questions 13 and14 were open-ended and evaluated student-perceived strengths and weaknesses of the activities. These questions were “what aspect of the program evaluation assignment most enhanced your learning about the process of program evaluation?” and “what aspect of the program evaluation assignment could have been changed to improve the active learning activity?”

Figure 2.

Figure 2.

Students’ Self-rated Abilities Before and After Program Evaluation Activity (N=127 students)

The 10 questions on self-rated abilities were categorized into one of six Bloom’s Taxonomy of Educational Objectives Cognitive Domain Levels through a consensus process.21 Three reviewers independently evaluated each question and assigned one of the six domain levels to each question. If all three reviewers agreed on the classification, then that particular level was used. If none or only two reviewers were in agreement, then in-person discussions occurred to reach a consensus. Also in a consensus process fashion, the reviewers further grouped the 10 questions into three overall broad Bloom’s Taxonomy Classifications combining those that involved remember or understand, apply or analyze, and create or evaluate (Figure 2).

Survey responses reflecting students’ self-rated abilities (questions 1-10) before the additional application activities (foundational knowledge learning session only) were compared to after the additional application activities. The Wilcoxon Matched-Pairs Signed-Rank test (2-tailed, p-value ≤.05 considered significant) was used. Responses to the attitudinal questions 11 and 12 pertaining to importance and applicability of the program evaluation learning sessions were summarized. Content analyses was performed on questions 13 and 14 concerning the aspects that most enhanced learning, and opportunities to improve the learning sessions. For this, the authors independently reviewed student responses and categorized them, then met in person as a group to review and reach consensus on final categories. The number of responses within each category was tallied and percentages were calculated. The Virginia Commonwealth University Institutional Review Board approved this study.

RESULTS

Out of 135 students, 127 responded to questions 1-10 concerning their self-rated abilities before and after the program evaluation activities (Figure 2). Students significantly improved in all three Bloom’s classifications, (p<.0001 for each). Compared to before the foundational knowledge learning session alone, following the additional application activities, more students reported they felt it was important for them to learn about program evaluation (question 11, changing from 96 to 106 students reporting agree/strongly agree). In addition, after the activities, more students agreed that program evaluation would be helpful to them as pharmacists (question 12, changing from 93 to 104 students reporting agree/strongly agree).

Ninety-eight students responded to question 13 on aspects of the activities that most enhanced learning (Table 1). Aspects that students frequently deemed helpful were opportunities to apply knowledge of specific components of the program evaluation process (n=30). The most common specific components cited were defining outcomes (n=8), and differentiating outcome vs process variables (n=6). Other frequently reported aspects that enhanced student learning included using real-life scenarios (n=27), and going through the whole process and seeing how the components fit together (n=19). Representative quotations are listed in Table 1.

Table 1.

Aspects of the Program Evaluation Activities that Most Enhanced Students’ Learning, as Reported by Students (N=98)

graphic file with name ajpe6390-t1.jpg

Results from the content analyses of question 14 pertaining to one aspect of the learning activities that could have been improved are presented in Table 2. Among 85 total responses, the most frequently requested change was to the clarity, organization and type of directions (n=31), particularly within the activity (n=27), followed by more examples or review (n=27). Some students also felt the activity was too long, and that too much time passed between the lecture and activities (due to one-week Thanksgiving break). Table 2 provides representative quotations.

Table 2.

Aspects of the Program Evaluation Activities that can be Most Improved, as Reported by Students (N=85)

graphic file with name ajpe6390-t2.jpg

For the program results scenarios, all students made the correct determination for continuing or discontinuing a particular program, and provided appropriate justification (eg, based on program efficacy and stakeholders’ satisfaction). Fifteen of the 24 teams identified the appropriate constraints.

DISCUSSION

This article describes the implementation of population health management learning sessions, which consisted of a population needs assessment activity, a program evaluation foundational knowledge didactic component, and program evaluation application activities.

The effect of the program evaluation application activities on students’ learning was also evaluated. Students reported that the activity improved their skills in program evaluation compared to the didactic foundational knowledge learning session alone. In addition, the students identified areas of strengths of the activities, for example, those that helped them integrate key CDC components into the whole program evaluation process, and the practical, real-life nature of the activities. They also identified opportunities for improving the activities, including clarity of the directions within the activities, and the need for more examples within the foundational knowledge didactic learning session.

The ability to perform needs assessments and program evaluations are vital to practicing pharmacists.22 A didactic description of how to perform these necessary skills is necessary to familiarize students with the process, but allowing students to work through example scenarios is a critical step in ensuring pharmacy students graduate with this ability to apply this skill in practice. The activities described in this study used learning skills at all levels of Bloom’s Taxonomy (Figure 1).21 Students’ self-rated improvement in learning highlighted the activities’ impact on their understanding and skills in program evaluation. The beneficial effect of the application activities was also evident in the content analyses from Table 1. Notably, the majority (77.6%) of students highlighted that the most beneficial aspect of the activities was the ability to apply specific components of the program evaluation and learn how the specific components and steps fit together.

The activities were designed to strengthen skills previously learned in the curriculum (eg, descriptive data analysis, epidemiologic measures) by incorporating them in a population needs assessment activity. The students applied new information in program assessment principles through active-learning scenario-based activities. Working through an entire program assessment process step by step allowed the students to see how each component fit together, and increased their ability to apply these principles to program scenarios (Table 1).

Based on students’ suggestions regarding strengths and weaknesses of the activities, the authors plan to integrate the skill-based application activities on program evaluation with the foundational knowledge didactic component, rather than delivering the two in a sequential manner in future course offerings. More examples will be added on components that students desired more clarity on (eg, measurement of process vs. outcome variables). Furthermore, the authors plan to formally evaluate the population needs assessment activity in future offerings of this course.

There are limitations with this study. The authors relied on student perceptions for learning outcomes. Although more objective measures may be useful, this method has been used in previous reports and the purpose of this study was to gather initial feedback about the learning sessions.23 Although a retrospective pre-post analysis was used to reduce response-shift bias, similar to previous pharmacy education evaluations of this nature, one could argue that it would have been preferred to obtain students’ perceptions prior to the designed activities.24 Additionally, the student survey regarding program evaluation was optional. Although the majority of students (85 out of 135 students) elected to complete the survey, selection bias cannot be ruled out. Furthermore, this project was implemented at a single institution, and using multiple institutions could provide more insights. While there was no parallel control group to serve as a comparison, similar to previous instructional interventions, the students served as their own controls.23,24 Having a control group not exposed to the described learning activities would pose ethical concerns.

CONCLUSION

Step-wise population health management learning sessions were developed and implemented in a second-year professional pharmacy course. Compared to didactic learning, students’ perceptions regarding program evaluation after skill-based application activities improved. Students found the application to case scenarios and the opportunity to integrate each component of program evaluation into a complete process particularly useful.

Appendix 1. Program Evaluation Activity Part 1

 Based on your needs assessment of a diabetic population of 136 patients at an urban academic medical center primary care clinic and discussions with stakeholders including pharmacists, physicians, patients, and medical directors, it was decided that two of your program recommendations would be implemented. These includes initiating ACE/ARB in patients with diabetes and hypertension who are not currently on one, and starting a smoking cessation program for smokers. From the needs assessment, you have previously found that 83% (86/104 patients) had hypertension and were not on an ACE/ARB. Further, 49% (68/138) of patients smoked. Both of these were carried out by a pharmacist who was available for a half day pharmacy-clinic per week that was located within the primary care clinic to conduct necessary program activities for one year. The pharmacist’s salary support was covered by the pharmacy department. The activities performed included pharmacist review of medical records in the clinic for patients eligible for the program. With the assistance from personnel at the registration desk, patients were invited via telephone and postal mail to set up an appointment in the pharmacy clinic. The pharmacist also sent notifications to the patient’s primary care physician to suggest this pharmacy clinic for eligible patients. She performed counseling regarding smoking cessation and provided educational brochures produced by the CDC. If an ACE/ARB and/or smoking cessation-related medications were considered necessary, the pharmacist facilitated the obtainment of a written prescription from physicians in the clinic. She would also track refills for patients who were initiated on drug therapy from refill authorization records from the pharmacy that serviced the clinic.

Your team has been re-consulted to conduct the program evaluation, and to consider a recommendation regarding extending the clinic to one other clinic that is located off-site of the medical campus.

You will have access to the EMR, but not claims data. The program pharmacist also distributed surveys to the pharmacy-clinic participants. The survey contained questions regarding their satisfaction with services provided during any follow-up pharmacy-clinic appointments. The pharmacist also distributed surveys to the physicians in the clinic to assess their satisfaction with the pharmacist’s services.

Recall the components of the CDC’s program evaluation framework, including the following steps:

engage stakeholders, describe the program, focus the evaluation design, gather credible evidence, justify conclusions, and ensure use and share lessons learned.

Please design a program evaluation based on each step of the framework.

The first step is to engage stakeholders.

  • 1. Identify the top 2 key stakeholders who should have been/should be consulted for the purposes below and explain your reasoning.

    • Who are the key stakeholders needed to:

      • Increase credibility?

      • Implement the interventions that are central to this effort (besides the pharmacist)?

      • Advocate for changes to institutionalize this effort?

      • Fund/authorize continuation or expansion of this effort?

The second step involves describing the program that has taken place. Recall from the lecture that description of the program has the following components: need for the program, target population for the program, outcomes, activities, inputs/outputs, relationship of activities and outcomes, stage of development, context.

  • 2. Based on the population needs assessment exercise, and the program described on page 1, describe: need for the program, target population for the program.

  • 3. Identify the model activities.

  • 4. Using the Basic Program Logic Model structure, identify the INPUTs.

  • 5. What are the constraints of this program?

Next step: Evaluation Design and Gathering Evidence

  • 6. List the main outcomes of the program that you will measure, and how you will measure and analyze them.

    • Outcomes

    • Indicator: Explain how you will measure your outcomes (eg, A1c < 7.0, yes/no variable). What is considered a successful implementation?

    • Data Source

    • Analysis Plan

  • 7. In order to conduct your program evaluation, what are some specific data that you will want to collect from the clinic personnel regarding the program fidelity (ie, how well the program was implemented as intended?)

  • 8. Given the three scenarios of the program evaluation results, respond whether you would:

    • a. Discontinue program.

    • b. Request more resources and then conduct another evaluation.

    • c. Request more time and then conduct another evaluation.

    • d. Continue current program.

    • e. Continue program and extend to another site.

  • 9. Describe the main users of the program findings.

  • 10. Based on your description above, will the program evaluation need Institutional Review Board Approval (human subjects protection)? Why or why not?

Appendix 2. Program Evaluation Activity Part 2

After gathering evidence, the next step in the CDC program evaluation framework is to justify conclusions. Given the three scenarios below, for each scenario of program results, comment on which action to take (a-e, below) and state reason why:

  • a. Discontinue program.

  • b. Request more resources and then conduct another evaluation.

  • c. Request more time and then conduct another evaluation.

  • d. Continue current program.

  • e. Continue program and extend to another site.

Scenario 1:

24/86 (28%) of patients with diabetes not on an ARB/ACE and were hypertensive have been to the pharmacy clinic at least once and were prescribed one. Of those eligible for refills during the one-year period (#20), a total of 15 (75%) have had their prescriptions refilled in time.

A total of 50 out of 68 smokers (74%) had been referred to the pharmacy clinic, of these, 18 attended the clinic and a total of 15 report cessation of smoking by the end of the study period.

Of the 5 MDs that staffed the clinic, 2 filled out an evaluation form; for most survey items it was indicated that they agreed or strongly agreed that the program activities enhanced patient care, the other respondent strongly disagreed or disagreed to survey items.

A total of 30 out of the 60 patients offered the survey completed it; 29/30 indicated that they agreed or strongly agreed that the program activities were beneficial; 1 strongly disagreed.

Scenario 2:

35/86 (41%) of patients not on ARB/ACE and were hypertensive have been to the pharmacy clinic at least once and were prescribed one. Of those eligible for refills during the one-year period (#28), a total of 26 (93%) have had their prescriptions refilled in time.

10/68 (15%) of patients who smoke attended the clinic and a total of 8 reported cessation of smoking by the end of the study period.

Of the 5 MDs that staffed the clinic, all 5 filled out an evaluation form; 4/5 responses indicated that they agreed or strongly agreed that the program activities enhanced patient care, the other MD indicated neutral responses.

A total of 22 patients filled out an evaluation form; 15 indicated that they agreed or strongly agreed that the program activities were beneficial; 1 person strongly disagreed, the rest indicated neutral agreement.

Scenario 3:

44/86 (51%) of patients not on ARB/ACE and were hypertensive have been to the pharmacy clinic at least once and were prescribed one. Of those eligible for refills during the one-year period (#36), a total of 16 (44%) have had their prescriptions refilled in time.

20/68 (29%) of patients who smoked attended the clinic and a total of 5 reported cessation of smoking by the end of the study period.

Of the 5 MDs that staffed the clinic, 2 filled out an evaluation form; for most survey items 1 respondent indicated that they strongly agreed that the program activities enhanced patient care, the other respondent strongly disagreed to survey items.

A total of 9 patients filled out an evaluation form; 4 indicated that they agreed or strongly agreed that the program activities were beneficial; 4 persons disagreed or strongly disagreed, 1 indicated neutral agreement.

REFERENCES

  • 1.U.S. Department of Health and Human Services. About the law. http://www.hhs.gov/healthcare/about-the-law/index.html. Accessed September 11, 2016. [DOI] [PubMed]
  • 2.Song Z. Accountable care organizations in the U.S. health care system. J Clin Outcomes Manag. 2014;21(8):364–371. [PMC free article] [PubMed] [Google Scholar]
  • 3.Allen SJ, Zellmer WA, Knoer SJ, et al. ASHP Foundation Pharmacy Forecast 2017: strategic planning advice for pharmacy departments in hospitals and health systems. Am J Health Syst Pharm. 2017;74:27–53. doi: 10.2146/sp170001. [DOI] [PubMed] [Google Scholar]
  • 4.Medina MS, Plaza CM, Stowe CD, et al. Center for the Advancement of Pharmacy Education 2013 Education Outcomes. Am J Pharm Educ. 2013;77(8) doi: 10.5688/ajpe778162. Article 162. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Shi L, Liu J, Koleva Y, Fonseca V, Kalsekar A, Pawaskar M. Concordance of adherence measurement using self-reported adherence questionnaires and medication monitoring devices. Pharmacoeconomics. 2010;28(12):1097–1107. doi: 10.2165/11537400-000000000-00000. [DOI] [PubMed] [Google Scholar]
  • 6.Bradley C, Todd C, Gorton T, Symonds E, Martin A, Plowright R. The development of an individualized questionnaire measure of perceived impact of diabetes on quality of life: the ADDQoL. Qual Life Res. 1999;8(1-2):79–91. doi: 10.1023/a:1026485130100. [DOI] [PubMed] [Google Scholar]
  • 7.National Committee for Quality Assurance. HEDIS measures. http://www.ncqa.org/hedis-quality-measurement/hedis-measures/hedis-2015. Accessed September 20, 2016.
  • 8.James PA, Oparil S, Carter BL, et al. evidence-based guideline for the management of high blood pressure in adults: report from the panel members appointed to the Eighth Joint National Committee (JNC 8) JAMA. 2014;311(5):507–520. doi: 10.1001/jama.2013.284427. [DOI] [PubMed] [Google Scholar]
  • 9.American Diabetes Association. Standards of medical care in diabetes – 2015. Diabetes Care. 2015;38(Suppl 1):S33–S40. [PubMed] [Google Scholar]
  • 10.Centers for Disease Control and Prevention. Program Performance and Evaluation Office. Program evaluation. http://www.cdc.gov/eval/framework/index.htm. Accessed September 9, 2016.
  • 11.Davis NM. A medication error prevention educational retreat. Hosp Pharm. 2000;35(5):466–467. [Google Scholar]
  • 12.Howard GS, Schmeck RR, Bray JH. Internal validity in studies employing self-report instruments: a suggested remedy. J Educ Meas. 1979;16(2):129–135. [Google Scholar]
  • 13.Nicholson T, Belcastro PA, Gold RS. Retrospective pretest-posttest analysis versus traditional pretest-posttest analysis. Psychol Rep. 1985;57:525–526. [Google Scholar]
  • 14.Slack MK, Coyle RA, Draugalis JR. An evaluation of instruments used to assess the impact of interdisciplinary training on health professional students. Issues Interdiscipl Care. 2001;3(1):59–67. [Google Scholar]
  • 15.Sprangers M, Hoogstraten J. Pretesting effects in retrospective pretest-posttest designs. J App Psychol. 1989;74(2):265–272. [Google Scholar]
  • 16.Skeff KM, Stratos GA, Bergen MR. Evaluation of a medical faculty development program: a comparison of traditional pre/post and retrospective pre/post self-assessment ratings. Eval Health Professions. 1992;15:350–366. [Google Scholar]
  • 17.Howard GS. Response-shift bias: a problem in evaluating interventions with pre/post self-reports. Eval Rev. 1980;4(1):93–106. [Google Scholar]
  • 18.Bray JH, Howard GS. Methodological considerations in the evaluation of a teacher-training program. J Educ Psychol. 1980;72(1):62–70. [Google Scholar]
  • 19.Jackson TR, Popovich NG. The development, implementation, and evaluation of a self-assessment instrument for use in a pharmacy student competition. Am J Pharm Educ. 2003;67(2) Article 57. [Google Scholar]
  • 20.Jackson TL. Application of quality assurance principles: teaching medication error reduction skills in a “real world” environment. Am J Pharm. Educ. 2004;68(1) Article 17. [Google Scholar]
  • 21. Anderson LW, Krathwohl DR. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives. New York, NY: Logman; 2001.
  • 22.Benjamin GC. Ensuring population health. An important role for pharmacy. Am J Pharm Educ. 2016;80(2) doi: 10.5688/ajpe80219. Article 19. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Warholak TL, Noureldin M, West D, Holdford D. Faculty perceptions of the educating pharmacy students to improve quality (EPIQ) program. Am J Pharm Educ. 2011;75(8) doi: 10.5688/ajpe758163. Article 163. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Gilligan AM, Myers J, Nash JD, et al. Educating pharmacy students to improve quality (EPIQ) in colleges and schools of pharmacy. Am J Pharm Educ. 2012;76(6) doi: 10.5688/ajpe766109. Article 109. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy

RESOURCES