Skip to main content
American Journal of Pharmaceutical Education logoLink to American Journal of Pharmaceutical Education
. 2011 Aug 10;75(6):116. doi: 10.5688/ajpe756116

A Quality Improvement Course Review of Advanced Pharmacy Practice Experiences

T Lynn Stevenson 1,, Lori B Hornsby 1, Haley M Phillippe 1, Kristi Kelley 1, Sharon McDonough 1
PMCID: PMC3175668  PMID: 21931454

Abstract

Objectives. To determine strengths of and quality improvements needed in advanced pharmacy practice experiences (APPE) through a systematic course review process.

Design. Following the “developing a curriculum” (DACUM) format, course materials and assessments were reviewed by the curricular subcommittee responsible for experiential education and by key stakeholders. Course sequence overview and data were presented and discussed. A course review worksheet was completed, outlining strengths and areas for improvement.

Assessment. Student feedback was positive. Strengths and areas for improvement were identified. The committee found reviewing the sequence of 8 APPE courses to be challenging.

Conclusions. Course reviews are a necessary process in curricular quality improvement but can be difficult to accomplish. We found overall feedback about APPEs was positive and student performance was high. Areas identified as needing improvement will be the focus of continuous quality improvement of the APPE sequence.

Keywords: curriculum, quality assurance, advanced pharmacy practice experiences, assessment

INTRODUCTION

Curriculum management (development and assessment) is often difficult for colleges and schools of pharmacy. The Accreditation Council for Pharmacy Education (ACPE) mandates that all colleges and schools of pharmacy have in place a curriculum committee that is responsible for the “development, organization, delivery and improvement” of the college or school's professional curriculum.1 With accreditation standards focusing on objectives and outcomes, this process can be challenging. The ACPE accreditation standards also state that the curriculum committee must conduct “orderly and systematic reviews of curricular structure, content, process, and outcomes, based on assessment data” (Guideline 10.2) as well as use “a system of evaluation of curricular effectiveness” (Guideline 15.2).1 The experiential curriculum accounts for approximately 30% of a professional degree program in pharmacy. Because of this and because of the many variables (multiple advanced pharmacy practice experience [APPE] courses, faculty members, affiliate preceptors, and training sites) involved in experiential education, continuous quality assessment and improvement is of the utmost importance for all colleges and schools of pharmacy.

A Medline search using the terms course review, advanced pharmacy practice experience, pharmacy education, medical curriculum, curriculum development, curriculum evaluation, and curriculum reform was conducted for English-language articles published from 1966 through November 2010. Abstracts presented at the American Association of Colleges of Pharmacy (AACP) annual meetings from 2005 through 2010 were searched for relevant data. Articles pertinent to course reviews and curriculum changes were identified and reviewed.

The course review process used at the University of Kentucky College of Dentistry in 2000 consisted of 4 components: (1) course documentation, (2) self-assessment by the course director, (3) peer review, and (4) system assessment. Outcomes of their review process included curriculum and faculty development as well as increased peer interactions.2 Others have described curricular change in medical education involving leadership, governance, communication, faculty development, integration of courses, assessment of instructional methods, student assessment, and overall program evaluation.3

Barham described a well-designed course review process as one that “addresses curricular issues and leads to identification of possible solutions” resulting in “achievement of the desired educational outcomes.”4 Course reviews should aim for course improvement through identification of problems and include multiple assessments (ie, self, peer, and student) with participation from both internal and external reviewers. Maintaining a cooperative and collaborative review process is important. Activities within the course review process may ultimately result in faculty development as well.4

A process of curricular review and mapping within the University of Oklahoma College of Pharmacy has been described, but no information was found in the literature regarding a systematic course review process specifically for APPEs.5 Therefore, the purpose of this project was to determine quality improvements for APPEs through a systematic course review process at Auburn University Harrison School of Pharmacy (AUHSOP). This course review is part of the overall continuous quality improvement program of the curriculum. Findings from the course review are presented, including strengths of and areas for improvement in the APPE sequence, as well as reflections on the course review process.

DESIGN

The school's mission is to prepare graduates who are highly competent and can deliver primary pharmacy care both independently and collaboratively with other healthcare providers. The purpose of APPEs is to give student pharmacists opportunities to develop and demonstrate achievement of the school's ability-based outcomes. The APPE course series consisted of eight 5-week practice experiences for all fourth-year student pharmacists. The course sequence included the following experiences: 1 acute care medicine, 1 community pharmacy practice, 1 health-system pharmacy practice, 2 ambulatory care/primary care, 1 selective (repeat acute care medicine or ambulatory care/primary care), 1 drug information, and 1 elective. The practice experiences were precepted by pharmacy practice faculty members and affiliate faculty members in training sites across the state of Alabama; West Central Georgia; Pensacola, Florida; and Biloxi, Mississippi. Practice experience areas included primary care, cardiology, neurology, infectious disease, pediatrics/neonatology, pulmonary medicine, drug information, health-system pharmacy, public health/Indian Health Service, pharmaceutical industry, pharmacy management, managed healthcare, long-term care, home healthcare, professional associations, academia, and clinical research.

The experiential curriculum was managed through the Office of Experiential Learning. Faculty and staff in this office included the director (a faculty member who also serves as the APPE course coordinator) and a professional staff member who served as the administrative coordinator for the APPEs. Additional personnel included the clinical director of our introductory pharmacy practice experience (IPPE) program (faculty member) and 2 professional staff members who served as administrative coordinators for the IPPE program. The school implemented a comprehensive course-review method in 2006 that followed the DACUM (Developing a Curriculum)6 process. Each course or course sequence was reviewed soon after the course was taught the first time and then again every 3 to 4 years, or more frequently if major changes to course content occurred or if the previous review had resulted in such recommendations. The process encompassed 5 phases (pre-review, team development, data collection, assessment, and reporting), with the majority of the time spent on the data collection, assessment, and reporting phases, which included review and approval of the course assessment by the school's curriculum committee (Figure 1). The pre-review phase involved the curriculum committee's decision on review scope and deadlines as well as identification of the course-review facilitators. Most of the didactic courses within the school's curriculum had been reviewed using this established process, but as of 2008, no plans had been made to review the APPE sequence. The APPE review was going to be more challenging than other course reviews as the sequence consisted of multiple individual courses taught in various practice sites by multiple instructors/preceptors. However, standardization of the course review process was felt to be important in order to ensure a systematic evaluation of all courses. The curriculum committee charged the standing experiential subcommittee in fall 2008 to oversee the review of the APPE sequence using the same format as all other course reviews.

Figure 1.

Figure 1.

APPE Course Review Process

Members of the experiential subcommittee began planning the review in December 2008. Members of this committee were appointed by the dean annually and consisted of faculty members, affiliate faculty members (preceptors), professional staff members, and student pharmacists. Faculty members (full-time and affiliate) were directly involved in the APPE courses as preceptors and therefore were expected to have an understanding of course content and instruction methods. One professional staff member was directly involved in coordination of the APPE courses. Most full-time faculty members as well as the professional staff member had been involved in prior course reviews at the school and were familiar with the process. During the planning stages, the experiential subcommittee was in communication with the curriculum committee to ensure that accommodations were made when necessary but that the same standardized format was followed.

Prior to the formal review sessions, the documents and information listed in Table 1 were collected and provided to the reviewers in an Open BlackBoard (Blackboard Vista, Version 8.0.4., Blackboard, Inc., Washington, DC) course developed by the curriculum committee. The Office of Experiential Learning director and APPE administrative coordinator were responsible for gathering most of the materials for the review. Workload related to this task was the responsibility of the director, as course coordinator of the APPEs.

Table 1.

Requested Documents for Each Course Review

• Course coordinator's instructional philosophy
• Course syllabi
• Course development documenta
• Learning evaluation blueprintsa
• Results of learning evaluations (statistics and grade distribution report)
• Teaching journal (log or periodic reflections of things to improve upon)a
• Lecture/facilitation notesa
• Learning and teaching resources (orientation materials, multimedia, etc.) specific for APPE
• Handouts, problem sets, case examples (examples of patient presentations, journal clubs, in-services, data collection forms) specific for APPE
• Evaluations of learning
• Course assessments (student feedback evaluations)
• Other available and useful information (student survey of APPE sequence)
a

Not applicable or not available for the course review

A checklist was developed based on the university-approved syllabus template that outlined expected components in all course syllabi and was used by committee members during the data collection phase (Table 2). All preceptors were asked to submit an updated version of their syllabus to the experiential office prior to the course review. The initial plan was to review all available syllabi. However, after reviewing 27 syllabi submitted voluntarily by preceptors using this checklist, the committee members determined there was a recurrence of similar issues and, therefore, did not feel review of additional syllabi was necessary.

Table 2.

Syllabus Checklist

• Course number and title
• Credit hours
• Prerequisite
• Course description
• Course coordinator contact information
• Preceptor contact information/site name
• APPE specific objectives/activities
• Recommended texts
• Links to online resources
• Ability-based outcomes and rotation objectives
• Course requirements/calculation of grade
○ % Pharmaceutical Care Ability Profile at least 60%
○ % presentations
○ % other projects
• Grading scale
• Policy statements
○ Attendance
○ Conduct and academic dishonesty
○ Documentation of student interventions
○ Grievances
○ Special Needs
• Link to Office of Experiential Learning Web site for APPE policies
• Information access

Abbreviations: APPE = advanced pharmacy practice experience

Student pharmacist performance was assessed using a standardized evaluation form for all APPEs. The evaluation form was designed to assess student performance in 9 specific areas/domains to ensure competency in 8 ability-based curricular outcomes. These domains and outcomes are listed in Table 3. The Office of Experiential Learning provided a copy of the annual grade distribution report, which included grades by APPE type, region, and faculty appointment. The standardized evaluation tool also was available for review and discussion. Learning and teaching resources that were applicable to the APPE course sequence in general (ie, orientation materials, availability of other resources, such as presentation examples and data-collection sheets) were provided by the course coordinator.

Table 3.

Standardized APPE Evaluation Form Domains and Curricular Outcomes

Domains Curricular Outcomes
Patient Assessment 1
Drug Therapy Assessment 1,3
Develop, Implement, and Monitor Drug Therapy Plans 2,3
Communication Abilities 1,2
Critical Thinking and Problem-Solving Skills 1,2
Management/Organizational Abilities 5,6
Self-Learning Abilities 4,6
Professional Ethics and Identity 6,8
Social Interaction, Citizenship, and Leadership 6,7

Abbreviations: APPE = advanced pharmacy practice experience.

Curricular Outcomes: (1) Evaluate pharmacotherapy; (2) Provide appropriate pharmacotherapy interventions; (3) Ensure appropriate drug distribution; (4) Maintain and enhance competence through self-initiated learning; (5) Manage the pharmacy within the organization's business plan; (6) Develop practice and leadership; (7) Participate in public health and professional initiatives and policies; (8) Advance the profession.

Formal student feedback of overall preceptor teaching ability obtained from standard APPE course evaluations was provided for the review. Additional student feedback was obtained prior to the APPE course review through a 22-item survey instrument developed based on important concepts and requirements outlined in the American College of Clinical Pharmacy (ACCP) White Paper7 and Position Statement8 on experiential education. The clarity of questions included in the survey instrument was initially assessed by a small sample of fourth-year student pharmacists. The survey instrument was then administered to student pharmacists in the final month of their fourth year (April 2009), just prior to the course review. The student survey instrument addressed frequency of specific APPE activities, exposure to patient populations and disease states, elective opportunities, access to patient medical records and drug information resources, orientation to the practice site's scope of practice, preceptor/pharmacist supervision, concept of “colleague in training,” extent of challenge provided by each type of APPE, level of student self-directedness, extent of constructive feedback from preceptors, interaction with other healthcare disciplines, preparation for independent and collaborative practice, and extent of participation in professional writing assignments in APPEs other than the drug information APPE. (This survey instrument is available upon request from the corresponding author.)

The formal review of the APPE course sequence was then conducted during two 4-hour sessions in the summer of 2009. To review the 8-course sequence as efficiently and effectively as possible, the first session focused on APPEs in acute care medicine, drug information, and electives, and the second session focused on primary care, community pharmacy practice, and health system practice experiences. The APPE course review was a collaborative process that was open to all stakeholders (course coordinator, full-time and affiliate faculty, professional staff, student pharmacists, and school administration). To ensure representation of affiliate faculty members, the course coordinator requested attendance of select affiliate preceptors. Although some bias may have been introduced with this method of selection, these preceptors were chosen based on their prior experience and contributions to the experiential program. Video conferencing technology and teleconferencing were used to include off-site participants in the course review.

At the beginning of the review session, the course coordinator presented information and data to the review participants (20-30 minutes). Information presented included an overview of the course sequence, course and APPE assignment procedures, and information collected by the committee as previously outlined (Table 1). The coordinator also provided a personal assessment/reflection of the course sequence, outlining potential areas for improvement as well as recent initiatives developed to improve various aspects of the APPE sequence. The reviewers then asked questions of the course coordinator. To ensure open discussion, the course coordinator was excused for the remainder of each session once all questions from the reviewers had been answered.

Two experiential subcommittee members were appointed facilitators for the review process. The facilitators were responsible for maintaining the flow of the formal review and documenting feedback from the attendees. During this time, the reviewers assessed the course sequences using the school-approved course review worksheet. This worksheet, which was used to guide the dialogue and identify strengths and areas for improvement, consisted of 26 criteria in 3 primary areas: (1) teaching and learning, (2) assessment and evaluation, and (3) content (Table 4). Suggestions for improvement regarding any criterion were rated as minor (course instructors decide whether to implement suggestions), moderate (changes cited must be made before the next course/sequence review), or major (changes cited must be made before the next offering of the course/sequence). (Since this review was conducted, the ratings have been changed to suggested, moderate and critical; however, the same definitions apply.) When determining the rating of each criterion, the significance of the deficiency as well as the logistics of implementing a change was considered by the reviewers.

Table 4.

Course Review Criteria (Worksheet Components)

Teaching and Learning
• Emphasis on thinking
• Challenging
• Hold student accountable
• Coordinates with other faculty
• Preassessment
• Helpful orientation
• Developmental Teaching
• Transparency
Assessment and Evaluation of Learning
• Congruent
• Correct Educational Consequences
• Items
• Quality Testing
• Authentic
• Developmental
Content
• Ensuring Professionalization
• Ensuring Understanding
• Distinctive Pharmacy Expertise
• Realistic
• Explicit
• Motivating
• Integrated
• Appropriate Scope
• Scientific
• Valuable
• Appropriate Level
• Interesting

After the review session, the facilitator(s) completed the course review worksheet and provided a summary of the course review findings to the experiential subcommittee in a subsequent meeting. Once reviewed and approved, the written report was presented to the curriculum committee for approval and then to the course coordinator and entire faculty. This project was exempted by the Auburn University Institutional Review Board.

EVALUATION AND ASSESSMENT

This process was the first formal review of the APPE sequence of the curriculum implemented in 2005. The review included APPE data from May 2008 to April 2009. During this time, 113 student-pharmacists completed 900 APPEs at 113 practice sites with 176 preceptors. The preceptors included 33 full-time faculty members and 143 affiliate faculty members. Full-time faculty members taught 37% of all APPEs during this time.

Course grades demonstrated a high level of performance, with 86% of grades being A (89.5-100). The overall grade average during this time was 93.2 (median 93.5). Course grades for community pharmacy practice and elective APPEs had the highest percent of A grades (97%). Acute care medicine APPEs had the lowest percent of A grades (69%).

Course evaluations by students included 874 assessments of the preceptors and experiences, covering 97% of all APPEs completed during the 2008-2009 academic year. Ratings of preceptors’ overall teaching ability were exceptional (47.6%), very good (33.5%), good (13.8%), adequate (4.0%) and poor (1.1%).

Student responses to the survey administered just prior to the course review were positive (defined as >50% of students responding “to a great extent”) regarding their exposure to multiple patient populations and multiple disease states while on APPEs, as well as opportunities for self-directed learning and practicing collaboratively with other providers. Students also responded positively regarding access to medical records and drug information resources while on APPEs, and alternate supervision provided in the absence of their primary preceptor. Areas identified as needing further evaluation (defined as <50% of students responding “to a great extent”) included students’ perception of available electives to fit their personal interests/career plans, clearly defined objectives for elective APPEs, provision of adequate constructive feedback to students, treatment of students as a “colleague in training,” APPEs preparing students for independent practice, and level of supervision provided by preceptors based on student skill level. Student responses regarding the frequency of their involvement in select activities varied based on APPE type. The majority of students reported “daily” involvement in response to queries about activities during their primary care APPEs. There were similar findings for medicine APPEs, with the exception of provision of disease-state education to patients. Student responses regarding frequency of activities during their community-pharmacy practice and health-system practice APPEs were highly variable, with a majority of students reporting “seldom” or “never” for some queries. Students were asked to rate the extent to which they felt challenged during each APPE type. The majority reported they were challenged “to a great extent” during their medicine (72.5%) and primary care (73.9%) APPEs, whereas only 16% to 17% of students responded that they felt challenged “to a great extent” during their health-system and community-pharmacy practice APPEs.

Using the checklist developed by the committee, 27 (15%) syllabi voluntarily provided by preceptors were reviewed prior to the course review. The committee identified several issues with syllabi during this review, which are outlined in Table 5.

Table 5.

Issues Identified During Review of Syllabi (n=27, 15%)

• Missing APPE site/preceptor contact information
• Lack of APPE-specific objectives/activities
• Listing of preceptor as course coordinator rather than university recognized course coordinator (Director of Experiential Learning)
• Failing to use the approved template or using template from previous curriculum
• Standardized evaluation form percentage accounting for <60% of final grade
• Final grade calculation unclear
• Not including school's ability-based outcomes in the syllabus
• Omission of policy statements (ie, grievances, disabilities)

Based on data presented and feedback from students and the course review participants, the reviewers identified several strengths of the APPE sequence. These included preceptor experience and education, course sequence organization, course materials (syllabi templates, resources on the Office of Experiential Learning Web site), course assessments and evaluations (APPE evaluation form, presentation evaluation forms, electronic submission of evaluations, annual grade report), and overall student perceptions.

Several general areas for improvement also were identified. These included continued efforts to identify quality training sites and preceptors, identification of additional unique APPE electives, consistent use of syllabi templates by all preceptors, quality assurance of training sites and preceptors, appropriate use of the approved standardized evaluation form, improved communication and sharing of information between the Office of Experiential Learning and preceptors, and obtaining updated APPE descriptions for all training sites. Twenty-six criteria were assessed using the school-approved course review worksheet. One major finding was the need to review the availability of the drug information APPE to determine its sustainability as a required APPE. Specific areas for improvement in the APPE sequence that were identified and rated as minor included:

  1. Standardization of student practice activities for drug information, community-pharmacy and health-system practice APPEs to improve consistency among APPEs taught by different preceptors;

  2. Inclusion of APPE-specific objectives and clear explanation of final-grade determination in each APPE syllabus, and preceptor use of the standardized evaluation form to ensure adequate feedback;

  3. Preceptor development focusing on student accountability, use of student self-assessments/professional reflections, and importance of frequent feedback to students;

  4. Continued efforts to ensure preceptor engagement in training and development, dissemination of student feedback from regional meetings to all preceptors, sharing of information and resources between preceptors, and considering APPE-specific liaisons for preceptor-to-preceptor collaboration;

  5. Pre-assessment of students at the start of each APPE;

  6. Adequate and timely orientation to APPE sequence and to each APPE;

  7. Review of the standardized APPE evaluation form for improvements and/or simplification to ensure appropriate use by all preceptors and students as well as education on appropriate use of the form;

  8. Promotion of direct patient-care APPE activities whenever possible;

  9. Identification of training sites with integrative services and emphasis on interprofessional education;

  10. Clear communication of elective APPE opportunities to students, promotion of student involvement in professional organizations, and encouragement of clinical elective APPEs for students interested in residency training; and

  11. Development of APPE-specific activities that are valuable learning experiences for students, and consideration of a list of core topics the students should be exposed to through direct patient-care experiences or discussions/case-based reviews.

DISCUSSION

Continuous quality-improvement initiatives are an important component of curricular management. A course review process has been established at our institution and is being used to review and assess all courses within the curriculum on a cycle of every 3 to 4 years. A review of the literature provided limited information regarding a systematic process for institutions to undertake for course reviews. No published information regarding reviews of an APPE course sequence within pharmacy education was identified.

Based on the data presented and discussions by the reviewers during the course review, the following conclusions were made: (1) overall student performance based on grades was high, (2) overall student feedback based on APPE/preceptor evaluations and the student survey was positive, and (3) the APPE course sequence has several strengths as well as areas for improvement.

Conducting a course review for the APPE sequence posed many challenges. Systematically reviewing 8 individual courses consisting of multiple APPE types being taught by multiple preceptors in multiple practice settings posed logistical challenges not encountered when reviewing a single course. Ideally, it would be desirable to conduct a systematic course review of each individual APPE; however, with 176 different preceptors offering their own experience, the task would be lengthy and time-consuming. The Office of Experiential Learning conducts site visits on a regular basis as part of the quality assurance process for the experiential program using a standardized form developed by the Southeastern Pharmacy Experiential Education Consortium. These individual site visits involve collecting important information about the APPE regarding the site and preceptor's ability to adhere to the school's expectations, including ACPE Standards. The course review process adopted by our school called for a review of the entire APPE course sequence, which the experiential subcommittee determined could be most efficiently conducted as outlined in this manuscript.

Several strengths of this course review process include its openness to all stakeholders (students, preceptors [affiliate and full-time faculty members], other faculty members, professional staff members, and administration), assessment of student experiences by means of required APPE evaluations and a survey instrument developed specifically for the APPE course review, assessment of grades received by students for each APPE type, and open discussion of each APPE type within the APPE sequence. The process included a self-assessment/reflection by the course coordinator and a collaborative peer review.

Limitations of this course review process included the challenge of assessing individual practice experiences (courses) in the context of this type of course review. This course review process took a significant amount of time. The data collection phase took the director/course coordinator and APPE administrative coordinator approximately 4 to 6 hours to complete. The actual course reviews were conducted during two 4-hour sessions for those attending the reviews. An additional 1 to 2 hours were required for the facilitators to complete the course review worksheet and summary, which was submitted to the curriculum committee.

Student performance was evaluated globally using APPE-specific grades, which are derived from individual domains defined in the standardized evaluation form. Although these domains relate to specific ability-based outcomes, student performance related to each outcome was not assessed in this review. Future reviews should consider methods for evaluating performance of each domain and/or ability-based outcome.

The method of syllabi selection for review may have biased the results, as only a sample of the syllabi submitted to the coordinator was evaluated. Preceptors submitting their syllabi may be more likely to comply with suggested templates compared with those who had not submitted their syllabi. Additional areas for improvement may have been identified if all syllabi had been evaluated.

While the student survey instrument was developed to address criteria provided as recommendations in the ACCP White Paper7 and Position Statement8 on experiential education, there are obvious limitations. Although the survey instrument was tested in a small number of fourth-year students for overall clarity of questions, there is a potential for varying interpretation of questions by individual students. Although the survey instrument was not validated, the intent was to use the responses to identify trends or outliers that may indicate strengths and/or weaknesses of the APPE sequence or individual components.

The course review worksheet was developed to incorporate as much objective assessment as possible with defined ratings of minor, moderate, and major. Even with these defined ratings, some subjectivity in the reviewers' assessment of each criterion still occurred.

A possible improvement in the review process might be to review a course sequence such as APPEs by individual course or experience type (ie, primary care/ambulatory care vs acute care). This process would allow for more direct feedback to individual preceptors regarding strengths and areas for improvement within their specific practice experience. Future assessments should include an evaluation of the changes implemented as a result of this course review. The entire APPE sequence is scheduled for review again in 2012.

Many of the strengths and limitations noted with the APPE reviews have been found with other course reviews conducted at the school. Review of the course sequences has proven to be a strength as well as limitation. The time required to thoroughly conduct reviews of courses, especially course sequences, is a limitation resulting in many of the reviews being conducted during the summer when additional stakeholders (not key) may not be available. With many reviews, it has been challenging to gather all of the data that are requested as part of the standardized course review process. Open BlackBoard is used as a centralized place to store the material for the review committees and will provide a record for future reviews.

After reviewing numerous courses over the past 4 years, it is the opinion of the school's curriculum committee that the process of reviewing all courses in the curriculum is increasing communication among faculty members, both within and between departments. Reviews involving all courses and key stakeholders (faculty members, administration, preceptors, staff members, and student pharmacists) have not only opened dialogue regarding individual courses but appear to be changing the culture regarding the faculty as a whole taking responsibility for the entire curriculum rather than individual courses.

SUMMARY

Quality assessment through course reviews of pharmacy curricula is an important process in continual improvement in pharmacy education. The course review process used by Auburn University Harrison School of Pharmacy aligns well with what is outlined in ACPE Guideline 15.2, which calls for “a system of evaluation of curricular effectiveness” and in Guideline 14.6, the “quality assurance procedure.”1 Course reviews can be challenging and time consuming, especially with course sequences such as with the APPEs, but these reviews offer schools useful information from all stakeholders regarding quality improvement.

REFERENCES

  • 1.Accreditation Council for Pharmacy Education. Chicago, IL: Accreditation Council for Pharmacy Education; 2007. Accreditation standards and guidelines for the professional program in pharmacy leading to the doctor of pharmacy degree. http://www.acpe-accredit.org/pdf/ACPE_Revised_PharmD_Standards_Adopted_Jan152006.pdf. Accessed May 27, 2011. [Google Scholar]
  • 2.Skelton J, West KP, Zeff T. Phase 1 of a comprehensive course review process: program innovation. J Dental Educ. 2002;66(3):405–413. [PubMed] [Google Scholar]
  • 3.Mennin S, Krackov S. Reflections on relevance, resistance, and reform in medical education. Acad Med. 1998;73(9 Suppl):60S–64S. doi: 10.1097/00001888-199809001-00011. [DOI] [PubMed] [Google Scholar]
  • 4.Barham I, Prosser M. Review and redesign: beyond course evaluation. Higher Educ. 1985;14(3):297–306. [Google Scholar]
  • 5.Britton M, Letassy N, Medina MS, Er N. Evaluation, assessment, and outcomes in pharmacy education: the 2007 AACP Institute. A curricular review and mapping process supported by an electronic database system. Am J Pharm Educ. 2008;92(5):Article 99. doi: 10.5688/aj720599. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Norton R. ERIC; Ipswich, MA: 1997. Ohio State Univ. C. DACUM Handbook. Second Edition. Leadership Training Series No. 67 [e-book] Available from. http://www.eric.ed.gov/PDFS/ED401483.pdf. Accessed July 6, 2011. [Google Scholar]
  • 7.Haase KK, Smythe MA, Orlando PL, Resman-Targoff BH, Smith LS. ACCP white paper: quality experiential education. Pharmacotherapy. 2008;28(10):219e–227e. doi: 10.1592/phco.28.12.1548. [DOI] [PubMed] [Google Scholar]
  • 8.Haase KK, Smythe MA, Orlando PL, Resman-Targoff BH, Smith LS. ACCP position statement: ensuring quality experiential education. Pharmacotherapy. 2008;28(12):1548–1551. doi: 10.1592/phco.28.12.1548. [DOI] [PubMed] [Google Scholar]

Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy

RESOURCES