Skip to main content
American Journal of Pharmaceutical Education logoLink to American Journal of Pharmaceutical Education
. 2007 Apr 15;71(2):20. doi: 10.5688/aj710220

Curriculum Mapping in Program Assessment and Evaluation

Cecilia M Plaza 1,*,, JoLaine Reierson Draugalis 1, Marion K Slack 1, Grant H Skrepnek 1, Karen Ann Sauer 1
PMCID: PMC1858603  PMID: 17533429

Abstract

Objectives

To demonstrate a curriculum mapping technique and its use in program evaluation and assessment, as well as to provide specific recommendations for potential uses in pharmacy education.

Methods

This study employed a descriptive cross-sectional study design based on a learning outcomes document and several existing student and curricular data sets.

Results

The population consisted of 209 PharmD students at the University of Arizona College of Pharmacy (UACOP) during the 2004-2005 academic year and mapped 31 of the 34 required didactic courses in the curriculum. There was concordance between student and faculty member ranking of domain coverage in their respective curricular maps.

Conclusions

The agreement between the student and faculty graphical curriculum maps on the order of the ranking of the relative emphasis of each domain suggests concordance between the intended/delivered and received curriculums. This study demonstrated a curriculum mapping methodology that can be used to both make sense and use of existing data in curricular evaluation.

Keywords: curriculum evaluation, curriculum mapping, assessment, curriculum

INTRODUCTION

In the American Association of Colleges of Pharmacy's (AACP's) commissioned Excellence Paper on curriculum development and assessment, Abate and colleagues recommended the use of curriculum mapping to demonstrate and explore the links between content and learning outcomes.1 In addition to numerous recommendations for the Academy, this Excellence Paper suggested the use of student learning outcomes as an integral component of curriculum development, serving as the basis for formative and summative assessment.1

In addition to taking an inventory and mapping out where data are already being collected to make use of existing points of contact with students, an important assessment consideration is an examination of the “designed curriculum,” the “delivered curriculum,” and the “experienced curriculum.”2,3 The designed or intended curriculum is the institutional and program requirements, the delivered or enacted curriculum is the instructional delivery, and the experienced or learned curriculum is what the students actually experience.3-4 Potential data sources for performance indicators to examine the curriculum include catalog or syllabus review, self-reports from faculty members and students, and assessment and student work examples, such as portfolios.3

Curriculum mapping is a consideration of when, how, and what is taught, as well as the assessment measures utilized to explain achievement of expected student learning outcomes.5 Harden detailed the 2 main functions of curriculum maps in medical education: to make the curriculum more transparent to all the stakeholders and to demonstrate the links between the various components of the curriculum. Curriculum maps can help in 3 primary ways: (1) identify whether the intended material is actually being taught and furthers what students actually learn; (2) demonstrate the links among the different key components of the curriculum: learning outcomes, learning opportunities, content, and assessment; and (3) examine specific portions of the curriculum, such as learning location, learning resources, and timetables, in addition to examining the curriculum from multiple perspectives.5

Curriculum maps in pharmacy education have been used to identify potential deficiencies in the curriculum, aid in planning assessment activities, and develop different models to guide the assessment process for both ACPE accreditation and regional accreditation requirements.6,7 Mort and colleagues demonstrated an approach to examining the intended curriculum from the faculty perspective based on an outcomes document.8,9

There have been several curriculum mapping studies reported in the medical literature of initiatives at both US and foreign medical schools to examine a specific component of the medical education curriculum. A curriculum map was used to explore to what extent if any medical students were exposed to disease prevention and health promotion in the first 3 years of a medical school curriculum.10 Competencies in disease prevention and health promotion served as the guide for assessing the scope of coverage for the content of interest in the curriculum map. From the map they determined that the domains of clinical prevention and quantitative skills were well represented across all 3 years, while the community aspect of practice was the least represented domain.

Citing the lack of incorporation of formal palliative care education into undergraduate medical school curricula in the United States despite the identification of core competencies in the area, Meekin et al developed an instrument to facilitate curriculum mapping of this often “hidden” topic.11 A follow-up study used this instrument to develop strategic plans at 13 of the 14 participating medical schools in New York with the intent of increasing coverage of palliative care within each respective curriculum.12 Curriculum mapping combined with strategic planning increased the palliative care content in medical school curricula in New York state.12 These studies showed that a curriculum mapping tool could be used both as part of curricular self-assessment and to incorporate a desired topic into a curriculum by guiding strategic planning.11,12 In another study that also used curriculum mapping to examine a specific component, in this case cultural competency in a medical school curriculum, a triangulation method was used to examine the intended, taught, and received curriculum, to provide a transparent and complete picture of the curriculum, and to show linkages.10,13 The intended curriculum was examined using the medical faculty handbook, which contained the learning objectives for each course. The taught curriculum was examined using interviews with instructors responsible for entire courses or themed areas of study. The received curriculum was explored using focus group interviews with students. Mapping consisted mainly of listing specific instances where cultural competency had been included in the curriculum and comparing the relative number of occurrences qualitatively. Cultural competency was found in the curriculum, however, it was mostly “hidden” and teachers reported a greater extent of coverage than did students. The triangulation method of mapping led to a better understanding of the current curriculum and informed change for the future curriculum.

Porter described a method of measuring the content of instruction, instructional materials, and the alignment between them, focusing on the delivered or enacted curriculum.4,14 He suggested that good measures of instructional content could be used to describe the taught curriculum or to measure the degree to which a new curriculum has been implemented.14 Porter described 3 tools for measuring content and alignment: (1) teacher surveys on instructional content, (2) content analyses of instructional materials, and (3) alignment indices to describe the degree of overlap between content and standards or assessment.14 To measure content and alignment, a uniform language for describing content must be developed and used as a measure of alignment.4,14 A uniform language can include measuring the level of coverage and the relative emphasis in terms of time demands of total instructional time spent on each topic or category to form a matrix. Reflecting primary and secondary education perspectives, Porter suggested that content analysis of instructional materials such as textbooks could be used to examine the intended curriculum as a means of improving teaching.14 In professional health sciences curricula where textbooks are not as heavily used due to the dynamic nature of the healthcare disciplines, other measures of the intended curriculum such as course syllabi could possibly be used. Alignment indices compare proportions in each respective matrix to determine the degree to which they overlap or are aligned. Porter also recommended the use of topographical maps to display what content is emphasized relative to the standards used in the curriculum.14 Emphasis is depicted by the intensity of shading, with darker shading indicating greater emphasis and lighter shading indicating less emphasis. This graphical representation of a curriculum allows a mapping which can be visually compared to other measures of content coverage. While Porter did not use student measures of content coverage, conceivably student measures of content coverage and an associated index of alignment could be constructed as well as mapped to determine the degree of alignment between the intended/taught curriculum and the received curriculum.

The purpose of this study was to demonstrate the use of curriculum mapping in program evaluation and assessment as well as to provide specific recommendations for potential uses in pharmacy education.

METHODS

This study employed a descriptive cross-sectional study design based on several existing student and curricular data sets. The population consisted of 209 PharmD students at the University of Arizona College of Pharmacy during the 2004-2005 academic year. The Human Subjects Protection Program declared this project exempt.

An examination and inventory was taken to determine where data were already being collected to make use of existing points of contact with students.2 As part of the reflective portfolio process, students completed a report for each domain along with an associated item on the extent of domain coverage during that academic year using a Microsoft Word document form. Student data were entered and de-identified by an independent student worker in the summer and early fall semester of 2005 to form the anonymous database. The various data sources used in the demonstration of this methodology are detailed in Table 1.

Table 1.

Data Sources Used in Curriculum Mapping

graphic file with name ajpe20tbl1.jpg

The curriculum mapping focused on elucidating the intended, delivered, and received curriculum based on the Outcomes Expected document. The “Outcomes Expected of Graduates of the Doctor of Pharmacy Program” or Outcomes Expected document was developed at The University of Arizona College of Pharmacy by the Evaluation and Special Study Committee. The document, which incorporated a competency-based learning outcomes document with the curriculum at the College, is reported elsewhere in the literature.15 With the AACP's Center for the Advancement of Pharmaceutical Education (CAPE) Educational Outcomes document serving as the overarching guide, the Outcomes Expected document consisted of 5 domains with 18 associated competency statements.15,16 The 5 domains were: domain 1-patient care, ensuring appropriate therapy and outcomes; domain 2 - dispensing medications and devices; domain 3 - health promotion and disease prevention; domain 4 – professionalism; and domain 5 - health systems management. The Outcomes Expected document was operationalized as the overarching curricular framework through a required reflective portfolio for all pharmacy students at the University of Arizona College of Pharmacy and served as the primary consideration in the curriculum evaluation. Only the didactic portion was remapped since the original map from the 2001-2002 academic year covered the first 3 years of the curriculum, which was the portion of the curriculum shared by all students. Given the retrospective nature of data collection from existing databases from the 2004-2005 academic year, it was not possible to sufficiently differentiate between the intended and the delivered curriculum; thus, the curriculum was remapped from 3 different perspectives to 2.

Syllabi review from the 2004-2005 academic year served as a measure of the intended curriculum and as a proxy for the delivered curriculum and is further described in Table 1. The intended/delivered curriculum versus the received curriculum was examined by comparing student ratings of domain coverage in each year of the didactic portion of the curriculum to the curriculum map constructed from course syllabi.

Given changes in the curricular sequence at the University of Arizona College of Pharmacy since the original curriculum map was constructed, curriculum mapping was repeated using a portion of the original instrument by the principal investigator using the course syllabi from the 2004-2005 academic year as described in Table 1. Since 2 associated discussion courses were indicated by each respective course coordinator to be extensions of their respective lecture courses and therefore not described in separate syllabi, they were not mapped separately. For the basic science courses, the existing curriculum map was used to supplement the syllabi. Since the Outcomes Expected document has embedded and implied basic science competencies, they were not as explicitly stated in the course syllabi of those courses. Elective courses were not mapped since they were not common to all students. Required courses that were not housed in the College or taught by College faculty, such as anatomy and biochemistry, were not mapped since the instructors of those courses were under no obligation to use the Outcomes Expected document, which was specific to pharmacy. The fourth-professional year (P4), which is the advanced experiential portion of the curriculum, was not mapped given the diversity of the rotations offered. The final curriculum map reported the number of courses covering each domain and associated competency, as reported by the course coordinators. Teaching or not teaching to a competency served as a proxy for the extent of competency coverage from the faculty perspective. A total of 31 courses were included in the faculty curricular map.

The curriculum was mapped using an adapted version of Porter's topographical maps to graphically display the curriculum to indicate which competencies and associated domains were emphasized relative to each other.14 Intensity of shading was used to indicate competency or domain emphasis. The darker the shading, the more emphasis in the curriculum based on content matrices, which contained, in this case, the proportion of coverage for each competency or domain in each respective year in the didactic curriculum. Unlike Porter who used “smoothing” to depict the intersections between topics and cognitive demand much like a real topographical map, the maps used in this study were left as grids since less precise measures were used to obtain information on the competency and course level.14 Teaching to a competency served as a proxy measure of extent of domain coverage within the curriculum.

The intended/delivered curriculum was first graphically mapped by course and competency based on faculty syllabi. The dichotomous data were reflected with the color black (complete shading) to indicate that the course did teach to the competency in question or the color white (the absence of shading) to indicate that the course did not teach to the competency. Courses were listed in the order in which they occurred in the curriculum to allow for a visual depiction of the progression of the curriculum.

The intended/delivered curriculum was then mapped by course and domain level by collapsing the competencies into their respective domains, by summing the number of competencies within each domain for each course. The more competencies taught within a given domain, the greater the intensity of shading on the graphical map to reflect the relative greater emphasis on that domain. The total number of competencies within a given domain were converted to proportions so that domains with differing numbers of total competencies could be mapped using the same shading intensity ranges (eg, teaching 4 out of the 4 total competencies in domain 1 would be shaded the same as teaching 3 out of the 3 total competencies in domain 2).

To map the intended/delivered curriculum at the year and domain level, the courses in a given year were further collapsed. Complete domain coverage, the maximum emphasis of a domain in a given year, would theoretically occur if each course in a specific/particular year included each competency within a given domain. Each course in a given academic year contributed towards domain coverage through the number of competencies covered in the course. For example, the first-professional year (P1) had 12 courses, so for domain 1, which had 4 competencies, total possible domain coverage would be 48 affirmative responses to teaching to each competency within that domain across all courses in P1. To obtain the proportion of domain coverage for a given year, the total number of affirmative responses to teaching to each competency within a given domain was divided by the total possible domain coverage. This measure was used to reflect the extent of domain coverage for each respective year in the didactic curriculum. It was necessary to collapse the courses down to the year level to provide a graphical map in the same metric as the student data which will be described in more detail below.

As part of the reflective portfolio, students were asked to rate the extent to which the curriculum focused, for the given year, on each respective domain in the Outcomes Expected document as described in Table 1. To graphically map the student-reported extent of domain coverage, a mean was calculated for each domain for each year in the didactic curriculum. The resulting map was on the domain and year level. Each mean was converted to a proportion by dividing the mean by the maximum value the mean could assume, which was 3, so that student data could be graphically mapped using the same shading intensity ranges as the faculty graphical map.

An index of alignment, as suggested by Porter, was not calculated given the nature of the data in this study where the requirement of data collection using a uniform language for the measure of alignment was not possible.16 Qualitative comparisons of the graphical maps were used.

For each domain, a one-way ANOVA was conducted to determine if there were differences in the mean student-reported extent of domain coverage between years in the curriculum. A Bonferroni post-hoc test was used for significant F-tests. The a priori level of significance was 0.05.

RESULTS

The graphical curriculum map of the intended/delivered curriculum by course and domain is shown in Figure 1. Domain 3 (health promotion and disease prevention) and its associated competencies appeared to receive less relative emphasis at the course level than the other domains. Domain 1 (patient care), domain 2 (dispensing medications and devices), and domain 4 (professionalism) had greater intensity of shading relative to the other domains. Domain 5 (health systems management) appeared to have greater emphasis relative to domain 3 (health promotion and disease prevention), but less than domains 1, 2, and 4.

Figure 1.

Figure 1

Map of intended/delivered curriculum by course and domain (domain 1: patient care – ensuring appropriate therapy and outcomes; domain 2: dispensing medications and devices; domain 3: health promotion and disease prevention; domain 4: professionalism; domain 5: health systems and management).

The graphical map of the intended/delivered and the received curriculum by domain and professional year is shown in Figure 2. The shading intensity of the received curriculum graphical map appears to be darker than the intended/delivered curriculum graphical map. The shading intensity on the received curriculum map is particularly pronounced in domains 1 and 4. Both maps show the least relative shading intensity for domain 3. As with the intended/delivered curriculum, the graphical map of the received curriculum showed that domains 1, 2, and 4 received a greater relative intensity of shading than domains 3 and 5.

Figure 2.

Figure 2

Map of the intended/delivered and received curriculum by professional year and domain. (Domain 1: Patient care, ensuring appropriate therapy and outcomes; domain 2: dispensing medications and devices; domain 3: health promotion and disease prevention; domain 4: professionalism; domain 5: health systems and management).

The mean student rating of domain coverage by year is shown in Table 2. The only differences in the mean rating of domain coverage between professional years occurred in domain 5 (health systems management). The P1 curriculum had a lower mean rating of domain coverage than both the second-professional year (P2) curriculum (p < 0.0001) and the third-professional year (P3) curriculum (p < 0.0001). P2 and P3 students did not differ in their mean rating of this domain coverage (p = 0.129).

Table 2.

Extent of Domain Coverage Reported by First-, Second- and Third-Professional Year Pharmacy Students*†

graphic file with name ajpe20tbl2.jpg

*Four-point rating scale: 0 = not at all, 1 = a very small extent, 2 = a moderate extent, and 3 = a very large extent

Domain 1: Patient care, ensuring appropriate therapy and outcomes, domain 2: dispensing medications and devices, domain 3: health promotion and disease prevention, domain 4: professionalism, domain 5: health systems and management

DISCUSSION

Curriculum mapping can be used to make the curriculum more transparent and to demonstrate the links between various components of the curriculum. There does appear to be some level of concordance between the intended/delivered and the received curriculum graphical maps as the respective domains on both graphical maps were ranked in the same order as indicated by the relative intensity of shading. For example, domain 3 (health promotion and disease prevention) was the least emphasized domain as indicated by the lack of shading intensity compared to other domains on both graphical maps. The received curriculum appeared to show a higher relative emphasis of the various domains than on the corresponding domains on the intended/delivered curriculum graphical map as indicated by the darker intensity of shading. It was expected that students would rate higher levels of domain coverage given several possible confounders. While students were asked to indicate the extent to which the curriculum focused on a given domain during a particular academic year, it may have been difficult for students to differentiate between work experiences and extracurricular activities that could potentially address various aspects of a given domain. Students may also not be able to make judgments about curricular coverage on an academic year basis. As Litaker et al noted, it is not possible to determine the extent to which topics were reinforced in the curriculum, potentially resulting in an underreporting of domain coverage in syllabi.10 Students may also report a greater extent of domain coverage due to social desirability bias or a lack of a complete understanding of the competencies and respective domains. McCurdy et al found that students had difficulty associating educational experiences with appropriate educational outcomes, as well as with attributing excessive educational outcomes to each experience.17 It is possible that the students attributed excessive competencies to various experiences and thus over reported the extent of domain coverage.

While elective courses are not officially offered in P1, students who have previously taken the required biochemistry and/or physiology courses replace those credit hours with electives. Electives are formally offered as part of the curriculum in P2 and P3. Students are allowed to take more than the required number of elective units in the curriculum. The different types of electives and the number of electives a particular student chooses to take may have also affected student rating of the extent of domain coverage. It would be expected that a student taking electives that provide greater exposure to certain competencies would rate the extent of domain coverage higher on those domains that contained the said competencies. Given the retrospective nature of data collection and current design of the reflective report form, it was not possible to control for this potential confounder.

Interestingly, mean student ratings of domain coverage did not vary between professional years except for domain 5 for which P1 students reported a lower mean extent of domain coverage than both P2 and P3 students. Since this study was cross-sectional, it was not possible to determine whether a single cohort followed longitudinally would have the same consistency of relative ranking of domain coverage. It was also not clear whether the same consistency of relative ranking on the delivered curriculum map would have been seen if students had been asked to rate the extent of coverage at the competency level. While not shown here, on the intended/delivered curriculum graphical map at the course and competency level, the shading intensity does change between years in the curriculum. For example, for competency 1.1, there was a greater shading intensity for courses in P1 compared to both P2 and P3, suggesting greater emphasis in the curriculum during P1 versus later in the curriculum. For the received curriculum graphical map and associated reported means, the lack of difference in shading intensity may be due to a loss of information by collecting data exclusively at the domain versus the competency level.

Many of the limitations in this study were the result of the retrospective nature of the data collection. This curriculum mapping demonstration had some of the same limitations identified by Litaker et al: the retrospective nature of the curriculum mapping makes it subject to recall bias from faculty members and students and it is not possible to determine the extent to which topics were reinforced in the curriculum.10 In this study, mapping of the intended/delivered curriculum was done retrospectively using syllabi, so while not subject to recall bias, it did not permit investigation of the relative emphasis in terms of instructional time of each competency within each course. The retrospective nature of syllabi review also provided a cruder measure than the teacher surveys used by Porter to determine instructional emphasis.14 The syllabi did not allow the examination of the extent to which topics were reinforced. The assumption that the intended curriculum was the same as the delivered curriculum was also a major limitation of this study. The student responses contained in an existing database were subject to recall bias at the time of data collection since students were asked to consider the academic year which they had just completed. Student responses may have also reflected potential confounders as discussed above.

While the graphical maps provide insight into the intended/delivered versus the received curriculum, it was not possible to calculate an index of alignment because the retrospective faculty and student data were on a different metric. Collection of data using a uniform language, as Porter suggested, was not possible given the retrospective nature of the data.4,16 Faculty data were based on syllabi where teaching or not teaching to a competency served as a proxy for domain coverage. Student data were based on an item on extent of domain coverage in the curriculum for each respective domain. Since each of the domains are composed of associated competencies, which in most cases are even further divided into components, students may have been answering the item based on any number of competencies or components within a given domain. The faculty map was also based exclusively on the 31 required courses that comprise the didactic portion of the curriculum. Student responses were based on didactic and potentially elective courses, extracurricular activities, and possible work experience. As seen with the graphical maps based on faculty syllabi, there appeared to be differences in the relative emphasis on competencies within a given domain. At the domain level, these differences would not be distinguishable.

Another limitation of content reflection was the inability, due to the retrospective nature of data collection, to differentiate between the intended and the delivered curriculum. While ideally they should not differ, indicating good alignment, the intended curriculum described in the syllabus may not be what is actually delivered in the classroom. Some faculty members complied with the inclusion of the Outcomes Expected document the previous academic year but then omitted the material in the 2004-2005 academic year. So while it is listed in the syllabi at one point, its omission on subsequent iterations may indicate that the delivered curriculum may differ since the inclusion of the Outcomes Expected document may have been to comply with the request that it be added to each course syllabi. While it is an assumption that the intended and delivered curriculum are the same, even the use of focus groups to assess the delivered curriculum would be susceptible to limitations such as recall bias and social desirability bias.

The graphical curriculum maps have been used in several different ways by the College. They were used to identify potential redundancies or excessive review in the curriculum. Likewise they were also used to identify potential areas of omission or more “hidden” parts of the curriculum. Health promotion and disease prevention was one such weaker area in terms of coverage in the curriculum. The curriculum maps were given to the Evaluation and Special Study Committee as well as the College's curriculum committee for consideration for use in updating the Outcomes Expected document and as a baseline picture prior to the anticipated curriculum revision process in response to Standards 2007. The curriculum maps will also be used at the spring 2007 curriculum retreat as a data source for informing faculty of the intended/delivered and received curricula as the College moves forward in planning improvements. Other colleges of pharmacy could use similar data sources to map their curricula to have a baseline picture of which AACP CAPE competencies are adequately addressed and what areas need improvement as the implementation of Standards 2007 approaches.

CONCLUSION

The agreement between the graphical curriculum maps for both students and faculty on the relative emphasis of the domains in the Outcomes Expected document suggests concordance between the intended/delivered and the delivered curriculum. The methodology illustrated provides a means for examining the intended, delivered, and received curriculum based on outcomes statements. While the work of Porter focused on the delivered curriculum, schools and colleges of pharmacy can use this graphical mapping technique to examine the relative degree of concordance between the student and faculty perceptions of coverage of the AACP CAPE Outcomes. This technique also provides a mechanism for visually determining when the curriculum competencies are covered as well as areas that are potentially not sufficiently covered. Future study will include the use of faculty and student measures using the same level of detail in the description of the outcomes of the Outcomes Expected document to allow for the calculation of an index of alignment.

ACKNOWLEDGMENTS

The primary author would like to acknowledge the generous support of the American Foundation of Pharmaceutical Education in the conduct of this study through a predoctoral fellowship. We would also like to acknowledge the contribution of Jerome V. D'Agostino, PhD, and Megan Welsh, MS, from the Educational Psychology department at The University of Arizona College of Education for their methodological assistance.

The ideas expressed in this manuscript are those of the authors and do not represent the position of the American Association of Colleges of Pharmacy.

REFERENCES

  • 1.Abate MA, Stamatakis MK, Haggett RR. Excellence in curriculum development and assessment. Am J Pharm Educ. 2003;67 article 89. [Google Scholar]
  • 2.Ewell PT. National trends in assessing student learning. J Engineering Educ. 1998:107–13. [Google Scholar]
  • 3.Ewell PT, Jones DP. Indicators of ‘Good Practice’ in Undergraduate Education: A Handbook for Development and Implementation. Boulder, CO: National Center for Higher Education Management Systems; 1996. [Google Scholar]
  • 4.Porter AC, Smithson JL. Philadelphia, PA: CRPE Research Report Series Consortium for Policy Research in Education; 2001. Defining, developing, and using curriculum indicators. [Google Scholar]
  • 5.Harden RM. AMEE Guide No. 21: Curriculum mapping: a tool for transparent and authentic teaching and learning. Med Teach. 2001;23:123–37. doi: 10.1080/01421590120036547. [DOI] [PubMed] [Google Scholar]
  • 6.Zavod RM, Zgarrick DP. Appraising general and professional ability based outcomes: curricular mapping project. Am J Pharm Educ. 2001;65:75S–116S. [Google Scholar]
  • 7.Bouldin AS, Wilkin NE, Wyandt CM, Wilson MC. General and professional education abilities: identifying opportunities for development and assessment across the curriculum. Am J Pharm Educ. 2001;65:75S–116S. [Google Scholar]
  • 8.Mort JR, Houglum JE, Kaatz B. Use of outcomes in the development of an entry level PharmD curriculum. Am J Pharm Educ. 1995;59:327–33. [Google Scholar]
  • 9.Mort JR, Houglum JE. Comparison of faculty's perceived coverage of outcomes: pre-versus post-implementation. Am J Pharm Educ. 1998;62:50–3. [Google Scholar]
  • 10.Litaker D, Cebul RD, Masters S, Nosek T, Haynie R, Smith CK. Disease prevention and health promotion in medical education: reflections from an academic health center. Acad Med. 2004;79:690–7. doi: 10.1097/00001888-200407000-00017. [DOI] [PubMed] [Google Scholar]
  • 11.Meekin SA, Klein JE, Fleischman AR, Fins JJ. Development of a palliative education assessment tool for medical student education. Acad Med. 2000;75:986–92. doi: 10.1097/00001888-200010000-00011. [DOI] [PubMed] [Google Scholar]
  • 12.Wood EB, Meekin SA, Fins JJ, Fleischman AR. Enhancing palliative care education in medical school curricula: implementation of the palliative education assessment tool. Acad Med. 2002;77:285–91. doi: 10.1097/00001888-200204000-00005. [DOI] [PubMed] [Google Scholar]
  • 13.Wachtler C, Troein M. A hidden curriculum: mapping cultural competency in a medical programme. Med Educ. 2003;37:861–8. doi: 10.1046/j.1365-2923.2003.01624.x. [DOI] [PubMed] [Google Scholar]
  • 14.Porter AC. Measuring the content of instruction: uses in research and practice. Educ Res. 2002;31:3–14. [Google Scholar]
  • 15.Draugalis JR, Slack MK, Sauer KA, Haber SL, Vaillancourt RR. Creation and implementation of a learning outcomes document for a doctor of pharmacy curriculum. Am J Pharm Educ. 2002;66:253–60. [Google Scholar]
  • 16. American Association of Colleges of Pharmacy. Educational Outcomes. Alexandria, Va: Center for the Advancement of Pharmaceutical Education Outcomes; 1998. Available at: www.aacp.org. Accessed March 11, 2007.
  • 17.McCurdy LB, Walcerz DB, Drake WH. A web-based approach for outcomes assessment. Albuquerque, NM: Paper presented at the American Society for Engineering Education Annual Conference & Exposition; 2001. [Google Scholar]

Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy

RESOURCES