Skip to main content
Innovations in Pharmacy logoLink to Innovations in Pharmacy
. 2025 Jan 14;15(4):10.24926/iip.v15i4.5840. doi: 10.24926/iip.v15i4.5840

Assessment of Student IPEC Competency Using Observer-Based Evaluation in Didactic Interprofessional Education Activities

Jacqueline M Zeeman 1,, Kimberly A Sanders 1, Tia M Belvin 1, Philip T Rodgers 1
PMCID: PMC12090079  PMID: 40401303

Abstract

Introduction: IPE competency requires multiple developmental experiences across diverse educational environments, including didactic and experiential learning. While literature outlines various IPE activities, gaps exist regarding IPE evaluation strategies with most published tools relying on self-evaluation. This study describes an observer-based assessment of individual student IPEC Competency development for students participating in didactic-IPE activities, and compares observer-based ratings with student self-evaluation ratings.

Innovation: The IPEC Competency Assessment Tool of Individual Students (I-CATIS) was piloted in an IPE case collaboration activity involving pharmacy and dental students. Faculty were trained on the I-CATIS and evaluated pharmacy students on thirteen predetermined IPEC sub-competencies. Students evaluated their self-efficacy on the selected IPEC sub-competencies, which was compared with I-CATIS results.

Findings: Sixty-three pharmacy students across 12 groups were evaluated by six faculty facilitators. Across all observed competencies, 26% of students were rated as “Minimal” and 64% as “Developing” on a competency compared to 10% rated as “Competent.” Students’ self-evaluation ratings were higher on all sub-competencies compared to observer-evaluation ratings. Facilitators indicated the I-CATIS tool was easy to use, but challenging to complete while concurrently facilitating interprofessional teams.

Conclusions: The I-CATIS enabled observer-based evaluation of individual student’s IPEC Competency development in the didactic-IPE activity. I-CATIS can supplement and advance student self-evaluation data and inform didactic IPE curriculum development to ensure graduates are prepared and competent to practice in a collaborative healthcare environment.

Keywords: Interprofessional education, Outcome assessment, Competency-based education, pharmacy education

Introduction

Interprofessional education (IPE) involves educators and learners from two or more health professions who jointly create and foster a collaborative learning environment.1 Research shows that IPE activities are an effective tool for developing and improving attitudes towards interdisciplinary teamwork, communication, problem-solving, and knowledge/skills to prepare collaborative practice-ready graduates.2-4

Recognizing the impact of IPE experiences in developing a collaborative practice-ready workforce, the Interprofessional Education Collaborative Core Competencies for Interprofessional Collaborative Practice (IPEC Competencies) were developed with representation from multiple healthcare professions with the goal of defining a consensus framework to support interprofessional competency development.5,6 The IPEC Competencies outline four core competencies and several related sub-competencies needed to prepare collaborative practice-ready graduates: Values and Ethics (VE), Roles and Responsibilities (RR), Communication (CC), and Teams and Teamwork (TT).5,6 In November 2023, a revision to the IPEC Competencies was released. Notably, the revision retained the original four competency domains and a majority of the sub-competency statements. 6

In addition to this consensus effort, various health professions education programs include IPE in accreditation requirements.7,8 Within pharmacy, the Accreditation Council for Pharmacy Education (ACPE) Standards outline several areas related to IPE, including required experiential and didactic learning and assessment of student readiness to contribute as an interprofessional team member.9

Despite consensus on incorporating IPE into accreditation standards, effective assessment strategies for evaluating students’ interprofessional competency remain insufficient. Most existing tools rely on student self-assessment which, while convenient, have limitations.10,11 To ensure programs are preparing collaborative practice-ready graduates, it is critical to evaluate individual student IPEC competency as the graduate will interact with various teams in the workforce. A scan of the National Center for Interprofessional Practice and Education Resource Exchange (Nexus) database of IPE assessment and evaluation tools indicates that only six of 50 published tools focus on evaluating a single individual by an observer.12 However, two of these six tools – the Performance Assessment Communication and Teamwork Tools Set (PACT)13 and The University of Auckland Behavioral Rating Scale (UA-BRS)14 – actually evaluate the team rather than individual when using the inventory’s observer-based component.12 Notably, none of these tools assess individual students on the IPEC Competencies using observer-based evaluation.

In an increasingly dynamic healthcare environment, assessing individual student ability is crucial for preparing collaborative practice-ready graduates.15 Despite the national consensus approach, few assessment tools incorporate the IPEC Competencies. Additionally, a gap exists in best practices for observer-based evaluation of individual student’s IPEC Competency although prior studies have explored various IPE self-evaluation tools. The purpose of this study was to assess individual student IPEC Competency development by an observing facilitator in a didactic IPE activity and compare observer-based and student self-evaluation ratings.

Innovation

Prior literature guided the approach, with focus on assessment of individuals (as opposed to teams), observer-based evaluation (as opposed to student self-evaluation), incorporation of the IPEC Competencies, and applicability to pharmacy learners (as opposed to tools designed for other disciplines). A 2017 systematic review of IPE assessment tools relevant to pharmacy16 identified 36 tools, of which most utilize self-assessment.17-20 Four tools were identified that utilized observer-based evaluation of individual student performance: Interprofessional Professionalism Assessment (IPA),21 Individual Teamwork Observation and Feedback Tool (iTOFT),22 Team Observed Structured Clinical Encounter (TOSCE),23 and Interprofessional Collaborator Assessment Rubric (ICAR).24 Though each are utilized for observer-based assessment, none directly evaluate all four IPEC Competency domains and sub-competencies. Further, TOSCE has been used in clinical settings rather than didactic learning environments.23 Nonetheless, these tools guided development of the study’s IPEC-Competency Assessment Tool of Individual Students (I-CATIS) (available upon request).

The I-CATIS was created with the objectives of assessing student IPEC Competency, as well as being concise and feasible for evaluators to assess individual students’ performance in a didactic IPE activity within a brief observation period. While other published tools align with select domains of the IPEC Competencies, the I-CATIS uses the specific competency statement language and corresponding domains outlined in the IPEC Competencies – avoiding the need for multiple tools for different competency domains. Additionally, this approach positions the I-CATIS to easily incorporate published updates or revisions to the IPEC Competencies. During the observation period, the I-CATIS evaluates student performance of the IPEC sub-competency statements on 3 anchors: 1-Minimal, 2-Developing, and 3-Competent. A “Not Observable” option is also provided for the observing evaluator to indicate the evaluator did not observe the student demonstrate the IPEC sub-competency. Anchors were reviewed and refined by IPE experts and evaluated for meaning, language interpretation, and significance.

The I-CATIS tool was developed to be applicable to various didactic IPE activities. The specific sub-competencies included on the I-CATIS for the didactic IPE activity are determined by IPE curriculum leaders by identifying IPEC Competency statements that align with the activity’s learning objectives – a best practice in curricular design.25,26 In this pilot, faculty leading the IPE activity independently selected from the 39 sub-competencies in the 2016 IPEC Competencies5 (as the 2023 IPEC Competencies6 were not yet available). Consensus was reached on thirteen IPEC sub-competencies for this activity.

Pilot

The I-CATIS was piloted in a required didactic IPE activity involving third-year pharmacy and third-year dental students (Table 1).27 Students completed a voluntary retrospective-pre and post evaluation of their self-efficacy in performing the same IPEC sub-competencies on a 6-point scale.27 Cross-mapping of observer and student ratings was done a priori to facilitate comparison, such that student self-evaluation ratings of 1-Very unconfident and 2-Unconfident mapped to observer-rating 1-Minimal; self-evaluation ratings of 3-Somewhat unconfident and 4-Somewhat confident mapped to observer-rating 2-Developing; and self-evaluation ratings of 5-Confident and 6-Very confident mapped to observer-rating 3-Competent.

Table 1.

Description of the didactic IPE Case Collaboration Activitya

Activity Name Pharmacy-Dental patient case collaboration
Educational Activity: Students from the 3rd year pharmacy class and 3rd year dental class were formed into interprofessional groups as part of a required course in each of their schools/programs. The activity was to discuss their perspectives on a patient case involving infective endocarditis antibiotic prophylaxis, dental extractions, pain management, atrial fibrillation with anticoagulation management, and long-term dental care.
Educational Approach: Team-based learning approach combined with interprofessional collaboration to share unique perspectives.
Learning Objectives:
  1. Identify (efficiently, accurately and from authoritative sources, especially primary literature) the key information required to understand and correctly treat disease states the student is currently unfamiliar with.

  2. Apply the process of clinical decision-making effectively (i.e., using the best evidence) and efficiently (i.e., in a space- and time-limited format).

  3. Collaborate effectively and efficiently with peers in teams to apply clinical decision-making and problem-solving skills to achieve optimal medication and health-related outcomes.

  4. Design an effective pharmacotherapeutic plan to address a diverse range of patient-specific problems.

  5. Communicate pharmacotherapy plans accurately and concisely to a diverse range of patients and health professionals.

Materials: Respective textbook chapters on the topics noted above for each course; Patient case provided as electronic documents or on a commercial electronic medical record teaching platform (EHR-Go®, Archtype Innovations, LLC; Duluth, MN).
Educational Strategies: Interprofessional student teams included a faculty facilitator who attended a portion of the team meeting to help guide discussion and pose helpful questions to prompt discussion. Dental students were required to give a presentation to their team using slides about their findings, assessments, and interventions for the case. Pharmacy students were expected to explain drug-related problems and solutions, and explore the dental perspectives.
Incentives: As an activity in required courses, students were expected to attend team meetings or be marked absent. Course absences are considered during course failure and remediation eligibility. Team meeting content contributed to preparing a deliverable of presentations for a grade and active participation for each course.
Instructors: Instructors were pharmacy faculty, including the lead topic instructors and all team facilitators. Dental faculty served as their respective program course coordinators and reviewed content.
Delivery: Teams and facilitators met in a virtual environment (Zoom, version 4.6.8, Zoom Video Communications, Inc.), with 8-9 students per team and 1 faculty facilitator.
Environment: Students and facilitators were allowed to connect remotely from any location (home, campus, etc) where they could access the virtual connection. This facilitated dual campus connection and participation, regardless of primary campus assignment.
Schedule: Teams met on a designated morning for 2 hours to complete the case activity. Faculty facilitators were each assigned 2 student teams, attending ~45 minutes of each teams’ meetings online.
Format: All students were able to review the topics and case for 24-48 hours prior to their team meeting. Students spent 2 hours in their team meeting, and then 24-48 hours after the meeting preparing their assigned deliverable for their courses.
Planned changes/Adaptations: Due to the pharmacy school structure, the activity was conducted twice: half of each school’s/program’s students met during a designated day in one month to complete the activity, and the other half met during a designated day in another month to complete it. Pharmacy and dental classes were evenly divided between both months.
Unplanned changes/Modifications: There were no unplanned or unanticipated changes needed because the activity went as expected.
Attendance: Attendance was very high with only a few anticipated excused absences noted. Faculty facilitators assessed attendance when they appeared in the virtual room.
Effective delivery: The team discussions were primarily student-led but faculty facilitators assessed and guided the quality of the discussion during the meeting. The deliverable prepared by the students demonstrated the interprofessional concepts learned from the meeting.
a

Table format amended from the Guideline for Reporting Evidence-based Practice Educational Interventions and Teaching (GREET)36

Validity evidence for the full I-CATIS instrument and its application in this pilot study included review by IPE experts, evaluators, and students for face and content validity. Facilitator I-CATIS training was held to provide instruction on how to use the tool, calibration on applying scale anchors to enhance interrater reliability, and example cases for application practice.

Twelve teams consisting of 6-10 students/group met virtually for the didactic IPE activity. Each team included 3-7 pharmacy and 3-4 dental students, and one faculty facilitator who guided group discussion. For the purposes of this pilot, the evaluation focused on pharmacy students only. Each facilitator evaluated two groups of pharmacy students over a 2-hour period, observing each group for approximately 45 minutes and using the I-CATIS.

Within 48 hours, facilitators submitted their I-CATIS evaluation for each student via Qualtrics (version 2021.10, QualtricsXM) and provided feedback on the I-CATIS’s ease of use, time needs, strengths, and opportunities for improvement. Students voluntarily completed their self-efficacy assessment via Qualtrics within one week of the IPE activity. All data were de-identified and analyzed using descriptive statistics as prior literature has explored the validity and reliability of the IPEC Competencies in evaluating student learning in health professions education.28-32 Pre/post- student self-evaluation ratings were analyzed using paired t-test; comparisons between student-post and observer-ratings were analyzed using unpaired t-test. Facilitator comments were reviewed for common themes. This project was determined to be exempt by the University of North Carolina at Chapel Hill Institutional Review Board.

Findings

Sixty-three student pharmacists were evaluated by six faculty facilitators. Reviewing the distribution of I-CATIS ratings by evaluators (Table 2), 26% of students were rated as “Minimal” and 64% as “Developing” compared to 10% rated as “Competent” across all observed competencies. Of the 13 sub-competencies evaluated, TT10 had the highest (87%) frequency of “Not Observable” rating followed by TT11 (86%); notably, both were rated as “Not Observable” for 11 of 12 teams (92%).

Table 2.

Comparison of Observer-Based Competency Rating and Student Self-Evaluation of IPEC Competency

  Observera Student Retrospective-Preb Student Postb
  Minimal % (n) Developing % (n) Competent % (n) Minimal % (n) Developing % (n) Competent % (n) Minimal % (n) Developing % (n) Competent % (n)
Values/Ethics (VE) for Interprofessional Practice
VE4*,^ 10% (3) 79% (23) 10% (3) 3% (1) 38% (12) 59% (19) 0% (0) 13% (4) 88% (28)
Roles/Responsibilities (RR)
RR1*,^ 33% (5) 47% (7) 20% (3) 9% (3) 44% (14) 47% (15) 0% (0) 19% (6) 81% (26)
RR2*,^ 25% (8) 63% (20) 13% (4) 6% (2) 41% (13) 53% (17) 0% (0) 28% (9) 72% (23)
RR3*,^ 33% (7) 57% (12) 10% (2) 6% (2) 53% (17) 41% (13) 0% (0) 28% (9) 72% (23)
RR4*,^ 30% (3) 50% (5) 20% (2) 9% (3) 56% (18) 34% (11) 0% (0) 25% (8) 75% (24)
RR5*,^ 32% (7) 64% (14) 5% (1) 3% (1) 63% (20) 34% (11) 0% (0) 31% (10) 69% (22)
RR6*,^ 70% (7) 0% (0) 30% (3) 3% (1) 53% (17) 44% (14) 3% (1) 28% (9) 69% (22)
RR9*,^ 63% (10) 38% (6) 0% (0) 3% (1) 59% (19) 38% (12) 0% (0) 28% (9) 72% (23)
Interprofessional Communication (CC)
CC3*,^ 23% (5) 73% (16) 5% (1) 3% (1) 56% (18) 41% (13) 3% (1) 25% (8) 72% (23)
CC4*,^ 9% (2) 77% (17) 14% (3) 3% (1) 47% (15) 50% (16) 3% (1) 19% (6) 78% (25)
Teams and Teamwork (TT)
TT3*,^ 17% (4) 74% (17) 9% (2) 0% (0) 63% (20) 38% (12) 0% (0) 31% (10) 69% (22)
TT10* 0% (0) 100% (8) 0% (0) 0% (0) 53% (17) 47% (15) 0% (0) 28% (9) 72% (23)
TT11* 11% (1) 89% (8) 0% (0) 0% (0) 50% (16) 50% (16) 0% (0) 31% (10) 69% (22)

Interprofessional Education Collaborative (IPEC) Core Competencies for Interprofessional Collaborative Practice5

a

IPEC-Competency Assessment Tool of Individual Students (I-CATIS) ratings: 1-Minimal, 2-Developing, 3-Competent.

b

Student retrospective-pre and post self-evaluation: 6-point self-efficacy scale mapped to the I-CATIS, such that 1-Very unconfident and 2-Unconfident mapped to I-CATIS rating 1-Minimal; 3-Somewhat unconfident and 4-Somewhat confident mapped to I-CATIS rating 2-Developing; and 5-Confident and 6-Very confident mapped to I-CATIS rating 3-Competent.

*

Statistically significant (p<0.05) difference observed between student retrospective-pre and post ratings.

^

Statistically significant (p<0.05) difference observed between observed-based rating and student-post rating.

The I-CATIS ratings (1-Minimal, 2-Developing, 3-Competent) were compared with students’ self-efficacy ratings (1-Very unconfident to 6-Very confident) (Table 2). Thirty-two of 63 students (51%) completed the voluntary retrospective-pre/post self-assessment. For all 13 sub-competencies, students rated their confidence higher post-activity compared to before (p<0.05). Additionally, students rated themselves higher on both retrospective-pre and post compared to facilitators on all (100%) sub-competencies.

Facilitator feedback highlighted a strength of I-CATIS was the tool was easy to complete. However, documenting the evaluation for each student, while simultaneously facilitating student groups, was challenging. When asked the amount of time needed to perform both concurrently, responses varied from 5-30 minutes per student. Facilitator suggestions included having a dedicated evaluator while faculty facilitate the case, or reducing the sub-competencies to evaluate if additional personnel is unavailable.

Critical analysis

This study is one of the first to describe an observer-based strategy to assess individual student IPEC Competency during a didactic IPE activity, addressing a critical gap in evaluation strategies that move beyond student self-evaluation.

As prior literature has shown, a common assessment strategy used in didactic IPE activities is student self-evaluation,10-12,16 which encompasses various limitations such as self-report bias and social desirability bias.33 Implementing observer-based evaluation can advance and/or supplement this common assessment strategy and minimize these known limitations. Within this study, faculty rated student IPEC Competency lower compared to students’ self-evaluations. This finding supports literature demonstrating self-evaluations rate higher compared to non-self-evaluations, suggesting trained faculty may better evaluate interprofessional competency given their experience and expertise.34,35 Use of the I-CATIS in combination with student self-evaluations provides a more robust evaluation of student IPEC Competency development.

Additionally, I-CATIS was designed to evaluate student IPEC Competency in various didactic IPE activities and thus can support data-driven curriculum design. The I-CATIS’s ability to focus on those IPEC Competencies emphasized in the specific IPE activity is a unique strength, as it allows for targeted assessment of student competency development, while minimizing respondent fatigue that would occur with evaluating each student on all 39 competencies. In this study, I-CATIS was piloted in an annual didactic-IPE activity27 and results indicated the TT domain was most frequently rated as “Not Observable”. Additionally, student self-evaluations rated lower self-efficacy on TT compared to other domains. This suggests the TT domain may be challenging to practice and/or assess in this didactic IPE activity, a critical insight otherwise not easily identified without the I-CATIS. IPE curriculum leaders may consider integrating additional teamwork content within this activity and/or emphasizing teamwork in alternate IPE activities. This is just one example of many curricular insights that can be gained by reviewing both I-CATIS observer-ratings and student self-evaluation ratings.

While facilitators indicated the greatest strength of I-CATIS was the tool was easy to use, they found it challenging to actively facilitate interprofessional team discussion while simultaneously evaluating individual students. This suggests that while the I-CATIS is a well-designed tool, consideration should be given to dedicating an observing evaluator in addition to the faculty facilitator for each group. An additional consideration may be to extend the observation time. While extending time during a live observation may require additional resources (eg, physical space, personnel), virtual activities can be recorded, which may reduce resources while simultaneously improving the evaluator experience by allowing them to re-watch the observation and thus reduce challenges with simultaneously facilitating and evaluating. Finally, consideration should be given to the number of IPEC sub-competencies included in the I-CATIS in relation to the observation period and team size to ensure sufficient opportunity for students to develop and demonstrate these skills. Inclusion of all or too many IPEC sub-competencies may result in evaluator fatigue.

Next steps

Study findings provide important insight into considerations for observer-based evaluation of individual student IPEC Competency development in didactic IPE activities and can be used by other programs to guide assessment strategies in didactic IPE curricula. This innovative approach can supplement commonly used student self-evaluation strategies in didactic IPE assessment, providing a more holistic understanding of student collaborative practice readiness and guide didactic IPE curricula.

While this study had a number of strengths and addressed a critical literature gap, there are limitations to consider. This study was conducted at a single institution during one didactic IPE activity involving pharmacy and dental students. For the purposes of this pilot, only one discipline was evaluated using the I-CATIS to explore feasibility, strengths, and opportunities.

Faculty indicated simultaneously facilitating the case discussion hindered their ability to thoroughly evaluate each participant. Future research should pilot refinements to the I-CATIS, including additional resources (eg, independent evaluators, additional observation time), and expand the evaluation to include all participating disciplines involved in the didactic IPE activity.

Acknowledgments

Acknowledgments: The authors acknowledge Robert Hubal, PhD and members of the Research and Scholarship in Pharmacy (RASP) Program at the UNC Eshelman School of Pharmacy for feedback on the study design.

Funding/support: None

Conflicts of interest: None

Disclaimer: The statements, opinions, and data contained in all publications are those of the authors.

References

  • 1.Buring SM, Bhushan A, Broeseker A, et al. Interprofessional education: definitions, student competencies, and guidelines for implementation. Am J Pharm Educ. 2009;73:59. doi: 10.5688/aj730459 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Dyess AL, Brown JS, Brown ND, Flautt KM, Barnes LJ. Impact of interprofessional education on students of the health professions: a systematic review. J Educ Eval Health Prof. 2019;16:33. doi: 10.3352/jeehp.2019.16.33 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Lie DA, Richter-Lagha R, Forest CP, Walsh A, Lohenry K. When less is more: validating a brief scale to rate interprofessional team competencies. Med Educ Online. 2017;22:1314751. doi: 10.1080/10872981.2017.1314751 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Mitzel K, Storjohann T, Herrick A, Davis L, Shamblen C, Bonnin K. Interprofessional objective structured clinical examination with physician assistant and pharmacy students - a qualitative and quantitative study. Curr Pharm Teach Learn. 2020;12:174-180. doi: 10.1016/j.cptl.2019.11.011. [DOI] [PubMed] [Google Scholar]
  • 5.Interprofessional Education Collaborative. (2016). Core competencies for interprofessional collaborative practice: 2016 update. Washington, DC: Interprofessional Education Collaborative. Accessed 31 October 2024. https://ipec.memberclicks.net/assets/2016-Update.pdf [Google Scholar]
  • 6.IPEC Core Competencies Revision: Version 3 (2021-2023). Washington, DC: Interprofessional Education Collaborative. Accessed 31 October 2024. https://www.ipecollaborative.org/assets/core-competencies/IPEC_Core_Competencies_Version_3_2023.pdf [Google Scholar]
  • 7.Brewer ML, Flavell H. High and low functioning team-based pre-licensure interprofessional learning: an observational evaluation. J Interprof Care. 2020;35:538-545. doi: 10.1080/13561820.2020.1778652. [DOI] [PubMed] [Google Scholar]
  • 8.Rogers GD, Thistlethwaite JE, Anderson ES, et al. International consensus statement on the assessment of interprofessional learning outcomes. Med Teach. 2017;39:347-359. doi: 10.1080/0142159X.2017.1270441. [DOI] [PubMed] [Google Scholar]
  • 9.Accreditation Council for Pharmacy Education (ACPE). Accreditation standards and key elements for the professional program in pharmacy leading to the Doctor of Pharmacy degree. Standards 2025. Accessed 31 October 2024. https://www.acpe-accredit.org/pdf/ACPEStandards2025.pdf
  • 10.Salvati LA, Meny LM, de Voest MC, et al. Assessing the validity and reliability of the pharmacist interprofessional competencies tool. Am J Pharm Educ. 2020;84:7668. doi: 10.5688/ajpe7668 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Thannhauser J, Russell-Mayhew S, Scott C. Measures of interprofessional education and collaboration. J Interprof Care. 2010;24:336-349. doi: 10.3109/13561820903442903. [DOI] [PubMed] [Google Scholar]
  • 12.Nexus Resource Exchange. Assessment and Evaluation – The Measurement Instrument Collection. National Center for Interprofessional Practice and Education. Accessed 31 October 2024. https://nexusipe.org/advancing/assessment-evaluation [Google Scholar]
  • 13.Chiu CJ. Development and Validation of Performance Assessment Tools for Interprofessional Communication and Teamwork (PACT), Unpublished doctoral dissertation, University of Washington. 2014. Accessed 31 October 2024. https://digital.lib.washington.edu/server/api/core/bitstreams/b9b2142e-cd20-44d4-921c-2f66311ff40b/content [Google Scholar]
  • 14.Weller J, et al. Evaluation of an instrument to measure teamwork in multidisciplinary critical care teams. BMJ Qual Saf. 2011;20(3):216-22. doi: 10.1136/bmjqs.2010.041913 [DOI] [PubMed] [Google Scholar]
  • 15.Salvati L, Bright D, de Voest M, et al. An approach for the development and implementation of an assessment tool for interprofessional education learning activities. Innov Pharm. 2017;8:4. https://pubs.lib.umn.edu/innovations/vol8/iss4/4 [Google Scholar]
  • 16.Shrader S, Farland MZ, Danielson J, Sicat B, Umland EM. A systematic review of assessment tools measuring interprofessional education outcomes relevant to pharmacy education. Am J Pharm Educ. 2017;81:119. doi: 10.5688/ajpe816119 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Archibold D, Trumpower D, MacDonald CJ. Validation of the interprofessional collaborative competency attainment survey (ICCAS). J Interprof Care. 2014;28:553-558. doi: 10.3109/13561820.2014.917407 [DOI] [PubMed] [Google Scholar]
  • 18.Parsell G, Bligh J. The development of a questionnaire to assess the readiness of health care students for interprofessional learning (RIPLS). Med Educ. 1999:33:95-100. doi: 10.1046/j.1365-2923.1999.00298.x [DOI] [PubMed] [Google Scholar]
  • 19.Norris J, Carpenter JG, Eaton J, et al. Development and construct validation of the interprofessional attitudes scale. Acad Med. 2015;90:1394-1400. doi: 10.1097/ACM.0000000000000764 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Orchard CA, King GA, Khalili H, Bezzina MB. Assessment of interprofessional team collaboration scale (AITCS): development and testing of the instrument. J Contin Educ Health Prof. 2012;32:58-67. doi: 10.1002/chp.21123 [DOI] [PubMed] [Google Scholar]
  • 21.Frost JS, Hammer DP, Nunez LM, et al. The intersection of professionalism and interprofessional care: development and initial testing of the interprofessional professionalism assessment (IPA). J Interprof Care. 2019:33:102-115. doi: 10.1080/13561820.2018.1515733 [DOI] [PubMed] [Google Scholar]
  • 22.Thistlethwaite J, Dallest K, Moran M, et al. Introducing the individual teamwork observation and feedback tool (iTOFT): Development and description of a new teamwork measure. J Interprof Care. 2016;30:526-528. doi: 10.3109/13561820.2016.1169262 [DOI] [PubMed] [Google Scholar]
  • 23.Lie D, May W, Richter-Lagha R, Forest C, Banzali Y, Lohenry K. Adapting the McMaster-Ottawa scale and developing behavioral anchors for assessing performance in an interprofessional team observed structured clinical encounter. Med Educ Online. 2015;20:26691. doi: doi: 10.3402/meo.v20.26691 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Curran V, Hollett A, Casimiro LM, et al. Development and validation of the interprofessional collaborator assessment rubric (ICAR). J Interprof Care. 2011;25:339-344. doi: 10.3109/13561820.2011.589542 [DOI] [PubMed] [Google Scholar]
  • 25.Biggs J. Aligning teaching for constructive learning. Higher Education Academy. 2003;1:1-4. [Google Scholar]
  • 26.Wiggins GP, McTighe J. Understanding by design. 2nd Expanded edition. Association for Supervision & Curriculum Development; 2005:13-34. [Google Scholar]
  • 27.Zeeman JM, Rodgers PR, Sanders KA. Utilization of an IPEC Core Competency assessment instrument measuring student self-efficacy in a pharmacotherapy course. Am J Pharm Educ. 2020; 84:8220. doi: 10.5688/ajpe8220 [DOI] [Google Scholar]
  • 28.Dow AW, DiazGranados D, Mazmanian PE, Retchin SM. An exploratory study of an assessment tool derived from the competencies of the interprofessional education collaborative. J Interprof Care. 2014;28:299-304. doi: 10.3109/13561820.2014.891573 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Lockeman KS, Dow AW, DizGranados D, et al. Refinement of the IPEC Competency Self-Assessment survey: Results form a multi-institutional study. J Interprof Care 2016;30:726-731. doi: 10.1080/13561820.2016.1220928 [DOI] [PubMed] [Google Scholar]
  • 30.Hasnain M, Gruss V, Keehn M, Peterson E, Valenta AL, Kottorp A. Development and validation of a tool to assess self-efficacy for competency in interprofessional collaborative practice. J Interprof Care. 2017;31:255-262. doi: 10.1080/13561820.2016.1249789 [DOI] [PubMed] [Google Scholar]
  • 31.Kottorp A, Keehn M, Hasnain M, Gruss V, Peterson E. Instrument refinement for measuring self-efficacy for competency in interprofessional collaborative practice: development and psychometric analysis of IPECC-SET 27 and IPECC-SET 9. J Interprof Care. 2019;33:47-56. doi: 10.1080/13561820.2018.1513916 [DOI] [PubMed] [Google Scholar]
  • 32.Lockeman KS, Dow AW, Randell AL. Validity evidence and use of the IPEC Competency Self-Assessment, Version 3. J Interprof Care. 2021;35:107-113 doi: 10.1080/13561820.2019.1699037 [DOI] [PubMed] [Google Scholar]
  • 33.Austin Z, Gregory PAM. Evaluating the accuracy of pharmacy students’ self-assessment skills. Am J Pharm Educ. 2007;71:89. doi: 10.5688/aj710589 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Dow AW, DiazGranados D, Mazmanian PE, Retchin SM. An exploratory study of an assessment tool derived from the competencies of the interprofessional education collaborative. J Interprof Care. 2014;28:299-304. doi: 10.3109/13561820.2014.891573 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Eva KW, Regehr G. Exploring the divergence between self-assessment and self-monitoring. Adv Health Sci Educ. 2011;16:311-329. doi: 10.1007/s10459-010-9263-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Phillips AC, Lewis LK, McEvoy MP, Galipeau J, Glasziou P, Moher D, Tilson JK, Williams MT. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Med Educ. 2016;16:237. doi: 10.1186/s12909-016-0759-1 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Innovations in Pharmacy are provided here courtesy of University of Minnesota Libraries Publishing

RESOURCES