Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2016 Apr 27;31(8):846–853. doi: 10.1007/s11606-016-3690-6

Charting a Key Competency Domain: Understanding Resident Physician Interprofessional Collaboration (IPC) Skills

Sondra Zabar 1,, Jennifer Adams 1, Sienna Kurland 1, Amara Shaker-Brown 1, Barbara Porter 1, Margaret Horlick 1, Kathleen Hanley 1, Lisa Altshuler 1, Adina Kalet 1, Colleen Gillespie 1,2
PMCID: PMC4945565  PMID: 27121308

Abstract

BACKGROUND

Interprofessional collaboration (IPC) is essential for quality care. Understanding residents’ level of competence is a critical first step to designing targeted curricula and workplace learning activities. In this needs assessment, we measured residents’ IPC competence using specifically designed Objective Structured Clinical Exam (OSCE) cases and surveyed residents regarding training needs.

METHODS

We developed three cases to capture IPC competence in the context of physician–nurse collaboration. A trained actor played the role of the nurse (Standardized Nurse – SN). The Interprofessional Education Collaborative (IPEC) framework was used to create a ten-item behaviorally anchored IPC performance checklist (scored on a three-point scale: done, partially done, well done) measuring four generic domains: values/ethics; roles/responsibilities; interprofessional communication; and teamwork. Specific skills required for each scenario were also assessed, including teamwork communication (SBAR and CUS) and patient-care–focused tasks. In addition to evaluating IPC skills, the SN assessed communication, history-taking and physical exam skills. IPC scores were computed as percent of items rated well done in each domain (Cronbach’s alpha > 0.77). Analyses include item frequencies, comparison of mean domain scores, correlation between IPC and other skills, and content analysis of SN comments and resident training needs.

RESULTS

One hundred and seventy-eight residents (of 199 total) completed an IPC case and results are reported for the 162 who participated in our medical education research registry. IPC domain scores were: Roles/responsibilities mean = 37 % well done (SD 37 %); Values/ethics mean = 49 % (SD 40 %); Interprofessional communication mean = 27 % (SD 36 %); Teamwork mean = 47 % (SD 29 %). IPC was not significantly correlated with other core clinical skills. SNs’ comments focused on respect and IPC as a distinct skill set. Residents described needs for greater clarification of roles and more workplace-based opportunities structured to support interprofessional education/learning.

CONCLUSIONS

The IPC cases and competence checklist are a practical method for conducting needs assessments and evaluating IPC training/curriculum that provides rich and actionable data at both the individual and program levels.

KEY WORDS: interprofessional collaboration, medical education, interprofessional education, OSCE

BACKGROUND

Collaboration among members of the health care team is essential to safe and effective practice.1 New models of care such as those associated with Patient-Centered Medical Homes and Accountable Care Organizations depend on effective interprofessional collaborative practice. Despite expert recommendations for training in interprofessional skills,2 there is little formal preparation in undergraduate and graduate medical education, a lack of opportunities for practice and feedback, and almost no systematic observation or assessment of interprofessional collaboration (IPC) in residency programs.1

To work together effectively, individual health professionals on a team need both complementary profession-specific and interprofessional collaborative competencies.3 In 2010, the Interprofessional Education Collaborative expert panel proposed a core set of truly “interprofessional” competencies in four domains:4 Values and Ethics (e.g., Respect, Trust, Cooperation, High Standards of Care), Roles and Responsibilities (e.g., Discuss and Clarify Roles, Recognize Limitations), Interprofessional Communication (e.g., Organize Information, Communicate Information, Facilitate Discussion, Listen, Give Feedback), and Teams and Team Work (e.g., Engage Team, Build Consensus, Support Teamwork, Resolve Conflict). This framework helped spur interest in the development of curricula and assessment strategies for addressing these competencies, which have already been included in the New Accreditation System as milestones for all residency programs.5

To design IPC curricula, workplace learning experiences and assessments, it is critical to first understand resident physicians’ current levels of competence. Several medical training programs have described the use of the Objective Structured Clinical Exam (OSCE) as a valid and acceptable educational tool to both assess and teach interprofessional skills to medical students and residents.611 Learners and evaluators have also emphasized that immediate feedback, discussion and reflection sessions,6 often with interprofessional evaluators,9 are particularly high yield aspects of these OSCEs.

Such feedback is not possible without quality assessment and in our initial review of IPC assessment tools, conducted in 2011 when preparing our cases and checklists, we were unable to find any well-validated instruments for assessing nurse–physician interactions in outpatient clinical encounters, as many tools focused on attitudes and perceived competence,12,13 and those that did assess skills were often based in team crisis management settings and not designed to assess individual skills.14 We therefore created a behaviorally defined assessment tool, based on Interprofessional Education Collaborative domains and competencies, and tailored to the routine physician–nurse interactions that occur frequently as part of primary care practice for common chronic (e.g., diabetes, hypertension) and urgent conditions. Additional tools focused on assessing performance have since been developed (see, for example, Interprofessional Collaborator Assessment Rubric15 and the Performance Assessment for Communication and Teamwork16), and while these share many of the core skills and domains we identified from the literature, they also focus on the team and not individuals, including domains such as situational monitoring, team goals and use of protocols/checklists not relevant to PC scenarios, and tend to use judgment-based scoring options (e.g., below expectations or excellent). Our OSCE assessments are designed to help Standardized Patients rate consistently and accurately by using case-specific behavioral descriptors (observable actions) as the basis for selecting scoring options.

In this needs assessment of residents’ IPC skills, we designed OSCE cases that simulate the common clinical scenarios interprofessional care team members encounter.17 Specifically, we describe the scenario and assessments we developed for measuring IPC practice skills in resident physicians; describe IPC performance and its association with other skills in our sample of medicine residents as a means for understanding current levels and dimensions of competence in our sample of residents; and describe resident physicians’ views on the amount of IPC training they have received and what kinds of trainings they think would be most helpful. These data were collected to inform the development of targeted curricula and workplace-based learning activities to support effective IPC practice.

METHODS

Sample

At our institution, an annual 6–10 station OSCE is fielded with all three post-graduate years (PGY) of Primary Care Internal Medicine residents (PC) and with the PGY2s in the Categorical Internal Medicine Program (CAT). Each OSCE since 2012 has included an interprofessional collaborative practice case and we combined the data on IPC case performance over 3 years to arrive at a sample of 68 PC and 110 Internal CAT residents, representing 89 % of all residents in these two programs (178/199). (Table 1). The study reported here is covered under our Resident Research Registry, an NYU School of Medicine institutional review board (IRB)-approved registry in which residents are asked to provide consent for their routinely collected educational data to be compiled in a de-identified longitudinal database that can be used to answer research questions.18 Ninety-one percent (162) of the residents who completed these OSCEs are in the Registry (consented). We also surveyed residents as part of a cross-residency program needs assessment about IPC training and 100 of the residents who completed an IPC case in the OSCE also completed this survey (62 % of consenting residents).

Table 1.

Residency Programs, Years, Cases, Response Rates and Sample Size

Program Year Case Total N of residents Residents completed the OSCE cases N (Consented)
Primary care internal medicine 2012 Hypertension 23 21 20
2013 Diabetes 23 23 22
2014 Chest Pain 24 24 22
Primary care internal medicine total 70 68 64 (91 %)
Categorical internal medicine 2012 Chest Pain 44 36 33
2013 Chest Pain 43 39 34
2014 Chest Pain 42 35 31
Categorical internal medicine total 129 110 98 (76 %)
TOTAL 199 178 162 (81 %)

OSCE Objective Structured Clinical Exam

Interprofessional Collaborative Practice Cases

IPC OSCEs cases were embedded in an annual multi-station OSCE with six to ten cases. Cases were developed collaboratively by several PC physician and nurse educators, and were based on scenarios and nurse–physician interactions commonly seen in our clinics. The general format of the OSCE is as follows: the resident reads a brief description of the patient and required tasks, and then spends 10 minutes demonstrating skills by evaluating the standardized patient (SP) and/or interacting with the Standardized Nurse (SN) within the simulation center exam room. Three cases were developed to assess IPC practice skills—all three involved an SN and one had both an SP and SN. One case involved an SN who had just completed an initial workup of a patient complaining of chest pain. The clinical challenge of this case was to diagnose a right ventricular infarction and appropriately manage the immediate needs of the patient. The interprofessional collaboration challenge for the resident was to elicit needed information from the SN, including her assessment, and to work together to develop a care plan. The second case was a follow-up clinic visit in which the physician was expected to recognize an SN’s medical error in summarizing the medications recommended for a patient with hypertension, and the third case involved a phone call in which, ideally, the resident and SN collaboratively developed a treatment plan for a patient with diabetes and hyperglycemia.

Measures

Core clinical skills across competency areas were assessed by the SNs, who received 6 hours of training in both portraying the case and rating performance reliably and accuracy,19 using 32–44 item checklist of behaviorally anchored items, each scored on a three point (not done, partly done, or well done) scale. Competency areas other than IPC included communication skills (12 items), history taking (6–12 items), physical examination (4–11 items), patient education (2–5 items) and counseling (2–3 items), treatment plan (2–3 items) and patient satisfaction (3–4 items). Scores were calculated as percent of items rated well done within each domain. Minimum internal consistency has been established for these OSCEs over time, with Cronbach’s alphas exceeding 0.80 for each of these domains.

Based on the four competency domains in the Interprofessional Education Collaborative model,20 we generated items and specific behavioral anchors to measure each domain of competency, selecting and adapting Interprofessional Education Collaborative (IPEC) competencies based on the focus on the resident-nurse interaction and on the needs of the clinical scenario. Table 2 shows these and the distribution of performance as rated by the SNs. In addition to these generic IPC skills, we identified IPC skills that were specific to the hypertension and diabetes error cases based on two communication and mutual support techniques that are included in the Agency for Healthcare Research and Quality (AHRQ)’s TeamSTEPPS materials: using the SBAR format (Situation, Background, Assessment, Recommendation) to organize information when presenting to the SN, and using the CUS format (Concern about the situation, Uncomfortable with the situation, Safety of the patient is at risk) to identify and respond to an error.21 The diabetes case called for patient education and counseling; therefore, we assessed the degree to which residents used the SN’s expertise for educating the patient. Table 2 provides summary scores for the three cases and associated internal consistency estimates (Cronbach’s alpha), as well as frequency distributions of ratings for the specific behaviorally anchored checklist items. The SNs also provided open-ended comments about the residents’ interprofessional collaboration as part of their assessment of and feedback to the residents, which we used to better understand residents’ strengths and weaknesses in IPC practice. Finally, we conducted an online survey of surveyed Internal Medicine residents in 2014 as part of a needs assessment conducted across multiple disciplines on their exposure to IPC training during residency and views on what kinds of IPC trainings would be most helpful (question: What kinds of interprofessional collaborative practice training do you think would be most helpful to you?)

Table 2.

Interprofessional Collaborative Practice (ICP) Skills Assessed in Three Cases (n = 162) and in Specific Cases (n = 22–42) with Internal Consistency Estimates (Cronbach’s alpha)

IPC competency domain Checklist item (Skill) Frequency distribution of residents for each item Behavioral descriptor of well done Domain and total summary scores for entire sample
% Not done (n) % Partly done (n) % Well done (n) Cronbach’s Alpha Mean % Well done (n = 162)
Generic IPC Skills
 Values/Ethics Expresses value for the SN’s information 4 % (6) 38 % (62) 58 % (94) Attends to info; expresses value 0.80 49 %
SD 40 %
Responds well to SN suggestions 2 % (3) 58 % (94) 40 % (65) Listened to and acted on
Treats SN respectfully 1 % (2) 49 % (79) 50 % (81) Treats SN respectfully
Responds to SN as a person 1 % (2) 51 % (83) 48 % (78) Responds to SN as an equal person
 Roles/Responsibilities Introduces self and role 9 % (15) 63 % (102) 28 % (45) Introduces both 0.81 37 %
SD 37 %
Discusses roles and responsibilities 4 % (6) 46 % (75) 50 % (81) Discusses roles and responsibilities
 Interprofessional Communication Fully explores SN knowledge of problem 7 % (11) 67 % (109) 26 % (42) Fully explores 0.77 27 %
SD 36 %
Explores SN assessment of situation 14 % (23) 62 % (100) 24 % (39) Fully explores including SN conclusions
Determines what SN has done 0 % (0) 68 % (110) 32 % (52) Elicits and attends to SN’s report
 Teamwork Develops interprofessional follow-up plan 3 % (5) 50 % (81) 47 % (76) Develops follow-up plan with SN N/A 47 %
SD 29 %
TOTAL IPC SCORE 0.78 41 %
SD 29 %
Case Specific IPC Skills
 Diabetes and Hypertension (n = 42) Uses SBAR format to present case 2 % (1) 42 % (18) 56 % (24) Fully employs SBAR in presenting case N/A1
Uses CUS to communicate mistake 60 % (25) 15 % (6) 25 % (11) Communicates with respectful use of CUS
 Diabetes (n = 20) Makes maximum use of SN ability to educate the patient (glucometer) 4 % (1) 74 % (16) 22 % (5) Specifically discusses and confirms education SN can provide N/A1
Uses check back for medications and dosages 61 % (13) 26 % (6) 13 % (3) Uses check back; meds and dosage

1Cronbach’s alpha not calculated; each item is conceptually distinct and not part of a summary score

SN standardized nurse; SBAR Situation, Background, Assessment, Recommendation; CUS Concern about the situation, Uncomfortable with the situation, Safety of the patient is at risk

Analyses

Table 2 summarizes frequency distributions of residents’ performance for the generic and case-specific IPC items. Mean domain scores (percent of domain items rated as well done calculated for each resident and then average across the full sample) were compared to determine the significance of differences between domains, using repeated measures ANOVA to determine relative strengths and weaknesses of domains of IPC skills. Since the PC program has used all three cases at some point in its OSCEs, we were able to compare overall generic IPC scores (% well done across all generic IPC items) by case (one-way ANOVA with three cases as the factor; Chi Square for the single-item domain of IPC plan) within this sub-group of residents (not tabled) to see whether IPC performance differed significantly among cases. Finally, we calculated correlation coefficients (Pearson’s r) to explore the degree to which IPC skills were associated with the other core clinical skills assessed in the OSCE (not tabled).

SN’s comments about resident performance and residents’ suggestions for foci of IPC training were each coded by one of the authors (CG) through an iterative coding process in which related themes were grouped into broader thematic categories. Co-authors (ASB, SZ, JA) reviewed the coding and themes. Each residents’ response was counted within the category with which it was most closely aligned.

RESULTS

Participants

Half of our 162 residents are male, 80 % are younger than 29 years old, 56 % are white, and 8 % are from economically disadvantaged backgrounds; this distribution does not differ significantly from the full population of medicine residents (n = 199 enrolled in the residency programs; n = 178 who completed the OSCEs; response rate = 81 %).

IPC Performance: By Item, Domain, and Overall

As shown in Table 2’s item level frequencies, most residents attempted to address IPC behaviors, with only a few of them failing to perform the indicated behaviors at all (not done). For example, only 24 % of residents were judged by the evaluators to have fully assessed the situation; however, 62 % partly did this task. Fewer than half the residents were rated as having performed well on the following items: introducing self and role, determining what the SN has done for the patient, responding well to the SN’s suggestions, responding to the SN as a person/treating as a colleague, fully exploring the SN’s knowledge of the problem, fully exploring the SN’s assessment of the situation, and discussing/developing an interprofessional follow-up plan with the SN. The IPC domains we used in the assessment of performance were internally consistent (Cronbach’s alpha; Table 2). Residents’ mean percent of all items rated well done (overall summary score) was 41 % with substantial variation (SD 29 %). Average scores were highest for the values/ethics domain (mean 49 %, SD 40 %) and lowest for interprofessional communication (mean 27 %, SD 36 %) (repeated measures ANOVA F = 19.43, p < 0.001).

Case-Specific Performance

Residents struggled with case-specific IPC tasks; a quarter or fewer residents were rated as well done in using the CUS method to communicate a mistake, making maximum use of the nurse for patient education in a diabetes case, and using the “check back” technique to confirm accuracy of medications and dosages.

Comparison of Performance by Case

Among the primary care residents who saw all three cases, the diabetes case appeared to be the most difficult: domain scores for values and for the interprofessional plan item were significantly lower for this case (values domain mean % well done=42 %, SD 34 % and IPC plan item 26 % well done) than for hypertension (values domain mean % well done=71 %, SD 35 % and IPC plan item=62 % well done) and for chest pain (values domain % well done mean=85 %, SD 23 % and IPC plan item=54 % well done) (values domain score one-way ANOVA F = 11.56, p < 0.001 and IPC plan item Chi Square=5.86, p = 0.003, not tabled).

Correlations Between Interprofessional Collaborative Practice Skills and Other Core Clinical Skills Assessed in OSCE (n = 162)

IPC performance was not correlated significantly with any of the core clinical skills assessed in the OSCE: communication r = 0.04, p = 0.61, history gathering r = 0.09, p = 0.25, physical exam r = −0.08, p = 0.31, case-specific patient education and counseling r = 0.14, p = 0.08 , treatment plan and management r = −0.09, p = 0.25, and patient centeredness r = −0.08, p = 0.32.

Standardized Nurses’ Open-Ended Comments on Residents’ Interprofessional Skills

Three major themes (Table 3) were identified in these comments: values/respect (demonstrating respect and partnership); how resident interactions with the SN could influence patient care (from failing to start with what the SN already knew to missing opportunities to truly collaborate); and evidence that IPC skills were distinct from other skills with two main patterns. The SNs described how students appeared to differ in their ability to balance the goals of responding to the patient, interacting effectively with the SN, and addressing the core challenges of the case: Some interacted very well with the patient, but then failed to use those same skills with the SN, and others interacted effectively with both the SN and the patient, but failed to address the core challenges of the case.

Table 3.

Standardized Nurse Comments Regarding the Interprofessional Collaborative (IPC) Skills of Residents: Three Major Themes and Exemplar Quotes

Values/Respect IPC communication skills influenced patient care IPC as distinct skill
• Took charge and was clear with his instructions, but could create more of a sense of partnership with his nurse (didn’t include me as part of the team).
• Great with the patient, but didn’t treat me as part of the process; did not collaborate with me in the process besides asking me for things.
• Started out okay, but by the end I was miffed. I didn’t feel respected or collaborated with, in fact, it somehow was very clear that he expected me to make the call, but never asked or even suggested. He is clearly knowledgeable, but that's not enough. He came, he heard, he told me what to do.
• Friendly, respectful and made me want to be as helpful to her as possible. She made me feel like I was good at my job and really on the ball. I liked her, and if I were a nurse, I’m sure I’d love working with her if she treats all her colleagues this way.
• Did well, but should be careful not to rush and to use the nurse more effectively. Took the information from me, but needs to remember what the nurse says is vital from the start.
• Great work with the diagnosis and acting quickly. Should be careful to get all the information from the nurse to start to better set up diagnosis and treatment. Asked the patient many questions from the start and then the nurse. Should have asked for all the nurse’s information up front.
• Seemed rushed and did not ask me as the nurse for the information I had gathered before he arrived. Went directly to the patient (great interaction), but did not ask me to collaborate in the process. Great information gathering from the patient and showing of concern for her. Great with the patient but didn’t treat the nurse like part of the process, except for asking me for things (did not elicit information).
• Great reaction to the seriousness of the situation. She gathered information quickly and then shared her diagnosis ideas out loud—wonderful collaboration. Used the nurse as a resource. Fast action to get patient basic care that was vital.
• Has a very calm presence and was very thoughtful in responses to the patient’s anxiety. Could use the nurse as a teammate in the future.
• Was very professional and efficient. Cut right to the important things and listed well to the patient. He used his resources well, but could check in more with the nurse as a partner.
• She was personable, easy to talk to, and made me feel like an important part of the patient’s health care team. She did miss quite a lot of stuff. She didn’t discuss the BP goals, didn’t look at the notes to double check and never repeated the BP to me.
• Great bedside manner and communication. She worked well with me and was respectful. Problem was, she did not diagnose correctly.

Residents’ Suggestions for IPC Training

Residents’ suggestions for training fell into five broad categories and are reported in Table 4. Of the 100 medicine residents who responded to a survey about IPC competencies (62 % response rate), 5 % (n = 5) reported receiving no formal training in IPC in residency, 30 % reported only a little training, 46 % reported receiving some training; and 19 % reported participating in a substantial amount of training; 58 residents responded to an open-ended question asking them for suggestions for IPC training (58 %); 15 residents felt that training should focus on enhancing understanding of roles and responsibilities; 11 residents suggested specific activities and structured time/space for the actual work of teamwork, including, for example, team huddles, round table meetings, daily rounds, and open forums; ten residents identified the need for more opportunities for simulation and six identified specific skills upon which to focus; namely, communication, conflict resolution and leadership skills. Finally, six residents identified workplace-based approaches that focused on modeling, situational training, discussion and trouble-shooting and observations and evaluations of team dynamics as needed.

Table 4.

Residents’ (n = 58 Open-Ended Comments/100 Completed Surveys) Suggestions for Interprofessional Collaborative (IPC) Practice Training

Training focus Number of residents mentioned Examples of IPC training needs
Information on Roles 15 • A formal orientation of each team members skills vis-à-vis their areas of professional strengths and weaknesses
• More about understanding each other’s role and pressures that other disciplines are facing
• Discussion of roles and when to refer to another part of the team for help
• Developing a better idea of what nursing staff does on a regular basis. This would work both ways: For nurses to see what we do on a regular basis
• Understanding roles of each member of the team and the best ways to communicate with them
• Might be helpful to understand/hear about the types of issues different team members face, and how the team as a whole could work together to resolve the issue
Structures for “Work” of Teamwork 11 • Daily rounds, small group conferences, and research projects
• Going on rounds together
• Having a team meeting at the middle of a rotation to help formally assess how things are going and how things can be improved
• More outpatient interprofessional teamwork or rounds
• More team huddles. They should occur before every clinic session and ideally inpatient rounds would have each patient’s nurse and social worker involved
• Open forum, minimizing conflict, between different groups (i.e., doctors, nurses, social workers)
• Round table meetings
• Routine meetings/huddles
• Setting a specific time for team huddles in clinic
Practice (Simulation) 10 • An OSCE that includes social work, medical students, resident, intern and attending
• More simulation sessions and workshops
• OSCEs with SNs
• Scenario simulations
• Rapid Response Team simulations
Development of Specific Skills 6 • Communication skills
• Conflict resolution skills
• Leadership skills
Workplace-Based (in Situ) 6 • Modeling by senior team members; feedback and instruction in the moment
• Situational training in groups
• The most helpful would just be doing actual everyday activities then going over everybody’s role prior to and after these activities to establish who can do what
• Discussion and troubleshooting of problematic team dynamics
• Observational evaluation by a trainer on the wards with constructive feedback for individuals and on team dynamics

OSCE Objective Structured Clinical Exam; SN standardized nurse

DISCUSSION

Though IPC skills are now a core residency competency and a majority of residents report that they have received some IPC training, we found that OSCE cases document deficits in IPC skills. Residents easily identified gaps in their knowledge (e.g., roles and responsibilities) and skills (e.g., communication and conflict resolution) and recognized their need for more training and practice, even suggesting specific settings and opportunities. This level of self-awareness is reassuring. We were heartened that residents’ scores in the domains of value and ethics were higher than in the other domains, as these are the foundation upon which skills can be built; however, there was still substantial room for improvement.

We found IPC skills to be a distinct domain of competence largely unrelated to other core clinical skills, including physician–patient communication. By adding the nursing point of view, we were able to identify two resident patterns: either effectively interacting with the patient but not with the SN, or effectively interacting with both the patient and SN, but not handling core IPC challenges of the case. While this may be in part an artifact of the assessment of a complex set of skills in a time limited simulation, it may provide insight into what and how residents prioritize under such circumstances.

Though medical educators advocate for common assessment criteria for interporfessional collaboration milestones, there is much work to be done.22 There are few instruments that meet these criteria for addressing IPC milestones for the individual dyad and limited opportunities for work place assessment of these important skills. Direct workplace observation of a resident giving feedback to an interprofessional colleague may not be feasible and simply placing learners in team-based care may be insufficient to teach these skills.23,24 Performance-based assessment such as OSCEs may be an ideal, efficient strategy for providing program needs assessments and educational opportunities in these critical but difficult to directly observe milestones.

Patient safety and quality of care are at risk if residents are unable to communicate effectively with the other members of the health care team. Although some residents have been exposed to these models,25 our residents were less effective than we expect them to be on two process measures from AHRQ’s Team STEPPS materials: using the SBAR format to organize information when presenting to the SN and using the CUS format to identify and respond to an error. In our data, the tasks assessed in the context of caring for a patient with diabetes were more difficult than in other clinical cases. This case distinguished among residents who could integrate their clinical skills, IPC values, and communication skills with another health care provider over the phone to address a medication error. Maximizing residents’ interprofessional teamwork will most likely need additional work place experience and debriefing to effectively have residents become competent in these critical skills.

This analysis provides rich and actionable data on resident physicians’ IPC skills at the program level. We describe an educationally useful, feasible, and scalable method to assess both curriculum innovation and individual IPC practice skills. Current formal and informal residency curriculum is at least partially successful in supporting positive values and ethics with regard to working in an interprofessional team; however, this alone does not ensure skills competence. For our program, interpreting the OSCE performance data and SP and SN comments along with residents’ views on gaps and preferences for training in this domain led to the following recommendations for refining IPC curriculum: 1) physician trainees, even those who have spent significant time in the workplace, need formal education about the roles and responsibilities of the non-physician members of the health care team; 2) residents requested both more simulation-based practice with, and structured workplace experience of, team huddles, round table meetings, and daily interdisciplinary rounds; 3) our programs must ensure that attending physicians and nurses in the workplace are good role models for trainees in these domains; and 4) specific IPC skills education is needed, namely communication, conflict resolution, and leadership skills.

There are many limitations of this type of analysis. Our data were collected as part of an annual formative OSCE designed not solely to highlight IPC skills, but to assess a broad cross-section of resident skills from a single institution. Assessment of skills based on a single performance in simulation should be interpreted cautiously and is certainly not adequate for high stakes decision-making about the competence of an individual trainee. We focused on only one discipline other than medicine, nursing, because this represents residents’ most frequent interprofessional collaboration; however, it limits generalizability to other professions. We are designing cases that include more disciplines.

CONCLUSION

OSCEs can be used as a reliable and educational method for assessing residents’ interprofessional collaborative practice skills and residency programmatic evaluation. Residents recognize the need for more training and have excellent suggestions about how this should be undertaken. Though current IPC resident skills are not uniformly at the level desired, we are optimistic that residents demonstrated that they are familiar with IPC skills expected of them and are eager to learn more.

Acknowledgements

The authors would like to recognize the valuable contributions of the interprofessional members of the Research on Medical Education Outcomes (ROMEO) group, the faculty and the learners who participated in the IPC OSCES.

This project was supported in part by HRSA PC residency training grant D58HP10328 and HRSA Bureau of Health Professions Academic Administrative Unit grant D54HP05446.

Contributors

According to the definition given by the International Committee of Medical Journal Editors (ICMJE), all authors qualify for authorship based on making one or more of the substantial contributions to the intellectual content: conception and design, acquisition of data, and/or analysis and interpretation of data. Additionally, all authors contributed to the writing and editing of the manuscript.

Compliance with Ethical Standards

Prior Presentations

Oral: “Using OSCE Cases to Assess Resident Physicians’ Competence in Interprofessional Collaborative Practice” C. Gillespie PhD; J. Adams MD; M. Horlick MD; B. Porter MD; K. Hanley MD; J. Fox BA; A. Burgess BA: S. Zabar MD. SGIM Annual Meeting, April 24-27, 2013, Denver, CO.

Oral: “Using OSCE Cases to Assess Resident Physicians’ Competence in Interprofessional Collaborative Practice” J. Adams MD; A. Burgess BA; J. Fox BA; C. Gillespie PhD; K. Hanley MD; B. Porter MD; S. Zabar MD. AAMC Annual Meeting, November 1-6, 2013, Philadelphia, PA.

Oral: “Using OSCE Cases to Assess Resident Physicians’ Competence in Interprofessional Collaborative Practice” C. Gillespie PhD; B. Porter MD; K. Hanley MD; J. Adams MD; J. Fox BA; S. Zabar MD. Ottawa Conference, April 25-29, 2014, Ottawa, Ontario, Canada.

Conflict of Interest

The authors declare that they do not have a conflict of interest.

REFERENCES

  • 1.Barnsteiner JH, Disch JM, Hall L, Mayer D, Moore SM. Promoting interprofessional education. Nurs Outlook. 2007;55:144–50. doi: 10.1016/j.outlook.2007.03.003. [DOI] [PubMed] [Google Scholar]
  • 2.Carlisle C, Cooper H, Watkins C. “Do none of you talk to each other?”: the challenges facing the implementation of interprofessional education. Med Teach. 2004;26:545–52. doi: 10.1080/61421590410001711616. [DOI] [PubMed] [Google Scholar]
  • 3.Barr H, Koppel I, Reeves S, Hammick M and Freeth DS. Effective interprofessional education: argument, assumption and evidence (Promoting Partnership for Health), vol. 39. John Wiley & Sons; 2008.
  • 4.Education I, Expert C. Core competencies for interprofessional collaborative practice. 2011.
  • 5.Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med. 2012;366(11):1051–6. doi: 10.1056/NEJMsr1200117. [DOI] [PubMed] [Google Scholar]
  • 6.Symonds I, Cullen L, Fraser D. Evaluation of a formative interprofessional team objective structured clinical examination (ITOSCE): a method of shared learning in maternity education. Med Teach. 2003;25(1):38–41. doi: 10.1080/0142159021000061404. [DOI] [PubMed] [Google Scholar]
  • 7.Cullen L, Fraser D, Symonds I. Strategies for interprofessional education: the interprofessional team objective structured clinical examination for midwifery and medical students. Nurse Educ Today. 2003;23:427–33. doi: 10.1016/S0260-6917(03)00049-2. [DOI] [PubMed] [Google Scholar]
  • 8.Hall P, Marshall D, Weaver L, Boyle A, Taniguchi A. A method to enhance student teams in palliative care: piloting the McMaster-Ottawa team observed structured clinical encounter. J Palliat Med. 2011;14(6):744–50. doi: 10.1089/jpm.2010.0295. [DOI] [PubMed] [Google Scholar]
  • 9.Morison SL, Stewart MC. Developing interprofessional assessment. 2005;192–202.
  • 10.Gillespie C, et al. ‘We might as well be speaking different languages’: an innovative interprofessional education tool to teach and assess communication skills critical to patient safety. BMJ Simul Technol Enhanc Learn. 2015;1(2):54–60. [DOI] [PMC free article] [PubMed]
  • 11.Muller-Juge V, et al. Interprofessional collaboration on an internal medicine ward: role perceptions and expectations among nurses and residents. PLoS One. 2013;8(2):e57570. doi: 10.1371/journal.pone.0057570. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Abu-Rish E, et al. Current trends in interprofessional education of health sciences students: a literature review. 2012;26:444–51. [DOI] [PMC free article] [PubMed]
  • 13.Canadian Interprofessional Health Collaborative. An inventory of quantitative tools measuring interprofessional education and collaborative practice outcomes. 2012.
  • 14.Rosen MA, Weaver SJ, Lazzara EH, et al. Tools for evaluating team performance in simulation-based training. J Emerg Trauma Shock. 2010;3(4):353–9. doi: 10.4103/0974-2700.70746. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Hayward MF, Curran V, Curtis B, Schulz H, Murphy S. Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents. BMC Med Educ. 2014;14:1049. doi: 10.1186/s12909-014-0279-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Chiu CJ, Brock D, Abu-Rish E, Vorvick L, Wilson S, Hammer D, Schaad D, Blondon K, Zierler B. Performance assessment of communication and teamwork (PACT) tool set. u.d. Retrieved from: http://collaborate.uw.edu/educators-toolkit/tools-for-evaluation/performance-assessment-of-communication-and-teamwork-pact-too. Accessed 01/25/2016.
  • 17.Simmons B, Egan-Lee E, Wagner SJ, Esdaile M, Baker L, Reeves S. Assessment of interprofessional learning: the design of an interprofessional objective structured clinical examination (iOSCE) approach. J Interprof Care. 2011;25:73–4. doi: 10.3109/13561820.2010.483746. [DOI] [PubMed] [Google Scholar]
  • 18.Gillespie C, Zabar S, Altshuler L, Fox J, Pusic M, Xu J, Kalet A. The Research on Medical Education Outcomes (ROMEO) registry: addressing ethical and pratical challenges of using ‘Bigger’, longtiduinal educational data. Acad Med. 2015 doi: 10.1097/ACM.0000000000000920. [DOI] [PubMed] [Google Scholar]
  • 19.Zabar S, Hanley K, Altshuler L, et al. Do clinical skills assessed in OSCEs transfer to the real world of clinical practice? using unannounced standardized patient visits to assess transfer. AAMC Med Educ Conf. 2014;6.
  • 20.Panel IECE. Core competencies for interprofessional collaborative practice: report of an expert panel. Core competencies interprofessional Collab Pract Rep an Expert panel. 2011.
  • 21.Appendix: examples of the SBAR and CUS tools: improving patient safety in long-term care facilities, Module 2. Agency Healthc Res Qual. 2014. http://www.ahrq.gov/professionals/systems/long-term-care/resources/facilities/ptsafety/ltcmod2ap.html
  • 22.Wingo MT, et al. Interprofessional collaboration milestones: advocating for common assessment criteria in graduate medical education. BMC Med Educ. 2015;15.1:1. doi: 10.1186/s12909-015-0432-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Soones TN, O’Brien BC, Julian KA. Internal medicine residents’ perceptions of team-based care and its educational value in the continuity clinic: a qualitative study. J Gen Intern Med. 2015;30(9):1279–85. doi: 10.1007/s11606-015-3228-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Havyer RDA, et al. Teamwork assessment in internal medicine: a systematic review of validity evidence and outcomes. J Gen Intern Med. 2014;29(6):894–910. doi: 10.1007/s11606-013-2686-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Triola M, Djukic M et al. NYULMC division of educational informatics: NYU3T: teaching, technology, teamwork. NYU Sch Med NYU Coll Nurs Curric. http://dei.med.nyu.edu/research/nyu3t

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES