Abstract
Background
Patients can contribute to resident assessment in Competence by Design (CBD). This study explored the extent, nature, as well as the facilitators and hindrances of patient involvement in resident assessment within and across Canadian specialty/sub-specialty/special programs that are transitioning or have transitioned to CBD.
Methods
We used a two-phase sequential explanatory mixed-methods design. In Phase 1, we surveyed program directors (PDs). In Phase 2, we interviewed PDs from Phase 1.
Results
In Phase 1, 63 (62.4%) respondents in the CBD preparation stage, do not know if patients will be involved in resident assessment, 21 (20.8%) will involve patients, and 17 (16.8%) will not involve patients. Of those in the field-testing or implementation stages, 24 (72.7%) do not involve patients in resident assessment, five (15.2%) do involve patients, and four (12.1%) do not know if they involve patients. In Phase 2, 12 interviewees raised nine factors that facilitate or hinder patient involvement including, patients’ interests/abilities, guidelines/processes for patient involvement, type of Entrustable Professional Activities, type of patient interactions in programs, and support from healthcare organizations.
Conclusion
Patient involvement in resident assessment is limited. We need to engage in discussions on how to support such involvement within CBD.
Introduction
Competency-based medical education (CBME) requires multiple assessors of residents,1,2 authentic assessment, and multi-faceted assessment programs.1,3 Given the emphasis that CBME places on assessment for learning as well as the range of competencies included in, for example, the CanMEDS framework, it is essential to develop assessment programs that encompass a range of assessment strategies and assessors.1 Patients and family members or caregivers (herein referred to as patients), if provided with the opportunity, have the ability to contribute to these assessment programs. Residency programs can include assessments by patients as part of multisource feedback (MSF).4 Used for the assessment of competencies, MSF can include the collection of data on residents’ performance from peers, supervising physicians, allied health professionals, and patients. By including patient assessments in MSF, residency programs can ensure that patients’ first-hand experiences with residents as well as their perceptions of residents’ skills and abilities are included as part of the learning process.
Researchers suggest that residents need assessments from patients to improve their performance,5 to develop and maintain patient-centred professionalism,6 to develop positive physician-patient relationships,7 and to understand how their interactions with patients impact health outcomes.8 Residents have reported high satisfaction with and utilization of patient assessments.9,10 Patient involvement in resident assessment can also empower patients to improve the care that they and others receive.11,12 In recognition of these mutual benefits, the Royal College of Physicians and Surgeons of Canada (RCPSC) recommends the involvement of patients in resident assessment in their CBME model, entitled Competence by Design (CBD). However, there is limited research on patient involvement in resident assessment.13 Towle and Godolphin’s bibliography on patient involvement in health professions education noted that a mere 29 of 657 (4.4%) studies focused on patient involvement in assessment, seven of which are from medicine.14 These studies found that patients can assess residents’ professionalism, collaboration, interpersonal abilities, and communication skills.7,15-17 Authors also showed that patients can provide reliable assessments of residents’ skills.12,18,19 Nonetheless, these studies did not explore the extent and nature of patient involvement in resident assessment within the emerging Canadian CBD context. Thus, the purpose of this two-phase sequential explanatory mixed-methods study was to explore and document the current state of and plans for patient involvement in resident assessment within these CBD programs to identify resources and activities needed to advance and sustain this important component of resident assessment. The following research questions, as they related to CBD, guided our study:
Phase 1: Survey
To what extent are patients or will patients be involved in resident assessment?
Why are patients or will patients be involved or not involved in resident assessment?
What skills and abilities can patients assess?
Phase 2: Interviews
What factor(s) facilitate and hinder patient involvement in resident assessment?
Methods
Study design
We used a two-phase sequential explanatory mixed-methods design.20 Phase 1 encompassed quantitative survey data from program directors (PDs) whose programs were in the CBD preparation, field-testing, or implementation stages at the time of the study. We then used the findings from Phase 1 to inform participant-level questions and identify participants for Phase 2, which comprised qualitative interview data with selected PDs. Our rationale for using interviews was pragmatic. When designing the study, we recognized that it would be challenging to schedule focus groups with busy PDs but that one-on-one interviews would allow the PDs to participate at times that were most convenient for them. Nevertheless, the quantitative and qualitative study portions were equally important in terms of addressing the research topic.20 We opted for this mixed methods design because we anticipated that the qualitative interview findings (Phase 2) would build on the initial quantitative survey results (Phase 1) and thus, provide a more comprehensive understanding of patient involvement in resident assessment than either quantitative or qualitative designs alone.20 This design also enabled us to create and ask specific interview questions to PDs who responded “yes,” “no,” or “don’t know” to questions of patient involvement in resident assessment on their Phase 1 survey and thereby elicit and illustrate accurate descriptions and understandings of the factor(s) that facilitate and hinder patient involvement in resident assessment. We obtained ethics approval for both Phases 1 and 2 from the University of Ottawa (File number 03-17-08).
Phase 1: Survey
Sample
Based on the RCPSC’s website, we identified 17 primary specialties, 12 sub-specialties, and one special program (i.e., Surgical Foundations) that were in the CBD preparation, field-testing, or implementation stages. We distributed the online survey to 462 program directors (PDs).
Instrument development
We developed survey dimensions and items using literature on strategies for investigating patient involvement in medical education9,13,21,22 and assessment practices.16,23-25 Survey items pertained to one of the following dimensions: (a) level and type of patient involvement; (b) purpose of assessment; (c) reasons for patient involvement; and (d) skills and abilities. Six reviewers knowledgeable in patient involvement in medical education, assessment, and CBD reviewed the survey. We also piloted it with five former PDs and five physician-educators ineligible for the study. We then translated the 23-item survey to French and piloted it with three former French PDs.
Data collection procedures
A Research Assistant (RA) emailed the survey link to PDs. We confidentially tracked responders/non-responders and used a modified version of Dillman’s et al.’s Tailored Design Method.26
Data analysis
We calculated descriptive statistics (i.e., frequencies, percentages) in SPSS (version 24).
Phase 2: Interviews
Sample
We used purposeful criterion-based sampling to identify and recruit PDs. To be eligible for an interview, PDs had to have completed the Phase 1 survey and expressed interest in participating in a Phase 2 interview.27
Instrument development
We used Phase 1 findings to inform the semi-structured interview guides. We developed separate guides for those who responded “yes,” “no,” or “don’t know” in regard to patient involvement in resident assessment in Phase 1. Each guide consisted of an introductory script and six open-ended questions on resources, values, or conditions that affect programs’ abilities to involve patients in resident assessment. We piloted the guides with five former English-speaking PDs, translated them to French, and then piloted them with three former French-speaking PDs.
Data collection procedures
The RA emailed information letters to eligible PDs, requesting them to reply if they were interested in participating. Each interview lasted 30 to 60 minutes. All interviewees provided informed consent prior to the interview. The RA conducted each interview by telephone, audio-recorded it, and transcribed it verbatim.
Data analysis
Two of us (KM & KE) analyzed the data ourselves and recognized that our interpretations of the data were mediated through us as medical education researchers and proponents of patient involvement in medical education and CBD. Miles’ and Huberman's three-step iterative process (i.e., data condensation, data displays, drawing and verifying conclusions)28 informed our inductive analyses. The goal of the analyses was to identify the major themes articulated by the PDs representing factors that facilitate or hinder patient involvement in resident assessment. Throughout the analyses, we identified factors that were present in more than one interview.
Following each interview, we independently listened to the audio-recordings to identify factors that interviewees highlighted as facilitators or hindrances to patient involvement in resident assessment. We documented these factors to establish an audit trail. Following transcription, the RA compared the transcripts to the audio-recordings. We then independently reviewed the transcripts and summaries to create our own coding systems. We read each transcript, annotated phrases, and coded the data. We incorporated codes not previously identified in our listening to the audio-recordings. We then met to discuss our independent analyses. Together, we merged our analyses, created data display tables, and generated thematic conclusions, including exemplar quotations.
Results
Phase 1: Survey
Demographics
We obtained 134 completed surveys (response rate of 29.0%). When asked about their programs’ CBD stage, 101 (75.4%) indicated CBD preparation (i.e., the specialty committee is working with the RCPSC to prepare the program for CBD), 21 (15.7%) selected CBD field-testing (i.e., the program is field testing aspects of CBD), and 12 (9.0%) stated CBD implementation (i.e., residents are in a CBD-based program). Tables 1 and 2 provide demographic details.
Table 1.
Survey respondents’ demographic characteristics (N = 134)
| Characteristic | n (%) |
|---|---|
| Gender | |
| Female | 61 (45.5) |
| Male | 64 (47.8) |
| Prefer not to specify | 9 (6.7) |
| Years working as a Program Director | |
| <12 months | 32 (23.9) |
| 1-5 years | 65 (48.5) |
| 6-10 years | 25 (18.7) |
| 11-15 years | 4 (3.0) |
| >20 years | 1 (0.7) |
| Prefer not to specify | 7 (5.2) |
| Region of Canada | |
| British Columbia | 12 (9.0) |
| Alberta | 14 (10.4) |
| Saskatchewan | 6 (4.5) |
| Manitoba | 7 (5.2) |
| Ontario | 38 (28.4) |
| Québec | 36 (26.9) |
| Nova Scotia | 6 (4.5) |
| Newfoundland and Labrador | 1 (0.7) |
| Prefer not to specify | 14 (10.4) |
Table 2.
Survey respondents’ programs (N = 134)
| Program | n (%) |
|---|---|
| Anesthesiology | 4 (3.0) |
| Cardiology | 10 (7.5) |
| Critical Care Medicine | 8 (6.0) |
| Emergency Medicine | 4 (3.0) |
| Gastroenterology | 2 (1.5) |
| General Internal Medicine | 7 (5.2) |
| Geriatric Medicine | 2 (1.5) |
| Internal Medicine | 6 (4.5) |
| Medical Oncology | 5 (3.7) |
| Neonatal-Perinatal Medicine | 2 (1.5) |
| Nephrology | 6 (4.5) |
| Pediatrics | 12 (9.0) |
| Physical Medicine & Rehabilitation | 3 (2.2) |
| Psychiatry | 5 (3.7) |
| Radiation Oncology | 6 (4.5) |
| Respirology | 4 (3.0) |
| Rheumatology | 6 (4.5) |
| Cardiac Surgery | 2 (1.5) |
| General Surgery | 5 (3.7) |
| Neurosurgery | 3 (2.2) |
| Obstetrics & Gynecology | 7 (5.2) |
| Otolaryngology | 6 (4.5) |
| Plastic Surgery | 2 (1.5) |
| Surgical Foundations | 3 (2.2) |
| Urology | 3 (2.2) |
| Anatomical Pathology | 2 (1.5) |
| Clinical Immunology & Allergy | 3 (2.2) |
| Forensic Pathology | 1 (0.7) |
| General Pathology | 1 (0.7) |
| Nuclear Medicine | 4 (3.0) |
To what extent are patients or will patients be involved in resident assessment?
Of the 101 respondents in the CBD preparation stage, 63 (62.4%) stated that they did not know if patients will be involved in resident assessment, 21 (20.8%) said that they will involve patients, and 17 (16.8%) indicated they will not involve patients. Among the 33 respondents in the field-testing or implementation stages, 24 (72.7%) said that they do not involve patients in resident assessment, five (15.2%) stated that they do involve patients, and four (12.1%) stated that they do not know if they involve patients.
Preparation stage.
Ten (47.6%) of the 21 respondents who are in the preparation stage and who indicated that they will be involving patients in resident assessment stated that they did not know how often patients will be involved over a four-week period. The remaining respondents stated that patients will never (n=2, 9.5%), rarely (n=4; 19.0%), sometimes (n=4; 19.0%), or frequently (n=1; 4.8%) be involved over a four-week period. Twelve (57.1%) of these 21 respondents stated that patients will be involved in formative assessment, one (4.8%) stated that patients will be involved in summative assessment, and eight (38.1%) responded that they did not know if patients will be involved in formative or summative assessment. When asked if patients will provide written assessments, ten of the 21 respondents (47.6%) noted that they did not know, six (28.6%) indicated that patients will provide written assessments, and five (19.0%) said that patients will not. In regard to verbal assessments, over half (n=11; 52.4%) of the 21 respondents said “yes” patients will provide verbal assessments, seven (33.3%) said they “did not know,” and three (14.3%) said “no.” Four (19.0%) of the 21 answered that patients will assess residents in Objective Structured Clinical Examinations (OSCEs) and all confirmed that patients will not assess residents in simulation. All 21 respondents indicated that patients will not be involved in assessment tool development.
Field-testing or implementation stage.
Of the five respondents in the field-testing or implementation stages who stated that they involve patients in resident assessment, four (80.0%) stated that patients are rarely involved in the assessment of residents over a four-week period and one (20%) indicated that they are frequently involved. All five stated that patients are involved in formative rather than summative assessment. In terms of providing written assessments, three (60.0%) indicated that patients provide written assessments. When asked about verbal assessments, two (40.0%) said patients provide them. All respondents confirmed that patients are not involved in assessments for OSCEs or simulation. They also noted that patients are not involved in assessment tool development.
Why are patients or will patients be involved or not involved in resident assessment?
The main reason why respondents in the preparation and field-testing or implementation stages will be involving or do involve patients in resident assessment is to provide and gain firsthand information on the care provided by residents. This reason was cited by 16 (76.2%) of the 21 programs in the preparation stage and four (80.0%) of the five programs in the field-testing or implementation stages that will be involving patients in assessment. Other reasons indicated by respondents included: (a) to improve the quality of care (57.1% preparation stage, 40.0% field-testing or implementation stages); (b) to empower patients (28.6% preparation stage, 20.0% field-testing or implementation stages); and (c) to satisfy program accreditation requirements (19.0% preparation stage, 40.0% field-testing or implementation stages). Appendix A provides a list of reasons why patients might not be or are not involved in resident assessment.
What skills and abilities can patients assess?
Table 3 presents respondents’ views on which skills and abilities patients can (and cannot) assess. Across the CBD stages and irrespective of patient involvement in resident assessment, the majority of respondents thought that patients could assess residents’ communication and respectfulness.
Table 3.
Skills and abilities patients can assess
| Patient Involvement | Skills and Abilities | Preparation Stage N = |
Field-testing or Implementation Stages |
|---|---|---|---|
| n (%) | |||
| Yes | N= 21 | N=5 | |
| Communication | 20 (95.2) | 4 (80.0) | |
| Team work | 8 (38.1) | 2 (40.0) | |
| Leadership | 4 (19.0) | 1 (20.0) | |
| Situational awareness (“Knowing what is going on around you”) | 8 (38.1) | 1 (20.0) | |
| Decision making | 4 (19.0) | 0 (0.0) | |
| Coping with stress | 5 (23.8) | 0 (0.0) | |
| Coping with fatigue | 1 (4.8) | 0 (0.0) | |
| Respectfulness | 18 (85.7) | 4 (80.0) | |
| Punctuality | 13 (61.9) | 1 (20.0) | |
| Awareness of limitations | 6 (28.6) | 1 (20.0) | |
| Ability to ask for help | 10 (47.6) | 1 (20.0) | |
| Comfort level in a clinical setting | 12 (57.1) | 3 (60.0) | |
| Adaptability | 4 (19.0) | 1 (20.0) | |
| Managing workloads | 1 (4.8) | 1 (20.0) | |
| Resolving conflicts | 8 (38.1) | 1 (20.0) | |
| Patients cannot assess residents | 0 (0.0) | 0 (0.0) | |
| Don’t Know | N=63 | N=4 | |
| Communication | 58 (92.1) | 4 (100.0) | |
| Team work | 21 (33.3) | 1 (25.0) | |
| Leadership | 12 (19.0) | 1 (25.0) | |
| Situational awareness (“Knowing what is going on around you”) | 28 (44.4) | 3 (75.0) | |
| Decision making | 16 (25.4) | 2 (50.0) | |
| Coping with stress | 11 (17.5) | 3 (75.0) | |
| Coping with fatigue | 2 (3.2) | 1 (25.0) | |
| Respectfulness | 56 (88.9) | 3 (75.0) | |
| Punctuality | 39 (61.9) | 3 (75.0) | |
| Awareness of limitations | 19 (30.2) | 3 (75.0) | |
| Ability to ask for help | 17 (27.0) | 2 (50.0) | |
| Comfort level in a clinical setting | 30 (47.6) | 2 (50.0) | |
| Adaptability | 14 (22.2) | 2 (50.0) | |
| Managing workloads | 3 (4.8) | 2 (50.0) | |
| Resolving conflicts | 19 (30.2) | 3 (75.0) | |
| Patients cannot assess residents | 0 (0.0) | 0 (0.0) | |
| No | N=17 | N=24 | |
| Communication | 14 (82.4) | 20 (83.3) | |
| Team work | 3 (17.6) | 5 (20.8) | |
| Leadership | 2 (11.8) | 2 (8.3) | |
| Situational awareness (“Knowing what is going on around you”) | 5 (29.4) | 10 (41.7) | |
| Decision making | 3 (17.6) | 6 (25.0) | |
| Coping with stress | 6 (35.3) | 7 (29.2) | |
| Coping with fatigue | 5 (29.4) | 3 (12.5) | |
| Respectfulness | 13 (76.5) | 20 (83.3) | |
| Punctuality | 11 (64.7) | 14 (58.3) | |
| Awareness of limitations | 5 (29.4) | 7 (29.2) | |
| Ability to ask for help | 2 (11.8) | 9 (37.5) | |
| Comfort level in a clinical setting | 8 (64.7) | 13 (54.2) | |
| Adaptability | 2 (11.8) | 7 (29.2) | |
| Managing workloads | 2 (11.8) | 3 (12.5) | |
| Resolving conflicts | 4 (23.5) | 7 (29.2) | |
| Patients cannot assess residents | 1 (5.9) | 1 (4.2) | |
Phase 2: Interviews
Demographics
We interviewed 12 PDs; eight (66.7%) in the CBD preparation stage and four (33.3%) in the field-testing or implementation stages. Of the interviewees in the preparation stage, four (50.0%) reported that patients will be involved in resident assessment and four (50.0%) said that they did not know if patients will be involved. All interviewees in the field-testing or implementation stages noted that patients are not involved in resident assessment. The interviewees represented Anesthesiology, General Surgery, Medical Oncology, Neonatal-Perinatal Medicine, Nephrology, Obstetrics and Gynecology, Otolaryngology, Pediatrics, Respirology, and Surgical Foundations from various geographical locations.
What factor(s) facilitate or hinder patient involvement in resident assessment?
Factors that some interviewees viewed as facilitators others instead viewed as hindrances (see Table 4). Appendix B provides exemplar participant quotations.
Table 4.
Factors that facilitate and hinder patient involvement in resident assessment by stage of CBD
| Factor | Preparation stage | Field-testing or implementation stages |
|---|---|---|
| Patients’ interests and abilities. | – | – |
| Funding | – | – |
| Guidelines and processes for patient involvement in assessment | – | – |
| Faculty members’ and residents’ perceptions | + – | – |
| Staffing and time | + – | – |
| Availability and existence of patient assessment tools | + – | – |
| Type of Entrustable Professional Activities | + – | – |
| Type of patient interactions in program | + – | – |
| Support from healthcare organizations | + – | – |
Note.– indicates that the interviewees perceived the factor to hinder patient involvement in resident assessment; + – indicates that the interviewees perceived the factor to both hinder and facilitate patient involvement in resident assessment
Patients’ interests and abilities
All participants in the preparation and field-testing or implementation stages, irrespective of their desires to involve patients in assessment, thought that patients’ interests and abilities to be involved will be/is a hindrance. They commented on how they believe that patients will/do not want to be involved in education or that they will/do not have the time to participate in assessment. Interviewees also expressed concern about the expertise of patients to assess residents. They noted that patients would need training in resident assessment and medical education requirements. Interviewees thought that providing such training is unfeasible given current program resources.
Funding
Regardless of the participants’ CBD stage and their interest in involving patients in resident assessment, they all said that insufficient funding hinders them from involving patients. Participants commented on how they would require additional funding to implement and ensure ongoing patient involvement. They discussed how patient involvement is costly and impossible within existing budgets.
Guidelines and processes for patient involvement in assessment
Participants across all CBD stages viewed the lack of guidelines and processes as a hindrance to patient involvement in resident assessment. They noted the non-existence of guidelines and processes on how to collect resident assessment data from patients, which patients they should involve, how many patient assessments they should collect, what patient assessment reports should include, and if these reports should only include anonymized data. Several interviewees also commented on how the online assessment system will be/is restricted to faculty members and that there are no guidelines or processes on how patients will/can access it. They expressed that guidelines and processes would facilitate and standardize patient assessment activities but that without them, patient assessments will be/are haphazard or not occur. Moreover, most participants noted that they would not involve patients in resident assessment unless it was mandatory.
Faculty members’ and residents’ perceptions
Some interviewees in the preparation stage described how their positive perceptions towards patient involvement in resident assessment would facilitate their efforts to involve patients. These participants believe that patients will bring unique insights to well-rounded assessment programs and provide residents with important information that they can use to improve their competencies. However, others in the preparation, field-testing, or implementing stages flagged that residents’ and faculty members’ prior preconceived negative perceptions towards patient assessments might hinder involvement. They commented on how they may not take it seriously. Participants also discussed how they perceive patient assessments to be biased (i.e., overly positive or negative) and thus, hinders them from collecting and using them. They noted that if they, as the PDs, are not supportive of patient assessments then they will not occur, as they champion the various assessment strategies. Furthermore, several of these interviewees disclosed that they do not see patient assessments as a priority.
Staffing and time
Participants discussed staffing and time as factors that both facilitate and hinder patient involvement in resident assessment. They focused on the amount of time available for staff to collect patient assessments and the willingness of staff and residents to assume such responsibilities. Some interviewees in the preparation stage explained how they would ask clerks to distribute assessment forms to patients. They also suggested that nurses could assist or residents could collect the patient assessments. However, others within the preparation stage detailed how they will not have the staff or time to collect patient assessments. They discussed how the time commitments of teaching and clinical responsibilities would hinder them from involving patients in resident assessment. Those in the field-testing or implementation stages echoed these thoughts and added that they do not have anyone in their programs who can assume responsibility for overseeing and collecting patient assessments. They also explained that residents would forget to collect these assessments and thus, it would be impractical to assign the task to them.
Existence and availability of patient assessment tools
Participants whose programs are in the preparation stage thought that the existence and availability of patient assessment tools will be both a facilitator and hindrance to patient involvement in assessment. As a facilitator, some explicated how they plan to involve patients in resident assessment using existing tools within their 360° assessment systems. However, others in the preparation stage noted that they do not have access to or are unaware of existing patient assessment tools, especially those with validity and reliability evidence. Interviewees in the field-testing or implementation stages also reiterated their lack of access to and knowledge of patient assessment tools as a hindrance. They noted the need to develop patient-specific assessment tools that align with their existing Entrustable Professional Activities (EPAs), but they explained that the development of such tools is beyond their expertise.
Type of Entrustable Professional Activities
Some participants in the preparation stage highlighted how the EPAs that they are developing for their programs will require or are amenable to patient involvement in resident assessment. In some cases, they described how it would be essential to obtain patient assessments to know if residents are progressing appropriately. Conversely, others in the preparation, field-testing, or implementation stages stressed how their programs’ EPAs will hinder their abilities to involve patients because the EPAs are not patient-oriented. They stated that the EPAs for their programs do not require patient assessments. They explained how the EPAs focus solely on skills/abilities that only require faculty members’ assessments. These interviewees also noted that they will/have not included patients as assessment sources and thus, patients will/are not involved.
Type of patient interactions in program
Several participants in the preparation stage discussed longitudinal patient interactions in their programs. They believed that these interactions would facilitate their abilities to incorporate patient assessments and enable patients to comment on residents’ competencies over time. However, other interviewees in the preparation, field-testing, or implementation stages commented on how their types of patient interactions are not conducive to patient involvement in resident assessment. They explained how interactions are episodic or short and therefore, patients will/do not have enough time with residents to provide reliable and constructive assessments.
Support from healthcare organizations
.Participants in the preparation stage who plan to involve patients explained how support from the academic hospitals in which their residents are training would be essential for facilitating patient involvement in resident assessment. They noted that they are involving administrators as well as patient advisory groups in preparation efforts to understand how they can best involve patients. These interviewees detailed how these stakeholders will facilitate their programs’ efforts to contact patients for assessment activities. They discussed how these stakeholders are also providing them with advice on how to respect patients’ privacy concerns when involved in resident assessment. Participants mentioned that the patient-centred philosophies of their academic hospitals would support patient involvement in resident assessment. Nonetheless, those in the field-testing and implementation stages said regardless of their hospitals’ philosophies, they do not have staff or mechanisms to assist and guide them in involving patients in resident assessment.
Discussion
This study provides an understanding of the current extent, nature, as well as the facilitators and hindrances of patient involvement in resident assessment. The majority of Phase 1 respondents in the CBD preparation stage indicated that they did not know if patients would be involved in resident assessment. This uncertainty is not surprising as these programs are still in early CBD stages and navigating the transition process. However, the majority of respondents in the field-testing or implementation stages indicated that they do not involve patients in resident assessment. This finding does not bode well for patient involvement in resident assessment, as many of the programs in the preparation stage may look to these early CBD adopters, who are piloting the model to ensure that it is appropriate and feasible before implementing it across all programs, to inform their programs’ assessment activities.29 This lack of patient involvement in assessment also differs from the methods used to assess residents in Accreditation Council for Graduate Medical Education (ACGME) programs in the United States.24 Holt et al. in their review of methods used to assess residents in ACGME programs found that 61% of programs involved patients as assessors and that 34% included family members as assessors.24
All respondents in the present study also noted a lack of patient involvement in the development of assessment tools. This finding is unfortunate as research shows that patients can contribute to tool creation.30-32 Moreover, participants in Phase 2 commented on having no access to or awareness of patient assessment tools. However, there are several existing patient assessment tools with validity and reliability evidence that programs could use as part of their MSF activities.31,33-37 Nevertheless, researchers and other RCPSC stakeholders could better promote these tools to PDs to facilitate use. In addition, prior to using any existing tools, programs need to ensure that the tools’ items align with their programs’ EPAs, as these EPAs should be used as the blueprints that guide tool selection.38,39
On another note, participants in the current study who will or do involve patients in resident assessment noted how patient involvement provides patients’ perspectives to residents, improves care provision, and empowers patients. These benefits are consistent with the literature on patient involvement in assessment.7,40-42 That said, some participants discussed how they will or are only involving patients in resident assessment to satisfy program accreditation requirements. Unfortunately, this form of involvement often leads to token patient involvement.8
Phase 1 participants identified that patients could best assess residents’ communication and respectfulness, which is congruent with previous studies.7,16,24 In relation to the assessment of these skills, interviewees also noted that opportunities for longitudinal patient-resident interactions would facilitate patient involvement. Researchers have shown that a series of longitudinal patient assessments can illustrate learners’ evolution of patient-oriented behaviours and communication abilities.43 Conversely, other interviewees detailed how resident-patient interactions in their programs are episodic or short and thus not conducive to patient involvement in resident assessment. However, as we have shown in a previous study, parent assessments of residents is feasible in pediatric emergency departments, where interactions are brief and isolated.31
In both Phases 1 and 2, regardless of CBD stage or intentions of involving patients, participants indicated several reasons why they might not or do not involve patients in resident assessment, including lack of funding and time. Holmboe et al.38 and Pinsk et al.44 confirm that CBME and CBD, respectively, lead to increased demands on faculty and resources and that assessment efforts are time-intensive. Thus, adding patient assessments is undoubtedly challenging. Participants also suggested that their lack of knowledge on how to involve patients in resident assessment is a reflection of a dearth of guidelines and processes on how to do this. Such guidelines and processes can support a culture of patient inclusion8,12 and include information on remuneration for patient assessors,17 strategies for increasing the diversity of patients involved in assessment,45 and data collection techniques.32
Of interest, Phase 2 interviewees highlighted hindrances and facilitators to involving patients in resident assessment that were not strongly represented in Phase 1. For example, while a small number of Phase 1 respondents indicated that their programs do not believe patients can assess residents, all Phase 2 participants thought that patients’ interests and abilities will be or are a hindrance to patient involvement in resident assessment. Interviewees focused heavily on patients’ lack of expertise to assess residents. Other researchers have commonly cited concerns about overburdening patients or about patients’ abilities to provide meaningful and reliable assessments because of their emotional, physical, or mental health.46 Few have commented specifically on patients’ expertise. Those that have alluded to patients’ expertise suggested that faculty members may feel threatened by the transfer of some assessment power from themselves to patients and thus, resist it,47 or commented that programs have provided minimal or inappropriate training to patients to enable effective assessment involvement.48
Interviewees also expanded on staffing as a factor that can both hinder and facilitate patient involvement in resident assessment. Some noted that their programs will not or do not have the staff to collect patient assessments. Naylor et al.17 echoed these participants’ concerns noting that the Canadian context has deficiencies in infrastructure for supporting authentic patient involvement. However, other interviewees also commented on the potential of having other health professionals and residents assist with patient assessments. Previous research studies35,36,49 have successfully used such processes, but it is unclear how these processes would work in Canadian healthcare systems where health professionals’ job descriptions do not include the collection of patient assessments.
Lastly, interviewees raised the importance of having leadership and support to facilitate patient involvement from both the healthcare organizations in which the residents provide care and the PDs themselves. A lack of sustained leadership and support is one of the main reasons why patient involvement in medical education is not mainstream.45 In order to facilitate such involvement, Towle et al.45 endorse the promotion of patient involvement “through directives such as accreditation standards, external and internal policies, pronouncements from professional bodies and best practice statements.”
Limitations & Future Directions for Research
This study has four limitations that future research can mitigate. First, some may view the survey response rate of 29.0% as a limitation. While it is plausible that the data would have yielded different results if the response rate was higher, the response rate is consistent with other surveys targeted at PDs50 and those involving healthcare professionals.51 However, future research in this area could work on increasing this response rate. Second, since we limited the survey to the perspectives of PDs for specialty/sub-specialty/special programs associated with the RCPSC, in future, it would be interesting to survey PDs for Family Medicine programs. Third, many respondents from Phase 1 elected to not participate in Phase 2. We do not know if there were any differences between those who participated in Phase 2 and those who did not. It is possible that PDs who participated where more interested in the topic and thus, expressed different views from those who did not participate. Lastly, although Phase 2 included a small group of PDs, it included the views of PDs from several specialty/sub-specialty/special programs and various geographical locations. Nevertheless, it would be beneficial to undertake future studies to gather additional PDs’ perspectives on the topic. Moreover, patients and residents are key stakeholders on this topic, and future studies should also explore their views. Such studies could use focus groups, rather than one-on-one interviews, in order to facilitate the obtainment of high-quality data, as the interactional, synergistic nature would encourage these stakeholders to clarify or expand upon their discussion points in relation to those raised by others.
Patient involvement in resident assessment appears limited and sporadic across Canadian specialty, sub-specialty, and special programs that are transitioning or have transitioned to CBD. Unfortunately, the majority of respondents whose programs are in the CBD field-testing or implementation stages indicated that they do not involve patients in resident assessment. This lack of involvement will inevitably have a deleterious impact on the extent of patient involvement in resident assessment, since other programs may follow the examples and activities of these early CBD adopters. The PDs also identified factors that facilitate and hinder such involvement. Overall, by highlighting the current state of patient involvement in resident assessment as well as these factors, we are optimistic that those leading CBD transitions will be motivated to engage in critical discussions about the extent to which, how, and why we need to enact measures to better support patient involvement in resident assessment. After all, since patients are central to healthcare and medical education, residents may benefit from and appreciate the involvement of patients in their assessment.
Appendix A
Table A1.
Why patients might not be or are not involved in resident assessment
| Patient Involvement | Why patients might not be or are not involved in resident assessment | Preparation Stage | Field-testing or Implementation Stages |
|---|---|---|---|
| n (%) | |||
| Yes | N=21 | N=5 | |
| Residents do not have direct contact with patients | 2 (9.5) | 0 (0.0) | |
| Program doesn’t know how to involve patients in resident assessment | 10 (47.6) | 2 (40.0) | |
| No funding to support patient involvement in resident assessment | 9 (42.9) | 1 (20.0) | |
| No time to support patient involvement in resident assessment | 8 (38.1) | 1 (20.0) | |
| No tools to support patient involvement in resident assessment | 12 (57.1) | 3 (60.0) | |
| Program does not believe patients can assess residents | 1 (4.8) | 3 (60.0) | |
| Patients’ health conditions impede them from assessing residents | 6 (28.6) | 1 (20.0) | |
| Don’t know | 1 (4.8) | 1 (20.0) | |
| Don’t Know | N=63 | N=4 | |
| Residents do not have direct contact with patients | 1 (3.2) | 0 (0.0) | |
| Program doesn’t know how to involve patients in resident assessment | 30 (47.6) | 1 (25.0) | |
| No funding to support patient involvement in resident assessment | 35 (55.6) | 2 (50.0) | |
| No time to support patient involvement in resident assessment | 35 (55.6) | 1 (25.0) | |
| No tools to support patient involvement in resident assessment | 43 (68.3) | 2 (50.0) | |
| Program does not believe patients can assess residents | 3 (4.8) | 0 (0.0) | |
| Patients’ health conditions impede them from assessing residents | 14 (22.2) | 2 (50.0) | |
| Don’t know | 2 (3.2) | 0 (0.0) | |
| No | N=17 | N=24 | |
| Residents do not have direct contact with patients | 2 (11.8) | 0 (0.0) | |
| Program doesn’t know how to involve patients in resident assessment | 5 (29.4) | 13 (54.2) | |
| No funding to support patient involvement in resident assessment | 3 (17.6) | 14 (58.3) | |
| No time to support patient involvement in resident assessment | 4 (23.5) | 13 (54;2) | |
| No tools to support patient involvement in resident assessment | 7 (41.2) | 15 (62.5) | |
| Program does not believe patients can assess residents | 3 (17.6) | 1 (4.2) | |
| Patients’ health conditions impede them from assessing residents | 3 (17.6) | 1 (4.2) | |
| Don’t know | 1 (5.9) | 0 (0.0) | |
Appendix B
Table B1.
(Supplementary material) Exemplar quotations for factors that facilitate and hinder patient involvement in resident assessment by stage of CBD
| Factor | Quotations | |
|---|---|---|
| Preparation stage | Field-testing or implementation stages | |
| Patients’ interests and abilities | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Funding | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Guidelines and processes for patient involvement in assessment | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Faculty members’ and residents’ perceptions | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Staffing and time | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Availability and existence of patient assessment tools | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Type of Entrustable Professional Activities (EPAs) | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Type of patient interactions in program | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
| Support from healthcare organizations | ||
| As a facilitator |
|
|
| As a hindrance |
|
|
Note.Y beside the participant identification number indicates that the interviewee’s program will be involving patients in resident assessment. DK beside the participant identification number indicates that the interviewee’s program does not know if it will be involving patients in resident assessment. N beside the participant identification number indicates that the interviewee’s program does not involve patients in resident assessment.
Footnotes
Conflicts of interest: There are no conflicts of interest for any of the authors.
Funding: The Royal College of Physicians and Surgeons of Canada funded this study.
References
- 1.Takahashi S, Abbott C, Oswald A, Frank JR, eds. CanMEDS teaching and assessment tools guide Ottawa, ON: Royal College of Physicians and Surgeons of Canada; 2015. [Google Scholar]
- 2.Lockyer J, Carraccio C, Chan M-K, et al. Core principles of assessment in competency-based medical education. Med Teach. 2017;39:609-16. [DOI] [PubMed] [Google Scholar]
- 3.Ferguson P, Caverzagie KJ, Nousiainen MT, Snell L. Changing the culture of medical training: An important step toward the implementation of competency-based medical education. Med Teach. 2017;39:599-602. [DOI] [PubMed] [Google Scholar]
- 4.Evans RG, Edwards A, Evans S, Elwyn B, Elwyn G. Assessing the practising physician using patient surveys: a systematic review of instruments and feedback methods. Family Practice. 2007;24:117-27. [DOI] [PubMed] [Google Scholar]
- 5.Carraccio C, Englander R, Van Melle E, et al. Advancing competency-based medical education: A charter for clinician-educators. Acad Med. 2016;91:645-9. [DOI] [PubMed] [Google Scholar]
- 6.Spencer J. Some activity but still not much action on patient and public engagement. Med Educ. 2016;50:3-23. [DOI] [PubMed] [Google Scholar]
- 7.Abadel FT, Hattab AS. Patients' assessment of professionalism and communication skills of medical graduates. BMC Med Educ. 2014;14:1-8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Tee S. Service user involvement: Addressing the crisis in confidence in healthcare. Nurse Educ Today. 2012;32:119-20. [DOI] [PubMed] [Google Scholar]
- 9.Towle A, Bainbridge L, Godolphin W, et al. Active patient involvement in the education of health professionals. Med Educ. 2010;44:64-74. [DOI] [PubMed] [Google Scholar]
- 10.Moreau K, Eady K, Jabbour M. Exploring residents' reactions to and use of parent feedback in a pediatric emergency department: A grounded theory study Med Teach. in press. [DOI] [PubMed]
- 11.Ahuja AS, Williams R. Involving patients and their carers in educating and training practitioners. Curr Opin Psychiatry. 2005;18:374-80. [DOI] [PubMed] [Google Scholar]
- 12.Muir D, Laxton JC. Experts by experience: The views of service user educators providing feedback on medical students' work based assessments. Nurse Educ Today. 2012;32:146-50. [DOI] [PubMed] [Google Scholar]
- 13.Spencer J, Blackmore D, Heard S, et al. Patient-oriented learning: A review of the role of the patient in the education of medical students. Med Educ. 2000;34:851-7. [DOI] [PubMed] [Google Scholar]
- 14.Towle A, Godolphin W. Patient involvement in health professional education: A bibliography 1975-November 2016. 2016. [Internet]. Available at:https://pcpe.health.ubc.ca/node/207 [Accessed December 1, 2016].
- 15.Chisholm A, Askham J. What do you think of your doctor? A review of questionnaires for gathering patients' feedback on their doctor. Oxford, UK, 2006. [Google Scholar]
- 16.Moreau K, Eady L, Frank JR, et al. A qualitative exploration of which resident skills parents in pediatric emergency departments can assess. Med Teach. 2016;38:1118-24. [DOI] [PubMed] [Google Scholar]
- 17.Naylor S, Harcus J, Elkington M. An exploration of service user involvement in the assessment of students. Radiography. 2015;21:269-72. [Google Scholar]
- 18.Thomson AN. Reliability of consumer assessment of communication skills in postgraduate family practice examination. Med Educ. 1994;28:146-50. [DOI] [PubMed] [Google Scholar]
- 19.Thomson AN. Consumer assessment of interview skills in a family practice certification examination. Fam Med. 1993;25:41-4. [PubMed] [Google Scholar]
- 20.Creswell JW, Plano Clark V. Designing and conducting mixed methods research. Thousand Oaks: Sage, 2007. [Google Scholar]
- 21.Tew J, Gell C, Foster S. Learning From Experience: Involving Service Users and Carers in Mental Health Education and Training Nottingham, UK: National Institute for Mental Health in England, 2004. [Google Scholar]
- 22.Institute for Patient- and Family-Centered Care. Medical Education. 2016. [Internet]. Available at: http://www.ipfcc.org/advance/topics/meded.html [Accessed December 3, 2016].
- 23.Chou S, Lockyer J, Cole G, McLaughlin K. Assessing postgraduate trainees in Canada: Are we achieving diversity in methods? Med Teach. 2009;31:e58-63. [DOI] [PubMed] [Google Scholar]
- 24.Holt K, Miller R, Nasca T. Residency programs' evaluations of the competencies: Data provided to the ACGME about types of assessments used by programs. J Grad Med Educ. 2010;2:649-55. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Flin R, O'Connor R, Crichton M. Safety at the sharp end: A guide to non-technical skills. Farnham, UK: Ashgate, 2008. [Google Scholar]
- 26.Dillman D, Smyth J, Christian LM. Internet, phone, mail, and mixed-mode surveys: the taliored design method. New York, NY: John Wiley & Sons, 2014. [Google Scholar]
- 27.Patton MQ. Qualitative research & evlaution methods: Integrating theory and practice. Thousand Oaks, CA: Sage, 2015. [Google Scholar]
- 28.Miles MB, Huberman AM, Saldana J. Qualitative Data Analysis: A Methods Sourcebook. 3rd ed. Los Angeles, CA: Sage Publications, 2014. [Google Scholar]
- 29.Royal College of Physicians and Surgeons of Canada. Announcing our two early adopters for Competence by Design, 2014. [Internet]. Available at: https://ceomessage.royalcollege.ca/2014/04/30/announcing-our-two-early-adopters-for-competence-by-design/ [Accessed August 7, 2018].
- 30.Shippee N, Graces J, Lopez G, et al. Patient and service user engagement in research: A systematic review and synthesized framework. Health Expect. 2013. [DOI] [PMC free article] [PubMed]
- 31.Moreau K, Eady K, Jabbour M, Frank JR, Hamstra S. The development of the PARENTS: A tool for parents to assess medical residents’ non-technical skills in pediatric emergency departments. BMC Med Educ. 2017;17. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Moreau K, Pound CM, Eady K. Pediatric caregiver involvement in the assessment of physicians. BMC Med Educ. 2015;15. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Campbell C, Lockyer J, Laidlaw T, MacLeod H. Assessment of a matched-pair instrument to examine doctor?patient communication skills in practising doctors. Med Educ. 2007;41:123-9. [DOI] [PubMed] [Google Scholar]
- 34.Makoul G, Krupat E, Chang C-H. Measuring patient views of physician communication skills: Development and testing of the Communication Assessment Tool. Patient Educ Couns. 2007;67:333-42. [DOI] [PubMed] [Google Scholar]
- 35.Street R, Makoul G, Arora N, Epstein R. How does communication heal? Pathways linking clinician-patient communication to health outcomes. Patient Educ Couns. 2009;74:295-301. [DOI] [PubMed] [Google Scholar]
- 36.Crossley J, Eiser C, Davies HA. Children and their parents assessing the doctor-patient interaction: a rating system for doctors' communication skills. Med Educ. 2005;39:820-8. [DOI] [PubMed] [Google Scholar]
- 37.McGraw M, Fellows S, Long A, et al. Feedback on doctors' performance from parents and carers of children: a national pilot study. Arch Dis Child. 2011;97:206-10. [DOI] [PubMed] [Google Scholar]
- 38.Holmboe E, Sherbino J, Long D, Swing S, Frank JR. The role of assessment in competency-based medical education. MedTeach. 2010;32:676-82. [DOI] [PubMed] [Google Scholar]
- 39.Green ML, AAgaard EM, Caverzagie KJ, et al. Charting the road to competence: Developmental milestones for internal medicine residency training. J Grad Med Educ. 2009;1:5-20. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 40.Jha V, Quinton ND, Bekker HL, Roberts TE. Strategies and interventions for the involvement of real patients in medical education: a systematic review. Med Educ. 2009;43:10-20. [DOI] [PubMed] [Google Scholar]
- 41.Walters K, Buszewicz M, Russell J, Humphrey C. Teaching as therapy: Cross sectional and qualitative evaluation of patients' experiences of undergraduate psychiatry teaching in the community. BMJ. 2003;326:740-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 42.Wykurz G, Kelly D. Developing the role of patients as teachers: Literature review. BMJ. 2002;325:818-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Epstein RM, Franks P, Fiscella K, et al. Measuring patient-centered communication in patient-physician consultations: Theoretical and practical issues. Soc Sci Med. 2005;61:1516-28. [DOI] [PubMed] [Google Scholar]
- 44.Pinsk M, Karpinski J, Carlisle E. Introduction of competence by design to Canadian nephrology postgraduate training. Canadian Journal of Kidney Health and Disease. 2018;5:1-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 45.Towle A, Farrell C, Gaines ME, et al. The patient's voice in health and social care professional education: The Vancouver Statement. International Journal of Health Governance. 2016;21:1-8. [Google Scholar]
- 46.McMahon-Parkes K, Chapman L, James J. The views of patients, mentors and adult field nursing students on patients' participation in student nurse assessment in practice. Nurse Educ Pract. 2016;16:202-8. [DOI] [PubMed] [Google Scholar]
- 47.Felton A, Stickley T. Pedagogy, power and service user involvement. J Psychiatr Ment Health Nurs. 2004;11:89-98. [DOI] [PubMed] [Google Scholar]
- 48.Dogra N, Anderson J, Edwards R, Cavendish S. Service user perspectives about their roles in undergraduate medical training about mental health. Med Teach. 2008;30:e152-6. [DOI] [PubMed] [Google Scholar]
- 49.McGraw M, Fellows S, Long A, et al. Feedback on doctors' performance from parents and carers of children: a national pilot study. Arch Dis Child. 2012;97:206-10. [DOI] [PubMed] [Google Scholar]
- 50.The International Conference on Residency Education. The International Conference on Residency Education Conference Research Abstracts. 2017. [Internet]. Available at: www.royalcollege.ca/rcsite/.../icre/2017-icre-conference-abstracts-design-jgme.pdf [Accessed August 7, 2018].
- 51.Hill CA, Fahrney K, Wheeless SC, Carson CP. Survey response inducements for registered nurses. Western Journal of Nursing Research. 2006:322-34. [DOI] [PubMed] [Google Scholar]
