Skip to main content
MedEdPublish logoLink to MedEdPublish
. 2024 Jan 8;14:2. [Version 1] doi: 10.12688/mep.19247.1

Exploring residents’ perceptions of competency-based medical education across Canada: A national survey study

Heather Braund 1, Vivesh Patel 2, Nancy Dalgarno 1, Steve Mann 3,a
PMCID: PMC10933567  PMID: 38487752

Abstract

Background: As competency-based medical education (CBME) is implemented across Canada, little is known about residents’ perceptions of this model. This study examined how Canadian residents understand CBME and their lived experiences with implementation.

Methods: We administered a survey in 2018 with Likert-type and open-ended questions to 375 residents across Canada, of whom 270 were from traditional programs (“pre-CBME”) and 105 were in a CBME program. We used the Mann-Whitney test to examine differences across samples, and analyzed qualitative data thematically.

Results: Three themes were identified across both groups: program outcome concerns, changes, and emotional responses. In relation to program concerns, both groups were concerned about the administrative burden, challenges with the assessment process, and feedback quality. Only pre-CBME residents were concerned about faculty engagement and buy-in. In terms of changes, both groups discussed a more formalized assessment process with mixed reactions. Residents in the pre-CBME sample reported greater concerns for faculty time constraints, assessment completion, and quality of learning experiences, whilst those in CBME programs reported being more proactive in their learning and greater selfreflection. Residents expressed strong emotional narrative responses including greater stress and frustration in a CBME environment.

Conclusion: Findings demonstrate that residents have mixed feelings and experiences regarding CBME. Their positive experiences align with the aim of developing more self-directed learners. However, the concerns suggest the need to address specific shortcomings to increase buy-in, while the emotional responses associated with CBME may require a cultural shift within residency programs to guard against burnout.

Keywords: Competency-based medical education, resident experiences, resident perspectives, survey

Introduction

Canadian postgraduate medical education (PGME) programs are transitioning from a time-based model to competency-based medical education (CBME) 1 . The Royal College of Physicians and Surgeons of Canada (RCPSC) introduced CBME with the 2017 cohort of residents at one Canadian institution. Since then, Canadian PGME programs have been transitioning to the RCPSC competency-based model, competence by design (CBD) 1 .

CBME is supported by four tenets including a focus on outcomes, an emphasis on abilities, a de-emphasis of time-based training, and the promotion of learner-centredness. Another benefit of CBME is the requirement for frequent, utilitarian assessment 2 . This increased assessment is intended to promote learner-centred teaching that could help residents achieve the required competencies 2 . These benefits informed CBME implementation, with the goal of standardizing and improving residency training across Canada, and fostering self-reflection and lifelong learning skills. Implementing this significant change has required much scholarship and collaboration 25 . However, although educational scholars have written extensively about CBME, there remains a gap in the literature examining resident perspectives 6, 7 . CBME literature has focused on examining the impact of educational interventions on residents’ understanding of the process and purpose 8 , or utilizing qualitative methodology 9 . Additional areas of focus include understanding residents’ perceptions specific to assessment 10, 11 . The literature also focuses on resident perspectives within specific programs including internal medicine 10, 11 , obstetrics and gynecology 12 , and anaesthesia 13 rather than national data across specialities.

Research has shown that residents anticipate improved assessment, feedback, and flexibility to pursue self-identified educational needs, but are concerned regarding logistical challenges and administrative burden 9 . These perspectives are important, because residents are key stakeholders as learners, mentors, and assessors. Gaining increased insight into residents’ perspectives will support an environment that makes learning more efficient and meaningful. Furthermore, residents’ perspectives provide insight into the implementation of CBME programs and many of the daily intricacies of CBME that might otherwise go unnoticed. Given the rich knowledge that can be discovered from residents’ perspectives, they should be targeted to improve CBME implementation and help ensure the goals are being met.

This exploratory study examined how Canadian residents conceptualize CBME with a goal to gain insight and add to the discourse around CBME. This study may aid the RCPSC and program directors as they implement CBME and make evidence-informed decisions regarding their programs. Sharing the resident voice may also enhance the resident experience, improve buy-in, and ultimately cultivate more competent physicians.

Methods

This study examined Canadian residents’ perceptions of CBME before and after implementation through a survey design.

Ethical statement

This study received ethics approval through the institutional Health Sciences Research Ethics Board (File number: 6020800). Participants provided written informed consent for participation in the study and the use and publication of their data. If they desired, they were entered into a draw for compensation which included a draw for one of two Apple Watches.

Setting and participants

Scholars at a mid-sized university in Eastern Ontario recruited residents in time-based and CBME RCPSC training programs. We distributed the online survey examining residents’ perceptions of CBME from July 2018 through December 2018 to trainees in English-speaking Canadian residency programs, by means of an electronic notice sent through institutional list servers. A total of 10 Canadian institutions circulated the survey. However, given that the survey was distributed by institutions, the sampling frame is unknown as each institution used different listservs and varied numbers of reminders.

Data collection and tools

This study is a continuation of a project that started with individual interviews; those lived experiences informed survey development for this study 9 .

We collected data through a single-phase design consisting of a survey which included closed-ended items (Likert) and an open-ended question. The survey branched to questions according to whether participants indicated they were in a CBME program. The questions contained the same premise but were re-worded to make sense for participant context, either ‘pre-CBME’ or ‘in CBME’.

A total of 12 Likert items were divided into two categories, 1) assessment and feedback, and 2) learning experiences. Participants rated their agreement using a six-point scale (1=Strongly disagree, 2-Disagree, 3=Neither agree nor disagree, 4=Agree, 5=Strongly agree, and 6=Prefer not to answer). After the Likert items, participants received an open-ended question asking how CBME might affect their residency experience (pre-CBME) or how CBME has affected their residency experience (in CBME). The final question used a six-point Likert scale where residents rated their overall satisfaction with their current educational training (1=Very dissatisfied, 2=Dissatisfied, 3=Neither satisfied nor dissatisfied, 4=Satisfied, 5= Very satisfied, and 6=Prefer not to answer.) No demographic data were collected as a means of protecting resident identity 14 given that some residency programs in Canada are small. We piloted the survey between April 2018 and May 2018 and evaluated the items for their construct validity 15 . The survey was completed by 13 respondents external to the study who provided feedback on item wording, flow, completion time, and item interpretation. These respondents were purposefully selected based upon their expertise in survey design, knowledge of CBME, and interest in medical education. This process resulted in small refinements (e.g., clarifying terminology and rewording). We used Cronbach’s alpha to evaluate survey reliability 16 .

Data analysis

Quantitative: Data were de-identified and participants were assigned a study ID number for the analysis. Quantitative data were uploaded into SPSS Statistical Package (version 25) for analyses. Our sample violated normality assumptions; hence we used the Mann-Whitney U to examine differences across samples, and significance was set as .05.

Qualitative: All qualitative data were analyzed thematically using NVivo (version 12). The smallest level of analysis was a code; we grouped similar codes together into categories, and grouped similar categories together to form themes 17 . Two researchers coded 25% of the data independently and then met to discuss their coding with an inter-coder reliability of 97%. The discussion resulted in a consensus-built codebook used for the coding. We reached data saturation, with no new findings emerging from the last 20 responses. Once preliminary themes were determined the research team met to discuss the findings and ensure accurate data representation.

Researcher reflexivity

The lead author is an assistant professor and orthopaedic surgeon with a Master’s in Medical Education. The two researchers are mixed methodologists with extensive experience conducting educational scholarship. A medical student who was learning how to conduct research was also involved. The researcher and medical student conducted the coding and statistical analysis before discussing all results with the team. The researchers engaged in a reflexive process including the development of a coding diary to identify common patterns, possible biases, and document memos. This process ensured that the interpretation of data was in alignment with participants’ perspectives. The activities promoting reflexivity and the coding process described above is in alignment with recommended guidelines for establishing inter-coder reliability.

Results

A total of 375 residents completed the survey, of whom 270 were pre-CBME (72% of respondents) and 105 were enrolled in a CBME program (28% of respondents). A Cronbach’s alpha score of .691 indicated that the Likert-items were approaching acceptable reliability 18 . 86% of respondents provided narrative responses.

Effect of CBME on assessment and feedback

An overview of the differences between groups is available in Table 1. Residents in the pre-CBME group had greater concerns that faculty time constraints would affect assessment completion ( Mean = 4.13) than CBME residents ( Mean = 3.91), U = 11663.50, z = -2.88, p = .004, r = -0.14. Similarly, pre-CBME residents ( Mean = 3.28) were skeptical about the level of faculty CBME knowledge and believed that faculty were less likely to have the required knowledge to complete assessments compared to CBME residents ( Mean = 3.74), U = 1698.50, z = -3.84, p = .000, r = -0.20.

Table 1. Differences between groups for effect of competency-based medical education (CBME) on assessment and feedback.

Faculty time
constraints
will limit the
extent to
which they can
complete the
assessment
tools and
provide
feedback.
The need
for frequent
assessments
from faculty
will make it
difficult for
residents to
complete CBME
assessment
tools.
Faculty in my
department
have the
knowledge
required to
complete
quality
assessments
using the
CBME model.
Faculty in my
department
will be able
to make valid
and reliable
decisions about
residents'
competence
using the CBME
model.
Definitions of
competence,
competencies,
EPAs, and
milestones will
be consistent
among
assessors and
competency
committees.
Assessment
in the CBME
model
will result
in more
valuable
feedback
from
faculty.
Assessment
in the CBME
model will
positively
impact
faculty’s
ability
to coach
residents.
N Pre-
CBME
270 270 270 270 270 270 270
N CBME 105 105 105 105 105 105 105
Means
Pre-CBME
4.13 3.96 3.28 3.51 2.83 3.53 3.51
Means
CBME
3.91 3.68 3.74 3.6 3.14 3.5 3.47
Z -2.88 -2.24 -3.84 -.76 -2.46 -.96 -.68
P Value .004 .025 .000 .445 .014 .338 .497

Effect of CBME on resident learning experiences

An overview of the differences is available in Table 2. The only significant difference found was that pre-CBME residents ( Mean = 3.46) agreed that faculty would spend greater time on administering CBME than on learning experiences when compared to CBME residents ( Mean = 2.91), U =9705.50, z = -1.095, p = .000, r = -0.26. Residents in both groups reported high satisfaction, with pre-CBME residents having higher satisfaction ( Mean = 4.26) when compared to those in CBME programs ( Mean = 4.19).

Table 2. Differences between groups for effect of competency-based medical education (CBME) on resident learning experiences.

In the CBME model,
faculty may spend
more time on
administering a
competency-based
programme than on
ensuring the quality
of the learning
experiences.
The CBME
model will
accommodate
individual
learning needs
and variability
in the pace of
learning.
The CBME
model will
encourage
resident
self-
reflection.
The CBME
model will
encourage
residents to
engage in
self-directed
learning.
In the CBME model,
once residents
identify their learning
needs, they will
pursue learning
opportunities to
achieve competence
in those areas.
Please rate
your overall
satisfaction
with the
quality of
education
you receive in
your current
program.
N Pre-
CBME
270 270 270 270 270 270
N CBME 105 105 105 105 105 105
Means
Pre-CBME
3.46 3.46 3.66 3.59 3.60 4.26
Means
CBME
2.91 3.35 3.79 3.69 3.69 4.19
Z -4.96 -1.09 -1.18 -1.01 -.79 -.97
P Value .000 .273 .239 .314 .432 .332

Qualitative findings

Three themes were identified: 1) Program outcome concerns, 2) Changes, and 3) Emotional responses. Whilst these themes were evident across groups, the frequency with which they were mentioned differed. Table 3Table 5 provide supporting quotations for each theme. Additional quotations can be found in Table 6.

Table 3. Supporting quotes for Theme 1: Program outcome concerns.

Sub-theme Supporting Quote
Administrative burden Incessant burden of paperwork” (P269, Pre-CBME)

I suspect it will be more busy work whereas as a member of a smaller program I think I get enough
supervision feedback and personalized attention” (P274, Pre-CBME)

Extra stressor in terms of administrative burden to have assessments done for daily tasks” (P98, In CBME).
Assessment challenges Difficulty in getting staff to do the evaluations during the workday” (P105, Pre-CBME)

More paperwork but more observed encounters too which are appreciated” (P18, In CBME)
Quality of feedback At times feedback is quite generic, but no different than prior to CBME” (P40, In CBME)

I think it will be very beneficial in obtaining and seeking effective feedback.” (P221, Pre-CBME).
Faculty engagement and buy-in Staff in my program already don't do Midterm assessments and end of rotation assessments. There is no
way they are going to do competency forms” (P175, Pre-CBME).

I find it stressful to try to get feedback on off-service rotations because the staff there don't really care
(P138, In CBME).

Table 4. Supporting quotes for Theme 2: changes.

Sub-theme Supporting Quote
Formalized
assessment
process
More formal assessments” (P174, In CBME)

I anticipate it will formalize evaluation of the learning objectives already in place, to document in greater detail whether
residents are in fact meeting all of their rotation objectives” (P242, Pre-CBME)

It is burdensome to approach faculty as there is a negative outlook on assessment as it is a frustrating process right
now” (P64, in CBME)
Teaching and
learning process
There are certain objectives that I need to reach according to the CBME model, but staff have not changed their teaching
style” (P126, In CBME)

Less time receiving teaching from attendings given need for assessments” (P174, Pre-CBME).

Staff will be more focused on filling out forms than on patient specific teaching” (P2, Pre-CBME)

Changing the way that I instruct learners as a senior resident” (P15, Pre-CBME).

I feel it has encouraged me to be a self-directed learner” (P136, In CBME)

Being able to concentrate on the things I really am lacking in and minimize time spent on things I know very well” (P213,
Pre-CBME)

I feel CBME is overall positive because it allows a resident to progress at the rate they feel comfortable” (P234, Pre-CBME)

Table 5. Supporting quotes for Theme 3: emotional responses.

Sub-theme Supporting Quote
Negative feelings I think that it would be more concise and useful, but it will also require more time and will probably be more stressful
(P18, In CBME)

“It is estimated that our minds can accommodate ~7 items at a time. Between consults, paperwork, and ward issues
we often become stressed. Adding a requirement to fill out at least 3 assessments a day (as estimated by my surgical
PD last week) will increase distraction and stress, hopefully not limiting patient care” (P95, Pre-CBME)

CBME has put a lot of stress on me in my residency. Faculty do not fill out assessments. I barely ever get them back
(P7, In CBME).

I feel an extra stress on shift of completing EPAs as opposed to focussing on patients and flow” (P35, In CBME)

Slightly more stressful as school transitions to CBME model and overcomes growing pains” (P65, In CBME)

“I find this constant feedback to be emotionally taxing. To elaborate, I find preceptors feel the need to always
say something good you did and something bad you did even if both were not necessary. As a result, I feel
like eval[uation]s have gotten way more nit-picky and consequently anxiety inducing” (P101, In CBME).

Not yet sure. I worry about a lack of flexibility” (P178, Pre-CBME)

I think it will encourage feedback more often. However, I worry about how difficult it will be to encourage staff to
actually participate and be engaged in more frequent feedback” (P190, Pre-CBME).

Table 6. Additional quotations for each theme.

Theme Subtheme Sample Codes Sample Quotations
1. Program
outcome
concerns
a) Administrative
burden


b) Assessment
challenges





c) Quality of feedback


d) Faculty engagement
and buy-in
-Greater administrative
burden


Assessment completion





Quality Feedback Will Be
Challenging
Hesitant to be constructive
Staff engagement
“Higher administrative burden on residents to pursue and
complete evaluations.” ( P99, Pre-CBME)

“More time spent doing paperwork” ( P132, In CBME)

“I fear there will be a significant gap in supervisor skill, and
willingness to dedicate the time needed, to assess milestones/
EPAs when CBME is first implemented.” ( P245, Pre-CBME)

“I am in R1 in a program transitioning to CBD for my year. It
is a learning curve for staff in the Emergency department as
they don't like having to actually observe me taking histories or
doing physicals.” (P45, In CBME)


“But attending physicians are often very busy, measures
need to be in place so that the quality of feedback is not
compromised.” ( P122, Pre-CBME)

“Assessors are hesitant to give constructive feedback” ( P59, In
CBME)

“However, I worry about how difficult it will be to encourage
staff to actually participate and be engaged in more frequent
feedback.” ( P188, Pre-CBME)

“Now, a year later, that has somewhat passed, and it is more of
a hassle to track down staff / convince them to fill EPAs that it
is a benefit.” ( P105, In CBME)
2. Changes a) Formalized
assessment process



b) Teaching and
learning process
More documentation




Tailored teaching

No difference in teaching


Decreased learning
“I anticipate I will spend more time completing documentation of
clinical experiences/EPAs” (P25, Pre-CBME)

“There will be more time spent documenting tasks” (P114,
Pre-CBME)


“I think I will get more tailored teaching to my level of
understanding” (P177, Pre-CBME)

“While it was fresh in the first few months of introduction, staff
would seek out opportunities to involve me in more complex
cases (that aligned with complex EPAs). Now, a year later, that
has somewhat passed, and it is more of a hassle to track down
staff / convince them to fill EPAs that it is a benefit. It seems,
aside from having forms filled, the teaching is not much
different.” (P105, In CBME)

“I find more of my time is spent on the logistics of CBME
(chasing down evaluations, keeping track of my progress,
competence reports etc.) rather than actually learning the
content behind EPAs.” (P4, In CBME)
3. Emotional
responses
a) Negative feelings More stress

Worried


Stressed
“Added stress of ensuring frequent assessments by staff in a
busy clinic or unit” (P115, Pre-CBME)

“I am personally worried about staff not being willing or not
having time to the CBME EPAs and evaluations which could
potentially blow back on residents and making it hard for them to
complete EPAs not from lack of
trying, but from time constraints
or other limitations outside of our control.” (P55, Pre-CBME)

“Trying to obtain the required competencies, explain CBME
milestones/EPAs to preceptors, arrange direct observation,
and have preceptors complete the required forms has been
by far the most stressful part of residency so far. It has taken
away from my learning experiences, as I target my patient
encounters towards seeing the patients I need to check-off
from my list rather than those that are interesting, and it
deters me from seeing challenging patients that I know I would
receive a poor evaluation for” (P77, In CBME)

Theme 1: Program outcome concerns

Many residents were concerned about administrative burden, assessment challenges, feedback quality, faculty engagement, and buy-in ( Table 3). However, residents in the pre-CBME group were more concerned about increased paperwork and administrative burdens. Residents experiencing CBME identified the administrative burden as a stressor.

Concerns were raised about the assessment process, such as assessment completion during clinical time. Whilst some CBME residents reported challenges with assessment completion and direct observation, others described increased direct observation as a benefit. Another concern was how feedback quality would be impacted, as some pre-CBME residents anticipated better feedback whereas some CBME residents reported no change in their feedback.

Residents anticipated challenges due to lack of engagement and buy-in from staff. Pre-CBME residents discussed issues with assessment completion before CBME which were expected to be an ongoing challenge. Residents in CBME expressed lack of engagement and buy-in to a lesser extent. Some residents reported additional stress from CBME implementation.

Theme 2: Changes

Residents identified processes that may change or had changed including formalized assessment, teaching, and learning ( Table 4). The pre-CBME residents focused on how the assessment process would become more formalized through direct observation and increased documentation. A few CBME residents reported that the increased documentation and requirements for assessment caused frustration.

Some CBME residents discussed how teaching had not changed following CBME implementation. Pre-CBME residents also discussed anticipated changes to teaching, as many were concerned about receiving less teaching due to assessment processes. Some residents identified the need to change their own teaching styles.

CBME residents reported that they were better able to direct their learning following implementation. Some pre-CBME residents indicated that they were excited for more tailored learning experiences and to focus on specific areas of need. They also anticipated a more individualized learning pace. There were clearly positive and negative implications for assessment, teaching, and learning in a CBME environment.

Theme 3: Emotional responses

A theme related to emotional responses due to CBME ( Table 5) was identified. Generally, these emotions were negative and included stress, uncertainty, anxiety, and worry. Some pre-CBME residents were anticipating increased stress. The experience of added stress was also reported by CBME residents. A few residents expressed how the added stress could negatively impact patient care given that they were focused on completing entrustable professional activities (EPAs), rather than on patients and clinical flow. The amount of stress experienced varied among residents; some reported a high level of stress and anxiety, and others did not mention stress.

Other feelings included uncertainty as some residents were unsure how CBME would impact them. Sometimes the worry was in relation to faculty buy-in, engagement, and assessment completion. Despite not being asked about their emotions, residents shared how they were feeling about CBME.

Discussion

Canadian residents have mixed feelings and experiences about CBME. Our findings suggest that apprehensions held by pre-CBME residents were less concerning to CBME trainees. Statements suggesting difficulties with time constraints and engagement related to assessment completion had significantly lower agreement amongst CBME residents. However, while CBME residents were significantly less concerned about variability in defining competence, the mean score of 3.14/5 (compared to 2.83/5 for pre-CBME residents) indicates that there is still work required to instill confidence in trainees regarding the ability of the CBME system to provide clear definitions and measurable metrics of competence. Residents shared tensions from assessment processes which is understandable given that within the CBME model we remain situated in an accountability paradigm and are using assessments for high-stakes decisions. However, future work should focus on identifying tensions and documenting how residents and faculty are navigating them 19 . Further, we need to emphasize learning and de-emphasize measurement within CBME 20 .

Self-reflection and self-directed learning received strong agreement scores across groups, suggesting that this is a major benefit of CBME. Conversely, CBME residents were less optimistic that individual learning needs and variability in learning pace were being accommodated, which suggests that CBME still exists within the logistical demands and framework of rotation schedules, call coverage, and patient care. Reassuringly, both groups reported high satisfaction with their training.

86% of respondents provided narrative responses, yielding rich qualitative data. These responses confirmed the quantitative findings: concerns regarding the logistical and administrative aspects of CBME are still present post-implementation, and the time demands of assessment and documentation inherent in CBME, although perhaps not quite as onerous for the pre-CBME group, remain significant. Residents are aware of the magnitude of the change required, requesting adaptation of assessment and teaching strategies.

The negative emotional responses expressed by participants were surprising and concerning, although one study highlighted emotional responses of trainees during CBME implementation including frustration due to assessment processes 21 . Additionally, the departmental culture may contribute to emotions if residents feel that the focus is on performance and assessment rather than that of learning and improvement 22 . Burnout among residents is an issue, with one study reporting rates of 42% among second-year residents 23 . Emotional exhaustion is one of three dimensions of burnout 24 , and the risk is increased when significant job responsibility is coupled with low autonomy 25 . Although one goal of CBME is increased learner autonomy through enhanced self-reflection skills and self-directed learning, CBME residents bear increased responsibility for obtaining assessments. They have little autonomy, however, in terms of ensuring assessments are completed, or with the CBME transition. This additional assessment burden, coupled with the already-significant training demands, may increase their risk of burnout.

Research should examine CBME post-implementation, identifying resident reactions, and required support. Residents anticipate and experience benefits from CBME, but have both theoretical and practical concerns, which add emotional and administrative stress. It is important that training programs be aware of resident apprehensions regarding CBME, and seek to proactively address these through open, two-way communication before, during, and after implementation. This may help to alleviate concerns and uncertainty, and increase resident engagement and co-production which are instrumental to successful implementation 9, 26 . Additionally, program leaders should acknowledge the potential for resident burnout, which may be exacerbated by the stress surrounding CBME implementation, and apply strategies to prevent, identify, and deal with added stress.

Limitations

The sampling frame is unknown given that authors were dependent upon institutional PGME offices to recruit participants. Demographic data were not collected to maintain anonymity. The respondents represent a sample of the resident population in Canada and results may not be generalizable across the population. Additionally, some residents may have had little experience with CBME. Further, participants had varied backgrounds and we were unable to elicit whether there was a shared baseline knowledge of CBME. Lastly, the use of a survey to provide deep exploration of qualitative themes must be acknowledged as a limitation. However, it is important to recognize that participants provided rich narrative responses.

Conclusion

This exploratory study is the largest known examination of resident perspectives of CBME in Canada, and using mixed methods allowed for thematic exploration, supporting, and supplementing the quantitative data. Training programs vary and resident perspectives on CBME will depend on their individual and institutional experiences. However, widespread reporting of administrative stressors associated with CBME and emotional responses to its implementation suggest that programs must be acutely aware of the risk of resident burnout and take proactive steps to address it.

List of abbreviations

CBME      Competency-based medical education

PGME      Post-graduate medical education

RCPSC     Royal College of Physicians and Surgeons of Canada

Acknowledgements

We would like to thank all residents who participated in this study.

Funding Statement

This work was supported by funding from the Robert Maudsley Scholarship and Research Fund awarded to Dr. Steve Mann.

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

[version 1; peer review: 2 approved, 1 approved with reservations]

Data availability

Underlying data

The data has not been made publicly available due to our institutional ethics clearance which outlines that only members of the research team would have access to the full raw and identifiable dataset. For those who are interested in accessing the data, please contact the corresponding author indicating your interest and outlining your reasons for access. The corresponding author will then review the institutional ethics clearance and identify next steps. The additional sample quotations provided are available for those who would like to review more of the data.

Extended data

Mendeley Data: CBME Resident Survey. https://doi.org/10.17632/xtr9dbt56d.1 16 .

This project contains the following extended data:

  • CBME_Environment_Final Survey.docx. (The file provides the final survey used to collect the data described in this study).

Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).

References

  • 1. RCPSC: Competence by Design. 2020. Reference Source [Google Scholar]
  • 2. Frank JR, Snell LS, Cate OT, et al. : Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45. 10.3109/0142159X.2010.501190 [DOI] [PubMed] [Google Scholar]
  • 3. Ferguson PC, Caverzagie KJ, Nousiainen MT, et al. : Changing the culture of medical training: An important step toward the implementation of competency-based medical education. Med Teach. 2017;39(6):599–602. 10.1080/0142159X.2017.1315079 [DOI] [PubMed] [Google Scholar]
  • 4. Jippes E, Van Luijk SJ, Pols J, et al. : Facilitators and barriers to a nationwide implementation of competency-based postgraduate medical curricula: A qualitative study. Med Teach. 2012;34(8):e589–e602. 10.3109/0142159X.2012.670325 [DOI] [PubMed] [Google Scholar]
  • 5. Ten Cate O, Billett S: Competency-based medical education: Origins, perspectives and potentialities. Med Educ. 2014;48(3):325–32. 10.1111/medu.12355 [DOI] [PubMed] [Google Scholar]
  • 6. Altahawi F, Sisk B, Poloskey S, et al. : Student perspectives on assessment: Experience in a competency-based portfolio system. Med Teach. 2012;34(3):221–5. 10.3109/0142159X.2012.652243 [DOI] [PubMed] [Google Scholar]
  • 7. Cilliers FJ, Schuwirth LW, Adendorff HJ, et al. : The mechanism of impact of summative assessment on medical students’ learning. Adv Health Sci Educ Theory Pract. 2010;15(5):695–715. 10.1007/s10459-010-9232-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Daniels VJ, Stach J, Sandhu G: Transitioning to competency-based medical education: impact of educational interventions on internal medicine residents’ understanding of the purpose and process. Can Med Educ J. 2019;10(4):e96–e98. [PMC free article] [PubMed] [Google Scholar]
  • 9. Mann S, Hastings Truelove A, Beesley T, et al. : Resident perceptions of Competency-Based Medical Education. Can Med Educ J. 2020;11(5):e31–e43. 10.36834/cmej.67958 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10. Gauthier S, Melvin L, Mylopoulos M, et al. : Resident and attending perceptions of direct observation in internal medicine: a qualitative study. Med Educ. 2018;52(12):1249–1258. 10.1111/medu.13680 [DOI] [PubMed] [Google Scholar]
  • 11. Branfield Day L, Miles A, Ginsburg S, et al. : Resident Perceptions of Assessment and Feedback in Competency-Based Medical Education: A Focus Group Study of One Internal Medicine Residency Program. Academic Medicine. 2020;95(11):1712–1717. 10.1097/ACM.0000000000003315 [DOI] [PubMed] [Google Scholar]
  • 12. Blades ML, Glaze S, McQuillan SK: Resident Perspectives on Competency-By-Design Curriculum. J Obstet Gynaecol Can. 2020;42(3):242–247. 10.1016/j.jogc.2019.07.005 [DOI] [PubMed] [Google Scholar]
  • 13. Boet S, Pigford AAe, Naik VN: Program director and resident perspectives of a competency-based medical education anesthesia residency program in Canada: a needs assessment. Korean J Med Educ. 2016;28(2):157–68. 10.3946/kjme.2016.20 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14. Allen M: Confidentiality and Anonymity of Participants.The sage encyclopedia of communication research methods Thousand Oaks. CA: SAGE Publications, Inc;2017. [Google Scholar]
  • 15. Cook DA, Beckman TJ: Current Concepts in Validity and Reliability for Psychometric Instruments: Theory and Application. Am J Med. 2006;119(2):166.e7–16. 10.1016/j.amjmed.2005.10.036 [DOI] [PubMed] [Google Scholar]
  • 16. Braund H: CBME Resident Survey. Mendeley Data, V1, Dataset, 2023. 10.17632/xtr9dbt56d.1 [DOI]
  • 17. Braun V, Clarke V: Using thematic analysis in psychology. Qualitative Research in Psychology. 2006;3(2):77–101. 10.1191/1478088706qp063oa [DOI] [Google Scholar]
  • 18. Bland JM, Altman DG: Statistics Notes: Cronbach's Alpha. BMJ. 1997;314(7080):572. 10.1136/bmj.314.7080.572 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Govaerts MJB, van der Vleuten CPM, Holmboe ES: Managing tensions in assessment: moving beyond either-or thinking. Med Educ. 2019;53(1):64–75. 10.1111/medu.13656 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20. Eva KW, Bordage G, Campbell C, et al. : Towards a program of assessment for health professionals: from training into practice. Adv Health Sci Educ Theory Pract. 2016;21(4):897–913. 10.1007/s10459-015-9653-6 [DOI] [PubMed] [Google Scholar]
  • 21. Martin L, Sibbald M, Vegas DB, et al. : The impact of entrustment assessments on feedback and learning: Trainee perspectives. Med Educ. 2020;54(4):328–336. 10.1111/medu.14047 [DOI] [PubMed] [Google Scholar]
  • 22. Watling CJ, Ginsburg S: Assessment, feedback and the alchemy of learning. Med Educ. 2019;53(1):76–85. 10.1111/medu.13645 [DOI] [PubMed] [Google Scholar]
  • 23. Dyrbye LN, Burke SE, Hardeman RR, et al. : Association of Clinical Specialty With Symptoms of Burnout and Career Choice Regret Among US Resident Physicians. JAMA. 2018;320(11):1114–1130. 10.1001/jama.2018.12615 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24. Thomas NK: Resident Burnout. JAMA. 2004;292(23):2880–9. 10.1001/jama.292.23.2880 [DOI] [PubMed] [Google Scholar]
  • 25. Linzer M, Visser MR, Oort FJ, et al. : Predicting and Preventing Physician Burnout- Results from the United States and the Netherlands. Am J Med. 2001;111(2):170–5. 10.1016/s0002-9343(01)00814-2 [DOI] [PubMed] [Google Scholar]
  • 26. Dagnone JD, Buttemer S, Hall J, et al. : Black Ice: ways to get a grip on resident co-production within medical education change. Can Med Educ J. 2020;11(1). 10.36834/cmej.67919 [DOI] [PMC free article] [PubMed] [Google Scholar]
MedEdPublish (2016). 2024 Apr 8. doi: 10.21956/mep.20620.r36182

Reviewer response for version 1

Jackson Hearn 1

Dear Authors,  

Thank you for the opportunity to review your recent article, “Exploring residents’ perceptions of competency-based medical education across Canada: A national survey study,” which approaches a meaningful topic to a broad range of medical educators and residents. Overall, the topic of resident perceptions of CBME (Competency Based Medical Education) is relevant to programs implementing CBME.  

---

Summary: 

This article presents data from a survey of 375 residents regarding perceptions of CBME (Competency Based Medical Education). The authors use closed ended (Likert scale) and open ended (typed narrative) responses. The respondents were grouped into either “pre-CBME” or “in CBME” groups and responses were compared using a mean difference of Likert items. Mann-Whitney U was used as a non-parametric comparison test. The authors conducted a thematic analysis from narrative responses. They identify the themes of “program outcome concerns, changes, and emotional responses,” and present representative quotes from narrative responses.  

Strengths:  

The sample is large and provides information describing participants' perceptions on various aspects of CBME. The authors’ qualitative analysis should be appreciated for its attention to the perspectives of residents who are (and will be) directly affected by implementation of CBME in training. Carefully evaluating resident perspectives is a crucial element in evaluating the quality of various elements of CBME implementation. The authors provide their survey instrument for review and ask about a number of important concerns regarding CBME, such as faculty interactions. The authors present compelling and illuminating quotations from residents. Their study improves our understanding of resident perspectives outside of resident responses to survey items. The quotes provided from the responses are compelling and engaging and support the codes and themes they propose. One particularly interesting idea which is found in several supporting quotes is the concern for ineffective implementation of CBME or persistent challenges which exist regardless of CBME (e.g. “the staff there don’t really care,” “there is a negative outlook on assessment). I think this offers a compelling area for further exploration. 

Areas for Improvement: 

The authors sampled residents from 10 Canadian institutions. The confidentiality provided by eschewing demographic data limits the degree to which sampling methods can be evaluated. For instance, the number of respondents from each institution is not given, which may obscure a pattern of low response rates at various institutions, or a pattern in which institutions provided respondents (e.g. urban vs rural, many vs few residents). While collecting demographic data may enable further insights (through methods such as subgroup analysis), this must be balanced against the utility of maintaining confidentiality for the participants. In other words, we do not know a lot about the participants, but that is precisely the point of confidentiality!  

Regarding the quantitative analysis:  

First, I would suggest that the use of the mean to represent central tendency for interval data (such as Likert scale responses) is controversial [1]. While Likert scales (consisting of multiple Likert items) have been considered interval data, individual Likert items remain ordinal data and must be analyzed as such. “Furthermore, because the numbers derived from Likert scales represent ordinal responses, presentation of a mean to the 100th decimal place is usually not helpful or enlightening to readers.”[2]. The authors use means of individual items in their analysis of survey results but do not display frequency distributions or report percentages of responses to compare response patterns for single Likert items. The items are not pooled into Likert scales in the analysis, which could allow the statistical comparisons the authors propose. Including measures of variability (or displaying data in a histogram or Likert chart) might better illuminate patterns among respondents. We might ask if participants favor extreme responses, or do they cluster around the mean? This could be improved by generating a histogram showing response frequencies or a Likert chart showing the proportions of responses at each level. In summary, I would encourage an review of the statistical methoda utilized with an experienced psychometrician to ensure the analytical methods are sound—I readily admit my training is not as a psychometrician, and the analyses conducted by the authors may be appropriate given particular features of this survey, study design, or data which are not in this version of the article.  

Second, the authors utilize Cronbach’s alpha as a measure of reliability; however, normally distributed data are assumed when using this measure. The authors report that the sample violated normality assumptions. In addition, Cronbach’s alpha does not necessarily establish the reliability or internal consistency of responses when measuring different factors (such as faculty time constraints or definitions of competence). This could be improved by modifying the survey methodology to generate scales for multiple factors.  

Regarding qualitative methods:

Providing the codebook would enable more engagement with the qualitative analysis performed. This could be included in an appendix, as the survey instrument is. Theoretically, the authors hint at a relationship between the three themes (“Residents anticipate and experience benefits from CBME, but have both theoretical and practical concerns, which add emotional and administrative stress”); or to say another way, that concerns about CBME drive emotional and administrative stress. The quotes in support of theme 3 may better support the idea simply that residents experience stress more specifically related to tension between ensuring completion of CBME documentation and competing clinical workload (rather than concerns about CBME more generally). 

Overall, I thank the authors for their diligent attention to gathering a wide range of perspectives from hundreds of residents and thoughtfully analyzing narrative responses on the topic. In the world of medical education, the importance of considering CBME from the viewpoint of residents cannot be understated.

Have any limitations of the research been acknowledged?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Partly

Are all the source data underlying the results available to ensure full reproducibility?

No

Is the study design appropriate and is the work technically sound?

Yes

Are the conclusions drawn adequately supported by the results?

Partly

Are sufficient details of methods and analysis provided to allow replication by others?

Partly

Reviewer Expertise:

I study teaching communication to residents, resident assessment, and clinical competency committees.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard, however I have significant reservations, as outlined above.

References

  • 1. : Resolving the 50-year debate around using and misusing Likert scales. Med Educ .2008;42(12) : 10.1111/j.1365-2923.2008.03172.x 1150-2 10.1111/j.1365-2923.2008.03172.x [DOI] [PubMed] [Google Scholar]
  • 2. : Analyzing and interpreting data from likert-type scales. J Grad Med Educ .2013;5(4) : 10.4300/JGME-5-4-18 541-2 10.4300/JGME-5-4-18 [DOI] [PMC free article] [PubMed] [Google Scholar]
MedEdPublish (2016). 2024 Mar 12. doi: 10.21956/mep.20620.r36042

Reviewer response for version 1

Ciraj Ali Mohammed 1

This study investigates Canadian residents' perceptions of competency-based medical education (CBME) highlighting areas of strength, and improvement within the CBME framework. The work contributes to a culture of continuous improvement in medical education, fostering transparency, accountability, and resident satisfaction in the field of CBME.

Research of this nature can guide medical educators in refining curriculum design, assessment methods, and assist administrators in enhancing support structures that better align with the needs and expectations of residents

Here are a few points that may be considered which help researchers in enhancing the robustness of the study, and provide a more comprehensive understanding of residents' experiences within the CBME framework.

Sampling:  You mention that a total of 10 Canadian institutions circulated the survey. What was the basis of selection for these institutes? 

How did you ensure a representative sample of residents across various specialties and training levels? What was the method employed?

Data collection: The qualitative component appears limited, hindering a deep exploration of residents' lived experiences. A FGD or structured interview conducted with selected residents would have shed more light on the nature of concerns and sought ways of mitigation

Analysis: 

Did you have a provision to conduct subgroup analyses based on demographic factors (e.g., specialty, training level) to identify patterns and variations in perceptions? This may help in avoiding any unnecessary generalization of the nature of responses.

How did you address confounding variables such as individual characteristics, personal experiences, or concurrent life events that would have influenced residents' perceptions independently of CBME?

Discussion: 

In light of your findings briefly suggest in the discussion: 

a) Wellbeing measures proposed to alleviate stress and promote wellbeing among residents 

b) Faculty development  measures that will be useful in the implementation of the CBME approach

c) Plans for including residents' voices in the decision-making process

Annexures: 

Include the consensus-built codebook used in the study 

Reviewer's Recommendations: By providing evidence-based recommendations, the research serves as a valuable resource for institutions, policymakers, and educators striving to optimize CBME across the globe and hence can be accepted with possible inclusion/revision of suggested data/evidence.

Have any limitations of the research been acknowledged?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Partly

Is the study design appropriate and is the work technically sound?

Yes

Are the conclusions drawn adequately supported by the results?

Yes

Are sufficient details of methods and analysis provided to allow replication by others?

Partly

Reviewer Expertise:

I have two decades of experience in Health Professions education with a special focus on active learning, PBL, faculty development, CBME, assessment, and accreditation.  I have served as a change management expert specifically in implementing educational intervention.

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

MedEdPublish (2016). 2024 Mar 6. doi: 10.21956/mep.20620.r36045

Reviewer response for version 1

Robert Cooney 1

Dear Authors,

I have reviewed your article "Exploring residents' perceptions of competency-based medical education across Canada: A national survey study" [version 1; peer review: awaiting peer review] and overall I find it to be a well-conducted study that provides important insights into how Canadian residents view competency-based medical education (CBME). I believe the article has academic merit and will be an important contribution to the literature on this topic. However, I do have some suggestions for improvement, which I outline below.

Summary:

The study collected survey data from 375 medical residents across Canada to examine their perceptions and experiences with the transition to CBME. The quantitative data was analyzed appropriately and the qualitative data from open-ended responses yielded rich insights through thematic analysis. Three main themes emerged: program outcome concerns, changes resulting from CBME, and emotional responses to CBME.  

Strengths:

- Large national sample across multiple programs and specialties

- Mixed-methods design allowing for exploration of quantitative trends as well as deeper insights from qualitative data

- Transparent reporting of methods, analysis, and researcher reflexivity 

- Findings align with and expand on existing literature regarding resident perceptions of CBME

- Highlights important areas for programs to focus on, such as addressing logistical challenges, faculty buy-in, and resident stress/burnout risk

Areas for Improvement:

1. Reproducibility and Details for Replication 

While the methods are described thoroughly overall, replication will be difficult

- The survey instrument is provided, as an appendix, which is a helpful first step. I would like to see:

- Details on the sampling approach and response rate are lacking, specifically:

   - How institutions/programs were selected for participation 

   - What was the total eligible population for each group (pre-CBME and CBME residents)

   - What were the response rates for each group relative to the eligible population (I acknowledge this is likely unknowable given the structure of how the survey was administered and the decision to not collect demographic data).

Providing these additional sampling and response details is important for assessing potential selection bias and better understanding the representativeness of the sample. This would also help to know the total possible pool of respondents. 

- More details on the qualitative coding process would be helpful. Again, the codebook could also be offered as a supplementary appendix and it would be helpful to know what theory guides the qualitative process (grounded theory, etc). Guessing this was a simple thematic analysis vs narrative analysis but it’s always helpful to spell this out in writing.

Main Revisions Needed:

To address the key points above and improve the scientific merit of the study, I would recommend the authors:

1) Provide additional details on the sampling approach, population sizes, and response rates for context.

2) Expand on the description of the qualitative coding process (consider supplemental appendix). Also, which type of qualitative theory informed your work?

Other than those points, the article is very well-written and follows a clear structure. The literature review appropriately sets up the study's rationale and the limitations are discussed transparently. With the additional details and data transparency suggested above, I believe this article would make a strong contribution to the literature.

I hope these comments are helpful for revising the manuscript.

Have any limitations of the research been acknowledged?

Yes

Is the work clearly and accurately presented and does it cite the current literature?

Yes

If applicable, is the statistical analysis and its interpretation appropriate?

Yes

Are all the source data underlying the results available to ensure full reproducibility?

Partly

Is the study design appropriate and is the work technically sound?

Yes

Are the conclusions drawn adequately supported by the results?

Yes

Are sufficient details of methods and analysis provided to allow replication by others?

Partly

Reviewer Expertise:

Medical Education; educational technology, health systems science, faculty development

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Data Citations

    1. Braund H: CBME Resident Survey. Mendeley Data, V1, Dataset, 2023. 10.17632/xtr9dbt56d.1 [DOI]

    Data Availability Statement

    Underlying data

    The data has not been made publicly available due to our institutional ethics clearance which outlines that only members of the research team would have access to the full raw and identifiable dataset. For those who are interested in accessing the data, please contact the corresponding author indicating your interest and outlining your reasons for access. The corresponding author will then review the institutional ethics clearance and identify next steps. The additional sample quotations provided are available for those who would like to review more of the data.

    Extended data

    Mendeley Data: CBME Resident Survey. https://doi.org/10.17632/xtr9dbt56d.1 16 .

    This project contains the following extended data:

    • CBME_Environment_Final Survey.docx. (The file provides the final survey used to collect the data described in this study).

    Data are available under the terms of the Creative Commons Attribution 4.0 International license (CC-BY 4.0).


    Articles from MedEdPublish are provided here courtesy of Association for Medical Education in Europe

    RESOURCES