Abstract
Background
A key component of competency‐based medical education is workplace‐based assessment, which includes observation (direct or indirect) of residents. Direct observation has been emphasized as an ideal form of assessment yet challenges have been identified that may limit its adoption. At present, it remains unclear how often direct and indirect observation are being used within the clinical setting. The objective of this study was to describe patterns of observation in an emergency medicine competency‐based program 2 years postimplementation.
Methods
Emergency medicine residents (n = 19) recorded the type of observation they received (direct or indirect) following workplace‐based entrustable professional activity (EPA) assessments from December 15, 2019, to April 30, 2020. Assessment forms were reviewed and analyzed to describe patters of observation.
Results
Assessments were collected on all 19 eligible residents (100% participation). A total of 1,070 EPA assessments were completed during the study period, of which 798 (74.6%) had the type of observation recorded. Of these recorded observations, 546 (68.4%) were directly observed and 252 (31.6%) were indirectly observed. The length of written comments contained within assessments following direct and indirect observation did not differ significantly. There was no significant association between resident gender and observation type or resident stage of training and observation type. Certain EPA assessments showed a clear preference toward either direct or indirect observation.
Conclusions
To the best of our knowledge, this study is the first to report patterns of observation in a competency‐based residency program. The results suggest that direct observation can be quickly adopted as the primary means of workplace‐based assessment. Indirect observation comprised a sizeable minority of observations and may be an underrecognized contributor to workplace‐based assessment. The preference toward either direct or indirect observation for certain EPA assessments suggests that the entrustable professional activity itself may influence the type of observation.
Keywords: competency‐based medical education, observation, workplace‐based assessment
INTRODUCTION
Over the past 20 years, there has been a global movement toward competency‐based medical education (CBME). 1 A major component of CBME is workplace‐based assessment (WBA), which refers to assessment of resident competence in clinical settings. 2 Within Canadian specialist CBME programs, resident progression is sequenced into four stages: transition to discipline, foundations of discipline, core of discipline, and transition to practice. 3 Each stage is associated with a unique set of specialty specific entrustable professional activities (EPAs), which represent the primary form of WBA (Data Supplement S1, Appendix S1, available as supporting information in the online version of this paper, which is available at http://onlinelibrary.wiley.com/doi/10.1002/aet2.10591/full). Both residents and supervisors have an important role in the completion EPA assessments in the workplace (Figure 1). Through EPA assessments, residents demonstrate competence in each stage of training before promotion within the residency program. 4 For authentic judgments of competence to occur in the workplace, observation of resident performance has been promoted. 2
FIGURE 1.

Process for EPA selection, observation, and documentation
Observation of performance has been categorized into two types: direct and indirect. 5 Direct observation has been described as the process of watching residents work to develop an understanding of how they apply their knowledge and skills to clinical practice. 6 Conversely, indirect observation comprises observations that occur when the supervisor has not directly watched the resident perform the task being assessed. 5 Examples of indirect observation include listening to an oral case presentation or reviewing documentation. 5 , 6 Following indirect observation, assessment of performance is based on inferences and from surrogate data. 7
Direct observation is viewed as a key assessment strategy in CBME and much of the assessment literature to date has focused on direct rather than indirect observation. 1 , 2 , 6 Despite this, adoption of direct observation has remained challenging due to multiple barriers including resident concerns they are burdening their supervisor and supervisor fears of decreasing resident autonomy and resident–supervisor trust. 8 , 9 , 10 Given the emphasis on direct observation, describing patterns of observation within a new CBME program will serve as an important marker of fidelity of implementation and whether proposed barriers to direct observation impact its adoption in emergency medicine. The objective of this study was to describe patterns of observation in an emergency medicine competency‐based residency program 2 years postimplementation.
METHODS
Study setting and population
This study was conducted in the department of emergency medicine (DEM) at The Ottawa Hospital in Ottawa, Ontario, Canada. The DEM is a large, academic, tertiary care emergency department that serves as the major training site for emergency medicine competency‐based residents affiliated with the University of Ottawa. The study population consisted of all enrolled emergency medicine competency‐based residents (n = 19) during the study period (December 15, 2019–April 30, 2020).
Data collection
Beginning on December 1, 2019, the DEM instituted a policy whereby it became mandatory for residents to indicate whether EPA assessments were completed following either direct or indirect observation. Residents were instructed to record the observation type within the written comment field on the electronic EPA assessment form (Appendix S2). To facilitate this policy change, residents received definitions and examples for each observation type that were derived from the Royal College of Physicians and Surgeons on Canada (RCPSC; Appendix S3). 5 The RCPSC is the governing and accrediting body for specialist training in Canada. Residents were informed of the policy change both in person and through their institutional email. A monthly reminder email regarding this policy was sent to residents during the study period.
Data analysis
All EPA assessment forms completed during the study period were identified using the DEM’s electronic database and manually reviewed by one author (JML). A two‐tailed t‐test was performed to compare the mean number of words contained within the EPA assessment written comment field for each observation type. Chi‐square analyses were used to explore the relationship between observation type and resident gender and resident stage of training (PGY year). Statistical analyses were conducted using IBM SPSS Statistics for Windows Version 23.0 (IBM Corp.). This study received ethical approval from the Ottawa Hospital Research Ethics Board.
RESULTS
Entrustable professional activity assessment data were collected on all 19 eligible residents (100% participation). A total of 1,070 EPA assessments were completed during the data collection period, of which 798 (74.6%) had the type of observation recorded. Of the assessments with observation type recorded, 546 (68.4%) were directly observed and 252 (31.6%) were indirectly observed. Written comment word count for direct (mean ± SD = 59.86 ± 34.52) and indirect (mean ± SD = 63.80 ± 32.40) observation did not differ significantly (p = 0.13). There was no significant association between resident gender and observation type (χ 2 (1, n = 798) = 0.04, p = 0.84), or resident stage of training (PGY year) and observation type (χ2 (1, n = 798) = 3.54, p = 0.06). Table 1 provides further characteristics of EPA assessments completed following each type of observation.
TABLE 1.
Characteristics of observed EPAs
| Direct (%) | Indirect (%) | Total | |
|---|---|---|---|
| Resident gender | |||
| Male (n = 12) | 334 (68.1) | 156 (31.8) | 490 |
| Female (n = 7) | 212 (68.8) | 96 (31.1) | 308 |
| Stage of training (PGY) | |||
| Foundations (PGY1; n = 10) | 360 (66.2) | 183 (33.7) | 543 |
| Core (PGY2; n = 9) | 186 (72.9) | 69 (27.0) | 255 |
| EPA type | |||
| TD 1 | 1 (33.3) | 2 (66.7) | 3 |
| TD 2 | 5 (41.7) | 7 (58.3) | 12 |
| TD 3 | 2 (100.0) | 0 (0.0) | 2 |
| FD 1 | 59 (65.6) | 31 (34.4) | 90 |
| FD 2 | 48 (36.9) | 82 (63.1) | 130 |
| FD 3 | 20 (83.3) | 4 (16.7) | 24 |
| FD 4 | 76 (75.2) | 25 (24.8) | 101 |
| CD 1 | 26 (86.7) | 4 (13.3) | 30 |
| CD 2 | 14 (100.0) | 0 (0.0) | 14 |
| CD 3 | 80 (100.0) | 0 (0.0) | 80 |
| CD 4 | 27 (93.1) | 2 (6.9) | 29 |
| CD 5 | 23 (43.4) | 30 (56.6) | 53 |
| CD 6 | 23 (57.5) | 17 (42.5) | 40 |
| CD 7 | 15 (78.9) | 4 (21.1) | 19 |
| CD 8 | 6 (66.7) | 3 (33.3) | 9 |
| CD 9 | 10 (32.3) | 21 (67.7) | 31 |
| CD 10 | 2 (28.6) | 5 (71.4) | 7 |
| CD 11 | 0 (0.0) | 2 (100.0) | 2 |
| CD 12 | 1 (100.0) | 0 (0.0) | 1 |
| CD 13 | 94 (91.3) | 9 (8.7) | 103 |
| CD 14 | 9 (90.0) | 1 (10.0) | 10 |
| CD 15 | 5 (62.5) | 3 (37.5) | 8 |
Abbreviations: CD, core of discipline; EPA, workplace‐based assessment; FD, foundations of discipline; TD, transition to discipline.
DISCUSSION
This study described patterns of observation in a competency‐based residency program. Direct observation was the primary method of observation comprising over two‐thirds (68.4%) of all recorded WBA. Despite being underrepresented in the medical education literature, indirect observation occurred in nearly one‐third (31.6%) of recorded observations comprising a notable minority.
No significant relationship between observation type and resident gender or resident stage of training (PGY year) was detected. Written comments contained within EPA assessments were, on average, four words longer following indirect observation. This finding was not statistically significant and unlikely to represent a meaningful educational difference. Certain EPA types showed a clear preference toward either direct or indirect observation, suggesting that the nature of the task itself may influence what type of observation occurs.
Within the workplace, supervisors frequently make entrustment decisions as they aim to balance the provision of safe patient care while simultaneously fostering resident autonomy. 11 Entrustment decisions have been shown to be influenced by several factors including the resident, supervisor, context, task, and resident–supervisor relationship. 11 The decision to directly or indirectly observe an EPA assessment could be viewed as a form of entrustment decision with similar factors influencing supervisors’ observation type decisions. In this case, the EPA assessment would represent the task. Future work is required to better understand what combination of design elements contained within an EPA and external factors result in the decision for an EPA to be directly or indirectly observed.
Previous studies have identified barriers to direct observation, which include resident concerns they are burdening their supervisor and supervisor fears of decreasing resident autonomy and resident–supervisor trust. 8 , 9 , 10 Barriers to direct observation have the potential to limit its adoption and implementation within the clinical setting. Given these challenges, alternative means of gathering WBA data such as indirect observation have garnered increased attention in recent years. 6 , 7 In this study, indirect observation comprised a sizeable minority of recorded observations and may be an underrecognized contributor to WBA within competency‐based programs. Future work should seek to develop a more in‐depth understanding of the role of indirect observation through exploration of resident, supervisor, and competence committee perceptions on the educational value and outcomes of indirect observation. Despite the reported challenges associated with direct observation described in the literature, the results of this study suggest that these barriers can be overcome and direct observation can be quickly adopted as the primary means of gathering WBA data in an emergency medicine program.
LIMITATIONS
This study was conducted within a single residency program; therefore, generalizability to other programs should be considered carefully. Future work will seek to explore patterns of observation more broadly across multiple institutions. Further, 272 (25.4%) completed EPA assessments did not have the type of observation recorded. Limitations of the electronic EPA assessments forms required residents to manually record this data. Given that residents were asked to interpret and record the type of observation that occurred, it is possible that residents may have had variable interpretation of what constitutes direct and indirect observation. An attempt to mitigate this limitation was made by providing residents with a clear definitions and examples of each type of observation (Appendix S3). For observations that may have included a component of both direct and indirect observation, residents were asked to record the type of observation that occurred for the majority of the EPA assessment.
Finally, this study only included EPA assessments across the first three stages of training. Given that Canadian emergency medicine residency programs transitioned to CBME in 2018, EPA assessments for the final stage of training were not available because no residents in the program had been promoted to this stage. It is possible that EPAs completed later in training may lean more heavily toward indirect observation as greater attention is given to fostering autonomy.
CONCLUSIONS
To the best of our knowledge, this study is the first to report patterns of direct and indirect observation in a competency‐based residency program. The results of this study suggest that direct observation can be quickly adopted as the primary means of gathering workplace‐based assessment data. Indirect observation comprised a sizeable minority of recorded observations and may be an underrecognized contributor to workplace‐based assessment. The preference toward either direct or indirect observation for certain entrustable professional activity assessments suggests that the entrustable professional activity itself may influence what type of observation occurs.
CONFLICT OF INTEREST
The authors have no potential conflicts to disclose.
AUTHOR CONTRIBUTIONS
Jeffrey M. Landreville conceived the study; acquired, analyzed, and interpreted the data; and drafted the manuscript. Jason R. Frank and Warren J. Cheung conceived the study, interpreted the data, and provided revisions of the manuscript.
Supporting information
Data Supplement S1
Landreville JM, Frank JR, Cheung WJ. Does direct observation happen early in a new competency-based residency program?. AEM Educ Train. 2021;5:e10591. 10.1002/aet2.10591
REFERENCES
- 1. Ten Cate O. Competency‐based postgraduate medical education: past, present and future. GMS J Med Educ. 2017;34(5):Doc69. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2. Harris P, Bhanji F, Topps M, et al. Evolving concepts of assessment in a competency‐based world. Med Teach. 2017;39(6):603‐608. [DOI] [PubMed] [Google Scholar]
- 3. Sherbino J, Bandiera G, Doyle K, et al. The competency‐based medical education evolution of Canadian emergency medicine specialist training. CJEM. 2020;22(1):95‐102. [DOI] [PubMed] [Google Scholar]
- 4. Thoma B, Hall AK, Clark K, et al. Evaluation of a national competency‐based assessment system in emergency medicine: a CanDREAM study. J Grad Med Educ. 2020;12(4):425‐434. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Gofton W, Dudek N, Barton G, Bhanji F. Workplace‐Based Assessment Implementation Guide: Formative Tips for Medical Teaching Practice. Ottawa, ON: The Royal College of Physician and Surgeons of Canada; 2017. [Google Scholar]
- 6. LaDonna K, Hatala R, Lingard L, Voyer S, Watling C. Staging a performance: learners’ perceptions about direct observation during residency. Med Educ. 2017;51(5):498‐510. [DOI] [PubMed] [Google Scholar]
- 7. Landreville J, Cheung W, Hamelin A, Frank J. Entrustment checkpoint: clinical supervisors’ perceptions of the emergency department oral case presentation. Teach Learn Med. 2019;31(3):250‐257. [DOI] [PubMed] [Google Scholar]
- 8. Holmboe E. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med. 2004;79(1):16‐22. [DOI] [PubMed] [Google Scholar]
- 9. Cheung W, Patey A, Frank J, Mackay M, Boet S. Barriers and enablers to direct observation of trainees’ clinical performance: a qualitative study using the theoretical domains framework. Acad Med. 2019;94(1):101‐114. [DOI] [PubMed] [Google Scholar]
- 10. Watling C, LaDonna K, Lingard L, Voyer S, Hatala R. ‘Sometimes the work just needs to be done’: socio‐cultural influences on direct observation in medical training. Med Educ. 2016;50(10):1054‐1064. [DOI] [PubMed] [Google Scholar]
- 11. Ten Cate O, Hart D, Ankel F, et al. Entrustment decision making in clinical training. Acad Med. 2016;91(2):191– 198. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Supplement S1
