Skip to main content
Journal of Graduate Medical Education logoLink to Journal of Graduate Medical Education
. 2013 Mar;5(1):46–53. doi: 10.4300/JGME-D-11-00324.1

Essential Facets of Competence That Enable Trust in Graduates: A Delphi Study Among Physician Educators in the Netherlands

Marjo Wijnen-Meijer, Marieke van der Schaaf, Kirstin Nillesen, Sigrid Harendza, Olle ten Cate
PMCID: PMC3613317  PMID: 24404226

Abstract

Background

There is a need for valid methods to assess the readiness for clinical practice of recently graduated physicians. To develop these methods, it is relevant to know the general features of trainees' performance that facilitate supervisors' trust in their ability to perform critical clinical tasks.

Objective

To discover such essential facets of competence (FOCs), based on the opinion of experienced physician educators.

Methods

We conducted a Delphi study, consisting of 2 rounds, among 18 experienced physician educators in the Netherlands. Mean, standard deviation, level of agreement, and skewness were calculated for the importance of FOCs for making entrustment decisions. The study yielded a list of 25 FOCs.

Results

In the first round, means were between 6.50 and 7.00 on a 7-point Likert scale (SD, 0.42–2.18); in the second round, means ranged from 5.45 to 6.90 (SD, 0.3–2.02). The level of agreement was high for 92% of the FOCs in the first round and 100% of the FOCs in the second round.

Conclusions

Our Delphi study found consensus among experts about FOCs that are important for clinical entrustment decisions.


Editor's Note: The online version of this article contains the following 3 tables: adapted calculation levels of agreement according to the method of De Loe, expert ratings for completeness and clearness, and describing judgments for importance for entrustment decisions of facets of competence in Delphi study (105KB, doc) .

What was known

Clinical supervisors need information on attributes of clinical trainees to facilitate decisions to entrust trainees with clinical tasks with indirect supervision or “after the fact” supervision.

What is new

A Delphi study of experts in the Netherlands found consensus on facets of competence that are important for clinical entrustment decisions.

Limitations

Small sample size and use of particular competency frameworks may limit the ability to generalize.

Bottom line

Experts in physician education agreed on general attributes of competence for medical trainees that contribute to a robust approach that allows supervisors to make clinical entrustment decisions.

Introduction

Obtaining a correct impression of medical residents' and fellows' readiness for clinical practice is important for medical educators. This requires a valid method for the assessment of clinical competence, a topic that has occupied the minds of medical educators for decades. Despite multiple reviews and authoritative proposals,16 clinical educators, evaluators, and examiners in practice still have difficulty assessing medical trainees.7 Instruments to assess separate domains of competence in the workplace are scarce8 and instruments that focus on specific tasks usually rely on simulated conditions to ensure standardization. The mini-clinical evaluation exercise9 and other direct-observation instruments10 are valuable feedback instruments but the documentation of their psychometric properties is limited to date.10,11 One problem may be a lack of validity of those assessment approaches because they do not directly consider the important question of “whether a medical trainee is ready for independent practice” but rather focus on the assessment of particular skills. Traditional checklists established for these purposes may not capture the essence of such entrustment decisions as they insufficiently align with this essential construct.12 In clinical settings supervisors decide on a daily basis whether to trust a medical trainee with a specific task and to what extent supervision is needed.1316 Specific tasks pertain to a given procedure or skill that is to be carried out; general features pertain to task-independent characteristics. These general features can be called facets of competence (FOCs).

The aim of this study was to uncover general features of trainees that facilitate clinicians' trust of trainees with critical clinical tasks. The research question was: What do experienced clinical educators consider as essential facets of competence that determine decisions to entrust a trainee with critical clinical tasks? Our goal was not to answer this question for specific tasks, but for critical clinical tasks in general. The answer to this question is relevant when developing an instrument to evaluate medical graduates' readiness for practice.

Methods

Design

We used a Delphi technique to investigate consensus and the amount of agreement among experts. The Delphi technique is a widely accepted method for identifying desired features of professionals by eliciting expert opinions in successive rounds.1719 An advantage of the Delphi process is that face-to-face meetings are not required so that there is no risk of peer pressure, and experts from different regions can easily participate in the study. The Delphi technique in our study comprised the following procedure: experts were interviewed with electronic questionnaires, answers were collected, aggregated, and refined throughout 2 rounds. Although panel members did not know the individual answers of the other participants, after each round general feedback was provided to each panel member by summarizing all judgments of the previous round.

At the start of our study, participants received instructions about the study's general objectives. Next, they were asked to judge preliminary descriptions of general facets of competence of medical trainees and subsequently to judge successive revisions in 2 further rounds.

Participants

Participants comprised experienced physician-educators, acquainted with competence levels of starting residents in the Netherlands. We approached all 24 experienced clinicians in the Netherlands who met the following criteria: (1) holding an academic chair in medical education, (2) working in clinical practice, and (3) supervising medical residents.

Instrument Development

We developed a draft questionnaire with a preliminary list of FOCs drawn from a review of the literature on selected competency frameworks and other relevant publications. Three competency frameworks were analyzed, that is, the Canadian CanMEDS framework,20 the Dutch “Blueprint of Objectives for Medical Schools,”21 and the competency framework of the General Medical Council in the United Kingdom.22 A qualitative analysis of the FOCs in the different frameworks resulted in an initial list with the relevant FOCs.

The first draft of the relevant FOCs was completed through a literature review regarding important FOCs for clinicians. The literature yielded 3 articles,13,23,24 of which two13,23 used a holistic approach to the evaluation of medical graduates' competences. Ginsburg et al23 described characteristics of trainees that influence assessments by supervisors. Sterkenburg et al13 focused on factors that guide decisions of supervisors to trust residents with critical clinical tasks. The third article by Kearney24 described the results of a Delphi study exploring the features of professionalism.

We compared these 6 sources (the 3 frameworks and the 3 articles) and found many similarities in content (table 1). We also found differences, mainly in the way the descriptions were ordered and labeled, and in their level of detail. From these sources, we constructed a list of 24 FOCs that met the following criteria: content correspondence with the original source, observable and assessable level of detail, and applicability to medical graduates.

table 1.

Final List of Facets of Competence With References

graphic file with name i1949-8357-5-1-46-t01.jpg

table 1.

Final List of Facets of Competence With References Continued

graphic file with name i1949-8357-5-1-46-t02.jpg

Each of the resulting FOCs was scored on 2 significant aspects for valid competence descriptions: (1) clarity and completeness of formulation; and (2) importance for the entrustment of critical clinical tasks to starting residents. For each FOC a 7-point Likert scale25 was developed that pertained to the following statements: (1) The description of this FOC is complete and clear. (2) This FOC is important for the entrustment of critical clinical tasks to beginning residents. Only the 2 endpoints of the scale were labelled: 1 (strongly disagree) and 7 (strongly agree). Respondents were asked to substantiate their answers with comments and to propose improvements for the FOC descriptions. At the end of the questionnaire, respondents were asked to add any new FOCs that they felt were missing.

Data Gathering

Our Delphi study had 2 rounds. For the first round, the list of 24 FOCs was electronically provided to the panelists. On the basis of scores regarding completeness and clearness of the descriptions and the experts' suggestions for improvement of the descriptions, all FOC descriptions were slightly modified. Further, for 2 of the FOCs this resulted in each being divided into 2 separate FOCs, and 2 other FOCs were combined into 1. In a second round, a new list consisting of 25 FOCs, together with a summary of the judgments of the experts in the first round, was sent to the panelists. Participants were asked the same questions as in the first round. For 3 FOCs, multiple-choice questions were added to solicit opinions concerning contradictory or unclear aspects that were made during the first round. Round 2 started 2 weeks after the closing of round 1. In both rounds, the respondents had 2 weeks to complete the questionnaire.

Data Analysis

After each round, a key issue was the decision about whether particular FOCs should be accepted, revised, or deleted. This decision was based on expert ratings on the completeness and clearness of formulation and importance for entrustment decisions. We used experts' mean score to calculate their endorsement of the proposed FOCs.

Next we established the degree of consensus among the experts' judgments by using standard deviations and levels of agreement. For the calculation of the levels of agreement we used a method described by De Loe2628 for a 4-point scale, adapted to a 7-point scale. De Loe designates consensus as “medium” if 70% of the scores are given in 2 (of 4) contiguous scale levels. Because of our 7-point scale, we adapted this criterion to 3 (of 7) contiguous scale levels, which is a slightly more stringent judgment (provided as online supplemental material). However, consensus can only be assumed if judgments tend to be unidirectional.29 Therefore, the skewness of the distributions of the ratings was computed to check on symmetry and to check whether the experts' judgments tended to be in 1 direction.

Our purpose was to revise the FOCs, from experts' judgments and comments, until the list satisfied most experts. Therefore, we made a list of all written comments from the experts, which we used to modify the descriptions of the FOCs. To test the differences between ratings in the 2 rounds on the completeness and clearness of the formulation, we used 2-tailed dependent t tests.30

In accordance with established national practice in the Netherlands, ethical approval is not required for education studies. We made sure that participants could not be identified from any materials presented to other participants, that participation would cause no plausible harm and would be fully voluntary, and that refusal or withdrawal would not incur any adverse consequences.

Results

Eighteen physician educators from 8 different hospitals and 11 disciplines participated in this study. In the first round, 14 participated in the Delphi study (58%), 13 of whom were men. The average age was 59 years (range, 52–66 years). In the second round, 15 experts responded (63%), which included 13 men. The average age was 59 years (range, 49–66 years). The distribution in sex and age in the sample reflects the distribution in the population of physician educators in the Netherlands. Responders in both rounds represented a wide range of surgical and nonsurgical disciplines (anesthesiology, cardiology, general practice, geriatrics, gynecology, internal medicine, neurology, oncology, pediatrics, rheumatology, and surgery). Eleven experts participated in both rounds, 3 in the first round only and 4 in the second round only.

table 2 shows the expert ratings on “comprehensiveness and clarity” for the descriptions of the FOCs in both rounds of the Delphi study. In the first round means ranged from 4.77 to 6.46 and standard deviations ranged from 0.78 to 2.07. The results of the second round were comparable, with means between 4.82 and 6.27 and standard deviations between 1.03 and 2.23. For the first and the second round, the level of agreement among the experts was medium or high for 83% and 76% of the FOCs, respectively. To test the differences in ratings on comprehensiveness and clarity of the formulations between round 1 and 2, we used dependent t tests. This difference was statistically significant for only 1 FOC (“active listening to patients”), based on a 2-tailed t test (t10  = 1.5, P  =  .03, r  =  .64). The mean rating for this FOC in the second round was higher than in the first.

table 2.

Expert Ratings for “Completeness and Clearness” of Facets of Competence in Delphi Studya

graphic file with name i1949-8357-5-1-46-t03.jpg

We also asked the panel members to indicate on a 7-point Likert scale the importance of the FOC for entrustment decisions. In the first round these means ranged from 6.50 to 7.00, and in the second round, from 5.45 to 6.90. The level of agreement was high for 92% of the FOCs in the first round and for 100% of the FOCs in the second round (table 3). In both rounds and for both the judgments of “comprehensiveness and clarity” and the judgments of “importance,” all ratings tended toward the “strongly agree” side of the scale, such that in all cases the skewness was negative. After round 2, minor changes were recommended regarding the description of 11 FOCs to complete the final list of 25 FOCs (table 1).

table 3.

Expert Ratings for “Importance for Entrustment Decisions” of Facets of Competence in Delphi Studya

graphic file with name i1949-8357-5-1-46-t04.jpg

Discussion

This study exemplifies which facets of the general competence of trainees seem to inform the decisions by clinical supervisors to trust them with critical tasks. It is important to map these facets, in light of current approaches to assessment in competency-based medical education.3134

We made a preliminary list of FOCs, based on competency frameworks and the literature reflecting an international perspective.13,2024 After a Delphi study with 2 rounds, we constructed a list with 25 FOCs. The most important finding was that the experts agreed about the formulation of these FOCs. They strongly agreed that these FOCs are important or very important for the entrustment of critical clinical tasks to starting residents.

Earlier studies about entrustment decisions have shown that these decision processes are complex and relate to several factors.13,14 These factors include the characteristics and achieved level of competence of the trainees. In the current study we clarified which FOCs of the trainees appear relevant for entrustment decisions.

Our study is limited by the small number of participants. Further evaluation of the identified FOCs in other countries and different educational climates will be important to enhance generalizability of these findings and inform understanding of entrustability.35,36 The validity of our findings could be further supported by evaluating actual entrustment decisions and their relationships with trainees' FOCs. Finally, our population was selective and not representative of clinical supervisors in general. Participants' views were informed by their academic background, which could have led to the listing of certain FOCs, although we do not have specific hypotheses in mind to support this concern.

Conclusions

We found consensus support by experts for a list of 25 important FOCs for the entrustment of critical tasks to medical trainees. Our results identified general features of medical graduates that enable supervisors to entrust them with critical clinical tasks. The findings are useful for the development of a valid method for assessing medical graduates' readiness for clinical practice. The findings may also be useful for frame-of-reference training for clinicians who must regularly make entrustment decisions.6

Footnotes

Marjo Wijnen-Meijer, MSc, is Coordinator, Quality Control, and Educational Researcher at the Center for Research and Development of Education, University Medical Center Utrecht, the Netherlands; Marieke van der Schaaf, PhD, is Associate Professor and Coordinator, Master Educational Design and Consultancy in the Department of Education at Utrecht University, the Netherlands; Kirstin Nillesen, MSc, is an Educationalist and worked as a Trainee at the Center for Research and Development of Education, University Medical Center Utrecht, the Netherlands; Sigrid Harendza, PhD, is Professor of Internal Medicine and Educational Research in the Department of Internal Medicine at University Medical Center Hamburg-Eppendorf, Germany; and Olle ten Cate, PhD, is Professor of Medical Education and Director of the Center for Research and Development of Education, University Medical Center Utrecht, the Netherlands.

Funding: The authors report no external funding source for this study.

The authors would like to thank Karen Hauer for critically reading and editing the English-language manuscript.

References

  • 1.Van der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ. 1996;1:41–67. doi: 10.1007/BF00596229. [DOI] [PubMed] [Google Scholar]
  • 2.Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357:945–949. doi: 10.1016/S0140-6736(00)04221-5. [DOI] [PubMed] [Google Scholar]
  • 3.Turnbull J, Van Barneveld C. Assessment of clinical performance: in-training evaluation. In: Norman GR, Van der Vleuten CPM, Newble DI, editors. International Handbook of Research in Medical Education. Dordrecht, the Netherlands: Kluwer Academic Publishers; 2002. pp. 793–810. [Google Scholar]
  • 4.Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002;2:226–235. doi: 10.1001/jama.287.2.226. [DOI] [PubMed] [Google Scholar]
  • 5.Epstein RM. Assessment in medical education. N Engl J Med. 2007;356(4):387–396. doi: 10.1056/NEJMra054784. [DOI] [PubMed] [Google Scholar]
  • 6.Holmboe ES, Hawkins RE, editors. Practical Guide to the Evaluation of Clinical Competence. Philadelphia, PA: Mosby-Elsevier; 2008. [Google Scholar]
  • 7.Govaerts MJ, Van der Vleuten CPM, Schuwirth LWT, Muijtjens AMM. Broadening perspectives on clinical performance assessment: rethinking the nature of in-training assessment. Adv Health Sci Educ. 2007;12:239–260. doi: 10.1007/s10459-006-9043-1. [DOI] [PubMed] [Google Scholar]
  • 8.Lurie SJ, Mooney CJ. Relationship between clinical assessment and examination scores in determining clerkship grade. Med Educ. 2010;44:177–183. doi: 10.1111/j.1365-2923.2009.03572.x. [DOI] [PubMed] [Google Scholar]
  • 9.Norcini JJ, Blank LL, Arnold GK, Kimball HR. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med. 1995;123:795–799. doi: 10.7326/0003-4819-123-10-199511150-00008. [DOI] [PubMed] [Google Scholar]
  • 10.Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA. 2009;302:1316–1326. doi: 10.1001/jama.2009.1365. [DOI] [PubMed] [Google Scholar]
  • 11.Norcini J, Burch V. Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach. 2007;29:855–871. doi: 10.1080/01421590701775453. [DOI] [PubMed] [Google Scholar]
  • 12.Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ. 2011;45:560–569. doi: 10.1111/j.1365-2923.2010.03913.x. [DOI] [PubMed] [Google Scholar]
  • 13.Sterkenburg A, Barach P, Kalkman C, Gielen M, ten Cate O. When do supervising physicians decide to entrust residents with unsupervised tasks. Acad Med. 2010;85:1408–1417. doi: 10.1097/ACM.0b013e3181eab0ec. [DOI] [PubMed] [Google Scholar]
  • 14.Dijksterhuis MGK, Voorhuis M, Teunissen PW, Schuwirth LW, ten Cate OT, Braat DDM, et al. Determining competence and progressive independence in postgraduate clinical training. Med Educ. 2009;43(12):1156–1165. doi: 10.1111/j.1365-2923.2009.03509.x. [DOI] [PubMed] [Google Scholar]
  • 15.Kilminster S, Cottrell D, Grand J, Jolly B. AMEE Guide No. 27: effective educational and clinical supervision. Med Teach. 2007;29(1):2–19. doi: 10.1080/01421590701210907. [DOI] [PubMed] [Google Scholar]
  • 16.ten Cate O. Trust, competence and the supervisor's role in postgraduate training. BMJ. 2006;333:748–751. doi: 10.1136/bmj.38938.407569.94. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Linstone HA, Turoff M, editors. The Delphi Method: Techniques and Applications. London, UK: Addison-Wesley Publishing Company; 1977. [Google Scholar]
  • 18.Dunn WR, Hamilton DD, Harden RM. Techniques of identifying competences needed of doctors. Med Teach. 1985;7(1):15–25. doi: 10.3109/01421598509036787. [DOI] [PubMed] [Google Scholar]
  • 19.Clayton MJ. Delphi: a technique to harness expert opinion for critical decision making tasks in education. Educ Psychol. 1997;17:373–386. [Google Scholar]
  • 20.Frank JR, editor. The CanMEDS 2005 Physician Competency Framework: Better Standards, Better Physicians, Better Care. Ottawa, Canada: The Royal College of Physicians and Surgeons of Canada; 2005. [Google Scholar]
  • 21.Blueprint of Objectives for Medical Schools [in Dutch] Dutch Federation of University Medical Centers; 2009. [Google Scholar]
  • 22.The New Doctor: Guidance on Foundation Training. General Medical Council; 2009. [Google Scholar]
  • 23.Ginsburg S, McIlroy J, Oulanova O, Eva K, Regehr G. Toward authentic clinical evaluation: pitfalls in the pursuit of competency. Acad Med. 2010;85(5):780–786. doi: 10.1097/ACM.0b013e3181d73fb6. [DOI] [PubMed] [Google Scholar]
  • 24.Kearney RA. Defining professionalism in anaesthesiology. Med Educ. 2005;39:769–776. doi: 10.1111/j.1365-2929.2005.02233.x. [DOI] [PubMed] [Google Scholar]
  • 25.Dawes J. Do data characteristics change according to the number of scale points used: an experiment using 5-point, 7-point and 10-point scales. Int J Market Res. 2007;50(1):61–77. [Google Scholar]
  • 26.De Loe RC. Exploring complex policy questions using policy Delphi: a multi-round, interactive survey method. Appl Geogr. 1995;15(1):53–68. [Google Scholar]
  • 27.Dekker-Groen AM, Van der Schaaf MF, Stokking KM. Teacher competences required for developing reflection skills of nursing students. J Adv Nursing. 2011;67(7):1568–1579. doi: 10.1111/j.1365-2648.2010.05591.x. [DOI] [PubMed] [Google Scholar]
  • 28.Van der Schaaf MF, Stokking KM. Construct validation of content standards for teaching. Scan J Educ Res. 2011;55(3):273–289. [Google Scholar]
  • 29.Sackman H. Delphi Critique: Expert Opinion, Forecasting, and Group Process. Lexington, MA: Lexington Books; 1975. [Google Scholar]
  • 30.Norman G. Likert scales, levels of measurement and the “laws” of statistics. Adv Health Sci Educ. 2010;15:625–632. doi: 10.1007/s10459-010-9222-y. [DOI] [PubMed] [Google Scholar]
  • 31.Carraccio C, Burke AE. Beyond competencies and milestones: adding meaning through context. J Grad Med Educ. 2010;2(3):419–422. doi: 10.4300/JGME-D-10-00127.1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Jones MD, Rosenberg A, Gilhooly J, Carraccio C. Competencies, outcomes controversy. Acad Med. 2011;86:161–165. doi: 10.1097/ACM.0b013e31820442e9. [DOI] [PubMed] [Google Scholar]
  • 33.ten Cate O, Snell L, Carraccio C. Medical competence: the interplay between individual ability and the health care environment. Med Teach. 2010;32(8):669–675. doi: 10.3109/0142159X.2010.500897. [DOI] [PubMed] [Google Scholar]
  • 34.Frank JR, Snell LS, ten Cate O, Holmboe E, Carraccio C, Swing S, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–645. doi: 10.3109/0142159X.2010.501190. [DOI] [PubMed] [Google Scholar]
  • 35.Babbot S. Watching closely at a distance: key tensions in supervising resident physicians. Acad Med. 2010;85(9):1399–1400. doi: 10.1097/ACM.0b013e3181eb4fa4. [DOI] [PubMed] [Google Scholar]
  • 36.Boyce P, Spratt C, Davies M, McEvoy P. Using entrustable professional activities to guide curriculum development in psychiatry training. BMC Med Educ. 2011;11:96. doi: 10.1186/1472-6920-11-96. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of Graduate Medical Education are provided here courtesy of Accreditation Council for Graduate Medical Education

RESOURCES