Skip to main content
. 2023 Aug;15(4):463–468. doi: 10.4300/JGME-D-22-00756.1

Table 3.

Program Director Survey Response Frequencies

Survey Prompts n (%), N=291
My program’s CCC is successful at identifying residents who are failing. Strongly disagree 5 (1.7)
Disagree 4 (1.4)
Neutral 24 (8.2)
Agree 134 (46.0)
Strongly agree 124 (42.6)
My program’s CCC is successful at identifying residents who require remediation in one or more areas but are not failing. Strongly disagree 5 (1.7)
Disagree 7 (2.4)
Neutral 21 (7.2)
Agree 145 (49.8)
Strongly agree 113 (38.8)
My program’s CCC is successful at identifying residents who are exceeding expectations in training and may benefit from individualized education to achieve their potential. Strongly disagree 2 (0.7)
Disagree 30 (10.3)
Neutral 57 (19.6)
Agree 133 (45.7)
Strongly agree 68 (23.4)
Missing 1 (0.3)
Do CCC members receive formal faculty development or training on CCC best practices? For example, this training might include the expectations of the CCC or how to synthesize assessment data and might occur through STFM, RLS, the ACGME, or your GME office. Yes, all members receive formal CCC training 59 (20.3)
Yes, some members receive formal CCC training 101 (34.7)
Only the program director receives formal CCC training 25 (8.6)
Only one member (other than the program director) receives formal CCC training 19 (6.5)
No one has formal CCC training 87 (29.9)
Is there a formal policy describing a standardized way for residents in your program to receive feedback generated from the CCC? Yes, we have a written policy describing this process 140 (48.1)
Yes, we have a process we always or usually follow but no written policy 132 (45.1)
No, we have no usual process, policy, or procedure, but residents usually get feedback 12 (4.1)
No, we have no usual process, policy, or procedure, and feedback to residents can be hit or miss 5 (1.7)
No, residents do not usually receive feedback after a CCC meeting 0 (0.0)
Missing 2 (0.7)
Which of the following best describes the data considered in your CCC meetings? We use assessment data from multiple sources, such as rotation evaluation scores and written comments, procedure logs, etc 276 (94.8)
We mostly use data from one source, such as rotation evaluations, and consider other data sources as well 14 (4.8)
We rely heavily on data from one source, such as rotation evaluations 0 (0.0)
Something else 0 (0.0)
Missing 1 (0.3)
Does your CCC have a policy or procedure for considering data from multiple sources? For example, does your CCC have a way of reviewing core faculty and non-core faculty evaluations differently, or stating they should be considered the same way? Yes, we have a formal written policy or procedure for how to include different kinds of data 70 (24.1)
Yes, we have a procedure that we usually carry out, but it is not formal or written 174 (59.8)
No, we do not have a usual way of integrating data, or it may vary from meeting to meeting or resident to resident 45 (15.5)
Missing 2 (0.7)
For each 6-month milestone reporting interval, how much time does a typical CCC member spend on your CCC meetings, including time spent reviewing materials ahead of time, time in the meeting, and time spent completing any follow up work afterward? <3 hours 54 (18.5)
3-<5 hours 91 (31.3)
5-<7 hours 57 (19.6)
>7 hours 87 (29.9)
Missing 2 (0.7)
How efficient do you think your CCC is? Very inefficient 10 (3.4)
Inefficient 50 (17.2)
Efficient 198 (68.0)
Very efficient 32 (11.0)
Missing 1 (0.3)
Which one of these scenarios best describes how your CCC functions? Individual CCC members review one or more assigned resident files prior to the meeting and present their milestone place 129 (44.3)
Most milestone rankings are generated automatically from the information and evaluations in the resident management system 32 (11.0)
The CCC works in smaller committee format, where groups of CCC members discuss assigned residents and make recommendations 12 (4.1)
The whole CCC meets together and assesses each resident file one at a time at the meeting, discussing each milestone 95 (32.6)
Some other format 21 (7.2)
Missing 2 (0.7)

Abbreviations: CCC, Clinical Competency Committee; STFM, Society of Teachers of Family Medicine; RLS, Residency Leadership Summit; ACGME, Accreditation Council for Graduate Medical Education; GME, graduate medical education.