Abstract
We studied the nature of feedback given after a miniCEX. We investigated whether the feedback was interactive; specifically, did the faculty allow the trainee to react to the feedback, enable self-assessment, and help trainees to develop an action plan for improvement. Finally, we investigated the number of types of recommendations given by faculty. One hundred and seven miniCEX feedback sessions were audiotaped. The faculty provided at least 1 recommendation for improvement in 80% of the feedback sessions. The majority of the sessions (61%) involved learner reaction, but in only 34% of the sessions did faculty ask for self-assessment from the intern and only 8% involved an action plan from the faculty member. Faculty are using the miniCEX to provide recommendations and often encourage learner reaction, but are underutilizing other interactive feedback methods of self-assessment and action plans. Programs should consider both specific training in feedback and changes to the miniCEX form to facilitate interactive feedback.
Keywords: feedback, direct observation, evaluation
The mini clinical evaluation exercise, or miniCEX, is a valuable method for the evaluation of clinical skills. Previous work has shown the miniCEX method is reliable and possesses construct validity.1–3 Because the miniCEX involves the direct observation of clinical skills, faculty have a significant opportunity to provide meaningful real time feedback to trainees.
Previous studies have highlighted the need for “interactive” feedback to help trainees correct deficiencies and grow professionally.4–7 Interactive feedback includes self-assessment by the trainee and allowing the learner to react to the feedback provided. Interactive feedback should include an action plan where the trainee, with the guidance of the faculty, develops a behavioral plan to improve his/her clinical skills. For example, an action plan for a deficiency in physical examination skills could involve reading in a physical examination textbook followed by repeat observation. While providing recommendations for improvement is important, recommendations are only a first step. Failure to allow the trainee to react or develop action plans may impede or slow improvement because the trainee may not accept or embrace the recommendations or because the trainee may not know how to implement the recommendations.
As part of the American Board of Internal Medicine's (ABIM) miniCEX project, we undertook a separate parallel study to investigate the nature and content of the feedback provided by faculty as part of the miniCEX. To our knowledge, no previous study has investigated the content of feedback from a miniCEX. Our specific objective for this study was to examine how often faculty provided recommendations and used interactive techniques when providing feedback as part of a miniCEX.
METHODS
This was a prospective observational cohort study at 3 internal medicine residency programs participating in the ABIM miniCEX project2: the National Naval Medical Center (NNMC) in Bethesda, Md; the Washington Hospital Center (WHC) in the District of Columbia; and Yale University Primary Care Residency Program (YPC) in New Haven and Waterbury, Conn. The institutional review boards at all 3 sites approved the study.
For this study, all miniCEXs were completed in the outpatient setting. After completing a miniCEX, the faculty member provided feedback to the intern before the end of the clinic day. At the beginning of the project, all faculty were instructed on the goals of the miniCEX, including the importance of feedback to promote improvement in clinical skills, and provided an instructional handout prepared by the ABIM. Because the objective of this study was to investigate the nature of feedback associated with direct observation, no workshops on feedback were performed. Feedback sessions were audiotaped and then transcribed. Faculty were encouraged to record all aspects of the feedback session.
A taxonomy of feedback was first developed using the constant comparative method.8 Each of the four authors initially coded transcripts independently and the categories of interactive feedback were defined out of the entire taxonomy. Final agreement on the taxonomy was reached by consensus. Content analysis was then performed by one of the authors (ESH) to quantify the type and amount of interactive feedback for all transcripts.9,10 A second author (SJH) independently coded 15 transcripts to ensure the reliability of the content analysis.
For this analysis, 4 categories of feedback were delineated: recommendations, promoting leaner reaction, action plans, and self-assessment. Examples of each category can be found in the Appendix. Action plans could be directed by the faculty member or intern, and self-assessment could be prompted or unprompted (spontaneous). Because this is a descriptive study, demographic characteristics and feedback results are presented as frequencies. Correlations were calculated using the Spearman rho statistic (SPSS, version 11.0, SPSS, Inc., Chicago, Ill).
RESULTS
Overall, 107 miniCEX feedback sessions were audiotaped and transcribed. Seventy feedback sessions occurred at NNMC, 12 at WHC, and 25 at YPC. Overall, this study involved 41 interns and 28 faculty at the 3 sites. Table 1 provides demographic information for the 3 sites. The majority of the faculty were general internists (89%) and early in their academic careers. However, the faculty group included 1 program director, 4 current or former associate program directors, 7 fellowship-trained general internists, and 17 faculty who had participated in clinical teaching faculty development workshops in the past.
Table 1.
Characteristic | National Naval Medical Center | Yale Primary Care Program | Washington Hospital Center |
---|---|---|---|
Audiotaped feedback sessions, n | 70 | 25 | 12 |
Interns, n | 22 | 13 | 6 |
Faculty, n | 16 | 8 | 4 |
Faculty rank of assistant professor or lower | 14 | 8 | 4 |
MiniCEX Evaluations (N = 98) | |||
---|---|---|---|
Characteristic, Median Ratings | National Naval Medical Center (N = 69) | Yale Primary Care Program (N = 19) | Washington Hospital Center (N = 10) |
Complexity of encounter (out of 3) | 2 | 2 | 1.5 |
Medical interviewing skills* | 7 | 7 | 6 |
Physical examination skills* | 6 | 7 | 6 |
Counseling skills* | 7 | 7 | 6 |
Overall competence* | 7 | 7 | 6 |
Resident satisfaction with MiniCEX† | 8 | 6 | 7 |
Faculty satisfaction with MiniCEX† | 8 | 6 | 7.5 |
Score on a 9-point scale. For medical interviewing, physical examination, and overall competence, 1 to 3 denotes unsatisfactory, 4 to 6 satisfactory, and 7 to 9 superior performance.
On a 9-point scale; higher scores denote greater satisfaction.
For the 107 audiotaped sessions, 98 miniCEX evaluation forms were available for review; 5 transcripts had lacked identifying information to link the transcript to an evaluation form and for 4 sessions a miniCEX form was not submitted as part of the ABIM study.2 As shown in Table 1, the median complexity of the patient encounter for the 3 sites was moderate. The median ratings on the miniCEX 9-point scale for the interns’ scores in medical interviewing, physical examination, and overall competence were similar at all 3 sites being either in the high satisfactory category (score = 6) or low superior category (score = 7). The range of scores for overall competence ratings were 4 to 9 at NNMC, 6 to 8 at YPC, and 6 to 9 at WHC.
Table 2 shows the frequencies of the recommendations given and the use of specific interactive feedback techniques. In total, there were 204 recommendations given in the 107 sessions, with a mean of 1.9, median of 1, and a range of 0 to 9 recommendations per feedback session. Faculty enabled learner reaction in 65 (61%) of the feedback sessions; there was a modest correlation of allowing learner reaction with giving at least 1 recommendation (r = .23; P = .02). Self-assessment was used less frequently, with only 36 (34%) of the sessions involving self-assessment by the intern. Finally, despite the high percentage of feedback sessions with recommendations provided, only 11 (11%) of the sessions included an action plan, and in 3 sessions the intern and not the faculty generated the action plan. There were no significant associations between the presence of an action plan with the number of recommendations provided, learner reaction, or self-assessment.
Table 2.
Category | Frequency |
---|---|
Sessions with ≥1 recommendations, n (%) | 86 (80) |
Mean (median) number of recommendations per session | 1.9 (1) |
Sessions faculty asked for self-assessment, n (%) | 36 (34) |
Domain of recommendation, n (%) | |
Medical interviewing | 43 (40) |
Physical examination | 38 (36) |
Counseling | 27 (25) |
Medical knowledge | 2 (2) |
Humanism/professionalism | 1 (1) |
Other | 23 (21) |
Sessions faculty enabled learner reaction, n (%) | 65 (61) |
Sessions with action plans, n (%) | 11 (10) |
From faculty | 8 (8) |
From intern | 3 (3) |
For the 98 sessions that had the miniCEX evaluation form data available, there were modest but negative correlations for scores on the domains of medical interviewing (r = −.28; P = .03), humanism (r = −.28; P = .002), and overall competence (r = −.39; P < .001) with the number of recommendations provided. There was also a modest negative correlation between enabling learner reaction (r = −.23; P = .03) and self-assessment (r = −.30; P < .01) with evaluator satisfaction.
DISCUSSION
The miniCEX is a potentially powerful tool to provide high-quality, interactive feedback that could contribute to improvement in trainees’ clinical skills. Direct observation of clinical skills is a critical first step in helping trainees to improve their clinical skills. The miniCEX provides a reliable, structured format for performing direct observation.1–3 However, the evaluation generated by the miniCEX must lead to meaningful, useful feedback to promote growth in the trainees’ clinical skills. On the positive side, this study demonstrates that among 3 separate internal medicine residency programs, the miniCEX frequently leads to a recommendation for improvement, with the majority of the recommendations focused on the clinical skills of medical interviewing, physical examination, and counseling. Recommendations concerning medical knowledge and professionalism were uncommon. The focus on recommendations for clinical skills is encouraging given the abundant literature on deficiencies in clinical skills among trainees and practicing physicians.11,12
To be most effective, feedback needs to be interactive so the trainees can embrace and take ownership of their strengths and weaknesses. Faculty in this study did enable learner reaction in nearly two-thirds of the feedback sessions. However, use of self-assessment occurred less frequently, and despite the substantial number of recommendations provided, the explicit development of an action plan was rare. The lack of action plans is particularly unfortunate because it suggests that many faculty may not be “closing the loop” to ensure that the deficiencies noted were addressed by the intern.
Why were self-assessment and action plans less likely to be used by faculty? Possible explanations include faculty discomfort, lack of experience using these approaches, or that they simply do not like to use self-assessment and action plans. Although overall ratings were high, self-assessment and learner reaction were modestly correlated with lower faculty satisfaction with the miniCEX. Our experience in faculty development suggests that faculty often worry that self-assessment opens a “Pandora's box” they feel ill equipped to handle. However, faculty can use self-assessment to first generate positive feedback by asking opening questions such as, “What do you think went well?” before moving on to corrective feedback.
Action plans are also perceived to indicate that a deficiency was significant enough to require more formal intervention and follow-up; pressured faculty may feel they do not have adequate time or skills for follow-up. Another contributing factor may be that faculty feel they, and not the resident, are responsible for completing the action plan. More work is needed to determine the nature of these findings and what barriers exist for faculty to use these interactive feedback techniques.
Programs implementing the miniCEX as part of their evaluation program will need to give equal attention to the quality of both the observation and the feedback. Faculty development is one key approach, but this study highlights two important points. First, the majority of faculty (17/28, 61%) had participated in at least one workshop on feedback, thus highlighting the need for ongoing training and reinforcement. Such “reinforcement training” can occur at section meetings, clinic conferences, and clinical competency meetings. Second, feedback training should explicitly encourage teaching and practice of interactive feedback approaches.
The current miniCEX form does not facilitate interactive feedback and programs should consider revising the form. At one program (YPC), a newly revised miniCEX form specifically asks for the action plan and faculty are required to list at least one item. Work is ongoing to assess whether this approach improves interactive feedback. Faculty should also understand that they are not always primarily responsible for the action plan. The action plan can be used to facilitate resident self-directed learning; the faculty should simply follow up to see whether the resident completed the action plan task(s). Residents should be activated to ask and seek interactive feedback. This will help to remind faculty but, equally important, helps to promote reflective practice and professionalism among residents. Finally, program directors should assess and address local barriers among faculty to the use of interactive feedback.
Several limitations of this study should be noted. The majority of the feedback sessions came from one program. However, this program involved 20 interns and 16 faculty, and the 3 sites combined contributed feedback sessions involving 41 interns and 28 faculty. In addition, the median miniCEX scores were similar at all 3 sites in multiple domains of competence. Second, it is possible that the Hawthorne effect could have affected the nature of the feedback sessions. However, if true, then the results most likely represent an optimistic estimate of the interactive behaviors being used by faculty. Third, it is possible that additional feedback was provided to the intern before or after the audiotaped portion of the feedback session. However, faculty were given formal instructions on the need to record all aspects of the session, and given the range in the number of sessions and recommendations it is unlikely that substantial portions of feedback sessions were not captured. We also did not assess the quality of the feedback nor follow up with interns after the miniCEX to see whether they implemented any of the recommendations. Finally, we were unable to determine whether expressed or unexpressed action plans were actually completed by the faculty.
In conclusion, this study showed that the majority of faculty are providing recommendations and enabling learner reaction as part of miniCEX feedback, but are not using self-assessment and action plans with sufficient frequency. Additional research is needed to determine what barriers exist in using self-assessment and action plans, whether focused faculty development can improve interactive feedback behaviors, and whether changes to the miniCEX form can facilitate interactive feedback.
Acknowledgments
The authors wish to thank Rebecca Lipner and Gregory Fortna of the American Board of Internal Medicine for assistance in acquiring the miniCEX score data, and Ms. Leslie Galaty and Barbara Wanciak for data entry and transcription.
This research was supported in part by the American Board of Internal Medicine Foundation.
The views expressed herein are solely those of the authors and do not represent the views of the United States Navy or Department of Defense.
APPENDIX A
Feedback Categories: Examples
-
Recommendations:
-
History taking:
“One of the things I thought you should work on is where we talked about setting the agenda up front…get that list up front and decide what you are going to talk about.”
-
Physical examination:
“But generally speaking when you do pitting edema it's not how hard it's how long. You just kind of press…like that…and then you let go.”
-
Counseling:
“I think counseling you need to kind of work on a little bit more about telling them about what they (need) to do. And you can actually do counseling throughout.”
-
-
Learner reaction:
“…But other than that, you covered all the points very well. Do you have any questions or comments?”
-
Self-assessment:
“How did you feel about the uh…how everything went?”
-
Action plan:
“Well, interviewing (pause)…next time we do the miniCEX I’ll come into the room with you and maybe I can give you more hints on how to focus and things of that sort. But generally, you don’t want to shoot closed questions.”
REFERENCES
- 1.Norcini JJ, Blank LL, Arnold GK, Kimball HR. The Mini-CEX (Clinical Evaluation Exercise): a preliminary investigation. Ann Intern Med. 1995;123:795–9. doi: 10.7326/0003-4819-123-10-199511150-00008. [DOI] [PubMed] [Google Scholar]
- 2.Norcini JJ, Blank LL, Duffy FD, Fortna GS. The Mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138:476–81. doi: 10.7326/0003-4819-138-6-200303180-00012. [DOI] [PubMed] [Google Scholar]
- 3.Holmboe ES, Huot SJ, Chung J, Norcini JJ, Hawkins RE. Construct validity of the mini-clinical evaluation exercise (miniCEX) Acad Med. 2003;78:826–30. doi: 10.1097/00001888-200308000-00018. [DOI] [PubMed] [Google Scholar]
- 4.Ende J. Feedback in clinical medical education. JAMA. 1983;250:777–81. [PubMed] [Google Scholar]
- 5.Skeff KM, Stratos GA, Berman J, Bergen MR. Improving clinical teaching. Evaluation of a national dissemination program. Arch Intern Med. 1992;152:1156–61. doi: 10.1001/archinte.152.6.1156. [DOI] [PubMed] [Google Scholar]
- 6.Salerno SM, Jackson JL, O'Malley PG. Interactive faculty development seminars improve the quality of written feedback in ambulatory teaching. J Gen Intern Med. 2003;18:831–4. doi: 10.1046/j.1525-1497.2003.20739.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Salerno SM, O'Malley PG, Pangaro LN, Wheeler GA, Moores LK, Jackson JL. Faculty development seminars based on the one-minute preceptor improve feedback in the ambulatory setting. J Gen Intern Med. 2002;17:779–87. doi: 10.1046/j.1525-1497.2002.11233.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Strauss A, Corbin J. Basics of Qualitative Research. Thousand Oaks, Calif: Sage Publishing; 1998. [Google Scholar]
- 9.Crabtree BF, Miller WL. Doing Qualitative Research. Thousand Oaks, Calif: Sage Publishing; 1999. [Google Scholar]
- 10.Denzin NK, Lincoln YS. Collecting and Interpreting Qualitative Materials. Thousand Oaks, Calif: Sage Publishing; 1998. [Google Scholar]
- 11.Braddock CH, III, Edwards KA, Hasenberg NM, Laidley TL, Levinson W. Informed decision making in outpatient practice. Time to get back to basics. JAMA. 1999;282:2313–20. doi: 10.1001/jama.282.24.2313. [DOI] [PubMed] [Google Scholar]
- 12.Mangione S, Nieman LZ. Cardiac auscultatory skills of internal medicine and family practice trainees. A comparison of diagnostic proficiency. JAMA. 1997;278:717–22. [PubMed] [Google Scholar]