Skip to main content
The Journal of Education in Perioperative Medicine : JEPM logoLink to The Journal of Education in Perioperative Medicine : JEPM
. 1999 May 1;1(2):E007.

ORGANIZATION OF A COMPREHENSIVE ANESTHESIOLOGY ORAL PRACTICE EXAMINATION PROGRAM: Planning, Structure, Startup, Administration, Growth and Evaluation

Armin Schubert *,, John Tetzlaff †,, Michael Licina §,, Edward Mascha, Michael P Smith
PMCID: PMC4803385  PMID: 27390795

Abstract

Background

Oral practice examinations (OPEs) are used in many anesthesiology programs to familiarize residents with the format of the oral qualifying examination given by the American Board of Anesthesiology (ABA). The purpose of this communication is to describe the planning, structure, startup, administration, growth and evaluation of a comprehensive oral examination program at a sizeable residency program in the Midwest.

Methods and Results

A committee of three experienced faculty was formed to plan the effort. Planning involved consideration of format and frequency of administration, timing for best resident and faculty availability, communication, forms design, clerical support, record keeping and quality monitoring. To accommodate resident rotation and faculty work schedules, a semiannual administration schedule on 3-4 consecutive Mondays was chosen. The mock oral format was deliberately constructed to resemble that used by the ABA so as to enhance resident familiarity and comfort with ABA style oral exams. Continued quality improvement tools put in place consisted of regular examiner and examinee inservice sessions, communication and feedback from ABA associate examiners to faculty examiners as well as review of examinee exit questionnaires. A set of OPE databases were constructed so as to facilitate quality monitoring and educational research efforts.

Continued administration of the OPE program required ongoing construction of a pool of guided case-oriented questions, selection of appropriate questions based on examinee training exposure, advance publication of the exam calendar and scheduling of recurring examiner and examinee activities. Significant issues which required action by the governing committee were exam timing, avoidance of conflict with clinical demands, use of OPE results, and procurement of training resources.

Despite initial skepticism, the OPE program was begun successfully and grew substantially from 56 exams in the first year to 120 exams by year three. The OPE was perceived positively by the majority of residents. 90.2% of exit questionnaires acknowledged specific learning about oral exam technique, while only 0.3% indicated lack of meaningful information exchange at OPE sessions. Fewer than 10% of responses indicated misleading questions or badgering by examiners. Although anxiety remained constant over time, resident preparedness increased with repeat OPE exposure.

Summary

A comprehensive mock oral examination of substantial scope was successfully planned, initiated, and developed at our anesthesiology resident training program. It is well accepted by residents and faculty. Its inception was associated with an increase in resident preparedness. Now in its tenth year of existence it continues to be an asset and essential component of our training program.

Keywords: Education, anesthesiologists, residents, oral examinations, board certification

INTRODUCTION

In an era of continued and enhanced importance of board certification in anesthesiology, anesthesiology programs have had many indirect and direct incentives to assure that their residents have the best possible chance for passage of ABA (American Board of Anesthesiology) sponsored examinations. Anesthesiology mock oral examinations for residents have been conducted in various formats for many years. Programs in the 1970’s and 1980’s conducted mock orals largely on an individualized basis, frequently based on residents requesting them from specific faculty. Virtually no published material is available about practice oral examination efforts from this period. More recently, the ABA has encouraged the staging of mock oral examinations in all anesthesiology training programs.

Given the lack of available literature, the continued high level of interest among current and prospective residents and the complexities of establishing and administering a large scale OPE program, the authors set out to describe the planning process, structure, startup, administration, growth and evaluation of their comprehensive OPE program. This information may be helpful to others contemplating to initiate, expand or enrich similar programs.

Planning and Initiation

Prior to the organized OPE program, the department had deliberated about holding oral practice exams. The recruitment of several anesthesiologists from a program with an established oral practice effort galvanized these deliberations into action. Support from the department chair occurred after presenting an overview of goals, advantages and resources for an OPE program. Shortly thereafter, a governing committee of three experienced faculty was formed to plan and administer the effort. Each had at least four years’ experience with mock orals in another institution and all were or soon became ABA question writers or associate examiners. The responsibilities of the committee included planning, policy setting, examiner selection, review of questions, ABA liaison, research coordination and results reporting.

Planning involved goal setting, consideration of format and frequency of administration, timing for best resident and faculty availability, communication, forms design, clerical support, record keeping and quality monitoring. The goals of the OPE planning committee were as follows: (1) Providing a high quality practice exam (defined as mirroring the ABA format; minimizing examinee confusion; maximizing value to examinees), (2) Minimizing the impact on clinical responsibilities and (3) Creating an opportunity for study of issues important to anesthesiology residency training.

In its planning effort, the committee gave several factors special consideration: the size of the training program (30-60 residents and fellows would need to be examined each session), the busy clinical schedule of an operating suite handling over 30,000 patients a year, the effect of off-campus rotations and the need to demonstrate value to the training program.

To accommodate resident rotations, faculty work schedules and the number of residents to be tested, a semiannual administration schedule on 3-4 consecutive Mondays was chosen. The mock oral format was deliberately constructed to resemble closely that used by the ABA so as to enhance resident familiarity and comfort level with ABA style oral exams. Quality enhancement tools put in place consisted of regular examiner and examinee inservice sessions, communication and feedback from ABA associate examiners to faculty examiners as well as review of examinee exit questionnaires. An electronic database was identified as a key resource to facilitate exam administration, quality monitoring and educational research efforts.

Format and Content

The OPE was modeled closely after the oral qualifying examination given by the ABA. It therefore makes use of the guided question format and includes a stem (case scenario) that is divided into sections for preoperative evaluation (Section A), intraoperative management (Section B) and postoperative care (Section C). In addition to the stem, each question also contains Section D (“additional topics”) in the form of 2-3 clinical vignettes designed to explore expertise in areas, which are different from the material covered in the stem (Section D). An example of such a guided question, as used in the OPE at this institution, appears in Appendix A.

Twenty-one stem questions (case scenarios) were used during the first five years of the OPE program (Table 1). Thereafter, new case scenarios were added at a rate of approximately three biannually. The authors of our OPE case scenarios were OPE faculty who based the content of the exam questions on their clinical, ABA or practice oral exam experience. An effort was made to achieve a diverse sampling of anesthetic problems within any one case scenario and through the use of additional topics (Section D). Questions thus generated were reviewed by the organizing committee, and edited as necessary. The guided question format utilized by the ABA served to standardize trainee exposure to examination material. This format is designed to allow conclusions about such consultant attributes as application of knowledge, judgment, adaptability under changing clinical circumstances, ability to manage complications, and effective communication. All candidates are required to return their stem question papers in an effort to preserve security of the case scenario material.

Table 1.

Frequency of OPE Stem Question Use*

Stem Question # Frequency Percent of Total OPEs
1. 18 4.0
1 21 4.7
2 24 5.4
3 35 7.8
4 18 4.0
5 28 6.3
6 15 3.4
7 6 1.3
8 29 6.5
9 35 7.8
10 38 8.5
11 31 6.9
12 18 4.0
13 15 3.4
14 24 5.4
15 27 6.0
16 15 3.4
17 11 2.5
18 5 1.1
19 22 4.9
20 12 2.7
*

data from the first five years of the program

Faculty

Initially, twelve faculty examiners were invited to participate in the OPE. Although experience level varied among examiners, all were board certified anesthesiologist members of the professional staff of the Cleveland Clinic with a permanent faculty appointment with a demonstrated interest in resident education, and had attended at least one OPE inservice session. Small-group inservice sessions were conducted at least yearly for all examiners. OPE examiners also attended briefings and participated in information exchange with ABA associate examiners and board members. Several examiners with substantial non-operating commitments (intensive care, pain therapy) were included to broaden the scope and to provide greater flexibility in scheduling. Several days before the examination, the faculty examiners received a packet of materials that included the examination schedule, the examination questions as well as a synopsis of the OPE format, grading procedure, and desirable examining technique (see www.jepm.org Electronic Educational Library: “Mock Oral Examiner Briefing Packet). Less experienced examiners were assigned to examine with more experienced faculty examiners. The training material, as well as each OPE case scenario question were reviewed applicability to the oral exam process by faculty members who were also ABA associate examiners.

Administration

Continued administration of the OPE program required ongoing review and construction of guided case-oriented questions, selection of appropriate questions consistent with examinee training exposure, advance publication of the exam calendar and scheduling of recurring examiner and examinee activities. A schedule of communications associated with a typical OPE sessions appears in Table 2. Selection and assignment of OPE questions to each candidate involved (1) avoidance of duplication for residents who had prior OPEs and (2) avoidance of questions whose stem dealt with an area covered during a subspecialty rotation to which the resident had not yet been exposed (e.g., pediatrics). Electronic database queries to accomplish the above were necessary prior to each OPE session. Significant issues which required action by the governing committee were exam timing, avoidance of conflict with clinical demands, use of OPE results, procurement of training resources, and communication of results to training program and development leadership.

Table 2.

Communications Schedule for a Typical OPE Session

Subject Timing Distribution
OPE session dates 6-9 months prior Department heads, anesthesia control desks, SICU director, pain center director, all residents & fellows
Arrange exam room availability & refreshments one month prior Faculty affected
Examiner material: 1-2 weeks prior OPE examiners; anesthesia control desks
 OPE assignment schedule*
 OPE questions
 Instructions**
 Special communication(s)#
Arrange feedback location 1-2 weeks after Room scheduling
Examiner feedback 1-2 weeks after Examiners
 Thank you note
 Invitation to feedback session
Summary of current statistics
*

sample schedule available in JEPM electronic educational library

**

available in electronic educational library

#

example available in electronic educational library

The OPE is administered in the spring and fall of each year. All Cleveland Clinic anesthesiology residents with at least 9 months of clinical anesthesia training are scheduled to participate in the semiannual OPE. Participation is mandatory. At least yearly, residents participate in a 2-hour inservice session during which faculty explain the format of and the rationale for the OPE. An agenda of a typical resident in-service session appear in Appendix B. OPE performance was deliberately not used in the evaluation of any resident, either by the program director or the clinical competence committee. However, it was resolved that the reliability and validity of the OPE as a resident assessment tool should be investigated.

Each examination session is preceded by a 7 to 10 minute preparation period during which the candidate reviewed the short narrative of the stem question. The resident also completes a “candidate information sheet” (see Appendix C) prior to the exam. The candidate is then asked to enter a faculty office and is seated with two faculty examiners. Until the recent change in the ABA oral examination format in 1997, examiner 1 began with the preoperative evaluation (Section A) and questioned the candidate for 5 minutes. Examiner 2 continued for 15 minutes with the intra- and post-operative sections (Sections B & C), followed by examiner 1 who explored one to three additional topics (Section D) in depth for the next 10 minutes. This examiner returned to the stem question only if he or she felt that conduct of the exam did not allow a conclusive grade (FG 70 or 80) to be assigned. At the conclusion of the examination, the resident is briefly excused and the examiners independently complete the standardized grading sheet.

OPE Grading Technique

To guide examiner scoring, a standardized grading sheet is used (see Appendix C). Both examiners independently rate the examinee’s performance on the preoperative (A), intraoperative (B), postoperative (C) and additional topics (D) sections of the OPE. Each section (A through D) contains 3 to 6 sub-questions that are graded separately. A “sub-score” is generated for each sub-question. Sub-scores are combined to yield the section score (see below). Permissible sub-scores are 1,2,3 and 4, with 1 denoting best and 4 lowest performance. In addition, examiners mark the presence of weakness in one of four areas (judgment, application of knowledge, clarity and adaptability) for each sub-question. A final grade (FG) is then assigned on a four point scale [80, 77, 73, 70]. A grade of 80 is defined as “definite pass”, 70 as “definite fail”. A definite pass was understood to mean that the candidate functioned at or near the levels of a board certified anesthesiologist. In grading, examiners are urged to identify one or two key areas in the exam. They are also reminded to accumulate sufficient information during the exam to be able to distinguish clearly between pass and fail. Only in situations where the examining procedure leaves room for uncertainty, are the grades of 77 and 73 to be used. Questions demanding exclusively factual information were discouraged because they were considered ungradeable. The object of questioning during the exam is to elicit the candidate’s thought processes (application of knowledge) and evidence for a consultant level of function (judgment, communication skills, adaptability).

During the first five years of the OPE program, a systematic quantitation of trainee performance was undertaken, which went beyond the P/F grading employed in the ABA exam. The OPE scoring procedure provided the inputs for the four major indices of candidate performance. These are (1) the FG from the OPE grading sheet; (2) the P/F grade (A candidate was considered to have passed when the average of the two examiners’ FGs exceeded 75); (3) the section score (calculated for each section (i.e., A-D) as the sum of all reported sub-scores in that section divided by the number of sub-questions per section; and (4) the overall numerical score (ONS; defined as the sum of all sub-scores divided by the number of sub-questions). A summary of scores used during this period is found in the glossary. While OPE grades were not used for resident evaluation, this information was needed to enable faculty to conduct systematic educational investigation of a variety of OPE characteristics, such as inter-rater reliability, internal consistency, and validity.

Debriefing Session and Exit Questionnaire

Within a few minutes of having completed the mock oral examination, residents were asked to return to the exam location for a 5-10 minute debriefing session with the faculty. During the debriefing period, examiners attempt to elicit feedback from the examinee about their subjective experience and provide feedback to the resident about his or her performance. Grading is de-emphasized. Rather, the focus is on improving the resident’s approach to an oral exam question, point out good performance as well as pitfalls and providing suggestions for further development. There was specific effort to avoid teaching didactic material. Examiners point out speech and behavior patterns exhibited by the candidate and suggest modification, as necessary. To ensure that specific behaviors and responses are discussed, examiners make notes on the scoring sheet and consult them during the debrief. Questions are answered and an opportunity for venting is created before the resident returns to his or her clinical assignment. Candidates also receive general information about the setting and characteristics of ABA style oral examinations. Thereafter, examinees complete an anonymous exit questionnaire (see Appendix E), designed to assess the impact of the OPE on resident perceptions and, ultimately, to improve exam quality.

Clerical Support, Data Management and Quality Improvement

The information contained in the hand-marked examiner scoring sheet (Appendix C), the candidate information sheet (appendix D) and the exit evaluation questionnaire (appendix E) is transferred into Microsoft Excel for Windows (Microsoft Co., Redmond, Washington) and Superbase 4 for Windows (Superbase Inc., Bohemia, New York) utilizing a dedicated personal computer. Anonymity was created by replacing individual’s names with numerical examinee and examiner codes. The key to the code is safeguarded by the data management specialist. Examiners do not have access to the code.

After each semiannual OPE period, scoring sheets and exit questionnaires were summarized. The results, in particular the rates of disagreement on pass-fail, and the rate of non-definitive final grade (i.e. not either 70 or 80) scoring were fed back to OPE faculty, along with a summary of those responses from the exit questionnaire which were considered to indicate quality issues. The rates at which resident reported badgering, misleading questions, being challenged, having their questions answered, having both strengths and weaknesses pointed out, etc., were considered indicators of exam technique and opportunities for improvement.

DISCUSSION

Oral examinations continue to be part of the certification process for many medical specialty boards including the American Board of Anesthesiology. The format, strengths and weaknesses of oral qualifying examinations in anesthesiology have been well-communicated.,,, Throughout the years, many anesthesiology programs have recognized that practice exposure to oral examinations may help prepare candidates. Despite the long history of oral practice examinations, it was not until 1989 that the ABA officially encouraged the staging of “mock” oral examinations in all anesthesiology-training programs.1 In addition, courtesy observer status at the ABA new examiner workshop is extended to faculty from training programs without board examiners.

One justification for holding anesthesiology OPEs lies in the need for residents to be exposed to the style and content of an oral examination that represents for many the final hurdle to board certification as a consultant in anesthesiology. Anesthesiology residents take several written examinations during their training continuum, but consistent large scale access to practice with ABA-type oral examinations is not available in many programs, leading to the proliferation of commercial oral examination training courses. Intuitively, a resident who is familiar with the format of such an examination should perform better because anxiety about format will be minimized, allowing a more effective demonstration of relevant skills. However, greater familiarity with exam format alone does not encompass the full scope of potential benefits that either the training program or the individual trainee may ultimately reap from an OPE program. Table 8 summarizes such additional benefits.

Table 8.

Possible benefits from an organized oral practice examination program

  • Resident familiarity with the ABA format

  • Reduced anxiety with regard to the format and process (anxiety about lack of preparation may remain)

  • Feedback on performance during an oral examination

  • Ability of the resident and/or training director to initiate remediary actions

  • Feedback to the training director on curriculum and program adequacy

  • Asset in residency recruiting

  • Assessment tool in resident evaluation

The methods used by the authors resulted in a successful mock oral examination effort at their institution. This statement is supported by the growth of the program in the early phase, its continued prominence as part of the residency program and its reasonable performance on the evaluation tool, the exit questionnaire. More than 90% of responses indicated that specific learning took place with regard to the taking of an oral examination where clinical judgment and communication skills are evaluated. Although ideally, no resident should feel confused or badgered during an examination, the relatively low and constant rate (<10%) of these examination practices speaks in favor of the OPE program as administered. Another indication of the OPE program’s success may be found in the observation that resident self-assessed preparedness increased after the first OPE session and confidence about taking oral exams increased with repeat OPE exposure.

Not all information was positive. The rate with which examiners addressed residents’ strengths during the debriefing period decreased over time. The rate at which residents reported confusion with questioning was a steady 10-15%. These trends have been discussed at the faculty level and require continual attention.

Knowledge of the organization of the authors’ OPE program may assist anesthesiology educators in expanding or adjusting their own programs. We believe that we are reporting the most detailed information about mock oral examinations available to-date. We also recognize that the circumstances that conspired to result in a successful program at our institution may not apply elsewhere. A program with a more extensive geographical dispersion of clinical training sites may encounter substantially more difficult logistical problems than we did. Some programs may choose to use OPE results as a means for resident assessment. Such a change in milieu would likely alter resident perceptions and faculty-resident interactions surrounding the oral exam, resulting in different observations. A small program with several active ABA examiners on staff may be able to provide an experience much more intense and realistic than was the case for our OPEs. We offer our observations as one view into a fascinating aspect in resident education and hope that others will provide additional information to further refine and develop oral case-based testing for experience, and, possibly, for assessment for clinical competence.

Oral examinations have been criticized for lack of reliability, validity, and objectivity. Specific areas of weakness are inter-examiner agreement, case specificity (and therefore, poor generalizability), undue influences by extraneous factors, and poor correlation with “objective” measures of knowledge. On the other hand, oral examinations in anesthesiology have been refined considerably under the aegis of the ABA in the last two decades. In a previous report, the authors have shown that their OPEs have excellent internal consistency and an inter-rater reliability which was more than satisfactory for the purposes of an internal examination. OPE outcomes also correlated moderately well with global faculty evaluations and written in-training examination scores. This is consistent with other reports of successful oral examinations including the ABA oral examination3. Recognized means of improving the reliability of oral examinations include standard grading forms, examiner training, grading from videotaped recordings and standardized, scripted questions to reduce content sampling error and the use of two simultaneous evaluators.,,,, The authors’ OPE includes several of these features, perhaps accounting for the acceptable observed reliability and validity.2,11

As part of their customer-focused definition of health care quality, policy makers may soon demand certification of competence based on performance in clinical situations. In 1991, the Liaison Committee on Medical Education formally incorporated performance-based assessment into the accreditation standards of medical schools. As of this year, the clinical competency examination for international medical graduates includes a clinical skills examination. Medical education, including that of the anesthesiologist, is already being shaped by an emphasis on demonstration of competence by simulated clinical performance in OSCEs and standardized patients,, and is being linked to improvements in quality of care.,, This trend will generate a growing need for measures of trainee performance which directly relate to the quality of health care provided. In this context, standardized oral practice examinations might contribute to a “clinical performance criterion” necessary to graduate students of anesthesiology into independent practice.

The oral examination in anesthesiology is likely to continue to occupy its importance in the process of board certification because it assesses clinical judgment, problem solving ability and communication skills, essential components of competence for anesthesiologists. Documentation of clinical competence throughout the residency-training period is acknowledged to be a difficult task, with a general lack of agreement on what constitutes the best definition of competence. Measures other than written multiple choice examinations are needed to make up for variations in program and rater standards, as well as for the limited observation of residents by faculty. The retention of a face-to-face examination in many specialty boards, including the ABA, provides evidence of an uneasiness among specialty boards and medical teaching faculty,,, to entrust certification of professional competence entirely to a written examination. Objective structured clinical examinations (OSCEs) are increasingly used for medical student assessment and have been introduced by several medical specialty boards. In some ways, the OPE resembles an OSCE. An OSCE presents a carefully scripted clinical situation to the candidate whose responses are graded systematically. Validity of our OPE was similar to that of many OSCEs.7,, The OPE is a labor and time intensive process, requiring resources similar to those needed for OSCEs.26,

A comprehensive mock oral examination effort of substantial scope was successfully planned, rolled out, grown at our anesthesiology resident training program. The feedback from residents and fellows who participate in this program is overwhelmingly positive. Now in its tenth year of existence the OPE continues to be an acknowledged asset and essential component of our training program.

Table 3.

Frequency of Examinations Given by Examiners*

Examiner # of Exams Percent of Total Exams Pass Rate
1 65 7.5 60.0
2 18 2.1 38.9
3 57 6.6 59.6
4 78 9.0 44.9
5 44 5.1 43.2
6 53 6.1 39.6
7 51 5.9 72.5
8 19 2.2 68.4
9 80 9.2 62.5
10 147 17.0 39.5
11 170 19.6 67.1
12 27 3.1 44.4
13 23 2.7 73.9
14 16 1.8 93.8
15 6 0.7 33.3
16 11 1.3 36.4
*

data from the first five years of the program

Table 4.

Pass, Disagreement And Waffle Rates (First Five Years)

1989 1990 1991 1992 1993
Spring Fall Spring Fall Spring Fall Spring Fall Spring
n 24 33 39 35 41 46 53 59 61
Pass % 42 42 43 31 37 43 43 42 31
Split %* 8 9 13 43 39 37 32 29 36
Waffle %# 31 32 17 26 23 13 15 19 20
*

refers to percent of exams during which examiners disagreed on pass-fail status (gave FGs which differed by > 3 points)

#

refers to percent of examiners who gave FGs which were neither 70 no 80.

Table 5.

Preparedness and Anxiety over Time

Preparedness Anxiety

N % > 5 (95% CI) Mean N % > 5 (95% CI) Mean
Spring 1989 0 0
Fall 1989 19 16 (3,40) 3.4 20 70 (46,88) 6.3
Spring 1990 36 33 (19,51) 4.5 36 61 (43,77) 5.6
Fall 1990 32 41 (24,59) 4.9 32 53 (35,71) 5.2
Spring 1991 35 43 (26,61) 4.8 34 50 (32,68) 5.1
Fall 1991 38 61 (43,76) 5.6 43 51 (35,67) 5.8
Spring 1992 41 49 (33,65) 5.2 42 52 (36,68) 5.5
Fall 1992 45 51 (36,66) 5.0 47 60 (44,79) 5.8
Spring 1993 47 40 (26,56) 4.9 47 45 (30,60) 5.1
Fall 1993 49 37 (23,52) 4.5 52 52 (38,66) 5.7

Overall 342 43 4.8 353 54 5.5

Table 6.

Anxiety and Confidence in First Time vs. Repeat Examinees (Range: 1-10)

Anxiety (n=84) Confidence (n=78)
First-time 5.5 ± 1.7 3.7 ± 2.1
Repeat 5.6 ± 1.6 5.1 ± 1.5*
*

<0.001 vs. first-time

Table 7.

Fractional responses (%) on OPE Exit Questionnaire* Results (First FiveYears n=387)

Year

All 1989 1990 1991 1992 1993
(N=377) (N=56) (N=73) (N=86) (N=106) (N=56)
Misled 4.0% 5.4% 5.5% 2.3% 1.9% 7.1%
Badgered 4.5% 5.4% 2.7% 4.7% 6.6% 1.8%
Material too complex 4.5% 5.4% 5.5% 4.7% 3.8% 3.6%
Intimidated 22.3% 19.6% 23.3% 20.9% 24.5% 21.4%
Confident 10.6% 5.4% 8.2% 14.0% 9.4% 16.1%
In Control 16.2% 14.3% 16.4% 18.6% 16.0% 14.3%
Challenged 80.1% 82.1% 80.8% 81.4% 78.3% 78.6%
Difficulty Expressing 45.1% 37.5% 41.1% 47.7% 50.9% 42.9%
Missed Things 54.1% 60.7% 56.2% 41.9% 57.5% 57.1%
*

see Also Appendix E

ACKNOWLEDGEMENT

The authors thank the following individuals without whom the OPE program would not have been possible: the residents, fellows and staff anesthesiologists of The Cleveland Clinic Foundation; Mrs. Shelly Sords, education coordinator of the Division of Anesthesiology for her cheerful support and interest; Ms. Charlie Androjna for her skillful management of the large databank; Dr. Frances Rhoton for her encouragement and support during the project’s early days; Dr. Arthur Barnes, Vice Chairman and Dr. Fawzy G. Estafanous, Chairman of the Division of Anesthesiology, for creating an environment in which the OPE program could flourish. We especially thank Mrs. Nikki Williams for her expert secretarial support.

Appendix A. Sample OPE Question

A 42 year-old hypertensive, 135 kg, 155 cm, female is scheduled for elective cholecystectomy. She has a 40-pack year smoking history and complains of epigastric pain. There is a history of barbiturate allergy. Arterial blood gas on room air reveals pH=7.38, PO2=62, PCO2=43, BP=160/100, HR=96, RR=20, Hb=16.

  1. PREOPERATIVE EVALUATION

    1. Obesity: Is an obese person an increased anesthetic risk? Why? Criteria for severity? Further evaluation? Why? (For associated disorders: diabetes, cardiac reserve, pulmonary hypertension, hepatic problems.) What would you do with these test results?

    2. Hypertension: Would you require normalization of blood pressure perioperatively? Why? Why not? Risks of hypertension? Perioperative MI risk? Neurologic risk?

    3. Pulmonary Assessment: How do you explain the blood gas values? Hemoglobin? Are further pulmonary function tests needed? Under what circumstances? What if the PCO2 were 52?

  2. INTRAOPERATIVE COURSE

    1. Premedication: Do you think a sedative is necessary? Why? What would make you give a sedative premedication? Which and why? Is aspiration prophylaxis needed? Best regimen? Why?

    2. Selection of Monitors: CVP catheter indicated? Why/Why not? What would prompt you to recommend CVP? Arterial line? Why?

    3. Choice of anesthesia: Your surgical colleague asks if a combined regional/general anesthetic is better than general anesthesia? Effect of epidural anesthesia/analgesia on post op ABG/FRC/VC? Epidural opiates vs. local anesthetics?

    4. Induction of Anesthesia: How would you induce general anesthesia? (Consider airway, aspiration, risk, CV status). Justify each agent! When would you consider awake intubation/laryngoscopy? Assume the history of allergy to barbiturates is real. What are the implications for anesthetic management? Porphyria? Types? How would you manage? Why?

    5. Maintenance of Anesthesia (Pharmacokinetics of Obesity): Is narcotic or inhalational technique preferable in obesity? Why? Difference in recovery times? Dosing (TBW vs. BSA vs. LBW?) What pharmacokinetic properties would be desirable in the obese patient?

  3. POSTOPERATIVE CARE

    1. Should the patient be monitored in a special care unit postoperatively? How do you decide?

    2. Postoperatively, the patient is in pain and has an O2 saturation of 87% on 50% O2 by mask. What are your recommendations? Why?

    3. Suddenly the patient becomes extremely tachypneic and tachycardic with concomitant hypotension. What would you do? Why? Assume pulmonary embolism: How would you diagnose and manage? Why?

  4. ADDITIONAL TOPICS

    1. A pregnant nurse anesthetist asks if she should be allowed to work? Risk of miscarriage/fetal malformation?

    2. After induction with halothane and succinylcholine, a 10-year-old boy develops masseter spasm. Implications? Do you cancel the case? Why? What do you tell the parents/family? Why?

    3. A 55-year-old war veteran presents with chronic burning upper extremity pain. How do you evaluate? Assume reflex sympathetic dystrophy: Outline treatment plan. How do you decide on treatment modality?

Appendix B. Agenda for Typical Resident OPE Inservice Session

  1. Introduction: Background & benefit of mock orals

  2. Scheduling of OPEs

  3. Expectations for attendance

  4. Description of activities on day of OPE session

  5. How (and how not) to prepare for mock oral examinations

  6. How to approach a stem question; use of time immediately before the exam

  7. Conduct of the OPE; timing & content

  8. What do examiners look for (communication, clinical judgment, adaptability, defense of decisions made; etc.)

  9. Importance of exit questionnaire

  10. Discussion of debriefing session

  11. OPE scoring by examiners; implications for residents

Appendix C. Candidate Information Questionnaire

   Candidate Information Questionnaire

Name:__________________________ Started CA-1 Year: ________(m/y)

ON A SCALE OF 1-10, RATE HOW YOU FEEL PRESENTLY (10 = highest):

_____ I am anxious about the oral board exam.

_____ I feel prepared for the oral board exam.

_____ I feel I have adequate knowledge to take the oral board exam.

_____ I feel I have the knowledge, but may not be able to get it across to the examiners.

_____ I am so nervous I can’t think straight.

CHECK ALL APPLICABLE ITEMS:

_____ I have taken practice oral boards before.

_____ I have studied specifically for the practice oral boards by:

_____ Reading standard anesthesia texts.

_____ ASA Refresher courses and journal review articles.

_____ Editorials and journal articles.

_____ Attending a course geared to prepare for orals.

_____ Doing nothing specifically.

I look up pertinent information and read about anesthesia management in connection with the cases I do:

_____ > 75% of the time

_____ 50-75% of the time

_____ 25-50% of the time

_____ < 25% of the time

I feel I know:

_____ A little bit about everything

_____ A few topics in depth.

_____ A lot of topics in depth.

Appendix D. OPE Candidate Scoring Sheet

Candidate’s Name _____________________________________ Date ___________

CATEGORIES OF EVALUATION SUBSCORE CODE

  1. Ability to organize and express thoughts clearly 1 Definite Pass

  2. Sound judgment in decision making and application 2 Probable Pass

  3. Ability to apply basic science principles to clinical problems 3 Probable Fail

  4. Adaptability to changing clinical conditions 4 Definite Fail

*Please circle topic number to designate major areas of test Areas of weakness (check all applicable)

graphic file with name jepm-01-002_VolI_IssueII_Schubert_f0001.jpg

Summary Statement:

Examiner _______________________________ Grade ___________

  (Print Name)

Appendix E. Exit Questionnaire

graphic file with name jepm-01-002_VolI_IssueII_Schubert_f0002.jpg

Fig 1.

Fig 1

Early growth of the OPE program: Number of exams administered 1989-1993

Fig 2.

Fig 2

Resident Self-Assessed Anxiety and Preparedness*

Fig 3.

Fig 3

Resident Ratings of the OPE Debrief

GLOSSARY

Acronyms

CA-1,2,3

Clinical Anesthesia, year one, two, three

ITE

In-Training Examination (written exam sponsored by joint ABA/ASA Council)

OPE

Oral Practice Examination

ONS

Overall Numerical Score = average of all sub-scores

ABA

American Board of Anesthesiology

OSCE

Objective structured clinical examination

IRR

Inter-rater reliability

Definition of Terms

Internal consistency:

The degree to which performance in one area of the OPE relates to performance in a different area.

Inter-rater reliability:

The agreement between scores assigned by two concurrent examiners.

Validity:

Defined here as criterion or concurrent validity, referring to the degree to which OPE scores are related to other concurrent measures of anesthesiology resident performance.

Section:

One of four parts of the OPE guided question:

A = preoperative C = postoperative

B = intraoperative D = additional topics

Sub-question Specific question/question complex within a section. There are 3-6 sub-questions per section.

Sub-score:

Examiner-assigned score based on examinee OPE performance on sub-questions within an OPE section (possible scores=1,2,3,4; 1=best).

Section score:

The average of all sub-scores in one section (range 1-4; 1=best; continuous variable).

Final grade (FG):

One of 4 possible (70=definite fail, 73=probable fail, 77=probable pass, 80=definite pass) scores assigned by examiners. Except for IRR calculations, the average of two examiners FGs is used.

Pass-fail (P/F):

Pass is defined as a FG of >75, fail as £75.

OPE outcome:

Any measure of examinee performance on OPE.

REFERENCES

  • 1.ABA News. 1989;2(1):4. [Google Scholar]
  • 2.Carter HD. How reliable are good oral examinations? Calif J Educ Research. 1962;13:147–53. [Google Scholar]
  • 3.Kelley PR, Matthews KH, Schumacher CF. Analysis of the oral examination of the American Board of Anesthesiology. J Med Education. 1971;46:982–8. doi: 10.1097/00001888-197111000-00010. [DOI] [PubMed] [Google Scholar]
  • 4.Pope WD. Anaesthesia oral examination (editorial) Can J Anaesth. 1993;40:907–10. doi: 10.1007/BF03010090. [DOI] [PubMed] [Google Scholar]
  • 5.Eagle CJ, Martineau R, Hamilton K. The oral examination in anaesthetic resident evaluation. Can J Anaesth. 1993;40:947–53. doi: 10.1007/BF03010098. [DOI] [PubMed] [Google Scholar]
  • 6.Colliver JA, Verhulst SJ, Williams RG, Norcini JJ. Reliability of performance on standardized patient cases: A comparison of consistency measures based on generalizability theory. Teaching and Learning in Medicine. 1989;1:31–7. [Google Scholar]
  • 7.Schubert A, Hull A, Tetzlaff J, Maurer W, Barnes A. Reliability and validity of anesthesiology “mock orals” during a three-year period. Anesthesiology. 1992;77:A1118. [Google Scholar]
  • 8.McGuire CH. Studies of the oral examination: experiences with orthopedic surgery. In: Lloyd JS, Langley DG, editors. Evaluating the skills of medical specialists. Chicago: American Board of Medical Specialists; 1983. pp. 105–109. [Google Scholar]
  • 9.Yang JC, Laube DW. Improvement of reliability of an oral examination by a structured evaluation instrument. J Med Education. 1983;58:864–72. doi: 10.1097/00001888-198311000-00005. [DOI] [PubMed] [Google Scholar]
  • 10.Anastakis DJ, Cohen R, Reznick RK. The structured oral examination as a method for assessing surgical residents. Am J Surg. 1991;162:67–70. doi: 10.1016/0002-9610(91)90205-r. [DOI] [PubMed] [Google Scholar]
  • 11.Muzzin LJ, Hart L. Oral Examinations (Ch 5) In: Neufeld VR, Norman GR, editors. Assessing Clinical Competence. Springer Publishing Co; New York: 1984. [Google Scholar]
  • 12.Evans LR, Ingersoll RW, Smith EJ. The reliability, validity and taxonomic structure of the oral examination. J Med Educ. 1966;41:651–7. doi: 10.1097/00001888-196607000-00002. [DOI] [PubMed] [Google Scholar]
  • 13.Elliott RL, Juthani NV, Rubin EH, Greenfeld D, Skelton WD, Yudkowsky R. Quality in residency training: toward a broader, multidimensional definition. Academic Medicine. 1996;71(3):243–7. doi: 10.1097/00001888-199603000-00012. [DOI] [PubMed] [Google Scholar]
  • 14.Liaison Committee on Medical Education . AMA. Vol. 14. Chicago, IL: Council on Medical Education; 1991. Functions and structure of a medical school; pp. 174–9. [Google Scholar]
  • 15.Polk SL. Educational initiatives. Problems in Anesthesia. 1991;5:305–18. [Google Scholar]
  • 16.Stillman PL, Regan MB, Philbin M, Haley HL. Results of a survey on the use of standardized patients to teach and evaluate clinical skills. Academic Medicine. 1990;65:288–92. doi: 10.1097/00001888-199005000-00002. [DOI] [PubMed] [Google Scholar]
  • 17.Polk SL. Educational initiatives. Problems in Anesthesia. 1991;5:305–18. [Google Scholar]
  • 18.Green JS, Robin HS, Schibanoff J, Goldstein P, Lorente JL, Hoagland P, et al. Continuous medical education: using clinical algorithms to improve the quality of health care in hospitals. J Cont Educ in Health Prof. 1992;12:143–55. [Google Scholar]
  • 19.Triplett HB, Wilson-Pessano S, Vintilla-Friedaman S, Levine R, Marshall G. Critical performance requirements for anesthesiology residency. Anesthesiology. 1988;69:A797. [Google Scholar]
  • 20.Orkin FK, Greenhow DE. A study of decision making: How faculty define competence. Anesthesiology. 1978;48:267–71. doi: 10.1097/00000542-197804000-00009. [DOI] [PubMed] [Google Scholar]
  • 21.Stillman P, Swanson D, Regan MB, Philbin NM, et al. Assessment of clinical skills of residents utilizing standardized patients. A follow-up study and recommendations for application. Ann Intern Med. 1991;114(5):393–401. doi: 10.7326/0003-4819-114-5-393. [DOI] [PubMed] [Google Scholar]
  • 22.Van der Vleuten CPM, Norman GR, DeGraaff E. Pitfalls in the pursuit of objectivity: issues of reliability. Medical Education. 1991;25:110–8. doi: 10.1111/j.1365-2923.1991.tb00036.x. [DOI] [PubMed] [Google Scholar]
  • 23.Giles TJ. Task force completes four-year study of in-training evaluation in internal medicine. Ann R Coll Physicians Surg Can. 1983;16:265–6. [Google Scholar]
  • 24.Newble DI, Baxter A, Elmslie RG. A comparison of multiple choice and free response tests in examinations of clinical competence. Medical Education. 1979;13:263–8. doi: 10.1111/j.1365-2923.1979.tb01511.x. [DOI] [PubMed] [Google Scholar]
  • 25.Elstein AS. Beyond multiple choice questions and essays: The need for a new way to assess clinical competence. Academic Medicine. 1993;68:244–9. doi: 10.1097/00001888-199304000-00002. [DOI] [PubMed] [Google Scholar]
  • 26.Quattlebaum TG, Darden PM, Sperry JB. In-training examinations as predictors of resident clinical performance. Pediatrics. 1989;84:165–72. [PubMed] [Google Scholar]
  • 27.Maatsch JL, Huang R. An evaluation of construct validity of four alternative theories of clinical competence. Proceedings of the Annual Conference on Research in Medical Education; 1986. pp. 69–74. [PubMed] [Google Scholar]
  • 28.Kowlowitz V, Hoole AJ, Sloane PD. Implementing the objective structured clinical examination in a traditional medical school. Academic Medicine. 1991;66:345–7. doi: 10.1097/00001888-199106000-00008. [DOI] [PubMed] [Google Scholar]

Articles from The Journal of Education in Perioperative Medicine : JEPM are provided here courtesy of Society for Education in Anesthesia

RESOURCES