Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2000 Feb;15(2):129–133. doi: 10.1046/j.1525-1497.2000.03119.x

Evidence-Based Medicine Training in Internal Medicine Residency Programs

A National Survey

Michael L Green 1
PMCID: PMC1495338  PMID: 10672117

Abstract

To characterize evidence-based medicine (EBM) curricula in internal medicine residency programs, a written survey was mailed to 417 program directors of U.S. internal medicine residency programs. For programs offering a freestanding (dedicated curricular time) EBM curriculum, the survey inquired about its objectives, format, curricular time, attendance, faculty development, resources, and evaluation. All directors responded to questions regarding integrating EBM teaching into established educational venues. Of 417 program directors, 269 (65%) responded. Of these 269 programs, 99 (37%) offered a freestanding EBM curriculum. Among these, the most common objectives were performing critical appraisal (78%), searching for evidence (53%), posing a focused question (44%), and applying the evidence in decision making (35%). Although 97% of the programs provided medline, only 33% provided Best Evidence or the Cochrane Library. Evaluation was performed in 37% of the freestanding curricula. Considering all respondents, most programs reported efforts to integrate EBM teaching into established venues, including attending rounds (84%), resident report (82%), continuity clinic (76%), bedside rounds (68%), and emergency department (35%). However, only 51% to 64% of the programs provided on-site electronic information and 31% to 45% provided site-specific faculty development. One third of the training programs reported offering freestanding EBM curricula, which commonly targeted important EBM skills, utilized the residents' experiences, and employed an interactive format. Less than one half of the curricula, however, included curriculum evaluation, and many failed to provide important medical information sources. Most programs reported efforts to integrate EBM teaching, but many of these attempts lacked important structural elements.

Keywords: evidence-based medicine, residency programs, curriculum, graduate medical education, survey


Evidence-based medicine (EBM) refers to the conscientious, explicit, and judicious use of the current best evidence in making decisions about the care of individual patients.1 The current advocacy of EBM derives from the growing evidence base supporting many clinical maneuvers,2 and the recognition of physicians' unmet information needs,3,4 poor information retrieval skills,5 deterioration of up-to-date knowledge after training,6 and practice variations for interventions with established efficacy.7 In response to these needs and recognized curricular deficiencies, the Accreditation Council for Graduate Medical Education8 and the Association of American Medical Colleges9 have called for the introduction of clinical epidemiology, biostatistics, critical appraisal, and medical informatics into medical school and graduate medical education curricula.

In many internal medicine residency programs, this training has traditionally occurred in journal clubs, in which small groups discuss articles chosen for their recentness, “landmark” status, or general relevance to a particular specialty. A recent national survey found journal clubs active in 95% of internal medicine programs.10 However, in the majority of published reports of individual journal clubs, their effectiveness is either unexamined or very limited.11,12 Furthermore, these curricula generally focus on clinical epidemiology principles, critical appraisal skills, and “keeping up” with the medical literature, but neglect individual patient decision making.11,13

The practice of EBM, in contrast, begins and ends with an individual patient. In particular, an EBM curriculum must include the acquisition, appraisal, and application of “the evidence” in the context of individual patient decision making. Many internal medicine programs are either initiating new EBM curricula or transforming their traditional journal clubs. The objective of this study was to determine the prevalence and characteristics of EBM curricula in internal medicine residency programs.

METHODS

In July 1998, a written survey was mailed to program directors on the mailing list of the Association of Program Directors in Internal Medicine, which included all U.S. programs and 12 Canadian programs. A second request was sent 1 month later. They were instructed to complete the survey or designate another faculty member who bears primary responsibility for EBM training.

The survey was developed for this study and revised after pilot testing. General medicine faculty at the author's institution completed the survey with responses based on the primary care residency program's EBM curriculum and gave written comments on clarity and content. Most respondents required less than 10 minutes to complete the instrument.

The survey instructions included specific definitions of journal clubs, EBM, and EBM curricula, and distinguished freestanding from integrated EBM curricula. Evidence-based medicine refers to the conscientious, explicit, and judicious use of the current best evidence in making decisions about the care of individual patients and involves 4 steps: (1) convert information needs into answerable questions, (2) efficiently acquire the best evidence, (3) critically appraise the evidence for its validity and usefulness, and (4) interpret the results for an individual patient. The main goal of an EBM curriculum is to improve residents' skills and behaviors in integrating “the evidence” into their decision making for individual patients. An EBM curriculum may be freestanding, with dedicated curricular time, or may be integrated, with organized efforts to teach and exemplify EBM in “real time” in various venues. This is in contrast to journal clubs, which consist of group discussions of articles chosen for their recentness, landmark status, or general interest. The goals of a journal club are usually to improve generic critical appraisal skills and facilitate “keeping up” with emerging literature.

The survey first inquired about the existence of a freestanding EBM curriculum and, if offered, its objectives, format, attendance, faculty development, resources, and evaluation. With the exception of objectives, which required free text responses, the other questions were yes/no, multiple choice, or numeric. Prior to distribution, the author generated a list of potential objectives for coding, which ultimately represented 85% of the responses. Four new variables were added post hoc to capture the remaining 15%. There was little ambiguity in coding the responses for the objectives, but a reliability analysis was not done to confirm this.

Whether or not they offered a freestanding curriculum, respondents were asked about organized efforts to integrate EBM teaching into established educational and clinical activities, including bedside rounds, attending rounds, resident report, continuity clinic, and the emergency department. For programs attempting this integrated EBM teaching, the survey inquired about site-specific electronic medical information, faculty development, and documentation of resident EBM behaviors.

Frequencies and means were determined for descriptive data. For comparisons, χ2tests and t tests were used for categorical and continuous variables, respectively.

RESULTS

Out of 417 programs, 269 returned the questionnaire (response rate, 65%). Three of the respondents indicated that their program had either been discontinued or merged with another program, leaving 266 programs for analysis. Thirty-seven percent (n = 99) of the programs offered a freestanding EBM curriculum, which was equally common in university-based programs (39%, 45 of 116) and community-based programs (37%, 54 of 146) (p = .80). The frequencies of learning objectives, curriculum characteristics, and medical information resources offered are listed in Table 1. The mean curricular time was 20 hours per year and the mean resident attendance per session was 17 (SD ± 11).

Table 1.

Features of Freestanding Evidence-Based Medicine (EBM) Curricula in Programs that Had an EBM Curricula (N = 99)

Feature Number of EBM Curricula
Objectives
 Perform critical appraisal of the literature 77
 Search for “the evidence” 52
 Articulate a focused clinical question 44
 Apply the evidence to individual patient decision making 35
 Integrate “the evidence” in decision making in actual practice 23
 Understand principles of biostatistics 16
 Obtain introduction to EBM 16
 Understand principles of clinical epidemiology 14
 Acquire more positive attitude toward EBM 12
 Keep up with the medical literature 7
 Establish habit of lifelong learning 7
 Appreciate uncertainty in medical decision making 3
 Engage in clinical research or improve research skills 3
 Judge the validity of clinical guidelines 2
Characteristics
 Implementation
  Ongoing experience 57
  Block time 28
  Both 12
 Format
  Faculty-directed didactic session 27
  Faculty-directed interactive session 10
  Resident-directed interactive session 26
  Both resident and faculty sessions 29
  Other 3
 Case-based discussions
  Sessions center on clinical scenario 70
  Sessions center on actual patient 68
 Selection of cases discussed
  Residents 25
  Faculty 10
  Both 64
 Faculty development
  EBM skills 51
  Facilitating techniques 51
Medical Information Sources
medline 95
 Internet access 76
ACP Journal Club 75
Best Evidence 32
Cochrane Library 31
Evidence-Based Medicine 21

Of the 99 programs offering a freestanding EBM curriculum, 36 (36%) conducted an evaluation, which was equally common in university-based programs (33%, n = 15) and community-based programs (39%, n = 121) (p = .50). The outcomes measured in the curriculum evaluations are listed in Table 2.

Table 2.

Outcome Assessment in Evidence-Based Medicine (EBM) Curricula that Had an Evaluation Component (N = 36)

Outcome Assessment Number (%) of Curricula
Satisfaction questionnaire 30 (83)
Exercise based on critically appraising a journal article 22 (61)
Documentation of resident attendance and partipation 21 (58)
Exercise based on applying the evidence to an individual patient 20 (56)
Assessment of attitude toward EBM 12 (33)
EMB knowledge examination 7 (19)
Documentation of residents' actual practice of EBM in clinical settings 5 (14)

Whether or not they offered a freestanding EBM curriculum, most of the responding programs undertook organized efforts to integrate EBM teaching in real time in one or more venues, including attending rounds (84%, n = 218), resident report (82%, n = 214), continuity clinic (76%, n = 199), bedside rounds (68%, n = 177), and emergency department (35%, n = 90). For the programs attempting this integrated EBM teaching,Table 3 lists the structural elements provided for each site.

Table 3.

Structural Elements Provided for Sites of Integrated Evidence-Based Medicine (EBM) Teaching (N = 261 programs)

Venue Element Number (%) of Programs
Attending rounds (n = 218) On-site electronic information 111 (51)
Faculty development 99 (45)
Documentation of residents' actual EBM behaviors 34 (16)
Resident report (n = 214) On-site electronic information 120 (56)
Faculty development 80 (37)
Documentation of residents' actual EBM behaviors 59 (28)
Continuity clinic (n = 199) On-site electronic information 127 (64)
Faculty development 84 (42)
Documentation of residents' actual EBM behaviors 45 (23)
Bedside rounds (n = 177) On-site electronic information 85 (48)
Faculty development 79 (45)
Documentation of residents' actual EBM behaviors 25 (14)
Emergency dept. (n = 90) On-site electronic information 54 (60)
Faculty development 28 (31)
Documentation of residents' actual EBM behaviors 9 (10)

DISCUSSION

Recognizing the limitations of journal clubs, many internal medicine programs are developing EBM curricula or transforming their traditional journal clubs, but few curricula have been reported.14,15 This national survey found that 37% of programs dedicate curricular time to a freestanding EBM curriculum. The 4 most commonly cited objectives exactly conform to the 4 steps in evidence based decision-making.16 This represents an advance over journal clubs, which consistently target critical appraisal but rarely focus on the other 3 steps.11,13

As adult learners, residents should thrive in curricula informed by adult learning theory.17 Learners, in this paradigm, must understand why they need to learn something, take responsibility for their learning, exploit their experience as a resource, and link their readiness to learn with the exigency of real-life situations. The characteristics of the EBM curricula in this survey reflect attention to adult learning theory in their development. In most of the curricula, residents chose the cases, which often represented real clinical scenarios involving their actual patients. The seminars were usually interactive and 58% of the time were directed or codirected by the residents. The effectiveness of this approach has been confirmed in a controlled trial.15

To efficiently practice EBM, residents need access to a range of information resources. medlineis an important resource but is limited by the predominance of basic science articles, imperfect indexing, and its complex search requirements. Furthermore, busy clinicians cannot practically identify, read, and appraise the entire literature addressing each of their questions. As a more realistic alternative, physicians can seek and apply evidence-based summaries of articles and systematic reviews.18 In one report of EBM on hospital rounds, most clinical questions were answered quickly with these resources as early options in a searching algorithm.19 Though most of the programs in this survey provided medline, only about 30% provided Best Evidence or the Cochrane Library, two collections of this readily accessible information.

Thirty-six percent of the programs evaluated their curricula. In over 50% of these, residents completed an exercise involving the critical appraisal of an actual journal article or the application of scientific evidence to an individual patient. This realistic measurement of skills represents an advance over the evaluation of journal clubs, which most commonly measure surrogate outcomes such as clinical epidemiology knowledge on multiple choice examinations.11,12

In addition to measuring the impact of a curriculum on skills, evaluators must ask: are residents more frequently acquiring, appraising, and applying “the evidence” in their day-to-day practice? In this survey, only 14% of the programs that conducted an evaluation of their EBM curriculum (and 5% of the programs with established curricula) documented the residents' actual practice of EBM in clinical settings. The questionnaire lacked sufficient detail to determine the exact outcome measures they used. The reported curriculum evaluations that measured behavior relied on self-report of hours spent reading, attention to methods and results sections, preferred sources of information, or frequency of referral to original articles to answer clinical questions.11 However, retrospective self-reporting may underestimate physicians' information needs and overestimate their information-seeking behaviors.3 In a promising report, Flynn and Helwig described using audiotapes of teaching sessions to directly determine the frequency with which residents use the evidence in their practices.20

Though freestanding curricula can help residents improve their EBM skills, this format does not confront the actual logistical problems and time constraints faced by busy clinicians. Clinicians will not fully embrace EBM unless it allows them to ask and answer most of their questions at the time that they emerge in the flow of patient care. This survey confirms that many training programs have undertaken efforts to teach and exemplify EBM in established venues.

In recent years, medical educators have explored ways to accomplish this type of integrated EBM training, but little has been reported.14,21 To successfully integrate EBM teaching, programs will most likely require on-site electronic medical information and site-specific faculty development. In this survey, 31% to 64% of programs provided these elements, depending on the particular venue. Furthermore, only the 10% to 23% of programs that tracked residents' EBM behaviors will be able to determine the impact of their curricular efforts.

There are a few important limitations of this study. The response rate of 65% may not have captured a completely representative group of programs. However, the programs that failed to respond had the same proportion of university-based programs and the same geographic distribution. With the exception of the question about curriculum objectives, the remainder of the survey questions required yes/no, multiple choice, or numeric answers. With these limits, some program directors may have been constrained in attempting to describe important curricular innovations. Finally, this type of survey is susceptible to overreporting, because respondents can list what they offer without reporting on quality or effectiveness.

In conclusion, at the time of this survey, approximately one third of U.S. training programs offered freestanding EBM curricula, which commonly targeted important EBM skills, utilized the residents' actual experience, and employed an interactive seminar format. Less than one half of these, however, offered faculty development or performed an evaluation, and many failed to provide some useful medical information sources. Most programs reported efforts to integrate EBM teaching into established clinical and educational venues, but many of these attempts lacked important structural elements. As graduate medical education curricula in this area evolve, educators should focus on innovative ways to integrate EBM into the flow of patient care, with access to a broad range of information resources, and curriculum evaluation (particularly for behavioral outcomes).

Acknowledgments

The author thanks Mary Cerreta, Dianne Kalish, and Eydie Sirica for their assistance with data collection and management and Dr. Patrick O'Connor for his thoughtful review of the manuscript.

REFERENCES

  • 1.Sackett DL, Rosenberg WM, Muir Gray JA, Haynes RB, Richardson WS. Evidence-based medicine: what it is and what it isn't. BMJ. 1996;312:71–2. doi: 10.1136/bmj.312.7023.71. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Ellis J, Mulligan I, Rowe J, Sackett DL. Inpatient medicine is evidence based. Lancet. 1995;346:407–10. [PubMed] [Google Scholar]
  • 3.Covell DG, Uman GC, Manning PR. Information needs in office practice: are they being met? Ann Intern Med. 1985;103:596–9. doi: 10.7326/0003-4819-103-4-596. [DOI] [PubMed] [Google Scholar]
  • 4.Gorman PN, Helfand M. Information seeking in primary care: how physicians choose which clinical questions to pursue and which to leave unanswered. Med Decis Making. 1995;15:113–9. doi: 10.1177/0272989X9501500203. [DOI] [PubMed] [Google Scholar]
  • 5.McKibbon KA, Haynes RB, Walker-Dilks CJ, et al. How good are clinical MEDLINE searches? A comparative study of clinical end-user and librarian searches. Comput Biomed Res. 1990;23:583–93. doi: 10.1016/0010-4809(90)90042-b. [DOI] [PubMed] [Google Scholar]
  • 6.Ramsey PG, Carline JD, Inui TS, et al. Changes over time in the knowledge base of practicing internists. JAMA. 1991;266:1103–7. [PubMed] [Google Scholar]
  • 7.Soumerai SB, McLaughlin TJ, Spiegelman D, Hertzmark E, Thibault G, Goldman L. Adverse outcomes of underuse of beta-blockers in elderly survivors of acute myocardial infarction. JAMA. 1997;277:115–21. [PubMed] [Google Scholar]
  • 8.Accreditation Council for Graduate Medical Education. In: The Graduate Medical Education Directory, 1996–97. Chicago: Ill: American Medical Association; 1996. Program requirements for residency education in internal medicine: special educational requirements; p. 79. [Google Scholar]
  • 9.Muller S., chairman Physicians for the twenty-first century: report of the project panel on the general professional education of the physician and college preparation for medicine. J Med Educ. 1984;59((pt 2)):127–8. ,155–67. [PubMed] [Google Scholar]
  • 10.Sidorov J. How are internal medicine residency journal clubs organized and what makes them successful? Arch Intern Med. 1995;155:1193–7. [PubMed] [Google Scholar]
  • 11.Green ML. Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-medicine: a critical review of curricula. Acad Med. 1999;74:686–94. doi: 10.1097/00001888-199906000-00017. [DOI] [PubMed] [Google Scholar]
  • 12.Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. CMAJ. 1998;158:177–81. [PMC free article] [PubMed] [Google Scholar]
  • 13.Alguire P. A review of journal clubs in postgraduate medical education. J Gen Intern Med. 1998;13:347–53. doi: 10.1046/j.1525-1497.1998.00102.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Reilly B, Lemon M. Evidence-based morning report: a popular new format in a large teaching hospital. Am J Med. 1997;103:419–26. doi: 10.1016/s0002-9343(97)00173-3. [DOI] [PubMed] [Google Scholar]
  • 15.Green ML, Ellis PJ. Impact of an evidence-based medicine curriculum based on adult learning theory. J Gen Intern Med. 1997;12:742–50. doi: 10.1046/j.1525-1497.1997.07159.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Sacket DL, Richardson WS, Rosenberg W, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM. New York, NY: Churchill Livingstone; 1997. :3. [Google Scholar]
  • 17.Knowles M. The Adult Learner: A Neglected Species. Houston, Tex: Gulf Publishing Co; 1984. [Google Scholar]
  • 18.Cook DJ, Mulrow CD, Haynes RB. Sytematic reviews: synthesis of best evidence for clinical decisions. Ann Intern Med. 1997;126:376–80. doi: 10.7326/0003-4819-126-5-199703010-00006. [DOI] [PubMed] [Google Scholar]
  • 19.Sackett DL, Straus SE. Finding and applying evidence during clinical rounds: the “evidence cart.”. JAMA. 1998;280:1336–8. doi: 10.1001/jama.280.15.1336. [DOI] [PubMed] [Google Scholar]
  • 20.Flynn C, Helwig A. Evaluating an evidence-based medicine curriculum. Acad Med. 1997;72:454–5. doi: 10.1097/00001888-199705000-00096. [DOI] [PubMed] [Google Scholar]
  • 21.Grimes DA. Introducing evidence-based medicine into a department of obstetrics and gynecology. Obstet Gynecol. 1995;86:451–7. doi: 10.1016/0029-7844(95)00184-S. [DOI] [PubMed] [Google Scholar]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES