Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2002 Jan;17(1):58–65. doi: 10.1046/j.1525-1497.2002.10121.x

Early Introduction of an Evidence-based Medicine Course to Preclinical Medical Students

Malathi Srinivasan 1,5, Michael Weiner 1,5,6, Philip P Breitfeld 2, Fran Brahmi 3, Keith L Dickerson 4, Gary Weiner 2
PMCID: PMC1494995  PMID: 11903776

Abstract

Evidence-based Medicine (EBM) has been increasingly integrated into medical education curricula. Using an observational research design, we evaluated the feasibility of introducing a 1-month problem-based EBM course for 139 first-year medical students at a large university center. We assessed program performance through the use of a web-based curricular component and practice exam, final examination scores, student satisfaction surveys, and a faculty questionnaire. Students demonstrated active involvement in learning EBM and ability to use EBM principles. Facilitators felt that students performed well and compared favorably with residents whom they had supervised in the past year. Both faculty and students were satisfied with the EBM course. To our knowledge, this is the first report to demonstrate that early introduction of EBM principles as a short course to preclinical medical students is feasible and practical.

Keywords: evidence-based medicine, preclinical medical students, web-based curriculum, problem-based learning, medical education


The teaching of evidence-based medicine (EBM) has been increasingly integrated into curricula at all levels of medical education.14 During residency and in practice, clinicians are reinforcing principles of clinical decision making by asking relevant clinical questions, interpreting medical literature, and applying principles of biostatistics and clinical epidemiology to the care of individual patients. The Medical School Objectives Program developed by the Association of American Medical Colleges (AAMC) advocates incorporation of EBM principles throughout undergraduate education.5,6 Published reports of EBM curricula in medical schools describe teaching that occurs predominantly in the third and fourth year of medical school,79 perhaps because of students' increased experience with patient case studies. Yet increasingly, medical educators are finding innovative ways of integrating aspects of EBM techniques throughout the 4-year medical school experience, e.g., by teaching search strategies and evidence assessment during preclinical classes, then reviewing strategies for evaluating different types of articles (diagnosis, harm, prognosis, etc.) during clinical rotations.10

At our institution, we began formal introduction of EBM to fourth-year medical students during a required 1-week clerkship in Medical Informatics. This clerkship received the highest clerkship rating in 1999 compared to all other clinical clerkships. Students strongly recommended introducing the material before their third-year clinical rotations. To address their recommendations, we developed an EBM course for first-year medical students, using a format similar to McMaster University's problem-based learning format.11 We hypothesized that first-year medical students could master EBM principles even without clinical backgrounds, since EBM involves techniques of critically appraising information, rather than comprehensive mastery of a specific field. Therefore, we sought to determine whether first-year medical students could master introductory EBM principles through a short course. Secondarily, we sought to assess whether web-based curricula were useful adjuvants in teaching EBM.

PROGRAM DESCRIPTION

Indiana University School of Medicine instructs the second largest medical student body in the country.12 For their first 2 years, 139 of the 279 students in each class are instructed at the Indianapolis center, while all others are instructed at 8 regional centers throughout Indiana. All students pursue their clinical instruction in Indianapolis. Our EBM intervention was implemented only in Indianapolis, since each center locally controls methods employed to teach the statewide core curricula.

After review of AAMC's Medical School Objectives Program,5,6 United States Medical Licensing Examination Step 1 guidelines,13 and Indiana University School of Medicine's life-long learning competency and biostatistics requirements,14 we developed a short EBM course for our first-year students, consisting of 8 student contact hours.

In January 2000, our EBM course was conducted as two 1-hour lectures and three 2-hour small-group sessions. Sixteen small-group facilitators with EBM experience included faculty in emergency medicine (8), pediatrics (5), internal medicine (2), and library sciences (1). Small-groups consisted of 1 to 2 facilitators and 10 to 11 students. A facilitator's handbook was developed to provide consistent small-group experiences, with detailed objectives, timelines, commonly asked questions and answers, sample dialog, completed “Users' Guides to the Medical Literature” worksheets,1519 critical concept summaries, background reading material, small-group teaching strategies, and references. Prior to the course, facilitators met to review the facilitator's handbook, and to discuss teaching/learning strategies.

A 1-hour introductory lecture reviewed standard biostatistical concepts and construction of clinical questions (Table 1). The class was then introduced to clinical questions that would be discussed in small group, after watching Viagra commercials (Pfizer, Washington, D.C.) and part of an “ER” television episode (October 1999). Three small-group, problem-based learning sessions focused on evaluation skills frequently used by clinicians20: assessment of risks/benefits of therapeutic interventions and diagnostic tests, and of causation of harm. During each interactive session, groups discussed a clinical vignette, developed a relevant question, and evaluated a corresponding article: Session 1, therapeutic interventions: Among diabetic patients with erectile dysfunction, how efficacious is Viagra versus placebo in improving successful intercourse?21; Session 2, diagnostic tests: Among head trauma patients with brief loss of consciousness, how well does head CT versus physical examination rule out intracranial hemorrhage?22; and Session 3, causation of harm: Among patients with depression, how strong is the evidence for an associated sexual abuse history?23 Students completed worksheets with questions from “Users' Guides to the Medical Literature.”1519 We emphasized standard EBM concepts such as question formulation, study design, bias, and statistical test interpretation (Table 1). All classrooms were internet accessible, and during the third small-group session, we demonstrated searching strategies to access medline and Cochrane articles. At each session's end, critical concepts were reviewed, and students received pocket cards with formulas and key concepts for reference. A final lecture reviewed all major EBM concepts covered in the initial lecture and small groups. Further, the course director (GW) sent out weekly “teaser” clinical questions via a first-year electronic discussion group. Dedicated secretarial support facilitated course coordination.

Table 1.

EBM Course Content for First-year Medical Students

Method of Instruction
Learning Objectives USMLE Step 1 Content Reading Material Lecture Small Group Session Website
1. Students will be able to describe the fundamental concepts of measurement and the description of scientific data.
 Describe each of the following types of data and correctly identify the type when described in a study
  Discrete, dichotomous
  Continuous
  Nominal
  Ordinal
  Interval
  Ratio
 Describe each of the following types of variables and identify the variable when used in a study
  Independent
  Dependent
 Describe and identify the following sample distributions when presented with a data set
  Nominal
  Binomial
 Calculate and describe the limitations of each of the following measures of central tendency
  Mean
  Median
  Mode
 Explain how measures of variability will change with sample size and measurement tool precision
  Standard deviation
  Confidence interval
 Identify the appropriate rule of probability to use when approaching a data set
  Additive rule
  Multiplicative rule
  Binomial expansion
2. Students will use the medical literature to determine whether a clinical article advocating or dismissing a treatment contains conclusions about the efficacy that are valid and could be applicable (sensible and feasible) in one's own practice.
 State the null hypothesis for a proposed treatment
  Null hypothesis
 Explain why each of the following is important when assessing the validity of a study
  Randomization
  Blinding
  Bias
 Assess disease outcomes
  Identify meaningful outcomes
  Control event rate
  Experimental event rate
  Intention to treat
 Measure impact on health by calculating and understanding each of the following
  Relative risk
  Relative risk reduction
  Absolute risk reduction
  Number needed to treat
  Number needed to ham
 Assess accuracy by understanding the application of each of the following
  Confidence Intervals
  Statistical significance
  Statistical power
  Type I Error
  Type II Error
  Sample size calculation
3. Students will use the medical literature to determine whether a clinical article has drawn valid and applicable conclusions about disease causality or clinical efficacy.
 Understand the fundamental concepts of study design and describe the hierarchy of evidence for each of the following designs
  Randomized controlled trials
  Case-control
  Cohort (retrospective and prospective)
  Case series
 Calculate each of the following
  Incidence
  Prevalence
  Nominal and adjusted disease rates
4. Students will use the medical literature to determine whether clinical data (signs, symptoms, or test results) are likely to be valid and applicable in practice.
 Calculate and understand the use and limitations of each of the following.
  Sensitivity
  Specificity
  Likelihood ratio
  Positive predictive value
  Negative predictive value
  Odds ratios
  Convert pre-test probability to pre-test odds
  Calculate post-test odds using pre-test odds and LR
  Understand the selection and use of reference standards
5. Students will be able to use library resources to access original research in the medical literature.
 Formulate a well designed clinical question
  Describe a population
  Describe an intervention or test
  Describe a desired outcome
 Use computer program with standard graphical interface to search the medical literature for articles related to the clinical question
  Access Pubmed or Ovid on the internet
  Complete a literature search
 Use Cochrane Collection to access systematic reviews
  Retrieve appropriate article from Cochrane database

USMLE, United States Medical Licensing Examination.

Students were encouraged to pursue self-directed learning using supplemental online EBM curricula. Using WebCT (University of British Columbia, Vancouver, British Columbia), a program that builds educational websites with graphical and testing capabilities, the course director developed a 20-page web-based EBM curriculum and a practice examination. The website was colorful and easy to navigate. It provided access to a moderated electronic discussion group.

PROGRAM EVALUATION METHODS

A multidimensional approach to assessing course performance was used, examining student preparation, performance, and participation, student/facilitator satisfaction, and utilization of supplemental web curricula.

Student performance was assessed using an online practice examination and written final examination. The 60-item online practice exam was programmed through WebCT. Participation was voluntary, as part of formative self-evaluation. Four exam components paralleled the course structure: basic biostatistics, evaluation of therapies, diagnostic tests, and study design. Students could sign on from any internet-accessible site, and could repeatedly submit exam components. The multiple-choice practice exam was open book. Once submitted, incorrect answers were explained, and links to relevant website sections were provided. WebCT stored submitted answers for each attempt by log-in ID, and generated score distributions for each submitted attempt.

Summative performance evaluation was obtained through a proctored written final examination. The 30-item multiple-choice final exam paralleled course objectives and practice exam content, and was felt to reflect relevant problems encountered by practicing physicians. We emphasized understanding EBM concepts, and practical EBM applications to clinical problems, rather than formula memorization. Since clinicians have ready access to statistical formulas in clinical practice, the final exam was open book, with formulas provided. The questions on the final examination were different from those asked during the online examination, although the same concepts and applications were covered. Students chose among 4 answers per question; incorrect answers reflected common errors likely to be made while answering the question (Table 2).

Table 2.

EBM Practice and Final Examination: Questions and Student Performance

Online Practice Exam (Voluntary, First Submission) Mandatory Final Exam (139 Students)
Topic Sample Questions and Responses* Number of Items Number of Students Average Correct, % ±SD Number of Items Average Correct, % ±SD
Basic biostatistics A randomized, placebo controlled, double-blinded trial reports that patients treated with Snoreeze have a relative risk = 0.25 for snoring compared to control with 95% confidence intervals 0.10–1.25. Which of the following statements is true?
A. If the investigators enrolled more patients, the 95% confidence interval for the relative risk would most likely increase.
B. You can be 95% confident that the reported point estimate of the relative risk (0.25) is the true relative risk for the whole population from which the study sample was drawn.
C. You can be 95% confident that Snoreeze decreases the risk of snoring.
D. You can be 95% confident that the true value for the RR for the whole population from which the study sample was drawn lies between 0.10 and 1.25.
Answer: D 20 119 77 ± 17 7 96 ± 5
Therapy In a trial of prophylactic Ineffectol to prevent depression in teenagers, 20% of CONTROL patients develop depression and 15% of TREATED patients develop depression. How many teenagers would need to be treated with prophylactic Ineffectol in order to prevent one teenager from developing depression?
A. 5
B. 20
C. 50
D. 75
Answer: B 14 123 79 ± 20 9 97 ± 2
Diagnosis You have decided to use a new early-detection test for melanoma on a group of professional sunbathers, with a family history of malignant melanomas. The prevalence of melanoma in this population is 60%. In this population, what is the chance (probability) that an individual with a positive test will develop a malignant melanoma? Your test is 90% sensitive and 90% specific.
A. 60%
B. 72%
C. 85%
D. 93%
Answer: D 20 121 75 ± 16 9 92 ± 5
Study design The State Health Department questioned 100 people who got sick after a picnic and 100 people who did not. They asked them if they ate the egg salad. Sixty of those who got sick ate the egg salad. Twenty of those who did not get sick ate the egg salad. What is the correct measure of the association between eating egg salad and getting sick?
A. 6.0 Odds Ratio
B. 3.0 Odds Ratio
C. 2.27 Relative Risk
D. 0.16 Relative Risk Reduction
Answer: A 5 119 77 ± 23 5 91 ± 7
*

Answers provided in italics. Students improved significantly between first practice exam and final exam, per section, P < .05, paired t test).

RR, relative risk.

Student participation was noted through small-group attendance and facilitator assessment of student preparation/participation, on the basis of 15 questions in a 50-item written facilitator questionnaire. Student utilization of web-based supplemental curriculum was noted through WebCT, which tracks unique log-in IDs per student, number of times pages are accessed (“hits”) and time spent per page.

Student satisfaction was assessed through anonymous online and written questionnaires. Students completed a 23-item online questionnaire anytime during the course, focusing on their course preparation and satisfaction. After the final examination, a 40-item written questionnaire was distributed, focusing on online curriculum utility, course assessment, and satisfaction. Items were scored from strongly agree (1) to strongly disagree (4).

Facilitator satisfaction and experiences while facilitating were determined through a 50-item questionnaire administered postcourse. Facilitators reported their teaching responsibilities, prior EBM experience, course preparation time, and satisfaction with course content and structure. Further, 15 questions assessed facilitators' impression of student performance and preparation, scored from strongly disagree (1) to strongly agree (5). Eight questions asked facilitators to rate their students' abilities using EBM in relationship to residents whom they had supervised in the previous year. We calculated means and standard deviations, t tests, or χ2 tests as appropriate for each data set.

PROGRAM EVALUATION RESULTS

Small-group Attendance

All 139 first-year medical students in Indianapolis took the EBM course. From attendance data, only 1 student missed and then rescheduled a single small-group session.

Small-group Preparation and Participation

Student small-group preparation was assessed using the voluntary student online questionnaire, completed by 82 students (59%). All responding students indicated that they prepared in advance for their small groups, and prepared on average slightly over 1 hour per session. Most (70%) did not have significant prior biostatistical experience. Since the questionnaire was anonymous, we could not assess differences between respondents and nonrespondents.

Fifteen of 16 (94%) facilitators completed the written facilitator questionnaire. Facilitators reported preparing 3.6 ± 1.5 hours per session. Facilitators reported that on a 5-point scale (5 = strongly agree), students were well prepared for class (4.5 ± 0.7), were interested in learning EBM (4.5 ± 0.6), participated in EBM topics (4.3 ± 0.5), and demonstrated responsibility in learning material (4.2 ± 0.6).

Knowledge Acquisition

Knowledge acquisition by students was assessed using formative and summative testing. The formative, online practice exam was used by 123 (88%) students (Table 2). Students submitted exams once (66% of users), twice (29%), three times (3%), or more than three times (2%). Students who submitted their tests more than once significantly improved their scores between first and second attempts (P < .0001; 2-tail paired t test). However, since answers were available once the exam was submitted, we report percentages of correct answers for first-time test takers only.

All 139 students took the summative final examination. The average score was 28.3 ± 1.9 points, out of a perfect 30. Final exam and practice exam component scores are compared in Table 2. We were able to match the log-on ID numbers used for the practice exam with the identification used for the final exam for all but 7 students. The average score of these 112 identified students improved significantly between the first practice exam and the final exam (P < .05; 2-tail paired t test).

Use of Web-based Curriculum

Students used the website extensively, with 133 (96%) students using the site. Overall, the 20-page supplemental curriculum totaled 1,505 hits, or 11.3 hits per student who logged on (range, 1 to 85). The site was used for over 110 student-hours, or 5.8 hours per page. Three pages each had over 100 hits: Glossary of Practical Epidemiologic Concepts (116 hits, total 19 hours), Diagnostic Tests (113 hits, total 10.5 hours), and Expanded Answers on Therapy Tests (110 hits, total 14 hours). The number of hits and time spent per page provide some evidence that students were reading and not just browsing through the program material. Further, website use increased during the last week, and especially before the final exam.

The written student questionnaire was completed by 98 (71%) students. In this questionnaire, 94% agreed or strongly agreed that the web curricula enhanced their understanding of course content.

Student Satisfaction

Online student survey data are presented as a combination of “agree and strongly agree” responses. Students reported enjoying the course (98%) and felt the material was appropriate for their training level (95%). Students found small groups helpful (97%) and preferred not to have a self-paced, computer tutorial course without small-group sessions (95%). Students felt that their facilitators were enthusiastic (93%) and extremely knowledgeable (92%). According to the written survey, students understood how course material related to clinical practice (100%), and felt the course encouraged application of EBM knowledge (99%).

Facilitator Satisfaction and Opinions

Facilitators reported that they had taught 15 ± 6 years, supervising students, residents and fellows 10 ± 3 months/year. Facilitators taught their learners in clinics (7), in-patient units (6), or in emergency departments (5). Facilitators had participated in formal EBM courses (77%), taken biostatistics courses (69%), previously taught EBM (69%), conducted critical literature reviews for journals (46%), held MPH degrees (31%), and had written about EBM for peer-reviewed journals (23%).

Facilitators felt that students performed well in 8 specific EBM-related areas, with mean item scores over 4 on a 5-point scale (Table 3). Further, facilitators felt that students could use EBM concepts as well as or better than residents whom they had supervised in the past year. Overall, facilitators felt that course material was appropriate for first-year medical students (4.5 ± 0.5) and that EBM should first be taught during preclinical years (4.5 ± 1.0). Three facilitators felt that they had not directly assessed residents' EBM utilization, and therefore did not compare residents to students. Most facilitators reported that discussing articles about diagnostic tests was their greatest teaching challenge, particularly teaching pre/post-test probability and likelihood-ratios. All facilitators reported that they enjoyed teaching the course and would teach it again.

Table 3.

Facilitators' Assessment of First-Year Medical Students' Abilities

Facilitators' Perception of First-Year Medical Students* (15 respondents) “…Overall, my first year students…” Facilitators' Perception of Students' Ability to Use EBM Skills, in Comparison to Previously Supervised Residents (12 respondents)§
Content Area Mean ± SD Mean ± SD
Understood EBM principles 4.3 ± 0.5 3.4 ± 0.7
Are interested in learning EBM 4.5 ± 0.6 3.9 ± 0.9
Can structure clinical questions 4.5 ± 0.7 3.8 ± 0.7
Can apply EBM to evaluate an article about therapy 3.8 ± 0.8 3.3 ± 0.9
Can apply EBM to evaluate an article about diagnosis 3.7 ± 0.8 3.1 ± 1.0
Can apply EBM to evaluate an article about causation/harm 4.0 ± 0.5 3.3 ± 1.0
Can engage in discussions about articles 4.1 ± 0.8 3.6 ± 1.2
Can self-assess areas of EBM knowledge deficits 3.7 ± 0.8 3.5 ± 0.7
*

1 = Strongly disagree; 3 = average; 5 = strongly agree.

The number of respondents per item may have varied, since 3 faculty did not attend some small-group sessions and therefore did not complete the corresponding sections.

1 = Worse than residents; 3 = same as residents; 5 = better than residents.

§

Three faculty did not feel that they had sufficient opportunities to gauge resident EBM abilities.

DISCUSSION

Abilities to analyze data critically, evaluate study methods, and evaluate study outcomes are essential cognitive skills used by physicians in caring for their patients. To our knowledge, this is the first report demonstrating that most EBM principles can be introduced to preclinical medical students in a short first-year course. Longer interventions have been reported in clinical rotations, and 1 report has discussed a preclinical search strategy that extends into clinical rotations.10 Introduction of EBM principles earlier in medical school curricula may encourage students to think more critically about therapeutic and diagnostic decisions that they make upon entering their third- and fourth-year rotations,24,25 and may motivate students to learn more content during concurrent, preclinical courses. The short length of our course provides one template by which EBM skills can be introduced into an already busy preclinical curricula.

This EBM course demonstrated reasonable measures of success. Overall, students and facilitators felt that the curriculum was well-conceived and well-implemented. Both experienced educators and students felt that the material was appropriate for this level of learner, contrary to conventional wisdom that EBM should be taught primarily in the clinical years. Final examination results indicated mastery of course concepts and their application to clinical problems. Facilitators felt that students performed at the same level or better than residents whom they had supervised over the past year. This suggests that the practice of EBM depends less on learners' medical knowledge and more on the ability to apply methodology to clinical studies. Further, the website was successful, providing interactive, independent-study in a user-friendly manner. Students spent hundreds of hours on the website, and indicated a learning preference for combined small-group session and web curricula instruction. We speculate that this supplemental curriculum was a valuable factor for course success. Further, such portable websites can facilitate sharing interactive material among institutions.

Although our EBM course is a good introductory first step, like most cognitive skills, EBM principles need to be reinforced through application and integrated repetition. After introduction, educators have multiple opportunities to reinforce EBM techniques. For instance, during preclinical years, EBM examples can be included when discussing sensitivity/specificity of examination findings during physical examination classes, effectiveness of medications during pharmacology, or causal relationships during physiology and pathophysiology. One study reported including search strategies and basic evidence assessment in the first 2 years of medical school, then using these strategies to identify unknown products in a microbiology class10 before expanding those EBM skills during clinical rotations. Other reports have focused on EBM skills of students and residents in predominantly clinical venues, such as morning report,26 obstetric rounds,8 journal clubs,27 and ambulatory care blocks.28 Reinforcement of introductory and advanced EBM principles through these methods and through journal clubs, patient presentations, and most importantly, role-modeling can be powerful tools in assisting our students to become more critical clinicians.

Several limitations of our study should be acknowledged. Most notably, our assessment of EBM skills acquisition was performed immediately after the course, so long-term outcomes were not measured. Assessing students' long-term retention of EBM principles, and application of those principles during clinical rotations would be a desirable next step. Second, we used an uncontrolled evaluation design. Because we had no authority over the biostatistics/epidemiology courses at 8 other regional centers associated with our medical school, we had no suitable comparison group with sufficient numbers or an appropriate alternative intervention. Third, facilitators' comparison of students with residents was limited by variation in their experiences interacting with residents and by variation in residents' training in EBM. Fourth, online student survey results may have been affected by volunteer bias. Fifth, final exam scores were quite high, probably due to the open-book format. However, we feel that the test reflects the level of difficulty encountered by practicing clinicians, and simulates normally available clinician resources. Additionally, our course had sufficient numbers of skilled educators with significant EBM experience. Availability of these resources might limit application of our small-group method elsewhere. Finally, our assessment does not allow us to determine the relative effectiveness of different curricular components. Future work should focus on minimizing these limitations and, in particular, on tracking the EBM skills of trainees and controls longitudinally through 4 years of medical school and beyond.

Despite these limitations, the test performance and satisfaction of our students, their high use of our web-based curricular component, and the positive evaluations provided by our experienced facilitators provide encouraging evidence that introduction of clinically relevant EBM material early in the career of medical students is practical, feasible, and desirable.

Acknowledgments

We would like to acknowledge the exemplary efforts of Drs. Carey Chisholm, William Cordell, James Corrall, Brent Furbee, James H. Jones, James B. Jones, Mark Kirk, Barbara Mahon, Christi Monts, Jean Molleston, Joe Phillips, Kevin Rodgers, and Steven Steiner in helping to facilitate this course.

This project was funded in part by the Department of Pediatrics and the Office of Medical Student and Curricular Affairs at Indiana University. Dr. Srinivasan was funded by the National Research Service Award Fellowship at the Regenstrief Institute for Health Care.

REFERENCES

  • 1.Green M. Graduate medical education training in clinical epidemiology, critical literature appraisal, and evidence based medicine: a critical review of curricula. Acad Med. 1999;74:686–94. doi: 10.1097/00001888-199906000-00017. [DOI] [PubMed] [Google Scholar]
  • 2.ACGME Outcomes Project. General Competencies. Chicago, Ill: Accreditation Council for Graduate Medical Education; 1996. [Google Scholar]
  • 3.Michaud GC, McGowan JL, van der Jagt RH, Dugan AK, Tugwell P. The introduction of evidence-based medicine as a component of daily practice. Bull Med Libr Assoc. 1996;84:478–81. [PMC free article] [PubMed] [Google Scholar]
  • 4.AAMC. Physicians for the twenty-first century: report of the project panel on the general professional education of the physician and college preparation for medicine. J Med Edu. 1984;59:127–8. 155–67. [PubMed] [Google Scholar]
  • 5.Medical School Objective Project Writing Group. Learning objectives for medical student education—guidelines for medical schools: report I of the medical school objectives project. Acad Med. 1999;74:13–8. doi: 10.1097/00001888-199901000-00010. [DOI] [PubMed] [Google Scholar]
  • 6.Medical Informatics Advisory Panel. Medical School Objectives Project: Medical Informatics Objectives. Washington, D.C.: American Association of Medical Colleges; 1999. pp. 1–11. [Google Scholar]
  • 7.Wadland WC, Barry HC, Farquhar L, Holzman C, White A. Training medical students in evidence-based medicine: a community campus approach. Fam Med. 1999;31:703–8. [PubMed] [Google Scholar]
  • 8.Grimes D, Bachica J, Learman L. Teaching critical appraisal to medical students in obsterics and gynecology. Obstet Gynecol. 1998;92:877–82. doi: 10.1016/s0029-7844(98)00276-2. [DOI] [PubMed] [Google Scholar]
  • 9.Bennett KJ, Sackett DL, Haynes RB, Neufeld VR, Tugwell P, Roberts R. A controlled trial of teaching critical appraisal of the clinical literature to medical students. JAMA. 1987;257:2451–4. [PubMed] [Google Scholar]
  • 10.Barnett SH, Kaiser S, Morgan LK, et al. An integrated program for evidence-based medicine in medical school. Mt Sinai J Med. 2000;67:163–8. [PubMed] [Google Scholar]
  • 11.Sacket DL, Straus S, Richardson S, Rosenberg W, Haynes RB. Evidence Based Medicine: How to Practice and Teach EBM. New York, NY: Churchill Livingstone; 1997. pp. 185–206. [Google Scholar]
  • 12.Medical Schools in the United States, Appendix 1A, Table 2. U.S. medical school enrollments for academic year 1998–1999. JAMA. 1999;282:888–91. [Google Scholar]
  • 13.Step 1: Content Description and Sample Test Materials. Philadelphia, Pa: Federation of State Medical Boards, Inc. and National Board of Medical Examiners; 2000. United States Medical Licensing Examination; p. 13. [Google Scholar]
  • 14.The Indiana Initiative. Physicians for the 21st Century. Indianapolis, Ind: Indiana University School of Medicine; 1996. pp. 105–6.pp. 118–21. [Google Scholar]
  • 15.Guyatt GH, Sackett DL, Cook DJ. Users' guides to the medical literature. II. How to use an article about therapy or prevention. A. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA. 1993;270:2598–601. doi: 10.1001/jama.270.21.2598. [DOI] [PubMed] [Google Scholar]
  • 16.Guyatt GH, Sackett DL, Cook DJ. Users' guides to the medical literature. II. How to use an article about therapy or prevention. B. What are the results and will they help me in caring for my patients? Evidence-Based Medicine Working Group. JAMA. 1994;271:59–63. doi: 10.1001/jama.271.1.59. [DOI] [PubMed] [Google Scholar]
  • 17.Jaeschke R, Guyatt GH, Sackett DL. Users' guides to the medical literature. III. How to use an article about a diagnositic test. A. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA. 1994;271:389–91. doi: 10.1001/jama.271.5.389. [DOI] [PubMed] [Google Scholar]
  • 18.Jaeschke R, Guyatt GH, Sackett DL. Users' guides to the medical literature. III. How to use an article about a diagnostic test. B. What are the results and will they help me in caring for my patient? Evidence-Based Medicine Working Group. JAMA. 1994;271:703–7. doi: 10.1001/jama.271.9.703. [DOI] [PubMed] [Google Scholar]
  • 19.Levine M, Walter S, Lee H, et al. Users' guides to the medical literature. IV. How to use an article about harm. Evidence-Based Medicine Working Group. JAMA. 1994;271:1615–9. doi: 10.1001/jama.271.20.1615. [DOI] [PubMed] [Google Scholar]
  • 20.Covell D, Uman G, Manning P. Information needs in office practice: are they being met? Ann Intern Med. 1985;103:596–9. doi: 10.7326/0003-4819-103-4-596. [DOI] [PubMed] [Google Scholar]
  • 21.Rendell MS, Rajfer J, Wicker PA, Smith MD. Sildenafil for treatment of erectile dysfunction in men with diabetes: a randomized controlled trial. JAMA. 1999;281:421–6. doi: 10.1001/jama.281.5.421. [DOI] [PubMed] [Google Scholar]
  • 22.Borczuk P. Predictors of intracranial injury in patients with mild head trauma. Ann Emerg Med. 1995;25:731–6. doi: 10.1016/s0196-0644(95)70199-0. [DOI] [PubMed] [Google Scholar]
  • 23.Cheasty M, Clare AW, Collins C. Relation between sexual abuse in childhood and adult depression: case-control study. BMJ. 1998;316:198–201. doi: 10.1136/bmj.316.7126.198. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Bordley DR, Fagan M, Theige D. Evidence-based medicine: a powerful educational tool for clerkship education. Am J Med. 1997;102:427–32. doi: 10.1016/S0002-9343(97)00158-7. [DOI] [PubMed] [Google Scholar]
  • 25.Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence based medicine) skills: a critical appraisal. Can Med Assoc J. 1998;158:177–1781. [PMC free article] [PubMed] [Google Scholar]
  • 26.Reilly B, Lemon M. Evidence-based morning report: a popular new format in a large teaching hospital. Am J Med. 1997;103:419–26. doi: 10.1016/s0002-9343(97)00173-3. [DOI] [PubMed] [Google Scholar]
  • 27.Bazarian JJ, Davis CO, Spillane LL, Blumstein H, Schneider SM. Teaching emergency medicine residents evidence-based critical appraisal skills: a controlled trial. Ann Emerg Med. 1999;34:148–54. doi: 10.1016/s0196-0644(99)70222-2. [DOI] [PubMed] [Google Scholar]
  • 28.Green ML, Ellis PJ. Impact of an evidence-based medicine curriculum based on adult learning theory. J Gen Intern Med. 1997;12:742–50. doi: 10.1046/j.1525-1497.1997.07159.x. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES