Skip to main content
Journal of General Internal Medicine logoLink to Journal of General Internal Medicine
. 2008 Jul 10;23(7):1057–1059. doi: 10.1007/s11606-008-0625-x

Evaluation of a Longitudinal Medical School Evidence-Based Medicine Curriculum: A Pilot Study

Colin P West 1,, Furman S McDonald 1
PMCID: PMC2517920  PMID: 18612744

Abstract

Background

Evidence-based medicine (EBM) is increasingly taught in medical schools, but few curricula have been evaluated using validated instruments.

Objective

To evaluate a longitudinal medical school EBM curriculum using a validated instrument.

Design, Participants, Measurements

We evaluated EBM attitudes and knowledge of 32 medical students as they progressed through an EBM curriculum. The first part was an EBM “short course” with didactic and small-group sessions occurring at the end of the second year. The second part integrated EBM assignments with third-year clinical rotations. The validated 15-item Berlin Questionnaire was administered before the course, after the short course, and at the end of the third year.

Results

EBM knowledge scores increased from baseline by 2.8 points at the end of the second year portion of the course (p = .0001), and by 3.7 points at the end of the third year (p < .0001). Self-rated EBM knowledge increased from baseline by 0.8 and 1.1 points, respectively (p = .0006 and p < .0001, respectively). EBM was felt to be of high importance for medical education and clinical practice at all time points, peaking after the short course.

Conclusions

A longitudinal medical school EBM curriculum was associated with increased EBM knowledge. This knowledge increase was sustained throughout the curriculum.

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-008-0625-x) contains supplementary material, which is available to authorized users.

KEY WORDS: medical education, evidence-based medicine, medical school

BACKGROUND

Evidence-based medicine (EBM) has become an important component of medical education for medical students and practicing physicians alike. A variety of curricula have been developed to teach EBM, but most have been seminar series or “short courses”.14 Few curricula have extended EBM instruction longitudinally throughout clinical rotations.5,6 In addition, EBM curricula have rarely been evaluated using validated instruments to assess both short-term acquisition and longer-term retention of EBM knowledge.7

In this pilot study, we investigated the potential effectiveness of a new EBM curriculum at the Mayo Medical School in 2006 combining a short course with longitudinal EBM practice throughout third year clinical experiences. We evaluated self-reported EBM knowledge and attitudes regarding the importance of EBM for medical education and clinical practice. We also used the validated Berlin Questionnaire8 to assess EBM knowledge over the course of this curriculum.

METHODS

Participants and Curriculum

The Mayo Clinic institutional review board approved this study. The current EBM curriculum at the Mayo Medical School began in 2006 near the end of the second year of medical school with a short course of 22 contact hours for each of the 32 students intending to immediately begin third-year clinical rotations. This short course was adapted from the model developed at McMaster University, Canada.9 Didactic sessions were used to introduce EBM skills following the Users’ Guides to the Medical Literature text for common article types in the medical literature (therapy, harm, diagnosis, prognosis, and systematic reviews).9 Students then worked with the 2 course instructors (CPW and FSM) in small group sessions in which journal articles were critically appraised following the criteria set forth in the text.

The curriculum continued throughout the third year of medical school, integrated with clinical experiences. During each third-year clinical rotation (Internal Medicine, Surgery, Pediatrics, Obstetrics-Gynecology, Neurology, Psychiatry, and Family Medicine), students generated a clinical question, searched for an article addressing that question, critically appraised the article, and produced a brief summary of the evidence and how it applied to the patient from whom the clinical question arose. The course instructors evaluated each assignment and provided substantive feedback on each student’s review. The course was graded on a pass–fail scale. (See the online appendix for more information regarding the curriculum.)

The instructors each had advanced training in biostatistics and epidemiology and had participated in How to Teach Evidence-based Clinical Practice workshops at McMaster University. In addition, both instructors had taught basic and advanced EBM topics to Internal Medicine residents at Mayo.

Evaluation Instrument

Students were asked to report their self-rated EBM knowledge and their assessment of the importance of EBM for medical education and clinical practice on 5-point Likert scales ranging from “1—very low” to “5—very high”. Students also completed the Berlin Questionnaire, a well-validated, reliable, and objective instrument designed to measure EBM knowledge.8 This instrument consists of 15 multiple-choice questions designed to assess the ability to apply concepts rather than simply reproduce facts, and covers a wide range of EBM domains. The questions are structured around clinical scenarios and linked to published research literature. Scores on this instrument may range from 0 to 15, and each question receives equal weight. Each student completed testing on the first day of the course, at the completion of the second-year short course, and upon completion of the third year of medical school. Both of the 2 psychometrically equivalent Berlin Questionnaire formats were used, with each student randomly assigned via computer-generated randomization on the first day of the course to an initial format. Students then received the alternate format at the second administration, and the initial format again at the third administration.

Statistical Analysis

Data were entered and analyzed using SAS Version 9.1 (SAS Institute, Cary, NC). Differences in EBM attitudes and knowledge between each of the 3 time points were tested using Wilcoxon signed rank tests for paired data. Correlations between self-rated EBM knowledge and Berlin score were assessed using Spearman rank correlations. Nonparametric statistics were applied given the small sample size and ordinal nature of the data. The threshold for statistical significance was set at 0.05.

RESULTS

Results from this study are presented in the Table 1. All 32 eligible students contributed data at the baseline and end of the second year assessments. Two students did not contribute data at the end of the third year assessment owing to scheduling conflicts. The sample consisted of 17 male students (53%) and 15 female students (47%), with an average age at the start of the course of 25.4 years. Before the course, self-rated EBM knowledge was poor (average score 2.2), and increased to fair upon completion of the short course and longitudinal curriculum (average score 3.0 and 3.3, respectively, both p < .0001). The average Berlin score increased from 6.1 (out of 15 possible) to 8.9 (p = .0001) by the end of the second-year portion of the curriculum, and to 9.8 (p < .0001) by the end of the third year. Median Berlin scores (interquartile range) were 6.0 (5–8), 9.0 (7.5–10), and 10.0 (8–11), respectively.

Table 1.

Evidence-based Medicine Knowledge and Attitude Mean Scores over the Course of the Medical School EBM Curriculum

Variable (Possible range) Baseline (n = 32) End Year 2 (n = 32) P* End Year 3 (n = 30) P* P
Self-rated EBM Knowledge (1–5) 2.2 3.0 <.001 3.3 <.001 .01
Berlin Score (0–15) 6.1 8.9 <.001 9.8 <.001 .04
Importance for medical education (1–5) 3.8 4.4 .002 3.9 .56 <.001
Importance for clinical practice (1–5) 4.3 4.6 .03 4.3 .63 .02

*Comparison with baseline, Wilcoxon signed rank test

Comparison with result at end of Year 2, Wilcoxon signed rank test

Median Berlin scores (interquartile range) were 6.0 (5–8) at baseline, 9.0 (7.5–10) at end of Year 2, and 10.0 (8–11) at end of Year 3.

The perceived importance of EBM for both medical education and clinical practice was generally high before the course (average score 3.8 and 4.3, respectively, out of 5 possible). These values were essentially stable over the course of the curriculum, with some increase noted at the end of the second year, but not at the end of the third year.

In addition, at each time point there was a small positive correlation between self-rated EBM knowledge and Berlin score, with Spearman rank correlations ranging from 0.16 to 0.48. The average Spearman rank correlation between self-rated EBM knowledge and Berlin score across all 3 time points was 0.27.

DISCUSSION

We report a medical school EBM curriculum that is effective in improving both perceived and measured EBM knowledge. Key elements of this curriculum include a short course in which students learn important principles of EBM and practice critical appraisal facilitated by experienced instructors, and a longitudinal component in which students gain further experience in applying EBM principles to clinical questions encountered during patient care rotations. Our results suggest that both parts of this curriculum contributed to the improvement and maintenance of EBM knowledge in this group of Mayo medical students. Attitudes regarding the importance of EBM were positive at all stages of the curriculum, with the perceived importance of EBM for medical education peaking at the end of the short course.

Many EBM curricula focus almost exclusively on critical appraisal skills.2,10 Our curriculum also teaches students how to ask clinical questions, search for the best evidence to answer these questions, and apply valid evidence to clinical practice. It is important to note that our curriculum emphasizes the integration of EBM into current clinical experience, as has been stressed in the literature but rarely incorporated into medical school curricula.6,11,12

Limited previous work has found that self-perceived EBM ability correlates poorly with objective assessment of EBM knowledge.13 Our findings in this regard are similar, and the small observed correlations suggest that objective knowledge assessment is crucial in the evaluation of EBM curricula. To this end, the Berlin Questionnaire is 1 of only 2 well-validated, reliable, and objective instruments intended to evaluate the full spectrum of EBM.14 This instrument uses a multiple-choice format, making implementation simple, but it does have some limitations. For example, it does not allow students to demonstrate their real-time ability to perform EBM tasks such as generating a clinical question or searching literature databases.8,14 The Fresno Test15 allows this but is far more cumbersome to grade. Future evaluation of the Mayo curriculum using the Fresno Test is planned.

Our study has additional limitations. First, while students served as their own controls in the pre-post design, given the small size of each Mayo Medical School class there was no concurrent control group. Therefore, it is possible that factors outside of the curriculum could have been responsible for the observed increases in EBM knowledge. However, there is no other specific EBM teaching in the Mayo Medical School curriculum. Second, this study represents the experience of a single medical school class at 1 institution. Replication of these results in subsequent classes is necessary. Ideally, this would involve additional medical schools and instructors, which might also allow the incorporation of control groups to strengthen the validity of these findings.

In summary, a medical school EBM curriculum combining an initial short course and subsequent integration of EBM practice with clinical activities resulted in sustained increases in perceived and measured EBM knowledge. Additional research using alternative EBM knowledge assessment instruments such as the Fresno Test and controlled study designs is needed to confirm the impact of this curriculum.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Acknowledgments

Conflict of Interest None disclosed.

Footnotes

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-008-0625-x) contains supplementary material, which is available to authorized users.

References

  • 1.NormanGR,ShannonSI. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. Can Med Assoc J. 1998;158:177–81. [PMC free article] [PubMed]
  • 2.GreenML. Graduate medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula. Acad Med. 1999;74:686–94. [DOI] [PubMed]
  • 3.TaylorR,ReevesB,EwingsP,BinnsS,KeastJ,MearsR. A systematic review of the effectiveness of critical appraisal skills training for clinicians. Med Educ. 2000;34:120–5. [DOI] [PubMed]
  • 4.AklEA,IzuchukwuIS,El-DikaS,FritscheL,KunzR,SchunemannHJ. Integrating an evidence-based medicine rotation into an internal medicine residency program. Acad Med. 2004;79:897–904. [DOI] [PubMed]
  • 5.BarnettSH,KaiserS,MorganLK, et al.. An integrated program for evidence-based medicine in medical school. Mt Sinai J Med. 2000;67:163–8. [PubMed]
  • 6.Del MarC,GlasziouP,MeyerD. Teaching evidence based medicine. BMJ. 2004;329:989–90. [DOI] [PMC free article] [PubMed]
  • 7.SmithCA,GanschowPS,ReillyBM, et al.. Teaching residents evidence-based medicine skills: a controlled trial of effectiveness and assessment of durability. J Gen Intern Med. 2000;15:710–5. [DOI] [PMC free article] [PubMed]
  • 8.FritscheL,GreenhalghT,Falck-YtterY,NeumayerH-H,KunzR. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ. 2002;325:1338–41. [DOI] [PMC free article] [PubMed]
  • 9.GuyattG,RennieD. Users’ guides to the medical literature. Chicago: AMA Press; 2002.
  • 10.HatalaR,GuyattG. Evaluating the teaching of evidence-based medicine. JAMA. 2002;288:1110–2. [DOI] [PubMed]
  • 11.BradtP,MoyerV. How to teach evidence-based medicine. Clin Perinatol. 2003;30:419–33. [DOI] [PubMed]
  • 12.CoomarasamyA,KhanKS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329:1017–21. [DOI] [PMC free article] [PubMed]
  • 13.KhanKS,AwonugaAO,DwarakanathLS,TaylorR. Assessments in evidence-based medicine workshops: loose connection between perception of knowledge and its objective assessment. Med Teach. 2001;23:92–4. [DOI] [PubMed]
  • 14.ShaneyfeltT,BaumKD,BellD,FeldsteinD,HoustonTK,KaatzS,WhelanC,GreenM. Instruments for evaluating education in evidence-based practice: a systematic review. JAMA. 2006;296:1116–27. [DOI] [PubMed]
  • 15.RamosKD,SchaferS,TraczSM. Validation of the Fresno Test of competence in evidence based medicine. BMJ. 2003;326:319–21. [DOI] [PMC free article] [PubMed]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Below is the link to the electronic supplementary material.


Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine

RESOURCES