Skip to main content
The BMJ logoLink to The BMJ
. 2004 Oct 30;329(7473):1029–1032. doi: 10.1136/bmj.329.7473.1029

Evaluating the teaching of evidence based medicine: conceptual framework

Sharon E Straus 1, Michael L Green 2, Douglas S Bell 3, Robert Badgett 4, Dave Davis 5, Martha Gerrity 6, Eduardo Ortiz 7, Terrence M Shaneyfelt 8, Chad Whelan 9, Rajesh Mangrulkar 10; the Society of General Internal Medicine Evidence-Based Medicine Task Force
PMCID: PMC524561  PMID: 15514352

Short abstract

Although evidence for the effectiveness of evidence based medicine has accumulated, there is still little evidence on what are the most effective methods of teaching it.


Interest in evidence based medicine (EBM) has grown exponentially, and professional organisations and training programmes have shifted their agenda from whether to teach EBM to how to teach it. However, there is little evidence about the effectiveness of different methods,1 and this may be related to the lack of a conceptual framework within which to structure evaluation strategies. In this article we propose a potential framework for evaluating methods of teaching EBM. Showing the effectiveness of such teaching methods relies both on psychometrically strong measurements and methodologically rigorous and appropriate study designs, and our framework addresses the former.

This effort was initiated by the Society of General Internal Medicine Evidence-Based Medicine Task Force.2 In an attempt to tackle the challenges in designing and evaluating a series of teaching workshops on EBM for busy practising clinicians, the task force created a conceptual framework for evaluating teaching methods. This was done by a working group of clinicians interested in the subject. They completed a literature review of instruments used for evaluating teaching of EBM (manuscript in preparation), and two members of the task force used the information to draft a conceptual framework. This framework and relevant background materials were discussed and revised at a consensus conference including 10 physicians interested in EBM, evaluation of education methods, or programme development. We then sent a revised framework to all members of the task force and six other international colleagues interested in the subject. We incorporated their suggestions into the framework presented in this article.

When formulating clinical questions, advocates of EBM suggest using the “PICO” approach—defining the patient, intervention, comparison intervention, and outcome.3 We used this approach to provide a framework for the evaluation matrix, specifically:

  • Who is the learner?

  • What is the intervention?

  • What is the outcome?

The answers to these three questions form the structure of our conceptual model.

Who is the learner?

Learners can be doctors, patients, policy makers, or managers. This article focuses on doctors, but our evaluation framework could be applied to other audiences.

Figure 1.

Figure 1

Credit: PHILIP SIMPSON/PHOTONICA

Not all doctors want or need to learn how to practise all five steps of EBM (asking, acquiring, appraising, applying, assessing).4,5 Indeed, most doctors consider themselves users of EBM, and surveys of clinicians show that only about 5% believe that learning all these five steps is the most appropriate way of moving from opinion based to evidence based medicine.4

Doctors can incorporate evidence into their practice in three ways.3,6 In a clinical situation, the extent to which each step of EBM is performed depends on the nature of the encountered condition, time constraints, and level of expertise with each of the steps. For frequently encountered conditions (such as unstable angina) and with minimal time constraints, we operate in the “doing” mode, in which at least the first four steps are completed. For less common conditions (such as aspirin overdose) or for more rushed clinical situations, we eliminate the critical appraisal step and operate in the “using” mode, conserving our time by restricting our search to rigorously preappraised resources (such as Clinical Evidence). Finally, in the “replicating” mode we trust and directly follow the recommendations of respected EBM leaders (abandoning at least the search for evidence and its detailed appraisal). Doctors may practise in any of these modes at various times, but their activity will probably fall predominantly into one category.

The various methods of teaching EBM must therefore address the needs of these different learners. One size cannot fit all. Similarly, if a formal evaluation of the educational activity is required, the evaluation method should reflect the different learners' goals. Although several questionnaires have been shown to be useful in assessing the knowledge and skills needed for EBM,7,8 we must remember that learners' knowledge and skills targeted by these tools may not be similar to our own. The careful identification of our learners (their needs and learning styles) forms the first dimension of the evaluation framework that we are proposing.

What is the intervention?

The five steps of practising EBM form the second dimension of our evaluation framework. But what is the appropriate dose and formulation? If our learners are interested in practising in the “using” mode, our teaching should focus on formulating questions, searching for evidence already appraised, and applying that evidence. Evaluation of the effectiveness of the teaching should exclusively assess these steps. In contrast, doctors interested in practising in the “doing” mode would receive training in all five steps of practising EBM, and the evaluation of the training should reflect this.

Published evaluation studies of teaching EBM show the diversity of existing teaching methods. Some evaluation studies use an approach to clinical practice, whereas others use training in one of the skills of EBM such as searching Medline9 or critical appraisal.10 Indeed, one review of 18 reports of graduate medical education in EBM found that the courses most commonly focused on critical appraisal skills, in many cases to the exclusion of other necessary skills.11 Some studies have looked at 90 minute workshops whereas others included courses that were held over several weeks to months, thereby increasing the “dose” of teaching. Evaluation instruments should be tailored to the dose and delivery method, thereby assessing outcomes and behaviours that are congruent with the intended objectives.

What are the outcomes?

Effective teaching of EBM will produce a wide range of outcomes. Various levels of educational outcomes could be considered, including attitudes, knowledge, skills, behaviours, and clinical outcomes. The outcome level (the third dimension of the conceptual framework) reflects Miller's pyramid for evaluating clinical competence12 and builds on the competency grid for evidence based health care proposed by Greenhalgh.13 Changes in doctors' knowledge and skills are relatively easy to detect, and several instruments have been evaluated for this purpose.7,8 However, many of these instruments primarily evaluate critical appraisal skills, focusing on the role of “doer” rather than “user.” A Cochrane review of critical appraisal teaching found one study that met the authors' inclusion criteria and that the course studied increased knowledge of critical appraisal.10 With our proposed framework, evaluation of this teaching course falls into the learner domain of “doing,” the intervention domain of “appraisal,” and the outcome domain of “knowledge.”

Changes in behaviours and clinical outcomes are more difficult to measure because they require assessment in the practice setting. For example, in a study evaluating a family medicine training programme, doctor-patient interactions were videotaped and analysed for EBM content.14 A recent before and after study has shown that a multi-component intervention including teaching EBM skills and providing electronic resources to consultants and house officers significantly improved their evidence based practice (Straus SE et al, unpublished data). With our proposed framework, evaluation of this latter teaching intervention would be categorised into the learner domain of “doing.” The intervention domains include all five steps of EBM, and the outcome domain would be “doctor behaviour.”

Implementing the evaluation framework

The EBM task force developed teaching workshops for practising doctors that focused on formulating questions and searching for and applying preappraised evidence. Because these workshops were unlike traditional workshops that focused on the five steps of practising EBM,15 we concluded that evaluation of these workshops must be different. We created an evaluation instrument to detect an effect on learners' EBM knowledge, attitudes, and skills.

When we applied the evaluation framework to our evaluation instrument we found that our learners' goals were different from what we were assessing (table 1). We found that we placed greater emphasis on the skills necessary for practising in the “doing” mode than those required in the “using” mode, whereas the intervention was targeted to improve “user” behaviour. Moreover, the assessment mirrored traditional evaluation methods, focusing on appraisal skills, with little attention paid to question formulation. Finally, we saw that our evaluation predominantly measured skills rather than behaviour. This reflection led us to redesign our evaluation instrument to more closely reflect the learning objectives. We also attempted to show how the evaluation framework could be used—how to move from a concept to actual use (table 2).

Table 1.

Application of evaluation framework to SGIM EBM Task Force evaluation tool

Intervention*
Outcome Learner Ask Acquire Appraise Apply Assess
Attitudes Replicator
1




User
1




Doer 1
Knowledge Replicator





User

10, 12



Doer 10, 12
Skills Replicator
3, 5
5
5
5

User
3, 5
5
5
5, 16ii

Doer 3, 5 5 5, 8, 9, 11, 15, 16i 5, 13, 14, 16ii
Behaviour Replicator
3
1, 2



User
3
1, 2



Doer 3 1, 2
Clinical outcomes Replicator





User





Doer

SGIM EBM Task Force=Society of General Internal Medicine Evidence-Based Medicine Task Force.

*

Numbers refer to questions on the evaluation tool (see sample questions from evaluation tool on bmj.com).

Table 2.

Application of the conceptual framework for formulating clinical questions

Outcome Replicator User Doer
Attitudes • Recognise the importance of identifying knowledge gaps • Replicator objectives and • User objectives
• Recognise that converting the gap into a focused clinical question is important • Recognise that multiple knowledge deficits commonly exist in clinical situations
• Be open to new knowledge and to seeking new knowledge
Knowledge • List and understand crucial, relevant components of a focused clinical question • List and understand all relevant components of a focused clinical question • User objectives
Skill • Construct a focused clinical question that contains relevant components • Be able to ask a focused clinical question containing all relevant components for each type of clinical question that arises • User objectives
• Be able to select the appropriate question(s) to pursue from the list based on importance to user's and patient's needs
Behaviour • Occasionally ask appropriate colleagues focused clinical questions containing relevant components • Frequently use appropriate, focused clinical questions relevant to clinical patients in order to seek new knowledge about the care of these patients • User objectives and
• Often record the focused clinical questions that arise and those questions that have been answered
Clinical outcomes • Use clinical questions to identify gaps in practice and to change practice accordingly • Replicator objectives • Replicator objectives

Limitations of this framework

Our model requires that teachers work with learners to understand their goals, to identify in what mode of practice they want to enhance their expertise, and to determine their preferred learning style. This simple model could be expanded to include other dimensions, including the role of the teacher and the “dose” and “formulation” of what is taught. However, our primary goal was to develop a matrix that was easy to use. Although we have applied this framework to several of the published evaluation instruments and have found it to be useful, others may find that it does not meet all of their requirements.

What's next?

While EBM teachers struggle with developing innovative course materials and evaluation tools, we propose a coordinated sharing of these materials in order to minimise duplication of effort. Using the proposed framework as a categorisation scheme, the task force is establishing an online clearinghouse to serve as a repository for evaluations of methods of teaching EBM including details on their measurement properties.2 Teachers will be able to identify evaluation tools that might be useful in their own setting, using the framework to target their needs.

There is still little evidence about the effectiveness of different teaching methods,1 and attempting to evaluate such teaching is challenging given the complexity of the learners, the interventions, and the outcomes. One way to help meet these challenges is to develop a collaborative research network to conduct multicentre, randomised trials of educational interventions. We invite interested colleagues to join us in developing this initiative and to create the clearinghouse for evaluation tools (www.sgim.org/ebm.cfm).

Summary points

There is little evidence about the effectiveness of different methods of teaching evidence based medicine

Doctors can practise evidence based medicine in one of three modes—as a doer, a user, or a replicator

Instruments for evaluating different methods of teaching evidence based medicine must reflect the different learners (their learning styles and needs), interventions (including the dose and formulation), and outcomes that can be assessed

Our framework provides only one way to conceptualise the evaluation of teaching EBM; many others could be offered. We hope that our model serves as an initial step towards discussion and that others will offer their suggestions so that we may work together towards improved understanding of the evaluation process and promote more rigorous research on the evaluation of teaching EBM.

Supplementary Material

Question samples
bmj_329_7473_1029__.html (17.2KB, html)

Inline graphicSample questions from the task force's summative evaluation tool appear on bmj.com

The members of the SGIM EBM Task Force included: Rob Golub, Northwestern University, Chicago, IL; Michael Green, Yale University, New Haven, CT; Robert Hayward, University of Alberta, Edmonton, AB; Rajesh Mangrulkar, University of Michigan, Ann Arbor, MI; Victor Montori, Mayo Clinic, Rochester, MN; Eduardo Ortiz, DC VA Health Centre, Washington, DC; Linda Pinsky, University of Washington, Seattle, WA; W Scott Richardson, Wright State University, Dayton OH; Sharon E Straus, University of Toronto, Toronto, ON. We thank Paul Glasziou for comments on earlier drafts of this article.

Funding: SES is funded by a Career Scientist Award from the Ministry of Health and Long-term Care and by the Knowledge Translation Program, University of Toronto. DSB is funded in part by the Robert Wood Johnson Foundation Generalist Physician Faculty Scholars Program.

Competing interests: None declared.

References

  • 1.Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA 2002;288: 1110-2. [DOI] [PubMed] [Google Scholar]
  • 2.Society of General Internal Medicine. Evidence based medicine. www.sgim.org/ebm.cfm (accessed 1 Oct 2004).
  • 3.Sackett DL, Straus SE, Richardson WS, Rosenberg WMC, Haynes RB. Evidence-based medicine: how to practice and teach EBM. London: Churchill Livingstone, 2000.
  • 4.McColl A, Smith H, White P, Field J. General practitioners' perceptions of the route to evidence-based medicine: a questionnaire survey. BMJ 1998;316: 361-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.McAlister FA, Graham I, Karr GW, Laupacis A. Evidence-based medicine and the practicing clinician: a survey of Canadian general internists. J Gen Intern Med 1999;14: 236-42. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Straus SE, McAlister FA. Evidence-based medicine: a commentary on common criticisms. CMAJ 2000;163: 837-41 [PMC free article] [PubMed] [Google Scholar]
  • 7.Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H, Kunz R. Do short courses in evidence based medicine improve knowledge and skills? Validation of Berlin questionnaire and before and after study of courses in evidence based medicine. BMJ 2002;325: 1338-41. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence based medicine. BMJ 2003;326: 319-21. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Rosenberg WM, Deeks J, Lusher A, Snowball R, Dooley G, Sackett D. Improving searching skills and evidence retrieval. J R Coll Physicians Lond 1998;32: 557-63. [PMC free article] [PubMed] [Google Scholar]
  • 10.Parkes J, Hyde C, Deeks J, Milne R. Teaching critical appraisal skills in health care settings. Cochrane Database Syst Rev 2001;(3): CD001270. [DOI] [PubMed]
  • 11.Green ML. Graduate medical education training in clinical epidemiology, critical appraisal and evidence-based medicine: a critical review of curricula. Acad Med 1999;74: 686-94. [DOI] [PubMed] [Google Scholar]
  • 12.Miller GE. The assessment of clinical skills/competency/performance. Acad Med 1990;65(9 suppl): S63-7. [DOI] [PubMed] [Google Scholar]
  • 13.Greenhalgh T, Macfarlane F. Towards a competency grid for evidence-based practice. J Eval Clin Pract 1997;3: 161-5. [DOI] [PubMed] [Google Scholar]
  • 14.Ross R, Verdieck A. Introducing an evidence-based medicine curriculum into a family practice residency—is it effective? Acad Med 2003;78: 412-7 [DOI] [PubMed] [Google Scholar]
  • 15.Kunz R, Fritsche L, Neumayer HH. Development of quality assurance criteria for continuing education in evidence-based medicine. Z Arztl Fortbild Qualitatssich 2001;95: 371-5. [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Question samples
bmj_329_7473_1029__.html (17.2KB, html)

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES