Skip to main content
The BMJ logoLink to The BMJ
editorial
. 2002 Apr 20;324(7343):929–930. doi: 10.1136/bmj.324.7343.929

Doctors' knowledge about evidence based medicine terminology

General practitioners may not know the jargon, but could use the knowledge

James D Woodcock 1, Sarah Greenley 1, Stuart Barton 1
PMCID: PMC1122890  PMID: 11964325

In a report published in this issue (p 950) Australian general practitioners rated themselves and were then tested on their evidence based medicine skills.1 The results are not encouraging. Fifty general practitioners in Australia rated their understanding of seven common terms from evidence based medicine from “It would not be helpful for me to understand this term” to “I understand this and could explain it to others.” On average, only 22% said they understood each term and could explain it to others. Worse still, in the subsequent structured interview only one general practitioner could provide a fully satisfactory explanation of any of the terms, and many of the explanations revealed considerable misunderstanding. The authors of the study argue that general practitioners need to understand these terms to practise evidence based medicine and that there is little good research on how this can be done. For those working in evidence based medicine these results make depressing reading.

There are some problems interpreting this study. The authors attempt to validate self rating of evidence based medicine skills, but what they actually test is knowledge. The authors recognise that people who cannot demonstrate knowledge in a potentially intimidating academic environment may be more successful at using knowledge in real life. The ability to explain a term may not be the kind of knowledge required of general practitioners. The criteria for fully understanding each term were also quite challenging. It is possible to explain a term without providing all the stated criteria (for example the criteria for number needed to treat includes mentioning that it is the reciprocal of the absolute risk reduction). Unfortunately, even if the results from those general practitioners whose answers were partially correct are combined with the fully correct answers, we are still left with poor results. Only four of 74 claims of understanding were confirmed or partially confirmed in the subsequent interview. Many, whilst claiming understanding, refused to explain the terms, which may or may not indicate ignorance. We do not know how the sample was selected or how many general practitioners had had formal training in critical appraisal. Can we apply the results from this group of Australian general practitioners to clinicians around the world? Australia is seen as a centre for evidence based medicine and one would not expect much better results elsewhere.2

The results cast doubt on the use of self assessed knowledge as a proxy for actual skills. This supports earlier research in the United Kingdom that examined knowledge of six evidence based concepts, two of which were used in the Australian study (relative risk and absolute risk).3 Khan et al studied 55 healthcare professionals including some hospital doctors. They found poor correlation between participants' self evaluated knowledge and multiple choice test scores.

These studies support the view that evidence based medicine skills are not well developed in general practitioners around the world. They do not tell us about general practitioners' demand for evidence and certainly do not undermine the case for providing it. In a survey of English general practitioners most felt that their role is in the application of evidence based conclusions.4 Only a small minority felt that their time was best used learning the skills of evidence based medicine. A recent trend has been to distinguish users of evidence from searchers and appraisers of evidence.5 The assumptions are that the skills required of users of evidence are primarily those of relating the evidence to particular patients and qualitative explanation of the risks and benefits of treatment options, rather than mastery of clinical epidemiology. However, even users of evidence may need to communicate with patients who have done their own search, and who need to weigh conflicting evidence. Although there is disagreement on the best method of teaching critical appraisal, there is evidence that a variety of methods can improve knowledge.6,7 What we do not know is which skills of users of evidence are necessary to improve consultations or patient outcomes.

The challenge for those working in evidence based medicine is to provide summaries of the evidence in a variety of formats that reflect the range of skills of users of evidence, using innovative methods of presentation. These should be arranged hierarchically so that those with interest and skills can drill down to find detail. This transparency is the best safeguard to ensure against bias in pre-appraised summaries. More and better training may not be amiss either.

Primary care p 950

Footnotes

  JDW, SG, and SB work on Clinical Evidence, a compendium of evidence that includes results presented using many of the terms studied by Young et al.

References

  • 1.Young JM, Glasziou P, Ward JE. General practitioners' self ratings of skills in evidence based medicine: validation study. BMJ. 2002;324:950–951. doi: 10.1136/bmj.324.7343.950. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Wooldridge M. Australia first in world to adopt evidence based medicine. www.health.gov.au/archive/mediare1/1998/MN7798.htm (accessed 6 Nov 01).
  • 3.Khan KS, Awonuga AO, Dwarakanath LS, Taylor R. Assessment in evidence-based medicine workshops: loose connection between perception of knowledge and its objective assessment. Medical Teacher. 2001;23:92–94. doi: 10.1080/01421590150214654. [DOI] [PubMed] [Google Scholar]
  • 4.McColl A, Smith H, White P, Field J. General practitioners' perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316:361–365. doi: 10.1136/bmj.316.7128.361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Guyatt GH, Meade MO, Jaeschke RZ, Cook DJ, Haynes RB. Practitioners of evidence based care. BMJ. 2000;320:954–955. doi: 10.1136/bmj.320.7240.954. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Hyde C, Parkes J, Deeks J, Milne R. Systematic review of effectiveness of teaching critical appraisal. ICRF/NHS Centre for Statistics in Medicine, 2000 (UK National R&D Programme Project reference 12-8). www.bham.ac.uk/arif/sysrevs/teachcritapp.pdf (accessed 6 Nov 01).
  • 7.Smith CA, Ganschow PS, Reilly BM, Evans AT, McNutt RA, Osei A, et al. Teaching residents evidence-based medicine skills: a controlled trial of effectiveness and assessment of durability. J Gen Intern Med. 2000;15:710–715. doi: 10.1046/j.1525-1497.2000.91026.x. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES