Skip to main content
The BMJ logoLink to The BMJ
. 2003 Jan 18;326(7381):142–145. doi: 10.1136/bmj.326.7381.142

Transferability of principles of evidence based medicine to improve educational quality: systematic review and case study of an online course in primary health care

Trisha Greenhalgh a, Peter Toon a, Jill Russell a, Geoff Wong a, Liz Plumb b, Fraser Macfarlane c
PMCID: PMC140008  PMID: 12531848

The success of evidence based medicine has led to pressure to make medical education more evidence based. Greenhalgh and colleagues tested the transferability of these principles when developing a postgraduate course

Evidence based medicine advocates a structured and systematic approach to clinical decision making using a five point sequence (box B1). The same principles, linked to audit and performance review, have been used extensively in policy making1,2 and quality improvement initiatives3,4 in health care. They have also been advocated as an approach to improving the quality of education in general,5 and medical education in particular,6,7 though others have strongly rejected such approaches.8 We explored the extent to which the five stage evidence based medicine sequence can be applied to developing and implementing quality standards in online education.

Summary points

  • It is widely believed that the education of health professionals should be more evidence based

  • Good randomised controlled trials in education (especially postgraduate education) are hard to find

  • A systematic review of evidence on online education found only one relevant randomised controlled trial

  • Independent qualitative analysis of students' and staff experience on our online course was invaluable when testing the validity and transferability of published research evidence and quality standards

  • Evidence in education should include not only formal, research derived knowledge but also tacit knowledge (informal knowledge, practical wisdom, and shared representations of practice)

Box 1.

Sequence of evidence based medicine

Frame a focused question
Search thoroughly for research derived evidence
Appraise the evidence for its validity and relevance
Seek and incorporate the user's values and preferences
Evaluate effectiveness through planned review against agreed success criteria

Aims

As the developers of an online degree course for health professionals, we aimed to:

  • Evaluate the use of an evidence based medicine framework in an educational development setting

  • Develop robust quality standards for the delivery of an online postgraduate course in primary health care

  • Draw general lessons about the transferability of the principles of evidence based medicine to educational practice.

Research team

We are a multidisciplinary, research oriented academic team comprising four general practitioners (one with a strong interest in information technology), a social scientist, a psychologist, and an educationalist; we work closely with an academic nursing unit. We had previously taught in undergraduate medicine, postgraduate short courses, and work based training, but we were new to the online environment.

Educational context

We established a part time MSc degree in primary health care at University College London in 1999. The course is entirely online except for an initial one week summer school. It caters for a diverse student group of general practitioners, public health physicians, community nurses, pharmacists, and managers drawn from the United Kingdom and mainland Europe, most of whom sign up to the course to achieve goals such as developing new services, establishing local research programmes, or developing and evaluating teaching and training initiatives.

When we embarked on this project in 1997, University College London had a policy of discouraging distance learning because of concerns about quality. Hence, we found ourselves a test case for wider issues concerning the credibility and feasibility of online learning at our institution.

Methods

The study comprised three overlapping phases: secondary research, primary research, and synthesis, as shown in figures 1 and 2.

Figure 1.

Figure 1

Methods used in preparation of the quality framework

Figure 2.

Figure 2

Methods used to synthesise the quality framework

Secondary research phase

We did a systematic review of the literature on online education. Following the sequence in box B1, we framed focused questions and tried to select research designs, search strategies and data sources appropriate to each. Although we initially constructed these questions in terms of how the course affected student performance, our final list of questions was as follows:

  • What is a high quality online learning experience for postgraduate students of primary health care?

  • How can we provide that experience consistently and efficiently?

  • How can we reliably demonstrate the quality of our course to internal critics and external evaluators?

  • How can we best support, train, and supervise our staff?

  • How will we know when we are failing?

  • How can we improve our performance year on year?

We applied a formal search strategy to online databases (notably ERIC (Educational Resources Information Centre) and PsycInfo). We also searched books, grey literature (especially dissertations and internal reports), and key journals by hand. We gained additional insights from attending conferences and courses (including online Open University courses), joining academic email lists, and making direct contact with experts in the field. Through these, we encountered many examples of existing online programmes, which we considered as case studies.

We examined literature on the development of audit and quality assurance programmes from industry and the service sector (for example, ISO 9000, Investors in People, Royal College of General Practitioners Quality Practice Award) and identified official guidelines on distance education produced by the Quality Assurance Agency in the United Kingdom and comparable publications from other countries.

We applied standard critical appraisal checklists to guidelines.9 For qualitative research papers (where difficulties in appraisal are well described10), we used a range of checklists7,1113 to guide in-depth discussion of published studies and prompt contact with authors where necessary. These strategies and sources are described fully on bmj.com (appendix 1).

Primary research phase

An independent researcher (LP) from a separate department did an extensive primary research study of the experiences of students and staff on our course using a range of qualitative methods (see bmj.com for full details). Interviews and focus groups were audiotaped, transcribed in full, and analysed for themes—that is, LP developed a preliminary taxonomy of areas of concern, flagged critical incidents, and suggested explanations for particular behaviours or phenomena. LP periodically presented these themes, together with her impressions from shadowing and observing participants, to staff and students and modified the themes in response to feedback and discussion.

Synthesis phase

We held regular review meetings to consider the emerging results of the secondary and primary research, reframe questions where necessary, and formally reflect on our role as both researchers of, and participants in, the project. Over several such meetings, we developed and refined a first draft of a detailed quality framework for our course. We circulated this draft to our students, external examiners, and around 30 colleagues within and outside the college, some of whom were selected for their critical views of online learning. We presented the second draft at academic meetings and conferences and again invited feedback, but in practice made little subsequent modification.

We tested the transferability of our quality framework to other courses, institutions, and contexts. One of us (FM) modified it at the University of Surrey to provide draft quality standards for placing course materials on line and running optional email discussions for students in conventionally taught MSc programmes. We also used a modified version of the framework in the development of a series of CD Rom based continuing professional development modules for general practitioners (see www.apollobmj.com).

Results of systematic review

We found only one randomised controlled trial examining what works in online education in our subject. This was a small trial on the effect of online postgraduate programmes in primary health care.1 The full results of our search are available on bmj.com Of around 300 primary research articles and 700 reviews and editorials, we rejected around 95% as irrelevant or methodologically poor. Many original research papers had not been peer reviewed (some had been published exclusively on the internet), and most were limited to technical details or superficial case description. The studies of undergraduate medical education were the only ones whose sampling frames, interventions, and outcomes could be meaningfully compared in a summary table, and we have published a systematic review of these studies.14

Of the 15 guidelines for online education, around half were relevant and potentially transferable, but validity was hard to assess. The recommendations from the UK Quality Assurance Agency generally seemed sensible, but the evidence base was not clear and there was little advice on dissemination, implementation, or local adaptation (see appendix 2 on bmj.com for details). Several US guidelines seemed more robust and flexible, but most of these still took an institutional focus and the practical lessons for people developing courses were unclear.

Formulating a quality framework

Combining our diverse secondary and primary sources to produce a clear vision for quality, a succinct set of standards, and a set of measurable success criteria for our own course was difficult and complex. It required repeated discussion and revisiting of concepts. Our primary research often provided rich case examples that enabled us to make sense of (or challenge) the published recommendations. Critical incidents proved particularly useful as triggers for action. Examples of all these and the final version of the quality framework are given on bmj.com (appendices 3 and 4).

Despite the plethora of papers and guidelines on online education, we found no simple recipes for developing evidence based quality standards in our educational project. We repeatedly found that reflection on practical experience (rather than, say, the application of critical appraisal checklists) enabled us to test the validity and transferability of published evidence to our course.

Applicability of evidence based medicine

We believe there are four key differences between evidence based education and conventional evidence based medicine. Firstly, many questions relating to clinical practice fall into a simple and logical taxonomy (such as, prevalence, prognosis, or therapy). The different types of question have a corresponding preferred research design (survey, cohort study, randomised controlled trial, etc) with accepted criteria for assessing validity (the critical appraisal checklist13). Educational questions have a more complex taxonomy, a less direct link with particular preferred study designs, and no universally accepted criteria for assessing validity.15,16

Secondly, most of the definitive research questions generated for this project were qualitative—that is, they began with exploratory, open ended stems such as how or what. In the early stages of the project, we constructed questions in the format used in evidence based medicine (population, intervention, comparison, and outcome)—for example, “What proportion of students will pass an exam if all the teaching is online compared with the proportion that would pass if taught conventionally?” Implicit in this question is a behaviourist model of learning, in which the students are viewed as a population sample; the online course as an intervention; conventional education as the comparison; and student performance as the outcome. The validity of such assumptions is highly questionable, especially when (as in many postgraduate and continuing professional development courses) the goals of the course are humanistic rather than behaviourist (professional development, motivation, support, confidence) and (as with most part time adult learners) there is wide variation between students in terms of background, personal goals, life commitments, learning styles, ongoing circumstances, and so on.17,18

Thirdly, we found the online educational literature difficult to access and navigate. This is unlikely to be wholly due to our lack of technical familiarity with the databases, since the user interface for the ERIC and Psyclit databases is identical to that for Medline. Some key search terms (e-learning, computer mediated communication) have multiple synonyms, and others (quality, performance) have multiple meanings. Given the diverse nature of qualitative research, search filters intended to select out such studies are in reality neither sensitive nor specific.10

Fourthly, and perhaps most importantly, we found that educational development requires practical wisdom and not merely research evidence. Although the theoretical knowledge we gained about online learning from published guidelines often scored well on objective measures of quality, it served to confuse as much as inform us. But the practical knowledge that we gleaned from conferences, academic mailing lists, expert contacts, Open University courses, and our portfolio of case examples from education, health care, and industry was invaluable in converting evidence into action.

Conclusions

Hammersley has accused the evidence based medicine movement of “making false and dangerous promises” for the transferability of its methods to education. In particular, he claims, it does not address how research evidence should be combined with other kinds of evidence in making practical judgments in educational development.8 We agree that the educational community must take care not to climb uncritically on the evidence based medicine bandwagon in the politically fashionable drive towards a focused and scientific approach. We propose an alternative decision making sequence (box B2) that better reflects the reality of evidence based education.

Box 2.

Suggested sequence for evidence based educational development

Frame a detailed question that fully reflects the context and complexity of the course being considered
Search thoroughly for research derived evidence
Appraise the evidence for its validity and relevance
Seek practical know-how through personal contacts and networking
Undertake rigorous, in-depth primary research on the experience of staff and students
Integrate these diverse sources iteratively into a draft development plan
Evaluate effectiveness through planned review against agreed success criteria

In conclusion, the linear and formulaic link between evidence and practice implicit in evidence based medicine proved inadequate for the complexities of educational research. Conceptual models designed for multifaceted problems, which may be more appropriate, include cognitive restructuring theory,19 complexity (non-linearity) theory,20 activity theory (the relation between course developers, contexts, and tools),21 and the sharing of tacit knowledge in informal communities of practice.22

Supplementary Material

[extra: Further details]

Acknowledgments

Further details of the course described in this paper can be viewed at www.ucl.ac.uk/openlearning/msc/index.html We thank the following people for help with developing of our quality framework: Gilly Salmon, Robin Mason, Ann Rossiter, Lewis Elton, Pat Cryer, David Perry, Gene Feder, Marcia Rigby, Angela Chesser, Ann Leyland, Will Coppola, and all students on our course. We also thank the referee, Janet Grant, for helpful and constructive comments.

Footnotes

Funding: This project was funded partly via an educational development grant from the University of London External System New Technologies Fund.

Competing interests: None declared.

Further details of the systematic review and the quality framework are given on bmj.com

References

  • 1.Eriksson C. Learning and knowledge-production for public health: a review of approaches to evidence-based public health. Scand J Public Health. 2000;28:298–308. [PubMed] [Google Scholar]
  • 2.Mulrow CD, Lohr KN. Proof and policy from medical research evidence. J Health Politics, Policy Law. 2001;26:249–266. doi: 10.1215/03616878-26-2-249. [DOI] [PubMed] [Google Scholar]
  • 3.Secretary of State for Health. A first class service: quality in the new NHS. London: Stationery Office; 1998. [Google Scholar]
  • 4.Greenhalgh T, Donald A. Oxford Textbook of Primary Care. Oxford: Oxford University Press; 2002. Evidence based medicine as a tool for quality improvement. [Google Scholar]
  • 5.Cullen J, Hadjivassiliou K, Hamilton E, Kelleher J, Sommerlad E, Stern E. Review of current pedagogic research and practice in the fields of post-compulsory education and lifelong learning. London: Tavistock Institute; 2002. [Google Scholar]
  • 6.Hesketh EA. A framework for developing excellence as a clinical educator. Med Educ. 2001;35:555–564. doi: 10.1046/j.1365-2923.2001.00920.x. [DOI] [PubMed] [Google Scholar]
  • 7.Morrison JM. Evidence-based education: development of an instrument to critically appraise reports of educational interventions. Med Educ. 2001;33:890–893. doi: 10.1046/j.1365-2923.1999.00479.x. [DOI] [PubMed] [Google Scholar]
  • 8. Hammersley M. Some questions about evidence-based practice in education. Paper presented at the symposium on “Evidence-based practice in education” at the Annual Conference of the British Educational Research Association, University of Leeds, September 13-15, 2001. www.leeds.ac.uk/educol/documents/00001819.htm (accessed 13 December 2002).
  • 9.Cluzeau FA, Littlejohns P, Grimshaw JM, Feder G, Moran SE. Development and application of a generic methodology to assess the quality of clinical guidelines. International Journal of Quality in Health Care. 1999;11:21–28. doi: 10.1093/intqhc/11.1.21. [DOI] [PubMed] [Google Scholar]
  • 10.Popay J, Rogers A, Williams G. Rationale and standards in the systematic review of qualitative literature in health services research. Qualitative Health Research. 1998;8:341–351. doi: 10.1177/104973239800800305. [DOI] [PubMed] [Google Scholar]
  • 11.Giacomini MK, Cook DJ. Users' guides to the medical literature. XXIII. Qualitative research in health care A. Are the results of the study valid? Evidence-Based Medicine Working Group. JAMA. 2000;284:357–362. doi: 10.1001/jama.284.3.357. [DOI] [PubMed] [Google Scholar]
  • 12.Giacomini MK, Cook DJ. Users' guides to the medical literature. XXIII. Qualitative research in health care B. What are the results and how do they help me care for my patients? Evidence-Based Medicine Working Group. JAMA. 2000;284:478–482. doi: 10.1001/jama.284.4.478. [DOI] [PubMed] [Google Scholar]
  • 13. Critical Appraisal Skills Programme. Critical appraisal checklists for published evidence. www.phru.org.uk/∼casp/resources/index.htm (accessed 25 November 2002).
  • 14.Greenhalgh T. Computer assisted learning in undergraduate medical education. BMJ. 2001;322:40–44. doi: 10.1136/bmj.322.7277.40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Prideaux D. Medical education research: is there virtue in eclecticism? Med Educ. 2002;36:502–503. doi: 10.1046/j.1365-2923.2002.01247.x. [DOI] [PubMed] [Google Scholar]
  • 16.Hammersley M. On “systematic” reviews of research literatures: a “narrative” response to Evans and Benefield. British Educational Research Journal. 2001;27:543–554. [Google Scholar]
  • 17.Brigley S. Continuing medical education: the question of evaluation. Med Educ. 1997;31:67–71. doi: 10.1111/j.1365-2923.1997.tb00046.x. [DOI] [PubMed] [Google Scholar]
  • 18.Newman P, Peile E. Valuing learners' experience and supporting further growth: educational models to help experienced adult learners in medicine. BMJ. 2002;325:200–202. doi: 10.1136/bmj.325.7357.200. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Desforges C. How does experience affect theoretical knowledge for teaching? Learning and Instruction. 1995;5:385–400. [Google Scholar]
  • 20.Fraser SW, Greenhalgh T. Coping with complexity: educating for capability. BMJ. 2001;323:799–803. doi: 10.1136/bmj.323.7316.799. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Engestrom Y. Expansive learning at work: towards an activity theoretical reconceptualisation. Journal of Education and Work. 2001;14:133–161. [Google Scholar]
  • 22.Wegener E. Communities of practice: learning, meaning and identity. Cambridge: Cambridge University Press; 1996. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

[extra: Further details]
bmj_326_7381_142__2.pdf (92.8KB, pdf)
bmj_326_7381_142__2.pdf (92.8KB, pdf)
bmj_326_7381_142__3.pdf (150.2KB, pdf)
bmj_326_7381_142__3.pdf (150.2KB, pdf)
bmj_326_7381_142__4.pdf (318.5KB, pdf)
bmj_326_7381_142__4.pdf (318.5KB, pdf)

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES