Skip to main content
The BMJ logoLink to The BMJ
. 1999 May 8;318(7193):1265–1267.

Guidelines for evaluating papers on educational interventions

Education Group for Guidelines on Evaluation
PMCID: PMC1115651  PMID: 10231261

Education is an important part of the work of most doctors, and the BMJ is interested to publish original studies that will be useful to doctors in their educational role. Unfortunately many of the accounts we receive of educational interventions comprise a thin description of the innovation and an evaluation that says little more than that the students liked the innovation. This is not good enough. The standard of papers evaluating educational interventions should be as high as that of any other original papers that we publish.

We recognise, however, that many of the methodologies that are best for evaluating educational innovations are different from the methods with which BMJ readers are familiar—for instance, methods for evaluating new drugs. We thus set up a group of advisers, consisting of people expert in medical education, to produce guidelines that we could use when reviewing original papers that describe educational innovations. This is a first version of those guidelines. The guidelines are intended for authors, editors, reviewers, and readers. We have no doubt that they can be improved, and it might be that we should evolve different guidelines for those different groups.

We are doing three things with these guidelines:

  • Publishing them today and inviting comment. The group that produced the guidelines will revise them in the light of the responses we receive;

  • Sending them to various groups and individuals interested in evaluating educational innovations and asking them to comment;

  • To see if the guidelines work in practice, we plan to use them in reviewing papers describing educational innovations.

Summary points

  • Doctors are increasingly involved in education, and they should benefit from being exposed to research in medical

  • education

  • General medical journals have published little educational research, but the BMJ is interested to publish more

  • The methods used in educational research are often different from those most familiar to readers of general medical journals

  • The journal has therefore worked with education experts to develop guidelines for authors, editors, reviewers, and readers for evaluating studies on educational interventions

  • Two crucial factors in good studies are that the educational rationale behind the intervention is made explicit and that the evaluation is planned in advance

  • The guidelines are being widely circulated and will then be revised before final

  • adoption

Guidelines for authors, editors, reviewers, and readers to evaluate papers on educational interventions

1 Overview

(a) Is the paper right for the BMJ?

Papers on educational interventions are of several types. Retrospective and descriptive studies that describe a change in a programme or module, or the development of a new teaching method or curriculum, may be of interest to those involved in curriculum management, but are less likely to be of general enough interest for the BMJ.

Well conducted studies examining educational innovations have a better chance of publication. These may take various forms: detailed observational studies, properly conducted questionnaire surveys, or randomised controlled studies. In all cases the design and evaluation criteria listed in section (3) below should be applied.

(b) Does it add anything new and valuable?

The BMJ is most interested in publishing studies that are genuinely original in that nobody has ever done anything like this before. There is also a place for studies that confirm and extend previous studies. The BMJ will be interested in the first few studies that confirm previous studies, particularly if they are methodologically superior to previous work and extend it. But the BMJ is not interested in studies that confirm what has been shown several times before—albeit in different countries and settings.

(c) Is it suitable for a general readership?

The BMJ is interested in material that will be useful and understandable to a general audience. Papers that are intended primarily for an audience with a specialist interest in education should be published elsewhere. Authors of educational papers that are submitted to the BMJ should avoid educational jargon; at the very least, jargon should be explained simply and fully in the text—and perhaps in a glossary.

(d) Is it readable?

It is important that the paper is presented in a logical fashion and written in a coherent, readable style. The tables and diagrams should be well presented, useful, and relevant.

2 Theoretical considerations

(a) Are the aims and objectives clearly stated?

The aims and objectives of the study should be clearly stated. The educational rationale, context of the study, and methodology should relate to the aims and objectives. The research techniques used must be appropriate to answer the question(s) posed in the aims, and to achieve the learning objectives.

Summary of guidelines for evaluating papers on educational interventions

Checklist

1 Overview (c) Are the methods used for recruitment described in enough detail?
(a) Is the paper right for the BMJ? (d) Was the evaluation method planned in advance and linked to the aims of the study?
(b) Does it add anything new and valuable? (e) Is the evaluation tool described in enough detail?
(c) Is it suitable for a general readership? (f) Are the results meaningful?
(d) Is it readable? 4 Discussion
2 Theoretical considerations (a) Is it well structured?
(a) Are the aims and objectives clearly stated? (b) Does it discuss the strengths and weaknesses of the study in relation to other studies?
(b) Is the educational rationale explicit? (c) Does it discuss the meaning and implications of the results?
(c) Is the intervention described in context? (d) Does it discuss the need for further work?
3 Study presentation and design
(a) Is the method described in enough detail?
(b) Does the study allow the questions posed to be answered?

(b) Is the educational rationale explicit?

The educational rationale behind the innovation should be explicit, and it should be obvious from the paper that the study is founded on the application of theoretical principles. An adequate review of the literature should be given to support the basis of the study.

(c) Is the intervention described in context?

The context of educational change is important and may have a direct bearing on the implementation of change. The paper should describe the context of the intervention in terms of healthcare delivery systems, political policy, and external drives to encourage change where these have been important. In addition, the study should take into account the study population and the stage of educational development, and it should describe the relevant details of local issues such as the individual course or module, its place within the curriculum, and the physical environment in which the study took place.

3 Study presentation and design

(a) Is the method described in enough detail?

As educational interventions are often specific to the context in which they take place, a large amount of background detail may be necessary to familiarise others. A balance is needed between enough detail to allow scrutiny and reproducibility of the intervention, and information overload. It is always possible to publish more detailed information in the eBMJ.

(b) Does the study allow the questions posed to be answered?

The aims of the intervention should be reflected in the aims of the research and then in the methodology selected.

(c) Are the methods used for recruitment described in enough detail?

The method of recruitment needs justification. If control groups are used, the process of selecting controls should be fully described and rigorous. The usual policy of the BMJ is to publish only trials in which controls are randomly selected unless there is a convincing reason why randomisation is not possible.1 Randomised trials should conform with the CONSORT criteria.2,3 Purposive sampling may be more informative than attempts at randomisation and controls, which are difficult to achieve in adult education.

(d) Was the evaluation method planned in advance and linked to the aims of the study?

Evaluations should not be an afterthought. Every evaluation has a purpose specific to the research question and the context. Researchers planning an intervention should have designed the evaluation at the outset to answer their specific questions.

(e) Is the evaluation tool described in enough detail?

An evaluation may be valid and useful within the context in which it took place, without meeting the criteria required for research. To achieve generalisability or reproducibility, there needs to be a higher level of rigour. Attempts should be made to correlate reported behaviour changes after an intervention with more objective measures such as referral rates and with investigation patterns.

(f) Are the results meaningful?

Educational interventions are often difficult to analyse because multiple variables are involved and because there may not be only one explanation for the results. The results need to be presented in sufficient detail to be meaningful, and the statistical analysis should be appropriate for the study design.

4 Discussion

(a) Is it well structured?

The BMJ is proposing to introduce structured discussions,4 and these should be used in educational articles as in any others. The discussion should begin with a sentence on the principal finding, followed by a thorough examination of the strengths and weaknesses of the study itself.

(b) Does it discuss the strengths and the weaknesses of the study in relation to other studies?

Strengths and weaknesses should then be discussed in relation to previous studies. Any differences in results, and why different conclusions have been reached, should particularly be emphasised.

(c) Does it discuss the meaning and implications of the results?

Next, the “meaning” of the study in terms of possible mechanisms, and implications for clinicians or policy makers, needs to be explored but should not be overstated.

(d) Does it discuss the need for further work?

Finally, questions that remain unanswered and what future work is needed should be discussed without being speculative.

Footnotes

Members of the group are listed at the end of the article

References

  • 1.Altman DG. Randomisation. BMJ. 1991;292:810–812. doi: 10.1136/bmj.302.6791.1481. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Altman DG. Better reporting of randomised trials: the CONSORT statement. BMJ. 1996;313:570–571. doi: 10.1136/bmj.313.7057.570. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Begg C, Cho M, Eastwood S, Horton R, Moher D, Olkin I, et al. Improving the quality of reporting of randomised controlled trials: the CONSORT statement. JAMA. 1996;276:637–639. doi: 10.1001/jama.276.8.637. [DOI] [PubMed] [Google Scholar]
  • 4.Docherty M, Smith R. The case for structuring the discussion of scientific papers. BMJ. 1999;318:1224–1225. doi: 10.1136/bmj.318.7193.1224. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES