Skip to main content
Bulletin of the World Health Organization logoLink to Bulletin of the World Health Organization
. 2015 Dec 2;94(1):58–64. doi: 10.2471/BLT.15.167585

Reporting guidelines for implementation and operational research

Directives concernant l'établissement de rapports pour les recherches opérationnelles et la mise en œuvre

Directrices sobre la presentación de informes para la investigación operativa y sobre la ejecución

المبادئ التوجيهية العامة لرفع التقارير عن البحوث التنفيذية والميدانية

有关实施性与操作性研究的报告指南

Рекомендации по составлению отчетности в исследованиях внедрения и оперативных исследованиях

Simon Hales a,, Ana Lesher-Trevino a, Nathan Ford b, Dermot Maher c, Andrew Ramsay c, Nhan Tran d
PMCID: PMC4709804  PMID: 26769997

Abstract

In public health, implementation research is done to improve access to interventions that have been shown to work but have not reached many of the people who could benefit from them. Researchers identify practical problems facing public health programmes and aim to find solutions that improve health outcomes. In operational research, routinely-collected programme data are used to uncover ways of delivering more effective, efficient and equitable health care. As implementation research can address many types of questions, many research designs may be appropriate. Existing reporting guidelines partially cover the methods used in implementation and operational research, so we ran a consultation through the World Health Organization (WHO), the Alliance for Health Policy & Systems Research (AHPSR) and the Special Programme for Research and Training in Tropical Diseases (TDR) and developed guidelines to facilitate the funding, conduct, review and publishing of such studies. Our intention is to provide a practical reference for funders, researchers, policymakers, implementers, reviewers and editors working with implementation and operational research. This is an evolving field, so we plan to monitor the use of these guidelines and develop future versions as required.

Introduction

Implementation and operational research are growing in importance and recognition. Major donors, including the Canadian Institutes of Health Research, the European & Developing Countries Clinical Trials Partnership, the United States of America’s National Institutes of Health and the Wellcome Trust are increasing funding support for this research area and leading scientific journals have established sections promoting the publication of such research. Implementation research contributes a growing part of the evidence base used by the World Health Organization (WHO), which promotes, supports, publishes and evaluates such research.

The combined results of basic, clinical and implementation research have made it possible to reach millions of people with life-saving treatments and effective prevention measures. For example, the global scale-up of antiretroviral drugs for human immunodeficiency virus (HIV) has brought the number of people on treatment to 13.6 million from 800 000 in 20031 and reduced deaths from 2.4 million in 2005 to 1.5 million in 2013.2 As another example, tuberculosis is declining rapidly. However, to sustain these gains and reach elimination targets for tuberculosis, new drugs and vaccines are needed, along with the implementation research that shows how these can be delivered where they are needed and in a form that works. Deaths from malaria have decreased by 58% between 2000 and 2015,1 in part because malaria control measures, including indoor residual spraying, long-lasting insecticidal nets, point-of-care diagnostic tests and artemisinin-based combination therapies have been delivered effectively to communities.

Operational research uses an existing resource – the data routinely collected by programmes – to provide ways of improving programme operations and thereby delivering more effective, efficient and equitable care. Implementation and operational research are usually carried out in close collaboration between researchers and public health practitioners. Operational research is typically very specific to a single programme or activity. The costs are generally modest, yet these studies have the potential for a huge magnifier effect, in extending the impact of health interventions.3

Implementation research can help answer questions about why effective interventions are not reaching the people who could benefit from them.4,5 Implementation research is also useful in understanding how health system failures create barriers to the delivery of policies or programmes. For example, the non-adherence of providers to service delivery guidelines based on evidence may result from a lack of monitoring and feedback mechanisms within the system. It could also result from the time allocated per visit which limits what providers are able to achieve. A broad understanding of systems failures and their relationship to implementation barriers is a key aspect of much implementation research. Resolving barriers such as non-adherence to treatment guidelines may have less to do with training providers and more to do with changing the system to allow more time or by establishing better feedback mechanisms.

As the range of applications of implementation research is very broad, a wide range of different research methods may be used depending on the type of problem studied (Table 1). Existing guidelines and their extensions cover some – but not all – of the required reporting areas. These gaps can make it difficult for researchers, implementers and journal editors to ensure that studies are reported in sufficient detail to allow replication. A further difficulty is that the success of implementation, particularly for complex interventions, is often highly dependent on the context. The traditional structure of a scientific research article may not provide a good framework for reporting important contextual issues.

Table 1. Research objectives, implementation questions and research methods4.

Objective Description Implementation question Research methods
Explore Explore an idea or phenomenon to make hypotheses or generalizations from specific examples What are the possible factors and agents responsible for good implementation of a health intervention? For enhancing or expanding a health intervention? Qualitative methods: grounded theory, ethnography, phenomenology, case studies and narrative approaches; key informant interviews, focus groups, historical reviews
Quantitative: network analysis, cross-sectional surveys
Mixed methods: combining qualitative and quantitative methods
Describe Identify and describe the phenomenon and its correlates or possible causes What describes the context in which implementation occurs? What describes the main factors influencing implementation in a given context? Quantitative: cross-sectional (descriptive) surveys, network analysis
Qualitative methods: grounded theory, ethnography, phenomenology, case studies and narrative approaches; key informant interviews, focus groups, historical reviews
Mixed methods: both qualitative and quantitative inquiry with convergence of data and analyses
Influence Test whether an intervention produces an expected outcome
With adequacy With sufficient confidence that the intervention and outcomes are occurring Is coverage of a health intervention changing among beneficiaries of the intervention? Before-after or time series in intervention recipients only; participatory action research
With plausibility With greater confidence that the outcome is due to the intervention Is a health outcome plausibly due to the implemented intervention rather than other causes? Concurrent, non-randomized cluster trials: health intervention implemented in some areas and not in others; before-after or cross-sectional study in programme recipients and non-recipients; typical quality improvement studies
With probability With a high (calculated) probability that the outcome is due to the intervention Is a health outcome due to implementation of the intervention? Partially controlled trials: pragmatic and cluster randomized trials; health intervention implemented in some areas and not in others; effectiveness-implementation hybrids
Explain Develop or expand a theory to explain the relation between concepts, the reasons for the occurrence of events, and how they occurred How and why does implementation of the intervention lead to effects on health behaviour, services, or status in all its variations? Mixed methods: both qualitative and quantitative inquiry with convergence of data and analyses
Quantitative: repeated measures of context, actors, depth and breadth of implementation across subunits; network identification; can use designs for confirmatory inferences; effectiveness-implementation hybrids
Qualitative methods: case studies, phenomenological and ethnographic approaches with key informant interviews, focus groups, historical reviews
Participatory action research
Predict Use prior knowledge or theories to forecast future events What is the likely course of future implementation? Quantitative: agent based modelling; simulation and forecasting modelling; data extrapolation and sensitivity analysis (trend analysis, econometric modelling)
Qualitative: scenario building exercises; Delphi techniques from opinion leaders

Note: Table reproduced from Peters, et al. 4

For the past 10 years, the Bulletin of the World Health Organization has published many examples of implementation research in a section called Lessons from the Field.6 The papers in this section have described actions taken in response to a wide variety of public health problems in many different settings, but we have found that authors often needed to be prompted to provide sufficient detail on local context, details of interventions and measures of impact.

Here we describe the development of reporting guidelines for implementation and operational research (Table 2). In future, authors submitting relevant research articles to the Bulletin will be asked to follow the new reporting guidelines. The guidelines are the result of collaboration between WHO, the Alliance for Health Policy & Systems Research (AHPSR) and the Special Programme for Research and Training in Tropical Diseases (TDR), journal editors, researchers and funders. Our intention is to provide a useful reference for all involved in implementation and operational research, and to revise these guidelines, as required, after a first year’s trial period.

Table 2. Reporting guidelines for operational/implementation researcha .

Section Reporting item
Title and abstract Identify as implementation or operational research in the title. Provide a structured summary of study context, rationale, objectives, design, methods, results and conclusions.
Introduction
Background Explain the scientific background relating to both the intervention and the implementation. What is already known about the issue?
Describe the policy or programme context. Describe relevant elements of setting or settings (for example, geography, physical resources, organizational culture, history of change efforts).
What is it about implementation in this setting that warrants research and reporting of findings?
Problem Briefly describe the nature and severity of the specific issue or problem that was addressed.
Specify who (champions/supporters) what (events/observations) triggered the decision to make changes, why in this location and why now?
Implementation strategy Describe mechanisms or strategies by which components were expected to cause changes, and plans for testing whether these were effective.
Intervention What evidence-based intervention or innovation is proposed?
Intended outcomes Describe the specific aim of the proposed study (changes/improvements in processes and outcomes).
Methods
Study design Identify the study design (for example, observational, quasi-experimental, experimental, qualitative, mixed) chosen for measuring impact of the intervention on primary and secondary outcomes, (if relevant).
Setting Exact details of study locations, baseline population characteristics, recruitment of participants, relevant dates for implementation, follow-up, and data collection.
Implementation Give a description of the implementation strategy: frequency, duration, intensity, including how and when interventions were actually implemented, additional resources required to support implementation, mode of delivery, why and when the study ended.
Describe the intervention, (if relevant). The amount of detail given should be sufficient to allow replication of the study. For well-established interventions, it is sufficient to refer to previously published studies.
Explain methods used to assure data quality (for example, blinding; repeating measurements and data extraction; training in data collection; collection of sufficient baseline measurements).
Participants For qualitative studies: what was the approach (e.g. ethnography, grounded theory, narrative) and theory?
Indicate how size of target population was determined.
Cohort study – Give the eligibility criteria, and the sources and methods of selection of participants. Describe methods of follow-up.
Case-control study – Give the eligibility criteria, and the sources and methods of case ascertainment and control selection. Give the rationale for the choice of cases and controls.
Cross-sectional study – Give the eligibility criteria, the sources and methods of selection of participants.
For matched studies, give matching criteria and number of exposed and unexposed or the number of controls per case.
For randomized studies, how was randomization done, definition of clusters for cluster randomized studies. Was the study blinded?
Variables Clearly define all outcomes, exposures, predictors, potential confounders, and effect modifiers.
Data sources/measurement For each variable of interest, give sources of data and methods of assessment (or measurement). Describe sampling strategies and comparability of assessment methods if there is more than one group.
Methods for processing data before and during analysis, including translation, transcription, data entry, data management and security, verification of data integrity, data coding, and de-identification.
Explain how variables were handled in the analyses. If applicable, describe which groupings were chosen and why; how data were coded.
Analyses Which analyses were pre-specified, and which were exploratory?
For qualitative analyses: process by which inferences or themes were identified and developed, including the researchers involved in data analysis.
For quantitative analyses: describe statistical methods, including those used to adjust for sampling methods and control for confounding.
Where both qualitative and quantitative analyses are used, describe both types of analysis and how findings were synthesized.
Describe any methods used to examine subgroups and interactions.
Explain how missing data were addressed.
Cohort study: explain how loss to follow-up was addressed.
Case-control study: describe matching of cases and controls.
Ethical considerations Including consent procedures, if relevant. How was confidentiality ensured?
How was the balance between the potential risks and benefits of this research to individuals or communities assessed?
Results
Descriptive data Report numbers of individuals at each stage of study – e.g. numbers eligible, included in the study, completing follow-up, and analysed. Include a flow diagram, timeline or graph, if relevant.
Cross tabulate the number of participants by subgroups as relevant e.g. demographic, clinical, social characteristics, response rates, loss to follow-up or other sources of missing data, potential confounders, for those who receive the intervention and those who do not receive it.
Outcomes Explain the actual course of the intervention, if relevant. For example, describe the sequence of steps, events or phases; type and number of participants at key points, preferably using a time-line diagram or flowchart.
Document the degree of success in implementation:
• changes in processes and outcomes associated with the intervention.
• changes observed in outcome (for example, population behaviour change, morbidity, mortality, function, patient/staff satisfaction, service utilization, cost, care disparities).
• consider benefits, harms, costs, unexpected results, problems, failures.
Outcome data Report numbers of outcome events (or summary measures over time), separately for those who receive the intervention and those who do not receive it. Include summary statistics and measure of variance (SD or SE).
Main results Main findings (e.g. interpretations, inferences, and themes); might include development of a theory or model, or integration with prior research or theory.
Provide unadjusted estimates of intervention effect, and, if applicable, confounder-adjusted estimates and their precision (e.g. 95% confidence interval).
Consider translating estimates of relative risk into absolute risk for a meaningful time period.
Synthesis of quantitative and qualitative results.
Other analyses Report other analyses done – e.g. analyses of subgroups and interactions, sensitivity analyses, costs.
Discussion
Key results Summarize key results with reference to study objectives.
Limitations Discuss limitations of the study, taking into account possible sources of confounding, bias or imprecision in design, measurement, and analysis that might have affected study outcomes (internal validity).
Discuss both direction and magnitude of any potential bias.
Interpretation Interpret the results considering objectives, limitations, multiplicity of analyses, results from similar studies, and other relevant evidence.
Compare and contrast study results with relevant findings of others, drawing on broad review of the literature; use of a summary table may be helpful in building on existing evidence.
Suggest steps that might be modified to improve future performance.
Review issues of opportunity cost and actual financial cost of the intervention.
Contextual factors Success factors, barriers and how they were overcome.
Generalizability Discuss the generalizability (external validity) of the study results.
Explore factors that could affect generalizability – for example, representativeness of participants; effectiveness of implementation; dose–response effects; features of local setting.
Applicability to other settings; Potential barriers to scale up.
Conclusion Consider overall practical usefulness of the intervention.
Suggest implications for the implementation programme. How will the results be used/translated into practice in the context of the study?
Suggest implications for further studies.
Other information Indicate if the study is registered and if the data are available.
Give the source of funding and the role of the funders for the present study and, if applicable, for the original study on which the present article is based.
State the role of individuals in the study and any conflict of interest.

SD: standard deviation; SE: standard error.

a The reporting items in this table are intended to cover the wide range of study designs for implementation and operational research. As a result, not all of the items are relevant for all studies (for example, some studies will not involve the testing of an implementation strategy).

Guidelines development

A flowchart summarizing the development of the guidelines is shown in Fig. 1. We reviewed existing guidelines, and guidelines under development on the EQUATOR website7 and selected relevant items from a set of existing guidelines listed in Table 3. Since many of the items in the standards for reporting observational research (STROBE) guidelines8 were selected, we used the STROBE checklist as a starting point. We compiled a list of 120 researchers, funders, editors and implementers and sent them a link to an online survey, asking that they score the importance of potential reporting items. We received 44 responses from people working in a wide range of countries. Most respondents identified themselves as researchers.

Fig. 1.

Fig. 1

Flow chart illustrating the process of guideline development

Table 3. Reporting guidelinesa.

Type of study Guideline name Example extensions
Randomized trials CONSORT TIDieR
Observational studies STROBE RECORD
Systematic reviews PRISMA PRISMA-P
Qualitative research SRQR COREQ
Diagnostic/prognostic studies STARD TRIPOD
Quality improvement studies SQUIRE
Economic evaluations CHEERS
Phase IV implementation studies STaRI
Policy interventions UNTIDieR

Several respondents suggested including more items relevant to qualitative studies. Others pointed out that implementation studies often, but not always, involve defining a problem to be solved. In some cases, reasons for the success of a programme might be the main focus of a study. There was a range of opinions on the right balance between reporting of the implementation strategy as opposed to specific details of an intervention. Some expressed a concern that implementers may be put off rather than encouraged if the proposed guidelines were too detailed.

A revised version was sent for review by email and 33 responses were received. To gain consensus on key issues and help reconcile comments from people with different perspectives, a consensus meeting was held at WHO headquarters in October 2015. Participants agreed on the inclusion of a set of standard reporting items and pointed out that the guidelines should not refer exclusively to health-care settings. Some suggested that the guidelines should refer to existing guidelines where relevant. For example, studies using qualitative methods could use the SRQR guidelines;9 studies using routine data could use RECORD.10 In revising the guidelines, we tried to balance completeness with user-friendliness and decided that it was preferable to produce a comprehensive guideline, as shown in Table 2, that includes all relevant items, rather than referring authors to multiple guidelines.

Conclusion

A major challenge in the development of reporting guidelines for implementation and operational research is that this research is governed by the nature of the questions rather than by specific methods or designs. As such, the guidelines presented in this paper build upon and bring together a range of existing guidelines. The process of developing these reporting guidelines has brought people with different expertise and perspectives to the debate and helped build consensus. It is hoped that the present guidelines will be a useful reference, but further discussion and development will be required to overcome challenges in this evolving field.

Acknowledgements

We thank Hilary Pinnock and Steph Taylor for providing the draft STaRI guidelines and the participants of the meeting.

Funding:

Special Programme for Research and Training in Tropical Diseases, World Health Organization.

Competing interests:

None declared.

References

  • 1.Resolution A/RES/70/1. Transforming our world: the 2030 Agenda for Sustainable Development. In: Seventieth session United Nations General Assembly, New York, 25 September 2015. Agenda items 15 and 116. New York: United Nations; 2015. Available from: http://www.un.org/ga/search/view_doc.asp?symbol=A/RES/70/1&Lang=Ehttp://[cited 2015 Dec 9].
  • 2.The Millennium Development Goals Report 2015. New York: United Nations; 2015. Available from: http://www.un.org/millenniumgoals/2015_MDG_Report/pdf/MDG%202015%20rev%20%28July%201%29.pdfhttp://[cited 2015 Dec 9].
  • 3.Maher D. The Global Fund and global goals: the role of implementation and operational research. Public Health Action. 2015. September 21;5(3):154–5. 10.5588/pha.15.0046 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Peters D, Tran N, Taghreed A. Implementation research in health. Geneva: World Health Organization; 2013. Available from http://who.int/alliance-hpsr/alliancehpsr_irpguide.pdfhttp://[cited 2015 Dec 9]. [Google Scholar]
  • 5.Peters DH, Adam T, Alonge O, Agyepong IA, Tran N. Implementation research: what it is and how to do it. BMJ. 2013;347:f6753. [DOI] [PubMed] [Google Scholar]
  • 6.Revisiting lessons from the field. Bull World Health Organ. 2006;84(1):3 10.2471/BLT.05.029132 [DOI] [Google Scholar]
  • 7.Enhancing the quality and transparency of health research (EQUATOR) [Internet]. Oxford, England: Equator Network; 2015. Available from: http://www.equator-network.org/[cited 2015 Dec 9].
  • 8.von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP; STROBE Initiative. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. Ann Intern Med. 2007. October 16;147(8):573–7. 10.7326/0003-4819-147-8-200710160-00010 [DOI] [PubMed] [Google Scholar]
  • 9.O’Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014. September;89(9):1245–51. 10.1097/ACM.0000000000000388 [DOI] [PubMed] [Google Scholar]
  • 10.Benchimol EI, Smeeth L, Guttmann A, Harron K, Moher D, Petersen I, et al. ; RECORD Working Committee. The REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) Statement. PLoS Med. 2015. October;12(10):e1001885. 10.1371/journal.pmed.1001885 [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Bulletin of the World Health Organization are provided here courtesy of World Health Organization

RESOURCES