Abstract
Background:
Theories of change explaining how interventions work are increasingly important, yet the methods/data to develop these are less advanced than for evaluating effects.
Methods:
We conducted a systematic evidence synthesis to develop a theory of change for structural adolescent contraception interventions. We reflect on the utility of the information provided in evaluation reports.
Findings/discussion:
Few of the included evaluations presented their theory of change, or included rich, qualitative process data. Authors’ descriptions of context and implementation, typically in introduction and discussion sections, were very useful. These helped to understand the intervention’s context, how it was experienced and why or how it had the effect that it did. We recommend incorporating rich process evaluations into studies, and reporting contextual insights into the intervention’s development, implementation and experience. We also recommend including these data and insights within syntheses that aim to develop theories of change.
Keywords: adolescent, contraception, evaluation, evidence synthesis, process evaluation
Introduction
Outcome evaluations tell us whether an intervention is effective or not. However, it has been increasingly recognised that it is also important to understand how an intervention works (i.e. what the causal mechanisms are), to explain why some interventions are more or less effective than others, and to consider whether an intervention is likely to be applicable to a new context (1–3). To answer these questions, we also need to know how the intervention was implemented, what the context was and how it was experienced.
A theory of change sets out how an intervention is expected to lead to an outcome, describing the causal mechanisms and what contextual and other factors may affect their enactment (4). It is useful throughout the process of intervention development and evaluation, from helping to determine what the intervention should be and how it should be delivered, to shaping the evaluation and finally, refining the theory based on empirical evidence gathered through evaluation (5). However, despite increasing awareness of their importance, the methods and data required for developing and refining theories of change are not as advanced as those for evaluating effects (4). Furthermore, despite their utility across different stages of an intervention, they are more commonly used to inform intervention development, rather than empirically testing a theory of change (5). Therefore, while a theory of change is often included to inform or represent the design of an intervention or evidence synthesis, less attention is given to evaluating and updating this theory based on knowledge developed through the research/evaluation itself.
Theories of change can be useful at the individual study level, but can also be developed through synthesis of multiple studies (6). One approach is to use meta-ethnography to synthesise individual studies’ theories of change (7). However, studies’ reporting of theories of change are by no means universal and their prevalence varies between fields of research (7,8). In instances where insufficient studies have reported theories of change to allow synthesis, reviewers could draw on other data and information provided in the studies, for example, from process evaluations, to develop a theory of change for a body of evidence.
Calls have been made in recent decades for process evaluations to accompany outcome evaluations and the number of process evaluations published has increased (3,9). However, process evaluations often focus on quantitative data, reporting what took place, for whom, and/or measures of the perceived acceptability of the intervention, rather than exploring how an intervention was experienced, or how and why it had an effect (10,11). Such quantitative data can be relatively simple to collect and analyse, but lacks the richness to help understand an intervention’s causal mechanisms.
The question therefore remains, how to best conduct a synthesis that helps to understand interventions’ mechanisms of action, when this has not been set out within a study and when rich process data are lacking.
Standard evidence syntheses of effectiveness typically draw mainly on the research methods and findings sections of evaluation reports, extracting data on the intervention content, study design and results. In this short communication, we reflect on the data we found useful in a recent evidence synthesis undertaken to develop a theory of change.
Materials and methods
We undertook a systematic evidence synthesis to develop a theory of change for structural interventions to enable adolescent contraceptive use (12). We wanted to understand how interventions were implemented and experienced to help explain their effect (or lack there of). We used a case-based approach, intervention component analysis (ICA), that drew on insights from all aspects of the included literature (13). ICA is an iterative method that involves extracting information about all aspects of the intervention and its evaluation (see Burchett et al. (12) for more details). The benefits of the method stem from its ability to draw insights from reports beyond those traditionally incorporated into reviews, particularly those related to the evaluation’s context and the intervention’s implementation – two factors that are understood to be of critical importance for developing theories of change. During and after the analysis, the team discussed and reflected on the types of data and insights that were most helpful in informing the development of our theory.
Results/discussion
Only a minority of the included studies explicitly set out their theory of change and those that did rarely reflected on or refined it based on empirical data gathered in their evaluation. We found the most useful data for us were often from authors describing the context and interactions within it, typically found in introduction and discussion sections. For example, in an evaluation of a Bangladeshi intervention teaching computer skills to adolescent girls, the authors explained that, ‘The technology used is important for its novelty – to keep girls coming and to ensure attendance – as well as to generate a favorable impression in the community regarding the program’ (14, p. 36). This offers insight not only into the context – that is, technology was novel there at that time – but also how the intervention might have been experienced, explaining why girls attended.
Insights were also gleaned on intervention implementation and experience. The authors of an Indian evaluation described efforts to gain community acceptance,
When launched, the project faced considerable opposition from parents and other adults in the community. In order to allay concerns about exposing adolescents, especially girls, to information about sex . . . the project, led by local NGO partners, held community-level meetings . . . These efforts went a long way towards gaining community acceptance for the training programme (15, p. 19).
This shows not only that the context led to initial hostility, but that the project took steps to address this and achieve acceptance (and described how it did so).
Finally, authors have offered suggestions on how a study’s control arm was experienced, helping to explain the evaluation’s results. In a study in Zimbabwe, the control group not only received substantial care, support and intervention, but the authors also reflected that,
There may also have been some ‘contamination’ across arms where control participants converted study reimbursements ($5/study visit) into economic opportunities – such as paying for school fees or buying and selling goods at a higher price. Although we did not capture this activity in a systematic way, program monitoring data indicate this did occur among some participants (16, p. 15).
Conclusion
We have presented some examples of how authors’ perspectives can be useful in understanding intervention evaluations, specifically their context and causal mechanisms, in order to develop a theory of change through synthesis of a set of intervention evaluations. Insights into interventions’ contexts and how interventions were implemented and experienced were particularly useful. We believe that to understand interventions in a broader sense, data, insights and authors’ reflections from all parts of an article or report can be useful to help understand an interventions’ context, implementation, experience and possibly even mechanisms of action. These may be particularly useful when rich process data and contextual information is lacking, and when individual interventions’ theories of change are either unavailable or unable to be synthesised. Based on our experience, we recommend intervention evaluators incorporate rich process evaluations into their studies, and report contextual data as well as insights into the intervention development, implementation and experience. We also recommend that rich process data and contextual insights be included within evidence syntheses that aim to develop theories of change, or explain how interventions work, in order to provide more rigorous evidence for these aspects of interventions.
Footnotes
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This project that this paper drew on was funded by the Centre of Excellence for Development Impact and Learning (CEDIL), supported by U.K. aid from the U.K. Government. The views expressed in this research project paper do not necessarily reflect the U.K. Government’s official policies or CEDIL.
ORCID iD: Helen Elizabeth Denise Burchett
https://orcid.org/0000-0002-8380-2476
References
- 1. Davey C, Hassan S, Cartwright N, Humphreys M, Masset E, Prost A, et al. Designing evaluations to provide evidence to inform action in new settings (CEDIL Inception Paper) [Internet]. 2018[cited 2023 November 20]. Available from: https://cedilprogramme.org/wp-content/uploads/2018/10/Designing-evaluations-to-provide-evidence.pdf
- 2. Skivington K, Matthews L, Simpson SA, Craig P, Baird J, Blazeby JM, et al. Framework for the development and evaluation of complex interventions: gap analysis, workshop and consultation-informed update. Health Technol Assess. 2021; 25: 1–132. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3. Moore GF, Audrey S, Barker M, Bond L, Bonell C, Hardeman W, et al. Process evaluation of complex interventions: medical research council guidance. BMJ. 2015; 350: h1258. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4. De Silva MJ, Breuer E, Lee L, Asher L, Chowdhary N, Lund C, et al. Theory of change: a theory-driven approach to enhance the Medical Research Council’s framework for complex interventions. Trials. 2014; 15: 267. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5. Breuer E, Lee L, De Silva M, Lund C. Using theory of change to design and evaluate public health interventions: a systematic review. Implement Sci. 2015; 11: 63. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6. Kneale D, Thomas J, Booth A, Garside R, Noyes J. Chapter 4. Developing and using logic models. In: Noyes J, Harden A. (eds). Cochrane-Campbell Handbook for Qualitative Evidence Synthesis, Version 1, 2023 [cited 2023 November 20] Available from: https://training.cochrane.org/qeschapter4logv0161023 .
- 7. Tancred T, Paparini S, Melendez-Torres GJ, Thomas J, Fletcher A, Campbell R, et al. A systematic review and synthesis of theories of change of school-based interventions integrating health and academic education as a novel means of preventing violence and substance use among students. Syst Rev. 2018; 7: 190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8. Jones B, Paterson A, English M, Nagraj S. Improving child health service interventions through a theory of change: a scoping review. Front Pediatr. 2023; 11: 1037890. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9. Oakley A, Strange V, Bonell C, Allen E, Stephenson J. Process evaluation in randomised controlled trials of complex interventions. BMJ. 2006; 332: 413–416. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10. Wierenga D, Engbers LH, Van Empelen P, Duijts S, Hildebrandt VH, Van Mechelen W. What is actually measured in process evaluations for worksite health promotion programs: a systematic review. BMC Public Health. 2013; 13: 1190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11. McIntyre SA, Francis JJ, Gould NJ, Lorencatto F. The use of theory in process evaluations conducted alongside randomized trials of implementation interventions: a systematic review. Transl Behav Med [Internet]. 2018; 10: 168–178 [cited 2023 November 20]. Available from: https://academic.oup.com/tbm/advance-article/doi/10.1093/tbm/iby110/5208274 [DOI] [PubMed] [Google Scholar]
- 12. Burchett HED, Griffin S, De Melo M, Picardo JJ, Kneale D, French RS. Structural interventions to enable adolescent contraceptive use in LMICs: a mid-range theory to support intervention development and evaluation. Int J Environ Res Public Health. 2022; 19: 14414. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13. Sutcliffe K, Thomas J, Stokes G, Hinds K, Bangpan M. Intervention Component Analysis (ICA): a pragmatic approach for identifying the critical features of complex interventions. Syst Rev. 2015; 4: 140. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14. Amin S, Ahmed J, Sah J, Hossain M, Haque E. Delaying child marriage through community-based skills-development programs for girls: results from a randomized controlled study in rural Bangladesh [Internet]; 2016. [cited 2021 January 19]. Available from: https://knowledgecommons.popcouncil.org/departments_sbsr-pgy/557/
- 15. Pandey N, Jejeebhoy SJ, Acharya R, Singh SK, Srinivas M. Effects of the PRACHAR project’s reproductive health training programme for adolescents: findings from a longitudinal study [Internet]; 2016. [cited 2021 January 19] Available from: https://knowledgecommons.popcouncil.org/departments_sbsr-pgy/566/#:~:text=Findings%20confirm%20that%20the%20training,implemented%20among%20large%20proportions%20of
- 16. Dunbar MS, Kang Dufour MS, Lambdin B, Mudekunye-Mahaka I, Nhamo D, Padian NS. The SHAZ! project: results from a pilot randomized trial of a structural intervention to prevent HIV among adolescent women in Zimbabwe. PLoS ONE. 2014; 9: e113621. [DOI] [PMC free article] [PubMed] [Google Scholar]
