Highlights
-
•
Health promotion programmes in various settings are effective means to improve the health of the population.
-
•
Despite substantial research, results from programme implementation remain unclear and challenging to evaluate.
-
•
A shift from ‘one size fits all’ evidence-based fidelity to anchoring and tailoring interventions to their contexts is needed.
-
•
Recurrences in combinations of contextual factors (namely Typical Contextual Equations) occur in a given type of context.
-
•
TCEs focus on a selection of the key critical factors that have drastic impact on implementation.
Keywords: School health promotion, Implementation, Context
One strategy to improve the health of the population, and particularly children, is to implement health promotion programmes in various settings (school, sports club, …). However, providing evidence of successful implementation remains limited and evaluation is still challenging. In addition, the processes leading to such results are often obscure and described in the literature as a “black box” [1,2]. Furthermore, demonstrating positive and sustainable impact on health inequalities remains difficult. For example, Cochrane review conclusions are often expressed cautiously and are tentative at best; many authors also consider evaluation results with prudence due to the level of complexity impacting the effectiveness of health promotion programmes [3]. Assessment of health promotion programme effectiveness, upscaling and transferability issues remain rarely examined. Yet building evidence on programme implementation is a pressing priority for public health 4
1. Is programme fidelity possible in health promotion?
Frameworks used and developed for programme evaluation are usually grounded in a linear programme fidelity perspective [4,5,13]; based on the underlying assumption that: “a) either the programme is delivered as planned or not; and b) either it delivers expected outcomes, or not” [6]. However, this ideological perspective is clearly at odds with the overly simplistic outcomes and fidelity measures evaluation perspectives, as the complexity of implementation defies such linear one-dimensional thinking [17]. Additionally, expected achievements in prevention programmes are multi-level and complex, often manifesting on a long-term basis, and encompassing quite complex interactions between people and their life ecosystems [[8], [9], [10]]. Such variability frequently emerges during evaluation, through the gap between expectations and achievements [13], which are due to multiple and interwoven contextual factors relating to the nature of the intervention, and more importantly to the different contexts of implementation (setting and community-specific), as well as the differing characteristics, dispositions and practices of the professionals and stakeholders involved [3,[11], [12], [13], [14], [15]]. Professionals tend not to implement a programme which they consider to be a “detail” which adds to other priorities in their workload; to customize the programme or build something new with already available resources; and to redefine the status of the programme to follow their own path.
The recognition of these factors sets transferability and wider replication of interventions as the two main challenges to overcome, as streamlined outcomes and identical replication seem unrealistic, the factors involved being so variable and contextually influenced [6]. ‘One size’ does not ‘fit all’. ATo assume contexts, and stakeholders respond similarly to the same programme, adversely impacts intervention sustainability and decreases effectiveness, as it limits community ‘buy in’. Contexts cannot be thought to fail to deliver pre-defined objectives, in a pre-defined way. The adoption of a flexible, ground-up and empowering process which takes the community/setting as its point of origin could result in outcomes which may even exceed and surpass pre-defined goals [6].
2. From evidence-based fidelity, to ground-up context-informed implementation
Health promotion programmes often facilitate new solutions and innovations, which is rarely measured. Programmes can be a catalyst of change within a context, a revealer of specific conditions for implementation or even a constraint. Sometimes, the very introduction of the programme triggers the changes. However, interactions between the newly introduced programme and the context are not always assessed, as focusing on outcomes means these changes are often missed. Still, programme outputs are important, but could be considered in the light of the process that created them.
Recent years have seen a strong trend towards implementation fidelity despite the fact that, at a micro level, contexts can vary quite significantly [7]. Schools for example might seem similar from a macro point of view, in terms of structure and operation; however in terms of culture and internal dynamics, they may show substantial differences.
The deep complexity of schools as organisations may not only impact the successes and failures of programme implementation, but also the meaningfulness of its expected impact. In fact, settings are so complex that more evidence is needed to identify the essential elements which must be taken into account in setting-specific programme design, to shift from evidence-based fidelity, to a more flexible perspective of tailoring interventions to fit each context. From this perspective, the programme and its content generally remain the same; however, more context-specific thinking is applied to the implementation process, and the types of outcomes that might be expected. In effect, this is not a reinvention of the wheel, but rather a reflection on how existing research can be used to provide operational tools in a flexible, applied and pragmatic manner [6].
3. Typical contextual equations or the quest for regularities in the contexts
A single coherent framework to conceptualize and examine programme implementation and evaluation has not yet emerged. Realist evaluation [9] is a potential choice as it focuses on ‘what works in which circumstances and for whom?’, rather than merely ‘does it work?’ However, realist evaluation is often used to understand why expected programme outcomes were not achieved and what factors were involved in successes and failures. While realist evaluation has much to offer here, we suggest a reorientation of focus towards the implementation process rather than programme outcomes, and a revisioning [3] of the well-known triad from Ref. [9].
“Context: A given context is a complex system of specific interacting factors, for example the characteristics or features in the setting, the community and stakeholders, that in their interactions create the specific conditions that exist prior to implementation.” [6]. Combinations of key contextual factors influencing implementation processes (“contextual equation”) give practitioners and researchers the lens through which to assess the conduciveness of a context towards implementation at a given moment in time. We advocate recognition of the fact that contextual equations are inherently changing and variable over time.
Outcomes can be defined as the intended and unintended results (potentially positive and negative) from the implementation process. Outcomes are observed across the whole context (e.g. health capacity building through organizational changes, changes in leadership or partnership, competency development, policy development, pedagogical and curriculum innovation) and potential retroaction on the programme (e.g. evolution in programme content; and/or health-related programme impact set beforehand).
Mechanism account for the interactions between programme and context thus influencing outputs significantly, which may be modelled using the causal loop framework [18]. Mechanisms are not currently given much focus in programme implementation as such, however they warrant close analysis.
Previous work shows the number of factor combination at play during programme implementation is limited, and recurrent combinations of contextual factors can be found [3,10], namely “Typical Contextual Equations (TCE)”. TCEs provide a sense of the critical factors involved during the implementation process in a certain context without discarding the variability which would show in a detailed analysis. In this sense, Typical Contextual Equations (TCE) could be compared to a setting or community implementation profile [3,6]. For example, staff turn-over, support from the management team and individual vision of one’s role in health promotion are critical factors in school settings.
4. Implementation patterns to support health promotion programme implementation
Our work has led us to consider a focus on “what works”, which is not sufficient to take up the challenge of the reduction of the gradient of inequalities [6]. However, a better understanding of the interaction between programme and contexts could contribute to upscaling the design of effective health promotion strategies. Our suggestion is to focus programme design on such interactions, to transform the potential vulnerability or strength in contexts into opportunities without discarding the importance of programme content and features. Based on Typical Contextual Equations, implementation patterns could lead action, and in turn, inform policy development, programme design and practices, based on proportionate universalism. Research and analysis of effective practices from the field could help us to create such patterns, as well as useful tools for programme design and implementation.
Abraham Maslow once wrote “It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail” [8]. Classical approaches of programme effectiveness are not effective enough to understand the implementation process in complex settings such as workplace, sports club, school, hospital, nursing home. Researchers and practitioners need to think ‘outside the box’ of programme fidelity to enlighten potential solutions to the current challenges of health promotion programme implementation, without causing harm to the process [6].
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
References
- 1.Rowling L., Samdal O. Filling the black box of implementation for health-promoting schools. Health Educ. 2011;111(5):347–362. [Google Scholar]
- 2.Samdal O., Rowling L. Theoretical and empirical base for implementation components of health-promoting schools Theoretical and empirical base for implementation components of health-promoting schools. Health Educ. 2011;111(5):367–390. [Google Scholar]
- 3.Darlington E.J., Simar C., Jourdan D. Implementation of a health promotion programme: a ten-year retrospective study. Health Educ. 2017;117(3):252–279. http://www.emeraldinsight.com/doi/abs/10.1108/HE-09-2016-0038 Available from. [Google Scholar]
- 4.Wimbush E., Watson J. An evaluation framework for health Promotion : theory , quality and effectiveness. Evaluation. 2000;6(3):301–321. [Google Scholar]
- 5.Spencer B., Broesskamp-Stone U., Ruckstuhl B., Ackermann G., Spoerri A., Cloetta B. Modelling the results of health promotion activities in Switzerland: development of the Swiss model for outcome classification in health promotion and prevention. Health Promot. Int. 2008;23(1):86–97. doi: 10.1093/heapro/dam038. [DOI] [PubMed] [Google Scholar]
- 6.Darlington E.J., Mannix McNamara P., Jourdan D. ECER Conference 2017, " Reforming Education and the Imperative of Constant Change: Ambivalent Roles of Policy and the Role of Educational Research”. 2017. Enhancing the efficacy of health education interventions: moving the spotlight from implementation fidelity to quality of the implementation process. Copenhagen, Denmark, 22 - 25 August 2017. [Google Scholar]
- 7.Nutbeam D. Evaluating health promotion--progress, problems and solutions. Health Promot. Int. 1998;13(1):27–44. http://heapro.oxfordjournals.org/content/13/1/27.short [Internet] Available from: [Google Scholar]
- 8.Rowling L., Jeffreys V. Capturing complexity: integrating health and education research to inform health-promoting schools policy and practice. Health Educ. Res. 2006;21(5):705–718. doi: 10.1093/her/cyl089. [DOI] [PubMed] [Google Scholar]
- 9.McIsaac J.-L.D., Read K., Veugelers P.J., Kirk S.F.L. Culture matters: a case of school health promotion in Canada. Health Promot. Int. 2013 doi: 10.1093/heapro/dat055. http://www.ncbi.nlm.nih.gov/pubmed/23945087 [Internet] (August 2013):207–17. Available from: [DOI] [PubMed] [Google Scholar]
- 10.Stewart-Brown S. WHO Regional Office for Europe; Copenhague: 2006. What Is the Evidence on School Health Promotion in Improving Health or Preventing Disease and, Specifically, what Is the Effectiveness of the Health Promoting Schools Approach? [Google Scholar]
- 11.Durlak J.A., DuPre E.P. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am. J. Community Psychol. 2008;41(3–4):327–350. doi: 10.1007/s10464-008-9165-0. [DOI] [PubMed] [Google Scholar]
- 12.Meyers D.C., Durlak J.A., Wandersman A. 2012. The Quality Implementation Framework : A Synthesis of Critical Steps in the Implementation Process. [DOI] [PubMed] [Google Scholar]
- 13.Fixsen D.L., Naoom S.F., Blase K.A., Friedman R.M., Wallace F. Louis de la Parte Florida Mental Health Institute Publication #231; Tampa, FL: Tampa, Florida: 2005. Implementation Research: A Synthesis of the Literature.http://nirn.fmhi.usf.edu [Internet] Available from: [Google Scholar]
- 14.Mcisaac J.D., Mumtaz Z., Veugelers P.J., Kirk S.F.L. vol. 53. Eval Program Plann; 2015. pp. 65–71. (Providing Context to the Implementation of Health Promoting Schools : A Case Study). [Internet] Available from: [DOI] [PubMed] [Google Scholar]
- 15.Darlington E.J., Violon N., Jourdan D. Implementation of health promotion programmes in schools: an approach to understand the influence of contextual factors on the process? BMC Public Health. 2018;18(1) doi: 10.1186/s12889-017-5011-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Pawson R., Tilley N. An Evidence-Based Approach to Public Health and Tackling Health Inequalities: Practical Steps and Methodological Challenges. Sage; London: 2004. Realist evaluation; pp. 1–36. [Google Scholar]
- 17.Maslow A. Harper & R. Gateway Editions. 1966. The psychology of science: a reconnaissance; pp. 15–16. [Google Scholar]
- 18.Fixsen D.L., Naoom S.F., Blase K a, Friedman R.M., Wallace F. In: Florida U of S, editor. University of South Florida, The National Implementation Research Network; Tampa: 2005. pp. 1–119. (Implementation Research: A Synthesis of the Literature). [Google Scholar]
- 19.Hirsch G.B., Levine R., Miller R.L. Using system dynamics modeling to understand the impact of social change initiatives. Am. J. Community Psychol. 2007;39(3–4):239–253. doi: 10.1007/s10464-007-9114-3. [DOI] [PubMed] [Google Scholar]