Skip to main content
BMJ Open logoLink to BMJ Open
. 2024 Aug 22;14(8):e080078. doi: 10.1136/bmjopen-2023-080078

Mapping review of ‘proof-of-concept’ in mental health implementation research using the TRL framework: a need for a better focus and conceptual clarification

Cindy E Woods 1,, Sue Lukersmith 1, Luis Salvador-Carulla 1
PMCID: PMC11344517  PMID: 39179274

Abstract

Abstract

Background

Proof-of-concept (PoC) development is a key step in implementation sciences. However, there is a dearth of studies in this area and the use of this term in health and social sciences is ambiguous.

Objective

The objective was to remove the ambiguity surrounding the PoC and pilot study stage in the research development process using a standard system to rate the development of projects and applications provided by the Technology Readiness Levels (TRL) framework.

Design

Mapping review and critical analysis using TRL as the standard measure.

Search strategy and charting method

PubMed and PsycInfo databases were searched for papers that reported PoC studies of mental health interventions up to August 2023. Data were extracted, described and tabulated.

Eligibility criteria

Included were PoC studies in mental health implementation research. Exclusion criteria were research relating to biomedical (drugs) development, neurocognitive tools, neuropsychology, medical devices, literature reviews or discussion papers or that did not include the term ‘proof-of-concept’ in the title, abstract or text.

Results

From the 83 citations generated from the database search, 22 studies were included in this mapping review. Based on the study title, abstract and text, studies were categorised by research development stage according to the TRL framework. This review showed 95% of the studies used PoC incorrectly to describe the development stage of their research but which were not at this specific level of project development.

Conclusions

The TRL was a useful reference framework to improve terminological clarity around the term ‘proof-of-concept’ in implementation research. To extend the use of TRL in implementation sciences, this framework has now been adapted and validated to a health and social science-related research context accompanied by a health-related glossary of research process terms and definitions to promote a common vocabulary and shared understanding in implementation sciences.

Keywords: MENTAL HEALTH, Systematic Review, Health, Implementation Science


Strengths and limitations of this study.

  • This study gives insight into common misunderstandings and confusion about the way proof-of-concept is used in mental health implementation research by analysing available literature.

  • This approach can be very useful for improving one’s understanding of complex research topics such as the developmental stages of implementation research and the need for conceptual clarification of commonly used research terms.

  • Quality appraisal of the included studies was considered out of scope and was not performed.

Introduction

Proof-of-concept (PoC) and pilot studies are key steps in implementation sciences, however, a better understanding of the differences between these two levels of project development is urgently needed. The terms ‘proof-of-concept’ and ‘pilot study’ are frequently used interchangeably in the literature, possibly because they share similarities in their objectives. Yet they are distinct in their scope, purpose and phase of project development.1

PoC studies aim to determine the workability of an application. That is, whether a particular technology, product, intervention or design could work as intended, solve a specific problem or meet a particular need. PoC is not intended to provide comprehensive results or solutions but rather to show that the core idea is worth further investigation. A PoC study can lead to a decision on whether to invest further resources in developing the idea. Furthermore, the National Science Foundation’s Accelerating Innovation Research-Technology Translation defines ‘proof-of-concept’ as ‘the realisation of a certain method or idea to ascertain its scientific or technological parameters. A PoC should be understood sufficiently so that potential application areas can be identified and a follow-on working prototype designed’.2

Feasibility studies aim to test and validate a prototype, a working model or a preliminary version of an application (a technology, concept, process, product or service).3 Testing for feasibility may include testing for relevance, acceptability, applicability, practicality, effectiveness/efficiency and value.4,6 At this level, testing aims to demonstrate that the prototype application can function as expected in the intended real-world setting with the target audience.

The primary objective of a pilot study is to test the application in a relevant environment with a small representative sample of the intended target audience before conducting a larger scale study.7 Moore et al define pilot studies as ‘preparatory studies designed to test the performance characteristics and capabilities of study designs, measures, procedures, recruitment criteria and operational strategies that are under consideration for use in a subsequent, often larger, study’.8

A pilot study should be carried out after the PoC and feasibility studies to help researchers identify potential issues and gather information to refine their study design and assess whether the study procedures are practical and effective. Researchers collect data during the pilot phase to assess the study’s logistics, data collection instruments and processes.9

The results of a pilot study are not used to draw definitive conclusions but to adjust and improve the research design.10 A pilot study can be likened to a rehearsal for full-scale implementation,11 allowing for subsequent adjustments or improvements that can increase the likelihood of a successful larger-scale study.

In spite of the relevance of these research terms within the research process, the extent of the lack of clarity between PoC and pilot studies is unknown. We postulated that the term ‘proof-of-concept’ is commonly used in the literature to describe ‘pilot’ and ‘feasibility’ studies. Both terms are used in the literature to describe studies undertaken in preparation for a future larger study to evaluate the effect of an intervention or to gather information about the feasibility of implementing an intervention in a future study.7 12

Jobin et al provide a concise historical narrative of the genesis of the concept of PoC and its evolution over time, beginning with NASA and the Technology Readiness Levels (TRL) framework.13 The TRL scale was developed by NASA in the 1970s as a means of measuring the maturity or readiness of a technology or component to be launched into space.14 15 The TRL framework is a 9-point scale. TRL 1 indicates the earliest ideas stage of the research process, TRL 2 and 3 focus on formulating and developing the concept, TRL 4 completion of the prototype, TRL 5 validation of the prototype, TRL 6 pilot testing, TRL 7 demonstration and TRL 8 and 9 indicating the technology is fully tested and ready to be applied in its intended or operational environment.

The TRL scale can be used to monitor the progress of an application through the different research stages over time. The TRL scale has been adopted, adapted and widely used by various industries and sectors, including defence in Australia,16 medical drug development17 by research consortiums (eg, Digital Health Cooperative Research Centre in Australia) and international research funding agencies such as the European Commission for its Horizon 2020 and Horizon Europe Research and Innovation funding programs since 2014 (table 1).13 18

Table 1. Technology Readiness Level scale (European Commission, 2014).

Maturity level Description Stages
TRL 1 Basic principles observed Ideas stages
TRL 2 Technology concept formulated
TRL 3 Experimental proof of concept Development stages
TRL 4 Technology validated in lab
TRL 5 Technology validated in relevant environment Validating and pilot testing stages
TRL 6 Technology demonstrated in relevant environment
TRL 7 System prototype demonstration in operational environment Demonstration in real world stages
TRL 8 System complete and qualified
TRL 9 Actual system proven in operational environment

Each of the TRL stages have a role to play in the research process, and while this process is not always linear, the TRL does facilitate an understanding of which stage an application currently is in, within the research process. The TRL scale can also help to clarify and specify the difference between particular development stages, such as PoC and pilot testing. Recently the TRL has been adapted for health and social science implementation research and the adapted TRL-IS has been validated for use in a health implementation research context.3

In summary, the terms PoC and pilot show a terminological problem due to the unclarity (ambiguity or vagueness) of these scientific terms. Terminological ambiguity exists when a term (the dyad of a name and its definition) can reasonably be interpreted in more than one way (eg, two different codes of a reference classification system can be assigned to the same entity). Vagueness occurs when a word or phrase is underspecified and therefore admits borderline cases or relative interpretation.19

The aim of this study was to remove the ambiguity surrounding the stage in the research development process in which PoC and pilot study belong, using a standard system to rate the development of projects and applications provided by the TRL-IS framework. The following research question is applied: How frequently is PoC used correctly within the health implementation research literature to report a study in the PoC stage of development following the TRL-IS framework?

Methods

Mapping review

A mapping review20 21 was conducted to systematically map the extent that mental health implementation research studies include the term ‘proof-of-concept’ in the title, abstract or text when reporting pilot/feasibility studies or studies in different stages of the research process. A mapping review was selected to identify, describe and catalogue available evidence and evidence gaps relating to the research question. It is an appropriate approach due to the descriptive nature of the extracted data and higher-level (predefined) codes.20 The review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Review checklist22 to ensure all necessary steps were included.

Literature search strategy

A search of PubMed and PsycInfo databases was conducted in August 2023 using the following search terms, ‘mental health care’ AND ‘proof of concept’, ‘digital mental health’ AND ‘proof of concept’, and ‘Technology Readiness Level’ AND ‘digital mental health’ (see online supplemental file 1). No date limits were applied to the search, but the papers were required to be in the English language. Two authors (CEW, LS-C) scanned the titles and abstracts.

Inclusion criteria

Studies were included if they were published in the English language, the full text was available and they reported a PoC study in mental health implementation research. Studies were excluded if they related to interventions involving biomedical (drugs) development, neurocognitive tools, neuropsychology, medical devices, were literature reviews or discussion papers or were not reporting mental health implementation research.

Data extraction

Relevant data from the selected studies were extracted including: Publication date; study design; sample size; and evidence used to determine if the study was at the PoC stage; pilot stage or other stage according to the European Commission 2014 TRL framework and definitions (table 1). Extracted data were recorded in an Excel spreadsheet.

Patient and public involvement statement

There was no patient or public involvement in this study.

Results

The database search returned 83 results. Four duplicates were removed, leaving a total of 79 papers. Two authors (CEW, LS-C) screened the titles and abstracts and 45 papers were retained. The full text of the selected studies was examined by one author (CEW) to ensure they met the inclusion/exclusion criteria, and if there was any uncertainty, papers were referred to a second author (LS-C). 23 papers were discarded with reasons, leaving a final 22 papers for inclusion in the review (figure 1). The protocol paper was excluded as the full results paper was identified in the search results.

Figure 1. PRISMA Flowchart. Adapted from Page et al (2021). PoC, proof-of-concept; PRISMA, Preferred Reporting Items for Systematic Reviews and Meta-Analyses.

Figure 1

The two authors classified the studies according to the current TRL stage. Extracted data were descriptively organised into a table which illustrates whether the terms PoC and/or pilot appear in the title and/or abstract.

Study design

The majority of the selected studies were randomised controlled trials (RCT) (n=13), clinical trial (n=1), secondary analysis of RCT data (n=2) and RCT protocols (n=3). One used an observational study design, one reported the outcomes from a funding programme and one used a qualitative study design.

Just over half (n=12) of the studies used a small sample of <100, seven studies used a sample size of between 101 and 300, the observational study had a sample size of 39823 and the sample size is unknown for two studies.24 25

Currency of evidence

All of the selected studies were published over an 11-year period (2012–2023).

PoC or pilot study

None of the selected studies provided a definition of the terms PoC nor pilot study. Overall, 18 of the 22 selected studies or study protocols were rated as TRL 6—pilot studies. A number of these studies used PoC and feasibility/pilot study as synonyms or interchangeable terms in the same paper (n=8), while others used PoC as a synonym for feasibility in the development process (n=10). Of the remaining papers, one is a descriptive study of funding programme outcomes (TRL 7),25 one describes the implementation of a model of care (TRL 7),23 one reports validation of a prototype (TRL 5) prior to testing in a pilot study but also reports TRL 3 (PoC—co-design and development of a prototype) and TRL 4 (testing of preliminary prototype) activities26 and one is a PoC study (TRL 3).27 All of the 22 studies used PoC in the title and/or abstract—one used PoC in the title and ‘demonstration of concept’ in the abstract (online supplemental file 2).25 Only one study used the term ‘pilot’ in the title and/or abstract.28 Online supplemental file 2 shows the evidence used to indicate the study is a pilot or feasibility study and not a PoC or other study. These figures (21/22) translate to 95% of the authors of the selected studies using PoC incorrectly. This finding indicates that the term PoC lacks clarity and is not a well-understood concept.

Discussion

This study has demonstrated that despite clear differences in the definitions of PoC and pilot study, these terms continue to be used interchangeably and uncritically. While the two terms have distinct definitions, they share a common thread of validation and testing in the early stages of research which may lead to confusion in some contexts.

The lack of a clear understanding and use of these terms emerged as a significant challenge. While the scientific community recognises their importance, the interchangeable use of PoC and feasibility/pilot study terminology remains prevalent.29 Our findings underscore the need for greater precision in defining and applying these concepts and universally accepted definitions.

The use of PoC in the context of feasibility/pilot studies has conceivably gained traction as a way to shortcut the process of initially assessing the viability or workability of an idea and then developing and validating a prototype. By combining PoC and feasibility/pilot testing, proving the viability of an idea can be integrated with practical validation of the economic, operational and/or logistical feasibility. However, it is more likely that PoC, theoretical viability or workability, is being confused with operational and/or logistical feasibility and even with prototyping. Language is fluid and words and phrases can evolve in meaning over time.30 As more people use these terms interchangeably, their meanings may become less clear.

In interdisciplinary projects (eg, biomedical engineering, technology development) terminology can vary depending on the background of the researchers involved leading to overlap or confusion of terms. Both PoC and feasibility/pilot studies are crucial steps in the research and development process, but they each have different purposes and stages of application development.

In a research context, it is good practice to use terminology accurately to avoid ambiguity and potential misunderstandings. Terminology is defined as ‘a set of designations belonging to one special language’,31 and its main purpose is to eliminate unclarity and the ambiguity from technical languages by means of standardisation.32

Our study underscores the importance of refining terminology and promoting consistency in the scientific community. Clear definitions of PoC and feasibility/pilot studies are essential for accurate reporting, robust evidence generation and informed decision-making. Researchers, reviewers and policymakers should observe standardised guidelines, such as the TRL,3 18 to foster a shared understanding of these critical stages in the research continuum.

Strengths and limitations

The outcomes of this study promote consistency in terminology fostering a robust scientific ecosystem, facilitating knowledge exchange and ensuring that research findings contribute meaningfully to advancements in various fields.

Quality assessment was not used in this review and we recognise that the papers may be of variable quality.

The incorrect use of terminology (PoC vs feasibility/pilot) does not affect the validity of the studies described but may create misunderstandings about the stage of research of each study.

Conclusions

The findings from this study indicate, on the one hand, the significant contribution of the TRL framework to improve clarity of project development in implementation sciences, but they also signal the dearth of use of this framework in mental health research in spite of the recommendations in international guidelines. The TRL has been adapted and tailored to a health and social science implementation research context (TRL-IS) with discipline-specific guides.3 The adapted TRL-IS and guides provide researchers with a mutual understanding of maturity stages achieved through a shared language that can be used across organisations and research institutes to better communicate and monitor developmental progress. The progressive levels of the TRL-IS provide a systematic approach and framework to guide the planning, development, monitoring of progression and implementation of health-related interventions. The use of the TRL-IS may also affect funding and policy decisions as the maturity or ‘readiness level’ reflects how close an intervention is to being validated, tested and proven ready for use in routine care.

These findings also indicate the need for an international glossary of health and social science-related research terms and definitions to promote a common vocabulary and shared understandings of research terminology to prevent unclarity and ambiguity. This glossary should be incorporated into an international organisation such as the WHO, similar to their Health Promotion Glossary of Terms 2021.33 Using the correct terminology becomes particularly important for research funding applications, such as incubator or seed funding, PoC funding or assigning a TRL to applications for funding programmes.

supplementary material

online supplemental file 1
bmjopen-14-8-s001.pdf (90.7KB, pdf)
DOI: 10.1136/bmjopen-2023-080078
online supplemental file 2
bmjopen-14-8-s002.pdf (228.4KB, pdf)
DOI: 10.1136/bmjopen-2023-080078

Footnotes

Funding: The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Prepub: Prepublication history and additional supplemental material for this paper are available online. To view these files, please visit the journal online (https://doi.org/10.1136/bmjopen-2023-080078).

Provenance and peer review: Not commissioned; externally peer reviewed.

Patient consent for publication: Not applicable.

Ethics approval: Not applicable.

Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

Contributor Information

Cindy E Woods, Email: cindy.woods@canberra.edu.au.

Sue Lukersmith, Email: sue.lukersmith@canberra.edu.au.

Luis Salvador-Carulla, Email: luis.salvador-carulla@anu.edu.au.

Data availability statement

All data relevant to the study are included in the article or uploaded as supplementary information.

References

  • 1.Searchfield GD, Searchfield GD, Searchfield GD, et al. Good vibrations: a proof-of-concept study of the preferred temporal characteristics in surf-like sounds for tinnitus therapy. Can J Speech Lang Pathol Audio. 2019;43:216–29. [Google Scholar]
  • 2.National Science Foundation . Program Solicitation: Accelerating Innovation Research-Technology Translation, Directorate for Engineering, Industrial Innovation and Partnerships. NSF; 2014. pp. 14–569. [Google Scholar]
  • 3.Salvador-Carulla L, Woods C, de Miquel C, et al. Adaptation of the technology readiness levels for impact assessment in implementation sciences: the TRL-IS checklist. Heliyon. 2024;10:e29930. doi: 10.1016/j.heliyon.2024.e29930. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Slade M, Thornicroft G, Glover G. The feasibility of routine outcome measures in mental health. Soc Psychiatry Psychiatr Epidemiol. 1999;34:243–9. doi: 10.1007/s001270050139. [DOI] [PubMed] [Google Scholar]
  • 5.Zeilinger EL, Nader IW, Brehmer-Rinderer B, et al. CAPs-IDD: characteristics of assessment instruments for psychiatric disorders in persons with intellectual developmental disorders. J Intellect Disabil Res. 2013;57:737–46. doi: 10.1111/jir.12003. [DOI] [PubMed] [Google Scholar]
  • 6.Zeilinger EL, Gärtner C, Janicki MP, et al. Practical applications of the NTG-EDSD for screening adults with intellectual disability for dementia: a German-language version feasibility study. J Intellect Dev Disabil. 2016;41:42–9. doi: 10.3109/13668250.2015.1113238. [DOI] [Google Scholar]
  • 7.Leon AC, Davis LL, Kraemer HC. The role and interpretation of pilot studies in clinical research. J Psychiatr Res. 2011;45:626–9. doi: 10.1016/j.jpsychires.2010.10.008. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Moore CG, Carter RE, Nietert PJ, et al. Recommendations for planning pilot studies in clinical and translational research. Clin Transl Sci. 2011;4:332–7. doi: 10.1111/j.1752-8062.2011.00347.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Lancaster GA, Dodd S, Williamson PR. Design and analysis of pilot studies: recommendations for good practice. J Eval Clin Pract. 2004;10:307–12. doi: 10.1111/j.2002.384.doc.x. [DOI] [PubMed] [Google Scholar]
  • 10.Lancaster GA, Campbell MJ, Eldridge S, et al. Trials in primary care: statistical issues in the design, conduct and evaluation of complex interventions. Stat Methods Med Res. 2010;19:349–77. doi: 10.1177/0962280209359883. [DOI] [PubMed] [Google Scholar]
  • 11.Conn VS, Algase DL, Rawl SM, et al. Publishing pilot intervention work. West J Nurs Res. 2010;32:994–1010. doi: 10.1177/0193945910367229. [DOI] [PubMed] [Google Scholar]
  • 12.Eldridge SM, Lancaster GA, Campbell MJ, et al. Defining feasibility and pilot studies in preparation for randomised controlled trials: development of a conceptual framework. PLoS One. 2016;11:e0150205. doi: 10.1371/journal.pone.0150205. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Jobin C, Hooge S, Masson PL. What Does the Proof-of-Concept (POC) Really Prove? A Historical Perspective and A Cross-Domain Analytical Study. in XXIXème Conférence de l’Association Internationale de Management Stratégique (AIMS) 2020. [Google Scholar]
  • 14.Héder M. From NASA to EU: the evolution of the TRL scale in public sector innovation. TIJ. 2017;22:1–23. [Google Scholar]
  • 15.Mankins JC. Technology readiness assessments: a retrospective. Acta Astronaut. 2009;65:1216–23. doi: 10.1016/j.actaastro.2009.03.058. [DOI] [Google Scholar]
  • 16.Moon T, Smith J, Cook S. Technology Readiness and Technical Risk Assessment for the Australian Defence Organisation. ICE Australia; 2005. [Google Scholar]
  • 17.Årdal C, Baraldi E, Theuretzbacher U, et al. Insights into early stage of antibiotic development in small- and medium-sized enterprises: a survey of targets, costs, and durations. J Pharm Policy Pract. 2018;11:8. doi: 10.1186/s40545-018-0135-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Bruno I, et al. Technology readiness revisited: a proposal for extending the scope of impact assessment of European public services. in proceedings of the 13th international conference on theory and practice of electronic governance; 2020. [DOI] [Google Scholar]
  • 19.Castelpietra G, Simon J, Gutiérrez-Colosía MR, et al. Disambiguation of psychotherapy: a search for meaning. Br J Psychiatry. 2021;219:532–7. doi: 10.1192/bjp.2020.196. [DOI] [PubMed] [Google Scholar]
  • 20.Campbell F, Tricco AC, Munn Z, et al. Mapping reviews, scoping reviews, and evidence and gap maps (EGMs): the same but different- the “big picture” review family. Syst Rev. 2023;12:45. doi: 10.1186/s13643-023-02178-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.James KL, Randall NP, Haddaway NR. A methodology for systematic mapping in environmental sciences. Environ Evid . 2016;5:1–13. doi: 10.1186/s13750-016-0059-6. [DOI] [Google Scholar]
  • 22.Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. 2018;169:467–73. doi: 10.7326/M18-0850. [DOI] [PubMed] [Google Scholar]
  • 23.Faddy SC, McLaughlin KJ, Cox PT, et al. The mental health acute assessment team: a collaborative approach to treating mental health patients in the community. Australas Psychiatry. 2017;25:262–5. doi: 10.1177/1039856216689655. [DOI] [PubMed] [Google Scholar]
  • 24.Henderson C, Brohan E, Clement S, et al. A decision aid to assist decisions on disclosure of mental health status to an employer: protocol for the CORAL exploratory randomised controlled trial. BMC Psychiatry. 2012;12:1–9.:133. doi: 10.1186/1471-244X-12-133. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Savage RM, Dillon JM, Hammel JC, et al. The alabama coalition for a healthier black belt: a proof of concept project. Community Ment Health J. 2013;49:79–85. doi: 10.1007/s10597-012-9488-z. [DOI] [PubMed] [Google Scholar]
  • 26.Realpe A, Elahi F, Bucci S, et al. Co-designing a virtual world with young people to deliver social cognition therapy in early psychosis. Early Interv Psychiatry. 2020;14:37–43. doi: 10.1111/eip.12804. [DOI] [PubMed] [Google Scholar]
  • 27.Verhagen SJW, Simons CJP, van Zelst C, et al. Constructing a reward-related quality of life statistic in daily life-a proof of concept study using positive affect. Front Psychol. 2017;8:1917. doi: 10.3389/fpsyg.2017.01917. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Kooistra LC, Wiersma JE, Ruwaard J, et al. Blended vs. face-to-face cognitive behavioural treatment for major depression in specialized mental health care: study protocol of a randomized controlled cost-effectiveness trial. BMC Psychiatry. 2014;14:290. doi: 10.1186/s12888-014-0290-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Bond C, Lancaster GA, Campbell M, et al. Pilot and feasibility studies: extending the conceptual framework. Pilot Feasibility Stud. 2023;9:24. doi: 10.1186/s40814-023-01233-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Steels L, Szathmáry E. The evolutionary dynamics of language. BioSystems. 2018;164:128–37. doi: 10.1016/j.biosystems.2017.11.003. [DOI] [PubMed] [Google Scholar]
  • 31.Roche C. Ontoterminology: How to Unify Terminology and Ontology into a Single Paradigm. in LREC 2012, Eighth International Conference on Language Resources and Evaluation. European Language Resources Association; 2012. [Google Scholar]
  • 32.Gutierrez-Colosia MR, Hinck P, Simon J, et al. Magnitude of terminological bias in international health services research: a disambiguation analysis in mental health. Epidemiol Psychiatr Sci. 2022;31:e59. doi: 10.1017/S2045796022000403. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.World Health Organization . Health Promotion Glossary of Terms 2021. Geneva: World Health Organization; 2021. [Google Scholar]

Associated Data

    This section collects any data citations, data availability statements, or supplementary materials included in this article.

    Supplementary Materials

    online supplemental file 1
    bmjopen-14-8-s001.pdf (90.7KB, pdf)
    DOI: 10.1136/bmjopen-2023-080078
    online supplemental file 2
    bmjopen-14-8-s002.pdf (228.4KB, pdf)
    DOI: 10.1136/bmjopen-2023-080078

    Data Availability Statement

    All data relevant to the study are included in the article or uploaded as supplementary information.


    Articles from BMJ Open are provided here courtesy of BMJ Publishing Group

    RESOURCES