Skip to main content
BMJ Open logoLink to BMJ Open
. 2020 Aug 13;10(8):e037643. doi: 10.1136/bmjopen-2020-037643

Identifying optimal frameworks to implement or evaluate digital health interventions: a scoping review protocol

Charlene Soobiah 1, Madeline Cooper 1, Vanessa Kishimoto 1, R Sacha Bhatia 1, Ted Scott 2,3, Shelagh Maloney 4, Darren Larsen 1,5, Harindra C Wijeysundera 6, Jennifer Zelmer 7, Carolyn Steele Gray 8,9, Laura Desveaux 1,9,
PMCID: PMC7430416  PMID: 32792444

Abstract

Introduction

Digital health interventions (DHIs) are defined as health services delivered electronically through formal or informal care. DHIs can range from electronic medical records used by providers to mobile health apps used by consumers. DHIs involve complex interactions between user, technology and the healthcare team, posing challenges for implementation and evaluation. Theoretical or interpretive frameworks are crucial in providing researchers guidance and clarity on implementation or evaluation approaches; however, there is a lack of standardisation on which frameworks to use in which contexts. Our goal is to conduct a scoping review to identify frameworks to guide the implementation or evaluation of DHIs.

Methods and analysis

A scoping review will be conducted using methods outlined by the Joanna Briggs Institute reviewers’ manual and will conform to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews. Studies will be included if they report on frameworks (ie, theoretical, interpretive, developmental) that are used to guide either implementation or evaluation of DHIs. Electronic databases, including MEDLINE, EMBASE, CINAHL and PsychINFO will be searched in addition to grey literature and reference lists of included studies. Citations and full text articles will be screened independently in Covidence after a reliability check among reviewers. We will use qualitative description to summarise findings and focus on how research objectives and type of DHIs are aligned with the frameworks used.

Ethics and dissemination

We engaged an advisory panel of digital health knowledge users to provide input at strategic stages of the scoping review to enhance the relevance of findings and inform dissemination activities. Specifically, they will provide feedback on the eligibility criteria, data abstraction elements, interpretation of findings and assist in developing key messages for dissemination. This study does not require ethical review. Findings from review will support decision making when selecting appropriate frameworks to guide the implementation or evaluation of DHIs.

Keywords: protocols & guidelines, quality in health care, statistics & research methods


Strengths and limitations of this study.

  • To our knowledge, this is the first scoping review to identify frameworks to implement or evaluate digital health interventions on a broad scale.

  • The study protocol was informed by rigorous and established methods as suggested by the Joanna Briggs Institute approach for scoping reviews and adheres to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews.

  • Digital health knowledge users, such as policymakers, researchers, clinicians and developers have been engaged in the design and development of the review since its inception to ensure relevance and scope of project.

  • This scoping review will not examine the usability of the frameworks, as such our findings will be limited to descriptive syntheses.

  • Findings stemming from this review will provide practical guidance for digital health knowledge users and enable them to use evidence informed approaches to select optimal frameworks to implement or evaluate digital health interventions.

Introduction

Frameworks help to systematically organise and link research objectives or constructs, and provide useful insights in quantitative and qualitative analyses, which can inform interpretation or decision making.1 2 The Medical Research Council (MRC) categorises frameworks into four distinct groups: (1) development frameworks, which can model processes and outcomes; (2) feasibility frameworks, which can guide pilot testing of an intervention; (3) implementation frameworks, to guide evidence into clinical practice and (4) evaluation frameworks, to determine intervention effectiveness.3

A recent scoping review, identified over 159 knowledge translation frameworks to guide implementation and evaluation of health interventions in clinical practice settings, presenting a plethora of options for the implementation and evaluation of digital health interventions (DHIs).4 Implementation and evaluation frameworks present an opportunity to address gaps relating not only to whether an intervention works but provide actionable insights for how to support their uptake in practice.

DHIs differ from traditional health interventions such as implementing a new programme or evaluating drug effectiveness. DHIs include any health service or treatment delivered using technology that aims to facilitate, capture or exchange knowledge.5 Examples of DHIs include electronic medical records, mobile applications or wearable sensors for remote monitoring. DHIs are complex, differ both in intended functionality (eg, self-management support vs data sharing) and intended users (eg, patients vs providers). DHIs are not static; instead the interaction between the technology, end-user and the healthcare team and setting is by its nature dynamic and thus can vary substantially over time.6 Given the unique sociotechnical aspects of DHIs, it remains unclear which frameworks can be appropriately applied in this emerging field.

This paper outlines the protocol for a scoping review to identify frameworks to guide the implementation or evaluation of DHIs. Specifically, our objectives are to:

  1. Describe the attributes of existing frameworks that have been used to guide the implementation or evaluation of DHIs.

  2. Identify the proposed role of each framework, including the constructs and mechanisms they target.

  3. Describe how each framework has been applied in primary studies, if applicable.

The results of this review will provide practical guidance and support for researchers, clinicians, policymakers and developers in selecting the most appropriate framework for DHIs, which will support evidence-based approaches in relation to implementation and evaluation efforts.

Methods and analysis

We will conduct a scoping review to comprehensively search the literature, ‘map’ the evidence and identify gaps in the research knowledge base.7 8 The study will be conducted using established methods outlined by the Joanna Briggs Institute reviewers’ manual7 and reporting will conform to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews.9 This protocol is registered on Open Science Framework (OSF) and is available at https://osf.io/8jydm/. OSF is an open source platform where researchers can share protocols, data and contributes to transparency of research.10

Eligibility criteria

Studies reporting on the development or application of frameworks (ie, theoretical or interpretive) to guide implementation or evaluation of DHIs in healthcare will be included. We will use the WHO definition of health which encompasses physical, mental and social well-being and spans across multiple disciplines such as psychology, sociology or medical sciences.11 DHI was defined as any health service or treatment delivered using technology that aims to facilitate, capture or exchange knowledge (formally or informally).5 DHI definition was generated from a search of the literature and consultations with digital health knowledge users, including policymakers, researchers, clinicians and developers. Implementation frameworks will be operationalised according to MRC guidance, as frameworks that aim to guide research into practice, which can include development, feasibility and dissemination frameworks.3 Evaluation frameworks will be defined as frameworks that focus on determining the effectiveness of DHIs, which includes measuring outcomes and understanding processes or mechanisms of action.3 No limitations will be placed on user population, comparators, study design, publication status or geographic region to enhance the comprehensiveness of our results and avoid unintended exclusion of relevant studies. Conference abstracts/proceedings and white papers will be included. We will include studies reported in other languages and use appropriate tools (ie, Google translate, translation services, contact author) to assess inclusion. Commentaries and studies examining mathematical or statistical frameworks will be excluded.

Information sources

An experienced information specialist developed the literature search in consultation with the multidisciplinary research team. The search will be peer reviewed by a second information specialist using the Peer Review of Electronic Search Strategy (PRESS) checklist to ensure the search is comprehensive and maximises appropriate search terms.12

We will search MEDLINE, EMBASE, CINAHL and PsychINFO using keywords such as ‘digital health’ and ‘framework’. Additional search terms were drawn from multiple disciplines such as psychology, nursing, sociology and medicine to ensure comprehensiveness. The databases will be searched from inception to present and the search strategy is presented in online supplementary appendix 1. We chose not to use the BeHEMoTh (behaviour of interest, health context, exclusions and models or theory) approach13 as specified in our OSF registration. Although this approach has been successful in identifying frameworks in knowledge translation,4 it did not prove to be a feasible approach in our scoping review as it yielded a vast number of citations with limited specificity related to our objectives. We used a simplified heuristic, which included identifying DHIs in various healthcare contexts, adding terms for frameworks and removing exclusions such as animal studies (online supplementary appendix 1).

Supplementary data

bmjopen-2020-037643supp001.pdf (45.8KB, pdf)

The search strategy will be supplemented by a search for grey literature using the checklist suggested by the Canadian Agency for Drugs and Technologies in Health.14 Specifically, we will search for white papers or benefit evaluation studies through health technology assessment agencies such as Agency for Healthcare Research and Quality and National Institute for Health and Care Excellence, Canada Infoway and other relevant organisations involved in providing guidance on delivery of healthcare services. We will use keywords such as ‘digital health’, ‘frameworks’ and ‘benefits evaluation’ to refine our supplementary search. In addition, we will also scan reference lists of included studies and conduct a forward citation search (ie, examine studies that reference the included studies) in Web of Science using the cited reference search feature. This will ensure our approach is comprehensive.

Eligibility screening process

Citations obtained from the literature search will be uploaded to Covidence,15 a systematic review software program which organises citations, enables screening of citations by multiple reviewers and identifies discrepancies. We will apply a two-step process for identifying relevant citations. At level 1, titles and abstracts will be assessed using the eligibility criteria (online supplementary appendix 2). Studies with abstracts fulfilling criteria will be passed to level 2 where the eligibility criteria will be applied to the full text articles.

Supplementary data

bmjopen-2020-037643supp002.pdf (25.8KB, pdf)

Prior to screening, a pilot test will be completed using a random sample of 10% of citations or full text articles, with the expressed purpose of assessing agreement between reviewers at each level. Specifically, percent agreement will be used to assess agreement among reviewers (inter-rater reliability ≥80% will be considered adequate). If agreement is not reached, a second pilot will be conducted with another random sample of 10%. A third reviewer will mediate any disagreements. Citations and full text articles will be screened in duplicate by two reviewers.

Data items and abstraction process

Studies fulfilling the eligibility criteria will be abstracted in Excel. We will extract the following study characteristics for the identified frameworks: name, reference, theory associated with framework (if applicable), description of its components or constructs and its application in research (or stage of research to which it was applied, if applicable). For studies outlining the application of a framework, additional characteristics will be abstracted such as the type of DHI, healthcare setting, method of application and nature and directionality of the results. We will abstract information such as name of the framework, the role of framework in study (ie, development, feasibility/pilot testing, implementation, evaluation), components of the framework that were used, type of DHI, the objective of the study (if applicable) and healthcare setting from included studies.

Methodological appraisal

We will not assess the quality of included articles in the scoping review (consistent with Joanna Briggs Institute reviewer’s manual7 as our purpose is to gain an overview of frameworks used in relation to DHIs and not to assess the quality of their application.

Ethics and dissemination

This scoping review is focused on published reports and studies of DHI and does not involve patients or primary data collection; as such, no formal ethics approval is required.

The dissemination plan will be tailored to end-users and will include passive and interactive strategies such as peer-reviewed publications, conference events and other network events with digital health knowledge users. To ensure broader reach, we will also disseminate our findings through social media platforms, and public-facing communications such as one-page briefs released on the Women’s College Institute for Health Innovation website at Women’s College hospital.

Patient and public involvement

We employed an integrated knowledge translation strategy to engage digital health knowledge users in the review process to ensure the scope of the project met the needs of various end-users. Knowledge users are defined as individuals who are likely to use the findings to inform health decision making.2 A priori, we decided to engage senior leaders and policymakers at organisations that promote or support implementation of digital health solutions, as well as researchers, clinicians and developers evaluating DHIs in real-world settings. An advisory panel of digital health knowledge users was established to provide input at strategic phases of the scoping review.

Potential panelists were identified through organisational networks and were invited to participate via email. Six members agreed to participate (CSG, TS, HCW, JZ, SM, DL) on the advisory panel. Panelists and the research team convened a meeting and discussed the strategic steps and opportunities for involvement and input in the review. Specifically, the advisory panel will support refinement of inclusion criteria, prioritisation of data abstraction elements, assist in interpretation of findings and develop dissemination strategies. Panelists have national and international networks that will ensure the scope of the review reflects the knowledge needs of a diverse audience, which is directly in line with the stated aim of providing practical guidance on the selection and application of frameworks for DHIs. As the intended audience of this paper does not include patients and members of the general public, they were not included as part of the advisory panel. The perspectives of patients and the general public will be incorporated through their participation and involvement in the respective studies included as part of this review.

Analysis

Included studies will be summarised using qualitative description, an approach that seeks to create an understanding of phenomenon through accessing the meanings ascribed by authors.16 Descriptions of individual frameworks will be organised by key categories, including study design, report type (published vs non-published), methodological approach (ie, how the framework is intended to be applied) and application papers (ie, how the framework has been applied in practice). We will then synthesise findings by mapping core components of the frameworks and examining how research objectives and type of DHIs are linked to the framework. Categorisation will use language directly from included studies, where possible, and authors will be contacted when information is not present or unclear. The advisory panel will guide the synthesis of findings by providing input on the level of detail abstracted from included articles and provide input on categorisation of frameworks, where appropriate.

Strengths and limitations

To our knowledge, this is the first scoping review to examine the use of frameworks to guide implementation or evaluation of DHIs on a broad scale. The protocol was generated using established methods for the conduct of scoping reviews and informed by input from digital health knowledge users to define scope and ensure the relevance of the project. A clear understanding of which frameworks can be used for development, feasibility, implementation and evaluation of DHIs will facilitate decision making by making evidence-based approaches available to policymakers, clinicians and developers. Additionally, this guidance will support researchers in identifying appropriate frameworks with the goal of establishing consistency across studies, minimising duplication and accelerating scientific progress.

Given the breadth of this scoping review, we anticipate a few key challenges. The first relates to the inconsistent and often ill-defined nature of DHIs and frameworks. To be inclusive, we have defined DHIs broadly as any health intervention that can be delivered using technology to ensure we capture frameworks that are currently being used across formal (eg, care delivered within the walls of a healthcare organisation) and informal settings (eg, direct to consumer technologies). Moreover, use of the term framework itself also creates challenges. For the purposes of this scoping review, we have defined a framework as a tool to systematically organise and link research questions or constructs, but a range of terms are often used synonymously (eg, models or processes). To account for this variability, we will include studies reporting on ‘models’ and work closely with the advisory panel to confirm whether the reported framework aligns with our a priori definition, as well as the needs of relevant digital health knowledge user groups.

Second, we also anticipate challenges searching the literature as a product of inconsistent terminology outlined above. We have constructed our search to balance comprehensiveness and specificity, working closely with an information specialist to ensure the number of citations are focused and feasible. Several iterations of the literature search were conducted, specifically, we added in keywords and removed them in a stepwise fashion to understand the impact on specificity and sensitivity of our search. Through this iterative process, we developed our search strategy, which was then peer reviewed using the PRESS checklist; however, we anticipate additional challenges when screening.

Third, we anticipate challenges arising from poor reporting or limited description, as evidenced by previous studies.17 18 Authors may not provide sufficient details on the frameworks they use or their method of application.19 To mitigate this, we will contact authors to obtain additional information whenever information is missing or unclear.

Finally, we anticipate that some included frameworks will have a dual purpose of addressing implementation and evaluation or may contain components that lend themselves to both constructs. We will convene with the advisory committee on a quarterly basis to discuss these issues as they arise and will devise the most appropriate plan for analysis through group consensus.

Overall, identification of frameworks will serve as a guide for researchers, clinicians, policymakers and developers of DHIs by providing practical guidance on which frameworks may be most appropriate for which objectives (ie, implementation or evaluation). In parallel, the results will contribute to a more nuanced understanding of how to evaluate and implement DHIs, including the identification and understanding of key constructs.

Supplementary Material

Reviewer comments
Author's manuscript

Acknowledgments

We would like to thank Becky Skidmore and Anne Dabrowski for their assistance with generating the literature search, as well as Beatrice Choremis, who helped screen a few preliminary citations.

Footnotes

Twitter: @lauradesveaux

Contributors: CS, LD conceived and developed the study. CS drafted the manuscript. MC, VK, RSB, TS, SM, DL, HCW, JZ, CSG and LD reviewed and edited the manuscript.

Funding: The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

Competing interests: None declared.

Patient and public involvement: Patients and/or the public were involved in the design, or conduct, or reporting, or dissemination plans of this research. Refer to the Methods section for further details.

Patient consent for publication: Not required.

Provenance and peer review: Not commissioned; externally peer reviewed.

References

  • 1. Tabak RG, Khoong EC, Chambers DA, et al. . Bridging research and practice: models for dissemination and implementation research. Am J Prev Med 2012;43:337–50. 10.1016/j.amepre.2012.05.024 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2. Straus S, Tetroe J, Graham ID. Knowledge translation in health care: moving from evidence to practice, 2014. [Google Scholar]
  • 3. Craig P, Dieppe P, Macintyre S, et al. . Developing and evaluating complex interventions: the new medical Research Council guidance. Int J Nurs Stud 2013;50:587–92. 10.1016/j.ijnurstu.2012.09.010 [DOI] [PubMed] [Google Scholar]
  • 4. Strifler L, Cardoso R, McGowan J, et al. . Scoping review identifies significant number of knowledge translation theories, models, and frameworks with limited use. J Clin Epidemiol 2018;100:92–102. 10.1016/j.jclinepi.2018.04.008 [DOI] [PubMed] [Google Scholar]
  • 5. Murray E, Hekler EB, Andersson G, et al. . Evaluating digital health interventions: key questions and approaches. Am J Prev Med 2016;51:843–51. 10.1016/j.amepre.2016.06.008 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Shaw J, Agarwal P, Desveaux L, et al. . Beyond "implementation": digital health innovation and service design. NPJ Digit Med 2018;1:48. 10.1038/s41746-018-0059-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Aromataris EMZ. Joanna Briggs Institute reviewer's manual. The Joanna Briggs Institute, 2017. [Google Scholar]
  • 8. Arksey H, O'Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol 2005;8:19–32. 10.1080/1364557032000119616 [DOI] [Google Scholar]
  • 9. Tricco AC, Lillie E, Zarin W, et al. . PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med 2018;169:467–73. 10.7326/M18-0850 [DOI] [PubMed] [Google Scholar]
  • 10. Foster, MSLS ED, Deardorff, MLIS A. Open science framework (OSF). Jmla 2017;105 10.5195/JMLA.2017.88 [DOI] [Google Scholar]
  • 11. World Health organization Constitution of the world Health organization. in: World Health organization: basic documents. 45th ED, 2005. Available: https://apps.who.int/gb/bd/ [Accessed 8 Jun 2020].
  • 12. McGowan J, Sampson M, Salzwedel DM, et al. . PRESS Peer Review of Electronic Search Strategies: 2015 Guideline Statement. J Clin Epidemiol 2016;75:40–6. 10.1016/j.jclinepi.2016.01.021 [DOI] [PubMed] [Google Scholar]
  • 13. Booth A, Carroll C. Systematic searching for theory to inform systematic reviews: is it feasible? is it desirable? Health Info Libr J 2015;32:220–35. 10.1111/hir.12108 [DOI] [PubMed] [Google Scholar]
  • 14. Canadian Agency for Drugs and Technologies in Health (CADTH) Grey matters: a practical search tool for evidence-based medicine, 2013. Available: http://www.cadth.ca/resources/grey-matters
  • 15. Covidence, 2019. Available: https://www.covidence.org/home
  • 16. Sandelowski M. What's in a name? qualitative description revisited. Res Nurs Health 2010;33:77–84. 10.1002/nur.20362 [DOI] [PubMed] [Google Scholar]
  • 17. Glasziou P, Meats E, Heneghan C, et al. . What is missing from descriptions of treatment in trials and reviews? BMJ 2008;336:1472–4. 10.1136/bmj.39590.732037.47 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18. Hoffmann TC, Erueti C, Glasziou PP. Poor description of non-pharmacological interventions: analysis of consecutive sample of randomised trials. BMJ 2013;347:f3755. 10.1136/bmj.f3755 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19. Breuer E, Lee L, De Silva M, et al. . Using theory of change to design and evaluate public health interventions: a systematic review. Implement Sci 2016;11:63. 10.1186/s13012-016-0422-6 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary data

bmjopen-2020-037643supp001.pdf (45.8KB, pdf)

Supplementary data

bmjopen-2020-037643supp002.pdf (25.8KB, pdf)

Reviewer comments
Author's manuscript

Articles from BMJ Open are provided here courtesy of BMJ Publishing Group

RESOURCES