Skip to main content
BMC Medical Ethics logoLink to BMC Medical Ethics
. 2025 Oct 8;26:132. doi: 10.1186/s12910-025-01291-5

Bridging ethical gaps in digital health research: a framework for informed consent aligned with NIH guidance

Rahma Rizky Alifia 1,, Malihe Sadeghi 1,2, Maheswari Eluru 1, Mohammad Jafari 3, Maria Adela Grando 1
PMCID: PMC12509369  PMID: 41063017

Abstract

Background

Digital health technologies, including mobile applications, wearable devices, and sensors, are rapidly transforming clinical research. However, current informed consent practices often fall short of addressing the unique ethical risks introduced by these technologies. This study aims to develop and assess a comprehensive ethical consent framework to improve transparency, equity, and participant protection in digital health research.

Methods

We developed a consent framework aligned with national research ethics guidance, including 63 attributes and 93 subattributes across four domains: Consent, Grantee (Researcher) Permissions, Grantee (Researcher) Obligations, and Technology. We conducted thematic analysis under guidance and then reviewed 25 informed consent forms from real-world digital health studies to expand the guidance and assess each form's alignment with the framework. We used descriptive statistics to measure attribute completeness and to identify missing ethical elements.

Results

None of the consent forms fully adhered to all the required or recommended ethical elements, especially those related to technology-specific risks. The highest completeness for the required attributes reached only 73.5%. We also identified four ethically salient consent elements not present in the current national guidance: commercial profit sharing, study information disclosure, during-study result sharing, and data removal requests.

Conclusions

These findings reveal persistent ethical gaps in participant protection and highlight the need for more comprehensive, equity-oriented consent practices. Our framework offers a practical tool to strengthen transparency, autonomy, and justice in digital health research.

Supplementary Information

The online version contains supplementary material available at 10.1186/s12910-025-01291-5.

Keywords: Digital health ethics, Informed consent, Wearable devices, Mobile health, Sensor, Data governance, Participant rights, Clinical research, Technology and ethics, Digital equity

Introduction

Digital health technologies (DHTs) are reshaping healthcare and research landscapes, enhancing real-time data collection and intervention delivery [13]. However, ethical frameworks—particularly those governing informed consent—have not evolved at the same pace [1, 4]. Foundational principles from the Belmont Report and the Declaration of Helsinki emphasize informed autonomy, beneficence, and justice [5, 6]. However, these principles are increasingly strained by the complexities of DHTs, especially with respect to data privacy, participant comprehension, and third-party technology involvement [7].

Participants in DHT-based studies often engage with technologies that collect sensitive clinical, genomic, and behavioral data [1, 6]. However, consent forms rarely convey the implications of data reuse, third-party access, or technological limitations [7, 8]. Ethical risk is amplified in global and low-resource settings [9], where digital literacy, language barriers, and infrastructural constraints further hinder meaningful consent [10, 11].

In response, the NIH Office of Science Policy (OSP) issued a guidance document in 2024 outlining key considerations for digital health consent [12]. However, its voluntary nature and limited adoption raise concerns about consistent implementation and protection [13]. Several prior efforts have attempted to develop informed consent frameworks, but these frameworks are largely context-specific and fail to address the unique ethical demands of digital health. Existing frameworks have been tailored to pediatric research, fetal and embryonic tissue collection, genomic research, or humanitarian emergencies [5, 1416]. The Global Alliance for Genomics and Health (GA4GH), for example, developed the Consent Toolkit to assist in genomic and clinical research, with sample clauses for rare disease studies, pediatric populations, and large-scale initiatives [17]. Similarly, European GDPR-aligned toolkits emphasize legal compliance more than ethical transparency or participant understanding [18]. While some studies have explored methodological innovations such as dynamic consent [6, 1921] or focused on improving readability and comprehension [22], these approaches do not offer a comprehensive framework tailored to the digital health context.

Therefore, this study seeks to address those gaps through the development and evaluation of a comprehensive, ethically grounded consent framework that extends NIH guidance.

Methodology

Ethically grounded framework development

The researchers conducted qualitative analysis to develop the initial version of the informed consent framework. We used NIH OSP’s guidance document titled Informed Consent for Research Using Digital Health Technologies: Points to Consider & Sample Language (2024) [12] as the initial source in developing the framework. The extraction process followed the thematic analysis's six-step approach with some modifications [23]:

  1. Data Familiarization: The researchers reviewed several times to understand the main ideas, structure, and intentions of the data. This step helped identify where important guidance and examples appeared across different sections, including the Introduction, Procedures, Data Sharing and Ownership, Potential Risks, and Withdrawal.

  2. Code generation: The researchers examined each segment of the text, ranging from multiple sentences to sample phrases, for relevant meaning and labeled it with descriptive codes. These codes reflected specific consent-related requirements, expectations, or values. For example, the section Sample Language Components: Introduction stating, “Include a clear statement about how the technology is being used to address the study aims, whether the technology has been approved by the FDA for its intended use, and if the efficacy of the technology is being studied,” was decomposed into three distinct requirements: (a) an explanation of how the technology addresses study aims, (b) whether the technology has FDA approval for its intended use, and (c) whether its efficacy is being evaluated. Then, those requirements were coded into (a) technology purpose, (b) regulatory approval, and (c) technological efficacy evaluation.

To reduce researcher bias during coding, independent coding rounds were conducted, followed by consensus-building sessions. A third team member reviewed and validated the coding decisions. All codes were developed through a structured process, using predefined criteria informed by foundational ethical frameworks. Specifically, these criteria were derived from sources such as the Belmont Report [24] and peer-reviewed literature on digital autonomy, data stewardship, and risk-benefit transparency [25]. This approach helped ensure consistency and ethical rigor throughout the coding procedure.

  • 3.

    Attribute, subattribute, and theme formation: After coding, the researchers grouped related codes and merged them into broader attributes or subattributes. Each attribute represented a major consent topic, whereas subattributes captured more detailed parts within that topic. For example, codes related to the company’s contact for data questions, the physical or cloud location of participant data, and data security procedures were all categorized as subattributes under the attribute Data Storage. Attributes such as data storage and information confidentiality were further grouped under higher-level themes, such as grantee obligations, reflecting responsibilities that researchers must fulfill.

  • 4.

    Attribute Reviews: We assessed whether ethically relevant consent information was missing from the framework. The research team engaged in deliberations to evaluate whether specific topics, especially those underrepresented in the NIH guidance, should be added to the framework. The team developed additional attributes through consensus.

  • 5.

    Significance Assessment: We also examined each attribute and subattribute for its importance and relevance to digital health research. This helped decide whether to keep, combine, or remove any items on the basis of their value in the context of studies using DHTs (wearable devices, sensors, and mobile health applications).

  • 6.

    Findings Reporting: The results of this process became Framework Version 1, which included attributes and subattributes organized under certain themes. Each attribute and subattribute are also complemented with cardinality, description, and technology relevance information on the basis of the research team’s judgment. The research team then finalized this version and used it for testing via real-world ICFs.

Informed consent form collection

To evaluate and refine the framework, the study performed ICFs collection and screening process from clinical trials utilizing digital health technologies. This phase of the research mirrored a systematic review technique and followed clearly defined inclusion criteria and reproducible search strategies.

The study retrieved ICFs from ClinicalTrials.gov [26] using a set of keyword combinations representing digital health modalities (digital health technology, wearable devices, mobile apps, sensor technology, eHealth, and mHealth) and was limited to studies that met the following criteria: (a) conducted within the United States; (b) active, recruiting, or completing status; (c) starting date of January 1, 2019, or later; and (d) contained publicly available ICFs. We collected and selected ICF documents to ensure variation in technology type and study design. Details of each study and corresponding ClinicalTrials.gov identifiers are provided in Supplementary File 3.

Ethically grounded framework iterative testing and refinement

After we developed Framework Version 1, which includes attributes extracted from NIH guidance and attributes generated through researchers’ interpretations and consensus, we iteratively refined and expanded the framework by testing it on the ICFs collected. This process aimed to improve the framework's clarity, relevance, and completeness on the basis of how consent elements are written in practice.

One researcher (RRA) reviewed the five ICFs to identify whether there were any new attributes not covered by the framework. In this review, RA modified some attribute names to better match the language used in digital health studies. Two other researchers (MS and MAG) reviewed the updated version, focusing on improving the grouping of attributes and ensuring that the descriptions were consistent with the NIH guidelines and real-world consent forms.

The refinement continued in multiple rounds. In each round, the researchers analyzed five more ICFs and added new attributes if they were found in the documents but not yet covered in the framework. When different ICFs used various terms to describe the same concept, we adjusted the framework to include common variations. The research team discussed all the changes, and then reached final decisions through consensus.

This process continued until thematic saturation was reached—the point at which no new attributes appeared [27] in the reviewed ICFs. At this point, we considered the complete theoretical framework for informed consent in digital health studies. The final framework consolidated findings from three sources: (1) attributes coded from the NIH, (2) attributes identified through the analysis of the ICFs, and (3) refinements proposed by the research team. These were presented in structured tables indicating their origin (source), cardinality, and category.

In summary, Fig. 1 presents the end-to-end methods used in this study.

Fig. 1.

Fig. 1

Flow diagram of ICFs collection and selection process for framework refinement

Results

Initial ethically grounded framework (Framework Version 1)

The NIH guidance document organizes consent elements into seven primary sections: Introduction, Procedures, Data Sharing and Ownership, Potential Risks, Potential Benefits, Costs, and Withdrawal. From this guidance, we extracted 98 distinct requirements relevant to digital health consent and categorized them (see Supplementary File 1). We described each requirement in five columns:

  1. Section: A heading where the statements/sentences lie in the NIH document

  2. No: An identifier for a requirement or subrequirement

  3. Proposed Attribute: A title representing the intent of the requirement

  4. Parent Category: Applied only in subattribute, indicating what attribute acted as a general classification of subattribute(s)

  5. Source: A mark where the requirement/attribute was found, whether it was from NIH guidance, researchers’ deliberation, or both

We then translate these requirements into structured attributes and subattributes. The researchers also defined additional attributes not covered in the guidance by deliberation. The resulting framework, referred to as Framework Version 1 (see Supplementary File 2), organized attributes into three primary groups:

  • Consent: Outlines the fundamental aspects of research participation, such as study purpose, benefits, compensation, and voluntary participation. This group has 47 attributes and 44 subattributes.

  • Grantee Permissions: Describes the rights granted to a grantee (researcher) by a grantor (study subject) regarding procedures and data, including access, usage, and sharing. It has 7 attributes and 15 subattributes.

  • Grantor Obligations: Includes the responsibilities researchers must uphold, such as maintaining data security, ensuring compliance with consent terms, and providing participants with mechanisms to withdraw consent. This category contains 10 attributes and 13 subattributes.

Informed consent forms overview

The study team collected 45 ICFs and analyzed 25 of them [26]. These forms represented a range of digital health studies utilizing mobile apps, wearable devices, or sensor technologies. Every ICF incorporated at least one digital health technology, with several studies including multiple types. The distribution across technology types enabled comprehensive testing of the framework across various contexts, which included sensor technologies in 18 ICFs, whereas both mobile apps and wearable devices appeared in 17 studies. Ten studies used all the technologies, and the remaining used different combinations, including sensors and mobile apps (5 ICFs), sensors and wearable devices (2 ICFs), mobile apps only (5 ICFs), and sensors only (3 ICFs).

Final ethically grounded framework

By iteratively comparing Framework Version 1 with ICFs, the researchers expanded the framework and then reached thematic saturation after reviewing the 25th ICF. Based on this analysis, we introduced new consent elements and created a fourth group, Technology, to address topics unique to DHTs, such as how the technology affects participant routines, responsibilities for device damage, and device proprietary concerns. The final framework included 63 attributes and 93 subattributes distributed across Consent, Grantee Permissions, Grantee Obligations, and Technology groups.

We compiled all the attributes in three structured tables (Supplementary Files 4–5): (a) attributes list with cardinality and definitions; (b) attributes presence across 25 ICFs; and (c) attributes source, which includes three columns: the NIH guideline, ICFs, and recommendation. In total, 78 attributes/subattributes were sourced from NIH guidance, 32 originated in the ICFs, and the remaining 46 attributes/subattributes were marked as recommendation. The recommendation indicates the researchers'suggestions for attributes that should be included in consent templates but are not mentioned in either the NIH guidance or ICFs.

Ethically grounded framework testing results

The comparison between real-world ICFs and the theoretical framework revealed several key findings. First, many attributes and subattributes were frequently missing across forms, and none of the 25 ICFs contained the full set of required or optional elements from the framework. When considering only mandatory attributes, those with cardinality starting with “1..” (e.g., 1..1, 1..*, etc.), ICF 4 had the highest level of completeness, covering 50 out of 68 attributes/subattributes. For the NIH-recommended elements, ICF 3 performed best, with 44 out of 92 (48%) present. When focusing specifically on attributes that are both mandatory and NIH-recommended, ICF 4 again ranked highest, covering 30 out of 47 (63%) attributes.

Figures 2 and 3 display the presence (green) and absence (gray) of consent attributes across 25 ICFs, differentiating between all attributes in the framework (Fig. 2) and those specifically recommended by the NIH guideline for digital technology (Fig. 3). In Fig. 2, foundational consent elements—typically located in the upper rows—are frequently present across ICFs, indicating broad coverage of standard ethical components. In contrast, technology-specific and digital-context attributes found in the lower half of the framework are often missing.

Fig. 2.

Fig. 2

Presence of 63 Framework Attributes and 93 Framework Subattributes in 25 ICFs

Fig. 3.

Fig. 3

Presence of 36 NIH-recommended Digital Technology Attributes and 55 Subattributes in 25 ICFs

Figure 3 narrows the focus to NIH-recommended attributes, highlighting that even these prioritized elements are not consistently documented. Several ICFs show notable gaps, especially in attributes related to (32.a) data breaches, (32.c) reidentification risks, and (60) DHT risks. Across both figures, we observe substantial variation in overall completeness: while some ICFs (e.g., ICF 4, ICF 17, and ICF 19) include more than 80 attributes/subattributes, others (e.g., ICF 11, ICF 14, and ICF 24) include fewer than 60. Most fall within the range of 60–80 attributes/subattributes out of 156, reflecting inconsistent application of digital health consent elements, even among studies using similar types of technologies.

Second, more than 25 attributes/subattributes were missing in more than 90% of the ICFs, including those recommended by the NIH for studies involving digital technology. Within the Technology attributes alone, 17 NIH-recommended elements were found in no more than two ICFs. These include (41) Activity impact of use, (43) After study, service access, (44) Damage (or) loss recovery, (45.a) Data collection using technology: Additional information collected, and (47) End-user agreements and: (47.a) Data use notice, (47.b) Documents link (or) attachment, (47.c) Controller, (47.d) Updates (or) changes notice, and (47.d.1) Risk of update. Other rarely observed elements include (55.a) Technology cost: Duration of cost, (55.b) Technology cost: Responsibility for cost, also (57.a) Technology discontinuation: Data and software removal, (60.a) Technology risk (or) discomfort: Data access and linking technologies, (60.b) Technology risk (or) discomfort: Technology integration, (61.b) Technology selection, and (63.b) Technology use: Mandatory status.

In contrast, a small number of nondigital technology-related consent elements were present in all the ICFs. These include attributes such as (4) Compensation, (5) Consent, (6) Contact for participant rights or issues, (13.a)Study team contact information, (17) Voluntary participation, and (31) Consent copy. Among the NIH-recommended elements, (25) Procedure and (25.j) Test article were the most consistently present. This contrast reveals a distinct implementation gap: while foundational ethical elements are routinely disclosed, digital health–specific elements, particularly those addressing technological risk, data governance, and ongoing participant rights, are systematically underrepresented.

The analysis also compared missing attributes based on the types of DHT involved in each study. Across all the groups presented in Fig. 4, the median percentage of missing attributes was similar across all three categories: wearable devices (55.6%), sensor technologies (56.2%), and mobile apps (56.2%). The interquartile ranges and overall spread were also comparable, which indicates that no single technology consistently demonstrated better or worse attribute completeness across all items.

Fig. 4.

Fig. 4

Percentage of missing attributes/subattributes by the number and type of technologies used

When comparing study start dates, studies initiated after the NIH guideline release (May 2024) presented slightly improved attribute completeness. However, the limited number of postguideline studies restricts the ability to draw firm conclusions. Only two ICFs (ICF 4 and ICF 12) started after May 2024; ICF 12 had a small positive net percentage (+ 5.1%), while ICF 4 still had more missing attributes than present attributes (–17.9%). Additionally, most preguideline ICFs had more missing attributes than present attributes did, with several exhibiting net deficits of over 25%.

Lastly, as shown in Fig. 4 with purple-colored text, this study revealed four technology-related consent items that appeared in real-world ICFs but were not included in the NIH: (30) Commercial profit sharing, (35) Disclosure of study information, (36) During study result sharing, and(37.d) Data removal requests. It also showed that the second and third items were mentioned in several ICFs, while (30) Commercial profit sharing and (37.d) Data removal requests were rarely included.

Discussion

This study reveals persistent ethical shortfalls in current informed consent practices for digital health research. Despite the availability of NIH guidance, most ICFs failed to meet foundational ethical standards for transparency, informed autonomy, and equitable burden disclosure.

Respect for autonomy and comprehension

A central ethical failure was the widespread omission of technology-specific details. Participants were frequently not informed about the following:

  • Who controls their data

  • How long data will be retained or accessible

  • Their ability to access or delete data after study completion

  • Financial obligations associated with technology use

Such omissions undermine participants’ capacity to make informed choices and violate the principle of respect for persons—a core tenet of biomedical ethics and international ethical guidelines such as the Declaration of Helsinki. These omissions are likely rooted in institutional template limitations, gaps in researchers’ digital literacy, and the absence of standardized disclosure practices for emerging technologies. Incorporating modular, technology-specific sections into ICFs and strengthening training in digital ethics may help mitigate these issues.

Justice and risk distribution

Several missing attributes reflected a failure to equitably distribute the burdens of participation. For example, few forms disclosed who would bear costs for device damage or whether participants could continue accessing services poststudy. This lack of clarity may disproportionately affect vulnerable populations, including those in low-resource settings or with limited digital literacy or language proficiency. Language barriers and lack of technical skills can significantly hinder individuals’ ability to comprehend informed consent materials and participate meaningfully in digital health research. Addressing these issues is essential to prevent exclusion and ensure equity.

The ethical principle of justice requires transparency in these matters to avoid undue exploitation or unacknowledged burdens.

Ethical accountability in vendor relationships

A unique finding of this study is the degree to which ethical accountability may be obscured by third-party technology partnerships. Consent documents often did not include information about terms of service, data sharing with vendors, or who the data controller was—attributes linked to third-party apps and devices. This lack of visibility shifts ethical responsibility away from the researcher without participant awareness or consent.

Such “outsourced” accountability creates a significant ethical blind spot in data governance and undermines participant trust.

Institutional and structural barriers

The consistency of missing attributes across technologies suggests that the shortcomings stem more from institutional practices—such as outdated templates or limited IRB awareness—than from the technologies themselves. Addressing these gaps will require system-level changes, including updated institutional guidance, staff training, and redesigned consent templates that incorporate digital-specific ethics.

Strengthening ethical guidance and policy

While the NIH guidance provides a foundation, our results suggest that it is insufficiently comprehensive. The identification of four technology-relevant elements not currently included in NIH recommendations—particularly profit sharing, interim result sharing, public study disclosure, and data removal rights—highlights the evolving ethical expectations around participant agency and benefit sharing.

Moreover, recent international efforts such as the Global Patient co-Owned Cloud (GPOC) initiative have emphasized co-ownership models that address patient autonomy, cross-border ethics, and legislative diversity [2832]. These frameworks propose consent as a human rights mechanism, offering broader protections in transnational contexts and highlighting the ethical complexity of data commercialization [33]. Additionally, decentralized technologies most notably blockchain have shown promise in improving transparency, consent management, and data security for personal health records (PHRs), pointing toward scalable solutions in global digital infrastructures [34].

Future iterations of federal or international consent guidance should address these areas to ensure ethically robust consent practices that align with digital realities.

Limitations

The first limitation is that the analysis focused solely on ICFs from U.S.-based studies, which may limit generalizability to international contexts with different regulatory, ethical, and data protection standards. Second, the study included only DHTs defined in the NIH guidance, which are mobile health applications, wearable devices, and sensor technologies. It did not examine studies involving other emerging digital tools, such as artificial intelligence (AI), virtual reality (VR), augmented reality (AR), or implantable technologies. These newer modalities may introduce unique consent challenges that require additional framework extension.

Furthermore, because the framework aligns with NIH guidance focused on DHTs, it may not fully address ethical considerations in studies that integrate digital components with traditional biomedical interventions. Fourth, the small number of studies initiated after the release of the NIH guideline in May 2024 limits the ability to assess the real compliance of ICFs and the impact of the NIH guideline release. Finally, our analysis relied on publicly available ICFs from ClinicalTrials.gov, which may not fully capture verbal explanations or supplementary materials provided during the consent process.

Despite these limitations, the study offers a robust, actionable, and ethically grounded framework to enhance ethical transparency, participant protection, and institutional accountability in digital health research.

Toward implementation and future application

To translate this framework into practice, future research should pilot its application in active digital health studies to assess its impact on ethical completeness and participant understanding. Feedback from IRB members, investigators, and study staff can further evaluate the framework’s clarity, usability, and integration into review processes.

To support broader adoption, the framework should be converted into user-friendly tools, such as annotated consent templates, training modules, and interactive guides. It may also serve as a consent design checklist or be embedded within e-consent platforms. Encoding the framework in machine-readable formats (e.g., HL7 FHIR) could facilitate integration with automated review systems and regulatory templates.

Conclusion

This study underscores significant ethical gaps in current informed consent practices for digital health research. Despite the NIH efforts to modernize guidance, most consent forms fail to disclose essential technology-related risks, responsibilities, and participant rights. Our proposed framework provides a practical, ethically grounded tool to support more transparent, equitable, and participant-centered consent processes. To uphold core principles of autonomy, justice, and accountability in the digital age, ethics review boards, investigators, and policymakers must adopt consent practices that fully reflect the complexities of emerging technologies.

Supplementary Information

Supplementary Material 1. (35.6KB, docx)

Acknowledgements

Not applicable.

Abbreviations

AI

Artificial Intelligence

AR

Augmented Reality

DHT

Digital Health Technology

GA4GH

Global Alliance for Genomics and Health

GDPR

General Data Protection Regulation

ICFs

Informed Consent Forms

NIH

National Institute of Health

NIH OSP

NIH Office of Science Policy

VR

Virtual Reality

Authors’ contributions

R.R.A. conducted the formal analysis, data curation, investigation, and visualization, wrote the original draft of the manuscript, and refined the manuscript. M.A.G. conceptualized and designed the study, contributed to the methodology and validation, provided resources, supervised the project, managed project administration, acquired funding, and contributed to reviewing and editing the manuscript. M.S. contributed to the methodology, validation, investigation, supervision, and manuscript review and editing. M.E. and M.J. contributed to providing resources for the study. All authors have read and approved the final version of the manuscript.

Funding

This work was supported by the National Institute on Drug Abuse (R01 DA056984-06A1), awarded through SHARES (Substance use HeAlth REcord Sharing). M.A.G. is the principal investigator of SHARES. The funder had no role in the design of the study, data collection, analysis, interpretation, or writing of the manuscript.

Data availability

All data analyzed during this study consist of publicly available informed consent documents and published guidelines through these links: https://osp.od.nih.gov/and https://clinicaltrials.gov/. The outcomes of the study are all available and shared through this manuscript.

Declarations

Ethics approval and consent to participate

This study did not involve human participants, identifiable private data, or human biological material. This study was deemed exempt by the Arizona State University Institutional Review Board. All data analyzed during this study consist of publicly available informed consent documents and published guidelines available through https://osp.od.nih.gov/ and https://clinicaltrials.gov/. No consent to participate was needed.

Consent for publication

Not applicable. No participants were recruited, and no individual-level data were collected. Therefore, informed consent was not required.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Kassam I, Ilkina D, Kemp J, Roble H, Carter-Langford A, Shen N. Patient perspectives and preferences for consent in the digital health context: state-of-the-art literature review. J Med Internet Res. 2023;25(1):e42507. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Global strategy on digital health 2020–2025. 2021. Available from: https://www.who.int/publications/i/item/9789240020924. Cited 2024 Sep 30.
  • 3.SD Solutions LLC. Improving Healthcare Through AHRQ’s Digital Healthcare Research Program: 2023 Year in Review. Rockville, MD: Agency for Healthcare Research and Quality; 2024 p. 54. Report No.: 24–0070.
  • 4.Marra C, Chen JL, Coravos A, Stern AD. Quantifying the use of connected digital products in clinical research. Npj Digit Med. 2020;3(1):50. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Thomson B, Mehta S, Robinson C. Scoping review and thematic analysis of informed consent in humanitarian emergencies. BMC Med Ethics. 2024;25(1):135. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Lee AR, Koo D, Kim IK, Lee E, Yoo S, Lee HY. Opportunities and challenges of a dynamic consent-based application: personalized options for personal health data sharing and utilization. BMC Med Ethics. 2024;25(1):92. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.McInnis BJ, Pindus R, Kareem D, Vital DG, Hekler EB, Nebeker C. Factors influencing informed consent preferences in digital health research: survey study of prospective participants. J Med Internet Res. 2025;27:e63349. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Andreotta AJ, Kirkham N, Rizzi M. AI, big data, and the future of consent. AI Soc. 2022;37(4):1715–28. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Verily, Abernethy A, Adams L, National Academy of Medicine, Barrett M, ResMed, et al. The Promise of Digital Health: Then, Now, and the Future. NAM Perspect. 2022 [cited 2025 May 13];6(22). Available from: https://nam.edu/the-promise-of-digital-health-then-now-and-the-future [DOI] [PMC free article] [PubMed]
  • 10.Whitehead L, Talevski J, Fatehi F, Beauchamp A. Barriers to and facilitators of digital health among culturally and linguistically diverse populations: qualitative systematic review. J Med Internet Res. 2023;25:e42719. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Arias López MDP, Ong BA, Borrat Frigola X, Fernández AL, Hicklent RS, Obeles AJT, et al. Digital literacy as a new determinant of health: a scoping review. Ziegler J, editor. PLOS Digit Health. 2023;2(10):e0000279. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.NIH Office of Science Policy. Informed Consent for Research using Digital Health Technologies: Points to Consider & Sample Language. National Institutes of Health; 2024. Available from: https://23917484.fs1.hubspotusercontent-na1.net/hubfs/23917484/NIH%20-%20Digital%20Health%20Resource.pdf
  • 13.NIH Office of Science Policy. National Institutes of Health. 2023. Request for Information: Developing Consent Language for Research Using Digital Health Technologies. Available from: https://grants.nih.gov/grants/guide/notice-files/NOT-OD-24-002.html. Cited 2024 Sep 13.
  • 14.Voronov A, Jafari M, Zhao L, Soliz M, Hong Q, Pope J, et al. Pediatric consent on FHIR. Appl Clin Inform. 2024;15(02):342–56. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.El-Haddad J, Pather N. Developing an ethical framework for informed consent using human fetal and embryological collections: an Australian perspective. Anat Sci Educ. 2025;18(2):192–208. [DOI] [PubMed] [Google Scholar]
  • 16.Tamuhla T, Tiffin N, Allie T. An e-consent framework for tiered informed consent for human genomic research in the global south, implemented as a REDCap template. BMC Med Ethics. 2022;23(1):119. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Global Alliance for Genomic Health. Global Alliance for Genomic Health. Consent Toolkit. Available from: https://www.ga4gh.org/product/consent-toolkit/. Cited 2025 May 14.
  • 18.Hussein R, Wurhofer D, Strumegger EM, Stainer-Hochgatterer A, Kulnik ST, Crutzen R, et al. General Data Protection Regulation (GDPR) Toolkit for Digital Health. In: Otero P, Scott P, Martin SZ, Huesing E, editors. Studies in Health Technology and Informatics. IOS Press; 2022 [cited 2025 May 15]. Available from: https://ebooks.iospress.nl/doi/10.3233/SHTI220066 [DOI] [PubMed]
  • 19.Saksena N, Matthan R, Bhan A, Balsari S. Rebooting consent in the digital age: a governance framework for health data exchange. BMJ Glob Health. 2021;6(Suppl 5):e005057. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Prinsen L. Introducing dynamic consent for improved trust and privacy in research involving human biological material and associated data in South Africa. Front Genet. 2024;15:1272924. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.De Sutter E, Borry P, Geerts D, Huys I. Personalized and long-term electronic informed consent in clinical research: stakeholder views. BMC Med Ethics. 2021;22(1):108. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Nebeker C, Gholami M, Kareem D, Kim E. Applying a digital health checklist and readability tools to improve informed consent for digital health research. Front Digit Health. 2021;3:690901. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Byrne D. A worked example of Braun and Clarke’s approach to reflexive thematic analysis. Qual Quant. 2022;56(3):1391–412. [Google Scholar]
  • 24.National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report: Ethical Principles and Guidelines for the Protection of Human Subjects of Research. Department of Health, Education, and Welfare; 1979. Available from: https://www.hhs.gov/ohrp/regulations-and-policy/belmont-report/index.html [PubMed]
  • 25.Reischauer G, Hess T, Sellhorn T, Theissen E. Transparency in an age of digitalization and responsibility. Schmalenbachs J Bus Res. 2024;76(4):483–94. [Google Scholar]
  • 26.Informed Consent Forms. U.S. National Library of Medicine; Available from: https://clinicaltrials.gov/
  • 27.Guest G, Namey E, Chen M. A simple method to assess and report thematic saturation in qualitative research. Soundy A, editor. PLOS ONE. 2020;15(5):e0232076. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Lidströmer N, Kanters JK, Herlenius E. Systematic review of ethics and legislation of a Global Patient co-Owned Cloud (GPOC) [version 3; peer review: 4 approved]. Bioeth Open Res. 2025;2(3).
  • 29.Lidströmer N, Davids J, ElSharkawy M, Ashrafian H, Herlenius E. Systematic review and meta-analysis for a global patient co-owned cloud (GPOC). Nat Commun. 2024;15(1):2186. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Lidströmer N, Davids J, ElSharkawy M, Ashrafian H, Herlenius E. Necessity for a global patient co-owned cloud (GPOC). BMC Digit Health. 2024;2(1):76. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Davids J, ElSharkawy M, Ashrafian H, Herlenius E, Lidströmer N. Technical sandbox for a global patient co-owned cloud (GPOC). BMC Digit Health. 2024;2(1):75. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Lidströmer N, Davids J, ElSharkawy M, Ashrafian H, Herlenius E. A summit on a global patient co-owned cloud (GPOC). BMC Digit Health. 2024;2(1):51. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Dickens A. From information to valuable asset: the commercialization of health data as a human rights issue. Health Hum Rights. 2020;22(2):67–70. [PMC free article] [PubMed] [Google Scholar]
  • 34.George M, Chacko AM. Health passport: a blockchain-based PHR-integrated self-sovereign identity system. Front Blockchain. 2023;6:1075083. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplementary Material 1. (35.6KB, docx)

Data Availability Statement

All data analyzed during this study consist of publicly available informed consent documents and published guidelines through these links: https://osp.od.nih.gov/and https://clinicaltrials.gov/. The outcomes of the study are all available and shared through this manuscript.


Articles from BMC Medical Ethics are provided here courtesy of BMC

RESOURCES