Skip to main content
Journal of Medical Internet Research logoLink to Journal of Medical Internet Research
. 2024 Nov 15;26:e57628. doi: 10.2196/57628

Human Factors, Human-Centered Design, and Usability of Sensor-Based Digital Health Technologies: Scoping Review

Animesh Tandon 1,2,3, Bryan Cobb 4, Jacob Centra 5, Elena Izmailova 6, Nikolay V Manyakov 7, Samantha McClenahan 5, Smit Patel 5, Emre Sezgin 8, Srinivasan Vairavan 9, Bernard Vrijens 10, Jessie P Bakker 5,11,12,; Digital Health Measurement Collaborative Community (DATAcc) hosted by DiMe13
Editor: Taiane de Azevedo Cardoso
Reviewed by: Shubin Yu, Michael Nissen
PMCID: PMC11607562  PMID: 39546781

Abstract

Background

Increasing adoption of sensor-based digital health technologies (sDHTs) in recent years has cast light on the many challenges in implementing these tools into clinical trials and patient care at scale across diverse patient populations; however, the methodological approaches taken toward sDHT usability evaluation have varied markedly.

Objective

This review aims to explore the current landscape of studies reporting data related to sDHT human factors, human-centered design, and usability, to inform our concurrent work on developing an evaluation framework for sDHT usability.

Methods

We conducted a scoping review of studies published between 2013 and 2023 and indexed in PubMed, in which data related to sDHT human factors, human-centered design, and usability were reported. Following a systematic screening process, we extracted the study design, participant sample, the sDHT or sDHTs used, the methods of data capture, and the types of usability-related data captured.

Results

Our literature search returned 442 papers, of which 85 papers were found to be eligible and 83 papers were available for data extraction and not under embargo. In total, 164 sDHTs were evaluated; 141 (86%) sDHTs were wearable tools while the remaining 23 (14%) sDHTs were ambient tools. The majority of studies (55/83, 66%) reported summative evaluations of final-design sDHTs. Almost all studies (82/83, 99%) captured data from targeted end users, but only 18 (22%) out of 83 studies captured data from additional users such as care partners or clinicians. User satisfaction and ease of use were evaluated for 83% (136/164) and 91% (150/164) of sDHTs, respectively; however, learnability, efficiency, and memorability were reported for only 11 (7%), 4 (2%), and 2 (1%) out of 164 sDHTs, respectively. A total of 14 (9%) out of 164 sDHTs were evaluated according to the extent to which users were able to understand the clinical data or other information presented to them (understandability) or the actions or tasks they should complete in response (actionability). Notable gaps in reporting included the absence of a sample size rationale (reported for 21/83, 25% of all studies and 17/55, 31% of summative studies) and incomplete sociodemographic descriptive data (complete age, sex/gender, and race/ethnicity reported for 14/83, 17% of studies).

Conclusions

Based on our findings, we suggest four actionable recommendations for future studies that will help to advance the implementation of sDHTs: (1) consider an in-depth assessment of technology usability beyond user satisfaction and ease of use, (2) expand recruitment to include important user groups such as clinicians and care partners, (3) report the rationale for key study design considerations including the sample size, and (4) provide rich descriptive statistics regarding the study sample to allow a complete understanding of generalizability to other patient populations and contexts of use.

Keywords: digital health, remote, decentralized, sensors, connected care, usability, ergonomics, human-centered design, user experience, systematic scoping review, human factors, screening, clinicians, wearable, mobile phone

Introduction

Sensor-based digital health technologies (sDHTs), defined as connected digital medicine products that process data captured by mobile sensors using algorithms to generate measures of behavioral and/or physiological function [1], have been increasingly adopted in both research and health care in recent years [2,3]. sDHTs include products designed to capture data passively (such as continuous glucose monitors and wearables for monitoring sleep) or during active tasks (such as mobile spirometry or smartphone-based cognitive assessments) from wearable, implantable, ingestible, or ambient tools. Implementation of sDHTs requires interactions across the hardware containing the sensor or sensors, the software that is used to convert sensor data to health-related measures, and the users (who could be consumers, patients, clinicians, and more) who interact at one or more stages of data capture. Given this complexity and the increasing use of sDHTs, defining and understanding best practices for human factors, human-centered design, and usability (defined in Textbox 1) of sDHTs is a critical need. Although regulatory guidance focused on the usability of medical devices is well established, sDHTs require unique consideration because (1) sDHTs used in clinical research studies for data capture may or may not be regulated medical devices [4], (2) research participants likely have different motivations and needs related to their use of the technology, (3) sDHTs are often used over much longer time periods in research compared with health care settings, and (4) digital measures captured in large studies may be analyzed with limited human oversight or clinical interpretation.

Definitions.

Human factors

  • The application of knowledge about human behavior, abilities, limitations, and other characteristics of users to the design and development of a sensor-based digital health technology (sDHT) to optimize usability within a defined intended use or context of use. This definition incorporates terminology and concepts from the US Food and Drug Administration (FDA) [20], the UK Medicines and Health Care Products Regulatory Agency (MHRA) [21], and the National Medical Products Administration (NMPA) of China (translated) [22].

Human-centered design

  • An approach to interactive systems that aims to make systems usable and useful by focusing on the users, their needs and requirements, and by applying human factors and usability knowledge and techniques, as defined in the International Organization for Standardization (ISO) 9241-210:2019 standard [23].

Usability

  • The extent to which an sDHT can be used to achieve specified goals with ease, efficiency, and user satisfaction within a defined intended use or context of use. This definition incorporates terminology and concepts from the FDA [20], the MHRA [21], the NMPA (translated) [22], and ISO 9241-210:2019 [23].

The methodological approaches taken toward sDHT usability evaluation have varied substantially [5,6], casting light on the many challenges in implementing these tools into clinical trials and patient care at scale across diverse patient populations [7,8]. For example, some studies have adopted questionnaires developed for products and systems other than sDHTs [9], while others have described the approach to participatory design alongside qualitative data capture [10]. Inadequate attention to human-centered design and usability testing approaches can hinder the evaluation of health care interventions, contribute to insufficient adoption, perpetuate health disparities, increase costs, and potentially introduce safety risks [11-14]. Thus, integrating human factors considerations in the design, development, and evaluation of sDHTs is critical to improving their likelihood of being adopted and properly utilized in a way that is safe, effective, inclusive, and optimizes the user experience.

While several systematic reviews have focused on understanding and quantifying the usability of digital health products for specific applications [15-19], their focus has primarily been on study outcomes rather than evaluating methodological approaches. Recognizing the urgency of addressing sDHT usability-related challenges, a precompetitive collaboration within the Digital Health Measurement Collaborative Community (DATAcc) hosted by the Digital Medicine Society (DiMe) undertook a scoping review to highlight studies that have performed a usability-related evaluation for sDHTs, outline the dimensions of usability data that were assessed, and highlight the methods of usability evaluation. Our objective was to explore the current landscape and identify gaps, which will inform the development and dissemination of recommendations and an evidence-driven evaluation framework of sDHTs as being fit for purpose from a usability perspective.

Methods

Overview

We followed the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines for scoping reviews (Multimedia Appendix 1) [24]. As a scoping review, this work did not meet the criteria for registration on PROSPERO [25]. The protocol is available from the corresponding author.

Literature Search

We completed our literature search in PubMed using search terms designed in 6 layers as follows (terms within each layer were separated by the Boolean operator “OR”, while the layers themselves were separated using “AND” or “NOT”): (1) Medical Subject Heading (MeSH; [26]) term for human participants; (2) MeSH terms related to sDHTs, such as wearable electronic devices and digital technology; (3) keywords related to sDHTs such as wear* (asterisk indicates truncation), remote, and connected; (4) keywords related to human-centered design, usability, human factors, and ergonomics; (5) exclusion of out-of-scope publication types such as editorials and case reports; and (6) published between January 1, 2013, and May 30, 2023. The complete search string is provided in Table S1 in Multimedia Appendix 2.

To avoid potentially overlooking novel or emerging technologies, the search terms did not include descriptions of specific sensor types (such as accelerometer), form factors (such as watch), methodology (such as actigraphy), wear location (such as wrist), or technology make or model.

Study Selection

We systematically screened publications identified in the literature search based on the PICO (patients/participants; intervention; comparator; outcomes) eligibility criteria outlined in Table 1, designed to identify studies describing the incorporation of knowledge about human behavior, abilities, limitations, and other characteristics of users to the design and development process; human-centered design; and ease of use, efficiency, or user satisfaction of sDHTs. Studies reporting sDHT adherence (eg, average wear time) or measurement success metrics (eg, percentage of in-range measurements obtained) were considered out of scope unless they reported one of the aforementioned concepts.

Table 1.

Study selection eligibility criteria.

PICOa frameworkb Eligibility criteria
Patient or participant
  • Exclude studies that do not report data collected from human participants

Intervention
  • Exclude studies that do not assess a specific sDHTc, defined according to the definition of BioMeT in the V3 frameworkd:

    • Connected

      • Interpreted as a digital method of data transfer from the sDHT to the location of data analysis, either wired or wireless

    • Mobile

      • Interpreted as the tool being capable of collecting data in the out-of-clinic setting, although the study may have deployed the tool in clinic

    • Sensor-based

      • Interpreted as the tool containing at least one sensor sampling a physical construct such as acceleration, light, or temperature

      • Used for purposes of measurement, diagnosis, and/or treatment of a behavioral or physiological function

Comparator
  • N/Ae

Outcome or outcomes
  • Exclude studies that do not report data on human factors, human-centered design, or usability (see Textbox 1 for definitions)

aPICO: patients/participants; intervention; comparator; outcomes.

bThe PICO framework is described by Eriksen and Frandsen [29].

csDHT: sensor-based digital health technology.

dThroughout this review, we refer to “sensor-based digital health technology” (sDHT); however, this was operationalized according to the definition of “biometric monitoring technology” (BioMeT) as described in Goldsack et al [1].

eNot applicable.

Two independent investigators (JC and JPB) began by screening a random selection of 20% of publications; disagreements were resolved by consensus, and clarifications were made to the wording of the eligibility criteria to reduce ambiguity. The same two investigators then reviewed another random selection of 20% of publications; it was determined a priori that if the reviewers were in agreement for ≥90% of these publications, the remaining 60% would be reviewed by a single investigator (JC) as described elsewhere [27,28].

Data Extraction and Analysis

Data extraction fields included study design and sample characteristics; the type, maturity, make or model, form factor, and wear location (if applicable) of each sDHT evaluated along with the health concept or concepts generated by each sDHT; the methodological approaches; and the types of usability-related data reported in each study. Most fields for data capture were categorical, with categories created in advance to minimize error. Extraction from each publication was undertaken by one of three investigators with adjudication by an independent investigator as needed.

Categories of usability-related data are described in Table 2, and compiled based on the literature including the International Organization for Standardization (ISO) 9241-210:2019 standard [23] and Nielsen’s [30] usability attributes, as well as the studies identified in this review; that is, data not clearly fitting into an existing category were extracted and categorized post hoc. We acknowledge that there are various models for capturing data describing usability and related topics [31]; however, there is no single standard that has been widely adopted.

Table 2.

Categories of usability-related data extracted from eligible papers.

Category Definitiona,b
User satisfaction The extent to which a user finds the sDHTc to be pleasant to use, which may reflect trust, comfort, aesthetics, engagement, desirability, emotional response, and other considerations. Always captured through self-report.
Ease of use The ease with which a user is able to perform user tasks. Can be captured through self-report (such as the mental demand or effort required to complete a task) or objective measures (such as the number of actions, number of attempts, or time required to complete a task).
Efficiency The ease with which a user is able to perform user tasks after having learned how to use the sDHT. Captured according to the definition of ease of use above.
Learnabilityd The ease with which a user is able to perform user tasks during their first encounter with the sDHT. Captured according to the definition of ease of use above.
Memorability The ease with which a user is able to perform user tasks after a period of nonuse, assessed in a test-retest paradigm. Captured according to the definition of ease of use above.
Usefulnesse The extent to which a user finds the sDHT, or its specific features or functions, to be valuable, productive, or helpful. Always captured through self-report.
Use errorsf An action or lack of action that may result in a use-related hazard (a potential source of harm), as well as error recovery defined as the ability of a user to make a correction following a use error in order to complete a task. Can be captured through self-report or objective assessments.
Technical performance or malfunctions Technical performance, such as page load times, or the number, type, and severity of errors associated with sDHT malfunction. Can be captured through self-report or objective assessments.
Readability The reading skills a user must possess to understand information presented to them through the sDHT itself, or through written materials such as instructions for use, cautions, warnings, or contraindications; [32]. Always captured through objective assessments, and typically reported as a reading grade.
Understandability or actionability The extent to which users of diverse backgrounds, languages, and varying levels of health literacy understand (1) the clinical data or other information, such as instructions, cautions, warnings, and contraindications, presented to them; and (2) the actions or tasks they should complete in response, such as an sDHT-derived blood glucose measurement requiring an adjustment to medication [32]. Always captured through objective assessments.

aNote that in the definitions, “self-report” includes data captured through surveys, interviews, and focus groups, while “objective” includes data captured through observation (direct or video) or through the sDHT itself (or any related software) such as timestamps, app crash reports, and page load times.

bComfort and trust were extracted separately for the purposes of this review.

csDHT: sensor-based digital health technology.

dLearnability refers to the operation of the sDHT rather than a practice effect associated with a research study outcome or endpoint.

eWe have adopted the term usefulness instead of utility, to avoid confusion with clinical utility, which refers to the extent to which implementing a medical product leads to improved health outcomes or provides useful information about diagnosis, treatment, management, or prevention of disease [33].

fStudy outcomes that come after “use errors” are not typically considered usability data, but are related concepts often captured during usability evaluations.

Consistent with the goal of a scoping review, all data were analyzed descriptively.

Results

Literature Search and Study Selection

The PubMed search conducted on June 1, 2023, yielded 442 results, including one published only as an abstract. After applying the eligibility criteria described in Table 1, a further 356 publications were excluded. As such, 85 studies were determined to be eligible; however, 2 studies were under embargo, leaving 83 studies for data extraction (Figure 1). A complete list of all included studies is provided in Table S2 in Multimedia Appendix 2 [9,10,34-114].

Figure 1.

Figure 1

PRISMA flowchart. PRISMA: Preferred Reporting Items for Systematic Reviews and Meta-Analyses; sDHT: sensor-based digital health technology.

As described above, two investigators reached a consensus on 20% (n=88) of the 442 publications, before any further publications were screened. The same investigators then screened a further 88 publications independently, which resulted in 100% agreement of eligibility. Per protocol, a single investigator screened the remaining 266 papers.

Study Design Considerations

The majority of studies (55/83, 66%; Table 3) reported summative evaluations of products that were marketed or production-equivalent (ie, sample products of final design assembled in a way that differs from—but is equivalent to—the manufacturing processes used for the marketed product [115]). The remaining 28 (34%) out of 83 studies reported formative evaluations of prototype products; we did not identify any reports focused solely on sDHT design. Most studies (53/83, 64%) were conducted partially or completely off-site. Study sample sizes spanned a wide range (range 1-623; median 27, IQR 13-60); however, only 21 (25%) of the full set of 83 studies, and 17 (33%) of the 55 summative studies, reported a rationale for the sample size (with or without a power calculation).

Table 3.

Study design and sample characteristics across therapeutic areas.


Therapeutic area of sDHTa end users (number of studies in parenthesis)

Aging (n=19) Cardiovascular (n=9) Endocrine (n=3) Neurology (n=13) Oncology (n=3) Respiratory (n=6) Surgery (n=5) Healthy (n=15) Otherb (n=10) Total (n=83)
Study design, n (%)

Observational 17 (89) 9 (100) 3 (100) 13 (100) 3 (100) 5 (83) 4 (80) 14 (93) 10 (100) 78 (94)

Interventional 2 (11) 0 (0) 0 (0) 0 (0) 0 (0) 1 (17) 1 (20) 1 (7) 0 (0) 5 (6)
Study focusc , n (%)

Summative; sample size rationale 3 (16) 3 (33) 0 (0) 4 (31) 0 (0) 1 (17) 2 (40) 1 (7) 3 (30) 17 (20)

Summative; no sample size rationale 6 (32) 2 (22) 2 (67) 7 (54) 3 (100) 4 (67) 1 (20) 9 (60) 4 (40) 38 (46)

Formative; sample size rationale 2 (11) 1 (11) 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 1 (10) 4 (5)

Formative; no sample size rationale 8 (42) 3 (33) 1 (33) 2 (15) 0 (0) 1 (17) 2 (40) 5 (33) 2 (20) 24 (29)
Setting, n (%)

Remote 11 (58) 3 (33) 1 (33) 5 (38) 2 (67) 6 (100) 1 (20) 7 (47) 6 (60) 42 (51)

On-site 4 (21) 6 (67) 1 (33) 5 (38) 0 (0) 0 (0) 3 (60) 8 (53) 3 (30) 30 (36)

Both remote and on-site 4 (21) 0 (0) 1 (33) 3 (23) 1 (33) 0 (0) 1 (20) 0 (0) 1 (10) 11 (13)
Duration of sDHT data collection, n (%)

≤1 day 5 (26) 3 (33) 1 (33) 2 (15) 0 (0) 0 (0) 2 (40) 7 (47) 2 (20) 22 (27)

>1 to ≤7 days 2 (11) 3 (33) 0 (0) 4 (31) 0 (0) 2 (33) 2 (40) 2 (13) 2 (20) 17 (20)

>7 to ≤30 days 6 (32) 2 (22) 1 (33) 3 (23) 1 (33) 0 (0) 1 (20) 2 (13) 0 (0) 16 (19)

>31 to ≤90 days 3 (16) 0 (0) 0 (0) 2 (15) 1 (33) 4 (67) 0 (0) 3 (20) 1 (10) 14 (17)

>90 to ≤180 days 1 (5) 1 (11) 0 (0) 0 (0) 1 (33) 0 (0) 0 (0) 0 (0) 3 (30) 6 (7)

>180 days 1 (5) 0 (0) 0 (0) 2 (15) 0 (0) 0 (0) 0 (0) 0 (0) 1 (10) 4 (5)

Not reported 1 (5) 0 (0) 1 (33) 0 (0) 0 (0) 0 (0) 0 (0) 1 (7) 1 (10) 4 (5)
Study sample

Sample size, median (IQR) 30 (13.5-52.5) 24 (10-41) 35 (20-189) 40 (22-70) 30 (22-31.5) 14.5 (9.25-19.75) 29 (15-60) 25 (13-105) 21 (12-81.25) 27 (13-60)

Sample size, range 8-125 5-156 5-343 5-623 14- 33 1-314 10-77 1-243 3-407 1-623
Users, n (%)

End usersd 19 (100) 9 (100) 3 (100) 13 (100) 3 (100) 6 (100) 5 (100) 15 (100) 9 (90) 82 (99)

Care partner usersd 0 (0) 1 (11) 1 (33) 2 (15) 0 (0) 0 (0) 0 (0) 3 (20) 1 (10) 8 (10)

Clinician usersd 2 (11) 3 (33) 1 (33) 2 (15) 0 (0) 1 (17) 1 (20) 1 (7) 1 (10) 12 (14)

Expertsd 1 (5) 0 (0) 1 (33) 0 (0) 0 (0) 0 (0) 0 (0) 1 (7) 0 (0) 3 (4)
Age, n (%)

Adults only 19 (100) 7 (78) 1 (33) 10 (77) 2 (67) 3 (50) 5 (100) 11 (73) 7 (70) 65 (78)

Children only 0 (0) 0 (0) 1 (33) 1 (8) 0 (0) 2 (33) 0 (0) 3 (20) 3 (30) 10 (12)

Both adults and children 0 (0) 1 (11) 0 (0) 2 (15) 1 (33) 1 (17) 0 (0) 1 (7) 0 (0) 6 (7)

Not reported 0 (0) 1 (11) 1 (33) 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 2 (2)
Sex/gender, n (%)

Male or men only 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 0 (0) 1 (7) 2 (20) 3 (4)

Female or women only 1 (5) 0 (0) 0 (0) 0 (0) 1 (33) 1 (17) 0 (0) 0 (0) 2 (20) 5 (6)

Both or all sexes/genders 18 (95) 7 (78) 2 (67) 13 (100) 2 (67) 5 (83) 5 (100) 13 (87) 6 (60) 71 (86)

Not reported 0 (0) 2 (22) 1 (33) 0 (0) 0 (0) 0 (0) 0 (0) 1 (7) 0 (0) 4 (5)
Race/ethnicity, n (%)

Race/ethnicity reported 2 (11) 1 (11) 0 (0) 4 (31) 0 (0) 2 (33) 0 (0) 3 (20) 2 (20) 14 (17)

Race/ethnicity not reported 17 (89) 8 (89) 3 (100) 9 (69) 3 (100) 4 (67) 5 (100) 12 (80) 8 (80) 69 (83)
Number of sDHTs assessed

Range 1-7 1-3 1-1 1-5 1-6 1-5 1-2 1-7 1-11 1-11

asDHT: sensor-based digital health technology.

b“Other” therapeutic area category contains studies with enrollment eligibility focused on anaphylaxis, muscular dystrophy, hemophilia, nocturnal enuresis, blood and marrow transplant, overweight or obesity, pregnancy, and nonspecific hospitalized or chronic illness. One study recruited clinicians only (no end users of the sDHT) and is included in this category.

cStudies reporting formative and summative evaluations are categorized as summative.

dCategories are not mutually exclusive.

Sample Characteristics

As shown in Table 3, the largest target populations were focused on aging and healthy participants (15 and 19 studies, respectively; 34/83, 41% of all studies). Among the various diseases studied, neurology and cardiovascular were the most common therapeutic areas (13 and 9 studies, respectively; 22/44, 50% of studies assessing nonhealthy individuals). Table S3 in Multimedia Appendix 2 contains a list of conditions falling into each therapeutic area.

Almost all studies (82/83, 99%) captured data from targeted end users; the remaining study captured data only from clinician users [63]. Several studies captured data from multiple user groups; in total, 8 and 12 studies gathered data from care partner users and clinician users, respectively. Three studies involved experts (not considered to be sDHT users); two of these described a formal heuristic evaluation [79,99] while the other described involving experts in design, biomedical engineering, computer science, and mobile health system production in the sDHTs design and formative testing process [103]. Finally, we noted substantial missing participant demographic data; age, sex/gender, and race/ethnicity were not reported in 2, 4, and 69 studies, respectively.

sDHTs Assessed in Eligible Studies

Across the 83 studies included in our review, a total of 164 different sDHTs were assessed (141 wearable and 23 ambient tools; Table 4), ranging from 1 to 11 sDHTs within a single study. Ingestible and implantable sDHTs were in scope, but none were identified in our literature search. A wide range of form factors (22 distinct categories) and wear locations (14 anatomical locations presented in 5 categories) were identified. Digital clinical measures of vital signs (n=76 sDHTs), physical activity (n=61 sDHTs), and mobility (n=35) were most prevalent. Table S4 in Multimedia Appendix 2 contains more comprehensive information regarding wear locations and health concepts captured by sDHTs.

Table 4.

sDHTa descriptive information across therapeutic areas.


Therapeutic area of sDHT users (number of sDHTs in parenthesis)

Aging (n=27) Cardiovascular (n=12) Endocrine (n=3) Neurology (n=31) Oncology (n=8) Respiratory (n=15) Surgery (n=7) Healthy (n=35) Otherb (n=26) Total (n=164)
sDHT type, n (%)

Wearable 26 (96) 9 (75) 1 (33) 30 (97) 8 (100) 11 (73) 7 (100) 33 (94) 16 (62) 141 (86)

Ambient 1 (4) 3 (25) 2 (67) 1 (3)  0 (0) 4 (27)  0 (0) 2 (6) 10 (38) 23 (14)
sDHT maturity, n (%)

Prototype 9 (33) 5 (42) 1 (33) 5 (16)  0 (0)  0 (0) 2 (29) 5 (14) 11 (42) 38 (23)

Final or marketed 18 (67) 6 (50) 2 (67) 26 (84) 8 (100) 15 (100) 5 (71) 27 (77) 12 (46) 119 (73)

Not reported  0 (0) 1 (8)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 3 (9) 3 (12) 7 (4)
Form factor, n (%)

Adhesive patch 2 (7)  0 (0) 1 (33) 5 (16)  0 (0)  0 (0) 1 (14) 2 (6) 1 (4) 12 (7)

Balance board  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (4) 1 (<1)

Camera, video, or still  0 (0) 1 (8)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 2 (8) 3 (2)

Clip 4 (15)  0 (0)  0 (0)  0 (0)  0 (0) 1 (7) 1 (14) 1 (3)  0 (0) 7 (4)

Clothing or shoes 5 (19) 3 (25)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 4 (11) 5 (19) 17 (10)

Contact lens  0 (0)  0 (0)  0 (0) 1 (3)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (<1)

Cuff or wrap  0 (0) 1 (8)  0 (0) 1 (3)  0 (0)  0 (0) 1 (14) 2 (6)  0 (0) 5 (3)

Electrode or electrodes  0 (0)  0 (0)  0 (0) 1 (3)  0 (0)  0 (0)  0 (0) 2 (6)  0 (0) 3 (2)

Exercise equipment  0 (0)  0 (0) 1 (33)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 2 (8) 3 (2)

Glasses 1 (4)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (3)  0 (0) 2 (<1)

Gloves 2 (7) 1 (8)  0 (0) 1 (3)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 4 (2)

Glucometer  0 (0)  0 (0) 1 (33)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (<1)

Handheld thermometer  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (7)  0 (0)  0 (0)  0 (0) 1 (<1)

Mattress pad  0 (0) 1 (8)  0 (0) 1 (3)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 2 (<1)

Medication package  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (7)  0 (0)  0 (0) 1 (4) 2 (<1)

Phone or tablet  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (3) 4 (15) 5 (3)

Probe  0 (0)  0 (0)  0 (0) 1 (3)  0 (0)  0 (0)  0 (0)  0 (0) 1 (4) 2 (<1)

Ring 0 (0)   0 (0)  0 (0)  0 (0)  0 (0) 1 (7)  0 (0)  0 (0)  0 (0) 1 (<1)

Spirometer  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 2 (13)  0 (0)  0 (0)  0 (0) 2 (<1)

Contactless unit 1 (4)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (3)  0 (0) 2 (<1)

Strap 12 (44) 4 (33)  0 (0) 20 (65) 8 (100) 9 (60) 4 (57) 21 (60) 9 (35) 87 (53)

Weight scale  0 (0) 1 (8)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (<1)
Wear location, n (%)

Arms or wrists or hands 11 (41) 5 (42) 0 19 (61) 8 (100) 8 (53) 3 (43) 19 (54) 9 (35) 82 (50)

Head or face 1 (4) 0 0 4 (13) 0 0 0 3 (9) 0 8 (5)

Legs or ankles or feet 2 (7) 2 (17) 0 (0) 1 (3) 0 (0) 0 (0) 2 (29) 3 (9) 0 (0) 10 (6)

Neck or torso or hips 10 (37) 2 (17) 1 (33) 3 (10) 0 (0) 2 (13) 2 (29) 6 (17) 6 (23) 32 (20)

Multiple locationsc 2 (7)  0 (0)  0 (0) 3 (10)  0 (0)  0 (0)  0 (0) 2 (6) 1 (4) 8 (5)

N/Ad 1 (4) 3 (25) 2 (67) 1 (3)  0 (0) 5 (33)  0 (0) 2 (6) 10 (38) 24 (15)
Interaction typee , n (%)

Passive 24 (89) 7 (58) 1 (33) 26 (84) 8 (100) 12 (80) 3 (43) 30 (86) 15 (58) 126 (77)

Active 3 (11) 5 (42) 2 (67) 5 (16)  0 (0) 3 (20) 4 (57) 5 (14) 11 (42) 38 (23)
Health conceptsf, n (%)

Activities of daily living 6 (22) 2 (17)  0 (0) 2 (6)  0 (0)  0 (0)  0 (0) 1 (3)  0 (0) 11 (7)

Physical activity 16 (59) 4 (33) 1 (33) 6 (19) 9 (113) 9 (6) 1 (14) 15 (43)  0 (0) 61 (37)

Adherence  0 (0)  0 (0)  0 (0)  0 (0)  0 (0) 1 (7)  0 (0)  0 (0)  0 (0) 1 (<1)

Electrical activity  0 (0)  0 (0)  0 (0) 15 (48)  0 (0)  0 (0)  0 (0) 1 (3)  0 (0) 16 (<1)

Mobility 5 (19) 4 (33)  0 (0) 11 (35)  0 (0)  0 (0) 2 (29) 13 (37)  0 (0) 35 (21)

Sleep 6 (22) 1 (8)  0 (0)  0 (0) 1 (13) 1 (7)  0 (0) 5 (14)  0 (0) 14 (9)

Vital signs 9 (33) 5 (42) 2 (67) 18 (58)  0 (0) 5 (33) 14 (2) 23 (66)  0 (0) 76 (46)

Otherg 4 (15) 2 (17)  0 (0) 4 (13)  0 (0) 2 (13)  0 (0)  0 (0)  0 (0) 12 (7)

asDHT: sensor-based digital health technology.

b“Other” therapeutic area category contains studies with enrollment eligibility focused on anaphylaxis, muscular dystrophy, hemophilia, nocturnal enuresis, blood and marrow transplant, overweight or obesity, pregnancy, and nonspecific hospitalized or chronic illness. One study recruited clinicians only (no end users of the sDHT) and is included in this category.

cRefers to multisensor sDHTs worn on different parts of the body, or sDHTs that can be positioned in one of many locations.

dWear location is not applicable to ambient sDHTs. Wear locations are presented in greater detail in Table S4 in Multimedia Appendix 2.

ePassive: sDHT data are collected over long time periods without user input other than aspects such as charging or changing batteries (such as actigraphy); includes tools for which the absence of data is meaningful (such as smart packaging for adherence monitoring). Active: sDHT data collection requires user engagement at defined timepoints. Categories described previously [28].

fHealth concepts are not mutually exclusive; a single sDHT can capture data in multiple categories. Heath concepts are presented in greater detail in Table S4 in Multimedia Appendix 2.

g“Other” health concept category includes bladder volume, body habitus, cardiac output, fall detection, gaze or visual movement, intraocular pressure, lung or airway function, and tremor detection.

Most sDHTs (126/164, 77%) required only passive interaction by users, meaning that data were captured without user input other than basic tasks such as charging or changing batteries. The remaining 38 (23%) sDHTs required active engagement at specific times such as completion of physical therapy [58], exercise [97], or blood glucose tests [79].

Methodological Approaches

As described in Table 5, most sDHTs (139/164, 85%) were evaluated in the actual environment in which they were intended to be used, while 25 sDHTs were assessed in a simulated environment only. The vast majority were evaluated during actual use (148/164, 90%) rather than through “look and feel” approaches. Of particular interest, a variety of methods were used to evaluate usability and related concepts, including interviews (49 sDHTs), focus groups (29 sDHTs), direct or video observation (35 sDHTs), think-aloud (15 sDHTs), and heuristic analysis (2 sDHTs). Surveys were the most prevalent method for capturing usability data; 86 sDHTs were evaluated using referenced surveys while 81 sDHTs were evaluated using surveys developed in house by study investigators. Data for 4 sDHTs were captured using the sDHT itself; for example, instances of connectivity loss or data capture drops were recorded as “use errors” or “technical performance or product errors” [42,100]. An illustrative example of a use error that may be addressed through design modification is the report of users turning an sDHT on and off repeatedly as it was not clear whether the product was operating correctly [53]. Additional examples of product errors, distinct from use errors, included instances of system crash [51] and software malfunctions requiring computer program patches [54].

Table 5.

Methodological approaches to usability data collection.


sDHTa type (number of sDHTs in parenthesis)

Ambient (n=23) Wearable (n=141) Total (n=164)
Data collection environment, n (%)

Actual environment 23 (100) 107 (76) 130 (79)

Simulated environment  0 (0) 25 (18) 25 (15)

Both actual and simulated  0 (0) 9 (6) 9 (5)
Interactions with sDHT, n (%)

Look and feel 1 (4) 15 (11) 16 (10)

Actual use 22 (96) 126 (89) 148 (90)
Usability evaluation methodsb

Interviews 5 (22) 44 (31) 49 (30)

Focus groups 10 (43) 19 (13) 29 (18)

Surveys—referenced 14 (61) 72 (51) 86 (52)

Surveys—in house 7 (30) 74 (52) 81 (49)

Think-aloud 1 (4) 14 (10) 15 (9)

Observation (direct or video) 1 (4) 34 (24) 35 (21)

Measured by the sDHT  0 (0) 5 (4) 5 (3)

Heuristic analysis 1 (4) 1 (<1) 2 (<1)
Type or types of usability data reported, n (%)

Mixed methods 14 (61) 58 (41) 72 (44)

Quantitative only 6 (26) 60 (43) 66 (40)

Qualitative only 3 (13) 23 (16) 26 (16)
Categories of usability and related data reported, n (%)

User satisfaction 19 (83) 117 (83) 136 (83)

Comfort 5 (22) 107 (76) 112 (68)

Ease of use; self-report 23 (100) 122 (87) 145 (88)

Ease of use; objectively captured 1 (4) 4 (3) 5 (3)

Learnability 1 (4) 10 (7) 11 (7)

Efficiency  0 (0) 4 (3) 4 (2)

Memorability  0 (0) 2 (<1) 2 (<1)

Usefulness 16 (70) 96 (68) 112 (68)

Use errors 6 (26) 26 (18) 32 (20)

User trust 12 (52) 53 (38) 65 (40)

Readability  0 (0)  0 (0) 0 (0)

Understandability or actionability 1 (4) 13 (9) 14 (9)

Technical performance or product errors 19 (83) 79 (56) 98 (60)
Adherence to sDHT reported, n (%)

Objectively measured by the sDHT 6 (26) 44 (31) 50 (30)

Self or care partner report 1 (4) 14 (10) 15 (9)

Both objective and self or care partner  0 (0) 4 (3) 4 (2)

Reported but method not described 1 (4) 7 (5) 8 (5)

Adherence not reported 15 (65) 72 (51) 87 (53)

asDHT: sensor-based digital health technology.

bCategories are not mutually exclusive.

Categories of Usability-Related Data Reported

User satisfaction was captured for the majority of sDHTs (136/164, 83%), often as a measure of acceptability or user attitudes. Although overall ease of use was also commonly reported, captured through either self-report or objective methods (n=145 and n=5, respectively), the related concepts of learnability, efficiency, and memorability were reported for only 11, 4, and 2 sDHTs, respectively. Technical performance and product errors associated with malfunction were captured for 98 sDHTs, while use errors were captured for only 32 sDHTs. Finally, although none of the studies in our review reported the readability of information presented to the user, 14 sDHTs were evaluated according to the extent to which users were able to understand the data or information presented to them (understandability) or the actions or tasks they should complete in response (actionability).

Finally, adherence (such as wear or use time) was reported for 77 sDHTs. Of these, 50 sDHTs captured adherence data objectively, adherence to 19 sDHTs was assessed through self-report or care partner report, and the method was not described for 8 sDHTs.

The complexity of the relationships in our dataset comparing usability evaluation methods with sDHT form factor, and comparing usability evaluation methods with the categories of usability-related data reported, are depicted in Figures 2 and 3, respectively. For example, the width of each chord in Figure 2 is proportional to the number of sDHTs of the relevant form factor that were assessed using the linked method, demonstrating that surveys (both referenced and in house) were the most common evaluation methods while heuristic analysis was the least common. Similarly, Figure 3 demonstrates that overall satisfaction and self-reported ease of use were captured frequently, in contrast to data related to objective ease of use, efficiency, learnability, and memorability. Both figures contain a large number of linked chords, indicating that specific usability evaluation methods were adopted across diverse sDHT form factors and outcome measures.

Figure 2.

Figure 2

Chord diagram depicting the relationship between sDHT form factors and usability evaluation methods. sDHT: sensor-based digital health technology.

Figure 3.

Figure 3

Chord diagram depicting the relationship between usability evaluation methods and categories of usability-related data reported. sDHT: sensor-based digital health technology.

Tables S5-S7 in Multimedia Appendix 2 present the data shown in Tables 3-5 for the subset of 55 studies reporting the results of summative evaluations, while Tables S8-S10 in Multimedia Appendix 2 present these data for the subset of 28 studies reporting formative evaluations.

Discussion

Principal Findings

This paper represents the first scoping review reporting the methodological approaches adopted during usability-related studies specifically focused on sDHTs. We identified 83 formative and summative studies published over the decade from 2013 to 2023 that evaluated human factors, human-centered design, or usability for 164 ambient and wearable tools. Most studies (67/83, 81%) recruited nonhealthy individuals, thereby providing informative data regarding sDHT usability across many diseases in addition to other aspects of health such as aging and pregnancy. Most sDHTs were evaluated in the intended use environment, with multiple facets of usability-related data captured via a range of mixed method approaches including heuristic analysis, surveys, observation, think-aloud, focus groups, interviews, and use errors or technical performance errors captured by the sDHT itself such as instances of connectivity loss.

This review highlights 4 notable gaps that warrant attention as the field advances. First, the breadth and scope of usability and related data were fairly simplistic, relying largely on surveys capturing user satisfaction and ease of use (each captured for >80% of sDHTs) with limited reporting of sDHT use errors, learnability, efficiency, or memorability. The extent to which users understood the health- and behavior-related data or other information presented to them (understandability) and the actions or tasks they should complete in response (actionability) was assessed for only 9% (14/164) of sDHTs. Understandability and actionability are particularly important for sDHTs, given that they are often used by patients or participants in out-of-clinic settings without clinical supervision. For the use of sDHTs in clinical care settings, it is imperative that users understand whether and how to react to clinical data [116], and thus the lack of focus on understandability and actionability is concerning and could be due to the early-stage nature of sDHTs in clinical practice. In the context of clinical research, however, sharing sDHT data with participants in real time has the potential to introduce bias and affect user behavior, thereby posing a risk of yielding inaccurate results [4]. Additional dimensions related to understandability and actionability, such as understanding optimal ways of implementing remote examinations, also warrant further investigation.

Second, only 22% (18/83) of studies considered users other than end users (patients or participants), such as care partners and clinicians, who play crucial roles in sDHT implementation and therefore the quality of data captured [117]. Especially in populations where care partners play a key role in sDHT implementation (eg, children, older people, those with language barriers, and those with disabilities), understanding usability from the care partner perspective is vital. Although existing usability data may be available for some sDHTs that are regulated as medical devices, research participants likely have needs and motivations for using the sDHT that differ from patients using the product as part of usual care. Similarly, the needs of investigator users are likely different from the needs of clinician users, requiring further evaluation.

Third, we found that only 31% (17/55) of summative studies (referred to by the US Food and Drug Administration as “human factors validation studies”; [20]) provided a rationale for the sample size, with or without a power calculation. An understanding of key study design considerations, including sample size, is important for evaluating the robustness of study conclusions.

Finally, as has been noted previously [28,118], we observed a deficiency in reporting basic sample demographics, with studies typically providing information on age and sex/gender but neglecting to include details on the race and ethnicity of participants. Inadequate reporting of descriptive data, including sociodemographics, precludes a complete understanding of generalizability, potentially leading to the need to repeat studies while contributing to disparities and biases in clinical research [119].

As described above, while there are several existing systematic reviews describing the usability of digital health products for specific applications [15-19], few have focused specifically on evaluating methodological approaches. In addition, most prior systematic reviews with similar objectives have focused on digital health technologies that are not sensor-based, such as electronic medical records systems [120] and mobile clinical decision support tools [121], that are not used for remote data capture. In 2023, Maqbool and Herold [5] published a systematic review of usability evaluations describing a broad suite of over 1000 digital health tools consisting mostly of mobile health apps and including a subset of 20 products approximately aligned to our definition of sDHT, including fitness or activity trackers, digital sphygmomanometers, and wearable fall risk assessment systems. Compared to this study, Maqbool and Herold [5] found relatively increased rates of clinician and care partner participation, and reporting of learnability, efficiency, and memorability. Such differences emphasize substantial variability in usability study methodology across subcategories of digital health technologies, as well as differences in definitions and terminology of the concepts reported, underscoring the need for a common evaluation framework.

Strengths and Limitations

Strengths of this review include the robust approach taken to testing our search terms including a careful assessment against a list of target papers identified a priori to ensure that we were capturing appropriate literature. This process was intended to not only ensure the inclusivity of relevant literature but also the reliability of our findings to help provide a foundation for subsequent reviews and meta-analyses. In-depth data extraction across many domains allowed for a thorough comparative analysis of the identified studies. The decision to focus on studies published within the last decade (2013-2023) was also carefully considered, as it encompasses the recent surge in studies reporting sDHT implementation. While sDHTs have a lengthy history prior to 2013, this temporal scope ensures that our findings reflect contemporary developments and trends, offering insights into the current state of sDHT implementation.

A number of limitations are acknowledged. First, we limited our search to the peer-reviewed literature. We acknowledge that many usability studies undertaken by technology manufacturers may be published in the gray literature; however, our ultimate goal is to use the findings of our review to guide the development of a framework representing best practices, and therefore, the peer review process was used as an indicator of methodological rigor and reporting quality. Second, terminology in the field of digital medicine is still evolving and investigators use many different terms to describe sDHTs; by incorporating 25 descriptive keywords in Layer C of our search terms (Table S1 in Multimedia Appendix 2), we found it necessary to rely on MeSH terms developed by the National Library of Medicine [26] as a means of limiting our literature search to a feasible number of publications. As a consequence, we were limited to conducting our search in PubMed as this is the clinical research database for the National Library of Medicine. While MeSH terms are widely accepted and systematically applied, their specificity may have excluded relevant studies using different terminology potentially resulting in unintentional omissions. In addition, the decision to search within one database may have resulted in missed publications. Our hope is that as the field matures, terminology will become harmonized and sDHT-specific indexing will support the identification of studies adopting these technologies. Third, our decision to exclude descriptions of specific sensors, form factors, methodologies, wear location, and technology make or model may have excluded publications that used these types of keywords in the absence of other descriptors and MeSH terms. This approach was taken to reduce the possibility of overlooking novel or emerging technologies in favor of established digital products such as actigraphy tools. Finally, only 40% (176/442) of publications were screened for eligibility by multiple investigators. This approach to study identification, which has been described and adopted previously [27,28], allowed us to screen a greater number of papers which was necessary given the lack of systematic indexing. The high agreement levels between investigators suggest that our quality-control approach maintained a robust screening process, despite part of the work being conducted by a single investigator.

Conclusions and Future Directions

Based on our findings, we suggest 4 actionable recommendations that will help to advance the implementation of sensor-based digital measurement tools in both clinical and research settings. First, we encourage investigators to adopt in-depth assessment and reporting of usability data beyond user satisfaction and ease of use. In particular, it is valuable to understand use errors alongside technical errors, and it is critical to evaluate the extent to which users understand the clinical data and information presented to them and the appropriate tasks to undertake in response, if applicable. Second, it is essential to embrace the diversity of users in all respects, including diversity of stakeholders within the human-centered design process; evaluation of usability across multiple user groups including care partners and clinicians; and ensuring that the participating users are generalizable to the intended use population in terms of sociodemographics, social determinants of health, and other characteristics. Third, rigorous study design is key. Usability is a heterogeneous concept, and it is often beneficial to evaluate usability alongside other objectives such as analytical or clinical validation; thus, we do not advocate a particular study design or set of study outcome measures. We do, however, believe that careful consideration of usability evaluation criteria, study sample sizes, and predetermined thresholds of success is critical for making go or no-go decisions as to whether a particular sDHT is sufficiently usable for implementation in a particular context of use. Finally, we recommend adhering to reporting and publication checklists such as Annex B in ISO 9241-11:2018 [122] and EVIDENCE [123], the latter of which describes optimal reporting requirements of studies evaluating several aspects of sDHT quality including usability assessments. Ensuring consistency in reporting will enable meaningful comparisons between studies, facilitate better assessments of findings, and enhance the accurate interpretation of results and limitations across studies.

Our long-term goal is to develop and disseminate an evidence-driven framework for evaluating sDHTs as being fit for purpose from a usability perspective, informed in part by the findings of this review. By developing such a framework, we endeavor to contribute to the ongoing discourse surrounding sDHTs, ultimately paving the way for the development of safe and effective tools that lead to a more inclusive and patient-centric health care ecosystem poised to improve clinical trials and clinical practice.

Acknowledgments

The authors wish to acknowledge Bethanie McCrary and Danielle Stefko for assistance with project management, Katerina Djambazova for assistance with data extraction, and Jennifer Goldsack for providing feedback on the manuscript. Chord diagrams were created with flourish.studio online software. This work was undertaken within the Digital Health Measurement Collaborative Community (DATAcc), hosted by the Digital Medicine Society (DiMe).

Abbreviations

MeSH

Medical Subject Heading

PICO

patients/participants; intervention; comparator; outcomes

PRISMA

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

sDHT

sensor-based digital health technology

Multimedia Appendix 1

PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) checklist.

jmir_v26i1e57628_app1.pdf (162.1KB, pdf)
Multimedia Appendix 2

Supplementary tables.

jmir_v26i1e57628_app2.docx (140.5KB, docx)

Footnotes

Authors' Contributions: All authors contributed to the study design, data interpretation, and manuscript preparation. The literature search, literature screening, and data extraction were undertaken by JC, SM, and JPB.

Conflicts of Interest: AT is a consultant for Synergen Technology Labs, LLC; Siemens Healthineers; and Gabi SmartCare. BC is an employee of Genentech, a member of the Roche Group and Roche Pharmaceuticals, and owns company stock. EI is an employee of Koneksa Health and may own company stock. NM and SV are employees of Johnson & Johnson Innovative Medicine and hold company stocks or stock options. JPB reports financial interests (consulting income, shares, or stock) in Philips, Signifier Medical Technologies, Koneksa Health, and Apnimed. ES serves on the editorial board as an Associate Editor of JMIR Publications.

References

  • 1.Goldsack JC, Coravos A, Bakker JP, Bent B, Dowling AV, Fitzer-Attas C, Godfrey A, Godino JG, Gujar N, Izmailova E, Manta C, Peterson B, Vandendriessche B, Wood WA, Wang KW, Dunn J. Verification, analytical validation, and clinical validation (V3): the foundation of determining fit-for-purpose for biometric monitoring technologies (BioMeTs) NPJ Digital Med. 2020;3:55. doi: 10.1038/s41746-020-0260-4. https://doi.org/10.1038/s41746-020-0260-4 .260 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Library of digital endpoints. Digital Medicine Society (DiMe) 2021. [2024-01-23]. https://dimesociety.org/get-involved/library-of-digital-endpoints/
  • 3.Marwaha JS, Landman AB, Brat GA, Dunn T, Gordon WJ. Deploying digital health tools within large, complex health systems: key considerations for adoption and implementation. NPJ Digital Med. 2022;5(1):13. doi: 10.1038/s41746-022-00557-1. https://doi.org/10.1038/s41746-022-00557-1 .10.1038/s41746-022-00557-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Center for Drug Evaluation, Research Digital health technologies for remote data acquisition in clinical investigations. US Food and Drug Administration. [2024-01-23]. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/digital-health-technologies-remote-data-acquisition-clinical-investigations .
  • 5.Maqbool B, Herold S. Potential effectiveness and efficiency issues in usability evaluation within digital health: a systematic literature review. J Syst Software. 2024;208:111881. doi: 10.1016/j.jss.2023.111881. http://paperpile.com/b/S6KR0h/06lr . [DOI] [Google Scholar]
  • 6.Mathews SC, McShea MJ, Hanley CL, Ravitz A, Labrique AB, Cohen AB. Digital health: a path to validation. NPJ Digital Med. 2019;2:38. doi: 10.1038/s41746-019-0111-3. https://doi.org/10.1038/s41746-019-0111-3 .111 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Whitehead L, Talevski J, Fatehi F, Beauchamp A. Barriers to and facilitators of digital health among culturally and linguistically diverse populations: qualitative systematic review. J Med Internet Res. 2023;25:e42719. doi: 10.2196/42719. https://www.jmir.org/2023//e42719/ v25i1e42719 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Bent B, Sim I, Dunn JP. Digital medicine community perspectives and challenges: survey study. JMIR Mhealth Uhealth. 2021;9(2):e24570. doi: 10.2196/24570. https://mhealth.jmir.org/2021/2/e24570/ v9i2e24570 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Evans J, Papadopoulos A, Silvers CT, Charness N, Boot WR, Schlachta-Fairchild L, Crump C, Martinez M, Ent CB. Remote health monitoring for older adults and those with heart failure: adherence and system usability. Telemed e-Health. 2016;22(6):480–488. doi: 10.1089/tmj.2015.0140. https://europepmc.org/abstract/MED/26540369 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Ummels D, Braun S, Stevens A, Beekman E, Beurskens A. Measure it super simple (MISS) activity tracker: (re)design of a user-friendly interface and evaluation of experiences in daily life. Disabil Rehabil Assist Technol. 2022;17(7):767–777. doi: 10.1080/17483107.2020.1815089. [DOI] [PubMed] [Google Scholar]
  • 11.Chen Y, Clayton EW, Novak LL, Anders S, Malin B. Human-centered design to address biases in artificial intelligence. J Med Internet Res. 2023;25:e43251. doi: 10.2196/43251. https://www.jmir.org/2023//e43251/ v25i1e43251 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Levander XA, VanDerSchaaf H, Barragán VG, Choxi H, Hoffman A, Morgan E, Wong E, Wusirika R, Cheng A. The role of human-centered design in healthcare innovation: a digital health equity case study. J Gen Intern Med. 2024;39(4):690–695. doi: 10.1007/s11606-023-08500-0.10.1007/s11606-023-08500-0 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Benishek LE, Kachalia A, Biddison LD. Improving clinician well-being and patient safety through human-centered design. JAMA. 2023;329(14):1149–1150. doi: 10.1001/jama.2023.2157.2802009 [DOI] [PubMed] [Google Scholar]
  • 14.Tase A, Vadhwana B, Buckle P, Hanna GB. Usability challenges in the use of medical devices in the home environment: a systematic review of literature. Appl Ergon. 2022;103:103769. doi: 10.1016/j.apergo.2022.103769. https://linkinghub.elsevier.com/retrieve/pii/S0003-6870(22)00092-8 .S0003-6870(22)00092-8 [DOI] [PubMed] [Google Scholar]
  • 15.Ye B, Chu CH, Bayat S, Babineau J, How T, Mihailidis A. Researched apps used in dementia care for people living with dementia and their informal caregivers: systematic review on app features, security, and usability. J Med Internet Res. 2023;25:e46188. doi: 10.2196/46188. https://www.jmir.org/2023//e46188/ v25i1e46188 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Siette J, Dodds L, Sharifi F, Nguyen A, Baysari M, Seaman K, Raban M, Wabe N, Westbrook J. Usability and acceptability of clinical dashboards in aged care: systematic review. JMIR Aging. 2023;6:e42274. doi: 10.2196/42274. https://aging.jmir.org/2023//e42274/ v6i1e42274 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Kraaijkamp JJM, van Dam van Isselt EF, Persoon A, Versluis A, Chavannes NH, Achterberg WP. eHealth in geriatric rehabilitation: systematic review of effectiveness, feasibility, and usability. J Med Internet Res. 2021;23(8):e24015. doi: 10.2196/24015. https://www.jmir.org/2021/8/e24015/ v23i8e24015 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Keogh A, Argent R, Anderson A, Caulfield B, Johnston W. Assessing the usability of wearable devices to measure gait and physical activity in chronic conditions: a systematic review. J Neuroeng Rehabil. 2021;18(1):138. doi: 10.1186/s12984-021-00931-2. https://jneuroengrehab.biomedcentral.com/articles/10.1186/s12984-021-00931-2 .10.1186/s12984-021-00931-2 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.Butler S, Sculley D, Santos DS, Fellas A, Gironès X, Singh-Grewal D, Coda A. Usability of eHealth and mobile health interventions by young people living with juvenile idiopathic arthritis: systematic review. JMIR Pediatr Parent. 2020;3(2):e15833. doi: 10.2196/15833. https://pediatrics.jmir.org/2020/2/e15833/ v3i2e15833 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Center for Devices, Radiological Health Applying human factorsusability engineering to medical devices. US Food and Drug Administration. [2024-01-23]. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/applying-human- factors-and-usability-engineering-medical-devices .
  • 21.Department of Health and Social Care Guidance on applying human factors and usability engineering to medical devices including drug-device combination products in Great Britain. Medicines and Healthcare products Regulatory Agency. [2024-01-30]. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/970563/Human-Factors_ Medical-Devices_v2.0.pdf .
  • 22.Guiding principles for technical review of human factors design of medical devices [DiMe, Trans.] National Medical Products Association. [2024-01-30]. https://datacc.dimesociety.org/wp-content/uploads/2023/09/NMPA-Human-Factors-Guidance- English-Translation-FINAL.pdf .
  • 23.ISO. ISO 9241-210. 2019. [2024-01-23]. https://www.iso.org/standard/77520.html .
  • 24.Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, Shamseer L, Tetzlaff JM, Akl EA, Brennan SE, Chou R, Glanville J, Grimshaw JM, Hróbjartsson A, Lalu MM, Li T, Loder EW, Mayo-Wilson E, McDonald S, McGuinness LA, Stewart LA, Thomas J, Tricco AC, Welch VA, Whiting P, Moher D. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71. doi: 10.1136/bmj.n71. http://www.bmj.com/lookup/pmidlookup?view=long&pmid=33782057 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.PROSPERO. [2024-01-23]. https://www.crd.york.ac.uk/prospero/
  • 26.Medical subject headings—home page. U.S. National Library of Medicine. 2020. [2024-01-30]. https://www.nlm.nih.gov/mesh/meshhome.html .
  • 27.Nussbaumer-Streit B, Sommer I, Hamel C, Devane D, Noel-Storr A, Puljak L, Trivella M, Gartlehner G. Rapid reviews methods series: guidance on team considerations, study selection, data extraction and risk of bias assessment. BMJ Evidence-Based Med. 2023;28(6):418–423. doi: 10.1136/bmjebm-2022-112185. http://ebm.bmj.com/lookup/pmidlookup?view=long&pmid=37076266 .bmjebm-2022-112185 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Olaye IM, Belovsky MP, Bataille L, Cheng R, Ciger A, Fortuna KL, Izmailova ES, McCall D, Miller CJ, Muehlhausen W, Northcott CA, Rodriguez-Chavez IR, Pratap A, Vandendriessche B, Zisman-Ilani Y, Bakker JP. Recommendations for defining and reporting adherence measured by biometric monitoring technologies: systematic review. J Med Internet Res. 2022;24(4):e33537. doi: 10.2196/33537. https://www.jmir.org/2022/4/e33537/ v24i4e33537 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Eriksen MB, Frandsen TF. The impact of patient, intervention, comparison, outcome (PICO) as a search strategy tool on literature search quality: a systematic review. J Med Libr Assoc. 2018;106(4):420–431. doi: 10.5195/jmla.2018.345. https://europepmc.org/abstract/MED/30271283 .jmla-106-420 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Nielsen J. Usability Engineering. Somerville, Massachusetts: Boston Academic Press; 1993. [Google Scholar]
  • 31.Shachak A, Kuziemsky C, Petersen C. Beyond TAM and UTAUT: future directions for HIT implementation research. J Biomed Inform. 2019;100:103315. doi: 10.1016/j.jbi.2019.103315. https://linkinghub.elsevier.com/retrieve/pii/S1532-0464(19)30234-5 .S1532-0464(19)30234-5 [DOI] [PubMed] [Google Scholar]
  • 32.Shoemaker SJ, Wolf MS, Brach C. Development of the patient education materials assessment tool (PEMAT): a new measure of understandability and actionability for print and audiovisual patient information. Patient Educ Couns. 2014;96(3):395–403. doi: 10.1016/j.pec.2014.05.027. https://europepmc.org/abstract/MED/24973195 .S0738-3991(14)00233-X [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Bossuyt PMM, Reitsma JB, Linnet K, Moons KGM. Beyond diagnostic accuracy: the clinical utility of diagnostic tests. Clin Chem. 2012;58(12):1636–1643. doi: 10.1373/clinchem.2012.182576.clinchem.2012.182576 [DOI] [PubMed] [Google Scholar]
  • 34.Weenk M, Bredie SJ, Koeneman M, Hesselink G, van Goor H, van de Belt TH. Continuous monitoring of vital signs in the general ward using wearable devices: randomized controlled trial. J Med Internet Res. 2020;22(6):e15471. doi: 10.2196/15471. https://www.jmir.org/2020/6/e15471/ v22i6e15471 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Bentley CL, Powell L, Potter S, Parker J, Mountain GA, Bartlett YK, Farwer J, O'Connor C, Burns J, Cresswell RL, Dunn HD, Hawley MS. The use of a smartphone app and an activity tracker to promote physical activity in the management of chronic obstructive pulmonary disease: randomized controlled feasibility study. JMIR Mhealth Uhealth. 2020;8(6):e16203. doi: 10.2196/16203. https://mhealth.jmir.org/2020/6/e16203/ v8i6e16203 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Liverani M, Ir P, Wiseman V, Perel P. User experiences and perceptions of health wearables: an exploratory study in Cambodia. Glob Health Res Policy. 2021;6(1):33. doi: 10.1186/s41256-021-00221-3. https://ghrp.biomedcentral.com/articles/10.1186/s41256-021-00221-3 .10.1186/s41256-021-00221-3 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 37.Nasseri M, Nurse E, Glasstetter M, Böttcher S, Gregg NM, Nandakumar AL, Joseph B, Attia TP, Viana PF, Bruno E, Biondi A, Cook M, Worrell GA, Schulze-Bonhage A, Dümpelmann M, Freestone DR, Richardson MP, Brinkmann BH. Signal quality and patient experience with wearable devices for epilepsy management. Epilepsia. 2020;61 Suppl 1:S25–S35. doi: 10.1111/epi.16527. [DOI] [PubMed] [Google Scholar]
  • 38.Sala-Cunill A, Luengo O, Curran A, Moreno N, Labrador-Horrillo M, Guilarte M, Gonzalez-Medina M, Galvan-Blasco P, Cardona V. Digital technology for anaphylaxis management impact on patient behaviour: a randomized clinical trial. Allergy. 2021;76(5):1507–1516. doi: 10.1111/all.14626. [DOI] [PubMed] [Google Scholar]
  • 39.Vaughn J, Gollarahalli S, Shaw RJ, Docherty S, Yang Q, Malhotra C, Summers-Goeckerman E, Shah N. Mobile health technology for pediatric symptom monitoring: a feasibility study. Nurs Res. 2020;69(2):142–148. doi: 10.1097/NNR.0000000000000403. https://europepmc.org/abstract/MED/31972852 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Ollenschläger M, Kluge F, Müller-Schulz M, Püllen R, Möller C, Klucken J, Eskofier BM. Wearable gait analysis systems: ready to be used by medical practitioners in geriatric wards? Eur Geriatr Med. 2022;13(4):817–824. doi: 10.1007/s41999-022-00629-1. https://europepmc.org/abstract/MED/35243600 .10.1007/s41999-022-00629-1 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Matcham F, Carr E, White KM, Leightley D, Lamers F, Siddi S, Annas P, de Girolamo G, Haro JM, Horsfall M, Ivan A, Lavelle G, Li Q, Lombardini F, Mohr DC, Narayan VA, Penninx BWHJ, Oetzmann C, Coromina M, Simblett SK, Weyer J, Wykes T, Zorbas S, Brasen JC, Myin-Germeys I, Conde P, Dobson RJB, Folarin AA, Ranjan Y, Rashid Z, Cummins N, Dineley J, Vairavan S, Hotopf M. Predictors of engagement with remote sensing technologies for symptom measurement in major depressive disorder. J Affect Disord. 2022;310:106–115. doi: 10.1016/j.jad.2022.05.005.S0165-0327(22)00507-9 [DOI] [PubMed] [Google Scholar]
  • 42.Bruno E, Biondi A, Thorpe S, Richardson MP, RADAR-CNS Consortium Patients self-mastery of wearable devices for seizure detection: a direct user-experience. Seizure. 2020;81:236–240. doi: 10.1016/j.seizure.2020.08.023. https://linkinghub.elsevier.com/retrieve/pii/S1059-1311(20)30259-4 .S1059-1311(20)30259-4 [DOI] [PubMed] [Google Scholar]
  • 43.Figueiredo J, Carvalho SP, Vilas-Boas JP, Gonçalves LM, Moreno JC, Santos CP. Wearable inertial sensor system towards daily human kinematic gait analysis: benchmarking analysis to MVN BIOMECH. Sensors (Basel) 2020;20(8):2185. doi: 10.3390/s20082185. http://hdl.handle.net/10261/209493 .s20082185 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Lunsford-Avery JR, Keller C, Kollins SH, Krystal AD, Jackson L, Engelhard MM. Feasibility and acceptability of wearable sleep electroencephalogram device use in adolescents: observational study. JMIR Mhealth Uhealth. 2020;8(10):e20590. doi: 10.2196/20590. https://mhealth.jmir.org/2020/10/e20590/ v8i10e20590 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Standoli CE, Guarneri MR, Perego P, Mazzola M, Mazzola A, Andreoni G. A smart wearable sensor system for counter-fighting overweight in teenagers. Sensors (Basel) 2016;16(8):1220. doi: 10.3390/s16081220. https://www.mdpi.com/resolver?pii=s16081220 .s16081220 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 46.Marando CM, Mansouri K, Kahook MY, Seibold LK. Tolerability and functionality of a wireless 24-hour ocular telemetry sensor in African American glaucoma patients. J Glaucoma. 2019;28(2):119–124. doi: 10.1097/IJG.0000000000001141.00061198-201902000-00005 [DOI] [PubMed] [Google Scholar]
  • 47.Pavic M, Klaas V, Theile G, Kraft J, Tröster G, Guckenberger M. Feasibility and usability aspects of continuous remote monitoring of health status in palliative cancer patients using wearables. Oncology. 2020;98(6):386–395. doi: 10.1159/000501433. https://doi.org/10.1159/000501433 .000501433 [DOI] [PubMed] [Google Scholar]
  • 48.Yang K, Meadmore K, Freeman C, Grabham N, Hughes A-M, Wei Y, Torah R, Glanc-Gostkiewicz M, Beeby S, Tudor J. Development of user-friendly wearable electronic textiles for healthcare applications. Sensors (Basel) 2018;18(8):2410. doi: 10.3390/s18082410. https://www.mdpi.com/resolver?pii=s18082410 .s18082410 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.McGillion MH, Dvirnik N, Yang S, Belley-Côté E, Lamy A, Whitlock R, Marcucci M, Borges FK, Duceppe E, Ouellette C, Bird M, Carroll SL, Conen D, Tarride JE, Harsha P, Scott T, Good A, Gregus K, Sanchez K, Benoit P, Owen J, Harvey V, Peter E, Petch J, Vincent J, Graham M, Devereaux PJ. Continuous noninvasive remote automated blood pressure monitoring with novel wearable technology: a preliminary validation study. JMIR Mhealth Uhealth. 2022;10(2):e24916. doi: 10.2196/24916. https://mhealth.jmir.org/2022/2/e24916/ v10i2e24916 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Carrasco JJ, Pérez-Alenda S, Casaña J, Soria-Olivas E, Bonanad S, Querol F. Physical activity monitoring and acceptance of a commercial activity tracker in adult patients with haemophilia. Int J Environ Res Public Health. 2019;16(20):3851. doi: 10.3390/ijerph16203851. https://www.mdpi.com/resolver?pii=ijerph16203851 .ijerph16203851 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Argent R, Slevin P, Bevilacqua A, Neligan M, Daly A, Caulfield B. Wearable sensor-based exercise biofeedback for orthopaedic rehabilitation: a mixed methods user evaluation of a prototype system. Sensors (Basel) 2019;19(2):432. doi: 10.3390/s19020432. https://www.mdpi.com/resolver?pii=s19020432 .s19020432 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Tsung-Yin O, Chih-Young H, Che-Wei L. A mixed-methods study of users' journey mapping experience and acceptance of telehealthcare technology in Taiwan. Telemed e-Health. 2019;25(11):1057–1070. doi: 10.1089/tmj.2018.0155. [DOI] [PubMed] [Google Scholar]
  • 53.Simblett SK, Biondi A, Bruno E, Ballard D, Stoneman A, Lees S, Richardson MP, Wykes T. Patients' experience of wearing multimodal sensor devices intended to detect epileptic seizures: a qualitative analysis. Epilepsy Behav. 2020;102:106717. doi: 10.1016/j.yebeh.2019.106717.S1525-5050(19)31129-1 [DOI] [PubMed] [Google Scholar]
  • 54.Chevallier T, Buzancais G, Occean B, Rataboul P, Boisson C, Simon N, Lannelongue A, Chaniaud N, Gricourt Y, Lefrant J, Cuvillon P. Feasibility of remote digital monitoring using wireless Bluetooth monitors, the Smart Angel™ app and an original web platform for patients following outpatient surgery: a prospective observational pilot study. BMC Anesthesiol. 2020;20(1):259. doi: 10.1186/s12871-020-01178-5. https://bmcanesthesiol.biomedcentral.com/articles/10.1186/s12871-020-01178-5 .10.1186/s12871-020-01178-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Virbel-Fleischman C, Rétory Y, Hardy S, Huiban C, Corvol JC, Grabli D. Body-worn sensors for Parkinson's disease: a qualitative approach with patients and healthcare professionals. PLoS One. 2022;17(5):e0265438. doi: 10.1371/journal.pone.0265438. https://dx.plos.org/10.1371/journal.pone.0265438 .PONE-D-21-35357 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Meritam P, Ryvlin P, Beniczky S. User-based evaluation of applicability and usability of a wearable accelerometer device for detecting bilateral tonic-clonic seizures: a field study. Epilepsia. 2018;59 Suppl 1:48–52. doi: 10.1111/epi.14051. [DOI] [PubMed] [Google Scholar]
  • 57.Esbjörnsson M, Ullberg T. Safety and usability of wearable accelerometers for stroke detection the STROKE ALARM PRO 1 study. J Stroke Cerebrovasc Dis. 2022;31(11):106762. doi: 10.1016/j.jstrokecerebrovasdis.2022.106762. https://linkinghub.elsevier.com/retrieve/pii/S1052-3057(22)00456-6 .S1052-3057(22)00456-6 [DOI] [PubMed] [Google Scholar]
  • 58.Hua A, Johnson N, Quinton J, Chaudhary P, Buchner D, Hernandez ME. Design of a low-cost, wearable device for kinematic analysis in physical therapy settings. Methods Inf Med. 2020;59(1):41–47. doi: 10.1055/s-0040-1710380. [DOI] [PubMed] [Google Scholar]
  • 59.Domingos C, Costa P, Santos NC, Pêgo JM. Usability, acceptability, and satisfaction of a wearable activity tracker in older adults: observational study in a real-life context in Northern Portugal. J Med Internet Res. 2022;24(1):e26652. doi: 10.2196/26652. https://www.jmir.org/2022/1/e26652/ v24i1e26652 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 60.Wing D, Godino JG, Baker FC, Yang R, Chevance G, Thompson WK, Reuter C, Bartsch H, Wilbur A, Straub LK, Castro N, Higgins M, Colrain IM, de Zambotti M, Wade NE, Lisdahl KM, Squeglia LM, Ortigara J, Fuemmeler B, Patrick K, Mason MJ, Tapert SF, Bagot KS. Recommendations for identifying valid wear for consumer-level wrist-worn activity trackers and acceptability of extended device deployment in children. Sensors (Basel) 2022;22(23):9189. doi: 10.3390/s22239189. https://www.mdpi.com/resolver?pii=s22239189 .s22239189 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 61.Cortell-Tormo JM, Garcia-Jaen M, Ruiz-Fernandez D, Fuster-Lloret V. Lumbatex: a wearable monitoring system based on inertial sensors to measure and control the lumbar spine motion. IEEE Trans Neural Syst Rehabil Eng. 2019;27(8):1644–1653. doi: 10.1109/TNSRE.2019.2927083. [DOI] [PubMed] [Google Scholar]
  • 62.Keogh A, Dorn JF, Walsh L, Calvo F, Caulfield B. Comparing the usability and acceptability of wearable sensors among older Irish adults in a real-world context: observational study. JMIR Mhealth Uhealth. 2020;8(4):e15704. doi: 10.2196/15704. https://mhealth.jmir.org/2020/4/e15704/ v8i4e15704 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Hamilton C, Lovarini M, van den Berg M, McCluskey A, Hassett L. Usability of affordable feedback-based technologies to improve mobility and physical activity in rehabilitation: a mixed methods study. Disabil Rehabil. 2022;44(15):4029–4038. doi: 10.1080/09638288.2021.1884904. [DOI] [PubMed] [Google Scholar]
  • 64.O'Brien J, Mason A, Cassarino M, Chan J, Setti A. Older women's experiences of a community-led walking programme using activity trackers. Int J Environ Res Public Health. 2021;18(18):9818. doi: 10.3390/ijerph18189818. https://www.mdpi.com/resolver?pii=ijerph18189818 .ijerph18189818 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 65.Jiménez-Fernández S, de Toledo P, del Pozo F. Usability and interoperability in wireless sensor networks for patient telemonitoring in chronic disease management. IEEE Trans Biomed Eng. 2013;60(12):3331–3339. doi: 10.1109/TBME.2013.2280967. [DOI] [PubMed] [Google Scholar]
  • 66.Areia C, Young L, Vollam S, Ede J, Santos M, Tarassenko L, Watkinson P. Wearability testing of ambulatory vital sign monitoring devices: prospective observational cohort study. JMIR Mhealth Uhealth. 2020;8(12):e20214. doi: 10.2196/20214. https://mhealth.jmir.org/2020/12/e20214/ v8i12e20214 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 67.Bruno E, Biondi A, Böttcher S, Lees S, Schulze-Bonhage A, Richardson MP, RADAR-CNS Consortium Day and night comfort and stability on the body of four wearable devices for seizure detection: a direct user-experience. Epilepsy Behav. 2020;112:107478. doi: 10.1016/j.yebeh.2020.107478.S1525-5050(20)30658-2 [DOI] [PubMed] [Google Scholar]
  • 68.Uomoto JM, Skopp N, Jenkins-Guarnieri M, Reini J, Thomas D, Adams RJ, Tsui M, Miller SR, Scott BR, Pasquina PF. Assessing the clinical utility of a wearable device for physiological monitoring of heart rate variability in military service members with traumatic brain injury. Telemed e-Health. 2022;28(10):1496–1504. doi: 10.1089/tmj.2021.0627. [DOI] [PubMed] [Google Scholar]
  • 69.Mitsopoulos K, Fiska V, Tagaras K, Papias A, Antoniou P, Nizamis K, Kasimis K, Sarra PD, Mylopoulou D, Savvidis T, Praftsiotis A, Arvanitidis A, Lyssas G, Chasapis K, Moraitopoulos A, Astaras A, Bamidis PD, Athanasiou A. NeuroSuitUp: system architecture and validation of a motor rehabilitation wearable robotics and serious game platform. Sensors (Basel) 2023;23(6):3281. doi: 10.3390/s23063281. https://www.mdpi.com/resolver?pii=s23063281 .s23063281 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 70.Grym K, Niela-Vilén H, Ekholm E, Hamari L, Azimi I, Rahmani A, Liljeberg P, Löyttyniemi E, Axelin A. Feasibility of smart wristbands for continuous monitoring during pregnancy and one month after birth. BMC Pregnancy Childbirth. 2019;19(1):34. doi: 10.1186/s12884-019-2187-9. https://bmcpregnancychildbirth.biomedcentral.com/articles/10.1186/s12884-019-2187-9 .10.1186/s12884-019-2187-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 71.Pradhan S, Kelly VE. Quantifying physical activity in early Parkinson disease using a commercial activity monitor. Parkinsonism Relat Disord. 2019;66:171–175. doi: 10.1016/j.parkreldis.2019.08.001. https://europepmc.org/abstract/MED/31420310 .S1353-8020(19)30357-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 72.Lin WY, Ke HL, Chou WC, Chang PC, Tsai TH, Lee MY. Realization and technology acceptance test of a wearable cardiac health monitoring and early warning system with multi-channel MCGs and ECG. Sensors (Basel) 2018;18(10):3538. doi: 10.3390/s18103538. https://www.mdpi.com/resolver?pii=s18103538 .s18103538 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 73.Yurkiewicz IR, Simon P, Liedtke M, Dahl G, Dunn T. Effect of Fitbit and iPad wearable technology in health-related quality of life in adolescent and young adult cancer patients. J Adolesc Young Adult Oncol. 2018;7(5):579–583. doi: 10.1089/jayao.2018.0022. [DOI] [PubMed] [Google Scholar]
  • 74.Mackintosh KA, Chappel SE, Salmon J, Timperio A, Ball K, Brown H, Macfarlane S, Ridgers ND. Parental perspectives of a wearable activity tracker for children younger than 13 years: acceptability and usability study. JMIR Mhealth Uhealth. 2019;7(11):e13858. doi: 10.2196/13858. https://mhealth.jmir.org/2019/11/e13858/ v7i11e13858 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 75.Hendriks MM S, Vos-van der Hulst M, Keijsers NLW. Feasibility of a sensor-based technological platform in assessing gait and sleep of in-hospital stroke and incomplete spinal cord injury (iSCI) patients. Sensors (Basel) 2020;20(10):2748. doi: 10.3390/s20102748. https://www.mdpi.com/resolver?pii=s20102748 .s20102748 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 76.Lin BS, Wong AM, Tseng KC. Community-based ECG monitoring system for patients with cardiovascular diseases. J Med Syst. 2016;40(4):80. doi: 10.1007/s10916-016-0442-4.10.1007/s10916-016-0442-4 [DOI] [PubMed] [Google Scholar]
  • 77.Radder B, Prange-Lasonder GB, Kottink AIR, Holmberg J, Sletta K, van Dijk M, Meyer T, Melendez-Calderon A, Buurke JH, Rietman JS. Home rehabilitation supported by a wearable soft-robotic device for improving hand function in older adults: a pilot randomized controlled trial. PLoS One. 2019;14(8):e0220544. doi: 10.1371/journal.pone.0220544. https://dx.plos.org/10.1371/journal.pone.0220544 .PONE-D-18-22610 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 78.Bell KM, Onyeukwu C, McClincy MP, Allen M, Bechard L, Mukherjee A, Hartman RA, Smith C, Lynch AD, Irrgang JJ. Verification of a portable motion tracking system for remote management of physical rehabilitation of the knee. Sensors (Basel) 2019;19(5):1021. doi: 10.3390/s19051021. https://www.mdpi.com/resolver?pii=s19051021 .s19051021 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 79.Pillalamarri SS, Huyett LM, Abdel-Malek A. Novel Bluetooth-enabled tubeless insulin pump: a user experience design approach for a connected digital diabetes management platform. J Diabetes Sci Technol. 2018;12(6):1132–1142. doi: 10.1177/1932296818804802. https://journals.sagepub.com/doi/abs/10.1177/1932296818804802?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub0pubmed . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 80.Burgdorf A, Güthe I, Jovanović M, Kutafina E, Kohlschein C, Bitsch JÁ, Jonas SM. The mobile sleep lab app: an open-source framework for mobile sleep assessment based on consumer-grade wearable devices. Comput Biol Med. 2018;103:8–16. doi: 10.1016/j.compbiomed.2018.09.025.S0010-4825(18)30287-7 [DOI] [PubMed] [Google Scholar]
  • 81.Del-Valle-Soto C, Valdivia LJ, López-Pimentel JC, Visconti P. Comparison of collaborative and cooperative schemes in sensor networks for non-invasive monitoring of people at home. Int J Environ Res Public Health. 2023;20(7):5268. doi: 10.3390/ijerph20075268. https://www.mdpi.com/resolver?pii=ijerph20075268 .ijerph20075268 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 82.Hosseini A, Buonocore CM, Hashemzadeh S, Hojaiji H, Kalantarian H, Sideris C, Bui AAT, King CE, Sarrafzadeh M. Feasibility of a secure wireless sensing smartwatch application for the self-management of pediatric asthma. Sensors (Basel) 2017;17(8):1780. doi: 10.3390/s17081780. https://www.mdpi.com/resolver?pii=s17081780 .s17081780 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 83.Lin WY, Chou WC, Tsai TH, Lin CC, Lee MY. Development of a wearable instrumented vest for posture monitoring and system usability verification based on the technology acceptance model. Sensors (Basel) 2016;16(12):2172. doi: 10.3390/s16122172. https://www.mdpi.com/resolver?pii=s16122172 .s16122172 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 84.Esfahani MIM, Nussbaum MA. Preferred placement and usability of a smart textile system vs. inertial measurement units for activity monitoring. Sensors (Basel) 2018;18(8):2501. doi: 10.3390/s18082501. https://www.mdpi.com/resolver?pii=s18082501 .s18082501 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 85.Stubberud A, Tronvik E, Olsen A, Gravdahl G, Linde M. Biofeedback treatment app for pediatric migraine: development and usability study. Headache. 2020;60(5):889–901. doi: 10.1111/head.13772. [DOI] [PubMed] [Google Scholar]
  • 86.Chung J, Brakey HR, Reeder B, Myers O, Demiris G. Community-dwelling older adults' acceptance of smartwatches for health and location tracking. Int J Older People Nurs. 2023;18(1):e12490. doi: 10.1111/opn.12490. https://europepmc.org/abstract/MED/35818900 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 87.Shelley J, Fairclough SJ, Knowles ZR, Southern KW, McCormack P, Dawson EA, Graves LEF, Hanlon C. A formative study exploring perceptions of physical activity and physical activity monitoring among children and young people with cystic fibrosis and health care professionals. BMC Pediatr. 2018;18(1):335. doi: 10.1186/s12887-018-1301-x. https://bmcpediatr.biomedcentral.com/articles/10.1186/s12887-018-1301-x .10.1186/s12887-018-1301-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 88.Auerswald T, Meyer J, von Holdt K, Voelcker-Rehage C. Application of activity trackers among nursing home residents—a pilot and feasibility study on physical activity behavior, usage behavior, acceptance, usability and motivational impact. Int J Environ Res Public Health. 2020;17(18):6683. doi: 10.3390/ijerph17186683. https://www.mdpi.com/resolver?pii=ijerph17186683 .ijerph17186683 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 89.Schuurmans MM, Muszynski M, Li X, Marcinkevičs R, Zimmerli L, Lopez DM, Michel B, Weiss J, Hage R, Roeder M, Vogt JE, Brunschwiler T. Multimodal remote home monitoring of lung transplant recipients during COVID-19 vaccinations: usability pilot study of the COVIDA desk incorporating wearable devices. Medicina (Kaunas) 2023;59(3):617. doi: 10.3390/medicina59030617. https://www.mdpi.com/resolver?pii=medicina59030617 .medicina59030617 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 90.Cesareo A, Nido SA, Biffi E, Gandossini S, D'Angelo MG, Aliverti A. A wearable device for breathing frequency monitoring: a pilot study on patients with muscular dystrophy. Sensors (Basel) 2020;20(18):5346. doi: 10.3390/s20185346. https://www.mdpi.com/resolver?pii=s20185346 .s20185346 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 91.Albani G, Ferraris C, Nerino R, Chimienti A, Pettiti G, Parisi F, Ferrari G, Cau N, Cimolin V, Azzaro C, Priano L, Mauro A. An integrated multi-sensor approach for the remote monitoring of Parkinson's disease. Sensors (Basel) 2019;19(21):4764. doi: 10.3390/s19214764. http://hdl.handle.net/2318/1721161 .s19214764 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 92.Demiris G, Chaudhuri S, Thompson HJ. Older adults' experience with a novel fall detection device. Telemed e-Health. 2016;22(9):726–732. doi: 10.1089/tmj.2015.0218. https://europepmc.org/abstract/MED/26959299 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 93.Kaiserman K, Buckingham BA, Prakasam G, Gunville F, Slover RH, Wang Y, Nguyen X, Welsh JB. Acceptability and utility of the mySentry remote glucose monitoring system. J Diabetes Sci Technol. 2013;7(2):356–361. doi: 10.1177/193229681300700211. https://europepmc.org/abstract/MED/23566993 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 94.Drehlich M, Naraine M, Rowe K, Lai SK, Salmon J, Brown H, Koorts H, Macfarlane S, Ridgers ND. Using the technology acceptance model to explore adolescents' perspectives on combining technologies for physical activity promotion within an intervention: usability study. J Med Internet Res. 2020;22(3):e15552. doi: 10.2196/15552. https://www.jmir.org/2020/3/e15552/ v22i3e15552 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 95.Nguyen NH, Hadgraft NT, Moore MM, Rosenberg DE, Lynch C, Reeves MM, Lynch BM. A qualitative evaluation of breast cancer survivors' acceptance of and preferences for consumer wearable technology activity trackers. Support Care Cancer. 2017;25(11):3375–3384. doi: 10.1007/s00520-017-3756-y.10.1007/s00520-017-3756-y [DOI] [PubMed] [Google Scholar]
  • 96.Arends J, Thijs RD, Gutter T, Ungureanu C, Cluitmans P, Van Dijk J, van Andel J, Tan F, de Weerd A, Vledder B, Hofstra W, Lazeron R, van Thiel G, Roes KCB, Leijten F. Multimodal nocturnal seizure detection in a residential care setting: a long-term prospective trial. Neurology. 2018;91(21):e2010–e2019. doi: 10.1212/WNL.0000000000006545. https://europepmc.org/abstract/MED/30355702 .WNL.0000000000006545 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 97.Finkelstein J, Cisse P, Jeong IC. Feasibility of interactive resistance chair in older adults with diabetes. Stud Health Technol Inform. 2015;213:61–64. [PubMed] [Google Scholar]
  • 98.Thilo FJS, Hahn S, Halfens RJG, Schols JMGA. Usability of a wearable fall detection prototype from the perspective of older people—a real field testing approach. J Clin Nurs. 2019;28(1-2):310–320. doi: 10.1111/jocn.14599. [DOI] [PubMed] [Google Scholar]
  • 99.Preusse KC, Mitzner TL, Fausset CB, Rogers WA. Older adults' acceptance of activity trackers. J Appl Gerontol. 2017;36(2):127–155. doi: 10.1177/0733464815624151. https://europepmc.org/abstract/MED/26753803 .0733464815624151 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 100.Hofbauer LM, Rodriguez FS. How is the usability of commercial activity monitors perceived by older adults and by researchers? A cross-sectional evaluation of community-living individuals. BMJ Open. 2022;12(11):e063135. doi: 10.1136/bmjopen-2022-063135. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=36323474 .bmjopen-2022-063135 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 101.Cajamarca G, Rodríguez I, Herskovic V, Campos M, Riofrío JC. StraightenUp+: monitoring of posture during daily activities for older persons using wearable sensors. Sensors (Basel) 2018;18(10):3409. doi: 10.3390/s18103409. https://www.mdpi.com/resolver?pii=s18103409 .s18103409 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 102.Caswell N, Kuru K, Ansell D, Jones MJ, Watkinson BJ, Leather P, Lancaster A, Sugden P, Briggs E, Davies C, Oh C, Bennett K, DeGoede C. Patient engagement in medical device design: refining the essential attributes of a wearable, pre-void, ultrasound alarm for nocturnal enuresis. Pharmaceut Med. 2020;34(1):39–48. doi: 10.1007/s40290-019-00324-w.10.1007/s40290-019-00324-w [DOI] [PubMed] [Google Scholar]
  • 103.Imbesi S, Corzani M. Multisensory cues for gait rehabilitation with smart glasses: methodology, design, and results of a preliminary pilot. Sensors (Basel) 2023;23(2):874. doi: 10.3390/s23020874. https://www.mdpi.com/resolver?pii=s23020874 .s23020874 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 104.Shore L, Power V, Hartigan B, Schülein S, Graf E, de Eyto A, O'Sullivan L. Exoscore: a design tool to evaluate factors associated with technology acceptance of soft lower limb exosuits by older adults. Hum Factors. 2020;62(3):391–410. doi: 10.1177/0018720819868122. [DOI] [PubMed] [Google Scholar]
  • 105.Hart P, Bierwirth R, Fulk G, Sazonov E. The design and evaluation of an activity monitoring user interface for people with stroke. 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society; August 26-30, 2014; Chicago, IL. 2014. pp. 5908–5911. [DOI] [PubMed] [Google Scholar]
  • 106.Rupp MA, Michaelis JR, McConnell DS, Smither JA. The role of individual differences on perceptions of wearable fitness device trust, usability, and motivational impact. Appl Ergon. 2018;70:77–87. doi: 10.1016/j.apergo.2018.02.005.S0003-6870(18)30029-2 [DOI] [PubMed] [Google Scholar]
  • 107.Drabarek D, Anh NT, Nhung NV, Hoa NB, Fox GJ, Bernays S. Implementation of medication event reminder monitors among patients diagnosed with drug susceptible tuberculosis in rural Viet Nam: a qualitative study. PLoS One. 2019;14(7):e0219891. doi: 10.1371/journal.pone.0219891. https://dx.plos.org/10.1371/journal.pone.0219891 .PONE-D-18-34691 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 108.Vandelanotte C, Duncan MJ, Maher CA, Schoeppe S, Rebar AL, Power DA, Short CE, Doran CM, Hayman MJ, Alley SJ. The effectiveness of a web-based computer-tailored physical activity intervention using Fitbit activity trackers: randomized trial. J Med Internet Res. 2018;20(12):e11321. doi: 10.2196/11321. https://www.jmir.org/2018/12/e11321/ v20i12e11321 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 109.Matsunaga K, Ogasawara T, Kodate J, Mukaino M, Saitoh E. On-site evaluation of rehabilitation patients monitoring system using distributed wireless gateways. 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC); July 23-27, 2019; Berlin, Germany. 2019. pp. 3195–3198. [DOI] [PubMed] [Google Scholar]
  • 110.Van Velthoven MH, Oke J, Kardos A. ChroniSense national early warning score study: comparison study of a wearable wrist device to measure vital signs in patients who are hospitalized. J Med Internet Res. 2023;25:e40226. doi: 10.2196/40226. https://www.jmir.org/2023//e40226/ v25i1e40226 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 111.de'Sperati C, Dalmasso V, Moretti M, Høeg ER, Baud-Bovy G, Cozzi R, Ippolito J. Enhancing visual exploration through augmented gaze: high acceptance of immersive virtual biking by oldest olds. Int J Environ Res Public Health. 2023;20(3):1671. doi: 10.3390/ijerph20031671. https://www.mdpi.com/resolver?pii=ijerph20031671 .ijerph20031671 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 112.Radder B, Prange-Lasonder G, Kottink AIR, Melendez-Calderon A, Buurke JH, Rietman JS. Feasibility of a wearable soft-robotic glove to support impaired hand function in stroke patients. J Rehabil Med. 2018;50(7):598–606. doi: 10.2340/16501977-2357. https://doi.org/10.2340/16501977-2357 . [DOI] [PubMed] [Google Scholar]
  • 113.McNamara RJ, Tsai LLY, Wootton SL, Ng LWC, Dale MT, McKeough ZJ, Alison JA. Measurement of daily physical activity using the SenseWear armband: compliance, comfort, adverse side effects and usability. Chron Respir Dis. 2016;13(2):144–154. doi: 10.1177/1479972316631138. https://journals.sagepub.com/doi/10.1177/1479972316631138?url_ver=Z39.88-2003&rfr_id=ori:rid:crossref.org&rfr_dat=cr_pub0pubmed .1479972316631138 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 114.Radder B, Prange-Lasonder GB, Kottink AIR, Holmberg J, Sletta K, Van Dijk M, Meyer T, Buurke JH, Rietman JS. The effect of a wearable soft-robotic glove on motor function and functional performance of older adults. Assist Technol. 2020;32(1):9–15. doi: 10.1080/10400435.2018.1453888. [DOI] [PubMed] [Google Scholar]
  • 115.ISO 13485:2016. 2020. [2024-02-22]. https://www.iso.org/standard/59752.html .
  • 116.Cox CL. Patient understanding: how should it be defined and assessed in clinical practice? J Eval Clin Pract. 2023;29(7):1127–1134. doi: 10.1111/jep.13882. [DOI] [PubMed] [Google Scholar]
  • 117.Thomas EE, Taylor ML, Banbury A, Snoswell CL, Haydon HM, Rejas VMG, Smith AC, Caffery LJ. Factors influencing the effectiveness of remote patient monitoring interventions: a realist review. BMJ Open. 2021;11(8):e051844. doi: 10.1136/bmjopen-2021-051844. https://bmjopen.bmj.com/lookup/pmidlookup?view=long&pmid=34433611 .bmjopen-2021-051844 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 118.Sharma Y, Djambazova K, Marquez C, Lyden K, Goldsack J, Bakker J. A systematic review assessing the state of analytical validation for connected, mobile, sensor-based digital health technologies. medRxiv. Preprint posted online on May 23, 2023. 2023:1–21. doi: 10.1101/2023.05.22.23290371. [DOI] [Google Scholar]
  • 119.National Academies of Sciences, Engineering, and Medicine. Policy and Global Affairs. Committee on Women. Bibbins-Domingo K, Helman A. Why diverse representation in clinical research matters and the current state of representation within the clinical research ecosystem. 2022. [2024-01-23]. https://www.ncbi.nlm.nih.gov/books/NBK584396/
  • 120.Wronikowska MW, Malycha J, Morgan LJ, Westgate V, Petrinic T, Young JD, Watkinson PJ. Systematic review of applied usability metrics within usability evaluation methods for hospital electronic healthcare record systems: metrics and evaluation methods for eHealth systems. J Eval Clin Pract. 2021;27(6):1403–1416. doi: 10.1111/jep.13582. https://europepmc.org/abstract/MED/33982356 . [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 121.Wohlgemut JM, Pisirir E, Kyrimi E, Stoner RS, Marsh W, Perkins ZB, Tai NRM. Methods used to evaluate usability of mobile clinical decision support systems for healthcare emergencies: a systematic review and qualitative synthesis. JAMIA Open. 2023;6(3):ooad051. doi: 10.1093/jamiaopen/ooad051. https://europepmc.org/abstract/MED/37449057 .ooad051 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 122.ISO 9241-11:2018. 2023. [2024-01-23]. https://www.iso.org/standard/63500.html .
  • 123.Manta C, Mahadevan N, Bakker J, Irmak SO, Izmailova E, Park S, Poon J, Shevade S, Valentine S, Vandendriessche B, Webster C, Goldsack JC. EVIDENCE publication checklist for studies evaluating connected sensor technologies: explanation and elaboration. Digital Biomarkers. 2021;5(2):127–147. doi: 10.1159/000515835. https://doi.org/10.1159/000515835 .dib-0005-0127 [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Multimedia Appendix 1

PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses Extension for Scoping Reviews) checklist.

jmir_v26i1e57628_app1.pdf (162.1KB, pdf)
Multimedia Appendix 2

Supplementary tables.

jmir_v26i1e57628_app2.docx (140.5KB, docx)

Articles from Journal of Medical Internet Research are provided here courtesy of JMIR Publications Inc.

RESOURCES