Skip to main content
Digital Health logoLink to Digital Health
. 2026 Apr 17;12:20552076261443211. doi: 10.1177/20552076261443211

Enhancing mental health care through human-computer interaction: AI holistic design of a mental health APP in China

Zongyi Zhang 1, Xuanxuan Tan 1,
PMCID: PMC13100375  PMID: 42027714

Abstract

Objective

The integration of Human-Computer Interaction (HCI) and healthcare technologies is transforming the landscape of mental health interventions. Despite the growing adoption of mental health apps, current evaluation methods often neglect the interplay between interface design, personalization, emotional resonance, privacy, and community engagement. These gaps limit the capacity of digital tools to meet therapeutic goals while maintaining user trust and long-term engagement. This study examined the role of Xin Dao Diary, an AI-assisted platform, in enhancing emotional well-being through innovative design and user engagement strategies.

Methods

With a mixed-methods approach, including walkthrough methods, diary studies, and sentiment analysis of user feedback, we explored how digital interfaces can facilitate effective mental health care.

Results

Our findings reveal that intuitive interface design and personalized AI interventions improve user satisfaction and emotional health outcomes. However, challenges remain in data privacy, algorithmic transparency, and the authenticity of emotional responses, which may undermine user trust and limit long-term engagement.

Conclusion

The present research proposes a Holistic AI Care Design which emphasized the integration of multiple factors, including user needs, AI personalization, privacy, and community building in app design. It also incorporates usability, user engagement, and ethical considerations into the evaluation of AI-assisted mental health apps. This research underscores the importance of interdisciplinary approaches in advancing digital health solutions, offering valuable insights for developers and healthcare practitioners aiming to optimize user experience and therapeutic efficacy.

Keywords: human-computer interaction, healthcare, AI care design, mental health, interface design

1. Introduction

Digital health technologies, especially artificial intelligence (AI), are emerging as a transformative force in the field of healthcare. As the global mental health crisis intensifies, the growing prevalence of mental health disorders is placing unprecedented pressure on healthcare systems worldwide. 1 Traditional face-to-face counseling and treatment models are proving inadequate, falling short in meeting the immense demand for scalable, accessible, and personalized mental health interventions. In this context, AI and HCI have together begun to reshape how mental health services are designed, delivered, and experienced. These technologies introduce new paradigms for emotionally intelligent systems that resonate with user needs. 2 By using large-scale data, machine learning algorithms, and natural language processing, AI-assisted mental health applications can deliver interventions that dynamically respond to individual emotional states, behavioral patterns, and contextual factors. In parallel, HCI serves as the critical infrastructure for ensuring these tools are intuitive, usable, and emotionally engaging. The intersection of AI and HCI, therefore, holds promise for advancing mental health care, yet remains underexplored in academic literature, especially from a multidisciplinary and user-centered perspective. Current evaluation methods often neglect the interplay between interface design, personalization, emotional resonance, privacy, and community engagement. These gaps limit the capacity of digital tools to meet therapeutic goals while maintaining user trust and long-term engagement.

Specifically, while individual aspects of HCI, such as interface usability and AI personalization, have been studied in isolation within digital mental health research, the integrated examination of how these factors collectively shape user experience, therapeutic efficacy, and sustained engagement remains significantly underexplored. 3 Current evaluation frameworks typically focus on single dimensions such as clinical efficacy, usability, or user satisfaction without accounting for the interdependencies between interface design, AI-driven personalization, privacy and trust, feedback mechanisms, and community support features. This fragmented approach obscures the holistic user experience and limits our understanding of how technology can be optimized to support both clinical outcomes and user wellbeing.

Thus, the present research addresses the need for a comprehensive methodological framework that integrates multiple factors essential for designing and evaluating AI-assisted mental health applications. We propose a Holistic AI Care Design that foregrounds user-centered interface design, adaptive AI personalization, ethical data practices, active feedback mechanisms, and social connectivity. This interdisciplinary approach aligns with the principles of responsible innovation, emphasizing both technical performance and emotional relevance in mental health technologies. It also responds to critical limitations in existing evaluation paradigms, which often focus narrowly on technical accuracy or clinical efficacy, while overlooking the lived experiences of users, the interpretability of AI recommendations, or the psychosocial implications of digital engagement. 3 To ground this investigation, we conducted a case study on Xin Dao Diary, a Chinese AI-assisted mental health application that exemplifies the convergence of therapeutic content and social networking features. Launched in 2020, Xin Dao Diary has grown into an emotional support platform with more than two millions of users. It integrates AI-driven features such as mood tracking, personalized emotional reports, and virtual therapy sessions, alongside community-based functionalities that enable anonymous peer support and shared emotional expression. The app’s design philosophy bridges psychological self-help with participatory media environments, making it particularly relevant to both HCI and digital healthcare research. Its dual role—as both a mental health intervention and a digital community—invites critical examination of how interface, interaction, and emotional intelligence intersect to shape user experiences and outcomes.

This study seeks to explore how Xin Dao Diary supports emotional well-being through innovative design and user engagement strategies. Specifically, the study addresses four interconnected research questions: (1) How are principles of HCI implemented within the app to support users’ mental health experiences, particularly through interface design that facilitates emotional expression and routine self-reflection? (2) How do AI-driven features contribute to the delivery of personalized mental health interventions that respond dynamically to users’ emotional states and behavioral pattern? (3) How do user interactions and feedback influence the app’s continuous development and psychological efficacy4,5? (4) Whate implications does Xin Dao Diary offer for evaluating and designing AI-assisted mental health app? These questions focus on the intersection of HCI and healthcare, emphasizing user experience, AI interaction, and mental health outcomes. In this sense, this is integrated approach advances our understanding of how technology can simultaneously achieve clinical effectiveness and user-centered design excellence.

1.1 Human-computer interaction and AI healthcare design

HCI is a critical bridge between technology and user behavior in digital health interventions, shaping the effectivenes of healthcare systems. 6 Interface design and user experience affect the usability, acceptability, and safety of healthcare applications, thereby influencing their long-term effectiveness. 7 Intuitive navigation and visually coherent interfaces build user trust in mental health platforms and improve operational efficiency in integrated healthcare systems, enabling collaboration between patients and providers. Effective HCI design also supports health monitoring and timely emergency response. AI-assisted mental health applications, particularly those leveraging large-scale data, machine learning algorithms, and natural language processing, have demonstrated therapeutic efficacy in delivering personalized interventions that respond to individual emotional states and behavioral patterns. 1 Recent evidence supports the effectiveness of these approaches across multiple mental health conditions, 1 with particular success in specialized domains such as addiction care monitoring 8 and personalized mental health interventions. 9 Despite its importance, Blandford underscores a persistent gap in the field, 10 noting that “insufficient integration of human factors insights into healthcare technology design remains widespread.” Suboptimal interface design carries tangible risks: studies demonstrate that poorly conceived user interfaces in complex healthcare systems can directly compromise patient safety, even leading to life-threatening errors. 11 Consequently, optimizing user interface (UI) design is imperative to strengthen interactions between healthcare technologies and their users. 12 Notably, reliance on prior knowledge among trained professionals fails to mitigate errors stemming from flawed system architecture, emphasizing the non-negotiable role of usability in clinical adoption. Usability and user experience jointly dictate the real-world implementation of healthcare technologies. Enhanced usability correlates strongly with increased device utilization, while superior user experience drives iterative improvements in system design. 10 Ultimately, the synergy of human-centered user experience and robust technical systems forms the cornerstone of achieving measurable healthcare outcomes, as evidenced by longitudinal studies on technology-integrated care models. 13 However, implementing AI-assisted mental health applications presents significant challenges that must be carefully addressed. Surveillance fatigue remains a critical concern, as continuous monitoring and data collection may lead to user discomfort and reduced engagement. 14 Additionally, maintaining appropriate therapeutic boundaries between AI systems and users is essential, as unclear boundaries can undermine the therapeutic relationship and user trust. 15 The effectiveness of AI interventions also depends on robust feedback loops that enable continuous system refinement based on user outcomes and clinical effectiveness. 16 These implementation challenges underscore the importance of integrating rigorous HCI principles with ethical considerations to ensure that AI-assisted mental health applications enhance rather than compromise user well-being and therapeutic efficacy.

HCI principles promote sustained user engagement in healthcare technologies by minimizing cognitive demands and integrating emotionally resonant design. One of the key strategies involves leveraging gamification elements to incentivize consistent use, as demonstrated in behavioral health interventions. 17 Given that poor usability remains a critical barrier to adoption, healthcare applications must prioritize intuitive, user-friendly interfaces that balance simplicity with clinical efficacy. Gamified components, such as progress tracking and reward systems, not only foster positive psychological engagement but also mitigate frustration and cognitive overload during system interactions. Simultaneously, real-time interactive features, such as mood diaries and symptom trackers, amplify users’ capacity for self-reflection. These tools exemplify the broader HCI imperative for information exchange in smart mobile medical systems. At the intersection of HCI and emotional health research, self-tracking enables users to monitor activities, dietary habits, and mood fluctuations, while smartphone or web-based platforms facilitate real-time condition management, thereby reinforcing self-regulatory behaviors. 18 Central to effective HCI design is the systematic analysis of users’ environmental contexts and device interactions, which generates activity logs and historical data to underpin reflective practices. However, a critical consideration lies in balancing functional sophistication with usability: over-engineered tools risk overwhelming users, as evidenced by Ben-Zeev et al., 9 who observed that functionally redundant mental health apps exacerbated cognitive strain in anxious populations, resulting in reduced adherence. Consequently, the central challenge in healthcare HCI is transforming technological complexity into seamless, intuitive experiences—oftern described as “senseless services” in user-centered design—that support effortless engagement without compromising clinical utility.

The integration of AI into digital health interventions introduces opportunities for personalization and adaptability in mental healthcare. AI-driven personalization represents a paradigm shift, enabling the creation of tailored treatment plans that account for individual behavioral patterns, cognitive profiles, and real-time physiological data. 1 Such precision in intervention design improves therapeutic efficacy by aligning strategies with patients’ unique challenges and strengths, thereby enhancing treatment adherence, shortening recovery timelines, and elevating user satisfaction. In addiction care, AI systems demonstrate particular utility through continuous behavioral monitoring. By analyzing triggers, stressors, and substance use patterns, these systems can identify high-risk scenarios in real time, alerting both clinicians and patients to imminent relapse risks and enabling proactive adjustments to treatment protocols. 8 This capability underscores adaptability as a critical metric for evaluating AI’s impact in mental health interventions. Machine learning algorithms dynamically assess user progress and iteratively refine therapeutic approaches, circumventing the inefficiencies of traditional trial-and-error methods while optimizing outcomes through data-driven iteration. Notably, AI enhances evidence-based practices such as cognitive behavioral therapy (CBT) by adapting interventions to users’ evolving needs. For instance, algorithms can detect users’ cognitive patterns and modulate therapeutic content to target these traits specifically, thereby improving CBT’s relevance and effectiveness across diverse populations. Concurrently, AI-enabled continuous monitoring serves as a preventive tool, detecting early indicators of mental health deterioration—such as deviations in sleep, activity levels, or speech patterns—in conditions like depression and bipolar disorder. These alerts empower timely clinical interventions while fostering longitudinal care through continuous data collection. The aggregation of longitudinal datasets further enriches therapeutic decision-making, offering clinicians actionable insights into patients’ mental health trajectories and guiding personalized intervention strategies. Beyond clinical settings, AI-powered tools such as chatbots and virtual therapists can also scale mental healthcare delivery. By providing timely and accessible support, these solutions expand access to underserved populations, reduce treatment costs, and alleviate systemic shortages in mental health resources. 19 Thus, AI not only helps bridge gaps in service delivery but also broadens access to patient-centered care.

Despite these advantages, the integration of AI into mental healthcare faces two key challenges: ethical dilemmas and barriers to user acceptance. Data privacy concerns remain paramount due to the sensitive nature of mental health data. Robust safeguards for patient information are imperative, particularly as persistent anxieties about large-scale data surveillance undermine user confidence. 2 Meanwhile, AI’s inherent limitations in human empathy and contextual understanding, qualities essential to therapeutic relationships, risk eroding trust. Surveillance fatigue, poor management of therapeutic boundaries, and unintended disclosure of non-clinical personal data may further weaken users’ trust, a critical concern given that trust is foundational to effective digital mental health interventions. Another concern is the opacity of algorithmic decision-making, often termed the “black box” problem, which compromises transparency and fuels skepticism about AI-driven conclusions. When AI influences clinical decisions, accountability becomes ambiguous, particularly in crisis scenarios where questions arise about ultimate responsibility for harmful outcomes. 14 Thus, AI algorithms must address concerns about transparency and algorithmic decision-making to build user trust. Explainable Artificial Intelligence (EAI),which aims to make AI decision-making processes transparent and interpretable to users, has emerged as a critical solution to enhance credibility, accountability, and trust in high-stakes applications such as mental health interventions. Moreover, the implementation of anthropomorphic AI designs requires cautious calibration, as excessive human-like features may provoke resistance to perceived mechanization of care. 15 Thus, the successful adoption of AI in mental healthcare hinges not only on technological advancement but also on developing user-comprehensible interpretive frameworks. Addressing these challenges demands proactive ethical governance and regulatory structures. Ensuring AI tools align with both ethical standards and therapeutic objectives requires participatory frameworks that embed clinicians, patients, and policymakers as co-designers throughout the technology lifecycle. Co-design approaches with end-users have been shown to enhance designer empathy and foster more responsive, culturally appropriate solutions. 20 Participatory frameworks, which involve users in the design and refinement process, have similarly demonstrated their effectiveness in fostering user trust and ensuring that digital mental health applications align with user needs and preferences. 16 By incorporating these co-design and participatory principles into the development of AI-assisted mental health applications, developers can ensure that technology solutions genuinely address user needs while maintaining high standards of safety and efficacy.

User feedback serves as a central catalyst for the iterative optimization of digital health platforms, underpinning collaborative efforts among stakeholders. As a cornerstone of HCI and user-centered design (UCD) methodologies, feedback mechanisms critically inform the development and refinement of these systems. 6 Distinct from alternative approaches, UCD prioritizes users throughout the HCI lifecycle, enabling the creation of intuitive tools that align with clinical workflows and user expectations. By systematically integrating user insights, digital health technologies can enhance engagement while ensuring compatibility with existing practices—a prerequisite for sustained adoption. 16 In conversational interfaces such as chatbots, precise interpretation of user inputs and contextually appropriate responses are essential for maintaining seamless engagement, directly influencing utilization rates and long-term retention. 21 This principle holds heightened significance in mental health applications, where user feedback proves indispensable for identifying unintended adverse effects of technologies like behavioral tracking. For instance, Sanches et al. emphasize the necessity of designing tracking systems that mitigate discouragement by carefully framing negative data and providing supportive user interactions. 22 Co-design practices that engage service users and caregivers as domain experts are vital for developing ethical and effective mental health AI. Feedback loops bridge the innovation-acceptability divide, ensuring technologies resonate with user preferences and therapeutic goals. However, scholars caution against overreliance on self-reported user needs, as individuals may lack awareness of their latent requirements or the technical possibilities available. 6 Consequently, improving digital healthapplications demands a dual focus: fostering end-user participation while employing agile development frameworks that enable continuous product iteration informed by empirical evaluation.

The adaptability of digital health platforms can be enhanced through iterative design processes that integrate real-world user insights. For instance, a study analyzing AI-driven Conversational Agents (CAs) in mental health, which are designed as “diary-like” applications, revealed communication breakdowns as a primary contributor to user disengagement. 23 Such breakdowns occur when conversational agents inadequately interpret or contextually respond to user inputs, thereby diminishing intervention efficacy and fostering negative experiences. These findings underscore the imperative to refine conversational agents capabilities in natural language processing and response generation, directly informing iterative design improvements. In digital mental health, user feedback derived from mixed-methods approaches like triangulating qualitative data (interviews or questionnaires) with quantitative engagement metrics, provides critical insights into technology integration and its psychosocial impacts. Researchers have observed new user behaviors, including clinicians repurposing online interventions as adjunct therapeutic tools or patients using mood diaries to raise sensitive topics during therapy. 21 These emergent usage patterns and unanticipated challenges constitute actionable feedback for adaptive system redesign. Stakeholder-centered design methodologies further illuminate nuanced user requirements and contextual constraints. Co-designing prototypes with end-users cultivates designer empathy, ensuring solutions align with lived experiences while preemptively addressing usability barriers. By embedding user perspectives into development cycles, such participatory frameworks not only enhance experiential outcomes but also foster trust in digital interventions—a prerequisite for sustained adoption.

HCI lays the foundation of user experience, AI technology empowers personalized service, and user feedback drives continuous optimization of the system. The core of the synergy of the three lies in the “human-centered” design philosophy - from the ease of interface interaction, to the interpretability of AI decision-making, to the initiative of user participation, all of which need to be centered on the emotional needs and cognitive characteristics of users. Despite these contributions, current methodological approaches to AI-assisted mental health app design and evaluation often remain fragmented and limited in scope. Many studies prioritize either clinical efficacy, algorithmic functionality, or usability in isolation, without sufficiently accounting for the interconnected nature of users’ psychological, technological, and social experiences. This disciplinary siloing results in evaluation tools that overlook how emotional resonance, interface design, AI personalization, privacy expectations, and social dynamics collectively shape user engagement and therapeutic effectiveness. Moreover, critical concerns—such as trust in algorithmic decision-making, ethical data handling, and the sustainability of user participation—are frequently underexplored. These gaps highlight the pressing need for an integrative evaluative framework that brings together multiple interdependent factors: HCI principles, usability and user experience, AI-driven personalization, privacy and trust safeguards, feedback loops, and mechanisms for social support and community building. Such a comprehensive, interdisciplinary approach ensures that mental health technologies are not only functionally sound but also emotionally relevant, ethically responsible, and socially resonant, laying a stronger foundation for effective, user-centered digital mental health interventions.

Building on these theoretical foundations, our experimental investigation focuses on five interconnected HCI elements that represent critical components of effective digital mental health interventions. First, we examine interface design and usability through analysis of navigation structures and affordances that facilitate user interaction. Second, we assess emotional resonance through design element, including soft music, visual cues, and pre-set emotional tags that enhance the emotional support experience. Third, we investigate personalization through AI features such as adaptive recommendations and tailored interactions that respond to individual user needs. Fourth, we evaluate feedback mechanisms and iterative design processes that enable continuous refinement based on user input. Fifth, we analyze community engagement features that foster peer support and social connection. By systematically examining these elements through our mixed-methods approach, we can provide comprehensive evidence for how thoughtful HCI design, combined with AI capabilities and community features, creates effective digital mental health interventions.

2. Methods

This research examines Xin Dao Diary, an emotional healing app in China that doubles as a social media platform. Launched in 2020, Xin Dao Diary (Figure 1) combines therapeutic content (e.g., mindfulness exercises, AI counseling), user-generated support communities, and social networking features, having over 2 million users. 24 Xin Dao Diary is characterized by its integration of social media attributes with emotion management features. The platform facilitates this engagement by enabling users to document their emotional experiences in digital diaries, which the app then analyzes to generate personalized emotional reports. This study was conducted in mainland China between September to November 2024. It combines (1) a researcher-led technical walkthrough of the production app, (2) a two-week remote diary study with 11 university students, and (3) a sentiment analysis of public user reviews from the Xiaomi App Store. All activities were carried out remotely on participants’ own smartphones; no clinical sites were involved. The walkthrough is a researcher-led, HCI-focused audit intended to systematically examine the app’s environment of expected use, interface materiality and constraints and it is designed to complement, rather than replace the real-user data reported in diary studies and sentiment analysis. Because Xin Dao Diary is a mature platform launched in 2020 with over 2 million users, our study is not a pilot of the app but a case-study evaluation of app usage.

Figure 1.

Figure 1.

Push notification of Xin Dao Diary (English text was translated by corresponding author).

2.1. Walkthrough method

The walkthrough method, a standard HCI approach, is effective for systematically identifying usability problems. It is valued for its structured approach.20,25 The method requires few resources and can be conducted by designers, developers, or researchers and thus it is cost-effective for iterative design and evaluation cycles.25,26 The method has been successfully adapted for various application areas, including mobile apps, health applications, and domain-specific tools, and can be combined with other techniques for more comprehensive results.2527 This study employs the walkthrough method to investigate the design, interface, and features of the Xin Dao Diary app from a HCI perspective. 28 This approach enables a comprehensive analysis of user interactions and interface usability. Specifically, we examined the environment of expected use by looking at the vision and operating model. We analyzed the strategies employed by the app provider to manage and control user interactions in order to maintain the app’s operational model and achieve its vision. We also examined information regarding user data applications, privacy, and safety. 28 Subsequently, throughout the technical walkthrough process, we adopted the perspective of a user, interacted with the app’s interface, navigated through screens, pressed buttons, and investigated the menu options. In this phase, we generated detailed field notes. We focused on the app’s materiality, such as the actions it demands and directs users to perform, and considered how users might interpret these as affordances or constraints. 28 To be specific, we examined user interface arrangement, functions and features, textual content and tone and symbolic representation.

By navigating the app as users, researchers evaluated the interface’s usability and its role in facilitating effective human-computer interaction. This method allowed for the identification of design elements, such as emotion logging and pre-set tags, which enhance user engagement and emotional expression. The analysis highlighted how these features align with user expectations for mental health care.

To address the limitations of the walkthrough method, such as its lack of analysis regarding user content, activities, or attitudes, we incorporate supplementary data from diary studies and app reviews. This combination provides a richer understanding of user attitudes and activities, enhancing our evaluation of the app’s impact on mental health outcomes and its role in AI-assisted mental health care.

2.2. Diary studies

We recruited 11 university students (2 male and 9 female, aged 18-22) to participate in a two-week diary study (Table 1). Participants were recruited through purposive sampling based on two key criteria: (1) they demonstrated genuine interest in and need for mental health support applications, and (2) they had prior experience using other mental health or wellness apps, but had never previously used Xin Dao Diary. This sampling strategy ensured authentic, informed engagement with the study while enabling participants to provide comparative perspectives based on their existing app experience. Participants were instructed to engage with the app at least twice daily, documenting their experiences and capturing screenshots when necessary. Each week, they completed a user report detailing their app usage frequency, recent emotional states, and specific interactions related to emotion regulation.

Table 1.

Demographic information of diary study participants.

Participant ID Gender Age Usage motivation Perceived AI effectiveness Community engagement
P01 Female 18 Emotional ventilation, connection Partially effective Active
P02 Female 18 Emotional regulation, expression Ineffective Active
P03 Female 19 Digital journaling alternative Effective Moderate
P04 Female 19 Browsing content, observation Ineffective Passive
P05 Male 19 Self-reflection, AI interaction Partially effective Moderate
P06 Female 20 Peer support, emotional comfort Effective Active
P07 Female 20 Self-awareness, emotional clarity Effective Active
P08 Female 20 Shared emotional experience Effective Active
P09 Female 21 Mental stress release Partially effective Active
P10 Male 21 Curiosity, comparison Unclear Passive
P11 Female 22 Emotional tracking, reflection Effective Moderate

In the diary study, participants recorded their daily experiences with the app, including their expectations and perceptions of the app’s empathetic qualities. They also noted how closely they adhered to the app’s guidance and instructions, and reported on their sense of autonomy while using the app. At the end of the study, participants provided an overall evaluation of their experience, assessing the app’s influence on their emotions both online and offline.

Prior to the study, participants were briefed on its purpose, procedures, and privacy protection measures, and provided informed consent. Given the sensitive nature of mental health topics, participants were informed of the possibility of psychological discomfort and were free to withdraw from the study at any time. To ensure privacy, all data were anonymized. Participants received a compensation of 100 RMB (approximately 13 USD) upon completing the study.

The diary studies provided valuable insights into the app’s healthcare impacts by capturing users’ emotional experiences and interactions over time. Through detailed personal accounts, we gained an understanding of how the app affects users’ emotional well-being and its potential role in emotion regulation. Participants’ detailed accounts of their interactions with AI features, like the “Forest Healing Room,” provided qualitative insights into how these elements offer personalized mental health interventions. The study captured user experiences with AI-driven sessions that employ psychological techniques (e.g., CBT, Dialectical Behavior Therapy (DBT)), illustrating how these contribute to user satisfaction and emotional well-being.

2.3. Sentiment analysis

Using a Web Crawler, we collected 9,346 reviews from the Xiaomi Store, focusing on Xin Dao Diary. After excluding irrelevant comments and emojis, we processed 9,171 reviews spanning from August 2021 to August 2024 using sentiment analysis with Atlas. ti. This approach allowed us to systematically evaluate user satisfaction and emotional responses to the app, By examining the sentiment in user feedback, researchers can identify trends and patterns that reflect the app’s impact on mental health outcomes. Positive feedback might indicate beneficial interactions and successful outcomes, while negative feedback could suggest areas where the app needs improvement.

3. Results

3.1. User interface and experience

Xin Dao Diary’s designed interface and features effectively facilitated user interaction, emotional tracking, and engagement, creating a healing and supportive environment that aligned with user expectations for emotional expression and well-being. The app’s design promoted interaction through its digital diary feature, which allowed users to log their feelings using pre-set emotional tags. This structure not only helped users articulated their emotions but also organized those emotions systematically, promoting mental health care. Specific interface elements, such as push notifications (Figure 1), soft music, and inspirational quotes, further enhanced user engagement by reminding them to document their emotions and providing comfort when needed.

For instance, when a user was not actively using the app, it sent push notifications prompting them to record their emotions. Upon the user’s next login, a feedback interface appeared, asking if they have new feelings to share. The app may also push comforting content, like soft music or inspirational quotes, to offer emotional support. Within the diary feature, users were encouraged to document their emotions using pre-set tags, which helped them better understand and articulate their feelings. The app employed algorithms to match users’ emotion records with similar ones, displaying these on their page. When a user clicks on another person’s mood record, they were directed to a comment interface, where the app prompted them to “warmly respond and empathize.” Additionally, users had the option to view calming images or watch short videos related to emotional healing during the session.

Beyond notifications, the app implements soft control mechanisms that structures user activity and access. These include time-gated resource collection, staged capability cultivation, episodic event triggers, and community access barriers. For instance, “Daily Heart Collection and Tasks” is a time-gated loop where users collect “hearts” at fixed intervals, complete daily tasks, and level up; higher levels yield more hearts per unit time. This progression scaffolds continued engagement while control users’ gains. Real-time interactions in the “Floating Island” are gated by level/criteria, creating an entry threshold for new users.

The app does not implement a built-in pre–post outcome comparison for video or related content. Accordingly, our assessment relied on diary-study self-reports about emotion regulation and the sentiment patterns observed in app-store reviews. User feedback highlighted a positive reception of these features. Sentiment analysis revealed that users express positive emotions when evaluating the app (Table 2).

Table 2.

Result of sentiment analysis generated from APP reviews.

N %
Negative emotions 314 3%
Positive emotions 7022 75%
Neutral emotions 2083 22%
Total 9419 100%

With 75% of the reviews reflecting positive emotions, users were satisfied with the app, appreciating its ability to provide timely solutions to emotional challenges, a healing environment, and anonymous social interactions. The following quotes are extracted from the diary studies. Participants were instructed to document their daily experiences with the app, and these data were analyzed to identify themes related to user satisfaction, emotional impact, and perceived therapeutic benefit. Participant identifiers (P01-P11) are used to preserve anonymity while enabling traceability.

Participant P04 said “I have certain expectations for the app: I hope it can help me regulate negative emotions and become my “emotional trash can,” allowing me to express many “unspeakable” little secrets. These expectations were partially fulfilled. Specifically, the app indeed provided me with a space to vent my emotions and document my life. Additionally, the posts I shared received many comments from ‘island friends (other users).’ They gave me comfort and empathy, making me feel cared for by strangers. ” Similarly, P07 acknowledged “before using the app, I hoped it could truly replace a traditional paper diary, as the app might make regulating emotions more convenient and effective. The app could be opened at any time, allowing for instant recording without the need for pen and paper, making recording emotions much more accessible. Upon entering the app’s diary recording interface, a character would inquire whether my mood has improved. If not, the app would offer me some suggestions for emotional improvement. This feature compensated for the lack of interaction in traditional diaries, where the absence of communication can diminish the therapeutic effect. As a result, the app served better as a companion for emotional support.” The app’s promotional video also highlighted its strengths in providing a healing environment to meet users’ expectations.

User feedback highlighted satisfaction with the app’s structured yet flexible environment for emotional expression. Users such as P04 had noted the convenience of being able to record emotions instantly, enhancing the therapeutic effect compared to traditional diaries. However, some users, such as P01, had raised concerns about privacy, suggesting the need for more robust privacy settings to maintain user trust and satisfaction.

3.2. AI-driven personalization in mental health care

The AI features within Xin Dao Diary played a crucial role in delivering personalized mental health care by offering tailored recommendations and interventions. A standout feature is the app’s AI-powered “Forest Healing Room,” where users could engage in a 10-minute “psychological therapy” session with an animal avatar powered by a large language model (Figure 2). During these sessions, the animal therapist actively inquired about the user’s recent emotional state and employs psychological techniques, such as CBT and DBT, or Motivational Interviewing, to help regulate emotions when appropriate.

Figure 2.

Figure 2.

“Forest healing room”, an AI driven psychological therapy session at Xin Dao Diary (Screenshot of XinDao).

Users had reported that these sessions provide valuable insights and aid in emotional regulation. For instance, P08 shared her experience with the AI animal therapists, who helped her navigate negative emotions. She added, “I feel the AI animal therapists grasp my emotions and perspectives. Though they may not fully empathize, their programmed companionship occasionally enlightens and aids me through challenges. With their help, I will not be constantly stuck in negative emotions.”

AI-driven recommendations were generally perceived as effective, with users appreciating the personalized content that resonated with their current emotional states. However, trust in AI varies among users, with some expressing skepticism about the authenticity of emotional records generated by the algorithms. Several participants had openly admitted to having low trust in the app, feeling hesitant to post content involving personal privacy due to the lack of robust privacy settings and the high likelihood that their published diaries could be seen by others. P04 stated, “I feel hesitant to post personal content because the app lacks privacy settings, and it seems like anyone could view my diary.” Similarly, P11 held a critical view of the application including the AI animal therapists, questioning the authenticity of emotional records, the rationality of emotional quantification, and the actual effectiveness of emotional regulation. These concerns highlighted the need for continuous improvement in AI personalization to enhance user trust and the effectiveness of mental health interventions.

3.3. User interaction and feedback

Xin Dao Diary’s responsive adaptation to user feedback had led to the integration of innovative features and community-driven activities, fostering a supportive environment that enhances users’ mental health and sense of belonging. The platform actively evolved based on user input, as evidenced by the introduction of features like AI animal therapists, which were added following user suggestions. Additionally, community activities that gave a voice to people with depression were implemented to address users’ suggestions and their need for psychological counseling.

The app’s adaptability not only improved the user experience but also garnered positive feedback. One user in social media remarked, “As a ‘veteran’ user of Xin Dao Diary, I have seen many aspects of the app. The most touching aspect for me is the attitude of the app developer. The app is the most conscientious app I have used so far.” The user further illustrated that when the online community “Floating Island” faced operational issues, “the app management team promptly adjusted and came up with plans. They are responsible and do far more than we think.”

Social networking features played a pivotal role in fostering a supportive community within the app. The algorithmic matching of users experiencing similar emotions encouraged mutual support and empathy, creating a connecting environment that enhances mental health outcomes. For example, in Figure 3, users sharing the same emotion such as feeling sad were shown at the same interface and user could click the emotional icon and interact with another user.

Figure 3.

Figure 3.

Social networking feature blended with emotion at Xin Dao Diary (English text was translated by the corresponding author, screenshot from the author’s phone).

Through the app, users developed a sense of self-identity by considering their emotional needs and hoping the app can satisfy them. Specifically, using the app enhanced users’ self-awareness and self-efficacy, while also fostering a sense of social belonging. For example, participant P06 noted that the app allowed her to express her feelings, which strengthens her self-awareness and provides a sense of satisfaction.

Users also gained a sense of social belonging. Participant P08 shared, “I really enjoy reading letters from other users; most of the letters are sincere long texts, and the ones I receive are from peers facing similar dilemmas. Waiting for the letters to arrive makes me cherish them even more, and I keep all the letters I get. For me, having genuine help documented gives me more motivation to support more people.” Through sincere communication with peers, resonance with shared struggles, anticipation and appreciation for incoming letters, and the act of saving letters, users developed a sense of identification with and belonging to the community, which inspired their internal drive to support others.

Users’ self-efficacy was enhanced when they discovered they can help others and accomplish tasks within the app. Participant P07 shared her experience of comforting a user who was struggling to adapt to the stress of senior high school through the app’s messaging function. The user’s gratitude for her response gave P07 a sense of self-efficacy in providing psychological support to others.

Users like P04 and P10 had shared experiences of receiving virtual empathy and support from other users, reinforcing the community’s positive atmosphere. Beyond verbal communication, users employed emojis and icons to convey understanding and resonance with others’ emotions. P04 recounted, “After posting a diary entry on feeling emotionally drained, strangers (other users) ‘hugged’ me virtually and reassured me that things would improve. In other entries, strangers express shared feelings, indicating a moment of shared emotions, allowing others to empathize or project themselves into my narrative, thus experiencing similar emotions.” Similarly, P10 mentioned, “In the app, we comfort and support each other, virtually lighting a bonfire to give a hug. This interaction makes me feel a genuine connection with others.” Through these specific acts of shared vulnerability and virtual comfort, users actively build the empathetic, positive atmosphere that defines the app’s community.

4. Discussion

4.1. Development of the holistic AI care design framework

The Holistic AI Care Design framework emerged systematically from our multi-methods analysis. The walkthrough method identified specific HCI elements implemented within Xin Dao Diary’s interface, documenting design features such as pre-set emotional tags, push notifications, and visual affordances. Concurrently, diary studies with 11 participants revealed rich user experiences related to interface design, AI personalization, privacy concerns, and community engagement features over the two-week study period. By synthesizing findings across all three data sources, we identified four interconnected dimensions that collectively characterize effective AI-assisted mental health design: user-centered interface design, AI-driven personalization, privacy and trust and community support features. Each pillar of the framework was validated by specific findings from our methodology: interface design elements identified through the walkthrough method demonstrated measurable improvements in usability; personalization features revealed in the diary study correlated with higher user satisfaction; privacy concerns documented by participants informed the trust and security dimension; and community features emerged as critical enablers of user engagement and self-efficacy. Furthermore, sentiment analysis of app reviews complemented these qualitative findings, quantifying overall user satisfaction levels. This data-driven approach ensured that the framework directly reflects the real-world experiences and needs of users, rather than representing theoretical constructs disconnected from actual implementation.

4.2. Framework components

The research findings informed a Holistic AI Care Design (Table 3), an approach for designing and evaluating AI-assisted mental health APP by emphasizing user-centered design, personalization, privacy, feedback, and community support.

Table 3.

Holistic AI care design.

Element Requirement
Design Framework User-Centered Design Describe strategies for incorporating user needs, preferences, and feedback into the design process.
AI Personalization Outline methods for implementing AI-driven personalization and adaptive learning.
Privacy Concerns Detail approaches for ensuring data privacy and compliance with regulations.
Community Support Discuss how to integrate community features and peer support into the design.
Evaluation Criteria Usability Usability is a critical determinant of app adoption and sustained use. Evaluation criteria for usability include effectiveness, efficiency, and satisfaction,29,30 appropriateness/usefulness and aesthetics. 31 Effectiveness can be assessed by determining whether users could achieve core goals, such as, emotion logging, regulation, reflection, and peer support. Efficiency is inferred from convenience indicators such as guided prompts, and push notifications that reduced effort and time-on-task. Satisfaction can be measured through diary narratives and review sentiment, capturing users’ feelings such as comfort, empathy, and perceived companionship. Appropriateness/usefulness is judged by fit to user needs. Aesthetics can be evaluated via user reactions to elements and contents such as calming visuals, soft music, and inspirational content that created a soothing experience.
Engagement User engagement describes how user engagement and retention will be measured. It is essential for effectiveness, particularly in health apps. Features such as self-monitoring, personalized content, reminders, and online communities are linked to higher engagement.3234 Metrics include frequency and duration of use, interaction with features, and subjective satisfaction.32,33,35
Ethical Considerations Evaluation criteria for ethical considerations include beneficence (user-reported healing and satisfaction), non-maleficence (privacy-related hesitation, trust erosion, distress incidents), autonomy (use of visibility controls, self-efficacy gains), explicability/transparency (clarity of AI rationales, user understanding), privacy/data protection (granular settings, anonymization uptake, incident rates), accountability (timely developer responses, corrective updates).3638
Interdisciplinary Method Walkthrough methods, Diary studies, Sentiment analysis, etc.

The study on Xin Dao Diary underscores the pivotal role of user-centered design in AI-assisted mental health applications. A well-crafted interface can significantly enhance user engagement and emotional tracking by creating a soothing and supportive environment. For instance, incorporating soft music and inspirational quotes fosters a healing ambiance, while emotion tagging and intuitive navigation address diverse user needs. Personalization further amplifies user satisfaction by tailoring features such as social interactions and AI-driven therapy sessions. These interventions support emotional regulation, ensuring that users feel seen and supported.

Privacy and trust are fundamental to the success of AI-assisted mental health apps. Xin Dao Diary has faced challenges in this area, as some users expressed reluctance to share personal content due to concerns about privacy protections. A more transparent and ethical framework for user data usage and processing is necessary to address these concerns. For example, stronger privacy settings, including options for anonymization and selective content sharing, can build trust and encourage users to engage more openly with the app.

The app leverages AI-driven personalization to deliver unique emotional support experiences. One standout feature is the “Forest Healing Room,” where users interact with animal avatars for personalized support sessions. By employing large language models to analyze user emotions and behaviors, Xin Dao Diary tailors interactions in real time, integrating evidence-based techniques such as CBT and Motivational Interviewing. This adaptive learning ensures that interventions remain relevant and effective for each user, reinforcing the app’s role as a trusted mental health tool.

Community support is another essential element of Xin Dao Diary’s design. The app fosters a sense of belonging by connecting users with similar emotional experiences, promoting empathy and mutual support. Features such as “Floating Island” and comment interfaces encourage users to “warmly respond and empathize” create a supportive environment where individuals feel understood.

Evaluation criteria including usability, user engagement and ethical implications. Usability focuses on interface design, accessibility, and user experience, with features like the digital diary and emotion tagging earning positive feedback for providing a structured, user-friendly environment. User engagement is tracked through usage frequency and community participation, with push notifications and algorithmic emotional support proving effective in sustaining interest. Evaluation criteria for ethical considerations encompass beneficence, assessed through user reports of healing and satisfaction; nonmaleficence, reflected in privacy related hesitation, erosion of trust, and distress events; autonomy, evidenced by use of visibility controls and gains in self-efficacy; explicability and transparency, measured by the clarity of AI rationales and user comprehension; privacy and data protection, evaluated via granular settings, uptake of anonymization, and rates of incidents; accountability, shown by prompt developer responses and corrective updates; stakeholder engagement, captured by the frequency and impact of feedback driven changes; lifecycle and iterative improvement, indicated by feature evolution aligned with user input; risk identification, demonstrated through documented risks and mitigation actions; and metrics and monitoring, tracked with sentiment trends and usage indicators.

While anonymization processes and transparent policies address some concerns, ongoing user feedback and regular audits are crucial to mitigate biases, enhance privacy protections, and ensure responsible AI use.

Xin Dao Diary exemplifies the principles of Holistic AI Care Design by combining user-centered design, AI personalization, privacy and community support. While the app demonstrates significant strengths in areas like personalization and emotional support, addressing privacy concerns and enhancing ethical standards remain critical for building user trust and ensuring its long-term success (Table 4).

Table 4.

Mapping of HCI elements, evaluation methods, and outcomes.

HCI element(s) Evaluation method(s) Outcome (Improved usability & satisfaction)
Pre-set Emotional Tags Walkthrough + Diary Studies Enabled quick mood tracking and emotional awareness
Push Notifications Diary Studies + Sentiment Analysis Increased engagement; users appreciated timely, empathetic reminders
Soft Music & Inspirational Quotes Diary Studies + Sentiment Analysis Created supportive ‘healing environment’; reduced user stress
Feedback Interface & Check-in Walkthrough + Diary Studies Enhanced continuous emotional articulation and engagement
Algorithmic Emotional Matching Diary Studies + Sentiment Analysis Connected users with similar experiences; fostered belonging
Comment Interface & Empathy Prompts Diary Studies Enabled peer support; users received comfort and validation
Calming Visual/Video Content Diary Studies Provided therapeutic resources; enhanced healing experience
AI Forest Healing Room & Avatars Diary Studies + Sentiment Analysis Delivered personalized psychological interventions
Personalized Emotional Reports Walkthrough + Diary Studies Enabled self-awareness and tracking of emotional patterns
Community Features & Virtual Support Diary Studies + Sentiment Analysis Enhanced self-efficacy and genuine connection

5. Conclusion

This study examined the role of Xin Dao Diary, an AI-assisted mental health platform, in enhancing emotional well-being through innovative design and user engagement strategies. With a mixed-methods approach, including walkthrough methods, diary studies, and sentiment analysis of user feedback, we explored how digital interfaces can facilitate effective mental health care. Our findings reveal that intuitive interface design and personalized AI interventions are associated with higher user satisfaction and self-reported emotional health improvements. The app cultivates an emotionally supportive user experience through a combination of minimalist design, emotionally resonant features, and adaptive support mechanisms. Specific elements—such as pre-set emotional tags, timely push notifications, calming audio-visual stimuli, and AI-generated avatars—assist users in articulating and regulating emotions more effectively than traditional self-help methods. Moreover, the app’s community-oriented features, including anonymous diary sharing and peer-to-peer interaction, foster a sense of belonging and emotional reciprocity, thereby addressing the social isolation commonly associated with mental health challenges. Nonetheless, the study also highlights several critical concerns, particularly regarding data privacy, algorithmic transparency, and the authenticity of emotional responses, which may undermine user trust and limit long-term engagement. The present research proposes a Holistic AI Care Design which emphasized the integration of multiple factors, including user needs, AI personalization, privacy, and community building in app design. It also incorporates usability, user engagement, and ethical considerations into the evaluation of AI-assisted mental health apps. This research underscores the importance of interdisciplinary approaches in advancing digital health solutions, offering valuable insights for developers and healthcare practitioners aiming to optimize user experience and therapeutic efficacy.

Theoretically, this study advances the field of HCI and digital mental health by moving beyond fragmented evaluations of individual features toward a holistic, integrative framework. Existing literature has tended to examine discrete design elements—such as usability, personalization, or community support—in isolation, without accounting for their interdependencies or collective therapeutic impact. The Holistic AI Care Design framework proposed here bridges this gap by demonstrating how affective, functional, social, and ethical dimensions of app design mutually reinforce one another to produce sustained user engagement and emotional well-being. This contribution extends relational perspectives in HCI research by specifying how AI-mediated environments can be designed not merely to respond to users, but to co-constitute supportive emotional ecologies. In doing so, the framework offers a transferable conceptual vocabulary for evaluating AI-assisted mental health interventions across diverse cultural and clinical contexts.

From a practical standpoint, these findings carry concrete implications for app developers, mental health practitioners, and platform policymakers. Developers should prioritize not only technical functionality but also the emotional resonance and ethical architecture of their products, ensuring that personalization features respect user autonomy and data privacy. Practitioners can employ the Holistic AI Care Design framework to evaluate whether a given application is therapeutically appropriate and aligned with user needs. Policymakers and institutional stakeholders, particularly in China and other contexts where mental health stigma and service gaps persist, should recognize AI-assisted platforms as complementary tools that can extend care reach, while establishing regulatory safeguards around algorithmic transparency and informed consent. The present case study of Xin Dao Diary suggests that embedding these considerations into design from the outset—rather than retrofitting them after deployment—may help foster user trust and long-term engagement.

Several limitations of this study merit acknowledgment and point toward directions for future research. First, the diary study sample was relatively small and demographically homogeneous—comprising predominantly young female university students—which may limit the generalizability of findings to broader populations, including older adults, clinical users, or individuals from different socioeconomic backgrounds. Second, as a single-platform case study, the research reflects the specific design philosophy and user community of Xin Dao Diary; cross-platform comparative studies would strengthen the external validity of the Holistic AI Care Design framework. Third, the two-week study duration precludes conclusions about long-term engagement trajectories or the sustained therapeutic impact of repeated app use. Future research should employ longitudinal designs, incorporate clinically validated psychological outcome measures, and expand to diverse user populations and cultural contexts. Comparative studies examining how different implementations of the Holistic AI Care Design principles perform across varying app ecosystems would further advance evidence-based guidelines for AI-assisted mental health intervention development.

These findings inform a Holistic AI Care Design framework that integrates user-centered interface design, AI-driven personalization, privacy protections, active feedback mechanisms, and community support features to create effective mental health applications. This comprehensive approach is essential because these elements collectively build trust, ensure safety, and maximize the real-world impact of AI-assisted mental health tools while fostering individualized care. By systematically distinguishing between the specific HCI elements implemented, the rigorous evaluation methods employed to assess them, and the documented outcomes demonstrating improved usability and satisfaction, this research clarifies how thoughtful design choices translate into measurable improvements in user experience. The framework demonstrates that successful digital mental health interventions require not simply the presence of individual features, but rather their thoughtful integration, rigorous evaluation, and alignment with user needs and preferences. Human-centered design approaches help identify potential risks such as over-reliance on AI, ensuring that technology augments rather than replaces human judgment and empathy. Given the sensitive nature of mental health data, robust privacy protections, transparent data practices, and active user feedback remain critical for building trust and ensuring responsible AI deployment. Ultimately, this integrated approach ensures that technology enhances rather than undermines the quality and safety of mental health care, providing a model for future development of digital mental health interventions. The global mental health crisis continues to evolve, such holistic and user-informed approaches are not only timely but essential for realizing the promise of digital well-being.

Acknowledgments

The authors would thank Lin Gao and Peng Ding for their help in data collection and discussion. Special thanks go to participants in diary studies. Authors used AI writing assistance tools for manuscript organization and language refinement. All data, analyses, and key intellectual contributions were conducted by the authors. The use of AI did not compromise the scientific integrity of the research.

Author biographies

Zongyi Zhang is currently an assistant professor in the School of Communication in Soochow University. His research interests include new media and digital culture, digital platform studies and critical algorithm studies in China. He obtained his PhD from the Chinese University of Hong Kong and was a visiting researcher in at the University of Southern California.

Xuanxuan Tan is an assistant professor at the School of Communication at Soochow University. Her research focuses on affective governance and social and political impacts of technologies, including AI and public health technologies. She obtained her PhD in Cultural Studies from the Chinese University of Hong Kong and was a visiting researcher at Aarhus University and Southern University of Science and Technology.

Author contributions: ZZ: Writing – review & editing, Conceptualization, Methodology; XT: Writing – original draft, Conceptualization, Methodology, Formal Analysis.

Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research is supported by China Postdoctoral Science Foundation (2024M762277).

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Copyright declarations: No copyrighted figures from third-party publications have been reproduced. All app screenshots were captured by the research team and are used strictly for non-commercial academic research, critical commentary, and scholarly discussion purposes.

ORCID iDs

Zongyi Zhang https://orcid.org/0000-0002-5576-4118

Xuanxuan Tan https://orcid.org/0000-0002-6090-6379

Ethical consideration

There was no ethics or institutional committee in place at the researchers’ institution at the time the study was conducted.

Consent to participate

Written informed consent was obtained from all the participants prior to study initiation. each participant signed a written informed consent form before engaging in any study activities. All collected data were anonymized to protect participant identities.

Data Availability Statement

All data generated or analyzed during the study are included in the published article.*

References

  • 1.Olawade DB, Wada OZ, Odetayo A, et al. Enhancing mental health with Artificial Intelligence: Current trends and future prospects. Journal of Medicine Surgery and Public Health 2024; 3: 100099. 10.1016/j.glmedi.2024.100099 [DOI] [Google Scholar]
  • 2.Boucher EM, Harake NR, Ward HE, et al. Artificially intelligent chatbots in digital mental health interventions: a review. Expert Rev Med Devices 2021; 18(suppl 1): 37–49. 10.1080/17434440.2021.2013200 [DOI] [PubMed] [Google Scholar]
  • 3.Andersen TO, Nunes F, Wilcox L, et al. Realizing AI in Healthcare: Challenges Appearing in the Wild. Association for Computing Machinery, 2021, pp. 1–5. 10.1145/3411763.3441347 [DOI] [Google Scholar]
  • 4.Carr S. ‘AI gone mental’: engagement and ethics in data-driven technology for mental health. J Ment Health 2020; 29(2): 125–130. 10.1080/09638237.2020.1714011 [DOI] [PubMed] [Google Scholar]
  • 5.Nazar M, Alam MM, Yafi E, et al. A Systematic Review of Human–Computer Interaction and Explainable Artificial intelligence in Healthcare with Artificial Intelligence techniques. IEEE Access 2021; 9: 153316–153348. 10.1109/ACCESS.2021.3127881 [DOI] [Google Scholar]
  • 6.Liu P, Fels SS, West N, et al. Human computer interaction design for mobile devices based on a smart healthcare architecture. arXiv 2019. 10.48550/arxiv.1902.03541 [DOI] [Google Scholar]
  • 7.Ecclestone A, Linden B, Rose J, et al. Results of a formative process evaluation of Canada’s student mental Health Network, a Web-Based Knowledge Mobilization Initiative: a Concurrent Mixed Methods Study (Preprint). JMIR Formative Research 2024; 9: e58992. 10.2196/58992 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Curtis B, Giorgi S, Ungar L, et al. AI-based analysis of social media language predicts addiction treatment dropout at 90 days. Neuropsychopharmacology 2023; 48(11): 1579–1585. 10.1038/s41386-023-01585-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Ben-Zeev D, Brian R, Wang R, et al. CrossCheck: Integrating self-report, behavioral sensing, and smartphone use to identify digital indicators of psychotic relapse. Psychiatric Rehabilitation Journal 2017; 40(3): 266–275. 10.1037/prj0000243 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Blandford A. HCI for health and wellbeing: Challenges and opportunities. Int J Hum Comput Stud 2019; 131: 41–51. 10.1016/j.ijhcs.2019.06.007 [DOI] [Google Scholar]
  • 11.Hong S, Choi H, Kweon H. Medical Device Based on Virtual Reality-Based Upper Limb Rehabilitation Software: Usability Evaluation through Cognitive Walkthrough (Preprint). JMIR Formative Research 2024; 9: e68149. 10.2196/68149 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Alrizq M, Solangi SA, Alghamdi A, et al. An architecture supporting intelligent mobile healthcare using Human-Computer Interaction HCI principles. Computer Systems Science and Engineering 2021; 40(2): 557–569. 10.32604/csse.2022.018800 [DOI] [Google Scholar]
  • 13.Koutsouleris N, Hauser TU, Skvortsova V, et al. From promise to practice: towards the realisation of AI-informed mental health care. Lancet Digit Health 2022; 4(11): e829–e840. 10.1016/S2589-7500(22)00153-4 [DOI] [PubMed] [Google Scholar]
  • 14.Balcombe L, De Leo D. Digital mental health challenges and the horizon ahead for solutions. JMIR Ment Health 2021; 8(3): e26811. 10.2196/26811 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Vaidyam AN, Wisniewski H, Halamka JD, et al. Chatbots and Conversational Agents in Mental Health: A review of the Psychiatric landscape. Can J Psychiatry 2019; 64(7): 456–464. 10.1177/0706743719828977 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Bond RR, Mulvenna MD, Potts C, et al. Digital transformation of mental health services. Npj Mental Health Research 2023; 2(1): 13. 10.1038/s44184-023-00033-y [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Ibrahim ENM, Jamali N, Suhaimi AIH. Exploring gamification design elements for mental health support. International Journal of Advanced Technology and Engineering Exploration 2021; 8(74): 114–125. 10.19101/ijatee.2020.s1762123 [DOI] [Google Scholar]
  • 18.Nunes F, Verdezoto N, Fitzpatrick G, et al. Self-Care Technologies in HCI. ACM Transactions on Computer-Human Interaction 2015; 22(6): 1–45. 10.1145/2803173 [DOI] [Google Scholar]
  • 19.Balcombe L, Leo D. Human-Computer Interaction in Digital Mental Health. Informatics 2022; 9: 14. 10.3390/informatics9010014 [DOI] [Google Scholar]
  • 20.Khasanah I, Gunawan R. Method use enhanced cognitive walkthrough (ECW) in testing the interface design of student interface rating system assessment program. International Journal of Progressive Sciences and Technologies 2023; 40: 263. 10.52155/ijpsat.v40.1.4689 [DOI] [Google Scholar]
  • 21.Inkster B, Sarda S, Subramanian V. An Empathy-Driven, Conversational Artificial Intelligence Agent (WYSA) for Digital Mental Well-Being: Real-World Data Evaluation Mixed-Methods Study. JMIR Mhealth Uhealth 2018; 6(11): e12106. 10.2196/12106 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Sanches P, Janson A, Karpashevich P, et al. HCI and Affective Health: Taking stock of a decade of studies and charting future research directions. DIVA 2019. https://kth.diva-portal.org/smash/record.jsf?pid=diva2%3A1348089&dswid=1383 [Google Scholar]
  • 23.Li H, Zhang R, Lee Y, et al. Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. Npj Digital Medicine 2023; 6(1): 236. 10.1038/s41746-023-00979-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Cross S, Bell I, Nicholas J, et al. Use of AI in Mental Health Care: Community and Mental Health Professionals Survey. JMIR Mental Health 2024; 11: e60589. 10.2196/60589 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Mahatody T, Sagar M, Kolski C. State of the art on the cognitive walkthrough method, its variants and evolutions. Int J Hum Comput Interact 2010; 26: 741–785. 10.1080/10447311003781409 [DOI] [Google Scholar]
  • 26.Ko A, Burnett M, Green T, et al. Improving the design of visual programming language experiments using cognitive walkthroughs. J Vis Lang Comput 2002; 13: 517–544. 10.1006/jvlc.2002.0245 [DOI] [Google Scholar]
  • 27.Tiyasa A, Wirdiani N, Rusjayanthi N. Analysis and design of UI and UX of the Taring application using goal-directed design and cognitive walkthrough methods. MATRIX: Jurnal Manajemen Teknologi dan Informatika 2023; 13(3): 142–156. 10.31940/matrix.v13i3.142-156 [DOI] [Google Scholar]
  • 28.Light B, Burgess J, Duguay S. The walkthrough method: An approach to the study of apps. New Media & Society 2018; 20(3): 881–900. 10.1177/1461444816675438 [DOI] [Google Scholar]
  • 29.Alabduljabbar R. User-centric AI: evaluating the usability of generative AI applications through user reviews on app stores. PeerJ Comput Sci 2024; 10: e2421. 10.7717/peerj-cs.2421 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.International Organization for Standardization . ISO 9241-11:2018 ergonomics of human-systeminteraction—Part 11: usability: definitions and concepts [Internet]. ISO, 2018. [cited 2023 Oct 9]. Available from. https://www.iso.org/obp/ui/#iso:std:iso:9241:11:ed-2:v1:en [Google Scholar]
  • 31.Maqbool B, Herold S. Potential effectiveness and efficiency issues in usability evaluation within digital health: a systematic literature review. J Syst Softw 2024; 208: 111881. 10.1016/j.jss.2023.111881 [DOI] [Google Scholar]
  • 32.Agarwal P, Gordon D, Griffith J, et al. Assessing the quality of mobile applications in chronic disease management: a scoping review. NPJ Digit Med 2021; 4: 46. 10.1038/s41746-021-00410-x [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Cao W, Milks P, Liu M, et al. mHealth interventions for self-management of hypertension: framework and systematic review on engagement, interactivity, and tailoring. JMIR Mhealth Uhealth 2022; 10: e29415. 10.2196/29415 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Daraojimba A, Omaghomi T, Elufioye O, et al. Health apps and patient engagement: a review of effectiveness and user experience. World J Adv Res Rev 2023. 10.30574/wjarr.2024.21.2.0476 [DOI] [Google Scholar]
  • 35.Ng M, Firth J, Minen M, et al. User engagement in mental health apps: a review of measurement, reporting, and validity. Psychiatr Serv 2019; 70: 538–544. 10.1176/appi.ps.201800519 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 36.Ashok M, Madan R, Joha A, et al. Ethical framework for artificial intelligence and digital technologies. Int J Inf Manag 2022; 62: 102433. 10.1016/j.ijinfomgt.2021.102433 [DOI] [Google Scholar]
  • 37.Char D, Abràmoff M, Feudtner C. Identifying ethical considerations for machine learning healthcare applications. Am J Bioeth 2020; 20: 7–17. 10.1080/15265161.2020.1819469 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Van Bruxvoort X, Van Keulen M. Framework for assessing ethical aspects of algorithms and their encompassing socio-technical system. Appl Sci (Basel) 2021; 11(23): 11187. 10.3390/app112311187 [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

All data generated or analyzed during the study are included in the published article.*


Articles from Digital Health are provided here courtesy of SAGE Publications

RESOURCES