Skip to main content
Alpha Psychiatry logoLink to Alpha Psychiatry
editorial
. 2024 Sep 1;25(5):667–668. doi: 10.5152/alphapsychiatry.2024.241827

Artificial Intelligence-Assisted Adjunct Therapy: Advocating the Need for Valid and Reliable AI Tools in Mental Healthcare

Waqar Husain 1, Seithikurippu R Pandi-Perumal 2,3,, Haitham Jahrami 4,5
PMCID: PMC11562293  PMID: 39553490

Mental disorders have been rising on a global scale. The number of existing psychiatrists and other mental health professionals is far lower than the number of psychiatric patients worldwide. Social stigmatization worsens this situation, whereby almost one-third of those who need mental health services do not opt for professional treatment due to several cultural, religious, and economic reasons.1 Artificial intelligence (AI) can be very helpful in dealing with this situation. The current editorial proposes AI-assisted adjunct therapy (AIAAT) and advocates the need for developing specific AI tools in this regard.

The existing AI apps and websites, especially conversational bots, have already been utilized for mental health assistance. These tools can provide real-time responses, psychoeducation, and therapeutic exercises to those who do not wish to face the social stigma associated with visiting a mental health professional.2 ChatGPT and similar tools can facilitate emotional catharsis. These tools can guide clients to a feasible therapeutic process and can give immediate advice in dealing with stress and anxiety. This provision can help those clients who experience frequent episodes of distress and need immediate support from time to time.3 Based on a rich database, AI tools can detect mental problems, identify the symptoms, diagnose a person, and recommend a professional course of action.4 Most importantly, these tools overcome the barriers of social stigma associated with mental disorders and therapy.5

AI tools are not meant to replace conventional mental healthcare. These tools can play a supplementary role in helping both clients and practitioners by reducing long wait times and providing faster diagnoses. A client registered with a practitioner and attending regular sessions can also take advantage of these AI tools. These tools can provide ongoing support between sessions by helping clients achieve the goals that are already set by therapist.6 For example, an AI system might offer reminders for practicing mindfulness exercises, tracking mood fluctuations, and providing motivational support. This continuous engagement in the psychotherapeutic process would lead to better progress and improved outcomes.

Despite the potential benefits, the integration of AI tools in mental health services has also raised several concerns.7 The reliability of information is the foremost critique in this regard. There is a high risk that AI may provide information that is not reliable. Artificial intelligence can be a source of misguidance, as it is not yet capable of observing clients in natural settings. Users can also mislead AI by complaining falsely or by exaggerating symptoms. Thus, specialized AI applications are needed that could overcome the drawbacks of using AI for mental health. We advocate the stakeholders to consider developing specialized programs and software for AIAAT.

The AIAAT must include statistically established algorithms for the prevention, diagnosis, and addressing of client-centered needs. It must be noted that clients cannot take psychiatric medication without an appropriate prescription. Moreover, not a single psychotherapy claims to be universally beneficial.8 Since the use of AI tools is worldwide, and clients belonging to various races, ethnicities, and cultural backgrounds may use them, the AIAAT must incorporate cross-cultural issues along with Westernized approaches to mental health. The AIAAT must never claim to be an alternative to conventional therapy. Using the existing AI tools, clients may feel satisfied with the suggestions provided by the AI and may not consult professionals. The AIAAT must encourage clients to seek professional advice, in addition to the assistance provided by the AIAAT. The AIAAT must also have a client-based memory to remember the last advice given and the actions taken by the clients in this regard. The software companies must consult mental health professionals in developing such a tool. Psychological studies carried out worldwide can also serve as baseline information to understand the specific cultures and individual needs of the clients. Ethical considerations must also be addressed in the AIAAT, which are currently missing in the existing AI tools.9 These include privacy, confidentiality, and genuineness, which are the core foundations of conventional therapy. Most importantly, Continuing Medical Education training could be offered to mental health professionals.

Fraudulence and artificiality have been constantly observed in the field of Information Technology, whereby several web-based tools intend to utilize Information Technology unfairly and greedily. Through the current editorial, we encourage software companies to come forward and develop a standardized AIAAT application that could be approved by the leading psychiatric and psychological associations. A trusted AIAAT application would not only help the world in combating mental health issues but would also serve as a legally profitable venture for the IT sector and mental health professionals alike.

Funding Statement

The authors declare that this study received no financial support.

Footnotes

Ethics Committee Approval: N/A

Informed Consent: N/A

Peer-review: N/A

Author Contributions: Concept – W.H., S.R.P., H.J.; Design – W.H., S.R.P., H.J.; Supervision – W.H., S.R.P., H.J.; Resources – W.H., S.R.P., H.J.; Materials – W.H., S.R.P., H.J.; Data Collection and/or Processing – W.H., S.R.P., H.J.; Analysis and/or Interpretation – W.H., S.R.P., H.J.; Literature Search – W.H., S.R.P., H.J.; Writing – W.H., S.R.P., H.J.; Critical Review – W.H., S.R.P., H.J.

Acknowledgments: N/A

Declaration of Interests: S.R.P. is serving as one of the Editorial Board members of this journal. We declare that S.R.P. had no involvement in the processing of this article. The authors have no conflicts of interest to declare.

Data Availability Statement

N/A

References

  • 1. Husain W. Barriers in seeking psychological help: public perception in Pakistan. Community Ment Health J. 2020;56(1):75 78. ( 10.1007/s10597-019-00464-y) [DOI] [PubMed] [Google Scholar]
  • 2. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR Ment Health. 2017;4(2):e19. ( 10.2196/mental.7785) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3. Inkster B, Sarda S, Subramanian V. An empathy-driven, conversational artificial intelligence agent (Wysa) for digital mental well-being: real-world data evaluation mixed-methods study. JMIR mHealth uHealth. 2018;6(11):e12106. ( 10.2196/12106) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4. Pandi-Perumal SR, Narasimhan M, Seeman MV, Jahrami H. Artificial intelligence is set to transform mental health services. CNS Spectr. 2024;29(3):155 157. ( 10.1017/S1092852923002456) [DOI] [PubMed] [Google Scholar]
  • 5. Lee EE, Torous J, De Choudhury M, et al. Artificial intelligence for mental healthcare: clinical applications, barriers, facilitators, and artificial wisdom. Biol Psychiatry Cogn Neurosci Neuroimaging. 2021;6(9):856 864. ( 10.1016/j.bpsc.2021.02.001) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6. Torous J, Wisniewski H, Liu G, Keshavan M. Mental health mobile phone app usage, concerns, and benefits among psychiatric outpatients: comparative survey study. JMIR Ment Health. 2018;5(4):e11715. ( 10.2196/11715) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7. Bickmore TW, Schulman D, Sidner CL. A reusable framework for health counseling dialogue systems based on a behavioral medicine ontology. J Biomed Inform. 2011;44(2):183 197. ( 10.1016/j.jbi.2010.12.006) [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8. Husain W, Ijaz F, Husain MA, Zulfiqar M, Khalique J. Simplifying the understanding and measurement of mental disorders thru a comprehensive framework of psychosocial health. OBM Integr Complement Med. 2024;9(1):011. ( 10.21926/obm.icm.2401011) [DOI] [Google Scholar]
  • 9. Gerke S, Minssen T, Cohen G. Ethical and legal challenges of artificial intelligence-driven healthcare. Artif Intell Healthc. 2020;295 336. Epub 2020 Jun 26. ( 10.1016/B978-0-12-818438-7.00012-5) [DOI] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

N/A


Articles from Alpha Psychiatry are provided here courtesy of AVES

RESOURCES