Abstract
The utilization of artificial intelligence (AI) in psychiatry has risen over the past several years to meet the growing need for improved access to mental health solutions. Additionally, shortages of mental health providers during the COVID-19 pandemic have continued to exacerbate the burden of mental illness worldwide. AI applications already in existence include those enabled to assist with psychiatric diagnoses, symptom tracking, disease course prediction, and psychoeducation. Modalities of AI mental health care delivery include availability through the internet, smartphone applications, and digital gaming. Here we review emerging AI-based interventions in the form of chat and therapy bots, specifically conversational applications that teach the user emotional coping mechanisms and provide support for people with communication difficulties, computer generated images of faces that form the basis of avatar therapy, and intelligent animal-like robots with new advances in digital psychiatry. We discuss the implications of incorporating AI chatbots into clinical practice and offer perspectives on how these AI-based interventions will further impact the field of psychiatry.
Supplementary Information
The online version contains supplementary material available at 10.1007/s11126-022-09973-8.
Keywords: Artificial intelligence, Chatbot, Therapy bots, Digital therapy, Digital psychiatry
Overview of Existing Intelligent Applications
Artificial intelligence (AI), commonly defined as “the development of computer systems able to perform tasks that normally require human intelligence” [1], has become increasingly prevalent in modern medicine and in the field of psychiatry. AI is based upon a wide variety of computer algorithms classified under machine learning (ML). Some examples include random forests, support vector machines, linear discriminant analysis, and natural language processing [2]. Beginning in the 1960s, a computer program known as ELIZA was developed to emulate the conversational abilities of a psychotherapist. The idea was for the machine to simulate human conversation while allowing the patient to do most of the cognitive work of interpretation. The program was intended only for research purposes involving natural language processing experiments, and ultimately led to the rapid expansion of the conversation on artificial intelligence [3]. In 1971, another computer model was designed to simulate paranoia in the setting of a diagnostic psychiatric interview. The model was an attempt to characterize the inner structure of paranoid behavior often encountered by clinicians interviewing paranoid patients [4].
Within the last two decades, AI began to incorporate neuroimaging studies of psychiatric patients with deep learning models to classify patients with psychiatric disorders. For example, Kim et al. was able to classify schizophrenia patients and controls with an accuracy of 85.5% by extracting functional connectivity patterns from resting-state functional MRIs of schizophrenia patients and healthy controls [5]. These findings suggested that deep learning has the ability to classify psychosis in patients using neuroanatomical and neurofunctional information.
Other targets of AI encompass digital gaming interventions and smartphone applications. Digital gaming was initially used to track symptoms and for psychoeducation but has now evolved into complete interventional programs. Gaming modalities are now addressing psychosocial and cognitive domains focusing on specific deficits in various psychiatric disorders. Services provided may include cognitive behavioral therapy, behavioral modification, social motivation, attention enhancement, and biofeedback [6]. Games continue to have widespread appeal and can also be utilized via smartphones. Smartphone applications have become another method utilizing AI, such as projects including mindLAMP (Learn, Assess, Manage, Prevent) and BiAffect. MindLAMP is an application that uses smartphones and embedded sensors to understand people’s experiences of mental illness and helps predict recovery through the collection of surveys, cognitive tests, GPS coordinates, and exercise information. BiAffect uses machine learning algorithms and keyboard metadata such as variability in typing dynamics, errors, and pauses in user messaging to predict manic and depressive episodes in people with bipolar disorder [7].
AI applications appear to have great potential for transforming the delivery of psychiatric care and have already been utilized to assist with making psychiatric diagnoses, symptom tracking, prediction of acute disease exacerbations and recovery, and psychoeducation. In the era of the COVID-19 pandemic, another form of AI technology has gained momentum to offer digital help for psychiatric disorders: chatbots.
Chat/Therapy Bots
Mental illness represents a large burden for individuals, communities, and nations. The emergence of COVID-19 presented doctors with an unprecedented challenge: how to increase access to care during a pandemic. Therapeutic devices that work over SMS text messaging or other messaging devices are currently being explored as a way to address psychiatric symptoms exacerbated by the unrelenting global health crisis and to help those with an existing mental health condition.
Woebot is an automated conversational application available through Facebook Messenger or mobile apps that provides tools that automate the process of cognitive behavioral therapy (CBT). This tool was developed to monitor symptoms and manage episodes of anxiety and depression through learned skills such as identifying and challenging cognitive distortions [8]. According to a randomized controlled trial, 70 subjects were randomized into Woebot and an e-book reading for depression. The Woebot group reported a significant decrease in depression compared to the e-book group [9]. Since chatbots are conversational and keep users engaged, higher levels of engagement might explain the significantly better outcomes and why it draws more attention from financial sponsors [10]. Tess is another program available as a phone number that utilizes text messaging to coach individuals through times of emotional distress. This tool enables the user to have similar therapeutic conversations as though they were conversing with a psychologist and delivers emotional wellness coping strategies [11].
In a similar approach, new forms of avatar therapy have been developed to provide therapeutic conversations with its users. Replika is a smartphone application that allows users to have conversations about themselves, allowing users to gain a better understanding of the good qualities within themselves. Replika reconstructs a footprint of your personality out of digital remains or text conversations you have with your avatar. One of the strongest draws of Replika is that the user can have vulnerable conversations with their avatar without fear of judgement throughout the interaction. Similar to therapy sessions with a psychiatrist or personal conversations with a trusted friend, the avatar can have therapeutic conversations with the user and help the user gain insight into their own personality [12]. Another use of avatars is with Avatar Therapy where computer-generated images of faces interact with patients with schizophrenia via intelligent algorithms. Patients undergo six ten-minute sessions of Avatar Therapy where they challenge the persecutory voice hallucinations they experience and gradually learn to gain control over the distressing voices. Initial studies have shown that Avatar Therapy decreases the amount of distress patients feel in relation to their voices, frequency of hearing voices, and the extent to which they feel overwhelmed by them [13].
In addition to AI designed to replicate human processes, clinicians and scientists have explored the concept of using intelligent animal-like robots to improve psychiatric outcomes such as reducing stress, loneliness and agitation, and improving mood. Companion bots such as Paro, a robotic seal, and eBear, an expressive bear-like robot, interact with patients and provide the benefits of animal therapy. Paro has already been used to help patients with dementia who may be isolated or experiencing feelings of depression. AI-enabled robots have also been studied to help children with autism spectrum disorders (ASDs) through education and therapy. Robots such as Kaspar and Nao are able to teach children social skills and help them with facial recognition and appropriate gaze response, with initial studies reporting that children with ASDs performed better with robotic intervention compared to human therapists [8]. Another application that has made an impact on these individuals is Apple’s virtual assistant Siri, which can engage with children who have ASDs and addresses the hyper-focus on specific interests that can come with the disorder. Humans may not have the desire or the patience to engage with children in the minutiae that they are focusing on, but Siri has the ability to do so. Through engagements like this, provided by AI assistants such as Siri, children can develop the skills necessary to socially interact with others without negative recourse for the social faux pas that inevitably occur. Siri can be of great help by providing the child a safe learning environment and the patience necessary to practice these skills [14].
Discussion
In summary, the future of AI in psychiatry appears to have great potential with growing need and utilization of AI bots in managing psychiatric symptoms and augmenting therapeutic treatments. Mental illness continues to be a heavy burden for society at large. AI-based interventions such as those discussed in this article could provide some relief of that burden especially in an era with a shortage of mental health providers.
Innovations such as chatbots, avatar therapy, and companion bots have many advantages with their utilization such as a reduction in the stigma associated with sharing symptoms of mental illness with physicians, increasing personal comfort with self-disclosure, cost effectiveness, and broadening accessibility. On the other hand, AI bots are not endowed with the varied skill set of that of a trained psychiatrist or therapist, are limited in their ability to apply personal patient details to assist with the cognitive work required of the patient, and may not have the nuanced emotional awareness and empathetic response of a human counterpart. Currently, there are no direct applications available in the clinical setting and no national standard for comparison during technological development [7].
Another major concern is with regard to legal responsibilities; it begs the question, who would be responsible if the bot makes an incorrect diagnosis or incorrectly interprets distress from a patient [15]. Additional concerns surround the safety of protected health information (PHI) during exchange of the information formulated under the Health Insurance Portability and Accountability Act (HIPAA). Fortunately, cloud-based web services such as Amazon Web Service (AWS) have recently started to offer “HIPAA- compliant” connections [16].
Despite the advances in this field of technology, newer tools have not been immediately embraced by mental health providers. Some psychiatrists who strongly value interpersonal interactions with patients may be slow in adopting these new methods indicating a slow diffusion-innovation process in the mental health community [17]. Furthermore, mental health applications are fast growing in adoption which makes risk assessments more challenging and perhaps more likely to occur after harm has already been done. Future work should be directed towards investigating efficacy of AI-based interventions in large, controlled trials and methods for incorporating AI into clinical practice.
Electronic Supplementary Material
Below is the link to the electronic supplementary material.
Acknowledgements
None.
Authors’ Contributions
The idea for the article was formed by Salih Selek MD. All authors contributed to the literature search and data analysis. The first draft of the manuscript was written by Kay Pham and all authors critically revised previous versions of the manuscript. All authors read and approved the final manuscript.
Funding
Not applicable.
Availability of Data and Material
Not applicable.
Code Availability
Not applicable.
Declarations
Conflicts of Interest/Competing Interests
No conflicts of interest.
Ethics Approval
Not applicable.
Consent to Participate
Not applicable.
Consent for Publication
Not applicable.
Footnotes
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
References
- 1.Nasrallah HA, Kalanderian H. Artificial Intelligence in Psychiatry: Potential Uses of Machine Learning Include Predicting the Risk of Suicide, Psychosis. Current psychiatry. 2019;18:33. [Google Scholar]
- 2.Paulus MP, Quentin JM, Huys, Tiago V, Maia A Roadmap for the Development of Applied Computational Psychiatry. Biological psychiatry: cognitive neuroscience neuroimaging. 2016;1(5):386–92. doi: 10.1016/j.bpsc.2016.05.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Bassett C. The computational therapeutic: exploring Weizenbaum’s ELIZA as a history of the present. AI Soc. 2019;34:803–12. doi: 10.1007/s00146-018-0825-9. [DOI] [Google Scholar]
- 4.Colby K, Mark Sylvia Weber, and Franklin Dennis Hilf. “Artificial Paranoia. Artif Intell. 1971;2(1):1–25. doi: 10.1016/0004-3702(71)90002-6. [DOI] [Google Scholar]
- 5.Vieira S, Walter HL, Pinaya, Mechelli A. “Using Deep Learning to Investigate the Neuroimaging Correlates of Psychiatric and Neurological Disorders: Methods and Applications.” Neurosci Biobehav Rev. 74.Pt A 2017;58–75. Web. [DOI] [PubMed]
- 6.Vajawat B, Varshney P, Banerjee D. Digital Gaming Interventions in Psychiatry: Evidence, Applications and Challenges. Psychiatry Res. 2021 Jan;295:113585. 10.1016/j.psychres.2020.113585. Epub 2020 Nov 24. PMID: 33303223. [DOI] [PubMed]
- 7.Allen S. Artificial Intelligence and the Future of Psychiatry. IEEE Pulse. 2020 May-Jun;11(3):2–6. 10.1109/MPULS.2020.2993657. PMID: 32559160. [DOI] [PubMed]
- 8.Fiske A, Henningsen P, Buyx A. “Your Robot Therapist Will See You Now: Ethical Implications of Embodied Artificial Intelligence in Psychiatry, Psychology, and Psychotherapy.” J Med Internet Res. 21.5 2019;e13216–e13216. Web. [DOI] [PMC free article] [PubMed]
- 9.Fitzpatrick K, Kara A, Darcy, Vierhile M. “Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial.” JMIR Mental Health. 4.2 2017;e19–e19. Web. [DOI] [PMC free article] [PubMed]
- 10.Sachan D. Self-Help Robots Drive Blues Away. The Lancet Psychiatry. 2018;5:547–7. doi: 10.1016/S2215-0366(18)30230-X. [DOI] [PubMed] [Google Scholar]
- 11.Fulmer R, et al. “Using Psychological Artificial Intelligence (Tess) to Relieve Symptoms of Depression and Anxiety: Randomized Controlled Trial.” JMIR Mental Health. 5.4 (2018): e64–e64. Web. [DOI] [PMC free article] [PubMed]
- 12.Murphy M, Templin J. (2021). Replika. replika.ai. Retrieved September 16, 2021, from https://replika.ai/about/story.
- 13.Garety P, et al. Optimising AVATAR Therapy for People Who Hear Distressing Voices: Study Protocol for the AVATAR2 Multi-Centre Randomised Controlled Trial. Trials. 2021;22(1):366–6. doi: 10.1186/s13063-021-05301-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Raccio AJ, Newman To Siri with Love: A Mother, Her Autistic Son, and the Kindness of Machines. J Autism Dev Disord. 2019;49:3472–3. doi: 10.1007/s10803-019-03996-0. [DOI] [Google Scholar]
- 15.Hariman K, Ventriglio A, Bhugra D. The Future of Digital Psychiatry. Curr Psychiatry Rep. 2019 Aug 13;21(9):88. 10.1007/s11920-019-1074-4. PMID: 31410728. [DOI] [PubMed]
- 16.Ruggiano N, et al. “Chatbots to Support People With Dementia and Their Caregivers: Systematic Review of Functions and Quality.” J Med Internet Res. 23.6 2021;e25006–e25006. Web. [DOI] [PMC free article] [PubMed]
- 17.Karlinsky H. Psychiatry, Technology, and the Corn Fields of Iowa. Canadian journal of psychiatry. 2004;49(1):1–3. doi: 10.1177/070674370404900101. [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Data Availability Statement
Not applicable.
Not applicable.