Skip to main content
Indian Journal of Psychiatry logoLink to Indian Journal of Psychiatry
editorial
. 2023 Mar 3;65(3):297–298. doi: 10.4103/indianjpsychiatry.indianjpsychiatry_112_23

Artificial intelligence in the era of ChatGPT - Opportunities and challenges in mental health care

Om P Singh 1,2,
PMCID: PMC10187878  PMID: 37204980

Chat Generative Pre-training Transformer (ChatGPT)[1] is a powerful AI-based chatbot system launched on November 30, 2022, by San Francisco-based OpenAI. It gained massive attention and currently has over 100 million users, which makes it the fastest-growing consumer application.[1]

It is a transformer-based neural network system that uses a vast neural network to produce a human-like language through which it communicates. The AI-based programs are programmed with unlimited text data to understand the context and relevancy of human communications.

There is massive competition in this segment as multiple similar, better-advanced apps are on the verge of being launched, like Google Bard,[2] Microsoft Bing AI, Chinese Ernie bot, Korean SearchGPT, Russian YaLM 2.0, Chatsonic, Jasper Chat, Character AI, Perplexity AI, and YouChat.

ChatGPT and other AI platforms hold enormous potential in many fields, including mental health. They carry vast utilization possibilities and are coming in a big way. From chats and games to writing computer programs, music compositions, songs, and teleplays to writing essays, letters, scientific papers, abstracts, and introductions, it will affect one and all in a big way. It is not hard to predict that they will make a massive difference in the mental healthcare delivery system.

There is a huge treatment gap in mental health care in developing, lower, and lower-middle-income countries. According to WHO, there is a 76%–85% treatment gap in developing countries regarding mental disorders. According to National Mental Health Survey, in India, the treatment gap reported for any mental disorder is as high as 83%. A huge deficit of mental health professionals far below the specified norms and an inequitable resource distribution make the gap more prominent.[3] AI and digital interfaces are emerging as viable alternatives for reducing this gap and making psychiatric diagnosis and treatment accessible and affordable.

The ability of ChatGPT and other AI-based chatbots to generate human-quality responses can provide companionship, support, and therapy for people who have problems with accessibility and affordability in terms of time, distance, and finances. The ease, convenience, and simulation of talking to another human being make it a superior app for providing psychotherapies. The ChatGPT and AI-based chatbots are programmed and trained with vast knowledge about psychiatric conditions and respond with empathy. Still, they cannot diagnose specific mental health conditions and provide treatment details reliably and accurately.

Though there is a lot of excitement associated with the use of AI in various psychiatric conditions, there are several areas of concern with its use. To start with, ChatGPT and other AI are trainable and are trained using web-based information and utilize the reinforcement learning technique with human feedback. If not prepared with proper responses and from authentic sites, they can provide wrong information regarding the condition and inappropriate advice, which may be potentially harmful to persons with mental problems.

Confidentiality, privacy, and data safety are significant areas of concern.[4] Any person utilizing an AI-based app for their mental health condition and therapy is bound to share important personal details about themselves and family members, making them potentially vulnerable in situations of breach of confidentially and in scenarios of the data breach.

Other concerns are the lack of proper standardization and monitoring, the universality of applications, misdiagnosis, wrong diagnosis, inappropriate advice, and the inability to handle crises.[4] There are also concerns regarding their safety, efficacy, and tolerability.

These pose significant concerns about the ethical issues related to using ChatGPT and AI-based apps in academics, diagnosis, treatment, and therapy.

There is a definite need to regulate and monitor AI-based apps. American Psychiatric Association (APA) has formulated a digital psychiatry task force to evaluate and monitor AI and mental health-related apps for their efficacy, tolerability, safety, and potential to provide mental health care.[5]

APA has come up with an innovative initiative of the App Evaluation Model called App Advisor.[5] APA’s App Evaluation Model has been adopted and replicated by several other healthcare organizations, e.g., the Division of Digital Psychiatry and BIDMC at Harvard University App Evaluation, Health Navigator App Evaluator Model and Assessment tools, NYC Department of Health and Mental Hygiene: NYC Well App Advisor, to name a few.[6]

With the vast difference in awareness, education, language, and level of understanding in the Indian population, Indian Psychiatric Society and other stakeholders should start to evaluate and regulate AI-based global and local apps for their safety, efficacy, and tolerability and guide the general public for proper and safe use.

REFERENCES


Articles from Indian Journal of Psychiatry are provided here courtesy of Wolters Kluwer -- Medknow Publications

RESOURCES