Skip to main content
Missouri Medicine logoLink to Missouri Medicine
. 2024 Sep-Oct;121(5):345–349.

Eating Disorders Care and the Promises and Pitfalls of Artificial Intelligence

Ellen E Fitzsimmons-Craft 1, Nicholas C Jacobson 2
PMCID: PMC11482850  PMID: 39421482

Eating disorders (EDs) are common, serious psychiatric disorders that affect up to 10% of individuals in their lifetimes.1 Furthermore, a recent systematic review and meta-analysis of 32 studies including 63,181 participants from 16 countries found that 22% of children and adolescents show disordered eating.2 EDs are associated with high psychiatric comorbidity, impairment, and mortality, with mortality from anorexia nervosa (AN) being the second highest of all psychiatric illnesses (second only to opioid use disorder).1,37 Large cohort studies over recent decades have also suggested that both EDs and ED symptoms are on a dramatic rise.8,9 EDs affect people of all genders, races, ethnicities, socioeconomic statuses, and body sizes and shapes, as well as those who live in rural areas and those who are food insecure—despite the stereotypes that abound surrounding these illnesses, EDs can affect anyone.10

The costs of EDs are also staggering—$65 billion in the US. in 2018–2019 alone, with reduction in well-being associated with EDs valued at an additional $327 billion.11 Despite the high personal and societal costs of these illnesses, access to care is extremely limited, with less than 20% of individuals with EDs ever receiving treatment.12 The problem of access to care for EDs is even worse amongst individuals from racial and ethnic minority backgrounds—these individuals are significantly less likely than their White counterparts to be diagnosed with an ED, receive care for an ED, or to even be asked by a physician about ED symptoms.10,1315 There is clearly an urgent need for accessible, evidence-based care for EDs.

There are many reasons for the extremely wide treatment gap in EDs—denial and failure to perceive the severity of the illness, lack of problem identification by providers, stigma, shame, lack of knowledge about resources, and practical barriers such as cost, lack of accessible treatment, and long waiting lists.16,17 Technology can provide a solution to many of these problems by offering opportunities for scalable approaches to problem identification and evidence-based intervention, anytime, anywhere. Indeed, in the past several decades, the field has made much progress with regard to developing and evaluating technology-based tools for EDs screening, prevention, and treatment,1820 with the majority of these tools involving use of traditional web-based programs or mobile apps with visual interfaces.

Generative Artificial Intelligence Use for Care Gap: Potential Challenges

However, we are now at a critical juncture in the field given recent advances in large language models (LLMs) and generative Artificial Intelligence (AI) (Table 1). Generative AI models can generate high-quality text, images, and other types of content based on the patterns and relationships in content in datasets on which the models have been trained. One of the most well-known of these tools is ChatGPT, which launched in November 2022 and is an LLM-based chatbot that can generate text and hold conversations. ChatGPT crossed one million users in just five days of launch and had 100 million active users by January 2023, setting the record for the fastest-growing platform in history at the time.21

Table 1.

Benefits and Risks of Generative AI in Mental Health

Type Aspect Description
Benefits Increased Access to Care Generative AI chatbots can provide 24/7 access to mental health support, helping to bridge the gap in care for individuals who may not have access to traditional treatment options.
Support for Mental Health Providers Generative AI can assist with administrative tasks, reducing clinician burnout and potentially improving the quality of care for patients.
Enhanced Patient Engagement AI-powered chatbots can engage patients through interactive and responsive conversations, encouraging consistent use and adherence to treatment plans.
Personalized Treatment Generative AI can offer tailored interventions based on individual user data, making the treatment more relevant and effective.
Scalability Generative AI tools can be deployed at scale, reaching a large number of individuals without significant additional costs.

Risks Incorrect or Harmful Information If not properly trained, AI models can provide inaccurate or harmful advice as seen in large foundation models.
Dependence on Data Quality The effectiveness of generative AI is heavily dependent on the quality and breadth of the data used to train the models.
Potential for Misinterpretation Generative AI may misinterpret user inputs, leading to responses that are not helpful or potentially harmful.
Ethical and Privacy Concerns There are significant concerns around data privacy, consent, and the ethical use of AI in mental health.

On its face, this technology could be seen as having great potential for increasing access to high-quality care for EDs, both by supporting mental health providers and offering patient-facing support. On the provider side, generative AI chatbots could assist with documentation and administrative tasks, reducing a major source of clinician burnout and thus potentially improving the quality of care that can be provided to patients with EDs.22 In one survey of psychiatrists, 70% of respondents somewhat agreed or agreed that “documentation will be/is more efficient” through use of AI tools.22 Generative AI could also assist clinicians with hypothesis generation; for example, generating reasonable options for possible differential diagnoses.23 In the future, one could also see potential for generative AI tools to supplement clinician training in evidence-based treatments or to provide opportunities for simulated practice; it could also be used for quality monitoring and to help ensure fidelity to best practices. Finally, some work has shown that generative AI chatbots could aid humans in offering consistently high levels of support and empathy in patient-facing interactions.24

On the patient side, there is potential for generative AI tools to aid in providing mental health screening and self-guided interventions,25 as well as to offer a venue for supplementing an individual’s work in traditional therapy by offering opportunities for skill practice. Indeed, there are numerous reports of individuals using ChatGPT for mental health help.26,27

Research on use of chatbots for mental health support has begun to emerge, with tools being developed and evaluated for providing psychoeducation, delivering intervention, offering support, and boosting mental resilience.28 The majority of mental health chatbots to date have operated on rule-based systems, which in contrast to generative AI, rely on pre-scripted content and operate based on decision trees and algorithms. A recent systematic review and meta-analysis of chatbots for promoting mental health and well-being found that chatbots significantly reduce symptoms of depression (Hedge’s g 0.64 [95% CI 0.17–1.12]) and distress (Hedge’s g 0.7 [95% CI 0.18–1.22]). (None of the included studies explicitly addressed EDs.)28 The research also indicated that effects were more pronounced in chatbots that used generative AI, perhaps given the constraints of rule-based systems, which have limited capability to understand user context and intention.28

Within the EDs field, our own team also recognized the potential for chatbots as a way to scale access to intervention for EDs. We had the thought that a chatbot could deliver the core content from our effective web-based, cognitive-behavioral EDs prevention program (Student Bodies)29 in a modern, engaging format—in bite-sized bits of information, showing up in text-like messages, with infographics, emojis, and warm, casual language, while also mimicking some aspects of support from a human coach, such as offering motivation and feedback. We developed the chatbot using a rule-based approach—with all responses written by our team—and the chatbot delivered the cognitive-behavioral program in a series of 10-minute conversations, designed to help users challenge unrealistic body ideals, engage in more healthful eating patterns, and learn more adaptive coping strategies. This is similar to the approach that is being used as of this writing by tools like Woebot and Wysa—chatbots that guide the user through carefully crafted pre-scripted conversations. Our own chatbot was carefully created over the course of more than a year, and our team painstakingly reviewed thousands of lines of conversation from hundreds of users to make iterative improvements.30 We tested the effectiveness of the chatbot in a randomized controlled trial involving 700 women at high risk for an ED who were randomized to either the chatbot or waitlist control.31 We found that the intervention group had significantly greater reductions in weight/shape concerns compared to the control group at 3- (d=0.20; p=.03) and six-month follow-up (d=0.19; p=.04). The odds of not developing an ED were also significantly higher in intervention vs. control at both 3- (OR=2.37, 95% CI [1.37, 4.11]) and six-month follow-ups (OR=2.13, 95% CI [1.26, 3.59]).31

In February 2022, the National Eating Disorders Association (NEDA) and the company that hosted the chatbot actually made the tool publicly available, following the publication of our strong research findings in the peer-reviewed literature.31 Over the course of a year of the chatbot being available on the NEDA website, thousands used it and found it to be helpful. It was most popular in the evening hours, and one-third used it on the weekend, when traditional sources of help are usually not available. Unfortunately, unbeknownst to our team, at some point, the company hosting the chatbot rolled out an AI component across its platform.32 With this, users could receive erroneous and even harmful information, which in this case involved recommendations for unhealthy diet and exercise approaches that were directly in conflict with the evidence-based approaches for addressing disordered eating originally programmed into the rule-based portion of the bot. While the rule-based bot itself was demonstrated to be a helpful tool, both in the research and in the initial roll-out, this situation exemplified ways in which problems can be introduced in real-world deployment of AI tools.

Generative AI-based approaches are only as good as the data on which they have been trained, and at the same time, advice or content that might be appropriate for one group of individuals may not be appropriate for another group of individuals, highlighting challenges of developing a generative AI tool that can work for all people, for all issues. One report suggested that of the text on which GPT-4 (the LMM behind ChatGPT) is trained, 16% came from books and news articles and 84% from webpages, including not only Wikipedia but also lower-quality text sources like Reddit.33,34 Within EDs, one report found that popular AI tools, which included ChatGPT and Bard, generated harmful content in response to ED-related prompts 41% of the time.35

In contrast to generative AI models that have not been specifically trained to deliver evidence-based treatment for EDs, our team designed and tested an expert-fine-tuned generative AI-powered chatbot, Therabot, for mental health treatment with a team of more than 70 persons across over 100,000 human hours. We then conducted the first randomized controlled trial (RCT) to test an expert-fine-tuned generative AI-powered chatbot.36 Participants (N=210) were randomized to a four-week Therabot intervention (n=106) or waitlist control (WLC; n=104). Participants were stratified into clinically high-risk feeding and eating disorder (CHR-FED), major depressive disorder (MDD), or generalized anxiety disorder (GAD) groups based on baseline symptom severity. Primary outcomes included disorder-specific symptom changes from baseline to four and eight weeks. Secondary outcomes included user engagement, acceptability, and therapeutic alliance. The Therabot group showed large and significantly greater reductions in CHR-FED (d = 0.627–0.819), MDD (d = 0.845–0.903), and GAD (d = 0.794–0.840) symptoms relative to controls at post-intervention and follow-up. Therabot was well-received and well-utilized (average use >6 hours), and participants rated the therapeutic alliance comparable to human therapists. These results demonstrate the effectiveness of a fully generative AI therapy chatbot for treating mental health disorders, showing promise for CHR-FED, MDD, and GAD symptom reduction. Participants were engaged with Therabot, reported exceptional therapeutic alliance, and rated the intervention highly, highlighting the feasibility of fine-tuned generative AI chatbots for scalable, personalized mental health interventions.

As seen in the Therabot trial, despite some of the challenges that have been observed with tools based on generative AI, there remains the appeal of a tool that could provide a rich, personalized experience for the user and go beyond the bounds of a purely scripted interaction. In a generative AI tool, there is also the possibility that a user can have more control over the experience, more directly shaping the course of the conversation in ways that are more personally relevant to them.

Conclusion

Moving forward, while there remains high potential for tools based on generative AI to help increase access to high-quality care for EDs, there also remain important risks that have to be addressed. As has been shown, these tools are only as good as the data on which they have been trained and there is risk of “hallucinations” and delivery of incorrect and even harmful content.37 As such, there is clearly a need for tight guardrails and human supervision when testing and even after deploying these tools.

On the provider side, providers need to be trained that generative AI-based approaches are simply tools and should not be viewed as a replacement for the work of a highly-trained provider. Patients need to also understand that these tools are not 1:1 replacements for providers and that there are risks of inaccurate information or advice being provided. On the whole, while it is clear there are numerous ways in which AI may help to address the wide treatment gap for EDs, we must balance these potential benefits against possible risks, while also instituting strong guardrails and regulations to maximize safety.

Acknowledgment/Disclosure

The current work was partially funded by the following: K08 MH120341 from the National Institute of Mental Health, National Eating Disorders Association Feeding Hope Fund Grant, 5 P30 DA029926 from the National Institute on Drug Abuse, and R01 MH123482 from the National Institute of Mental Health and National Institute of General Medical Sciences. Ellen Fitzsimmons-Craft receives royalties from UpToDate, is a consultant for Kooth, is on the Clinical Advisory Board for Beanbag Health, and receives speaking fees related to her research. Nicholas Jacobson has received a grant from Boehringer-Ingelheim, has edited a book through Academic Press and receives book royalties, and also receives speaking fees related to his research. Artificial intelligence was not used in the study, research, preparation, or writing of this manuscript.

Footnotes

Ellen E. Fitzsimmons-Craft, PhD, (pictured), is in the Department of Psychological and Brain Sciences and Department of Psychiatry, Washington University, St. Louis, Missouri. Nicholas C. Jacobson, PhD, is at the Center for Technology and Behavioral Health, Geisel School of Medicine, Dartmouth College, Hanover, New Hampshire.

References

  • 1.Hudson JI, Hiripi E, Pope HG, Jr, Kessler RC. The prevalence and correlates of eating disorders in the National Comorbidity Survey Replication. Biological Psychiatry. 2007;61(3):348–358. doi: 10.1016/j.biopsych.2006.03.040. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.López-Gil JF, García-Hermoso A, Smith L, et al. Global proportion of disordered eating in children and adolescents: A systematic review and meta-analysis. JAMA pediatrics. 2023 doi: 10.1001/jamapediatrics.2022.5848. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Association AP. Treatment of patients with eating disorders, American Psychiatric Association. The American Journal of Psychiatry. 2006;163(7 Suppl):4–54. [PubMed] [Google Scholar]
  • 4.van Hoeken D, Hoek HW. Review of the burden of eating disorders: mortality, disability, costs, quality of life, and family burden. Current Opinion in Psychiatry. 2020;33(6):521. doi: 10.1097/YCO.0000000000000641. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.Arcelus J, Mitchell AJ, Wales J, Nielsen S. Mortality rates in patients with anorexia nervosa and other eating disorders: A meta-analysis of 36 studies. Archives of General Psychiatry. 2011;68(7):724–731. doi: 10.1001/archgenpsychiatry.2011.74. [DOI] [PubMed] [Google Scholar]
  • 6.Smink FR, van Hoeken D, Hoek HW. Epidemiology, course, and outcome of eating disorders. Current Opinion in Psychiatry. 2013;26(6):543–548. doi: 10.1097/YCO.0b013e328365a24f. [DOI] [PubMed] [Google Scholar]
  • 7.Haynos AF, Egbert AH, Fitzsimmons-Craft EE, Levinson CA, Schleider JL. Not niche: eating disorders as an example in the dangers of overspecialisation. The British Journal of Psychiatry. 2023:1–4. doi: 10.1192/bjp.2023.160. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Galmiche M, Déchelotte P, Lambert G, Tavolacci MP. Prevalence of eating disorders over the 2000–2018 period: a systematic literature review. The American journal of clinical nutrition. 2019;109(5):1402–1413. doi: 10.1093/ajcn/nqy342. [DOI] [PubMed] [Google Scholar]
  • 9.Daly M, Costigan E. Trends in eating disorder risk among US college students, 2013–2021. Psychiatry Research. 2022;317:114882. doi: 10.1016/j.psychres.2022.114882. [DOI] [PubMed] [Google Scholar]
  • 10.Schaumberg K, Welch E, Breithaupt L, et al. The science behind the academy for eating disorders’ nine truths about eating disorders. European Eating Disorders Review. 2017;25(6):432–450. doi: 10.1002/erv.2553. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Streatfeild J, Hickson J, Austin SB, et al. Social and economic cost of eating disorders in the United States: evidence to inform policy action. International Journal of Eating Disorders. 2021;54(5):851–868. doi: 10.1002/eat.23486. [DOI] [PubMed] [Google Scholar]
  • 12.Kazdin AE, Fitzsimmons-Craft EE, Wilfley DE. Addressing critical gaps in the treatment of eating disorders. International Journal of Eating Disorders. 2017;50(3):170–189. doi: 10.1002/eat.22670. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Becker AE, Franko DL, Speck A, Herzog DB. Ethnicity and differential access to care for eating disorder symptoms. International Journal of Eating Disorders. 2003;33(2):205–212. doi: 10.1002/eat.10129. [DOI] [PubMed] [Google Scholar]
  • 14.Cachelin FM, Striegel-Moore RH. Help seeking and barriers to treatment in a community sample of Mexican American and European American women with eating disorders. International Journal of Eating Disorders. 2006;39(2):154–161. doi: 10.1002/eat.20213. [DOI] [PubMed] [Google Scholar]
  • 15.Marques L, Alegria M, Becker AE, et al. Comparative prevalence, correlates of impairment, and service utilization for eating disorders across US ethnic groups: Implications for reducing ethnic disparities in health care access for eating disorders. International Journal of Eating Disorders. 2011;44(5):412–420. doi: 10.1002/eat.20787. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Ali K, Farrer L, Fassnacht DB, Gulliver A, Bauer S, Griffiths KM. Perceived barriers and facilitators towards help-seeking for eating disorders: A systematic review. International Journal of Eating Disorders. 2017;50(1):9–21. doi: 10.1002/eat.22598. [DOI] [PubMed] [Google Scholar]
  • 17.Ali K, Fassnacht DB, Farrer L, et al. What prevents young adults from seeking help? Barriers toward help-seeking for eating disorder symptomatology. International Journal of Eating Disorders. 2020;53(6):894–906. doi: 10.1002/eat.23266. [DOI] [PubMed] [Google Scholar]
  • 18.Fitzsimmons-Craft EE, Balantekin KN, Graham AK, et al. Results of disseminating an online screen for eating disorders across the U.S.: Reach, respondent characteristics, and unmet treatment need. International Journal of Eating Disorders. 2019 Jun;52(6):721–729. doi: 10.1002/eat.23043. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 19.SBIRT for Eating Disorders. https://eatingdisorderscreener.org .
  • 20.Linardon J, Shatte A, Messer M, Firth J, Fuller-Tyszkiewicz M. E-mental health interventions for the treatment and prevention of eating disorders: An updated systematic review and meta-analysis. Journal of consulting and clinical psychology. 2020;88(11):994. doi: 10.1037/ccp0000575. [DOI] [PubMed] [Google Scholar]
  • 21.Nerdynav. 107 up-to-date ChatGPT statistics & user numbers [April 2024] [Accessed June 10, 2024]. https://nerdynav.com/chatgpt-statistics/
  • 22.Blease C, Worthen A, Torous J. Psychiatrists’ experiences and opinions of generative artificial intelligence in mental healthcare: An online mixed methods survey. Psychiatry Research. 2024;333:115724. doi: 10.1016/j.psychres.2024.115724. [DOI] [PubMed] [Google Scholar]
  • 23.Kanjee Z, Crowe B, Rodman A. Accuracy of a generative artificial intelligence model in a complex diagnostic challenge. Jama. 2023;330(1):78–80. doi: 10.1001/jama.2023.8288. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Sharma A, Lin IW, Miner AS, Atkins DC, Althoff T. Human–AI collaboration enables more empathic conversations in text-based peer-to-peer mental health support. Nature Machine Intelligence. 2023;5(1):46–57. [Google Scholar]
  • 25.Schueller SM, Morris RR. Clinical science and practice in the age of large language models and generative artificial intelligence. Journal of Consulting and Clinical Psychology. 2023;91(10):559. doi: 10.1037/ccp0000848. [DOI] [PubMed] [Google Scholar]
  • 26.Landwehr J. People are using ChatGPT in place of therapy—what do mental health experts think? https://www.health.com/chatgpt-therapy-mental-health-experts-weigh-in-7488513 .
  • 27.Hale E. ChatGPT is giving therapy. A mental health revolution may be next. https://www.aljazeera.com/economy/2023/4/27/could-your-next-therapist-be-aitech-raises-hopes-concerns .
  • 28.Li H, Zhang R, Lee Y-C, Kraut RE, Mohr DC. Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. NPJ Digital Medicine. 2023;6(1):236. doi: 10.1038/s41746-023-00979-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Taylor CB, Bryson S, Luce KH, et al. Prevention of eating disorders in at-risk college-age women. Archives of general psychiatry. 2006;63(8):881–888. doi: 10.1001/archpsyc.63.8.881. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Chan WW, Fitzsimmons-Craft EE, Smith AC, et al. The Challenges in Designing a Prevention Chatbot for Eating Disorders: Observational Study. JMIR Formative Research. 2022;6(1):e28003. doi: 10.2196/28003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Fitzsimmons-Craft EE, Chan WW, Smith AC, et al. Effectiveness of a chatbot for eating disorders prevention: A randomized clinical trial. International Journal of Eating Disorders. 2022;55(3):343–353. doi: 10.1002/eat.23662. [DOI] [PubMed] [Google Scholar]
  • 32.Jargon J. A chatbot was designed to help prevent eating disorders. Then it gave dieting tips. . Wall Street Journal. 2023 June 1; [Google Scholar]
  • 33.King DR, Nanda G, Stoddard J, et al. An introduction to generative artificial intelligence in mental health care: considerations and guidance. Current psychiatry reports. 2023;25(12):839–846. doi: 10.1007/s11920-023-01477-x. [DOI] [PubMed] [Google Scholar]
  • 34.Zhao WX, Zhou K, Li J, et al. A survey of large language models. arXiv preprint arXiv:230318223. 2023 [Google Scholar]
  • 35.Hate CfCD. AI and eating disorders: How generative AI is enabling users to generate harmful eating disorder content. 2023 [Google Scholar]
  • 36.Heinz MV, Mackin DM, Trudeau BM, et al. Evaluating Therabot: A Randomized Control Trial Investigating the Feasibility and Effectiveness of a Generative AI Therapy Chatbot for Depression, Anxiety, and Eating Disorder Symptom Treatment. 2024 [Google Scholar]
  • 37.De Freitas J, Cohen IG. The health risks of generative AI-based wellness apps. Nature Medicine. 2024:1–7. doi: 10.1038/s41591-024-02943-6. [DOI] [PubMed] [Google Scholar]

Articles from Missouri Medicine are provided here courtesy of Missouri State Medical Association

RESOURCES