ABSTRACT
Artificial Intelligence (AI) has rapidly altered the practice of couple/marital and family therapy (CMFT). While much has been written in other disciplines about competencies to be developed in integrating AI into one's work, to date no competencies have been presented for use in CMFT. In this article, we present the first competencies for AI in CMFT. These new competencies cross six domains and were informed by three other competency frameworks: the condensed core competencies for CMFT, interdisciplinary competencies for telebehavioral health, and AI competencies for medicine. Emphasis is placed on the development of AI‐specific competencies that align with relational ethics and person‐centered care.
Keywords: AI competencies, artificial intelligence, couple and family therapy, ethics, systemic therapy, telebehavioral health
1. Introduction
Couple/marital and family therapy (CMFT) has consistently emphasized the importance of human connection, relational dynamics, and systemic thinking (Wampler et al. 2019). These tenets are now being tested by the emergence of artificially intelligent‐powered tools such as chatbots, large language models (LLMs), and embodied AI agents (Stade et al. 2024). Artificial Intelligence (AI) broadly refers to systems that use algorithms, hardware, and data to simulate human intelligence and task performance such as decision‐making, pattern recognition, and language processing (Bajwa 2024). In CMFT, AI introduces both opportunities for incredible innovation and complicated dilemmas that challenge longstanding clinical practice and training norms. AI is often used in CMFT through tools that provide predictive analytics, diagnostic support, between‐session support, natural language processing, and conversational agents (i.e., chatbots) (Espejo et al. 2023). As CMFTs incorporate AI into their service delivery, there is a growing imperative to understand the implications of AI on a CMFT's clinical training, therapeutic outcomes, and practice (Babu and Joseph 2024; Bajwa 2024; Springer et al. 2020; Taylor et al. 2025).
In CMFT, AI supports both therapists and clients alike. Therapists use AI to be more efficient in their case documentation, identifying problematic clinical patterns, supporting their diagnostic hunches, and developing highly personalized treatment plans (Espejo et al. 2023; Arbind Kumar et al. 2025; Manole et al. 2024). Clients engage with AI tools for symptom tracking, guided interventions, or support between sessions (Saeidnia et al. 2024; Shaik et al. 2023). An even larger number of people are turning to AI tools as a substitute for traditional therapy. For example, OpenAI (a research and deployment company) reported that over a million people talk to ChatGPT about suicide weekly (Zeff 2025). Other adults report using AI systems, including mental health chatbots for emotional support because they are more accessible and cost‐effective than going to a human therapist. These varied uses among clinicians and clients underscore AI's role in the mental health arena. For CMFTs, AI will play an increasingly significant role in clinical practice either as an adjunct to care or the primary service delivery agent.
The use of AI in CMFT, however, is not without controversy (Pandya et al. 2024). Clients may not always use AI in productive or therapeutic ways: they currently use AI tools for therapy‐like purposes, with or without therapist guidance. They may seek confirmation for their experiences and to validate and/or justify their feelings and actions through AI chatbots (Khawaja and Bélisle‐Pipon 2023). The justification of those feelings or actions may be contraindicated to the treatment goals. Those motivated to solve an issue without wanting to experience uncomfortable feelings until the next therapy session may engage in chatbot therapy in the interim, posing prompts that provide a more palatable perspective or solution. Clients may also compare therapist versus AI guidance and weigh which interventions to use without disclosing this to the therapist.
Therapists can also use AI in ways that do not promote best practices and can inadvertently compromise client care. Therapists might seek information in search of diagnoses (Muetunda et al. 2024); in so doing, they may provide enough detail to violate confidentiality or rely heavily on the feedback provided by the AI system and less on other aspects of the clinical presentation or interview that would be indicative of a different diagnosis. There may also be heavy or potentially disproportionate reliance on using AI for developing interventions (Békés et al. 2025). The tendency may be to generate an intervention for a person and their symptoms, not the process or the relationships. Overreliance on such technologies may inhibit critical thinking to address the dynamic in unique ways (Carr 2008). If clinicians rely on AI‐derived patterns, it can inadvertently bias their framing, interventions, and language, potentially overriding their human‐centric systemic observations. In fact, AI tools can reinforce confirmation bias, leading therapists to misinterpret adaptive outputs as objective truths.
Further compromising the procedural and ethical integration of AI in CMFT is that there are no systemic ethical codes or established competencies for AI use in our discipline. This creates important and unique challenges for CMFTs in how we use AI in our current platforms (i.e., recording, autopopulating responses in sessions and notes, impact of confirmation bias, and legal ramifications).
CMFTs must now determine how to preserve the relational core of therapy—one if not the defining tool of our trade (Babu and Joseph 2024; Hubble et al. 1999)—while harnessing the efficiency and analytical power of AI in competent ways (Elyoseph et al. 2024; Volkmer et al. 2024). This paper offers the first step addressing these issues: a high‐level overview of the competencies needed for therapists to engage with the new third entity in the therapy room: artificial intelligence (AI). To establish the need for AI competencies in CMFT, we describe how these systems have irrevocably altered systemic practice, outline instances in which competencies have been introduced to address an emerging need, and present an introductory set of AI competencies for CMFT. The goal of this article is not to provide a treatment manual or model‐specific application (e.g., integrating AI within a specific framework or how to write appropriate prompts), but rather to begin a critical conversation about how AI can be intentionally and responsibly integrated into the practice of CMFT through presenting this new set of competencies.
2. The State of Artificial Intelligence in CMFT
The digital revolution is the fourth industrial resolution (Sixto‐García et al. 2024); the state of AI competencies in CMFT is also in a paradigm shift, positioned between first‐order and second order conceptualizations. Thus far, CMFTs commonly accept and use AI as a tool in a first‐order way. Specific first‐order applications of AI include:
Using AI for writing case notes (Kolding et al. 2024)
Providing interventions (Graham et al. 2020)
Providing supervision (Maurya and DeDiego 2025)
Prescribing AI‐guided work in between sessions (Diano et al. 2023)
Using conversational AI agents (e.g., Woebot™, Wysa™, and Moodfit™) to help clients identify emotions and learn skills to reduce anxiety, improve overall resilience, and reduce depressive symptoms (Fitzpatrick et al. 2017; Ray et al. 2022; Sachan 2018).
Machine learning, or systems to learn from data and make decisions that improve performance on specific tasks without being explicitly programmed including assisting in developing treatments and interventions (Noll et al. 2022; Tekin 2023)
While valuable, viewing AI as only a tool is a limited view of how AI is actually used in CMFT and discounts how AI alters the process of therapy in a second‐order way. AI's focus on dynamics and reflection both within and outside of the session affect the rules of therapy, the roles of the therapist, clients, and AI in treatment, and may maintain the problem. CMFT shifting toward viewing AI in a second‐order way requires that CMFT appreciates AI's role as third actor/third entity through which new rules must be negotiated and governed, and whose unique contributions fundamentally change the practice for clients, therapists, and the profession as a whole.
AI is embedded in CMFT in ways that go beyond service delivery. It is a sweeping shift in how the system views itself and the problem. It introduces complex, recursive changes into the therapeutic system that demand immediate professional attention by the discipline who knows this space best: CMFTs. The CMFT field has a history of profound innovation, notably leveraging concepts like cybernetics and general systems theory (Bowen 1978; Bateson 1972), which shifted clinical practice from linear to circular causality and established the competency of viewing clients systemically. The shift was not about learning new techniques, but about adopting a completely new epistemology, or a different way of knowing and observing human problems. This new shift also required that the therapist consider themselves in the room as part of the relationship and dynamic. It required that therapists operate from a level of reflexivity and examine isomorphism in their practices. Therapists paid attention to feedback loops and attempted to use them to their advantage in the treatment process. Examples of CMFTs using AI in second‐order ways include:
A client's undisclosed recording of a session, which fundamentally alters the dynamics of confidentiality and trust within the relational field.
Using NLPs to analyze raw data such as clinical case notes, recorded/transcribed therapy sessions, thus providing summaries, treatment suggestions and predictive analytics to determine which clients are at risk for developing further disorders (Cambria and White 2014; Graham et al. 2019; Sachan 2018)
Using Machine learning to teach pattern detection or adherence to a model (Le Glaz et al. 2021)
Platforms that mediate connection and offer diagnosis, support, and evolve and add new information with each search term entered
Attention to the camera angle, Wi‐Fi stability, and interface to design all influence affective tone and empathy exchange (Grondin et al. 2021)
Considering the AI as a third actor in the room
AI‐based chatbots, wellness apps, and therapeutic algorithms that co‐construct meaning and alter the results for others moving forward (a prime example of recursive feedback loops, Bateson 1972; Lee et al. 2025)
Second‐order thinking requires the therapist to acknowledge their own role in the system and leverage interaction with that “third person” (the therapeutic relationship itself) to challenge and shift the underlying rules and epistemology of the therapeutic system, now infused with AI. Systemic therapies must include human‐to‐human dyads to human–human–technology/human‐technology‐human triads. Using AI also affects one's clinical decision‐making. Therapists who depend on AI for determining patterns may be subject to bias in the way that issues are framed and shape interventions. Using AI tools can reinforce confirmation bias, leading therapists to misinterpret adaptive outputs as objective truths. In short, AI introduces a new epistemology of information (a new way of gathering and presenting data) and a new mode of communication and pattern identification that fundamentally transforms the practice environment for CMFTs.
3. The Critical Role of the Profession and Organizations in AI Competency Development
In addition to the systemic innovations in our field that moved from linear to circular causality, the Marriage and Family Therapy field were early adopters of audio and video recording technology, with the advent of audiotapes, portable reel‐to‐reel systems and later VHS recording systems (McKenzie et al. 1986). This allowed the pioneers of our field such as Carl Whitaker, Don Jackson, Jay Haley and Salvador Minuchin to use live session recordings to enhance clinical systemic training (Lawson and Falke 2017). Family Therapy's emphasis on observable interaction patterns, nonverbal communication and dyadic relational dynamics, made audio and video recordings ideal for helping clinicians observe real cases without intruding in session. It also greatly enhanced systemic supervision and live supervision, which is a hallmark of the CMFT field (Bartle‐Haring et al. 2009).
In the past, AAMFT and the COAMFTE regulatory board established policies that, while well‐intentioned, inadvertently hindered innovation and the early adoption of competencies in AI. While there was a general push for technology integration across clinical training and research, COAMFTE's pre‐2020 policy did not permit student therapists to count telebehavioral health hours (TBH) toward their required clinical experience, even when sessions involved couple and family therapy under AAMFT‐approved supervision. This approach had a direct consequence on hindering the development of essential competencies. It also slowed the integration of technology into clinical training programs and further discouraged competency development in CMFT, a message that was contrary to the direction that other mental health disciplines were taking (Novotney 2011).
To address this lag, the Commission on Accreditation for Marriage and Family Therapy Education (COAMFTE) responded with significant and necessary updates to its standards. These positive changes centered on formal recognition of digital practice and the mandate for new technological competency. The most immediate and critical change was the acceptance of clinical hours earned through TBH. The commission also revised its policies to include procedures for both virtual and on‐campus site visits for program review, normalizing the delivery of education and supervision through technology. In addition, the latest COAMFTE Standards (Version 12.5) explicitly set parameters for the inclusion of entry‐level training and experience in teletherapy practice (Commission on Accreditation for Marriage and Family Therapy Education 2021). This shifted TBH from an optional skill to a required professional competency. Examples of new competencies referenced in these documents include a focus on state‐specific licensure laws, inter‐jurisdictional practice, and mobility, technological security, compliance with privacy laws (e.g., HIPAA) and the secure use of platforms and data encryption, and identification of effectiveness of TBH for specific clients and relational issues.
AAMFT also published Best Practices in the Online Practice of Couple and Family Therapy where it was suggested that AAMFT work with COAMFTE to develop competencies for online therapy to be embedded in COAMFTE programs. These changes have effectively worked to address the lag and sought to ensure that graduating CMFTs are competent in the modalities required to deliver that care in contemporary practice (technology and digital ethics).
But more work needs to be done. AI has continued to present challenges to the field in both the advancement in technology and as its presence as a third actor in the room and competencies must be developed above and beyond the applications (first‐order) and address the expansive impact of AI on our profession (second‐order).
Reddy (2023) argued the need for discipline‐specific AI standards so precise regulations can provide clear guidelines for addressing the challenges of AI use, ensuring thoughtful, ethical and legal AI implementation (Howley and Whelan 2025). Garvey et al. (2021) considered it imperative for disciplines to develop training, skillsets, and mechanisms to regulate and inform how AI influences clinical disciplines; and yet, we could not find any published competencies for mental health practitioners in AI. Specifically, they call for clinical fields to articulate the competencies to accommodate AI training and education. CMFT, as a clinical field, is subject to this imperative.
4. Building the AI Competencies for Couple/Marital and Family Therapy (AICMFT)
AI not only alters therapeutic practice but also transforms what is considered effective, appropriate practice. It demands the creation of a new standard of care. Traditional models of how therapists become competent at their craft emphasize supervision, feedback, and the ascertaining of a particular skill set. In this new age of AI, the skill set must include working to obtain these skills with new modalities and techniques. In fact, it is in training and the providing of feedback where AI is already playing a significant role (Kopelovich et al. 2025). As AI becomes embedded in therapeutic practice, CMFTs must be equipped with new competencies that complement traditional clinical training. These include both TBH competencies and AI‐specific knowledge. There are three frameworks that inform the development of the AICMFT: (1) Condensed Core Competencies for MFT (Northey and Gehart 2020), (2) Interprofessional TBH Competencies (Maheu et al. 2018), and (3) AI Competencies for Healthcare Professionals (Russell et al. 2023) (see Figure 1).
Figure 1.

Competency frameworks informing systemic AI competencies. AI, artificial intelligence; CMFT, couple/marital and family therapy; TBH, telebehavioral health hours. [Color figure can be viewed at wileyonlinelibrary.com]
4.1. Condensed Core Competencies for MFT
Introduced by AAMFT in 2004 (American Association for Marriage and Family Therapy 2004), the original CMFT competencies (MFT‐CC) included a list of 128 competencies to be taught by CMFT training programs (Nelson et al. 2007). The MFT‐CC were those competencies that were agreed on by at least four members of the steering committee and organized by type of skill: conceptual, evaluative, executive, perceptual, and professional. This incredible effort was a response to the call for competency‐based education and moved the field forward to further legitimize its training techniques and itself as a discipline.
Inspired in part by the difficulty in attempting to teach and measure so many items and the unrealistic time commitment involved to reliably do so, Northey and Gehart (2020) sought to condense the MFT‐CC into a condensed set of competencies (C‐MFT‐CC) that reflected the skills needed for independent practice and were more manageable to teach and assess. The C‐MFT‐CC focused on 16 items and included competencies in: MFT Theories, Human Development, Cultural and Contextual Awareness, Selecting Treatment Models, Therapeutic Relationships, Diagnosis, Relational Assessment, Treatment Planning, Treatment, Intervention, and Practice, Safety Planning, Collaboration, Law and Ethics, Supervision and Consultation, Self‐of‐the‐Therapist, Measuring Effectiveness, and Research.
In short, each of these lists of competencies for couple and family therapy therapists focuses on the following: (1) clinical assessment (ability to assess mental health conditions), (2) therapeutic techniques (ability to apply theory and interventions in real time), (3) contextual competence (respecting diverse populations, unique circumstances, and personalized mental health needs), and (4) ethical and legal considerations (command of and compliance with AAMFT's ethical standards and navigating legal aspects). Thus, these domains must be represented in the AICFT.
4.2. Interprofessional TBH Competencies
The Interprofessional TBH competencies were developed by a strategically‐assembled collection of experts dedicated to articulating and training the skills necessary for all behavioral health clinicians to be successful in this new environment (Maheu et al. 2018). This group of people was selected based on their notable experience in TBH and included experts in the development and review of standards for TBH, leading national initiatives in TBH, writing extensively on TBH, being an active teacher and mentor in TBH spaces, and delivering professional presentations and training on TBH. Hertlein et al. (2021) described how these competencies could be adapted by CMFT. While the article referenced the entire set of competencies, the focus of the piece was in the areas of clinical evaluation and care, telepresence and virtual environments, and legal and regulatory issues. For each domain, Hertlein et al. (2021) presented details about how this applies to a couple and family therapy therapist's practice. Other work has also been done to develop competencies for the supervision of relational telemental health (Springer et al. 2020).
There are seven general domains with specific competencies in each domain. The first domain is Clinical Evaluation and Care. Broken into three subdomains, the objectives in this domain are training how to make evidence‐based decisions in client care and to competently deliver services through asynchronous and synchronous technologies. Specific sample competencies include conducting appropriate assessments for services, monitoring client comfort, effectively adapting in‐person care strategies, monitoring engagement, and establishing professional boundaries. A second domain is Virtual Environment and Telepresence, whose objectives are to help the clinician to effectively use technology to build rapport and maintain therapeutic presence.
Specific competencies under this domain include the clinician's ability to adjust their clinical environment to support TBH, facilitate presence, and create an environment free from distractions much like in‐person clinical care. A third domain is Technology. The objective of this domain is to ensure the clinician has enough technological knowledge in hardware and software to be an effective TBH clinician. Examples of specific competencies in this domain include the ability to address technology issues when they arise, be knowledgeable about security in software systems, and understand enough technology to ascertain which systems provide the most benefit for privacy and confidentiality.
A fourth domain concerns Legal and Regulatory Issues. The primary goal for this domain is to demonstrate how one adheres to local, regional, and federal laws and regulations governing TBH practices. Examples of specific competencies are to be practiced in alignment with the regulations and laws. The fifth domain is Evidence‐Based and Ethical Practice. This domain is centered on ensuring that clinicians are utilizing best practices in their service delivery. Specific competencies include but are not limited to: identification of relevant documents for ethical TBH service delivery, can discuss appropriate uses of social media, and demonstrates a knowledge of how social media can negatively impact treatment.
A sixth domain is Mobile Health Technologies. This domain concerns itself with wearables and includes competencies about wearables on the treatment outcome and the therapeutic relationship. Examples of competencies in this domain include educating clients that TBH is evidence‐based and reviewing with clients the privacy of specific apps. The final domain is Telepractice Development, which focuses on the way in which a therapist advertises themselves and continues to be engaged in training and knowledge about technology in practice. Examples of competencies include developing an appropriate marketing plan and ensuring that one is adhering to relevant law regarding marketing.
While often confused with TBH, AI is distinct in both function and purpose. TBH—a specific application of technology under which AI is included—involves the delivery of behavioral health services through telecommunications technologies (Haber et al. 2024). AI, on the other hand, is an embedded system that automates or enhances aspects of clinical care such as assessment, documentation, or treatment planning, without necessarily involving human‐to‐human interaction (Espejo et al. 2023; Miner et al. 2019). In fact, the development of AI is directly a result of TBH use. The systems that were central in TBH helped to contribute to the developments in AI.
The reason why the CTiBS group was so successful in writing this document is that the composition of workgroup members represented each of the major disciplines represented in the conversation: social work, psychiatry psychology, couple and family therapy, counseling, and nursing. It was from these professional spaces and these expert voices that conversations kept being turned toward the same few issues: (1) how to effectively use technology (competence in using TBH platforms and troubleshooting basic technology issues), (2) how to relate to our clients via TBH (engagement, verbal, and non‐verbal communication), (3) how to understand and manage security in a new way (including confidentiality and privacy), (4) how to manage emergencies and crises, (5) how to demonstrate knowledge and compliance of legal and ethical TBH practice (a special challenge since in many states provisions for practice had not yet been developed and technological advances were outpacing policy), and (6) ensuring cultural and contextual sensitivity (TBH practices accounting for cultural nuances in virtual spaces). As with the Northey and Gehart (2020) condensed competencies, these too must be accounted for in the AICFT.
4.3. AI Competencies for Healthcare Professionals
Russell et al. (2023) articulated the first known AI competencies for medicine. This list of competencies closely mirrors the competencies not specifically related to AI and further supports the idea that disciplines should be adapting their competencies to incorporate advances in AI. The first competency from Russell et al. (2023) is AI Literacy. This means medical professionals need to be aware of how AI tools are trained (i.e., what data is used and what is input to achieve certain results so that they may have a better view of the limitations and intentions). Russell et al.'s (2023) second competency is Ethical and Social Implications. This competency surrounds itself with ensuring that practitioners are following ethical practices in data management with a premium in securing data privacy. Clinical Application is the third competency and focuses on using AI to appropriately make a clinical encounter with a patient better. In medicine, there is a greater push for personalizing care (Aboraya 2022; Hays 2021; Moon et al. 2024) and this competency continues to inspire the field to move in that direction. Russell et al.'s (2023) fourth competency is in Evidence‐Based Evaluation. This competency speaks specifically how the AI tools being used are effective and safe for users and for patients. This involves reliance on scholarly literature but also a continued conversation with developers and those in the medical profession to communicate the needs and provide feedback about how the tools achieve the goal.
A fifth competency is Workflow Integration. This points to the role that AI has in clinical decision‐making and documentation processes. Because of the variation in each software product's capacities, capabilities, and integration with other systems, using AI significantly alters one's workflow and even changing the product to perform the same task may require a redesign of many workflows when someone enters a healthcare activity, from the patient care component to the documentation and all components in between (Novak et al. 2023). The sixth competency, Practice‐Based Learning and Improvement Regarding AI‐Based Tools, concerns the evolving nature of these technologies and allows pace for the medical professional to continue to evaluate their knowledge or command of these systems as they are integrated into their work.
5. AICMFT
The development of the AICMFT is a response to the innovations of AI in the therapy room and demands that practicing therapists develop these skills to be effective in today's technology‐informed, highly competitive mental health marketplace, especially from a relational context. These competencies were developed with the three collections of competencies described above in mind and reflect each of the aforementioned groups of competencies in each domain (see Table 1).
Table 1.
Competency frameworks logic model.
| AI competencies for couple and family therapy domains | Systemic influence (condensed CMFT competencies; Northey and Gehart 2020) | TBH influence (TBH competencies (Maheu et al. 2018) | AI in medicine influence (AI competencies; Russell et al. 2023) |
|---|---|---|---|
| 1: AI literacy in telebehavioral health tools | MFT theories selecting treatment models | Technology | Basic knowledge of AI |
| 2: Contextual and systemic integration | Cultural and contextual awareness relational assessment | Clinical evaluation and care | AI‐enhanced clinical encounters |
| 3: Ethical and culturally responsive practice | Cultural and contextual awareness | Clinical evaluation and care evidence‐based and ethical practice | Social and ethical implications of AI |
| 4: Legal and regulatory understanding | Law and ethics | Legal and regulatory issues | Basic knowledge of AI |
| 5: Relationship‐ and person‐centered care | Therapeutic relationship | Clinical evaluation and care virtual environment and telepresence | AI‐enhanced clinical encounters |
| 6: Continuous professional development | Self‐of‐therapist measure effectiveness | Telepractice development | Workflow analysis for AI‐based tools |
| Practice‐based learning and improvement regarding AI‐based tools |
There are six proposed domains for AI Competencies for CMFT: (1) AI Literacy in AI and Telebehavioral Health Tools, (2) Contextual and Systemic Integration, (3) Ethically and Culturally Responsive Practice, (4) Legal and Regulatory Understanding, (5) Relationship‐ and Person‐Centered Care, and (6) Continuous Professional Development. These competencies provide a comprehensive skill set for therapists seeking to responsibly engage with AI in relational contexts, operate from a place of systemic thought, and reflect the consistent themes observed in the aforementioned competency lists.
Table 2 outlines the description of each of the competencies as well as specific competencies under each domain. The language used for specific competencies was informed by Bloom's taxonomy for two reasons. First, it establishes a way to think about how to observe the competencies. Second, it is the system that is more akin to Nelson et al's (2007) competencies which are organized into five different skill sets.
Table 2.
Couple and family therapy AI domains and competencies.
| AI competency domains | Description of domain | Specific AI competencies for couple and family therapy (CMFT) |
|---|---|---|
| 1: AI Literacy in Telebehavioral Health (TBH) Tools | Responsibly integrate tools that support client engagement and meet the therapeutic objectives. |
|
| 2: Contextual and Systemic Integration | Systemic therapists should be able to integrate/synthesize therapeutic techniques and CMFT frameworks into AI while maintaining the integrity of the CMFT frameworks, one's personal theory of change, and therapeutic relationships. |
|
| 3: Ethical and Culturally Responsive Practice | Practice with awareness of cultural, contextual, and ethical issues. |
|
| 4: Legal and Regulatory Understanding | Practice consistent with the guidelines and standards of relevant jurisdictions. |
|
| 5: Relationship‐ and Person‐Centered Care | Establish and sustain emotionally attuned, secure, and equitable therapeutic relationships that reflect the integrity and focus of systemic therapy. |
|
| 6: Continuous Professional Development | Establish a plan for continued learning about AI and its impacts on CMFT. |
|
6. AICFT Domain #1: AI Literacy in AI and TBH Tools
The first domain, AI Literacy, speaks to the need for CMFTs to be highly knowledgeable about technology systems. As the old adage goes, knowledge is power. Yet this suggestion may be a precarious prescription for those systemic therapists concerned about the perceived power that they wield in the room. Informed by many postmodern approaches, the honoring of multiple truths and realities may not square with the modernist perspective more closely associated with learning about and leaning into AI and technological systems. Some of the concerns about leaning into AI literacy relate to the dismissal of some of the concerns about AI such as its fabrications, and lack of knowledge in particular areas. Rather than moving away from these already‐acknowledged limitations, CMFTs need to know that they occur, under what circumstances, and understand AI enough to collaborate with developers to achieve effective results.
Becoming literate in a new discipline is the first step to competency development. AI literacy can reduce hesitation and misconception about technology use, fostering a more open and informed perspective toward AI adoption. Second, AI literacy allows CMFTs to recognize areas where AI tools can effectively support their practice (both administratively and clinically). For example, there are several AI tools that can support administrative tasks (e.g., Clinical Notes AI™, Jotpsych™). These tools can automate notetaking, session summarization, provide treatment recommendations and enhance patient monitoring. Third, AI literacy increases one's awareness of the ethical implications concerning client privacy and confidentiality, which is critical when making decisions about what tools to adopt. Fourth, with this knowledge CMFT's are better able to choose AI tools that align with their theoretical approach and treatment plan. For example, CMFTs may choose to utilize AI‐supported chatbots to support clients to practice mindfulness techniques outside of session, with the goal of empowering a client to better manage their anxiety so long as it is in alignment with a CMFT's treatment plan.
Ultimately, CMFTs need to realize that we may already be utilizing AI tools without our knowledge as they are already integrated into many telehealth platforms and software systems we use. Examples of platforms such as Simple Practice™, and Zoom™ have AI‐enhanced features that automatically respond to client inquiry and assist with note taking. Other online applications in mental health software include the ability to analyze facial expressions and vocal patterns to assess client emotional states (e.g., as Affectiva™) (Therapy Talk Team 2025). Understanding how to utilize these tools and in what context, will be critical for the future of CMFTs. See Table 2 for specific competencies in this domain.
6.1. AICFT Domain #2: Contextual and Systemic Integration
The second competency domain—Contextual and Systemic Integration—leans heavily into more traditional systemic practices. This domain focuses on how to effectively use AI in one's practice. It includes understanding how AI can be used to advance clinical decision‐making (not replace), how to apply systemic theory across digital platforms, and how to use standard CMFT tools such as the genogram in digital forms. This domain is the space where systems therapists can compete with similar disciplines who market personalized health care options and tailor their techniques to the clients in front of them.
Selecting AI tools that will enhance our systemic practice and keep CMFTs true to the relational origins is crucial for the field of CMFT. As discussed previously, there are myriads of therapeutic AI tools on the market that could enhance therapeutic treatment goals, especially when conceptualized and supervised through a family systems approach. It is up to the CMFT to ensure they are using the tools to meet these systemic goals. This includes using the tools for assessment, in session interventions, broader AI coverage such as outcome monitoring or use of assessment instruments (e.g., AI, i.e., integrated with common CFT frameworks, such as Gottman or CBT approaches). For more detail, visit Table 2.
It is also important that as clinicians we do not over rely on AI tools so that they take the place of therapy. Whether a CMFT utilizes an AI tool to teach mindfulness, or to help a client recognize their cognitive distortions, tying it back to systemic outcomes will be essential. One example of a relational application tool used by CMFTs—Patient Notes™—is designed to assist with documenting and highlighting family dynamics and creating therapeutic strategies. Determining whether this software or others can assist the therapist in achieving their relational/systemic outcome must be at the forefront of our decision making and will have an impact from a second‐order change perspective.
6.2. AICFT Domain #3: Ethically and Culturally Responsive Practice
The third domain contains competencies that reflect an Ethically and Culturally Responsive Practice. As mentioned earlier with the ability for TBH to serve populations who may not otherwise have service, technology and AI have the potential to be an equalizer to an extent. The competencies covered in this domain prepare therapists to be aware of how AI can approach equity, the extent to which therapists know how to implement AAMFT guidelines and Code of Ethics as they apply to AI and establishing leadership in the relationship in who protects clients from harm.
While AAMFT currently lacks ethical guidance for the use of AI in practice, we suggest the following ethical considerations established by the American Psychological Association (2025) when integrating AI into your systemic practice. The first is including a statement in your informed consent, similar to what is required when doing online therapy. This statement should clearly highlight if AI is being used in treatment and the risk and limitations of AI tools. Second, clinicians need to evaluate AI tools and use them in therapy to mitigate any bias that could be introduced in the treatment process, especially for underserved and underrepresented groups. Third, CMFTs need to select AI tools that are HIPAA compliant, thus ensuring data privacy and security of client information. Fourth, is to ensure that AI tools have been validated so information being provided is accurate, and misinformation is reduced. Fifth, AI tools should only be used to augment, not replace therapy. In other words, CMFTs need to be the final decision makers on whether any recommendations an AI tool is making, is ethical, is based in sound clinical judgment, and aligns with systemic theories and treatment outcome. Finally, CMFTs need to be aware of the liabilities, risks and responsibilities associated with using AI. Currently, state licensing agencies are beginning to think about and address AI use in the mental health field. Keeping up to date on what laws and regulations are set in place in one's state will be critical for ethical practice (See Domain #4) (APA 2025).
While early studies have shown that Deep Learning models are effective in disease risk prediction for clinical and non‐clinical data (Su et al. 2020), it is not without its limitations. First, there are no large and diverse data sets that are crucial for ensuring that these deep learning models are accurate and predictive. The lack of a large, diverse data set generally is further complicated in CMFT which generally lacks large databases anyway because of the focus on process (Johnson et al. 2017). Next, like other ML models, bias in data sets and patient privacy need to be addressed, as well as competencies in how clinical decision making is done with this data (Su et al. 2020).
Despite Conversational AI's utility, this emerging technology is limited by many of the same issues discussed previously including data privacy, the data sets bias, hallucinations resulting in inappropriate responses. These data sets also lack diversity and cultural nuance, leading to biased responses (Fabuyi 2024; Russell et al. 2023; Stade et al. 2024). To review the specific competencies under this domain, view Table 2.
6.3. AICFT Domain #4: Legal and Regulatory Understanding
The fourth domain is Legal and Regulatory Understanding. Each state has their own provision for what systemic therapists can do in their role. It is incumbent upon CMFTs to remain up to date on legal and regulatory laws that state licensure or national organizations place on the use of AI in clinical practice. More uniquely, it requires that CMFTs have knowledge about how AI systems are designed, who owns the data, and how algorithms change. These are questions systemic therapists have not had to ask before, reflect the second‐order impact of AI on the profession, and are part of the landscape of the work.
For example, three states have created new laws and/or policy regulations as it relates to AI use in clinical practice (e.g., New York, Nevada and Utah). These laws range from requiring companies to disclose that AI chatbots are not human, to restricting behavioral healthcare providers from using AI systems while treating patients. In each case, these laws and regulations regard AI as the third actor and are aimed to ensure community safety (Ruder 2025). It is imperative that CMFTs learn about the regulations and laws that exist in the states they are practicing. Where laws do not yet exist we suggest following the competencies established in this paper, as well as to seek out the guidance established by other national organizations including those established by the American Counseling Association (American Counseling Association 2025), the National Board for Certified Counselor (National Board for Certified Counselors 2024), and the Utah Office of Artificial Intelligence Policy guidance letter for best practices for the use of artificial intelligence (Office of Artificial Intelligence Policy: Utah Department of Commerce 2025).
For example, MLs using Large Language Models have been frequently prone to fabricate information, known as hallucinations (Azamfirei et al. 2023; Roustan and Bastardot 2025). This is because the large models pull from various sources indiscriminately, and as a result outputs can often be factually incorrect or not supported by training data or real‐world facts. This makes using MLs in clinical settings a concern and requires CMFTs to do further research to ensure that MLs are providing accurate information (Machine Learning is Bridging the Gap in Mental Health Services 2025).
Despite the promising potential for early screening and diagnosis as well as digital interventions (e.g., using chatbots, open Chat GPTs) that can provide personalized interventions and psychoeducation, and clinical assistance (Stade et al. 2024) these tools are still in their early phases of development (Hu et al. 2025). Hu et al. (2025) in the most comprehensive review of large language models to date, caution that these models have no standardized frameworks for evaluating the effectiveness and safety of these mental health applications. As a result, the lack of standardized evaluations hinders the comparison of models or to assess their true impact on mental health outcomes. In addition, there are serious concerns about using open GPT's about data privacy, transparency, reproducibility of the data, and general effectiveness (Hua et al. 2025). As a result, CMFT as a profession must be cautious of advocating or using these as standalone interventions and instead focus on use under care of trained CMFTs (Hua et al. 2025). Even still, there is strong evidence that clinicians are interested in using LLMs in their practice but need more training and support to implement them effectively and safely in their practice (Mirzaei et al. 2024). To review the competencies under this section, see Table 2.
6.4. AICFT Domain #5: Relationship‐ and Person‐Centered Care
The fifth domain concerns the competency of relationship development and maintenance: Relationship‐ and Person‐Centered Care. The significant value the profession puts on relationships and connection to learners, faculty, and clients set CMFT back as a profession in its adoption of TBH and AI. The technology—when used correctly—should support (and perhaps improve) the relationship rather than feel limiting, distracting, or risky. This includes helping practitioners develop an appropriate sense of presence with the practitioner, over technology, providing support with boundaries, and communicating care and compassion across any platform. Specific competencies in this section include teaching our trainees how to model healthy communication within relationships over technology (since our clients are already using technology to talk to each other) and to apply systems theory to create relational depth and increase a sense of social presence. Hertlein (2012), in discussing the Couple and Family Therapy Technology Framework, highlighted the way that social presence advances communication and disclosure. These key aspects of the therapeutic process can be achieved through effective use of AI. This is just one example of the competencies in this section but speaks to the ability for AI to transform the work of systemic therapists in positive ways.
Another competency in this section is the focus on interprofessional work and developing a skill set that drives collaborating with and learning from interprofessional teams. Drude et al. (2020) highlighted the barriers to interprofessional education including professional centrism, exclusive educational practices, and the lack of an interdisciplinary perspective in many professions. Drude et al. (2020) believed that technology is a unifying force influencing every behavioral health profession and that the skills in applying technology in practice are not germane to any one specific mental health profession but rather are skills that can be learned: (1) en masse, and (2) in settings where similarly‐situated professionals with common interests can work together to solve problems and develop networking relationships with one another (see Table 2).
6.5. AICFT Domain #6: Continuous Professional Development
The final domain—Continuous Professional Development—is focused on the field of CMFT making an active commitment to continue to evolve as the AI tools evolve. CEUs, for example, function to keep CMFTs apprised of changes in law, policy, ethics, standards of care, and effective practices. Learning about advances in AI would require CMFTs to continue to avail themselves to learning about advances in technologies, learn about increased safety risks for systems they were using, learn about new systems with different (and hopefully fewer) vulnerabilities in terms of safety and care, and perhaps to also gain some information about effectiveness from studies performed in our discipline with our populations. The need for CMFTs to engage in continuous professional development, including recommendations for seasoned practitioners to engage in supervision when using TBH has been discussed previously (Hertlein et al. 2021). Such an approach, however, requires that CMFT's continue to embrace a stance of humility rather than fear when approaching using these technologies to avail themselves of the lessons to be learned (Hertlein et al. 2021). Specific competencies in this domain include understanding and navigating AI‐enhanced workflows (e.g., EHR integration, AI dashboards) while safeguarding relational information and honoring relational ethics in team‐based care, staying current with research and best practices in using AI, and staying current with new laws and limits to usage. Visit Table 2 for more detail.
7. Discussion
As Artificial Intelligence continues to evolve, its presence in CMFT will only deepen. This article has emphasized that couple and family therapists are uniquely positioned to lead in the ethical deployment of AI, not just as users but as co‐designers of therapeutic technologies. The core values of systemic therapy – empathy, relational context, and ethical responsibility—must remain central even as AI tools offer efficiency, accessibility, and predictive power.
As presented in this article, AI should not be seen as being applied in isolation to a CMFT's practice, but as part of a larger ecosystem of systems and processes in its contribution to the profession and shaping of dynamics. Bronfenbrenner and Ceci (1994) noted the existence of such systems and their influence on our functioning through the inclusion of the chronosystem in the ecological model (Hertlein and Earl 2019). The current context in which we live is amid a highly significant cultural revolution with new ways of relating, new terminology, and impacts to our health and well‐being that cannot yet be measured.
7.1. Future Directions
The field stands at a critical crossroads: either passively (and haphazardly) adopt AI tools as they emerge thereby allowing other mental health disciplines to dictate standards of care and common use and abdicating the power we have to put the relationship piece front and center, or be an active and leading voice in shaping AI's ethical, relational, and clinical integration into practice and training. Therapists, scholars, researchers, and trainers must critically assess AI's role, determining when it functions as a supportive tool and when it risks undermining the human connection essential to change. Future directions for the field mandate that we work toward:
Developing superior formalized training curricula that integrate fundamental AI literacy with systemic clinical skills. The training curricula should be formalized in a way that reflects current COAMFTE criteria, foundational knowledge of AI specific to mental health, communication, collaboration, and strategic implementation.
Developing and applying specific strategies for how to quickly and effectively integrate AI into a CMFT's practice . This includes: (1) being able to ethically provide treatment using this new technology, (2) ensuring that it can increase access to mental health services, (3) ensuring that the technology can be provided in acceptable ways for the desired population, (4) assessing whether it actually improves mental health outcomes, and (5), assessing its ability to increase the efficiency of providing treatment for both the client and therapist (Taylor et al. 2025).
Forging of cross‐disciplinary and interprofessional collaborations between mental health professionals, AI developers, ethicists, and clients to co‐create technology that is inclusive, safe, and effective. When it comes to developing systems that will work for CMFTs, the premise is true: garbage in, garbage out. CMFTs must dig into their collaborative and networking spirit to best advance the profession and clinical care. The development of electronic medical records, for example, was intended to ease administrative burden; but because therapists were not working alongside developers, the interfacing can actually increase burden. When we discuss integration and collaboration, it will need to occur outside of the therapy room and even the profession. It will require attendance at transprofessional conferences and workgroups to ensure that the products being created are going to meet the needs of CMFTs. It will also require discussion of large language models (like ChatGPT™, Gemini™, Claude™, etc.) to address the diversity of models and uses, offering both positive and problematic examples.
Producing high‐quality research on AI efficacy and impact in relational therapy. Large language models are still early in development for psychotherapy and require more testing as adjuncts to structured treatments. This might include research in treatment outcomes, clinical efficacy, client perceptions of the process, therapist experiences, program and training outcomes, therapeutic alliance, specific protocols, and professional perspectives. We also need to produce empirically supported, cutting‐edge applications of AI in systemic therapy and write scholarship that explores the negative inherent issues with AI use to effectively guide CMFTs in their work (i.e., legal ramifications of recording, confirmation bias, auto‐population of notes, etc.).
Advocating for sound regulatory standards that ensure transparency, privacy, boundaries, and accountability in AI applications within therapy. This will include engagement with other professionals in a contemporaneous fashion as these standards are being developed to ensure that design will be aligned with clinical need. Engagement should also discuss ways that therapists and clients can develop healthy boundaries in AI use. If CMFTs do not understand the competencies related to good AI use, setting boundaries will only lead to first order change, and not second order change (with AI use). Future scholarship and training on ethics and AI would do well to highlight these areas.
Detailing competencies specific to trainee developmental levels. Couple and family therapists honor both legacy and change in their clients. We are advocating for the same: we wish to turn into the legacy of understanding the impact of external systems on our clients while encouraging them to change in the face of the unknown. We fully expect and invite as AI changes, so, too, will these competencies. The list of domains and competencies published here is hopefully the first of many versions. We invite current and future scholars, practitioners, and trainees to revise, refine, and adapt as the technologies evolve. This is not the final word on the matter but rather the beginning of a complicated and long‐term conversation for the field as to how we incorporate AI into a CMFT's practice and profession in productive ways.
8. Conclusion
The emergence of AI in mental health care is not a passing trend. It is a fundamental shift in how care is conceptualized, delivered, regulated, and evaluated. Couple and family therapists—before AI itself eclipses the care we offer—must rise to the challenge and preserve the integrity of relational care while embracing and advancing the tools that can enhance it. The future of CMFT as a profession and discipline must involve immediate proactive engagement with AI including shaping new competencies for practice that ensure support of therapeutic relationships, preserve client autonomy, and advance equitable care across diverse populations.
References
- Aboraya, A. S. 2022. Manual for the Standard for Clinicians' Interview in Psychiatry (SCIP): A New Assessment Tool for Measurement‐Based Care (MBC) and Personalized Medicine in Psychiatry (PMP). Springer International Publishing. 10.1007/978-3-030-94930-3. [DOI] [Google Scholar]
- American Association for Marriage and Family Therapy . 2004. “Marriage and Family Therapy Core Competencies. Alexandria, VA. https://www.aamft.org/common/Uploaded%20files/COAMFTE/Accreditation%20Resources/MFT%20Core%20Competencies%20(December%202004).pdf.
- American Counseling Association . 2025. “Recommendations for Practicing Counselors and Their Use of AI.” https://www.counseling.org/resources/research-reports/artificial-intelligence-counseling/recommendations-for-practicing-counselors.
- American Psychological Association . 2025. “Ethical Guidance for AI in the Professional Practice of Health Service Psychology. https://www.apa.org/topics/artificial-intelligence-machine-learning/ethical-guidance-professional-practice.pdf.
- Arbind Kumar, C. , Anamitra H., Venkata Naveen Kumar P., and Abirami R.. 2025. “Evaluating the Effectiveness of Microbiota‐Targeted Therapies and AI‐Driven Tools in Personalized Medicine: A Systematic Review and Meta‐Analysis.” Batna Journal of Medical Sciences (BJMS) 12, no. 2: 167–174. 10.48087/BJMSoa.2025.12204. [DOI] [Google Scholar]
- Azamfirei, R. , Kudchadkar S. R., and Fackler J.. 2023. “Large Language Models and the Perils of Their Hallucinations.” Critical Care 27, no. 1: 120. 10.1186/s13054-023-04393-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Babu, A. , and Joseph A. P.. 2024. “Artificial Intelligence in Mental Healthcare: Transformative Potential Vs. the Necessity of Human Interaction.” Frontiers in Psychology 15: 378904. 10.3389/fpsyg.2024.1378904. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bajwa, M. 2024. “Case‐Based Content for A.I. and Machine Learning in Medical Education: Computer Vision.” Presented October 28, 2024, SUNY.
- Bartle‐Haring, S. , Silverthorn B. C., Meyer K., and Toviessi P.. 2009. “Does Live Supervision Make a Difference? A Multilevel Analysis.” Journal of Marital and Family Therapy 35, no. 4: 406–414. 10.1111/j.1752-0606.2009.00124.x. [DOI] [PubMed] [Google Scholar]
- Bateson, G. 1972. Steps to an Ecology Of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. The University of Chicago Press. [Google Scholar]
- Békés, V. , Bőthe B., and Aafjes‐van Doorn K.. 2025. “Acceptance of Using Artificial Intelligence and Digital Technology for Mental Health Interventions: The Development and Initial Validation of the UTAUT‐AI‐DMHI.” Clinical Psychology & Psychotherapy 32, no. 3: e70085. 10.1002/cpp.70085. [DOI] [PubMed] [Google Scholar]
- Bowen, M. 1978. Family Therapy in Clinical Practice. Jason Aronson. [Google Scholar]
- Bronfenbrenner, U. , and Ceci S. J.. 1994. “Nature‐Nurture Reconceptualized in Developmental Perspective: A Bioecological Model.” Psychological Review 101, no. 4: 568–586. 10.1037//0033-295x.101.4.568. [DOI] [PubMed] [Google Scholar]
- Cambria, E. , and White B.. 2014. “Jumping NLP Curves: A Review of Natural Language Processing Research.” IEEE Computational Intelligence Magazine 9: 48–57. 10.1109/MCI.2014.2307227. [DOI] [Google Scholar]
- Carr, N. 2008. “Is Google Making Us Stupid?” Teachers College Record: The Voice of Scholarship in Education 110, no. 14: 89–94. 10.1177/016146810811001427. [DOI] [Google Scholar]
- Commission on Accreditation for Marriage and Family Therapy Education . 2021. “COAMFTE Standards (Version 12.5).” https://www.coamfte.org/Common/Uploaded%20files/COAMFTE/Accreditation%20Resources/COAMFTE%20Standards%20Version%2012.5%20-%20Published%20August%202021%20-%208.26.21%20%28with%20links%29.pdf.
- Diano, F. , Sica L. S., and Ponticorvo M.. 2023. “A Systematic Review of Mobile Apps as An Adjunct to Psychological Interventions for Emotion Dysregulation.” International Journal of Environmental Research and Public Health 20, no. 2: 1431. 10.3390/ijerph20021431. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Drude, K. P. , Hertlien K. M., Maheu M. M., Hilty D. M., and Wall K.. 2020. “Telebehavioral Health Competencies in Interprofessional Education and Training: A Pathway to Interprofessional Practice.” Journal of Technology in Behavioral Science 5, no. 1: 30–39. 10.1007/s41347-019-00112-y. [DOI] [Google Scholar]
- Elyoseph, Z. , Refoua E., Asraf K., Lvovsky M., Shimoni Y., and Hadar‐Shoval D.. 2024. “Capacity of Generative AI to Interpret Human Emotions From Visual and Textual Data: Pilot Evaluation Study.” JMIR Mental Health 11: e54369. 10.2196/54369. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Espejo, G. , Reiner W., and Wenzinger M.. 2023. “Exploring the Role of Artificial Intelligence in Mental Healthcare: Progress, Pitfalls, and Promises.” Cureus 15, no. 9: 44748. 10.7759/cureus.44748. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fabuyi, J. A. 2024. “Leveraging Synthetic Data as a Tool to Combat Bias in Artificial Intelligence (AI) Model Training.” Journal of Engineering Research and Reports 26, no. 12: 24–46. 10.9734/jerr/2024/v26i121337. [DOI] [Google Scholar]
- Fitzpatrick, K. K. , Darcy A., and Vierhile M.. 2017. “Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial.” JMIR Mental Health 4, no. 2: e19. 10.2196/mental.7785. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garvey, K. V. , Craig K. J. T., Russell R. G., et al. 2021. “The Potential and the Imperative: The Gap in AI‐Related Clinical Competencies and the Need to Close It.” Medical Science Educator 31, no. 6: 2055–2060. 10.1007/s40670-021-01377-w. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Le Glaz, A. , Haralambous Y., Kim‐Dufor D.‐H., et al. 2021. “Machine Learning and Natural Language Processing in Mental Health: Systematic Review.” Journal of Medical Internet Research 23, no. 5: e15708. 10.2196/15708. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Graham, S. , Depp C., Lee E. E., et al. 2019. “Artificial Intelligence for Mental Health and Mental Illnesses: An Overview.” Current Psychiatry Reports 21, no. 11: 116. 10.1007/s11920-019-1094-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grondin, F. , Lomanowska A. M., Békés V., and Jackson P. L.. 2021. “A Methodology to Improve Eye Contact in Telepsychotherapy via Videoconferencing With Considerations for Psychological Distance.” Counselling Psychology Quarterly 34, no. 3–4: 586–599. 10.1080/09515070.2020.1781596. [DOI] [Google Scholar]
- Haber, Y. , Levkovich I., Hadar‐Shoval D., and Elyoseph Z.. 2024. “The Artificial Third: A Broad View of the Effects of Introducing Generative Artificial Intelligence on Psychotherapy.” JMIR Mental Health 11: e54781. 10.2196/54781. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Hays, P. 2021. Advancing Healthcare Through Personalized Medicine. Second edition. Springer International Publishing. 10.1007/978-3-030-80100-7. [DOI] [Google Scholar]
- Hertlein, K. M. 2012. “Digital Dwelling: Technology in Couple and Family Relationships.” Family Relations 61, no. 3: 374–387. 10.1111/j.1741-3729.2012.00702.x. [DOI] [Google Scholar]
- Hertlein, K. M. , Drude K., and Jordan S. S.. 2021. “What Next?: Toward Telebehavioral Health Sustainability in Couple and Family Therapy.” Journal of Marital and Family Therapy 47, no. 3: 551–565. 10.1111/jmft.12475. [DOI] [PubMed] [Google Scholar]
- Hertlein, K. M. , Drude K. P., Hilty D. M., and Maheu M. M.. 2021. “Toward Proficiency in Telebehavioral Health: Applying Interprofessional Competencies in Couple and Family Therapy.” Journal of Marital and Family Therapy 47, no. 2: 359–374. 10.1111/jmft.12496. [DOI] [PubMed] [Google Scholar]
- Hertlein, K. M. , and Earl R.. 2019. “Internet‐Delivered Therapy in Couple and Family Work.” In Internet‐Delivered Therapy and Consultancy for Groups, Families, and Organizations, edited by Rolnick A.. Routledge. [Google Scholar]
- Howley, L. D. , and Whelan A. J.. 2025. “From the World Wide Web to AI: Why We Must Learn From Our Past to Transform the Future of Medical Education.” Academic Medicine 100: S30–S33. 10.1097/ACM.0000000000006103. [DOI] [PubMed] [Google Scholar]
- Hu, M. , Ma C., and Li W., et al. 2025. "A Survey of Scientific Large Language Models: From Data Foundations to Agent Frontiers." (arXiv:2508.21148).” arXiv. 10.48550/arXiv.2508.21148. [DOI]
- Hubble, M. A. , Duncan, B. L. , and Miller, S. D. , ed. 1999. The Heart And Soul of Change: What Works in Therapy. American Psychological Association. 10.1037/11132-000. [DOI] [Google Scholar]
- Johnson, L. N. , Miller R. B., Bradford A. B., and Anderson S. R.. 2017. “The Marriage and Family Therapy Practice Research Network (MFT‐PRN): Creating a More Perfect Union Between Practice and Research.” Journal of Marital and Family Therapy 43, no. 4: 561–572. 10.1111/jmft.12238. [DOI] [PubMed] [Google Scholar]
- Khawaja, Z. , and Bélisle‐Pipon J. C.. 2023. “Your Robot Therapist Is Not Your Therapist: Understanding the Role of AI‐Powered Mental Health Chatbots.” Frontiers in Digital Health 5: 1278186. 10.3389/fdgth.2023.1278186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kolding, S. , Lundin R. M., Hansen L., and Østergaard S. D.. 2024. “Use of Generative Artificial Intelligence (AI) in Psychiatry and Mental Health Care: A Systematic Review.” Acta Neuropsychiatrica 37: Article e37. 10.1017/neu.2024.50. [DOI] [PubMed] [Google Scholar]
- Kopelovich, S. L. , Slevin R., Brian R. M., et al. 2025. “Preliminary Investigation of An Artificial Intelligence‐Based Cognitive Behavioral Therapy Training Tool.” Psychotherapy 62, no. 1: 12–21. 10.1037/pst0000550. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lawson, L. , and Falke S.. 2017. “Live supervision.” In The Sage Encyclopedia of Marriage, Family, and Couples Counseling, 4, 970–972. SAGE Publications, Inc. 10.4135/9781483369532.n292. [DOI] [Google Scholar]
- Lee, K. S. , Yeung J., Kurniawati A., and Chou D. T.. 2025. “Designing Human‐Centric AI Mental Health Chatbots: A Case Study of Two Apps.” In Information Systems for Intelligent Systems, edited by Joshi A., Patel B., Iglesias A., and Shin J., 1255, 435–452. Springer. 10.1007/978-981-96-1747-0_36. [DOI] [Google Scholar]
- Machine Learning is Bridging the Gap in Mental Health Services . 2025. “IEEE Transmitter.” https://transmitter.ieee.org/machine-learning-is-bridging-the-gap-in-mental-health-services.
- Maheu, M. M. , Drude K. P., Hertlein K. M., Lipschutz R., Wall K., and Hilty D. M.. 2018. “Correction To: An Interprofessional Framework for Telebehavioral Health Competencies.” Journal of Technology in Behavioral Science 3, no. 2: 108–140. 10.1007/s41347-018-0044-1. [DOI] [Google Scholar]
- Manole, A. , Cârciumaru R., Brînzaș R., and Manole F.. 2024. “Harnessing AI in Anxiety Management: A Chatbot‐Based Intervention for Personalized Mental Health Support.” Information 15, no. 12: 768. 10.3390/info15120768. [DOI] [Google Scholar]
- Maurya, R. K. , and DeDiego A. C.. 2025. “Artificial Intelligence Integration in Counsellor Education and Supervision: A Roadmap for Future Directions and Research Inquiries.” Counselling and Psychotherapy Research 25, no. 1: 12727. 10.1002/capr. [DOI] [Google Scholar]
- McKenzie, P. N. , Atkinson B. J., Quinn W. H., and Heath A. W.. 1986. “Training and Supervision in Marriage and Family Therapy: A National Survey.” The American Journal of Family Therapy 14, no. 4: 293–303. 10.1080/01926188608250652. [DOI] [Google Scholar]
- Miner, A. S. , Shah N., Bullock K. D., Arnow B. A., Bailenson J., and Hancock J.. 2019. “Key Considerations for Incorporating Conversational AI in Psychotherapy.” Frontiers in Psychiatry 10: 746–756. 10.3389/fpsyt.2019.00746. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Mirzaei, T. , Amini L., and Esmaeilzadeh P.. 2024. “Clinician Voices on Ethics of LLM Integration in Healthcare: A Thematic Analysis of Ethical Concerns and Implications.” BMC Medical Informatics and Decision Making 24: 250. 10.1186/s12911-024-02656-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Moon, W.‐K. , Jeong J.‐Y., Park S.‐W., Yun S.‐Y., Lee E., and Shin S.. 2024. “Integrative Personalized Medicine Care for Adjustment Disorder of a post‐COVID‐19 Patient: A CARE‐Compliant Case Report.” Medicine 103, no. 31: e39121. 10.1097/MD.0000000000039121. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Muetunda, F. , Sabry S., Jamil M. L., Pais S., Dias G., and Cordeiro J.. 2024. “AI‐Assisted Diagnosing, Monitoring and Treatment of Mental Disorders: A Survey.” ACM Transactions on Computing for Healthcare 5, no. 4: Article 23. 10.1145/3681794. [DOI] [Google Scholar]
- National Board for Certified Counselors . 2024. Ethical Principles for Artificial Intelligence in Counseling. https://www.nbcc.org/assets/ethics/EthicalPrinciples_for_AI.pdf.
- Nelson, T. S. , Chenail R. J., Alexander J. F., Crane D. R., Johnson S. M., and Schwallie L.. 2007. “The Development of Core Competencies for the Practice of Marriage and Family Therapy.” Journal of Marital and Family Therapy 33, no. 4: 417–438. 10.1111/j.1752-0606.2007.00042.x. [DOI] [PubMed] [Google Scholar]
- Noll, R. , Schaaf J., and Storf H.. 2022. “The Use of Computer‐Assisted Case‐Based Reasoning to Support Clinical Decision‐Making – A Scoping Review.” In Case‐Based Reasoning Research and Development, edited by Wiratunga N. and Keane M. T., 13405, 395–409. Springer International Publishing AG. 10.1007/978-3-031-14923-8_26. [DOI] [Google Scholar]
- Northey, W. F., and Gehart D. R.. 2020. “The Condensed MFT Core Competencies: A Streamlined Approach for Measuring Student and Supervisee Learning Using the MFT Core Competencies.” Journal of Marital and Family Therapy 46, no. 1: 42–61. 10.1111/jmft.12386. [DOI] [PubMed] [Google Scholar]
- Northey, W. F. , and Gehart D. R.. 2020. “The Condensed Mft Core Competencies: A Streamlined Approach for Measuring Student and Supervisee Learning Using the Mft Core Competencies.” Journal of Marital and Family Therapy 46, no. 1: 42–61. 10.1111/jmft.12386. [DOI] [PubMed] [Google Scholar]
- Novak, L. L. , Russell R. G., Garvey K., et al. 2023. “Clinical Use of Artificial Intelligence Requires Ai‐Capable Organizations.” JAMIA Open 6, no. 2: ooad028. 10.1093/jamiaopen/ooad028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Novotney, A. 2011. “A New Emphasis on Telehealth. How Can Psychologists Stay Ahead of the Curve ‐ and Keep Patients Safe.” American Psychological Association 42, no. 6: 40. https://www.apa.org/monitor/2011/06/telehealth. [Google Scholar]
- Office of Artificial Intelligence Policy: Utah Department of Commerce . 2025. “Guidance Letter: Best Practices for the Use of Artificial Intelligence by Mental Health Therapists. https://ai.utah.gov/wp-content/uploads/Best-Practices-Mental-Health-Therapists.pdf.
- Pandya, A. , Lodha P., and Ganatra A.. 2024. “Is ChatGPT Ready to Change Mental Healthcare? Challenges and Considerations: A Reality‐Check.” Frontiers in Human Dynamics 5: 1289255. 10.3389/fhumd.2023.1289255. [DOI] [Google Scholar]
- Ray, A. , Bhardwaj A., Malik Y. K., Singh S., and Gupta R.. 2022. “Artificial Intelligence and Psychiatry: An Overview.” Asian Journal of Psychiatry 70: 103021. 10.1016/j.ajp.2022.103021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Reddy, S. 2023. “Navigating the AI Revolution: The Case for Precise Regulation in Health Care.” JMIR mHealth and uHealth 25, no. 2: e49989. https://www.jmir.org/2023/1/e49989. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Roustan, D. , and Bastardot F.. 2025. “The Clinicians' Guide to Large Language Models: A General Perspective With a Focus on Hallucinations.” Interactive Journal of Medical Research 14: e59823. 10.2196/59823. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Ruder, E. 2025. “3 States Regulating AI and Mental Health.' Becker's Behavioral Health . https://www.beckersbehavioralhealth.com/ai-2/3-states-regulating-ai-and-mental-health/.
- Russell, R. G. , Lovett Novak L., Patel M., et al. 2023. “Competencies for the Use of Artificial Intelligence‐Based Tools by Health Care Professionals.” Academic Medicine 98, no. 3: 348–356. 10.1097/ACM.0000000000004963. [DOI] [PubMed] [Google Scholar]
- Russell, R. G. , Lovett Novak L., Patel M., et al. 2023. “Competencies for the Use of Artificial Intelligence‐Based Tools by Health Care Professionals.” Academic Medicine 98, no. 3: 348–356. 10.1097/ACM.0000000000004963APA. [DOI] [PubMed] [Google Scholar]
- Sachan, D. 2018. “Self‐Help Robots Drive Blues Away.” The Lancet Psychiatry 5, no. 7: 547. 10.1016/S2215-0366(18)30230-X. [DOI] [PubMed] [Google Scholar]
- Saeidnia, H. R. , Hashemi Fotami S. G., Lund B., and Ghiasi N.. 2024. “Ethical Considerations in Artificial Intelligence Interventions for Mental Health and Well‐Being: Ensuring Responsible Implementation and Impact.” Social Sciences 13, no. 7: 381. 10.3390/socsci13070381. [DOI] [Google Scholar]
- Shaik, T. , Tao X., Higgins N., et al. 2023. “Remote Patient Monitoring Using Artificial Intelligence: Current State, Applications, and Challenges.” WIREs Data Mining & Knowledge Discovery 13, no. 2: e1485. 10.1002/widm.1485. [DOI] [Google Scholar]
- Sixto‐García, J. , Quian, A. , Rodríguez‐Vázquez, A.‐I. , Silva‐Rodríguez, A. , and Soengas‐Pérez, X. , ed. 2024. Journalism, Digital Media and the Fourth Industrial Revolution, 2024, 1st ed. Springer Nature Switzerland. 10.1007/978-3-031-63153-5. [DOI] [Google Scholar]
- Springer, P. , Bischoff R. J., Kohel K., Taylor N. C., and Farero A.. 2020. “Collaborative Care at a Distance: Student Therapists' Experiences of Learning and Delivering Relationally Focused Telemental Health.” Journal of Marital and Family Therapy 46, no. 2: 201–217. 10.1111/jmft.12431. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stade, E. C. , Stirman S. W., Ungar L. H., et al. 2024. “Large Language Models Could Change the Future of Behavioral Healthcare: A Proposal for Responsible Development and Evaluation.” NPJ Mental Health Research 3, no. 1: Article 12. 10.1038/s44184-024-00056-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Su, C. , Xu Z., Pathak J., and Wang F.. 2020. “Deep Learning in Mental Health Outcome Research: A Scoping Review.” Translational Psychiatry 10, no. 1: 116. 10.1038/s41398-020-0780-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Taylor, N. C. , Springer P. R., and Bischoff R. J.. 2025. “Guidelines for Integrating Technology in Clinical Practice.” International Journal of Systemic Therapy 37, no. 1: 72–94. 10.1080/2692398X.2025.2496839. [DOI] [Google Scholar]
- Tekin, Ş. 2023. “Ethical Issues Surrounding Artificial Intelligence Technologies in Mental Health: Psychotherapy Chatbots.” In Technology Ethics: A Philosophical Introduction and Readings, edited by Robson and Tsou, 92–104. Routledge. [Google Scholar]
- Therapy Talk Team 2025. "10 AI in Mental Health Breakthroughs to Transform Care.” Weling Talk Therapy. https://therapytalk.io/blogs/10-ai-in-mental-health-breakthroughs-to-transform-care.
- Volkmer, S. , Meyer‐Lindenberg A., and Schwarz E.. 2024. “Large Language Models in Psychiatry: Opportunities and Challenges.” Psychiatry Research 339: 116026. 10.1016/j.psychres.2024.116026. [DOI] [PubMed] [Google Scholar]
- Wampler, K. S. , Blow A. J., McWey L. M., Miller R. B., and Wampler R. S.. 2019. “The Profession of Couple, Marital, and Family Therapy (CMFT): Defining Ourselves and Moving Forward.” Journal of Marital and Family Therapy 45, no. 1: 5–18. 10.1111/jmft.12294. [DOI] [PubMed] [Google Scholar]
- Zeff, M. 2025. OpenAI Says Over a Million People Talk to ChatGPT About Suicide Weekly. TechCrunch. [Google Scholar]
