Skip to main content
BMC Psychiatry logoLink to BMC Psychiatry
. 2025 Jul 7;25:685. doi: 10.1186/s12888-025-07135-1

Psychiatrists’ and trainees’ knowledge, perception, and readiness for integration of artificial intelligence in mental health care in Nigeria

Olatunji Alao Abiodun 1, Peter Omoniyi Ajiboye 1, Mumeen Olaitan Salihu 2,, Dauda Sulyman 3, Adesanmi Akinsulore 4, Okwudili Obayi 5, Hassan Bala Salihu 6
PMCID: PMC12235781  PMID: 40624506

Abstract

Background

Artificial intelligence (AI) is revolutionising healthcare globally, including in Nigeria. AI is promising in psychiatry, particularly in addressing the shortage of psychiatrists and rural healthcare gaps. However, research on AI adoption among Nigerian psychiatrists is unavailable.

Aims

This study assesses Nigerian psychiatrists’ and trainees’ knowledge, perception, and readiness toward AI adoption in psychiatric practice.

Methods

An online cross-sectional survey was conducted using a convenience sample of 200 psychiatrists and trainees. Participants completed a structured online questionnaire assessing demographics, knowledge, perception, and readiness for AI adoption in psychiatry.

Results

The mean age of the participants is 39 years (Range: 26–68). Most (86.5%) were aware of AI’s usefulness in psychiatric practice, particularly in diagnostic assistance (54%), patient monitoring (60%) and predicting outcomes (59%). However, only 38.5% were familiar with its use. About 73.5% had a positive perception towards AI integration in psychiatry; Most agreed to AI’s potential benefits in the standardisation and personalisation of care plans (63%), addressing the shortage of psychiatrists (61%), minimises bias (73.5%), and prompt help-seeking behaviour among patients (68%). Respondents were sceptical about AI surpassing average psychiatrists in tasks requiring empathy (91.0% unlikely) and mental status examinations (68% unlikely). Data security, potential loss of human interaction, and diminished empathy were significant concerns. Only 29.5% had used AI-based tools, and 79.5% expressed future adoption readiness.

Conclusion

Nigerian psychiatrists view AI as valuable in addressing psychiatric service gaps but emphasise the need for ethical regulations and targeted training to ensure safe, empathetic, and culturally appropriate AI applications in psychiatry.

Ethics registration

The study was approved by the Institutional Review Committee of the Kwara State University Teaching Hospital with approval protocol number KWASUTH/IRC/246/VOL.II/46.

Clinical trial number

Not applicable.

Keywords: Artificial intelligence (AI) in psychiatry, Psychiatrists’ shortage, Data privacy concerns, Cultural Bias and adaptation, Human empathy and therapeutic alliance

Introduction

Artificial intelligence (AI) combines computer science with large datasets to mimic human-like intelligence, revolutionising healthcare globally, particularly in high-income countries [1, 2]. While developing countries in Africa face challenges in fully harnessing AI’s potential, they have made notable progress in healthcare applications, including AI-driven analysis of medical images (chest X-rays and blood films) for tuberculosis and malaria diagnosis, which provides accurate and rapid results [25]. In Nigeria, AI in healthcare is evolving, with applications in diagnostic radiology, telemedicine, laboratory medicine, and ophthalmology [68].

However, Nigeria faces a significant mental health burden, with approximately 20% of its over 200 million population affected by mental illness and a severe shortage of psychiatrists (fewer than 300), leading to a vast treatment gap [911]. This gap is further exacerbated by the uneven distribution of psychiatrists and brain drain, resulting in nearly 90% of individuals with psychiatric illnesses being either undiagnosed or untreated [1113]. In this context, AI in psychiatry is promising to improve treatment access, bridge the human resource gap, ensure precise psychiatric diagnoses, facilitate easy monitoring and follow-up regardless of distance, and develop comprehensive treatment plans [14, 15]. The AI can be used to deliver mental health services in remote and hard-to-reach areas, thereby improving access to specialist care, which could help the current skewed distribution of specialists and a severe shortage of mental health professionals in Nigeria, estimated to be 0.1 psychiatrists to 100,000 population [13, 15].

The use of AI in mental health care is still limited compared to other specialities of medicine [16]. This could be due to the nature of psychiatric practice as it relies on soft skills like building rapport and a therapeutic relationship with the patients while also observing their emotions and behaviour [17]. This notwithstanding, AI has been found very useful in preventive mental health practice through machine learning-assisted pre-diagnostic screening tools and risk factor models that determine an individual’s predisposition to mental illness [18]. Similarly, AI has demonstrated great potential in redefining mental health diagnosis and helps better understand mental illness [14]. For instance, AI has been used to diagnose and differentiate depression and attention deficit hyperactivity disorder (ADHD) from healthy controls, with reported accuracy greater than 90% using electroencephalogram (EEG)-based machine learning [19, 20]. Also, AI demonstrated accuracy of more than 70% in the diagnosis of psychotic disorders and has been deployed in analysing speech to predict the level of incoherence in patients with schizophrenia [21].

In the areas of treatment, machine-based learning was reported to be effective in improving refractory symptoms and medication adherence in patients with treatment-resistant schizophrenia, with a potential for improved quality of life, using virtual reality therapy [22, 23]. Artificial intelligence, like robots, was used as at-home care assistants to provide companionship and interaction for elderly patients with dementia, and this has helped them reduce stress, agitation, and loneliness and improve mood and social interaction [24]. AI-assisted technology can be used in a venture care approach where intensive care is provided to patients in most need [25]. In contrast, the least resource-intensive care is given to most people at first, using a technological AI-assisted triage system, which reduces the overall healthcare costs and minimises waste [25].

Furthermore, AI has been found helpful in mental health research as it provides a clear understanding of disease progression and treatment discovery, among others [26]. AI has shown promising results in psychiatric medical education through self-directed learning, case-based learning, stimulation, content synthesis and examination assessment tools for medical students and residents [27]. AI applications can be helpful for administrative purposes through electronic health records (EHRs) to assist in real-time population health risk assessment for planning and resource management [28].

Despite these potential benefits, AI use in psychiatry is not without risks and concerns. Some of these risks include data privacy and security of patient health information, including hacking and unauthorised monitoring, misinformation, empathy issues, AI algorithm bias, the regulation regarding the use of AI and over-dependence on AI tools in carrying out specific tasks of psychiatrists, like diagnosis and treatment plans, with fear of psychiatrists losing their clinical skills [29, 30]. Other areas of concern are the costs of providing the services, digital divides, underdeveloped infrastructure, lack of trained professionals, and cultural peculiarity of psychiatric care in Africa [31].

Evidence has shown that the deep understanding and knowledge of AI tools influence the attitude and intention to use AI tools in psychiatric practice [32]. Cecil et al. [32] found a very low understanding of AI-enabled tools in psychiatric practice among mental health practitioners, as only 0.6% of the respondents demonstrated excellent knowledge in its application by mentioning all the four key areas of it use (i.e. diagnostic, treatment, feedback and management), 53.7% mentioned only one application area, 37.6% stated two applications areas, while 10.5% failed to describe its use. In the same study, 45.4% of the participants reported that they had never heard of AI-enabled technologies in psychiatry, and a preponderant (93.37%) had not used AI tools in their clinical practice [32]. Ahmed et al. [33] found that a significant number of doctors (74%) had a basic knowledge of AI, but less than a third (27.3%) were aware of its applications in healthcare. In the same study, many respondents demonstrated a positive attitude toward the usefulness of AI in healthcare, but only 14.8% of the doctors had ever practically used AI in their practice [33]. In another study conducted among psychiatrists in Bahrain, 41.9% and 20.9% of them demonstrated average and above average knowledge of AI in psychiatric care, with only 4.7% showing poor knowledge. At the same time, a good number displayed a positive attitude towards AI use in specific psychiatric tasks like formulating personalised treatment and predicting outcomes and its benefits in improving psychiatric practice [34]. Doraiswamy et al.[35] in a global survey involving 791 psychiatrists from 22 countries, found that one in two respondents expressed a positive attitude that AI would improve their job substantially, particularly in information synthesis, documentation and updating medical records.

Studies on the knowledge, perception and adoption of AI in psychiatric practice among psychiatrists are unavailable in Nigeria. However, some positive progress in AI use has been reported in other specialities like radiology in Nigeria [36]. A recent local study on AI conducted among medical professionals across different specialities, including medical interns and medical students from six geopolitical found that nearly all (98.6%) respondents were aware of AI, 77.1% had no theoretical training on AI, and more than two-thirds (75%) had a positive attitude to the use of AI in healthcare practice [37]. To bridge the knowledge gap and promote the adoption of AI in the West African sub-region, a capacity-building program was conducted by the National Universities Commission in collaboration with the National Open University of Nigeria and the West Africa Office of the Association of African Universities [38]. This 13-week intensive training on AI was provided to 931 staff members from Nigerian universities, polytechnics, and colleges of education. The training covered key aspects of AI and included practical sessions. Some beneficiaries were academic psychiatrists who also worked as clinicians in teaching hospitals. Integrating AI into mental health care practice in Nigeria depends on the readiness for adoption and knowledge of its application among mental health practitioners. Hence, this study aims to assess the knowledge, perception, and adoption readiness for AI in psychiatric practice among psychiatrists and trainees practising in Nigeria.

Methods

Study design and participants

This study adopted a cross-sectional design, utilizing an online survey between July and September 2024. About 280 psychiatrists and 181 trainees are registered on the website of the Association of Psychiatrists in Nigeria (APN) [10]. The inclusion criterion was psychiatrists and trainees still practising and residing in Nigeria.

The Google survey link was distributed online using social networks of the Association of Psychiatrists in Nigeria (APN) and Early Career Psychiatrists sections of the APN (APN-ECPs) to reach the eligible participants. The APN WhatsApp page contained all registered consultant psychiatrists in Nigeria, while the ECPs page had all residents/trainees, including consultant psychiatrists who had qualified within 5 years and below. The respondents practice in different psychiatry training centres, including private institutions, spread across the six geopolitical zones in Nigeria (the North-east, Northwest, North-central, South-south, South-east and South-west). The Google survey link was also sent to the departmental forum social network page of accredited psychiatric training centres via an investigator practising in those training institutions to create a reminder, especially for those who may have missed our questionnaire link on the two main APN pages. Eligible respondents were encouraged to complete the survey once to prevent multiple entries from an individual. Reminders were sent to all these social networks twice weekly (Mondays and Fridays) for 12 weeks to complete the questionnaire. All participants were provided information about the voluntary nature of the study at the beginning of the Google Form. Data confidentiality was maintained throughout the study, with survey responses anonymized and stored securely to protect participants’ privacy. The aim was to cover nearly 80% of the targeted population using convenience sampling. A total of 200 respondents, however, responded to the survey, yielding a response rate of 43.4%.

The questionnaire, which the researchers developed, has most of its items adapted and modified from previous similar surveys [34, 35, 37], where such items have been validated and used. This notwithstanding, the questionnaire was pretested on five consenting psychiatrists and trainees. The observation from the pretest did not indicate the need for modification to be made to the questionnaire. The questionnaire contained three segments.

Section I: Demographic and Professional Information. This section collected data on respondents’ socio-demographic and professional characteristics, including gender, age, religion, rank, years of consultant experience, residency training, institution of practice and subspecialty.

Section II: This assesses i) knowledge of AI in psychiatric practice among respondents using seven key questions centred on AI application areas and familiarity with AI use in psychiatric practice, and these include: (1) can AI be useful in practice of psychiatry, (2) can AI be useful in making psychiatric diagnosis, (3) is AI of use in carrying out psychiatric investigations, (4) is AI useful in the treatment of psychiatric patients, (5) can AI be of use in monitoring the progress during psychiatric treatment, 6.) Can AI be useful in predicting psychiatric outcomes? 7.) Are you familiar with AI use in psychiatry? The responses to the above questions range from Yes, No, or Not Sure. ii.) Perception/views of the respondents were also assessed on the a). Role of AI in psychiatry (e.g., Should AI use in Psychiatry be encouraged? Do you agree that AI can improve the accuracy of psychiatric diagnoses? Do you believe AI can improve the treatment of psychiatric patients? Do you believe the use of AI will improve teaching in Psychiatry? Do you believe AI can improve the quality of research in psychiatry? Do you believe using AI in psychiatry will reduce the need for psychiatrists). The 5-Likert responses include strongly agree, Agree, Do not know, Disagree, and Strongly disagree. b). Future competence of AI in 10 key tasks of psychiatrists. The tasks included: (1) provide documentation (e.g. update medical records) about patients, (2) provide empathetic care to patients, (3) formulate personalized medication and/or therapy treatment plans for patients, (4) evaluate when to refer patients to outpatient versus inpatient treatment, (5) analyze patient information to establish prognoses, (6) analyze patient information to detect homicidal thoughts, (7) analyze patient information to detect suicidal thoughts, (8) synthesize patient information to reach diagnoses, (9) perform a mental status examination, (10) interview psychiatric patients in a range of settings to obtain medical history. The first ten items opened with a general question: “Do you think AI will perform under listed tasks better than average psychiatrists in future?” The 6-Likert responses include Extremely Unlikely, Unlikely, Somehow Unlikely, Somehow Likely, Likely, and Extremely Likely. c.) The benefit of AI in psychiatric practice (e.g. scalability of treatment where there is a shortage of psychiatrists, eliminating human errors, ability to work in any environment, etc.). The responses are Yes, No or Not Sure. d.) Risk and limitations of AI in psychiatric practice (e.g. data security and privacy concerns, lack of empathy, psychiatrists will forsake clinical thinking, etc.). The response is Yes, No, or Not Sure.

Section III:This section assesses the use of AI in psychiatric practice and readiness to adopt AI in psychiatric practice in 5 years. The questions include 1) Have you ever used AI in your practice and 2.) Would you be willing to adopt AI in your practice in 5 years? The responses are Yes, No, or Not Sure.

Ethical considerations

This study received ethical approval from the Kwara State University Teaching Hospital Institutional Review Committee (KWASUTH/IRC) with protocol number KWASUTH/IRC/246/VOL.II/46. Participants were informed about the study’s voluntary nature and their right to withdraw without penalty at the beginning of the Google Form. Only those who provided informed consent proceeded with the questionnaire. Survey responses were anonymized and stored securely to maintain confidentiality, protecting participants’ privacy throughout the study.

Data analysis

Data analysis was conducted using EPI-INFO version 7.2.6.0. Categorical variables were summarized using frequencies, while continuous variables were described using mean and standard deviation. To facilitate interpretation, responses from Likert scales were recategorized: “Agree” (combined “strongly agree” and “agree,“) “Disagree” (combined “strongly disagree” and “disagree,“) and “Don’t Know” remained a separate category. Additionally, likelihood responses were collapsed into two categories: “Likely” (including “somewhat likely,” “likely,” and “extremely likely”) and “Unlikely” (including “extremely unlikely,” “unlikely,” and “somewhat unlikely”).

Results

Sociodemographic characteristics of the respondents

The results reflect data collected from 200 psychiatrists and trainees practising in Nigeria. The age range of respondents was 26 to 68 years, with a mean age of 39. There were more male respondents (64.5%), and most (85.5%) were married. Consultant psychiatrists constitute 38.0%, while trainees constitute the majority (62%). Among the consultants, 63.2% had obtained their fellowship 10 years and below, only 6.5% obtained theirs more than 20 years, and the majority (76.5%) had their specialisation in General Psychiatry. For residents, a third (33.9%) of them have been in residency training for more than 4 years. Preponderant of the respondents (90.0%) worked in the federally owned institutions, which include federal neuropsychiatric hospitals (49.0%), federal university teaching hospitals (37.5%) and federal medical centres (3.5%), while only 2.5% worked with private institutions (Table 1).

Table 1.

Demographic and professional characteristics of psychiatrists participating in a study on AI utilization in psychiatric practice

Variables Frequency (%)
Age Mean age is 39 years (Range:26–68 years)
Gender
 Female 71 (35.5)
 Male 129 (64.5)
Marital Status
 Single 25 (12.5)
 Married 171 (85.5)
 Separated/Divorce 4 (2.0)
Religion
 Christian 112 (56.0)
 Islam 83 (41.5)
 Others 5 (2.5)
Rank
 Consultant 76 (38.0)
 Senior Registrar 65 (32.5)
 Registrar/Medical Officer 59 (29.5)
Years of Experience as a Consultant n = 76
 ≤ 10 year 48 (63.2)
 11–20 years 23 (30.3)
 > 20 years 5 (6.5)
Years of training as a resident in Psychiatry n = 121
 ≤ 4 years 80 (66.1)
 > 4 years 41 (33.9)
Institution of practice
 Federal 180 (90.0)
 State 15 (7.5)
 Private 5 (2.5)
Sub-specialization
 General Psychiatry 153 (76.5)
 Addiction 14 (7.0)
 Child and Adolescent 6 (3.0)
 Community 2 (1.0)
 Consultation-liaison 19 (9.5)
 Forensic 6 (3.0)

AI training needs and regulation

Figure 1 showed that only 11.5% of respondents had received formal AI training, with the majority (88.5%) having no prior training. However, nearly all respondents (97.5%) expressed interest in AI training, with preferred formats being online courses (87.5%), workshops and seminars (67.5%), and in-service training (56.5%). A substantial proportion of respondents (95.5%) advocated regulating AI use in psychiatry, with 68.5% favouring strict regulation.

Fig. 1.

Fig. 1

Respondents’ perspective on AI training needs and regulation

Respondents’ knowledge of AI applications in psychiatry

Most respondents (86.5%) were aware of the usefulness of AI in the practice of psychiatry. However, only 38.5% were familiar with its use. Regarding the specific AI clinical application areas in psychiatry, slightly above half of the respondents recognised the role of AI in making psychiatric diagnoses (54.0%) and treating psychiatric patients (54.5%). Nearly two-thirds knew its role in monitoring patients’ treatment progress (60.0%) and predicting outcomes (59.0%). However, slightly above half (51.0%) were unsure or unaware of AI use in psychiatric investigations (Table 2).

Table 2.

Respondents’ knowledge on artificial intelligence application in core clinical psychiatry practice

Can Artificial intelligence (AI) be useful in the practice of psychiatry? Frequency (%)
 Yes 173 (86.5)
 No 8 (4.0)
 Not Sure 19 (9.5)
Are you familial with AI use in psychiatry?
 Yes 77 (38.5)
 No 94 (47.0)
 Not Sure 29 (14.5)
Can AI be of use in making psychiatric diagnosis?
 Yes 108 (54.0)
 No 19 (9.5)
 Not Sure 73 (36.5)
Is AI of use in carrying out psychiatric investigations?
 Yes 98 (49.0)
 No 22 (11.0)
 Not Sure 80 (40.0)
Is AI useful in the treatment of psychiatric patients?
 Yes 109 (54.5)
 No 22 (11.0)
 Not Sure 69 (34.5)
Can AI be of use in monitoring progress during psychiatric treatment?
 Yes 120 (60.0)
 No 16 (8.0)
 Not Sure 64 (32.0)
Can AI be of use in predicting psychiatric outcome?
 Yes 118 (59.0)
 No 17 (8.5)
 Not Sure 65 (32.5)

 Respondents’ views on the role of AI in psychiatry

Respondents identified specific areas where they believed AI could benefit psychiatric practice. The majority (73.5%) of respondents positively perceived AI use in psychiatry and encouraged it. A Preponderant of the respondents agree to its potential to improve research quality (90.5%) and teaching (84%) in psychiatry; the majority believe that AI would enhance treatment (69%) and accuracy in psychiatric diagnosis (61.5%). However, slightly above half (53.5%) disagreed that AI would replace or reduce the need for psychiatrists. (Fig. 2).

Fig. 2.

Fig. 2

Respondents’ views on the role of AI in psychiatry

Respondents’ views on the future competence of AI on 10 key aspects of psychiatrists’ job

When respondents were asked to assess AI’s ability to perform various psychiatric tasks better and above average psychiatrists, most respondents believed that it is unlikely that the future AI would be able to provide empathic care to patients (91.0%) and perform mental status examination (68%) better than average psychiatrists. However, most expressed optimism that the future AI would likely excel in documentation tasks like updating medical records of patients (73.0%), analysing patient information to establish prognoses (71.5%) and synthesising patient information to reach diagnoses (74.5%). (Fig. 3).

Fig. 3.

Fig. 3

Respondents’ views on the future competence of AI on 10 key aspects of psychiatrists’ job

Respondents’ views on the benefits of artificial intelligence in psychiatric practice

Respondents also outlined the potential benefits of AI in psychiatric practice. Nearly half (49%) agreed that AI could reduce human errors, though 37.5% were unsure. A significant number of respondents felt that AI could improve the standardisation and personalisation of care plans (63%), address the shortage of psychiatrists through the expedition of treatment delivery (61%), and streamline task flow, which could improve practitioners’ quality of life (59.5%), minimises gender or race bias (73.5%), and promote early help-seeking behaviour among patients (68%). However, nearly half of respondents were sceptical about AI’s potential to eliminate human errors (51%) or provide a better understanding of brain pathologies causing mental health disorders (54.5%) (Table 3).

Table 3.

Respondents’ views on the benefits of artificial intelligence in psychiatric practice

Eliminate human errors? Frequency (%)
 Yes 98 (49.0)
 No 27 (13.5)
 Not sure 75 (37.5)
Standardization and personalization of care plans?
 Yes 126 (63.0)
 No 16 (8.0)
 Not sure 58 (29.0)
Ability to work under any environment?
 Yes 130 (65.0)
 No 18 (9.0)
 Not sure 52 (26.0)
Scalability of treatment where there are shortage of Psychiatrists?
 Yes 122 (61.0)
 No 19 (9.5)
 Not sure 59 (29.5)
More truthful responses from patients to AI machine?
 Yes 55 (27.5)
 No 45 (22.5)
 Not sure 100 (50.0)
Improved quality of life for psychiatrists if work flow is streamlined through AI?
 Yes 119 (59.5)
 No 15 (7.5)
 Not sure 66 (33.0)
Better elucidation of brain pathologies contributing to aetiology of mental disorders?
 Yes 90 (45.0)
 No 29 (14.5)
 Not sure 81 (40.0)
Not associated with race or gender bias?
 Yes 146 (73.5)
 No 9 (4.5)
 Not sure 45 (25.5)
It may help patient at risk to seek help early?
 Yes 136 (68.0)
 No 5 (2.5)
 Not sure 59 (29.5)

Respondents’ views on the risks and limitations of artificial intelligence in psychiatric practice

The study also highlighted respondents’ concerns and reservations regarding AI. Privacy concerns were significant, with 77% of respondents identifying risks related to data security. Nearly 90% were concerned about reduced human interaction, and 82.5% worried about the potential lack of empathy in AI-driven care. Concerns regarding the accuracy and reliability of AI tools were raised by 56.5% of respondents, while 81.5% feared over-dependence on AI technologies, and 63% of respondents viewed that AI could diminish psychiatrists’ creative clinical thinking. Concerns about potential job displacement were noted by 52% of respondents, while 73.5% expressed concern over the cost of implementing AI. However, slightly above half (54%) of the respondents expressed optimism that the potential benefits of AI would outweigh the risks or harms (Table 4; Fig. 4.).

Table 4.

Respondents’ views on risk and limitations of artificial intelligence in psychiatric practice

Risks and limitations Frequency (%)
Data security and privacy concerns?
 Yes 154 (77.0)
 No 30 (15.0)
 Not sure 16 (8.0)
Lack of human interaction
 Yes 179 (89.5)
 No 12 (6.0)
 Not sure 9 (4.5)
Lack of empathy?
 Yes 165 (82.5)
 No 17 (8.5)
 Not sure 18 (9.0)
Dependence on technology?
 Yes 163 (81.5)
 No 19 (9.5)
 Not sure 18 (9.0)
Psychiatrists will forsake creative critical thinking?
 Yes 126 (63.0)
 No 42 (21.0)
 Not sure 32 (16.0)
Feelings of antipathy towards AI due to job loss?
 Yes 104 (52.0)
 No 46 (23.0)
 Not sure 50 (25.0)
Risk of greater burnout for psychiatrists if the time saved by the use of AI is used (by administrators) to increase psychiatrists’ patient loads?
 Yes 81 (40.5)
 No 46 (23.0)
 Not sure 73 (36.5)
Accuracy and Reliability of AI Tools?
 Yes 113 (56.5)
 No 28 (14.0)
 Not sure 59 (29.5)
Cost of Implementation?
 Yes 147 (73.5)
 No 24 (12.0)
 Not sure 29 (14.5)
Lack of Knowledge and Training?
 Yes 28 (14.0)
 No 147 (73.5)
 Not sure 25 (12.5)

Fig. 4.

Fig. 4

Respondents’ view on whether the potential benefits of AI outweigh possible risks/harms

AI use and future adoption readiness

Figure 5 showed that only 29.5% of respondents had used AI-based tools in their practice, with commonly cited tools including diagnostic support systems, patient monitoring apps, and predictive analytics platforms. However, most (79.5%) expressed willingness to adopt AI in their practice in 5 years.

Fig. 5.

Fig. 5

Ever Used and Readiness to Adopt AI-based tools in psychiatric practice

Discussion

The present study investigated the knowledge, perception and adoption readiness for AI in mental health care among Nigerian psychiatrists and trainees.

Knowledge and familiarity with AI use in psychiatric practice

Our findings revealed that a significant number of respondents (86.5%) knew about AI use in psychiatric practice. However, only about one in two were knowledgeable of its specific clinical application areas, such as diagnostic assistance (54%), treatment (54.5%) and investigation (49%) of patients with mental illness. This is well comparable to a study by Cecil et al. [32], where half of the participants demonstrated a basic understanding of one AI application area in psychiatry, AI-enabled tools for diagnostic assistance (43.4%) and treatment (41.4%) were part of the helpful application areas reported. A study conducted among Pakistani doctors reported that more than two-thirds of respondents had a basic knowledge of AI, but 77% of respondents were unaware of its practical application in medicine [33]. Also, nearly two-thirds (61.5%) of the respondents in our study were not familiar with AI use in psychiatry, which is higher than the 45.4% reported by Cecil and colleagues [32]. However, the study among Korean doctors reported that only 6% had a good familiarity with AI [39]. Compared to the present study, the low familiarity with AI use in the latter study may be attributed to the larger sample size with mixed participants comprising medical students, trainee physicians, and physicians from different specialities. Also, most respondents (88.5%) in the present study reported that they had no prior training on AI use in psychiatry, while nearly all expressed a desire for training in the use of AI in psychiatry, consistent with previous studies [32, 40, 41]. This may suggest that the current Nigerian postgraduate residency training curriculum may not cover AI-related topics, as psychiatric trainees constitute 62% of the total respondents in our study. This, therefore, underscores the need for curriculum review to include technological innovations in mental health care during postgraduate psychiatric medical education and, by extension, undergraduate psychiatric medical education to improve future competence [27].

Perspectives on the AI roles, benefits, and risks in psychiatry

The findings of this study indicate that respondents generally hold an optimistic view of the usefulness of Artificial Intelligence in psychiatry. Most respondents supported integrating AI into psychiatric practice, which is consistent with previous findings [42, 43]. Our study also revealed that most respondents believed that AI would enhance psychiatric diagnosis accuracy, treatment, teaching and research quality in psychiatry, which agrees with previous studies [34, 42]. This underscored the importance of AI in psychiatric practice and psychiatric medical education [44]. Also, about one-fifth (24%) agreed that AI would reduce the need for psychiatrists, slightly lower than the findings by Oh et al. [39], where 35.4% of respondents believed that AI would replace the physicians’ jobs. However, some authors posited that AI is unlikely to replace physicians in the future but can complement the task of physicians [45]. The possible reasons highlighted included the limitations of AI in providing emphatic care and establishing therapeutic relationships [45].

Our study revealed that respondents were sceptical about AI surpassing average psychiatrists in tasks requiring empathy (91.0% unlikely) and mental status examinations (68% unlikely). However, they were optimistic about AI excelling in the documentation and analytical tasks, including updating medical records (73.0% likely), establishing prognoses (71.5% likely), and making diagnoses (74.5% likely). These findings align with previous studies where most respondents (83%) felt that future AI was unlikely to provide emphatic care and mental status examinations compared to average psychiatrists while predicting that future AI would likely replace them in documentation and synthesizing information [34, 35]. A similar study conducted among family physicians in the UK found that respondents felt that AI would unlikely be able to provide emphatic care (94.4% unlikely), making diagnoses (68% unlikely), personalize treatment plans (61.2% unlikely) and referral to other health professionals (61.7% unlikely) but found useful in documentation (80.2% likely) and predicting prognosis (52.7% likely) [46].

The present study highlighted potential AI benefits in psychiatric practice, including improving care plan standardization and personalization and addressing psychiatrist shortages. Many also believed AI could streamline workflows, minimize bias, and promote early help-seeking. Deploying AI to address critical human resources for health deficit would be more relevant in Nigeria, where there is a ratio of less than 0.1 psychiatrists to 100,000 population, with a huge treatment gap of close to 90% [11, 13]. The AI-based tools could assist by improving equitable access to specialist care for patients needing help, including hard-to-reach communities [29]. AI-driven administrative tools such as triage management systems can also streamline workflow, promoting better efficiency and productivity and reducing burnout among psychiatrists [25, 29, 47]. AI integration in psychiatry can lead to standardization and personalization of care, potentially reducing cost and human error [28, 34]. This study also reported the benefits of AI in reducing gender and racial biases in psychiatric care. This finding is significant given the well-documented challenges of racial, gender, and socio-economic biases in healthcare [28, 48]. By mitigating these biases, AI has the potential to provide a more equitable standard of care, an advancement that would be transformative in Nigeria and similar settings where systemic biases often exacerbate health disparities.

Despite the perceived benefits, respondents expressed several reservations, with the most pressing concerns being data security, loss of human interaction, and reliability. Data security concerns are common worldwide whenever AI in health care delivery is considered [29, 30]. Many practitioners cited privacy issues as an ethical reason for not embracing AI technology [29]. This is particularly important in psychiatric care, where patient confidentiality is critical. Regulatory frameworks focusing on data privacy in AI applications will be essential to address these concerns [49, 50]. Coincidentally, nearly all respondents in our study also supported the need for AI regulation in Nigeria, with more than two-thirds calling for strict regulation. Concerns over reduced human interaction and empathy reflect apprehensions that technology could make caring for patients too mechanical, compromising the therapeutic relationship necessary for effective psychiatric management, which was also raised in this study. The reservation about the lack of empathy by AI tools is also a recurrent decimal among mental health professionals and other physicians who have registered their concerns that AI could hinder empathy-driven care [35, 46]. In a society like Nigeria, where interpersonal relationships often play a crucial role in mental health treatment acceptance, lack of emphatic care may be a big challenge [51]. Previous research [35] reported uncertainty on whether the potential benefits of future AI would outweigh the risks/harms in about 40% of psychiatrists, consistent with the findings in our study, where 36.5% rate was found. The reasons could be attributed to the ambiguities around the AI-assisted tool’s limitations, especially data security, emphatic care, privacy concerns and regulations [29].

AI use and adoption readiness in mental health care

The benefits of AI in mental healthcare depend on its adoption by professionals [32]. Our study found that only 29.5% of respondents currently use AI tools, such as diagnostic support systems and predictive analytics. Despite this, 79.5% expressed interest in adopting AI within the next 5 years. A previous study [33] reported that 14.8% of the doctors had used AI, the findings lower than the 29.5% found in the present study. However, 73.5% indicated future readiness to adopt AI in their practice, and the findings seem encouraging and consistent with our results [33]. The possible reasons for low AI use included poor knowledge and understanding of AI and its applications, lack of training, absent curriculum, low interest in AI-based tools and poor technological advancement, among others [33]. Also, evidence suggests that mental health practitioners intended to learn rather than practically use AI-based tools, with psychiatrists displaying a willingness to use clinicians-centred feedback or practice management AI technology tools rather than patient-centred tools [32]. This may be due to a lack of trust in the reliability of the AI-based tools, which respondents expressed in the present study. Also, some posited that high risks associated with using AI-based technologies to determine diagnosis or treatment decisions compared to receiving feedback or administrative support could be the reason, as diagnostic or treatment errors can have severe negative consequences, potentially resulting in wrong or delayed treatment and a worse prognosis and possible attendant’s litigations [32, 52]. Similarly, one study also reported that 60% of psychiatrists do not recommend technological-based treatment to their patients [40]. However, Cross and colleagues reported that 43% of mental health professionals in Australia used AI in their practice, which is higher than the 29.5% reported in our study [42]. Factors such as infrastructural and technical issues, psychological concerns and workload-related issues were possible barriers to AI implementation in health care [53]. Consequently, there is a need for clinical validation of AI diagnostic or treatment tools and training and strengthening legislation that is jurisdiction-specific to minimize human errors, boost confidence and trust, and encourage AI adoption among practitioners.

Strengths and limitations

To our knowledge, this is the first survey in Nigeria assessing the knowledge, perception and adoption readiness for AI use in psychiatric practice among psychiatrists and trainees, and it provides the foundational insights into this topic among Nigerian psychiatrists. It is a national survey with a fair representation of psychiatrists and trainees across six geopolitical zones, which is expected to strengthen the findings in the present study.

The study’s limitations include the moderate response rate of 43.4%, which may be adequate given results from similar online surveys among doctors [12, 54]. There is a possibility of response biases as the respondents who chose to participate in the study might have been those interested in the AI topic, which could affect the findings. The convenience sampling technique may introduce a biased sample, which might affect the generalizability of the results to the sample population. The study focused on the descriptive statistics of the results and excluded subgroup analysis using bivariate statistics, which could have provided more detailed insights, including influencing variables, and further studies could look at this. Our study did not assess the deep knowledge of AI applications in psychiatry as the qualitative design would have provided more insights into this concept, and recommend it for future research.

Conclusion

This study showed that while most psychiatrists and trainees in Nigeria were aware of the usefulness of AI in psychiatric practice, only about half were knowledgeable of its basic clinical application areas, and only 38.5% were familiar with its use. Most respondents positively perceived AI use in psychiatry, especially in diagnosis, treatment, teaching and research. Most expressed optimism about the future capacity of AI, excelling over average psychiatrists in tasks like documentation, diagnosis and predicting prognosis. The AI benefits in psychiatry highlighted include standardisation and personalisation of care plans, addressing the shortage of psychiatrists, streamlining task flow, promoting early help-seeking behaviour and eliminating bias. Many expressed concerns over data security and loss of human interaction from AI-based tools. Also, the study revealed a low AI use but higher future adoption readiness among practitioners. Notably, there is a significant training gap as only about one-tenth of them had previous training, and nearly all believed that AI should be regulated. Therefore, we recommend a national policy formulation and regulatory framework on the ethical use of AI in psychiatric practice that aligns with global best practices to encourage AI adoption in mental health care in Nigeria. Also, there is a need for a curriculum review of undergraduate and postgraduate psychiatric medical education to accommodate AI-related topics with practical exposure to bridge the training gap and improve future competence among practitioners. Future studies should consider qualitative designs on this national survey to gain further insights into the psychiatrists’ knowledge, perception and use of AI in psychiatry and possible predictors for targeted interventions. Government and other relevant stakeholders should invest in technological innovations and healthcare infrastructure to transform the mental health delivery landscape and improve treatment access in Nigeria.

Acknowledgements

We are grateful to all the respondents that participated in the study for their support, patience and cooperation throughout the period of data collection.

Abbreviations

ADHD

Attention Deficit Hyperactivity Disorder

AI

Artificial Intelligence

APN

Association of Psychiatrists in Nigeria

ECPs

Early Career Psychiatrists

EEG

Electroencephalogram

EHRs

Electronic Health Records

IRC

Institutional Review Committee

KWASUTH

Kwara State University Teaching Hospital

Authors’ contributions

O.A.A., P.O.A., M.O.S., and D.S., conceptualized the study. O.A.A., P.O.A., M.O.S., and D.S., wrote the study protocol. All authors participated in data collection. D.S., and M.O.S., analysed the data. M.O.S., and D.S., wrote the first draft of the manuscript. All authors contributed to the revision of the article with a critical contribution to the intellectual content. All authors of this article have contributed sufficiently to the manuscript and read and approved the final manuscript.

Funding

This research did not receive any specific grant funding agencies in the public, commercial, or not-for-profit sectors.

Data availability

The anonymized data used in this study are not publicly available due to ethical reasons but are available with corresponding author upon reasonable request.

Declarations

Ethics approval and consent to particpate

The study was approved by the Institutional Review Committee of the Kwara State University Teaching Hospital with approval protocol number KWASUTH/IRC/246/VOL.II/46. This study was conducted in accordance with the principles of Declaration of Helsinki. The survey was voluntary, and only those who signed informed consent were recruited into the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Footnotes

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

References

  • 1.Zuhair V, Babar A, Ali R, Oduoye MO, Noor Z, Chris K, et al. Exploring the impact of artificial intelligence on global health and enhancing healthcare in developing nations. J Prim Care Community Health. 2024;15:1–10. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Alaran MA, Lawal SK, Jiya MH, Egya SA, Ahmed MM, Abdulsalam A, et al. Challenges and opportunities of artificial intelligence in Africa health space. Digit Health. 2025;11:1–7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 3.Oduoye MO, Fatima E, Muzammil MA, Dave T, Irfan H, Fariha FNU, et al. Impacts of the advancement in artificial intelligence on laboratory medicine in low- and middle-income countries: challenges and recommendations-A literature review. Health Sci Rep. 2024;7(1):e1794. 10.1002/hsr2.1794. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Alami H, Rivard L, Lehoux P, Hoffman SJ, Cadeddu SBM, Savoldelli M, et al. Artificial intelligence in health care: laying the foundation for responsible, sustainable, and inclusive innovation in low- and middle-income countries. Global Health. 2020;16(1):52. 10.1186/s12992-020-00584-1. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 5.McKinsey & Company. Unlocking digital healthcare in lower- and middle-income countries [Internet]. 2021. Accessed May 16, 2025. https://www.mckinsey.com/industries/healthcare/ourinsights/unlocking-digital-healthcare-in-lower-and-middle-incomecountries
  • 6.Paranjape K, Schinkel M, Hammer RD, Schouten B, Nannan Panday RS, Elbers PWG, et al. The value of artificial intelligence in laboratory medicine. Am J Clin Pathol. 2021;155(6):823–31. 10.1093/ajcp/aqaa170. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Tahir MY, Mars M, Scott RE. A review of teleradiology in Africa–Towards mobile teleradiology in Nigeria. SA J Radiol. 2022;26(1):2257. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 8.Bellemo V, Lim ZW, Lim G, Nguyen QD, Xie Y, Yip MYT, et al. Artificial intelligence using deep learning to screen for referable and vision-threatening diabetic retinopathy in africa: a clinical validation study. Lancet Digit Health. 2019;1(1):e35–44. 10.1016/S2589-7500(19)30004-4. [DOI] [PubMed] [Google Scholar]
  • 9.Gureje O, Lasebikan VO, Kola L, Makanjuola VA. Lifetime and 12-month prevalence of mental disorders in the Nigerian survey of mental health and Well-Being. Br J Psychiatry. 2006;188:465–71. 10.1192/bjp.188.5.465. [DOI] [PubMed] [Google Scholar]
  • 10.Association of Psychiatrists of Nigeria. (2025) Association of Psychiatrists of Nigeria- Home Page. Available at https://www.apn.org.ng/. Last accessed on the May 16, 2025.
  • 11.Okechukwu CE. Shortage of psychiatrists: A barrier to effective mental health-care delivery in Nigeria. Int J Noncommunicable Dis. 2020;5(1):22. 10.4103/jncd.jncd_1_20. [Google Scholar]
  • 12.Essien EA, Mahmood MY, Adiukwu F, Kareem YA, Hayatudeen N, Ojeahere MI, et al. Workforce migration and brain drain–A nation-wide cross sectional survey of early career psychiatrists in Nigeria. Camb Prisms: Global Mental Health. 2024;11e30:1–9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Mugisha J, Abdulmalik J, Hanlon C, Petersen I, Lund C, Upadhaya N, et al. Health systems context(s) for integrating mental health into primary health care in six Emerald countries: A situation analysis. Int J Mental Health Syst. 2024;11(1):7. 10.1186/s13033-016-0114-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Ray A, Bhardwaj A, Malik YK, Singh S, Gupta R. Artificial intelligence and psychiatry: an overview. Asian J Psychiatry. 2022;70:103021. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.Luxton DD. An introduction to artificial intelligence in behavioral and mental health care. In: Luxton DD, editor. Artificial intelligence in behavioral and mental health care. Elsevier Academic Press; 2016. p. 1–26.
  • 16.Miller DD, Brown EW. Artificial intelligence in medical practice: the question to the answer? Am J Med. 2018;131(2):129–33. 10.1016/j.amjmed.2017.10.035. [DOI] [PubMed] [Google Scholar]
  • 17.Gabbard GO, Crisp-Han H. The early career psychiatrist and the psychotherapeutic identity. Acad Psychiatry. 2017;41(1):30–4. 10.1007/s40596-016-0627-7. [DOI] [PubMed] [Google Scholar]
  • 18.Shatte A, Hutchinson DM, Teague SJ. Machine learning in mental health: a scoping review of methods and applications. Psychol Med. 2019;49(9):1426–48. 10.1017/S0033291719000151. [DOI] [PubMed] [Google Scholar]
  • 19.Saeedi A, Saeedi M, Maghsoudi A, Shalbaf A. Major depressive disorder diagnosis based on effective connectivity in EEG signals: a convolutional neural network and long short-term memory approach. Cogn Neurodyn. 2021;15(2):239–52. 10.1007/s11571-020-09619-0. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Tenev A, Markovska-Simoska S, Kocarev L, Pop-Jordanov J, Müller A, Candrian G. Machine learning approach for classification of ADHD adults. Int J Psychophysiol. 2014;93(1):162–6. 10.1016/j.ijpsycho.2013.01.008. [DOI] [PubMed] [Google Scholar]
  • 21.Antonucci LA, Raio A, Pergola G, Gelao B, Papalino M, Rampino A, et al. Machine learning-based ability to classify psychosis and early stages of disease through parenting and attachment-related variables is associated with social cognition. BMC Psychol. 2021;9(1):47. 10.1186/s40359-021-00552-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Bain EE, Shafner L, Walling DP, Othman AA, Chuang-Stein C, Hinkle J, et al. Use of a novel artificial intelligence platform on mobile devices to assess dosing compliance in a phase 2 clinical trial in subjects with schizophrenia. JMIR Mhealth Uhealth. 2017;5(2):e18. 10.2196/mhealth.7030. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.du Sert OP, Potvin S, Lipp O, Dellazizzo L, Laurelli M, Breton R, et al. Virtual reality therapy for refractory auditory verbal hallucinations in schizophrenia: a pilot clinical trial. Schizophr Res. 2018;197:176–81. 10.1016/j.schres.2018.02.031. [DOI] [PubMed] [Google Scholar]
  • 24.Yu R, Hui E, Lee J, Poon D, Ng A, Sit K, et al. Use of a therapeutic, socially assistive pet robot (PARO) in improving mood and stimulating social interaction and communication for people with dementia: study protocol for a randomized controlled trial. JMIR Res Protoc. 2015;4(2):e45. 10.2196/resprot.4189. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Bower P, Gilbody S. Stepped care in psychological therapies: access, effectiveness and efficiency: narrative literature review. Br J Psychiatry. 2005;186(1):11–7. 10.1192/bjp.186.1.11. [DOI] [PubMed] [Google Scholar]
  • 26.Sun J, Dong QX, Wang SW, Zheng YB, Liu XX, Lu TS, et al. Artificial intelligence in psychiatry research, diagnosis, and therapy. Asian J Psychiatry. 2023;87:103705. [DOI] [PubMed] [Google Scholar]
  • 27.Lee QY, Chen M, Ong CW, Hui Ho CS. The role of generative artificial intelligence in psychiatric education– a scoping review. BMC Med Educ. 2025;25:438. 10.1186/s12909-025-07026-9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting Racial bias in an algorithm used to manage the health of populations. Science. 2019;366(6464):447–53. [DOI] [PubMed] [Google Scholar]
  • 29.Fiske A, Henningsen P, Buyx A. Your robot therapist will see you now: ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. J Med Internet Res. 2019;21(5):e13216. 10.2196/13216. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Harishbhai TM, Kumar CP, Choppadandi A, Kaur J, Naguri S, Saoji R, et al. Ethical considerations in the use of artificial intelligence and machine learning in health care: A comprehensive review. Cureus. 2024;16(6):e62443. 10.7759/cureus.62443. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Chakrabarti S. Digital psychiatry in low-and-middle-income countries: new developments and the way forward. World J Psychiatry. 2024;14(3):350. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 32.Cecil J, Kleine A, Lermer E, Gaube S. Mental health practitioners’ perceptions, and adoption intensions of AI-enabled technologies: an international mixed-methods study. BMC Health Serv Res. 2025;25:556. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 33.Ahmed Z, Bhinder KK, Tariq A, Tahir MJ, Mehmood Q, Tabbassum SS, et al. Knowledge, attitude, and practice of artificial intelligence among Doctors and medical students in pakistan: A cross-sectional online survey. Annals Med Surg. 2022;76:103493. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.Al-Ansari AM, Al-Medfa MK. Psychiatrists attitudes towards artificial intelligence: tasks, job security and benefits. Bahrain Med Bull. 2023;45(3):1528–31. [Google Scholar]
  • 35.Doraiswamy PM, Blease C, Bodner K. Artificial intelligence and the future of psychiatry: insights from a global physician survey. Artif Intell Med. 2020;102:101753. 10.1016/j.artmed.2019.101753. [DOI] [PubMed] [Google Scholar]
  • 36.Ayinde JB, Adebayo PO, Aremu IB, Akande HJ, Majeed HA, Osawe AA et al. A CNN-Based AI model for detecting acute intracranial heamorrhages on non-contrast CT scans in cohort of blacks. ECR 2025; poster number:C-19899. 10.26044/ecr2025/C-19899
  • 37.Ojedokun S, Afolabi S, Olukoyejo O, Alatise T. Perception and opinions of medical professionals on Artifical intelligence in optimizing the heatlth sector. Asian J Med principles Clin Pract. 2024;7(1):279–88. [Google Scholar]
  • 38.Kuni Tyeesi. NUC, NOUN and AAU Train 931 scholars on use of AI., ThisDay Newspaper. Available at https://www.thisdaylive.com/index.php/2024/05/01/nuc-noun-aau-train-931-scholars-on-use-of-ai/?amp. (Last accessed 24/05/2025).
  • 39.Oh S, Kim JH, Choi SW, Lee HJ, Hong J, Kwon SH. Physician confidence in artificial intelligence: an online mobile survey. J Med Internet Res. 2019;21(3):e12422. 10.2196/12422. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Bauer R, Glenn T, Monteith S, Whybrow PC, Bauer M. Survey of psychiatrist use of digital technology in clinical practice. Int J Bipolar Disord. 2020;8(1):29. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Pucchio A, Rathagirishnan R, Caton N, Gariscsak PJ, Del Papa J, Nabhen JJ, et al. Exploration of exposure to artificial intelligence in undergraduate medical education: a Canadian cross-sectional mixed-methods study. BMC Med Educ. 2022;22(1):815. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 42.Cross S, Bell I, Nicholas J, Valentine L, Mangelsdorf S, Baker S, et al. Use of AI in mental health care: community and mental health professionals survey. JMIR Mental Health. 2024;11(1):e60589. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Graham S, Depp C, Lee EE, Nebeker C, Tu X, Kim HC, et al. Artificial intelligence for mental health and mental illnesses: an overview. Curr Psychiatry Rep. 2019;21:1–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 44.Espejo GD. Artificial intelligence: the next frontier in psychiatric treatment and education. Acad Psychiatry. 2023;47(4):437–8. [DOI] [PubMed] [Google Scholar]
  • 45.Krittanawong C. The rise of artificial intelligence and the uncertain future for physicians. Eur J Intern Med. 2018;48:e13–4. 10.1016/j.ejim.2017.06.017.S0953-6205(17)30261-3. [DOI] [PubMed] [Google Scholar]
  • 46.Blease C, Bernstein MH, Gaab J, Kaptchuk TJ, Kossowsky J, Mandl KD, et al. Computerization and the future of primary care: A survey of general practitioners in the UK. PLoS ONE. 2018;13(12):e0207418. 10.1371/journal.pone.0207418. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 47.Topol EJ. High-performance medicine: the convergence of human and artificial intelligence. Nat Med. 2019;25(1):44–56. [DOI] [PubMed] [Google Scholar]
  • 48.Togioka BM, Young E. Diversity and discrimination in health care. 2024 May 2. In: StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2025. PMID: 33760480. [PubMed]
  • 49.Babikian J. Securing rights: legal frameworks for privacy and data protection in the digital era. Law Res J. 2023;1(2):91–101. [Google Scholar]
  • 50.Wang C, Zhang J, Lassi N, Zhang X. Privacy protection in using artificial intelligence for healthcare: Chinese regulation in comparative perspective. Healthc (Basel). 2022;10(10):1878. 10.3390/healthcare10101878. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Block VJ, Haller E, Villanueva J, Meyer A, Benoy C, Walter M, et al. Meaningful relationships in community and clinical samples: their importance for mental health. Front Psychol. 2022;13:832520. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 52.Kisely S, Scott A, Denney J, Simon G. Duration of untreated symptoms in common mental disorders: association with outcomes: international study. Br J Psychiatry. 2006;189(1):79–80. [DOI] [PubMed] [Google Scholar]
  • 53.Borges do Nascimento IJ, Abdulazeem H, Vasanthan LT, Martinez EZ, Zucoloto ML, Østengaard L, et al. Barriers and facilitators to utilizing digital health technologies by healthcare professionals. NPJ Digit Med. 2023;6(1):161. 10.1038/s41746-023-00899-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Cunningham CT, Quan H, Hemmelgarn B, Noseworthy T, Beck CA, Dixon E, et al. Exploring physician specialist response rates to web-based surveys. BMC Med Res Methodol. 2015;15:32. 10.1186/s12874-015-0016-z. [DOI] [PMC free article] [PubMed] [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Data Availability Statement

The anonymized data used in this study are not publicly available due to ethical reasons but are available with corresponding author upon reasonable request.


Articles from BMC Psychiatry are provided here courtesy of BMC

RESOURCES