Abstract
Background
Artificial Intelligence (AI)-enabled chatbots can offer anonymous education about sexual and reproductive health (SRH). Understanding chatbot acceptability and feasibility allows the identification of barriers to the design and implementation.
Methods
In 2020, we conducted an online survey and qualitative interviews with SRH professionals recruited online to explore the views on AI, automation and chatbots. Qualitative data were analysed thematically.
Results
Amongst 150 respondents (48% specialist doctor/consultant), only 22% perceived chatbots as effective and 24% saw them as ineffective for SRH advice [Mean = 2.91, SD = 0.98, range: 1–5]. Overall, there were mixed attitudes towards SRH chatbots [Mean = 4.03, SD = 0.87, range: 1–7]. Chatbots were most acceptable for appointment booking, general sexual health advice and signposting, but not acceptable for safeguarding, virtual diagnosis, and emotional support. Three themes were identified: “Moving towards a ‘digital’ age’“, “AI improving access and service efficacy”, and “Hesitancy towards AI”.
Conclusions
Half of SRH professionals were hesitant about the use of chatbots in SRH services, attributed to concerns about patient safety, and lack of familiarity with this technology. Future studies should explore the role of AI chatbots as supplementary tools for SRH promotion. Chatbot designers need to address the concerns of health professionals to increase acceptability and engagement with AI-enabled services.
Keywords: Europe, location, prevention
Introduction
The rapid digitalisation of sexual and reproductive health services (SRHS) during the COVID-19 pandemic has offered valuable opportunities for improving healthcare utilisation, with the provision of remote consultations and screening for sexually transmitted infections (STIs). The advancements in Artificial Intelligence (AI) may further support healthcare delivery by automating routine tasks, and by mining large datasets to support clinical decisions. 1 AI-enabled interventions, such as medical algorithms and virtual assistants, are likely to improve STI prevention and control, by increasing surveillance, and enhancing online interventions through tailored and targeted health promotion. 2 Proof-of-concept studies have already demonstrated the successful applications of machine learning to identify patients at increased risk of STIs/HIV who may benefit from HIV pre-exposure prophylaxis or more frequent sexual health screening.3–5 Several risk-prediction tools, supported by AI, have also shown to be capable of predicting user risk of HIV, syphilis, gonorrhoea and chlamydia, as well as predicting contraceptive use amongst young people.6–8 Thus, AI models could inform clinical decisions, predict trends in the utilisation of SRHS and support resource allocation.
Medical chatbots, or conversational agents, are AI-enabled interventions that recognise human speech and respond with pre-set answers, imitating message exchange between patient and healthcare professionals (HCP). 9 While chatbots were found to be acceptable in healthcare, particularly in mental health, 10 oncology 11 and public health, 12 their efficacy and safety in SRHS still need to be established. Most patients prefer to consult an HCP for general medical queries. However for conditions associated with higher stigma, such as STIs, some patients expressed a preference for an anonymous, non-judgmental consultation with a chatbot. 13 Our previous research has demonstrated moderate acceptability of chatbots amongst sexual health patients (40%) and internet users (67%), with 48% expressing willingness to seek specialist advice about sexual health via this technology.14,15 Our qualitative study on engagement with a dedicated sexual health chatbot found mixed attitudes in which users appreciated the chatbot’s convenience and anonymity while having concerns about accuracy, and the chatbot’s ability to understand individual needs. 16 This type of intervention was seen as mostly acceptable for anonymous sex education in young people and for signposting to appropriate clinical services. Previous studies also demonstrated that computer-assisted sexual health interviewing was greater for reported behaviours associated with STI risk, suggesting that chatbots may facilitate assessment to appropriate screening tests if they enabled greater disclosure from users. 17 A systematic review of 31 studies on the acceptability and effectiveness of conversational agents for sexual health promotion provides further support for the application of patient-facing automated systems. 18 It has demonstrated optimal levels of satisfaction and engagement, with 86% of users recommending these automated systems to others. However, more research is needed to understand the impact of chatbot consultations on specific health behaviours such as uptake of screening, vaccination or prophylaxis.
HCPs’ views on AI play an important role in developing and implementing technology in healthcare. One study has shown that while some doctors thought that chatbots could support, motivate and coach patients, acting as surrogate caregivers complementing the work of HCPs, others were concerned about chatbots’ ability to comprehend the emotional states of patients. 19 No study to date has explored HPCs’ views on the acceptability of AI in SRHS, and without their support, this technology may not be successfully utilised. Therefore, our study aimed to explore attitudes toward AI-led interventions in SRHS and identify the perceived usefulness of automation for various service streams.
Methods
Design
An exploratory mixed-methods study, comprised of an online survey and follow-up interviews, was conducted between May and July 2020 to gain an in-depth understanding of HCPs’ attitudes towards AI-led interventions for SRHS. The University of Westminster Research Ethics Committee approved the study (Ref: ETH19200979).
Participants and recruitment
The research focussed on health professionals working in SRHS. These included consultants, specialist doctors, nurses, health advisors, psychologists, support workers, healthcare assistants, commissioners, service managers and health promotion practitioners. To place the study within the National Health Service (NHS) context, those working outside of the UK were excluded.
Due to the ongoing COVID-19 pandemic, no in-person recruitment was permitted. Thus, an online study advert was distributed via Twitter and the internal newsletters of the British Association for Sexual Health and HIV, with a request to participate in the study, as well as distribute the survey link within their relevant professional networks. Thus, recruitment utilised both convenience and snowball sampling approaches to maximise the response. However, the participation rate was not recorded due to the nature of snowball sampling. Participation was voluntary and no incentive was offered.
Measurement and procedure
The study advert contained a URL link to an online Qualtrics survey consisting of four scales. Participants were first asked demographic questions (age, gender, ethnicity, professional role). The first part of the survey asked about attitudes towards the rapid provision of digital sexual health services during the pandemic, published in a separate report. 20 The second part of the survey explored the perceived usefulness of various automated services in SRHS, outlined as 21 items representing different service streams such as ‘booking appointments’, ‘patient triage’, ‘risk assessment via sexual history taking’, ‘partner notification’ and ‘safeguarding for sexual assault’. Participants were asked to rate the usefulness of automation using a 5-Likert scale ranging from ‘not at all useful’ to ‘extremely useful’. The second 7-Likert scale explored attitudes towards automation, AI and chatbots for SRHS and contained 13 statements, such as ‘I am sceptical about the use of AI in medicine’ and ‘A sexual health chatbot could negatively affect my work’, with response options ranging from ‘strongly agree’ to ‘strongly disagree’. Additionally, participants were asked about their personal experience with chatbots, and a dedicated chatbot for sexual health. Perceived effectiveness of sexual health chatbots, with a 5-Likert scale ranging from ‘very ineffective’ to ‘very effective’, was also assessed.
Participants were invited to submit their contact details if they wished to take part in an online follow-up, and all of those were interviewed. A topic guide further exploring the acceptability of AI and chatbots in SRHS was employed. The questions aimed to identify potential barriers and facilitators to the implementation of AI-led interventions. The interviews were conducted via telephone or Skype lasting approximately 30 min. All were audio-recorded and transcribed verbatim.
Data analysis
Descriptive statistics (i.e. percentages, means and standard deviations) from the quantitative data were analysed in SPSS, and identified trends in attitudes. The qualitative data were analysed independently using the six-step Thematic Analysis approach 21 by two researchers (TN and NK) who compared findings to identify patterns in participant responses. The analysis informed the formulation of themes corresponding to the research question and these were checked for consistency, coherence and applicability.
Results
Overall, 150 SRH professionals, mainly employed in England (82.7%) with a median age of 49 years (range: 24–80), completed the survey. Nearly half (48%) were specialist doctors or consultants, 28.5% were nurses and 12% were sexual health advisors; 71% were women and 84% identified as white. At the time of the study, over a third were required to take on a different role due to the ongoing COVID-19 crisis. Additionally, 24 survey participants (age range: 31–76; 54% doctors; 54% women; and 83% white) were interviewed.
There were mixed attitudes towards the automation of SRHS [Mean = 4.03, SD = 0.87, range: 1–7; Figure 1). Most participants agreed that the automation of services that require psychological and emotional support, such as safeguarding for sexual assault, trauma and abuse, was not seen as useful and acceptable. There was also low perceived usefulness for automated services that offer a virtual diagnosis based on photography, or confidential chemsex support. Just over half thought that automation of follow-up care or risk assessment via sexual history taking, was either moderately useful or very useful. The perceived usefulness of automation was higher for electronic prescribing services, patient triage and HIV medication adherence. About two-thirds of participants thought that automation could be useful for the identification of those at high risk of STIs, as well as those who may benefit from sexual health screening, and HIV pre-exposure prophylaxis. Approximately the same proportion perceived partner notification, and completion of antibiotics after STI treatment as potentially useful for automation. Finally, there was a high level of perceived usefulness of automation for appointment booking, condom distribution, general sexual health advice and signposting as well as service evaluation based on patient responses.
Figure 1.
Perceived usefulness of automation in sexual and reproductive health services.
Although 40% of participants reported personal experience with a chatbot, only 5% had experienced a dedicated sexual health chatbot. While 22% perceived AI chatbots as an effective intervention, 24% thought they were ineffective and 54% were unsure (Mean = 2.91, SD = 0.98, range: 1–5). There were also mixed attitudes towards AI-enabled chatbots for sexual and reproductive healthcare (Figure 2). While half agreed that chatbots could provide more personalised treatment, a large proportion of participants were unsure whether chatbots would improve patient privacy, whether they could widen health inequalities or worsen the quality of patient care. Just under half were sceptical about the use of AI in medicine, and whether chatbots could improve access to SRHS. About a third of participants reported not being able to understand how sexual chatbots work, and whether patients would disclose clinically relevant information to chatbots. The majority of respondents disagreed that chatbots could prevent unnecessary visits or reduce travel time to healthcare providers.
Figure 2.
Attitudes towards chatbots amongst sexual and reproductive health professionals.
The qualitative analysis resulted in three themes regarding the acceptability of automation and AI in SRHS.
Moving towards a ‘digital age’
Most interviewed HCPs agreed that the COVID-19 crisis has accelerated the digitalisation of SRHS in the UK, facilitating the transition to remote consultations and STI screening. Many believed that the NHS needed to be innovative in the use of online platforms, to increase the access and quality of healthcare services, but were unsure how the advancements in AI could benefit their work and patient experience. Despite positive views on the digitalisation of healthcare services, most participants admitted that they had little or no experience with AI, thus their understanding of the technology was limited. Several HCPs emphasised the need for automation and AI because of the increased demand for SRHS, shortage of qualified specialists, and limited funding. Chatbots, in particular, were seen as complementary interventions that could signpost users to the appropriate services. They were compared to existing online triage systems that help to manage demand and user expectations. However, there was a general concern that patients of older age, who do not have access to technology, or who have lower digital literacy skills, may be less likely to benefit from AI-led interventions such as chatbots. There was a common view that while some patients may be comfortable with automated services, many would still prefer face-to-face consultation with HCPs.
“I think AI and chatbots and automated kind of services will become the norm within the next 5-10 years, if not sooner. When people gain more confidence in them and they become more kind of sophisticated. But then we also need to remember that a lot of people will not be happy or find any form of digital health acceptable, or won’t have access to smart phones, or internet. So, we need to be careful that we don’t leave people behind who may be marginalised, or have worse health outcomes anyway” [Consultant]
“That’s basically what we’re doing with our telephone triage now, which is essentially we are intelligent bots ourselves, you know, doing the Q&A over the phone, and you don’t get it right every time, but you get it right a lot of the time.” [Nurse]
AI improving access and service efficacy
Automation and AI were seen as most acceptable for repetitive, non-clinical administrative tasks, such as booking and managing patient appointments, follow-ups or checking medication adherence. There was a consensus that AI chatbots could be well suited to provide basic generic advice, raise awareness of STIs and different SRHS, as well as engage in personalised health promotion and education for different social groups. However, they were not seen as suitable for complex cases that require clinical input, such as diagnosis or treatment recommendation. Although there was no clear understanding of the role of AI in SRHS, many participants thought that automation could free up more time for HCPs to deal with complex cases, and vulnerable, or underserved patient groups. AI was seen as potentially improving the overall quality of the service if facilitating better patient-HCP consultations by offering an advanced online triage, service signposting or preparing users for in-person sexual history taking.
“Something like a chatbot is very interactive, you spend a bit more time with it, so you could ask more questions, find out what someone’s specific needs are. So, you can present them with exactly what the service offers them, whether that’s an appointment, or an online test. But we’ve always tried to build into our tools that there’s a secondary level of information offered, which is what our clinicians would be. So, it’s like have you sort of thought about free condoms, have you thought of free testing in this time, have you thought about HIV PrEP. So, it’s almost the chatbot tool itself, or the triage tool, our website, because if you have that information that is automated, then you’re able to give them that much richer information, rather than just ‘please call to make an appointment’”. [Specialist doctor]
There was a view that unlike popular internet search engines, which can portray inaccurate and stigmatising information about sex and sexual health, AI chatbots could identify individual user needs and offer rich, clinically validated information. Thus, those who engage in behaviours that increase their chances of acquiring an STI could be nudged to undertake online STI screening or contact an HCP. Also, several participants believed that chatbot conversations could be translated for non-English speaking users, thus overcoming language barriers for underserved populations. The anonymity offered by chatbots was seen as a potential advantage, especially for users who are embarrassed or uncomfortable talking to HCPs. However, there was an emphasis that chatbots should not pose an additional barrier for people undergoing sexual health screening, and therefore be designed in a sex-positive way.
“If you think that in a few years’ time, that you might well worry about how well your chatbot will stand up to a Bangladeshi person in East London, but you know, in a few years’ time, your translation algorithm should be good enough that you can press a button and say, let’s have this, and boom it’ll do it. You know, you just press a button, and the machine will do it for you.” [Sexual health advisor]
Hesitancy towards AI
A large proportion of sampled HCPs were sceptical about AI, automation and chatbots for SRHS. This was explained by the lack of familiarity with AI-led interventions and clear scientific evidence that could inform clinical guidelines. There were concerns about the accuracy and equity of algorithms, and whether AI-enabled interventions can be trusted. One participant mentioned that any AI chatbot is only as good as its programming, emphasising the need for HCPs and patients to be involved in the design of automated services. Many were also concerned about confidentiality, data privacy and information governance, questioning the overall management of data for AI algorithms.
“People are complex really and I’m not absolutely confident that for most people chatbots could be programmed well enough to perform those sorts of complexities.” [Consultant]
In terms of AI chatbots, participants thought that the technology may not yet be capable of understanding difficult questions about sex and sexuality. They believed that empathy and compassion cannot be delivered by bots, and users would not feel comfortable sharing intimate information. They emphasised the importance of non-verbal communication and the ‘human factor’ in SRHS, stressing that although AI may respond to simple questions it would not be as emotionally intelligent as an HCP. Some participants were concerned that chatbots may not identify sensitive issues, especially those requiring safeguarding, which could be a missed opportunity for interaction. Some were worried that if vulnerable users were directed to interact with a chatbot instead of an HCP, that could potentially be harmful or lead to unintended negative consequences. Conversely, chatbots programmed to recommend sexual health screening for those at lower risk of STIs could increase overall demand for SRHS and add unnecessary workload. There was an agreement that more evidence is needed to understand how AI chatbots could improve health outcomes and patient wellbeing.
“You’re always going to need clinicians in order to be able to have that human factor, that you’re never going to get with a chatbot.” [Nurse]
“That one that came out, what was it last year? All over the place. Chatbot called something… dangerous. It just gave all the wrong diagnoses. So, the biggest fear is it says the wrong thing to somebody, just sends them up the wrong path, or it goes to the wrong clinical decision tree, gives the wrong diagnosis, the wrong advice. It’s quite high risk for healthcare with almost any problem.” [Consultant]
Discussion
This is the first study exploring the acceptability of AI-led interventions amongst sexual health professionals. It demonstrates mixed attitudes and support for the use of AI in SRHS, with a large proportion of HCPs expressing hesitancy for automation. While many acknowledged the potential benefits of AI, there was a general reluctance due to the lack of evidence on the effectiveness, safety and equity of AI-led interventions as well as clear implementation guidelines. This could also be explained by the lack of familiarity with the technology and the time required for widespread application in healthcare, as described by the Diffusion of Innovation theory that differentiates between innovators, early adopters and laggards. 22 Therefore, the observed suboptimal acceptability of AI is not unexpected and may improve as more AI-enabled interventions and services are established across healthcare.
Previous studies involving HCPs reported similar results. A survey of 100 clinicians in the US showed that 76% had believed that AI chatbots were useful for administrative tasks such as scheduling doctor appointments and providing medication information, however, the same proportion had thought that chatbots could not effectively care for all patients’ needs, lacking empathy and posting a substantial risk to patients through inadequate self-diagnosis. 19 Similarly to our study, many HCPs believed that patients would not disclose sensitive information to chatbots. However, withholding medically relevant information to clinicians is well-documented amongst women, younger patients and those of poorer self-rated health, 23 and several studies have indicated that the level of nondisclosure could be lower in chatbots.24,25 Thus, further research is needed to establish the differences in information disclosure between HCPs and AI-led interventions. Additionally, an online survey of 250 healthcare staff in Saudi Arabia demonstrated a general lack of knowledge about AI and concerns about potential job displacement due to automation. 26 Many of the surveyed clinicians were not aware of the AI benefits, highlighting the need for training and technology education in the context of healthcare delivery.
Our study achieved a comparable sample size of HCPs to those previously published and used a mixed-methods design to offer a better understanding of attitudes towards AI and automation. However, the data collection took place during the first UK lockdown of the COVID-19 pandemic when face-to-face recruitment was not permitted. The online survey may reflect the attitudes of HCPs who were more technologically informed inflating the moderate acceptability of AI. Our survey did not measure digital competencies and literacy and it is likely that those with high ‘AI literacy’ might have more positive attitudes towards chatbots. 27 The survey was conducted amongst UK-based HCPs who worked for universal publicly-funded healthcare services, with potentially lower incentives for automation compared with those working for privately-funded healthcare. HCPs from other countries, especially those with less or more developed digital services, may have different views on AI and automation. At present, there is no validated tool to measure AI acceptability amongst HCPs, thus future research should identify the standardised method of assessing the hesitancy for AI in HCPs. In addition, the study was conducted before the emergence of large language models and generative AI tools such as ChatGPT. There is a change that the advancements in conversational AI would have impact on attitudes towards chatbots as more HCPs are becoming familiar with the technology. 28
The findings emphasise the need to monitor the acceptability of AI amongst HCPs as their hesitancy for automation is likely to influence the implementation of AI-led solutions for SRHS. Doctors, nurses and health advisors should be involved in the design, development and evaluation of AI systems for optimal acceptability and implementation. Detailed guidelines outlining the evidence for automation safety, effectiveness and equity are required to reduce AI hesitancy. Additionally, HCPs would benefit from specific staff training to improve knowledge and skills related to automation, specifically ‘know-how’ about AI auditing, monitoring and evaluation. Finally, there were positive attitudes towards administrative and non-clinical tasks such as providing specialised knowledge about SRH, signposting users to appropriate healthcare services as well as identifying service users in need of prophylaxis and screening. These tasks can already be performed by chatbots and risk-prediction tools based on AI algorithms. Thus, future studies should explore how these types of digital interventions could be incorporated into existing SHRS.
Supplemental Material
Supplemental Material for “But can chatbots understand sex?” Attitudes towards artificial intelligence chatbots amongst sexual and reproductive health professionals: An exploratory mixed-methods study by Tom Nadarzynski, Alexandria Lunt, Nicky Knights, Jake Bayley, and Carrie Llewellyn in International Journal of STD & AIDS.
Declaration of conflicting interests: The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding: The author(s) received no financial support for the research, authorship, and/or publication of this article.
Contributorship: All authors contributed to the design of the study. AL collected all data. AL, NK and TN conducted data analysis. All authors contributed to the interpretation of the analysis and write-up of the manuscript.
Supplemental Material: Supplemental material for this article is available online.
ORCID iDs
Tom Nadarzynski https://orcid.org/0000-0001-7010-5308
Carrie Llewellyn https://orcid.org/0000-0002-9107-8473
References
- 1.Malik P, Malik P, Pathania M, et al. Overview of artificial intelligence in medicine. J Family Med Prim Care 2019; 8(7): 2328–2331. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Young SD, Crowley JS, Vermund SH. Artificial intelligence and sexual health in the USA. Lancet Digit Health 2021; 3(8): e467–e468. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Marcus JL, Sewell WC, Balzer LB, et al. Artificial intelligence and machine learning for HIV prevention: emerging approaches to ending the epidemic. Curr HIV/AIDS Rep 2020; 17(3): 171–179. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Mutai CK, McSharry PE, Ngaruye I, et al. Use of machine learning techniques to identify HIV predictors for screening in sub-Saharan Africa. BMC Med Res Methodol 2021; 21(1): 159–211. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Ridgway JP, Friedman EE, Bender A, et al. Evaluation of an electronic algorithm for identifying cisgender female pre-exposure prophylaxis candidates. AIDS Patient Care STDS 2021; 35(1): 5–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Xu X, Ge Z, Chow EP, et al. A machine-learning-based risk-prediction tool for HIV and sexually transmitted infections acquisition over the next 12 months. J Clin Med 2022; 11(7): 1818. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Gruber S, Krakower D, Menchaca JT, et al. Using electronic health records to identify candidates for human immunodeficiency virus pre‐exposure prophylaxis: an application of super learning to risk prediction when the outcome is rare. Stat Med 2020; 39(23): 3059–3073. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 8.Liu Z, Lin Z, Cao W, et al. Identify key determinants of contraceptive use for sexually active young people: a hybrid ensemble of machine learning methods. Children 2021; 8(11): 968. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.Tudor Car L, Dhinagaran DA, Kyaw BM, et al. Conversational agents in health care: scoping review and conceptual analysis. J Med Internet Res 2020; 22(8): e17158. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Abd-Alrazaq AA, Rababeh A, Alajlani M, et al. Effectiveness and safety of using chatbots to improve mental health: systematic review and meta-analysis. J Med Internet Res 2020; 22(7): e16021. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Xu L, Sanders L, Li K, et al. Chatbot for health care and oncology applications using artificial intelligence and machine learning: systematic review. JMIR Cancer 2021; 7(4): e27850. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Oh YJ, Zhang J, Fang ML, et al. A systematic review of artificial intelligence chatbots for promoting physical activity, healthy diet, and weight loss. Int J Behav Nutr Phys Act 2021; 18(1): 160–225. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Miles O, West R, Nadarzynski T. Health chatbots acceptability moderated by perceived stigma and severity: a cross-sectional survey. Digit Health 2021; 7: 20552076211063012. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Nadarzynski T, Bayley J, Llewellyn C, et al. Acceptability of artificial intelligence (AI)-enabled chatbots, video consultations and live webchats as online platforms for sexual health advice. BMJ Sex Reprod Health 2020; 46(3): 210–217. [DOI] [PubMed] [Google Scholar]
- 15.Nadarzynski T, Miles O, Cowie A, et al. Acceptability of artificial intelligence (AI)-led chatbot services in healthcare: a mixed-methods study. Digit Health 2019; 5: 2055207619871808. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Nadarzynski T, Puentes V, Pawlak I, et al. Barriers and facilitators to engagement with artificial intelligence (AI)-based chatbots for sexual and reproductive health advice: a qualitative analysis. Sex Health 2021; 18(5): 385–393. [DOI] [PubMed] [Google Scholar]
- 17.Richens J, Copas A, Sadiq ST, et al. A randomised controlled trial of computer-assisted interviewing in sexual health clinics. Sex Transm Infect 2010; 86(4): 310–314. [DOI] [PubMed] [Google Scholar]
- 18.Balaji D, He L, Giani S, et al. Effectiveness and acceptability of conversational agents for sexual health promotion: a systematic review and meta-analysis. Sexual Health 2022; 19(5): 391–405. Online first. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Palanica A, Flaschner P, Thommandram A, et al. Physicians’ perceptions of chatbots in health care: cross-sectional web-based survey. J Med Internet Res 2019; 21(4): e12887. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Lunt A, Llewellyn C, Bayley J, et al. Sexual healthcare professionals’ views on the rapid provision of remote services at the beginning of COVID-19 pandemic: a mixed-methods study. Int J STD Aids 2021; 32(12): 1138–1148. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006; 3(2): 77–101. [Google Scholar]
- 22.Kaminski J. Diffusion of innovation theory. Can J Nurs Inform 2011; 6(2): 1–6. [Google Scholar]
- 23.Levy AG, Scherer AM, Zikmund-Fisher BJ, et al. Prevalence of and factors associated with patient nondisclosure of medically relevant information to clinicians. JAMA Netw Open 2018; 1(7): e185293. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 24.Lee YC, Yamashita N, Huang Y. Designing a chatbot as a mediator for promoting deep self-disclosure to a real mental health professional. Proc ACM Hum Comput Interact 2020; 4(CSCW1): 1–27. [Google Scholar]
- 25.Tsai WHS, Lun D, Carcioppolo N, et al. Human versus chatbot: understanding the role of emotion in health marketing communication for vaccines. Psychol Mark 2021; 38(12): 2377–2392. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Abdullah R, Fakieh B. Health care employees’ perceptions of the use of artificial intelligence applications: survey study. J Med Internet Res 2020; 22(5): e17620. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Ng DTK, Leung JKL, Chu SKW, et al. Conceptualizing AI literacy: an exploratory review. Comput Educ: Artif Intell 2021; 2: 100041. [Google Scholar]
- 28.Grünebaum A, Chervenak J, Pollet SL, et al. The exciting potential for ChatGPT in obstetrics and gynecology. Am J Obstet Gynecol 2023. 10.1016/j.ajog.2023.03.009 [DOI] [PubMed] [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental Material for “But can chatbots understand sex?” Attitudes towards artificial intelligence chatbots amongst sexual and reproductive health professionals: An exploratory mixed-methods study by Tom Nadarzynski, Alexandria Lunt, Nicky Knights, Jake Bayley, and Carrie Llewellyn in International Journal of STD & AIDS.