Skip to main content
Digital Health logoLink to Digital Health
. 2025 Feb 3;11:20552076241311144. doi: 10.1177/20552076241311144

Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study

Jane Hoffman 1,2,, Rachel Wenke 1, Rebecca L Angus 1, Lucy Shinners 3, Brent Richards 1,4, Laetitia Hattingh 1,2,5
PMCID: PMC11792011  PMID: 39906878

Abstract

Background

Artificial intelligence (AI) has the potential to revolutionise healthcare. If the implementation is successful it has the potential to improve healthcare outcomes for patients and organisations. Little is known about the perceptions of allied health professionals (AHPs) towards AI in healthcare.

Objective

This study investigated barriers and enablers to AI implementation in the delivery of healthcare from the AHPs perspective.

Methods

Qualitative methodology informed by behaviour change theory using focus groups with AHPs at a health service in Queensland, Australia.

Results

Twenty-four barriers and 24 enablers were identified by 25 participants across four focus groups. Barriers included: lack of AI knowledge, explainability challenges, risk to professional practice, negative impact on professional practice, and role replacement. Enablers include AI training and education, regulation, reputation, understanding the healthcare benefits of AI and engaging clinical champions.

Conclusions

AHPs have concerns about the impact and trustworthiness of AI and the readiness of organisations to support its use. Organisations must take a proactive approach and adopt targeted and multifaceted strategies to address barriers. This may include workforce upskilling, clear communication of the benefits of AI use of local champions and ongoing research.

Keywords: Allied health, artificial intelligence, hospital, digital health, behaviour change

Introduction

Artificial intelligence (AI) has the potential to accelerate digital transformation and revolutionise healthcare. The rapid advancements in technology from the fourth digital revolution are expected to profoundly impact the healthcare system, public health, and the individuals that deliver and receive care. 1 As pressure increases on healthcare globally due to a growing aging population, chronic disease, and a dwindling workforce, it is essential that healthcare organisations consider innovative and sustainable strategies to adapt to this demand.26 AI presents promising, novel solutions for these logistical and economic pressures.7,8 It has the potential to improve patient safety and positive system change by reducing mundane tasks, improving productivity, reducing risks, and supporting clinical decision making.810 AI and related technologies emulate human cognition by analysing immense volumes of data via complex computational models. 11 This can be used to improve access, quality, and efficiency of healthcare via generative AI, machine learning, natural language processing and computer vision.1215

Successful technology adoption is dependent on understanding the social dimensions of the human-technology relationship. 16 Prior digital transformations have shown that the increase of technology in healthcare delivery prompted a need to better understand the expectations, skills and resources of human users. 17 It is likely that AI will be used by the same people who currently deliver traditional healthcare hence it is important to explore the perceptions of the current workforce on implementation challenges and facilitators. 18 Studies show that the more complex the technology, the less likely the intended user will adopt and use it19,20 risking the abandonment or misuse of health technologies. 21 Failure to identify the key barriers and facilitators influencing AI adoption in healthcare could result in AI technology abandonment posing a risk to patient safety in an overstretched health system.

Although there is vast potential for AI in healthcare, there are considerable barriers that challenge implementation and user adoption. Reported barriers influencing AI implementation include human factors such as lack of knowledge and skills10,21,22; lack of explainability21,2325; perceived substitution crisis 4 ; regulatory barriers including liability, regulation, governance and policy6,21,26,27; and data-related barriers due to the quality and availability of health data27,28 combined with lack of interoperability of systems.6,22 Enabling factors include training and education,4,6,23,27 communicating potential benefits of AI in healthcare21,23,27,28 and leveraging influential staff. 23 Previous reports on these barriers and enablers have been focused on the perceptions of medical and nursing professions.6,23,25,28 While providing valuable insights for these professions, it is not yet known if these findings translate to the allied health (AH) workforce. There is currently no internationally agreed definition of AH, however, the roles and functions of AHPs are similar between countries. 29 Australian allied health professionals (AHPs) are university-qualified practitioners with expertise to prevent, diagnose and treat a range of conditions and illnesses.3032 AH is the second largest health workforce in Australia consisting of at least sixteen diverse professions such as physiotherapy, pharmacy, occupational therapy, speech pathology, social work and dietetics, among others.3035

Implementing change in clinical practice requires a change management strategy underpinned by a theoretical understanding of clinician behaviour. 36 The validated behaviour change wheel, has three essential components: capability, opportunity and motivation (COM) to provide a framework for understanding behaviour (B) known as the COM-B model of behaviour.3638 This recognises that behaviour is influenced by many factors: capability includes psychological (knowledge) or physical (skills); opportunity can be social (societal influences) or physical (environmental resources); motivation can be automatic (emotion) or reflective (beliefs, intentions). 39 The theoretical domains framework (TDF) is a validated tool describing 14 domains derived from the COM-B model, to provide a more detailed understanding of the determinants of behaviour. 40 The behaviour change wheel links the components of both these frameworks (COM-B and TDF) to strategically develop practical interventions. With limited application in the area of AI in healthcare, 41 both frameworks are widely used to understand the barriers and enablers contributing to implementation problems and develop targeted intervention strategies to address related problems.37,40

This study aimed to investigate determinants of AI implementation in healthcare from the perspective of AHP. Understanding these barriers and enablers will provide essential insights to help inform future AI implementation strategies and facilitate the provision of quality healthcare.

Methods

Overview

A qualitative, focus group study conducted in 2023, exploring barriers and enablers of AI implementation in healthcare from the perspective of AHPs. Analysis was informed by the validated behaviour change wheel: the COM-B model of behaviour36,42 and the TDF. 40

The consolidated criteria for reporting qualitative research (COREQ) (Supplemental file 1) was used in the development, analysis and reporting of the study. 43 This study constitutes the second phase of a larger study which included a quantitative survey exploring AHP perceptions of AI in healthcare. 44

Study setting

The Gold Coast Hospital and Health Service (GCHHS) employs approximately 10,000 staff to deliver public healthcare services to a general population of over 630,000 in south-east Queensland, Australia.4547 Approximately 1200 AHPs are employed across one large tertiary hospital: Gold Coast University Hospital (GCUH), one mid-sized urban hospital: Robina Hospital, one-day surgery unit: Varsity Lakes Day Hospital, and community health services. 47 All GCHHS employed AHPs were eligible to participate in this study, with no exclusion criteria.

Recruitment of participants

This study is phase two of a larger study. Phase one 44 comprised a convenience sample, of GCHHS AHP who responded to an invitation to participate in an online survey, disseminated using email, broadcasts, posters and staff meetings. The AH professions invited to participate in this study were from audiology; clinical measurement sciences; dietetics; medical imaging; occupational therapy; pharmacy; physiotherapy; podiatry; psychology; speech pathology and social work. 47

AHPs that participated in phase one 44 were invited to participate in a single 90-min focus group targeting perceptions on barriers and enablers of AI in health care. Potential participants were contacted via email, provided with study information and, the opportunity to ask questions. All participants provided written consent prior to data collection. Four focus groups were scheduled, with participants from different AHPs evenly distributed within each for a balanced representation of AH professional groups. A decision was made to separate those in senior leadership roles from junior staff of the same AH profession to encourage open discussion and full participation. 48 Participant numbers were limited to a maximum of ten people considering that the recommended number of participants for effective discussion is eight to twelve. 49

Data collection

Focus group workshops were conducted between 6 June and the 6 July 2023 at GCUH and Robina hospitals. Workshop attendance was limited to research participants and facilitators, delivered in a workplace meeting room, audio-recorded, and facilitated by two members of the research team (JH, LH or RA) who took field notes for reflexivity. Participants were advised of the research team membership, and in a few cases, facilitators were known to the participants. The focus group question guide (Supplemental file 2) and prompts were designed to explore participants perceptions of barriers and enablers to implementing AI in healthcare across the three constructs of COM-B36,42 and the 14 domains of the TDF. 40 The facilitator ensured discussions reached saturation point (i.e. when no new comments or ideas were presented) for each question before progressing to the next question. Plus, focus group sessions continued until saturation was achieved, meaning no new novel information was being gathered. Prior to the focus group workshops, questions were emailed to five AHPs to check for face and content validity and piloted on one, which resulted in minor changes to wording.50,51

The five female, one male, research team included registered health professionals in allied health (pharmacy, speech pathology and dietetics), medicine and nursing. Two (JH, BR) were working as clinicians within the health service during the study period. All had research qualifications (MMedRes or PhD). Three had a special interest in behaviour change theory and three in digital health initiatives.

Data analysis

Digital audio recordings of the focus groups were transcribed verbatim by a professional transcription service. A coding guide was developed in alignment with the TDF 40 and COMB,36,42 then reviewed and agreed by the research team. Barriers and enablers were deductively coded against these two frameworks with further subcategories identified using an inductive approach. Transcripts were coded according to the guide by the first author. A second author reviewed the coding with differences discussed, agreed for consensus, and then shared with the wider research team. Similar subcategories were then merged for reporting the key findings. NVivo (QSR International) 52 was used to facilitate data analysis. Transcripts and results were not circulated to participants for comment.

Results

Participant demographics

Of the 245 participants that participated in phase one of the study, 44 69 opted-in to participate in a focus group, 44 then declined the invitation due to workload or did not respond to the focus group invitation. A total of 25 participants attended one of four focus groups, each running for 90 min. The research team agreed that data saturation had occurred.

Participants were predominantly under 40 years old (11/25, 40%), female (17/25, 68%) and mostly in clinical (11/25, 40%) or managerial (8/25, 32%) roles. The three most represented professions were pharmacists (9/25) followed by occupational therapists (5/25) and physiotherapists (3/25). Participant demographics are summarised in Table 1.

Table 1.

Participant demographics n = 25.

Question Category n (%)
Age 18–30 5 (20)
31–40 6 (24)
41–50 6 (24)
51–60 4 (16)
61–70 0 (0)
Unknown 4 (16)
Gender Man 8 (32)
Woman 17 (68)
Profession Dietetics/nutrition 1 (4)
Medical imaging 1 (4)
Occupational therapy 5 (20)
Pharmacy 9 (36)
Physiotherapy 3 (12)
Psychology 2 (8)
Social work 2 (8)
Speech therapy 2 (8)
Role Clinical informatics/technology 3 (12)
Clinician 11 (40)
Educator/clinical facilitator 1 (4)
Governance 1 (4)
Manager 8 (32)
Researcher/academic 1 (4)

Barriers and facilitators

Twenty-four barriers and 24 enablers for AI implementation were identified from the focus group discussions. The most frequently mentioned barriers included: lack of AI knowledge, explainability challenges, risk to professional practice, negative impact to professional practice and role replacement. Frequently mentioned enablers were AI training and education, regulation, reputation, understanding the healthcare benefits of AI and engaging clinical champions. These notable subcategories were described across 11 of the 14 TDF domains, aligning with all three constructs of COM-B. Subcategories within each domain are listed in order of frequency of mentions, along with the number of focus groups in which each subcategory was discussed (Table 2). Most comments fell within eight TDF domains. Narrative description of the most frequently mentioned subcategories is presented in relation to the three constructs of COM-B: capability, opportunity and motivation. 40 Quotations are used to support the narrative with the focus group of participants annotated as FG1 to FG4.

Table 2.

Barriers and enablers of AI in healthcare according to allied health professionals, ranked in order of mentions, grouped by COM-B and TDF domains (n = 25).

COM-B TDF domain Subcategory description Barrier/ Enabler Number of focus group(s) Mentions
Capability Knowledge Lack of AI knowledge, and explainability challenges Barrier 4 62
AI training and education Enabler 4 56
Research and post implementation evaluation Enabler 4 31
Lack of unbiased research or post implementation evaluation Barrier 4 19
Speed of change and volume of technology development Barrier 4 19
Lack of credible sources providing information, education and guidance on AI Barrier 3 9
Behavioural regulation Risk to professional practice: Barrier 4 44
Clinical governance and leadership Enabler 4 35
Skills Using AI in healthcare, initiated slowly Enabler 4 37
Lack of first-hand experience with AI Barrier 4 33
Opportunity Social influences Clinical champions: influential leaders supporting change Enabler 4 54
Absence of AI information from leadership groups. Barrier 4 38
Appropriate stakeholder identification and engagement to enable AI adoption. Enabler 3 15
Inappropriate stakeholder identification and lack of engagement for AI development and implementation Barrier 2 4
Environmental context and resources Workforce and workplace readiness Barrier 4 25
Prohibitive cost associated with development, implementation and maintenance of AI Barrier 3 11
Clinicians providing technical support to AHPs during AI implementation Enabler 2 11
Lack of technical support for AHPs and troubleshooting assistance Barrier 3 10
Lack of interoperability of existing electronic systems Barrier 3 9
Adequate funding and resources to support effecting implementation Enabler 3 5
Motivation Social or professional role & identity Negative change to professional scope of practice and role replacement Barrier 4 42
Change to professional scope of practice that is favourable to AHPs Enabler 4 12
Clinical deskilling of AHPs using AI Barrier 3 11
Prevention of AHP clinical deskilling Enabler 1 3
Beliefs about consequences Improved efficiencies Enabler 4 40
Improved quality of healthcare Enabler 4 34
Not meeting the needs of the clinician. Barrier 4 33
Reduced human contact Barrier 4 32
Problem is clearly identified Enabler 4 28
Reputational decline of health service if not using AI Enabler 4 22
Reduced quality of healthcare and increased patient safety risk Barrier 4 21
Poor accuracy of the AI system: Limited data access and poor data quality Barrier 4 20
Increased workflow complexity with AI Barrier 4 18
AI helping AHPs deliver quality care as work force shortages continue. Enabler 4 18
Compromising patient privacy and confidentiality, lack of ethical framework Barrier 4 14
Communicate the key benefits of using AI in healthcare to AHPs. Enabler 4 11
AI helping to increase human contact in the delivery of health care. Enabler 3 10
AI improving accuracy in diagnostics, and synthesising information to support AHPs Enabler 3 9
Improve patient safety when using AI Enabler 3 9
Workplace culture resisting technical solutions in the delivery of healthcare Barrier 3 4
Ensuring patient privacy and confidentiality, provision of an ethical framework for AI implementation Enabler 3 4
Reduced efficiency for AHPs using AI Barrier 2 6
Simplification of workflow using AI Enabler 1 2
Beliefs about capabilities Strategies to improve user accuracy to ensure accurate AI system outputs. Enabler 3 5
Poor user AI competence affecting AI accuracy Barrier 2 2
Emotion Feelings of fear, concern and axiety about AI implementation in healthcare. Barrier 4 13
Optimism Enthusiasm for the potential positive impacts of AI Enabler 3 4
Reinforcement Liability of the clinician if they do not use an AI tool proven to improve health care Enabler 2 4

Capability

Barriers and enablers of AI in health relating to capability include knowledge, behavioural regulation and skills.

Knowledge

A lack of AI knowledge was a barrier to AI implementation in healthcare, along with the corresponding enabling factors of education, training and research.

Participants reported they did not have adequate AI comprehension, or even a basic level of AI knowledge, and thus did not have a means to assess the level of training and education required to be competent in AI. Further, participants were concerned they would be unable to understand how the AI system derived the information provided, resulting in distrust in AI output ‘… my biggest concern is … issues with trust across the board are around the pace of the advancement. Like we can't keep up with knowing how it [AI] got to the conclusion that it got to or we can't keep up with being able to clinically understand to be able to check it as well’ (FG4).

Consequently, participants in all focus groups believed that acquiring knowledge of AI could be a key enabler for implementation. They wanted high quality, credentialed education from reputable providers to upskill the AI literacy of clinicians and its application to the delivery of healthcare. Participants indicated that within the workplace, training would ideally be provided by senior clinicians who were users of the relevant AI system and ‘definitely not the vendor’ (FG1).

Participants in all focus groups identified credible, independent, peer-reviewed research as a key enabler for implementation. Types of valued research identified ranged from randomised controlled trials to post-implementation evaluation to ensure AI tools met required expectations. ‘I’d like it to be well investigated to show that… outputs are effective. Not only effective but cost effective and deliver on what they say they deliver compared to the usual treatment’ (FG4).

Behavioural regulation

The risk to professional practice due to lack of legislation, perceived unclear indemnity insurance policies and risk to professional registration were identified as barriers to implementation by all focus groups. Participants identified a need for clear accountability, clinical governance and professional organisation leadership as enablers.

Participants voiced a need for a legal framework to guide the application and use of AI and recognised that legislative amendments are historically behind in regulating changes and fail to keep up with the speed of technology development. ‘…what legislative frameworks are we having to work within to be able to release some AI product. Keeping the legislation up to date as the technology develops…when legislation takes so long to go through anyway’ (FG2). Additionally, a lack of clarity from indemnity insurance companies on the use of AI in healthcare generated concern about clinician accountability and the professional risk that AHPs may be exposed to, raising concerns about the potential impact on professional registration and livelihood. ‘…professional registration potentially as well, and what is competent then when you have AI, what does that mean then? As a clinician what then makes you competent?’ (FG2).

Participants believed clear professional accountability around using AI systems in healthcare would facilitate use in the clinical setting and provide clarity on the responsibilities of the individual clinician versus the obligations of the health organisation. Professional indemnity insurance policies that included a position on the clinical use of AI, would encourage confident use without fear of litigation. ‘…indemnity insurance… saying, “this is an accepted part of practice… the technology fails you are not wholly to blame” …’ (FG2). Further, some participants noted that, there may be a professional obligation to use AI systems in the future if they improve the quality of healthcare and patient safety ‘… if you don’t use this its less safe… you are liable of you’re not using it’ (FG2).

Leadership from professional organisations was discussed as a potential enabler. Participants wanted these organisations to provide guidance on the use of AI in healthcare through the usual channels that define best clinical practice. ‘Our professional organisations release position statements … that would definitely impact on our ability to implement AI’ (FG3). ‘professional endorsement … [if] professional bodies endorsed it [AI] and… provided that kind of quality assurance that would make you more confident…’ (FG3).

Clinical governance leadership within health organisations would provide a further enabler. Clinical governance procedures were identified as a means to provide AHPs with confidence that a problem exists and that the proposed AI solution is safe with appropriate regulation and policy in place to provide guidance to clinicians. ‘Clinical governance plays a critical role in building ensuring trust with the emerging health technologies’ (FG1).… governance has the best view of what are the problems that we should really be focusing on’ (FG1).

Skills

Participants in all focus groups identified a lack of firsthand experience with AI as a key barrier to implementation. They perceived AI as not yet available for use in healthcare and were unaware of opportunities to utilise AI in their current delivery of care. They believed that attaining hands-on experience with AI would provide basic skills to support further implementation. Participants recommended AI implementation in health care should be gradual, and initially target nonclinical tasks. ‘…it would have to be a slow integration and starting from a non-clinical point of view first…’ (FG2).

Opportunity

Barriers and enablers related to opportunities for implementation of AI in health care included social influences and environmental context and resources.

Social influences

The role of clinical leaders in supporting change was identified by participants in all focus groups as key for AI implementation. Participants believed that implementation of AI requires a clinical leader to help influence change. They should be a ‘…senior trusted clinicians ….’ (FG2) because participants acknowledged that ‘…if I'm taught how to do things by someone that I really trust, then I'm going to be more likely to listen and take that on and implement it into my practice’ (FG2). Collaboration between AI technical experts and clinical managers would be required to influence staff acceptance and behaviour. ‘…you need the people with the referent power… to help shape it, guide it, lead it, make it happen. But then you also need the expertise of the people that actually know and understand it to actually be able to answer the questions that come’ (FG1). The involvement of three departments within the health service ‘governance, research, informatics’ (FG1) were identified as being highly influential in enabling the adoption of AI in health care.

Participants in all focus groups identified the absence of AI information from leadership as a key barrier to implementation, noting a lack of discussion about AI by managers, senior colleagues and clinical governance bodies. As a result, participants were cautious in discussing the role AI might have in health care due to a lack of clarity around the support for it throughout the organisation. ‘…there's not a lot of talk in the governance space specifically around AI’ (FG1).

Environments context and resources

Participants described pressures associated with workforce shortages and limited time as barriers to AI implementation. One participant stated, ‘I can see it's going to be a lot of work to get a successful product working’ (FG2) and based on prior experience noted that health services may not have this capacity due to workforce shortages and high demand for services. Participants believed workforce shortages would impose operational limitations on readiness initiatives such as training, suggesting the health organisation would not have the ‘…ability to release staff in the current circumstance, especially to then learn a whole new system’ (FG1). They also suggested that the health service may not be ready to incorporate AI in health care due to infrastructure and equipment limitations. ‘I don't think we have the structures, processes, in place to really start to move forward with that [AI]’ (FG1), noting that their prior experience of digital systems using existing infrastructure and lack of interoperability had resulted in disruptions to service ‘…we have had a lot of technology fail …’ (FG2).

Motivation

Barriers and enablers related to motivation for implementing AI in health care included social, professional role and identity, beliefs about consequences, beliefs about capabilities, emotion, optimism and reinforcement.

Social, professional role and identity

A barrier to AI implementation was a concern expressed in all focus groups that AI may have a negative impact on professional scope of practice; ‘As the technology changes, the roles changes, the scope of practice has to change. Role descriptions, remuneration, all that’ (FG1). There were concerns about AI replacing tasks associated with valued experience and knowledge. ‘…is this replacing me, but also can I trust it? It's taking away my decision making, my value, my independence. Do I actually trust a computer to do the things that I've been trained to do in my 20 years of experience or whatever? How can it do the job that I do?’ (FG1). Some expressed fear of the risk that AI would replace their role entirely, ‘… staff are going to be replaced by or supplemented to a large extent by AI…’ (FG4), ‘Just leave it alone. We like our jobs. We want to keep them’ (FG3). Or result in the overall reduction of staff numbers delivering healthcare ‘…there is a lot of concern about a decrease in the workforce and increase in the AI…’ (FG3). Some suggested that introduction of AI might result in non-clinical, unskilled workers performing complex clinical tasks currently performed by experienced clinicians, as encapsulated by this comment ‘… admin [administrative] person using a brain surgery piece of equipment’ (FG4).

Beliefs about consequences

Many subcategories were described by participants for this domain. For completeness, all subcategories are listed in Table 2, however, the most frequently mentioned subcategories are further described below. Participants in all focus groups identified key barriers as AI may not meet the needs of clinicians, reduced human contact with patients, poor access to quality data and a perceived reduction of quality in healthcare because of AI. Key enablers related to communicating the following to AHPs: potential improved efficiencies, benefits of AI in health care, problem identification and potential positive impact on organisational reputation.

A key enabler to AI adoption was understanding how AI could improve efficiencies when delivering care. ‘… If we can convince people … this is actually going to give you back time to do the job you need to do to make people less pressured… That's an easy sell’ (FG1). Notably, communicating key messages about the benefit of efficiencies gained alongside anticipated workforce shortages. ‘…[AI] may increase our efficiency… we're going to have to treat an increased population… there's just never going to be enough healthcare workers to go around. So it can potentially automate some of the more routine parts of your role and allow you to see more patients potentially’ (FG3). Connecting efficiencies gained to maintaining quality healthcare was also viewed as a facilitator for AI adoption ‘… we've got an aging population, a growing population. Workforce is a real issue. … if we are actually going to be able to support our communities needs we need AI to [help clinicians] become more effective and efficient. So if we don’t, we're just not going to be able to … maintain the levels of health care we're used to’ (FG2).

Communicating the benefits of AI was identified as a key enabler to AI implementation. Participants were motivated by opportunities for improved quality of care and equitable access to healthcare which could empower the workforce ‘…we've got a really good opportunity to improve patient outcomes … To ignore something that is so powerful, admittedly problematic to implement I feel like is doing a disservice to our community and our patients’ (FG4).

There were beliefs that AI implementation efforts may not be driven by problems experienced by clinicians in delivering clinical care, which would act as a barrier to AI implementation. There were perceptions that AI systems would not meet the needs of clinicians but rather organisations would ‘…implement something that nobody has really asked for’ (FG2). Efforts to pursue innovation could result in the abandonment of the real issues facing clinicians because these issues may not be novel and newsworthy. ‘We are just not even doing some of the basics that would make it better and free up clinician time to deliver patient care. They're the things that actually matter at the end of the day’ (FG1). Participants believed AI implementation strategies might instead be motivated by commercial potential. ‘We have to be careful not to choose what we are going to develop based on the ability to commercialise it, because it links back to solving the right problem’(FG1).

A key barrier to AI implementation was the belief that AI would result in reduced human contact between the patient and clinician, along with less interprofessional collaboration. Participants were concerned that reduced human contact would negatively impact on patients ‘…make sure that they [patients] don't feel like we've abandoned them to robots’ (FG3).

Participants in all focus groups believed that clinicians would use AI if they clearly understood that it addressed a specific problem. Of importance was to know how the problem was identified, who prioritised it and why, and how it aligned with the existing clinical priorities of the organisation. ‘…who's curating that list of issues? Who's deciding what are the priorities on that list?’(FG1) to be reassured that ‘… the health service … [is] solving the right problems’ (FG1). Further to this, participants also want to understand the scope of the AI system, how it would be used, and what the limitations would be ‘…the solution is understanding the scope [of AI]’ (FG1), ‘being clear of what [AI] can and can't be…’ (FG3).

Participants in all focus groups identified the reputation of the health organisation as an enabler for AI implementation. There were concerns that failure to pursue AI in health care would negatively impact the reputation of their workplace ‘…we’re trying to strive towards being a world class health facility. If everyone else starts doing it and we're not, it's potential impact on our reputation’ (FG1). Some would find it ‘…personally very embarrassing’ (FG3) to not be viewed as an innovative health organisation and believed the health service would fall ‘…behind the rest of the world’ (FG3). This would bring the performance of individual clinicians into question ‘…[if] we don't implement it, we're obviously behind, so it'll be viewed as inefficient and incompetent’ (FG1).

In addition, participants recognised the impact of organisational reputation on capacity to recruit and retain quality staff. ‘… we wouldn’t be progressive…. we wouldn’t be seen as innovative… we wouldn’t be able to recruit’ (FG3). Further, patient perceptions about the reputation and quality of the health service were important to AHPs ‘I think the public's perspective also impacts the clinician's view, if that makes sense. So what patients think matters to clinicians’ (FG1).

Participants identified AI systems as potentially increasing patient safety risks and ultimately result in the decline in the quality of health. ‘…I think there's a real risk that there could be worse outcomes as well from it’ (FG1), ‘…how do we …make sure… it's safe for patients and that it's not adding extra risk to patient care?’ (FG4).

The perception of poor accuracy of AI output due to data access and data quality was identified as a barrier to implementation. Participants noted that access to patient data is complex due to cybersecurity restrictions to protect patient privacy and confidentiality. Concerns were raised about data access limitations that will result in incomplete and questionable AI output. ‘[AI would have to]…go through like a stringent cybersecurity check before you can even use it, which I think is a barrier as well’ (FG4).

Participants recognised existing issues with the quality of available data and how this would impact on the development of AI systems. ‘… trying to do research with data in the organisation and seeing the variability of that data I'm thinking, how could we - how do we get to the point where the data feeding into it is consistent enough for it to spit out a consistent message?’ (FG2). Participants acknowledge that AHP inputs contribute to the quality of the available data ‘… it's only as good as the quality of data that's put in. We currently have to employ people into positions to make sure the quality of data going in is correct …’ (FG4). Poor quality data would produce inaccurate AI information potentially impacting patient care. ‘…we can have the smartest computer in the world, but if the data is flawed, then the output's going to be bad’ (FG4).

Discussion

This is the first study of its kind to explore AHPs perceived barriers and enablers to the implementation of AI in healthcare. Key barriers identified by AHPs include a lack of AI knowledge, explainability challenges, perceived risk to professional practice and employment, and AHP liability. Key enablers were identified as quality training and education to increase digital literacy, leadership from professional organisations, credible research and evaluation, communicating the benefits of AI, utilising clinical champions, and maintaining or improving the reputation of the health service. The results demonstrated a variety of factors that aligned with the three constructs of COM-B: capability, opportunity, and motivation 37 and 11 of the TDF domains. 40 This can be used to guide the development of interventions to achieve behavioural change, an important consideration when planning implementation strategies for any change in healthcare delivery.

AHP in this study were representative of the allied health workforce in Australia, reported by the Australian Government Australian Institute of Health and Welfare as mostly female, and aged under 44 years old. 34 Physiotherapists and occupational therapists comprise the largest professions in the allied health workforce, 31 yet pharmacists were predominantly represented in this study. Phase one of this study found that perceptions about the professional impact of AI varied based on individual AH professions 44 and pharmacists were more likely to think AI will impact their professional role compared with occupational therapists, physiotherapists, and social workers. 44 Pharmacists may be more aware of the potential of AI on their professional future and are increasingly represented in literature44,5355 which may explain the number of pharmacists who agreed to participate in this study.

The present study revealed that AHPs feel very responsible and liable for the quality of healthcare they provide, and as such need to understand the clinical tools available to support their decisions. It is a challenging task to address technical literacy when complex algorithms are rarely taught within the health curriculum and AI produces unexplainable and unexpected ‘black box’ outputs.21,2325 The lack of AI system explainability can lead to clinician mistrust in the output provided, prompting concerns for patient safety.6,23 Yet digital health implementations rely on trust to succeed. 56 A study in the Netherlands found radiologists required a technical understanding of an AI tool to comprehend how the output was derived, and to establish trust in the reliability, quality and safety of the output.21,24 Consistent with these findings, our study found AHPs do not believe they have adequate AI comprehension to understand how the AI system works, and this impacts their ability to validate the findings of an AI tool. As a result, AHPs expressed distrust in AI output, expressing concerns for patient safety which would likely hinder AI implementation efforts. Healthcare organisations therefore need to be aware of the distrust associated with AI, addressing how the output is derived for implementation.

To address reported knowledge deficits and resultant distrust toward AI, our findings indicate that quality education and training is imperative for developing a skilled workforce to enable AI implementation in AH practices, in line with other research.5759 Efforts should target increasing general AI knowledge and improve the technical explainability of each AI tool.6,44 An earlier phase of our study found that AHPs had little to no knowledge, training, or firsthand experience with AI and that efforts to improve this should be tailored to individual AH professional groups. 44 From our focus groups, it was clear that AHPs want high-quality, accredited, transferable training delivered by professional organisations and/or academic institutions. Although training institutions have a responsibility to prepare health professionals for AI, 25 digital health training providers and formal AHP competency frameworks are lacking in this respect.59,60 Further, health professionals indicate trust for universities over entities that stand to make a profit from health professional training in, and use of, their systems. 56 Our results highlight the critical role that professional organisations will have in enabling AI adoption in healthcare. These can support AHP education, professional accreditation and provide influential leadership with position statements while digital literacy is gradually embedded in health care curriculum.5860 This study highlighted a need for ongoing development and evaluation of education strategies and initiatives. Healthcare organisation may need to drive AI knowledge and training strategies for AHP by collaborating with educational and professional organisations.

Another important barrier identified in the study was that health professionals perceive that adopting AI in practice may pose an unacceptable risk to their professional practice and livelihood, equating to a complex and considerable barrier to implementation. As has been noted in medical professions, our study found that AHPs believe digital advancements such as AI could result in the replacement of clinical staff.4,24,26,57 This perceived substitution crisis is a serious barrier in the deployment of AI initiatives, 4 as clinicians will not be motivated if it is perceived to impact their employment or decline their professional role. Although the impact on employment in health has not been clearly determined, literature would suggest AI presents opportunities for increased efficiencies and productivity 61 rather than reducing employment opportunities. Health professional roles may also evolve over time to incorporate digital skills in clinical practice.6163 This would more likely lead to an extend scope of practice and job satisfaction. In addition, use of AI in healthcare is not clearly supported by professional standards, legislation, regulation or within professional indemnity policies. This places a risk on the AHP if liability falls on the clinician, rather than on the technology or the organisation.21,26,58,64 It is interesting to note however, that the AHPs in our study also questioned the potential for professional liability if AI tools are available but are not used. This further highlights the need for clarity on the professional application of AI tools.

Communicating the advantages of AI in healthcare to AHP was identified as an important enabler for AI implementation, consistent with prior studies.10,26,27,57 Participants believed information about potential efficiencies gained and improvement in the quality of healthcare would assist AHPs to adapt to AI in healthcare. One strategy recommended by an American study was to demonstrate the advantages and usability of AI by presenting successful cases on efficiency improvements and applicability. 27 Embedding research and evaluation within AI implementation has been established as a key strategy to facilitate AI implementation,28,64,65 and was similarly identified as important by our study participants. Quality research would provide credible and targeted information for AHPs to support their understanding of the advantages and applicability of AI.44,5355 This finding highlights the importance of developing a communication plan prior to implementation, tailored to include credible information about the efficiencies gained, applicability to individual AH professions and the impact on the quality of healthcare.27,28

Organisational leadership was seen by AHPs as another key to successful implementation of AI technology, but that it must be supported by effective communication and reliable information to promote trust and use. One strategy was for local clinical champions to take leadership roles. There is evidence that local champions can be essential to overcoming resistance to AI implementation in healthcare. 24 Clinician-led change leverages the considerable impact of influential leaders on successful digital health implementations.24,57,66 An Australian study found that influential leaders had the greatest impact on pharmacy staff using a newly implemented dispensing robot, and that this early influence continued to impact staff usage fifteen months after implementation. This highlights the importance of identifying and deploying champions early in digital implementation to leverage a long-term positive impact. 66 In our study, AHPs identified the potential for clinical champions to play a crucial role in AI adoption. As each AH profession is unique and diverse in professional scope of practice, organisations should identify a clinical champion in each area to meet the AHPs need for a senior clinical leader that is using an AI system to facilitate adoption.

An interesting finding of our study was preventing reputational decline of the health organisations is important to AHPs and could thus be an enabler for the workforce in adopting AI in healthcare. To our knowledge, this is the first study to report organisational reputation as a key enabler of AI implementation for AHPs, and this should be considered in the messaging for future AI implementation efforts. Reputation can help determine the success or failure of a company and can influence the expectations, behaviour and attitude of patients and health professionals. 67 The present study found working for a health organisation renowned for innovation aligned with professional credibility and the capacity to recruit quality staff.

Strengths and limitations

A key strength of our study was the use of behaviour change frameworks (COM-B and TDF) to underpin the study. Also, focus group discussions are ideal for investigating complex behaviours or motivations and provides a forum for participants to discuss AI in clinical settings openly and robustly.68,69 Different AH professions were distributed within focus groups for balanced representation and senior leaders were separated from participants in junior roles of the same AH profession to encourage open discussion and full participation.

Although the study had a good response rate and representation from eight AH professions, it was conducted at a single health service, limiting the generalisability of the findings in other health settings. We have previously reported that pharmacists are more likely than other AH professions to perceive AI will impact their professional role 44 . In combination with the potential influence of two senior pharmacists leading the study (JH, LH), this may account for the high proportion of pharmacist participation. 44 AHP's with an existing interest and awareness of AI in healthcare may have been more motivated to participate, thus those without an interest in AI or resistant to technology change may be underrepresented. Despite efforts to engage and recruit across all disciplines, participants from only. eight of the sixteen allied health professions employed within our health service were represented, limiting the generalisability of the findings across the range of AH professions. Future studies may consider targeting the AH professions not represented in this study to comprehensively explore AHP barriers and enablers of AI in healthcare and compare results between individual AH professions to guide targeted approaches. Further investigation is required on the best educational modality and frequency to address AHP learning requirements. Future studies should evaluate the impact of AI on AHP employment, reduction of simple administrative task, increase in professional tasks and extending scope of practice.

Conclusion

AI presents a promising and novel solution to support pressures on current and future healthcare demands. Rather than a one size fits all approach, a clear understanding of the workforce perception of AI and its impact on staff capability, opportunities and motivations should inform implementation strategies, which will impact adoption, and resultant patient outcomes. Our study showed that AHPs have concerns about the impact of AI on healthcare and the readiness of organisations to support them in the adoption of new technologies. Targeted strategies are needed to overcome these concerns that incorporate interventions such as workforce upskilling through high-quality, accredited, transferable training delivered by professional organisations and academic institutions, clear communication of the benefits of AI and the problem it is intended to solve, use of local clinical champions and ongoing research. Healthcare organisations need to take a proactive approach and deploy a tailored and multifaceted approach to facilitate the use and uptake of AI in the delivery of healthcare to provide best possible patient care.

Supplemental Material

sj-pdf-1-dhj-10.1177_20552076241311144 - Supplemental material for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study

Supplemental material, sj-pdf-1-dhj-10.1177_20552076241311144 for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study by Jane Hoffman, Rachel Wenke, Rebecca L Angus, Lucy Shinners, Brent Richards and Laetitia Hattingh in DIGITAL HEALTH

sj-docx-2-dhj-10.1177_20552076241311144 - Supplemental material for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study

Supplemental material, sj-docx-2-dhj-10.1177_20552076241311144 for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study by Jane Hoffman, Rachel Wenke, Rebecca L Angus, Lucy Shinners, Brent Richards and Laetitia Hattingh in DIGITAL HEALTH

Acknowledgements

The authors would like to thank allied health staff at Queensland Health GCHHS for their participation in this study and GCHHS Allied Health Research for providing in kind funding to complete this research.

Appendix

Notation

AH

allied health

AHP

allied health professional

AI

artificial intelligence

COM-B

capability, opportunity, motivation, behaviour

GCHHS

Gold Coast Hospital and Health Service

GCUH

Gold Coast University Hospital

TDF

theoretical domain framework

Footnotes

Author contributions: JH, LH, RW conceptualised the study and developed the methodology. All authors researched literature, developed the protocol, and gained ethical approval. LH, RW, RA and JH were involved in participant recruitment, and data collection. JH conducted data analysis; LH validated the data. JH drafted the manuscript. All authors reviewed, edited, and approved the final version of the manuscript.

Consent to participate: All participants provided informed written consent prior to data collection.

The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Ethical considerations: Ethics approval was granted by the Gold Coast Health Human Research Ethics Committee (HREC/2023/QGC/96821).

Funding: The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the Gold Coast Health, Study, Education and Research Trust Account (SERTA).

Supplemental material: Supplemental material for this article is available online.

References

  • 1.Cwiklicki M, Duplaga M, Klich J. The digital transformation of healthcare, Health 4.0. New York: Routledge, 2022. [Google Scholar]
  • 2.OECD and World Health Organization. Health at a Glance: Asia/Pacific 2020: Measuring Progress Towards Universal Health Coverage. Paris: OECD Publishing, 2020. [Google Scholar]
  • 3.Jakovljevic M, Westerman R, Sharma T, et al. Aging and global health. In: Kickbusch I, Ganten D, Moeti M. (eds) Handbook of global health. Cham: Springer International Publishing, 2021, pp.73–102. [Google Scholar]
  • 4.Hameed BMZ, Nithesh N, Sufyan I, et al. Breaking barriers: unveiling factors influencing the adoption of artificial intelligence by healthcare providers. Big Data Cogn Comput 2023; 7: 105–105. [Google Scholar]
  • 5.World Health Organisation. Regulatory considerations on artificial intelligence for health. Geneva: WHO, 2023. [Google Scholar]
  • 6.Chomutare T, Tejedor M, Svenning TO, et al. Artificial intelligence implementation in healthcare: a theory-based scoping review of barriers and facilitators. Int J Environ Res Public Health 2022; 19: 16359. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Topol EJ. The Topol review: preparing the healthcare workforce to deliver the digital future. An independent report on behalf of the Secretary of State for Health and Social Care England: Health Education, 2019. [Google Scholar]
  • 8.World Health Organisation. World Health Organisation: global strategy on digital health 2020–2025. Geneva: World Health Organisation, 2021. [Google Scholar]
  • 9.Davenport T, Kalakota R. The potential for artificial intelligence in healthcare. Future Healthc J 2019; 6: 94–98. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 10.Klarenbeek SE, Schuurbiers-Siebers OCJ, van den Heuvel MM, et al. Barriers and facilitators for implementation of a computerized clinical decision support system in lung cancer multidisciplinary team meetings-a qualitative assessment. Biol (Basel) 2020; 10: 9. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Russell S, Norvig P. Artificial intelligence: a modern approach. 3rd ed. USA: Pearson. 2016. [Google Scholar]
  • 12.Lai MC, Brian M, Mamzer MF. Perceptions of artificial intelligence in healthcare: findings from a qualitative survey study among actors in France. J Transl Med 2020; 18: 14. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Lee EE, Torous J, De Choudhury M, et al. Artificial intelligence for mental health care: clinical applications, barriers, facilitators, and artificial wisdom. Biol Psych 2021; 6: 856–864. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Alami H, Lehoux P, Auclair Y, et al. Artificial intelligence and health technology assessment: anticipating a new level of complexity. J Med Internet Res 2020; 22: e17707. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 15.He J, Baxter SL, Xu J, et al. The practical implementation of artificial intelligence technologies in medicine. Nat Med 2019; 25: 30–36. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Hanseth O, Monteiro E. Changing irreversible networks: Institutionalisation and infrastructure. Aix-en-Provance, France: European Conference on Information Systems, 1997. [Google Scholar]
  • 17.Ziebland S, Hyde E, Powell J. Power, paradox and pessimism: on the unintended consequences of digital health technologies in primary care. Soc Sci Med 2021; 289: 114419. [DOI] [PubMed] [Google Scholar]
  • 18.Ćwiklicki M, Klich J, Chen J. The adaptiveness of the healthcare system to the fourth industrial revolution: a preliminary analysis. Futures 2020; 122: 102602. [Google Scholar]
  • 19.Cresswell K, Sheikh A. Organizational issues in the implementation and adoption of health information technology innovations: an interpretative review. Int J Med Inform 2013; 82: e73–e86. [DOI] [PubMed] [Google Scholar]
  • 20.Greenhalgh T, Wherton J, Papoutsi C, et al. Beyond adoption: a new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J Med Internet Res 2017; 19: e367. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Winter PD, Chico TJA. Using the non-adoption, abandonment, scale-up, spread, and sustainability (NASSS) framework to identify barriers and facilitators for the implementation of digital twins in cardiovascular medicine. Sensors 2023; 23: 6333. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Cresswell K, Domínguez Hernández A, Williams R, et al. Key challenges and opportunities for cloud technology in health care: semistructured interview study. JMIR Hum Factors 2022; 9: e31246. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Balch JA, Loftus TJ. Actionable artificial intelligence: overcoming barriers to adoption of prediction tools. Surgery 2023; 174: 730–732. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Strohm L, Hehakaya C, Ranschaert ER, et al. Implementation of artificial intelligence (AI) applications in radiology: hindering and facilitating factors. Eur Radiol 2020; 30: 5525–5532. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 25.Singh RP, Hom GL, Abramoff MD, et al. Current challenges and barriers to real-world artificial intelligence adoption for the healthcare system, provider, and the patient. Transl Vis Sci Technol 2020; 9: 45–45. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Haider H. Barriers to the adoption of Artificial Intelligence in healthcare in India. K4D Helpdesk Report 780. Brighton, UK: Institute of Development Studies, 2020. [Google Scholar]
  • 27.Zemplenyi A, Tachkov K, Balkanyi L, et al. Recommendations to overcome barriers to the use of artificial intelligence-driven evidence in health technology assessment. Front Public Health 2023; 11: 1088121. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Watson J, Hutyra CA, Clancy SM, et al. Overcoming barriers to the adoption and implementation of predictive modeling and machine learning in clinical care: what can we learn from US academic medical centers? JAMIA open 2020; 3: 167–172. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Turnbull C, Grimmer-Somers K, Kumar S, et al. Allied, scientific and complementary health professionals: a new model for Australian allied health. Aust Health Rev 2009; 33: 27–37. [DOI] [PubMed] [Google Scholar]
  • 30.Angus RL, Hattingh HL, Weir KA. The health service perspective on determinants of success in allied health student research project collaborations: a qualitative study guided by the consolidated framework for implementation research. BMC Health Serv Res 2024; 24: 43. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 31.Australian Government Department of Health and Aged Care. About allied health care, https://www.health.gov.au/topics/allied-health/about (2024, accessed 17/05/2024 2024).
  • 32.Australian Health Practitioner Regulation Agency. Professions and divisions. https://www.ahpra.gov.au/Registration/Registers-of-Practitioners/Professions-and-Divisions.aspx (2024, accessed 17/05/2024 2024).
  • 33.Queensland Health. Allied health workforce. Queensland, Australia: Office of the Chief Allied Health Officer, 2017. https://qheps.health.qld.gov.au/alliedhealth/html/professions/professions-landing-page (accessed 06/12/2023). [Google Scholar]
  • 34.Australian Government. Australian institute of health and welfare. Health Workforce 2022. https://www.aihw.gov.au/reports/workforce/health-workforce (accessed 29/11/2023). [Google Scholar]
  • 35.Queensland Government. Queensland Health careers: Allied health requirements to practice https://www.careers.health.qld.gov.au/allied-health-careers/registration-requirements (2024, accessed 22/05/2024 2024).
  • 36.Michie S, Atkins L, West R. The behaviour change wheel: a guide to designing interventions. Great Britain: Silverback Publishing, 2014. [Google Scholar]
  • 37.Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci 2011; 6: 42. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 38.Hattingh L, Sim TF, Sunderland B, et al. Successful implementation and provision of enhanced and extended pharmacy services. Res Social Adm Pharm 2020; 16: 464–474. [DOI] [PubMed] [Google Scholar]
  • 39.McDonagh LK, Saunders JM, Cassell J, et al. Application of the COM-B model to barriers and facilitators to chlamydia testing in general practice for young people and primary care practitioners: a systematic review. Implement Sci 2018; 13: 130. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 40.Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci 2012; 7: 37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 41.Luo C, Yang C, Yuan R, et al. Barriers and facilitators to technology acceptance of socially assistive robots in older adults - a qualitative study based on the capability, opportunity, and motivation behavior model (COM-B) and stakeholder perspectives. Geriatr Nurs 2024; 58: 162–170. [DOI] [PubMed] [Google Scholar]
  • 42.Boyd J, McMillan B, Easton K, et al. Utility of the COM-B model in identifying facilitators and barriers to maintaining a healthy postnatal lifestyle following a diagnosis of gestational diabetes: a qualitative study. BMJ Open 2020; 10: e037318. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19: 349–357. [DOI] [PubMed] [Google Scholar]
  • 44.Hoffman J, Hattingh L, Shinners L, et al. Allied health professionals’ perceptions of artificial intelligence in the clinical setting: a cross sectional survey. JMIR Form Res 2024; 8: 57204. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.Australian Bureau of Statistics. Gold Coast 2021 Census All persons QuickStats, https://www.abs.gov.au/census/find-census-data/quickstats/2021/309 (2023, accessed 12 May 2023 2023). 2023.
  • 46.Gold Coast City Council. City of Gold Coast Population Data, https://www.goldcoast.qld.gov.au/Council-region/About-our-city/Population-data (2023, accessed 12 May 2023 2023).
  • 47.Queensland Health. State of Queensland (Gold Coast Hospital and Health Service) Annual Report 2022-2023. Queensland, Australia: Queensland Health, 2023. [Google Scholar]
  • 48.Nguyen ML, Sunderland B, Lim S, et al. A qualitative exploration of factors contributing to non-guideline adherent antipsychotic polypharmacy. Res Soc Admin Phar 2022; 18: 2457–2467. [DOI] [PubMed] [Google Scholar]
  • 49.Chronic Care Network. Participant experience focus groups: Facilitation guide. Chatswood: Agency for Clinical Innovation, 2016. [Google Scholar]
  • 50.Smith F. Health services research methods in pharmacy: survey research: (2) survey instruments, reliability and validity. Int J Pharm Pract 2011; 5: 216–226. [Google Scholar]
  • 51.Kelly K, Clark B, Brown V, et al. Good practice in the conduct and reporting of survey research. Int J Qual Health Care 2003; 15: 261–266. [DOI] [PubMed] [Google Scholar]
  • 52.NVivo (QSR international Pty Ltd).
  • 53.Lambert SI, Madi M, Sopka S, et al. An integrative review on the acceptance of artificial intelligence among healthcare professionals in hospitals. NPJ Dig Med 2023; 6: 111. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Aldughayfiq B, Sampalli S. Patients’, pharmacists’, and prescribers’ attitude toward using blockchain and machine learning in a proposed ePrescription system: online survey. JAMIA Open 2022; 5: ooab115. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Hogue SC, Chen F, Brassard G, et al. Pharmacists’ perceptions of a machine learning model for the identification of atypical medication orders. J Am Med Inform Assoc 2021; 28: 1712–1718. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 56.Adjekum A, Blasimme A, Vayena E. Elements of trust in digital health systems: scoping review. J Med Internet Res 2018; 20: e11254. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 57.Ajmera P, Jain V. Modelling the barriers of health 4.0–the fourth healthcare industrial revolution in India by TISM. Operat Manag Res 2019; 12: 129–145. [Google Scholar]
  • 58.Scheetz J, Rothschild P, McGuinness M, et al. A survey of clinicians on the use of artificial intelligence in ophthalmology, dermatology, radiology and radiation oncology. Sci Rep 2021; 11: 5193. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 59.Butler-Henderson K, Dalton L, Probst Y, et al. A meta-synthesis of competency standards suggest allied health are not preparing for a digital health future. Int J Med Inf 2020; 144: 104296. [DOI] [PubMed] [Google Scholar]
  • 60.Woods L, Janssen A, Robertson S, et al. The typing is on the wall: Australia’s healthcare future needs a digitally capable workforce. Aust Health Rev 2023; 47: 553–558. [DOI] [PubMed] [Google Scholar]
  • 61.Ernst E, Merola R, Samaan D. Economics of artificial intelligence: implications for the future of work. IZA J Lab Pol 2019; 9. DOI: 10.2478/izajolp-2019-0004. [DOI] [Google Scholar]
  • 62.Robert N. How artificial intelligence is changing nursing. Nurs Manage 2019; 50: 30–39. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 63.Martin K. Artificial intelligence more likely to transform HI jobs than replace workers. J AHIMA 2024: 1–1. [Google Scholar]
  • 64.Paul Y, Hickok E, Sinha A, et al. Artificial intelligence in the healthcare industry in India. Bengaluru: The Centre for Internet and Society, India, 2018. [Google Scholar]
  • 65.Australian alliance for artifical intelligence in healthcare. A national policy roadmap for artificial intelligence in healthcare. North Ryde: AAAIH, 2023. [Google Scholar]
  • 66.Hogan J, Grant G, Kelly F, et al. Factors influencing acceptance of robotics in hospital pharmacy: a longitudinal study using the extended technology acceptance model. Int J Pharm Pract 2020; 28: 483–490. [DOI] [PubMed] [Google Scholar]
  • 67.Radu M, Radu G, Condurache A, et al. The influence of digital media on the success of a health care unit. J Med Life 2018; 11: 254–256. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 68.Basch CEP. Focus group interview: an underutilized research technique for improving theory and practice in health education. Health Educ Q 1987; 14: 411–448. [DOI] [PubMed] [Google Scholar]
  • 69.Patton MQ. Qualitative research & evaluation methods: integrating theory and practice. Fourth ed. Thousand Oaks, California: Sage Publications, Inc, 2015. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sj-pdf-1-dhj-10.1177_20552076241311144 - Supplemental material for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study

Supplemental material, sj-pdf-1-dhj-10.1177_20552076241311144 for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study by Jane Hoffman, Rachel Wenke, Rebecca L Angus, Lucy Shinners, Brent Richards and Laetitia Hattingh in DIGITAL HEALTH

sj-docx-2-dhj-10.1177_20552076241311144 - Supplemental material for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study

Supplemental material, sj-docx-2-dhj-10.1177_20552076241311144 for Overcoming barriers and enabling artificial intelligence adoption in allied health clinical practice: A qualitative study by Jane Hoffman, Rachel Wenke, Rebecca L Angus, Lucy Shinners, Brent Richards and Laetitia Hattingh in DIGITAL HEALTH


Articles from Digital Health are provided here courtesy of SAGE Publications

RESOURCES