Abstract
PURPOSE
Artificial intelligence (AI) applications in radiotherapy (RT) are expected to save time and improve quality, but implementation remains limited. Therefore, we used implementation science to develop a format for designing an implementation strategy for AI. This study aimed to (1) apply this format to develop an AI implementation strategy for our center; (2) identify insights gained to enhance AI implementation using this format; and (3) assess the feasibility and acceptability of this format to design a center-specific implementation strategy for departments aiming to implement AI.
METHODS
We created an AI-implementation strategy for our own center using implementation science methods. This included a stakeholder analysis, literature review, and interviews to identify facilitators and barriers, and designed strategies to overcome the barriers. These methods were subsequently used in a workshop with teams from seven Dutch RT centers to develop their own AI-implementation plans. The applicability, appropriateness, and feasibility were evaluated by the workshop participants, and relevant insights for AI implementation were summarized.
RESULTS
The stakeholder analysis identified internal (physicians, physicists, RT technicians, information technology, and education) and external (patients and representatives) stakeholders. Barriers and facilitators included concerns about opacity, privacy, data quality, legal aspects, knowledge, trust, stakeholder involvement, ethics, and multidisciplinary collaboration, all integrated into our implementation strategy. The workshop evaluation showed high acceptability (18 participants [90%]), appropriateness (17 participants [85%]), and feasibility (15 participants [75%]) of the implementation strategy. Sixteen participants fully agreed with the format.
CONCLUSION
Our study highlights the need for a collaborative approach to implement AI in RT. We designed a strategy to overcome organizational challenges, improve AI integration, and enhance patient care. Workshop feedback indicates the proposed methods are useful for multiple RT centers. Insights gained by applying the methods highlight the importance of multidisciplinary collaboration in the development and implementation of AI.
INTRODUCTION
Artificial intelligence (AI) can transform health care by addressing complex challenges such as improving quality of life, extending survival rates, and enhancing safety. Moreover, it is also hypothesized to be part of the solution to tackle the shortage of health care staff in the upcoming years.1-3
CONTEXT
Key Objective
Can validated methods from implementation science help us with implementing innovations in a more efficient manner?
Knowledge Generated
We have applied methods from implementation science to formulate an implementation strategy for artificial intelligence (AI), and our study highlights the need for a collaborative approach to implement AI in radiotherapy (emphasized by both literature and clinical staff). We designed a strategy to overcome organizational challenges, improve AI integration, and enhance patient care.
Relevance (J.L. Warner)
This report of an implementation science workshop illustrates the need for objective and reproducible approaches to the incorporation of AI into clinical workflows.*
*Relevance section written by JCO Clinical Cancer Informatics Editor-in-Chief Jeremy L. Warner, MD, MS, FAMIA, FASCO.
AI is a catch-all term and has many applications in the clinical domain. In this paper, we focus on radiation oncology (RO). First, in this field, AI is being developed, applied, and evaluated for automation and optimization of the workflows, leading to more efficiency and higher-quality treatment processes. Examples are automated radiation treatment planning methods and fast autosegmentation tools. Also, in RO, the escalating personnel shortage is pressing the need for the implementation of these kinds of time-saving techniques4 in the routine care path. Second, AI-based predictive outcome models are being developed to support personalized treatment choices.5-7
For the clinical implementation of AI in RO, the typical steps for introducing an innovative technology in clinical practice have already been recommended,8 that is, a commissioning phase, followed by the clinical implementation phase, and finally the daily use of the AI model and the (daily) quality assurance specific for the innovative technology.8,9 These recommendations were built upon experiences with the integration of AI software in the RO workflow. They specifically emphasize the intricacies of the innovation content itself within each distinct phase. However, they are lacking recommendations to address implementation problems at the organizational level in the actual implementation processes. The absence of these kinds of recommendations might be a factor contributing to the currently limited application of AI in RO,5,9-11 and potentially contributes to the gap between research translation and clinical practice.8-13
To date, to our knowledge, there is no unified picture of factors affecting AI implementation in RO. In scientific literature, there is a widespread agreement that implementation processes are more complicated than the products, services, or technologies themselves that they aim to introduce. Every aspect of the implementation process can be riddled with challenges, ranging from transforming systems to altering employee behavior and adapting to changing organizational contexts.14,15 This complexity stems from the multitude of interconnected factors in various domains, including technology, processes, behaviors, and organizational settings. Shaw et al15 suggest that there is a role to play by implementation science and implementation practice to thoroughly contemplate the broader spectrum of issues associated with AI implementation, encompassing the health system, social, and economic implications of deploying AI in health care settings. This additional knowledge is important for all radiotherapy (RT) professionals and researchers to close the AI-implementation gap.
The purpose of this study was three-fold. First, to develop an implementation strategy for AI in our own center, using validated methods from the field of implementation science. Second, to identify which insights were gained expecting to improve the implementation of AI when using this format. Third, whether using this format to design a center-specific implementation strategy for AI is a feasible and acceptable approach for departments that aim to implement AI.
METHODS
For the first purpose, we developed an implementation strategy, on the basis of validated methods from implementation science.16-21 We initially optimized the strategy for a large academic RO department in the Netherlands. The content of the AI innovations included two types of AI: AI for increasing efficiency and quality of care; and AI for developing predictive outcome models to support personalized treatment choices. Our own strategy was built using stakeholder mapping, identifying barriers/facilitators, and formulating an implementation strategy. For the second aim, we gathered insights from building our own implementation strategy and from the multidisciplinary workshop where other RO centers applied the format to develop their own strategy. For the final purpose, we evaluated the format used to develop an implementation strategy in a multidisciplinary workshop. Figure 1 provides an overview of the study.
FIG 1.

Workflow of the study. AI, artificial intelligence; CFIR, Consolidated Framework for Implementation Research; ERIC, Expert Recommendations for Implementing Change.
Development of an Implementation Strategy in Our Center
Identifying Stakeholders and Conducting Stakeholder Analysis
To gain deeper understanding of the stakeholders who make the implementation process a success, the stakeholder mapping theory was used.18,19 First, we identified the stakeholders in the implementation of AI. A stakeholder is anyone who can affect or is affected by the implementation of AI.18 Therefore, we used our organization chart in which all functions of the organization are listed, and relevant stakeholders for the implementation of AI in clinical care were identified by two independent researchers (R.S. and W.v.E.). We used the organizational chart as a checklist to ensure that no stakeholders were overlooked. Next, all stakeholders were categorized by determining the influence, interest, knowledge, and proximity of each stakeholder (directly affected or not). On the basis of the influence, interest, proximity, and knowledge of each stakeholder, relationship characteristics were defined (conflicting, complementary, collaborative, following, or directive/prescriptive).18
Identifying Barriers and Facilitators
Barriers and facilitators for the implementation of AI were identified through semistructured stakeholder interviews and a literature search, using the Consolidated Framework for Implementation Research (CFIR).17,21 The interview guide was based on the CFIR (Data Supplement, Appendix S1). Each 30- to 45-minute interview was audiorecorded with consent, transcribed, and analyzed using Atlas.ti. Coding began with open coding, followed by axial coding, and codes were finally categorized according to the CFIR's five domains. In addition, a literature search was performed in PubMed, using search terms such as AI, implement*, facilitators, barriers, challenges, and success factors to extract relevant articles mentioning barriers and facilitators for implementing AI in health care. The search was performed in May 2023 and included articles published between May 2018 up until May 2023. The Data Supplement (Appendices S2 and S3) show the search terms and the inclusion and exclusion criteria. An overview was made of all the data from the interviews and literature search (barriers and facilitators), which were categorized according to the CFIR domains.
Definition of the Implementation Strategy
Combining the CFIR overview and the Expert Recommendations for Implementing Change (ERIC) strategies using the CFIR-ERIC Implementation Strategy Matching Tool19,20 provided a list of relevant implementation strategies to consider. These strategies were translated to the local context to formulate an initial implementation strategy. In collaboration with the various stakeholders, and through the collection of their feedback on the initial implementation strategy, the final implementation strategy for AI was determined on the basis of all the acquired information.
Designing Center-Specific Implementation Strategies
We organized a multidisciplinary workshop to gather insights on AI implementation in RO in several centers and assess the transferability of methods for developing AI implementation strategies. AI teams from Dutch RO centers, part of the Taskforce Innovation Implementation (Dutch Society for Radiation Oncology—NVRO), were invited. Seven of the nine invited centers participated, each bringing a case for strategy development. Three cases were discussed: implementing new commercial autocontouring software, an AI algorithm for autocontouring in brachytherapy, and autocontouring for prostate cancer. The workshop had 20 participants, including clinical physicists, managers, researchers, RTTs, and radiation oncologists.
The session began with an overview of AI in RO and the value of implementation science, including the methods for developing the AI-implementation strategy. Participants were divided into subgroups to create their own strategies for the implementation of AI-based autosegmentation as an example, guided by an implementation scientist and a clinical expert in AI.
To evaluate the methods, participants completed a questionnaire assessing the acceptability, appropriateness, and feasibility on the basis of the Acceptability of Intervention Measure (AIM), Intervention Appropriateness Measure (IAM), and Feasibility of Intervention Measure (FIM),22 immediately after the workshop.
RESULTS
Development of an Implementation Strategy in Our Center
Identified Stakeholders and Stakeholder Analysis
First, the stakeholder analyses generated a diverse group of internal stakeholders, including physicians, physicists, RTTs, the chief medical information officer, patient planners, physician assistants, management of patient care, information technology (IT), finance and purchasing department, and the program manager for education. Important external stakeholders were patients and patient representatives. Table 1 lists all stakeholders and their influence, interest, proximity, and knowledge, and the relationship characteristics.
TABLE 1.
Identified Stakeholders and Stakeholder Analysis
| Internal/External | Stakeholder | Interest | Influence | Proximity | Important Knowledge | Relation Characteristicsa |
|---|---|---|---|---|---|---|
| Internal | RTTs | Optimal treatment, efficient planning, improved radiation plans, error-resistant methodology, consistent plans | Proficient in AI usage; potential resistance to job profile changes | Directly involved | In-depth knowledge of processes and treatment | Complementary/following |
| Physicists | Optimal treatment, efficient organization, improved radiation plans, error-resistant methodology, consistent plans | Proficient in AI usage; potential job changes or departmental restructuring | Directly involved | Profound knowledge, often involved in software development and technical implementation | Prescriptive/complementary/cooperative | |
| Radiation oncologists | Optimal treatment, minimal side effects, improved radiation plans, error-resistant methodology, adaptable during treatment | Proficient in AI usage | Directly involved | Proficient knowledge of processes and treatment | Complementary | |
| Researchers | Best treatment, latest treatments, new developments, new prediction models | Proficient in developing new AI solutions; not all may be relevant to the clinic | Directly involved | Profound knowledge, developers of AI solutions | Cooperative | |
| IT | Secure software systems and effective application management | Technically sound setup and management of software | Directly involved | Proficient knowledge of processes and software setup and management | Cooperative | |
| Management | Decision making and budgeting for AI implementation | Major influence in decision making and budget allocation | Directly involved | Knowledge of strategic goals, KPIs, financials, prioritization | Directive/prescriptive | |
| Medical physics engineers | Safe and responsible treatment | — | Directly involved | Strong knowledge of processes, quality assurance, and quality control | Cooperative | |
| Patient planners/doctor's assistants | Best treatment and processes | Limited interest | Further away | Following | ||
| Finance and control and purchasing | Procurement processes and financing | Reasonable proximity | Knowledge of finance and potential suppliers, procurement processes | Cooperative | ||
| External | Patients | Greatest chance of survival with the fewest possible side effects | Shared decision making | Directly involved | Personal situation and preferences | Cooperative/following |
| Referrers | Best treatment for the patient and efficient referral processes | Available and compatible data and data infrastructure. When implementing collaborative decision making, good collaboration is important to implement AI at the right place in the care process/patient journey | Reasonable proximity | Cooperative | ||
| Companies/suppliers | Developing/delivering software and technology for optimal treatment; marketing, and sales | Reasonable proximity | Knowledge about developments, equipment, and software | Cooperative | ||
| Health insurers | High-quality and affordable care | Reimbursement of care | Further away | Conflicting | ||
| Other RT departments | Provide best treatment for the patient | Dissemination of AI within RT departments | Reasonable proximity | Complementary | ||
| Universities | Dissemination of knowledge | Further away | Prescriptive/cooperative/complementary | |||
| Governments | High-quality and affordable care | Considerable influence concerning policy and legislation, tariff structure, and health care reimbursement | Further away | Knowledge about new development in terms of regulations, tariff structure, and health care reimbursement | Prescriptive | |
| Patient council | Greatest chance of survival with the fewest possible side effects | Shared decision making | Directly involved | Personal situation and preferences | Cooperative/following |
Abbreviations: AI, artificial intelligence; IT, information technology; KPIs, key performance indicators; RT, radiotherapy; RTTs, radiotherapy technicians.
Conflicting: different end goals; complementary: same interest; collaborative: same end goal, different interest; following: not necessarily involved in implementation process, but will need to work with new/adjusted processes; directive/prescriptive: giving/prescribing instructions concerning implementation.
Barriers and Facilitators Categorized According to the CFIR
In total, 23 interviews were held with researchers (n = 6), radiation oncologists (n = 4), radiotherapy technicians (RTTs; n = 3), physicists (n = 3), managers (n = 2), team leaders in the IT department (n = 4), and the RTT advisory board consisting of five RTTs who were interviewed in a group setting (n = 1) to gain insights in barriers according to the different stakeholders involved in clinically implementing AI. The literature search resulted in 29 articles included in this study. The Data Supplement (Appendix S4) shows the Preferred Reporting Items for Systematic Reviews and Meta Analyses flowchart for including articles.
Among the prominent barriers identified, both in interviews and literature, the lack of transparency of AI, often called the black box nature of its operations, emerged as a recurring theme. The issue of explainability, or the lack thereof, was associated with this challenge. For the use of predictive outcome models, health professionals frequently expressed apprehension regarding the quality of the data underpinning AI models and their development, making data quality and availability significant barriers. Privacy and security concerns, along with laws and legislation, were also frequently mentioned barriers in the implementation of AI.
However, factors facilitating the implementation of AI included a sound knowledge and understanding of AI concepts, the establishment of trust in AI systems, educational initiatives, a multidisciplinary approach, active stakeholder involvement, and effective leadership. Table 2 provides an overview of all identified barriers and facilitators, categorizing them according to the CFIR domains and suggesting potential general ERIC strategies to address these challenges.
TABLE 2.
Facilitators and Barriers From Literature and Interview Classified by CFIR Domain
| CFIR Domain16,17 | Factor | Facilitator/Barrier | Literature 29 Papers (reference number)a | Interviews (n = 23)a | Total | CFIR Implementation Strategies to Consider (ERIC strategies)19 |
|---|---|---|---|---|---|---|
| Innovation | Black box, explainability | B | 915,23-30 | 7 | 16 | Promote adaptability Identify and prepare champions Conduct educational meetings Inform local opinion leaders Conduct local consensus discussions Capture and share local knowledge Develop educational materials Conduct educational outreach visits Identify early adopters |
| Privacy and security | B | 119,15,24,26,28,29,31-35 | 2 | 13 | ||
| Data availability and quality | B | 121,9,10,12,15,23,26-29,33,35 | 2 | 14 | ||
| Validation models, validityb | B | 327,36,37 | 3 | 6 | ||
| Interoperability, standardization | B | 21,12 | 3 | 5 | ||
| QA, updating modelsb | B | 134 | 4 | 5 | ||
| Complexity | B | 238,39 | 2 | 4 | ||
| Legal liability | 329,40,41 | 3 | ||||
| Transparency, usability, and liability | B | 326,27,41 | 3 | |||
| Technical design | F | 112 | 1 | 2 | ||
| Good feasibility and desirability | F | 238,39 | 2 | |||
| Generalizability | B | 226,27 | 2 | |||
| Quality and safety | B | 140 | 1 | |||
| Reliability, accuracy | B | 137 | 1 | |||
| Scalability | B | 115 | 1 | |||
| Reproducibility | B | 136 | 1 | |||
| Expected added benefit | F | 140 | 1 | |||
| Minimize workflow changes | F | 140 | 1 | |||
| Systemic bias in the data | B | 137 | 1 | |||
| Outer setting | Laws and legislation, policy (MDR, GDPR, CE marking) | B | 91,12,13,27,28,31,34,40,41 | 5 | 14 | |
| Meeting standards and quality requirements | B | 113 | 1 | |||
| Lack of political commitment | B | 137 | 1 | |||
| Analysis of multicenter data is limited because of differences in database structures across systems (eg, electronic medical records database of different service providers) | B | 137 | 1 | |||
| AI models are not reimbursed by insurance | B | 1 | 1 | |||
| Inner setting | Finance and resources | B | 312,40,42 | 3 | 6 | Assess for readiness, and identify barriers and facilitators Identify and prepare champions Conduct local consensus discussions Conduct educational meetings Build a coalition Create a learning collaborative Conduct local needs assessment Capture and share local knowledge Alter incentive/allowance structure Facilitation Promote adaptability Inform local opinion leaders Involve executive boards Tailor strategies Recruit, designate, and train for leadership Organize clinician implementation team meetings Identify early adopters Promote network weaving Use advisory boards and workgroups Access new funding Develop a formal implementation blueprint Use an implementation adviser Distribute educational materials Fund and contract for clinical innovation Conduct cyclical small tests of change Involve patients/consumers and family members Visit other sites |
| Communication | B | 223,40 | 4 | 6 | ||
| Transformation of health care professions and care processes | B | 212,13 | 3 | 5 | ||
| Good management/leadership | F | 412,23,30,42 | 4 | |||
| Resistance to change | B | 142 | 2 | 3 | ||
| Gap research—clinic | B | 3 | 3 | |||
| Innovation strategy | F | 140 | 1 | |||
| Innovation manager | F | 140 | 1 | |||
| Local champions | F | 140 | 1 | |||
| Timing: clinical need v data availability | B | 1 | 1 | |||
| Culture | B | 1 | 1 | |||
| Clinicians with too little time and/or interest in AI | B | 1 | 1 | |||
| Lack of resources to build and maintain IT infrastructure to support AI process | B | 137 | 1 | |||
| Regulatory compliance issues in the process of managing a high volume of sensitive information | B | 137 | 1 | |||
| Raw fragmented or unstructured data (eg, electronic medical records, imaging reports), which are difficult to aggregate and analyze | B | 137 | 1 | |||
| Lack of well-described patient-level health databases | B | 137 | 1 | |||
| Support | F | 1 | 1 | |||
| Good = good enough | B | 1 | 1 | |||
| Individuals | Knowledge and understanding of AIb | F | 81,8,12,23,28,32,40,42 | 3 | 11 | Identify and prepare champions Conduct educational meetings Develop educational materials Inform local opinion leaders Conduct educational outreach visits |
| Trust in AI | F | 810-13,28,40,42,43 | 2 | 10 | ||
| Confidence in clinical data from which an AI/ML model learns | B | 110 | 3 | 4 | ||
| Autonomy loss physician | B | 124 | 2 | 3 | ||
| Ownership | B | 1 | 1 | |||
| Lack of appropriate skills for applying AI methods | B | 137 | 1 | |||
| Implementation process | Lack of stakeholder involvement/engagement/consensus | B | 611-13,29,30,40 | 6 | 12 | Identify and prepare champions Conduct local consensus discussions Inform local opinion leaders Assess for readiness, and identify barriers and facilitators Build a coalition Identify early adopters Conduct local needs assessment Develop a formal implementation blueprint Involve patients/consumers and family members Obtain and use patients/consumers and family feedback Conduct educational meetings Recruit, designate, and train for leadership Develop and implement tools for quality monitoring Facilitation Audit and provide feedback Use advisory boards and workgroups Capture and share local knowledge Create a learning collaborative Develop and organize quality monitoring systems Prepare patients/consumers to be active participants Organize clinician implementation team meetings |
| Internal and external multidisciplinary collaborationb | F | 48,13,23,30 | 7 | 11 | ||
| Educationb | F | 78,9,13,23,37,42,43 | 1 | 8 | ||
| Lack of effect measurement | B | 131 | 3 | 4 | ||
| Implementation strategy/guidelines | F | 29,13 | 2 | |||
| Clear goals and process | F | 238,39 | 2 | |||
| Risk analysisb | F | 28,9 | 2 | |||
| Evaluation and testingb | F | 28,12 | 2 | |||
| Frequent project/team meetingsb | F | 28,12 | 2 |
Abbreviations: AI, artificial intelligence; CE, Conformité Européenne; CFIR, Consolidated Framework for Implementation Research; ERIC, Expert Recommendations for Implementing Change; GDPR, General Data Protection Regulation; IT, information technology; MDR, Medical Device Regulation; ML, machine learning; QA, quality assurance.
Numbers in this column refer to the number of times mentioned in literature/interviews.
Factors included in current recommendations that can be interpreted as a consensus of radiotherapy centers.2
Furthermore, Table 2 offers a comparative analysis, juxtaposing these factors with those already incorporated into existing recommendations for AI implementation in RO. This comparative work shows the insights gained from the current study, providing a clear perspective on additional implementation challenges that merit attention in the context of AI integration in RO practices.
Definition of the Implementation Strategy
The CFIR-ERIC strategy matching tool included actions such as identifying champions, informing opinion leaders, conducting consensus discussions, assessing readiness, identifying barriers and facilitators, adjusting incentives, organizing educational meetings, sharing local knowledge, promoting adaptability, building a coalition, and identifying early adopters. The tool's output includes the percentage of endorsement for each barrier, with cumulative percentages showing strategies ranked from most to least preferred. The Data Supplement (Appendix S5) visually represents these strategies, with cumulative percentages exceeding 100%, highlighting the comprehensive and multifaceted approach needed to address implementation barriers, in our study defined as offering a range of actions from education to enhancing interaction between clinicians and researchers, for addressing and overcoming implementation barriers.
These insights form the basis for developing a comprehensive implementation strategy, combining a generic approach to prepare the organization for AI use and a project-specific strategy for individual AI-implementation projects. In collaboration with stakeholders, we transformed these actions into a specific AI-implementation strategy. The Data Supplement (Appendix S5) highlights the importance working with champions and local opinion leaders, so we appointed contact persons in the clinical teams and AI champions to promote AI innovations on the work floor, gain support, address obstacles, and share knowledge about the innovation. Table 3 provides a thorough overview of the AI-implementation strategy, which addresses challenges such as bridging the research-clinical gap and aligning clinical needs with data availability.
TABLE 3.
Implementation Strategy for AI
| Problem | Action | Elaboration | CFIR Domain |
|---|---|---|---|
| Generic strategy | |||
| Gap research—clinic Timing: clinical need v data availability Communication Support |
AI research agenda | Appoint contact person per clinical team: look at the possibilities and needs to create a shared AI research/development agenda Contact person is a link between research and clinic Quarterly meeting clinicians—researchers |
Inner setting |
| Support Ownership Communication |
AI champions/opinion leaders in clinical teams | Champions promote innovation on the work floor, gain support from other people, define, and overcome obstacles and share information and knowledge about the innovation | Inner setting |
| Lack of knowledge and understanding | Education program | Define basic knowledge level for employees Define specific requirements per function group Training program and teaching materials will be made available |
Characteristics of individuals |
| Management/leadership | Management development program AI | Development of a module leading digital transformation for management | Inner setting |
| Complexity, clear goals, sufficient employees, and good feasibility and desirability | Quick scan project for chance of timely implementation | In previous research, a prediction model for timely implementation of innovations was developed.38,39 All AI projects will use the model at the start, during, and at the end of the project. Project leader, project members, and project owners will fill out the model. This provides project leaders with a quick scan of their project and this way they can ensure that the goals and process are clear for all stakeholders, ensure sufficient employees to work on the project, see whether the project is feasible and desirable, and whether project members find it a complex project. With this information, the project leader can better prepare for potential obstacles/hurdles during the project | Process |
| Policy and legislation Good = good enough |
Develop AI policy | Aim for automation but keep flexibility for continuous improvements Sacrifices must be made, since a small improvement means the AI solution must be rebuilt and validated A roadmap for the introduction and management of AI following relevant current legislation |
Outer setting Inner setting |
| Project-specific strategy | |||
| Lack of stakeholder involvement/engagement Clear goals and process Support Ownership Complexity Good feasibility and desirability Communication |
Interactive, multidisciplinary kick-off workshop | Participants of the workshop: all relevant stakeholders needed at any given time in a specific AI project (from researcher to ICT) Goal workshop: stimulate implementation by creating collective problem awareness/collective ownership Workshop content Goal and planning of project Quick scan project using prediction model timely implementation Inventory bottlenecks from various stakeholders Brainstorming solutions for bottlenecks Result: action list solution bottlenecks, including who picks up which task Evaluation: acceptability, feasibility, appropriateness, costs |
Intervention characteristics Inner setting Process |
| Frequent project meetings Risk analyses Evaluation and testing Lack of effect measurement Education Clear goals and process |
Project plan | Project plan including Phasing, milestones/deliverables, risk analysis, effect measurement, training, and communication plan Complete project planning, including Time commitment, staff deployment, planning in time If there are separate phases, evaluation after each phase Regular project meetings of whole project team |
Process Inner setting Characteristics of individuals |
| Evaluation and testing Lack of effect measurement |
Evaluation | Evaluation before the project: acceptability, appropriateness, feasibility, and the cost of the project Evaluation during the project: acceptability, feasibility, adoption, fidelity, coverage/scope, and cost of the project Evaluation after the project: an evaluation takes place on both content and process: Does it deliver what it is supposed to? Does it do what it is supposed to do? Lessons learned? What should be done differently next time? |
Process |
Abbreviations: AI, artificial intelligence; CFIR, Consolidated Framework for Implementation Research; ICT, Information Communication Technology.
Designing Center-Specific Implementation Strategies
Relevant Insights Gained From Our Own Implementation Strategy and the Workshop for the Implementation of AI
Important barriers identified in interviews and literature include knowledge and understanding of AI, trust in AI, confidence in the clinical data used to train AI/machine learning models, lack of stakeholder involvement, the research-clinical practice gap, multidisciplinary collaboration, and the lack of effect measurement. Notably, confidence in the clinical data to train models, the research-clinical practice gap, and the lack of effect measurement are less addressed in literature. Clinical staff emphasized the importance of stakeholder involvement and multidisciplinary collaboration.
In two of the three cases, the workshop led to useful and unexpected insights into to barriers was encountered. For example, in case 1, no stakeholders analysis was performed, and the project faced several months of delays waiting for the hospital IT department's response to proceed with software implementation. The IT department was involved only after purchase, resulting in delayed prioritization on their end. Involving IT early on might have integrated the project into their planning. The results, which were similar across the three cases, are summarized in Table 4. Although the projects differed in their stages and specific challenges, they all underscored the crucial role of stakeholder analysis and involvement in ensuring project success.
TABLE 4.
Insights Gathered During the Workshop
| Case | Gained Insights |
|---|---|
| Case 1: implementation of commercial autocontouring software | The project encountered significant delays (several months) as the team awaited a response from the hospital IT department to proceed with software implementation. Upon conducting stakeholder analyses during the workshop, it became evident that the hospital IT department had not been involved in the project from the beginning, only after the software program was already bought. This highlights a common scenario where software program is acquired without consulting IT. In such cases, IT is only brought in after purchase, with the expectation that they will prioritize and expedite the project. By contrast, including IT as a stakeholder in project planning ensures they are informed about upcoming tasks and can integrate them into their planning promptly. This proactive approach prevents delays. Furthermore, involving IT from the outset allows them to contribute their expertise, offering insights into various scenarios and evaluating the impact of different available software packages throughout the project |
| Case 2: implementation of an AI algorithm for autocontouring in a brachytherapy workflow | In the initial stages of this project, the researcher indicated that the AI algorithm was not sufficiently effective for automated contouring. However, after analyzing the stakeholders during the workshop, it became apparent that radiation oncologists found the algorithm to be effective, while RTTs deemed its quality insufficient. This discrepancy arose from a shift in the RTTs responsible for contouring, as different planning systems are in use and not all RTTs possess all the skills. The RTTs currently in charge of contouring differed from the original group, leading to a lack of acceptance for the automatic contouring by their colleagues Had a comprehensive stakeholder analysis and barrier assessment been conducted beforehand, it likely would have resulted in a different solution |
| Case 3: autocontouring for prostate cancer | This project was not started yet, so this group set out drafting an implementation strategy instead of evaluating an existing project and the approach used. First stakeholders were identified, where participants acknowledged that the introduced methods provided a clear and structural approach to identify stakeholders. They were especially fond of using the organizational chart, since it ensures not forgetting any important stakeholders. Care professionals are primarily focused on the content of their work and innovations. During the workshop, they realized there are more factors and perspectives that need to be considered when implementing AI |
Abbreviations: AI, artificial intelligence; IT, information technology; RTTs, radiotherapy technicians.
Evaluation of the Applied Format for Developing an Implementation Strategy (feasibility, appropriateness, and acceptability)
The multicentric evaluation of the workshop revealed unanimous agreement among all participants regarding the workshop's high utility and the relevance of its content. Participants expressed consensus that the methods used, and the outcomes of the workshop could be effectively integrated into their own organizations' clinical routines. Additionally, 16 (80%) participants fully endorsed the workshop's format for developing an implementation strategy.
The workshop introduced new methods for designing an implementation strategy. Among participants, 60% were unfamiliar with stakeholder mapping/analysis, 80% were unfamiliar with categorizing barriers and facilitators according to the CFIR, and 80% had not used the CFIR and ERIC matching tool for strategy formulation. In terms of acceptability, appropriateness, and feasibility, 18 participants (90%) found the chosen concept to develop an implementation strategy to have highly acceptable, 17 participants (85%) deemed found the methods used to be appropriate, and 15 participants (75%) acknowledged the feasibility. Figure 2 shows the overall evaluation results.
FIG 2.

Evaluation of the hands-on multidisciplinary workshop. CFIR, Consolidated Framework for Implementation Research; ERIC, Expert Recommendations for Implementing Change.
All participants agreed that the workshop's methods were appropriate for broader use or other innovations. The methods were praised for their practicality, systematic approach, concrete guidance, and efficiency. However, 20% noted a potential one-size-fits-all approach. Proficiency in the methods allows independent application within organizations, reducing the need for a structured workshop. Participants also highlighted the benefits for complex projects, including early stakeholder involvement, explicit planning, valuable insights, and a fresh perspective.
DISCUSSION
We developed an AI-implementation strategy for RO clinicians using validated methods from implementation science. The methods to develop an AI-implementation strategy proved to be generalizable and acceptable for other RO centers in the Netherlands. To our knowledge, this is the first study to developing an implementation strategy for AI in RO, assessing the applicability, appropriateness, and feasibility. The positive response from the workshop suggests the methods appeal to RO professionals, supporting further dissemination within the RO community.
This study started with a stakeholder analysis, which is important for successful clinical implementation. First, interviews with stakeholders identified organization-specific barriers and facilitators, enabling RO centers to build tailored implementation strategies. Stakeholders also identified overlooked risks and challenges, leading to more informed decision making.44-46 Second, participatory stakeholder analysis builds trust, strengthens relationships, and uncovers biases.18 Third, involving stakeholders in the initial stages enhances the project's success rate by creating ownership and ensuring engagement.47-49 Overall, engaging stakeholders early on is vital for the effective clinical implementation of AI and mitigating resistance to AI.49,50 This highlights the importance of considering stakeholder perspectives when clinically implementing AI. Not considering stakeholder perspectives might constrain the successful AI implementation.51 The workshop supported these findings, where all participants underscored the crucial role of stakeholder involvement, even while the cases differed in their stages and specific challenges.
Looking at the barriers and facilitators found in this study, the results in literature differed from the factors mentioned in the interviews with employees. Literature identified data quality and availability, and privacy and security as prominent barriers, while interviews showed AI being a black box, multidisciplinary collaboration, and lack of stakeholder involvement as the most prominent barriers. These differences arise because of the different methodologies. Literature reviews uncover trends and gaps in existing literature,52 while interviews provide insights and contextual understanding often absent in written sources.53 Our combined approach of literature review and interviews is vital for a thorough understanding of all relevant factors. Insights from both methods complement each other.52,53
The variation in factors highlights the importance of participatory stakeholder analysis and identifying barriers and facilitators within the own organization. These factors are inherently specific to the local context.54 AI is a catch-all term. It depends on the stakeholder which topics are found relevant. AI's relevance varies by stakeholder: researchers focus on data collection, data quality, and model building and performance; radiation oncologists on predictive models and AI's suitability for the patient currently in front of them and concerns regarding AI being a black box; and RTTs on automation and AI's impact on job roles and the trustworthiness of AI and whether or not the AI is better. The same can also be said for the type of AI solution that is implemented. Whether implementing prediction models influencing treatment decisions, commercial AI tools, or research AI solutions, the barriers are dependent on the tool, the people involved, and the context. Our study does not aim to develop a universal strategy for the implementation of AI but to provide the steps for centers to develop tailored strategies addressing specific local challenges.19,20,54 Our study provides a guide to develop a customized implementation strategy, which should be part of a comprehensive project plan, that outlines the specific steps and details needed to implement the project successfully, including resources and budget.55
Creating an implementation plan may seem to take more time, but through good preparation, implementation problems would be prevented, allowing the extra preparation time to be quickly recouped during execution. Furthermore, the steps in the plan are simple, allowing them to be quickly routinized by the RT professional.
One strength of this study is that it uses validated methods from implementation science, such as stakeholder analysis,18 CFIR,16,17 and ERIC-CFIR matching tool.19,20 Furthermore, the validated evaluation measures (AIM, IAM, and FIM) used22 assessed the workshop's effectiveness. However, potential limitations or biases, such as participant subjectivity, should be acknowledged. Moreover, the participating centers are members of the innovation implementation task force, demonstrating a collective commitment to fostering innovation and seeking collaborative opportunities. This shared ethos of seeking collaboration, rather than reinventing the wheel individually, suggests openness toward the proposed methods and thus bias.
Future research should focus on process validation and use more quantitative methods to demonstrate effectiveness and efficacy,56 as well as explore the longitudinal impact of the implemented strategies and delve deeper into specific challenges faced by different RO centers. Evolving trends in AI implementation, such as emerging technologies or changing regulatory landscapes, should guide future directions. Furthermore, the implementation of AI is an iterative process, where barriers and facilitators change as knowledge, understanding, and experience with AI evolve. Where barriers are resolved, new barriers may arise.
In conclusion, using insights from implementation science, we devised a strategy to enhance AI implementation in clinical settings. On the basis of workshop outcomes, we recommend the following for implementing AI in clinical care: (1) Conduct a structured stakeholder analysis before starting an AI project, paying special attention to key stakeholders not involved in day-to-day AI use (eg, IT in automated contouring). (2) Go beyond content and project planning by identifying potential barriers in collaboration with stakeholders, considering center-specific challenges that may not be evident in the literature. (3) Cocreate implementation strategies with stakeholders, addressing identified barriers. Ensure AI champions are appointed, provide education on AI, and integrate AI into your broader strategy. (4) Use validated implementation science methods to develop a strategy supported by all stakeholders.
Our experience underscores the importance of multidisciplinary collaboration and stakeholder involvement in AI implementation. The proposed strategy's acceptability, appropriateness, and feasibility were positively evaluated by multiple RO centers, as well as the proposed methods and format for developing an implementation strategy. In this era, with the rapid emergence of AI applications in the medical field and the extensive development of AI in RO, efficient and effective implementation is crucial. Future research should demonstrate the effectiveness of this implementation strategy and the generalizability of methods used in other cancer disciplines.
ACKNOWLEDGMENT
The authors express their gratitude to all participants in both the interviews and the workshop for their valuable input and feedback. Participating centers in this study were the radiotherapy departments of University Medical Center Groningen, University Medical Center Utrecht, Erasmus University Medical Center, Amsterdam University Medical Center, Radboud University Medical Center, Haaglanden Medical Center, and Radiotherapeutisch Instituut Friesland and Maastro.
PRIOR PRESENTATION
Presented at ESTRO 2024, Glasgow, UK, May 3-7, 2024.
AUTHOR CONTRIBUTIONS
Conception and design: Rachelle Swart, Liesbeth Boersma, Rianne Fijten, Paul Cremers, Maria J.G. Jacobs
Collection and assembly of data: Rachelle Swart
Data analysis and interpretation: Rachelle Swart, Liesbeth Boersma, Rianne Fijten, Wouter van Elmpt, Maria J.G. Jacobs
Manuscript writing: All authors
Final approval of manuscript: All authors
Accountable for all aspects of the work: All authors
AUTHORS' DISCLOSURES OF POTENTIAL CONFLICTS OF INTEREST
The following represents disclosure information provided by authors of this manuscript. All relationships are considered compensated unless otherwise noted. Relationships are self-held unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject matter of this manuscript. For more information about ASCO's conflict of interest policy, please refer to www.asco.org/rwc or ascopubs.org/cci/author-center.
Open Payments is a public database containing information reported by companies about payments made to US-licensed physicians (Open Payments).
Rianne Fijten
Research Funding: Janssen-Cilag (Inst)
Wouter van Elmpt
Speakers' Bureau: Varian Medical Systems
Research Funding: Varian Medical Systems (Inst)
No other potential conflicts of interest were reported.
REFERENCES
- 1.Chua IS, Gaziel-Yablowitz M, Korach ZT, et al. : Artificial intelligence in oncology: Path to implementation. Cancer Med 10:4138-4149, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Esteva A, Robicquet A, Ramsundar B, et al. : A guide to deep learning in healthcare. Nat Med 25:24-29, 2019 [DOI] [PubMed] [Google Scholar]
- 3.What to expect from AI in oncology. Nat Rev Clin Oncol 16:655, 2019 [DOI] [PubMed] [Google Scholar]
- 4.Claessens M, Vanreusel V, De Kerf G, et al. : Machine learning-based detection of aberrant deep learning segmentations of target and organs at risk for prostate radiotherapy using a secondary segmentation algorithm. Phys Med Biol 67:115014, 2022 [DOI] [PubMed] [Google Scholar]
- 5.Sauerbrei A, Kerasidou A, Lucivero F, et al. : The impact of artificial intelligence on the person-centred, doctor-patient relationship: Some problems and solutions. BMC Med Inform Decis Mak 23:73, 2023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Brown C, Nazeer R, Gibbs A, et al. : Breaking bias: The role of artificial intelligence in improving clinical decision-making. Cureus 15:e36415, 2023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 7.Bjerring J, Busch J: Artificial intelligence and patient-centered decision-making. Philos Technol 34:349-371, 2021 [Google Scholar]
- 8.Vandewinckele L, Claessens M, Dinkla A, et al. : Overview of artificial intelligence-based applications in radiotherapy: Recommendations for implementation and quality assurance. Radiother Oncol 153:55-66, 2020 [DOI] [PubMed] [Google Scholar]
- 9.Brouwer CL, Dinkla AM, Vandewinckele L, et al. : Machine learning applications in radiation oncology: Current use and needs to support clinical implementation. Phys Imaging Radiat Oncol 16:144-148, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Cabitza F, Campagner A, Balsano C: Bridging the “last mile” gap between AI implementation and operation: “data awareness” that matters. Ann Transl Med 8:501, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Gama F, Tyskbo D, Nygren J, et al. : Implementation frameworks for artificial intelligence translation into health care practice: Scoping review. J Med Internet Res 24:e32215, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Chomutare T, Tejedor M, Svenning TO, et al. : Artificial intelligence implementation in healthcare: A theory-based scoping review of barriers and facilitators. Int J Environ Res Public Health 19:16359, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Petersson L, Larsson I, Nygren JM, et al. : Challenges to implementing artificial intelligence in healthcare: A qualitative interview study with healthcare leaders in Sweden. BMC Health Serv Res 22:850, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Fixsen DL, Naoom SF, Blase KA, et al. : Implementation Research: A Synthesis of the Literature. Tamps, FL, University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network, 2005 [Google Scholar]
- 15.Shaw J, Rudzicz F, Jamieson T, et al. : Artificial intelligence and the implementation challenge. J Med Internet Res 21:e13659, 2019 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Damschroder L, Hall C, Gillon L, et al. : The Consolidated Framework for Implementation Research (CFIR): Progress to date, tools and resources, and plans for the future. Implement Sci 10:A12, 2015 [Google Scholar]
- 17.Damschroder LJ, Reardon CM, Widerquist MAO, et al. : The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci 17:75, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 18.Reed MS, Graves A, Dandy N, et al. : Who’s in and why? A typology of stakeholder analysis methods for natural resource management. J Environ Manage 90:1933-1949, 2009 [DOI] [PubMed] [Google Scholar]
- 19.Powell BJ, Waltz TJ, Chinman MJ, et al. : A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci 10:21, 2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Waltz TJ, Powell BJ, Fernández ME, et al. : Choosing implementation strategies to address contextual barriers: Diversity in recommendations and future directions. Implement Sci 14:42, 2019 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Damschroder LJ, Aron DC, Keith RE, et al. : Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci 4:50, 2009 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Weiner BJ, Lewis CC, Stanick C, et al. : Psychometric assessment of three newly developed implementation outcome measures. Implement Sci 12:108, 2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 23.Fiorino C, Jeraj R, Clark CH, et al. : Grand challenges for medical physics in radiation oncology. Radiother Oncol 153:7-14, 2020 [DOI] [PubMed] [Google Scholar]
- 24.Arora A: Conceptualising artificial intelligence as a digital healthcare innovation: An introductory review. Med Devices (Auckl) 13:223-230, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 25.Arbelaez OL, Starke G, Lorenzini G, et al. : Re-focusing explainability in medicine. Digit Health 8:205520762210744, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Koski E, Murphy J: AI in healthcare, in Honey M, Ronquillo C, Lee TT, et al (eds): Nurses and Midwives in the Digital Age. Amsterdam, Netherlands, IOS Press, 2021, pp 295-299 [Google Scholar]
- 27.Huynh E, Hosny A, Guthier C, et al. : Artificial intelligence in radiation oncology. Nat Rev Clin Oncol 17:771-781, 2020 [DOI] [PubMed] [Google Scholar]
- 28.Aung YYM, Wong DCS, Ting DSW: The promise of artificial intelligence: A review of the opportunities and challenges of artificial intelligence in healthcare. Br Med Bull 139:4-15, 2021 [DOI] [PubMed] [Google Scholar]
- 29.Saw SN, Ng KH: Current challenges of implementing artificial intelligence in medical imaging. Phys Med 100:12-17, 2022 [DOI] [PubMed] [Google Scholar]
- 30.Liao F, Adelaine S, Afshar M, et al. : Governance of clinical AI applications to facilitate safe and equitable deployment in a large health system: Key elements and early successes. Front Digit Health 4:931439, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Wolff J, Pauling J, Keck A, et al. : Success factors of artificial intelligence implementation in healthcare. Front Digit Health 3:594971, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Chew HSJ, Achananuparp P: Perceptions and needs of artificial intelligence in health care to increase adoption: Scoping review. J Med Internet Res 24:e32939, 2022 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 33.Sunarti S, Fadzlul Rahman F, Naufal M, et al. : Artificial intelligence in healthcare: Opportunities and risk for future. Gac Sanit 35:S67-S70, 2021. (suppl 1) [DOI] [PubMed] [Google Scholar]
- 34.Cohen IG, Evgeniou T, Gerke S, et al. : The European artificial intelligence strategy: Implications and challenges for digital health. Lancet Digit Health 2:e376-e379, 2020 [DOI] [PubMed] [Google Scholar]
- 35.Pirracchio R: The past, the present and the future of machine learning and artificial intelligence in anesthesia and postanesthesia care units (PACU). Minerva Anestesiol 88:961-969, 2022 [DOI] [PubMed] [Google Scholar]
- 36.Christie JR, Lang P, Zelko LM, et al. : Artificial intelligence in lung cancer: Bridging the gap between computational power and clinical decision-making. Can Assoc Radiol J 72:86-97, 2021 [DOI] [PubMed] [Google Scholar]
- 37.Zemplényi A, Tachkov K, Balkanyi L, et al. : Recommendations to overcome barriers to the use of artificial intelligence-driven evidence in health technology assessment. Front Public Health 11:1088121, 2023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 38.Swart RR, Jacobs MJ, Roumen C, et al. : Factors predicting timely implementation of radiotherapy innovations: The first model. Br J Radiol 94:20200613, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 39.Swart RR, Fijten R, Boersma LJ, et al. : External validation of a prediction model for timely implementation of innovations in radiotherapy. Radiother Oncol 179:109459, 2023 [DOI] [PubMed] [Google Scholar]
- 40.Strohm L, Hehakaya C, Ranschaert ER, et al. : Implementation of artificial intelligence (AI) applications in radiology: Hindering and facilitating factors. Eur Radiol 30:5525-5532, 2020 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 41.Young AT, Amara D, Bhattacharya A, et al. : Patient and general public attitudes towards clinical artificial intelligence: A mixed methods systematic review. Lancet Digit Health 3:e599-e611, 2021 [DOI] [PubMed] [Google Scholar]
- 42.Yousefi Nooraie R, Lyons PG, Baumann AA, et al. : Equitable implementation of artificial intelligence in medical imaging: What can be learned from implementation science? PET Clin 16:643-653, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 43.Darcel K, Upshaw T, Craig-Neil A, et al. : Implementing artificial intelligence in Canadian primary care: Barriers and strategies identified through a national deliberative dialogue. PLoS One 18:e0281733, 2023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 44.Fazey I, Fazey JA, Fazey DMA: Learning more effectively from experience. Ecol Soc 10:art4, 2005 [Google Scholar]
- 45.Bennett RH: The importance of tacit knowledge in strategic deliberations and decisions. Manag Decis 36:589-597, 1998 [Google Scholar]
- 46.Yang L, Ene IC, Arabi Belaghi R, et al. : Stakeholders’ perspectives on the future of artificial intelligence in radiology: A scoping review. Eur Radiol 32:1477-1495, 2022 [DOI] [PubMed] [Google Scholar]
- 47.Reed MS: Stakeholder participation for environmental management: A literature review. Biol Conserv 141:2417-2431, 2008 [Google Scholar]
- 48.Barrane FZ, Ndubisi NO, Kamble S, et al. : Building trust in multi-stakeholder collaborations for new product development in the digital transformation era. Benchmarking Int J 28:205-228, 2020 [Google Scholar]
- 49.Gibson KR, O’Leary K, Weintraub JR: The little things that make employees feel appreciated. Harvard Business Review, 2020. https://hbr.org/2020/01/the-little-things-that-make-employees-feel-appreciated
- 50.Murphy J, Qureshi O, Endale T, et al. : Barriers and drivers to stakeholder engagement in global mental health projects. Int J Ment Health Syst 15:30, 2021 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 51.Hogg HDJ, Al-Zubaidy M; Technology Enhanced Macular Services Study Reference Group, et al. : Stakeholder perspectives of clinical artificial intelligence implementation: Systematic review of qualitative evidence. J Med Internet Res 25:e39742, 2023 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 52.Xiao Y, Watson M: Guidance on conducting a systematic literature review. J Plan Educ Res 39:93-112, 2019 [Google Scholar]
- 53.Sutton J, Austin Z: Qualitative research: Data collection, analysis, and management. Can J Hosp Pharm 68:226-231, 2015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 54.Powell BJ, Beidas RS, Lewis CC, et al. : Methods to improve the selection and tailoring of implementation strategies. J Behav Health Serv Res 44:177-194, 2017 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 55.van de Schoot J, Brons E: Onderzoek, projectplan, evaluatie en goede afspraken. PodoSophia 22:22-24, 2014 [Google Scholar]
- 56.Sharma S, Goyal S, Chauhan K: A review on analytical method development and validation. Int J Appl Pharm 10:8-15, 2018 [Google Scholar]
