Skip to main content
. 2024 Jan 17;11:e47031. doi: 10.2196/47031

Table 4.

Other factors related to acceptance (N=77) of medical AIa applications (14/32, 43.8%, studies).

Factor category and factors from the rapid review Umbrella factors used in the survey
Technology-related factors

Performance expectancy (n=4, 28.6%); design and output quality (n=4, 28.6%); accuracy (n=2, 14.3%); efficiency (n=1, 7.1%) Performance of AI applications in medicine (reproducibility of outcomes, accuracy)

Perceived ease of use (n=2, 14.3%); user-friendliness (n=2, 14.3%); actual system use (n=1, 7.1%); compatibility (n=1, 7.1%); facilitating conditions (n=1, 7.1%) Possibility of integration of AI applications into existing clinical workflows

Perceived risk (n=1, 7.1%) Clear balance of risks and benefits of the AI application

Transparency (n=3, 21.4%); explainability (n=2, 14.3%); evidence strength (n=1, 7.1%); trustworthiness (n=1, 7.1%) Explainability and transparency of the processes and outcomes
Legal and ethical factors

Adequate regulations, legislation and governance (n=2, 14.3%); ethical risks (n=1, 7.1%); political support (n=1, 7.1%) Adequacy of the regulations and governance of AI applications in medicine

Data protection/security (n=2, 14.3%); patients’ consent to the continuous collection and processing of data (n=1, 7.1%) Data use transparency

Accountability and responsibility (n=2, 14.3%); tort liability (n=1, 7.1%) Clear accountability and responsibility of the AI application (machine vs human responsibility)
Additional factors

Replacement of doctor/lack of human touch and moral support when evaluated by AI alone (n=1, 7.1%) Impact on job availability (machines replacing humans)

Trust in AI applications (n=3, 21.4%) Acceptance emerging from trust

aAI: artificial intelligence.