Skip to main content
. 2024 Mar 8;26:e53008. doi: 10.2196/53008

Table 3.

Generative artificial intelligence (AI) in health care categories and security or privacy threats in model training or building and implementation phases.

Category Model training and building phase Implementation phase

Integrity threats Availability threats Integrity threats Confidentiality threats
Medical diagnostics Adversarial training and classification manipulation (eg, image classification manipulation) Model performance deteriorating by feeding poisonous data AI hallucination (eg, made-up diagnosis), misinformation or disinformation, and adversarial use exploitation Data extraction from carefully crafted prompts and privacy attacks
Drug discovery Adversarial training and classification manipulation Model performance deteriorating by feeding poisonous data AI hallucination (eg, made-up chemical compound or protein structures), misinformation or disinformation, and adversarial use exploitation Data extraction from carefully crafted prompts and privacy attacks
Virtual health assistants Adversarial training and classification manipulation Model performance deteriorating by feeding poisonous data AI hallucination (eg, made-up medical advice), misinformation or disinformation, and adversarial use exploitation Data extraction from carefully crafted prompts and privacy attacks
Medical research Adversarial training and classification manipulation Model performance deteriorating by feeding poisonous data AI hallucination (eg, made-up findings, hypothesis, and citations), misinformation or disinformation, and adversarial use exploitation Data extraction from carefully crafted prompts and privacy attacks
Clinical decision support Adversarial training and classification manipulation Model performance deteriorating by feeding poisonous data AI hallucination (eg, made-up conclusions, findings, and recommendations), misinformation or disinformation, and adversarial use exploitation Data extraction from carefully crafted prompts and privacy attacks