Shifts in Input Data |
Changes in demographics, new hardware or software, change in image acquisition protocol, artifacts that impact input data quality, shifts in disease prevalence.12
|
Distribution or dataset shifts may cause AI tools to deviate from their baseline performance. The shifts may be anticipated due to planned changes or unexpected. They may also be isolated incidents due to off-label use (eg, adult tools used on pediatrics) or corrupted input (eg, poor image quality). Ongoing QA should include periodic review for distribution shifts and re-validation against the reference datasets. |
Hardware Reliability |
Hardware failure, updated hardware incompatibility, or general wear. |
Physical component failures (eg, X-ray tube, detectors, sensors, etc.) affecting inputs or computational capabilities may impact performance and reliability of AI tools.12 QA procedures are contingent on the specific hardware configurations, the type of AI tool being used, and the unique operational environment in which it is deployed. QA should include regular hardware diagnostics and stress tests, especially for critical components, as instructed by the manufacturer or vendor. |
Software Issues |
Software bugs, version incompatibility, and security vulnerabilities in AI algorithms and supporting systems. |
Ongoing QA may need to consider the interoperability of the AI tool with various medical data standards.12 Periodically assessing the tool’s compliance with evolving cybersecurity regulations is essential. QA should include regular security audits and penetration testing. |
Data Integrity |
Incomplete, incorrect, biased, or AI-derived input data. |
Implement automatic QC check to monitor any drift of AI performance over time. If drift occurs, identify whether input data integrity is the cause. |