Table 1.
Medical image AI deployment: regulatory comparison (US vs. EU vs. Japan).
| Regulatory aspect | United States (FDA) | European Union (MDR/IVDR + AI Act) | Japan (PMDA) |
|---|---|---|---|
| Primary regulatory bodies | Food and Drug Administration (FDA), Center for Devices and Radiological Health (CDRH) | European Commission, National Competent Authorities, Notified Bodies | Ministry of Health, Labour and Welfare (MHLW), Pharmaceuticals and Medical Devices Agency (PMDA) |
| Key legislation | Federal Food, Drug, and Cosmetic Act (FD&C Act); 21st Century Cures Act (2016) | Medical Devices Regulation (EU 2017/745), AI Act (EU 2024/1689), GDPR | Pharmaceuticals and Medical Devices Act (PMD Act, 2014) |
| Classification system | Risk-based: Class I (low), II (moderate), III (high risk) | Risk-based: Class I, IIa, IIb, III (Medical Devices); Minimal, Limited, High-risk, Unacceptable (AI Act) | Risk-based: Class I (General), II (Controlled), III (Highly Controlled), IV (Highly Controlled) |
| Approval pathways | 510(k) (predicate device comparison), de novo (novel low-to-moderate risk), PMA (Premarket Approval for high-risk), Breakthrough Devices Program | CE Marking via: Self-certification (Class I), Notified Body assessment (IIa–III); Combined conformity assessment under MDR/IVDR and AI Act for high-risk AI | Todokede (notification for Class I), Ninsho (certification for Class II/III), Shonin (approval for Class III/IV) |
| AI-specific framework | Predetermined Change Control Plan (PCCP) finalized December 2024; allows pre-approved algorithm modifications without new submissions | AI Act (effective August 2024, full implementation by August 2027); high-risk AI systems require strict compliance including human oversight, transparency, data governance | Post-Approval Change Management Protocol (PACMP, March 2023); DASH for SaMD 2 strategy; Two-stage approval system for SaMD |
| Typical approval timeline | 510(k): 3–6 months de novo: 6–12 months PMA: 12–18+ months Most AI devices use 510(k) pathway | Class IIa: 6–12 months Class IIb/III: 12–24+ months Current bottleneck: Notified Body capacity constraints | Class I: 1–2 months Class II/III: 6–12 months Class IV: 12–18 months SaMD Priority Review: 6 months (target from 2024) |
| Approved AI devices | 1,250+ AI-enabled devices (as of July 2025); ∼712 in radiology; exponential growth from 6 (2015) to 223 (2023) | Thousands approved under MDR; exact numbers vary by member state; most radiology AI devices Class IIa or higher | Limited compared to US/EU; only 3 therapeutic apps approved by Sept 2023 vs. 50+ in US/EU; 15% increase in AI imaging device approvals 2018–2023 |
| Post-market surveillance | Medical Device Reporting (MDR); Real-World Evidence (RWE) emphasis; Performance monitoring required under PCCP | Stringent post-market surveillance under MDR; Vigilance reporting; AI Act requires continuous monitoring of high-risk systems | Good Vigilance Practice (GVP) Ordinance; Safety management measures; Enhanced monitoring for SaMD updates |
| Algorithm update management | PCCP Framework (2024): - Pre-approved modifications without new submission - Must follow exact protocol - QMS documentation required - Deviations require new submission | AI Act Requirements: - Predefined change protocols - Continuous oversight required - Must maintain conformity throughout TPLC - Combined MDR/AI Act assessment | PACMP (2023): - Predefined parameters for updates - Risk-mitigated modifications - Post-approval within safety boundaries - 30-day review for IDATEN system changes |
| Data requirements | Clinical validation required; increasing acceptance of RWE; bridging studies may be accepted for global data | High-quality datasets mandated under AI Act; GDPR compliance required; data from EU populations often preferred | Japanese population data often required; bridging studies from global trials accepted; stricter requirements for novel devices |
| Transparency & explainability | Transparency Guidance (June 2024): - Human-centered design principles - User interface clarity - Model description in submissions - Not mandated but strongly recommended | AI Act Mandates: - High transparency for high-risk systems - Explainability requirements - Fundamental rights considerations - Documentation of AI decision-making process | Transparency requirements aligned with international standards (JIS T 62366-1:2022); Human factors engineering required as of April 2024 |
| Data privacy framework | HIPAA (sector-specific): - Applies to covered entities only - Permits data sharing for treatment/payment - Separate state privacy laws - No comprehensive federal AI privacy law | GDPR (cross-sectoral): - Comprehensive data protection - Applies to all health data - Strict consent requirements - Right to explanation - Data minimization principles | Act on Protection of Personal Information (APPI): - Similar to GDPR but less stringent - Special provisions for sensitive medical data - Focus on appropriate data handling |
| Liability framework | Mixed liability model: - Product liability (Restatement Third of Torts) - Medical malpractice for physicians - Manufacturer responsibility unclear for adaptive AI - Case-by-case determination | Strict liability model: - Product Liability Directive (85/374/EEC) under revision - New AI Liability Directive (2024/2853) - No-fault product liability - Burden of proof on manufacturer - Penalties under AI Act | Mixed liability model: - Product Liability Law - Medical malpractice framework - Manufacturer accountability for defects - Healthcare provider responsibility for clinical use |
| Human oversight requirements | Recommended but not mandated; clinical decision support software must allow independent physician review | AI Act Mandates: - Human oversight required for high-risk AI - Cannot fully replace human judgment - Override capability necessary - Recognized as risk mitigation factor | Encouraged through human factors engineering requirements; physician final decision-making authority maintained |
| Documentation language | English | Native languages of member states + English increasingly accepted | Japanese required (all documentation); English submissions under pilot for certain applications (as of Sept 2024) |
| International harmonization | Participates in IMDRF (International Medical Device Regulators Forum); collaborative guidance with UK MHRA and Health Canada | Leader in international AI governance; IMDRF participation; AI Act influences global standards | Active IMDRF participant; aligns with ICH, ICMRA, MDSAP; ISO 13485 compliance |
| Innovation support mechanisms | - Breakthrough Devices Program (accelerated review) - Pre-submission meetings - Q-Submission program - Real-World Evidence pilots | - Innovation support via EU funds - Regulatory sandboxes (member state dependent) - SME support initiatives - Notified Body guidance | - DASH for SaMD 2 program - Priority review for SaMD - Two-stage approval system - Pre-submission consultations - Expanded review team for SaMD |
| Key challenges | - 510(k) predicate pathway complexity - Unclear guidance for truly novel AI - Time-intensive for complex devices - Post-market surveillance requirements | - Notified Body capacity constraints - Dual compliance (MDR + AI Act) - Regulatory fragmentation across member states - Lengthy certification timelines - High compliance costs | - Language barrier (Japanese documentation) - Japanese population data requirements - Slower adoption than US/EU - Limited number of approved SaMD - Rigorous QMS requirements |
| Deployment timeline impact | Fast-to-Market: - 510(k) enables rapid market entry - Established predicate pathways - Largest approved device database - Iterative updates via PCCP | Moderate Pace: - CE marking timeframe variable - Notified Body availability critical - Dual assessment (MDR + AI Act) adds complexity - Single market access advantage | Deliberate Pace: - Focus on quality over speed - Comprehensive review process - SaMD priority review improving timelines - Cultural emphasis on thorough validation |
| Clinical trial requirements | - Pivotal clinical trials for PMA pathway - May accept foreign clinical data - Good Clinical Practice (GCP) compliance - IDE required for investigational devices | - Clinical evaluation required under MDR - Clinical Trials Regulation (CTR) - GCP compliance mandatory - May require EU-specific data | - GCP compliance required - Often requires Japanese population data - Bridging studies common - Clinical data from outside Japan may be accepted with justification |
| Bias & fairness requirements | Emphasis on bias mitigation in PCCP guidance; recommendations for diverse datasets; no explicit mandates | AI Act requires: - High-quality, representative datasets - Bias monitoring and mitigation - Fundamental rights assessment - Population diversity considerations | Aligned with international standards; increasing focus on data quality and representation |
| Cybersecurity requirements | FDA cybersecurity guidance; premarket and postmarket requirements; Software Bill of Materials (SBOM) | Cyber Resilience Act; NIS2 Directive; cybersecurity mandatory for high-risk AI systems | Aligned with international cybersecurity standards; QMS includes cybersecurity provisions |
| Cost implications | Moderate: - 510(k): $10,000-$50,000 - de novo: $50,000-$150,000 - PMA: $200,000-$1M+- User fees + compliance costs | High: - Notified Body fees: €50,000-€300,000+ - Dual compliance (MDR + AI Act) - Multiple market authorizations if multi-state - Legal/consulting fees substantial | Moderate-High: - Translation costs significant - Japanese representative/MAH required - Consultant fees for navigation - QMS certification costs |
| Market access strategy | Single approval for entire US market (330M+ population); largest single medical device market | Single CE mark grants access to 27 EU member states (450M+ population); some national requirements remain | Third-largest medical device market ($30B by 2025); gateway to broader Asia-Pacific region |