If your organisation is already GDPR-compliant or ISO 27001 certified, you have a head start — but significant gaps remain. This mapping shows exactly what carries over and what still needs to be built.
Requires training data documentation, bias checks, and provenance records
Article 30 records of processing cover data provenance and purpose
A.8 Asset management covers data classification and ownership
Gap: AI Act adds bias detection and demographic representativeness requirements not covered by GDPR or ISO 27001
Requires continuous risk management covering accuracy, bias, and foreseeable misuse
DPIA (Article 35) covers high-risk processing but is narrower in scope
ISO 27001 Clause 6.1 requires formal risk assessment and treatment
Gap: AI Act risk management is ongoing across the system lifecycle — not a one-time assessment. Must cover AI-specific risks like model drift and adversarial attacks
Nine-section technical file covering architecture, training, validation, and monitoring
No equivalent requirement
A.12 Operations documentation covers some system design records
Gap: Annex IV documentation is largely new work for both GDPR-only and ISO 27001 certified organisations
Automatic event logging throughout system operation with tamper protection
Article 33 requires breach logging but not full audit trails
A.12.4 Logging and monitoring covers audit trail requirements
Gap: ISO 27001 logging covers security events. AI Act logging must cover model decisions and outputs — typically requires additional tooling
Technical measures enabling humans to monitor, intervene in, and override AI outputs
Article 22 right not to be subject to automated decisions (with exceptions)
No direct equivalent
Gap: AI Act human oversight must be built into the product UI — not just a policy right. Most organisations will need to add product features
Instructions for use for deployers; AI disclosure for end users
Articles 13-14 require comprehensive transparency notices
No direct equivalent
Gap: GDPR transparency covers data use. AI Act transparency additionally requires disclosing how the AI works, its limitations, and how to exercise human oversight
Provider obligations remain even when using third-party AI models
Article 28 processor agreements required for all third-party data processors
A.15 Supplier relationships covers third-party security requirements
Gap: AI Act adds obligation to understand and document third-party AI model characteristics, limitations, and training data — beyond standard data processor agreements
Validated accuracy across demographic groups; robustness and cybersecurity testing
No equivalent requirement
A.14 System acquisition covers testing but not AI-specific performance metrics
Gap: Disparate impact testing across demographic groups is an AI Act-specific requirement with no equivalent in GDPR or ISO 27001
Formal post-market monitoring plan with data collection and incident reporting
No equivalent requirement
Clause 9 performance evaluation covers ongoing monitoring but not AI-specific drift
Gap: AI Act requires a documented monitoring plan specifically tracking model performance, drift, and real-world incidents — typically new work
Conformity assessment and EU AI database registration before market placement
No equivalent requirement
ISO 27001 certification provides third-party validation but is not EU AI Act conformity
Gap: Completely new requirement. ISO 27001 certification does not substitute for EU AI Act conformity assessment
GDPR compliance covers transparency, data governance, and user rights — saving roughly 20–30% of EU AI Act compliance work for high-risk systems.
ISO 27001 certification covers risk management, logging, and supplier management — saving an additional 15–25% of compliance work.
What remains regardless of existing certifications: Annex IV technical documentation, AI-specific accuracy and bias testing, human oversight product features, post-market monitoring plan, and EU database registration.