RegulatorySignals
All articles
EU AI Act 6 min read2026-05-11

EU AI Act vs GDPR for SaaS: What Changes in 2026, What Overlaps, and What You Can Reuse

Side-by-side breakdown of EU AI Act and GDPR obligations for SaaS teams. Three new obligations, two overlaps, and one conflict you need to know before August 2026.

"We are GDPR-compliant, so we are covered, right?" is the most expensive misconception in EU AI Act preparation. GDPR is point-in-time, output-neutral, and focused on personal data processing. The AI Act is continuous, output-specific, and focused on automated decision-making risk — and it applies regardless of whether personal data is involved. The two regimes overlap in three specific places and diverge in five. Understanding the map before August 2026 is the difference between reusing 70% of your existing compliance work and rebuilding it from scratch.

Run a free scan on your site at Regulatory Signals to surface which obligations apply to your specific setup.

The Side-by-Side

Dimension GDPR EU AI Act
Scope trigger Collecting or processing personal data Providing or deploying an AI system in the EU
Personal data required? Yes — scope is defined by PII No — applies to AI systems regardless of data type
Timing Point-in-time DPIA before significant processing begins Continuous throughout AI system lifecycle
Documentation Records of Processing Activities (RoPA) Technical documentation + conformity assessment BEFORE market placement
Primary obligation Lawful basis, data subject rights, retention limits Risk management, transparency, human oversight, robustness
Enforcement National DPAs (e.g., CNIL, BfDI, ICO) National competent authorities designated under AI Act + European AI Office

The key practical difference for engineering teams: GDPR lets you batch compliance events. You run a DPIA when you start a new processing activity. The AI Act, via Article 9 of Regulation (EU) 2024/1689, requires that risk management is continuous. New training runs, model version updates, production drift detection events, and changes to the deployment context all trigger a review obligation. GDPR does not have an equivalent trigger.

The 3 New Obligations GDPR Did Not Impose

1. Continuous risk management throughout the AI lifecycle (Article 9)

GDPR's DPIA (Article 35) is a pre-processing gate — you run it once before a significant new processing activity, file it, and update it only when you materially change the processing. It is not a continuous obligation.

AI Act Article 9 is explicitly different. The risk management system must be "implemented in a continuous iterative process." This means the risk register is not a document you file before launch — it is a live record that must reflect the current state of the system. Retrain the model: update the risk register. Deploy to a new customer segment: update the risk register. Notice unexpected output distributions in production: update the risk register and document the investigation.

For SaaS teams used to treating compliance as a gate, this is a workflow change. Article 9 compliance needs to be embedded in your development process, not handled by the legal team once a quarter.

2. Pre-deployment technical documentation (Article 11)

GDPR requires a RoPA entry for new processing activities and a DPIA for high-risk processing — but neither requires technical documentation of the AI system's design and architecture.

Article 11 requires technical documentation covering: the intended purpose of the AI system, the general logic and principal design choices, the training methodology and training data (or data inputs), the performance metrics used, known limitations and failure modes, and the expected lifetime of the system. This documentation must exist before the system is placed on the market or put into service.

For SaaS teams with a "ship first, document later" culture, this is the obligation most likely to cause a compliance gap. The documentation must precede the EU deployment, not follow it. Wiring a documentation template into the feature PR process is the engineering solution.

3. Conformity assessment before market entry (Articles 43–49)

There is no GDPR equivalent to the conformity assessment. GDPR has no mechanism requiring you to demonstrate, before processing begins, that your system meets a defined set of technical and procedural requirements.

For most high-risk AI systems, the Act permits self-assessment under Annex VI (rather than requiring a third-party notified body). But the self-assessment must be completed before EU deployment, and the documentation must be retained for 10 years from market placement. For SaaS teams on continuous delivery cycles, this means a documented assessment checkpoint before each major feature that involves a high-risk AI component — not a single assessment for the entire product.

The 2 Overlaps (Where GDPR Work Counts)

Overlap 1: DPIA fusion with AI Act risk assessment

For AI systems that process personal data — which covers most SaaS — you can run a single combined document that satisfies both GDPR Article 35 (DPIA) and AI Act Article 9 (risk management system initialisation). The EDPB has explicitly encouraged this approach. A combined document covers: the lawful basis and data subject rights analysis (GDPR), AND the risk categories, residual risks, testing methodology, and update cadence (AI Act).

This is the biggest efficiency gain available to SaaS teams that already have a functioning DPIA process. You are not starting from zero — you are extending an existing document to cover the AI Act dimensions.

The combined document does not satisfy the Article 11 technical documentation requirement or the Article 43–49 conformity assessment. Those are separate. But it does collapse two compliance activities into one, saving significant time for every AI feature that processes personal data.

Overlap 2: Sub-processor disclosure covers AI providers

GDPR Article 28 requires you to list sub-processors in your Data Processing Agreement and give data subjects visibility into who processes their data. AI Act Article 11 (technical documentation) and Article 25 (obligations for GPAI model providers) require disclosure of AI component providers as part of the technical record.

These are the same list. OpenAI, Anthropic, Google Cloud AI, Mistral — these are both GDPR sub-processors (if they process personal data on your behalf) and AI component providers under the AI Act. One updated DPA and sub-processor list satisfies both disclosure obligations. Maintain one canonical list and reference it from both your Privacy Policy (GDPR) and your AI technical documentation (AI Act).

Audit your current sub-processor list and AI Act documentation gaps in one run. Get the audit pack — it cross-checks your Privacy Policy, DPA, and AI system documentation against what is actually deployed on your site.

The 1 Place They Conflict

Training data minimisation vs AI Act Article 10 data governance

GDPR Article 5(1)(c) requires data minimisation: you collect and retain only what is necessary for the specified purpose. This is a strong default obligation — when in doubt, delete.

AI Act Article 10 requires comprehensive data governance over training datasets, including documentation of data origin, data processing operations, bias testing, and demographic coverage. Article 10(5) specifically requires that where technical measures exist to identify and mitigate bias, including in the data used, these measures are applied. In practice, bias testing and audit trails require retaining training data — and diverse training data — for longer than data minimisation would suggest.

The resolution comes from EDPB Opinion 28/2024 (published February 2024): where AI Act Article 10 creates a specific legal obligation to retain or document training data for auditability and bias assessment purposes, this constitutes a legitimate purpose under GDPR Article 5(1)(b) that justifies retention beyond what data minimisation alone would permit. The retention must be documented explicitly in your Records of Processing Activities, citing the AI Act Article 10 obligation as the legal basis for the extended retention period.

Do not delete training data needed to satisfy Article 10 without a written legal analysis confirming that the AI Act obligation no longer requires it.

What This Means for SaaS Compliance Workflows

If your GDPR operations are already in order, your AI Act uplift consists of four specific additions:

  1. Convert DPIAs to combined DPIA + AI risk assessments — Reuse approximately 70% of each existing DPIA document. Add the AI Act-specific sections: risk categories, residual risks, testing methodology, and review cadence.

  2. Add Article 11 technical documentation to the feature PR process — This has no GDPR equivalent. It is a new workflow gate. Every AI feature PR needs a documentation artifact committed alongside the code.

  3. Add a conformity self-assessment checkpoint before production launches of high-risk features — Again, no GDPR equivalent. This is a structured checklist (Annex VI), not a free-form audit.

  4. Extend your incident response playbook to include AI incident timelines — Article 62 (high-risk) and Article 73 (GPAI) establish notification obligations to national competent authorities. Your existing GDPR 72-hour breach notification process is the right model; the trigger and recipient are different.

You are not starting from zero. The GDPR infrastructure you built is reusable in the ways described above. But Article 9's continuous monitoring obligation is genuinely new, and it requires tooling — a one-time audit will not keep you compliant when your model updates ship.

Related

If your AI Act and GDPR compliance workflows are still separate processes, the free scan at /demo shows you which obligations overlap for your specific setup in 30 seconds. The audit pack generates the combined DPIA + AI risk assessment template pre-filled with findings from your site.

Regulatory Signals

Scan your site or AI system now

Detect trackers, check legal page adequacy, classify EU AI Act risk, and generate policy documents — in minutes.

Run a free scan