← Back to blog
6 min read

Stop Waiting Until the Last Minute: The EU AI Act Deadline Is Closer Than You Think

If you haven't started your EU AI Act compliance yet, you are in the majority. Surveys of European businesses consistently show that most companies deploying AI have not begun formal compliance work. That might feel reassuring — but it isn't. The deadline is fixed, and the work takes longer than almost everyone expects.

How much time is actually left?

The main EU AI Act enforcement date for high-risk AI systems is August 2, 2026. As of today, there are 93days left. That sounds like a lot. It isn't — especially when you understand what "compliance" actually involves.

How long does compliance actually take?

This depends entirely on your risk level — which is why classification is the first step. But here are realistic timelines:

Minimal risk (most AI systems)

A few hours. Classify your system, document why it's minimal risk, move on. There are no mandatory obligations, though documenting your classification is good practice.

Limited risk (chatbots, content generators)

A few days. Add AI disclosure banners to your product, update your terms of service, label AI-generated content. Straightforward — but you have to actually do it.

High-risk (HR AI, credit scoring, education, infrastructure)

3 to 6 months minimum if you're starting from scratch. Here's what the work involves:

  • Technical documentation (Annex IV) — 9 sections covering system architecture, training data, validation methodology, risk analysis, and more. This alone takes weeks for most teams.
  • Risk management system (Article 9) — a documented process for identifying, assessing, and mitigating risks throughout the system lifecycle. Not a one-page document.
  • Data governance review (Article 10) — documenting training data sources, data quality measures, bias detection. Often requires going back through historical development decisions.
  • Human oversight implementation (Article 14) — if your product doesn't currently allow users to review or override AI outputs, you may need to build this feature.
  • Conformity assessment — a formal self-assessment verification that can take weeks to complete properly.
  • EU database registration (Article 49) — required before the system goes to market or remains on the market after August 2.

If you have a high-risk AI system and haven't started: you now have 93days to complete months of work. That's still possible — but only if you start immediately.

Why are companies still waiting?

The reasons are predictable:

  • "We'll do it when we have more time." There is no future moment where you have more time than now. The deadline is not moving.
  • "It doesn't apply to us." This assumption is often wrong. If you have EU users and an AI product, it very likely applies.
  • "It's too expensive." Compliance consultants do cost €50K–€200K. But that's not the only option — and a €15 million fine is considerably more expensive.
  • "Enforcement won't be strict at first." Possibly true for minimal and limited risk systems. Not a safe assumption for high-risk systems that cause demonstrable harm.

The real risk for procrastinating companies

There are two risks — one regulatory, one commercial.

The regulatory risk: national market surveillance authorities can order non-compliant high-risk AI systems off the EU market. For a company dependent on EU revenue, this is potentially catastrophic.

The commercial risk: enterprise procurement teams are increasingly asking about EU AI Act compliance status. EU customers — especially larger companies subject to their own compliance obligations as AI deployers — will require their vendors to be compliant. Non-compliance becomes a deal-blocker before it becomes a legal problem.

What to do today

Three steps, in order:

  • Classify your AI systems. Until you know your risk tier, you cannot scope the work. Use ActReady's free classifier at getactready.com/classify — it takes 60 seconds per system and tells you exactly what obligations apply. No signup required.
  • Prioritise high-risk systems. If any systems are high-risk, start the technical documentation and risk management work immediately. Don't wait until everything else is done.
  • Handle limited-risk systems quickly. AI disclosure banners and content labels can typically be implemented in a day or two. Get these done now while your team works on the harder stuff.

The companies that will struggle in August are the ones reading this in July. Don't be one of them.

Stay ahead of the deadline

Get EU AI Act updates, enforcement news, and compliance guides delivered to your inbox. No spam — unsubscribe any time.

Check your AI system's risk level for free

Our classifier maps your AI system against the EU AI Act in under 60 seconds. No signup required.

Classify Your AI System