EU AI Act for HR Software Buyers: What Deploying Workday, HireVue, or Any AI Hiring Tool Actually Requires
Most EU AI Act coverage focuses on the companies building AI systems. But if your HR team uses Workday, HireVue, Eightfold, or any other AI-powered recruitment or performance management tool, you have your own set of obligations — regardless of the fact that you didn't write a line of code.
Under the EU AI Act, the company deploying an AI system in a workplace context is a deployer. Deployers of high-risk AI systems have binding legal obligations that sit alongside — and are separate from — whatever compliance the vendor has done.
Why HR AI is always high-risk
Annex III Point 4 of the EU AI Act classifies the following as high-risk AI:
- AI used in recruitment and selection — including CV screening, ranking candidates, and filtering applications
- AI used to make or assist decisions on promotion, task allocation, and performance monitoring
- AI used for termination decisions or monitoring employee behaviour
This covers virtually every AI-powered HR tool on the market. If your ATS uses AI to rank applicants, that is a high-risk system. If your performance management platform uses AI to flag underperformers, that is a high-risk system. The fact that you bought it off the shelf does not change your obligations as the deployer.
What deployers of HR AI must do
1. Conduct a Fundamental Rights Impact Assessment (FRIA)
Deployers in the public sector and certain private sector organisations must carry out a Fundamental Rights Impact Assessment before deploying a high-risk AI system. The FRIA must assess the potential impact on equality, non-discrimination, and data protection — particularly relevant when AI is used in hiring decisions that could disproportionately affect protected groups.
2. Implement human oversight
You cannot simply let the AI make decisions. Article 14 requires that you put in place measures enabling humans to monitor, understand, and where necessary override or stop the system. For HR tools, this means:
- Ensuring hiring managers understand what the AI is doing and why
- Maintaining a clear process for human review of AI-generated shortlists or scores
- Documenting that final decisions are made by humans, not the system alone
3. Monitor the system in use
Deployers are responsible for monitoring the AI system's performance in their specific context. A tool that performs well in aggregate may produce biased outputs in your workforce demographic. You need a process for identifying when the system is not working as intended and escalating to the provider.
4. Keep records
You must maintain logs of when and how the system was used, sufficient to demonstrate compliance with your deployer obligations. This is separate from whatever logging the vendor does on their side.
5. Inform employees
Workers who are subject to AI-assisted decisions have a right to be informed. If AI is being used to monitor performance, screen applications, or make recommendations about their employment, you must tell them — clearly and in advance.
What your vendor is responsible for
The provider (Workday, HireVue, etc.) is responsible for building and maintaining a compliant system — including Annex IV technical documentation, conformity assessment, and EU database registration. They bear the primary product compliance burden.
But their compliance does not cover your deployment. You cannot point to your vendor's EU AI Act certification and consider yourself covered. The deployer obligations listed above fall on your organisation, not theirs.
Before August 2, 2026, ask your HR software vendors for their EU AI Act compliance documentation. Any reputable provider of high-risk AI should be able to produce evidence of conformity assessment and Annex IV technical documentation on request.
What to do now
- Audit your HR tech stack. List every tool that uses AI to assist decisions about hiring, performance, or monitoring. Each one needs to be assessed.
- Contact your vendors. Ask for their EU AI Act classification and compliance documentation before the deadline.
- Document your human oversight process. How are hiring managers instructed to use AI outputs? Is this written down?
- Review your employee communications. Do your privacy notices and employment policies cover AI-assisted decision making?
- Classify your deployment. Use the free classifier at getactready.com/classify to understand your specific obligations as a deployer in your context.
Stay ahead of the deadline
Get EU AI Act updates, enforcement news, and compliance guides delivered to your inbox. No spam — unsubscribe any time.
Check your AI system's risk level for free
Our classifier maps your AI system against the EU AI Act in under 60 seconds. No signup required.
Classify Your AI System