Credit scoring, fraud detection, and insurance underwriting AI all fall under Annex III, point 5 (access to essential private and public services). If your AI affects whether individuals get loans, insurance, or financial services, you have 93 days to comply.
Annex III, point 5 covers AI used to evaluate creditworthiness, set insurance premiums, or determine access to essential services.
AI that evaluates creditworthiness, approves or rejects loan applications, or sets credit limits for individuals
Systems that flag, block, or restrict accounts based on fraud risk scores when those decisions impact access to services
AI that calculates insurance premiums, assesses risk profiles, or decides coverage eligibility for individual applicants
AI systems that evaluate, grant, reduce, or revoke access to public assistance, benefits, or essential financial services
Not every fintech AI tool triggers Annex III. The key factor is whether AI decisions directly affect individual access to financial services.
Even if your system is not high-risk, transparency obligations under Article 50 may still apply. Run the free classifier to find out.
Each must be in place before August 2, 2026. Non-compliance risks fines up to €15 million or 3% of global turnover.
Some work carries over
If you already meet GDPR requirements, your data governance documentation and DPIA processes will partially cover EU AI Act Article 10 (data governance) and Article 9 (risk management). But the AI Act adds AI-specific requirements that GDPR doesn't cover: bias detection, model accuracy documentation, conformity assessment, and continuous post-market monitoring.
See the full GDPR overlap mapping →