AI Act ATS: the CNIL compliance check-list before August 2026

The AI Act applies on 2 August 2026. A 7-point check-list aligned on the CNIL FAQ to audit your ATS before the deadline.

5 min read
Alexandre NotoArticle
AI Act ATS: the CNIL compliance check-list before August 2026

2 August 2026. In just over three months, the AI Act becomes fully applicable to AI systems used in recruitment. Your ATS is in scope, even if you have never thought of it as an AI tool. Automated candidate sorting, matching scores, relevance rankings: all of it falls within the perimeter. The CNIL has just published its first FAQ on the regulation. It is short, dense, and it sets the practical frame.

Why the CNIL publishes now

The CNIL FAQ arrives three months before the deadline. Not a coincidence. The authority is preparing its audits and wants companies to have had the time to get compliant. As a signal, the 5 million euro fine handed to France Travail in January 2026 for security failings reminded everyone that the CNIL now hits the public sector as hard as the private. The idea of an asymmetric tolerance is dead.

The AI Act does not replace the GDPR. The CNIL keeps hammering it: both regulations stack. An ATS processing applications must comply with both. A DPIA remains mandatory whenever the conditions of article 35 GDPR are met. The AI Act adds its own framework on top.

Annex III: your ATS is high-risk

The regulation classifies AI systems into four levels of risk. Annex III lists high-risk use cases, and point 4 targets employment directly: recruitment, selection, evaluation, targeted advertising of vacancies. An ATS that sorts or ranks candidates falls in scope without debate.

In the text's vocabulary, two roles are distinguished: the provider (the one that designs and places the system on the market) and the deployer (the one that uses it in production). Your ATS vendor is a provider. You, the recruiter, are a deployer. Each role carries its own stack of obligations. Think of a car manufacturer and a driver: both answer, each for their own share.

The check-list in 7 points

Each point must be verified with your vendor. Ask for written proof, not a sales promise.

1. High-risk classification assumed (Annex III, point 4) The vendor must explicitly acknowledge that its system falls under point 4 of Annex III. If the pitch boils down to "our AI is not concerned", run. The text leaves no room for interpretation on candidate sorting.

2. Compliant instructions for use (article 13) The provider must deliver clear instructions: capabilities, limits, expected input data types, accuracy level, interpretation of outputs. Not a marketing PDF: a readable technical document. Without these instructions, you cannot exercise your human oversight.

3. Operational human oversight (article 14) You must be able to understand the output, challenge it, ignore it, stop the system. The vendor must document how its interface enables that oversight. Automation bias is named in the text: a recruiter who systematically validates what the AI suggests is not supervising.

4. Input data under control (article 26, paragraph 4) The deployer ensures that input data is relevant and representative for the intended purpose. If your ATS learns from your biased historical data, you must document and correct it. That becomes your responsibility, not only the vendor's.

5. Logs kept for at least six months (article 26, paragraph 6) Automatically generated activity logs must be kept for at least six months, unless stricter obligations apply. Check that your vendor produces these logs, that you can access them, and that your internal policy covers retention. This is what will let you document a contested decision.

6. CE marking and technical documentation (articles 16, 47, 48) The provider must affix the CE marking and maintain the technical documentation. Ask for the EU declaration of conformity. Without CE marking, the system cannot be placed on the European market after 2 August 2026 for the uses listed in Annex III.

7. Information of candidates and employee representatives (article 26, paragraphs 7 and 11) Two distinct obligations. On the candidate side, information that the application is processed by a high-risk AI system must be given before processing. Internally, employee representatives and affected employees must be informed before the system goes live at work. These two notices stack on top of standard GDPR obligations, they do not replace them.

What the CNIL FAQ does not settle

The FAQ sets the frame, it does not detail everything. The FRIA, the fundamental rights impact assessment required by article 27, is not mandatory for private-sector recruitment. It is for public bodies and delegated public services. If you are a local authority, a public institution, or an OPCO, you are in. The official template from the European AI Office is expected before the deadline.

On the GDPR side, the existing CNIL rules remain in place: adequate, relevant, strictly necessary data, two-year active-base retention after last contact, candidate information. The AI Act removes nothing, it stacks.

What we suggest you do this week

Take the 7 points above, paste them into an email, send them to your ATS vendor. Give yourself three weeks to get an answer. What does not come back in writing is not acquired.

At JobAffinity, we are a French ATS vendor, we document our compliance on all 7 points. We host in France, our AI runs on our own servers, and our instructions for use are available. We will publish by June our AI Act compliance sheet for our clients, so point 2 of the check-list is ticked without delay. If your current vendor meets your questions with silence or vagueness on these points, it is probably a good time to compare.

Topics covered:

ConformitéAI ActCNILATS

Frequently Asked Questions

Is my ATS covered by the AI Act?
Yes if your ATS sorts, ranks, scores or evaluates candidates. Annex III of the regulation places such systems in the high-risk category, under point 4 on employment. It does not matter whether the feature is marketed as AI or not: the purpose is what counts.
What should I check with my vendor before August 2026?
Seven points: high-risk classification assumed, article 13 instructions for use, article 14 human oversight, control over input data, logs kept for at least six months, CE marking and technical documentation, information of the candidate and of employee representatives. Ask for written proof on each item.
Does the GDPR still apply?
Yes. The CNIL states it explicitly in its FAQ: both regulations apply in parallel. The GDPR covers personal data, the AI Act covers the AI system. An ATS processing applications must comply with both. A DPIA remains mandatory whenever the conditions of article 35 GDPR are met.
Am I required to run a FRIA?
The FRIA, the fundamental rights impact assessment of article 27, is mandatory for public bodies, private entities providing a public service, and deployers under points 5(b) and 5(c) of Annex III. Private-sector recruitment is not directly covered, but the CNIL encourages the exercise to document proportionality.
What are the penalties for non-compliance?
Up to 15 million euros or 3% of worldwide turnover for breaches of high-risk obligations. Up to 35 million or 7% for prohibited practices. GDPR fines remain in force on top: the CNIL fined France Travail 5 million euros in January 2026 on data security grounds alone.
Is my ATS vendor a provider or a deployer?
The vendor that designs and markets the AI system is the provider. The recruiter who uses it in production is the deployer. The two roles carry distinct obligations. Article 26 lists those of the deployer: compliant use, human oversight, data quality, logs, information of data subjects. They remain your responsibility, even when the vendor takes care of its own share.

Ready to optimize your recruitment?

Discover JobAffinity and transform the way you recruit

Request a demo
Chargement...