AI resume scoring: what your ATS is not telling you

67% of companies plan to hire fewer juniors because of AI. Automated resume scoring optimizes the past, not potential.

7 min read
Alexandre NotoArticle
AI resume scoring: what your ATS is not telling you

67% of companies will hire fewer juniors. What role does AI play?

That is the figure making the rounds. According to an IDC study conducted for Deel across 5,500 companies in 22 countries (including 250 in France), 67% of business leaders plan to reduce junior hiring over the next three years. The reason cited: efficiency gains generated by AI.

Let us be precise. The problem is not AI. It is what we ask AI to do. And when it comes to AI scoring in recruitment, we often ask it to do exactly what it should not: sort humans the way you sort invoices.

Scoring optimizes the past. Not potential.

An AI scoring system works in a fairly straightforward way. It takes a resume, extracts the data (parsing), then compares it to the criteria of a job listing. It assigns a score. 85%, 62%, 41%. The recruiter sees the list, sorted from highest to lowest.

In theory, it saves time. In practice, it is a machine for reproducing the past.

Take a junior fresh out of training. Their resume is one page. Little experience, no industry-specific keywords, an internship and a final-year project. Up against a senior with ten years of experience and a resume packed with technical terms, the junior has no chance. Low score, pushed to the bottom of the pile, never seen by the recruiter.

It is like judging a marathon by the first 100 meters. The fastest runner at the start is not always the one who crosses the finish line.

A human recruiter might have noticed the coherence of the career path, the quality of the cover letter, an interesting volunteer commitment. The algorithm sees an empty resume. Score: 34%. Next.

83% of French companies surveyed by IDC are already observing job transformations or eliminations linked to AI. If you add a scoring system that automatically eliminates the least experienced profiles, you are not just reducing junior hires. You are cutting the pipeline for team renewal.

Scoring versus matching: the distinction no one makes

There is a word used constantly in ATS platforms without ever being clearly defined: matching. And that is a problem.

Scoring, in itself, is information. It tells the recruiter: "This resume matches 78% of the job criteria." It is an indicator. The recruiter decides what to do with it.

Matching is something else. It is when the algorithm sorts, ranks, and pre-selects applications. It no longer provides information to the recruiter; it makes a decision on their behalf. The candidate at 45% will never be displayed. The one at 72% will be buried on page 3.

The boundary between the two is smoke and mirrors. In practice, when a recruiter receives 200 applications and the ATS sorts them by descending score, they will never look beyond the top 20. Scoring becomes de facto matching. Information becomes a selection filter, and the recruiter is not even aware of it.

It is the difference between a GPS that shows you three routes and a GPS that chooses for you. Both use the same data, but only one lets you decide.

AI Act: recruitment scoring classified as "high risk"

If the debate were purely philosophical, we could leave it there. But there is a regulatory timeline that makes this very concrete.

On August 2, 2026, the obligations of the European AI Act become fully applicable for high-risk AI systems. And guess what: automated resume screening, candidate scoring, and performance rating are explicitly on the list.

The obligations are significant. Technical documentation to be retained for ten years. Registration in a centralized European database. Mandatory human oversight with the ability to override automated decisions. Incident reporting. Full traceability.

The penalties? Up to 35 million euros or 7% of global revenue for serious violations. 15 million or 3% for less severe infractions.

Let us do the math honestly. On one side: the cost of compliance (DPIA, audits, documentation, human oversight, training). On the other: a compatibility score that most recruiters check once during the demo and then forget in daily use.

For many companies, AI scoring in recruitment is the heated seat of the ATS. Impressive on the spec sheet, rarely used in real life. Except now, the heated seat comes with a 200-page compliance manual and a potential fine of several million euros.

What AI actually does well (and what we should be asking it to do)

JobAffinity is a pioneer in AI applied to recruitment. We speak at numerous webinars about best practices for optimizing recruitment with AI without dehumanizing the process. JAI is our AI integrated into the ATS, scoring included. But we draw a clear line between assisting the recruiter and deciding on their behalf.

Here is what AI does well in recruitment:

Writing job postings. Going from a manager's brief to a structured, inclusive, job-board-optimized listing in minutes instead of an hour. The AI drafts, the recruiter validates and adjusts. The recruiter stays in control of their messaging.

Preparing interviews. Analyzing a resume to suggest the right questions. Identifying gaps, career transitions to explore, skills to verify. The AI prepares, the recruiter conducts the interview.

Assisting you daily. JAI is the guardian angel of your process. It can remind you that of the 5 candidates you contacted last week, 3 have not responded, and offer a one-click follow-up. It can flag when a listing is losing traction on new applications and suggest posting on additional job boards.

In all three cases, AI is an assistant. It processes the raw material so the human recruiter makes better decisions, faster. Scoring, when it is transparent and configurable, follows the same logic: information serving the recruiter, not an autonomous filter.

The exercise to do tomorrow morning

Open your ATS. Look at the last job listing you published. If your tool offers AI scoring, ask yourself three questions:

  1. Have you set up your application forms to inform candidates they will be scored (AI Act)?
  2. Have you ever rejected a profile solely because their score was low?
  3. Do you know exactly which criteria the algorithm weights to calculate the score, and can you modify them?

If the answer to question 1 is "no," scoring is a significant legal and financial risk. Is the gain worth that risk? If the answer to question 2 is "yes," you have an automated bias problem. If the answer to question 3 is "no," you are using an opaque tool to pre-select candidates. And from August 2026, that is a legal risk.

Score, yes. But not just any way.

At JobAffinity, we offer scoring because it is a feature that can be useful when used with awareness. But it is transparent, configurable, and compliant with the AI Act.

We were recruiters before we became software vendors. Our mission since 2009 has been to help recruiters do their job effectively and ethically. Every AI feature we integrate passes a simple filter: does it genuinely help the recruiter day to day, or is it just convincing in a demo? We do not serve you the next trendy AI feature. We take a critical look at what we integrate, because we know the profession, the recruiter, and the candidate.

In practice, what does that look like?

Data sovereignty. Our AI is French and self-hosted. Your candidates' resumes do not end up with an American cloud provider subject to the CLOUD Act. This is also a direct advantage for AI Act and GDPR compliance.

Transparent scoring. The recruiter sees why a candidate received a given score. No black box. This is exactly what the AI Act will require for high-risk systems, and it is what we already do.

Configurable by the recruiter. You adjust the criteria, you fine-tune the weightings. You define what matters for your role, not the algorithm. Scoring remains your tool, not your replacement.

The score informs, it does not decide. With us, scoring helps prioritize which applications to read. It does not automatically reject anyone. The recruiter stays in control from the first resume to the final interview.

The next junior you hire may have an empty resume and potential that only a human can spot. Our job is to give you the right tools to see it.

Topics covered:

IA recrutementATSScoring CV

Frequently Asked Questions

What is the difference between scoring and matching in an ATS?
Scoring assigns a compatibility score to a resume against a job listing. It is information provided to the recruiter. Matching sorts and ranks applications automatically. The line between the two is blurry: when the recruiter only looks at resumes above 80%, scoring becomes a disguised selection filter.
Is AI scoring reliable for evaluating junior candidates?
No. AI scoring relies on past experience, resume keywords, and degrees. A junior candidate, by definition, has little experience. Their score will systematically be low, even if they show strong potential. This is a structural bias of the method, not a bug.
What does the AI Act say about AI scoring in recruitment?
The European AI regulation (AI Act) classifies resume scoring and screening systems as high-risk systems. From August 2, 2026, companies using these tools must comply with requirements for technical documentation (retained for 10 years), transparency, human oversight, and registration in a European database. Penalties can reach 35 million euros or 7% of global revenue.
How do I know if my ATS scoring complies with the AI Act?
Ask your vendor three questions: Is the candidate informed they will be scored? Can you see and modify the weighting criteria? Does the tool maintain complete technical documentation? If the answer is no to any of these, your scoring presents a legal risk from August 2026 onward.
Does AI scoring replace the recruiter?
No, provided it is used as an indicator rather than an automatic filter. Transparent and configurable scoring helps recruiters prioritize which applications to review. But if the recruiter only consults high scores, the tool is making decisions on their behalf, which raises bias and regulatory compliance issues.
What are the concrete risks of opaque AI scoring in recruitment?
Three main risks: a structural bias against junior and non-traditional profiles (low scores due to lack of experience), a legal risk under the AI Act (penalties up to 35 million euros), and a risk of dehumanizing the process that depletes the talent pipeline over time.

Ready to optimize your recruitment?

Discover JobAffinity and transform the way you recruit

Request a demo
Chargement...