Artificial intelligence (AI) is rapidly transforming many industries, and recruitment is no exception. AI-powered tools are being used to automate tasks such as screening CVs, scheduling interviews, and even conducting initial assessments.

While AI can offer some significant benefits, I believe that it has the risk of being over-utilised and it can actually harm the recruitment process. However, let’s start by looking at the good bits:

The benefits of AI in recruitment

AI can offer some advantages within talent acquisition:

• Increased efficiency: AI can automate many time-consuming tasks, like scheduling of interviews, and the creation of job descriptions and adverts. It can even be used in some platforms to speed up sourcing and finding the right talent.
• Reduced bias: It is claimed by some, that AI can help to reduce unconscious bias in the recruitment process by evaluating candidates based on objective criteria – I intend to explore that in this blog.
• Improved candidate experience: AI can provide candidates with a more streamlined and personalised experience – again, I intend to challenge this assumption as well!

The Limitations of AI in Recruitment

As mentioned above, AI has some limitations that need to be considered:

• Lack of human judgement: AI cannot replicate the human ability to assess a candidate’s soft skills, such as communication, teamwork, and cultural fit.
o Equally, as Charlotte Lytton points out in her BBC article (link below), AI is screening CVs for keywords and automatically rejecting potential talent.
• Potential for bias: AI algorithms can be biased if they are trained on biased data
• Ethical concerns: The use of AI in recruitment raises ethical concerns, such as the potential for discrimination against certain groups of candidates.

Let’s take each of those points and dive a bit deeper.

The Importance of Human Oversight

At application stage, candidates are at the mercy of who (or what) reads their profile. I’ve always believed CV writing is a skill, and candidates, sometimes of a younger demographic, are not always adequately prepared for approaching the job market with a well-structured and detailed CV (that’s a blog for a different day).

But it’s not just newly qualified candidates looking for their first jobs whose CVs can be poor. I’ve seen 100s of CVs from people with 15/20+ years’ experience or more – and the quality of their CVs has not reflected them well.

Does it mean they can’t do the job? Absolutely not. What it does mean, is that no-one has ever shown them how to articulate their experience on paper in an optimal way.

Now I know what you might be thinking. “It’s not my fault if they can’t write a good CV.” And that could be seen as a perfectly legitimate argument! If you’re lucky enough to be hiring in an industry in 2024 with a plethora of experienced and qualified candidates applying for every single vacancy in your business, then you’re in the luxurious position to be able to reject candidates based on the quality of their CVs.

However, if you’re hiring in any number of the industries that is struggling with talent shortages, the unfortunate reality is that you need to dig deeper, read between the lines, find the glimmers of hope, and give people a chance.

It’s experienced people with good training that can do this – not algorithms.

Where this all ends up is that when humans are in control of the recruitment process, we can make choices based on experience and gut, and we can perhaps give that person with the weird CV a call because we see something in them and we have more questions to ask.

If AI is in control, we can potentially lose out on very capable candidates.

“One biased human hiring manager can harm a lot of people in a year, and that’s not great. But an algorithm that is maybe used in all incoming applications at a large company… that could harm hundreds of thousands of applicants” – Hilke Schellman (US-based author of ‘The Algorithm: How AI Can Hijack Your Career and Steal Your Future’, and an assistant professor of journalism at New York University)

Candidate Experience

One of the other claims made by the huge numbers of AI platforms rushing to market is that AI can streamline and strengthen the candidate experience.

Some evidence suggests that this is not the case, with job candidates rarely ever knowing if these tools are the sole reason companies reject them. The software often doesn’t tell users how they’ve been evaluated.

Now, this is where I actually think AI can be useful in evolving the ‘standard rejection’ email. In addition, I also believe AI can help us ensure that everyone who has applied gets a response – even if it’s a rejection. There’s nothing worse than being ‘ghosted’ by a company (and this does NOTHING for your candidate experience!).

AI can assist with the creation of rejection templates which can at least give some kind of feedback to candidates. If you create some templated options such as “not enough experience”, or, “location doesn’t fit with role”, these can help respond quickly to candidates, especially when there are a lot of applicants, and you need to filter efficiently.


The different types of biases are well documented in the recruitment world, and we spend a lot of time in our business training hiring managers on how to interview in an unbiased way.

AI in recruitment has been billed as reducing, if not eliminating bias from the process. But this all relies on the training data.

Here’s an example from an anonymous company in the US.

Its AI CV Screener had been trained on existing profiles of people in the business. The algorithm had picked up that profiles with hobbies such as basketball and baseball were attributed to more successful people in the business, often men. As such, when candidates were being screened, people who’d written softball in their hobbies – typically women in this case – were filtered out.

In another researched case, one user who had been rejected, resubmitted the same application but tweaked their birth date to make them younger. They were offered an interview.

Bias absolutely exists within humans, don’t get me wrong. But it feels to me that we have more of an opportunity, through training and having a clear and fair process, to eliminate bias far more easily and effectively than AI.

The Ethics

This brings us nicely (or not so nicely) onto the ethical conversation. Clearly, from the examples above, ageism and sexism are not being eliminated by the introduction of AI. You could argue that there are only a handful of documented cases of this happening, but surely only one example is what’s required to have the very serious conversation and debate about the effectiveness and fairness of the use of AI in hiring.

Companies looking to implement AI technology into their recruitment process (40% of 8,500 companies surveyed by IBM in late 2023) should be extremely wary of the ethical and legal risks involved, especially if partnering with ‘rushed-to-market’ technology and algorithms that have not been fully stress-tested. It could open a completely unnecessary can of worms.

Wrap Up

In conclusion, AI can be a valuable tool for supplementing the recruitment process, but it should never entirely replace human judgment. Recruiters need to use their experience and expertise to assess candidates’ soft skills, identify potential biases, read between the lines on CVs, and ensure that the recruitment process is fair and ethical.

I think that in the hype of the AI tsunami, a lot of us have rushed to shoehorn ChatGPT, Gemini and various other tools into our personal and professional lives, under the illusion that we can automate things and be more productive.

However, I think that when it comes to hiring for our businesses, we need to take a people-focused approach and remember that AI is here to help us, not replace us.