A contact recently applied for a management job for which his background was super aligned with the job board ad.
Within one minute he received a reply saying he was not successful as his background didn’t match the specific requirements needed.
He was quite puzzled and a little confused, so picked up the phone and spoke to the Recruiter.
The Recruiter was quick to reiterate that my contact did not have the background they were seeking.
My contact then pointed out that quite clearly his background DID align with the role advertised.
The Recruiter apologised profusely and booked an interview. But while my contact did go on to meet this Recruiter, his impression had been tainted.
Of course, AI is not ready to replace humans and that includes Recruiters although it can help with some of the less exciting and repetitive tasks we do.
In saying that, a few years ago Jeffrey Dastin said, “Amazon scrapped a secret AI recruiting tool that showed bias against women.” His article went on to say, “But by 2015, the company realised its new system was not rating candidates for software developer jobs and other technical posts in a gender-neutral way.”
It turned out that Amazon’s computer models were trained to vet applicants by observing patterns in resumes submitted over a 10-year period. Most came from men, a reflection of male dominance across the tech industry. In effect, Amazon’s system taught itself that male candidates were preferable.
I would like to think that Chat GPT has this sorted but, given my contact’s experience, it still isn’t as good as it needs to be.
AI can’t phone screen candidates and identify the soft skills I discussed in my post last week. Or in an interview, AI doesn’t pick up on social cues and can’t build rapport. The list goes on.
AI will be a huge part of our future by reducing repetitive basic tasks but it won’t be replacing Recruiters any time soon.
If you’re looking to grow your sales team and want real human interaction, please do give Miller Recruitment a call.