Amazon scraps 'sexist AI' recruitment tool
Members of the team working on the system said it effectively taught itself that male candidates were preferable
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Amazon has scrapped a “sexist” tool that used artificial intelligence to decide the best candidates to hire for jobs.
Members of the team working on the system said it effectively taught itself that male candidates were preferable.
The artificial intelligence software was created by a team at Amazon’s Edinburgh office in 2014 as a way to automatically sort through CVs and select the most talented applicants.
But the algorithm rapidly taught itself to favour male candidates over female ones, according to members of the team who spoke to Reuters.
They realised it was penalising CVs that included the word “women’s,” such as “women’s chess club captain.” It also reportedly downgraded graduates of two all-women’s colleges.
The problem arose from the fact the system was trained on data submitted by applicants over a 10-year period – much of which was said to have come from men.
Five members of the team who developed the machine learning tool - none of whom wanted to be named publicly - said the system was intended to review job applications and give applicants a score ranging from one to five stars.
Some of the team members pointed to the fact this mirrored the way shoppers rate products on Amazon.
“They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those,” one of the engineers said.
But by 2015, it was obvious the system was not rating candidates in a gender-neutral way because it was built on data accumulated from CVs submitted to the firm mostly from males.
The project was discarded but Reuters said it was used for a period by recruiters who examined the recommendations generated by the tool but were never exclusively dependent on it.
Automation has played a critical role in Amazon’s e-commerce clout – from inside the actual warehouses to influencing pricing decisions.
According to a survey by software firm CareerBuilder, about 55 per cent of US human resources managers said AI would have a role to play in recruitment within the next five years.
But concerns have previously been raised about how trustworthy and consistent algorithms which are trained on information which has the possibility of being biased will be.
In May last year, a report claimed that an AI-generated computer program used by an American court for risk assessment was biased against black prisoners.
The program flagged black people were twice as likely as white people to re-offend due to the flawed information that it was learning from.
As the tech industry creates artificial intelligence, there is the risk that it inserts sexism, racism and other deep-rooted prejudices into code that will go on to make decisions for years to come.
Charlotte Morrison, general manager of global branding and design agency Landor, told The Independent: “The fact that Amazon’s system taught itself that male candidates were preferable, penalising resumes that included the word ‘women’s’, is hardly surprising when you consider 89 per cent of the engineering workforce is male.
“Brands need to be careful that when creating and using technology it does not backfire by highlighting society’s own imperfections and prejudices.
“The long-term solution is of course getting more diverse candidates into STEM education and careers – until then, brands need to be alert to the dangers of brand and reputational damage from biased, sexist, and even racist technology.”
Amazon did not immediately respond to The Independent’s request for comment.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments