Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women

By Reuters. Amazon.com Inc’s (AMZN.O) machine-learning specialists uncovered a big problem: their new recruiting engine did not like women.

The team had been building computer programs since 2014 to review job applicants’ resumes with the aim of mechanizing the search for top talent, five people familiar with the effort told Reuters.

Automation has been key to Amazon’s e-commerce dominance, be it inside warehouses or driving pricing decisions. The company’s experimental hiring tool used artificial intelligence to give job candidates scores ranging from one to five stars – much like shoppers rate products on Amazon, some of the people said.

“Everyone wanted this holy grail,” one of the people said. “They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those.” (Read more from “Amazon Scraps Secret AI Recruiting Tool That Showed Bias Against Women” HERE)

_____________________________________________

Amazon Built an AI Tool to Hire People but Had to Shut It Down Because It Was Discriminating Against Women

By Business Insider. . .A year later, however, the engineers reportedly noticed something troubling about their engine — it didn’t like women. This was apparently because the AI combed through predominantly male résumés submitted to Amazon over a 10-year period to accrue data about whom to hire.

Consequently, the AI concluded that men were preferable. It reportedly downgraded résumés containing the words “women’s” and filtered out candidates who had attended two women-only colleges.

Amazon’s engineers apparently tweaked the system to remedy these particular forms of bias but couldn’t be sure the AI wouldn’t find new ways to unfairly discriminate against candidates.

Gender bias was not the only problem, Reuters’ sources said. The computer programs also spat out candidates who were unqualified for the position.

Remedying algorithmic bias is a thorny issue, as algorithms can pick up on subconscious human bias. In 2016, ProPublica found that risk-assessment software used to forecast which criminals were most likely to reoffend exhibited racial bias against black people. Overreliance on AI for things like recruitment, credit-scoring, and parole judgments have also created issues in the past. (Read more from “Amazon Built an Ai Tool to Hire People but Had to Shut It Down Because It Was Discriminating Against Women” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.