WASHINGTON (REUTERS) - Amazon's software engineers recently uncovered a big problem: Their new online recruiting tool did not like women.
The glitch, sources told Reuters, stemmed from the fact that Amazon's computer models were trained by observing patterns in resumes of job candidates over a 10-year period - largely from men - in effect teaching themselves that male candidates were preferable.
"The technology thought, 'Oh, Amazon doesn't like any resume that has the word 'women's' in it - captain of a women's chess club, women's soccer team, some all-women's universities," said Reuters correspondent Jeffrey Dastin. "Because the company has hired so many male engineers or software developers, data scientists and so forth, that clearly the unsuccessful candidates were ones that would have this word 'women's' in it.''
Amazon never solely relied on these online recruiting tools though, and disbanded the unit that created it by the start of last year, sources said. It now uses a "much watered-down version" for administrative chores. The company declined to comment.
Mr Dastin explained that artificial intelligence is only as smart as the information it is fed.
"What people say in the industry is, 'Garbage in, garbage out'. So if you give it bad data, or that reflects some bias or whatever, the computer is just gonna mimic that. It's going to mimic whatever human flaws there are."
A growing number of companies are automating recruitment, hoping this will make hiring faster and more uniform. Hilton and Unilever are among those using software made by HireVue, which lets applicants video-record answers to employers' questions. HireVue's chief executive officer says his firm analyses candidates' speech and facial expressions in order reduce reliance on resumes.
Amazon, a source said, has a new team assembled to give online screening another try, this time with a focus on diversity.