Amazon had been working to develop an artificial intelligence (AI) system that would analyze and rate their applicants’ résumés. However, as reported by Reuters in October, Amazon’s AI discounted the résumés of female applicants in favour of male applicants. Here's what happened...
Amazon identified deficiencies with the AI output early in the process when the system showed a gender-bias in its assessment of candidates. Firstly, résumés that included the word “women’s”, as in “women’s soccer” or “women’s health”, or referenced women’s colleges, were penalized. Secondly, the system preferred résumés with certain descriptive verbs, like “executed” or “captured”, which were more commonly used by male applicants.
The lesson is that even with an experienced team to administer and train the AI system, robust validation is needed. In fact, validation is critical.
Why, you may ask, would the AI system exhibit this gender-bias? Well, the system was trained to identify patterns in Amazon’s hiring practices by analyzing résumés submitted to the company over a 10-year period. Most technical applicants and hires at Amazon are men. In turn, the output reflected past recruitment. The result was an AI system that learned to favour male applicants.
But how did this happen?
Amazon's failure illustrates a limitation of AI that must not be underestimated. AI technology will only be as effective as the data and training it receives. If the data exhibits a gender bias, so too will the system. If the data is human behaviour, which is rife with implicit biases, the system will serve to reinforce those biases. In this case, it reinforced a diversity problem instead of solving it. AI is not magic. It reflects its programming and training, including any defects therein.
According to Reuters reporting, Amazon never used its AI system to evaluate applicants. However, the company did not deny that the ratings were viewed by company recruiters. In any event, Amazon abandoned the project because of its gender bias and other flaws, such as the recommendation of unqualified (presumably male) candidates. Go figure.
The future is not bleak.
Amazon was an early-adopter of AI, which drives the company’s product recommendations. Despite its expertise, Amazon was unsuccessful in this attempt to introduce AI into its HR processes. The lesson is that, even with an experienced team to administer and train an AI system, robust validation is needed. In fact, validation is critical. Thankfully, Amazon closely scrutinized the recommendations of its AI, thereby averting the hiring of unqualified, male staff (and a re-naming of the business to the Dunder Mifflin Paper Company).