Tuesday, November 20, 2018

Amazon’s Artificial Intelligence Is Misogynistic?

Can we send robots to sensitivity training? Because apparently Amazon’s employee-recruiting artificial intelligence is biased against women. According to the Daily Mail, an artificial intelligence program used by Amazon in Scotland taught itself to downgrade resumes that included words like women’s, as in “women’s chess club champions.” It also downgraded the resumes of graduates from all women’s colleges. It gave preference to applicants who used “male” verbs like “executed” and “captured.”

Amazon said that it did not rely solely on this AI to hire applicants. And once it discovered the AI’s bias, it relegated it to minor tasks like ensuring that the same applicant did not apply more than once and other judgment-free tasks. This is not the first time that machine learning has gone awry. In 2016, AI developed by Microsoft had to be taken offline less than 24 hours after its launch after it started to become racist.

So, for the time being, the jobs of human recruiters are safe. They may not be for long, however. In what I think is the realization of my technological dystopian nightmares, artificial intelligence is apparently being developed to conduct interviews of employees who have been the victims of workplace harassment. Who needs eye-contact and empathy when dealing with HR issues? We can look forward to a future where chat-robots and automated phone menus are running our HR departments.