HB Ad Slot
HB Mobile Ad Slot
Is Artificial Intelligence Sexist and Racist?
Monday, February 18, 2019

Last year, Amazon scrapped its machine-learning algorithm because it discovered it had a major problem—the artificial intelligence didn’t like women. The machine-based learning tool was designed to analyze resumes and compare potential applicants to Amazon’s current workforce. The algorithm was designed to take 100 resumes and filter out the top five applicants.

The problem was that there is a pre-existing gender gap in software developer and other technical posts. Therefore, when the artificial intelligence tool analyzed the patterns in Amazon’s hiring practices over the prior 10-year period, it taught itself to favor men over women. Amazon ultimately disbanded the tool.

Amazon’s artificial intelligence highlights an important limitation on machine-based learning tools—the tools are only as good as the information they are given. While artificial intelligence can quickly and more efficiently screen potential job candidates, such algorithms can inadvertently reinforce discrimination in hiring practices. In Amazon’s case, tech-based job applicants were more likely to be male then female. The algorithm mistakenly interpreted this gender gap as a hiring preference for Amazon. Thus, instead of highlighting the qualified women, the algorithm screened out such candidates.

Employers these days have a panoply of tech-based tools at their disposal. Websites like Monster.com and Indeed.com advertise job openings and generate large numbers of applicants. Employers are turning to tech-based tools to reduce the time to hire and the costs of hiring. Such tech-based tools, however, are designed to mimic human decision-making. Therefore, when the tool relies on data that is inaccurate or biased, the tool can inadvertently discriminate against women or minorities. Studies have also found that tech-based tools can discriminate in more subtle ways as well. For example, an employer attempting to maximize work tenure found that those who lived closer to work tended to have longer tenures. However, screening applicants based on how close they lived from work tended to disproportionately screen out certain minority candidates.

Under Title VII of the Civil Rights Act of 1964 and analogous state and local laws, the employer is responsible for ensuring that it is screening job applicants in a nondiscriminatory manner. Therefore, if you are using or considering a tech-based tool to help you screen job applicants, you should take steps to ensure that such tools are not disproportionately screening out candidates based on gender, race, or other protected classes. Simply telling tech-based tools not to discriminate against minorities or women may be insufficient because such tools will attempt to identify candidates that reflect your existing hiring practices. Some helpful tips to consider when using tech-based hiring tools are:

  1. Do not rely exclusively on tech-based hiring tools. Most tools will rank candidates. Employers should review lower-ranked candidates and make independent assessments based on non-discriminatory criteria.
  2. Consistently review and update data provided to your hiring tool. Make sure the data your hiring tool relies on does not reflect discriminatory hiring practices.
  3. Independently audit the results and rankings generated by the hiring tool and make appropriate adjustments as necessary.

Over time, these tech-based hiring tools will likely improve and, hopefully, screen applicants free of any discriminatory bias. But until the technology is perfected, employers should take steps to make sure that members of protected classes are not disproportionately screened through uses of tech-based hiring algorithms.

HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins