HB Ad Slot
HB Mobile Ad Slot
EEOC Issues Nonbinding Guidance on Permissible Employer Use of Artificial Intelligence to Avoid Adverse Impact Liability Under Title VII
Wednesday, May 31, 2023

US Labor, Employment, and Workplace Safety Alert

On 18 May 2023, the US Equal Employment Opportunity Commission (EEOC) issued nonbinding guidance on how existing federal anti-discrimination law may apply to employers’ use of artificial intelligence (AI) when hiring, firing, or promoting employees (the EEOC AI Disparate Impact Guidance). The EEOC AI Disparate Impact Guidance aims to assist employers when using AI for such “selection procedures” to avoid disparately or adversely impacting protected groups under Title VII and incurring any related liability.1

The EEOC AI Disparate Impact Guidance is notably limited in scope because it does not address Title VII’s or other federal employment laws’ separate prohibition on employers intentionally discriminating against employees based on their protected characteristics, including race, color, religion, sex (including pregnancy, sexual orientation, and gender identity), or national origin, nor does it address validation of the selection procedure in the event adverse impact is found.

Nevertheless, as discussed further below, employers currently or intending to use AI—especially AI created by third-party vendors—as part of their employment selection procedures should consider the EEOC AI Disparate Impact Guidance before implementing such procedures to avoid potential disparate impact liability under Title VII.

TITLE VII DISPARATE IMPACT

Title VII prohibits “disparate” or “adverse” impact discrimination. This prohibition precludes employers from using facially neutral selection procedures or tests that have the effect of disproportionately excluding persons protected by Title VII if the tests or selection procedures are not “job related for the position in question and consistent with business necessity.”There is a three-pronged test for analyzing disparate impact:

  1.  Does an employer use a particular employment practice that has a disparate impact on a group protected by Title VII?
  2. If there is a disparate impact, are the selection procedures or tests job-related and consistent with business necessity? 
  3. Even if the selection procedures or tests are job-related and consistent with business necessity, is there a less discriminatory alternative available?3

The EEOC AI Disparate Impact Guidance relates only to item (1).

The long-standing Uniform Guidelines on Employee Selection Procedures (the “Guidelines”) under Title VII, which have been in place since 1979, provide EEOC guidance to employers about how to determine if their selection procedures or tests are legal under a Title VII disparate impact analysis.4

THE EEOC AI DISPARATE IMPACT GUIDANCE

The EEOC AI Disparate Impact Guidance begins with a refresher on key provisions in the Guidelines.

“Selection rate” is the proportion of applicants or candidates who are hired, promoted, or otherwise selected;and the selection rate for a particular group is calculated by dividing the number of persons hired, promoted, or otherwise selected from that group by the total number of persons in that group.By way of the EEOC’s example, if 80 white persons and 40 Black persons applied for a position and an employer’s use of an AI-powered employment selection tool selected 48 white applicants and 12 Black applicants, the selection rate for white persons and Black persons would be 60% (48/80) and 30% (12/40), respectively.

The Guidelines also set forth a “four-fifths rule,” a general rule of thumb providing that the selection rate for a particular group will be “substantially” different from another group if the ratio of those two groups’ selection rates is less than 80%.In the example above, the ratio of selection rates for white applicants and Black applicants is 50% (30/60). Since the ratio is less than 80%, the four-fifths rule provides that the selection rate for Black applicants is substantially different than white applicants and shows possible evidence of discrimination against Black applicants.

Turning to how AI-powered employment selection tools may intersect with Title VII’s prohibition on disparate impact discrimination, the EEOC issued the following recommendations for employers in the “Questions and Answers” section in the EEOC AI Disparate Impact Guidance:8

  • AI-powered tools used for making or informing decisions about hiring, promoting, terminating, or taking similar actions toward applicants or current employees would be subject to the Guidelines as a “selection procedure.” The Guidelines define “selection procedure” as any “measure, combination of measures, or procedure” used as a basis for making an employment decision.9
  • An employer can, and should, assess whether the use of an AI-powered employment selection tool has an adverse impact on a particular protected group. The use of the AI-powered employment selection tool will have an adverse impact on a particular protected group where its results cause a selection rate for individuals in that protected group that is substantially less than the selection rate for individuals in another group.10 If so, then using the AI-powered employment selection tool will violate Title VII unless the employer can show that its use is “job related and consistent with business necessity.”11
  • An employer can be liable for the disparate impact caused by an AI-powered employment selection tool, even where the tool is designed or administered by a third party. If an employer administers the use of an AI-powered employment selection tool that causes a disparate impact, the employer may be liable under Title VII even if the tool was developed by an outside vendor. Additionally, an employer may be liable under Title VII where it relies on the results of an AI-powered employment selection tool administered by a third party. This is because an employer may be liable under Title VII for the actions of its agents if the employer gave them the authority to act on the employer’s behalf.12 As a precaution, an employer should, at a minimum, ask the third party, whether it be the developer or the administrator of the AI-powered employment selection tool, if it has evaluated whether the use of the tool causes a substantially lower selection rate for individuals in groups protected by Title VII. If the third party states that the AI-powered employment selection tool should be expected to cause any such disparate impact, the employer should consider whether using the tool is job related and consistent with business necessity, and if there are alternatives that may meet the employer’s needs and have less of a disparate impact. An employer may even be liable where it relies on a third-party developer or administrator of an AI-powered employment selection tool’s own incorrect assessment of whether the tool results in a disparate impact.
  • “The four-fifths rule is merely a rule of thumb” and “may be inappropriate under certain circumstances.”13  For instance, smaller differences in selection rates may still indicate adverse impact where the procedure is used to make a large number of decisions14 or there is evidence demonstrating that an employer’s actions disproportionately discouraged individuals from applying on the basis of a characteristic protected under Title VII.15 The four-fifths rule also does not automatically supersede the results of a test of statistical significance.16 The EEOC is not bound to base a determination of a charge of discrimination alleging an employer is engaged in employment discrimination on the four-fifths rule.17 An employer should ask a third-party developer or administrator of any AI-powered employment selection tool whether it relied upon the four-fifths rule or other test of statistical significance in determining whether the use of the tool will have a disparate impact on a group protected by Title VII.
  • Where an employer discovers an AI-powered employment selection tool would have an adverse impact on one or more groups of individuals protected by Title VII, it should take steps to reduce the impact or select a different employment selection tool. Employers should also frequently conduct self-analyses on an ongoing basis to ensure its AI-powered employment selection tools are not causing a disparate impact.

CONCLUSION

The EEOC AI Disparate Impact Guidance comes during a potentially revolutionary moment for AI given its myriad applications to resume scanners, monitoring software, chatbots, and video interviewing software, among other things.

Employers and their counsel should read the EEOC AI Disparate Impact Guidance as part of the Biden administration’s “Blueprint for an AI Bill of Rights,” which is concerned with how AI and the software underlying it may reinforce existing biases in employment, housing, education, and other critical areas of law.

The EEOC AI Disparate Impact Guidance also follows the EEOC’s April 2023 joint statement with the US Department of Justice, Federal Trade Commission, and Consumer Financial Protection Bureau foretelling enforcement efforts to come: “We also pledge to vigorously use our collective authorities to protect individuals’ rights regardless of whether legal violations occur through traditional means or advanced technologies.”

Lastly, in addition to ongoing federal law developments related to AI, employers and their counsel should continue tracking state legislation, which may further complicate employer use of AI when making key employment decisions.


In May 2022, the EEOC weighed in on how employers may use AI when assessing job applications without violating the Americans without Disabilities Act. There, as with this latest guidance, the EEOC expressed concern with how cutting-edge AI technology could adversely affect protected classes under federal employment law, including disabled workers, and offered agency tips on what to do if they felt an employer’s AI screened them out because of their disability. 

42 U.S.C. § 2000e-2(a)(2), (k).

See id. § 2000e-2(k); Griggs v. Duke Power Co., 401 U.S. 424 (1971).

See 29 C.F.R. pt. 1607.

Id. § 1607.16(R).

See EEOC, Questions and Answers to Clarify and Provide a Common Interpretation of the Uniform Guidelines on Employee Selection Procedures, Q&A 12 (Mar. 1, 1979) [hereinafter Questions and Answers], https://www.eeoc.gov/laws/guidance/questions-and-answers-clarify-and-provide-common-interpretation-uniform-guidelines.

29 C.F.R. §§ 1607.4(D), 1607.16(B).

See EEOC AI Disparate Impact Guidance, Questions and Answers Section.

29 C.F.R. § 1607.16(Q).

10 Id. § 1607.16(B).

11 42 U.S.C. § 2000e-2(k)(1); 29 C.F.R. § 1607.3(A).

12 EEOC, Compliance Manual Section 2 Threshold Issues § 2-III.B.2 (May 12, 2000), https://www.eeoc.gov/laws/guidance/section-2-threshold-issues#2-III-B-2.

13 See EEOC AI Disparate Impact Guidance, Questions and Answers Section, Question 5.

14 Questions and Answers, supra note 5, Q&A 22; see also 29 C.F.R. § 1607.4(D).

15 29 C.F.R. § 1607.4(D); see also Uniform Guidelines on Employee Selection Procedures, 43 Fed. Reg., 38,290, 38,291 (“[A]n employer’s reputation may have discouraged or ‘chilled’ applicants of particular groups from applying because they believed application would be futile. The application of the ‘4/5ths’ rule in that situation would allow an employer to evade scrutiny because of its own discrimination.”).

16 See, e.g., Isabel v. City of Memphis, 404 F.3d 404, 412 (6th Cir. 2005) (rejecting the argument that “a test’s compliance with the four-fifths rule definitively establishes the absence of adverse impact”); Jones v. City of Boston, 752 F.3d 38, 46–54 (1st Cir. 2014) (rejecting the use of the four-fifths rule to evaluate a test with a large sample size); Howe v. City of Akron, 801 F.3d 718, 743 (6th Cir. 2015) (“[The Sixth Circuit] ha[s] used the four-fifths rule as the starting point to determine whether plaintiffs alleging disparate impact have met their prima facie burden, although we have used other statistical tests as well.”); Questions and Answers, supra note 2, Q&A 20, 22.

17 29 C.F.R. § 1607.16(I).

HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins