HB Ad Slot
HB Mobile Ad Slot
EEOC Issues New Workplace Artificial Intelligence Technical Assistance
Sunday, May 21, 2023

Since late October 2021, when the Equal Employment Opportunity Commission (EEOC) launched its Initiative on Artificial Intelligence (AI) and Algorithmic Fairness, the agency has taken several steps to ensure AI and other emerging tools used in hiring and other employment decisions comply with federal civil rights laws that the agency enforces. Among other things, the EEOC has hosted disability-focused listening and educational sessions, published technical assistance regarding the Americans with Disabilities Act (ADA) and the use of AI and other technologies, and held a public hearing to examine the use of automated systems in employment decisions.

Consistent with its initiative and its Draft Strategic Enforcement Plan for 2023-2027, on May 18, 2023, the EEOC issued new “technical assistance” entitled Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964.  Publication of the new technical assistance comes on the heels of a joint federal agency declaration on AI and arguably signals the building of a regulatory and enforcement framework in light of the accelerating utilization of disruptive technologies in the workplace.

The key aspects of the technical assistance answer some initial questions that developers and employers may have on the application of Title VII of the Civil Rights Act of 1964, as amended, to newer “algorithmic decision making tools” (ADT) used in the workplace and to assist employers in determining specifically whether such tools – when used in selection decisions such as hiring, promotion, and termination – may result in an adverse or disproportionately disparate or negative impact on categories or classes protected by Title VII, the Age Discrimination in Employment Act (ADEA), and the ADA.

Purposefully “limiting” the technical assistance’s scope to “selection procedures,” the EEOC does not address other aspects of a disparate impact analysis – e.g., whether a selection tool constitutes a valid measure of a job-related trait – nor does it cover application of ADTs to other employment practices of a covered employer. The EEOC also does not discuss any potential “disparate treatment” or intentional discrimination with the use of ADTs.

The technical assistance defines central terms regarding automated systems and AI included in the guidance, including “software,” “algorithm,” and “artificial intelligence,” with the definitions used by various government organizations and in federal legislation.  It also identifies resume screening software, virtual assistants, and chatbots that interview and evaluate candidates for employment, as well as testing software deployed for personality, aptitude, or cognitive skill testing, as examples of ADTs.

To assist the employer in determining whether the tests and selection procedures utilizing ADTs impact adversely on a protected class or category, the technical assistance relies on the Uniform Selection Guidelines on Employee Selection Procedures (the “Guidelines”), a set of guidelines issued over four decades ago to determine adverse impact.  The technical assistance confirms that the Guidelines apply equally to selection tools that constitute ADTs.

According to the Guidelines, if the use of a selection tool causes a selection rate for individuals within a protected group or category that is substantially lower (less than 4/5s or 80% – i.e., the “Four-Fifths Rule of Thumb”) than that of the most selected group, a preliminary finding of adverse impact is likely and the employer must examine the ADT to determine if it in fact has an adverse impact.  If it does, the employer must show that either the use of the ADT is job related and consistent with business necessity pursuant to Title VII, or that the preliminary Four-Fifths Rule assessment was in error.  As noted in the technical assistance, employers must take caution when undertaking such adverse impact assessments as the Guidelines analysis might not be appropriate in all circumstances – where, for example, the statistical sampling may be too small (statistically insignificant based on sample size) or a court might employ a different test to determine disparate impact which could result in a finding of a violation of Title VII.  

According to the EEOC, an employer’s reliance on a vendor to develop or administer an ADT underscores the importance of conducting an adverse impact analysis. In such circumstances, the employer must determine if and what kind of adverse impact analysis has been conducted and whether any such analysis suggests the tool results in a disparate selection impact on a protected class or category.  If the adverse impact assessment suggests or determines that the selection procedure may violate Title VII, the employer likely must explore alternative methods of selection or adjustment to the ADT itself.

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins