On July 20, 2023, U.S. Senators Bob Casey (D-PA) and Brian Schatz (D-HI) introduced the “No Robot Bosses Act.” Other than bringing to mind a catchy title for a dystopic science fiction novel, the bill aims to regulate the use of “automated decision systems” throughout the employment life cycle and, as such, appears broader in scope than the New York City’s Local Law 144 of 2021, about which we have previously written, and which New York City recently began enforcing. Although the text of the proposed legislation has not yet been widely circulated, a two-page fact sheet released by the sponsoring Senators outlines the bill’s pertinent provisions regarding an employer’s use of automated decision systems affecting employees and would:
-
prohibit employers’ exclusive reliance on automated decision systems;
-
require pre-deployment and periodic testing and validation of automated decision systems to prevent unlawful biases;
-
require operational training;
-
mandate independent, human oversight before using outputs;
-
require timely disclosures of use, data inputs and outputs, and employee rights with respect to the decisions; and
-
establish a regulatory agency at the U.S. Department of Labor (“DOL”) called the “Technology and Worker Protection Division.”
The bill does not define with specificity “automated systems,” nor does it define or limit the term “employment decision.” The fact sheet, however, sets forth examples of automated systems potentially subject to the “No Robot Bosses Act,” including “recruitment software, powered by machine learning algorithms,” “automated scheduling software” and “tracking algorithms” applicable to delivery drivers. These examples suggest a broad intended application that could include other types of monitoring technology. But the fact sheet does not provide examples of the nature or scope of “employment” decision,” nor does it identify the industries or classes of employees subject to the law. Moreover, at this time, the bill is silent as to enforcement mechanisms, penalties, or fines for violations.
In addition, Senators Casey, and Schatz, joined by Corey Booker (D-NJ), have also introduced the “Exploitative Workplace Surveillance and Technologies Task Force Act of 2023.” Like the “No Robot Bosses Act,” the text of the bill is not yet available, but the Senators released a one-page fact sheet detailing that the proposed legislation would create a “dynamic interagency task force” led by the Department of Labor and the Office of Science and Technology Policy to study a range of issues related to automated systems and workplace monitoring technology.
While these proposed bills are still in their early stages, lawmakers at the state, local, and federal levels continue to consider methods of regulating employment-related automated systems and artificial intelligence more broadly. At the same time, federal regulators and private plaintiffs are leveraging existing employment laws, including Title VII of the Civil Rights Act, in connection with employers’ use of technology that automates employment decisions. For example, the EEOC recently published a technical assistance memorandum alerting and assisting employers to mitigate risk when using ‘automated decision tools’ in the workplace. Consequently, it is critical that employers, especially personnel involved in recruiting, hiring, and promotion, identify and assess potential risk in the use of AI tools in employment decision-making by:
-
Understanding and documenting the systems and vendors used in making employment decisions throughout the employment life cycle;
-
Assessing the need for an artificial intelligence governance framework, or other internal policies and procedures taking into account considerations related to safety, algorithmic discrimination, data privacy, transparency, and human oversight and fallback;
-
In conjunction with counsel, conducting impact assessments as to the use of automated systems; and
-
Ensuring compliance with all applicable laws governing automated decision systems and artificial intelligence.