On April 6, 2023, the New York City Department of Consumer and Worker Protection (DCWP) adopted highly anticipated final rules implementing the city’s law regulating the use of automated employment decision tools (AEDT) tools in hiring that will take effect on July 5, 2023.
The AEDT law, which took effect on January 1, 2023, restricts the use of automated employment decision tools and artificial intelligence (AI) by employers and employment agencies by requiring that such tools be subjected to bias audits and requiring employers and employment agencies to notify employees and job candidates that such tools are being used to evaluate them.
The final rules come after the DCWP first proposed rules in September 2022, which it later revised in December 2022 after a public hearing. The final rules include a number of changes to earlier versions, including expanding the scope of “machine learning, statistical modeling, data analytics, or artificial intelligence,” modifying bias audit standards, and clarifying information that must be disclosed. Here are some key points from the new rules.
Automated Employment Decision Tools
The law defines AEDT as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation” that is used to “substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” Maintaining the approach adopted in the in the December 2022 revised proposed rules, the final rules provide that the phrase “to substantially assist or replace discretionary decision making” refers to:
- relying “solely on a simplified output (score, tag, classification, ranking, etc.), with no other factors considered;” or
- using a simplified output as “one of a set of criteria” where it is weighed more than others in the set; or
- using a simplified output to “overrule” other conclusions based on other factors, including “human decision-making.”
On the other hand, the final rules alter the definition of “machine learning, statistical modeling, data analytics, or artificial intelligence” proposed in the earlier versions of the rules, and provide that the term means “a group of mathematical, computer-based techniques” that: (i) “generate a prediction, meaning an expected outcome” or “that generate a classification, meaning an assignment of an observation to a group” and “for which a computer at least in part identifies the inputs, the relative importance placed on those inputs, and, if applicable, other parameters for the models in order to improve the accuracy of the prediction or classification.” This definition omits techniques “for which the inputs and parameters are refined through cross-validation or by using training and testing data,” which had been included in the earlier versions of the proposed rules.
Bias Audits
Under the AEDT law, before employers or employment agencies may use AEDTs, the tools must be subjected to “a bias audit conducted no more than one year prior to the use of such tool.” A “bias audit” is defined as “an impartial evaluation by an independent auditor” to assess the tool’s potential “disparate impact” on sex, race, and ethnicity. The employer or employment agency must also post a “summary of the results of the most recent bias audit” on its website.
The final rules clarify the requisite calculations for a bias audit. Where an AEDT is used to select candidates for hiring or promotion to move forward in the hiring process or classifies them in groups, “a bias audit must, at a minimum”:
- “Calculate the selection rate for each category”;
- “Calculate the impact ratio for each category”;
- Separately calculate the impact on: (i) “[s]ex categories”; (ii) “[r]ace/[e]thnicity categories”; and (iii) “intersectional categories of sex, ethnicity, and race (e.g., impact ratio for selection of Hispanic or Latino male candidates vs. Not Hispanic or Latino Black or African American female candidates).”
- Ensure that all the calculations are “performed for each group, if an AEDT classifies candidates for employment or employees being considered for promotion into specified groups (e.g., leadership styles)”; and
- “Indicate the number of individuals the AEDT assessed that are not included in the required calculations because they fall within an unknown category.”
The final component represents an additional requirement that was not expressly addressed in the prior versions of the rules.
In another change to the bias audit requirements from the earlier versions of the proposed rules, the final rules state that notwithstanding the requirements of paragraphs 2 and 3, detailed above (and the similar requirements for a bias audit on an AEDT that scores candidates for employment or employees being considered for promotion), “an independent auditor may exclude a category that represents less than 2% of the data being used for the bias audit from the required calculations for impact ratio.” The final rules also specify that “[w]here such a category is excluded, the summary of rules must include the independent auditor’s justification for the exclusion, as well as the number of applicants and scoring rate or selection rate for the excluded category.”
Sources of Data
The final rules incorporate provisions that address the use of historical data and test data. The provisions relating to the use of historical data are largely unchanged. According to the final rules, multiple employers or employment agencies using the same AEDT may rely on the same bias audit conducted using historical data of other employers or employment agencies only if the employer or employment agency “provided historical data from its own use of the AEDT to the independent auditor conducting the bias audit or if such employer or employment agency has never used the AEDT.”
The final rules relating to the use of test data are more explicit about the limited circumstances in which an employer or employment agency may utilize test data, and specify that the bias audit may rely upon “test data if insufficient historical data is available to conduct a statistically significant bias audit.” The final rules maintain the requirement that the summary of results for a bias audit that uses test data “must explain why historical data was not used and describe how the test data used was generated and obtained.”
Characteristics of an Independent Auditor
The final rules end any lingering uncertainty about individuals or entities who can perform the bias audit required by the law by retaining the definitions of an independent auditor contained in the December 2022 proposed rules. As such, the final rules provide that an “[i]ndependent auditor” means “a person or group that is capable of exercising objective and impartial judgment on all issues within the scope of a bias audit of an AEDT.” The final rules identify three disqualifying characteristics, namely a person or group that:
- “is or was involving in using, developing, or distributing the AEDT;
- at any point during the bias audit, has an employment relationship with an employer or employment agency that seeks to use or continue to use the AEDT or with a vendor that developed or distributes the AEDT; or
- at any point during the bias audit, has a direct financial interest or a material indirect financial interest in an employer or employment agency that seeks to use or continue to use the AEDT or in a vendor that developed or distributed the AEDT.”
Bias Audit Summary Results
Before using an AEDT, employers and employment agencies must publicly disclose the date of the most recent bias audit of the AEDT and a “summary of the results.” The final rules expand the December 2022 list of information that must be included in the summary, and specifies that it must include:
- “the source and explanation of the data used to conduct the bias audit”;
- “the number of individuals the AEDT assessed that fall within an unknown category”; and
- “the number of applicants or candidates, the selection or scoring rates, as applicable, and the impact ratios for all categories;” and
- “[t]he distribution date of the AEDT.
The final version of the rules continue to specify that the notice requirements may be met “with an active hyperlink to a website” that must be “clearly identified as a link to the results of the bias audit.” Additionally, the summary must be posted “at least [six] months after its latest use of the AEDT for an employment decision.”
The final rules also specify the required notices to candidates and employees. These provisions are unchanged from the December 2022 proposed rules, and specify that notice to candidates may be provided via the website, or in a job posting or by mail “at least 10 business days before use of an AEDT.” Notice to employees being considered for promotion made be provided in a policy or procedure that is distributed “at least 10 business days before use of an AEDT.”
Key Takeaways
Employers are increasingly relying on AEDTs and AI systems to make hiring decisions or screen candidates, which can increase efficiency and improve results. New York City is one of several jurisdictions to put guardrails around this emerging technology amid concerns with bias. The newly adopted final rules by the New York City DCWP provide further guidance and clarification on the city’s new restrictions.
Employers and employment agencies in New York City may want to consider reviewing their use of automated decision-making tools or AI in making hiring and promotion decisions. If such tools are being used or are planned to be used, employers may want to consider whether the tools being considered have been subjected to bias audits.