On November 8, 2024, the California Privacy Protection Agency (CPPA) voted 4-1 to proceed with formal rulemaking regarding automated decision-making technology (“ADMT”), which the draft regulations define as “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking.” If enacted, the regulations would impose sweeping requirements on employers who rely on assistance from artificial intelligence (AI) tools in making employment decisions including hiring; allocation of work; compensation and benefits; promotion; and demotion, suspension, or termination. The draft regulations take a similar approach to laws that have passed in New York and laws that are likely to be enacted in Colorado, in that they require certain disclosures and risk assessments, and require that employees and applicants have the ability to opt-out of being evaluated by AI in some contexts.
The key elements of the draft regulations include the following:
- Pre-Use Notice – Employers that use ADMT for significant employment decisions must inform “consumers”—which includes not only employees, but also independent contractors, and job applicants—about the employer’s use of ADMT, the consumer’s right to opt-out of ADMT, and the consumer’s right to access ADMT prior to the employer’s processing of any personal information.
- Bias Review – Employers that use physical or biological identification or profiling for a significant employment decision must conduct a bias review to ensure that the software does not discriminate based upon protected classes. It is unclear from the draft regulations whether such audit would need to be unique to that particular employer. In New York City, for example, employers may rely on bias audits that use data on other entities’ employees in some contexts (e.g., if the employer has never used the tool before or has shared its data to the auditor for inclusion in the audit).
- Opt-Out – Employers must provide consumers the ability to opt-out of the use of ADMT and request qualified human review of the significant employment decision. However, employers may deny an opt-out request for decisions regarding hiring, allocation of work, and compensation if the ADMT has “accuracy and nondiscrimination safeguards,” meaning the employer has conducted an evaluation of and implemented policies, procedures, and training to ensure the ADMT works as intended for the business’s proposed use and does not discriminate based upon protected classes.
- Cybersecurity Audits –Employers that use ADMT for significant employment decisions must complete an annual cybersecurity audit to ensure that data protection measures are current and that any gaps or weaknesses in the business’s cybersecurity program are promptly addressed to safeguard personal data. Businesses must use a qualified, objective, independent auditor that may be internal or external to the business.
- Risk Assessments – Employers that use ADMT for significant employment decisions must complete an annual risk assessment before initiating the processing of personal information to determine whether the risks to consumers’ privacy from the processing of personal information outweigh the benefits to the consumer, the business, other stakeholders, and the public from that same processing. Employers must submit the annual risk assessment to the CPPA. Among other requirements, the risk assessment must identify the purpose for processing the personal information (generic terms such as “to improve our services,” do not suffice); the method for processing the personal information, including the retention period and technology used; the negative impacts to privacy; and safeguards to address any negative impacts.
Now that the CPPA has published its notice of proposed rulemaking, the public comment period begins. The CPPA requested that the standard 45-day public comment period be extended due to the holidays. Thus, comments will be due in early 2025, but no specific date has been set.