HB Ad Slot
HB Mobile Ad Slot
California’s Privacy Regulator Had a Busy November, Automated Decisionmaking Edition: What Does It Mean for Businesses?
Tuesday, December 10, 2024

In the second in our series of new CCPA regulations from California, we look at proposed rules for use of automated decisionmaking technology. As a reminder, CCPA discusses these technologies in relation to profiling, namely “any form of automated processing of personal information” to analyze or predict people’s work performance, health, and personal preferences, among other things.

The law had called on the California privacy agency (CPPA) to promulgate rules to give consumers the ability to opt out of the use of these technologies and get access to information about how the tools are used when making decisions about them. The first set of proposed rules were met with some concern, some of which has been addressed in this newest version. Highlights of the changes are below:

  • Narrowing the definition of “automated decisionmaking technology:” The law does not define this term, and in 2023 the agency had proposed that it be broadly any system that “in whole or in part” facilitates human decisionmaking. The term has now been narrowed to that which either replaces humans or substantially facilitates their decisionmaking. Meaning, that it is a “key factor” in the human’s decision. The rule gives an example: using a tool’s score as primary factor in making a significant decision about someone.
  • Automatic decisionmaking and risk assessments: As part of the new rules for risk assessments, the agency has included specific provisions on profiling. First, companies would need to conduct risk assessments themselves. Second, the proposed rule imposes obligations on entities that make automated decisionmaking or AI technologies available to others if it trains on personal information. In those cases, the company would need to give the other entities the information they need to conduct their own risk assessments. That information would need to be given in “plain language.”
  • Automated decisionmaking that results in a “significant decision:” If there will be a “significant decision” made, the rules contemplate a “pre-use” notice. This was also contemplated in the 2023 version of the rules. However, in the 2023 version, the obligation arose if there was a “legal or similarly significant” impact (the language of CCPA). Under the proposed rules, the agency discusses “significant decisions” impacting an individual. It gives examples, including education and employment opportunities. Also included are extensive profiling and training automated decisionmaking technology that might, among other things, identify someone or make a significant decision about them.
  • Changes to company privacy policies: The rule as revised would require companies to add into the privacy policy (in the rights section) that an individual can opt out of having their information used by automated decisionmaking that results in a “significant decision.” The policy also needs to explain how someone can access automated decisionmaking.

Putting It Into Practice: The California privacy agency has addressed some of the concerns raised in the initial automated decisionmaking rules. However, the obligations continue to be expansive, and may impact many organizations’ uses of AI tools, especially in the HR space. That said, the obligations outlined in the rule should look familiar to those who already fall under NYC’s AI law.

James O'Reilly also contributed to this article.

Listen to this post

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins