The rulemaking process on California’s Proposed “Regulations on CCPA Updates, Cybersecurity Audits, Risk Assessments, Automated Decisionmaking Technology, and Insurance Companies” (2025 CCPA Regulations) has been ongoing since November 2024. With the one-year statutory period to complete the rulemaking or be forced to start anew on the horizon, the California Privacy Protection Agency (CPPA) voted unanimously to move a revised set of draft regulations forward to public comment on May 1, which began May 9 and closes at 5 pm Pacific June 2, 2025. The revisions cut back on the regulation of Automated Decision-making Technology (ADMT), eliminate the regulation of AI, address potential Constitutional deficiencies with regard to risk assessment requirements and somewhat ease cybersecurity audit obligations. This substantially revised draft is projected by the CPPA to save California businesses approximately 2.25 billion dollars in the first year of implementation, a 64% savings from the projected cost of the prior draft.
ADMT: Most notable changes relate to ADMT, which are said to result in an 83% cost savings for businesses compared to the prior draft. “Cut to the bone,” is the way Chair Jennifer Urban characterized it, which is welcome news to many, including likely California Gavin Newson, who had sent the CPPA a letter stating that he was “pleased to learn about the Board’s decision, at its April 4, 2025 meeting, to direct Agency staff to narrow the scope of the ADMT regulations.” The revised ADMT regulations now no longer address “artificial intelligence” at all and include the following revisions (among others):
- Deleting the definition “extensive profiling” (behavior advertising or monitoring employees, students or publicly available spaces) and shifting to focus on transparency and choice obligations to use to make a significant decision about consumers. However, risk assessments would still be required for profiling based on systemic observation and training of ADMT to make significant decisions, to verify identity, or for biological or physical profiling.
- Streamlining the definition of ADMT to “mean any technology that processes personal information and uses computation to replace … or substantially replace human decision-making [which] means a business uses the technology output to make a decision without human involvement.” Prior drafts had covered use to help facilitate human decisions.
- Streamlining the definition of significant decisions to remove decisions regarding “access to” and limit applicability to the “provision or denial of” the following more narrow types of goods and services: “financial or lending services, housing, education enrollment or opportunities, employment or independent contracting opportunities or compensation, or healthcare services,” and clarifying that use for advertising is not a significant decision.
- Obligations to evaluate the risk of error and discrimination for certain types of ADMT uses were deleted, but the general risk assessment obligations were largely kept. The requirement to implement policies, procedures and training to ensure that certain types of ADMT work as intended and do not discriminate were removed.
- Pre-use notice obligations were streamlined.
- Opt-out rights were limited to uses to make a significant decision.
- Businesses were given until January 1, 2027, to comply with the ADMT regulations.
Cybersecurity Audits: Also pared back, though more through a phase-in of implementation than regarding substantive requirements, are the draft regulations on cybersecurity audits. Here are the highlights of where the CPPA landed:
- Timing for completion of a first annual cybersecurity audit and filing an audit report with the state will depend on the size of the business:
- April 1, 2028: $100 million + gross revenue.
- April 1, 2029: between $50 million and $100 million.
- April 1, 2030: under $50 million.
- Rather than requiring the Board of Directors to review audit results and certify their sufficiency, such obligations were changed to a member of management with responsibility for cybersecurity.
- The aspects of what an audit must assess remain broad, including the sufficiency of inventory and management of personal information and the business’ information systems, including “data maps and flows identifying where personal information is stored, and how it can be accessed” and hardware and software (including cookies) inventories and an approval and prevention processes.
Privacy Risk Assessments: We have detailed the prior drafts of the risk assessment regulations here. The latest draft not only reflects the ADMT changes discussed above but also a more thoughtful approach to the purpose and process for conducting and documenting assessments:
- In keeping with the removal of the concepts of “extensive profiling” (public monitoring, HR/educational monitoring and behavioral advertising) under the ADMT regulations, these concepts were also removed from the types of high-risk activities that require a risk assessment, but were replaced with “profiling a consumer through systematic observation of that consumer when they are acting in their capacity as an educational program applicant, job applicant, student employee or independent contractor for the business” and “profiling a consumer based upon their presence in a sensitive location.” However, in the draft published for comment, these activities were more narrowly defined to include only use of such observation “to infer or extrapolate intelligence, ability, aptitude, performance at work, economic situation, health (including mental health), personal preferences, interests, reliability, predispositions, behavior, location or movements[,] but excluding “using a consumer’s personal information solely to deliver goods to, or provide transportation for, that consumer at a sensitive location.” These edits are responsive to concerns raised by Board member Mactaggart (e.g., a nurse ordering pizza delivered to work).
- The high-risk assessment trigger of training ADMT or AI was modified to remove the reference to AI and limited to where the business intends to use the ADMT for a significant decision concerning a consumer, or to train facial recognition, emotion recognition, or other technology that verifies a consumer’s identity, or conducts physical or biological identification or profiling of a consumer. Triggers tied to the generation of deepfakes and the operation of generative models, such as large language models, were removed.
- The other high-risk activities from prior drafts remain: selling personal information, sharing personal information (for targeted advertising), and processing sensitive personal information.
- While risk assessments must still include a harm-benefit analysis (Section 7152(a)(4) and (5)), that information, and the business judgment analysis as to the pros and cons thereof, is no longer required to be included in the form of Risk Analysis Report (a new term) that is subject to government inspection. This will make the assessment requirements much less vulnerable to First Amendment challenge as unconstitutional compelled speech on a matter of opinion and not mere recitation of facts, a concern publicly expressed previously by at least one CPPA Board member. This is a very significant development. [Note that while the Colorado regulations require documentation of such a risk-benefit analysis as part of assessment documentation, they also provide that assessments may be subject to legal privilege protections.]
- Similarly, the forms of assessment summaries that must be filed with the state are limited to factual recitations and the new draft abandons the prior approach of requiring filing of abridged assessments summarizing each assessment in favor of a single attestation of annual compliance with some basic information on the number of assessments, in total and by type of covered processing activities, and which categories of personal information and/or sensitive personal information was covered. This substantially reduces what must be disclosed in agency filings and again helps insulate against compelled speech challenges.
- Also, likely to address Constitutional issues, Section 7154 was changed from prohibiting processing activities if risks to consumer privacy are not outweighed by identified benefits, to expressing that the goal of risk assessments is to serve to prohibit or restrict processing activities if privacy risks outweigh processing benefits. This should go a long way to protect a business’s subjective business judgment and ethical value decisions that should not be subject to second-guessing by the government, absent violation of clear and unambiguous statutory requirements.
- While high-risk activities occurring on and after the effective date of the regulations (likely before the end of 2025) will be subject to assessment, businesses will have until December 31, 2027, to complete the documentation of the corresponding Risk Assessment Reports through that date, and the filing of the first annual assessment attestation would not be due until April 21, 2028.
Finally, the amendments to the existing regulations were revised:
What stayed in:
- If a business maintains personal information (collected after January 1, 2022) for longer than 12 months, it must enable consumers to specify a date range or treat the request as without time limitation.
- A business must ensure that when a consumer corrects their personal information, it remains corrected.
- A business must inform consumers who make corrections of their personal information of the source of the incorrect data or inform the source that the information was incorrect and must be corrected.
- Symmetry of choice applies to any opt-in, not just an opt-in after opt-out.
- A website must display the status of opt-out choice based on GPC / OPPS browser signals or opt-out request. [Most CMPs already have this feature as optional.]
- A business must provide a means to confirm that a limit-sensitive information processing request has been received and is being honored.
- In verifying agent authority and the consumer’s identity, a business may not require the consumer to resubmit an individual request.
- Consumer statements regarding contested accuracy of health data, which are already required, must be shared with recipients of that data if the consumer requests.
What got cut:
- The requirements of businesses and service providers to implement measures to ensure deleted personal information remains deleted or de-identified was removed.
- The requirement to inform consumers, as part of a request response, of their right to file a complaint with the state was removed.
- The requirement to inform those to which it has disclosed personal information of a subsequent consumer correction was removed.
- The requirements to provide internal and external notice of consumer claims of inaccuracy that were not corrected (due to insufficient proof), unless the request was fraudulent or abusive, were removed.
The current revisions are subject to additional revisions based on the new round of public comment, which could lead to adding back or otherwise changing provisions. However, the CPPA Board members all seemed to express the opinion at the May 1 meeting that a set of regulations needed to be timely completed, and that future rulemaking could build on the foundation of the draft that has been advanced. Accordingly, it would appear that we are at the “home stretch” with the finish line in clear view.