HB Ad Slot
HB Mobile Ad Slot
Colorado Passes Consumer Protection Law Regulating AI
Wednesday, June 19, 2024

While some states have existing laws governing certain aspects of the use of AI, on May 17, Colorado became the first state to enact comprehensive artificial intelligence (AI) legislation. Senate Bill 24-205, “Concerning Consumer Protections in Interactions with Artificial Intelligence Systems,” (the “Colorado AI Act”) will be effective February 1, 2026. The Colorado AI Act takes aim at conduct that is already illegal – unlawful discrimination in important consumer activities like lending and employment – but Colorado now builds an intricate set of obligations for companies to meet if the companies use AI in assisting these decisions.

The Colorado AI Act applies to developers and deployers of “high-risk artificial intelligence systems” defined as AI which, “When deployed, makes, or is a substantial factor in making, a consequential decision.” The types of consequential decisions regulated here include “the provision or denial to any consumer, or cost or terms of”

  • An education opportunity
  • An employment opportunity
  • A financial service
  • An essential government service
  • Health care
  • Housing
  • Insurance
  • Legal Services

So the Colorado AI Act applies to AI used in a decision-making context, as opposed to most current uses of generative AI. Although we could imagine a context where a chatbot or other large language model would be able to direct consumers in a manner that could be considered a consequential decision regarding one of the highlighted areas, the Colorado AI Act would be unlikely to apply to many functionalities of AI used by business and government today.

The Colorado AI Act was passed to remedy a concern over the possibility of “algorithmic discrimination.” This is defined as “unlawful differential treatment or impact that disfavors an individual or group of individuals on the basis of their actual or perceived age, color, disability,, ethnicity, genetic information, limited proficiency in the English language, national origin, race, religion, reproductive health, sex, veteran status or other classification protected under the laws of this state or federal law.”

Deployers must develop and apply a new risk management program for AI decision making.

The burdens of the Colorado AI Act fall on developers and deployers of these decision-making systems. Under the Colorado AI Act, developers of this type of AI must use reasonable care to protect consumers from any known or reasonably foreseeable risks of algorithmic discrimination, and must provide specific documentation to deployers or other developers of high-risk AI systems, including a general statement describing the reasonably foreseeable uses and known harmful or inappropriate uses of the system. They must also detail the system’s training data, limitations, purpose, intended benefits, and uses and include documents necessary to assist in understanding how to monitor algorithmic decisions for bias.

Entities that deploy these decision-making AI systems to affect the relevant categories of benefit are obligated with a new affirmative duty of reasonable care protecting consumers from known or reasonably foreseeable risk of algorithmic discrimination. The duty of care includes a number of consumer notifications, ongoing programs, and system analysis reports.

Deployers must develop and apply a new risk management program for AI decision making, and provide consumers with notice that an AI decision-making system is being used. The latter obligation includes statements about the purpose of the system, the nature of its decisions, and a notice of consumers right to opt out of processing. In cases where the consumer is on the wrong side of a decision facilitated by AI, the deployer must include a clear, effective statement disclosing the reasons for the decision, the data used to make the decision, and must provide both an opportunity to appeal the decision and to correct any incorrect data.

Annual data impact assessments for each decision-making AI system will soon be required for anyone deploying them in Colorado.

The Colorado AI Act does not have a private right of action and violations may be enforced by the state attorney general as an unfair trade practice.

Companies that train, sell or license decision-making AI system, and companies that use those systems will have a new set of requirements, notices and explanations to make to Colorado consumers by 2026.

Organizations should examine how AI is employed within their operations and work with legal counsel to determine if those activities may fall under the Colorado AI Act.

HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins