On May 8, 2024, Colorado passed “An Act Concerning Consumer Protections in Interactions with Artificial Intelligence Systems” (SB205), a state law that comprehensively regulates the use of certain “Artificial Intelligence (AI)” systems.[1] The law is aimed at addressing AI bias, establishing a requirement of human oversight throughout the life cycle of AI systems, and requiring significant documentation around the use of AI. This article covers whom the law applies to, effective dates and penalties, important definitions, and initial steps companies should consider taking to prepare for complying with the law.
Whom does SB205 apply to?
SB205 applies to any person doing business in Colorado who develops an “AI system” or deploys a “high risk AI system” (each are discussed further below).[2] The law defines “deploy” as “use,”[3] meaning that SB205 applies to any company using a high-risk AI system, whether that system is consumer-facing or not. Developing an AI system as defined in the law will also include actions that “intentionally and substantially modify” an existing AI system.[4]
Additionally, certain obligations of deployers will apply to their third-party contractors as well.[5] These flow-down obligations mean that companies who provide B2B services that involve AI systems will need to confirm whether their customers are deployers under SB205, because these service providers will also have to comply with the law.
How is the law enforced?
SB205 explicitly excludes a private right of action, leaving enforcement solely with the Colorado Attorney General.[6] Additionally, SB205 provides that if the Attorney General brings an enforcement action relating to high-risk AI systems, there is a rebuttable presumption that a company used reasonable care under the law if the company complied with the provisions of the relevant section it is accused of violating.[7] For example, if a developer faced an enforcement action related to the development of a high-risk AI system, and could demonstrate it had the requisite processes and documentation in place as required by Section 6-1-1702, it would benefit from a rebuttable presumption that the developer exercised reasonable care to protect consumers from risks of algorithmic discrimination. The law also provides companies with an affirmative defense against actions by the Attorney General if the company discovers the violation and takes corrective actions, in addition to maintaining a compliance program that meets certain criteria.[8]
How does the law work? Key Definitions
SB205 contains key definitions that determine what specific steps companies must take to be in compliance with the law. Companies must be aware of what constitutes “algorithmic discrimination,” be able to assess whether their AI systems are “high risk,” and determine whether they are a developer, a deployer, or both.
“Algorithmic Discrimination” is defined as “any condition in which the use of an artificial intelligence system results in an unlawful differential treatment or impact that disfavors any individual or group of individuals on the basis of their actual or perceived age, color, disability, ethnicity, genetic information, limited proficiency in the English language, national origin, race, religion, reproductive health, sex, veteran status, or other classification protected under the laws of this state or federal law.”[9]
Further, the law’s main obligations attach to classifies AI systems based on their capabilities and uses. “High-Risk Artificial Intelligence System” means “any artificial intelligence system that, when deployed, makes, or is a substantial factor in making, a consequential decision.”[10] Note that the definition is subject to a litany of exclusions, mainly related to technical functions, provided that the excluded technology meet varying criteria.[11]
Finally, companies will need to distinguish whether they are developers, deployers, or both:
- Developers are “a person doing business in [Colorado] that develops, or intentionally and substantially modifies, an artificial intelligence system.”[12]
- Deployers are “any person doing business in [Colorado] that deploys a high-risk artificial intelligence system.”[13] As mentioned above, “deploys” means “use.”[14]
Whether a company meets the criteria of either or both will be context-dependent, and will influence both statutory and contractual considerations.
When does the law take effect, and what should I do to prepare for SB2? Five Initial Steps
All of SB205’s provisions take effect on February 1, 2026.[15] All companies must implement a notice within consumer-facing AI systems that alerts consumers to the presence of AI by February 1, 2026, whether the system is high risk or not, with limited exception.[16]
If you or your customers do business in the state of Colorado, there are five key actions you should consider taking to prepare for SB205 taking effect:
- Determine whether you are a developer, deployer or both. This may depend on the various types of and ways in which your company uses AI. Companies that provide B2B services also need to determine whether their customers are deployers that have flow-down obligations under the law.
- Determine if you have a high-risk AI system as defined by the law. Because most of SB205’s substantive provisions only apply to high-risk systems, having a clear idea as to whether your AI systems are covered will be crucial. You should also consider future use-cases for AI systems that are not yet high-risk but may become high-risk depending on how they are deployed.
- Review SB205’s notice requirements. As mentioned above, all consumer-facing AI systems must contain a notice within the system to the consumer that AI is present, effective Feb. 1, 2026, with limited exception.[17] In addition, there are multiple other required notices, some of which must be publicly available.[18]
- Review SB205’s impact assessment requirements. The law requires impact assessments in particular contexts that differ somewhat from data processing impact assessments that companies may already be conducting to comply with privacy laws.[19]
- Determine whether you need to implement a risk-management policy and program. SB205 requires deployers using high-risk AI systems to implement risk-management policies and programs that have to meet several criteria.[20] Additionally, any company wishing to benefit from the affirmative defense provided by SB205 will need to have a compliance program in place that meets several criteria.[21]
[1] See S.B. 24-205, 74th Gen. Assemb., Reg. Sess. (Colo. 2024). Other states have regulated specific uses of AI, such as California and including Colorado, giving consumers opt-out rights from profiling. At the time of this blog, the law has not yet been signed by Colorado’s governor.
[2] S.B. 24-205, Secs. 6-1-1701(6) & (7), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[3] S.B. 24-205, Sec. 6-1-1701(5), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[4] See S.B. 24-205, Secs. 6-1-1701(7) & (10)(a), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[5] See generally S.B. 24-205, 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[6] S.B. 24-205, Sec. 6-1-1706(6), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[7] See, e.g., S.B. 24-205, Secs. 6-1-1702(1) & 6-1-1703(1), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[8] S.B. 24-205, Sec. 6-1-1706(3), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[9] S.B. 24-205, Sec. 6-1-1701(1), 74th Gen. Assemb., Reg. Sess. (Colo. 2024). The definition also clarifies that “algorithmic discrimination” does not include uses related to testing AI systems for discrimination, “expanding applicant, customer, or participant” pools to increase diversity, or acts or omissions of private clubs as covered by 42 U.S.C. 2000a(e). Id.
[10] S.B. 24-205, Sec. 6-1-1701(9)(a), 74th Gen. Assemb., Reg. Sess. (Colo. 2024). The law also defines “consequential decision” as “any decision that has a material legal or similarly significant effect on the provision or denial to any consumer of, or the cost or terms of, (a) education enrollment or an education opportunity, (b) employment or an employment opportunity, (c) a financial or lending service, (d) an essential government service, (e) health-care services, (f) housing, (g) insurance, or (h) a legal service.” S.B. 24-205, Sec. 6-1-1701(3), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[11] S.B. 24-205, Sec. 6-1-1701(9)(b), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[12] S.B. 24-205, Sec. 6-1-1701(7), 74th Gen. Assemb., Reg. Sess. (Colo. 2024). The law also defines “intentional and substantial modification.” S.B. 24-205, Sec. 6-1-1701(10), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[13] S.B. 24-205, Sec. 6-1-1701(6), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[14] S.B. 24-205, Sec. 6-1-1701(5), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[15] See generally S.B. 24-205, 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[16] S.B. 24-205, Sec. 6-1-1704, 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[17] S.B. 24-205, Sec. 6-1-1704, 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[18] See generally, S.B. 24-205, 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[19] See, e.g., S.B. 24-205, Sec. 6-1-1703(3)(a), 74th Gen. Assemb., Reg. Sess. (Colo. 2024) (making impact assessments a requirement for deployers and their third party contractors). Note that the law also implies scenarios in which a developer would also conduct impact assessments. S.B. 24-205, Sec. 6-1-1702(3)(a), 74th Gen. Assemb., Reg. Sess. (Colo. 2024)
[20] S.B. 24-205, Sec. 6-1-1703, 74th Gen. Assemb., Reg. Sess. (Colo. 2024).
[21] S.B. 24-205, Sec. 6-1-1706(b), 74th Gen. Assemb., Reg. Sess. (Colo. 2024).