On May 12, the Colorado legislature passed Senate Bill 26‑189, a substantial rewrite of its 2024 law establishing consumer protections for artificial intelligence (formerly referred to as the CO AI Act), and replaced it with a more targeted framework for “automated decision‑making technology” (ADMT). The changes will take effect on January 1, 2027.
Financial institutions doing business in Colorado should carefully assess these changes and how they may apply to their own uses of computerized systems that are used to “make, guide, or assist” consequential decisions about individuals relating to financial or lending services as well as other enumerated consumer opportunities and services.
Scope
The law defines ADMT broadly as technology that processes personal data and uses computation to generate outputs (predictions, recommendations, classifications, rankings, or scores) that are used to make or assist decisions about individuals. A system becomes a “covered ADMT” when its outputs “materially influence” a consequential decision, meaning they are more than incidental and affect the outcome of the decision, for example by ranking applicants, constraining options, or determining pricing.
Consequential decisions are those that determine a person’s access to, eligibility for, selection for, compensation for, or pricing of key opportunities and services. The covered domains for such consequential decisions include education, employment decisions that may create an employer–employee relationship, leasing or purchasing residential real estate in Colorado, financial and lending services, insurance (including underwriting, pricing, coverage, and claims), health‑care services, and essential government services and public benefits. The law also treats materially worse pricing or terms that effectively limit or deny access in these covered domains as consequential decisions.
At the same time, the statute narrows its reach by excluding low‑stakes or routine processes that do not materially influence eligibility, pricing, or access. It specifically carves out advertising and marketing, differentiated product recommendations, and content moderation. It also excludes core infrastructure and basic tools such as databases, firewalls, simple calculators, and spreadsheets that do not use machine learning. Systems used for anti-money laundering, Office of Foreign Assets Control, and sanctions compliance, fraud detection and prevention (including identity verification), cybersecurity, and related controls are excluded to the extent they are performing those compliance functions rather than making the kind of consequential decisions covered by the law. Uses for administrative purposes would also be excluded.
The law no longer contains a small business exemption for companies with fewer than fifty (50) full-time employees. The amendments also removes a safe harbor that companies could assert if complying with the National Institute of Standards and Technology Artificial Intelligence Risk Management Framework, ISO/IEC 42001 framework for an Artificial Intelligence Management System, or other recognized artificial intelligence framework.
Developer Obligations
Developers, defined as entities doing business in Colorado that sell, license, or substantially modify covered ADMT for consequential decisions, must provide deployers with enough information to understand how these systems should and should not be used. Developers must supply technical documentation describing the ADMT’s intended uses and known harmful or inappropriate uses; the categories of training data, including personal data, to the extent known; known limitations and risks, including circumstances in which the system should not be used; and instructions for appropriate use, monitoring, and meaningful human review.
Developers must also give deployers the information they reasonably need to comply with their own obligations under the statute. If certain information is withheld because it is a trade secret or otherwise legally protected, the developer must notify the deployer. In addition, developers are required to notify deployers of “material updates,” such as new versions or patches that materially affect outputs, performance, or intended use, and to retain records like version identifiers and changelogs for at least three years.
These duties attach when a developer has marketed, documented, configured, or contracted a system to be used in consequential decisions, or becomes aware that it is being used in that way consistent with its intended and contracted uses.
The amendments remove prior developer obligations to post a public statement about artificial intelligence on their websites, notifications to the Colorado attorney general, processes around algorithmic discrimination, and an affirmative duty of care.
Deployer Obligations
Deployers, defined as businesses using covered ADMT to materially influence consequential decisions, must focus on transparency and process.
Before using covered ADMT in such a decision, a deployer must provide a clear and conspicuous notice to the consumer that an automated system is being used or will be used in a consequential decision affecting them and explain how the consumer can obtain additional information. The law allows this to be satisfied by a prominent public notice that is reasonably accessible at points of consumer interaction, such as an online application portal, so long as it is reasonably proximate to where the consequential decision may occur.
If the deployer’s use of ADMT leads to an “adverse outcome” (i.e., a denial of a loan, a materially reduced benefit, or significantly worse pricing), the deployer must, within thirty days, give the consumer a plain‑language description of the decision and the role of the ADMT; explain how to request additional information about the system and the types, categories, and sources of personal data used (to the extent provided by the developer); and describe the consumer’s rights under the statute and how to exercise them.
The Attorney General must adopt rules by January 1, 2027, to flesh out these post‑adverse outcome disclosure requirements, including sector‑specific guidance and standards for describing the system’s role in a way that is reasonably understandable to consumers. The statute expressly anticipates alignment with existing adverse action regimes under laws, such as the Equal Credit Opportunity Act and the Fair Credit Reporting Act: creditors that provide compliant federal notices and include required information about ADMT use can satisfy Colorado’s requirements for the same decision, avoiding duplicative notices. All notices and disclosures must be reasonably accessible to consumers with disabilities and to consumers with limited English proficiency.
The amendments remove prior deployer obligations to post a public statement about artificial intelligence on their websites, notifications to the Colorado attorney general, and processes around algorithmic discrimination.
Consumer Rights
Consumers who experience an adverse outcome from a consequential decision materially influenced by covered ADMT gain specific rights.
They may request access to personal data used in the decision and request correction of factually incorrect or materially inaccurate personal data, in coordination with Colorado’s existing privacy framework. The statute makes clear that certain exceptions (such as under the Gramm-Leach-Bliley Act) in the Colorado Privacy Act do not apply to this correction right in the ADMT context. The obligation also does not extend to correcting opinions, predictions, scores, or protected evaluations.
Consumers may also request “meaningful human review” and reconsideration of the decision, to the extent commercially reasonable. Meaningful human review requires a trained individual with authority to approve, modify, or override the decision who considers relevant evidence, does not simply default to the system output, and has enough information to understand the system’s intended use, limitations, inputs, and main factors behind its output, without requiring disclosure of proprietary source code or model details.
Enforcement, Bias, and Existing Law
SB 26‑189 is enforced exclusively by the Colorado Attorney General under the Colorado Consumer Protection Act. A violation of the developer or deployer requirements is deemed a deceptive trade practice, but private parties cannot bring suit under this statute as there is no new private right of action. Before bringing an enforcement action, the Attorney General generally must provide a 60‑day notice of violation and an opportunity to cure, except where violations are knowing or repeated.
The law does not create any freestanding prohibition on bias, discrimination, or unfair outcomes in ADMT itself. Instead, it assumes those issues will be addressed under existing anti‑discrimination and consumer protection laws, such as the Colorado Anti‑Discrimination Act. The statute clarifies how fault is allocated between developers and deployers in such cases. Both can be held liable under existing law where a covered ADMT materially influenced a consequential decision that violated anti‑discrimination statutes, with responsibility apportioned according to their relative fault. Developers are liable only when their system was used as they intended, documented, or contracted, and they are not liable for uses outside that scope if they have met their documentation duties. The law also declares void any contract clauses that purport to indemnify a developer or deployer against their own acts or omissions that violate Colorado anti‑discrimination law in connection with covered ADMT.
Finally, the statute emphasizes that compliance with SB 26‑189 does not excuse noncompliance with any other applicable state or federal law. Use of ADMT in a consequential decision does not justify or provide a defense to discrimination or consumer protection violations. It is an additional set of process and transparency obligations layered on top of existing legal requirements.
Removal of Governance and Bias Assessment Requirements
The new version of the law eliminates some of the requirements of the 2024 law – the provisions requiring developers to create governance frameworks, engage in model risk assessment and testing, and to take measures to prevent algorithmic bias are all absent from the new version of the law.
Our Take and Unanswered Questions
The new version of the Colorado AI Act, if it is signed by Governor Jared Polis (D), who has already signaled his intention to sign, represents a significant improvement over the 2024 version in terms of the duties it imposes on financial services companies. It does, however, leave some unanswered questions for industry to grapple with, which may be addressed by the Attorney General’s rulemaking, which is required to be completed by January 1, 2027. Those questions include whether a consumer’s receipt of less than the best pricing on a financial product constitutes an “adverse outcome” under the CO law, and whether the law will require adverse outcome notices to be given in situations where ECOA and the FCRA do not require them (e.g., in the case of accepted counteroffers). It also remains to be seen how the Attorney General will interpret the “commercially reasonable” language with respect to human review of ADMT decisions, which in our view is both commercially unworkable and counterproductive from a fair lending standpoint in consumer lending decision-making. The final unanswered question is when the law will actually be enforced — enforcement after the January 1, 2027 effective date may be impacted by the litigation pending that challenges the 2024 version of the law, and the Attorney General’s recent agreement in that litigation to stay enforcement of the law until the regulations are finalized and the court decides the plaintiffs’ preliminary injunction motion. We’ll continue to monitor these developments closely and report on them on this blog and in our podcasts.
