Colorado AI Act

Colorado AI Act compliance guide

Equip developers and deployers to satisfy SB24-205 by February 1, 2026 with documented inventories, assessments, notices, and rapid reporting into the Colorado Attorney General.

Updated to incorporate the Attorney General’s September 2024 proposed rules, fall 2025 readiness directives, and Zeph Tech’s Q4 runbooks.

Colorado SB24-205 readiness timeline

  • Q4 2025. Finalise high-risk inventories, document developer–deployer contracts, and circulate preliminary impact assessment templates before the last pre-enforcement quarter closes.Zeph Tech briefing — Oct 18 2025
  • By 1 February 2026. Launch written risk management policies, complete pre-deployment impact assessments, publish public statements describing each high-risk system, and confirm consumer notice scripts are ready for go-live.SB24-205 §§6-1-1702(4), 6-1-1703(2)-(5)
  • Ongoing. Refresh assessments at least annually and within 90 days of intentional and substantial modifications, update public statements within 90 days, and deliver AG notifications no later than 90 days after discovering algorithmic discrimination.SB24-205 §§6-1-1702(4)(b), 6-1-1703(3)(a)-(c), 6-1-1703(7)
  • Rule adoption watch. Track Attorney General rulemaking from the September 2024 notice of proposed rulemaking through final publication so workflow tooling reflects any new forms or definitions before enforcement.Colorado AG NPRM (Sept 2024)

Inventory and risk management operations

Colorado’s presumption of reasonable care hinges on documented programmes. Use this checklist to coordinate developer artefacts and deployer controls.

  • Consolidate system catalogues. Merge product, risk, and vendor inventories to flag which AI systems meet the statute’s consequential decision triggers, then record intended and harmful uses captured in developer statements.SB24-205 §§6-1-1702(2)(a), 6-1-1703(2)
  • Embed framework alignment. Structure the deployer risk management programme around NIST AI RMF and ISO/IEC 42001 controls cited in §6-1-1703(2)(a)(I), including governance roles, data quality checks, bias testing cadence, and monitoring metrics.SB24-205 §6-1-1703(2)(a)(I)
  • Capture developer assurances. Require suppliers to deliver model cards, dataset cards, evaluation evidence, and mitigation guidance that allow deployers or third-party assessors to complete impact assessments.SB24-205 §6-1-1702(2)-(3)
  • Document lifecycle checkpoints. Retain risk programme updates, inventory decisions, and cross-functional approvals so evidence packages cover the Attorney General’s potential 90-day disclosure demands.SB24-205 §6-1-1703(9)
  • Run governance cadences. Schedule board or executive reviews of SB24-205 metrics quarterly, logging risk acceptance, remediation targets, and cross-jurisdiction harmonisation with EU AI Act and U.S. federal mandates.Zeph Tech briefing — Nov 14 2025

Developer documentation obligations

Deployer readiness is only possible if upstream developers share inventories, performance evidence, and discrimination alerts on statutory timelines.

  • Maintain technical packets. Provide intended use statements, capabilities, limitations, and training data overviews so deployers can populate §6-1-1703 impact assessments without guesswork.SB24-205 §6-1-1702(2)
  • Publish public statements. List every high-risk system in a public use-case inventory or equivalent webpage and update within 90 days of intentional and substantial modifications.SB24-205 §6-1-1702(4)
  • Share discrimination findings. Notify the Attorney General and all known deployers within 90 days when testing or credible reports uncover algorithmic discrimination tied to intended uses.SB24-205 §6-1-1702(5)
  • Control confidentiality. Flag trade secrets and privileged workpapers when responding to Attorney General requests while still honoring the 90-day delivery window for documentation.SB24-205 §6-1-1702(7)

Impact assessment workflow

Section 6-1-1703(3) prescribes what each impact assessment must document. Build automation so every deployment and major change ships with a compliant pack.

  1. Purpose and scope. Record intended use cases, deployment context, benefits, and human involvement, referencing developer-provided statements for consistency.SB24-205 §6-1-1703(3)(b)(I)
  2. Risk analysis. Evaluate known or foreseeable algorithmic discrimination, document testing results, and log mitigations or compensating controls.SB24-205 §6-1-1703(3)(b)(II)
  3. Data lineage. Summarise inputs, outputs, and any deployer-provided customisation data to evidence provenance and suitability.SB24-205 §6-1-1703(3)(b)(III)-(IV)
  4. Performance and transparency. List evaluation metrics, known limitations, consumer notices, and monitoring safeguards, then capture post-deployment oversight plans and change logs.SB24-205 §6-1-1703(3)(b)(V)-(VII)
  5. Comparable coverage. Reuse impact assessments created for other regimes when scope and effect match Colorado’s requirements to reduce duplication while preserving statutory elements.SB24-205 §6-1-1703(3)(e)
  6. Lifecycle records. Version control assessments for at least three years after final deployment, tag board certifications, and map evidence to AG disclosure requests to satisfy §6-1-1703(3)(f) and §6-1-1703(9).SB24-205 §§6-1-1703(3)(f), 6-1-1703(9)

Consumer notice and appeals

Consequential decisions require plain-language disclosures, appeal pathways, and correction channels.

  • Draft multi-channel notices. Prepare scripts and templates that describe the AI system, purpose, data categories, opt-out rights, and contact information in every language the organisation normally uses, formatted for accessibility.SB24-205 §6-1-1703(4)(a)-(c)
  • Explain decisions. Configure systems to capture the principal reasons for outcomes, the role of AI, data sources, and human review options so appeals teams can respond without delay.SB24-205 §6-1-1703(4)(b)
  • Operationalise corrections. Integrate CRM, ticketing, and data governance workflows so consumers can amend inaccurate inputs and receive status updates within statutory timelines.Zeph Tech briefing — Oct 23 2025
  • Publish public statements. Maintain the required website disclosures listing each high-risk system, mitigation approach, and data usage; refresh within 90 days when inventory changes.SB24-205 §6-1-1703(5)

Consumer chatbot transparency

Colorado requires clear disclosures whenever consumers interact with AI systems, even outside high-risk consequential decision contexts.

  • Instrument greetings. Ensure conversational AI greets Colorado users with an upfront statement that they are interacting with an automated system unless the interaction is obviously automated.SB24-205 §6-1-1704(1)-(2)
  • Log disclosure coverage. Track channels where human-to-AI handoffs occur (web chat, IVR, kiosks) and monitor transcripts to confirm the notice fires before consequential decisions are rendered.SB24-205 §6-1-1704(1)
  • Update scripts as rules evolve. Align disclosure language with Attorney General rulemaking once final forms issue so consumer messaging stays compliant.Colorado AG NPRM (Sept 2024)

Attorney General engagement

Colorado centralises enforcement with the Attorney General. Build playbooks that aggregate the evidence the office can demand.

  • Incident notification drills. Simulate 90-day reporting windows for algorithmic discrimination, assigning legal, risk, and customer teams to compile impact summaries, remediation steps, and consumer outreach evidence.SB24-205 §6-1-1703(7)
  • Developer disclosures. Ensure developers can notify the Attorney General and all deployers within 90 days of discovering discrimination through testing or credible reports.SB24-205 §6-1-1702(5)
  • Portal readiness. Configure document repositories and attestations to align with forms previewed in the September 2024 proposed rulemaking, including inventory statements, impact assessment uploads, and contact designations.Colorado AG NPRM (Sept 2024)
  • Disclosure response kits. Pre-build packages containing risk management policies, the latest impact assessments, records of annual reviews, and privilege designations to respond to §6-1-1702(7) or §6-1-1703(9) information requests within 90 days.SB24-205 §§6-1-1702(7), 6-1-1703(9)

Small deployer accommodations

Organisations with fewer than 50 full-time employees can leverage a narrow exemption but must satisfy strict guardrails.

  • Validate eligibility. Confirm headcount, ensure the deployer does not train the system with its own data, and verify the AI continues learning from external sources before invoking the exemption.SB24-205 §6-1-1703(6)(a)-(b)
  • Use intended functionality. Limit deployments to the developer’s documented intended uses so reliance on developer artefacts remains compliant.SB24-205 §6-1-1703(6)(b)(I)
  • Distribute developer impact assessments. Publish any impact assessments received from the developer that substantively mirror Colorado’s requirements so consumers still access transparency materials.SB24-205 §6-1-1703(6)(c)
  • Monitor growth triggers. Track staffing and data usage so exemptions are withdrawn immediately if the organisation scales beyond thresholds or begins training with proprietary datasets.SB24-205 §6-1-1703(6)

Integrate with multi-state AI governance

Colorado’s statute intersects with federal and state frameworks. Use harmonised tooling to avoid duplicated effort.

  • Map SB24-205 controls to EU AI Act Annex IV and U.S. OMB M-24-10 requirements so documentation serves multiple regulators.
  • Extend procurement questionnaires to cover Colorado-specific warranties, algorithmic discrimination notifications, and rights to updated impact assessments.Zeph Tech briefing — Aug 19 2025
  • Align customer care playbooks with Tennessee ELVIS Act and other state transparency laws to maintain consistent notices and appeals.
  • Instrument dashboards that track mitigation status, training completion, and AG submissions ahead of board and audit committee reviews.Zeph Tech briefing — Nov 14 2025