← Back to all briefings

Policy · Credibility 94/100 · · 2 min read

Policy Briefing — EU AI Act high-risk system obligations commence

Twenty-four months after the EU AI Act enters into force, providers and deployers of high-risk systems must comply with Title III requirements, including conformity assessments, quality management systems, and post-market monitoring starting 1 August 2026.

Executive briefing: Regulation (EU) 2024/1689 applies its Title III obligations to high-risk AI systems from 1 August 2026. Providers must complete conformity assessments, implement quality management systems, and register qualifying systems, while deployers assume documentation, human oversight, and logging duties.

Mandatory deliverables

  • Quality management. Article 17 requires documented policies covering design, testing, and post-market surveillance for each high-risk system.
  • Technical documentation. Articles 11 and 18 mandate comprehensive technical files, data governance evidence, and logging arrangements that authorities can audit.
  • Post-market monitoring. Articles 61 and 62 demand incident reporting, model drift tracking, and corrective actions feeding into Union-wide databases.

Program actions

  • Conformity assessments. Finalise internal control checks or engage notified bodies where required for Annex III use cases before deployment after 1 August.
  • Deployers’ governance. Establish human oversight playbooks, risk logs, and data minimisation checks aligned with Article 29 responsibilities.
  • Market surveillance readiness. Prepare for requests from national supervisory authorities and the EU AI Office by curating evidence repositories and contact points.

Enablement moves

  • Map EU AI Act controls to NIST AI RMF and Colorado SB24-205 artefacts to consolidate compliance investments.
  • Integrate real-time monitoring outputs with incident response teams to accelerate reporting under Article 62.

Sources

  • EU AI Act
  • High-risk AI
  • Conformity assessment
  • AI governance
Back to curated briefings