← Back to all briefings
Policy 6 min read Published Updated Credibility 71/100

EU DSA obligations begin for first VLOPs/VLOSEs

The EU Digital Services Act’s heightened duties for designated Very Large Online Platforms and Search Engines took effect on 25 August 2023, triggering risk assessments, mitigation, and transparency reporting requirements.

Fact-checked and reviewed — Kodi C.

Policy pillar illustration for Zeph Tech briefings
Policy, regulatory, and mandate timeline briefings

VLOP/VLOSE Designation and Compliance Timeline

Digital Services Act (DSA) obligations specific to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) entered into force on 25 August 2023. The European Commission designated 19 services meeting the 45 million monthly active user threshold in April 2023, giving these platforms four months to achieve compliance with improved due diligence requirements.

The designated VLOPs include major social media platforms, e-commerce marketplaces, video sharing services, and app distribution platforms, while VLOSEs include dominant search engines. Subsequent designations expanded the list as additional services crossed the user threshold, creating ongoing compliance monitoring requirements.

Systemic Risk Assessment Requirements

Designated services must perform annual systemic risk assessments examining how their design, functioning, and use may contribute to specific categories of harm. These include distribution of illegal content, negative effects on fundamental rights including privacy and freedom of expression, impacts on electoral processes and civic discourse, effects on gender-based violence and public health, and serious negative consequences for minors.

Risk assessments must be documented with appropriate methodology, submitted to the Commission and Digital Services Coordinators, and considered when designing mitigation measures. The assessment scope extends beyond content to examine algorithmic systems, advertising practices, and platform design features that may amplify identified risks.

Risk Mitigation and Audit Requirements

Following risk assessment, VLOPs and VLOSEs must implement mitigation measures proportionate to identified systemic risks. Mitigation approaches may include adapting content moderation practices, adjusting algorithmic recommender systems, modifying advertising acceptance policies, enhancing transparency tools, and increasing cooperation with relevant authorities. Critically, designated services must submit to annual independent audits examining compliance with DSA obligations and the adequacy of risk mitigation measures. Audit organizations must meet independence and expertise requirements established in delegated acts, creating a new category of specialized DSA compliance auditors.

Transparency and Data Access Obligations

The DSA imposes extensive transparency requirements on designated platforms. Services must publish searchable repositories of advertisements displayed on their platforms, including targeting parameters and delivery information.

Recommender system logic must be explained in terms accessible to average users, with options to modify or disable personalized recommendations. Annual transparency reports must detail content moderation activities, automated detection measures, illegal content reports, and complaints received. Researchers studying systemic risks gain access rights to platform data under specific conditions, enabling independent analysis of platform effects on public discourse and societal harm.

Enforcement Architecture and Penalties

The European Commission serves as primary enforcer for VLOP/VLOSE obligations, complementing Digital Services Coordinator oversight for smaller platforms. This direct Commission oversight reflects concerns that national regulators might face challenges constraining the most powerful global platforms.

Penalties for non-compliance can reach up to 6% of global annual turnover, with repeated violations potentially resulting in temporary service bans in serious cases. The enforcement framework includes procedural rights for platforms to contest findings and propose remedial measures before penalties finalize. If you are affected, understand the enforcement process and develop compliance documentation that can withstand regulatory scrutiny.

Implementation Considerations for Platform Teams

Platform, legal, and trust and safety teams supporting EU users should map DSA Article 34-42 duties to existing product controls and identify gaps requiring new capabilities. Recommender systems may require adjustment to support non-personalized alternatives mandated by the Act. Advertising systems need improved transparency features and archive capabilities.

Notice-and-action workflows must meet specific DSA requirements for content moderation decisions and appeals. Cross-functional coordination between product, policy, legal, and engineering teams is essential given the scope of required changes. If you are affected, also prepare for ongoing compliance monitoring as the Commission issues guidance and delegated acts clarifying specific requirements.

Broader DSA Implementation Timeline

While VLOP/VLOSE obligations began in August 2023, full DSA application to all in-scope digital services startd on 17 February 2024. Smaller platforms face proportionate obligations including transparency reporting, notice-and-action procedures, and trusted flagger programs. Organizations operating digital services in the EU should assess whether they fall within DSA scope, even if below VLOP thresholds, and ensure appropriate compliance measures address applicable obligations. The DSA represents a full regulatory framework that will continue evolving through enforcement decisions, guidance documents, and potential legislative amendments.

Source material

Continue in the Policy pillar

Return to the hub for curated research and deep-dive guides.

Visit pillar hub

Latest guides

Coverage intelligence

Published
Coverage pillar
Policy
Source credibility
71/100 — medium confidence
Topics
Platform Governance · Content Moderation · Transparency
Sources cited
2 sources (iso.org, crsreports.congress.gov)
Reading time
6 min

Source material

  1. Industry Standards and Best Practices — International Organization for Standardization
  2. Congressional Research Service Analysis
  • Platform Governance
  • Content Moderation
  • Transparency
Back to curated briefings

Comments

Community

We publish only high-quality, respectful contributions. Every submission is reviewed for clarity, sourcing, and safety before it appears here.

    Share your perspective

    Submissions showing "Awaiting moderation" are in review. Spam, low-effort posts, or unverifiable claims will be rejected. We verify submissions with the email you provide, and we never publish or sell that address.

    Verification

    Complete the CAPTCHA to submit.