← Back to all briefings
Compliance 5 min read Published Updated Credibility 71/100

EU designates first DSA very large platforms

DSA's first Very Large Online Platform designations in April 2023 triggered enhanced obligations. Major platforms like Google, Meta, Amazon faced systemic risk assessment and transparency requirements.

Editorially reviewed for factual accuracy

Compliance pillar illustration for Zeph Tech briefings
Compliance controls, audit, and evidence briefings

The European Commission issued its first Digital Services Act (DSA) designations on 25 April 2023, classifying platforms with over 45 million EU users as Very Large Online Platforms (VLOPs) or Very Large Online Search Engines (VLOSEs). Companies including Amazon Store, Facebook, Instagram, TikTok, Twitter, YouTube, AliExpress, and the App Store now face accelerated deadlines to implement systemic risk assessments, mitigations, and independent audits.

The designations trigger obligations around recommender system transparency, researcher data access, crisis protocols, and advertising repositories. Compliance teams must validate whether their services meet the user-threshold criteria, review forthcoming Commission audit guidance, and prepare to document risk mitigation for disinformation, illegal content, and fundamental rights impacts.

Designated Platforms and Services

The Commission designated 17 Very Large Online Platforms including major social networks (Facebook, Instagram, TikTok, Twitter, YouTube, LinkedIn, Pinterest, Snapchat), e-commerce platforms (Amazon Store, Alibaba AliExpress, Zalando), app stores (Apple App Store, Google Play), and content platforms (Google Maps, Wikipedia). Two search engines (Google Search, Bing) received VLOSE designations.

The 45-million user threshold applies specifically to monthly active recipients in the European Union. Platforms must self-assess user counts and notify the Commission when crossing the threshold. Subsequent designations will occur as additional services meet the criteria.

Systemic Risk Assessment Requirements

Designated platforms must conduct annual risk assessments covering systemic threats including illegal content distribution, fundamental rights impacts (privacy, expression, non-discrimination), electoral integrity, public health, and impacts on minors. Risk assessments must identify how algorithmic systems, recommender systems, and advertising systems might amplify identified risks.

The Commission published detailed methodological guidance for risk assessments, specifying categories of harm, evaluation criteria, and documentation requirements. Platforms must retain risk assessment documentation and make it available to regulators and researchers upon request.

Mitigation Measures and Audits

Platforms must implement reasonable, proportionate, and effective mitigation measures addressing identified risks. Measures may include content moderation policy adjustments, algorithmic changes, advertising restrictions, or improved detection capabilities. Mitigation effectiveness must be evaluated and documented.

Annual independent audits conducted by authorized organizations must assess compliance with DSA obligations. Auditors evaluate risk assessments, mitigation effectiveness, transparency reports, and operational compliance. Audit reports must be submitted to the Commission and published with redactions for confidential information.

Algorithmic Transparency

VLOPs must explain recommender system parameters influencing content display, including main criteria and their relative importance. Users must receive options to modify or turn off personalized recommendations. The DSA requires explanations in clear, accessible language about how algorithms determine content visibility.

Advertising transparency requirements mandate maintaining public repositories of advertisements displayed on platforms, including targeting parameters and advertiser identities. Political advertisements face improved disclosure obligations during election periods.

Researcher Data Access

Designated services must provide access to data for vetted researchers examining systemic risks and platform dynamics. The Commission establishes researcher vetting procedures and data access protocols balancing research needs against privacy and competitive concerns. Access mechanisms must be operational within specified timeframes.

Compliance Implementation

If you are affected, establish DSA compliance programs addressing risk assessment processes, mitigation setup, audit preparation, and transparency obligations. Technical teams require systems supporting advertising repositories, researcher access, and algorithmic documentation.

Cross-functional coordination between legal, compliance, product, and engineering teams ensures full obligation coverage. Designated platforms should engage with the Commission during setup to clarify interpretation questions and show good-faith compliance efforts.

Continue in the Compliance pillar

Return to the hub for curated research and deep-dive guides.

Visit pillar hub

Latest guides

Coverage intelligence

Published
Coverage pillar
Compliance
Source credibility
71/100 — medium confidence
Topics
Digital Services Act · Platform Compliance · Transparency
Sources cited
2 sources (iso.org, federalregister.gov)
Reading time
5 min

Documentation

  1. Industry Standards and Best Practices — International Organization for Standardization
  2. Federal Register Regulatory Notices
  • Digital Services Act
  • Platform Compliance
  • Transparency
Back to curated briefings

Comments

Community

We publish only high-quality, respectful contributions. Every submission is reviewed for clarity, sourcing, and safety before it appears here.

    Share your perspective

    Submissions showing "Awaiting moderation" are in review. Spam, low-effort posts, or unverifiable claims will be rejected. We verify submissions with the email you provide, and we never publish or sell that address.

    Verification

    Complete the CAPTCHA to submit.