← Back to all briefings
Governance 7 min read Published Updated Credibility 93/100

EU Digital Services Act Enforcement Intensifies with Major Platform

The European Commission is ramping up Digital Services Act enforcement in 2026, with active investigations into major US technology platforms including X, Google, Meta, and Apple. Recent enforcement actions have resulted in substantial fines, with X receiving a €120 million penalty for transparency violations.

Accuracy-reviewed by the editorial team

Governance pillar illustration for Zeph Tech briefings
Governance, ESG, and board reporting briefings

The European Union's Digital Services Act (DSA) enforcement entered a more aggressive phase in 2026, with the European Commission pursuing multiple investigations and levying significant penalties against major technology platforms. The €120 million fine against X (formerly Twitter) for advertising and user account transparency violations signals increased willingness to impose substantial penalties. Very large online platforms (VLOPs) and very large online search engines (VLOSEs) face heightened compliance pressure as enforcement priorities expand to include generative AI risk assessments, minor protection, and algorithmic transparency requirements.

DSA enforcement structure

The DSA establishes a dual enforcement structure with the European Commission supervising VLOPs and VLOSEs while national Digital Services Coordinators (DSCs) oversee other intermediary services. The Commission designated 17 platforms and 2 search engines as VLOPs/VLOSEs based on European user counts exceeding 45 million, subjecting these services to the most stringent DSA requirements and direct Commission oversight.

National DSC designation has proceeded unevenly across member states. Several countries faced infringement proceedings from the Commission for delayed authority designation, creating enforcement gaps in non-VLOP oversight. The uneven national implementation complicates compliance for services operating across multiple member states, as enforcement approaches and priorities vary.

The European Board for Digital Services (EBDS), operational since February 2024, serves as a coordination body for national DSCs. The Board provides advisory opinions, coordinates cross-border enforcement, and develops guidance on DSA interpretation. The European center for Algorithmic Transparency supports technical enforcement aspects, particularly for complex algorithmic systems.

Enforcement authority includes significant penalty power. Non-compliance can result in fines up to 6% of global annual turnover, creating substantial financial exposure for large platforms. Periodic penalties for ongoing violations can reach 5% of average daily worldwide turnover, providing continuous compliance incentive.

Active enforcement actions

The €120 million fine against X addressed failures in advertising transparency and user account verification. The Commission found that X failed to adequately disclose advertising repositories, provide searchable advertisement databases, and implement proper verification procedures for advertisers. The penalty represents the first major DSA fine and signals Commission willingness to pursue substantial enforcement.

Ongoing investigations target multiple major platforms. Google faces scrutiny over advertising transparency, recommender system documentation, and search result ranking practices. Meta's investigations address targeted advertising practices, content recommendation algorithms, and data access for researchers. Apple's probe focuses on App Store policies and compliance with interoperability requirements.

TikTok investigations address minor protection measures, addiction-related design features, and content recommendation transparency. The platform's significant youth user base makes minor protection requirements particularly relevant. Commission concerns about addictive design patterns reflect broader regulatory attention to platform impacts on younger users.

Investigation timelines vary, but enforcement activity has accelerated through late 2025 and early 2026. Platforms under investigation face information requests, compliance reviews, and potential penalty proceedings. The backlog of open investigations creates sustained enforcement pressure across the VLOP/VLOSE category.

2026 enforcement priorities

Risk assessment requirements receive particular enforcement attention as the Commission evaluates how platforms identify and mitigate systemic risks. DSA Article 34 requires VLOPs to conduct annual risk assessments addressing content moderation, algorithmic amplification, and potential negative effects on fundamental rights. Commission scrutiny focuses on assessment thoroughness and corresponding mitigation measures.

Generative AI integration creates new risk assessment obligations. Platforms incorporating generative AI features must evaluate risks associated with synthetic content generation, potential for misuse, and impacts on information integrity. The intersection of DSA systemic risk requirements and AI governance represents an evolving compliance area.

Minor protection requirements under DSA Article 28 face increased enforcement focus. Platforms must implement appropriate measures to ensure high privacy, safety, and security for minors, including age verification and design choices that avoid exploiting vulnerabilities. Enforcement attention to minor protection reflects broader societal concern about platform impacts on young users.

Advertising transparency continues as a core enforcement priority. DSA Article 39 requires platforms to maintain publicly accessible repositories of advertisements shown on their services. Commission enforcement examines both repository completeness and accessibility, with the X penalty demonstrating willingness to sanction advertising transparency failures.

Compliance requirements for VLOPs

VLOPs face thorough compliance obligations beyond those applicable to smaller intermediary services. Annual systemic risk assessments must identify potential negative effects related to illegal content, fundamental rights, public discourse, electoral processes, public health, and minors. These assessments require documented methodology, evidence basis, and board-level review.

Mitigation measures corresponding to identified risks must be implemented and documented. Platforms must demonstrate reasonable and proportionate responses to systemic risks, with measures subject to Commission evaluation. The relationship between risk identification and mitigation implementation receives particular scrutiny.

Recommender system transparency requirements under DSA Article 27 mandate clear explanation of parameters used in recommendation algorithms. Users must be offered at least one option not based on profiling. Platform compliance requires both technical implementation and user interface design supporting informed choice.

Independent audit requirements under DSA Article 37 mandate annual compliance verification by qualified auditors. Audit reports must be published and submitted to the Commission. The independent audit requirement creates additional compliance documentation and external verification of DSA adherence.

Private enforcement considerations

While the DSA primarily relies on regulatory enforcement, private enforcement mechanisms exist. Users and consumer organizations can seek compensation for damages resulting from platform violations of DSA obligations. This private enforcement avenue creates additional compliance incentive beyond regulatory penalties.

Class action mechanisms available under EU consumer protection law may be applied to DSA violations affecting multiple users. Consumer organizations have signaled interest in pursuing collective actions against platforms for systemic violations. Platform operators should anticipate private enforcement as a supplement to Commission action.

Data access provisions for researchers create accountability mechanisms beyond formal enforcement. DSA Article 40 requires platforms to provide access to data for vetted researchers studying systemic risks. Research findings may inform both regulatory enforcement priorities and public pressure for compliance improvements.

Reputation effects from enforcement actions extend beyond direct penalties. Public disclosure of violations, investigation findings, and penalty decisions creates market pressure for compliance. Platform operators must consider reputational implications alongside financial penalty exposure when assessing compliance investments.

International implications

The DSA's extraterritorial application affects US-based platforms serving European users. Enforcement against major US technology companies has created transatlantic tension, with concerns about discriminatory application of European regulations. The Trump administration has threatened retaliatory measures in response to perceived targeting of American companies.

Despite political tensions, platform operators must comply with DSA requirements to maintain European market access. The size of the European market makes withdrawal impractical for major platforms. Compliance investments are necessary business costs regardless of political debates about regulatory appropriateness.

The DSA's "Brussels Effect" influences regulatory approaches in other jurisdictions. Other countries are developing digital platform regulations referencing DSA concepts and requirements. Platforms should anticipate that DSA compliance investments will have utility beyond European operations as similar regulations spread globally.

Interoperability with other EU digital regulations creates compliance complexity. The Digital Markets Act, GDPR, and forthcoming AI Act create overlapping obligations that platforms must handle. Integrated compliance approaches addressing multiple regulatory frameworks provide efficiency advantages.

Near-term action plan

  • Assess VLOP/VLOSE designation status and applicable DSA obligation tier.
  • Review systemic risk assessment methodology against Commission enforcement priorities.
  • Evaluate advertising transparency implementation for repository completeness and accessibility.
  • Assess minor protection measures against DSA Article 28 requirements and enforcement guidance.
  • Review recommender system transparency and user option implementation.
  • Prepare for or conduct required independent audits under DSA Article 37.
  • Establish monitoring for Commission investigation announcements and enforcement decisions.
  • Brief leadership on DSA enforcement trends and organizational compliance status.

Bottom line

DSA enforcement has entered a more aggressive phase with the Commission demonstrating willingness to impose substantial penalties for compliance failures. The €120 million X penalty provides precedent for significant enforcement against VLOPs. Organizations subject to DSA obligations should treat compliance as an operational necessity rather than a theoretical requirement.

2026 enforcement priorities focus on systemic risk assessment, minor protection, advertising transparency, and recommender system documentation. These areas receive particular Commission attention and should receive corresponding compliance investment. Organizations lagging in these areas face elevated enforcement risk.

The intersection of DSA requirements and generative AI creates emerging compliance challenges. Platforms integrating AI features must evaluate new systemic risks and implement appropriate mitigations. This AI-related compliance area will likely receive increased attention as AI integration in platform services accelerates.

This analysis recommends that organizations subject to DSA obligations conduct thorough compliance assessments focused on current enforcement priorities. The Commission's demonstrated enforcement willingness makes compliance gaps now risky. Investment in DSA compliance should be prioritized accordingly.

Continue in the Governance pillar

Return to the hub for curated research and deep-dive guides.

Visit pillar hub

Latest guides

Coverage intelligence

Published
Coverage pillar
Governance
Source credibility
93/100 — high confidence
Topics
Digital Services Act · EU Platform Regulation · DSA Enforcement · VLOP Compliance · Advertising Transparency · Platform Governance
Sources cited
3 sources (europarl.europa.eu, p.nyu.edu, europeanbusinessmagazine.com)
Reading time
7 min

Further reading

  1. Enforcing the Digital Services Act: State of play — europarl.europa.eu
  2. Digital Services Act Decoded: DSA Enforcement Key Points — nyu.edu
  3. EU Strengthens Tech Regulation Enforcement Ahead of 2026 — europeanbusinessmagazine.com
  • Digital Services Act
  • EU Platform Regulation
  • DSA Enforcement
  • VLOP Compliance
  • Advertising Transparency
  • Platform Governance
Back to curated briefings

Comments

Community

We publish only high-quality, respectful contributions. Every submission is reviewed for clarity, sourcing, and safety before it appears here.

    Share your perspective

    Submissions showing "Awaiting moderation" are in review. Spam, low-effort posts, or unverifiable claims will be rejected. We verify submissions with the email you provide, and we never publish or sell that address.

    Verification

    Complete the CAPTCHA to submit.