← Back to all briefings
Governance 5 min read Published Updated Credibility 86/100

Policy Briefing — EU Digital Services Act Fully Applies

The Digital Services Act became fully applicable EU-wide on 17 February 2024, requiring all online intermediaries to operationalise notice-and-action, trader traceability, ad transparency, risk mitigation, and regulator engagement frameworks.

Timeline plotting source publication cadence sized by credibility.
2 publication timestamps supporting this briefing. Source data (JSON)

Executive briefing: The Digital Services Act (DSA) entered full application across the European Union on , extending obligations beyond the initial cohort of Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to encompass all online intermediaries serving EU users. Hosting services, marketplaces, app stores, social media platforms, and content-sharing services must now demonstrate compliance with the DSA’s comprehensive framework, which covers notice-and-action mechanisms, illegal content removal, trader traceability, advertising transparency, systemic risk mitigation, and cooperation with Digital Services Coordinators (DSCs). The regime imposes fines up to 6% of global turnover for violations and, in extreme cases, temporary service suspension.

Large platforms already designated as VLOPs/VLOSEs have been subject to enhanced duties since August 2023, including risk assessments and independent audits. Now, small and medium-sized platforms, hosting providers, and intermediaries must implement scaled obligations commensurate with their role. Micro and small enterprises (fewer than 50 employees and €10 million turnover) enjoy certain exemptions but must still comply with core duties if they act as marketplaces or host user-generated content.

Why it matters for governance teams

The DSA consolidates fragmented national rules into a single EU-wide regime. Compliance requires cross-functional coordination spanning legal, policy, trust and safety, cybersecurity, engineering, procurement, and customer support. Boards and senior management must oversee DSA compliance through documented governance structures, assigning accountable executives and establishing reporting cadence to audit or risk committees. Failure to demonstrate effective oversight could invite investigations from DSCs or the European Commission.

The law also introduces new transparency expectations. Platforms must publish annual transparency reports detailing content moderation activity, notices received, automated detection tools, average response times, suspensions, and appeals outcomes. Online marketplaces must collect and verify seller information (“Know Your Business Customer”), display trader identity to consumers, and take reasonable efforts to assess product safety. Advertising transparency requires platforms to label ads, disclose the advertiser, explain key targeting parameters, and offer meaningful choices to users. These obligations influence product design, data governance, and vendor management.

Governance checkpoints

  • Accountable leadership: Appoint a DSA compliance officer or establish a governance committee with representatives from trust and safety, legal, and engineering. Define escalation paths to senior leadership and DSCs.
  • Notice-and-action workflows: Implement user-friendly mechanisms enabling individuals and trusted flaggers to report illegal content. Document assessment procedures, response timelines, and appeal processes. Ensure automation is transparent and subject to human oversight.
  • Trader traceability: For marketplaces, verify trader identities before allowing listings. Collect business registration numbers, contact information, and relevant authorisations. Integrate verification into onboarding workflows and conduct periodic reviews to remove fraudulent sellers.
  • Advertising transparency controls: Build systems to label ads clearly, provide “why am I seeing this” explanations, and offer opt-out choices for profiling-based advertising (especially for minors). Maintain ad repositories accessible to regulators and the public.
  • Risk management and auditing: Conduct systemic risk assessments covering illegal content dissemination, disinformation, impacts on fundamental rights, public security, and mental health. Implement mitigation measures (e.g., algorithmic adjustments, user empowerment tools). Large platforms must commission independent audits; smaller entities should document internal reviews.
  • Data access readiness: Prepare to respond to data access requests from DSCs and, where applicable, vetted researchers under Article 40. Maintain inventories of relevant datasets, security safeguards, and legal assessments.

These checkpoints should map to internal controls, policies, training, and monitoring metrics. Boards should review dashboards tracking notice volumes, removal rates, appeal outcomes, trader verification status, and advertising compliance.

Implementation roadmap

Immediate (Q1 2024): Conduct a gap analysis against DSA articles applicable to your service type. Update terms of service, community guidelines, and internal policies to reflect new obligations. Notify DSCs of points of contact and, where required, appoint legal representatives within the EU.

Q2 2024: Deploy or enhance notice-and-action tooling. Train moderation teams on illegal content categories defined by EU and member-state law. Establish appeals portals and ensure decisions include reasoning and references to legal bases. Begin building transparency reporting pipelines that capture metrics automatically.

Q3 2024: For marketplaces, audit seller onboarding and product listings. Implement proactive checks using product safety databases (Safety Gate, RAPEX) and coordinate with customs authorities as necessary. For social platforms, assess algorithmic recommender systems for systemic risks and document mitigation measures.

Q4 2024: Publish the first annual transparency report reflecting full-year DSA compliance. Engage external auditors if subject to VLOP/VLOSE obligations. Prepare for DSC inspections by maintaining evidence repositories, including training logs, risk assessment reports, and incident response records.

Operational considerations

Compliance requires robust documentation management. Platforms should implement policy management systems capturing version history, approvals, and distribution. Logging infrastructure must record moderation actions, user notifications, and appeals outcomes, retaining data for at least six months (longer if national law mandates). Cybersecurity teams should ensure data used for compliance (e.g., advertiser information, trader documents) is protected against breaches, noting that DSA violations could coincide with GDPR penalties.

Companies operating multiple brands should harmonise processes while accounting for service-specific risk profiles. Establish shared services for legal interpretation, policy updates, and training, but allow product teams to tailor user experience changes. Engage with industry associations (e.g., DOT Europe, Ecommerce Europe) to monitor guidance and share best practices.

Platforms must also manage interplay with the Digital Markets Act (DMA) and sector-specific rules. Gatekeepers subject to DMA obligations need to ensure DSA compliance measures—such as recommender transparency or user choice screens—do not conflict with interoperability or self-preferencing prohibitions. Financial services platforms should coordinate with anti-money laundering KYC programmes, while media streaming services must align takedown processes with copyright directives.

Smaller intermediaries should document proportionality assessments explaining how they scale obligations relative to their size. DSCs can request these justifications during investigations, and well-evidenced assessments can mitigate enforcement exposure.

Record-keeping obligations require storing notices, decisions, and trader verification evidence for at least six months. Implement retention schedules and access controls consistent with GDPR, and prepare to export records in machine-readable formats when DSCs initiate investigations.

Risk watch

Track Commission guidance, delegated acts, and decisions designating new VLOPs/VLOSEs. Monitor enforcement actions against early violators—these cases will set precedents on acceptable moderation timelines, transparency detail, and trader vetting. Expect DSCs to coordinate with consumer protection and competition authorities, potentially leading to joint investigations.

By treating the DSA as a continuous governance programme—covering legal compliance, user trust, and operational resilience—online intermediaries can reduce enforcement exposure, protect brand reputation, and contribute to a safer digital ecosystem.

Timeline plotting source publication cadence sized by credibility.
2 publication timestamps supporting this briefing. Source data (JSON)
Horizontal bar chart of credibility scores per cited source.
Credibility scores for every source cited in this briefing. Source data (JSON)

Continue in the Governance pillar

Return to the hub for curated research and deep-dive guides.

Visit pillar hub

Latest guides

  • Digital Services Act
  • EU platform compliance
  • Content moderation
  • Transparency reporting
  • Online marketplace governance
Back to curated briefings