European Parliament adopts the Digital Services Act
The European Parliament formally adopted the Digital Services Act on 5 July 2022, confirming sweeping platform liability, recommender transparency, and systemic risk mitigation duties ahead of Council approval and publication.
Editorially reviewed for factual accuracy
The European Parliament adopted the Digital Services Act by overwhelming vote on 5 July 2022, completing the legislative process for the EU's landmark platform regulation. The DSA establishes tiered obligations for digital service providers based on their role in the digital ecosystem and the risks their services pose to users and society.
Service Provider Categories
Intermediary services form the broadest category, covering any service transmitting, hosting, or providing access to information at user request. Basic obligations include transparency reporting, terms of service requirements, and cooperation with judicial and administrative orders.
Hosting services face additional notice-and-action obligations requiring expeditious removal of illegal content upon receiving actual knowledge. These providers must implement complaint mechanisms, provide statement of reasons for content moderation decisions, and notify law enforcement of serious criminal activity.
Online platforms connecting buyers and sellers or enabling content sharing face improved obligations including traceability of traders, ban on dark patterns in user interfaces, prohibition of advertising based on sensitive personal data, and transparency in recommender systems.
Very Large Online Platforms (VLOPs) with over 45 million EU users face the most stringent requirements, including systemic risk assessments, independent audits, crisis response mechanisms, and improved regulatory oversight by the Commission.
Content Moderation Framework
Notice-and-action procedures standardize how users report illegal content and how platforms process such reports. Platforms must provide accessible reporting mechanisms, acknowledge receipt, and communicate decisions with reasoning that enables effective appeals.
Statement of reasons requirements require platforms explain content moderation decisions, identifying the legal or terms-of-service basis, whether automated systems were involved, and available redress mechanisms. This transparency enables user understanding and external accountability.
Internal complaint mechanisms require platforms to provide users with free, accessible processes to appeal content moderation decisions. Decisions must be made by qualified staff with appropriate training, not solely by automated systems.
Out-of-court dispute settlement establishes certified bodies to resolve content moderation disputes between users and platforms. Platforms must engage in good faith with these proceedings, which provide binding decisions at no cost to users.
VLOP Obligations
Systemic risk assessments require VLOPs to identify, analyze, and assess risks from their services including illegal content distribution, fundamental rights impacts, public health effects, and electoral integrity threats. Assessments must be conducted annually and upon significant service changes.
Risk mitigation measures require VLOPs implement reasonable, proportionate measures addressing identified risks. These may include algorithmic adjustments, content moderation changes, terms of service updates, or cooperation with trusted flaggers and authorities.
Independent audits subject VLOPs to annual compliance audits by certified organizations, examining both procedural compliance and significant effectiveness of risk mitigation. Audit reports are submitted to regulators and published with appropriate confidentiality protections.
Data access provisions require VLOPs to provide researchers with access to data necessary to study systemic risks, subject to appropriate safeguards for user privacy and platform security. The Commission may compel data access if voluntary cooperation fails.
Enforcement Framework
National Digital Services Coordinators serve as primary enforcement authorities in each member state, with the Commission exercising direct supervision over VLOPs. Penalties can reach 6% of global annual turnover, with periodic penalties for ongoing violations and potential service restrictions for serious, repeated non-compliance.
If you are affected, assess their DSA obligations based on service type and user base, implement required transparency and content moderation procedures, and prepare for compliance deadlines beginning in early 2024 for most obligations.
Continue in the Policy pillar
Return to the hub for curated research and deep-dive guides.
Latest guides
-
AI Policy Implementation Guide
Coordinate governance, safety, and reporting programmes that meet EU Artificial Intelligence Act timelines and U.S. National AI Initiative Act mandates while sustaining product…
-
Digital Markets Compliance Guide
Implement EU Digital Markets Act, EU Digital Services Act, UK Digital Markets, Competition and Consumers Act, and U.S. Sherman Act requirements with cross-functional operating…
-
Semiconductor Industrial Strategy Policy Guide
Coordinate CHIPS and Science Act, EU Chips Act, and Defense Production Act programmes with capital planning, compliance, and supplier readiness.
Coverage intelligence
- Published
- Coverage pillar
- Policy
- Source credibility
- 71/100 — medium confidence
- Topics
- Platform Regulation · Transparency · Content Moderation
- Sources cited
- 2 sources (iso.org, crsreports.congress.gov)
- Reading time
- 5 min
Documentation
- Industry Standards and Best Practices — International Organization for Standardization
- Congressional Research Service Analysis
Comments
Community
We publish only high-quality, respectful contributions. Every submission is reviewed for clarity, sourcing, and safety before it appears here.
No approved comments yet. Add the first perspective.