Skip to content

AI-generated content: Responses are generated by AI, automatically assembled and may contain errors. Conformi is a research tool and does not replace legal advice or case-by-case legal review. All responses should be verified using the linked original sources.

💻Digital services

Digital Services Act (DSA) — Regulation on a Single Market for Digital Services

Analysis from 18 April 20262 sourcesOriginal version of 27.10.2022EUR-Lex Original

Does our platform need a compliance officer, risk assessment and advertising repository under the DSA — and what happens if we miss the obligations?

Any intermediary service offered to EU users must comply with the DSA since 17 February 2024, with fines up to 6% of worldwide annual turnover — and platforms reaching 45 million monthly active users face additional VLOP obligations enforced directly by the European Commission.

Short Answer

The DSA imposes layered due diligence obligations on all intermediary services operating in the EU: mere conduit, caching, hosting, online platforms, and very large online platforms or search engines (VLOPs/VLOSEs) [Art. 11-43]. All hosting services must implement notice-and-action mechanisms and provide statements of reasons for content restrictions [Art. 16, Art. 17]. Online platforms must additionally ensure complaint handling, advertising transparency and ban dark patterns [Art. 20, Art. 25, Art. 26]. VLOPs/VLOSEs designated by the Commission must conduct annual systemic risk assessments, submit to independent audits and maintain public advertising repositories [Art. 34, Art. 37, Art. 39].

Who is affected

All providers of intermediary services offered to recipients in the EU, regardless of where the provider is established [Art. 2(1)]. The Regulation distinguishes four tiers: (1) all intermediary services including mere conduit, caching and hosting; (2) hosting services that store user-provided content; (3) online platforms that disseminate information to the public, excluding micro and small enterprises [Art. 19]; (4) VLOPs/VLOSEs exceeding 45 million average monthly active recipients in the EU [Art. 33(1)]. Non-EU providers must designate a legal representative in a Member State [Art. 13].

Deadline

The DSA is fully applicable since 17 February 2024 [Art. 93(2)]. All obligations are currently enforceable. VLOP/VLOSE-specific obligations apply 4 months after Commission designation [Art. 33(6)]. Ongoing obligations include: annual transparency reports [Art. 15], semi-annual publication of average monthly active recipients for platforms [Art. 24(2)], annual systemic risk assessments for VLOPs/VLOSEs [Art. 34(1)], and annual independent audits for VLOPs/VLOSEs [Art. 37(1)].

Risk

Fines up to 6% of worldwide annual turnover for non-compliance with DSA obligations [Art. 52(3)]. Periodic penalty payments of up to 5% of average daily worldwide turnover per day of delay [Art. 52(4)]. For incorrect, incomplete or misleading information: up to 1% of annual worldwide turnover [Art. 52(3)]. The Commission enforces directly against VLOPs/VLOSEs with identical fine ceilings [Art. 74]. As a last resort, a Digital Services Coordinator may request judicial authority to temporarily restrict access to the non-compliant service [Art. 51(3)].

Proof

Legal status

  • In force
  • as of 2026-04-18
  • Original version of 27.10.2022

Primary sources

What to do now

Legal / DPO

  • Classify your service into the correct DSA tier (mere conduit, hosting, online platform, VLOP/VLOSE) and map applicable obligations per tier, as each tier triggers cumulative due diligence requirements [Art. 2, Art. 3(g), Art. 33].
  • Review and update terms and conditions to include clear, machine-readable information on content moderation policies, restriction criteria, and algorithmic decision-making — including plain-language versions accessible to minors if the service targets them [Art. 14].
  • Designate a legal representative in an EU Member State if your organisation is not established in the Union, and establish a single point of contact for authorities and for recipients of the service [Art. 11, Art. 12, Art. 13].

Compliance

  • Implement a notice-and-action mechanism allowing any person to report illegal content, with electronic confirmation, clear timelines for action, and reasoned decisions communicated to the notifier [Art. 16, Art. 17].
  • Publish annual transparency reports covering content moderation activities, orders received from authorities, notices processed, and use of automated detection — micro and small enterprises are exempt unless designated as VLOP [Art. 15].
  • For VLOPs/VLOSEs: conduct annual systemic risk assessments covering illegal content dissemination, effects on fundamental rights, civic discourse, elections, public health and minors, and implement proportionate mitigation measures [Art. 34, Art. 35].

IT / Security

  • Build an internal complaint-handling system that processes complaints electronically, ensures decisions are made under human supervision and not solely by automated means, and allows recipients at least 6 months to lodge complaints [Art. 20].
  • Implement technical safeguards against dark patterns: the online interface must not deceive, manipulate or materially distort users' ability to make free and informed decisions [Art. 25].
  • For VLOPs/VLOSEs: provide at least one recommender system option that is not based on profiling, maintain a searchable public advertising repository retained for one year, and provide vetted researchers with data access for systemic risk research [Art. 38, Art. 39, Art. 40].

Product / Engineering

  • Ensure advertising transparency by clearly marking each advertisement as such, identifying the advertiser and payer, and disclosing the main targeting parameters used — profiling-based ads using special category data under GDPR Art. 9(1) are prohibited [Art. 26].
  • For online marketplaces: implement trader verification (KYBT) before allowing traders to offer products, collecting name, address, ID, payment account and a self-certification of EU law compliance, and make trader identity visible to consumers [Art. 30].
  • Design the platform to protect minors by implementing appropriate privacy, safety and security measures, and prohibit profiling-based advertising targeting recipients known to be minors [Art. 28].

Key Terms

Intermediary service
An information society service that consists of mere conduit (transmission), caching (temporary storage for efficient forwarding), or hosting (storage at the request of a recipient) [Art. 3(g)].
Online platform
A hosting service that, at the request of a recipient, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service [Art. 3(i)].
Very Large Online Platform (VLOP)
An online platform reaching at least 45 million average monthly active recipients in the EU, designated by the European Commission, subject to additional systemic risk obligations [Art. 33].
Digital Services Coordinator (DSC)
The independent national authority designated by each Member State as responsible for supervising intermediary service providers established in its territory and enforcing the DSA [Art. 49].
Illegal content
Any information that, in itself or in relation to an activity, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, regardless of the subject matter or nature [Art. 3(h)].
Trusted flagger
An entity awarded trusted flagger status by a Digital Services Coordinator on the basis of its expertise, independence and diligent reporting, whose notices receive priority processing by online platforms [Art. 22].
Recommender system
A fully or partially automated system used by an online platform to suggest, prioritise or rank specific information in the interface presented to recipients of the service [Art. 3(s)].
Dark pattern
Online interface design practices that deceive, manipulate, or materially distort the ability of recipients to make free and informed decisions, prohibited under the DSA [Art. 25].
?

Frequently Asked Questions

Which services fall under the DSA?
The DSA applies to all intermediary services offered to recipients in the EU, regardless of the provider's place of establishment [Art. 2(1)]. Intermediary services include mere conduit (e.g. internet access providers), caching (e.g. CDNs), hosting services (e.g. cloud storage), online platforms (e.g. social networks, marketplaces) and online search engines [Art. 3(g)-(j)]. The key criterion is whether the service stores, transmits or caches information provided by recipients.
What is a Very Large Online Platform (VLOP) and how is one designated?
A VLOP is an online platform with at least 45 million average monthly active recipients in the EU, corresponding to approximately 10% of the EU population [Art. 33(1)]. The same threshold applies to Very Large Online Search Engines (VLOSEs). The European Commission designates VLOPs/VLOSEs by decision, and the additional obligations apply 4 months after notification of the designation [Art. 33(4), Art. 33(6)].
Are small companies exempt from DSA obligations?
Micro and small enterprises (as defined in the Annex to Recommendation 2003/361/EC) are exempt from the additional obligations in Section 3 (online platform obligations such as complaint handling, trusted flaggers, advertising transparency) [Art. 19]. However, they must still comply with all obligations applicable to intermediary and hosting services [Art. 11-18]. The exemption does not apply if a micro or small enterprise is designated as a VLOP or VLOSE [Art. 19].
What is a trusted flagger and what priority do their notices receive?
Trusted flaggers are entities awarded that status by a Digital Services Coordinator based on demonstrated expertise, independence, and diligent reporting [Art. 22(2)]. Online platforms must process notices from trusted flaggers with priority and without undue delay [Art. 22(1)]. The status can be revoked if the trusted flagger submits a significant number of insufficiently substantiated notices [Art. 22(6)].
What must a systemic risk assessment cover for VLOPs?
VLOPs/VLOSEs must identify, analyse and assess at least annually the systemic risks stemming from (a) the dissemination of illegal content, (b) negative effects on fundamental rights including freedom of expression and privacy, (c) negative effects on civic discourse, electoral processes, and public security, and (d) negative effects related to gender-based violence, public health and the protection of minors [Art. 34(1)]. The assessment must consider the design of recommender systems, content moderation, terms and conditions, advertising systems and data practices [Art. 34(2)].
Who enforces the DSA?
Each Member State must designate a Digital Services Coordinator (DSC) as primary enforcer for providers established in its territory [Art. 49]. The European Commission has exclusive enforcement powers over VLOPs and VLOSEs regarding their additional Section 5 obligations [Art. 56(2), Art. 65]. The European Board for Digital Services coordinates consistent application across Member States [Art. 61].
Can users claim compensation for DSA violations?
Yes. Recipients of the service have a right to seek compensation from providers of intermediary services for any damage suffered as a result of the provider's infringement of its DSA obligations, in accordance with applicable Union and national law [Art. 54].
3

Assessment Factors & Checklist

Premium
4

Questions for Your Lawyer

Premium
5

Conclusion & Summary

Premium

Detailed analysis with source links.

Schalten Sie die KI-Analyse frei — mit markierten Fundstellen und direkten Links zu EUR-Lex. 7 Tage kostenlos testen.

Keine Kreditkarte heute. Kündigung jederzeit.