Digital Services Act (DSA) - what you need to do now!

  • 13 Minuten Lesezeit

Digital Services Act (DSA) - what you need to do now!


The Digital Services Act (DSA) is the name of the regulation that, since 25 August 2023, has been aimed in particular at combating so-called disinformation (find here the German version of our article). The Digital Services Act (DSA) was introduced for this purpose by the EU Commission. Digital services" refers to various online services, from websites to online platforms. The companies addressed include, for example, online marketplaces, social networks, app stores as well as online platforms (collectively referred to as "platforms"). The platforms offering their services in the EU are to be obliged to actively combat the dissemination of illegal content and other social risks.


Digital Services Act - what the DSA is all about


The primary intention behind the Digital Services Act, or DSA, is to strengthen security in the digital environment, protect the fundamental rights of users and create a level playing field for businesses. This is intended to promote innovation, growth and competitiveness in both the European and international markets.


Member states are required to adapt their national legislation in accordance with the provisions of the DSA and thus implement the Digital Services Act. Companies, in turn, are obliged to carefully review the requirements of the DSA and take appropriate measures if necessary. The mandatory application of the Digital Services Act in all member states will come into force by 17 February 2024 at the latest.


For VLOPs (so-called "Very Large Online Platforms") and VLOSEs (so-called "Very Large Online Search Engines"), the Digital Services Act (DSA) has already come into force on 25 August 2023. Companies are classified in this category if they have an average of at least 45 million active users in the Union per month. An overview of the platforms classified by the Commission as VLOPs and VLOSEs can be found here: https://ec.europa.eu/commission/presscorner/detail/en/IP_23_2413.


This paper focuses on the new obligations under the DSA and provides a summary of some of the crucial aspects of the Digital Services Act.


Digital Services Act - What is the core problem ?


The Digital Services Act closes a gap. Until now, there was practically no regulatory control of the platforms. They have essentially regulated themselves and set their own guidelines. In their so-called "transparency reports", they sometimes explained what measures they were taking to prevent the spread of disinformation, for example. However, it was almost impossible to verify such statements. The DSA now aims to change this situation.


DSA - What does the Digital Services Act contain?


Increased autonomy and safety for users are the focus of the DSA


The DSA contains various obligations that specific companies in the digital sector must fulfil. Accordingly, the Digital Services Act requires technology companies to implement new procedures to remove unlawful content such as hate speech, incitement to terrorism and child sexual abuse.


At the same time, the DSA underlines the prohibition of general "content monitoring", i.e. the systematic and continuous monitoring, collection, analysis and archiving of content. New rights for users make it possible to challenge moderation decisions. If platforms block accounts or downgrade or delete content, they must justify this to the users concerned.  


Furthermore, the Digital Services Act provides for measures to force tech giants to increase transparency regarding the algorithms they use. Tech giants are large, influential technology companies that dominate the digital economy and operate in areas such as IT, software and the internet. Examples are Apple Inc, Amazon.com Inc, Google LLC, Meta Platforms Inc, Microsoft Corp and Alibaba Ltd, which in turn are all categorised as VLOPs. Based on the DSA, these platforms must set out in their terms of use exactly how they moderate content and how their algorithmic recommendation systems work. In addition, they are obliged under the Digital Services Act to offer users at least one alternative recommendation or feed option without "profiling", i.e. without targeted marketing based on customer profiles.


In the future, users of platforms such as Instagram or TikTok will have the right to receive information about why certain content is displayed in their feed. For those who do not want personalised recommendations, there is an option to opt out of this process to prevent their data from being processed for this purpose ("opt-out"). Meta Platforms Inc. has announced in response to the DSA that they will offer their users a feed that is ordered chronologically.


Protection against manipulative advertising through the Digital Services Act


The Digital Services Act includes restrictions on targeted advertising and misleading designs. For example, the DSA prohibits advertising targeted at children and profiling based on "sensitive" characteristics such as religious belief or sexual orientation. In addition, the DSA introduces restrictions on platform design that must not mislead or manipulate users, including "dark patterns", i.e. design patterns that are intended to lead users to take actions they might not otherwise have taken.


General transparency and reporting requirements of the Digital Services Act require platforms to produce annual reports on their content moderation. These reports must indicate the number of illegal content deleted on the order of Member States or "Trusted Flaggers", as well as the volume of user complaints. "Trusted Flaggers" are organisations or users in online platforms that have been deemed trustworthy and reliable by the platform or company to perform certain tasks. Typically, Trusted Flaggers are tasked with reporting or reviewing inappropriate or problematic content to ensure compliance and platform safety. These trusted flaggers assist the platform in identifying and removing offensive, dangerous or objectionable content.


To reduce systemic risks, the DSA introduces further requirements for the largest platforms. The so-called VLOPs must, due to the Digital Services Act, examine according to formal guidelines how their products, including algorithmic systems, could amplify societal risks. They must then take measurable action to mitigate these risks.


What is unlawful content under the DSA?


The DSA refers to "unlawful content" in many places. This is defined in the provision as content that is in conflict with Union law or the law of the respective Member State. Depending on the context, this does not only refer to explicitly unlawful content, but also to activities that precisely violate applicable law, such as the provision of services contrary to the requirements of consumer protection law. The Digital Services Act lists a catalogue of examples to illustrate some of the illegal content meant. Reference is often made to the term hate speech, which, as a primarily political term, is difficult to define in a legal framework, so that unlawful hate speech is explicitly meant.


What do companies have to consider as a result of the Digital Services Act?


The Digital Services Act distinguishes between different categories of providers with regard to the respective obligations. Exemptions apply to micro or small enterprises, i.e. companies with fewer than 50 employees and an annual turnover/balance sheet of up to EUR 10 million. Depending on the classification of the respective company, stricter or less strict DSA obligations apply. We have compiled the most important obligations for you below. You can also download them here.


1. most important obligations for intermediary services, Art. 11 ff. DSA:

  • Obligation to designate a central contact point for authorities and users;
  • Commitment to revise the GTC (in particular with regard to information on all guidelines, procedures, measures and tools in the moderation of content);
  • Obligation to publish annual transparency reports.


2. additional obligations for hosting service providers, Art. 16 ff. DSA (including online platforms):

  • Establish a reporting and redress procedure for illegal content;
  • Obligation to justify any restrictions on content to users who have provided information;
  • Obligation to report suspicions of criminal offences to the competent authorities.


3. additional important obligations for providers of online platforms, Art. 20 et seq. DSA:

  • Integration of an internal complaints management system;
  • Obligation to inform users about access to an out-of-court dispute resolution body;
  • Requirement of technical-organisational measures for the priority processing of reports from trustworthy whistleblowers;
  • Obligation to temporarily suspend services for users who frequently and obviously provide illegal content;
  • Expanded transparency reporting obligations, especially with regard to average monthly user numbers;
  • Prohibition of dark patterns in the design and organisation of online interfaces;
  • Obligation for transparent labelling of advertising;
  • Obligation to state the most important parameters in GTCs when using recommendation systems;
  • The need for appropriate and proportionate measures to protect minors.


4. additional important obligations for online platforms in distance selling, Art. 30 et seq. DSA:

  • Obligation to obtain information from traders to establish identity and traceability;
  • Online interfaces shall be designed and organised in such a way that traders can comply with their obligations regarding pre-contractual information, conformity and product safety information under applicable Union law;
  • Duty to inform consumers as soon as they become aware of illegal products and services offered via the platform.


5. additional duties of very large online platforms and search engines, so-called VLOPs and VLOSEs (> 45 million users per month):

  • Obligation to carry out an annual risk assessment in relation to the design and operation of the service and documentation;
  • Require appropriate, proportionate and effective risk mitigation measures;
  • Implementation of appropriate measures in the event of a crisis as decided by the Commission;
  • Duty to cooperate with organisations in annual audits regarding compliance with obligations and commitments under the DSA;
  • Obligation to present an option without "profiling" in the context of recommendation systems;
  • additional transparency obligations for online advertising;
  • Duty to provide access to data for monitoring and assessing compliance with the DSA;
  • Obligation to establish a compliance department;
  • Obligation to pay an annual supervision fee set by the Commission.


How will the Digital Services Act be enforced?


Platforms are required to share their internal data with independent auditors, EU authorities and national authorities. Compliance with this regulation is to be ensured by a specially created "European Digital Services Board" and 27 national "Digital Services Coordinators" under the direct supervision of the Commission. These have the power to investigate various breaches of the regulation and take (provisional) deletion decisions. Non-compliance with the DSA can be punished with fines of up to 6% of the company's total turnover from the previous business year if committed intentionally or negligently.


In Germany, the responsibility for implementing the Digital Services Act (DSA) basically lies with the Federal Network Agency. This means that the Federal Network Agency is to be expanded into a digital authority that monitors not only the traditional infrastructure operators but also the digital economy. The Federal Network Agency will thus become a coordinator for digital services in the sense of the DSA and thus receive the necessary powers and obligations at the national level.


FAQs on the Digital Services Act


What does the Digital Services Act regulate?

The aim of the DSA is to create a safer digital space where users' fundamental rights are protected and businesses are offered a level playing field.

In addition, the Digital Services Act will in future regulate the activities of digital service providers within the EU. This has resulted in one of the most significant regulations for digital policy in Europe.

The EU Digital Services Act (DSA) and the Digital Market Act (DMA) are harmonised rules. They have two important objectives:

  1. Creating a safer digital environment in which the fundamental rights of all users of digital services are protected
  2. To promote innovation, growth and competitiveness both in the European internal market and globally, a level playing field should be created.

The EU Commission's strategy aims to strengthen the European single market in the digital age through innovation, growth and competitiveness.


To whom does the Digital Services Act apply?

The Digital Services Act aims to provide intermediary services to users in the EU. The term is very broad and covers all digital services that provide consumers with access to services, content and goods, i.e. play a role as intermediaries. The DSA looks in particular at access, caching and hosting providers. In addition to general internet services such as telecoms, there are also online platforms such as social networks, search engines and online trading platforms. For the application of the Digital Services Act, it is initially unimportant how large the service is or how many users it has. However, micro and small businesses are not subject to the specific regulations for online platforms and online trading platforms. In addition, particularly strict regulations apply to very large online platforms and search engines, known as gatekeepers. To be recognised as such, the number of users per month must be at least 45 million and the EU Commission must have explicitly designated this. In particular, this is intended to address large technology companies such as Google or Meta.


What is the EU Digital Services Act?

Last year, the EU passed the Digital Services Act. Its purpose is to ensure that websites and search engines remove illegal content more quickly than before. It is planned that it will be easier for users to report such content. In general, large providers have to follow stricter rules than small ones.

The e-commerce Directive will be complemented and some parts of it updated by the Digital Services Act. It is planned to create uniform horizontal rules on due diligence and disclaimers for intermediary services (such as online platforms) in order to create a safe, predictable and trustworthy online environment and support the EU single market for intermediary services.

Due to the particular dangers, extended regulations are envisaged for very large online platforms that reach more than ten percent of the 450 million consumers in Europe.

In addition, in future there will be uniform procedures throughout Europe for reporting and immediately removing illegal content. There are also additional responsibilities for very large online platforms and search engines.


What changes does the Digital Services Act bring?   

An EU official emphasises on the Digital Services Act that in the future, terms and conditions should be designed so that every child can understand them. Customer service must also meet a minimum standard. For example, online marketplaces such as Amazon or Alibaba's AliExpress should remove counterfeit clothes or dangerous toys as far as possible and inform buyers about them.

Online trading venues must also ensure that consumers can easily find out who they are doing business with. It is necessary that suppliers and vendors on online platforms disclose their contact details, entries in trade registers as well as other relevant information.

Not only are online platforms and search engines to delete illegal content more quickly than before, but in future they are to report in detail to the EU Commission on what dangers exist for citizens in Europe. Snapchat and YouTube are to investigate whether their services incite violence or impair freedom of expression and take action if necessary. It is necessary that they also disclose which posts they remove and provide opportunities to object.

In future, the DAS will no longer allow advertising to be targeted at children. Platforms must take measures to protect the safety, privacy and mental health of minors.



Digital Services Act: What companies need to do now!

Due to their classification into one of the categories of providers, companies are potentially directly affected by the DSA. Therefore, each company must be qualified according to the criteria of the regulation in order to know which due diligence obligations must be complied with and to what extent. Insofar as your company falls within the scope of the DSA, the respective applicable obligations should already be examined and implemented now.


We support you in the audit and advise you on the implementation of the obligations under the DSA.


We support you in the implementation of the Digital Services Act


Our lawyers will be happy to assist you with the implementation of the DSA. You can rely on our expertise in IT law. Call us, send us an email or use the following form:


Digital Services Act (DSA) – das müssen Sie nun unternehmen!

Foto(s): LL

Rechtstipp aus den Rechtsgebieten

Artikel teilen:


Sie haben Fragen? Jetzt Kontakt aufnehmen!

Weitere Rechtstipps von Rechtsanwalt Daniel Loschelder

Beiträge zum Thema