New obligations for online platforms under the EU Digital Services Act

The Digital Services Act (DSA) is an ambitious piece of legislation enacted by the EU lawmakers that came into force in late 2022. Some of the provisions of this comprehensive regulation are already applicable and others will become applicable as of 17 February 2024.

The DSA fully harmonizes the rules that apply to online intermediary services (mere conduit services, caching services and hosting services) in the internal market. To provide some context, a “mere conduit” service consists of the transmission in a communication network of information provided by a service recipient or the mere provision of access to a communication network. A “caching” service consists of the transmission in a communication network of information provided by a service recipient, which involves the automatic and temporary storage of that information, performed for the sole purpose of optimizing the efficiency of the onward transmission of the information to other recipients. The third and most prevalent type of intermediary services – “hosting” services, consist of the storage of information which is provided by a service recipient and which takes place at the request of the service recipient.

The objective of the regulation is to ensure a safe, predictable and trusted online environment. Furthermore, the aim is to address the dissemination of illegal content online and the societal risks that such dissemination of disinformation may generate, while at the same time effectively protecting the fundamental rights of EU citizens and fostering innovation. Among the addressees of the new rules will be social networks, online marketplaces, app stores, content-sharing platforms and online travel and accommodation platforms.

It must be noted that the rules established by the DSA will apply to providers of intermediary services even if they are established or located outside the EU, in case they offer services in the EU and there is a substantial connection to the Union. Such a substantial connection to the Union should be considered to exist in case a service provider has an establishment in the EU or, in the absence of such an establishment, where the number of service recipients in one or more Member States is significant in relation to the population thereof, or on the basis of the targeting of activities towards one or more Member States.

One concept of significant importance which is established by the Digital Services Act is the notion of a “very large online platform”. This concept is a key component of the regulatory toolbox of the EU legislator in its ambitious aim to regulate Big Tech and create a safe online environment free from illegal and harmful content.

Very large online platforms (VLOPS) are platforms which have a number of average monthly active recipients of the service in the EU equal to or higher than 45 million (10% of the EU population), and which additionally are designated as VLOPs by a decision of the European Commission.

The platforms currently designated as VLOPS by the European Commission are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and Zalando. It can be noted that two of those platforms – Amazon and Zalando have started proceedings before the General Court of the EU in which they challenge their designations as VLOPs. Additionally, 2 search engines – Google Search and Bing, have been designated as very large online search engines (VLOSE).

 

 

The respective Digital Services Coordinator (i.e. the national regulator to be established under the DSA) will have to verify every 6 months, whether the number of average monthly active recipients of the service in the EU of online platforms under its jurisdiction is equal to or higher than the mentioned threshold of 45 million. After this verification, the regulator will have to adopt a decision designating the online platform as a VLOP, or alternatively terminating that designation, and then to communicate that decision to the concerned online platform and the Commission.

It must be noted that the obligations established by the Digital Services Act are already applicable to the above-mentioned entities designated as VLOPs and VLOSEs as of 25 August 2023.

These obligations include the following among others:

  • Risk assessment – VLOPs must identify and assess any systemic risks in the EU that stem from the design or functioning of their service and its related systems (risks related to illegal content, public security, electoral processes, public health, etc). Platforms shall preserve the documents of that risk assessments for at least 3 years after its performance, and shall, upon request, communicate them to the Commission and the national regulator;
  • Risk mitigation – Based on the above-mentioned risk assessment, VLOPs must put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified, with particular consideration to the impacts of such measures on fundamental rights;
  • Independent audit – VLOPs will be subject, at their own cost and at least once a year, to independent external audits that would assess their compliance with the DSA;
  • Crisis response – In case of crisis (threat to public security or public health), the Commission may adopt a binding decision that requires a VLOP to assess how the use of its services may contribute to the threat and to apply specific measures in order to eliminate any such contribution to the threat;
  • Advertising transparency – VLOPs must have a publicly available repository of all the advertisements served on their interface;
  • Data access – VLOPs must provide national regulators and the Commission with access to data that are necessary to monitor and assess their compliance with the regulation. This would also include explaining the design, logic and functioning of their algorithmic systems;
  • Compliance function – VLOPs must establish an internal compliance function, which is independent from their operational functions;

 

 

It needs to be underlined that the Digital Services Act will affect significantly not only Big Tech but practically all online platforms. In addition to the special requirements applicable for very large online platforms, the DSA introduces important obligations that will apply to practically all online platforms which offer their services in the EU.

The DSA defines an online platform as a type of hosting service. In particular, it is “a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service”.

All hosting service providers and online platforms, without any exceptions, will have to establish mechanisms that allow any individual or entity to send them notices about content that is present on their platform and that the individual or entity considers to be illegal. The provider may decide to remove or disable access to that content that is uploaded by users of the service. If that is the case, the platform will be obliged to inform the affected users of the decision and also to provide a clear and specific statement of reasons for that decision. That statement should include information about the possibilities for redress that are available to those users of the service (the existence of an internal complaint-handling system, options for out-of-court dispute settlement and judicial redress).

There is a bundle of additional obligations that would apply to all online platforms with the exception of micro and small enterprises. As per Commission Recommendation of 6 May 2003, a small enterprise is defined as an enterprise which employs fewer than 50 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 10 million and a microenterprise is defined as an enterprise which employs fewer than 10 persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 2 million. As for the specific obligations, they impose the following requirements among others:

  • Platforms should provide users with access to an effective internal complaint-handling system for a period of at least 6 months following a decision for removing content, suspending or terminating a user account, suspending or terminating the provision of the service;
  • Recipients of the service that are affected by the above-mentioned decisions will also be entitled to bring their case before an out-of-court dispute settlement body that has been certified by the national regulator under the DSA;
  • Article 25 of the DSA imposes a prohibition on the use of so called “dark patterns”. In particular, providers of online platforms shall not design or organize their online interfaces in a way that deceives or manipulates service recipients or in a way that otherwise materially distorts or impairs the ability of service recipients to make free and informed decisions. For example, online platforms shall be forbidden to present choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision. Additionally, it shall also be forbidden to repeatedly request a recipient of the service to make a choice where such a choice has already been made or to make the procedure of cancelling a service significantly more cumbersome than signing up to it;
  • Providers of online platforms that are accessible to minors are under an obligation to put in place appropriate measures in order to ensure a high level of privacy, safety, and security of minors. Furthermore, providers of online platform are prohibited from presenting advertisements on their interface based on profiling when they are aware with reasonable certainty that the recipient of the service is a minor;
  • Online platforms must ensure that notices submitted by “trusted flaggers” are processed and decided with priority. “Trusted flaggers” are entities that are awarded their status by a national regulator and which have particular expertise in detecting and notifying illegal content;
  • Platforms will be entitled to suspend, for a reasonable period of time, the provision of their services to recipients that frequently provide manifestly illegal content. However, such suspension can be put in place only after a prior warning has been given;
  • In case a platform displays advertising on its interface, it must provide each user with information in real time that the displayed content is an advertisement, the entity on whose behalf the ad is displayed or the entity who paid for the ad and meaningful information about the parameters used in order to determine the ad recipient, and, where applicable, how to change those parameters;
  • If an online platform allows consumers to conclude distance contracts with traders, it shall ensure that such traders can only use its services if the platform has obtained certain identifying information from them in advance and self-certification by the trader with commitment to only offer products or services that comply with EU law. The platform will also have to make some of this information available to end users;
  • Platforms that allow the conclusion of distance contracts with traders must design their online interface in a way that enables such traders to comply with their obligations regarding pre-contractual information, compliance and product safety information;
  • If a provider of an online platform that allows consumers to conclude distance contracts with traders becomes aware that an illegal product or service has been offered to consumers located in the EU through its services, that provider will be under an obligation to inform the consumers who purchased the illegal product or service through its services of the following: I) the fact that the product or service is illegal, ii) the identity of the trader and iii) the means for redress;

Online platforms that are offering their services in the EU and have less than 45 million active users will have to be compliant with all those requirements as of 17 February 2024.