Digital Service Act: what it entails and what changes

As of February 17, 2024, the obligations introduced by the EU Regulation 2022/2065 on the Single Market for Digital Services (Digital Service Act or DSA) will be applied to all European operators.

With regards to the so-called Big Tech companies, on August 25, 2023 some of the Digital Service Act provisions came into effect, introducing new transparency and control obligations aimed at creating a safer digital environment for all users.

The Big Tech companies identified by the European Commission

The Digital Service Act imposes stricter obligations and responsibilities for very large platforms and online search engines, that is those online services providers that have an average number of monthly recipients of the service in the Union equal or higher than 45 million.

After the entry into force of the DSA (on November 16, 2022), online platforms had 3 months to communicate the number of active final users of their websites. Based on the data provided, the Commission identified the platforms which were found to have large dimensions.

In April the interested companies have been notified by designations acts of the Commission and, from that moment, companies would have 4 months to comply with the obligations provided by the DSA.

The list of Big Tech companies drafted by the Commission includes Amazon, Apple AppStore, Booking, Facebook, Google, Instagram, LinkedIn, TikTok, Twitter, Wikipedia, YouTube, Pinterest, Snapchat, Zalando, Bing e Alibaba Aliexpress.

The news obligations of the Big Tech companies

Being platforms able to reach monthly 10% of the European population, the DSA elaborated a set of obligations aimed at preventing and containing risks that user could suffer through the fruition of the services provided by big platforms.

In designing these new obligations, the European legislator recognized that these platforms can have a significant impact on society and, therefore, they are required to identify, analyze and communicate periodically to the Commission systemic risks deriving from the development and functioning of their own services. In setting up risk assessment, Big Tech companies will have to evaluate, among others, risks connected to the diffusion of unlawful content through their services, risks for rights and fundamental freedom and potential negative effects of the service on the public debate and election processes. Once the risks are identified, platforms will have to implement all necessary measures to contain and prevent them, undertaking periodic audit procedures.

Beside the identification of the risk, large platforms are required to be more transparent towards users concerning terms and conditions, advertisement, content recommender or moderation systems. Moreover, greater guarantees for child protection and specific measures to tackle unlawful actions are introduced.

In particular, content recommender algorithms

Platforms that use content recommender algorithms will have to clearly explain how these systems work and what are the parameters used to suggest certain content to the users.

Moreover, Big Tech companies should always ensure that the user is able to view content in an order that is not determined by profiling mechanisms.

Ilaria Feriti