Digital Service Act (”DSA”) is a legislation which establishes responsibilities for all the parties involved in the digital field as well as the essential elements of the institution of Digital Services Coordinator. DSA enforces more strict requirements for very large online platforms. In order to understand the reason behind imposing such measures to some of the economic actors, we must clarify the definition of very large online platforms. A very large online platform is the platform having over 45 million monthly users during one calendar year, in the EU, and this aspect is considered to be an important risk for the society.
Once the Commission nominates a platform as VLOP (“Very Large Online Platforms – VLOP”) or a search engine as VLOSE (”Very Large Online Service Entreprise”) based on the number of users supplied by the platforms or search engines, which they were obliged to publish until 17 February 2023, regardless of their size, then the respective platform or search engine must comply with the DSA within 4 months. The platforms and search engines shall need to update these numbers at least once every six months, as provided by DSA.
This nomination triggers specific norms for approaching the specific risks which these large services are causing to European citizens and for the society e in respect of illicit content, as well as the impact thereof over the fundamental rights, the public safety and wellbeing.
The Commission shall revoke its decision if the platform or search engine no longer reaches the 45 million monthly users during one calendar year threshold.
On 25 April 2023, the Commission has nominated as VLOP and VLOSE the following:
VLOP – Alibaba Aliexpress, Magazinul Amazon, Apple AppStore, Booking.com, News, Google Maps, Google Shopping, etc.
VLOSE –Google Search and others.
The nominated very large platforms shall have to observe the regulation sooner than the other economic actors.
Some of the VLOP and VLOSE obligations are:
- To establish a contact point;
- To denounce the infringements;
- To provide terms of use which are easy to navigate and use;
- To be transparent in respect of advertising, recommendation systems or decision to moderate the content;
- To identify and assess the risks in connection to:
- Illicit content;
- Fundamental rights such as freedom of speech, mass-media freedom and pluralism, discrimination, customer protection and children rights;
- Public safety and election processes;
- Gender violence, public health, minor’s protection and mental and physical wellbeing.
Part of the conformity procedure is to enforce a risk management system. This system entails the identification and mitigation of the potential dangers for the society, such as the harmful content.
The procedure to manage the risks entails that the very large platforms should apply annually the relevant mitigation measures. These measures shall be verified by independent auditors. Then, the audit company shall present its risks evaluation report before the European Commission and the national authorities of the member state where the headquarters is located.
The responsibilities in respect of enforcement of the measures shall be divided between the EU Executive, as the main agent of enforcement of measures regarding the very large platforms and, for the smaller actors, the coordinator of digital services which shall collaborate with other competent national authorities.
The Digital Service Council shall ensure the consistent cross-border enforcement and cooperation.
The Member States shall be able to delegate some of the tasks to other authorities than the coordinator, which would remain the single point of contact between the companies established within its territory and in order to present the national position before the Council.
The Digital Services Coordinator – have, among others, the task to perform joint investigations, to cooperate with their colleagues in case of cross-border cases or to request the European Commission to assess if a large platform represents a systemic issue.
The EU Council could cooperate with EU bodies and agencies in matters regarding data protection, competition and non-discrimination.
Each member state is allowed another year to nominate its Digital Services Coordinator. This means that also Romania should nominate, by the end of 2024, a politically independent institution as digital services coordinator. This independent institution shall have to be created and coordinated according to the DSA provisions and it will practically supervise the digital platforms and shall ensure that the DSA regulation is being enforced.
If the Government shall not choose to establish a new national authority to perform this role, one of the existent authorities such as – ANCOM, CNA, the Ombudsman, the Ministry of Research, Innovation and Digitalisation or the National Directorate for Cybernetic Security – most probably will cover this function.
The digital services coordinator holds the role of users’ complaints office, as provided by article 53 of the DSA, but it also offers researchers access to the date generated by the very large platforms. According to art. 36 of the DSA, in crisis situations, the Coordinator may make recommendations in its field of expertise. There are some special situations when a certain content from one online platform is illegal in one member state, but it is allowed in other member state and in this case the Coordinator may initiate a request for deletion of content.