e Internet and media

 

 


The scale of online disinformation is widely considered to be one of the most important challenges in terms of providing users with a “safe, predictable, and trusted online environment”.


For years, various types of legal acts have been passed in individual countries of the European Union to reduce the risks associated with disinformation. At the European Union level, a fundamental change was brought about by the adoption of the Digital Services Act Regulation, which for the first time across the European Union introduced legal obligations to combat disinformation, as well as liability in the event of non-fulfilment of these obligations by its addressees.


For the very large online platforms and search engines, these rules came into force on 25 August 2023, and for the others will come from 17 February 2024. The following are the ten most important key takeaways on the fight against disinformation under the Digital Services Act.


1. Disinformation – definition and types of damage on the basis of documents of the European Union.

Disinformation activities are undertaken for a variety of reasons – mainly to achieve political objectives and/or the desire to achieve economic gain. They may result in:

a.       public damage (e.g. violation of the democratic electoral process),

b.       personal damage (e.g. violation of the good name of a particular individual/legal person).


Postulates to combat disinformation have been appearing for years in various documents of the European Union. According to them, disinformation is understood as “the deliberate dissemination of false or misleading information aimed at undermining trust in institutions, societies and particular people”. Against this background, previous EU documents have focused on the prevention of public harm. It is therefore worth noting that, for some national legislations, measures to combat disinformation also include the prevention of personal injury. Such a solution was also adopted in the draft law on the protection of freedom of expression in online social networks prepared by the Polish Ministry of Justice (the Act has not been passed yet).


2. Disinformation under the Digital Services Act.

One of the main objectives of the DSA resolution was to provide “a secure, predictable and trustworthy online environment” (Article 1. 1), which undoubtedly includes combating disinformation, as evidenced by the use of the term in recitals of the Digital Services Act.


The provisions of the Digital Services Act do not contain a legal definition of disinformation. As a consequence, there is a risk of fragmentation in the fight against it in the individual member states of the European Union. For Internet intermediaries other than Very Large Online Platforms (VLOPs) or Very Large Online Search Engines VLOSEs), the obligation to remove content applies only to illegal content, while not all EU countries prohibit disinformation by law (see further point 4 below).


3. Two basic ways to combat misinformation in the Digital Services Act.

Under the Digital Services Act, there are two main ways to combat disinformation:

a) as illegal content within the meaning of the DSA, which all online intermediaries are obliged to combat,

b) as socially harmful content for which the DSA imposes specific obligations on only a selected category of online intermediaries (“Very Large Online Platforms”, “Very Large Online Search Engines”)


4. Obligation to combat disinformation as illegal content within the meaning of the Digital Services Act.

 According to Article 3(h) of the Digital Services Act, illegal content within the meaning of the DSA is ‘information which, by itself or by reference to an activity, including the sale of products or the provision of services, is incompatible with Union law or with the law of any Member State which is compatible with Union law, irrespective of the specific subject matter or nature of that law’ (Article 3(h)). As mentioned above, not all countries of the European Union consider the same content illegal. As a result, the same content may be subject to the removal obligation in one country and at the same time be disseminated in another country (e.g. Poland has not adopted legislation prohibiting public reporting of the allegedly harmful effects of a COVID-19 vaccine).


From the point of view of combating disinformation, it is important that if disinformation is considered illegal under the legislation of a given EU country, all online intermediaries, including online platforms, are obliged to take action to remove it, subject to fines as an administrative sanction.


5. Instruments in the DSA aimed at combating disinformation as illegal content.

In terms of DSA, the following legal instruments aimed at combating illegal content should be mentioned:

a) the obligation to take action against illegal content on the basis of an order issued by the relevant judicial or administrative authorities (Article 9(1) DSA)

b) the obligation for online intermediaries to organize mechanisms enabling any person or entity to report certain information that the person or entity considers to be illegal content (notice and action mechanism) – Art. 16 DSA,

c) the obligation to ensure priority handling of notices submitted by entities referred to as trusted flaggers, operating in designated areas in which they have expertise (Article 22 of the DSA).


6. Disinformation and prohibition of targeted advertising based on profiling sensitive data.

The Digital Services Act also introduces legal regulation on online advertising, which may also reduce the dissemination of disinformation. Article 26(3) of the DSA prohibits targeted advertising based on the profiling of sensitive data within the meaning of Article 9 of the GDPR. Such data within the meaning of that provision includes inter alia, information on ‘political beliefs’.


7. Disinformation and DSA – responsibilities of very large online platforms (VLOPs) and very large online search engines (VLOSEs).

Some platforms are also obliged to prevent the dissemination of harmful data, which does not necessarily have to be illegal content under European Union law or the national laws of European Union member states. This is, in particular, the case of online intermediaries that have obtained the status of Very Large Online Platform (VLOP) or Very Large Online Search Engine (VLOSE) because they have an average number of monthly active users in the Union of at least 45 million and have therefore been qualified as such by the European Commission.


VLOPs and VLOSEs are subject to two additional obligations, compared to other online intermediaries, from the point of view of combating disinformation:

a) the obligation to assess the systemic risks arising from the design, operation, and use of their services, as well as from the potential misuse of services by recipients of services (Article 34); when assessing systemic risks, they should focus on the systems or other elements that may contribute to the risk, including any algorithmic systems that may be relevant, in particular, their recommendation and advertising systems;

b) the obligation to take measures to address identified systemic risks (Article 35).


8. Disinformation as a systemic risk in terms of the DSA.

In the light of the DSA regulations, disinformation may potentially constitute primarily two systemic risks defined in the provisions of the Digital Services Act:

a) the risk relates to an actual or foreseeable negative impact on democratic processes, civic discourse and electoral processes, as well as on public security (recital 82),

b) the risk relates to an actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being, or on gender-based violence. Such risks may also stem from coordinated disinformation campaigns related to public health, or from online interface design that may stimulate behavioural addictions of recipients of the service (recital 83).


9. Combating disinformation – additional obligations of VLOPs and VLOSEs.

The Digital Services Act also provides for additional obligations that may be relevant to the fight against disinformation by VLOP and VLOSE. According to recital 108 of the DSA, the European Commission may take the initiative to develop voluntary crisis protocols to coordinate rapid, collective and cross-border responses in the online environment. This may be the case, for example, where online platforms are abused for the rapid dissemination of illegal content or disinformation, or where there is a need for the rapid dissemination of reliable information.

In turn, according to Article 37 of the DSA, providers of very large online platforms and very large online search engines at their own expense are obliged to undergo independent audits at least once a year to assess their compliance with the obligations set out, inter alia, in point 7 above.


10. DSA and the Strengthened Code against Disinformation adopted by the European Commission.

In September 2018, the European Commission approved the Code of Conduct on Combating Disinformation. Adherence to the Code is voluntary and its signatories are the largest Internet companies as well as industry organizations.


The amended, so-called The strengthened Code of Conduct on Disinformation was signed and presented on 16 June 2022 by 34 signatories.


The activities carried out within the framework of the Code will complement the implementation of the requirements set out in the DSA.


Adherence to the Code may also constitute proof of compliance with the obligations imposed by the DSA on VLOPs and VLOSEs.


The Code may also become a code of conduct under Article 45 of the DSA for very large online platforms and online search engines that have signed the commitments and apply the measures contained in the Code.


Have more questions? Contact the author!

Xawery Konarski

Attorney-at-law, Senior Partner, Co-Managing Partner

[email protected]