In a decision with far-reaching repercussions, the Federal Supreme Court (STF) concluded on 26.06.2025 the judgement that redefined the civil liability of digital platforms for content generated by third parties, declaring the partial unconstitutionality of art. 19 of the Marco Civil da Internet, the Brazilian Internet Civil Rights Framework (Law no. 12.965/2014). By a 8 to 3 vote, the majority of the justices understood that the rule, which conditioned the liability of providers on a specific court order to remove content, would have become insufficient to prevent the spread of disinformation and criminal activities online.
The decision, handed down in the context of two extraordinary appeals with recognised general repercussion (RE nº 1.037.396, Theme 987 and RE nº 1.057.258, Theme 533), establishes a new paradigm for content moderation in Brazil, relaxing the need for judicial intervention to remove publications in cases of serious crimes.
Article 19 of the Brazilian Internet Civil Rights Framework stipulated that:
Art. 19. In order to ensure freedom of expression and prevent censorship, the Internet application provider may only be held civilly liable for damages arising from content generated by third parties if, after a specific court order, it fails to take steps to make the content indicated as infringing unavailable within the scope and technical limits of its service and within the indicated period, subject to legal provisions to the contrary.
This rule protected platforms from lawsuits over posts by their users, unless they failed to comply with a court decision. The rule also aimed to prevent a certain kind of “private censorship”, in which a simple notification from a user or company who felt offended would lead a platform to remove content to avoid litigation, silencing criticism and removing legitimate content arbitrarily, delegating the role of arbiter of discourse to the platforms themselves.
I. The new interpretation for Article 19 established by the STF
The majority of the justices followed the vote of the president of the Court, Justice Luís Roberto Barroso, who proposed a thesis establishing different liability regimes for the platforms. The decision establishes the so-called “duty of care” for technology companies, which must be proactive in removing illegal content.
The new rule is structured as follows:
a) Duty of care in the event of mass circulation of serious illegal content: this is the “toughest” part of the increase in liability promoted by the STF. For an exhaustive list of offences considered serious, platforms have a proactive duty to remove content even without prior notification; if they fail to act diligently, they could be held civilly liable. This list includes anti-democratic behaviour and acts, terrorism, incitement to suicide, hate crimes and discrimination, crimes against women, sexual crimes against the vulnerable (such as child pornography) and human trafficking.
STF considered that the liability of platforms in this regard is conditional on “systemic failure”, which only occurs when the provider “fails to adopt adequate measures to prevent or remove the illegal content listed above, constituting a violation of the duty to act responsibly, transparently and cautiously”. STF also held that “the existence of illegal content in an isolated, atomised form is not, in itself, sufficient to give rise to the application of the civil liability of this item”. Thus, the presence of a single illegal content in isolation is not enough for liability to arise. However, the liability prescribed in article 21 of the Marco Civil was maintained, which determines that the platform is subsidiarily liable for the unauthorised disclosure of images, videos or other material containing nudity or sexual acts of a private nature when, after notification by the participant or their representative, steps are not taken to make the content unavailable.
b) Presumption of liability in the case of illicit paid content: the presumption of liability of platforms in the case of illicit content was established when it comes to paid adverts and boosts and artificial distribution networks (chatbot or robots).
The decision established that, in these cases, liability may arise regardless of notification, and providers will be excluded from liability if they “prove that they acted diligently and within a reasonable time to make the content unavailable”.
In these cases, the platform is considered responsible from the outset and, as a result, the burden of proof is reversed: it is only exempt from liability if it proves that it acted “diligently and within a reasonable time” to remove the content.
c) Crimes against honour: for crimes against honour (slander, libel and defamation), the general rule of the Marco Civil contained in art. 19 continues to apply. In other words, the platform’s liability will only occur if there is non-compliance with a specific court order to remove the content. Maintaining this requirement seeks to protect freedom of expression and prevent platforms from excessively removing content that is in a “grey zone”.
d) Replication of content already deemed illegal: if a piece of content has already been declared illegal by a court decision, platforms will have the duty to remove identical reproductions upon simple notification, without the need for a new court decision for each post.
e) Private communications: STF upheld the general rule in this case, establishing that “art. 19 of the Marco Civil applies to: (a) email service providers; (b) application providers whose primary purpose is to hold closed meetings by video or voice; (c) instant messaging service providers (also called private messaging service providers), exclusively with regard to interpersonal communications, protected by the secrecy of communications (art. 5, item XII, of the Brazilian Constitution)”.
f) Marketplaces: the decision established that “internet application providers that operate as marketplaces are civilly liable in accordance with the Consumer Defence Code (Law 8.078/90)”.
II. Other duties imposed on providers
Drawing strong inspiration from the European Union’s Digital Services Act (DSA), STF imposed a series of procedural and governance duties on platforms, such as the obligation to create self-regulation with clear moderation rules, publish annual transparency reports, maintain accessible customer service channels and, crucially, set up and maintain a legal representative in Brazil with full powers to respond judicially and administratively.
III. Consequences of the decision
The STF’s decision has profound and immediate implications for users, platforms and the public debate in Brazil.
Technology companies will be forced to invest heavily in legal teams, moderation technology and compliance structures to adapt to the new, multifaceted liability regime. The decision increases legal uncertainty for the sector, which will now have to interpret extrajudicial notifications and assess the legality of content at the risk of being held liable.
One criticism of the decision is that it does not distinguish between the tech giants (the “big techs”) and smaller companies in the sector, such as many start-ups, the former having incomparably greater resources to adapt to the new changes. There is a possibility of increased economic concentration in the sector, hindering the activities of newer and smaller companies.
Victims of hate speech and other serious crimes will have a quicker route to content removal, without the need to wait for a lengthy judicial process. On the other hand, there are fears that stricter moderation could lead to the undue removal of criticism, satire and legitimate opinions.
The decision also intensifies the debate about the limits of freedom of expression in the digital environment. While some see it as an essential tool to combat disinformation and hate speech, others consider it a step backwards that could lead to censorship and the curtailment of public debate by large technology companies.
STF modulated the decision to produce effects only on events occurring after 26 June 2025, the date of the judgement, preserving cases that have already become res judicata.
The rules defined by STF will shape the digital environment in Brazil until the legislature eventually passes a legislation on the subject, such as PL 2630/20 (“PL das Fake News” or the “Fake News Bill”), which has been going on for years without consensus.