Digital Services Act: A Tool For Censorship?

Many have claimed that the Digital Services Act, which entered into full effect last February 17, is a Regulation geared towards censorship: under the pretext of tackling illegal content and disinformation on the internet, the DSA may lead to remove online content based on the opinions they express. A debate on censorship is never to take lightly, particularly in occasion of new Regulations that have an impact on fundamental rights like freedom of expression. In this article, we are going to see some of the concerns raised on Digital Services Act and censorship and how the Regulation seeks to tackle illegal content and disinformation online while protecting freedom of expression.

Freedom of expression: a fundamental right of the EU

Freedom of expression is enshrined in article 11 of the Charter of Fundamental Rights of the European Union. Therefore, should we conclude that the Digital Services Act enables censorship and restricts freedom of expression, this would make the Regulation in direct contrast with one of the fundamental rights of the European Union.

Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.

Article 11 - Charter of the fundamental rights of the EU

What online content can be removed according to the DSA?

To see if it is true that the DSA enables censorship practices, we should start by checking in which cases the Regulation allows the removal of information published online.

The Digital Services Act enables the removal of:

  1. content against the terms and conditions of the platform;
  2. illegal content.

Content incompatibile with terms and conditions

As to the first case, a private subject, like a social media company, is free to have a moderation policy on what can and what can't be published on its platform. While this may result in censorship, which may not be a good publicity for enterprises whose mission is letting people communicate, this is nothing new: the practice of taking down content deemed against the terms of use is well known to social media platforms. However, this has nothing to do with the Digital Services Act, since the decision on what to ban is taken entirely by the platforms.

Illegal content

Content is illegal when it breaks the law: the Digital Services Act does not define when content is illegal. What makes the content illegal is the contrast with any other EU or national legislation. However, to sanction the removal of content through a generic reference to a norm, which the content allegedly violated, may still look something very close to censorship.

In other words, to label content as illegal through a loose reference to a norm would be just an expedient to censor content, and the DSA may provide a legal base for that. Therefore, it will be important to see how the DSA will be implemented and in which cases content will be considered illegal and taken down.

Digital Services Act and censorship: the critics say

What is disinformation?

One of the arguments of the debate on Digital Services Act and censorship concerns the term disinformation; among the objectives of the DSA there is indeed to "tackle disinformation". The critics of the Regulation point out that the term disinformation is too vague. Given its ambiguity, it may be possible to label as disinformation and remove any unwanted piece of content, which would amount to nothing less than censorship.

An information is either true or false, a piece of content is illegal when breaches a norm, but how to define disinformation, and when does it justify the removal of content according to the Digital Services Act?

It may be worth noting that the term disinformation is used frequently in the recitals, that is in the introduction of the DSA, but not even once in the articles of the Regulation. Therefore, the impact of the term disinformation on the implementation of the DSA may be not as critical as one may think. That may look optimistic, but from a technical standpoint it seems unlikely that a piece of content could be removed from the internet purely because deemed as disinformation.

However, what caused particular alarm is the reference made to disinformation by recital n° 84, which concerns the systemic risks assessment:

When assessing the systemic risks identified in this Regulation, those providers should also focus on the information which is not illegal, but contributes to the systemic risks identified in this Regulation. Such providers should therefore pay particular attention on how their services are used to disseminate or amplify misleading or deceptive content, including disinformation.

Recital n° 84

Regardless of the impact recitals may have on the implementation of the Regulation, that is frankly concerning to start off by declaring the intention to target content which is not illegal, based on loosely defined features such as deceptive and misleading, or again disinformation. Not that promising as introduction.

Crisis response mechanism and content moderation

One of the aspects of the Digital Services Act that raised concerns about freedom of expression and censorship is the impact of the crisis response mechanism on content moderation. Article 36 provides that in situations of crisis the Commission may require providers of very large online platforms or of very large online search engines to take certain actions, including

adapting content moderation processes, including the speed and quality of processing notices related to specific types of illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified

Article 35 par 1 c

What many find concerning is the power conferred to the Commission to compel online platforms to implement specific measures on content moderation: this would enable the European Commission to introduce whatever new restriction it may find appropriate, given the situation of crisis, and therefore new constraints on freedom of expression. That seems a legitimate concern.

Safeguards against censorship in the Digital Services Act

As we have seen, in same cases the Digital Services Act enables the removal of content from the internet: this led many to see the DSA as a tool for censorship. We also found that some concerns are not unreasonable. However, we should note that the DSA also contains measures aimed at protecting freedom of expression. We are going to see them now.

Terms and conditions: information on restrictions

The terms and conditions of online platforms should contain information on any restriction they may impose on the user.

That information shall include information on any policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review

Article 14

The DSA also requires that:

Providers of intermediary services shall act in a diligent, objective and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter.

Article 14

We shall see how online platforms will comply with that and how precise they will be in drafting their terms and conditions.
A generic description of what content may be subject to restrictions would frustrate the purpose of the norm, which is to let the user know in advance the boundaries of what can be published. A wider and unpredictable number of cases in which a content may be taken down would imply more room for censorship and unjustified compression of freedom of expression.
Once again, that will be interesting to see how the Regulation will be implemented, and how precise the terms and conditions of online platforms will be required to be with regard to such restrictions.

Content moderation

The Digital Services Act introduces obligations for online platforms engaging in content moderation.

Statement of reasons: article 17 of DSA

For restrictions such as:

  • any restrictions of the visibility of specific items of information provided by the recipient of the service, including removal of content, disabling access to content, or demoting content;
  • suspension or termination of the user account: the platform stops providing services to the user;
  • suspension or restrictions on monetary payment: demonetisation

online platforms need to provide a statement of reasons explaining the reasons of the restrictions, and specifically

  • for alleged illegal content, the a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content;
  • if the information is deemed incompatible with the terms and conditions of the platform a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible.

Digital Services Act and shadow banning

We are pleased to note that by mentioning restrictions of the visibility of specific items of information provided by the recipient of the service, the Digital Services Act requires the platform to give explanations on shadow banning too: a restriction that does not remove content, but it does not show it to other users.

What makes shadow banning particularly unfair is that the user is not aware of it: the content is not removed and the author can still see it online; other users can't. That would be a good thing if the DSA will put an end to such sneaky practice.
Time will tell.


The Digital Services Act establishes a right to compensation for the user who suffers damages due to an infringement of the Regulation by the provider of the platform.

Recipients of the service shall have the right to seek, in accordance with Union and national law, compensation from providers of intermediary services, in respect of any damage or loss suffered due to an infringement by those providers of their obligations under this Regulation.

Article 54

This too seems a good news, which would allow to hold online platform accountable for unlawful restrictions. We can think to the following case. On a video sharing platform a video is demonetised: the author of the video does not receive the money he or she would have earned based on the number of views the video had. The demonetisation of a video is a loss for the user, and easy to quantify: the profit is usually linked to the number of views. Therefore, should the demonetisation turn out to be illegitimate, that is unjustified according to the DSA, the user may claim the damages from the provider of the video sharing platform.


  • The DSA tries to find a balance between tackling illegal content online, fighthing disinformation (whatever that means), and protecting freedom of expression. Not an easy task.
  • Although the Digital Services Act does affect freedom of expression online, it also provides measures which enable the user to hold online platforms accountable, or at least obtain explanations, whenever they impose restrictions.
  • Therefore, in order to preserve freedom of speech on the internet, it will be crucial to have users aware of their rights, and specifically those the Digital Services Act provides them with against unjustified restrictions on their online activities.
  • It remains to see how effective these safeguards against censorship will be; for this, we will have to wait for the practical implementation of the DSA over the next few months.

About the author

Vincenzo Lalli

Vincenzo Lalli

Founder of

Avvocloud is an Italian network of lawyers passionate about law, innovation and technology.
Feel free to reach out for any info: send a message.

Thanks for reading!

Creative Commons License


The Italian Network of Lawyers




Support Avvocloud

Our mission is to promote innovation in law: if you like our project, you may consider a small donation.

Donate with Paypal