Vitalik Buterin criticizes Europe’s Digital Services Act (DSA)

In a recent tweet, Ethereum co-founder Vitalik Buterin lashed out at the rhetoric underlying the Digital Services Act (DSA) of the European Union, a regulation that imposes strict obligations on large digital platforms to limit harmful content such as hate speech, cyberbullying, scams and dangerous products.

The European Commission, from its official account @DigitalEU, promoted the spirit of the law with an emphatic slogan:

There is no room for cyberbullying. There is no room for dangerous products. There is no room for hate speech. There is no room for scams. YEAH. With the Digital Services Act, what is illegal offline remains illegal online.

This message seeks to communicate that What is illegal outside the internet cannot have a place in the digital environment. The institutional intention is clear: to emphasize the active responsibility of platforms to combat online harms.

Screenshot of a post by X about the Digital Services Law.Screenshot of a post by X about the Digital Services Law.
Tweet from the official account of the European Union, where the “guidelines” of the Digital Services Law are expressed. Fountain: x

However, Vitalik Buterin believes that this approach can lead to an authoritarian impulse that, under the pretext of safeguarding users, ends up unjustifiably restricting the diversity of ideas online. In your answerthe programmer stated that the idea that there should be “no room” for certain expressions reflects “a totalitarian and anti-parliamentary impulse.”

According to your reading:

The idea that there should be no ‘room’ for something you don’t like is fundamentally a totalitarian and anti-parliamentary impulse. It is incompatible with being in an environment that you do not completely control.

Vitalik Buterin, founder of Ethereum.

This argument focuses on pluralism and freedom of expression, maintaining that the complete elimination of content considered bad or dangerous—especially when its definition is subjective—can open the door to centralized control and censorship mechanisms.

Buterin clarifies that it is not about defending digital chaos, but about accepting that in a free society there will always be opinions or content that some find harmful. What is problematic, he maintains, is not the existence of these corners, but the massive amplification of this content by algorithms designed to maximize the engagementsomething that has marked networks like X (formerly Twitter).

At this point, the Digital Services Law poses precisely mitigate that effect, by requiring that large platforms offer users options for feed not based on algorithmic recommendations (i.e., not personalized), as part of its digital rights approach.

Buterin warns that adopt a philosophy of “no space” may take Europe down a “dark” pathwhere regulations, although well-intentioned, become tools to impose a single vision of truth in the digital public space. For him, true protection is not in suppressing controversial ideas, but in designing platforms and policies that minimize the dominance of toxic content without sacrificing pluralism.

This debate over the balance between online safety and freedom of expression places the DSA at the center of a global regulatory tension: How to protect users without falling into control mechanisms that restrict the diversity of opinions? Buterin’s criticism invites us to rethink the application of laws like the DSA under principles that do not conflict with the fundamental values ​​of a free and open internet.

Source link