Should the Social Media Platforms Be Held Accountable for Disinformation? EDYN Features Members

Published August 26, 2024

EDYN Features are short articles where our members share their thoughts on key issues like youth engagement, disinformation, democracy, EU enlargement, and conflict resolution. The goal is to spark conversations, highlight different perspectives, and to help us all think more deeply about what’s happening in the world.

Want to be a part of the series? Write a message to [email protected] and share your ideas!

 

 

Should the Social Media Platforms Be Held Accountable for Disinformation?

 

Written by Pavel Havlíček
Member of EDYN Czech Republic
Research Fellow at the Association for International Affairs
Member of TOP 09 party

One of the big discussions of today is how to best combat disinformation and misinformation in the information space and which tools to use to limit their spread since this is something that often causes (moral) panic of being manipulated in the quickly developing information ecosystem oversaturated with information.

A significant part of the debate is connected to social media platforms since they nowadays serve as an essential place of social interactions and exchange of views among their users and the public discussion has largely moved online to domains like Facebook, Twitter, Instagram, YouTube or Tik Tok and other applications.

While these platforms have benefitted from the attention of billions of users, commercialised their online presence and attention paid to online products, including via advertising, they have overwhelmingly underestimated the security threats, foreign operations, often conducted by bots and trolls, and radicalisation that have spread in their ecosystems, despite being clearly against their own rules of the community.

It was best visible when the European Commission introduced its Code of Practice against Disinformation in December 2018, which was supposed to motivate the platforms to invest in the integrity of their own processes and protection of users and their rights. However, this was realised based on the principle of self-regulation that was supposed to be implemented and at the same time evaluated by the social media actors themselves without the direct involvement of the European regulators.

Unsurprisingly, the results were more than mixed and pushed the European Commission to amend its approach and enforce a much stricter type of regulation that forced the platforms to implement a new set of rules and declare the results of their activities to the European Commission by law, the so-called Digital Services Act (DSA).

Role of DSA

Based on the previous experience and lack of engagement by the social media platforms after 2018, the DSA enforced several key principles that should make sure that the spread of disinformation and illegal content online comes under the European control again.

One of them was of transparency that should make sure that the users of social media platforms themselves become more aware of their rights and opportunities, including when reporting the illegal and legal and harmful (disinformation) content online. Transparency should also help to open the “intestines” of the social platforms in the form of algorithms that determine who is seeing what at each time, without the users often understanding why and how it happens. Finally, the principle of transparency should also make sure that the platforms report on high risks and allow for external audits. 

Another key issue has been of users’ empowerment and ensuring that it is not the platform but European rules and norms that prevail over their own interpretation of rules and regulations. While turning the table and enforcing the EU’s will over the common European digital space, the platforms now need to respect the right of appeal and finally also the will of the national courts over their own judgements determining if this or that should happen online. What this has meant in practice is, for example, that if Donald Trump was “deplatformed” in July 2024 in the EU’s digital space, he would have the right to appeal to the decision and ask an independent authority and finally also the court to decide about his social media presence. This rule now applies to all users, be that individual or commercial. 

Finally, the EU’s own Code of Practice on Disinformation, originally adopted in December 2018 and several times amended since then, became a golden standard when it comes to approaching disinformation and other forms of problematic content online. This – for instance – means that the platforms are supposed to deprioritise and demonetise problematic content and in fact go against their original revenue model based on the spread of polarising and thus very often highly trending content.

In many of these instances, the new European approach to the so-called very large media platforms and other online actors is revolutionary and deserves our attention. At the same time, it also shows that the platforms themselves were not motivated and keen on doing too much to tackle disinformation online. Now, this changes and we should make proper use of all of these new opportunities at the EU, national and also individual levels.

 

Related articles

Published November 9, 2024

Many young people see democracy as a given, something that’s just always been there, says young politician Gabriela Bosnjak

This day marks 35 years since the fall of the Berlin Wall, a defining moment that opened the door to…

Published November 7, 2024

EDYN President & Leadership Council (LC) Elections 2025

We are pleased to announce EDYN’s Leadership Council and Presidential Elections! The nomination period starts on Monday, 11 November, and…

Published October 25, 2024

Apply To Become an EDYN Member: International Recruitment Call 2024

The European Democracy Youth Network (EDYN) is now accepting applications for new members! The 2024 annual recruitment call opens on…