France and other countries (among which Germany and the United Kingdom) are currently investing efforts in attempting to regulate the moderation of harmful content on social media platforms. If such initiatives are often supported by public authorities, they are the source of heated debates regarding their potential impact for freedom of expression. In this interview, Serge Abiteboul, member of the Collège de l'Arcep, researcher at the École normale supérieure and at Inria, and member of the French mission to regulate social networks, emphasizes the importance of involving citizens in this process.
You participated in the mission to regulate social networks, which issued a report published on May 10th. What key issues related to content moderation does the report highlight?
In my opinion, three key points are worth remembering from the report. First, it is essential that we work to ensure that social networks become welcoming spaces where users feel protected from potential attacks. That being said, it is also crucial that the means used to do so do not impede freedom of expression. One of the main advantages of social networks is that they allow everyone to express themselves, and this invaluable asset must be preserved. Finally, citizens must accept the balance to be struck between civility and respect on the one hand, and freedom of expression on the other, which excludes the possibility of self-regulation by platforms, as well as that of state regulation. In my opinion, platforms and state services should not be the only ones involved in the exercise of content moderation: the judiciary, academics, and more generally all of civil society must also be engaged in these debates.
What did the collaboration between Facebook and the French government to produce this document consist of?
The main advantage of this collaboration is that it allowed us to understand the problem from within the platform, although this approach also faces limitations. This experience confirmed that it is important for platforms and the state to collaborate if we want to achieve our goals, even if it provides only part of the solution.
Selon vous, comment peut-on au mieux s'assurer que la décision de supprimer du contenu sur une plateforme ne soit pas interprétée comme un effort de censure ?
We are constantly caught between Scylla and Charybdis: delete too little content and citizens will not be protected, delete too much and it will amount to censorship. We must also be wary of states that would be ready to use this pretext to censor speech that does not suit them. This is already the case in some countries, and Europe must fight against this risk.