Search for a report, a publication, an expert...
Institut Montaigne features a platform of Expressions dedicated to debate and current affairs. The platform provides a space for decryption and dialogue to encourage discussion and the emergence of new voices.
01/08/2019

Challenges of Content Moderation: Making Progress with Citizens

Interview with Serge Abiteboul

Challenges of Content Moderation: Making Progress with Citizens
 Serge Abiteboul
Senior researcher at the Ecole Normale Supérieure de Paris

France and other countries (among which Germany and the United Kingdom) are currently investing efforts in attempting to regulate the moderation of harmful content on social media platforms. If such initiatives are often supported by public authorities, they are the source of heated debates regarding their potential impact for freedom of expression. In this interview, Serge Abiteboul, member of the Collège de l'Arcep, researcher at the École normale supérieure and at Inria, and member of the French mission to regulate social networks, emphasizes the importance of involving citizens in this process.

You participated in the mission to regulate social networks, which issued a report published on May 10th. What key issues related to content moderation does the report highlight?

In my opinion, three key points are worth remembering from the report. First, it is essential that we work to ensure that social networks become welcoming spaces where users feel protected from potential attacks. That being said, it is also crucial that the means used to do so do not impede freedom of expression. One of the main advantages of social networks is that they allow everyone to express themselves, and this invaluable asset must be preserved. Finally, citizens must accept the balance to be struck between civility and respect on the one hand, and freedom of expression on the other, which excludes the possibility of self-regulation by platforms, as well as that of state regulation. In my opinion, platforms and state services should not be the only ones involved in the exercise of content moderation: the judiciary, academics, and more generally all of civil society must also be engaged in these debates.

What did the collaboration between Facebook and the French government to produce this document consist of? 

The main advantage of this collaboration is that it allowed us to understand the problem from within the platform, although this approach also faces limitations. This experience confirmed that it is important for platforms and the state to collaborate if we want to achieve our goals, even if it provides only part of the solution.

Selon vous, comment peut-on au mieux s'assurer que la décision de supprimer du contenu sur une plateforme ne soit pas interprétée comme un effort de censure ?

We are constantly caught between Scylla and Charybdis: delete too little content and citizens will not be protected, delete too much and it will amount to censorship. We must also be wary of states that would be ready to use this pretext to censor speech that does not suit them. This is already the case in some countries, and Europe must fight against this risk.

Transparency must be a common thread, present from the design of moderation rules to the rendering and explanation of decisions.

Collaboration between platforms and governments, which I have just mentioned, is one way to avoid this. Other measures can be established in order to strengthen such mechanisms and enable their acceptance by civil society, such as the conception of shared rules for moderation. The scenario whereby platforms would be responsible for moderating content under the supervision of a regulator is also an option.

Moderation decisions must then be notified to those they concern, and must be questionable through appeal procedures. The final decision can only be made by the judiciary.

Second, it is important that citizens be educated on these issues for them to understand the decisions made, and take part in the design of the principles that govern them. Transparency must be a common thread, present from the design of moderation rules to the rendering and explanation of decisions.

It is essential that digital means be mobilized to respond to the difficulties posed by content moderation. The moderator itself must be a social network, a space that fosters dialogue. In this respect, the problem of moderation by algorithms, which favor some content over others, must be addressed: we must be able to question them and demand more transparency on the ways in which they work. This is obviously complex since it is where platforms’ business model lies. Algorithmic moderation can be democratized, but only if it is done well, which requires extensive public research in the field, as the detection of hate speech is particularly difficult. I stress the importance of the public nature of this research – we should not delegate this type of investigation to platforms.

Sharing annotated data corpuses is also essential to train algorithms and make moderation accessible to smaller companies. Finally, we must not forget to question the model of big platforms: the problems we face today are also due to the excessively dominant, even monopolistic, positions they occupy.

Receive Institut Montaigne’s monthly newsletter in English
Subscribe