Skip to main content
Ex: Europe, Middle East, Education

Challenges of Content Moderation: Addressing the Knowledge Gap

Interview with Charlie Beckett

INTERVIEW - 11 July 2019

France and other countries (among which Germany and the United Kingdom) are currently investing efforts in attempting to regulate the moderation of harmful content on social media platforms. If such initiatives are often supported by public authorities, they are the source of heated debates regarding their potential impact for freedom of expression. In this interview, Charlie Beckett, Professor of Media and Communications at the London School of Economics (LSE) and Lead Commissioner for the LSE Truth, Trust & Technology Commission (T3), advocates for the creation of an agency that would have the independence and expertise required to monitor platforms’ moderation of content.

What are the key takeaways from the T3 Commission’s report Tackling the Information Crisis: A Policy Framework for Media System Resilience, in particular with regards to content moderation and its regulation?

First, it is worth noting that several key issues led to the writing of this report. One of them was the growing lack of trust in the mainstream media, sparked by a general feeling of confusion, notably due to the increasing proliferation of information.

Another of our aim was to gain an in-depth understanding of social media platforms - their structure, the way they operate, their biases - and to try to provide frameworks to ensure that these organizations be more accountable.

Other questions concerned the role of social media platforms after the Cambridge Analytica scandal: could advertising be deployed transparently and not invade users’ privacy? What is the function of social media platforms in times of election? The role of the T3 Commission was thus to take a step back, and identify evidence to describe the potential harms caused by our new information system, in order to help dissipate this general sense of confusion, but also to avoid the dangers of overreaction. For instance, the urge to moderate content is confronted with the importance of protecting freedom of speech. There is indeed always a risk that certain voices be silenced, when it is essential that everyone take part in public discussions, especially in times of political change.

Therefore, the goal of the T3 Commission’s report was mainly to understand what we are really worried about when we talk about the information crisis, to assess whether there is any evidence that disinformation is spread by external actors, to determine whether disinformation changes the way we do politics, and to examine the role played by the general socio-economic context in these new phenomena. Another of our aim was to gain an in-depth understanding of social media platforms - their structure, the way they operate, their biases - and to try to provide frameworks to ensure that these organizations be more accountable, while simultaneously recognizing the positive aspects of some of their activities.

What solutions have you identified to ensure that social media platforms become more accountable?

In the report, we advocate for the establishment of an Independent Platform Agency. The idea is for this institution to have access to the data of all the major platforms in order to monitor their moderation activities, and to be able to impose fines when platforms fail to provide data. Its role would thus go beyond the mere recommendation of transparency reports. Social media platforms have very recently claimed that they are in favor of regulation, when they had previously resisted it. Yet they themselves do not know what regulation would look like, and it is important that the latter not be conceived exclusively on their terms. We therefore need an independent body to work hand in hand with platforms, and which would have the power, responsibility and expertise required to collaborate with platforms and look at the mechanisms they employ from within. Such an institution, which would also coordinate with regulators and legislators, is what will allow us to build an accountability framework, which will ultimately need to involve sanctions.

Until now, discussions have mainly focused on ways in which either platforms or governments - or both - can and should moderate content. Yet, if they do not include civil society, such initiatives might be accused of being non-democratic, given the current context of distrust. How can we include citizens in these debates?

It is indeed key to further involve citizens in these discussions, because there is a real risk that people will grow increasingly cynical with regards to digital technologies. I believe one way to do so is to label the way content is prioritized. These processes are still too opaque, and the criteria used for content curation need to be made more transparent.

Of course, making curation and moderation algorithms more transparent would potentially allow malicious users to circumvent content moderation, and this non-neglectable risk must not be forgotten. Yet we currently do not have the luxury to find perfect solutions to the problems we face, as every action taken will almost inevitably have some downsides.

The key thing for now is to gain some oversight over platforms’ moderation activities.

The key thing for now is to gain some oversight over platforms’ moderation activities. The idea of having democratic criteria within content moderation algorithms is also often brought up, yet determining who decides what these are might be very tricky. Indeed there is a great plurality of values, and it might be problematic to try to encode only a subset of them in content moderation algorithms. 

Overall, more attention needs to be raised on this issue. Everyone is currently talking about AI and the digital economy, but there is a real knowledge gap when it comes to content moderation, and we need institutions and universities to be more engaged on this topic. No organization is currently specifically dedicated to this matter, so it is crucial that we build more expertise before we start making significant policy decisions.

 

See also

Add new comment

Commentaire

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type='1 A I'> <li> <dl> <dt> <dd> <h2 id='jump-*'> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.
  • Only images hosted on this site may be used in <img> tags.

Envoyer cette page par email

L'adresse email du destinataire n'est pas valide
Institut Montaigne
59, rue la Boétie 75008 Paris

© Institut Montaigne 2017