It will be the role of an interim judge to qualify a piece of news as "fake", based on three criteria:
- the fake news must be manifest,
- be disseminated deliberately on a massive scale,
- and lead to a disturbance of the peace or compromise the outcome of an election.
Dealing with grey areas: True or false? Legitimate or illegitimate?
A lot of ink was spilled contesting the project : what does "manifest" mean ? Is there really a line between fake and true, and is a judge the right person to decide? Those questions are impossible to ignore, especially given that social scientists from a wide array of disciplines, such as Bruno Latour, have long questioned the very notion of objective facts: in this line of thought, facts are not merely "out there" but constructed by scientists in a human environment shaped by power dynamics. Therefore, the question of who asserts the objectivity of a fact matters a great deal.
When dealing with content moderation more generally, the point of view from which a content is considered to be "appropriate" is essential.Content moderation is about deleting content and therefore, to some extent, censorship. In the end, the tricky question is: "how many people need to agree that a specific content should be censored for censorship to be legitimated?". Let’s take for example those who deny the existence of climate change. Not many do in France, but in the United States, there is a range of people who doubt that this phenomenon is real. Should this argument be banned because a large majority of scientists have proved it to be wrong? Such debates create important divides.
Technical complications
In addition to these considerations of what is legitimate or not, there is a technical issue involved in deleting content. Today, platforms such as Facebook are investing a lot of money to build tools using artificial intelligence to detect specific types of content (nudity, fake news, hate speech). Facebook recently announced it had invested $7.5 million to improve video and image analysis technology in order to identify duplicates and delete them quicker in the wake of the Christchurch scandal. The technology’s efficiency varies according to the type of content. For example, Nicholas Thompson and Fred Vogelstein report in a Wired article that on Facebook "the success rate for identifying nudity is 96 percent." Yet it is only 52% for hate speech.
Despite the accuracy of the technology in categorizing a content ("nude"), in some cases it still makes erroneous choices. Tarleton Gillespie’s book on content moderation, Custodians of the Internet, starts by the example of the Nick Ut 1972 famous photograph, the Terror of War, on which a little girl runs naked whilst a village in the background burns. It is nude, but should it be deleted? Here, it is difficult for the algorithm to take into consideration the picture’s historical dimension. Such misclassifications will inevitably happen. If platforms are punished for leaving inappropriate content (in this case nude content) on their platform, some argue it will lead them to filter more content than necessary to avoid paying large fines.
Two examples: initiatives in France and in the United Kingdom
As previously mentioned, the French government has begun to take action in order to provide answers to these questions and counter the spread and curation of harmful speech online. The bill presented by Laetitia Avia in March, which focuses on the issue of hateful speech, proposes to oblige platforms to remove any content inciting to hatred within the 24 hours after which it is posted. It also recommends to simplify the current processes by which platform users can signal hateful speech, and emphasizes the need for platforms to be transparent about the processes they themselves employ. Laetitia Avia has also made clear that the bill, which will be reviewed by the French Parliament in July, will propose to establish platforms’ failure to remove hateful content as a criminal offense. Conversely, the French deputy also noted that excessive censorship from platforms should equally be sanctioned. It is also expected that a regulator will be appointed to oversee the implementation of this new legislation.
Add new comment