Search for a report, a publication, an expert...
  • Home
  • Information Manipulations Around Covid-19: France Under Attack
Note
JULY 2020

Information Manipulations
Around Covid-19:

France Under Attack

Authors
Théophile Lenoir
Fellow - Misinformation and Digital Policy

Théophile Lenoir est chercheur associé à l’Institut Montaigne. Il a développé pendant quatre ans (de 2017 à 2021) le programme de travail de l’Institut Montaigne sur les questions numériques. Ses intérêts portent sur les technologies de la communication et les transformations de l’espace public. Il est notamment le co-auteur pour l’Institut Montaigne de la note Information Manipulations Around Covid-19 : France Under Attack (juillet 2020). Il a aussi travaillé avec la Visiting Fellow Alexandra Pavliuc, doctorante au Oxford Internet Institute et auteure de la note State-backed Information Manipulation : The French Node (février 2021), et a coordonné la rédaction de plusieurs rapports, dont Media Polarization "à la française" ? Comparing the French and American ecosystems (juin 2019). 

Théophile effectue un doctorat à l’Université de Leeds, sur les controverses autour des mesures de l’impact environnemental du numérique, pour mieux comprendre ce que recouvre la notion d’objectivité en politiques publiques. Il est diplômé de la London School of Economics et de la USC Annenberg School for Communication and Journalism, où il a suivi le double programme Global Media and Communications.

Avant de rejoindre l’Institut Montaigne, Théophile a travaillé au sein de start-ups à l’intersection des médias et de la technologie (un outil d’analyse et une plateforme de contenus), à Londres et à Los Angeles.

In times of a pandemic, what harm can social media posts actually cause? During Covid-19, it seems they can actually do quite a lot. From messages arguing that bleach can cure Covid-19 to ones that show the weaknesses of the West in dealing with the crisis, some of the information circulating on social media has been problematic to governments and sanitary organizations.

Whilst more efforts are needed on the part of platforms to deal with illegal and dangerous information, and to reduce amplification mechanisms, public authorities also need to raise their awareness of the interest online communities have in hostile narratives. This study by Institut Montaigne and the Institute for Strategic Dialogue, based on data collected by Linkfluence, shows that both influencers gathered around far-right and far-left themes are interested in anti-Europe narratives, however the first are also interested in pro-authoritarian narratives. Successful influence messages use these interests to circulate in specific communities.

With the help of Linkfluence, a French media monitoring and web analysis startup, 37 messages related to Covid-19 were selected and grouped by theme. Their circulation within 6 thematic communities from December 1 2019 to April 30 2020 was then analyzed.

How we identified messages

Our selection of messages is based on existing studies led by fact-checking organizations as well as research in the field of information manipulation. For each message, keywords were identified and then searched for in online discussions on Twitter and on Facebook’s public pages. The list of messages is non-exhaustive; it seeks to include the main narratives surrounding the Covid-19 crisis.

The groups of messages are:

Pro-authoritarian narratives

  • Promoting Russia
  • Russia as a victim
  • Promoting China
  • China as a victim
  • Authoritarian interpretations

Anti-Europe narratives

  • Weaknesses of the European Union
  • The end of the European project
  • Anti-liberal messages

French conspiracy theories

  • Chloroquine - Agnès Buzyn, former Health Minister, and her husband, Yves Lévy, former chair of the French health research center Inserm, plotted to wreck Professor Raoult’s work on chloroquine
  • Chinese bioweapon

Government and business

  • France and the government
  • Pharmaceutical companies

5G

  • 5G - Correlation between 5G deployment and the pandemic

Health

  • Dangerous information
  • Fake cures

Others (found in very small proportions)

  • Xenophobia    
  • Institut Pasteur (a French medical foundation)
  • Responsibility of the United States
  • Anti-elitism, including against Bill Gates
  • Conspiracies against Trump

How we identified discussion groups in which these messages circulate

30,000 online posts and articles in the public domain were found mentioning these messages. 6 groups of influencers sharing the information were created, based on both explicit information (a declarative statement in the description of a Twitter account) and implicit information (online practices, particularly sharing certain types of sources). In other words, Linkfluence would start by pinpointing influencers identifying themselves as "nationalists" or "anti-technology" before pooling together highly visible users sharing posts from these influencers, as well as articles from resources often quoted by them. In this way, Linkfluence created a list of influencers and highly visible accounts that serve as a sample for the communities below.

The communities are not exhaustive and do not give a full picture of the French public debate. They were selected because they were the most visible communities relaying information on the messages identified.

The 6 discussion communities are the ones that present themselves as:

  • Nationalists: using keywords such as "patriot", "nationalist", "Rassemblement National" (French far-right party led by Marine Le Pen).
  • Anti-Islam: using keywords such as "anti-Islam", racial slurs, or insulting Islam and Muslims;
  • The Yellow Vests movement (The Yellow Vests is a French protest movement that started in October 2018. Read Institut Montaigne’s article on the movement here): using keywords such as "Yellow Vests" or including a profile picture referring to the movement.
  • Anti-technology: using keywords such as "anti-technology", "anti-5G", "Linky meters" (which measure and share information on electrical consumption in French households, and were the source of intense debates online).
  • Naturalists: using keywords such as "anti-vaccine", "homeopathy", "natural", sharing information against processed food products and pharmaceutical companies.
  • Anti-capitalists: using keywords such as "alter-globalization" ("altermondialisme"), "far left", "France Insoumise" (French far-left party led by Jean-Luc Mélenchon), "French Communist Party".

A word of caution

The results below and in our study indicate a community’s interest, not belief, in the debate surrounding a message: this study looks at the presence of messages in French online communities. When following disinformation pieces, articles that debunk them are often present in the corpus.

Narratives and disinformation do not circulate evenly online

Our mapping shows that the sharing of information and disinformation is always political: information, whether true or false, only circulates in communities that are interested in it. Consequently, players can take advantage of events such as the Covid-19 crisis to draft messages that meet these interests and serve political goals. More than a creator of disinformation, the pandemic has catalyzed information creation, both true and false, on specific topics with political aims.

The influencers that gather around far-right themes are the most sensitive to messages that both promote authoritative regimes and show Europe’s weaknesses. In parallel, these influencers were the most interested in most pieces of disinformation and conspiracy theories we looked at (concerning 5G or the fact that Agnès Buzyn, former Health Minister of France, and her husband, Yves Lévy, former chair of the national health research center Inserm, plotted to wreck Professor Raoult’s work on chloroquine).

The influencers that gather around themes associated with the far-left are relatively immune to overt pro-authoritarian narratives. They will be more interested in reading about Europe’s weaknesses. They are also more interested in pieces of disinformation and conspiracy theories that concern corporations and include an economic dimension.

Finally, our study shows that the influencers that gather around technology and health issues are relatively impermeable to narratives that do not concern them directly. The messages that interest these influencers are tailored to their themes (for example: non-medical cures, 5G…).

France was relatively immune to international conspiracy theories during the lockdown

Messages such as the ones accusing Bill Gates of having created the coronavirus were largely absent from the discussions we looked at in France, though they were successful in the United States.

Our study suggests that, as of today, language remains a safety net concerning foreign disinformation. This is coherent with previous findings on the "#Macron leaks" operation, which showed that extremist groups in Russia and the United States posted English content in French discussion groups, hindering their circulation. Overall, despite the European Union bringing to light information manipulation operations coming from China during the Covid-19 crisis, we found fewer occurrences of French messages promoting China, compared to messages promoting Russia. It is possible that China-related players still mainly operate in English.

Fact checking: a challenge for traditional media

Fake cures or dangerous information regarding the virus (including, for example, the idea that bleach is a remedy) were largely absent from Twitter and Facebook. It could be that this information spread mostly on messaging services such as Whatsapp or Telegram. This may be a particularity in the French context, as ISD has identified significant sharing of fake cures in English on social media, notably the harmful assertion that colloidal silver can help "resolve" coronavirus.

The absence of such messages on social media raises the difficult question of the media’s role in covering disinformation. By warning against the circulation of a piece of disinformation, media organizations can play a significant role in making it visible. This can be illustrated through the 5G misinformation, whereby an interview of French virologist Luc Montagnier on April 17 by a French media, during which Mr. Montagnier stated that 5G frequencies may have contributed to the spread of the virus, has generated active debates online.

In a world of online influence and manipulations, governments, researchers and platforms are part of the solution

Most government initiatives in France and from social media platforms have focused on sharing reliable information on the virus. This is a necessary step, however it is not a sufficient response to foreign interference. In parallel to encouraging platforms to take more action and delete illegal and dangerous content, public authorities also need to raise their awareness of the interest online communities have in hostile narratives.

Three dimensions are therefore crucial to ensuring that democratic societies develop their understanding of the challenges ahead, and learn to live in a world of online influence and manipulations:

  1. Governments need to recognize this challenge as a priority area, by continuing to make public health-related information reliable, transparent and compelling. They should also create new regulations to open up channels of communication between platforms and governments, to incentivize companies to share information, and to require more transparency from platforms, including through audit mechanisms.
     
  2. Researchers need to have access to more data from online platforms to understand the fragilities of public debates and the extent to which they are exploited by foreign actors. The real-time reporting on disinformation and polarization campaigns is essential to helping governments understand the scope of the challenge and to make communication infrastructures a priority in State-level negotiations.
     
  3. Online intermediary platforms need to be responsible for prioritizing authoritative information and sources, deplatforming malign ones, and down-ranking and clearly labelling misinformation. New regulation should lead them to undertake robust and transparent research, as well as to design products and moderation systems compliant with privacy on the information sharing and communication ecosystems. This should include private groups and messaging apps when they are related to the spread of potentially harmful mis- and disinformation.
Receive Institut Montaigne’s monthly newsletter in English
Subscribe