Skip to main content
Ex: Europe, Middle East, Education

The Noise Around Disinformation

ARTICLES - 13 October 2021

On Monday October 11, 2021, social media platforms representatives met with the French State Secretary in charge of Digital Affairs, Cédric O, and the Secretary General for National Defense and Security, Stéphane Bouillon, to present their work on foreign interference. This meeting took place as the new French government agency, Viginum, is set to monitor foreign information operations targeting France. This presents an opportunity to reflect on what France has learned since the 2017 election and the "Macron Leaks" operation, as well as what could be expected in the coming months.

The state of disinformation in 2017

In 2017, when France held its last presidential election, the French government and other Western nations were ill-prepared to deal with online political interference. The 2018 French law on the manipulation of information did not exist yet, and the agency that was in charge of digital threats, the National Agency for the Security of Information Systems (Anssi), looked at cyber attacks, but did not investigate their links with the messages and rhetoric that circulated online. Moreover, the media regulator, the Audiovisual Council (CSA), had no authority over social media companies.

Some candidates, wary of the fact that ill-intentioned actors might try to intervene in the French election, took action at the individual level. Emmanuel Macron and his team included false emails in their communication channels to make it harder to differentiate true from false messages in the case of a hack, as reported by Jean-Baptiste Jeangene Vilmer in an Atlantic Council and French Institute for Strategic Research (Irsem) paper. This was cited as one of the various reasons for why the Macron Leaks were not as damaging as they could have been. 

The state of disinformation in 2021

Four and a half years later, the situation has changed significantly. Disinformation campaigns, both international and domestic, have become an increasing source of concern, alongside misinformation (the latter, contrary to the former, being shared with the explicit aim to deceive the public). 

Disinformation campaigns, both international and domestic, have become an increasing source of concern.

Numerous countries participate in information operations. These include the Philippines, Iran, China, but also France, who defended its military intervention in the Central African Republic as a response to an information operation targeting it. The 2021 edition of the Oxford Internet Institute (which annually publishes a l report on state-backed information operations) reported that 81 countries took part in strategies to influence opinion online. 

New tactics have emerged. The 2020 American election, for instance, differed from 2016 in two respects. First, analysts observed an increasing number of fake accounts using profile pictures created with Generative Adversarial Networks (GAN) technologies. These were used by foreign actors such as Iran, China or Russia. Second, domestic actors such as content creators or journalists were increasingly found to be involved in foreign information operations, without their knowledge. This was done by hiring them to cover a story for a foreign organization, for example. Similar techniques were reported by Irsem in their September 2021 report on Chinese operations.

Questioning disinformation

Alongside the growing digital means to interfere in elections, some are questioning the disinterestedness of the field of disinformation studies on the one hand, and the very impact of disinformation on the other. 

In the US, Joseph Bernstein, Senior Reporter at Buzzfeed and Fellow at the Niemann Foundation, notes that the fact that the American institutions the most invested in fighting disinformation are politically liberal shows that some have more to gain than others in the defense of truth. And indeed, in 2017, two considerable blows to the international and liberal world order, namely Brexit and the election of Donald Trump, had taken Western nations by surprise. In both cases, disinformation was singled out as a potential explanation for the sudden change in opinion. To make such decisions, voters (in both countries) surely must have had their facts wrong. 

Since 2017, it has become increasingly apparent that truth can be weaponized. This was the case with the debates around the definition of fake news in the 2018 French law, with Facebook’s argument that it does not want to become the arbiter of truth, or with falsehoods linked to Covid-19 being reinterpreted by political groups online: truth is hard to define, harder to agree upon, and always serves someone’s interest. This makes the fight against disinformation more difficult, since it prevents government agencies and civil society organizations to frame it as an indisputable and fully legitimate fight for truth. 

Have we made too big a deal out of disinformation? 

This question is even more interesting given that some argue there is little evidence suggesting that content (be it advertising or news) is capable of dictating how consumers or citizens behave. The paradox is that most of us are entirely convinced that content has the power to tell us what to buy or who to vote for, but scientists have not yet determined what impact content truly has on how opinions are formed. 

Evidence shows that advertising might not have the effects it boasts to have. For example, scientific papers have demonstrated that the ad effectiveness at eBay was overestimated by up to 4,100%. Similarly, a group of scientists found that, in half of the fifteen experiments analyzed, the increase in sales from online advertising was overestimated by a factor of three or more. These papers help to put into perspective the impact of disinformation on voter attitude.

Scientists have not yet determined what impact content truly has on how opinions are formed.

In his book The Hype Machine, Sinan Aral also notes that most studies fail to demonstrate a link between exposure to social media and causal changes in voting behavior. So why do we assume that people presented with disinformation lack the ability to form their own judgment, to the point that they will end up believing something that they previously did not? It could be that disinformation is an easy way to explain some of society’s most complex problems (for example, tensions over immigration or distrust in representative democracy). 

The task ahead 

In France, a new agency, Viginum, is tasked with monitoring disinformation campaigns in the run-up to April 2022. Viginum will help the French government raise awareness and react to online threats. This is a most welcome improvement to the French government’s ability to understand foreign online threats. But seizing the implications of the questions raised above is paramount.

On the one hand, overestimating the impact of disinformation (by arguing, for example, that Russian bots are the reason why French citizens do not trust the government) shifts the attention away from the underlying tensions within our societies (concerns over imigration and security, distrust in government, all of which preempt social media). On the other hand, underestimating disinformation bears the risk of remaining inactive, on the premise that it has no real impact. It would be just as incorrect to assert that, say, the Russian Internet Research Agency is the reason why Trump was elected, as to deny the role information operations play in geopolitics. 

Sinan Aral clearly shows how the Russian government suppressed pro-Ukrainian voices and created pro-Russian content during the weeks surrounding the annexation of Ukraine in 2014. Similarly, there is no doubt that Russia very much intended to influence the result of the 2016 American election. More recently, the French Irsem report on Chinese operations shows the extent to which China has gone out of its way to influence foreign politics online. Whether these actions were successful or not, significant efforts were deployed by contemporary states to manipulate opinion. In this regard, France must up its game and be able to monitor online activity better.

Whilst building stronger capabilities to monitor online threats, Viginum will also need to better document how foreign actors use domestic fears in their operations. "Disinformation" often encompasses a wide range of content that includes so-called "fake news," conspiracy theories, factually incorrect information, but also rumors or partisan information. It can be tempting to label a political content as fake because it does not align with one’s worldview. Addressing disinformation in 2022 will require both stopping the spread of foreign information that aims to voluntarily deceit, and reducing hasty conclusions in public debates that portray disinformation as the cause of social tensions.

 

Copyright: Kirill KUDRYAVTSEV / AFP

 

See also
  • Commentaires

    Add new comment

    About text formats

    Commentaire

    • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type='1 A I'> <li> <dl> <dt> <dd> <h2 id='jump-*'> <h3 id> <h4 id> <h5 id> <h6 id>
    • Lines and paragraphs break automatically.
    • Web page addresses and email addresses turn into links automatically.
    • Only images hosted on this site may be used in <img> tags.

...

Envoyer cette page par email

L'adresse email du destinataire n'est pas valide
Institut Montaigne
59, rue la Boétie 75008 Paris

© Institut Montaigne 2017