Skip to main content
In the News   
Ex: Europe, Middle East, Education

The French and the Rest: The Reach of State-backed Disinformation Networks

BLOG - 31 July 2020

Some of the most powerful examples of state-backed information manipulation we are aware of started with a seed: an individual anecdote or story that grew into a piece of widely-accepted public opinion. The process of growing this seed into a sprout - a story with some traction - to a tall and imposing tree - a piece of accepted public opinion - takes dedicated time and strategy. In 1983, a Soviet newspaper published an anonymous editorial which claimed that AIDS was developed in an American military lab. Five years later, the seed was republished by international outlets giving it credibility and authority. A decade after, 15% of Americans believed that "the AIDS virus was created deliberately in a government laboratory." This is just one seed that has grown a life of its own. With social media today, the ability to plant and grow manipulated information can feel limitless.

Another word for this manipulated information is disinformation. Wikipedia defines disinformation as "false or misleading information that is spread deliberately to deceive" and files the term under their series on War. This editorial decision was likely made because countries around the world are developing their own foreign influence capabilities, and because Russia spent about $1.25 million US per month on spreading disinformation in 2016. It takes time, money, and strategy for state-actors to develop their disinformation capabilities. Equal investment and coordination is required by researchers and social media platforms to unravel the intents behind these decades-long operations.

Since 2018, Twitter has been identifying, shutting down, and publicly releasing information on accounts they attribute to state-backed information operations that have been growing on their platform since as early as 2007. The purpose of these data releases is to encourage "open research and investigation of these behaviors from researchers and academics around the world", and the international community of social media and information manipulation researchers have answered the call.

With social media today, the ability to plant and grow manipulated information can feel limitless.

These data releases have been analyzed extensively by a number of researchers, including myself. The Stanford Internet Observatory has analysed pre-publications of the Twitter data releases. They found that Chinese accounts were often created specifically for manipulating information around certain events, particularly the Hong Kong protests and Covid-19.

They also found that recent Russian data releases have focussed on "pro-Kremlin, anti-opposition, and anti-Western content". Accounts originating in Turkey were created in large batches, and promoted President Erdoğan’s 2017 power consolidation, as well as Turkish interventions in Syria, both domestically and internationally. An indepth look at Egypt found that state-actors promoted a hashtag about terror in France, and mimicked an authentic French media outlet. Their research was conducted through a mixed methods approach that included temporal network analyses and statistical analysis of tweet metadata.

When networked timelines of intent on social media are cross-analyzed with timelines of newsworthy events and bilateral changes in policy and relationships between countries, strategies can begin to unfurl. Regarding Russia’s Internet Research Agency (IRA), the troll farm that was found to be attempting to influence the 2016 US election, my research with Dr Charles Kriel for NATO Defence Strategic Communications, reverse engineered the IRA’s operation on Twitter through a data science method called temporal network analysis. This method visualizes relationships between entities in a dataset over a time period. We found evidence within the networks of Twitter data that the IRA had been testing out multiple strategies to gain traction on social media, including experimentation with coordination (see Figure 1), survival of the fittest (or most followed) personas, and further polarizing already existing divides in the US.


A symmetrical network of Russian IRA accounts coordinating tweets to real people on Twitter

Figure 1: A symmetrical network of Russian IRA accounts coordinating tweets to real people on Twitter (Source: NATO Defence Strategic Communications).


The book Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics argued that the outcome of the 2016 American election was swayed not particularly by Russia, but by internal mechanics of the US, including the framing and agenda setting strategies of mainstream media outlets. The authors emphasized the importance of measuring whether or not state-backed information operations had an impact on their target audiences. Although the datasets released by Twitter do not give much indication as to the impact of the information operations, they do divulge the intent of state-actors through the digital crumbs they leave behind.

The aforementioned researchers, among others, have found evidence that state-backed information operations target their own populations as much as international populations, regarding domestic or international narratives. The Oxford Internet Institute has published The Global Disinformation Order, an inventory of countries with internal and foreign influence operations. This evidence implies that multiple state-actors may seek to influence either international populations or narratives, possibly including those of France. Stanford’s research on Egyptian Information operations found evidence of French targets, but research has not yet been conducted on how the French have been targeted by these operations.

State-backed information operations target their own populations as much as international populations, regarding domestic or international narratives.

As a Visiting Fellow in International Information Manipulation at Institut Montaigne, I will research if and how the French public has been targeted by state-backed information operations on Twitter over the last decade through temporal network analysis. I will work to uncover the intent of states who have targeted France and assess their impacts on French media narratives. States that are found to be targeting France will be ideal candidates for future research and policy efforts. The question of if and how state-backed information operations have targeted France and whether they have been successful stands, but only the data holds the answer.





See also
  • Commentaires

    Add new comment


    • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type='1 A I'> <li> <dl> <dt> <dd> <h2 id='jump-*'> <h3 id> <h4 id> <h5 id> <h6 id>
    • Lines and paragraphs break automatically.
    • Web page addresses and email addresses turn into links automatically.
    • Only images hosted on this site may be used in <img> tags.

Envoyer cette page par email

L'adresse email du destinataire n'est pas valide
Institut Montaigne
59, rue la Boétie 75008 Paris

© Institut Montaigne 2017