Search for a report, a publication, an expert...
Institut Montaigne features a platform of Expressions dedicated to debate and current affairs. The platform provides a space for decryption and dialogue to encourage discussion and the emergence of new voices.
28/02/2020

Disinformation : The German Approach and What to Learn From It

Print
Share
Disinformation : The German Approach and What to Learn From It
 Julian Jaursch
Author
Project Director "Strengthening the Digital Public Sphere | Policy", Stiftung Neue Verantwortung
 Théophile Lenoir
Author
Fellow - Misinformation and Digital Policy

In technology policy circles, everyone is looking for ways to govern the Internet. Over the past couple of years, some solutions have emerged for governments to have their say in making the Internet safer and more inclusive. One of the major topics of regulation after 2016 has been trying to hold big social media companies responsible for their practices, especially concerning content moderation.

Whilst the UK is disclosing the follow-ups to its much-discussed White Paper on Online Harms, and Facebook and the European Commission are discussing the right level of accountability of private actors, Germany is proposing a number of regulations in different policy fields. Three times over the past three years, it has put forth ambitious ideas to regulate Internet giants, in some cases inspiring similar actions in France.

The NetzDG: a law to deal with illegal content

The first occurrence was the NetzDG, the "Network Enforcement Act", which entered into force in January 2018. The NetzDG was the first European regulation to force platforms to delete “obviously illegal content" within 24 hours of being notified, to establish notification systems and publish transparency reports on the deletions and to designate persons accountable for these decisions.

The law has been a success in creating some form of oversight of big tech companies, mainly by forcing them to be more open about their content moderation practices. However, the NetzDG has been criticized since its inception for outsourcing legal decisions about content to private companies. In an open letter to the federal government, several civil society organizations have recently criticized a proposed NetzDG reform. They argue the reform does not alleviate, and in some instances worsens, concerns that the NetzDG lacks important safeguards for protecting free speech.

The NetzDG has been a source of inspiration for French regulators. In November 2018, French parliamentarians signed a bill to "combat the manipulation of information", focusing specifically on disinformation during election periods. During the three months before citizens go to the polls, a legal procedure allows for a judge to be appointed to determine whether a piece of content is false or not, and make that decision within 48 hours. More recently, the "online hatred bill" includes the obligation for platforms to delete illegal content, to establish a notification system, to publish transparency reports and to designate accountable company representatives.

The Interstate Media Treaty: a new legal framework for recommendation algorithms

A second instance of Germany’s lead on technology solutions concerning big tech companies is the upcoming reform of the country’s media regulatory framework. Media regulation is in the hands of the German federal states (not the federal government in Berlin). The draft "Medienstaatsvertrag" (Interstate Media Treaty), published in December 2019, addresses the challenges posed by sorting and recommendation algorithms at video portals such as YouTube and Netflix and so-called "media intermediaries" such as Facebook and Google. In short, the draft aims to prevent video platforms from discriminating content (i.e. making it less visible for no justified reason), gives priority to journalistic content (likely meaning that made by public broadcasters) and stipulates that there needs to be easy-to-understand information on how the recommendation algorithms work. The transparency and non-discrimination rules also apply to media intermediaries.

For instance, it remains unclear what exactly algorithmic transparency means, who it is supposed to be for (the users? the regulators? the media?) and how this is supposed to be overseen.

The interstate treaty is supposed to go into effect in September 2020. Until then, many open questions remain on how exactly the proposed rules will be interpreted and implemented. For instance, it remains unclear what exactly algorithmic transparency means, who it is supposed to be for (the users? the regulators? the media?) and how this is supposed to be overseen. The federal states’ media authorities, which are non-governmental regulatory bodies, will for the first time be confronted with overseeing giant global tech companies, raising coordination and implementation issues.

On its side, France’s law on the audiovisual sector focuses mainly on copyright issues, and on ensuring that the content video platforms such as Netflix produce or present online is diverse (two topics highly debated by industry players in France). It makes little mention of algorithms (it only states, in the introduction, that recommendation algorithms play a role in weakening social cohesion by isolating citizens). Interestingly, it insists on the ability of users to classify content, for instance to distinguish the content that is produced by a platform from other types of content. However, if it asks platforms to do so, it does not include measures to look at exactly how the content is classified, and does not address the question of how recommendations work.

The Digitization law: updating Germany’s competition law

A third and final example of German internet regulation is that of the new competition rules proposed by the Federal Ministry for Economic Affairs and Energy. A reform of the long-standing Act against Restraints of Competition ("GWB-Digitalisierungsgesetz") aims at addressing power imbalances in the digital economy, specifically by looking at how large players with strong positions across markets may abuse that role to restrict competition. The law, if passed, would acknowledge that market dominance can be attained when companies hold a lot of (user) data. By becoming a type of gatekeeper (or "intermediary power"), these strong and dominant companies can affect other businesses’ development and overall competition.
The federal cartel office would receive expanded powers to tackle such intermediary power, which is expected to apply to companies such as Facebook, Google and Amazon. For example, the draft proposes that competitors would have easier access rights to data from companies that dominate a market. The reform could be an important step in further adapting the legal framework to data-driven business models that underly some of most crucial challenges of the digital public sphere.

A holistic vision of transparency and oversight

Although many questions remain concerning the implementation of these regulations, together they do represent a way of dealing with tech platforms and their business and content moderation practices in multiple policy fields. The strength of this approach is to recognize that the challenge arising from dominant global tech platforms is not only the fact that there is disinformation, for example, but that there is a privately owned, lightly overseen infrastructure to support it.

The real challenge that disinformation raises is not that untruths circulate online, but that it is possible to give these untruths disproportionate substance and turn them into political weapons.

Indeed, from a technology policy standpoint, the real challenge that disinformation raises is not that untruths circulate online, but that it is possible to give these untruths disproportionate substance and turn them into political weapons.

To see how this system can be gamed to do so, actors other than the platforms need to have access to it to find efficient solutions. They need access to the pipes through which disinformation flows and an idea about the way the pipe managers deal with it. While it might seem enticing to envision a single non-state, independent oversight body (preferably at the EU level) to inspect the pipes and pipe managers, for now, this is just that: a pipe dream. In the absence of such an overarching EU cooperation, different existing regulatory authorities will have to figure out ways to deal with big tech companies which help shape the information and media environment for billions of people.

A first step towards that is creating transparency at the platforms, which is evident in the German approach: tech companies need to be more open about their content moderation practices (NetzDG), their algorithms (media regulation) and providing data access to users and competitors (competition law). If the specifics of what transparency means (and for whom) are established, this could be helpful in dealing with disinformation, making it more difficult to purposefully distort reality in an intent to do harm through coordination and mechanical amplification.

 

Copyright: PHILIPPE HUGUEN / AFP

Receive Institut Montaigne’s monthly newsletter in English
Subscribe