In technology policy circles, everyone is looking for ways to govern the Internet. Over the past couple of years, some solutions have emerged for governments to have their say in making the Internet safer and more inclusive. One of the major topics of regulation after 2016 has been trying to hold big social media companies responsible for their practices, especially concerning content moderation.
Whilst the UK is disclosing the follow-ups to its much-discussed White Paper on Online Harms, and Facebook and the European Commission are discussing the right level of accountability of private actors, Germany is proposing a number of regulations in different policy fields. Three times over the past three years, it has put forth ambitious ideas to regulate Internet giants, in some cases inspiring similar actions in France.
The NetzDG: a law to deal with illegal content
The first occurrence was the NetzDG, the "Network Enforcement Act", which entered into force in January 2018. The NetzDG was the first European regulation to force platforms to delete “obviously illegal content" within 24 hours of being notified, to establish notification systems and publish transparency reports on the deletions and to designate persons accountable for these decisions.
The law has been a success in creating some form of oversight of big tech companies, mainly by forcing them to be more open about their content moderation practices. However, the NetzDG has been criticized since its inception for outsourcing legal decisions about content to private companies. In an open letter to the federal government, several civil society organizations have recently criticized a proposed NetzDG reform. They argue the reform does not alleviate, and in some instances worsens, concerns that the NetzDG lacks important safeguards for protecting free speech.
The NetzDG has been a source of inspiration for French regulators. In November 2018, French parliamentarians signed a bill to "combat the manipulation of information", focusing specifically on disinformation during election periods. During the three months before citizens go to the polls, a legal procedure allows for a judge to be appointed to determine whether a piece of content is false or not, and make that decision within 48 hours. More recently, the "online hatred bill" includes the obligation for platforms to delete illegal content, to establish a notification system, to publish transparency reports and to designate accountable company representatives.
The Interstate Media Treaty: a new legal framework for recommendation algorithms
A second instance of Germany’s lead on technology solutions concerning big tech companies is the upcoming reform of the country’s media regulatory framework. Media regulation is in the hands of the German federal states (not the federal government in Berlin). The draft "Medienstaatsvertrag" (Interstate Media Treaty), published in December 2019, addresses the challenges posed by sorting and recommendation algorithms at video portals such as YouTube and Netflix and so-called "media intermediaries" such as Facebook and Google. In short, the draft aims to prevent video platforms from discriminating content (i.e. making it less visible for no justified reason), gives priority to journalistic content (likely meaning that made by public broadcasters) and stipulates that there needs to be easy-to-understand information on how the recommendation algorithms work. The transparency and non-discrimination rules also apply to media intermediaries.