The challenges of regulation...
While Facebook did not directly oppose the President (Mark Zuckerberg, its CEO, simply announced a review of content moderation policies related to posts concerning the use of force by the state), the social network may not come out unscathed.
Following Twitter's first efforts (urging its users to check the facts on mail-in voting), the President signed an executive order calling for an investigation into the business model of content platforms, a look at possible biases in social networks, and a review of Section 230 of the Communications Decency Act of 1996. While the executive order will have little effect on federal law, it could lead to a long-term review of Section 230, which states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider". As we noted in our Tackling Disinformation: Going Beyond Content Moderation article, it effectively protects social networks from any liability and has withstood most legal challenges.
On Wednesday June 17, the US Department of Justice revealed recommendations for a reform of Section 230. One of their aims is to "encourage platforms to be more transparent and accountable to their users”, by providing more information on the content they delete in “good faith". It thus calls for a statutory definition of "good faith" to clarify its purpose. Recommendations also include removing the platforms’ protection from government civil lawsuits and antitrust liability under Section 230. Whilst Congress would need to get involved for any of these recommendations to go any further, this reveals the Trump administration’s motivation to prevent platforms from deleting content without facing consequences.
The question of how to hold platforms accountable for the content they host and promote is not new for social networks. The European Commission has already taken the lead on the issue of their responsibility towards content, after countries like France and Germany drafted and implemented laws requiring platforms to play a more active role in moderation, such as disinformation or illegal content. For example, the 2000 European e-commerce directive, which protects platforms, is undergoing a major overhaul under the Digital Services Act, and a public consultation on the issue was launched on June 2.
This new text aims to create a legal framework for the liability of digital platforms, ranging from issues such as content dissemination to the employment of workers like Uber drivers. What they have in common is the issue of transparency of socio-technical systems (a mix of technical tools, for example using artificial intelligence, and human decisions that influence them), and their responsibility in the decisions that they lead to.