Search for a report, a publication, an expert...
Institut Montaigne features a platform of Expressions dedicated to debate and current affairs. The platform provides a space for decryption and dialogue to encourage discussion and the emergence of new voices.

Digital Services Act: With Great Power Comes Great Responsibility

Digital Services Act: With Great Power Comes Great Responsibility
 Arno Amabile
Engineer Corps des mines
 Théophile Lenoir
Fellow - Misinformation and Digital Policy
 Basile Thodoroff
Engineer Corps des mines

At the beginning of June, the European Commission opened a public consultation in order to collect contributions from civil society to the Digital Services Act, the European text aimed at holding digital intermediaries accountable. Institut Montaigne responded to the consultation and, in this series of articles, presents the major issues of platform accountability, drawing on the work carried out by several working groups in recent years. This introductory article presents the difficulties that platforms face, and the risks induced for European companies.

If there was still any need to prove the importance of digital platforms in all aspects of our societies, the lockdowns took care of it. Constrained by physical distance, we connected to these platforms to work, order our groceries, keep in touch with our loved ones, and keep up to date with the state of the pandemic and new regulations. This period was a time of forced experimentation both for some people - who had hitherto kept well away from digital intermediaries - and for some sectors - which had hitherto seemed impervious to digitalization.

Each sector is indeed seeing the emergence of digital intermediaries that are disrupting the established players, while yesterday's bad behaviors and fraudulent actors are replaced by today's avatars. It is therefore not surprising that these intermediaries are at the heart of most major economic and political issues: competition and small business, urban mobility, disinformation and populism, bias and discrimination, the status of platform workers...

The European Digital Services Act, a major project of the new Von der Leyen Commission, will attempt, by the end of the year, to better regulate their activities and create a new framework of responsibility to ensure the protection of European citizens and users. This initiative is both necessary and delicate, as the stakes are so high. Both inaction and ill-advised action would have significant costs.

The numerous challenges of digital intermediaries

The difficulties posed by digital intermediates are numerous. First of all, they are of a competitive order, due to their very nature as economic players. They connect buyers and producers in many markets and acquire dominant power, whether in trade and logistics (Amazon, Shopify), information (Google News, Twitter), transportation (Uber), tourism (Booking), entertainment (Tik Tok, Youtube, Twitch), or advertising (Google, Facebook). Network effects mean that there is often room only for a limited number of intermediaries. The winner takes over the vast majority of the market, and sets their own rules.

Second of all, digital intermediaries are also creating new categories that are still outside the scope of the current legal framework, and therefore escape its rules. Such is the case with workers on "task-based" platforms (Kaptn, Deliveroo, etc.), for instance. They are considered independent and therefore have much lower social protection than "regular" employees. And yet, these platforms have considerable power to get drivers or delivery drivers to work during a certain time slot or in a certain area. How should this relationship between the platform and the workers be framed?

In another area, we are still struggling to decide whether social networks should be considered as publishers (a newspaper, a TV channel, a radio) and therefore be responsible for the content that is published by their users, or as simple hosts. We have connected everyone, including those with ill intentions. The content that circulates online today has problematic effects in our individual daily lives (cyberbullying, especially among young people) and on a collective level (when democratic processes are disrupted). To what extent should actors such as Facebook and Youtube be held accountable for disinformation or hateful content published on their platforms and recommended by their algorithms? What concrete measures can they put in place to limit the circulation of dangerous content without hindering communication and economic freedom?

One thing in common

The common problem is that of the transparency of the systems [...] Since the decisions they make regarding such content affect consumers, citizens or workers, they must be subject to external scrutiny.

Why try to write a text concerning Amazon in e-commerce, Uber in transportation or Facebook in the information market? What is the common ground between these platforms and the challenges they pose? As we will see in this series, these intermediaries create challenges of a purely numerical nature. They generate valuable, sometimes intimate data (on usage, purchases, behavior), which they use to better understand their users and influence the decisions they make (reading one article rather than another, working at a specific time, buying that product, etc.). Digital intermediaries and their systems therefore play an important role in each of these sectors.

The common problem is that of the transparency of the systems they put in place to recommend services, products or content on the one hand, and to identify illegal or dangerous services, products or content on the other. Since the decisions they make regarding such content affect consumers, citizens or workers, they must be subject to external scrutiny. Public authorities, but also civil society (associations for the protection of rights, individuals, consumers, etc.) must therefore be able to participate in establishing rules to frame these systems and avoid abuses. But how can these systems be opened up? How can we set up democratic and legitimate decision-making processes, especially concerning grey areas such as hateful content or disinformation? These are the questions that the European Commission is working to answer in the context of the Digital Services Act.

Note that the summer of 2020 will have been a particularly dense mix of all these difficulties for the platforms. In the United Kingdom, the algorithm used to estimate grades for A-levels, the final exam for high school students, has been accused of favoring high school students from private schools. In Europe and the United States, misinformation about the ongoing pandemic is putting public acceptance of a future vaccine at risk, while the popularity of TikTok among young people has raised fears of a growing Chinese influence. In the face of all these difficulties, even the CEOs of Facebook, Microsoft and Google are calling for regulation, which of course they hope will be favorable to their business model.

In the coming days, Institut Montaigne will publish a series of articles to highlight the contributions of several of its studies to the debate on the liability of digital intermediaries. The first concerns the cross-cutting issue of the supervision of algorithms and their possible biases, based on the report Algorithms: Please Mind the Bias! The second addresses the issues related to self-employed workers on platforms, based on the conclusions of the report Travailleurs des plateformes: liberté oui, protection aussi, published in French. Finally, the third article deals with the issues related to content, evoked in the reports French Youth: Online and Exposed and Tackling Disinformation: Going Beyond Content Moderation.




Receive Institut Montaigne’s monthly newsletter in English