Therefore, the goal of the T3 Commission’s report was mainly to understand what we are really worried about when we talk about the information crisis, to assess whether there is any evidence that disinformation is spread by external actors, to determine whether disinformation changes the way we do politics, and to examine the role played by the general socio-economic context in these new phenomena. Another of our aim was to gain an in-depth understanding of social media platforms - their structure, the way they operate, their biases - and to try to provide frameworks to ensure that these organizations be more accountable, while simultaneously recognizing the positive aspects of some of their activities.
What solutions have you identified to ensure that social media platforms become more accountable?
In the report, we advocate for the establishment of an Independent Platform Agency. The idea is for this institution to have access to the data of all the major platforms in order to monitor their moderation activities, and to be able to impose fines when platforms fail to provide data. Its role would thus go beyond the mere recommendation of transparency reports. Social media platforms have very recently claimed that they are in favor of regulation, when they had previously resisted it. Yet they themselves do not know what regulation would look like, and it is important that the latter not be conceived exclusively on their terms. We therefore need an independent body to work hand in hand with platforms, and which would have the power, responsibility and expertise required to collaborate with platforms and look at the mechanisms they employ from within. Such an institution, which would also coordinate with regulators and legislators, is what will allow us to build an accountability framework, which will ultimately need to involve sanctions.
Until now, discussions have mainly focused on ways in which either platforms or governments - or both - can and should moderate content. Yet, if they do not include civil society, such initiatives might be accused of being non-democratic, given the current context of distrust. How can we include citizens in these debates?
It is indeed key to further involve citizens in these discussions, because there is a real risk that people will grow increasingly cynical with regards to digital technologies. I believe one way to do so is to label the way content is prioritized. These processes are still too opaque, and the criteria used for content curation need to be made more transparent.
Add new comment