Calculating the shortest route on our phones, automatically creating playlists with our favorite songs, or even finding the most relevant result on a search engine: algorithms help you every day. But what would happen if a recruiting algorithm systematically left out women or ethnic minorities? How can we make sure these errors are acknowledged and corrected?
To answer these questions we interviewed forty experts from various sectors with the aim to offer concrete solutions to limit potential abuses and increase public trust in algorithms.
This report proposes a French perspective on algorithmic bias, essentially viewed today through an American lens. It builds on the study published by Télécom Paris and Fondation Abeona, Algorithms: Bias, Discrimination and Fairness in 2019.