The platform is accused of promoting content that violates its own regulations.
Last September, the Mozilla Foundation launched a new browser extension called RegretsReporter. The goal was to collect “regrettable recommendations” that the YouTube algorithm could make to better understand how it works. Internet users who noticed that a questionable video had been suggested to them were thus invited to send a report to the researchers.
YouTube claims to have made progress on this issue
Ten months later, the results of this study have reached us and they are not very flattering for the platform. A total of 37,380 people downloaded this extension and 3,362 regrets were shared with scientists. This data was then sifted through 41 research assistants. They were then able to notice that 71% of the regrets expressed by the participants came from recommendations made by the YouTube algorithm.
As our colleagues from Zdnet noted, in half of the cases, these videos had no real connection with previous views, like this internet user who was suggested by an extreme right-wing channel at the time, that he was watching survival content in the wilderness. Likewise, some recommended posts violated YouTube policies. Many have since been deleted but had previously been viewed by millions of people.
The researchers recognize that the results of this study have limitations. Moreover, nothing prevents some malicious participants from trying to falsify the data by only looking for certain unwanted videos. Likewise, the 37,380 participants are not necessarily representative of the rest of the population. However, the authors hope this will prompt YouTube to be more transparent about how it works in the future.
For its part, YouTube has also reacted to this study. Quoted by TheNextWeb, a spokesperson said: “We are constantly striving to improve the experience on YouTube and, in the last year alone, we have launched over 30 different changes to reduce the recommendations of harmful content. Thanks to this change, the limit content consumption that comes from our recommendations is now well below 1%.“
The platform also says it is considering the possibility of calling on external researchers to study its systems. They would also like to look into the Mozilla study to examine its validity but was unable to recover all of the data.