Paris (AFP)

The YouTube algorithm, regularly accused of promoting harmful content, is based more on the keywords attached to a video than on its success to recommend it, according to a study of the CSA presented Tuesday.

YouTube, owned by Google, has remained mysterious about the settings of its algorithm, constantly evolving, never revealing exactly the role played by the various measures (time spent on video, session time, number of views or + likes + ) in the recommendation engine.

The platform, which displays a billion hours of viewing per day, does not communicate any figures on the share of views from the recommendations, compared to videos sought directly by users.

The CSA questions in the study on the "diversity of points of view" proposed in the selections of the algorithms. The Council, which is expected to soon obtain a sanctioning power on platforms as part of the audio-visual reform, reiterated an experiment already conducted by youtubers, researchers and media, focusing on the opinions conveyed by the videos.

42 user-witnesses have launched videos on 23 "cleavage-prone" topics such as bullfighting, secularism, street drag, veganism or the death of Michael Jackson. The 10 videos recommended successively by YouTube were then analyzed.

As a result, in its first recommendations, "the algorithm seems to give less importance to the number of views, to the publication date or to the number of reactions than to the keywords associated with the starting theme of the videos", explains the THAT'S IT. The dynamism of communities mobilized on certain social issues (in the comments in particular) also seems to play "a central role" in the visibility of these videos.

Moreover, more than a third of recommended videos "express the same point of view as the original video", at the risk of producing a phenomenon called "echo chamber", says the CSA. Over the recommendations, 44% of the videos offered by the automatic reading go in the same direction as the previous video.

The experiment also brought to light a "pivot", the third video proposed in automatic reading, after which "the algorithm definitely seems to deviate from the starting theme". The robot then recommends "increasingly popular videos (number of views) and more and more recent," says the CSA.

© 2019 AFP