Since their inception, the Internet and the World Wide Web have been surrounded by hopes and fears. Early utopias celebrated limitless communication and almost unlimited access to information. Compared to what is available quickly and free of charge via Wikipedia, the content of the large Brockhaus seems puny. However, the online encyclopedia also demonstrates the negative aspects of free availability and all-round participation: some contributions are of dubious quality, others are the subject of constant struggles for interpretative sovereignty. The balance is even worse if one takes into account other information offers on the Internet that neither share the claim nor the correction mechanisms of Wikipedia, for example YouTube.

The Internet offers something for everyone, but that also means: nonsense, misinformation - and a broad spectrum of moral and political positions. In particular, digital communication makes it easier to reach a large number of people with ideas that are rejected by the mainstream or do not even appear there. Radical and extremist attitudes are reinforced if the algorithms prefer to present information that corresponds to one's own interests and predominantly connects like-minded people with one another. Anecdotal evidence points to two mechanisms of radicalization: Recommendation systems that offer increasingly extreme content based on user behavior, and links between this and moderate content that arise, for example,that people from extremist or conspiracy theory circles appear on channels that have a wider audience. In this way, channels that deal critically with the mainstream - mostly perceived as “left” - can become gateways for political extremism.

Archetypes of news consumption

A recently published study by American computer and communication scientists examines for the first time with meaningful and reliable data from the video platform Youtube whether and how such an online radicalization is taking place. The study analyzes the surfing behavior of a representative sample of the American population comprising more than 300,000 participants over a period of four years from 2016 to 2019, both on and off the popular YouTube platform. During this time, the study participants consumed almost 10 million different YouTube videos from more than two million channels, around a tenth of them related to news and political content that was the focus of the analysis.More than half a million videos from almost 1000 channels could be categorized along the political spectrum. Between the classic poles of left and right positions, an additional category for “anti-woke” videos was introduced, which oppose a milieu that describes itself as “woke” and is sensitive to inequality and identity, ie “political correctness”, “genderization” and criticize left “identity politics”.

The analysis shows that this simple categorization already differentiates between six “archetypes” of news consumption quite clearly: news is predominantly obtained from one of the categories, so that one can assume relatively homogeneous reception communities. Overall, right-wing extremist and “woke” hostile content is consumed relatively seldom compared to politically moderate and apolitical content. However, both categories have gained considerably in popularity on YouTube during the period under review. Above all, they seem to develop a strong pull: If you watch several videos in these categories in one session, you will then use the corresponding channels significantly more often.

But what role does the platform's recommendation algorithm play in this? Apparently not a particularly large one. The data showed that videos with extremist content are more often controlled from outside the platform. Also, there is no trend towards increasingly radical content during a session. The often scolded algorithms are apparently not responsible for the fact that people prefer to use those sources of information that confirm their own point of view or radicalize it even further. Rather, users already bring their opinions and interests with them and move around in the digital space in a similar way to a library: goal-oriented, but on the move open to chance finds that correspond to their own preferences.