• 1 comment

Rodrigo Terrasa Madrid

Madrid

Updated Saturday, February 3, 2024-00:09

Years ago, a group of researchers discovered that YouTube's algorithm was capable of identifying a video of a semi-naked child on the Internet. Soon the platform put the dots together and learned to promote that content to users looking for pedophile material. Using the same equation, Facebook concluded that its

friends

spent more hours on the network the more alarmist, conspiratorial or radical their posts were. So that's what he gave them.

Two decades have passed since the official launch of the social network that Mark Zuckerberg launched to connect the planet and since then almost everything has gone worse by replicating that same formula that

Max Fisher

now guts . Based on years of research, the journalist from

The New York Times

unravels in

The Networks of Chaos

(Peninsula) how social networks have learned in the last decade to exploit the psychological fragility of their users to design algorithms that enhance radicalization, extremism and violence.

The remedy?

"Disable the algorithm

," she replies.

"It took the tobacco companies half a century and the threat of potentially fatal litigation to admit that their products caused cancer," he recalls in his book. "How likely was it that Silicon Valley would accept that its products could cause riots and even genocide?"

It is now 20 years since the launch of Facebook. Do you remember what your first impression of this social network was?


At first it was just one of many social networks like MySpace and Orkut. It's hard to remember, but back then social media was very boring. They were mediocre businesses, not very exciting websites and not very significant from a cultural point of view. I think we all assumed Facebook would be like that too. Perhaps the only difference was that it was more oriented towards university students, since at first it was limited to students from certain universities. It wasn't until several years later, when Facebook launched its

newsfeed,

that it became clear that it would be a new and much more impactful type of experience.


What would you say has been the greatest positive contribution of Facebook and the social networks that came after?


The ability to form a large, disparate community around some shared issue or concern has been extremely significant. For better and for worse. Social movements like

MeToo

or

Black Lives Matter

would not have been possible without large social networks. That is a very important contribution. But of course, social media also allows the creation of more harmful communities, such as QAnon or anti-vaccine movements. That in itself is not Silicon Valley's fault, of course, but what we now understand is that companies like Facebook and YouTube designed their social networks to attract people to the most harmful and destructive version of this community-building impulse, because it is more effective in generating engagement and increasing your income.


How do networks make us better?


Platforms have become exponentially more effective at displaying content that specifically appeals to you. They do this using very sophisticated algorithms that determine your specific tastes and interests and show you what attracts you most. Anyone who has spent time, for example, browsing YouTube has seen many videos that respond to their interests and that they would not have discovered otherwise. But the problem, again, is that these algorithms have also learned that the best way to capture our attention is to cultivate and activate the darkest and most destructive parts of our nature.

"A social network can be extremely effective in changing your behavior, your sense of right and wrong, even your understanding of the line between truth and lies."

What has been Facebook's worst sin? It is difficult to define the "worst" sin. The most harmful? The most immoral? Their gravest sin was deliberately designing their platform to exploit our innate psychological needs to get us to spend more time online. When they started doing it in the late 2000s, they told themselves that what they were doing was fine because getting us to spend more time online could only be beneficial for us. They believed that the internet would literally be the salvation of humanity and therefore anything they did to make us more connected was good. But it soon became clear that the most effective way to do this was by amplifying our worst instincts—toward hate, division, and misinformation. And it became so lucrative for them that company leaders deliberately ignored the consequences, even when their own internal researchers told them that their products were indoctrinating millions of people in racial hatred, medical conspiracies and other dangerous beliefs. How has this translated into the real world? His most immoral sin, I would say, was refusing to bring Facebook under control in Myanmar when the platform was spreading racist lies and hate speech there on such a large scale that even the United Nations said it Facebook was substantially contributing to the genocide. They could have disabled the algorithm whenever they wanted, and they didn't because it would have been bad business. How have social networks made us worse 20 years later?


Networks train us to unleash some of our most destructive and damaging instincts, and to exaggerate them on a scale that rarely occurs otherwise. Every time you log in, the platform is subtly changing your behavior. When you post something that the platform wants to encourage, the algorithm will push your post to more people so that it gets more

likes,

so that it gets more shares. If you post something that the platform wants to discourage, it will hide it from other users so you feel ignored. He does it again and again. Every day. This is a very powerful form of social reward that, research shows, can be extremely effective in changing your behavior, your sense of right and wrong, even your understanding of the line between truth and lies. And we know that the behaviors that social networks most reward are indignation, hatred of an external social group, the division of us against them and misinformation that satisfies our deepest fears or hatreds. In your book you talk about the dictatorship of

like

, misinformation, echo chambers, extremism, anger, hatred,

trolling

, paranoia, the impact on democracy, the damage to mental health... Of all the evils caused by social networks, which one has the worst remedy?


The extreme promotion of moral indignation is probably the most consequential, because networks lead us to dramatically exaggerate this instinct. Moral indignation can be healthy and useful in certain circumstances. It's how we, as a society, discourage harmful or antisocial behavior. But social media encourages you to dramatically deepen your sense of moral outrage, to amplify the rage with which you express it, and to amplify who you direct that outrage at.


You claim that Facebook's main problem is Facebook. Was it possible to create a good social network? Was a different Facebook possible? Of course it was and remains possible. If social platforms removed the technological features that maximize engagement, like the algorithms that govern what you see or the

like

button or the counter that shows the number of shares beneath each post, then the harms of social media would be substantially reduced. while the positive aspects would mostly remain. We know this is possible because this is how social media worked 20 years ago.


What do you think will be the main legacy of social networks?


It's too soon to say that. The influence of social networks on our politics and our society is increasing every day, and the platforms are always evolving and changing.


What would be the most urgent measure to deactivate the chaos machine you speak of?


Turn off the algorithms that promote and order content based on what engages the most. Let us see what our friends publish without interference or manipulation.


The networks of chaos

Max Fisher

Peninsula. 544 pages. 21.90 euros. You can buy it here