Disinformation: "It's about creating confusion and chaos"
Misinformation has become more dangerous, says psychologist Stephan Lewandowsky. The world is approaching a post-truth era. The good: we can equip ourselves.
Stephan Lewandowsky is a professor of cognitive psychology and is considered one of the world's leading researchers in the field of scientific disinformation. For the European elections, we asked him how dangerous disinformation campaigns are and what can be done against them.
ZEIT ONLINE: Before the European elections, the fear of disinformation campaigns and election manipulation is great. They say we live in a post-truth era where misinformation and disinformation spread. Is not that just black-and-white?
Stephan Lewandowsky: The amount of misinformation is extremely difficult to quantify. We know that the distribution has increased. But it is possible that there is generally more information, while the proportion of false information has remained the same.
Stephan Lewandowsky is Professor of Cognitive Psychology at the University of Bristol. For years he has been researching why myths are remembered and how to refute them.
But what matters is something else. Whoever deals with misinformation should ask two questions: is the amount big enough to drown the true information and change the society to the negative? And: Has misinformation changed, so does the post-truth era have its own fingerprint? I think the answer to both questions is yes.
ZEIT ONLINE: Why?
Lewandowsky: First, because misinformation has long been effective. A good example of this is the work of political scientist Kathleen Hall Jamieson, who in her book Cyberwar quantitatively analyzed Russia's influence in the 2016 US presidential election. She concludes that misinformation campaigns were significant for the election. And there are good studies from the UK as well, showing that Brexit's Leave campaign has benefited significantly from a false figure - from the lie that Britain transfers £ 350 million to Brussels every week.
Second and, I think, even more interesting: the nature of misinformation has changed. Let's go back to the year 2003. At that time, the US government under George W. Bush and the UK government under Toni Blair claimed that Iraq possessed weapons of mass destruction, although they knew that such weapons did not exist. It was a massive deception to get people into the war. The whole action was carefully curated and propagandistic, the effort immense. They literally competed with the United Nations inspectors who had not found weapons in a contest for the truth. But - and this is crucial - Blair and Bush referred to the same reality as the inspectors.
Today, that sometimes seems different: people who spread nonsense seem to no longer care if there is a reality that is controversial. Voices saying nobody knows what's true are getting louder and louder. Everything is in the eye of the beholder, it is said. There is an explicit dedication to subjectivity. That's why former election campaign manager Trumps, Kellyanne Conway, also spoke of "alternative facts."
ZEIT ONLINE: Misinformation campaigns are often no longer about anchoring certain false facts in people's minds, they are unsettling people. To make them feel they can no longer believe in destabilizing societies.
Lewandowsky: That's right! It is no longer about convincing people of something. It's about creating confusion and chaos. This is different today than it used to be. Russian bots, for example, tweet on both sides of the vaccine dispute in the US, some for it, some against it. They come from the same troll factory in St. Petersburg. The only goal is to deepen divisions in society. And honestly, Donald Trump does the same thing: lying about things he does not have to lie about in order to move forward politically. Why should he do that? It is not to his advantage unless his strategic goal is to undermine the concept of truth.
ZEIT ONLINE: People are often confronted with facts and misinformation at the same time. When does something depend on both?
Lewandowsky: Unfortunately, we often know that afterwards. An approximation is: the shorter and easier an information, and the more emotionally charged - whether funny, funny, frightening, or provoking outrage - the sooner it gets stuck.