Pretty crazy idea.

This is how Facebook founder Mark Zuckerberg described the accusation in November 2016 that Facebook could have influenced the outcome of the US presidential election in some way.

A few days earlier, Donald Trump had surprisingly won a very close election against Hillary Clinton.

In search of explanations, the social networks that gave the Republicans the presumably decisive votes were cited.

"Donald Trump won because of Facebook," one could read.

There are still around six weeks until the next US election on November 3rd.

It cannot be ruled out that history will repeat itself.

Contrary to current polls, Donald Trump could be re-elected president.

But there is one difference: the social networks, especially Facebook, are already under observation before the election.

And they are also reacting: with new guidelines and functions they are trying to prevent any influence on voters.

The supposedly pretty crazy idea doesn't seem that crazy anymore, even to Mark Zuckerberg.

In fact, Facebook's image of the public has changed since November four years ago.

While data protection concerns dominated the debate in the previous years, after the election it was increasingly about the question of Facebook's influence on democratic-social discourse, Facebook's role in the spread of hate speech and disinformation - and how the company deals with this responsibility.

Recent revelations such as that of former colleague Sophie Zang, who claims that the network has criminally ignored political influence for years, heighten concerns that Facebook could become a far more powerful electoral tool than previously thought. But is the concern also justified?

How big was Facebook's influence on the 2016 US election really?

The fact that in the run-up to the last US presidential election there were attempts by various sides to manipulate voters via social networks has been sufficiently documented.

Just three examples: Macedonian teenagers began to systematically post false messages on Facebook that were supposed to discredit Hillary Clinton - because it clicked well and brought them big advertising revenues.

A Russian campaign in support of Republican and Democratic candidates ran thematic election ads that were displayed to millions of American Facebook users.

And the British company Cambridge Analytica analyzed data from 87 million users to find out which users may have been particularly susceptible to pro-Republican opinion making.

Trump's campaign team proudly claimed to have carried out one of the largest and most expensive digital campaigns to date.

However, it is by no means so certain whether the attempted influence was ultimately successful and Donald Trump really got the decisive votes.

Rather, more recent studies come to the conclusion that although many people saw fake news on Facebook, they still accounted for only a small part of news consumption prior to the election four years ago (

NatureHuman Behavior:

Guess et al., 2020).

Studies on disinformation on Twitter (


: Grinberg et al., 2019) also come to similar results.

And even if voters were shown false news on social networks, in the opinion of some researchers this only rarely led them to actually believe the false news (PLOSOne: Garrett, 2019). Experts such as communication scientist MartinEmmer from the Weizenbaum Institute for the Networked Society and Political scientist Thomas Rid believe that the impact of Facebook and the political influence on social networks is sometimes overestimated.

In other words: The strength of the network lies in connecting like-minded people and providing them with content that confirms their worldview.

But when it comes to convincing people of something new, these services could quickly reach their limits.