Facebook founder Mark Zuckerberg doesn't care about the criticism of his social network. It doesn't matter, he made it clear again last week: More than 900 companies and self-employed people have announced as part of the Stop Hate For Profit initiative that they no longer want to advertise on the social network. Mainly because Facebook does so little against the hate that is widespread there: because, for example, it left out calls for violence against the Black Lives Matter movement. Because it classified the very right-wing website Breitbart as a trustworthy source despite its contacts with white nationalists.

Among the signatories are large US companies such as Coca-Cola, Starbucks, Verizon, but also Dax companies such as Henkel, SAP, Volkswagen. Even if there are huge advertising budgets behind these companies, Zuckerberg seems to be taking it easy. He said internally, "My guess is that all of these advertisers will be back on the platform soon." It is more of a reputation than a financial problem for his company. A verbal middle finger towards the boycott.

Facebook can afford this toughness. The advertising boycott should hit the social network at its most sensitive point: at its source of income. This point of attack actually seemed to make investors nervous at first: when Unilever, one of the world's largest companies, agreed to support the initiative on June 26, the share price fell dramatically. Facebook's goodwill just declined by $ 56 billion. But the shock lasted only a few days. The share price has long since recovered and is back at the pre-boycott level.

The economic ban could have a similarly minor impact on sales. Because a boycott only works if many participate. 900 companies may sound like many. But first, the list of boycotting advertising partners has long included not only large companies with correspondingly huge advertising budgets, but also a yoga studio from Bridgeport in Connecticut or a law firm from Houston, Texas. Second, 900 companies are a negligible number for Facebook, which it claims to work with more than seven million advertising partners. If a few drop out, there will be enough other companies that pay for their advertising space instead of it.

So much cosmetics must be

Of course, Facebook reacted a bit anyway, it's not just about the money, but also about the image. The social network has blocked the accounts, pages and groups of an extreme right-wing network in the USA. If Facebook classifies a post as relevant to the message, but at the same time sees a violation of its own rules of hate, it wants to provide it with a warning in the future. And it wants to show original sources, i.e. current news, on which other media base their reports, higher up in the news feed. But that remains cosmetic interventions, like so many Facebook measures in recent years. Because they all deal with the outgrowth of problems instead of tackling them at the root. Disinformation? Facebook is currently fighting by working with fact checkers worldwide and flagging problematic websites. Hate on the net? While users can report, they often stop.

Facebook would have to intervene much earlier. And one wonders what needs to happen before it finally does it.

To count on that, however, one has to assume that Facebook can change at all. But what if that's not possible? If Facebook is irreparable with its attention and advertising logic? Journalist Chris O'Brien poses these interesting questions in the US tech magazine VentureBeat . Facebook's problems are not simply the result of a reserved executive, even if it has made the situation worse, he writes. Facebook deleted 3.2 billion fake accounts between April and September 2019 alone - that's more than the 2.4 billion monthly active users of the service. Still, it feels like nothing has happened. The problem lies in the "nature of the beast" itself, writes O'Brien. 

Facebook not only maps, Facebook weights

Facebook is not the cause of the problems, "Facebook holds up a mirror to society," wrote Nick Clegg, formerly UK Vice Premier and now Vice President of Communication for Facebook, in a blog post. Everything that is good, bad or ugly is expressed by users on Facebook, Instagram or WhatsApp. There is of course a true essence in this statement: All the anger and hatred, the quarrels and lies for which we are so happy to blame the special communication in the social network appear almost everywhere where people meet. At the regulars' table, differentiating voices are always more difficult than screaming necks with a clear opinion, whether analog or digital. Family members can spread false information while drinking coffee as well as online. And women, people of color or trans people often experience hate and threats in everyday life.

But what Clegg omits: The Internet is changing how many people reach individual voices. Through social networks like Facebook, hatred, disinformation, manipulation can multiply. How many likes something gets, how often we see a request to speak has an impact on how strongly we approve and thus how important we perceive it. A group that in analogue life could perhaps be quickly identified as a few scattered spinners can appear on the net by excessive sharing and posting to represent a relevant social camp. This fundamental phenomenon can be found on all major platforms. On Youtube. On twitter. On Twitch. And also on many small ones.

Facebook, however, has a special responsibility - because the company has a strong influence on the reception of news by many people. Because the social network does not simply depict what is happening anyway. The network weighted. If a post gets a lot of likes, a lot of comments, it is very likely that it will be prominently flushed in the news feed by other users. If a person interacts with the content of a news source, there is an increased chance that the content will be displayed prominently again and again - which can at some point affect the perception of reality. Facebook picks out exactly what is shown to whom, what could be interesting for whom. It is not a mirror of society, it is a mirror of our interests. Or better: a mirror of the interests that Facebook ascribes to us in order to keep us on its pages as long as possible. So that we get as many ads as possible and that as much money as possible can be earned with our attention.