• Interview.Nick Clegg: "We are in favor of a tax reform, even if we pay more"

When society divides and tension increases, that division manifests itself on social media. Platforms like Facebook are a mirror of society , with more than 3,000 million people using Facebook applications every month, everything that is good, bad and unpleasant in our societies will find its expression on our platform. That places a huge responsibility on Facebook and other social platforms when deciding where to put limits on what content is acceptable.

Facebook has come under fire in the past few weeks after its decision to allow President Trump's controversial posts to stick, as well as doubts from many people - including companies advertising on our platform - about our approach to doing facing hate speech. I want to be clear: Facebook does not benefit from hate. Billions of people use Facebook and Instagram because they have good experiences. They don't want to see hateful content, our advertisers don't want to see it, and we don't want to see it. There is no incentive for us other than removing it.

Through our services, more than 100 billion messages are sent every day. Those are all of us, talking to each other, sharing our lives, our opinions, our hopes and our experiences. Within all those billions of interactions, there is hatred in only a small fraction. When we find hate posts on Facebook and Instagram, we take a zero tolerance approach and remove them. When content is not classified as hate speech - or our other policies aimed at preventing voter harm or suppression - we err on the side of freedom of expression because, ultimately, the best way to counter the hurtful, divisive and offensive speech, is more speech. Exposing it to sunlight is better than hiding it in the shadows.

Unfortunately, zero tolerance does not mean zero incidents. With so much content published every day, eradicating hate is like looking for a needle in a haystack. We invest billions of dollars each year in equipment and technology to keep our platform secure. We have tripled - more than 35,000 - the people who work in security on Facebook. We are pioneers in artificial intelligence technology to scale out hate content.

We are making real progress. According to a recent European Commission report, Facebook evaluated 95.7% of reports of hate speech in less than 24 hours, faster than YouTube and Twitter. Last month we reported that almost 90% of the hate speech we detected we removed before anyone reported it - more than 24% a little over two years ago. We have cracked down on 9.6 million pieces of content in the first quarter of 2020, up from 5.7 million in the previous quarter. And 99% of the content related to ISIS and Al Qaeda that we remove is removed before anyone reports it.

We are improving, but we are not complacent. So we've recently announced new policies and products to make sure that everyone can be safe, stay informed, and ultimately use their voice where it matters most: voting. These measures include the largest voter information campaign in US history, with the goal of registering four million voters, as well as policy updates to take strong action against voter suppression and fight against hate speech. Many of these changes are the direct result of comments from the civil rights community. We will continue to work with them and with other experts as we adjust our policies to address new risks as they arise.

Of course, focusing on hate speech and other types of harmful content on social media is necessary and understandable, but it's worth remembering that the vast majority of those billions of conversations that do occur are positive.

A good example is what has happened during the coronavirus pandemic . Billions of people used Facebook to stay connected when they were physically separated. Grandparents and grandchildren, brothers and sisters, friends and neighbors. And more than that, people also came together to help each other. Thousands upon thousands of local groups were formed and millions of people came together and organized to help the most vulnerable in their various communities. Others gathered in turn to celebrate and support our toilets. And when businesses had to close their doors to the public, Facebook was a lifeline for many of them. More than 160 million companies use Facebook's free tools to reach their customers, and many used these tools to keep their businesses afloat when their doors were closed to the public, saving people's work and livelihood.

Facebook also helped people get accurate and truthful information about health. We directed more than two billion people on Facebook and Instagram to information from the World Health Organization and other public health authorities, and more than 350 million people clicked on this official information.

And it's worth remembering that when dark and terrible things happen in our society, social media provides people with a means to make the light shine. To show the world what is happening; to unite and fight hate; and for millions of people around the world to show their solidarity. We've seen it countless times, and we're seeing it right now with the Black Lives Matter movement.

We may never be able to completely prevent hate from showing up on Facebook, but we are constantly improving when it comes to stopping it.

* Nick Clegg is Vice President of Global Affairs and Communication for Facebook

According to the criteria of The Trust Project

Know more

  • Facebook
  • Instagram
  • Youtube
  • Twitter
  • European Comission
  • Al Qaeda

Café Steiner @ realDonaldTrump lies

Tribune Talk about racism here

Global Court Germans always have whoever they want

See links of interest

  • Last News
  • Programming
  • English translator
  • Work calendar
  • Daily horoscope
  • Santander League Ranking
  • League calendar
  • TV Movies
  • Cut notes 2019
  • Themes
  • Mallorca - Celta de Vigo
  • Tenerife - Deportivo de La Coruña
  • Barça - Kirolbet Baskonia
  • Leganés - Seville
  • Barcelona - Atlético de Madrid, live