Facebook has its own rules that do not allow hateful content that attacks anyone due to skin color, ethnicity, national origin, religious affiliation, sexual orientation, caste, gender, gender identity, illness or disability.

But how do you follow your own rules?

Have been out for several years

We sent 26 comments that Swedish courts have ruled are hot against people. Some of the hatred has been left for more than two years in some of the largest Swedish anti-refugee groups on Facebook.

After SVT informed Facebook, the company writes that 25 of the posts violate the platform's own rules and have now been removed. Many of the hate crimes had not been reported. Some had, and still had to stand.

"We have begun work to understand why we did not remove them," Facebook writes in an email response to SVT.

"No moderation"

The nethat reviewer is a non-profit association that reviews the social platforms. According to project manager Tomas Åberg, the association has made 5,000 applications in recent years.

- It's very simple when you come to say "oh, we did wrong, we remove it". But the tens of thousands of comments still left, what happens to them?

SVT quickly finds other attacks against Muslims, immigrants and dark-skinned - which Facebook did nothing about.

-They have no moderation whatsoever but rely entirely on users to report to them. That's not enough for a company that has many billions of dollars in profits, says Tomas Åberg.

According to Facebook, it is a challenge to understand the context, for example, to distinguish hateful content from disclosure of injustice. Large investments are made in digital tools to detect malicious content more quickly, even if it is not reported. Progress has been made over the past year, according to Facebook.

- The talk that it is difficult and complicated is pure bullshit. Now they are getting sharpened, says Tomas Åberg.

See the documentary "Live live " about the phenomenon of living parts of life live.