Facebook has agreed to pay $ 52 million to settle a lawsuit filed against it by some of the supervisors who have worked to evaluate the content of the material posted on the site.

The suit, initially filed by content supervisor Selena Skola, dates back to 2018, when Skola claimed she had suffered a "permanent" post-traumatic stress disorder after she had to watch "thousands of extremist violence on Facebook because of the nature of her work."

Sites and newspapers such as The Guardian, The Virg, Washington Post, and others have indicated the conditions in which content moderators work on Facebook.

The supervisors - who work mostly for other companies working with Facebook as contractors - said that their work required spending hours looking at clips of murders, brutal treatment of animals, sexual abuse, child abuse and other horrific shots, with no administrative or psychological support and difficulty working due to The lack of clear guidance in some cases.

Facebook will pay an initial compensation of $ 1,000 to these people, while supervisors who have been diagnosed with "post-traumatic stress disorder" as a result of their evaluation of the aforementioned materials can apply for compensation of approximately $ 50,000.

"We are very happy that Facebook worked with us to create an unprecedented program to help people do something unimaginable even a few years ago," said Steve Williams, the attorney representing the plaintiffs, in a written statement. .

More than 11,000 moderators will receive current and former content working in Arizona, California, Florida, and Texas at least $ 1,000 from the settlement.

While employees who have been officially diagnosed with mental health disorders such as PTSD or depression will receive an additional $ 1,500 per diagnosis to pay for treatment costs, with a maximum of $ 6,000 per employee, and people with other diseases may be able to provide evidence of this occurring Injuries as a result of working on Facebook content and obtaining additional compensation for damages.

Zuckerberg said earlier that Facebook seeks to automate the process of content supervision (Getty Images)

And Facebook said in its statement that it is "grateful to the people who do this important work," adding, "We are committed to providing them with additional support through this settlement and in the future."

The solution to automation
The automation of this work is the ultimate goal of Facebook, and in this way you will need a smaller number of contractors to examine harmful content, and those will not have to see the shocking content but the content on it, which reduces the incidence of various disorders.

The company's latest report on the commitment to apply community standards indicates that the machine tools used have already improved in this area, but there is still a long way to go.

The company said that about 90% of the hate speech that was removed from Facebook in the last quarter was detected automatically before it was reviewed by moderators.

Facebook, like any other platform, now faces misinformation about Corona virus, which is multiplying rapidly, especially when you have more than 2.6 billion users.

Facebook CEO Mark Zuckerberg said last April that the site's factual auditors "have put about 50 million stickers on corona virus-based content based on 7,500 articles", and that these posters appear to be effective, with around 95% of viewers found Do not click on the content that was categorized as false.