In late June, the British newspaper Daily Mail published a video on its Facebook page.

In the video, you could see clips of, among other things, a number of black people.

Facebook users who saw the video in their feed got an automated text box asking if they wanted to "continue watching videos with primates", which prompted the newspaper to investigate the matter and disable the AI-powered feature that drove the message.

Apologize

On Friday, September 3, Facebook apologized and called the incident an "unacceptable error" and said they were investigating the function with the aim of "preventing this from happening again".

Dani Lever, a spokeswoman for Facebook, said in a statement: "We have improved our AI, but we know it is not perfect, and that we have more work ahead of us.

We apologize to anyone who may have seen these offensive recommendations. "

Facebook is not alone

Similar incidents have occurred in the past with other IT giants.

For example, in 2015 when Google categorized images of black people as "gorillas".

The company then went out and apologized and promised to fix the problem.

Facial recognition technology has at times been inadequate when it comes to people with darker skin, and may have trouble identifying them.

Last year, for example, a person was arrested after a facial recognition program pointed him out as the perpetrator.

"Resolving racism is not a priority"

Darci Groves, a former employee of Facebook, resigned this summer after four years at the company.

In an interview with The News York Times, Groves said that racism is not a priority issue for the company, given all the mistakes they have made time and time again.

- Facebook can not continue to make mistakes like this and then just say sorry, she says.