Paris (AFP)

An African-American wrongfully arrested because of facial recognition software: this case has revived the debate on the biases of artificial intelligence, in full mobilization against racism and police violence.

The case goes back to early January: Robert Williams is arrested in Detroit and spends 30 hours in detention because software has deemed the photo of his driver's license and the image of a thief of watches captured by cameras of identical. surveillance. Wrongly.

For the American Civil Liberties Union (ACLU), which filed a complaint on its behalf on June 24, "although it is the first known case, it is probably not the first person who was arrested and questioned at wrong on the basis of false facial recognition. "

For Joy Buolamwini, founder of the activist group Algorithmic Justice League, this case is indicative "of how systemic racism can be encoded and reflected in artificial intelligence (AI)."

Under pressure from associations like the powerful ACLU, Microsoft, Amazon and IBM announced in early June that they would restrict the use of their face analysis tools by the police.

AI is based on automated learning from data inserted by the designer, which the machine analyzes. If this data is biased, the result is distorted.

A study by the Massachussets Institute of Technology published in February 2018 thus revealed a large disparity according to population groups, with error rates of less than 1% for white men, and up to 35% for black women , in the main facial recognition software sifted through.

- Thermometer and gun -

In a tweet that has gone viral, Nicolas Kayser-Bril, from the NGO Algorithm Watch, shows that when confronted with images of individuals holding a forehead thermometer, the "Google Vision" image analysis program recognized "twins" in a white skinned hand but identified "a pistol" in the black skinned hand.

According to him, this bias was "probably due to the fact that the images used in the database which included black people were more often associated with violence, than those with white people."

Google has admitted to Algorithm Watch an "unacceptable" result.

However, software of this type is legion, and marketed to companies and administrations around the world, not only by big names in tech.

"This makes it very difficult to identify the conditions under which the data set was collected, the characteristics of these images, the way in which the algorithm was formed," said Seda Gürses, researcher at the university of technology. from Delft in the Netherlands.

This multiplicity of actors reduces costs, but this complexity blurs the tracing and allocation of responsibilities, according to the researcher.

"A racist policeman can be trained or replaced, while in the case of an algorithm", decisions in companies are determined by this algorithm, which obeys above all economic criteria, according to Ms. Gürses.

This also applies to programs claiming to predict criminal behavior.

Evidenced by a recent controversy around a software boasting of "predicting with 80% accuracy if a person is criminal based solely on a photo of his face".

More than 2,000 people, including many scientists, have signed a petition asking publisher Springer Nature not to publish an article devoted to this technology developed by several professors at the University of Harriburg in Pennsylvania, and defended in this article.

But the editor, questioned by AFP, assured that it had "never been accepted for publication".

"The problem is not so much the algorithm as that of the researchers' presuppositions, comments Nicolas Kayser-Bril, who rejects the" purely technological "approach." Even with excellent quality data sets, there is n come to nothing if you don't take into account all the social issues behind it. For that you have to work with sociologists, at a minimum. "

"You can't change the history of racism and sexism," says Mutale Nkonde, artificial intelligence researcher at Stanford and Harvard universities. But you can prevent the algorithm from becoming the ultimate decision maker. "

© 2020 AFP