Jimmy Gomez is a Democrat from California, a Harvard graduate and a Hispanic (Hispanic) lawyer who serves in the US House of Representatives, but for Amazon's face recognition system he looks like a potential criminal.

Gomez is one of 28 members of the US House of Representatives who have mistakenly matched images of people previously arrested as part of a test conducted by the American Civil Rights Association last year on the face recognition program.

Almost 40 percent of the misappropriation of the Amazon device, which was used by the police, involved colored people.

The findings reinforce growing concerns among civil liberties groups, lawmakers and even some technology companies that face recognition systems may harm minorities, as the technology is becoming more widespread.

Some forms of technology have already been used in iPhone and Android phones, and police, shops and schools have begun to use them gradually, but studies have shown that facial recognition systems have difficulties identifying women and dark-skinned people, which can lead to catastrophic false results.

According to a study by researchers from the Massachusetts Institute of Technology's Media Lab, face recognition systems for Microsoft, IBM and Vis ++ found it difficult to distinguish the sex of dark-skinned women, such as African Americans, from white men.

Another study of the same university published in January found that Amazon's face recognition technology found it more difficult than Microsoft's or IBM's to identify the sex of dark-skinned women.

Apple uses face recognition technology to unlock phones for modern iPhone (Reuters)

Causes of the problem
There are several reasons why face recognition systems may be more discriminating in the face of minority and women than white men, most notably - according to Georgetown Center for Privacy and Technology colleague Claire Garvey - the public images used by computer technicians to recognize faces include Images of white men more than pictures of minority owners.

For example, if a company uses images from a database of famous characters, those images tend to be white people because minorities are not well represented in Hollywood.

Engineers in technology companies - mostly white men - may design face recognition systems inadvertently to better discriminate against certain races, according to Garvey.

Then there are the challenges of dealing with low chromatic contrast in dark skin, or with women who use make-up to hide wrinkles or shampoo their hair in a different way, Garvey said.

Tech companies are making great efforts to minimize errors in their face recognition systems, but fears of abuse of such systems to discriminate against migrants or minorities persist, in part because people are still They fight bias in their personal lives.