After AI firm Clearview uploaded billions of images from the public web — websites including Instagram, Venmo and LinkedIn — to create a tool For facial recognition to serve law enforcement, many concerns have been raised about the company and its tool for breaking norms.

In addition to the privacy implications and the legality of what Clearview did, there have been questions about whether the tool actually works as advertised, i.e. could the company actually find a particular person's face from a database of billions?

Clearview's AI implementation was a tool in the hands of law enforcement agencies for many years before its accuracy was tested by an impartial third party.

Now, after two rounds of US federal testing over the past month, the tool's accuracy is no longer a major concern.

Facial recognition through artificial intelligence.

(American press)

In the results announced Monday, the New York-based company Clearview was among the top 10 of nearly 100 companies selling facial recognition tools, in a nationwide test aimed at revealing which tools are best at finding a face. Appropriate while looking at the photos of millions of people.

But Clearview performed less well on another version of the test, which simulates the use of facial recognition to allow people into buildings, meaning that it works like the process of verifying that someone is employed somewhere.

In 2019, the US Department of Commerce blacklisted SnesTime, along with 27 other Chinese entities, because its products were implicated in China's campaign against Uighurs and other Muslim minorities.

"We are pleased," said Hwan Tun, CEO of Clearview.

"This reflects our actual use case," he added.

The company also did well last month on a so-called Match Two Test, for its ability to match two different photos of the same person, simulating the face-verification process that people use to unlock their smartphones.

The positive results "have been a catalyst for the sales team," Tun said.

The National Institute of Standards and Technology has been running tests for companies selling facial recognition tools for two decades.

Since the start of those tests, the report notes, "facial recognition has seen an industrial revolution, with algorithms increasingly receptive to blurred and other low-quality images, as well as poorly displayed targets."

Authorities in Canada and Australia said Clearview broke their laws by not obtaining consent from citizens whose photos are included in the database, and the company is fighting privacy lawsuits in Illinois and Vermont.

Clearview first produced impressive results on charts in one-image-to-image comparisons, as well as in surveys, but the top performers were SenseTime of China and Cubox of South Korea.

In 2019, the US Department of Commerce blacklisted SenseTime, along with 27 other Chinese entities, because its products were implicated in China's campaign against Uighurs and other Muslim minorities.

Axios reported that the company's name has been changed to "Beijing Sense Time" to reduce the impact of the blacklisting.

Regardless of the accuracy, questions remain about the legality of the Clearview tool.

Authorities in Canada and Australia have said Clearview broke their laws by not obtaining consent from citizens whose photos are included in the database, and the company is fighting privacy lawsuits in Illinois and Vermont.

© The New York Times Foundation 2021. Translated


by Al Arabiya, "Al Jazeera's Leadership" page.