Facebook no longer wants to hear about Voyager Labs.

The social networking giant began legal proceedings on Thursday January 12 to ban this social network analysis company from using Facebook or Instagram.

The crime of this company of Israeli origin?

Having "unduly collected data from Facebook, Instagram and other platforms", assures Meta, the parent company of the social network, in its press release announcing the legal action.

Fake accounts for real data sucker

Voyager Labs, described as a "data mining and surveillance" company by Facebook, is accused of obtaining the information by creating fake accounts in order to gain access to private Facebook groups and pages.

Behavior that would violate the rules of use of Mark Zuckerberg's social network.

The information thus gleaned is then passed through the mill of Voyager Labs' in-house algorithms to obtain "a better understanding of the interactions and relationships between individuals", according to the official website of this company.

A polished formulation of the core business of this structure: to be a Madame Irma 2.0 of human behavior.

It is for this predictive aspect that Voyager Labs was singled out for the first time in 2021.

The British daily The Guardian had made it the embodiment of a new "tech business ecosystem" selling dreams boosted with artificial intelligence to police authorities in search of tools to better prevent or predict crime.

In this case, the Los Angeles police had tested Voyager Labs' "predictive policing" system for several months in 2019. Subsequently, California law enforcement had negotiated for more than a year the signing of a long-term contract that ultimately fell through due to budget cuts related to the Covid-19 pandemic.

"Voyager Labs belongs to this family of companies, some of which have become very large and well known - such as Palantir or PredPol - which claim to use the power of algorithms to facilitate the work of the police by analyzing public data available on the Internet", summarizes Griff Ferris, specialist in the use of artificial intelligence in the field of criminal justice for the British NGO Fair Trials.

Hunt "terrorists" or control an epidemic

Voyager Labs was proposing to take network surveillance one step further, according to internal documents obtained by the Brennan Center for Justice, an American legal aid association.

Their sales pitches suggested, among other things, using fake profiles to carry out digital undercover missions in order to gain access to private spaces on social networks.

Voyager Labs claimed that this hunt for information – identity data, photos, messages, conversations – allowed its algorithms to predict which individuals were most likely to commit a crime or, for example, to turn to terrorism.

In support of its statements, the company notably advances the case of Bahgat Saber, an Egyptian national living in New York and close to the Muslim Brotherhood who had called, at the start of the Covid-19 pandemic, to "voluntarily contaminate employees Egyptian embassies and consulates".

By analyzing his web of connections on Facebook and Twitter, Voyager Labs argues that this individual posed a terrorist threat because he knew at least two US government employees who could be influenced.

Voyager Labs would also make it possible to better control pandemics such as that of Covid-19.

With certain opportunism, the company thus explains in its 2020 presentation that it has reconstructed the social interactions of Mattia – an Italian identified "as patient zero of the epidemic in Italy" (when he would only be patient 1 at best). … and again) – to understand how the virus spread in the country.

Result: the most likely would be that he first infected… his office colleagues.

Guilt by association

Los Angeles law enforcement appears to have used this prediction machine to analyze "500 Facebook user profiles and thousands of messages," says The Guardian.

In documents obtained by the Brennan Center for Justice, a police officer claims that this tool would have identified "a few new people of interest".

"All this still looks a lot like a system of guilt by association or by friendship which is not very concerned with the presumption of innocence", regrets Griff Ferris.

It would be enough to be friends on Facebook with the wrong people to find yourself in the crosshairs of this tool.

“Be careful not to confuse correlation and causation. It is not because there is a link with suspicious individuals or because there have been disturbing messages that there will necessarily be action” , continues this expert.

Voyager Labs defends itself by simply bringing up to date the digital work of the police: trying to establish connections that advance an investigation.

The company would achieve this faster and by analyzing more data thanks to algorithms.

An argument put forward by most companies that sell these "predictive policing" solutions, recalls the Brennan Center for Justice, which has listed all the police stations that have used equivalent solutions.

The concern is that these are "private companies beyond any external control and doing the work of public bodies which, themselves, must be accountable for their methods", recalls Griff Ferris.

Internet users whose online activity is scrutinized in this way do not therefore benefit from the same protections as persons subject to a proper investigation by the police.

A friend on Facebook does not have the same value as an acquaintance in real life.

"In the hyper-connected world in which we operate, having 'friends' on Facebook no longer means much. You really need to have a sufficiently large database and a quality algorithm to succeed in bring up really meaningful connections,” notes Griff Ferris.

Facebook hypocrisy?

This is the problem with the "in-house algorithms" of Voyager Labs, Palantir or others.

These are black boxes whose content and operation no one really knows and which, when used by law enforcement, can have significant consequences on the lives of individuals.

“We know that this type of tool has already been used to establish whether a suspect should remain in police custody or not, to estimate the length of a sentence, etc.,” underlines Griff Ferris.

But Voyager Labs is not the only one on the dock in this case.

Why did Facebook wait almost two years before cracking down on this company?

"We have the impression that Facebook is mainly acting as a guardian of the temple of data that it no longer wants to share with others," said Griff Ferris.

This specialist recalls, in fact, that the social network uses very similar methods to analyze the probable behavior of its users.

With this difference that it does not act on behalf of law enforcement, but for advertisers and brands who want to know which Internet users are most likely to buy their products.

According to this expert, the regulatory authorities are still too absent-minded.

There is no reason for companies like Voyager Labs not to offer their services to police officers who ask for more, except to put regulatory limits on these practices.

NGOs such as Access Now or Fair Trials are campaigning for this issue to be addressed in the AI ​​Act – the new European regulation on artificial intelligence currently under discussion in Brussels – and for there to be an outright ban. of "predictive policing".

The summary of the

France 24 week invites you to come back to the news that marked the week

I subscribe

Take international news everywhere with you!

Download the France 24 app