• The Senate must vote this Tuesday on a bill relating to the Olympic Games which introduces the experimentation of algorithmic video protection (VPA), for a period of two years.

  • This technique makes it possible to detect abnormal movements in the crowd in order to alert the operator in the viewing center, without however processing the biometric data of individuals, which is prohibited by European regulations.

  • But several voices are raised against a device that has not yet proven itself which is considered too intrusive and against an experiment that is not really one.

Smile, you are analyzed.

With its bill relating to the 2024 Olympic and Paralympic Games, which must be voted on Tuesday in the Senate before being discussed in the National Assembly, the government is taking a new step in the surveillance society.

Via article 7, the executive intends to legalize the experimentation of the processing of video protection images by artificial intelligence (AI).

Concretely, the images will be analyzed by algorithms which will be able to detect abnormal movements in the crowd (jostling, terrorist attack, forgotten package, etc.) in order to alert the operator in the viewing center.

However, the experimentation will not be limited to the Olympic Games but to "sporting events,

For the government, the use of AI is necessary because there are simply far too many cameras available: “Live viewing of all the images captured by the video protection cameras is materially impossible”.

And to cite the example of the Paris police headquarters which "relyed, in 2020, on 3,762 cameras belonging to the police headquarters and 37,800 cameras belonging to third-party authorities", i.e. more than 40,000 lenses.

For the executive, the AI ​​"would [also] save precious time", especially in the event of acts of terrorism.

Moreover, the impact study is categorical: “Only the use of algorithmic processing is likely to signal this risky situation in real time”.

“It is absolutely disproportionate”

This article 7 made the Quadrature du Net jump, an association which "defends fundamental freedoms in the digital environment".

“It is absolutely disproportionate and it is a very dangerous technology in the public space”, denounces Alouette who follows the file for Quadrature.

According to her, algorithmic video surveillance (VPA) presents several problems.

The first is due to the imperfections of the AI.

“Technology is not neutral, it is developed by humans and incorporates their biases, she believes.

It targets those who spend the most time on the street, often the poorest.

“We are moving towards a codification of the public space”, engages Thomas Dossus, senator EELV who notably tabled, in vain, an amendment to delete this famous article 7.

To limit bias, "we must avoid having a team of coders who are all 26 years old, are all Caucasian, come from the same school, are of the same gender, so that there is diversity in the representations", says Adrien Basdevant, lawyer specializing in digital.

However, “in the text, there is nothing to frame the biases”, deplores Thomas Dossus.

The law's impact study specifies that the VPA “[will] detect, using defined criteria, abnormal events”.

But who will define these “criteria” and this abnormality, and how?

It's total blur.

Amazing timing

La Quadrature du Net also denounces “enormous economic issues behind this experiment”.

Clearly, the Olympics will serve as a showcase for the French champions of video protection and will allow their algorithms to feed on data and to be more efficient.

“It's a market opportunity, analyzes Thomas Dossus.

This was also reaffirmed by Gérald Darmanin in the Senate and in addition it is covered by industrial secrecy so it is impossible to know how the algorithms are calibrated.

"This experiment is, in part, a response to the lobby of private hardware (cameras) and software (software) companies", supports Guillaume Gormand, researcher associated with Cerdap² at the IEP of Grenoble and specialist in video protection.

“We have the right not to be hostile to national interests, defends Dominique Legrand,

president of the national association of video protection, both lobby and think tank.

And yes, the Olympics are a showcase, but for all companies.

»

However, it is difficult to determine whether the experimentation is done for the benefit of public decision-making, by measuring the effectiveness of the VPA, or of those who develop these software solutions.

Thus the Ministry of the Interior, contacted by

20 minutes

, indicates at the same time that this experiment makes it possible to "evaluate the operational interest" and to "refine the device technically and operationally".

On the other hand, a parliamentary mission led by MPs Philippe Gosselin (LR) and Philippe Latombe (MoDem), aims, according to the latter, “to see what, in the current legal framework, requires adjustments.

For example, should algorithmic video protection be integrated?

“If the report is adopted, it could turn into law by the end of the year,” calculates the deputy.

That is a year and a half before the end of the experiment, scheduled for June 30, 2025, which should precisely make it possible to “assess the relevance of the device”!

Person to assess

“We knew that the excuse of the Olympics was a decoy, is hardly surprised Thomas Dossus.

We do not do this experimentation in order not to perpetuate.

Either it's going well and we'll say it's thanks to the AI, or it's going badly, and we'll just be told that the AI ​​needs to improve.

“We will have trained agents, developed software, we cannot go back,” regrets Alouette.

What's more, Guillaume Gormand has “big doubts about the state's ability to produce an evaluation process.

As early as 2011, the Court of Auditors had asked him to produce an evaluation of video protection and this is still not the case.

In 2020, the Court thus recalled that “the scale of the sums committed for more than ten years indeed requires an objective assessment of the effectiveness of video protection”.

Likewise in a summary of December 2021, she wrote that "the lack of evaluation of the effectiveness and efficiency of the video protection plan of the Paris police headquarters has persisted since 2010".

The camera goddess

And that is perhaps the most incredible.

While France is covered with cameras - at least 75,000 in 2018 outside Paris and the inner suburbs - we still do not know if they are useful.

Guillaume Gormand conducted a study, at the request of the gendarmerie, on four municipal territories in the Grenoble region to assess their effectiveness.

Here is what he found:

Video protection does not act as a deterrent, nor is it used to reassure people, it is the communication approach around the installation of cameras which produces this effect but which fades over time, it is used in a anecdotal in the resolution of everyday crimes, in about 1% of cases, but it is more useful on emblematic cases.

Finally, if there is a good chain of command, it can help the intervention on the ground by securing and directing, a bit like the radio”.

Findings corroborated by the Court of Auditors which writes that “no overall correlation has been noted between the existence of video protection devices and the level of crime committed on the public highway, or even the rates of elucidation”.

Which makes the researcher say that on this subject “we are in an almost religious belief” and that the absence of evaluations makes it possible not to “break this technological miracle”.

“Video protection is the answer when you have no answer,” summarizes Guillaume Gormand.

"Wherever we don't have cameras, we install them"

"There can be no tools that cannot be criticized", tempers Dominique Legrand, who brushes aside the 1% elucidation figure and "calls for a real study".

Sign that it works according to him, "everywhere where we do not have cameras, we install them".

He also recalls that it is thanks to video protection – among other things – that the police were able to arrest Nordahl Lelandais, in the Maëlys case.

For its part, the SNCF, via its Innovation department, has been experimenting for some time with AI in the real-time processing of video protection images.

“The tests are very conclusive and interesting, indicates the rail security department.

In the case of an abandoned package, we can quickly find the owner and free the traffic.

»

Our dossier on AI

“There is a problem of scale, denounces Thomas Dessus.

To guard against abandoned baggage, you have to cut corners on a package of rights and freedoms.

Above all, the AI ​​is not (yet) perfectly reliable.

"There was an experiment in England a few years ago, and we realized that there were many more false positives and false negatives than true positives", specifies the researcher from Grenoble.

"Detecting when an individual pulls out a weapon, the AI ​​does not know how to do, there are too many false positives", adds Philippe Latombe.

The risk of slipping

Above all, some are worried that this is only the first step towards real-time processing of biometric data and therefore facial recognition.

"It's exactly the same technology, that's why sliding is easy," says Alouette.

“These treatments do not use any biometric identification system […] and do not implement any facial recognition technique”, promises the bill.

“As in China, do we want cameras that allow us to recognize Uighurs?

Absolutely not,” assures Dominique Legrand.

Moreover, “real-time biometrics is prohibited in Europe”, recalls Adrien Basdevant.

But on the side of the SNCF, we would not be against venturing into this field.

“Our results are limited by the non-use of biometric data, analyzes the security department.

The bill will greatly restrict our action, but we are already taking what is there.

“Above all, “it leads to conditioning, regrets Adrien Basdevant.

You start with transport, you gradually arrive at schools, then you drift to places of residence and work and then you have real-time monitoring.

This is why he calls for "a social and political dialogue on the choice of tools".

“Is it because a technology exists that we must necessarily use it?

Is it proportionate to the purpose for which it would be deployed?

What are the safeguards?

he wonders.

Paris

Can video surveillance become an essential criterion in the search for a property?

Paris

Paris City Hall wants to install 315 new CCTV cameras

Video protection or video surveillance?

From a legal point of view, "video surveillance is practiced in a private setting while video protection concerns public places," says Adrien Basdevant, a lawyer specializing in digital technology.

And depending on the qualification of the terms, you have separate legal regimes.

Using one term more than another is also a way of emphasizing one's position vis-à-vis video protection.

La Quadrature du Net only uses the term video surveillance, which has a more negative connotation, when the public authorities have much more recourse to “video protection”.

  • Paris

  • Ile-de-France

  • Paris 2024 Olympics

  • Video surveillance