Opening of shops on Sundays, establishment of a health center in Seine-Saint-Denis, administrative investigation on future accredited... It is a catch-all law that the National Assembly must adopt Tuesday, March 28, during a solemn vote, to prepare the Olympics-2024. But Article 7 of this law is particularly debated: it provides that on an experimental basis, the use of algorithmic video surveillance (VSA) is authorized to secure these Olympic Games. Human rights groups point to dangerous technology.
During the general examination phase, this article of the Olympic bill was adopted with 59 votes in favour (presidential majority; The Republicans; National Rally) against 14 against (New Ecological and Social People's Union). It provides on an experimental basis that the security of "sporting, recreational or cultural events" of magnitude can be ensured by means of algorithms.
A technology in question
"Algorithmic video surveillance is a new technology based on the use of computer software capable of analyzing in real time the images captured by surveillance cameras," explains Arnaud Touati, a lawyer specializing in digital law. "The algorithms used in software are based on machine learning technology, which allows VSA systems, over time, to continue to improve and adapt to new situations."
Proponents of this technology boast an ability to anticipate crowd movements or the detection of abandoned luggage or projectile jets. Compared to conventional video surveillance, everything is automated with algorithms responsible for analysis – which, according to the defenders of this technology, makes it possible to limit human errors or inattention.
"As France presents itself as a champion of human rights around the world, its decision to allow AI-assisted mass surveillance during the Olympics will lead to a widespread assault on the right to privacy, the right to protest and the rights to freedom of assembly and expression," Amnesty International said in a statement after the article was adopted.
The France, the future herald of video surveillance in Europe?
Katia Roux, Technology and Human Rights specialist at the NGO, explains the fears that technology crystallizes: "Under international law, legislation must strictly meet criteria of proportionality. However, the legislator has not demonstrated this," she said. "We are talking about an assessment technology, which must assess behaviors and categorize them as at risk in order to take action later."
"This technology is not legal today. In France, there have been experiments but without ever the legal basis that this law proposes to create," she said. "Neither at European level. This is even part of the ongoing discussions in the European Parliament on technologies and regulation of artificial intelligence systems. The legislation could therefore also violate the European regulation currently being drafted."
Olympics 2024: fear cameras, anti-drone laser, video surveillance, tech makes its games © FRANCE 24
"By passing this law, France would be championing video surveillance in the EU and setting an extremely dangerous precedent. This would send an extremely worrying signal to states that might be tempted to use this technology against their own populations."
One of the fears is that the algorithm, seemingly cold and infallible, actually contains discrimination biases: "These algorithms will be trained through a set of data decided and designed by the human being. They will therefore simply be able to integrate the discriminatory biases of the people who designed and thought of them," notes Katia Roux.
"The VSA has already been used for racist purposes, notably by China, in the exclusive surveillance of the Uighurs, a Muslim minority present in the country," said lawyer Arnaud Touati. "Due to the under-representation of ethnic minorities in the data provided to algorithms for their learning, there are significant discriminatory and racist biases. According to an MIT study, facial recognition error is 1% for white men, but it is mostly 34% for black women."
Arnaud Touati, however, wants to see the glass half full: "The use of VSA at events of such magnitude could also highlight the discriminatory, misogynistic and racist biases of the algorithm by identifying, at too strong occurrences to be fair, people from minorities as potential suspects," he explains.
Summoned by the left-wing opposition in the National Assembly to reassure about the situations that would be detected, the Minister of the Interior, Gérald Darmanin, quipped: "Not the [people wearing] hoodies." On the side of the French government, it is estimated that the limits set by the law – the absence of facial recognition, data protection – will be enough to prevent any drift.
"We have put safeguards in place, so that calls for tenders are only reserved for companies that comply with a certain number of rules, including the hosting of data on the national territory, compliance with the CNIL and the GDPR [General Data Protection Regulation, Editor's note]," defends MP Modem Philippe Latombe, who co-signed an amendment with the National Rally so that the call for tenders prioritizes European companies. "Clearly, we don't want it to be a Chinese company doing data processing in China and using the data to do something else."
"The guarantees provided by the government are not likely to reassure us. In reality, there is no real amendment, and this technology is, in itself, problematic and dangerous for human rights," said Katia Roux. "It will remain so until there has been a serious evaluation, until there has been a demonstration of the necessity and proportionality of its use, and as long as there has not been a real debate with the various actors of civil society on the issue."
Sport, an eternal field of experimentation
If the Olympics are clearly the target event, the experiment can begin as soon as the law is promulgated and will stop four months after the end of the Paralympic Games, on December 31, 2024. It could therefore concern a wide range of events, starting with the next Rugby World Cup, in September-October.
Opponents of VSA fear that its use, initially exceptional, will eventually become widespread. Sporting events are often used as a testing ground for policing, security and new technologies. Thus, the London Olympics had helped to generalize surveillance in the British capital.
"We are afraid to see a generalization that will extend beyond this period of exception," says Katia Roux, who recalls that after the 2018 football World Cup in Russia, the voice recognition technologies that had been authorized were used to repress the opposition.
Finally, Amnesty International is concerned that in the long term, video surveillance will drift towards biometric or voice surveillance: "Facial recognition is only a feature to be activated," warns Katia Roux.
The Olympic law has not yet completed its legislative path. In addition to Tuesday's solemn vote in the National Assembly, the text still has to shuttle with the Senate that had previously approved it, but in different terms. Until the two Houses agree.
Peter O'Brien of Tech 24 contributed to this report.
The summary of the week France 24 invites you to look back on the news that marked the week
Take international news with you everywhere! Download the France 24 app