In a report, the National Commission on Computing and Freedom (CNIL) draws red lines on the use of facial recognition to protect the privacy of citizens. So that this technology does not turn into "Big Brother", the Cnil puts particular emphasis on the consent of the people targeted.

How to avoid falling into a society between "Black Mirror" and "Big Brother"? The National Commission on Computing and Liberties (CNIL) drew Friday "red lines" not to cross in the use of facial recognition, in particular respect for the privacy of citizens.

"All is not and will not be allowed in terms of facial recognition," warns a report from the CNIL, which highlights the "potential new surveillance that may involve societal choices." Whether in "1984", George Orwell's novel, or in the British television series "Black Mirror", where a reader implanted in the eye allows access to the internet and his personal data, facial recognition has long been associated with a restriction of freedoms.

Transparency and right of withdrawal

While he does not oppose in principle the use of facial recognition, the French police officer of personal data precisely emphasizes several requirements to frame the experiment. First, "draw boundaries" before any use, even experimental, to define the scope of what is "desirable" politically and socially and what is "possible" technologically and financially. The CNIL reminds, for example, that it recognizes the legitimacy of certain practices such as the control of access to the Nice Carnival, on a sample of volunteers. On the contrary, she indicates that she opposed her use for access to schools.

>> READ ALSO - Video in public places: the legislation is no longer appropriate, warns the CNIL

The organization also recommends Friday to pay special attention to the respect of personal data that could be used via facial recognition. It highlights some cardinal points to respect on the subject: the consent of the targeted people, the control of the data by the individuals, the transparency, the right of withdrawal of the device and access to information, or the security of the biometric data. . "Experiments can not ethically have the purpose or effect of accustoming people to intrusive surveillance techniques," insists the institution. Finally, the CNIL advocates a real experimental approach to "test and perfect technical solutions respectful of the legal framework".

An experiment in Orly in 2020

In France, the ADP group intends to experiment this technology next year at Paris-Orly airport where facial recognition will allow passengers to pass different controls, from registration to boarding. The Ministry of the Interior is also experimenting with Alicem, an application that will give smartphone users access to online services requiring strong identification security in exchange for a facial recognition step.