Algorithmic justice settles in France

AP - Jean-Francois Badias

Text by: Dominique Desaunay Follow

3 min

The Council of State has just validated the computerized decision support system, DataJust, allowing the courts to establish the compensation to which victims of bodily injury are entitled.

The associations for the defense of fundamental freedoms denounce the use of personal data without the prior consent of the litigants.

Advertising

Read more

The objective of the DataJust program is to provide assistance to magistrates to assess, for example, the amounts of compensation to which victims of physical assault or road accident can claim. The launch of this experiment, which was announced by decree on March 27, 2020, “

aims to develop an algorithm, responsible for automatically extracting and using the data contained in court decisions relating to compensation for bodily injury.

»Then specified the Ministry of Justice on its site.

To "train" this system, which is based on artificial intelligence programs, draws from a file containing thousands of sensitive data.

Among these, the names and first names of the victims as well as information related to potential injuries, medical expertise, but also their professional or financial situation.

While most of this information has been made anonymous, certain identifying elements such as the date of birth or even any family ties between the victims and their relatives appear in clear in the database.

Associations for the defense of fundamental freedoms mobilized

This method of computer processing would violate the RGPD, the European device for the protection of personal data and the Data Protection Act, according to several lawyers.

The same goes for associations.

"

The State is freeing itself with this experiment of algorithmic justice, laws that protect personal data and privacy

," said Bastien Le Querrec, lawyer and member of the litigation group at La Quadrature du Net, the association for the defense of fundamental freedoms in digital environments.

► To read also: Digital data: "Protected on paper, not in reality"

"

This methodology, which consists of affixing an experimentation label to allow yourself to go and dig up even more personal information, bothers us. Each time, we decide that it is experimental, do not worry, we will check if the result proposed by the algorithm is proportionate, while in reality we are playing the sorcerer's apprentice. This belief that an algorithm can make better decisions than a human being is spreading, we observe in the police sector, now the justice sector, in matters of social control in the systems of the Allowance Fund. family ... we see that these algorithms are interfering more and more in the daily life of citizens and it is really a choice of society. At Quadrature du Net,we try to fight against the biases generated by these automatic programs, because technology sometimes can lead to abuses, as is the case today with the DataJust device 

», Regrets Bastien Le Querrec.

The DataJust algorithm validated by the highest court

The Council of State seized by Quadrature du Net indicates in its conclusions that this project being of public interest, then the consent of the persons appearing in the file is not necessary to proceed to the processing of their personal data.

The DataJust device, the effectiveness of which has not yet been revealed, should be operational at the end of March 2022, predicts the Ministry of Justice.

Newsletter

Receive all international news directly in your mailbox

I subscribe

Follow all the international news by downloading the RFI application

google-play-badge_FR

  • Internet

  • Justice

  • Computer science

  • our selection

  • Gafa