Orsay (France) (AFP)

Imagine new algorithms, give them the ability to learn and solve a few mysteries ... a supercomputer is making artificial intelligence for the first time in France to get out of "old times".

Inaugurated on Friday, the machine, at full power, offers a computing capacity equivalent to that of 40,000 personal computers (16 petaflops - 16 million billion operations per second), doubling the computing power of France.

"GAFA very quickly acquired large supercomputers (dedicated to AI), quickly imitated by the United States", explains to AFP Jamal Atif, researcher at CNRS.

If the British company DeepMind was able to design AlphaGo, whose victory against the world champion in the game of Go had caused a stir in 2016, it is because it had the computing power of Google.

But in France, "there was nothing. And without calculation, we were staying in ancient times", admits the researcher from the Laboratory for Analysis and Modeling of Systems for Decision Support.

"The competition is terrible in this area, and the French researchers considered themselves very disadvantaged compared to their competitors", says Denis Girou, former director of the Institute of development and resources in scientific computing (Idris) of the CNRS.

- "after a week!" -

One of the challenges of artificial intelligence is to give computers the ability to learn from data, "machine learning" or deep learning.

But to learn how to perform a specific task (recognizing an image, translating a text, playing Go ...), an algorithm needs to ingest billions of examples.

"Processing this large mass of data requires a lot of calculations," explains the researcher from the Laboratory for Analysis and Modeling of Systems for decision support. The same goes for testing these algorithms or designing new ones.

In a large white room, around forty huge black cabinets contain processors and storage disks. Nothing spectacular except the deafening noise.

Located in the CNRS Idris data center on the Saclay plateau in the Paris region, Jean Zay, this new technological gem, costing 25 million euros, has specific processors (GPUs). "These accelerators are absolutely crucial for AI researchers," explains Denis Girou.

"A lot is expected from my community", recognizes Jamal Atif, citing "the promise of the autonomous car", intelligent assistants, predictive justice, automatic diagnostic assistance ...

But before all that, some mysteries remain to be clarified, in particular the problem of "explicability": the operations carried out by the algorithms are so complex that it is difficult to explain how the algorithm arrived at such or such decision.

In the United States, COMPAS software is used to predict recidivism. "But we don't know exactly what it is based on ...", explains Jamal Atif.

Another limitation of deep learning: it remains vulnerable to attack. Even a tiny change in an image, even imperceptible to the naked eye, can fool an algorithm, prevent it from recognizing a stop sign or make it take a school bus for an ostrich.

A big problem if you want to deploy deep learning algorithms in autonomous vehicles or planes. "We understand a lot of things but a lot of things elude us," summarizes the researcher.

Jean Zay will also be used in other fields which are also very demanding in terms of computing time such as climatology, astrophysics, molecular dynamics or genomics…

In particular to simulate the evolution of the climate, recreate the solar environment or visualize the behavior of influenza viruses according to different temperatures.

"Instead of waiting a month for results, we will have them after a week!" Rejoices Marc Baden, director of the theoretical biochemistry laboratory of the CNRS.

© 2020 AFP