Artificial intelligence can recognize the six basic emotions. Drawing. - Smith Collection / Gado / Sipa USA / SIPA

  • The EU unveiled its artificial intelligence and data battle plan on Wednesday to catch up and dispel fears of Big Brother-style control.
  • Facial recognition is an increasingly strategic sector.
  • What is artificial intelligence capable of in the field of emotion recognition? For the moment, we lend her abilities that she doesn't have.

What can artificial intelligence detect from our emotional states? Facial recognition, an increasingly strategic sector, is considered to be the technology of the future. Its fields of application are numerous and China - with its social credit system - and the Gafa have already been on the move for a while. While the EU unveiled its battle plan on artificial intelligence and data on Wednesday to catch up and dispel fears of Big Brother-style control, 20 Minutes takes stock of emotional recognition. Can a simple machine cheer us up in the event of depression?

The facial recognition system from Amazon, called Rekognition, would be able to detect the six basic emotions (joy, sadness, anger, surprise, disgust, confusion), relays a Guardian article published last Sunday. We find in all humanity these facial expressions defined by the work of the American psychologist Paul Ekman. From large databases, algorithms learn to recognize these six emotions, to which is added "calm".

The palette of emotions

Today's algorithms work rather well in recognizing basic emotions when the person is frozen in front of a camera, much less in an uncontrolled space. But in everyday life, the face is rarely frozen. “Everyday life is made up of a multitude of fairly subtle expressions, you can have several emotions at the same time, points out Laurence Devillers, specialist in affective computing at CNRS and professor at Sorbonne-University. These emotions: strong fear, strong sadness, strong joy ... When does this happen in everyday life? "

Humans, among themselves, do not always manage to decipher the emotions of others, so a machine ... Klaus Scherer, former professor at the University of Geneva, worked on the perception of emotions. He interviewed thousands of people and he showed that certain emotions are well identified (at 80%) while others are much less identified (at 60%). Laughter can express many different emotions (embarrassment, joy, sometimes rage…). Similarly, anger is not expressed the same way with a banker or in the context of an emergency call center. Context has a big role to play.

"There is a wealth of content that cannot be traced by machines," insists Laurence Devillers. The algorithms currently use very little context. "In a lot of situations, the face alone cannot give the emotional state of the person, we can introduce other sensors, physiological sensors in certain cases, to better analyze the emotion, specifies Mohamed Daoudi, IT professor at IMT Lille Douai and researcher at the CRIStAL laboratory. But there is still a long way to go, not to say inaccessible.

"The machine does not understand anything"

The communities between them do not share the same value systems. And in the same community, individuals also do not have the same ways of translating their emotional states. There is a great risk of falling into the trap of bias with machines that attribute emotions based on distorted databases from the start. In the field of artificial intelligence, databases are the nerve of war. This is why the Gafa, which is rich in data, and China - with its mass surveillance tools - are more than one step ahead in this field. If the machine is trained to recognize emotions from the facial expressions of a specific population (say Western men and women), it will hardly recognize the expression of a person from another population. This is why it is necessary to work on the most representative databases possible.

“What we are doing for the moment are scams. We know how to detect positive-negative, and again, annoys Laurence Devillers. What we detect best is expressive-not expressive. ” Asking robots to understand all the situations and all the subtleties of emotions is inaccessible. "The machine does not understand anything and we are sorely lacking annotated data on these subjects to do something serious," says the artificial intelligence specialist. Suffice to say that we must change the angle of attack.

The most promising application in the long term would be systems dedicated to a person who would be able to detect nuances in their audio or visual behavior. "In cars, we can put cameras capable of observing the emotional state of the driver (if he falls asleep or if he is anxious), it can be interesting to personalize a service according to the state of the nobody ”, considers Mohamed Daoudi. In certain illnesses, heavy depressions or Alzheimer's, facial expressions can help the doctor better understand what is going on. We are not on the variability of all individuals but on the variability of an individual. And, that would avoid playing sorcerer's apprentices.

Culture

Facial recognition: is our face preparing to become our worst enemy?

Culture

DNA collection in China: Will we soon be able to make a portrait-robot from DNA?

  • Artificial intelligence
  • Gafa
  • Facial recognition
  • Future (s)
  • Near future