Ryuk became the highest grossing ransomware in internet history - Geeko

Artificial intelligence has been used to create realistic images and computer-generated voices for several years now. A technology which makes it possible in particular to make public figures say things by using their image and reproducing their voice. We then speak of deepfake or hypertrucage. This technology can be used for tricks, hoaxes, but also to disseminate fake news and it seems that it is of particular interest to hackers.

Reproducing someone's voice to make them say things is indeed a great way for hackers to come to their end. Especially since this form of technology improves over the years, becoming more and more realistic. Bad guys have used audio deepfake to rip people off in the past, and the number of potential victims could grow as voice clones get more sophisticated.

The ever more realistic audio deepfake

In its latest report, the security consultancy NISOS explains that it analyzed one of these deepfake audio scams. Hackers copied the voice of a company's CEO and voicemailed an employee saying he supposedly needed "immediate assistance to finalize an urgent business deal."

The audio shared with the Motherboard site is far from perfect and the voice still seems robotic, but in an emergency, it is quite possible to be fooled and follow the instructions given. “It sounds really human. They ticked that box with respect to: does it sound more robotic or more human? I would say more human, ”Rob Volkert, researcher at NISOS, told Motherboard.

In this case, the scam did not work. The employee identified the message as suspicious and reported it to his company's legal department. But we can still imagine that this kind of attack could multiply and work, as was the case in the past. And the number of victims could explode as this technology is perfected.

A serious precedent

In 2019, the CEO of a British energy company was taken in by such a scam. He sent 220,000 euros to a Hungarian supplier after receiving an alleged phone call from his CEO. His interlocutor told him that the transfer was urgent and that it should be done within the hour. The criminals have still not been identified.

For hackers, simple audio recordings are enough to feed an artificial intelligence. A real problem for public figures who speak regularly on television or on the Internet, as well as for phone calls from bosses about financial results, etc. Based on these recordings, the AI ​​will be able to copy the person's voice. The better the audio quality, the more likely the result will be.

The only solution to avoid being fooled by an audio deepfake attack is to keep calm - despite the pressure from your interlocutor -, hang up and call the person back, to make sure that it is indeed a question. 'she. A word of safety could also be agreed between people, but in an emergency it is not always easy to think of it. This is also why hackers play on this aspect.

High Tech

Facebook bans deepfakes, parody content spared

High Tech

Hackers hack other hackers ... after helping them hack

  • Cyber ​​attack
  • Artificial intelligence
  • Cybersecurity
  • Cybercriminality
  • Scam
  • High Tech
  • Pirate
  • Hacker