More and more people are suffering from crimes using deep fakes.

SBS'I want to know that' aired on the 27th dealt with crimes using'Deepfakes' under the subtitle'See, Hear, and Suspect-War with Fakes, Deepfakes'.

The whistleblowers were shocked to receive a video of their sex from strangers one day. This was not the video he shot, but the face in the video was very similar to himself. And with this, a stranger demanded money from women. This so-called sex crime through deepfakes.

'Deepfake' is a technology that uses deep learning, a field of artificial intelligence, to create a fake video that synthesizes a person's face or specific part.

However, sexual crimes that exploited this were frequently occurring. In particular, the deep fake videos of K-POP stars raised concern as they could lead to serious crimes.

Of course, there are technologies that distinguish them, but development technologies develop faster than security technologies, and it is not easy to prevent those who abuse them. In response, the expert worried, "The detection rate is 0% for the deep fake generation technology that I did not expect." In particular, experts pointed out that if a crime using deepfake video occurs, the damage can be even greater.

Accordingly, the broadcast conducted an experiment by replaying the crime using deep fakes.

How much can parents recognize the face of a child closer than anyone else?

The production crew synthesized the photos of the three actors into another video to create a deepfake video that is being hospitalized.

And they sent it to their parents, asking for a picture of a credit card because they needed money urgently.

In this experiment, the parents of the two experimenters immediately sent a picture of the card to confirm that if such a crime actually occurred, many could be harmed.

Recently, a petition for a deep fake video upload site for female celebrities and a petition to severely punish users has been posted on the national petition site.

The deep fake video of a female celebrity is an illegal sexual exploitation that combines the faces of celebrities with pornography.

The reporter who reported this said, "Before the discovery of Room N, there was such a deep fake chat room. The level is much higher than that of adult content, so I thought that if left unattended, it could be a social problem and could hurt nationality."

According to a 2019 report on Deep Fake, 96% of deepfake videos were prono videos, and 25% of the victims of those videos were Korean celebrities.

In particular, it was Korean K-POP singers who appeared in pornography after British actors.

And it turns out to be valid even now.

An expert explained, "K-pop stars are being abused because they are world-famous and the person who made the video is interested in the target. Because it is good to sell online, it is making videos for them."

It was easy to find the site where the deep fake videos of Korean celebrities were uploaded.

And one of these sites even ranks pornography videos, which is a surprise.

A reporter who reported on Deep Fake was worried, "It's an idol who debuted at the end of last year, and a deep fake composite has already been released after two or three months of debut. If you debut today, you can have a deep fake video tomorrow."

If so, who makes these videos?

In response, a digital security expert said, "It is not made by individuals. There are many videos that have been handed out by experts, but this is not good for deep fake tea."

One site dealing with deepfake videos of celebrities was frowned upon by posting notices that the video was fake and intended to be used only for entertainment and educational purposes.

One informant said that the first domestic deepfake technology was disclosed to porn sites.

He said, "It was the first to appear on a site called K-Pop OOO, which was mainly a combination of K-pop idols and Japanese AV actors."

And, according to his claim, many of the creators are IT developers.

"It was a confrontational atmosphere of who made the porn synthesis more sophisticated," he explained.

In fact, last year, Korean and Chinese IT developers produced deep fake porn, and the sales situation was captured, and an investigation was conducted.

In response, the investigator in charge explained, "The arrest was unsuccessful because it was a foreign site and the relevant people could not be specified."

What is the purpose of those who produce deepfake videos?

Their purpose was simply to make money.

In response, the expert said, "If it is a video of an official or a certain popular singer or actor, the speed of spreading is faster than the speed of deletion. With a video created to make money, the main characters in the video are bound to suffer lifelong damage."

The production crew requested entertainment agencies to report on the deep fake video.

Some were ignorant of the existence of Deepfakes, and some, even if they knew about it, avoided mentioning it because they thought the damage would be greater the moment they mentioned it.

In addition, some responded that they had even sued, but gave up because it was impossible to catch the perpetrator.

And the victims of deepfake crime were spreading from entertainers to the public.

In fact, the women who suffered deep fakes reported to the police, but said they were told not to post photos on social media, saying that they could not identify the criminal because they were posted on overseas sites.

The victims also blamed themselves for posting their photos on social media.

One woman sued him for being victim of a deepfake crime through an acquaintance.

The woman recalled the time of the complaint, saying, "I thought I would be punished. Still, I would live for a year and come out."

There were as many as 14 victims of the incident involving this woman.

However, the criminal was released on probation because he was a first offender and was reflecting on his sin.

The victims said, "I was regretful. I was so angry that I was living well."

The production crew requested an interview with the perpetrator.

Then the perpetrator said, "I didn't do it myself, so should I take this phone call?"

And he said, "I just gave it to me because someone I met on the Internet asked me for a picture. Just because I gave money, I just gave me any picture, but that's what I did." It was rumored, so I couldn't go around properly. I couldn't even live properly.” The victim's pain was unjust without counting.

Since deepfake crime is a structure that anyone can easily encounter, there is a risk of leading to juvenile crime.

The production crew easily found a deepfake seller on the Internet and made a call with him.

A seller who declared himself 17 years old in high school said, "When I opened this server for the first time, there were about 30 people coming in a day, and the money I made so far was about 500,000 won. I started with the money. I synthesized photos and made a deep video. "I'm giving some to the people who sell fakes," he explained.

In particular, he said to the question of how terrible he was committing a crime, "I know, but if I get caught, I will sell it and sell it."

Also, when he asked if he could actually take responsibility, he made those who answered no to vain.

And recently, two teens who sold more than 3,000 deep fake photos of celebrities on social media have been arrested, reminding us of how serious this crime is.

The production crew received a report from a woman that they had suffered deep fake damage.

The woman claimed to have seen a foreigner on a site and sent a pornographic video of her face.

And the crew found another Korean victim on the site.

In particular, another victim woman reported several deep fake sexual exploits using her own face, and the perpetrator who posted them was presumed to be a Korean.

The crew found an SNS account based on the clues.

This account was set up with an SNS profile picture as the picture of the victim.

And this account was an account in which sexual crimes were committed by filming and selling obscene acts such as exposing the genitals.

Accordingly, the production crew accessed this account and asked who the protagonist of the profile picture was.

Then the account holder said that it was a picture of an acquaintance and that there was also a deep fake.

The production crew decided that his behavior was serious and based on the video he uploaded, he found the area where he appeared.

After confirming the area, he reported the man to the relevant cyber investigation unit.

The production crew also announced that the man is distributing deepfake videos and selling sex crime videos.

Accordingly, the police explained, "It is a serious crime. Under the Information and Communication Network Act, it seems that pornography has been distributed, and if it has been produced and distributed after June 25, 2020, it can be punished by distributing false video production."

And he added to a man who is using a foreign site, "It seems that there will be no traces just because he is a cyber, but all crimes have traces. If we investigate with the contents posted on Twitter, we will be able to identify them."

After reporting to the police, the production crew met a number of people who had witnessed the man while investigating the surroundings.

And the man was astonished by boasting that he had been doing obscene acts on his SNS for over 10 years.

Experts advised that "crime using video can be widely circulated and the extent of the damage can be wider. If it is not arrested, the crime will continue," he advised that an immediate arrest is necessary.

And, fortunately, news was reported that the man was arrested after five weeks of reporting by the production crew.

The police explained, "The day following the report, the suspect was identified and arrested in 5 days. A number of related evidences were secured, and he also admitted to the crime in part."

In addition, it turned out that the victim used in the deepfake did not know about the fact of the damage.

It is reported that the victim woman is not close to the perpetrator and is known to have learned online a few years ago.

And the police said, "To prevent secondary damage, we plan to actively support the deletion of videos and psychological support of the victims."

This support is possible because the law was revised from June of last year to punish deepfake crimes with a special law of sexual violence.

The expert explained, "Previously, electronic anklets, personal information, and employment restrictions imposed on sexual violence offenders were not applied, but now these are applicable and measures to protect victims are possible."

Among the support for the victims, a video deletion support center was in operation.

An official said, "We are receiving damages through online bulletin boards 24 hours a day. We provide support for deletion and, if the situation is found, we provide urgent deletion for 3 months immediately. We provide all public support for free, so if you are damaged, take courage. I hope you can come to the center immediately.”

In addition, the police promised to provide prompt deletion support and to pursue and arrest the criminal until the end, even if it takes time.

In addition, he stressed, "We are trying to increase gender sensitivity and the investigators' expertise in protecting victims. If we take courage and actively respond to victims rather than frustration, the police can also fully support it."

In addition, he pledged to make every effort to protect the victims by enhancing its own investigative power and actively working in international cooperation.

Deepfake developers emphasized that crimes using deepfakes are bad, but the technology itself is not.

"The downside of this technology is part, the developer is concerned about the misconception of deepfakes," he said.

In addition, he said that it has potential in so many content fields and social media, and said that if public awareness is established only poorly before it becomes full bloom, it is a pity that the opportunity for technology to grow and develop may even be blown away.

At the same time, the virtual face using deep fakes can protect someone and create a new life.

In fact, the broadcast used deep fakes to protect the victims.

And if you create educational contents using the faces of celebrities, the educational effect will increase. "It would be good to use the faces of celebrities in Korean wave or K-pop craze. Korean education contents."

Deepfakes also make it possible to meet those who have passed away again.

A few years ago, in the movie'Star Wars', Carrie Fisher's Princess Leia was restored with a deep fake, giving a moving scene.

Experts said, "The learning data of DeepFake is human data. Deepfake is also created by human imagination. What Deepfake creates is a world of fantasy, and whether it is a good dream or a nightmare is up to people." He stressed that we should think about what we need to do so that we can become a society where we see, hear, and comfort, not a society that sees, hears, and doubts due to deepfakes.

On the other hand, on this day's broadcast, a deep fake was used to protect the victims' personal lives and featured the pure function of the deep fake with a deep fake made by the host Kim Sang-jung to help understand the broadcast.

(SBS Entertainment News Editor Kim Hyo-jeong)