What is the alleged crime of AI changing faces to "create yellow rumors"? Lawyer: Suspected of the crime of spreading obscene materials and defamation, the perpetrator may be punished concurrently

After those photos appeared, Lin Zhi's (pseudonym) life was completely disrupted. A week ago, several strangers who were "courting friends" sent photos that surprised her with images of obscene women with faces that resembled herself.

The first thing that came was shame, anger, and in the process of revealing the truth, she gradually felt anxiety and fear. With the help of AI technology, those perpetrators hidden around her have plunged her into a huge "sense of insecurity".

In recent years, AI has gradually "flown into ordinary people's homes" from the initial "industrial-grade" technology, and the lower threshold for use and creation cost have provided convenience to perpetrators and even criminals. AI face swapping scams, celebrity face changing live broadcast sales, changing faces and celebrities "kissing", photos of subway rides being "undressed" and other cases emerge one after another. AI face theft, "yellow rumors" and "one-click undressing" have also broken through legal and ethical boundaries and invaded the lives of ordinary people.

An investigation by a reporter from the Beijing News found that the low-hanging fruit mapping tools and difficult-to-trace communication channels have made AI "yellow rumors" makers more rampant. While they spread rumors with impunity, victims are trapped in self-proof – they no longer trust social networks, they repeatedly examine themselves, and their trust in the world around them gradually disintegrates.

In other fields derived from relying on AI to change faces, there is no shortage of speculators. Some of them peddle obscene videos with celebrity faces on social platforms; Some provide paid customized face changing and undressing image services; Others use stripping as a gimmick to sell "operation teaching".

Suddenly received friend requests from multiple strangers

A week ago, Lin Zhi suddenly received WeChat friend requests from nearly 10 strangers, all of whom left "Little sister, make a friend".

Lin Zhi felt strange, she rarely gave WeChat to others on weekdays. She has a small social circle and hardly adds WeChat to anyone other than colleagues, friends and family. She refused one by one, but two or three strangers repeatedly sent requests, and eventually she added one of them.

The stranger said he had seen some of Lin's "large-scale" photographs. The other party first sent a few photos of Lin Zhi's life, and Lin Zhi recognized that it was a photo that he had posted in the circle of friends. Then came pornographic photos and videos that she had never seen before, with with facial features that resembled herself.

"I was robbed of my face by AI." Lin Zhi reacted quickly, she didn't expect that this kind of thing would happen to her.

In order to find out the truth, Lin Zhi agreed to the friend requests of two other strangers. In the chat, Lin Zhi learned that they were in the obscene chat group of the foreign social software Telegram (hereinafter referred to as "Telegram"), and saw someone posting photos and videos of her, the text on which always carried the words "slut" and "provocative", some implying that she was a sex worker, and they then privately chatted with the publisher to obtain her private information.

Although Lin Zhi told them at the beginning that he was married, they continued to tease with frivolous words, "sister" and "spoil you, want to chase you".

The more he learned, the more frightened Lin Zhi felt. The cartographers knew her well and knew her name, age and home address. According to the time span of the leaked photos, the cartographers have long known Lin Zhi, "The earliest photos were sent in May 2022, and the latest was issued in early May this year, I speculate that it was an acquaintance who committed a crime and squatted for a long time." ”

Lin Zhi said that there are both real photos and fake photos of AI changing faces in "yellow ballads", and the two are very similar, and it is difficult for ordinary people to distinguish, which adds "realism". If she wants to prove her innocence, she will fall into the trap of rumor-mongers. She wanted to find the rumor-monger and give herself an explanation.

"I've been struggling with it." Lin Zhi was worried about the expansion of his influence and hoped that the rumor-monger would not post it again, but if he did not post it, he would not be found. Lin Zhi and her husband downloaded the telegram and tried to find clues using keywords provided by strangers. However, only many obscene images forged with her face were retrieved, and the publisher continued to sell and build new ones, making it difficult to trace.

In order to catch the "living" cartographers, she had to constantly search and check, "every time she looked at it was mental pollution." ”

During the epidemic, Lin Zhi suffered from depression, and his condition was stable two months ago and he had just stopped taking medication. But after the incident, anxiety, doubt and uneasiness once again enveloped the heart, and symptoms of brain ringing and insomnia also showed signs. She was worried and would soon have to take her medication again.

"You are not wrong, it is the rumor-mongers who are wrong." Her husband and best friends constantly encouraged Lin Zhi. But sometimes, she still falls into self-doubt. When she was nervous and anxious, she looked like the cartographer. Strange men at the milk tea shop, company colleagues, and even family members have become the object of her suspicion.

After the incident, Lin Zhi abandoned the WeChat account for more than ten years, and in the circle of friends of the new account, she set her privacy to "only visible to herself". "All the desire to share has disappeared, and I don't dare to post photos casually in the future."

On May 5, Lin Zhi went to the Public Security Bureau to report the case and made a record. Others chose silence for various reasons.

After a victim was stolen by AI to create pornographic rumors, a stranger added her friend and kept calling her and posting obscene videos forged by AI. Because she was afraid, she did not dare to call the police. "I didn't do anything wrong." She comforted herself, then blacked out the stranger and ignored it.

Another victim told the Beijing News that two selfies she posted on social media had been spread after being stripped with AI. This incident made her realize that "protecting your privacy everywhere" was the solution. In her opinion, rumor-mongers on overseas software such as Telegram are difficult to catch, and it is difficult to defend rights.

"It's disgusting when those people harass me, but think of those bodies that aren't mine, and have a clear conscience."

Servers are set up overseas, making it difficult to trace the source

Telegram is known for its "privacy" and "security", and because the servers are located abroad, it is more difficult for the relevant departments to trace the rumor-mongers. In addition, the reporter noticed that the "burn after reading" function made it difficult for the police to obtain evidence - users can destroy previously published information when they cancel their accounts.

It is these characteristics that make criminals think they have found a place to live. The reporter typed sexually suggestive keywords into the Telegram search bar, revealing nearly 10 obscene chat groups with more than 5000,1 subscribers, including one with more than 3,<> subscribers.

"Contrast", "exposure", "self-destruction" and "social death" are high-frequency words in these group names, and most of the chat content is to produce and disclose obscene images and private information of women. In addition to images, many women's height, weight, age and other information have been made public, and even have home addresses, social media accounts and ID numbers.

The reporter noticed that a large number of "slut shaming" related content appeared in the chat, and the words were full of abusive words such as "dog", "slut" and "owed".

Regarding the purpose of producing and distributing pornographic photos, some publishers often mention words such as "indebtedness", "revenge" and "humiliation". In the comment area, some netizens will ask for the contact information of the exposed women, some publishers directly publicize, and some need to chat privately.

The reporter found that just by clicking on the group name searched, you can browse all historical information for free, and you don't even need to join the group.

Some highly simulated face-changing videos can be fake and real, making people mistakenly think that the life photos and obscene images are the same person, and even the victims themselves will be in a trance when they see them. Lin Zhi told reporters that she felt that the facial features of the face in the fake photo were very similar to herself, but if she looked closely, she would find that the face shape was different.

And some low-quality face changing images can directly see the flaws from the face line. Some photos have blurred, distorted, and even broken facial lines. Netizens in the comment area will also point out that "it is a change of face at a glance" and "too fake".

However, regardless of the quality, the comment section is filled with a lot of sexual plot descriptions and abusive words. Because the victim's privacy was made public, some netizens in the same city claimed to go over with the intention of harassing.

Publishers sell software using video as a gimmick

With the blessing of traffic, a group of criminals took the opportunity to pour in, relying on private customization, handling and selling courses to make profits. Among them, there are many speculators who sell obscene content, who regularly update AI works on some mainstream social platforms and hint that netizens can make obscene images for a fee.

A "supplier" told reporters that he can customize AI beauty images according to needs. Depending on the quality of the image, the price ranges from 100 to 180 yuan per image. After the reporter asked to look at the sample first, he immediately sent an AI-generated picture of a woman, in which the woman's private parts were exposed. "See? Everything is as you request. He said, then withdrawing the dew point picture.

The reporter asked if it was possible to AI strip the real photo, and he said that he needed to pay a deposit and could operate.

In addition, the face-swapping videos of female celebrities became a major gimmick for speculators. The reporter found that in Baidu Tieba , some netizens specially sold obscene images with female celebrities' faces.

At the same time, the "face changing AI" section of a yellow website is full of obscene videos with female celebrities' faces. The website says that you can become a VIP member by recharging 200 yuan and watch the content of the "face changing AI" area for free.

In the telegram group, similar behavior was more rampant. The reporter searched for keywords such as "AI face change" and "undressing", showing multiple related groups, all of which were spreading obscene images. In a group of more than 4000,<> subscribers, there are a large number of obscene images of AI changing faces and undressing.

The stolen faces in the image are not lacking in well-known actresses and Internet celebrities. At the same time, photos of ordinary women in daily scenes such as schools, shopping malls, offices, and subways have also become the object of AI face theft. They are changed by the AI, or stripped of their coats and naked. Words such as "contrast" and "inner beauty" are used to describe photos after a face-change. Each similar group message has thousands or even nearly 10,000 views.

Some publishers said, "Appropriate entertainment, satisfy yourself, don't send it to the other party or harass the other party, once the police are in trouble!" AI stripping, you give me a picture, I'll help you make a picture. ”

There are also publishers who use obscene videos as gimmicks to post purchase links for AI face changing and stripping software in the group. "Private customization, no hardware threshold, whole process Chinese, fool-like operation, instant work."

The reporter clicked on a purchase link to ask. Customer service said that directly providing software, it costs 200-350 US dollars (mobile phone and computer charges are different), one year validity, including teaching.

Directly make a picture for $30 a piece and a video for $50 a piece. After payment, users only need to provide a photo or video as material, and the processed image will be obtained after one hour. The customer service also said that after purchasing the software, you can enter the VIP exclusive group and enjoy more AI image resources.

Expert: The threshold for AI face change has become lower, which is easy to cause illegal problems

The reporter's investigation found that the cost and threshold of AI to produce obscene pictures are relatively low, and after recharging on some websites, AI stripping images can be generated with one click.

Take a foreign one-click stripping website as an example, without logging in, you only need to wait 40 seconds, upload a photo that meets the requirements of the website, and then click processing, less than half a minute, you can generate an image after undressing. At this point, the displayed image is blurry, if you want to download the HD version, you need to top up. Top up $29.95 and generate 100 AI stripping images with one click.

"If you want to copy a person's face information, you can often collect a photo." An expert who has been engaged in network security work for many years told reporters that with the current AI technology, even if there is only one frontal photo, AI can complete images from other angles through deep learning, obtain three-dimensional face images, and then map the images to videos.

Experts showed reporters a case: through AI completion, the characters in the flat paintings were three-dimensional, and the characters who originally only showed their faces turned their heads, and the side faces were revealed. The reporter noticed that the three-dimensional character moved smoothly and the completed facial details were natural.

Experts said that most of the photo clarity currently published on mainstream social platforms is enough to generate high-fit, face-swapping photos and videos that are difficult to distinguish between real and fake. Even if the clarity is low, AI can be used to improve the fit of the face change image.

This is thanks to the maturity of AI deep learning technology, there are many open source face model training projects on the Internet, with high deep learning capabilities, many of which are free tools that are simple to operate, which can export very natural and realistic face changing videos.

When AI runs, the requirements for device hardware have also been greatly reduced. Many AI face changing software is connected to the cloud server, the operation is carried out in the cloud, and then transmitted back to the local after completion.

In the opinion of this expert, although the image after changing faces can achieve a very high degree of realism, there is still a way to distinguish it if you look closely.

For example, pay attention to whether the transition between the eyes, eyebrows and skin tone is natural; whether the light and shadow effects are natural; Whether the ratio of the person to the environment in the picture is realistic. In addition, image quality can be checked, and face changing involves image processing and compositing, so image quality may be affected, resulting in blurry, distortion, or high noise.

On the one hand, low-cost, low-threshold AI face changing technology gives everyone the opportunity to use this entertainment tool; On the other hand, the technology that is readily available is also more likely to cause illegal and criminal problems.

Lawyer: Victims should take screenshots and save evidence to report to the police immediately

Lan Tianbin, a senior partner at Jiangsu Fade Dongheng Law Firm, said that criminals using AI technology to produce and disseminate pornographic pictures or videos without the consent of the parties have been suspected of infringing on the parties' portrait rights, reputation rights and privacy rights from a civil perspective. In addition, the actor is also suspected of violating the Public Security Administration Punishment Law, and the public security organ may punish the perpetrator for public security.

Lan Tianbin mentioned that if the perpetrator is for the purpose of profit and reaches a certain amount or amount involved, then the perpetrator is suspected of the crime of making and disseminating pornographic materials for profit.

"Even if there is no profit, the perpetrator may be suspected of the crime of spreading obscene materials." Lan Tianbin said.

Lei Jiamao, a lawyer at Hebei Hounuo Law Firm, agrees with the above view. Lei Jiamao analyzed that in the past, the simple act of "creating yellow rumors" was generally only a civil tort, that is, infringement of the right to reputation, and the serious circumstances may be suspected of the crime of defamation; Using AI face-swapping technology to spread other people's pornographic pictures, videos and real photos is suspected of the crime of spreading pornographic materials and defamation, and the perpetrator will be punished concurrently.

Lei Jiamao said that the difficulty of rights protection in such cases mainly lies in collecting evidence and locking up illegal crimes. Because criminals may take measures such as shutting down the website or deleting links, it is impossible to secure evidence. In addition, it will be difficult to lock the source of publication due to the easy dissemination of network information and the anonymization of publishers.

Lui Jiamao suggested that if this happens, the victim should immediately take screenshots and store evidence, promptly report to the police and provide the police with evidence and clues he has kept.

Beijing News reporter Cong Zhixiang Xiong Lixin