Europe 1 with AFP // Photo credit: Matt Winkelmeyer / GETTY IMAGES NORTH AMERICA / Getty Images via AFP 9:28 p.m., January 26, 2024

Photos that created outrage. The American political class and fans of Taylor Swift have strongly criticized the publication of false pornographic images, created using AI, featuring the singer. One photo has been viewed more than 47 million times, an event that could lead to the creation of a law to eradicate this phenomenon. 

The American political class and fans of Taylor Swift expressed their indignation on Friday as false pornographic images featuring the singer, and created using generative AI, were widely shared in recent days on X (formerly Twitter) and other platforms. One of these images has been viewed more than 47 million times on the social network. According to American media, the image remained on X for more than 17 hours before being deleted.

Fake pornographic images ("deepfakes") of famous women, but also targeting many anonymous people, are not new. But the development of generative artificial intelligence (AI) programs risks producing an uncontrollable flow of degrading content, according to many activists and regulators. The fact that such images this time affect Taylor Swift, second among the most listened to artists in the world on the Spotify platform, could however help to raise awareness of the problem among the authorities, given the indignation of her millions of fans.

The platform declared “remove all identified images”

"The only 'positive point' in this happening to Taylor Swift is that she is big enough for a law to be passed to eliminate this. You guys are sick," Danisha Carter, an influencer, posted on X with an audience of several hundred thousand people on social networks.

X is known for having less strict rules on nudity than Instagram or Facebook. Apple and Google have a right to control the content circulating on applications via the rules they impose on their mobile operating systems, but they have tolerated this situation on X, until now. In a press release, X assured that it had “a zero tolerance policy” on the non-consensual publication of nudity images.

>> READ ALSO - 

Be careful, "deepfake", fake videos, are increasing on the Internet

The platform declared that it was “concretely removing all identified images” of the singer and “taking appropriate measures against the accounts that posted them”. Representatives of the American singer have not yet commented. “What happened to Taylor Swift is not new, women have been the target of false images without their consent for years,” recalled the elected Democrat Yvette Clarke, who supported a law to fight against the phenomenon. “With advances in AI, creating these images is easier and cheaper.”

A study carried out in 2019 estimated that 96% of deepfake videos were pornographic in nature. According to Wired magazine, 113,000 of these videos were uploaded to major porn sites during the first nine months of 2023.