Enlarge image

Nuca example motif: The camera doesn't actually pixelate anything

Photo:

Nuca

Nuca comes across like the product of a start-up, with its own website, slogans like “AI Powered to be You” and a collection of press photos. But it is a “speculative product,” as the makers say. Some might think of Nuca as a playful commentary on the tech world, others as a nightmare taken to its logical conclusion: a camera that instantly turns portrait photos into nude pictures.

Mathias Vef, artist from Berlin, and Benedikt Groß, designer from Schwäbisch-Gmünd, implemented the art project. Nuca is alluding to the increasing number of AI-generated nude photos and porn videos, which are usually created without the knowledge or even the consent of the people shown - the majority of those affected are female. Special apps or customized AI models such as Stable Diffusion are now able to generate this material almost at will. When trolls spread fake nude photos of superstar Taylor Swift, there was a media outcry.

From a purely technical perspective, Nuca isn't exactly revolutionary: The clunky-looking camera sends each portrait photo to a server, where it is classified according to gender, age, hairstyle, body shape and 41 other categories. A prompt (text command) for stable diffusion is derived from this information. The appropriately pre-trained AI creates a suitable naked body. In the last step, the face and pose are transferred to this body. The finished artificial nude image is sent back to the camera and displayed on its display. The whole process only takes about ten seconds because the AI ​​model already knows billions of combinations of the different categories and can therefore quickly access “sample bodies”.

“It was very important to us not to celebrate the anonymous deepfakes of Taylor Swift, for example,” says Matthias Vef in an interview with SPIEGEL. Rather, the two wanted to “contrast the sometimes naive fascination” of the technology behind it with reality.

Some who have already tried Nuca would have found the results amusing. Vef explains it like this: »When I am photographed, I can of course see that it does not correspond to reality. On the one hand, this is quite funny. On the other hand, you can watch the AI ​​work, so to speak. And it then shows that it is not perfect, but rather works with probabilities that are not correct." One of her goals was to expose that. The algorithmic bias of the AI ​​model is sometimes quite clear: “The attractive bodies are more likely to be placed under the faces of young people. The older a person is thought to be, the more likely they are to have a bigger stomach.« (Read more about the images Stable Diffusion was trained with here.)

So far, Vef and Groß have only tested Nuca in small circles, especially with friends. Accordingly, there was a relationship of trust that does not exist when someone unknown uses other people's pictures without asking as a template for nude pictures. Nevertheless, as a precaution, the faces of the people shown on the website and here in the article are only AI-generated and their genitals are also pixelated. The camera doesn't actually pixelate anything.

The official presentation with the opportunity to try out the camera on yourself is planned for the end of June in the Berlin gallery nüüd.