Last summer, they discovered with horror that so-called "generative" AI programs could now produce, on simple request, a drawing of a dog "like Sarah Andersen" or an image of a nymph "Karla Ortiz way".

An appropriation without the interested parties having given their consent, being credited or compensated financially, the 3 "Cs" at the heart of their battle.

In January, artists collectively filed a complaint against Midjourney, Stable Diffusion and DreamUp, three AI models trained using billions of images collected from the internet.

Sarah Andersen, one of the main plaintiffs, felt "intimately aggrieved" when she saw a drawing generated with her name, in the style of her comic strip "Fangs".

Her outraged reaction on Twitter was widely reported, and then other artists contacted her. "We hope to set a judicial precedent and force AI companies to follow rules," she said.

In particular, artists want to be able to accept or refuse to have their works used by a model - and not have to ask for their removal, even when possible.

Under these conditions, one could imagine a "licensing system, but only if the commissions are sufficient to live on it," notes Karla Ortiz, another plaintiff.

"Easy and cheap"

No question "of receiving cents while the company pockets millions," insists this illustrator who has worked for Marvel Studios.

Artist Karla Ortiz shows off the Doctor Strange costume she designed and designed for Marvel Studios, at her studio in San Francisco, California, March 8, 2023 © Amy Osborne / AFP

On social networks, artists tell how they lost a large part of their contracts.

"Art is dead, man. It's over. AI won. Humans lost," Jason Allen told The New York Times in September 2022, after submitting an image generated by Midjourney to a competition, which he won.

The Mauritshuis Museum in The Hague is currently exhibiting an AI-generated image for a competition to create works inspired by Vermeer's "The Girl with a Pearl Earring".

The San Francisco Ballet has caused debate by using Midjourney for its campaign to promote The Nutcracker in December.

"It's easy and cheap, so even institutions don't hesitate, even if it's not ethical," says Sarah Andersen.

The accused companies did not respond to AFP's requests for comment, but Emad Mostaque, the boss of Stability AI (Stable Diffusion), likes to compare these programs to simple tools, such as Photoshop.

They will allow "millions of people to become artists" and "create tons of new creative jobs," he said, believing that "unethical" use or "to do illegal things" is the "problem" of users, not technology.

Apocalypse of creation

Companies will invoke the legal concept of "fair use", a kind of copyright exception clause, says lawyer and developer Matthew Butterick.

"The magic word is +transformation+. Does their system offer anything new? Or does it replace the original on the market?" says the consultant.

With the law firm Joseph Saveri, he represents artists, but also engineers in another complaint against Microsoft software, which generates computer code.

Within a distant trial, and an uncertain outcome, the mobilization is also organized on the technological field.

Called to the rescue by artists, a laboratory at the University of Chicago last week launched software to publish works online by protecting them from AI models.

Called "Glaze", the program adds a layer of data on the image, invisible to the naked eye, which "blurs the tracks", summarizes Shawn Shan, the student in charge of the project.

The initiative is greeted with enthusiasm, but also skepticism.

"The responsibility will fall on the artists to adopt these techniques," says Matthew Butterick. "And it's going to be a game of cat and mouse" between companies and researchers.

He fears the next generation will become discouraged.

"When science fiction imagines the apocalypse by AI, robots arrive with laser guns," notes the lawyer. "But I think the victory of AI over humanity is when people give up and stop creating."

© 2023 AFP