AI startups are allowing their customers to stay in virtual contact with people who have died, in a field that is still shrouded in great mystery and raises many questions.

In a promotional video, Ryu Soon Yoon sits in front of a microphone and a giant screen showing her husband, who died a few months ago, and says to her, "Honey, it's me," to shed tears and start a kind of conversation with him, and after learning that he has cancer in the final stages, this 76-year-old South Korean - whose name is Lee Byung-hual - hired the company "Deep Brain Ai", which recorded videos of him over hours to create a digital version of him that can answer questions.

Joseph Murphy, head of development at Deep Brain AI, details the software called Re-Memory: "We don't create new content," meaning the technology doesn't generate phrases that the deceased would not have spoken or written during his lifetime.

The same principle is adopted by Story File, which used 92-year-old actor William Shatner as a promotional face on its site.

Stephen Smith, head of the service, which is used by thousands, according to the company, says our approach is to keep that person's magic for as long as possible" during their lifetime "and then use artificial intelligence."

Companies specializing in funeral organization in China offer the possibility of interacting virtually with the deceased during his funeral using artificial intelligence.

At the beginning of April 2023, entrepreneur and engineer Pratik Desai caused a stir by calling on people to "start taking recordings" by audio or video "of parents, the elderly and relatives," pointing out that from "the end of this year" it will be possible to create a virtual avatar character for a deceased person, explaining that he is working on a project in this direction.

The message posted on Twitter sparked a whirlwind of criticism, prompting him to confirm a few days later that he was not a "gravedigger", saying: "This is a very personal matter, and I sincerely apologize for hurting people."

On Storyfile, Stephen Smith explains, "This is an ethically sensitive area, and we are taking great precautions."

Some experts argue that the use of artificial intelligence in communicating with the dead is directed to a specific group and not a viable sector (Getty Images)

Ethical challenges

After her best friend died in a car accident in 2015, California-based Russian engineer Eugenia Kiuda created a "chatbot" named "Roman" after her late boyfriend and provided him with thousands of SMS messages he sent to relatives with the aim of creating something like a virtual version of him.

In 2017, it launched Replica, which offers some of the most advanced personal chat programs on the market, with which some users spend several hours talking to daily.

But despite what happened with Roman, Republica "is not a platform designed to recreate a loved one," a company spokeswoman warned.

London-based Somnium Space seeks to rely on the metaverse to make virtual versions of users during their lifetime, and they will have a special existence without human intervention in this parallel world after their death.

General Manager Artur Seschoff acknowledges that the service "is not aimed at everyone, of course" in a video posted on YouTube about the company's product called "Live for Ever", which it announced will launch at the end of the year.

"Do I want to meet my grandfather with AI? It will be available to whoever wants it."

The question that arises here is: "To what extent is it acceptable to have a hypothetical existence of a deceased loved person who, thanks to generative AI, can say things that he did not say before his death?

Joseph Murphy acknowledges that "the challenges are philosophical, not artistic, I don't think society is ready yet, there is a line we did not plan to cross."

The director of Re Memory, which has a few dozen users, explains that the technology is "targeted to a specific category and not a growth sector," adding, "I don't expect it to be very successful."

Candy Kahn, a professor at Baylor University, who is currently researching the topic in South Korea, said: "Interacting with an AI version of a person in order to live in mourning can help. to move forward with minimal trauma, particularly with the help of a professional."

Mary Dias, professor of medical psychology at Johnson & Wales University, interviewed many of her mourning patients about virtual contact with their deceased parents.

"The most common answer was, 'I don't trust AI, I'm afraid it will say something I won't accept,'" she explains.