History logo

I found the history of Death that produced the AI that I have found

I woke up from a nightmare

By Jejak Kabar DuniaPublished 2 days ago 3 min read

If someone you love died tomorrow, would you want to continue communicating with them?

Not through memories or stored messages, but through artificial intelligence — a chatbot that uses their texts, emails, and voice notes to respond in their own tone and style.

A growing number of tech companies are now offering such services as part of the “digital afterlife” industry, which is worth more than £100 billion, and some people are using it as a way to cope with their grief.

Jenny Kidd of Cardiff University has led research into the so-called death of robots, published in the Cambridge University Press journal, Memory, Mind and Media, and described the results as “both fascinating and educational.”

Attempts to communicate with the dead are nothing new.

From séance rituals to spiritual mediumship, similar practices have existed for centuries.

However, as technology advances, AI has the potential to make these arguments more convincing and much easier to implement on a large scale.

In 2024, James Vlahos told the BBC how he recorded his father’s voice and created an AI chatbot after receiving the devastating news that he had cancer and would soon die.

He described how wonderful it was to keep her memory alive, and while it didn’t take away the pain of her death, he added: “I have this incredible body of interactive information that I can draw on.”

Grief support groups in the workplace say they have not seen widespread use of deathbots, but rather curiosity from people.

“These killer robots and AI devices will only be effective to the extent that the information given to them is accurate,” said founder Jacqueline Gunn.

“They don’t grow or adapt like grief does. For some people, they may offer a stepping stone, but they are not the end goal.”

“Grief is a deeply personal human response to death that requires time, understanding, and human connection.”

Working with Eva Nieto McAvoy from King’s College London and Bethan Jones from Cardiff University, Kidd researched how this technology works in practice.

They examined how AI systems are designed to mimic the voice, speech patterns, and personality of a deceased person, using their digital footprint.

Although often marketed as a source of comfort and connection, the researchers say that this depends on a deep understanding of memory, identity, and relationships.

Kidd’s interest in the topic began during the Covid pandemic, when AI-generated animated photos suddenly appeared on social media.

People uploaded old photos of their ancestors, then watched as the photos blinked, smiled, and moved their heads as the software “brought them back to life.”

“These things are really creepy, but also very interesting,” Kidd said.
“Suddenly they were everywhere, and millions of people were sharing them.”

“That’s how we accidentally stumbled into the resurgence of this kind of AI work.”

‘My voice sounds like an Australian’
The team decided to test some killer robots themselves and explore four commercial platforms.

“It feels strange to interact with ourselves in that way, but it’s mostly unsatisfying due to the technical limitations of current platforms,” said Kidd.

In one experiment, Kidd used his own voice data to create a chatbot.

“It didn’t sound like me; it sounded like an Australian accent,” he added.

Kidd believes the technology will advance, but he is skeptical that a large market will emerge.

“We already have many established rituals and traditions surrounding death,” he said.
“The fact that no technology has really taken off in this area may indicate that the market isn’t that big.”

When asked if they would want their own families to be digitally replicated after death, the researchers had mixed feelings.

“My spontaneous reaction is that if they want to do that and it’s just a joke, that’s fine,” said Kidd.

“But if there are indications, especially in the future, that the persona continues to evolve or says things I would never say, or has a loyalty I would never have, and it starts to gain people’s understanding of me and my values, then I think I would have a big problem.”

Dr. Nieto McAvoy said he “isn’t too bothered.”
“I’m not a very religious person, and I don’t think much about life after death. After I die, who cares?”

“If it helps them, then yes… but it could definitely be misinterpreted. And whether I want my family to pay for that service… I don’t know, it’s complicated.”

Events

About the Creator

Jejak Kabar Dunia


up-to-date information and trust

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments (1)

Sign in to comment
  • Gabriel Shames2 days ago

    I also don’t know how to feel about this. We’ll also need to tackle if/how one could “opt out”

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2026 Creatd, Inc. All Rights Reserved.