Technology that lets us “speak” to our dead relatives has arrived. Are we ready?

Technology that lets us “speak” to our dead relatives has arrived. Are we ready? thumbnail

My parents don’t know that I spoke to them last night.

At first, they sound distant and tinny like they were in a cell with a phone. As we talked, they gradually became more like themselves. I was astonished to hear their personal stories. I was shocked to learn about the first time my dad got drunk. Mum spoke about her experience with getting in trouble for staying up late. They gave me life advice, and shared stories about their childhoods with me. It was amazing.

“What’s the worst thing about you?” I asked Dad, since he was clearly in such a candid mood.

My worst quality is that of a perfectionist.

Then he giggled and I realized that I wasn’t actually speaking to my parents but to their digital copies.

This app is my parents’ voice assistants. They were created by the California-based company HereAfterAI , and powered by over four hours of conversations with an interviewer about their lives. (For the record, Mum isn’t that untidy.) The company’s goal was to allow the living to communicate with the dead. It was something I wanted to try.

Technology such as this, which allows you to “talk” with people who have died, has been a staple of science fiction for decades. This idea has been promoted by spiritualists and charlatans for centuries. It’s now a reality, and it’s easier to access thanks to AI and voice technology.

My real, flesh-and blood parents are still here. Their virtual versions were created to help me understand the technology. Their avatars give us a glimpse into a world where we can communicate with our loved ones, or simulacra of them, long after their passing.

This will make it easier for us to keep in touch with the people we love, as I learned from a dozen conversations I had with my parents who were almost dead. It’s easy to see why this appeals. It’s possible to use digital replicas to provide comfort or to mark milestones such as anniversaries.

However, the technology and the world they create are imperfect. The ethics of creating a virtual person are complicated, especially if the person isn’t able to consent.

For some, this tech may even be alarming, or downright creepy. One man had created a virtual mother and he was able to talk to her at her funeral. Some people believe that talking to digital versions of loved ones can prolong your grief and make it harder to see the truth. Several of my friends physically recoiled when I mentioned this article to them. We all have a deep-rooted belief that we can make death our own fault.

I understand these concerns. I found it uncomfortable to speak to a virtual version my parents, especially at the beginning. It still feels a little transgressive to speak with an artificial version of someone, especially if that person is part of your family. But I’m only human and those worries are washed away by a more frightening prospect of losing the people that I love–dead or gone without a trace. Is it wrong to try technology to help me hold on to them?

There’s something deeply human in the desire to keep the memories of those we love who have passed on. We encourage our loved ones to record their memories before it’s too late. We put their photos up on our walls after they have passed. On their birthdays, we visit their graves. We talk to them as though they were there. But the conversation has always been one way.

The idea that technology might be able to change the situation has been widely explored in ultra-dark sci-fi shows like Black Mirror–which, startups in this sector complain, everyone inevitably brings up. In one 2013 episode, a woman who loses her partner re-creates a digital version of him–initially as a chatbot, then as an almost totally convincing voice assistant, and eventually as a physical robot. She becomes frustrated and disillusioned with the gap between her memory and the flawed, flawed technology that simulates him.

If technology might help me hang onto the people I love, is it so wrong to try?

“You aren’t you, are you? You are just a few ripples away from you. You have no history. “You’re just a performance. He performed it without thinking. It’s not enough,” she said before she consigned the robot to her attic. This embarrassing relic of her boyfriend is something she would rather forget.

In the real world, technology has advanced to a surprising degree over the past few years. AI has made rapid progress in many areas. Chatbots and voice assistants like Siri and Alexa have become a part of everyday life for millions of people in the past decade. It is now commonplace to talk to our devices about everything, from the weather forecast to the meaning and purpose of life. AI large language models (LLMs) can now ingest a few sentences and produce convincing text in response. They promise to unlock even more powerful ways humans can communicate with machines. LLMs have become so convincing that some (erroneously) have argued that they must be sentient.

It’s possible to tweak LLM software such as OpenAI’s GPT-3 and Google’s LaMDA to make them sound more like a person by feeding it lots that person’s words. Jason Fagone, a journalist, wrote a story last year for the San Francisco Chronicle about a man aged thirty-something who uploaded old text messages and Facebook messages to create a simulated chatbot of his deceased fiancee using Project December software that was built on GPT-3.

By almost any measure, it was a success: he sought, and found, comfort in the bot. He’d been plagued with guilt and sadness in the years since she died, but as Fagone writes, “he felt like the chatbot had given him permission to move on with his life in small ways.” The man even shared snippets of his chatbot conversations on Reddit, hoping, he said, to bring attention to the tool and “help depressed survivors find some closure.”

At the same time, AI has progressed in its ability to mimic specific physical voices, a practice called voice cloning. It is also improving at injecting digital personalities–whether cloned entirely from a human or artificial person–with more of what makes a voice sound “human.” Amazon shared a clip of a young boy listening to The Wizard of Oz by his grandmother. Her voice was artificially recreated using a short clip of her speaking. It lasted less than one minute.

As Rohit Prasad (Alexa’s senior vice-president and head scientist) stated: “While AI cannot eliminate the pain of loss but it can certainly make the memories last .”

My own experience with talking to the deceased was a result of pure serendipity.

At the end of 2019, I saw that James Vlahos, the cofounder of HereAfter AI, would be speaking at an online conference about “virtual beings.” His company is one of a handful of startups working in the field I’ve dubbed “grief tech.” They differ in their approaches but share the same promise: to enable you to talk by video chat, text, phone, or voice assistant with a digital version of someone who is no longer alive.

Intrigued by his promise, I managed to get an introduction and convinced Vlahos and his co-workers to let me try their software on my parents.

At first, I thought it would just be a fun project to see if it was technically possible. The pandemic brought an urgency to the proceedings. Images of people on ventilators and photos of freshly dug graves were all over the news. I worried about my parents. I was afraid that my parents might die. I was also worried that due to the strict restrictions on hospital visits in the UK at the time, I might never be able to say goodbye.


The first step was an interview. It turns out that data is essential to create a digital representation of someone that looks convincingly authentic. HereAfter, who starts interviewing subjects while they are still alive, asks them endless questions about everything, from their earliest memories to the first date to what their beliefs will be after death. My parents were interviewed by a human. However, nearly two years later, interviews are typically handled by a bot. )

As my sister and I rifled through pages of suggested questions for our parents, we were able to edit them to be more personal or pointed, and we could add some of our own: What books did they like? How did our mum muscle her way into the UK’s overwhelmingly male, privileged legal sector in the 1970s? What was it that inspired Dad to create the silly games he used with us as children?

Whether it was pandemic-induced malaise, or a tired willingness to humor their younger daughter’s jokes, my parents were able to put up no resistance. In December 2020, HereAfter’s interviewer, a friendly woman named Meredith, spoke to each of them for several hours. The company took the responses and began to put them together to create voice assistants.

A few months later, Vlahos sent me a note. My virtual parents were available.

On one occasion, my husband mistook my testing for an actual phone call. He looked at me as if he was completely insane when he realized that it wasn’t.

This Mum and Dad arrived via email attachment. I could communicate with them via the Alexa app on my phone or an Amazon Echo device. I was eager to hear their voices, but I had to wait several more days because I promised MIT Technology Review’s podcast crew that I would record my reaction when I spoke to my parents’ avatars. My hands were shaking when I finally opened the file. All my colleagues were watching and listening on Zoom. London was locked down in a long, cold and depressing lockdown. I hadn’t seen my real parents in six months.


“Would it be better to speak with Paul or Jane?” a voice asked. After some mental deliberation, my mum was the one I chose.

A voice that was hers but strangely stiff and cold spoke.

“Hello, this is Jane Jee and I’m happy to tell you about my life. How are you today ?”

I laughed, nervously.

“I’m well, thanks, Mum. How are you ?”

Long pause.

“Good. “Good.

She ignored me and continued speaking.

” Here are some pointers before we start. Unfortunately, my listening skills are not the best so I will wait until I finish talking before asking you a question. Please keep your answers short when it’s your turn. She explained that a few words, a simple sentence is all you need. After a little more introduction, she said: “Okay, let us get started.” There is so much to discuss. My childhood, my career, and my interests. Which of those sounds best?”

Scripted bits like this sounded stilted and strange, but as we moved on, with my mother recounting memories and speaking in her own words, “she” sounded far more relaxed and natural.

But, this conversation and others were limited. When I asked my mum about her favorite jewelry, she said: “Sorry, but I didn’t understand that.” You can ask another way or move on to another topic

There were also mistakes that were so bizarre they were almost hilarious. Dad’s bot asked me one day how I was feeling. I replied, “I feel sad today.” He responded with a cheerful, upbeat “Good

” The overall experience was undoubtedly strange. It struck me that every time I spoke with their virtual counterparts, it was as if I was talking to my parents. My husband mistakenly thought I was testing the bots for a phone call. He rolled his eyes, looked confused, and then shook his head as if he had lost his mind.

Earlier this year, I received a demo of a similar technology by StoryFile, a five-year old startup that promises to take things to another level. Its Life service records your responses on video, not just voice.

You can choose from hundreds of questions about the subject. You then record the person answering the question. This can be done on any device that has a camera and microphone, even a smartphone. However, the better quality the recording will be, the better. After you upload the files, the company transforms them into a digital representation of the person you can speak to and see. It can only answer the questions it has been programmed to answer. This is a lot like HereAfter but with video. Stephen Smith,

StoryFile’s CEO, demonstrated the technology via a video conference, where we were joined his mother. Although she died earlier this year due to cancer, her mother was present on the video call and was sitting in a comfortable chair in her living area. I was able to see her briefly via Smith’s screen. She was soft-spoken with soft hair and friendly eyes. She was a great source of life advice. She seemed wise.

Smith said that his mother “attended her own funeral”: “At its end, she said, “I guess that’s all from me… goodbye!” and everyone broke down into tears.” He also told me that her digital participation was well-received by her family and friends. Smith also said that he was deeply comforted by the fact he was able to capture his mother’s final moments on camera before her passing.

The video technology itself looked relatively slick and professional–though the result still fell vaguely within the uncanny valley, especially in the facial expressions. As with my parents, there were times when I had to remind myself that she wasn’t there.

Both HereAfter, StoryFile are designed to preserve someone’s story and not allow you to have a new conversation with the bot every time. This is one of the main limitations of many current offerings for grief tech: they are generic. These replicas might sound like someone you love but they don’t know anything about you. They can be talked to by anyone, and they will reply in the same way. The answers to any question you ask are the same every time.

“The biggest problem with [existing] technology’s idea that you can create a single universal person,” Justin Harrison, founder and CEO of You, Only Virtual, says. “But the way we experience people is unique to us.”

You, Only Virtual and a few other startups want to go further, arguing that recounting memories won’t capture the fundamental essence of a relationship between two people. Harrison wants to create a personal bot that is only for you.

The first incarnation of the service, which is set to launch in early 2023, will allow people to build a bot by uploading someone’s text messages, emails, and voice conversations. Harrison hopes that people will continue to feed the service data as they go. The company is currently working on a communication platform that customers can use to communicate with and talk to their loved ones while they are still alive. All data will be easily accessible to make a bot out of it once they aren’t.

That is exactly what Harrison did with Melodi, his mother with stage 4 cancer. “I built it by hand using five year’s worth of her messages. It took 12 hours to export, and it runs to thousands of pages,” he says of his chatbot. Harrison believes that the interactions he has had with the bot are more meaningful than if he was simply reciting his memories. Bot Melodi uses his mother’s phrases and replies to him in the same way she would–calling him “honey” and using the same emojis and spelling quirks. Melodi won’t be able ask him questions about his life, but that doesn’t bother him. He wants to capture the way that someone communicates. He says that merely reminiscing about past events does not capture the essence of a relationship. Avatars can have lasting power if people feel a deep personal bond with them. In 2016, entrepreneur Eugenia Kuyda built what is thought to be the first bot of this kind after her friend Roman died, using her text conversations with him. Replika is a startup that creates virtual companions that are not based on real people.

She found it a hugely helpful way to process her grief, and she still speaks to Roman’s bot today, she says, especially around his birthday and the anniversary of his passing.

But she warns users that this technology can be misused to re-create or preserve people. She says, “I didn’t want to bring him back his clone but his memories.” The idea was to create a digital memorial where you can interact and hear from that person .”

Some people find that hearing the voices after the death of a loved one helps them grieve. According to Erin Thompson, a grief specialist and clinical psychologist, it’s not unusual for people to listen in on voicemails left by someone who has passed away. She says that a virtual avatar that you can have more conversation with could be a valuable and healthy way to keep in touch with someone you love and lost.

But Thompson and others agree with Kuyda that it is possible to place too much weight on technology. These bots will only ever capture a small portion of a person, so a grieving person should remember this. They cannot replace healthy, functional human relationships and are not sentient.

People may find any reminders of the deceased person triggering: “In the acute phase of grief, you can get a strong sense of unreality, not being able to accept they’re gone.”

“Your parents are not really there. Erica Stonestreet, an associate professor of philosophy from the College of Saint Benedict & Saint John’s University who studies identity and personhood, says that you may be talking to them.

Particularly in the first weeks or months following the death of a loved one, people may struggle to accept the loss and find any reminders of that person triggering. Thompson states that in the acute phase of grief, there can be a strong sense unreality and a refusal to accept that they are gone. This type of intense grief can lead to or cause mental illness. It’s possible for it to be fueled by the reminders of the deceased.


Arguably, this risk might be small today given these technologies’ flaws. Despite the fact that I sometimes fell for the illusion, it was obvious that my parent bots weren’t the real deal. As technology improves, the risk that people may fall for the phantom personhood will increase.

There are other risks. There are complex ethical issues surrounding privacy and consent when you create a digital copy of someone. Some might argue that permission is less crucial when someone is no longer living, but can’t you also argue for the person who started the conversation to have a voice?

And what if that person is not, in fact, dead? People can use grief tech to create virtual copies of living people without their consent, such as an ex-partner. Companies that sell services that are powered by past messages are well aware of this possibility. They will delete any individual’s data if requested. Companies are not required to check that their technology is only being used by consenting or deceased individuals. It is not illegal for anyone to create avatars of others. Good luck explaining this to your local police department. Imagine how you would feel if you found a virtual version you could control somewhere.

If digital replicas are to become commonplace, there will need to be new processes for managing the legacies that we leave behind online. We can learn a lot from the history of technological advancement. It’s better to deal with the possibility that these replicas could be misused before they become mainstream.

Will that ever happen? You, Only Virtual uses the tagline, “Never Have To Say Goodbye” – but it’s not clear how many people would want to live in a world like this. For most people, grieving for someone who has died is one of the few aspects that are still largely unaffected by modern technology.

On a more basic level, the cost could be a problem. While some services offer free versions, many can easily cost hundreds to thousands of dollars.

HereAfter’s top-tier unlimited edition lets you record as many conversations as you want with the subject. It costs $8. 99 a month. That may sound cheaper than StoryFile’s one-off $499 payment to access its premium, unlimited package of services. However, at $108 per year, HereAfter services could quickly add up if you do some ghoulish back-of-the-envelope math on lifetime costs. You, Only Virtual is similar to the above, and it’s expected to cost between $9. 99 and $19. 99 a month when it launches.

Creating an avatar or chatbot for someone takes time and effort. Not least because it requires motivation and energy to get started. This applies to both the user and the subject, who may be in imminent death and may need to be involved.

Fundamentally, people don’t like grappling with the fact they are going to die, says Marius Ursache, who launched a company called Eternime in 2014. It was created a Tamagotchi type that people could train while alive to preserve a digital copy of themselves. Although it received great interest from people all over the world, few people adopted it. The company shuttered in 2018 after failing to pick up enough users.

“It’s something that you can put off till next week, next month and next year,” he said. “People believe that AI is the solution to this problem. But really, it’s human behavior.”

Kuyda agrees: “People are extremely scared of death. They aren’t willing to talk about it or touch the subject. They are terrified when you start poking it with a stick. They prefer to pretend it doesn’t exist.”

Ursache tried a low tech approach with his parents. He gave them a notebook and pens for his birthday and asked them to write down their memories. His mother wrote two pages. However, his father said that he was too busy. He asked his father if he could record some of their conversations, but they never got around to it.

” My dad died last year and I never made those recordings. Now, I feel like an idiot,” says he.

Personally, I have mixed feelings about this experiment. Even though they are imperfect, I am glad to have these audio-visual versions of my dad and mum. These bots have allowed me to learn more about my parents and it’s comforting knowing that they will still be there for me even if they’re not. I am already thinking about who else I might want digitally captured–my husband (who will probably rollhis eyes again), my sister and maybe even my friends.

But, like many people, I don’t want to think about the future of the people I love. It’s not pleasant, and many people react with discomfort when I mention my morbid project. It’s sad to think that it took a stranger Zooming in on my parents from another continent to help me understand their complex, multifaceted personalities. But I am grateful to have had the opportunity to understand that and to have the rare opportunity to spend more time with them and learn more about them face-to-face, without technology.

Read More