When AI meets grief: The controversial application that's dividing the internet over digital "resurrections". Is this acceptable?

Offtopic / Everything else

When AI meets grief: The controversial application that's dividing the internet over digital "resurrections". Is this acceptable?

compressed-igor-omilaev-FHgWFzDDAOs-unsplash.jpeg When AI meets grief: The controversial application that's dividing the internet over digital "resurrections". Is this acceptable?
Technology meets memory: some find healing, others find heartbreak in simulated conversations with the departed. Image Source: Unsplash/ Igor Omilaev

Imagine opening an application and hearing a familiar voice—one you thought you’d never hear again. Not a recording, not a memory, but a real-time conversation powered by artificial intelligence (AI), designed to mimic the speech patterns, personality, and even emotional tone of someone you’ve lost.



That’s the promise—and the controversy—behind a new application that’s sparking heated debate across social media and beyond.



Some call it a breakthrough in healing, a way to find closure in the digital age. Others see it as unsettling, even exploitative, raising ethical questions about consent, memory, and the boundaries of technology.



Whether you find it comforting or chilling, one thing’s clear: this isn’t science fiction anymore. It’s here. And it’s asking all of us—would you try it?





How the technology works​

2WAI, the AI startup co-founded by Calum Worthy, officially rolled out its iOS beta version on November 11.



Using advanced AI, the application recreates the voice, personality, and communication style of someone who has passed away.



Users upload digital artifacts—such as voice recordings, text messages, emails, or social media posts—which the system analyzes to simulate how that person might have spoken or written. Once trained, the AI can respond to typed or spoken messages in a way that mirrors the tone and phrasing of the deceased.





Some versions even feature animated video avatars, adding a lifelike visual component to the experience.



With just a few minutes of recorded material, users can generate a personalized HoloAvatar powered by 2WAI’s proprietary FedBrain technology.



This system processes user-approved content and limits responses to that data, ensuring a controlled and customized interaction.



Currently free during its beta phase, the application is expected to shift to a tiered subscription model, with monthly pricing projected between $10 and $20—similar to other AI services.





What grief experts are saying​

Mental health experts are voicing strong concerns about how grief-related AI tools might interfere with the natural process of mourning. While some view these technologies as comforting, others caution that they may disrupt emotional healing by offering a simulated connection that delays acceptance.



Dr. Jessica Heesen, lead ethicist of the Edilife project at the University of Tübingen, explained the risk clearly: “Digital avatars could act like a painkiller in preventing the bereaved from accepting and dealing with their loss.”



Her warning reflects a growing unease among psychologists and ethicists who worry that emotional reliance on AI recreations could complicate closure and prolong grief.






Beijing-based psychologist Wang Qiang also emphasized that a key part of healthy grieving involves coming to terms with the loss and reshaping one’s emotional connection to the person who has died.



While AI tools may replicate surface-level interactions, he cautions that they risk prolonging an artificial sense of closeness.



As he explained it: “Although AI can capture casual human interaction, these “grief bots” may continue a kind of fake emotional bond to the deceased, which may limit the emotional and psychological health of their users.”



His warning reflects growing concern among mental health professionals about the unintended consequences of using technology to simulate relationships that have ended.



The ethics of digital afterlife​


This isn’t the first time technology has tried to bridge the gap between the living and the dead.



From hologram concerts featuring departed musicians to AI chatbots trained on famous figures, the idea of a “digital afterlife” is becoming more common. But it raises big questions: Who owns our digital legacy? Should there be rules about how our data is used after we’re gone? And is it healthy to keep talking to someone who’s no longer here?



For now, the 2WAI controversy serves as a stark reminder that not every technological capability should become a commercial reality—especially when it targets our most vulnerable moments and profound human experiences.



Key Takeaways
  • A new application has been released that allows users to speak to deceased loved ones using artificial intelligence technology.
  • The application has generated controversy, raising concerns about ethics.
  • Critics worry about the emotional impact the application could have on users who are grieving.
  • Others have questioned how the application sources data and whether it could potentially be misused.

So, what do you think, The GrayViners? Would you use an application like this to “talk” to someone you’ve lost? Or does the idea send shivers down your spine? Maybe you see it as a harmless way to remember the good times, or perhaps you think it’s a step too far.



And remember, whether you’re a technology enthusiast or a bit of a skeptic, there’s no right or wrong way to remember those we’ve lost. The most important thing is to find comfort and connection in whatever way feels right for you.

Join the conversation

News, deals, games, and bargains for Americans over 60. From everyday expenses like groceries and eating out, to electronics, fashion and travel, The GrayVine is all about helping you make your money go further.
Share With a Friend
Change Weather Zip code ×
Change Petrol Postcode×