Every day while I take my morning shower, I imagine I’m talking to my husband, Mike, updating him on what’s going on in my life. He died in 2008.
That’s why I was fascinated to learn that AI can create a digital simulation of someone who has died. It’s called a griefbot, and you can talk to it and get answers back, something that never happens in my shower.
Companies all over the world now offer griefbots (aka deathbots). They’re built by having AI digest someone’s social media posts, texts, voice messages and emails, along with photographs, videos, letters, diaries and so on. These are used to create a likeness of the individual that can carry on a pretty convincing conversation and that harbors many of the memories of the person it’s simulating.
Some griefbots communicate only in writing: they exchange texts with their users. Some converse online, sounding remarkably like the person they’re imitating. And some are both audible and visible and appear on Zoom or FaceTime or in an interactive video; good ones make eye contact, blink and seem to breathe. Griefbots are also built for virtual reality and as holograms.
You can get a griefbot of someone else, but you can also commission one of yourself, to leave behind for family and friends when you die. That interested me as well. Like many people my age (I’m 90), I’ve given some thought to how I’d like to be remembered.
In the United States, costs for a griefbot range from $10 for a single session with one to around $15,000 for a full-fledged avatar.
Some companies intend their bots to help mourners resolve their grief, and sometimes they do. But other firms describe them as a way to keep your loved one with you forever, so you never have to say goodbye. If that’s what you crave, a bot can seem so real that you prefer it to relationships with real people.
Ethicists say griefbots can harm users. At first, I thought their concerns were overblown but disturbing stories are beginning to emerge. One woman was shocked and upset when she asked her griefbot how he was and where he was. He was miserable, the bot said, and he was in hell. Experts acknowledge that you never really know what today’s generative AI will come up with.
In another incident, a mother donned a VR helmet to reunite with her small daughter, who had died. When she turned around, her little girl was running toward her, and she reached out, eager to gather up her child. Again and again, her arms closed around thin air. She sobbed uncontrollably. This is heartbreaking to watch, and online viewers were indignant.
Sociologist Sherry Turkle, PhD, defines grieving as “the very difficult process of accepting a loss.” A griefbot can prevent that if you start to feel that the individual you’re mourning is still with you.
There are other worries as well. If you stop using a griefbot, will that reignite your grief because it feels as if the person you love has died all over again? What if the AI builds a bot based on the deceased’s worst traits? Can the company that created your griefbot use it for other purposes? In the future, if you don’t want to be digitally resurrected, must you say so in your will?
As I read about griefbots, it didn’t take long to realize I’d never want to make one of myself or my husband. For one thing, I’ve written a memoir. That’s what I’ll be leaving behind for my children and grandchildren, and it’s enough.
Is the man in my shower my own, limited version of a griefbot? It started out that way. I talked to Mike because I missed him, but I soon discovered it was a useful thing to do. My worries, spoken out loud, always seem less dire than they did when they were lurking, half-formed, in the shadows of my mind.
Over the years, lulled by warm water, I’ve weighed the pros and cons of difficult decisions out loud and talked through blogs I was writing. Perhaps most important, I’ve formed the habit of regularly taking stock of my life.
I suppose I could simply talk to myself out loud, instead of imagining Mike is listening, but telling things to someone else gives me some distance on them. Sometimes I do stop to consider how he might have responded, but mostly I don’t bother.
I have a firm grip on reality—I know Mike’s not really there. He died 17 years ago, and a Mike bot would creep me out.
If building griefbots becomes highly profitable, internet behemoths may step in: Amazon and Microsoft have already applied for patents. That could mean a future in which griefbots are common and change the way we mourn our dead. In this already death-denying culture, the eventualities ethicists worry about might come true.
Having imaginary conversations with the ghost of someone is much safer. For anyone it appeals to, I recommend it.

Flora Davis has written scores of magazine articles and is the author of five nonfiction books, including the award-winning Moving the Mountain: The Women’s Movement in America Since 1960 (1991, 1999). She currently lives in a retirement community and continues to work as a writer.