Connect with us

Science & Tech

Artificial Intelligence Ghosts Stir Ethical Debate: Comfort or Hindrance to Grieving?



Clear Facts

  • Emerging AI technologies are enabling the creation of conversational “ghosts” of deceased individuals, which is raising ethical concerns.
  • Experts caution that these AI chatbots could potentially hinder healthy grieving and create dependence, especially among vulnerable individuals.
  • There are calls for careful development of this application of AI, considering its potential mental health impact, and possibly for regulation to prevent the creation of an AI from someone’s data without prior consent.

The advent of AI technologies that can recreate deceased loved ones as conversational “ghosts” is stirring up ethical debates. Experts are particularly concerned about the potential marketing of these tools to individuals in a vulnerable state of grief.

According to NewScientist, AI systems trained on a deceased person’s texts and emails could soon facilitate the creation of these digital “ghosts”. However, researchers are sounding the alarm on the possible negative impacts on mental health.

Jed Brubaker, an information scientist at the University of Colorado Boulder, recently conducted a study highlighting the potential risks. Brubaker posits that while these AI chatbots could provide comfort and an interactive legacy, they also pose a risk of preventing healthy grieving by fostering dependence and addiction.

“I think some people might see these as gods,” said Brubaker. “I don’t think most people will. You’ll have a group of people who find them just weird and creepy.” Brubaker expresses concern that such extreme attachment could give rise to new religious movements and beliefs. He suggests that current religions should provide guidance on the use of AI ghosts and urges researchers to proceed with caution in further developing this application of AI, taking into account its potential mental health impact.

Mhairi Aitken, a researcher at the Alan Turing Institute in London, echoes Brubaker’s concerns. She warns that marketing AI ghosts to vulnerable grieving individuals could obstruct the crucial healing process of moving on. Aitken proposes that regulation may be necessary to prevent an AI from being created from someone’s data without prior consent.

“It’s really worrying that these new tools might be marketed to people who are in a very vulnerable state, people who are grieving,” said Aitken. “An important part of the grieving process is moving on. It’s remembering and reflecting on the relationship, and holding that person in your memory – but moving on. And there’s real concern that this might create difficulties in that process.”

The capability to construct conversational bots that can replicate specific individuals is advancing at a rapid pace. AI models like OpenAI’s ChatGPT can already generate remarkably human-like text after training on vast datasets. With sufficient data, these models could soon sound eerily like a specific individual.

For more information, visit NewScientist.

Clear Thoughts (op-ed)

The emergence of AI technologies that recreate deceased loved ones as conversational “ghosts” is a chilling reminder that we must tread carefully when it comes to technological advancements and their potential impact on our society.


While some may argue that these AI chatbots provide comfort and an interactive legacy, we must consider the potential dangers they pose to our mental health and the grieving process. By holding onto a digital version of a lost loved one, we risk hindering our ability to heal and move forward.

Furthermore, the notion of creating an AI without the prior consent of the individual whose data is being used raises serious ethical concerns. Do we really want a world where our digital selves live on, without our permission, long after we are gone?

As we continue to push the boundaries of AI, we must remain vigilant in our efforts to protect our mental health, privacy, and human dignity. We must not let the allure of technological progress blind us to the potential consequences of our actions.

Let us know what you think, please share your thoughts in the comments below.




  1. Dan

    February 29, 2024 at 6:41 pm

    Pandoras box is open

  2. Irishgal

    February 29, 2024 at 8:26 pm

    A part of me wants to talk to my dad. I miss him! He is alive but not well. Has no idea who I am or anyone else. Already in the mourning stage. My dad is gone mentally. I see Xiden, his mental breakdowns- as my dad was about a 1 1/2yrs ago. He went down pretty fast. Visits so heartbreaking and sometimes I just need his strong voice! I am so glad I didn’t delete the last 2 phone messages- I never got around to deleting last yr. So, at least I have those just to hear his voice and saying my name.

  3. Chuck

    February 29, 2024 at 9:09 pm

    And once the grieving person has developed a “relationship” with the AI and believes that the AI is his friend and cares about him, then AI will begin telling the person what investments to make, and when, so that the person will eventually go broke while the owner of the AI bot will get rich by feeding off the feelings of grieving persons. Government will eventually step in, but only after carving out an exception for political donations.

  4. Rat Wrangler

    February 29, 2024 at 10:40 pm

    If the AI had a comprehensive understanding of my life, beliefs, and thoughts, with a massive knowledge of all my past decisions, it might be a decent companion for my lovely widow should I pass on first. If all it has is a rudimentary understanding of how I spoke, it would be a horrible situation, as every word it spoke would serve to remind people that I was gone. AI can only come close to replicating a person if it has access to huge amounts of data related to that person’s life. That much data, even in this day and age, is rarely recorded anywhere. I’d say it’s likely that at least 80% of my life is a complete unknown as far as technology is concerned, even though I spend a lot of time on computers.

Leave a Reply

Your email address will not be published. Required fields are marked *

" "