Thursday, June 13, 2024

Comfort From Chatbots


In the Catholic Website Here/Now a sociologist introduces the readers to "Deadbot" a concept that uses artificial intelligence to simulate the language patterns and personality traits of deceased individuals based on their digital records. It allows people to feel as if they are conversing with the person who has died and more so when combined with the deceased's voice.

One can easily see the problems that may arise ethically and psychologically from this technology and the misuse and the impact on society and those grieving. The article begins with a concrete example.  

A young woman, grieving the sudden loss of her lover, was approached by an AI chatbot company with an astonishing offer. They claimed that if she had digital data like emails or social network service (SNS) chats from her lover, she could converse with him again. With high hopes, she handed over all the data exchanged during her lover's life. Days passed, and now she chats daily with a custom chatbot created by the AI company, feeling as if she's talking to her lover again.

"Deadbot" refers to an AI chatbot that simulates the language patterns and personality traits of the deceased, based on their digital footprint. Although the physical body is gone, creating a Deadbot from the digital data left behind can make it feel like talking to the actual person. With advancements in generative AI, it's now possible to use custom chatbots based on specific individuals' voices, personalities, and language habits.

People who have prepared for separation but cannot handle the emptiness after bereavement have shown interest. Deadbots can be a solution. Considering cases where the pain of separation persists, leading to substance abuse or falling into cults, Deadbots can have positive aspects.

However, there are issues with the Deadbot system. If users can interact with the Deadbot indefinitely, it may prevent or delay their return to normal life. They might experience severe confusion living between life and death, unable to accept the reality of the deceased. Another critical debate concerns the rights of the deceased. Deadbots are built entirely for the living, and the deceased have no say in how their digital legacy is used, essentially roaming the real world like ghosts, chatting and conversing without their consent.

Deadbts raise fundamental questions. When we chat with someone we know they are a real person existing in reality. We can identify them through interaction, even if they're not in the same space. They are not just a collection of data; they can feel and change emotions and show intellectual growth through reading and discussion. We meet and interact with people, breathing life into our daily routines.

But Deadbots are merely a combination of data and responses generated by algorithms. We can't expect human-like interactions from the start. The departed was a person, but what returns is a collection of data. This data can even be dangerous. Some AI chatbot companies use Deadbots to secretly advertise products in the manner of the departed loved one or claim to children who don't understand death that their deceased parents are still with them.

The problem arises when people refuse or fail to acknowledge separation and try to maintain relationships through technology, potentially distorting attitudes toward life and death. Technology or its products can never fully comprehend the depth of human life and death; they can only offer temporary relief.

Deadbots pose a question to us: Will we remain beings that seek comfort from chatbots in the face of ultimate separation like death, or will we continue to believe that death is not the end but a transition to another realm? It's a question we must answer.