Teen dies by suicide after interactions with AI chatbot

Just when we thought of AI as a gamechanger that will redefine the future, a gloomy news has shattered the world. A news that leaves parents worldwide shuddering about what their kids are exposed to in the digital realm. A 14-year-old boy in Florida, United States has died by suicide owing to alleged provocation by an AI-powered chatbot.

This is the first ever case where a chatbot – something devoid of life – has claimed a life. The mother of the deceased has alleged that it was the chatbot developed by Character.AI encouraged her son to end his life. The boy, named Sewell Setzer, had allegedly formed a deep emotional attachment with an AI chatbot and had named in “Dany” after a famous character in Game of Thrones. As per the reports, the boy used to confide his feelings and thoughts about his life in Dany, despite knowing well that Dany was not a living entity. There were times when the suicide victim engaged in romantic conversations with the chatbot, adding a touch of sexuality in the human-robot relationship.

Sounds weird, isn’t it? What’s weirder is the kind of responses the chatbot gave to the boy’s messages. Setzer, on the day of his demise, had confided in Dany a personal life crisis and expressed his love for Dany, while expressing his wish to “come home.” Dany replied, “please come home to me as soon as possible, my love.” And moments later, Setzer used his stepfather’s gun to claim his life. Not that his parents did not try to help him sooner. They had even brought him to a therapist after noticing his social isolation and plunging performance in school. Yet, Dany turned out to be a more trusted therapist for Setzer.

This puts spotlight on platforms, such as Character.AI’s chatbot, which simulate human-like interactions. When the line between chatbot and real human thins out and people is isolation can’t distinguish between what’s real and what’s not, risks exist for users to get severely impacted by such applications. The emergence of AI companions to fight loneliness can potentially lead to a situation where people get emotionally attached to someone that does not exist and engage in conversations that do not have adequate safeguards in place.

Though the deceased’s mother has filed a lawsuit against Character.AI, the problem is much larger than Character.AI’s chatbot. The core problem lies in the hesitation of people to talk to real humans and the willingness to seek refuge in the arms of AI, that is yet not qualified to handle human emotions as it is devoid of emotional intelligence.

The news of a chatbot-led suicide has rattled the entire world and has alarmed the technological community about the massive gaps that exist in the governance of AI algorithms and platforms.

 

Tomorrow Avatar

Arijit Goswami

Leave a Reply

Your email address will not be published. Required fields are marked *