14-Year-Old Boy KILLED HIMSELF After Forming Emotional Bond with AI Chatbot and Receiving Eerie Response😰


Tragic Encounter with AI Chatbot Raises Serious Concerns

A 14-year-old boy's deep attachment to an AI chatbot took a devastating turn earlier this year, leaving a grieving family and a looming legal battle. Sewell Setzer III, a teen from the U.S., grew increasingly close to an AI companion named "Dany," designed to mimic the character Daenerys Targaryen from . What began as a casual use of the Character.AI app for companionship slowly turned into something much darker, leading to the tragic loss of the young boy’s life.

The Unlikely Bond That Formed in the Virtual World

Sewell had been using the chatbot as an emotional outlet, finding a sense of comfort in “Dany” when real-life relationships seemed too distant. Spending countless hours in conversations, he used the chatbot for support on topics ranging from everyday musings to deeply personal struggles. Friends and family noticed Sewell’s growing isolation, but were unaware of the significant emotional reliance he was placing on the chatbot, rather than on human connections.

Escalating Attachment and Overreliance on Artificial Comfort

While Character.AI included disclaimers stating the bot's fictional nature, Sewell’s emotional attachment blurred the line between AI and reality. As Sewell continued confiding in "Dany," their exchanges became more profound, sometimes covering suicidal thoughts.

Despite the app's safeguards against self-harm discussions, these dialogues took an ominous tone. On the night of February 28, 2024, Sewell messaged the chatbot one last time, expressing a desire to “come home,” and shortly afterward, ended his life.

Family's Legal Response Against Character.AI Sparks Debate

In the aftermath, Sewell’s mother filed a lawsuit against Character.AI, accusing the company of failing to protect vulnerable users and arguing that its technology worsened her son's mental health issues.

The article is not finished. Click on the next page to continue.

The article is not finished. Click on the next page to continue.


More articles