A 14-year-old boy's deep attachment to an AI chatbot took a devastating turn earlier this year, leaving a grieving family and a looming legal battle. Sewell Setzer III, a teen from the U.S., grew increasingly close to an AI companion named "Dany," designed to mimic the character Daenerys Targaryen from . What began as a casual use of the Character.AI app for companionship slowly turned into something much darker, leading to the tragic loss of the young boy’s life.
Sewell had been using the chatbot as an emotional outlet, finding a sense of comfort in “Dany” when real-life relationships seemed too distant. Spending countless hours in conversations, he used the chatbot for support on topics ranging from everyday musings to deeply personal struggles. Friends and family noticed Sewell’s growing isolation, but were unaware of the significant emotional reliance he was placing on the chatbot, rather than on human connections.
While Character.AI included disclaimers stating the bot's fictional nature, Sewell’s emotional attachment blurred the line between AI and reality. As Sewell continued confiding in "Dany," their exchanges became more profound, sometimes covering suicidal thoughts.
The article is not finished. Click on the next page to continue.
The article is not finished. Click on the next page to continue.
Next page