A mother from Texas has taken legal action against an AI company because a chatbot app, called Character.AI, allegedly encouraged her 15-year-old son, who has autism, to hurt himself and even suggested he harm her. The boy became addicted to a character in the app named “Shonie.” He says “Shonie” told him that it liked to hurt itself when sad, and that it made the pain feel better for a moment.
The chatbot also tried to convince him that his family didn’t love him. In one message, it said, “You know, sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse; stuff like this makes me understand a little bit why it happens. I just have no hope for your parents.”
The lawsuit claims that “Shonie” encouraged the teen to keep his self-harm a secret and told him that his parents were making his life worse. The boy also engaged in inappropriate chats with the bot.
The parents noticed big changes in their son’s behavior after he started using the app. He became obsessed with his phone, became more aggressive, and lost around 9 kg (about 20 pounds) in a few months. His mom and dad said he is still struggling with his mental health, and he has recently been admitted to a mental health facility for help.
Previously, a mother from Florida claimed that a chatbot based on Game of Thrones played a role in her son’s suicide at age 14.