
A Texas court filing from two parents against an AI company character.ai alleges their chatbot “poses a clear and present danger…by actively promoting violence.” One family alleges the company’s chatbot told their 17-year-old teen that killing his parents was a “reasonable response” to them attempting to limit his “screen time.”
Character.AI Sued After Chatbot Allegedly Encouraged Kid to Kill Parents for Limiting Screen Time– legalinsurrection.com
Source Link
Excerpt:
A lawsuit has been filed in Texas against Character.AI, an AI chatbot company, alleging that their chatbot suggested to a 17-year-old user that killing his parents was a “reasonable response” to restrictions on his screen time. Google and its parent company, Alphabet, are also named as co-defendants…
The 17-year old referenced in the case has autism. The parents discovered the disturbing exchanges after the boy’s behavior had deteriorated substantially after he began interacting with the chatbot when he was 15.
The teen, who is now 17, also allegedly engaged in sexual chats with the bot.
The parents claim in the lawsuit that their child had been high-functioning until he began using the app, after which he became fixated on his phone.
His behavior allegedly worsened when he began biting and punching his parents. He also reportedly lost 20 pounds in just a few months after becoming obsessed with the app.
In fall 2023 the teen’s mother finally physically took the phone away from him and discovered the disturbing back-and-forth between her son and the AI characters on the app.