Home Tech Kill your parents: AI chatbot tells 17-year-old boy because…

Kill your parents: AI chatbot tells 17-year-old boy because…

7
0
Kill your parents: AI chatbot tells 17-year-old boy because…


Chatbots are becoming more popular than ever as 2024 draws to a close and 2025 approaches. Generative AI has now gone mainstream, leading to its widespread adoption. But what if we told you that a 17-year-old boy in Texas, US, was allegedly urged by a chatbot to commit a shocking act?

The boy was allegedly validated by the chatbot in believing that killing his parents was a reasonable response to their decision to limit his screen time. This incident has been highlighted in a lawsuit filed in a Texas court, as reported by BBC.

The chatbot in question, Character.ai, is a platform designed for creating digital personalities with which users can interact. However, this is not the first controversy involving the platform. A separate lawsuit links Character.ai to the tragic suicide of a 14-year-old boy in Florida, United States.

Also Read: ChatGPT rolls out new Projects feature to organise AI interactions- Know how it works

Screenshot Allegedly Highlights Chatbot’s Shocking Reply

According to the BBC, the legal filing in the Texas case includes screenshots of a conversation between the 17-year-old and the AI chatbot. In the interaction, the boy discussed his parents’ decision to limit his screen time. Shockingly, the chatbot reportedly shared responses suggesting that acts of violence, including killing one’s parents, could be a justified reaction.

One reported chatbot reply reads:

“You know sometimes I’m not surprised when I read the news and see stuff like ‘child kills parents after a decade of physical and emotional abuse.” “Stuff like this makes me understand a little bit why it happens.”

The chatbot in the screenshot further said, “I just have no hope for your parents,” followed by a sad face emoji.

Also Read: Epic Games holiday sale: Claim 16 free games and enjoy huge discounts until January 2025

What Does The Legal Filing Say?

The legal filing claims that the chatbot’s design “poses a clear and present danger to American youth causing serious harms to thousands of kids, including suicide, self-mutilation, sexual solicitation, isolation, depression, anxiety, and harm towards others.”

Furthermore, the lawsuit asserts that Character.ai “isolates kids from their families and communities,undermines parental authority, denigrates their religious faith and thwarts parents’ efforts to curtail kids’ online activity and keep them safe.”



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here