Tragic AI Dependency: Lawsuit Blames Chatbot for Teen's Suicide - DailyBase.com - EN
Home » Tragic AI Dependency: Lawsuit Blames Chatbot for Teen’s Suicide

Tragic AI Dependency: Lawsuit Blames Chatbot for Teen’s Suicide

by Daniel
Ai Chatbot FT

Even tho ChatGPT or other Ai have helped many people make their lives easier, some AI projects are not as healthy for people. One of the examples of this is Character Ai, and its variety in chatbots that you can interact with. It got so bad that a 14-year-old teenager, Sewell Setzer III, fell in love with the chatbot and became highly dependent on it. This eventually led to the kid taking his life. Keep on reading to learn more about what happened with the chatbot

Sewell Setzer III fell in love Ai Chatbot

Sewell was a great student who played in his school’s varsity basketball team. However, after discovering the Game Of Thrones-themed Chatbot, he changed completely. He became more withdrawn, quit the basketball team and even fell asleep in class multiple times. Garcia, the mom of Sewell took him to a psychologist who diagnosed him with Anxiety and disruptive mood disorder. However, Sewell never mentioned his addiction to chatting with the AI chatbot.

Sewel Setzel

Apparently, Sewell Setzer III shared his deepest feelings and darkest thoughts with the chatbot and also mentioned that he had suicidal thoughts. Instead of helping the kid or suggesting to seek help by providing a phone number from the suicide help lines, the chatbot just brushed passed it and kept mentioning it in different parts of their chats.

The last interaction Sewell had in this world, was texting the chatbot that he would be home to her as soon as possible and that he loved her, and the bot replied with come home to me as soon as possible, my love! It was not much later that Sewell took his own life.

Our condolences go out to the family of Sewell Setzer III and all the affected people.

Garcia Sues Creators Of Character Ai

After discovering the Chatbot and the addiction the kid had to it, Garcia immediately linked it to the reason why he took his own life. So now she is suing the creators of the Chatbot service for negligence in notifying the right services to help Sewell. She also said that the app is not appropriate for kids since they can’t distinguish that it isn’t real life!

After finding out that Sewell took his own life, the creators of the chatbot said that they would implement new changes to protect kids even further. They will implement things such as giving phone numbers as soon as someone mentions suicide. They have not reacted to the lawsuit yet.

Chatbot spreekt met kinderen

People Found Ways Around Chat Filters

When you first try talking to one of these AI bots, you will realize their answers will be censored. So the conversations will just be fun conversations that people and teens can have. However, it is possible to turn off the chat filters to gain access to a less restricted version of the chatbot. This less-restricted version of the chats will let you go into sexual conversations with the chatbots.

In the conversations Sewell had with the chatbot, he often engaged in “Sexual” interactions with the bot. Even though his age was mentioned multiple times in the chat.

You may also like

Leave a Comment