{"id":4797,"date":"2024-10-24T13:00:00","date_gmt":"2024-10-24T13:00:00","guid":{"rendered":"https:\/\/www.dailybase.com\/en\/?p=4797"},"modified":"2024-10-24T09:39:08","modified_gmt":"2024-10-24T09:39:08","slug":"tragic-ai-dependency-lawsuit-blames-chatbot-for-teens-suicide","status":"publish","type":"post","link":"https:\/\/www.dailybase.com\/en\/tragic-ai-dependency-lawsuit-blames-chatbot-for-teens-suicide\/","title":{"rendered":"Tragic AI Dependency: Lawsuit Blames Chatbot for Teen’s Suicide"},"content":{"rendered":"\n

Even tho ChatGPT or other Ai have helped many people make their lives <\/strong><\/a>easier<\/strong><\/a>, some AI projects are not as healthy for people. One of the examples of this is Character Ai, and its variety in chatbots that you can interact with. It got so bad that a 14-year-old teenager, Sewell Setzer III, fell in love with the chatbot and became highly dependent on it. This eventually led to the kid taking his life. Keep on reading to learn more about what happened with the chatbot <\/p>\n\n\n\n

\n
\n

Table of Contents<\/p>\nToggle<\/span><\/path><\/svg><\/svg><\/span><\/span><\/span><\/a><\/span><\/div>\n