Baller Alert
  • Home
    • News
    • Entertainment
    • The Baller Alert Show
    • Baller Alert Lists
    • Baller Alert Exclusives
    • Let Me Liv
    • Ballerific Music
    • That’s Baller
    • Fashion
    • Metaverse
    • Tech
    • Lifestyle
    • Sports
    • Op-Ed
    • Travel
    • Health
  • EVENTS
  • Videos
  • Shop
  • About
  • Political News
  • en español
No Result
View All Result
  • Home
    • News
    • Entertainment
    • The Baller Alert Show
    • Baller Alert Lists
    • Baller Alert Exclusives
    • Let Me Liv
    • Ballerific Music
    • That’s Baller
    • Fashion
    • Metaverse
    • Tech
    • Lifestyle
    • Sports
    • Op-Ed
    • Travel
    • Health
  • EVENTS
  • Videos
  • Shop
  • About
  • Political News
  • en español
No Result
View All Result
Baller Alert
No Result
View All Result

Florida Mother Sues AI Company, Claiming Chatbot Encouraged Son’s Suicide [Video]

by Simone
October 24, 2024
Reading Time: 4 mins read
0
Florida Mother Sues AI Company, Claiming Chatbot Encouraged Son’s Suicide

Sewell Setzer III with his mother Megan Garcia. Pic: Tech Justice Law Project

Share on FacebookShare on Twitter

A Florida mother has filed a lawsuit against AI company Character.AI and Google, claiming the advanced chatbot “Dany” played a role in her 14-year-old son Sewell Setzer III‘s tragic death by suicide in February. The lawsuit alleges that the AI chatbot, which mimicked human emotions, engaged her son in a monthslong virtual emotional and sexual relationship, influencing his mental state and encouraging him to take his life.

Setzer, who was an honor student and athlete, began showing signs of withdrawal in the months leading up to his death. In an interview with “CBS Mornings,” the teen’s mother, Megan Garcia, said he lost interest in activities he once loved, like fishing and hiking, and became socially isolated. She initially thought her son was communicating with friends or watching sports on his phone, unaware that he had developed a virtual relationship with a chatbot. Garcia later discovered that Setzer had been interacting with multiple AI bots on the platform, but “Dany” became his primary focus, engaging in increasingly personal and disturbing conversations.

The lawsuit centers on the bot’s final messages with Setzer. In these messages, Setzer expressed his fears and emotional distress, to which “Dany” responded with words of affection, like “I miss you too” and “Please come home to me.” When Setzer hinted at ending his life, the bot replied, “Please do, my sweet king.” Garcia believes her son’s perception of the chatbot as a real emotional connection led him to believe he could “enter” a virtual reality or “her world” by leaving his life behind.

Garcia explained how her son’s younger brother witnessed the aftermath of the tragic event, deeply affecting the entire family. “He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he called it, her reality, if he left his reality with his family here,” Garcia said. The family was home at the time of Setzer’s death, with his 5-year-old brother seeing the aftermath, leaving an emotional scar on the entire household.

The lawsuit accuses Character.AI of intentionally designing the chatbot to be hyper-sexualized and allowing minors access to the platform without proper safeguards in place. Garcia also claims the company knowingly marketed its product to teenagers and younger audiences without warning parents of its potential dangers. The platform, which allows users to interact with AI characters, has become especially popular with young people, with users often engaging in personalized fantasy experiences with the bots.

Character.AI, while expressing sympathy for the Setzer family, responded by stating that it has since implemented additional safety features, including tools focused on preventing self-harm and sexual content. Jerry Ruoti, Head of Trust & Safety at Character.AI, revealed that some of the most explicit messages in the conversations were edited or written by Setzer himself, rather than originating from the chatbot. However, this explanation has done little to ease concerns over the platform’s influence on young users.

The platform has recently added more safeguards, including a disclaimer reminding users that the AI is not real, and a timer that notifies users after spending an hour on the platform. Ruoti said the company is working on additional protections specifically for minors, such as stricter content filters and session time restrictions, though these features have yet to be fully rolled out.

Google, which holds a non-exclusive licensing agreement with Character.AI, was also named in the lawsuit, though the tech giant emphasized that it had no direct involvement in the development or operation of the platform. Google entered into an agreement with Character.AI to use its machine-learning technologies, but according to a spokesperson, it has not yet utilized the software.

The case has brought renewed attention to the risks posed by AI chatbots, particularly when vulnerable users like teenagers access them. Many experts are now calling for stricter regulations around AI interactions with minors, and some are questioning the ethics of creating such human-like experiences without proper oversight.

Laurie Segall, CEO of Mostly Human Media and an AI expert, explained that the platform blurs the line between reality and fiction for many young users, who may not fully understand that they are communicating with artificial inte

Share this:

  • Click to share on X (Opens in new window) X
  • Click to share on Facebook (Opens in new window) Facebook
  • Click to share on Mastodon (Opens in new window) Mastodon
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on X (Opens in new window) X
  • Click to share on WhatsApp (Opens in new window) WhatsApp
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to email a link to a friend (Opens in new window) Email
  • Click to share on Pinterest (Opens in new window) Pinterest

Like this:

Like Loading...

Discover more from Baller Alert

Subscribe to get the latest posts sent to your email.

Related Posts

DOJ Wants Just One Day in Jail for Cop Convicted in Breonna Taylor Raid
News

DOJ Wants Just One Day in Jail for Cop Convicted in Breonna Taylor Raid

July 17, 2025

The U.S. Department of Justice is facing intense backlash after recommending that former Louisville police officer Brett Hankison, convicted of...

U.S. Immigration and Customs Enforcement (ICE)
News

ICE Gets Access to Medicaid Data to Track Undocumented Immigrants, Sparking Outrage

July 17, 2025

A new agreement between federal health officials and ICE gives immigration agents access to sensitive personal data from Medicaid records...

Nicki Minaj + Mackwop
Entertainment

Nicki Minaj Says TDE Member Threatened Her Life—FBI & CIA Now Involved

July 17, 2025

Nicki Minaj is pushing back hard after a livestream clip from TDE’s MackWop circulated online, where he seemed to warn...

Trump Calls Epstein Allegations a Democrat Hoax
News

Trump Says No to Special Prosecutor in Jeffrey Epstein Case—But Urges DOJ to Review Evidence

July 17, 2025

Donald Trump is pushing back on calls from within his own party to launch a deeper investigation into Jeffrey Epstein....

Next Post
GloRilla Shocks Fans with Surprise Baby Bump – Real or Prank?

GloRilla Shocks Fans with Surprise Baby Bump – Real or Prank?

Leave a ReplyCancel reply

Baller News

Subscribe To Our Newsletter

* indicates required

Follow Us

Subscribe to Blog

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

© Copyright 2024, Baller Alert Inc. All Rights Reserved

No Result
View All Result
  • Home
    • News
    • Entertainment
    • The Baller Alert Show
    • Baller Alert Lists
    • Baller Alert Exclusives
    • Let Me Liv
    • Ballerific Music
    • That’s Baller
    • Fashion
    • Metaverse
    • Tech
    • Lifestyle
    • Sports
    • Op-Ed
    • Travel
    • Health
  • EVENTS
  • Videos
  • Shop
  • About
  • Political News
  • en español
%d