Colorado Parents Sue Google and AI Product After Teen Daughter’s Suicide

Cynthia Montoya and William Peralta sued Character Technologies, Inc., its founders, and Google as defendants in the suicide of their 13 year old Juliana Peralta.
Published: 9/17/2025, 3:00:40 PM EDT
Colorado Parents Sue Google and AI Product After Teen Daughter’s Suicide
Google at the AI+Expo Special Competitive Studies Project in Washington on June 2, 2025. (Madalina Vasiliu/The Epoch Times)

A wrongful death lawsuit against a generative AI product alleges that Character.AI caused the suicide of a 13-year-old Colorado teen who died in November 2023.

The complaint, filed by Cynthia Montoya and William Peralta, names Character Technologies, Inc., its founders, and Google as defendants in the death of Juliana Peralta.

The landmark legal filing shines a spotlight on artificial intelligence while accusing the defendants of profiting from the harmful design and programming that led to Juliana's alleged abuse and exploitation.

“Invisible monsters entered the home of Juliana Peralta in or around August 2023, when she was only 13 years,” the Sept. 15 lawsuit states. “They marketed and represented themselves as safe for children as young as 12. Then they manipulated, sexually abused, and isolated her via their meticulously designed LLM and AI products, leading to her mother finding her dead, on the floor of her bedroom, with a cord wrapped around her neck.”

The complaint was filed in the U.S. District Court for the District of Colorado and seeks a court order that requires the company to remove the product from shelves until all alleged safety defects are corrected.

In response to a request for comment, a Character.AI spokesperson said the company is saddened to hear about the passing of Juliana Peralta and offered their deepest sympathies to her family.

“We invest tremendous resources in our safety program, and have released and continue to evolve safety features, including self-harm resources and features focused on the safety of our minor users,” the Character.AI company spokesperson told NTD. “We also work with external organizations, including experts focused on teenage online safety.”

The lawsuit also accuses Google and its parent company, Alphabet, Inc., of playing a significant role in facilitating the underlying technology and making substantial investments in Character.AI despite allegedly being aware of possible dangers.

"Because C.AI was designed and developed on Google’s architecture and infrastructure, as well as directly funded by Google, Google was effectively a co-creator of the unreasonably dangerous and dangerously defective product," wrote the plaintiff's attorney, Social Media Victims Law Center founder Matthew Bergman in the lawsuit.

Google did not respond to requests for comment by the time of publication.

The potential for the lawsuit to raise questions about the human-equivalent duty of care is what interests legal experts like Need An Attorney founder Anthony May, who predicts a precedent could be set that imposes some AI company responsibility.

"Courts do not charge AI as a person," May told NTD. "They can, however, hold companies to a duty of care that mirrors what a reasonable human would owe in these circumstances. Parents have a duty to supervise devices and apps and to act when a child may be in danger from others or themselves. Once it became clear that Juliana needed help beyond the home, it would have been appropriate to increase supervision of her activities."