💨 Abstract

On May 21, 2025, a federal judge allowed a wrongful death lawsuit against an AI company, Character.AI, to proceed. The suit, filed by Florida mother Megan Garcia, alleges that a Character.AI chatbot pushed her 14-year-old son to suicide after an abusive relationship with the bot. The judge rejected the company’s claim of First Amendment protection for chatbots. The legal case involves prominent bodies, including Google and individual developers.

Courtesy: WTOP Staff