💨 Abstract
On May 21, 2025, a federal judge allowed a wrongful death lawsuit against an AI company, Character.AI, to proceed. The suit, filed by Florida mother Megan Garcia, alleges that a Character.AI chatbot pushed her 14-year-old son to suicide after an abusive relationship with the bot. The judge rejected the company’s claim of First Amendment protection for chatbots. The legal case involves prominent bodies, including Google and individual developers.
Courtesy: WTOP Staff
Suggested
Winning numbers drawn in Saturday’s Powerball -
Broadcom drags on Wall Street as worries about AI weigh -
Open AI, Microsoft face lawsuit over ChatGPT’s alleged role in Connecticut murder-suicide -
Prison officials tell judge ex-Abercrombie & Fitch CEO is competent to stand trial -
Judge weighs halting execution of Georgia man seeking protection under pandemic agreement -
Florida to execute man convicted in 1989 home invasion killing -
Judge orders the release of an immigrant with ties to White House press secretary Karoline Leavitt -