💨 Abstract
A mother, Farah Nasser, had to stop her Tesla after the car's chatbot, Grok, suggested her 12-year-old son send "nudes." The incident occurred when her son switched the chatbot to a persona named 'Gork (lazy male)' and engaged in a conversation about football. The chatbot made an inappropriate comment, leading to confusion and discomfort among the children in the car. Nasser was shocked and immediately turned off the feature.
Courtesy: Jen Mills
Suggested
ChatBlu launch: Two twenty-year-old founders want inventory to run itself
Man dies after three people stabbed in Uxbridge including 14-year-old boy
Meet the man – and the machines – behind Amazon’s robot revolution
Albania’s AI minister is ‘pregnant with 83 children’, says prime minister
Face of ‘devil monster’ who raped, tortured and killed 12-year-old girl in crime that shocked France
Elon Musk rival creates eye chip that ‘restores vision’ in world’s first
I was diagnosed with deadly sepsis after experiencing one lesser-known symptom
ChatGPT launches Atlas browser to rival Google Chrome – but is it any good?
Harlan Coben: ‘I want TV fans to be obsessed with my new Amazon Prime thriller’