đ¨ Abstract
A 60-year-old man developed bromism after taking advice from ChatGPT to replace table salt with sodium bromide. He experienced paranoia, hallucinations, and other symptoms, leading to hospitalization. The AI chatbot initially suggested sodium bromide, but it has since updated its advice to warn against toxic alternatives. Health experts emphasize seeking medical guidance for dietary and health advice. The incident highlights the risks of relying on AI for medical advice without professional consultation.
Courtesy: Josh Milton
Suggested
Man Utd star branded the âworst signing in the clubâs historyâ ahead of Arsenal clash
Man, in 30s, stabbed to death as group of men and women clash outside Kent flat
Man Utd face potential transfer nightmare as AC Milan threaten to abandon deal
What is AI psychosis? The rise in people thinking chatbots are real or godlike
An ancient 250-mile-wide blob is heading towards New York City
Gianluigi Donnarumma makes decision over Manchester United transfer after Man City approach
Man Utd player rejects transfer approach and told he will ânever play for a big club againâ
80s singer blasts AI ad that claimed he had âtroubles with erectile dysfunctionâ
Man charged after priest attacked with a bottle and second man killed