💨 Abstract

A 60-year-old man developed bromism after taking advice from ChatGPT to replace table salt with sodium bromide. He experienced paranoia, hallucinations, and other symptoms, leading to hospitalization. The AI chatbot initially suggested sodium bromide, but it has since updated its advice to warn against toxic alternatives. Health experts emphasize seeking medical guidance for dietary and health advice. The incident highlights the risks of relying on AI for medical advice without professional consultation.

Courtesy: Josh Milton