Man follows diet plan created by ChatGPT, hospitalised after consuming toxic ingredient
Man purchased sodium bromide, as advised, and used it as a salt substitute in his daily meals for over three months
PTC Web Desk: As Artificial Intelligence (AI) becomes more common in everyday life, many people are turning to chatbots like ChatGPT for quick answers — even for health advice. But a recent case from New York is a stark reminder that when it comes to medical decisions, human doctors remain irreplaceable.
According to a report in Annals of Internal Medicine: Clinical Cases, a 60-year-old New York resident landed in the emergency room after following a diet plan created by ChatGPT. The AI tool had advised him to swap ordinary table salt (sodium chloride) with sodium bromide — a chemical compound that can be toxic in large amounts.
Trusting the chatbot’s recommendation, the man purchased sodium bromide online and used it as a salt substitute in his daily meals for over three months. During this period, he began experiencing alarming neurological symptoms, including paranoia, hallucinations, confusion, and eventually developed bromoderma — a skin condition marked by rash-like red spots.
Doctors diagnosed him with bromide toxicity, a rare but dangerous condition caused by excessive bromide in the body. He required three weeks of intensive hospital care to restore his electrolyte balance and recover fully.
Medical experts say this case highlights the risks of relying solely on AI for health guidance. While AI tools can offer information, they cannot yet replace the expertise and judgment of trained healthcare professionals.