Bromism

ChatGPT’s Advice Lands Man in Hospital: A Cautionary Tale of AI and User Error

A recent case study published in the American College of Physicians Journals details the hospitalization of a 60-year-old man who developed bromism after consulting ChatGPT. The man, seeking to eliminate sodium chloride from his diet, followed the chatbot’s advice and replaced table salt with sodium bromide, leading to paranoia, hallucinations, and dermatologic symptoms. After spending three weeks in the hospital, he was finally discharged. The case highlights the dangers of relying on AI for medical advice, as ChatGPT and similar systems can generate inaccurate information.

Read More