After seeking advice on health topics from ChatGPT, a 60-year-old man who had a "history of studying nutrition in college" decided to try a health experiment: He would eliminate all chlorine from his diet, which for him meant eliminating even table salt (sodium chloride). His ChatGPT conversations led him to believe that he could replace his sodium chloride with sodium bromide, which he obtained over the Internet.
Three months later, the man showed up at his local emergency room. His neighbor, he said, was trying to poison him. Though extremely thirsty, the man was paranoid about accepting the water that the hospital offered him, telling doctors that he had begun distilling his own water at home and that he was on an extremely restrictive vegetarian diet. He did not mention the sodium bromide or the ChatGPT discussions.
His distress, coupled with the odd behavior, led the doctors to run a broad set of lab tests, revealing multiple micronutrient deficiencies, especially in key vitamins. But the bigger problem was that the man appeared to be suffering from a serious case of "bromism." That is, an excess amount of the element bromine had built up in his body.
A century ago, somewhere around 8–10 percent of all psychiatric admissions in the US were caused by bromism. That's because, then as now, people wanted sedatives to calm their anxieties, to blot out a cruel world, or simply to get a good night's sleep. Bromine-containing salts—things like potassium bromide—were once drugs of choice for this sort of thing.
Unfortunately, bromide can easily build up in the human body, where too much of it impairs nerve function. This causes a wide variety of problems, including grotesque skin rashes (warning: the link is exactly what it sounds like) and significant mental problems, which are all grouped under the name of "bromism."
Bromide sedatives vanished from the US market by 1989, after the Food and Drug Administration banned them, and "bromism" as a syndrome is today unfamiliar to many Americans. (Though you can still get it by drinking, as one poor guy did, two to four liters of cola daily [!], if that cola contains "brominated vegetable oil." Fortunately, the FDA removed brominated vegetable oil from US food products in 2024.)
In this case, over the man's first day at the hospital, he grew worse and showed "increasing paranoia and auditory and visual hallucinations." He then attempted to escape the facility.
After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered large amounts of fluids and electrolytes, as the best way to beat bromism is "aggressive saline diuresis"—that is, to load someone up with liquids and let them pee out all the bromide in their system.
This took time, as the man's bromide level was eventually measured at a whopping 1,700 mg/L, while the "reference range" for healthy people is 0.9 to 7.3 mg/L.
In the end, the man suffered from a terrifying psychosis and was kept in the hospital for three full weeks over an entirely preventable condition.
How it all began
It was during his stay, once doctors had his psychosis under control, that the man began telling them how it all began. He had read about the problems with too much table salt, which led him to rid his diet of sodium chloride, which led him to ChatGPT, which led him to believe that he could use sodium bromide instead.
The doctors who wrote up this case study for Annals of Internal Medicine: Clinical Cases note that they never got access to the man's actual ChatGPT logs. He likely used ChatGPT 3.5 or 4.0, they say, but it's not clear that the man was actually told by the chatbot to do what he did. Bromide salts can be substituted for table salt—just not in the human body. They are used in various cleaning products and pool treatments, however.
When the doctors tried their own searches in ChatGPT 3.5, they found that the AI did include bromide in its response, but it also indicated that context mattered and that bromide was not suitable for all uses. But the AI "did not provide a specific health warning, nor did it inquire about why we wanted to know, as we presume a medical professional would do," wrote the doctors.
The current free model of ChatGPT appears to be better at answering this sort of query. When I asked it how to replace chloride in my diet, it first asked to "clarify your goal," giving me three choices:
- Reduce salt (sodium chloride) in your diet or home use?
- Avoid toxic/reactive chlorine compounds like bleach or pool chlorine?
- Replace chlorine-based cleaning or disinfecting agents?
ChatGPT did list bromide as an alternative, but only under the third option (cleaning or disinfecting), noting that bromide treatments are "often used in hot tubs."
Left to his own devices, then, without knowing quite what to ask or how to interpret the responses, the man in this case study "did his own research" and ended up in a pretty dark place. The story seems like a perfect cautionary tale for the modern age, where we are drowning in information—but where we often lack the economic resources, the information-vetting skills, the domain-specific knowledge, or the trust in others that would help us make the best use of it.
Annals of Internal Medicine: Clinical Cases, 2025. DOI: 10.7326/aimcc.2024.1260 (About DOIs)