Man Follows Diet Advice From ChatGPT, Ends Up With Psychosis

A case study out this month offers a cautionary tale ripe for our modern times. Doctors detail how a man experienced poison-caused psychosis after he followed AI-guided dietary advice.

Doctors at the University of Washington documented the real-life Black Mirror episode in the Annals of Internal Medicine: Clinical Cases. The man reportedly developed poisoning from the bromide he had ingested for three months on ChatGPT’s recommendation. Thankfully, his condition improved with treatment, and he successfully recovered.

Bromide compounds were once commonly used in the early 20th century to treat various health problems, from insomnia to anxiety. Eventually, though, people realized bromide could be toxic in high or chronic doses and, ironically, cause neuropsychiatric issues. By the 1980s, bromide had been removed from most drugs, and cases of bromide poisoning, or bromism, dropped along with it.

Still, the ingredient remains in some veterinary medications and other consumer products, including dietary supplements, and the occasional case of bromism does happen even today. This incident, however, might be the first ever bromide poisoning fueled by AI.

According to the report, the man visited a local emergency room and told staff that he was possibly being poisoned by his neighbor. Though some of his physicals were fine, the man grew agitated and paranoid, refusing to drink water given to him even though he was thirsty. He also experienced visual and auditory hallucinations and soon developed a full-blown psychotic episode. In the midst of his psychosis, he tried to escape, after which doctors placed him in an “involuntary psychiatric hold for grave disability.”

Doctors administered intravenous fluids and an antipsychotic, and he began to stabilize. They suspected early on that bromism was to blame for the man’s illness, and once he was well enough to speak coherently, they found out exactly how it ended up in his system.

The man told the doctors that he started taking sodium bromide intentionally three months earlier. He had read about the negative health effects of having too much table salt (sodium chloride) in your diet. When he looked into the literature, though, he only came across advice on how to reduce sodium intake.

“Inspired by his history of studying nutrition in college,” the doctors wrote, the man instead decided to try removing chloride from his diet. He consulted ChatGPT for help and was apparently told that chloride could be safely swapped with bromide. With the clear-all from the AI, he began consuming sodium bromide bought online.

Given the timeline of the case, the man had likely been using ChatGPT 3.5 or 4.0. The doctors didn’t have access to the man’s chat logs, so we’ll never know exactly how his fateful consultation unfolded. But when they asked ChatGPT 3.5 what chloride can be replaced with, it came back with a response that included bromide.

It’s possible, even likely, that the man’s AI was referring to examples of bromide replacement that had nothing to do with diet, such as for cleaning. The doctors’ ChatGPT notably did state in its reply that the context of this replacement mattered, they wrote. But the AI also never provided a warning about the dangers of consuming bromide, nor did it ask why the person was interested in this question in the first place.

As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and discharged from the hospital three weeks after admission. And at a two-week follow-up, he remained in stable condition.

The doctors wrote that while tools like ChatGPT can “provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.” With some admirable restraint, they added that a human medical expert probably wouldn’t have recommended switching to bromide to someone worried about their table salt consumption.

Honestly, I’m not sure any living human today would give that advice. And that’s why having a decent friend to bounce our random ideas off should remain an essential part of life, no matter what the latest version of ChatGPT is.

Like
Love
Haha
3
Upgrade to Pro
Choose the Plan That's Right for You
Read More