
AI Diet Advice Gone Wrong: Man Ends Up With Psychosis
A recent case study is a real eye-opener, showing how blindly following AI advice can seriously backfire. It tells the story of a man who ended up with bromide poisoning and psychosis after trusting ChatGPT for dietary guidance. Seriously, it sounds like something straight out of a dystopian movie!
Doctors at the University of Washington reported this bizarre incident. Apparently, the guy had been taking bromide for three months, all on ChatGPT's say-so. Thankfully, with the right treatment, he pulled through and made a full recovery. But, it makes you think twice, right?
Back in the day, bromide compounds were a go-to for everything from insomnia to anxiety. But, surprise, surprise, it turns out bromide can be toxic if you take too much of it. By the '80s, it was pretty much out of most medications. So, bromism cases became rare.
Even so, you can still find bromide in some veterinary drugs and dietary supplements. This particular case is wild because it seems to be the first bromide poisoning directly caused by AI advice.
The report says the man showed up at the ER, convinced his neighbor was poisoning him. Despite some normal physical results, he was agitated, paranoid, and even hallucinating. It got so bad he had a psychotic episode and had to be put on psychiatric hold.
After doctors stabilized him, they found out he'd been intentionally taking sodium bromide for months. His reasoning? He'd read that too much table salt (sodium chloride) was bad for you. Instead of just cutting back on salt, he decided to eliminate chloride altogether.
That's when he turned to ChatGPT, which apparently told him bromide was a safe substitute. So, naturally, he started buying sodium bromide online and taking it. While the doctors didn’t have access to the man’s chat logs, when they asked ChatGPT 3.5 what chloride can be replaced with, it came back with a response that included bromide.
Now, it's possible ChatGPT was talking about bromide in a completely different context. Context is key, right? But, the AI didn't bother to warn him about the dangers of consuming bromide or even ask why he wanted to know.
Luckily, the man recovered after being treated and taken off antipsychotics. He was discharged after three weeks and was doing well at his follow-up appointment.
The doctors pointed out that while AI tools can be great for bridging the gap between experts and the public, they also risk spreading information without context. It's safe to say no human doctor would recommend swapping chloride for bromide if you're worried about salt intake. It just seems obvious!
I think this whole story is a good reminder that we still need to use our brains. Having someone to talk things through with, someone who can say, "Hey, maybe that's not such a great idea," is more important than ever, no matter how smart AI gets.
Source: Gizmodo