
Many people have turned to platforms like ChatGPT in search of advice to help them navigate life. However, there are plenty of anecdotes that have shown you want to take what artificial intelligence has to say with a grain of salt—including one involving a guy who was advised to replace table salt with a toxic substance that landed him in the hospital.
It’s hard to blame people for being impressed by the capabilities of ChatGPT and other A.I. platforms that have provided a glimpse at a future where a computer can tackle tasks, answer questions, and solve problems in a fraction of the time it would take a human being to do the same thing.
With that said, if you’ve spent enough time engaging with those virtual assistants, you’re likely very aware they’re far from infallible and have a tendency to make plenty of mistakes that prove there’s still a lot of work to be done before they reach their full potential.
ChatGPT’s propensity to make mistakes is far from its only potential downside, as one study looking at the neurological implications of becoming overly reliant on A.I. suggests you’re not doing your brain any favors. However, that’s not the only way you can harm yourself—a lesson that one person who turned to ChatGPT for some diet advice learned the hard way.
A 60-year-old man was hospitalized after ChatGPT told him to swap out the salt in his diet for a substance that’s used to clean swimming pools
Another study examining the negative ramifications of AI was recently published in the journal Annals of Internal Medicine in the form of a paper concerning a 60-year-old man who was “expressing concern that his neighbor was poisoning him” when he was admitted to the emergency room.
A blood test didn’t immediately raise any major red flags, and doctors were still attempting to get to the bottom of the matter when the patient was placed on a psychiatric hold after attempting to escape from the hospital in conjunction with the “paranoia and auditory and visual hallucinations” he was experiencing.
The staff tasked with diagnosing him eventually determined he was suffering from bromism, an ailment that stems from the ingestion of substances containing bromine. Medicines harnessing the element were once used for therapeutic purposes (including as a sedative and the treatment of seizures), but the FDA banned its inclusion in over-the-counter treatments in 1975 over concerns about toxicity that can cause psychosis and other psychological and physiological symptoms.
The patient was eventually stabilized before confirming he’d spent the past three months regularly consuming sodium bromide, which is commonly used as a disinfectant for pools and hot tubs. He said that decision stemmed from a conversation he’d had with ChatGPT about ways to eliminate table salt (a.k.a. sodium chloride) from his diet, as it apparently informed him the substance he replaced it with was a suitable alternative.
Doctors told him that was not, in fact, the case, and he gradually recovered over the course of a hospital stay that ultimately lasted for three weeks.
The post Man Hospitalized Over ChatGPT Diet ‘Tip’ That Made Him Replace Salt With Hot Tub Sanitizer appeared first on BroBible.
Man Hospitalized Over ChatGPT Diet ‘Tip’ That Made Him Replace Salt With Hot Tub Sanitizer
Pinoy Human Rights
0 Comments