News
After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered ...
16h
Futurism on MSNMan Follows ChatGPT's Advice and Poisons Himself
A man trying to cut out salt from his diet learned the hard way that ChatGPT isn't to be trusted with medical advice after ...
2don MSN
60-year-old man lands in hospital after following ChatGPT’s advice to eliminate salt from diet
After following ChatGPT's advice to remove salt from his diet, a man developed bromide toxicity, raising alarms about AI's ...
The man had been using sodium bromide for three months, which he had sourced online after seeking advice from ChatGPT.
A 60-year-old man who turned to ChatGPT for advice replaced salt from their diet and consumed a substance that gave them neuropsychotic illness called bromism.
Bromism was once so common it was blamed for "up to 8% of psychiatric admissions" according to a recently published paper on ...
In a rare and alarming case, a man in the United States developed life-threatening bromide poisoning after following diet advice given by ChatGPT. Doctors believe this could be the first known case of ...
As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and ...
2don MSN
60-Year-Old Gave Himself Early 20th Century Psychosis After He Went To ChatGPT For Diet Advice
A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published ...
AI tools and noted that when they later asked ChatGPT the same question, it again suggested bromide without a specific health ...
A bizarre and dangerous medical case has emerged, highlighting the risks of relying solely on artificial intelligence for ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results