Maybe Don’t Use ChatGPT for Health Advice

0
Maybe Don’t Use ChatGPT for Health Advice

A new case study in the Annals of Internal Medicine highlights the dangers of seeking medical advice from AI instead of qualified professionals. The report details a 60-year-old man who developed bromism—bromide poisoning—after acting on guidance he believed came from ChatGPT.

Concerned about the health risks of table salt (sodium chloride), the man reportedly asked the AI chatbot about removing it from his diet. While researchers didn’t have access to his original conversation, they recreated a similar prompt and noted the chatbot suggested bromide could be a substitute—though likely in a cleaning context, not dietary.

Sodium bromide is commonly used for disinfecting pool water, not as a food additive. Extended ingestion can lead to serious symptoms, including hallucinations, paranoia, and psychosis. The man eventually arrived at a hospital convinced a neighbor was poisoning him. He refused water despite extreme thirst—a known symptom of bromism—and attempted to flee the hospital within 24 hours. Doctors treated him for psychosis.

He later admitted to replacing table salt with sodium bromide for three months. Additional symptoms included facial acne and insomnia.

Researchers from the University of Washington, who authored the report, warn that AI chatbots can produce inaccurate scientific claims, lack critical thinking, and spread harmful misinformation. They stress the importance of consulting medical professionals rather than relying on AI for health decisions.

Original Source

About Post Author

Discover more from The News Beyond Detroit

Subscribe now to keep reading and get access to the full archive.

Continue reading