AI health advice gone wrong: When a chatbot’s answer leads to the ER
By
Veronica E.
- Replies 0
Artificial intelligence is becoming a go-to tool for everything from vacation planning to solving everyday tech problems—but what happens when you use it for medical advice?
For one 60-year-old man, the answer was a dangerous hospital stay that could have been avoided.
While AI tools like ChatGPT can deliver quick, confident-sounding responses, they can also provide inaccurate or even harmful suggestions.
A recent case shows how one simple question about salt substitutes led to a toxic reaction and serious health complications.
It’s a cautionary reminder that AI can be helpful for learning—but it’s no substitute for a real doctor.

A dangerous substitution with serious consequences
A 60-year-old man concerned about reducing his salt intake turned to ChatGPT for advice.
Instead of suggesting common low-sodium alternatives, the AI recommended sodium bromide—a chemical compound typically used in industrial applications, not food.
Trusting the advice, he purchased sodium bromide online and used it in place of table salt for three months.
Before long, he developed confusion, paranoia, hallucinations, fatigue, acne, poor muscle coordination, and extreme thirst.
Doctors diagnosed him with bromism, a toxic reaction to bromide exposure, and he required both medical treatment and psychiatric care.
Also read: Alone with a panic attack? Here's how ChatGPT stepped in
Why AI can’t replace a healthcare professional
This case, published in Annals of Internal Medicine Clinical Cases, underscores the risks of relying on AI for medical guidance.
While AI can sound confident, it lacks the context, critical thinking, and personal assessment that a trained healthcare provider offers.
In this instance, experts noted that a medical professional would never have recommended sodium bromide for dietary use.
Also read: Is AI getting better at being human? A new test has people talking
The growing trend of using AI for wellness advice
Despite these risks, a survey by Talker Research for The Vitamin Shoppe’s annual Trend Report found that 35% of Americans already use AI to help manage their health and wellness.
Even more—63%—reported turning to AI for guidance, surpassing those who consult social media (43%) or influencers (41%).
Speed, availability, and a lack of judgment make AI appealing, but the trade-off can be safety.
Also read: FDA’s new AI tool has blind spots—Could that be bad news for seniors?
Smart ways to use AI for health questions
Experts stress that AI should be treated as a supplemental tool, not a source of prescriptions.
Use it to learn general information or prepare questions before a doctor visit, but never to make medication or dietary changes without professional input.
Unusual suggestions—especially for chemicals or supplements you’ve never heard of—should always be double-checked with a licensed provider.
AI can be helpful for research and organization, but it’s no substitute for a real conversation with a medical expert.
As this case shows, even the most advanced AI can make dangerous mistakes.
Staying informed is important—but so is knowing when to rely on trained professionals.
Read next: How one woman used AI to wipe out $12K in debt—and how you can too
Have you ever received questionable advice from AI or the internet? What steps do you take to make sure your health information is accurate?
For one 60-year-old man, the answer was a dangerous hospital stay that could have been avoided.
While AI tools like ChatGPT can deliver quick, confident-sounding responses, they can also provide inaccurate or even harmful suggestions.
A recent case shows how one simple question about salt substitutes led to a toxic reaction and serious health complications.
It’s a cautionary reminder that AI can be helpful for learning—but it’s no substitute for a real doctor.

AI can be a helpful tool for research, but medical decisions should always be guided by a trusted healthcare provider. Image Source: Pexels / Airam Dato-on.
A dangerous substitution with serious consequences
A 60-year-old man concerned about reducing his salt intake turned to ChatGPT for advice.
Instead of suggesting common low-sodium alternatives, the AI recommended sodium bromide—a chemical compound typically used in industrial applications, not food.
Trusting the advice, he purchased sodium bromide online and used it in place of table salt for three months.
Before long, he developed confusion, paranoia, hallucinations, fatigue, acne, poor muscle coordination, and extreme thirst.
Doctors diagnosed him with bromism, a toxic reaction to bromide exposure, and he required both medical treatment and psychiatric care.
Also read: Alone with a panic attack? Here's how ChatGPT stepped in
Why AI can’t replace a healthcare professional
This case, published in Annals of Internal Medicine Clinical Cases, underscores the risks of relying on AI for medical guidance.
While AI can sound confident, it lacks the context, critical thinking, and personal assessment that a trained healthcare provider offers.
In this instance, experts noted that a medical professional would never have recommended sodium bromide for dietary use.
Also read: Is AI getting better at being human? A new test has people talking
The growing trend of using AI for wellness advice
Despite these risks, a survey by Talker Research for The Vitamin Shoppe’s annual Trend Report found that 35% of Americans already use AI to help manage their health and wellness.
Even more—63%—reported turning to AI for guidance, surpassing those who consult social media (43%) or influencers (41%).
Speed, availability, and a lack of judgment make AI appealing, but the trade-off can be safety.
Also read: FDA’s new AI tool has blind spots—Could that be bad news for seniors?
Smart ways to use AI for health questions
Experts stress that AI should be treated as a supplemental tool, not a source of prescriptions.
Use it to learn general information or prepare questions before a doctor visit, but never to make medication or dietary changes without professional input.
Unusual suggestions—especially for chemicals or supplements you’ve never heard of—should always be double-checked with a licensed provider.
AI can be helpful for research and organization, but it’s no substitute for a real conversation with a medical expert.
As this case shows, even the most advanced AI can make dangerous mistakes.
Staying informed is important—but so is knowing when to rely on trained professionals.
Read next: How one woman used AI to wipe out $12K in debt—and how you can too
Key Takeaways
- A 60-year-old man was hospitalized after following ChatGPT’s advice to replace table salt with sodium bromide, leading to toxic bromism.
- His symptoms included paranoia, hallucinations, fatigue, acne, poor muscle coordination, and extreme thirst before receiving medical and psychiatric treatment.
- Medical experts caution that AI tools can spread misinformation and should never replace professional health advice or diagnoses.
- A survey found 35% of Americans use AI for health and wellness management, though most still trust medical professionals over AI.
Have you ever received questionable advice from AI or the internet? What steps do you take to make sure your health information is accurate?