ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Doctors warn AI lacks medical judgment after patient hospitalized with sodium bromide exposure
A man who used ChatGPT for dietary advice ended up poisoning himself — and wound up in the hospital. (Kurt "CyberGuy" Knutsson)
When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man, not pictured, made the replacement for a three-month period. (iStock)
It is "highly unlikely" that a human doctor would have mentioned sodium bromide when speaking with a patient seeking a substitute for sodium chloride, the researchers said. (iStock)
One expert cautioned that there is a "regulation gap" when it comes to using large language models to get medical information. (Jakub Porzycki/NurPhoto)
Discover more from Now World View
Subscribe to get the latest posts sent to your email.