American Man Poisoned After Listening to ChatGPT Medical Advice



ChatGPT can be made to talk, write applications, summarize long articles and help you revise for school. In just three years, this artificial intelligence (AI) chatbot has become an aid used by hundreds of millions of users every month. But a man in the United States suffered from poisoning after listening to medical advice given by ChatGPT.


In a case reported in the American College of Physicians Journals last week, the 60-year-old man went to the hospital because of extreme thirst, skin rashes and visual and audio hallucinations. Based on the symptoms shown, he was diagnosed with bromism, which is bromine poisoning in the body.


Three months ago, worried about hearing that salt is bad for health, the patient asked ChatGPT what the best substitute was for him. ChatGPT suggested bromide salt and as a result, the body's bromide content increased. The answer given by ChatGPT was incorrect because bromide salt is just a substitute for regular salt when used in pools and not for food.


He is now recovering after being treated with fluids and electrolytes. This is why medical professionals are urging people not to take on AI-given roles outright as they could be harmful to health. In Illinois, AI chatbots in place of psychologists have been banned after a case of AI telling users to self-harm to solve problems.

Previous Post Next Post

Contact Form