A 60-year-old man was hospitalized for three weeks because of following incorrect advice from ChatGPT, highlighting the potential dangers of relying on AI for medical information. The man asked ChatGPT how to replace salt in his diet. The AI suggested using sodium bromide, a substance now considered toxic. He subsequently purchased sodium bromide and consumed it, without seeking professional medical advice, resulting in a series of severe symptoms. The man experienced significant distress, including fear, confusion, and extreme thirst, leading to hospitalization. Doctors worked to stabilize his condition and restore his health. The case, published in the American College of Physicians journal, emphasizes the importance of consulting medical professionals for health and nutrition advice and avoiding reliance on AI in place of expert guidance.
breaking
- Chief Minister Shri Hemant Soren was presented with the ‘Armed Forces Flag Day’ badge by Director of Sainik Welfare Directorate Brigadier Niranjan Kumar (Retd.)
- Over 50 Acres of Palamu Farmland Destroyed by Nilgai Attacks
- Telangana Deputy Chief Minister Mr. Mallu Bhatti Vikramarka pays courtesy call on Chief Minister Mr. Hemant Soren
- Bigg Boss 19: Final 5 Confirmed After Malti Chahar’s Eviction
- Siraj’s White-Ball Exile: Aakash Chopra Raises Concerns for India
- IndiGo’s Operational Woes: 1000+ Flights Canceled, Passengers in Limbo
- Roman Gofman Named Mossad Director, Shifting Focus to Military Leadership
- Student Scholarship Crisis: Jharkhand MLA’s Assembly Protest
