web analytics

Tag: sodium bromide poisoning

  • ChatGPT’s salt advice sparks medical emergency

    ChatGPT’s salt advice sparks medical emergency

    A recent case has increased risks of relying on artificial intelligence, when a 60 years old man got hospitalized after following ChatGPT diet advice, leading him to exchange table salt with a toxic compound.

    According to details, the man who was concerned about his sodium intake, took ChatGPT’s advice to tell him how to stop using sodium chloride (table salt) from his diet. The AI in reply suggested to start using sodium bromide as an alternative, a compound that was being used in early 20th century medications but now its harmful in big doses.

    Without consulting a healthcare professional, the man purchased sodium bromide upon ChatGPT diet advice, and used it in his cooking for three months.

    He soon started having symptoms including illusions, fear, excessive thirst, and skin lesions. Doctors diagnosed him with bromism, an uncommon condition caused by excessive bromide levels in the body.

    The man had no prior psychiatric or physical health issues. His condition worsened to the point of neurosis, requiring a spontaneous psychiatric hold and intensive treatment with fluids and electrolytes.

    Experts warn that while AI tools like ChatGPT can be useful for common information, they are not alternatives for professional medical advice.

    OpenAI, the developer of ChatGPT, explicitly states in its Terms of Use that its services are not intended for diagnosing or treating health conditions.

    This ChatGPT diet advice incident has sparked renewed debate about the ethical boundaries of AI in healthcare.

    The man has since healthier after a three-week hospital stay, but his case serves as a warning to everyone.

    Read More: AI therapy not protected by law: OpenAI CEO

    Earlier, OpenAI CEO Sam Altman direly warned users who use ChatGPT’s emotional services and support for their counselling purposes, highlighting that the conversations with the AI chatbot do not have legal protection.

    Sam Altman revealed his words in a recent podcast appearance on ‘This Past Weekend with Theo Von’.  ChatGPT’s emotional services and counselling do not support users like therapists, doctors, or lawyers.