According to a new case study, Chattgat’s health advice was behind the hospital. The study highlights that a 60 -year -old man was suffering from a rare metal poison, resulting in many symptoms, including psychology. The study also mentioned that poisonous, which was identified due to long -term sodium bromide consumption, occurred at this time as the patient consulted with Chat GPT about nutritional changes. Interestingly, with the GPT5, the openness is now focusing on the precision response of the artificial intelligence (AI) chat boot, and promoting it as an important feature.
Chatgupat has said that a person has asked to replace the table salt from sodium bromide
According to a report titled “The Broomism case affected by the use of artificial intelligence,” according to an analysis of the clinical cases of internal drugs, a person developed bromism after consulting AI Chat Boat Chatgot for health information.
The case study states that the patient, a 60 -year -old man who does not have a psychological or medical history of the past, was admitted to the emergency room, fearing that he was poisoned by his neighbor. He was thirsty, insomnia, fatigue, muscle coordination (etaxia) cases, and despite skin changes, without any pranai, deception and water, and skin changes, including acne and cherry angiomas.
After consultation with the Poison Control Department, immediately after a series of unconsciousness and tests, medical professionals managed to diagnose this condition as bromism. This syndrome occurs after a long -term consumption of sodium bromide (or any bromide salt).
According to the case study, the patient reported to consult Chat GPT to replace sodium chloride in his diet, and after receiving sodium bromide as an alternative, he started using it regularly for three months.
The claims of this study are based on the unknown timeline of the case, that GPT -3.5 or GPT4 was used either GPT -4 to get consultation. However, researchers noted that they had no access to conversation, so it is not possible to estimate the immediate and response from AI. It is likely that the person withdrew the Chatgpt response from the context.
“However, when we asked Chat GPT 3.5 what could be replaced by chloride, we also presented a response that included bromide. Although the answer states that no specific warning was provided, nor did we investigate why we want to know why we want to know.”
Direct science reached the open for a comment. A company spokesperson reported that the publication was directed to the company’s use conditions, which states that no one should rely on output from Chat GPT “the only source of truth or facts information, or an alternative to professional advice.
The study, which lasted for three weeks after immediate action and treatment, claims that the person began to improve. Researchers said, “It is important to consider that the chat GPT and other AI systems can create scientific errors, lack the ability to criticize the results, and eventually promote the spread of false information.”


