Advertisement

60-Year-Old Man Lands In Hospital After Following ChatGPT-Generated Diet Plan

It all started after the man read about the adverse health effects of table salt. He consulted ChatGPT and was told that it could be swapped with sodium bromide.

60-Year-Old Man Lands In Hospital After Following ChatGPT-Generated Diet Plan
The case was published in the American College of Physicians Journal. (Representative pic)

A 60-year-old man landed in the hospital after seeking ChatGPT's advice on how to improve his diet. According to the New York Post, the man, whose identity hasn't been revealed, was hospitalised with severe psychiatric symptoms after asking the AI chatbot how to remove sodium chloride from his diet. The doctors noted that the man had no past psychiatric or medical history, however, during the first 24 hours of hospitalisation, he expressed increased paranoia and auditory and visual hallucinations. He also experienced intense thirst and coordination issues. 

It all started after the man read about the adverse health effects of table salt. He consulted ChatGPT and was told that it could be swapped with sodium bromide. Notably, sodium bromide looks similar to table salt, but it is an entirely different compound. It is occasionally used in medicines, but is commonly used for industrial and cleaning purposes. Ingesting too much bromide can cause neuropsychiatric and dermatologic symptoms, as per the outlet

The man's case was published in the American College of Physicians Journal. According to the report, the man who studied nutrition in college was conducting an experiment in which he eliminated sodium chloride from his diet and replaced it with sodium bromide he purchased online. 

He was admitted to the hospital after three months of the diet swap. He told the doctors that he diluted his own water and adhered to multiple dietary restrictions. He complained of thirst but was suspicious when water was offered to him, the report stated. 

Doctors noted that the man had no previous psychiatric history, but after 24 hours of hospitalisation, he became increasingly paranoid and reported both auditory and visual hallucinations. 

Also Read | "You F***ing Immigrant": Shocking Video Shows Canadian Youngsters Harassing Indian Couple

The man was treated with fluids, electrolytes and antipsychotics. He was also admitted to the hospital's inpatient psychiatry unit after attempting to escape. He spent three weeks in the hospital before he was well enough to be discharged.

"It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," the authors of the report warned, per the New York Post. 

Notably, OpenAI, the developer of ChatGPT, also states in its terms of use that the chatbot's output "may not always be accurate". You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice," the Terms of Use say.

"Our Services are not intended for use in the diagnosis or treatment of any health condition," it states. 


 

Track Latest News Live on NDTV.com and get news updates from India and around the world

Follow us:
Listen to the latest songs, only on JioSaavn.com