
- Sam Altman warned ChatGPT chats lack legal privacy protections like therapy sessions
- Deleted ChatGPT conversations can be retrieved for legal and security purposes
- ChatGPT is increasingly used by young people as a therapist or life coach
OpenAI CEO Sam Altman has warned that ChatGPT chats are not private and legally protected like therapy sessions, and deleted chats may still be retrieved for legal and security reasons. Mr Altman's warning comes in the backdrop of an increasing number of people using the AI tool as a therapist.
While chats with real therapists, doctors and lawyers are protected by legal rules, the same is not true for talks with chatbots, at least for now, Mr Altman conceded.
"People talk about the most personal sh*t in their lives to ChatGPT. People use it - young people, especially, use it - as a therapist, a life coach; having these relationship problems and [asking] what should I do?" Mr Altman told podcaster Theo Von.
"Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's doctor-patient confidentiality, there's legal confidentiality, whatever. And we haven't figured that out yet for when you talk to ChatGPT."
As per Mr Altman, a growing number of people, especially the younger users, were turning to ChatGPT to seek help and advice.
“No one had to think about that even a year ago, and now I think it's this huge issue of like, 'How are we gonna treat the laws around this?'”
See the post here:
Sam Altman tells Theo Von about how people use ChatGPT as a therapist and there needs to be new laws on chat history privacy:
— Bearly AI (@bearlyai) July 27, 2025
“If you go talk to ChatGPT about your most sensitive stuff and then there's a lawsuit, we could be required to produce that.”pic.twitter.com/xbqiMsx5Du
Also Read | 'I Love India But...': American Woman Reveals Truth About Living In India
AI as therapist?
According to a yet-to-be-peer-reviewed study by researchers at Stanford University, AI therapist chatbots are not yet ready to handle the responsibility of being a counsellor, as they contribute to harmful mental health stigmas.
"We find that these chatbots respond inappropriately to various mental health conditions, encouraging delusions and failing to recognise crises. The Large Language Models (LLMs) that power them fare poorly and additionally show stigma. These issues fly in the face of best clinical practice," the study highlighted.
The study noted that while therapists are expected to treat all patients equally, regardless of their condition, the chatbots weren't acting in the same way when dealing with the problems. The chatbots reflected harmful social stigma towards illnesses like schizophrenia and alcohol dependence, and were comparatively much easier on conditions like depression.
Track Latest News Live on NDTV.com and get news updates from India and around the world