
OpenAI CEO Sam Altman said on Tuesday that ChatGPT, a generative artificial intelligence (AI) chatbot, is becoming more like a human with new updates that are coming in a few weeks. ChatGPT would act as a chatbot companion and would also allow users, who have verified that they are adults, to access "erotica".
"We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues. We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right," he wrote in a post on X (formerly Twitter).
We made ChatGPT pretty restrictive to make sure we were being careful with mental health issues. We realize this made it less useful/enjoyable to many users who had no mental health problems, but given the seriousness of the issue we wanted to get this right.
— Sam Altman (@sama) October 14, 2025
Now that we have…
Also Read | Amazon To Lay Off Up To 15% Of HR Staff Amid AI-Driven Restructuring: Report
"Now that we have been able to mitigate the serious mental health issues and have new tools, we are going to be able to safely relax the restrictions in most cases."
"In a few weeks, we plan to put out a new version of ChatGPT that allows people to have a personality that behaves more like what people liked about 4o (we hope it will be better!)."
He suggested that if users want ChatGPT to respond like a person, they can use emojis or act like a friend.
"In December, as we roll out age-gating more fully and as part of our 'treat adult users like adults' principle, we will allow even more, like erotica for verified adults," he added.
Also Read | Scientists Find Most Powerful, Distant Odd Radio Circle Ever Detected
When one user pointed out that "ChatGPT used to feel like a person you could actually talk to, then it turned into a compliance bot. if it can be made fun again without losing the guardrails, that's a huge win. people don't want chaos, just authenticity."
Altman responded, "Almost all users can use ChatGPT".
"However, they'd like without negative effects; for a very small percentage of users in mentally fragile states there can be serious problems. 0.1% of a billion users is still a million people. We needed (and will continue to need) to learn how to protect those users, and then with enhanced tools for that, adults that are not at risk of serious harm (mental health breakdowns, suicide, etc) should have a great deal of freedom in how they use ChatGPT."
When one user pointed out that "ChatGPT used to feel like a person you could actually talk to, then it turned into a compliance bot. if it can be made fun again without losing the guardrails, that's a huge win. people don't want chaos, just authenticity."
Altman responded, "Almost all users can use ChatGPT. however they'd like without negative effects; for a very small percentage of users in mentally fragile states there can be serious problems. 0.1% of a billion users is still a million people. We needed (and will continue to need) to learn how to protect those users, and then with enhanced tools for that, adults that are not at risk of serious harm (mental health breakdowns, suicide, etc) should have a great deal of freedom in how they use ChatGPT."
Track Latest News Live on NDTV.com and get news updates from India and around the world