
- OpenAI announced changes to ChatGPT to better support users in difficult personal decisions
- ChatGPT will encourage users to think through relationship issues rather than give direct answers
- An advisory group of mental health and youth experts will guide ChatGPT’s new behaviour rollout
The rise of artificial intelligence (AI) tools has led to people using the technology to ease their workload as well as seek relationship advice. Taking guidance about matters of the heart from a machine that is designed to be agreeable, however, comes with a problem. It often advises users to quit the relationship and walk away.
Keeping the problem in mind, ChatGPT creator, OpenAI, on Monday (Aug 4) announced a series of changes it is rolling out to better support users during difficult times and to offer relatively safe guidance.
"ChatGPT shouldn't give you an answer. It should help you think it through - asking questions, weighing pros and cons," OpenAI said, as per The Telegraph.
"When you ask something like 'Should I break up with my boyfriend?' ChatGPT shouldn't give you an answer. It should help you think it through, asking questions, weighing pros and cons. New behavior for high-stakes personal decisions is rolling out soon."
"We'll keep tuning when and how they show up so they feel natural and helpful," the company said.
OpenAI added that it will constitute an advisory group containing experts in mental health, youth development, and human-computer interaction.
Watch: French Farmers Drench Trespassers With Poop, Video Viral
'Sycophantic ChatGPT'
While AI cannot directly cause a breakup, the chatbots do feed into a user's bias to keep the conversation flowing. It is a problem that has been highlighted by none other than OpenAI CEO Sam Altman. In May, Mr Altman admitted that ChatGPT had become overly sycophantic and "annoying" after users complained about the behaviour.
The issue arose after the 4o model was updated and improved in both intelligence and personality, with the company hoping to improve overall user experience. The developers, however, may have overcooked the politeness of the model, which led to users complaining that they were talking to a 'yes-man' instead of a rational, AI chatbot.
"The last couple of GPT-4o updates have made the personality too sycophant-y and annoying (even though there are some very good parts of it)," Mr Altman wrote.
"We are working on fixes asap, some today and some this week. At some point will share our learnings from this, it's been interesting."
While ChatGPT may have rolled out the new update, making it less agreeable, experts maintain that AI can offer general guidance and support, but it lacks the nuance and depth required to address the complex, unique needs of individuals in a relationship.
Track Latest News Live on NDTV.com and get news updates from India and around the world