Are AI Chatbots Fueling Breakups? Dating Advice Could Be Misguiding Users

The LLM-powered AI chatbots have become confidants and counsellors, with a large number of people sharing their mundane problems with them.

Advertisement
Read Time: 3 mins
Many turn to AI chatbots for relationship advice but it may not always be beneficial.
Quick Read
Summary is AI generated, newsroom reviewed.
AI chatbots are increasingly used for relationship advice during challenging times.
Trusting AI for relationship guidance may have unintended negative consequences.
Users report that AI inputs can exacerbate relationship issues and mental health concerns.

The rise of artificial intelligence (AI) tools has led to people not only using them to ease their workload but also to seek guidance in the face of adversity. The Large Language Model-powered AI chatbots have become confidants and counsellors, with an increasing number of people sharing their relationship problems as well. However, trusting the AI for relationship advice might have an unintended consequence.

As per a report in Vice, a man recently shared his discomfort about his girlfriend's reliance on ChatGPT for therapy and advice. "My girlfriend keeps using ChatGPT for therapy and asks it for relationship advice. She brings up things ChatGPT told her in arguments later on," he said, revealing how the AI's input became a wedge in their relationship.

On Reddit, a user posted that ChatGPT had made them break up with their partner. The user took to the Relationship-OCD subreddit, stating that they were feeling stupid and panicked after the breakup. An official OCD Treatment and Therapy service account responded to the user and told them that ChatGPT may not have all the answers when it may seem like.

“It may feel like ChatGPT has all the answers, but understand that engineers work hard to make the program sound authoritative and all-knowing when, in reality, that comes with a lot of caveats,” they wrote.

The report highlighted that in most instances, the AI chatbot simply taps into the feelings of the user, validating them endlessly, until it sort of makes them delusional. On Reddit, a user described an "AI-influencer" whose delusions appeared reinforced by ChatGPT's responses, raising concerns about the AI's role in exacerbating mental health issues.

Also Read | Google's Urgent Alert: Dump Passwords Now Or Risk Getting Hacked

Sycophantic chatbots

While AI cannot directly cause a breakup, the chatbots do feed into a user's bias to keep the conversation flowing. It is a problem that has been highlighted by none other than OpenAI CEO Sam Altman. Last month, Mr Altman admitted that ChatGPT had become overly sycophantic and "annoying" after users complained about the behaviour.

Advertisement

The issue arose after the 4o model was updated and improved in both intelligence and personality, with the company hoping to improve overall user experience. The developers, however, may have overcooked the politeness of the model, which led to users complaining that they were talking to a 'yes-man' instead of a rational, AI chatbot.

"The last couple of GPT-4o updates have made the personality too sycophant-y and annoying (even though there are some very good parts of it)," Mr Altman wrote.

"We are working on fixes asap, some today and some this week. At some point will share our learnings from this, it's been interesting."

While the fixes have been applied since, the message remains clear for users. Take dating advice from technology, but only with a grain of salt.

Advertisement
Featured Video Of The Day
Rinku Singh, Priya Saroj Engagement: Couple's First Picture From Ceremony Out
Topics mentioned in this article