This Article is From Jan 10, 2023

Mental Health App Koko Tested AI Chatbot On Users, Faces Backlash

People can seek advice and help from other users through the peer-to-peer mental health service Koko.

Mental Health App Koko Tested AI Chatbot On Users, Faces Backlash

The test was shut down because it sounded "inauthentic."

The latest trend in the treatment of mental health is the usage of applications where professionals are meant to provide customers with remote counselling, eliminating the need for in-person meetings. However, an app is currently receiving criticism since, without the customers' knowledge, it substituted an AI chatbot for genuine professionals.

Mobile mental health service app Koko tested AI-generated responses without first notifying the recipients.

Rob Morris, the co-founder of the mental health app Koko, revealed on Twitter last week that 4000 people have received counselling with the use of GPT-3, an AI chatbot.

Morris described how they employed a "co-pilot" strategy, with people monitoring the AI as necessary.

"We used a 'co-pilot' approach, with humans supervising the AI as needed. We did this on about 30,000 messages...," he wrote.

He further explained that "Messages composed by AI (and supervised by humans) were rated significantly higher than those written by humans on their own (p < .001). Response times went down by 50%, to well under a minute.

Rob Morris, a co-founder of Koko, spoke to Gizmodo and said, "Frankly, this is going to be the future. We're going to think we're interacting with humans and not know whether there is an AI involved. How does that affect human-to-human communication? I have my own mental health challenges, so I really want to see this done correctly."

However, this test angered social media users, who expressed their displeasure with it.

"I can't believe you did this to people who needed mental health care. I'm shocked," commented a user.

"He admits it's unethical and harmful to patients," wrote another user.

.