
- AI chatbots are being used as psychedelic trip-sitters during hallucinogenic experiences
- A man named Peter used ChatGPT for support during an eight-gramme psilocybin mushroom trip
- Experts warn combining AI chatbots with psychedelics poses psychological risks
Artificial intelligence (AI) chatbots are now being employed as psychedelic trip-sitters by users, guiding them through their hallucinogenic journeys. According to an MIT tech review report, the drug-takers are using everything from popular chatbots like ChatGPT, to obscure tools like "TripSitAI" or "The Shaman" to ensure they have company during their trippy journeys.
Ever since AI chatbots burst onto the scene, throngs of people have turned to them as surrogates for human therapists, citing the high costs and accessibility barriers. However, this is the first instance when reports of AI being used as a trip sitter have surfaced. Trip sitter is a phrase that traditionally refers to a sober person tasked with monitoring someone who's under the influence of a psychedelic.
The report highlights a case study where a man named Peter underwent a transformative experience tripping on a heroic dose of eight grammes of psilocybin mushrooms with AI assistance after a period of hardship in 2023. After reaching out to ChatGPT, the AI chatbot curated him a calming playlist and also offered reassurance, the same way a human trip sitter would.
Despite Peter's relatively good experience with the chatbot, the report warned that using AI with psychedelics was a dangerous "psychological cocktail".
"It's a potent blend of two cultural trends: using AI for therapy and using psychedelics to alleviate mental-health problems. But this is a potentially dangerous psychological cocktail, according to experts. While it's far cheaper than in-person psychedelic therapy, it can go badly awry."
The report highlighted that AI chatbots, by design, are aimed at maximising engagement with a user, often through flattery, which may feed into the delusion of a user under the influence of drugs.
"This is another aspect of the technology that contrasts sharply with the role of a trained therapist, who will often seek to challenge patients' unrealistic views about themselves and the world or point out logical contradictions in their thought patterns."
Also Read | 'Horrified By Gurugram's Filth': French Expat's Viral Post Sparks Cleanliness Debate
AI as therapists?
Last month, a yet-to-be-peer-reviewed study by researchers at Stanford University stated that AI chatbots were encouraging schizophrenic delusions and suicidal thoughts in users who were using these tools as a replacement for therapists.
"We find that these chatbots respond inappropriately to various mental health conditions, encouraging delusions and failing to recognise crises. The Large Language Models (LLMs) that power them fare poorly and additionally show stigma. These issues fly in the face of best clinical practice," the study highlighted.
The study noted that while therapists are expected to treat all patients equally, regardless of their condition, the chatbots weren't acting in the same way when dealing with the problems. The chatbots reflected harmful social stigma towards illnesses like schizophrenia and alcohol dependence.
Track Latest News Live on NDTV.com and get news updates from India and around the world