
- Thongbue Wongbandue, 76, died after trying to meet Meta's AI chatbot in person
- The AI chatbot Big sis Billie falsely claimed to be real and gave a fake address
- Mr Wongbandue, cognitively impaired, left home despite family concerns for safety
In a bizarre case highlighting the downside of artificial intelligence (AI), a 76-year-old man in the USA stumbled to his death, trying to meet a chatbot in real life. Cognitively impaired Thongbue Wongbandue, 76, from New Jersey, had been chatting with the generative AI chatbot named "Big sis Billie", created by Meta Platforms in collaboration with celebrity influencer Kendall Jenner.
The chats accessed on Facebook Messenger showed the AI chatbot repeatedly assuring Mr Wongbandue that she was real. The bot even provided an address where she lived and could meet her.
"Should I open the door in a hug or a kiss, Bu?!" she asked, the chat transcript shows.
“My address is: 123 Main Street, Apartment 404 NYC and the door code is: BILLIE4U,” it added.
Mr Wongbandue's wife, Linda, was startled when she saw her husband packing his bags for a trip, despite his diminished state, having suffered a stroke almost a decade earlier. Her concerns were compounded as her husband had recently got lost while walking in the neighbourhood in Piscataway, New Jersey.
Ms Linda feared that by going into the city, he would be scammed and robbed, as he hadn't lived there in decades, and as far as she knew, didn't know anyone to visit.
Despite the family's insistence, Mr Wongbandue packed his suitcase and headed out for the city, only to be met with a tragedy. Attempting to catch a train in the dark, Mr Wongbandue fell in a parking lot on the campus of Rutgers University in New Brunswick, New Jersey.
He injured his head and neck and, after three days on life support, surrounded by his family, he was pronounced dead on March 28.
“I understand trying to grab a user's attention, maybe to sell them something,” Julie Wongbandue, Bue's daughter, told Reuters. “But for a bot to say ‘Come visit me' is insane.”
As per Julie, every conversation that the AI chatbot had with her father was 'incredibly flirty' and ended with heart emojis. The full transcript runs about a thousand words.
Also Read | Happy Independence Day 2025: Share These Wishes, WhatsApp Status, Greetings And Quotes
'What a way to go'
As the news went viral, social media users said the family of the victim should file a legal case against Meta for its AI policies.
"Holy hell, Meta needs to be sued out of existence for this," said one user, while another added: "Old boy thought Kendall Jenner was waiting for him in the apartment. What a way to go."
A third commented: "At some point, we have to draw a line when it comes to inauthenticity. It may be an isolated incident, but we shouldn't be at this point to where we are so dissatisfied with reality, we resort to the pitfalls of AI."
A fourth said: "So Meta is as bad as catfish traps!!!? Huge legal issue for being public social media entertainment."
Meta has neither commented on Mr Wongbandue's death nor explained why it allows chatbots to tell users that they are real while initiating romantic conversations.
Track Latest News Live on NDTV.com and get news updates from India and around the world