Advertisement

ChatGPT's Wrong Advice Leaves Influencer Couple Stranded At Airport

After arriving at the airport, the airline staff told the couple they could not board the flight without an ESTA.

ChatGPT's Wrong Advice Leaves Influencer Couple Stranded At Airport
Mery Caldass tearfully explains the incident in a viral video.
  • Spanish influencer couple missed flight due to ChatGPT's incorrect visa advice for Puerto Rico
  • ChatGPT failed to mention the need for an ESTA travel authorisation for Puerto Rico
  • The couple was prevented from boarding the flight by airline staff without an ESTA
Did our AI summary help?
Let us know.

As artificial intelligence (AI) technology makes rapid advancements, an increasing number of people across the globe are using chatbots for advice. Experts have advised against delegating too much of daily tasks and thought processing to the AI-powered chatbots as a Spanish influencer couple found the wrong way. The couple missed their flight after seeking travel advice from ChatGPT.

In a now-viral video, Mery Caldass can be seen crying as her boyfriend, Alejandro Cid, tries to console her, whilst they roam in the airport.

“Look, I always do a lot of research, but I asked ChatGPT and they said no,” Ms Caldass explained, referring to whether they needed a visa to visit Puerto Rico to watch artist Bad Bunny perform.

She added that AI told them that they did not require a visa, but failed to warn that they needed an ESTA (Electronic System for Travel Authorisation) to visit Puerto Rico. After arriving at the airport, the airline staff told them they could not board the flight without an ESTA.

“I don't trust that one anymore because sometimes I insult him [ChatGPT]. I call him a bastard, you're useless, but inform me well that's his revenge," she added, accusing ChatGPT of holding a grudge.

See the viral post here:

Also Read | Brazilian Woman Married To Indian Man Shares Her Love Story: 'When You Know'

ChatGPT advice backfires

This is not the first instance when using AI chatbots to seek advice has backfired. As per a case report published in the American College of Physicians Journals, a 60-year-old man was hospitalised after he asked ChatGPT how to remove salt (sodium chloride) from his diet, having read about the negative health effects of table salt.

After consulting the AI chatbot, the man made a dietary change and removed salt from his lifestyle and replaced it with sodium bromide, a substance once commonly used in medications in the early 1900s, but now known to be toxic in large quantities. The doctors noted that he had developed bromism after asking ChatGPT for advice on his diet.

"He had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT, in which he had read that chloride can be swapped with bromide, though likely for other purposes, such as cleaning," the report highlighted.

Track Latest News Live on NDTV.com and get news updates from India and around the world

Follow us:
Listen to the latest songs, only on JioSaavn.com