AI is sending people to places that don't exist
An imagined town in Peru, an Eiffel tower in Beijing: travellers are increasingly using tools like ChatGPT for itinerary ideas – and being sent to destinations that don't exist.
Miguel Angel Gongora Meza, founder and director of Evolution Treks Peru, was in a rural Peruvian town preparing for a trek through the Andes when he overheard a curious conversation. Two unaccompanied tourists were chatting amicably about their plans to hike alone in the mountains to the "Sacred Canyon of Humantay".
"They [showed] me the screenshot, confidently written and full of vivid adjectives, [but] it was not true. There is no Sacred Canyon of Humantay!" said Gongora Meza. "The name is a combination of two places that have no relation to the description. The tourist paid nearly $160 (£118) in order to get to a rural road in the environs of Mollepata without a guide or [a destination]."
What's more, Gongora Meza insisted that this seemingly innocent mistake could have cost these travellers their lives. "This sort of misinformation is perilous in Peru," he explained. "The elevation, the climatic changes and accessibility [of the] paths have to be planned. When you [use] a program [like ChatGPT], which combines pictures and names to create a fantasy, then you can find yourself at an altitude of 4,000m without oxygen and [phone] signal."
In just a few years, artificial intelligence (AI) tools like ChatGPT, Microsoft Copilot and Google Gemini have gone from a mere novelty to an integral part of trip planning for millions of people. According to one survey, 30% of international travellers are now using generative AI tools and dedicated travel AI sites such as Wonderplan and Layla to help organise their trips.
While these programs can offer valuable travel tips when they're working properly, they can also lead people into some frustrating or even dangerous situations when they're not. This is a lesson some travellers are learning when they arrive at their would-be destination, only to find they've been fed incorrect information or steered to a place that only exists in the hard-wired imagination of a robot.
Dana Yao and her husband recently experienced this first-hand. The couple used ChatGPT to plan a romantic hike to the top of Mount Misen on the Japanese island of Itsukushima earlier this year. After exploring the town of Miyajima with no issues, they set off at 15:00 to hike to the montain's summit in time for sunset, exactly as ChatGPT had instructed them.
"That's when the problem showed up," said Yao, a creator who runs a blog about traveling in Japan, "[when] we were ready to descend [the mountain via] the ropeway station. ChatGPT said the last ropeway down was at 17:30, but in reality, the ropeway had already closed. So, we were stuck at the mountain top."
A 2024 BBC article reported that Layla briefly told users that there was an Eiffel Tower in Beijing and suggested a marathon route across northern Italy to a British traveller that was entirely unfeasible. "The itineraries didn't make a lot of logical sense," the traveller said. "We'd have spent more time on transport than anything else."
According to a © BBC
