Skip to content
Menu
Menu

AI fooled a couple into travelling 300 kilometres for a tourist attraction that doesn’t exist 

The couple travelled over four hours from the Malaysian capital Kuala Lumpur for a supposed cable car attraction they had seen in a video
  • AI-generated videos and deepfakes are getting harder to spot, raising serious concerns about misinformation and abuse

ARTICLE BY

PUBLISHED

ARTICLE BY

PUBLISHED

An elderly couple from Malaysia travelled some 300 kilometres to visit a cable car attraction, only to discover no such attraction exists.

A hotel worker shared details of the incident in a post on Threads in late June, saying the experience had shocked her, reports Singapore’s Straits Times. The couple arrived at the local hotel to check in and asked the staff member if she had ridden the cable car at Kuak Hulu. “I thought they were just kidding,” she wrote. 

When she told them there was nothing to see in Kuak Hulu, a quiet village in Gerik, the couple did not believe her. They had seen a nearly three-minute-long video in which a journalist from “TV Rakyat” introduced the attraction and interviewed visitors, including tourists from Thailand. But all of it – the attraction, the journalist, the interviews – was fake, generated by artificial intelligence (AI).

“I was so shocked,” the hotel worker wrote. She tried to explain to the elderly woman that the video was not real. The woman questioned why anyone would want to lie, pointing to the “reporter” in the video, and insisting that she had not seen any comments under the video indicating that it was fake.

There are subtle eclues in the video that point to it being AI-generated. We see the journalist interviewing visitors, as well as scenes of people queuing up at the “Kuak SkyRide” ticket counter. Then it cuts to scenes of the cable car passing over an expanse of trees, near a stream and a group of deer grazing, before stopping near the foot of Baling Mountain in the neighbouring state of Kedah. 

As Straits Times noted, the biggest clue comes at the end of the video when an old woman in the background does a handstand. As she flips back to her feet, her legs and body merge into a misshapen blob, before she lands on her feet looking completely normal.

[See more: China is making labels compulsory for all AI-generated content]

Such a bizarre visual would be a dead giveaway to anyone paying attention, but with AI-generated videos quickly advancing, most people don’t realize they need to actively work to separate real from fake. Obvious clues were a common sight in early AI-generated videos, where even the most realistic clips quickly devolved into bizarre surrealist vignettes. Dialogue was another weakness of these early videos, so robotic and obviously not real that most clips included none. 

The AI-generated cable car video shows how the technology is advancing, however. Even knowing the whole thing was fake, the elderly woman wanted to sue the “journalist” in the clip, prompting the hotel worker to remind her that the woman in the video did not exist.

[See more: AI generated bogus attractions and fictional cuisine for a Japanese tourism website]

 Ahmad Salimi Md Ali, the acting police chief in Baling, told the media that he had not received any official complaints as of 3 July, three days after the post. He reiterated that there was no such cable car project in the district, and advised the public to verify what they see in viral social media posts. 

“In this era of AI-generated media, misleading materials can spread easily and cause confusion,” he said. Local police across the country echoed the plea for caution and encouraged verification. 

This is far from the first time travellers have been fooled by AI-generated content, with everything from a Japanese tourism website rife with ‘hallucinations’ (the term used for fabricated or false information from AI) to fake travel photos to too-good-to-be true vacation rentals bamboozling the public.

Send this to a friend