Want to really annoy someone who relies on Google Maps for satellite navigation? Researchers have come up with a novel way of stealthily sending people in the wrong direction, using $250 of equipment that can spoof GPS signals and switch in “ghost” maps that appear to be the real thing but are in fact a kind of digital illusion.
【工具】强大互联网情报挖掘工具—-升级版的企查查 好消息！好消息！！！ 互联网企业、老板信息挖掘工具企查查也给丁爸粉丝送福利来了！！！ 24小时后文章末尾留言后获的点赞数最多的前20名将有机会获赠30天VIP会员体验！！！ 企查查官网地址：https://www.q
The researchers—from Virginia Tech, the University of Electronic Science and Technology of China, and Microsoft Research—tested out their attacks at midnight in Chengdu, China, riding around in a Ford Escape, guided by Google’s navigation software running on two different phones, a Xiaomi with Android 8.0 and a Huawei on Android 6.0. Their hacks used an algorithm that searched for map layouts that look similar but aren’t the same as the real ones. It’s then possible to switch in a “ghost location” to replace the legitimate place the driver wants to get to without them noticing, according to the researchers’ paper.
Why bother with the ghost map? To keep the driver truly unaware they’re going the wrong way. In a typical GPS spoofing attack, the hacker forces the software to connect to their own equipment rather than the legitimate satellite systems. The hacker can then start sending false GPS data. But any sensible driver would be able to determine something was wrong if the map suddenly looked very different. For instance, a driver might see a straight road where the spoofed GPS shows a crossroads. So for a truly stealthy attack, a replica map is required.
The algorithm searches for matching maps by using a dataset of 600 taxi trips taken across Manhattan and Boston. They were acquired from from the NYC Taxi and Limousine Commission (TLC) and the Boston taxi trace dataset used by MIT Challenge. “On average, our algorithm identified 1547 potential attacking routes for each target trip for the attacker to choose from,” the researchers claimed in their paper.
In their first of two attacks in the real world, a ghost destination was set to another location on the original route. This meant the driver wouldn’t be alerted with the “recalculating” voice prompt, even though they were taken 2.1 kilometers away from the original destination. The second attack did trigger a recalculating prompt, but was taken in the opposite direction to where they’d asked to go.
All this was achieved with a portable spoofer of various equipment—a HackRF One software defined radio, a Raspberry Pi, a portable power source and an antenna—that cost a total of $220 and could easily fit into a backpack. This could be controlled remotely, with the spoofing equipment installed under the car, the academics claimed in a paper due to be presented at the 2018 USENIX Security Symposium taking place in Baltimore this August.
There’s still the possibility a user would notice road names or other landmarks were different. But in tests on a driving simulator, where 40 participants in the study were asked to motor around a virtual world, 38 were still tricked into heading to the wrong destinations. Kexiong Zeng, a researcher from Virginia Tech, told Forbes the attacks were primarily aimed at people who didn’t know the area in which they were travelling.
A ‘troubling’ hack
The real risk for drivers is the possibility of being diverted and ambushed, said Zeng. He also claimed that his attacks would work on other GPS-based software, including Apple Maps and Pokemon Go.
Alan Woodward, a professor at the University of Surrey in the U.K., said the attack was “troubling both in its subtlety and its apparent effectiveness.”
“We are all becoming so reliant upon what our car driver aids tell us that were reaching a stage where we believe what were told even if our common sense says otherwise. Hence, if someone can change the information being presented by any driver aid they can effectively control you. Why hack the computer-controlled steering of car to take you somewhere when they can make you drive there yourself?
“The very reason we use GPS maps is because we are unfamiliar with an area so you can easily see how this attack, if done subtly, would be effective.”
Such attacks could be prevented with encryption. But it would be incredibly hard to deploy that across the myriad GPS technologies in use across the world, according to Zeng. “If you want to stop this problem in a fundamental way, you have to implement encryption, which requires you to modify the satellites and the GPS hardware and software,” he added. It’d require a very high modification cost and a very long cycle to implement this, given there are billions of GPS receivers out there. … It’s a pain in the ass.”
Google hadn’t responded to a request for comment at the time of publication.
Tesla’s secret security
Earlier this week, Zeng and his colleagues tried out their hacks on a Tesla Model S from 2014. They wanted to see if they could manipulate the car’s navigation system on the vehicle using the same techniques. But, thanks to a piece of tech used by Tesla, they failed.
“We tried to take over its navigation system by overpowering the GPS signals but were not able to manipulate the location as we want. More specifically, Tesla is using an advanced u-blox navigation chip, which implements some anti-spoofing function,” Zeng told me.
“Theoretically, this kind of defense can still be cracked by a more advanced spoofer. Now we are working on improving our spoofer and plan to test it on that specific u-blox chip.”
Zeng gave credit to Tesla for deploying such defensive measures. “Luxury cars come with luxury navigation chips.”