We know the Russians are using GPS spoofing in Crimea, Ukraine. Todd Humphreys explains how it is possible for even script kiddies now.
We also know that Ublox, who builds the receiver in many GPS devices we use, is aware of the security problem now on the open market. A recent firmware update contained a spoofer detector, hopefully, this will assist efforts to increase security.
Todd does an excellent job pointing out that all his work is unclassified, but the FBI, recognizing the significance and potential of his research, has stayed in touch. It is in his last paragraph where he avoids addressing the elephant in the room, he leaves a gaping hole in his article.
Maybe, after the GitHub posting and the DEF CON presentation, some important customer finally did.
My guesses include the CIA or NSA, one of them, visited Ublox and paid for a GPS spoofer to be installed in the firmware update.
The next logical step is for the Russians to target uBlox and/or their software, and remove the GPS spoofer detection in their spoofers. The Chinese will follow suit. Now Swiss counterintelligence, uBlox is a Swiss corporation, needs to get involved, we have a potential global security problem looming.
The US Department of Defense has recognized this possibility for decades, I know it was the subject of much consternation, discussions, and arguments back in the mid-90s. Many of my jobs hinged on the security of the GPS devices. 25+ years later, the danger has increased. I only hope we’ve fully addressed the dilemma.
By Todd Humphreys
The University of Texas at Austin, http://radionavlab.ae.utexas.edu/
CTN Issue: February 2016
Script Kiddies: GPS Spoofing is now only a Download Away
For the fourth time in as many years, FBI agents visited my office last August. Each time it’s the same line of questioning: “Is the code safe? ” “Have you been sticking to the security protocol we agreed on? ” “Has anyone suspicious contacted you, requesting details about how your code works or wanting to join your research group? ” Each time I tell them that we’re doing our best to safeguard the code, following which I remind them that the ethos of open inquiry at the University of Texas can’t tolerate the high level of security they wish we had.
In any case, after what I told them last August, they haven’t been back.
I arrived at the University of Texas in 2009 as a fresh assistant professor with lots of big plans. My bright new students and I immediately got started on a research effort in navigation security. We had a lot of questions: What happens when someone sends counterfeit GPS signals into an unsuspecting receiver—so-called GPS spoofing? Can the false signals be detected from within the receiver? In other words, is there any way to tell them apart from genuine GPS signals received from overhead satellites? And if you’ve got a mixture of false and genuine signals coming through your antenna, how can you prevent the false signals from corrupting your receiver’s position (or time) once you’ve detected them? Can other sensors important for drones or self-driving cars can also be spoofed or jammed? Automotive radar? Lidar?
We heard rumors that some of these questions had already been examined in the classified world, but we saw no evidence that this classified work had made navigation and timing any more secure for non-military users of GPS. So we forged ahead, confident that a better understanding of GPS vulnerability and how to patch it would be important to the billions of GPS users worldwide.
Within a few years we had developed the prototype GPS spoofing device I had begun work on while in graduate school at Cornell University into a formidable lying machine: it could fool any commercial GPS receiver into reporting the wrong position or time. It did this by listening carefully to signals from overhead GPS satellites, and then generating counterfeit replica signals that were nearly perfectly aligned with the authentic ones. This potent receiver-spoofer device was the first of its kind—including, I believe, in the classified world, though I can’t be entirely sure. It was also relatively cheap: the parts, almost all of them off-the-shelf, amounted to about $2,000. One could also use an off-the-shelf GPS signal simulator for GPS spoofing, but these were a lot more expensive (nearly half a million dollars) and a lot less subtle in the attack. Whereas our box could slip fake GPS signals right underneath the true ones, then gradually raise their power to effect a stealthy and seamless takeover of the victim receiver’s tracking loops, a commercial GPS signal simulator didn’t have a way to align the fake and true signals. It’s best strategy would be to first jam the target receiver and then overpower it with false signals in hopes it would grab hold of the more powerful fake signals instead of the true ones upon re-acquisition.
The secret to our device being both lower cost and more sophisticated (for spoofing) than an off-the-shelf signal simulator was that it was built as a software-defined radio: all signal processing downstream of its analog-to-digital converter was done on a general-purpose processor. The advantages of a software-defined receiver-spoofer were huge: we had complete control over the signals we tracked and transmitted, and fixing problems that would arise was as easy as debugging our C++ code.
But a software-defined spoofer also has a significant downside: if the code were ever leaked to the public, then anyone who had read our papers, knew the basics of programming, and could wire together a few circuit boards could replicate our box. It wouldn’t take years and a team of PhDs like the first time. Writing in GPS World magazine back in 2005, Logan Scott, a prominent security expert in the GPS community, had actually warned of this very scenario. He conceded that building a functional GPS spoofer would be no easy task, but that downloading one and running it on commercial hardware would be well within the capability of “script kiddies”—hackers who don’t have the talent or experience to write a computer virus by themselves, but who can easily download pre-packaged viruses created by others and launch them effectively. A software-defined GPS spoofer, argued Scott, was a lot like a computer virus. In the wrong hands, it could do serious damage to our economy, disrupting travel by fooling GPS-guided aircraft or ships, communications by de-synchronizing cellular base stations, even the power grid by manipulating the timing of measurements assumed to be synchronized. And Logan’s warning of a “gathering threat” came long before we realized that GPS-guided drones and autonomous cars would soon be a reality.
It was the worry that script kiddies would get hold of our code that kept the FBI returning to my office.
But last August I had to give them the bad news. Our code was safe, so far as I knew, but that didn’t matter anymore. Just two months prior to their visit, a Japanese researcher named Takuji Ebinuma had posted a fully-functional software-defined GPS signal simulator to GitHub. No doubt Ebinuma’s motives were good: the GPS community has long been clamoring for a signal simulator that doesn’t cost hundreds of thousands of dollars. With a simulator, receiver testing is much more efficient and repeatable. But I doubt that before uploading his code to public-access GitHub, Ebinuma paused to think about the security aspects of what he was doing. He was uploading a spoofer. He was handing his code to the script kiddies.
Probably not by coincidence, the hacker conference DEF CON 23, held in August, 2015, featured a presentation by a Chinese hacker showing off her working software-defined GPS spoofer, whose software component was a hodgepodge of simulator code downloaded from public sites on the Internet and the hacker’s own work. Her presentation, now available on the DEF CON website , details how someone with no prior experience in GPS signal processing can put together a spoofer in a matter of months. She found several low-cost off-the-shelf hardware platforms that were capable hosts for her software, turning her line-by-line instructions into fake but convincing GPS signals. Like my research group did years ago, she demonstrated her ability to arbitrarily dictate the output position of everyday GPS receivers. But her presentation was different from everything we had done: we showed that spoofing was possible; she showed that anyone can now do it.
I’ve since heard from a colleague at the University of Bath in the UK. Out of curiosity, he downloaded Ebinuma’s code from GitHub and gave it a try. It’s not elegant, he tells me, but it does the job. A GPS receiver he had in the lab, the popular ublox receiver found in millions of devices across the globe, from drones to ankle monitors to the Google[x] self-driving car, happily reported whatever position and time the spoofer dictated.
Besides my not having to worry about more visits from the FBI, there is perhaps one silver lining on the dark cloud that is the first public release of spoofing code. Just weeks ago, ublox released a firmware update for their most recent, and very capable, M8 GPS/GNSS receiver. The update enables tracking of signals from the new Galileo constellation, Europe’s answer to the US GPS. This much was expected. But the update also includes a surprising security enhancement: a spoofing detector. Just 6 months ago a ublox vice president had told me and others gathered for a panel on GPS security that, while they recognize the threat of GPS spoofing is real, and the consequences could be serious, their customers were just not asking for any anti-spoofing features. Maybe, after the GitHub posting and the DEF CON presentation, some important customer finally did.
Editor-in-Chief: Alan Gatherer (firstname.lastname@example.org)