Who's really in control?
As cars become more and more advanced, they also become increasingly complicated, technically speaking. Instead of being purely mechanical machines, modern and future autonomous cars are very much computers. And like any computer, smartphone, or tablet, they’re prone to be hacked. Question is, what can be done about it? Simple. Cybersecurity for cars.
Regulus Cyber, a company that specializes in sensor security and malicious attack prevention, figured it’d be interesting to see whether they could hack a new Tesla Model 3 while it was using its Navigate system while on Autopilot. First reported by Bloomberg, Regulus Cyber bought a $400 GPS spoofer and a $150 jammer antenna and then mounted a small spoofing antenna on the EV’s roof in order to simulate an outside attack.
The goal was to see if the Model 3 is capable of isolating itself against the spoofing. Basically, this is the likely method an external hacker would use to gain control of the vehicle. The antenna was also used in order to prevent spoofing from any nearby cars or other GNSS receivers. The testers than took to the highway at 59 mph while using Navigate and Autopilot. This route would direct them to a town close by and also require the vehicle to make an autonomous exit. Here’s where things get interesting. The "hacker” team sent fake satellite coordinates that were received by the Model 3’s GPS receiver, which caused the car to exit the highway in 500 feet. The real exit was over a mile away. The Model 3 then slowed down to just 15 mph and turned onto an emergency pit stop instead of the exit.
The driver did not have his hands on the steering wheel at this time. By the time he did regain control, it was too late. The Model 3 could not maneuver safely back to the highway. Because both Navigate and Autopilot use GPS and Google map data to determine things like lane location and when to exit, the Model 3 is susceptible to GPA spoofing attacks.
In fact, Regulus Cyber claims the car was spoofed several times, causing "extreme deceleration and acceleration, rapid lane changing suggestions, unnecessary signaling, multiple attempts to exit the highway at incorrect locations and extreme driving instability.” None of that sounds good. What about other Teslas? Regulus claims it previously tested a Model S, but the spoofing didn’t affect the driving. Instead, the sedan’s air suspension changed "unexpectedly” because it believed it was driving on surfaces it actually wasn’t. Also not good.
Regulus Cyber reached out to Tesla regarding its findings and was told the following: "Any product or service that uses the public GPS broadcast system can be affected by GPS spoofing, which is why this kind of attack is considered a federal crime.” It also added the effects of a spoofing attack would be minimal, but it continues to take steps to "introduce safeguards in the future which we believe will make our products more secure against these kinds of attacks.”