Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

How a little electrical tape can trick a Tesla into speeding

Security researchers found an unsettling vulnerability in Tesla’s intelligent cruise control.

A Tesla automobile speeding down a street.
A Tesla automobile speeding down a street.
Security researchers discovered a simple road sign hack that will trick Tesla’s intelligent cruise control feature.
Jonathan Nackstrand/AFP via Getty Images
Rebecca Heilweil
Rebecca Heilweil covered emerging technology, artificial intelligence, and the supply chain.
Open Sourced logo

McAfee researchers recently tricked a Tesla into speeding while the car’s intelligent cruise control feature was engaged. This news signals, yet again, that completely safe, fully autonomous cars have still not arrived, and it suggests that they face new types of vulnerabilities.

Over the course of 18 months, the researchers, whose report was published today, explored how they could get a Tesla to misread a speed limit by messing with the vehicle’s ability to see. To make that happen, the researchers placed visual distractions like stickers and tape that could trick the car’s camera system into misreading a 35-miles-per-hour speed limit.

A 35-mile-per-hour speed limit sign with a piece of black electrical tape on the 3, making the middle part of the 3 just a bit longer so it resembles an 8.
Here’s the sticker that confused the Tesla.

While the researchers successfully spoofed the camera’s reading in several different ways, they found that just a 2-inch piece of black electrical tape across the middle of the 3 in a 35 MPH speed limit sign could cause the system to read the sign as an 85 MPH sign. In a live test with a 2016 Model S70 using an EyeQ3 camera from MobilEye, they found that, when the Tesla Automatic Cruise Control (TACC) was activated, the vehicle’s system would attempt to determine the current speed limit with help from the camera.

That’s when those visual distractions — that small piece of black tape, in one case — could cause the car to misread the speed limit and head toward the 85 MPH speed. (The researchers note that they applied the brakes before the car reached that speed and that no one was hurt during testing.)

“This system is completely proprietary (i.e. Black Box), we are unable to specify exactly why the order of operations is essential,” Steve Povolny, head of McAfee Advanced Threat Research, told Recode in an email. He also cautioned that the “real-world implications of this research are simplistic to recreate but very unlikely to cause real harm given a driver is behind the wheel at all times and will likely intervene.” Povolny added that cybercriminals have yet to publicly attempt to hack self-driving cars, although plenty of people are worried about the possibility.

Still, the research demonstrates how self-driving cars, or cars with some autonomous abilities, can fall short. And it’s not the first time researchers have tricked a car like this. Just last April, similar stickers were used to get a Tesla to switch lanes improperly.

Tesla didn’t respond to a request for comment, but a spokesperson from MobilEye argued that the stickers and tape used by McAfee could confuse the human eye, too, and therefore didn’t qualify as an “adversarial attack.”

“Traffic sign fonts are determined by regulators, and so advanced driver assistance systems (ADAS) are primarily focused on other more challenging use cases, and this system in particular was designed to support human drivers — not autonomous driving,” said the spokesperson. “Autonomous vehicle technology will not rely on sensing alone, but will also be supported by various other technologies and data, such as crowdsourced mapping, to ensure the reliability of the information received from the camera sensor and offer more robust redundancies and safety.”

The researchers also said that they studied a 2020 Tesla vehicle with a new version of the MobilEye camera and did not observe the same problem, though they noted that testing was “very limited.” The study says that only Teslas produced from 2014 to 2016 that are equipped with the EyeQ3 model camera showed the vulnerability. The researchers also noted that neither Tesla nor MobilEye had expressed any “current plans” to address this vulnerability in their existing hardware.

But this vulnerability isn’t about Tesla. It’s about the challenges raised by self-driving car technology and the growing industry that aims to make roads safer for all of us — but also requires strict testing and regulation. After all, time has shown that teaching a computer to drive is not as easy as teaching a human.

As Future Perfect’s Kelsey Piper has explained:

Following a list of rules of the road isn’t enough to drive as well as a human does, because we do things like make eye contact with others to confirm who has the right of way, react to weather conditions, and otherwise make judgment calls that are difficult to encode in hard-and-fast rules.

Such a judgment call might be spotting a weird-looking speed-limit sign and noticing if the car suddenly went more than double the speed limit. As Povolny told Recode, the flaw analyzed by McAfee could be just one of many issues that a self-driving car encounters in both the “digital” and “physical” worlds, including “classic software flaws, to networking issues, configuration bugs, hardware vulnerabilities, [and] machine learning weaknesses.”

So that signals a long road ahead for self-driving cars. After all, the Teslas involved in the McAfee study still requires a human to be in the car and alert, though as several Autopilot accidents have shown, plenty of Tesla drivers overestimate the technology. Let’s hope when fully autonomous vehicles are finally on the highways, they won’t be so easily distracted.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh