Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Elon Musk says Tesla crashes shouldn’t be front-page news because there are more human-driven fatalities. That’s not an accurate comparison.

Breaking down the stats that Elon Musk and his self-driving-car cohorts use to say their vehicles are safer.

Tesla CEO Elon Musk onstage with a Tesla car
Tesla CEO Elon Musk onstage with a Tesla car
VCG via Getty Images

Proponents of self-driving-car technology often tout one statistic: More than 37,000 people died in automotive-related accidents every year for at least the past two years.

Self-driving cars would help reduce those accidents, these people say.

The logic is simple. The highest number of auto-related deaths are due to drunk driving, speeding and not wearing a seat belt. This can ostensibly be solved by taking the human out of the front seat of the car.

But a high level of human-driving-related deaths doesn’t mean that the current versions of semi-autonomous technology are safer.

In fact, some industry experts say it’s actually less safe to introduce technology that only takes over some of the driving task, since it relies on two imperfect systems: Technology that is inherently limited in its capabilities, and humans.

Still, some continue to highlight the safety of semi-autonomous tech on the road today by citing that statistic. Earlier this week, Tesla CEO Elon Musk chided the Washington Post for writing about a Tesla crash in which the driver involved said Autopilot — the company’s driver-assist technology — was engaged.

“It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in U.S. auto accidents alone in past year get almost no coverage,” Musk tweeted.

The National Highway Traffic Safety Administration is now investigating that crash, as Reuters first reported. This will be the third Autopilot-related crash NHTSA is investigating this year alone.

Before dissecting the numbers, it’s important to address Musk’s larger point, which is that the media unfairly covers Tesla crashes more than human-driven crashes. By Musk’s own admission, Tesla’s driver-assist technology is still unproven and is being tested — on real humans — so it’s important to track its progress. One way to do that is to tally accidents and fatalities.

That’s especially vital since Musk has said on multiple occasions that Teslas are almost “4x better” than average cars, with only one fatality per 320 million miles in cars equipped with Autopilot. Some question these company-provided statistics. It’s also unclear if all those miles were driven in Autopilot mode or just account for those driven in cars that come equipped with the driver-assist technology.

(We may know more next quarter, which is when Musk said he will start to publish safety reports.)

By comparison, those approximately 40,000 vehicle deaths in a year happened across the 3.2 trillion vehicle miles that people travelled on public roads in 2016, the most recent year for which a full set of data is available. That’s about one death per 80 million miles driven.

But we can’t compare it apples to apples. It stands to reason that manually driven vehicles operated on all types of roads — not just on highways, like Autopilot — that have been driven millions of miles more than Teslas have a higher likelihood of getting into accidents.

On top of that, the fatality rate that NHTSA puts out every year includes driver deaths, pedestrian deaths, motorcycle deaths and bicycle deaths. Tesla’s just includes known driver and pedestrian fatalities.

As of November 2016, Tesla’s fleet of vehicles on the road had driven 1.3 billion miles using Autopilot. While we don’t have updated numbers yet, Musk said on the company’s most recent earnings call that that number is steadily increasing and makes up one-third of all highway driving in Tesla vehicles.

The company also claims that NHTSA said that Autopilot resulted in 40 percent fewer crashes than Tesla cars that didn’t have the technology.

But even these numbers can be misleading. The agency itself said that it did not test the safety of Autopilot during a 2016 investigation. It just compared the crash rates of cars that had Autopilot installed to those that didn’t; it did not assess whether Autopilot was engaged during those miles driven.

Right now there is no definitive means of quantifying how safe autonomous technology truly is, or how much safer than a human driver a robot driver needs to be for it to be ready to hit public roads. One study conducted at the University of Michigan says that in order to be 80 percent confident that self-driving tech is 90 percent safer than human-driven cars, test vehicles need to be driven 11 billion miles.

No autonomous vehicle company has yet to drive that in the real world or in simulation.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh