Skip to main content

The context you need, when you need it

When news breaks, you need to understand what actually matters — and what to do about it. At Vox, our mission to help you make sense of the world has never been more vital. But we can’t do it on our own.

We rely on readers like you to fund our journalism. Will you support our work and become a Vox Member today?

Join now

Google’s Self-Driving Car Hit Another Vehicle for the First Time

It’s unclear if Google’s system was at fault.

Google via Google Plus

Since Google’s robot cars have been on the road, they have been involved with 17 different accidents. But in those incidents, Google’s car wasn’t to blame — another car struck Google’s or the test driver behind the autonomous vehicle was at fault.

Until earlier this month. On Feb. 14, one of Google’s self-driving Lexus SUVs struck a municipal bus in Mountain View, according to documents filed with the California DMV.

According to the report, the Google car was waiting at an intersection to turn right when it encountered several sand bags blocking the lane. When the light turned green, the car moved left to avoid the bags, then struck a public bus coming from behind.

Google’s autonomous driving mode was active when the crash occurred (in other incidents, Google’s test drivers had switched on manual mode). The bus was traveling at 15 miles per hour and Google’s car was going two miles per hour. According to the report, the test driver “saw the bus approaching in the left side mirror but believed the bus would stop or slow to allow the Google [autonomous vehicle] to continue.” No one was injured.

Google’s self-driving car unit has repeatedly stressed that autonomous vehicles are far safer than human-piloted ones. Getting regulatory approval and consumer acceptance of driverless fleets is the key pillar to the unit’s business strategy.

It’s unclear if Google will ascribe the bus accident to an error with its driving system or simply the complexity of traffic. Very few of the thorny insurance and policy answers about how to treat robotic systems have been worked out.

We reached out to Google for additional comment. Tomorrow is the first of the month, when Google typically puts out its monthly traffic report detailing each incident involving its cars. Google said there were no accidents registered in December or January.

Update: Google released a snippet of its February self-driving car report a day early to address the bus crash. The company described the incident as something that happens “every day” on the road, but noted that Google “clearly bear[s] some responsibility.”

Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements. In this case, we clearly bear some responsibility, because if our car hadn’t moved there wouldn’t have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

This article originally appeared on Recode.net.

More in Technology

Podcasts
Are humanoid robots all hype?Are humanoid robots all hype?
Podcast
Podcasts

AI is making them better — but they’re not going to be doing your chores anytime soon.

By Avishay Artsy and Sean Rameswaram
Future Perfect
The old tech that could help stop the next airborne pandemicThe old tech that could help stop the next airborne pandemic
Future Perfect

Glycol vapors, explained.

By Shayna Korol
Future Perfect
Elon Musk could lose his case against OpenAI — and still get what he wantsElon Musk could lose his case against OpenAI — and still get what he wants
Future Perfect

It’s not about who wins. It’s about the dirty laundry you air along the way.

By Sara Herschander
Life
Why banning kids from AI isn’t the answerWhy banning kids from AI isn’t the answer
Life

What kids really need in the age of artificial intelligence.

By Anna North
Culture
Anthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque messAnthropic owes authors $1.5B for pirating work — but the claims process is a Kafkaesque mess
Culture

“Your AI monster ate all our work. Now you’re trying to pay us off with this piece of garbage that doesn’t work.”

By Constance Grady
Future Perfect
Some deaf children are hearing again because of a new gene therapySome deaf children are hearing again because of a new gene therapy
Future Perfect

A medical field that almost died is quietly fixing one disease at a time.

By Bryan Walsh