Self-Driving Cars Have Been in 4 Accidents in California Since September [Update: Google Responds]
A handful of the self-driving cars currently on the road in California have run into trouble—albeit allegedly not caused by the cars themselves. An investigation by The Associated Press
has discovered that four of the 48 autonomous vehicles licensed for operation on public roads there have already gotten into some kind of accident.
All four of the scrapes reportedly happened at less than 10 mph. In two of them the car was in autonomous mode, while the human required by law to be behind the wheel was in control during the other two.
There aren’t many additional details available at this time. AP
got the information in part because accidents since September have to be reported to the California DMV, but that agency wouldn’t give out any more information. From the AP report:Three involved Lexus SUVs that Google Inc. outfitted with sensors and computing power in its aggressive effort to develop “autonomous driving,” a goal the tech giant shares with traditional automakers. The parts supplier Delphi Automotive had the other accident with one of its two test vehicles. Google and Delphi said their cars were not at fault in any accidents, which the companies said were minor.
Delphi says its car—an Audi SQ5, perhaps the one it recently used to drive autonomously from San Francisco to New York City—was hit while waiting to make a left turn by another, non-autonomous vehicle. Google wouldn’t say what happened with its three accidents, but has said before that its fleet of street-legal autonomous cars have gotten into some minor scrapes and fender benders before. None of the other automakers licensed for autonomous testing in California, which include Audi and Mercedes, have reported accidents.
While questions about who’s at fault when a self-driving car gets itself into an accident could be answered sooner rather than later, we’re most concerned with Google’s secrecy concerning incidents involving its vehicles. After all, it’s going to take a lot of convincing for many people to give up control behind the wheel, and honesty about what’s going right or wrong with such projects seems like a logical step to take to that end.
UPDATE, May 12: Google didn’t waste any time in trying to change the conversation about its self-driving cars. Not long after the AP
released its report about the accidents, Chris Urmson, the director of Google’s self-driving car program, took to Backchannel with a post elucidating all the situations Google cars encounter on the road.Urmson also doubled down on saying that the self-driving cars were not at fault in any of the accidents in which they’ve been involved, including those that predate those discussed in the AP report (the emphasis is Google’s):Over the 6 years since we started the project, we’ve been involved in 11 minor accidents (light damage, no injuries) during those 1.7 million miles of autonomous and manual driving with our safety drivers behind the wheel, and not once was the self-driving car the cause of the accident. The AP report provides some perspective on the crash rate, stating that “the national rate for reported ‘property-damage-only crashes’ is about 0.3 per 100,000 miles driven, according to the National Highway Traffic Safety Administration. Google’s 11 accidents over 1.7 million miles would work out to 0.6 per 100,000, but as company officials noted, as many as five million minor accidents are not reported to authorities each year—so it is hard to gauge how typical this is.”Urmson cites NHTSA data that says driver error is to blame for 94 percent of automotive crashes, and also provides anecdotes of unwise decisions that human drivers have made in the vicinity of Google cars. He also demonstrates this visually via graphics that show how the cars’ computers see the road in general—you can check them out, including a video, here—as well as when a human cuts off the vehicle while trying to turn right from the left lane.
Urmson has a point. One of the strongest arguments for vehicle autonomy is controlled safety, in that there would be fewer accidents if emotional, distracted, unpredictable humans weren’t controlling the wheel. Yet as we alluded to in the original version of this story and as our pal Damon at Jalopnik
expanded on, it’s not enough for Google to build the car of the future. It also needs to win the public-relations battle in order to sell Americans on the idea of self-driving cars. It looks a little defensive or fishy—we won’t go nearly so far as to say evil—for the company to finally release a flood of data only when it’s hand has been forced.
A version of this story originally appeared on popularmechanics.com via AP; it was published here first on May 11 at 3:54 p.m.