That’s according to a study released Thursday from the University of Michigan’s Transportation Research Institute. The report pulled together all the publicly available data on the crashes that self-driving cars have been in and compared it with crash statistics involving conventional vehicles from the National Highway Traffic Safety Administration.
After adjusting for under-reporting of accidents, researchers Brandon Schoettle and Michael Sivak found that:
- Autonomous Vehicles (AVs) got into more crashes overall: 9.1 crashes per million miles driven, compared with 4.1 crashes per million miles for conventional vehicles.
- AVs had a higher rate of injury per crash: 0.36 injuries per crash, compared with 0.25 for conventional vehicles.
- AVs weren’t responsible for any of the crashes they were involved in.
- Most AV crashes were low-speed, and the ones involving injury were minor compared with the injuries sustained during conventional vehicle crashes.
“Having a crash, of course, isn’t desirable, but the increased crash rate with the vehicles at this point doesn’t imply that they’re less safe,” said Schoettle, the lead author.
He compared the early trends to those found in intersections before and after traffic circles are installed. Though the number of crashes at an intersection can increase after a traffic circle is put in, they are typically less severe because they are mostly not head-ons or angled collisions. Property-only damage is preferable to damaging human bodies.
For instance, Google's crash reports describe situations where drivers felt whiplash, but nothing more serious than that. Some crashes were so minor they didn't even involve damage to the vehicle, such as an incident where a human driver passed a Google AV on the right and brushed a sensor.
Some, including a representative of Consumer Watchdog, have posited that autonomous vehicles might be accident-prone because they behave differently from human drivers. Schoettle said the data in the study isn’t enough to weigh in on that question, but it's one possible explanation for why the vehicles appear to be involved in more crashes even though the self-driving software hasn’t yet been at fault.
In fact, Schoettle said, the anecdotal evidence bears out that people in Mountain View, Calif. — the main testing ground for Google — act differently around the driverless cars.
“They see the vehicles and they recognize them and everyone is of the opinion that you want to get in front of it,” Schoettle said. “You don’t want to be behind it because it might be overly cautious and stop [when you don’t expect it to].”
The authors cautioned readers to take the findings with a few grains of salt. With just 11 self-driving car crashes on the record and almost all the miles driven in favorable weather, Schoettle said he and Sivak couldn’t yet rule out the possibility that self-driving cars actually get into fewer crashes than conventional vehicles.
“The margin of error, the confidence intervals for the self-driving vehicles, are still pretty large at this point,” he said. “The real number will fall between those two extremes, the margin of error.”
But with Google — the company with the largest self-driving vehicle fleet, at least in terms of those cars reported to the California Department of Motor Vehicles — surpassing 1 million miles driven in autonomous mode earlier this year, Schoettle said he felt that research could finally begin into the cars’ safety.
The bottom line, he said, is that there needs to be more data.
“As is probably the case for the conventional vehicles, a significant increase in mileage … if we could get a situation where there’s 50 million or 100 million [miles driven in autonomous mode], that would really strengthen the statistical power of the self-driving data,” he said.