“The Volvo was in self-driving mode with a human backup driver at the wheel when it struck 49-year-old Elaine Herzberg as she was walking a bicycle outside the lines of a crosswalk in Tempe, police said.
Uber immediately suspended all road-testing of such autos in the Phoenix area, Pittsburgh, San Francisco and Toronto. The ride-sharing company has been testing self-driving vehicles for months as it competes with other technology companies and automakers like Ford and General Motors. ...”
The immense global coverage of this auto accident has been unprecedented, and the breadth and depth of issues range from the role of the driver to the reliability of the technology used to new state and federal laws governing autonomous vehicles.
A full investigation is underway, and all sides cautioned against jumping to any conclusions without all the facts being examined. Nevertheless, several early articles seemed to defend the autonomous Uber and claim the woman may have suddenly and unexpectedly jumped in front of the car.
However, by later in the week, videos released by the police show the woman more than half way across the road and walking slowly along with her bicycle.
Here is a sample of some of the headlines involving this topic:
Bloomberg: Uber victim stepped suddenly in front of self-driving car — “Police say a video from the Uber self-driving car that struck and killed a woman on Sunday shows her moving in front of it suddenly, a factor that investigators are likely to focus on as they assess the performance of the technology in the first pedestrian fatality involving an autonomous vehicle.”
TheVerge: Uber ‘likely’ not at fault in deadly self-driving car crash, police chief says. “Uber was likely not at fault in the deadly crash of its self-driving vehicle in Arizona on Sunday evening, Tempe Police Chief Sylvia Moir told the San Francisco Chronicle in a startling interview the following day. Her comments have caused a stir in this closely watched investigation, which is being characterized as the first human killed by an autonomous vehicle.”
NY Times: Uber SUV's Autonomous System Should Have Seen Woman — “Two experts say video of a deadly crash involving a self-driving Uber vehicle shows the sport utility vehicle's laser and radar sensors should have spotted a pedestrian, and computers should have braked to avoid the crash.”
The Atlantic: Self-Driving Cars Still Don't Know How to See — “This is thesecond death in theUnited States caused by a self-driving car, and it’s believed to be the first to involve a pedestrian. It’s not the firstaccident this year, nor is this the first time that a self-driving Uber has caused a major vehicle accident in Tempe: In March 2017, a self-driving Uber SUV crashed into two other cars and flipped over on the highway. As the National Transportation Safety Board opens an inquiry into the latest crash, it’s a good time for a critical review of thetechnical literature of self-driving cars. This literature reveals that autonomous vehicles don’t work as well as their creators might like the public to believe.”
USA Today: Operator of self-driving Uber vehicle that killed pedestrian was a felon — “The operator behind the wheel of a self-driving Uber vehicle that hit and killed a 49-year-old woman Sunday night had served almost four years in an Arizona prison in the early 2000s on an attempted armed robbery conviction.”
WSJ: How Uber’s Self-Driving Car Could Have Missed the Pedestrian — “The roads north of Arizona State University are in many ways ideal for testing self-driving cars, with wide, clearly marked lanes and minimal traffic late at night, when the vehicle’s laser sensors work best.
The optimal conditions make it especially troubling that an Uber Technologies Inc. self-driving car plowed straight into and killed a pedestrian walking across a street here at night, without appearing to brake or veer, according to a video from the vehicle released by police Wednesday.”
Background: Autonomous Vehicle Context Within Technology Trends and Platforms
The Internet of Things (IoT), and, more specifically, smart cities and smart transportation systems, are the hottest of leading-edge technologies as we head toward 2020.
Within this IoT space, autonomous vehicles (AV) are perhaps the No. 1 area of public interest. (Side Note: Gartner puts AV within the artificial intelligence (AI) area). Public- and private-sector involvement (or lack thereof, as the case may be) and global research are all around us. Here are helpful articles on the background and research regarding the technology.
- Pew Research Center: Americans had concerns about self-driving cars before fatal Arizona accident — “Slightly more than half of U.S. adults (54%) said in a Pew Research Center survey conducted in May 2017 that they were somewhat or very worried about the development of driverless vehicles, while 40% said they were at least somewhat enthusiastic about it. A majority of U.S. adults (56%) also said they would not personally want to ride in a driverless car if they had the opportunity, compared with 44% who would.”
Here is an excerpt:
“Why didn’t the automated car avoid the pedestrian? Was there a sensor failure? A faulty algorithm? Why didn’t the human “safety driver” prevent the crash? Was the car at fault? Or, as seems to be implied by coverage that takes pains to point out that she was not on a crosswalk, was the pedestrian really to blame? Talking to The San Francisco Chronicle, Tempe Chief of Police Sylvia Moir noted gravely that “It is dangerous to cross roadways in the evening hour [outside crosswalks] when well-illuminated, managed crosswalks are available.”
As for Uber, Sunday’s fatality clearly hurts their numbers. Their test fleet has logged over two million miles, a figure the company hit in late December. That would extrapolate out to a rate of close to 50 fatal crashes per 100 million miles driven, which, caveats about apples and oranges aside, stands poorly against the figure for human drivers in the U.S. of 1.18 deaths per 100 million miles.”
Other important questions were raised earlier by the Institute of Electrical and Electronics Engineers (IEEE). Read this informative article on “the big problem with self-driving cars,” which is the behavior of people crossing streets in congested neighborhoods without rules being followed.
Wired Magazine offered this helpful perspective on the current reality that research shows that puny humans still see the world better than self-driving cars. “You’re probably safer in a self-driving car than with a 16-year-old, or a 90-year-old," says Schoettle. "But you’re probably significantly safer with an alert, experienced, middle-aged driver than in a self-driving car." (Vindication for those 40-somethings feeling past their prime).
Public-Sector Involvement Regarding Autonomous Vehicles
Here are some of the important policy efforts related to autonomous vehicles over the past few years:
- Federal Automated Vehicles Policy — U.S. Department of Transportation — “The U.S. Department of Transportation is partnering with a broad coalition including industry, academia, states and local governments, safety advocates, and transportation stakeholders to encourage the safe development, testing and deployment of automated vehicle technology. The Department of Transportation is committed to facilitating a new era of transportation innovation and safety, and ensuring that our country remains a leader in automated vehicle technology. The Department seeks to support the technology and transportation industry, state and local governments, and other key stakeholders as they consider and design best practices relative to the testing and deployment of automated vehicle technologies.”
- Self Drive Act — National Governors Association — “On behalf of the nation’s governors, legislatures, state transportation officials, state motor vehicle administrators and highway safety officials, we appreciate the efforts made by the House Energy and Commerce Committee’s Digital Commerce and Consumer Protection Subcommittee to include key stakeholders in developing the appropriate federal framework for the use of autonomous vehicle (AV) technologies on our nation’s roadways. …”
- Governors, States Embrace Innovation — “In his remarks, Gov. [Brian] Sandoval drew heavily on transportation and energy to illustrate opportunities for innovation, including autonomous and electric vehicle technology, drones, ride-sharing, and more. (His NGA chair’s initiative, Ahead of the Curve: Innovation Governors, focuses on how states can respond to emerging technology in the energy and transportation sectors.)
- Governors Highway Safety Association (GHSA) — Autonomous Vehicles — “It is widely acknowledged that fully autonomous vehicles, or cars and trucks that can drive themselves without a human at the controls, are coming soon. In fact, Levels 1 and 2 autonomous vehicles are already on our roads. Many companies are currently testing autonomous vehicles (AVs), and AV programs have been launched across the country by various companies in the technology and transportation industries.”
- In early 2017, GHSA published a Spotlight on Highway Safety Report to help states understand and address issues related to autonomous vehicles. The report provides an overview of existing and upcoming technologies, information on public knowledge and attitudes, and recommendations for states to effectively prepare for autonomous vehicles and ensure that traffic safety is at the forefront of all AV development and policy discussions. The full report along with infographics is available for download here.
So a fatal accident has occurred. What’s next?
First, we must recognize that states and political leaders are competing for dollars, jobs, research facilities and their future innovative images as they try to attract more testing of autonomous vehicles in their states. No one wants anyone to die in the process, but Uber moved to Arizona from California because of less regulation regarding autonomous vehicles.
TheNextweb.com discussed how states are competing to get autonomous vehicles on the road, but asked: Should they be? After this accident, there is no doubt that the risks are higher, but most governors are not changing their policies and are taking a wait-and-see attitude. The articles list several pros and cons for current policy, and now citizens are paying attention.
Second, there have been thousands of questions raised about new laws, rules, insurance and more. These questions are now under more scrutiny.
- Harvard Business Review offered these hard questions on our transition to driverless cars in 2017.
- ABC News in Australia raised these five ethical questions on AV. Each of them is important, but No. 5 gets to the heart of the matter — public TRUST of the new technology.
- I also like this list of questions being asked locally in Arizona, which is not surprising given what happened. And similar questions are being asked all over the world, such as in the United Kingdom.
- These wider ethical questions on autonomous vehicles will start showing up more often in the news media and even in car showrooms as we head toward more autonomy in vehicles. From minimizing risk of injury to wider issues like the loss of trucking jobs, public- and private-sector organizations are discussing these topics now.
- The National League of Cities offers this helpful policy guide on autonomous vehicles: Autonomous Vehicles — National League of Cities
- The U.S. Government Accountability Office (GAO) recommends that the Department of Transportation develop a comprehensive plan to better manage departmental initiatives related to automated vehicles.
- The Information Technology and Innovation Foundation (ITIF) offers this Policymaker's Guide to Connected Cars.
I am generally an advocate for autonomous vehicles, and I am excited about the prospect of driverless cars coming soon for my family.
Nevertheless, I was a surprised by several articles (like this one) that came out immediately after the accident that basically defended autonomous vehicles on the grounds that thousands of people die on the highways every year. While technically true, these AV defense pieces seemed pre-written — ready for someone to die in an autonomous vehicle crash to be published.
Not only do I think these articles are in poor taste (written too soon after the accident), I also think they don’t help the auto industry as much as the authors think they do.
Why?
Building trust is key, and a subliminal point of these pieces is that some people will need to die in order to get society more quickly to an all-autonomous road situation (with presumably fewer accidents coming when we reach this transportation utopia.) This defense makes me feel that victims are being compared to soldiers fighting in wars — dying for the good of society. In my opinion, this comparison doesn’t work because we are not fighting enemy combatants, but accidental deaths on highways.
More important, this rationale does not build societal trust in the testing and public/private decision-making processes with autonomous vehicles during the next decade of transitions. It may actually lower trust in autonomous vehicles.
To build more trust in AV, the facts of this case (and future cases) must be transparent. Blaming the woman for not using a crosswalk won’t help, because most drivers have seen this situation many times and reacted. The public wants to know why the car did not see the woman, or even slow down or swerve when she was half-way across the road? Was this truly a one-off situation or a deeper flaw?
We have not even added in other autonomous vehicle questions/factors such as hacking or bad weather or unmarked roads, which will complicate the conversation further.
Talking about the Uber accident, one friend commented, “I am now watching more closely to see what happens over the next few years. Fool me once, shame on you, fool me twice, shame on me.”
Bottom line, everyone in the world is now focused more intently on autonomous vehicle testing, and the margin for error just got thinner.