In the wake of Uber's fatal
autonomous car crash, companies such as Intel and Waymo are insisting
that their autonomous vehicle systems would have prevented this tragedy.
What are these companies doing differently—and will those differences
be enough to keep public confidence in the safety of these vehicles?
On
March 19th in Tempe, Arizona, one of Uber's test vehicles struck and
killed a pedestrian, 49-year-old Elaine Herzberg, despite a safety
driver sitting behind the wheel of the car. This is the first fatal
collision involving a pedestrian, though Tesla's Autopilot was involved
in the first fatal autonomous crash with another vehicle in 2016.
There are clear ramifications for Uber directly, including Arizona's immediate suspension of Uber's autonomous vehicle testing privileges. Other autonomous vehicle manufacturers, however, are also anticipating backlash as the industry as a whole is cast in a negative light.
How does Uber's sensing technology compare to systems designed by competitors? How are these competing companies reacting to this historic and tragic milestone in autonomous vehicle history?
Uber's Approach to Autonomous Vehicle Sensor Systems
If you've seen the distressing video of the fatal car crash, you'll notice that Uber's car in autonomous mode did not brake or try to avoid the pedestrian at all. This is a clearcut failure of Uber's sensors' ability to detect and avoid obstacles in its path, though it hasn't yet been established what precisely went wrong.One obvious possibility is that the vehicle's sensors weren't operating properly or well enough for driving conditions. For its environment sensing, Uber utilizes Velodyne's LiDAR sensors. At its basest level, LiDAR technology emits laser pulses which are reflected back after hitting objects in the environment.
Velodyne is among several other companies developing solid-state LiDAR for automotive purposes, aiming to reduce both the sensors' footprint and costs. These advancements represent further investment in the future of autonomous driving that sensor companies are incentivized to protect.
Two iterations of the Velodyne Velarray sensor. Left: Velarray as pictured in its press release (image courtesy of Business Wire). Right: Velarray on display at CES 2018.
It remains uncertain, however, whether the incident was caused by this sensing technology, the decision-making systems that interpret sensor data, or some other portion of the process that autonomous vehicles use to make decisions.
Velodyne's President, Marta Hall, stated in no uncertain terms that "we do not believe the accident was due to LiDAR." She went on to say that Velodyne doesn't have anything to do with the decision-making systems that interpret the visual data their LiDAR sensors gather.
Also trying to distance themselves from the fallout of the incident is NVIDIA, which was quick to point out that their partnership with Uber is limited to GPUs. It's notable that NVIDIA offers an AI platform for autonomous driving, NVIDIA DRIVE, which was not involved in the fatal incident. In a show of proactivity, however, NVIDIA has decided to halt its testing of autonomous vehicles on public roads, anyway. NVIDIA's co-founder and chief executive officer, Jen-Hsun Huang believes that it's extremely important that all autonomous companies take a step back and try to learn from this accident.
Intel Demonstrates Proprietary ADAS Over Uber Crash Footage
In a statement from Intel's newsroom, Professor Amnon Shashua, Senior Vice President at Intel Corporation, as well as the CEO and CTO of Mobileye (an Intel company) responded to the incident in broad strokes. He stated that he believes this to be the right time to take a step back and analyze the current designs in hardware and software in these vehicles. In order to ensure that autonomous vehicles are completely safe to pedestrians and drivers, Shashua believes that the autonomous vehicle field needs to look at sensing and decision-making.Intel's response as a whole pointed out why their ADAS design is different from Uber's, such as that their system includes features such as automatic emergency braking (AEB) and lane keeping support that could have helped prevent this accident from ever occurring.
In a bold demonstration, Intel ran their ADAS technology overtop the footage of the fatal accident in Tempe. In the demonstration, Intel's software picked up on Ms. Herzberg and her bicycle. The three images below include green detection boxes that were created from pattern recognition and a "free-space" detection module.
The footage from Uber's crash overlaid with Mobileye's ADAS system response. Image courtesy of Intel.
Along with this software, Mobileye's Road Experience Management™ (REM) utilizes an ultra-high refresh rate to ensure its low Time to Reflect Reality (TTRR) is well above qualification. In terms of hardware, Mobileye's system-on-chip (SoC) comes from the EyeQ® family.
The Mobileye family of EyeQ chips and which level of autonomy each supports. Image from Mobileye.
What sets Mobileye apart is their proprietary computation cores (or accelerators). These accelerators are used for various computer-vision, signal-processing, and machine-learning tasks. Below is a list of what each programmable accelerator core provides.
- The Vector Microcode Processors (VMP) is a VLIW SIMD processor that provides hardware support for operations similar to computer vision applications.
- The Multithreaded Processing Cluster (MPC) which is much more versatile than any GPU and more efficient than any CPU produced.
- The Programmable Macro Array (PMA) enables computation density nearing that of fixed-function hardware accelerators without sacrificing any programmability.
Parallels with Waymo
Waymo CEO John Krafcik addressed the Uber accident, as well, stating to Forbes that "What happened in Arizona was a tragedy. It was terrible." Like Intel, Krafcik claims that he feels very confident that their car would have handled that situation.Waymo, like Uber, uses LiDAR technology in their autonomous vehicles. In fact, the two companies have been engaged in a major lawsuit regarding patents on LiDAR sensing technology.
A differentiating factor, however, is that Waymo develops their hardware and software under the same roof. Waymo's sensors are developed by software experts that specialize in AI which are built into a single integrated system.
Waymo intends to open an autonomous ride-sharing business in Phoenix this year, putting even more scrutiny on their LiDAR technology and Arizona's approach to the public roadways for autonomous vehicles.
The Fate of an Industry
This incident is not the first to cast doubt on the safety of autonomous vehicles. Aside from lapses in either sensor and vision systems or in vision processing software, there are also plenty of issues that may arise with actively malicious third parties interfering with these systems. From altered street signs to spoofing attacks that directly target autonomous systems, autonomous vehicles are facing many challenges.Herzberg's death represents a major event for this industry, which has seen extraordinary interest from sensor developers to automakers alike. Will more companies, like NVIDIA, press pause on their testing programs? Will regulatory bodies create regulations in response? Given the response to Uber's first fatal crash already, it's clear that 2018 will be a lynchpin year for the autonomous vehicle industry.
No comments:
Post a Comment