Meet WARLORD: Metawave Aims to Bring Millimeter-Wave RADAR Sensors to the Automotive Industry - LEKULE

Breaking

30 Aug 2018

Meet WARLORD: Metawave Aims to Bring Millimeter-Wave RADAR Sensors to the Automotive Industry

We've heard all about LiDAR sensors for automotive applications. But what about RADAR? Metawave has developed a RADAR sensor, dubbed WARLORD, that CEO Dr. Maha Achour believes will eventually allow safer Level 4 and Level 5 autonomous vehicles.

LiDAR has been a quickly rising star in the sensing arena. RADAR sensors, however, may stand to give it a run for its money.

AAC's Mark Hughes spoke to Metawave’s Founder and CEO, Dr. Maha Achour, Ph.D. and Metawave's VP of Strategic Alliances Tim Curley to take a look at a how millimeter-wave RADAR sensors may unseat LiDAR as the future of automotive sensing.

The Current State of LiDAR

The first autonomous cars were built with mechanical LiDAR units affixed atop their roofs. These systems generate large datasets called point-clouds that are then passed to a computer for processing (3D SLAM). Inside the computer, advanced algorithms try to determine which objects are cars, people, trees, buildings, signs, etc… By watching the objects move over time, the central computer can determine velocity, bearing, and predict collisions.

Image showing mapping surrounding autonomous vehicle. Image used courtesy of Velodyne LiDAR

While these first LiDAR units were adequate for the early days of Level 1 and Level 2 autonomous driving, they were aesthetically obtrusive atop their parent vehicle, and they had the unforgivable sin of being prohibitively expensive. For LiDAR to enter the mass-marketplace, the per-unit price had to drop substantially—and the easiest way to realize that dream is to remove the rotating array and eliminate any macroscopic moving parts.

For the last several years, multiple companies (Velodyne, Innoviz, Leddartech, etc…) have been working on solid-state LiDAR, especially for automotive applications like the development of autonomous vehicles. The goal of these companies is to provide a 3D point-cloud with no moving parts, or only MEMS-based movement. The technologies of various companies differ, but all of the units are small, have a limited scan area, and have an almost insignificant expense compared to their mechanical predecessors. The low price (several companies claim $100 price points in mass production) allows multiple units to be attached at the corners of a vehicle to provide 360° coverage.
The LiDAR units are not used in isolation—the data they create is fused with other sensor data. Current Level 3 vehicles also might incorporate visual and IR cameras, ultrasonic sensors, RADAR arrays, and a few other technologies that are all centrally fed to a powerful computer such as the NVIDIA JETSON TX1/2. The computer combines the sensor data to better understand the environment surrounding the car. Since each RADAR/LiDAR sensor can generate up to tens of millions of points per second, cars need gigabit transfer networks and computers capable of processing the data in real time.

Unfortunately, most visual detection methods (e.g., visual and IR camera, LiDAR) are adversely affected by weather conditions, dirt, and highway debris. As autonomous cars continue to progress to Level 4 and Level 5, where no driver interaction is required, automobile makers need technology that isn’t flummoxed by a swarm of bees, a mud puddle, or a rainy day.

Active RADAR Antennas

Millimeter-wave RADAR has several advantages over LiDAR. The first, and perhaps most significant advantage is that the RADAR sensors are not affected by weather and cannot easily be obstructed by highway debris. Where a conventional LiDAR unit can become compromised in a heavy rain or can be partially obstructed by bug or other debris impact, RADAR can see right through those obstructions. A camera or LiDAR unit sees a grasshopper as an opaque, obstructive object that can completely obscure its field of view, whereas a RADAR unit sees a 1-2 dB decrease in signal strength, but otherwise is able to function fully. This means that, for example, a child that is hidden from view behind leaves on a tree on a fog-filled day is invisible to cameras and LiDAR, but remains visible in the millimeter-wave spectrum up to ¼-mile away.

Metawave created an electrically steerable RADAR antenna system called WARLORD (W-band Advanced RADAR for Long-Range Object Recognition and Detection).


Exploded view of WARLORD. Image used courtesy of Metawave

Don’t think of the device as an antenna array, with dozens of feedpoints connected to dozens of antennas. It is a single antenna fed by a single transceiver port. Proprietary integrated circuits on the antenna shift the phase of the signal and are able to steer a main lobe up to ±60°, alternatively multiple lobes can be created to simultaneously track multiple targets. Additionally, the proprietary ICs can contribute to lower costs. "We augment our structure with our own IC," says Dr. Achour. "This is where some of the cost initially would be high, but since we're targeting three markets and they're all in the same range between 60 gigahertz to 80 gigahertz, 5G, and at the same time the automotive radar. We know that the volume for 5G is quite high, so that can offset the cost of the IC."
Multiple devices can be mounted at the four corners of the vehicle to provide full 360° coverage, or augmented with other, less expensive sensors (such as ultrasonic) for closer situations. Level 4 and Level 5 autonomous vehicles require no driver interaction, and the burden of responsibility in an accident is shifted to the manufacturer of the car rather than the occupants, so the cost of adding safety features is negligible compared to the cost of a lawsuit.

Most current LiDAR units send the point cloud to the central computer for processing. WARLORD is able to process the data from the antenna and send object detection and classification information to the central computer (the point cloud is still available for customers who wish to process their own data), greatly decreasing the computational complexity. The unit will send back information that describes the speed of the object (using the Doppler effect), where the object is relative to the car (distance, bearing, elevation), as well as what the object is. For example, WARLORD will notify the main computer that a truck is 500 meters directly ahead and traveling at 20 miles per hour away from the car, and a child is in the crosswalk 50 meters ahead and about to cross into the car’s path. This feat of engineering is accomplished by Metawaves in-house team of AI programmers and testers. Since the RADAR is able to detect objects at such great distances, it provides the central computer ample time to track and respond to potential hazards.

The cost of the device is expected to be less than $500 in mass production, in no small part because it is made with readily available metamaterials on a conventional production line: "There is no exotic material or special processing that needs to be done," Dr. Achour says. "And we've manufactured them using conventional production line, and we expect the yield to comply with all the precision and tolerances of these production lines. So expect the yield to be 100% of these structures."

How Does WARLORD Work?

WARLORD has a custom antenna, created with custom materials, controlled by custom integrated circuits.


Active antenna created from adaptive metamaterials. Image used courtesy of Metawave

The signal from a single feedpoint is controlled with the custom ICs to provide an electrically steerable beam pattern.

Beam-pattern from an active RADAR antenna. Image used courtesy of Analog.

This active antenna configuration allows WARLORD to change its beam pattern at-will to create one or many lobes. This allows the system to simultaneously track multiple objects or to focus in on particular objects of interest. Narrow-beams allow for tracking objects with a smaller RADAR cross-section at a greater distance.


Metawave's WARLORD used to track and identify multiple targets. Image used courtesy of Metawave.

Challenges for RADAR in the Autonomous Vehicle Industry

It is impossible to predict the future as technology is still in the early stages of development. But mechanical rotary LiDAR units appear to have been rejected by OEMs en masse. LeddarTech, Innoviz, and Velodyne have mechanical LiDAR units that are currently being integrated into Level 3 autonomous vehicles. The cost of these units will continue to decrease and their performance will improve. However, all modern LiDAR and camera units suffer from the same critical issues—they have limited range and can be obstructed by debris.

By that same token, however, Dr. Achour says that there's only one real challenge remaining for millimeter-wave technology when it comes to hardware limitations:

"When you start doing beam-forming instead of just sending the signal everywhere, you put this digital wave on every single antenna or analog phase shifter. Now you're operating this array as a phase array antenna. The problem with this approach occurs if it's not being designed in concert together, if they are designed independently. As soon as [the matching antennas] start steering the beam, it goes way above 10 db. You have reflection coming from the antenna, and that reflection basically kills your PA, kills your IC, creates this thermal noise. So, these are talking about the limitations, not talking about the signal processing and the delay and the power consumption in doing this expensive digital signal processing."

Another challenge for autonomous vehicles as a whole is the concept of responsibility for the "decisions" that a car makes. Dr. Achour pointed out that multiple RADAR sensors may be "overkill for Level 3 cars" such as a Tesla because there is a driver who is responsible for the safety of car operation. "But when you go to Level 4 and 5," she says, "Well now the safety is the responsibility of the company that operates this fleet of cars. The profit is not just per car sold but is basically per mile driven. So it's a very different business model for both the car OEM and the service provider."
Metawave, however, does not claim to use its AI to make decisions at a vehicle level.

"Artificial intelligence covers a very broad functionality inside the car. So, if this is centralized, that means we have only one central processor that takes raw data and processes the whole thing. I think that the trend is going to be in doing what we call a hybrid or hybrid centralized and decentralized AI algorithm, so AI processing. Now you have each sensor provide some sort of labeling of these objects to the sensor fusion, and the sensor fusion does another layer of AI to decide 'Should I stay on this lane? Should I brake? Should I change lanes? What should I do?' We [at Metawave] don't do the sensor fusion and there are a lot of companies that do. In addition, all the car OEMs also want to own that sensor fusion because, in the end, this is the brain of the car and the company that has the smartest and safest brain is going to be the winner. We don't expect all of the players to survive a level four or level five challenge. Very few."

So if Metawave's AI isn't intended to perform sensor fusion and produce "decisions" to direct a vehicle's actions, what does the AI do?


"What we offer is an AI algorithm that sits only in the RADAR and only is responsible for processing the radar data and provide it with some level of confidence about the object. For example, if I see a truck maybe with 90% probability, I can provide that label to the sensor fusion, let's say at 300 meters. If I see a motorcycle at 300 meters because the cross-sections are smaller, I will provide it maybe with a 50% accuracy. Now, the sensor fusion will take this information and will instruct the LiDAR and the camera to look in the direction of the motorcycle instead of looking everywhere and wasting time just to verify is this really a motorcycle or not. By doing that, we provide the sensor fusion enough time to react before the car hits the motorcycle, and at the same the RADAR doesn't become liable of the final decision because we provide the long-range information."

Metawave also says it offers something unique to allow for better decision-making. "We give [OEMs, etc.] the option to have raw data. Today, none of the RADAR companies provide raw data. They only provide the two-point cloud, which is the range and the Doppler, just because it's a Level 2, Level 3 [application]. But if we provide them with the raw data, they can do whatever they want with it (and we provide them with the post-processed data of course on a different business model). Then, they have a very stronger platform to work with to make sure that the operation of the car is seamlessly maintained in any kind of operating condition, in any type of weather condition, and at the highest safety expectation."

What’s Next? Ambitions to Unseat LiDAR

Current ADAS (advanced driver-assistance systems) require cameras, LiDAR, and other sensor systems—all of which will almost certainly be necessary for Level 4 and Level 5 vehicles. But, Dr. Achour says, this may change in 10 to 15 years, once sensor fusion has further evolved. With sufficiently advanced RADAR sensors, ("with high-resolution imaging capability that is capable of operating in all weather conditions and all environments and also adding the non-line-of-sight detection and tracking, doing the V2V communication"), you may be able to avoid the need for short- and mid-range sensors at all.

"You add more functionality to the RADAR," she says. "You may not need these short-range and mid-range RADAR sensors. So you are eliminating other sensors."


Metawave is still refining their millimeter-wave RADAR technology, as are any other companies that haven’t made their presence known in the market yet. In a few years, when the RADAR-based-technology companies are ready for tier-1 integration, they might very well supplant the solid-state RADAR that is all the rage today.

No comments: