4 Key Technologies to Watch at Embedded World 2016 - LEKULE

Breaking

30 Sept 2016

4 Key Technologies to Watch at Embedded World 2016

Embedded World 2016 is almost here. Here are some technologies to keep your eye on.

Autonomous cars are the future of automotive electronics, but they are years away from mainstream commercial realization. What we have right now is the advanced driver assistance systems or ADAS, and the Embedded World 2016 show in Nuremberg, Germany will provide a glimpse of where ADAS products and technologies stand regarding commercial realization.
ADAS is technology to watch in 2016

The ADAS technology is driving the demand for chips from automotive OEMs and Tier 1 suppliers as well as from after-market manufacturers. Not surprisingly, therefore, chip firms ranging from Altera to Cadence and Cypress to Imagination Technologies are unveiling they ADAS offerings during the Embedded World 2016 show being held on 23-25 February. Below are the four key technology ingredients that will likely drive the growth of ADAS products and thus will be worth watching during the Embedded World 2016.

Computer Vision

The smart camera is clearly the brain of the ADAS, and once it's combined with powerful computer vision algorithms, ADAS enables vehicles to look outside to monitor road conditions and look inside to monitor driver behavior.
The advanced computer vision algorithms are a driving force in the next-generation ADAS offerings. And that calls for low-power vision chips that can process a vast array of visual information while monitoring roadway objects such as pedestrians and animals.

Freescale (now part of NXP) bought Cognivue to acquire computer vision IP

The rising number of cameras inside and outside the car is a testament that computer vision technology will be a key influence on ADAS in 2016. Now add artificial intelligence and deep learning algorithms to the computer-vision recipe and that may lead to groundbreaking ADAS product developments during this year.

Automotive Radars

Apparently, cameras are going to play a central role in driver assistance systems, but they are hindered during the night time and in situations like fog, snow, and rain. So it's quite likely that the automotive industry won't rely on just one sensor technology to enable the ADAS. Enter Light Detection And Ranging technology or LiDAR.
Radar is the other major sensor after camera in ADAS products; it can see through darkness and fog, and it can also measure the speed and distance of objects. Moreover, LiDAR systems provide higher resolution and more granular details of the objects. According to Frost & Sullivan, seven out of 13 top automotive OEMs are incorporating the LiDAR technology in their vehicles.


NXP's small CMOS radar chip aims to replace traditional ultrasonic solutions

However, LiDAR systems are still at a nascent stage with a modest resolution and limited range. The current LiDAR products are relatively expensive, and they can scan up to 100 meters with limited reflectivity. Furthermore, they come in larger packages made up of two to three SiGe chips.
NXP has showcased a single-chip CMOS solution for short-range automotive radars at the CES 2016. NXP claims that its 77 GHz RF transceiver chip consumes 40 percent less power than conventional radar ICs and that it will help car OEMs replace bulky ultrasonic radars with lightweight and high-resolution sensors.

ASIL Certification

The connected car standards like ADAS are all about functional safety, so the ISO 26262 specification incorporates stringent reliability features in automotive chips. These powerful chips will require greater processing power in order to accomplish the computational consolidation that is the hallmark of technologies like the ADAS.

TI's TDA3x ADAS chip complies with the ISO 26262 functional safety standard

Not surprisingly, therefore, the Automotive Safety Integrity Level (ASIL) certification defined in the ISO 26262 Functional Safety for Road Vehicles standard has become a major priority for ADAS solution providers. Take CEVA, the supplier XM4 vision processor IP, which has recently received the ASIL B certification.

Sensor Fusion

An ADAS product usually comprises of one or more cameras, GPS, an inertial sensor, a processor and a communications modem. The processor powers computer vision, artificial intelligence and deep learning algorithms to enable contextually aware speech and image recognition for ADAS warning systems.

Sensor fusion is crucial amid the rising number of sensors in ADAS (Image: TI)

A driver assistance system is more than the camera and radar sensors. A cocoon of sensors is required to enable ADAS features such as 360-degree view and automated parking. An inertial sensor, for example, monitors vehicle's speed and acceleration. Then, there are proximity and light detection sensors.

A robust sensor fusion is crucial for ensuring that a lot of data from cameras, radars and other sensors is accurately gathered in real-time. The data from different sensors is usually driven to a central place—sensor hub—that offloads the central processor from the sensor number crunching. Here, accuracy and power consumption are main challenges for smart fusion of a heterogeneous sensors world.

No comments: