Products on Show

Raw data sensor fusion technology accelerates ADAS performance

A recurring and critical question raised by engineering teams worldwide is how to solve object detection, classification and tracking challenges to unlock ADAS features on vehicles.

As demonstrated through nuScenes, a standard approach for evaluating object detection, raw data sensor fusion and perception has exhibited strong performance and represents a significantly better alternative to the standard approach in meeting the challenges in Level 2-3 ADAS development.

At the expo, LeddarTech will highlight LeddarVision, a raw data sensor fusion and perception solution for safe and reliable ADAS and AD applications.

Each sensor, whether lidar, radar or camera, has its weaknesses, such as the camera’s poor performance in bad weather, radar’s poor resolution and lidar’s range limitations. This problem is solved with raw data sensor fusion that detects and classifies objects by combining raw data from all sensor modalities to provide confident and accurate object detection, tracking and classification, unlike traditional perception systems that perform object-level fusion and individually classify objects.

Raw data sensor fusion technology outperforms with superior object detection, classification and tracking performance; it provides fewer false positives and negatives and has a built-in redundancy component. Alternatively, when sensor data is not fused, the system may get contradicting inputs from sensors and, therefore, be unable to determine the next course of action with an acceptable degree of certainty. For example, if an obstacle is detected by the camera but not by the lidar or radar, the system hesitates over whether the vehicle should stop, thereby potentially causing an accident. Raw data fusion solves this problem.

Booth: 1052

Back to News