Products on Show

Day 2: LeddarTech showcases raw-data sensor fusion and perception for ADAS and AD systems

On Day 2, LeddarTech is demonstrating its raw-data sensor fusion and perception solution, LeddarVision.

“Sensor fusion is the merging of data from at least two sensors,” explained company representatives at the expo. “In autonomous vehicles, perception refers to the processing and interpretation of sensor data to detect, identify, classify and track objects. Sensor fusion and perception enables an autonomous vehicle to develop a 3D model of the surrounding environment that feeds into the vehicle’s control unit.”

While sensor fusion (fusion of data from different sensors) and perception (online collection of information about the surrounding environment) is already in use in current ADAS and AD applications, LeddarTech says the technology still has one major drawback: each detection is based on suboptimal information (sensing data from the camera, radar, lidar, etc), resulting in partial or even contradictory information that can lead the system to make a wrong decision.

Currently, the most common type of fusion employed by software providers is object-level fusion. This approach involves perception being done separately on each sensor. For LeddarTech, this is not optimal because when sensor data is not fused before the system makes a decision, there may be contradicting inputs. For example, if an obstacle is detected by the camera but not by the lidar or the radar, the system may hesitate about whether the vehicle should stop.

Instead, LeddarTech employs a raw-data fusion approach: objects detected by the different sensors are first fused into a dense and precise 3D environmental RGBD model, then decisions are made based on a single model built from all the available information. Fusing raw data from multiple frames and multiple measurements of a single object improves the signal-to-noise ratio (SNR), enables the system to overcome single sensor faults and allows the use of lower-cost sensors. This solution provides better detections and fewer false alarms, especially for small obstacles and unclassified objects.

Booth: 1052

Back to News