Products on Show

Hyperspectral synthetic data to accelerate ADAS/AV development

The AV and ADAS industry is implementing a new generation of deep learning perception systems using advanced sensors and optical systems that demand accurate data to fulfill the most challenging scenarios. Its new sensing capabilities include photodetectors covering different parts of the spectrum and automatic adaptation to dark conditions. Synthetic data will unequivocally be needed to design, train, calibrate and validate this new generation of autonomous systems. However, not all synthetic data is made equal. Pixel-accurate synthetic data is required to reduce the domain gap and accurately reproduce the sensors’ behavior.

Anyverse has deployed a flexible modular software platform for scene generation, rendering and sensor simulation, allowing users to design their use cases and data set production processes at will.

The company employs a pure spectral ray-tracing engine that computes the spectral radiance of every light beam interacting with materials in the scene, simulating lights and materials at a close physical level. With the spectral information provided by the render, Anyverse simulates the physics happening inside the sensor and ISP. As a result, the user can evaluate and test any camera sensor with high accuracy and decide which sensor fits best with their system. Furthermore, the platform enables users to design scenarios using an extensive assets library to compose the scene, apply dynamic behaviors, program the environmental variability, produce data sets in the cloud and explore the results, including all the associated ground truth data.

Anyverse’s approach opens new opportunities for AV and ADAS developers in any deployment stage, allowing data engineers to simulate any scenario programmatically, configure variations, test different weather, light and environment conditions, and reproduce any corner case virtually while guaranteeing maximum accuracy in the process.

Booth: 1016

Back to News