Simulation, scenarios, verification and validation
Large-scale simulation supporting HARA for ISO 26262, SOTIF and SPTA
Dr Edward Schwalb Lead data scientist MSC Software USA
We will review a Bayesian approach to hazard analysis and risk assessment, which gives rise to ‘smart hazardous miles’. We describe a generic hazard control loop that enables active avoidance of accidents, and illustrate how it can be applied to classify hazards as potential or developing, consistent with the UK theoretical driving test administered to human drivers. This approach further guides our development of tools enabling accelerated development and testing. We discuss the implied hazard specification extensions to the PEGASUS approach. An integrated workflow will be presented, which is consistent with ISO 26262, SOTIF and STPA.
Obstacles on the road to massive scenario-based verification
Yoav Hollander Founder and CTO Foretellix Ltd Israel
There is now a growing consensus that massive scenario-based testing should be a major component of verifying ADAS/AV safety. However, experience shows that when one goes from, say, 1K scenario runs to 10M scenario runs (per week), some serious new problems arise. For instance: understanding what was already tested (and where the 'holes' are) can be challenging; lack of repeatability can make debugging a big challenge; checking the results (and handling 'gray areas') can become a nightmare.
Enabling virtual test and validation: creating a virtual proof of validation in the ENVITED ecosystem
Carlo van Driesten Systems architect for virtual test and validation BMW Group Germany
New forms of cooperation are necessary to turn the vision of autonomous driving and fully connected mobility systems into a reality. Virtual validation is an essential part of the development process. Standards for model and system interchange are vital for cross-company and cross-domain virtual integration and simulation of HAD functions. Standards like FMI, SSP and the OpenX at ASAM eV showcase current possibilities, challenges and future directions, as well as a vision of future collaborations. The foundations for the future ecosystem have been laid by the ENVITED ecosystem: standardized data for virtual test and validation; open and modular simulation architecture; traceability of standardized data and test results for a virtual proof of validation.
Enabling scenario-based verification for autonomous driving
Alexandre Mugnai Business development manager Esteco Italy
Siddhant Gupta Research engineer, verification and validation, autonomous driving/ADAS Volvo Cars Corporation Sweden
This paper proposes a methodology to enable scenario-based verification for autonomous driving by creating a scenario and test database using a suitable design of experiments. Consequently, an optimization strategy is developed to evaluate the criticality of the aforementioned test cases to segregate the test cases to be simulated in a software-in-the-loop (SIL) environment and the critical test cases in the vehicle-in-the-loop (VIL) environment to ensure the coverage using SIL and VIL platforms. The toolchain to create the design of experiments and optimization strategy is the Simulation Platform Active Safety (SPAS) virtual platform enabling SIL verification developed at Volvo Cars, which is co-simulated with the multi-domain optimization tool modeFrontier.
Panthera driving simulation framework with high-quality content for human-in-the-loop testing
Jelle van Doornik Product manager ADAS and AD Cruden Netherlands
Although the industry seems to be focused on full autonomous driving (L5), Cruden believes that partial automation (L1-L4) is going to be around for decades to come. The increasing number of automated driving functions causes the interaction between driver and vehicle to change. The most time-effective way to test and validate this properly is with an open-architecture simulation framework that allows easy integration of the customer's engineering tools while displaying high-quality content for the human driver. The Panthera driving simulation framework provides a safe and cost-efficient way to add human-in-the-loop simulation to the existing development, test and validation toolchain.
OmniCAV: hybrid simulation for AV stack verification
Dr Gavin Jackman Managing director Aimsun UK
OmniCAV is a consortium project that is partly funded by the UK Government. It aims to deliver a highly realistic simulation environment for AV stack verification that considers all road users and road types. A unique environment that covers all road types and eventualities is being created and validated with real AV testing within the OmniCAV project. Our consortium is applying cutting-edge technology in creating this environment, with high-fidelity lidar mapping, mixed-traffic simulated environments with motorized and non-motorized vehicles (bicycles) and pedestrians, advanced driving simulators and more. The solution will cover the urban environment, the strategic road network and the rural road network, allowing all conditions to be simulated and evaluated. This is being delivered in combination with a UK local government and a Zenzic/CCAV-funded testbed location.
Edge-case hunting in scenario-based virtual validation of AVs
Dr Henning Lategahn CEO Atlatec GmbH Germany
Validating ADAS and AV stacks in large parts in virtual environments using simulators is a given nowadays. The industry largely agrees on a quality over mere quantity view on things. But what is the quality of a scenario set that is tested against? It essentially boils down to the ability to identify edge cases that show the exact limits of the system under test. In this talk, we present how one can use real-world scenarios, translate these into their digital-twin counterparts for simulation and finally modify these to identify the crucial edge cases.
Autonomous vehicle engineering simulation tools for appropriate driver acceptance and comfort
Dr Andras Kemeny Expert leader immersive simulation Renault France
The challenge of autonomous vehicle validation relies on massive simulation, due to the vast number of kilometers to run in various road, traffic and weather conditions. Nevertheless, acceptance of the proposed automation system will have a heavy impact on the use of the system and the efficiency of handling it when sharing or taking back vehicle control. Driver-in-the-loop simulation will play an essential role, using high-performance driving simulators or dedicated configurations, including virtual reality and web-based online solutions. These simulation tools will be essential to complete efficient autonomous vehicle engineering design for driver acceptance and comfort.
Developing a future-proof scenario database in a world of emergent standards
Mike Freeman Project engineer Warwick Manufacturing Group UK
Testing is fundamental to the safety of automated driving software, but driving billions of miles to achieve sufficient scenario coverage is unfeasible and requires a better approach. Scenario sharing across the industry is gaining support as being the solution. With this aim, the standardization of scenario description is being worked on but we are still some way from a universal standard. This puts the system architect in a difficult position: how to design a scenario database that will support today’s standards as well as those of tomorrow? As part of the UK’s Midlands Future Mobility project, we answer this question.
Generation of safety-critical scenarios for validation of autonomous vehicles
Julien Niol Research engineer Apsys France
Autonomous vehicles are introducing a new challenge to secure them by relying on algorithms to analyze their environment and take decisions accordingly without any human supervision. This paradigm requires the consideration of not only functional safety but also the safety of the intended functionality (SOTIF). We present how we have experimented with a model-based safety assessment methodology to generate classes of safety-critical scenarios for the validation of autonomous vehicles. This approach is based on a high abstraction level behavioral model integrating the system architecture and its immediate environment, developed to address key concepts introduced in ISO 21448.
Automated validation toolchain for autonomous driving functions
Thorsten Drogge System architect Elektrobit Automotive GmbH Germany
Handling huge amounts of raw sensor and vehicle bus data in the hundreds of petabyte range, as well as maintaining a precise and comprehensive sensor model tightly coupled to a restbus simulation, poses a major challenge for sufficiently testing and validating automated and autonomous driving functions. Providing a high-performance web portal to ingest, track and reprocess such data in the cloud or from locally attached mass storage of hardware- or software-in-the-loop simulation solutions and driving scene catalogs enables the task of data orchestration to be mastered. In addition, suitable tooling provides hardware support for maintaining a precise sensor model and bus simulation to emulate a full automotive restbus to the device under test.
Vision, lidar and sensor test and development
New consumer-friendly ADAS rating system including innovative lidar solutions
Dr Mircea Gradu Senior vice president quality and validation Velodyne Lidar USA
Rapid developments in the field of driver assistance technologies require clear communication from the industry. Customers need crystal-clear information about what their vehicles can do. Simply naming ADAS functions such as LKA or ACC as a feature is not enough. A clear, general nomenclature is needed to understand and accept new functions. Service descriptions must be refined, and test protocols must be adapted to the use cases. More security through new sensor technologies requires common, transparent standards that must be clearly communicated in order to enable responsible implementation. The presentation gives an overview of identified shortcomings in the current ADAS sensor taxonomy, testing and validation and proposes a standardization approach leading to a new ADAS feature rating system.
Synthetic data utilization for AD/ADAS sensing functions performance evaluation – challenges and opportunities
Farid Kondori Verification tools lead Aptiv Sweden
To tackle the problem of verification and validation of autonomous vehicles, synthetic data generated in virtual environments can be employed to complement field testing, due to the fact that these environments are highly scalable and inexpensive. Although in recent years virtual environments have been employed by the community to develop and test ADAS/AD functionalities, such as AEB or ACC, there has been limited possibility to utilize synthetic data for sensing functions performance evaluation. This presentation will address the challenges and opportunities of virtual environments for test and validation of camera-based sensing algorithms.
Modular solid-state lidar to find the best fit for automotive applications and car integration
Filip Geuens CEO XenomatiX – True Solid State Lidar Belgium
XenomatiX and Marelli have elaborated a modular solid-state lidar approach. Based on compact modules, different lidar systems can be composed to find the right fit between lidar performance, application requirements and vehicle integration restrictions. A number of system configurations will be explained and detection capabilities will be shown. As Level 3 ADAS is the next target for automotive mass-production vehicles, a high-resolution front-view solid-state lidar configuration will be presented.
LWIR thermal sensing is a must in the autonomous suite
Raz Peleg Sales director AdaSky Israel
The current trio sensing suite of CMOS, radar and lidar fails to address certain corner cases. Adding LWIR thermal sensing to this suite will allow all-weather detection and classification, VRU classification even through fences, VRU classification when only partial body is exposed (between parked vehicles), a solution for blinding situations and faster AV shuttles. The sensor is passive and cost-effective and can support the shift from prototypes to mass autonomous production.
See the unseen: the future of driving
Tim LeBeau Chief business development officer Seek Thermal USA
The best vehicles on the market are hyper-aware and able to detect challenges in their path more quickly than occupants. As autonomous vehicles begin to hit the road, it is imperative that ADAS are optimized to safely avoid hazards with speed and accuracy. The presentation will explore concerns around autonomous vehicles and share the impact of thermal sensors on minimizing human control in driving with real-time insights. It will discuss how data collected from thermal sensors can be leveraged to identify impending dangers before they occur, sharing how temperature values create visual images, accurately distinguishing living from non-living objects.
Physics-based sensor simulation – essential for developing the safest autonomous vehicles
Serge Laverdure Connected and automated unit director ESI France
The certification of autonomous vehicles will require more and more virtual testing as a consequence of a new level of complexity that the automotive industry has never faced before. Nevertheless, real testing should evolve accordingly and should be supported by interoperable software tools. ESI is providing an interoperable solution relying on high-fidelity physics-based sensor models to tackle the harsh weather conditions in which autonomous vehicles will operate in the field (fog, rain, snow, dust and sand). The aim of this presentation is to evaluate the effect of harsh weather conditions on lidar performance.
Verifying L2+ collaborative driving and ADAS functions using DIL
Ram Mirwani Director, global business development, ADAS Konrad Technologies & VI-grade Germany
As more standards are worked on and released for defining autonomous driving capability, the test focus on L2+ ADAS features being released in several vehicles by OEMs continues to intensify. With collaborative driving taking center stage for near-term autonomous driving functions, key variables to verify for L2+ ADAS functions include driver reaction time, acceptable tolerances for taking over control and overall user experience. In this session, VI-grade and Konrad Technologies will highlight how DIL with sensor fusion test is a platform for jointly verifying the functional performance and driver experience for L2+ ADAS functions.
Real-time over-the-air automotive radar hardware in the loop to test autonomous vehicles
Dr Kasra Haghighi CEO Uniquesec Sweden
Fahimeh Rafieinia CTO Uniquesec Sweden
Autonomous driving will revolutionize the future of mobility. The main pillar of autonomous driving, as well as advanced driver assistance safety systems (ADAS), is sensing. Among all sensors, radars are the most reliable and versatile ones, providing environment perception. Test and validation of radars and radar-based safety functions is a necessity that requires an advanced OTA radar HIL setup where rich and dynamic traffic scenarios can be emulated. Certain real-time requirements need to be established between environment simulator and radar-target-simulator. This presentation will demonstrate a full OTA HIL setup enabling emulation of more than 200 moving targets for the radar under test.
Platform-based approach for a future-proof ADAS V&V infrastructure
Balazs Toth Technical sales project manager National Instruments Germany
The growing variety, resolution and data rate of modern ADAS sensors and components require adequate validation and verification systems and methods for the validation and verification of ADAS sensors and ADAS sensor fusion systems. The use of an open, scalable, modular, high-performance ADAS record-and-replay platform provides an optimal way for addressing the current complex ADAS V&V challenges.
Reference measurement system for development and validation of ADAS functions
Steffen Metzner Technology scout ADAS, simulation and control AVL List GmbH Austria
Despite all efforts to carry out tests of vehicle automation functions in simulation, many tests still need to be executed directly in-vehicle. For efficient development and verification of sensor systems, a highly accurate reference sensor system is necessary. Without such systems, troubleshooting and objective evaluation of vehicle behavior becomes very expensive. The solution discussed in this presentation aims at supporting the development of vehicle automation functionalities during the entire development process from sensor selection to function validation. The main advantage is the possibility to get feedback in the vehicle, which can be refined in post-processing.
Dr Simon Funke Partnership Manager Automotive understandAI GmbH Germany
The testing of the autonomous driving function requires a huge number of miles to be driven to secure a safe operation. To track development, miles-per-disengagement are reported. Lastly, alternative metrics have been proposed, such as the Driverless Readiness Score that also takes false negative and simulation into account.
We propose a data setting, where raw lidar data is translated into simulation models where the data is augmented by varying the environment, the traffic and especially the occurrence of critical situations. Through sensor realistic simulation, different sensors and sensor positions can be analyzed based on one single data set, thus reducing the number of miles to be driven drastically. We prove our approach by showing a better perception performance on the augmented data set compared to the raw data.
Solid-state lidar based on liquid crystal metasurfaces
Gleb Akselrod CTO Lumotive USA
Since the first mechanical lidar sensors surfaced for use in robotic cars, multiple approaches have been proposed to address the challenging performance requirements in terms of range, field of view, resolution and frame rate. Although OPA- and MEMS-based systems show some promise in cost-effective design, the small aperture of the mirrors or phased arrays in these systems makes it impossible to meet industry performance goals. A lidar based on liquid crystal metasurfaces (LCMs) manufactured in CMOS silicon can be built with large optical aperture size, yielding good optical performance while also achieving low cost in mass production.
Best Practices & Innovation in Test & Development
How augmented reality and vehicle-to-pedestrian communication increase safety
Robert Kempf Vice president sales and business development ADAS/autonomous driving Harman International Germany
Vehicle-to-pedestrian (V2P) communication is an innovative application of ADAS technologies that has the potential to significantly increase public safety. V2P proactively alerts automobile drivers and pedestrians to possible safety conflicts through the use of vehicle-to-everything (V2X) technology. In the car, V2P can work on low-latency 5G peer-to-peer signals to identify objects in the vehicle’s path through proximity scanning. Similarly, pedestrians or cyclists with a C-V2X-enabled mobile device will also receive an alert that a vehicle is entering their path. With this improved awareness, vehicles, pedestrians and cyclists can be alerted to potential safety conflicts even in situations where advanced cameras can’t see past physical obstructions – such as around corners or through parked vehicles. This presentation will give an overview of the status of the technology, current developments and V2P's potential for pedestrian safety as well as the industry.
Latest achievements in the world of connected driving
Thomas Jäger Senior vice president and head of technology technical service management Dekra SE Germany
The presentation will discuss the latest V2X technology developments (ITS-G5, DSRC-wave, cellular); the latest regulatory and certification situation (EU, USA and others); testing requirements for connected and automated driving; the most recent developments in regional and global interest groups; challenges and outlook for the future.
Autonomous Vehicle AI, Software and Sensor Fusion Conference
Scale, variability and accuracy: synthetic data sets for training and validation
Danny Atsmon Founder and CEO Cognata Israel
Machine learning and deep neural networks require tremendous quantities of data for training and validation; but even at scale, raw, repetitive or inaccurately labeled data doesn't produce results. Training and validation alike call for accurate large-scale data sets comprising common scenarios, edge cases and every sort of variation in between. Furthermore, each process requires a distinct data set. We will explore how new techniques in synthetic data generation are helping time-pressured industries like automated driving satisfy the ever-growing need for larger, more diverse and highly accurate data sets.
Functional safety and security – partners or independent contributors?
As the complexity of systems increases, so does the need for functional safety and security. Quite often these goals are meant to be fulfilled by the same component. But how are they analyzed and designed so that instead of being two independent areas, they contribute together toward the achievement of the system goals? The presentation will show the similarities and differences in terms of goals, processes, concepts and mechanisms when it comes to functional safety and security. It will also discuss the analysis that can be done to bring them together on the same page.
Combined model-based and AI architectures for safety and comfortable driving
Dr Son Tong Senior research engineer Siemens Digital Industries Software Belgium
This talk presents our autonomous vehicle (AV) algorithm developments exploiting combination architectures of model-based and artificial intelligence (AI) toward safety and comfortable driving objectives. Recently, AI has been investigated in AV control; however, the disadvantage is a lack of rigorous results on explainability and safety. We discuss several strategies that incorporate data learning in control developments dealing with these challenges while enhancing performance: imitation learning of human-like driving in lane-keeping; Gaussian process control for snowy driving; reinforcement learning control. Finally, our experience in applying AI in robust sensor fusion is also highlighted.
Tackle the challenges of AD algorithm development with versatile software tools
Nicolas du Lac CEO Intempora France
Software has become an essential milestone for autonomous vehicles; at the same time, the number of software tasks is increasing rapidly with new sensor technologies. Automated driving development is becoming more complex, expensive and time-consuming. It is critical for developers to find reliable, versatile, powerful software tools that enable them to collaborate, face all these challenges by reducing the workload, and handle most automotive use cases from R&D to production. This presentation will focus on software development concepts for automated driving and will show how some OEMs and Tier 1 suppliers succeed, with algorithm demonstrations in multi-threaded and distributed architecture.
Migrating autonomous software from prototype to production
Robert Day Director automotive solutions and platforms embedded and automotive division Arm USA
This session discusses the challenges in moving autonomous vehicles from prototype to deployment, starting with the general requirements for the compute platform and then looking in more detail at the huge task of migrating the enormous software stack into a commercially deployable platform. The presentation will cover software platforms from an open-source and a commercial perspective, including the use of operating systems and hypervisors, middleware and application stacks. There will also be discussion around functional safety and how mixed-criticality systems can be deployed effectively.
Dynamic risk management (DRM) for safe and driver-accepted autonomous driving
Thomas Freudenmann CEO EDI GmbH - Engineering Data Intelligence Germany
With our EDI hive IoT framework we trained an AI-based dynamic risk management (DRM) system – algorithms that enable autonomous vehicles to react to critical driving situations in a context-sensitive way like a human, experienced driver. The AI-based algorithms were trained with more than 100,000 incidents specifically focused on the drivers’ decision-making processes in critical traffic situations. In addition, we have established a validation environment with PTV Vissim in order to be able to secure different driver models in the simulation.
Real-time trajectory planning for automated driving and some related applications
Dr Joshué Pérez Rastelli Principal investigator Tecnalia Spain
Automated driving has increased the functionalities for a semi, highly, and fully automated vehicles in recent years. Most of the applications receive onboard sensors and communication inputs from the infrastructure and other vehicles. Some motion planning and control techniques have been implemented for complex environments; however, most of them spend a lot of time in the execution. This presentation describes the techniques used by different research teams, their contributions to motion planning and comparison among these techniques. Furthermore, an approach based on a testing methodology for validation of path planning and control algorithms for current and future automated vehicles is presented. A high degree of modularity, adaptability and real-time generation has been considered in the design of the proposed method. It shows good results for real applications in some complex environments.
Design and implementation of a novel lane-departure detection algorithm
Dr Imran Hayee Professor University of Minnesota Duluth USA
The existing lane-departure detection systems use either some kind of image processing or advanced differential GPS technology. We have designed a novel algorithm that can detect an unintentional lane departure using standard GPS technology without any need for a camera or lane-level resolution maps. We have successfully implemented such an algorithm in a proof-of-concept system and published our results in TRB 2019. Our algorithm success rate is almost 100%. We have filed a US patent application and have secured funding to develop a smartphone app to make this feature widely available to the public.
Multi-sensor AI considerations
Nicola Croce Technical program manager Deepen AI USA
Accurate classification and segmentation across multiple sensors is required for developing critical ADAS and autonomous vehicle components. Having redundant sensors is important to avoid safety risks in perception, tracking and path/motion planning algorithms. This talk will cover best practices for how to manage and benchmark your AV/ADAS AI models for fused sensor configurations. It will include data validation aspects, early versus late fusion, and data taxonomy implications for your model.
Raw sensor fusion – environmental perception model
Ronny Cohen Co-founder and CEO VayaVision Israel
L2 and L2+ are becoming mainstream and are implemented in many vehicles today. However, the switch to L3 and above is extremely problematic since the responsibility moves from the driver to the AV. This means that the AV must be safer than the human driver (and many would say at least 10 times safer). To meet these functional safety requirements, sensor fusion algorithms must become much more precise, reliable, robust and with multiple redundancies. To achieve this, raw sensor fusion methodology must be implemented. Unlike object-level fusion, raw sensor fusion methodology can achieve much better precision, reduce false alarms, work with multiple sensor inputs and continue working even when some sensors degrade in performance or go completely offline.
Challenges of large-scale sensor data processing for autonomous driving
Jan Wiegelmann CEO Autovia GmbH Germany
During the development and validation of autonomous driving systems, engineers must collect and store a huge amount of sensor data for analysis, deep learning and safety validation. In the presentation, we will show insights from using frameworks for large-scale data processing and distributed applications running in on-premises clusters and in the cloud. We will share our experiences and lessons learned on accelerating the end-to-end engineering process from data ingest and cataloging to analysis, development and safety validation.
Processing sensor data with a hybrid multi-protocol scale-out file system
Stefan Radtke Technical director systems engineering, EMEA Qumulo Germany
Sensor data for autonomous vehicle development is growing exponentially. Large OEMs and suppliers typically have 10s of petabytes stored in their data centers. However, machine learning and other technologies drive demand in compute power that can often not be satisfied within the data centers. Training deep neural networks, for example, requires thousands of compute hours, which can often be better performed in the cloud where bursty workloads can be satisfied with on-demand resources such as GPU optimized servers. During the presentation, we’ll explain how Qumulo addresses these challenges with its multi-protocol file system that works in data centers as well as in the cloud.
How reinforcement learning can boost the way to fully automated driving
Christian Spohn Innovation manager automotive Atos Information Technology GmbH Germany
Dr Christian Peeren Senior data scientist - information management and analytics Atos Information Technology GmbH Germany
An important aspect of perfecting autonomous driving is providing labeled quality data to feed the numerous deep-learning algorithms. Since the amount of data is in the petabyte range, labeling is a very sophisticated task and mostly done manually. Due to its nature, reinforcement learning doesn't need fully labeled data, but rather quantity and the reward of teaching a neural network certain actions. In this presentation, we discuss whether reinforcement learning can help speed up the labor-intensive data curation process.
Please note: this conference programme may be subject to change