Conference Program
The full conference program has been announced!
The two-day conference brings together world-leading experts in autonomous vehicle research, AI, software, sensor fusion, AV testing, validation, development, standards and safety. 50+ speakers from international OEMs, Tier 1 suppliers, R&D centers and innovative transportation startups will share best practices and innovative strategies. Hear from IBM, Volvo Cars, Torc Robotics, Tier IV, The Goodyear Tire & Rubber company, Argonne National Laboratory and more.
Click the links below to see speakers & presentations
Room 1
Key issues, strategies and innovations and their implications for safe AD/ADAS development and deployment
9am - 12.25pm
Moderator
Continuously delivering safety cases for an ADAS/AD product – easy?
For traditional E/E systems for automotive, if that is an allowed expression, functions were developed to be sufficiently good, such as ABS. ISO 26262 was then applied to enable the definition of additional means and measures so that implementation would be reasonably safe for use, or fulfilling the ISO 26262 safety norm. For ADAS and AD we have a more complex situation. We are more or less forced to release functional growth along with the advances of functional development and the verification and validation of the same. This puts very stringent requirements on the components that constitute the ADAS or AD. All safety-relevant components need to have their contribution to the safety case known and deviations of the same taken care of by the system. What does that imply? A lot. The presentation will elaborate on some of the areas of concern such as the SeooC concept and ODD expansion in the light of ISO 26262, as well as verification in the light of ISO 21448.
Transitioning from research to production-level autonomous vehicle testing
Torc has a rich, diverse history in the autonomous vehicle space dating back to 2005. Since becoming an independent subsidiary of Daimler in 2019, Torc has become laser-focused on delivering an autonomous long-haul trucking solution. Torc’s team of highly trained test crew (in-vehicle fallback test drivers and safety conductor) are an important part of its research and development testing to help ensure safety. The research testing has evolved into a rigorous development lifecycle process. As we progress toward driverless operations, a cohesive cross-platform SIL/HIL/VIL testing strategy which includes best practices and standards will need to be employed. Attend the session to learn about Torc’s testing journey including specific testing examples its our plans for continuing to work toward delivering a safe and reliable autonomous solution.
What the audience will learn
- Brief synopsis of Torc's history and product delivery vision
- Practical overview of the autonomous vehicle testing approaches and technologies
- What we did - How Torc tested and deployed autonomous vehicles early in the development cycle
- How we've evolved - What Torc has learned and adopted over the past four years to increase testing rigor today
- Where we are going - Torc's fully evolved testing strategy for releasing a safe autonomous trucking product in 2027
Autonomous vehicle safety readiness for ridesharing service
Applying modern AI to advanced autonomous driving
Join Anurag's presentation and dive into the application of latest AI models in developing autonomous driving technology, discuss the transformative impacts of modern AI in building more robust and adaptive AD systems and share Plus’s real-world commercialization experience across the U.S., Europe and Australia.
The role of inertial sensors for automated driving
In addition to perception sensors such as cameras, lidar and radar, accurate vehicle localization requires intelligent use of 'absolute' sensors such as GNSS, inertial measurements and a high-definition map. This presentation describes some of the key requirements and design trade-offs for inertial sensors, and how these sensors contribute to automation in challenging environments.
What the audience will learn
- The important role of vehicle motion sensors for automated driving
- Differences between Absolute Positioning (in global reference frame) versus Relative Positioning (in vehicle reference frame)
- Continuous improvements in performance & cost of inertial sensors
High definition sensor redundancy for automated driving
This presentation will demonstrate the value added by Imaging Radar, Thermal camera and LiDAR for automated driving. It will discuss situations where one sensor doesn't perform well but is compensated for by the rest of the two modalities. It will focus on key use cases such as lost cargo.
Tire intelligence for ADAS and AV applications
Kanwar Bharat Singh
Program manager, algorithms and software engineering
The Goodyear Tire & Rubber Company
USA
Program manager, algorithms and software engineering
The Goodyear Tire & Rubber Company
USA
Room 1
Regulations, standards, metrics and collaboration – building blocks for safe ADAS and AD technologies
2pm - 5pm
Moderator
Test & deployment of autonomous vehicles in California – a regulatory update
California is the birthplace and epicenter of autonomous vehicles and remains at the heart of AV testing. In what has been an eventful year, it’s more important than ever to keep up-to-date with regulatory developments, and engineers wishing to test their technologies on California’s streets should not miss this presentation.
Automated driving system behavior evaluation
The behavioral capability of the automated driving system (ADS) is an important attribute of ADS safety. Behavioral capabilities can be categorized into three broad categories: regulatory compliance, conflict avoidance and collision avoidance. In particular, the annexes in ISO 2022 highlighted the need for more detailed scenario-dependent behavioral evaluation around conflict avoidance. Assuming the identification of a set of scenarios a-priori, this points to the need for objective metrics that can evaluate whether the behavior of the ADS meets the scenario-specific behavioral targets. This presentation aims to discuss computational metrics that can be used as behavioral benchmarks for flagging when ADS behavior does not meet expectations. Such metrics can also be applied to discovering novel scenarios and events with behavioral issues from existing data sets.
What the audience will learn
- Establishing a clear conceptualization of good driving through Drivership.
- Operationalizing Drivership through behavioral benchmarks.
- Research needs in the space of behavioral reference models.
- An overview of some of Waymo’s computational behavioral models including NIEON, Surprise, and FSM.
Building trust in AVs via safety metrics
A safety case is a topic that has been much discussed when it comes to safe deployment for AVs. Within the framework, it becomes critical to talk about what metrics are identified, generated and reported upon. Even more critical is their relevance and impact on building trust and confidence, internal and external. This topic dives into metrics within a safety case and their value and impact in building trust.
Updates from ASAM OpenODD specifications: taxonomy, modular conditions and operational domains
What is the AVSC and how is it influencing standards?
The presentation will include an overview of the AVSC (Automated Vehicle Safety Consortium) and its mission regarding the development of best practices and how they lead into other standards organizations, along with the impact and influence AVSC has had since its inception in 2019.
What the audience will learn
- What is the AVSC and how is it influencing the autonomous vehicle industry
- The mission and goals of the AVSC
- How the AVSC operates
- Advantages over other standards organizations
- Latest published work
Open Source SDV Development: Autoware Open AD Kit 3.0
This presentation will introduce the Autoware Open AD Kit 3.0, a collaborative effort from key software-defined vehicle stakeholders such as Arm, AWS, eSync Alliance, Leo Drive, Excelfore, RedHat, PIX Moving, SOAFEE and TIER IV. The Open AD Kit framework is a testament to Autoware Foundation's vision of developing SDV solutions based on open standards and collaboration. On the technical side, the presentation will go through the company's elaborate CES demo to explain the essential components (cloud-native, DevOps, OTA, containerization, etc) that enable the big loop, a concept that is frequently used to describe how modern software development methodologies are translating into automotive.
What the audience will learn
- The Autoware Open AD Kit framework and how the AWF collaborates with key automotive stakeholders
- How to replicate the Open AD Kit 3.0 (an open-source, well-documented blueprint)
- The important components of SDV: containerization, cloud-native, consolidated and elastic compute, OTA, DevOps tools, etc
- The value SDV will bring to all stakeholders (e.g. auto makers, consumers, technology developers, etc) across the board
- Unlocking new business/revenue models through SDV (based on a methodical work conducted with the Wharton School)
Room 2
Developing software, AI, big data and architecture – challenges and innovations
9am - 12.25pm
Moderator
Empowering the Automotive Ecosystem through Collaborative Mapping
The automotive industry is being transformed by connected vehicles and advanced perception systems, generating a wealth of data that has the potential to revolutionize mapping. However, challenges in data processing, integration, and standardization remain. Overture Maps offers a collaborative solution.
Overture provides an open, customizable base map for both driver-facing navigation and machine-readable applications, empowering the automotive ecosystem to build unique map products, enhance user experiences, and unlock new revenue streams. By establishing interoperability for automotive data layers like traffic, hazards, and parking information, Overture enables a rich ecosystem for seamless data exchange.
Sensor data generated by millions of vehicles holds tremendous potential. However, challenges remain in standardizing data formats, streamlining mapmaking, and establishing efficient sharing mechanisms. We invite OEMs, suppliers, and the broader automotive ecosystem to collaborate with us to unlock the full potential of this data. Together, we can enhance map quality and coverage, while fostering innovation and efficiency only possible through collaborative mapping.
Join us to learn how Overture is empowering the automotive ecosystem to harness the full potential of collaborative mapping.
Exploring generative AI in AD/ADAS: applications and limitations
Developing and deploying self-driving technology safely
After two decades and billions of dollars being poured into self-driving, access to this technology is limited to a few locations. Before we can realize the tremendous personal and societal value that driving assistance and automation promise to deliver, we must design a system that can safely and verifiably handle the persistent long-tail of real-world driving scenarios. Traditional systems have failed to tackle the long-tail of driving scenarios using brittle and time-consuming rules-based programming. Wayve is pioneering a new approach using end-to-end embodied AI. Wayve will present its end-to-end embodied AI as a solution to effectively handle complex, real-world driving situations.
What the audience will learn
- Develop an understanding of the complexity of the long-tail of driving problems
- Recognize the challenges presented when attempting to solve the long-tail with concepts-and-rules-based programming
- Understand the benefits and recognize the potential of Wayve’s end-to-end embodied AI to tackle and conquer the long-tail
Accelerate AV development with data-driven automotive AI
Developing AVs is a time-intensive and complex process that requires best-in-class data and AI training infrastructure. Companies developing software-defined vehicles need to accelerate time-to-market and minimize costs without sacrificing safety, combining vehicle sensors, map data, telematics and navigation guidance using machine learning and data fusion techniques. Data-
driven development is not without its challenges. One of the biggest challenges is data collection and integrity, as data needs to be collected accurately and consistently to drive accurate decisions. Additionally, data-driven development requires data analysis capabilities, as information needs to be analyzed and interpreted to make meaningful decisions. Finally, data-driven development requires collaboration between stakeholders and technical teams to ensure that data is being used appropriately. In close collaboration with Red Hat and Nvidia, IBM will deliver fully integrated systems that bring AI-powered computing to everywhere data is created, from the edge to
the cloud, helping businesses easily deploy tailored AI applications to drive innovation. This presentation examines how to best leverage AI-powered data infrastructures and software to accelerate AV development and achieve maximum efficiency.
Accelerating ADAS and autonomous vehicle development with cloud-native solutions
Autonomous vehicles collect petabyte-scale sensor data. The IoT FleetWise Rich Sensor data solution allows auto makers to scale the collection and transfer of meaningful data by defining driving campaigns, enabling them to extract more value and context, reducing the time and cost of autonomous driving feature development. Simultaneously, software is playing a pivotal role in vehicle design and evolution, driving the transition toward software-defined vehicles (SDVs). This session will showcase the latest breakthroughs in cloud-native development, demonstrating how auto makers can leverage AWS and Qualcomm technologies, such as the Snapdragon Digital Chassis, to scale and accelerate vehicle software development, including leveraging virtualized digital twins and cloud-native tools to test and validate code. This session aims to equip attendees with the knowledge and strategies to embrace the 'Big Loop' of data-driven and software-defined development, accelerating the path to safer and more capable ADAS and autonomous vehicle technologies.
What the audience will learn
- Autonomous vehicles collect petabyte-scale sensor data; it’s necessary to optimize its collection and use
- Cloud-native software development, enabled by technologies like AWS and Qualcomm's Snapdragon Digital Chassis, allows auto makers to scale and accelerate vehicle software development, including testing and validating code on virtualized digital twins
- By combining rich sensor data and cloud-native software development, auto makers can embrace the
Integrating FPGA and AI technology
Without new architectural solutions for immediate implementation, silicon chip manufacturers struggle to keep up with automotive innovation. As a driver of automotive progress, AMD is able to solve this with new embedded AMD Ryzen™ processors and adaptable FPGA/SoCs, elevating heterogeneous computing. We will be discussing the benefits of the AMD new offerings and demonstrate them on Xylon’s Surround View system with integrated AI capabilities. This POC showcases a unique combination of programmable FPGA logic for data pre-processing, dedicated AI processing partitions, and scalar processing on a multi-core part for post-processing. Performance metrics and demonstration videos will highlight advantages of this hybrid compute platform.
What the audience will learn
- • New AMD offerings in the automotive space and ways in which AMD will be driving the future of automotive systems
- • New architectures and benefits of combining them in a single chip
- • Design flows that successfuly partition chip architecture in the most efficient ways
- • Performance improvements achieved in the real-life POC system, shown via a demonstration
Data management and the homologation challenge
To introduce automated driving systems (ADS) to market, OEMs must demonstrate ethically acceptable safety levels, which requires achieving a positive risk balance amid numerous real-world risk factors. Real-world data insights must be complemented with model-based and scenario-based testing methodologies. In this talk, we will explore leveraging AI for identifying critical data sequences and using advanced technologies like GenAI for automated scenario generation. Additionally, we will outline innovative solutions in smart fleet management to address data coverage gaps for ADS development and validation. This holistic approach aims to accelerate development cycles by optimizing data acquisition and enhancing safety validation processes.
What the audience will learn
- Big picture: the homologation challenge for vehicles, systems and components
- Combining an analytics-based approach and a data-driven explorative approach for identification of hazardous scenarios (ISO 34502)
- Defining parameter ranges using real-world traffic data for scenario-based safety assessment
- Retrieval-augmented generation (RAG) based scenario generation copilot
- ODD gap coverage through targeted logging fleets
Room 2
Key components in the ADAS/AD test and development toolchain
2pm - 5pm
Moderator
ADAS performance variability in unique and challenging scenarios
The presentation will explore ways of visualizing the performance of certain ADAS features in scenarios currently tested in NHTSA's NCAP and provide insight into some unique and challenging scenarios where performance is degraded or becomes more variable.
What the audience will learn
- Visualizing data on ADAS performance
- Identifying several scenarios that are challenging for current ADAS features
- Potential areas for future research, spurring improvement in ADAS technology
ODD modeling, scenarios and ODD coverage – safety-driven validation
The presentation introduces modeling of ODD (operational design domain) in ASAM OpenSCENARIO DSL 2.1, and incorporating it into safety-driven validation of ADS. It presents virtual testing, interaction of scenarios and ODD. Specific attention will be given to the challenges of modeling ODDs and validating correct behavior of the ADS within an ODD.
What the audience will learn
- How to model an ODD using ASAM OpenSCENARIO DSL 2.1
- How to simulate and validate ADS/AV and incorporate ODDs into the simulation
- How to produce coverage metrics on ODDs
Simple on-production target verification of ADAS software
In order to validate that your autonomous software is capable of behaving correctly in all driving scenarios, including those one in 10 million driving events, it is necessary to employ a comprehensive simulation and testing methodology that supports the replay of real-world vehicle data. The cornerstones of a comprehensive testing strategy most likely include a pure virtual software open loop (SoL) solution as well as a hardware open loop (HoL) testing solution. Together with our OEM customers, we have identified an additional market need for a simpler and less expensive (compared with HoL) production target replay and simulation solution.
What the audience will learn
- On-target verification (why/advantages)
- Verification based on real-world data (why/advantages)
- Verification concepts with middleware (validation build versus target build)
- Concepts of on-target stimulation (time/event based)
- High-data bandwidth GETK interface for combined on-target stimulation and measurement with low impact on target
Development of ADAS systems: from critical scenarios to virtual hardware
The rapid advancement of software-defined vehicle and related advanced driver assistance systems (ADAS) necessitates a progressive approach not just to software development but also the electronics hardware and physical systems development for delivering the features. By leveraging holistic digital twins, auto makers can significantly enhance the robustness and thorough testing of ADAS functionalities. This shift left approach integrates scenario-based testing, virtual validation and compliance with Safety Of The Intended Functionality leveraging virtual hardware for digital validation before the hardware is finalized.
What the audience will learn
- Leverage real-world data for scenario-based testing
- Automatically generate unknown/unsafe scenarios per the SOTIF standard
- Leverage SDV framework to evaluate ADAS performance on virtual hardware
- How to test AD and ADAS functionality on virtual hardware-in-the-loop
Transforming autonomous transit: lessons learned from real-world ART operations
Explore the future of autonomous transit with our autonomous rapid transit (ART) system, as we explore its real-world applications in enhancing transportation services for airports and public transit. We will share several ART operations utilizing full-size buses and ADA-compliant vehicles, alongside an innovative ART airport people mover project that integrates electric buses and a vehicle management platform. We will discuss the implementation process and lessons learned in adopting these solutions. Join us and uncover how ART offers faster deployment and reduced total costs while meeting safety requirements, making it a game-changer for the autonomous transit market.
What the audience will learn
- ART's role in addressing transit challenges: attendees will understand how the autonomous rapid transit (ART) system addresses the challenges faced
- Real-world ART applications: through real-world use cases, attendees will understand the technologies that ART integrates and its practical applications
- Airport people mover project: By sharing the ART-based airport people mover project, attendees will learn more about ART’s capabilities
- Implementation processes: attendees will gain valuable insights into ART implementation processes, practical considerations and the lessons learned from ART projects
- Key benefits: by exploring ART’s benefits, such as faster deployment and reduced total cost, attendees will be equipped with knowledge
Reduction of parameter space scenario testing via safety model
Reduction of the parameter space for synthetic scenario creation is conducted with a novel approach using the GCAPS safety model. Using real-world data from perception systems, CISS, or naturalistic driving data, the decomposition process of GCAPS creates a concrete scenario. The concrete scenario is parameterized, and ranges of the parameters are identified. The safety model uses the object trajectories, road path and informed statistics to determine several metrics such as conflict probability. Applying the safety model to the concrete scenario and parameter ranges segregates the parameter space by the safety model metrics, enabling a focused evaluation of the ADAS.
What the audience will learn
- Methodology in converting real-world data to virtual events to enable synthetic scenario creation
- Safety model capabilities for conflict probability, risk determination and other safety-related metrics
- Application of the safety model to separate scenarios for evaluation based on safety metrics to enable focus evaluation in HIL and SIL
Room 1
Safety considerations and best practices for advancing ADAS and AV technologies
9am - 3.40pm
Moderator
Bodo Seifert
Senior automotive functional safety engineer and practice lead
TÜV Rheinland of North America
USA
Senior automotive functional safety engineer and practice lead
TÜV Rheinland of North America
USA
Using model-based system engineering in ISO 26262 processes with independence
This study investigates the integration of model-based system engineering (MBSE) into ISO 26262 processes, focusing on the idea of independence in safety-critical automotive systems. This idea is borrowed from other domains, like DO-178C for aviation, and extended with back-to-back testing. Practices and tools commonly used in MSBE will be examined, and their place in the verification and validation process will be shown. The role of independence at various safety integrity levels will be examined. Finally, a case study will be presented to show how this can be achieved with existing tools and processes.
What the audience will learn
- model-based system engineering is becoming more important in a variety of domains
- Back-to-back testing with model-based system engineering is considered in ISO 26262
- 'With independence' is a concept from aviation that provides many benefits in the automotive world
- Safety integrity levels in ISO 26262 and how they relate to engineering practices
Enhance and enable Level 4 autonomous parking with Real-time HD Semantic Map solutions
The automotive industry is undergoing rapid transformation with the introduction of new vehicle architectures and software-defined vehicles, alongside advanced processing platforms. This shift creates a wealth of innovative data-driven opportunities. In this presentation, Dr Xiao will explore real-time HD semantic map solutions designed to enhance parking experiences and enable Level 4 autonomous parking systems. The solutions use low-cost sensors – surround-view cameras – to create sharable parking lot maps. These maps, stored in the cloud, provide real-time updates on parking spot availability and type (EV, handicapped, etc). Dr Xiao will explain how these advancements lead to smarter, more efficient parking solutions.
What the audience will learn
- Utilizing crowdsourced parking maps derived from sensor data, sourced from mapping cars or end customer fleets
- Generating shareable global semantic parking maps on demand for autonomous valet parking, leveraging the detection of landmarks like parking lines.
- Achieving centimeter-level accuracy in vehicle localization within the map through V-Localize technology
- Providing live status updates on parking spot availability, including handicapped/EV/reserved spots, ensuring efficient and accessible parking solutions
Developing sustainable eco-driving strategies for connected and autonomous electric vehicles
Connected and autonomous electric vehicles (CAEVs) allow the deployment of more advanced driving strategies, such as ecological driving (or eco-driving) strategies, toward even lower energy consumption when driving on streets and passing intersections. Deep Learning-based algorithms can be used to optimize the eco-driving strategies for CAEVs on transportation networks (isolated intersections, arterial streets and road networks.) Computer simulation and driving simulators can be used to evaluate the effectiveness of the proposed optimization models and eco-driving strategies. Such a study helps car manufacturers and transportation and environmental agencies at all levels understand the design, operation and impacts of optimal eco-driving strategies.
What the audience will learn
- Connected and autonomous electric vehicles (CAEVs) allow the design of eco-driving strategies
- Eco-driving strategy will yield lower energy consumption when driving on streets and passing through intersections
- Deep learning-based algorithms can optimize the eco-driving strategies for connected and autonomous electric vehicles on transportation networks
- Such a study helps government agencies and car manufacturers understand the design, operation and impacts of optimal eco-driving strategies
An analysis of AV and ADAS shortcomings and potential solutions
It seems that on a weekly basis we see another accident or incident involving the apparent failure of an AV or ADAS. Regardless of the frequency of these incidents, they are used as fodder by the media to frighten the public, earning more clicks and add revenue. We've seen AVs jamming up streets, hitting cyclists, dragging pedestrians and a recent tragedy where a Ford MachE ( supposedly with Blue Cruise engaged) struck a stopped vehicle on a highway, resulting in a fatality. I will discuss my opinions as to why such incidents happen and propose technical solutions to reduce or eliminate these issues in the future. Solutions can take the form of better or increased sensors on vehicles or more robust localization and communication systems. There are also many investments that can be made around infrastructure, which could greatly improve the safety and efficiency of ADAS and AV systems. I will discuss a few of these technical approaches (including V2V and V2X) as well as who should be paying for them.
What the audience will learn
- How well do today's ADAS sensors perform in certain conditions - Is super human sensing possible?
- Is there a silver bullet ( can one sensor solve all our issues)
- What systems need improvements outside the car ( Localizations, Communications etc)
- How sensor data is combined with AI to interpret the world.
- Is affordable autonomy achievable , what's the ROI for these systems?
Effects of aftermarket modifications on a vehicle’s ADAS functionality
Luis Morales
Director of vehicle technology and product development
Specialty Equipment Market Association (SEMA)
USA
Director of vehicle technology and product development
Specialty Equipment Market Association (SEMA)
USA
As ADAS technology proliferates in passenger vehicles, there is a pressing need to understand how aftermarket modifications may affect the functionality of these systems. To address these challenges, SEMA stays at the forefront of research and testing, particularly with rapidly evolving vehicle technologies such as ADAS and autonomous features. In this presentation, SEMA will share the results of its groundbreaking research projects that tested the effects of various aftermarket modifications (suspension lifts) on the ADAS functionalities in a Chevrolet Silverado and Ford F150, including lane departure warnings/lane-keeping assist, forward collision warnings/automatic emergency braking, blind-spot detection and rear cross-traffic alerts.
What the audience will learn
- ADAS technology and features in new vehicles and how they can be affected by aftermarket products and modifications
- The importance of research and data analysis of ADAS functionality after modifications
- Findings of SEMA’s breakthrough research and future projects
- The disconnect between auto makers and the aftermarket industry and how it affects products and services
Preventing unexpected behaviors in autonomous vehicles
Bodo Seifert
Senior automotive functional safety engineer and practice lead
TÜV Rheinland of North America
USA
Senior automotive functional safety engineer and practice lead
TÜV Rheinland of North America
USA
Four lines of defense to a safer vehicle: process – build what you specify (automotive SPICE); functional safety – ISO 26262:2018; safety of the intended function – ISO 21448; cybersecurity ISO 21434.
What the audience will learn
- Define functional goals/specifications
- Identifying hazards
- Creating traceability
- Performing verification and validation
- Produce a safety/cybersecurity plan
Panel discussion: Where standards and regulations meet - how can they best interact?
Join a lively discussion of the current landscape for regulations and standards, as well as what developments might be on the horizon. Our expert panel will also discuss possible interactions between the two, what barriers there might be to interaction and how they might be negotiated.
Moderator
ISO 26262 for cloud-based precise positioning
As the automotive industry advances toward higher levels of vehicle autonomy and vehicle-to-everything (V2X) communication, cloud-based services will play a larger role. While the ISO 26262 standard sets functional safety requirements for software running inside the vehicle, cloud-based applications must adhere to the same requirements. Swift Navigation’s Skylark Precise Positioning Service is a cloud-based GNSS corrections service running on AWS that provides vehicles with high-accuracy location data for precise navigation, ADAS and autonomy and V2X communication. Skylark recently achieved ISO 26262 certification, elevating the role of GNSS in the automotive sensor suite and setting a design pattern for the development of cloud-based applications used for safety-critical use cases. Learn more about the service architecture and cloud infrastructure used to ensure compliance with safety standards while leveraging the unique benefits of the cloud.
What the audience will learn
- Application of automotive safety standard principles in a cloud-based environment
- Addressing performance, availability and safety trade-offs
- Precise absolute positioning state of the art
The 5GAA roadmap for advanced driving: connectivity as a linchpin
This session will present a vision for how connectivity technologies can meaningfully support advanced driver assistance systems (ADAS) worldwide. The 5GAA roadmap for advanced driving includes both wide area and local broadcast communications and shares a vision for how they can augment ADAS solutions. Attendees will learn the immediate and near-term benefits of vehicles equipped with 5G modems and gain insight into the regulatory and standards headwinds to overcome.
What the audience will learn
- How connectivity aids ADAS today and future growth strategies
- The role of industry associations, including 5GAA, in working toward a connected and automated future
- The hard work that’s needed in standards in regulations to make it happen
Innovative techniques and methods for a changeable roadway testing environment
Battelle is assisting the Federal Highway Administration (FHWA) with research to develop realistic and easily changeable mock roadway features (e.g. curbs, pavement markings, barriers, vegetation) to rapidly create and take down diverse scenarios for testing automated driving systems (ADS). These realistic, lightweight, portable (mock) roadway features will enable proving grounds and other facilities to increase their testing adaptability and agility without having to invest in expensive infrastructure upgrades. We will present several concepts under development – including foam curbs, removable pavement marking and lightweight concrete barriers – and the methods used to develop and test them (e.g. sensor signatures, likeliness and durability testing).
What the audience will learn
- Innovative techniques and methods to develop artificial features that help quickly create and take down ADS testing scenarios
- How ADS sensors view real-world features versus the corresponding mock features and associated ADS perception implications
- Applicability and feasibility of using mock roadway features to test ADS
- Insights around ADS behaviors and responses to different roadside conditions
- Challenges and solutions to speed up ADS testing in controlled test settings
Formal methods: game-changer in ensuring automotive software safety and security
The presentation will show how, through automating the use of formal methods via abstract interpretation among other techniques, enterprises can alleviate the tester burden, reduce iterations generated by penetration, fuzzing and unit testing, and allow developers to focus on high-value tasks. This unique approach, which combines static and dynamic analysis, was recommended by NIST (NIST.IT.8151) and the White House's ONCD, and has the additional powerful advantage that it yields no false negatives. This means a guaranteed absence of undefined behaviors like buffer overflows. We will further demonstrate how developers can comprehensively verify software properties and produce critical-bug-free code, ensuring memory safety.
What the audience will learn
- Formal methods accelerate verification for complex cases, reduce user fatigue, and improve developer efficiency, cost-effectiveness and software reliability
- Undefined behaviors represent the root cause of the majority of the most severe software defects and vulnerabilities
- Multiple government agencies and standards bodies have recently recommended widespread industry adoption of formal methods
Room 2
State of the art hardware and software test & development
9am - 3.40pm
Moderator
Next-generation ADAS test for V2X and vehicle-level radar test
As Level 3 ADAS functions continue to dominate vehicle roadmaps in the short term, testing these ADAS functions adequately is a continuing challenge for the automotive ecosystem. In this session, Rohde & Schwarz will introduce two new test methods for ADAS testing: the optimized, cost-effective on-vehicle radar test and combining V2X with the sensor HIL test for pre-road test reliability tests. The session will share initial project/demo descriptions, key learnings and potential application areas for these new test methods.
What the audience will learn
- How to accurately verify key radar sensor test parameters with the sensor mounted on the vehicle
- The growing need and importance for vehicle-level radar sensor and ADAS tests
- How you can combine V2X communication into the sensor HIL test loop for more complete ADAS tests
- The role of V2X HIL testing for vehicle safety functions
Meeting SOTIF for ADAS/AD through efficient software and scenario validation
Validation of automated driving functions, especially for SAE L3, has become a challenging aspect of new vehicle development. Large quantities of tests have to be performed in different environments. Complex SIL and HIL setups and highly scalable cloud simulations are necessary to cover the required testing. In ISO 21448 (SOTIF), sampling is a central component and focuses on targeted parameterization and variation of scenarios. In this presentation, the audience will learn about the SOTIF standard and its effect on vehicle development. We will also present a scalable, virtual toolchain solution for automation, parameterization and efficient, targeted analysis of the test.
What the audience will learn
- What the SOTIF standard is about
- SOTIF areas – what is Area 3?
- How the new SAE L3 standard is applied
- Selecting the right scenarios to find critical behavior
- Options to utilize virtualization for scalable, automated test environments
Enhancing autonomous vehicle development through simulation
Explore the critical role of perception simulation in developing safe and reliable autonomous vehicles. This keynote offers an in-depth look at aiMotive's aiSim and its capabilities in simulating real-world sensor data to train and validate perception systems. Learn how aiSim empowers engineers to create highly realistic and challenging scenarios for perception algorithms, ensuring they are prepared for complex road environments. Discover the impact of perception simulation on accelerating the development of AI-driven perception in self-driving technology.
What the audience will learn
- An in-depth understanding of perception simulation in autonomous vehicle development
- Practical applications and real-world case studies using aiSim
- How to create and validate perception algorithms in complex, simulated scenarios
- The benefits of accelerated AI-driven perception development through simulation
- The future of perception simulation and its role in advancing self-driving technology
Argonne’s anything-in-the-loop workflow: assessing the energy impact of CAVs
Estimating connected and automated vehicles' (CAVs) energy impacts is challenging due to the non-linear dependence between CAVs' interactions with the environment and the scarcity of real-world data. To overcome these limitations, an 'anything-in-the-loop' (XIL) workflow has been developed. The XIL workflow integrates various CAV controllers into real powertrain components or vehicles in a safe, controlled and highly repeatable experimental environment, enabling a thorough validation and analysis of functionalities, dynamic performance and energy impacts. The presentation provides an overview of the workflow tasks and highlights achievements in energy conservation and insights into validating a multivehicle simulation leveraging experimental results.
What the audience will learn
- Experimental energy characterization of CAVs
- In-the-loop testing methods development for intelligent mobility
- Validation of multivehicle simulations using laboratory data
A server-based HIL rig proposal
An essential methodology that every ADAS/AD project must leverage is hardware-in-the-loop testing (HIL). While traditional HIL testing is more accurate than SIL, it runs orders of magnitude more slowly than cycle-for-cycle software-in-the-loop testing. Complexity, cost and deployment challenges limit scalability. In this session, Ambarella, in cooperation with Dell, will introduce a new methodology that delivers a low-cost, scalable server-based HIL rig simulation architecture that matches or exceeds the speed and accuracy of traditional HIL rigs for AI model verification with the flexibility of on-premises and/or remote deployment.
What the audience will learn
- The current challenges of traditional HIL rig architectures
- A new approach to AI model development and verification using Ambarella's CV3 ECU controller
- How IT best practices and techniques can help to dramatically reduce ADAS/AD development cycle times
- An introduction to the server-based HIL rig architecture
- Large-scale deployments of a server-based HIL rig reduce complexity, development cycle times and cost
Precipitation characterization for ADAS development of sensor performance at ACE
Ontario Tech University/Automotive Centre of Excellence's core research facility will share the journey of the precipitation characterization and testing functionality of ADAS for soiling due to weather impacting the vehicle sensors and detecting targets. Ontario Tech and ACE are leading the way in developing new, advanced development routines and testing methodologies for real-world weather conditions. The world-class Climatic Aerodynamic Wind Tunnel and other environmental chambers provide testing in rain/snow/ice/fog in varying temperatures and humidity. ACE provides calibrated rain/snow/ice/fog precipitation characterization that accurately and repeatably reproduces real-world performance of lidar, radar, cameras and other optical sensors in full vehicle operation.
What the audience will learn
- The presentation will introduce Ontario Tech University and ACE (Automotive Centre of Excellence) capabilities for the R&D and testing of ADAS
- The presentation will share the journey of precipitation characterization and testing functionality/performance of ADAS sensors in real-world weather conditions
- The world-class Climatic Aerodynamic Wind Tunnel and other environmental chambers can provide testing in rain/snow/ice/fog in temperatures/humidity
- ACE's calibrated rain/snow/ice/fog precipitation characterization accurately and repeatably reproduces real-world performance of sensors
- about the reproducible performance of lidar, radar, cameras and other optical sensors in varying speeds on city, rural, highway and high-speed roads.
Radar ADAS: virtual modeling of targets
The safety of autonomous systems is one of the challenges addressed by the CVH project at IRT SystemX through simulation. One of its aims concerns perception, considering physically realistic simulation of sensors, from a hardware point of view on one hand, and modeling of the environment on the other hand. In the case of radars, the applied methodology comprises four main topics: building the virtual sensor, modeling virtual targets, describing simulation scenarios and characterizing the phenomenological model of perturbations. This paper discusses modeling virtual targets, in order to demonstrate the relevance of our methodology through quantified results.
What the audience will learn
- The methodology of testing and validating sensors through a complete toolchain of comparison of real measurements and simulated data
- The chosen targets to model virtually, their response to radar sensors
- The experiment plan and the post-processing work with IMT Atlantique to acquire real data measurements for each target
- The working pipeline to get usable 3D models in the tools SE-WORKBENCH and SCANeR Studio
- The methodology to test the representativity of the radar data through RCS measurements and simulation
Is ADAS a bridge to full autonomous driving?
Both these AD approaches utilize sensors, hardware, processors, cameras, etc, but their integration and computing power requirements are different. The presentation will examine where the concepts diverge and determine if there is a direct path from one to the other. Another essential factor to consider is how the vehicle perceives the immediate environment. Different mapping options – HD Maps versus mapless – will be presented and compared.
What the audience will learn
- The development aspects that need to be considered when attempting to move from L2 to L3 to L4 and beyond
- Whether or not autonomous vehicles are simply ADAS on steroids
- The pros and cons of the two leading perception approaches as the basis for motion planning
Moderator
Novel trends of 3Dlidar visualization systems for automotive technologies
Analysis of the current state of development of autonomous vehicle technologies requires a critical analysis of the requirements and key elements of autonomous driving systems, in particular visualization systems. The report focuses on emerging trends in lidar sensor technologies to improve the performance of autonomous vehicles. New trends include new optical sensors with improved performance, fast response principles of lidar sensors, optical laser beam distribution systems – particularly diffractive optics, and a new method of obtaining information in a visual camera format with distance information (possibly in color), which allows users to analyze information based on artificial intelligence on a chip and others.
What the audience will learn
- Analysis of the present situation and problems of conventional lidar sensors for autonomous technologies for Level 2+ and further
- Novelty and innovations of lidar sensors, in particular optical sensors with high sensitivity, up to single photons for autonomous technologies
- FLASH principle of visualization of the situation with single laser and diffraction optics for high-performance lidar sensors
- Novel camera readout principle of 3D visualization with FLASH mode and TOF ranging perspectives
- Miniaturization of lidar sensors as a way to improve the 3D dynamic visualization system architecture for autonomous technologies
Targetless lidar calibration – unleashing new dimensions in autonomous tech
The presentation will cover challenges with traditional lidar calibration, advantages of targetless calibration, use cases and applications and Deepen AI's targetless calibration.
Generative AI for requirements management and continuous homologation in autonomous engineering
Expand your operational design domain (ODD) and cover new geographical regions by fine-tuning large language models (LLM) and augmenting autonomous driving scenarios with abstract knowledge of standards, regulations and laws.
What the audience will learn
- Generative AI and large language models for ODD expansion
- Regulatory frameworks for automated mobility
- The challenge of expanding into new markets
Testing Time Sensitive Networking and MACsec encryption in an AV
Introduction to In-Vehicle Network (IVN) security
TSN standards and industries adoption
Security architectures
Multi-Level security validation
Conclusions
What the audience will learn
- TSN standards and industries adoption
- Security Architectures
- Multi-Level Security Validation
- MACSec
- TSN Time Sensitive Networking