Conference Program


Room A Strategies, innovations and requirements for the safe deployment of ADAS and autonomous technologies
9am - 12.25pm

Moderator

Dr Sven Beiker
External Advisor
SAE International
USA

9am

Deploying a safe and trustworthy AV in different markets

Vivetha Natterjee
Autonomous Vehicle Safety Specialist
CEVT
Sweden
Now that we have removed the human from the vehicle has the AV become safer in traffic? No. We have removed the human and not the human element. But wait, is removing the human the solution? No. In fact the opposite is true, more the merrier. In Vivetha's multi-pillared approach to continuously improve and deploy safe and trustworthy autonomous vehicles, 'inclusion is key'. Data, verification and personas need to be weighed equally. Humans in our different roles (as engineers, as drivers, as traffic inspectors etc.) are to be modelled using AI in order to understand and prove safety of AV.

What the audience will learn

  • How to instill trust with end user to hail a Robotaxi
  • How can safety of AV be measured using a multi-pillared approach?
  • How sensible it is to share the AV Ecosystem?

9.25am

Collision avoidance bus system: radar and camera technology for safe transportation

Kevin Jones
Head of Innovation, America
ZF
USA
Join us for an engaging talk on the revolutionary collision avoidance bus system developed by ZF, incorporating advanced radar and camera technology. This groundbreaking system aims to prevent collisions with cars, bikes, and pedestrians, ensuring enhanced safety on American roads. We will delve into the value propositions of this innovative driver-assist product and overall increased safety. Our discussion will also address the technological challenges associated with scaling up this autonomous product. Discover how this system is transforming the transportation and logistics industry, paving the way for a safer and more efficient future.

What the audience will learn

  • Are you a bus manufacturer or MTA? Do you need a collision avoidance system?
  • Are you a startup and want to collaborate?
  • Do you need an intro into ZF?
  • Do you need other hardware, software, or business help?

9.50am

Safely delivering autonomous trucking solutions

Michelle Chaka
SVP, Safety and Regulatory
Torc Robotics
USA
Torc is focused on helping the freight industry thrive by providing a safe, reliable, and cost-effective trucking solution. Delivering on safety requires going beyond just building a safe product. Collaboration with partners and stakeholders from across the ecosystem to inform the full lifecycle and customer journey is required. Attend our session to stay connected with how safety is shaping the future of freight.

What the audience will learn

  • How Torc is approaching its’ safety management system to support a complex freight industry?
  • What opportunities and challenges remain for the industry to deliver on safety?
  • Why collaboration and sharing of safety information with partners and customers is paramount to achieving safety goals?

10.15am - 10.45am

Break

10.45am

Validator design and SoTIF processes to address the million miles problem

Hakan Sivencrona
Chief Safety Officer
Zenseact
Sweden
Andrew Backhouse
Technical Specialist Collision-Avoidance Threat-Assessment
Zenseact
Sweden
The “million miles problem” is not unique for AD design. It is an issue for ADAS too. Autonomous Emergency Steering and Autonomous Emergency Braking have the potential to cause harm and these faults can not be addressed by functional safety alone. Perception and decision algorithms which are insufficiently robust can lead to hazardous situations . We will present validator concepts which can be used to design more robust ADAS as well as AD-systems. We will then also present how one can iterate the design to minimize the risk of functional insufficiencies and build an argument to motivate that the systems are safe.

What the audience will learn

  • Why the million miles problem is relevant for ADAS
  • The importance of tolerant time intervals in SoTIF
  • Architectural design for handling the tolerant time intervals of functional insufficiencies
  • How event scanners and tail distributions can be used in the verification and validation of ADAS.

11.10am

Cooperative congestion management

Fang-Chieh Jerry Chou
Researcher
Alliance Innovation Lab
USA
Traffic congestion is a major issue in many urban areas, resulting in increased travel times, air pollution, and economic costs. Congestion management is critical to improving traffic flow, but traditional approaches based on central control and infrastructure investments have limitations. Cooperative congestion management control is a promising new approach that optimizes traffic flow by coordinating driving speeds of connected and automated vehicles. Using a recurrent bottleneck on I-680 as a case study, this presentation will explore the key concepts, benefits, and challenges of the system.

What the audience will learn

  • Overview of the congestion management
  • Opportunities of connected and automated vehicles for solving congestion
  • Concept of cooperative congestion management control
  • Case studies of real world traffic

11.35am

Ensure Software Safety & Security With Continuous Testing

Andrey Madan
Director of Solution Engineering
Parasoft
USA
Agile methodologies foster a collaborative culture within organizations and have a proven track record of streamlining the product development process in the automotive industry. It places value on effective communication, integration, and cross-functional teamwork to enable continuous and rapid delivery of reliable products that are also safe, secure, and compliant to automotive industry standards like MISRA C 2023, ISO 26262, and ISO 21434.

What the audience will learn

  • The high value of adopting a modern development CI/CD workflow
  • Continuous testing: static analysis, unit testing, and code coverage
  • TARA: Integrating security into your SDLC

12pm

Measuring surprising road user behavior

Azadeh Dinparastdjadid
Senior Research Scientist
Waymo
USA
The key role of expectations and expectation violations (i.e., surprise) in the context of road traffic has long been acknowledged. Despite the important conceptual role surprise plays in traffic safety research, there is no precise quantitative definition or computational model of surprise in this domain. We demonstrate, for the first time, how computational models of surprise rooted in cognitive science and neuroscience combined with state-of-the-art machine learned generative models can be used to detect surprising human behavior in complex, dynamic environments like road traffic. In traffic safety, such models can support the identification of traffic conflicts, modeling of road user response time, and driving behavior evaluation for both human and autonomous drivers. We also present novel approaches to quantify surprise and use naturalistic driving scenarios to demonstrate a number of advantages over existing surprise measures from the literature.

What the audience will learn

  • • We demonstrate, for the first time, how computational models of surprise rooted in cognitive science and neuroscience combined with state-of-the-art machine learned generative models can be used to detect surprising human behavior in complex, dynamic en
  • • We present novel approaches to quantify surprise and use naturalistic driving scenarios to demonstrate a number of advantages over existing surprise measures from the literature.
  • • We discuss the application of surprise in the identification of traffic conflicts, modeling of road user response time, and driving behavior evaluation for both human and autonomous drivers.

12.25pm - 1.35pm

Lunch

Room A Software, AI, architecture and data management
1.35pm - 4.35pm

Moderator

Dr Edward Schwalb
Consultant
Schwalb Consulting LLC
USA

1.35pm

Integrating general purpose and AD/ADAS-specific middleware

Dr Stuart Mitchell
Regional Solution Field Manager North America
ETAS Inc
USA
AUTOSAR Adaptive has a proven record as middleware for HPC nodes within next generation E/E architectures. AUTOSAR provides a range of services supporting development including diagnostics, communication, scheduling and security. However, as a general-purpose middleware it lacks features considered necessary for AD/ADAS development including reproducible behavior (for validation through simulation), a high-performance data transport layer (to support AD/ADAS bandwidth needs) and highest levels of safety (ASIL-D). In this presentation we briefly review the capabilities of AUTOSAR Adaptive and then consider how general-purpose and AD/ADAS-specific middleware can be integrated to combine the best of both when building an autonomous driving solution.

What the audience will learn

  • The role and capabilities of AUTOSAR Adaptive as a general purpose Middleware component
  • Requirements for AD/ADAS specific Middleware that are not covered by AUTOSAR Adaptive, including reproducibility, determinism and validation through simulation
  • Integration points when combining Middleware components into a single platform
  • What lessons have we learned from combining general purpose and AD/ADAS specific middlewares

2pm

Virtual ADAS/AV sensor validation in the cloud

Esti Mor Yosef
Senior Program Manager
Microsoft
USA
The variety of ADAS/AV sensors that are available in the market today is big and growing. Automotive companies are not always aware of all existing sensor options, and even if they were, they would not be able to evaluate all the sensors in the real world because the process is long and expensive. If at the end of the “real world sensor evaluation” process, the sensor performance is not satisfactory making it unsuitable for use in production, the automotive company needs to look for an alternative and start the evaluation process all over again resulting in delays and cost. This session will focus on the benefits of “Pre-real-world sensor evaluation” using a cloud based platform for evaluating sensors virtually, comparing them and eventually narrowing down the number of sensors that will be taken for real world evaluation.

What the audience will learn

  • AV/ADAS sensor types, and functionality
  • The challenges of sensor selection for production programs
  • Microsoft working with partners on ways to overcome these challenges by virtualization and cloud computing

2.25pm

Revolutionizing autonomous vehicles: harnessing the power of quantum computing for advanced automotive applications

Perminder Singh
Global Tech Lead - automotive
Amazon
USA
Erik Garcell
Technical Marketing Manager
Classiq
USA
Are you ready to experience the transformative potential of quantum computing in the autonomous vehicle industry? Join us as we delve into the groundbreaking applications of this emerging technology and its impact on the safety, reliability, and efficiency of self-driving cars. Discover how quantum-enhanced machine learning is revolutionizing object detection and data processing from advanced sensors like LiDAR. Learn how quantum computing's unparalleled processing speed can optimize route planning and reduce traffic congestion, enabling a safer and more efficient transportation ecosystem. This presentation will challenge your current understanding of the autonomous vehicle landscape and inspire you to envision a future where quantum computing drives innovation in the automotive sector. Don't miss this unique opportunity to explore the cutting edge of technology and its potential to reshape the world of autonomous vehicles.

What the audience will learn

  • Understand how quantum computing enhances autonomous vehicle sensor data processing for faster and more accurate decision-making.
  • Learn about real-world applications of quantum computing in improving route optimization and reducing traffic congestion.
  • Discover the advancements in quantum machine learning for object detection and classification, contributing to safer autonomous driving.
  • Gain insights into the challenges and limitations of integrating quantum computing technology into the automotive industry.
  • Explore the future prospects of quantum computing in accelerating research and development in electric vehicles and novel automotive technologies.

2.50pm - 3.20pm

Break

3.20pm

Developing new ADAS and autonomous systems in the cloud

Christian John
President and Chair
Tier IV
USA
This presentation will look at some of the new development and deployment methodologies that can be used to develop new ADAS and AD functions. The SOAFEE initiative is taking cloud-native standards, tools and technologies and applying them to the development of software defined functions in the car. This allows for both early development in the cloud and then continuous integration/ continuous delivery (CI/CD) to the vehicle. We will cover some of the key technologies and methodologies in the SOAFEE architecture an show an example of how they can be applied to an autonomous software stack.

What the audience will learn

  • How can cloud-native development methodologies can be used to develop and deploy ADAS and AD software
  • What is the SOAFEE initiative and how to get involved
  • How SOAFEE can be used to develop an autonomotous stack with a real-world example
  • What are the key cloud-native technologies that can be used to develop vehicle software functions

3.45pm

Beyond scenario-based testing – validating the software-defined vehicle of tomorrow

Felix Mueller
General manager
TraceTronic, Inc
USA
Florian Rohde
Managing Partner
iProcess LLC
USA
Software, ADAS/AD and the user experience are the key differentiators for future vehicles. This presents new challenges to ensure the systems’ quality. Developing advanced functions requires a complex, interdependent environment of different specialized tools and processes to satisfy the legal and technological requirements. Continuous integration and early detection of failures are critical for success. All system levels must be evaluated and validated. Scenario-based testing with variations only analyzes a system in an ideal state of operation. This presentation will demonstrate how comprehensive test environments and state-of-the-art methodology achieve the highest level of software quality – continuously.

What the audience will learn

  • - Challenges companies are facing to validate complex, software defined vehicles
  • - How multi staging and DevOps helps to release high quality Software Features
  • - What aspects beyond the method of scenario based testing have a critical impact on product success
  • - Insights into the ideal test workflow from function developer to vehicle

4.10pm

Artificial Intelligence For AVs: A step-change in training data quality.

Dennis Berryman
Technical Sales Engineer
rFpro Ltd
UK
Artificial Intelligence (AI) is the only way a driverless vehicle can navigate dynamic public roads. But its effectiveness can only ever be as good as the training data it learns from. In this presentation, rFpro will provide insights into its new ray tracing rendering simulation technology, developed in partnership with Sony. It accurately replicates the way vehicle sensor systems perceive the world for the first time, significantly increasing the fidelity and quality of synthetic training data. Now available within High Performance Computing (HPC) solutions, it helps to remove one of the biggest challenges facing the industry; collecting high-quality, safety-critical training data for the development of AI.

What the audience will learn

  • The benefits of using high-quality synthetic data to train artificial intelligence.
  • The importance of physically modelling sensors, that allow for motion blur and rolling shutter effects to be accurately captured.
  • How ray-traced synthetic data can overcomes the shortfalls of real-world only data collection.
  • Benefits of decoupling simulation from real-time to overcome the trade-off between rendering quality and running speed. Available within HPC Solutions.
  • How a technical partnership with Sony has created an End2End perception simulation pipeline for the training of artificial intelligence.

Room B Standards, regulations and law, and their impact on engineers and technology
9am - 4.35pm

Moderator

Dan Richardson
Director of Market Analysis
Spatial Web Foundation
USA

9am

Regulations and standards for autonomous road vehicles worldwide: an overview

Surya Kopparthi
Certified Functional Safety Engineer & Autonomy Safety Professional
TÜV Süd America
USA
The worldwide legislative/regulatory and standards landscape for autonomous road vehicles is complex due to the following factors: -Regional difference of legislation/regulations -Multiple layers of legislation/regulations for a single administrative region -Legislation/regulations leaving room for interpretation, including regarding applicability of industry standards -Rapid changes in legislation/regulations -Liability questions outside of legislation/regulations This talk will give an overview of the achievements and issues in the field of legislation/regulations and standards for autonomous road vehicles.

What the audience will learn

  • An overview of legislation/regulations for autonomous road vehicles worldwide
  • An overview of industry standards for autonomous road vehicles
  • An overview of liability in the context of autonomous road vehicles

9.25am

Test & deployment of autonomous vehicles in California – a regulatory update

Dr Bernard Soriano
Deputy Director
California Department of Motor Vehicles
USA
California is the birthplace and epicenter of autonomous vehicles. And a regulatory structure governing the technology has been in place for almost 10 years. Automotive and technology companies are testing their autonomous vehicles on the public streets in California. Some companies have deployed their vehicles in a variety of business models. Learn about the latest developments, who is testing where, and what companies have deployed their vehicles, plus lessons learned in the last 10 years. Then discover what engineers need to consider when testing & deploying the next generation of autonomous vehicle technology in California’s evolving regulatory landscape.

9.50am

The evolving regulatory environment for Advanced Driver Assistance Systems (ADAS)

Chris Monk
Senior Managing Scientist
Exponent, Inc.
USA
It has been two years since the Bipartisan Infrastructure Law (BIL) was enacted and the National Highway Traffic Safety Administration (NHTSA) has issued Notices of Proposed Rulemaking (NPRM), initiated research in support of future regulatory actions, and moved forward with required updates to the New Car Assessment Program (NCAP). This presentation will address the ADAS technologies the BIL requires NHTSA to address, and what the latest NHTSA actions are in the regulatory and research stages in support of these required actions. Insights to NTHSA's future actions will also be included.

What the audience will learn

  • The ADAS technologies affected by the Bipartisan Infrastructure Law (BIL)
  • NHTSA's latest regulatory and research activities to address the BIL requirments
  • Insights into expected future actions by NHTSA related to ADAS and ADS

10.15am - 10.45am

Break

10.45am

AV privacy and cybersecurity: legal and regulatory framework and considerations

Ji Won Kim
Senior Associate
Norton Rose Fulbright US LLP
USA
This presentation will explore the current legal and regulatory landscape surrounding autonomous vehicle privacy and cybersecurity in the US and beyond. Attendees will learn about the various privacy and cybersecurity risks as well as strategies to manage those risks.

What the audience will learn

  • US Federal laws and regulations and recent/upcoming developments
  • US State laws regulations and recent/upcoming developments
  • Laws Regulations and recent/upcoming developments outside the US
  • Privacy and Cybersecurity Risks in Autonomous Vehicles
  • Strategies to Manage Privacy and Cybersecurity Risks in Autonomous Vehicles

11.10am

Product Calculus Series - Wireless communications and ADS Safety: The state of V2X regulatory developments.

Katherine Sheriff
Lead, Mobility and Transportation Industry Group
Davis Wright Tremaine
USA
Stakeholders in the mobility and transportation industry have long envisioned vehicle safety communications enabled by direct communication technology (V2X). Yet this vision has enjoyed limited deployment due to many challenges in getting V2X technologies to market. This presentation tackles one pervasive but necessary challenge: regulation. Specifically, what is the state of V2X regulation in the US, and how does this evolving framework factor into your product calculus for ADS technology?

What the audience will learn

  • • State of V2X regulatory development in the United States
  • • How US federal agencies are collaborating to regulate V2X technologies
  • • Role of the public and private sectors in getting V2X technologies to market
  • • Complexities presented by current regulatory uncertainty

11.35am

AAMVA Guidelines for Testing and Deployment of Automated Vehicles

Nanette M. Schieke
CAV Program Manager
Maryland Department of Transportation
USA
Paul Steier
Director, Vehicle Programs
American Association of Motor Vehicle Administrators (AAMVA)
USA
This session will provide a detailed overview of the AAMVA publication titled Safe Testing and Deployment of Vehicles Equipped with Automated Driving Systems Edition 3. This guidance document that will be reviewed provides assistance and recommendations for the regulation of testing, piloting, and deployment of automated vehicles. As there are no national standards for regulating automated vehicle operations, this guidance serves as a benchmark for oversight of safe testing and deployment of these vehicles. This overview will provide an in-depth discussion on guidance given for the main areas of consideration consisting of the vehicle, operator, and law enforcement.

What the audience will learn

  • Undertand the role and responsibilities states have taken on with testing and piloting of AV
  • Gain an in-depth understanding of the AAMVA Guidance document
  • Provide input for future AV testing and piloting for state AV regulatory guidance development
  • Learn about law enforcement involvement and considerations with AV use
  • Obtain contacts to collaborate with state regulators

12pm

A vision for promoting AV rules & public confidence through clarity in safety engineering standards

Joshua Wilkenfeld
Senior Director, Regulatory, Delivery/Core/Emerging Tech
Uber
USA
AVs are caught in a chicken-and-egg problem: Regulators want rules to wait for increased public trust, while improved trust depends on compliance with regulatory standards. This presentation offers a solution to this logjam: First, Different organs of government can take steps to apply existing motor vehicle approaches to AVs. Second, third-parties can encourage developers to compile a clear, unitary description of their safety approach. By providing a public accounting for safety, this approach can start providing regulators with a firmer foundation and richer, more technical, more developed content that can serve as a foundation for new AV-specific rules.

What the audience will learn

  • Safety standards are not moving forward with any speed. This dynamic results, at least in part, from a chicken-and-egg problem: Regulators want rules to wait for increased public trust, while improved trust depends on compliance with regulatory standards
  • The absence of safety standards directly affects further development: Industry is left with little certainty if regulators could, at any moment, dramatically change design expectations.
  • Even in the absence of total clarity on ultimate safety standards for AVs, government regulators in different organs of government can take steps to apply existing motor vehicle approaches to AVs, and thereby steadily fill in our collective understanding
  • Outside government, third-parties can help encourage movement toward greater regulatory certainty. Including by encouraging developers to compile a clear, unitary description of their safety approach, and the development of industry standards

12.25pm - 1.35pm

Lunch

Moderator

Tim Daniel
Segment Leader ADAS
AVL Mobility Technologies Inc
USA

1.35pm

AI components: technology, validation, liability and solutions for ADAS/AI

Dr Rahul Razdan
Senior Director Advanced Mobility Institute
Florida PolyTechnic University
USA
AI components are a powerful new capability which is the center of ADAS and AV solutions. However, AI components present significant challenges in the domain of validation and verification. Without a clear methodology to handle AI components, ADAS/AV solutions introduce significant legal liability to product developers. This presentation will present the history of AI components, the V&V challenges, the impact on ADAS/AV legal liability, and structural pathways towards successful integration. Without a comprehensive approach to AI components, the ultimate value for functionality such as ADAS is limited, but with reasonable solutions, the value perceived by customers rises rapidly.

What the audience will learn

  • History and Role of AI components
  • Validation Challanges for AI components
  • Validation Challanges for ADAS systems with AI components
  • Legal Perils with current V&V OEM Methodology
  • Pathways to successfully navigate ADAS validation and manage legal liability

2pm

Open Standards: Making sure they actually work

Benjamin Engel
Chief Technology Officer
ASAM eV
Germany
Open standards make it easy to exchange data between tools and stakeholders. At least that is the goal, but the reality is often different. As adoption of ASAM OpenX standards in the industry grows, we observe that there are often differences in interpretation or implementation of the standards. This often means that it is difficult to determine whether a specific tool or file actually conforms to a standard. Standards need to be supported by test suites, checker tooling and more to really enable consistent understanding of the standards. This talk will give some insight into the latest developments at ASAM on the journey to deliver more than 'just' standards.

2.25pm

The future of global AI governance

Dan Richardson
Director of Market Analysis
Spatial Web Foundation
USA
Peter Stockburger
San Diego Managing Partner and Co-lead Global Autonomous Vehicles Practice.
Dentons
USA
Governing autonomous systems presents a significant challenge when taking into account the needs of all stakeholders, including engineers. While the demands for governance are clear, the path forward remains uncertain. Peter Stockburger from Dentons, and Dan Richardson from the Spatial Web Foundation, will be presenting next generation socio-technical standards and related policy proposals that will help drive global discourse on the governance of Autonomous Systems. These standards have the potential to address what current regulatory and governance efforts cannot: the critical issues of interoperability, explainability, and AI’s exponential advance toward greater intelligence and autonomy.

What the audience will learn

  • An overview of the socio-technical standards being developed by the Spatial Web Foundation in partnership with the IEEE
  • How Socio-technical standards ensure interoperability of data, models and systems.
  • How Socio-technical standards make it possible for Autonomous Intelligent Systems to be compliant with diverse local, regional, national, and international regulatory demands, cultural norms, and ethics.
  • About the

2.50pm - 3.20pm

Break

3.20pm

Exploring the latest updates of MISRAC:2023

Alex Lim
Lead Field Application Engineer
LDRA
USA
Learn about the latest updates in MISRAC:2023, the widely used coding standard for embedded software development, and how these updates specifically apply to the automotive industry. Through practical examples and case studies, we will discuss key changes in MISRA C: 2023, highlight their implications for automotive software development and share best practices for implementation. Stay ahead of the curve and ensure compliance with industry best practices for developing safe and reliable embedded software in the automotive domain.

What the audience will learn

  • What's new in MISRA C: 2023
  • Example cases on how the new rules work
  • How MISRA C: 2023 affects the automation industry.

3.45pm

Legal and consumer requirements for ADAS in US and Europe

Andres Aparicio
Head, ADAS and Connected and Automated Vehicles
Applus IDIADA
Spain
The deployment of ADAS in the American and European market is guided by legal and consumer requirements defined by local stakeholders. Currently, US defines consumer requirements under NHTSA US NCAP programme and a new set of protocols is available. In Europe, the General Safety Regulation defines mandatory requirements that urge for standard fitment of multiple ADAS in all new vehicles. And beyond the legal framework, Euro NCAP foresees complex safety and assisted driving systems, with a comprehensive roadmap until 2030. The presentation will provide review current and future requirements for ADAS applicable to the American and the European markets.

What the audience will learn

  • Current and future NHTSA US NCAP protocols for ADAS
  • Legal framework in EU defined by the General Safety Regulation with focus on ADAS
  • Overview of the Euro NCAP Vision 2030 roadmap for safety and assisted driving systems
  • Overview of legal and consumer requirements applicable to the American and European markets

4.10pm

Complying with UNECE R155 and R156

Eystein Stenberg
CTO
Northern.tech
USA
The automotive industry continues to evolve towards a software-centered and -defined vehicle. To ensure automotive safety in this new world, legislation is coming into force through the work of the UNECE Working Party / WP 29. While these new requirements can seem daunting at first glance, they are a formalization of best practices for over-the-air (OTA) software updates and cybersecurity controls. Still, missing key requirements or uncertainties around implementation can lead to unnecessary compliance failures, delays, and wasted time and money. We will go over the key requirements, how they relate and a practical implementation of compliance.

What the audience will learn

  • An overview of ISO/SAE 21434, UNECE R155, R156 and how they relate
  • Key requirements dictated by the standard and regulations
  • Outline of a general-purpose compliant OTA update architecture
  • An example of a concrete implementation using Mender

Room C Advanced simulation and scenario-based testing
9am - 4.35pm

Moderator

Chris Reeves
Head of CAV Technologies
HORIBA MIRA Ltd
UK

9am

Smart & efficient methods for testing AVs

Mihai Nica
Global Head of ADAS, Automated Driving & Connectivity
AVL List GmbH
Austria
Autonomous vehicles are a key enabler for increasing safety and access to mobility for current society, but they can also strongly contribute to the CO2 footprint of current vehicles, by an efficient driving behavior. To enable AVs on mass scale, it is crucial to be able to have smart and efficient testing techniques that assure proper operation of the system in all environmental conditions. This presentation will give an insight on such approaches that make use of ODD information, AI- and combinatorial based testing for enabling a comprehensive test program with highest coverage of the parameter.

What the audience will learn

  • - How to address the issue of coverage for testing AV systems
  • - The multi pillar approach used for creating the safety argumentation with respect to verification and validation process.
  • - Ontology-based test generation method, which makes use of the ODD and combinatorial testing to automatically generate efficient test programs
  • - Game-based method for creating critical test scenarios by making use of gaming industry concepts for generating edge and corner cases.

9.25am

A quality of exposure metric for SOTIF validation

Jeremiah Robertson
Principal Safety Engineering Team Lead
Motional
USA
The question of how many miles is enough to determine readiness for driverless deployment has been a popular topic in the automated vehicle industry for some time. This presentation aims to explore the various aspects that should be considered in the decision-making process to establish driverless readiness. While a specific mileage accumulation target is perhaps the easiest to establish as a concrete goal, the oversimplification of this approach and associated fallbacks are laid out here. As a result, a comprehensive approach is required to establish such a target. One new concept Motional is using is the Quality of Exposure metric.

What the audience will learn

  • There are many issues with using a pure quantity of miles to validate AV performance
  • True AV performance depends on many factors including capabilities or features exercised
  • Highly accelerated life testing can be used to argue for lower validation miles by
  • Statistical methods can be used to analyze the number of miles to build a certain confidence
  • Unique metrics can be applied to validate AV performance against human benchmarks for certain behaviors

9.50am

Leveraging ODD at VW Commercial Vehicles: ODD specification, validation and coverage

Dr Edward Schwalb
Consultant
Schwalb Consulting LLC
USA
Dr Schwalb will provide an overview of the components of the ODD specification. The presentation will demonstrate how a detailed ODD can be developed in a modular distributed fashion whereby different aspects (e.g. highway pilot vs parking assistant) can be integrated into a single coherent interpretable specification. We will review how the Operational Domain (OD) is leveraged, how coverage is determined and clarify the relationship to the scenarios used for testing. For each use case, the applicable OpenODD aspect will be highlighted and references will be provided to more detailed method documentation available in the form of IEEE publications from the group.

10.15am - 10.45am

Break

10.45am

Testing ADAS and AVs in a deterministic simulation environment

Mike Dempsey
Managing Director
Claytex
UK
Its essential that we can test our ADAS and AV systems on a wide range of scenarios covering nominal, critical and edge cases. This must be done in a deterministic simulation environment so that the same test can be run in repeatable manner with the only variable being the ADAS/AV controller behaviour. AVSandbox provides a deterministic and sensor realistic simulation environment that allows the ADAS/AV controller to be immersed into the virtual world and tested on a diverse range of scenarios to identify failure modes and focus the development effort to improve safety.

What the audience will learn

  • Insights into the challenges of virtual testing and how to effectively harness this to drive development
  • Why deterministic simulation is important for ADAS and AV system development and testing
  • How a sensor realistic simulation environment enables the ADAS/AV controller to be immersed into the virtual world

11.10am

Safety-driven validation – enable large-scale ADS/ADAS safety validation using ASAM OpenSCENARIO 2.0.0

Gil Amid
Chief Regulatory Affairs Office
Foretellix
Israel
The presentation describes a new approach to large scale safety verification and validation of ADS/ADAS. The approach relies utilizes ASAM OpenSCENARIO 2.0.0, coupled with scenario-based coverage-driven validation, producing safety metrics. This approach changes the way the automotive industry is conducting verification and validation of automated driving systems. It also demonstrates an advanced method for safety verification that allows tackling the infinite space of scenarios. Several use cases will be presented, with an emphasis on how the new approach enables a step function in ensuring the correctness and safety of automated driving systems.

What the audience will learn

  • A full flow/solution for Safety driven validation – tools and methodology and their usage.
  • Usage of ASAM OpenSCENARIO® 2.0.0 advanced features for safety scenarios.
  • Producing metrics, data and evidence for safety argumentation.

11.35am

Accelerating autonomous driving development through simulation-based validation and verification

Ben Hager
Head of Autonomous Driving
dSPACE Inc.
USA
The presentation will focus on simulation-based validation for autonomous driving, which has emerged as an effective tool to complement physical testing and accelerate the development process. We will discuss the benefits and challenges of simulation-based validation for autonomous driving, including the use of advanced sensor models, scenario generation tools, and real-time simulation platforms. Additionally, we will present the latest trends and innovations in the autonomous driving simulation and validation field, along with case studies and examples of successful simulation-based validation projects. Attendees will learn how to leverage simulation-based validation to ensure the safety and reliability of autonomous driving systems.

What the audience will learn

  • Benefits of simulation-based validation in addressing the challenges of autonomous driving.
  • The techniques and tools available for simulation-based validation, such as advanced sensor models, scenario generation tools, and real-time simulation.
  • The latest trends and innovations in the autonomous driving simulation and validation field.
  • Real-world examples of successful simulation-based validation projects
  • Best practices for leveraging simulation-based validation to ensure the safety and reliability of autonomous driving systems.

12pm

Critical scenario creation methodology for safety assessment

Akshay Sheorey
Business Development Manager
Siemens Industry Software N.V
Netherlands
The design of infrastructure can have an impact on overall safety and in most accident cases, is also attributed to the driver. The introduction of the SOTIF standard has brought along a fundamental shift in the safety case for autonomous vehicles resulting in the vehicle manufacturer carrying an increased burden of responsibility and liability. Critical Scenario Creation (CSC) is a proprietary methodology to systematically and automatically generate unsafe-unknown scenarios. By adopting the CSC process, cities and traffic planners can identify problem areas and evaluate infrastructure changes that support safer operations, thus increasing safety in the urban environment.

What the audience will learn

  • How to automatically generate Critical Scenarios and asses safety in context of ADAS and AV functionality?
  • Generation of scenarios to evaluate safety per SOTIF standard?
  • How to evaluate the safety impact of Infrastructure changes?

12.25pm - 1.35pm

Lunch

Moderator

Dr Sagar Behere
Vice President of Safety
Foretellix
USA

1.35pm

Digital Twin Concepts for Autonomous Vehicle Testing

Dr Greg VanWiggeren
R&D Program Manager
Keysight Technologies
USA
The pace of innovation and digital transformation in the automotive industry is growing exponentially. The push toward software-defined vehicles and digital twins enables engineers to accelerate the definition, design, and production of cars, modules, and critical sensors for the automotive industry. The environment remains a variable when validating the functionality of a new vehicle in the real world. This context is beyond the control of automakers, creating a challenge when testing a car’s functionality with confidence. However, automotive manufacturers do have control over testing in the lab using digital twins. In fact, when testing, you can analyze, record, and assess

What the audience will learn

  • Facilitate the path to autonomy with in-lab testing.
  • How to validate radar-based Advanced Driver Assistance Systems (ADAS)/ autonomous driving (AD) with 512-pixel resolution.
  • Synchronize with and prove V2X functionality.
  • Integrate this testing with any hardware-in-the-loop system, 3D modeler, or V2X stack software.

2pm

Measuring digital impact on the zero-crash paradigm

Michael Morgan
Lead Active Safety Solutions Engineer
Humanetics
USA
This presentation centers around the integration of the virtual and simulation components within the AV and ADAS domains, aiming to achieve a zero-crash paradigm. Using virtual technologies with a blend of simulated and real-world situations, vehicles can undergo comprehensive testing that enhances their ability to respond and intervene effectively in critical contexts. Through simulation and validation of a wide array of driving scenarios, preemptive crash avoidance strategies can be swiftly deployed. This approach permits exhaustive scenario validation in the virtual realm, reserving physical testing for unique edge cases, thus optimizing resource allocation, value, and time-to-market. The ultimate outcome manifests as more resilient AI, AV, and ADAS systems, yielding a substantial reduction in accidents, injuries, and fatalities.

2.25pm

Accelerating the data loop

Nijanthan Berinpanathan,
Co-Founder
DeepScenario
Germany
The physical world is complex and highly dynamic, with sudden changes and unforeseen anomalies. Therefore, it is essential to establish a systematic way to prepare autonomous systems for safe operation in the physical world. In this presentation, we give insights into our novel AI Scenario Engine that helps customers deploy autonomous systems significantly faster and with dramatically less cost. Core of this platform is a world-class computer vision software that extracts traffic scenarios from monocular cameras in a highly accurate and fully automated way. DeepScenario's AI Scenario Engine has been used by the world’s most recognizable companies, like BMW, Bosch, or Torc Robotics, to solve the challenges of autonomous system deployment.

What the audience will learn

  • Conduct large-scale traffic observations significantly faster and at dramatically less cost with stationary cameras.
  • Gain knowledge from motion data by parameterizing the real-world measurements.
  • Convert critical scenarios into industry standards to enable continuous training and testing in simulation.

2.50pm - 3.20pm

Break

3.20pm

Workflows for generating virtual scenarios from recorded vehicle data

Seo-Wook Park
Principal Application Engineer
MathWorks
USA
This presentation introduces workflows for generating virtual scenarios from vehicle logs utilizing Automated Driving Toolbox™ and RoadRunner Scenario. The workflows consist of multiple data processing tasks, including: • GPS and IMU sensor fusion to generate ego trajectory • Ego localization using lane detections and a high-definition map • Target vehicle trajectory creation from recorded lidar, radar, and vision sensors data RoadRunner Scenario exports the scenarios created to OpenSCENARIO, which is used for regression testing of the advanced driver-assistance systems (ADAS) and autonomous driving (AD) algorithms.

What the audience will learn

  • Create virtual scenarios from recorded vehicle data
  • GPS and IMU sensor fusion to generate ego trajectory
  • Ego localization using lane detections and a high-definition map
  • Target vehicle trajectory creation from recorded lidar, radar, and vision sensors data
  • Export the scenarios to OpenSCENARIO

3.45pm - 4.35pm

Panel Discussion - Safety validation for Highly Automated Driving

Highly Automated Driving offers many potential benefits. But despite technological progress, timelines for large-scale commercial deployments remain uncertain, the main barrier being safety. In this panel, experts from the industry will discuss and provide insights regarding the challenges of safety verification and validation and the innovations needed for assuring safety and building a valid Safety Case.
Vangelis Kokkevis
Head of Simulation and Validation
Woven by Toyota
USA
Andreas Reschka
Senior Director of Product, Systems, and Safety
Pony.ai
USA
Quresh Sutarwala
Lead Systems Engineer
Kodiak Robotics
USA
Dr Sagar Behere
Vice President of Safety
Foretellix
USA
Ury Zhilinsky
Senior Director, Head of Learned Behavior and Simulation
Nuro
USA

Room A Real-world test and deployment – lessons learned
8.50am - 10.55am

Moderator

Katherine Sheriff
Lead, Mobility and Transportation Industry Group
Davis Wright Tremaine
USA

8.50am

Autonomous Rapid Transit: Clear alternative to light-rail and people-mover

Jia-Ru Li
CEO
LILEE Systems
USA
To increase efficiency and reduce cost, LILEE Systems introduced the Autonomous Rapid Transit (ART) system since 2018. Comparing to legacy Light Rail and People Mover, ART reduces cost by 60% and implementation time by 75%. LILEE has completed projects by operating full-sized buses on open roads in America and Taiwan. We will share challenges and opportunities during the development, validation, and deployment phases: 1. System verification by working with cities and operators. 2. DMV license to operate self-driving buses from 9 a.m. to 6 p.m. for 2 years. 3. Infrastructure connecting signal priority and vehicle/pedestrian detection to Operation Control Center.

What the audience will learn

  • Why can Autonomous Rapid Transit (ART) be cheaper and faster to implement than Light Rail and People Mover?
  • How is ART compared to robotaxi, city-wide home delivery service, and long-haul trucking?
  • What are the steps of developing an ART system?
  • What are the challenges when developing an ART system?
  • Which city or what projects in the world are actively pursuing the ART system?

9.15am

Lessons learned from robotaxi services in multiple cities in Korea

Joonwoo Son
Founder & Chair
Sonnet Co., Ltd.
Korea
Sonnet.AI, South Korea's leading robotaxi startup, launched its first commercial robotaxi service in late 2021 and has expanded to various cities. We would like to share the possibilities and limitations of operating a robotaxi service that serves different purposes for tourism, commuting, and daily transportation depending on the service area. Robotaxis will play an important role in maintaining mobility not only in large cities but also in depopulated cities, but they have not yet secured profitability. However, Sonnet.AI is creating a business model that can operate in the black with the small help of the government's transportation subsidies.

What the audience will learn

  • Understanding the possibilities and limitations of operating a robotaxi service in South Korea
  • Understanding Korean legislation for commercial robotaxi services
  • The case studies for safe and profitable commercial robotaxi operations

9.40am

The Importance of adverse weather condition testing with automated transit buses

Cemre Kavvasoglu
Product Management Director
ADASTEC
USA
One of the main challenges of automated transit bus deployments is the varying weather conditions. Depending on where an automated transit bus gets deployed, it is vital to assess the environment’s climate, including extreme weather conditions such as heavy rain, snow, fog, and extreme heat. To operate throughout varying seasons, ADASTEC has undergone tremendous amounts of testing in various climates around the globe. Furthermore, ADASTEC has collected vast amounts of data and experience to not only handle a variety of environmental and weather conditions but also operate on different road settings during all four seasons.

What the audience will learn

  • • What are the difficulties operating in adverse weather conditions? How does ADASTEC operate their buses in adverse weather conditions such as heavy rain, snow, and fog?
  • • Developing, implementing, and testing the software in various weather and road conditions around the globe
  • • How does ADASTEC mitigate challenges with the human acceptance and awareness of automated public transportation buses?
  • • Mitigating risks due to lack of availability of infrastructure communication (V2I) and environmental factors

10.05am

Vehicle development support for PTI impact testing of automated vehicles

Thomas Tentrup
Director R&D
KUES Bundesgeschäftsstelle
Germany
In the future safety-relevant ADAS need to be tested as part of periodic technical inspection (PTI) in the after sales area. Similar to the ViL tests in the vehicle development phase the tests are provided as reliable impact testing during dynamic driving on a specific functional test bench at the test line KÜS DRIVE allowing steering while driving up up to 130 km/h without vehicle fixation. Monitor and radar target simulator stimulates ADAS-sensors without ADAS-ECU communication. The impact tests are performed in this way because standardized “open” interfaces to the ADAS-ECU´s and -sensors are not implemented during vehicle development phase.

What the audience will learn

  • ADAS functionalities needs to be checked by impact test of the complete functional chain over the whole vehicle lifetime.
  • Actual impact tests at KÜS DRIVE are able to check ADAS without ADAS-ECU communication and therefore with some limitations.
  • The implementation of standardized open ADAS-ECU and –sensors interfaces by the vehicle development would facilitate impact testing for PTI.
  • Impact tests at KÜS DRIVE are a secure and economical alternative to check ADAS outside on streets with movable targets.
  • Especially AEBS tests at KÜS DRIVE can be performed without danger for driver and vehicle with retraceable, exact measurement results.

10.30am

High-speed tests of collision mitigation systems on cars and trucks

Shawn Harrington
Principal & Founder
Forensic Rock
USA
Higher closing speed tests (up to 70-75 mph) of passenger and commercial vehicle forward collision warning (FCW) and automatic emergency braking (AEB) systems in stationary rear collision scenarios will be presented to attendees. Research will present the timing of the issuance of FCWs and the initiation of AEB in both passenger and commercial vehicles in stationary and moving rear collision scenarios. Testing speeds include 10 - 75 mph and over a dozen passenger vehicles and heavy trucks' collision mitigation systems will be compared from model years 2013 to 2023. Real-world LKAS and LDW research will also be presented.

What the audience will learn

  • Peformance of FCW/AEB at high closing speeds
  • Tests on model years 2013 - 2023
  • Performance of heavy truck collision mitigation systems
  • Performance and comparison of passenger vehicle collision mitigation systems
  • Performance of LKAS and LDW in the real-world

10.55am - 11.25am

Break

Room A Software, AI, architecture and data management – continued
11.25am - 3pm

Moderator

Ram Mirwani
Group Manager, Automotive Business Development
Rohde & Schwarz
USA

11.25am

Current trends in high-performance automotive datalogging

Bernhard Kockoth
Global Technology Scout
ViGEM GmbH
Germany
The verification and validation of ADAS systems for automated driving requires the accurate recording of high data rates from sensors and vehicle busses in real-world use. High-performance dataloggers record every bit from high-resolution raw video streams. In 2019 ViGEM introduced distributed logging which places capture modules and adapter probes close to data sources and transmits the thus collected and timestamped data over robust ethernet connections to central storage. After four years of successful worldwide deployments, we present a scalable solution with a new datalogger that fits into existing data capture setups.

What the audience will learn

  • State of the art high performance data logging
  • Trends in hardware and software development for logging devices
  • Distributed data logging explained
  • Future developments outlook

11.50am

Standardized APIs for autonomous, connected (V2X) and cockpit software

David Cole
Director - Engineering Solutions
Danlaw Inc.
USA
Different components of autonomous software are developed in an integrated environment both commercial and open-source software like AUTOWARE. However other key components of the automotive environment, in particular, V2X and cockpit software are developed in isolation. Hence, we have standardized the APIs that will help these components interact seamlessly. These software APIs for Connected Cars, Cockpit, and Autonomous Software are becoming increasingly important in the automotive industry. These APIs provide a standardized interface for different software components, enabling faster development and deployment of new software functionalities. APIs also promote innovation and collaboration, driving growth and revenue in the automotive industry.

What the audience will learn

  • Overview of AUTOWARE, Connected Car (V2X) and Cockpit features
  • Key Features of V2X and cockpit that has impact on Autonomous Software
  • Challenges of not having standardized APIs
  • Definition of APIs and their implementation
  • Advantages of having Standardized APIs for Autonomous, Connected (V2X) and Cockpit Software

12.15pm

Reaching production: Solving deployment challenges through scalable cloud infrastructure

Dev Patel
Product Manager
Applied Intuition
USA
In the past decade, AV capabilities have gone from dream to near-reality. However, remaining blockers for production deployment are amongst the most challenging, including validating new software across millions of test cases, monitoring and mining fleet data, quantifiably proving safety, and expanding operational design domains (ODDs) at scale. Cloud infrastructure and tooling play a significant role, powering development loops with scaled compute and data that enable programs to go to market faster and cost-effectively without compromising on safety. This presentation discusses four key tactics that industry leaders are deploying: Virtual validation, data management, monitoring and deployment, and cloud collaboration.

12.40pm - 1.45pm

Lunch

1.45pm

Enabling future Software Defined Vehicles for ADAS and Automated Driving

Samuel Kuttler
Senior Business Development Engineer
Vector
Germany
Vector is a hidden champion in the embedded system and software field, at the same time the company pioneers the path to Software Defined Vehicles (SDV) through its software products by adopting an embedded or cloud-native approach. Today, Adaptive MICROSAR is driving around in all German built cars to provide the highest level of dependability for ADAS Applications. While Silicon Valley accelerates software and system development through innovation, there is a potential trade-off in quality. Let’s see how we can make best of both worlds, dependable systems engineering and fast pace software craftmanship for automated mobility.

What the audience will learn

  • Embedded or Cloud native - first step is an collaborative approach.
  • What silicon valley can learn from automotive software engineering. to provide level 4/5 automated driving
  • The next technologies for ADAS on software platform level.
  • Combining Adaptive MICROSAR and ROS (Robotic Operating System) on one target.

2.10pm

Modern Software-defined vehicle architectures: The foundation for autonomous vehicles

Pedro López Estepa
Director of Automotive
Real-Time Innovations (RTI)
Spain
Modern software-defined vehicle architectures are the foundation of the Autonomous Vehicle (AV) era. The new AV development paradigm requires not only safety critical software to meet the requirements set forth in the Functional Safety (FuSa) standard, but also provide flexibility, scalability, compatibility and upgradability on different platform components. Platform-independent solutions together with standard data-models help OEMs to optimize the path towards autonomy while reducing associated risk and cost. In addition, choosing the right Business Model and liability allocations will help ensure long-term success. This session will highlight the path towards a solid software-defined architectural strategy as a foundation towards AVs.

What the audience will learn

  • RTI will present the best practices in order to secure a long term strategy for the Software-defined vehicle
  • RTI will present the optimal Design Cycle in order to integrate functional safety software components in a production program.
  • Description of challenges at Business Model and Liability level in Functional Safety production programs from a supplier perspective
  • RTI's Connext Drive presentation and how it's the place where safety and functionality meet for AVs framework design.
  • Describe the importance of platform independent components and standard data-models to secure a continuous evolution of the AV software solution

2.35pm

Augmenting the Why: outsourced anomaly detection data pipelines

Aaron Bianchi
Director, ML Solutions
Digital Divide Data
USA
AD and ADAS systems constantly interact with new situations and scenarios. Some of these may cause rapid corrective action, human takeover, and other similar events. Understanding what causes these corrective actions can help to inform testing, ODD definition, and much more. Join this talk to learn more about how we help AD leaders quickly and effectively parse and bin these corrective actions to help drive their functionality and error handling capabilities into the future.

What the audience will learn

  • Identify driver takeover and similar cases to triage and define
  • Build a conceptual model for root cause analysis and ODD definition
  • Drive meaningful insisghts into vehicle performance to augment the engineering team
  • Apply similar approaches to other engineering areas and disciplines (such as test engineering)

Room B Sensor test, development, fusion, calibration and data
8.50am - 3pm

Moderator

Jeremiah Robertson
Principal Safety Engineering Team Lead
Motional
USA

8.50am

Economics of enabling technologies for lidar

Sunil Khatana
Chief Technology Officer
Inyo System Inc
USA
There are several competing technologies for lidar. This presentation will discuss the cost/performance trade-offs of the underlying technologies that enable realization of lidar for automotive and 3D sensing. It will review ranging methods, lasers, detectors and scanning methods used for lidar design and how these map to performance and cost. Map various device technologies into application space where they are likely going to be most competitive. It will provide comprehensive coverage of ranging methods - iTOF, dTOF, coherent; lasers - EEL VCSEL, fiber laser; lighting - spots, line scan, flying spots; detectors APD, SiPM, SPAD, CIS and PINs. It will present a comparison of strength and weakness of each in terms of cost, range, FOV, range resolution, spatial resolution.

9.15am

Silicon photonics for LiDAR and sensing application

Marcus Yang
Sr. Director, Head of LIDAR Sensing - Silicon Photonics Product Division
Intel
USA
FMCW LiDAR offers precise distance measurement using a continuous-wave laser with modulated frequency. Advantages include higher resolution, velocity detection, and cost-effectiveness. Silicon Photonics (SiP) enables compact and efficient FMCW LiDAR systems by integrating optical components on a single chip. SiP's role in sensing spans communication, computing, automotive, bio-sensing, optical gyroscopes, and AI applications. Intel offers mature high-volume SiP platform with industry-leading quality, unique devices such as hybrid-laser, and cutting-edge PICs for FMCW LiDAR Sensing.

9.40am

Radars in autonomy: current landscape, challenges, and the future

Arvind Srivastav
Software Engineer, Radar Perception
Zoox, Inc.
USA
This presention will focus on providing a compelling overview of radars in autonomy today and maximizing their contribution to autonomous perception. Topics will include radar fundamentals, radar data formats and illustration of their strengths and weaknesses, role of radar in perception, current radar deep learning approaches, the research early fusion models and their promise, radar occupancy flow models, and methods for direct target tracking on radar data. In this talk, we will focus on providing an improved understanding of radars used in autonomy to the audience and encouraging increased contribution from radars to make autonomy safe, robust, and reliable.

What the audience will learn

  • Why radars aren't able to live up to their promise in autonomous perception today
  • Where does the problem lie and why recent deep learning approaches offer a better promise
  • How would the autonomous radar future look like

10.05am

Utilizing LiDARs at scale for ADAS safety, compliance and efficiency

Mohammad Musa
Founder and CEO
Deepen AI
USA
Maintaining accurate sensor calibration is key to all ADAS and AV systems. Now that most new cars will have ADAS features in them, undertaking multi-sensor calibration at scale is becoming a real bottleneck. The presentation will discuss the multi-sensor calibration lifecycle, all types of calibrations required and best practices for conducting multi-sensor calibration at scale.

What the audience will learn

  • Why LiDAR? Safety, Compliance and Efficiency
  • LiDAR placements and integration with other sensors
  • LiDAR Annotation & Calibration

10.30am - 11am

Break

11am

Next-Generation Sensors for Automated Road Vehicles - SAE EDGE™Research Report Discussion

Cameron Gieda
Sr. Director of Business Development
Pony.AI
USA
Based on an SAE EDGE™Report authored by Sven Beiker with support from a number of mobility experts, this discussion will cover the spectrum of currently available sensors for higher levels of automated driving such as lidar, radar, cameras and ultrasonics. Pony Ai’s Cameron Gieda, one of the primary contributors to the report will explain the nuanced differences between the approaches that manufactures employ when designing these sensors as well as why these decisions are made. There will be details provided as to why certain sensors are used for certain tasks as well as what is the "Achilles heel" of any given modality. Lastly, he will address how sensor fusion can be used to leverage the best of each type of sensor and what trends we will see in the future.

What the audience will learn

  • What are the common sensor architectures in automated vehicles?
  • What are the minimum sensors needed for higher levels of ADAS
  • Can cameras alone solve the problem?
  • Why is redundancy important for safety and reliability?
  • How computer power throttles the use of some sensor modalities.

11.25am

Solving critical issues affecting safety of ADAS - accelerate AD adoption

Hannah Osborn
Director, America Sales & Business Development
LeddarTech
Canada
Consumer confidence drives the pace of ADAS and AD development. This presentation will unveil some critical issues surrounding safety and performance that are evident today, why they affect customer confidence, and how they can be addressed to accelerate greater safety and autonomy.

What the audience will learn

  • The sensor fusion and perception software directly impacts the safety and performance of ADAS and AD
  • Sensor performance is ever-evolving, each type fulfilling a function based on level of autonomy and redundancy is extremely important
  • Sensor limitations and degradation issues are highlighted in harsh conditions but can be overcome with better fusion and perception software
  • Low-level fusion helps mitigate the impact of a malfunctioning/degraded sensor, providing better detections and fewer false alarms for small obstacales
  • Suited for on-road and off-road applications, LLF is the future of perception

11.50am

Turning miles into minutes with greater realism in test

Ajay Vemuru
Director, PNT Simulation
Spirent Communications PLC
USA
Testing in the lab saves engineers time and money over road testing, and delivers results that have greater traceability and repeatability. For this reason, increasing realism in lab testing is a potential game changer for automotive developers. In this presentation we’ll look at how to ensure a hardware-in-the-loop test environment is optimized for realism, particularly by reducing the impact of latency. We’ll look at how you can recreate the local signal environment to give a better idea how your equipment will perform on the road, including by using record & playback. Lastly, we’ll discuss how you can bring multi-sensor testing in the lab.

What the audience will learn

  • • Local environment – how can developers model the local environment more realistically, and why?
  • • Multi-sensor testing – how can you integrate multiple sensors efficiently in the lab?
  • • Hardware-in-the-loop – how does latency impact the integrity of your testing?
  • • Record & playback – how high fidelity record & playback can combine the integrity of the real world and

12.15pm

FMCW as the end state: exploring advantages of instantaneous velocity

Matt Last
Director of Product
Aeva
USA
Next-generation Frequency Modulated Continuous Wave LiDAR systems can detect and track objects farther, faster, and with greater precision than ever before. Aeva’s FMCW LiDAR-on-chip system adds doppler velocity to the standard range, azimuth, elevation, and reflectivity channels generated by traditional 3D LiDAR systems. This session will explore the unique perception capabilities enabled by the velocity channel, how they deliver improved safety and reliability to vehicle automation, and why FMCW-based systems are the end state for high performance automotive LiDAR.

What the audience will learn

  • How FMCW LiDAR systems can detect and track objects farther, faster, and with greater precision than ever before
  • The unique perception capabilities enabled by the doppler velocity channel
  • How these capabilities deliver improved safety and reliability to vehicle automation

12.40pm - 1.45pm

Lunch

Moderator

Hannah Osborn
Director, America Sales & Business Development
LeddarTech
Canada

1.45pm

Fix the optical subsystem, fix lidar

Eric Aguilar
Co-founder & CEO
Omnitron Sensors
USA
While highly promising, today’s optical subsystems for lidar remain fragile, large, expensive to build and maintain, overly susceptible to environmental conditions, and inconsistent in their performance. We can reach the full potential of lidar by fixing the optical subsystems on which lidar systems rely. With experience that spans core sensor development and systems integration, Eric Aguilar learned first-hand what automotive integrators need for affordable, reliable, long-range lidar systems. He’ll both review the pros and cons of today’s optical subsystems and introduce a new, cost-effective MEMS scanning mirror for lidar that ticks all the boxes for automotive integrators and manufacturers.

What the audience will learn

  • The role played by the optical subsystem in LiDAR for ADAS and autonomous systems
  • Automotive industry requirements for optical subsystems for LiDAR
  • The top 3 issues with existing optical subsystems for LiDAR—Voice Coil, SCALA, spinning polygon, Galvo
  • The great potential—and challenges—of MEMS mirrors
  • Problem-solver: first mass-produced low-cost, rugged, reliable MEMS scanning mirror

2.10pm

Advancements in stereo vision for night-time and low-light scenarios

Piotr Swierczynski
Director of Engineering
NODAR
USA
Join us for an illuminating session on how advanced stereo vision overcomes the challenges of night-time and low-light driving in autonomous vehicles. We will share how NODAR's 3D vision system outperforms other sensor systems in these challenging conditions and provides the reliability and performance required for L3 and above autonomy. This presentation will cover the technology behind stereo vision and NODAR’s unique take on this age-old technique. We will offer valuable insights into how this new technology can enhance safety in autonomous driving.

What the audience will learn

  • The benefits and limitations of stereo vision technology for nighttime and low-light driving scenarios
  • Real-world examples of how stereo vision is being used in autonomous vehicles
  • Insights into how stereo vision enhances safety in autonomous driving, particularly in nighttime and low-light scenarios

2.35pm

Effect of UVH coatings on self-cleaning performance for automotive sensors

Songwei Lu
Research Associate I
PPG Industries, Inc.
USA
Using a stationary testbed, we have evaluated the effect of UVH coatings on self-cleaning performance for automotive sensors under lab-simulated inclement weather. Four types of inclement weather were simulated, including rain, mud, fog, and bug. Images from vision camera were analyzed using Modulation Transfer Function and signal-to-noise ratio to evaluate optical distortion incurred by weathering. The evaluation results of the UVH coatings as-prepared, and after various hours up to 3000 hours in Weather-O-Meter testing will be presented. Current results point to a significant benefit of using UVH coatings to improve the signal reading of vision cameras under inclement weather.

What the audience will learn

  • The effect of UV durable hydrophobic coatings for autonomous sensors under inclement weathers.
  • Lab simulated weather conditions including light rain and heavy rain, light mud and heavy mud, fog, and bug.
  • The presentation will mainly focus on automotive vision camera. The effect on IR camera, LiDAR and radar will be mentioned.
  • We will show significant benefits of using UVH coatings to improve the signal reading of vision cameras under inclement weather.
  • The UV durable hydrophobic UVH coatings on sessors will ensure safe and effective driving of autonomous vehicles

Room C Best practices for accelerating the test, development and deployment of safe ADAS & AD systems
8.50am - 3pm

Moderator

Pedro López Estepa
Director of Automotive
Real-Time Innovations (RTI)
Spain

8.50am

Connected mobility solution for reliable performance of automated vehicles

Subrata Kundu
Senior Manager
R&D Division, Hitachi America, Ltd
USA
Digitalization rapidly transforming the mobility industry and providing new opportunities in connected automated vehicles and mobility services. As the acceptance of connected vehicles increases, innovative solutions to improve safety and operational efficiency as well as to reduce the possibility of error of automated vehicles are also beginning to emerge. We have been developing advanced sensors, high performance electronic control unit (ECU), and connected mobility solution, which enables improved and reliable performance of connected automated vehicles. This presentation will introduce innovative solutions to maximize safety and ensure component functionalities of connected automated vehicle.

What the audience will learn

  • Connected Vehicle Application Platform
  • Smart Routing Solution to Maximize Safety of Connected Automated Vehicle
  • Connected Diagnostic Solution to Ensure Component Functionalities

9.15am

Cloud-Powered development for AVs at scale

J.J. Navarro
Customer Experience Lead, Autonomous Vehicles Google Cloud
Google
USA
Learn how leading AV companies are using the cloud to accelerate development of their AI/ML autonomy platforms and build for scale and elasticity, and how they transition from R&D to reliable commercial operations

What the audience will learn

  • Learn how AV companies are using The Cloud to:
  • optimize for cost and speed
  • build for global scale and reliable commercial operations
  • scale their ML and simulation workloads

9.40am

Communication of automated vehicles with other road users

Dr Sven Beiker
External Advisor
SAE International
USA
This presentation will discuss how automated vehicles will / should communicate with other road users. Conventional (human-driven) vehicles, bicyclists, and pedestrians already have a functioning system of understating each other while on the move. Adding automated vehicles to the mix requires assessing the spectrum of existing modes of communication – both implicit and explicit, biological and technological, and how they will interact with each other in the real world. The impending deployment of AVs represents a major shift in the traditional approach to ground transportation; its effects will inevitably be felt by parties directly involved with the vehicle manufacturing and use and those that play roles in the mobility ecosystem (e.g., aftermarket and maintenance industries, infrastructure and planning organizations, automotive insurance providers, marketers, telecommunication companies). The audience of this presentation will learn about multiple scenarios that are likely to evolve in a future not too far away and how they are likely to play out in practical ways.

What the audience will learn

  • - overview of previous work related to external communication of AVs
  • - understanding of the challenges that seeming obvious solutions present
  • - insights into what experts demand and question regarding AV communication
  • - appreciation for the differences of visual versus auditory communication solutions

10.05am

Determining the performance of an ADS through three key questions.

Chris Reeves
Head of CAV Technologies
HORIBA MIRA Ltd
UK
ADS are transforming how we travel, systems are becoming more prevalent, more complex and broader in their application in an unpredictable external environment. This creates a major technical challenge, how to ensure the features are safe and functionally robust without exponentially increasing validation and verification time and cost. HORIBA MIRA’s ASSURED CAV centre of excellence uses a multi-pillar approach and novel techniques to answer three critical questions, what to test, how to test and when to stop to ensure vehicle performance is determined for real world complexity.

10.30am - 11am

Break

11am

Insights on V2X, from standardization to validation

Tony Vento
Strategic Development
S.E.A.
USA
V2X (Vehicle-to-Everything) adds capabilities to ADAS sensor fusion such as non-line-of-sight (NLOS) and sensor sharing. It will improve vehicle safety, necessary as fatalities continue to increase between vehicles and VRUs (vulnerable road users like pedestrians). Hear insights on standardization and validation efforts. Standardization includes automotive OEMs and Tier 1s, network operators, semiconductor companies, and device testers. Validation includes 3GPP tests at a physical level, V2X protocol level, and Day 1 Use Cases at an application level. We have tested many V2X devices and collected field data from OBUs (on-board units) and RSUs (roadside units). Additional 5G benefits are coming.

What the audience will learn

  • How V2X will improve the safety of Autonomous Driving
  • The status of V2X standardization
  • Validation efforts for V2X

11.25am

Skeleton-based gesture recognition model for automated vehicles

Jagdish Bhanushali
Senior Deep Learning Software Engineer
Valeo
USA
In autonomous vehicles passengers can interact with the system through various physical buttons or voice commands, but pedestrians are far from the car and don't have a communication channel between them. One of the ways to convey messages from pedestrians to automated vehicles is gesture recognition, where pedestrians can do various actions to inform autonomous vehicles about their intentions. Here we are presenting how skeleton-based gesture recognition helps to create communication channels between autonomous vehicles and pedestrians.

What the audience will learn

  • Challenges in complex urban environment for autonomous vehicle
  • Existing methodologies to solve the problems
  • Dataset collections and annotations
  • How Valeo’s approach is different from from existing methods
  • Performance and limitations of the system

11.50am

Multimodal feedback for drivers of AVs during transfer of control

Salman Safdar
Automotive Consultant
Ansible Motion
UK
Level 3 vehicles are now commercially available, allowing drivers to engage in non-driving-related tasks. However, autonomous control systems have design and intent limitations, so it is important to provide suitable feedback for safe and timely transfer of control between vehicle and driver. Prof. Bani Anvari’s (Professor of Intelligent Mobility at University College London) driver-in-the-loop simulator investigations focus on how haptic feedback through a novel mechano-tactile driver's seat reduces drivers’ reaction times and improves the success of control transfer. IM@UCL team supported by Ansible motion have found that multimodal outperforms single-modal feedback: Combining auditory, visual and haptic feedback offers the highest proportion of successful vehicle control transfers with the lowest reaction times. We believe this research can be extended in the future to include 6 DoF motion for highly immersive feedback.

What the audience will learn

  • Effectiveness of different type of feedback for take over requests.
  • Impact of feedback on the perception of drivers during different levels of autonomy.
  • Understanding situational awareness based on physiological and behavioural sensors.
  • Experimental methodology to validate take over requests in a Driver-in-the-Loop driving simulator.
  • Design of robotic human-machine interfaces to support take over requests.

12.15pm

Beyond machine learning: Solving the Long Tail Problem of ADAS/AVs

Stan Stringfellow
CEO / Founder
PlasticFog Technology Corp.
USA
Machine learning, which is based on past experience, can never address all possible future ADAS/AV scenarios, especially combinatorially complex and unfamiliar scenarios -- which happen all the time. This is the called the Long Tail Problem. It is a huge impediment to safe and effective ADAS/AV's, yet no publicly-known technology has been able to solve this problem. We are developing a conceptual-level reasoning capability that builds on top of machine learning, yet is completely transparent. We think of it like the all-seeing eye above the pyramid base. This is essentially a fast, real-time, widely-distributed, combinatorial optimization solution that targets ADAS/AVs.

What the audience will learn

  • Why the Long Tail Problem is critical to solve, and why it is currently unsolved
  • How it is possible to solve roadway combinatorial optimization problems in real-time
  • How ADAS technology can be evolved into city-wide hierarchical ADAS ecosystems, which even assist non-connected roadway users
  • How to overcome the problem of intrusive and/or unnecessary ADAS events by utilizing a conceptual-level reasoning capability
  • How to overcome the problem of false classifications of roadway observations by optimizing across multiple sensor/ML viewpoints

12.40pm - 1.45pm

Lunch

Moderator

Salman Safdar
Automotive Consultant
Ansible Motion
UK

1.45pm

Shaping the future of road safety

Nihat Kücük
CTO
Terranet
Sweden
In a world where cities grow bigger and urban traffic becomes denser and more complicated, we have the power to make a difference. With the use of cutting-edge technology, we can protect the most vulnerable road users. We believe that a complementary set of intelligent solutions in vehicles will transform urban road safety and ensure a safer future for all.

What the audience will learn

  • Vision Zero: Megacities, Micromobility, Urbanization
  • | Facts on Road Fatalities
  • Why we need better and faster sensors
  • Todays Sensor Landscape

2.10pm

EMI gap pads simplify material needs, enabling high-performing ADAS components

Bongjoon Lee
Product Development Scientist
Henkel Corporation
USA
Consumer demand is propelling rapid growth of autonomous driving and advanced driver assistance systems. This increase in automotive electronic sensors, which tend to use high frequency (77 GHz) radio waves, coupled with the need for electronic devices to become more power intensive, creates two critical concerns – electromagnetic interference and overheating. Both can lead to damaged components, malfunctions, equipment errors, reduced component life, failure and more. Thermally conductive EM absorbing materials combines electromagnetic attenuation and effective heat dissipation in one material – with typical applications on ADAS components, such as radars.

What the audience will learn

  • EMI Gap Pads reduce the risks of electronics failure due to overheating and electromagnetic interference
  • Optimization of thermal pathway, dielectric and magnetic properties is the key for high performance
  • Using fillers that have both high thermal conductivity and high EM absorbing property gives big performance/processing/cost benefits

2.35pm

Slimmer, smaller, smarter: Optimizing LiDAR design for today’s automotive trends

Dr Mark McCord
Chairman of Technology Advisory Board & Co-Founder of Cepton
Cepton, Inc.
USA
Lidar technology is expected to evolve as the automotive industry embraces new trends: safety, autonomy, software definability and electrification. How can lidar design be optimized to offer the performance needed, while meeting the increasing OEM demand for seamless sensor integration and intelligence? In this presentation, we will discuss the latest lidar trends that address this challenge: 1) Unlocking new placement options with slimmer, smaller lidar design; 2) Enabling adaptive 3D perception with software-defined imaging capabilities; and 3) leveraging automotive lidar programs to achieve lidar scalability.

What the audience will learn

  • In-vehicle placement options, their advantages and how to build integration solutions that address real-world driving needs like self-cleaning and cooling.
  • Market demand for smaller, slimmer lidar, and how ASICs enables higher sensor performance without increasing footprint and power consumption.
  • How software-definable lidars increases flexibility without change to hardware and how to combine performance enhancement, software definability and size reduction.
  • How simulation helps streamline design and development processes involving lidar integration for OEM and autonomous vehicle developers, with examples showcased.
  • OEM validation in scaling lidar for mass-market ADAS and AV applications, and how to increase lidar’s usability in consumer vehicles.