Conference Program



Room A Real-world test and deployment – lessons learned
8.50am - 10.55am

Moderator

Katherine Sheriff
Lead, Mobility and Transportation Industry Group
Davis Wright Tremaine
USA

8.50am

Autonomous Rapid Transit: Clear alternative to light-rail and people-mover

Jia-Ru Li
CEO
LILEE Systems
USA
To increase efficiency and reduce cost, LILEE Systems introduced the Autonomous Rapid Transit (ART) system since 2018. Comparing to legacy Light Rail and People Mover, ART reduces cost by 60% and implementation time by 75%. LILEE has completed projects by operating full-sized buses on open roads in America and Taiwan. We will share challenges and opportunities during the development, validation, and deployment phases: 1. System verification by working with cities and operators. 2. DMV license to operate self-driving buses from 9 a.m. to 6 p.m. for 2 years. 3. Infrastructure connecting signal priority and vehicle/pedestrian detection to Operation Control Center.

What the audience will learn

  • Why can Autonomous Rapid Transit (ART) be cheaper and faster to implement than Light Rail and People Mover?
  • How is ART compared to robotaxi, city-wide home delivery service, and long-haul trucking?
  • What are the steps of developing an ART system?
  • What are the challenges when developing an ART system?
  • Which city or what projects in the world are actively pursuing the ART system?

9.15am

Lessons learned from robotaxi services in multiple cities in Korea

Joonwoo Son
Founder & Chair
Sonnet Co., Ltd.
Korea
Sonnet.AI, South Korea's leading robotaxi startup, launched its first commercial robotaxi service in late 2021 and has expanded to various cities. We would like to share the possibilities and limitations of operating a robotaxi service that serves different purposes for tourism, commuting, and daily transportation depending on the service area. Robotaxis will play an important role in maintaining mobility not only in large cities but also in depopulated cities, but they have not yet secured profitability. However, Sonnet.AI is creating a business model that can operate in the black with the small help of the government's transportation subsidies.

What the audience will learn

  • Understanding the possibilities and limitations of operating a robotaxi service in South Korea
  • Understanding Korean legislation for commercial robotaxi services
  • The case studies for safe and profitable commercial robotaxi operations

9.40am

The Importance of adverse weather condition testing with automated transit buses

Cemre Kavvasoglu
Product Management Director
ADASTEC
USA
One of the main challenges of automated transit bus deployments is the varying weather conditions. Depending on where an automated transit bus gets deployed, it is vital to assess the environment’s climate, including extreme weather conditions such as heavy rain, snow, fog, and extreme heat. To operate throughout varying seasons, ADASTEC has undergone tremendous amounts of testing in various climates around the globe. Furthermore, ADASTEC has collected vast amounts of data and experience to not only handle a variety of environmental and weather conditions but also operate on different road settings during all four seasons.

What the audience will learn

  • • What are the difficulties operating in adverse weather conditions? How does ADASTEC operate their buses in adverse weather conditions such as heavy rain, snow, and fog?
  • • Developing, implementing, and testing the software in various weather and road conditions around the globe
  • • How does ADASTEC mitigate challenges with the human acceptance and awareness of automated public transportation buses?
  • • Mitigating risks due to lack of availability of infrastructure communication (V2I) and environmental factors

10.05am

Vehicle development support for PTI impact testing of automated vehicles

Thomas Tentrup
Director R&D
KUES Bundesgeschäftsstelle
Germany
In the future safety-relevant ADAS need to be tested as part of periodic technical inspection (PTI) in the after sales area. Similar to the ViL tests in the vehicle development phase the tests are provided as reliable impact testing during dynamic driving on a specific functional test bench at the test line KÜS DRIVE allowing steering while driving up up to 130 km/h without vehicle fixation. Monitor and radar target simulator stimulates ADAS-sensors without ADAS-ECU communication. The impact tests are performed in this way because standardized “open” interfaces to the ADAS-ECU´s and -sensors are not implemented during vehicle development phase.

What the audience will learn

  • ADAS functionalities needs to be checked by impact test of the complete functional chain over the whole vehicle lifetime.
  • Actual impact tests at KÜS DRIVE are able to check ADAS without ADAS-ECU communication and therefore with some limitations.
  • The implementation of standardized open ADAS-ECU and –sensors interfaces by the vehicle development would facilitate impact testing for PTI.
  • Impact tests at KÜS DRIVE are a secure and economical alternative to check ADAS outside on streets with movable targets.
  • Especially AEBS tests at KÜS DRIVE can be performed without danger for driver and vehicle with retraceable, exact measurement results.

10.30am

High-speed tests of collision mitigation systems on cars and trucks

Shawn Harrington
Principal & Founder
Forensic Rock
USA
Higher closing speed tests (up to 70-75 mph) of passenger and commercial vehicle forward collision warning (FCW) and automatic emergency braking (AEB) systems in stationary rear collision scenarios will be presented to attendees. Research will present the timing of the issuance of FCWs and the initiation of AEB in both passenger and commercial vehicles in stationary and moving rear collision scenarios. Testing speeds include 10 - 75 mph and over a dozen passenger vehicles and heavy trucks' collision mitigation systems will be compared from model years 2013 to 2023. Real-world LKAS and LDW research will also be presented.

What the audience will learn

  • Peformance of FCW/AEB at high closing speeds
  • Tests on model years 2013 - 2023
  • Performance of heavy truck collision mitigation systems
  • Performance and comparison of passenger vehicle collision mitigation systems
  • Performance of LKAS and LDW in the real-world

10.55am - 11.25am

Break

Room A Software, AI, architecture and data management – continued
11.25am - 3pm

Moderator

Ram Mirwani
Group Manager, Automotive Business Development
Rohde & Schwarz
USA

11.25am

Current trends in high-performance automotive datalogging

Bernhard Kockoth
Global Technology Scout
ViGEM GmbH
Germany
The verification and validation of ADAS systems for automated driving requires the accurate recording of high data rates from sensors and vehicle busses in real-world use. High-performance dataloggers record every bit from high-resolution raw video streams. In 2019 ViGEM introduced distributed logging which places capture modules and adapter probes close to data sources and transmits the thus collected and timestamped data over robust ethernet connections to central storage. After four years of successful worldwide deployments, we present a scalable solution with a new datalogger that fits into existing data capture setups.

What the audience will learn

  • State of the art high performance data logging
  • Trends in hardware and software development for logging devices
  • Distributed data logging explained
  • Future developments outlook

11.50am

Standardized APIs for autonomous, connected (V2X) and cockpit software

David Cole
Director - Engineering Solutions
Danlaw Inc.
USA
Different components of autonomous software are developed in an integrated environment both commercial and open-source software like AUTOWARE. However other key components of the automotive environment, in particular, V2X and cockpit software are developed in isolation. Hence, we have standardized the APIs that will help these components interact seamlessly. These software APIs for Connected Cars, Cockpit, and Autonomous Software are becoming increasingly important in the automotive industry. These APIs provide a standardized interface for different software components, enabling faster development and deployment of new software functionalities. APIs also promote innovation and collaboration, driving growth and revenue in the automotive industry.

What the audience will learn

  • Overview of AUTOWARE, Connected Car (V2X) and Cockpit features
  • Key Features of V2X and cockpit that has impact on Autonomous Software
  • Challenges of not having standardized APIs
  • Definition of APIs and their implementation
  • Advantages of having Standardized APIs for Autonomous, Connected (V2X) and Cockpit Software

12.15pm

Reaching production: Solving deployment challenges through scalable cloud infrastructure

Dev Patel
Product Manager
Applied Intuition
USA
In the past decade, AV capabilities have gone from dream to near-reality. However, remaining blockers for production deployment are amongst the most challenging, including validating new software across millions of test cases, monitoring and mining fleet data, quantifiably proving safety, and expanding operational design domains (ODDs) at scale. Cloud infrastructure and tooling play a significant role, powering development loops with scaled compute and data that enable programs to go to market faster and cost-effectively without compromising on safety. This presentation discusses four key tactics that industry leaders are deploying: Virtual validation, data management, monitoring and deployment, and cloud collaboration.

12.40pm - 1.45pm

Lunch

1.45pm

Enabling future Software Defined Vehicles for ADAS and Automated Driving

Samuel Kuttler
Senior Business Development Engineer
Vector
Germany
Vector is a hidden champion in the embedded system and software field, at the same time the company pioneers the path to Software Defined Vehicles (SDV) through its software products by adopting an embedded or cloud-native approach. Today, Adaptive MICROSAR is driving around in all German built cars to provide the highest level of dependability for ADAS Applications. While Silicon Valley accelerates software and system development through innovation, there is a potential trade-off in quality. Let’s see how we can make best of both worlds, dependable systems engineering and fast pace software craftmanship for automated mobility.

What the audience will learn

  • Embedded or Cloud native - first step is an collaborative approach.
  • What silicon valley can learn from automotive software engineering. to provide level 4/5 automated driving
  • The next technologies for ADAS on software platform level.
  • Combining Adaptive MICROSAR and ROS (Robotic Operating System) on one target.

2.10pm

Modern Software-defined vehicle architectures: The foundation for autonomous vehicles

Pedro López Estepa
Director of Automotive
Real-Time Innovations (RTI)
Spain
Modern software-defined vehicle architectures are the foundation of the Autonomous Vehicle (AV) era. The new AV development paradigm requires not only safety critical software to meet the requirements set forth in the Functional Safety (FuSa) standard, but also provide flexibility, scalability, compatibility and upgradability on different platform components. Platform-independent solutions together with standard data-models help OEMs to optimize the path towards autonomy while reducing associated risk and cost. In addition, choosing the right Business Model and liability allocations will help ensure long-term success. This session will highlight the path towards a solid software-defined architectural strategy as a foundation towards AVs.

What the audience will learn

  • RTI will present the best practices in order to secure a long term strategy for the Software-defined vehicle
  • RTI will present the optimal Design Cycle in order to integrate functional safety software components in a production program.
  • Description of challenges at Business Model and Liability level in Functional Safety production programs from a supplier perspective
  • RTI's Connext Drive presentation and how it's the place where safety and functionality meet for AVs framework design.
  • Describe the importance of platform independent components and standard data-models to secure a continuous evolution of the AV software solution

2.35pm

Augmenting the Why: outsourced anomaly detection data pipelines

Aaron Bianchi
Director, ML Solutions
Digital Divide Data
USA
AD and ADAS systems constantly interact with new situations and scenarios. Some of these may cause rapid corrective action, human takeover, and other similar events. Understanding what causes these corrective actions can help to inform testing, ODD definition, and much more. Join this talk to learn more about how we help AD leaders quickly and effectively parse and bin these corrective actions to help drive their functionality and error handling capabilities into the future.

What the audience will learn

  • Identify driver takeover and similar cases to triage and define
  • Build a conceptual model for root cause analysis and ODD definition
  • Drive meaningful insisghts into vehicle performance to augment the engineering team
  • Apply similar approaches to other engineering areas and disciplines (such as test engineering)

Room B Sensor test, development, fusion, calibration and data
8.50am - 3pm

Moderator

Jeremiah Robertson
Principal Safety Engineering Team Lead
Motional
USA

8.50am

Economics of enabling technologies for lidar

Sunil Khatana
Chief Technology Officer
Inyo System Inc
USA
There are several competing technologies for lidar. This presentation will discuss the cost/performance trade-offs of the underlying technologies that enable realization of lidar for automotive and 3D sensing. It will review ranging methods, lasers, detectors and scanning methods used for lidar design and how these map to performance and cost. Map various device technologies into application space where they are likely going to be most competitive. It will provide comprehensive coverage of ranging methods - iTOF, dTOF, coherent; lasers - EEL VCSEL, fiber laser; lighting - spots, line scan, flying spots; detectors APD, SiPM, SPAD, CIS and PINs. It will present a comparison of strength and weakness of each in terms of cost, range, FOV, range resolution, spatial resolution.

9.15am

Silicon photonics for LiDAR and sensing application

Marcus Yang
Sr. Director, Head of LIDAR Sensing - Silicon Photonics Product Division
Intel
USA
FMCW LiDAR offers precise distance measurement using a continuous-wave laser with modulated frequency. Advantages include higher resolution, velocity detection, and cost-effectiveness. Silicon Photonics (SiP) enables compact and efficient FMCW LiDAR systems by integrating optical components on a single chip. SiP's role in sensing spans communication, computing, automotive, bio-sensing, optical gyroscopes, and AI applications. Intel offers mature high-volume SiP platform with industry-leading quality, unique devices such as hybrid-laser, and cutting-edge PICs for FMCW LiDAR Sensing.

9.40am

Radars in autonomy: current landscape, challenges, and the future

Arvind Srivastav
Software Engineer, Radar Perception
Zoox, Inc.
USA
This presention will focus on providing a compelling overview of radars in autonomy today and maximizing their contribution to autonomous perception. Topics will include radar fundamentals, radar data formats and illustration of their strengths and weaknesses, role of radar in perception, current radar deep learning approaches, the research early fusion models and their promise, radar occupancy flow models, and methods for direct target tracking on radar data. In this talk, we will focus on providing an improved understanding of radars used in autonomy to the audience and encouraging increased contribution from radars to make autonomy safe, robust, and reliable.

What the audience will learn

  • Why radars aren't able to live up to their promise in autonomous perception today
  • Where does the problem lie and why recent deep learning approaches offer a better promise
  • How would the autonomous radar future look like

10.05am

Utilizing LiDARs at scale for ADAS safety, compliance and efficiency

Mohammad Musa
Founder and CEO
Deepen AI
USA
Maintaining accurate sensor calibration is key to all ADAS and AV systems. Now that most new cars will have ADAS features in them, undertaking multi-sensor calibration at scale is becoming a real bottleneck. The presentation will discuss the multi-sensor calibration lifecycle, all types of calibrations required and best practices for conducting multi-sensor calibration at scale.

What the audience will learn

  • Why LiDAR? Safety, Compliance and Efficiency
  • LiDAR placements and integration with other sensors
  • LiDAR Annotation & Calibration

10.30am - 11am

Break

11am

Next-Generation Sensors for Automated Road Vehicles - SAE EDGE™Research Report Discussion

Cameron Gieda
Sr. Director of Business Development
Pony.AI
USA
Based on an SAE EDGE™Report authored by Sven Beiker with support from a number of mobility experts, this discussion will cover the spectrum of currently available sensors for higher levels of automated driving such as lidar, radar, cameras and ultrasonics. Pony Ai’s Cameron Gieda, one of the primary contributors to the report will explain the nuanced differences between the approaches that manufactures employ when designing these sensors as well as why these decisions are made. There will be details provided as to why certain sensors are used for certain tasks as well as what is the "Achilles heel" of any given modality. Lastly, he will address how sensor fusion can be used to leverage the best of each type of sensor and what trends we will see in the future.

What the audience will learn

  • What are the common sensor architectures in automated vehicles?
  • What are the minimum sensors needed for higher levels of ADAS
  • Can cameras alone solve the problem?
  • Why is redundancy important for safety and reliability?
  • How computer power throttles the use of some sensor modalities.

11.25am

Solving critical issues affecting safety of ADAS - accelerate AD adoption

Hannah Osborn
Director, America Sales & Business Development
LeddarTech
Canada
Consumer confidence drives the pace of ADAS and AD development. This presentation will unveil some critical issues surrounding safety and performance that are evident today, why they affect customer confidence, and how they can be addressed to accelerate greater safety and autonomy.

What the audience will learn

  • The sensor fusion and perception software directly impacts the safety and performance of ADAS and AD
  • Sensor performance is ever-evolving, each type fulfilling a function based on level of autonomy and redundancy is extremely important
  • Sensor limitations and degradation issues are highlighted in harsh conditions but can be overcome with better fusion and perception software
  • Low-level fusion helps mitigate the impact of a malfunctioning/degraded sensor, providing better detections and fewer false alarms for small obstacales
  • Suited for on-road and off-road applications, LLF is the future of perception

11.50am

Turning miles into minutes with greater realism in test

Ajay Vemuru
Director, PNT Simulation
Spirent Communications PLC
USA
Testing in the lab saves engineers time and money over road testing, and delivers results that have greater traceability and repeatability. For this reason, increasing realism in lab testing is a potential game changer for automotive developers. In this presentation we’ll look at how to ensure a hardware-in-the-loop test environment is optimized for realism, particularly by reducing the impact of latency. We’ll look at how you can recreate the local signal environment to give a better idea how your equipment will perform on the road, including by using record & playback. Lastly, we’ll discuss how you can bring multi-sensor testing in the lab.

What the audience will learn

  • • Local environment – how can developers model the local environment more realistically, and why?
  • • Multi-sensor testing – how can you integrate multiple sensors efficiently in the lab?
  • • Hardware-in-the-loop – how does latency impact the integrity of your testing?
  • • Record & playback – how high fidelity record & playback can combine the integrity of the real world and

12.15pm

FMCW as the end state: exploring advantages of instantaneous velocity

Matt Last
Director of Product
Aeva
USA
Next-generation Frequency Modulated Continuous Wave LiDAR systems can detect and track objects farther, faster, and with greater precision than ever before. Aeva’s FMCW LiDAR-on-chip system adds doppler velocity to the standard range, azimuth, elevation, and reflectivity channels generated by traditional 3D LiDAR systems. This session will explore the unique perception capabilities enabled by the velocity channel, how they deliver improved safety and reliability to vehicle automation, and why FMCW-based systems are the end state for high performance automotive LiDAR.

What the audience will learn

  • How FMCW LiDAR systems can detect and track objects farther, faster, and with greater precision than ever before
  • The unique perception capabilities enabled by the doppler velocity channel
  • How these capabilities deliver improved safety and reliability to vehicle automation

12.40pm - 1.45pm

Lunch

Moderator

Hannah Osborn
Director, America Sales & Business Development
LeddarTech
Canada

1.45pm

Fix the optical subsystem, fix lidar

Eric Aguilar
Co-founder & CEO
Omnitron Sensors
USA
While highly promising, today’s optical subsystems for lidar remain fragile, large, expensive to build and maintain, overly susceptible to environmental conditions, and inconsistent in their performance. We can reach the full potential of lidar by fixing the optical subsystems on which lidar systems rely. With experience that spans core sensor development and systems integration, Eric Aguilar learned first-hand what automotive integrators need for affordable, reliable, long-range lidar systems. He’ll both review the pros and cons of today’s optical subsystems and introduce a new, cost-effective MEMS scanning mirror for lidar that ticks all the boxes for automotive integrators and manufacturers.

What the audience will learn

  • The role played by the optical subsystem in LiDAR for ADAS and autonomous systems
  • Automotive industry requirements for optical subsystems for LiDAR
  • The top 3 issues with existing optical subsystems for LiDAR—Voice Coil, SCALA, spinning polygon, Galvo
  • The great potential—and challenges—of MEMS mirrors
  • Problem-solver: first mass-produced low-cost, rugged, reliable MEMS scanning mirror

2.10pm

Advancements in stereo vision for night-time and low-light scenarios

Piotr Swierczynski
Director of Engineering
NODAR
USA
Join us for an illuminating session on how advanced stereo vision overcomes the challenges of night-time and low-light driving in autonomous vehicles. We will share how NODAR's 3D vision system outperforms other sensor systems in these challenging conditions and provides the reliability and performance required for L3 and above autonomy. This presentation will cover the technology behind stereo vision and NODAR’s unique take on this age-old technique. We will offer valuable insights into how this new technology can enhance safety in autonomous driving.

What the audience will learn

  • The benefits and limitations of stereo vision technology for nighttime and low-light driving scenarios
  • Real-world examples of how stereo vision is being used in autonomous vehicles
  • Insights into how stereo vision enhances safety in autonomous driving, particularly in nighttime and low-light scenarios

2.35pm

Effect of UVH coatings on self-cleaning performance for automotive sensors

Songwei Lu
Research Associate I
PPG Industries, Inc.
USA
Using a stationary testbed, we have evaluated the effect of UVH coatings on self-cleaning performance for automotive sensors under lab-simulated inclement weather. Four types of inclement weather were simulated, including rain, mud, fog, and bug. Images from vision camera were analyzed using Modulation Transfer Function and signal-to-noise ratio to evaluate optical distortion incurred by weathering. The evaluation results of the UVH coatings as-prepared, and after various hours up to 3000 hours in Weather-O-Meter testing will be presented. Current results point to a significant benefit of using UVH coatings to improve the signal reading of vision cameras under inclement weather.

What the audience will learn

  • The effect of UV durable hydrophobic coatings for autonomous sensors under inclement weathers.
  • Lab simulated weather conditions including light rain and heavy rain, light mud and heavy mud, fog, and bug.
  • The presentation will mainly focus on automotive vision camera. The effect on IR camera, LiDAR and radar will be mentioned.
  • We will show significant benefits of using UVH coatings to improve the signal reading of vision cameras under inclement weather.
  • The UV durable hydrophobic UVH coatings on sessors will ensure safe and effective driving of autonomous vehicles

Room C Best practices for accelerating the test, development and deployment of safe ADAS & AD systems
8.50am - 3pm

Moderator

Pedro López Estepa
Director of Automotive
Real-Time Innovations (RTI)
Spain

8.50am

Connected mobility solution for reliable performance of automated vehicles

Subrata Kundu
Senior Manager
R&D Division, Hitachi America, Ltd
USA
Digitalization rapidly transforming the mobility industry and providing new opportunities in connected automated vehicles and mobility services. As the acceptance of connected vehicles increases, innovative solutions to improve safety and operational efficiency as well as to reduce the possibility of error of automated vehicles are also beginning to emerge. We have been developing advanced sensors, high performance electronic control unit (ECU), and connected mobility solution, which enables improved and reliable performance of connected automated vehicles. This presentation will introduce innovative solutions to maximize safety and ensure component functionalities of connected automated vehicle.

What the audience will learn

  • Connected Vehicle Application Platform
  • Smart Routing Solution to Maximize Safety of Connected Automated Vehicle
  • Connected Diagnostic Solution to Ensure Component Functionalities

9.15am

Cloud-Powered development for AVs at scale

J.J. Navarro
Customer Experience Lead, Autonomous Vehicles Google Cloud
Google
USA
Learn how leading AV companies are using the cloud to accelerate development of their AI/ML autonomy platforms and build for scale and elasticity, and how they transition from R&D to reliable commercial operations

What the audience will learn

  • Learn how AV companies are using The Cloud to:
  • optimize for cost and speed
  • build for global scale and reliable commercial operations
  • scale their ML and simulation workloads

9.40am

Communication of automated vehicles with other road users

Dr Sven Beiker
External Advisor
SAE International
USA
This presentation will discuss how automated vehicles will / should communicate with other road users. Conventional (human-driven) vehicles, bicyclists, and pedestrians already have a functioning system of understating each other while on the move. Adding automated vehicles to the mix requires assessing the spectrum of existing modes of communication – both implicit and explicit, biological and technological, and how they will interact with each other in the real world. The impending deployment of AVs represents a major shift in the traditional approach to ground transportation; its effects will inevitably be felt by parties directly involved with the vehicle manufacturing and use and those that play roles in the mobility ecosystem (e.g., aftermarket and maintenance industries, infrastructure and planning organizations, automotive insurance providers, marketers, telecommunication companies). The audience of this presentation will learn about multiple scenarios that are likely to evolve in a future not too far away and how they are likely to play out in practical ways.

What the audience will learn

  • - overview of previous work related to external communication of AVs
  • - understanding of the challenges that seeming obvious solutions present
  • - insights into what experts demand and question regarding AV communication
  • - appreciation for the differences of visual versus auditory communication solutions

10.05am

Determining the performance of an ADS through three key questions.

Chris Reeves
Head of CAV Technologies
HORIBA MIRA Ltd
UK
ADS are transforming how we travel, systems are becoming more prevalent, more complex and broader in their application in an unpredictable external environment. This creates a major technical challenge, how to ensure the features are safe and functionally robust without exponentially increasing validation and verification time and cost. HORIBA MIRA’s ASSURED CAV centre of excellence uses a multi-pillar approach and novel techniques to answer three critical questions, what to test, how to test and when to stop to ensure vehicle performance is determined for real world complexity.

10.30am - 11am

Break

11am

Insights on V2X, from standardization to validation

Tony Vento
Strategic Development
S.E.A.
USA
V2X (Vehicle-to-Everything) adds capabilities to ADAS sensor fusion such as non-line-of-sight (NLOS) and sensor sharing. It will improve vehicle safety, necessary as fatalities continue to increase between vehicles and VRUs (vulnerable road users like pedestrians). Hear insights on standardization and validation efforts. Standardization includes automotive OEMs and Tier 1s, network operators, semiconductor companies, and device testers. Validation includes 3GPP tests at a physical level, V2X protocol level, and Day 1 Use Cases at an application level. We have tested many V2X devices and collected field data from OBUs (on-board units) and RSUs (roadside units). Additional 5G benefits are coming.

What the audience will learn

  • How V2X will improve the safety of Autonomous Driving
  • The status of V2X standardization
  • Validation efforts for V2X

11.25am

Skeleton-based gesture recognition model for automated vehicles

Jagdish Bhanushali
Senior Deep Learning Software Engineer
Valeo
USA
In autonomous vehicles passengers can interact with the system through various physical buttons or voice commands, but pedestrians are far from the car and don't have a communication channel between them. One of the ways to convey messages from pedestrians to automated vehicles is gesture recognition, where pedestrians can do various actions to inform autonomous vehicles about their intentions. Here we are presenting how skeleton-based gesture recognition helps to create communication channels between autonomous vehicles and pedestrians.

What the audience will learn

  • Challenges in complex urban environment for autonomous vehicle
  • Existing methodologies to solve the problems
  • Dataset collections and annotations
  • How Valeo’s approach is different from from existing methods
  • Performance and limitations of the system

11.50am

Multimodal feedback for drivers of AVs during transfer of control

Salman Safdar
Automotive Consultant
Ansible Motion
UK
Level 3 vehicles are now commercially available, allowing drivers to engage in non-driving-related tasks. However, autonomous control systems have design and intent limitations, so it is important to provide suitable feedback for safe and timely transfer of control between vehicle and driver. Prof. Bani Anvari’s (Professor of Intelligent Mobility at University College London) driver-in-the-loop simulator investigations focus on how haptic feedback through a novel mechano-tactile driver's seat reduces drivers’ reaction times and improves the success of control transfer. IM@UCL team supported by Ansible motion have found that multimodal outperforms single-modal feedback: Combining auditory, visual and haptic feedback offers the highest proportion of successful vehicle control transfers with the lowest reaction times. We believe this research can be extended in the future to include 6 DoF motion for highly immersive feedback.

What the audience will learn

  • Effectiveness of different type of feedback for take over requests.
  • Impact of feedback on the perception of drivers during different levels of autonomy.
  • Understanding situational awareness based on physiological and behavioural sensors.
  • Experimental methodology to validate take over requests in a Driver-in-the-Loop driving simulator.
  • Design of robotic human-machine interfaces to support take over requests.

12.15pm

Beyond machine learning: Solving the Long Tail Problem of ADAS/AVs

Stan Stringfellow
CEO / Founder
PlasticFog Technology Corp.
USA
Machine learning, which is based on past experience, can never address all possible future ADAS/AV scenarios, especially combinatorially complex and unfamiliar scenarios -- which happen all the time. This is the called the Long Tail Problem. It is a huge impediment to safe and effective ADAS/AV's, yet no publicly-known technology has been able to solve this problem. We are developing a conceptual-level reasoning capability that builds on top of machine learning, yet is completely transparent. We think of it like the all-seeing eye above the pyramid base. This is essentially a fast, real-time, widely-distributed, combinatorial optimization solution that targets ADAS/AVs.

What the audience will learn

  • Why the Long Tail Problem is critical to solve, and why it is currently unsolved
  • How it is possible to solve roadway combinatorial optimization problems in real-time
  • How ADAS technology can be evolved into city-wide hierarchical ADAS ecosystems, which even assist non-connected roadway users
  • How to overcome the problem of intrusive and/or unnecessary ADAS events by utilizing a conceptual-level reasoning capability
  • How to overcome the problem of false classifications of roadway observations by optimizing across multiple sensor/ML viewpoints

12.40pm - 1.45pm

Lunch

Moderator

Salman Safdar
Automotive Consultant
Ansible Motion
UK

1.45pm

Shaping the future of road safety

Nihat Kücük
CTO
Terranet
Sweden
In a world where cities grow bigger and urban traffic becomes denser and more complicated, we have the power to make a difference. With the use of cutting-edge technology, we can protect the most vulnerable road users. We believe that a complementary set of intelligent solutions in vehicles will transform urban road safety and ensure a safer future for all.

What the audience will learn

  • Vision Zero: Megacities, Micromobility, Urbanization
  • | Facts on Road Fatalities
  • Why we need better and faster sensors
  • Todays Sensor Landscape

2.10pm

EMI gap pads simplify material needs, enabling high-performing ADAS components

Bongjoon Lee
Product Development Scientist
Henkel Corporation
USA
Consumer demand is propelling rapid growth of autonomous driving and advanced driver assistance systems. This increase in automotive electronic sensors, which tend to use high frequency (77 GHz) radio waves, coupled with the need for electronic devices to become more power intensive, creates two critical concerns – electromagnetic interference and overheating. Both can lead to damaged components, malfunctions, equipment errors, reduced component life, failure and more. Thermally conductive EM absorbing materials combines electromagnetic attenuation and effective heat dissipation in one material – with typical applications on ADAS components, such as radars.

What the audience will learn

  • EMI Gap Pads reduce the risks of electronics failure due to overheating and electromagnetic interference
  • Optimization of thermal pathway, dielectric and magnetic properties is the key for high performance
  • Using fillers that have both high thermal conductivity and high EM absorbing property gives big performance/processing/cost benefits

2.35pm

Slimmer, smaller, smarter: Optimizing LiDAR design for today’s automotive trends

Dr Mark McCord
Chairman of Technology Advisory Board & Co-Founder of Cepton
Cepton, Inc.
USA
Lidar technology is expected to evolve as the automotive industry embraces new trends: safety, autonomy, software definability and electrification. How can lidar design be optimized to offer the performance needed, while meeting the increasing OEM demand for seamless sensor integration and intelligence? In this presentation, we will discuss the latest lidar trends that address this challenge: 1) Unlocking new placement options with slimmer, smaller lidar design; 2) Enabling adaptive 3D perception with software-defined imaging capabilities; and 3) leveraging automotive lidar programs to achieve lidar scalability.

What the audience will learn

  • In-vehicle placement options, their advantages and how to build integration solutions that address real-world driving needs like self-cleaning and cooling.
  • Market demand for smaller, slimmer lidar, and how ASICs enables higher sensor performance without increasing footprint and power consumption.
  • How software-definable lidars increases flexibility without change to hardware and how to combine performance enhancement, software definability and size reduction.
  • How simulation helps streamline design and development processes involving lidar integration for OEM and autonomous vehicle developers, with examples showcased.
  • OEM validation in scaling lidar for mass-market ADAS and AV applications, and how to increase lidar’s usability in consumer vehicles.