Discover more about the topics and technologies to be discussed at this year's (fee-to-attend) conferences, via a series of exclusive interviews with a selection of our expert speakers
Dr Wolfgang Stolzmann, lead engineer and consultant for ADAS systems at CMore, talks about recent developments in driver monitoring technology, and why they are an important step before the industry can move on to fully autonomous vehicles.
Catch Wolfgang’s presentation Validation of driver monitoring systems at the Autonomous Vehicle Interior Design & Technology Symposium. Purchase your delegate pass here.
Tell us about what you will be presenting.
It sounds like a paradox, but autonomous driving is a big enabler for driver monitoring systems in mass-production vehicles. To reach high and full automation (levels 4 and 5), conditional automation (level 3) must be established first. To guarantee that the driver is able to take over responsibility, inattentiveness and sleepiness must be detected, which can only be done by a driver monitoring system. CMORE is a global partner for the validation of driver monitoring systems (DMS), including ground truth systems concepts, development and integration into the vehicle, and data collection, annotation and analysis.
Our ground truth systems Head-GT and Gaze-GT allow high-precision measurement of the driver’s head position, orientation and gaze direction. We’re currently collecting data using a representative cross-section of the world population. This way, we can make sure that driver monitoring systems work without errors, irrespective of the driver’s gender, age or ethnicity. We’re using our labeling tool C.LABEL for data annotation to deliver precise measurements of the relevant properties of the drivers’ eyes. This includes eye closure values, which are core values for the detection of inattentiveness and sleepiness.
What are the most important driver monitoring systems today and which ones will be important in the future?
In the early 2000s, Daimler started to develop drowsiness detection algorithms based on the steering wheel behavior of the driver. Since 2009 these algorithms have been used by the Mercedes-Benz Attention Assist, a feature that was introduced as standard equipment on the W212. All Mercedes model series since then have been equipped with the system.
It set the standard up until today, but taking its information from the driver’s steering inputs, it evidently will not work with self-steering cars. For conditional automation (level 3), only direct driver observation with a driver camera is suitable for driver monitoring systems. Driver cameras will become as important for level 3 automation as the steering wheel behavior monitoring is for manual driving.
Can it be guaranteed at all that a ‘driver’ will be able to take over when required if the vehicle handles all driving functions in practice?
A quick look at level 2 cars with steering assist shows the challenges. A driver can overestimate the ability of a self-steering car and try to drive hands-off. To avoid this foreseeable misuse, all self-steering cars have a hands-off detection. If the driver does not have his hands on the steering wheel, then the hands-off detection prompts the driver to touch the steering wheel again.
In a conditionally automated driving scenario (level 3) the driver can still overestimate the ability of the car to take over the responsibility for the driving safety. The driver might become too inattentive or worse, fall asleep. To avoid this, level 3 cars will have a driver responsiveness detection, for instance as explained in the patent DE102015001686B4. A detection of inattentiveness and sleepiness based on a driver camera can also be combined with a dead man’s switch.