In-car gesture controls and automotive human-machine interactions are emerging in premium cars such as the BMW, Jaguar, Volkswagen, and many more. They complement driver monitoring systems (DMS), which use computer vision to track driver attentiveness.
By sending a warning system that alerts the driver as needed, the DMS saves lives. However, what can be done to improve driver safety even before the moment of distraction?
In this blog post, we will explore the combination of hand tracking and gesture recognition for in-car touch-free gesture controls with driver monitoring systems as a complementary duo that can help to keep drivers focused.
Safety First: DMS and Multimodal Infotainment Controls
Infotainment system and cognitive load
It takes only 2.5 seconds of eyes off the road before a driver’s risk of crashing significantly increases, however, touchscreen infotainment systems require 18-25 seconds to complete an interaction.
With mid-air gesture controls, drivers can keep their eyes on the road throughout an interaction, greatly improving road safety by simplifying interactions with the infotainment system.
Complementing the DMS with gesture controls
Even with advanced driver monitoring systems that track pupil dilation, gaze, and head orientation, it takes only seconds before a driver’s risk of crashing significantly increases. When a driver monitoring system is combined with hand tracking and gesture recognition for in-car touchfree gesture controls, safety increases dramatically.
By simply swiping to change the song, or using mid-air gesture controls to adjust the volume, this intuitive human-machine interface enables high precision and seamless navigation of display functionalities. Clay AIR’s software even recognizes the movement of individual fingers with up to 99% accuracy on simple gestures and sequences according to the ground truth and f-measure.
This multimodal experience enables a variety of ways to interact with a car’s infotainment systems via speech and touch-free gesture controls with real-time driver monitoring. It enables to decrease the cognitive load on the driver, and also providing a strong use case for emerging driverless vehicle passenger control systems for a luxury lifestyle experience.
Seamless Integration into Existing DMS Camera or Infrared System
There are numerous ways to monitor hand tracking and gesture recognition, including computer vision, which is one of the most effective methods that can take advantage of the already embedded camera used for the DMS.
These are usually Time of Flight (ToF) cameras that with Clay AIR’s software will recognize a driver’s gestures within a wide sphere of 25 to 45 centimeters from the car’s dashboard for high-performance results.
Whether integrating Clay AIR’s hand tracking and gesture recognition into an existing camera or infrared system, no additional hardware is required, making it a cost effective solution.
Enhancing The In-Car Experience
DMSs with multimodal controls will redefine driver safety standards and comfort as a reliable solution for modernized rides with enhanced in-car gesture controls.
Clay AIR has an easily customizable library of hand gestures that can be used, for example, to control and navigate the GPS system, adjust climate, access entertainment, make and receive calls, and select, enhancing a driver’s interactions with the infotainment system. Each finger and hand segment are tracked and interpreted at speeds faster than human perception.
The solution is optimized for automotive applications, ensuring a low latency, low power consumption, and high accuracy performance outcomes.
As a result of these advantages, Clay AIR partnered with Renault-Nissan-Mitsubishi vehicles to integrate a bespoke Clay DRIVE hand tracking and gesture recognition software to reduce the risk of crashing with custom in-air gesture controls.
Get in touch
If you’d like to read more, send us an email at email@example.com to receive the case study.