Skip to main content

Continental gestures to a safer driving future

To improve non-verbal communication between drivers and their vehicles, Continental has devised a range of user-friendly touch gestures for the cockpit, using a combination of gesture interaction and touch screens. This enables drivers to draw specific, defined symbols on the input display to trigger a diverse array of functions and features for rapid access. According to Dr Heinz Abel, head of Cross Product Solutions at Continental’s Instrumentation and Driver HMI business unit, the use of gestures and
April 10, 2017 Read time: 2 mins
To improve non-verbal communication between drivers and their vehicles, 260 Continental has devised a range of user-friendly touch gestures for the cockpit, using a combination of gesture interaction and touch screens. This enables drivers to draw specific, defined symbols on the input display to trigger a diverse array of functions and features for rapid access.

According to Dr Heinz Abel, head of Cross Product Solutions at Continental’s Instrumentation and Driver HMI business unit, the use of gestures and system control through haptic methods allows drivers to access controls and functions much faster than with conventional control concepts involving buttons and switches. But there is still potential for drivers to get distracted. By combining both elements Continental believes it can significantly reduce levels of driver distraction compared with the standard method using a touch screen.

Drivers can enable touch gesture interaction simply by touching the display with two fingers, then use two fingers to draw a heart symbol to access a favourite contact or a house roof symbol telling the navigation system to drive home. By drawing a circle, the driver can turn on the air-conditioning system in his apartment. “To ensure that such concepts are accepted, it is important that the gestures used are intuitive and do not have to be specially learned. At the same time, it should be possible to draw the gestures without getting distracted from the task of driving and the gestures should be easy to remember. Current in-house user studies prove that we have succeeded on both counts,” says Dr Abel.

A lab study conducted by Continental showed that two-finger gestures can reduce the length of time required to call up the desired features and functions by around one third. Another result was that, compared with one-finger touch gestures, two-finger touch gestures reduced the mental effort involved in operation to around one quarter.

Two-finger touch gestures can be drawn anywhere on the touch-sensitive surface of the input display, with drivers hardly having to avert their eyes from the road; this ensures intuitive and user-friendly operation. At the same time, this concept extends the conventional human–machine dialogue by allowing users to create favourites that can be accessed directly at the first menu level.

Related Content

  • November 12, 2015
    Preventing connected vehicles creating disconnected drivers
    Advanced driver assistance systems (ADAS) are evolving at a rapid pace – but drivers’ ability to cope with them is not and at some point the mismatch must be addressed. Probably the biggest challenge the transportation industry has ever faced.” That is how Dr Bryan Reimer of Massachusetts Institute of Technology AgeLab describes the challenges posed by semi-autonomous vehicles.
  • September 7, 2018
    Affectiva and Nuance to develop humanised automotive assistant
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as anger
  • December 6, 2018
    Affectiva and Nuance to offer assistance
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as ange
  • February 1, 2012
    Advanced in-vehicle user interface - future developments
    Dave McNamara and Craig Simonds, Autotechinsider LLC, look at human-machine interface development out to 2015. The US auto industry is going through the worst crisis it has faced since the Great Depression. But it has embraced technologies that will produce the best-possible driving experience for the public. Ford was the first OEM to announce in-car internet radio and SYNC, its signature-branded User Interface (UI), is held up as the shining example of change embracement.