Skip to main content

Affectiva and Nuance to develop humanised automotive assistant

US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as anger
September 7, 2018 Read time: 2 mins

US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions.

The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform.

Affectiva Automotive AI measures facial expressions and emotions such as anger and surprise as well as verbal expressions in real-time. It also displays icons which indicate drowsiness such as yawning, eye closure and blink rates and physical or mental distraction.

Through the partnership, Dragon Drive will enable the in-car assistant to interact with passengers via emotion and cognitive state detection. It currently facilitates this correspondence through gesture, touch, gaze detection, voice recognition powered by natural language understanding.

Stefan Ortmanns, executive vice president and general manager, Nuance Automotive, says these additional modes of interaction will help its OEM partners develop automotive assistants which can ensure the safety and efficiency of connected and autonomous cars.

In the future, the automotive assistant may also be able to take control of semi-autonomous vehicles if the driver displays signs of physical or mental distraction.

Related Content

  • UR:BAN developing driver assistance and traffic management systems
    May 16, 2014
    European vehicle manufacturers, including BMW, Opel and Mercedes-Benz and MAN, are taking part in a new project to develop advanced driver assistance and traffic management systems for cities. The focus is on the human element in all aspects of mobility and traffic and takes the form of three approaches: Cognitive Assistance; Networked Traffic Systems; and Human Factors in Traffic. The four-year UR:BAN project (from a German acronym for Urban Space: User-oriented assistance systems and network managemen
  • Denso launches solution to reduce commercial vehicle accidents
    May 17, 2018
    Automotive supplier Denso has released a retrofittable driver status monitor in Japan that detects distractions and drowsiness to reduce accidents involving commercial vehicles. The platform carries out checks based on a driver’s facial image, which is captured through a camera installed in the cabin. Denso’s monitor voices an alert if the driver is not paying attention or is seated inappropriately. The driver’s condition is recorded on a secure digital (SD) card. An operation manager can view the nu
  • Lidar: recipes for success
    March 28, 2022
    Lidar is being deployed all over the world - and you can even read a cookbook on the subject...
  • Caltrans develops remote remedy for ailing VMS
    February 18, 2014
    A remote diagnostic system for variable message signs keeps Caltrans staff safer and makes them more efficient. District 12 of the California Department of Transportation (Caltrans) maintains roads in Orange County including 292 route miles of freeway lanes and 240 directional miles of full-time high occupancy vehicle or carpool lanes. All of these lanes are controlled from the district’s transportation management centre (TMC) using a network of 58 variable message signs (VMS) positioned alongside or abo