Skip to main content

Affectiva and Nuance to offer assistance

US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as ange
December 6, 2018 Read time: 2 mins
US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions.


The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform.

Affectiva Automotive AI measures facial expressions and emotions such as anger and surprise as well as verbal expressions in real time. It also displays icons which indicate drowsiness such as yawning, eye closure and blink rates and physical or mental distraction.Through the partnership, Dragon Drive will enable the in-car assistant to interact with passengers via emotional and cognitive state detection. It currently facilitates this correspondence through gesture, touch, gaze detection and voice recognition powered by natural language understanding.

Stefan Ortmanns, executive vice president and general manager, Nuance Automotive, says these additional modes of interaction will help its OEM partners develop automotive assistants which can ensure the safety and efficiency of connected and autonomous cars.

In the future, the automotive assistant may also be able to take control of semi-autonomous vehicles if the driver displays signs of physical or mental distraction.

Related Content

  • February 27, 2018
    Nissan and Dena launch robo-vehicle mobility service trial
    Nissan Motor and Dena will conduct a field test of the Easy Mile robo-vehicle mobility service in Japan on the 5 March that will allow participants to travel in cars equipped with autonomous driving technology. The trial is intended to help passengers discover new local destinations as well energise cities and neighbourhoods. Riders will be able to use a dedicated app to choose from a list of recommended destinations through text or voice. The in-car tablet screen will display selections of nearly 500
  • May 26, 2020
    OpenSpace visualises how social distancing will work
    OpenSpace CEO Nicolas Le Glatin tells Adam Hill how Xovis camera tech might help unlock more convenient ways for moving through mobility hubs during Covid-19
  • January 8, 2019
    Here uses Alexa to offer drivers voice-first navigation
    Here Technologies is to integrate its navigation and location services with Amazon’s Alexa to offer drivers voice-first navigation. At CES 2019 in Las Vegas, Here announced that it would utilise Alexa Auto tools to keep drivers focused on the road while offering personalised guidance. Alexa will come pre-integrated with Here Navigation On-Demand, the company’s new navigation-as-a-service model which allows drivers to search for points of interest and access live traffic information. Additionally,
  • November 2, 2016
    Ertico coordinates big data debate
    David Crawford finds that agreeing a common data standard for auto manufacturers’ onboard sensors, navigation system companies and map makers is proving a complex task.