Skip to main content

Camera-based DMS to be chief enablers of safe, semi-autonomous driving, says research

ABI Research has identified camera-based driver monitoring systems (DMS) as the chief enablers of safe, semi-autonomous driving. The market is forecast to reach 17.5 million camera-based DMS shipments in 2026. Biometric availability encompasses the driver's alertness, his or her engagement with the driving process and even the driver's ability to manually interact with the system as required. The key to enabling such a holistic driver monitoring system (DMS) is the use of internal cameras, either stereos
October 21, 2016 Read time: 2 mins
5725 ABI Research has identified camera-based driver monitoring systems (DMS) as the chief enablers of safe, semi-autonomous driving. The market is forecast to reach 17.5 million camera-based DMS shipments in 2026.

Biometric availability encompasses the driver's alertness, his or her engagement with the driving process and even the driver's ability to manually interact with the system as required. The key to enabling such a holistic driver monitoring system (DMS) is the use of internal cameras, either stereoscopic or time-of-flight, to identify and track facial features, gaze direction, and upper body position. ABI Research identifies a number of vision analytics companies active in this space, including EDGE3 Technologies, FotoNation, Jungo Connectivity and gestigon.

According to James Hodgson, industry analyst at ABI Research, a number of semi-autonomous system launches from OEMs like Mercedes-Benz, Nissan, and Tesla have highlighted the importance of a robust human machine interface, or HMI, in scenarios that require an automated system to work in tandem with a human driver. Leveraging camera-based DMS to provide the host autonomous system with a comprehensive understanding of the driver's biometric availability is the foundation of safe, semi-autonomous HMI.

"In many ways, this represents a new cost burden to OEMs looking to deploy semi-autonomous systems," concludes Hodgson. "As gesture control and driver identification emerge as popular features to justify the additional cost of an autonomous system to the end users, OEMs are exploring how these new features can be offered via the enabling hardware for next generation DMS, in order to capitalise on this movement."

Related Content

  • December 6, 2018
    Affectiva and Nuance to offer assistance
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as ange
  • May 22, 2014
    Self-driving cars ‘a US$87 billion opportunity in 2030’
    The latest research from Lux Research indicates that automakers and technology developers are closer than ever to bringing self-driving cars to market, with basic Level 2 autonomous behaviour already coming to market, in the form of relatively modest self-driving features like adaptive cruise control, lane departure warning, and collision avoidance braking. With these initial steps, automakers are already on the road to some level of autonomy, but costs remain high in many cases. It is the higher levels
  • May 10, 2016
    Smart mobility on the rise, says ABI Research
    As extreme pollution and congestion in urban areas coupled with limited transportation options continues to challenge major cities across the globe, market intelligence firm ABI Research, predicts an imminent rise in smart electric mobility. Data analysis forecasts global electric vehicle revenue will hit US$58 billion in 2021, more than five times its market value in 2015. "The role of vehicle electrification in urban areas is part of a broader smart mobility model that includes shared vehicles, chargi
  • September 4, 2018
    Getting to the point
    Cars are starting to learn to understand the language of pointing – something that our closest relative, the chimpanzee, cannot do. And such image recognition technology has profound mobility implications, says Nils Lenke Pointing at objects – be it with language, using gaze, gestures or eyes only – is a very human ability. However, recent advances in technology have enabled smart, multimodal assistants - including those found in cars - to action similar pointing capabilities and replicate these human qual