Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • January 9, 2017
    ADAS and AV software and hardware revenues ‘to exceed US$35 billion by 2020’
    A new study from Juniper Research forecasts that the advanced driver assistance systems (ADAS) and autonomous vehicle (AV) market will reach a total global value of US$35 billion in 2020, before representing a fourfold increase to reach US$144 billion in revenues by 2025.
  • July 16, 2012
    Semi-autonomous hybrid vehicle trials show fuel, emission savings
    The Transport Research Laboratory has unveiled an innovative semi-autonomous vehicle prototype. It offers improves in environmental performance and safety but also displays some shortcomings. Mike Woof reports. The UK's Transport Research Laboratory (TRL) has been working on an innovative project to develop a prototype vehicle intended to reduce fuel consumption. Based on a Ford Escape hybrid model, TRL's Sentience vehicle uses a combination of mobile communications and mapping technologies to reduce fuel c
  • December 14, 2021
    AWS enhances Aurora AV system 
    AWS supports millions of virtual tests to validate the capabilities of the Aurora Driver 
  • October 29, 2014
    ITS need not reinvent machine vision
    Machine vision techniques hold the potential to solve a multitude of challenges facing the transportation sector Optical Character Recognition (OCR), the base technology for number plate recognition, has been in industrial use for more than three decades. It is a prime example of how, instead of having to start from scratch, the transportation sector can leverage and adapt the machine vision expertise already used in industry in order to provide robust solutions with new capabilities. “The real val