Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • Indra drones to manage road traffic in Spain
    October 14, 2019
    Indra is to use drones to monitor road traffic and detect incidents in Lugo, Spain. The company plans to employ the drones as sensors for current transportation monitoring systems and integrate them into its transportation control solution Mova Traffic. It will also develop tools to analyse video and images taken by drones in a bid to detect incidents automatically. Additionally, the company will incorporate its drones with a transportation control centre, which will process real-time image and video tra
  • Driver aids make inroads on improving safety
    November 12, 2015
    In-vehicle anti-collision systems continue to evolve and could eliminate some incidents altogether. John Kendall rounds up the current developments. A few weeks ago, I watched a driver reverse a car from a parking bay at right angles to the road, straight into a car driving along the road. The accident happened at walking pace, no-one was hurt and both cars had body panels that regain their shape after a low speed shunt.
  • Vector and Baselabs partner on ADAS
    October 8, 2014
    Vector and BaselabS have formed a partnership aimed at jointly creating products and services for the development of advanced driver assistance systems (ADAS) and automated driving. Vector will focus on special software tools for the development of ADAS and automated vehicles, while Baselabs will concentrates on software for data fusion in multiple-sensor scenarios, including the necessary algorithms. Baselabs will also provide application support in the development of ADAS and automated vehicles. Th
  • Don’t drive drunk – or use a hands-free phone
    August 29, 2019
    Despite law changes, drivers’ bad habits have been creeping back in. TRL’s Dr Shaun Helman tells Adam Hill why using a phone at the wheel is just as distracting as driving after a few drinks esearch from as far back as 2002 (see box) suggests that driving while making a phone call – either hands-free or holding a handset to your ear – creates the same amount of distraction as being drunk behind the wheel. While it is notoriously hard to predict how alcohol will affect an individual (due to the speed of