Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • May 10, 2019
    SafeRide: it’s time to act on cyberattacks
    Cyber threats are increasing rapidly and conventional security measures are unable to keep up. Ben Spencer talks to SafeRide’s Gil Reiter about what OEMs can do now As more vehicles become connected, so the potential threats to their security increase. Gil Reiter, vice president of product management for security firm SafeRide, says the biggest ‘attack surface’ for connected cars is their internet connectivity - and the in-vehicle applications that use the internet connection. “The most vulnerable co
  • September 4, 2018
    Getting to the point
    Cars are starting to learn to understand the language of pointing – something that our closest relative, the chimpanzee, cannot do. And such image recognition technology has profound mobility implications, says Nils Lenke Pointing at objects – be it with language, using gaze, gestures or eyes only – is a very human ability. However, recent advances in technology have enabled smart, multimodal assistants - including those found in cars - to action similar pointing capabilities and replicate these human qual
  • January 7, 2014
    Study forecasts growth of self-driving cars
    In its latest study, “Emerging Technologies: Autonomous cars—not if, but when,”, IHS Automotive forecasts total worldwide sales of self-driving cars (SDC) will grow from nearly 230 thousand in 2025 to 11.8 million in 2035 – seven million SDCs with both driver control and autonomous control and 4.8 million that have only autonomous control. In all, there should be nearly 54 million self-driving cars in use globally by 2035. The study anticipates that nearly all of the vehicles in use are likely to be self
  • September 19, 2022
    Parsons technology unlocks the full potential of AI
    Parsons has provided advanced transportation management solutions (ATMS) for nearly three decades. As visitors will learn, with the introduction of various new artificial intelligence (AI) methods, including basic AI, advanced AI and machine learning (ML), the company is now deploying more innovative and advanced technology solutions than ever before.