Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • Electronic toll collection: Change is in the air
    November 7, 2024
    Trends in technology plus users’ comfort in adopting new advances indicate that the environment for a new electronic toll collection architecture is evolving. Hal Worrall considers what this might look like
  • Caltrans trials Xerox’s Passenger Detection System
    October 30, 2015
    Xerox’s Passenger Detection System has been trialled in California and compared with the state’s team of human counters giving some interesting results, as Colin Sowman discovers. Like others adopting high-occupancy and high-occupancy vehicle (HOV) lanes for congestion management, Caltrans has faced challenges with compliance in what has been effectively an ‘honour system’ with drivers trusted to set their tags correctly or comply with the multi-passenger requirement.
  • IR’s invisible benefit for traffic surveillance and enforcement
    June 30, 2016
    Advances in vision technology are enhancing traffic surveillance and enforcement applications. Variable lighting conditions have long been a stumbling block for vision technology applications in the transport sector. With applications such as ANPR, the read-rate may vary between daylight and night and can be adversely affected by glare and low sun. Madrid, Spain-based Lector Vision had these considerations in mind when designing its Traffic Eye ANPR system, which combines off-the-shelf and custom hardware
  • Renesas single ship solution for ADAS
    December 19, 2014
    Renesas Electronics’ R-Car V2H is the company’s newest system-on-a-chip (SoC), implementing image recognition technology to support high-resolution surround viewing utilised in advanced driver assistance systems (ADAS). The R-Car V2H enables embedded system manufacturers to deliver high-resolution surround-view monitoring systems, with multiple cameras, for advanced point-of-view switching.