Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • How intelligent video security supports smart mobility
    April 1, 2021
    The ease of getting from point A to point B, the effective movement of goods and services, and the flexibility and integration of various modes of transportation are key aspects of mobility today. Jayden Xu, senior manager, ITS Solutions, at Hikvision, details the role of intelligent video security technology for improved traffic management
  • Virtual cockpit in cars ‘edges closer to reality’
    September 3, 2015
    New analysis from Frost & Sullivan, Rise of Virtual Cockpits in Cars finds that the instrument cluster (IC) market in North America and Europe is expected to clock a compound annual growth rate (CAGR) of 2.2 percent from 2014 to 2021, with digital IC expected to reach a CAGR of approx. 26 percent by 2021. While the virtual cockpit will be limited to premium-segment vehicles, fully digital clusters that will be standard in about 20 percent of cars will also be offered as an option on medium-segment cars.
  • Multi-technology sensor fusion comes into focus
    July 30, 2025

    The future of intelligent transportation is arriving at this year’s Congress with the launch of Sensys Networks’ MultiSens Intersection. This is a practical advancement in traffic detection that merges video AI and wireless sensors into a single, unified platform for signal actuation and traffic analysis. Sensys Networks innovative system delivers true sensor fusion, offering unparalleled insights into traffic flow.

  • TRW launches camera technology in the US
    April 15, 2013
    To support its growing video camera business worldwide, TRW Automotive Holdings has launched production of camera systems in the US and is investing in its electronics facility in Illinois. TRW's scalable video camera (S-Cam) family incorporates technology from Mobileye and can provide a range of safety functions including lane departure warning, forward collision warning, headlight control, traffic sign recognition and pedestrian detection. When integrated with vehicle chassis systems, the S-Cam can provid