Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • Targeted roadside advertising project uses deep learning to analyse traffic volumes
    June 22, 2016
    A targeted roadside advertising project for digital signage using big data and deep learning just launched in Tokyo, Japan, by US smart data storage company Cloudian will focus on vehicle recognition and the ability to present relevant display ads by vehicle make and model. Together with Dentsu, Smart Insight Corporation, and QCT (Quanta Cloud Technology) Japan, and with support from Intel Japan, the project will conduct, at its first stage, deep learning analysis – artificial intelligence (AI) for recog
  • Global V2V penetration in new cars to reach 69 per cent by 2027
    November 21, 2013
    The latest analysis by ABI research expects global V2V penetration in new cars to increase from 10.9 per cent in 2018 to 69 per cent in 2027. ABI Research vice-president and practice director Dominique Bonte comments: “Huge interest in autonomous driving across the automotive ecosystem firmly positions V2X technology and applications as a key component of driverless car systems. However, some OEMs are claiming some forms of (semi)-autonomous driving can be achieved by just using in-vehicle ADAS-sensors.
  • Hikvision showcases AI Check-Point cameras
    March 21, 2018
    Hikvision is presenting a check-point camera that aims to brings artificial intelligence (AI) to critical infrastructure support at Intertraffic. The platform uses automatic number plate recognition, classification and automotive dead reckoning to detect and track criminals and identify unlicensed or uninsured drivers.
  • Nissan using anthropologist to develop proPILOT autonomous vehicle
    August 17, 2016
    Nissan is using an array of technical talent to develop its next generation autonomous vehicle, including automobile and software engineers, experts on sensor technology and artificial intelligence, computer scientists, production specialists an anthropologist. Melissa Cefkin, principal scientist and design anthropologist at the Nissan Research Center in Silicon Valley is playing a key role in the project, analysing human driving interactions to ensure that it is prepared to be a ‘good citizen’ on the ro