Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • ANPR developments in the Spanish market
    February 2, 2012
    Gonzalo García Palacios, R&D engineer with Quality Information Systems, writes about ANPR developments in the Spanish market In an increasing number of countries, Automatic Number Plate Recognition (ANPR) systems are a growing market. They have become a fundamental part of many ITS systems, whether publicly or privately owned, and essential to any user which looks seriously to give the best services to its customers or wants to improve its facilities' performance.
  • Delphi debuts ‘connecting with safety’ concept car
    October 2, 2013
    Infotainment and safety solutions provider Delphi Automotive is to debut its MyFi connecting with safety concept car at the IAA Frankfurt show, which the company says is unique in that it connects infotainment with active safety. By locating important information in the driver’s forward view, MyFi uses voice recognition, text-to-speech, large touch screens, reconfigurable displays and workload management technologies to tailor information to the driver depending on the driving environment. When linked wi
  • Nissan to lead human driving style AV project in the UK
    February 2, 2018
    Nissan’s European Technical Centre will lead a 30-month Autonomous Vehicle trial on UK country roads, high speed roundabouts, A-Roads and motorways with live traffic and different environmental conditions. Called the HumanDrive project, it will also emulate a natural human driving style with the intention of providing an enhanced experience for its occupants. The artificial driver model that controls perception and decision making will pilot the vehicle, and will be developed using artificial intelligence
  • Conscience versus convenience
    June 8, 2015
    David Crawford looks at new ways forward for public transport. By 2025, nearly 60% of the world’s population will be living in towns and cities, increasing their extent and density, and the journeys that people make within and between them. In response, the International Association of Public Transport (UITP) wants to see public transport’s global modal share doubling (PTx2) by the same date. “Success in 2025,” a spokesperson told ITS International, “will save 170 million tonnes of oil equivalent and 550