Skip to main content

Artificial intelligence systems for autonomous driving on the rise, says IHS

According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are
June 17, 2016 Read time: 3 mins
According to the latest report from market research firm HIS, Automotive Electronics Roadmap Report, as the complexity and penetration of in-vehicle infotainment systems and advanced driver assistance systems (ADAS) increases, there is a growing need for hardware and software solutions that support artificial intelligence, which uses electronics and software to emulate the functions of the human brain. In fact, unit shipments of artificial intelligence (AI) systems used in infotainment and ADAS systems are expected to rise from just 7 million in 2015 to 122 million by 2025, says IHS. The attach rate of AI-based systems in new vehicles was 8 percent in 2015, and the vast majority were focused on speech recognition. However, that number is forecast to rise to 109 percent in 2025, as there will be multiple AI systems of various types installed in many cars.

According to the report, AI-based systems in automotive applications are relatively rare, but they will grow to become standard in new vehicles over the next five years, especially in: Infotainment human-machine interface, including speech recognition, gesture recognition (including hand-writing recognition), eye tracking and driver monitoring, virtual assistance and natural language interfaces; ADAS and autonomous vehicles, including camera-based machine vision systems, radar-based detection units, driver condition evaluation, and sensor fusion engine control units (ECU).

Specifically in ADAS, deep learning -- which mimics human neural networks -- presents several advantages over traditional algorithms; it is also a key milestone on the road to fully autonomous vehicles. For example, deep learning allows detection and recognition of multiple objects, improves perception, reduces power consumption, supports object classification, enables recognition and prediction of actions, and will reduce development time of ADAS systems.

The hardware required to embed AI and deep learning in safety-critical and high-performance automotive applications at mass-production volume is not currently available due to the high cost and the sheer size of the computers needed to perform these advanced tasks. Even so, elements of AI are already available in vehicles today. In the infotainment human machine interfaces currently installed, most of the speech recognition technologies already rely on algorithms based on neural networks running in the cloud. The 2015 BMW 7 Series is the first car to use a hybrid approach, offering embedded hardware able to perform voice recognition in the absence of wireless connectivity. In ADAS applications, Tesla claims to implement neural network functionality, based on the MobilEye EYEQ3 processor, in its autonomous driving control unit.

Related Content

  • January 26, 2017
    Fully autonomous vehicles ‘spur LiDAR sensors mass adoption’
    Cost-effective, high-resolution light detection and ranging (LiDAR) sensors capable of long-range object detection will be necessary for high to fully-automated driving applications. Demand for 3D mapping and imaging, better overall performance, automated processing of graphic data gathering and self-sufficient sensor with best-in-class performance in low-visibility conditions are factors driving the development and adoption of LiDAR sensors within the advanced driver assistance systems (ADAS) sensor suite
  • December 7, 2017
    Videantis partners with Adasens on sensing technology for self-driving vehicles
    Adasens has entered a partnership to provide its portfolio of computer vision functions to Videantis in a project that aims to bring advanced sensing technologies to self-driving vehicles and automotive advanced driver assistance systems applications. Videantis will also offer its low-power, high-performance embedded vision processor to the agreement.
  • November 30, 2012
    Intel investing in vehicles’ connected future
    Prospects for a vision of vehicles fully connected to traffic information, safety and entertainment services are being boosted by a $100 million investment from Intel. Pete Goldin reports. Hear the name Intel and what comes to mind is processing power. What may not be realised is that Intel is positioned to become a major player in the automotive technologies market, including connected vehicle technology. To strengthen this position, the company’s investment arm, Intel Capital, has established a $100 milli
  • October 21, 2016
    Camera-based DMS to be chief enablers of safe, semi-autonomous driving, says research
    ABI Research has identified camera-based driver monitoring systems (DMS) as the chief enablers of safe, semi-autonomous driving. The market is forecast to reach 17.5 million camera-based DMS shipments in 2026. Biometric availability encompasses the driver's alertness, his or her engagement with the driving process and even the driver's ability to manually interact with the system as required. The key to enabling such a holistic driver monitoring system (DMS) is the use of internal cameras, either stereos