Skip to main content

Intel outlines AV limits of perception

CES 2021: Intel boss Amnon Shashua suggests radar and Lidar as redundant add-ons
By Ben Spencer January 12, 2021 Read time: 2 mins
Shashua: 'You have to be 1,000 times better than these statistics'

What is an acceptable failure rate of a vehicle's perception system?

And how does this influence the development and regulation of autonomous vehicles (AVs)?

These were among the key areas covered by Professor Amnon Shashua, senior vice president of Intel and chief executive officer of Mobileye at this week's CES 2021 event.

In an online session, Shashua revealed the company measures failure rate in terms of hours of driving. 

“If we google, we will find out that about 3.2 trillion miles a year in the US are being travelled by cars and there are about six million crashes a year,” he said.

“So divide one by another, you get: every 500,000 miles on average there is a crash.”

“Let's assume that 50% it's your fault in a crash, so let's make this one million and let's divide this by 20 miles per hour on average, so we get about once every 50,000 hours of driving we'll have a crash,” he added. 

Shashua then applied this level of performance to a scenario involving a robotic machine and the deployment of 50,000 cars. 

“It would mean that every hour on average, will have an accident that is our fault because it’s a failure of the perception system,” he continued.

“From a business perspective this not sustainable, and from a society perspective, I don't see regulators approving something like this so you have to be 1,000 times better than these statistics.”

Mobileye is acutely aware of this, having just announced it will be testing AVs in new cities this year: Detroit, Tokyo, Shanghai, Paris and (pending regulation) New York City.

From a technological point of view, Shashua insisted it is “so crucial to do the hard work” and not combine all the sensors at the beginning and carry out a “low-level fusion – which is easy to do”.

“Forget about the radars and Lidars, solve the difficult problem of doing an end-to-end, standalone, self-contained camera-only system and then add the radars and Lidars as a redundant add-on,” he concluded. 

For more information on companies in this article

Related Content

  • Machine vision develops closer traffic ties
    January 11, 2013
    Specifiers and buyers of camera technology in the transportation sector know what they need and are seeking innovative solutions. Over the following pages, Jason Barnes examines the latest developments with experts on machine vision technology. Transplanting the very high-performance camera technology used in machine vision from tightly controlled production management environments into those where highly variable conditions are common requires some careful thinking and not a little additional effort. Mach
  • Asecap prepares for ‘interoperability on steroids’
    March 31, 2023
    The gathering of Europe’s toll professionals offers a chance for views to be exchanged by senior people on a number of big issues: and there’s currently an awful lot to think about, reports Geoff Hadwick
  • Iomob: Tech can help us make better transport choices
    January 24, 2023
    Tired of ‘greenwashing’? Maybe it’s time for the transport sector to think differently, and more ambitiously, about how to encourage greener modal shift, suggests Adrian Ulisse of Iomob
  • Sorting sensible from shiny in tolling technology
    December 11, 2014
    Instead of always striving for the latest shiny toys Kevin Hoeflich of HNTB advises a 10-steps method for selecting the most appropriate technology. Amid the hype and razzmatazz surrounding the launch of Apple’s iPhone 6, the company also announced its new mobile payment system, Apple Pay. Built into the new iPhone 6, Apple Pay works at 220,000 merchants across America and is supported by major US banks and the big three credit card companies.