Skip to main content

Update on autonomous cars: mastering city street driving

In a recent blog post, Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye. Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California. Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area. He claims that
May 14, 2014 Read time: 2 mins
In a recent blog post, 1691 Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye.

Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California.  Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area.

He claims that Google has improved its software so it can detect hundreds of distinct objects simultaneously—pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn. A self-driving vehicle can pay attention to all of these things in a way that a human physically can’t—and it never gets tired or distracted.

Urmson says: “As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer. As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it). We still have lots of problems to solve, including teaching the car to drive more streets in Mountain View before we tackle another town, but thousands of situations on city streets that would have stumped us two years ago can now be navigated autonomously.”

With nearly 700,000 autonomous miles under its belt, Google is growing more optimistic that it is heading toward an achievable goal—a vehicle that operates fully without human intervention.

Related Content

  • March 14, 2023
    Watch your step: the sidewalk robots are here
    The way we order and pay for goods has changed radically – but what about how those goods are delivered? Gordon Feller looks at how sidewalk robots might reshape the urban landscape
  • February 27, 2012
    A need for order in evolution
    The hit film Jurassic Park took its name from one of the several geological periods or epochs (as they are also known) in which dinosaurs were the dominant land-dwellers.
  • November 10, 2015
    User-based insurance joins the battle for big data
    User-based insurance is blazing a trail others would like to follow and is also discovering the challenges. The ITS sector needs to keep a very careful eye on the automotive industry: “There’s a war going on in the connected car space creating richer datasets than we ever imagined possible” says Paul Stacy, research and development director of Wunelli, part of the LexisNexis group. The car makers have gone way beyond infotainment, unlocking huge amounts of data in the process … facts and figures which the i
  • December 12, 2013
    One eye on the future
    Mobileye’s Itay Gat discusses the evolution of monocular solutions for assisted and autonomous driving with Jason Barnes. Founded in 1999, Israeli company Mobileye manufactures and supplies advanced driver assistance systems (ADAS) based on its EyeQ family of systems-on-chips for image processing for solutions such as lane sensing, traffic sign recognition, vehicle and pedestrian detection. Its products are used by both the OEM and aftermarket sectors. The company’s visual interpretation algorithms drive