Skip to main content

Update on autonomous cars: mastering city street driving

In a recent blog post, Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye. Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California. Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area. He claims that
May 14, 2014 Read time: 2 mins
In a recent blog post, 1691 Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye.

Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California.  Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area.

He claims that Google has improved its software so it can detect hundreds of distinct objects simultaneously—pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn. A self-driving vehicle can pay attention to all of these things in a way that a human physically can’t—and it never gets tired or distracted.

Urmson says: “As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer. As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it). We still have lots of problems to solve, including teaching the car to drive more streets in Mountain View before we tackle another town, but thousands of situations on city streets that would have stumped us two years ago can now be navigated autonomously.”

With nearly 700,000 autonomous miles under its belt, Google is growing more optimistic that it is heading toward an achievable goal—a vehicle that operates fully without human intervention.

For more information on companies in this article

Related Content

  • Signalised intersections are about to have their ‘Napster moment’, says Miovision
    April 20, 2023
    Miovision CEO Kurtis McBride provides the background to the launch of Miovision One, the foundation of an operating system for the modern intersection
  • Continental gestures to a safer driving future
    April 10, 2017
    To improve non-verbal communication between drivers and their vehicles, Continental has devised a range of user-friendly touch gestures for the cockpit, using a combination of gesture interaction and touch screens. This enables drivers to draw specific, defined symbols on the input display to trigger a diverse array of functions and features for rapid access. According to Dr Heinz Abel, head of Cross Product Solutions at Continental’s Instrumentation and Driver HMI business unit, the use of gestures and
  • Roadside infrastructure key to in-vehicle deployment
    November 28, 2013
    The implementation of in-vehicle systems will require multilateral cooperation, as Honda’s Sue Bai explains to Colin Sowman. Vehicle manufacturers will shape the future direction of in-vehicle ITS systems, but they can’t do it on their own. So to find out what they see on the horizon, and the obstacles they face, ITS International spoke to Sue Bai, principal engineer in the Automobile Technology Research Department with Honda R&D Americas. Not only does she play an important role in Honda’s US-based ITS
  • The path to safer roads: America can learn from Europe’s example, says Verra Mobility
    May 1, 2024
    Many US states are establishing road safety programmes that will inspire others. TJ Tiedje, vice president commercial at Verra Mobility, explains why this is important