Skip to main content

Update on autonomous cars: mastering city street driving

In a recent blog post, Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye. Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California. Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area. He claims that
May 14, 2014 Read time: 2 mins
In a recent blog post, 1691 Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye.

Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California.  Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area.

He claims that Google has improved its software so it can detect hundreds of distinct objects simultaneously—pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn. A self-driving vehicle can pay attention to all of these things in a way that a human physically can’t—and it never gets tired or distracted.

Urmson says: “As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer. As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it). We still have lots of problems to solve, including teaching the car to drive more streets in Mountain View before we tackle another town, but thousands of situations on city streets that would have stumped us two years ago can now be navigated autonomously.”

With nearly 700,000 autonomous miles under its belt, Google is growing more optimistic that it is heading toward an achievable goal—a vehicle that operates fully without human intervention.

For more information on companies in this article

Related Content

  • Affectiva and Nuance to develop humanised automotive assistant
    September 7, 2018
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as anger
  • Affectiva and Nuance to offer assistance
    December 6, 2018
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as ange
  • SWRI fuels young people’s passion for STEM topics
    September 21, 2022
    A small car on the Southwest Research Institute (SWRI) booth aims to spur young people’s interest in STEM topics and further the cause of autonomous vehicles.
  • Assessing driver behaviour in work zones
    May 31, 2013
    David Crawford looks at moves to increase throughput and safety in work zones.