Skip to main content

Update on autonomous cars: mastering city street driving

In a recent blog post, Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye. Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California. Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area. He claims that
May 14, 2014 Read time: 2 mins
In a recent blog post, 1691 Google’s director of their self-driving car project, Chris Urmson has given an update on the technology that he says is better than the human eye.

Google’s autonomous vehicles have logged nearly 700,000 miles on the streets of the company’s hometown, Mountain View, California.  Urmson says a mile of city driving is much more complex than a mile of freeway driving, with hundreds of different objects moving according to different rules of the road in a small area.

He claims that Google has improved its software so it can detect hundreds of distinct objects simultaneously—pedestrians, buses, a stop sign held up by a crossing guard, or a cyclist making gestures that indicate a possible turn. A self-driving vehicle can pay attention to all of these things in a way that a human physically can’t—and it never gets tired or distracted.

Urmson says: “As it turns out, what looks chaotic and random on a city street to the human eye is actually fairly predictable to a computer. As we’ve encountered thousands of different situations, we’ve built software models of what to expect, from the likely (a car stopping at a red light) to the unlikely (blowing through it). We still have lots of problems to solve, including teaching the car to drive more streets in Mountain View before we tackle another town, but thousands of situations on city streets that would have stumped us two years ago can now be navigated autonomously.”

With nearly 700,000 autonomous miles under its belt, Google is growing more optimistic that it is heading toward an achievable goal—a vehicle that operates fully without human intervention.

For more information on companies in this article

Related Content

  • CCAM innovation at ITS World Congress 2021
    September 27, 2021
    We live in an era of increasingly cooperative, connected and automated mobility (CCAM) but there’s still a huge way to go - visitors to ITS World Congress in Hamburg will be able to see projects, innovations and real-life solutions showcased in the city
  • Smart Cities: a journey, not a destination
    June 30, 2021
    As technologies evolve, cities of the future should prepare for expansion by establishing scal­able systems, suggest Benjamin Ho and James Birdsall of Parsons
  • Advanced in-vehicle user interface - future developments
    February 1, 2012
    Dave McNamara and Craig Simonds, Autotechinsider LLC, look at human-machine interface development out to 2015. The US auto industry is going through the worst crisis it has faced since the Great Depression. But it has embraced technologies that will produce the best-possible driving experience for the public. Ford was the first OEM to announce in-car internet radio and SYNC, its signature-branded User Interface (UI), is held up as the shining example of change embracement.
  • Tollers make way as NextNav muscles into 902-928MHz spectrum
    July 30, 2013
    Toll operators and Progeny trade claim and counter claim about the potential ramifications of operating in the 902-928MHz spectrum, as Jon Masters finds out. Two months after the Federal Communications Commission (FCC) determined that Progeny can start commercial operation of its NextNav location finding service, the dust has begun to settle. The tolling industry has had a chance to reflect on how this may impact its operations, in the knowledge that NextNav will share the 902-928MHz frequency band with RFI