Skip to main content

Waymo redesigns fifth generation hardware sensor suite

Waymo has redesigned its fifth-generation hardware sensor suite with the aim of enabling the scaled deployment of Waymo Driver autonomous vehicles (AVs).
By Ben Spencer March 16, 2020 Read time: 2 mins
Waymo's self-driving Jaguar I-Pace electric SUV 1 (Source: Waymo)

In a blog post, Satish Jeyachandran, head of hardware at Waymo, says the new 360 Lidar system provides a bird's-eye view of the cars, cyclists and pedestrians surrounding the vehicle. It allows Waymo Driver to navigate the complexities of city driving by distinguishing the opening of a car door a city block away while also allowing its trucks to spot debris hundreds of metres ahead on the highway, he adds.
 
Perimeter Lidars are now placed at four points around the sides of the vehicle to help it navigate tight gaps in city traffic and cover potential blind spots on hilly terrain.
 
According to Jeyachandran, long range cameras and a 360 vision system identify pedestrians and stop signs greater than 500m away.
 
Additionally, a perimeter vision system is expected to work in conjunction with its perimeter Lidars to give Waymo Driver another perspective of objects close to the vehicle.
 
“For example, while our perimeter Lidars detect obstacles directly in front of the vehicle with precision, our perimeter cameras provide our machine learning algorithms additional details to reliably identify objects, providing more context to the traffic scene,” Jeyachandran explains.
 
He claims that a peripheral vision system allows Waymo to ‘peek’ around a truck driving in front to determine whether it is safe to overtake or wait.
 
Waymo has also redesigned the architecture, outputs and signal processing capabilities of the hardware sensor suite to create an “imaging radar system for self-driving”.
 
“Our next-generation radar can also see objects at great distances, including detecting a motorcyclist from hundreds of metres away,” he continues. “Like with our other long-range sensors, being able to accurately detect objects at greater distances gives us a longer reaction time to make a more comfortable experience for our riders.”
 
The company has integrated its new generation of sensors on its Jaguar I-PACE vehicle.
 
“With the first of these new vehicles, we’ve completed comprehensive module-level and system-level tests to ensure our next-generation hardware can withstand whatever the roads throw at it - from stormy weather and salted roads, to extreme heat and dirt storms,” Jeyachandran concludes.

 

Related Content

  • January 24, 2014
    Ford teams up with MIT and Stanford on automated driving
    Building on the automated Ford Fusion Hybrid research vehicle unveiled last month, Ford is announcing new projects with Massachusetts Institute of Technology (MIT) and Stanford University to research and develop solutions to some of the technical challenges surrounding automated driving. Automated driving is a key component of Ford’s Blueprint for Mobility, which outlines what transportation will look like in 2025 and beyond, along with the technologies, business models and partnerships needed to get the
  • March 24, 2020
    Flashing LEDs may cut ‘distracted walking’ risk
    Flashing LED lights embedded into pavements could improve the safety of pedestrians distracted by their phones, says Australia’s Queensland University of Technology (QUT).
  • January 6, 2023
    CES 2023: NXP chip for ADAS & AVs
    Radar one-chip family allows long-range detection/separation of small and larger objects
  • December 13, 2013
    Daimler’s double take sees machine vision move in-vehicle
    Jason Barnes looks at Daimler’s Intelligent Drive programme to consider how machine vision has advanced the state of the art of vision-based in-vehicle systems. Traditionally, radar was the in-vehicle Driver Assistance System (DAS) technology of choice, particularly for applications such as adaptive cruise control and pre-crash warning generation. Although vision-based technology has made greater inroads more recently, it is not a case of ‘one sensor wins’. Radar and vision are complementary and redundancy