Skip to main content

Lidar technology wins big in China’s autonomous vehicle challenge

China’s fifth annual Future Challenge earlier this month pitted eleven unmanned intelligent vehicles against each other on a course designed to test their capabilities in suburban and urban road tests, over a 23-kilometre course. All of the first eight cars to finish were equipped with Velodyne’s 3D Lidar vision technology which provides active sensing for crash avoidance, driving automation and mobile road survey and mapping. Velodyne HDL-64E and HDL-32E sensors deliver 360-degree views of the car’s env
November 26, 2013 Read time: 2 mins
China’s fifth annual Future Challenge earlier this month pitted eleven unmanned intelligent vehicles against each other on a course designed to test their capabilities in suburban and urban road tests, over a 23-kilometre course.

All of the first eight cars to finish were equipped with Velodyne’s 3D Lidar vision technology which provides active sensing for crash avoidance, driving automation and mobile road survey and mapping. Velodyne HDL-64E and HDL-32E sensors deliver 360-degree views of the car’s environment, with real-time updates twenty times per second.

Cars on the course needed to demonstrate the ability to recognise light, eliminate human and vehicle interference, successfully detour around construction zones, turn around and come to a stop. All were also required to establish the ability to make a U-turn, accelerate and decelerate. Performance was graded on safety, smartness, smoothness and speed.

"This is simply a remarkable accomplishment," said Wolfgang Juchmann, PhD, 2259 Velodyne Lidar director of sales and marketing. "The Future Challenge course was nothing less than demanding throughout, with terrain and tests that demonstrated Lidar’s versatility and reliability in real time. And the fact that eight of eleven vehicles were so equipped stands as a huge vote of confidence in our technology."

For more information on companies in this article

Related Content

  • Air quality tops transportation agendas
    November 17, 2014
    Colin Sowman catches up on some of the latest research around outdoor pollution and looks at options available to authorities in areas of poor air quality. Iair quality hasn’t already reached the top of the agenda in transportation department meetings in your area, it probably soon will with national, trans-national and even global bodies calling for authorities to reduce pollution levels.
  • University research shows a few self-driving cars can improve traffic flow
    May 15, 2017
    The presence of just a few autonomous vehicles can eliminate the stop-and-go driving of the human drivers in traffic, along with the accident risk and fuel inefficiency it causes, according to new research by the University of Illinois at Urbana-Champaign. Funded by the National Science Foundation’s Cyber-Physical Systems program, the research was led by a multi-disciplinary team of researchers with expertise in traffic flow theory, control theory, robotics, cyber-physical systems, and transportation engine
  • Idaho adds human dimension to winter savings
    September 23, 2014
    Idaho leverages the increased capability and reliability of its road weather sensor network to reduce costs and prevent accidents. Weather-related accidents can form a significant chunk of an authorities’ annual road casualty statistics. While authorities cannot control the weather, the technology exists to monitor the road conditions and react with warnings to motorists and the treatment of icy or snow-covered roads. However, with all capital expenditure now placed under the microscope of public scrutiny,
  • Machine vision - cameras for intelligent traffic management
    January 25, 2012
    For some, machine vision is the coming technology. For others, it’s already here. Although it remains a relative newcomer to the ITS sector, its effects look set to be profound and far-reaching. Encapsulating in just a few short words the distinguishing features of complex technologies and their operating concepts can sometimes be difficult. Often, it is the most subtle of nuances which are both the most important and yet also the most easily lost. Happily, in the case of machine vision this isn’t the case: