Skip to main content

US regulator ‘paves the way for Google’s self-driving car’

A letter to Google, the US federal transport regulator, National Highway Traffic Safety Administration (NHTSA), appears to pave the way for self-driving cars, but adds the proviso that the rule-making could take some time. Google had requested clarification of a number of provisions in the Federal Motor Vehicle Safety Standards (FMVSSs) as they apply to Google’s described design for self-driving vehicles (SDVs). “If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable
February 11, 2016 Read time: 4 mins
A letter to Google, the US federal transport regulator, 834 National Highway Traffic Safety Administration (NHTSA), appears to pave the way for self-driving cars, but adds the proviso that the rule-making could take some time.

Google had requested clarification of a number of provisions in the Federal Motor Vehicle Safety Standards (FMVSSs) as they apply to Google’s described design for self-driving vehicles (SDVs).

“If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable to identify the driver as whatever (as opposed to whoever) is doing the driving,” Paul A. Hemmersbaugh, NHTSA chief counsel said in the letter to Chris Urmson, director of Google’s self-driving car project.  “In this instance, an item of motor vehicle equipment, the SDS, is actually driving the vehicle.”

The NHTSA’s current 49 CFR 571.3 rule defines a driver as “the occupant of a motor vehicle seated immediately behind the steering control system.”

According to Google, its SDVs are fully autonomous motor vehicles, i.e., vehicles whose operations are controlled exclusively by a self-driving system (SDS).  The SDS is an artificial-intelligence driver, which is a computer designed into the motor vehicle itself that controls all aspects of driving by perceiving its environment and responding to it. Google believes that the vehicles have no need for a human driver and has asked the NHTSA for advice on the interpretation of provisions in the Federal Motor Vehicle Safety Standards (FMVSSs) for the operation of the new cars.

Google's description of its proposed vehicles corresponds to Level 4 Full Self-Driving Automation defined by a May 2013 preliminary policy statement of the NHTSA on automated vehicles, according to Hemmersbaugh.

According to that Statement, a Level 4 vehicle “is designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles. By design, safe operation rests solely on the automated vehicle system.”

“In essence, Google seeks to produce a vehicle that contains L4 automated driving capabilities, and removes conventional driver controls and interfaces (like a steering wheel, throttle pedal, and brake pedal, among many other things),” said Hemmersbaugh.

However, the NHTSA is wrong to say the artificial intelligence guiding an autonomous robot car counts as the driver, Consumer Watchdog said, adding that Google's own test data demonstrates the need for a human driver who can take control when necessary.

"Google says its robot technology failed and handed over control to a human test driver 272 times and the driver was scared enough to take control 69 times," said John M. Simpson, Consumer Watchdog's privacy project director. "The robot cars simply cannot reliably deal with everyday real traffic situations. Without a driver, who do you call when the robots fail?"

Consumer Watchdog reiterated its support for regulations proposed by the California Department of Motor Vehicles covering the general deployment of autonomous robot cars on the state's highways.

"The DMV would require a licensed driver behind the wheel," Simpson noted. "If you really care about the public's safety, that's the only way to go."

Commenting on NHTSA's interpretation that the robot technology can count as a driver, Anthony Foxx, Secretary of Transportation said, "We are taking great care to embrace innovations that can boost safety and improve efficiency on our roadways. Our interpretation that the self-driving computer system of a car could, in fact, be a driver is significant. But the burden remains on self-driving car manufacturers to prove that their vehicles meet rigorous federal safety standards."

Related Content

  • TomTom banishes range anxiety
    March 16, 2021
    High-quality routing and weather information is going to be vital in persuading drivers that electric vehicles will not let them down, thinks TomTom’s Robin van den Berg
  • Electronic toll collection: Change is in the air
    November 7, 2024
    Trends in technology plus users’ comfort in adopting new advances indicate that the environment for a new electronic toll collection architecture is evolving. Hal Worrall considers what this might look like
  • Xerox automates HOV/HOT enforcement
    May 27, 2014
    Counting the number of people in a vehicle has always been a manual task, but now Xerox has developed a real-time system to automate the process. Xerox has introduced an automated system that determines the number of passengers in a vehicle, enabling authorities to detect non-qualifying drivers using the High Occupancy Vehicle (HOV) and High Occupancy Toll (HOT) lanes. Traditionally HOV/HOT enforcement has entailed local police visually confirming each vehicle has the required number of occupants and chasin
  • Google spin-off Waymo to open ‘world’s first Level 4 AV’ factory in Michigan
    January 28, 2019
    Waymo, the company that began as Google’s driverless car project, has pledged to open a facility in Michigan, US, to produce advanced autonomous vehicles (AVs). In a statement, Waymo insisted: “This will be the world’s first factory 100% dedicated to the mass production of Level 4 AVs.” Level 4 automation means that no human interaction is required, and the vehicle is able to adjust in the case of things going wrong – but there is an option for manual override. This is still some way from Level 5, in