Skip to main content

NHTSA opens investigation into fatal Tesla crash

The US National Highway Traffic Safety Administration (NHTSA) has opened a preliminary investigation into a fatal crash involving a Tesla autonomous car in Florida. According to a Florida Highway Patrol report, the 40-year-old driver was killed when his 2015 Model S drove under the trailer of an 18-wheel truck. In a blog post on the crash, which happened in early May, Tesla said “the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to t
July 1, 2016 Read time: 3 mins
The US 834 National Highway Traffic Safety Administration (NHTSA) has opened a preliminary investigation into a fatal crash involving a Tesla autonomous car in Florida. According to a Florida Highway Patrol report, the 40-year-old driver was killed when his 2015 Model S drove under the trailer of an 18-wheel truck.

In a blog post on the crash, which happened in early May, Tesla said “the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S. Had the Model S impacted the front or rear of the trailer, even at high speed, its advanced crash safety system would likely have prevented serious injury as it has in numerous other similar incidents.”

The investigation comes just as the NHTSA was expected to issue new guidelines for self-driving. Secretary of Transportation Anthony Foxx and NHTSA director Mark Rosekind have publicly pressed for the rapid deployment of the technology. NHTSA should conclude its investigation into the Tesla crash and publicly release those data and findings before moving forward with its guidance, said non-profit consumer advocacy group Consumer Watchdog.

"We hope this is a wake-up call to federal regulators that we still don't know enough about the safety of self-driving cars to be rushing them to the road. The Administration should slow its rush to write guidelines until the causes in this crash are clear, and the manufacturers provide public evidence that self-driving cars are safe. If a car can't tell the difference between a truck and the sky, the technology is in doubt," said Carmen Balber, executive director with Consumer Watchdog.

In a telephone interview with Bloomberg, Clarence Ditlow, executive director of the Center for Auto Safety, an advocacy group in Washington said that if the Autopilot system didn’t recognise the tractor trailer, then Tesla will have to recall the cars to fix the flaw. Ditlow said that Tesla’s Autopilot system needs to be able to recognise all possible road conditions.

“That’s a clear-cut defect and there should be a recall,” Ditlow said in a phone interview. “When you put Autopilot in a vehicle, you’re telling people to trust the system even if there is lawyerly warning to keep your hands on the wheel.”

Consumer Watchdog has called on NHTSA to hold a public rulemaking on self-driving cars, and to require the cars to have a steering wheel and pedals to allow a human driver to take over when the technology fails.

Related Content

  • July 29, 2016
    Ignoring deadly defects in autonomous cars serves no one, say auto safety advocates
    The US Center for Auto Safety, Consumer Watchdog and former National Highway Traffic Administration (NHTSA) administrator Joan Claybrook have told NHTSA administrator Mark Rosekind that "you inexcusably are rushing full speed ahead" to promote the deployment of self-driving robot car technology instead of developing adequate safety standards "crucial to ensuring imperfect technologies do not kill people by being introduced into vehicles before the technology matures." In a letter to Rosekind in response
  • January 24, 2017
    Tesla Autopilot system ‘not at fault’ in fatal crash
    A nine-month investigation by the US National Highway Traffic Safety Administration (NHTSA) into the fatal car crash involving a Tesla Model S in Florida last year has concluded that the car’s Autopilot system, which was in operation at the time, was not at fault. The decision noted that Autopilot is a Level 2 self-driving system and, therefore, requires the driver to always monitor the system and be at the ready to intervene – a stipulation that the driver failed to perform, the administration says.
  • August 10, 2016
    Tesla Autopilot feature helps driver get safely to hospital
    US driver Joshua Neally made it to safely to hospital by putting his Tesla Model X into Autopilot mode when he suffered what was later diagnosed as a pulmonary embolism. The lawyer was travelling home in growing rush-hour traffic when he began to suffer severe pain in his chest and stomach. Instead of calling an ambulance he used the car’s self-drive mode to negotiate the 20 miles to the nearest hospital. He told Slate that he manually steered it into the parking lot and checked himself into the emergenc
  • October 20, 2016
    Consumer Watchdog calls for stricter safety standards for autonomous cars
    The US Consumer Watchdog is calling on the California Department of Motor Vehicles (DMV) to prohibit autonomous vehicles without a human driver capable of taking control until the National Highway Traffic Safety Administration (NHTSA) enacts enforceable standards covering the safety performance of robot cars. NHTSA has proposed a voluntary safety checklist that contains no enforceable standards. The proposed DMV rules would require manufacturers to submit that federal checklist before testing or deployin