Skip to main content

Two deaths in Tesla crash with no driver

Victims found in the front and back seats - but this was not an autonomous vehicle
By Ben Spencer April 21, 2021 Read time: 1 min
Tesla says data recovered so far showed Autopilot was not enabled (© Sylvain Robin | Dreamstime.com)

Two men were killed after a Tesla vehicle crashed into a tree in Houston, Texas, sparking an investigation.

The 2019 Tesla Model S was travelling at a high speed when it failed to negotiate a curve on a winding road.

A report by the BBC says police believe there was nobody present in the driver's seat at the time of the accident. 

Mark Herman, Harris County Precinct 4 constable, is quoted as saying that evidence suggests “no-one was driving the vehicle at the time of impact”. 

He added that the case was still under investigation. 

One victim was found in the front passenger seat and the other was in the back of the vehicle. 

Tesla says data recovered so far showed the Autopilot advanced driver assistance system was not enabled. 

A Tesla Model X operating in autopilot claimed the life of a driver in 2018 after the vehicle crashed into a roadside barrier in California. 

During the same year, Uber pulled out of its autonomous vehicle operation in Arizona after one of its test vehicles killed a pedestrian.

 

Related Content

  • August 11, 2016
    Tesla crash in China puts autonomous cars in the spotlight again
    Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions. Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel. "The driver of the Tesla, whose h
  • February 14, 2019
    Ride-hailing and taxi drivers could face tougher criminal checks in England
    Drivers who ply their trade on apps such as Uber could be under greater scrutiny as part of proposals being put forward by the UK government. The potential risk to passengers from the explosion of ride-hailing apps, as private-hire drivers are perceived to receive less thorough vetting – for example, to flag up past convictions – has long been argued. Incidents such as the murders of passengers by a Didi driver in China heightened such concerns - although critics point out that a US Uber driver who ad
  • November 1, 2021
    Verizon applies C-V2X pedestrian safety
    California’s CCTA will initiate validation of the tech for its ADS Grant Program 
  • January 24, 2017
    Tesla Autopilot system ‘not at fault’ in fatal crash
    A nine-month investigation by the US National Highway Traffic Safety Administration (NHTSA) into the fatal car crash involving a Tesla Model S in Florida last year has concluded that the car’s Autopilot system, which was in operation at the time, was not at fault. The decision noted that Autopilot is a Level 2 self-driving system and, therefore, requires the driver to always monitor the system and be at the ready to intervene – a stipulation that the driver failed to perform, the administration says.