Skip to main content

Tesla crash in China puts autonomous cars in the spotlight again

Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions. Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel. "The driver of the Tesla, whose h
August 11, 2016 Read time: 3 mins
Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions.

Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel.

"The driver of the Tesla, whose hands were not detected on the steering wheel, did not steer to avoid the parked car and instead scraped against its side," a Tesla spokeswoman said in an emailed response to Reuters.

Richard Cuerden, chief scientist, engineering & technology, at the UK’s Transport Research Laboratory (491 TRL) said the collision in China further highlights potential issues around the use of automated systems, particularly in cases where the driver is still required to remain alert and attentive at the controls of the vehicle.

He said, “The Society of Automotive Engineers currently specifies five levels of vehicle automation. Tesla’s autopilot system is classified as level two automation, which means the driver is required to maintain alertness and be ready at the controls, even in autopilot mode. This presents well-known challenges in terms of drivers’ awareness and understanding of the capabilities of the automation systems and the process by which control is shared and shifted between the driver and the vehicle in different modes of automated operation.

“We are going to see more collisions like this where, for whatever reason, the driver and the technology didn’t identify and react to the threat. What we need to do now is understand why the vehicle made the decisions it did and act accordingly. This is where projects like MOVE_UK, which will compare the behaviour of automated systems to human drivers, can really help. By understanding what went wrong and why, we can quickly teach collision avoidance systems to better predict any risks in real life environments.

“At the same time, it’s vital that drivers of vehicles with automated functionality remain aware and follow the instructions provided by the manufacturer, so that incidents like the one in China can be avoided as we discover more about this new technology.”

Related Content

  • November 20, 2015
    Multi-tasking at the wheel a potentially fatal myth, finds IAM
    Expert psychologists have concluded that multi-tasking whilst driving is a myth – and the most dangerous of those driving multi-tasks is texting and talking on a mobile phone, according to a new report produced by the Institute of Advanced Motorists (IAM) and the Transport Research Laboratory (TRL). The research focuses on the dangers involved when drivers try and engage in more than one task, indicating this can have a ‘detrimental’ effect on the quality and accuracy of driving performance. The find
  • April 5, 2017
    Autonomous vehicles will not prevent half of real-world crashes
    Alan Thomas of CAVT looks at the reality behind the safety claims fuelling the drive towards autonomous vehicles
  • April 11, 2016
    Consumer Watchdog calls on NHTSA to strength rules on autonomous cars
    The US Consumer Watchdog has called on the National Highway Traffic Safety Administration (NHTSA) to require a steering wheel, brake and accelerator so a human driver can take control of a self-driving robot car when necessary in the guidelines it is developing on automated vehicle technology. In comments for a NHTSA public meeting about automated vehicle technology, John M. Simpson, Consumer Watchdog's privacy project director, also listed ten questions he said the agency must ask Google about its self-
  • August 10, 2016
    Tesla Autopilot feature helps driver get safely to hospital
    US driver Joshua Neally made it to safely to hospital by putting his Tesla Model X into Autopilot mode when he suffered what was later diagnosed as a pulmonary embolism. The lawyer was travelling home in growing rush-hour traffic when he began to suffer severe pain in his chest and stomach. Instead of calling an ambulance he used the car’s self-drive mode to negotiate the 20 miles to the nearest hospital. He told Slate that he manually steered it into the parking lot and checked himself into the emergenc