Skip to main content

Tesla crash in China puts autonomous cars in the spotlight again

Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions. Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel. "The driver of the Tesla, whose h
August 11, 2016 Read time: 3 mins
Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions.

Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel.

"The driver of the Tesla, whose hands were not detected on the steering wheel, did not steer to avoid the parked car and instead scraped against its side," a Tesla spokeswoman said in an emailed response to Reuters.

Richard Cuerden, chief scientist, engineering & technology, at the UK’s Transport Research Laboratory (491 TRL) said the collision in China further highlights potential issues around the use of automated systems, particularly in cases where the driver is still required to remain alert and attentive at the controls of the vehicle.

He said, “The Society of Automotive Engineers currently specifies five levels of vehicle automation. Tesla’s autopilot system is classified as level two automation, which means the driver is required to maintain alertness and be ready at the controls, even in autopilot mode. This presents well-known challenges in terms of drivers’ awareness and understanding of the capabilities of the automation systems and the process by which control is shared and shifted between the driver and the vehicle in different modes of automated operation.

“We are going to see more collisions like this where, for whatever reason, the driver and the technology didn’t identify and react to the threat. What we need to do now is understand why the vehicle made the decisions it did and act accordingly. This is where projects like MOVE_UK, which will compare the behaviour of automated systems to human drivers, can really help. By understanding what went wrong and why, we can quickly teach collision avoidance systems to better predict any risks in real life environments.

“At the same time, it’s vital that drivers of vehicles with automated functionality remain aware and follow the instructions provided by the manufacturer, so that incidents like the one in China can be avoided as we discover more about this new technology.”

Related Content

  • October 20, 2017
    Move_UK develop new validation method to speed up AV deployment
    Move_UK has completed the first phase of its three-year research programme for the real-world testing of autonomous vehicles (AVs) in the borough of Greenwich, London. The project has enabled the company to develop a new validation method to reduce the time taken to test automated driving systems and bring them to market. The project’s data is gathered from sensors installed on a fleet of Land Rover vehicles that have already completed more than 30
  • January 24, 2017
    Tesla Autopilot system ‘not at fault’ in fatal crash
    A nine-month investigation by the US National Highway Traffic Safety Administration (NHTSA) into the fatal car crash involving a Tesla Model S in Florida last year has concluded that the car’s Autopilot system, which was in operation at the time, was not at fault. The decision noted that Autopilot is a Level 2 self-driving system and, therefore, requires the driver to always monitor the system and be at the ready to intervene – a stipulation that the driver failed to perform, the administration says.
  • January 26, 2012
    Increasing road safety with automated driver assistance systems
    Jon Masters looks at how drivers will be trained to use the increasing number of advanced driver assistance systems being incorporated into modern cars
  • February 11, 2016
    US regulator ‘paves the way for Google’s self-driving car’
    A letter to Google, the US federal transport regulator, National Highway Traffic Safety Administration (NHTSA), appears to pave the way for self-driving cars, but adds the proviso that the rule-making could take some time. Google had requested clarification of a number of provisions in the Federal Motor Vehicle Safety Standards (FMVSSs) as they apply to Google’s described design for self-driving vehicles (SDVs). “If no human occupant of the vehicle can actually drive the vehicle, it is more reasonable