Skip to main content

Tesla crash in China puts autonomous cars in the spotlight again

Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions. Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel. "The driver of the Tesla, whose h
August 11, 2016 Read time: 3 mins
Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions.

Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel.

"The driver of the Tesla, whose hands were not detected on the steering wheel, did not steer to avoid the parked car and instead scraped against its side," a Tesla spokeswoman said in an emailed response to Reuters.

Richard Cuerden, chief scientist, engineering & technology, at the UK’s Transport Research Laboratory (491 TRL) said the collision in China further highlights potential issues around the use of automated systems, particularly in cases where the driver is still required to remain alert and attentive at the controls of the vehicle.

He said, “The Society of Automotive Engineers currently specifies five levels of vehicle automation. Tesla’s autopilot system is classified as level two automation, which means the driver is required to maintain alertness and be ready at the controls, even in autopilot mode. This presents well-known challenges in terms of drivers’ awareness and understanding of the capabilities of the automation systems and the process by which control is shared and shifted between the driver and the vehicle in different modes of automated operation.

“We are going to see more collisions like this where, for whatever reason, the driver and the technology didn’t identify and react to the threat. What we need to do now is understand why the vehicle made the decisions it did and act accordingly. This is where projects like MOVE_UK, which will compare the behaviour of automated systems to human drivers, can really help. By understanding what went wrong and why, we can quickly teach collision avoidance systems to better predict any risks in real life environments.

“At the same time, it’s vital that drivers of vehicles with automated functionality remain aware and follow the instructions provided by the manufacturer, so that incidents like the one in China can be avoided as we discover more about this new technology.”

Related Content

  • January 30, 2012
    In-vehicle systems as enforcement enablers?
    From an enforcement perspective at least, Toyota's recent recalls over problems with accelerator pedal assemblies had a positive outcome in that for the first time a major motor manufacturer outside of the US acknowledged publicly what many have known or suspected for quite a while: that the capability exists within certain car companies to extract data from a vehicle onboard unit which can be used to help ascertain, if not prove outright, just what was happening in the vital seconds up to an accident or cr
  • September 23, 2014
    Does ADAS create as many problems as it solves
    Victoria Banks and Neville Stanton [1] of Southampton University’s Transportation Research Group examine the real impact of creeping driver automation. Safety research suggests that 90% of accidents are thought to be a result of driver inattentiveness to unpredictable or incomplete information and the vision is that highly automated vehicles will lead to accident-free driving in the future.
  • July 29, 2016
    Ignoring deadly defects in autonomous cars serves no one, say auto safety advocates
    The US Center for Auto Safety, Consumer Watchdog and former National Highway Traffic Administration (NHTSA) administrator Joan Claybrook have told NHTSA administrator Mark Rosekind that "you inexcusably are rushing full speed ahead" to promote the deployment of self-driving robot car technology instead of developing adequate safety standards "crucial to ensuring imperfect technologies do not kill people by being introduced into vehicles before the technology matures." In a letter to Rosekind in response
  • October 19, 2016
    FEMA and Dutch motorcyclists question Tesla’s type approval
    Dutch motorcyclists’ organisations Motorrijders Actie Groep (MAG), the Koninklijke Nederlandse Motorrijders Vereniging (KNMV) and Federation of European Motorcyclists’ Associations (FEMA) have written to RDW, the Netherlands Vehicle Authority, to express their concerns about the way car manufacturers implement driver assist systems. According to FEMA, crashes, studies and evasive answers to its questions FEMA indicate that these systems are not properly tested and certainly not with motorcycles. FEMA