Skip to main content

Tesla crash in China puts autonomous cars in the spotlight again

Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions. Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel. "The driver of the Tesla, whose h
August 11, 2016 Read time: 3 mins
Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions.

Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel.

"The driver of the Tesla, whose hands were not detected on the steering wheel, did not steer to avoid the parked car and instead scraped against its side," a Tesla spokeswoman said in an emailed response to Reuters.

Richard Cuerden, chief scientist, engineering & technology, at the UK’s Transport Research Laboratory (491 TRL) said the collision in China further highlights potential issues around the use of automated systems, particularly in cases where the driver is still required to remain alert and attentive at the controls of the vehicle.

He said, “The Society of Automotive Engineers currently specifies five levels of vehicle automation. Tesla’s autopilot system is classified as level two automation, which means the driver is required to maintain alertness and be ready at the controls, even in autopilot mode. This presents well-known challenges in terms of drivers’ awareness and understanding of the capabilities of the automation systems and the process by which control is shared and shifted between the driver and the vehicle in different modes of automated operation.

“We are going to see more collisions like this where, for whatever reason, the driver and the technology didn’t identify and react to the threat. What we need to do now is understand why the vehicle made the decisions it did and act accordingly. This is where projects like MOVE_UK, which will compare the behaviour of automated systems to human drivers, can really help. By understanding what went wrong and why, we can quickly teach collision avoidance systems to better predict any risks in real life environments.

“At the same time, it’s vital that drivers of vehicles with automated functionality remain aware and follow the instructions provided by the manufacturer, so that incidents like the one in China can be avoided as we discover more about this new technology.”

Related Content

  • June 6, 2016
    Autonomous driving – what can we really expect?
    Dave Marples of Technolution BV looks beyond the hype to the practical implementation of autonomous vehicles. Having looked at the development of this sector for some time, I am concerned about the current state of autonomous driving development as engineering (and marketing) have run way ahead of the wider systemic, and legislative, requirements to support an autonomous future.
  • February 28, 2019
    TRL develops vehicle safety standards for Europe
    Transport Research Laboratory (TRL) has developed new vehicle safety standards which it claims will save 25,000 lives and assist European countries in the development of autonomous vehicles (AVs). Matthias Seidl, senior researcher - vehicle safety & regulation at TRL, says the advanced safety measures will protect all road users. “Intelligent speed assistance and drowsiness and distraction recognition will support drivers in their ongoing tasks, autonomous emergency braking and emergency lane keeping wi
  • November 6, 2019
    NTSB: Uber’s AV in fatal crash ‘had software issues’
    The US National Transportation Safety Board (NTSB) has found that an Uber autonomous vehicle which killed Elaine Herzberg last year had software flaws. NTSB released a report which says the Volvo XC60’s autonomous system software classified the pedestrian as an unknown object and determined that an emergency braking manoeuvre was needed to mitigate the collision. Uber confirmed that emergency braking manoeuvres must be carried out manually and the system is not designed to alert the driver. Data
  • November 9, 2012
    US transportation 'needs political leadership'
    Long-time industry leader John Worthington reflects on where transportation in the US is heading – and where it should be going. Interview with Jason Barnes. The US’s new transportation bill reflects much of what is wrong in the sector in general and in ITS in particular, according to John Worthington. While a decision is welcome, he says, it does little more than provide certainty of funding for anything other than day-to-day operations. Worthington, former Chairman and CEO of TransCore, is back in the ITS