Skip to main content

Drivers need clarity on liability with automated vehicles, says FIA

FIA Region I recently presented the consumer view on liability and automated driving at the Driving Future platform, where it stressed the need to increase consumer confidence in driverless technologies by guaranteeing safety and swift compensation for traffic victims. FIA believes the transition to fully autonomous vehicles will take time, during which different levels of automation will coexist on our roads, creating challenges for the current insurance model. It says there must be differentiation
March 14, 2017 Read time: 2 mins
8054 FIA Region I recently presented the consumer view on liability and automated driving at the Driving Future platform, where it stressed the need to increase consumer confidence in driverless technologies by guaranteeing safety and swift compensation for traffic victims.

FIA believes the transition to fully autonomous vehicles will take time, during which different levels of automation will coexist on our roads, creating challenges for the current insurance model.

It says there must be differentiation between lower levels of automation and the higher levels of automation. Up to SAE level 2, driver interaction is required in some form and therefore drivers should remain liable, provided the systems are properly designed and the driver is aware of their function, limits and constraints. For higher levels of automation, drivers can be asked to take over only under certain circumstances. In those circumstances, the recording of a limited set of data will be needed to establish liability in case of an accident.

FIA Region I interim director general, Laurianne Krid, said: "Drivers need to be properly informed about upcoming automated systems and their responsibilities to make correct use of the technology as it is released. At higher automation levels, drivers expect to be able to engage in other tasks and should, in our view, not be held liable in case of accident or infringement. Limited data recording through a Data Storage Systems should help clarify liability in case of doubt.”

Related Content

  • January 30, 2012
    e-Call emergency service doesn't go far enough
    eCall misses the point and is only a tacit acknowledgement that the road safety issue has not yet been adequately addressed, according to FEMA's Aline Delhaye. According to the Federation of European Motorcyclists' Associations (FEMA), the European Commission's (EC's) ambitions for eCall implementation are premature and fail to take account of all road users' needs or of technological progress elsewhere.
  • August 11, 2016
    Tesla crash in China puts autonomous cars in the spotlight again
    Tesla is investigating the crash in Beijing, China last week, when a Tesla Model S in autopilot mode hit the side of a parked car. According to Reuters, Tesla said it had reviewed data to confirm the car was in autopilot mode, a system that takes control of steering and braking in certain conditions. Tesla also said it was the driver's responsibility to maintain control of the vehicle. In this case, it said, the driver's hands were not detected on the steering wheel. "The driver of the Tesla, whose h
  • January 9, 2023
    Synthetic data v the real thing
    ITS and smart cities thrive on data: but does all the data need to be real? Steve Harris of Mindtech explains why the answer could lie in combining elements of the real world with the synthetic
  • September 26, 2023
    FHWA collaborative framework on automated driving systems: an explainer
    USDoT FHWA has put together a collaborative framework to help secure the roll-out of automated driving systems in the US. John Harding of FHWA explains the thinking…