Skip to main content

New Tesla models to have ‘full self-driving capability’

In its online blog, Tesla says that self-driving vehicles will play a crucial role in improving transportation safety and accelerating the world’s transition to a sustainable future. Full autonomy will enable a Tesla to be substantially safer than a human driver, lower the financial cost of transportation for those who own a car and provide low-cost on-demand mobility for those who do not. The company has announced that from now, all Tesla vehicles produced in its factory, including Model 3, will have th
October 21, 2016 Read time: 2 mins
In its online blog, Tesla says that self-driving vehicles will play a crucial role in improving transportation safety and accelerating the world’s transition to a sustainable future. Full autonomy will enable a Tesla to be substantially safer than a human driver, lower the financial cost of transportation for those who own a car and provide low-cost on-demand mobility for those who do not.

The company has announced that from now, all Tesla vehicles produced in its factory, including Model 3, will have the hardware needed for full self-driving capability at a safety level greater than that of a human driver. Eight surround cameras provide 360 degree visibility around the car at up to 250 meters of range. Twelve updated ultrasonic sensors complement this vision, allowing for detection of both hard and soft objects at nearly twice the distance of the prior system. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength, capable of seeing through heavy rain, fog, dust and even the car ahead.

To make sense of all of this data, a new onboard computer with more than 40 times the computing power of the previous generation runs the new Tesla-developed neural net for vision, sonar and radar processing software. Together, this system provides a view of the world that a driver alone cannot access, seeing in every direction simultaneously and on wavelengths that go far beyond the human senses.

Model S and Model X vehicles with this new hardware are already in production.

Before activating the features enabled by the new hardware, Tesla says it will further calibrate the system using millions of miles of real-world driving to ensure significant improvements to safety and convenience.

During this period, Tesla cars with new hardware will temporarily lack certain features currently available on cars with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control. As these features are robustly validated they will be enabled over the air, together with a rapidly expanding set of entirely new features.

Related Content

  • November 26, 2013
    Lidar technology wins big in China’s autonomous vehicle challenge
    China’s fifth annual Future Challenge earlier this month pitted eleven unmanned intelligent vehicles against each other on a course designed to test their capabilities in suburban and urban road tests, over a 23-kilometre course. All of the first eight cars to finish were equipped with Velodyne’s 3D Lidar vision technology which provides active sensing for crash avoidance, driving automation and mobile road survey and mapping. Velodyne HDL-64E and HDL-32E sensors deliver 360-degree views of the car’s env
  • March 31, 2021
    Digital Transformation is the way to comprehensive transportation 
    Transportation worldwide needs to keep up with a variety of challenges: Frederic Giron of Forrester Consulting explains how digital technologies will be the key to making the necessary changes...
  • July 18, 2024
    PTV drives into simulation of automotive development
    PTV Vissim Automotive designed to create responsive digital vehicle testing environment
  • January 25, 2012
    Machine vision - cameras for intelligent traffic management
    For some, machine vision is the coming technology. For others, it’s already here. Although it remains a relative newcomer to the ITS sector, its effects look set to be profound and far-reaching. Encapsulating in just a few short words the distinguishing features of complex technologies and their operating concepts can sometimes be difficult. Often, it is the most subtle of nuances which are both the most important and yet also the most easily lost. Happily, in the case of machine vision this isn’t the case: