Skip to main content

New Tesla models to have ‘full self-driving capability’

In its online blog, Tesla says that self-driving vehicles will play a crucial role in improving transportation safety and accelerating the world’s transition to a sustainable future. Full autonomy will enable a Tesla to be substantially safer than a human driver, lower the financial cost of transportation for those who own a car and provide low-cost on-demand mobility for those who do not. The company has announced that from now, all Tesla vehicles produced in its factory, including Model 3, will have th
October 21, 2016 Read time: 2 mins
In its online blog, Tesla says that self-driving vehicles will play a crucial role in improving transportation safety and accelerating the world’s transition to a sustainable future. Full autonomy will enable a Tesla to be substantially safer than a human driver, lower the financial cost of transportation for those who own a car and provide low-cost on-demand mobility for those who do not.

The company has announced that from now, all Tesla vehicles produced in its factory, including Model 3, will have the hardware needed for full self-driving capability at a safety level greater than that of a human driver. Eight surround cameras provide 360 degree visibility around the car at up to 250 meters of range. Twelve updated ultrasonic sensors complement this vision, allowing for detection of both hard and soft objects at nearly twice the distance of the prior system. A forward-facing radar with enhanced processing provides additional data about the world on a redundant wavelength, capable of seeing through heavy rain, fog, dust and even the car ahead.

To make sense of all of this data, a new onboard computer with more than 40 times the computing power of the previous generation runs the new Tesla-developed neural net for vision, sonar and radar processing software. Together, this system provides a view of the world that a driver alone cannot access, seeing in every direction simultaneously and on wavelengths that go far beyond the human senses.

Model S and Model X vehicles with this new hardware are already in production.

Before activating the features enabled by the new hardware, Tesla says it will further calibrate the system using millions of miles of real-world driving to ensure significant improvements to safety and convenience.

During this period, Tesla cars with new hardware will temporarily lack certain features currently available on cars with first-generation Autopilot hardware, including some standard safety features such as automatic emergency braking, collision warning, lane holding and active cruise control. As these features are robustly validated they will be enabled over the air, together with a rapidly expanding set of entirely new features.

Related Content

  • November 4, 2016
    Ford invests in next-generation driver assist technology
    In addition to the driver assistance systems already in use on its card, new technology being developed by Ford includes cross-traffic alert with braking technology to help reduce parking stress by detecting people and objects about to pass behind the vehicle, providing a warning to the driver and then automatically braking if the driver does not respond. Rear wide-view camera, on the in-car display, will offer an alternative wide-angle view of the rear of the vehicle. Enhanced active park assist will paral
  • March 14, 2014
    Rear-view cameras ‘more effective than parking sensors’
    Rear cameras are more effective than parking sensors at helping drivers avoid objects while travelling in reverse, but they don't help in every situation, a new Insurance Institute for Highway Safety (IIHS) study shows. The study, conducted with volunteer drivers in an empty parking lot in the Los Angeles area, indicates that cameras would help prevent more reversing crashes into pedestrians in the vehicle's blind zone than parking sensors. Surprisingly, cameras by themselves worked better than sensors a
  • September 8, 2014
    Michigan to lead way on V2V and V2I system
    The world’s largest vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) system will be put in place in Michigan by 2017.
  • July 16, 2019
    Volvo and Nvidia to develop AV decision-making system
    Volvo has partnered with Nvidia to develop a decision-making system which it says will allow autonomous commercial vehicles to operate safely on public roads. The solution will be built on Nvidia’s full software stack for sensor processing, perception, map localisation and path planning to enable a range of autonomous driving applications such as public transit and freight transport. The contract includes accelerated computing technology in the data centre for training deep neural networks, large-scale si