Skip to main content

EsoftThings works with Renesas’ R-Car to realise autonomous driving

EsoftThings (EST) has announced it has joined Renesas’ R-Car Consortium to accelerate the development of Advanced Driver Assistance Systems (ADAS) and Automated Driving (AD) with computer vision technology optimized for R-Car systems-on-chip (SoCs). The Renesas computer vision and cognitive Accelerator IMP-X5 of the R-Car SoCs are designed with the intention of processing huge amounts of real-time input from cameras and radar sensors that are being added to future models of cars.
November 8, 2017 Read time: 2 mins
EsoftThings (EST) has announced it has joined Renesas’ R-Car Consortium to accelerate the development of Advanced Driver Assistance Systems (ADAS) and Automated Driving (AD) with computer vision technology optimized for R-Car systems-on-chip (SoCs).

The Renesas computer vision and cognitive Accelerator IMP-X5 of the R-Car SoCs are designed with the intention of processing huge amounts of real-time input from cameras and radar sensors that are being added to future models of cars.

EST is integrating and delivering a selection of algorithms on the Renesas autonomy platform by utilizing the dedicated on-chip accelerators providing high performance at low power consumption. In addition, EST provides training and consulting to Renesas partners and customers to enable the optimal use of the solutions in their target applications.

Eric Pinton, director at Renesas' Global ADAS Center, said: "eSoftThings have gained in-depth know-how of our accelerators for sensing and cognitive applications. This helps us to train our customers and partners to implement their solutions effectively as well as shortening time-to-market. We value eSoftThings as an important partner for our Renesas autonomy Platform. Therefore, we are pleased that they are now officially member of the R-Car Consortium."

Related Content

  • June 18, 2015
    Land Rover demonstrates remote-control Range Rover Sport
    Jaguar Land Rover, part of the UK Autodrive consortium, has demonstrated a remote control Range Rover Sport research vehicle, showing how a driver could drive the vehicle from outside the car via their smartphone. The smartphone app includes control of steering, accelerator and brakes as well as changing from high and low range. This would allow the driver to walk alongside the car, at a maximum speed of 4mph, to manoeuvre their car out of challenging situations safely, or even to negotiate difficult off
  • April 29, 2015
    Taking the hassle out of parking
    A team of senior electrical and computer engineers from Rice University in Houston, Texas, has developed a new parking technology called ParkiT, with the aim of making it easier to find a parking space in a crowded car park. The team claims the new system is cheaper than sensor technology currently being used and would provide car park managers and attendants with real time information on available parking spaces. That information could then be shared with drivers through electronic signs or a driver-fri
  • June 30, 2016
    Machine vision’s transport offerings move on apace
    Colin Sowman considers some of the latest advances in camera technology and transport-related vision technology applications. Vision technology in the transportation sector is moving apace as technical developments on both the hardware and software sides combine to make cameras more multifunctional with a single digital camera now able to cover a multitude of tasks.
  • January 25, 2012
    Machine vision - cameras for intelligent traffic management
    For some, machine vision is the coming technology. For others, it’s already here. Although it remains a relative newcomer to the ITS sector, its effects look set to be profound and far-reaching. Encapsulating in just a few short words the distinguishing features of complex technologies and their operating concepts can sometimes be difficult. Often, it is the most subtle of nuances which are both the most important and yet also the most easily lost. Happily, in the case of machine vision this isn’t the case: