Skip to main content

Allied Vision and TORC Robotics help blind driver ‘see’

TORC Robotics has partnered with the Robotics and Mechanisms Laboratory (RoMeLa) at the Virginia Polytechnic Institute and State University (Virginia Tech) with the aim of developing vehicles for the next generation of National Federation of the Blind (NFB) Blind Driver Challenge vehicles. The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently. Held at the Daytona Speedway as a pre
May 22, 2015 Read time: 3 mins
TORC Robotics has partnered with the Robotics and Mechanisms Laboratory (RoMeLa) at the Virginia Polytechnic Institute and State University (5593 Virginia Tech) with the aim of developing vehicles for the next generation of National Federation of the Blind (NFB) Blind Driver Challenge vehicles.

The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently.  Held at the Daytona Speedway as a pre-race event for the annual Rolex 24 sports car endurance race, a blind driver was to independently drive the vehicle down the main straight onto the road course.  

Using a crossover SUV, TORC implemented its ByWire drive-by-wire conversion modules, SafeStop wireless emergency stop system, and PowerHub distribution modules on the vehicle.  Drive-by-wire gives a driver electronic control of a vehicle.  The premise comes from the fly-by-wire system, where an aircraft’s controls produce electronic signals which are read and put through computing systems connected to actuators that control the surfaces of the wings and tail.

Jesse Hurdus, TORC’s project manager for this event, stated, “Cars are much further behind in taking this step.  In order to have an autonomous vehicle, you need to have it so a computer can control the throttle, transmission, and braking systems.  This is drive-by-wire”.

The team also used light detection and ranging (LIDAR) which measures distance by emitting a laser pulse and analysing the reflected light to determine the obstacles a driver has to drive around.  However, LIDAR has difficulty with classifying obstacles and differentiating objects such as vegetation from other solid objects, which is where Allied Vision’s Prosilica GC1290C camera provided the solution.

TORC used the camera to help overcome the challenges LIDAR presents, by taking sensor data and feeding it into the software to provide an understanding on what is around the vehicle and detecting lane markings. The information is fed back to the autonomous system and provides input to the blind driver so that he or she can keep the vehicle centred and within the lane.  

The blind driver wears special DriveGrip gloves and sits on a SpeedStrip padded insert on the driver’s seat.  The gloves contain small vibrating motors on top of each finger which help relay steering information from the autonomous system.  The padding on the driver’s seat also contains vibrating motors stretching along the driver’s legs and back which relay the vehicle’s speed information and vibrate to tell the driver to accelerate or brake. Vibrations in the gloves to signal the direction the car needs to be turned.

While the focus of TORC’s systems was specifically for the Challenge, they can be potentially used for future solutions.  Hurdus concluded, “This was an exploratory effort to see how we could use the cameras to achieve the goal.  A person blind from birth was able to drive a vehicle outfitted with sensor technology to give him an understanding of the environment generated by a combination of Allied Vision’s cameras, LIDAR systems, and GPS localisation systems.  The fusion of all this data was able to give this person the ability to ‘see’ the environment as a person would be able to see through their own eyes.”

Related Content

  • November 12, 2015
    Driver aids make inroads on improving safety
    In-vehicle anti-collision systems continue to evolve and could eliminate some incidents altogether. John Kendall rounds up the current developments. A few weeks ago, I watched a driver reverse a car from a parking bay at right angles to the road, straight into a car driving along the road. The accident happened at walking pace, no-one was hurt and both cars had body panels that regain their shape after a low speed shunt.
  • June 18, 2015
    Land Rover demonstrates remote-control Range Rover Sport
    Jaguar Land Rover, part of the UK Autodrive consortium, has demonstrated a remote control Range Rover Sport research vehicle, showing how a driver could drive the vehicle from outside the car via their smartphone. The smartphone app includes control of steering, accelerator and brakes as well as changing from high and low range. This would allow the driver to walk alongside the car, at a maximum speed of 4mph, to manoeuvre their car out of challenging situations safely, or even to negotiate difficult off
  • June 28, 2012
    Ford Research looking to help drivers manage stressful situations on the road
    Engineers in the Ford Research and Innovation labs are developing ways to help the driver stay focused in busy situations by intelligently managing incoming communications. Data from the sensing systems of driver-assist technologies can be used to determine the amount of external demand and workload upon a driver at any given time including traffic and road conditions. In addition, Ford continues its health and wellness research with the development of a biometric seat, seat belt and steering wheel that can
  • December 13, 2013
    Daimler’s double take sees machine vision move in-vehicle
    Jason Barnes looks at Daimler’s Intelligent Drive programme to consider how machine vision has advanced the state of the art of vision-based in-vehicle systems. Traditionally, radar was the in-vehicle Driver Assistance System (DAS) technology of choice, particularly for applications such as adaptive cruise control and pre-crash warning generation. Although vision-based technology has made greater inroads more recently, it is not a case of ‘one sensor wins’. Radar and vision are complementary and redundancy