Skip to main content

Allied Vision and TORC Robotics help blind driver ‘see’

TORC Robotics has partnered with the Robotics and Mechanisms Laboratory (RoMeLa) at the Virginia Polytechnic Institute and State University (Virginia Tech) with the aim of developing vehicles for the next generation of National Federation of the Blind (NFB) Blind Driver Challenge vehicles. The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently. Held at the Daytona Speedway as a pre
May 22, 2015 Read time: 3 mins
TORC Robotics has partnered with the Robotics and Mechanisms Laboratory (RoMeLa) at the Virginia Polytechnic Institute and State University (5593 Virginia Tech) with the aim of developing vehicles for the next generation of National Federation of the Blind (NFB) Blind Driver Challenge vehicles.

The NFB developed the Blind Driver Challenge which calls upon developers and innovators to create interface technologies to allow those who are blind to drive a car independently.  Held at the Daytona Speedway as a pre-race event for the annual Rolex 24 sports car endurance race, a blind driver was to independently drive the vehicle down the main straight onto the road course.  

Using a crossover SUV, TORC implemented its ByWire drive-by-wire conversion modules, SafeStop wireless emergency stop system, and PowerHub distribution modules on the vehicle.  Drive-by-wire gives a driver electronic control of a vehicle.  The premise comes from the fly-by-wire system, where an aircraft’s controls produce electronic signals which are read and put through computing systems connected to actuators that control the surfaces of the wings and tail.

Jesse Hurdus, TORC’s project manager for this event, stated, “Cars are much further behind in taking this step.  In order to have an autonomous vehicle, you need to have it so a computer can control the throttle, transmission, and braking systems.  This is drive-by-wire”.

The team also used light detection and ranging (LIDAR) which measures distance by emitting a laser pulse and analysing the reflected light to determine the obstacles a driver has to drive around.  However, LIDAR has difficulty with classifying obstacles and differentiating objects such as vegetation from other solid objects, which is where Allied Vision’s Prosilica GC1290C camera provided the solution.

TORC used the camera to help overcome the challenges LIDAR presents, by taking sensor data and feeding it into the software to provide an understanding on what is around the vehicle and detecting lane markings. The information is fed back to the autonomous system and provides input to the blind driver so that he or she can keep the vehicle centred and within the lane.  

The blind driver wears special DriveGrip gloves and sits on a SpeedStrip padded insert on the driver’s seat.  The gloves contain small vibrating motors on top of each finger which help relay steering information from the autonomous system.  The padding on the driver’s seat also contains vibrating motors stretching along the driver’s legs and back which relay the vehicle’s speed information and vibrate to tell the driver to accelerate or brake. Vibrations in the gloves to signal the direction the car needs to be turned.

While the focus of TORC’s systems was specifically for the Challenge, they can be potentially used for future solutions.  Hurdus concluded, “This was an exploratory effort to see how we could use the cameras to achieve the goal.  A person blind from birth was able to drive a vehicle outfitted with sensor technology to give him an understanding of the environment generated by a combination of Allied Vision’s cameras, LIDAR systems, and GPS localisation systems.  The fusion of all this data was able to give this person the ability to ‘see’ the environment as a person would be able to see through their own eyes.”

For more information on companies in this article

Related Content

  • iMobility Challenge
    February 28, 2013
    The iMobility Challenge, a high-level technology demonstration day during which visitors, both public and professionals, can take the driver's seat and experience the latest mobility applications takes place on Wednesday 11 September 2013 Valkenburg Airport, Katwijk in the Netherlands. Participants will get an overview of a diverse variety of technologies, systems, services & actual deployment examples in the field of efficient, cooperative, smart and safe mobility. The iMobility Challenge event will be f
  • IAM RoadSmart welcomes US study on benefits of humans and new vehicles working together
    August 17, 2017
    UK independent road safety charity IAM RoadSmart has welcomed a new white paper which it says supports its statement that we will not gain the full safety benefits of self-driving cars until every car on the road is connected to each other. Until then, IAM RoadSmart believes that the human mind holds the edge, until such point that connected cars actually ‘talk’ to each other and predict what is happening over the horizon. According to the white paper, Sensor Fusion: A Comparison of Sensing Capabilities of
  • Europe’s road safety gains have stagnated EU
    March 17, 2017
    Europe will fail to meet its road death targets as enforcement budgets are slashed and drivers face an epidemic of distractions. The European Union will not achieve its aim of halving the number of people killed on its roads each year by 2020, delegates to Tispol’s (the organisation of European traffic police) annual conference in Manchester were told. “The target will be missed because there was only a 17% decrease in road fatalities across Europe between 2010 and 2015 when [the rate of reduction] should h
  • Machine vision’s image of road management’s future
    June 11, 2015
    Q-Free’s Marco Sinnema looks at how the commoditisation of high-quality vision-based solutions is widening their application. Machine vision technology’s entry into the ITS/traffic management sector has followed a classic top-down path. This is unsurprising given the extremely demanding performance criteria which are the standard in its market of origin, manufacturing processing. Very high image qualities combined with frame rates often in the hundreds per second range resulted in vision systems with capabi