Skip to main content

SmartDrive launch new suite of sensors to tackle high-risk driving behaviour

SmartDrive Systems has introduced its SmartSense for Distracted Driving (SSDD), the first in a new line of intelligent sensors that are designed with the intention of identifying dangerous driving habits and intervening with drivers before a catastrophic event occurs. It uses computer vision-based algorithms along with SmartDrive’s video analytics platform to recognize when a driver is distracted.
November 2, 2017 Read time: 3 mins
639 SmartDrive Systems has introduced its SmartSense for Distracted Driving (SSDD), the first in a new line of intelligent sensors that are designed with the intention of identifying dangerous driving habits and intervening with drivers before a catastrophic event occurs. It uses computer vision-based algorithms along with SmartDrive’s video analytics platform to recognize when a driver is distracted.


When combined with the SmartDrive program and its extended recording capability, SSDD informs fleets of what led the driver to distraction, how it manifested, and the outcome; enabling them to provide detailed feedback and actionable coaching to improve driver safety.

These purpose-built sensors combine with engine computer data, telematics, accelerometer and SmartDrive analytic data. Through a reviewed video and training database of over 200 million analysed driving events, the sensor’s algorithms can be tuned to optimise triggering efficacy and system performance.

SSDD interprets driver cues proven to indicate distraction such as head and eye movements and triggers a video whenever distraction, inattention or drowsiness is detected, which is prioritized and offloaded for immediate verification and intervention, allowing fleets to act quickly.

Other features include purpose-built hardware with infrared sensors that capture distraction even when sunglasses are worn as well as in-cab alerts when distraction or inattention occurs. In addition, it features a prioritized review and risk scoring for video distraction events and integrates with the SmartDrive video safety program.

Video evidence from the SmartDrive library has revealed that drivers who engage in distracted driving frequently demonstrate an over-reliance on their ability to respond to dangerous situations, should they occur, such as putting themselves in perceived safe modes prior to texting. In these situations, drivers move to the right lane using cruise control at or below the speed limit, and position themselves in limited surrounding traffic or at a distance that appears safe. Additionally, drivers regularly misjudge the length of time and frequency of their distraction, texting for a longer time period than estimated, as well as diverting their eyes from the road more frequently and for more time than perceived.

Steve Mitgang CEO of SmartDrive, said: “It’s estimated that distracted driving accounts for 10% of all fatal crashes and 17% of all collisions that cause injuries—at a cost of at least $129 billion annually. Given the difficulty of proving distraction as a root cause, these numbers are probably low. With SmartSense for Distracted Driving, we’re tackling this issue head-on by delivering an intelligent sensor tuned specifically to this risk. And, because it’s delivered with our video safety program, fleets finally have both a comprehensive view of the frequency, severity and impact of distracted driving, and a solution to an industry epidemic that costs money and lives.”

UTC

Related Content

  • October 26, 2017
    EdgeVis removes bandwidth barriers to mobile streamed video
    A new generation of video compression can lower transmission costs of data and make streaming from mobile and body-worn cameras a reality, as Colin Sowman discovers. Bandwidth limitations have long been the bottleneck restricting the expanded use of video streaming for ITS, monitoring and surveillance purposes. Recent years have seen this countered to some degree by the introduction of ‘edge processing’ whereby ANPR, incident detection and other image processing is moved into (or close to) the camera, so
  • July 27, 2018
    Uber’s self-driving cars resume trials in Pittsburgh in manual mode
    Uber’s self-driving cars are being manually driven on public roads in Pittsburgh after a fatal crash which prompted the company to pull out of its testing programme in North America. The company is trialling new safeguards which it says will improve vehicle fleet safety and performance. According to a report by Medium, Eric Meyhofer, head of Uber Advanced Technologies, says: “While we are eager to resume testing of our self-driving system, we see manual driving as an important first step in piloting thes
  • July 24, 2012
    Righter shade of pale
    Jon Tarleton, Quixote Transportation Technologies, Inc., talks about developments in mobile weather information gathering Quixote Transportation Technologies, Inc. (QTT) is promoting the greater use of mobile technologies to provide infill between fixed Road Weather Information System (RWIS) infrastructure. It is, the company says, a means of reducing the expense of providing comprehensive, network-wide coverage, particularly in geographic locations where the sheer number of centreline miles causes cost to
  • November 15, 2012
    IBM and City of Lyon collaborate to create transport management centre of the future
    IBM researchers are piloting a system with the City of Lyon, France which will be used to help traffic operators in its transportation management centre to evaluate an incident and make more informed assessments about which actions would restore traffic flow. Using real-time traffic data, the new analytics and optimisation technology can help officials predict outcomes and analyse ways to resolve problems. The researchers say that, although traffic management centres have sophisticated video walls and colou