Skip to main content

SmartDrive launch new suite of sensors to tackle high-risk driving behaviour

SmartDrive Systems has introduced its SmartSense for Distracted Driving (SSDD), the first in a new line of intelligent sensors that are designed with the intention of identifying dangerous driving habits and intervening with drivers before a catastrophic event occurs. It uses computer vision-based algorithms along with SmartDrive’s video analytics platform to recognize when a driver is distracted.
November 2, 2017 Read time: 3 mins
639 SmartDrive Systems has introduced its SmartSense for Distracted Driving (SSDD), the first in a new line of intelligent sensors that are designed with the intention of identifying dangerous driving habits and intervening with drivers before a catastrophic event occurs. It uses computer vision-based algorithms along with SmartDrive’s video analytics platform to recognize when a driver is distracted.


When combined with the SmartDrive program and its extended recording capability, SSDD informs fleets of what led the driver to distraction, how it manifested, and the outcome; enabling them to provide detailed feedback and actionable coaching to improve driver safety.

These purpose-built sensors combine with engine computer data, telematics, accelerometer and SmartDrive analytic data. Through a reviewed video and training database of over 200 million analysed driving events, the sensor’s algorithms can be tuned to optimise triggering efficacy and system performance.

SSDD interprets driver cues proven to indicate distraction such as head and eye movements and triggers a video whenever distraction, inattention or drowsiness is detected, which is prioritized and offloaded for immediate verification and intervention, allowing fleets to act quickly.

Other features include purpose-built hardware with infrared sensors that capture distraction even when sunglasses are worn as well as in-cab alerts when distraction or inattention occurs. In addition, it features a prioritized review and risk scoring for video distraction events and integrates with the SmartDrive video safety program.

Video evidence from the SmartDrive library has revealed that drivers who engage in distracted driving frequently demonstrate an over-reliance on their ability to respond to dangerous situations, should they occur, such as putting themselves in perceived safe modes prior to texting. In these situations, drivers move to the right lane using cruise control at or below the speed limit, and position themselves in limited surrounding traffic or at a distance that appears safe. Additionally, drivers regularly misjudge the length of time and frequency of their distraction, texting for a longer time period than estimated, as well as diverting their eyes from the road more frequently and for more time than perceived.

Steve Mitgang CEO of SmartDrive, said: “It’s estimated that distracted driving accounts for 10% of all fatal crashes and 17% of all collisions that cause injuries—at a cost of at least $129 billion annually. Given the difficulty of proving distraction as a root cause, these numbers are probably low. With SmartSense for Distracted Driving, we’re tackling this issue head-on by delivering an intelligent sensor tuned specifically to this risk. And, because it’s delivered with our video safety program, fleets finally have both a comprehensive view of the frequency, severity and impact of distracted driving, and a solution to an industry epidemic that costs money and lives.”

UTC

Related Content

  • June 2, 2015
    Self-driving car safety perspectives
    At yesterday’s Opening Plenary, Chris Urmson’s keynote speech dealt with the reality of driverless cars on our roads. By far and away their greatest benefit to mankind will be the potential to achieve an incredible saving of life and injury on the roads, as Urmson, director of the Google Self-Driving Car program, revealed to delegates. In response to an Associated Press article last month disclosing that self-driving cars have been involved in four accidents in the state of California, Urmson revealed th
  • October 22, 2018
    Six easy steps to security
    As security threats become increasingly vast and varied, multinationals are beginning to see the need for an effective global security operations centre to protect their organisation. James I. Chong spells out what is required. You know you need a global security operations centre (GSOC) to support what you’ve built, identify threats, and prevent disasters before they happen - but how do you know if it’s truly effective? There’s no shortage of information coming into operation centres. Too often, it’s the
  • September 7, 2018
    Affectiva and Nuance to develop humanised automotive assistant
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as anger
  • December 6, 2018
    Affectiva and Nuance to offer assistance
    US company Affectiva plans to develop a joint automotive assistant which detects driver distraction and drowsiness and voices recommendations such as navigating to a coffee shop. The solution is intended to align its dialogue to a motorist’s emotional state based on facial and verbal expressions. The integrated solution will combine the Affectiva Automotive AI solution with UK-based Nuance Communications’ Dragon Drive platform. Affectiva Automotive AI measures facial expressions and emotions such as ange