Skip to main content

MATE Version 4.0

MATE-Intelligent Video has released Version 4.0 of its intelligent video analytics system and a new intelligent video encoder, TriggerNG (Next Generation), a transmission device with 4CIF/CIF resolution and MJPEG/MPEG4 video streaming.
February 3, 2012 Read time: 1 min
2243 MATE-Intelligent Video has released Version 4.0 of its intelligent video analytics system and a new intelligent video encoder, TriggerNG (Next Generation), a transmission device with 4CIF/CIF resolution and MJPEG/MPEG4 video streaming. It supports eight different video analytics detection features, object classification and an analogue output with video analytics overlay on the video.

In version 4.0, MATE unveils Rule Dependency, a feature allowing users to combine multiple detection rules in a logical way and custom naming of alarms to enhance end users' control. Additionally, MATE has increased its channel capacity, offering systems capable of analysing up to 12 analogue or IP channels on a single server.

Following its open architecture principle, MATE continues to support previously integrated platforms and adds new technology partners. Version 4.0 is fully integrated with American Dynamics VideoEdge, Hirsch Velocity, IQeye 753 series and DVTel 5.3.

For more information on companies in this article

Related Content

  • Intelligence transport systems potential?
    February 25, 2013
    The world of intelligent transport systems can, it would seem, be just as beset by muddled thinking as any other sector. How else to interpret the baffling announcement in January by the US Federal Communications Commission (FCC) chairman Julius Genachowski that the FCC intends to open up almost 200MHz of spectrum in the 5GHz band to unlicensed users, starting almost immediately? As the FCC itself points out, this would be the largest block of unlicensed spectrum to be made available for Wi-Fi in nearly te
  • New York pioneers online mobile real-time bus tracking
    May 22, 2012
    An unusual technology collaboration. David Crawford investigates Early in January 2012, the New York City Metropolitan Transportation Authority (MTA) rolled out the first borough-wide implementation of its pioneering Bus Time online mobile real-time tracking service. The system allow commuters to track each bus on every route in real-time on the internet, via smartphones and by text messaging to a mobile phone. The MTA chose Staten Island for its first live launch due to it being the only one of the five Ne
  • Flow Labs partners with Geotab ITS
    July 24, 2024
    Contextual fleet & freight data will help traffic safety, sustainability & performance
  • Daimler’s double take sees machine vision move in-vehicle
    December 13, 2013
    Jason Barnes looks at Daimler’s Intelligent Drive programme to consider how machine vision has advanced the state of the art of vision-based in-vehicle systems. Traditionally, radar was the in-vehicle Driver Assistance System (DAS) technology of choice, particularly for applications such as adaptive cruise control and pre-crash warning generation. Although vision-based technology has made greater inroads more recently, it is not a case of ‘one sensor wins’. Radar and vision are complementary and redundancy