Skip to main content

Vision technology is bringing 2024 into sharp focus

What vision trends should we be looking out for? AI? Autonomous vehicles? Video analytics? Let’s ask the experts
By Adam Hill January 9, 2024 Read time: 4 mins
Urban mobility is increasingly complex (© Dimitar Lazarov | Dreamstime.com)

Predicting the future is famously fraught with peril, particularly in fast-moving areas of technology. For example, this time last year everyone had heard of artificial intelligence but relatively few people were au fait with ChatGPT. The future comes at you quickly. 

But with those caveats, it’s worth doing a bit of horizon-scanning, looking ahead just a short distance to see the trends or technologies that vision experts expect to come to the fore in 2024.

“The adoption of computer vision technologies is continuing to grow, and 2024 could represent a breakout year,” says Anthony Incorvati, segment development manager at Axis Communications.

 

“The one trend that we believe in and are actually developing is video analytics” Jerry Diaz, Ekin Americas


The use of vision technology to improve road safety is particularly prevalent in today’s thinking on mobility. “Computer vision sensors coupled with specialised AI provide real-time infrastructural insight and situational awareness to connected and autonomous vehicles, thereby predicting and detecting conflicts with pedestrians, vehicles or other vulnerable road users [VRUs], to avoid collisions,” Incorvati continues. 

“In addition, the technology supports adaptive traffic management by detecting and tracking vehicles, VRUs and safety conditions. This enables adaptive traffic signal actuation based on safety events like red-light running or jaywalking. The technology integrates with industry-standard signal controllers.”
 

Private and public
 

Teledyne Imaging sees two trends continuing over the next 12 months: the first is the additional vision sensor module being integrated into autonomous vehicles. “This is true for both private transportation, like a car, or efforts into growing autonomous public transportation,” says Manny Romero, the firm’s senior product manager. 

“The second trend is continuously growing usage and/or investigation of AI in ITS in general. This might be more edge-based on smart transportation, but can be cloud-based on non-critical decision-making.”

 

“As city traffic systems continue to grow in complexity, traffic governance has become a critical area for video technology applications” Jayden Xu, Hikvision


As more city authorities around the world encourage active travel, there will be other applications, thinks Incorvati. “Computer vision is also being applied to crosswalks to detect pedestrians and other VRUs, predict their intent-to-cross and apply flashing strobes at the right time,” he points out. “This removes the need for pedestrians to push a button, and instead the system automatically and preemptively alerts nearby vehicles—ultimately improving VRU safety and ensuring effective traffic control around crossings.”

Monitoring speeding and traffic light violations will remain key (© Leung Cho Pan | Dreamstime.com)


Meanwhile, video analytics will be key for Ekin next year. “The one trend that we believe in and are actually developing is video analytics,” says Jerry Diaz, president of Ekin Americas. “This will allow us to eliminate hardware components that impede the continuous growth of AI. Software is a limitless evolution while hardware is finite. The future in our case the present is to utilise video analytics for traffic management, enforcement, civil security and school safety.”
 

Driver monitoring systems

Another potentially significant trend in vision technology within ITS in 2024 is advanced vision-based driver monitoring systems (DMS), according to Tattile. “The integration of more sophisticated DMS using advanced vision technology is expected to become increasingly prevalent in vehicles and transportation systems,” thinks Tattile CTO Alex Filippini.

 

"The integration of more sophisticated driver monitoring systems using advanced vision technology is expected to become increasingly prevalent” Alex Filippini, Tattile


“These systems typically utilise cameras and AI computer vision algorithms to monitor the driver's attention, behaviour and overall state while operating a vehicle. Advanced vision-based DMS is likely to be a pivotal component in ensuring safer and more reliable transportation systems, and their increased implementation is expected to be a key trend in the vision tech aspect of intelligent transportation by 2024.”

To address this need, in addition to existing solutions for monitoring speeding and traffic light violations, Tattile says it is developing solutions integrated with its cameras, entirely based on AI, such as vehicle inside inspection (VII) and advanced tracking modelling (ATM).
 

“[One trend is] the additional vision sensor module being integrated into autonomous vehicles” Manny Romero, Teledyne Imaging

 

Finally, for its part, Hikvision anticipates the integration of radar and camera systems coming to the fore in 2024. “Radar has shown great advantages in object detection and movement tracking, providing accurate, long-range detection, unaffected by weather conditions,” says Jayden Xu, the company’s senior ITS solution manager. 

“By merging radar and video into multi-dimensional camera systems, they surpass the limitations of conventional one-dimensional video perception. With enhanced perception capabilities for traffic systems, camera-radar fusion demonstrates vast potential in improving traffic safety management, from early obstacle detection to identifying speed violations, resulting in safer roads and well-maintained environments.”


Towards intelligent automation...
 

This is part of an ITS trend which emphasises smart traffic and highways management – sometimes using AI and Internet of Things (IoT) tech to create more efficient and convenient traffic services. 

“As city traffic systems continue to grow in complexity, traffic governance has become a critical area for video technology applications, where technologies like AI and IoT are being utilised to facilitate real-time monitoring, violation detection, and traffic flow control, enhancing the responsiveness of traffic management,” Xu continues. 

“Meanwhile, the future of highways is shifting towards intelligent automation with smart traffic systems, being capable of identifying hazards, aiding in route selection, and even facilitating autonomous driving.”

For more information on companies in this article

Related Content

  • Monali Shah: "The way we move and the air we breathe is all connected"
    September 5, 2023
    Be yourself: Monali Shah of Google and ITS America tells Adam Hill how showing her personality in business has enabled her to make deeper connections on a ‘non-traditional’ journey into transportation
  • What’s right with this picture?
    September 12, 2024
    AI-driven image review is a game changer for tolling industry efficiency. Rafael Hernandez of IntelliRoad outlines the importance of partnerships with service providers
  • Vehicle data translator for road weather monitoring
    February 1, 2012
    Sheldon Drobot, Michael Chapman and Amanda Anderson, NCAR, and Paul Pisano, FHWA, detail latest results of testing of a vehicle data translator for road weather monitoring and information applications. The use of vehicle sensor data to improve weather and road condition products, envisioned as part of the US Department of Transportation Research and Innovative Technology Administration's (RITA's) IntelliDriveSM initiative, could revolutionise the provision of road weather information to transportation syste
  • Tattile shows ANPR Mobile and Vega Color solutions
    March 25, 2014
    Leading Italian ITS company Tattile is here at Intertraffic to expand its product range with the launch of new products, including ANPR Mobile and Vega Color. ANPR Mobile, a new cutting-edge technology in support of police forces, incorporates Megapixel sensors enabling it to scan over 100 number plates per second, front and rear, at any light condition. The newly-launched system needs neither embedded processing units nor physical connection between the cameras and the on-board computer/tablet.