Skip to main content

5G-Safe project developing road weather services based on vehicle data

VTT Technical Research Centre of Finland is coordinating the 5G-Safe project, which is part of Tekes’ Challenge Finland competition. It is focused on the identification of local weather and road conditions on the basis of data collected from vehicles, and the sending of warnings to road users. In addition, real-time video and radar data will be exchanged between passing vehicles. Other issues being investigated include the use of data on local road weather conditions to improve the situational awareness of
April 21, 2017 Read time: 2 mins
814 VTT Technical Research Centre of Finland is coordinating the 5G-Safe project, which is part of Tekes’ Challenge Finland competition. It is focused on the identification of local weather and road conditions on the basis of data collected from vehicles, and the sending of warnings to road users. In addition, real-time video and radar data will be exchanged between passing vehicles.


Other issues being investigated include the use of data on local road weather conditions to improve the situational awareness of autonomous vehicles and the enhancement of autonomous operation in harsh weather.

The services currently being developed require no action during driving in order to send data or warnings. Instead, the prevailing local weather and road conditions are automatically identified based on data collected from vehicles. Warnings and other useful information are sent in real-time to road users, road operators and autonomous vehicle control systems. The new network and cloud computing technologies being researched under the project aim to reduce delays in data exchange and be more scalable than current services.

According to Tiia Ojanperä, a project manager from VTT, the wide introduction of real-time services, based on sensor and video data collected from vehicles, is being made possible by next-generation 5G mobile network technology and new solutions supporting optimal data collection and exchange.

As an example, he says 5G will form the cornerstone of interaction between autonomous cars. Contemporary driver support systems are mainly vision-based, relying on signals generated by the vehicle’s sensors. 5G and short-range radios will also bring the power of speech and hearing to vehicles, taking their capabilities to a new level, he claims.

For more information on companies in this article

Related Content

  • AWS enhances Aurora AV system 
    December 14, 2021
    AWS supports millions of virtual tests to validate the capabilities of the Aurora Driver 
  • ITS initiatives provide travel information for disabled passengers
    December 4, 2012
    David Crawford investigates initiatives and issues in travel information for disabled passengers. World Health Organisation estimates suggest that 10% of the global population live with a disability. This can impact directly on their mobility, with implications for their independence; keeping active; and travelling to work, education and social activities; as well as the accessibility of information necessary to aid mobility. The EU-supported ‘CARDIAC’ project (Coordination Action in R&D in Accessible & Ass
  • Siemens offers Stamford a ‘bird’s eye view’
    April 29, 2019
    Stamford, Connecticut is a vibrant, diverse community overlooking the Long Island Sound, within commuting distance of New York City. Stamford hosts the largest financial district in the greater New York metro area outside of Manhattan and is home to a high concentration of large corporations and corporate HQs. With a population of 130,000, Stamford is Connecticut’s third largest city and the fastest-growing municipality in the state. Like many US cities, Stamford had previously relied on an antiquated traf
  • Machine vision - cameras for intelligent traffic management
    January 25, 2012
    For some, machine vision is the coming technology. For others, it’s already here. Although it remains a relative newcomer to the ITS sector, its effects look set to be profound and far-reaching. Encapsulating in just a few short words the distinguishing features of complex technologies and their operating concepts can sometimes be difficult. Often, it is the most subtle of nuances which are both the most important and yet also the most easily lost. Happily, in the case of machine vision this isn’t the case: