Skip to main content

Q-Free pioneers next-generation road user charging (RUC) for private vehicles

April 24, 2025 Read time: 2 mins

 

Since 1984, Q-Free has been a leader in tolling solutions, and now the company is driving innovation in road user charging (RUC) — a smarter, more flexible way to pay for road usage. Unlike traditional tolling, RUC calculates fees based on distance driven, with dynamic pricing for factors like rush hour congestion or urban vs rural travel. It also shifts revenue focus, covering external costs like accidents, noise, and delays rather than just infrastructure.

With declining fuel tax revenues due to electric vehicles and reduced car ownership, RUC offers a sustainable alternative. However, existing heavy-vehicle RUC solutions are too bulky, costly, and power-intensive for private cars. Q-Free is changing that with a next-generation GNSS-based tag — theTag4All, which will be launched in Seville.

Tag4All is compact, battery-powered and easy to install. It provides an unprecedented solution that removes RUC barriers with a cable-free device that takes privacy to a new level without sharing any detailed location information. Hence, it is fully compliant with GDPR regulations while offering fully-compliant DSRC technology.

RUC implementations vary by markets, with governments typically overseeing revenue collection. Q-Free’s development is backed by key stakeholders, including the Norwegian Research Council, SINTEF, and the Norwegian Public Road Administration. Following successful large-scale pilots, the company is refining the technology for broader adoption.

“With Tag4All, we’ve developed a solution that combines advanced technology with user simplicity. Our goal has been to make road user charging easy to use for private vehicles without compromising on privacy or reliability,” says Ola Martin Lykkja, RUC 2.0 Project Manager

Q-Free is inviting delegates to its stand at the ITS European Congress to learn more about RUC 2.0, Tag4All and how it’s shaping the future of road charging.

Stand: D4

Related Content

  • ITS industry needs more effort to get to the future
    January 19, 2012
    Eric Sampson, visiting professor at Newcastle University and City University London and ambassador for ITS-UK, provides a retrospective on the last couple of decades and takes a look at what the ITS industry still needs to do to get to where it needs to be
  • Do we need a new approach to ITS and traffic management?
    January 31, 2012
    In an article which has implications for the European Electronic Toll Service, ASECAP's Kallistratos Dionelis asks whether the approach we currently take to major ITS system implementations is always the best or healthiest. I was asked recently to write a paper on the technology-oriented future of transport. To paraphrase, I started with: "The goal of European policy-makers is to establish a transport system which meets society's economic, social and environmental needs, satisfying in parallel a rising dema
  • America fires V2V starting gun
    April 7, 2014
    Leo McCloskey, ITS America’s senior vice president for Technical Programs, talks to Jason Barnes about what the recent NHTSA ruling on light vehicle connectivity means for cooperative infrastructures in North America. In early February the US Department of Transportation’s (USDOT’s) National Highway Traffic Safety Administration (NHTSA) announced it had decided to start taking steps to enable Vehicle-to-Vehicle (V2V) communication technology for light vehicles. In so doing, the many safety-related applicati
  • Machine vision - cameras for intelligent traffic management
    January 25, 2012
    For some, machine vision is the coming technology. For others, it’s already here. Although it remains a relative newcomer to the ITS sector, its effects look set to be profound and far-reaching. Encapsulating in just a few short words the distinguishing features of complex technologies and their operating concepts can sometimes be difficult. Often, it is the most subtle of nuances which are both the most important and yet also the most easily lost. Happily, in the case of machine vision this isn’t the case: