Skip to main content

Q-Free unveils futuristic Q-City virtual reality experience

Q-Free broke the mould when it unveiled Q-City at 2014’s Intertraffic. A computerised rendering of a modern urban area, Q-City allows users to look at how the company’s large suite of ITS products work with each other to make roads safer, cleaner and less congested. At this year’s show, Q-Free and Q-City have gone a step further and visitors can enjoy a fully immersive virtual reality tour.
April 4, 2016 Read time: 2 mins
Jenny Simonsen of Q-Free
108 Q-Free broke the mould when it unveiled Q-City at 2014’s Intertraffic. A computerised rendering of a modern urban area, Q-City allows users to look at how the company’s large suite of ITS products work with each other to make roads safer, cleaner and less congested. At this year’s show, Q-Free and Q-City have gone a step further and visitors can enjoy a fully immersive virtual reality tour.


Q-City brings the process of understanding ITS into the 21st Century. Starting from a bird’s-eye view, it makes it possible to zoom in and out to explore application areas such as tolling, traffic management, parking and infomobility and to see how these previously discrete sectors have moved together to become more holistic and connected. The new virtual reality experience enables users to stand at street level and gain an even more ‘hands-on’ perspective.

“We’re a technology innovator, so it makes sense to use technology to demonstrate what we do,” says Jenny Simonsen, Q-Free’s Global Director Marketing & Communication. “It’s more than just a gimmick. By being able to move quickly around a cityscape, either alone or in the company of our technology experts, it’s possible to gain a real feel for what ITS can do far more quickly than might otherwise be the case.

“The virtual reality tours aren’t the only way in which Q-City has evolved. Q-Free has spent a lot of time since the last Intertraffic expanding and fine-tuning its portfolio. We’ve needed to reflect the new additions and the finessing which has occurred,” says Simonsen. “This latest version of Q-City is right up to date and features all of our products and services.”

Q-City also forms the centrepiece of a group experience here at Intertraffic. Each day at 3pm, the company’s Chief Technologist, Knut Evensen, will use it to give a guided tour of the company’s ITS capabilities, followed by drinks and networking opportunities.

Related Content

  • May 26, 2023
    Flow Labs and Tapco agreement is ‘natural step’ to reach out
    Partnership will give both companies new perspective on North America traffic solutions
  • July 17, 2012
    US economic stimulus package highlights ITS technology
    US Secretary of Transportation Ray LaHood talks to ITS International about economic stimulus funding and the absolute need to maintain and increase the use of technology in transportation. Of the total of $787 billion of funding announced under the American Recovery and Reinvestment Act (ARRA), the economic stimulus package which was signed into law by US President Barack Obama on 17 February 2009, $48.1 billion will go to the US Department of Transportation (USDOT). Of that, $27.5 billion is for highway in
  • April 29, 2015
    Foundation funds research for informed campaigning
    ITS International talks to Professor Stephen Glaister, director of the transport research and lobbying organisation, the RAC Foundation. It is through the eyes of an economist that Professor Stephen Glaister, emeritus professor of transport and infrastructure at Imperial College London and director of the RAC Foundation, views current and future transport problems. Having spent 30 years at the London School of Economics and another 10 at Imperial, the move to the RAC Foundation was a radical departure from
  • September 4, 2018
    Getting to the point
    Cars are starting to learn to understand the language of pointing – something that our closest relative, the chimpanzee, cannot do. And such image recognition technology has profound mobility implications, says Nils Lenke Pointing at objects – be it with language, using gaze, gestures or eyes only – is a very human ability. However, recent advances in technology have enabled smart, multimodal assistants - including those found in cars - to action similar pointing capabilities and replicate these human qual