Skip to main content

Motional VR environments aid AV research 

VR environments include parked cars, swaying trees and birds chirping
By Ben Spencer December 15, 2021 Read time: 2 mins
Motional scenarios show AV and manually-driven car stopping at an intersection (image credit: Motional)

Motional has made its nuReality virtual reality (VR) environments open source to help accelerate research on the interaction between autonomous vehicles (AV) and pedestrians.

Motional says nuReality is a proprietary set of VR experiences it is using in its expressive robotics research to learn how to train robots to respond in the environment similar to a person. 

The company has developed nuReality to understand how expressive behaviours by AVs like flashing lights and deliberate sounds such as exaggerated braking can aid in human-machine communication with pedestrians and signal the vehicle's intentions. 

Using expressive behaviours to help AVs communicate with pedestrians in crossing situations enables consumers to more quickly understand the intent of AVs and feel more confident in their decisions, the company adds. 

The VR environment has an animation file of an AV that comes with a side mirror and roof-mounted Lidar sensors and no visible occupants. A second file shows a human-driven model in which a driver is looking ahead and remains motionless during the intersection. 

The files contain vehicle animation scenarios that see both vehicles stopping at an intersection followed by two more where they do not stop. Another situation includes an AV using expressive behaviour such as a light bar or sounds to signal its intentions. 

The virtual environments include road and building texturing, parked cars, swaying tree, birds chirping, cars driving by and people talking. 

According to Motional, these details enhance place illusion and allow users to sense spatial presence within the virtual environment – giving the impression that they are standing on an actual street. 

The company claims that this “VR immersion experience” was so convincing that it provoked several participants to elicit instinctively angry reactions including swearing and making gestures toward vehicles that did not stop for them.

The nuReality files can be adapted and used in a variety of applications so that others can expand on Motional's work in expressive robotics. 

In 2019, Motional made the nuScenes autonomous driving dataset available to help further research that seeks to bring safe AVs to streets and communities faster. 

Related Content

  • November 28, 2018
    Driven consortium aims to trial AVs in London before Christmas
    The Driven consortium, led by software provider Oxbotica, hopes to trial a fleet of autonomous vehicles (AV) in London before Christmas following successful ongoing tests in Oxford. The vehicles will map streets in the London Borough of Hounslow as part of the consortium’s plans to run a fully autonomous fleet between both cities in 2019. Oxbotica has equipped the vehicles with its autonomous software, radar, lidar sensors and onboard computers and cameras. The fleet will gather data on the contents of
  • June 29, 2021
    Can AV mapping rely on crowds?
    Mapping tech companies need to expand their data inputs beyond crowdsourcing in order to maintain temporally accurate maps at scale, says Ro Gupta at Carmera
  • June 22, 2021
    Jenoptik joins Smart Mobility Living Lab
    Jenoptik is expanding its Vehicle to Infrastructure communications into the C/AV space
  • October 12, 2018
    Trust me, I'm a driverless car
    Developing C/AV technology is the easy bit: now the vehicles need to gain people’s confidence. So does the public feel safe in driverless hands – and how much might they be willing to pay for the privilege? The Venturer consortium’s final user and technology test (Trial 3) explored levels of user trust in scenarios where a connected and autonomous vehicle (C/AV) is interacting with cyclists, pedestrians and other road users on a controlled road network. Trial 3 consisted of experimental runs in the