Skip to main content

Nervous about AV travel? You’ll get the Gist

Help is on the way for those anxious folk who will accept rides from automated vehicles but may feel uncomfortable doing so, reports David Arminas
February 4, 2025 Read time: 3 mins
Oh, just sit back and relax (© Belena839 | Dreamstime.com)

Scientists from South Korea’s Gwangju Institute of Science and Technology (Gist) say they have created strategies to make passengers feel safer in self-driven vehicles.

Automated vehicles (AVs) may be part of the future urban mobility mix, but passenger trust remains a challenge. Providing timely, passenger-specific explanations for the decisions AVs make can bridge this gap - so Gist’s researchers have investigated a method of increasing passengers’ sense of safety and confidence in AV trips.

Their TimelyTale is a novel dataset designed to capture real-world driving scenarios and passenger explanation needs. The goal, they say, is to have the in-vehicle multimodal dataset in all AVs.

 

Fostering trust

Driverless cars enable human users to engage in non-driving related tasks such as relaxing, working or watching multimedia en route. However, widespread adoption is hindered by passengers' limited trust. Understanding why AVs react a certain way can foster trust by providing control and reducing negative experiences. However, explanations must be informative, understandable and concise to be effective.

Existing explainable artificial intelligence (XAI) approaches majorly cater to developers, focusing on high-risk scenarios or complex explanations, which are both potentially unsuitable for passengers. Instead, passenger-centric XAI models need to understand the type and timing of information needed in real-world driving scenarios, said Professor SeungJun Kim, director of the Human-Centered Intelligent Systems Lab at Gist.

 

"Our research lays the groundwork for increased acceptance and adoption of AVs” SeungJun Kim, Gwangju Institute of Science and Technology

 

TimelyTale was created to include passenger-specific sensor data for context-relevant explanations which make people feel more confident about their trip. “Our research shifts the focus of XAI in autonomous driving from developers to passengers,” said Kim. “We have developed an approach for gathering passenger's actual demand for in-vehicle explanations and methods to generate timely, situation-relevant explanations for passengers.”

The researchers first studied the impact of various visual explanation types, including perception, attention and a combination of both - and their timing - on passenger experience under real driving conditions by using augmented reality.

They found that the vehicle’s perception state alone improved trust, perceived safety, and situational awareness without overwhelming the passengers. They also discovered that traffic risk probability was most effective for deciding when to deliver explanations, especially when passengers felt overloaded with information.

 

Telling a TimelyTale

Building upon these findings, the researchers developed the TimelyTale dataset. This approach includes various types of data: exteroceptive (regarding the external environment, such as sights and sounds); proprioceptive (about the body’ positions and movements) and interoceptive (about the body’s sensations such as pain).

The data was gathered from passengers using a variety of sensors in naturalistic driving scenarios, as key features for predicting their explanation demands. Notably, this work also incorporates the concept of interruptibility, which refers to the shift in passenger focus from so-called ‘non-driving-related’ tasks to ‘driving-related’ information. The method effectively identified both the timing and frequency of the passenger’s demands for explanations as well as specific explanations that passengers want during driving situations.

Using this approach, the researchers developed a machine-learning model that predicts the best time for providing this information. Additionally, as proof of concept, the researchers conducted city-wide modelling for generating textual explanations based on different driving locations.

"Our research lays the groundwork for increased acceptance and adoption of AVs, potentially reshaping urban transportation and personal mobility in the coming years," concludes Kim.

Related Content

  • What's Next for Aimsun?
    October 4, 2023
    Aimsun is switching strategy from being a pure software firm to one that is focused on outcomes. The company’s CEO Alexandre Torday talks to Adam Hill and explains why
  • Getting C/AVs from pipedream to reality
    October 17, 2019
    The UK government has suggested that driverless cars could be on the roads by 2021. But designers and engineers are grappling with a number of difficult issues, muses Chris Hayhurst of MathWorks Earlier this year, the UK government made the bold statement that by 2021, driverless cars will be on the UK’s roads. But is this an achievable reality? Driverless technology already has its use cases on our roads, with levels of autonomy ranked on a scale. At one end of the spectrum, level 1 is defined by th
  • Seoul Robotics thinks everything’s better in 3D
    January 9, 2024
    As more and more of us will live in urban areas and need to share space on the road, 3D perception and smart cities point the way to safer transportation, says William Muller of Seoul Robotics
  • C/AV technology will be ‘life-altering revolution’
    July 20, 2018
    Preparing for the challenges - and promises - of connected and automated vehicles and other emerging transportation technologies does not necessarily mean investing in actual hardware. Matthew Smith identifies eight key points that US transportation authorities need to look at. Transportation technology is moving rapidly. With the advent of connected and automated vehicle (C/AV) technology, the nation is on the verge of experiencing a major transportation revolution: a life-altering revolution akin to th