Skip to main content

Columbia brings the noise to VRUs

‘Twalking’ – the practice of staring at a smartphone screen while walking – may be a matter for wry amusement for the non-addicted, but is potentially hazardous to the phone users. A US research project may have found a solution, finds Alan Dron
May 7, 2020 Read time: 7 mins
The headset uses an array of four microphones to localise the sound of an approaching vehicle

It has become an increasingly common sight on city streets: a pedestrian – usually young – head-down and engrossed in their smartphone who steps on to the road, oblivious to oncoming traffic. This is called ‘twalking’.

Odds are that, if you’re a motorist, you’ve cursed their idiocy. Or, if one of the head-down tribe yourself, that you’ve had a narrow escape at some point.

The problem is real: statistics have shown a sharp rise in serious and fatal accidents in such circumstances in recent years. However, a new device currently under development in the US may provide a solution.

Professor Xiaofan Fred Jiang and his team at New York’s Columbia University started work on the device after he personally discovered that noise-cancelling headphones and busy urban streets were not a good combination. He realised that listening to music or watching a small screen posed similar hazards.

“We did some preliminary research and it turns out this is a big problem,” said Jiang, assistant professor in the department of electrical engineering and a member of the university’s Data Science Institute. The smart headphone project was awarded a $1.2 million grant from the US National Science Foundation in 2017.

His team’s proposal? An array of four miniature microphones fitted into headphones. These are linked to a signal processing device and machine learning algorithms that can be incorporated into a smartphone via an app.

The machine learning algorithm needs to be ‘fed’ sounds from the streets, such as engine and tyre noise
The machine learning algorithm needs to be ‘fed’ sounds from the streets, such as engine and tyre noise

In simple terms, the microphones detect the sound of vehicles, localise the direction from which they are approaching by measuring the tiny differences in time at which the sound reaches the different microphones and then – if appropriate – the system sends a warning to the wearer.

The sensors are simple MEMS microphones, miniature devices that are readily available from a computer store. “We were very conscious of the cost,” Jiang told ITS International. “We don’t want to make something costing hundreds of dollars that consumers can’t afford. We calculate the material costs over a 10,000-unit production run as around $25.”

However, all the other aspects of the system were developed by the team. The customised integrated circuit converts raw sound from the microphones into digital features without going through a traditional analogue-to-digital converter. “That’s one part of the system that’s very different from the classical approach,” said Jiang. The custom integrated circuit design is led by Jiang’s colleague Peter Kinget.

That digital data containing a vector of sound signatures of an approaching vehicle is then transmitted by Bluetooth to the user’s smartphone. An app running in the background of the smartphone then uses machine learning algorithms to localise vehicles and flashes a warning on to its screen.

However, the team is also looking at the possibility of transforming that alert into a warning tone in the headphones themselves. The closer the vehicle, the louder that tone would be. “If they’re already listening to music, we can lower the volume of the music and increase the volume of the beep,” Jiang says.

Currently, prototypes of the machine can detect fairly accurately the direction from which a vehicle is approaching; more difficult is gauging its distance from the wearer. “We’re still working on that part,” he explains.

Work is under way to create the necessary ‘machine intelligence’ that would allow the app and the smartphone to distinguish vehicles from the background noise of a busy urban street.

This involves building up a library of sound signatures, in the same way that a warship’s sonar system compares underwater sounds to a library of noise signatures of different classes of enemy submarine.

“The machine learning algorithm in the smartphone has to be trained by ‘labelling’ data,” said Jiang. “We have to bring our device to the streets and feed it those sounds of vehicles and then ‘feed’ those with a label.” Those sounds include engine and tyre noise.

“After you perform this training phase, that’s when the machine learning classifier becomes intelligent enough to tell you whether there’s a car, and where the car is.”

The electrical engineering department will undertake the great bulk of development work for the system. But Jiang insists: “The idea here is we would build a few prototypes and demonstrate that this is something that we could achieve. Our hope is that headphone manufacturers, or companies such as Apple or Microsoft, would be interested.”  

For example, said Jiang, if Apple took up the idea, it could conceivably embed the system into a future generation of AirPod wireless headphones. “A lot of headphones are already connected to smartphones by Bluetooth.”

A power system for the device is being rapidly progressed. “Initially, we used two AAA batteries that would only last around four hours,” he continues. “We’re now using a coin cell battery that lasts for 10 hours. We can replace that with a rechargeable battery. Some of the new headsets are already wireless-powered; we simply use a small portion of that to power our device. We could also leach power off the smartphone. That’s a little more difficult [but] we could use a biasing voltage from the 3.5mm jack.”

Jiang said that the device had not yet been presented to manufacturers: “It’s still a research project. We want to create the initial technology. It still requires a lot of engineering effort to bring this to market. There are still issues right now, for example, it can’t localise multiple vehicles. We’re trying to improve that.”

He believes that his team could do 90-95% of the necessary work, then turn to a large manufacturer to undertake the last lap of the development required to turn it into a consumer device. While it is difficult to put a precise timeframe on this, he believes that two years could be a reasonable period for a company to get it to market.

ABOUT THE AUTHOR:
Alan Dron is a freelance journalist working in the transport sector and regularly reports from major ITS events.


Silent but deadly

A related problem to ‘twalking’ arises with the lack of engine noise created by the new generation of electrically-powered vehicles (EVs).

Companies have started to realise this creates a new market niche. One, UK-based Brigade Electronics, believes it is the first to offer a solution with its Quiet Vehicle Sounder (QVS), which it likens to “the next reversing alarm”. The QVS is intended primarily for commercial vehicles and consists of a device that broadcasts a mix of tonal and white noise.

Brigade Electronics is trialling its Quiet Vehicle Sounder to give pedestrians warning of approaching electric vehicles
Brigade Electronics is trialling its Quiet Vehicle Sounder to give pedestrians warning of approaching electric vehicles

“With tonal, you can hear a pitch shift, while white sound is used for directional ability, so people can understand where the vehicle is coming from,” said a company spokesman. The volume of noise increases in line with the speed of the vehicle. This is particularly important in urban settings, where vehicles travelling in the low 30mph range are especially quiet.

“Legislation now has forced OEMs to think about what they have to do in the future. I believe in 2021 [EVs] will have to come out of the factory with an acoustic alerting system already fitted.” Any new models introduced from now will need to be fitted with a suitable acoustic device, with all such vehicles requiring them by next year.

The advent of EVs is potentially a particular problem for the blind.

“We’ve been running a campaign for a while now, to make sure all new EVs have a noise device fitted to them as a minimum requirement,” said a spokesman for the UK’s Guide Dogs for the Blind charity. “A European regulation came into force last year, so that someone with sight loss who’s trying to make a judgement [on approaching vehicles] at least has that information to go on.”

The charity is not adapting the training of guide dogs to the new problem: “When people see someone out with a dog, they assume the dog makes the decision to cross the road. It’s not, it’s the person, who uses all the information they have – including noise. So it is important that quiet vehicles are audible.”

For more information on companies in this article

Related Content

  • Parifex speed cameras: picture perfect
    September 30, 2020
    From speed cameras to smart cities, image processing and AI – Parifex is not short of ambition. Nathalie Deguen tells Adam Hill where the French company is heading next
  • M&A in ITS: upward mobility
    February 17, 2021
    2021 has kicked off with a flurry of M&A activity. Adam Hill asks the bosses of IRD and Iteris what we should make of their new purchases – and finds out why the whole process is a bit like dancing…
  • Pivot Power: 'We need to rethink the EV customer experience'
    October 10, 2018
    Electric vehicles will increasingly become a key part of the mobility mix but charging infrastructure is currently patchy. Adam Hill talks to Matt Allen of Pivot Power about disruption, horses, slot machines – and the importance of customer experience. Electric vehicles (EVs) – including buses, taxis and cars for individual and shared use – are already a common sight on our roads. They are not yet ubiquitous. But that will come. There will be around 30 million electric cars in the world by 2030 (as they
  • Hayden AI & Snapper Services keep their eyes on the road
    August 29, 2024
    Snapper Services CEO Miki Szikszai and Chris Carson, CEO of Hayden AI, tell Adam Hill about synergy and partnership – and how to make use of data once you’ve gathered it