Skip to main content

Multilingual announcements for onboard systems

Multilingual announcements in human quality voices can now be generated directly from a low cost module or in a full featured mountable audio amplifier, thanks to the latest high definition speech synthesis hardware from US company TextSpeak.
December 19, 2014 Read time: 1 min

Multilingual announcements in human quality voices can now be generated directly from a low cost module or in a full featured mountable audio amplifier, thanks to the latest high definition speech synthesis hardware from US company 7954 TextSpeak.

Dynamic and real-time passenger information, announcements and security warnings in 20 languages can be spoken from message queues, CAD/AVL systems, streaming data or directly from typed text to bus or rail stations, vehicles, kiosks, parking lots and unattended platforms.

The tiny TTS-EM module and the TTS-EN-M amplifier systems offer high quality voice synthesis paging and announcement in a stand-alone package that only requires a digital input signal and a speaker connection to produce spoken audio with integrated text-to-speech. The conversion of informational data to a clear, natural sounding voice is completely automatic.

Digital Signage can support ADA and disability audio announcements with the push of a button.

For more information on companies in this article

Related Content

  • Utah Department of Transportation: How we’re using traffic analytics software
    February 4, 2025
    Our use of Iteris ClearGuide lets our traffic operations engineers interpret critical probe traffic data without the need for statisticians and software developers
  • Smartphones smooth the journey for visually impaired
    May 13, 2016
    Moves to make life easier and safer for vulnerable and impaired road users are gaining strength on both sides of the Atlantic. A recent webcast by the US Roadway Safety Institute, based at the University of Minnesota, showcased work in progress on a positioning and mapping methodology using Bluetooth and smartphone technologies to support situation awareness and wayfinding for the visually impaired.
  • Getting to the point
    September 4, 2018
    Cars are starting to learn to understand the language of pointing – something that our closest relative, the chimpanzee, cannot do. And such image recognition technology has profound mobility implications, says Nils Lenke Pointing at objects – be it with language, using gaze, gestures or eyes only – is a very human ability. However, recent advances in technology have enabled smart, multimodal assistants - including those found in cars - to action similar pointing capabilities and replicate these human qual
  • Hayden AI & Snapper Services keep their eyes on the road
    August 29, 2024
    Snapper Services CEO Miki Szikszai and Chris Carson, CEO of Hayden AI, tell Adam Hill about synergy and partnership – and how to make use of data once you’ve gathered it