Skip to main content

Social media mooted for traffic management

SQLstream’s Ronnie Beggs discusses with Jason Barnes the potential and pitfalls of using social media for traffic monitoring and management. cataclysmic events such as hurricanes and tsunami have challenged perceptions of what constitutes robust traffic management infrastructure in recent times. Presumptions that only fixed systems could offer high levels of unbroken service, accuracy and communication bandwidth, have been taught some hard lessons by nature. In many respects wireless systems now represent t
November 13, 2012 Read time: 7 mins
SQLstream vice president of marketing Ronnie Beggs – remains convinced that the single-most important traffic data source going forward will be GPS/GNSS

SQLstream’s Ronnie Beggs discusses with Jason Barnes the potential and pitfalls of using social media for traffic monitoring and management.

Cataclysmic events such as hurricanes and tsunami have challenged perceptions of what constitutes robust traffic management infrastructure in recent times. Presumptions that only fixed systems could offer high levels of unbroken service, accuracy and communication bandwidth, have been taught some hard lessons by nature. In many respects wireless systems now represent the standard in terms of resilience, typically being first back into service once disaster has passed. Concomitant developments in wireless telephony standards have reinforced this view; LTE/4G may not yet match hard-wired counterparts in data capacity, but it does offer continuity of connection and data volumes adequate for many transport-related applications. Its follow-ons will only be better.


In developed countries especially, the costs of replacing infrastructure amid economic slump have also had a formative effect on opinions and both technological and doctrinal development. Solutions which use very little additional infrastructure are finding favour and greater efforts are being made to detect and monitor traffic via inference and absence rather than absolutes. This is especially so in more remote locations where roads are poorly instrumented but moves are afoot to enrich and complement data sets on strategic and urban routes; wherever the location, this represents a significant deviation from pursuit of conventional detection and monitoring systems with accuracy in the high 90 per cents.

Mass-market solutions

The use of GPS and consumer protocols such as 1835 Bluetooth to detect vehicles and monitor traffic flows has become more common and social media such as 2171 Twitter have excited a fair bit of interest. The thinking is that Tweets from travellers could
warn of incidents in advance of existing infrastructure, even in heavily instrumented locations. But as with any information source not tightly controlled there is a need to be wary of its veracity. That’s before consideration of Twitter or other social media as parts of a well-orchestrated prank or terrorist attack.

As a data stream management specialist, 589 SQLstream is well placed to comment on the utility of social media. According to
the firm’s vice president of marketing Ronnie Beggs, there has been more than a hint of over-hopefulness with regard to the
use of such sources of data. The biggest perceived benefit is the ability to add crowd data for little or no cost, he says, but subscription services provided by Twitter itself are probably the better bet, as these remain consistent when Tweet levels are high; by contrast, public Twitter feeds, which constitute only around 1% of total traffic volume, tend to be the first to get throttled back.

Extracting usable data

“There is useful information to be had,” Beggs states, “but the reality is that the majority of Tweets don’t contain the really useful information, such as location. You might see a sudden spike of ‘I’m stuck in traffic’ messages but work is still needed to derive usable data. It is possible to get position from triangulation  if working in cooperation with the telcos but, really, it’s up to individual Tweeters to add their locations.”

Social media, he notes, exhibit particular characteristics when it comes to traffic incidents. One is the immediacy of reporting, another is the speed at which Tweets fall off once an incident has passed.
“The information is very much in real-time – there is a real sense of something occurring and passing. However, at present, geo-tagging is something which Tweeters explicitly have to opt into and Twitter could have done better with this regard. Geo-tagging could still result from agencies approaching Twitter with requests for the mechanism by which this is streamlined,” says Beggs.

There are two ways of extracting information from Twitter. If Tweets are geo-tagged, then network managers can look at the volumes of Tweets at given locations. Semantic analysis of Tweet content can be done, but is difficult as people often take shortcuts with spelling and grammar. Specific words such as ‘accident’ or ‘traffic congestion’ might not be used.

Data extraction

“From our perspective, social media are a potentially important overlay information source. The problem is that little information can be extracted from even a large number of Tweets,” Beggs says.

“There is a whole industry centred on text analysis and semantic modelling and people are now looking at how to better exploit the technology but this is fraught with difficulties. Where it has been successful is in analysis of medical research papers – in summarising large documents.

In reporting the outcome of medical trials, the technology has been used to trawl social media and blogs on medicines and their side-effects.

Online marketing companies have also used it to provide competitive intelligence reports on companies’ products. However, the technology has been unsuccessful thus far at processing real-time Twitter feeds even where there are large numbers of Tweets. Geo-fixing is a solution.”

That said, Twitter has to be regarded as a first-generation Tweeting technology.

Competitors may emerge and Twitter is likely to develop. Suppliers of navigation systems may provide products with some form of in-built Tweeting capability.

A general fear with crowd-sourcing technology though, is that it is prone to misuse. This is far from an ethereal concern and platforms ought to be taking account of this and indicating levels of confidence in the veracity of the data. Beggs feels that once there’s a greater understanding of how they work, social media will become more prone to abuse.
“Deliberate use of social media to corral people together to make them vulnerable to attack would have to involve a very sophisticated, multi-sourced effort. Pranks are more likely than terrorism; from people sitting at a bus or tram stop, posting false information about services running late, for instance.”

Where social media really come into their own is in providing information along arterials and in residential areas, for reinforcing data from buses’ GPS systems, for instance. But detecting an incident is one thing; it’s quite another to determine whether it requires intervention. Severe congestion might be entirely normal at given points and times of the day. This is where relational analysis is useful.

Benefits and contrasts

“The main benefits of social media are going to come from improvements they bring to the overall traveller experience, for example, in providing reliable information on bus arrivals,” Beggs continues. In terms of traffic services, it is difficult to gauge how much quicker reaction to incidents might become. Social media provide an additional mechanism which helps to gauge severity but does not necessarily increase speed or accuracy.

“A perceived benefit is that the processing infrastructure needed is very light. In part, that’s because the actual volume of Twitter feeds is relatively low; it’s around one billion per month, which equates to a few hundred per second. Compare that to the worldwide number of GPS updates, which is around eight million per second,” Beggs says.

“The major overheads come in with semantic analysis of Tweets. For traffic management, this data is not just stored for later, so real-time processing is needed, on a par with processing GPS data in real-time. Comparison of real-time models with historical data is something that very few organisations can do at present.”

Although social media will continue to make their presence felt, Beggs remains convinced that the single-most important traffic data source going forward will be GPS/GNSS.

He cites several reasons: “It’s relatively easy to expand and improve coverage, especially onto minor roads, and the high number of GPS updates already established will expand significantly. GPS is commonplace on smartphones and a big increase from traffic-related applications is expected. Much of the current generation of fleet sensor technology provide updates every minute. Coming solutions will do so every few seconds. Data feeds will increase exponentially as a result.”

For more information on companies in this article

Related Content

  • Moxa provides clear vision for Caldecott Tunnel’s Fourth Bore
    September 15, 2014
    Caldecott Tunnel’s new Fourth Bore is utilising a bespoke high-capacity monitoring and communications network from Moxa. The Caldecott Tunnel connects Contra Costa and Alameda counties in Northern California and traditionally it has suffered severe congestion - especially during peak hours. Opened in 1937 as a twin-bore arrangement, by 1964 the increase in traffic volumes led to a third bore being added. Shortly after the third bore was opened a tidal flow was introduced with the centre bore alternating in
  • When weather warnings get hyperlocal
    August 24, 2016
    David Crawford looks at new technologies to cope with the age-old problem of driving in bad weather. On the 10-year average, between 2005 and 2014 bad weather contributed to more than 1.5 million vehicle crashes in the US each year, resulting in more than 800,000 injuries and 7,400 deaths. These were the findings of analysis by Booz Allen Hamilton of NHTSA data which concluded that the loss of life, hospital treatment and damage to assets costs an annual average of $42bn.
  • Xerox’s mobility app offers Mobility as a Service
    June 1, 2016
    Andrew Bardin Williams looks at a new mobility app in Los Angeles and Denver that brings Mobility as a Service one step closer. Commuting today doesn’t have to require a single modal route. You can take Uber to the nearest light-rail station or a bus to the commuter line. Then on the other end of your trip, you can book a bikeshare the rest of the way to your office. For many who live in major metropolitan areas around the US this is a distinct reality as new ways to move from Point A to Point B continue to
  • First among equals
    May 21, 2012
    Dr Peter Sweatman, Director of the University of Michigan Transportation Research Institute (UMTRI) and the new chairman of ITS America, has no doubt where safety stands in the ITS world What do you hope to achieve in your term as chairman of ITS America? I really want to advance the agenda of safe and sustainable transportation because ITS really is the only weapon that can advance that. We have been working on connected vehicles for safety for a number of years, putting all of the right elements in place,