Skip to main content

Social media mooted for traffic management

SQLstream’s Ronnie Beggs discusses with Jason Barnes the potential and pitfalls of using social media for traffic monitoring and management. cataclysmic events such as hurricanes and tsunami have challenged perceptions of what constitutes robust traffic management infrastructure in recent times. Presumptions that only fixed systems could offer high levels of unbroken service, accuracy and communication bandwidth, have been taught some hard lessons by nature. In many respects wireless systems now represent t
November 13, 2012 Read time: 7 mins
SQLstream vice president of marketing Ronnie Beggs – remains convinced that the single-most important traffic data source going forward will be GPS/GNSS

SQLstream’s Ronnie Beggs discusses with Jason Barnes the potential and pitfalls of using social media for traffic monitoring and management.

Cataclysmic events such as hurricanes and tsunami have challenged perceptions of what constitutes robust traffic management infrastructure in recent times. Presumptions that only fixed systems could offer high levels of unbroken service, accuracy and communication bandwidth, have been taught some hard lessons by nature. In many respects wireless systems now represent the standard in terms of resilience, typically being first back into service once disaster has passed. Concomitant developments in wireless telephony standards have reinforced this view; LTE/4G may not yet match hard-wired counterparts in data capacity, but it does offer continuity of connection and data volumes adequate for many transport-related applications. Its follow-ons will only be better.


In developed countries especially, the costs of replacing infrastructure amid economic slump have also had a formative effect on opinions and both technological and doctrinal development. Solutions which use very little additional infrastructure are finding favour and greater efforts are being made to detect and monitor traffic via inference and absence rather than absolutes. This is especially so in more remote locations where roads are poorly instrumented but moves are afoot to enrich and complement data sets on strategic and urban routes; wherever the location, this represents a significant deviation from pursuit of conventional detection and monitoring systems with accuracy in the high 90 per cents.

Mass-market solutions

The use of GPS and consumer protocols such as 1835 Bluetooth to detect vehicles and monitor traffic flows has become more common and social media such as 2171 Twitter have excited a fair bit of interest. The thinking is that Tweets from travellers could
warn of incidents in advance of existing infrastructure, even in heavily instrumented locations. But as with any information source not tightly controlled there is a need to be wary of its veracity. That’s before consideration of Twitter or other social media as parts of a well-orchestrated prank or terrorist attack.

As a data stream management specialist, 589 SQLstream is well placed to comment on the utility of social media. According to
the firm’s vice president of marketing Ronnie Beggs, there has been more than a hint of over-hopefulness with regard to the
use of such sources of data. The biggest perceived benefit is the ability to add crowd data for little or no cost, he says, but subscription services provided by Twitter itself are probably the better bet, as these remain consistent when Tweet levels are high; by contrast, public Twitter feeds, which constitute only around 1% of total traffic volume, tend to be the first to get throttled back.

Extracting usable data

“There is useful information to be had,” Beggs states, “but the reality is that the majority of Tweets don’t contain the really useful information, such as location. You might see a sudden spike of ‘I’m stuck in traffic’ messages but work is still needed to derive usable data. It is possible to get position from triangulation  if working in cooperation with the telcos but, really, it’s up to individual Tweeters to add their locations.”

Social media, he notes, exhibit particular characteristics when it comes to traffic incidents. One is the immediacy of reporting, another is the speed at which Tweets fall off once an incident has passed.
“The information is very much in real-time – there is a real sense of something occurring and passing. However, at present, geo-tagging is something which Tweeters explicitly have to opt into and Twitter could have done better with this regard. Geo-tagging could still result from agencies approaching Twitter with requests for the mechanism by which this is streamlined,” says Beggs.

There are two ways of extracting information from Twitter. If Tweets are geo-tagged, then network managers can look at the volumes of Tweets at given locations. Semantic analysis of Tweet content can be done, but is difficult as people often take shortcuts with spelling and grammar. Specific words such as ‘accident’ or ‘traffic congestion’ might not be used.

Data extraction

“From our perspective, social media are a potentially important overlay information source. The problem is that little information can be extracted from even a large number of Tweets,” Beggs says.

“There is a whole industry centred on text analysis and semantic modelling and people are now looking at how to better exploit the technology but this is fraught with difficulties. Where it has been successful is in analysis of medical research papers – in summarising large documents.

In reporting the outcome of medical trials, the technology has been used to trawl social media and blogs on medicines and their side-effects.

Online marketing companies have also used it to provide competitive intelligence reports on companies’ products. However, the technology has been unsuccessful thus far at processing real-time Twitter feeds even where there are large numbers of Tweets. Geo-fixing is a solution.”

That said, Twitter has to be regarded as a first-generation Tweeting technology.

Competitors may emerge and Twitter is likely to develop. Suppliers of navigation systems may provide products with some form of in-built Tweeting capability.

A general fear with crowd-sourcing technology though, is that it is prone to misuse. This is far from an ethereal concern and platforms ought to be taking account of this and indicating levels of confidence in the veracity of the data. Beggs feels that once there’s a greater understanding of how they work, social media will become more prone to abuse.
“Deliberate use of social media to corral people together to make them vulnerable to attack would have to involve a very sophisticated, multi-sourced effort. Pranks are more likely than terrorism; from people sitting at a bus or tram stop, posting false information about services running late, for instance.”

Where social media really come into their own is in providing information along arterials and in residential areas, for reinforcing data from buses’ GPS systems, for instance. But detecting an incident is one thing; it’s quite another to determine whether it requires intervention. Severe congestion might be entirely normal at given points and times of the day. This is where relational analysis is useful.

Benefits and contrasts

“The main benefits of social media are going to come from improvements they bring to the overall traveller experience, for example, in providing reliable information on bus arrivals,” Beggs continues. In terms of traffic services, it is difficult to gauge how much quicker reaction to incidents might become. Social media provide an additional mechanism which helps to gauge severity but does not necessarily increase speed or accuracy.

“A perceived benefit is that the processing infrastructure needed is very light. In part, that’s because the actual volume of Twitter feeds is relatively low; it’s around one billion per month, which equates to a few hundred per second. Compare that to the worldwide number of GPS updates, which is around eight million per second,” Beggs says.

“The major overheads come in with semantic analysis of Tweets. For traffic management, this data is not just stored for later, so real-time processing is needed, on a par with processing GPS data in real-time. Comparison of real-time models with historical data is something that very few organisations can do at present.”

Although social media will continue to make their presence felt, Beggs remains convinced that the single-most important traffic data source going forward will be GPS/GNSS.

He cites several reasons: “It’s relatively easy to expand and improve coverage, especially onto minor roads, and the high number of GPS updates already established will expand significantly. GPS is commonplace on smartphones and a big increase from traffic-related applications is expected. Much of the current generation of fleet sensor technology provide updates every minute. Coming solutions will do so every few seconds. Data feeds will increase exponentially as a result.”

For more information on companies in this article

Related Content

  • European tunnel upgrades following new safety legislation
    August 20, 2015
    Across Europe there is a very mixed picture of compliance to latest safety standards for road tunnels. Best practice has emerged, however, in the wake of European legislation. Jon Masters reports High profile fatal fires following accidents in the Mont Blanc, Tauern and Gotthard tunnels prompted the 2004 European Union Directive 2004/54 on road tunnel safety. This meant all EU member states would have to meet new standards of safety in road tunnels by 30 April 2014. The Directive applied to all tunnels over
  • Seleta Reynolds: 'Set a vision, listen to your people & then get out of their way'
    September 12, 2022
    Los Angeles, host of the 2022 ITS World Congress, is a city where the only constant is change, says Seleta Reynolds of LA Metro. Adam Hill finds out about leadership, dream jobs and the 2028 Olympics...
  • Rapid growth makes Texas an incubator for tolling innovation
    September 8, 2014
    As the IBTTA’s annual meeting and exhibition heads for Austin, Mitchell Beer, president of Smarter Shift, considers the role of Texas in the development of tolling strategies and technology. The State of Texas has always prided itself on being ‘larger than life’. From the sprawling geography of the state itself with its wide open skies, to its entrepreneurial ‘get-it-done’ attitude, Texas exudes an impatient restlessness that pushes businesses and public agencies to deliver faster, better results. More ofte
  • Building the case for photo enforcement
    October 26, 2016
    As red light enforcement is returning to some intersections and being shut down at others, new evidence has been released backing the safety campaigners, reports Jon Masters. In 2014, 709 Americans were killed in red-light-running crashes and an estimated 126,000 were injured according to the Insurance Institute for Highway Safety (IIHS).