Skip to main content

Social media mooted for traffic management

SQLstream’s Ronnie Beggs discusses with Jason Barnes the potential and pitfalls of using social media for traffic monitoring and management. cataclysmic events such as hurricanes and tsunami have challenged perceptions of what constitutes robust traffic management infrastructure in recent times. Presumptions that only fixed systems could offer high levels of unbroken service, accuracy and communication bandwidth, have been taught some hard lessons by nature. In many respects wireless systems now represent t
November 13, 2012 Read time: 7 mins
SQLstream vice president of marketing Ronnie Beggs – remains convinced that the single-most important traffic data source going forward will be GPS/GNSS

SQLstream’s Ronnie Beggs discusses with Jason Barnes the potential and pitfalls of using social media for traffic monitoring and management.

Cataclysmic events such as hurricanes and tsunami have challenged perceptions of what constitutes robust traffic management infrastructure in recent times. Presumptions that only fixed systems could offer high levels of unbroken service, accuracy and communication bandwidth, have been taught some hard lessons by nature. In many respects wireless systems now represent the standard in terms of resilience, typically being first back into service once disaster has passed. Concomitant developments in wireless telephony standards have reinforced this view; LTE/4G may not yet match hard-wired counterparts in data capacity, but it does offer continuity of connection and data volumes adequate for many transport-related applications. Its follow-ons will only be better.


In developed countries especially, the costs of replacing infrastructure amid economic slump have also had a formative effect on opinions and both technological and doctrinal development. Solutions which use very little additional infrastructure are finding favour and greater efforts are being made to detect and monitor traffic via inference and absence rather than absolutes. This is especially so in more remote locations where roads are poorly instrumented but moves are afoot to enrich and complement data sets on strategic and urban routes; wherever the location, this represents a significant deviation from pursuit of conventional detection and monitoring systems with accuracy in the high 90 per cents.

Mass-market solutions

The use of GPS and consumer protocols such as 1835 Bluetooth to detect vehicles and monitor traffic flows has become more common and social media such as 2171 Twitter have excited a fair bit of interest. The thinking is that Tweets from travellers could
warn of incidents in advance of existing infrastructure, even in heavily instrumented locations. But as with any information source not tightly controlled there is a need to be wary of its veracity. That’s before consideration of Twitter or other social media as parts of a well-orchestrated prank or terrorist attack.

As a data stream management specialist, 589 SQLstream is well placed to comment on the utility of social media. According to
the firm’s vice president of marketing Ronnie Beggs, there has been more than a hint of over-hopefulness with regard to the
use of such sources of data. The biggest perceived benefit is the ability to add crowd data for little or no cost, he says, but subscription services provided by Twitter itself are probably the better bet, as these remain consistent when Tweet levels are high; by contrast, public Twitter feeds, which constitute only around 1% of total traffic volume, tend to be the first to get throttled back.

Extracting usable data

“There is useful information to be had,” Beggs states, “but the reality is that the majority of Tweets don’t contain the really useful information, such as location. You might see a sudden spike of ‘I’m stuck in traffic’ messages but work is still needed to derive usable data. It is possible to get position from triangulation  if working in cooperation with the telcos but, really, it’s up to individual Tweeters to add their locations.”

Social media, he notes, exhibit particular characteristics when it comes to traffic incidents. One is the immediacy of reporting, another is the speed at which Tweets fall off once an incident has passed.
“The information is very much in real-time – there is a real sense of something occurring and passing. However, at present, geo-tagging is something which Tweeters explicitly have to opt into and Twitter could have done better with this regard. Geo-tagging could still result from agencies approaching Twitter with requests for the mechanism by which this is streamlined,” says Beggs.

There are two ways of extracting information from Twitter. If Tweets are geo-tagged, then network managers can look at the volumes of Tweets at given locations. Semantic analysis of Tweet content can be done, but is difficult as people often take shortcuts with spelling and grammar. Specific words such as ‘accident’ or ‘traffic congestion’ might not be used.

Data extraction

“From our perspective, social media are a potentially important overlay information source. The problem is that little information can be extracted from even a large number of Tweets,” Beggs says.

“There is a whole industry centred on text analysis and semantic modelling and people are now looking at how to better exploit the technology but this is fraught with difficulties. Where it has been successful is in analysis of medical research papers – in summarising large documents.

In reporting the outcome of medical trials, the technology has been used to trawl social media and blogs on medicines and their side-effects.

Online marketing companies have also used it to provide competitive intelligence reports on companies’ products. However, the technology has been unsuccessful thus far at processing real-time Twitter feeds even where there are large numbers of Tweets. Geo-fixing is a solution.”

That said, Twitter has to be regarded as a first-generation Tweeting technology.

Competitors may emerge and Twitter is likely to develop. Suppliers of navigation systems may provide products with some form of in-built Tweeting capability.

A general fear with crowd-sourcing technology though, is that it is prone to misuse. This is far from an ethereal concern and platforms ought to be taking account of this and indicating levels of confidence in the veracity of the data. Beggs feels that once there’s a greater understanding of how they work, social media will become more prone to abuse.
“Deliberate use of social media to corral people together to make them vulnerable to attack would have to involve a very sophisticated, multi-sourced effort. Pranks are more likely than terrorism; from people sitting at a bus or tram stop, posting false information about services running late, for instance.”

Where social media really come into their own is in providing information along arterials and in residential areas, for reinforcing data from buses’ GPS systems, for instance. But detecting an incident is one thing; it’s quite another to determine whether it requires intervention. Severe congestion might be entirely normal at given points and times of the day. This is where relational analysis is useful.

Benefits and contrasts

“The main benefits of social media are going to come from improvements they bring to the overall traveller experience, for example, in providing reliable information on bus arrivals,” Beggs continues. In terms of traffic services, it is difficult to gauge how much quicker reaction to incidents might become. Social media provide an additional mechanism which helps to gauge severity but does not necessarily increase speed or accuracy.

“A perceived benefit is that the processing infrastructure needed is very light. In part, that’s because the actual volume of Twitter feeds is relatively low; it’s around one billion per month, which equates to a few hundred per second. Compare that to the worldwide number of GPS updates, which is around eight million per second,” Beggs says.

“The major overheads come in with semantic analysis of Tweets. For traffic management, this data is not just stored for later, so real-time processing is needed, on a par with processing GPS data in real-time. Comparison of real-time models with historical data is something that very few organisations can do at present.”

Although social media will continue to make their presence felt, Beggs remains convinced that the single-most important traffic data source going forward will be GPS/GNSS.

He cites several reasons: “It’s relatively easy to expand and improve coverage, especially onto minor roads, and the high number of GPS updates already established will expand significantly. GPS is commonplace on smartphones and a big increase from traffic-related applications is expected. Much of the current generation of fleet sensor technology provide updates every minute. Coming solutions will do so every few seconds. Data feeds will increase exponentially as a result.”

Related Content

  • December 4, 2012
    Assessing the potential of in-vehicle enforcement systems
    Jason Barnes considers the social and ethical ramifications of using in-vehicle safety technologies to fulfil enforcement functions. Although policy documents often imply close correlation between enforcement, compliance and safety – in part, as a counter to accusations that enforcement is rather more concerned with revenue generation – there is a noticeable reluctance among policy makers and auto manufacturers to exploit in-vehicle safety systems for enforcement applications. From a technical perspective t
  • January 31, 2012
    Wireless traffic data in real time
    The effect of moving objects on the electromagnetic landscape set up by cellular telephony networks can be detected and interpreted to give real-time traffic data across large geographical areas at low cost. Here, we revisit the Celldar concept. Global economic downturn has pushed public-sector agencies, transport administrations among them, to push even harder for cost efficiencies. Unfortunately, when it comes to transport safety and efficiency the public sector often has to work up to a cost rather than
  • June 7, 2012
    Camera technology a flexible and cost-effective option
    Perceptions of machine vision being an expensive solution are being challenged by developments in both core technologies and ancillaries. Here, Jason Barnes and David Crawford look at the latest developments in the sector. A notable aspect of machine vision is the flexibility it offers in terms of how and how much data is passed around a network. With smart cameras, processing capabilities at the front end mean that only that which is valid need be communicated back to a central processor of any descripti
  • May 26, 2020
    OpenSpace visualises how social distancing will work
    OpenSpace CEO Nicolas Le Glatin tells Adam Hill how Xovis camera tech might help unlock more convenient ways for moving through mobility hubs during Covid-19