Traffic jams and mishaps are often painful and sometimes dangerous facts of life. At this week’s IEEE International Conference on Big Data being held in Boston, researchers from TACC and colleagues will present a new deep learning tool that uses raw traffic camera footage from City of Austin cameras to recognize objects – people, cars, buses, trucks, bicycles, motorcycles and traffic lights – and characterize how those objects move and interact.
The researchers from Texas Advanced Computing Center (TACC), the University of Texas Center for Transportation Research and the City of Austin have been collaborating to develop tools that allow sophisticated, searchable traffic analyses using deep learning and data mining. An account of the work (Artificial Intelligence and Supercomputers to Help Alleviate Urban Traffic Problems), written by Aaron Dubrow, was posted this week on the TACC website.
Their work is being tested in parts of Austin where cameras on signal lights automatically counted vehicles in a 10-minute video clip, and preliminary results showed that their tool was 95 percent accurate overall.
“We are hoping to develop a flexible and efficient system to aid traffic researchers and decision-makers for dynamic, real-life analysis needs,” said Weijia Xu, a research scientist who leads the Data Mining & Statistics Group at TACC. “We don’t want to build a turn-key solution for a single, specific problem. We want to explore means that may be helpful for a number of analytical needs, even those that may pop up in the future.” The algorithm they developed for traffic analysis automatically labels all potential objects from the raw data, tracks objects by comparing them with other previously recognized objects and compares the outputs from each frame to uncover relationships among the objects.
The team used the open-source YOLO library and neural network developed by University of Washington and Facebook researchers for real-time object detection. According to the team, this is the first time YOLO has been applied to traffic data. For the data analysis and query component, they incorporated HiveQL, a query language maintained by the Apache Software Foundation that lets individuals search and compare data in the system.

“Current practice often relies on the use of expensive sensors for continuous data collection or on traffic studies that sample traffic volumes for a few days during selected time periods,” Natalia Ruiz Juri, a research associate and director of the Network Modeling Center at UT’s Center for Transportation Research. “The use of artificial intelligence to automatically generate traffic volumes from existing cameras would provide a much broader spatial and temporal coverage of the transportation network, facilitating the generation of valuable datasets to support innovative research and to understand the impact of traffic management and operation decisions.”
Whether autonomous vehicles will mitigate the problem is an ongoing debate and Juri notes, “The highly anticipated introduction of self-driving and connected cars may lead to significant changes in the behavior of vehicles and pedestrians and on the performance of roadways. Video data will play a key role in understanding such changes, and artificial intelligence may be central to enabling comprehensive large-scale studies that truly capture the impact of the new technologies.”
Link to full article: https://www.tacc.utexas.edu/-/artificial-intelligence-and-supercomputers-to-help-alleviate-urban-traffic-problems
Link to video on the work: http://soda.tacc.utexas.edu
Images: TACC