Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
October 21, 2013

GPU Database Speeds Big Data Visualization

Tiffany Trader
MapD_twitter_visualizations_280x

Creating visualizations of enormous datasets used to be a strictly HPC endeavor, but that’s starting to change. A new massively parallel database, called MapD, developed by MIT researchers Todd Mostak and Samuel Madden, uses off-the-shelf GPUs to crunch complex spatial and GIS data in real time. The approach is significantly faster than conventional CPU-based systems. Using a single high-performance GPU card, Mostak reported a 70-fold speedup in the rendering of Twitter data.

As this article at MIT Technology Review explains, the “new technology achieves big speed gains by storing the data in the onboard memory of graphics processing units (GPUs) instead of in central processing units (CPUs), as is conventional.”

Falling hardware prices and the advance of social media analytics have made visualization technology more accessible; however, turning big datasets into useful animations was still a time-consuming process for researchers who lacked a powerful workstation or cluster.

Where previous technology would take seconds or longer to render data into images or animations, MapD turns millions of data points into maps and animations in just milliseconds. The MapD technology will work for different kinds of data, but the prototype is being demonstrated on tweets. As the video below illustrates, MapD can show how a meme (in this case “rain”) is trending in real time on regional or world maps. The user can set search terms and other parameters, e.g., time frame or geographical region, and the new visualization appears instantly. It’s a lot like using a search engine.

The impetus for the idea came during Mostak’s time as a Harvard graduate student in Middle Eastern studies. His thesis project on Egyptian politics during the Arab Spring required some 40 million geolocated tweets to be processed, but mapping the large dataset for interactive analysis would take days. His solution was to use inexpensive hardware designed for gamers, i.e. GPUs, to build his own database.

“By building a tool to explore data sets like this in a truly interactive fashion, with latencies measured in milliseconds rather than seconds or minutes, we hope to remove a computational bottleneck from the process of hypothesis formulation, testing, and refinement,” Mostak says.

One of the early adopters of this technology will be Sunlight Foundation, which is working for open and transparent campaign financing. The organization will use MapD to analyze 22 years of US state and federal campaign donation records to see how the more than 20 million donations break down by donor, region, elected official and other factors.

The combination of low-cost analytics tools and social media data are a powerful force for the democratization of big data visualization with implications for business, government and academia. For example, the ability to harness geographical data from mobile devices and social media streams in real time would be a tremendous resource for epidemiology and disaster response teams.

Even though MapD has just launched, the research team already has plans to expand its hardware support to include Intel parts (perhaps Phi) and general x86 processors. Mostak also has reported that he’s 99 percent sure he wants to make MapD open source. He’ll keep certain parallel processing algorithms proprietary, but publish the base of the data processing system and the compute modules on an open source license.

public version of the tool has a database of 50 million geocoded tweets that were posted between September 28 and October 6. Visitors to the site can input “what,” “who” and “where” and then zoom in all the way down to the level of individual tweet data.