Big Data Versus Big Compute

By Tiffany Trader

December 10, 2013

Big data and big compute are not new concepts. Before the term “big data” took off as the buzzword du jour, the HPC community expressed these same ideas as compute-intensive and data-intensive computing. Problems were compute-bound or IO-bound or both.

It is the case, however, that the world is in the midst of a data explosion. In 2013, the amount of data flowing through the Internet was 667 exabytes, an amount equivalent to more than 141 billion DVDs. The quick rise of the big data conceptual framework reflects this paradigm. Big compute works nicely as a complementary term. They are essentially two sides of a coin, or are they?

In a recent TEDx Talk, Virginia Tech professor and noted HPC expert Wu Feng discusses how these elements are experienced differently across nations.

Feng begins his talk with a question: “In today’s rapidly evolving technological world, is our future in big data or big compute?”

As he provides an overview of the terms, Feng references HokieSpeed, the GPU-accelerated supercomputer that he developed, which debuted as the greenest commodity supercomputer in the US in November 2011. HokieSpeed is a big compute resource, notes Feng, capable of calculating 500 trillion operations per second*, 100,000 times faster than a typical PC.

HokieSpeed and other systems like it are being used for epidemiological studies, which can be used to guide public policy in the event of disease outbreaks. Simulations boost scientists’ understanding of how viruses spread, enabling them to assist public health officials in devising appropriate containment measures.

Another HokieSpeed project aims to reverse-engineer the brain. Researchers are trying to find repeating patterns of higher-order motor function in EEG brain readings. Simulations are used map neurological pathways.

One of the neurological ailments in the news today is called CTE, a progressive, degenerative brain disease that is affecting athletes with a history of brain trauma, namely concussions. CTE can only be definitely diagnosed after death, but neurologists are working towards diagnosing and treating CTE in living patients. On a PC, this kind of research would take months or years instead of hours or days.

Big data has many definitions, and one important characteristic is that it’s relative, i.e., more data than you are used to. “Big data is your humongous haystack and various algorithms that you use to root around that haystack. Big compute is lots of metal detectors,” explains Feng. “They’re the devices with which you are going to try and find all the little needles of information in the haystack that you can glean some insight and knowledge from.”

Feng makes the case that different nations have different priorities when it comes to investing in big data or big compute.

Back in May 2013, Feng spoke with White House officials to discuss DNA sequencing research in the life sciences. One of the applications here includes finding mutations in genomes. This makes it possible to then infer different pathways that are causing cancer, setting the stage for potential treatments. At this function, there was clearly a focus on big data, notes Feng, while big compute, while important, was clearly secondary.

Three weeks later, Feng traveled to China as part of a US delegation, where he found that the converse was true.

“Here, we look at big data as being more important,” Feng states. “And in China, big compute is more important than big data, so much so that they created a supercomputer called TIANHE-2 that is 282 times faster than HokieSpeed and twice as fast as the fastest US supercomputer.”

They view big data merely as an application area of big compute, notes Feng.

Feng contends that big data, at least in the US, has been elevated to a position above big compute, in part because the compute side is so often hidden from the user. For example, Google returns search results with lightening speed, but the average person does not realize the immensity of the underlying computational infrastructure that has enabled this transaction.

He cites IBM Watson’s Jeopardy appearance as another example of a very visible “big data” application where the compute side was essentially hidden from the audience.

So what should we be investing in? asks Feng. As complementary forces, the data and compute go hand-in-hand. “In order to make sense of the data, we need to compute on the data.” There is a cycle in which data becomes information, then knowledge, then wisdom – and each of these steps requires computing.

*Note: According to Virginia Tech’s announcement, HokieSpeed claims “a single-precision peak of 455 teraflops, 455 trillion operations per second, and a double-precision peak of 240 teraflops, or 240 trillion operations per second.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Intel, Micro Debut Quad-Level Cell NAND Flash

May 22, 2018

Chipmakers continue to gear designs toward AI and other demanding cloud workloads that take advantage of datacenter flash storage capacity. To that end, memory specialist Micron Technology Inc. began shipping compact sol Read more…

By George Leopold

Japan Meteorological Agency Takes Delivery of Pair of Crays

May 21, 2018

Cray has supplied two identical Cray XC50 supercomputers to the Japan Meteorological Agency (JMA) in northwestern Tokyo. Boasting more than 18 petaflops combined peak computing capacity, the new systems will extend the a Read more…

By Tiffany Trader

ASC18: Final Results Revealed & Wrapped Up

May 17, 2018

It was an exciting week at ASC18 in Nanyang, China. The student teams braved extreme heat, extremely difficult applications, and extreme competition in order to cross the cluster competition finish line. The gala awards ceremony took place on Wednesday. The auditorium was packed with student teams, various dignitaries, the media, and other interested parties. So what happened? Read more…

By Dan Olds

HPE Extreme Performance Solutions

HPC and AI Convergence is Accelerating New Levels of Intelligence

Data analytics is the most valuable tool in the digital marketplace – so much so that organizations are employing high performance computing (HPC) capabilities to rapidly collect, share, and analyze endless streams of data. Read more…

IBM Accelerated Insights

Mastering the Big Data Challenge in Cognitive Healthcare

Patrick Chain, genomics researcher at Los Alamos National Laboratory, posed a question in a recent blog: What if a nurse could swipe a patient’s saliva and run a quick genetic test to determine if the patient’s sore throat was caused by a cold virus or a bacterial infection? Read more…

ASC18: Tough Applications & Tough Luck

May 17, 2018

The applications at the ASC18 Student Cluster Competition were tough. Tougher than the $3.99 steak special at your local greasy spoon restaurant. The apps are so tough that even Chuck Norris backs away from them slowly. Read more…

By Dan Olds

Japan Meteorological Agency Takes Delivery of Pair of Crays

May 21, 2018

Cray has supplied two identical Cray XC50 supercomputers to the Japan Meteorological Agency (JMA) in northwestern Tokyo. Boasting more than 18 petaflops combine Read more…

By Tiffany Trader

ASC18: Final Results Revealed & Wrapped Up

May 17, 2018

It was an exciting week at ASC18 in Nanyang, China. The student teams braved extreme heat, extremely difficult applications, and extreme competition in order to cross the cluster competition finish line. The gala awards ceremony took place on Wednesday. The auditorium was packed with student teams, various dignitaries, the media, and other interested parties. So what happened? Read more…

By Dan Olds

Spring Meetings Underscore Quantum Computing’s Rise

May 17, 2018

The month of April 2018 saw four very important and interesting meetings to discuss the state of quantum computing technologies, their potential impacts, and th Read more…

By Alex R. Larzelere

Quantum Network Hub Opens in Japan

May 17, 2018

Following on the launch of its Q Commercial quantum network last December with 12 industrial and academic partners, the official Japanese hub at Keio University is now open to facilitate the exploration of quantum applications important to science and business. The news comes a week after IBM announced that North Carolina State University was the first U.S. university to join its Q Network. Read more…

By Tiffany Trader

Democratizing HPC: OSC Releases Version 1.3 of OnDemand

May 16, 2018

Making HPC resources readily available and easier to use for scientists who may have less HPC expertise is an ongoing challenge. Open OnDemand is a project by t Read more…

By John Russell

PRACE 2017 Annual Report: Exascale Aspirations; Industry Collaboration; HPC Training

May 15, 2018

The Partnership for Advanced Computing in Europe (PRACE) today released its annual report showcasing 2017 activities and providing a glimpse into thinking about Read more…

By John Russell

US Forms AI Brain Trust

May 11, 2018

Amid calls for a U.S. strategy for promoting AI development, the Trump administration is forming a senior-level panel to help coordinate government and industry research efforts. The Select Committee on Artificial Intelligence was announced Thursday (May 10) during a White House summit organized by the Office of Science and Technology Policy (OSTP). Read more…

By George Leopold

Emerging Advanced Scale Tech Trends Focus of Annual Tabor Conference

May 9, 2018

At Tabor Communications' annual Advanced Scale Forum (ASF) held this week in Austin, the focus was on enterprise adoption of HPC-class technologies and high performance data analytics (HPDA). It’s a confab that brings together end users (CIOs, IT planners, department heads) and vendors and encourages... Read more…

By the Editorial Team

MLPerf – Will New Machine Learning Benchmark Help Propel AI Forward?

May 2, 2018

Let the AI benchmarking wars begin. Today, a diverse group from academia and industry – Google, Baidu, Intel, AMD, Harvard, and Stanford among them – releas Read more…

By John Russell

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

Leading Solution Providers

HPC and AI – Two Communities Same Future

January 25, 2018

According to Al Gara (Intel Fellow, Data Center Group), high performance computing and artificial intelligence will increasingly intertwine as we transition to Read more…

By Rob Farber

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

HPE Wins $57 Million DoD Supercomputing Contract

February 20, 2018

Hewlett Packard Enterprise (HPE) today revealed details of its massive $57 million HPC contract with the U.S. Department of Defense (DoD). The deal calls for HP Read more…

By Tiffany Trader

CFO Steps down in Executive Shuffle at Supermicro

January 31, 2018

Supermicro yesterday announced senior management shuffling including prominent departures, the completion of an audit linked to its delayed Nasdaq filings, and Read more…

By John Russell

Deep Learning Portends ‘Sea Change’ for Oil and Gas Sector

February 1, 2018

The billowing compute and data demands that spurred the oil and gas industry to be the largest commercial users of high-performance computing are now propelling Read more…

By Tiffany Trader

Nvidia Ups Hardware Game with 16-GPU DGX-2 Server and 18-Port NVSwitch

March 27, 2018

Nvidia unveiled a raft of new products from its annual technology conference in San Jose today, and despite not offering up a new chip architecture, there were still a few surprises in store for HPC hardware aficionados. Read more…

By Tiffany Trader

Hennessy & Patterson: A New Golden Age for Computer Architecture

April 17, 2018

On Monday June 4, 2018, 2017 A.M. Turing Award Winners John L. Hennessy and David A. Patterson will deliver the Turing Lecture at the 45th International Sympo Read more…

By Staff

Part One: Deep Dive into 2018 Trends in Life Sciences HPC

March 1, 2018

Life sciences is an interesting lens through which to see HPC. It is perhaps not an obvious choice, given life sciences’ relative newness as a heavy user of H Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This