Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
August 16, 2010

TeraGrid 2010 Keynote: Attendees Peer into Blue Waters

by Jan Zverina

NCSA’s Wilhelmson discusses first research projects for supercomputer’s 2011 debut

Blue Waters, expected to be one of the most powerful supercomputers in the world for open scientific research when it comes online next year, is being counted on to help solve some of the world’s most vexing scientific and social challenges, from figuring out how the first galaxies formed to simulating the spread of disease across large populations to better prepare us for such medical emergencies.

Bob WilhelmsonAt this year’s TeraGrid conference, Bob Wilhelmson, recently retired chief science officer of the National Center for Supercomputing Applications (NCSA) and former applications lead for the Blue Waters project, delivered a keynote address in which he discussed the Blue Waters architecture and shared several planned projects for the new supercomputer, a joint effort between NCSA, the University of Illinois, IBM, the Great Lakes Consortium for Petascale Computation, and the National Science Foundation (NSF).

Eighty percent of the Blue Waters resource will be dedicated to NSF awardees through the Petascale Computing Resources Allocation Program or PRAC, Wilhelmson told TG’10 attendees in Pittsburgh, Pa. Each PRAC award identifies a scientific challenge requiring advanced modeling and simulation capabilities that can only be provided by a system that provides sustained performance approaching one petaflop. Awardees receive a $40,000 travel grant to learn about the new supercomputer system and prepare their algorithms to scale to hundreds of thousands of processors/cores. To date, 18 awards have been made and 55 proposals are under review. Approximately 10 new awards are expected.

Some of the first areas of research selected for Blue Waters include:

  • The simulation of stellar weather, including high-resolution turbulence simulations. Simulation will help researchers better understand how convection in the Sun and other stars.
     
  • The study of chromatophores, or cells which are largely responsible for generating skin and eye color in cold-blooded animals. At the bacterial level, this study is expected to assist researchers in better understanding human disease and in new drug development.
     
  • Actions and interaction of quarks, elementary particles and a fundamental constituent of matter. This particle physics project will provide information of value for research in astronomy, physics, meteorology, and other fields.
     
  • The simulation of disease spread and pandemics in very large social networks. The project will model predictions of network behavior among human populations of 300 million or more, and provide guidance for medical emergency preparedness, such as mass vaccinations.
     
  • The formation of the first galaxies. Blue Waters’ massive computer power will allow researchers to simulate large numbers of galaxies with much higher resolution.
     
  • A ‘bio evolution’ project. This effort will focus on how bacteria mutate and how to clean up environmental contamination by developing multi-scale models of bacteria populations.
     
  • Simulating supercell storms and tornadoes. Blue Waters’ resources will be used to carry out simulations of tornadoes embedded in supercell storms with unprecedented detail and accuracy, with up to 8 million times as many grid points compared to what was possible to compute in the 1970s.

Blue Waters is being built from the most advanced computing technologies under development at IBM, including the multicore ‘POWER7′ microprocessor. The system will have more than one petabyte of memory, more than 10 petabytes of disk storage, and eventually up to 500 petabytes of archival storage. It will take up approximately 5,000 square feet of floor space in a new state-of-the-art computer facility at the University of Illinois.

“The concept here is to develop a well-balanced machine both in terms of compute power, memory size, disk/archive storage, and IO capability,” Wilhelmson told TG’10 attendees. “It is one of the things that many organizations struggle with.”

While the numbers behind the data-intensive Blue Waters supercomputer are impressive — its 300,000-plus cores will help the system achieve peak performance of approximately 10 petaflops, or 10 quadrillion calculations per second, and deliver a sustained performance of at least one petaflop on a range of real-world science and engineering applications — Wilhelmson said it is the science and scientific advances that are really important.

“Machines are just technology,” he said. “They live for five years and then they’re gone, replaced by something else. What does not die is the application, because it is developed and used to gain a deeper understanding of the world around us.”

Wilhelmson, an atmospheric scientist at the University of Illinois at Urbana-Champaign, also had some advice for students and young researchers working with TeraGrid, the nation’s largest open-access scientific discovery infrastructure.

“Expect to work in teams,” he said, adding that the days of researchers working alone on a project are over. “Teams and collaborations are crucial to solving interdisciplinary problems and furthering our understanding because the problems are so big and often quite complex.”

Wilhelmson said that today’s scientists must be “nimble and adaptive,” willing to try new things, and “find new ways to deal with the data explosion which we are in part creating.

“We will be able to do things on Blue Waters that I never dreamed about,” he said, adding that “we are now solving problems that we didn’t have enough computational power to solve in the past.”

In conclusion, Wilhelmson stressed the need for funding at adequate levels for applications development and system support, calling it essential to progress and leadership. Yet he expressed doubt about the next frontier: exascale computing, which is a thousand-fold increase over the petascale level.

“I’ll make a claim,” he told the TG’10 audience. “There will be no general purpose exascale machine ever built that anyone can afford to operate, much less buy,” largely because of the massive amount of funding that will be needed, along with the extreme power requirements. Upon reflection, Wilhelmson challenges today’s young computer scientists: “Who will show that this prediction is wrong?”

SC14 Virtual Booth Tours

AMD SC14 video AMD Virtual Booth Tour @ SC14
Click to Play Video
Cray SC14 video Cray Virtual Booth Tour @ SC14
Click to Play Video
Datasite SC14 video DataSite and RedLine @ SC14
Click to Play Video
HP SC14 video HP Virtual Booth Tour @ SC14
Click to Play Video
IBM DCS3860 and Elastic Storage @ SC14 video IBM DCS3860 and Elastic Storage @ SC14
Click to Play Video
IBM Flash Storage
@ SC14 video IBM Flash Storage @ SC14  
Click to Play Video
IBM Platform @ SC14 video IBM Platform @ SC14
Click to Play Video
IBM Power Big Data SC14 video IBM Power Big Data @ SC14
Click to Play Video
Intel SC14 video Intel Virtual Booth Tour @ SC14
Click to Play Video
Lenovo SC14 video Lenovo Virtual Booth Tour @ SC14
Click to Play Video
Mellanox SC14 video Mellanox Virtual Booth Tour @ SC14
Click to Play Video
Panasas SC14 video Panasas Virtual Booth Tour @ SC14
Click to Play Video
Quanta SC14 video Quanta Virtual Booth Tour @ SC14
Click to Play Video
Seagate SC14 video Seagate Virtual Booth Tour @ SC14
Click to Play Video
Supermicro SC14 video Supermicro Virtual Booth Tour @ SC14
Click to Play Video