Since 1987 - Covering the Fastest Computers in the World and the People Who Run Them

July 25, 2012

NASA Builds Supercomputing Lab for Earth Scientists

Robert Gelber

This week, NASA announced it would soon be launching a new HPC and data facility that will give earth scientists access to four decades of satellite imagery and other datasets. Known as the NASA Earth Exchange (NEX), the facility is being promoted as a “virtual laboratory” for researchers interested in applying supercomputing resources to studying areas like climate change, soil and vegetation patterns, and other environmental topics.

Much of the work will be based on high-resolution images of Earth that NASA has been accumulating since the early 70s, when the agency began collecting the data in earnest. Originally known as the Earth Resources Technology Satellite (ERTS) program, and later renamed Landsat, its mission was to serve up images of the earth, allowing scientists to observe changes to our planet over time. This includes tracking forest fires, urban sprawl, climate change and a host of other valuable information. Data generated by these satellites has been extremely popular in the global science community. In the last 10 years, more than 500 universities around the globe have used Landsat data to support their research.

Over time though, the program’s growth created a logistical problem. Multiple datasets eventually spanned facilities around the US, which presented challenges for researchers looking to retrieve satellite imagery. Recognizing the issue, NASA created the NEX program with the goal of increasing access to the three-petabyte library of Landsat data.

NEX will houses all data generated by Landsat satellites and related datasets, as well as offering analysis tools powered by the agency’s HPC resources. We spoke with NASA AMES Earth scientist Ramakrishna Nemani, who explained the purpose behind the NEX facility and how it has been implemented. “The main driver is really big data,” he told HPCwire. “Over the past 25 years we have accumulated so much data about the Earth, but the access to all this data hasn’t been that easy.”

Prior to NEX, he said, researchers would be tasked with locating, ordering and downloading relevant data. The process could be time consuming because the satellite imagery they wanted could be housed at one or more locations. Even after locating the desired images, data transfer times would often be prohibitive.

NASA set out to solve the problem, leveraging one of their strongest assets: supercomputing. The agency decided to take all of the disparate datasets and migrate them to the AMES research center. “We said ‘let’s do an experiment.’ We already have a supercomputer here at AMES, so we can bring all these datasets together and locate them next to the supercomputer,” said Nemani.

That system, known as Pleiades, is the world’s largest SGI Altix ICE cluster and the agency’s most powerful supercomputer. Pleiades has been upgraded over time accumulating several generations of Intel Xeon processors: Harpertown, Nehalem, Westmere, and, most recently Sandy Bridge. For extra computational horsepower, the Westmere nodes are equipped with NVIDIA Tesla GPUs. Linpack performance is 1.24 petaflops, which earned it the number 11 spot on the June 2012 TOP500 list.

The system also includes 9.3 petabytes of DataDirect storage. Given that, AMES is now able to host the three petabytes of image data at a single location. But NEX was created to do more than hold all the satellite imagery under one roof. A collection of tools was developed to help researchers analyze the data using the Pleiades cluster.

For example, a scientist could create vegetation patterns with the toolset, piecing together images like a jigsaw puzzle. The program estimates that processing time for a scene containing 500 billion pixels would take under 10 hours. Without the NEX toolset, scientists would have to create their own computational methods to perform similar research.

While making Pleiades’ compute resources available was beneficial for researchers, it posed somewhat of a challenge for the NEX project team, since a certain level of virtualization is required to support concurrent access. The marriage of virtualization and supercomputing can be “tricky business,” according to Nemani, but the program had a unique plan in this regard.

“We have two sandboxes that sit outside of the supercomputing enclave,” he said. “We bring in people and have them do all the testing on the sandboxes. After they get the kinks worked out and they’re ready to deploy, we send them inside.”

Eventually, the program would like to have scientists run their own sandbox program and upload it to the supercomputer as a virtual machine.

While NEX has some cloud elements to it, NASA could not feasibly run the project on a public cloud infrastructure. “We are trying to collocate the computing and the data together, just like clouds are doing. I would not say this is typical cloud because we have a lot of data. I cannot do this on Amazon because it would cost me a lot of money,” said Nemani

The NEX program also features a unique social networking element, which allows researchers to share their findings. It’s not uncommon for scientists to move on after working a particular topic. However, this reduces access to codes and algorithms utilized in their research. These social media tools provided by NEX allow peers to go back and verify the results of previous experiments. Combined with access to HPC and the legacy datasets, the facility provides what may be the most complete set of resources of its kind in the world.

“Basically, we are trying to create a one-stop shop for earth sciences,” said Nemani.

Share This