TeraGrid 2010 Keynote: The Essential Role Cyberinfrastructure Plays in the Geosciences

By Faith Singer-Villalobos

August 12, 2010

Dr. Tim Killeen, representing the National Science Foundation (NSF), last week addressed the annual TeraGrid ’10 conference in Pittsburgh, Pa. His keynote emphasized the urgent need for sustainable cyberinfrastructure in the geosciences and across all domains of science.
Dr. Tim Killeen, NSF Geosciences
“The geosciences is a domain in which cyberinfrastructure is incredibly important,” Killeen, the NSF assistant director for geosciences, said. “There is need for end-to-end cyberinfrastructure that is accessible to brilliant young career professionals across the country. We need the capabilities now.”

The NSF and funding agencies from countries including Brazil, Australia, Russia, Canada, France, Germany, Great Britain, and Japan one year ago declared that they would work collectively “to deliver knowledge to support human action and adaptation to regional environmental change” for global issues such as climate change and the availability of fresh water on the planet.

“We don’t have a century to get this right,” Killeen said. “We need the resources, sustained investments, smooth transitions, and accessibility of these resources to the brain trust of the nation and internationally. It’s amazing when you look at the strategic plans of other countries and see how parallel they are to our own.”

The crux of the challenge is developing an earth-human knowledge management system (aka “Earth-Cubed”) to support a more complete understanding of the earth system and the human interactions with that system. “It’s a scientific and technical challenge for the 21st century.”

Killeen asked each conference participant to consider their role and the TeraGrid’s role with regard to “Earth-Cubed” and cited the geosciences as an example of the domain requirements and cyberinfrastructure needs put on the TeraGrid community.

“I have yet to see a high-performance computing center that doesn’t use the geosciences as a driver or rationale as to why we need this type of capability,” he said.

Yet, focusing on sustainability is going to stretch the NSF and other agencies to do this in a robust way, Killeen said. “It’s going to place demands on the types of products and services that come out of TeraGrid. Cyberinfrastructure’s interface with science and society is going to be challenging — no question about it.”

Currently, the NSF’s Geosciences Direcorate invests 10 percent of its overall budget in cyberinfrastructure in addition to the investments made by the NSF Office of Cyberinfrastructure (OCI). “We like to invest in multiple approaches with multiple outcomes…things that enhance productivity and capability. We look to the community for direction and priority. We like to understand the full life cycle costs and process. We want to anticipate increases in demand, make new investments, and address workforce issues.”

It’s a very exciting time for the geosciences, according to Killeen. There are new ways of looking at the Earth system and new methods by which much tougher problems that require advanced computing simulations are addressed. In addition, the data volumes are large and the data return is in the upper 90th percentile. “It’s about the data and the value that data brings to understanding. Data-intensive computing is a very high priority,” he said.

According to Killeen, on-demand, global experiments in the geosciences are becoming the norm. But, the changing context as it relates to human interactions with the Earth system is much more complex. It requires another level of integrated assessment models that can only be achieved through advanced computing simulation.

The next generation of models will be able to resolve detailed processes in the oceans, atmosphere and land. “We’ve reached this threshold, in part, by TeraGrid’s efforts in building the appropriate cyberinfrastructure and computational capability. We’re coming to a point where the sensor arrays and the development of the earth models are maturing at the same time. Models have become substantially more complete; they drive the most capable computation.”

Killeen said the common architecture for cyberinfrastructure includes hardware, software, data provision, networking, sensor deployment, model assimilation, middleware, tools, Web services, cloud computing, free global access to information, and single-password authentication, as well as integrated services (data and instrument services, governance activities, computational expertise).

“Overall, it’s a sustained investment and a balanced approach to drive transformation in the scientific disciplines. This is what NSF wants to get to…it involves training people to address incredibly important societal challenges with all the tools at our command. It’s going to be the cyberinfrastructure that transforms the geosciences and takes it to the next level. We’re now poised to do it in the next 10 years.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

HPC-as-a-Service Finds Toehold in Iceland

December 11, 2017

While high-demand workloads (e.g., bitcoin mining) can overheat data center cooling capabilities, at least one data center infrastructure provider has announced an HPC-as-a-service offering that features 100 percent fre Read more…

By Doug Black

HPC Iron, Soft, Data, People – It Takes an Ecosystem!

December 11, 2017

Cutting edge advanced computing hardware (aka big iron) does not stand by itself. These computers are the pinnacle of a myriad of technologies that must be carefully woven together by people to create the computational c Read more…

By Alex R. Larzelere

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit and Sierra. The new AC922 server pairs two Power9 CPUs with f Read more…

By Tiffany Trader

HPE Extreme Performance Solutions

Explore the Origins of Space with COSMOS and Memory-Driven Computing

From the formation of black holes to the origins of space, data is the key to unlocking the secrets of the early universe. Read more…

PEZY President Arrested, Charged with Fraud

December 6, 2017

The head of Japanese supercomputing firm PEZY Computing was arrested Tuesday on suspicion of defrauding a government institution of 431 million yen (~$3.8 million). According to reports in the Japanese press, PEZY founde Read more…

By Tiffany Trader

HPC Iron, Soft, Data, People – It Takes an Ecosystem!

December 11, 2017

Cutting edge advanced computing hardware (aka big iron) does not stand by itself. These computers are the pinnacle of a myriad of technologies that must be care Read more…

By Alex R. Larzelere

IBM Begins Power9 Rollout with Backing from DOE, Google

December 6, 2017

After over a year of buildup, IBM is unveiling its first Power9 system based on the same architecture as the Department of Energy CORAL supercomputers, Summit a Read more…

By Tiffany Trader

Microsoft Spins Cycle Computing into Core Azure Product

December 5, 2017

Last August, cloud giant Microsoft acquired HPC cloud orchestration pioneer Cycle Computing. Since then the focus has been on integrating Cycle’s organization Read more…

By John Russell

GlobalFoundries, Ayar Labs Team Up to Commercialize Optical I/O

December 4, 2017

GlobalFoundries (GF) and Ayar Labs, a startup focused on using light, instead of electricity, to transfer data between chips, today announced they've entered in Read more…

By Tiffany Trader

HPE In-Memory Platform Comes to COSMOS

November 30, 2017

Hewlett Packard Enterprise is on a mission to accelerate space research. In August, it sent the first commercial-off-the-shelf HPC system into space for testing Read more…

By Tiffany Trader

SC17 Cluster Competition: Who Won and Why? Results Analyzed and Over-Analyzed

November 28, 2017

Everyone by now knows that Nanyang Technological University of Singapore (NTU) took home the highest LINPACK Award and the Overall Championship from the recently concluded SC17 Student Cluster Competition. We also already know how the teams did in the Highest LINPACK and Highest HPCG competitions, with Nanyang grabbing bragging rights for both benchmarks. Read more…

By Dan Olds

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

SC Bids Farewell to Denver, Heads to Dallas for 30th Anniversary

November 17, 2017

After a jam-packed four-day expo and intensive six-day technical program, SC17 has wrapped up another successful event that brought together nearly 13,000 visit Read more…

By Tiffany Trader

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Japan Unveils Quantum Neural Network

November 22, 2017

The U.S. and China are leading the race toward productive quantum computing, but it's early enough that ultimate leadership is still something of an open questi Read more…

By Tiffany Trader

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

Leading Solution Providers

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Perspective: What Really Happened at SC17?

November 22, 2017

SC is over. Now comes the myriad of follow-ups. Inboxes are filled with templated emails from vendors and other exhibitors hoping to win a place in the post-SC thinking of booth visitors. Attendees of tutorials, workshops and other technical sessions will be inundated with requests for feedback. Read more…

By Andrew Jones

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Tensors Come of Age: Why the AI Revolution Will Help HPC

November 13, 2017

Thirty years ago, parallel computing was coming of age. A bitter battle began between stalwart vector computing supporters and advocates of various approaches to parallel computing. IBM skeptic Alan Karp, reacting to announcements of nCUBE’s 1024-microprocessor system and Thinking Machines’ 65,536-element array, made a public $100 wager that no one could get a parallel speedup of over 200 on real HPC workloads. Read more…

By John Gustafson & Lenore Mullin

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

Intel Launches Software Tools to Ease FPGA Programming

September 5, 2017

Field Programmable Gate Arrays (FPGAs) have a reputation for being difficult to program, requiring expertise in specialty languages, like Verilog or VHDL. Easin Read more…

By Tiffany Trader

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

Share This