The Future of Climate Research: A Q&A with ORNL’s James Hack

By Leo Williams (NCCS Science Writer)

November 7, 2008

When James Hack came to Oak Ridge National Laboratory (ORNL) at the end of 2007, he was given two hats: one as the director of ORNL’s National Center for Computational Sciences (NCCS) and the other as leader of ORNL’s laboratory-wide climate science effort.

At the helm of the NCCS, he guides the most powerful open science supercomputing center in the world. The NCCS hosts leading research in climate dynamics and the development of alternative energy sources, as well as a wide range of computational sciences — from basic explorations in nuclear physics and quantum dynamics to astrophysics explorations of supernovas and dark matter.

As leader of ORNL’s Climate Change Initiative, he is in charge of pulling together scientists and engineers from across ORNL to advance the state of the science. Hack is uniquely qualified to take on this role. Before coming to ORNL, he headed the Climate Modeling Section at the National Center for Atmospheric Research (NCAR) in Boulder, Colo., and served as deputy director of the center’s Climate and Global Dynamics Division.

We asked Hack about the future of climate science and the climate initiative at ORNL.

HPCwire: How will climate research evolve in the coming years?

Hack: Climate science has largely been curiosity-driven research. But the growing acceptance that humans affect the evolution of atmospheric composition, land use, and so on, all of which in turn affect the climate state, provides a little more focus and a little more urgency to taking a harder look at what the modeling tools are capable of providing in the form of specific consequences for society.

That to me is the transformation. There’s a growing need for improvements in simulation fidelity and predictive skill. The potential consumers of that kind of simulation information will be leaning hard on the climate change community to provide answers to their questions. That’s the change that’s going to differentiate the next 10 years of climate change science from the previous 30.

For example, we know from observations over the last 50 years that the snowpack in the Pacific Northwest has been decreasing. At the same time, temperature in the same region has been increasing. If that trend continues, it raises lots of concerns for water resource managers who have counted on storing their water in the form of snow until a certain time of year when it starts melting.

If precipitation never comes down as snow or if it starts melting sooner than you need it, you may not able to meet your water demands. It’s an example of an infrastructure that’s vulnerable to specific changes in a region’s climate state. Many of the solutions to this problem may also bring with them other environmental consequences.

HPCwire: So what can you do to help users of climate data?

Hack: We need to know if we can tie down with some certainty how climate will change on the scales that matter to people. It’s one thing to tell somebody that the planet’s going to warm by 2 degrees centigrade between now and 2100, but it doesn’t really help anybody who’s in the business of planning or managing societal infrastructures on regional scales. We know from the models that it won’t be a homogeneous change. The high latitudes are going to feel maybe 8-degree increases in temperature, and the lower latitudes are going to feel considerably less. And quantifying changes in the hydrological cycle on regional scales may be even more important than temperature changes.

We think we might currently have sufficient skill to project climate change on regional scales about the size of the Southeast, Pacific Northwest, Rocky Mountain West, or Farm Belt. As a community we need to demonstrate that the potential is really there and try and quantify what the uncertainties are. We haven’t done a very good job with this challenge so far. But I think the scientific community is starting to realize that we have an opportunity to take a step back and ask, “What can we do on regional scales and timescales that we think are predictable?”

For example, there’s a belief that climate statistics have some predictive skill on decadal timescales. The driver for that is going to reside in the ocean, the motion scales of which have a very, very long time frame. There is a belief in the scientific community that the ocean’s behavior can be predicted several decades into the future.

If you can do the ocean part of the problem, given the fact that 70 percent of the planet is covered with water, you have a very strong constraint on the other parts of the system. Then the question is, “Will the other component models follow?” The atmosphere doesn’t have any deterministic predictive skill beyond a few weeks. So you’re dealing with statistics that are forced by components of the climate system that have much slower variability than the atmosphere. Even the terrestrial components of the climate system, particularly land use changes, come into play on longer timescales.

HPCwire: Are we ready to make predictions about the ocean?

Hack: As is the case with the atmosphere, we’re still building knowledge about the ocean component. A difficult challenge will be initializing the ocean state for the purpose of prediction. I believe there’s a tremendous opportunity for people who want to pursue the ocean initialization problem.

To deliver decadal prediction we will need to treat the climate problem as an initial value problem and not a hypothetical boundary-value problem. Besides getting the statistical behavior right, you need the phase of low frequency variability to be correct as well. For example, predicting when an El Niño will occur or when a La Niña will occur. If we can accurately predict this type of ocean behavior, there is evidence that other features of the climate state can be accurately reproduced. That’s a matter of correctly initializing the model and accurately incorporating all the necessary physics in the respective component models.

HPCwire: How do you demonstrate that you’re getting it right?

Hack: We can come up with numerical experiments to assess whether the global model can produce useful information on the timescales and space scales of most importance to resource managers and planners. They may want to know where the temperature’s headed locally, how the hydrological cycle is likely to behave, or how extreme events will change. Do the models provide us with the kind of predictive skill we need, and if not, how can they be improved?

When you start windowing down to very small space scales, at what point does the uncertainty or natural noise in the system begin to swamp the signal that you’re trying to find? We can illuminate that with retrospective simulations because we have lots of data for an instrumented period that’s multidecadal. It’s not all the same quality, but it quantifies what’s happened in the climate record in a much more complete way, say, than going back to paleoclimate times or even going back a few centuries. Retrospective simulations over the latter part of the 20th century can help to quantitatively establish what the models are capable of doing or not capable of doing on relatively fine spatial scales.

HPCwire: What is the role of computing in this effort?

Hack: Computing is a big part of the effort. To fully evaluate the skill in our modeling tools, we need very large computer systems — petascale machines. Assimilating data streams that will be used in the evaluation of modeling frameworks requires very large computer and data systems.

Clearly, a significant computational piece is modeling — building models that have all the components they need to accurately predict the evolution of the earth’s climate system. That’s computationally very intensive. Incorporating the complexities of the carbon cycle in these models, using the expertise of ORNL’s Environmental Sciences Division, contributes to the computational demands. And then mining the data to deal with questions of human impacts and climate extremes, that again is very computationally intensive.

So computation does in fact tie the whole effort together. It cuts across all the various climate science applications. There are certain areas of science where you need a virtual laboratory to explore the what-if experiments, and that’s what computation provides for the climate problem.

Global modeling is something that has been funded under programs like SciDAC [Scientific Discovery through Advanced Computing] and other DOE programs in partnership with other national labs like NCAR. For example, there’s an almost 20-year history of ORNL partnering with NCAR on the development of global models and implementing global models efficiently on high-performance computing systems. We are also in the process of building strong new relationships with our NOAA [National Oceanic and Atmospheric Administration] and NASA [National Aeronautics and Space Administration] climate modeling colleagues, looking at high-resolution global modeling, quantifying predictive skill on climate timescales, identifying climate extremes in global simulations, and exploring climate impacts in the context of integrated assessment modeling. All this builds on strong preexisting partnerships with many other DOE laboratories.

HPCwire: You are leading a new multidisciplinary effort at ORNL focused on climate science. What is the reasoning behind this effort?

Hack: ORNL has identified climate change as an opportunity that could very effectively exploit existing competencies, particularly high-performance computing and ORNL’s long history in contributing to fundamental knowledge about carbon science and in global modeling. The lab also has expertise in evaluating impacts on societal infrastructure. Take rising sea levels. Most of the folks living around the world live close to coastlines, so if the sea level rises even a meter, it has a huge societal impact. The people who are displaced must go somewhere else, maybe moving into areas that were previously used for agriculture. That displaces agricultural activities. ORNL has a very strong GIS [geographic information systems] group that can contribute to quantification of these scenarios.

So we’re looking at how we can bring these various competencies together to provide a capability that’s unique among the laboratories. The end result for us is to provide stakeholders, resource managers, and others with information they need to deal with the consequences of climate change.

HPCwire: What will ORNL’s initiative look like?

Hack: It’s a cross-cutting initiative. We’re trying to engage people from across the laboratory to stretch the kind of work they’re doing in such a way that it requires partnerships with other ORNL folks. So far, many of the more promising proposals include collaborations that cut across the Biological and Environmental Sciences Directorate and CCSD [Computing and Computational Sciences Directorate].

As the initiative matures, I hope we’ll begin to incorporate more people in the energy arena, another strong part of the ORNL scientific program. These things could include ways to link climate change and the hard questions we’re facing in energy production, like bioenergy and renewable energy technologies, as well as energy consumption. Dealing directly with climate mitigation questions, such as strategies for the sequestration of carbon, is an opportunity for this initiative.

From an energy production point of view, planning has a multidecadal timeframe. Anyone planning investments in the energy infrastructure needs to understand what role the environment might play. That’s the goal — to be able to say 20 years from now, “Here’s what we anticipate will happen with regard to environmental change on a regional scale.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Supercomputer Modeling Shows How COVID-19 Spreads Through Populations

May 30, 2020

As many states begin to loosen the lockdowns and stay-at-home orders that have forced most Americans inside for the past two months, researchers are poring over the data, looking for signs of the dreaded second peak of t Read more…

By Oliver Peckham

SODALITE: Towards Automated Optimization of HPC Application Deployment

May 29, 2020

Developing and deploying applications across heterogeneous infrastructures like HPC or Cloud with diverse hardware is a complex problem. Enabling developers to describe the application deployment and optimising runtime p Read more…

By the SODALITE Team

What’s New in HPC Research: Astronomy, Weather, Security & More

May 29, 2020

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

By Oliver Peckham

DARPA Looks to Automate Secure Silicon Designs

May 28, 2020

The U.S. military is ramping up efforts to secure semiconductors and its electronics supply chain by embedding defenses during the chip design phase. The automation effort also addresses the high cost and complexity of s Read more…

By George Leopold

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI-based techniques – has expanded to more than 56 research Read more…

By Doug Black

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

What’s New in Computing vs. COVID-19: IceCube, TACC, Watson & More

May 28, 2020

Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporations and governments are urgently devoting their computing reso Read more…

By Oliver Peckham

COVID-19 HPC Consortium Expands to Europe, Reports on Research Projects

May 28, 2020

The COVID-19 HPC Consortium, a public-private effort delivering free access to HPC processing for scientists pursuing coronavirus research – some utilizing AI Read more…

By Doug Black

$100B Plan Submitted for Massive Remake and Expansion of NSF

May 27, 2020

Legislation to reshape, expand - and rename - the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to Read more…

By John Russell

IBM Boosts Deep Learning Accuracy on Memristive Chips

May 27, 2020

IBM researchers have taken another step towards making in-memory computing based on phase change (PCM) memory devices a reality. Papers in Nature and Frontiers Read more…

By John Russell

Hats Over Hearts: Remembering Rich Brueckner

May 26, 2020

HPCwire and all of the Tabor Communications family are saddened by last week’s passing of Rich Brueckner. He was the ever-optimistic man in the Red Hat presiding over the InsideHPC media portfolio for the past decade and a constant presence at HPC’s most important events. Read more…

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia Read more…

By Doug Black

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

HPC in Life Sciences 2020 Part 1: Rise of AMD, Data Management’s Wild West, More 

May 20, 2020

Given the disruption caused by the COVID-19 pandemic and the massive enlistment of major HPC resources to fight the pandemic, it is especially appropriate to re Read more…

By John Russell

AMD Epyc Rome Picked for New Nvidia DGX, but HGX Preserves Intel Option

May 19, 2020

AMD continues to make inroads into the datacenter with its second-generation Epyc "Rome" processor, which last week scored a win with Nvidia's announcement that Read more…

By Tiffany Trader

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Contributors

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This