Supercomputing Experts React to Dire Climate Report

By Oliver Peckham

August 26, 2021

The Intergovernmental Panel on Climate Change (IPCC) recently released the first major component of its Sixth Assessment Report (AR6). The IPCC’s first assessment report since 2014, AR6 paints a bleak picture of both the present and the future, illustrating “widespread, rapid and intensifying” climate changes and all but guaranteeing that the world will cross the +1.5°C threshold in the coming years.

HPC earns a few mentions in this first installment of AR6 – no surprise, as the installment focuses on the physical science of climate change, and HPC remains core to large-scale climate modeling. That relationship, of course, goes both ways: climate modeling was one of the first applications of HPC and remains one of its foremost applications.

In the wake of the new climate report, HPCwire spoke to several HPC experts about the present and future of climate supercomputing.


Kate Evans, division director for Oak Ridge National Laboratory’s Computational Science and Engineering Division – and previously, group leader for ORNL’s Computational Earth Sciences Group.

Thomas Sterling, professor of intelligent systems engineering at Indiana University and co-developer of the Beowulf cluster.

Rick Stevens, associate laboratory director for computing, environment and life sciences at Argonne National Laboratory.

Left to right: Kate Evans, Thomas Sterling and Rick Stevens.

Reactions to AR6

The interviewees lauded the scientific improvements demonstrated in AR6, attributing many of them to advances in HPC. “HPC’s footprint is all over AR6, right?” Evans said. “Of course I’m happy to see that they’re using more advanced models which took advantage of higher resolution meshes, more detail, more fidelity, more ensembles – and you couldn’t have done that without HPC.”

Stevens agreed that AR6 was a “refinement … of what we had before,” but stressed that while the wording may be more severe, the science of the new report is anything but shocking for many people working in the field. “For people that haven’t been paying attention, it should be pretty sobering,” he said. “For people who’ve been paying attention, it’s like – well, this is what we’ve been telling you guys for the last 20 years!”

Sterling, meanwhile, was somewhat more unsettled by the report. “We’re in major trouble,” he said. “And I’m not an alarmist – though I am mildly alarmed. … We will survive COVID. … It’s not absolutely clear we’re going to survive climate change.”


Adaptation gains prominence

AR6 has served as the loudest warning yet that climate change is not only coming: it’s here, and it’s quickly accelerating. This abrupt transition from theory to reality – given form by the intensifying disasters around the world – has decision-makers talking more about adaptation research (focused on weathering the effects of climate change), potentially at the expense of mitigation research (which focuses on preventing climate change).

“I think the shift to adaptation [research] started years ago,” Stevens said. “It’s like a slow-motion movie, right? … Even if you got the emissions corrected immediately, we’d still be riding out climate change for a while. So that realization has been around for a decade or more. And so in policy spaces, even if you go back to the Obama administration, there was already a shift toward adaptation in our work.” Now, Stevens said, there’s a balance between adaptation work and mitigation/decarbonization work.

“I do see a shift [away from] trying to prevent it from happening,” Evans agreed. “I think because more people are seeing it happening in their own backyard in a way they didn’t ten years ago. I do think there’s a shift from ‘how do we prevent it’ to ‘how do we deal with it.’”


The continued need for HPC-powered basic climate science

The researchers discouraged the notion that climate modeling and mitigation research were less important as the changes in the climate became more apparent. “There’s even kind of a weird reverse logic going on, right?” Stevens said. “‘Well, now that we know that climate change is real and it’s gonna be bad, do we really need to keep doing simulations?’ … That’s not ideal.”

“We have to understand climate change,” Sterling said. “We have to model it. We have to predict it with such confidence that we can act quickly and in concerted efforts. … The target has to be very high-confidence climate models, which we do not have.” Stressing again that he wasn’t an alarmist (“I actually am anti-alarmist!”), Sterling added that he was “concerned that we didn’t have models that were accurate enough to tell us that the entire west side of the country was going to be on fire.”

The interviewees agreed that continued investment in this kind of basic climate science would yield important answers with real-world impacts. “If you go back 10, 20 years ago, the primary goal of the modeling was trying to get the global balances right,” Stevens said. “And I think now, we’ve made good progress on that, but we need to use these models to try to get the local impacts understood.”

“And we still need to understand how the biosphere actually is going to react to all these things,” he added. “There hasn’t been a deep sense of when you have a short-term climate shock … how biospheres react to that, how we understand that process and how we can model it.”


The growing role of AI

Throughout the interviews, the researchers cited the advancing cachet of AI in high-level climate modeling. “It’s only more recently that data analytics [and] deep learning have been adequately applied to the [climate] data coming in,” Sterling said, highlighting the relatively novel opportunity to correlate atmospheric data with our bottom-up, chemical understanding of climate dynamics.

“We think that AI is going to make a big difference,” Stevens added, “both in how we can do downscaling and how we can do interpretation of the output of models and interpretation of observational data. But AI can also play a role in the models themselves – and that’s going to be developing over the next 5, 10 years – but it could result in more accuracy, better understanding of processes and reduced uncertainty.”

Evans, like Stevens, cited the burgeoning intersection between AI techniques and more traditional climate modeling. “Combining [machine learning and simulation], I think, is the future,” she said. “It’s not that we are going to do [just] machine learning or just mod sim. I think the combination of data-driven models and physically defined models is going to get us where we need to go – because you do need both, there are things you can’t characterize with equations and there are things you can really characterize well with equations … Modeling and simulation does a great job with things like fluid dynamics and those things where we really understand physically what’s happening.”


Climate lessons from the pandemic

A year and a half ago, virtually every major research HPC practitioner suddenly had a new, urgent and extraordinarily resource-intensive task: COVID research. But far from viewing HPC’s pivot to COVID research as an impediment to climate research, the interviewees instead expressed how the emphasis on COVID research had proved useful for climate research, and vice versa.

“I see it as a 100 percent gain,” Evans said. “My division is computational science, covering many domains – including earth and climate science, where I came from. But we also have a section on human health and biology [and] what we’ve done is actually combine the two. … What we found is that the kinds of climate science people are interested in is how it affects them, right?”

“For example,” she continued, “I was with this working group where we looked at vector-borne diseases. So, mosquitoes carry disease, mosquitoes exist in certain areas and not in others – but with climate change, their habitats will change. And so you can expect if you have more water, right, or ponding or puddles and things, you get more mosquitoes. And so we’re looking at how you connect disease to climate change, so you can imagine that even though the pandemic is getting a lot of attention, we can use that in a way that actually will help people understand the changes for the climate.”

Evans also cited the health impacts from the wildfires, noting how the combination of the two research teams helped them to understand the intertwined impacts. Sterling, similarly, referenced a Harvard report. “What they showed was the fires on the West Coast” – specifically, the smoke from those fires – “significantly aggravate the likelihood of COVID being caught by people,” he said.


A model for rapid research

Beyond the scientific intersections, some also cited the impact that the pandemic had had in terms of modeling how urgent research could be conducted swiftly and efficiently. “One of the most important successes … has been the ability to rapidly change,” Sterling said, noting changes in the accessibility of medical data and computational resources. “[Researchers] were able to reduce months of competitive bureaucracy down to a few weeks. Japan did this [with Fugaku and other systems].”

“That’s a remarkable piece of bureaucracy,” he continued. “That’s exactly what we have to do not just with COVID, but with climate change – and we have to do it fast.” Evans also referenced post-COVID changes in the workflow: “I think the pandemic obviously changed the way we collaborate,” she said, adding that “getting cycles and access to HPC is going to force us to collaborate in a way that we should be doing.”


What can the HPC community do?

Countless HPC practitioners are, of course, working on modeling and addressing climate change. “Whether it’s from a standpoint of creating technology – like energy storage or resilient grids or electric vehicles or whatever – there’s a hundred different things the [national] labs are doing that are affecting the decarbonization agenda or the electrification agenda or the clean energy agenda,” Stevens said, going on to cite the imminent promise of exascale climate supercomputing. Sterling, similarly, referred to major advances and investments in HPC for weather and climate, highlighting the recent deal between Microsoft and the UK’s Met Office to provide the world’s most powerful system dedicated to weather and climate research.

Still, the question remains: is there anything else the HPC community can do that would help advance climate change research or awareness?

Sterling returned to organizational considerations. “We need a stronger synthesis [among] the research community,” he said. “Remember, most of the research community are really little mom-and-pop operations in various institutions that are primarily, but not exclusively, being funded by NSF. … In the same way that we have the NIH, we need an oversight body that manages money and manages coordination [and] correlation of work efforts without constraining research ideas.”

“We need the Manhattan Project for climate modeling,” he added. “We need such a project with … people being able to do the research without having to spend most of their time writing the proposals.”

Stevens doubled down on the need for basic research to understand new facets of Earth’s climate. “We still have big open questions in the climate system,” he said. “Not that it would challenge the basic trends that we have, but details still really need to be understood.”

Evans, meanwhile, advised a healthy level of detachment. “I like to separate the two concerns,” she said. “HPC’s job is to do a better job of telling people the true state of affairs and our best predictions of the future. And then what we do with that and how freaked out we are is someone else’s job to worry about.”


Read More

Climate, Exascale & the Ultimate Answer

Microsoft to Provide World’s Most Powerful Weather & Climate Supercomputer for UK’s Met Office

Is Weather and Climate Prediction the Perfect ‘Pilot’ for Exascale?

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Why HPC Storage Matters More Now Than Ever: Analyst Q&A

September 17, 2021

With soaring data volumes and insatiable computing driving nearly every facet of economic, social and scientific progress, data storage is seizing the spotlight. Hyperion Research analyst and noted storage expert Mark No Read more…

GigaIO Gets $14.7M in Series B Funding to Expand Its Composable Fabric Technology to Customers

September 16, 2021

Just before the COVID-19 pandemic began in March 2020, GigaIO introduced its Universal Composable Fabric technology, which allows enterprises to bring together any HPC and AI resources and integrate them with networking, Read more…

What’s New in HPC Research: Solar Power, ExaWorks, Optane & More

September 16, 2021

In this regular feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here. Read more…

Cerebras Brings Its Wafer-Scale Engine AI System to the Cloud

September 16, 2021

Five months ago, when Cerebras Systems debuted its second-generation wafer-scale silicon system (CS-2), co-founder and CEO Andrew Feldman hinted of the company’s coming cloud plans, and now those plans have come to fruition. Today, Cerebras and Cirrascale Cloud Services are launching... Read more…

AI Hardware Summit: Panel on Memory Looks Forward

September 15, 2021

What will system memory look like in five years? Good question. While Monday's panel, Designing AI Super-Chips at the Speed of Memory, at the AI Hardware Summit, tackled several topics, the panelists also took a brief glimpse into the future. Unlike compute, storage and networking, which... Read more…

AWS Solution Channel

Supporting Climate Model Simulations to Accelerate Climate Science

The Amazon Sustainability Data Initiative (ASDI), AWS is donating cloud resources, technical support, and access to scalable infrastructure and fast networking providing high performance computing (HPC) solutions to support simulations of near-term climate using the National Center for Atmospheric Research (NCAR) Community Earth System Model Version 2 (CESM2) and its Whole Atmosphere Community Climate Model (WACCM). Read more…

ECMWF Opens Bologna Datacenter in Preparation for Atos Supercomputer

September 14, 2021

In January 2020, the European Centre for Medium-Range Weather Forecasts (ECMWF) – a juggernaut in the weather forecasting scene – signed a four-year, $89-million contract with European tech firm Atos to quintuple its supercomputing capacity. With the deal approaching the two-year mark, ECMWF... Read more…

GigaIO Gets $14.7M in Series B Funding to Expand Its Composable Fabric Technology to Customers

September 16, 2021

Just before the COVID-19 pandemic began in March 2020, GigaIO introduced its Universal Composable Fabric technology, which allows enterprises to bring together Read more…

Cerebras Brings Its Wafer-Scale Engine AI System to the Cloud

September 16, 2021

Five months ago, when Cerebras Systems debuted its second-generation wafer-scale silicon system (CS-2), co-founder and CEO Andrew Feldman hinted of the company’s coming cloud plans, and now those plans have come to fruition. Today, Cerebras and Cirrascale Cloud Services are launching... Read more…

AI Hardware Summit: Panel on Memory Looks Forward

September 15, 2021

What will system memory look like in five years? Good question. While Monday's panel, Designing AI Super-Chips at the Speed of Memory, at the AI Hardware Summit, tackled several topics, the panelists also took a brief glimpse into the future. Unlike compute, storage and networking, which... Read more…

ECMWF Opens Bologna Datacenter in Preparation for Atos Supercomputer

September 14, 2021

In January 2020, the European Centre for Medium-Range Weather Forecasts (ECMWF) – a juggernaut in the weather forecasting scene – signed a four-year, $89-million contract with European tech firm Atos to quintuple its supercomputing capacity. With the deal approaching the two-year mark, ECMWF... Read more…

Quantum Computer Market Headed to $830M in 2024

September 13, 2021

What is one to make of the quantum computing market? Energized (lots of funding) but still chaotic and advancing in unpredictable ways (e.g. competing qubit tec Read more…

Amazon, NCAR, SilverLining Team for Unprecedented Cloud Climate Simulations

September 10, 2021

Earth’s climate is, to put it mildly, not in a good place. In the wake of a damning report from the Intergovernmental Panel on Climate Change (IPCC), scientis Read more…

After Roadblocks and Renewals, EuroHPC Targets a Bigger, Quantum Future

September 9, 2021

The EuroHPC Joint Undertaking (JU) was formalized in 2018, beginning a new era of European supercomputing that began to bear fruit this year with the launch of several of the first EuroHPC systems. The undertaking, however, has not been without its speed bumps, and the Union faces an uphill... Read more…

How Argonne Is Preparing for Exascale in 2022

September 8, 2021

Additional details came to light on Argonne National Laboratory’s preparation for the 2022 Aurora exascale-class supercomputer, during the HPC User Forum, held virtually this week on account of pandemic. Exascale Computing Project director Doug Kothe reviewed some of the 'early exascale hardware' at Argonne, Oak Ridge and NERSC (Perlmutter), while Ti Leggett, Deputy Project Director & Deputy Director... Read more…

Ahead of ‘Dojo,’ Tesla Reveals Its Massive Precursor Supercomputer

June 22, 2021

In spring 2019, Tesla made cryptic reference to a project called Dojo, a “super-powerful training computer” for video data processing. Then, in summer 2020, Tesla CEO Elon Musk tweeted: “Tesla is developing a [neural network] training computer called Dojo to process truly vast amounts of video data. It’s a beast! … A truly useful exaflop at de facto FP32.” Read more…

Berkeley Lab Debuts Perlmutter, World’s Fastest AI Supercomputer

May 27, 2021

A ribbon-cutting ceremony held virtually at Berkeley Lab's National Energy Research Scientific Computing Center (NERSC) today marked the official launch of Perlmutter – aka NERSC-9 – the GPU-accelerated supercomputer built by HPE in partnership with Nvidia and AMD. Read more…

Esperanto, Silicon in Hand, Champions the Efficiency of Its 1,092-Core RISC-V Chip

August 27, 2021

Esperanto Technologies made waves last December when it announced ET-SoC-1, a new RISC-V-based chip aimed at machine learning that packed nearly 1,100 cores onto a package small enough to fit six times over on a single PCIe card. Now, Esperanto is back, silicon in-hand and taking aim... Read more…

Enter Dojo: Tesla Reveals Design for Modular Supercomputer & D1 Chip

August 20, 2021

Two months ago, Tesla revealed a massive GPU cluster that it said was “roughly the number five supercomputer in the world,” and which was just a precursor to Tesla’s real supercomputing moonshot: the long-rumored, little-detailed Dojo system. “We’ve been scaling our neural network training compute dramatically over the last few years,” said Milan Kovac, Tesla’s director of autopilot engineering. Read more…

CentOS Replacement Rocky Linux Is Now in GA and Under Independent Control

June 21, 2021

The Rocky Enterprise Software Foundation (RESF) is announcing the general availability of Rocky Linux, release 8.4, designed as a drop-in replacement for the soon-to-be discontinued CentOS. The GA release is launching six-and-a-half months after Red Hat deprecated its support for the widely popular, free CentOS server operating system. The Rocky Linux development effort... Read more…

Intel Completes LLVM Adoption; Will End Updates to Classic C/C++ Compilers in Future

August 10, 2021

Intel reported in a blog this week that its adoption of the open source LLVM architecture for Intel’s C/C++ compiler is complete. The transition is part of In Read more…

Google Launches TPU v4 AI Chips

May 20, 2021

Google CEO Sundar Pichai spoke for only one minute and 42 seconds about the company’s latest TPU v4 Tensor Processing Units during his keynote at the Google I Read more…

Hot Chips: Here Come the DPUs and IPUs from Arm, Nvidia and Intel

August 25, 2021

The emergence of data processing units (DPU) and infrastructure processing units (IPU) as potentially important pieces in cloud and datacenter architectures was Read more…

Leading Solution Providers

Contributors

AMD-Xilinx Deal Gains UK, EU Approvals — China’s Decision Still Pending

July 1, 2021

AMD’s planned acquisition of FPGA maker Xilinx is now in the hands of Chinese regulators after needed antitrust approvals for the $35 billion deal were receiv Read more…

HPE Wins $2B GreenLake HPC-as-a-Service Deal with NSA

September 1, 2021

In the heated, oft-contentious, government IT space, HPE has won a massive $2 billion contract to provide HPC and AI services to the United States’ National Security Agency (NSA). Following on the heels of the now-canceled $10 billion JEDI contract (reissued as JWCC) and a $10 billion... Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

Quantum Roundup: IBM, Rigetti, Phasecraft, Oxford QC, China, and More

July 13, 2021

IBM yesterday announced a proof for a quantum ML algorithm. A week ago, it unveiled a new topology for its quantum processors. Last Friday, the Technical Univer Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

Frontier to Meet 20MW Exascale Power Target Set by DARPA in 2008

July 14, 2021

After more than a decade of planning, the United States’ first exascale computer, Frontier, is set to arrive at Oak Ridge National Laboratory (ORNL) later this year. Crossing this “1,000x” horizon required overcoming four major challenges: power demand, reliability, extreme parallelism and data movement. Read more…

Intel Unveils New Node Names; Sapphire Rapids Is Now an ‘Intel 7’ CPU

July 27, 2021

What's a preeminent chip company to do when its process node technology lags the competition by (roughly) one generation, but outmoded naming conventions make it seem like it's two nodes behind? For Intel, the response was to change how it refers to its nodes with the aim of better reflecting its positioning within the leadership semiconductor manufacturing space. Intel revealed its new node nomenclature, and... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire