CAS 2005: Focus on Earth System Modeling, Katrina

By By Christopher Lazou

September 23, 2005

From Sept. 11 to Sept. 15, 80 meteorologists and HPC experts from 12 countries and five continents attended the bi-annual CAS 2005 workshop on the use of HPC in meteorology, held at the idyllic Imperial Palace Hotel, Annecy, France, organized by the National Center for Atmospheric Research (NCAR).

This excellent, relatively small and friendly workshop provided a tour de force in meteorological and computing techniques by active practitioners striving to maximize the latest HPC technology to refine and improve their climate prediction models. It was augmented by talks from broader scientific centers of excellence, such as NERSC and ORNL from the USA and CCLRC from the UK, presenting the e-science program. Most presenters came from sites in the U.S. with large IBM P3/4/5 systems, while the European contingent included a strong representation from sites with large NEC SX-6 and SX-8 systems. This article highlights a few of the many climate issues raised by presentations given at this workshop.

There were 39 presentations in four and a half days, some describing the Grid's enabling potential for international collaboration within the community of climate system modeling (CCSM). The talks were crammed with technical information on how to use parallel supercomputers for computation using mathematical models, which describe climate/weather patterns over time. They were interspersed with weather maps and video pictures from simulations and these were compared with satellite pictures of the behavior of actual weather events.

Why are meteorologists doing all this Earth System Modeling and what is the urgency? Dramatic reports of flooding and other climate change events now appear frequently in the press and on television. Climate simulations show that intensely hot summers and increase in rainfall, causing flooding, are likely to become more common. These images are injecting a political dimension into the proceedings.

The destruction of New Orleans by hurricane Katrina provides a cautionary tale. The Earth System Models correctly predicted the path of Katrina days before it hit the historic city. Predictions were communicated to the authorities, yet the emergency response infrastructure failed in its mission to minimize damage to property and empower prompt evacuation of citizens while minimizing loss of life and inevitable pain. It is easy to scapegoat individuals about this failure, but the reality goes much deeper. It is a systemic failure on how fast scientific discovery is inserted into public domain infrastructures for the benefit of society at large. Reporting new knowledge in scientific forums is not enough. It requires effort to engage the political class so it can take ownership of this knowledge and inform policy for risk management infrastructure, so it becomes part of the fabric of the emergency response process. In the case of Katrina, the fact that the Bush administration is in denial about global warming exacerbated this lack of preparation. This denial mindset inevitably downgrades fiscal requirements to deal with potential risk. For the people of New Orleans, it was a catastrophic tragic outcome.  

Experiments by the Intergovernmental Panel for Climate Change (IPCC) were completed this summer. The fourth assessment report, to be published in 2007, is being prepared. The scientific results from these experiments predict a grim future. The trend is clear. More extreme weather, flooding, droughts, stronger and more frequent hurricanes and climate changes involving decertification and higher sea levels, are inevitable during this century. The experiments show that human activity is contributing to global warming. The reduction of snow in the north, the melting of glaciers, the projection of no snow in the north in the year 2100 (even at the North Pole) and the implied rise in sea level, raise questions on the state of the atmosphere, ocean, sea-ice, land surfaces and humankind. In short, there is a perceived pending catastrophe, because of global warming exacerbated by greenhouse gases and other pollutants from human activities.

Some scenarios show that sea level rise alone could deprive a billion people of food in the next 100 years. Insurance companies cannot protect against consequences of this magnitude. Thus, the stakes are high and finding answers to the socio-economic effects of climate change has climbed to the top of the political agenda, but sadly not in the U.S., where more than a quarter of pollutants and greenhouse gases are generated, as per the communicate from August's G8 meeting at Gleneagles.

The key goal of the climate change efforts is to develop and enhance our capability to monitor and predict how the Earth System is evolving. Temporal scales seasonal and inter-annual, weather forecasting and climate change predictions are dominated by initial conditions of the atmosphere, the oceans and by forcing factors (naturally-occurring and human-induced).

Warren Washington, chairman of the NSF National Science Board, presented the findings of his team at NCAR, called “IPCC climate change simulations of 20th and 21st century: Present and Future”

“The Community Climate System Model (CCSM), has produced one of the largest data sets for the IPCC fourth assessment,” he said. “As a result of this and other assessments, most of the climate research science community now believes that humankind is changing the earth's system and that global warming is taking place.”

CCSM is a comprehensive system for simulating the past, present and future climates of the Earth. It grew out of a collaborative development effort involving NCAR, university investigators and scientists from several U.S. federal agencies. One of CCSM's distinguishing features is that the complete source code, documentation and simulation data sets are freely distributed to the international climate research community. It initially consisted of four major components representing the atmosphere, ocean, sea ice and land surface. The exchange of energy, water and other constituents at the interfaces among these components is simulated using a flux coupler.

The current version of the model, called CCSM3, has been developed to facilitate work on a variety of scientific problems. These include the interactions between aerosols and climate, the relative importance of natural and anthropogenic forcing from the last millennium, and the nature of abrupt climate change. Results from CCSM3 form the basis for NCAR's contribution to forthcoming international (IPCC and WMO) climate fourth assessments. This talk chronicled the major new features and improvements in CCSM3 relative to its predecessors.

These include new radiation and cloud parameterizations in the atmosphere; heating of the ocean surface by chlorophyll and detailed vegetation ecology. The improvements in simulations of present-day climate produced by the new model physics were illustrated with recent coupled experiments. Global and regional climate aspects investigated using a climate model included El Nino, La Nina, monsoons, the north Atlantic oscillation and the Arctic oscillation.

The controversy of global warming was settled in 2005. With more green house gases climate models project: Troposphere temperature increase, stratosphere temperature decrease, surface temperature increase and troposphere, warms more than earth surface. Observations show that since 1960, surface and troposphere warm about the same rate. There are strong decreases in stratosphere temperature and increases in tropopause height since 1979 (T. Karl, NOAA).

Climate change scenarios show that: “At any point in time, we are committed to additional warming and sea level rise from the radiative forcing already in the system. Warming stabilizes after several decades, but sea level from thermal expansion continues to rise for centuries. Each emission scenario has a warming impact.”

Climate models can be used to provide information on changes in extreme events such as heat waves. Heat wave severity is defined as the mean annual three-day warmest nighttime minima event. The Model compares favorably with present-day heat wave severity. In a future, warmer climate, heat waves become more severe in southern and western North America, and in the western European and Mediterranean regions.

In the next few years, CCSM will be further expanded to include reactive troposphere chemistry, detailed aerosol physics and microphysics, comprehensive biogeochemistry, ecosystem dynamics and the effects of urbanization and land use change. These new capabilities will considerably expand the scope of earth system science that can be studied with CCSM and other climate models of similar complexity. Higher resolution is especially important near mountains, river flow and coastlines. Full hydrological coupling, including ice sheet, is important for sea level changes. It will include better vegetation and land surface treatments with ecological interactions as well as carbon and other biogeochemical cycles.

For example, one of the carbon cycle methods being tested is based on microbe activity. There is a strong feedback between decomposition and plant growth: soil mineral nitrogen is the primary source of nitrogen for plant growth. Nitrogen fixing bacteria/algae are very important, however, there are limited field and laboratory data, on their role. It has been suggested that their role in nitrogen fixing can result in a shift from “carbon source” to “carbon sink,” under a warming scenario.

The proposed DoE climate science Computational End Station (CES), set up at ORNL, will address Grand Challenge scale problems, to predict future climate change resulting from various energy options. It will use the CCSM for studies of model biases, climate variability, abrupt climate change, and global carbon and other chemical cycles and pursue high resolution, atmosphere and ocean studies.

The computer requirements, for the next generation of comprehensive climate models, can only be satisfied by major advances in computer hardware, software and storage. The classic climate model problems with supercomputer systems are that the computers (with the exception of vector systems) are not balanced between processor speed, memory bandwidth and communication bandwidth between processors, including global computational needs. They also are more difficult to program and optimize; it is hard to get I/O out of computers efficiently and computer facilities need to expand archival data capability into the petabyte range. In addition, there is a weak relationship between peak performance and performance on actual working climate model programs.

The major atmospheric research centers now have systems consisting of several thousands of IBM P3/4/5 processors, up to a thousand Cray X1E vector processors or several hundred NEC SX-6 and SX-8 vector processors. In either case, they can achieve about a half Teraflop per second sustained and sometimes one Teraflop per second on certain application codes. The exemption to this is the Earth Simulator in Japan, based on NEC SX-6 technologies (5120 processors), which delivers over 12 Teraflops per second of sustained performance.

Thus with sustained performance Teraflop per second computing on the horizon and occasionally on stream, meteorologists are moving from Climate to Earth System Modeling (ESM). This is because feedback loops of climate system with other relevant systems, such as ecology and socio-economy, are not negligible. Climate Modeling is not possible without proper representation of these systems; hence, ESM. Earth System Modeling is multi (time and space) scale, multi-process, multi-topical (physics, chemistry, biology, geology, economy). It is both very compute and data intensive. Some people claim it requires several orders of magnitude more computing power to tackle the problem. Achieving Petaflop per second and Hexaflop per second are therefore eagerly awaited.

According to Tom Bettge, deputy director of the scientific computing division at NCAR, “A factor of 25 times the present NCAR computing resources is needed to accommodate CCSM requirements over the next two years, to prepare for the Next IPCC assessment starting in 2007. How this deficiency is to be remedied is a great challenge. Although special architectures, like the IBM Blue Gene R&D system, for protein folding, is delivering good results in its niche area, this architecture is not suited to ESM, which needs a small number of fat nodes, rather than the thousands of processors as in the Blue Gene.”

It was noted that despite many computing centers having IBM systems with 15-to- 25 Teraflops per second peak performance, these locations are only delivering a few hundreds of Gigaflops per second of sustain performance to the user application. Presently, CCSM is in the hundreds of Gigaflops per second era. Only Earth Systems Models running on the Japanese Earth Simulator have graduated to Teraflops. This was aptly illustrated by the talk from Michel Desgagne, of Environment Canada on the study of Hurricane behavior. His simulations were performed on the ES and achieved 13 Teraflops per second of sustained performance, using 495 compute nodes and 7 TBs memory. Each run took seven to eight days wall clock time.

Added Desgagne: “Recent studies have shown that very high resolution is essential to properly resolve waves that have direct impact on the intensification of hurricanes. In particular, innovative potential vorticity diagnostic tools were applied to diagnose inner spiral bands formed in explicitly simulated hurricanes. It was shown that wave-number one and two anomalies are in fact vortex Rossby waves that explain 40 percent to 50 percent of the wave activity in a period of 24 hours. These meso-vortices within the inner core of a hurricane are responsible for the dynamical processes controlling the redistribution of angular momentum and numerical resolution of these vortices could help to more accurately predict the intensification of hurricanes”.

For the Vortex Rossby waves (VRWs), it was found that a 6 Km resolution was not good enough, a 1 Km resolution had to be used to get useful results. What has recently being identified is a Rossby wave train starting from the Indonesia area moving across the oceans causing bad weather. Using the ES, one achieves 1hour simulation for 1hour of computation. To follow and analyze VRWs across the globe, one needs to attain a day's simulation for 10 minutes of computation. This translates to approximately 150 times more computing than that achieved on the ES, i.e. ~2Petaflops per second sustained performance.

Several talks concentrated on projects implementing Earth System Modeling Frameworks, ESMF in the USA and PRISM in Europe. PRISM has now moved from an R&D project to the status of PRISM Support Initiative delivering a service. For example, PRISM will be the modeling environment at DKRZ for the IPCC fifth assessment.

During the last workshop in 2003, a strong emphasis was placed on data management and the challenges this entails. This time the emphasis was more on power used by supercomputers and the footprint as well as facility space requirements, which are of most concern.

Christopher Lazou, a frequent contributor to HPCwire, is managing director at HiPerCom Consultants Ltd.

Brands and names are the property of their respective owners.

Copyright: Christopher Lazou, HiPerCom Consultants Ltd., UK. September 2005.

 

 

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire