GRACE Data Contributes to Understanding of Climate Change

May 23, 2019

May 23, 2019 — The University of Texas at Austin team that led a twin satellite system launched in 2002 to take detailed measurements of the Earth, called the Gravity Recovery and Climate Experiment (GRACE), reports in the most recent issue of the journal Nature Climate Change on the contributions that their nearly two decades of data have made to our understanding of global climate patterns.

Among the many contributions that GRACE has made:

  • GRACE recorded three times the mass of ice lost in the polar and mountainous regions since first beginning measurements — a consequence of global warming.
  • GRACE enabled a measure of the quantity of heat added to the ocean and the location for said heat that remains stored in the ocean. GRACE has provided detailed observations, confirming that the majority of the warming occurs in the upper 2,000 meters of the oceans.
  • GRACE has observed that of the 37 largest land-based aquifers, 13 have undergone critical mass loss. This loss, due to both a climate-related effect and an anthropogenic (human-induced) effect, documents the reduced availability of clean, fresh water supplies for human consumption.
  • The information gathered from GRACE provides vital data for the federal agency United States Drought Monitor and has shed light on the causes of drought and aquifer depletion in places worldwide, from India to California.
Illustration of the twin GRACE follow-on satellites. (NASA/JPL-Caltech)

Intended to last just five years in orbit for a limited, experimental mission to measure small changes in the Earth’s gravitational fields, GRACE operated for more than 15 years and has provided unprecedented insight into our global water resources, from more accurate measurements of polar ice loss to a better view of the ocean currents, and the rise in global sea levels. The mission was a collaboration between NASA and the German Aerospace Centre and was led by researchers in the Center for Space Research (CSR) in UT’s Cockrell School of Engineering.

UT’s Texas Advanced Computing Center (TACC) has played a critical role in this international project over the last 15 years, according to Byron Tapley, the Clare Cockrell Williams Centennial Chair Emeritus in the Department of Aerospace Engineering and Engineering Mechanics who established the Center for Space Research at UT in 1981 and who served as principal investigator of the GRACE mission.

“As the demand for the GRACE science deliverables have grown, TACC’s ability to support these demands have grown. It has been a seamless transition to a much richer reporting environment,” he said.

By measuring changes in mass that cause deviations in the strength of gravity’s pull on the Earth’s various systems — water systems, ice sheets, atmosphere, land movements, and more — the satellites can measure small changes in the Earth system interactions.

“By monitoring the physical components of the Earth’s dynamical system as a whole, GRACE provides a time variable and holistic overview of how our oceans, atmosphere and land surface topography interact,” Tapley said.

The data system for the mission is highly distributed and requires significant data storage and computation through an internationally distributed network. Although the final data products for the CSR solutions are generated at TACC, there is considerable effort in Germany by the Geophysics Center in Potsdam and the NASA Jet Propulsion Laboratory (JPL) in Pasadena, California. The final CSR analysis at TACC starts with a data downlink from the satellites to a raw data collection center in Germany. The data is then transmitted to JPL where the primary measurements are converted into the geophysical measurements consisting of GPS, accelerometer, attitude quaternions, and the high accuracy intersatellite ranging measurements collected by each satellite during a month-long observation span.

“The collection of information from this international community are brought together by the fundamental computing capability and the operational philosophy at TACC to undergo the challenging data analysis required to obtain the paradigm-shifting view of the Earth’s interactions,” Tapley said.

Despite being a risky venture operating on minimal funding, the GRACE mission surpassed all expectations and continues to provide a critical set of measurements.

“The concept of using the changing gravimetric patterns on Earth as a means to understanding major changes in the Earth system interactions had been proposed before,” Tapley said. “But we were the first to make it happen at a measurement level that supported the needs of the diverse Earth-science community.”

One of the remarkable benefits of working with TACC, according to Tapley, is the ability to pose questions whose solutions would have not been feasible prior to TACC and to find the capability to answer the questions.

“As an example, when we began the GRACE mission, our capability was looking at gravity models that were characterized by approximately 5,000 model parameters, whose solution was obtained at approximately yearly analysis intervals. The satellite-only GRACE models today are based on approximately 33,000 parameters that we have the ability to determine at a daily interval. In the final re-analysis of the GRACE data, we’re looking to expand this parameterization to 4,000,000 parameters for the mean model. The interaction with TACC has always been in the context of: ‘If the answer to a meaningful question requires extensive computations, let’s find a way to satisfy that requirement,'” Tapley said.

Now that the GRACE Follow-On mission, which the CSR will continue to play a role in, has launched successfully, the chance to continue the GRACE record for a second multi-decadal measurement of changes in mass across the Earth system is possible. Engineers and scientists anticipate that the longer data interval will allow them to see an even clearer picture of how the planet’s climate patterns behave over time.


Source: Faith Singer-Villalobos, TACC

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from its predecessors, including the red-hot H100 and A100 GPUs. Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. While Nvidia may not spring to mind when thinking of the quant Read more…

2024 Winter Classic: Meet the HPE Mentors

March 18, 2024

The latest installment of the 2024 Winter Classic Studio Update Show features our interview with the HPE mentor team who introduced our student teams to the joys (and potential sorrows) of the HPL (LINPACK) and accompany Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the field was normalized for boys in 1969 when the Apollo 11 missi Read more…

Apple Buys DarwinAI Deepening its AI Push According to Report

March 14, 2024

Apple has purchased Canadian AI startup DarwinAI according to a Bloomberg report today. Apparently the deal was done early this year but still hasn’t been publicly announced according to the report. Apple is preparing Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimization algorithms to iteratively refine their parameters until Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, code-named Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Houston We Have a Solution: Addressing the HPC and Tech Talent Gap

March 15, 2024

Generations of Houstonian teachers, counselors, and parents have either worked in the aerospace industry or know people who do - the prospect of entering the fi Read more…

Survey of Rapid Training Methods for Neural Networks

March 14, 2024

Artificial neural networks are computing systems with interconnected layers that process and learn from data. During training, neural networks utilize optimizat Read more…

PASQAL Issues Roadmap to 10,000 Qubits in 2026 and Fault Tolerance in 2028

March 13, 2024

Paris-based PASQAL, a developer of neutral atom-based quantum computers, yesterday issued a roadmap for delivering systems with 10,000 physical qubits in 2026 a Read more…

India Is an AI Powerhouse Waiting to Happen, but Challenges Await

March 12, 2024

The Indian government is pushing full speed ahead to make the country an attractive technology base, especially in the hot fields of AI and semiconductors, but Read more…

Charles Tahan Exits National Quantum Coordination Office

March 12, 2024

(March 1, 2024) My first official day at the White House Office of Science and Technology Policy (OSTP) was June 15, 2020, during the depths of the COVID-19 loc Read more…

AI Bias In the Spotlight On International Women’s Day

March 11, 2024

What impact does AI bias have on women and girls? What can people do to increase female participation in the AI field? These are some of the questions the tech Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Analyst Panel Says Take the Quantum Computing Plunge Now…

November 27, 2023

Should you start exploring quantum computing? Yes, said a panel of analysts convened at Tabor Communications HPC and AI on Wall Street conference earlier this y Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Training of 1-Trillion Parameter Scientific AI Begins

November 13, 2023

A US national lab has started training a massive AI brain that could ultimately become the must-have computing resource for scientific researchers. Argonne N Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire