A Tribute to the Earth Simulator

By Markus Henkel

July 8, 2008

The Earth Simulator is a legend in computational meteorology and long-term climate simulations. For years it has been one of the fastest supercomputers, designed to evaluate the effects of global warming and problems in solid earth geophysics. In 2009, a comprehensive upgrade of the Earth Simulator will begin a new era for the history-making supercomputer.

For six years, the Earth Simulator has been a constant in the twice-yearly TOP500 list of the most powerful supercomputers. Currently ranked in 49th place, a significant enhancement is scheduled. A new and comprehensive reinstallation is underway to enable researchers to conduct enhanced climate simulations to produce results that are beyond the scope of any other computing system.

Again the NEC Corporation in Japan, responsible for the original version in 2002, is assigned to deliver the upgraded Earth Simulator. This new mega-simulator is being installed in the Japanese research centre JAMSTEC (Japan Agency for Marine-Earth Science and Technology) and is destined to help scientists find answers to mitigate the impact of global climate change on the earth’s ecosystems. With a maximum processing power of 131 teraflops (131 billion arithmetic operations per second), a planned special SX-9-series will be able to accomplish so-called ultra-high-speed simulations in real time.

Until the new system is installed in 2009, the current Earth Simulator will continue to work on unique scientific topics. Simulations of the global climate both in the atmosphere and in the oceans made the Earth Simulator the expert in assessing climatology issues and the most important contributor to the Fourth IPCC Assessment report in 2007. The need for continuing this effort is obvious. In the period from 1950 to 2005, a massive increase in major weather-related natural catastrophes was observed, and between 1994 and 2005 there were almost three times as many weather-related natural catastrophes as in the 1960s.

The beginning: The fastest computer in the world

It all started in 2002. The fastest computer in the world was installed at the Yokohama Institute for Earth Sciences (YES). Its 640 nodes, with 8 vector processors each, occupy 3,250 square meters. The system is based on the SX-6 architecture and consists of 5,120 CPUs, 10 terabytes of main storage, 700 terabytes of hard disc storage units and 1.6 petabytes of streamer storage. An unrivalled accomplishment back then, the system is still very much in use today, and not only by Japanese researchers. International collaborative projects with institutes in the US, France and UK have attained unique results.

At first Japan used the supercomputer for its own interests. In 1923 the Kanto earthquake killed 130,000 people and destroyed Tokyo and Yokohama. In addition to that, the birth rate rose explosively, forcing the government to pay much more attention to potential national catastrophes. Today, however, the Earth Simulator is in demand worldwide, helping to clarify problems associated with a global climate. This field has indeed become a global issue, as evidenced by recent deadly natural disasters, such as the Tsunami in 2004 and the latest earthquake in China, which killed hundreds of thousands of people.

One important aim of climate supercomputers is not only understanding incidents like earthquakes, but predicting them as early as possible. Another field of activity, which the original Earth Simulator is perfect for, is the realization of coupled simulations of the atmosphere and the ocean or even complete earth simulations.

Same architecture — only smaller and more effective

“Currently there is no other platform that is able to run calculations of that kind in a comparable short time — even though there are machines capable of more output according to the available data,” says Dr. Sébastien Masson from L’Institut Pierre-Simon Laplace (IPSL) in Paris, France. For his climate models, he has been using the Earth Simulator for a long time now — not least because the interaction between hardware and software works out perfectly well. Another key aspect is the possibility of running coupled models. That means complicated ocean models, complex land models, and also climate models can be simultaneously simulated and then brought together.

“The Earth Simulator is almost identical in construction with the SX 6 machine we are using in Hamburg,” says Michael Böttinger, who is responsible for the application software at the German Climate Computing Centre (DKRZ). “Except the vector architecture is 25 times bigger than ours.” Its architecture is one of the key benefits of the Earth Simulator. The computer simulation of natural occurrences and the calculation of natural catastrophes and their climate impact require increasing processing power.

Vector processors — with their strong single processor performance associated with a very high bandwidth to the main memory — are particularly effective at meeting the special demands of computing in earth sciences. Clusters of vector computers have the advantage of using a mixture of both MPI and OpenMP with extremely strong SMP nodes. As an example of how well these codes map to the vector architecture, the AFES climate model delivers up to 60 percent of the Earth Simulator’s peak performance.

Since the first successful numerical weather forecast in 1950, the world has changed dramatically. With the increase of unusual weather events, computational meteorology has taken over a new role in assessing the future changes of the environment rather than solely forecasting tomorrow’s weather. Society recognizes the value of investing in these kinds of supercomputer systems and in their utilization. This support is evidenced by the scheduled reinstallation of the Earth Simulator.

“Oceans with their complex biological interaction play a key role in climate research, producing 50 percent of the oxygen and storing one third of the atmospheric CO2,” explains Dr. Onno Groß, marine biologist and founder of the Ocean Protection Organisation DEEPWAVE. “But the temperature change and the acidification of the sea by exhaust emissions threaten their organisms. The impact of these changes can only be made graspable with supercomputers like the Earth Simulator.”

Thanks to enhancements, the new computer allows extended applications, like cloud resolving physics and ensemble studies. Exact high-speed analysis is enabled, and with that even more precise forecasts for worldwide environmental phenomena are possible. Warming of the climate system is unequivocal, as is now evident from observations of increases in global average air and ocean temperatures and rising global average sea level. However, with adequate adaptation and mitigation strategies associated with emission reduction concepts, scientists can provide guidance to try to minimize the impact to ecosystems and human society. Systems like the Earth Simulator will play a key role in defining and evaluating these strategies and scenarios. Neither emission reduction nor adaptation will avoid all climate change impacts, but together they can reduce the risks of climate change.

Looking Ahead

Another supercomputer is also being constructed in Japan. The RIKEN Institute (Institute of Physical and Chemical Research) is developing a 10 petaflop system. The enormous increase in processing power in the HPC-area shows the impressive book of supercomputing is never going to end.

About the Author

Markus Henkel is a geodesist, science writer and lives in Hamburg, Germany. He writes about supercomputing, environmental protection and clinical medicine. For more information, email him at [email protected] or visit the Web sites: http://laengsynt.de and http://netzwelt.de.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

HPC in Life Sciences Part 1: CPU Choices, Rise of Data Lakes, Networking Challenges, and More

February 21, 2019

For the past few years HPCwire and leaders of BioTeam, a research computing consultancy specializing in life sciences, have convened to examine the state of HPC (and now AI) use in life sciences. Without HPC writ large, modern life sciences research would quickly grind to a halt. It’s true most life sciences research computing... Read more…

By John Russell

Arm Unveils Neoverse N1 Platform with up to 128-Cores

February 20, 2019

Following on its Neoverse roadmap announcement last October, Arm today revealed its next-gen Neoverse microarchitecture with compute and throughput-optimized silicon designs catered toward general-purpose cloud computing Read more…

By Tiffany Trader

The Internet of Criminal Things—Trust in the Gods but Verify!

February 20, 2019

“Are we under attack?” asked Professor Elmarie Biermann of the Cyber Security Institute during the recent South African Centre for High Performance Computing’s (CHPC) National Conference in Cape Town. A quick show Read more…

By Elizabeth Leake, STEM-Trek

HPE Extreme Performance Solutions

HPE and Intel® Omni-Path Architecture: How to Power a Cloud

Learn how HPE and Intel® Omni-Path Architecture provide critical infrastructure for leading Nordic HPC provider’s HPCFLOW cloud service.

powercloud_blog.jpgFor decades, HPE has been at the forefront of high-performance computing, and we’ve powered some of the fastest and most robust supercomputers in the world. Read more…

IBM Accelerated Insights

The Perils of Becoming Trapped in the Cloud

Terms like ‘open systems’ have been bandied about for decades. While modern computer systems are relatively open compared to their predecessors, there are still plenty of opportunities to become locked into proprietary interfaces. Read more…

Machine Learning Takes Heat for Science’s Reproducibility Crisis

February 19, 2019

Scientists are raising red flags about the accuracy and reproducibility of conclusions drawn by machine learning frameworks. Among the remedies are developing new ML systems that can question their own predictions, show Read more…

By George Leopold

HPC in Life Sciences Part 1: CPU Choices, Rise of Data Lakes, Networking Challenges, and More

February 21, 2019

For the past few years HPCwire and leaders of BioTeam, a research computing consultancy specializing in life sciences, have convened to examine the state of HPC (and now AI) use in life sciences. Without HPC writ large, modern life sciences research would quickly grind to a halt. It’s true most life sciences research computing... Read more…

By John Russell

Arm Unveils Neoverse N1 Platform with up to 128-Cores

February 20, 2019

Following on its Neoverse roadmap announcement last October, Arm today revealed its next-gen Neoverse microarchitecture with compute and throughput-optimized si Read more…

By Tiffany Trader

Insights from Optimized Codes on Cineca’s Marconi

February 15, 2019

What can you do with 381,392 CPU cores? For Cineca, it means enabling computational scientists to expand a large part of the world’s body of knowledge from the nanoscale to the astronomic, from calculating quantum effects in new materials to supporting bioinformatics for advanced healthcare research to screening millions of possible chemical combinations to attack a deadly virus. Read more…

By Ken Strandberg

ClusterVision in Bankruptcy, Fate Uncertain

February 13, 2019

ClusterVision, European HPC specialists that have built and installed over 20 Top500-ranked systems in their nearly 17-year history, appear to be in the midst o Read more…

By Tiffany Trader

UC Berkeley Paper Heralds Rise of Serverless Computing in the Cloud – Do You Agree?

February 13, 2019

Almost exactly ten years to the day from publishing of their widely-read, seminal paper on cloud computing, UC Berkeley researchers have issued another ambitious examination of cloud computing - Cloud Programming Simplified: A Berkeley View on Serverless Computing. The new work heralds the rise of ‘serverless computing’ as the next dominant phase of cloud computing. Read more…

By John Russell

Iowa ‘Grows Its Own’ to Fill the HPC Workforce Pipeline

February 13, 2019

The global workforce that supports advanced computing, scientific software and high-speed research networks is relatively small when you stop to consider the magnitude of the transformative discoveries it empowers. Technical conferences provide a forum where specialists convene to learn about the latest innovations and schedule face-time with colleagues from other institutions. Read more…

By Elizabeth Leake, STEM-Trek

Trump Signs Executive Order Launching U.S. AI Initiative

February 11, 2019

U.S. President Donald Trump issued an Executive Order (EO) today launching a U.S Artificial Intelligence Initiative. The new initiative - Maintaining American L Read more…

By John Russell

Celebrating Women in Science: Meet Four Women Leading the Way in HPC

February 11, 2019

One only needs to look around at virtually any CS/tech conference to realize that women are underrepresented, and that holds true of HPC. SC hosts over 13,000 H Read more…

By AJ Lauer

Quantum Computing Will Never Work

November 27, 2018

Amid the gush of money and enthusiastic predictions being thrown at quantum computing comes a proposed cold shower in the form of an essay by physicist Mikhail Read more…

By John Russell

Cray Unveils Shasta, Lands NERSC-9 Contract

October 30, 2018

Cray revealed today the details of its next-gen supercomputing architecture, Shasta, selected to be the next flagship system at NERSC. We've known of the code-name "Shasta" since the Argonne slice of the CORAL project was announced in 2015 and although the details of that plan have changed considerably, Cray didn't slow down its timeline for Shasta. Read more…

By Tiffany Trader

The Case Against ‘The Case Against Quantum Computing’

January 9, 2019

It’s not easy to be a physicist. Richard Feynman (basically the Jimi Hendrix of physicists) once said: “The first principle is that you must not fool yourse Read more…

By Ben Criger

AMD Sets Up for Epyc Epoch

November 16, 2018

It’s been a good two weeks, AMD’s Gary Silcott and Andy Parma told me on the last day of SC18 in Dallas at the restaurant where we met to discuss their show news and recent successes. Heck, it’s been a good year. Read more…

By Tiffany Trader

Intel Reportedly in $6B Bid for Mellanox

January 30, 2019

The latest rumors and reports around an acquisition of Mellanox focus on Intel, which has reportedly offered a $6 billion bid for the high performance interconn Read more…

By Doug Black

ClusterVision in Bankruptcy, Fate Uncertain

February 13, 2019

ClusterVision, European HPC specialists that have built and installed over 20 Top500-ranked systems in their nearly 17-year history, appear to be in the midst o Read more…

By Tiffany Trader

US Leads Supercomputing with #1, #2 Systems & Petascale Arm

November 12, 2018

The 31st Supercomputing Conference (SC) - commemorating 30 years since the first Supercomputing in 1988 - kicked off in Dallas yesterday, taking over the Kay Ba Read more…

By Tiffany Trader

Looking for Light Reading? NSF-backed ‘Comic Books’ Tackle Quantum Computing

January 28, 2019

Still baffled by quantum computing? How about turning to comic books (graphic novels for the well-read among you) for some clarity and a little humor on QC. The Read more…

By John Russell

Leading Solution Providers

SC 18 Virtual Booth Video Tour

Advania @ SC18 AMD @ SC18
ASRock Rack @ SC18
DDN Storage @ SC18
HPE @ SC18
IBM @ SC18
Lenovo @ SC18 Mellanox Technologies @ SC18
NVIDIA @ SC18
One Stop Systems @ SC18
Oracle @ SC18 Panasas @ SC18
Supermicro @ SC18 SUSE @ SC18 TYAN @ SC18
Verne Global @ SC18

Contract Signed for New Finnish Supercomputer

December 13, 2018

After the official contract signing yesterday, configuration details were made public for the new BullSequana system that the Finnish IT Center for Science (CSC Read more…

By Tiffany Trader

Deep500: ETH Researchers Introduce New Deep Learning Benchmark for HPC

February 5, 2019

ETH researchers have developed a new deep learning benchmarking environment – Deep500 – they say is “the first distributed and reproducible benchmarking s Read more…

By John Russell

IBM Quantum Update: Q System One Launch, New Collaborators, and QC Center Plans

January 10, 2019

IBM made three significant quantum computing announcements at CES this week. One was introduction of IBM Q System One; it’s really the integration of IBM’s Read more…

By John Russell

IBM Bets $2B Seeking 1000X AI Hardware Performance Boost

February 7, 2019

For now, AI systems are mostly machine learning-based and “narrow” – powerful as they are by today's standards, they're limited to performing a few, narro Read more…

By Doug Black

HPC Reflections and (Mostly Hopeful) Predictions

December 19, 2018

So much ‘spaghetti’ gets tossed on walls by the technology community (vendors and researchers) to see what sticks that it is often difficult to peer through Read more…

By John Russell

Nvidia’s Jensen Huang Delivers Vision for the New HPC

November 14, 2018

For nearly two hours on Monday at SC18, Jensen Huang, CEO of Nvidia, presented his expansive view of the future of HPC (and computing in general) as only he can do. Animated. Backstopped by a stream of data charts, product photos, and even a beautiful image of supernovae... Read more…

By John Russell

The Deep500 – Researchers Tackle an HPC Benchmark for Deep Learning

January 7, 2019

How do you know if an HPC system, particularly a larger-scale system, is well-suited for deep learning workloads? Today, that’s not an easy question to answer Read more…

By John Russell

Intel Confirms 48-Core Cascade Lake-AP for 2019

November 4, 2018

As part of the run-up to SC18, taking place in Dallas next week (Nov. 11-16), Intel is doling out info on its next-gen Cascade Lake family of Xeon processors, specifically the “Advanced Processor” version (Cascade Lake-AP), architected for high-performance computing, artificial intelligence and infrastructure-as-a-service workloads. Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This