The Prisoner’s Dilemma

By Michael Feldman

May 12, 2011

Over the past few years, there has been a subtle shift of tone about climate change solutions, from prevention and remediation to mitigation and adaptation. That shift is not all that surprising, given the failure of the international community to deliver any substantive policy solutions toward reducing greenhouse gas emissions. As a result, people seem to be talking less about reducing carbon footprints and more about how high to build the sea walls in Manhattan.

I would put the effort to build climate models for exascale supercomputers in the latter category as well. By 2018, when the first exaflop machines are expected to debut, chances are that climate change will have advanced to the point where policy decisions will be strongly focused on mitigation and adaptation. That also seems to reflect the thinking of others in the research community. Marc Snir, who is the principal investigator of the newly-hatched G8 research project on exascale climate simulations is one such person. In an article this week in HPCwire, he talked about the utility of enhancing climate simulations for exascale machines:

I suspect that all participants in our project believe that the time to act on global warming is now, not ten years from now. The unfortunate situation is that we seem incapable of radical action, for a variety of reasons. It is hard to have international action when any individual country will be better served by shirking its duties — the prisoners’ paradox — and it is hard to act when the cost of action is immediate and the reward is far in the future. As unfortunate as this is, we might have to think of mitigation, rather than remediation. More accurate simulations will decrease the existing uncertainty about the rate of global warming and its effects; and will be needed to assess the effect of unmitigated climate change, and the effect of various mitigation actions.

Marc’s reference to the “prisoner’s paradox,” sometimes known as the Prisoner’s Dilemma, refers to a situation when cooperation is trumped by the motivation to act in one’s own self-interest. The paradox has to do with the fact that individuals, or individual countries for that matter, will often behave this way even if their long-term interests would be better served by working together.

That realization appears to be sinking in more generally. A report released on Thursday from the National Research Council discusses the necessity of preparing to adapt to climate change, while also talking up the need for “substantial” action to limit its magnitude. But even here, the authors admit that such action wouldn’t turn back the clock on climate change:

Aggressive emissions reductions would reduce the need for adaptation, but not eliminate it. Climate change is already happening, and additional changes can be expected for all plausible scenarios of future greenhouse gas emissions. Prudent risk management demands advanced planning to deal with possible adverse outcomes — known and unknown — by increasing the nation’s resilience to both gradual changes and the possibility of abrupt disaster events.

I suppose it’s a good sign that the governments of the world are investing so much in climate research. In the US, most of the top US Department of Energy and NASA supercomputers spend at least some portion of their cycles on climate simulation codes (not to mention the NCAR and NOAA machines). Same goes for most of the top tier supercomputers in Europe and Asia. It’s probably not an exaggeration to say that hundreds of millions of dollars in supercomputing infrastructure and software development has been invested globally to address climate change.

As anyone who has even casually followed this research over the last 20 years knows by now, the models are all pointing in the same direction — global warming. The latest assessment by the Intergovernmental Panel on Climate Change predicts the average surface temperature of the Earth will increase between 2.0 and 11.5 Fahrenheit by the end of the century, assuming no heroic efforts are instituted to restrict greenhouse emissions.

It’s disheartening, therefore, to see how little, policy-wise has been accomplished by world governments based on the results of all the research they took the trouble to sponsor. It’s sort of like getting your car checked out every year with the latest diagnostics, but not fixing anything when the mechanic tells you the engine is going to blow. In truth though, that analogy is not quite fair. In this case, we can’t pull the car into the repair shop to fix it; we’ve got to retool it while it’s careening down the highway.

Ironically, the biggest reduction in greenhouse gas emissions was the result of the 2008-2009 recession, which brought CO2 levels down to those of the late 1990s. The depth of the economic downturn was so severe that, according to the US Energy Information Administration (EIA), emission levels aren’t expected to return to their 2008 high point of 6 billion metric tons per year until around 2025. After that, levels are expected to grow at an average of about 0.2 percent per year. Of course we could always hope for another economic disaster to mitigate an environmental one, but that seems like a poor trade-off.

Mitigating climate change, rather than reversing it, appears to be our destiny at this point. Human nature is certainly better adapted to acting on problems that confront us in real time, rather than at some point in the undefined future. Exascale-level climate simulations would almost certainly put us in a better position to do that by the end of the decade. But we still might want to start building those sea walls now.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Spoiler Alert: Glimpse Next Week’s Solar Eclipse Via Simulation from TACC, SDSC, and NASA

August 17, 2017

Can’t wait to see next week’s solar eclipse? You can at least catch glimpses of what scientists expect it will look like. A team from Predictive Science Inc. (PSI), based in San Diego, working with Stampede2 at the Read more…

By John Russell

Dell EMC will Build OzStar – Swinburne’s New Supercomputer to Study Gravity

August 16, 2017

Dell EMC announced yesterday it is building a new supercomputer – the OzStar – for the Swinburne University of Technology (Australia) in support the ARC Centre of Excellence for Gravitational Wave Discovery (OzGrav) Read more…

By John Russell

Microsoft Bolsters Azure With Cloud HPC Deal

August 15, 2017

Microsoft has acquired cloud computing software vendor Cycle Computing in a move designed to bring orchestration tools along with high-end computing access capabilities to the cloud. Terms of the acquisition were not Read more…

By George Leopold

HPE Extreme Performance Solutions

Leveraging Deep Learning for Fraud Detection

Advancements in computing technologies and the expanding use of e-commerce platforms have dramatically increased the risk of fraud for financial services companies and their customers. Read more…

HPE Ships Supercomputer to Space Station, Final Destination Mars

August 14, 2017

With a manned mission to Mars on the horizon, the demand for space-based supercomputing is at hand. Today HPE and NASA sent the first off-the-shelf HPC system into space aboard the SpaceX Dragon Spacecraft to explore if Read more…

By Tiffany Trader

Microsoft Bolsters Azure With Cloud HPC Deal

August 15, 2017

Microsoft has acquired cloud computing software vendor Cycle Computing in a move designed to bring orchestration tools along with high-end computing access capa Read more…

By George Leopold

HPE Ships Supercomputer to Space Station, Final Destination Mars

August 14, 2017

With a manned mission to Mars on the horizon, the demand for space-based supercomputing is at hand. Today HPE and NASA sent the first off-the-shelf HPC system i Read more…

By Tiffany Trader

AMD EPYC Video Takes Aim at Intel’s Broadwell

August 14, 2017

Let the benchmarking begin. Last week, AMD posted a YouTube video in which one of its EPYC-based systems outperformed a ‘comparable’ Intel Broadwell-based s Read more…

By John Russell

Deep Learning Thrives in Cancer Moonshot

August 8, 2017

The U.S. War on Cancer, certainly a worthy cause, is a collection of programs stretching back more than 40 years and abiding under many banners. The latest is t Read more…

By John Russell

IBM Raises the Bar for Distributed Deep Learning

August 8, 2017

IBM is announcing today an enhancement to its PowerAI software platform aimed at facilitating the practical scaling of AI models on today’s fastest GPUs. Scal Read more…

By Tiffany Trader

IBM Storage Breakthrough Paves Way for 330TB Tape Cartridges

August 3, 2017

IBM announced yesterday a new record for magnetic tape storage that it says will keep tape storage density on a Moore's law-like path far into the next decade. Read more…

By Tiffany Trader

AMD Stuffs a Petaflops of Machine Intelligence into 20-Node Rack

August 1, 2017

With its Radeon “Vega” Instinct datacenter GPUs and EPYC “Naples” server chips entering the market this summer, AMD has positioned itself for a two-head Read more…

By Tiffany Trader

Cray Moves to Acquire the Seagate ClusterStor Line

July 28, 2017

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

Nvidia’s Mammoth Volta GPU Aims High for AI, HPC

May 10, 2017

At Nvidia's GPU Technology Conference (GTC17) in San Jose, Calif., this morning, CEO Jensen Huang announced the company's much-anticipated Volta architecture a Read more…

By Tiffany Trader

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Quantum Bits: D-Wave and VW; Google Quantum Lab; IBM Expands Access

March 21, 2017

For a technology that’s usually characterized as far off and in a distant galaxy, quantum computing has been steadily picking up steam. Just how close real-wo Read more…

By John Russell

Russian Researchers Claim First Quantum-Safe Blockchain

May 25, 2017

The Russian Quantum Center today announced it has overcome the threat of quantum cryptography by creating the first quantum-safe blockchain, securing cryptocurrencies like Bitcoin, along with classified government communications and other sensitive digital transfers. Read more…

By Doug Black

HPC Compiler Company PathScale Seeks Life Raft

March 23, 2017

HPCwire has learned that HPC compiler company PathScale has fallen on difficult times and is asking the community for help or actively seeking a buyer for its a Read more…

By Tiffany Trader

Trump Budget Targets NIH, DOE, and EPA; No Mention of NSF

March 16, 2017

President Trump’s proposed U.S. fiscal 2018 budget issued today sharply cuts science spending while bolstering military spending as he promised during the cam Read more…

By John Russell

Leading Solution Providers

CPU-based Visualization Positions for Exascale Supercomputing

March 16, 2017

In this contributed perspective piece, Intel’s Jim Jeffers makes the case that CPU-based visualization is now widely adopted and as such is no longer a contrarian view, but is rather an exascale requirement. Read more…

By Jim Jeffers, Principal Engineer and Engineering Leader, Intel

Groq This: New AI Chips to Give GPUs a Run for Deep Learning Money

April 24, 2017

CPUs and GPUs, move over. Thanks to recent revelations surrounding Google’s new Tensor Processing Unit (TPU), the computing world appears to be on the cusp of Read more…

By Alex Woodie

Google Debuts TPU v2 and will Add to Google Cloud

May 25, 2017

Not long after stirring attention in the deep learning/AI community by revealing the details of its Tensor Processing Unit (TPU), Google last week announced the Read more…

By John Russell

MIT Mathematician Spins Up 220,000-Core Google Compute Cluster

April 21, 2017

On Thursday, Google announced that MIT math professor and computational number theorist Andrew V. Sutherland had set a record for the largest Google Compute Engine (GCE) job. Sutherland ran the massive mathematics workload on 220,000 GCE cores using preemptible virtual machine instances. Read more…

By Tiffany Trader

Six Exascale PathForward Vendors Selected; DoE Providing $258M

June 15, 2017

The much-anticipated PathForward awards for hardware R&D in support of the Exascale Computing Project were announced today with six vendors selected – AMD Read more…

By John Russell

Top500 Results: Latest List Trends and What’s in Store

June 19, 2017

Greetings from Frankfurt and the 2017 International Supercomputing Conference where the latest Top500 list has just been revealed. Although there were no major Read more…

By Tiffany Trader

IBM Clears Path to 5nm with Silicon Nanosheets

June 5, 2017

Two years since announcing the industry’s first 7nm node test chip, IBM and its research alliance partners GlobalFoundries and Samsung have developed a proces Read more…

By Tiffany Trader

Messina Update: The US Path to Exascale in 16 Slides

April 26, 2017

Paul Messina, director of the U.S. Exascale Computing Project, provided a wide-ranging review of ECP’s evolving plans last week at the HPC User Forum. Read more…

By John Russell

  • arrow
  • Click Here for More Headlines
  • arrow
Share This