Visit additional Tabor Communication Publications
October 21, 2005
One of the most comprehensive climate models of the continental United States predicts more extreme temperatures throughout the country and more extreme precipitation along the Gulf Coast, in the Pacific Northwest and east of the Mississippi.
This new climate model, run on supercomputers at Purdue University, takes into account a large number of factors that have been incompletely incorporated in past studies, such as the effects of snow reflecting solar energy back into space, and of high mountain ranges blocking weather fronts from traveling across them, said Noah Diffenbaugh, the team's lead scientist. Diffenbaugh said a better understanding of these factors -- coupled with a more powerful computer system on which to run the analysis -- allowed the team to generate a more coherent image of what weather we can expect to encounter in the continental United States for the next 100 years. Those expectations paint a very different climate picture for most parts of the country.
"This is the most detailed projection of climate change that we have for the U.S.," said Diffenbaugh, an assistant professor of earth and atmospheric sciences in Purdue's College of Science and a member of the Purdue Climate Change Research Center. "And the changes our model predicts are large enough to substantially disrupt our economy and infrastructure."
A climate model is represented as a sophisticated computer application that attempt to incorporate as many details about the complex workings of our environment as possible. Hundreds of dynamic processes, such as ocean currents, cloud formations, vegetation cover and -- of particular import -- the increase in atmospheric greenhouse gases, are programmed into the computers, which then attempt to discern the net effects on square-shaped plots of land that represent small pieces of the Earth's surface. The smaller these squares are, the better the resolution the model can provide.
"Just as a digital camera that creates images with more pixels can result in a better photograph, we want to make those squares as small as possible," Diffenbaugh said. "We'd also like to incorporate as much of the climate system as we can so the analysis will be realistic."
Despite the number-crunching power of the linked computers used for these simulations, a model must factor in so many changing variables that a full analysis can require months of nonstop computational effort. Diffenbaugh's team required five months to run their model on a cluster of Sun computers at the Rosen Center for Advanced Computing on Purdue's campus.
"The results were worth it, though, because this model allows us to project changes in climate with unprecedented resolution," Diffenbaugh said.
Until now, the fastest computers have been used to resolve squares 50 kilometers (about 31 miles) to a side, which can return a reasonably accurate but rather grainy "photograph" of climate change. The new model has twice the resolution, analyzing areas that are only 25 kilometers (about 16 miles) to a side. This allows the model to discern landscape features more precisely.
With their improvements over previous models, the team has been able to make several observations about the change in climate over the next century, particularly for the late century when greenhouse gas accumulation could have greater effect than, say, a decade from now.
"These projections are not necessarily about specific weather events," Diffenbaugh said. "But they do give us a good idea about what kind of weather to expect over the long run in a particular part of the country."
Some of the climate projections include:
The model, Diffenbaugh said, assumes that greenhouse gases will attain a concentration more than twice their current levels, but he said he is confident that the model's performance gives as accurate a picture of the future as we can hope for at the moment.
"We checked our model's performance by analyzing the period from 1961 to 1985 for which, of course, we do not need a prediction," Diffenbaugh said. "The model performed admirably, which tells us we've got a good understanding of how to represent the physical world in terms of computer code. It's certainly not perfect, but we'll need a computer at least 100 times as powerful as the cluster we used to really improve the accuracy. We would like to have access to such computing power in the future."
Commenting on the study, Stanford University's Stephen H. Schneider said the results confirm scientists' suspicions about the future of climate change.
"This study is the latest and most detailed simulation of climatic change in the United States," said Schneider, who is Stanford's Melvin and Joan Lane Professor for Interdisciplinary Environmental Studies. "Critics have asserted that the coarse resolution of previous studies made their sometimes dire predictions suspect, but this new result with a very high resolution grid over the United States shows potential climatic impacts at least as significant as previous results with lower resolution model. As the authors wisely note, such potential impacts certainly should not be glibly dismissed."
Indeed, the recent climate study provided by the National Center for Atmospheric Research (NACAR) also projects an intensification of weather patterns. In aggreement with the Purdue model, the NACAR model predicts more precipitation in the the northwestern and northeastern U.S. as well as drought in the Southwest.
Diffenbaugh emphasized that, while his model was in no way designed to return an alarmist image of our climate's future, the picture it painted should be considered. "The more detail we look at with these models, the more dramatic the climate's response is," he said. "Critics have complained that climate models lack sufficient spatial detail to be trusted. In terms of looking at the whole contiguous United States, we've quadrupled the spatial detail and, as a result, it appears that climate change is going to be even more dramatic than we previously thought. Of course, we can never be completely certain of the future, but it's clear that as we consider more and more detail, the picture of future climate change becomes more and more severe."
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.