Visit additional Tabor Communication Publications
March 16, 2011
"Climate change is a global problem that requires global solutions," writes Oxford University professor Tim Palmer. The international organization CERN studies particle physics, but there is no equivalent group dedicated to the issue of climate change. Palmer argues the time has come for such a facility. Palmer is a Royal Society Research Professor at Oxford University and also co-director of the program on predicting climate change at the Oxford Martin School. He believes that scientists' ability to create accurate, reliable, high-resolution climate models is being thwarted by a deficit in computing power.
Current weather models are quite good at predicting weather a few days ahead of time, but global climate models need to see much further out. For example, the models used in the fourth assessment report by the Intergovernmental Panel on Climate Change (IPCC) attempt to see 100 years into the future. Because they must factor so many more variables, these long-range models are less accurate. In order for the model to account for such a long period of time, shortcuts are made when it comes to other variables -- the main one being grid-spacing. The short-term forecasts use a fine gridpoint spacing of about a few tens of kilometres, while the very long-range forecasts typically have a grid spacing ten times that. Because of this, they cannot reliably predict whether certain weather patterns will become more or less likely with increased greenhouse gas concentrations.
The problem is not one of physics, but one of computation. According to Palmer, "we do not have the computing power to solve the known partial differential equations of climate science with sufficient accuracy."
Increasing the resolution of models creates ever larger computational demands, explains Palmer. Reducing the grid spacing by half can increase computational costs by up to a factor of 16. Plus, there are a host of other processes all vying for computer time at national climate-prediction institutes. For example, the Meteorological Office in the UK must employ fluid dynamics algorithms, while also accounting for the Earth's relevant biological and chemical processes, such as the carbon cycle. Monte Carlo calculations are necessary to estimate the effects of unavoidable approximations. And not only must climate scenarios run forward, but must also look back, integrating up to one thousand years of historical climate data.
For the above reasons, Palmer argues that the computing needs of today's climate modelers are not being met -- not by individual research outfits. However, by pooling resources, a global coalition with sufficient endowments could accomplish much more than regional sites. No longer would researchers be relegated to making difficult choices in leaving out important modeling elements or be forced to decide between equally worthy scientific projects due to lack of funding and resources.
It is time to start planning for a truly international climate-prediction facility, on a scale such as ITER or CERN. Such a centre would not replace existing national climate centres. Rather, it would allow them to do the sort of research experimentation currently impossible. Indeed, the collaboration between the proposed facility and the national climate centres could be similar to that between CERN and the university groups that devise the experiments run at the lab. There would be collaboration rather than competition.
Such a facility would allow the dedicated use of cutting-edge exascale (10^18 operations per second) technology for understanding and predicting climate, for the benefit of society worldwide as soon as this technology becomes available in a few years' time.
Full story at Physics World (PDF)
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.