“Climate change is a global problem that requires global solutions,” writes Oxford University professor Tim Palmer. The international organization CERN studies particle physics, but there is no equivalent group dedicated to the issue of climate change. Palmer argues the time has come for such a facility. Palmer is a Royal Society Research Professor at Oxford University and also co-director of the program on predicting climate change at the Oxford Martin School. He believes that scientists’ ability to create accurate, reliable, high-resolution climate models is being thwarted by a deficit in computing power.
Current weather models are quite good at predicting weather a few days ahead of time, but global climate models need to see much further out. For example, the models used in the fourth assessment report by the Intergovernmental Panel on Climate Change (IPCC) attempt to see 100 years into the future. Because they must factor so many more variables, these long-range models are less accurate. In order for the model to account for such a long period of time, shortcuts are made when it comes to other variables — the main one being grid-spacing. The short-term forecasts use a fine gridpoint spacing of about a few tens of kilometres, while the very long-range forecasts typically have a grid spacing ten times that. Because of this, they cannot reliably predict whether certain weather patterns will become more or less likely with increased greenhouse gas concentrations.
The problem is not one of physics, but one of computation. According to Palmer, “we do not have the computing power to solve the known partial differential equations of climate science with sufficient accuracy.”
Increasing the resolution of models creates ever larger computational demands, explains Palmer. Reducing the grid spacing by half can increase computational costs by up to a factor of 16. Plus, there are a host of other processes all vying for computer time at national climate-prediction institutes. For example, the Meteorological Office in the UK must employ fluid dynamics algorithms, while also accounting for the Earth’s relevant biological and chemical processes, such as the carbon cycle. Monte Carlo calculations are necessary to estimate the effects of unavoidable approximations. And not only must climate scenarios run forward, but must also look back, integrating up to one thousand years of historical climate data.
For the above reasons, Palmer argues that the computing needs of today’s climate modelers are not being met — not by individual research outfits. However, by pooling resources, a global coalition with sufficient endowments could accomplish much more than regional sites. No longer would researchers be relegated to making difficult choices in leaving out important modeling elements or be forced to decide between equally worthy scientific projects due to lack of funding and resources.
Writes Palmer:
It is time to start planning for a truly international climate-prediction facility, on a scale such as ITER or CERN. Such a centre would not replace existing national climate centres. Rather, it would allow them to do the sort of research experimentation currently impossible. Indeed, the collaboration between the proposed facility and the national climate centres could be similar to that between CERN and the university groups that devise the experiments run at the lab. There would be collaboration rather than competition.
Such a facility would allow the dedicated use of cutting-edge exascale (10^18 operations per second) technology for understanding and predicting climate, for the benefit of society worldwide as soon as this technology becomes available in a few years’ time.