September 24, 2010

Supercomputing Energy Use Getting a Bad Rap

Tiffany Trader

In a thought-provoking piece over at ZDNet, Numerical Algorithms Group’s Andrew Jones takes a look at the supercomputing power consumption equation, examining whether its current trajectory might not be so untenable.

He writes:

There are a range of estimates for the likely power consumption of the first exaflops supercomputers, which are expected at some point between 2018 and 2020. But probably the most accepted estimate is 120MW, as set out in the Darpa Exascale Study edited by Peter Kogge (PDF).

At this figure, the supercomputing community panics and says it is far too much — we must get it down to between 20MW and 60MW, depending who you ask — and we worry even that is too much. But is it?

What follows is a comparison of today’s largest supercomputers with their closest kin, major scientific research facilities.

In Jones’ opinion:

[T]he largest supercomputers at any time, including the first exaflops, should not be thought of as computers. They are strategic scientific instruments that happen to be built from computer technology. Their usage patterns and scientific impact are closer to major research facilities such as Cern, Iter, or Hubble.

Thinking of the big supercomputers that way, their power consumption and other costs — construction, operation, and so forth — are comparable to other major research centers and not that outrageous, concludes Jones.

Jones also tackles the subject of whether it makes sense to continually improve and replace systems every couple of years (as we currently do) or whether it would offer more value to society to collaborate on the construction of one mega-supercomputer every decade – putting ten years of resources into it, and then relying only on that system for ten years. There are, of course, pros and cons to each path. Because supercomputing performance increases exponentially, the first option results in a greater number of exflops per year, but also think of the resources saved with the second option by not having to continually rewrite and validate code and the value to society in having a 2030-era system ten years ahead of schedule.

Jones is not sold on either path, but wonders why we are so set on the first option without giving some consideration to the second. Check out the full article for more in-depth treatment of these ideas.

Full story at ZDNet

Share This