Here is a collection of highlights, selected totally subjectively, from this week’s HPC news stream as reported at insideHPC.com and HPCwire.
10 words and a link
AMD to cut 1,100 jobs and reduce salaries
IBM reports record 2008
Digging into Data challenge announced, first deadline March 15
Open Education Cup deadline draws near
Budget set for Japan’s next generation super in ’09
HPC helps build crankshafts 80 percent faster
NCSA may be stiffed on state funds for Blue Waters
University of Canterbury first to teach supercomputing in Australasia
TotalView adds support For HP Cluster System
NVIDIA partners with India-based Wipro for CUDA services
AMD and Sun deliver small cluster to China
LinkedIn as a resource for staying hooked up in HPC
UK Met Office super found to have a black thumb
The TimesOnline ran a story last week about the UK Met Office’s new £33M ($65M, roughly) IBM super. Evidently some public reaction to the computer’s carbon footprint has been negative:
For the Met Office the forecast is considerable embarrassment. It has spent £33m on a new supercomputer to calculate how climate change will affect Britain — only to find the new machine has a giant carbon footprint of its own.
“The new supercomputer, which will become operational later this year, will emit 14,400 tonnes of CO2 a year,” said Dave Britton, the Met Office’s chief press officer. This is equivalent to the CO2 emitted by 2,400 homes — generating an average of six tonnes each a year.
This is a point of sensitivity in the UK because they’ve recently been in the business of publishing admonitions about the seriousness of not reducing carbon footprint:
However, when it came to buying a new supercomputer, the Met Office decided not to heed its own warnings. The ironic problem was that it needed the extra computing power to improve the accuracy of its own climate predictions as well as its short-term weather forecasting. The machine will also improve its ability to predict extreme events such as fierce localised storms, cloudbursts and so on.
Met Office officials are arguing that supercomputing is the only way to get a handle on climate change and the potential result in the environment of changes in human behavior through energy consumption policy. I’m certainly not arguing that the Met Office made a bad decision either, but I think they are guilty of having a tin ear on this issue, which at least suggests some level of organizational dysfunction with respect to carbon footprint (i.e., concern about it has not yet become a “gut” issue that pervades the organization). The real problem here, in my opinion, is that it was not prepared to frame its purchase in those terms, or address the concerns when they were raised.
New HPC services, hosting firm announced in Europe
UK-based EigenForge Limited came out of the closet, press release in hand, on Monday of this week. EigenForge will focus in three areas of business:
1) Fully Managed HPC Service. Following a business and work-flow consultancy review with a potential client, EigenForge will procure, host and manage the clients HPC system in an EigenForge data centre.
2) CPU Brokerage…EigenForge will match clients with an excess of HPC resource with those who demand extra capacity to meet intermittent demand.
3) HPC Consultancy…Expert and impartial advice on internal infrastructure, data centre appraisals, market reviews, tender writing, and best practices will ensure the fastest and most successful way of starting the procurement process.
This brings together services offered by a number of boutique firms throughout the HPC ecosystem. For example, Parabon (cycle brokering, article at HPCwire), NAG (consultancy, article at HPCwire back in August), and Nimbis Services (brokering, consultancy; InsideTrack here), all come to mind. Managing other people’s HPC gear is kind of like traditional colocation, I guess, with the twist that there is a real chance for added value if the colo folks knew anything about HPC.
The founder, Jason Hogan-O’Neill, apparently does know anything about HPC, from a user and sales perspective anyway. He was a user (condensed matter physics), managed HPC centers, and worked pre-sales at SGI.
Does all of this transfer into potentially valuable consulting? Potentially. But successful operations of these petulant beasts is the long product of all kinds of things, many of which one only learns about after decades of finding them, one by one, standing in the middle of a week of downtime on your machines. We’ll have to wait for customers to tell us if there is value here for them.
All of their facilities are in the UK right now, so this looks like a European play for the time being.
House Democrats have innovation on their minds
The outstanding Computing Research Policy Blog has an analysis of what the Democrats propose to do with respect to science and computing to stimulate the economy. You can read the full text (pdf) of the draft legislation as well as the committee report (pdf) online.
Here are some highlights from the legislation related to computing and high performance computing:
$100 million increase for the Advanced Scientific Computing Research program. The only other program in Science to get a specific call-out is the brand new Advanced Research Projects Agency — Energy (ARPA-E), which would receive $400 million.
The NSF gets an additional $3B:
Of the $3 billion, $2.5 billion would go to the Research and Related Activities Account, home of NSF’s core research efforts. Of that $2.5 billion, 300 million would go to the Major Research Instrumentation program and an additional $200 million for academic research facilities modernization. This leaves an additional $2.0 billion to be spread among the research directorates for their core programs!
Plus lots more, for NIST, NIH, and others. The CRA is very upbeat on this proposal, banking on the perspective that science and technology innovation will drive the economy out of the dumper.
In summary, though, this looks awfully good to us and will likely go a long way towards recharging the Nation’s innovation engine.
I’ve been a federal employee off and on for the past 15 years (more on than off), and one thing I’ve learned is to not underestimate any branch of the government’s ability to snatch defeat from the jaws of victory. Something in favor of this bill is the presidential honeymoon.
I haven’t read the bill text (and probably won’t, I’m pushed right now), but a text search for combinations of HPC and supercomputing didn’t turn up anything. There does appear to be a lot of discretion in this bill for the various agency directors to do the Right Thing, which I’m in favor of in general. Hopefully the bill will be delayed long enough that the agencies will have their new heads and that the rotation will open an opportunity to bring in new thinking.
—–
John West is part of the team that summarizes the headlines in HPC news every day at insideHPC.com. You can contact him at [email protected].