Here's a collection of highlights, selected totally subjectively, from this week's HPC news stream as reported at insideHPC.com and HPCwire.
>>10 words and a link
Dan Reed and Tony Hey are featured at upcoming RENCI/Microsoft eScience workshop;
http://insidehpc.com/2007/05/07/renci-and-microsoft-sponsor-escience-workshop/
Release 0.2.6 of JPPF, a grid computing toolkit for Java;
http://insidehpc.com/2007/05/07/jppf-update-to-026/
French computer maker Bull introduces new HPC system;
http://insidehpc.com/2007/05/07/bulls-new-system/
Progress research world on quantum computing: a coupled qubit circuit;
http://insidehpc.com/2007/05/08/coupled-qubits/
RapidMind revs Cell/GPGPU application development platform;
http://insidehpc.com/2007/05/09/rapidmind-revs-development-platform/
Odd bedfellows: Dell, Novell, Microsft and Red Hat, IBM;
http://insidehpc.com/2007/05/09/odd-bedfellows-dell-novell-microsft-red-hat-ibm/
>>Losses for primary HPC vendors
Both SGI and Cray reported quarterly financials this week, and if this was NPR's “Marketplace” we'd be playing the sad music.
Supercomputer pioneer Cray reported Q1 results for the quarter ending 31 Mar. Total revenue for the quarter was down year over year to $47.1 million compared to $48.5 million for the same quarter in 2006. There was some bright news in that net loss totaled out at $0.8 million ($0.03 per share), versus $5.3 million in Q1 2006. Recall that Cray earned a profit of $0.33 per share in Q4 2006. Cray's financial filings indicate that it is re-spinning some of the key silicon in its next product offerings, a move that may result in delays in Black Widow and the XMT.
Graphics pioneer turned supercomputer manufacturer SGI reported its second quarterly results since exiting bankruptcy. The company reported quarterly revenue of $111 million with an operating loss of $20 million. Still, the company ended the quarter with $70 million in cash, having spent about $6 million in cash during the quarter for settlement of bankruptcy-related obligations.
>>Changes in utility computing pricing models reflect optimism
Utility computing — where a hosting company buys the supercomputer and you just rent the time — has gotten a lot of press lately, though it's certainly not a new idea. These days the timesharing concepts developed for mainframes of the 60s and 70s are conflated with modern terms like “software as a service” and “service-oriented architecture,” but many of the ideas are still recognizable.
Despite having a fairly unflattering record over the past several decades, most particularly in the technical computing arena, the recent growth of interest in HPC to enable core enterprise business processes (the fastest growing segment of the HPC market) has spurred providers like Sun, Unisys, Amazon and others to give it a go.
Recently the idea appears to have gotten some traction even among scientific customers, and that is spurring a closer evaluation of pricing for these services. Byte and Switch ran a piece last week on the recent efforts by Sun, Amazon and Unisys to overhaul their hosted computation solutions. The article is fairly skeptical, as illustrated by this quote from Michael Dortch, director at the Robert Frances Group: “People understand the concept but there are still not enough pre-built applications or documented case studies to lower peoples' confusion thresholds.”
But it notes that both Sun and Amazon have adapted their pricing model in an attempt to balance the price people pay with the demands they place on the infrastructure. This is similar, of course, to the way that you pay a different rate for electricity than your local manufacturing plant.
I view these changes as overall positive for the utility computing business. If companies are investing the time and effort to tweak pricing based upon well-understood exemplars (like the power utility pricing models) then they have evidence that it's worth their time to do so.
You can get a pointer to specifics on the pricing changes and to the Byte and Switch article at http://insidehpc.com/2007/05/08/byte-and-switch-hosted-grid-offerors-working-hard-to-lure-skeptical-users/.
>>Arctic ice models 30 years off
So, this is bad news if you ask me. According to a study by scientists at the National Center for Atmospheric Research (NCAR) and the University of Colorado's National Snow and Ice Data Center (NSIDC):
“[B]ecause of the disparity between the computer models and actual observations, the shrinking of summertime ice is about 30 years ahead of the climate model projections. As a result, the Arctic could be seasonally free of sea ice earlier than the IPCC-projected timeframe of any time from 2050 to well beyond 2100.”
Read the whole thing at http://www.ucar.edu/news/releases/2007/seaice.shtml, then go out and buy a hybrid.
—–
John West summarizes the headlines in HPC every day at insideHPC.com, and writes on leadership and career issues for technology professionals at InfoWorld and on his own blog at onlytraitofaleader.com. You can contact him at [email protected].