Here’s a collection of highlights, selected totally subjectively, from this week’s HPC news stream as reported at insideHPC.com and HPCwire.
>>10 words and a link
ACM/IEEE HPC Ph.D. Fellowship Program accepting nominations;
4th annual Stanford HPC conference;
HPC Oak Ridge photo tour at C|Net;
Foster on HPC 2008 in Cetraro;
Verari’s 16 core vis solution;
>>HPC’s Trickle Effect
LinuxWorld has an interesting article today on HPC’s influence upon the greater landscape of enterprise IT. Generally speaking, high performance computing organizations run on the fringe of what most consider “enterprise information technology.” Some more lunatic than others. Either way, the constant driving force of the HPC industry as a whole has become a significant influence on the enterprise world.
Just as the U.S. space program afforded such innovations as scratch resistant sunglasses, all-weather winter radial tire for cars, and equipment for hospitals to monitor patients’ vital signs, so has HPC been the source of innovation for smaller scale architectures. The technology used to run HPC is spilling over to mainstream IT, reaching smaller entities and companies with a much smaller scale than Livermore use or Sandia National Laboratory might have. Enterprise and SMB market have an abundance of resources to draw from traditional HPC computing.
>>Dreamworks and Intel Team Up
Dreamworks Animation and Intel have announced a “strategic alliance” for creating 3D animated movies. Beginning early next year, Dreamworks will begin producing all their films on Intel-based platforms. From the release:
To meet the increased demands of creating 3-D animated feature films, Intel will provide DreamWorks Animation with the latest high-performance processing technologies, including future chips with multiple processing cores. Intel software engineers will help to optimize DreamWorks’ applications for these advanced processors.
Interesting…this could be the first sign of Intel’s recent graphics investments rearing their head in commerical industry.
>>Cray Henry on HPCMP Archive Strategy
Cray Henry, director of the DoD’s High Performance Computing Modernization Program, has written an article for Government Computer News detailing the current and future plans for the HPCMP’s data archive system. For those who don’t know, the HPCMP is a program tasked with providing computational and scientific support to over 4,000 scientists and engineers performing DoD-sponsored R&D. According to the article, the program expects to generate, each year, one-third the amount of data it has accumulated over its 15-year history. The HPCMP has begun to look at more efficient ways of managing all this data:
Meeting the next five years’ storage requirements will involve increasing the number of machines devoted to storage, improving mechanisms for predicting future storage needs, and possibly integrating algorithms into applications that allow users to catalog and define the storage period for new data.
During the next year, we will institute a number of strategies, such as a revised retention policy, reliance on the users to more proactively manage their data and an upgrade of storage systems, including new storage-density technologies.