If virtualization is one of the biggest hurdles to clouds being able to handle HPC applications with supercomputer might and speed, why not simply remove the abstraction and ditch the virtualization altogether? The result would be very recognizable to many in research and academia; it’s the ages-old “rent a cluster” paradigm. And what is wrong with it? Clarification — what is wrong with it now that the term “cloud” has all the gloss of a gleaming new datacenter and cluster rental as a concept (CRAC) seems far less appealing?
Renting HPC: What’s Cloud Got to Do with It?
July 1, 2010