The staff at Intersect360 Research have published their seventh Site Budget Allocation Map, peeling back the curtains on HPC site spending across industry, government and academic sites to suss out important trends. Overall this was a strong year characterized by fairly stable spending across the major categories (hardware, software, etc.) with 59 percent of all sites anticipating further growth over Read more…
While advancing the field of HPC into the exascale era is beset by many obstacles, resiliency might be the most thorny of all. As the number of cores proliferate so too do the number of incorrect behaviors, threatening not just the operation of the machine, but the validity of the results as well. When you Read more…
CEO Pete Manca details Egenera’s unusual journey from hardware vendor to software provider.
During the International Supercomputing Conference, Bull’s Matthew Foxton sounded an alarm bell for the European supercomputing community with his statement that all the R&D will not prove useful to Europe’s future without a solid investment in the “D”–not just the “R”.
Photorealistic rendering for design and animation is pushing multicore processors to their limit with key software advancements.
A recent effort led by Cycle Computing based on the SHOC benchmark revealed equal performance between GPU-accelerated cloud and native hardware.
With exascale predictions all the rage, here’s a more sobering look at the next big thing in supercomputing.
The idea that HPC in the cloud should be simple and fulfill the true promise of instant, on-demand resources without effort is faulty, according to Joe Landman, who argues that customer expectations are not meeting with HPC cloud realities.
Achieving workable software-based fault tolerance will require a fresh approach for developers.
Big Blue sees green in mainstream high performance computing market.