Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!
June 18, 2008
On Wednesday Microsoft announced something big in a little number. NCSA put one of its systems, the 9000+ core Abe system, on the latest Top500 list at number 23. This is the highest ranking Windows HPC system to date and has important implications for Microsoft and the community. Read more…
June 18, 2008
There is no other way to characterize this year: 2008 will be remembered as "the year" -- the year that one petaflops was achieved in Linpack performance. It is a milestone that has been anticipated for almost a decade and a half, and one that was accomplished through the synthesis of two big trends that have emerged as the driving forces for HPC in the last few years -- multicore and heterogeneous computing. Read more…
June 13, 2008
The 23rd annual International Supercomputing Conference (ISC) will bring together many of the world's leading experts in high performance computing this week in Dresden, Germany. HPCwire got an opportunity to ask conference chair Prof. Hans Meuer about the upcoming conference and his thoughts on the direction of supercomputing. Read more…
June 9, 2008
Petaflop. Sure it's just a number, but it's a big number. On June 10, IBM announced that its Roadrunner supercomputer reached a record-breaking one petaflop -- a quadrillion floating point operations per second -- using the standard Linpack benchmark. It is the first general-purpose computer to reach this milestone. Read more…
As genomic data becomes ubiquitous, infrastructure bottlenecks for life sciences organizations are narrowing. But speedy analysis and real-time decision making don't have to remain out of reach: modern end-to-end systems are emerging as flexible solutions for a competitive edge.
As data processing grows more and more specialized, effective storage strategies are more important than ever. And for IT professionals reevaluating their storage needs, software-defined and object-based storage are gaining ground by automating storage management – a trend that is only set to continue.
High performance workloads, big data, and analytics are increasingly important in finding real value in today's applications and data. Before we deploy applications and mine data for mission and business insights, we need a high-performance, rapidly scalable, resilient infrastructure foundation that can accurately, securely, and quickly access data from all relevant sources. Red Hat has technology that allows high performance workloads with a scale-out foundation that integrates multiple data sources and can transition workloads across on-premise and cloud boundaries.
HPC may once have been the sole province for huge corporations and national labs, but with hardware and cloud resources becoming more affordable even small and mid-sized companies are taking advantage.
© HPCwire. All Rights Reserved. A Tabor Communications Publication
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.