Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!
June 2, 2010
Even as we gain a footing in the era of petaflops computing, we have set in motion the exploration of the undiscovered domain of exaflops computing. This year has seen the launching of multiple programs to develop the concepts, architectures, software stack, programming models, and new families of parallel algorithms necessary to enable the practical realization of exaflops capability prior to the end of this decade. Read more…
June 1, 2010
Chipmaker Intel is reviving the Larrabee technology for the HPC market, with plans to bring a manycore coprocessor to market in the next few years. During the ISC'10 opening keynote, Kirk Skaugen, vice president of Intel's Architecture Group and general manager of the Data Center Group, announced the chipmaker is developing what they're calling a "Many Integrated Core" (MIC) architecture, which will be the basis of a new line of processors aimed squarely at high performance technical computing applications. Read more…
May 31, 2010
A Chinese supercomputer called Nebulae, powered by the latest Fermi GPUs, grabbed the number two spot on the TOP500 list announced earlier today. The new machine delivered 1.27 petaflops of Linpack performance, yielding only to the 1.76 petaflop Jaguar system, which retained its number one berth. Read more…
May 28, 2010
Dr. Ashwini Nanda has been at the center of some of the most cutting-edge HPC projects and initiatives in the world. In this interview, Dr. Nanda talks about high performance computing in India, how he sees the industry today, and what led him to start up his company, HPC Links. Read more…
As genomic data becomes ubiquitous, infrastructure bottlenecks for life sciences organizations are narrowing. But speedy analysis and real-time decision making don't have to remain out of reach: modern end-to-end systems are emerging as flexible solutions for a competitive edge.
As data processing grows more and more specialized, effective storage strategies are more important than ever. And for IT professionals reevaluating their storage needs, software-defined and object-based storage are gaining ground by automating storage management – a trend that is only set to continue.
High performance workloads, big data, and analytics are increasingly important in finding real value in today's applications and data. Before we deploy applications and mine data for mission and business insights, we need a high-performance, rapidly scalable, resilient infrastructure foundation that can accurately, securely, and quickly access data from all relevant sources. Red Hat has technology that allows high performance workloads with a scale-out foundation that integrates multiple data sources and can transition workloads across on-premise and cloud boundaries.
HPC may once have been the sole province for huge corporations and national labs, but with hardware and cloud resources becoming more affordable even small and mid-sized companies are taking advantage.
© HPCwire. All Rights Reserved. A Tabor Communications Publication
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.