View the news and highlights from this year’s ISC High Performance 2017 in Frankfurt, Germany.
Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!
June 22, 2009
Thomas Lippert, director of the Jülich Supercomputing Center in Germany, is speaking at this year's International Supercomputing Conference in Hamburg about his experiences with the exotic systems that lead the TOP500 list of the HPC community's preeminent supercomputers, and the scientific breakthroughs that they enable. We caught up with Dr. Lippert by email before the conference to get a sneak peak at his thoughts on working at the extremes of computation. Read more…
June 21, 2009
When 1,500 leading members of the world’s high performance computing community convene June 23-26 at the 2009 International Supercomputing Conference, The opening keynote address will be presented by Andreas “Andy” von Bechtolsheim, the legendary co-founder of Sun Microsystems and founder and Chief Development Officer of Arista Networks. Von Bechtolsheim will discuss “The Evolution of Interconnects for High Performance Computing.” Read more…
March 25, 2009
The 24th International Supercomputing Conference (ISC) convenes June 23-26 in the Congress Center Hamburg, and organizers are expecting more than 1,500 participants and about 120 exhibitors from around the world. As final preparations are being made, Prof. Hans Meuer took a break from his duties as general conference chair to discuss the event's new venue in Hamburg as well as this year's highlights. Read more…
As data processing grows more and more specialized, effective storage strategies are more important than ever. And for IT professionals reevaluating their storage needs, software-defined and object-based storage are gaining ground by automating storage management – a trend that is only set to continue.
High performance workloads, big data, and analytics are increasingly important in finding real value in today's applications and data. Before we deploy applications and mine data for mission and business insights, we need a high-performance, rapidly scalable, resilient infrastructure foundation that can accurately, securely, and quickly access data from all relevant sources. Red Hat has technology that allows high performance workloads with a scale-out foundation that integrates multiple data sources and can transition workloads across on-premise and cloud boundaries.
HPC may once have been the sole province for huge corporations and national labs, but with hardware and cloud resources becoming more affordable even small and mid-sized companies are taking advantage.
© HPCwire. All Rights Reserved. A Tabor Communications Publication
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.