January 12, 2010
Company offers way to make large-memory x86 machines. Read more…
January 6, 2010
More videos from HPC's premier event of 2009. Read more…
December 2, 2009
Small vendors, flash memory on the rise. Read more…
November 27, 2009
Big Blue unveils "Blue Waters" server node. Read more…
November 25, 2009
Before SC09 recedes too far in the rear-view mirror, it's probably worth recapping some of the news connected to the big trends that emerged at the conference. Read more…
November 20, 2009
The oft-contended best simple statement is that we need ubiquitous parallelism in the classroom. In the near future, most electronic devices will have multiple cores which would benefit greatly from parallel programming. The low hanging fruit is, of course, the student's laptop, and aiding the student to make full use of that laptop. Read more…
November 20, 2009
Supercomputer performance has grown at a fairly constant rate of a 1,000-fold increase per decade. Will the sprint to exascale be able to hold that pace? Read more…
Five Recommendations to Optimize Data Pipelines
When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.
With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.
To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.
Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.
KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.
Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.