September 27, 2021
Setting aside the relatively recent rise of electronic signatures, personalized stamps have been a popular form of identification for formal documents in East A Read more…
July 16, 2021
In early March 2020, on the cusp of the pandemic’s global acceleration, a popular Twitter user called ZDoggMD (in real life, a physician named Zubin Damania) Read more…
April 3, 2021
Like any virus, HIV requires host cells to reproduce: however, key steps in HIV’s reproductive process have eluded scientists for decades. Now, a team of rese Read more…
January 26, 2021
COVID-19 may have dominated headlines and occupied much of the world’s scientific computing capacity over the last year, but many researchers continued their Read more…
July 30, 2020
As the COVID-19 pandemic progresses, the hunt for antibodies (the protective proteins produced by the human body in response to an infection) has reached a feve Read more…
June 9, 2020
Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…
July 8, 2019
For the last 12 years, the “Galaxy Zoo” has been working hard to improve our understanding of the cosmos. Despite its name, the Galaxy Zoo doesn’t house a Read more…
July 31, 2015
A plenary panel at the XSEDE15 conference, which took place this week in St. Louis, Mo., highlighted the broad spectrum of computing resources provided by the N Read more…
Five Recommendations to Optimize Data Pipelines
When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.
With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.
To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.
Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.
KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.
Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.