September 14, 2022
When DeepMind, an Alphabet subsidiary, started off more than a decade ago, solving some most pressing research questions and problems with AI wasn’t at the top of the company’s mind. Instead, the company started off AI research with computer games. Every score and win was a measuring stick of success... Read more…
July 14, 2016
It’s perhaps fitting that in the middle of the summer, when water management is a common challenge, that a paper in the Proceeding of the National Academy of Sciences (PNAS) offers more proof that life as we know it can’t occur without water. Using Ohio Supercomputing Center resources, researchers have shown the critical role water plays in actively guiding protein folding and movement. “For a long time, scientists have been trying to figure out how water interacts with proteins... Read more…
January 20, 2015
The smartphone has already usurped the role of the computer in a number of ways, whether you use it for navigation, to check the weather, or take and share phot Read more…
October 22, 2012
After a successful five-year run, Sony is ending its participation with Stanford University's [email protected] project. Read more…
November 4, 2010
Originally designed for video games, GPUs are now making their mark in the world of chemistry. Read more…
August 4, 2010
The Singularity is not so near after all. Read more…
February 24, 2009
Last week, the [email protected] team reported that they achieved five petaflops of processing power for their popular protein folding research project. Read more…
Five Recommendations to Optimize Data Pipelines
When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.
With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.
To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.
Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.
KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.
Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.