September 14, 2022
When DeepMind, an Alphabet subsidiary, started off more than a decade ago, solving some most pressing research questions and problems with AI wasn’t at the top of the company’s mind. Instead, the company started off AI research with computer games. Every score and win was a measuring stick of success... Read more…
September 2, 2022
Fusion, the nuclear reaction that powers the Sun and the stars, has incredible potential as a source of safe, carbon-free and essentially limitless energy. But Read more…
July 15, 2022
The development of a whole device model (WDM) for a fusion reactor is critical for the science of magnetically confined fusion plasmas. In the next decade, the Read more…
May 26, 2021
Inertial confinement fusion (ICF) experiments is a speculative method of fusion energy generation that would compress a fuel pellet to generate fusion energy ju Read more…
July 29, 2015
In a recent IEEE Micro article, a team of engineers and computers scientists from chipmaker Advanced Micro Devices (AMD) detail AMD's vision for exascale computing, which in its most essential form combines CPU-GPU integration with hardware and software support to facilitate the running of scientific workloads on exascale-class systems. Read more…
September 29, 2014
University of Texas at Austin physicist Wendell Horton has been using the resources of the Texas Advanced Computing Center (TACC) to study the full 3D structure Read more…
August 27, 2013
Fusion science, which seeks to recreate the energy of the stars for use on Earth, has long been the holy grail of energy researchers. A recent experiment at Lawrence Livermore's National Ignition Facility puts fusion energy one step closer. Read more…
March 20, 2013
LLNL researchers have successfully harnessed all 1,572,864 of Sequoia's cores for one impressive simulation. Read more…
Five Recommendations to Optimize Data Pipelines
When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.
With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.
To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.
Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.
KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.
Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.