Leibniz QIC’s Mission to Coax Qubits and Bits to Work Together

March 14, 2023

Four years after passing the U.S. National Quantum Initiative Act and decades after early quantum development and commercialization efforts started – think D- Read more…

Quantum – Are We There (or Close) Yet? No, Says the Panel

November 19, 2022

For all of its politeness, a fascinating panel on the last day of SC22 – Quantum Computing: A Future for HPC Acceleration? – mostly served to illustrate the Read more…

What’s New in HPC Research: Quantum Cryptography, Radiation Damage, Human Settlements & More

October 25, 2019

In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programm Read more…

Supercomputers Help Uncover the Link Between Earthquakes and Tsunamis

September 9, 2019

In 2018, a powerful earthquake hit Sulawesi in Indonesia. Palu – the capital of Central Sulawesi – was struck by an unexpected ensuing tsunami. In total, ov Read more…

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

Earthquake Simulation Hits Petascale Milestone

April 15, 2014

German researchers are helping to push back the goalposts on large-scale simulation. Using the the IBM “SuperMUC” high performance computer at the Leibniz S Read more…

  • arrow
  • Click Here for More Headlines
  • arrow

Whitepaper

Streamlining AI Data Management

Five Recommendations to Optimize Data Pipelines

When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.

With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.

To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.

Download Now

Sponsored by DDN

Whitepaper

Taking research further with extraordinary compute power and efficiency

Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.

KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.

Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.

Download Now

Sponsored by Lenovo

Advanced Scale Career Development & Workforce Enhancement Center

Featured Advanced Scale Jobs:

Receive the Monthly
Advanced Computing Job Bank Resource:

HPCwire Resource Library

HPCwire Product Showcase

Subscribe to the Monthly
Technology Product Showcase:

HPCwire