February 16, 2021
With the one-year mark of the pandemic in the U.S. rapidly approaching and vaccinations ramping up, decision-makers and stakeholders are beginning to look back Read more…
July 2, 2020
Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporation Read more…
March 14, 2020
Genome editing stands to change the trajectory of human civilization, with massive implications for treatments of any genetic disease and potential for even bro Read more…
February 26, 2020
While not the novel coronavirus that is now sweeping across the world, the 2009 H1N1 flu pandemic (pH1N1) infected up to 21 percent of the global population and Read more…
January 18, 2018
The rich history of collaboration between UC San Diego and AIST in Japan is getting richer. The organizations entered into a five-year memorandum of understandi Read more…
August 21, 2013
San Diego Supercomputer Center (SDSC) is announcing a bold new cloud and analytics-based initiative, called Sherlock. Established by SDSC with the assistance of SD Technology and Chickasaw Nation Industries, the Sherlock-branded project represents an "extensive portfolio of information technology services for healthcare and government." Read more…
Five Recommendations to Optimize Data Pipelines
When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.
With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.
To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.
Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.
KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.
Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.