July 22, 2024
Getting intact qubits from here-to-there is the basic challenge for any quantum internet scheme. Now, scientists from the University of Chicago, Stanford Univer Read more…
May 3, 2022
Control of quantum computers has always required fast, precise coordination between a traditional computer and the quantum computer. Mostly, these are custom sy Read more…
January 15, 2021
Over the course of the last year, many detailed computational models of SARS-CoV-2 have been produced with the help of supercomputers, but those models have lar Read more…
March 27, 2018
The lithium ion battery, which essentially transformed portable electronics, has been a tough act to follow. Last week, researchers from Argonne National Labora Read more…
September 19, 2017
The National Science Foundation has awarded a second phase, $10 million grant to the Chameleon cloud computing testbed project led by University of Chicago with Read more…
May 5, 2016
Chameleon, the NSF-funded cloud testbed co-located at the University of Chicago and the Texas Advanced Computing Center, has been operating less than one year, Read more…
October 21, 2014
Grid computing pioneer and big data visionary Charlie Catlett recently delivered a presentation on “Big Data and the Future of Cities” as part of the Argonn Read more…
August 21, 2014
The National Science Foundation (NSF) announced funding for two cloud testbeds, named "Chameleon" and "CloudLab.” A total award of $20 million to be split eve Read more…
As Federal agencies navigate an increasingly complex and data-driven world, learning how to get the most out of high-performance computing (HPC), artificial intelligence (AI), and machine learning (ML) technologies is imperative to their mission. These technologies can significantly improve efficiency and effectiveness and drive innovation to serve citizens' needs better. Implementing HPC and AI solutions in government can bring challenges and pain points like fragmented datasets, computational hurdles when training ML models, and ethical implications of AI-driven decision-making. Still, CTG Federal, Dell Technologies, and NVIDIA unite to unlock new possibilities and seamlessly integrate HPC capabilities into existing enterprise architectures. This integration empowers organizations to glean actionable insights, improve decision-making, and gain a competitive edge across various domains, from supply chain optimization to financial modeling and beyond.
Data centers are experiencing increasing power consumption, space constraints and cooling demands due to the unprecedented computing power required by today’s chips and servers. HVAC cooling systems consume approximately 40% of a data center’s electricity. These systems traditionally use air conditioning, air handling and fans to cool the data center facility and IT equipment, ultimately resulting in high energy consumption and high carbon emissions. Data centers are moving to direct liquid cooled (DLC) systems to improve cooling efficiency thus lowering their PUE, operating expenses (OPEX) and carbon footprint.
This paper describes how CoolIT Systems (CoolIT) meets the need for improved energy efficiency in data centers and includes case studies that show how CoolIT’s DLC solutions improve energy efficiency, increase rack density, lower OPEX, and enable sustainability programs. CoolIT is the global market and innovation leader in scalable DLC solutions for the world’s most demanding computing environments. CoolIT’s end-to-end solutions meet the rising demand in cooling and the rising demand for energy efficiency.
© 2024 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.