September 18, 2022
Albert Einstein famously described quantum mechanics as "spooky action at a distance" due to the non-intuitive nature of superposition and quantum entangled par Read more…
April 9, 2018
The past 10-15 years has seen a stark rise in the density, size, and diversity of scientific data being generated in every scientific discipline in the world. Key among the sciences has been the explosion of laboratory technologies that generate large amounts of data in life-sciences and healthcare research. Large amounts of data are now being stored in very large storage name spaces, with little to no organization and a general unease about how to approach analyzing it. Read more…
June 28, 2016
As he approaches retirement, Reagan Moore reflects on SRB, iRODS, and the ongoing challenge of helping scientists manage their data. In 1994, Reagan Moore managed the production computing systems at the San Diego Supercomputer Center (SDSC), a job that entailed running and maintaining huge Cray computing systems as well as networking, archival storage, security, job scheduling, and visualization systems. At the time, research was evolving from analyses done by individuals on single computers into a collaborative activity using distributed, interconnected and heterogeneous resources. Read more…
October 21, 2015
Getting useful information from life sciences laboratory data in a timely manner requires selecting a suitable architecture that brings together complementary c Read more…
August 27, 2015
A curse of dealing with mounds of data so massive that they require special tools, said computer scientist Valerio Pascucci, is if you look for something, you will probably find it, thus injecting bias into the analysis. Read more…
April 4, 2014
When natural disaster strikes – be it a flood, an earthquake or a tsunami – every second counts. Just as emergency teams must be ready to go in a moment's Read more…
January 25, 2012
NCSA chooses Globus Online as big data mover. Read more…
Between 2012 and 2022, CoreHive Computing collaborated with IBM in upgrading the National Oceanic and Atmospheric Administration’s (NOAA) Weather and Climate Operational Supercomputing System (WCOSS). Among the most powerful high-performance computing (HPC) systems in the world, WCOSS plays a vital role in providing forecasts, watches, warnings, and sharing data for public and international use.
The upgraded system seamlessly integrated IBM and Cray supercomputing systems using Spectrum Scale, resulting in a computational speed of 8.4 petaflops. The new system empowers NOAA to process larger data volumes and generate higher-resolution weather models, resulting in more precise forecasts and enhanced support services to communities worldwide. By successfully meeting the stringent performance requirements of the WCOSS contract, CoreHive Computing demonstrated its expertise in delivering and supporting HPC systems.
© 2023 HPCwire. All Rights Reserved. A Tabor Communications Publication
HPCwire is a registered trademark of Tabor Communications, Inc. Use of this site is governed by our Terms of Use and Privacy Policy.
Reproduction in whole or in part in any form or medium without express written permission of Tabor Communications, Inc. is prohibited.