HPC + AI Wall Street to Feature ‘Spooky’ Science for Financial Services

September 18, 2022

Albert Einstein famously described quantum mechanics as "spooky action at a distance" due to the non-intuitive nature of superposition and quantum entangled par Read more…

Transitioning from Big Data to Discovery: Data Management as a Keystone Analytics Strategy

April 9, 2018

The past 10-15 years has seen a stark rise in the density, size, and diversity of scientific data being generated in every scientific discipline in the world. Key among the sciences has been the explosion of laboratory technologies that generate large amounts of data in life-sciences and healthcare research. Large amounts of data are now being stored in very large storage name spaces, with little to no organization and a general unease about how to approach analyzing it. Read more…

Profile of a Data Science Pioneer

June 28, 2016

As he approaches retirement, Reagan Moore reflects on SRB, iRODS, and the ongoing challenge of helping scientists manage their data. In 1994, Reagan Moore managed the production computing systems at the San Diego Supercomputer Center (SDSC), a job that entailed running and maintaining huge Cray computing systems as well as networking, archival storage, security, job scheduling, and visualization systems. At the time, research was evolving from analyses done by individuals on single computers into a collaborative activity using distributed, interconnected and heterogeneous resources. Read more…

Matching the Use Case to Architecture is Critical in Life Science

October 21, 2015

Getting useful information from life sciences laboratory data in a timely manner requires selecting a suitable architecture that brings together complementary c Read more…

Exploring Large Data for Scientific Discovery

August 27, 2015

A curse of dealing with mounds of data so massive that they require special tools, said computer scientist Valerio Pascucci, is if you look for something, you will probably find it, thus injecting bias into the analysis. Read more…

Data Management in Times of Disaster

April 4, 2014

When natural disaster strikes – be it a flood, an earthquake or a tsunami – every second counts. Just as emergency teams must be ready to go in a moment's Read more…

Blue Waters Supercomputer to Use Grid-Based File Service

January 25, 2012

NCSA chooses Globus Online as big data mover. Read more…

  • arrow
  • Click Here for More Headlines
  • arrow

Whitepaper

HPC in a Global Weather Environment

Between 2012 and 2022, CoreHive Computing collaborated with IBM in upgrading the National Oceanic and Atmospheric Administration’s (NOAA) Weather and Climate Operational Supercomputing System (WCOSS). Among the most powerful high-performance computing (HPC) systems in the world, WCOSS plays a vital role in providing forecasts, watches, warnings, and sharing data for public and international use.

The upgraded system seamlessly integrated IBM and Cray supercomputing systems using Spectrum Scale, resulting in a computational speed of 8.4 petaflops. The new system empowers NOAA to process larger data volumes and generate higher-resolution weather models, resulting in more precise forecasts and enhanced support services to communities worldwide. By successfully meeting the stringent performance requirements of the WCOSS contract, CoreHive Computing demonstrated its expertise in delivering and supporting HPC systems.

Download Now

Sponsored by CoreHive

Advanced Scale Career Development & Workforce Enhancement Center

Featured Advanced Scale Jobs:

SUBSCRIBE for monthly job listings and articles on HPC careers.

HPCwire Resource Library

HPCwire Product Showcase

Subscribe to the Monthly
Technology Product Showcase:

HPCwire