Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags

Tag: data-intensive

NSF Forges Further Beyond FLOPs

May 22, 2013 |

In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they’re turning an eye to solving data-intensive challenges. We spoke with the agency’s Irene Qualters and Barry Schneider about..

FLOPS Fall Flat for Intelligence Agency

Mar 27, 2013 |

<img src=”http://media2.hpcwire.com/hpcwire/light-speed.gif” alt=”” width=”94″ height=”94″ />The Intelligence Advanced Research Projects Activity (IARPA) is putting out some RFI feelers in hopes of pushing new boundaries with an HPC program. However, at the core of their evaluation process is an overt dismissal of benchmarks, including floating operations per second (FLOPS).

SGI Supercomputer Takes Twitter’s Pulse

Nov 22, 2012 |

UV 2 system can create heat maps of tweets during hurricanes and elections.

Convey Cooks Personality into New MX Line

Nov 21, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Convey_boards.jpg” alt=”” width=”98″ height=”85″ />Last week at SC12 in Salt Lake Convey pulled the lid off its MX big data-driven architecture designed to shine against graph analytics problems, which were at the heart of the show’s unmistakable data-intensive computing thrust this year. The new MX line is designed to exploit massive degrees of parallelism while efficiently handling hard-to-partition big data applications.

Big Data Is HPC – Let’s Embrace It

Oct 25, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/bigdatagraphic_132x.jpg” alt=”” width=”75″ height=”105″ />Big data is all the rage these days. It is the subject of a recent Presidential Initiative, has its own news portal, and, in the guise of Watson, is a game show celebrity. Big data has also caused concern in some circles that it might sap interest and funding from the exascale computing initiative. So, is big data distinct from HPC – or is it just a new aspect of our evolving world of high-performance computing?

ESnet Launches Architecture to Help Researchers Deliver on Data-Intensive Science

Apr 26, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/ESnet_logo.jpg” alt=”” width=”114″ height=”119″ />In order to help research institutions capitalize on the growing availability of high-bandwidth networks to manage their growing data sets, the DOE’s Energy Sciences Network, known as ESnet, is working with the scientific community to encourage the use of a network design model called the “Science DMZ.” Leading the development of this effort is Eli Dart, a network engineer with previous experience at Sandia National Laboratories and the National Energy Research Scientific Computing Center. In this interview, Dart talks about the nature of the project and explains how such an architecture can help researchers.

Convey Cranks Up Performance with Latest FPGA-Accelerated Computer

Apr 24, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Convey_HC-2.bmp” alt=”” width=”141″ height=”61″ />Convey Computer has launched its newest x86-FPGA “hybrid-core” server. Dubbed HC-2, it represents the first major upgrade of the system since the company introduced the HC-1 product back in 2008. The new offering promises much better performance, but with a similar price range as the original system.

Flash Forward: SDSC Launches Data-Intensive Supercomputer

Dec 6, 2011 |

Gordon, the largest flash memory-based computer on the planet, was officially launched at a ceremony that took place on Monday at the San Diego Supercomputer Center (SDSC). Two years in the making, and backed by a $20 million Track 2 grant from the National Science Foundation (NSF), Gordon represents the first really big purpose-built supercomputer for data-intensive applications.

IU Team Sets Data Transfer Record at SC11

Nov 22, 2011 |

Indiana University’s Scinet Research Sandbox entry sets new records, renews promise of cloud for data-intensive science workloads.

Climate Workshop Will Explore Data-Intensive Research Methods

Oct 31, 2011 |

One of the most promising use cases for “Big Data” is to help advance climate research. At SC11, Reinhard Budich (Max Planck Institute for Meteorology), John Feo (Pacific Northwest National Laboratory) Tobias Weigel (DKRZ) and Per Nyberg (Cray) will co-host the second Climate Knowledge Discovery (CKD) workshop to explore new data-intensive methods. HPCwire talked about this with Budich and Feo.

SC14 Virtual Booth Tours

AMD SC14 video AMD Virtual Booth Tour @ SC14
Click to Play Video
Cray SC14 video Cray Virtual Booth Tour @ SC14
Click to Play Video
Datasite SC14 video DataSite and RedLine @ SC14
Click to Play Video
HP SC14 video HP Virtual Booth Tour @ SC14
Click to Play Video
IBM DCS3860 and Elastic Storage @ SC14 video IBM DCS3860 and Elastic Storage @ SC14
Click to Play Video
IBM Flash Storage
@ SC14 video IBM Flash Storage @ SC14  
Click to Play Video
IBM Platform @ SC14 video IBM Platform @ SC14
Click to Play Video
IBM Power Big Data SC14 video IBM Power Big Data @ SC14
Click to Play Video
Intel SC14 video Intel Virtual Booth Tour @ SC14
Click to Play Video
Lenovo SC14 video Lenovo Virtual Booth Tour @ SC14
Click to Play Video
Mellanox SC14 video Mellanox Virtual Booth Tour @ SC14
Click to Play Video
Panasas SC14 video Panasas Virtual Booth Tour @ SC14
Click to Play Video
Quanta SC14 video Quanta Virtual Booth Tour @ SC14
Click to Play Video
Seagate SC14 video Seagate Virtual Booth Tour @ SC14
Click to Play Video
Supermicro SC14 video Supermicro Virtual Booth Tour @ SC14
Click to Play Video