Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags

Tag: data-intensive

Rensselaer Orders Up Blue Gene/Q for Exascale and Data-Intensive Research

Oct 25, 2011 |

Last month Rensselaer Polytechnic Institute announced it had been awarded a $2.65 million grant to acquire a 100 teraflop Blue Gene/Q supercomputer for its Computational Center for Nanotechnology Innovations. The new system will also include a multi-terabyte RAM-based storage accelerator, petascale disk storage, and rendering cluster plus remote display wall system for visualization.

Big Data Rains Down on Seattle

Oct 20, 2011 |

At SC11 in Seattle, the stage is set for data-intensive computing to steal the show. This year’s theme correlates directly to the “big data” trend that is reshaping enterprise and scientific computing. We give an insider’s view of some of the top sessions for the big data crowd and a broader sense of how this year’s conference is shaping up overall.

HPC Players Embrace Hadoop

Oct 19, 2011 |

SGI, Microsoft warm up their Hadoop offerings.

Convey Bends to Inflection Point

Oct 7, 2011 |

Convey recently noted that HPC is “no longer just numerically intensive, it’s now data-intensive—with more and different demands on HPC system architectures.” They claim that the “whole new HPC” that is gathered under the banner of data-intensive computing possesses a number of unique characteristics and see unique opportunities for all the of the data, and new memory and co-processor architectures.

With Windows Support, SGI Casts Altix UV in New Light

Apr 3, 2011 |

SGI has been getting a lot of mileage out of its SGI UV shared memory platform, having delivered close to 500 systems since it started shipping them in June 2010. Now, with the recent addition of support for Microsoft’s Windows Server, the company is looking to expand its customer base in a big way.

Cray Pushes XMT Supercomputer Into the Limelight

Jan 26, 2011 |

When announced in 2006, the Cray XMT supercomputer attracted little attention. The machine was originally targeted for high-end data mining and analysis for a particular set of government clients in the intelligence community. While the feds have given the XMT support over the past five years, Cray is now looking to move these machines into the commercial sphere. And with the next generation XMT-2 on the horizon, the company is gearing up to accelerate that strategy in 2011.

Graph 500 Takes Aim at a New Kind of HPC

Nov 15, 2010 |

Data-intensive applications are quickly emerging as a significant new class of HPC workloads. For this class of applications, a new kind of supercomputer, and a different way to assess them, will be required. That is the impetus behind the Graph 500, a set of benchmarks that aim to measure the suitability of systems for data-intensive analytics applications.

SDSC Puts Data at Center Stage

Sep 7, 2010 |

The naming of Michael Norman as director of the San Diego Supercomputer Center (SDSC) last week was long overdue. SDSC has been without an official director for more than 14 months, with Norman filling the spot as the interim head since last July. The appointment could mark something of a comeback for the center, which has not only gone director-less during this time, but has been operating without a high-end supercomputer as well.

TeraGrid 2010 Keynote: The Physics of Black Holes with Cactus

Aug 11, 2010 |

TeraGrid ’10, the fourth annual conference of the TeraGrid, took place last week in Pittsburgh, Pa. HPCwire will be running a series of articles highlighting the conference. The first in the series covers Gabrielle Allen’s keynote talk on Cactus, an open, collaborative software framework for numerical relativity.

Back to the Future: Solid-State Storage in Cloud Computing

May 28, 2010 |

Solid-state devices based on Flash and PCIe are emerging as a new class of enterprise storage option — Tier-0. Tier-0 is an optimized storage tier specifically for high performance workloads, which can benefit the most from using flash memory.