Tag: big data

SGI ERI Installation Illustrates Multi-use HPC

May 20, 2015 |

Traditional research and Big Data apps are increasingly run on the same HPC system as lines between their computational requirements blur and demand for dual-use capability grows, said Bob Braham, SGI CMO. He pointed to last month’s ramp up of the latest SGI supercomputer at the Earthquake and Volcano Information Center at the University of Tokyo’s Earthquake Read more…

HPC for Advanced Analytics at the USPS

May 5, 2015 |

Today, the United States Postal Service is on its third generation of supercomputers, with each generation more capable than its predecessor. IDC believes the USPS embrace of HPC exemplifies an important, accelerating IT trend: Leading organizations in the private and public sectors are increasingly turning to high-performance computing to tackle challenging big data analytics workloads Read more…

Why TACC’s New Data ‘Wrangler’ Is a Big Deal

Apr 30, 2015 |

While there’s been a lot of activity around the coming crop of “exascale-relevant” supercomputers, the HPC landscape is also shifting to become more data-aware. Perhaps no system reflects this transition better than Wrangler, the I/O-optimized open science system from Dell and EMC that debuted earlier this month at the Texas Advanced Computing Center (TACC). In a presentation Read more…

New Models for Research, Part III – Embracing the Big Data Stack

Mar 30, 2015 |

In the third of a four-part installation, Jay Etchings, director of operations for research computing, and senior HPC architect at Arizona State University, explores the brave new world of open big data alternatives to traditional bare metal HPC+MPI+InfiniBand for research computing.

FPGA-Accelerated Search Heads for Mainstream

Mar 10, 2015 |

With data volumes now outpacing Moore’s Law, there is a move to look beyond conventional hardware and software tools. Accelerators like GPUs and the Intel MIC architecture have extended performance goals for many HPC-class workloads. Although field-programmable gate arrays (FPGAs) have not seen the same level of adoption for traditional HPC workloads, a subset of big data Read more…

New Models for Research Computing, Part I

Mar 9, 2015 |

In the first of a four-part series, Jay Etchings, director of operations for research computing, and senior HPC architect at Arizona State University lays out the concept of the next-generation cyberinfrastructure, a set of integrated technology components working together to support the diverse needs of the research community across disciplines and across scale. Where does high-performance computing go from Read more…

Catching Up with Baseball’s Mystery Cray Owner

Mar 3, 2015 |

If you’re a sports fan and an HPC-watcher than you know that supercomputing has hit the major leagues, Major League Baseball (MLB) that is. Last year, a MLB team purchased a Cray Urika supercomputer with the aim of transforming enormous volumes of disparate data into game-winning intelligence. The technology and implications of this early-adoption trend are interesting, Read more…

Tyrone Brings HPC and Big Data Solutions to India and the World

Mar 2, 2015 |

India’s appetite for high performance computing – ranging from powerful HPC clusters to advanced workstations – continues to grow as an increasing number of companies not only address opportunities within the country, but world wide as well. Indian enterprises are making their mark in such fields as seismic engineering for oil and gas, aeronautics and Read more…

NERSC Realigns for Enhanced Data Focus

Feb 26, 2015 |

The National Energy Research Scientific Computing Center (NERSC), one of the nation’s primary HPC facilities for scientific research, has implemented several organizational changes, which it says will help its 6,000 users “more productively manage their data-intensive research.” NERSC’s Storage Systems Group will move under the Services Department, in order to foster greater synergy with the Read more…

Army Research Lab Lays Out HPC Strategy

Feb 19, 2015 |

The US Army Research Laboratory (ARL) is counting on supercomputing and large-scale analytics to provide the competitive edge it needs to maintain its position as the nation’s premier laboratory for land forces. As laid out in the recently-released Technical Implementation Plan for 2015-2019, the ARL sees advanced computing as fundamental to its mission to “discover, Read more…