Tag: big data

TACC Director Lays Out Details of 2nd-Gen Stampede System

Jun 2, 2016 |

With a $30 million award from the National Science Foundation announced today, the Texas Advanced Computing Center (TACC) at The University of Texas at Austin (UT Austin) will stand up a second-generation Stampede system based on Dell PowerEdge servers equipped with Intel “Knights Landing” processors, next-generation Xeon chips and future 3D XPoint memory.

HPC and Big Data Convergence Takes Life at PSC’s Bridges

Jun 1, 2016 |

Roughly three months into early operations, the Bridges computing resources being deployed at the Pittsburgh Supercomputing Center is bearing fruit. Designed to accommodate traditional HPC and big data analytics, Bridges had supported 245 projects as of May 26. This ramping up of the NSF-funded ($9.6 M) Bridges project is an important step in delivering practical Read more…

Cray Bakes Big Data Software Framework Into Urika-GX Analytics Platform

May 24, 2016 |

Cray continued its courtship of the advanced scale enterprise market with today’s launch of the Urika-GX, a system that integrates Cray supercomputing technologies with an agile big data platform designed to run multiple analytics workloads concurrently. While Cray has pre-installed analytics software in previous systems, the new system takes pre-configuration to a new level with an open software framework designed to eliminate installation, integration and update headaches that stymie big data implementations.

TACC Wrangler Supports Zika Hackathon

May 19, 2016 |

By now most people have heard of Zika, the mosquito-borne disease that can cause fever and birth defects and that threatens to spread to the United States from Latin and South America. Earlier this week more than 50 data scientists, engineers, and University of Texas students gathered for the “Austin Zika Hackathon” at big data Read more…

Nielsen and Intel Migrate HPC Efficiency and Data Analytics to Big Data

May 16, 2016 |

Nielsen has collaborated with Intel to migrate important pieces of HPC technology into Nielsen’s big-data analytic workflows including MPI, mature numerical libraries from NAG (the Numerical Algorithms Group), as well as custom C++ analytic codes. This complementary hybrid approach integrates the benefits of Hadoop data management and workflow scheduling with an extensive pool of HPC tools and C/C++ capabilities for analytic applications. In particular, the use of MPI reduces latency, permits reuse of the Hadoop servers, and co-locates the MPI applications close to the data.

Wrangler Supercomputer Speeds Through Big Data

Mar 11, 2016 |

Like the old Western cowboys who tamed wild horses, Wrangler tames beasts of big data, such as computing problems that involve analyzing thousands of files that need to be quickly opened, examined and cross-correlated.

Making Sense of HPC in the Age of Democratization

Mar 8, 2016 |

These are exciting times for HPC. High-performance computing and its cousin high-productivity computing are expanding such that the previous definitions of HPC as a moving performance target or as the purview of modeling and simulation are breaking down. The democratization of HPC has spurred a lot of focus on the impact that HPC-hatched technologies are Read more…

Mapping Out a New Role for Cognitive Computing in Science

Feb 25, 2016 |

The avalanche of data being produced by experimental instruments and sensors isn’t just a big (data) problem – it’s perhaps the biggest bottleneck in science today. What’s needed, say the distinguished authors of a new white paper from the Computing Community Consortium, is a community-wide agenda to develop cognitive computing tools that can perform scientific Read more…

Create Mixed HPC/Big Data Clusters Today Says Bright Computing

Feb 3, 2016 |

The latest version of convergence – blending traditional HPC and big data computing into a ‘single’ environment – dominates much of the conversation in the HPC today, including within HPCwire. Elegant unified solutions, certainly at the very high end, are still in the making. That said, Bright Computing is in the thick of efforts to Read more…

Toward a Converged Exascale-Big Data Software Stack

Jan 28, 2016 |

Within the HPC vendor and science community, the groundswell of support for HPC and big data convergence is undeniable with sentiments running the gamut from the pragmatic to the enthusiastic. For Argonne computer scientist and HPC veteran Pete Beckman, the writing is on the wall. As the leader of the Argo exascale software project and Read more…