Tag: big data

Cray Bakes Big Data Software Framework Into Urika-GX Analytics Platform

May 24, 2016 |

Cray continued its courtship of the advanced scale enterprise market with today’s launch of the Urika-GX, a system that integrates Cray supercomputing technologies with an agile big data platform designed to run multiple analytics workloads concurrently. While Cray has pre-installed analytics software in previous systems, the new system takes pre-configuration to a new level with an open software framework designed to eliminate installation, integration and update headaches that stymie big data implementations.

TACC Wrangler Supports Zika Hackathon

May 19, 2016 |

By now most people have heard of Zika, the mosquito-borne disease that can cause fever and birth defects and that threatens to spread to the United States from Latin and South America. Earlier this week more than 50 data scientists, engineers, and University of Texas students gathered for the “Austin Zika Hackathon” at big data Read more…

Nielsen and Intel Migrate HPC Efficiency and Data Analytics to Big Data

May 16, 2016 |

Nielsen has collaborated with Intel to migrate important pieces of HPC technology into Nielsen’s big-data analytic workflows including MPI, mature numerical libraries from NAG (the Numerical Algorithms Group), as well as custom C++ analytic codes. This complementary hybrid approach integrates the benefits of Hadoop data management and workflow scheduling with an extensive pool of HPC tools and C/C++ capabilities for analytic applications. In particular, the use of MPI reduces latency, permits reuse of the Hadoop servers, and co-locates the MPI applications close to the data.

Wrangler Supercomputer Speeds Through Big Data

Mar 11, 2016 |

Like the old Western cowboys who tamed wild horses, Wrangler tames beasts of big data, such as computing problems that involve analyzing thousands of files that need to be quickly opened, examined and cross-correlated.

Making Sense of HPC in the Age of Democratization

Mar 8, 2016 |

These are exciting times for HPC. High-performance computing and its cousin high-productivity computing are expanding such that the previous definitions of HPC as a moving performance target or as the purview of modeling and simulation are breaking down. The democratization of HPC has spurred a lot of focus on the impact that HPC-hatched technologies are Read more…

Mapping Out a New Role for Cognitive Computing in Science

Feb 25, 2016 |

The avalanche of data being produced by experimental instruments and sensors isn’t just a big (data) problem – it’s perhaps the biggest bottleneck in science today. What’s needed, say the distinguished authors of a new white paper from the Computing Community Consortium, is a community-wide agenda to develop cognitive computing tools that can perform scientific Read more…

Create Mixed HPC/Big Data Clusters Today Says Bright Computing

Feb 3, 2016 |

The latest version of convergence – blending traditional HPC and big data computing into a ‘single’ environment – dominates much of the conversation in the HPC today, including within HPCwire. Elegant unified solutions, certainly at the very high end, are still in the making. That said, Bright Computing is in the thick of efforts to Read more…

Toward a Converged Exascale-Big Data Software Stack

Jan 28, 2016 |

Within the HPC vendor and science community, the groundswell of support for HPC and big data convergence is undeniable with sentiments running the gamut from the pragmatic to the enthusiastic. For Argonne computer scientist and HPC veteran Pete Beckman, the writing is on the wall. As the leader of the Argo exascale software project and Read more…

SGI’s HPC GM Gabriel Broner on 2016 Trends

Jan 25, 2016 |

Roughly six months ago SGI reorganized into three business units – traditional HPC; high performance data analytics (HPDA); and SGI services – each with a general manager at the head. Gabriel Broner, then vice president of software innovation, became general manager for the HPC unit. Broner is a familiar voice in the HPC community. He Read more…

Argonne Paves the Way for Future Systems

Jan 14, 2016 |

Last April, the third and final piece of the CORAL acquisition program clicked into place when the U.S. Department of Energy signed a $200 million supercomputing contract with Intel to supply Argonne National Laboratory with two next-generation Cray supercomputers: an 8.5-petaflop “Theta” system based on Knights Landing (KNL) and a much larger 180-petaflop “Aurora” supercomputer. The staff Read more…