Tag: big data

Making Sense of HPC in the Age of Democratization

Mar 8, 2016 |

These are exciting times for HPC. High-performance computing and its cousin high-productivity computing are expanding such that the previous definitions of HPC as a moving performance target or as the purview of modeling and simulation are breaking down. The democratization of HPC has spurred a lot of focus on the impact that HPC-hatched technologies are Read more…

Mapping Out a New Role for Cognitive Computing in Science

Feb 25, 2016 |

The avalanche of data being produced by experimental instruments and sensors isn’t just a big (data) problem – it’s perhaps the biggest bottleneck in science today. What’s needed, say the distinguished authors of a new white paper from the Computing Community Consortium, is a community-wide agenda to develop cognitive computing tools that can perform scientific Read more…

Create Mixed HPC/Big Data Clusters Today Says Bright Computing

Feb 3, 2016 |

The latest version of convergence – blending traditional HPC and big data computing into a ‘single’ environment – dominates much of the conversation in the HPC today, including within HPCwire. Elegant unified solutions, certainly at the very high end, are still in the making. That said, Bright Computing is in the thick of efforts to Read more…

Toward a Converged Exascale-Big Data Software Stack

Jan 28, 2016 |

Within the HPC vendor and science community, the groundswell of support for HPC and big data convergence is undeniable with sentiments running the gamut from the pragmatic to the enthusiastic. For Argonne computer scientist and HPC veteran Pete Beckman, the writing is on the wall. As the leader of the Argo exascale software project and Read more…

SGI’s HPC GM Gabriel Broner on 2016 Trends

Jan 25, 2016 |

Roughly six months ago SGI reorganized into three business units – traditional HPC; high performance data analytics (HPDA); and SGI services – each with a general manager at the head. Gabriel Broner, then vice president of software innovation, became general manager for the HPC unit. Broner is a familiar voice in the HPC community. He Read more…

Argonne Paves the Way for Future Systems

Jan 14, 2016 |

Last April, the third and final piece of the CORAL acquisition program clicked into place when the U.S. Department of Energy signed a $200 million supercomputing contract with Intel to supply Argonne National Laboratory with two next-generation Cray supercomputers: an 8.5-petaflop “Theta” system based on Knights Landing (KNL) and a much larger 180-petaflop “Aurora” supercomputer. The staff Read more…

DDN’s Alex Bouzari on Beating Moore’s Law

Jan 13, 2016 |

As 2015 was in its home stretch, DataDirect Networks (DDN) refreshed its high performance SFA block storage line with the launch of SFA14K and SFA14KE, formerly codenamed “Wolfcreek.” DDN also took the wraps off its Infinite Memory Engine (IME14K), which leverages solid state and nonvolatile memory technologies to create a data caching tier between processor and Read more…

SC15 Video: IBM’s Dave Turek on Big Data, CORAL, and HPC’s Evolution

Dec 8, 2015 |

Accelerated computing certainly dominated the IBM message at SC15 but there were many sub-themes in Austin. Big data, the beneficial impact of software frameworks (think Apache Spark), workflow optimization, and a growing role for cloud in HPC delivery were all in the mix. HPCwire managing editor John Russell sat down with Dave Turek, IBM VP Read more…

Cray Lays Out Vision for HPC-Big Data Convergence

Dec 3, 2015 |

Messaging around HPC-big data convergence which had been ramping up all year reached new heights at SC15 — and you’d be hard-pressed to find a bigger champion of the unified platform strategy than American supercomputer-maker Cray. HPCwire met with Cray’s Barry Bolding at the show in Austin last month to discuss the company’s latest customer wins, its take on Read more…

Contrary View: CPUs Sometimes Best for Big Data Visualization

Dec 1, 2015 |

Contrary to conventional thinking, GPUs are often not the best vehicles for big data visualization. In this commentary, I discuss several key technical reasons why a CPU-based “Software Defined Visualization” approach can deliver a more flexible, performant, scalable, cost effective, and power efficient solution to big data visualization than a conventional GPU-based approach. An example Read more…