Tag: data analytics

HPE Gobbles SGI for Larger Slice of $11B HPC Pie

Aug 11, 2016 |

Hewlett Packard Enterprise (HPE) announced today that it will acquire rival HPC server maker SGI for $7.75 per share, or about $275 million, inclusive of cash and debt. The deal ends the seven-year reprieve that kept the SGI banner flying after Rackable Systems purchased the bankrupt Silicon Graphics Inc. for $25 million in 2009 and assumed the SGI brand. Bringing SGI into its fold bolsters HPE’s high-performance computing and data analytics capabilities and expands its position…

Intel Xeon E7 Balloons In-memory Capacity, Targets Real-Time Analytics

Jun 8, 2016 |

Who crunches more data faster, wins. It’s this drive that cuts through and clarifies the essence of the evolutionary spirit in the computer industry, the dual desire to get to real time with bigger and bigger chunks of data. The locomotive: HPC technologies adapted to enterprise mission-critical data analytics. With its memory capacity of up Read more…

Nielsen and Intel Migrate HPC Efficiency and Data Analytics to Big Data

May 16, 2016 |

Nielsen has collaborated with Intel to migrate important pieces of HPC technology into Nielsen’s big-data analytic workflows including MPI, mature numerical libraries from NAG (the Numerical Algorithms Group), as well as custom C++ analytic codes. This complementary hybrid approach integrates the benefits of Hadoop data management and workflow scheduling with an extensive pool of HPC tools and C/C++ capabilities for analytic applications. In particular, the use of MPI reduces latency, permits reuse of the Hadoop servers, and co-locates the MPI applications close to the data.

Making Sense of HPC in the Age of Democratization

Mar 8, 2016 |

These are exciting times for HPC. High-performance computing and its cousin high-productivity computing are expanding such that the previous definitions of HPC as a moving performance target or as the purview of modeling and simulation are breaking down. The democratization of HPC has spurred a lot of focus on the impact that HPC-hatched technologies are Read more…

HPC for Advanced Analytics at the USPS

May 5, 2015 |

Today, the United States Postal Service is on its third generation of supercomputers, with each generation more capable than its predecessor. IDC believes the USPS embrace of HPC exemplifies an important, accelerating IT trend: Leading organizations in the private and public sectors are increasingly turning to high-performance computing to tackle challenging big data analytics workloads Read more…

Army Research Lab Lays Out HPC Strategy

Feb 19, 2015 |

The US Army Research Laboratory (ARL) is counting on supercomputing and large-scale analytics to provide the competitive edge it needs to maintain its position as the nation’s premier laboratory for land forces. As laid out in the recently-released Technical Implementation Plan for 2015-2019, the ARL sees advanced computing as fundamental to its mission to “discover, Read more…

Cray Advances Hadoop for HPC

Feb 4, 2014 |

In a recent blog entry, Mike Boros, Hadoop Product Marketing Manager at Cray, Inc., writes about the company’s positioning of Hadoop for scientific big data. Like the old adage, “when the only tool you have is a hammer, every problem begins to resemble a nail,” Boros suggests that the Law of the Instrument may be true Read more…

New OpenSFS Rep on the Future of Lustre

Mar 12, 2013 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/OpenSFS_logoCROPPED.jpg” alt=”” width=”95″ height=”51″ />OpenSFS has chosen its Community Representative Director for 2013: Tommy Minyard, director of Advanced Computing Systems (ACS) at the Texas Advanced Computing Center (TACC). We got the new director’s views on Lustre’s opportunities in big data and exascale, maintaining a single source tree, and new features on the horizon.

Penguin Joins Microserver ARMs Race

Oct 18, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Penguin_UXD1_server_small.jpg” alt=”” width=”92″ height=”75″ />Penguin Computing has launched its first ARM-based server platform. Known as the UDX1, the Penguin box is based on Calxeda’s latest ARM server chip, and is aimed at cloud computing, Web hosting, and, especially, data analytics – UD stands for Ultimate Data. The move puts Penguin into the front ranks of computer makers who are testing the waters for the burgeoning microserver market.

Wyoming Plays Host to Top 10 Super

Jun 11, 2012 |

Petascale supercomputing is coming to one of the least populated states in the US.