Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Tag: big data

HPE Gobbles SGI for Larger Slice of $11B HPC Pie

Aug 11, 2016 |

Hewlett Packard Enterprise (HPE) announced today that it will acquire rival HPC server maker SGI for $7.75 per share, or about $275 million, inclusive of cash and debt. The deal ends the seven-year reprieve that kept the SGI banner flying after Rackable Systems purchased the bankrupt Silicon Graphics Inc. for $25 million in 2009 and assumed the SGI brand. Bringing SGI into its fold bolsters HPE’s high-performance computing and data analytics capabilities and expands its position…

Obama, NIH Announce Big Data Gathering Push for Precision Medicine

Jul 7, 2016 |

One could be forgiven for experiencing a bit of hopeful, skepticism in response to U.S. President Barack Obama Administration’s statement in May regarding re-energizing the “War Against Cancer.” The war against cancer is a many-decades old effort with mixed results – great progress in many areas but matched with disappointment in others. Winning the war Read more…

OLCF Researchers Scale R to Tackle Big Science Data Sets

Jul 6, 2016 |

Sometimes lost in the discussion around big data is the fact that big science has long generated huge data sets. “In fact, large-scale simulations that run on leadership-class supercomputers work at such high speeds and resolution that they generate unprecedented amounts of data. The size of these datasets—ranging from a few gigabytes to hundreds of Read more…

TACC Director Lays Out Details of 2nd-Gen Stampede System

Jun 2, 2016 |

With a $30 million award from the National Science Foundation announced today, the Texas Advanced Computing Center (TACC) at The University of Texas at Austin (UT Austin) will stand up a second-generation Stampede system based on Dell PowerEdge servers equipped with Intel “Knights Landing” processors, next-generation Xeon chips and future 3D XPoint memory.

HPC and Big Data Convergence Takes Life at PSC’s Bridges

Jun 1, 2016 |

Roughly three months into early operations, the Bridges computing resources being deployed at the Pittsburgh Supercomputing Center is bearing fruit. Designed to accommodate traditional HPC and big data analytics, Bridges had supported 245 projects as of May 26. This ramping up of the NSF-funded ($9.6 M) Bridges project is an important step in delivering practical Read more…

Cray Bakes Big Data Software Framework Into Urika-GX Analytics Platform

May 24, 2016 |

Cray continued its courtship of the advanced scale enterprise market with today’s launch of the Urika-GX, a system that integrates Cray supercomputing technologies with an agile big data platform designed to run multiple analytics workloads concurrently. While Cray has pre-installed analytics software in previous systems, the new system takes pre-configuration to a new level with an open software framework designed to eliminate installation, integration and update headaches that stymie big data implementations.

TACC Wrangler Supports Zika Hackathon

May 19, 2016 |

By now most people have heard of Zika, the mosquito-borne disease that can cause fever and birth defects and that threatens to spread to the United States from Latin and South America. Earlier this week more than 50 data scientists, engineers, and University of Texas students gathered for the “Austin Zika Hackathon” at big data Read more…

Nielsen and Intel Migrate HPC Efficiency and Data Analytics to Big Data

May 16, 2016 |

Nielsen has collaborated with Intel to migrate important pieces of HPC technology into Nielsen’s big-data analytic workflows including MPI, mature numerical libraries from NAG (the Numerical Algorithms Group), as well as custom C++ analytic codes. This complementary hybrid approach integrates the benefits of Hadoop data management and workflow scheduling with an extensive pool of HPC tools and C/C++ capabilities for analytic applications. In particular, the use of MPI reduces latency, permits reuse of the Hadoop servers, and co-locates the MPI applications close to the data.

Wrangler Supercomputer Speeds Through Big Data

Mar 11, 2016 |

Like the old Western cowboys who tamed wild horses, Wrangler tames beasts of big data, such as computing problems that involve analyzing thousands of files that need to be quickly opened, examined and cross-correlated.

Making Sense of HPC in the Age of Democratization

Mar 8, 2016 |

These are exciting times for HPC. High-performance computing and its cousin high-productivity computing are expanding such that the previous definitions of HPC as a moving performance target or as the purview of modeling and simulation are breaking down. The democratization of HPC has spurred a lot of focus on the impact that HPC-hatched technologies are Read more…