Tag: big data

NSF Backs ‘Big Data Spokes’ with $10M in Grants

Sep 28, 2016 |

In recent years the Obama Administration and National Science Foundation have worked to spur growth of big data infrastructure to handle academic, government, and industrial data-intensive research. In 2012, the Big Data Research and Development Initiative was launched by OSTP and last year NSF announced BD Hubs. Today NSF continued adding muscle to the growing Read more…

MIT Programmers Attack Big Data Memory Gap

Sep 15, 2016 |

Among the computing challenges presented by big data is the scattering of unstructured items across huge datasets. Pulling together that data from arbitrary locations in main memory is therefore emerging as a major performance bottleneck in CPUs. Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory have proposed a solution to the memory “locality” problem Read more…

HPE Gobbles SGI for Larger Slice of $11B HPC Pie

Aug 11, 2016 |

Hewlett Packard Enterprise (HPE) announced today that it will acquire rival HPC server maker SGI for $7.75 per share, or about $275 million, inclusive of cash and debt. The deal ends the seven-year reprieve that kept the SGI banner flying after Rackable Systems purchased the bankrupt Silicon Graphics Inc. for $25 million in 2009 and assumed the SGI brand. Bringing SGI into its fold bolsters HPE’s high-performance computing and data analytics capabilities and expands its position…

Obama, NIH Announce Big Data Gathering Push for Precision Medicine

Jul 7, 2016 |

One could be forgiven for experiencing a bit of hopeful, skepticism in response to U.S. President Barack Obama Administration’s statement in May regarding re-energizing the “War Against Cancer.” The war against cancer is a many-decades old effort with mixed results – great progress in many areas but matched with disappointment in others. Winning the war Read more…

OLCF Researchers Scale R to Tackle Big Science Data Sets

Jul 6, 2016 |

Sometimes lost in the discussion around big data is the fact that big science has long generated huge data sets. “In fact, large-scale simulations that run on leadership-class supercomputers work at such high speeds and resolution that they generate unprecedented amounts of data. The size of these datasets—ranging from a few gigabytes to hundreds of Read more…

TACC Director Lays Out Details of 2nd-Gen Stampede System

Jun 2, 2016 |

With a $30 million award from the National Science Foundation announced today, the Texas Advanced Computing Center (TACC) at The University of Texas at Austin (UT Austin) will stand up a second-generation Stampede system based on Dell PowerEdge servers equipped with Intel “Knights Landing” processors, next-generation Xeon chips and future 3D XPoint memory.

HPC and Big Data Convergence Takes Life at PSC’s Bridges

Jun 1, 2016 |

Roughly three months into early operations, the Bridges computing resources being deployed at the Pittsburgh Supercomputing Center is bearing fruit. Designed to accommodate traditional HPC and big data analytics, Bridges had supported 245 projects as of May 26. This ramping up of the NSF-funded ($9.6 M) Bridges project is an important step in delivering practical Read more…

Cray Bakes Big Data Software Framework Into Urika-GX Analytics Platform

May 24, 2016 |

Cray continued its courtship of the advanced scale enterprise market with today’s launch of the Urika-GX, a system that integrates Cray supercomputing technologies with an agile big data platform designed to run multiple analytics workloads concurrently. While Cray has pre-installed analytics software in previous systems, the new system takes pre-configuration to a new level with an open software framework designed to eliminate installation, integration and update headaches that stymie big data implementations.

TACC Wrangler Supports Zika Hackathon

May 19, 2016 |

By now most people have heard of Zika, the mosquito-borne disease that can cause fever and birth defects and that threatens to spread to the United States from Latin and South America. Earlier this week more than 50 data scientists, engineers, and University of Texas students gathered for the “Austin Zika Hackathon” at big data Read more…

Nielsen and Intel Migrate HPC Efficiency and Data Analytics to Big Data

May 16, 2016 |

Nielsen has collaborated with Intel to migrate important pieces of HPC technology into Nielsen’s big-data analytic workflows including MPI, mature numerical libraries from NAG (the Numerical Algorithms Group), as well as custom C++ analytic codes. This complementary hybrid approach integrates the benefits of Hadoop data management and workflow scheduling with an extensive pool of HPC tools and C/C++ capabilities for analytic applications. In particular, the use of MPI reduces latency, permits reuse of the Hadoop servers, and co-locates the MPI applications close to the data.