Tag: big data

DDN’s Alex Bouzari on Beating Moore’s Law

Jan 13, 2016 |

As 2015 was in its home stretch, DataDirect Networks (DDN) refreshed its high performance SFA block storage line with the launch of SFA14K and SFA14KE, formerly codenamed “Wolfcreek.” DDN also took the wraps off its Infinite Memory Engine (IME14K), which leverages solid state and nonvolatile memory technologies to create a data caching tier between processor and Read more…

SC15 Video: IBM’s Dave Turek on Big Data, CORAL, and HPC’s Evolution

Dec 8, 2015 |

Accelerated computing certainly dominated the IBM message at SC15 but there were many sub-themes in Austin. Big data, the beneficial impact of software frameworks (think Apache Spark), workflow optimization, and a growing role for cloud in HPC delivery were all in the mix. HPCwire managing editor John Russell sat down with Dave Turek, IBM VP Read more…

Cray Lays Out Vision for HPC-Big Data Convergence

Dec 3, 2015 |

Messaging around HPC-big data convergence which had been ramping up all year reached new heights at SC15 — and you’d be hard-pressed to find a bigger champion of the unified platform strategy than American supercomputer-maker Cray. HPCwire met with Cray’s Barry Bolding at the show in Austin last month to discuss the company’s latest customer wins, its take on Read more…

Contrary View: CPUs Sometimes Best for Big Data Visualization

Dec 1, 2015 |

Contrary to conventional thinking, GPUs are often not the best vehicles for big data visualization. In this commentary, I discuss several key technical reasons why a CPU-based “Software Defined Visualization” approach can deliver a more flexible, performant, scalable, cost effective, and power efficient solution to big data visualization than a conventional GPU-based approach. An example Read more…

Hadoop and Spark Get RADICAL at SC15

Nov 13, 2015 |

The rapid maturation of the Apache Hadoop ecosystem has caught the eyes of HPC professionals who are eager to take advantage of emerging big data tools, such as Spark. One HPC group presenting on the topic at the SC15 show this week in Austin, Texas, is Rutgers University’s RADICAL team. The Research in Advanced Distributed Read more…

Mira is First Supercomputer to Simulate Large Hadron Collider Experiments

Nov 4, 2015 |

Argonne physicists are using Mira to perform simulations of Large Hadron Collider (LHC) experiments with a leadership-class supercomputer for the first time, shedding light on a path forward for interpreting future LHC data. Researchers at the Argonne Leadership Computing Facility (ALCF) helped the team optimize their code for the supercomputer, which has enabled them to Read more…

NSF Awards $5M for Four Regional Big Data Hubs in Science

Nov 2, 2015 |

The ability to access, analyze and draw insights from massive amounts of data already drives innovation in areas ranging from medicine to manufacturing, leading to greater efficiency and a higher quality of life. To accelerate this emerging field, the National Science Foundation (NSF) today announced four awards totaling more than $5 million to establish regional Read more…

Will SC15 Talk Provide Glimpse into NSCI Implementation Plans?

Oct 15, 2015 |

With Randy Bryant (OSTP) and Tim Polk (NIST) slated the to give an invited talk at SC15 explaining the National Strategic Computing Initiative, the timing seems perfect for an early preview of specifics contained in NSCI’s implementation plan. The talked is highlighted on today’s SC15 Blog. President Obama’s Executive Order establishing NSCI was issued at Read more…

IBM ‘Returns to HPC’ with New Linux Server Line Says Gupta

Oct 8, 2015 |

IBM today launched a new line of Power8-based Linux servers – the Power LC (Linux cluster) Line – including one offering that marks IBM’s return to the HPC market, according to Sumit Gupta, vice president of HPC and OpenPOWER Operations at IBM (NYSE: IBM). Three servers were announced, aimed at cloud computing, data analytics, and Read more…

This Hospital Computer Knows When Your Days Are Numbered

Sep 25, 2015 |

Chalk “predict death” off the list of “what computers can’t do.” A supercomputer at Beth Israel Deaconess Medical Center in Boston, Mass., can decipher the likelihood of a patient’s imminent demise with uncanny accuracy. Patients at the hospital are linked up to the machine, which leverages all available patient data — doctors visits, lab results, medications and Read more…