Tag: big data

White House Launches National HPC Strategy

Jul 30, 2015 |

Yesterday’s executive order by President Barack Obama creating a National Strategic Computing Initiative (NSCI) is not only powerful acknowledgment of the vital role HPC plays in modern society but is also indicative of government’s mounting worry that failure to coordinate and nourish HPC development on a broader scale would put the nation at risk. Not Read more…

Read more…

Gearing Up for the Cluster of Tomorrow

Jul 23, 2015 |

While the shifting architectural landscape of elite supercomputing gets a lot of the spotlight especially around TOP500 time, cluster computing has grown to comprise more than three-quarters of the HPC market. At a joint Cray-IDC webinar held earlier today, the partners discussed how clusters, and specifically scale-out “cluster supercomputers,” are evolving in ways that can benefit from supercomputing technology. Steve Conway, Read more…

Read more…

White House Prioritizes HPC Investment

Jul 22, 2015 |

A memorandum released by the White House earlier this month cites high-performance computing and big data as essential technology areas for 21st century governance. Building on the administration’s past budget priorities, the memo calls for federal agencies to prioritize a set of nine science and technology areas when making coordinated FY 2017 Budget submissions to the Office of Read more…

Read more…

IDC: The Changing Face of HPC

Jul 16, 2015 |

At IDC’s annual ISC breakfast there was a good deal more than market update numbers although there were plenty of those: “We try to track every server sold, every quarter, worldwide,” said Earl Joseph, IDC program vice president and executive director HPC User Forum. Perhaps more revealing and as important this year was IDC’s unveiling Read more…

Read more…

HP, Intel Partner to Expand HPC Into the Enterprise

Jul 13, 2015 |

When the going gets tough, the tough join together to innovate. This is the message we are hearing from HPC stakeholders across the government and vendor landscape. It would an oversimplification to say that big data is responsible for driving this deeper partner integration but the technology trend has something to with it. More precisely, the Read more…

Read more…

Dell Aims PowerEdge C-Series Platform for HPC and Beyond

Jun 30, 2015 |

Dell has positioned its latest PowerEdge C-series platform to meet the needs of both traditional HPC and the hyperscale market. The recently hatched PowerEdge C6320 is outfitted with the latest generation Intel Xeon E5-2600 v3 processors, providing up to 18 cores per socket (144 cores per 2U chassis), up to 512GB of DDR4 memory and Read more…

Read more…

The Necessary Marriage of Big Data with Exascale

Jun 29, 2015 |

Failure to incorporate big data computing insights into efforts to achieve exascale computing would be a critical mistake argue Daniel Reed and Jack Dongarra in their article, Exascale Computing and Big Data, published in the July issue of the Communications of the ACM journal. While scientific and big data computing have historically taken different development Read more…

Read more…

Japan Preps for HPC-Big Data Convergence

Jun 18, 2015 |

The task of drawing discrete boundaries around the tech spheres known as HPC and big data is getting more complicated with each passing year. Many are anticipating convergence between these camps. One prominent HPCer waving the convergence flag is Satoshi Matsuoka, a professor at Tokyo Institute of Technology’s Global Scientific Information and Computing Center. In his Read more…

Read more…

HP Launches HPC & Big Data Global Business Unit

Jun 4, 2015 |

When HP finally divides into two pieces – HP Inc. (PCs and printers) and Hewlett Packard Enterprise (servers and services) – how will the HPC portfolio fare? Views vary of course. The split is meant to let the ‘new’ companies shed distraction and sharpen focus. HPC will live within HP Enterprise, but perhaps surprisingly not Read more…

Read more…

Shining a Light on SKA’s Massive Data Processing Requirements

Jun 4, 2015 |

One of the many highlights of the fourth annual Asia Student Supercomputer Challenge (ASC15) was the MIC optimization test, which this year required students to optimize a gridding algorithm used in the world’s largest international astronomy effort, the Square Kilometre Array (SKA) project. Gridding is one of the most time-consuming steps in radio telescope data processing. Read more…

Read more…