Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags

Features

Earthquake Simulation Hits Petascale Milestone

Apr 15, 2014 |

German researchers are helping to push back the goalposts on large-scale simulation. Using the the IBM “SuperMUC” high performance computer at the Leibniz Supercomputing Center (LRZ), a cross-disciplinary team of computer scientists, mathematicians and geophysicists successfully scaled an earthquake simulation to more than one petaflop/s, i.e., one quadrillion floating point operations per second. The collaboration included participants Read more…

Why Iterative Innovation is the Only Path to Exascale

Apr 14, 2014 |

If we’re out of “magic bullets” that can shoot across supercomputing space, shattering assumptions about how high performance computing operates efficiently at massive scale, we’re left with one option…refine and tweak that which exists, while pushing as much funding as possible toward the blue sky above with the hopes that another disruptive technology will emerge. Read more…

Big Science, Tiny Microservers: IBM Research Pushes 64-Bit Possibilities

Apr 10, 2014 |

Four years ago, a friend dropped a Sheeva Plug into the hands of Ronald Luijten, a system designer at IBM Research in Zurich. At the time, neither could have realized the development cycle this simple gift would spark. If you’re not familiar, Sheeva Plugs are compact devices that look a lot like your laptop power Read more…

IDC Details Further HPC Market Momentum

Apr 10, 2014 |

This week we traveled to Santa Fe for the IDC User Forum to get a better grasp on upcoming trends for the high performance computing market in 2014 and beyond. While the emphasis at this particular meeting was on the value of industrial partnerships, there were a number of talks around scientific computing initiatives, developments Read more…

DOE Exascale Roadmap Highlights Big Data

Apr 7, 2014 |

If you’ve been following the US exascale roadmap, then chances are you’ve been following the work of William (“Bill”) J. Harrod, Division Director for the Advanced Scientific Computing Research (ASCR), Office of Science with the US Department of Energy (DOE). In January, Harrod asserted that the DOE’s mission to push the frontiers of science and Read more…

Oil Company Drills into HPC Cloud Issues

Apr 7, 2014 |

Danish energy company, Maersk Oil, has seen a number of technology trends evolve since its founding in 1962, particularly in terms of its ability to stay ahead of the unending race to new sources of hydrocarbons. The company’s research and technology team hasn’t been immune to the promises of cloud computing in its evaluations of Read more…

Numascale Launches Scalable GPU Systems

Apr 7, 2014 |

Based on its innovative interconnect Numascale offers scalable and expandable systems for high performance applications. Adding up standard servers will scale your GPU system within a single image operating system and environment with scalable and shared memory and open for better utilization of the GPUs’ computing power.

Inside Major League Baseball’s “Hypothesis Machine”

Apr 3, 2014 |

When it comes to sports statistics, there’s no richer source of historical data than baseball. With over 140 years of detailed information on individual players, teams, and winning trends, the addition of digital data is powering even deeper analytical capability to help fans and team owners make decisions. Baseball data, over 95% of which has Read more…

The Bright Side of Decline: IDC Sheds Light on HPC Server Market

Apr 2, 2014 |

As some of you have already noted, the most recent figures from IDC’s sweep of the HPC server market are in—and from the surface, they don’t suggest a stellar season ahead for supercomputing. However, when put into some broader context, particularly on the international scale with a few massive, surprise systems added to the mix, Read more…

HPC and Big Data: A “Best of Both Worlds” Approach

Mar 31, 2014 |

While they may share a number of similar, overarching challenges, data-intensive computing and high performance computing have some rather different considerations, particularly in terms of management, emphasis on performance, storage and data movement. Still, there is plenty of room for the two areas to merged, according to Indiana University’s Dr. Geoffrey Fox. Fox and his Read more…