Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags

Topics » Systems

Features

Earthquake Simulation Hits Petascale Milestone

Apr 15, 2014 |

German researchers are helping to push back the goalposts on large-scale simulation. Using the the IBM “SuperMUC” high performance computer at the Leibniz Supercomputing Center (LRZ), a cross-disciplinary team of computer scientists, mathematicians and geophysicists successfully scaled an earthquake simulation to more than one petaflop/s, i.e., one quadrillion floating point operations per second. The collaboration included participants Read more…

Inside Major League Baseball’s “Hypothesis Machine”

Apr 3, 2014 |

When it comes to sports statistics, there’s no richer source of historical data than baseball. With over 140 years of detailed information on individual players, teams, and winning trends, the addition of digital data is powering even deeper analytical capability to help fans and team owners make decisions. Baseball data, over 95% of which has Read more…

Swiss Hybrid Petaflopper Opens for Research

Mar 24, 2014 |

During the 2013 NVIDIA GPU Technology conference, the Swiss National Supercomputing Center (CSCS) revealed that its Cray XC30 “Piz Daint” supercomputer was on track to becoming Europe’s fastest GPU-accelerated number-cruncher, and the first Cray machine to be equipped with Intel Xeon processors and NVIDA GPUs. Now, one year later, the revved-up Piz Daint is officially Read more…

Peek into China’s Plans for Top Supercomputer Shows No Slowdown

Mar 20, 2014 |

This week we learned more about Japan’s exascale plans for the 2020 timeframe, but also on the contender list to be among the first to reach exascale-class computing levels is China. For now, however, the country has its sights set on continuing to dominate the list in 2015 and beyond. To put this and the Read more…

Details Emerging on Japan’s Future Exascale System

Mar 18, 2014 |

The Big Data and Extreme Computing meeting in Fukuoka, Japan concluded recently, pushing a great deal of information about international progress toward exascale initiatives into the global community. As the host country, Japan had ample opportunity to gather many of the researchers building out the next incarnation of the K Computer, which is expected to Read more…

Short Takes

Russian-Bred Supercomputer in the Works

Apr 11, 2014 |

Russia is said to be developing a home-grown supercomputer for military-industrial applications, according to a report in Prensa Latina. Ruselectronics CEO Andrei Zverev revealed that the state-sponsored electronics holdings company is coordinating with the Ministry of Industry and Commerce to create a 1.2 petaflop computer to serve the needs of the Russian defense industry. “All of Read more…

Viglen Gives UK Science Facility JASMIN £4 Million Makeover

Apr 10, 2014 |

British systems integrator Viglen has won a £4 million contract to outfit JASMIN, a UK-based environmental scientific data analysis and simulation facility, with petascale-level data processing and storage capabilities. The contract calls for the design, supply and installation of a turnkey integrated HPC computing, storage and network solution at the site, which is run by Read more…

Leading Edge Versus Bleeding Edge

Apr 10, 2014 |

Enterprises are always looking for an edge to use against the competition, and information technology was created initially and specifically to be that edge. Decades later, computing in its various forms is the foundation of the modern corporation, and companies are still looking for new ways of gaining an advantage. More times than not, that Read more…

HPC ‘App’ for Industry Stresses Ease of Use

Apr 8, 2014 |

One of the main enterprise uses for high performance computing (HPC) is to bring product designs to market faster via a process known as rapid prototyping. This week three popular companies – Unilever, Syngenta and Infineum – have partnered with the HPC facilities at the Science and Technology Facilities Council’s (STFC’s) Hartree Centre, drawn by Read more…

How NASA Is Meeting the Big Data Challenge

Apr 7, 2014 |

As the scientific community pushes past petaflop into exascale territory, it is imperative that the tools to support ever-more data-intensive workloads keep pace. No where is this more true than at the storied NASA research complex. With 100 active missions supporting cutting-edge science, NASA knows more than most about compute- and data-driven challenges. A recent paper Read more…

Off the Wire

Intel Selects Georgia Tech as Site for Next Parallel Computing Center

Apr 17, 2014 |

April 17 — As modern computer systems become more powerful, utilizing as many as millions of processor cores in parallel, Intel is looking for new ways to efficiently use these high performance computing (HPC) systems to accelerate scientific discovery. As part of this effort, Intel has selected Georgia Tech as the site of one of Read more…

NEC Selects Chelsio Adapters for Vector Supercomputer

Apr 16, 2014 |

SUNNYVALE, Calif., April 16 – Chelsio Communications, Inc., a leading provider of 10-Gigabit and 40-Gigabit Ethernet adapters and ASIC solutions (Terminator 4 and 5), today announced that its T4 based 10GbE adapters have been selected by NEC Corporation for NEC’s HPC Systems for network and storage connectivity in SX-ACE, the latest SX-series Vector Supercomputer, targeted at Read more…

Mellanox Collaborates with Dell

Apr 16, 2014 |

SUNNYVALE, Calif. & YOKNEAM, Israel, April 16 – Mellanox Technologies, Ltd., a leading supplier of high-performance, end-to-end interconnect solutions for data center servers and storage systems, today announced its 10/40GbE adapters for rack servers and 10GbE mezzanine adapters for blades are now available with Dell Fluid Cache for SAN. The Mellanox high speed adapters help enable Read more…

Cluster 2014 Set for September

Apr 15, 2014 |

April 15 — Clusters have become the workhorse for computational science and engineering research, powering innovation and discovery that advance science and society. They are the base for building today’s rapidly evolving cloud and HPC infrastructures, and are used to solve some of the most complex problems. The challenge to make them scalable, efficient, and Read more…

SDSC Enables Large-Scale Data Sharing Using Globus

Apr 14, 2014 |

April 14 — The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, has implemented a new feature of the Globus software that will allow researchers using the Center’s computational and storage resources to easily and securely access and share large data sets with colleagues. In the era of “Big Data”-based science, accessing and Read more…