Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags


Mira Supercomputer Propels High-Intensity Beam Science

Mar 6, 2015 |

As CERN’s Large Hadron Collider (LHC) prepares to restart this March, a team of Fermilab physicists are using powerful Department of Energy supercomputing resources to reduce the risks and costs associated with producing these high intensity particle beams. Led by Fermilab physicist James Amundson, the team is working with the Argonne Leadership Computing Facility (ALCF) Read more…

CERN Details OpenStack Journey

Nov 4, 2014 |

At the OpenStack Summit in Paris, France, CERN’s Infrastructure Services Manager Tim Bell gave the general session audience an overview of his institution’s experiences moving to OpenStack, which he characterizes as a “cultural and technology transformation.” CERN, the European Organization for Nuclear Research, supports 11,000 physicists from around the world. These scientists use the facilities Read more…

CERN Researchers Explore x86 Alternatives

Oct 28, 2014 |

Energy consumption has become a constraining feature on the growth of computing systems, such that the industry has shifted its focus from pure performance to performance-per-watt. As a result, there is increased interest in newer chip architectures that emphasize energy efficiency. An international group of researchers with ties to CERN are especially concerned with the effect these power constraints Read more…

ESnet Deploys 100G Connectivity Across Atlantic

Oct 21, 2014 |

The Department of Energy’s (DOE’s) Energy Sciences Network, or ESnet, is gearing up to deploy four new high-speed transatlantic links with a total capacity of 340 gigabits-per-second, significantly boosting network speeds between US research sites and European facilities. The trans-Atlantic expansion adds four separate links connecting Boston, New York and Washington DC with Amsterdam, London Read more…

Pushing Parallel Processing Power at CERN

Feb 17, 2014 |

The Large Hadron Collider (LHC) relies on parallel processors, including coprocessors, to power its massive acquisition system. Without the computational power afforded by these processors, discovery is hampered. The reach of the science is supported in part by improvements in computational speed.  Valerie Halyo, research scientist in the Department of Physics at Princeton University, is Read more…

Window in Time Opens to CERN’s Supercomputing History

Jan 14, 2014 |

Yesterday, we brought you a story about the iconic CDC 6500 supercomputer, which is currently undergoing restoration at the Living Computer Museum in Seattle. The CDC 6500 system, built by Control Data Corporation in 1967, was part of the CDC 6000 line, designed by Seymour Cray in the 1960s. The most famous of these was Read more…

CERN and Rackspace Form OpenStack Partnership

Jul 3, 2013 |

Amazon Web Services is the cloud provider most often cited in scientific articles about high performance applications in the cloud. Meanwhile, cloud competitor Rackspace has not ventured much into the high scientific computing arena. That is, until this week when Rackspace, noted provider of cloud services, will be looking at getting into the cloud based HPC game by partnering with CERN, a development that was announced on Monday.

CloudSigma CEO Elaborates on Science Cloud

Jun 14, 2013 |

Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)…

The Science Cloud Cometh

May 28, 2013 |

Monumental scientific undertakings have very different goals, but one important feature in common: the huge amounts of data that must be processed efficiently in order to yield accurate results. The answer to this dilemma may lie in one of today’s most innovative computing delivery technologies: cloud computing.

CERN, Google Drive Future of Global Science Initiatives

May 21, 2013 |

Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).