Already the world’s most energy-efficient petascale supercomputer, “Piz Daint” – the Cray XC30 system installed at the Swiss National Supercomputing Centre (CSCS) – is even more powerful, thanks to multicore Cray XC40 extension called “Piz Dora.” Following a local naming convention, the monikers were inspired by mountains of the Swiss Ortler Alps with Piz Daint as the “inner peak” and Piz Dora the “outer peak.”
The heterogeneous system was built to satisfy a variety of high-end computing services, including extreme scale compute, data analytics, pre- and post processing, as well as visualization. The impetus for the addition was to better support the science workflow of this premier Swiss research center, as well as partner institutions, including the University of Zurich and EPFL.
In a release published earlier today, ETH Zurich rep Andrea Schmits writes, “In order to offer even more computing resources for users, at the beginning of 2014 CSCS decided to expand its existing facilities with the purchase of a Cray supercomputer (multi-core Cray XC40).”
In May, officials from ETH Zurich approved the additional infrastructure, which the University of Zurich has agreed to fund. The new resources will also be made available to other Swiss universities for scientific inquiries.
“This further strengthens the cooperation between ETH Zurich and the University of Zurich,” says Roman Boutellier, Vice-President for Human Resources and Infrastructure at ETH Zurich. “Thanks to their involvement, we can spread our fixed costs across a larger volume of calculations, while the University of Zurich will not have to invest so much in expensive infrastructure itself. It is possible that we could also enter into partnerships with other universities and universities of applied science.”
On its own, Piz Daint, a Cray XC30 machine, has 5,272 compute nodes (with Intel Xeon E5-2670 and NVIDIA Tesla K20X parts), for a theoretical peak performance per node of 166.4 gigaflops (for the E5-2670) and 1311.0 gigaflops (for the Tesla K20X). In total, Piz Daint claims a theoretical peak performance of 7.787 petaflops. Benchmark tests show Piz Daint running DCA++, a quantum Monte Carlo code to simulate models of high-temperature superconductors, at a sustained 4.2 petaflops.
The extension Piz Dora – which adds a fourth row of cabinets behind the three rows that make up Piz Daint – has a maximum capability of 1.258 petaflops. The Cray XC40 packs 1,256 compute nodes, each touting two 12-core Intel Xeon E5-2690 v3 CPUs. At 24 cores per node, this brings the total core count to 30,144, bumped to 60,288 total virtual cores when hyperthreading is enabled. On 1,192 nodes, there is 64GB of RAM each, while the other 64 “fat nodes” have 128GB of RAM each, accessible under the SLURM partition bigmem.
In describing the new Cray system in the context of the the evolution of HPC, CSCS Directory Thomas C. Schulthess pointed to “the changing role of high-performance computing,” stating “HPC is now an essential tool for science, used by all scientists (for better or worse), rather than being limited to the domain of applied mathematics and providing numerical solution to theoretical problems only few understand.”
Given this new paradigm, Schulthess believes “we shouldn’t be worrying about individual computers anymore, but about platforms on which we deliver scientific computing services.”
The updated infrastructure is replacing the University of Zurich’s Schrödinger supercomputer to better serve researchers with a streamlined process for submitting new algorithms for the competitive supercomputers at CSCS. The arrangement will also mean less data being piped between Zurich and Lugano.
The CSCS team has been preparing the working environment since early November and it is currently authorized for trial use.
“From January 2015, the system will be made available to a wider group of users,” reports CSCS spokesperson Angela Detjen.
Full normal production is scheduled for April 1, 2015.
In related news, CSCS recently became the first institution in Switzerland, besides the premier science center CERN, to have a 100 gigabit per second (Gbps) connection to SWITCH, the Swiss Education and Research Network.