SUPERCOMPUTING CENTRE DELIVERS EXCELLENCE ON SX-5

July 21, 2000

By Christopher Lazou

San Diego, CA — Since 1992, the Swiss Centre for Scientific Computing (CSCS) has been at the forefront in providing HPCN services to the Swiss scientific community, using NEC SX systems. Their work varies from the mesoscale Alpine weather forecast project, to the Car-Parrinello method in the study of molecular dynamics. In a recent interview, (published below), Dr. D. Maric, the Chief Technology Officer for the Swiss National HPCN, described their work and added: “The choice of an NEC SX-5 just installed underlines also our confidence in the future SX-technology.”

CL: (Christopher Lazou). Can you briefly describe HPCN in Switzerland?

DM: (Dr. D. Maric ) HPCN is one of the key strategic technologies enabling the current and future competitiveness of many fields of scientific research and technological development in Switzerland. Its role and future sustainable development are guaranteed by the coherent and strategic guidance by the Board of the Swiss Federal Institutes of Technology (FIT Board) and ETH Zurich in the frame of the national science policy.

CL: The Swiss Centre for Scientific Computing, (CSCS) has been in operation since 1992 and during this period you have been using NEC SX systems for the high performance demanding scientific fields. What in your opinion is the significance of the new CSCS installation for HPC in Switzerland?

DM: This upgrade reflects the recognition of the excellent quality of the scientific work of the current SX-customers and our commitment to provide them with continued support.

CL: You have been using several generations of NEC SX systems so can you briefly say why?

DM: The decision to install NEC SX systems from the beginning turned out to be a very wise one. We have done an evaluation during the summer of 1999 and this re-enforced our view that the SX-technology remains supreme for the applications of our SX-4 user community. The choice of an NEC SX-5 just installed underlines also our confidence in the future SX-technology.

CL: CSCS is one of the major scientific/academic centres particularly in Europe, can you give a general feel of the experiences in providing HPC services to a national community. Who are your users and how do they get resources on your NEC SX systems?

DM: Our users have access to a number of national centres, CSCS at Manno and the computing centres at EPF Lausanne and ETH Zurich. There is a peer review mechanism set up where scientists request computing resources which are then assessed by their peers to ensure that the available resources are focused on relevant research.

CL: A number of scientific projects are pursued under the “Task force” umbrella, can you say something about the importance of Task force to CSCS?

DM: When we bought the NEC SX-4/16 back in 1996, CSCS in partnership with NEC set up a “Task force” to help users port, optimise and parallelise their applications to the new system and in this process take full advantage of the new parallel vector architecture and its power. The concept of a joint Task force, of specialists in tuning and parallel processing, that cooperate closely with end-users from academic and research institutes, turned out to be very effective. New tools developed, and the synergy gained from the combined experiences of the Task force and end-users, ultimately contributed significantly to the very efficient usage of the installed compute power.

CL: Can you mention some of the important scientific projects studied under the “Task force” umbrella?

DM: The quality of our scientific users is high and the Task force approach has been adopted by several other countries. Our approach in the Task force can be summarised as: “let the scientists do the science and we’ll do the rest”. As part of the Task force exercise a set of important areas were studied. In numerical weather forecasts, the mesoscale Alpine project, MAP; in material sciences, the Car-Parrinello method used in condensed matter physics to study molecular dynamics; and in quantum chemistry, the semi-empirical molecular orbital methods for studying ground-state potential surfaces based on the MNDO model were used. In computational fluid dynamics (CFD), direct numerical simulations of turbulence were performed and also some collaborative work with EU Aerospace industry.

CL: I understand that Professor Car from Princeton also uses CSCS, so your community is in reality international, can you elucidate what he and other international scientists are using it for?

DM: Professor Car, before going to Princeton, was working at the Institute for Numerical Research in the Physics of Materials (IRRMA), Lausanne and was one of the key strategic users participating in the Task force. He previously stated: “The performance achieved on the SX-4 by the Car-Parrinello code developed at IRRMA opens unprecedented possibilities for the simulation of complex molecular processes in physics and chemistry using highly accurate quantum-mechanical techniques”.

CL: Can you also say something about the code performance for Car- Parinello and any other resource demanding task force codes?

DM: The Task force has accomplished a significant optimisation and parallelization of the Car-Parrinello code. It achieved 90% efficiency on one processor and a speed-up factor of 10 on 16 processors which amounts to 17.9 Gflop/s.

CL: What about performance on other codes?

DM: For the MNDO code the performance is even better attaining 23 Gflop/s on 16 processors. The story is repeated in the Direct Numerical Simulation of turbulence, CFD code which achieved 26 Gflop/s on 16 processors. This outstanding performance corresponds to more than 80% of the peak performance of the 32 Gflop/s of the SX-4 then available at CSCS, and this was achieved with a realistic, professional state-of-the- art application, not with an artificial benchmark test.

CL: One of the most important European climate experiments is the Mesoscale Alpine Programme, (MAP) project. Can you briefly say what this is and in what way it benefited from the use of the NEC SX system at CSCS?

DM: On September 7, 1999, one of the largest field experiments ever in Alpine meteorological history was initiated, with the objective to make a major contribution to current problems in weather forecasting, particular regard to extreme events such as heavy precipitation, river run-off and flooding. The Mesoscale Alpine Programme (MAP) is a collaboration of several hundred scientists from national weather services; university institutions; prominent research institutes such as the US National centre for Atmospheric Research (NCAR), Environment Canada, the French National Centre for scientific research (CNRS); and from the World Meteorological Organisation (WMO). Currently the Swiss Meteorological Institute (SMA) hosts the MAP programme office, and the data centre is at the ETH, Zurich. Computer simulations at unprecedented resolution with prospects for substantial improvements of weather prediction are one of the key motivations of MAP. The SX-4 at CSCS was the only machine powerful for performing these simulations.

CL: What in your view are the most useful characteristics of HPC systems which benefit your users and what barriers need to be removed to allow users more freedom to perhaps incorporate new physics in their models?

DM: A parallel vector supercomputer with a flat shared memory allows users to achieve high efficiencies with least effort. For some of our very important users the NEC SX-5 has also the advantage of mature software, and continuity with no porting disruption to our users.

CL: Finally, let me congratulate you for being elected as the new Chairman (President) of the NEC User Group. Can you say a few words about the NUG and your vision on how it should develop for the benefit of both users and NEC?

DM: A strong user group is essential and helps both users by sharing experiences and the vendor by providing inputs for a better understanding of user requirements. The NUG has been successful but because of the relatively small number of sites the infrastructure remained weak. With the expansion of customer base the time has come to put the NUG on a more permanent footing. This involves incorporation into an independent structure with its own legal framework. This would enable the NUG to provide broader and more authoritative inputs to NEC on future user requirements and thus enable the vendor to develop new systems with some confidence that they are matching user needs. Finally, my vision is that NUG meetings should become attractive also for the non-NEC users, as major international HPCN events of interest for the most demanding “key players”.

CL: I think we explored a number of issues. Thank you for your time in talking to me and I am sure our readers would find your views interesting.

For further details contact: Dr. D. Maric, Chief Technology Officer for the Swiss national HPCN, Swiss Federal Institute of Technology (ETH), Swiss Centre for Scientific Computing (CSCS), Galleria 2, via Cantonale, CH-6928 Manno, Switzerland. e-mail:[email protected]

Copyright: Christopher Lazou, Managing Director, HiPerCom Consultants Ltd. email: [email protected] (Chris Lazou)

The viewpoints in this article are those of its author and not necessarily those of the publisher or staff of HPCwire.

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pressing needs and hurdles to widespread AI adoption. The sudde Read more…

Quantinuum Reports 99.9% 2-Qubit Gate Fidelity, Caps Eventful 2 Months

April 16, 2024

March and April have been good months for Quantinuum, which today released a blog announcing the ion trap quantum computer specialist has achieved a 99.9% (three nines) two-qubit gate fidelity on its H1 system. The lates Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Computational Chemistry Needs To Be Sustainable, Too

April 8, 2024

A diverse group of computational chemists is encouraging the research community to embrace a sustainable software ecosystem. That's the message behind a recent Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire