European Cyberinfrastructure in the Making

By Christopher Lazou

December 1, 2006

In Rome on November 24-25, CASPUR (Consorzio per le Applicazioni del Supercalcolo Per Università e Ricerca) hosted an excellent “Forward Look” workshop on quantum molecular sciences at the prestigious Accademia dei Lincei, whose, past members include Galileo Galilei. This is part of the European Computational Science Forum: The Lincei Initiative: from computers to scientific excellence. The aim of “Forward Look” is to develop a vision on how computational sciences will evolve in the coming 10 to 20 years and create an infrastructure of support. During this workshop on quantum molecular sciences I listened to the proceedings and spoke with many of the leading players.

I visited CASPUR in the eternal city of Rome, so before I report on the workshop let's see what a medium size computer centre such as CASPUR offers to the scientific community. Despite large centres stealing the limelight, medium size ones represent not only the “bread and butter” for scientists, but also represent a major part of HPC usage.

Established in 1992, CASPUR is a consortium of nine universities of central and south Italy. Lead by Dr Carlo Maria Serio, the consortium has about 80 staff of which about 30 are members of the HPC unit. CASPUR offers both computing resources and expertise in the field of scientific and technical computing. From the start, CASPUR offered to its users a choice of computer architectures suited to their specific needs. The current installation consists of three systems: an IBM Power5, an AMD Opteron cluster and an NEC SX-6 computer. The machines provide an aggregate of 2.2 teraflops peak performance.
 
Thanks to this diversity of computers, CASPUR is able to satisfy the different needs of its user community, representing domains such as material sciences, life sciences, fluid dynamics, earthquake simulations and oceanography, and bioinformatics. Although the installed computing power is small compared to the most powerful European supercomputing centres, CASPUR follows a policy of continuous updating, so that at any moment in time, one of the installed computers is of the latest generation. Located in Rome, with its four public universities, and especially Università “La Sapienza”, the largest European university with about 150 thousand students, CASPUR offers HPC services to a very large number of research groups, both in academia and research centres.

CASPUR's strategy about HPC, from the very beginning, was to balance computing power with computing expertise and user support. This is reflected in the increase of staff, in the HPC unit, in line with the growth of users and computing power. CASPUR not only offers support for porting users' or community software on its machines, but also offers support for optimisation and development of new scientific software of professional quality. These applications, developed and optimised in collaboration with CASPUR, are then used at larger computer facilities.

“Our keyword is HPCSS: High Performance Computing Scientific Services”, said Dr Serio. “The strategy for implementing such a vision is to have strong collaborations with leading research groups in many scientific areas. Sometimes these collaborations are of a scientific nature. This enables us to better understand current and future HPC needs of computational scientists. For example, CASPUR is doing research in the field of opto-electronics in collaboration with Prof. Roberto Car, from Princeton, Prof. M. Knupfer from the Leibniz Institute, Dresden and Amadeo Palma, Istituto per lo Studio dei Materiali Nanostrutturati (ISMM), Consiglio Nazionale delle Ricerche. They have recently demonstrated that it is possible to build nano-structured materials, in particular nano-wires, within an organic matrix. Although in this instance the organic molecule was PTCDA, the potential is there for other organic materials to be finely tuned to host different kind of nano-materials”.

In the European and global HPC context, CASPUR believes it is crucially important to form a “Network of Competence” amongst small and medium computing centres. Whilst these centres cannot compete with major centres in terms of computing power, they can provide an important contribution in disseminating the HPC culture and supporting the broad scientific community, which often does not have access to larger European computer facilities.

CASPUR strategic policy is in line with the European “Lisbon Agenda”, which states that regions have a key role to play in building a knowledge-based society. This is very important in the light of the small role played by European hardware manufacturers, on the global landscape. The implementation of a strategy based on strengthening expertise in middleware and application software, as well as other HPC services, would make the European computing ecosystem complementary to that of the U.S. and Japan. It would encourage and promote the formation of European expertise, which is of crucial importance in the dynamic services market. CASPUR is promoting this approach at every level of European institutions responsible for supercomputing programmes.

Let's move on to the quantum molecular sciences workshop. The workshop is part of the computational science forum, “Forward Look”, organised by the European Science Foundation (ESF). This is an initiative aiming at defining the scientific software infrastructure, which should support the research activities of European computational scientists, complementing hardware developments. Although analogous software initiatives, e.g. PITAC report, Blue Ribbon report, NSF cyberinfrastructure and the SciDAC program, are ongoing in the United States (with similar activities pursued in Japan), in Europe software is not as yet widely recognised as a critical component of infrastructure. It is underrepresented on research budgets and consequently there is a scarcity of experts for building professional software. As Dr Serio put it: “This has some odd effects. Users practise self-censorship and depend on the American and Japanese developers to define tomorrow's computing software, and this makes Europe fall even farther behind”.

This lack of a European software infrastructure, for developing computer codes, and the support to deploy maintain and evolve these codes, as well as technical training for the next generation of computational scientists, is hindering the advancement of European science in many fields. Eventually this would lead to European computational scientists losing their world leading position.

During this “Forward Look” exercise there will be six parallel, community level workshops on chemistry, astrophysics and astronomy, fluid dynamics, life sciences, bio and non-bio materials sciences, and climate and environmental sciences. Based on how each field will evolve and on the needs of the scientific community, a strategy will be presented in a final report on what software/hardware and support infrastructures are needed in the European context.

The concept of a cyberinfrastructure includes all the necessary developments, which have to be put in place, to provide the means to take full advantage of possibilities offered by the rapid growth of hardware and the Grid. Code developers across Europe should be able to work in a collaborative and integrated way — sharing data, building specific codes, and using remote facilities and graphic tools. It is anticipated this infrastructure will operate within a European context and will result from an analysis of the actual needs of the scientists involved and the vision they have on how these needs can be met.

The “Forward Look” initiative plans to issue a final report in mid-2007 containing recommendations and a roadmap, which once implemented, should enable European computational scientists to remain competitive with world-class capabilities. The report will be presented to funding agencies and influential European institutions, such as European Commission, ESFRI (European Strategy Forum on Research Infrastructures) and HPCN centres.

Computational sciences and computer simulations in particular, are playing an ever-growing role in fundamental and applied sciences and this success is based to a large extent on the development of computers. The Internet has also changed the way science is organized. In addition to a European supercomputer infrastructure, the scientific software developed by the various European groups is becoming increasingly important from a strategic point of view. Whereas ten years ago every research group could develop its own research codes, nowadays the complexity and sophistication of the methods in use have grown to such an extent, that many groups rely on third party software. For example, in the field of (ab-initio and molecular) materials science, the four most important European codes have been used as a “research instrument” in over 2000 publications in international journals per year.

There are presently many packages in use: ACESII, CPMD, Columbus, Crystal, Daltons, Dirac, FLEUR, GAMESS-UK, MNDO97, MolCas, MolPro, NWChem, Orca, Q-Chem, Quickstep, SIESTA, Turbomole, VASP and many more. Europe is not only the home of some of the most widely used packages for electronic structure simulations, but it is also the cradle of innovations that has made possible the extraordinary successes in this discipline in the last quarter century.

For Europe to maintain a leading position and also ensure that as many European research groups as possible take advantage of this unique position, it is essential to develop a European-focused strategy for efficient use of resources. Many HPC centres give support for local software development, but since most of the software is used across Europe, this support could be even more efficient if it were coordinated at the European level. More importantly, the availability of a cyberinfrastructure should create new opportunities for science. For example, combining the expertise of the Psi-k and MolSimu communities allows them to link electronic structure calculations with techniques to compute a macroscopic phase diagram of a material. A well-designed European cyberinfrastructure can deal with the complexity of the various codes and the contributions of many different groups.

Another issue is that the range of users is broadening — from theoreticians developing improved algorithms and new types of applications, to experimentalists who need to use simulation to interpret complex data. Moreover, it is not just a matter of distributing software. Essential elements of the success of a cyberinfrastructure are appropriate training and user support, geared to the high level of expertise needed to use the software reliably.

Since the “Forward Look” initiative involves several “Grand Challenge” fields, each workshop is tasked to produce a report answering general questions such as current and future levels of importance of computational science in their field, with examples of applications: The current cyberinfrastructure in each field, describing strengths and weakness for computational science. The cyberinfrastructure needs for the next 10-20 years and the path to achieve this new vision.

The desirable cyberinfrastructure presupposes that the key scientific questions that will be addressed in the coming 10-20 years are understood. Setting up such an infrastructure is a multidisciplinary activity, which could benefit from expertise in state-of-the-art developments in hardware (grid, middleware, etc.) and computer science (integrated code development, good practice, multi-scale multi-code issues). One has to evaluate the emerging technologies and how can they be used.

How should the European computer infrastructure, both in hardware and software, be organised such that Europe has a competitive advantage? The view from this workshop was that a software infrastructure can take many forms, but should have an independent distinct organizational structure — say an institute — to look after the communities interests. The operational aspects can be distributed (i.e., virtual), where personnel are attached to existing supercomputer centres, or in a single central building. In addition, the availability of a cyberinfrastructure may create opportunities for new science, for example, by coupling computational methods to databases. Training and tutoring on how to set up such a cyberinfrastructure is of great importance, and also ensures it can be used by the European scientific community at large.

Another issue is how to build on existing groupings, for example, the Psi-k and MolSimu communities, the UK CCP1 for electronic structure of molecules, CCP6 for heavy particle dynamics, and on other communities such as astrophysics, computational fluid dynamics, bioinformatics, and so on.

From the workshops that have taken place to-date, a vision for computational science in the coming 10 to 20 years is beginning to crystallize. In the short term it includes consolidation and maintenance of current major software. Another need is porting codes onto new architecture as well as providing documentation and support. Standardisation of input/output, both file format and data naming, for interoperability across various languages and packages is also seen as a worthwhile goal.

In the long term, there is a need for an infrastructure to rationalize existing software efforts by creating generic libraries toolkits and APIs where appropriate. The new researcher is expected to be proficient in multiple disciplines, for example, not only a science domain such as chemistry, but also mathematics and computational science. This requires a funding stream to support current research and a programme to train the new generation of multi-discipline researchers.

I look forward to the final report to see whether Europe has the foresight and political will to fund cyberinfrastructure on a long-term basis in order to empower European scientists to remain competitive in this strategic field. The viability of the European economies and the welfare of 500 million Europeans may depend on it.

—–

Copyright (c) Christopher Lazou, HiPerCom Consultants, Ltd., UK. December 2006. Brands and names are the property of their respective owners.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that have occurred about once a decade. With this in mind, the ISC Read more…

2024 Winter Classic: Texas Two Step

April 18, 2024

Texas Tech University. Their middle name is ‘tech’, so it’s no surprise that they’ve been fielding not one, but two teams in the last three Winter Classic cluster competitions. Their teams, dubbed Matador and Red Read more…

2024 Winter Classic: The Return of Team Fayetteville

April 18, 2024

Hailing from Fayetteville, NC, Fayetteville State University stayed under the radar in their first Winter Classic competition in 2022. Solid students for sure, but not a lot of HPC experience. All good. They didn’t Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use of Rigetti’s Novera 9-qubit QPU. The approach by a quantum Read more…

2024 Winter Classic: Meet Team Morehouse

April 17, 2024

Morehouse College? The university is well-known for their long list of illustrious graduates, the rigor of their academics, and the quality of the instruction. They were one of the first schools to sign up for the Winter Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pressing needs and hurdles to widespread AI adoption. The sudde Read more…

Kathy Yelick on Post-Exascale Challenges

April 18, 2024

With the exascale era underway, the HPC community is already turning its attention to zettascale computing, the next of the 1,000-fold performance leaps that ha Read more…

Software Specialist Horizon Quantum to Build First-of-a-Kind Hardware Testbed

April 18, 2024

Horizon Quantum Computing, a Singapore-based quantum software start-up, announced today it would build its own testbed of quantum computers, starting with use o Read more…

MLCommons Launches New AI Safety Benchmark Initiative

April 16, 2024

MLCommons, organizer of the popular MLPerf benchmarking exercises (training and inference), is starting a new effort to benchmark AI Safety, one of the most pre Read more…

Exciting Updates From Stanford HAI’s Seventh Annual AI Index Report

April 15, 2024

As the AI revolution marches on, it is vital to continually reassess how this technology is reshaping our world. To that end, researchers at Stanford’s Instit Read more…

Intel’s Vision Advantage: Chips Are Available Off-the-Shelf

April 11, 2024

The chip market is facing a crisis: chip development is now concentrated in the hands of the few. A confluence of events this week reminded us how few chips Read more…

The VC View: Quantonation’s Deep Dive into Funding Quantum Start-ups

April 11, 2024

Yesterday Quantonation — which promotes itself as a one-of-a-kind venture capital (VC) company specializing in quantum science and deep physics  — announce Read more…

Nvidia’s GTC Is the New Intel IDF

April 9, 2024

After many years, Nvidia's GPU Technology Conference (GTC) was back in person and has become the conference for those who care about semiconductors and AI. I Read more…

Google Announces Homegrown ARM-based CPUs 

April 9, 2024

Google sprang a surprise at the ongoing Google Next Cloud conference by introducing its own ARM-based CPU called Axion, which will be offered to customers in it Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Leading Solution Providers

Contributors

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Eyes on the Quantum Prize – D-Wave Says its Time is Now

January 30, 2024

Early quantum computing pioneer D-Wave again asserted – that at least for D-Wave – the commercial quantum era has begun. Speaking at its first in-person Ana Read more…

GenAI Having Major Impact on Data Culture, Survey Says

February 21, 2024

While 2023 was the year of GenAI, the adoption rates for GenAI did not match expectations. Most organizations are continuing to invest in GenAI but are yet to Read more…

The GenAI Datacenter Squeeze Is Here

February 1, 2024

The immediate effect of the GenAI GPU Squeeze was to reduce availability, either direct purchase or cloud access, increase cost, and push demand through the roof. A secondary issue has been developing over the last several years. Even though your organization secured several racks... Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire