European Cyberinfrastructure in the Making

By Christopher Lazou

December 1, 2006

In Rome on November 24-25, CASPUR (Consorzio per le Applicazioni del Supercalcolo Per Università e Ricerca) hosted an excellent “Forward Look” workshop on quantum molecular sciences at the prestigious Accademia dei Lincei, whose, past members include Galileo Galilei. This is part of the European Computational Science Forum: The Lincei Initiative: from computers to scientific excellence. The aim of “Forward Look” is to develop a vision on how computational sciences will evolve in the coming 10 to 20 years and create an infrastructure of support. During this workshop on quantum molecular sciences I listened to the proceedings and spoke with many of the leading players.

I visited CASPUR in the eternal city of Rome, so before I report on the workshop let's see what a medium size computer centre such as CASPUR offers to the scientific community. Despite large centres stealing the limelight, medium size ones represent not only the “bread and butter” for scientists, but also represent a major part of HPC usage.

Established in 1992, CASPUR is a consortium of nine universities of central and south Italy. Lead by Dr Carlo Maria Serio, the consortium has about 80 staff of which about 30 are members of the HPC unit. CASPUR offers both computing resources and expertise in the field of scientific and technical computing. From the start, CASPUR offered to its users a choice of computer architectures suited to their specific needs. The current installation consists of three systems: an IBM Power5, an AMD Opteron cluster and an NEC SX-6 computer. The machines provide an aggregate of 2.2 teraflops peak performance.
 
Thanks to this diversity of computers, CASPUR is able to satisfy the different needs of its user community, representing domains such as material sciences, life sciences, fluid dynamics, earthquake simulations and oceanography, and bioinformatics. Although the installed computing power is small compared to the most powerful European supercomputing centres, CASPUR follows a policy of continuous updating, so that at any moment in time, one of the installed computers is of the latest generation. Located in Rome, with its four public universities, and especially Università “La Sapienza”, the largest European university with about 150 thousand students, CASPUR offers HPC services to a very large number of research groups, both in academia and research centres.

CASPUR's strategy about HPC, from the very beginning, was to balance computing power with computing expertise and user support. This is reflected in the increase of staff, in the HPC unit, in line with the growth of users and computing power. CASPUR not only offers support for porting users' or community software on its machines, but also offers support for optimisation and development of new scientific software of professional quality. These applications, developed and optimised in collaboration with CASPUR, are then used at larger computer facilities.

“Our keyword is HPCSS: High Performance Computing Scientific Services”, said Dr Serio. “The strategy for implementing such a vision is to have strong collaborations with leading research groups in many scientific areas. Sometimes these collaborations are of a scientific nature. This enables us to better understand current and future HPC needs of computational scientists. For example, CASPUR is doing research in the field of opto-electronics in collaboration with Prof. Roberto Car, from Princeton, Prof. M. Knupfer from the Leibniz Institute, Dresden and Amadeo Palma, Istituto per lo Studio dei Materiali Nanostrutturati (ISMM), Consiglio Nazionale delle Ricerche. They have recently demonstrated that it is possible to build nano-structured materials, in particular nano-wires, within an organic matrix. Although in this instance the organic molecule was PTCDA, the potential is there for other organic materials to be finely tuned to host different kind of nano-materials”.

In the European and global HPC context, CASPUR believes it is crucially important to form a “Network of Competence” amongst small and medium computing centres. Whilst these centres cannot compete with major centres in terms of computing power, they can provide an important contribution in disseminating the HPC culture and supporting the broad scientific community, which often does not have access to larger European computer facilities.

CASPUR strategic policy is in line with the European “Lisbon Agenda”, which states that regions have a key role to play in building a knowledge-based society. This is very important in the light of the small role played by European hardware manufacturers, on the global landscape. The implementation of a strategy based on strengthening expertise in middleware and application software, as well as other HPC services, would make the European computing ecosystem complementary to that of the U.S. and Japan. It would encourage and promote the formation of European expertise, which is of crucial importance in the dynamic services market. CASPUR is promoting this approach at every level of European institutions responsible for supercomputing programmes.

Let's move on to the quantum molecular sciences workshop. The workshop is part of the computational science forum, “Forward Look”, organised by the European Science Foundation (ESF). This is an initiative aiming at defining the scientific software infrastructure, which should support the research activities of European computational scientists, complementing hardware developments. Although analogous software initiatives, e.g. PITAC report, Blue Ribbon report, NSF cyberinfrastructure and the SciDAC program, are ongoing in the United States (with similar activities pursued in Japan), in Europe software is not as yet widely recognised as a critical component of infrastructure. It is underrepresented on research budgets and consequently there is a scarcity of experts for building professional software. As Dr Serio put it: “This has some odd effects. Users practise self-censorship and depend on the American and Japanese developers to define tomorrow's computing software, and this makes Europe fall even farther behind”.

This lack of a European software infrastructure, for developing computer codes, and the support to deploy maintain and evolve these codes, as well as technical training for the next generation of computational scientists, is hindering the advancement of European science in many fields. Eventually this would lead to European computational scientists losing their world leading position.

During this “Forward Look” exercise there will be six parallel, community level workshops on chemistry, astrophysics and astronomy, fluid dynamics, life sciences, bio and non-bio materials sciences, and climate and environmental sciences. Based on how each field will evolve and on the needs of the scientific community, a strategy will be presented in a final report on what software/hardware and support infrastructures are needed in the European context.

The concept of a cyberinfrastructure includes all the necessary developments, which have to be put in place, to provide the means to take full advantage of possibilities offered by the rapid growth of hardware and the Grid. Code developers across Europe should be able to work in a collaborative and integrated way — sharing data, building specific codes, and using remote facilities and graphic tools. It is anticipated this infrastructure will operate within a European context and will result from an analysis of the actual needs of the scientists involved and the vision they have on how these needs can be met.

The “Forward Look” initiative plans to issue a final report in mid-2007 containing recommendations and a roadmap, which once implemented, should enable European computational scientists to remain competitive with world-class capabilities. The report will be presented to funding agencies and influential European institutions, such as European Commission, ESFRI (European Strategy Forum on Research Infrastructures) and HPCN centres.

Computational sciences and computer simulations in particular, are playing an ever-growing role in fundamental and applied sciences and this success is based to a large extent on the development of computers. The Internet has also changed the way science is organized. In addition to a European supercomputer infrastructure, the scientific software developed by the various European groups is becoming increasingly important from a strategic point of view. Whereas ten years ago every research group could develop its own research codes, nowadays the complexity and sophistication of the methods in use have grown to such an extent, that many groups rely on third party software. For example, in the field of (ab-initio and molecular) materials science, the four most important European codes have been used as a “research instrument” in over 2000 publications in international journals per year.

There are presently many packages in use: ACESII, CPMD, Columbus, Crystal, Daltons, Dirac, FLEUR, GAMESS-UK, MNDO97, MolCas, MolPro, NWChem, Orca, Q-Chem, Quickstep, SIESTA, Turbomole, VASP and many more. Europe is not only the home of some of the most widely used packages for electronic structure simulations, but it is also the cradle of innovations that has made possible the extraordinary successes in this discipline in the last quarter century.

For Europe to maintain a leading position and also ensure that as many European research groups as possible take advantage of this unique position, it is essential to develop a European-focused strategy for efficient use of resources. Many HPC centres give support for local software development, but since most of the software is used across Europe, this support could be even more efficient if it were coordinated at the European level. More importantly, the availability of a cyberinfrastructure should create new opportunities for science. For example, combining the expertise of the Psi-k and MolSimu communities allows them to link electronic structure calculations with techniques to compute a macroscopic phase diagram of a material. A well-designed European cyberinfrastructure can deal with the complexity of the various codes and the contributions of many different groups.

Another issue is that the range of users is broadening — from theoreticians developing improved algorithms and new types of applications, to experimentalists who need to use simulation to interpret complex data. Moreover, it is not just a matter of distributing software. Essential elements of the success of a cyberinfrastructure are appropriate training and user support, geared to the high level of expertise needed to use the software reliably.

Since the “Forward Look” initiative involves several “Grand Challenge” fields, each workshop is tasked to produce a report answering general questions such as current and future levels of importance of computational science in their field, with examples of applications: The current cyberinfrastructure in each field, describing strengths and weakness for computational science. The cyberinfrastructure needs for the next 10-20 years and the path to achieve this new vision.

The desirable cyberinfrastructure presupposes that the key scientific questions that will be addressed in the coming 10-20 years are understood. Setting up such an infrastructure is a multidisciplinary activity, which could benefit from expertise in state-of-the-art developments in hardware (grid, middleware, etc.) and computer science (integrated code development, good practice, multi-scale multi-code issues). One has to evaluate the emerging technologies and how can they be used.

How should the European computer infrastructure, both in hardware and software, be organised such that Europe has a competitive advantage? The view from this workshop was that a software infrastructure can take many forms, but should have an independent distinct organizational structure — say an institute — to look after the communities interests. The operational aspects can be distributed (i.e., virtual), where personnel are attached to existing supercomputer centres, or in a single central building. In addition, the availability of a cyberinfrastructure may create opportunities for new science, for example, by coupling computational methods to databases. Training and tutoring on how to set up such a cyberinfrastructure is of great importance, and also ensures it can be used by the European scientific community at large.

Another issue is how to build on existing groupings, for example, the Psi-k and MolSimu communities, the UK CCP1 for electronic structure of molecules, CCP6 for heavy particle dynamics, and on other communities such as astrophysics, computational fluid dynamics, bioinformatics, and so on.

From the workshops that have taken place to-date, a vision for computational science in the coming 10 to 20 years is beginning to crystallize. In the short term it includes consolidation and maintenance of current major software. Another need is porting codes onto new architecture as well as providing documentation and support. Standardisation of input/output, both file format and data naming, for interoperability across various languages and packages is also seen as a worthwhile goal.

In the long term, there is a need for an infrastructure to rationalize existing software efforts by creating generic libraries toolkits and APIs where appropriate. The new researcher is expected to be proficient in multiple disciplines, for example, not only a science domain such as chemistry, but also mathematics and computational science. This requires a funding stream to support current research and a programme to train the new generation of multi-discipline researchers.

I look forward to the final report to see whether Europe has the foresight and political will to fund cyberinfrastructure on a long-term basis in order to empower European scientists to remain competitive in this strategic field. The viability of the European economies and the welfare of 500 million Europeans may depend on it.

—–

Copyright (c) Christopher Lazou, HiPerCom Consultants, Ltd., UK. December 2006. Brands and names are the property of their respective owners.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Hyperion: AI-driven HPC Industry Continues to Push Growth Projections

November 21, 2019

Three major forces – AI, cloud and exascale – are combining to raise the HPC industry to heights exceeding expectations. According to market study results released this week by Hyperion Research at SC19 in Denver, Read more…

By Doug Black

At SC19: Bespoke Supercomputing for Climate and Weather

November 20, 2019

Weather and climate applications are some of the most important uses of HPC – a good model can save lives, as well as billions of dollars. But many weather and climate models struggle to run efficiently in their HPC en Read more…

By Oliver Peckham

Microsoft, Nvidia Launch Cloud HPC Service

November 20, 2019

Nvidia and Microsoft have joined forces to offer a cloud HPC capability based on the GPU vendor’s V100 Tensor Core chips linked via an InfiniBand network scaling up to 800 graphics processors. The partners announced Read more…

By George Leopold

Hazra Retiring from Intel Data Center Group, Successor Not Known

November 20, 2019

Rajeeb Hazra, corporate VP of Intel’s Data Center Group and GM for the Enterprise and Government Group, is retiring after more than 24 years at the company. At this writing, his successor is unknown. An earlier story on... Read more…

By Doug Black

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU-accelerated computing. In recent years, AI has joined the s Read more…

By John Russell

AWS Solution Channel

Making High Performance Computing Affordable and Accessible for Small and Medium Businesses with HPC on AWS

High performance computing (HPC) brings a powerful set of tools to a broad range of industries, helping to drive innovation and boost revenue in finance, genomics, oil and gas extraction, and other fields. Read more…

IBM Accelerated Insights

Data Management – The Key to a Successful AI Project

 

Five characteristics of an awesome AI data infrastructure

[Attend the IBM LSF & HPC User Group Meeting at SC19 in Denver on November 19!]

AI is powered by data

While neural networks seem to get all the glory, data is the unsung hero of AI projects – data lies at the heart of everything from model training to tuning to selection to validation. Read more…

SC19 Student Cluster Competition: Know Your Teams

November 19, 2019

I’m typing this live from Denver, the location of the 2019 Student Cluster Competition… and, oh yeah, the annual SC conference too. The attendance this year should be north of 13,000 people, with the majority attende Read more…

By Dan Olds

Hyperion: AI-driven HPC Industry Continues to Push Growth Projections

November 21, 2019

Three major forces – AI, cloud and exascale – are combining to raise the HPC industry to heights exceeding expectations. According to market study results r Read more…

By Doug Black

At SC19: Bespoke Supercomputing for Climate and Weather

November 20, 2019

Weather and climate applications are some of the most important uses of HPC – a good model can save lives, as well as billions of dollars. But many weather an Read more…

By Oliver Peckham

Hazra Retiring from Intel Data Center Group, Successor Not Known

November 20, 2019

Rajeeb Hazra, corporate VP of Intel’s Data Center Group and GM for the Enterprise and Government Group, is retiring after more than 24 years at the company. At this writing, his successor is unknown. An earlier story on... Read more…

By Doug Black

Jensen Huang’s SC19 – Fast Cars, a Strong Arm, and Aiming for the Cloud(s)

November 20, 2019

We’ve come to expect Nvidia CEO Jensen Huang’s annual SC keynote to contain stunning graphics and lively bravado (with plenty of examples) in support of GPU Read more…

By John Russell

Top500: US Maintains Performance Lead; Arm Tops Green500

November 18, 2019

The 54th Top500, revealed today at SC19, is a familiar list: the U.S. Summit (ORNL) and Sierra (LLNL) machines, offering 148.6 and 94.6 petaflops respectively, Read more…

By Tiffany Trader

ScaleMatrix and Nvidia Launch ‘Deploy Anywhere’ DGX HPC and AI in a Controlled Enclosure

November 18, 2019

HPC and AI in a phone booth: ScaleMatrix and Nvidia announced today at the SC19 conference in Denver a joint offering that puts up to 13 petaflops of Nvidia DGX Read more…

By Doug Black

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

SC19: Welcome to Denver

November 17, 2019

A significant swath of the HPC community has come to Denver for SC19, which began today (Sunday) with a rich technical program. As is customary, the ribbon cutt Read more…

By Tiffany Trader

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge

August 7, 2019

Fusion energy is the Holy Grail of the energy world: low-radioactivity, low-waste, zero-carbon, high-output nuclear power that can run on hydrogen or lithium. T Read more…

By Oliver Peckham

Using AI to Solve One of the Most Prevailing Problems in CFD

October 17, 2019

How can artificial intelligence (AI) and high-performance computing (HPC) solve mesh generation, one of the most commonly referenced problems in computational engineering? A new study has set out to answer this question and create an industry-first AI-mesh application... Read more…

By James Sharpe

Cray Wins NNSA-Livermore ‘El Capitan’ Exascale Contract

August 13, 2019

Cray has won the bid to build the first exascale supercomputer for the National Nuclear Security Administration (NNSA) and Lawrence Livermore National Laborator Read more…

By Tiffany Trader

DARPA Looks to Propel Parallelism

September 4, 2019

As Moore’s law runs out of steam, new programming approaches are being pursued with the goal of greater hardware performance with less coding. The Defense Advanced Projects Research Agency is launching a new programming effort aimed at leveraging the benefits of massive distributed parallelism with less sweat. Read more…

By George Leopold

AMD Launches Epyc Rome, First 7nm CPU

August 8, 2019

From a gala event at the Palace of Fine Arts in San Francisco yesterday (Aug. 7), AMD launched its second-generation Epyc Rome x86 chips, based on its 7nm proce Read more…

By Tiffany Trader

D-Wave’s Path to 5000 Qubits; Google’s Quantum Supremacy Claim

September 24, 2019

On the heels of IBM’s quantum news last week come two more quantum items. D-Wave Systems today announced the name of its forthcoming 5000-qubit system, Advantage (yes the name choice isn’t serendipity), at its user conference being held this week in Newport, RI. Read more…

By John Russell

Ayar Labs to Demo Photonics Chiplet in FPGA Package at Hot Chips

August 19, 2019

Silicon startup Ayar Labs continues to gain momentum with its DARPA-backed optical chiplet technology that puts advanced electronics and optics on the same chip Read more…

By Tiffany Trader

Crystal Ball Gazing: IBM’s Vision for the Future of Computing

October 14, 2019

Dario Gil, IBM’s relatively new director of research, painted a intriguing portrait of the future of computing along with a rough idea of how IBM thinks we’ Read more…

By John Russell

Leading Solution Providers

ISC 2019 Virtual Booth Video Tour

CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
GOOGLE
GOOGLE
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
VERNE GLOBAL
VERNE GLOBAL

Cray, Fujitsu Both Bringing Fujitsu A64FX-based Supercomputers to Market in 2020

November 12, 2019

The number of top-tier HPC systems makers has shrunk due to a steady march of M&A activity, but there is increased diversity and choice of processing compon Read more…

By Tiffany Trader

Intel Confirms Retreat on Omni-Path

August 1, 2019

Intel Corp.’s plans to make a big splash in the network fabric market for linking HPC and other workloads has apparently belly-flopped. The chipmaker confirmed to us the outlines of an earlier report by the website CRN that it has jettisoned plans for a second-generation version of its Omni-Path interconnect... Read more…

By Staff report

Kubernetes, Containers and HPC

September 19, 2019

Software containers and Kubernetes are important tools for building, deploying, running and managing modern enterprise applications at scale and delivering enterprise software faster and more reliably to the end user — while using resources more efficiently and reducing costs. Read more…

By Daniel Gruber, Burak Yenier and Wolfgang Gentzsch, UberCloud

Dell Ramps Up HPC Testing of AMD Rome Processors

October 21, 2019

Dell Technologies is wading deeper into the AMD-based systems market with a growing evaluation program for the latest Epyc (Rome) microprocessors from AMD. In a Read more…

By John Russell

Rise of NIH’s Biowulf Mirrors the Rise of Computational Biology

July 29, 2019

The story of NIH’s supercomputer Biowulf is fascinating, important, and in many ways representative of the transformation of life sciences and biomedical res Read more…

By John Russell

Xilinx vs. Intel: FPGA Market Leaders Launch Server Accelerator Cards

August 6, 2019

The two FPGA market leaders, Intel and Xilinx, both announced new accelerator cards this week designed to handle specialized, compute-intensive workloads and un Read more…

By Doug Black

Intel Debuts New GPU – Ponte Vecchio – and Outlines Aspirations for oneAPI

November 17, 2019

Intel today revealed a few more details about its forthcoming Xe line of GPUs – the top SKU is named Ponte Vecchio and will be used in Aurora, the first plann Read more…

By John Russell

When Dense Matrix Representations Beat Sparse

September 9, 2019

In our world filled with unintended consequences, it turns out that saving memory space to help deal with GPU limitations, knowing it introduces performance pen Read more…

By James Reinders

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This