European Cyberinfrastructure in the Making

By Christopher Lazou

December 1, 2006

In Rome on November 24-25, CASPUR (Consorzio per le Applicazioni del Supercalcolo Per Università e Ricerca) hosted an excellent “Forward Look” workshop on quantum molecular sciences at the prestigious Accademia dei Lincei, whose, past members include Galileo Galilei. This is part of the European Computational Science Forum: The Lincei Initiative: from computers to scientific excellence. The aim of “Forward Look” is to develop a vision on how computational sciences will evolve in the coming 10 to 20 years and create an infrastructure of support. During this workshop on quantum molecular sciences I listened to the proceedings and spoke with many of the leading players.

I visited CASPUR in the eternal city of Rome, so before I report on the workshop let's see what a medium size computer centre such as CASPUR offers to the scientific community. Despite large centres stealing the limelight, medium size ones represent not only the “bread and butter” for scientists, but also represent a major part of HPC usage.

Established in 1992, CASPUR is a consortium of nine universities of central and south Italy. Lead by Dr Carlo Maria Serio, the consortium has about 80 staff of which about 30 are members of the HPC unit. CASPUR offers both computing resources and expertise in the field of scientific and technical computing. From the start, CASPUR offered to its users a choice of computer architectures suited to their specific needs. The current installation consists of three systems: an IBM Power5, an AMD Opteron cluster and an NEC SX-6 computer. The machines provide an aggregate of 2.2 teraflops peak performance.
Thanks to this diversity of computers, CASPUR is able to satisfy the different needs of its user community, representing domains such as material sciences, life sciences, fluid dynamics, earthquake simulations and oceanography, and bioinformatics. Although the installed computing power is small compared to the most powerful European supercomputing centres, CASPUR follows a policy of continuous updating, so that at any moment in time, one of the installed computers is of the latest generation. Located in Rome, with its four public universities, and especially Università “La Sapienza”, the largest European university with about 150 thousand students, CASPUR offers HPC services to a very large number of research groups, both in academia and research centres.

CASPUR's strategy about HPC, from the very beginning, was to balance computing power with computing expertise and user support. This is reflected in the increase of staff, in the HPC unit, in line with the growth of users and computing power. CASPUR not only offers support for porting users' or community software on its machines, but also offers support for optimisation and development of new scientific software of professional quality. These applications, developed and optimised in collaboration with CASPUR, are then used at larger computer facilities.

“Our keyword is HPCSS: High Performance Computing Scientific Services”, said Dr Serio. “The strategy for implementing such a vision is to have strong collaborations with leading research groups in many scientific areas. Sometimes these collaborations are of a scientific nature. This enables us to better understand current and future HPC needs of computational scientists. For example, CASPUR is doing research in the field of opto-electronics in collaboration with Prof. Roberto Car, from Princeton, Prof. M. Knupfer from the Leibniz Institute, Dresden and Amadeo Palma, Istituto per lo Studio dei Materiali Nanostrutturati (ISMM), Consiglio Nazionale delle Ricerche. They have recently demonstrated that it is possible to build nano-structured materials, in particular nano-wires, within an organic matrix. Although in this instance the organic molecule was PTCDA, the potential is there for other organic materials to be finely tuned to host different kind of nano-materials”.

In the European and global HPC context, CASPUR believes it is crucially important to form a “Network of Competence” amongst small and medium computing centres. Whilst these centres cannot compete with major centres in terms of computing power, they can provide an important contribution in disseminating the HPC culture and supporting the broad scientific community, which often does not have access to larger European computer facilities.

CASPUR strategic policy is in line with the European “Lisbon Agenda”, which states that regions have a key role to play in building a knowledge-based society. This is very important in the light of the small role played by European hardware manufacturers, on the global landscape. The implementation of a strategy based on strengthening expertise in middleware and application software, as well as other HPC services, would make the European computing ecosystem complementary to that of the U.S. and Japan. It would encourage and promote the formation of European expertise, which is of crucial importance in the dynamic services market. CASPUR is promoting this approach at every level of European institutions responsible for supercomputing programmes.

Let's move on to the quantum molecular sciences workshop. The workshop is part of the computational science forum, “Forward Look”, organised by the European Science Foundation (ESF). This is an initiative aiming at defining the scientific software infrastructure, which should support the research activities of European computational scientists, complementing hardware developments. Although analogous software initiatives, e.g. PITAC report, Blue Ribbon report, NSF cyberinfrastructure and the SciDAC program, are ongoing in the United States (with similar activities pursued in Japan), in Europe software is not as yet widely recognised as a critical component of infrastructure. It is underrepresented on research budgets and consequently there is a scarcity of experts for building professional software. As Dr Serio put it: “This has some odd effects. Users practise self-censorship and depend on the American and Japanese developers to define tomorrow's computing software, and this makes Europe fall even farther behind”.

This lack of a European software infrastructure, for developing computer codes, and the support to deploy maintain and evolve these codes, as well as technical training for the next generation of computational scientists, is hindering the advancement of European science in many fields. Eventually this would lead to European computational scientists losing their world leading position.

During this “Forward Look” exercise there will be six parallel, community level workshops on chemistry, astrophysics and astronomy, fluid dynamics, life sciences, bio and non-bio materials sciences, and climate and environmental sciences. Based on how each field will evolve and on the needs of the scientific community, a strategy will be presented in a final report on what software/hardware and support infrastructures are needed in the European context.

The concept of a cyberinfrastructure includes all the necessary developments, which have to be put in place, to provide the means to take full advantage of possibilities offered by the rapid growth of hardware and the Grid. Code developers across Europe should be able to work in a collaborative and integrated way — sharing data, building specific codes, and using remote facilities and graphic tools. It is anticipated this infrastructure will operate within a European context and will result from an analysis of the actual needs of the scientists involved and the vision they have on how these needs can be met.

The “Forward Look” initiative plans to issue a final report in mid-2007 containing recommendations and a roadmap, which once implemented, should enable European computational scientists to remain competitive with world-class capabilities. The report will be presented to funding agencies and influential European institutions, such as European Commission, ESFRI (European Strategy Forum on Research Infrastructures) and HPCN centres.

Computational sciences and computer simulations in particular, are playing an ever-growing role in fundamental and applied sciences and this success is based to a large extent on the development of computers. The Internet has also changed the way science is organized. In addition to a European supercomputer infrastructure, the scientific software developed by the various European groups is becoming increasingly important from a strategic point of view. Whereas ten years ago every research group could develop its own research codes, nowadays the complexity and sophistication of the methods in use have grown to such an extent, that many groups rely on third party software. For example, in the field of (ab-initio and molecular) materials science, the four most important European codes have been used as a “research instrument” in over 2000 publications in international journals per year.

There are presently many packages in use: ACESII, CPMD, Columbus, Crystal, Daltons, Dirac, FLEUR, GAMESS-UK, MNDO97, MolCas, MolPro, NWChem, Orca, Q-Chem, Quickstep, SIESTA, Turbomole, VASP and many more. Europe is not only the home of some of the most widely used packages for electronic structure simulations, but it is also the cradle of innovations that has made possible the extraordinary successes in this discipline in the last quarter century.

For Europe to maintain a leading position and also ensure that as many European research groups as possible take advantage of this unique position, it is essential to develop a European-focused strategy for efficient use of resources. Many HPC centres give support for local software development, but since most of the software is used across Europe, this support could be even more efficient if it were coordinated at the European level. More importantly, the availability of a cyberinfrastructure should create new opportunities for science. For example, combining the expertise of the Psi-k and MolSimu communities allows them to link electronic structure calculations with techniques to compute a macroscopic phase diagram of a material. A well-designed European cyberinfrastructure can deal with the complexity of the various codes and the contributions of many different groups.

Another issue is that the range of users is broadening — from theoreticians developing improved algorithms and new types of applications, to experimentalists who need to use simulation to interpret complex data. Moreover, it is not just a matter of distributing software. Essential elements of the success of a cyberinfrastructure are appropriate training and user support, geared to the high level of expertise needed to use the software reliably.

Since the “Forward Look” initiative involves several “Grand Challenge” fields, each workshop is tasked to produce a report answering general questions such as current and future levels of importance of computational science in their field, with examples of applications: The current cyberinfrastructure in each field, describing strengths and weakness for computational science. The cyberinfrastructure needs for the next 10-20 years and the path to achieve this new vision.

The desirable cyberinfrastructure presupposes that the key scientific questions that will be addressed in the coming 10-20 years are understood. Setting up such an infrastructure is a multidisciplinary activity, which could benefit from expertise in state-of-the-art developments in hardware (grid, middleware, etc.) and computer science (integrated code development, good practice, multi-scale multi-code issues). One has to evaluate the emerging technologies and how can they be used.

How should the European computer infrastructure, both in hardware and software, be organised such that Europe has a competitive advantage? The view from this workshop was that a software infrastructure can take many forms, but should have an independent distinct organizational structure — say an institute — to look after the communities interests. The operational aspects can be distributed (i.e., virtual), where personnel are attached to existing supercomputer centres, or in a single central building. In addition, the availability of a cyberinfrastructure may create opportunities for new science, for example, by coupling computational methods to databases. Training and tutoring on how to set up such a cyberinfrastructure is of great importance, and also ensures it can be used by the European scientific community at large.

Another issue is how to build on existing groupings, for example, the Psi-k and MolSimu communities, the UK CCP1 for electronic structure of molecules, CCP6 for heavy particle dynamics, and on other communities such as astrophysics, computational fluid dynamics, bioinformatics, and so on.

From the workshops that have taken place to-date, a vision for computational science in the coming 10 to 20 years is beginning to crystallize. In the short term it includes consolidation and maintenance of current major software. Another need is porting codes onto new architecture as well as providing documentation and support. Standardisation of input/output, both file format and data naming, for interoperability across various languages and packages is also seen as a worthwhile goal.

In the long term, there is a need for an infrastructure to rationalize existing software efforts by creating generic libraries toolkits and APIs where appropriate. The new researcher is expected to be proficient in multiple disciplines, for example, not only a science domain such as chemistry, but also mathematics and computational science. This requires a funding stream to support current research and a programme to train the new generation of multi-discipline researchers.

I look forward to the final report to see whether Europe has the foresight and political will to fund cyberinfrastructure on a long-term basis in order to empower European scientists to remain competitive in this strategic field. The viability of the European economies and the welfare of 500 million Europeans may depend on it.


Copyright (c) Christopher Lazou, HiPerCom Consultants, Ltd., UK. December 2006. Brands and names are the property of their respective owners.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

What’s New in Computing vs. COVID-19: Fugaku, Congress, De Novo Design & More

July 2, 2020

Supercomputing, big data and artificial intelligence are crucial tools in the fight against the coronavirus pandemic. Around the world, researchers, corporations and governments are urgently devoting their computing reso Read more…

By Oliver Peckham

OpenPOWER Reboot – New Director, New Silicon Partners, Leveraging Linux Foundation Connections

July 2, 2020

Earlier this week the OpenPOWER Foundation announced the contribution of IBM’s A21 Power processor core design to the open source community. Roughly this time last year, IBM announced open sourcing its Power instructio Read more…

By John Russell

HPC Career Notes: July 2020 Edition

July 1, 2020

In this monthly feature, we'll keep you up-to-date on the latest career developments for individuals in the high-performance computing community. Whether it's a promotion, new company hire, or even an accolade, we've got Read more…

By Mariana Iriarte

Supercomputers Enable Radical, Promising New COVID-19 Drug Development Approach

July 1, 2020

Around the world, innumerable supercomputers are sifting through billions of molecules in a desperate search for a viable therapeutic to treat COVID-19. Those molecules are pulled from enormous databases of known compoun Read more…

By Oliver Peckham

HPC-Powered Simulations Reveal a Looming Climatic Threat to Vital Monsoon Seasons

June 30, 2020

As June draws to a close, eyes are turning to the latter half of the year – and with it, the monsoon and hurricane seasons that can prove vital or devastating for many of the world’s coastal communities. Now, climate Read more…

By Oliver Peckham

AWS Solution Channel

Maxar Builds HPC on AWS to Deliver Forecasts 58% Faster Than Weather Supercomputer

When weather threatens drilling rigs, refineries, and other energy facilities, oil and gas companies want to move fast to protect personnel and equipment. And for firms that trade commodity shares in oil, precious metals, crops, and livestock, the weather can significantly impact their buy-sell decisions. Read more…

Intel® HPC + AI Pavilion

Supercomputing the Pandemic: Scientific Community Tackles COVID-19 from Multiple Perspectives

Since their inception, supercomputers have taken on the biggest, most complex, and most data-intensive computing challenges—from confirming Einstein’s theories about gravitational waves to predicting the impacts of climate change. Read more…

Hyperion Forecast – Headwinds in 2020 Won’t Stifle Cloud HPC Adoption or Arm’s Rise

June 30, 2020

The semiannual taking of HPC’s pulse by Hyperion Research – late fall at SC and early summer at ISC – is a much-watched indicator of things come. This year is no different though the conversion of ISC to a digital Read more…

By John Russell

OpenPOWER Reboot – New Director, New Silicon Partners, Leveraging Linux Foundation Connections

July 2, 2020

Earlier this week the OpenPOWER Foundation announced the contribution of IBM’s A21 Power processor core design to the open source community. Roughly this time Read more…

By John Russell

Hyperion Forecast – Headwinds in 2020 Won’t Stifle Cloud HPC Adoption or Arm’s Rise

June 30, 2020

The semiannual taking of HPC’s pulse by Hyperion Research – late fall at SC and early summer at ISC – is a much-watched indicator of things come. This yea Read more…

By John Russell

Racism and HPC: a Special Podcast

June 29, 2020

Promoting greater diversity in HPC is a much-discussed goal and ostensibly a long-sought goal in HPC. Yet it seems clear HPC is far from achieving this goal. Re Read more…

Top500 Trends: Movement on Top, but Record Low Turnover

June 25, 2020

The 55th installment of the Top500 list saw strong activity in the leadership segment with four new systems in the top ten and a crowning achievement from the f Read more…

By Tiffany Trader

ISC 2020 Keynote: Hope for the Future, Praise for Fugaku and HPC’s Pandemic Response

June 24, 2020

In stark contrast to past years Thomas Sterling’s ISC20 keynote today struck a more somber note with the COVID-19 pandemic as the central character in Sterling’s annual review of worldwide trends in HPC. Better known for his engaging manner and occasional willingness to poke prickly egos, Sterling instead strode through the numbing statistics associated... Read more…

By John Russell

ISC 2020’s Student Cluster Competition Winners Announced

June 24, 2020

Normally, the Student Cluster Competition involves teams of students building real computing clusters on the show floors of major supercomputer conferences and Read more…

By Oliver Peckham

Hoefler’s Whirlwind ISC20 Virtual Tour of ML Trends in 9 Slides

June 23, 2020

The ISC20 experience this year via livestreaming and pre-recordings is interesting and perhaps a bit odd. That said presenters’ efforts to condense their comments makes for economic use of your time. Torsten Hoefler’s whirlwind 12-minute tour of ML is a great example. Hoefler, leader of the planned ISC20 Machine Learning... Read more…

By John Russell

At ISC, the Fight Against COVID-19 Took the Stage – and Yes, Fugaku Was There

June 23, 2020

With over nine million infected and nearly half a million dead, the COVID-19 pandemic has seized the world’s attention for several months. It has also dominat Read more…

By Oliver Peckham

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

Supercomputer Simulations Reveal the Fate of the Neanderthals

May 25, 2020

For hundreds of thousands of years, neanderthals roamed the planet, eventually (almost 50,000 years ago) giving way to homo sapiens, which quickly became the do Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Leading Solution Providers


Neocortex Will Be First-of-Its-Kind 800,000-Core AI Supercomputer

June 9, 2020

Pittsburgh Supercomputing Center (PSC - a joint research organization of Carnegie Mellon University and the University of Pittsburgh) has won a $5 million award Read more…

By Tiffany Trader

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Australian Researchers Break All-Time Internet Speed Record

May 26, 2020

If you’ve been stuck at home for the last few months, you’ve probably become more attuned to the quality (or lack thereof) of your internet connection. Even Read more…

By Oliver Peckham

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

By Doug Black

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This