Touching down in Barcelona on Saturday afternoon, it was warm, sunny, and oh so Spanish. I was greeted at my hotel with a glass of Cava to sip and treated to a tour of the historic hotel. A short rest, walk around Barcelona, and a little bit of work filled the time until dinner — at 8pm.
On Tuesday morning, PRACEdays 2017 commenced as part of the European Summit Week. The program began with a welcome by Sergi Girona, EXDCI coordinator, and Serge Bogaerts, managing director of PRACE, outlining the week of plenaries, keynotes, breakout sessions, BoFs, and poster sessions. There will be a lot to see and learn this week in Barcelona!
PRACE Council Chair Anwar Osseyran was next with a detailed overview of PRACE achievements and the challenges ahead. PRACE prides themselves on providing open access of the best HPC systems for European scientists. Their criterion: scientific excellence.
In the PRACE partnerships, there are seven “Tier 0” systems (top systems available for international use), including the recent addition Piz Daint, currently number eight on the Top500 list. Of the seven world-class systems they have, there are over 60 petaflops of peak performance enabling 524 scientific projects.
Anwar has positioned the challenges PRACE sees in how to adapt and modernize HPC Infrastructure into four quadrants:
- European open science cloud: Enabling persistent access to data. This is a huge challenge affecting health care.
- Strong HPC infrastructures for data processing.
- Adapting HPC solutions for cloud environments to make it easy and accessible for scientists.
- How to achieve exascale.
As PRACE considers these challenges, the question of funding comes in. How will PRACE fund all their ambitions? If they can’t do it all, what technologies and applications should they focus on? As Anwar says, consider the “mundane versus heavenly science. It’s about choices.
On more than one occasion during his presentation, Anwar discussed the concept of collaboration among the communities versus the benefits of competition. Anwar suggested that competition among scientists produces better results. I would have thought that collaboration among the supercomputing centers would be more of a norm – sharing resources, results — all contributing to better science.
As Anwar said in his closing statement: “It’s about finding a balance between traditional, disruptive, and fundamental science.”
The first keynote was delivered by Minna Palmroth, titled Understanding Near-Earth Space in Six Dimensions.
The thing I really love about events like this is the opportunity to learn more about the science, the big science problems, and things I’ve never thought about. Minna hit the mark in her presentation on near-space problems.
The Earth has radiation belts. Navigation and weather satellites sail in plasma around the Earth, traversing the radiation belts. Two types of phenomena are affecting spacecraft and satellites: A single event upset (like a system failure), and the aging of the spacecraft due to the harsh radiation they experience in the radiation belts.
The radiation belt situation is already extremely important, but will be more important in the future as the number of spacecraft grows. The challenge in a nutshell: how to simulate large, and ever increasing numbers of spacecraft in the radiation belt requires a dense grid, and complex grid calculations in multiple dimensions simultaneously.“
Minna is a research professor and unit head with the department of physics at the University of Helsinki, Finland. They are solving parallelization on three levels:
- Across nodes on clusters and supercomputers using MPI.
- Across multiple cores within a node using OpenMP.
- Within cores with vectorization.
Their most recent development supports multiple ions, an optimised boundary conditions implementation resulting in improved scaling. This has given them the processing power and speed to do the math needed for the near-space problems they have identified.
The simulations Minna shared of solar winds and radiation belts as they hit the Earth’s atmosphere are fascinating. The solar winds create significant amounts of heat that dissipate and spread around the Earth’s atmosphere.
The system Vlasiator is a newly developed large-scale space physics model. The goal is to model the entire near-Earth space, going far beyond the existing large-scale plasma simulations. This will take the modeling from the current solar winds and radiation belts to space weather and spacecraft instrument optimization. Vlasiator has been used to discover phenomena that no one thought existed, and with the continued modeling improvements such as adding machine learning, Vlasiator will be an important tool to understanding space phenomena and methods to protect spacecraft, technological systems, or human life in space.
The second keynote, Using Big-Data Methodologies in the Chemical Industry, was given by Telli van der Lei.
The information shared by Telli is not surprising; we have long known that modeling supply chains can produce positive results. Regardless, this is a topic that can’t be discussed enough, especially in a conference that is heavily research and academic (approximately 73% of the attendees). Talking about the business application of the science they are modeling and the improvements it enables is a good thing. It takes science and computation that was developed in one place, used and enhanced in another, and demonstrating it back to the first place.
Telli is an academic, now working for industry. She works for DSM as a senior scientist in the Supply Chain and Process Modeling. Doing the modeling in industry, Telli professes, can be quite hard. In her presentation, Telli talks about the industry issues DSM thinks about, the results they achieved with their Supply Chain Modeling, and the challenges she thinks about going forward.
For industry issues, health is number one. I would argue that nearly every country in the world has the issues of aging population, healthcare, and optimal food composition. After health, nutrition, how to feed the growing population, and the drive of urbanization creating clusters of humans while decreasing farming space is a growing issue. Lastly, resource constraints of available materials to feed into the supply chain is a major issue.
DSM uses computer modeling to simulate the supply chain from raw materials to manufacturing, warehouse to the client. Using the modeling and incorporating the process into their supply chain, they have realized some amazing results in correctness of orders, reduction in supply chain costs, reduction of inventory, and a more efficient, flexible, and responsive supply chain.
Out of all this, they have modeling and advanced analytics which have come from their proven successes. They still see challenges such as from a modeling perspective, how do you optimize the input, output, and runtime of existing models or incorporate the business choices in models? They can use HPC to simulate, but how are you convinced your results have value? As Telli said: “It’s not yay – here we go [with our results], it’s how you change your business.”
Feature image by Bernard Gagnon via Creative Commons