In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here.
Using small-scale history data to predict large-scale HPC performance
Performance modeling is crucial for HPC, and machine learning can help – but extrapolating performance on a broad scale from small-scale applications is often unreliable. These authors, a team from the University of Science and Technology of China, demonstrate a technique applying both interpolation and extrapolation to resolve this problem. They show that their technique can achieve higher predictive accuracy on real-world HPC systems.
Authors: Wenju Zhou, Jiepeng Zhang, Jingwei Sun and Guangzhong Sun.
Processing crowdsourced aircraft observations in an HPC environment
With the advent (and growing prevalence) of unmanned aircraft systems (UASs), the risk of collisions between manned and unmanned aircrafts is growing. In this paper, researchers from MIT’s Lincoln Laboratory use the Lincoln Laboratory Supercomputing Center to process nearly four billion aircraft observations from the OpenSky Network to develop an open-source workflow for aircraft monitoring.
Authors: Andrew Weinert, Ngaire Underhill, Bilal Gill and Ashley Wicks.
Optimizing HPC systems for biomedical workloads
“Most biomedical researchers are focused on better understanding scientific phenomena rather than developing and optimizing code,” write these authors from the Icahn School of Medicine. To that end, they present a case study of a biomedical computational workload at a leading academic medical center, outlining how they upgraded the system over the years to keep pace with user needs and optimization requirements.
Authors: Patricia Kovatch, Lili Gai, Hyung Min Cho, Eugene Fluder and Dansha Jiang.
Moving towards exascale simulations of the magnetic universe
These authors discuss the EXAMAG project, an effort to prepare high-performance simulations of cosmic structure formation in preparation for the exascale era. “These calculations are presently at the forefront of today’s use of supercomputers,” the authors (from universities in Germany, India and the UK) write, “and are important scientific drivers for the future use of exaflop computing platforms.”
Authors: Christian Klingenberg, Rüdiger Pakmor, Thomas Guillet and Praveen Chandrashekar.
Integrating deep learning in domain sciences at exascale
Increasingly, machine learning and HPC are becoming integrated at scale. In this paper, researchers from Oak Ridge National Laboratory, the Georgia Institute of Technology and the University of Tennessee, Knoxville, outline challenges with that integration, evaluating existing deep learning packages for their ability to run efficiently on HPC systems. The authors go on to discuss the need for an HPC deep learning framework and make suggestions as to how those needs can be provided.
Authors: Rick Archibald, Edmond Chow, Eduardo D’Azevedo, Jack Dongarra, Markus Eisenbach, Rocco Febbo, Florent Lopez, Daniel Nichols, Stanimire Tomov, Kwai Wong and Junqi Yin.
Simulating peridynamics on the TaihuLight supercomputer
Peridynamics, a theory of solid mechanics well-suited for modeling fractures in materials, has growing applications in materials sciences, human health, manufacturing and more. These authors from the Chinese Academy of Sciences worked to optimize peridynamics simulations on the Sunway TaihuLight supercomputer. The team reports that their work achieved simulation speedups of six to 181 times on various processors and demonstrated strong parallel efficiency when scaling.
Authors: Xinyuan Li, Huang Ye and Jian Zhang.
Examining training efforts in the Exascale Computing Project
The Exascale Computing Project (ECP), which describes itself as “a collaborative effort toward the national imperative for exascale computing power to advance quality of life, the economy, and national security,” engages in a variety of training activities. In this paper, a duo of researchers from Lawrence Berkeley National Laboratory and Oak Ridge National Laboratory describe these training activities, some of which, they say, “can be beneficial to the community at large.” They describe these resources, aiming to raise awareness of tools they say will extend beyond the ECP’s scope and life cycle.
Authors: Osni Marques and Ashley Barker.
Do you know about research that should be included in next month’s list? If so, send us an email at [email protected]. We look forward to hearing from you.