In this bimonthly feature, HPCwire highlights newly published research in the high-performance computing community and related domains. From parallel programming to exascale to quantum computing, the details are here.
The Australian Square Kilometre Array Pathfinder (ASKAP) telescope (itself a pilot project for the record-setting Square Kilometre Array planned for construction in the coming years) will enable highly sensitive radio astronomy that produces a tremendous amount of data. In this paper, researchers from the Commonwealth Scientific and Industrial Research Organisation (CSIRO) highlight how they are preparing a dedicated HPC platform, called ASKAPsoft, to handle the expected 5 PB/year of data produced by ASKAP.
Authors: Juan C. Guzman, Eric Bastholm, Wasim raja, Matthew Whiting, Daniel Mitchell, Stephen Ord and Max Voronkov.
In an expert field like HPC, institutional memory and information-sharing is crucial for maintaining and building on expertise – but institutions often lack cohesive infrastructures to perpetuate that knowledge. These authors, a team from North Carolina State University and Lawrence Livermore National Laboratory, introduce “OpenK,” an open, ontology-based infrastructure aimed at facilitating the accumulation, sharing and reuse of HPC knowledge.
Authors: Yue Zhao, Xipeng Shen and Chunhua Liao.
High-performance data analysis (HPDA) is an emerging tool for scientific disciplines like bioscience, climate science and security – and now, it’s being used to prepare astrophysics research for exascale. In this paper, written by a team from the Astronomical Observatory of Trieste, Italy, the authors discuss the ExaNeSt and EuroExa projects, which built a prototype of a low-power exascale facility for HPDA and astrophysics.
Authors: Giuliano Taffoni, David Goz, Luca Tornatore, Marco Frailis, Gianmarco Maggio and Fabio Pasian.
“Monitoring users on large computing platforms such as [HPC] and cloud computing systems,” these authors – a duo from Lawrence Berkeley National Laboratory – write, “is non-trivial.” Users can (and have) abused access to HPC systems, they say, but process viewers and other monitoring tools can impose substantial overhead. To that end, they introduce a technique for identifying running programs with 97% accuracy using just the system’s power consumption.
Authors: Bogdan Copos and Sein Peisert.
In numerical weather and climate prediction (NWP), accuracy depends strongly on available computing power – but the increasing number of cores in top systems is leading to a higher frequency of hardware and software failures for NWP simulations. This report (from researchers at eight different institutions) examines approaches for fault tolerance in numerical algorithms and system resilience in parallel simulations for those NWP tools.
Authors: Tommaso Benacchio, Luca Bonaventura, Mirco Altenbernd, Chris D. Cantwell, Peter D. Düben, Mike Gillard, Luc Giraud, Dominik Göddeke, Erwan Raffin, Keita Teranishi and Nils Wedi.
Another team – this time, from SURF, a collaborative organization for Dutch research – also investigated the intersection of astronomy and the exascale era. This paper, written by three researchers from SURF, highlights a new, OpenStack-based cloud infrastructure layer and Spider, a new addition to SURF’s high-throughput data processing platform. The authors explore how these additions help to prepare the astronomical research community for the exascale era, in particular with regard to data-intensive experiments like the Square Kilometre Array.
Authors: J. B. R. Oonk, C. Schrijvers and Y. van den Berg.
As the exascale era approaches, HPC systems are growing in complexity, improving performance but making the systems less accessible for new users. These authors – a duo from the Ludwig Maximilian University of Munich – propose a support framework for these future HPC architectures called EASEY (for Enable exAScale for EverYone) that “can automatically deploy optimized container computations with negligible overhead[.]”
Authors: Maximilian Höb and Dieter Kranzlmüller.
Do you know about research that should be included in next month’s list? If so, send us an email at firstname.lastname@example.org. We look forward to hearing from you.