Last month the National Energy Research Scientific Computing Center (NERSC) announced a collaboration with supercomputing vendors Intel and Cray to prepare for Cori, the Cray XC supercomputer slated to be deployed at NERSC in 2016. To ensure that the highly diverse workloads of the DOE science community continue to be supported as over 5,000 users make the transition to Cori, the partners launched the NERSC Exascale Science Applications Program (NESAP).
As part of this effort, NERSC had earlier put out a call for submissions for the code optimization program. Out of 50 submissions, NERSC narrowed down the pool to 20, which it shared publicly last week. Throughout the acceptance of the Cori system, NESAP will connect these 20 application teams with resources at NERSC, Cray, and Intel. The teams will have access to early hardware, special training and preparation sessions with Intel and Cray staff.
“We were very impressed with the response to our call for participation—we received 50 proposals,” said Harvey Wasserman, HPC consultant at NERSC and NESAP post-doc lead. “NERSC is very grateful for the user participation; they’ll be doing the ‘heavy lifting’ during the project and will help us ensure that the workload is ready when Cori is deployed. This exciting machine architecture is now being followed by exciting science in the national interest.”
In addition to the 20 teams, 24 other science code teams will also be included in the program, benefitting from specialized training and early access to hardware. The goal of NESAP is for all 44 projects to leverage Cori as soon as it’s available.
The review process was conducted by NERSC and other DOE staff. Projects encompass a wide range of NERSC disciplines, including astrophysics, genomics, materials science, climate and weather modeling, plasma fusion physics and accelerator science. The core task will be shaping the codes to take advantage of Cori’s Knights Landing manycore architecture, so that they may enable important scientific advances. Thus early access to prototype and production hardware is a crucial component of NESAP.
As announced by NERSC, the 20 projects appear below in connection with the relevant program office.
Advanced Scientific Computing Research (ASCR):
- Optimization of the BoxLib Adaptive Mesh Refinement Framework for Scientific Application Codes, PI: Ann Almgren (Lawrence Berkeley National Laboratory)
- High-Resolution CFD and Transport in Complex Geometries Using Chombo-Crunch, David Trebotich (Lawrence Berkeley National Laboratory)
Biological and Environmental Research (BER)
- CESM Global Climate Modeling, John Dennis (National Center for Atmospheric Research)
- High-Resolution Global Coupled Climate Simulation Using The Accelerated Climate Model for Energy (ACME), Hans Johansen (Lawrence Berkeley National Laboratory)
- Multi-Scale Ocean Simulation for Studying Global to Regional Climate Change, Todd Ringler (Los Alamos National Laboratory)
- Gromacs Molecular Dynamics (MD) Simulation for Bioenergy and Environmental Biosciences, Jeremy C. Smith (Oak Ridge National Laboratory)
- Meraculous, a Production de novo Genome Assembler for Energy-Related Genomics Problems, Katherine Yelick (Lawrence Berkeley National Laboratory)
Basic Energy Science (BES):
- Large-Scale Molecular Simulations with NWChem, PI: Eric Jon Bylaska (Pacific Northwest National Laboratory)
- Parsec: A Scalable Computational Tool for Discovery and Design of Excited State Phenomena in Energy Materials, James Chelikowsky (University of Texas, Austin)
- BerkeleyGW: Massively Parallel Quasiparticle and Optical Properties Computation for Materials and Nanostructures (Jack Deslippe, NERSC)
- Materials Science using Quantum Espresso, Paul Kent (Oak Ridge National Laboratory)
- Large-Scale 3-D Geophysical Inverse Modeling of the Earth, Greg Newman (Lawrence Berkeley National Laboratory)
Fusion Energy Sciences (FES)
- Understanding Fusion Edge Physics Using the Global Gyrokinetic XGC1 Code, Choong-Seock Chang (Princeton Plasma Physics Laboratory)
- Addressing Non-Ideal Fusion Plasma Magnetohydrodynamics Using M3D-C1, Stephen Jardin (Princeton Plasma Physics Laboratory)
High Energy Physics (HEP)
- HACC (Hardware/Hybrid Accelerated Cosmology Code) for Extreme Scale Cosmology, Salman Habib (Argonne National Laboratory)
- The MILC Code Suite for Quantum Chromodynamics (QCD) Simulation and Analysis, Doug Toussaint (University of Arizona)
- Advanced Modeling of Particle Accelerators, Jean-Luc Vay, Lawrence Berkeley National Laboratory)
Nuclear Physics (NP)
- Domain Wall Fermions and Highly Improved Staggered Quarks for Lattice QCD, Norman Christ (Columbia University) and Frithjof Karsch (Brookhaven National Laboratory)
- Chroma Lattice QCD Code Suite, Balint Joo (Jefferson National Accelerator Facility)
- Weakly Bound and Resonant States in Light Isotope Chains Using MFDn — Many Fermion Dynamics Nuclear Physics, James Vary and Pieter Maris (Iowa State University
The 24 additional codes that will be targeted are listed below along with their principal investigators.
GTC-P (Stephane Ethier/PPPL)
GTS (William Tang/PPPL)
VORPAL (John Cary/TechX)
TOAST (Julian Borrill/LBNL)
Qbox/Qb@ll (Yosuke Kanai/U. North Carolina)
CALCLENS and ROCKSTAR (Risa Wechsler/Stanford)
WEST (Marco Govoni/U. Chicago)
QLUA (William Detmold/MIT)
P3D (James Drake/U. Maryland)
WRF (John Michalakes/ANL)
PHOSIM (Andrew Connolly/U. Washington)
SDAV tools (Hank Childs/U. Oregon)
M3D/M3D-K (Linda Sugiyama/MIT)
DGDFT (Lin Lin/U.C. Berkeley)
GIZMO/GADGET (Joel Primack/U.C. Santa Cruz)
ZELMANI (Christian Ott/Caltech)
VASP (Martijn Marsman/U. Vienna)
NAMD (James Phillips/U. Illinois)
PHOENIX-3D (Eddie Baron/U. Oklahoma)
ACE3P (Cho-Kuen Ng/SLAC)
S3D (Jacqueline Chen/SNL)
ATLAS (Paolo Calafiura/LBNL)
BBTools genomics tools (Jon Rood/LBNL, JGI)
DOE MiniApps (Alice Koniges, LBNL)