FY18 Budget & CORAL-2 – Exascale USA Continues to Move Ahead

By Alex R. Larzelere

April 2, 2018

It was not pretty. However, despite some twists and turns, the federal government’s Fiscal Year 2018 (FY18) budget is complete and ended with some very positive news for the DOE and NNSA’s exascale activities. When we last looked at the budget situation, Congress and the Administration had stacked the deck to make passing the 12 appropriations bills as easy as possible. The last Continuing Resolution (CR) lifted the budget sequestration caps and allowed the Senate and House Appropriations committees to basically ignore the country’s debt limit. However, even with this added flexibility, Congress was unable to complete the process to reconcile the differences in the appropriations bills passed by the House and the Senate. That resulted in the creation of an omnibus bill that covered the discretionary elements of the entire federal government.

The FY18 omnibus budget is huge. The bill was 2,232 pages long and covered over $1.3 trillion of government spending. It basically contains the text of the 12 appropriations bill that would have been the subject to the House and Senate conference committees if “regular order” had been followed. The “must pass” nature of an omnibus provides Congress with an opportunity to slip in non-budget related language into what eventually becomes law. It was some of those provisions that almost caused the demise of the omnibus and generated a lot of excitement before the bill was signed by the President into law.

The text of the omnibus bill was unveiled on the evening of Wednesday, March 21st. The bill was then passed by both the House and Senate on Thursday. The next and final step was the President signing the bill into law. Historically, given the alternative of shutting down the government, the Presidential signing would be considered automatic. However, in a Tweet on the morning of Friday, March 23rd, President Trump expressed his dissatisfaction with the bill and the process that generated it. He threatened to veto the bill, which in turn generated a great deal of uncertainty with what would happen next. This was particularly troublesome because Congress was in the process of heading out of town for its two-week spring recess. However, in the end, and to the relief of many, the President did sign the omnibus bill and the $1.3 trillion budget became law.

The very good news is that in the omnibus, the Department of Energy’s (DOE) Exascale Computing Initiative (ECI) saw significant budget growth. As you will recall, ECI is the joint effort by the DOE’s Office of Science (SC) and the National Nuclear Security Administration (NNSA). The goal of ECI is to establish a “productive” exascale system in the United States by 2021, with several follow-on installations. ECI consists of two major parts. One is the activities associated with the procurement of systems to support the computing capabilities at the national laboratories facilities. This work is being done through the CORAL-2 acquisition process and will include Non-Recurring Engineering work, site preparation, and installation of exascale computing system. CORAL stands for Collaboration of Oak Ridge, Argonne, and Livermore. The “2” represents that this is the second time this process is being used. The first time, the CORAL Request for Proposals (RFP) was used to acquire the Aurora system at Argonne National Laboratory (ANL), the Summit system at Oak Ridge National Laboratory (ORNL) and the Sierra system at Lawrence Livermore National Laboratory. The CORAL-2 procurement process is just starting and the RFP is expected to be released any day now.

The other part of ECI is known as the Exascale Computing Project (ECP). This is an official Office of Science (SC) project (governed by DOE orders) that is jointly supported by SC / ASCR and the NNSA Advanced Simulation and Computing (ASC) program. ECP is closely coordinated with complementary exascale work funded by NNSA at its national labs. The focus of ECP and the NNSA work is on conducting research that will improve the productivity of systems that will be procured through CORAL-2. ECP also supports research in hardware technologies and middleware software (operating systems, file systems, libraries, etc.) to improve exascale system productivity. A major emphasis of ECP is creating the applications that can use the power of exascale computing to solve important science and national security challenges. Finally, ECP supports co-design centers that bring all the elements of ECP together.

The final number for ECI saw the numbers grow from an initial request of $508 million to a final $663 million. Most of this growth occurred in the Office of Science budget. The initial request for NNSA ECI was $183 million and only grew to $186 million in the FY18 omnibus. On the SC side of the ledger, the initial request was for $347 million; the final omnibus number is $477 million, a big $130 million increase. Most of that occurred within the facilities elements of the budget that is used for hardware procurement and site preparation.

An important aspect of the FY18 omnibus numbers is that they will feed into the exascale program budget for FY19. In an earlier article, we outlined the ECI budget request made by the President for FY19. These numbers were also very encouraging. The President’s FY19 ECI budget request was for a total of $636 million in the Department of Energy (DOE). This is below the FY18 omnibus number, but that will likely be adjusted as the request works its way through the Congressional process. The NNSA FY19 request for ECI is $163 million, which is the same as the FY18 omnibus number. The SC ECI FY19 request is for $473 million and that is below the final FY18 omnibus number, but again can be adjusted by Congress.

With these budget numbers, it is now up to the DOE SC and NNSA programs to continue to execute the work needed to deliver “capable” exascale computers starting 2021. A big part of that will be the CORAL-2 procurements that should be released shortly. Just as for the previous CORAL-1 procurement, to mitigate technology risks, CORAL-2 expects to procure machines with two different architectures. One is slated to go to the Oak Ridge Leadership Computing Facility (OLCF) for 2021 delivery and the other, depending on the availability of funds, to the Argonne Leadership Computing Facility (ALCF) for delivery in 2022. The ALCF exascale system would be in addition to the “novel” architecture system currently being built by Intel and known as A21 (formerly Aurora). The NNSA exascale system is expected to start its installation in 2022 at LLNL. The NNSA is expected to choose one of the two different SC systems, but reserves the option of choosing a third architecture.

Also, the ECP will not have any funding excuses for delivering the technologies to make the future exascale computers “productive.” The ECP leadership transition from Paul Messina to Doug Kothe has been completed along with several leadership changes at the sub-project level. During the December advisory committee meeting, Kothe talked about installing project planning processes that provide a good view of tasks, milestones, interdependencies and risks. This is similar to the approach he used before with great success when he was the Director of the Consortium for Advanced Simulation of Light-water-reactors (CASL). CASL is a DOE Energy Innovation Hub, which in the words of former Secretary of Energy Steve Chu, was required to have a “fierce sense of urgency.” Given all the recent good news, ECI also seems to be getting that similar sense of urgency.

As messy as the end was, the FY18 budget process is now complete. The numbers for the U.S. exascale program look very good and the prospects for the next year are at least as good and could get better. The worldwide competition for exascale supremacy is still on. China, Japan, and Europe are all making announcements about their plans. The great news is that there are no questions that the U.S. is in the race. In the final analysis, there is no reason to doubt the country’s commitment to retaining and building its leadership in the important strategic technology of exascale computing.

About the Author

Alex Larzelere is a senior fellow at the U.S. Council on Competitiveness, the president of Larzelere & Associates Consulting and HPCwire’s policy editor. He is currently a technologist, speaker and author on a number of disruptive technologies that include: advanced modeling and simulation; high performance computing; artificial intelligence; the Internet of Things; and additive manufacturing. Alex’s career has included time in federal service (working closely with DOE national labs), private industry, and as founder of a small business. Throughout that time, he led programs that implemented the use of cutting edge advanced computing technologies to enable high resolution, multi-physics simulations of complex physical systems. Alex is the author of “Delivering Insight: The History of the Accelerated Strategic Computing Initiative (ASCI).”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

AI-Focused ‘Genius’ Supercomputer Installed at KU Leuven

April 24, 2018

Hewlett Packard Enterprise has deployed a new approximately half-petaflops supercomputer, named Genius, at Flemish research university KU Leuven. The system is built to run artificial intelligence (AI) workloads and, as Read more…

By Tiffany Trader

New Exascale System for Earth Simulation Introduced

April 23, 2018

After four years of development, the Energy Exascale Earth System Model (E3SM) will be unveiled today and released to the broader scientific community this month. The E3SM project is supported by the Department of Energy Read more…

By Staff

RSC Reports 500Tflops, Hot Water Cooled System Deployed at JINR

April 18, 2018

RSC, developer of supercomputers and advanced HPC systems based in Russia, today reported deployment of “the world's first 100% ‘hot water’ liquid cooled supercomputer” at Joint Institute for Nuclear Research (JI Read more…

By Staff

HPE Extreme Performance Solutions

Hybrid HPC is Speeding Time to Insight and Revolutionizing Medicine

High performance computing (HPC) is a key driver of success in many verticals today, and health and life science industries are extensively leveraging these capabilities. Read more…

New Device Spots Quantum Particle ‘Fingerprint’

April 18, 2018

Majorana particles have been observed by university researchers employing a device consisting of layers of magnetic insulators on a superconducting material. The advance opens the door to controlling the elusive particle Read more…

By George Leopold

AI-Focused ‘Genius’ Supercomputer Installed at KU Leuven

April 24, 2018

Hewlett Packard Enterprise has deployed a new approximately half-petaflops supercomputer, named Genius, at Flemish research university KU Leuven. The system is Read more…

By Tiffany Trader

Cray Rolls Out AMD-Based CS500; More to Follow?

April 18, 2018

Cray was the latest OEM to bring AMD back into the fold with introduction today of a CS500 option based on AMD’s Epyc processor line. The move follows Cray’ Read more…

By John Russell

IBM: Software Ecosystem for OpenPOWER is Ready for Prime Time

April 16, 2018

With key pieces of the IBM/OpenPOWER versus Intel/x86 gambit settling into place – e.g., the arrival of Power9 chips and Power9-based systems, hyperscaler sup Read more…

By John Russell

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Cloud-Readiness and Looking Beyond Application Scaling

April 11, 2018

There are two aspects to consider when determining if an application is suitable for running in the cloud. The first, which we will discuss here under the title Read more…

By Chris Downing

Transitioning from Big Data to Discovery: Data Management as a Keystone Analytics Strategy

April 9, 2018

The past 10-15 years has seen a stark rise in the density, size, and diversity of scientific data being generated in every scientific discipline in the world. Key among the sciences has been the explosion of laboratory technologies that generate large amounts of data in life-sciences and healthcare research. Large amounts of data are now being stored in very large storage name spaces, with little to no organization and a general unease about how to approach analyzing it. Read more…

By Ari Berman, BioTeam, Inc.

IBM Expands Quantum Computing Network

April 5, 2018

IBM is positioning itself as a first mover in establishing the era of commercial quantum computing. The company believes in order for quantum to work, taming qu Read more…

By Tiffany Trader

FY18 Budget & CORAL-2 – Exascale USA Continues to Move Ahead

April 2, 2018

It was not pretty. However, despite some twists and turns, the federal government’s Fiscal Year 2018 (FY18) budget is complete and ended with some very positi Read more…

By Alex R. Larzelere

Inventor Claims to Have Solved Floating Point Error Problem

January 17, 2018

"The decades-old floating point error problem has been solved," proclaims a press release from inventor Alan Jorgensen. The computer scientist has filed for and Read more…

By Tiffany Trader

Researchers Measure Impact of ‘Meltdown’ and ‘Spectre’ Patches on HPC Workloads

January 17, 2018

Computer scientists from the Center for Computational Research, State University of New York (SUNY), University at Buffalo have examined the effect of Meltdown Read more…

By Tiffany Trader

Russian Nuclear Engineers Caught Cryptomining on Lab Supercomputer

February 12, 2018

Nuclear scientists working at the All-Russian Research Institute of Experimental Physics (RFNC-VNIIEF) have been arrested for using lab supercomputing resources to mine crypto-currency, according to a report in Russia’s Interfax News Agency. Read more…

By Tiffany Trader

How the Cloud Is Falling Short for HPC

March 15, 2018

The last couple of years have seen cloud computing gradually build some legitimacy within the HPC world, but still the HPC industry lies far behind enterprise I Read more…

By Chris Downing

Chip Flaws ‘Meltdown’ and ‘Spectre’ Loom Large

January 4, 2018

The HPC and wider tech community have been abuzz this week over the discovery of critical design flaws that impact virtually all contemporary microprocessors. T Read more…

By Tiffany Trader

How Meltdown and Spectre Patches Will Affect HPC Workloads

January 10, 2018

There have been claims that the fixes for the Meltdown and Spectre security vulnerabilities, named the KPTI (aka KAISER) patches, are going to affect applicatio Read more…

By Rosemary Francis

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale

March 19, 2018

Petaflop per second deep learning training performance on the NERSC (National Energy Research Scientific Computing Center) Cori supercomputer has given climate Read more…

By Rob Farber

Leading Solution Providers

Fast Forward: Five HPC Predictions for 2018

December 21, 2017

What’s on your list of high (and low) lights for 2017? Volta 100’s arrival on the heels of the P100? Appearance, albeit late in the year, of IBM’s Power9? Read more…

By John Russell

Lenovo Unveils Warm Water Cooled ThinkSystem SD650 in Rampup to LRZ Install

February 22, 2018

This week Lenovo took the wraps off the ThinkSystem SD650 high-density server with third-generation direct water cooling technology developed in tandem with par Read more…

By Tiffany Trader

AI Cloud Competition Heats Up: Google’s TPUs, Amazon Building AI Chip

February 12, 2018

Competition in the white hot AI (and public cloud) market pits Google against Amazon this week, with Google offering AI hardware on its cloud platform intended Read more…

By Doug Black

HPC and AI – Two Communities Same Future

January 25, 2018

According to Al Gara (Intel Fellow, Data Center Group), high performance computing and artificial intelligence will increasingly intertwine as we transition to Read more…

By Rob Farber

New Blueprint for Converging HPC, Big Data

January 18, 2018

After five annual workshops on Big Data and Extreme-Scale Computing (BDEC), a group of international HPC heavyweights including Jack Dongarra (University of Te Read more…

By John Russell

US Plans $1.8 Billion Spend on DOE Exascale Supercomputing

April 11, 2018

On Monday, the United States Department of Energy announced its intention to procure up to three exascale supercomputers at a cost of up to $1.8 billion with th Read more…

By Tiffany Trader

Momentum Builds for US Exascale

January 9, 2018

2018 looks to be a great year for the U.S. exascale program. The last several months of 2017 revealed a number of important developments that help put the U.S. Read more…

By Alex R. Larzelere

Google Chases Quantum Supremacy with 72-Qubit Processor

March 7, 2018

Google pulled ahead of the pack this week in the race toward "quantum supremacy," with the introduction of a new 72-qubit quantum processor called Bristlecone. Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Share This