$100B Plan Submitted for Massive Remake and Expansion of NSF

By John Russell

May 27, 2020

Legislation to reshape, expand – and rename – the National Science Foundation has been submitted in both the U.S. House and Senate. The proposal, which seems to have bipartisan support, calls for giving NSF $100 billion over five years and expanding its mission to include a technology directorate. Early response from the science community has been a mix of enthusiasm for added resources and worry over shifting NSF’s emphasis away from curiosity-driven basic science to more policy-driven technology development.

Given the scope of the plan and the current mobilization of government to deal with the COVID-19 pandemic, it seems unlikely there will be fast action on the bill. The initiative is co-sponsored by Democrat Chuck Schumer (NY) and Republican Todd Young (IN) in the Senate. Early coverage of the proposed changed is in Science Magazine. (UPDATE: Early reaction from four prominent HPC leaders is presented at the end of the article.)

Here’s a brief excerpt from the Science report, written by Jeffrey Mervis:

“The Endless Frontiers Act (S. 3832) proposes a major reorganization of NSF, creating a technology directorate that, within 4 years, would grow to more than four times the size of the entire agency’s existing $8 billion budget. NSF would be renamed the National Science and Technology Foundation, and both the science and technology arms would be led by a deputy reporting to the NSF director. (NSF now has a single director; the deputy director slot has been unfilled since 2014.) Many academic leaders are praising the legislation, which was spearheaded by the Senate’s top Democrat, Chuck Schumer (NY), and co-sponsored by Senator Todd Young (R–IN). They see it as a huge vote of confidence in NSF, which this year is celebrating its 70th anniversary.

“These funds—which complement, not supplant, existing resources, an important condition—build on the NSF’s strengths and would fill gaps in our research enterprise, while allowing the foundation’s curiosity-driven research to continue to thrive,” says Rafael Reif, president of the Massachusetts Institute of Technology. “These investments will help NSF catalyze innovation, support scientific leadership, and keep America globally competitive,” adds Mary Sue Coleman, president of the Association of American Universities, a 65-member consortium of the nation’s leading research institutions.

But at least one former NSF director fears the bill would take the agency into dangerous territory by asking it to lead the government’s effort to develop new technologies. “I believe it would be a mistake for a technology directorate at NSF to serve as an offset to private funding for commercial innovation and entrepreneurship,” says Arden Bement, who led NSF from 2004 to 2010. “Federal funding for applied technology research and development should be need-based and channeled through mission agencies.”

The new directorate’s efforts would concentrate on a periodically updated list of no more than 10 “key technology focus areas,” with an initial list of the following 10:

  • artificial intelligence and machine learning
  • high performance computing, semiconductors, and advanced computer hardware
  • quantum computing and information systems
  • robotics, automation, and advanced manufacturing
  • natural or anthropogenic disaster prevention
  • advanced communications technology
  • biotechnology, genomics, and synthetic biology
  • cybersecurity, data storage, and data management technologies
  • advanced energy
  • materials science, engineering, and exploration relevant to the other key technology focus areas

The bill recommends the directorate’s budget rise from $2 billion in fiscal year 2021 to $35 billion in fiscal years 2024 and 2025, with a “hold harmless” provision mandating it cannot receive any funds in a given fiscal year if the budget for the rest of NSF declines. NSF’s annual budget is currently about $8 billion. Schumer and Young’s bill would also establish a Regional Technology Hub Program administered by the U.S. Economic Development Administration and the National Institute of Standards and Technology that would provide grants to consortia working in specified technology areas. The legislation would recommend a total budget of $10 billion for the program covering fiscal years 2021 through 2025. Reps. Ro Khanna (D-CA) and Mike Gallagher (R-WI) are expected to introduce the bill in the House.

UPDATE:
There’s a lot to unpack here. While the HPC community will need time to absorb the full proposal, four prominent HPC community members offered early thoughts.

Thomas Sterling, professor and director of AI Computing Systems Laboratory (AICSL), Indiana University, Bloomington:

Thomas Sterling, ISC19

“The strategic vision proposed to update the National Science Foundation after seven decades of establishing science as an American imperative is enlightened and timely. A new National Science and Technology Foundation acknowledges the complex interrelationships between science goals and discoveries on the one hand and technology advancements and innovations on the other. In particular the NSF has proven ambivalent about sponsoring projects in the domain of HPC systems as it has had a less than inspired criteria on judging curiosity-driven research meriting support. Technology innovation has been excluded by some at NSF as meriting support.

“In HPC technology, confusion exists about the role of industry providing next generation products and the need for future generation concepts creation through research. Although not politically fashionable, the review system for proposed research is highly constraining. The ability to recognize value of imaginative but risky ideas in HPC is treated with limited enthusiasm if not disdain. Adding “Technology” to its charter and mandate will force a new competitive culture for HPC systems. Perhaps, the NSTF will elevate systems research in hardware as well as software. Most importantly, mission-critical agencies are not responsible, nor have programs for creativity beyond the near-term in HPC systems. Recent initiatives from DARPA, DOE, IARPA, and NIH have delivered responsible incremental advances but are largely ignoring revolutionary ideas and directions which is the only way that the US will be able to leap-frog the threatening international competition.

“For these reasons, expanding beyond the legacy traditions of NSF to release the imagination and creativity of the nation’s inspired technology research may catalyze a renaissance in US leadership in HPC and systems.”

Dan Stanzione, executive director and associate VP for research, the Texas Advanced Computing Center, UT Austin:

Dan Stanzione, TACC

“We’re still evaluating the details of the proposed Legislation.  At TACC, we strongly support any efforts to strengthen the National Science Foundation, and to strengthen the US investments in key strategic technologies, particularly when other nations are ramping up their corresponding investments. We do believe that it is critically important that any new legislation protects NSF’s critical role in funding basic research.  NSF’s role in basic research is unique among federal agencies, and provides foundational discoveries key to creating future industries and improve the quality of life.”

Rick Stevens, associate laboratory director for computing, environment and life sciences, Argonne National Laboratory:

Rick Stevens, ANL

“I think improving the funding prospects for NSF basic science mission is very important. Expanding the basic science funding support to universities is a good idea, including expanding engineering research and research infrastructure. I think the government mission agencies are likely the better mechanism for applied science and technology research as they have government mission needs that serve as drivers and can shape priority investments for technology to meet those needs. They are also better staffed and organized to manage technology development projects through national labs and contracts with industry.

“I think it is also important to increase the scale and sustainability of private sector investments in advanced technology development via tax policy and other means such as intelligent government procurements that push the envelope. Increase overall government spending on science and technology is important and making those increases sustainable is even more important as the research community needs that stability to work on long-term problems.”

Jack Dongarra, Innovative Computing Laboratory, University of Tennessee:

Jack Dongarra

“This looks like a great opportunity for NS(T)F to expand its scope. Traditionally NSF was all about the science. Once the academic researcher describing the science was submitted as a paper the extended development was stopped. This may allow NSTF to extend the development into more useful harden technology that can be used. Since government budgets are a zero-sum game this may affect other agencies in an adverse fashion.”

Link to Science Magazine’s coverage: https://www.sciencemag.org/news/2020/05/us-lawmakers-unveil-bold-100-billion-plan-remake-nsf

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

San Diego Supercomputer Center Opens ‘Expanse’ to Industry Users

April 15, 2021

When San Diego Supercomputer Center (SDSC) at the University of California San Diego was getting ready to deploy its flagship Expanse supercomputer for the large research community it supports, it also sought to optimize Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU technology. The University of Cambridge will expand... Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impact on how large a piece of the DL pie a user can finally enj Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Nvidia Aims Clara Healthcare at Drug Discovery, Imaging via DGX

April 12, 2021

Nvidia Corp. continues to expand its Clara healthcare platform with the addition of computational drug discovery and medical imaging tools based on its DGX A100 platform, related InfiniBand networking and its AGX developer kit. The Clara partnerships announced during... Read more…

AWS Solution Channel

Research computing with RONIN on AWS

To allow more visibility into and management of Amazon Web Services (AWS) resources and expenses and minimize the cloud skills training required to operate these resources, AWS Partner RONIN created the RONIN research computing platform. Read more…

Nvidia Serves Up Its First Arm Datacenter CPU ‘Grace’ During Kitchen Keynote

April 12, 2021

Today at Nvidia’s annual spring GPU Technology Conference (GTC), held virtually once more due to the pandemic, the company unveiled its first ever Arm-based CPU, called Grace in honor of the famous American programmer Grace Hopper. The announcement of the new... Read more…

San Diego Supercomputer Center Opens ‘Expanse’ to Industry Users

April 15, 2021

When San Diego Supercomputer Center (SDSC) at the University of California San Diego was getting ready to deploy its flagship Expanse supercomputer for the larg Read more…

GTC21: Dell Building Cloud Native Supercomputers at U Cambridge and Durham

April 14, 2021

In conjunction with GTC21, Dell Technologies today announced new supercomputers at universities across DiRAC (Distributed Research utilizing Advanced Computing) in the UK with plans to explore use of Nvidia BlueField DPU technology. The University of Cambridge will expand... Read more…

The Role and Potential of CPUs in Deep Learning

April 14, 2021

Deep learning (DL) applications have unique architectural characteristics and efficiency requirements. Hence, the choice of computing system has a profound impa Read more…

GTC21: Nvidia Launches cuQuantum; Dips a Toe in Quantum Computing

April 13, 2021

Yesterday Nvidia officially dipped a toe into quantum computing with the launch of cuQuantum SDK, a development platform for simulating quantum circuits on GPU-accelerated systems. As Nvidia CEO Jensen Huang emphasized in his keynote, Nvidia doesn’t plan to build... Read more…

Nvidia Aims Clara Healthcare at Drug Discovery, Imaging via DGX

April 12, 2021

Nvidia Corp. continues to expand its Clara healthcare platform with the addition of computational drug discovery and medical imaging tools based on its DGX A100 platform, related InfiniBand networking and its AGX developer kit. The Clara partnerships announced during... Read more…

Nvidia Serves Up Its First Arm Datacenter CPU ‘Grace’ During Kitchen Keynote

April 12, 2021

Today at Nvidia’s annual spring GPU Technology Conference (GTC), held virtually once more due to the pandemic, the company unveiled its first ever Arm-based CPU, called Grace in honor of the famous American programmer Grace Hopper. The announcement of the new... Read more…

Nvidia Debuts BlueField-3 – Its Next DPU with Big Plans for an Expanded Role

April 12, 2021

Nvidia today announced its next generation data processing unit (DPU) – BlueField-3 – adding more substance to its evolving concept of the DPU as a full-fledged partner to CPUs and GPUs in delivering advanced computing. Nvidia is pitching the DPU as an active engine... Read more…

Nvidia’s Newly DPU-Enabled SuperPod Is a Multi-Tenant, Cloud-Native Supercomputer

April 12, 2021

At GTC 2021, Nvidia has announced an upgraded iteration of its DGX SuperPods, calling the new offering “the first cloud-native, multi-tenant supercomputer.” Read more…

Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?

January 13, 2021

The rapid adoption of Julia, the open source, high level programing language with roots at MIT, shows no sign of slowing according to data from Julialang.org. I Read more…

Intel Launches 10nm ‘Ice Lake’ Datacenter CPU with Up to 40 Cores

April 6, 2021

The wait is over. Today Intel officially launched its 10nm datacenter CPU, the third-generation Intel Xeon Scalable processor, codenamed Ice Lake. With up to 40 Read more…

CERN Is Betting Big on Exascale

April 1, 2021

The European Organization for Nuclear Research (CERN) involves 23 countries, 15,000 researchers, billions of dollars a year, and the biggest machine in the worl Read more…

Programming the Soon-to-Be World’s Fastest Supercomputer, Frontier

January 5, 2021

What’s it like designing an app for the world’s fastest supercomputer, set to come online in the United States in 2021? The University of Delaware’s Sunita Chandrasekaran is leading an elite international team in just that task. Chandrasekaran, assistant professor of computer and information sciences, recently was named... Read more…

HPE Launches Storage Line Loaded with IBM’s Spectrum Scale File System

April 6, 2021

HPE today launched a new family of storage solutions bundled with IBM’s Spectrum Scale Erasure Code Edition parallel file system (description below) and featu Read more…

10nm, 7nm, 5nm…. Should the Chip Nanometer Metric Be Replaced?

June 1, 2020

The biggest cool factor in server chips is the nanometer. AMD beating Intel to a CPU built on a 7nm process node* – with 5nm and 3nm on the way – has been i Read more…

Saudi Aramco Unveils Dammam 7, Its New Top Ten Supercomputer

January 21, 2021

By revenue, oil and gas giant Saudi Aramco is one of the largest companies in the world, and it has historically employed commensurate amounts of supercomputing Read more…

Quantum Computer Start-up IonQ Plans IPO via SPAC

March 8, 2021

IonQ, a Maryland-based quantum computing start-up working with ion trap technology, plans to go public via a Special Purpose Acquisition Company (SPAC) merger a Read more…

Leading Solution Providers

Contributors

Can Deep Learning Replace Numerical Weather Prediction?

March 3, 2021

Numerical weather prediction (NWP) is a mainstay of supercomputing. Some of the first applications of the first supercomputers dealt with climate modeling, and Read more…

Livermore’s El Capitan Supercomputer to Debut HPE ‘Rabbit’ Near Node Local Storage

February 18, 2021

A near node local storage innovation called Rabbit factored heavily into Lawrence Livermore National Laboratory’s decision to select Cray’s proposal for its CORAL-2 machine, the lab’s first exascale-class supercomputer, El Capitan. Details of this new storage technology were revealed... Read more…

New Deep Learning Algorithm Solves Rubik’s Cube

July 25, 2018

Solving (and attempting to solve) Rubik’s Cube has delighted millions of puzzle lovers since 1974 when the cube was invented by Hungarian sculptor and archite Read more…

African Supercomputing Center Inaugurates ‘Toubkal,’ Most Powerful Supercomputer on the Continent

February 25, 2021

Historically, Africa hasn’t exactly been synonymous with supercomputing. There are only a handful of supercomputers on the continent, with few ranking on the Read more…

The History of Supercomputing vs. COVID-19

March 9, 2021

The COVID-19 pandemic poses a greater challenge to the high-performance computing community than any before. HPCwire's coverage of the supercomputing response t Read more…

AMD Launches Epyc ‘Milan’ with 19 SKUs for HPC, Enterprise and Hyperscale

March 15, 2021

At a virtual launch event held today (Monday), AMD revealed its third-generation Epyc “Milan” CPU lineup: a set of 19 SKUs -- including the flagship 64-core, 280-watt 7763 part --  aimed at HPC, enterprise and cloud workloads. Notably, the third-gen Epyc Milan chips achieve 19 percent... Read more…

HPE Names Justin Hotard New HPC Chief as Pete Ungaro Departs

March 2, 2021

HPE CEO Antonio Neri announced today (March 2, 2021) the appointment of Justin Hotard as general manager of HPC, mission critical solutions and labs, effective Read more…

Microsoft, HPE Bringing AI, Edge, Cloud to Earth Orbit in Preparation for Mars Missions

February 12, 2021

The International Space Station will soon get a delivery of powerful AI, edge and cloud computing tools from HPE and Microsoft Azure to expand technology experi Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire