Debt Deal Casts Shadow on US Research Funding

By Michael Feldman

August 4, 2011

The US government’s flirtation with default ended on August 2nd when the President signed the debt agreement into law. Officially known as the Budget Control Act of 2011, it purports to shave at least 2.1 trillion dollars off the federal deficit over the next ten years. What this means for federal funding of science, education, and R&D is still unclear, but given the government’s current obsession with downsizing itself, it’s hard to envision that research-centric agencies like the NSF, DOE Office of Science, and DARPA will remain unscathed.

Currently all the deficit reduction is being done on the spending side, with no new revenues in the mix. The initial $917 billion promised in cuts splits the pain between discretionary and non-discretionary spending, with the other $1.2-1.5 trillion to be decided later (and which we’ll get to in a moment). None of the initial cuts are going into effect in the current fiscal year, with just $22 billion or so targeted for 2012 and the remainder spread out across 2013 through 2022. Last Sunday, President Obama tired to reassure the public that investments in education and research would be preserved, at least in the initial discretionary cuts.

The second phase of deficit reduction will be designed by a so-called congressional “Super Committee” of six Democrats and six Republicans. They’re tasked with coming up with an additional $1.5 trillion over the next ten years. If the Super Committee can’t come to an agreement or the Congress votes down the deal, which, given the hostile political climate, is a likely outcome, an automatic $1.2 trillion in cuts is triggered. The would bring the grand total to $2.1 trillion over the next decade.

So where does this leave R&D funding? From the glass-half-full perspective, none of these programs at the NSF, DOE Office of Science, or DARPA are specifically called out in the legislation, and probably won’t be in any subsequent deal the Super Committee comes up with. Better yet, in the short-term, the cuts on the discretionary spending side (where all the R&D funding comes from), are not really cuts per se; they are better characterized as caps on future spending increases.

According to a Science Insider report, the effect will be to basically freeze discretionary spending for the next two years, while allowing for absolute increases of $20 to $25 billion per year over the remainder of the decade. The article probably pegs it about right as far as the near-term effect on the research community:

While that’s hardly good news for researchers lobbying for the double-digit increases proposed by President Obama for some research agencies, it’s a lot better than the Republican drive to roll back spending to 2008 levels.

But another article in The Scientist, is more worrisome, noting that health agencies like NIH, CDC and the FDA could be hard hit:

[T]he proposed deficit reduction is too steep to avoid real damage, said Mary Woolley, president and CEO of Research!America, an advocacy group that promotes health research. “These are horrifying cuts that could set us back for decades,” she said.

DARPA, the research agency of the US Department of Defense (DoD), may be particularly unlucky. The DoD has been singled out to endure $350 billion in cuts from the initial phase of the debt deal and $500 to $600 billion in the second phase if the Super Committee fails and the trigger is pulled. DARPA’s total budget, which funds high-profile supercomputing projects like the Ubiquitous High Performance Computing (UHPC) program, is only about $3 billion a year, so it may not be a prime target when large cuts have to be made. But if the Pentagon really has to swallow nearly a billion in funding reductions over the next decade — and there is some skepticism that this will come to pass — one can assume that the research arm will not be able to escape harm completely.

The larger problem is that budget reductions of this magnitude threaten both parties’ most cherished programs, leaving other discretionary spending, like science, education and R&D as secondary priorities. Democrats want to protect things like Social Security and Medicare (off the table for the time being), while the Republicans are circling the wagons around national defense and are extremely adamant about not raising taxes.

In such a political environment, funding for research agencies, which normally get some measure bipartisan support, could be sacrificed. Certainly the Republicans’ increasing aversion to scientific research, and the Democrats’ willingness to capitulate to Republican demands doesn’t bode well for these agencies and their R&D programs.

The best hope for the science and research community is that this debt deal is superseded by more level-headed legislation down the road. That’s certainly going to require a much more reasonable approach to taxes and spending then we have now. The most recent blueprint for balancing the budget can be found during the latter part of the Clinton administration, when actual surpluses were being projected. But we have veered rather far from that revenue-spending model.

Without raising taxes, balancing our budget over the long term (which this latest deal will not do) will be impossible unless we’re willing to shrink the government down to its pre-World-War-II level. No respectable economist believes that the spending-cut fairy will magically increase revenues by growing the US economy. The debt deal signed into law this week is actually projected to reduce GDP by about 0.1 percent in 2012, according to Troy Davig, US economist at Barclays Capital.

It would be easy to blame the Congress, particularly the Tea Party wing of the Republicans, for their inability come up with a rational budget approach. And they surely deserve some of it. Holding the economy hostage by threatening to default on the debt was just plain dangerous and irresponsible.

But in a more fundamental way, the politicians are just reflecting the public’s ignorance of the how federal budgets work. There are a number of polls that show people believe they can have their entitlements and other programs with little or no revenue increases. There is also widespread ignorance of how the government allocates its money, and the value of funding scientific research and education.

With such a lack of understanding by the public, it’s no big mystery that we elect politicians who promise contradictory policies. Until that changes, it’s hard to imagine how we’ll get the government to behave responsibly with our money.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire