Debt Deal Casts Shadow on US Research Funding

By Michael Feldman

August 4, 2011

The US government’s flirtation with default ended on August 2nd when the President signed the debt agreement into law. Officially known as the Budget Control Act of 2011, it purports to shave at least 2.1 trillion dollars off the federal deficit over the next ten years. What this means for federal funding of science, education, and R&D is still unclear, but given the government’s current obsession with downsizing itself, it’s hard to envision that research-centric agencies like the NSF, DOE Office of Science, and DARPA will remain unscathed.

Currently all the deficit reduction is being done on the spending side, with no new revenues in the mix. The initial $917 billion promised in cuts splits the pain between discretionary and non-discretionary spending, with the other $1.2-1.5 trillion to be decided later (and which we’ll get to in a moment). None of the initial cuts are going into effect in the current fiscal year, with just $22 billion or so targeted for 2012 and the remainder spread out across 2013 through 2022. Last Sunday, President Obama tired to reassure the public that investments in education and research would be preserved, at least in the initial discretionary cuts.

The second phase of deficit reduction will be designed by a so-called congressional “Super Committee” of six Democrats and six Republicans. They’re tasked with coming up with an additional $1.5 trillion over the next ten years. If the Super Committee can’t come to an agreement or the Congress votes down the deal, which, given the hostile political climate, is a likely outcome, an automatic $1.2 trillion in cuts is triggered. The would bring the grand total to $2.1 trillion over the next decade.

So where does this leave R&D funding? From the glass-half-full perspective, none of these programs at the NSF, DOE Office of Science, or DARPA are specifically called out in the legislation, and probably won’t be in any subsequent deal the Super Committee comes up with. Better yet, in the short-term, the cuts on the discretionary spending side (where all the R&D funding comes from), are not really cuts per se; they are better characterized as caps on future spending increases.

According to a Science Insider report, the effect will be to basically freeze discretionary spending for the next two years, while allowing for absolute increases of $20 to $25 billion per year over the remainder of the decade. The article probably pegs it about right as far as the near-term effect on the research community:

While that’s hardly good news for researchers lobbying for the double-digit increases proposed by President Obama for some research agencies, it’s a lot better than the Republican drive to roll back spending to 2008 levels.

But another article in The Scientist, is more worrisome, noting that health agencies like NIH, CDC and the FDA could be hard hit:

[T]he proposed deficit reduction is too steep to avoid real damage, said Mary Woolley, president and CEO of Research!America, an advocacy group that promotes health research. “These are horrifying cuts that could set us back for decades,” she said.

DARPA, the research agency of the US Department of Defense (DoD), may be particularly unlucky. The DoD has been singled out to endure $350 billion in cuts from the initial phase of the debt deal and $500 to $600 billion in the second phase if the Super Committee fails and the trigger is pulled. DARPA’s total budget, which funds high-profile supercomputing projects like the Ubiquitous High Performance Computing (UHPC) program, is only about $3 billion a year, so it may not be a prime target when large cuts have to be made. But if the Pentagon really has to swallow nearly a billion in funding reductions over the next decade — and there is some skepticism that this will come to pass — one can assume that the research arm will not be able to escape harm completely.

The larger problem is that budget reductions of this magnitude threaten both parties’ most cherished programs, leaving other discretionary spending, like science, education and R&D as secondary priorities. Democrats want to protect things like Social Security and Medicare (off the table for the time being), while the Republicans are circling the wagons around national defense and are extremely adamant about not raising taxes.

In such a political environment, funding for research agencies, which normally get some measure bipartisan support, could be sacrificed. Certainly the Republicans’ increasing aversion to scientific research, and the Democrats’ willingness to capitulate to Republican demands doesn’t bode well for these agencies and their R&D programs.

The best hope for the science and research community is that this debt deal is superseded by more level-headed legislation down the road. That’s certainly going to require a much more reasonable approach to taxes and spending then we have now. The most recent blueprint for balancing the budget can be found during the latter part of the Clinton administration, when actual surpluses were being projected. But we have veered rather far from that revenue-spending model.

Without raising taxes, balancing our budget over the long term (which this latest deal will not do) will be impossible unless we’re willing to shrink the government down to its pre-World-War-II level. No respectable economist believes that the spending-cut fairy will magically increase revenues by growing the US economy. The debt deal signed into law this week is actually projected to reduce GDP by about 0.1 percent in 2012, according to Troy Davig, US economist at Barclays Capital.

It would be easy to blame the Congress, particularly the Tea Party wing of the Republicans, for their inability come up with a rational budget approach. And they surely deserve some of it. Holding the economy hostage by threatening to default on the debt was just plain dangerous and irresponsible.

But in a more fundamental way, the politicians are just reflecting the public’s ignorance of the how federal budgets work. There are a number of polls that show people believe they can have their entitlements and other programs with little or no revenue increases. There is also widespread ignorance of how the government allocates its money, and the value of funding scientific research and education.

With such a lack of understanding by the public, it’s no big mystery that we elect politicians who promise contradictory policies. Until that changes, it’s hard to imagine how we’ll get the government to behave responsibly with our money.

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Live and in Color, Meet the European Student Cluster Teams

November 21, 2017

The SC17 Student Cluster Competition welcomed two teams from Europe, the German team of FAU/TUC and Team Poland, the pride of Warsaw. Let's get to know them better through the miracle of video..... Team FAU/TUC is a c Read more…

By Dan Olds

SC17 Student Cluster Kick Off – Guts, Glory, Grep

November 21, 2017

The SC17 Student Cluster Competition started with a well-orchestrated kick-off emceed by Stephen Harrell, the competition chair. It began with a welcome from SC17 chair Bernd Mohr, where he lauded the competition for Read more…

By Dan Olds

Activist Investor Starboard Buys 10.7% Stake in Mellanox; Sale Possible?

November 20, 2017

Starboard Value has reportedly taken a 10.7 percent stake in interconnect specialist Mellanox Technologies, and according to the Wall Street Journal, has urged the company “to improve its margins and stock and explore Read more…

By John Russell

HPE Extreme Performance Solutions

Harness Scalable Petabyte Storage with HPE Apollo 4510 and HPE StoreEver

As a growing number of connected devices challenges IT departments to rapidly collect, manage, and store troves of data, organizations must adopt a new generation of IT to help them operate quickly and intelligently. Read more…

Installation of Sierra Supercomputer Steams Along at LLNL

November 20, 2017

Sierra, the 125 petaflops (peak) machine based on IBM’s Power9 chip being built at Lawrence Livermore National Laboratory, sometimes takes a back seat to Summit, the ~200 petaflops system being built at Oak Ridge Natio Read more…

By John Russell

Live and in Color, Meet the European Student Cluster Teams

November 21, 2017

The SC17 Student Cluster Competition welcomed two teams from Europe, the German team of FAU/TUC and Team Poland, the pride of Warsaw. Let's get to know them bet Read more…

By Dan Olds

SC17 Student Cluster Kick Off – Guts, Glory, Grep

November 21, 2017

The SC17 Student Cluster Competition started with a well-orchestrated kick-off emceed by Stephen Harrell, the competition chair. It began with a welcome from Read more…

By Dan Olds

SC Bids Farewell to Denver, Heads to Dallas for 30th

November 17, 2017

After a jam-packed four-day expo and intensive six-day technical program, SC17 has wrapped up another successful event that brought together nearly 13,000 visit Read more…

By Tiffany Trader

SC17 Keynote – HPC Powers SKA Efforts to Peer Deep into the Cosmos

November 17, 2017

This week’s SC17 keynote – Life, the Universe and Computing: The Story of the SKA Telescope – was a powerful pitch for the potential of Big Science projects that also showcased the foundational role of high performance computing in modern science. It was also visually stunning. Read more…

By John Russell

How Cities Use HPC at the Edge to Get Smarter

November 17, 2017

Cities are sensoring up, collecting vast troves of data that they’re running through predictive models and using the insights to solve problems that, in some Read more…

By Doug Black

Student Cluster LINPACK Record Shattered! More LINs Packed Than Ever before!

November 16, 2017

Nanyang Technological University, the pride of Singapore, utterly destroyed the Student Cluster Competition LINPACK record by posting a score of 51.77 TFlop/s a Read more…

By Dan Olds

Hyperion Market Update: ‘Decent’ Growth Led by HPE; AI Transparency a Risk Issue

November 15, 2017

The HPC market update from Hyperion Research (formerly IDC) at the annual SC conference is a business and social “must,” and this year’s presentation at S Read more…

By Doug Black

Nvidia Focuses Its Cloud Containers on HPC Applications

November 14, 2017

Having migrated its top-of-the-line datacenter GPU to the largest cloud vendors, Nvidia is touting its Volta architecture for a range of scientific computing ta Read more…

By George Leopold

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

AMD Showcases Growing Portfolio of EPYC and Radeon-based Systems at SC17

November 13, 2017

AMD’s charge back into HPC and the datacenter is on full display at SC17. Having launched the EPYC processor line in June along with its MI25 GPU the focus he Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Leading Solution Providers

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

Cray Moves to Acquire the Seagate ClusterStor Line

July 28, 2017

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

Intel Launches Software Tools to Ease FPGA Programming

September 5, 2017

Field Programmable Gate Arrays (FPGAs) have a reputation for being difficult to program, requiring expertise in specialty languages, like Verilog or VHDL. Easin Read more…

By Tiffany Trader

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

Flipping the Flops and Reading the Top500 Tea Leaves

November 13, 2017

The 50th edition of the Top500 list, the biannual publication of the world’s fastest supercomputers based on public Linpack benchmarking results, was released Read more…

By Tiffany Trader

IBM Advances Web-based Quantum Programming

September 5, 2017

IBM Research is pairing its Jupyter-based Data Science Experience notebook environment with its cloud-based quantum computer, IBM Q, in hopes of encouraging a new class of entrepreneurial user to solve intractable problems that even exceed the capabilities of the best AI systems. Read more…

By Alex Woodie

Share This