Visit additional Tabor Communication Publications
August 04, 2011
The US government's flirtation with default ended on August 2nd when the President signed the debt agreement into law. Officially known as the Budget Control Act of 2011, it purports to shave at least 2.1 trillion dollars off the federal deficit over the next ten years. What this means for federal funding of science, education, and R&D is still unclear, but given the government's current obsession with downsizing itself, it's hard to envision that research-centric agencies like the NSF, DOE Office of Science, and DARPA will remain unscathed.
Currently all the deficit reduction is being done on the spending side, with no new revenues in the mix. The initial $917 billion promised in cuts splits the pain between discretionary and non-discretionary spending, with the other $1.2-1.5 trillion to be decided later (and which we'll get to in a moment). None of the initial cuts are going into effect in the current fiscal year, with just $22 billion or so targeted for 2012 and the remainder spread out across 2013 through 2022. Last Sunday, President Obama tired to reassure the public that investments in education and research would be preserved, at least in the initial discretionary cuts.
The second phase of deficit reduction will be designed by a so-called congressional "Super Committee" of six Democrats and six Republicans. They're tasked with coming up with an additional $1.5 trillion over the next ten years. If the Super Committee can't come to an agreement or the Congress votes down the deal, which, given the hostile political climate, is a likely outcome, an automatic $1.2 trillion in cuts is triggered. The would bring the grand total to $2.1 trillion over the next decade.
So where does this leave R&D funding? From the glass-half-full perspective, none of these programs at the NSF, DOE Office of Science, or DARPA are specifically called out in the legislation, and probably won't be in any subsequent deal the Super Committee comes up with. Better yet, in the short-term, the cuts on the discretionary spending side (where all the R&D funding comes from), are not really cuts per se; they are better characterized as caps on future spending increases.
According to a Science Insider report, the effect will be to basically freeze discretionary spending for the next two years, while allowing for absolute increases of $20 to $25 billion per year over the remainder of the decade. The article probably pegs it about right as far as the near-term effect on the research community:
While that's hardly good news for researchers lobbying for the double-digit increases proposed by President Obama for some research agencies, it's a lot better than the Republican drive to roll back spending to 2008 levels.
But another article in The Scientist, is more worrisome, noting that health agencies like NIH, CDC and the FDA could be hard hit:
[T]he proposed deficit reduction is too steep to avoid real damage, said Mary Woolley, president and CEO of Research!America, an advocacy group that promotes health research. “These are horrifying cuts that could set us back for decades,” she said.
DARPA, the research agency of the US Department of Defense (DoD), may be particularly unlucky. The DoD has been singled out to endure $350 billion in cuts from the initial phase of the debt deal and $500 to $600 billion in the second phase if the Super Committee fails and the trigger is pulled. DARPA's total budget, which funds high-profile supercomputing projects like the Ubiquitous High Performance Computing (UHPC) program, is only about $3 billion a year, so it may not be a prime target when large cuts have to be made. But if the Pentagon really has to swallow nearly a billion in funding reductions over the next decade -- and there is some skepticism that this will come to pass -- one can assume that the research arm will not be able to escape harm completely.
The larger problem is that budget reductions of this magnitude threaten both parties' most cherished programs, leaving other discretionary spending, like science, education and R&D as secondary priorities. Democrats want to protect things like Social Security and Medicare (off the table for the time being), while the Republicans are circling the wagons around national defense and are extremely adamant about not raising taxes.
In such a political environment, funding for research agencies, which normally get some measure bipartisan support, could be sacrificed. Certainly the Republicans' increasing aversion to scientific research, and the Democrats' willingness to capitulate to Republican demands doesn't bode well for these agencies and their R&D programs.
The best hope for the science and research community is that this debt deal is superseded by more level-headed legislation down the road. That's certainly going to require a much more reasonable approach to taxes and spending then we have now. The most recent blueprint for balancing the budget can be found during the latter part of the Clinton administration, when actual surpluses were being projected. But we have veered rather far from that revenue-spending model.
Without raising taxes, balancing our budget over the long term (which this latest deal will not do) will be impossible unless we're willing to shrink the government down to its pre-World-War-II level. No respectable economist believes that the spending-cut fairy will magically increase revenues by growing the US economy. The debt deal signed into law this week is actually projected to reduce GDP by about 0.1 percent in 2012, according to Troy Davig, US economist at Barclays Capital.
It would be easy to blame the Congress, particularly the Tea Party wing of the Republicans, for their inability come up with a rational budget approach. And they surely deserve some of it. Holding the economy hostage by threatening to default on the debt was just plain dangerous and irresponsible.
But in a more fundamental way, the politicians are just reflecting the public's ignorance of the how federal budgets work. There are a number of polls that show people believe they can have their entitlements and other programs with little or no revenue increases. There is also widespread ignorance of how the government allocates its money, and the value of funding scientific research and education.
With such a lack of understanding by the public, it's no big mystery that we elect politicians who promise contradictory policies. Until that changes, it's hard to imagine how we'll get the government to behave responsibly with our money.
Posted by Michael Feldman - August 04, 2011 @ 5:39 PM, Pacific Daylight Time
Michael Feldman is the editor of HPCwire.
No Recent Blog Comments
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.