Visit additional Tabor Communication Publications
January 21, 2009
With U.S. businesses in full retreat, the new Obama administration and Congress are committed to injecting an enormous stimulus of federal money into the economy. At least some of this seems destined to end up as increased spending on science and technology R&D, and by extension, high performance computing.
That will be welcome news for the tech community, who has been pleading for more funding for basic science research for over a decade. The America COMPETES Act, signed into law in 2007, supported doubling funding for basic research programs in physical sciences, namely nanotechnology, alternative energy and supercomputing. But money was never appropriated at the levels that COMPETES called for. Measured in real dollars, U.S. government spending on physical sciences R&D has generally been on the decline since the 1990s, and even funding for life sciences has been dropping since 2004.
President Obama has repeatedly called for a doubling of federal funding for basic research over ten years. With than in mind, the American Recovery And Reinvestment Act currently before Congress provides for $10 billion toward science facilities, research, and instrumentation (out of a total allocation of $550 billion in government spending, plus $275 billion in tax cuts). The breakdown for the science research funding is as follows:
If the bill is passed and the money is appropriated as is, that would be a huge boost for these organizations' research efforts. There's no telling how much of this would trickle down to HPC infrastructure, programs and jobs since the spending details would ultimately be up to the individual agencies. For the long-term, the more salient issue is whether these increases would be maintained to keep R&D funding on the kind of trajectory called out by the COMPETES Act.
Peter Harsha, the director of Government Affairs at the Computing Research Association, reports that at least the money targeted for R&D infrastructure in the stimulus bill may be a one-time deal:
[I]n our meetings with congressional staff over the last couple of weeks, there has been some concern about managing expectations about the sustainability of any of this funding beyond the stimulus. There are no promises that this stimulus funding will establish a new baseline funding level for these science agencies. There is the possibility that this truly is "one and done." The report language doesn't speak to that directly, but seems to suggest that the idea with this influx of research funding in what was thought to be simply an "infrastructure" bill is to reestablish a trajectory towards the doubling targets in the America COMPETES Act. If that's the case, we should expect that future appropriations bills will start with a funding level of $8 billion for NSF, for example (because $1 billion of the $3 billion increase is for a "one-time" infrastructure investment, while the remaining $2 billion is a research investment), and not revert back to the $6 billion pre-stimulus level. Hard to know exactly what the intent is and it's hard to reach the appropriations staff to hear it from them directly.
If so, post-stimulus R&D funding will revert to the classic struggle of discretionary spending between budget hawks and doves. But the political winds may indeed be shifting. President Obama's commitment to boost federally-funded research should find a receptive audience in the Democratic-controlled Congress. In his inaugural speech, the new President pledged to "restore science to its rightful place." After years of uninspired government support for science and technology, the federal R&D machine that produced the Internet and decoded human genomes may be back in business.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.