Visit additional Tabor Communication Publications
March 20, 2013
On Friday, March 15, President Obama gave a speech at DOE's Argonne National Laboratory, and light-heartedly expressed his concerns about the effects of sequestration on budgets at the country's national laboratories.
Noting that some of the employees were standing in the crowded auditorium, he quipped, "I thought [that at] Argonne, one of the effects of the sequester [was that] you had to get rid of chairs!"
People laughed. Outside of that speech, however, nobody in a federal lab is chuckling over the possible impact of sequestration. Prominent heads of national labs, university researchers and technology executives are very concerned about how budget stalemates between the White House and Congress will affect government-funded research across the country.
Sequestration, because it demands cuts in government spending almost across the board, has brought the issue directly to the datacenter. If left in place, it will put federally funded R&D this year at a level $12.5 billion less than the amount spent in 2011 – an 8.7% decrease. Several organizations have already instituted budget cuts to prepare for the decrease in funding. The National Institutes of Health has said it is cutting grant levels by 10 percent and will offer fewer grants. The National Science Foundation says it will eliminate 1,000 grants this year.
Moreover, sequestration has sparked an op-ed debate over the value of government-funded research itself. It's a debate that could extend well beyond the current stalemate.
Locating the speech at Argonne and putting energy research on the table was itself a strategic move to highlight the importance of funding national labs. President Obama also tried to offer new funding in a palatable way. He did not call for additional taxes or even preventing future cuts, but suggested using a non-tax form of revenue to fund energy research. The approach would take $2 billion over the next 10 years from leases paid by energy companies that develop fossil fuel resources on federal land. That money would fund a very specific type of research: developing electric vehicles, homegrown biofuels, and domestically produced natural gas.
But that still leaves the longer-term question open. Is it a good idea to use tax revenue to fund research that may or may not have future benefits to the country? The heads of government organizations, national labs, universities and other supporters of technology are now defending the concept in hearings and in editorial pages across the country.
William Brinkman, director of the Office of Science at DOE, testified before a House Appropriations Subcommittee on Energy and Water Development on March 5. He said that sequestration would cut this year's budget for the Office of Science by $215 million from 2012, something the country cannot afford at a time when "other countries around the world are challenging our scientific leadership in essentially all the scientific disciplines that we steward." HPC research is a big part of that. "Since the inception of high-performance computing, the United States has been a world leader in this field," Brinkman continued.
But that may no longer be the case. Budget cuts will affect research intended to "accelerate the next generation of supercomputers at a time when international competition in this domain is growing," he said.
In fact, the US is not the clear leader it once was. In 2011, a 700,000-core Fujitsu K computer installed at the RIKEN Advanced Institute for Computational Science (AISC) hit the summit of the TOP500 list. It dropped to third position on the November 2012 list because of competition from newer machines, but 31 of the 50 most powerful computers on that list are based outside the US. Throughout the world, countries such as China, Japan, the UK, Germany, India and most recently Switzerland are touting the competitive benefits of new supercomputers.
China has joined the competition to become the first country with an exascale computer, as has a European consortium, the Partnership for Advanced Computing in Europe (PRACE). The Indian Institute of Technology Delhi (IIT Delhi) is partnering with NVIDIA to create a research lab to try to reach exascale computing in India by 2017.
Brinkman also argues that federally-funded HPC research is an enormous boon to industry at home. "Growth in computing performance has the potential to advance multiple sectors of our economy, including science, manufacturing, and national defense," he testified before Congress. As one example, he pointed out that corporations are conducting 15 projects in the Industrial High Performance Computing Partnerships Program at Oak Ridge National Laboratory (ORNL).
Others have also become very vocal in defending federal R&D in general as a boon to the economy. The Washington think tank ITIF estimates that projected cuts in R&D will reduce the GDP by between $203 billion and $860 billion over the next nine years. It also says that sequestration will put the US "$511 billion behind in R&D investment when compared to expected Chinese R&D expenditure growth rates."
In an editorial in The Atlantic, National Lab Directors Paul Alivisatos (Lawrence Berkeley National Laboratory), Eric D. Isaacs (Argonne) and Thom Mason (ORNL) write that the impact of sequestration "will be felt years – or even decades – in the future, when the nation begins to feel the loss of important new scientific ideas that now will not be explored, and of brilliant young scientists who now will take their talents overseas or perhaps even abandon research entirely." Federal R&D spending amounts to less than one percent of the federal budget, they argue, and cuts will result in "gaps in the innovation pipeline [that] could cost billions of dollars and hurt the national economy for decades to come."
In an editorial in The Financial Times, MIT president Rafael Reif and former Intel CEO Craig Barrett argue that "scientific discovery improves life and creates wealth like nothing else. But that notion has essentially been on trial in the US for decades." They point out that the commerce department has estimated that since WWII, 75 percent of postwar growth came from technological innovation.
Some people, however, dispute those numbers. Roger Pielke a professor of environmental studies at the Center for Science and Technology Policy Research at the University of Colorado at Boulder, has become something of a de-facto spokesman to counter the economic arguments. He is also a Senior Fellow at The Breakthrough Institute, which he describes as a "progressive think tank." He argues that the numbers claiming economic growth from R&D are bogus. "It would be remarkable if true," he writes at the organization's website. "Unfortunately, it is not." He says that there is no statistical basis for the claims. He also says that early proponents of the theories that economic growth is sparked by "creative destruction" in the economy (Joseph Schumpeter) or "technical change" (Robert Solow), which led to the arguments of the economic impact of R&D, have been misunderstood.
Many fiscal conservatives in Congress are likely to agree. The result so far is that the debate continues and budget cuts may still slice into funding of HPC centers, federal labs, and federal R&D in general. It's an impact that may be felt for years to come.
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
Jun 13, 2013 |
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.