Here is a collection of highlights from this week’s news stream as reported by HPCwire.
Air Force Selects IBM to Design Cloud Architecture for Cyber Security
Four ORNL Researchers Selected for Recovery Act Early Career Funds
mental images Introduces RealityServer AppLab
LSU-led Research Team Receives INCITE Award
CFD Simulation Used to Design World’s Biggest Wing
IDC Leads Consortium Awarded Contract to Help Develop HPC Strategy for the EU
DOE Awards Supercomputing Time to UC San Diego, SDSC Researchers
Developing a Cyberinfrastructure for Comparative Effectiveness in Cancer Research
DataDirect Plans Middle East Office
Bell Microproducts, Force10 Partner for Networking Solutions
NASA Selects Parabon to Develop ‘Modeling and Simulation as a Service’ Solution
Chelsio Delivers End-To-End Unified Storage
TACC’s Ranger Turns Two
It seems like just yesterday, we witnessed Ranger’s debut, which marked the beginning of the Petascale Era in high-performance computing. When Ranger came online, it offered more than six times the performance of the previous largest system for open science research. Today, the Texas Advanced Computing Center (TACC) celebrates two years of Ranger enabling groundbreaking computational science in Texas and across the nation.
The first system in the NSF “Path to Petascale” program and one of the most powerful systems in the world for open science research, Ranger offers a peak performance of 579.4 teraflops, which has secured its place among the top 10 systems on the TOP500 list for the past two years. Ranger also ranks among the top systems in terms of total memory at 123 terabytes. When Ranger first came online, that scale of memory allowed it to tackle problems that could not be run elsewhere.
A Sun Constellation cluster comprised of 15,744 quad-core AMD Opteron processors running Linux, Ranger has also done a lot to boost the image of Linux clusters as being viable for high end supercomputing. According to Jay Boisseau, principal investigator of the Ranger project and director of TACC:
It’s no longer a question as to whether Linux clusters can scale to petaflops. Ranger is a robust and powerful science instrument. We designed it with large memory as well as large compute capability so it would be a general purpose system, and it now enables scientific research across fields and domains.
In Nov. 2009, TACC announced that the Ranger supercomputer had run over one million jobs in under two years. Since it entered full production Feb. 4, 2008, Ranger has completed over 1,089,075 jobs and logged 754,873,713.8 hours of processing time, with an impressive 97 percent uptime. The system counts 2,863 users across 981 unique research projects.
Omar Ghattas, a co-principal investigator on the project, explains how Ranger changed the playing field, especially in regards to the petascale era:
Ranger’s enormous speed and memory drove computational scientists to rethink their underlying models, data structures and algorithms so that their codes could, for the first time, capitalize on the tens of thousands of processor cores provided by a new generation of systems such as Ranger. In this sense, Ranger has served as the gateway to the petascale era for the computational science community.
Ranger is in the best part of its life, says Boisseau. Future plans include exploring more data-intensive computing projects, improving the performance of the archival storage system, and adding more fast storage in the disk system.
Microsoft Gives NSF Researchers Free Access to Azure
Microsoft Corp. and the National Science Foundation (NSF) have partnered on a project that gives selected NSF researchers free use of Windows Azure for three years. Projects and researchers will be selected and managed by the NSF.
From Microsoft’s announcement:
Windows Azure provides on-demand compute and storage to host, scale and manage Web applications on the Internet through Microsoft datacenters. Microsoft researchers and developers will work with grant recipients to equip them with a set of common tools, applications and data collections that can be shared with the broad academic community, and also provide its expertise in research, science and cloud computing.
In this increasingly data-driven world, the goal of the program is to make it simpler for researchers to share and access data and to collaborate on projects. Jeannette M. Wing, assistant director for the NSF Computer and Information Science directorate, explains further:
We’ve entered a new era of science — one based on data-driven exploration — and each new generation of computing technology, such as cloud computing, creates unprecedented opportunities for discovery. We are working with Microsoft to provide the academic community a novel cloud computing service with which to experiment and explore, with the grander goal of advancing the frontiers of science and engineering as we tackle societal grand challenges.
NSF Submits $7.4 Billion Budget Request
This week the National Science Foundation (NSF) – in its 60th anniversary year — submitted to Congress its $7.4 billion budget request for the fiscal year 2011. And it is filled with HPC-sounding goodies.
Some basics from the announcement:
The request represents an 8-percent increase over 2010 and supports the President’s goal of increasing the nation’s total public and private investment in research and development to at least 3 percent of the gross domestic product.
Three main thrusts of the proposal are NSF programs that aim to prepare the next generation of high-tech workers. They are part of the President’s National Innovation Strategy, unveiled last year:
- The Advanced Technological Education (ATE) program, which supports new and enhanced two-year college programs that educate technicians for the high-technology workforce.
- The Graduate Research Fellowship program and the Faculty Career Development program, which support , respectively, students and early career investigators in order to foster the nation’s next generation of scientists and engineers.
- Climate Change Education, which addresses learning at all levels, and is another initiative designed to develop future scientists and engineers.
Other key points include increased funding for the Networking and Information Technology R&D program (NITRD), which would support research funding for large-scale networking, high-end computing and human-computer interaction. The budget would also enhance support for Science and Engineering Beyond Moore’s Law (SEBML), a program that is designed to solve great computational challenges by focusing on new frameworks for computing.
It’s nice to see advanced computation given such an integral role in the strategy toward enhancing the nation’s science and research goals. This was highlighted when NSF Director Arden L. Bement, Jr. said NSF’s fiscal year 2011 budget proposal is “designed to keep the agency at the forefront and, in turn, to insure the future well-being of not only the United States but humanity generally.”