Visit additional Tabor Communication Publications
November 14, 2011
SEATTLE, Nov. 14 -- The University of Illinois' National Center for Supercomputing Applications (NCSA) has finalized a contract with Cray Inc., to provide the supercomputer for the National Science Foundation's Blue Waters project.
This new Cray supercomputer will support significant research advances in a broad range of science and engineering domains, meeting the needs of the most compute-intensive, memory-intensive, and data-intensive applications. Blue Waters is expected to deliver sustained performance, on average, of more than one petaflops on a set of benchmark codes that represent those applications and domains.
More than 25 teams, from a dozen research fields, are preparing to achieve breakthroughs by using Blue Waters to model a broad range of phenomena, including: nanotechnology's minute molecular assemblies, the evolution of the universe since the Big Bang, the damage caused by earthquakes and the formation of tornadoes, the mechanism by which viruses enter cells, and improved climate change predictions.
Blue Waters will be composed of more than 235 Cray XE6 cabinets based on the recently announced AMD Interlagos microprocessor and more than 30 cabinets of a future version of the recently announced Cray XK6 supercomputer with NVIDIA Tesla GPU computing capability incorporated into a single, powerful hybrid supercomputer. These Cray XK nodes will further increase the measured sustained performance on real science problems.
"We are extremely pleased to have forged a strong partnership with Cray. This configuration will be the most balanced, powerful, and useable system available when it comes online. By incorporating a future version of the XK6 system, Blue Waters will also provide a bridge to the future of scientific computing," said NCSA Director Thom Dunning.
"The project is an incredible undertaking, requiring commitment and dedication not only from NSF, NCSA, the University of Illinois, and the science teams, but also from our computing systems partner – Cray. This strong partnership further establishes our place at the forefront high-performance computing," said University of Illinois President Michael Hogan.
"The Blue Waters team has the technological capability and the commitment to make this important resource a reality – a resource that will help scientists and engineers solve their most challenging problems," said Phyllis Wise, chancellor of the University of Illinois at Urbana-Champaign.
The Cray Blue Waters system will employ:
"We are extremely proud to have been selected to deliver the Blue Waters system through this important partnership with the NSF, the University of Illinois, and NCSA," said Peter Ungaro, president and CEO of Cray. "It's a honor to be able provide the NSF's vast user community with a Cray supercomputer specifically designed for delivering real, sustained petascale performance across a broad range of breakthrough science and engineering applications. It's a passion that drives all the members of this partnership, and we are pleased to be a part of it."
Consisting of products and services, the multi-year and multi-phase contract is valued at more than $188 million. Cray will begin installing hardware in the University of Illinois' National Petascale Computing Facility soon, with an early science system expected to be available in early 2012. Blue Waters is expected to be fully deployed by the end of 2012.
As supercomputers continue to grow in scale and complexity, it becomes more challenging to effectively harness their power. Since the Blue Waters project was launched in 2008, NCSA has helped researchers prepare their codes for the massive scale of this and other extreme-scale systems. NCSA also initiated a broad range of R&D projects designed to improve the performance of the existing HPC software stack and facilitate the development and use of applications on Blue Waters and other petascale computers.
The Blue Waters project is now prepared to mount a major, community-based effort to move the state of computational science into the petascale era. The center will work with the computational and computer science and engineering communities to help them take full advantage of Blue Waters as well as future supercomputers. The effort will focus on scalability and resilience of algorithms and applications, the use of accelerators to improve time to solution for science and engineering problems, and enabling applications to simultaneously use computational components with different characteristics.
For more information about the Blue Waters project, see http://www.ncsa.illinois.edu/BlueWaters/.
For a Cray press release with more information regarding the financial details of the contract and its expected impact on Cray's 2012 outlook, see http://www.cray.com/rd/nov2011.html.
About Cray Inc.
As a global leader in supercomputing, Cray (Nasdaq: CRAY) provides highly advanced supercomputers and world-class services and support to government, industry and academia. Cray technology is designed to enable scientists and engineers to achieve remarkable breakthroughs by accelerating performance, improving efficiency and extending the capabilities of their most demanding applications. Cray's Adaptive Supercomputing vision is focused on delivering innovative next-generation products that integrate diverse processing technologies into a unified architecture, allowing customers to surpass today's limitations and meeting the market's continued demand for realized performance. Go to www.cray.com for more information.
About the National Center for Supercomputing Applications
The National Center for Supercomputing Applications (NCSA), located at the University of Illinois at Urbana-Champaign, provides powerful computers and expert support that help thousands of scientists and engineers across the country improve our world. Established in 1986 as one of the original sites of the National Science Foundation's Supercomputer Centers Program, NCSA is supported by the state of Illinois, the University of Illinois, the National Science Foundation, and grants from other federal agencies.
Source: Cray; NCSA
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.