Since 1987 - Covering the Fastest Computers in the World and the People Who Run Them

August 2, 2013

Stanford Gets Federal Funding to Bring Solar Research to Exascale Levels

Alex Woodie

Stanford University will receive $16 million over the next five years from the National Nuclear Security Administration (NNSA) to use supercomputers to find ways to increase the efficiency of solar energy concentrators. The research project involves developing new models that will help solve vexing engineering challenges on the next generation of exascale supercomputers.

The crux of the research will focus on modeling complex physical and chemical interactions that take place in solar-thermal systems, which use mirrors to concentrate sunlight into a fluid that powers a turbine. Key variables that impact the efficiency of such industrial-scale solar systems include the alignment of the mirrors and the size of fine particles that are suspended in the fluid to serve as energy conduits.

The research at Stanford will focus on better understanding and modeling these variables, according to Gianluca Iaccarino, an associate professor of Mechanical Engineering and the leader of the new research project at Stanford.

“We need to rigorously assess the impacts of these sensitivities to be able to compute the efficiency of a system like this,” Iaccarino said in a news story that appeared on the Stanford website. “There is currently no supercomputer in the world that can do this, and no physical model.” 

Thus, the need for an exascale supercomputer, which is the second part of the NNSA’s directive. The NNSA and the Department of Energy have set an ambitious goal to develop an exascale supercomputer by 2018. Meeting that deadline is a major challenge in its own right.

“The supercomputer paradigm has reached a physical apex,” Iaccarino said in the Stanford story. “Energy consumption is too high, the computers get too hot, and it’s too expensive to compute with millions of commodity computers bundled together. Next generation supercomputers will have completely different architectures.” 

The researchers will need to get creative and be flexible in their models, which will need to adapt to whatever architecture emerges in the exascale period. This basically amounts to “programming blind,” the Stanford story says. 

The research at Stanford will involve several of the university’s departments, including the Mechanical Engineering, Aeronautics and Astronautics, Computer Science, and Math departments. Stanford has a long history of multi-disciplinary research work in HPC, including a collaboration that started 15 years ago between the Computer Science Department and the Mechanical Engineering Department to solve physics problems on massively parallel computers.

In addition to the computer work, Stanford will operate a physical experiment of the solar collector. The university will work with five other universities on the project, including the University of Michigan, the University of Minnesota, the University of Colorado-Boulder, the University of Texas-Austin, and the State University of New York-Stony Brook. 

Stanford, which was one of three universities selected by the NNSA for the project, will receive $3.2 million per year for the next five years. Other universities selected to house research centers under the NNSA’s Predictive Science Academic Alliance Program II (PSAAP II) include the University of Utah and the University of Illinois-Urbana-Champaign.

Related Articles

Green Flash Heralds Potential Breakthrough in Climate Modeling, Exascale Design

Senator Says U.S. Congress Doesn’t ‘Get’ Supercomputers

Green500 Founder on Getting to Exascale: ‘Something’s Gotta Change’

Share This