Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags

Tag: NNSA

The Nuts and Bolts of Nuclear Stewardship

Sep 10, 2014 |

Sometimes the impetus behind large-scale computing endeavors can be surprising. Take the case of nuts and bolts. Given the right context, these everyday objects become a much bigger deal. Like when the context is nuclear missile design. Every component of a nuclear weapon body must go through a painstaking review process. As an article at Deixis Magazine Read more…

Los Alamos Lead Shares ‘Trinity’ Feeds and Speeds

Jul 10, 2014 |

We’ve been anticipating news around the Trinity supercomputer for some time now and today were graced with the news that Cray will be supplying the machine in two phases with the final phase being complete in 2016. For the original background, the first run of the story can be found here. Since that time this Read more…

First Details Emerge from Cray on Trinity Supercomputer

Jul 10, 2014 |

Note – 7:32 p.m. Eastern: We have full details from Los Alamos about the system in a detailed update article. Cray has been granted one of the largest awards in its history for the long-awaited “Trinity” supercomputer. This morning the company announced a $174 million deal to provide the National Nuclear Security Administration (NNSA) with Read more…

DOE Sets Exascale Pricetag

Sep 16, 2013 |

The United States Department of Energy has announced a plan to field an exascale system by 2022, but says in order to meet this objective it will require an investment of $1 billion to $1.4 billion for targeted research and development.

Stanford Gets Federal Funding to Bring Solar Research to Exascale Levels

Aug 2, 2013 |

Stanford University will receive $16 million over the next five years from the National Nuclear Security Administration (NNSA) to use supercomputers to find ways to increase the efficiency of solar energy concentrators. The research project involves developing new models that will help solve vexing engineering challenges on the next generation of exascale supercomputers.

Exascale Advocates Stand on Nuclear Stockpiles

May 23, 2013 |

In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale’s future, however, the time could be ripe to instill some sense of urgency….

Sequoia Goes Core-AZY

Mar 20, 2013 |

LLNL researchers have successfully harnessed all 1,572,864 of Sequoia’s cores for one impressive simulation.

Sequoia Supercomputer Runs Cosmology Code at 14 Petaflops

Nov 29, 2012 |

Lawrence Livermore machine sets new record for sustained application performance.

DOE Primes Pump for Exascale Supercomputers

Jul 12, 2012 |

I<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/computer_chips_on_die_small.jpg” alt=”” width=”102″ height=”93″ />ntel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.

NetApp Gets First Petascale Supercomputer Win

Sep 28, 2011 |

NetApp flexed its newly acquired supercomputing muscles this week when it announced it would be supplying one of the largest Lustre storage system in the world for the Sequoia supercomputer to be installed at Lawrence Livermore National Laboratory next year. NetApp’s E-Series storage, which they inherited when the company purchased LSI’s Engenio business, will be used to provide 55 petabytes of disk arrays for the 20-petaflop Sequoia machine.