Tag: Lawrence Berkeley National Laboratory

Application Readiness at the DOE, Part II: NERSC Preps for Cori

Apr 17, 2015 |

In our second video feature from the HPC User Forum panel, “The Who-What-When of Getting Applications Ready to Run On, And Across, Office of Science Next-Gen Leadership Computing Systems,” we learn more about the goals and challenges associated with getting science applications ready for the coming crop of Department of Energy (DOE) supercomputers, which in addition to being five-to-seven times faster than Read more…

‘Edison’ Lights Up Research at NERSC

Jan 31, 2014 |

The National Energy Research Scientific Computing (NERSC) Center, located at Lawrence Berkeley National Laboratory, has taken acceptance of “Edison,” a Cray XC30 supercomputer named in honor of famed American inventor Thomas Alva Edison. The important milestone occurs just as NERSC is commemorating 40 years of scientific advances, prompting NERSC Director Sudip Dosanjh to comment: “As Read more…

Supercomputing Targets Cleaner Combustion

Oct 1, 2013 |

A team of scientists and mathematicians at the DOE’s Lawrence Berkeley National Laboratory are using powerful NERSC supercomputers together with sophisticated algorithms to create cleaner combustion technologies.

Toward Stable Quantum Computing

Sep 18, 2013 |

What good is computing if it’s not reliable? An international team of researchers just got a little closer to realizing the grand challenge that is practical quantum computing.

Intermolecular Lends Genomics Data to Materials Project

Jun 26, 2013 |

The hunt for new and useful materials got a big boost this week when Intermolecular agreed to lend its advanced combinational processing technology to the Materials Project, a materials-discovery computing project launched by Lawrence Berkeley National Lab and Massachusetts Institute of Technology (MIT).

“No Exascale for You!” An Interview with Berkeley Lab’s Horst Simon

May 15, 2013 |

Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.

What Will the Sequester Mean to HPC (and Federal) Research?

Mar 20, 2013 |

<img src=”http://media2.hpcwire.com/hpcwire/argonne_crop.jpg” alt=”” width=”94″ height=”72″ />Prominent figures in government, national labs, universities and other research organizations are worried about the effect that sequestration and budget cuts may have on federally-funded R&D in general, and on HPC research in particular. They have been defending the concept in hearings and in editorial pages across the country. It may be a tough argument to sell.

The Week in HPC Research

Mar 14, 2013 |

<img src=”http://media2.hpcwire.com/hpcwire/test_tube_image_200x.jpg” alt=”” width=”93″ height=”61″ />The top research stories of the week include the 2012 Turing Prize winners; an examination of MIC acceleration in short-range molecular dynamics simulations; a new computer model to help predict the best HIV treatment; the role of atmospheric clouds in climate change models; and more reliable HPC cloud computing.

IBM Supercomputer Reveals More Pieces of Pi

Apr 20, 2011 |

US-Australia research team solves “impossible” mathematical calculation.

Supernova Factory Employs EC2, Puts Cloud to the Test

Jul 9, 2010 |

Researchers from Berkeley Lab are looking at different options available for scientific computing users to move beyond physical infrastructure, including the possibility of deploying public clouds. A recently-published study of Amazon EC2’s handling of data from the Nearby Supernova Factory sheds light on putting large-scale scientific computing into the cloud in practice and in theory.