Tag: Lawrence Berkeley National Laboratory
In our second video feature from the HPC User Forum panel, “The Who-What-When of Getting Applications Ready to Run On, And Across, Office of Science Next-Gen Leadership Computing Systems,” we learn more about the goals and challenges associated with getting science applications ready for the coming crop of Department of Energy (DOE) supercomputers, which in addition to being five-to-seven times faster than Read more…
The National Energy Research Scientific Computing (NERSC) Center, located at Lawrence Berkeley National Laboratory, has taken acceptance of “Edison,” a Cray XC30 supercomputer named in honor of famed American inventor Thomas Alva Edison. The important milestone occurs just as NERSC is commemorating 40 years of scientific advances, prompting NERSC Director Sudip Dosanjh to comment: “As Read more…
A team of scientists and mathematicians at the DOE’s Lawrence Berkeley National Laboratory are using powerful NERSC supercomputers together with sophisticated algorithms to create cleaner combustion technologies.
What good is computing if it’s not reliable? An international team of researchers just got a little closer to realizing the grand challenge that is practical quantum computing.
The hunt for new and useful materials got a big boost this week when Intermolecular agreed to lend its advanced combinational processing technology to the Materials Project, a materials-discovery computing project launched by Lawrence Berkeley National Lab and Massachusetts Institute of Technology (MIT).
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
<img src=”http://media2.hpcwire.com/hpcwire/argonne_crop.jpg” alt=”” width=”94″ height=”72″ />Prominent figures in government, national labs, universities and other research organizations are worried about the effect that sequestration and budget cuts may have on federally-funded R&D in general, and on HPC research in particular. They have been defending the concept in hearings and in editorial pages across the country. It may be a tough argument to sell.
<img src=”http://media2.hpcwire.com/hpcwire/test_tube_image_200x.jpg” alt=”” width=”93″ height=”61″ />The top research stories of the week include the 2012 Turing Prize winners; an examination of MIC acceleration in short-range molecular dynamics simulations; a new computer model to help predict the best HIV treatment; the role of atmospheric clouds in climate change models; and more reliable HPC cloud computing.
US-Australia research team solves “impossible” mathematical calculation.
Researchers from Berkeley Lab are looking at different options available for scientific computing users to move beyond physical infrastructure, including the possibility of deploying public clouds. A recently-published study of Amazon EC2’s handling of data from the Nearby Supernova Factory sheds light on putting large-scale scientific computing into the cloud in practice and in theory.