Tag: Oak Ridge National Laboratory

Debugging at Titan Scale

Apr 15, 2013 |

Getting scientific applications to scale across Titan’s 300,000 compute cores means there will be bugs. Finding those bugs is where Allinea DDT comes in.

Releasing the Kraken on Protoplanetary Disks

Apr 2, 2013 |

The large-scale classical physics problems that remain unsolved must for the most part be run in parallel by high-performance machines like the Kraken supercomputer. Literally millions of variables culled from billions of particles combine to make this type of research unreasonable for ordinary computational physics.

What Will the Sequester Mean to HPC (and Federal) Research?

Mar 20, 2013 |

<img src=”http://media2.hpcwire.com/hpcwire/argonne_crop.jpg” alt=”” width=”94″ height=”72″ />Prominent figures in government, national labs, universities and other research organizations are worried about the effect that sequestration and budget cuts may have on federally-funded R&D in general, and on HPC research in particular. They have been defending the concept in hearings and in editorial pages across the country. It may be a tough argument to sell.

World’s Fastest Supercomputer Hits Speed Bump

Feb 27, 2013 |

When it comes to Titan’s final acceptance testing, ONRL says not so fast.

Neutron Science and Supercomputing Come Together at Oak Ridge National Lab

Dec 4, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Fusion_simulation.bmp” alt=”” width=”93″ height=”89″ />As the data sets generated by the increasingly powerful neutron scattering instruments at ORNL’s Spallation Neutron Source (SNS) grow ever more massive, the facility’s users require significant advances in data reduction and analysis tools. To meet the challenge, SNS data specialists have teamed with ORNL’s Computing and Computational Sciences Directorate.

Software Tools Will Need Refresh for ORNL’s Titan Supercomputer

Nov 21, 2011 |

In 2012 Oak Ridge National Laboratory will initiate a major upgrade of Jaguar using the latest CPUs and GPUs, resulting in a new 10-20 petaflop supercomputer called Titan. Such a system will require the a concerted effort of many teams at ORNL, including the Application Performance Tools Group, headed by Richard Graham. In this interview he describes the challenges of bringing all the supercomputing software tools up to speed for the new system.

Oak Ridge Supercomputers Modeling Nuclear Future

May 9, 2011 |

The Department of Energy has backed the Consortium for Advanced Simulation of Light Water Reactors at Oak Ridge National Laboratory. This sweeping five-year effort will unleash the power of HPC to simulate innovative designs that could dramatically improve nuclear safety, output, and waste reduction.

Thoughts on a US-Chinese HPC Partnership

Apr 13, 2011 |

An ORNL representative addresses the idea of a US-Chinese supercomputing alliance.

Oak Ridge Looks Toward 20 Petaflop Super

Mar 7, 2011 |

Future NVIDIA Tesla-equipped Cray machine will put lab at the forefront of GPU computing.

NOAA-ORNL Climate Research Collaboration Sets Lofty Goals for New Supercomputer

Jul 26, 2010 |

A year ago, NOAA and DOE signed an agreement calling for closer cooperation between NOAA and Oak Ridge National Laboratory. Jim Rogers, director of operations for the National Center for Computational Sciences at ORNL, discusses the agreement and the goals for the Climate Modeling and Research System (CMRS), the initial supercomputer chosen for the collaborative work.