Tag: Oak Ridge National Laboratory

Ford Taps ORNL to Boost Vehicle Airflow, Fuel Efficiency

Aug 19, 2013 |

Anybody who drives one of Ford’s recent vehicles spends a little less money on gasoline thanks to HPC work the carmaker undertook with Oak Ridge National Laboratory, where more than one million processor hours were spent getting a handle on the complex fluid dynamics governing airflow under the hood.

Vampir Rises to the Occasion at ORNL

Jul 31, 2013 |

Researchers are licking their chops with the potential to speed the execution of parallel applications on the largest supercomputers using Vampir, a performance tool that traces events and identifies problems in HPC applications.

Kraken Chews on Gribble Data for Industrial Enzyme Research

Jun 25, 2013 |

A diminutive marine crustacean called the Gribble landed on the biofuel industry’s radar for its unique ability to digest wood in salty conditions. Now, researchers in the US and the UK are putting the University of Tennessee’s Kraken supercomputer to work modeling an enzyme in the Gribble’s gut, which could unlock the key to developing better industrial enzymes in the future.

Titan Didn’t Redo LINPACK for June Top 500 List

Jun 13, 2013 |

Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.

Debugging at Titan Scale

Apr 15, 2013 |

Getting scientific applications to scale across Titan’s 300,000 compute cores means there will be bugs. Finding those bugs is where Allinea DDT comes in.

Releasing the Kraken on Protoplanetary Disks

Apr 2, 2013 |

The large-scale classical physics problems that remain unsolved must for the most part be run in parallel by high-performance machines like the Kraken supercomputer. Literally millions of variables culled from billions of particles combine to make this type of research unreasonable for ordinary computational physics.

What Will the Sequester Mean to HPC (and Federal) Research?

Mar 20, 2013 |

<img src=”http://media2.hpcwire.com/hpcwire/argonne_crop.jpg” alt=”” width=”94″ height=”72″ />Prominent figures in government, national labs, universities and other research organizations are worried about the effect that sequestration and budget cuts may have on federally-funded R&D in general, and on HPC research in particular. They have been defending the concept in hearings and in editorial pages across the country. It may be a tough argument to sell.

World’s Fastest Supercomputer Hits Speed Bump

Feb 27, 2013 |

When it comes to Titan’s final acceptance testing, ONRL says not so fast.

Neutron Science and Supercomputing Come Together at Oak Ridge National Lab

Dec 4, 2012 |

<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Fusion_simulation.bmp” alt=”” width=”93″ height=”89″ />As the data sets generated by the increasingly powerful neutron scattering instruments at ORNL’s Spallation Neutron Source (SNS) grow ever more massive, the facility’s users require significant advances in data reduction and analysis tools. To meet the challenge, SNS data specialists have teamed with ORNL’s Computing and Computational Sciences Directorate.

Software Tools Will Need Refresh for ORNL’s Titan Supercomputer

Nov 21, 2011 |

In 2012 Oak Ridge National Laboratory will initiate a major upgrade of Jaguar using the latest CPUs and GPUs, resulting in a new 10-20 petaflop supercomputer called Titan. Such a system will require the a concerted effort of many teams at ORNL, including the Application Performance Tools Group, headed by Richard Graham. In this interview he describes the challenges of bringing all the supercomputing software tools up to speed for the new system.