Tag: Oak Ridge National Laboratory
Anybody who drives one of Ford’s recent vehicles spends a little less money on gasoline thanks to HPC work the carmaker undertook with Oak Ridge National Laboratory, where more than one million processor hours were spent getting a handle on the complex fluid dynamics governing airflow under the hood.
Researchers are licking their chops with the potential to speed the execution of parallel applications on the largest supercomputers using Vampir, a performance tool that traces events and identifies problems in HPC applications.
A diminutive marine crustacean called the Gribble landed on the biofuel industry’s radar for its unique ability to digest wood in salty conditions. Now, researchers in the US and the UK are putting the University of Tennessee’s Kraken supercomputer to work modeling an enzyme in the Gribble’s gut, which could unlock the key to developing better industrial enzymes in the future.
Titan, the Cray XK7 at the Oak Ridge National Lab that debuted last fall as the fastest supercomputer in the world with 17.59 petaflops of sustained computing power, will rely on its previous LINPACK test for the upcoming edition of the Top 500 list.
Getting scientific applications to scale across Titan’s 300,000 compute cores means there will be bugs. Finding those bugs is where Allinea DDT comes in.
The large-scale classical physics problems that remain unsolved must for the most part be run in parallel by high-performance machines like the Kraken supercomputer. Literally millions of variables culled from billions of particles combine to make this type of research unreasonable for ordinary computational physics.
<img src=”http://media2.hpcwire.com/hpcwire/argonne_crop.jpg” alt=”” width=”94″ height=”72″ />Prominent figures in government, national labs, universities and other research organizations are worried about the effect that sequestration and budget cuts may have on federally-funded R&D in general, and on HPC research in particular. They have been defending the concept in hearings and in editorial pages across the country. It may be a tough argument to sell.
When it comes to Titan’s final acceptance testing, ONRL says not so fast.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Fusion_simulation.bmp” alt=”” width=”93″ height=”89″ />As the data sets generated by the increasingly powerful neutron scattering instruments at ORNL’s Spallation Neutron Source (SNS) grow ever more massive, the facility’s users require significant advances in data reduction and analysis tools. To meet the challenge, SNS data specialists have teamed with ORNL’s Computing and Computational Sciences Directorate.
In 2012 Oak Ridge National Laboratory will initiate a major upgrade of Jaguar using the latest CPUs and GPUs, resulting in a new 10-20 petaflop supercomputer called Titan. Such a system will require the a concerted effort of many teams at ORNL, including the Application Performance Tools Group, headed by Richard Graham. In this interview he describes the challenges of bringing all the supercomputing software tools up to speed for the new system.