Researchers are licking their chops with the potential to speed the execution of parallel applications on the largest supercomputers using Vampir, a performance tool that traces events and identifies problems in HPC applications.
Getting scientific applications to scale across Titan’s 300,000 compute cores means there will be bugs. Finding those bugs is where Allinea DDT comes in.
<img src=”http://media2.hpcwire.com/hpcwire/argonne_crop.jpg” alt=”” width=”94″ height=”72″ />Prominent figures in government, national labs, universities and other research organizations are worried about the effect that sequestration and budget cuts may have on federally-funded R&D in general, and on HPC research in particular. They have been defending the concept in hearings and in editorial pages across the country. It may be a tough argument to sell.
When it comes to Titan’s final acceptance testing, ONRL says not so fast.
<img src=”http://media2.hpcwire.com/hpcwire/puzzle.jpg” alt=”” width=”95″ height=”95″ />The TOP500 list provides a valuable source of information to the HPC community. But every year, some of the data requested by the organizers is missing. And wouldn’t it be a good idea to add some new data points to the list?
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/doe-logo-small.png” alt=”” width=”96″ height=”96″ />The national labs at Oak Ridge, Argonne and Lawrence Livermore are banding together for their next refresh of supercomputers. In late 2016 or early 2017, all three Department of Energy (DOE) centers are looking to deploy their first 100-plus petaflop systems, which will serve as precursors to their exascale machine further down the line. The labs will issue a request for proposal (RFP) later this year with the goal of awarding the work to two prime subcontractors.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Fusion_simulation.bmp” alt=”” width=”93″ height=”89″ />As the data sets generated by the increasingly powerful neutron scattering instruments at ORNL’s Spallation Neutron Source (SNS) grow ever more massive, the facility’s users require significant advances in data reduction and analysis tools. To meet the challenge, SNS data specialists have teamed with ORNL’s Computing and Computational Sciences Directorate.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Titan_supercomputer_small.jpg” alt=”” width=”98″ height=”77″ />Oak Ridge National Laboratory has officially launched its much-anticipated Titan supercomputer, a Cray XK7 machine that will challenge IBM’s Sequoia for petaflop supremacy. With Titan, ORNL gets a system that is 10 times as powerful as Jaguar, the lab’s previous top system upon which the new machine is based. With a reported 27 peak petaflops, Titan now represents the most powerful number-cruncher in the world.
ORNL machine gets initial taste of NVIDIA’s new K20 GPUs.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Galen_Shipman_climate_small.jpg” alt=”” width=”90″ height=”90″ />Supercomputers at Oak Ridge National Laboratory produce some of the world’s largest scientific datasets, many of which are related to climate change research. In this interview, Galen Shipman, data-systems architect for ORNL’s Computing and Computational Sciences Directorate and the person who oversees data management at the OLCF, discusses strategies for coping with the “3 Vs” of big data: variety, velocity, and volume.