Tag: exascale computing
At the tail end of 2013, Congress passed a law directing the Department of Energy to develop exascale computing capability within the next decade in order to meet the objectives of the nuclear stockpile stewardship program. The directive is part of the 2014 National Defense Authorization Act, which President Obama signed into law on December Read more…
The United States Department of Energy has announced a plan to field an exascale system by 2022, but says in order to meet this objective it will require an investment of $1 billion to $1.4 billion for targeted research and development.
To build exascale systems, power is probably the biggest technical hurdle on the hardware side. In terms of getting to exascale computing, demonstrating the value of supercomputing to funders and the public is a more urgent challenge. But the top roadblock for realizing the potential benefits from exascale is software.
Moore’s Law is projected to come to an end sometime around the middle of the next decade — a timeframe that coincides with the epoch of exascale computing. A white paper by Marc Snir, Bill Gropp and Peter Kogge discusses what we should be doing now to prepare high performance computing for the post-Moore’s Law era.
The first international effort to bring climate simulation software onto the next-generation exascale platforms got underway earlier this spring. The project, named Enabling Climate Simulation (ECS) at Extreme Scale, is being funded by the G8 Research Councils Initiative on Multilateral Research and brings together some of the heavy-weight organizations in climate research and computer science, not to mention some of the top supercomputers on the planet.
The challenge of climate change brings out the worst in us.
Hewlett Packard’s Partha Ranganathan outlines a path for exascale computing.
In his third column on programming for exascale systems, Michael Wolfe shares his views on what programming at the exascale level is likely to require, and how we can get there from where we are today. He explains that it will take some work, but it’s not a wholesale rewrite of 50 years of high performance expertise.
Is the HPC community too focused on the 10-year milestone?