Tag: Lawrence Livermore National Laboratory
The old adage “you cannot improve what you do not measure” is fresh again in the age of ubiquitous data. When considering the challenges of exascale computing, power is right at the top of the list and the major leadership-class centers want to make sure they’re doing everything they can to manage the demands of power today – which can run as high as 10 MW at peak for the largest machines – and in the coming exascale era, when the number could be three times that high. At loads of this magnitude, the largest HPC facilities need to have all the relevant power data within arm’s reach.
Come early March, grants worth $300,000 are up for grabs for manufacturers giving year-long access to national lab supercomputing cycles and half the staff hours of computer scientists with domain expertise under a U.S. Department of Energy program called High Performance Computing for Manufacturing (HPC4Mfg). In Phase 1 of the program, manufacturers as large as Read more…
In the realm of pathogenic threats, there are the usual suspects – anthrax, botulism, tuberculosis – but in reality there are a whole host of bacterial and viral pathogenic microbes that can be problematic to human and animal health. Protecting the public against these threats falls under the domain of biosecurity. While staying ahead of Read more…
It’s an unfortunate side effect of the drug development process that unsafe drugs sometimes slip through the extensive vetting period. So far no magic bullet drug has been developed that is completely without risk, but scientists are working hard to root out dangerous side effects, which according to the journal Nature, kill at least 100,000 patients Read more…
Lawrence Livermore National Laboratory’s High Performance Computing Innovation Center (HPCIC) in the US and the Science and Technology Facilities Council (STFC) in the United Kingdom are combining efforts to help industry stakeholders in both countries leverage supercomputing to accelerate innovation and boost economic competitiveness.
In this week’s hand-picked assortment, researchers consider virtualizing HPC as a Service, low latency on global cloud systems as well as accelerators and surveying the HPC cloud environment as a whole.
HPC can save energy companies enormous amounts of time in the development of new products and technologies compared to the traditional methods, according to hpc4energy.org, an incubator project at the Lawrence Livermore National Laboratory.
Chalk up another win for Sequoia and high-performance computing. The IBM Blue Gene breaks two more records.
LLNL researchers have successfully harnessed all 1,572,864 of Sequoia’s cores for one impressive simulation.
Everybody loves predictions. Here’s a few made by IEEE group members at SC12 in case you missed it.