Australian Government Unveils New Defense, Weather Supercomputers

August 15, 2022

The Australian government has been busy on the supercomputing front. In just the last two weeks, the Australian Department of Defence and the Australian Bureau of Meteorology have both revealed major supercomputing updates. The Department of Defence, for its part, unveiled a new supercomputer: Taingiwilta, named after the word for “powerful” in the language of the... Read more…

Supercomputer Simulations Elucidate Shakes of Explosions, Earthquakes

August 13, 2022

Explosions are the bread and butter of Lawrence Livermore National Laboratory (LLNL), one of the labs under the auspices of the National Nuclear Security Administration (NNSA). But not all explosions, of course, are nuclear – and now, researchers at LLNL and waveform simulation startup Mondaic have created a new, supercomputer-powered model to model the ground shaking that ensues... Read more…

Supercomputer Models Explosives Critical for Nuclear Weapons

August 6, 2022

Lawrence Livermore National Laboratory (LLNL) is one of the laboratories that operates under the auspices of the National Nuclear Security Administration (NNSA), which manages the United States’ stockpile of nuclear weapons. Amid major efforts to modernize that stockpile, LLNL has announced that researchers from its own Energetic Materials Center... Read more…

Exascale Climate Model Used to Examine the Climate Consequences of Nuclear War

December 3, 2020

Often, we can tend to examine global crises in isolation – however, they can have surprising feedback effects on one another. The COVID-19 pandemic, for insta Read more…

HPE Strikes Deal to Provide Crossroads, a New Supercomputer for the Nuclear Stockpile

October 1, 2020

The three national laboratories (Lawrence Livermore, Los Alamos and Sandia) that support the National Nuclear Security Administration (NNSA) occupy a strange role in the landscape of government-funded research and supercomputing. The NNSA manages the military applications of nuclear science... Read more…

Nuclear Deterrence: In Supercomputing We Trust

July 15, 2011

Not everyone is on board with the NNSA's Stockpile Stewardship Program. Read more…

Lawrence Livermore Prepares for 20 Petaflop Blue Gene/Q

February 3, 2009

Roadrunner and Jaguar, the DOE supercomputers that launched the petaflop era last year, will soon be eclipsed by new machines more than ten times as powerful. IBM and the US National Nuclear Security Administration announced on Tuesday that in 2011 Lawrence Livermore National Laboratory will install a 20 petaflop system to provide computational support for the country's aging nuclear weapons. Read more…

  • arrow
  • Click Here for More Headlines
  • arrow

Whitepaper

Streamlining AI Data Management

Five Recommendations to Optimize Data Pipelines

When building AI systems at scale, managing the flow of data can make or break a business. The various stages of the AI data pipeline pose unique challenges that can disrupt or misdirect the flow of data, ultimately impacting the effectiveness of AI storage and systems.

With so many applications and diverse requirements for data types, management systems, workloads, and compliance regulations, these challenges are only amplified. Without a clear, continuous flow of data throughout the AI data lifecycle, AI models can perform poorly or even dangerously.

To ensure your AI systems are optimized, follow these five essential steps to eliminate bottlenecks and maximize efficiency.

Download Now

Sponsored by DDN

Whitepaper

Taking research further with extraordinary compute power and efficiency

Karlsruhe Institute of Technology (KIT) is an elite public research university located in Karlsruhe, Germany and is engaged in a broad range of disciplines in natural sciences, engineering, economics, humanities, and social sciences. For institutions like KIT, HPC has become indispensable to cutting-edge research in these areas.

KIT’s HoreKa supercomputer supports hundreds of research initiatives including a project aimed at predicting when the Earth’s ozone layer will be fully healed. With HoreKa, projects like these can process larger amounts of data enabling researchers to deepen their understanding of highly complex natural processes.

Read this case study to learn how KIT implemented their supercomputer powered by Lenovo ThinkSystem servers, featuring Lenovo Neptune™ liquid cooling technology, to attain higher performance while reducing power consumption.

Download Now

Sponsored by Lenovo

Advanced Scale Career Development & Workforce Enhancement Center

Featured Advanced Scale Jobs:

Receive the Monthly
Advanced Computing Job Bank Resource:

HPCwire Resource Library

HPCwire Product Showcase

Subscribe to the Monthly
Technology Product Showcase:

HPCwire