Scientists at Stanford devised a “virtual earthquake” technique capable of predicting the effects of a major quake occurring along the southern San Andreas Fault. The remarkable thing about the new technique is that it relies on weak vibrations generated by the Earth’s oceans to create these ‘virtual earthquakes’ in order to forecast resultant ground movement Read more…
In 2011, South Carolina-based BMI Corp. worked with researchers at Oak Ridge National Laboratory (ORNL) to develop a technology that improves the aerodynamics of long haul tractor trailers, thereby boosting fuel efficiency. Two years later, the company and the partnership are still going strong.
Chalk up another win for Sequoia and high-performance computing. The IBM Blue Gene breaks two more records.
Japanese project marries brain research and robotics for big-time implications.
University of Oklahoma researcher zeros in on why some storms generate tornadoes while others don’t.
Huawei falls under scrutiny for deal with National Center for Computational Engineering.
Projects like the Sloan Digital Sky Survey have provided a wealth of cosmological data for scientists to explore in detail. However, making use of those terabytes — and generating far more data in the process of simulating and analyzing new concepts — is highlighting the bottlenecks for scientific computing at massive scale.
Advanced computing resources optimize the site selection of wind farms.
The first international effort to bring climate simulation software onto the next-generation exascale platforms got underway earlier this spring. The project, named Enabling Climate Simulation (ECS) at Extreme Scale, is being funded by the G8 Research Councils Initiative on Multilateral Research and brings together some of the heavy-weight organizations in climate research and computer science, not to mention some of the top supercomputers on the planet.
Researchers mitigate multicore challenges to refine current geological simulation capabilities.