Scientists at Stanford devised a “virtual earthquake” technique capable of predicting the effects of a major quake occurring along the southern San Andreas Fault. The remarkable thing about the new technique is that it relies on weak vibrations generated by the Earth’s oceans to create these ‘virtual earthquakes’ in order to forecast resultant ground movement Read more…
A new computer made of carbon nanotubes, created by a team of Stanford engineers, may be the first serious silicon challenger.
Stanford University will receive $16 million over the next five years from the National Nuclear Security Administration (NNSA) to use supercomputers to find ways to increase the efficiency of solar energy concentrators. The research project involves developing new models that will help solve vexing engineering challenges on the next generation of exascale supercomputers.
<img style=”float: left;” src=”http://media2.hpcwire.com/hpcwire/Stanford_jet_noise_simulation_150x.jpg” alt=”” width=”95″ height=”54″ />The 20 petaflop, third-generation IBM BlueGene system, Sequoia, may be the number two supercomputer according to the latest TOP500 rankings, but when it comes to max core usage, Sequoia has apparently set a new record. A team of Stanford engineers harnessed one million of Sequoia’s nearly 1.6 CPUs in parallel to solve a sophisticated fluid dynamics problem.
Chief scientist discusses memory stacks, interconnects, and US technology leadership.
Projects like the Sloan Digital Sky Survey have provided a wealth of cosmological data for scientists to explore in detail. However, making use of those terabytes — and generating far more data in the process of simulating and analyzing new concepts — is highlighting the bottlenecks for scientific computing at massive scale.