Visit additional Tabor Communication Publications
December 07, 2007
Dec. 3 -- Researchers from UQ's Earth Systems Science Computational Centre (ESSCC), who were able to predict a series of three large Sumatran earthquakes that occurred in September, will present their ground-breaking research at the Fall Meeting of the American Geophysical Union (AGU), held from Dec. 10 to 14.
In this, the Union's 40th year, the meeting is expected to draw a crowd of over 15,000 of the leading geophysicists from around the world, to present and review the latest breakthroughs on issues affecting the Earth, the planets and their environments in space.
Research team leader Dr Huilin Xing said the AGU's last-minute inclusion of the UQ research in an added special session entitled "The 2007 Sumatra Seismic Sequence", reflected the significance of the work.
The predictions were made using advanced computer simulation software developed as part of a research program under Dr Xing, with researchers utilising the ESSCC's Altix supercomputer -- one of the fastest in Australia -- to model scenarios and determine the highest risk areas.
As a result of their simulations, Dr Xing and his colleagues identified the part of the subduction zone where the Eurasian and Indian/Australian tectonic plates meet between latitude S1° and S5.5° as having the highest earthquake risk -- exactly the zone in which the series of quakes occurred.
Dr Xing said that in the wake of the 2004 Boxing Day tsunami, the Sumatra region was one of the first areas of application for the modelling software.
"Not too long after we developed the software the 2004 Boxing Day tsunami occurred and as a result, we began a project specially focused on the tsunami generation process induced by earthquakes and from there, we really began the research for the Sumatra area," Dr Xing said.
"The region had a lot of data and papers related to it as a very hot topic, and all that information was ideal for helping us conduct the simulations.
"We presented our results as early as last April in Hawaii, highlighting the high earthquake risk in this very specific area… and now already the event has happened with the three earthquakes occurring in exactly the place we had predicted -- and this is why we're very excited but in some ways quite shocked.
"This sort of event is very rare in earthquake history -- to have three very large earthquakes occur so close together but also in a very narrow area."
The three quakes, which occurred in the space of just two days, were measured on the Richter scale at magnitudes of 8.4 and 7.9 (September 12), and 7.0 (September 13) respectively. Residents living around the Indian Ocean were quick to register their shock at the magnitude of the tremors, which were felt as far away as Singapore and Malaysia.
Interestingly, the area was one of very few in the wider Sumatran region that had not experienced earthquake activity for some time. But the extended period of quiescence did not discourage Dr Xing and his colleagues from pinpointing it as high-risk zone.
"It was very strange that even in this region of high earthquake activity and in which the tsunami-inducing earthquake occurred, that this particular area seemed to be locked.
"But while on the one side this could have meant that perhaps this area was very safe because there was no slip, on the other side the lack of any slip meant a significant build-up of force and that the area had a large amount of energy to release.
"When we looked at the earthquake history around this area we found that about 170 years ago there were two very large earthquakes exactly in this area, so we began to think this area might have potential for a large, destructive earthquake in between the relatively long periods of quiet."
Despite the accuracy of the UQ forecast, Dr Xing was quick to point out that the prediction of earthquakes is not an exact science and said the recent series of earthquakes have in many ways only added to the many questions surrounding the subject.
"For example, from research we know we can expect that if an earthquake is larger than magnitude 6.5 there may be a tsunami, and while this is not directly or linearly related to size, it is very important.
"But in this case the first earthquake was of magnitude 8.4, and there was almost no tsunami…and I think this means we really need to keep looking deeper to work out what kinds of earthquakes can generate tsunamis and how big the tsunami might be.
"If we continue this research, we can help to make the prediction of earthquakes and tsunamis a more accurate process, contributing some further understanding of the factors involved and with respect to this particular area, modelling where the next event might occur."
In the meantime, Dr Xing's finite element crustal dynamics software is currently being applied in the supercomputer simulation of hot fractured rock geothermal reservoir systems in the field of alternative energy, and has demonstrated other significant potential applications in regards to modelling the deep geological disposal of nuclear waste and carbon dioxide.
Dr Xing said it was important to acknowledge the support he has received from the Australian Computational Earth Systems Simulator (ACcESS) -- a major national research facility hosted by the ESSCC; as well as the Australian Research Council and industry collaborators, such as Geodynamics Ltd.
The ESSCC conducts research on the mechanics and physics of solid Earth processes on all scales using supercomputer simulation and by applying the methodologies of geophysical fluid and solid mechanics.
Source: University of Queensland
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.