Visit additional Tabor Communication Publications
September 29, 2010
BATON ROUGE, La., Sept. 29 -- Hurricanes are an annual dangerous threat to Louisiana and other coastal states. But this year, many coastal residents wondered whether the Gulf of Mexico oil spill would worsen the potential impact of storm surge on the Louisiana coast.
A group of researchers with the LSU Center for Computation & Technology, or CCT, received complementary grant awards from the Louisiana Optical Network Initiative, or LONI, to model and visualize how severe storms would affect movement of the spilled oil particles. Gabrielle Allen and Bijaya Karki, who both are professors with the LSU Department of Computer Science, each received $10,000 grants and 500,000 CPU cycles on LONI to conduct a two-month study. Their awards also tie in to research with LSU's Q. Jim Chen, a professor with the Department of Civil and Environmental Engineering and CCT who received grant awards from LONI and TeraGrid, a national cyberinfrastructure for scientific research, to use those resources in conducting modeling toward oil spill research.
These awards involve joint work among the Coastal Modeling, Scientific Visualization and Computational Frameworks research groups at the CCT, and includes Sumanta Acharya from Department of Mechanical Engineering and Carola Kaiser with the School of the Coast and Environment, CCT researchers Jian Tao, Peter Diener, Werner Benger, and Marcel Ritter, post-doctoral researchers Kelin Hu and Haihong Zhao, LSU Department of Civil and Environmental Engineering, computer science graduate student Bidur Bohara, mechanical engineering graduate student Somnath Roy, and undergraduate mechanical engineering student Edwin Mathews.
"This is preliminary research into an area of great concern, and we plan to generate visualizations from our models that give scientists more insight into how severe storms would affect the path and spread of such an oil spill," Allen said. "Ultimately, we want to create comprehensive and quickly deployable environmental models that incorporate multiple elements such as hurricane winds, storm surge, oil spill and fish populations along with advanced visualization and analysis to better understand the coastal environment."
Most hurricane and storm surge models are not designed to include the three-dimensional transport of oil. To use such models, scientists need to combine other models using different physics. This research project will create a preliminary hurricane model that incorporates oil as an element.
"This project addresses some interesting challenges for computational science researchers, in terms of understanding the evolution of the oil spill as emerging from its underground source into the seawater," Karki said. "A wide variety of data related to the oil spill in the Gulf of Mexico is currently available and will be generated in the future. We plan to visualize some of these data to extract important information about the nature and extent of the oil spill."
For the first part of the project, the researchers who are part of Allen's grant will create an oil spill model that treats the spilled oil as individual particles using the Cactus Computational Framework, an open-source environment that allows scientists and engineers to use high-performance computing resources more effectively. Once a model for oil particles is in place, the researchers will input data on different wind speeds and water current movements to see how the oil particles behave in extreme weather conditions.
For initial examples, the research group will use wind speed and water current data they already have available from Hurricane Katrina (2005) and Hurricane Gustav (2008), expanding on data from these storms to simulate how and where oil from the spill would move under hurricane-force winds and storm surge from hurricanes on similar tracks.
For the second part of the project, the researchers working on Karki's grant through the CCT Scientific Visualization research group will conduct the final step of creating images of the data that describe the plume of oil in water and where the oil dissipates up to the surface because of immense upstream pressure, high temperature and buoyancy. They will produce movies and still images that will show researchers how spilled oil would move during a hurricane under different scenarios.
"Being from southern Louisiana, it is very exciting to be a part of a project that can directly affect my home and the entire Gulf South," said Edwin Mathews, an undergraduate mechanical engineering student working on the project. "On top of that, getting to work with a research group at a high-end facility like the CCT is a unique and eye-opening experience for an undergraduate student. Visualization is essential to the interpretation of the computational oil spill data, and being exposed to the inter-disciplinary workings of the project is something that will be of great value to me in the future."
The researchers on this project had experience creating multi-element, data-driven models for hurricanes through work on Cybertools, a National Science Foundation-funded project to develop tools and applications that allow scientists to use modern cyberinfrastructure to its full potential. All of the researchers on the LONI oil spill project had worked on the Cybertools project.
The research groups will use LONI's high-performance computing and advanced networking capabilities to make data input easier for the numerical model, and create visualizations using VISH, a software program developed at the CCT in collaboration with colleagues in Austria and the United Kingdom. VISH stands for "visualization shell," meaning it is an environment that can incorporate multiple elements of the visualization process while implementing newly developed algorithms and deploying them to researchers easily.
They plan to present the initial results of their research during the annual Supercomputing Conference (SC10), which will take place in New Orleans, Nov. 13-19, 2010.
Source: LSU Center for Computation & Technology
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.