Visit additional Tabor Communication Publications
December 09, 2005
"We're experiencing a little turbulence, folks," the pilot says, as the plane plummets fifty feet like a car in a funhouse ride. White knuckles, churning stomach - on an airplane, a word you'd prefer not to hear is turbulence.
As P. K. Yeung is quick to tell you, however, turbulence is often beneficial and modern air travel couldn't exist without it. "But for the turbulent mixing of fuel and air in a jet engine, jet flight wouldn't be possible," says Yeung, a professor of aerospace engineering at Georgia Tech. He has applied his deep knowledge of turbulence to carry out some of the largest computational studies of this widespread and important phenomenon, and his recent work at PSC, using 2,048 processors of LeMieux, PSC's terascale system, sets a new milestone for large-scale turbulence simulation.
Part of our lives in many ways - from the cream we stir in our coffee to thunderstorms that ruin a night at the ballgame - turbulence defies easy definition, but is, roughly, a state of fluid flow in which the velocities at any point fluctuate randomly.
But for these random fluctuations, many important industrial chemical reactions would happen very slowly or not at all. On a larger scale, turbulent mixing in the lower atmosphere coupled with phenomena at high altitudes has a great effect on weather in the short term and climate in the long term. Turbulent mixing in ocean currents, such as the Gulf Stream, spanning thousands of miles helps to maintain the heat balance and ecology of the oceans.
Better understanding of turbulence, especially since the advent of supercomputers, has led to improvements in how we live, including better airplane wings, which lower the fuel-cost of air travel, and better artificial heart valves, which save lives. But it's an extremely complex phenomenon - one that Nobel Prize-winning physicist Richard Feynman once referred to as the "last unsolved problem in physics" - and many challenges remain.
One of the more pressing turbulence-related issues, says Yeung, is in preserving environmental quality. Where, for instance, will particles of pollutants from a smokestack end up minutes, hours, and days from now? "In order to maintain air quality, we need to understand the behavior of smoke emanating from pollution sources. Almost always the flow out of a chimney will be turbulent. We can see that in the sky - the smoke follows an irregular path - and we want to be able to describe the motion of those pieces of fluid, which constitute that cloud. If a certain part of that fluid has been contaminated, we want to know where it goes."
Using a powerful method, called "direct numerical simulation," with the advanced parallel-processing capability of LeMieux, Yeung produced results that are a significant step toward this goal.
Tracking the Particles
A major challenge in simulating turbulence is that the random fluctuations - the eddies and vortices - occur over a very wide range of scales, all of which must be taken into account in a realistic model. In the atmosphere, for instance, the swirls and eddies of air that make up the overall flow vary from several centimeters in diameter to thousands of kilometers, with every size in between. The ratio of scales can be in the thousands, with the number of variables - and thus the amount of computing required to keep track of them - increasing rapidly as the ratio increases. This imposes a daunting computational demand.
Yeung tackles this problem directly. Direct numerical simulation (DNS), starts with the fundamental equations of fluid flow and calculates speed and direction for each fluid particle. "Direct" means that velocities are calculated at each time step as the flow progresses, without reliance on experimental data to supply parameters. DNS tracks each particle - such as the particles in a plume of smoke - as it moves step-by-step within a high-resolution grid.
"We are acting as if we could measure the velocity everywhere in space and over a sustained period of time," Yeung explains. "With DNS, we are able to follow the irregular pathways or trajectories of fluid elements exiting a localized contaminant source."
Visualize a plume of smoke rising from a smokestack three feet in diameter. Any two smoke particles are separated by three feet, at most, as they exit the stack. DNS allows you to address the issue of how far apart these two particles will be after they wander about in the atmosphere for a sustained period of time. Do they separate or come together over time? To do this experimentally, to identify and keep track of a single pair of fluid particles, let alone all the particles in a large cloud, would be impossible. DNS - Yeung points out - provides more data, more accurately, than is possible to gather experimentally.
Scientists quantify the degree of turbulence in a fluid flow by the Reynolds number. Higher Reynolds numbers correspond to a wider spread in the range of eddy sizes, equivalent to higher levels of turbulence. Scientists have long been interested in simulating high Reynolds number flows, but have been limited by computing power. "The availability of computers like LeMieux allows us to increase the Reynolds number by expanding the number grid points," says Yeung, "and this allows us to simulate a wider range of scales."
A Turbulence Database and the Kolmogorov Constant
Over the past year, Yeung employed 2,048 LeMieux processors simultaneously solving the fundamental fluid equations in a three-dimensional grid with eight-billion grid points. This is the largest DNS ever done that tracks the path of particles over time. To get started on LeMieux required importing his software to a system where it hadn't run before, a major challenge. "We've had very capable and dedicated assistance from PSC consulting."
LeMieux's ability to communicate efficiently among processors has been an important factor in Yeung's ability to carry out his large-scale DNS work. His software efficiently exploits LeMieux, using thousands of processors with minimal added communication time involved in adding processors - a major advantage for his work.
With more than a million processor-hours of LeMieux time, Yeung's simulations produced terabytes of data that yield the highest Reynolds number ever calculated with the DNS approach. Previously, researchers had to extrapolate data from low turbulence simulations if they wanted to apply it in high turbulence situations, which led to uncertainties. "We will now be approaching the Reynolds numbers typical in applications more closely," says Yeung, "and if we still have to extrapolate we can do so with much greater confidence."
Beyond his immediate goal of understanding pollutant dispersion for environmental purposes, Yeung's simulations create a valuable database - which can be made available at PSC to the wider community of turbulence scientists, who can use it to test their turbulence models. Because of the fundamental nature of his DNS simulations - free of assumptions derived from observation - the data is useful for turbulent flows in many different applications. For pollutant dispersion, such as a smoke plume, from a localized source, other researchers can compare their model results with Yeung's DNS data for a similar Reynolds number. "Using DNS we can obtain the fundamental data that would allow us to formulate those models more carefully," says Yeung, "and eventually to evaluate the performance of the model and suggest improvements."
In extending his DNS studies to higher Reynolds numbers, Yeung also is getting closer to pinpointing an elusive number called the Lagrangian Kolmogorov constant. In 1941, Russian mathematician A.N. Kolmogorov posited that at high enough Reynolds number small-scale features of turbulence are independent of the large-scale flow geometry. This theory has been widely influential in turbulence research. "The Kolmogorov constant is of great interest because of the supposition that it is universal," says Yeung, "being the same for turbulent flows of various types of geometry as long as the Reynolds number is sufficiently high."
In the laboratory, with data from fixed measurement locations, it is straightforward to apply Kolmogorov's hypotheses, and this version of the constant is well established. Models of pollutant transport, however, use what's called a Lagrangian reference frame, which mimics an observer moving with the fluid flow - like a weather balloon that drifts with the wind. Research by Yeung and others has indicated that it takes very high Reynolds numbers simulations to establish this version of the constant. "This constant is very important to modeling," says Yeung. "Our group's large simulations on LeMieux have given quite clear evidence that the value is approaching a constant as the Reynolds numbers increases without limit."
For more information, including graphics, visit http://www.psc.edu/science/2005/yeung/
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.