CLEAN POWER

January 19, 2001

by Michael Schneider, Pittsburgh Supercomputing Center

With 85 percent of U.S. power consumption coming from fossil-fuel combustion, researchers at the National Energy Technology Laboratory face the challenge of developing technologies for high-efficiency, low-emission combustion. In simulations at the Pittsburgh Supercomputing Center, they’re making progress toward this goal.

* * * * On the voyage home to Ithaca, Odysseus and his sailors had to navigate between Scylla and Charybdis — dangerous rocks and a whirlpool. Maneuver to avoid one peril and you risk the other. Researchers at the National Energy Technology Laboratory in Morgantown, West Virginia know the feeling. Their job is to steer the course of environmental stewardship in the face of accelerating demands for electrical power around the globe.

“America is running short of electricity,” said a front-page story in the Wall Street Journal last Spring (May 11, 2000). The information age — temperature controlled machine rooms and offices — and surging appliance purchases have juiced power consumption. Summertime U.S. peak demand is now about 700,000 megawatts, up from 525,000 in 1989, a rise that threatens to outstrip capacity, now about 780,000 megawatts. Complicating things is that deregulation of the electric utility industry has spawned uncertainty about the return on investment in new plants.

Adding fuel to the fire, literally, developing countries are a burgeoning market for energy. One recent projection holds that over the next few years 300 megawatts of new electric generating capacity will be installed somewhere in the world each day!

What about acid rain? What about greenhouse gases? These and other environmental imperatives drive research to provide clean-power options for the world’s energy. At present, 85 percent of U.S. consumption and 90 percent of the world’s comes from fossil fuel, and as the president’s commission of science and technology advisors reported last year, the current best opportunity for environmental progress in power generation is high-efficiency, low-emission combustion.

“The challenge is to convert fuel to energy without creating pollutants,” says George Richards, who leads NETL’s combustion dynamics team. The workhorses of electrical-power generation are the jet-engine-like gas turbines that convert fossil fuel into megawatts of electricity, and the mission of Richards’ team is to help develop the engineering knowledge to make 21st century turbines more efficient, cleaner and cheaper to operate. In a recent series of simulations at the Pittsburgh Supercomputing Center, they’ve made progress toward this goal.

Lean, Pre-Mixed Combustion

The power industry began to shift its new installations toward low-emission technology about 10 years ago, says Richards, and many new power plants employ low-emission turbines. The key to these advanced systems is “lean, pre-mixed combustion” — mixing the fuel, typically natural gas, with a relatively high proportion of air prior to burning. This substantially reduces nitrogen oxide pollutants (known as NOx) while allowing high-efficiency operation. The high efficiency reduces carbon dioxide, a major greenhouse gas, and lowered NOx alleviates smog and decreases other byproducts that affect air quality.

But a nasty problem bedevils these systems. With a lean-fuel mix, the combustor flame burns on the thin edge of not having enough fuel to keep burning, and a phenomenon analogous to a flickering candle sets up pressure oscillations — like a series of very rapid small explosions rather than a steadily burning flame. These oscillations can resonate with the vibration modes of the combustion unit and, literally, shake it to pieces.

“This instability is a major issue that every turbine developer using pre-mix combustion has to face,” says Richards. “It comes up in every conceivable stage — in development, during engine commissioning, in engine-fielding applications. It comes up in permitting these engines and in keeping them operating. It’s a very tricky problem. I’m happy to say that there’s been a lot of progress, and we can now see fielded engines using these incredibly clean combustors. But we also know that avoiding instability places very tight restrictions on how the engine can operate. Adding desirable features, like fuel flexibility, or a wider operating range, can lead to the same old problem.”

To zero-in on the problem, NETL researchers conducted extensive experiments with their Dynamic Gas Turbine Combustor. This state-of-the-art test facility makes it possible to adjust parameters involved in turbine-combustor design — such as location of the fuel injector relative to the flame — and to observe and measure what happens.

The experiments revealed an unexpected result. Changing the location of a nozzle component called the “swirl vane” affected the pressure oscillations. The swirl vane — so-called because it swirls the air flow to create aerodynamics that mix the fuel and air — sits upstream of the fuel injector. In experiments comparing two swirl-vane locations, with other parameters unchanged, when the swirl vane was moved two inches farther upstream the pressure oscillations virtually disappeared. Why?

What to Measure?

The objective, stresses Richards, is to understand the physics behind the observed data, so it can be incorporated rationally into turbine design. Moving the swirl vane gave better performance in one set of conditions, but the data was inconclusive when it came to explaining the results. Prior research suggested that the time lag between when fuel is injected and when it burns is a key factor for the oscillations, but presumably, since the fuel-injector didn’t move, the swirl-vane would have little or no effect on this.

“You can place the swirl vane either closer to the flame or farther away,” says Richards, “and it makes a difference. But we didn’t know why. We had some conjectures, and we tested those, but we still couldn’t prove what was going on. There’s subtle effects, like decay of turbulence and swirling flow, that impact the important time scales — multiple, simultaneous processes, and you can’t interpret the experimental data without quantifying the contributions from these simultaneous events.”

To sort out the details, Richards and his colleagues turned to simulations on PSC’s CRAY T3E. In recent years, the NETL team worked with consultants for FLUENT, commercial fluid-dynamics software, to develop 3D modeling that realistically simulates experiments in the experimental combustor. In summer 1999, with help from PSC scientists, they adapted FLUENT to the CRAY T3E and ran a series of simulations replicating the experiments.

Each computation — one for each experiment — required about a week of computing on 20 T3E processors to simulate 30 milliseconds of combustion. Each produced 20 gigabytes of compressed data, an enormous amount of information, which itself created a huge post-processing task.

When the results were in, they told an interesting story: The aerodynamics in the nozzle are such that moving the swirl vane, with no change to the fuel injector, significantly affects the time lag between injection and burning. In the two cases of interest, moving the swirl vane two inches upstream slows this lag time by a millisecond, and that millisecond makes a big difference in combustion stability.

“We looked at the simulations,” says Richards, “and said ‘ah-ha.’ It was obvious. The change in this time lag from the point of injection is what we need to measure. That’s a whole different universe to work in from where we were, a definite conclusion. It helped us set up the next set of experiments in which we’ve been trying to make a verifiable measurement of those time scales. And we’ve made some progress on that.”

Flame Volume & Reaction Rate

Along with focusing their analysis of the swirl-vane results, the CRAY T3E simulations also provide the NETL team with a way to look deeper yet at the physics of turbine combustion. A key factor in combustor stability is the flame’s reaction rate, the speed of burning, which varies with time. The NETL group would like to know what drives this variable. Does the volume of the flame change, such as when a candle-flame flickers, or does the flame volume stay constant as the burning-rate varies?

“We don’t know which occurs in practical systems,” says Richards. “We want to use these simulations and identify the dominant mechanism. It’s probably some of each, but is it 90/10, 50/50 or 20/80? We may find that it’s different under different conditions. That’s where the simulations really help. If we show that you go from one mechanism to the other in the same combustor, depending on operating conditions, you’d have to do different things to make the system quiet. With simulations, and going back and forth iteratively with the experiments, we’re learning a lot about fundamental physics.”

More information, including graphics: http://www.psc.edu/science/richards.html

============================================================

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Discovering Alternative Solar Panel Materials with Supercomputing

May 23, 2020

Solar power is quickly growing in the world’s energy mix, but silicon – a crucial material in the construction of photovoltaic solar panels – remains expensive, hindering solar’s expansion and competitiveness wit Read more…

By Oliver Peckham

Nvidia Q1 Earnings Top Expectations, Datacenter Revenue Breaks $1B

May 22, 2020

Nvidia’s seemingly endless roll continued in the first quarter with the company announcing blockbuster earnings that exceeded Wall Street expectations. Nvidia said revenues for the period ended April 26 were up 39 perc Read more…

By Doug Black

TACC Supercomputers Delve into COVID-19’s Spike Protein

May 22, 2020

If you’ve been following COVID-19 research, by now, you’ve probably heard of the spike protein (or S-protein). The spike protein – which gives COVID-19 its namesake crown-like shape – is the virus’ crowbar into Read more…

By Oliver Peckham

Using HPC, Researchers Discover How Easily Hurricanes Form

May 21, 2020

Hurricane formation has long remained shrouded in mystery, with meteorologists unable to discern exactly what forces cause the devastating storms (also known as tropical cyclones) to materialize. Now, researchers at Flor Read more…

By Oliver Peckham

Lab Behind the Record-Setting GPU ‘Cloud Burst’ Joins [email protected]’s COVID-19 Effort

May 20, 2020

Last November, the Wisconsin IceCube Particle Astrophysics Center (WIPAC) set out to break some records with a moonshot project: over a couple of hours, they bought time on as many cloud GPUS as they could – 51,000 – Read more…

By Staff report

AWS Solution Channel

Computational Fluid Dynamics on AWS

Over the past 30 years Computational Fluid Dynamics (CFD) has grown to become a key part of many engineering design processes. From aircraft design to modelling the blood flow in our bodies, the ability to understand the behaviour of fluids has enabled countless innovations and improved the time to market for many products. Read more…

HPC in Life Sciences 2020 Part 1: Rise of AMD, Data Management’s Wild West, More 

May 20, 2020

Given the disruption caused by the COVID-19 pandemic and the massive enlistment of major HPC resources to fight the pandemic, it is especially appropriate to review the state of HPC use in life sciences. This is somethin Read more…

By John Russell

HPC in Life Sciences 2020 Part 1: Rise of AMD, Data Management’s Wild West, More 

May 20, 2020

Given the disruption caused by the COVID-19 pandemic and the massive enlistment of major HPC resources to fight the pandemic, it is especially appropriate to re Read more…

By John Russell

Microsoft’s Massive AI Supercomputer on Azure: 285k CPU Cores, 10k GPUs

May 20, 2020

Microsoft has unveiled a supercomputing monster – among the world’s five most powerful, according to the company – aimed at what is known in scientific an Read more…

By Doug Black

AMD Epyc Rome Picked for New Nvidia DGX, but HGX Preserves Intel Option

May 19, 2020

AMD continues to make inroads into the datacenter with its second-generation Epyc "Rome" processor, which last week scored a win with Nvidia's announcement that Read more…

By Tiffany Trader

Hacking Streak Forces European Supercomputers Offline in Midst of COVID-19 Research Effort

May 18, 2020

This week, a number of European supercomputers discovered intrusive malware hosted on their systems. Now, in the midst of a massive supercomputing research effo Read more…

By Oliver Peckham

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

Wafer-Scale Engine AI Supercomputer Is Fighting COVID-19

May 13, 2020

Seemingly every supercomputer in the world is allied in the fight against the coronavirus pandemic – but not many of them are fresh out of the box. Cerebras S Read more…

By Oliver Peckham

Startup MemVerge on Memory-centric Mission

May 12, 2020

Memory situated at the center of the computing universe, replacing processing, has long been envisioned as instrumental to radically improved datacenter systems Read more…

By Doug Black

In Australia, HPC Illuminates the Early Universe

May 11, 2020

Many billions of years ago, the universe was a swirling pool of gas. Unraveling the story of how we got from there to here isn’t an easy task, with many simul Read more…

By Oliver Peckham

Supercomputer Modeling Tests How COVID-19 Spreads in Grocery Stores

April 8, 2020

In the COVID-19 era, many people are treating simple activities like getting gas or groceries with caution as they try to heed social distancing mandates and protect their own health. Still, significant uncertainty surrounds the relative risk of different activities, and conflicting information is prevalent. A team of Finnish researchers set out to address some of these uncertainties by... Read more…

By Oliver Peckham

[email protected] Turns Its Massive Crowdsourced Computer Network Against COVID-19

March 16, 2020

For gamers, fighting against a global crisis is usually pure fantasy – but now, it’s looking more like a reality. As supercomputers around the world spin up Read more…

By Oliver Peckham

[email protected] Rallies a Legion of Computers Against the Coronavirus

March 24, 2020

Last week, we highlighted [email protected], a massive, crowdsourced computer network that has turned its resources against the coronavirus pandemic sweeping the globe – but [email protected] isn’t the only game in town. The internet is buzzing with crowdsourced computing... Read more…

By Oliver Peckham

Global Supercomputing Is Mobilizing Against COVID-19

March 12, 2020

Tech has been taking some heavy losses from the coronavirus pandemic. Global supply chains have been disrupted, virtually every major tech conference taking place over the next few months has been canceled... Read more…

By Oliver Peckham

DoE Expands on Role of COVID-19 Supercomputing Consortium

March 25, 2020

After announcing the launch of the COVID-19 High Performance Computing Consortium on Sunday, the Department of Energy yesterday provided more details on its sco Read more…

By John Russell

Steve Scott Lays Out HPE-Cray Blended Product Roadmap

March 11, 2020

Last week, the day before the El Capitan processor disclosures were made at HPE's new headquarters in San Jose, Steve Scott (CTO for HPC & AI at HPE, and former Cray CTO) was on-hand at the Rice Oil & Gas HPC conference in Houston. He was there to discuss the HPE-Cray transition and blended roadmap, as well as his favorite topic, Cray's eighth-gen networking technology, Slingshot. Read more…

By Tiffany Trader

Honeywell’s Big Bet on Trapped Ion Quantum Computing

April 7, 2020

Honeywell doesn’t spring to mind when thinking of quantum computing pioneers, but a decade ago the high-tech conglomerate better known for its control systems waded deliberately into the then calmer quantum computing (QC) waters. Fast forward to March when Honeywell announced plans to introduce an ion trap-based quantum computer whose ‘performance’ would... Read more…

By John Russell

Fujitsu A64FX Supercomputer to Be Deployed at Nagoya University This Summer

February 3, 2020

Japanese tech giant Fujitsu announced today that it will supply Nagoya University Information Technology Center with the first commercial supercomputer powered Read more…

By Tiffany Trader

Leading Solution Providers

SC 2019 Virtual Booth Video Tour

AMD
AMD
ASROCK RACK
ASROCK RACK
AWS
AWS
CEJN
CJEN
CRAY
CRAY
DDN
DDN
DELL EMC
DELL EMC
IBM
IBM
MELLANOX
MELLANOX
ONE STOP SYSTEMS
ONE STOP SYSTEMS
PANASAS
PANASAS
SIX NINES IT
SIX NINES IT
VERNE GLOBAL
VERNE GLOBAL
WEKAIO
WEKAIO

Tech Conferences Are Being Canceled Due to Coronavirus

March 3, 2020

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have Read more…

By Alex Woodie

Exascale Watch: El Capitan Will Use AMD CPUs & GPUs to Reach 2 Exaflops

March 4, 2020

HPE and its collaborators reported today that El Capitan, the forthcoming exascale supercomputer to be sited at Lawrence Livermore National Laboratory and serve Read more…

By John Russell

‘Billion Molecules Against COVID-19’ Challenge to Launch with Massive Supercomputing Support

April 22, 2020

Around the world, supercomputing centers have spun up and opened their doors for COVID-19 research in what may be the most unified supercomputing effort in hist Read more…

By Oliver Peckham

Cray to Provide NOAA with Two AMD-Powered Supercomputers

February 24, 2020

The United States’ National Oceanic and Atmospheric Administration (NOAA) last week announced plans for a major refresh of its operational weather forecasting supercomputers, part of a 10-year, $505.2 million program, which will secure two HPE-Cray systems for NOAA’s National Weather Service to be fielded later this year and put into production in early 2022. Read more…

By Tiffany Trader

Summit Supercomputer is Already Making its Mark on Science

September 20, 2018

Summit, now the fastest supercomputer in the world, is quickly making its mark in science – five of the six finalists just announced for the prestigious 2018 Read more…

By John Russell

15 Slides on Programming Aurora and Exascale Systems

May 7, 2020

Sometime in 2021, Aurora, the first planned U.S. exascale system, is scheduled to be fired up at Argonne National Laboratory. Cray (now HPE) and Intel are the k Read more…

By John Russell

TACC Supercomputers Run Simulations Illuminating COVID-19, DNA Replication

March 19, 2020

As supercomputers around the world spin up to combat the coronavirus, the Texas Advanced Computing Center (TACC) is announcing results that may help to illumina Read more…

By Staff report

Nvidia’s Ampere A100 GPU: Up to 2.5X the HPC, 20X the AI

May 14, 2020

Nvidia's first Ampere-based graphics card, the A100 GPU, packs a whopping 54 billion transistors on 826mm2 of silicon, making it the world's largest seven-nanom Read more…

By Tiffany Trader

  • arrow
  • Click Here for More Headlines
  • arrow
Do NOT follow this link or you will be banned from the site!
Share This