First Warning

By Tim Green

July 7, 2006

It's hurricane season.

For residents of the Texas Gulf Coast, that means keeping the car gassed up, plywood in the garage for boarding up windows and the phone number of the relatives in Dallas on speed dial.

For Gordon Wells and his group at the Center for Space Research (CSR) at The University of Texas at Austin, hurricane season means making sure those residents — and officials responsible for public safety—know what's coming, when it's coming and when they should leave.

Using satellite data, databases of infrastructure, population centers and natural features, and advanced computing, the CSR team generates computer models that show approaching storms, display graphics that help determine evacuation routes and who should leave when and graphics that help first responders know what they'll have to deal with.

The team is part of the Governor's Emergency Management Council, a group of state agencies and organizations that gather in the State Operations Center to organize the state's response to an impending disaster.

The team brings together resources of The University of Texas at Austin that can make an immediate difference in dealing with hurricanes and other disasters.

Hurricane season began June 1 and the forecast from the National Hurricane Center calls for an 80 percent chance that there will be 13 to 16 tropical storms, more than during an average season. Eight to 10 of those storms might reach hurricane strength and four to six of those might be major hurricanes where wind speeds exceed 130 mph.

Getting ready

Even before hurricanes Katrina and Rita in 2005, the computer models generated by the CSR team have been an essential part of preparing for tropical storms and dealing with their aftermaths.

The team's work begins before the start of the season with the development of three-dimensional models based on generic storms and databases. State and local officials can use the model results and visualizations to plan their responses.

“You're presenting model results in a dynamic context so that it's sort of like a film that covers several hours duration, where you can see each individual part of the event as it occurs — as the surge dome moves inland, as the wind fields sweep across and create damage,” Wells said.

Those viewing the images can see what areas would be underwater, what areas might be subject to damage and what routes are safest for evacuation.

“All of this is done as a preview to the hurricane season so that county judges, mayors and local emergency managers are able to organize some of their exercises and drills and have these immersive, or at least fairly detailed, re-creations to train with,” Wells said. “They can think about what areas they would need to evacuate and in what sequence to get people out of harm's way.”

Modeling on the fly

Those models are fine for practice and drills, but it is highly unlikely they would replicate the exact characteristics of a specific storm.

As hurricane season progresses, Wells and the team are watching the weather to model developing storms that could threaten Texas.

“As a depression forms over the Bahamas, as did Katrina and Rita, we're immediately on alert,” Wells said. “We'd be in that group of four or five agencies that are already advising and beginning to make preparations at a very early stage. This would be 120 hours out, or more in some cases.”

If the hurricane looks like it's going to enter the Gulf of Mexico, Wells' team moves to their positions in the operations center where it is among the first handful of agencies preparing for a storm that may hold consequences for Texas.

“We begin to follow a storm very closely and as it enters the Gulf we start to look at predictions of what that storm will be on landfall” he said, by checking models that reflect dynamic characteristics similar to those of the approaching storm.

As the storm gets closer to the Texas coast, the team begins to analyze the results of surge models based on current observations and forecasts of the storm's track and intensity.
A storm surge is the long run of water pushed forward by the force of the hurricane. The surge was higher than 25 feet in some areas during Hurricane Katrina. Most of the damage and the great majority of deaths caused by large hurricanes comes from the storm surge.

“We begin examining storm surge models based upon the forecast characteristics of the storm at landfall between hour-96 and hour-72 (before tropical storm-force winds strike the coast),” Wells said. “We then generate new results whenever the storm path or predicted intensity at landfall changes. The final model typically follows the last National Hurricane Center advisory before landfall.”

High-powered graphics

To update and run models and create visualizations of their results quickly and in detail, Wells relies on the Texas Advanced Computing Center (TACC) of The University of Texas at Austin.

“The graphics supercomputers at TACC, particularly Maverick, are important for rapid visualization of storm surge models using high-resolution data rendered with the maximum level of detail,” Wells said. “An operator can easily change the viewpoint on the area of interest and zoom into the model results to examine the interaction of storm surge with terrain features, vegetation and structures. The results are photorealistic and allow decision makers to gain an instant awareness of the magnitude of events.”

The visualization tools developed at TACC enable emergency managers and responders to view the areas they're interested in on handheld devices, according to Gregory S. Johnson, manager of the TACC Visualization and Data Analysis group.

“What that means for an emergency response team is that they can load a Web page, log into that machine (Maverick) and load the visualization of the region they're interested in, approximate the flood surge of the type they're interested in and they interact with that environment,” Johnson said. “They can fly around the scene and fly up and down the streets. It's actually pretty amazing.”

The forecast of the surge model is important because it can help get people out of the way.

“Populations living in a surge zone have to be allowed to evacuate first,” Wells said. “The inland folks may be in jeopardy but not in nearly the degree of jeopardy that people in the surge zone are.”

During Rita, which hit landfall near the Texas-Louisiana border, he said, people inland left voluntarily, creating massive traffic on the highways needed by the coastal evacuees.

“So folks from the coast were getting on the road and their path was blocked by people inland,” he said. “Too many tried to get on the road at the same time.”

Technology for special needs

Over the past two years, Dr. Linda Prosperie, a member of Wells' group, has developed databases for planning and tracking shelters for the general population and has assisted in the development of the “shelter hub” concept. Shelter hubs are centralized areas with existing infrastructure that can be used to provide mass care resources before and after the storm.

Much of her work is to help the state deal with special needs populations who need to be evacuated.

“One of the things learned during Katrina and Rita is that you'll spend probably half or more of your effort dealing with special needs individuals,” Wells said. “Their care requirements are just so extreme and getting resources to folks like that on the fly is a very difficult matter. Things like oxygen and pharmaceuticals and the resources for people who need dialysis—you've got lot of people on the move all at once under highly stressed conditions.”

Prosperie, a research associate at CSR, said she has worked with the Department of State Health Services, the Department of Aging and Disability Service and the Texas Health Care Association to build databases for medial special needs evacuees. The data includes nursing homes, assisted-living and intermediate-care facilities, hospitals and other state facilities.

Wells said lessons learned from the 2005 storms are being acted on to make working with special needs people more effective. One step is the development of a statewide database that identifies those living in the hurricane evacuation zone who will require assistance to evacuate. The Governor's Task Force on Evacuation, Transportation and Logistics made this recommendation last March.

Constant communication

Wells and others are working on a plan that would enable evacuation vehicles to be monitored at all times. Buses would be tracked with global positioning systems (GPS), and the drivers would receive special cell phones.

He said that such devices enables those in the operations center to keep track of where buses are. Last year, a bus convoy was headed for Texarkana — after finding several other shelter areas full—before it could be located and directed to a more appropriate site.

Constant communication allows the authorities to get emergency aid and supplies to a bus rather rather through local EMS vehicles, air ambulances and National Guard helicopters.

The next phase of evacuee tracking will put a radio frequency identification tag with each person, Wells said. As each person steps on the bus, he or she will be registered on a roster that automatically updates a statewide database.

“We want to design an end-to-end system where we've got the needs of individuals pre-identified through the registration process before an event and we can track him or her through each stage of evacuation and sheltering from start to finish,” he said. “That way we'll know who was left behind in the impact area and we'll have the absolute geographic location of their residence. And our search-and-rescue teams will have that inventory of sites on their GPS logs and can go in and check each one of those residences as they search through neighborhoods.”

Even before the storms makes landfall, first responders move in close to the action. They set up in places close to where the storm will hit, but safe enough so they won't be harmed by it.

“They'll be able to shelter themselves as the storm passes through, but they'll be able to make the most rapid entrance into the area that's impacted afterwards,” he said.

As an example, in the final hours before Hurricane Rita made landfall, the team prepared maps to show the responders when and where to re-enter the region after the worst of the storm has passed.

“Teams had digital copies and hard copies of that map out in the field and they could determine, using the local maps, exactly when they could attempt to get in and by what route and when storm winds had diminished to the point that they could get into the air,” he said. “You're going to have swift water rescues and other potential air rescues as soon as you can get a plane into an area that's below tropical storm-force wind.”

The group keeps working after the storm, using satellite information to track the impact of pollution and power outages. Following Hurricane Rita, the team used orbital radar imagery from the Canadian Radarsat satellite to detect oil slicks from sunken vessels and ruptured storage tanks in the Sabine Lake area near Port Arthur.

At the height of a storm, Wells' group works with many state agencies, voluntary organizations and private sector service providers.

“Including the federal agencies, I would say that 25 to 30 agencies would be requesting and receiving products of some kind during a major event,” he said.

If the forecasts are correct, there could be a major event this year. Get ready. It's hurricane season.


Source: Texas Advanced Computing Center

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industy updates delivered to you every week!

Data Vortex Users Propel Transformational Computer Science

October 19, 2017

Last month (Sept. 11-12), HPC networking company Data Vortex held its inaugural Users Group at Pacific Northwest National Laboratory (PNNL) bringing together about 30 participants from industry, government and academia t Read more…

By Tiffany Trader

AI Self-Training Goes Forward at Google DeepMind

October 19, 2017

Imagine if all the atoms in the universe could be added up into a single number. Big number, right? Maybe the biggest number conceivable. But wait, there’s a bigger number out there. We're told that Go, the world’s Read more…

By Doug Black

Researchers Scale COSMO Climate Code to 4888 GPUs on Piz Daint

October 17, 2017

Effective global climate simulation, sorely needed to anticipate and cope with global warming, has long been computationally challenging. Two of the major obstacles are the needed resolution and prolonged time to compute Read more…

By John Russell

HPE Extreme Performance Solutions

Transforming Genomic Analytics with HPC-Accelerated Insights

Advancements in the field of genomics are revolutionizing our understanding of human biology, rapidly accelerating the discovery and treatment of genetic diseases, and dramatically improving human health. Read more…

Student Cluster Competition Coverage New Home

October 16, 2017

Hello computer sports fans! This is the first of many (many!) articles covering the world-wide phenomenon of Student Cluster Competitions. Finally, the Student Cluster Competition coverage has come to its natural home: H Read more…

By Dan Olds

Data Vortex Users Propel Transformational Computer Science

October 19, 2017

Last month (Sept. 11-12), HPC networking company Data Vortex held its inaugural Users Group at Pacific Northwest National Laboratory (PNNL) bringing together ab Read more…

By Tiffany Trader

AI Self-Training Goes Forward at Google DeepMind

October 19, 2017

Imagine if all the atoms in the universe could be added up into a single number. Big number, right? Maybe the biggest number conceivable. But wait, there’s a Read more…

By Doug Black

Student Cluster Competition Coverage New Home

October 16, 2017

Hello computer sports fans! This is the first of many (many!) articles covering the world-wide phenomenon of Student Cluster Competitions. Finally, the Student Read more…

By Dan Olds

Intel Delivers 17-Qubit Quantum Chip to European Research Partner

October 10, 2017

On Tuesday, Intel delivered a 17-qubit superconducting test chip to research partner QuTech, the quantum research institute of Delft University of Technology (TU Delft) in the Netherlands. The announcement marks a major milestone in the 10-year, $50-million collaborative relationship with TU Delft and TNO, the Dutch Organization for Applied Research, to accelerate advancements in quantum computing. Read more…

By Tiffany Trader

Fujitsu Tapped to Build 37-Petaflops ABCI System for AIST

October 10, 2017

Fujitsu announced today it will build the long-planned AI Bridging Cloud Infrastructure (ABCI) which is set to become the fastest supercomputer system in Japan Read more…

By John Russell

HPC Chips – A Veritable Smorgasbord?

October 10, 2017

For the first time since AMD's ill-fated launch of Bulldozer the answer to the question, 'Which CPU will be in my next HPC system?' doesn't have to be 'Whichever variety of Intel Xeon E5 they are selling when we procure'. Read more…

By Dairsie Latimer

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Intel Debuts Programmable Acceleration Card

October 5, 2017

With a view toward supporting complex, data-intensive applications, such as AI inference, video streaming analytics, database acceleration and genomics, Intel i Read more…

By Doug Black

How ‘Knights Mill’ Gets Its Deep Learning Flops

June 22, 2017

Intel, the subject of much speculation regarding the delayed, rewritten or potentially canceled “Aurora” contract (the Argonne Lab part of the CORAL “ Read more…

By Tiffany Trader

Reinders: “AVX-512 May Be a Hidden Gem” in Intel Xeon Scalable Processors

June 29, 2017

Imagine if we could use vector processing on something other than just floating point problems.  Today, GPUs and CPUs work tirelessly to accelerate algorithms Read more…

By James Reinders

NERSC Scales Scientific Deep Learning to 15 Petaflops

August 28, 2017

A collaborative effort between Intel, NERSC and Stanford has delivered the first 15-petaflops deep learning software running on HPC platforms and is, according Read more…

By Rob Farber

Oracle Layoffs Reportedly Hit SPARC and Solaris Hard

September 7, 2017

Oracle’s latest layoffs have many wondering if this is the end of the line for the SPARC processor and Solaris OS development. As reported by multiple sources Read more…

By John Russell

US Coalesces Plans for First Exascale Supercomputer: Aurora in 2021

September 27, 2017

At the Advanced Scientific Computing Advisory Committee (ASCAC) meeting, in Arlington, Va., yesterday (Sept. 26), it was revealed that the "Aurora" supercompute Read more…

By Tiffany Trader

Google Releases Deeplearn.js to Further Democratize Machine Learning

August 17, 2017

Spreading the use of machine learning tools is one of the goals of Google’s PAIR (People + AI Research) initiative, which was introduced in early July. Last w Read more…

By John Russell

GlobalFoundries Puts Wind in AMD’s Sails with 12nm FinFET

September 24, 2017

From its annual tech conference last week (Sept. 20), where GlobalFoundries welcomed more than 600 semiconductor professionals (reaching the Santa Clara venue Read more…

By Tiffany Trader

Graphcore Readies Launch of 16nm Colossus-IPU Chip

July 20, 2017

A second $30 million funding round for U.K. AI chip developer Graphcore sets up the company to go to market with its “intelligent processing unit” (IPU) in Read more…

By Tiffany Trader

Leading Solution Providers

Amazon Debuts New AMD-based GPU Instances for Graphics Acceleration

September 12, 2017

Last week Amazon Web Services (AWS) streaming service, AppStream 2.0, introduced a new GPU instance called Graphics Design intended to accelerate graphics. The Read more…

By John Russell

Nvidia Responds to Google TPU Benchmarking

April 10, 2017

Nvidia highlights strengths of its newest GPU silicon in response to Google's report on the performance and energy advantages of its custom tensor processor. Read more…

By Tiffany Trader

EU Funds 20 Million Euro ARM+FPGA Exascale Project

September 7, 2017

At the Barcelona Supercomputer Centre on Wednesday (Sept. 6), 16 partners gathered to launch the EuroEXA project, which invests €20 million over three-and-a-half years into exascale-focused research and development. Led by the Horizon 2020 program, EuroEXA picks up the banner of a triad of partner projects — ExaNeSt, EcoScale and ExaNoDe — building on their work... Read more…

By Tiffany Trader

Delays, Smoke, Records & Markets – A Candid Conversation with Cray CEO Peter Ungaro

October 5, 2017

Earlier this month, Tom Tabor, publisher of HPCwire and I had a very personal conversation with Cray CEO Peter Ungaro. Cray has been on something of a Cinderell Read more…

By Tiffany Trader & Tom Tabor

Cray Moves to Acquire the Seagate ClusterStor Line

July 28, 2017

This week Cray announced that it is picking up Seagate's ClusterStor HPC storage array business for an undisclosed sum. "In short we're effectively transitioning the bulk of the ClusterStor product line to Cray," said CEO Peter Ungaro. Read more…

By Tiffany Trader

Intel Launches Software Tools to Ease FPGA Programming

September 5, 2017

Field Programmable Gate Arrays (FPGAs) have a reputation for being difficult to program, requiring expertise in specialty languages, like Verilog or VHDL. Easin Read more…

By Tiffany Trader

IBM Advances Web-based Quantum Programming

September 5, 2017

IBM Research is pairing its Jupyter-based Data Science Experience notebook environment with its cloud-based quantum computer, IBM Q, in hopes of encouraging a new class of entrepreneurial user to solve intractable problems that even exceed the capabilities of the best AI systems. Read more…

By Alex Woodie

Intel, NERSC and University Partners Launch New Big Data Center

August 17, 2017

A collaboration between the Department of Energy’s National Energy Research Scientific Computing Center (NERSC), Intel and five Intel Parallel Computing Cente Read more…

By Linda Barney

  • arrow
  • Click Here for More Headlines
  • arrow
Share This