SDSU Visualizes ‘Blue Marble’ Imagery

By Nicole Hemsoth

November 4, 2005

The Visualization Center at San Diego State University, which relied on visualization technology from Silicon Graphics Inc. to create and disseminate 3D geospatial datasets for many projects including natural disaster mitigation and response, has added the processing and serving of NASA's “Blue Marble: Next Generation” satellite imagery of Earth to its list of achievements.

Beginning with high-resolution satellite imagery of Banda Aceh, Indonesia, acquired before and after last year's tsunami, and continuing through this year's U.S. hurricane season with before and after imagery, especially of the devastation of Hurricane Katrina in the Gulf states, SGI compute power and speed helps SDSU deliver 3D geospatial visualization to relief workers and government officials. The Silicon Graphics Prism visualization system is an integral part of the geospatial image processing pipeline for the many efforts at the Visualization Center at SDSU, including homeland security, remote sensing and environmental monitoring, global sharing of information and collaborative visualization.

“Blue Marble: Next Generation” uses imagery from NASA's 18 Earth-observing satellites, down-linked at NASA Earth Observatory at NASA's Goddard Space Flight Center. SDSU both processes and serves out the images using a Silicon Graphics Prism system. “Blue Marble” offers a year's worth of monthly composites at a spatial resolution of 500 meters per pixel. These monthly images reveal seasonal changes to the land surface: the green-up and dying-back of vegetation in temperate regions such as North America and Europe, dry and wet seasons in the tropics, and advancing and retreating Northern Hemisphere snow cover, helping scientists in many disciplines to make more detailed observations of our world.

According to NASA's web site, commenting on the upgrade from the original “Blue Marble,” “From a computer processing standpoint, the major improvement [in 'Blue Marble: Next Generation'] is the development of a new technique for allowing the computer to automatically recognize and remove cloud-contaminated or otherwise bad data — a process that was previously done manually.”

Additional processing is accomplished using the GeoMatrix Toolkit from GeoFusion, Inc. (http://www.geofusion.com), which is the backbone of the Visualization Center's high-performance imaging and GIS environment. The scalable computing power and large memory of the Silicon Graphics Prism system allows researchers to use GeoMatrix tools to process data for serving in the GeoPlayer ActiveX web browser plugin.

By visualizing hundreds of gigabytes to many terabytes of geospatial data, the researchers at the Visualization Center at SDSU are able to continuously create up-to-date 3D fly-throughs that depict the changes wrought by a natural disaster. The Silicon Graphics Prism system is the heart of the process and is used to create the new datasets, routinely processing 500 or more image files up to 200 MB in size each night to create mosaics of a terabyte or more. The Silicon Graphics Prism system at SDSU has 24 GB RAM, eight Intel Itanium 2 processors running GeoMatrix and OSSIM open-source tools in the Linux environment. This configuration allows conversion of all data into easily accessible, open-source format; the data is then stored back out to the servers at SDSC for public access on the Internet. In the case of Hurricane Katrina, WMS (web map server) data were served directly from the Silicon Graphics Prism system.

To create the mosaics and 3D fly-throughs that Red Cross and other relief workers would use to determine whether Katrina's victims had a house to return to, SDSU acquired data from a number of sources. Most of the imagery datasets of the affected Gulf states were taken by NOAA with its specially equipped airplane to rapidly acquire high-resolution imagery over the damaged area. Similar high-resolution photography, especially the before imagery was acquired from the USGS EROS Data Center and from other groups such as the Army Corps of Engineers. NASA satellites also provided before and after imagery, which are very good for a regional and multi-spectral perspective that can be combined with the high-resolution photography to provide location and context.

The DigitalGlobe satellite provided 60-centimeter imagery of before and after Katrina, which provided one of the first compelling views of the extraordinary impact of Katrina in both flooding and destruction. The high-resolution, color photography was acquired over and over again as the water drained to provide insight into the change of the water as levees broke and water flooded and receded.

“The Silicon Graphics Prism performed incredibly well,” said Eric Frost, co-director of the Visualization Center at SDSU. “There were 5,000 aerial shots in the first batch of after-Katrina photos, and each photo was between 150 and 200 MB. Then 2,500 more would come in the next day, and the next, and so on. An inherent aspect of the photos, especially with low-altitude photography, is that the scale is different from the center of the photo to the outside edge because the airplane is much closer to the center of the shot than it is to the outside.” “It's like taking a picture of someone when you're too close, where the nose looks bigger than the rest of the face. In order to put all those photos together into a mosaic and to add GIS data onto them, you actually have to process them in a way where you know exactly what the errors are and you can move all the pixels to the right place,” Frost continued. “So just to begin work on the first 5,000 photos, the team of experts that came together around the Silicon Graphics Prism system, color-balanced them and then geo-rectified them — meaning that you're putting all the pixels where they actually are on the Earth. That normally would be weeks or months of processing, but there was a very special coalition of extremely talented image processors and computer scientists that worked together from a number of different institutions.”

Once the photos are geo-rectified, data on locations of roads, city boundaries, hospitals, schools, police stations, and fire stations are added. As more and more information comes in, other data sets are laid on: where the damage is; what assets belong to HUD; where the refineries are; where the gas stations are; and where hazardous waste materials are.

The U.S. Census Bureau's Tiger data, which was linked to the imagery by Howard Butler of Iowa State University, enabling the Katrina relief team to also include something called “geo-coding.” Geo-coding means that anyone working for the Red Cross at any shelter in the U.S. could type in an evacuee's address and the computer immediately flies through the 3D geospatial dataset or a flat map to where that address is — or was — on that particular city block or country road.

The U.S. Navy also came to the team for geo-coding. The Navy wanted to know how many of their personnel contractors and their dependants had been affected by the disaster. Chuck Stein simply took the address dataset as a database file and, using the GeoMatrix format, which turned the addresses into 3D icons, ran the data through GeoFusion and out of the Silicon Graphics Prism system to a monitor. The results: little yellow flags came up on the addresses of all 8,000 people and it was obvious that several thousand people had lost their homes or were likely evacuated because of the damage to, or location, of their home. Within one day, the Navy was able to narrow the search down to about a hundred people and then find almost all of them the next day.

Once the data is uploaded to SDSU's Web site, any number of researchers and government agencies, or the public can use it. One of the most powerful uses of the data was for locating toxicity and medical data onto the images; and helping interpret the ongoing stream of environmental and monitoring data that are currently being collected by many groups who are helping with the decision-making for the future of the region and people.

Duke University professor Marie Lynn Miranda, a specialist in children's environmental health, considers lead contamination levels one of the biggest issues in the recovery. Professor Miranda is leading the National Institute for Environmental Health and Safety (NIEHS) effort, and the Visualization Center at SDSU is providing the computation power on the Silicon Graphics Prism system for the work she and others are doing.

Researchers at UCSD who work with NIEHS on Superfund site efforts, including professors Bob Tukey and Mark Ellisman, connected SDSU and its imaging capabilities with Professor Miranda and her GIS expertise and strategic position in helping decision-makers understand what the problems were and what possible solutions might be. Researchers like Professors Miranda, Tukey and Ellisman are using the completed data sets, adding census data and on-going environmental measurements, to attempt to determine areas where people could move back to or sections where people — especially children who are more likely to develop severe physical problems from exposure to high lead levels — should never return. Being able to serve the terabytes of imagery and GIS data to researchers and field workers both in the affected area and in decision centers focused on helping the people of all the affected states will be something that should provide a significant service to the nation for many years to come as the long-term medical impacts of Katrina and other hurricanes come to light.

“To really do the right thing for the people ravaged by Katrina, it takes a massive amount of data fusion –compositing the toxicity and damage data with imagery and then tracking the changes through time,” said John Graham, the Visualization Center's Senior Research Scientist who helped lead the effort with processing and building the social network of specialists who made the Silicon Graphics Prism perform so remarkably. “When you're working with satellite and aerial photography, you can be dealing with multiple terabytes of data. This is where the Silicon Graphics Prism system really shows off the power of its shared-memory architecture, with its ability to take all the bricks and connect them to appear as one large computer with lots of memory. Then taking the geo-referenced imagery and 'cooking' it into GeoFusion OpenGL texture format and storing it on high-speed servers, allowing anybody with a Windows PC that has a OpenGL video card and Internet Explorer — which almost all new machines do — to use the ActiveX web browser plugin and fly through those terabytes of data. But it's SGI technology, processing the data on the backend, that is making this all possible.”

While gratified by the huge amount of work by fellow scientists, researchers, and the numerous volunteers it took to deliver these data sets to help Katrina victims, including GeoFusion, who wrote code to help everything move even faster, Frost envisions much faster access to the original data, both through networks like the National Lambda Rail, a dark-fiber grid used by universities, and by the addition of 10 Gigabit Ethernet connections, especially at government facilities like the EROS Data Center and NASA's Goddard Space Flight Center, as well as numerous groups in the Washington, D.C. area such as the Navy Research Lab and the OSSIM researchers who were involved with the effort.

SDSU's first source of Katrina data was the national center for data, the U.S. Geological Survey Center's Earth Resources Observation and Science (EROS) Data Center, in Sioux Falls, S.D. When Graham first accessed EROS to download the data and FTP it, the screen said, “Estimated time: 218 hours” for one data set. All the datasets needed were eventually downloaded to disk and FedEx'ed or flown to SDSU.

“It is clear that scientific visualization has reached a new and powerful level to model, predict and plan for natural disasters,” said SGI's Bishop. “The Visualization Center at San Diego State University offers a clear demonstration of how this technology can be used to triumph over adversity.”

Subscribe to HPCwire's Weekly Update!

Be the most informed person in the room! Stay ahead of the tech trends with industry updates delivered to you every week!

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion XL — were added to the benchmark suite as MLPerf continues Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing power it brings to artificial intelligence.  Nvidia's DGX Read more…

Call for Participation in Workshop on Potential NSF CISE Quantum Initiative

March 26, 2024

Editor’s Note: Next month there will be a workshop to discuss what a quantum initiative led by NSF’s Computer, Information Science and Engineering (CISE) directorate could entail. The details are posted below in a Ca Read more…

Waseda U. Researchers Reports New Quantum Algorithm for Speeding Optimization

March 25, 2024

Optimization problems cover a wide range of applications and are often cited as good candidates for quantum computing. However, the execution time for constrained combinatorial optimization applications on quantum device Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at the network layer threatens to make bigger and brawnier pro Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HBM3E memory as well as the the ability to train 1 trillion pa Read more…

MLPerf Inference 4.0 Results Showcase GenAI; Nvidia Still Dominates

March 28, 2024

There were no startling surprises in the latest MLPerf Inference benchmark (4.0) results released yesterday. Two new workloads — Llama 2 and Stable Diffusion Read more…

Q&A with Nvidia’s Chief of DGX Systems on the DGX-GB200 Rack-scale System

March 27, 2024

Pictures of Nvidia's new flagship mega-server, the DGX GB200, on the GTC show floor got favorable reactions on social media for the sheer amount of computing po Read more…

NVLink: Faster Interconnects and Switches to Help Relieve Data Bottlenecks

March 25, 2024

Nvidia’s new Blackwell architecture may have stolen the show this week at the GPU Technology Conference in San Jose, California. But an emerging bottleneck at Read more…

Who is David Blackwell?

March 22, 2024

During GTC24, co-founder and president of NVIDIA Jensen Huang unveiled the Blackwell GPU. This GPU itself is heavily optimized for AI work, boasting 192GB of HB Read more…

Nvidia Looks to Accelerate GenAI Adoption with NIM

March 19, 2024

Today at the GPU Technology Conference, Nvidia launched a new offering aimed at helping customers quickly deploy their generative AI applications in a secure, s Read more…

The Generative AI Future Is Now, Nvidia’s Huang Says

March 19, 2024

We are in the early days of a transformative shift in how business gets done thanks to the advent of generative AI, according to Nvidia CEO and cofounder Jensen Read more…

Nvidia’s New Blackwell GPU Can Train AI Models with Trillions of Parameters

March 18, 2024

Nvidia's latest and fastest GPU, codenamed Blackwell, is here and will underpin the company's AI plans this year. The chip offers performance improvements from Read more…

Nvidia Showcases Quantum Cloud, Expanding Quantum Portfolio at GTC24

March 18, 2024

Nvidia’s barrage of quantum news at GTC24 this week includes new products, signature collaborations, and a new Nvidia Quantum Cloud for quantum developers. Wh Read more…

Alibaba Shuts Down its Quantum Computing Effort

November 30, 2023

In case you missed it, China’s e-commerce giant Alibaba has shut down its quantum computing research effort. It’s not entirely clear what drove the change. Read more…

Nvidia H100: Are 550,000 GPUs Enough for This Year?

August 17, 2023

The GPU Squeeze continues to place a premium on Nvidia H100 GPUs. In a recent Financial Times article, Nvidia reports that it expects to ship 550,000 of its lat Read more…

Shutterstock 1285747942

AMD’s Horsepower-packed MI300X GPU Beats Nvidia’s Upcoming H200

December 7, 2023

AMD and Nvidia are locked in an AI performance battle – much like the gaming GPU performance clash the companies have waged for decades. AMD has claimed it Read more…

DoD Takes a Long View of Quantum Computing

December 19, 2023

Given the large sums tied to expensive weapon systems – think $100-million-plus per F-35 fighter – it’s easy to forget the U.S. Department of Defense is a Read more…

Synopsys Eats Ansys: Does HPC Get Indigestion?

February 8, 2024

Recently, it was announced that Synopsys is buying HPC tool developer Ansys. Started in Pittsburgh, Pa., in 1970 as Swanson Analysis Systems, Inc. (SASI) by John Swanson (and eventually renamed), Ansys serves the CAE (Computer Aided Engineering)/multiphysics engineering simulation market. Read more…

Choosing the Right GPU for LLM Inference and Training

December 11, 2023

Accelerating the training and inference processes of deep learning models is crucial for unleashing their true potential and NVIDIA GPUs have emerged as a game- Read more…

Intel’s Server and PC Chip Development Will Blur After 2025

January 15, 2024

Intel's dealing with much more than chip rivals breathing down its neck; it is simultaneously integrating a bevy of new technologies such as chiplets, artificia Read more…

Baidu Exits Quantum, Closely Following Alibaba’s Earlier Move

January 5, 2024

Reuters reported this week that Baidu, China’s giant e-commerce and services provider, is exiting the quantum computing development arena. Reuters reported � Read more…

Leading Solution Providers

Contributors

Comparing NVIDIA A100 and NVIDIA L40S: Which GPU is Ideal for AI and Graphics-Intensive Workloads?

October 30, 2023

With long lead times for the NVIDIA H100 and A100 GPUs, many organizations are looking at the new NVIDIA L40S GPU, which it’s a new GPU optimized for AI and g Read more…

Shutterstock 1179408610

Google Addresses the Mysteries of Its Hypercomputer 

December 28, 2023

When Google launched its Hypercomputer earlier this month (December 2023), the first reaction was, "Say what?" It turns out that the Hypercomputer is Google's t Read more…

AMD MI3000A

How AMD May Get Across the CUDA Moat

October 5, 2023

When discussing GenAI, the term "GPU" almost always enters the conversation and the topic often moves toward performance and access. Interestingly, the word "GPU" is assumed to mean "Nvidia" products. (As an aside, the popular Nvidia hardware used in GenAI are not technically... Read more…

Shutterstock 1606064203

Meta’s Zuckerberg Puts Its AI Future in the Hands of 600,000 GPUs

January 25, 2024

In under two minutes, Meta's CEO, Mark Zuckerberg, laid out the company's AI plans, which included a plan to build an artificial intelligence system with the eq Read more…

Google Introduces ‘Hypercomputer’ to Its AI Infrastructure

December 11, 2023

Google ran out of monikers to describe its new AI system released on December 7. Supercomputer perhaps wasn't an apt description, so it settled on Hypercomputer Read more…

China Is All In on a RISC-V Future

January 8, 2024

The state of RISC-V in China was discussed in a recent report released by the Jamestown Foundation, a Washington, D.C.-based think tank. The report, entitled "E Read more…

Intel Won’t Have a Xeon Max Chip with New Emerald Rapids CPU

December 14, 2023

As expected, Intel officially announced its 5th generation Xeon server chips codenamed Emerald Rapids at an event in New York City, where the focus was really o Read more…

IBM Quantum Summit: Two New QPUs, Upgraded Qiskit, 10-year Roadmap and More

December 4, 2023

IBM kicks off its annual Quantum Summit today and will announce a broad range of advances including its much-anticipated 1121-qubit Condor QPU, a smaller 133-qu Read more…

  • arrow
  • Click Here for More Headlines
  • arrow
HPCwire