Since 1986 - Covering the Fastest Computers in the World and the People Who Run Them

Language Flags
April 4, 2014

Data Management in Times of Disaster

Tiffany Trader
disaster_computing_help

When natural disaster strikes – be it a flood, an earthquake or a tsunami – every second counts. Just as emergency teams must be ready to go in a moment’s notice so must critical data management systems. This important topic, an essential element of civil protection around the world, is the focus of a research paper, recently published in the Geophysical Research Abstracts.

Written by a team of German researchers from Leibniz Supercomputing Centre and Ludwig-Maximilians-Universität, the paper describes a resource-independent data management system for urgent natural disaster computing. Such systems are critical to helping public officials make timely decisions for affected areas, getting care to the injured and reducing casualties. Computer simulations can offer predictive scenarios that further aid this decision making process, the authors assert, but this involves routing the data to the required resource.

Because of the unpredictable nature of natural disasters, an urgent data management system for natural disaster computing has some unique requirements. Key among these are resource-independence, the need to manage deadlines and be able to handle a huge volume of data. Other desirable characteristics include fault tolerance, reliability, flexibility to changes, and ease of usage.

The authors have proposed a data management system that addresses these needs by employing multiple managers. These include “a service manager to provide a uniform and extensible interface for supported data protocols, a configuration manager to check and retrieve configurations of available resources, a scheduler manager to ensure that the deadlines can be met, a fault tolerance manager to increase the reliability of the platform and a data manager to initiate and perform the data activities.”

The service manager works with the four other managers to orchestrate the data activities to fulfill deadlines for a given task, be it data staging or computation. The researchers have associated two types of deadlines with the urgent computing system.

1. Soft-hard deadline: Missing a soft-firm deadline will render the computation less useful resulting in a cost that can have severe consequences.
2. Hard deadline: Missing a hard deadline renders the computation useless and results in full catastrophic consequences.

The researchers have built a prototype data management system using a REST-based service manager, which they believe provides a uniform interface that is easy to use and resource-independent, providing a crucial computing resource to support disaster relief efforts.

The research paper will be presented during the poster session at the European Geosciences Union General Assembly 2014, on April 28, 2014, in Vienna, Austria.

SC14 Virtual Booth Tours

AMD SC14 video AMD Virtual Booth Tour @ SC14
Click to Play Video
Cray SC14 video Cray Virtual Booth Tour @ SC14
Click to Play Video
Datasite SC14 video DataSite and RedLine @ SC14
Click to Play Video
HP SC14 video HP Virtual Booth Tour @ SC14
Click to Play Video
IBM DCS3860 and Elastic Storage @ SC14 video IBM DCS3860 and Elastic Storage @ SC14
Click to Play Video
IBM Flash Storage
@ SC14 video IBM Flash Storage @ SC14  
Click to Play Video
IBM Platform @ SC14 video IBM Platform @ SC14
Click to Play Video
IBM Power Big Data SC14 video IBM Power Big Data @ SC14
Click to Play Video
Intel SC14 video Intel Virtual Booth Tour @ SC14
Click to Play Video
Lenovo SC14 video Lenovo Virtual Booth Tour @ SC14
Click to Play Video
Mellanox SC14 video Mellanox Virtual Booth Tour @ SC14
Click to Play Video
Panasas SC14 video Panasas Virtual Booth Tour @ SC14
Click to Play Video
Quanta SC14 video Quanta Virtual Booth Tour @ SC14
Click to Play Video
Seagate SC14 video Seagate Virtual Booth Tour @ SC14
Click to Play Video
Supermicro SC14 video Supermicro Virtual Booth Tour @ SC14
Click to Play Video