Visit additional Tabor Communication Publications
December 14, 2009
Internet-based access will focus on large, community-oriented data sets
Dec. 11 -- The San Diego Supercomputer Center (SDSC) at UC San Diego and Arizona State University have been awarded a $1.7 million grant from the National Science Foundation (NSF) to operate an Internet-based national data facility for high-resolution topographic data acquired with LiDAR (Light Detection and Ranging) technology. The facility will also provide online processing tools and act as a community repository for information, software and training materials.
The three-year project, which includes a grant of $1.4 million to SDSC and $300,000 to the School of Earth and Space Exploration at Arizona State University, will be based on SDSC's OpenTopography portal, which will be scaled up to a national facility to make topography data available in multiple formats. This includes "raw" LiDAR point cloud data, standard LiDAR-derived digital elevation models, and easily accessible Google Earth products to better serve LiDAR users at various levels of expertise.
OpenTopography currently hosts and distributes a limited number of data sets acquired with funding from the NSF, NASA, and the U.S. Geological Survey (USGS). It is the product of the NSF-funded GEON (GeoSciences Network) project that has developed cyberinfrastructure for the integration of three- and four-dimensional earth science data.
"The fundamental goal of this project is to provide centralized access to community earth science LiDAR topography data," said Christopher Crosby, SDSC's project manager for the OpenTopography Facility. "There is wealth of public domain LiDAR data available, but much of it is not yet easily accessible. We intend to leverage available cyberinfrastructure to make these powerful data sets, as well as online processing tools and knowledge resources, accessible to a large and diverse user community."
The OpenTopography Facility will be primarily focused on large, community-oriented, scientific data sets, while building collaborations with existing LIDAR topography data providers and hosts such as the USGS and the NSF-funded National Center for Airborne Laser Mapping (NCALM) to link to their data archives and/or to host and distribute their data. An advisory committee representing OpenTopography users will prioritize which data sets are of greatest value to the community.
As one of the most powerful tools available to study the earth's surface, overlying vegetation and man-made structures, high-resolution LiDAR data sets are widely regarded as revolutionary for earth science, environmental and engineering applications, as well as natural hazard studies. LiDAR makes it possible to generate digital elevation models (DEMs) at resolutions that are more than one order of magnitude better than those currently available. Moreover, large geographic areas can be surveyed at relatively low expense.
"LiDAR topography data is revolutionizing the way we study the geomorphic processes acting along the Earth's surface," said Ramon Arrowsmith, associate professor in the School of Earth and Space Exploration at Arizona State University and project co-investigator. "From earthquake hazards research to examining the impact of human development on natural systems, LiDAR is emerging as a fundamental tool."
"High-resolution topographic data collection is burgeoning for research, planning and regulatory activities, yet the massive size of the data sets has made online community access to them difficult," said Chaitan Baru, SDSC Distinguished Scientist and principal investigator for OpenTopography and GEON. "LiDAR is an interesting test case because of those data volumes and the on-demand access our users require, but ultimately the strategies developed in this work could be applied to all types of scientific data over a very wide range of domains."
OpenTopography addresses the basic challenge of how to efficiently manage, archive, distribute process and integrate tens of terabytes of community geospatial data. Many organizations that acquire LiDAR topography do not have the disk space, bandwidth, and in-house expertise necessary to make these data available via the Internet for community-level access and analysis.
The OpenTopography LiDAR Facility is funded under NSF award number 0930731 (SDSC) and 0930643 (ASU).
As an organized research unit of UC San Diego, SDSC is a national leader in creating and providing cyberinfrastructure for data-intensive research. Cyberinfrastructure refers to an accessible and integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC recently doubled its size to 160,000 square feet with a new, energy-efficient building and datacenter extension, and is a founding member of TeraGrid, the nation's largest open-access scientific discovery infrastructure.
San Diego Supercomputer Center (SDSC): www.sdsc.edu
School of Earth and Space Exploration, Arizona State University: http://sese.asu.edu
OpenTopography Facility: www.opentopography.org
Geosciences Network (GEON) project: www.geongrid.org
National Science Foundation: www.nsf.gov
UC San Diego: www.ucsd.edu
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.