Visit additional Tabor Communication Publications
August 12, 2010
Dr. Tim Killeen, representing the National Science Foundation (NSF), last week addressed the annual TeraGrid '10 conference in Pittsburgh, Pa. His keynote emphasized the urgent need for sustainable cyberinfrastructure in the geosciences and across all domains of science.
"The geosciences is a domain in which cyberinfrastructure is incredibly important," Killeen, the NSF assistant director for geosciences, said. "There is need for end-to-end cyberinfrastructure that is accessible to brilliant young career professionals across the country. We need the capabilities now."
The NSF and funding agencies from countries including Brazil, Australia, Russia, Canada, France, Germany, Great Britain, and Japan one year ago declared that they would work collectively "to deliver knowledge to support human action and adaptation to regional environmental change" for global issues such as climate change and the availability of fresh water on the planet.
"We don't have a century to get this right," Killeen said. "We need the resources, sustained investments, smooth transitions, and accessibility of these resources to the brain trust of the nation and internationally. It's amazing when you look at the strategic plans of other countries and see how parallel they are to our own."
The crux of the challenge is developing an earth-human knowledge management system (aka "Earth-Cubed") to support a more complete understanding of the earth system and the human interactions with that system. "It's a scientific and technical challenge for the 21st century."
Killeen asked each conference participant to consider their role and the TeraGrid's role with regard to "Earth-Cubed" and cited the geosciences as an example of the domain requirements and cyberinfrastructure needs put on the TeraGrid community.
"I have yet to see a high-performance computing center that doesn't use the geosciences as a driver or rationale as to why we need this type of capability," he said.
Yet, focusing on sustainability is going to stretch the NSF and other agencies to do this in a robust way, Killeen said. "It's going to place demands on the types of products and services that come out of TeraGrid. Cyberinfrastructure's interface with science and society is going to be challenging -- no question about it."
Currently, the NSF's Geosciences Direcorate invests 10 percent of its overall budget in cyberinfrastructure in addition to the investments made by the NSF Office of Cyberinfrastructure (OCI). "We like to invest in multiple approaches with multiple outcomes...things that enhance productivity and capability. We look to the community for direction and priority. We like to understand the full life cycle costs and process. We want to anticipate increases in demand, make new investments, and address workforce issues."
It's a very exciting time for the geosciences, according to Killeen. There are new ways of looking at the Earth system and new methods by which much tougher problems that require advanced computing simulations are addressed. In addition, the data volumes are large and the data return is in the upper 90th percentile. "It's about the data and the value that data brings to understanding. Data-intensive computing is a very high priority," he said.
According to Killeen, on-demand, global experiments in the geosciences are becoming the norm. But, the changing context as it relates to human interactions with the Earth system is much more complex. It requires another level of integrated assessment models that can only be achieved through advanced computing simulation.
The next generation of models will be able to resolve detailed processes in the oceans, atmosphere and land. "We've reached this threshold, in part, by TeraGrid's efforts in building the appropriate cyberinfrastructure and computational capability. We're coming to a point where the sensor arrays and the development of the earth models are maturing at the same time. Models have become substantially more complete; they drive the most capable computation."
Killeen said the common architecture for cyberinfrastructure includes hardware, software, data provision, networking, sensor deployment, model assimilation, middleware, tools, Web services, cloud computing, free global access to information, and single-password authentication, as well as integrated services (data and instrument services, governance activities, computational expertise).
"Overall, it's a sustained investment and a balanced approach to drive transformation in the scientific disciplines. This is what NSF wants to get to...it involves training people to address incredibly important societal challenges with all the tools at our command. It's going to be the cyberinfrastructure that transforms the geosciences and takes it to the next level. We're now poised to do it in the next 10 years."
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.