Visit additional Tabor Communication Publications
April 15, 2009
April 14 -- Private companies, universities and government agencies are joining forces to bring scientific research into the era of "cloud computing," the name for massive clusters of computers connected through the Internet.
The University of Washington has won three recent awards from the National Science Foundation related to cloud computing. Two of the grants will fund projects examining ocean climate simulations and analyzing astronomical images. Both provide tools so researchers can use cloud computing to easily interact with the massive datasets that are becoming more and more common in science. A third grant to the UW provides curriculum and training to teach cloud computing.
The projects are funded through NSF's Cluster Exploratory program, which will access a cloud datacenter established for educational use in 2007 through a partnership between Google, IBM and six academic institutions, of which the UW was the first member. NSF joined the group last year.
Climate modelers are beginning to use computer simulations in more exploratory ways, said Bill Howe, a researcher at the UW's eScience Institute, a newly established group to support data-intensive research at the university. Instead of running a simulation to test a single hypothesis, climate scientists are now running long-term simulations and then sifting through tens of thousands of gigabytes of resulting data to discover trends.
"Using current tools, you can comfortably analyze and visualize datasets that fit in the computer underneath your desk," Howe said. "But you can't comfortably and interactively explore datasets at this new scale."
Howe's project aims to provide that interactivity for tens of thousands of gigabytes of simulation results. He created a tool, GridFields, to visualize the polygonal mesh of climate simulation output, and is now working to redesign GridFields to be efficient in a cloud computing environment. Collaborators at the University of Utah have an award under the same program to extend an accompanying system that makes it easier to write and keep track of computer programs.
"We need to get smart sooner rather than later on how to design and build a system that doesn't just live out on these machines at government or company data centers, but extends the cloud right down to your computer," Howe said.
Someday the tool should be easy enough that undergraduates and high-school students could sift through raw data themselves, he said.
A second grant will use cloud computing to study astronomical images. Astronomy has changed dramatically during the past decade, says Andrew Connolly, a UW associate professor of astronomy who was awarded the grant with UW research scientist Jeffrey Gardner. Scientists once competed for time on telescopes, recorded data and then studied the individual images in detail. Now telescopes continuously record high-resolution images that are available to all, providing millions of times more information.
"In the past I could have spent a couple of hours working on a single image. But now, if I have to multiply it by factors of many tens of thousands, that couple of hours each becomes something that's not feasible," Connolly said.
Companies such as Google, Microsoft, Amazon and Yahoo! have now created frameworks that make it easier to store and process information in the cloud and make the information available over the Web.
"We want to use these frameworks to enable science, and make it so that astronomers can come in and do the work that they need to do without needing to learn the intricacies of how to work with thousands of machines," Connolly said.
His grant will prepare astronomers to deal with data coming from telescopes scheduled to come online in coming years, such as the Large Synoptic Survey Telescope, of which the UW is a founding institution. The telescope's 27-foot mirror is connected to a 3.2 billion-pixel camera that takes pictures every 15 seconds. It is expected to record more than 30,000 gigabytes of data and detect more than 100 million astronomical sources every night.
"Cloud computing enables us to scale to the point where we can actually analyze that sort of data," Connolly said.
The third grant funded a 3-day workshop held in Seattle last July in which computer science professors learned from UW computer science and engineering faculty and students how to teach cloud computing skills.
"The rapid evolution of sensors is transforming all sciences from data-poor to data-rich," said Ed Lazowska, a UW professor of computer science and engineering who led the workshops. "The challenge is to use modern cloud computing resources, such as Amazon Web Services, and modern computer science advances, such as data mining and machine learning, to explore these massive volumes of data. This new computational science will be pervasive and will have enormous impact. UW is fortunate to be in on the ground floor."
The UW is the only institution to have won three awards through NSF's new data-intensive computing programs, and it has the largest total award value of nearly $700,000.
Source: the University of Washington
The Xeon Phi coprocessor might be the new kid on the high performance block, but out of all first-rate kickers of the Intel tires, the Texas Advanced Computing Center (TACC) got the first real jab with its new top ten Stampede system.We talk with the center's Karl Schultz about the challenges of programming for Phi--but more specifically, the optimization...
Although Horst Simon was named Deputy Director of Lawrence Berkeley National Laboratory, he maintains his strong ties to the scientific computing community as an editor of the TOP500 list and as an invited speaker at conferences.
Supercomputing veteran, Bo Ewald, has been neck-deep in bleeding edge system development since his twelve-year stint at Cray Research back in the mid-1980s, which was followed by his tenure at large organizations like SGI and startups, including Scale Eight Corporation and Linux Networx. He has put his weight behind quantum company....
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 09, 2013 |
The Japanese government has revealed its plans to best its previous K Computer efforts with what they hope will be the first exascale system...
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.