Visit additional Tabor Communication Publications
January 09, 2013
SEATTLE, Wash. Jan. 9 – The deluge of data coming from today's countless electronic devices will be harnessed to take on the most pressing problems facing science and society at a new computational institute in Seattle.
The Northwest Institute for Advanced Computing is being formed by the University of Washington and the Department of Energy's Pacific Northwest National Laboratory in Richland, Wash. Researchers associated with the institute will work to ensure the next generation of computers and the methods used to run them can address challenges ranging from climate change to energy management.
"Computing has transformed science, engineering and society in remarkable ways," said Doug Ray, associate director of PNNL's Fundamental & Computational Sciences Directorate. "But as huge amounts of new data are generated daily by scientific instruments and household electronics, new technologies and approaches are needed to give that information more meaning. Researchers at the Northwest Institute for Advanced Computing will tackle 'big data' and help improve the quality of life for many U.S. citizens."
Located on UW's campus, the institute will be a center of collaboration where UW and PNNL researchers jointly explore advanced computer system designs, accelerate data-driven scientific discovery and improve computational modeling and simulation. Scientists and engineers at the institute will also train future researchers in modern computational approaches.
The institute's research will aim to help solve a wide variety of the world's growing problems. For example, improved computational techniques can help design a smart electric grid that reliably delivers energy to keep homes warm and lit. Better analysis of biological data can help determine the causes of disease and how to treat health ailments.
Computer modeling can help explain how climate change impacts natural resources such as snow packs and the formation of greenhouse gas-capturing molecules in the atmosphere. And smartphone data can be used to improve urban life, such as decreasing idle traffic while also reducing carbon emissions from cars.
"The expanded partnership between UW and PNNL will create tremendous new opportunities for both organizations," said Ed Lazowska, UW's Bill & Melinda Gates Chair in Computer Science & Engineering and director of the UW eScience Institute. "'Big data' is transforming the process of discovery in all fields. UW and PNNL have significant and complementary strengths; together we'll be able to do amazing things."
UW and PNNL also hope to strengthen the Northwest's economy. The institute will build on UW's and PNNL's existing and already-strong relationships with the region's private technology industry. The institute will also help grow the region's skilled workforce for UW, PNNL, the Northwest technology sector and beyond.
Two co-directors will lead the institute: UW electrical engineering chair and Applied Computational Engineering Lab Director Vikram Jandhyala and PNNL Fellow Moe Khaleel, who directs PNNL's Computational Science and Mathematics research division. PNNL is funding the time spent by both Jandhyala and Khaleel leading the institute. Institute members from UW and PNNL will jointly submit proposals to various funding agencies for new research projects.
"This collaboration will open up new avenues for research at the interface of computational advances and applications, and is a great synergy for UW and PNNL," Jandhyala said.
"This will be an interdisciplinary place for UW faculty in computer science, electrical engineering and applied math to work with PNNL colleagues on areas such as computational physics, big data, cyber security and computing for the smart grid," Khaleel said.
The institute's headquarters are inside UW's Sieg Hall, but the institute will be broader than that specific location. Its research members will hail from many of UW's numerous schools and colleges. And PNNL scientists and engineers will work from both Seattle and the national laboratory's main campus in Richland.
PNNL currently has two scientists who conduct DOE-funded research related to big data and nuclear physics from UW's Seattle campus. About eight more PNNL researchers are expected to join them in Seattle by the end of 2013. All Seattle-based PNNL researchers involved in advanced computing will be associated with the institute. And initially more than a dozen of UW's faculty members are expected to join the institute.
Institute members will use computational resources already in place at their home institutions. In Seattle, that includes the Hyak supercomputer developed by UW's eScience Institute and UW-IT. Richland resources include components of the PNNL Institutional Computing program, which features the Olympus supercomputer. Cloud resources will also be used extensively.
Both UW and PNNL are well known for their contributions to advanced computing. UW is known for its computer science and engineering, electrical engineering and applied mathematics programs. UW's eScience Institute has advanced data-driven discovery, and the university's computational programs in physics, chemistry and astronomy are highly regarded.
And PNNL is known for designing and programming high-performance computers and evaluating their performance. PNNL leads research in computational molecular science, multi-scale mathematics, regional climate modeling and the modeling of underground fluids such as water.
Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the world's preeminent research universities.
Source: University of Washington
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.