Visit additional Tabor Communication Publications
January 09, 2013
SEATTLE, Wash. Jan. 9 – The deluge of data coming from today's countless electronic devices will be harnessed to take on the most pressing problems facing science and society at a new computational institute in Seattle.
The Northwest Institute for Advanced Computing is being formed by the University of Washington and the Department of Energy's Pacific Northwest National Laboratory in Richland, Wash. Researchers associated with the institute will work to ensure the next generation of computers and the methods used to run them can address challenges ranging from climate change to energy management.
"Computing has transformed science, engineering and society in remarkable ways," said Doug Ray, associate director of PNNL's Fundamental & Computational Sciences Directorate. "But as huge amounts of new data are generated daily by scientific instruments and household electronics, new technologies and approaches are needed to give that information more meaning. Researchers at the Northwest Institute for Advanced Computing will tackle 'big data' and help improve the quality of life for many U.S. citizens."
Located on UW's campus, the institute will be a center of collaboration where UW and PNNL researchers jointly explore advanced computer system designs, accelerate data-driven scientific discovery and improve computational modeling and simulation. Scientists and engineers at the institute will also train future researchers in modern computational approaches.
The institute's research will aim to help solve a wide variety of the world's growing problems. For example, improved computational techniques can help design a smart electric grid that reliably delivers energy to keep homes warm and lit. Better analysis of biological data can help determine the causes of disease and how to treat health ailments.
Computer modeling can help explain how climate change impacts natural resources such as snow packs and the formation of greenhouse gas-capturing molecules in the atmosphere. And smartphone data can be used to improve urban life, such as decreasing idle traffic while also reducing carbon emissions from cars.
"The expanded partnership between UW and PNNL will create tremendous new opportunities for both organizations," said Ed Lazowska, UW's Bill & Melinda Gates Chair in Computer Science & Engineering and director of the UW eScience Institute. "'Big data' is transforming the process of discovery in all fields. UW and PNNL have significant and complementary strengths; together we'll be able to do amazing things."
UW and PNNL also hope to strengthen the Northwest's economy. The institute will build on UW's and PNNL's existing and already-strong relationships with the region's private technology industry. The institute will also help grow the region's skilled workforce for UW, PNNL, the Northwest technology sector and beyond.
Two co-directors will lead the institute: UW electrical engineering chair and Applied Computational Engineering Lab Director Vikram Jandhyala and PNNL Fellow Moe Khaleel, who directs PNNL's Computational Science and Mathematics research division. PNNL is funding the time spent by both Jandhyala and Khaleel leading the institute. Institute members from UW and PNNL will jointly submit proposals to various funding agencies for new research projects.
"This collaboration will open up new avenues for research at the interface of computational advances and applications, and is a great synergy for UW and PNNL," Jandhyala said.
"This will be an interdisciplinary place for UW faculty in computer science, electrical engineering and applied math to work with PNNL colleagues on areas such as computational physics, big data, cyber security and computing for the smart grid," Khaleel said.
The institute's headquarters are inside UW's Sieg Hall, but the institute will be broader than that specific location. Its research members will hail from many of UW's numerous schools and colleges. And PNNL scientists and engineers will work from both Seattle and the national laboratory's main campus in Richland.
PNNL currently has two scientists who conduct DOE-funded research related to big data and nuclear physics from UW's Seattle campus. About eight more PNNL researchers are expected to join them in Seattle by the end of 2013. All Seattle-based PNNL researchers involved in advanced computing will be associated with the institute. And initially more than a dozen of UW's faculty members are expected to join the institute.
Institute members will use computational resources already in place at their home institutions. In Seattle, that includes the Hyak supercomputer developed by UW's eScience Institute and UW-IT. Richland resources include components of the PNNL Institutional Computing program, which features the Olympus supercomputer. Cloud resources will also be used extensively.
Both UW and PNNL are well known for their contributions to advanced computing. UW is known for its computer science and engineering, electrical engineering and applied mathematics programs. UW's eScience Institute has advanced data-driven discovery, and the university's computational programs in physics, chemistry and astronomy are highly regarded.
And PNNL is known for designing and programming high-performance computers and evaluating their performance. PNNL leads research in computational molecular science, multi-scale mathematics, regional climate modeling and the modeling of underground fluids such as water.
Founded in 1861, the University of Washington is one of the oldest state-supported institutions of higher education on the West Coast and is one of the world's preeminent research universities.
Source: University of Washington
Contributing commentator, Andrew Jones, offers a break in the news cycle with an assessment of what the national "size matters" contest means for the U.S. and other nations...
Today at the International Supercomputing Conference in Leipzing, Germany, Jack Dongarra presented on a proposed benchmark that could carry a bit more weight than its older Linpack companion. The high performance conjugate gradient (HPCG) concept takes into account new architectures for new applications, while shedding the floating point....
Not content to let the Tianhe-2 announcement ride alone, Intel rolled out a series of announcements around its Knights Corner and Xeon Phi products--all of which are aimed at adding some options and variety for a wider base of potential users across the HPC spectrum. Today at the International Supercomputing Conference, the company's Raj....
Jun 19, 2013 |
Supercomputer architectures have evolved considerably over the last 20 years, particularly in the number of processors that are linked together. One aspect of HPC architecture that hasn't changed is the MPI programming model.
Jun 18, 2013 |
The world's largest supercomputers, like Tianhe-2, are great at traditional, compute-intensive HPC workloads, such as simulating atomic decay or modeling tornados. But data-intensive applications--such as mining big data sets for connections--is a different sort of workload, and runs best on a different sort of computer.
Jun 18, 2013 |
Researchers are finding innovative uses for Gordon, the 285 teraflop supercomputer housed at the San Diego Supercomputer Center (SDSC) that has a unique Flash-based storage system. Since going online, researchers have put the incredibly fast I/O to use on a wide variety of workloads, ranging from chemistry to political science.
Jun 17, 2013 |
The advent of low-power mobile processors and cloud delivery models is changing the economics of computing. But just as an economy car is good at different things than a full size truck, an HPC workload still has certain computing demands that neither the fastest smartphone nor the most elastic cloud cluster can fulfill.
Jun 14, 2013 |
For all the progress we've made in IT over the last 50 years, there's one area of life that has steadfastly eluded the grasp of computers: understanding human language. Now, researchers at the Texas Advanced Computing Center (TACC) are utilizing a Hadoop cluster on its Longhorn supercomputer to move the state of the art of language processing a little bit further.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
Join HPCwire Editor Nicole Hemsoth and Dr. David Bader from Georgia Tech as they take center stage on opening night at Atlanta's first Big Data Kick Off Week, filmed in front of a live audience. Nicole and David look at the evolution of HPC, today's big data challenges, discuss real world solutions, and reveal their predictions. Exactly what does the future holds for HPC?
Join our webinar to learn how IT managers can migrate to a more resilient, flexible and scalable solution that grows with the data center. Mellanox VMS is future-proof, efficient and brings significant CAPEX and OPEX savings. The VMS is available today.