Visit additional Tabor Communication Publications
January 15, 2009
Green supercomputing paves the way for sharing scientific research and collaboration
Jan. 14 -- Energy efficiency ranks second when compared to renewable power generation as a solution to climate change, but for Phillip Dickens, a professor at the University of Maine, it is first in the world of supercomputing.
With the evolution of supercomputers over the last few years, speed, memory and capability have increased tremendously. The early models of computers were enormous in size, often filling a large room. One of the challenges of having a computer this size was the large amount of heat that it generated and how to keep it cool. Two dominant factors that have influenced supercomputer design are Moore's Law and economies of scale.
Today, a modern desktop computer running at 2.66 GHz is more powerful than a 10-year-old supercomputer, and costs considerably less.
In addition, parallelization allows for several smaller parts to work simultaneously and limits the amount of information that can be transferred between processing units. Clusters of computers can also be programmed to function as a single large computer. There are many possibilities that can maximize the energy efficiency of the hardware and software of supercomputers.
Dickens received a two year, $200,000 National Science Foundation (NSF) research grant that funded the development of a scientific grid portal in Maine and the purchase of an energy efficient supercomputer. The purpose of the grid-portal is to make research from the University of Maine Institute of Climate Modeling available to top research scientists down to school age children.
Results of widely used ice sheet models, tools for climate change research, prototype versions of object based caching system, real-time animations and video are just a few of the applications that will be available. In addition, the grid portal will provide the larger community with the computing power, storage capacity, and rendering engine to execute very high-resolution models and receive animations and other visual information in real time.
In a demonstration of energy efficiency, the University of Maine, Department of Computer Science, unveiled the first cyclist-powered "green" supercomputer. Powered by 10 cyclists, the eco-friendly SiCortex SC648 supercomputer successfully ran a program demonstrating glacial melting for 20 minutes.
"The fact that a computer can be powered by a team of cyclists underscores how efficient computers have become," said University of Maine professor George Markowsky.
Computer scientists are continually searching for new ways to reduce the amount of energy it takes to operate computer systems. The SiCortex SC648 is the first of two low power HPC systems to be developed in the state of Maine. It combines desktop accessibility with low cooling requirements of a standard PC, but has the speed and power for high-productivity computing. The SC648 can develop, distribute, run multiple applications and uses 1,000 watts of power. In addition to the SC648, the University of Maine has also invested in the SiCortex SC072. It provides answers to complex scientific computing challenges as quickly as conventional cluster computers, but at the fraction of the energy requirement using 300 watts of power. Both computer systems will power the University of Maine Scientific Grid Portal and provide access to computing resources, scientific applications and research animations.
"The computer industry is still in the early phase of realizing the importance of power consumption. Most computer users today don't think about energy and infrastructure costs when making a buying decision. With climate change looming, energy costs skyrocketing and the economy stumbling, power requirements are being thrust into the foreground," said James Bailey, marketing director of SiCortex.
Although Dickens is at the forefront of "green" supercomputing, his desire exceeds the current eco-friendly capabilities. "We are still in the beginning phase. It would have been nice to show multiple displays, all executing different phases, showing different processes. We have one application now and we are looking to get more because there is a lot of demand," he said.
An open exchange of scientific ideas and research
Most scientists conceal the findings of their research until their work is published, but Phillip Dickens is different. He would rather share. "What we are doing is trying to make some of our research available to the public and we are designing a special interface so it can be accessed by the public and students in K-12," said Dickens.
The grid portal is planned to launch in the summer of 2009, and will give users the opportunity to experiment with environmental parameters and examine variable climate change situations.
Fostering collaborations and sharing research are two things once unthinkable with regards to science, but the atmosphere is changing. An open exchange of scientific ideas and research materials require as certain amount of trust, but the benefits are enormous.
Dickens' vision for outreach is that school children and powerful research groups alike will utilize the research and tools that are available through the Grid Portal, which will connect to the state's High Performance Optical Network.
"I'm working with people at Jackson Laboratory and they are doing very intensive applications, and have the ability to upload data and access it remotely. The supercomputer will work on the models and send back the visualizations. We can work together on large problems, but it basically moved us to a more joint effort of research. This is something that hasn't been done before in the state," Dickens said.
For the last few years, the general trend in high performance computing has moved to a Grid Model, where virtual organizations can cooperate even though they are geographically distributed. System-level science concerns the complex and multidisciplinary understanding of the behavior of large physical, biological or social systems. For the Climate Change Institute at the University of Maine, acquiring the topographical and climate data to construct a dynamic computer-generated ice sheet model is a multidisciplinary effort.
"I think the trend of sharing that we are doing, we are letting people not only run other people's models, but run our models as well. My hope is that by providing this infrastructure and access to models that it will start to establish a community of trust where people can share data and research," Dickens said.
Broader Impact: The Next Generation
The cyberinfrastructure that Dickens and his colleagues at the University of Maine have established provide opportunities for undergraduate students to become involved in research and development. "You can make projects that are very well suited for undergraduates. I've watched the evolution of students, and by the time summer was over, they had a good understanding of working on a team," Dickens said. "That is one of the tremendous benefits that the NSF grant has provided -- a perfect platform for undergraduate research."
The University of Maine is focused on responsible computing and hopes to pass its vision and mission along to its students. The development of the grid portal and purchase of the eco-friendly supercomputer have transformed research in the state. Students have gained invaluable skills, such as hands-on experience, an understanding of real world challenges and the value of collaboration. In its efforts to foster collaborations between the lab and outside world, the university is creating a new generation of IT experts, who will be open to embrace a community-centric model of research.
"Beyond being an innovator in green computing, the University of Maine is at the head of the class in recognizing the future of computing and preparing its students for leadership," Bailey said. "Dr. Dickens is paving the way for students and faculty at the University of Maine to both advance in the state of computer science and to achieve more results using less power."
Related Web sites
For more information about the University of Maine's climate change programs, visit http://www.climatechange.umaine.edu/.
Source: University of Maine
In quieter times, sounding the bell of funding big science with big systems tends to resonate further than when ears are already burning with sour economic and national security news. For exascale's future, however, the time could be ripe to instill some sense of urgency....
In a recent solicitation, the NSF laid out needs for furthering its scientific and engineering infrastructure with new tools to go beyond top performance, Having already delivered systems like Stampede and Blue Waters, they're turning an eye to solving data-intensive challenges. We spoke with the agency's Irene Qualters and Barry Schneider about..
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 22, 2013 |
At some point in the not-too-distant future, building powerful, miniature computing systems will be considered a hobby for high schoolers, just as robotics or even Lego-building are today. That could be made possible through recent advancements made with the Raspberry Pi computers.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 15, 2013 |
Supercomputers at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) have worked on important computational problems such as collapse of the atomic state, the optimization of chemical catalysts, and now modeling popping bubbles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/15/2013 | Bull | “50% of HPC users say their largest jobs scale to 120 cores or less.” How about yours? Are your codes ready to take advantage of today’s and tomorrow’s ultra-parallel HPC systems? Download this White Paper by Analysts Intersect360 Research to see what Bull and Intel’s Center for Excellence in Parallel Programming can do for your codes.
In this demonstration of SGI DMF ZeroWatt disk solution, Dr. Eng Lim Goh, SGI CTO, discusses a function of SGI DMF software to reduce costs and power consumption in an exascale (Big Data) storage datacenter.
The Cray CS300-AC cluster supercomputer offers energy efficient, air-cooled design based on modular, industry-standard platforms featuring the latest processor and network technologies and a wide range of datacenter cooling requirements.